Post

Replies

Boosts

Views

Activity

Where in App Store are BG asset's keys visible: BAEssentiaMaxInstallSize / BAMaxInstallSize
The background asset keys (BAEssentiaMaxInstallSize/BAMaxInstallSize) referenced in the app's Info.plist will be displayed to users on the App Store, as outlined in the WWDC video and supported by Apple’s documentation. Could you please clarify where exactly on the App Store's product page these values will be visible? Is the size displayed on the App Store a sum of the app bundle size and the size specified in these keys within the Info.plist?
0
0
48
6d
APNS: Unexpired, priority=10, type=alert sometimes do not appear in Notification Center when coming back online
System Information: iPhone 13, iOS 17.6.1 Steps to reproduce: Open my app, causing it to register for an APNS token Kill my app to make sure it is not in the foreground Send a push notification with a payload similar to this: {"aps":{"alert":{"title":"My App Name","body":"10:24am 🚀🚀🚀"}},"price":19,"clock":175846989,"time":1731001868.379526} And the following attributes: Expiry: (Date that is 7 days from now) Type: Alert Priority: High (10) Payload Size: 141 bytes The notification appears in the Notification Center, as expected Turn on Airplane Mode (WiFi=off) Wait between 60 seconds - 8 hours (varies) Send the same notification payload/attributes again Wait between 60 seconds - 8 hours (varies) Turn on WiFi Wait 1-30 minutes (varies) Expected behavior: The notification appears in the Notification Center Actual behavior: Push notifications from other apps immediately appear in the Notification Center Roughly 30% of the time: The push notification(s) from my app never arrive, even after waiting 30 minutes Roughly 70% of the time: The notification appears in the notification center, and everything works fine Thoughts: Expiry must be set correctly because I've seen my notifications get queued and then delivered (correctly) in the CloudKit Push Notification tool. Identical notifications (payload, APNS headers, etc.) are also sent to other devices at the same time. They receive the notifications just fine. It must not be my iPhone's notification Settings, because notifications appear correctly when online I've tried restarting the iPhone, it did not fix this issue So it seems it must be an unexpected behavior in APNS or something broken with my specific phone? Not sure what else I could possibly do to make sure the notifications arrive. This breaks the entire experience of my app. I need to be able to notify users of incoming messages so they do not miss them.
6
0
171
1w
Critical alert delay
I am sending push notifications to the app with critical alerts, but there is a significant delay. If the number of target devices is 1000 or less, notifications will be received normally within a few seconds to a minute. Once the number of target devices exceeds 1000, some devices will arrive quickly (normally within a few seconds to 1 minute) and others will arrive late (3 minutes to 15 minutes, divided into hundreds of items). In severe cases, notifications to more than 80% of devices will be delayed. Example: If you send 3000 notifications at once,  1 minute: Notify 400 items  5 minutes: Notify 1000 items  10 minutes: Notify 1000 items  13 minutes: Notify 600 items *The timing of 5 minutes, 10 minutes, and 13 minutes changes every time and is not at regular intervals. We understand that according to the push notification specifications, sending several thousand messages at once is not a problem. Please let me know if there is a rule, such as sending 1000 items at a time, in order to deliver quickly and with minimal delay.
3
0
125
6d
Issue: API Call Delays (5-10 Minutes) on Real Device in tvOS 18 After Call Completes, Works Fine in Debug Mode and Simulator
I am encountering an issue when making an API call using URLSession with DispatchQueue.global(qos: .background).async on a real device running tvOS 18. The code works as expected on tvOS 17 and in the simulator for tvOS 18, but when I remove the debug mode, After the API call it takes few mintues or 5 to 10 min to load the data on the real device. Code: Here’s the code I am using for the API call: appconfig.getFeedURLData(feedUrl: feedUrl, timeOut: kRequestTimeOut, apiMethod: ApiMethod.POST.rawValue) { (result) in self.EpisodeItems = Utilities.sharedInstance.getEpisodeArray(data: result) } func getFeedURLData(feedUrl: String, timeOut: Int, apiMethod: String, completion: @escaping (_ result: Data?) -> ()) { guard let validUrl = URL(string: feedUrl) else { return } var request = URLRequest(url: validUrl, cachePolicy: .useProtocolCachePolicy, timeoutInterval: TimeInterval(timeOut)) let userPasswordString = "\(KappSecret):\(KappPassword)" let userPasswordData = userPasswordString.data(using: .utf8) let base64EncodedCredential = userPasswordData!.base64EncodedString(options: .lineLength64Characters) let authString = "Basic \(base64EncodedCredential)" let headers = [ "authorization": authString, "cache-control": "no-cache", "user-agent": "TN-CTV-\(kPlateForm)-\(kAppVersion)" ] request.addValue("application/json", forHTTPHeaderField: "Content-Type") request.httpMethod = apiMethod request.allHTTPHeaderFields = headers let response = URLSession.requestSynchronousData(request as URLRequest) if response.1 != nil { do { guard let parsedData = try JSONSerialization.jsonObject(with: response.1!, options: .mutableContainers) as? AnyObject else { print("Error parsing data") completion(nil) return } print(parsedData) completion(response.1) return } catch let error { print("Error: \(error.localizedDescription)") completion(response.1) return } } completion(response.1) } import Foundation public extension URLSession { public static func requestSynchronousData(_ request: URLRequest) -> (URLResponse?, Data?) { var data: Data? = nil var responseData: URLResponse? = nil let semaphore = DispatchSemaphore(value: 0) let task = URLSession.shared.dataTask(with: request) { taskData, response, error in data = taskData responseData = response if data == nil, let error = error { print(error) } semaphore.signal() } task.resume() _ = semaphore.wait(timeout: .distantFuture) return (responseData, data) } public static func requestSynchronousDataWithURLString(_ requestString: String) -> (URLResponse?, Data?) { guard let url = URL(string: requestString.checkValidUrl()) else { return (nil, nil) } let request = URLRequest(url: url) return URLSession.requestSynchronousData(request) } } Issue Description: Working scenario: The API call works fine on tvOS 17 and in the simulator for tvOS 18. Problem: When running on a real device with tvOS 18, the API call takes time[enter image description here] when debug mode is disabled, but works fine when debug mode is enabled, Data is loading after few minutes. Error message: Error Domain=WKErrorDomain Code=11 "Timed out while loading attributed string content" UserInfo={NSLocalizedDescription=Timed out while loading attributed string content} NSURLConnection finished with error - code -1001 nw_read_request_report [C4] Receive failed with error "Socket is not connected" Snapshot request 0x30089b3c0 complete with error: <NSError: 0x3009373f0; domain: BSActionErrorDomain; code: 1 ("response-not-possible")> tcp_input [C7.1.1.1:3] flags=[R] seq=817957096, ack=0, win=0 state=CLOSE_WAIT rcv_nxt=817957096, snd_una=275546887 Environment: Xcode version: 16.1 Real device: Model A1625 (32GB) tvOS version: 18.1 Debugging steps I’ve taken: I’ve verified that the issue does not occur in debug mode. I’ve confirmed that the API call works fine on tvOS 17 and in the simulator (tvOS 18). The error suggests a network timeout (-1001) and a socket connection issue ("Socket is not connected"). Questions: Is this a known issue with tvOS 18 on real devices? Are there any specific settings or configurations in tvOS 18 that could be causing the timeout error in non-debug mode? Could this be related to how URLSession or networking behaves differently in release mode? I would appreciate any help or insights into this issue!
1
0
138
1w
ios 18.2 beta(22C5131E) Speech Recognition Discards Previously Transcribed Audio
I was testing SFSpeechRecognition on my real device running ios 18.2 beta, and found that the result's "final" field is true, the result itself does not contain entire conversation's transcription. I came across some blog posts saying it's fixed in a 18.1 beta, is this not the case for 18.2 beta? Example code: recognitionTask = recognizer.recognitionTask(with: request) { [weak self] result, error in guard let self = self else { return } if let error = error { DispatchQueue.main.async { self.errorMessage = "Transcription failed: \(error.localizedDescription)" self.isTranscribing = false } } else if let result = result, result.isFinal { // HERE! } }
4
0
194
1w
iOS Application Bluetooth Permission
Is it possible for the Bluetooth permissions of an app to be turned off due to changes in the iOS application's Bluetooth library, possibly because of Apple's security requirements or OS-related factors? There are two applications, Application A and Application B, that control Bluetooth devices. Application A uses a third-party Bluetooth library to control the Bluetooth devices. Application B also uses a third-party Bluetooth library to control the Bluetooth devices. The Bluetooth libraries used by Application A and Application B are different, but both applications work without any issues. However, when the Bluetooth library used in Application B was changed to the one used in Application A, the Bluetooth permissions for Application B sometimes turned off. Since Application A and Application B operate without any issues on their own, we believe the problem is not with the Bluetooth libraries themselves. Given the above situation, is it possible that changing the Bluetooth library used could cause the Bluetooth permissions of the app to be turned off due to Apple's security requirements or OS-related factors?
1
0
87
6d
Developing a driver to read HFS disks on MacOS Sonoma and newer
Capability to read and write ofd HFS disks on Mac has been removed since a long time. Capability to simply read was also removed since Catalina I think. That is surprising and sometimes frustrating. I still use a 90's MacBook for a few tasks and need from time to time to transfer files to newer Mac or read some old files stored on 3.5" disks. Solution I use is to read the disk on an old Mac with MacOS 10.6 (I'm lucky enough to have kept one) and transfer to USB stick or airdrop… As there is no USB port on the Macbook of course (and I have no more a working 56k modem to transfer by mail), only option if not 3,5" disk is using PCMCIA port on the MacBook for writing to an SD Card to be read in Mac Sonoma. But reading directly 3.5" disk would be great. Hence my questions for the forum: how hard would it be to write such a driver for READING only HFS on Mac Sonoma? There are some software like FuseHFS. Did anyone experience it ? Did anyone have a look at the source code (said to be open source). does anyone know why Apple removed such capability (I thought it was a tiny piece of code compared to the GB of present MacOS)? Thanks for any insights on the matter.
2
0
156
1w
Documentation on implementing core telephony framework
We are a carrier in the US and would want documentation on implementing the native eSIM creation on the native app and install it on the device directly through the app. Core Telephony framework is available (https://developer.apple.com/documentation/coretelephony) to do that but I did not find any documentation on how to implement it by step by step process. Also we would want to understand how we can read the IMEI of the phone as we already have the carrier privileges on our developer account.
1
0
128
1w
DNS requests failing when NEPacketTunnelProvider is running.
Hi, TLDR: On iOS, when my PacketTunnel is running, can I exclude DNS requests from going into the tunnel? I have a test app, using Apple's AsyncDNSResolver, that makes a DNS call and it works when the tunnel is not running. If the tunnel is running it times out after 30 seconds and I get the error -65568. Here's how I'm setting up the tunnel func setup(tunnelRemoteAddress: String) { let settings = NEPacketTunnelNetworkSettings(tunnelRemoteAddress: tunnelRemoteAddress) settings.ipv4Settings = NEIPv4Settings(addresses: [tunnelRemoteAddress], subnetMasks: ["255.255.255.255"]) settings.ipv4Settings?.includedRoutes = [NEIPv4Route.default()] let proxySettings = NEProxySettings() proxySettings.httpEnabled = true proxySettings.httpServer = NEProxyServer(address: ProxyServerConfiguration.host, port: ProxyServerConfiguration.port) proxySettings.httpsEnabled = true proxySettings.httpsServer = NEProxyServer(address: LocalProxyServerConfiguration.host, port: LocalProxyServerConfiguration.port) proxySettings.excludeSimpleHostnames = true proxySettings.exceptionList = nil let dnsSettings = NEDNSSettings(servers: ["8.8.8.8"]) settings.dnsSettings = dnsSettings settings.proxySettings = proxySettings setTunnelNetworkSettings(settings) { error in // ... } } I've tried all combinations of setting/excluding the NEDNSSettings but the DNS call always fails when the tunnel is running. Thanks for any help.
4
0
129
1w
Unexpected system behavior with 'getifaddrs'
On iOS beta, monitoring network usage using the getifaddrs API sporadically causes system volume spikes. This happens even though the application does not interact with any audio-related code. The issue persists across different polling intervals (e.g., 0.05s to 1s) and only occurs when invoking getifaddrs. Replacing the API calls with mock data eliminates the problem, suggesting a potential issue with getifaddrs in the beta environment. The application updates UI elements based on network activity, but the volume spikes occur independently of UI or other observable app behavior. Steps to Reproduce: Create an app that monitors network usage using the getifaddrs API. Fetch network statistics on a timer (e.g., every 0.05 seconds). Observe system behavior while running the app on iOS beta. Note sporadic volume spikes during app runtime. Expected Result: Polling network usage with getifaddrs should not affect system volume or other unrelated resources. Actual Result: System volume spikes occasionally when network statistics are retrieved using getifaddrs. iOS 18.2 Beta, Tested on physical device ( iPhone 15 Pro )
1
0
134
1w
Drivers are not visible in iOS 18 Public beta
Hello team, I am using USBDriverKit and Driverkit framework in my application for communication of USB device. After updating my iPad OS to 18 public beta, I am unable to get option to enable drivers in my setting page of my application. However, I am able to see that options in developer beta version of iPad OS 18. Can anyone guide me, how should I proceed further as I am unable to use my USB devices.
8
1
567
Sep ’24
Allow "App" to find the devices on local network?
Hi, On macOS 15 beta 7, we get a network popup while launching application, "Allow "App" to find the devices on local network?" This popup we are not seeing in older versions of macOS. We also see a a new option in "System Settings->Privacy & Security->Local Network". Is there way to add the application entry in "Local Network" through a command so that we can suppress this popup on launching the applications? Regards Prema Kumar
6
0
2.3k
Aug ’24
Local Network permission prompt for daemon on macOS 15
Hi Team, OS is prompting for local network permission for our application which runs as root level daemon. As per the our analysis, it looks like it is prompting from our own library which is trying to get network info ' using /usr/sbin/system_profiler with "-xml -detailLevel basic SPNetworkDataType" and then trying to iterate to find DNS.ServerAddresses for each item. Then using [NSHost hostWithAddress:IPAddress];(When this library is not linked to the app then there is no prompt, so most likely this is the code that is resulting in the prompt). Is this expected ? . Is there any other way that we can get DNS host name without being prompted for local network permission on mac OS 15
21
1
1.2k
Sep ’24
SwiftData ModelContext Fetch Crashing
I'm currently using Xcode 16 Beta (16A5171c) and I'm getting a crash whenever I attempt to fetch using my ModelContext in my SwiftUI video using the environment I'm getting a crash specifically on iOS 18 simulators. I've opened up a feedback FB13831520 but it's worth noting that I can run the code I'll explain in detail below on iOS 17+ simulator and devices just fine. I'm getting the following crash: *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: 'The specified URI is not a valid Core Data URI: x-coredata:///MyApp/XXXXX-XXXXX-XXXX-XXXX-XXXXXXXXXXXX' It's almost as if on iOS18 SwiftData is unable to find the file on the simulator to perform CRUD operations. All I'm doing in my project is simply fetching data using the modelContext. func contains(_ model: MyModel, in context: ModelContext) -> Bool { let objId = palette.persistentModelID let fetchDesc = FetchDescriptor<MyModel>(predicate: #Predicate { $0.persistentModelID == objId }) let itemCount = try? context.fetchCount(fetchDesc) return itemCount != 0 }
8
5
1.3k
Jun ’24