Is there any threading assumption/requirement for ASIdentifierManager.advertisingIdentifier?
Please see the complete issue and stack trace here. The main thread was waiting for the worker thread, which was blocked in ASIdentifierManager.advertisingIdentifier.
Thread 68:
0 libsystem_kernel.dylib 0x188abf0f4 mach_msg_trap (in libsystem_kernel.dylib) + 8
1 libsystem_kernel.dylib 0x188abe5a0 mach_msg (in libsystem_kernel.dylib) + 72
2 libdispatch.dylib 0x188924880 _dispatch_mach_send_and_wait_for_reply (in libdispatch.dylib) + 500
3 libdispatch.dylib 0x188924d10 dispatch_mach_send_with_result_and_wait_for_reply$VARIANT$mp (in libdispatch.dylib) + 52
4 libxpc.dylib 0x188b8391c xpc_connection_send_message_with_reply_sync (in libxpc.dylib) + 204
5 Foundation 0x189aafa28 NSXPCCONNECTION_IS_WAITING_FOR_A_SYNCHRONOUS_REPLY (in Foundation) + 12
6 Foundation 0x189892f60 -[NSXPCConnection sendInvocation:orArguments:count:methodSignature:selector:withProxy:] (in Foundation) + 3608
7 CoreFoundation 0x188f3276c forwarding (in CoreFoundation) + 552
8 CoreFoundation 0x188f3475c forwarding_prep_0 (in CoreFoundation) + 92
9 CoreServices 0x1b1896ce4 -[LSApplicationWorkspace deviceIdentifierForAdvertising] (in CoreServices) + 160
10 AdSupport 0x198f70a60 -[ASIdentifierManager advertisingIdentifier] (in AdSupport) + 56
I don't see any thread related information in the document. I tried to recreate the scenario, but I could not reproduce the issue with the simplified test below.
#import "ViewController.h"
#import "AdSupport/AdSupport.h"
#import "AppTrackingTransparency/AppTrackingTransparency.h"
@interface ViewController () {
dispatch_queue_t _queue;
}
@end
@implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
_queue = dispatch_queue_create("IdentityWorkerQueue", DISPATCH_QUEUE_SERIAL);
dispatch_set_target_queue(_queue,
dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0));
if (@available(iOS 14, *)) {
[ATTrackingManager requestTrackingAuthorizationWithCompletionHandler:
^(ATTrackingManagerAuthorizationStatus status) {
NSLog(@"Status: %lu", (unsigned long)status);
}];
}
}
- (IBAction)action:(id)sender {
__block NSString *adId1 = nil;
dispatch_async(self->_queue, ^{
sleep(1);
adId1 = ASIdentifierManager.sharedManager.advertisingIdentifier.UUIDString;
});
__block NSString *adId2 = nil;
dispatch_sync(self->_queue, ^{
adId2 = adId1;
});
}
@end
Is there any threading assumption/requirement for ASIdentifierManager.advertisingIdentifier? For example, would it wait for the main thread to finish a certain task when an error happens?
Delve into the world of built-in app and system services available to developers. Discuss leveraging these services to enhance your app's functionality and user experience.
Post
Replies
Boosts
Views
Activity
I have an app that is already running with CoreData, and I want to allow the users to upload their data to iCloud so, in the case they need to delete their apps or change devices, they don't lose it.
Every time the app is opened, there is a synchronization that happens between CoreData and a JSON file fixture, in order to fill the app with its base values (creating NSManagedObject instances from the aforementioned fixture).
Before the iCloud sync, the CoreData model had some constraints for different entities, to enforce uniqueness, but these had to be stripped since CloudKit doesn't support them. Most of these constraints were basically ids that came from the JSON and represent an item in our Firebase database.
Given that, I want to make the underlying CKRecord.id the same as these ids, so I can avoid the situation where, if a person open the app in a second device, the fixture data is repeated, since the fixture runs before the sync with the iCloud happens.
Is that possible? Any help would be appreciated.
I have a Safari web extension that needs the ability to open the popup when the user interacts with a modal coming from the content script.
There is a native message handler that comes with the safari web extension when you first create it:
import SafariServices
import os.log
class SafariWebExtensionHandler: NSObject, NSExtensionRequestHandling {
func beginRequest(with context: NSExtensionContext) {
let item = context.inputItems[0] as! NSExtensionItem
let message = item.userInfo?[SFExtensionMessageKey]
os_log(.default, "Received message from browser.runtime.sendNativeMessage: %@", message as! CVarArg)
let response = NSExtensionItem()
response.userInfo = [ SFExtensionMessageKey: [ "Response to": message ] ]
print("hit")
context.completeRequest(returningItems: [response], completionHandler: nil)
}
}
I have the permissions to send a native message, I've seen some examples online where you can access the SFSafariApplication module from SafariServices and open the popover. But I can't seem to access SFSafariApplication in this module.
sudo ifconfig en0 ether [MAC address]
Now results in
ifconfig: ioctl (SIOCAIFADDR): Can't assign requested address
I am seeing crash reports with our users at this line of code
NSURL *url = [NSURL URLWithString:self->urlAddress];
self->urlAddress is never nil.
My understanding is that if self->urlAddress is malformed in someway then URLWithString should return nil but it seems to crash the thread as below.
Has anybody any suggestions as t what might be wrong?
Thread 3 name:
Thread 3 Crashed:
0 CoreFoundation 0x00000001a73704a0 CFStringGetLength + 60 (CFInternal.h:889)
1 CoreFoundation 0x00000001a739b5b4 _CFURLCreateWithURLString + 84 (CFURL.c:2046)
2 Foundation 0x00000001a8635908 +[NSURL(NSURL) URLWithString:relativeToURL:] + 52 (NSURL.m:463)
3 Dubline 0x0000000102f3391c __58-[settingsAccountViewController reloadDublineSettingsPage]_block_invoke + 1024 (settingsAccountViewController.m:273)
4 CFNetwork 0x00000001a79f23dc __40-[__NSURLSessionLocal taskForClassInfo:]_block_invoke + 540 (LocalSession.mm:687)
5 CFNetwork 0x00000001a7a04768 __49-[__NSCFLocalSessionTask _task_onqueue_didFinish]_block_invoke + 244 (LocalSessionTask.mm:584)
6 libdispatch.dylib 0x00000001a6fb9a84 _dispatch_call_block_and_release + 32 (init.c:1466)
7 libdispatch.dylib 0x00000001a6fbb81c _dispatch_client_callout + 20 (object.m:559)
8 libdispatch.dylib 0x00000001a6fc3004 _dispatch_lane_serial_drain + 620 (inline_internal.h:2557)
9 libdispatch.dylib 0x00000001a6fc3c34 _dispatch_lane_invoke + 456 (queue.c:3862)
10 libdispatch.dylib 0x00000001a6fce4bc _dispatch_workloop_worker_thread + 764 (queue.c:6589)
11 libsystem_pthread.dylib 0x00000001f304a7a4 0x1f3047000 + 14244
12 libsystem_pthread.dylib 0x00000001f305174c 0x1f3047000 + 42828
I took delivery of my first M1 Mac (iMac running Big Sur 11.4) and with great anticipation installed my iOS VoIP App from the AppStore.
I was greatly disappointed to see that
There were no VoIP Pushes to start an incoming call
Callkit does not seem to work so I get no Audio.
Am I missing something? Is there some permissions or configuration I might need to set?
Or is it just that Callkit and Pushkit don't work even though it states on developer.apple.com that they are supported on macOS 10.15+
Any advice or guidance greatly appreciated.
Very disappointed :-(
I regularly see questions, both here on DevForums and in my Day Job™ at DTS, that are caused by a fundamental misunderstanding of how background execution works on iOS. These come in many different variants, for example:
How do I keep my app running continuously in the background?
If I schedule a timer, how do I get it to fire when the screen is locked?
How do I run code in the background every 15 minutes?
How do I set up a network server that runs in the background?
How can my app provide an IPC service to another one of my apps while it’s in the background?
How can I resume my app in the background if it’s been ‘force quit’ by the user?
The short answer to all of these is You can’t. iOS puts strict limits on background execution. Its default behaviour is to suspend your app shortly after the user has moved it to the background; this suspension prevents the process from running any code.
There’s no general-purpose mechanism for:
Running code continuously in the background
Running code at some specific time in the background
Running code periodically at a guaranteed interval
Resuming in the background in response to a network or IPC request
However, iOS does provide a wide range of special-purpose mechanisms for accomplishing specific user goals. For example:
If you’re building a music player, use the audio background mode to continue playing after the user has moved your app to the background.
If you’re building a timer app, use a local notification to notify the user when your timer has expired.
If you’re building a video player app, use AVFoundation’s download support.
Keep in mind that the above is just a short list of examples. There are many other special-purpose background execution mechanisms, so you should search the documentation for something appropriate to your needs.
IMPORTANT Each of these mechanisms fulfils a specific purpose. Do not attempt to use them for some other purpose. Before using a background API, read clause 2.5.4 of the App Review Guidelines.
Additionally, iOS provides some general-purpose mechanisms for background execution:
To resume your app in the background in response to an event on your server, use a background notification (aka a ‘silent’ push). For more information, see Pushing background updates to your App.
To request a small amount of background execution time to refresh your UI, use BGAppRefreshTaskRequest.
To request extended background execution time, typically delivered overnight when the user is asleep, use BGProcessingTaskRequest.
To prevent your app from being suspended for a short period of time so that you can complete some user task, use a UIApplication background task. For more information on this, see UIApplication Background Task Notes.
To download or upload a large HTTP resource, use an NSURLSession background session.
All of these mechanisms prevent you from abusing them to run arbitrary code in the background. As an example, consider the NSURLSession resume rate limiter.
For more information about these limitations, and background execution in general, I strongly recommend that you watch WWDC 2020 Session 10063 Background execution demystified. It’s an excellent resource.
Specifically, this talk addresses a common misconception about the app refresh mechanism (BGAppRefreshTaskRequest and the older background fetch API). Folks assume that app refresh will provide regular background execution time. That’s not the case. The system applies a range of heuristics to decide which apps get app refresh time and when. This is a complex issue, one that I’m not going to try to summarise here, but the take-home message is that, if you expect that the app refresh mechanism will grant you background execution time, say, every 15 minutes, you’ll be disappointed. In fact, there are common scenarios where it won’t grant you any background execution time at all! Watch the talk for the details.
When the user ‘force quits’ an app by swiping up in the multitasking UI, iOS interprets that to mean that the user doesn’t want the app running at all. So:
If the app is running, iOS terminates it.
iOS also sets a flag that prevents the app from being launched in the background. That flag gets cleared when the user next launches the app manually.
This gesture is a clear statement of user intent; there’s no documented way for your app to override the user’s choice.
Note In some circumstances iOS will not honour this flag. The exact cases where this happens are not documented and have changed over time.
Finally, if you have questions about background execution that aren’t covered by the resources listed here, please open a new thread on DevForums with the details. Tag it appropriately for the technology you’re using; if nothing specific springs to mind, use Background Tasks. Also, make sure to include details about the specific problem you’re trying to solve because, when it comes to background execution, the devil really is in the details.
Share and Enjoy
—
Quinn “The Eskimo!” @ Developer Technical Support @ Apple
let myEmail = "eskimo" + "1" + "@" + "apple.com"
Change history:
2024-03-21 Added a discussion of ‘force quit’.
2023-05-11 Added a paragraph that explains a common misconception about the app refresh mechanism. Made other minor editorial changes.
2021-08-12 Added more entries to the common questions list, this time related to networking and IPC. Made minor editorial changes.
2021-07-26 Extended the statement about what’s not possible to include “running code periodically at a guaranteed interval”.
2021-07-22 First posted.
on iOS, I want to add up undo/redo and a close button. On ipadOS, I only need to add a close button
What’s your experience in adding a close button to the ToolPicker? Or at least have the position of the window so I can add an overlapping box (even on floating).
Hello
I've noticed that this product, heavily promoted on the ASC forums for many years, is no longer available from the Apple App Store.
Can anyone tell me the reason why the product is no longer supported?
Friends have asked me if it is 'safe' to use.
Is it?
Note to moderator: If I'm asking in the wrong places, please redirect my question. Thank you.
Background
I have an established app in the App Store which has been using NSPersistentCloudkitContainer since iOS 13 without any issues.
I've been running my app normally on an iOS device running the iOS 15 betas, mainly to see problems arise before my users see them.
Ever since iOS 15 (beta 4) my app has failed to sync changes - no matter how small the change. An upload 'starts' but never completes. After a minute or so the app quits to the Home Screen and no useful information can be gleaned from crash reports. Until now I've had no idea what's going on.
Possible Bug in the API?
I've managed to replicate this behaviour on the simulator and on another device when building my app with Xcode 13 (beta 5) on iOS 15 (beta 5).
It appears that NSPersistentCloudkitContainer has a memory leak and keeps ramping up the RAM consumption (and CPU at 100%) until the operating system kills the app. No code of mine is running.
I'm not really an expert on these things and I tried to use Instruments to see if that would show me anything. It appears to be related to NSCloudkitMirroringDelegate getting 'stuck' somehow but I have no idea what to do with this information.
My Core Data database is not tiny, but not massive by any means and NSPersistentCloudkitContainer has had no problems syncing to iCloud prior to iOS 15 (beta 4).
If I restore my App Data (from an external backup file - 700MB with lots of many-many, many-one relationships, ckAssets, etc.) the data all gets added to Core Data without an issue at all. The console log (see below) then shows that a sync is created, scheduled & then started... but no data is uploaded.
At this point the memory consumption starts and all I see is 'backgroundTask' warnings appear (only related to CloudKit) with no code of mine running.
CoreData: CloudKit: CoreData+CloudKit: -[PFCloudKitExporter analyzeHistoryInStore:withManagedObjectContext:error:](501): <PFCloudKitExporter: 0x600000301450>: Exporting changes since (0): <NSPersistentHistoryToken - {
"4B90A437-3D96-4AC9-A27A-E0F633CE5D9D" = 906;
}>
CoreData: CloudKit: CoreData+CloudKit: -[PFCloudKitExportContext processAnalyzedHistoryInStore:inManagedObjectContext:error:]_block_invoke_3(251): Finished processing analyzed history with 29501 metadata objects to create, 0 deleted rows without metadata.
CoreData: CloudKit: CoreData+CloudKit: -[NSCloudKitMirroringDelegate _scheduleAutomatedExportWithLabel:activity:completionHandler:](2800): <NSCloudKitMirroringDelegate: 0x6000015515c0> - Beginning automated export - ExportActivity:
<CKSchedulerActivity: 0x60000032c500; containerID=<CKContainerID: 0x600002ed3240; containerIdentifier=iCloud.com.nitramluap.Somnus, containerEnvironment="Sandbox">, identifier=com.apple.coredata.cloudkit.activity.export.4B90A437-3D96-4AC9-A27A-E0F633CE5D9D, priority=2, xpcActivityCriteriaOverrides={ Priority=Utility }>
CoreData: CloudKit: CoreData+CloudKit: -[NSCloudKitMirroringDelegate executeMirroringRequest:error:](765): <NSCloudKitMirroringDelegate: 0x6000015515c0>: Asked to execute request: <NSCloudKitMirroringExportRequest: 0x600002ed2a30> CBE1852D-7793-46B6-8314-A681D2038B38
2021-08-13 08:41:01.518422+1000 Somnus[11058:671570] [BackgroundTask] Background Task 68 ("CoreData: CloudKit Export"), was created over 30 seconds ago. In applications running in the background, this creates a risk of termination. Remember to call UIApplication.endBackgroundTask(_:) for your task in a timely manner to avoid this.
2021-08-13 08:41:03.519455+1000 Somnus[11058:671570] [BackgroundTask] Background Task 154 ("CoreData: CloudKit Scheduling"), was created over 30 seconds ago. In applications running in the background, this creates a risk of termination. Remember to call UIApplication.endBackgroundTask(_:) for your task in a timely manner to avoid this.
Just wondering if anyone else is having a similar issue? It never had a problem syncing an initial database restore prior to iOS 15 (beta 4) and the problems started right after installing iOS 15 (beta 4).
I've submitted this to Apple Feedback and am awaiting a response (FB9412346). If this is unfixable I'm in real trouble (and my users are going to be livid).
Thanks in advance!
While in a text input area, when a Bluetooth keyboard is connected to a IOS device and the On Screen Keyboard is "hidden", is there a way to start SIRIs dictation function?
(Besides activating the On Screen Keyboard and click on the microphone button)
Hi!
I'm looking for some insight and guidance on using the Foundation.Process type with a PTY (Psuedo Terminal) so that the subprocess can accept input and behave as if it was running via a terminal.
The reason for needing a PTY is that for programs like ssh or in my case (xcodes) which ask for user input including passwords, running these via Foundation.Process does not display the prompts to the user as the output is usually buffered (this works fine in the Xcode debugger console but when running via a real terminal that is buffered the prompts are never displayed in the terminal)
Looking at other threads it seems like correct approach here is create a PTY and use the filehandles to attach to the Process.
While I've got this to work to the point where prompts are now shown, I cant seem to figure out how to pass input back to the process as these are being controlled by the PTY.
Here is my Process setup:
let process = Process()
// Setup the process with path, args, etc...
// Setup the PTY handles
var parentDescriptor: Int32 = 0
var childDescriptor: Int32 = 0
guard Darwin.openpty(&parentDescriptor, &childDescriptor, nil, nil, nil) != -1 else {
fatalError("Failed to spawn PTY")
}
parentHandle = FileHandle(fileDescriptor: parentDescriptor, closeOnDealloc: true)
childHandle = FileHandle(fileDescriptor: childDescriptor, closeOnDealloc: true)
process.standardInput = childHandle
process.standardOutput = childHandle
process.standardError = childHandle
With this setup I then read the parent handle and output any result it gets (such as the input prompts):
parentHandle?.readabilityHandler = { handle in
guard let line = String(data: handle.availableData, encoding: .utf8), !line.isEmpty else {
return
}
logger.notice("\(line)")
}
When process.run() is executed the program runs and I can see it asks for Apple ID: input in my terminal, however, when typing input into the terminal the process does not seem to react to this input.
I've tried forwarding the FileHandle.standardInput:
FileHandle.standardInput.readabilityHandler = { handle in
parentHandle?.write(handle.availableData)
}
But this doesn't seem to work either.
What is the recommended way to setup a PTY with Foundation.Process for executing arbitrary programs and having them behave as if they were being run in a terminal context?
Most of the resources I found online are about other languages and I'd like to stick with Foundation.Process vs. doing anything custom in C/C++ if possible as it just makes it easier to reason about / maintain. The resources for Swift on this topic are very lacking and I've checked out some open source projects that claim to do this but most require manually sending input to the PTY handle vs. accepting them from the user in a terminal.
Any insight / help is very much appreciated!
Hi, when trying to test my app clip, if there's no link in the _XCAppClipURL environment variable in the scheme (Or TestFlight invocation), userActivity.webpageURL is "https://example.com" even though I don't have this link anywhere in my project.
This is my code for getting the link (I'm using UI:
func scene(_ scene: UIScene, continue userActivity: NSUserActivity) {
guard userActivity.activityType == NSUserActivityTypeBrowsingWeb,
let incomingURL = userActivity.webpageURL else {
return
}
print("Incoming URL: \(incomingURL)")
}
I removed _XCAppClipURL from the environment variables in the scheme and when I run the code I get:
Incoming URL: https://example.com
Is this a bug? How can I get rid of this https://example.com URL?
Hi all,
I’m trying to use NSMetadataQuery on iOS to track changes to folders users have imported from elsewhere but, no matter what I try, I get no results.
Following the documentation for searching file metadata with NSMetadataQuery,
I’m creating a live query (albeit in Swift) and listening for […]QueryDidFinishGathering and […]QueryDidUpdate. The former fires, with no results, and the latter never fires.
I’ve also tried following the Synchronizing Documents in the iCloud Environment example, adding the appropriate Ubiquity keys to my Info.plist and .entitlements file, with no change.
I’m importing files and folders using SwiftUI’s View.fileImporter(isPresented:allowedContentTypes:allowsMultipleSelection:onCompletion:), but can’t see how I might security-scope the NSMetadataQuery’s execution (if that’s even a thing?).
My test project is on GitHub, but the main parts are below…
My query method:
extension NSMetadataQueryUbiquitousExternalDocumentsTestApp {
func findAccessibleFiles() {
query.stop()
fileMonitor?.cancel()
fileMonitor = Publishers.MergeMany(
[
.NSMetadataQueryDidFinishGathering,
.NSMetadataQueryDidUpdate
].map { NotificationCenter.default.publisher(for: $0) }
)
.receive(on: DispatchQueue.main)
.sink { notification in
query.disableUpdates()
defer { query.enableUpdates() }
foundItems = query.results as! [NSMetadataItem]
print("Query posted \(notification.name.rawValue) with results: \(query.results)")
}
query.searchScopes = [
NSMetadataQueryAccessibleUbiquitousExternalDocumentsScope
]
query.predicate = NSPredicate(
format: "%K LIKE %@",
argumentArray: [NSMetadataItemFSNameKey, "*"]
)
query.sortDescriptors = [
NSSortDescriptor(key: NSMetadataItemFSNameKey, ascending: true)
]
if query.start() {
print("Query started")
} else {
print("Query didn't start for some reason")
}
}
}
Info.plist:
[…]
<key>NSUbiquitousContainers</key>
<dict>
<key>iCloud.com.stevemarshall.AnnotateML</key>
<dict>
<key>NSUbiquitousContainerIsDocumentScopePublic</key>
<true/>
<key>NSUbiquitousContainerName</key>
<string>AnnotateML</string>
<key>NSUbiquitousContainerSupportedFolderLevels</key>
<string>ANY</string>
</dict>
</dict>
[…]
I am trying to migrate to the new APNs Provider API.
Here is how I've been registering for push notifications:
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary*)launchOptions
{
//-- Set Notification
if ([application respondsToSelector:@selector(isRegisteredForRemoteNotifications)])
{
// iOS 8 Notifications: Registering for notifications and types
[application registerUserNotificationSettings:[UIUserNotificationSettings settingsForTypes:(UIUserNotificationTypeSound | UIUserNotificationTypeAlert | UIUserNotificationTypeBadge) categories:nil]];
[application registerForRemoteNotifications];
}
else
{
// iOS < 8 Notifications
_storyBoard = [UIStoryboard storyboardWithName:@"MainStoryboard" bundle:nil];
[[UIApplication sharedApplication] registerForRemoteNotificationTypes:
(UIRemoteNotificationTypeSound | UIRemoteNotificationTypeAlert)];
}
_storyBoard = [UIStoryboard storyboardWithName:@"MainStoryboard" bundle:nil];
[[UIApplication sharedApplication] registerForRemoteNotificationTypes:
(UIRemoteNotificationTypeSound | UIRemoteNotificationTypeBadge | UIRemoteNotificationTypeAlert)];
if (launchOptions != nil)
{
NSDictionary* dictionary = [launchOptions objectForKey:UIApplicationLaunchOptionsRemoteNotificationKey];
if (dictionary != nil)
{
NSLog(@"Launched from push notification: %@", dictionary);
/*[self addMessageFromRemoteNotification:dictionary updateUI:NO];*/
}
}
return YES;
}
Within the last week, I have been using the following terminal command from Sending Push Notifications Using Command-Line Tools to successfully send a test push notification to a testing device.
curl -v --header 'apns-topic: com.domain.appname' --header apns-push-type: alert --cert aps.cer --cert-type DER --key PushChatKey.pem --key-type PEM --data '{"aps":{"alert":"Test"}}' --http2 https://api.sandbox.push.apple.com/3/device/258ecf658e25256c8f06ddb1138d5d536ba0e760a96ebd12d3b1dbe112857c58
Recently after creating provisioning profile and adding it to Xcode, the app no longer prints the device token in the debug window.
After removing the provisioning profile from my Apple Developer account under profiles, I tried using a backed up version of the app which still prints a device token to the debugger window.
When I copy the device token and enter it into the terminal command to send another test push notification, the terminal output is a 400 status code : {"reason":"BadDeviceToken"}* Closing connection 1
curl -v --header 'apns-topic: com.domain.appname' --header apns-push-type: alert --cert aps.cer --cert-type DER --key PushChatKey.pem --key-type PEM --data '{"aps":{"alert":"Hello From Faunna"}}' --http2 https://api.sandbox.push.apple.com/3/device/a146d82d4acea02c9ef6de5838174292d0e2cd18a40be17fb79334c5003a0058
* Could not resolve host: alert
* Closing connection 0
curl: (6) Could not resolve host: alert
* Trying 17.188.138.73...
* TCP_NODELAY set
* Connected to api.sandbox.push.apple.com (17.188.138.73) port 443 (#1)
* ALPN, offering h2
* ALPN, offering http/1.1
Enter PEM pass phrase:
* successfully set certificate verify locations:
* CAfile: /etc/ssl/cert.pem
CApath: none
* TLSv1.2 (OUT), TLS handshake, Client hello (1):
* TLSv1.2 (IN), TLS handshake, Server hello (2):
* TLSv1.2 (IN), TLS handshake, Certificate (11):
* TLSv1.2 (IN), TLS handshake, Server key exchange (12):
* TLSv1.2 (IN), TLS handshake, Request CERT (13):
* TLSv1.2 (IN), TLS handshake, Server finished (14):
* TLSv1.2 (OUT), TLS handshake, Certificate (11):
* TLSv1.2 (OUT), TLS handshake, Client key exchange (16):
* TLSv1.2 (OUT), TLS handshake, CERT verify (15):
* TLSv1.2 (OUT), TLS change cipher, Change cipher spec (1):
* TLSv1.2 (OUT), TLS handshake, Finished (20):
* TLSv1.2 (IN), TLS change cipher, Change cipher spec (1):
* TLSv1.2 (IN), TLS handshake, Finished (20):
* SSL connection using TLSv1.2 / ECDHE-RSA-AES256-GCM-SHA384
* ALPN, server accepted to use h2
* Server certificate:
* subject: CN=api.development.push.apple.com; OU=management:idms.group.533599; O=Apple Inc.; ST=California; C=US
* start date: Feb 8 21:41:22 2021 GMT
* expire date: Mar 10 21:41:22 2022 GMT
* subjectAltName: host "api.sandbox.push.apple.com" matched cert's "api.sandbox.push.apple.com"
* issuer: CN=Apple Public Server RSA CA 12 - G1; O=Apple Inc.; ST=California; C=US
* SSL certificate verify ok.
* Using HTTP2, server supports multi-use
* Connection state changed (HTTP/2 confirmed)
* Copying HTTP/2 data in stream buffer to connection buffer after upgrade: len=0
* Using Stream ID: 1 (easy handle 0x7fbd4700aa00)
> POST /3/device/a146d82d4acea02c9ef6de5838174292d0e2cd18a40be17fb79334c5003a0058 HTTP/2
> Host: api.sandbox.push.apple.com
> User-Agent: curl/7.64.1
> Accept: */*
> apns-topic: com.faunna.PushChat
> Content-Length: 37
> Content-Type: application/x-www-form-urlencoded
>
* Connection state changed (MAX_CONCURRENT_STREAMS == 1000)!
* We are completely uploaded and fine
< HTTP/2 400
< apns-id: 8DE6AA75-8E41-E95E-1FAF-51D93A8B3200
<
* Connection #1 to host api.sandbox.push.apple.com left intact
{"reason":"BadDeviceToken"}* Closing connection 1
What is causing the bad device token output in this set up? And how should I be registering for remote notifications from this point forward?
Hello everyone!
Our team is in the middle of developing a new sleep tracking app with a bunch of additional features including a custom alarm clock. And we need Critical Alerts entitlement for this feature to work as only such notifications (critical interruption level for iOS 15) can break through Silence Ring & Do Not Disturb mode, which is necessary as users obviously won't wake up without the alarm actually going off.
We've applied twice already, clearly stating the purpose of this entitlement, but both times received the same answer: "Unfortunately, this API is not designed for the use you've identified".
While Apple Developer documentation on the matter is rather elusive and states that, among others, home- & health-related purposes are allowed, I couldn't find any straightforward info about our particular case. Moreover, I can easily name a few apps in the Health & Fitness category with exactly the same alarm clock functionality that works and obviously utilizes Critical Alerts entitlement.
So I wonder, can anyone give a piece of advice on how to get this entitlement? Maybe we need to provide some additional info or meet some unannounced conditions? Any info will be highly appreciated.
I've set up a couple of domains (seemingly) successfully with iCloud+ Custom Domains. All the DNS entries are correct.
Now, when I attempt to add an email address I get the error "There was a problem with adding this email address. Please try again later."
This has been happening for over 24 hours. I'm not quite sure how to proceed - I know this is in Beta, but I keep reading how it is working for other people so its hard to think this is a general problem.
I'm adding entirely new domains so there is no chance that the email addresses are previously known by Apple / used for a previous AppleID.
Any ideas please?
How do I adjust the latency timer for the AppleUSBFTDI driver?
I am developing an app in Swift using Xcode on a MacBook Pro M1 running Big Sur, for clinical brain-computer interface (BCI) research. The app needs very low-latency streaming from an external USB device.
The external device is a headset which connects via Bluetooth to an FT231X chip mounted on a USB-Serial dongle. The FT231X chip reads timestamped EEG data from the headset. The issue is that the AppleUSBFTDI driver is buffering the packets coming in from the headset, which causes jitter in the timestamps.
Typically, with proprietary drivers from FTDI, the solution is to reconfigure them to reduce the latency timer to 1ms. The Info.plist is edited to add new key/value pairs.
Is there a similar solution for Apple's built-in driver?
This has come up a few times so I thought I’d write it down so that Future Quinn™ could point folks at it.
If you have any questions or feedback, please start a new thread and tag it with Files and Storage so that I see it.
Share and Enjoy
—
Quinn “The Eskimo!” @ Developer Technical Support @ Apple
let myEmail = "eskimo" + "1" + "@" + "apple.com"
Extended Attributes and Zip Archives
In a traditional Unix file system, a file consists of a flat sequence of bytes and a very limited set of metadata. In contrast, a traditional Mac OS file had a second sequence of bytes, the resource fork, and various other bits of Mac metadata. This caused an interoperability problem between Mac and Unix systems because the latter has no place to store the Mac metadata.
To resolve this problem Apple introduced the AppleDouble file format. This stores the Mac’s data fork in the Unix file and all the Mac metadata in a companion AppleDouble file. If the file name was foo, this companion file was called ._foo. The leading dot meant that this companion file was hidden by many tools.
macOS does not typically need AppleDouble files because Apple file systems, like APFS and HFS Plus, have built-in support for Mac metadata. However, macOS will create and use AppleDouble files when working with a volume format that doesn’t support Mac metadata, like FAT32.
Similarly macOS will create and use AppleDouble files when working with archive file formats, like zip, that don’t have native support for Mac metadata.
macOS 10.4 added support for extended attributes at the Unix layer. For more on this, see the getxattr man page. As with traditional Mac metadata, this is supported natively by Apple file systems but must be stored in an AppleDouble file elsewhere.
Note At the API level the original Mac metadata is modelled as two well-known extended attributes, XATTR_RESOURCEFORK_NAME and XATTR_FINDERINFO_NAME.
When creating a zip archive macOS supports two approaches for storing AppleDouble files:
By default it stores the AppleDouble file next to the original file.
% ditto -c -k --keepParent root root-default.zip
% unzip -t root-default.zip
Archive: root-default.zip
testing: root/ OK
testing: root/nested/ OK
testing: root/nested/FileWithMetadata OK
testing: root/nested/._FileWithMetadata OK
No errors detected in compressed data of root-default.zip.
Alternatively, it can create a parallel hierarchy, rooted in __MACOSX, that holds all AppleDouble files.
% ditto -c -k --keepParent --sequesterRsrc root root-sequestered.zip
% unzip -t root-sequestered.zip
Archive: root-sequestered.zip
testing: root/ OK
testing: root/nested/ OK
testing: root/nested/FileWithMetadata OK
testing: __MACOSX/ OK
testing: __MACOSX/root/ OK
testing: __MACOSX/root/nested/ OK
testing: __MACOSX/root/nested/._FileWithMetadata OK
No errors detected in compressed data of root-sequestered.zip.
The latter is commonly used for mail attachments because it makes it easy for the recipient to discard all the Mac metadata.
Hi we do have a functionality where we need to highlight some content. Our functionality was using javascript functions window.getSelection() or document.getSelection(). These were working fine with IOS 14. when our devices were updated to IOS 15, we are getting null for either window.getSelection() or document.getSelection(). Could you please help us to resolve this issue?
we appreciate your help.