Hi, I have an App KeyPad that lets you use your mac keyboard as a bluetooth keyboard for your iPhone, TV, iPad, another Mac, Windows PC.
https://apps.apple.com/us/app/keypad-bluetooth-keyboard/id1491684442?mt=12
Ever since the release of MacOS Ventura this app is no longer working. I have raised feedback report in december, and there is no activity on it. FB11869248
Essentially if I call publishedServiceRecord with a 16bit UUID in "0001 - ServiceClassIDList" the bluetoothd crashes and kills all bluetooth connections (Mouse, keyboard, headset...everything) and then reconnects. This is 100% reproducible.
What is interesting is that if a user installs my App on MacOS Monterey, it works, and if they upgrade to MacOS Ventura, it will continue to work. But if they install it new in Ventura it will not work. Even more interesting, because my app was installed in Monterey, all OTHER similar apps will work on Ventura.
I suspect that this is some sort of permission issue but I am not getting any support.
Anybody see something similar.
Sam
Core OS
RSS for tagExplore the core architecture of the operating system, including the kernel, memory management, and process scheduling.
Post
Replies
Boosts
Views
Activity
I am able to open software update section of settings app from my custom app. But when settings app is already opened with other section other than software update. Settings app opens with previous section & not opens with software update section. Issue is in iOS 16.4.1
Below is my code:
if let url = URL(string: "App-prefs:root=Settings&path=General/SOFTWARE_UPDATE_LINK") {
UIApplication.shared.open(url)
}
I also have did some modification in info section of project & below is screen shot attached.
Below resources says that I cannot. I don't see satellite count related API in Core Location framework document also.
1)This is not possible using any publicly available iOS API. (2012/11/8)
https://stackoverflow.com/a/13309242/809671
2)There is no public available API that allows you to get this kind of detailed information about the GPS satellites. (2010/8/24)
https://stackoverflow.com/questions/3555732/is-there-a-way-to-get-detailled-gps-satellites-info-on-iphone
3)Confirmed that your app does not have access to the private API used to suss that information. (2017)
https://developer.apple.com/forums/thread/73220
4)Since ios does not provide the number of satellites, and since phone gap is designed as a subset of functionality that runs on all phoneplattform, the num satellites is not available. (2015/11/4)
https://stackoverflow.com/questions/33516688/phonegap-cordova-get-number-of-gps-satellites
5)The Public SDK doesn't expose that. (2009/12/17)
https://stackoverflow.com/questions/1919553/is-it-possible-to-get-gps-satellite-numbers-via-iphone-sdk
6)Can iOS 8 provide info on satellites? No. (2014/8/20)
https://forums.macrumors.com/threads/can-ios-8-provide-info-on-satellites.1766861/
Above posts are little old so I want to double confirm.
If an Apple developer can reply this then it would be better thanks.
I am seeing an interesting behavior on iOS 16.4+ when I set NEVPNProtocol includeAllNetworks flag to TRUE as part of my tunnels's saved preferences.
After my packet tunnel provider starts up and goes through the usual setup of adding routes, where let's say we just just add NEIPv4Route.default() to route everything and eventually setting via: setTunnelNetworkSettings. Any subsequent calls to cancelTunnelWithError will cause the phone to get into a state where the tunnel provider goes away but it appears that my routes did not properly clean up, essentially causing a device to get into a state where all network traffic is now dead. The only way to recover is to go into OS Settings -> VPN and change selected profile to some other one, or just remove ours and go through installation again.
It appears to only be happening on iOS 16.4+ devices, any previous versions clean up just fine. Curious if anyone has seen such behavior? Thanks in advance.
How to create a visionOS project in Xcode15.0beta
With the new interactive widgets brought by iOS 17, is it possible to indicate that an action (button tap or toggle use) has been triggered by firing an Haptic feedback to the user?
How in interactive widgets in ios 17 to prevent the opening of the application by clicking on the widget? Sometimes the user may miss the button and opening the application is inappropriate in this case
Firstly massive thank you to the Passkeys team at Apple for opening up the APIs to allow third-party password manager apps to save and autofill Passkeys in iOS 17! I wasn't expecting this so soon. Incredible work.
I have successfully implemented the new methods on ASCredentialProviderViewController, up to the point where our app's extension is now being presented when a user is prompted to "Create a passkey?". However two things are not entirely clear to me from this point on:
When the user chooses our app to create a password by tapping "Continue", the prepareInterfaceToProvideCredential(for credentialRequest: ASCredentialRequest) method is called. Should I be handling passkey creation within this method? Really at this point I was expecting prepareInterface(forPasskeyRegistration: to be called instead.
Are new passkeys automatically generated and returned by AuthenticationServices during this flow, or is it down to the developer to generate a new passkey here? I ask because the documentation for prepareInterface(forPasskeyRegistration: seems to imply the former, stating: "This method will present your extension's UI for user authentication before creating the passkey."
Thanks again.
Hi all,
Trying to create a Passkeys registration in my app.
This is the implementation I have:
ASAuthorizationPlatformPublicKeyCredentialProvider *provider = [[ASAuthorizationPlatformPublicKeyCredentialProvider alloc] initWithRelyingPartyIdentifier:[[jsonDict objectForKey:@"rp"] objectForKey:@"id"]];
ASAuthorizationPlatformPublicKeyCredentialRegistrationRequest *request = [provider createCredentialRegistrationRequestWithChallenge: [jsonDict objectForKey:@"challenge"] name:[[jsonDict objectForKey:@"user"] objectForKey:@"name"] userID:[[jsonDict objectForKey:@"user"] objectForKey:@"id"]];
ASAuthorizationController *controller = [[ASAuthorizationController alloc] initWithAuthorizationRequests:[NSArray arrayWithObject:request]];
controller.delegate = self;
controller.presentationContextProvider = self;
[controller performRequests];
But I get this error:
Remote proxy object error handler invoked with error: Error Domain=NSCocoaErrorDomain Code=4097 "Couldn’t communicate with a helper application."
Connection to agent service interrupted with error: Error Domain=NSCocoaErrorDomain Code=4097 "Couldn’t communicate with a helper application."
ASAuthorizationController credential request failed with error: Error Domain=NSCocoaErrorDomain Code=4097 "Couldn’t communicate with a helper application."
And this is the NSError I get in authorizationController:didCompleteWithError:
authorizationController error: Error Domain=NSCocoaErrorDomain Code=4097 "connection to service named com.apple.AuthenticationServicesCore.AuthenticationServicesAgent" UserInfo={NSDebugDescription=connection to service named com.apple.AuthenticationServicesCore.AuthenticationServicesAgent}
Everything is defined right (associated domains in project and .well-known).
Any reason I can't continue?
Edit: I'm with iOS 17.0 (21A5248v) and Xcode-Beta (15A5160n) if it matters.
Thanks
Hi,
I was wondering after watching the WWDC23 session, Meet Core Location for spatial computing, does the Apple Vision Pro have GPS? Or does it provide Core Location functionality via Wi-Fi?
Also, in Unity, we use Input.location to get latitude and longitude. When developing in Unity with Apple Vision Pro, do we use Input.location to get latitude and longitude?
Best regards.
Sadao Tokuyama
https://1planet.co.jp/
https://twitter.com/tokufxug
I have upgrade to Mac OS 14 and my smart card reader quit working. Works in safe mode and while booting up. Then stops functioning.
Allow Accessories to connect is not visible. However, I can search for the function but not select it.
MacBook Air M1, Sonoma Beta.
I would like to create an Apple Watch only app that queries data such as blood oxygenation, heart varibility, number of steps, energy consumed, and other data of a similar nature recorded over the past month and performs calculations on them. I read from the HealthKit documentation that Apple Watch synchronizes data with iPhone and periodically deletes older data, and that I can get the date from which the data is available with earliestPermittedSampleDate().
Is there a risk that in general, by making queries to retrieve data up to a month old, the data will no longer be available?
I need the app to work properly without needing an iPhone.
Hi all,
Does anyone know the dimensions of screenshots from Vision Pro?
Thanks
Planning to use 3D video in "windows" and "volumes" in my visionOS app, what's the easiest method to capture 3D video for this need?
Thanks and cheers!
I have an iPad app I've just started testing in visionOS and it's gone pretty good so far except one issue, none of the long-press or swipe gestures in my List work.
The app is SwiftUI based. so I'm using a List with the swipeActions and contextMenu modifiers.
Could these be broke, unsupported or am I not understand how to initiate these in the simulator?
For a long press I'd assume just holding down on the mouse button should work. This appears to work in Safari.
I'm working on an application for viewing AMF models on macOS, using RealityKit. AMF supports several different ways to color models, including per-vertex color (where the color of a triangle is interpolated from vertex to vertex) as well as per-face color (where the color of the triangle is the same across the entire face).
I'm trying to figure out how to support those color models using a RealityKit mesh. Apple's documentation (https://developer.apple.com/documentation/realitykit/modifying-realitykit-rendering-using-custom-materials) talks about per-vertex colors, but I haven't found a way to create a mesh that includes per-vertex colors, other than use a texture map (which might be the correct solution).
Can someone give me some pointers?
Second attempt to publish the Mac version of my App, but it was rejected again due to the same error: 'Library missing'.
The library IS included with the build.
The extracted .app from .xcarchive runs without a problem on different machines, both M1 and Intel. So, I don't know what's wrong.
Can somebody please help me?
I can provide the complete .ips from Apple if needed.
{
"code": 1,
"flags": 518,
"namespace": "DYLD",
"indicator": "Library missing",
"details": [ "(terminated at launch; ignore backtrace)" ],
"reasons": [
"Library not loaded: @rpath/TitaniumKit.framework/Versions/A/TitaniumKit",
"Referenced from: <85BA8613-0157-3B28-99AF-E73F1E579B72> /Applications/TiDesigner.app/Contents/MacOS/TiDesigner",
"Reason: tried: '/usr/lib/swift/TitaniumKit.framework/Versions/A/TitaniumKit' (no such file, not in dyld cache),
'/System/Volumes/Preboot/Cryptexes/OS/usr/lib/swift/TitaniumKit.framework/Versions/A/TitaniumKit' (no such file),
'/usr/lib/swift/TitaniumKit.framework/Versions/A/TitaniumKit' (no such file, not in dyld cache),
'/System/Volumes/Preboot/Cryptexes/OS/usr/lib/swift/TitaniumKit.framework/Versions/A/TitaniumKit' (no such file),
'/System/Library/Frameworks/TitaniumKit.framework/Versions/A/TitaniumKit' (no such file, not in dyld cache),
(security policy does not allow @ path expansion)"
]
}
Hi,
I would like to learn how to create custom materials using Shader Graph in Reality Composer Pro. I would like to know more about Shader Graph in general, including node descriptions and how the material's display changes when nodes are connected. However, I cannot find a manual for Shader Graph in Reality Composer Pro. This leaves me totally clueless on how to create custom materials.
Thanks.
Sadao Tokuyama
https://1planet.co.jp/
https://twitter.com/tokufxug
Hello,
When an iOS app runs on Vision Pro in compatible mode, is there a flag such as isiOSAppOnVision to determine the underlying OS at runtime? Just like the ProcessInfo.isiOSAppOnMac. It will be useful to optimize the app for visionOS.
Already checked but not useful:
#if os(xrOS) does not work in compatible mode since no code is recompiled.
UIDevice.userInterfaceIdiom returns .pad instead of .reality.
Thanks.
I'd like to allow the speech synthesizer to play on the device speaker while simultaneously mixing with a phone call. I've worked with a number of different configurations but am unable to find a configuration that achieves the functionality I am trying to achieve - or allows mixing with a phone call at all.
There is a flag: mixToTelephonyUplink that seems to suggest that at least some mixing with a phone call is possible using the speech synthesizer, but I'm currently unable to find almost any documentation about this flag besides basic API docs.
I've had some some luck at least getting the synthesizer to always play to the speaker with the following audio session configuration - but the sound never is mixed with a phone call. Instead, it is ducked and muted while the phone call takes place. I've tried quite a few configuration combinations for the category and overrides, but nothings seems to work quite as I'd expect it to.
synthesizer.mixToTelephonyUplink = true
try? audioSession.setCategory(.playback, mode: .voicePrompt, options: [.mixWithOthers, .defaultToSpeaker])
try? audioSession.setActive(true, options: [])
try? audioSession.overrideOutputAudioPort(.speaker)
Is there some kind of documentation for this that's off the beaten path that I'm somehow missing? I'm going to continue with guess and check, but I'm starting to think this flag - and the functionality it implies, actually wasn't ever fully implemented.