Hello,
I have written the following app intent and I can access it via shortcuts. But I can't get Siri to pick it up. I want it to have a dynamic book title (which could be anything) so that the user can say "Add (bookname) to my (app name). I need it to work for ios 17.1 onwards. I have added siri as a capability for my ios app.
import AppIntents
struct AddBookToReadingListIntent: AppIntent {
static var title: LocalizedStringResource = "Add my Book"
@Parameter(title: "Book Title", requestValueDialog: "What's the title of the book you want to add?")
var bookTitle: String
static var parameterSummary: some ParameterSummary {
Summary("Add my '\(\.$bookTitle)'")
}
func perform() async throws -> some IntentResult & ReturnsValue<String> {
return .result(value: "Added '\(bookTitle)' to your app")
}
}
struct AppShortcuts: AppShortcutsProvider {
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: AddBookToReadingListIntent(),
phrases: [
"Add \(\.$bookTitle) in \(.applicationName)"
],
shortTitle: "Add Book to app name",
systemImageName: "book"
)
}
}
General
RSS for tagDelve into the world of built-in app and system services available to developers. Discuss leveraging these services to enhance your app's functionality and user experience.
Post
Replies
Boosts
Views
Activity
I am also currently having the same issue after updating to IOS18 And iPhone 16pro max Whether wired or wireless sound quality is poof and very mono. At some point it corrects itself then you touch the phone screen then back to mono again. Making phone calls big issue. Nothing respond,nothing else resolve it. My Bluetooth works fine. Please fix this… so tired of this
Hi,
I am trying to determine if the Mac that is running my app has an active screen sharing session or not. Is there a way to detect this? Potentially using system API's or a system command?
Any help would be greatly appreciated, thank you!
I'm using Live Activity features in my app, but I want to customize the user experience across different Apple devices. Specifically, I'd like to:
Keep Live Activity enabled and functioning on the iPhone Disable or prevent Live Activity from appearing on the connected Apple Watch
Is this level of device-specific control possible with Live Activity? If so, what's the best approach to implement this functionality? What I've tried:
I've looked through Apple's documentation on Live Activity, but couldn't find specific information about device-level control. I've experimented with ActivityKit, but haven't found a clear way to distinguish between iPhone and Apple Watch when pushing updates.
Hello,
I am a developer currently working on a personal contact management app.
What is the app?
My app stores additional information beyond basic contact details. Therefore, instead of using the Contacts framework, I manage contact objects using Core Data.
What am I trying to achieve?
I want to display additional information on the caller ID screen when a call is received from a number stored in my app.
What have I tried?
I’ve attempted the following methods without success:
1. Call Directory Extension:
I thought using this method would allow me to display additional information from Core Data on the call screen. However, I learned that when a call is received, the iOS system first searches for the phone number in the Contacts app and only looks to the Extension app if no match is found. Therefore, displaying contact information from my app seems unfeasible.
2. Custom Call UI:
Using CallKit seemed like a viable option to display the necessary information during a call, but it appears to only be possible with VoIP apps. My app does not support VoIP calls, so this method was also not implementable.
I am wondering if there are any technologies available that could help me achieve my goal, or if there’s something I might be missing. Any advice would be greatly appreciated.
Thank you!
If a similar question has been asked, I apologize for the repetition.
I have an application that updates bulk contacts it used to take 2 seconds to update 100 contacts on < IOS 18. After IOS 18 it takes almost 2 minutes and is highly affecting our app performance.
On iPhone devices with dynamic islands (e.g. iPhone 15 Pro Max), the proximity sensor takes about 3 seconds to activate, is this normal?
The iPhone 13 responds almost instantly, but the iPhone 15 Pro Max seems to take a while.
class ProximitySensorManager {
// Shared instance for global access (optional)
static let shared = ProximitySensorManager()
// Proximity sensor activation flag
private(set) var isSensorEnabled: Bool = false
// Start observing proximity sensor changes
func enableProximitySensor(observer: Any) {
guard !isSensorEnabled else { return }
isSensorEnabled = true
UIDevice.current.isProximityMonitoringEnabled = true
NotificationCenter.default.addObserver(
observer,
selector: #selector(proximityStateChanged),
name: UIDevice.proximityStateDidChangeNotification,
object: nil
)
}
// Stop observing proximity sensor changes
func disableProximitySensor(observer: Any) {
guard isSensorEnabled else { return }
isSensorEnabled = false
UIDevice.current.isProximityMonitoringEnabled = false
NotificationCenter.default.removeObserver(
observer,
name: UIDevice.proximityStateDidChangeNotification,
object: nil
)
}
// Proximity sensor state change handler
@objc private func proximityStateChanged() {
if UIDevice.current.proximityState {
print("Proximity sensor detected close object")
// Additional functionality can be added here (e.g. lowering the screen brightness)
} else {
print("Proximity sensor detected no object")
}
}
deinit {
//disableProximitySensor() // Clean up observer on deinitialization
}
}
On iPhone devices with dynamic islands (e.g. iPhone 15 Pro Max), the proximity sensor takes about 3 seconds to activate, is this normal?
The iPhone 13 responds almost instantly, but the iPhone 15 Pro Max seems to take a while.
// Shared instance for global access (optional)
static let shared = ProximitySensorManager()
// Proximity sensor activation flag
private(set) var isSensorEnabled: Bool = false
// Start observing proximity sensor changes
func enableProximitySensor(observer: Any) {
guard !isSensorEnabled else { return }
isSensorEnabled = true
UIDevice.current.isProximityMonitoringEnabled = true
NotificationCenter.default.addObserver(
observer,
selector: #selector(proximityStateChanged),
name: UIDevice.proximityStateDidChangeNotification,
object: nil
)
}
// Stop observing proximity sensor changes
func disableProximitySensor(observer: Any) {
guard isSensorEnabled else { return }
isSensorEnabled = false
UIDevice.current.isProximityMonitoringEnabled = false
NotificationCenter.default.removeObserver(
observer,
name: UIDevice.proximityStateDidChangeNotification,
object: nil
)
}
// Proximity sensor state change handler
@objc private func proximityStateChanged() {
if UIDevice.current.proximityState {
print("Proximity sensor detected close object")
// Additional functionality can be added here (e.g. lowering the screen brightness)
} else {
print("Proximity sensor detected no object")
}
}
deinit {
//disableProximitySensor() // Clean up observer on deinitialization
}
}
I've implemented GADMobileAds and my Info.plist file includes the following entry:
<key>SKAdNetworkItems</key>
<array>
...
<dict>
<key>SKAdNetworkIdentifier</key>
<string>tl55sbb4fm.skadnetwork</string> <!-- Verve Group -->
</dict>
...
</array>
Unfortunately, I get the following error still:
...[Default] <Google> <Google:HTML> 1 required SKAdNetwork identifier(s) missing from Info.plist. Missing network(s): Verve. See [Enable SKAdNetwork to track conversions] (https://googlemobileadssdk.page.link/enable-skadnetwork).
I've tried everything from cleaning the build folder to adding all of Verve Group's other SKAdNetworkIdentifier strings.
What am I missing?
Our app has these settings configured in its Info.plist:
<key>LSSupportsOpeningDocumentsInPlace</key>
<true/>
<key>UIFileSharingEnabled</key>
<true/>
We have an Enterprise app, distributed via MDM, that was created approximately 6 years ago in Xcode 9 or 10.
In iOS 17 the app's Document directory is correctly listed as a folder in the Files app under "On My iPhone".
After updating a device to iOS 18, the app's Document directory is no longer listed under "On My iPhone". However, if you search for the app's name the search results does show the app's folder.
If I then run the app from Xcode directly on to an iOS 18 device, the app's Document directory folder is not listed under "On My iPhone" in the Files app and searching for it no longer finds it. If run the app directly on to an iOS 17 device, the app's Document directory is, correctly, listed as before.
I created a new test project in Xcode 15 and ran it on an ISO 18 device, but it works as expected, so it seems to have affected older projects.
This is the exact same problem, occurring for other developers:
https://stackoverflow.com/questions/79025597/ios-apps-document-folder-no-longer-accessible-from-files-app-after-ios-18-updat
I am writing an app that records accelerometer data in the background. When we call sensor.accelerometerData(from:to:), the for loop when iterating CMSensorDataList suspends/pauses as the watch screen turns off and resumes when app comes to foreground. Is there a way we can keep the processing on until it completes? I used Location Updates capability in Background Modes as a hack but it will make my app rejected from App store just to acquire capability for background execution.
Please note that our data is comprised of larger sets, say hours. We retrieve data in batches of 30 minutes each but this also does not complete.
I'm trying to add Siri support to my app for sending voice messages. I've implemented INSendMessageIntentHandling in my main app target.
It looks like it's getting as far as recording the voice message and passing my intent handler an INSendMessageIntent with an audio attachment, but I'm not able to read the attachment file.
func handle(
intent: INSendMessageIntent,
completion: @escaping (INSendMessageIntentResponse) -> Void
) {
if let attachment = intent.attachments?.first,
let audioFile = attachment.audioMessageFile,
let fileURL = audioFile.fileURL
{
// This branch runs
// fileURL is "file:///var/mobile/tmp/SiriMessages/89F738F7-6092-439A-B4FA-2DD9A99F0EED.caf"
let result = processMessageAudio(url: fileURL)
completion(result)
return
}
// This line isn't reached
completion(.init(code: .failure, userActivity: nil))
}
private func processMessageAudio(url: URL) -> INSendMessageIntentResponse {
var fileRef: ExtAudioFileRef?
if url.startAccessingSecurityScopedResource() {
logDebug("File access allowed")
} else {
// This branch runs
logDebug("File access not allowed")
}
defer {
url.stopAccessingSecurityScopedResource()
}
let openStatus = ExtAudioFileOpenURL(url as CFURL, &fileRef)
// openStatus is -54 (kAudio_FilePermissionError)
return INSendMessageIntentResponse(code: .failure, userActivity: nil)
}
I'm not sure what I'm missing. It looks like there should be an audio file, and Siri shows a preview of the audio for confirmation.
On and off I've been trying to figure out how to do hang detection in-application (at least from the user's point of view). Qualitatively what I'd like to do is have a process which runs sample(1) on the application after it's been unresponsive for more than a second or so. Basically, an in-app replacement for Spin Control. The problem I've been stuck on is: how do I tell?
There used to be Core Graphics SPI (CGSRegisterNotifyProc with a value of kCGSEventNotificationAppIsUnresponsive) for doing this, but it doesn't work anymore (either due to sandboxing or system-wide security changes, I can't tell which but it doesn't matter).
One thought I had was to have an XPC service which would expect to receive a checkin once per second from the host (via a timer set up by the host). If it didn't, it would start sample(1). This seems pretty heavyweight to me, since it means that once per second, I'm going to be consuming cycles to check in with the service. But I haven't been able to come up with a scheme that doesn't include some kind of check-in by the target process.
Are there any APIs or strategies that I could use to accomplish this? Or is there some entitlement which would allow the application to request "application became unresponsive"/"application became responsive" notifications from the window server?
Hi everyone,
I’m having trouble getting my iPhone 11 to detect a DWM3001CDK as an accessory using the Apple Nearby Interaction app. Here’s the background:
Two years ago, I successfully tested UWB ranging between the same devices (iPhone 11 and DWM3001CDK, which is based on the Qorvo DW3110 IC and an nRF52833 SoC with Bluetooth 5.2). At that time, the Nearby Interaction app was in beta and worked well for my tests. Now, with the stable version of the app, I’m encountering an issue.
Here’s what I’ve done so far:
I erased the DWM3001C and flashed it with the Qorvo Nearby Interaction firmware (v3.2.0, "DWM3001CDK-QANI-FreeRTOS_full_QNI_3_0_0.hex") using J-Flash Lite V7.86g on Windows.
With this configuration, I can connect the iPhone 11 to the accessory using the Qorvo NI apps, both in the foreground and background.
However, when I compile and run the project "ImplementingSpatialInteractionsWithThirdPartyAccessories" (available on the Apple Developer website) on my iPhone 11 (running iOS 17.7), the app remains stuck on the "Scanning for accessory" screen and doesn’t find the device, even though I’ve given the app permission to use Bluetooth.
Could this be due to an issue with the firmware I flashed on the DWM3001CDK, or might there be something else causing the problem?
Any help or insights would be appreciated!
Thanks in advance.
Hi, so as I understand it is not possible to know what all possible sources of files are available on iOS using some api call (by sources I mean smb shares connected, iCloud, gdrive, etc), the only paths I can get are the app sandbox, app group container and the same on iCloud. I can get the list of mount points in macOS using getmntinfo(), app/group sandbox and apart from these whatever standard locations I have given access to to my sandboxed app. Are there other paths that I can get?
I want to know how I can determine the volume given a user picks a file using a file picker. Say, they picked 10 files from gdrive and another 5 from local storage. If I encounter some errors on the files from gdrive I want to stop working on all 10 of them but to do that I need to be able to determine that that are on this particular volume. Is there a way to do this programmatically?
Ex: gdrive on iOS : "/private/var/mobile/Containers/Shared/AppGroup/6208BBEE-24BF-4CC9-A9ED-846F987C0442/File Provider Storage/39822865/1P8WD1tWEaq81ZB_DodTTZhXm0p00QaF7/test.txt"
on MacOS:
"/Users/username/Library/CloudStorage/GoogleDrive-useremailid/My Drive"
I am developing an iOS and watchOS app that share data using App Groups. The App Group identifier is group.(myteamid).dev.christopherallen.vtipz. The iOS app saves a QR code URL to the shared App Group container, and the watchOS app reads this URL to display the QR code image.
Steps to Reproduce:
1. Configure both the iOS and watchOS targets to use the App Group group.(myteamid).dev.christopherallen.vtipz in the "Signing & Capabilities" tab in Xcode.
2. Save the QR code URL to the shared App Group container in the iOS app using UserDefaults.
3. Attempt to read the QR code URL from the shared App Group container in the watchOS app using UserDefaults.
Code Snippets:
iOS App Code to Save URL:
private func saveQrCodeUrlToAppGroup(url: URL) { if let sharedDefaults = UserDefaults(suiteName: "group.(myteamid).dev.christopherallen.vtipz") { sharedDefaults.set(url.absoluteString, forKey: "qrCodeUrl") } }
watchOS App Code to Read URL:
private func loadQrCodeImage() { if let sharedDefaults = UserDefaults(suiteName: "group.(myteamid).dev.christopherallen.vtipz"), let urlString = sharedDefaults.string(forKey: "qrCodeUrl"), let url = URL(string: urlString) { fetchAndSaveImage(from: url) } else { showError = true } }
Error Encountered:
When attempting to read the QR code URL from the shared App Group container in the watchOS app, the following error is logged:
Couldn't read values in CFPrefsPlistSource<0x300b19880> (Domain: group.(myteamid).dev.christopherallen.vtipz, User: kCFPreferencesAnyUser, ByHost: Yes, Container: (null), Contents Need Refresh: Yes): Using kCFPreferencesAnyUser with a container is only allowed for System Containers, detaching from cfprefsd
Troubleshooting Steps Taken:
1. Confirmed that the App Group identifier is correctly set up in the Apple Developer portal.
2. Ensured that both the iOS and watchOS targets have the same App Group enabled in the "Signing & Capabilities" tab in Xcode.
3. Verified that the App Group identifier used in the code matches exactly with the one configured in Xcode.
Request for Assistance:
I am seeking assistance to understand why this error occurs and how to resolve it.
• Are there any additional steps required to properly access the shared App Group container?
• Any insights or suggestions to resolve this issue would be greatly appreciated.
I've been working on creating a CSR for about two hours now and I cannot find a Certificate assistant anywhere. I can open up keychain access, on the left I have login and cloud and system and system roots. there are 6 submenus under keychain access: All Items, Passwords, Secure Notes, My Certificates, Keys, and Certificates. I have used the search menu to find both in the search bar 'Certificate Assistant" and also Certificate Signing Request, and neither is anywhere to be found. I've looked on the developer Account help, I've read several places what you are supposed to do, I've see the illustrations where you enter the email and leave the CA email blank, I just can't find it anywhere around Keychain access. It is really really well described on the Developer account help, and the eskimo makes it sound really easy too, only nothing appears in my keychain access. I've scrolled through all of the submenus trying to find it and it is nowhere to be fount. Any help would be much appreciated
Since I updated to iOS 18, CallKit-linked caller not display on screen of CarPlay.
CarPlay display only "{App Name} Caller ID".
When iOS version was 17.x, CarPlay displayed caller name of CallKit-linked contact.
I think CarPlay should perform the same function as iOS 17.
Please review it.
A few weeks ago, I submitted a request for the Family Controls & Personal Device Usage Entitlement to enable my app to access the Managed Settings and Device Activity frameworks in the Screen Time API. The app is nearly complete, with the only remaining component being the Family Controls capability.
As of now, I haven’t received a response regarding the request. I’m wondering if anyone else has experienced a similar situation or can provide insight into the following:
• How long does it typically take to receive feedback on this type of entitlement request?
• Is there a way to check if a request is still active and under review?
• Is a finished website required for the entitlement to be granted?
Any information or guidance would be greatly appreciated!
I tried to add a few yearly reminder but every reminder is adding to the next year today's date
Example - Reminder for 5th October is setting to 30/9/25