Handle requests for your app’s services from users using Siri or Maps.

Posts under SiriKit tag

40 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

INPlayMediaIntentHandling, Unable to Trigger App to Play Specific Podcast
I have an app that's capable of playing podcasts via Siri requests, e.g. "Hey Siri, play [Podcast Name]". I’m using INPlayMediaIntentHandling, that is, the SiriKit domain intents, as opposed to the newer AppIntents framework for its ability to select my app for audio playback without the need to specify the name of the app in the user's request to Siri. This works great overall for the many podcasts I’ve tested the app with, with the exception of one. There's a podcast called "The Headlines", and I when I test the app with the request "Hey Siri, play The Headlines", my app is never selected. Instead, Apple Podcasts begins playback of a show called "NPR News Now". Oddly, if the Apple Podcasts app is deleted, my app will still not be selected by the system, and instead, Siri responds with "I don’t see an app for that. You’ll need to download one" with a button to open the App Store. Additionally, if I do add the app name to the request using this style of intent, Siri responds with "[App Name] hasn’t added support for that with Siri." However, I’d still like to accomplish this without requiring the app name in the Siri request. There's nothing complex in my setup: The target declares one supported intent, INPlayMediaIntent, with "Podcasts" selected as a supported media category. The Siri entitlement is enabled. My INSiriAuthorizationStatus is .authorized. My intent handler is specified in my AppDelegate as follows: func application(_ application: UIApplication, handlerFor intent: INIntent) -> Any? { return IntentHandler.shared } My intent handler is simple: final class IntentHandler: NSObject, INPlayMediaIntentHandling { static let shared = IntentHandler() func handle(intent: INPlayMediaIntent) async -> INPlayMediaIntentResponse { print("IntentHandler: processing intent: \(intent)") /** code to start playback based on information found in `intent` **/ } When requesting Siri to "Play The Headlines", my handler code is not called at all. For all other supported shows, the print statement executes, and playback begins as expected. Is there any way I can get my app to be selected instead of Apple Podcasts for this request?
1
0
72
3d
Issue with Siri Not Displaying Prompt in App Intent
Hello everyone, I'm currently working on an App Intent for my iOS app, and I’ve encountered a frustrating issue related to how Siri prompts for a category selection. Here’s an overview of what I’m dealing with: extension Category: AppEntity, @unchecked Sendable { var displayRepresentation: DisplayRepresentation { DisplayRepresentation(title: "\(name)") } static var typeDisplayRepresentation = TypeDisplayRepresentation(name: "Category") typealias DefaultQueryType = ShortcutsCategoryQuery static var defaultQuery: ShortcutsCategoryQuery = ShortcutsCategoryQuery() } struct ShortcutsCategoryQuery: EntityQuery { func entities(for identifiers: [String]) async throws -> [Category] { let context = await ModelContext(sharedModelContainer) let categories = try CategoryDataProvider(context: context).getItems() return categories.filter { identifiers.contains($0.id) } } func entities(matching string: String) async throws -> [Category] { return try await suggestedEntities() } func suggestedEntities() async throws -> [Category] { let context = await ModelContext(sharedModelContainer) do { let categories = try CategoryDataProvider(context: context).getItems() if categories.isEmpty { print("No categories found.") } return categories.map { category in Category( id: category.id, name: category.name, stringSymbol: category.stringSymbol, symbol: category.symbol, stringColor: category.stringColor, color: category.color ) } } catch { print(error) return [] } } } The issue arises when I use Siri to invoke the intent. Siri correctly asks me to select a category but does not display any options unless I said something that Siri recognized, like "Casa(House) or *****(Test)" in portuguese. Only then does it show the list of available categories. I would like the categories to appear immediately when Siri asks for a selection. I've already tried refining the ShortcutsCategoryQuery and debugging various parts of my code, but nothing seems to fix this behavior.
1
0
117
2d
continue UserActivity in AppDelegate not working after firing completion(INSendPaymentIntentResponse(code: .failureRequiringAppLaunch, userActivity: userActivity ))
func handle(intent: INSendPaymentIntent, completion: @escaping (INSendPaymentIntentResponse) -> Void) { if let _ = intent.payee, let currencyAmount = intent.currencyAmount { let userActivity = NSUserActivity(activityType: "com.rapipay.nye.test.failure") userActivity.userInfo = ["amount": Int(truncating: currencyAmount.amount!)] completion(INSendPaymentIntentResponse(code: .failureRequiringAppLaunch, userActivity: userActivity )) return } } func application(_ application: UIApplication, continue userActivity: NSUserActivity, restorationHandler: @escaping ([any UIUserActivityRestoring]?) -> Void) -> Bool { print("ddddd") return true } this code is perfectly working on simulator and even i fi user userActivity as nil, continue userActivity is called, but on device it is not called Siri is working, asking for name and amount and handler is also called and opens app but does not pass intent or userActivity
4
0
166
2w
App Intents is not able to start an outgoing call with CallKit when the app is backgrounded
I am currently working on integrating an app with Siri, adding support for starting VOIP calls and sending messages. Although it is understood it is recommended to use SiriKit for calling and messaging, I would like to allow users to select a profile to use for calling. As far as I am aware the notion of selecting a profile to call from is not something SiriKit supports, therefore, it was decided to go with App Intents to allow for more control over the parameters utilized to start calls. After integrating VOIP calling with App Intents, I noticed CallKit is not able to start calls when the App Intent is invoked from the background. I get the following error: Error Domain=com.apple.CallKit.error.requesttransaction Code=6 "(null)” This seems to correspond to the CXErrorCodeRequestTransactionError invalidAction. This error only happens when the intent is invoked from the background. Changing the App Intent property openAppWhenRun to true solves the issue as it brings the app to foreground before running the intent. However, I would like to support starting calls from the background to avoid making users unlock their phones prior to starting a call with Siri to make it a truly hands-free experience. I suspect the desired behavior is possible, most likely with SiriKit, as some famous VOIP calling apps (i.e. WhatsApp, Messenger, etc) exhibit the behavior I described. However, is there any way to start calls from the background with App Intents? Or is the desired behavior something exclusive to SiriKit? I have pasted three code snippets below that can replicate the issue. At the moment I am on Xcode Version 15.3, macOS Sonoma 14.6.1, and testing on iOS 16.6.1 To demonstrate the issue I have created the following CXProviderDelegate: class CallManager: NSObject, CXProviderDelegate { func startCall() { let callKitProvider = CXProvider(configuration: CXProviderConfiguration()) callKitProvider.setDelegate(self, queue: nil) let callKitController = CXCallController() let recipient = CXHandle(type: .generic, value: "Demo Outgoing Call") let uuid = UUID() let startCallAction = CXStartCallAction(call: uuid, handle: recipient) let transaction = CXTransaction(action: startCallAction) callKitController.request(transaction) { error in if let error { print(error) } else { print("no errors") } } callKitProvider.reportOutgoingCall(with: uuid, connectedAt: nil) } func providerDidReset(_ provider: CXProvider) { // no-op, not required to demonstrate the issue } } Then, I have a UIViewController that is the only screen of this example app: class ViewController: UIViewController { @IBOutlet weak var startCallButton: UIButton! override func viewDidLoad() { super.viewDidLoad() startCallButton.addTarget(self, action: #selector(buttonTapped), for: .touchUpInside) } @objc func buttonTapped() { let manager = CallManager() manager.startCall() } } As for app intents, I put together a very simple intent to trigger the start of an outgoing call: struct StartCall: AppIntent { static var title: LocalizedStringResource = "Start Call" static var openAppWhenRun = false func perform() async throws -> some IntentResult { let manager = CallManager() manager.startCall() return .result() } } When the UIViewController is presented and I tap the button to start a call I see the green call banner appear and "no errors" is printed to the console as intended. However, when I open the Shortcuts app and run the app intent, the green banner does not appear and the message Error Domain=com.apple.CallKit.error.requesttransaction Code=6 "(null)” is printed to the console.
1
0
141
2w
[iOS18]My apps do not appear in the shortcut apps.
It is no longer displayed as a target for shortcut app actions. It was displayed until iOS 17.7. Tried with iOS 18.0 and 18.0.1 but it does not appear. Shortcuts created when under iOS 18 work fine. Only INPlayMediaIntent is supported and is targeted at iOS 15 and above, so no Extension is used and is handled directly in the app. Is anyone else suffering from the same problem?
1
0
198
4w
Siri not picking up app intent
Hello, I have written the following app intent and I can access it via shortcuts. But I can't get Siri to pick it up. I want it to have a dynamic book title (which could be anything) so that the user can say "Add (bookname) to my (app name). I need it to work for ios 17.1 onwards. I have added siri as a capability for my ios app. import AppIntents struct AddBookToReadingListIntent: AppIntent { static var title: LocalizedStringResource = "Add my Book" @Parameter(title: "Book Title", requestValueDialog: "What's the title of the book you want to add?") var bookTitle: String static var parameterSummary: some ParameterSummary { Summary("Add my '\(\.$bookTitle)'") } func perform() async throws -> some IntentResult & ReturnsValue<String> { return .result(value: "Added '\(bookTitle)' to your app") } } struct AppShortcuts: AppShortcutsProvider { static var appShortcuts: [AppShortcut] { AppShortcut( intent: AddBookToReadingListIntent(), phrases: [ "Add \(\.$bookTitle) in \(.applicationName)" ], shortTitle: "Add Book to app name", systemImageName: "book" ) } }
2
0
172
2w
How can I access audio attachment from INSendMessageIntent
I'm trying to add Siri support to my app for sending voice messages. I've implemented INSendMessageIntentHandling in my main app target. It looks like it's getting as far as recording the voice message and passing my intent handler an INSendMessageIntent with an audio attachment, but I'm not able to read the attachment file. func handle( intent: INSendMessageIntent, completion: @escaping (INSendMessageIntentResponse) -> Void ) { if let attachment = intent.attachments?.first, let audioFile = attachment.audioMessageFile, let fileURL = audioFile.fileURL { // This branch runs // fileURL is "file:///var/mobile/tmp/SiriMessages/89F738F7-6092-439A-B4FA-2DD9A99F0EED.caf" let result = processMessageAudio(url: fileURL) completion(result) return } // This line isn't reached completion(.init(code: .failure, userActivity: nil)) } private func processMessageAudio(url: URL) -> INSendMessageIntentResponse { var fileRef: ExtAudioFileRef? if url.startAccessingSecurityScopedResource() { logDebug("File access allowed") } else { // This branch runs logDebug("File access not allowed") } defer { url.stopAccessingSecurityScopedResource() } let openStatus = ExtAudioFileOpenURL(url as CFURL, &fileRef) // openStatus is -54 (kAudio_FilePermissionError) return INSendMessageIntentResponse(code: .failure, userActivity: nil) } I'm not sure what I'm missing. It looks like there should be an audio file, and Siri shows a preview of the audio for confirmation.
2
0
194
Oct ’24
Siri's voice invocation to open App and pass the intent
Hi All, requirement - "Search (placeholder) in (myApp)". When user speaks this strings, Siri should open the app and pass the placeholder. This worked for me only when i used an AppEnum (with specific defined set) with AppEntity. I want the placeholder to be dynamic and not defined via the AppEnum. Have observed this feature working fine with Youtube, Spotify & Whatsapp apps. Is there anything else that these app add specifically to make this work. ? Also in these app's Siri settings, there is a toggle named - 'Use with Ask Siri'. Could someone please help in understanding, how this option is enabled ?
5
0
345
Sep ’24
iOS 18: Siri not passing string parameters to AppIntents if the string is a question
Xcode Version 16.0 (16A242d) iOS18 - Swift There seems to be a behavior change on iOS18 when using AppShortcuts and AppIntents to pass string parameters. After Siri prompts for a string property requestValueDialog, if the user makes a statement the string is passed. If the user's statement is a question, however, the string is not sent to the AppIntent and instead Siri attempts to answer that question. Example Code: struct MyAppNameShortcuts: AppShortcutsProvider { @AppShortcutsBuilder static var appShortcuts: [AppShortcut] { AppShortcut( intent: AskQuestionIntent(), phrases: [ "Ask \(.applicationName) a question", ] ) } } struct AskQuestionIntent: AppIntent { static var title: LocalizedStringResource = .init(stringLiteral: "Ask a question") static var openAppWhenRun: Bool = false static var parameterSummary: some ParameterSummary { Summary("Search for \(\.$query)") } @Dependency private var apiClient: MockApiClient @Parameter(title: "Query", requestValueDialog: .init(stringLiteral: "What would you like to ask?")) var query: String // perform is not called if user asks a question such as "What color is the moon?" in response to requestValueDialog // iOS 17, the same string is passed though @MainActor func perform() async throws -> some IntentResult & ProvidesDialog & ShowsSnippetView { print("Query is: \(query)") let queryResult = try await apiClient.askQuery(queryString: query) let dialog = IntentDialog( full: .init(stringLiteral: queryResult.answer), supporting: .init(stringLiteral: "The answer to \(queryResult.question) is...") ) let view = SiriAnswerView(queryResult: queryResult) return .result(dialog: dialog, view: view) } } Given the above mock code: iOS17: Hey Siri Ask (AppName) a question Siri responds "What would you like to ask?" Say "What color is the moon?" String of "What color is the moon?" is passed to the AppIntent iOS18: Hey Siri Ask (AppName) a question Siri responds "What would you like to ask?" Say "What color is the moon?" Siri answers the question "What color is the moon?" Follow above steps again and instead reply "Moon" "Moon" is passed to AppIntent Basically any interrogative string parameters seem to be intercepted and sent to Siri proper rather than the provided AppIntent in iOS 18
1
0
426
Oct ’24
Sirikit "Create" category show create button but siri don't accept "create" command
I develop a siri function with my App. In the custom intent, I choose "Create" category. And when I run this command, create button would be displayed. But the strange thing, when I say "create", no response. I say "yes", "okay", "sure", all of them worked. Only "create", didn't work. If we show a create button, the "create" command also need work. Can anyone help me, I want "create" command also worked.
1
0
267
Sep ’24
INPlayMediaIntent `mediaSearch` mediaName unreliable when searching for playlists
We are working with an app that uses the INPlayMediaIntent to allow users to select and play music using Siri. In building out this feature, we have noticed that when selecting playlists to play, Siri will consistently leave out information from the intent that we are use to resolve the media to play in the app. It seems that there is generally no rhyme or reason as to why some information is left out. Walking through a couple test cases, here is the phrase and corresponding mediaSearch that we receive when testing: "Hey Siri, play the playlist happy songs in the app " (this is a working example) ▿ Optional<INMediaSearch> - some : <INMediaSearch: 0x114050780> { reference = 0; mediaType = 5; sortOrder = 0; albumName = <null>; mediaName = happy songs; genreNames = ( ); artistName = <null>; moodNames = ( ); releaseDate = <null>; mediaIdentifier = <null>; } "Hey Siri, play the playlist my favorites in the app " (this fails with a null mediaName) ▿ Optional<INMediaSearch> - some : <INMediaSearch: 0x114050600> { reference = 0; mediaType = 5; sortOrder = 0; albumName = <null>; mediaName = <null>; genreNames = ( ); artistName = <null>; moodNames = ( ); releaseDate = <null>; mediaIdentifier = <null>; } "Hey Siri, play the playlist working out playlist in the app " (this fails as the term "playlist" is excluded) ▿ Optional<INMediaSearch> - some : <INMediaSearch: 0x114050ae0> { reference = 0; mediaType = 5; sortOrder = 0; albumName = <null>; mediaName = working out; genreNames = ( ); artistName = <null>; moodNames = ( ); releaseDate = <null>; mediaIdentifier = <null>; } "Hey Siri, play the playlist recently added in the app " (this fails with a null mediaName) ▿ Optional<INMediaSearch> - some : <INMediaSearch: 0x1140507e0> { reference = 0; mediaType = 5; sortOrder = 0; albumName = <null>; mediaName = <null>; genreNames = ( ); artistName = <null>; moodNames = ( ); releaseDate = <null>; mediaIdentifier = <null>; } Based on the above, Siri seems to ignore playlists named "Recently Added", "My Favorites", and playlists that have the word "playlist" in them such as "Working Out Playlist". To rectify this, we attempted to set the INVocabulary for the playlist titles that a user has in the app, as suggested in this WWDC session: https://developer.apple.com/videos/play/wwdc2020/10060/ let vocabulary = INVocabulary.shared() vocabulary.setVocabularyStrings(NSOrderedSet(array: [ "my favorites", "recently added", "working out playlist" ]), of: .mediaPlaylistTitle); This seems to have no effect. We understand the note in https://developer.apple.com/documentation/sirikit/registering_custom_vocabulary_with_sirikit/ stating that "a few minutes" should be waited before testing custom vocabulary, but waiting upwards of 20 minutes and even restarting the device did not result in any of the custom vocabulary making a difference. If these playlist names are set in AppIntentVocabulary.plist, "Recently Added" and "My Favorites" are able to be discovered as playlists, but the other failed test cases remain failing. The obvious shortcoming here is that these are not dynamic. <key>ParameterVocabularies</key> <array> <dict> <key>ParameterNames</key> <array> <string>INPlayMediaIntent.playlistTitle</string> </array> <key>ParameterVocabulary</key> <array> <dict> <key>VocabularyItemIdentifier</key> <string>working out playlist</string> <key>VocabularyItemSynonyms</key> <array> <dict> <key>VocabularyItemPhrase</key> <string>working out playlist</string> </dict> </array> </dict> <dict> <key>VocabularyItemIdentifier</key> <string>recently added</string> <key>VocabularyItemSynonyms</key> <array> <dict> <key>VocabularyItemPhrase</key> <string>recently added</string> </dict> </array> </dict> <dict> <key>VocabularyItemIdentifier</key> <string>my favorites</string> <key>VocabularyItemSynonyms</key> <array> <dict> <key>VocabularyItemPhrase</key> <string>my favourites</string> </dict> <dict> <key>VocabularyItemPhrase</key> <string>my favorites</string> </dict> </array> </dict> </array> </dict> </array> Given the above, our questions are as follows: Is there documentation surrounding how Siri may pass along the mediaSearch in INPlayMediaIntent and how/why information may be left out? Why does setting custom vocabulary with INVocabulary seem to have no effect, yet the same vocabulary in AppIntentVocabulary does have an effect? Is the functionality we are experiencing to be expected, or should this be reported as a bug? We've published the test app that we are using for debugging this functionality at this link: https://github.com/awojnowski/SiriTest
3
0
405
Aug ’24
Disable App Shortcuts without releasing a new build
I'm trying to create an App Shortcut so that users can interact with one of my app's features using Siri. I would like to be able to turn this shortcut on or off at runtime using a feature toggle. Ideally, I would be able to do something like this. struct MyShortcuts: AppShortcutsProvider { static var appShortcuts: [AppShortcut] { // This shortcut is always available AppShortcut( intent: AlwaysAvailableIntent(), phrases: ["Show my always available intent with \(.applicationName)"], shortTitle: "Always Available Intent", systemImageName: "infinity" ) // This shortcut is only available when "myCoolFeature" is available if FeatureProvider.shared.isAvailable("myCoolFeature") { AppShortcut( intent: MyCoolFeatureIntent(), phrases: ["Show my cool feature in \(.applicationName)"], shortTitle: "My Cool Feature Intent", systemImageName: "questionmark" ) } } } However, this does not work because the existing buildOptional implementation is limited to components of type (any _AppShortcutsContentMarker & _LimitedAvailabilityAppShortcutsContentMarker)?. All other attempts at making appShortcuts dynamic have resulted in shortcuts not working at all. I've tried: Creating a makeAppShortcuts method that returns [AppShortcut] and invoking this method from within the appShortcuts Extending AppShortcutsBuilder to support a buildOptional block isn't restricted to a component type of (any _AppShortcutsContentMarker & _LimitedAvailabilityAppShortcutsContentMarker)? Extending AppShortcutsBuilder to support buildArray and using compactMap(_:) to return an empty array when the feature is disabled I haven't used SiriKit before but it appears that shortcut suggestions were set at runtime by invoking setShortcutSuggestions(_:), meaning that what I'm trying to do would be possible. I'm not against using SiriKit if I have to but my understanding is that the App Intents framework is meant to be a replacement for SiriKit, and the prompt within Xcode to replace older custom intents with App Intents indicates that that is indeed the case. Is there something obvious that I'm just missing or is this simply not possible with the App Intent framework? Is the App Intent framework not meant to replace SiriKit and should I just use that instead?
2
1
612
Aug ’24
CarPlay / Siri: single message display
We have an app with carplay-messaging capability, and have successfully integrated our app in order to read out the list of unread messages. However, if a single message arrives and our push notification appears, we are not able to have the Siri UI automatically read only that single message or announce it. The 'list' UI appears (siri: "would you like to read your messages...") when tapping the notification, whereas we would like the 'item' UI to appear immediately with the "reply, repeat, don't reply" buttons. Our intent handler service looks like this - basically the auto-generated one for a new Intent Handler Extension: import Intents // As an example, this class is set up to handle Message intents. // You will want to replace this or add other intents as appropriate. // The intents you wish to handle must be declared in the extension's Info.plist. // You can test your example integration by saying things to Siri like: // "Send a message using <myApp>" // "<myApp> John saying hello" // "Search for messages in <myApp>" class IntentHandler: INExtension, INSendMessageIntentHandling, INSearchForMessagesIntentHandling, INSetMessageAttributeIntentHandling { override func handler(for intent: INIntent) -> Any { // This is the default implementation. If you want different objects to handle different intents, // you can override this and return the handler you want for that particular intent. return self } // MARK: - INSendMessageIntentHandling // Implement resolution methods to provide additional information about your intent (optional). func resolveRecipients(for intent: INSendMessageIntent, with completion: @escaping ([INSendMessageRecipientResolutionResult]) -> Void) { if let recipients = intent.recipients { // If no recipients were provided we'll need to prompt for a value. if recipients.count == 0 { completion([INSendMessageRecipientResolutionResult.needsValue()]) return } var resolutionResults = [INSendMessageRecipientResolutionResult]() for recipient in recipients { let matchingContacts = [recipient] // Implement your contact matching logic here to create an array of matching contacts switch matchingContacts.count { case 2 ... Int.max: // We need Siri's help to ask user to pick one from the matches. resolutionResults += [INSendMessageRecipientResolutionResult.disambiguation(with: matchingContacts)] case 1: // We have exactly one matching contact resolutionResults += [INSendMessageRecipientResolutionResult.success(with: recipient)] case 0: // We have no contacts matching the description provided resolutionResults += [INSendMessageRecipientResolutionResult.unsupported()] default: break } } completion(resolutionResults) } else { completion([INSendMessageRecipientResolutionResult.needsValue()]) } } func resolveContent(for intent: INSendMessageIntent, with completion: @escaping (INStringResolutionResult) -> Void) { if let text = intent.content, !text.isEmpty { completion(INStringResolutionResult.success(with: text)) } else { completion(INStringResolutionResult.needsValue()) } } // Once resolution is completed, perform validation on the intent and provide confirmation (optional). func confirm(intent: INSendMessageIntent, completion: @escaping (INSendMessageIntentResponse) -> Void) { // Verify user is authenticated and your app is ready to send a message. let userActivity = NSUserActivity(activityType: NSStringFromClass(INSendMessageIntent.self)) let response = INSendMessageIntentResponse(code: .ready, userActivity: userActivity) completion(response) } // Handle the completed intent (required). func handle(intent: INSendMessageIntent, completion: @escaping (INSendMessageIntentResponse) -> Void) { // Implement your application logic to send a message here. let userActivity = NSUserActivity(activityType: NSStringFromClass(INSendMessageIntent.self)) let response = INSendMessageIntentResponse(code: .success, userActivity: userActivity) completion(response) } // Implement handlers for each intent you wish to handle. As an example for messages, you may wish to also handle searchForMessages and setMessageAttributes. // MARK: - INSearchForMessagesIntentHandling func handle(intent: INSearchForMessagesIntent, completion: @escaping (INSearchForMessagesIntentResponse) -> Void) { // Implement your application logic to find a message that matches the information in the intent. let userActivity = NSUserActivity(activityType: NSStringFromClass(INSearchForMessagesIntent.self)) let response = INSearchForMessagesIntentResponse(code: .success, userActivity: userActivity) // Initialize with found message's attributes response.messages = [INMessage( identifier: "identifier", conversationIdentifier: "convo1", content: "I am so excited about SiriKit!", dateSent: Date(), sender: INPerson(personHandle: INPersonHandle(value: "sarah@example.com", type: .emailAddress), nameComponents: nil, displayName: "Sarah", image: nil, contactIdentifier: nil, customIdentifier: nil), recipients: [INPerson(personHandle: INPersonHandle(value: "+1-415-555-5555", type: .phoneNumber), nameComponents: nil, displayName: "John", image: nil, contactIdentifier: nil, customIdentifier: nil)], messageType: .text )] completion(response) } // MARK: - INSetMessageAttributeIntentHandling func handle(intent: INSetMessageAttributeIntent, completion: @escaping (INSetMessageAttributeIntentResponse) -> Void) { // Implement your application logic to set the message attribute here. let userActivity = NSUserActivity(activityType: NSStringFromClass(INSetMessageAttributeIntent.self)) let response = INSetMessageAttributeIntentResponse(code: .success, userActivity: userActivity) completion(response) } } Is there specific configuration required to allow display of a single message via tapping on the notification?
3
0
367
Sep ’24
Siri Intent Dismiss callback issue
I am opening the Siri shortcut screen from the viewDidLoad method, as follows: override func viewDidLoad() { super.viewDidLoad() // Present the Siri Shortcut screen to add Card Payment Intent let viewController = INUIAddVoiceShortcutViewController(shortcut: INShortcut(intent: self.cardPaymentIntent)!) viewController.modalPresentationStyle = .pageSheet // Setting Delegate viewController.delegate = self self.present(viewController, animated: true, completion: nil) } // Delegate Method Conformance :: INUIAddVoiceShortcutViewControllerDelegate @available(iOS 12.0, *) func addVoiceShortcutViewController(_ controller: INUIAddVoiceShortcutViewController, didFinishWith voiceShortcut: INVoiceShortcut?, error: Error?) { controller.dismiss(animated: true, completion: nil) // The issue is here. Whether we add the or Dismiss the Siri shortcut screen without adding it, this delegate gets called. } @available(iOS 12.0, *) func addVoiceShortcutViewControllerDidCancel(_ controller: INUIAddVoiceShortcutViewController) { controller.dismiss(animated: true, completion: nil) } // Card Payment Intent public var cardPaymentIntent: CardPaymentIntent { let intent = CardPaymentIntent() intent.suggestedInvocationPhrase = NSLocalizedString("Pay my credit card", comment: "") return intent } Whenever I present the siri shortcut screen, either I add the shortcut or dismiss the screen without adding. In both cases , the shortcut is added. And this method is called every time func addVoiceShortcutViewController(_ controller: INUIAddVoiceShortcutViewController, didFinishWith voiceShortcut: INVoiceShortcut?, error: Error?) Any solution ? while I dismiss the screen, i want it not to be added into the shortcut
1
0
441
Sep ’24
Unable to observe Focus status changes from an Intents app extension
I am writing a communication app that relies on the INShareFocusStatusIntentHandling protocol. However, it appears this API is not functional, even with the proper permissions and entitlements. Given the example code here, I am unable to trigger the logs in the INShareFocusStatusIntentHandling extension. In this code, when enabling or disabling focus, line 33 of IntentHandler.swift never gets logged, even though FocusStatusCenter is authorized for parent app, UserNotifications authorized for parent app (target), and Communication Notifications entitlement has been added to the parent app. I am also unable to hit any breakpoints in the extension. It seems as if the extension is simply never triggered -- maybe even broken at the OS-level. I have tried both iOS 17 and the latest 18 beta. Other users have reported similar difficulties in these forums. To replicate, Install the example app. Give the app UserNotifications and FocusStatus permissions. Background the app. Change focus status. Check console. Notice that the extension handler is never triggered/logged.
1
0
461
Jul ’24
muting SIRI via AppIntents
We're using App Intents to launch are control our app via Siri. Siri's responses have been fairly random, some with a "Done" popup, others with a verbal confirmation, others saying "I'm sorry, there's been a problem". The latter is bogus and doesn't look good to potential investors when the app is actually working fine. There appears to be no way in code that I've been able to find so far that would have been tell Siri to STFU. Let us handle our own errors. Otherwise is there a means to supply Siri with a dictionary of restored messages that could be triggered inside the app?
1
0
565
Jul ’24
SiriKit Extension still needed with AppIntents?
I have followed the SoupChef example in migrating Custom Intents from SiriKit to AppIntents. However, we only require one iOS release back, so we can require iOS 17. Thus, I eliminated everything that was strictly for backwards compatibility, most notably the SiriKit Extension that required enormous amounts of code to try to coordinate with the real app which worked poorly anyway. I tested for example that an NFC tag Automation created in Shortcuts works to execute an AppIntent while the app is backgrounded. I am now receiving a beta report that indicates someone trying to execute one of our migrated AppIntents from their HomePod is not working, and they say it used to work sometimes (not all the time). I'm sure most such cases used to require the SiriKit Extension in the old SiriKit world. I am terrified that I may need to rebuild that monster once again when the new (to me) AppIntent API seemed so beautiful without it and seemed to work without it. The AppIntent API documentation seems to indicate that SiriKit Extensions are no longer related or required. What is the truth here? Do I need to re-implement everything twice in the SiriKit Extension like a barbarian, or can we live in the new world with AppIntents? Thank you.
1
0
713
Jul ’24