Hi!
When my device is set to English, both search and the Shortcuts up automatically show multiple shortcuts parametrised for each value of the AppEnum - which is what I expected. When my device is set to German, I get only the basic AppShortcut without the (optional) parameter.
I am using an AppEnum (see below) for the parametrised phrases and localise the phrases into German with an AppShortcuts String Catalog added to my project.
Everything else seems to work, I can use my AppShortcut in the Shortcuts app and invoke it via Siri in both English and German.
The Shortcuts app displays the values correctly using the localized strings.
Any ideas?
import AppIntents
class ApolloShortcuts: AppShortcutsProvider {
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: GetIntent(),
phrases: [
"Get data from \(.applicationName)",
"Get data from \(.applicationName) for \(\.$day)",
"Get data from \(.applicationName) for the \(\.$day)"
],
shortTitle: "Get Data",
systemImageName: "wand.and.sparkles")
}
}
enum ForecastDays: String, AppEnum {
static var typeDisplayRepresentation: TypeDisplayRepresentation = "Day"
static var caseDisplayRepresentations: [Self : DisplayRepresentation] = [
.today: DisplayRepresentation(title: LocalizedStringResource("today", table: "Days")),
.tomorrow: DisplayRepresentation(title: LocalizedStringResource("tomorrow", table: "Days")),
.dayAfterTomorrow: DisplayRepresentation(title: LocalizedStringResource("dayAfterTomorrow", table: "Days"))
]
case today
case tomorrow
case dayAfterTomorrow
var displayName: String {
String(localized: .init(rawValue), table: "Days")
}
}
App Intents
RSS for tagExtend your app’s custom functionality to support system-level services, like Siri and the Shortcuts app.
Posts under App Intents tag
200 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
WWDC videos suggest that existing apps should continue using the old SiriKit domains, such as INPlayMediaIntent. But what about new apps for playing audio? Should we implement Siri functionality for audio playback using the old SiriKit domains, or should we create our own AppEntities and trigger them via custom AudioPlaybackIntent implementations?
Interactive widgets require an AppIntent and don’t support the old INPlayMediaIntent. To achieve the same functionality as the Music app widgets, it seems logical to adopt the new AudioPlaybackIntent. However, I can't find any information about this in the documentation.
Imagine we have an Xcode workspace containing two projects:
MyLibrary.xcodeproj holding a framework target
MyShortcutsApp.xcodeproj holding an app target which consumes MyLibrary framework
Both targets define App Intents and the ones from MyLibrary are exposed via AppIntentsPackage accordingly.
When trying to wrap the App Intent from framework as App Shortcut and passing localized AppShortcutPhrases I do see the following compile error:
".../Resources/de.lproj/AppShortcuts.strings:11:1: error: This AppShortcut does not map to a known action (MyLibraryIntent specified). (in target 'MyShortcutsApp' from project 'MyShortcutsApp')"
If I use the same localized App Shortcut phrases for an App Intent which is locally defined in the app target, everything works fine and also if I use the framework-provided App Intent in and App Shortcut without passing any localized phrases.
This is happening with Xcode 16.0 (16A242d), with 16.1 (16B40) and with 16.2 beta 2 (16C5013f).
I already raised this issue via FB15701779 which contains a sample project to reproduce and to further analyze the issue.
Thanks for any hint on how to solve that.
Frank
I was able to add shortcuts with parameters and use them from the Shprtcuts app in iOS 17, nevertheless Siri intent did never work.
I upgraded to iOS 18 my app and my mobile.
Now, the shortcut only appears in shortcuts app if no parameter is added to it. When I try to set a parameter, the shortcut does not appear any mora in Shortcuts app.
struct ShortcutsProvider: AppShortcutsProvider {
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: OpenAppIntent(),
phrases: [
"Show (.$screen) in (.applicationName)"
],
shortTitle: "Open",
systemImageName: "iphone.badge.play"
)
}
}
struct OpenAppIntent: AppIntent {
static var title: LocalizedStringResource = "Show"
static let description = IntentDescription("Shows a screen.")
static var openAppWhenRun: Bool = true
static var authenticationPolicy = IntentAuthenticationPolicy.alwaysAllowed
@Parameter(title: "screen")
var screen: String
@MainActor
func perform() async throws -> some IntentResult {
return .result()
}
}
extension ScreenOption: AppEntity {
struct OpenAppQuery: EntityQuery {
@IntentParameterDependency<OpenAppIntent>( \.$screen )
var openAppIntent
func entities(for: [ScreenOption.ID]) async throws -> [ScreenOption] {
return []
}
func suggestedEntities() async throws -> [ScreenOption] {
return []
}
}
var displayRepresentation: DisplayRepresentation {
.init(stringLiteral: "\(title)")
}
static var defaultQuery: OpenAppQuery = OpenAppQuery()
static var typeDisplayRepresentation: TypeDisplayRepresentation = .init(name: "Screen")
}
extension ScreenOption: EntityIdentifierConvertible {
static func entityIdentifier(for entityIdentifierString: String) -> ScreenOption? {
allCases.filter { $0.rawValue == entityIdentifierString }.first
}
public var entityIdentifierString: String {
rawValue
}
public init?(entityIdentifierString: String) {
guard let screenOption = ScreenOption.entityIdentifier(for: entityIdentifierString)
else { return nil }
self = screenOption
}
}
My requirement is to open a specific screen of my app with when user says " Start Sleep meditation for 10 minutes" where Sleep and 10 minutes are dynamic values in the phrase. Is it possible just with a phrase we can get the values. Or do i need to ask using siri "which meditation" and then "how much tine". I am planning to use AppIntent and AppShortcut, along with Entities. But unable to open the shortcut when siri invokes with phrase i discussed above.
I'm curious if anyone else has figured out why an intent defined in the intents file never seems to appear in the Shortcuts app on MacOS.
I'm following the steps outlined in "Meet Shortcuts for MacOS" from WWDC 2021.
https://developer.apple.com/videos/play/wwdc2021/10232
I build and run my app, launch Shortcuts, and the intent I defined refuses to show up!
There's one caveat - I allowed Xcode to update to 16.1, and mysteriously the intent became available in Shortcuts.app. When I went to add a second intent, I see the same as above - it simply never shows up in Shortcuts.app.
I have a few intents I'd like to write/add, but this build/test cycle is really slowing me down.
This app is a completely fresh Swift-AppKit app, I've never archived it, so there shouldn't be more than one copy on disk. I have also cleaned the build folder, restarted Xcode, restarted Shortcuts, restarted my machine entirely...
Anyone see this before and find a workaround? Any advice on how to give Shortcuts.app a kick in the rear to try and find my second intent?
I have an app with a shared internal framework, a main app target, and a widget target. In my shared framework, I have an AppIntent, FooIntent. In addition, I have an AppIntentPackage
public struct FooIntentsPackage: AppIntentsPackage { }
also in the framework. Finally, in the widget target, I reference that package:
struct FooAppIntents: AppIntentsPackage {
static var includedPackages: [any AppIntentsPackage.Type] { [ FooIntentsPackage.self ] }
}
However, when I run this, I get a bunch of these errors:
metadata `_$s8Internal15FooAppIntentsV' did not match any imported symbol.
I've tried turning off Strip Linked Product in both the Framework and the Widget, to no avail. Any ideas?
I have implemented ShowInAppSearchResultsIntent and AppShortcutsProvider. But on iOS 18.1+ getting and error in console :- Failed to generate TargetContentIdentifier for criteria.
In iOS 18.0 it's working fine.
The code I have implemented
@AssistantIntent(schema: .system.search)
struct SearchIntent: ShowInAppSearchResultsIntent {
// static let title: LocalizedStringResource = "Search in Cineverse for"
static let searchScopes: [StringSearchScope] = [.general]
@Parameter(requestValueDialog: IntentDialog("What would you like to search for?"))
var criteria: StringSearchCriteria
@MainActor
func perform() async throws -> some IntentResult {
let searchString = criteria.term
print("Searching for \(searchString)")
return .result()
}
}
class AppShortcuts: AppShortcutsProvider {
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: SearchIntent(),
phrases: [
"using \(.applicationName) search for",
"search on \(.applicationName) app"
],
shortTitle: "Search Movie",
systemImageName: "magnifyingglass"
)
}
}
I am working to add Spotlight indexing for my app entities as discussed in WWDC24's video "What's New in App Intents".
That video goes over the IndexedEntity protocol and the integration with Spotlight via CSSearchableItemAttributeSet.
What I'm seeing though does not match the video. In the video, the presenter goes through the sort of progressive approach you can take to getting this data into Spotlight starting with the basics and then expanding to include more support depending on how much the developer wants to do.
What I'm seeing is that if you conform to IndexedEntity, your entities will appear in Spotlight using the name derived from
public var displayRepresentation: DisplayRepresentation
So, that works. Name appears... BUT the next part of the video goes into how to expand your implementation with more metadata for Spotlight via CSSearchableItemAttributeSet. The issue I'm seeing is that once that's implemented, the items disappear from Spotlight, almost like that implementation is overriding the base implementation in a way that no longer functions.
My expectation is that an item with custom attributes would use them in Spotlight as appropriate, not disappear from search, i.e. what's shown in the video should work.
I've got a sample project here:
https://hanchor.s3.amazonaws.com/misc/IndexingTest.zip
To reproduce with the sample:
Build and run. Indexing is setup in the init() method so it will just run.
Go to Spotlight and search for 'Huntersblau', a string included in the content set. At this point you should see a result - good!
Stop the app and go back and uncomment the var attributeSet: CSSearchableItemAttributeSet implementation in IndexingTestApp.swift. This will provide custom attributes to Spotlight.
Repeat steps 1 and 2 - you'll see now, it no longer appears in the search results - when CSSearchableItemAttributeSet is implemented, the item drops out of Spotlight.
Hello all, I'm finding myself with a compile error when trying to use a defined UTType for Transferable conformance when the type is also an AppEntity.
The compiler error is
Could not determine the identifier of `.todo`. Please use a UTType defined by the UniformTypeIdentifiers framework
However, said compiler error only shows up after adding AppEntity conformance.
So, in order to reproduce:
Create any type, conform to Codable
struct Todo: Codable {
var id: UUID
var title: String
var completed: Bool
}
Create a UTType extension for the new type
extension UTType {
public static let todo: UTType = UTType(exportedAs: "org.nameghino.types.todo")
}
Add Transferable conformance
extension Todo: Transferable {
static var transferRepresentation: some TransferRepresentation {
CodableRepresentation(contentType: .todo)
ProxyRepresentation(exporting: \.title)
}
}
At this point, the code compiles correctly on Xcode 16.2 beta 2 (16C5013f)
Add AppEntity conformance
extension Todo: AppEntity {
static var typeDisplayRepresentation: TypeDisplayRepresentation = "todo_title"
static var defaultQuery = Todo.Query()
var displayRepresentation: DisplayRepresentation {
DisplayRepresentation(title: "\(title)")
}
struct Query: EntityQuery {
typealias Entity = Todo
init() {}
func entities(for identifiers: [UUID]) async throws -> [Todo] {
return []
}
func suggestedEntities() async throws -> [Entity] {
return []
}
}
}
Now the code is not compiling with the aforementioned error.
I noticed that with iOS 18, when adding a widget to the Control Center, there is now some "grouping system". I'm interested in the Capture group, which contains native widgets from Apple as well as third party apps like Instagram and Blackmagic cam, widgets in this group open the camera. My widget also opens the camera in my app, how can I add it to this group?
So, I've declared an AppIntent that indicates my app can "Open files" that conform to UTType.Image.
I've got a @AssistantEntity(schema: .files.file) and a
@AssistantIntent(schema: .files.openFile) declared.
So I navigate to the files app, quicklook an image, and open type-to-siri.
I tell siri "open this in " and all it does is act like "open ". No breakpoint is hit in my intent's perform method.
Am I doing something wrong? How can I test these cross-app behaviors?
Are they... not actually possible? Does an "OpenIntent" only work on my app's own URLs and not on file URLs from other apps?
Hello everyone,
I'm currently working on an App Intent for my iOS app, and I’ve encountered a frustrating issue related to how Siri prompts for a category selection. Here’s an overview of what I’m dealing with:
extension Category: AppEntity, @unchecked Sendable {
var displayRepresentation: DisplayRepresentation {
DisplayRepresentation(title: "\(name)")
}
static var typeDisplayRepresentation = TypeDisplayRepresentation(name: "Category")
typealias DefaultQueryType = ShortcutsCategoryQuery
static var defaultQuery: ShortcutsCategoryQuery = ShortcutsCategoryQuery()
}
struct ShortcutsCategoryQuery: EntityQuery {
func entities(for identifiers: [String]) async throws -> [Category] {
let context = await ModelContext(sharedModelContainer)
let categories = try CategoryDataProvider(context: context).getItems()
return categories.filter { identifiers.contains($0.id) }
}
func entities(matching string: String) async throws -> [Category] {
return try await suggestedEntities()
}
func suggestedEntities() async throws -> [Category] {
let context = await ModelContext(sharedModelContainer)
do {
let categories = try CategoryDataProvider(context: context).getItems()
if categories.isEmpty {
print("No categories found.")
}
return categories.map { category in
Category(
id: category.id,
name: category.name,
stringSymbol: category.stringSymbol,
symbol: category.symbol,
stringColor: category.stringColor,
color: category.color
)
}
} catch {
print(error)
return []
}
}
}
The issue arises when I use Siri to invoke the intent. Siri correctly asks me to select a category but does not display any options unless I said something that Siri recognized, like "Casa(House) or *****(Test)" in portuguese. Only then does it show the list of available categories.
I would like the categories to appear immediately when Siri asks for a selection. I've already tried refining the ShortcutsCategoryQuery and debugging various parts of my code, but nothing seems to fix this behavior.
I'm returning the following result in one of my AppIntents:
return .result(value: "Done!", dialog: IntentDialog("Speed limit \(speedLimit)"))
With iOS 18.0.1 it was nicely confirming the user the result of their command by saying e.g. "Speed limit 60" and showing it on top of the screen.
With iOS 18.1, it only shows/says "That's done" or "Done" at the bottom of the screen.
Am I missing something that changed in the AppIntents API since iOS 18.1?
I am working on adding indexing to my App Entities via IndexedEntity. I already, separately index my content via Spotlight.
Watching 'What's New in App Intents', this is covered well but I have a question.
Do I need to implement both CSSearchableItem's associateAppEntity AND also a custom implementation of attributeSet in my IndexedEntity conformance? It seems duplicative but I can't tell from the video if you're supposed to do both or just one or the other.
Hi devs,
my app has control center widgets and interactive widgets, both are using the same app intent to update sharing status (App Group's UserDefault).
When the app is running in the background, control center widget and interactive widgets can be reloaded with correct status. However, if the app is not running in the background, I tap the interactive widget and the interactive widget can reload with correct status but the control center widget can't. Vice versa, the control center widgets can reload with correct status but the interactive widgets can't.
struct ChangeAppStatusIntent: AudioRecordingIntent, CustomIntentMigratedAppIntent, PredictableIntent, LiveActivityIntent, SetValueIntent {
... // data and setup
@MainActor
func perform() async throws -> some IntentResult {
// change the AppGroup data for app status
WidgetCenter.shared.reloadAllTimelines()
ControlCenter.shared.reloadAllControls()
}
}
does any way to fix this issue?
@available(iOS 18.0, *)
@AssistantIntent(schema: .system.search)
struct SearchIntent: AppIntent {
static let title: LocalizedStringResource = "Search <app name redacted>"
static let searchScopes: [StringSearchScope] = [.general]
@Parameter(title: "Criteria")
var criteria: StringSearchCriteria
@MainActor
func perform() async throws -> some IntentResult {
MyDependencyManager.shared.performSearch(with: criteria.term)
return .result()
}
}
// In AppShortcutProvider
AppShortcut(
intent: SearchIntent(),
phrases: [
"Find \(\.$criteria) in \(.applicationName)",
"Search for \(\.$criteria) in \(.applicationName)",
],
shortTitle: "Search <app name redacted>",
systemImageName: "magnifyingglass"
)
The search works when using the Shortcuts app, but not when using Siri. For example if I ask Siri "Search for in " it just does a Google search or prompts me to install the app from the App Store (I am running a debug build via Xcode). I can't get this to work from Spotlight either.
I am using Xcode 16 and running the app on an iPhone 16 with OS 18.2 beta and Apple Intelligence turned on.
What am I doing wrong here? I cannot find any other information about this intent or how to properly set it up and use it.
I'm adding widget interactivity to my home screen widgets via buttons and AppIntents, but running into some interesting behavior the way the timeline is reloaded after.
I'm following this guide from Apple
https://developer.apple.com/documentation/widgetkit/adding-interactivity-to-widgets-and-live-activities
And the widget is guaranteed to be reloaded when a button pressed with an intent, But whenever the AppIntent is done with the perform action, the widget timeline is always reloaded twice. It's also interesting to note that both reloads happen after the perform method. If you add a 10 second sleep in the perform, nothing happens for 10 seconds, then both reloads happen.
This issue with this is 2-fold.
calculating and rendering the entire widget timeline can be Networking and DB intensive operations, so I would ideally like to avoid doing all the work twice and save the users battery and processing.
The even worse issue, sometimes data on the server changes in between the split second duplicate widget timeline reloads, causing the widget to flash one state, then update to another a second later which is not a good user experience.
I have a sample project which shows the issue and is very easy to reproduce.
The widget simply keeps track of the number of reloads.
To reproduce:
Add the widget to the homescreen
Press the refresh button, and observe the timeline refresh count always goes up by 2.
I've filed a Feedback and attached the sample project and screen recording for anyone to reproduce.
FB15595835
Hello,
I am implementing an App Intent which asks the user for a currency amount:
private func loadAmountList(forNumber number: String) async throws -> [NSDecimalNumber] {...}
@MainActor func perform() async throws -> some IntentResult & ShowsSnippetView {
let list = try await loadAmountList(forNumber: fixedNumber).compactMap {
currencyFormatter.string(from: $0)
}
throw $amount.needsDisambiguationError(among: list, dialog: "app_intent_sim_amount_prompt")
}
If I start this intent from Siri, the attached screenshot is shown, but no matter what I say ("10 EURO", "ten", "10", "10€"...) Siri never understands anything and keep reshowing the dialog over and over again.
If instead I tap any of the choices then the intent execution proceeds currectly.
How can I solve the problem?
Thanks
Hello,
I am implementing an App Intent which shows a confirmation dialog before proceeding with the operation execution.
It works fine when the intent is started from a shortcut, but it always fails when started from Siri: I obtain the error message depicted in the attached screenshot ("An error occurred, try again").
That message appears as soon as the requestConfirmation method is called in the perform method of my App Intent:
try await requestConfirmation(actionName: .do, dialog: "app_intent_sim_confirmation_message") {
SIMRechargeIntentSummaryView(...)
}
...
How can I solve the problem?
Thanks