I'm playing with the new Vision API for iOS18, specifically with the new CalculateImageAestheticsScoresRequest API.
When I try to perform the image observation request I get this error:
internalError("Error Domain=NSOSStatusErrorDomain Code=-1 \"Failed to create espresso context.\" UserInfo={NSLocalizedDescription=Failed to create espresso context.}")
The code is pretty straightforward:
if let image = image {
let request = CalculateImageAestheticsScoresRequest()
Task {
do {
let cgImg = image.cgImage!
let observations = try await request.perform(on: cgImg)
let description = observations.description
let score = observations.overallScore
print(description)
print(score)
} catch {
print(error)
}
}
}
I'm running it on a M2 using the simulator.
Is it a bug? What's wrong?
General
RSS for tagExplore the power of machine learning within apps. Discuss integrating machine learning features, share best practices, and explore the possibilities for your app.
Post
Replies
Boosts
Views
Activity
I am very new to App Intents and I am trying to add them to my On Device LLM ChatBot app so my users can get answers to any questions anywhere in iOS.
I have the following code and it is working wonderfully in the Shortcuts app.
import AppIntents
struct AskAi: AppIntent {
static var openAppWhenRun: Bool = false
static let title: LocalizedStringResource = "Ask Ai About"
static let description = "Gets an answer from Ai for your question."
@Parameter(title: "Question")
var question: String
static var parameterSummary: some ParameterSummary {
Summary("Ask Ai About \(\.$question)")
}
@MainActor
func perform() async throws -> some IntentResult & ReturnsValue<String> {
let bot: Bot = Bot()
await bot.respond(to: self.question)
return .result(
value: bot.output
)
}
}
class AppShortcuts: AppShortcutsProvider {
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: AskAi(),
phrases: [
"Ask \(.applicationName) \(\.$question)",
"Get \(.applicationName) answer for \(\.$question)",
"Open \(\.$question) using \(.applicationName) ",
"Using \(.applicationName) get help with \(\.$question)"
],
shortTitle: "Ask Ai",
systemImageName: "sparkles"
)
}
}
I can create a shortcut for this AppIntent and that allows me say speak the response.
I can call my shortcut via iOS 18 Beta 1 by the Shortcut name I set in the Shortcuts app and that allows it to work.
It does not work at all by just Asking Siri any of the phrases I have defined.
The info.plist has an app name alias defined just to be sure.
I even added the Siri capability in Xcode-beta.
I also tried using the ProvidesDialog return type too.
Whatever I do the AppIntent is invisible to Siri.
Siri tries to search the web, looking for my app name in the contacts or have an error Apple Cash which has nothing to do with what I was talking about.
Is there anything else I am missing for setting up iOS AppIntents to work with Siri?
Using App Shortcuts with app intents, Siri only responds to the first shortcut defined in the app shortcut below.
struct MementoShortcuts: AppShortcutsProvider {
u/AppShortcutsBuilder
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: SaveLinkIntent(),
phrases: ["Add a link to \(.applicationName)", "Add \(\.$url) to \(.applicationName)", "Make a new link in \(.applicationName)", "Create a new link in \(.applicationName) from \(\.$url)"],
shortTitle: "Add Link",
systemImageName: "link.badge.plus"
)
AppShortcut(
intent: LinkViewedIntent(),
phrases: [
"Mark a link I saved in \(.applicationName) as viewed",
"Mark \(\.$link) as viewed in \(.applicationName)",
"Set link in \(.applicationName) to viewed",
"Change status of \(\.$link) to viewed in \(.applicationName)",
],
shortTitle: "Mark Link as Viewed",
systemImageName: "book"
)
}
}
I have tried switching the order and she always uses the one that comes first. Both show up in the shortcuts app as an app shortcut, but only one shortcut is recognized by Siri even if I say the other one's phrase.
iOS 18 App Intents while supporting iOS 17
Hello,
I have an existing app that supports iOS 17. I already have three App Intents but would like to add some of the new iOS 18 app intents like ShowInAppSearchResultsIntent.
However, I am having a hard time using #available or @available to limit this ShowInAppSearchResultsIntent to iOS 18 only while still supporting iOS 17.
Obviously, the ShowInAppSearchResultsIntent needs to use @AssistantIntent which is iOS 18 only, so I mark that struct as @available(iOS 18, *). That works as expected. It is when I need to add this "SearchSnippetIntent" intent to the AppShortcutsProvider, that I begin to have trouble doing. See code below:
struct SnippetsShortcutsAppShortcutsProvider: AppShortcutsProvider {
@AppShortcutsBuilder
static var appShortcuts: [AppShortcut] {
//iOS 17+
AppShortcut(intent: SnippetsNewSnippetShortcutsAppIntent(), phrases: [
"Create a New Snippet in \(.applicationName) Studio",
], shortTitle: "New Snippet", systemImageName: "rectangle.fill.on.rectangle.angled.fill")
AppShortcut(intent: SnippetsNewLanguageShortcutsAppIntent(), phrases: [
"Create a New Language in \(.applicationName) Studio",
], shortTitle: "New Language", systemImageName: "curlybraces")
AppShortcut(intent: SnippetsNewTagShortcutsAppIntent(), phrases: [
"Create a New Tag in \(.applicationName) Studio",
], shortTitle: "New Tag", systemImageName: "tag.fill")
//iOS 18 Only
AppShortcut(intent: SearchSnippetIntent(), phrases: [
"Search \(.applicationName) Studio",
"Search \(.applicationName)"
], shortTitle: "Search", systemImageName: "magnifyingglass")
}
let shortcutTileColor: ShortcutTileColor = .blue
}
The iOS 18 Only AppShortcut shows the following error but none of the options seem to work. Maybe I am going about it the wrong way.
'SearchSnippetIntent' is only available in iOS 18 or newer
Add 'if #available' version check
Add @available attribute to enclosing static property
Add @available attribute to enclosing struct
Thanks in advance for your help.
The documentation for translationTask(source:target:action:) says it should translate when content appears, but this isn't happening. I’m only able to translate when I manually associate that task with a configuration, and instantiate the configuration.
Here’s the complete source code:
import SwiftUI
import Translation
struct ContentView: View {
@State private var originalText = "The orange fox jumps over the lazy dog"
@State private var translationTaskResult = ""
@State private var translationTaskResult2 = ""
@State private var configuration: TranslationSession.Configuration?
var body: some View {
List {
// THIS DOES NOT WORK
Section {
Text(translationTaskResult)
.translationTask { session in
Task { @MainActor in
do {
let response = try await session.translate(originalText)
translationTaskResult = response.targetText
} catch { print(error) }
}
}
}
// THIS WORKS
Section {
Text(translationTaskResult2)
.translationTask(configuration) { session in
Task { @MainActor in
do {
let response = try await session.translate(originalText)
translationTaskResult2 = response.targetText
} catch { print(error) }
}
}
Button(action: {
if configuration == nil {
configuration = TranslationSession.Configuration()
return
}
configuration?.invalidate()
}) { Text("Translate") }
}
}
}
}
How can I automatically translate a given text when it appears using the new translationTask API?
Hi everyone,
I was wondering, on how accurate is the Hand Classification ML? For Example: Is it possible to understand the different letters of the Sign Language Alphabet or is it only capable of recognizing simple poses like a thumbs up?
I noticed the code snippets are missing from the wwdc2024 10210 video titled Bring your app’s core features to users with App Intents https://developer.apple.com/videos/play/wwdc2024/10210/ It would be useful if those could be added. I also noticed the transcript is missing from the web version but it is in the Developer app, that is odd.
I'm trying to use a custom system image for my App Shortcut:
AppShortcut(
intent: AppOpenIntent(),
phrases: ["Open \(.applicationName)"],
shortTitle: "Open App",
systemImageName: "custom.image"
)
I've added custom.image to Images.xcassets for my project, but the shortcuts icon is always blank in the Shortcuts app. Is it possible to use custom system images for app shortcuts?
I'm looking for a solution to take a picture or point the camera at a piece of clothing and match that image with an image the user has stored in my app.
I'm storing the data in a Core Data database as a Binary Data object. Since the user also takes the pictures they store in the database I think I cannot use pre-trained Core ML models.
I would like the matching to be done on device if possible instead of going to an external service. That will probably describe the item based on what the AI sees, but then I cannot match the item with the stored images in the app.
Does anyone know if this is possible with frameworks as Vision or VisionKit?
I just downgraded from iOS 18 to iOS 17.5.1 and Siri not working anymore i cannot even setup
i have already make reset the phone and not working:(
Adding the openAppWhenRun property to an AppIntent for a ControlWidgetButton causes the following error when the control is tapped in Control Center:
Unknown NSError The operation couldn’t be completed. (LNActionExecutorErrorDomain error 2018.)
Here’s the full ControlWidget and AppIntent code that causes the errorerror:
Should controls be able to open apps after the AppIntent runs, or is this a bug?
I am developing an iPhone application. When I start testing in a simulator or on an actual device, I get the following message depending on the model. I don't see any problem with the actual operation of the app, but I don't know how to resolve this error.
#FactoryInstall Unable to query results, error: 5
Unable to list voice folder
Unable to list voice folder
Unable to list voice folder
Unable to list voice folder
Unable to list voice folder
I have tried to resolve the problem by following the steps below, but it hasn’t had any affect on the error. Is it OK to leave this error as it is? If you know how to resolve it, please let me know.
Clear the Xcode cache:
Go to the path of the DerivedData folder and delete all of its contents.
Clean Build folder:
Select "Product" -> "Clean Build Folder" from the Xcode menu.
Restart:
Restart Xcode and the simulator.
Software update:
Make sure you are using the latest version of Xcode and macOS.
Reinstallation:
Uninstall Xcode once and reinstall it.
Reset the simulator
I want to implement AppIntent in my framework which is created and choose Objective-C as language, but when I new a swift file and implement AppIntent. then import to my project. the AppIntent didn't work. but I can use AppIntent directly in project.
I want to implement AppIntent in my Objective-C framework, because we need to design a fast pass to certain function.
However, I create a struct inherited from AppIntent, and create a projectName-Bridging-Header.h in my framework just like I did in my project. It seems like framework don't work, I can't find it in shortcuts
I'm getting widespread reports from users trialling iOS 17.6 public beta that Siri Shortcuts are failing whenever they enter any text that looks like a URL.
It's getting reported to me because my app happens to have an app intent with a string parameter which can contain a URL in some circumstances.
However it's easily reproducible outside of my app: just create a 2 line shortcut like the one below. If you change "This is some text" to "https://www.apple.com" the shortcut below will fail:
In iOS 17.5 entering "https://www.apple.com" works fine.
I've raised feedback on this (FB14206088) but can anyone confirm that this is indeed a bug and not some weird new feature of Shortcuts where the contents of a variable can somehow change the type of a variable?
It would be very, very bad if this were so.
I have followed the SoupChef example in migrating Custom Intents from SiriKit to AppIntents. However, we only require one iOS release back, so we can require iOS 17. Thus, I eliminated everything that was strictly for backwards compatibility, most notably the SiriKit Extension that required enormous amounts of code to try to coordinate with the real app which worked poorly anyway.
I tested for example that an NFC tag Automation created in Shortcuts works to execute an AppIntent while the app is backgrounded.
I am now receiving a beta report that indicates someone trying to execute one of our migrated AppIntents from their HomePod is not working, and they say it used to work sometimes (not all the time). I'm sure most such cases used to require the SiriKit Extension in the old SiriKit world. I am terrified that I may need to rebuild that monster once again when the new (to me) AppIntent API seemed so beautiful without it and seemed to work without it. The AppIntent API documentation seems to indicate that SiriKit Extensions are no longer related or required. What is the truth here? Do I need to re-implement everything twice in the SiriKit Extension like a barbarian, or can we live in the new world with AppIntents?
Thank you.
Hi,
I have a AppEnum and I try to use a custom SF Symbol as DisplayRepresentation.Image but it's not working. I get a blank image when AppEnum picker appears.
The custom SF Symbol is stored in the target's asset catalog and it works in the target (eg: Image(named: "custom_sfsymbol"). The issue occurs when I try to use it in a DisplayRepresentation
static var caseDisplayRepresentations: [Self: DisplayRepresentation] = [
.sample : DisplayRepresentation(title: "sample_title", image: DisplayRepresentation.Image(named: "custom_sfsymbol")),
Can we use a custom SF Symbol in a DisplayRepresentation.Image?
We're using App Intents to launch are control our app via Siri. Siri's responses have been fairly random, some with a "Done" popup, others with a verbal confirmation, others saying "I'm sorry, there's been a problem". The latter is bogus and doesn't look good to potential investors when the app is actually working fine.
There appears to be no way in code that I've been able to find so far that would have been tell Siri to STFU. Let us handle our own errors.
Otherwise is there a means to supply Siri with a dictionary of restored messages that could be triggered inside the app?
Hi,
I am working on creating a EntityPropertyQuery for my App entity. I want the user to be able to use Shortcuts to search by a property in a related entity, but I'm struggling with how the syntax for that looks.
I know the documentation for 'EntityPropertyQuery' suggests that this should be possible with a different initializer for the 'QueryProperty' that takes in a 'entityProvider' but I can't figure out how it works.
For e.g. my CJPersonAppEntity has 'emails', which is of type CJEmailAppEntity, which has a property 'emailAddress'. I want the user to be able to find the 'person' by looking up an email address.
When I try to provide this as a Property to filter by, inside CJPersonAppEntityQuery, but I get a syntax error:
static var properties = QueryProperties {
Property(\CJPersonEmailAppEntity.$emailAddress, entityProvider: { person in
person.emails // error
}) {
EqualToComparator { NSPredicate(format: "emailAddress == %@", $0) }
ContainsComparator { NSPredicate(format: "emailAddress CONTAINS %@", $0) }
}
}
The error says "Cannot convert value of type '[CJPersonEmailAppEntity]' to closure result type 'CJPersonEmailAppEntity'"
So it's not expecting an array, but an individual email item. But how do I provide that without running the predicate query that's specified in the closure?
So I tried something like this , just returning something without worrying about correctness:
Property(\CJPersonEmailAppEntity.$emailAddress, entityProvider: { person in
person.emails.first ?? CJPersonEmailAppEntity() // satisfy compiler
}) {
EqualToComparator { NSPredicate(format: "emailAddress == %@", $0) }
ContainsComparator { NSPredicate(format: "emailAddress CONTAINS %@", $0) }
}
and it built the app, but failed on another the step 'Extracting app intents metadata':
error: Entity CJPersonAppEntity does not contain a property named emailAddress. Ensure that the property is wrapped with an @Property property wrapper
So I'm not sure what the correct syntax for handling this case is, and I can't find any other examples of how it's done. Would love some feedback for this.
I would like to split up my intents into smaller intents with more atomic pieces of functionality that I can then call one intent from another. For example:
struct SumValuesIntent: AppIntent {
static var title: LocalizedStringResource { "Sum Values" }
let a: Int
let b: Int
init(a: Int, b: Int) {
self.a = a
self.b = b
}
init() {
self.init(a: 0, b: 0)
}
func perform() async throws -> some IntentResult {
let sum = a + b
print("SumValuesIntent:", sum)
return .result(value: sum)
}
}
struct PrintValueIntent: AppIntent {
static var title: LocalizedStringResource { "Print Value" }
let string: String
init(string: String) {
self.string = string
}
init() {
self.init(string: "")
}
func perform() async throws -> some IntentResult {
print("PrintValueIntent:", string)
return .result()
}
}
What is the best way to chain intents like these? I tried
.result(opensIntent: PrintValueIntent(string: String(describing: sum)))
as the return type of SumValuesIntent.perform but that doesn't seem to work. Then I tried
try await PrintValueIntent(string: String(describing: sum)).perform()
as the return type and that works but I'm not sure that's the correct way to do it.