Hi, it's been 3 days that Apple Intelligence is stuck on preparing step and I don't know what can I do, can you help me please ?
(macOS Sonoma 15.1 Beta 3, Mac mini M1)
Explore the power of machine learning and Apple Intelligence within apps. Discuss integrating features, share best practices, and explore the possibilities for your app here.
Post
Replies
Boosts
Views
Activity
When building and running an app on iOS 18.1 Beta 3 a fresh sample app with an @AssistantIntent will immediately crash. Using a sample Assistant Intent from the developer documentation site will cause this. Removing the @AssistantIntent macro will allow the app to run. Using Xcode 16.1 beta.
dyld[1278]: Symbol not found: _$s10AppIntents15AssistantSchemaV06IntentD0VAC0E0AAWP Referenced from: <DC018008-EC0E-3251-AAFC-5DEB51863F17> /private/var/containers/Bundle/Application/2726C2CE-0255-4692-A7CA-B343146D4A83/Runner.app/Runner.debug.dylib Expected in: <E9AF073B-B6E0-31B8-88AA-092774CEEE3D> /System/Library/Frameworks/AppIntents.framework/AppIntents
(FB14949135)
Recently, deep learning projects have been getting larger, and sometimes loading models has become a bottleneck. I download the .mlpackage format CoreML from the internet and need to use compileModelAtURL to convert the .mlpackage into an .mlmodelc, then call modelWithContentsOfURL to convert the .mlmodelc into a handle. Generally, generating a handle with modelWithContentsOfURL is very slow. I noticed from WWDC 2023 that it is possible to cache the compiled results (see https://developer.apple.com/videos/play/wwdc2023/10049/?time=677, which states "This compilation includes further optimizations for the specific compute device and outputs an artifact that the compute device can run. Once complete, Core ML caches these artifacts to be used for subsequent model loads."). However, it seems that I couldn't find how to cache in the documentation.
Apple Intelligents is here, but I have some problems. First of all, it often shows that something is being downloaded on the settings page. Is this normal? And the Predictive Code Completion Model in Xcode seems to have been suddenly deleted and needs to be re-downloaded, and the error The operation couldn't be complet has occurred. Ed. (ModelCatalog.CatalogErrors.AssetErrors error 1.), detailed log:
The operation couldn’t be completed. (ModelCatalog.CatalogErrors.AssetErrors error 1.)
Domain: ModelCatalog.CatalogErrors.AssetErrors
Code: 1
User Info: {
DVTErrorCreationDateKey = "2024-08-27 14:42:54 +0000";
}
--
Failed to find asset: com.apple.fm.code.generate_small_v1.base - no asset
Domain: ModelCatalog.CatalogErrors.AssetErrors
Code: 1
--
System Information
macOS Version 15.1 (Build 24B5024e)
Xcode 16.0 (23049) (Build 16A5230g)
Timestamp: 2024-08-27T22:42:54+08:00
I have updated to MacOS Sequoia, but i do not see Apple Intelligence and Siri in the settings, I can just see Siri.
Xcode 15.3 AppIntentsSSUTraining warning: missing the definition of locale # variables.1.definitions
Hello!
I've noticed that adding localizations for AppShortcuts triggers the following warnings in Xcode 15.3:
warning: missing the definition of zh-Hans # variables.1.definitions
warning: missing the definition of zh-Hans # variables.2.definitions
This occurs with both legacy strings files and String Catalogs.
Example project: https://github.com/gongzhang/AppShortcutsLocalizationWarningExample
I'm trying to create an App Shortcut so that users can interact with one of my app's features using Siri. I would like to be able to turn this shortcut on or off at runtime using a feature toggle.
Ideally, I would be able to do something like this.
struct MyShortcuts: AppShortcutsProvider {
static var appShortcuts: [AppShortcut] {
// This shortcut is always available
AppShortcut(
intent: AlwaysAvailableIntent(),
phrases: ["Show my always available intent with \(.applicationName)"],
shortTitle: "Always Available Intent",
systemImageName: "infinity"
)
// This shortcut is only available when "myCoolFeature" is available
if FeatureProvider.shared.isAvailable("myCoolFeature") {
AppShortcut(
intent: MyCoolFeatureIntent(),
phrases: ["Show my cool feature in \(.applicationName)"],
shortTitle: "My Cool Feature Intent",
systemImageName: "questionmark"
)
}
}
}
However, this does not work because the existing buildOptional implementation is limited to components of type (any _AppShortcutsContentMarker & _LimitedAvailabilityAppShortcutsContentMarker)?.
All other attempts at making appShortcuts dynamic have resulted in shortcuts not working at all. I've tried:
Creating a makeAppShortcuts method that returns [AppShortcut] and invoking this method from within the appShortcuts
Extending AppShortcutsBuilder to support a buildOptional block isn't restricted to a component type of (any _AppShortcutsContentMarker & _LimitedAvailabilityAppShortcutsContentMarker)?
Extending AppShortcutsBuilder to support buildArray and using compactMap(_:) to return an empty array when the feature is disabled
I haven't used SiriKit before but it appears that shortcut suggestions were set at runtime by invoking setShortcutSuggestions(_:), meaning that what I'm trying to do would be possible. I'm not against using SiriKit if I have to but my understanding is that the App Intents framework is meant to be a replacement for SiriKit, and the prompt within Xcode to replace older custom intents with App Intents indicates that that is indeed the case.
Is there something obvious that I'm just missing or is this simply not possible with the App Intent framework? Is the App Intent framework not meant to replace SiriKit and should I just use that instead?
Hello I have on my MacBook Air M2 MacOS 15.1 and it runs amazingly Super Good also fie battery life I am aware of problems with a beta but Apple Intelligence can not be activated properly. I live in the EU but many still got it with the requirements that you really set everything to American unfortunately nothing really happened with the very long wait I came into the waiting list and with "Preparing" it stops and that a whole day (almost 24h) and I also restarted the waiting list by changing region but no succes.
Hello,
I still have problems with the activation with Apple Intelligence despite Beta 2 I had recently written a post here but had to make a new one because I had accidentally marked the old post as solved. I really tried to restart everything and reinstall MacOS it always gets stuck with Preparing even in memory you can't see a download I even tried it with VPN. I constantly use mobile hotspot is it maybe because of that? Unfortunately, I don't have Wi-Fi at home, only an unlimited data volume contract.
MacBook Air M2 and 170GB free Storage
I wanted to join the Apple Intelligence after I updated my iPhone 15pro to iOS 18.1 beta. But it is still showing that I’m in the waitlist. It has been almost one day! Why? Is it normal?
I can successfully train an ActionClassifier using CreateML. However, I get crashes when I attempt to do the same asynchronously.
The model parameters and training data sources are the same in both cases:
let modelParameters = MLActionClassifier.ModelParameters(validation: validationDataSet,batchSize: 5, maximumIterations: 10, predictionWindowSize: 120, targetFrameRate: 30)
let trainingDataSource = MLActionClassifier.DataSource.directoryWithVideosAndAnnotation(at: myStudyParticipantURLFinal, annotationFile: documentURLFinal, videoColumn: "file", labelColumn: "category", startTimeColumn: "startTime", endTimeColumn: "endTime")
the only thing I add to attempt asyncrounous training is sessionParameters:
let sessionDirectory = URL(fileURLWithPath: "(NSHomeDirectory())/test")
// Session parameters can be provided to `train` method.
let sessionParameters = MLTrainingSessionParameters(
sessionDirectory: sessionDirectory,
reportInterval: 10,
checkpointInterval: 100,
iterations: 10
)
To the final method:
let trainJob = try MLActionClassifier.train(trainingData: trainingDataSource, parameters: modelParameters, sessionParameters: sessionParameters)
The job crashes saying it cannot find plist files. I notice that only one plist file is written: meta.plist
It seems there should also be a parameters.plist written, but it is not there.
The action “bla bla” could not run because of an internal error on specific mobile only
I have tried updating shortcuts app and restart my phone but nothing works
Hi,
I am working on creating a EntityPropertyQuery for my App entity. I want the user to be able to use Shortcuts to search by a property in a related entity, but I'm struggling with how the syntax for that looks.
I know the documentation for 'EntityPropertyQuery' suggests that this should be possible with a different initializer for the 'QueryProperty' that takes in a 'entityProvider' but I can't figure out how it works.
For e.g. my CJPersonAppEntity has 'emails', which is of type CJEmailAppEntity, which has a property 'emailAddress'. I want the user to be able to find the 'person' by looking up an email address.
When I try to provide this as a Property to filter by, inside CJPersonAppEntityQuery, but I get a syntax error:
static var properties = QueryProperties {
Property(\CJPersonEmailAppEntity.$emailAddress, entityProvider: { person in
person.emails // error
}) {
EqualToComparator { NSPredicate(format: "emailAddress == %@", $0) }
ContainsComparator { NSPredicate(format: "emailAddress CONTAINS %@", $0) }
}
}
The error says "Cannot convert value of type '[CJPersonEmailAppEntity]' to closure result type 'CJPersonEmailAppEntity'"
So it's not expecting an array, but an individual email item. But how do I provide that without running the predicate query that's specified in the closure?
So I tried something like this , just returning something without worrying about correctness:
Property(\CJPersonEmailAppEntity.$emailAddress, entityProvider: { person in
person.emails.first ?? CJPersonEmailAppEntity() // satisfy compiler
}) {
EqualToComparator { NSPredicate(format: "emailAddress == %@", $0) }
ContainsComparator { NSPredicate(format: "emailAddress CONTAINS %@", $0) }
}
and it built the app, but failed on another the step 'Extracting app intents metadata':
error: Entity CJPersonAppEntity does not contain a property named emailAddress. Ensure that the property is wrapped with an @Property property wrapper
So I'm not sure what the correct syntax for handling this case is, and I can't find any other examples of how it's done. Would love some feedback for this.
Apple's sample code 'Trails' supports multiple scenes, however everything is using shared state across the scenes. Put the app in Split View mode and have two windows of the app running and navigate, you can see both mirror each other. Works as designed, it is using a shared 'navigation model' across all scenes.
https://developer.apple.com/documentation/appintents/acceleratingappinteractionswithappintents
I would like to know if there is a supported or recommended way to modify individual scene storage from within the perform body of an AppIntent. The objective is to have App Shortcuts that launch different tabs in a TabView or different selections in a List.
In short, I want to deep link to features, but account for more than one scene being open on iPad and only have programatic navigation happen on the scene that is 'foremost' or the 'activated' one in Split View.
I have it working with either a @Dependency or posting a Notification with my main ContentView listening to the other end, but it changes all scenes.
I've been running Sequoia 15.1 since it was released. I soon thereafter was taken off the waitlist and had been using Apple Intelligence until this morning.
My first hint something was wrong was that Writing Tools, which I'd been using extensively, disappeared. I tried in another app, and it wasn't there, either.
I then looked at the Siri icon in my menu bar - which looks different under Apple Intelligence - and it had been reverted to the old icon.
I then checked my Apple Intelligence settings and, sure enough, not only was it off, but I'd been returned to the waitlist.
My iOS and iPadOS devices continue working just fine with Apple Intelligence. Only my MacBook Pro is experiencing this issue.
Has anyone else seen this?
func testMLTensor() {
let t1 = MLTensor(shape: [2000, 1], scalars: [Float](repeating: Float.random(in: 0.0...10.0), count: 2000), scalarType: Float.self)
let t2 = MLTensor(shape: [1, 3000], scalars: [Float](repeating: Float.random(in: 0.0...10.0), count: 3000), scalarType: Float.self)
for _ in 0...50 {
let t = Date()
let x = (t1 * t2)
print("MLTensor", t.timeIntervalSinceNow * 1000, "ms")
}
}
testMLTensor()
The above code took more time than expected, especially in the early stage of iteration.
It's been over 2 hours and my connection is fine. Already tried restarting a couple of times
func testMLTensor() {
let t1 = MLTensor(shape: [2000, 1], scalars: [Float](repeating: Float.random(in: 0.0...10.0), count: 2000), scalarType: Float.self)
let t2 = MLTensor(shape: [1, 3000], scalars: [Float](repeating: Float.random(in: 0.0...10.0), count: 3000), scalarType: Float.self)
for _ in 0...50 {
let t = Date()
let x = (t1 * t2)
print("MLTensor", t.timeIntervalSinceNow * 1000, "ms")
}
}
testMLTensor()
The above code took more time than expected, especially in the early stage of iteration.
func testMLTensor() {
let t1 = MLTensor(shape: [2000, 1], scalars: [Float](repeating: Float.random(in: 0.0...10.0), count: 2000), scalarType: Float.self)
let t2 = MLTensor(shape: [1, 3000], scalars: [Float](repeating: Float.random(in: 0.0...10.0), count: 3000), scalarType: Float.self)
for _ in 0...50 {
let t = Date()
let x = (t1 * t2)
print("MLTensor", t.timeIntervalSinceNow * 1000, "ms")
}
}
testMLTensor()
The above code took more time than expected, especially in the early stage of iteration.
Well, hello there.
Correct me if I’m wrong: Apple Intelligence will be available only for US developers, and if you want to use it in your app you should immigrate to US?
Because now not only usage of it’s limited for US, but every new API usage are prohibited too.