What are the possible KPI requirements set by Apple AI for cellular networks, e.g. regarding latency, throughput or jitter?
What is the expected effect on iPhone energy consumption?
Apple Intelligence
RSS for tagApple Intelligence is the personal intelligence system that puts powerful generative models right at the core of your iPhone, iPad, and Mac and powers incredible new features to help users communicate, work, and express themselves.
Post
Replies
Boosts
Views
Activity
Trying to experiment with Genmoji per the WWDC documentation and samples, but I don't seem to get Genmoji keyboard.
I see this error in my log:
Received port for identifier response: <(null)> with error:Error Domain=RBSServiceErrorDomain Code=1 "Client not entitled" UserInfo={RBSEntitlement=com.apple.runningboard.process-state,
NSLocalizedFailureReason=Client not entitled, RBSPermanent=false}
elapsedCPUTimeForFrontBoard couldn't generate a task port
Is anything presently supported for developers? All I have done here is a simple app with a UITextView and code for:
textView.supportsAdaptiveImageGlyph = true
Any thoughts?
I’m currently developing an app that features a main view with a UITableView. When users select a row, they are navigated to a detail view that contains a UITextField. This UITextField already supports Writing Tools.
My question is: When a user long-presses a UITableView cell, is it possible to add a Writing Tools option to the Context Menu, allowing users to interact with the Writing Tools more conveniently?like Summary detail text
I'm trying to disable Writing Tools for a specific TextField using .writingToolsBehavior(.disabled), but when running the app on my iPhone 16 Pro with Apple Intelligence enabled, I can still use Writing Tools on the text box. I also see no difference with .writingToolsBehavior(.limited).
Is there something I'm doing wrong or is this a bug?
Sample code below:
import SwiftUI
struct ContentView: View {
@State var text = ""
var body: some View {
VStack {
TextField("Enter Text", text: $text)
.writingToolsBehavior(.disabled)
}
.padding()
}
}
#Preview {
ContentView()
}
I signed up for apple intelligence on my IPad Air m1 and then updated my phone today. It tells me that I’ve already been put on a waitlist i didn’t even join? And it’s been stuck on that for 2 days now
i have iphone 15 pro max on ios 18.1 I tried to join Apple Intelligence about three weeks ago, but I'm still stuck on the waitlist. I've already tried everything recommended by Apple, including changing my region and Siri's language to English (US). Can anyone help me figure out how to solve this issue?
Xcode Version 16.0 (16A242d)
iOS18 - Swift
There seems to be a behavior change on iOS18 when using AppShortcuts and AppIntents to pass string parameters. After Siri prompts for a string property requestValueDialog, if the user makes a statement the string is passed. If the user's statement is a question, however, the string is not sent to the AppIntent and instead Siri attempts to answer that question.
Example Code:
struct MyAppNameShortcuts: AppShortcutsProvider {
@AppShortcutsBuilder
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: AskQuestionIntent(),
phrases: [
"Ask \(.applicationName) a question",
]
)
}
}
struct AskQuestionIntent: AppIntent {
static var title: LocalizedStringResource = .init(stringLiteral: "Ask a question")
static var openAppWhenRun: Bool = false
static var parameterSummary: some ParameterSummary {
Summary("Search for \(\.$query)")
}
@Dependency
private var apiClient: MockApiClient
@Parameter(title: "Query", requestValueDialog: .init(stringLiteral: "What would you like to ask?"))
var query: String
// perform is not called if user asks a question such as "What color is the moon?" in response to requestValueDialog
// iOS 17, the same string is passed though
@MainActor
func perform() async throws -> some IntentResult & ProvidesDialog & ShowsSnippetView {
print("Query is: \(query)")
let queryResult = try await apiClient.askQuery(queryString: query)
let dialog = IntentDialog(
full: .init(stringLiteral: queryResult.answer),
supporting: .init(stringLiteral: "The answer to \(queryResult.question) is...")
)
let view = SiriAnswerView(queryResult: queryResult)
return .result(dialog: dialog, view: view)
}
}
Given the above mock code:
iOS17:
Hey Siri
Ask (AppName) a question
Siri responds "What would you like to ask?"
Say "What color is the moon?"
String of "What color is the moon?" is passed to the AppIntent
iOS18:
Hey Siri
Ask (AppName) a question
Siri responds "What would you like to ask?"
Say "What color is the moon?"
Siri answers the question "What color is the moon?"
Follow above steps again and instead reply "Moon"
"Moon" is passed to AppIntent
Basically any interrogative string parameters seem to be intercepted and sent to Siri proper rather than the provided AppIntent in iOS 18
Yesterday after updating to iOS 18.1 I joined the Apple Intelligence waitlist on my iPhone 15 Pro. About an hour later I noticed that it had the message "Support for processing Apple Intelligence on device is downloading." A day later it is still displaying the same message. I have strong wi-fi, I'm plugged in to power with full battery, and there are 750gb available in storage. From what I have been able to find online, this isn't the typical user experience and that it probably isn't going to complete the process at this point. Any advice on how to proceed and get Apple Intelligence installed and working would be greatly appreciated.
I just updated to 18.1 beta 4 and the only AI features I can use are the new Siri and photo cleanup. The rest are unavailable. Apple intelligence takes up 589mb on my iphone
hey just curious if apple intelligence will be available on iPhone 15 Plus as well??? in october or is there a way that iPhone 15 Plus owners can join apple intelligence’s wait lists or something??? please let me know !😫
I need to add AI Image Playground in my iOS app with UIKit, as per WWDC 2024 introduce new AI Image Playground API, I didn't find any official document yet, So how can add it ?
As a user, when viewing a photo or image, I want to be able to tell Siri, “add this to ”, similar to example from the WWDC presentation where a photo is added to a note in the notes app.
Is this... possible with app domains as they are documented?
I see domains like open-file and open-photo, but I don't know if those are appropriate for this kind of functionality?
I've got Apple AI working on my iPhone 15 pro max, SIRI 2.0 working as expected, however I don't seem to have the below options for Apple AI working / appearing.
AI in mail
AI in notes
Clean up just stuck on downloading in photos
Not sure if my setup is wrong or it's just not available for me yet
Hi,
I have an existing app with AppEntities defined, that works on iOS16 and iOS17. The AppEntities also have EntityPropertyQuery defined, so they work as 'find intents'. I want to use the new @AssistantEntity on iOS18, while supporting the previous versions. What's the best way to do this?
For e.g. I have a 'person' AppEntity:
@available(iOS 16.0, macOS 13.0, watchOS 9.0, tvOS 16.0, *)
struct CJLogAppEntity: AppEntity {
static var defaultQuery = CJLogAppEntityQuery()
....
}
struct CJLogAppEntityQuery: EntityPropertyQuery {
...
}
How do I adopt this with @AssistantEntity(schema: .journal.entry) for iOS18, while maintaining compatibility with iOS16 and 17?
I am checking actual movement on iOS18.1 beta 3 devices, but the following items are not functioning.
Image Playground
Image Wand
Genmoji
Please let me know the following
Are the above 3 items available on iOS18.1 beta 3?
If available, are there any other operations other than enabling Apple Intelligece that are required to use the features?
Hello, Mac Mini M1 2020, macOS 15.1 (24B5035e), AI enabled, no cleanup function in the Photos app, is that normal in your opinion? Thank you.
I am working on an app which would refine text the user wrote without user having to select the text and then interact with the options.
For example, one use-case is where users talks into the microphone and dictates the text, which is refined immediately.
Is this something where Apple-Inteligence, or Writing Tools can assist?
I just installed iOS 18.1 Beta 3 on my iPad M4 (I was previously on 18.0 betas).
I did the the same thing on my iPhone 15 Pro Max which works perfectly.
However on the iPad, it seems to be stuck on 99% and won't complete downloading.
The status message near the top keeps switching between "downloading" and "will continue later on WiFi".
Note, I'm connected to my home WiFi, very fast and iPhone was on the same network and downloaded quickly without issue.
Is there a way to reset and start again since it's stuck? This is really frustrating.
This has been going on for several hours at this point.
Hi, it's been 3 days that Apple Intelligence is stuck on preparing step and I don't know what can I do, can you help me please ?
(macOS Sonoma 15.1 Beta 3, Mac mini M1)
When building and running an app on iOS 18.1 Beta 3 a fresh sample app with an @AssistantIntent will immediately crash. Using a sample Assistant Intent from the developer documentation site will cause this. Removing the @AssistantIntent macro will allow the app to run. Using Xcode 16.1 beta.
dyld[1278]: Symbol not found: _$s10AppIntents15AssistantSchemaV06IntentD0VAC0E0AAWP Referenced from: <DC018008-EC0E-3251-AAFC-5DEB51863F17> /private/var/containers/Bundle/Application/2726C2CE-0255-4692-A7CA-B343146D4A83/Runner.app/Runner.debug.dylib Expected in: <E9AF073B-B6E0-31B8-88AA-092774CEEE3D> /System/Library/Frameworks/AppIntents.framework/AppIntents
(FB14949135)