So, I was working with organizing the Home Screen. Then I lost track of where I was and planed to just move it to the library but instead deleted it. How do I get It back?
Apple Intelligence
RSS for tagApple Intelligence is the personal intelligence system that puts powerful generative models right at the core of your iPhone, iPad, and Mac and powers incredible new features to help users communicate, work, and express themselves.
Post
Replies
Boosts
Views
Activity
Using an iPhone 15 Pro. I'm in Australia on iOS 18.2 Beta 1 but there's no mention of Apple Intelligence anywhere in the phone. Settings looks identical without the menu for AI, there's no playground image generation. It's as if my phone doesn't know it exists
I wanted to join the Apple Intelligence after I updated my iPhone 15pro to iOS 18.1 beta. But it is still showing that I’m in the waitlist. It has been almost one day! Why? Is it normal?
Hello everyone,
I hope you’re all doing well. I’m not a developer, but I have an idea for an iOS app that I’d love to get your thoughts on. I wanted to share it here to gather feedback from this knowledgeable community and to learn from your expertise.
Idea Overview: Real-Time AI Running Coach for iOS
The concept is an iOS application that provides personalized, real-time running coaching by leveraging on-device data sources and Apple’s latest technologies. The app aims to offer an adaptive and motivating running experience while ensuring user privacy through on-device processing.
Key Features:
• Personalized Coaching:
• Utilize real-time biometric data and personal insights to deliver AI-driven coaching tailored to the user’s mental and physical state.
• Analyze health metrics, activity data, mood check-ins, and more to provide context-based motivational feedback.
• Privacy First:
• All data processing occurs on-device using Apple’s frameworks like Core ML, ensuring no personal data leaves the device.
• Adaptive Motivation:
• Implement Natural Language Processing to analyze user inputs like journal entries or mood check-ins.
• Generate personalized coaching cues based on historical performance and mood trends.
• Performance Enhancement:
• Offer dynamic adjustments to pace, route, and strategy in real time to help improve running performance.
• Seamless integration with Apple Watch for real-time data collection and haptic feedback.
Technologies and Frameworks Involved:
• HealthKit: Access health metrics such as heart rate, distance run, VO₂ max, sleep patterns, etc.
• Core ML: On-device machine learning for real-time data analysis without latency.
• Natural Language Processing: Analyze personal inputs for better coaching personalization.
• Core Motion & Core Location: Track motion data and location services for runs.
• AVFoundation & Speech: Provide real-time voice feedback and coaching cues.
• SiriKit Integration: Allow users to initiate workouts and receive updates via Siri.
Target Audience:
• Runners of all levels seeking personalized coaching that adapts to their mental and physical states.
• Users who prioritize privacy and want AI-driven insights without their data leaving the device.
• Tech-savvy fitness enthusiasts who use iOS devices and Apple wearables.
Questions for the Community:
1. Feasibility: Is this idea technically achievable using current iOS frameworks and technologies?
2. Data Access: Are there limitations in accessing and processing the necessary data on-device, especially regarding privacy and permissions?
3. Potential Challenges: What hurdles might developers face in creating such an app, and how could they be addressed?
4. Advice: As someone without a technical background, what steps would you recommend I take to move this idea forward?
I truly appreciate any feedback or insights you can provide. I’m excited about the potential of this idea but also aware there may be complexities I’m not considering. Thank you for taking the time to read this!
Best regards,
Paul
Hello,
I’m working on a program that analyzes video files frame by frame to detect human poses in each frame. However, during the process of reading observations from the stream, the analysis frequently stops with the following error:
[LOG_ERROR] /Library/Caches/com.apple.xbs/Sources/MediaAnalysis/VideoProcessing/VCPHumanPoseImageRequest.mm[85]: code -18
[LOG_ERROR] /Library/Caches/com.apple.xbs/Sources/MediaAnalysis/VideoProcessing/VCPHumanPoseImageRequest.mm[178]: code -18
The error was caught and printed using a do-catch block, and here is the output:
Error Domain=NSOSStatusErrorDomain Code=-18 "Error: failed to processImage" UserInfo={NSLocalizedDescription=Error: failed to processImage}
While the do-catch block helps prevent the app from crashing, the frames following the error cannot be analyzed.
I’m hoping to understand the cause of this error, or find a way to skip the problematic frames and continue analyzing the subsequent ones.
My development environment is Xcode Version 16.0 (16A242d) and iOS 18.0.
Thank you for your help. (Attaching my code below.)
let videoProcessor = VideoProcessor(videoURL)
let bodyPoseRequest = DetectHumanBodyPoseRequest()
let asset = AVURLAsset(url: videoURL)
let videoTrack = try await asset.loadTracks(withMediaType: .video).first
let bodyPoseStream = try await videoProcessor.addRequest(bodyPoseRequest)
videoProcessor.startAnalysis()
do {
for try await observations in bodyPoseStream {
guard let observation = observations.first else { continue }
if let timeRange = observation.timeRange {
/// do something...
}
}
} catch {
print("\(error.localizedDescription)")
}
With apple intelligence enabled I am seeing a 10-15 sec delay in push notification.
This is actually causing issue with my alarm system, door bells and 2FA requests.
I'm using iPhone 16 Pro.
First installed 18.1 Dev beta 5 and AI and noticed this.
When beta 6 was released it continued.
I put in FB15389900.
Can vector embeddings be used in a SwiftData Model?
If yes, are there resources available to learn more about it? or at least a guided step on how to make it work?
why it’s taking too long to download appl intelligence on iPhone 15 pro max .
how long it will take to download.
I use 5G now it counting 4 weeks still downloading.
Every time I click join the waitlist, there's is no reaction.
My phone just redirects me back to this page
Why does Apple sell Iphones 16 with AI that won’t work in europe? You do not sell a car without a engine. Was it not better to wait to bring it to europe?
I don’t see the Apple Intelligence tap, even with iOS 18.1, language and region United States, Sri in English US… what can I do?
Xcode Version 16.0 (16A242d)
iOS18 - Swift
There seems to be a behavior change on iOS18 when using AppShortcuts and AppIntents to pass string parameters. After Siri prompts for a string property requestValueDialog, if the user makes a statement the string is passed. If the user's statement is a question, however, the string is not sent to the AppIntent and instead Siri attempts to answer that question.
Example Code:
struct MyAppNameShortcuts: AppShortcutsProvider {
@AppShortcutsBuilder
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: AskQuestionIntent(),
phrases: [
"Ask \(.applicationName) a question",
]
)
}
}
struct AskQuestionIntent: AppIntent {
static var title: LocalizedStringResource = .init(stringLiteral: "Ask a question")
static var openAppWhenRun: Bool = false
static var parameterSummary: some ParameterSummary {
Summary("Search for \(\.$query)")
}
@Dependency
private var apiClient: MockApiClient
@Parameter(title: "Query", requestValueDialog: .init(stringLiteral: "What would you like to ask?"))
var query: String
// perform is not called if user asks a question such as "What color is the moon?" in response to requestValueDialog
// iOS 17, the same string is passed though
@MainActor
func perform() async throws -> some IntentResult & ProvidesDialog & ShowsSnippetView {
print("Query is: \(query)")
let queryResult = try await apiClient.askQuery(queryString: query)
let dialog = IntentDialog(
full: .init(stringLiteral: queryResult.answer),
supporting: .init(stringLiteral: "The answer to \(queryResult.question) is...")
)
let view = SiriAnswerView(queryResult: queryResult)
return .result(dialog: dialog, view: view)
}
}
Given the above mock code:
iOS17:
Hey Siri
Ask (AppName) a question
Siri responds "What would you like to ask?"
Say "What color is the moon?"
String of "What color is the moon?" is passed to the AppIntent
iOS18:
Hey Siri
Ask (AppName) a question
Siri responds "What would you like to ask?"
Say "What color is the moon?"
Siri answers the question "What color is the moon?"
Follow above steps again and instead reply "Moon"
"Moon" is passed to AppIntent
Basically any interrogative string parameters seem to be intercepted and sent to Siri proper rather than the provided AppIntent in iOS 18
Hey guys,
If you are still having issues on having your download stuck at 99% and switching on and off WiFi, I believe I have a fix.
turn your cellular data completely off. It will force the device to just use wifi.
Then the download will restart and actually complete.
Trying to experiment with Genmoji per the WWDC documentation and samples, but I don't seem to get Genmoji keyboard.
I see this error in my log:
Received port for identifier response: <(null)> with error:Error Domain=RBSServiceErrorDomain Code=1 "Client not entitled" UserInfo={RBSEntitlement=com.apple.runningboard.process-state,
NSLocalizedFailureReason=Client not entitled, RBSPermanent=false}
elapsedCPUTimeForFrontBoard couldn't generate a task port
Is anything presently supported for developers? All I have done here is a simple app with a UITextView and code for:
textView.supportsAdaptiveImageGlyph = true
Any thoughts?
As a user, when viewing a photo or image, I want to be able to tell Siri, “add this to ”, similar to example from the WWDC presentation where a photo is added to a note in the notes app.
Is this... possible with app domains as they are documented?
I see domains like open-file and open-photo, but I don't know if those are appropriate for this kind of functionality?
I’m currently developing an app that features a main view with a UITableView. When users select a row, they are navigated to a detail view that contains a UITextField. This UITextField already supports Writing Tools.
My question is: When a user long-presses a UITableView cell, is it possible to add a Writing Tools option to the Context Menu, allowing users to interact with the Writing Tools more conveniently?like Summary detail text
I need to add AI Image Playground in my iOS app with UIKit, as per WWDC 2024 introduce new AI Image Playground API, I didn't find any official document yet, So how can add it ?
I signed up for apple intelligence on my IPad Air m1 and then updated my phone today. It tells me that I’ve already been put on a waitlist i didn’t even join? And it’s been stuck on that for 2 days now
i have iphone 15 pro max on ios 18.1 I tried to join Apple Intelligence about three weeks ago, but I'm still stuck on the waitlist. I've already tried everything recommended by Apple, including changing my region and Siri's language to English (US). Can anyone help me figure out how to solve this issue?
Yesterday after updating to iOS 18.1 I joined the Apple Intelligence waitlist on my iPhone 15 Pro. About an hour later I noticed that it had the message "Support for processing Apple Intelligence on device is downloading." A day later it is still displaying the same message. I have strong wi-fi, I'm plugged in to power with full battery, and there are 750gb available in storage. From what I have been able to find online, this isn't the typical user experience and that it probably isn't going to complete the process at this point. Any advice on how to proceed and get Apple Intelligence installed and working would be greatly appreciated.