Apple Intelligence

RSS for tag

Apple Intelligence is the personal intelligence system that puts powerful generative models right at the core of your iPhone, iPad, and Mac and powers incredible new features to help users communicate, work, and express themselves.

Post

Replies

Boosts

Views

Activity

Seeking Feedback on an Idea: Real-Time Siri Running Coach for iOS
Hello everyone, I hope you’re all doing well. I’m not a developer, but I have an idea for an iOS app that I’d love to get your thoughts on. I wanted to share it here to gather feedback from this knowledgeable community and to learn from your expertise. Idea Overview: Real-Time AI Running Coach for iOS The concept is an iOS application that provides personalized, real-time running coaching by leveraging on-device data sources and Apple’s latest technologies. The app aims to offer an adaptive and motivating running experience while ensuring user privacy through on-device processing. Key Features: • Personalized Coaching: • Utilize real-time biometric data and personal insights to deliver AI-driven coaching tailored to the user’s mental and physical state. • Analyze health metrics, activity data, mood check-ins, and more to provide context-based motivational feedback. • Privacy First: • All data processing occurs on-device using Apple’s frameworks like Core ML, ensuring no personal data leaves the device. • Adaptive Motivation: • Implement Natural Language Processing to analyze user inputs like journal entries or mood check-ins. • Generate personalized coaching cues based on historical performance and mood trends. • Performance Enhancement: • Offer dynamic adjustments to pace, route, and strategy in real time to help improve running performance. • Seamless integration with Apple Watch for real-time data collection and haptic feedback. Technologies and Frameworks Involved: • HealthKit: Access health metrics such as heart rate, distance run, VO₂ max, sleep patterns, etc. • Core ML: On-device machine learning for real-time data analysis without latency. • Natural Language Processing: Analyze personal inputs for better coaching personalization. • Core Motion & Core Location: Track motion data and location services for runs. • AVFoundation & Speech: Provide real-time voice feedback and coaching cues. • SiriKit Integration: Allow users to initiate workouts and receive updates via Siri. Target Audience: • Runners of all levels seeking personalized coaching that adapts to their mental and physical states. • Users who prioritize privacy and want AI-driven insights without their data leaving the device. • Tech-savvy fitness enthusiasts who use iOS devices and Apple wearables. Questions for the Community: 1. Feasibility: Is this idea technically achievable using current iOS frameworks and technologies? 2. Data Access: Are there limitations in accessing and processing the necessary data on-device, especially regarding privacy and permissions? 3. Potential Challenges: What hurdles might developers face in creating such an app, and how could they be addressed? 4. Advice: As someone without a technical background, what steps would you recommend I take to move this idea forward? I truly appreciate any feedback or insights you can provide. I’m excited about the potential of this idea but also aware there may be complexities I’m not considering. Thank you for taking the time to read this! Best regards, Paul
0
0
216
Oct ’24
"failed to processImage" in videoProcessor
Hello, I’m working on a program that analyzes video files frame by frame to detect human poses in each frame. However, during the process of reading observations from the stream, the analysis frequently stops with the following error: [LOG_ERROR] /Library/Caches/com.apple.xbs/Sources/MediaAnalysis/VideoProcessing/VCPHumanPoseImageRequest.mm[85]: code -18 [LOG_ERROR] /Library/Caches/com.apple.xbs/Sources/MediaAnalysis/VideoProcessing/VCPHumanPoseImageRequest.mm[178]: code -18 The error was caught and printed using a do-catch block, and here is the output: Error Domain=NSOSStatusErrorDomain Code=-18 "Error: failed to processImage" UserInfo={NSLocalizedDescription=Error: failed to processImage} While the do-catch block helps prevent the app from crashing, the frames following the error cannot be analyzed. I’m hoping to understand the cause of this error, or find a way to skip the problematic frames and continue analyzing the subsequent ones. My development environment is Xcode Version 16.0 (16A242d) and iOS 18.0. Thank you for your help. (Attaching my code below.) let videoProcessor = VideoProcessor(videoURL) let bodyPoseRequest = DetectHumanBodyPoseRequest() let asset = AVURLAsset(url: videoURL) let videoTrack = try await asset.loadTracks(withMediaType: .video).first let bodyPoseStream = try await videoProcessor.addRequest(bodyPoseRequest) videoProcessor.startAnalysis() do { for try await observations in bodyPoseStream { guard let observation = observations.first else { continue } if let timeRange = observation.timeRange { /// do something... } } } catch { print("\(error.localizedDescription)") }
0
1
161
Oct ’24
iOS 18: Siri not passing string parameters to AppIntents if the string is a question
Xcode Version 16.0 (16A242d) iOS18 - Swift There seems to be a behavior change on iOS18 when using AppShortcuts and AppIntents to pass string parameters. After Siri prompts for a string property requestValueDialog, if the user makes a statement the string is passed. If the user's statement is a question, however, the string is not sent to the AppIntent and instead Siri attempts to answer that question. Example Code: struct MyAppNameShortcuts: AppShortcutsProvider { @AppShortcutsBuilder static var appShortcuts: [AppShortcut] { AppShortcut( intent: AskQuestionIntent(), phrases: [ "Ask \(.applicationName) a question", ] ) } } struct AskQuestionIntent: AppIntent { static var title: LocalizedStringResource = .init(stringLiteral: "Ask a question") static var openAppWhenRun: Bool = false static var parameterSummary: some ParameterSummary { Summary("Search for \(\.$query)") } @Dependency private var apiClient: MockApiClient @Parameter(title: "Query", requestValueDialog: .init(stringLiteral: "What would you like to ask?")) var query: String // perform is not called if user asks a question such as "What color is the moon?" in response to requestValueDialog // iOS 17, the same string is passed though @MainActor func perform() async throws -> some IntentResult & ProvidesDialog & ShowsSnippetView { print("Query is: \(query)") let queryResult = try await apiClient.askQuery(queryString: query) let dialog = IntentDialog( full: .init(stringLiteral: queryResult.answer), supporting: .init(stringLiteral: "The answer to \(queryResult.question) is...") ) let view = SiriAnswerView(queryResult: queryResult) return .result(dialog: dialog, view: view) } } Given the above mock code: iOS17: Hey Siri Ask (AppName) a question Siri responds "What would you like to ask?" Say "What color is the moon?" String of "What color is the moon?" is passed to the AppIntent iOS18: Hey Siri Ask (AppName) a question Siri responds "What would you like to ask?" Say "What color is the moon?" Siri answers the question "What color is the moon?" Follow above steps again and instead reply "Moon" "Moon" is passed to AppIntent Basically any interrogative string parameters seem to be intercepted and sent to Siri proper rather than the provided AppIntent in iOS 18
1
0
476
Sep ’24
Genmoji developer support
Trying to experiment with Genmoji per the WWDC documentation and samples, but I don't seem to get Genmoji keyboard. I see this error in my log: Received port for identifier response: <(null)> with error:Error Domain=RBSServiceErrorDomain Code=1 "Client not entitled" UserInfo={RBSEntitlement=com.apple.runningboard.process-state, NSLocalizedFailureReason=Client not entitled, RBSPermanent=false} elapsedCPUTimeForFrontBoard couldn't generate a task port Is anything presently supported for developers? All I have done here is a simple app with a UITextView and code for: textView.supportsAdaptiveImageGlyph = true Any thoughts?
0
0
239
Sep ’24
Can Writing Tools be accessed In UITableView contextMenu?
I’m currently developing an app that features a main view with a UITableView. When users select a row, they are navigated to a detail view that contains a UITextField. This UITextField already supports Writing Tools. My question is: When a user long-presses a UITableView cell, is it possible to add a Writing Tools option to the Context Menu, allowing users to interact with the Writing Tools more conveniently?like Summary detail text
0
0
305
Sep ’24
Apple Intelligence download issue
Yesterday after updating to iOS 18.1 I joined the Apple Intelligence waitlist on my iPhone 15 Pro. About an hour later I noticed that it had the message "Support for processing Apple Intelligence on device is downloading." A day later it is still displaying the same message. I have strong wi-fi, I'm plugged in to power with full battery, and there are 750gb available in storage. From what I have been able to find online, this isn't the typical user experience and that it probably isn't going to complete the process at this point. Any advice on how to proceed and get Apple Intelligence installed and working would be greatly appreciated.
16
8
6.9k
Sep ’24