I use a data cable to connect my Nikon camera to my iPhone. In my project, I use the framework ImageCaptureCore. Now I can read the photos in the camera memory card, but when I press the shutter of the camera to take a picture, the camera does not respond, the connection between the camera and the phone is normal. Then the camera screen shows a picture of a laptop. I don't know. Why is that? I hope someone can help me.
General
RSS for tagExplore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Post
Replies
Boosts
Views
Activity
Not sure when it happened but I can no longer play explicit songs in my app using MK v3.
I've turned off restrictions/made sure i have access to explicit in...
My phone (Screen Time)
My computer (Screen Time)
My iPad (Screen Time)
music.apple.com (Settings)
and I still get this error when I try to play a song in console.
`CONTENT_RESTRICTED: Content restricted
at set isRestricted (https://js-cdn.music.apple.com/musickit/v3/musickit.js:28:296791)
at SerialPlaybackController._prepareQueue (https://js-cdn.music.apple.com/musickit/v3/musickit.js:28:318357)
at SerialPlaybackController._prepareQueue (https://js-cdn.music.apple.com/musickit/v3/musickit.js:28:359408)
at set queue (https://js-cdn.music.apple.com/musickit/v3/musickit.js:28:308934)
at https://js-cdn.music.apple.com/musickit/v3/musickit.js:28:357429
at Generator.next (<anonymous>)
at asyncGeneratorStep$j (https://js-cdn.music.apple.com/musickit/v3/musickit.js:28:351481)
at _next (https://js-cdn.music.apple.com/musickit/v3/musickit.js:28:351708)`
My app is consistently crashing for a specific user on 14.3 (iMac (24-inch, M1, 2021) when their library is being retrieved in full. User says they have 36k+ songs in their library which includes purchased music.
This is the code making the call:
var request = MusicLibraryRequest<Album>()
request.limit = 10000
let response = try await request.response()
I’m aware of a similar (?) crash FB13094022 (https://forums.developer.apple.com/forums/thread/736717) that was claim fixed for 14.2. Not sure if this is a separate issue or linked.
I’ve submitted new FB13573268 for it.
CrashReporter Key: 0455323d871db6008623d9288ecee16c676248c6
Hardware Model: iMac21,1
Process: Music Flow
Identifier: com.third.musicflow
Version: 1.2
Role: Foreground
OS Version: Mac OS 14.3
NSInternalInconsistencyException: No identifiers for model class: MPModelSong from source: (null)
0 CoreFoundation +0xf2530 __exceptionPreprocess
1 libobjc.A.dylib +0x19eb0 objc_exception_throw
2 Foundation +0x10f398 -[NSAssertionHandler handleFailureInMethod:object:file:lineNumber:description:]
3 MediaPlayer +0xd59f0 -[MPBaseEntityTranslator _objectForPropertySet:source:context:]
4 MediaPlayer +0xd574c -[MPBaseEntityTranslator _objectForRelationshipKey:propertySet:source:context:]
5 MediaPlayer +0xd5cd4 __63-[MPBaseEntityTranslator _objectForPropertySet:source:context:]_block_invoke_2
6 CoreFoundation +0x40428 __NSDICTIONARY_IS_CALLING_OUT_TO_A_BLOCK__
7 CoreFoundation +0x402f0 -[__NSDictionaryI enumerateKeysAndObjectsWithOptions:usingBlock:]
8 MediaPlayer +0xd5c1c __63-[MPBaseEntityTranslator _objectForPropertySet:source:context:]_block_invoke
9 MediaPlayer +0x11296c -[MPModelObject initWithIdentifiers:block:]
10 MediaPlayer +0xd593c -[MPBaseEntityTranslator _objectForPropertySet:source:context:]
11 MediaPlayer +0xd66c4 -[MPBaseEntityTranslator objectForPropertySet:source:context:]
12 MediaPlayer +0x1a7744 __47-[MPModeliTunesLibraryRequestOperation execute]_block_invoke
13 iTunesLibrary +0x16d84 0x1b4e1cd84 (0x1b4e1cd30 + 84)
14 CoreFoundation +0x5dec0 __invoking___
15 CoreFoundation +0x5dd38 -[NSInvocation invoke]
16 Foundation +0x1e874 __NSXPCCONNECTION_IS_CALLING_OUT_TO_REPLY_BLOCK__
17 Foundation +0x1cef4 -[NSXPCConnection _decodeAndInvokeReplyBlockWithEvent:sequence:replyInfo:]
18 Foundation +0x1c850 __88-[NSXPCConnection _sendInvocation:orArguments:count:methodSignature:selector:withProxy:]_block_invoke_3
19 libxpc.dylib +0x10020 _xpc_connection_reply_callout
20 libxpc.dylib +0xff18 _xpc_connection_call_reply_async
21 libdispatch.dylib +0x398c _dispatch_client_callout3
22 libdispatch.dylib +0x21384 _dispatch_mach_msg_async_reply_invoke
23 libdispatch.dylib +0xad24 _dispatch_lane_serial_drain
24 libdispatch.dylib +0xba04 _dispatch_lane_invoke
25 libdispatch.dylib +0x16618 _dispatch_root_queue_drain_deferred_wlh
26 libdispatch.dylib +0x15e8c _dispatch_workloop_worker_thread
27 libsystem_pthread.dylib +0x3110 _pthread_wqthread
28 libsystem_pthread.dylib +0x1e2c start_wqthread
I have trained a model to classify some symbols using Create ML.
In my app I am using VNImageRequestHandler and VNCoreMLRequest to classify image data.
If I use a CVPixelBuffer obtained from an AVCaptureSession then the classifier runs as I would expect. If I point it at the symbols it will work fairly accurately, so I know the model is trained fairly correctly and works in my app.
If I try to use a cgImage that is obtained by cropping a section out of a larger image (from the gallery), then the classifier does not work. It always seems to return the same result (although the confidence is not a 1.0 and varies for each image, it will be to within several decimal points of it, eg 9.9999).
If I pause the app when I have the cropped image and use the debugger to obtain the cropped image (via the little eye icon and then open in preview), then drop the image into the Preview section of the MLModel file or in Create ML, the model correctly classifies the image.
If I scale the cropped image to be the same size as I get from my camera, and convert the cgImage to a CVPixelBuffer with same size and colour space to be the same as the camera (1504, 1128, kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange) then I get some difference in ouput, it's not accurate, but it returns different results if I specify the 'centerCrop' or 'scaleFit' options. So I know that 'something' is happening, but it's not the correct thing.
I was under the impression that passing a cgImage to the VNImageRequestHandler would perform the necessary conversions, but experimentation shows this is not the case. However, when using the preview tool on the model or in Create ML this conversion is obviously being done behind the scenes because the cropped part is being detected.
What am I doing wrong.
tl;dr
my model works, as backed up by using video input directly and also dropping cropped images into preview sections
passing the cropped images directly to the VNImageRequestHandler does not work
modifying the cropped images can produce different results, but I cannot see what I should be doing to get reliable results.
I'd like my app to behave the same way the preview part behaves, I give it a cropped part of an image, it does some processing, it goes to the classifier, it returns a result same as in Create ML.
hdiutiul bug?
When making a DMG image for the whole content of user1 profile (meaning using srcFolder = /Users/user1) using hdiutil, the program fails indicating:
/Users/user1/Library/VoiceTrigger: Operation not permitted hdiutil: create failed - Operation not permitted
The complete command used was: "sudo hdiutil create -srcfolder /Users/user1 -skipunreadable -format UDZO /Volumes/testdmg/test.dmg"
And, of course, the user had local admin rights. I was using Sonoma 14.2.1 and a MacBook Pro (Intel T2)
What I would have expected, asuming that /VoiceTrigger cannot be copied for whatever reason, would be to skip that file or folder and continue the process. Then, at the end, produce a log listing the files/folders not included and the reason for their exclusion. The fact that hdiutil just ended inmediately, looks to me as a bug. Or what else could explain the problem described?
I am trying to use the Speech Synthesizer to speak the pronunciation of a word in British English rather than play a local audio file which I had before. However, I keep getting this in the debugger:
#FactoryInstall Unable to query results, error: 5 Unable to list voice folder Unable to list voice folder Unable to list voice folder IPCAUClient.cpp:129 IPCAUClient: bundle display name is nil Unable to list voice folder
Here is my code, any suggestions??
` func playSampleAudio() {
let speechSynthesizer = AVSpeechSynthesizer()
let speechUtterance = AVSpeechUtterance(string: currentWord)
// Search for a voice with a British English accent.
let voices = AVSpeechSynthesisVoice.speechVoices()
var foundBritishVoice = false
for voice in voices {
if voice.language == "en-GB" {
speechUtterance.voice = voice
foundBritishVoice = true
break
}
}
if !foundBritishVoice {
print("British English voice not found. Using default voice.")
}
// Configure the utterance's properties as needed.
speechUtterance.rate = AVSpeechUtteranceDefaultSpeechRate
speechUtterance.pitchMultiplier = 1.0
speechUtterance.volume = 1.0
// Speak the word.
speechSynthesizer.speak(speechUtterance)
}
I have an app that allows users to send messages to each other via a link. I'm receiving many requests from my users to add songs. Can I use the Apple Music API to add 30-second song previews along with the user's message? If users want to listen to the full song, there will be a link to the full song on Apple Music. My app contains ads and subscriptions.
Is there a way to retrieve a list of all the favorited artists for a given user via the Apple Music API? I see endpoints for playback history and resources added to the library. But I want to retrieve the full list of favorited artists for the given user and I don't see any obvious choices in the documentation.
Thanks in advance!
Running in a Mac (Catalyst) target or Apple Silicon (designed for iPad).
Just accessing the playbackStoreID from the MPMediaItem shows this error in the console:
-[ITMediaItem valueForMPMediaEntityProperty:]: Unhandled MPMediaEntityProperty subscriptionStoreItemAdamID.
The value returned is always “”.
This works as expected on iOS and iPadOS, returning a valid playbackStoreID.
import SwiftUI
import MediaPlayer
@main
struct PSIDDemoApp: App {
var body: some Scene {
WindowGroup {
Text("playbackStoreID demo")
.task {
let authResult = await MPMediaLibrary.requestAuthorization()
if authResult == .authorized {
if let item = MPMediaQuery.songs().items?.first {
let persistentID = item.persistentID
let playbackStoreID = item.playbackStoreID // <--- Here
print("Item \(persistentID), \(playbackStoreID)")
}
}
}
}
}
}
Xcode 15.1, also tested with Xcode 15.3 beta 2.
MacOS Sonoma 14.3.1
FB13607631
Hello,
I'm playing with the ScreenCaptureKit. My scenario is very basic: I capture frames and assign them to a CALayer to display in a view (for previewing) - basically what Apple's sample app does. I have two screens, and I capture only one of them, which has no app windows on it. And my app excludes itself from capture. So, when I place my app on the screen which is not being captured, I observe that most didOutputSampleBuffer calls receive frames with Idle status (which is expected for a screen which is not updating). However, if I bring my capturing app (which, to remind, is excluded from capture) to the captured screen, the frames start coming with Complete status (i.e. holding pixel buffers). And this is not what I expect - from capture perspective the screen is still not updating, as the app is excluded. The preview which the app displays proves that, showing empty desktop. So it seems like updates to a CALayer triggers screen capture frame regardless of the app being included or excluded. This looks like a bug to me, and can lead to noticable waste of resources and battery when frames are not just displayed on screen, but also processed somehow or/and sent over a network. Also, I'm observing another issue due to this behavior, where capture hangs when queueDepth is set to 3 in this same scenario (but I'll describe it separately).
Please, advise if I should file a bug somewhere, or maybe there is a rational explanation of this behavior.
Thank you
I haven't found any really thorough documentation or guidance on the use of CIRAWFilter.linearSpaceFilter. The API documentation calls it
An optional filter you can apply to the RAW image while it’s in linear space.
Can someone provide insight into what this means and what the linear space filter is useful for? When would we use this linear space filter instead of a filter on the output of CIRAWFilter?
Thank you.
I have 3d image but when I insert iton my project its come without colors , any noe knows why?
Hello, we are embedding a PHPickerViewController with UIKit (adding the vc as a child vc, embedding the view, calling didMoveToParent) in our app using the compact mode. We are disabling the following capabilities .collectionNavigation, .selectionActions, .search.
One of our users using iOS 17.2.1 and iPhone 12 encountered a crash with the following stacktrace:
Crashed: com.apple.main-thread
0 libsystem_kernel.dylib 0x9fbc __pthread_kill + 8
1 libsystem_pthread.dylib 0x5680 pthread_kill + 268
2 libsystem_c.dylib 0x75b90 abort + 180
3 PhotoFoundation 0x33b0 -[PFAssertionPolicyCrashReport notifyAssertion:] + 66
4 PhotoFoundation 0x3198 -[PFAssertionPolicyComposite notifyAssertion:] + 160
5 PhotoFoundation 0x374c -[PFAssertionPolicyUnique notifyAssertion:] + 176
6 PhotoFoundation 0x2924 -[PFAssertionHandler handleFailureInFunction:file:lineNumber:description:arguments:] + 140
7 PhotoFoundation 0x3da4 _PFAssertFailHandler + 148
8 PhotosUI 0x22050 -[PHPickerViewController _handleRemoteViewControllerConnection:extension:extensionRequestIdentifier:error:completionHandler:] + 1356
9 PhotosUI 0x22b74 __66-[PHPickerViewController _setupExtension:error:completionHandler:]_block_invoke_3 + 52
10 libdispatch.dylib 0x26a8 _dispatch_call_block_and_release + 32
11 libdispatch.dylib 0x4300 _dispatch_client_callout + 20
12 libdispatch.dylib 0x12998 _dispatch_main_queue_drain + 984
13 libdispatch.dylib 0x125b0 _dispatch_main_queue_callback_4CF + 44
14 CoreFoundation 0x3701c __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ + 16
15 CoreFoundation 0x33d28 __CFRunLoopRun + 1996
16 CoreFoundation 0x33478 CFRunLoopRunSpecific + 608
17 GraphicsServices 0x34f8 GSEventRunModal + 164
18 UIKitCore 0x22c62c -[UIApplication _run] + 888
19 UIKitCore 0x22bc68 UIApplicationMain + 340
20 WorkAngel 0x8060 main + 20 (main.m:20)
21 ??? 0x1bd62adcc (Missing)
Please share if you have any ideas as to what might have caused that, or what to look at in such a case. I haven't been able to reproduce this myself unfortunately.
Hello, I have a music on apple music. When I search this music on Shazam, I want it to appear with a clip like the link I provided below. Is there any way you can help with this?
Example: https://www.youtube.com/watch?v=St8smx2q1Ho
My Music: https://music.apple.com/us/album/tam-ba%C4%9F%C4%B1ms%C4%B1z-t%C3%BCrkiye/1689395789?i=1689395790
Thanks.
The MusicKit JS V3 documentation has this article about Native WebViews.
For some reason, my app fails to do anything after musicKitInstance.authorize() is called. I think that it is not being recognized as a WebView and is trying to open the window as if it was running in a browser, but since it's not a browser, there's no way for it to open a browser window.
Is there something I can do to make this work? I have tried many things, including setting the user agent to a few different ones for web views and seeing if it behaves differently but didn't see any different behavior.
Is there something musickit.js checks for, when determining if it's running in a WebView, that I can leverage to ensure the Native WebView workflow is followed?
I can not find an album where isCompilation is true, even when the album clearly consists of "songs by various artists". For example:
Album(
id: "567979803",
title: "Earth's Answer",
artistName: "Brian Keane, Deuter, James Newton, ,...many more",
isCompilation: false
The compilation checkbox for this album in the Apple Music Catalog Get Info dialog is also not checked.
Is this field NEVER SET in the catalog or the MusicKit API?
If there IS an album where isCompilation is true, I'd like its ID to use for a test case.
If not, can this be added to the API?
Hey there,
I have been trying to add the ability for a user to edit the playback rate of a song in their Apple Music library to my application. I have found that whenever the playback rate is changed, the music pausing for about half a second. I was wondering what the cause of this issue is and if there is anyway around it. For context I am using the SystemMusicPlayer from MusicKit.
I am currently experimenting with the MusicKit API, specifically if/how it could replace iTunesLibrary on macOS and MediaPlayer on iOS. My question is specifically about DRM-free tracks that the user has available locally (e.g. purchased, matched or uploaded tracks):
Is there a way to access the on-disk URL of locally available and DRM-free tracks?
In the iTunesLibrary framework on macOS there is ITLibMediaItem.location and on iOS I can get this path via MPMediaItem.assetURL, but MusicKit's Song.url seems to be nil for all of these tracks in my tests.
Hello, I don't run a podcast, therefore, I am not referring to the apple podcasts connect platform, and have been trying to get in contact with someone at Apple Podcasts. I would like to talk to developer support or someone who could consult on how to best approach something i'd like to build as an open source tool. I listen to a lot of podcasts and would like an analytics dashboard and toolset to take notes from the podcasts that I listen to on Apple Podcasts. Although, it would be just a good start to have analytics, accessing all of the info. I need to be able to plug into an API and pull all of that data from my account. Is there any way I can access this or talk to someone about this? I have a lot of historical data I assume from all of the shows i'm subscribed to and would like to visualize all of this. Is this possible? From my research, it seems that there is no way to access the information from the Podcasts app? Is there any infra for this?
Is there any way to play panoramic or 360 videos in an immersive space, without using VideoMaterial on a sphere?
I've tried using local videos with 4k and 8k quality and all of them look pixelated using this approach.
I tried both simulator as well as the real device, and I can't ever get a high-quality playback.
If the video is played on a regular 2D player, on the other hand, it shows the expected quality.