Hello! I'm trying to find my own window or SCWindow in a SCDisplay. This is so I can automatically change the SCDisplay to record when I drag a window from a display to another. Is there any way to check the windows that are contained in every SCDisplay? Thank you in advance!
General
RSS for tagExplore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Post
Replies
Boosts
Views
Activity
I'm getting a variety of errors when I call prepareToPlay on the MPMusicPlayerController. Sometimes they happen, sometimes they don't. I'm trying to play songs from the Apple Music service. When I don't get the errors, it plays just fine. I have iOS v13.5.1 on my iPhone Xs and I'm using Xcode 11.5. This is my code:
let applicationMusicPlayer = MPMusicPlayerController.applicationMusicPlayer
applicationMusicPlayer.setQueue(with: [trackID])
applicationMusicPlayer.prepareToPlay(completionHandler:{ error in
if let error = error {
print(error.localizedDescription)
return
}
DispatchQueue.main.async{
applicationMusicPlayer.play()
}
}
These are the various errors I'm getting:
[SDKPlayback] Failed to prepareToPlay error: Error Domain=MPMusicPlayerControllerErrorDomain Code=2 "Queue was interrupted by another queue" UserInfo={NSDebugDescription=Queue was interrupted by another queue}
[SDKPlayback] Failed to prepareToPlay error: Error Domain=MPMusicPlayerControllerErrorDomain Code=9 "Preparing queue timed out" UserInfo={NSDebugDescription=Preparing queue timed out}
[SDKPlayback] Failed to prepareToPlay error: Error Domain=MPMusicPlayerControllerErrorDomain Code=6 "Failed to prepare to play" UserInfo={NSDebugDescription=Failed to prepare to play}
[SDKPlayback] applicationQueuePlayer _establishConnectionIfNeeded timeout [ping did not pong]
I'm trying to accomplish creating a new playlist on device that appears in AppleMusic, and adding into the playlist a selection of MP3s within a small IOS app.
Now the MP3's are either a stream of bytes, or a flat file already stored on the device (the app itself generates these - they aren't downloaded, they are created in app, and then stored on the local device) in it's local storage space.
The idea is that created tracks can show up in a specific play list on the device.
Now, there appears to be some conflict as to what framework I need to use.
I've found MPMediaPlayer, which appears to allow me to create a playlist using the GetPlaylist call, although the documentation on this seems pretty sparse and there's not a lot of examples I can find on how to use this?
It looks like a UUID is passed in, but there is no documentation on what this UUID is or where it comes from? If I want to create a new Playlist, I presume I need to generate a UUID, and then store that locally in order to be able to access that playlist again later, yes?
There's an AddItem call which looks like it's how you add a track to a playlist, but there's no documentation on how you generate an entry. The documentation for this function talks about a Product ID, without describing what the product ID is, or where it needs to come from. Is this a GUID? Is it a name/description? Does it have to be unique? I'm assuming this Product ID refers to that which is being added to the playlist, but the documentation is sadly lacking in terms of explaining what the product ID refers to. Is it a media Item, or is that what is created when whatever entity the Product ID is referring to is added to the playlist?
I'm assuming I can create a NSURL of the file that is stored that is actually the MP3 sample, but what I do with that in order to actually add it as a playlist entry is unknown. I'm sure there is a mechanism to do this, it's just not clear what that is.
There's a lack of understanding or explanation of what the process is here, and some illumination would be helpful.
I need to duck the audio coming from ApplicationMusicPlayer while playing a local file using AVAudioPlayer.
I've tried using the duckOthers option as follows, but it doesn't work:
let appAudioSession = AVAudioSession.sharedInstance()
do
{
try appAudioSession.setCategory(.playAndRecord, mode: .default, options: .duckOthers)
Maybe this is because there's one session for the entire app, and ApplicationMusicPlayer is using it?
This is a fairly critical problem for my application, since Music content is always much louder than locally recorded content. Any insight appreciated.
if balloon == yellow1_balloon {
soundFile = "Sounds/newblop.wav"
playSound()
balloon.isHidden = true
poppedImages.isHidden = false
poppedImages.animationImages = ["popyellow-1","popyellow-2","popyellow-3","popyellow-4","popyellow-5","popyellow-6","popyellow-7"]
.compactMap({ name in
UIImage(named: name)
})
let x:CGFloat = yellow1_balloon.frame.origin.x
let y:CGFloat = yellow1_balloon.frame.origin.y
poppedImages.frame.origin.x = x
poppedImages.frame.origin.y = y
poppedImages.animationDuration = 1.0
poppedImages.animationRepeatCount = 1
poppedImages.startAnimating()
score = score + 10
scoreLbl.text = String(score)
return
}
x,y cordinates are always the same a when yellow1_balloon is first created and not where it ends up after being touched.
I would like to save the depth map from ARDepthData as .tiff, but notice my output tiff distances are incorrect. Objects that are close are reported to be slightly farther away, and walls that are around 4 meters away from me have a recorded value of 2 meters. I am using this code to write the tiff:
import UIKit
# Save method
extension CVPixelBuffer {
func saveDepthMapToTIFF(to path: URL) {
let ciImage = CIImage(cvPixelBuffer: self)
let context = CIContext()
do {
try context.writeTIFFRepresentation(
of: ciImage,
to: path,
format: .Lf,
colorSpace: CGColorSpaceCreateDeviceGray()
)
} catch {
print("Failed to write TIFF: \(error)")
}
}
}
# Calling the save
arFrame.sceneDepth?.depthMap.saveDepthMapToTIFF(to: depthMapPath)
I am reading the file like this in Python
import tifffile
depth_map = tifffile.imread("test.tiff")
plt.imshow(depth_map)
plt.colorbar()
which creates this image:
The farthest parts of the room should be around 4 meters, not 2. The dark blue spot on the lower right is closer than half a meter away.
Notably the depth map contains distances from the camera plane to each region, not the distance from the camera sensor to the region. Even correcting for this though, the depth map remains about the same.
Is there an issue with how I am saving the depth image? Is there a scale factor or format error?
Access to fetch at 'https://play.itunes.apple.com/WebObjects/MZPlay.woa/wa/webPlayback' from origin 'http://localhost:5173' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.
I am using ReplayKit's RPScreenRecorder to record my app. When I use it in a mixed immersive space, nothing is actually recorded. The video is entirely blank.
Is this a feature or a bug? I am trying to record everything the user sees, including passthrough. Is there another way to do this?
I'm trying to find a library item by title and artist but it returns 0 items. Example below for existing track in my library.
With title filter commented out, it successfully gives me all library items for that artistName.
If I add the title filter, or have only the title filter, I get 0 items. Why is that?
var request = MusicLibraryRequest<MusicKit.Track>()
// request.filter(matching: \.title, equalTo: "Crises (Remastered 2013)")
request.filter(matching: \.artistName, equalTo: "Mike Oldfield")
let response = try await request.response()
I can find the track by filtering the returned tracks by artist, but I feel this might not be an ideal approach if I have a bunch of tracks to find, possibly by different artists.
The reason I'm not querying by id is that I'm planning to do this sort of query for non Apple Music items and if I'm not mistaken there is no cross-device id for those (even with Sync Library on). If I have the app on multiple devices with the same Apple ID looking at the same library, I want device 2 to find the track you interacted with on device 1. If there are better ways to solve this, any ideas are welcome.
Appreciate any help.
I was about to finally sign up for the Apple Music affiliate program since one of my apps does provide the MusickKit Subscription Offer. However, it looks like it was renamed and restricts usage to only artists and record labels. (https://partners.applemediaservices.com)
Is this correct? Can I as an indie developer not earn commission of Apple Music that I drive from my apps?
After thoroughly scouring the internet numerous times, I came to the realization that Apple has not documented their User Authentication flow. Instead, developers are often directed to use their JavaScript solution, which proves to be insufficient and impractical for many projects.
Could we please request a comprehensive documentation outlining the process of generating a Music User Token from the Developer Token? This would greatly benefit developers seeking to integrate Apple Music functionality into their projects.
Hello everyone, my name is Joshua Osagie. For 2 months now, I have been trying to build my own music application, but unfortunately, I can’t because I was thinking I would get an API from maybe Apple Music or Spotify that will grant me access to over 100 million music. Even if the API is paid for, I have been doing my research, but then it's kind of impossible. So please, if anyone has an idea on what I can do to bring this application to life, I will really appreciate it. Or if anyone could share me an idea on how to get over millions of music on my app, I will be really grateful.
I'm working on an application that aims to deliver a DJ-like experience by overlapping songs and implementing volume fading. During development, I've encountered a roadblock due to the MusicPlayerController singleton behavior in iOS, which seems to only support playback of one audio stream at a time and doesn't support overlapping of songs or fading volume between tracks.
I understand that Apple Music content is protected and that playback through Apple MusicKit must respect the DRM and licensing agreements. However, I've noticed an application called "Mixonset" that seems to be able to stream songs from Apple Music, use the music player, and create an overlapping effect of songs for users.
Could anyone share insights on how Mixonset might be achieving this within the constraints of the Apple MusicKit? Is there any approach that I could explore to implement similar functionality, such as overlapping songs or crossfading while adhering to Apple's guidelines and without violating any terms of service?
Any advice or direction towards documentation and API capabilities that could support these features would be greatly appreciated.
Thank you for your assistance.
Hello,
I'm wondering if there is a way to programmatically write a series of UIImages into an APNG, similar to what the code below does for GIFs (credit: https://github.com/AFathi/ARVideoKit/tree/swift_5). I've tried implementing a similar solution but it doesn't seem to work. My code is included below
I've also done a lot of searching and have found lots of code for displaying APNGs, but have had no luck with code for writing them.
Any hints or pointers would be appreciated.
func generate(gif images: [UIImage], with delay: Float, loop count: Int = 0, _ finished: ((_ status: Bool, _ path: URL?) -> Void)? = nil) {
currentGIFPath = newGIFPath
gifQueue.async {
let gifSettings = [kCGImagePropertyGIFDictionary as String : [kCGImagePropertyGIFLoopCount as String : count]]
let imageSettings = [kCGImagePropertyGIFDictionary as String : [kCGImagePropertyGIFDelayTime as String : delay]]
guard let path = self.currentGIFPath else { return }
guard let destination = CGImageDestinationCreateWithURL(path as CFURL, __UTTypeGIF as! CFString, images.count, nil)
else { finished?(false, nil); return }
//logAR.message("\(destination)")
CGImageDestinationSetProperties(destination, gifSettings as CFDictionary)
for image in images {
if let imageRef = image.cgImage {
CGImageDestinationAddImage(destination, imageRef, imageSettings as CFDictionary)
}
}
if !CGImageDestinationFinalize(destination) {
finished?(false, nil); return
} else {
finished?(true, path)
}
}
}
My adaptation of the above code for APNGs (doesn't work; outputs empty file):
func generateAPNG(images: [UIImage], delay: Float, count: Int = 0) {
let apngSettings = [kCGImagePropertyPNGDictionary as String : [kCGImagePropertyAPNGLoopCount as String : count]]
let imageSettings = [kCGImagePropertyPNGDictionary as String : [kCGImagePropertyAPNGDelayTime as String : delay]]
guard let destination = CGImageDestinationCreateWithURL(outputURL as CFURL, UTType.png.identifier as CFString, images.count, nil)
else { fatalError("Failed") }
CGImageDestinationSetProperties(destination, apngSettings as CFDictionary)
for image in images {
if let imageRef = image.cgImage {
CGImageDestinationAddImage(destination, imageRef, imageSettings as CFDictionary)
}
}
}
Hi everyone! Are there any plans or existing alternatives to include the date a track was added to a playlist within Apple Music's API[1]? This functionality exists on Spotify[2] (with their "added_at" attribute), and it would be helpful for ordering tracks retrieved from playlists. Thank in advance for any help!
[1]https://developer.apple.com/documentation/applemusicapi/get_a_catalog_playlist_s_relationship_directly_by_name
[2]https://developer.spotify.com/documentation/web-api/reference/get-playlists-tracks
I often find when doing basic actions in MusicKit it is incredibly slow compared to Apple's Music App. I've tried different versions, devices, networks, Apple's sample code, it all throughout the last several years, and it is all the same. Does anyone else have this issue?
I'm using musickit on iOS 15 in my APP. I want to detect user change the player's play state when app enters the background.
I am developing an app using a data cable to link a camera. When I enter the page for the first time, I can detect the camera device, and then when I exit the page and enter again, I cannot detect the linked camera.
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view.
self.view.backgroundColor = [UIColor whiteColor];
[self addImageCaptureCore];
}
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(1.0 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
[self checkCameraConnection];
});
}
- (void)checkCameraConnection {
if (@available(iOS 13.0, *)) {
NSArray<ICDevice *> *connectedDevices = self.browser.devices;
if (connectedDevices.count > 0) {
NSLog(@"Camera is connected");
} else {
NSLog(@"Camera is not connected");
}
}
else {
// Fallback on earlier versions
}
}
- (void)viewWillDisappear:(BOOL)animated {
[super viewWillDisappear:animated];
if (@available(iOS 13.0, *)) {
if (self.cameraDevice) {
if (self.cameraDevice.hasOpenSession) {
[self.cameraDevice requestCloseSession];
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(0.5 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
[self.browser stop];
self.browser.delegate = nil;
self.browser = nil;
});
}
else {
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(0.5 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
[self.browser stop];
self.browser.delegate = nil;
self.browser = nil;
});
}
}
} else {
// Fallback on earlier versions
}
}
- (void)addImageCaptureCore {
if (@available(iOS 13.0, *)) {
ICDeviceBrowser *browser = [[ICDeviceBrowser alloc] init];
browser.delegate = self;
[browser start];
self.browser = browser;
}
else {
}
}
#pragma mark - ICDeviceBrowserDelegate
- (void)deviceBrowser:(ICDeviceBrowser*)browser didAddDevice:(ICDevice*)device moreComing:(BOOL) moreComing API_AVAILABLE(ios(13.0)){
NSLog(@"Device name = %@",device.name);
if ([device isKindOfClass:[ICCameraDevice class]]) {
if ([device.capabilities containsObject:ICCameraDeviceCanAcceptPTPCommands]) {
ICCameraDevice *cameraDevice = (ICCameraDevice *)device;
cameraDevice.delegate = self;
[cameraDevice requestOpenSession];
self.cameraDevice = cameraDevice;
}
}
}
- (void)deviceBrowser:(ICDeviceBrowser*)browser didRemoveDevice:(ICDevice*)device moreGoing:(BOOL) moreGoing API_AVAILABLE(ios(13.0)){
if (self.cameraDevice) {
if (self.cameraDevice.hasOpenSession) {
[self.cameraDevice requestCloseSession];
self.cameraDevice.delegate = nil;
self.cameraDevice = nil;
}
else {
self.cameraDevice.delegate = nil;
self.cameraDevice = nil;
}
}
}
#pragma mark - ICCameraDeviceDelegate
- (void)cameraDevice:(ICCameraDevice*)camera didAddItems:(NSArray<ICCameraItem*>*) items API_AVAILABLE(ios(13.0)){
if (items.count > 0) {
ICCameraItem *latestItem = items.lastObject;
NSLog(@"name = %@",latestItem.name);
}
}
#pragma mark - ICDeviceDelegate
- (void)device:(ICDevice*)device didOpenSessionWithError:(NSError* _Nullable) error API_AVAILABLE(ios(13.0)){
if (error) {
NSLog(@"Failed to open session %@",error.localizedDescription);
}
else {
NSLog(@"open session success");
}
}
- (void)device:(ICDevice*)device didCloseSessionWithError:(NSError* _Nullable)error API_AVAILABLE(ios(13.0)){
if (error) {
NSLog(@"close session error = %@",error.localizedDescription);
}
else {
NSLog(@"didCloseSession");
}
}
- (void)didRemoveDevice:(ICDevice*)device {
}
I play livestream thì bị lỗi : -12888 -"Playlist File unchanged for longer than 1.5 * target duration" , I also read error -12888 in the documentation page 170: https://docs.huihoo.com/apple/wwdc/ 2018/502_measuring_and_optimizing_hls_performance.pdf but still don't understand the reason. Please explain to me the reason for the error?
Hello everyone,
I was playing a livestream when I received the error -16831/START-TIME is too close to live returned from the AVPlayerItemNewErrorLogEntry function. I don't know why the error is returned.Can you explain to me the reason for this error?