Link sanbox: https://codesandbox.io/p/sandbox/webrtc-ios-lasted-issue-jzx9h5
Issue: Black video screen when url changed.
Reproduce step:
Get the source code on sanbox repo above
Install packages by command "npm install"
Start local web-app under https by command "HTTPS=true npm start"
Update url by click button "Update URL search param"
OS: iOS v17.4.1
Browser: Safari
Device: iPhone 11 pro
Anyone can help?
Note: it's works on iPhone X iOS version 16
Link video issue: https://streamable.com/rj07u8
General
RSS for tagExplore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Post
Replies
Boosts
Views
Activity
Is it possible to edit the Grouping of a song from the user's library using the Apple Music API?
According to what I read from the documentation, this is not possible, but this is something that I really would love, so, I better ask.
Thanks in advanced!
I am using below commad line to generate AES-128 HLS,
mediafilesegmenter -iso-fragmented --encrypt-key-file=my.key -S -f /Volumes/Samsung/pattern/vision_pro/hls/*** /Volumes/Samsung/pattern/vision_pro/***.mov
but it always generates SAMPLE-AES, even I removed -S
#EXT-X-KEY:METHOD=SAMPLE-AES,URI="enc.key",IV=0x7316166d6a85f56f3d4606eaebc3aa44
How I can generate AES-128 HLS? Thanks.
I have old ScreenCaptureKit sample downloaded on Oct 2022.
That sample worked on Oct 2022. But it does not work on Apr 2024 on Sonoma 14.4.1 M1 MacBook. It only shows black screen.
I also download updated ScreenCaptureKit sample and test it. It works on Sonoma 14.4.1 M1 MacBook. I noticed latest sample have SCContentSharingPicker and other changes.
I have my screen capture application based on old ScreenCaptureKit sample. My app only shows black screen.
Do I have to add SCContentSharingPicker and SCContentSharingPickerObserver on my application for capturing screen on Sonoma?
Old way of screen capture without SCContentSharingPicker is not supported anymore on Sonoma?
I am using MusicKit ApplicationMusicPlayer to play music in my app. Everything works fine as long as I'm not playing large playlists that contain hundreds of songs. When I to play collection of songs that is larger than around 300 I'm always getting the error message saying:
"Prepare to play failed" UserInfo={NSDebugDescription=Prepare to play failed, NSUnderlyingError=0x121d42dc0 {Error Domain=MPMusicPlayerControllerErrorDomain Code=9 "Remote call timed out" UserInfo={NSDebugDescription=Remote call timed out}}}))
It doesn't matter if songs are downloaded to the device or not.
I am aware that there is another initializer for player's queue that accepts Playlist instances but in my app users can choose to sort playlist tracks in different order than the default and that makes using that initializer not feasible for me.
I tried everything I could think of, I tried to fall back on MPMusicPlayerController and pass array of MPMusicPlayerPlayParameters to it but the result was the same.
typealias QueueEntry = ApplicationMusicPlayer.Queue.Entry
let player = ApplicationMusicPlayer.shared
let entries: [QueueEntry] = tracks
.compactMap {
guard let song = $0 as? Song else { return nil }
return QueueEntry(song)
}
Task(priority: .high) { [player] in
do {
player.queue = .init(entries, startingAt: nil)
try await player.play() // prepareToPlay failed
} catch {
print(error)
}
}
Hi Team, some of our users are getting crash in QuartzCore. But we are not sure the exact reason for it. Can you please help us in it? App is crashing in production.
Xcode version - 15.0
Platform - iOS
Below is the crash stack trace.
Crashed: com.apple.main-thread
0 libsystem_kernel.dylib 0xa974 __pthread_kill + 8
1 libsystem_pthread.dylib 0x60ec pthread_kill + 268
2 libsystem_c.dylib 0x75b80 abort + 180
3 QuartzCore 0x98ba8 CA::Render::Encoder::grow(unsigned long) + 288
4 QuartzCore 0x97e50 CA::Render::Vector::encode(CA::Render::Encoder*) const + 112
5 QuartzCore 0x10a76c CA::Render::KeyframeAnimation::encode(CA::Render::Encoder*) const + 68
6 QuartzCore 0x975ec CA::Render::Array::encode(CA::Render::Encoder*) const + 172
7 QuartzCore 0x75204 CA::Context::commit_animation(CA::Layer*, CA::Render::Animation*, void*) + 236
8 QuartzCore 0x72998 CA::Layer::commit_animations(CA::Transaction*, double ()(CA::Layer, double, void*), void ()(CA::Layer, CA::Render::Animation*, void*), void ()(CA::Layer, __CFString const*, void*), CA::Render::TimingList* ()(CA::Layer, void*), void*) + 956
9 QuartzCore 0x2b930 invocation function for block in CA::Context::commit_transaction(CA::Transaction*, double, double*) + 148
10 QuartzCore 0x2b838 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 368
11 QuartzCore 0x2b7c4 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252
12 QuartzCore 0x2b7c4 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252
13 QuartzCore 0x2b7c4 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252
14 QuartzCore 0x2b7c4 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252
15 QuartzCore 0x2b7c4 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252
16 QuartzCore 0x2b7c4 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252
17 QuartzCore 0x2b7c4 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252
18 QuartzCore 0x2b7c4 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252
19 QuartzCore 0x2b7c4 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252
20 QuartzCore 0x2b7c4 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252
21 QuartzCore 0x6f5b0 CA::Context::commit_transaction(CA::Transaction*, double, double*) + 11212
22 QuartzCore 0x661bc CA::Transaction::commit() + 648
23 QuartzCore 0x65e64 CA::Transaction::flush_as_runloop_observer(bool) + 88
24 CoreFoundation 0x35d3c CFRUNLOOP_IS_CALLING_OUT_TO_AN_OBSERVER_CALLBACK_FUNCTION + 36
25 CoreFoundation 0x34738 __CFRunLoopDoObservers + 552
26 CoreFoundation 0x33e50 __CFRunLoopRun + 1028
27 CoreFoundation 0x33968 CFRunLoopRunSpecific + 608
28 GraphicsServices 0x34e0 GSEventRunModal + 164
29 UIKitCore 0x22aedc -[UIApplication _run] + 888
30 UIKitCore 0x22a518 UIApplicationMain + 340
31 SwiftUI 0x1033860 OUTLINED_FUNCTION_39 + 600
32 SwiftUI 0x10336a8 OUTLINED_FUNCTION_39 + 160
33 SwiftUI 0xc4f9fc get_witness_table 7SwiftUI4ViewRzlAA15ModifiedContentVyxAA30_EnvironmentKeyWritingModifierVySbGGAaBHPxAaBHD1__AgA0cI0HPyHCHCTm + 364
34 Evie Ring 0x324620 main + 10 (MovanoRingApp.swift:10)
35 ??? 0x1ad632d84 (Missing)
I am working on ScreenCaptureKit sample with SCContentSharingPickerObserver.
My Target is SwiftUI based and calling Objective-C class method. I added [MyTarget]-Bridging.h and [MyTarget]-Swift.h
I got compile error of unknown class name SCContentSharingPickerObserver in [MyTarget]-Swift.h. But I do not know how to fix this error since [MyTarget]-Swift.h is Xcode generated file.
I set macOS target is 14.0, swift language ver is 5
Anyone know how to fix this error or waiting for Xcode update?
When accessing the REST API, If you apply "include=albums" to a 'catalog//songs' endpoint requests with a filter on ISRC, the API will, without fail, return a 504 error status.
If you remove the 'include=albums' and/or replace it with something like 'include=artists' it works fine.
This has been like this for months and we need to get album details back with these requests.
Could the Apple team please respond and verify the issue as it's blocking production for us.
Thanks.
I am using Screen Capture Kit to capture the windows and record it. But from macOS Sanoma onwards I see a wired behaviour when I try to capture the window which is in Full screen mode. The CMSampleBuffer returned by Screen capture kit has empty space at the top of the full screen window content. The ContentRect attachment in CMSampleBuffer includes this empty space. So there is no way to know what is the actual window content in the CMSampleBuffer.
In the CaptureCample sample code provided by Apple it does not enumerate the Full screen windows. I made a change in that to enumerate full screen windows. The issue is reproduced in that also.
Attaching the Image of showing the empty space. Has anybody encountered this issue?
When i am fetching Images and passing it to
await faceapi.fetchImage(label);
then i am facing RefrenceError:Can't Find Variable: FileReader
Please Help For this.
Hi,
just find that I can not play 3D movie stream: MV-HEVC/UHD/Dolby Vision/Dolby Digital/Dolby Atmos(website is https://developer.apple.com/streaming/examples)
when I click View 3D example(fMP4), have below issue: my MacOS is 14.4.1 and M2 chip.
Hi,
just generated a HDR10 MVHEVC file, mediainfo is below:
Color range : Limited
Color primaries : BT.2020
Transfer characteristics : PQ
Matrix coefficients : BT.2020 non-constant
Codec configuration box : hvcC+lhvC
then generate the segment files with below command:
mediafilesegmenter --iso-fragmented -t 4 -f av_1 av_new_1.mov
then upload the segment files and prog_index.m3u8 to web server.
just find that can not play the HLS stream on Safari...
the url is http://ip/vod/prog_index.m3u8
just checked that if i remove the tag Transfer characteristics : PQ when generating the MVHEVC file.
above same mediafilesegmenter command and upload the files to web server.
the new version of HLS stream is can play on Safari...
Is there any way to play HLS PQ video on Safari. thanks.
Hey,
I've been trying to fetch my Apple Music recently played songs for an app I'm working on, and I want to access the lastPlayedDate field. If I'm not mistaken, this field should exist for a Song according to Apple's documentation:
https://developer.apple.com/documentation/musickit/song/lastplayeddate
However, whenever I try to fetch this data, the lastPlayedDate field is always nil. All the other data I'm looking for, however, seems to fetch without issue. Here's the code I'm using:
//Request as described in Apple MusicKit
//https://developer.apple.com/documentation/musickit/musicrecentlyplayedrequestable
var request = MusicRecentlyPlayedRequest<Song>()
request.limit=30
do {
let response = try await request.response()
let songs = response.items.compactMap { song -> RecentlyPlayedSong? in
let songName = song.title
let songArtist = song.artistName
let songAlbum = song.albumTitle
let artwork: MusicArtworkType
let preview_url = song.previewAssets?.first?.url?.absoluteString
if let appleMusicArtwork = song.artwork {
print("Found a song, \(song) with lastPlayedDate \(song.lastPlayedDate)")
artwork = .AppleMusic(appleMusicArtwork)
return RecentlyPlayedSong(name: songName, artist: songArtist, album: songAlbum, artwork: artwork, preview_url: preview_url, lastPlayedDate: song.lastPlayedDate ?? Date())
}
I'm trying to map the response into a custom struct I made, but here's a sample of what's getting printed to the logs:
Found a song, Song(id: "1676362342", title: "pwdr Blu (feat. Brother.)", artistName: "Kx5, deadmau5 & Kaskade") with lastPlayedDate nil
Found a song, Song(id: "881289980", title: "Worlds Apart (feat. Kerli)", artistName: "Seven Lions") with lastPlayedDate nil
Found a song, Song(id: "1501540431", title: "What’s Done Is Done", artistName: "Seven Lions & HALIENE") with lastPlayedDate nil
Even though I just listened to these songs a a few minutes ago. Anyone ever run into this issue before? Any settings I need to look at changing to get this to show?
I'm trying to generate a developer token with the following code in nodeJS:
const jwt = require('jsonwebtoken')
const {TEAM_ID, KID, APPLE_PRIVATE_KEY } = require('./secret')
const expiration = 36000;
const currentTime = Math.floor(Date.now() /1000);
const expirationTime = currentTime + expiration;
const options = {
algorithm: 'ES256',
header :{
alg : "ES256",
kid : KID
}
}
const payload = {
iss : TEAM_ID,
iat : currentTime,
exp : expirationTime
}
const newToken = jwt.sign(payload, APPLE_PRIVATE_KEY, options)
console.log('1111111111111111111111' , newToken)
When testing my newToken in curl -- I'm getting a 401 response.
Please help.
How we can implement music kit in flutter app?
I'm using the systemMusicPlayer to play music and want to update the playback time using addObserver forKeyPath.
[self setMusicPlayer: [MPMusicPlayerController systemMusicPlayer]];
I've tried these two methods:
[self addObserver:self forKeyPath:@"musicPlayer.currentPlaybackTime" options:NSKeyValueObservingOptionNew | NSKeyValueObservingOptionInitial context:&musicPlayer];
[self.musicPlayer addObserver:self forKeyPath:@"currentPlaybackTime" options:NSKeyValueObservingOptionNew | NSKeyValueObservingOptionInitial context:&musicPlayer];
I do get the initial values for currentPlaybackTime in:
-(void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
but I never get any calls when the player is playing the song (the whole point).
If I set the currentPlaybackTime to a specific value (locating manually using a slider), I get calls with the values I set (useless since I know what I am setting them to).
How are we supposed to track the playback time without just polling it constantly?
Since iOS 12 it has become difficult to detect the end of playback using the system music player.
In earlier iOS versions, the now playing item would be set nil and you would receive a notification that the player stopped.
In iOS 12 and later, nowPlayingItem still contains the current song and the only notification you get is MPMusicPlayerControllerPlaybackStateDidChangeNotification with the playbackState set to MPMusicPlaybackStatePaused.
Pressing pause in my car (or any remote access) generates the same conditions making it difficult to correctly detect the difference.
It would be nice if they added a notification that playback was done (similar to the other players).
Any suggestions?
So i meant to make a shared album but made a shared library. That being said, I deleted the shared album but my family can not remove it from the phones
I am capturing a screenshot with SCScreenshotManager's captureImageWithFilter. The resulting PNG has the same resolution as the PNG taken from Command-Shift-3 (4112x2658) but is 10x larger (14.4MB vs 1.35MB).
My SCStreamConfiguration uses the SCDisplay's width and height and sets the color space to kCGColorSpaceSRGB.
I currently save to file by initializing a NSBitmapImageRep using initWithCGImage, then representing as PNG with representationUsingType NSBitmapImageFileTypePNG, then writeToFile:atomically.
Is there some configuration or compression I can use to bring down the PNG size to be more closely in-line with a Command-Shift-3 screenshot.
Thanks!
When I connect my MacBook to my living room AirPort (older gen wallwart) via Music app, the music output in both rooms is synced.
When I try to setup a Multi-Output Device in AudioMidi setup, I'm not able to get them synced. I'm outputting to the same devices, they're all on the same sample rate, and I've played with the various settings (Primary Clock Source and Drift Sync). What gives? How are these connections different?
Intel MacBook Pro 2018 running Sonoma 14.5