I have noticed a problem when a PHAsset creation request is made with the resource type PHAssetResourceType.photoProxy.
let creationRequest = PHAssetCreationRequest.forAsset()
creationRequest.addResource(with: .photoProxy, data: photoData, options: nil)
creationRequest.location = location
creationRequest.isFavorite = true
After successfully saving the resulting asset through PHPhotoLibrary.shared().performChanges, I could verify it in the Photos app.
I noticed that the created photo was initially marked as Favorite and that the location was added to the info as expected. The title of the image changes from "Today" to "" too.
Next, the photo was refreshed, and location data was purged. However, the title remains unchanged and displays the .
This refresh was also observed in the code. PHPhotoLibraryChangeObserver protocols func photoLibraryDidChange(_ changeInstance: PHChange) receives a change notification. The same asset has been changed, and there is no location information anymore. isFavorite information persists correctly.
After debugging for a few hours, I discovered that changing the resource type to .photo fixes this issue. Location data is not removed in the Photos app, and no refresh callback is seen in func photoLibraryDidChange(_ changeInstance: PHChange).
I initially used .photoProxy because in the AVCapturePhotoCaptureDelegate implementation class, I always get the call in func photoOutput(_ output: AVCapturePhotoOutput, didFinishCapturingDeferredPhotoProxy deferredPhotoProxy: AVCaptureDeferredPhotoProxy?, error: Error?). So here is where I am capturing the photo data as photoData = deferredPhotoProxy?.fileDataRepresentation().
Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Post
Replies
Boosts
Views
Activity
Hello,
I used kAudioDevicePropertyDeviceIsRunningSomewhere to check if an internal or external microphone is being used.
My code works well for the internal microphone, and for microphones which are connected using a cable.
External microphones which are connected using bluetooth are not reporting their status.
The status is always requested successfully, but it is always reported as inactive.
Main relevant parts in my code :
static inline AudioObjectPropertyAddress
makeGlobalPropertyAddress(AudioObjectPropertySelector selector) {
AudioObjectPropertyAddress address = {
selector,
kAudioObjectPropertyScopeGlobal,
kAudioObjectPropertyElementMaster,
};
return address;
}
static BOOL getBoolProperty(AudioDeviceID deviceID,
AudioObjectPropertySelector selector)
{
AudioObjectPropertyAddress const address =
makeGlobalPropertyAddress(selector);
UInt32 prop;
UInt32 propSize = sizeof(prop);
OSStatus const status =
AudioObjectGetPropertyData(deviceID, &address, 0, NULL, &propSize, &prop);
if (status != noErr) {
return 0; //this line never gets executed in my tests. The call above always succeeds, but it always gives back "false" status.
}
return static_cast<BOOL>(prop == 1);
}
...
__block BOOL microphoneActive = NO;
iterateThroughAllInputDevices(^(AudioObjectID object, BOOL *stop) {
if (getBoolProperty(object, kAudioDevicePropertyDeviceIsRunningSomewhere) !=
0) {
microphoneActive = YES;
*stop = YES;
}
});
What could cause this and how could it be fixed?
Thank you for your help in advance!
I'm working on a little light and sound controller in Swift, driving DMX lights and audio.
For the audio portion, I need to play a bunch of looping sounds (long-duration MP3s), and occasionally play sound effects (short-duration sounds, varying formats). I want all of this mixed into selected channels on specific devices. That is, I might have one audio stream going to the left channel, and a completely different one going to the right channel.
What's the right API to do this from Swift? Core Audio? AVPlayer stuff?
iOS18 Added support for WebRTC HEVC RFC 7789 RTP Payload Format. (112001659),How can i determine whether iOS 18 WebRTC uses hardware or software decoding for HEVC?
Hi,
trying to wrap my head around Xcode's FXPlug. I already sell Final Cut Pro titles for a company. These titles were built in motion.
However, they want me to move them to an app and I'm looking for any help on how to accomplish this
*What the app should do is:
Allow users with an active subscription to our website the ability to access titles within FCPX and if they are not an active subscriber, for access to be denied.
Hi,
Could you please explain how to use SF Symbols animations in Final Cut? I would greatly appreciate your help.
Thank you!
I have generated FCPXML, but i can't figure out issue:
<?xml version="1.0"?>
<fcpxml version="1.11">
<resources>
<format id="r1" name="FFVideoFormat3840x2160p2997" frameDuration="1001/30000s" width="3840" height="2160" colorSpace="1-1-1 (Rec. 709)"/>
<asset id="video0" name="11a(1-5).mp4" start="0s" hasVideo="1" videoSources="1" duration="6.81s">
<media-rep kind="original-media" src="file:///Volumes/Dropbox/RealMedia Dropbox/Real Media/Media/Test/Test AE videos, City, testOLOLO/video/11a(1-5).mp4"/>
</asset>
<asset id="video1" name="12(4)r8 mute.mp4" start="0s" hasVideo="1" videoSources="1" duration="9.94s">
<media-rep kind="original-media" src="file:///Volumes/Dropbox/RealMedia Dropbox/Real Media/Media/Test/Test AE videos, City, testOLOLO/video/12(4)r8 mute.mp4"/>
</asset>
<asset id="video2" name="13 mute.mp4" start="0s" hasVideo="1" videoSources="1" duration="6.51s">
<media-rep kind="original-media" src="file:///Volumes/Dropbox/RealMedia Dropbox/Real Media/Media/Test/Test AE videos, City, testOLOLO/video/13 mute.mp4"/>
</asset>
<asset id="video3" name="13x (8,14,24,29,38).mp4" start="0s" hasVideo="1" videoSources="1" duration="45.55s">
<media-rep kind="original-media" src="file:///Volumes/Dropbox/RealMedia Dropbox/Real Media/Media/Test/Test AE videos, City, testOLOLO/video/13x (8,14,24,29,38).mp4"/>
</asset>
</resources>
<library>
<event name="Untitled">
<project name="Untitled Project" uid="28B2D4F3-05C4-44E7-8D0B-70A326135EDD" modDate="2024-04-17 15:44:26 -0400">
<sequence format="r1" duration="4802798/30000s" tcStart="0s" tcFormat="NDF" audioLayout="stereo" audioRate="48k">
<spine>
<asset-clip ref="video0" offset="0/10000s" name="11a(1-5).mp4" duration="0/10000s" format="r1" tcFormat="NDF"/>
<asset-clip ref="video1" offset="12119/10000s" name="12(4)r8 mute.mp4" duration="0/10000s" format="r1" tcFormat="NDF"/>
<asset-clip ref="video2" offset="22784/10000s" name="13 mute.mp4" duration="0/10000s" format="r1" tcFormat="NDF"/>
<asset-clip ref="video3" offset="34544/10000s" name="13x (8,14,24,29,38).mp4" duration="0/10000s" format="r1" tcFormat="NDF"/>
</spine>
</sequence>
</project>
</event>
</library>
</fcpxml>
Any ideas?
Hello everyone, I have been receiving this same crash report for the past month whenever I try and export a Final Cut Pro project. The FCP video will get to about 88% completion of export, then the application crashes and I get the attached report. Any leads on how to fix this would be greatly appreciated! Thank you.
-Lauren
1.在Fxplug4.3的 FxRemoteWindowAPI 的协议中,没有提供window.frame.origin的设置。
2.如果我自定义NSWindow时,在FxPlug中 [Window setLevel:NSFloatingWindowLevel];也没有执行。
3.请问我应该如何把窗口保留在Final cut pro的前面,并且不影响Final cut pro 的操作呢?
NSPanel *panel = [[myPanel alloc] initWithContentRect:NSMakeRect(100, 100, 400, 300) styleMask:NSWindowStyleMaskTitled | NSWindowStyleMaskClosable
backing:NSBackingStoreBuffered
defer:NO];
[panel setLevel:NSFloatingWindowLevel];//无效????
[panel makeKeyAndOrderFront:self];
问题:在FxPlug4.3中使用setLevel不能将panel放在Final cut pro和Mition的前面?
救命~~~全世界都没找到答案!
1.In the FxRemoteWindowAPI protocol, there is no way to set window.frame.origin.
2.When using NSWindow, you cannot set [Window setLevel:NSFloatingWindowLevel].
3.How can I keep the window in front of Final Cut Pro without affecting the normal use of Final Cut Pro?
I'm trying to create code to generate an fcpxml file so I can automate Final Cut Pro timeline (project) creation. Here's an xml element that FCP successfully imports (and successfully creates a project/timeline).
<project name="2013-08-09 19_23_07 (id).mov">
<sequence format="r1">
<spine>
<asset-clip ref="r2" offset="0s" name="2013-08-09 19_23_07 (id).mov" start="146173027/60000s" duration="871871/60000s" tcFormat="DF" audioRole="dialogue"></asset-clip>
</spine>
</sequence>
</project>
The xml element example above was generated by exporting a simple timeline with a single clip. The problem I'm having is the media asset has timecode that gives a start time in relation to the timecode. When I try to remove timecode attributes and change the start time to "0s"
<asset-clip ref="r2" offset="0s" name="2013-08-09 19_23_07 (id).mov" start="0s" duration="871871/60000s" audioRole="dialogue"></asset-clip>
FCP complains with the import error:
2013-08-09 19_23_07 (id).fcpxml Invalid edit with no respective media. (/fcpxml[1]/project[1]/sequence[1]/spine[1]/asset-clip[1])
I guess the question is, does AVAsset provide a way to get the timecode information and the timecode based start offset, or is there a way to tell FCP to use a default start time independent of timecode?
xcode 16:
VT_EXPORT void
VT_EXPORT OSStatus
VTPixelTransferSessionCreate(
CM_NULLABLE CFAllocatorRef allocator,
CM_RETURNS_RETAINED_PARAMETER CM_NULLABLE VTPixelTransferSessionRef * CM_NONNULL pixelTransferSessionOut) API_AVAILABLE(macos(10.8), ios(16.0), tvos(16.0), visionos(1.0)) API_UNAVAILABLE(watchos);
xcode 15:
VT_EXPORT OSStatus
VTPixelTransferSessionCreate(
CM_NULLABLE CFAllocatorRef allocator,
CM_RETURNS_RETAINED_PARAMETER CM_NULLABLE VTPixelTransferSessionRef * CM_NONNULL pixelTransferSessionOut) VT_AVAILABLE_STARTING(10_8);
At Apple Developer documentation, https://developer.apple.com/documentation/avfoundation/avcapturedevice/discoverysession you can find the sentence
You can also key-value observe this property to monitor changes to the list of available devices.
But how to use it?
I tried it with the code above and tested on my MacBook with EarPods.
When I disconnect the EarPods, nothing was happened.
MacBook Air M2
macOS Sequoia 15.0.1
Xcode 16.0
import Foundation
import AVFoundation
let discovery_session = AVCaptureDevice.DiscoverySession.init(deviceTypes: [.microphone], mediaType: .audio, position: .unspecified)
let devices = discovery_session.devices
for device in devices {
print(device.localizedName)
}
let device = devices[0]
let observer = Observer()
discovery_session.addObserver(observer, forKeyPath: "devices", options: [.new, .old], context: nil)
let input = try! AVCaptureDeviceInput(device: device)
let queue = DispatchQueue(label: "queue")
var output = AVCaptureAudioDataOutput()
let delegate = OutputDelegate()
output.setSampleBufferDelegate(delegate, queue: queue)
var session = AVCaptureSession()
session.beginConfiguration()
session.addInput(input)
session.addOutput(output)
session.commitConfiguration()
session.startRunning()
let group = DispatchGroup()
let q = DispatchQueue(label: "", attributes: .concurrent)
q.async(group: group, execute: DispatchWorkItem() {
sleep(10)
session.stopRunning()
})
_ = group.wait(timeout: .distantFuture)
class Observer: NSObject {
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
print("Change")
if keyPath == "devices" {
if let newDevices = change?[.newKey] as? [AVCaptureDevice] {
print("New devices: \(newDevices.map { $0.localizedName })")
}
if let oldDevices = change?[.oldKey] as? [AVCaptureDevice] {
print("Old devices: \(oldDevices.map { $0.localizedName })")
}
}
}
}
class OutputDelegate : NSObject, AVCaptureAudioDataOutputSampleBufferDelegate {
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
print("Output")
}
}
Good afternoon
since I’ve installed ios 18 on me iphone 15 pro I have problems using Apple car play with my Ford Puma with Sync 3. More in detail, problems with audio commands, selecting audio track, bluetooth, etc..
Are you aware about it?
Thanks a lot
Regards
Alberto
I have a recent post kind of outlining a similar question here. This time though I'm confident that inserting an array of Track works when inserting into the ApplicationMusicPlayer.shared.queue but now I'm not sure how I can initialize the queue to display song title and artwork for example. I'm also not sure how to get the current item in the queue's artist information and album information which I feel should be easy to do so maybe I'm missing something obvious. Hope this paints of what I'm trying to do and I'm going to post the neccessary code here to help me debug/figure out this problem.
import SwiftUI
import MusicKit
struct PlayBackView: View {
@Environment(\.scenePhase) var scenePhase
@Environment(\.openURL) private var openURL
// Adding Enum Here for Question Sake
enum PlayState {
case play
case pause
}
@State var song: Track
@Binding var songs: [Track]?
@State var isShuffled = false
@State private var playState: PlayState = .pause
@State private var songTimer: Int = Int.random(in: 5...30)
@State private var roundTimer: Int = 5
@State private var isTimerActive = false
// @State private var volumeValue = VolumeObserver()
@State private var isFirstPlay = true
@State private var isDancing = false
@State private var player = ApplicationMusicPlayer.shared
private var isPlaying: Bool {
return (player.state.playbackStatus == .playing)
}
let timer = Timer.publish(every: 1, on: .main, in: .common).autoconnect()
var playPauseImage: String {
switch playState {
case .play:
"pause.fill"
case .pause:
"play.fill"
}
}
var body: some View {
VStack {
// Album Cover
HStack(spacing: 20) {
if let artwork = player.queue.currentEntry?.artwork {
ArtworkImage(artwork, height: 100)
} else {
Image(systemName: "music.note")
.resizable()
.frame(width: 100, height: 100)
}
VStack(alignment: .leading) {
/*
This is where I want to display song title, album title, and artist name for example
*/
// Song Title
Text(player.queue.currentEntry?.title ?? "Unable to Find Song Title")
.font(.title)
.fixedSize(horizontal: false, vertical: true)
// Album Title
// Text(player.queue.currentEntry ?? "Album Title Not Found")
// .font(.caption)
// .fixedSize(horizontal: false, vertical: true)
// I don't know what the subtitle actually grabs
Text(player.queue.currentEntry?.subtitle ?? "Artist Name Not Found")
.font(.caption)
}
}
.padding()
// Play/Pause Button
Button(action: {
handlePlayButton()
isFirstPlay = false
}, label: {
Text(playState == .play ? "Pause" : isFirstPlay ? "Play" : "Resume")
.frame(maxWidth: .infinity)
})
.buttonStyle(.borderedProminent)
.padding()
.font(.largeTitle)
.tint(.red)
}
.padding()
// Maybe I should use the `.task` modifier here?
.onAppear {
// I'm sure this code could be improved but don't think it'll help answer the question at the moment.
Task {
if let songs = songs {
do {
if isShuffled {
let shuffledSongs = songs.shuffled()
try await player.queue.insert(shuffledSongs, position: .tail)
handlePlayButton()
} else {
try await player.queue.insert(songs, position: .tail)
}
} catch {
print(error.localizedDescription)
}
}
}
}
}
private func handlePlayButton() {
Task {
if isPlaying {
player.pause()
playState = .pause
isTimerActive = false
} else {
playState = .play
await playTrack()
isTimerActive = true
}
}
}
@MainActor
private func playTrack() async {
do {
try await player.play()
} catch {
print(error.localizedDescription)
}
}
}
//#Preview {
// PlayBackView()
//}
Hello everyone, I am using QRCodeScanner library in my project, the scan qr code was working in earlier ipad os but now in iPad os 18 it's stopped working.
Case-ID: 9391388
Our application uses timed Metadata as part of a rating control system.
We noticed a problem in production and diagnosis shows that we stop receiving timed Metadata on iOS18 only
Our live streams are primed with metadata at least once per second but we are seeing extended gaps in receiving this content, in excess of 10 minutes.
We have also observed that this happens more as the player climbs the bitrate ladder, and doesn't happen if we cap to a low resolution
i.e. a preferredMaximumResolution of 768x432.
Furthermore, if we throttle network conditions after we stop receiving metadata the we start receiving them again.
Following is a simple example that demonstrates the above behaviour, unfortunately I cannot share the live stream endpoint which is primed with metadata publicly, but can provide privately to Apple to reproduce the problem.
import UIKit
import AVKit
class ViewController: UIViewController, AVPlayerItemMetadataOutputPushDelegate {
var player: AVPlayer?
var itemMetadataOutput: AVPlayerItemMetadataOutput?
override func viewDidAppear(_ animated: Bool) {
guard let url = URL(string: "endpoint redacted") else {
return
}
let player = AVPlayer(url: url)
let controller = AVPlayerViewController()
controller.player = player
self.player = player
present(controller, animated: true) {
player.play()
let currentItem = player.currentItem
let itemMetadataOutput = AVPlayerItemMetadataOutput(identifiers: nil)
self.itemMetadataOutput = itemMetadataOutput
self.itemMetadataOutput?.setDelegate(self, queue: .main)
currentItem?.add(itemMetadataOutput)
}
}
public func metadataOutput(_ output: AVPlayerItemMetadataOutput,
didOutputTimedMetadataGroups groups: [AVTimedMetadataGroup],
from track: AVPlayerItemTrack?) {
print("received metadata \(Date())")
}
}
I’m currently working on an iOS project that involves loading and playing stereoscopic/spatial videos. I’m using the AVFoundation framework, specifically AVURLAsset, but I’m having trouble determining how to correctly load and handle stereoscopic videos.
I would like to know:
Any guidance or code snippets would be greatly appreciated, I´m not understanding pretty well the apple developer videos...
Thank you in advance for your help!
Best,
Lau
Hi,
In my app I am using MusicLibraryRequest<Artist> to fetch all of the artists in someone's Library collection. With this response I then fetch each artists albums: artist.with([.album]).
The response from this only gives albums in the users Library collection. I would like to augment it with all of the albums for an artist from the full catalogue.
I'm using MusicKit and targeting iOS18 and visionOS 2.
Could someone please point me towards the best way to approach this?