Hey everyone,
I'm working on an iOS app where I use AVPlayer to play videos, then process them through Metal to apply effects. The app has controls that let users tweak these effects in real-time, and I want the final processed video to be streamed via AirPlay. I use a custom rendering layer that uses a Metal texture to display the processed video on the screen an that works as intended.
The problem is, when I try to AirPlay the video after feeding it the processed metal frames, it’s just streaming the original video from AVPlayer, not the version with all the Metal effects.
The final processed output is a Metal texture that gets rendered in a MTKView. I even tried capturing that texture and sending it through a new AVPlayer setup, but AirPlay still grabs the original, unprocessed video instead of the final, fully-rendered output. It's also clear that the airplayed video has the full length of the original built in so it's not even that it's 'live streaming' the wrong feed.
I need help figuring out how to make AirPlay stream the live, processed video with all the effects, not just the raw video. Any ideas? Happy to share my code if that helps but I'm not sure I have the right underlying approach yet.
Thanks!
Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.
Post
Replies
Boosts
Views
Activity
We have a production Metal app with a complex multithreaded Metal pipeline.
When everything is operating smoothly, it works great.
Even when extremely overloaded, it does not crash for days at a time.
This isn't good enough for our users.
Unfortunately, when I have zero visibility into id, I have no way of knowing when metal is "done" with an id.
When overloaded, stale metal render passes need to be 'aborted', which results in metal callbacks not being called.
for example, these callbacks may not be called after an aborted pass:
id<MTLCommandBuffer> m_cmdbuf;
[m_cmdbuf addScheduledHandler:^(id <MTLCommandBuffer> cb) {
cpr->scheduled = MachAbsoluteTime();
}];
[m_cmdbuf addCompletedHandler:^(id <MTLCommandBuffer> cb) {
cpr->completed = MachAbsoluteTime();
}];
For the moment, our workaround is a system which waits a few seconds after we "think" a rendering pass should be done with all its (aborted) resources before releasing buffers. This is not ideal, to say the least.
So, in summary, my question is, it would be nice to be able to 'query' an id to know when metal is done with it, so that we know that its safe to release it along with our own internal resources.
Is there any such (undocumented) mechanism? I have exhaustively read all existing Metal documentation many times.
An idea that I've been toying with... it would be nice to have something akin to Zombie detection running all the time for id only.
In OpenGL, it was OK to use a released texture... you may display a bad frame, but not CRASH!. Is there any similar option for id?
We have been having a mysterious crash in our media server app that I've never seen before. After fixing a number of other rare thread safety crashes relating to Metal buffers, this rare crash happens inside a Metal com.Metal.CompletionQueueDispatch?
I have no clue what is happening here. It looks to me like Metal is specifically calling abort() for some reason.
All of the other threads in the crash log appear to be in a normal state.
Thread 70 Crashed:: updateAllMedia Dispatch queue: com.Metal.CompletionQueueDispatch
0 libsystem_kernel.dylib 0x1af572d38 __pthread_kill + 8
1 libsystem_pthread.dylib 0x1af5a7ee0 pthread_kill + 288
2 libsystem_c.dylib 0x1af4e2330 abort + 168
3 libc++abi.dylib 0x1af562b18 abort_message + 132
4 libc++abi.dylib 0x1af552a3c demangling_terminate_handler() + 312
5 libobjc.A.dylib 0x1af4481c8 _objc_terminate() + 160
6 libc++abi.dylib 0x1af561eb4 std::__terminate(void (*)()) + 20
7 libc++abi.dylib 0x1af561e50 std::terminate() + 64
8 libdispatch.dylib 0x1af3e4288 _dispatch_client_callout4 + 40
9 libdispatch.dylib 0x1af40053c _dispatch_mach_msg_invoke + 464
10 libdispatch.dylib 0x1af3eb784 _dispatch_lane_serial_drain + 376
11 libdispatch.dylib 0x1af40125c _dispatch_mach_invoke + 456
12 libdispatch.dylib 0x1af3eb784 _dispatch_lane_serial_drain + 376
13 libdispatch.dylib 0x1af3ec438 _dispatch_lane_invoke + 444
14 libdispatch.dylib 0x1af3eb784 _dispatch_lane_serial_drain + 376
15 libdispatch.dylib 0x1af3ec404 _dispatch_lane_invoke + 392
16 libdispatch.dylib 0x1af3f6c98 _dispatch_workloop_worker_thread + 648
17 libsystem_pthread.dylib 0x1af5a4360 _pthread_wqthread + 288
18 libsystem_pthread.dylib 0x1af5a3080 start_wqthread + 8
Note that the thread name "updateAllMedia" is a misnomer because this thread appears to be a general Metal dispatch queue. I wish there was a debugging option in Metal that called "setThreadName" to name its internal threads.
The following generates a prior definition warning from #include <__config>. But that has an ifndef in it. I'm in C++17 on Xcode latest (14.5). Is this documented anywhere?
-D_LIBCPP_ENABLE_ASSERTIONS=1
I'm trying to update my projects to use Swift6, if I change the project settings to use Swift6 then my app crashes when I add a closure to the SCNAnimation animationDidStop property. The error is inside the SceneKit renderingQueue and indicates that the callback is being called on the wring queue.
Maybe I need to do something in the code to fix this but I can't seem to make it work, maybe a SceneKit bug?
If you create a new game template in Xcode using SceneKit and replace the contents of GameViewController.swift with the following you will see the app crash after it is launched.
import UIKit
import SceneKit
class GameViewController: UIViewController {
let player: SCNAnimationPlayer = {
let a = CABasicAnimation(keyPath: "opacity")
return SCNAnimationPlayer(animation: SCNAnimation(caAnimation: a))
}()
override func viewDidLoad() {
super.viewDidLoad()
let scnView = self.view as! SCNView
scnView.scene = SCNScene()
// Change the project settings to use Swift6
// Setting this closure will then cause a _dispatch_assert_queue_fail
// EXC_BREAKPOINT error in the scenekit.renderingQueue.SCNView queue,
// the only thing on the stack is:
// "%sBlock was %sexpected to execute on queue [%s (%p)]"
player.animation.animationDidStop = { (a: SCNAnimation, b: SCNAnimatable, c: Bool) in
print("stopped")
}
scnView.scene?.rootNode.addAnimationPlayer(player, forKey: nil)
player.play()
}
}
I have various USDZ files in my visionOS app. Loading the USDZ files works quite well. I only have problems with the positioning of the 3D model. For example, I have a USDZ file that is displayed directly above me. I can't move the model or perform any other actions on it. If I sit on a chair or stand up again, the 3D model automatically moves with me. This is my source code for loading the USDZ files:
struct ImmersiveView: View {
@State var modelName: String
@State private var loadedModel = Entity()
var body: some View {
RealityView { content in
if let usdModel = try? await Entity(named: modelName) {
print("====> \(modelName) : \(usdModel) <====")
let bounds = usdModel.visualBounds(relativeTo: nil).extents
usdModel.scale = SIMD3<Float>(1.0, 1.0, 1.0)
usdModel.position = SIMD3<Float>(0.0, 0.0, 0.0)
usdModel.components.set(CollisionComponent(shapes: [.generateBox(size: bounds)]))
usdModel.components.set(HoverEffectComponent())
usdModel.components.set(InputTargetComponent())
loadedModel = usdModel
content.add(usdModel)
}
}
}
}
I only want the 3D models from the USDZ files to be displayed later, and later on, to be able to move them via gestures. Moving the models is step 2. First, I need to make sure the models are displayed correctly. What have I forgotten or done wrong?
I'm trying to change things mid game by replacing/adding existing equipment with other variants. For example, I'm wanting to change the board. (It could be just the board display/model.)
I tried adding the equipment mid game but TabletopKit doesn't appear to render new items added once it initializes.
I'm not sure if I'm just doing something wrong or if there's some trick to reset the renderer or something.
One other option could be, is it possible to change the model or texture for an existing model entity?
Is it possible to join a TabletopKit Seat while using the VisionOS simulator? Seems in the simulator, I'm able to interact with the board's equipment as the floating simulated camera with a PlayerID, but none of my seats have PlayerIDs associated with them.
I have a legacy app that draws using OpenGL, in particular is draws lines using glLineStipple. This is on a Macbook Pro M3, but it also happens on and x86 based Mac.
This causes the following messages to be output to the terminal the app was run from:
FALLBACK (log once): Fallback to SW vertex for line stipple
FALLBACK (log once): Fallback to SW vertex processing, m_disable_code: 2000
FALLBACK (log once): Fallback to SW vertex processing in drawCore, m_disable_code: 2000
Is there a way of suppressing these messages?
Is it possible with iOS 18 to use RealityView with world tracking but without the camera feed as background?
With content.camera = .worldTracking the background is always the camera feed, and with content.camera = .virtual the device's position and orientation don't affect the view point. Is there a way to make a mixture of both?
My use case is that my app "Encyclopedia GalacticAR" shows astronomical objects and a skybox (a huge sphere), like a VR view of planets, as you can see in the left image. Now that iOS 18 offers RealityView for iOS and iPadOS, I would like to make use of it, but I haven't found a way to display my skybox as environment, instead of the camera feed.
I filed the suggestion FB14734105 but hope that somebody knows a workaround...
I am trying to make a shader that resembles a laser like this:
I've been experimenting with a basic Fresnel shader to start, but the Fresnel shader has a problem at high viewing angles where the top has a very different color than the rest of the capsule.
This might work for a laser shader once inverted and fine tuned:
However, when viewed from the top, it doesn't look so good anymore:
Ideally, the purple edge is always ONLY on the edge, and the rest of the surface is the light pink color, no matter the viewing angle. How can I accomplish this to create something that looks like a laser?
Has anyone come across the issue that setting GKLocalPlayer.local.authenticateHandler breaks a RealityView's world tracking on iOS / iPadOS 18 beta 5?
I'm in the process of upgrading my app to make use of the much appreciated RealityView unification, using RealityView not only on visionOS but now also on iOS and iPadOS. In my RealityView, I enable world tracking on iOS like this:
content.camera = .worldTracking
However, device position and orientation were ignored (the camera remained static) and there was no camera pass-through. Then I discovered that the issue disappeared when I remove the line
GKLocalPlayer.local.authenticateHandler = { viewController, error in
// ... some more code ...
}
So I filed FB14731139 and hope that it will be resolved before the release of iOS / iPadOS 18.
Good morning everyone,
I'm building a simple game (my first game) using SwiftUI and SpriteKit that contains multiple views. I'm writing my game based on a main scene loaded into the GameView using a SpriteView. From there, using buttons, I move from one scene to another using self.scene?.view?.presentScene(...) and also with some cool transitions (.crossFade(withDuration: 0.5))).
But I'm not sure if this is the best approach. I would need some guidance because I cannot find any material discussing the best way to create a proper navigation with SpriteKit.
Do you have an updated article, tutorial, or reference that I can follow to learn about the best way to implement navigation in a SpriteKit game?
What I'm doing right now is working, but I have limitations, for example, if I want to mix SwiftUI views and SpriteKit scenes. I want to add a Credits scene with some text and images that I want to do in SwiftUI and a Statistic scene with some cool graphics to show the players, but I don't know if I can navigate from an SKScene into a View, or if I need a completely different approach. Can I add UI components directly in a SpriteKit scene instead of using a different navigation system and full SwiftUI views?
I really appreciate any help you can provide. As you can see, I'm a little bit lost 😅
Thanks a lot in advance 🙏
VisionOS 2 beta 5 ,unity text shader errors
I have a test application that draws a large number of simple textured polygons (sprites).
Setting CAMetalLayer's displaySyncEnabled to FALSE will cause load on InterruptEventSourceBridge thread in kernel_task.
(In this case, nanosleep is used to adjust the amount of METAL commands per unit time so that they are approximately the same)
This appears to be a drawing-related thread, but there is no overhead when displaySyncEnabled is TRUE.
What are these differences?
A specific application is the SDL test program, SDL/test/testsprite.c.
https://github.com/libsdl-org/SDL/issues/10475
I have a test application that draws a large number of simple textured polygons (sprites).
Setting CAMetalLayer's displaySyncEnabled to FALSE will cause load on InterruptEventSourceBridge thread in kernel_task.
In this case, nanosleep() is used to adjust the amount of METAL commands per unit time so that they are approximately the same.
This appears to be a drawing-related thread, but there is no overhead when displaySyncEnabled is TRUE.
What are these differences?
A specific application is the SDL test program, SDL/test/testsprite.c.
https://github.com/libsdl-org/SDL/issues/10475
I have a game for iOS where I use CADisplayLink to animate a simulation, and for some reason the animation is not getting the full 120hz on capable devices (like iPhone 15 Pro). When I enable a 120hz refresh target, the animation is capped at only 90hz. This looks terrible because the animation works best when doubled (30, 60, 120, 240, etc).
The really bizarre thing is that when I turn on Screen Recording, my frame rate instantly jumps to 120, and everything looks perfectly smooth. My game has never looked better on iPhone! When recording is stopped, the animation drops back down to 90 fps. What in the world is going on?
[displayLink setPreferredFrameRateRange:CAFrameRateRangeMake(100,240,120)]; //Min. Max, Preferred [displayLink addToRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];
(Also, CADisableMinimumFrameDurationOnPhone is set to True in info.plist)
I have an AR game using ARKit with SceneKit that works just fine in iOS 17.
In the iOS 18 betas, the AR background image shows black instead of showing the real world. As a result there's no tracking and obviously the whole game is useless.
I narrowed down the issue to showing the Game Center Access Point.
My app has ViewController 1 (VC1) showing the main menu and that's where I want to show the GC Access Point. From there you open VC2 which shows a list of levels. Selecting any level will open VC3 which has the ARScene.
Following is the code I use to start Game Center in VC1:
GKLocalPlayer.local.authenticateHandler = { gcAuthVC, error in
let isGameCenterReady = (gcAuthVC == nil) && (error == nil)
if let viewController = gcAuthVC {
self.present (viewController, animated: true, completion: nil)
}
if error != nil {
print(error?.localizedDescription ?? "")
}
if isGameCenterReady {
GKAccessPoint.shared.location = .topLeading
GKAccessPoint.shared.showHighlights = true
GKAccessPoint.shared.isActive = true
}
}
When switching to VC2 I run GKAccessPoint.shared.isActive = false so that the Access Point will no longer show in any of the following VCs. I tried running it in VC1, VC2, and again in VC3 - it doesn't change anything. Once I reach VC3, the background is black.
If in VC1 I don't run GKAccessPoint.shared.isActive = true, so I don't activate the access point, the behavior is as follows:
If I wait until after the Game Center login animation completes and closes on its own and then I proceed to VC2 and VC3, the camera image will show correctly
If I quickly move to VC2 before the Game Center login animation has completed, so my code will close it by setting active = false, and then I continue to VC3, I will see the black background problem.
So it does look like activating the access point and then de-activating it causes the issue. BTW, if I activate the access point and leave it on in all VCs, the same black background issue persists.
Other than that, when I'm in VC3 with the black background and I switch to another app (so my game moves to the background), when it returns to the foreground, the camera suddenly shows the real world correctly!
I tried to manually reset the AR session by pausing and restarting it, but that didn't change anything. Also, when I check with the debugger, it looks like when the app comes back to the foreground it also doesn't run the session start code.
But something does seem to reset itself so I wonder what that is. Maybe I could trigger the same manually in my cdoe???
I repeat that everything works just fine in iOS 17 and below. This problem only started with the iOS 18 beta (currently on beta 5, but it started in some of the previous betas as well).
So could this be a bug in iOS 18?
As a workaround I could check the iOS version and if it's iOS18 not activate the access point, hoping that the user will not jump to VC2 too quickly, and show my own button which will open Game Center. But I'd rather give the users the full experience with their own avatar and the highlights showing up. Plus, certainly some users will move quickly to VC2 and that will be an awful experience.
Any help would be greatly appreciated. Thanks!
Have some older code running an ARView in .nonAR mode with a perspective camera that is moved around. It seem if running this on an iPhone with iOS18 Beta or iOS18.1 beta it ignores the camera and the view looks incorrect.
Hi, I have a question, is it possible to create an occlusion material that hide only specific entities. For example I would. like to create a mask that hides the an cube entity which is in front of another sphere entity and would like to be able to see the sphere but not the cube trough occlusion.