I am sure others will agree with me on this. I personally don’t like the way the new reactions look. Too many different color for the reactions. I honestly prefer the old grey version for the reactions to text messages. The extra emoji thing is okay but the change in color for the heart, thumbs up and the other reactions are not the best. Auto correct is horrible in this new update by the way
General
RSS for tagDelve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.
Post
Replies
Boosts
Views
Activity
I'll leave this here for anyone who's interested but it is possible to slightly use Windows VR on ARM Mac, right now it's just some demos but I am still working on solutions: https://www.youtube.com/watch?v=qbucnU0dpDo&t=431s&ab_channel=NightSightProductions
So I get JPEG data in my app. Previously I was using the higher level NSBitmapImageRep API and just feeding the JPEG data to it.
But now I've noticed on Sonoma If I get a JPEG in the CMYK color space the NSBitmapImageRep renders mostly black and is corrupted. So I'm trying to drop down to the lower level APIs. Specifically I grab a CGImageRef and and trying to use the Accelerate API to convert it to another format (to hopefully workaround the issue...
CGImageRef sourceCGImage = `CGImageCreateWithJPEGDataProvider(jpegDataProvider,`
NULL,
shouldInterpolate,
kCGRenderingIntentDefault);
Now I use vImageConverter_CreateWithCGImageFormat... with the following values for source and destination formats:
Source format: (derived from sourceCGImage)
bitsPerComponent = 8
bitsPerPixel = 32
colorSpace = (kCGColorSpaceICCBased; kCGColorSpaceModelCMYK; Generic CMYK Profile)
bitmapInfo = kCGBitmapByteOrderDefault
version = 0
decode = 0x000060000147f780
renderingIntent = kCGRenderingIntentDefault
Destination format:
bitsPerComponent = 8
bitsPerPixel = 24
colorSpace = (DeviceRBG)
bitmapInfo = 8197
version = 0
decode = 0x0000000000000000
renderingIntent = kCGRenderingIntentDefault
But vImageConverter_CreateWithCGImageFormat fails with kvImageInvalidImageFormat. Now if I change the destination format to use 32 bitsPerpixel and use alpha in the bitmap info the vImageConverter_CreateWithCGImageFormat does not return an error but I get a black image just like NSBitmapImageRep
In my app, I have an ARView that has cameraMode set to nonAR.
I occasionally hide the ARView when it is not needed and reveal it again later.
While the ARView is hidden, I'd like to pause the animation to save iPhone battery life. I'd also like to do this when I know that animation in my scene has paused and the contents of the view, although still visible, is static.
This was possible using SceneKit, but I can't seem to find an equivalent way to do it using RealityKit.
At least as of iOS 18, a hidden ARView with an empty scene appears to use approximately 30% of the CPU.
How can I pause ARView so that it won't use the battery unnecessarily?
Thank you for considering this question.
I have a legacy OpenGL fixed-pipeline app which has been ported from Windows (32-bit) to MacOS 64-bit.
The problem is that if I have a scene with a non-positional light, everything works great. If I add a positional spotlight the two lights interact, and I get incorrect results.
This problem does not occur on X86_64 Macs. It does occur when the app is X86_64 running under Rosetta or native ARM64.
So it's either an Apple Silicon OpenGL driver behaviour my code is triggering, or something with the on-chip Apple Silicon graphics.
Here is the "normal" case: the spotlight is to the right:
Here, I have moved the spotlight down (Y = 1). Notice the black areas on the cube. That's incorrect.
Now, I turn off the spotlight by commenting out the "makeALight" call for the spotlight (light 6). Now, the cube is evenly lit.
Here is the test code I use to generate the lights. You will need to install glfw with brew to build it.
main.cpp
Hello,
I’m trying to run Age of Mythology Retold on my Mac using the Game Porting Toolkit. Unfortunately, the game crashes before it opens. Has anyone experienced something similar or have any suggestions on how to resolve it?
Thank you!
Hello,
I want to create a painting app for iOS and I saw many examples use a CAShapeLayer to draw a UIBezierPath.
As I understand CoreAnimation uses the GPU so I was wondering how is this implemented on the GPU? Or in other words, how would you do it with Metal or OpenGL?
I can only think of continuously updating a texture in response to the user's drawing but that would be a very resource intensive operation...
Thanks
So, I've been messing around with SteamVR on Apple Silicon and it runs as expected under Rosetta translation, I've even got a game to run. But for some reason SteamVR cannot detect a headset, even when using one that SteamVR has drivers for such as the 2017 Vive headset. Would there be any explanation as to why this is because SteamVR works as expected so that leads me to believe it's something with MacOS.
I am currently working on a project where I aim to overlay the camera feed obtained via the Apple Vision Pro's camera access API to align perfectly with the user's perspective in Vision Pro.
However, I've noticed a discrepancy between the captured camera feed and the actual view from the user's perspective. My assumption is that this difference might be related to lens distortion correction or the lack thereof.
Unfortunately, I'm not entirely sure how the camera feed is being corrected or processed. For the overlay, I'm using a typical 3D CG approach where a texture captured from the background plane is projected onto a surface. In this case, the "background capture" is the camera feed that I'm projecting.
If anyone has insights or suggestions on how to align the camera feed with the user's perspective more accurately, any information would be greatly appreciated.
Attached image shows what difference between the camera feed and actual user's perspective field of view.
I want to align the camera feed image to the user's perspective.
I am creating a 3D model from multiple images using the photogrammetry session. Now, when the session generates an OBJ file and I measure the distance between two points, the distance is displayed sporadically in different units. Sometimes it's meters, then centimeters, or another unit altogether. How can I tell the photogrammetry session to always create the model in millimeters?
Minecraft Launcher gives message "minecraft launcher quit unexpectedly"
when opened, this began happening after I updated to macOS Sequoia Beta 15.0 (24A5327a)
Anyone know a fix?
LogUnrealMathTest: FAILED: VectorCos: Ref vs Vec
]LogUnrealMathTest: Bad(0.000000): (0.707107 0.500000 0.342019 0.173648) (0.000000 0.000000 0.000000 0.000000)
FAILED: VectorCos: Ref vs Vec
LogUnrealMathTest: Bad(0.000000): (-0.707107 -0.500000 -0.342021 -0.173648) (-0.000000 -0.000000 -0.000000 -0.000000)
LogMac: Error: appError called: Fatal error: [File:/Users/enginej3/Desktop/UE4/Engine/Source/Runtime/Core/Private/Tests/Math/UnrealMathTest.cpp] [Line: 1652]
VectorIntrinsics Failed.
this error when running after Xcode build success.this error case unreal editor crash.
I tried to understand the view matrix.
The part from original code as below:
private func updateGameState() {
/// Update any game state before rendering
uniforms[0].projectionMatrix = projectionMatrix
let rotationAxis = SIMD3<Float>(1, 1, 0)
let modelMatrix = matrix4x4_rotation(radians: rotation, axis: rotationAxis)
let viewMatrix = matrix4x4_translation(0.0, 0.0, -8.0)
uniforms[0].modelViewMatrix = simd_mul(viewMatrix, modelMatrix)
rotation += 0.01
}
If the view matrix is initialed in x = -0.5, as:let viewMatrix = matrix4x4_translation(-0.5, 0.0, -8.0)
The cube in the MetalView will move left.
I think it should move to right hand side because View Matrix is camera position, am I wrong?
我用的iPhone14ProMax iOS 18.0 beta5
AirPods是最新os
我玩暗区突围时队友说耳机有电流声/滋滋声 噪音很大 有的时候没有有的时候突然就有了
我使用手机麦克风就没有此问题
AirPods不在召唤范围但在保
I have a 3х3 Matrix which I need to apply to UIImage and save it in Documents folder. I successfully converted the 3x3 Matrix (represented as [[Double]]) to CATrasform3D and then I have broken my head with trying to figure out how to apply it to UIImage.
The only property where I can I apply it is UIView(or UIImageView in case with working with UIImage) transform property. But it has nothing to do with UIImage itself. I can't save the UIImage from transformed the UIImageView with all the transformations.
And all the CoreGraphic methods (like concatenate for CGContext) only work with affine transformations which not suits for me.
Please give me a hint what direction I should look.
Does Apple has native methods or I have to use 3rd party frameworks for this functionality?
Hello, I'm writing to report an issue (or a documentation error).
I am using the Entity/Component Architecture incorporated in the GamePlayKit framework. Additionally, I want to take advantage of the user interface provided by the Scene Editor. This is essential for me if I want to involve more people in the project.
The issue occurs when linking the user interface data with the GKScene of the aforementioned framework.
The first issue arises when adding a component through the interface as shown in the image
Then at that moment:
if let scene = GKScene(fileNamed: "GameScene") {
// Get the SKScene from the loaded GKScene
if let sceneNode = scene.rootNode as! GameScene? {
Scene.rootNode is nil, and the scene is not presented.
However, I can work around this issue by initializing the scene separately:
if let scene = GKScene(fileNamed: "GameScene") {
// Get the SKScene loaded separately
if let sceneNode = SKScene(fileNamed: "GameScene") as! GameScene? {
But from here, two issues arise:
The node contains a component, and the scene has been loaded separately
When trying to access a specific entity through its SKSpriteNode:
self.node?.entity // Is nil
It becomes very difficult to access a specific entity. When adding a component, an entity is automatically created. This is demonstrated here:
The node contains a component, and the scene has been loaded separately.
I only have one way to access this entity, and since there is only one, it's easy:
sceneNode.entities[0]
But even so, it's not very useful because when I try to access its components, it turns out they don't exist.
I just wanted to mention this because it would be very helpful for me if this issue could be resolved.
Thank you very much in advance.
I have tested my application in iOS 15, 16, 17 Version in that vision kit reading value in Horizontal direction once I got updated my device to iOS 18.0 beta value was reading as in vertical direction
The build was generated in Xcode 13.4.1.
Team please help to understand why this and need to change anything in code level
The following generates a prior definition warning from #include <__config>. But that has an ifndef in it. I'm in C++17 on Xcode latest (14.5). Is this documented anywhere?
-D_LIBCPP_ENABLE_ASSERTIONS=1
I have a legacy app that draws using OpenGL, in particular is draws lines using glLineStipple. This is on a Macbook Pro M3, but it also happens on and x86 based Mac.
This causes the following messages to be output to the terminal the app was run from:
FALLBACK (log once): Fallback to SW vertex for line stipple
FALLBACK (log once): Fallback to SW vertex processing, m_disable_code: 2000
FALLBACK (log once): Fallback to SW vertex processing in drawCore, m_disable_code: 2000
Is there a way of suppressing these messages?
I have a game for iOS where I use CADisplayLink to animate a simulation, and for some reason the animation is not getting the full 120hz on capable devices (like iPhone 15 Pro). When I enable a 120hz refresh target, the animation is capped at only 90hz. This looks terrible because the animation works best when doubled (30, 60, 120, 240, etc).
The really bizarre thing is that when I turn on Screen Recording, my frame rate instantly jumps to 120, and everything looks perfectly smooth. My game has never looked better on iPhone! When recording is stopped, the animation drops back down to 90 fps. What in the world is going on?
[displayLink setPreferredFrameRateRange:CAFrameRateRangeMake(100,240,120)]; //Min. Max, Preferred [displayLink addToRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];
(Also, CADisableMinimumFrameDurationOnPhone is set to True in info.plist)