I have an immersive environment with a skybox which uses an png image inside a sphere. I added an IBL, but I am not sure what the best format / prep method is for the IBL image.
I have tried several different images for my IBL, and all are very different vibes from what I have in Blender.
My question is how can I create an IBL that's closest to Blender's Cycles rendering engine?
However, it's a rather difficult to answer question, so I want to ask some smaller questions first.
Does IBL need to be BW or will colour work?
From my tests: colour works just as well. But why does Apple only show use of BW ones? Should we use BW?
What is the best file format for IBL? Any pros/cons? Or should we just test out each format and check visually.
From my tests: PNG, OpenEXR (.exr), Radiance HDR (.hdr) all work. But which format is recommended?
Will IBL on visionOS create shadows for us? In Blender an HDRI gives shadows.
From my tests: No, IBL does not provide shadows on your loaded environment/meshes. Is "shadow baking" the only solution for the time being?
Looking at a scene in Blender which uses HDRI as global lighting, how can we best "prep" the IBL image that will give the closest light similar to Blender's Cycles rendering engine?
From my tests: I tried (as shown below)
A) make a render of just the Blender HDRI (without meshes) via 360-degree panoramic camera.
→ Usage as IBL makes everything too bright.
B) make a render of the entire Blender scene via 360-degree panoramic camera.
→ Usage as IBL makes everything washed out and yellowish.
C) Use the Sunlight.png from the sample project.
→ With this IBL the scene is too dark.
D) Use the SystemIBL.exr from another sample project.
→ With this IBL the scene looks very flat and not realistic at all.
Here I show each IBL I described above 1~4 and sample screenshots from the simulator:
A)
B)
C)
D)
The atmosphere I'm aiming for as per Blender's Cycles rendering engine:
Can anyone help me with my questions 1 ~ 4 above.
It would give me some insight in how to create immersive environments with realistic lighting & shadows. : )
Much appreciated!
— Luca
Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.
Post
Replies
Boosts
Views
Activity
Hi there -
Where would a dev go these days to get an initial understanding of SceneKit?
The WWDC videos linked in various places seem to be gone?!
For example, the SceneKit page at developer.apple.com lists features a session videos link that comes up without any result, https://developer.apple.com/scenekit/
Any advice..?
Cheers,
Jan
I have a RealityView and I want to add an Entity with an Attachment.
Assuming I have a viewModel manage my entities, and the addEntityGesture() will add a new Entity under the rootEntity.
RealityView { content, attachments in
// Load initial content
content.add(viewModel.rootEntity)
} update: { updateContent, updateAttachments in
//
} attachments: {
//
}
.gesture(addEntityGesture())
I know that we can create attachment in the attachments closure, and add those attachments as entities in our make closure, however, what if I want to add entity with an attachment on the fly?
I found Scenekit crash on iOS 17 very frequently for all device on iOS 17
here is crash trace
Crashed: com.apple.scenekit.renderingQueue.SCNView0x15878c630
0 SceneKit 0x3eee4 C3DMatrix4x4GetAffineTransforms + 344
1 SceneKit 0x30208 C3DAdjustZRangeOfProjectionInfos + 140
2 SceneKit 0x2c0a90 C3DCullingContextSetupPointOfViewMatrices + 700
the attachment have the whole log
Crash Log
have anybody know how fo fix it
In reality composer pro, when importing an USDZ model and inserting it into the scene, reality composer pro will remove the material of the model itself by default, but I don't want to do this. So how can reality composer pro not remove the material of the model itself?
I'm attempting to put two mesh draws into a MTLRenderCommandEncoder with a memory barrier between them. I'm also using image blocks in the fragment functions in the two pipelines. Something like this:
[encoder setRenderPipelineState:pipeline1];
[encoder drawMeshThreadgroups:threadgroupsPerGrid threadsPerObjectThreadgroup:threadsPerObjectThreadgroup threadsPerMeshThreadgroup:threadsPerMeshThreadgroup];
[encoder memoryBarrierWithScope:MTLBarrierScopeBuffers afterStages:MTLRenderStageMesh beforeStages:MTLRenderStageObject];
[encoder setRenderPipelineState:pipeline2];
[encoder drawMeshThreadgroups:threadgroupsPerGrid threadsPerObjectThreadgroup:threadsPerObjectThreadgroup threadsPerMeshThreadgroup:threadsPerMeshThreadgroup];
I get a strange error:
Execution of the command buffer was aborted due to an error during execution. Too many unique viewports, scissor rectangles or depth-bias values to support memoryless render pass attachments. (0000000c:kIOGPUCommandBufferCallbackErrorExceededHardwareLimit)
I'm not using multiple viewports or scissor rectangles and I'm not using depth bias. I don't have memoryless attachments, though as mentioned, I am using imageblocks. Without the memory barrier I don't get the error. Using memoryBarrierWithResources rather than memoryBarrierWithScope
This is on an M2 Max running 14.3 Beta (23D5033f)
I can't tell if I encountered a real limitation or a Metal driver bug.
Hey everyone, I'm running into this issue of my USDZ model not showing up in Reality Composer Pro, exported from Blender as a USD and converted in Reality Converter.
See Attached image:
It's strange, because the USDz model appears fine in Previews. But once it is brought inside RCP, I receive this pop up, and does not appear.
Not sure how to resolve this multiple root level issue. If anyone can point me in the right direction or any feedback, all is much appreciated! Thank you!
I have installed the aple game kit and when it is time to build the packages i go to unity pakage manager and find the .tgz file and open it but the apple core pakage does not appear on the list inside package manager.
Using unity 2022.2.2
[08:53:18] [Package Manager Window] Error adding package: file:./Assets/External Packages/unityplugins-main/Build/fbx20133_converter_mac.pkg.tgz.
Unable to add package file: /Assets/External Packages/unitvolugins-main/Build/fbx20133 converter mac.nka.tazl:
[08:56:07] [Package Manager Window] Error adding package: file:./Assets/External Packages/unityplugins-main/Build/fbx20133_converter_mac.pkq.tqz.
Unable to add package tile:.. Assets/External Packages unitvplugins-mainBulld/TbX20133_converter mac.pka.taz:
[08:59:311 [Package Manager Window] Error adding package: file../ Assets/External Packages/unityplugins-main/Build/fbx20133_converter_mac.pkg.tgz.
unable to add package TIle….. AssETS/External Packages unityplugins-main/Bulld/TOX20133_converter_mac.pkq.tqz:
[09:03:111 [Package Manager Window] Error adding package: file../Assets/External Packages/unityplugins-main/Build/fbx20133_converter_mac.pkg.tgz.
unable to add package Tile:.. AssEts/Extemalackagesunitypiugins-main/Bulla/TOXU133_convener_mac.pko.toz]:
¿ Perso
[Package Manager Window] Error adding package: file:./ Assets/External Packages/unityplugins-main/Build/fbx20133_converter_mac.pkg.tgz.
Unable to add package [file:../Assets/External Packages/unityplugins-main/Build/fbx20133_converter_mac.pkg.tgz]:
The file L/var/folders/39/drmm3vg15tsggs7nt7z360lh0000gn/T/.tmp-9365-Q2GPWMjILnkO/package.json] cannot be found
Hi,
I am using xcode frame capture to profile my app's shader. And I got some question about the shader per line profile statistics. Please see the two screen shot first, it is my compute shader.
Begin:
End:
The first image is the head of the shader. The profile show's that the shader entry function takes 72.44% of the time.
And at the end of the shader, the profile shows that the right brace '}' takes 60.45%.
Here is my question:
How to properly understand the profile data? What's the real performance data of this shader?
Why the shader entry function does not take 100% of the time?
Can someone help me to answer the question?
Thanks!
Boson
I've been attempting to use the new CAMetalDisplayLink to simplify the code needed to sync my rendering with the display across Apple platforms. One thing I noticed since moving to using CAMetalDisplayLink is that the Metal Performance HUD which I had previously been using to analyze the total memory used by my app (among other things) is suddenly no longer appearing when using CAMetalDisplayLink.
This issue can be reproduced with the Frame Pacing sample from WWDC23
Anyone from Apple know if this is expected behavior or have an idea on how to get this to work properly?
I've filed FB13495684 for official review.
Metal offers both threadgroup_barrier() and simdgroup_barrier(). I understand the need for threadroup barriers — it would not be possible to rely on well cooperation between threads in a threadgroup without them, as different threads can execute on different SIMD partitions at different times. But I don't really get the simdgroup_barrier() — it was my impression that all threads in a simdgroup execute in lockstep and this if one thread in a simdgroup makes progress, all other active threads in the simdgroup are also guaranteed to make progress. If this were not the case we'd need to insert simdgroup barrier pretty much any time we read or write any storage or perform SIMD-scoped operations. It doesn't seem like Apple uses simdgroup_barrier() in any of their sample code. In fact, it seems like it's a no-op on current Apple Silicon hardware.
Is there a situation when I need to use simdgroup barriers or is this a superfluous operation?
P.S. It seems that Apple engineers are as confused by this as I am, see https://github.com/ml-explore/mlx/blame/1f6ab6a556045961c639735efceebbee7cce814d/mlx/backend/metal/kernels/scan.metal#L355
Hello,
I am creating an application that is cross-platform with Flutter, the problem is that when I launch my application on my Macbook there is only a black page displayed. This is a recurring problem with all Flutter applications on this Mac.
When I debug my application, this is what appears in the console.
Error submitting command buffer.
2023-12-27 15:58:12.468 tranzic[2333:21044] Error Domain=MTLCommandBufferErrorDomain Code=4 "Ignored (for causing prior/excessive GPU errors) (00000004:kIOAccelCommandBufferCallbackErrorSubmissionsIgnored)" UserInfo={NSLocalizedDescription=Ignored (for causing prior/excessive GPU errors) (00000004:kIOAccelCommandBufferCallbackErrorSubmissionsIgnored), MTLCommandBufferEncoderInfoErrorKey=(
"<errorState: MTLCommandEncoderErrorStatePending, label: (null), debugSignposts: (null)>",
"<errorState: MTLCommandEncoderErrorStatePending, label: (null), debugSignposts: (null)>",
"<errorState: MTLCommandEncoderErrorStatePending, label: (null), debugSignposts: (null)>"
)}
Error submitting command buffer.
2023-12-27 15:58:18.455 tranzic[2333:21044] Error Domain=MTLCommandBufferErrorDomain Code=4 "Ignored (for causing prior/excessive GPU errors) (00000004:kIOAccelCommandBufferCallbackErrorSubmissionsIgnored)" UserInfo={NSLocalizedDescription=Ignored (for causing prior/excessive GPU errors) (00000004:kIOAccelCommandBufferCallbackErrorSubmissionsIgnored), MTLCommandBufferEncoderInfoErrorKey=(
"<errorState: MTLCommandEncoderErrorStatePending, label: (null), debugSignposts: (null)>",
"<errorState: MTLCommandEncoderErrorStatePending, label: (null), debugSignposts: (null)>",
"<errorState: MTLCommandEncoderErrorStatePending, label: (null), debugSignposts: (null)>"
)}
I have a Macbook Pro mid-2012 running macOS Monterey and here's an issue I opened on the flutter repo for more details. https://github.com/flutter/flutter/issues/137859
Hello,
I'm exporting a 3D model with a shader created in Reality Composer Pro's Shader Graph to a USDZ file for viewing in QuickLook.
Once exported, the USDZ file in QuickLook appears without the shader material.
When I import the file back into Xcode, the material renders properly.
Is it possible to publish MaterialX shaders to be viewed in QuickLook?
Hopefully it is, because it seems the push to MaterialX and USDZ is supposed to be for universal support :)
Any guidance is appreciated!
I am using GCSystemGestureStateDisabled to suppress home button being capture by gesture recognizer so the system menu is not triggered while my app is using the gamepad.
It was working well on macOS 14.1 but stop working after I migrated to 14.2.
boundToSystemGesture is still showing "Yes" after button.preferredSystemGestureState = GCSystemGestureStateDisabled is assigned.
Is it possible some change was introduced on 14.2 causing the issue?
Hi,
I'm trying to build an app demo with Metal framework on visionOS.
I've found out that when I enabled immersive space it was in a fully immersive mode and the Metal was rendering the background with black color. And I've inspected the demo code and the clear color was (0, 0, 0, 0).
I wonder if it is possible to render just like RealityKit with mixed reality effect that some 3D models can be rendered in the real world without a black background. Any idea or explanation?
I'm using an anaconda environment
Tensorflow-macos 2.15
Keras 2.15
Python 3.11.5
macOS m2 14.1
I guess problem with Pycharm, because cod is working and error is: Cannot find reference 'keras' in 'imported module tensorflow | init.py'.
Previously I built a model on a simple MNIST and it's working but have same problem.
I have tried different references and versions of python. I've changed environments at least 3 times and it doesn't work.
I'm working in a game that it was working perfectly with Game Center (remote game with 1 remote player), but since last Sunday I'm having errors when I try to start a remote game.
I have 2 errors, one says FAILED when I invite a friend to play. In this case, the other device never gets the notification.
The other error sent the notification, but when I tap on it (on the other device), it fails saying that it couldn't communicate with the server). The main device it says "INVITED" but it doesn't say anything else.
I haven't found anyone else having the same issue, so I wonder if it's my fault, although I haven't change that part of the code since the last time I tested it.
Is it someone else here having similar problems? or knows what should I review?
thank you all, and have a great year!
Hi,
when compiling shaders, metal command line tool has more options than MTLDevice::newLibraryWithSource().
For instance, "man metal" mentions 10 levels of optimization (-O0, -O1, -O2, -O3, -Ofast, -Os ...) while MTLCompileOptions doc only shows 2 levels (Default, Size).
Is there a way to pass -O2 as optimization level to MTLDevice::newLibraryWithSource()?
Thanks
Good afternoon, I had a developer account and for years I developed several gaming applications for Apple.
And a few years ago I went to look at my old developer email and there was a confirmation of Apple's payment debit to me for the profit I earned on my apps. Could you check if Apple left any payments pending on my old developer account?
I would like you to give me an answer because even my old games were removed from the Apple store.
Thank you!
In our multiplayer game prototype, we experience a ping of 300 ms (at best) when using Game Center and GKMatch to send data between players, over the GKMatch.SendDataMode.unreliable channel. This latency is not suitable for a real-time game.
When we tested alternative services like Unity's Relay under identical conditions (location, devices, and Wi-Fi), we achieved a ping of 120 ms.
Is a ping value of 300 ms typical when using Game Center?
I can think of possible reasons in case it's not typical, but I can't be sure:
Is there a different behavior (servers relaying peer-to-peer connections) when the game is not yet released on the store?
We're in Europe, maybe this is normal in Europe and better in US?