AR Quick Look

RSS for tag

Allow users to see incredibly detailed object renderings in real-world surrounding, with support for audio playback, using AR Quick Look.

Posts under AR Quick Look tag

10 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

quick look configuration how to write a usdz with?
this week i was watching https://developer.apple.com/videos/play/wwdc2024/10105/ with the amazing "configuration" feature to change the color or mesh straight in quick look, but i tried a lot with goarounds but nothing bring me to success how do i write in the usda files? anytiome i overwrite the usda even with just a "{}" inside... Reality composer pro rejects the file to be open again where is the developer man in the tutorial writing the usda? how is the usda compressed in usdz? (none of the compressors i tried accepeted the modified usda file) this is the code it's suggested in the video #usda 1.0 ( defaultPrim = "iPhone" ) def Xform "iPhone" ( variants = { string Color = "Black_Titanium" } prepend variantSets = ["Color"] ) { variantSet "Color" = { "Black_Titanium" { } "Blue_Titanium" { } "Natural_Titanium" { } "White_Titanium" { } } } but i dont understand how to do it with my own files,
3
0
233
2w
Quicklook AugmentedReality Button Greyed-out
Hi, We are having problems with IOs Quick Look not working. Specifically, the AR button being greyed out after having opened the Scene / AR model previously. This is all running off our Web-App. What we have figured out is clearing the device's cache solves the issue and the greyed out button turns blue and clickable again. We are receiving this issue very inconsistently though - on iPad as well as iPhone and on both newer and older IOs versions. Very happy for any responses and advice to solve this issue as its behaviour makes the quick look function - even if it's great (when it works) unviable for Production (because it doesn't work consistently). Best Regards
0
0
277
May ’24
AR Quick Look in swift ui
HI, I'm new to IOS Dev. I am developing an app with AR function. I found there are a few tutorials about AR Quick Look. However, they're all use storyboard. Is there any way to use swift ui to demonstrate AR Quick Look. ContentView.swift import SwiftUI //import QuickLook //import ARKit struct ContentView: View { @State private var isPresented = false var body: some View { VStack { Button { isPresented = true print("click") } label: { Text("Click to AR") .font(.title) .fontWeight(.bold) .padding() .background() .cornerRadius(16) } .sheet(isPresented: $isPresented) { ARView() } .padding() } } } #Preview { ContentView() } ARView.swift import SwiftUI struct ARView: UIViewControllerRepresentable { func makeUIViewController(context: Context) -> QuickViewController { QuickViewController() } func updateUIViewController(_ uiViewController: QuickViewController, context: Context) { uiViewController.presentARQuickLook() } typealias UIViewControllerType = QuickViewController } QuickViewController.swift import UIKit import QuickLook import ARKit class QuickViewController: UIViewController, QLPreviewControllerDelegate, QLPreviewControllerDataSource { // 有幾個模型要呈現 func numberOfPreviewItems(in controller: QLPreviewController) -> Int { return 1 } // 顯示模型 func previewController(_ controller: QLPreviewController, previewItemAt index: Int) -> QLPreviewItem { let url = Bundle.main.url(forResource: "bear", withExtension: "usdz")! // Load file url let preview = ARQuickLookPreviewItem(fileAt: url) return preview } func presentARQuickLook() { let previewController = QLPreviewController() previewController.dataSource = self present(previewController, animated: true) print("Open AR model!") } override func viewDidLoad() { super.viewDidLoad() // Do any additional setup after loading the view. } /* // MARK: - Navigation // In a storyboard-based application, you will often want to do a little preparation before navigation override func prepare(for segue: UIStoryboardSegue, sender: Any?) { // Get the new view controller using segue.destination. // Pass the selected object to the new view controller. } */ }
0
0
630
Nov ’23
Quick Look AR shows "Object requieres a newer version of IOS" on iOS17 when using .reality files
Hi there Hosting in my server a no-doubt-well-formed AR file, as is the "CosmonautSuit_en.reality" from Apple's examples (https://developer.apple.com/augmented-reality/quick-look/) the infamous and annoying "Object requires a newer version of iOS." message appears, even when I'm running iOS 17.1 in my iPad. That is, the very last available version. All works flawless in uOS16 and below. Of course, my markup is following the required format, namely: <a rel="ar" href="https://artest.myhost.com/CosmonautSuit_en.reality"> <img class="image-model" src="https://artest.myhost.com/cosmonaut.png"> </a> Accessing this same .reality file from the aforementioned Apple's site page works fine. Why is not working in my hosting server? For you rinformation, when I use in my server a USDZ instead, also from the Apple's web page of examples, as is the toy_drummer_idle.usdz file, all works flawless. Again, I'm using the same markup schema: <a rel="ar" href="https://artest.myhost.com/toy_drummer_idle.usdz"> <img class="image-model" src="https://artest.myhost.com/toy_drummerpng"> </a> Also, when I delete the rel="ar" option, AR experience is launched, but by means of an extra step, that implied go thought an ugly poster (generated by QLAR on-the-fly), that ruins all the UX/UI of my webapp. This bahavior is, by the way, the same that you can experience when accessing directly the .realiity file by typing its URL in the Safari browser box. Any tip on this? Thanks for your time.
2
0
724
Oct ’23
Face Anchor in Reality Composer: Enabling Ball Movement Based on Head Tilts
Using the face anchor feature in Reality Composer, I'm exploring the potential for generating content movement based on facial expressions and head movement. In my current project, I've positioned a horizontal wood plane on the user's face, and I've added some dynamic physics-enabled balls on the wood surface. While I've successfully anchored the wood plane to the user's head movements, I'm facing a challenge with the balls. I'm aiming to have these balls respond to the user's head tilts, effectively rolling in the direction of the head movement. For instance, a tilt to the right should trigger the balls to roll right, and likewise for leftward tilts. However, my attempts thus far have not yielded the expected results, as the balls seem to be unresponsive to the user's head movements. The wood plane, on the other hand, follows the head's motion seamlessly. I'd greatly appreciate any insights, guidance, or possible solutions you may have regarding this matter. Are there specific settings or techniques I should be implementing to enable the balls to respond to the user's head movement as desired? Thank you in advance for your assistance.
0
0
553
Oct ’23
Issue with Quick Look AR App Missing Textures on 3D Models After Upgrading to iPadOS 17
Hello Apple community, I hope this message finds you well. I'm writing to report an issue that I've encountered after upgrading my iPad to iPadOS 17. The problem seems to be related to the Quick Look AR application, which I use extensively for 3D modeling and visualization. Prior to the upgrade, everything was working perfectly fine. I create 3D models in Reality Composer and export them as USDZ files for use with Quick Look AR. However, after the upgrade to iPadOS 17, I've noticed a rather troubling issue. Problem Description: When I view my 3D models using Quick Look AR on iPadOS 17, some of the 3D models exhibit a peculiar problem. Instead of displaying the correct textures, they show a bright pink texture in their place. This issue occurs only when I have subsequent scenes added to the initial scene. Strangely, the very first scene in the sequence displays the textures correctly. Steps to Reproduce: Create a 3D model in Reality Composer. Export the model as a USDZ file. Open the USDZ file using Quick Look AR. Observe that the textures appear correctly on the initial scene. Add additional scenes to the model. Navigate to the subsequent scenes. Notice that some of the 3D models display a pink texture instead of the correct textures (see picture). Expected Behavior: The 3D models should consistently display their textures, even when multiple scenes are added to the scene sequence. Workaround: As of now, there doesn't seem to be a viable workaround for this issue, which is quite problematic for my work in 3D modeling and visualization. I would greatly appreciate any insights, solutions, or workarounds that the community might have for this problem. Additionally, I would like to know if others are experiencing the same issue after upgrading to iPadOS 17. This information could be helpful for both users and Apple in addressing this problem. Thank you for your attention to this matter, and I look forward to hearing from the community and hopefully finding a resolution to this Quick Look AR issue. Best regards
2
1
786
Oct ’23
Wrong distance from vertical plane in AR QuickLook for USDZ with more than one object
Hi! I created manually USDZ with one cube to anchoring on wall (vertical plane) #usda 1.0 ( defaultPrim = "Root" metersPerUnit = 1 upAxis = "Y" ) def Xform "Root" ( assetInfo = { string name = "Root" } kind = "component" ) { def Xform "Geom" ( prepend apiSchemas = [ "Preliminary_AnchoringAPI" ] ) { # token preliminary:anchoring:type = "plane" # token preliminary:planeAnchoring:alignment = "vertical" matrix4d xformOp:transform = ( (0, 0, 0, 0), (0, 0, 0, 0), (0, 0, 0, 0), (0, 0, 0.2, 1) ) def Xform "Group" { def Cube "cube_0" { float3[] extent = [(-1, -1, -1), (1, 1, 1)] uniform bool doubleSided = 1 rel material:binding = </Root/Materials/material_0> matrix4d xformOp:transform = ( (0.1, 0, 0, 0), (0, 0.1, 0, 0), (0, 0, 0.01, 0), (0, 0, 0, 1) ) uniform token[] xformOpOrder = ["xformOp:transform"] } } } } It's displayed correct in ARQuickLook When I add second cube to USD(z) #usda 1.0 ( defaultPrim = "Root" metersPerUnit = 1 upAxis = "Y" ) def Xform "Root" ( assetInfo = { string name = "Root" } kind = "component" ) { def Xform "Geom" ( prepend apiSchemas = [ "Preliminary_AnchoringAPI" ] ) { # token preliminary:anchoring:type = "plane" # token preliminary:planeAnchoring:alignment = "vertical" matrix4d xformOp:transform = ( (0, 0, 0, 0), (0, 0, 0, 0), (0, 0, 0, 0), (0, 0, 0.2, 1) ) def Xform "Group" { def Cube "cube_0" { float3[] extent = [(-1, -1, -1), (1, 1, 1)] uniform bool doubleSided = 1 rel material:binding = </Root/Materials/material_0> matrix4d xformOp:transform = ( (0.1, 0, 0, 0), (0, 0.1, 0, 0), (0, 0, 0.01, 0), (0, 0, 0, 1) ) uniform token[] xformOpOrder = ["xformOp:transform"] } def Cube "cube_1" { float3[] extent = [(-1, -1, -1), (1, 1, 1)] uniform bool doubleSided = 1 rel material:binding = </Root/Materials/material_0> matrix4d xformOp:transform = ( (0.1, 0, 0, 0), (0, 0.1, 0, 0), (0, 0, 0.01, 0), (0.3, 0, 0, 1) ) uniform token[] xformOpOrder = ["xformOp:transform"] } } } } ARQuickLook display scene ~ 10 cm from wall, and then scene have more cubes we see increased distance from wall. Here for two cubes I also tryed recreate scene in Reality Composer on iPhone. Everything ok for one cube, and ok when preview in App(ARKit?) for two cubes, but when export scene in RealityComposer macOS to USDZ we again see wrong distance for two cubes and more. For tests I use iPhone 13 Pro Max with iOS 16.3.1
2
0
598
Jul ’23
iOS 17 AR QuickLook: Support for multiple UV channels
Is there support for using multiple UV channels in AR QuickLook in iOS17? One important use case would be to put a tiling texture in an overlapping tiling UV set while mapping Ambient Occlusion to a separate unwrapped non-overlapping UV set. This is very important to author 3D content combining high-resolution surface detail and high-quality Ambient Occlusion data while keeping file size to a minimum.
2
0
1.3k
Aug ’23