Include the following details (edit as applicable):
- Issue category: Real-time Mapping-Depth / ARDK Documentation / Unity Example Package / Developer Tools Scanning Framework
- Device type & OS version: iOS
- Host machine & OS version: Mac
- Issue Environment : On Device
- Xcode version: 13
- ARDK version: 3
- Unity version:2021.3.38f1
Description of the issue:
I have a problem when try to setup playback for app testing. I have tried on “Normal Mesh” scene from lightship samples. 2 errors occur:
[Error] Request kDepthSemantics_Frame missed a callback, likely because OnReceiveFrameData() takes longer than it should
InvalidOperationException: No environment prefab set. Pick an environment in a Scene View with Overlays → XR Environment.
UnityEngine.XR.Simulation.BaseSimulationSceneManager.SetupEnvironment () (at Library/PackageCache/com.unity.xr.arfoundation@5.1.0/Runtime/Simulation/BaseSimulationSceneManager.cs:88)
UnityEngine.XR.Simulation.SimulationSessionSubsystem+SimulationProvider.SetupSimulation () (at Library/PackageCache/com.unity.xr.arfoundation@5.1.0/Runtime/Simulation/Subsystems/SimulationSessionSubsystem.cs:136)
UnityEngine.XR.Simulation.SimulationSessionSubsystem+SimulationProvider.Initialize () (at Library/PackageCache/com.unity.xr.arfoundation@5.1.0/Runtime/Simulation/Subsystems/SimulationSessionSubsystem.cs:50)
UnityEngine.XR.Simulation.SimulationSessionSubsystem+SimulationProvider.Start () (at Library/PackageCache/com.unity.xr.arfoundation@5.1.0/Runtime/Simulation/Subsystems/SimulationSessionSubsystem.cs:73)
UnityEngine.SubsystemsImplementation.SubsystemWithProvider`3[TSubsystem,TSubsystemDescriptor,TProvider].OnStart () (at /Users/bokken/build/output/unity/unity/Modules/Subsystems/SubsystemWithProvider.cs:55)
UnityEngine.SubsystemsImplementation.SubsystemWithProvider.Start () (at /Users/bokken/build/output/unity/unity/Modules/Subsystems/SubsystemWithProvider.cs:10)
UnityEngine.XR.ARFoundation.ARSession.StartSubsystem () (at Library/PackageCache/com.unity.xr.arfoundation@5.1.0/Runtime/ARFoundation/ARSession.cs:396)
UnityEngine.XR.ARFoundation.ARSession+d__39.MoveNext () (at Library/PackageCache/com.unity.xr.arfoundation@5.1.0/Runtime/ARFoundation/ARSession.cs:384)
UnityEngine.SetupCoroutine.InvokeMoveNext (System.Collections.IEnumerator enumerator, System.IntPtr returnValueAddress) (at /Users/bokken/build/output/unity/unity/Runtime/Export/Scripting/Coroutines.cs:17)
My recording is running as empty asset in the app background (with not working app features) until I activate “Niantic Lightship Simulation” in “XR Management”. After I check this box, the errors I described above appear.
Please support me.
BR,
Greg
Hello,
If both playback and simulation are two different debug features and only one can be used at a time. Would you be able to clarify which of the two you’re attempting to use so that I can better assist you?
Hi,
I see that simulation is required to be enabled when playback is used. 2 cases I tried:
- Playback active without simulation enabled. Result: I see my recording but AR features can’t see it.
- Playback with simulation enabled. Result: errors I described above.
I’m sorry. Seems I wrong understood the settings should be. So let’s consider only my first configuration (which should be set): Playback without simulation enabled. Any idea why AR can’t recognize the recording?
Can you show me the rest of your setup for Occlusion? Mainly the Occlusion Manager?
Hi,
This is not only about the Occlusion. The problem same is in Meshing and other samples.
Occlusion and other works perfect after build the app. Anyhow Below you can see occlusion manager settings (if it could give any clue).
BR,
Greg
Hi,
Yes, I managed with these instructions. Playback recordings are loaded and they are visible when test is running (you can see it on the screenshot from previous post).
BR,
Greg
I believe the screenshots you sent show how you set up playback. The tutorial I linked shows how to create a recording for playback (it’s not possible to use any video, it has to be recorded through lightship)
Can you see if you get the same results using the sample playback recordings that we provide here: https://lightship.dev/docs/ardk/how-to/unity/setting_up_playback/
They’re located under step 1
I am also having trouble with this error with “NormalMeshes”.
Request kDepthSemantics_Frame missed a callback, likely because OnReceiveFrameData() takes longer than it should
Screen is dark.
Device type & OS version: iOS
Host machine & OS version: M3 MacBook Pro
Issue Environment : On Editor
Xcode version: 13
ARDK version: 3.6.0,3.5.0 Tried both
Unity version:2022.3.16f1 , 2022.3.33f1 (Apple Silicon) Tried both
Hi Kazuki,
Can you create a new topic for your question and include these details? While I understand the error is with playback in both of your cases, each of you is getting the issue with different features. To avoid confusion it would be good if I can assist the both of you on separate topics
Hi,
My screenshot shows that playback recording doesn’t work. Please pay attention on red cube which should be occluded. I paste one more time below:
It was recorded via lightship, applying instructions from tutorial you pasted. Please read this thread from the beginning. You will find these information. I have impression that I repeat myself.
Anyhow I downloaded recorded content I found under the link you gave (“ghandi_statue”). It results with following error:
[Error] [2024-06-26 21:46:34.334] [0x70000cfac000] Request kDepthSemantics_Frame missed a callback, likely because OnReceiveFrameData() takes longer than it should
Ok, can you show me a screenshot of the file structure of the playback recording? I’d like to take a look to make sure nothing is missing.
Something wrong is with the recording. I created the mesh scene from scratch (previously I used “NormalMesh” from lightship samples). It works now on “Gandh statue” recording (downloaded from Ligthship resources). On my own recording it still doesn’t work.
ok I think you may be going a folder too deep. Can you remove ‘/chunk0’ from your dataset path? If that doesn’t work, can you show me a screenshot of what the folder layout looks like outside of the chunk0 folder?
If I go one level up in folder structures it doesn’t work at all.
I noticed that “ghandi” example has additional json file: “camera.js”. In chunk I have only “capture.js”. See below
Thanks, let me see what I can dig up on this. The capture and camera json files should be at the top level of the main folder, not within the chunk folder. Have you tried creating a new recording to see if you can get it working a second time around?
Hello,
Were you able to try creating a new recording? It doesn’t look like this is a bug.