Setting ARSessionManager Runtime Environment to Live Device when running in Editor causes errors

Issue Category: Networking ?
ARDK 1.0.1
Unity 2020.3.1f
Editor on Mac

Bug reproduction steps:
Use the ARNetworkingSceneManager prefab in a scene and set the ARSessionManager Runtime Environment to Live Device. Run in the editor and you’ll see a null reference in ARSessionManager.cs caused by ARSessionFactory.Create failing to return an arSession.
I get similar problems setting NetworkSessionManager Runtime Environment to Live Device.

What i’m trying to do is connect the Editor instance to the instance of the app running on my phone. (no problem that the AR environment won’t match.)
Thanks!

Hi Ross,

I tried the repro steps with mixed results. Running only the ARNetworkingSceneManager, with the ARSessionManager and Network Session Manager scripts set to Live Device I didn’t receive any errors and was able to connect. My settings look like this:

However, I did get the error when I had the Manage Using Unity Lifecycle option selected. Are you running a custom scene or using the included ARDK Examples “ARNetworking” scene? It also helps to know the exact error messages. Will you please provide a screenshot or copy and paste of the Unity console error descriptions?

Hello Erik - thanks for the quick response. Yes, sorry I should’ve given some more detail there. So, yes it is a custom scene that I’m working on. In terms of this setup it follows the video tutorial that is in the manual (somewhere can’t seem to find it now) - so, basically I add the ARNetworkingSceneManager to the scene and then don’t manage with Unity lifecycle but with a button press instead which sets the session ID and calls ARNetworkingManager.EnableFeatures();
(If I do switch to using Unity Lifecycle I seem to be getting the same error anyway.)

I also get the same error if I try to use Live Device on the Pong HLAPI scene whether it’s managed with Unity Lifecycle or not so maybe it’s clearer for us to talk about that one. I’ll upload screens from my PongHLAPI tests.
(I can’t remember if I added the ARLog at line 239 - I think I did but it’s just breaking on the .Deinitialised line that follows otherwise.
There’s a reference to device logs i should check in the error message - will that be useful
Thanks!




Dq2G0Go0jenpR1MdeZ4fCMWO.png)

Hi Ross,
I’m having some trouble following along with your implementation for this. If you can provide a step by step description of what you are doing, and maybe a link to the video that you referenced earlier, that would be helpful. I tested what I could based on the information provided, and I found the errors, but they didn’t impede building to device. Once the scene was built, a connection would attempt to initialize. This worked both with the Unity editor and the device as host or peer, with both the Runtime Environment set to Live Device and with Manage Using Unity Lifecycle selected. Also, please note that trying to use Pong as a player in the Unity editor probably won’t work since you likely won’t be able to localize using the computer’s webcam.

Hi Erik… thanks for your replies - it’s possible that what I’m trying to do doesn’t really make any sense so apologies if that’s the case.
The tutorial I followed was this one : Niantic AR Development Kit (ARDK): Creating Shared AR Experiences
“Creating shared experiences in Niantic…”
I guess what I was hoping was that I could set the Runtime environment in the editor to be Live Device and that my editor instance would then connect to my phone instance (this does work as you said) - but that the ARSession in my editor instance would still be created and work as if I hadn’t set it to live device (which doesn’t work - the errors are and ARSession null reference.) Without a valid ARSession instance lots of parts of my project fail. So, yes the AR wouldn’t be able to localise properly but i’d still be able to run through parts of the experience and see messages etc that were being sent and received…
Thanks,
Ross

Hi Erik… thanks for your replies - it’s possible that what I’m trying to do doesn’t really make any sense so apologies if that’s the case.
The tutorial I followed was this one : Niantic AR Development Kit (ARDK): Creating Shared AR Experiences
“Creating shared experiences in Niantic…”
I guess what I was hoping was that I could set the Runtime environment in the editor to be Live Device and that my editor instance would then connect to my phone instance (this does work as you said) - but that the ARSession in my editor instance would still be created and work as if I hadn’t set it to live device (which doesn’t work - the errors are and ARSession null reference.) Without a valid ARSession instance lots of parts of my project fail. So, yes the AR wouldn’t be able to localise properly but i’d still be able to run through parts of the experience and see messages etc that were being sent and received…
Thanks,
Ross

[Discourse post]

“To ensure all clients can interact in a shared AR environment, clients must synchronize using mapping data from the environment generated by ARDK. This involves scanning the environment to create mapping data that all clients can synchronize to. This process is also known as AR localization.”

While the editor instance will attempt to connect (bypassing the errors we both found), it can’t be used as a Live Device since it will be unable to localize (the in-Unity camera can’t gather mesh data). Multipeer-networking can be tested in Unity through the Virtual Studio using the Mock Session tutorial, though I would also recommend checking out the multiplayer experience documentation.

Those can be found here:
https://lightship.dev/docs/multiplayer_experience.html

https://lightship.dev/docs/vs_mock_mode.html.

I hope this helps.