There is currently no supported way to do remote editing in Unity. This is a known issue and highly requested feature and is on the roadmap. I am opening this thread to discuss workarounds and ideas for approaching this in ARDK 2.
The ARDK 2 recommended approach is like this:
This is the thing I was expecting everything to be automatic and that I just had to drag and drop content like when using other products but it requires one extra step. So basically you load that scene and replace the spawned object to be Mesh you converted from Draco to obj. As a child of that mesh you will assign your ar.
You compile the app attempt to localize and once you localize you instantiate the OBJ. It will not be on place. So you will need to create a UI to adjust the XYZ position and the XYZ rotation.
Once you have the UI while you have the app you should instantiate as I mention but then you should adjust the position and rotation of the AR object to fit perfectly the real world Location.
Once you are confident you hit save.
Then people should use another app to retrieve the location and load the payloads (content).
(Thanks DiegoUSDZ on Discord)
This solution is… not great. Creates a lot of steps and headaches. Many layers of abstraction between what you need and what you get.
I know remote editing is on the road map but I wanted to suggest what might be an easy solution:
In the wayfarer app you can localize or you can scan. However, If the user were to localize first, then scan, the new model can be placed relative to the wayspot. The model can be arranged such that it will match perfectly if set at 0,0,0.
Another approach is that of Neogoma’s Stardust SDK. They have a menu right in Unity where you can type in an ID of a “wayspot” and it will load a point cloud right into Unity to reference from. That would be amazing to have here.