we are currently developing an app with Lightship ARDK, and will need to figure out how to implement the AR features into our app. So far, our plan is to offer a UI and frontend with react/Ionic, communicating via an API with the backend, and only when the user chooses to, should the AR be enabled and the view switched from the UI to the camera feed.
Has anyone tried to embed a Lightship Scene into a proper app yet?
If so, how did you manage building for iOS/Android separatley?
Are there any unexpected difficulties (with Unity in general)?
Is there going to be a guide/tutorial by Lightship on how to embed it into other apps?
Has anyone experienced with the capabilities of the Unity UI? Is it sufficient to develop a proper UI?
I am aware that this isnt really a Feature Request, but I dont know where I else I should post this question.
Please refer to the AR Voyage document, which provides a good starting point for you. Basically, this is the Lightship Demo Sample which is an example of a full ARDK app that uses several ARDK features.
You can develop an app with Lightship ARDK, please consider going through the terms of service beforehand located at Niantic Lightship Developer Platform Terms of Service and License Agreement – Niantic Lightship.
In addition, there is also a minimum OS version requirement for using ARDK features. You can check out our Runtime Requirements documentation for more information on that.
Good luck on your ARDK journey. Thanks.
I don’t know about react/Ionic, but you should be able to run Lightship within an Unity Activity launched from a native app. Take a look here Unity - Manual: Using Unity as a Library in other applications I have tested it on Android and it works.