Include the following details (edit as applicable):
- Issue category: Semantic Segmentation
- Device type & OS version: Android / iOS
- Host machine & OS version: Mac / Windows / Mac on Big Sur x.x
- Issue Environment : Unity Remote / Unity Mock / On Device / Dev Portal
- Xcode version: Latest
- ARDK version: Latest
- Unity version: 2020.3.LTS
Description of the issue:
I am playing around with the Semantic Segmentation and Masking Templates. If I place an object in the sky segmentation, how can I detect a touch of that object. The Object is being rendered using masking onto a flat 2DTexture
RawImage
to allow occlusion of real world objects using the RawImage mask
. So is there a way to detect the texture of the 3DObject being rendered onto the RawImage
? Using Semantic Masking I know if it touches the sky, but if I want to know if it touches the sky object that I put in that segmentation. HitTest are to detect real world objects, but what if I want to detect when a user touches an AR GameObject I placed on the screen.
For example , in the Semantic Masking Template, there are 2 Hot Air Balloons masked to the Sky Segmentation. How could I detect if a user touches one of those Hot Air Ballons??
Also, when using Semantic Segmentation, can you include UI Buttons? When I include touch.IsTouchOverUIObject()
I get an error
NullReferenceException: Object reference not set to an instance of an object
Niantic.ARDK.Utilities.Input.Legacy.PlatformAgnosticInput.IsTouchOverUIObject (UnityEngine.Touch touch) (at Assets/ARDK/Utilities/Input/Legacy/PlatformAgnosticInput.cs:110)
Niantic.ARDK.Templates.ObjectMaskingController.Update () (at Assets/LightshipHUB/Runtime/Scripts/ObjectMaskingController.cs:141)
because the Semantic Masking is updating the buffer and hasn’t rendered complete. This is my first experimentation with this so maybe I am missing how to I include UI buttons while having Semantic Masking? The buttons I have on the screen are not being touched or onClicked methods called.