- Issue category: Semantic Segmentation
- Device type & OS version: Android and iOS
- Host machine & OS version: Windows
- Issue Environment : On Device
- Xcode version: ?
- ARDK version: 2.4.1
- Unity version: 2023.1.0a26
Description of the issue:
I’m developing an app in which the floor is replaced with a floor texture of your choice. The app is using Semantic Segmentation/Confidence to detect what’s floor, and in many cases it works very well! However, I’ve recently discovered that in some conditions the app can’t track some parts of the floor, like the example in the image above where I’m indoors and strong sunlight is coming through the window. Then the bright areas on the floor is hard for the app to recognize as floor.
In this example Semantic Confidence is used, and for each pixel the floor texture gets an alpha value depending on the confidence values (that’s why you see through the floor texture at some areas). But in the brightest area there is no texture at all because the confidence values are close to 0.
Is there anything I can do to improve the tracking in situations like this, or are cases like this something that Semantic Confidence can’t deal with in a better way?