Low Semantic Confidence values in (partially) bright environments

  • Issue category: Semantic Segmentation
  • Device type & OS version: Android and iOS
  • Host machine & OS version: Windows
  • Issue Environment : On Device
  • Xcode version: ?
  • ARDK version: 2.4.1
  • Unity version: 2023.1.0a26

Description of the issue:

Hi!

I’m developing an app in which the floor is replaced with a floor texture of your choice. The app is using Semantic Segmentation/Confidence to detect what’s floor, and in many cases it works very well! However, I’ve recently discovered that in some conditions the app can’t track some parts of the floor, like the example in the image above where I’m indoors and strong sunlight is coming through the window. Then the bright areas on the floor is hard for the app to recognize as floor.

In this example Semantic Confidence is used, and for each pixel the floor texture gets an alpha value depending on the confidence values (that’s why you see through the floor texture at some areas). But in the brightest area there is no texture at all because the confidence values are close to 0.

Is there anything I can do to improve the tracking in situations like this, or are cases like this something that Semantic Confidence can’t deal with in a better way?

1 Like

Hi Johan, our segmentation capabilities are getting better and better with each iteration of the framework but as you’ve seen there are still a few things that can stump it. A few things to try include pushing back some of the other channels with the ARSemanticSegmentationManager and trying to keep the camera as level with the horizon as possible. The full guide on the use of Semantic segmentation can be found here. Keep an eye on the community pages for more updates and tips from fellow users. Thank you!

1 Like

Thank you for a quick response!

I tried the Segmentation example in ARDK-Examples but unfortunately no other channel seems to react on those bright areas. But I will keep an eye here when updates are coming. Thank you!