Plane detection does not recognise some window sills

Include the following details (edit as applicable):

  • Issue category: ARDK
  • Device type & OS version: / iOS / iPhones without Lidar e.g. iPhone 12 mini, iPhone 14, iPhone 15
  • Host machine & OS version: Mac
  • Issue Environment : On Device
  • Xcode version: 15.2
  • ARDK version: 3.3.0-2402121803
  • Unity version: 2022.3.21f1

Description of the issue:
For our client we are developing an application that allows placing blinds on window sills. We decided to use Lightship because it offered us much better stability of objects than ARKit. We started developing the application using Lightship 2.5.2, but recently updated to 3.3.0.
The flow of the application requires a user to first select a plane on the window sill. This used to work very good on 2.5.2, but after updating to 3.3.0 we noticed multiple window sills that are no longer recognisable (all of them are white) if we use an iPhone without Lidar. I’m attaching a screenshot to better visualise the problem. White dots mark recognised plane on the floor, but the window sill is not detected.


I can provide more examples, but as a new user I can post only a single image.

Hi Ksysu,

Thanks for choosing Lightship! An important difference between Lightship 2.x and the latest versions is that Lightship 3 is built to work synergistically with ARFoundation; meaning that Lightship 3 doesn’t “reinvent the wheel.” With that being said, the setup you had in 2.5.2 won’t work out of the box in a 3.3.0 build. You would want to make sure that you’re using the ARPlaneManager in a way that ARFoundation is anticipating. To do so, you could follow an ARFoundation Plane Detection tutorial or guide such as this one: https://www.youtube.com/watch?v=mDLmqhhY-6, or feel free to provide some screenshots or additional information of the setup of your game objects, their properties, their components, etc. so I can assist you further.

Kind regards,
Maverick L.

Hello Maverick_Liberty,
Thank you for your answer, but I’m afraid we misunderstood each other. My problem isn’t that we are unable to transition from 2.5.2 to 3.3.0. Detected planes are correctly visualised by our application.
The problem is that we noticed a significant decrease in the number of surfaces that are recognised as a plane compared to 2.5.2 when we use iPhones without a Lidar (the issue is gone when using a device with a Lidar). The most notable example for us are white window sills. Here is another example of a sill that is not recognised even though it worked on 2.5.2:


Hopefully that is a better explanation of the issue that we are facing.

Here is the same sill on 2.5.2:

When you get closer in the 3.3.0 version, is it detecting the window sill? I mentioned the differences between the versions to express that the different implementation and default settings compared to 2.x most likely is contributing to this. If you uncheck Use LiDAR if Available in Depth settings, do you get a consistent result between all devices?

When you get closer in the 3.3.0 version, is it detecting the window sill?

Moving back and forth sometimes helps with detecting a sill, but there are days when it is not recognised at all. Sometimes the space between window panes gets recognised as a plane, but not the sill.

If you uncheck Use LiDAR if Available in Depth settings, do you get a consistent result between all devices?

No, it seems to have no effect on plane detection. A phone with Lidar still detects every surface with almost no delay, and a phone without Lidar struggles to detect a white window sill.

Here are our settings for the SDK:
Niantic SDK settings

Would you mind submitting a minimal reproduction project to my private messages? It will help tremendously to get my hands on the project so I can track down what’s causing this.

Thank you!