Issue category: Image Detection
Device type & OS version: Android / iOS /
Issue Environment : On Device
Xcode version: 13.2.1
ARDK version: 1.1.0
Unity version: 2019.4.10f
Description of the issue: When building to Mobile (Android and Iphone) and experimenting the image detection on a real world object previously captured, sometimes it detects the image succesfully and sometimes it doesn’t. I would like to understand what it takes into account to make the comparison between the image and the real world object and what is the longest distance it can still detect the image.
Hello, In reference to the longest distance that you can detect an object, that would depend on several factors, such as the size of the object, local lighting conditions, and the overall visual features of the object.
For example, when scanning an object in the environment to reach a stable state, it’s recommended to position the object around 1-5 meters from the camera, with 5 meters being around the sweet spot, depending on the size of the object being scanned. Also, physical objects with distinct features should be easier to detect than say, a cube or plane with a printed pattern on it.
So as a general rule of thumb, around 5 meters from an object about the size of a chair should work.
Also, more information can be found on Image Detection in our documentation. I hope this helps.