We are using ARDK meshing to scan the surrounding world and paste some objects on generated mesh. But we faced to the problem, that on distances more than ~5 meters the mesh generates very slowly (not an FPS issue, just need to collect more data to build the mesh) and didn’t generate anything on distances larger than ~10 meters. So we can’t display objects correctly over long distances. Usually, objects ‘fall’ under the surface and occlusion hides half of the object’s body.
But if Occlusion works on large distances maybe there is the way to analyze occlusion data to create super low-poly meshes or planes on far distances. Is this possible? Or is there any simpler solution to or problem?
Hi Alex,
Niantic’s efforts are ongoing to increase the distance objects can be meshed at, but we do not have a specific feature on our roadmap that will allow for this at present. Improvements are rolling out consistently so please keep an eye on the announcements for any news.