Context:
I’m trying to get the lowest position of any triangle created by the Meshing in order to have some sort of ground awareness for my flying object. So I loop through all the MeshFilters and check the min Bounds of each and only store the minimum Y of all.
Description of the issue:
Every time the lowest triangle found is clearly below the ground. I can easily have meshes with block coords (x, 1, z) whereas I start the app with approximatively 0.7 meter from the ground. So my function always return a -1.4 value in y axis for the lowest point which is impossible in the real world.
Illustration:
I printed the block coords of each chunk created by ARDK and the text is placed at the bounds center of each mesh. It may be difficult to see it but I started the app at the height of the table, which is 0.7m. But there’re blocks that are detected (0,1,1) & (1,1,1) and the lowest point according to the mesh bounds is -1.4m (in Unity World Space).
Question:
Even with a LiDAR device (iPhone 12 Pro) I can reproduce this issue. Is this ARDK related or something that I misunderstood with the block coords and the (1, -1, 1) scaling to get the Transforms in Unity World coordinates?
-
Issue category: Meshing
-
Device type & OS version: Android, iOS (Samsung Galaxy S9, iPhone 11, iPhone 12 Pro)
-
Host machine & OS version: Windows
-
Issue Environment : On Device
-
ARDK version: 1.1.0
-
Unity version: 2019.4.33f1
Hello Angeline,
I think you are on the right track with the block coords and the (1, -1, 1 ) scaling. Helpers like the ARMesh prefab and ARMeshManager take care of that transformation, which most of the time only amounts to applying a scale of (1, -1, 1) on the transform of the mesh’s GameObject. Block coordinates are integers. Assuming the default block size, (0, 0, 0) is the coordinates of the block containing vertices and triangle between 0.0m and 1.4m on every axis; (-1, -1, -1) is the block containing vertices and triangles between -1.4m and 0.0m on every axis.
The Garden Mesh tutorial may be helpful to take a look at as each chunk is assessed for its own viability to spawn a plant (number of vertices + available horizontal or vertical vertex), then attempt to spawn a plant on that vertex. One easy check is to define a height limit. 0y is where the device camera starts the session; or you can dynamically set a height based on the lowest plane found (usually the floor).
https://lightship.dev/docs/meshing_addendum_lowlevel.html
https://lightship.dev/docs/meshing.html
https://lightship.dev/docs/meshing_tutorial_garden.html
One easy check is to define a height limit. 0y is where the device camera starts the session; or you can dynamically set a height based on the lowest plane found (usually the floor).
This is exactly what we’re trying to do. But the lowest plane is under the real floor so we can’t define the height between the start of the session (0y) and the real floor.
Is ARDK accurate enough to be sure that every vertices or triangles are above the floor and not under? Because when I start the session at 0.7m from the floor (in the real world), the lowest vertices are positionned at -1.4m not -0.7m as they’re supposed to be.
Sorry to see you are still having trouble with mesh coordinates falling below floor level. The screenshot was helpful, though are you able to provide any others? Are you implementing custom code that you are able to share? Any reference data you can provide would be helpful. I’ve reached out to Engineering, though it may take some time for a possible resolution to be found. Thank you for your patience.
Sorry for the delayed response. I have reduced the project to the bare minimum so your team can test it.
What it does: you’ll have to scan your environment and each block detected will be showing like in the previous picture I sent, meaning the coordinates will be displayed. Furthermore, Debug Logs are printed in update to see what is the lowest bound of the frame. Feel free to check the scripts that are doing the jobs if you think there might be a mistake here.
Let me know if you have any concerns.
You can download it here
var minBound = _meshListener.ARMeshFilters[i].sharedMesh.bounds.min;
From the Unity docs: This is the axis-aligned bounding box of the mesh in its local space (that is, not affected by the transform). Note that the Rendering.bounds property is similar but returns the bounds in World Space. Based on the function always returning a -1.4 value in y-axis for the lowest point, using Render.bounds instead will solve the issue.
Yes I was aware of this, that’s why 2 lines after this I use TransformPoint()
From the Unity docs: Transforms position
from local space to world space.
You can see it from my code:
var minBound = _meshListener.ARMeshFilters[i].sharedMesh.bounds.min;
minBound = _arMeshesRoot.TransformPoint(minBound); // bounds in World Space
We are continuing to run tests on this issue and hope to have a solution soon. Thank you for your patience.
1 Like
Thank you for the support and follow up on this. Much appreciated
1 Like
Hi @Angeline_G ,
Just wanted to update, and to let you know that we’re still looking into the behavior that you’re seeing with the meshing feature.
In the meantime, you may be able to accomplish the same thing using plane anchors to find the lowest ground plane instead of trying to do so with meshing. For more information, please see our documentation for Tracking AR Anchors and Detecting Planes.
Thanks @David_Quevedo that’s a good thing to know. However for this project we use the Meshing to interact with the world using fluid simulation. It’s already a bit demanding for the device so I wonder if I use AR planes at the same time will be ok or not.
That’s a good question. I don’t have an immediate answer as to how much overhead plane anchors would add to the app, so I’ve reached out to see if we can get a clearer picture of what that may be.
We’re also still looking into the behavior that you’re seeing with meshing, but given the current timing of being so close to the weekend, I would not expect an update for at least a couple of days.
Hi Angeline, sorry for the delay. It turns out that we don’t have benchmark numbers in place yet to give a definitive answer on how much overhead using AR planes would add to the project. But, keeping in mind that Meshing can be sensitive to noise, it still might be a better path forward than trying to use Meshing.
In regard to the behavior that you are seeing, would it be possible for you to save the mesh, and then inspect it in the editor to verify that there are no floating artifacts beneath the floor? Any additional screenshots or video that you can share could also be helpful in debugging the behavior.