I’m currently working on a multiplayer prototype that is supposed to use AR Meshing (ARDK 3.3). However I’m not quite sure how to get it working in a correct way. I simply set it up the normal way, adding a empty game object to my XR Origin and adding the lightship meshing extension ( Mesh Manager) and creating an AR Mesh prefab.
So far it seems like both phones are doing their own meshing, but it’s not shared.
Can anyone tell me if the game-object that holds the mesh managers or the AR Mesh prefab should have a network object script attached or what else I need to do to get a coherent shared mesh?
Hi Josh, you can see it in this video, starting 0:14. The Mesh is not the same on both devices. I thought in Shared AR it might create a “monolithic” mesh , based on both players input.
Hi Josh, I tested the prototype shown above today again and the major issue is, since both players have their own meshing, they can’t shoot each other due to the height difference / different depth perception of the meshes. So would be awesome if there was a solution for this
By default, the Network Manager component does not automatically share meshing and the Mesh Manager will handle meshing locally. If you use SharedSpaceManager with either VPS or Image Tracking, it should make it so that all objects under the Shared Space root match with physical space. However, this might not be what you need, as it might not be super performant.
Could you give me more details about what you are trying to use the mesh for? Is it simply for determining hit detection between the players and is there a reason you decided to go with the shared meshing approach?
Thank you josh Yes I need shared meshing as in my game, both players should be running around on the same map. If there is no shared meshing, it sometimes happens that the players are not on the same height for example and can’t shoot each other at all.
I’m currently recording a tutorial for my Youtube Channel Alive Studios and would like to explain shared meshing in that video. That’s why I wanted to ask you.
The thing is, that I’m even using the Shared space Manager, it is added to my XR Origin.
Here is the way I init my Shared AR Session:
using System;
using System.Collections;
using System.Collections.Generic;
using Colocalization;
using GameManager;
using Niantic.Lightship.SharedAR.Colocalization;
using Unity.Netcode;
using UnityEngine;
using UnityEngine.Serialization;
using UnityEngine.UI;
public class StartNetworkingManagerSharedARImage : NetworkBehaviour
{
[SerializeField] private SharedSpaceManager _sharedSpaceManager;
const int MAX_AMOUNT_CLIENTS_ROOM = 4;
It appears that shared meshing isn’t a built-in functionality, so you would have to write your own network behavior to simulate what the meshing manager does to make meshes appear for remote clients. Something to also consider is that consistently sending that many vertices through the network would be inefficient.
That being said, if that map is going to always be the same, you could simply generate a mesh of that map beforehand for the host to use.
Some alternative approaches could involve relying on server-side hitboxes to adjust for the inaccuracies in the y-positioning. That or redirecting the bullet during the on-fire event such that the bullet pathing would be corrected if the y position isn’t too egregious and within the limits you set.