Getting the correct transform for the camera feed when rendering it to texture

ARDK 1.0.1
Unity 2020.3.1f1
Mac
Xcode beta 13.2

Hello,
I’ve followed this topic

in order to render the camera feed to a texture.
I then want to display this texture exactly as it would be displayed if the ARRenderingManager were set to ‘Camera.’
(I’m doing this because i want to use some pixels form the live feed at various points in the application)
The way I’m currently attempting to do this is:

  • At startup I create a RenderTexture at screen resolution
  • Assign that RenderTexture to ARRenderingManager - and also to the material of a plane (which fits the screen exactly) that is drawn by its own camera behind everything else.

This works ok except there is a slight lack of synchronisation between the live feed and the mesh - suggesting that I should be cropping the live feed slightly / transforming it in some way.
It’s different on different devices - the video is on iPhone13 pro…

Can you give me any clues how i get to the correct transformation?
Thanks!

Do you copy the ProjectionMatrix of the Camera connected to the ARRendering Manager and paste it on the Camera that renders your virtual scene?
Because that’s what I did to fix that synchronisation error.

Thanks - I believe I’m doing that already thanks to your post on the other thread… will double check that it’s working as expected.

Hi Ross_Styants1,

I am trying to do the same thing and I have two questions:

  1. Where do you place the plane (whose material is set to the rendertexture from the ARRenderingManager) in the scene relative to the camera?
  2. How are you setting the projection matrix of the camera that renders your virtual scene? Are you doing it in a script? If so, where?

Thanks!
Jennifer

Hi Jennifer - i’ve actually turned this functionality off in my project for now - but I think the way I did it was -

  1. made a separate Orthographic camera with a Depth of -10 (so it rendered behind everything) - that only rendered that render texture on a plane.
    • VirtualObjectsCamera.projectionMatrix = ARCamera.projectionMatrix;

Hope that helps !

Wow thank so much for the quick reply. Is the orthographic camera a child of the ARCamera? Where in the code do you set the projectionMatrix? Is it in an update loop? Thanks again!

Hello, nope the orthographic camera is just on its own in the scene somewhere - just renders the live feed as background -
and i set the projection matrix in this callback i think -
ARSessionFactory.SessionInitialized += OnSessionInitialized;

Awesome . Thank you!