Xcode 13.2 beta
I’ve got a post processing screen effect that needs to know the FOV of the AR camera - it creates another camera childed to the ARSceneCamera and I need its settings to match those used rendering the objects over the AR scene.
The field of view setting on my ARSceneCamera is not used. I can get the ARSession.CurrentFrame.Camera which is an IARCamera - there’s no direct FOV value on it but there are Intrinsics - which are FocalLength (x,y) and Principal Point - I’m not sure if i can calculate the FOV from these without lens size?
Or is there an easier way for me to get hold of this? Thanks!
The ARSceneCamera does come with FOV adjustment, though there could be limitations on field of view with deferred rendering based on settings outside of Lightship. Can you explain more about your second camera implementation? You mentioned intrinsics, are you using the CameraIntrinsics.cs script from the ARDK to implement FOV components? Is this specifically for all post processing? Is it overlaid on the original camera? What are the secondary camera settings (orthographic, perspective)? I’m taking a further look into this issue, but found some other documentation you may find helpful.
Thanks Erik - in my 1.0.1 ARDK anyway the fov value of the ARSceneCamera doesn’t appear to be effective - if I set it to 120 for example and then query it while the AR is running it still returns 120 - even though clearly the AR scene is not using a FOV of 120…
However I just have to use the projection matrix instead and then it works - if I set the childed camera’s projection matrix to be the same as the ARSceneCamera’s projection matrix then it’s all good! Thanks for your help.
That’s excellent Ross, nice job. I can’t wait to see the awesome effects you achieve in AR with post processing using Lightship.