Acquiring camera RGB frame

I’m looking to render the AR camera to a texture to be able to run some additional computer vision algorithms and/or image processing.

With just arfoundation it’s quite easy to acquire an AR camera frame by CPU or GPU (Image capture | AR Foundation | 5.1.1), but with Niantic’s ARDK patching/overwriting arfoundation functions this doesnt seem to be possible.
ARDK 2.5 still documented how to do this using the ARRenderingManager and also the ARKitFrameRenderer in the ARDK2.5 Examples and also on this forum under “Get AR Scene Camera output to Texture / Material + to Camera”

None of this seems to work for ardk3+. When going down into the rabbit hole I get to UnityEngine.XR.ARSubsystems > XRCameraSubsystem which is what links the unity camera to the device camera, but here I can see that a function like TryAcquireLatestCpuImage exists but doesn pass it on to arcore, but instead throws a “NotSupportedException: getting camera image is not supported by this implementation”

Any suggestions where else to look or what other methods to use? I basically just want one camera render both to the screen as well as a rendertexture I can use for additional processing.

whole stack trace:

NotSupportedException: getting camera image is not supported by this implementation
Niantic.Lightship.AR.Subsystems.Playback.LightshipPlaybackCameraSubsystem+LightshipPlaybackProvider.TryAcquireLatestCpuImage (UnityEngine.XR.ARSubsystems.XRCpuImage+Cinfo& cinfo) (at Library/PackageCache/com.nianticlabs.lightship@25b2beb33d/Runtime/Subsystems/Playback/LightshipPlaybackCameraSubsystem.cs:264)
UnityEngine.XR.ARSubsystems.XRCameraSubsystem.TryAcquireLatestCpuImage (UnityEngine.XR.ARSubsystems.XRCpuImage& cpuImage) (at Library/PackageCache/com.unity.xr.arfoundation@5.1.0/Runtime/ARSubsystems/CameraSubsystem/XRCameraSubsystem.cs:494)
UnityEngine.XR.ARFoundation.ARCameraManager.TryAcquireLatestCpuImage (UnityEngine.XR.ARSubsystems.XRCpuImage& cpuImage) (at Library/PackageCache/com.unity.xr.arfoundation@5.1.0/Runtime/ARFoundation/ARCameraManager.cs:299)
ARCameraToTexture.OnCameraFrameReceived (UnityEngine.XR.ARFoundation.ARCameraFrameEventArgs eventArgs) (at Assets/ARCameraToTexture.cs:31)
UnityEngine.XR.ARFoundation.ARCameraManager.InvokeFrameReceivedEvent (UnityEngine.XR.ARSubsystems.XRCameraFrame frame) (at Library/PackageCache/com.unity.xr.arfoundation@5.1.0/Runtime/ARFoundation/ARCameraManager.cs:491)
UnityEngine.XR.ARFoundation.ARCameraManager.Update () (at Library/PackageCache/com.unity.xr.arfoundation@5.1.0/Runtime/ARFoundation/ARCameraManager.cs:351)

gist with simplified code

Appreciate any help. Thanks!

(sorry first post so not allowed to put more links for extra context)

Hello Roelof,

Are you attempting to test this out using the Remote Playback feature that Lightship provides in the Unity editor? I see that the error regarding the lack of support is being thrown by the Playback subsystem which would be expected because Playback currently only works with Lightship features, not ARFoundation features. You’d need to build to device to test ARFoundation features.

Hi Jesus, thanks for answering.
I see, yes that makes sense. The same code does work on device indeed (although crashes and havent traced down yet why).
Is there a supported way to get a CPU image using the replay functionality? If not, is it on the roadmap?

If you’re on Android I’d suggest maybe running the Android Logcat on Unity to see what you get at the moment of a crash.
I’m not aware of any timelines for supporting ARFoundation features with ARDK’s Remote Playback but it looks like ARFoundation does have something similar called XR Simulation

I’d love to be able to develop using Niantic’s ARDK and features though! :slight_smile: Is there a way that Lightship Replay supports grabbing a camera frame (eg as a Texture2D or Render Texture)?

I’ll look into this after the weekend and get back to you but I don’t think it will be possible due to the occlusion coming from Unity’s end. This doesn’t mean that you can’t use ARDK features it just means that our testing tools will only work with our features so for the Unity features you’d need to build to the device.

Hi @Jesus_Hernandez any update on this? Just a way to capture what the AR camera frame looks like at a particular moment.

Hi Roelof,

If I was understanding correctly, it was only in the ARDK remote playback feature that the camera frame wasn’t working correct? It was working on the device, right?

Unfortunately, after having a look, it isn’t possible to test the ARFoundation features on the ARDK remote playback feature at the moment, only ARDK features work. This doesn’t mean you’re unable to use ARDK features, it’s just that testing the AR Foundation aspects would require you to build to your device

Indeed just for the remote playback feature: on device works fine. If possible would love this to make it on the ARDK roadmap as it would be quite helpful for more complex AR experiences.

Thanks Jesus!

I’ve noted it down as feedback. Be on the lookout for future updates to ARDK, we’re always working on improving it