URP doesn't render live video camera feed

  • Device type & OS version: iOS 15
  • Host machine & OS version: Mac
  • Issue Environment : On Device
  • Xcode version: 13.2.1
  • ARDK version: 1.3
  • Unity version: 2021.3.1f

Description of the issue:
As soon as URP is enabled in project settings I cannot see camera feed. Camera just renders whatever is selected as Environment Background Type. Render Pipeline Asset is set to ‘ArdkArpAsset’ with default renderer ‘ArdkArpAssetRenderer’ and both ARSessionFeature and DepthMeshRendererFeature are enabled (as specified here Is there any particular settings in need for Unity URP project?).

I don’t know if this is relevant to the problem but log returns:

No ARSessionFeature was found added to the active Universal Render Pipeline Renderer.
Niantic.ARDK.Rendering.ARSessionBuffersHelper:AddBackgroundBuffer(Camera, CommandBuffer)
Niantic.ARDK.Rendering.ARFrameRenderer:ConfigurePipeline(RenderTarget, Resolution, Resolution, Material)

Also found a report on this issue on the forum but there were no comments what so ever: Error: No ARSessionFeature was found added to the active Universal Render Pipeline Renderer, and black screen on iOS build

So I wonder if I’m doing something wrong or is it a more general problem. Would appreciate any hint

1 Like

Hello Dmitrii,

We are trying to reproduce the issue on our end. In the meantime, can you please provide screenshots of your inspector windows, specifically of the Renderer List of Main Camera and Renderer Features of Ardk Urp Asset Renderer?



here you go

Thanks for the screenshots.

Can you please try to navigate to Edit > Project Settings > Quality, and ensure that it’s using the ArdkUrpAsset with the required build target i.e iOS here.
In addition, please make sure that ArdkUrpAsset is set at the top of Project Settings > Graphics.

Hope it will help to resolve your issue. Please reach out if you need more help.


everything is done, but there is still no video feed

Hi @Dmitrii , would you be able to try this with a new project using the following steps? I’m not sure how your current project is set up, so running through a project with a minimal setup might help us zero in on what’s happening for your project. Apologies if the steps are verbose. I wanted to make sure that we are on the same page in regard to the project’s setup.

  1. Create a new Unity project using the 3D (URP) template.
  2. Import ardk-1.3.1.unitypackage and ardk-examples-1.3.1.unitypackage into your project.
  3. Add your API key to the ArdkAuthConfig asset.
  4. In Build Settings, switch the platform to iOS.
  5. Open Player Settings and add a Camera Usage Description and Location Usage Description.
  6. Go to Project Settings > Graphics and change the Scriptable Render Pipeline Settings to ArdkUrpAsset (Universal Render Pipeline Asset).
  7. Go to Project Settings > Quality.
  8. Select Performant, and then change the Render Pipeline Asset to ArdkUrpAsset (Universal Render Pipeline Asset).
  9. Repeat the previous step for the Balanced and High Fidelity quality settings.
  10. In Build Settings, remove any Scenes In Build that are present, then add the PlaneAnchors scene to the Scenes In Build.
  11. Build the app and deploy it to your device, and then launch and test it.

Please let me know how it goes. Thank you!

Okay, apparently, you needed to repeat the same procedure for every quality level. That’s fixed the issue

Yes, that’s correct. It looks like we don’t call this information out in our documentation specifically, but we’ve put in a feature request to get this added. In any case, I’m glad that it’s working for you now.

This topic was automatically closed after 11 days. New replies are no longer allowed.