How to get a mixed reality video to Unity?

Here is a rough example for getting a video pass-through image to Unity.

This project shows how to pull the YUV camera streams from the headset, convert them to RGB textures, and undistort and render the image as the skybox background.

Open the VSTBackground scene and select play. What you see is the camera image that is now rendered in Unity instead of Varjo compositor the way it is normally done. Note that the quality and colors of the video pass-through image are not as good as they are normally. You can easily compare the quality between the two if you open Varjo Base Analytics window and toggle Video pass-through on and off while the example application is running.

The CPU image conversion API has synchronous and asynchronous modes. The project includes examples for all supported modes (although for the other async mode there was a memory leak). We recommend using the asynchronous request option. It is enabled by default. You can also use the synchronous one if you want, but bear in mind that this conversion requires a lot of computing so running in the main thread will cause the framerate to drop.


Contact Varjo Support

Didn't find what you are looking for? Reach out to Varjo Support for help.