- Varjo Support
- Developer FAQ
- Unity SDK
-
Get started with Varjo headsets
-
Get to know your headset
-
Get to know Varjo software
-
Ordering and deliveries
-
Headsets FAQ and troubleshooting
- Upgrading to XR-4
- Connecting the headset
- Setting up the headset
- Starting an application
- Displays and image quality
- Mixed reality
- Using positional tracking
- Using eye tracking
- Using hand tracking
- Using controllers
- Using headphones and audio
- Error messages
- Varjo account and Varjo subscriptions
- Miscellaneous
-
Security FAQ
-
Developer FAQ
-
Downloads
-
Release notes
How to get a mixed reality video to Unity?
THIS ARTICLE IS OBSOLETE
Here is a rough example for getting a video pass-through image to Unity from XR-3.
This project shows how to pull the YUV camera streams from the headset, convert them to RGB textures, and undistort and render the image as the skybox background.
Open the VSTBackground scene and select play. What you see is the camera image that is now rendered in Unity instead of Varjo compositor the way it is normally done. Note that the quality and colors of the video pass-through image are not as good as they are normally. You can easily compare the quality between the two if you open Varjo Base Analytics window and toggle Video pass-through on and off while the example application is running.
The CPU image conversion API has synchronous and asynchronous modes. The project includes examples for all supported modes (although for the other async mode there was a memory leak). We recommend using the asynchronous request option. It is enabled by default. You can also use the synchronous one if you want, but bear in mind that this conversion requires a lot of computing so running in the main thread will cause the framerate to drop.
Contact Varjo Support
Didn't find what you are looking for? Reach out to Varjo Support for help.