I am trying to get an AR/VR-App running with help of a 3rd Party Unity Extension and the HTC Vive . There are a couple of problems:
1) Getting the feed from the front camera of the vive has proven harder than usual. It is not enough to be able to use it as a webcamtexture or with the SteamVR funktion "trackedObject". I need to be able to access it as a normal Webcam of some sort. How can I make my computer accept the Vive front camera as normal webcam? Can I somehow create a virtual webcam on my pc that can trick the 3rd Party Extension?
2) Using the Vive as "Screen" to play the video on. In the end the extention is processing the video, recocnizing my AR markers and placing the augmenting content in the video, which then of course should be displayed by the eyewear. The question is, how can I use the Vive as Screen, like I was watching a movie without 360° function or any VR Content on the Vive?
I hope there is someone here who can help me. (A square in front of me with the video on it is not very emersive, I am looking for something that is more like the Room View from SteamVR or a way to just display the video).
Yes you can use a WebcamTexture (or preferably a Texture2D). To see it listed as an available webcam you must disable the camera in SteamVR settings if enabled. This is not intuitive which why I recommend using a Texture2D which you can obtain using the OpenVR or SteamVR plugin APIs. Basically you obtain the frame buffer from the tracked camera and use it for your texture before passing it to your library (e.g. OpenCV).
As for a full screen view (which I don't really recommend even though it may seem more immersive) you can simply extend your quad or plane to fit the field of view and have it follow your hmd (as a child of the hmd camera). Example of a full screen filter is available here: https://github.com/dariol/ViveOpenCVExample