08-05-2018 08:24 AM
We need to grab the dual camera images and simply show them on Vive Pro, and later, possibly add some NPR filters. We got it to work, partially, with SRWorks SDK, OpenCV, and OpenVR, but we are wondering if there's a simpler solution to show the images right away after grabbing the frames from the SDK.
Our solution so far is to grab the openCV Mat, convert them to OpenGL textures, and show them in OpenVR. It's very slow, and to be honest it's very slow and don't seem like real. If we can eliminate OpenVR and OpenGL, we could get better performance. By the way, the official DLL version 0.7.5.0 was a nightmare, and we had to debug our code for 3 days to find out it won't work simultaneously with OpenGL. The post on this forum with the attached DLLs of 0.7.5.1 helped and works perfectly.
So, my questions essentially are:
1- Is there any way to just show texture/image/picture direcltly on Vive Pro screens?
2- Is there a one-to-one translation between front cameras and Vive Pro screens? I am assuming not, because of asymmetric nature of Vive lenses. You cannot simply swap the rendering buffer with textures filled by left and right cameras’ distorted images in an OpenGL application, right? I've read somewhere that the center of each eye is leaning inwards, so are the cameras the same?
3- What happens when you change the IPD inside the Vive. Since the front cameras are fixed, is the picture moved by software automatically before rendering, or the user needs to take actions and shift the images?
Thank you for any feedback.
08-05-2018 09:44 AM
08-05-2018 09:10 PM
A time-space warping techinques is applied in SRWorks, so the perceived latency for mostly static scene is mitigated to a minimun when you fast move your head with HMD.
I guess you may concern the latency with moving object but the programming language is not a key factor contributing the latency. It mostly depends on your pipeline responding to your use case. The fastest tested approach is to render your camera texture to a overaly using openvr directly.
08-06-2018 06:13 AM
Thank you, DanY. So, by last paragraph, you mean we can get the camera feeds directly inside OpenVR? Because right now, we are using OpenVR, but in conjunction to SRWorks: SRWorks > Distorted Images > Convert to Texture > Show with OpenVR.
08-06-2018 07:45 PM
You could refer to the sample $(openvr)\samples\tracked_camera_openvr_sample included in OpenVR SDK for front camera access.
08-20-2018 10:36 AM