05-29-2018 08:13 AM
I'm creating a simulation in which the user is inside a vehicle and is able to see certain critical components of the vehicle, otherwise he is looking at the depth camera dispay to be able to interact with real world objects.
The issue I'm having is as I'm looking around inside the vehicle, I'm experiencing a parallax effect where the depth camera seems to slide around behind the rendered objects rather than with them.
Does anyone have any advice for how to mitigate this effect?
05-29-2018 10:34 PM
does you mean that the distance between you and objects in real world seems closer than your thought?
05-30-2018 10:23 AM
No, I'm talking about the shifting position visible between objects spawned in the scene and the pass through video feed that is displayed. For example, if you open the demo app, place a ball on the ground and walk around it with pass through video enabled, the position of the ball will shift around relative to real world objects even though the ball has no velocity.
I've been able to sort of make it better by trying to use a real world object as a sort of "anchor" and making sure my virtual objects don't shift much relative to that object. There may be no better fix for this at the moment, though I'm hoping someone out there has a better solution.
05-30-2018 08:32 PM
We have also observed the phenomenon you mentioned. We are figuring out if the shifting problem can be fixed. If you have any suggestion please feel free to post here. Thanks!
05-30-2018 11:05 PM
I'm no expert but I've been thinking about doing something similar, so maybe this can help.
I imagine this problem would happen when the latencies from the video feed and the rendered images are not the same. The video probably has more latency and, therefore, it "drags behind" the rendered image.
Your solution of "anchoring" the virtual object sounds like a good idea to mitigate the effect but it seems like the only real solution would be to match the latencies between the video feed and rendered image.
Assuming the video feed latency is higher, that leaves 2 options:
1) Increase rendering latency to match video feed
2) Reduce video feed latency to match rendering
Increasing rendering latency should be easy enough but it is obviously bad for overall immersion.
Reducing video feed latency might be possible if you "shift" the image based on the latest position/orientation of headset just before displaying it on the headset. Something similar to the reprojection happening for the rendererd image already. Turning or moving your head quickly will then cause some visual artifacts, though.
08-30-2018 04:38 PM
hi There, this shifting problem is preventing us from using Vive Pro trackers to match virtual and real objects. If any progress is made on this front please let us know! Its very frustrating to not be able to use the vive pro headset because of this problem. It is very strong shifiting and virtual and real object get totally unmatched.
08-30-2018 04:40 PM
After analysis very shortly, it seems it might be a update problem. It seems the camera position is updated in the C++ driver, and the unity object is updated later, so the plane that renders the image background moves with some delay in relation to the tracked camera within the unit.
08-31-2018 05:55 AM
Yeah, that´s one of the typical errors in AR, called dynamic registration error. I don´t know if there is a way to synchronize image rendering with hmd pose updates....
But @pablocael do you also have static errors? Like when the virtual object has a wrong position in general?
Becaus this is what I get, marking the middle of my room both in the virtual and real world and comparing them:
seems to me like a few cm of pose error...