I work on quite unusual task for HTC Vive. Our company is specialized on computational photography area. And we decide to make some experiment with HTC Vive. Our main goal is to improve quality of "pictures" observing by user looking in HTC Vive (in other words, stream of frames). It's known that picture getting "worse" on the edges of Vive's lenses. We beleive that we can overcome such limitations programmaticaly! To achive that I have to find a way to get prepared for rendering by Vive frames, process that frames in our shader and render it on Vive's screen. I'll be really appreciated if someone can help me to sort out such development issue! Thanks in advance!