02-27-2017 01:16 AM - last edited on 02-27-2017 01:07 PM by Rockjaw
I've received two Vive Trackers Development Kit and I've been experimenting during the weekend.
I made a video showing how to do the SteamVR setup and how to integrate the Trackers inside the Unreal Engine 4.
I also created a setup very similar to what Noitom did with their Hi5 VR gloves.
If you have any questions please let me know
03-06-2017 07:49 AM
Very helpful post! And do you know how to pair three VIVE trackers at the same time? I can only pair up to two trackers.
03-20-2017 11:18 PM
I really like your work, I wish you luck.
We wanted to test VIVE tracker on our Television Virtual set. We wanted to try using Vive tracker with UE4 camera, by buying only Vive tracker + 2 Base stations first without buying Vive pack. Is it possible to only use 2 Base stations, with Vive tracker? Will they work together, without Vive headset, controller?
I kindly wait for your reply.
03-22-2017 01:22 PM
The Trackers are indipendent from the controller and from the HMD, but without the controllers and the HMD you can't do the room scale calibration, which is a necessary step for the room scale setup, so even if you buy just the trackers and the base station, you will need the controller+HMD to make it work.
An alternative could be to ask someone to borrow their Vive, so that you can do the calibration the first time using your cameras and his HMD/Controllers, and after that is done you can just simply use the Trackers+Cameras and that's it.
I haven't tried that, so in a couple of days I'll do a test to see if everything works properly and give you a 100% answer to this issue
03-24-2017 05:54 PM
03-27-2017 01:06 AM
03-27-2017 01:08 AM
06-01-2017 01:42 PM
I'm having issues and confusion.
Device ID's of 1 and 2 or 3 and 4 are not working. I can confirm that 3 and 4 are my controllers but the trackers are not working with 1 and 2.
Also for debugging I'm using Get Valid Tracked Device Ids and only two controllers (3 and 4) show up. I'm assuming the trackers are supposed to show as well? I've also tried Device Type of Static and Other with no luck. Static shows my lighthouses I'm assuming?
And of course the trackers are showing as paired in SteamVR.
06-23-2017 08:24 AM
Hey, with regards to the trackers and lighthouses, yes it is possible. I have it setup now.
8 trackers and 2 lighthouse running off a surface. All battery powered.
You dont need the headset to define the room scale and calibrate floor. It can be done with a tracker.
08-23-2017 08:18 AM
Hi Nicolas, I had a question reguarding your integration of your Perception Neuron mocap data, I am trying to accomplish litteraly this exact setup for my Thesis project. I have the suit integrated and bringing the bvh data into UE4 but I am currently using a fullbody avatar. Did you have to retarget the hand skeletal mesh to get it to work right with the suit, or did it work with the vive hand mesh without having to do that? I am also wondering how you are sending the bvh data to two seperate skeletal meshes at the same time? My current setup requires a vive tracker for tracking a digital midi keyboard and the I am currently only using one tracker at the root of the Perception Neuron suit to track it, but I believe that your setup would be much more acurate in terms of identifying the exact location of the hands and would also be a huge time saver in showing my thesis to my panel without having to recalabrate the suit for people of different height. Thanks for the work you have done so far, and I really appreciate any insight you are willing to provide on your setup.
08-28-2017 03:34 AM
The setup for the PN suit works like that:
The Trackers gives the position/orientation of the hands, the fingers are driven by the mocap data being streamed from Axis Pro to UE4 in realtime.
In short, what I did was to modify the VR Pawn and add what I need, which in my case are the two hands ( which are from the same skeleton, not two different skeleton, since it's the Mannequin character modified in Maya by removing the entire body, except for the hands ), the Perception Neuron node inside the VR Pawn BP, then a Perception Neuton Manager inside the scene, and that's it.
The actual position of the player is given by the Vive itself, so the PN mocap data is used just to track the fingers.
If you want to have very precise results you need to use multiple Trackers, because the suit gives you an approximate position, which is never the same every time you move and try to go back to the same position ( downside of a IMU tracking system ).
Regarding the retargeting: if you're using the Plugin from the noitom website the retargeting is already done for you ( check the tutorial about it, it's very easy to setup the entire thing ).