I'm currently working on a research project, where we need to track the hand movements of a participant. Could this be done with a Vive Tracker? We wouldn't need any VR, we just want to place the lighthouses somewhere, strap a Vive Tracker on the wrist and read location data (X, Y, Z coordinates).
How much effort would it be to implement something like this? It would be best if the data could be processed in C# after it has been retrieved (streamed). Maybe there is some kind of API for the Tracker to get this kind of data?
This could be done with a Vive tracker. Trackers are best at resolving larger objects such as gaming accessories and entire limbs so as long as you only need to know the position of a wrist, you're set. You can use these to attach them. The trackers will tell you where in a room the hands are, but not where individual fingers are. Here is a guide in using SteamVR sans a Vive.
For finer resolution stuff, the cheapest option is a depth camera like a leap motion, but these have limited field of view. You can mount them to an HMD and potentially even hybridize the data with trackers.
You can get the data directly from OpenVR/SteamVR but it may be hard to work with in that format. You can use something like Unity to capture and convert the data into a coordinate system that fits your needs.