We recently had a 2-day Tensorflow hackathon in the group, and I decided to use it as an opportunity to play with one of our new toys, the Noitom Hi5 VR gloves:
The gloves provide finger tracking, and full hand tracking in VR when combined with HTC Vive Trackers on the wrist, meaning you can effectively see your hands in VR. This opens up some exciting possibilities for interaction and control in VR.
To develop interaction schemes, we need ways of detecting hand gestures and poses. This could be achieved with heuristics or several machine learning methods. In the 2 days of the hackathon, I developed a prototype for classifying hand poses using neural networks in Tensorflow, with the result being displayed in VR in real-time. The code is available over at github.
I spent the first morning collecting labelled data from myself and some members of the…
View original post 356 more words