Hand Pose Detection in Virtual Reality with Deep Learning

Mike O'Connor

We recently had a 2-day Tensorflow hackathon in the group, and I decided to use it as an opportunity to play with one of our new toys, the Noitom Hi5 VR gloves:

glove_demo.gif

The gloves provide finger tracking, and full hand tracking in VR when combined with HTC Vive Trackers on the wrist, meaning you can effectively see your hands in VR. This opens up some exciting possibilities for interaction and control in VR.

To develop interaction schemes, we need ways of detecting hand gestures and poses. This could be achieved with heuristics or several machine learning methods. In the 2 days of the hackathon, I developed a prototype for classifying hand poses using neural networks in Tensorflow, with the result being displayed in VR in real-time. The code is available over at github.

I spent the first morning collecting labelled data from myself and some members of theā€¦

View original post 356 more words

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s