Featured on the Leap Motion blog
Deaf people have significant things to offer to society, yet they are often left out of critical dialogues because of communication barriers.
ASL tutor is Rosetta Stone for sign language; it helps people learn American Sign Language in a fun, easy way. We show users an image of a particular sign and use skeletal tracking combined with machine learning to detect the physical location/orientation of your hand and determine what sign you are making.
Matt Tinsley and I built this in under 24 hours at TAMUHack 2015. Source code is available on GitHub.
I wrote more about the process of building it in my writeup of TAMUHack 2015.
Currently, ASL Tutor is only able to recognize the alphabet, but we'd like to improve it to be able to recognize any sign.
The "game" aspect of it was mostly devised as a fun way to demo our sign language recognition at a hackathon. Turns out that it's legitimately fun, but I think there are other applications of this technology as well. For example, it would be cool to build an on-the-fly translator for ASL.