A downloadable tool for Windows and Android

Nowadays, most of the input methods of the smart glasses are acoustic sensing, so we came up with the idea that we can use gesture recognition to new an input method. We use Leap Motion and Google Cardboard to simulate a smart glass with depth sensor, and SQLite as the database. Also, a windows computer used to receive the information of Leap Motion and interpret them into gestures. We now have six gestures which were trained by SVM to operate our device. The average accuracy has reached 80%(use cross-validation). Yet so far, we've only developed Traditional Chinese version, we're still looking forward to developing other versions with many different languages.

The gestures' using order has been described in the uploaded GestureOrder.pdf.


More information

Published1 year ago
StatusPrototype
CategoryTool
PlatformsWindows, Android
Rating
(1)
AuthorTzuChanChuang
Tagsinput-method

Install instructions

1. Extract "LeapMotionGesture.zip" and "gestcorgi-ar-ui-26fb2ffa2e71_2.zip"

2. Open the gestcorgi-ar-ui-26fb2ffa2e71_2/gestcorgi-ar-ui-26fb2ffa2e71_2 with Unity Project and run.

3. Find LeapMotionGesture/LeapMotionGesture/bin/Debug/LeapMotionGesture.exe and run (you should do this with a different computer, or it'll be difficult to use)

4. After that, you'll get a QRcode. Make this QRcode in front of the computer camera to make connection(as shown in the screenshot), then you can successfully make gesture with Leap Motion(Leap Motion should be plugged in the second computer.)

Download

Download
gestcorgi-ar-ui-26fb2ffa2e71_2.zip (55 MB)
Download
Gesture Order.pdf (176 kB)
Download
LeapMotionGesture.zip (61 MB)

Leave a comment

Log in with your itch.io account to leave a comment.