Eye-Wear Computing

You might have seen the J!NS academic videos by now, I added embedded versions to the end of the post.

Bellow is the full video of the sneak peek of our work in the J!NS promotion. Special thanks to Shoya Ishimaru and Katsuma Tanaka , two talented students Koichi Kise Sensei (Osaka Prefecture University) and I are co-supervising. Check out their other (private) work if you are into programming for smart phones, Mac, iOS and Google Glass ;). The video is a summary of research work mostly done by Shoya.

In the video we show applications using an early prototype of J!NS MEME, smart glasses with integrated electrodes to detect eye movements (Electrooculography, EOG) and motion sensors (accelerometer and gyroscope) to monitor head motions. We show several demonstrations: a simple eye movement visualization, detecting left/right eye motion and blink. Additionally, users can play a game, “Blinky Bird”. They need to help a bird avoid obstacles using eye movements. Using a combination of blink, eye movement and head motion we can also detect reading and talking behavior. We can give people a long term view of their reading, talking, and also walking activity over the day.

Publications still pending, so I cannot talk about the features, algorithms used etc. In the mean time, here is a demo we gave at UbiComp this year.

J!NS Academic Video:

Oh and if you haven’t had enough: Here’s an extended Interview with Inami Sensei and me. Me wearing the optical camouflage for the first time at 0:04 :D (very short).