Prof. Yasushi Yagi showed how to infer intention from gait analysis. Interestingly, he showed research about the relationship of gaze and gait.
Dr. Alireza Fathi presented cool work about ego centric cameras. He showed how to estimate gaze using ego centric cameras during cooking tasks and psychological studies.
Prof. Hanako Yoshida explores social learning in infants (equipping children with mobile eye trackers … awesome!), inferring developmental stages giving more insights in the learning process.
Prof. Masahiro Shiomi spoke about his research trying to adapt robot behavior to fit into social public spaces ( videos about people running away from a robot included ;) ). Currently, they focus on service robots and model their behavior according to successful human service personnel.
Prof. Yoichi Sato presented work related to detecting visual attention. They use visual saliency on video to train an appearance-based eye tracking. Really interesting work, I had a chance to talk a bit more with Yusuke Sugano, cool research :)
Of course, Koichi also gave an overview about our work. If you want to read more, checkout the IEEE Computer article.
I’m looking forward to the main conference. Here’s a tag cloud using the abstracts of ACPR and ASVAI papers:
We present demonstrations and new results of the eye tracking on commodity tablets/smart phones and a sharing infrastructure for our document annotation for smart phones.