Japanese / English

Detail of Publication

Text Language English
Authors Shoya Ishimaru
Title Activity Recognition with Google Glass: Combining Head Motion and Eye Blink Frequency
Reviewed or not Not reviewed
Month & Year March 2014
Abstract This thesis demonstrates how information about eye blink frequency and head motion patterns derived from Google Glass sensors can be used to distinguish different types of high level activities. While it is well known that eye movement is correlated with user activities, the aim of this research is to show that (1) eye blink frequency data from an unobtrusive, commercial platform which is not a dedicated eye tracker is good enough to be useful and (2) that adding head motion patterns information significantly improves the recognition rates. The method is evaluated on a data set containing five activity classes (reading a book, watching a video, solving mathematical tasks, sawing a cardboard and talking) of eight participants showing 67% recognition accuracy for eye blink only and 82% when extended with head motion patterns.
Back to list