User Gaze Detection on Arbitrary Objects Using Physical Sensors and an Eye Tracker in a Real Environment
Recent advances of mobile eye tracking technologies open up the possibilities of gaze-based human-computer interaction systems in a real environment. Since eye movements supply us with dynamic visual information, detection of gaze (observed when the person is looking at a specific region for a certain time) facilitates the system to trigger specific events when user attention is recognized. In order to detect such user gaze in a real environment, the existing approaches typically use image based object recognition methods. Such an approach limits the capability of the application because it is not applicable to unknown objects, for which the system has not been trained. This paper presents a method to detect the user gaze on arbitrary objects by using physical sensors in combination with an eye tracker. Experimental results show that the performance of the proposed method is comparable to the existing image based method but expands the applicability to arbitrary objects. Furthermore, we present a prototypical application that makes use of the method proposed in this paper to demonstrate the adaptability of this method to an open scenario.