Relevance Feedback from Eye Movements for Proactive IR
Jarkko Salojärvi, Kai Puolamäki and Samuel Kaski
In: Machine Learning Meets the User Interface, 12 December 2003, Whistler, Canada.
This on-going work is part of a larger project that aims at complementing or even replacing the laborious explicit relevance feedback in information retrieval by implicit feedback. Technology for measuring eye movements begins to be mature enough, and the movements contain rich information about the attention and interest patterns of the user. The problem is that the signal is very noisy and the
correspondence of the eye fixation patterns to user's attention is not always one-to-one.
Eye movements have earlier been used as alternative input devices in human-computer interfaces, and recently in a proactive dictionary which becomes automatically activated. To our knowledge, they have not been used in information retrieval before. We have implemented a pilot system and use it for developing models for detecting cues of relevance from eye movements. We started by preliminary experiments on whether simple HMM-based models are capable of discriminating relevant texts.