Learning to Rank Images from Eye Movements
Combining multiple information sources can improve the accuracy of search in information retrieval. This paper presents a new image search strategy which combines image features together with implicit feedback from users' eye movements, using them to rank images. In order to better deal with larger data sets, we present a perceptron formulation of the Ranking Support Vector Machine algorithm. We present initial results on inferring the rank of images presented in a page based on simple image features and implicit feedback of users. The results show that the perceptron algorithm improves the results, and that fusing eye movements and image histograms gives better rankings to images than either of these features alone.