Contextual Information Access with Augmented Reality
Antti Ajanki, Mark Billinghurst, Hannes Gamper, Toni Järvenpää, Melih Kandemir, Samuel Kaski, Markus Koskela, Mikko Kurimo, Jorma Laaksonen, Kai Puolamäki, Teemu Ruokolainen and Timo Tossavainen
In: IEEE International Workshop on Machine Learning for Signal Processing (MLSP), 29 Aug - 1 Sep 2010, Kittilä, Finland.
We have developed a prototype platform for contextual information access in mobile settings. Objects, people, and the environment are considered as contextual channels or cues to more information. The system infers, based on gaze, speech and other implicit feedback signals, which of the contextual cues are relevant, retrieves more information relevant to the cues, and presents the information with Augmented Reality (AR) techniques on a handheld or head-mounted display. The augmented information becomes potential contextual cues as well, and its relevance is assessed to provide more information. In essence, the platform turns the real world into an information browser which focuses proactively on the information inferred to be the most relevant for the user. We present the first pilot application, a Virtual Laboratory Guide, and its early evaluation results.