We present a system for making community driven websites easily accessible from the latest mobile devices. Many of these new devices contain an ensemble of sensors such as cameras, GPS and inertial sensors. We demonstrate how these new sensors can be used to bring the information contained in sites like Wikipedia to users in a much more immersive manner than text or maps. We have collected a large database of images and articles from Wikipedia and show how a user can query this database by simply snapping a photo. Our system uses the location sensors to assist with image matching and the inertial sensors to provide a unique and intuitive user interface for browsing results.
Douglas Gray, Igor Kozintsev, Yi Wu, Horst W. Haus