Augmented Reality on a Segway
In september I was in San Francisco for a couple of days to attend the MobileHCI conference and the MobiVis workshop, but not just for fun and looking at cool new stuff, I also presented a work about mobile augmented reality. My colleagues Michael Königs, Prof. Dr. Leif Kobbelt and myself tried out if you could build an augmented reality application (a game in this case) which uses only image based methods for localisation. This means instead of using the (inaccurate) GPS of your smartphone, you send an image to a remote server that figures out, where you are and what you are looking at. This has the added bonus of working exactly the same indoors and outdoors.
During the game you had to interact with historical figures and solve quests in a Monkey Island kind of style, but instead of klicking where you want to go you had to go there as the setting was the real city centre of Aachen. As this was intended for tourists to discover the city, we added a Segway for faster travelling – and additional fun ;-)
Of course we build authoring tools, did a user study and tested the image based localisation method from a technical and usability standpoint (you can find the details in the paper). The results summed up: You should look into image based localisation and computer vision if you want to build immersive AR apps, just GPS&compass will never be good enough. The users loved the point&click setting in the real world (and driving a Segway) and even our inexperienced, non-tech-savvy users had no problems with the localisation and gaming metaphors.
The server for finding our your position wasn’t done in this project but based on the work of our coworkers. On Youtube you can find videos of an older techdemo:
You can find the paper titled “A Framework for Vision-based Mobile AR Applications” on the MobiVis website for further information.