“snapshot augmented reality” on iPhone let’s you take a photo and augmented it with virtual objects or geolocated information. therefore the application sends the image to InstantVision server via Wifi or UMTS. the overlayed image is sent back to the iPhone and displayed some seconds later.
this lightweight solution eliminates the major problems of augmented reality on mobile devices: hardware availability, computing power and battery life. and it is compatible with Apple’s iPhone SDK, where live video streaming is still forbidden. the client is a very slim application that takes a photo and sends it to an InstantVision server. this server extracts the geolocation of the image and runs a database-driven postertracking on it to calculate the iPhone’s camera pose. therefore a reference image and a prepared 3d scene of each point of interest (POI) is needed. a flickr image and a small X3D scene is enough.
due to the distributed architecture the slim client can be ported to android, symbian, etc.
more about the project:
see also: instantmini X3D browser on the iPhone