X3DOM: X3D via WebGL

Johannes and his team are working hard on making X3DOM reality. X3DOM is a solution for a seamless and HTML5 conform integration of 3D content in web pages. Based on WebGL and Javascript it does not need a plug-in anymore. Sounds like the best chance of a 3D internet for years.

Let’s hope Apple enables WebGL on the iPhone soon in order to render augmented reality apps. TwittARound would be happy.


iTACITUS Final Review in Turin

iTACITUS is finished. After 32 month the EU funded project came to an end. The final review took place at Reggia Venaria Reale in Italy.


Fraunhofer IGD presented the final version of our augmented reality application on UMPCs (live markerless tracking) and iPhone (Snapshot AR). The application shows context aware overlays of the architecture and interior of the palace. We only used available historic media linke drawings and paintings. This enables affordable augented reality for cultural heritage compared to expensive 3D reconstructions. For a seamless integration of the overlays we invented Reality Filtering (VAST 2008) which renders the environment in the style of the overlay (black and white drawing, painting).



My paper “Cultural Heritage Layers” about iTACITUS was accepted at VSMM 2009 in Vienna.

allard pierson museum in amsterdam

last week i finished a project for the “a future for the past” exhibition at allard pierson museum in amsterdam. we set up two augmented reality cultural heritage applications about satricum and forum romanum. two large photographs in the exhibition are superimposed with 3d reconstructions and context sensitive information. presented on a new version of the movablescreen and umpcs.
project documentation at IGD



i did not even have time for a fast forward sightseeing in amsterdam..



augmented reality sightseeing at CeBIT 2009

as fraunhofer igd we are presenting the project “augmented reality sightseeing” at BMBF booth (hall 9, booth 40) during CeBIT 2009 in hannover. it deals about “20 years fall of the berlin wall” and is split in three components on two devices: iPhone and UMPC.

like in iTACITUS we are displaying historic photos over today’s view of Brandenburg Gate and Reichstag. in addition the urban development of berlin and the wall are presented on a satellite image of berlin on a table. via augmented reality video-seethrough urban grain plans from 1940 – 2008 in are augmented on the table in 3D.

it works live on an UMPC and via Snapshot Augmented Reality on the iPhone. therefore we set up a tracking server that receives an image, augments it and sends it back to the iPhone.

the german tv was also there:
video from 11:40

winchester iTACITUS review

in november we had our second review of the iTACITUS project. therefore i finished an augmented reality applications for winchester castle’s great hall. it shows overlay information about three points of interest: arthur’s round table, the court and an outside scene with the sheriff of hampshire.


the main task was to present these scenes on the sony ux ultramobile pc. additionaly i showed the outside scene on the iphone with my snapshot augmented reality application.


the project finishes in may 2009.

reality filtering at reggia venaria reale

i was in turin for a week to test my reality filtering approach at reggia venaria reale. at three places of the hunting palace i am adding historic drawings of lost architecture and paintings to the reality. the reality is rendered like the historic drawing in order to integrate it seamless and to create a living drawing.

diana palace’s architecture was revisioned three times. i am overlaying the two former versions on the current architecture. the reality filter integrates them seamless. even passengers are integrated into the living drawing.

in diana hall there are two grey areas where once were frescos. via augmented reality i am adding black and white drawings of the paintings at their real position. all the overlays are containing spatial audio comments that are starting as soon as the visitor looks at them.

iPhone snapshot augmented reality

“snapshot augmented reality” on iPhone let’s you take a photo and augmented it with virtual objects or geolocated information. therefore the application sends the image to InstantVision server via Wifi or UMTS. the overlayed image is sent back to the iPhone and displayed some seconds later.

this lightweight solution eliminates the major problems of augmented reality on mobile devices: hardware availability, computing power and battery life. and it is compatible with Apple’s iPhone SDK, where live video streaming is still forbidden. the client is a very slim application that takes a photo and sends it to an InstantVision server. this server extracts the geolocation of the image and runs a database-driven postertracking on it to calculate the iPhone’s camera pose. therefore a reference image and a prepared 3d scene of each point of interest (POI) is needed. a flickr image and a small X3D scene is enough.
due to the distributed architecture the slim client can be ported to android, symbian, etc.


more about the project:

see also: instantmini X3D browser on the iPhone