WEB3D & SIGGRAPH 2008 – part two

the presentation of the paper “Adapting X3D for Multi-Touch environments” at WEB3D conference and our demos at the WEB3D showcase were really successful. tobias showed his mixed reality light simulation and i presented the radiohead iphone experiment.


finally i met don brutzman and leonard daly who wrote the first X3D book X3D: Extensible 3D Graphics for Web Authors. Now i have a signed one.


at SIGGRAPH johannes and dirk gave their “Don’t be a WIMP” class. we showed some stereo rendering, wiimote 6DOF and augmented reality demos afterwards. inbetween we had to run to fetch the google t-shirts..


together with “Rome Reborn” we exhibited the iTACITUS technology. via instantvision’s markerless tracking we showed 3d models of roman monuments on top of a floor map and posters of Rome.

next years WEB3D conference will be at IGD in darmstadt!

instantmini and beta5

we are about to release our x3d player for the iPhone! patrick just finished the OpenGL ES based application today. it is build with the official iPhone SDK and will be available in the AppStore soon.


beta5 of instantplayer will be released today, too. the experimental BrowserTexture is one of my favorite features. XMLHttpRequest was postponed to beta6. meanwhile i found a solution via TCPClient backend for loading and parsing XML.

all news and interesting demos will be presented at WEB3D and SIGGRAPH next week. see you there..

rendering thom yorke on the “In Rainbows” LP in augmented reality

when radiohead released their “In Rainbows” album i bought the limited edition deluxe discbox. it’s good for markerless tracking! now thom yorke’s virtual head is rendered on it in augmented reality.

in our itacitus project at IGD alain and yulian developed a “poster tracker” for markerless tracking. it takes a photo of a flat object (poster, LP, iPhone screen) and extracts visual features. in instantreality it calculates the right camera position relative to the tracked object. thus it is very easy to write augmented reality applications without deep knowledge in computer vision.


ok. i trained the poster tracker for the “In Rainbows” cover and added the 3D data of “House of Cards” to it via peter’s script. now thom is singing on top of the album. thanks mario for the mac version!



the tracking works on the iphone, too. ok. only when displaying the image on the iphone. but we are really porting instantreality to the iphone.


radiohead’s ‘house of cards’ data in real-time 3d

yesterday radiohead released the 3d data of their video ‘house of cards’ under the creative commons license. brilliant!

we downloaded the data this morning and started to render it in real-time 3d. imagine a music video that you can walk through! we finished the first visualizations this evening.

peter eschler and i wrote a python script that converts radiohead’s csv coordinate data into an x3d / instantreality particleset. he already started interpreting the intensity values as z-depth, color and dot-size. you can download the python script (bsd license) at his website: http://www.pyjax.net/blog/1/2008/07/15/converting…

the real-time 3d data looks impressive on our new heyewall 2.0 – a multi-touch projection wall with 8160 x 4000 pixel resolution. the next days we will add a wiimote / balance board interaction for controlling time and distortion.


flickr album

youtube videos

peter created an atomizer of thom’s virtual copy.

reality filter in cultural heritage project

at fraunhofer igd i am leading the augmented reality part of the eu funded project iTACITUS. the project deals with mobile augmented reality for cultural heritage sites.

in april we tested our markerless tracking and my reality filtering system at the unesco word heritage site “reggia venaria reale” near turin in italy. the markerless tracking bases on alain’s and yulian’s poster tracker. you only have to take a reference image of the background for a stable tracking. that’s how tracking should work.

my reality filtering system renders the whole environment at a sketch in order to augment historic drawings on the site. one example is temple diana at the site. it was located at the end of a long creek in the large gardens. only its ruins and two historic drawings are left. standing on a viewing platform visitors are looking through the display of a mobile computer. as soon as they are looking at the temple’s position the video on the screen is rendered like a sketch and the drawing on the temple fits in the environment. the whole garden becomes a real time drawing.


there is a flash video of the tests on the project site:

ps: mark from boingboing just wrote about a frog concept about a reality filtering.

architectural multi-touch application at cebit 2008

at cebit 2008 we are introducing instantware. it is a sub brand of instantreality for our custom interaction devices and services like the multi-touch table and heyewall, movablescreen and our mobile augmented reality applications.

the multi-touch table will be presented at BMBF booth (hall 9, b40). johannes and yvonne build a ubercool architectural application. it features several scalable plans of messe frankfurt’s new hall 11. multiple users can move and zoom these plans like you know it from other multi-touch applications.

but the key feature is the tile with a high quality 3d view of the building. they implemented the multi-touch 3d camera gestures jens and i introduced last year: you are grabbing a plan with the left hand. one finger of right hand moves the camera through the 3d model. the second finger defines the orientation of the camera. this enables incredible cinematic camera movements in 3d. when the second finger points on a certain object on the plan and the first fingermoves around it the 3d camera moves around that object while keeping it on focus.

more images from cebit 2008 at:

multi-touch table released

last week we set up an installation for coperion group (global market leader for compounding systems) at k trade fair in duesseldorf: a large multi-touch table and an 8 meter wide hd projection showing a complete interactive plastics production process.


the application was developed together with jan, sebastian and chrisi from design & systems institute.

the visualization is rendered with instant reality’s cluster. it makes clustering x3d applications incredibly easy.

fraunhofer igd – project website