x3d mashup proof of concept

here is an experiment about creating x3d worlds from real time online data.

x3d is xml. thus with some xslt every rss feed can be converted into x3d.

on a local mac or linux box that’s simple. using available online tools is more challenging. i tried yahoo pipes, google mashups and w3c’s xslt service. and here is the proof of concept:

1. use an rss or get some data together in pipes:

2. write an xslt stylesheet that gets the interesting parts from the rss and puts it into the right place in x3d. i.e. image links into the url field of a imagetexture. this xslt gets the title of the feed and puts it in a screentextoverlay in a foreground:

3. the translation can happen on a local computer or server. i used w3d’s webservice. the result is a short x3d file that prints the feed’s title in the foreground:

unfortunately instantplayer does not load the file without .x3d ending.

architectural multi-touch application at cebit 2008

at cebit 2008 we are introducing instantware. it is a sub brand of instantreality for our custom interaction devices and services like the multi-touch table and heyewall, movablescreen and our mobile augmented reality applications.

the multi-touch table will be presented at BMBF booth (hall 9, b40). johannes and yvonne build a ubercool architectural application. it features several scalable plans of messe frankfurt’s new hall 11. multiple users can move and zoom these plans like you know it from other multi-touch applications.

but the key feature is the tile with a high quality 3d view of the building. they implemented the multi-touch 3d camera gestures jens and i introduced last year: you are grabbing a plan with the left hand. one finger of right hand moves the camera through the 3d model. the second finger defines the orientation of the camera. this enables incredible cinematic camera movements in 3d. when the second finger points on a certain object on the plan and the first fingermoves around it the 3d camera moves around that object while keeping it on focus.

more images from cebit 2008 at:

processing applets on playstation 3

i finally got ubuntu gutsy 7.10 working on my playstation 3. the current firmware 2.10 made some problems with the wireless connection. but i found a fixed kernel on psubuntu.com. the only problem now is the poor performance because of the 256 mb ram of the ps3 and the software rendering.

processing itself does not work yet. but with ibm’s java i got my processing applets working by calling them via appletviewer in the shell.

appletviewer index.html (it works with urls, too)


instantplayer and opensg did not compile yet. i tried to build opensg via deb source packages. but johannes will set up a build system for me next week.

multi-touch table released

last week we set up an installation for coperion group (global market leader for compounding systems) at k trade fair in duesseldorf: a large multi-touch table and an 8 meter wide hd projection showing a complete interactive plastics production process.


the application was developed together with jan, sebastian and chrisi from design & systems institute.

the visualization is rendered with instant reality’s cluster. it makes clustering x3d applications incredibly easy.

fraunhofer igd – project website

processing sketches as interactive textures

i managed to get processing sketches working as textures in instant reality (x3d). now it is possible to simply write animated and interactive textures in processing and use them in 3d scenes. this is done by an applet wrapper that renders a processing sketch offline and copies its pixels[] array into an x3d SFImage().


here is an example how to load a processing applet as texture in instant player..

ProcessingTexture.class is a wrapper class that loads a processing applet and puts its framebuffer into an SFImage. it also sends mouse events from instant player to the applet.

in order to load your processing applet you have to follow these steps:

  • download instant player and install it
  • download the zip with the wrapper class and the examples and extract it
  • write your processing sketch (size in power of two: 256×256, 512×512, 1024×512, ..), export it and copy the jar into the same folder as the example
  • open ProcessingTexture.x3d in a text editor
  • add your jar file to the “javaClassPath” in the ContextSetup node at the top of the file
  • change the value of the script node’s “name” field to the name of your applet (without .class or .pde)
  • open the x3d file in instant player. you should see your applet on a cube

download: instantprocessing.zip

siggraph 07

this year johannes and one of my students are exhibiting at siggraph. johannes presents our software instantreality and demos at khronos group’s booth. marion won the space time competition and shows her work “ce.real” she did in my course “experimental interfaces” last semester.

three of my works are shown at the kronos booth:

a 8bit style world works as real time information graphic. just posted flickr images mapped on boxes are falling from the sky on a dump of images in real time. the world features realistic realtime shadows and physics. thus a huge chaos is produced by the falling boxes.
a katamari damacy like ball with physical forces rolls through the dump in order to increase chaos and to give an impression about the game capabilities of the concept.
Sorting the dumps by tags and time will come soon.


bubble mirror
the installation converts silhuettes of persons into physical bubble particles swimming to the surface.


trails in the fog
a minimalistic scene of trees and ascending elements in white fog rendered on a colorkeepbackground. the silent scene becomes positive chaos and renders incredible unforeseen images.


multitouch for immersive environments

this week my student jens finished his diploma at h-da and igd. he developed a multitouch table and software for navigation through immersive vr environments. as multitouch technology has become a standard technology and everyone is able to build a table it’s the applications that matter now.

together we created a very straight interaction concept for navigating and evaluating virtual architecture visualizations. the movement in the virtual space becomes cinematic when using two fingers for movement, orientation and lookat at the same time.

the setup consits of a table installation in front of igd‘s 18 megapixel heyewall. content is a 3d architecture visualization of messe frankfurt’s new building. on the table there is a bird view of the area. tracking is done by harald’s visionlib blobdetection. it communicates with instant player (X3D) via TUIO protocol.

pointing with a finger on a spot let’s the camera fly to the point in the 3d scene. moving with the finger moves the camera. with a second finger the orientation of the camera is set. by pointing on a certain building the camera will look at it. by pointing with the second finger on a building and moving the first finger around it the camera moves around it while looking at it.


Project summary

tutorials, tutorials, tutorials..

finally we put instant reality’s tutorial section online. i wrote several tutorials about instant player’s interface, arduino / serial communication,  marker tracking, apple’s sudden motion sensor, java in x3d, followers and dampers, etc..

together with don brutzman’s x3d book this could be a turning point for x3d development. in the last years x3d was a hidden mystery documented by the web3d specification. only a closed circle of people was able to read and understand the spec. a kind of x3d geheimbund. i learned it from our developers in the institute. my students learned it from me. like in joseph weizenbaum’s modern cathedrals of science.

i am teaching x3d in my master course’s “augmented reality” module, too.