this year johannes and one of my students are exhibiting at siggraph. johannes presents our software instantreality and demos at khronos group’s booth. marion won the space time competition and shows her work “ce.real” she did in my course “experimental interfaces” last semester.
three of my works are shown at the kronos booth:
a 8bit style world works as real time information graphic. just posted flickr images mapped on boxes are falling from the sky on a dump of images in real time. the world features realistic realtime shadows and physics. thus a huge chaos is produced by the falling boxes.
a katamari damacy like ball with physical forces rolls through the dump in order to increase chaos and to give an impression about the game capabilities of the concept.
Sorting the dumps by tags and time will come soon.
the installation converts silhuettes of persons into physical bubble particles swimming to the surface.
trails in the fog
a minimalistic scene of trees and ascending elements in white fog rendered on a colorkeepbackground. the silent scene becomes positive chaos and renders incredible unforeseen images.
this week my student jens finished his diploma at h-da and igd. he developed a multitouch table and software for navigation through immersive vr environments. as multitouch technology has become a standard technology and everyone is able to build a table it’s the applications that matter now.
together we created a very straight interaction concept for navigating and evaluating virtual architecture visualizations. the movement in the virtual space becomes cinematic when using two fingers for movement, orientation and lookat at the same time.
the setup consits of a table installation in front of igd‘s 18 megapixel heyewall. content is a 3d architecture visualization of messe frankfurt’s new building. on the table there is a bird view of the area. tracking is done by harald’s visionlib blobdetection. it communicates with instant player (X3D) via TUIO protocol.
pointing with a finger on a spot let’s the camera fly to the point in the 3d scene. moving with the finger moves the camera. with a second finger the orientation of the camera is set. by pointing on a certain building the camera will look at it. by pointing with the second finger on a building and moving the first finger around it the camera moves around it while looking at it.
finally we put instant reality’s tutorial section online. i wrote several tutorials about instant player’s interface, arduino / serial communication,Â marker tracking, apple’s sudden motion sensor, java in x3d, followers and dampers, etc..
together with don brutzman’s x3d book this could be a turning point for x3d development. in the last years x3d was a hidden mystery documented by the web3d specification. only a closed circle of people was able to read and understand the spec. a kind of x3d geheimbund. i learned it from our developers in the institute. my students learned it from me. like in joseph weizenbaum’s modern cathedrals of science.
i am teaching x3d in my master course’s “augmented reality” module, too.
this issue of page magazine features some quotes of me regarding virtual worlds and open standards..
digg introduced they api with a visualization contest. it is limited to flash based contributions. anyway. it is a nice approach using instant player for visualizing real time data with X3D on our institute’s projection systems.
there are two visualization sketches on different output systems. in both the stories are growing as ice-like typography trees in real time. each category grows its own tree of upcoming stories. in the center grows the tree with the frontpage stories.
DiggTrees on heyewall
due its very high resolution the heyewall is an ideal projection system for visualizations with a high information density. with 18 megapixels or 6000×3000 pixels resolution it is possible to read the tiniest typography. that’s why we do not have to zoom anymore. instead we are coming closer to the screen to read small items.
DiggTrees on MovableScreen
the visualization on MovableScreen is a more spacial approach. the trees are growing in a circle around the front page stories. by rotating the display we can move around the circle and read details.
during the last months i realized my idea of a movable screen as interaction device at the IGD. it is again a minimalistic approach integrating / hiding complex technlogy in famliar hardware. i simply put an 24″ iMac at a rotating pillar and tracked the rotation with a rotation encoder. in order to create the effect of a “window into virtual reality” i am tracking the user’s head and map the position on the virtual camera. primarily it is running instantreality framework for visualization. there are also working prototypes of Second Life, Google Earth and hopefully soon Nasa Worldwind Java.
last week we presented the MovableScreen at the Hannover Messe trade fair at the booth of BMBF. the software shows the simulation of a Coperion bulk materials plant i created together with my colleages at design&systems institute using instantreality framework. we already have sold two MovableScreens until now!
i am happy that i have been working with our mixed reality software for the last years. this weekend johannes finally released the public beta 0 of the instantplayer at Web3D 2007 in italy. it was a lot of work during the last weeks to setup the application, the whole design and the website. finally this great platformindependent piece of software is available to the public for os x and windows.
right now we are writing a wiimote backend for our mixed reality system “instant reality“. meanwhile we connected the wiimote to our high resolution wall via glovepie. it’s nice to play wii-like in stereo on an 18 megapixel display.
last week linden labs released the source code of the second life client. we began thinking about what we could do with it.
first we started bringin second life to fraunhofer igd‘s heyewall – an 18 megapixel projection system. marcus and johannes managed it to get the client working on our clusters. there were some funny results because some textures were missing.
right now i am working to get the navigation working on my movableDisplay and emagin 3dvisor. next i will try to get real life sensors from arduino into the metaverse.
gestern trafen sich ehemalige digitale der fakultaet gestaltung der fh wuerzburg, um ihre damaligen diplomarbeiten und ihre aktuellen arbeiten zu zeigen. philipp heidkamp, didier stricker und joachim sauter hielten vortraege zum thema.
marco formulierte es ganz gut: sauters vortraege sind immer immens inspirierend, aber auch zugleich ein wenig frustrierend.
in einem zitat von sauter fand ich mich bestaetigt: “man muss es nur ausprobieren, dann funktioniert es doch.”
fuer die studenten ist ein solche veranstaltung immens wichtig. fuer unsere generation – die ausstellenden – war sauters besuch 2001 in wuerzburg der motivationsanschub unseres studiums.