at SIGGRAPH johannes and dirk gave their “Don’t be a WIMP” class. we showed some stereo rendering, wiimote 6DOF and augmented reality demos afterwards. inbetween we had to run to fetch the google t-shirts..
together with “Rome Reborn” we exhibited the iTACITUS technology. via instantvision’s markerless tracking we showed 3d models of roman monuments on top of a floor map and posters of Rome.
next years WEB3D conference will be at IGD in darmstadt!
we are about to release our x3d player for the iPhone! patrick just finished the OpenGL ES based application today. it is build with the official iPhone SDK and will be available in the AppStore soon.
beta5 of instantplayer will be released today, too. the experimental BrowserTexture is one of my favorite features. XMLHttpRequest was postponed to beta6. meanwhile i found a solution via TCPClient backend for loading and parsing XML.
all news and interesting demos will be presented at WEB3D and SIGGRAPH next week. see you there..
when radiohead released their “In Rainbows” album i bought the limited edition deluxe discbox. it’s good for markerless tracking! now thom yorke’s virtual head is rendered on it in augmented reality.
in our itacitus project at IGD alain and yulian developed a “poster tracker” for markerless tracking. it takes a photo of a flat object (poster, LP, iPhone screen) and extracts visual features. in instantreality it calculates the right camera position relative to the tracked object. thus it is very easy to write augmented reality applications without deep knowledge in computer vision.
ok. i trained the poster tracker for the “In Rainbows” cover and added the 3D data of “House of Cards” to it via peter’s script. now thom is singing on top of the album. thanks mario for the mac version!
the tracking works on the iphone, too. ok. only when displaying the image on the iphone. but we are really porting instantreality to the iphone.
the real-time 3d data looks impressive on our new heyewall 2.0 – a multi-touch projection wall with 8160 x 4000 pixel resolution. the next days we will add a wiimote / balance board interaction for controlling time and distortion.
last week we finished the basic hardware setup of the new heyewall 2.0 at fraunhofer igd. it’s a multi-touch projection wall with 8160 x 4000 pixels resolution. all hardware accelerated by a 48 pc cluster running our instantcluster software on ubuntu linux.
in april we tested our markerless tracking and my reality filtering system at the unesco word heritage site “reggia venaria reale” near turin in italy. the markerless tracking bases on alain’s and yulian’s poster tracker. you only have to take a reference image of the background for a stable tracking. that’s how tracking should work.
my reality filtering system renders the whole environment at a sketch in order to augment historic drawings on the site. one example is temple diana at the site. it was located at the end of a long creek in the large gardens. only its ruins and two historic drawings are left. standing on a viewing platform visitors are looking through the display of a mobile computer. as soon as they are looking at the temple’s position the video on the screen is rendered like a sketch and the drawing on the temple fits in the environment. the whole garden becomes a real time drawing.
at cebit 2008 we are introducing instantware. it is a sub brand of instantreality for our custom interaction devices and services like the multi-touch table and heyewall, movablescreen and our mobile augmented reality applications.
the multi-touch table will be presented at BMBF booth (hall 9, b40). johannes and yvonne build a ubercool architectural application. it features several scalable plans of messe frankfurt’s new hall 11. multiple users can move and zoom these plans like you know it from other multi-touch applications.
but the key feature is the tile with a high quality 3d view of the building. they implemented the multi-touch 3d camera gestures jens and i introduced last year: you are grabbing a plan with the left hand. one finger of right hand moves the camera through the 3d model. the second finger defines the orientation of the camera. this enables incredible cinematic camera movements in 3d. when the second finger points on a certain object on the plan and the first fingermoves around it the 3d camera moves around that object while keeping it on focus.
last week we set up an installation for coperion group (global market leader for compounding systems) at k trade fair in duesseldorf: a large multi-touch table and an 8 meter wide hd projection showing a complete interactive plastics production process.