this week my student jens finished his diploma at h-da and igd. he developed a multitouch table and software for navigation through immersive vr environments. as multitouch technology has become a standard technology and everyone is able to build a table it’s the applications that matter now.
together we created a very straight interaction concept for navigating and evaluating virtual architecture visualizations. the movement in the virtual space becomes cinematic when using two fingers for movement, orientation and lookat at the same time.
the setup consits of a table installation in front of igd‘s 18 megapixel heyewall. content is a 3d architecture visualization of messe frankfurt’s new building. on the table there is a bird view of the area. tracking is done by harald’s visionlib blobdetection. it communicates with instant player (X3D) via TUIO protocol.
pointing with a finger on a spot let’s the camera fly to the point in the 3d scene. moving with the finger moves the camera. with a second finger the orientation of the camera is set. by pointing on a certain building the camera will look at it. by pointing with the second finger on a building and moving the first finger around it the camera moves around it while looking at it.