Workshop Project Descriptions
Lead Investigator: Stefan Greuter Stefan.email@example.com
This project explores a series of stereoscopic multi user experiences that are displayed in the Virtual Room. The Virtual Room is a revolutionary visualisation laboratory which can be used as an interactive, immersive, three-dimensional and stereoscopic environment. Audiences of up to 40 participants can simultaneously experience the illusion that whatever is contained within the Virtual Room is physically contained within the confines of the eight screens. Interactive Games, Animations and Video can be viewed from a changing perspective as members of the audience walk around the perimeter of the Virtual Room.
Particular projects that will be demonstrated are:
VROOM Tennis is a three-dimensional stereoscopic multi-player tennis game experience that is based on one of the earliest arcade video games that led to the start of the video game industry: Pong. The aim of the game is to defeat opponents in a simulated tennis like game, where a ball bounces within the octagonal VROOM space while each player defends his/ her screen with a paddle that the player controls with his left or right hand through a Microsoft Kinect device. The last remaining player wins.
VROOM Video Performances by Australian artist, film maker, academic Dr Shaun Wilson. In a series of short videos shot in stereoscopic VROOM format, Wilson directs dance performances to music and monologues shot with several 3D cameras.
The Glub, is a 3D Animation by Ash Curkpatrick, a recent RMIT Animation and Interactive Media Graduate. The animation is about a boy who brings home an alien creature, which starts eating whatever is in its way. The animation features a cartoon style animation set in a fully modelled cartoon style bedroom environment. The animation can be viewed from 8 perspectives in stereo 3D.
Lead Investigator: Garry Keltie/Marita Cooper/Peter Wilson
CYCLE-SIM Aimed to design an experiment to measure human factors responses to proposed urban design and feedback into an iterative design process. The project studied users in a human scale, virtual simulation and examined perception differences in cyclist behaviour between differing designs of cycle transit environments. While ‘game’ like, the application used an applied gaming approach combined with immersive display techniques to attain a compelling scenario. The initial study revealed issues with the experimental design, game level design, input devices, display systems and proposed considerations for further simulation work.
Positional Tracking Demonstration for Augmented Environments
Lead Investigators :Chris Barker/Joshua Batty
The demonstration will present a proof of concept installation for the creation and display of augmented desktop environments with line of sight tracking presenting an anamorphic immersive display that exist beyond the screen.
Current screen-based displays limit the user to 2-dimensional representations of virtual worlds. Albertian perspective dominates the representation of other spaces through the creation of a fixed viewpoint. This demonstration makes use of open source libraries for Mircosoft Kinect or consumer web cameras to enable the real time construction of a point of view calculated in 3d space. It presents a novel mapping algorithm enabling the user to map game environments in the real world. It combines technology for video mapping and parallax mapping to enable greater telepresence in mixed reality (or augmented) environments.
Lead Investigator:Jonathan Duckworth/Peter Wilson
ELEMENTS is an interactive table top environment that supports movement assessment and rehabilitation for patients recovering from Traumatic Brain Injury. Brain injured patients frequently exhibit impaired upper limb function including reduced range of motion, accuracy of reaching, inability to grasp and lift objects, or perform fine motor movements. The Elements system responds to this level of disability by using an intuitive desktop workspace that affords basic gestural control.
The ELEMENTS system provides the patient with a suite of playful software applications for composing with sounds and visual feedback that promote artistic activity and playful interaction. Painting and sound mixing is expressed through the patient’s upper limb control of soft graspable user interfaces. These environments are designed to evoke the patient’s interests in practicing otherwise limited movement skills.
Through playful interaction users can seek out new effects, sounds and visual features to see how they work. By doing so, patients discover new ways of relating to their body and relearn their movement capabilities in a self directed fashion.
Remote Musical Collaboration and Gestural Interactions.
Lead Investigator:Barry Hill/Joshua Batty
This demonstration will present a performance between two musicians and a dancer demonstrating networking capabilities of linking to Salford University and using gestural input to manipulate audio effects in real time. By assigning musical control to the dancer they become in a sense the third musician in the group and also take on the role of a conductor. The dancer is then able to approach and use their own body as a musical instrument. The project makes use of open source libraries and custom software and presents a novel mapping system to assign gestural input to musical effects.