The University of Pennsylvania has installed a virtual reality system that allows a participant full-body interaction with a virtual environment without the hassle of bulky, dizzying 3-D glasses.
Key to the installation, dubbed LiveActor, is the pairing of an optical motion capture system to monitor the body's movements with a stereo projection system to immerse users in a virtual environment. The combination lets users interact with characters embedded within virtual worlds.
LiveActor users wear a special suit that positions 30 sensors on different parts of the body. As the system tracks the movement of these sensors as an actor moves around a stage roughly 10 feet by 20 feet in size, a virtual character can recreate the user's movements with great precision and without a noticeable time lag. The system can also project images onto the array of screens surrounding the LiveActor
While stereo projection systems have in the past been limited to relatively static observation and navigation - such as architectural walk-throughs, games and medical visualizations - LiveActor can be used to simulate nearly any environment or circumstance, chart user reactions and train users to behave in new ways. Unlike actual humans, virtual characters can be scripted to behave consistently in a certain way. >from *New Virtual Reality Array Allows Immersive Experience Without the Disorienting 3-D Goggles*. may 12, 2003
> tele-immersion demonstration: milestone of grid computing. november 27, 2002
> first transatlantic touch: virtual reality touch. november 4, 2002
> first large-scale vr environment for biz apps. march 8, 1999
> stereo play