Skip to Main Content

Whatever happened to ... Virtual Reality?

Pin it

Go to Science@NASA home page

Whatever happened to ... Virtual Reality?

Twenty years after the first wave of hype, VR is making a comeback in NASA laboratories.

NASA

Link to story audioListen to this story via streaming audio, a downloadable file, or get help.

June 21, 2004: In the Matrixsequels, the metal gates to the city of Zion are operated by air traffic controllers who work inside a virtual control tower: a computer-generated, heavenly-white space where controllers use fancy virtual control panels to guide sci-fi hovercraft.

see captionThis fantasy scenario must seem familiar to anyone who rode the wave of VR hype during the 1980s. Helmet-mounted displays, power gloves, 3D sights and sounds: these technologies were supposed to make immersive environments commonplace, revolutionizing everything from video games to stock market analysis to psychotherapy.

It didn't happen.

Right: At NASA/Ames, Dr. Stephen Ellis models a VR helmet. [More]

"The technology of the 1980s was not mature enough," explains Stephen Ellis, who leads the Advanced Displays and Spatial Perception Laboratory at NASA's Ames Research Center. VR helmets and their optics were too heavy. Computers were too slow. Touch-feedback systems often didn't work. The only thing consistently real about VR were headaches and motion sickness--common side effects of '80s-era helmets.


Sign up for EXPRESS SCIENCE NEWS delivery
Twenty years later, things have improved. Computers are thousands of times faster; VR peripherals are lighter-weight and they deliver a greater sense of feedback and immersion. And, importantly, researchers are beginning to understand crucial human factors; they're eliminating nausea and fatigue from the VR experience.

Once again, virtual reality seems promising, and NASA is interested.

Picture this: an astronaut on Mars sends a rover out to investigate a risky-looking crater. Slip-sliding down the crater wall, the rover sends signals back to the Mars Base where the astronaut, wearing VR goggles and gloves, feels like she herself is on the slope. Is the find important enough to risk venturing out in person? VR helps decide.

In another scenario, astronauts could use VR to perform repairs on the outside of their spacecraft by controlling a human-like robot, such as the Robonaut being developed at Johnson Space Center (JSC).

see captionLeft: An artist's concept of the NASA Robonaut. Image credit: John Frassanito & Associates, Inc.

Ellis, who holds advanced degrees in psychology and behavioral science, evaluates VR for space applications. At the moment he's investigating user interfaces for robots such as AERCam, short for Autonomous Extravehicular Robotic Camera. These are spherical free-flying robots being developed JSC to inspect spacecraft for trouble-spots. AERCam is designed to float outside, e.g., the ISS or the space shuttle, using small xenon-gas thrusters and solid-state cameras to view the vehicle's outer surfaces and find damage in places (such as the shuttle's underside) where a human spacewalker or the orbiter's robotic arm can't safely go.

The current plan is to use a laptop and a normal, flat monitor to operate AERCam. But Ellis is conducting research, funded by NASA's Office of Biological and Physical Research, to see if a virtual environment might be a better option. With a VR system, the astronaut could maneuver the melon-sized AERCam with standard hand controls while intuitive head movements rotate AERCam to let the astronaut "look around."

Ellis' research is necessary because, he says, "VR isn't always the best choice." For example, at the Wright Patterson Air Force Base researchers have tested VR interfaces for pilots. Time after time, their tests showed that pilots perform better with traditional panel-mounted displays.

Why? No one is sure, but here's one possibility: The field of view of the VR helmet was narrower than the pilots' natural peripheral vision. Ellis believes these helmets effectively divorced the pilots from the cockpit--the environment in which they learned to fly.

see caption"There are some surprisingly simple ergonomic issues that can interfere with VR interfaces," adds Ellis. For example, "in the early 1990s Mattel sold the PowerGlove (a simple VR glove) as a novel way to control video games. It was cool. But kids quickly discovered that it's very tiring to hold your hand up in front of you long enough to play an entire game." You'd have to be an athlete to use it. (The glove is no longer sold.)

Right: The Mattel PowerGlove was cool, but tiring.

Since the 1980s there has been a dawning awareness among researchers that human factors are crucial to VR. Age, gender, health and fitness, peripheral vision, posture, the sensitivity of the vestibular system: all of these things come into play. Even self-image matters. One study showed that people wearing VR helmets like to glance down and see their own virtual body. It helps "ground them" in the simulation. And the body should be correct: arms, legs, torso; male for men; female for women.

For every virtual environment, there is a human-computer interface, and if the interface doesn't match the person … game over.

To address these human factors, Ellis's group performs fundamental research on human senses and perception. One central concern is how people cope with "latencies," or delays, in the VR system. When you swing your head, does the virtual view follow immediately, or is there a split-second lag? If your eyes and your inner ear (where vestibular organs sense orientation) send conflicting reports to the brain, you might need a motion-sickness bag.

Below: Organs in the inner ear (the vestibular system) affect human balance. Vestibular adaptation is a key human factor in VR interface design. [More]

see caption"The question is how much delay can you tolerate?" Ellis says. For movement within the virtual environment to feel natural, most people need the delay to be less than 15 milliseconds (thousandths of a second), according to his group's research.

Bernard Adelstein, Durand Begault, and Elizabeth Wenzel, colleagues of Ellis who work in the Advanced Displays Laboratory at Ames, have discovered that, because sounds in a virtual environment can be generated much faster than touch feedback from a VR glove, sound can help compensate for the delay in touch. For example, when grabbing a virtual object, the immediate "click" sound of contact enhances the user's tactile perception of realism.

The years of research are finally beginning to pay off, Ellis says. "The fully immersive, head-mounted system is getting to be high enough fidelity for practical use. We'll probably have the AERCam experiment running by August."

The Matrix will take a little longer.

Topics: