Skip to Main Content

Virtual reality: Preparing workers for space, deserts or deep oceans

astronaut on spacewalk
Image: NASA

In the future, as the projected limits of human exploration extend beyond the moon to Mars and even further destinations, some workplaces will be in deep space with low- and no-gravity environments. How can workers be trained for extreme environments like space, deserts or deep oceans in a safe, cost-effective manner?

A research team that includes three Texas A&M College of Architecture scholars is developing design principles for virtual reality training simulators for workers to flourish in these environments in a three-year, $1.2 million study funded by the National Science Foundation.

“Our main goal is to understand how spatial cognitive processing differs in altered gravitational and visual environments, and how virtual reality-based simulation can help train workers to adapt to such environments,” said Manish Dixit, Texas A&M assistant professor of construction science and the study’s principal investigator.

The current approaches to train workers for operations in altered environments are expensive, insufficient, and risky, he said.

“To train workers for microgravity conditions, parabolic flights are used to experience a microgravity-like environment through free fall,” he said. “This is expensive and risky. Each flight has to perform multiple parabolic climbs and descents to achieve a few seconds of microgravity.”

Traditional scuba diving training methods are also risky, he said. “In 2014, nearly 1,220 emergency room visits due to scuba injuries were reported in the U. S. alone, resulting in 188 deaths.”

Dixit and Texas A&M visualization researchers Ann McNamara and André Thomas, will measure and analyze subjects’ eye movements, brain electrical activity, cognitive strategies and mental workload reaction time during behavioral tests in simulated normal and altered environments, then apply the results to create a framework for a training simulation or game.

“This study will create new knowledge in behavioral and physiological domains of cognitive science and lead to a better understanding of spatial cognitive processing in altered environments,” said Dixit. It will also lead, he said, to the invention, evaluation, and application of innovative methods and tools that use virtual reality, eye tracking and brain wave data to design scenario-based simulations and games for workforce training.

In the study, the College of Architecture researchers are also teamed with Joseph Orr and Jyotsna Vaid, faculty members of the Texas A&M Department of Psychological & Brain Sciences, and Greg Chamitoff, a former astronaut and professor of aerospace engineering.

The study’s funding is from the Human Technology Frontier Program of the National Science Foundation’s Division of Computer and Network Systems.