UCLA Health
 

Nobody thinks much about how to navigate from the sofa to refrigerator. Memorizing sensory cues along the way, like the sight of the dining room table or a kitchen countertop, requires no conscious effort. You may not realize it, but your brain is processing data faster than any super computer, rapidly constructing a mental map based on sensory inputs as you move through your house.  

When healthy, we don’t appreciate this gift. However, spatial memory is one of the first functions to deteriorate in several neurological conditions including epilepsy and Alzheimer’s disease. That is why UCLA neurophysicist Mayank Mehta devotes his career to studying how the brain, and especially the cells in the region called the hippocampus, learns to create our perception of space and time.

“The hippocampus is a special part of our brain that has evolved to learn and perceive abstract ideas, such as space and time,” said Mehta, a UCLA professor of neurology and neurobiology. “Damage to the hippocampus manifests as many neurological diseases such as depression, epilepsy and Alzheimer’s.”

If scientists better understood how the hippocampus perceives space and time, they could figure out how to better diagnose and treat debilitating neurological diseases that impair many forms of learning and memory, Mehta says. That’s why Mehta, who is also a professor of physics and astronomy in the UCLA College, and colleagues in his laboratory are pioneering the use of virtual reality to determine how neurons make mental maps of space and the cellular basis of learning and memory.

As Mehta notes, all animals (including humans) “agree 100 percent on concepts of abstract space and time.” And all animals calculate where they are in space in the same way or else they would bump into each other and the objects around them.

Such universal perception of space-time is possible because a set of hippocampal neurons, called “place cells,” fire in an orderly fashion as any animal navigates a jungle, a living room or a maze, laying down a map of space-time without even trying.

Three scientists won the Nobel Prize in Physiology and Medicine in 2014 for discovering those place cells. However, even after their discovery scientists were unsure exactly how these place cells worked. How do they turn light, sounds, smells, for example, into a universal perception of space and time?

Mayank Mehta
UCLA Health
Mayank Mehta

To resolve this question, Mehta concluded, one would need to tease apart the different sensory stimuli that constitute space-time. This is impossible to do in the real world, but it can be accomplished readily in the virtual world.

To test for spatial memory while controlling for sensory cues, for decades scientists used “water mazes,” which involved placing rodents in pools of water and then testing their ability to swim to an escape platform. The rationale was that immersion blocked interfering sensory input, like random odors or sound. But a water maze is unsuitable for diagnosing a person who may have symptoms of Alzheimer’s, whereas a virtual maze could be used for both humans and other species.

In one experiment, the researchers compared the rats’ ability to locate rewards hidden in a real world maze — where rats can use sights, sounds and odors to navigate — with their ability to locate rewards in a virtual maze, where they’re limited to using only visual cues.

First though, an interdisciplinary team that included UCLA students in neuroscience, psychology, computer science, engineering and physics had to build the virtual apparatus from scratch. They designed the hardware, including a ball-shaped treadmill, and trained the rats to become comfortable in virtual reality and to play virtual games.

The findings were surprising. The rats in the virtual maze found hidden rewards as skillfully as rats in a real maze. Rats, it turned out, can see quite well. In virtual reality, the rats could use visual cues to directly drive responses in the hippocampus, showing that they perceived virtual space as well as humans.

Despite this, the neural activity in their hippocampus was highly abnormal in virtual reality. More than half of all neurons in the hippocampus shut down in virtual reality and the remaining neurons fired in a disordered fashion, a pattern Mehta calls “alarmingly different” from the place cell activity rats display while navigating a real maze.

These studies demonstrate that the brain needs inputs from multiple sensory factors to construct a working spatial map, Mehta said. If the vision says you are moving in space, but the sounds and smells tell otherwise, as is the case in virtual reality, the neural activity becomes very strange. Given the similarities in how rodents and human process space, similar results would also occur in the human brains.

“VR breaks the laws of physics,” Mehta said. “It removes the consistent relationship between different stimuli in the world that all the animals have used for millions of years. This results in abnormal activity patterns in the brain.”

Defining how we perceive space is just a jumping off point for Mehta’s research.

“Our brains, including the hippocampus, evolved to perceive time and space,” Mehta said. “Now that we don’t have to run for our lives, we use that hardware to write symphonies and trade stocks. If we can understood how this hardware computes space-time, we might be able to answer multiple questions about how we learn and create other complex behaviors, and how we can diagnose and treat learning and memory disorders.”