The brain is a mysterious place. Even as the field of A.I. attempts to emulate its bio-based neuronal firings, there is much we don't yet know.
Despite weighing just 3 pounds, a lot can go wrong with the 100 billion (or so) neurons inside. Currently, traumatic brain injury is a major cause of death and disability, 1 in 26 Americans will develop epilepsy in their lifetime, and there are 5 million Americans currently living with Alzheimer's.
But imagine a future where patients walk around with neuroprosthetic implants that stimulate brain function and constantly monitor oscillations, which are sent wirelessly to medical experts in real time via body sensors. This scenario isn't too far off.
PCMag met with Dr. Nanthia Suthana, Assistant Professor-in-Residence at the David Geffen School of Medicine at UCLA, where she uses VR, motion capture, and brain implants to find out how we encode memories.
Dr. Suthana is still in the relatively early stages of her career, having received her Ph.D. in Neuroscience from UCLA less than a decade ago. So she brings none of the outmoded preconceptions of "how things have always been done." In fact it was a deep desire to "solve the unsolvable" that first sparked her curiosity with the brain and its working, or malfunctioning, parts.
"When I was a student, studying abroad in Europe, I was inspired by reading about different patient cases which were considered either unsolvable, or a mystery, in terms of why they were exhibiting the symptoms they had, due to some form of brain damage," Dr. Suthana told PCMag. "I was particularly fascinated by the famous case of 'Patient HM', who suffered from epilepsy and had his medial temporal lobe removed, which is crucial for memory formation. In fact, this is the area of the brain I now study."
To further her research, Dr. Suthana worked in collaboration with Nader Shaterian and Interactive Lab to create a "VR Stadium" at UCLA. It's an impressive setup: multiple motion-capture cameras in a rig attached to the ceiling, which interact wirelessly with subjects wearing motion-capture suits, virtual markers/beacons, Samsung Gear VR headsets (and others), and EEG caps with 64-channel electrodes.
The implant, meanwhile, is the RNS System from Silicon Valley-based Neuropace, which tracks brain activity.
Real Life. Real News. Real Voices
Help us tell more of the stories that matterBecome a founding member
"We take patients through a routine," explained Dr. Suthana. "Observing them navigating spatial environments via locations which are marked, asking them to recall and repeat sequences, and recording data wirelessly as they're moving around. We take the data and analyze theta oscillations and their relationship to the patient's speed, direction, and presence/absence of location markers. We then examine whether, statistically, theta oscillations can predict memory strength."
Because Dr. Suthana's technological/neurological combination in the VR Stadium records data deep within the brain, while the patient is in motion, this enables her to see how declarative memory—both semantic (facts/figures) and episodic (personal i.e. "your last trip to Paris")—is laid down in the brain. Previously, this type of data inside the hippocampus was only recorded in rats; human subjects were sitting in a hospital bed having undergone brain surgery—until now.
"This work is bridging the gap between animal experiments and what we can now do with humans," confirmed Dr. Suthana. "The implant…was developed to treat seizures, but we're using it for research here; [it's] crucial to our ability to pinpoint what's going on in the brain. Previously, where the EEG records brain waves, as a large signal, the implant, which the surgeon inserts with an electrode, and is about 1.5 millimeters in diameter, records deep in the brain, picking up the voltage fluctuations of neurons, such as theta oscillations, which are about 8 Hz per second. The patients I work with usually have four recording locations on the implant from which I can observe them wirelessly."
By putting them inside VR environments, which can be controlled and modified, Dr. Suthana is able to see exactly what happens when new memories are laid down, and prior recall brought forth by the hippocampus for examination and reintegration, as neuroplasticity is formed. Her work will be used to create a computational model of the human medial temporal memory system, which can inform a future generation of neuroprosthetic devices.
- Elon Musk's Neuralink Will Connect Our Brains to Computers Elon Musk's Neuralink Will Connect Our Brains to Computers
"We can use these findings in the future to develop treatments to push that particular brain area into a state that's conducive to learning," she confirmed. "The other aspect, which is harder to do, is to improve old memories, but those were formed a long time ago. The hippocampus already 'did its thing' and it's now less involved because the memory has become very distributed across the brain. Essentially, it [may] be easier to enhance the process of forming memories. So, if a patient has something like Alzheimer's, we'll want to put in the neuroprosthetics early so we can start to stimulate these areas. Any later and there's a lot of degeneration or progressive cell death already."
By 2037, Dr. Suthana predicts she'll have many, many patients in the field using a combination of future technological devices, enabling them to recover, or at least live with, debilitating brain diseases.
"I envision much more advanced, wireless, battery-less, implants carrying out single cell recording and oscillations of brain waves, as well as modulating brain function through stimulating. My lab will then be able to do so much more in terms of our research with patients as they go about their everyday lives, while we monitor and modify brain activity. There's an exciting future of insideables, or implants, linked to wearables, and emerging telemedicine devices, ahead."
Subscribe to the newsletter news
We hate SPAM and promise to keep your email address safe