Brain surgeries are opening windows for neuroscientists, but ethical questions abound

In 2019, Kate Folladori spent a month sitting in a hospital room hoping she’d have a seizure. Since her diagnosis with epilepsy nearly 20 years earlier, a series of medications had failed to bring relief. Now, a team at Baylor St. Luke’s Medical Center had placed wire electrodes into her brain to record neural activity. The doctors hoped to learn where her seizures originated—and whether she might be a candidate for tissue-removing surgery or a brain stimulation implant to suppress them.

As the weeks wore on, Folladori got restless. Time became warped by boredom, and her surroundings felt surreal. “One moment that I remember specifically was it was raining outside … and it felt to me like I was watching a television show where it was raining.”

Breaking the monotony were visits from a group of neuroscientists who recorded activity in Folladori’s brain while she did simple tasks. She might press a button when a cue appeared on a computer screen or watch short videos intended to evoke different moods. The studies weren’t aimed at helping Folladori or even at treating epilepsy; they addressed more basic questions about vision and emotion in the brain. But for Folladori, they were a rare bright spot. “[Having] people from the outside to make you laugh, and to give you something to do, and to give you a goal—that was everything to me,” she says.

Folladori, in turn, offered something rare and valuable to the research team, led by her neurosurgeon, Sameer Sheth of Baylor College of Medicine. The intimate view of brain activity the scientists gleaned from those tests is impossible without invasive surgery, which would be unethical to perform solely for research’s sake.

Kate Folladori in a hospital bed, with a bandage around her head.
To try to pinpoint the source of her seizures, Kate Folladori spent weeks with electrodes in her brain, which allowed her to participate in research. Joe Folladori

People who take part in these intracranial studies—often during epilepsy monitoring or brain surgery performed when the patient is awake—“are giving an invaluable gift,” says Khara Ramos, former director of the neuroethics program at the National Institutes of Health (NIH) who is now at the Dana Foundation. Noninvasive methods of studying brain function such as functional magnetic resonance imaging and electroencephalography can “give you good spatial resolution or good temporal resolution, but not both,” she says. But a fine wire placed in contact with brain tissue can detect the activity of neurons with precision on the scale of millimeters and milliseconds. And researchers can relate that activity with a person’s real-time report of the experience.

“We can essentially gain access to the very basic neural mechanism of the human condition,” says Itzhak Fried, a neurosurgeon at the University of California (UC), Los Angeles.

Thanks partly to the rise of invasive brain stimulation treatments for diseases such as Parkinson’s and epilepsy and to a recent U.S. federal funding program, intracranial human neuroscience is burgeoning. “There has been a significant expansion of groups that are capable of doing this work,” says Winston Chiong, a neurologist and ethicist at UC San Francisco.

But the research opportunities that come with intimate access to people’s brains also raise complex ethical issues. Basic science studies tacked onto medical procedures typically offer no clinical benefit to participants. People are often recruited into a study as they prepare for serious surgery, sometimes by an investigator who is also their surgeon.

“There is a really unique situation of vulnerability that patients are in,” Chiong says.

He and others have raised questions about how to verify that patients’ participation really is voluntary, how to make clear to participants that the research is separate from clinical care, and how to ensure that researchers’ desire to collect useful data doesn’t compromise or interfere with that care.

Those concerns have motivated one group of researchers to develop a set of ethical commitments to guide studies in the field, published today in Neuron. “I’ve been heartened by the conscientiousness of the neurosurgical community that we have,” Chiong says, “but there’s certainly opportunities for abuse.”

Piggybacking on a surgery to explore basic brain function isn’t new. Starting in the 1930s, Canadian neurosurgeon Wilder Penfield treated patients for epilepsy by removing small regions of the brain. During the operation, he also explored their exposed brains, stimulating the tissue with an electrical probe and asking the patients, who were awake, what they experienced. Such experiments led to the famous homunculus: a map of which brain regions represent various body parts.

In the past 20 years, researchers have benefited from the rise of other skull-penetrating medical treatments. Those include Folladori’s seizure-monitoring electrodes and implanted devices that deliver electrical stimulation to stop seizures, treat severe obsessive-compulsive disorder, and control symptoms of movement disorders such as Parkinson’s disease. Implanted stimulation devices are also being studied for other conditions, including posttraumatic stress disorder and depression.

A view from beneath the skull

Scientists can run invasive studies of the human brain only in special cases. Medical devices implanted to assess or treat certain conditions offer the chance to gather additional data for research. Listening in on neurons at close range can yield basic insights into brain function.

Examples of different implanted devices.
C. Bickel/Science

Awake surgeries to insert such devices or resect tumors can sometimes be paused briefly for an unrelated experiment. Fried estimates roughly 30 groups in North America now do intracranial human neuroscience in epilepsy surgery patients—up from fewer than 10 when he started in the field, about 20 years ago.

Researchers can also tap into therapeutic devices that stay in the brain long-term, some of which both deliver electrical stimulation and read out neural activity. Such implants are still underused sources of neural data, says UC Los Angeles (UCLA) neuroscientist Nanthia Suthana, who has used their recordings to study learning, memory, and spatial navigation. Another rare opportunity comes from people with paralysis or limb loss. Some of these patients agree to have neural recording devices implanted for research studies that may lead to new brain-computer interface approaches to restore lost movement or communication.

Intracranial research faces a unique set of constraints. For one, researchers typically can’t record from any brain region they want. “We adjust our question to where the electrodes are,” Fried says.

Because a brain region above the ears called the temporal lobe is among the most common sites of seizures, Fried and others have designed much of their research around its functions, which include memory and language processing. For example, recordings by Fried’s team in epilepsy patients have revealed the underpinnings of the “memory moment”—when neurons encoding a memory activate, about 1 second before a person reports that memory coming to mind.

The precise locations of electrodes also vary between patients, making data hard to align across participants, notes Evelina Fedorenko, a neuroscientist at the Massachusetts Institute of Technology. Her team relies on intracranial recordings to study how the brain uses both general and language-specialized mechanisms to understand language. Another issue for the field, she says, is that because eligible participants are scarce, there’s little incentive to conduct experiments that aim to replicate previous results rather than break new ground. “People just want to test whatever new cool hypothesis they have,” Fedorenko says.

In a further challenge, many powerful research tools used in lab animals, including genetic manipulation of brain cells, are simply off limits in people. When grant applications to do human intracranial research receive review, says Jim Gnadt, a program director at the National Institute of Neurological Disorders and Stroke, “it’s hard for them to compete with the critter studies because they’re not as invasive, they’re not as modern.” So in 2017, the NIH neuroscience technology initiative, Brain Research Through Advancing Innovative Neurotechnologies, created a new program specifically to fund research opportunities offered by intracranial human recordings and to encourage interdisciplinary collaboration.

A consortium of investigators supported by the program has become a key forum for ethical discussions, and in the current Neuron paper, they lay out an ethical framework. Chiong, who was not involved in writing the paper, thinks other researchers will take it seriously. “There’s going to be a fair amount of pressure to make sure you’re operating within that framework,” he says. “Investigators are kind of looking around at what other people are doing and wanting to be sure that everybody’s playing by the same rules.”

One tenet of the new paper: Scientific considerations should not influence clinical decisions. That guideline might sound straightforward. But for some procedures, including implanting epilepsy monitoring electrodes, multiple methods are acceptable, says Nader Pouratian, a member of the consortium and a neurosurgeon at the University of Texas Southwestern Medical Center. Surgeons use their discretion in clinical decisions that, in turn, influence what research data can be collected.

For example, debate is ongoing in deep brain stimulation (DBS) surgery about whether patients should be under general anesthesia or awake for part of the procedure, Sheth notes. Many doctors have switched to asleep procedures for patient comfort and convenience, he says, whereas other clinicians assert that having patients responsive as surgeons determine where to place the implant can lead to better outcomes.

Unresponsive patients can’t answer questions or do tasks for a research study. When asleep DBS surgery became standard at Sheth’s center in 2019, he was faced with asking patients to agree to an awake surgery that was “still a very appropriate way of doing it, but not how I usually do it.” Uncomfortable with posing that question, Sheth stopped doing research involving such patients.

How to recruit participants into those studies is itself fraught. Bioethicists have long discouraged “dual-role consent,” in which a physician who is also a study investigator invites a patient to participate. Patients may feel a sense of obligation or obedience to the physician in charge of their care, the thinking goes—and may misinterpret the study as having therapeutic benefit.

But Pouratian says some investigators in the NIH consortium asserted they were the best person to consult with patients and obtain consent because they understood both the study and the complexities of the brain surgery itself. He feels “a little conflicted” over the idea of leading the consent discussion. “They’re my patients—of course they’re going to want to consent for me,” he says. Pouratian and UCLA bioethicist Ashley Feinsinger have an NIH grant to study motivations of participants in nontherapeutic intracranial studies and their perceptions of risks and benefits. Feedback so far suggests trust in a physician or researcher plays an important role in how patients think about their participation.

Folladori can attest to that. “I really liked Dr. Sheth, and that was part of why I wanted to [participate in research],” she says. “If someone else had asked, I don’t think I would have said no, but I wonder if my feelings about it going in would have been different.”

Pouratian, Sheth, and others now use a hybrid consent process: A surgeon introduces the study and is available to answer questions, but another member of the study team not involved in the patient’s care walks through the consent documents and the signing process. The new Neuron paper says the consent process can vary across studies and institutions, “as long as the distinction between clinical care and research is explicit.”

quotation mark

I’ve been heartened by the conscientiousness of the neurosurgical community … but there’s certainly opportunities for abuse.

  • Winston Chiong
  • University of California, San Francisco

Measuring and communicating risk is also challenging. Most researchers agree that asking a person to play a few computer games or answer questions in the epilepsy monitoring unit carries little risk beyond fatigue. Harder to quantify is the risk of experiments done midsurgery, which can extend a patient’s time in the operating room, typically by 20 to 30 minutes.

Very long surgeries are associated with higher rates of infection than short ones, Sheth notes. But how much additional risk comes from extending a surgery from 3.5 hours to 4? “One could assume it’s very small,” he says, “but it may not be zero.”

Sometimes researchers temporarily place an extra strip of electrodes over the surface of the brain during a surgery to collect more data. “I’m very clear [with patients] that we’re doing something additionally that we normally would not do,” Pouratian says of those situations. In a study published in April 2021 in Neurosurgery, he and colleagues analyzed 367 surgeries to implant DBS devices. Temporary placement of additional electrodes for research didn’t come with higher rates of complications such as bleeding in the brain, they found. Yet Pouratian still tells patients this step carries risks. “There’s a risk with everything we do, clinically speaking,” he says, even if “it doesn’t increase the overall adverse event rate.”

How well patients understand and remember what they are told about risks and other study details is another uncertainty. Neuroethicist Anna Wexler of the Perelman School of Medicine at the University of Pennsylvania and her team surveyed 22 people with Parkinson’s who had agreed to participate in research during surgery. (The study recorded brain activity during eye movements to explore how the brain makes rapid, flexible decisions.) Encouragingly, no participants had the erroneous impression that the study held direct benefits for them.

But by about 1 week after the informed-consent process, only about 23% could recall either of the two study risks communicated to them—an increased risk of infection and a potential loss of confidentiality associated with sharing their data.

Wexler notes that the patients might have better understood the risks at the time they were communicated. She adds that little prior evidence is available for comparison, on how well patients with Parkinson’s recall information about either research or their treatment.

Still, the authors suggest future studies might explore ways to improve understanding and retention, such as a “teach-back” approach, in which participants explain details of the consent form to study staff.

Feinsinger and Pouratian are pursuing a different question: What do patients see as the value of joining these studies? At the annual meeting of the International Neuroethics Society in November 2021, Feinsinger presented feedback from interviews with 14 people between 2 months and 2 years after they took part in nontherapeutic research during implantation of DBS electrodes for a movement disorder. The conversations revealed a strong faith that basic science would pay off in future treatments for their or other brain diseases.

That was the case for Corey Westgate, who took part in studies by Suthana’s group at UCLA that relied on readouts from Westgate’s implanted seizure-preventing device. After decades struggling with convulsive seizures, “I want this to stop,” she says, “and if me doing research can help that, then I would love to do it as much as I can.”

Suthana says the studies weren’t focused on treating epilepsy; they explored how the brain navigates through space and remembers landmarks. But the results could improve understanding of memory impairments common in epilepsy patients, she says.

Feinsinger notes that researchers need to make sure patients’ hopes are realistic. “Are we responsible for inferences patients will make about the translational likelihood of this research?” she asks. “I think to some extent, we are.”

For Folladori, a month in the epilepsy monitoring unit allowed her to participate in several studies, but she never had the seizure her doctors were waiting for. Fortunately, they used signs of abnormal activity from some of the implanted electrodes, among other clues, to find a target for a stimulation device that has kept her seizure-free for 2 years.

The experience has shaped her attitude toward research. “The reason I’m here is because of the scientific process that took place before me,” she says. “If I can participate in that in any way whatsoever, then I will absolutely do that.”

“Well,” she adds, reconsidering, “maybe not any way whatsoever.”

source: sciencemag.org