University of California, Berkeley neuroscientists have tracked the progress of a thought through the brain, showing clearly how the prefrontal cortex at the front of the brain coordinates activity to help us act in response to a perception.

Recording the electrical activity of neurons directly from the surface of the brain, the scientists found that for a simple task, such as repeating a word presented visually or aurally, the visual and auditory cortexes reacted first to perceive the word. The prefrontal cortex then kicked in to interpret the meaning, followed by activation of the motor cortex in preparation for a response. During the half-second between stimulus and response, the prefrontal cortex remained active to coordinate all the other brain areas.

For a particularly hard task, like determining the antonym of a word, the brain required several seconds to respond, during which the prefrontal cortex recruited other areas of the brain, including presumably memory networks not actually visible. Only then did the prefrontal cortex hand off to the motor cortex to generate a spoken response. The quicker the brain’s handoff, the faster people responded.

Interestingly, the researchers found that the brain began to prepare the motor areas to respond very early, during initial stimulus presentation, suggesting that we get ready to respond even before we know what the response will be.

“This might explain why people sometimes say things before they think,” said Avgusta Shestyuk, a senior researcher in UC Berkeley’s Helen Wills Neuroscience Institute and lead author of a paper reporting the results in the current issue of Nature Human Behavior.

The findings, including the key role played by the prefrontal cortex in coordinating all the activated regions of the brain, are in line with what neuroscientists have pieced together over the past decades from studies in monkeys and humans.

“These very selective studies have found that the frontal cortex is the orchestrator, linking things together for a final output,” said co-author Robert Knight, a UC Berkeley professor of psychology and neuroscience and a professor of neurology and neurosurgery at UCSF. “Here we have eight different experiments, some where the patients have to talk and others where they have to push a button, where some are visual and others auditory, and all found a universal signature of activity centered in the prefrontal lobe that links perception and action. It’s the glue of cognition.”

While other neuroscientists have used functional magnetic resonance imaging (fMRI) and electroencephelography (EEG) to record activity in the thinking brain, the UC Berkeley scientists employed a much more precise technique, electrocorticograhy (ECoG), which records from several hundred electrodes placed on the brain surface and detects activity in the thin outer region, the cortex, where thinking occurs. ECoG provides better time resolution than fMRI and better spatial resolution than EEG, but requires access to epilepsy patients undergoing highly invasive surgery involving opening the skull to pinpoint the location of seizures.

Clues from epilepsy patients

The current study employed 16 epilepsy patients who agreed to participate in experiments while undergoing epilepsy surgery at UC San Francisco and California Pacific Medical Center in San Francisco, Stanford University in Palo Alto and Johns Hopkins University in Baltimore.

“This is the first step in looking at how people think and how people come up with different decisions; how people basically behave,” said Shestyuk, who recorded from the first patient 10 years ago. “We are trying to look at that little window of time between when things happen in the environment and us behaving in response to it.”

Once the electrodes were placed on the brains of each patient, Shestyuk and her colleagues conducted a series of eight tasks that included visual and auditory stimuli. The tasks ranged from simple, such as repeating a word or identifying the gender of a face or a voice, to complex, such as determining a facial emotion, uttering the antonym of a word or assessing whether an adjective describes the patient’s personality.

During these tasks, the brain showed four different types of neural activity. Initially, sensory areas of the auditory and visual cortex activate to process audible or visual cues. Subsequently, areas primarily in the sensory and prefrontal cortices activate to extract the meaning of the stimulus. The prefrontal cortex is continuously active throughout these processes, coordinating input from different areas of the brain. Finally, the prefrontal cortex stands down as the motor cortex activates to generate a spoken response or an action, such as pushing a button.

“This persistent activity, primarily seen in the prefrontal cortex, is a multitasking activity,” Shestyuk said. “fMRI studies often find that when a task gets progressively harder, we see more activity in the brain, and the prefrontal cortex in particular. Here, we are able to see that this is not because the neurons are working really, really hard and firing all the time, but rather, more areas of the cortex are getting recruited.”

In sum, Knight said, “Sustained activity in the prefrontal cortex is what guides a perception into an action.”

Video: https://www.youtube.com/watch?v=sU8SZwNK9QE

Video: https://www.youtube.com/watch?time_continue=2&v=PgU4s_U2Lb4

Story Source

Materials provided by University of California – BerkeleyNote: Content may be edited for style and length.


Journal Reference:

  1. Matar Haller, John Case, Nathan E. Crone, Edward F. Chang, David King-Stephens, Kenneth D. Laxer, Peter B. Weber, Josef Parvizi, Robert T. Knight, Avgusta Y. Shestyuk. Persistent neuronal activity in human prefrontal cortex links perception and actionNature Human Behaviour, 2017; 2 (1): 80 DOI: 10.1038/s41562-017-0267-2

Leave a Reply

Your email address will not be published. Required fields are marked *

Join our growing network

Become a member

Subscribe to Newsletter

By checking this box, you confirm that you have agreed to be included in our mailing list. Your details will not be shared with any third party.

GhScientific © 2024. All rights reserved.