New tool may decode what you are seeing in real time

New York: Researchers, including one of Indian-origin, have developed new computational tools that can instantly decode brain signals involved in viewing images.

“We were trying to understand, first, how the human brain perceives objects in the temporal lobe, and second, how one could use a computer to extract and predict what someone is seeing in real time?” explained Rajesh Rao from the University of Washington.

The researchers found that electrodes in patients’ temporal lobes carry information that, when analysed, enables scientists to predict what object patients are seeing.

“Clinically, you could think of our result as a proof of concept toward building a communication mechanism for patients who are paralysed or have had a stroke and are completely locked-in,” he said.

Temporal lobes process sensory input and are a common site of epileptic seizures. Situated behind mammals’ eyes and ears, the lobes are also involved in Alzheimer’s and dementias and appear somewhat more vulnerable than other brain structures to head traumas.

The research was published in the journal PLOS Computational Biology.

The study involved seven epilepsy patients receiving care at a Seattle hospital.

In the experiment, electrodes from multiple temporal-lobe locations were connected to powerful computational software that extracted two characteristic properties of the brain signal: “event-related potentials” and “broadband spectral changes.”

Rao characterised the former as likely arising from “hundreds of thousands of neurons being co-activated when an image is first presented,” and the latter as “continued processing after the initial wave of information.”

Using electrodes implanted in the temporal lobes of awake patients, the scientists decoded brain signals at nearly the speed of perception.

Further, analysis of patients’ neural responses to two categories of visual stimuli – images of faces and houses – enabled the scientists to subsequently predict which images the patients were viewing, and when, with better than 95 percent accuracy.

IANS