Washington: We all make decisions influenced by existing biases or beliefs, these decisions at times seem to be irrational. But a recent study suggests that our brain is capable of making logical decisions by setting aside these previously held notions.
The research has highlighted the importance that the brain places on the accumulation of evidence during decision-making, as well as how prior knowledge is assessed and updated as the brain incorporates new evidence over time.
According to the study’s senior author Michael Shadlen (MD, PhD), we interact with the world every day and our brains constantly form opinions and beliefs about our surroundings. Sometimes the knowledge is gained through education or through a feedback we receive. But in many cases, we learn not from a teacher, but from the accumulation of our own experiences.
For example, consider an oncologist who has to determine the best course of treatment for a patient diagnosed with cancer. Based on the doctor’s prior knowledge and her previous experiences with cancer patients, she may already have an opinion about what treatment combination can be recommended, even before examining the new patent’s complete medical history.
But each new patient brings new information, or evidence, that must be weighed against the doctor’s prior knowledge and experiences. The central question, the researchers of today’s study asked, was whether, or to what extent, that prior knowledge would be modified if someone is presented with new or conflicting evidence.
The researchers asked human participants to watch a group of dots as they moved across a computer screen, like grains of sand blowing in the wind. Over a series of trials, participants judged whether each new group of dots tended to move to the left or right, a tough decision as the movement patterns were not always immediately clear.
As new groups of dots were shown again and again across several trials, the participants were also given a task to judge whether the computer program generating the dots appeared to have an underlying bias.
Without telling the participants, the researchers had indeed programmed a bias into the computer. The movement of the dots was not evenly distributed between rightward and leftward motion, but instead was skewed towards one direction over another. By altering the strength and direction of the bias across different blocks of trials, researchers could study how people gradually learned the direction of the bias and then incorporated that knowledge into the decision-making process.
The study took two approaches to evaluate the learning of the bias, firstly, by monitoring the influence of bias in the participant’s decisions and their confidence in those decisions and secondly, by asking people to report the most likely direction of movement in the block of trials.
Both approaches demonstrated that the participants used sensory evidence to update their beliefs about the directional bias of the dots, and they did so without being told whether their decisions were correct.
Dr Zylberberg, one of the researchers said, “Originally, we thought that people were going to show a confirmation bias, and interpret ambiguous evidence as favouring their pre-existing beliefs but instead we found the opposite, people were able to update their beliefs about the bias in a statistically optimal manner.”
The researchers argue that this occurred because the participants’ brains were considering two situations simultaneously, one in which the bias exists, and a second in which it does not.
Dr Wolpert, professor of neuroscience at Columbia University Irving Medical Center, said, “The brain performed counterfactual reasoning by asking ‘What would my choice and confidence have been if there were no bias in the motion direction?’ Only after doing this did the brains update its estimate of the bias.”
The researchers were amazed at the brain’s ability to interchange these multiple, realistic representations with an almost Bayesian-like, mathematical quality.
The findings appeared in the journal- Neuron.
ANI