Dark side of cognitive illusions explored in bias research

Share this
Share

A cognitive bias, the “illusion of causality,” has been explored by a joint team of scientists who found that the bias isn’t limited to false beliefs about whatever was originally learned; later on, the bias can prevent new information from being learned — even when the original information is false and the newer information is true.

Dr. Helena Matute
Dr. Helena Matute

“Our ability to associate causes to effects is quite fallible,” Dr. Helena Matute, Professor of Psychology and Director of the Experimental Psychology Laboratory at Universidad de Deusto and lead researcher on the study, told The Speaker. “It often works well, but it very often is subject to illusions.”

The study involved two sets of student volunteers. The two groups witnessed drug treatment of different medical patients. The first group (called “high illusion”) witnessed many patients who had taken a drug — most of the patients recovered. The second group (“low illusion”) witnessed many patients who had not taken the drug — most of the patients recovered.

Both groups saw some patients who took the drug and some that didn’t, but each set saw more of one or the other. Across the board, around 70 percent of patients recovered, regardless of whether they took the drug or not.

The high illusion group more frequently concluded the drug had a helpful effect.

In a second round of experiments, both groups witnessed the same thing: half the patients received the drugs and half didn’t, and those who received the drug recovered 90 percent of the time, while those who didn’t receive the drug had only a 70 percent rate of recovery.

The high illusion group was less likely to recognize the drug’s effectiveness. The high illusion group thought that the recovery was due to the drug they had witnessed in the first phase of the study.

The researchers suspect that the high illusion group’s belief that the first drug was effective prevented the group from learning new information from the second round of experiments.

This study has relevance for false medical practices. It is important that people have exposure to true medical information early — before quackery gets a chance to reach them — Matute thinks.

“Yes, it is very important. But it might be even more important to provide people with excellent training on cognitive biases and cause-effect illusions, so that they will be interested in learning scientific methods, in general, not just related to medicine. And ideally, this should start quite early in life — maybe before 10 — and continue through life. The reason is that you cannot teach people all the details about medicine, present and future, and all the details about all other things they will need to know in their life. That is impossible. But if you teach them to think scientifically they have the tools to protect themselves against quackery and against many other frauds.

The researchers did a test for this, too, two years ago.

They convinced a group of teenagers that a metal wristband improved physical and mental abilities and that the teenagers should by the wristband.

The researchers next ran some the teenagers through a crash course on what had just happened. They told them about the weaknesses of the arguments in favor of the wristbands, explained the principle of baseline comparison, and taught them about causality illusion.

Afterwards, the researchers had the teenagers play a computer simulation in which the teenagers could administer a drug to patients to see if it was effective. The teenagers who had received the crash course ran more drug trials without the drug to see if the drug really was effective.

“Teaching scientific methods and scientific thinking to every one. Showing people that we are not ready to detect cause-effect relations on bare eyes, showing them that we all suffer illusions, that we often believe that A causes B when they are just co-occurring. Thus, teaching people that we need the help of controlled experiments to test whether a treatment is working. If there is no evidence supporting it, we should be aware we should not trust our personal experience, it is too biased.”

However, the illusion of causality can effect not just patients, but doctors, too.

“They are humans and are subject to identical cognitive biases as other people. They might feel that a treatment is working when it is not. But they have the scientific literature and reviews of current research to make sure whether treatment is supported by evidence.

“We need to be aware of these mistakes in order to be able to protect ourselves against them,” Matute concluded. “The only protection that we humans have developed about these cause-effect illusions is the scientific method. So, lets use it. And let’s teach everybody how to use it!”

The dark side of cognitive illusions: When an illusory belief interferes with the acquisition of evidence-based knowledge,” was completed by Ion Yarritu, Helena Matute, andDavid Luque, and was published in the British Journal of Psychology.

2 thoughts on “Dark side of cognitive illusions explored in bias research”

Comments are closed.