Table of contents:

I will not see until I believe: How to learn to change your point of view?
I will not see until I believe: How to learn to change your point of view?

Video: I will not see until I believe: How to learn to change your point of view?

Video: I will not see until I believe: How to learn to change your point of view?
Video: 10 Tips on How to INDUCE LABOR ON YOUR OWN | NATURAL Ways to INDUCE Labor | Birth Doula | Lamaze 2024, May
Anonim

We constantly distort reality in our favor, we very rarely notice this and even less often admit that we were wrong. These weaknesses of human thinking allow propaganda and advertising to work, and manipulation of public opinion in social networks is based on them. We are especially bad at reasoning about things related to our beliefs and faith. How to "catch" yourself on a mistake?

“Once having accepted any belief, the human mind begins to attract everything in order to strengthen and confirm it. Even if this belief refutes more examples than it confirms, the intellect either overlooks them or considers them as negligible, wrote the English philosopher Francis Bacon. Anyone who has participated in Internet discussions knows perfectly well what he meant.

Psychologists have long been trying to explain why we are so reluctant to change our point of view. Bacon's conjecture, advanced nearly four hundred years ago, is now backed up by hundreds of scientific studies. And the better we understand our mental distortions, the more likely we are to learn to resist them.

I won't see until I believe

The limits of human irrationality can only be guessed at. Any psychology student can use a couple of simple tests to prove that you are biased and biased. And we are not talking about ideological and prejudices, but about the most basic mechanisms of our thinking.

In 2018, scientists from the Hamburg-Eppendorf University Center showed participants in the experiment several videos. The participants had to determine in which direction the white dots were moving on the black screen. Since many of the points were moving erratically, it was not so easy to do this.

Scientists noticed that after making the first decision, the participants subconsciously adhered to it in the future. “Our decisions become an incentive to take into account only the information that is in agreement with them,” the researchers conclude

This is a well-known cognitive bias called confirmation bias. We find data that supports our point of view and ignore anything that contradicts it. In psychology, this effect is colorfully documented in a variety of materials.

In 1979, students at the University of Texas were asked to study two academic papers on the death penalty. One of them argued that the death penalty helps to reduce crime, and the second refuted this claim. Before starting the experiment, participants were asked how they felt about the death penalty, and then asked to rate the credibility of each study.

Instead of taking into account the arguments of the opposing sides, the participants only reinforced their initial opinion. Those who supported the death penalty became ardent supporters, and those who opposed it became even more ardent opponents

In a classic 1975 experiment, Stanford University students were shown a pair of suicide notes each. One of them was fictional, and the other was written by a real suicide. Students had to tell the difference between a real note and a fake one.

Some of the participants turned out to be excellent detectives - they successfully dealt with 24 pairs out of 25. Others showed complete hopelessness and correctly identified only ten notes. In fact, the scientists deceived the participants: both groups completed the task in about the same way.

In the second step, the participants were told that the results were bogus and were asked to rate how many notes they actually identified correctly. This is where the fun began. The students in the “good results” group felt confident that they did the task well - much better than the average student. Students with “poor scores” continued to believe they had failed miserably.

As the researchers note, "once formed, impressions remain remarkably stable." We refuse to change our point of view, even when it turns out that there is absolutely no basis behind it.

Reality is unpleasant

People do a very bad job of neutralizing facts and weighing arguments. Even the most rational judgments, in fact, arise under the influence of unconscious desires, needs and preferences. Researchers call this "motivated thinking." We do our best to avoid cognitive dissonance - the conflict between established opinions and new information.

In the mid-1950s, American psychologist Leon Festinger studied a small sect whose members believed in the imminent end of the world. The date of the apocalypse was predicted to a specific day - December 21, 1954. Unfortunately, the apocalypse never came on that day. Some began to doubt the truth of the prediction, but soon received a message from God, which said: your group radiated so much faith and goodness that you saved the world from destruction.

After this event, the behavior of members of the sect changed dramatically. If earlier they did not seek to attract the attention of outsiders, now they began to actively spread their faith. According to Festinger, proselytism became a way for them to remove cognitive dissonance. This was an unconscious, but in its own way logical decision: after all, the more people can share our beliefs, the more it proves that we are right.

When we see information that is consistent with our beliefs, we feel genuine satisfaction. When we see information that is contrary to our beliefs, we perceive it as a threat. Physiological defense mechanisms are turned on, the ability for rational thinking is suppressed

It is unpleasant. We are even willing to pay so as not to be confronted with opinions that do not fit into our belief system.

In 2017, scientists at the University of Winnipeg asked 200 Americans how they felt about same-sex marriage. Those who appreciated this idea were offered the following deal: answer 8 arguments against same-sex marriage and get 10 dollars, or answer 8 arguments in support of same-sex marriage, but get only 7 dollars for it. Opponents of same-sex marriage were offered the same deal, only on opposite terms.

In both groups, almost two-thirds of the participants agreed to receive less money so as not to face the opposite position. Apparently, three dollars is still not enough to overcome a deep reluctance to listen to those who disagree with us.

Of course, we don't always act so stubborn. Sometimes we are ready to quickly and painlessly change our opinion on some issue - but only if we treat it with a sufficient degree of indifference

In a 2016 experiment, scientists at the University of Southern California offered participants several neutral statements - for example, "Thomas Edison invented the light bulb." Almost everyone agreed with this, referring to school knowledge. Then they were presented with evidence that contradicted the first statement - for example, that there were other inventors of electric lighting before Edison (these facts were fake). Faced with new information, almost everyone changed their original opinion.

In the second part of the experiment, the researchers offered the participants political statements: for example, "The United States should limit its military spending."This time, their reaction was completely different, with the participants reinforcing their original beliefs rather than questioning them.

“In the political part of the study, we saw a lot of activity in the amygdala and islet cortex. These are the parts of the brain that are strongly associated with emotions, feelings, and ego. Identity is a deliberately political concept, therefore, when it seems to people that their identity is being attacked or questioned, they go astray,”the researchers sum up.

The opinions that have become part of our "I" are very difficult to change or refute. Anything that contradicts them, we ignore or deny. Denial is a basic psychological defense mechanism in stressful and anxious situations that call our identity into question. It's a pretty simple mechanism: Freud attributed it to children. But sometimes he works miracles.

In 1974, Japanese army junior lieutenant Hiroo Onoda surrendered to the Philippine authorities. He hid in the jungle on Lubang Island for nearly 30 years, refusing to believe that World War II was over and the Japanese were defeated. He believed he was waging a guerrilla war behind enemy lines - although in reality he only fought with the Philippine police and local peasants.

Hiroo heard messages on the radio about the surrender of the Japanese government, the Tokyo Olympics, and an economic miracle, but he considered it all to be enemy propaganda. He admitted his mistake only when a delegation headed by the former commander arrived on the island, who 30 years ago gave him the order "not to surrender and not to commit suicide." After the order was canceled, Hiroo returned to Japan, where he was greeted almost like a national hero.

Giving people information that contradicts their beliefs, especially those that are emotionally charged, is quite ineffective. Anti-vaccines believe that vaccines cause autism, not just from being uneducated. The belief that they know the cause of the disease gives a considerable share of psychological comfort: if greedy pharmaceutical corporations are to blame for everything, then at least it is clear who to be angry with. Scientific evidence does not offer such answers

This does not mean, of course, that we have to justify unfounded and dangerous prejudices. But the methods we use to combat them often produce opposite results.

If facts don't help, what can help?

How to persuade without facts

In The Riddle of the Mind, cognitive psychologists Hugo Mercier and Dan Sperber attempted to answer the question of what is the cause of our irrationality. In their opinion, the main task that our mind has learned to solve in the course of evolution is life in a social group. We needed reason not to search for truth, but in order not to lose face in front of our fellow tribesmen. We are more interested in the opinion of the group to which we belong, rather than objective knowledge.

If a person feels that something is threatening his personality, he is rarely able to take into account someone else's point of view. This is one of the reasons why discussions with political opponents are usually pointless

“People who are trying to prove something cannot evaluate the arguments of another person, because they consider them to be an attack against their own picture of the world in advance,” the researchers say.

But even if we are biologically programmed to be narrow-minded conformists, this does not mean that we are doomed.

“People may not want to change, but we have the ability to change, and the fact that many of our self-defending delusions and blind spots are built into the way our brains work is no excuse to give up trying to change. Great - the brain also pushes us to eat a lot of sugar, but after all, most of us have learned to eat vegetables with an appetite, not just cakes. Is the brain engineered so that we have a flash of anger when we are attacked? Great, but most of us have learned to count to ten and then find alternatives to the simple decision to pounce on the other guy with the club."

- from the book by Carol Tevris and Elliot Aronson "The mistakes that were made (but not by me)"

The Internet gave us access to huge amounts of information - but at the same time allowed us to sift out this information so that it confirms our point of view. Social media has connected people around the world - but at the same time created filter bubbles that discreetly shield us from opinions we don't accept.

Instead of flipping arguments and stubbornly defending our opinions, it is better to try to understand how we arrived at this or that conclusion. Perhaps we should all learn how to conduct dialogues according to the Socratic method. The task of the Socratic dialogue is not to win in an argument, but to reflect on the reliability of the methods that we use to create our picture of reality.

It is unlikely that the cognitive errors found by psychologists apply only to Stanford students. We are all irrational, and there are some reasons for this. We strive to avoid cognitive dissonance, exhibit confirmation bias, deny our own mistakes, but are very critical of the mistakes of others. In the era of "alternative facts" and information wars, it is very important to remember this

Perhaps the truth can be found out in a dialogue, but first you need to enter into this dialogue. Knowledge about the mechanisms that distort our thinking should be applied not only to opponents, but also to ourselves. If the thought “aha, everything here fully corresponds to my convictions, therefore it’s true,” comes to you, it is better not to rejoice, but to look for information that will cast doubt on your conclusion.

Recommended: