New Age Islam
Wed Nov 13 2024, 02:05 PM

Spiritual Meditations ( 14 March 2017, NewAgeIslam.Com)

Comment | Comment

Why We Believe Obvious Untruths


By Philip Fernbach and Steven Sloman

March 3, 2017

How can so many people believe things that are demonstrably false? The question has taken on new urgency as the Trump administration propagates falsehoods about voter fraud, climate change and crime statistics that large swaths of the population have bought into. But collective delusion is not new, nor is it the sole province of the political right. Plenty of liberals believe, counter to scientific consensus, that G.M.O.s are poisonous, and that vaccines cause autism.

The situation is vexing because it seems so easy to solve. The truth is obvious if you bother to look for it, right? This line of thinking leads to explanations of the hoodwinked masses that amount to little more than name calling: “Those people are foolish” or “Those people are monsters.”

Such accounts may make us feel good about ourselves, but they are misguided and simplistic: They reflect a misunderstanding of knowledge that focuses too narrowly on what goes on between our ears. Here is the humbler truth: On their own, individuals are not well equipped to separate fact from fiction, and they never will be. Ignorance is our natural state; it is a product of the way the mind works.

What really sets human beings apart is not our individual mental capacity. The secret to our success is our ability to jointly pursue complex goals by dividing cognitive labor. Hunting, trade, agriculture, manufacturing — all of our world-altering innovations — were made possible by this ability. Chimpanzees can surpass young children on numerical and spatial reasoning tasks, but they cannot come close on tasks that require collaborating with another individual to achieve a goal. Each of us knows only a little bit, but together we can achieve remarkable feats.

Knowledge Isn’t In My Head Or In Your Head. It’s shared.

Consider some simple examples. You know that the earth revolves around the sun. But can you rehearse the astronomical observations and calculations that led to that conclusion? You know that smoking causes cancer. But can you articulate what smoke does to our cells, how cancers form and why some kinds of smoke are more dangerous than others? We’re guessing no. Most of what you “know” — most of what anyone knows — about any topic is a placeholder for information stored elsewhere, in a long-forgotten textbook or in some expert’s head.

One consequence of the fact that knowledge is distributed this way is that being part of a community of knowledge can make people feel as if they understand things they don’t. Recently, one of us ran a series of studies in which we told people about some new scientific discoveries that we fabricated, like rocks that glow. When we said that scientists had not yet explained the glowing rocks and then asked our respondents how well they understood how such rocks glow, they reported not understanding at all — a very natural response given that they knew nothing about the rocks. But when we told another group about the same discovery, only this time claiming that scientists had explained how the rocks glowed, our respondents reported a little bit more understanding. It was as if the scientists’ knowledge (which we never described) had been directly transmitted to them.

The sense of understanding is contagious. The understanding that others have, or claim to have, makes us feel smarter. This happens only when people believe they have access to the relevant information: When our experimental story indicated that the scientists worked for the Army and were keeping the explanation secret, people no longer felt that they had any understanding of why the rocks glowed.

The key point here is not that people are irrational; it’s that this irrationality comes from a very rational place. People fail to distinguish what they know from what others know because it is often impossible to draw sharp boundaries between what knowledge resides in our heads and what resides elsewhere.

This is especially true of divisive political issues. Your mind cannot master and retain sufficiently detailed knowledge about many of them. You must rely on your community. But if you are not aware that you are piggybacking on the knowledge of others, it can lead to hubris.

Recently, for example, there was a vociferous outcry when President Trump and Congress rolled back regulations on the dumping of mining waste in waterways. This may be bad policy, but most people don’t have sufficient expertise to draw that conclusion because evaluating the policy is complicated. Environmental policy is about balancing costs and benefits. In this case, you need to know something about what mining waste does to waterways and in what quantities these effects occur, how much economic activity depends on being able to dump freely, how a decrease in mining activity would be made up for from other energy sources and how environmentally damaging those are, and on and on.

We suspect that most of those people expressing outrage lacked the detailed knowledge necessary to assess the policy. We also suspect that many in Congress who voted for the rollback were equally in the dark. But people seemed pretty confident.

Such collective delusions illustrate both the power and the deep flaw of human thinking. It is remarkable that large groups of people can coalesce around a common belief when few of them individually possess the requisite knowledge to support it. This is how we discovered the Higgs boson and increased the human life span by 30 years in the last century. But the same underlying forces explain why we can come to believe outrageous things, which can lead to equally consequential but disastrous outcomes.

That individual ignorance is our natural state is a bitter pill to swallow. But if we take this medicine, it can be empowering. It can help us differentiate the questions that merit real investigation from those that invite a reactive and superficial analysis. It also can prompt us to demand expertise and nuanced analysis from our leaders, which is the only tried and true way to make effective policy. A better understanding of how little is actually inside our own heads would serve us well.

Philip Fernbach is a cognitive scientist and professor of marketing at the University of Colorado’s Leeds School of Business. Steven Sloman is a professor of cognitive, linguistic and psychological sciences at Brown University. They are the authors of the forthcoming “The Knowledge Illusion: Why We Never Think Alone.”

Source: nytimes.com/2017/03/03/opinion/sunday/why-we-believe-obvious-untruths.html?emc=eta1&_r=0

URL: https://newageislam.com/spiritual-meditations/believe-obvious-untruths/d/110383

Loading..

Loading..