What makes rational people believe irrational things, by Dan Ariely

Published by Heligo Books

In journalism, objectivity is the holy grail, a prize to be pursued but never attainable. This is because objectivity is an illusion, and yet understanding why it’s elusive is critically important for journalists, for them to remain aware of – and to try and limit – the slants and prejudices in the stories they produce. 

The rule therefore in any reputable media house is always to find more than one source or opinion in a story you produce, and in contested terrain, balance a report with diverse and opposing viewpoints. It’s the best shot we have at the unicorn of objectivity. 

Enter social media and the yawning infinity of the internet, enhanced by artificial intelligence (AI), and what you have is a huge melting pot of information, containing best-practice journalism mixed in with a lot of intentionally fake or skewed news, but with few tools, aside from authoritative source verification, to discern which is which. 

Misbelief, by Dan Ariely, professor of behavioural economics at Duke University in the US, is a deep dive into why misinformation (fake news) and disinformation (propaganda) proliferate like the viruses they are, which in turn provides insight into where your truth filters may be lacking. By the end of it, you will certainly get to “know thy enemy”, which it turns out lurks within you as much as outside of you. 

The premise of his book is a deconstruction of how one might fall for a conspiracy theory, by comparing it to a “funnel of misbelief”. 

“When first entering this funnel, a person might have a few niggling questions about accepted truths in science, health, politics, the media etc… As people progress down the funnel of misbelief, they reach a point where healthy scepticism evolves into a reflexive mistrust of anything ‘mainstream’, and genuine open-mindedness slides into dysfunctional doubt,” he writes. 

Ariely identifies the constituent elements of the funnel of misbelief as emotional, cognitive, personality and social. Emotions drive many of our actions, he writes, and in the funnel of misbelief, emotions centre around stress and the need to manage it, which set up the conditions for the other elements to come into play. 

The cognitive element is how we rationalise something when motivated to move in one or other direction. “Cognitive bias kicks in and leads us to seek information that fills that need, regardless of its accuracy,” he says. “Then the story gets more complex: we construct narratives to get to the conclusions we want to get to.” 

The personality element is the combination of traits that make up a person’s distinctive character, such that, for example, two people exposed to the same stressors can react very differently. One personality trait that Ariely spotlights as a protector against misinformation is “intellectual humility”, an openness to changing a belief if there’s good enough evidence to question a current belief. Narcissism, on the other hand, lends itself to misbelief, he says, largely because there must always be someone or something else to blame in the narcissist’s world. 

The social element in the funnel of misbelief is essentially social pressure to belong, a deep human need. A social group is solidified by shared and consolidated viewpoints and positions, Ariely says, especially in groups formed in response to a stressful event or situation, such as Covid-19. 

“Once people reach a certain point in the funnel,” he writes, “they are so deeply embedded in social networks of people who share their views that those networks and the social forces within them begin to play an outsized role in accelerating misbelief and making it very hard for them to escape the funnel.” 

Actions (committing time and resources to a cause) fuel the belief in that cause, even if that cause turns out to be unreliable, Ariely argues. “The vast majority of possible actions are social: talking to other people, protesting, posting unverified information, reacting online, arguing with those who don’t agree, breaking ties with former friends and family, and so on.” 

The most damaging effect of the funnel of misbelief is the erosion of trust, Ariely says, which is one of the basic ingredients of a functioning society. “A lack of trust creates real risks for our ability to work together and overcome future obstacles together… We trust our doctors, lawyers, car mechanics. We trust Amazon with our credit card information.

“We trust that the government will set safety standards for roads, bridges and elevators, and then we trust corporations to comply with these standards. We trust in democracy, the police, the fire department and the justice system… Though the importance of trust in our modern society is somewhat hidden from sight, it does play a crucial role.” 

Psychology and cognitive neuroscience make much of the relevance of pattern recognition, which allows us to predict and expect what is coming, and Ariely draws on this research to help explain why the more we hear something, the more we believe it. 

“The more we encounter a piece of information (or misinformation), the more intensely the piece of information is coded in our brains as familiar and true, and the ‘stickier’ it becomes,” he writes. Joseph Goebbels, picking up on this to take advantage of this human quirk, said, ‘Repeat a lie often enough and it becomes the truth’.” The antidote to this, Ariely suggests, is to reduce exposure to an untruth, and counter it each time with a truth, along with the evidence supporting that truth. 

It’s not as easy as that, though, because of what Ariely describes as “solution aversion”, which is where ideology kicks in. It means that if we don’t like a proposed solution to a problem, we use motivated reasoning to deny that the problem exists in the first place. For example, if you propose cutting back fossil fuels as a solution to climate change to someone whose livelihood depends on selling coal, they may well deny climate change exists in the first place. 

“If we want to change people’s opinions, we need to understand in more detail where their resistance is coming from,” Ariely writes. “Often it is resistance to the solution, packaged in motivated reasoning, which means that until we come up with solutions that are more acceptable to both sides, one side will not give the information a chance.” 

By now, you’ll have an idea of how complex a person’s belief system is, and why blunting the force of misinformation/disinformation requires a multidisciplinary approach, far beyond just policing social media and suspicion of everything (and yes, you do need to be suspicious of everything). 

Thus, getting to the root of the problem is what this book mostly sets out to do, rather than laying out a map of how to tackle it. That said, it is peppered with good directives on ways to armour yourself, and Ariely has inserted columns titled “Hopefully Helpful”, containing useful tips like “don’t assume malice” (when something bad happens and we imagine it was done by someone intentionally). 

One of these tips looks at how to “paradoxically persuade”, for example, if someone tells you that all pharma companies are evil, agree with them, then suggest they should stop taking all medications and cancel their medical aid. “This approach turns out to be rather effective at making people reconsider their extreme positions,” he writes, tongue-in-cheek. 

This brings me to my last observation about this book: Misbelief thankfully dispels lofty pontificating in favour of an easy-to-understand, and often humorous and anecdotal style, with plenty of identifiable examples and research to illustrate and give weight to his theories. It’s an important book of our time. 

+ posts

Helen Grange is a seasoned journalist and editor, with a career spanning over 30 years writing and editing for newspapers and magazines in South Africa. Her work appears primarily on Independent Online (IOL), as well as The Citizen and Business Day newspapers, focussing on business trends, women’s empowerment, entrepreneurship and travel. Magazines she has written for include Noseweek, Acumen, Forbes Africa, Wits Business Journal and UJ Alumni magazine. Among NGOs she has written or edited for are Gender Links and INMED, a global humanitarian development organisation.

Share.

Helen Grange is a seasoned journalist and editor, with a career spanning over 30 years writing and editing for newspapers and magazines in South Africa. Her work appears primarily on Independent Online (IOL), as well as The Citizen and Business Day newspapers, focussing on business trends, women’s empowerment, entrepreneurship and travel. Magazines she has written for include Noseweek, Acumen, Forbes Africa, Wits Business Journal and UJ Alumni magazine. Among NGOs she has written or edited for are Gender Links and INMED, a global humanitarian development organisation.

Comments are closed.

Useful Links

Home
Good Governance Africa
Advertise

© 2023 Africa In Fact. All Rights Reserved.