Making people confirm our favored conclusions

Most of us have ways of making other people confirm our favored conclusions without ever engaging them in conversation. Consider this: To be a great driver, lover, or chef, we don’t need to be able to parallel park while blindfolded, make ten thousand maidens swoon with a single pucker, or create a pâte feuilletée so intoxicating that the entire population of France instantly abandons its national cuisine and swears allegiance to our kitchen. Rather, we simply need to park, kiss, and bake better than most other folks do. How do we know how well most other folks do? Why, we look around, of course—but in order to make sure than we see what we want to see, we look around selectively.

For example, volunteers in one study took a test that ostensibly measured their social sensitivity and were then told that they had flubbed the majority of the questions. When these volunteers were then given an opportunity to look over the test results of other people who had performed better or worse than they had, they ignored the test of the people who had done better and instead spent their time looking over the tests of the people who had done worse.

The bottom line is this: The brain and the eye may have a contractual relationship in which the brain has agreed to believe what the eye sees, but in return the eye has agreed to look for what the brain wants.

Daniel Gilbert, Stumbling on Happiness

going to places where you don’t normally go

An old joke says that if you torture the data long enough, it will confess. With enough work, you can distort data to make it say what you want it to say.

We all hold some beliefs, and that’s fine. It’s all part of being human. What’s not OK, though, is when we let those beliefs inadvertently come into the way we form our hypotheses.

We can see this tendency in our everyday lives. We often interpret new information in such a way that it becomes compatible with our own beliefs. We read the news on the site that conforms most closely to our beliefs. We talk to people who are like us and hold similar views. We don’t want to get disconcerting evidence because that might lead us to change our worldview, which we might be afraid to do.

One way to fight this bias is to critically examine all your beliefs and try to find disconcerting evidence about each of your theories. By that, I mean actively seeking out evidence by going to places where you don’t normally go, talking to people you don’t normally talk to, and generally keeping an open mind.

Rahul Agarwal writing in Built in

Motivated reasoning

Motivated reasoning is thinking through a topic with the aim, conscious or unconscious, of reaching a particular kind of conclusion. In a football game, we see the fouls committed by the other team but overlook the sins of our own side. We are more likely to notice what we want to notice. Experts are not immune to motivated reasoning. Under some circumstances their expertise can even become a disadvantage. 

People with deeper expertise are better equipped to spot deception, but if they fall into the trap of motivated reasoning, they are able to muster more reasons to believe whatever they really wish to believe.

Tim Harford, How to Make the World Add Up

Motivated Reasoning 

When we identify too strongly with a deeply held belief, idea, or outcome, a plethora of cognitive biases can rear their ugly heads. Take confirmation bias, for example. This is our inclination to eagerly accept any information that confirms our opinion, and undervalue anything that contradicts it. It’s remarkably easy to spot in other people (especially those you don’t agree with politically), but extremely hard to spot in ourselves because the biasing happens unconsciously. But it’s always there. 

Criminal cases where jurors unconsciously ignore exonerating evidence and send an innocent person to jail because of a bad experience with someone of the defendant’s demographic. The growing inability to hear alternative arguments in good faith from other parts of the political spectrum. Conspiracy theorists swallowing any unconventional belief they can get their hands 

We all have some deeply held belief that immediately puts us on the defensive. Defensiveness doesn’t mean that belief is actually incorrect. But it does mean we’re vulnerable to bad reasoning around it. And if you can learn to identify the emotional warning signs in yourself, you stand a better chance of evaluating the other side’s evidence or arguments more objectively.

Liv Boeree writing in Vox