You can’t opt Out of Life

Imagine that three people see a twenty-dollar bill on the front seat of an unlocked car. Each person walks past and leave the cash there. Why? The first person wanted to take the money but passed up the opportunity for fear of punishment if caught in the act. The second rejected the temptation out of a conviction that God makes certain rules that people are to follow, and one of those rules is that we shouldn’t take things that don’t belong to us. The third refrained from taking the money because of empathy—awareness of how frustrated and angry she herself would be if some of her money were stolen.

The action is the same for each individual—no one took the money. But people do things for reasons and the reasons behind the same action in the case above vary significantly. The bumper-sticker-sized version of the first person’s ethics is “Whatever you do, don’t get caught,” while that of the second person is “Thou shall not steal.” The final persona builds her morality around “Do unto others as you would have them do unto you.” These different reasons grow out of differences in theories about what constitutes right behavior.

Though none of the three people may have been immediately conscious of these theories at work, the theories were there, and they guided each person’s behavior.

Also consider the motives or the reasons behind the action.

Why they did what they did—the theoretical basis of their actions—is significant.

The reality is we must make decisions about the ethical issues confronting us, and we must have a theoretical foundations on which to build and evaluate these decisions.

In other words, the issue is not whether we have a theory, but whether we are conscious of the theory we do have and believe it is the best available guide for our life. We do not choose to be ethicists; we cannot opt out of that. The real question is whether we are going to be good ethicists.

Steve Wilkens, Beyond Bumper Sticker Ethics

Bumper Sticker Catch Phrases

We need to be careful about staking the important ethical decisions in our lives on bumper sticker catch phrases. The problem is that the ideas expressed in these bite-sized pronouncements have broader implications.

While the ethical aspect that is explicit in the bumper sticker may look good at first glance, other ideas that follow from it may not be so attractive. Most of us have heard or used the cliché “When in Rome, do as the Romans do,” and it can sound like worthwhile advice. But what if the standard practices of the “Romans” stand in direct conflict with your moral or religious convictions? The is why we need to get behind the cliché’ itself.

Before we commit ourselves to any bumper sticker, we want to make certain that we can accept all that is implied in the slogan.

Steve Wilkens, Beyond Bumper Sticker Ethics

A new approach to lie detection

Researchers from the University of Amsterdam's Leugenlab (Lie Lab) have developed a new approach to lie detection through a series of lab experiments.

Participants were free to use all possible signals—from looking people in the eye to looking for nervous behavior or a particularly emotional story—to assess whether someone was lying.

In this situation, they found it difficult to distinguish lies from truths and scarcely performed above the level of probability. When instructed to rely only on the amount of detail (place, person, time, location) in the story, they were consistently able to discern lies from truths.

Bachelor's students from the UvA and Master's students from the UvA and the UM carried out data collection, control experiments and replication studies for the research in the context of their theses. 

Read more online at The Univeristy of Amsterdam 

Your Inner Voice Can Mislead You

It’s very disturbing when you realize that our brains are a fiction-making machine. We make up all kinds of crazy things to help us feel better and to justify the decisions that we’ve made. The inner voice is the one who arbitrates a lot of that maneuvering around the truth, so we have to be very careful. It’s a master storyteller and far more important than you may realize.

Jim Loehr, performance psychologist and cofounder of the Human Performance Institute, quoted in Fast Company

Imagination inflation can lead to false memories

Imagination inflation refers to the tendency of people who, when asked to imagine an event vividly, will sometimes begin to believe, when asked about it later, that the event actually occurred. Adults who were asked "Did you ever break a window with your hand?" were more likely on a later life inventory to report that they believe this event occurred during their lifetimes. It seems that asking the question led them to imagine the event, and the act of having imagined it had the effect, later, of making them more likely to think it had occurred (relative to other group answer the question not having previously imagined it occurring).

Accounts that sound familiar can create the feeling of knowing and be mistaken for true. This is one reason that political or advertising claims that are not factual but are repeated can gain traction with the public, particularly if they have emotional resonance. Something you once heard that you hear again later carries a warmth of familiarity that can be mistaken for memory, a shred of something you once knew and cannot quite place but are inclined to believe. In the world propaganda, this is called "the big lie" technique—even a big lie told repeatedly can come to be accepted as truth.

Peter C. Brown and Henry L. Roediger III, Make It Stick: The Science of Successful Learning

What does it means for a human being to possess the truth

Kierkegaard’s concern is really not with the adequacy of a philosophical theory of truth, but with the question of what it means for a human being to possess the truth. To grasp the significance of this, we must not think of truth in the way characteristic of contemporary philosophy, focusing on the properties of propositions, but in the way ancient thinkers conceived of truth. For Socrates and Plato, at least as Kierkegaard understood them, having the truth meant having the key to human life, possessing that which makes it possible to live life as it was intended to be lived.

C Stephen Evans, Introduction: Kierkegaard’s life and works

Availability bias

People give their own memories and experiences more credence than they deserve, making it hard to accept new ideas and theories. Psychologists call this quirk the availability bias. It’s a useful built-in shortcut when you need to make quick decisions and don’t have time to critically analyze lots of data, but it messes with your fact-checking skills.

Marc Zimmer writing in The Conversation

Why are conspiracy theories popular?

People who feel powerless or vulnerable are more likely to endorse and spread conspiracy theories. This is seen in online forums where people’s perceived level of threat is strongly linked to proposing conspiracy theories. Conspiracy theories allow people to cope with threatening events by focusing blame on a set of conspirators. People find it difficult to accept that “big” events (e.g., the death of Princess Diana) can have an ordinary cause (driving while intoxicated). A conspiracy theory satisfies the need for a “big” event to have a big cause, such as a conspiracy involving MI5 to assassinate Princess Diana. For the same reason, people tend to propose conspiratorial explanations for events that are highly unlikely. Conspiracy theories act as a coping mechanism to help people handle uncertainty.

Stephan Lewandowsky & John Cook, The Conspiracy Theory Handbook

How does this information make me feel?

We don’t need to become emotionless processors of numerical information – just noticing our emotions and taking them into account may often be enough to improve our judgment. Rather than requiring superhuman control of our emotions, we need simply to develop good habits. Ask yourself: how does this information make me feel? Do I feel vindicated or smug? Anxious, angry or afraid? Am I in denial, scrambling to find a reason to dismiss the claim?

Before I repeat any statistical claim, I first try to take note of how it makes me feel. It’s not a foolproof method against tricking myself, but it’s a habit that does little harm, and is sometimes a great deal of help. Our emotions are powerful. We can’t make them vanish, and nor should we want to. But we can, and should, try to notice when they are clouding our judgment.

Tim Harford, How to Make the World Add Up

The Backfire effect 

Once something is added to your collection of beliefs, you protect it from harm. You do it instinctively and unconsciously when confronted with attitude-inconsistent information. Just as confirmation bias shields you when you actively seek information, the backfire effect defends you when the information seeks you, when it blindsides you. Coming or going, you stick to your beliefs instead of questioning them. When someone tries to correct you, tries to dilute your misconceptions, it backfires and strengthens them instead. Over time, the backfire effect helps make you less skeptical of those things which allow you to continue seeing your beliefs and attitudes as true and proper.

David McRaney  

You Have Your Truth, I have Mine

Some people.. maintain that morality is not dependent on the society but rather the individual. “Morality is in the eye of the beholder.” They treat morality like taste or aesthetic judgments, person relative. 

On the basis of (moral) subjectivism Adolf Hitler and serial murderer Ted Bundy could be considered as a moral as Gandhi, as long as each lived by his own standards, whatever those might be. 

Although many students say they espouse subjectivism, there is evidence that it conflicts with other of their moral views. They typically condemn Hitler as an evil man for his genocidal policies. A contradiction seems to exist between subjectivism and the very concept of morality. 

Louis Pojman, Ethical Theory

tell me a story

We naturally avoid ambiguity. We want black and white, right or left, up or down. The greys of life are so distasteful that when a cause is attached to any set of facts, we assume the "facts" are more likely to have really happened.

Nassim Taleb in his book The Black Swain points out that if you ask someone, "How many people are likely to have lung cancer in the U.S.?" you might get a response like "half a million." But if you make one change to the question and ask, "How many people are likely to have lung cancer in the U.S. because of smoking cigarettes" you would get a much higher number. Why is that? Taleb suggests we tend to believe an idea is more likely to be true when a cause is attached to it.

Joey seemed happily married but killed his wife.

Joey seemed happily married but killed his wife to get her inheritance. 

The first is broader and accommodate more possibilities. The second statement is more specific and less likely to be true.  But if you ask people which is more likely, more of them would say the second statement. Why?  The second statement tells us a story.

The narrative misguides us. We want an explanation, a back story. That's why it’s hard for us to look at a series of facts without weaving an explanation into them and tying the factsto the because. We like a good story-even when it misleads us about what is true. That's why you should be careful whenever you come across a because. Connecting causes to particular events must be handled with care.

Stephen Goforth

Casting Doubt

A lot of people still think of propaganda as the art of making lies sound truthful, but...they want to make truthfulness an irrelevant category. It’s not about proving something, it’s about casting doubt. Most political ideologies have not been about casting doubt — they’ve claimed to be telling the truth about the way the world is or should be. But this new propaganda is different. Putin isn’t selling a wonderful communist future. He’s saying, we live in a dark world, the truth is unknowable, the truth is always subjective, you never know what it is, and you, the little guy, will never be able to make sense of it all — so you need to follow a strong leader.

Sean Illing & Peter Pomerantsev in a Vox  interview 

 

When the facts change

According to David Perkins of Harvard University, the brighter people are, the more deftly they can conjure up post-hoc justifications for arguments that back their own side. Brainboxes are as likely as anyone else to ignore facts which support their foes. John Maynard Keynes, a (famously intelligent) British economist, is said to have asked someone: “When the facts change, I change my mind. What do you do, sir?” If they were honest, most would reply: “I stick to my guns.”

from The Economist 

Few people can detect a liar

In daily life, without the particular pressures of politics, people find it hard to spot liars. Tim Levine of the University of Alabama, Birmingham, has spent decades running tests that allow participants (apparently unobserved) to cheat. He then asks them on camera if they have played fair. He asks others to look at the recordings and decide who is being forthright about cheating and who is covering it up. In 300 such tests people got it wrong about half of the time, no better than a random coin toss. Few people can detect a liar. Even those whose job is to conduct interviews to dig out hidden truths, such as police officers or intelligence agents, are no better than ordinary folk.

The Economist 

Conspiracy Theories

A conspiracy theory is an attempt to force a story on a set of disparate, though often distantly related facts and observations. But the real world is not a narrative, not a clever mystery to be unraveled by amateur detectives. Every baroque edifice of conspiracy rests upon a foundational belief that there is a singular truth that diligent investigation will reveal, even if the shape of that truth branches and swirls in an infinite fractal. What this mindset cannot accept is that there may be many simple truths for many disturbing facts.

Jacob Bacharach writing in The Outline

Why We Lie

A life of total dedication to the truth means.. a life of total honesty. It means a continuous and never-ending process of self-monitoring to assure that our communications – not only the words that we say but also the way we say them-invariably reflect as accurately as humanly possible the truth or reality as we know it. Such honesty does not come painlessly. The reason people lie is to avoid the pain of challenge and its consequences. 

M Scott Peck, The Road Less Traveled

The Madman’s Narrative

Consider that two people can hold incompatible beliefs based on the exact same data. Does this mean that there are possible families of explanations and that each of these can be equally perfect and sound? Certainly not. One may have a million ways to explain things, but the true explanation is unique, whether or not it is within our reach. 

In a famous argument, the logician WV Quine showed that there exist families of logically consistent interpretations and theories that can match a give series of facts. Such insight should warn us that mere absence of nonsense may not be sufficient to make something true. 

Nassim Taleb, The Black Swain