Your brain has 100 billion neurons, 100 trillion connections and you only command 5% of it.
The human brain is a natural wonder. It produces more than 50,000 thoughts each day and 100,000 chemical reactions each second. With this amount of processing power, you might think you are not that biased, but you are and we all are.
Cognitive bias is the way our mind skews our thinking or decisions. If you look up cognitive bias on Wikipedia, there is a list of over a hundred. Some social psychologists believe our cognitive biases help us process information more efficiently, especially in dangerous situations. The problem is these biases still happen when we are not in danger and can often lead to serious errors in judgment. For example, you tend to look for information that confirms your beliefs and ignore information that challenges them. This is called confirmation bias. The contents of your bookshelf and bookmarks in your Web browser are a direct result of it.
Another component of our mind; Logical fallacies are like maths problems involving language, in which you skip a step or get turned around without realizing it. They are arguments in your mind where you reach a conclusion without all the facts because you don’t care to hear them or have no idea how limited your information is. Logical fallacies can also be the result of wishful thinking.
Researchers have found that only one in every 166 people believes they are more biased than the average person. It is what’s called the bias blind spot. It is the tendency to see ourselves as less biased than other people.
Let’s explore some of the most common types of cognitive biases and logical fallacies that rooted themselves in our lives. Awareness is the best way to beat these biases, so pay careful attention to how they influence you.
Priming: When a stimulus in the past affects the way you behave and think or the way you perceive another stimulus later on, it is called priming. Every perception, no matter is you consciously notice, sets off a chain of related ideas in your neural network. For example, Pencils make you think of eraser/pens. Blackboards make you think of classrooms. Muslims make you think of terrorist. It happens to you all the time, and though you are unaware, it changes the way you behave.
Confirmation Bias: Confirmation bias is the tendency to seek out information that supports our pre-existing beliefs. In other words, we form an opinion first and then seek out evidence to back it up, rather than basing our opinions on facts. For example, if you are thinking about buying a particular make (Hyundai or Toyota) of new car, you suddenly see people driving that car all over the roads. If you just ended a long-time relationship, every song you hear seems to be written about love. If you are having a baby, you start to see babies everywhere!
Hindsight Bias: You often look back on the things you have just learned and assume you knew them or believed them all along. For example, “I knew they were going to lose”; “I saw this coming”; “I had a feeling you might say that”; How many times have you said something similar and believed it? You tend to edit your memories so you don’t seem like such a dimwit when things happen you couldn’t have predicted. When you learn things you wish you had known all along, you go ahead and assume you did know them. This tendency is just part of being a person, and it is called the Hindsight Bias.
Dunning-Kruger Effect: The tendency for unskilled individuals to overestimate their own ability and the tendency for experts to underestimate their own ability. The Dunning-Kruger effect is what makes Britain’s Got Talent, Pop Idol and The X Factor possible. At the local karaoke bar you might be the best singer in the room. Up against the entire country? Not so much.
Burden of Proof Fallacy: The burden of proof lies with someone who is making a claim, and is not upon anyone else to disprove. The inability, or disinclination, to disprove a claim does not render that claim valid, nor give it any credence whatsoever. However it is important to note that we can never be certain of anything, and so we must assign value to any claim based on the available evidence, and to dismiss something on the basis that it hasn’t been proven beyond all doubt is also fallacious reasoning. For example: If anyone declares that a teapot is, at this very moment, in orbit around the Sun between the Earth and Mars, and that because no one can prove him wrong, his claim is therefore a valid one.
Shifting the Burden of Proof: One way in which one would attempt to shift the burden of proof is by committing a logical fallacy known as the argument from ignorance. It occurs when either a proposition is assumed to be true because it has not yet been proved false or a proposition is assumed to be false because it has not yet been proved true.
Argument From Ignorance: The misconception is – when you can’t explain something, you focus on what you can prove. But the truth is – when you are unsure of something, you are more likely to accept strange explanations.
This is when you decide something is true or false because you can’t find evidence to the contrary. You don’t know what the truth is, so you assume any explanation is as good as another. Maybe those lights were alien spacecraft, maybe not. You don’t know, so you think the likelihood they were intergalactic visitors is roughly the same as those light being from a helicopter far away. You can’t disprove something you don’t know anything about, and the argument-from-ignorance fallacy can make you feel as though something is possible because you can’t prove otherwise. The same holds true for leprechauns and unicorns, fairies and monsters. These things aren’t more likely just because you can’t prove they don’t exist. Some people think the Holocaust didn’t happen, or human beings never walked on the moon, but there is plenty of evidence for both. People who refuse to believe such things claim they need more evidence before they can change their minds, but no amount of evidence will satisfy them. Any shred of doubt allows them to argue from ignorance.
Argument From Authority: The misconception is – you are more concerned with the validity of information than the person delivering it. But the truth is – the status and credentials of an individual greatly influence your perception of that individual’s message.
Insisting that a claim is true simply because a valid authority or expert on the issue said it was true, without any other supporting evidence offered. For example,According to person 1, who is an expert on the issue of Y, Y is true. Therefore, Y is true.
You naturally look to those in power as having something special you lack, a spark of something you would like to see inside yourself. This is why people sometimes subscribe to the beliefs of celebrities who endorse exotic religions or denounce sound medicines.
Another example:Richard Dawkins, an evolutionary biologist and perhaps the foremost expert in the field, says that evolution is true. Therefore, it’s true.
Explanation: Richard Dawkins certainly knows about evolution, and he can confidently tell us that it is true, but that doesn’t make it true. What makes it true is the preponderance of evidence for the theory.
If a celebrity footballer tells you to buy a particular brand of batteries, ask yourself if the footballer seems like an expert on electrochemical energy storage units before you take his word.
Argument From Popularity: Using the popularity of a premise or proposition as evidence for its truthfulness. This is a fallacy which is very difficult to spot because our “common sense” tells us that if something is popular, it must be good/true/valid, but this is not so, especially in a society where clever marketing, social and political weight, and money can buy popularity. Also, in argumentation theory, an argument from popularity is a fallacious argument that concludes that a proposition must be true because many or most people believe it, often concisely encapsulated as: “If many believe so, it is so.”
- Nine out of ten people in the United States claim this bill is a bad idea; therefore, this bill is bad for the people.
- 50 million Elvis fans can’t be wrong or 100,000,000 Taylor Swift Fans Can’t Be Wrong.
- Everyone’s doing it; therefore, it must be good.
- Everyone thinks God exists. Therefore, God exists.
- In a court of law, the jury vote by majority; therefore, they will always make the correct decision.
- Many people buy extended warranties; therefore, it is wise to buy them.
- Millions of people agree with my viewpoint; therefore, it must be right.
- The majority of this country voted for this president; therefore, this president must, objectively, be a good President.
- My family or tribe holds this as a truth; therefore, everyone who disagrees is simply wrong.
Circular Reasoning: This is also known as circular logic which is a logical fallacy in which the arguer begins with what they are trying to end with. The components of a circular argument are often logically valid because if the premises are true, the conclusion must be true. Circular reasoning is often of the form: “A is true because B is true; B is true because A is true.” Circularity can be difficult to detect if it involves a longer chain of propositions.
For Example: The Bible is the Word of God because God tells us it is…in the Bible.
Explanation: This is a very serious circular argument on which many people base their entire lives. This is like getting an e-mail from a Nigerian prince, offering to give you his billion dollar fortune — but only after you wire him a “good will” offering of $50,000. Of course, you are skeptical until you read the final line in the e-mail that reads “I, prince Nubadola, assure you that this is my message, and it is legitimate. You can trust this e-mail and any others that come from me.” Now you know it is legitimate because it says so in the e-mail.
False Dilemma: A false dilemma is a type of informal fallacy in which something is falsely claimed to be an “either/or” situation, when in fact there is at least one additional option.
A false dilemma can arise intentionally, when a fallacy is used in an attempt to force a choice or outcome. The opposite of this fallacy is false compromise. The false dilemma fallacy can also arise simply by accidental omission of additional options rather than by deliberate deception. For example: “Kim spoke out against capitalism, therefore she must be a communist” (she may be neither capitalist nor communist). “Roger opposed an atheist argument against Islam, so he must be a Muslim” (When it’s assumed the opposition by itself means he’s a Muslim). Roger might be an atheist who disagrees with the logic of some particular argument against Islam. Additionally, it can be the result of habitual tendency, whatever the cause, to view the world with limited sets of options.
Brand Loyalty: The misconception is – you prefer the things you own over the things you don’t because you made rational choices when you bought them. But the truth is – you prefer the things you own because you rationalize your past choices to protect your sense of self.
In this 21st century, the Internet changed the way people argue. Check any Facebook post; comment system, forum, or message board and you will find people going at it, debating why their chosen product is better than the other guy’s. Coke vs. Pepsi, Hyundai vs. Toyota, Coles vs. Woolworths, Nikon vs. Canon, Mac vs. PC, iPhone vs. Android – it goes on and on. Usually, these arguments are between men, because men will defend their ego no matter how slight the insult. When someone always argue or writes a dozen paragraphs online defending his favorite thing or slandering a competitor, he is quickly branded as a ‘Fanboy’. If the product is unnecessary, there is great chance the customer will become a fanboy because he had to choose to spend a big chunk of money on it, It is the choosing of one thing over another that leads to narratives about why you did it, which usually tie in to your self-image.
The Just-World Fallacy: The misconception is – people who are losing at the game of life must have done something to deserve it. But the truth is – the beneficiaries of good fortune often do nothing to earn it, and bad people often get away with their actions without consequences.
It is common in fiction for the bad guys to lose and the good guys to win. This is how you would like to see the world – just and fair. In psychology, the tendency to believe that this is how the real world works is called the just-world fallacy. This is the tendency to react to horrible misfortune, like homelessness or drug addiction, by believing the people stuck in these situations must have done something to deserve it. The just-world fallacy helps you to build a false sense of security. You want to feel in control, so you assume as long as you avoid bad behavior, you won’t be harmed. You feel safer when you believe those who engage in bad behavior end up to the street, or pregnant, or addicted, or raped.
You have heard that what goes around comes around, or maybe you have seen a person get what was coming to them and thought, “That’s karma for you.” You want to believe those who work hard and sacrifice get ahead and those who are lazy and cheat do not. This, of course, is not always true. Success is often greatly influenced by when you were born, where you grew up, the socioeconomic status of your family, and random chance. All the hard work in the world can’t change those initial factors.
The Straw Man Fallacy: The misconception is – when you argue, you try to stick to the facts. But the truth is – in any argument, anger will tempt you to re-frame your opponent’s position.
When you are losing an argument, you often use a variety of deceptive techniques to bolster your opinion. You aren’t trying to be sneaky, but the human mind tends to follow predictable patterns when you get angry with other people and do battle with words. It works like this: When you get into an argument about either something personal or something more public and abstract, you sometimes resort to constructing a character who you find easier to refute, argue, and disagree with, or you create a position the other person isn’t even suggesting or defending. This is a straw man.
The straw man fallacy takes the facts and assertions of your opponent and replaces them with an artificial argument you feel more comfortable dealing with. The straw man fallacy follows a familiar pattern. You first build the straw man, then you attack it, then you point out how easy it was to defeat it, and then you come to a conclusion. Straw men can also be born out of ignorance. Keep in mind whoever does it is using a logical fallacy and even if that person succeeds, he or she didn’t really win.
The Ad Hominem Fallacy: The misconception is – if you can’t trust someone, you should ignore that person’s claims. But the truth is – what someone says and why they say it should be judged separately.
Sometimes an arguments can get so heated you start calling the other person names. You attack the other person instead of the position that person has take. It is easier to disagree with someone you see as nasty or ignorant. Calling someone a bigot, or an idiot, or any asshole feels good, but it does not prove you right or that person wrong. This makes sense, but you don’t always notice when you are doing it. When you assume someone is incorrect based on who that person is or what group he or she belongs to, you have committed the ad hominem fallacy.
Self-Serving Bias: The misconception is – you evaluate yourself based on past success and defeats. But the truth is – you excuse your failures and see yourself as more successful, more intelligent, and more skilled than you are.
You tend to accept credit when you succeed, but blame bad luck, unfair rules, difficult instructions, bad bosses, cheaters, and so on when you fail. When you are doing well, you think you are to blame. When you are doing badly, you think the world is to blame. This behavior can be observed in board games and elections, group projects and final exam. When things are going your way, you attribute everything to your amazing skills, but once the tide turns, you look for external factors to put the blame.
This sort of thinking also spreads to the way you compare yourself to others. Research shows just about all of us think we are more competent than our co-workers, more ethical than our friends, friendlier than the general public, more intelligent than our peers, more attractive than the average person, less prejudiced than people in our region, younger-looking than people the same age, better drivers than most people we know, better children than our siblings, and that we will live longer than the average lifespan. You don’t believe you are an average person, but you do believe everyone else is. This tendency, which springs from self-serving bias, is called the illusory superiority effect.
According to the research, you pay close attention to the success and failures of friends more than you do to those of strangers. You compare yourself to those who are close to you in order to judge your own worth. When you compare your skills, accomplishments, and friendships with those of others, you tend to accentuate the positive and eliminate the negative. You are a liar by default, and you lie most to yourself. If you fail, you forget it. If you win, you tell everyone.
Fundamental Attribution Error: This is the tendency to blame others when things go wrong, instead of looking objectively at the situation. In particular, you may blame or judge someone based on a stereotype or a perceived personality flaw. For example, if you’re in a car accident, and the other driver is at fault, you’re more likely to assume that he or she is a bad driver than you are to consider whether bad weather played a role or the other driver got a heart-attack.
Bandwagon Effect: The tendency to do or believe what others do or believe. For example, your leader may believe someone is great at their job and you go along with this. As more people come to believe in something, others also “hop on the bandwagon” regardless of the underlying evidence. Studies show that, in the right circumstances, as much as 75% of people will give answers that they know are false, simply because others around them have given the same incorrect answer (whereas less than 1% would answer incorrectly otherwise).
Halo Effect: This is a person’s overall impression of someone, and it influences our feelings and thoughts about the other persons overall character. It is the perception, for example, that if someone does well in a certain area, then they will automatically perform well at something else regardless of whether those tasks are related. Under the “Halo Effect” bias, we tend to lump together positive qualities, and assume where one attractive quality exists, others also exist.
Important Questions to Ask:
According to Daniel Kahneman, author of Thinking, Fast and Slow (2013), recommends that you ask three questions to minimize the impact of cognitive biases in your decision making:
1) Is there any reason to suspect the people making the recommendation of biases are based on self-interest, overconfidence, or attachment to past experiences? Realistically speaking, it is almost impossible for people to not have these three influence their decisions.
2) Have the people making the recommendation fallen in love with it? Again, this is almost inevitability because, in most cases, people wouldn’t make the recommendation unless they loved it.
3) Was there group-think or were there dissenting opinions within the decision-making team? This question can be mitigated before the decision-making process begins by collecting a team of people who will proactively offer opposing viewpoints and challenge the conventional wisdom of the group.
In answering each of these questions, you must look closely at how each may be woven into the recommendations that have been offered and separate them from their value. If a recommendation doesn’t stand up to scrutiny on its own merits, free of cognitive bias, it should be discarded.
Only by filtering out the cognitive biases that are sure to arise while decisions are being made can you be confident that, at the end of the day, the best decision for you and your people was made based on the best available information.