There's thinking and then there is critical thinking.
If a baseball bat and a ball cost a total of $1.10, and the bat costs $1 more than the ball, then how much does the ball cost?
If you answered ¢10 you were thinking. If you answered ¢5, you were thinking critically. The bat costs a dollar more than the ball. This example comes from a book you should read called Thinking: Fast and Slow, by Daniel Kahneman. The problem is our brain inclines us toward thinking the answer must be ¢10, and because we saw that "truth" in a flash, we embrace an error as if it were a fact, might even fight to defend it. If you dislike something you don't understand, that's thinking. If you like something because you are familiar with it, that's thinking. If you hang on to your initial opinions in the face of contradictory evidence, that's thinking. If you dislike people who disagree with you, that's thinking too. As you can tell from that list of inclinations toward errors, you should give up thinking. We resist critical thinking because thinking, going with our gut or our traditions, is fast and satisfying. The only thing more desirable than getting the right answer eventually is getting the wrong answer quickly. Critical thinking, on the other hand, is slow and cognitively taxing, and we don't always know the math (or logic or what someone else is thinking); we don't always know what we need to know in general and that makes us want to jump to something we do know rather than take the time to learn what we need to learn. A couple hours pondering hard problems and you're tired, everybody is.
A traveler comes to a fork in the road heading to two villages. In one village, the people always tells lies, and in the other village, the people always tell the truth. The traveler needs to conduct business in the village where everyone always tells the truth. A man from one of the villages is standing in the middle of the fork, but there's no indication of which village he reside in. The traveler approaches the man and asks him just one question. From the man's answer, he knows which road to follow. What is the question? (David Lieberman, You Can Read Anyone)mouseover for answer
Getting the right answer doesn't help if you asked the wrong question.
What follows is more of a parlor trick than a bit of directly instructional reason, but it makes a useful point. Play along.
I bet you a dollar I know what your answers are.
If I was right, it's because of how the number 9 works. Any number you chose of your own volition would lead to the number 4. Given the rules of the game, 4 = D and the subsequent rules led to the answers I bet you gave. That's not magic, it's manipulation. My point is simply that getting the right answer can distract you from looking more closely at the questions you are trying so hard to answer.
If you go into an observational setting thinking you know what you are looking for, your powers of concentration may keep you from seeing something important.
The goal of critical thinking is to overcome egocentricity. The goal of workplace-based writing and research is to overcome corporate bias.
In addition to inertia and over-confidence, critical thinking is impeded by how our brains work when unsupervised (cognitive biases) and by errors in logic (fallacies) and errors in statistical inference. What follows is an overview of those three sources of errors in judgment and decision making. You should make a point of remembering the concepts that follow. You should also make a point of daily searching for examples in your world, in your own thinking and other people's thinking as well. It is always easier to find these errors in other people's thinking than in our own. So you should seek feedback from people who don't see the world the way you do.
Be advised, however, that knowing these errors in thinking does not inoculate against them. Human beings are hard-wired to these defaults. Being constantly vigilant (and skeptical) will help, but you will succumb to default thinking from time to time anyway. We all do.
Naïve realism -- believing that you see the world clearly and understand it perfectly. If you hear yourself saying, "in reality" or "obviously" stop and think critically. What assumptions are you making that make it obvious to you that might not be shared by others? Two people can look at the same object and see something different, either because they are looking at the same thing from different angles or because they interpret the object differently. The latter happens most often when the object is abstract, like a concept or an idea. Just as the objects in our field of vision are assembled by our brains and interpreted by our minds, so our expectations, experiences, and assumptions influence what we infer about the parts of the world we can't see directly. "He must have been guilty or else why would have have run?" (What it Means, Drive-By Truckers) is a rhetorical question for some people (the answer implicit in the question) but for others it's a real question. Perhaps he ran because experience taught him that cops are dangerous.
Apophenia -- the human tendency to see patterns in random data, to mistake coincidence for meaning. Can you see the outline of a dolphin in the rose (image, right)? While there's nothing inherently wrong with seeing faces in toast or trees in clouds, basing arguments or beliefs on random patterns is very uncritical thinking, especially since once we come to believe something, it is very hard to un-believe (or unsee) it. [Technically, the dolphin is an example of pareidolia, which is a subset of apophenia.]
Given: 1,3,5,7; what's the next number? 11? Sure, if it is safe to assume that the first four items in the set are there because they are the prime numbers in sequence. What if that's not why they are there? What if you were looking at a code of some kind and the regularity of the numbers hasn't yet manifested itself? Maybe the next number is 16 and the one after that is 13. In that case you made a false, albeit reasonable, assumption. If you run a random number generator long enough, printing each randomly generated number as it was generated, eventually you would see the first four primes in sequence. But they wouldn't be the first four primes, just those numbers randomly displayed in what could easily be mistaken for a specific sequence.
Let's say you are trying to figure out why people like to drink coffee in the coffee shop in the library instead of leaving the building to get coffee elsewhere and you ask that question of the first person that walks in the door. Would you bet the next person would have the same reason? Maybe if you asked 20 people and discovered that there were only four answers, then you might bet a small amount of money on number 21 having one of the four answers. But that person might have a fifth, as yet unheard of answer. How many library goers would you need to ask before you achieved saturation, the point where you haven't heard a new answer in so long that you can safely figure any new answer could be treated as unique or at least so rare as to be irrelevant? Well, how much longer can you afford to stand there asking people as they walk in?
Confabulation -- making up a story to explain what you can't understand (mythologizing). People dislike doubt so much, and want so badly to feel smart, that we will fill any gap with a story and then accept the story as truth. Often, the story replaces the experience and fiction is all that's left.
The illusion of coherence -- just as we will mythologize to avoid confronting our ignorance, we will assimilate (explain away) or dismiss (ignore) any evidence that contradicts our beliefs. Human beings, most of us anyway, are so troubled by ambiguity that we would rather be wrong than uncertain. But painting over the cracks doesn't fix the foundation.
Stories are powerful because they make sense of the world for us, they make action meaningful, and they justify behavior, good or bad. But they are just stories; they are not reality. Reality includes a bunch of noise and a bunch of data we mistake for noise, as well as a bunch of noise we mistake for data.
Confirmation bias -- we humans are attracted to evidence that supports our beliefs and we tend to ignore or discount contradictory evidence. Once we believe something, we are hard-wired to find proof everywhere. Because we also tend to like people like us, that is who have similar views, we tend to surround ourselves with people who confirm our prior beliefs and thus it is even harder to think critically. This bias for confirmation is exacerbated by the way our Internet practices tend to filter what we see to conform to our preferences, our likes, our friends' likes, and so on. It's quite easy to mistake the block where you live for the whole world.
Fundamental attribution error -- believing that your failures are caused by bad luck, your successes by hard work, while others' failures are the result of character flaws, their successes just luck or a system rigged in their favor. People who tend to make excuses and blame others when they aren't getting what they want are gripped by the fundamental attribution error. The reality is closer to the fact that every one of us lives in a set of interrelated systems, some support us, others may impede us, but none of us succeeds or fails alone.
Normalcy bias -- thinking that everything is just fine when disaster looms. A great example of this kind of thinking en masse was the Atlanta "Snowpocalypse" of 2014. Two inches of snow was forecast for metro Atlanta, and everyone went about their business as usual. Next thing you know, the roads got icy, the highways backed up; the surface streets gridlocked, and thousands of people spent the night in their freezing cars.
Availability bias -- letting your personal experience influence your perceptions and inferences. Your experience might not be representative or even relevant. If you watch the news, you are likely to have a more negative worldview than if you don't, largely because "the news" means homicides, domestic violence, kidnappings, spectacular car accidents, fires and so on. These dramatic images color your worldview if you don't contextualize them. The availability bias enables the "information bubble" that most of us live in because the examples we use to enable our thinking tend to come from reinforcing sources.
In the context of workplace-based writing and research, availability means you need to pay special attention to the order you ask questions in. One question may cause a person to think about something that changes their mood and thus influences how they answer the next question or questions. Put that question lower down in the list and the bias might go away.
Framing -- How you see something, how you define a situation (Is this a threat or an opportunity? A problem or a possibility?), directly effects how you respond. If you can look at things from different perspectives you can capture insights that might lead you to different responses and therefore different decisions. If someone offers you a choice between two alternatives, you should always think: what about both or neither? The wider your perspective, the more options you will see, and the less likely you are to get tunnel vision, like the person who is so focused on a goal that when he achieves it, he has no idea what to do next.
You've no doubt taken a basic Philosophy class and so are familiar with the logical fallacies. If you haven't or you don't remember them, you should look them up. For our purposes, there are only a few of special importance.
A business proposal is an argument. It offers a description of a situation that indicates a problem and proposes a solution. For it to be successful, it has to be convincing. To be convincing, you need to focus on the evidence. The proposal itself may not be evidence heavy, depending on the nature of your intended audience, but you need to know what you are doing.
For hundreds of years people assumed that emotions warped decisions because there were plenty of examples where a person or people decided to do something and then later regretted their decision. From this perspective, a perfect world would be a perfectly rational world. We now think that pure rationality is rare and in fact a cognitive disorder (called Alexithymia) when present. We need to feel in order to think critically. We just need to be aware of our own states and the states of others. If a person is indifferent, knowing why may be important. If a person is agitated or distracted, their state suggests they may change their mind later when they come down. That doesn't mean what the think while stressed is necessarily wrong, only that they may think differently once they come down.
Feelings are caused by the feeler's interpretations and inferences, not the actions of others. As the authors of Crucial Conversations say, other people don't make you mad; you make you mad. As the Daoist tradition has always maintained suffering comes from within.
In statistics, the mean is the sum of the numbers in a set divided by the number of elements in the set, what we commonly call the average number. One would expect to get an average close to 1.5 in the case of a coin toss, assuming 1 = heads and 2 = tails (3/2 = 1.5.). So the more times you toss the coin, the closer you would expect to get to 1.5. But of course you could get an even split with just two tosses.
With a coin toss, we can only come up with an average if we identify heads with a number and tails with another number, like 1 and 2, but given that a coin is either a head or a tail, the idea of an average isn't real. We can get the number, but it has no referent in the real world. That may or may not be a problem, depending one what we are trying to do, but it's a point worth remembering since people often mistake numbers for reality. Precision often mascarades as truth.
mode = whichever is most frequent in a given trial, there is no mode in this case.
heads
tails
(P)# of trials
We Americans, by and large, don't like to be told what to think. Our country is founded on individuality and the idea that everyone is entitled to his or her opinion, no matter how demonstrably false that opinion is. We distrust authority in general, and most of us can think of half a dozen vivid examples of breaches of public trust that seem to warrant our wholesale rejection of authority. Our logic textbooks tell us that arguments from authority are fallacious. Any decent logic textbook, however, also tells us that just because an argument is invalid doesn't mean the conclusion is false.
The problem I have with the wholesale rejection of authority is that we don't always know enough to make good decisions without listening to expert testimony. We have to be able to understand enough science to differentiate science that is currently valid, acceptable to other scientists, from junk science, while still understanding that scientific thought evolves and what is scientifically valid today may be disproven in the future. To say that a theory is just a theory is to fundamentally misunderstand what the word theory means. There is a big difference between a specualtion and a theory.
"A frightening Myth about Sex Offenders" is a disturbing piece of video journalism about how junk science gets written into legislation. The video is clearly (I think) designed to shock and outrage people (the devil term "pedophile" and the opening scene make it's emotional designs clear to me) and as such makes for interesting discussions about rhetoric as well as grounds for considering how what we want to believe influences not only how we see the world but how we change the world to best fit our perceptions of it. The problem with arguments from authority is that they are only as good as the grounds which grant authority and here we seem to have evidence that no less an authority than the Supreme Court can base it's decisions on junk science, thus undermining its authoritativeness without lessening it authority. It's still the Supreme Court, no matter why it says what it says. (link)
Before you reject an argument from authority solely because it is an argument from authority, you need to understand what the authority's opinions are based on. And if you can't understand, then you may need to confront the unpleasant fact that maybe you should submit to the authority anyway. And yet at the very same time, you can't let an authority abuse it's power or make proclamations beyond it's sphere of knowledge. The Doctors in the 1950's who said Lucky Strike cigarettes were good for your throat because the tobacco was toasted should have had their licenses revoked, but of course by the time the world fully embraced that fact as a fact, they were dead.