CRITICAL THINKING & SOCRATIC
QUESTIONING MASTERY
HOW TO GET THE RIGHT ANSWERS AND BUILD WISE
ARGUMENTS
THINKNETIC
Did You Know That 93% Of CEOs Agree That This Skill Is More Important Than
Your College Degree?
Here's just a fraction of what you'll discover inside:
How to shortcut the famous Malcom Gladwell "10,000 Hours Rule" to
become an expert critical thinker, fast
What a WW2 pilot and the people of Romania can teach you about
critical thinking - this is the KEY to not making huge mistakes
Actionable, easy exercises to drill home every point covered in the
novel. You won't "read and forget" this book
Our educational system simply doesn't teach us how to think...
...and it's unlikely this is information you've ever learned anywhere else - until
now.
A glimpse into what you'll discover inside:
If your thinking is flawed and what it takes to fix it (the solutions are
included)
Tried and true hacks to elevate your rationality and change your life for
the better
Enlightening principles to guide your thoughts and actions (gathered
from the wisest men of all time)
(Or go to thinknetic.net or simply scan the code with your camera)
CONTENTS
The Critical Thinking Effect
Introduction
1. What It Means To Be A Critical Thinker In This Day And Age
2. What Keeps Us From Getting To The Truth?
3. Why Having A Scientifically Skeptical Mind Helps You Discover
The Truth
4. Why The Media Can Make Or Break Our Thinking
5. Everyday Lies And Deception
6. Pseudoscience Versus Science
Afterword
References
The Socratic Way Of Questioning
Introduction
1. The Crucial Role Of Critical Thinking
2. The Socratic Method Of Thinking
3. Traits Of A Socratic Mind
4. Questioning: The Heart Of The Socratic Method
5. The Skillful Art Of Asking The Right Questions
6. Getting It Right: Points To Remember And Apply
Afterword
References
One Final Word From Us
Continuing Your Journey
The Team Behind Thinknetic
Disclaimer
THE CRITICAL THINKING EFFECT
UNCOVER THE SECRETS OF THINKING CRITICALLY AND
TELLING FACT FROM FICTION
INTRODUCTION
Have you ever wondered why there seems to be so much
misinformation out there? With fake news stories reaching a
hundred times more users on Twitter than real news stories,
perhaps it is no wonder we have a problem separating the
facts from nonsense. That leads us to another question: who
comes up with the facts, and how do they know what is real?
Why do some people believe official or scientific
explanations, whereas others prefer to believe alternatives?
Is deception an unavoidable part of human existence? Can
we develop techniques to cope with this influx of potential
deception and get to the truth?
We all learned critical thinking at school and college, but
how often do we apply it in everyday life? Even more
importantly, how often do we interact with others who fail to
apply a critical thinking approach to their work and personal
lives? Many people happily believe in nonsensical ideas,
whether these be innocent misconceptions or (in some
cases) dangerous misinformation. By developing your
critical thinking skills even further, you can learn to deal
with these difficult people and present a coherent case for
why they should perhaps research their views further.
A critical thinking mindset also helps us be more efficient
and effective at work, enabling us to focus on relevant
information and discard unconnected or inaccurate
information. This leaves more time for the things and people
we enjoy. If you strive for balance, it is worth taking the time
to hone these skills and learn to apply them in a broader
range of situations.
Many of us are perfectionists, and there is nothing wrong
with that. We know that perfectionists deliver better results
due to their high standards, but why not streamline your
perfectionism and do the same or better with less effort?
That is where the techniques in this book come in.
This book walks you through the details of using critical
thinking, to identify the truth from non-truth. It contains
specific examples, illustrative stories, thorough
explanations, and practice exercises; all backed up by a range
of scientific studies conducted by experts in their fields.
Each chapter is self-contained and logically organized,
meaning you do not have to read it all in one sitting to make
sense of the contents. You will understand how to discern
the truth in a variety of contexts, from online news to
scientific claims, to personal interactions. You will learn
more about the scientific method, including scientific
skepticism and how to tell science from non-science.
You will also find out how your mind might block you from
figuring out the truth with a discussion of common biases,
heuristics, and fallacies, including ways to overcome them.
Knowing more about these will also help you deal more
effectively with difficult people, including those trying to sell
you phony ideas or dodgy products.
By learning more about how to sort the truth from the lies,
you can reach better conclusions and make better decisions.
You can also help others who do not yet have this knowledge
to feel more confident and confident in themselves.
This book’s author is Camilla J. Croucher. She graduated with
a Ph.D. in cognitive science from the University of Cambridge
in 2007 and then completed a postdoctoral fellowship at
City, University of London. Her academic specialisms include
emotional memory, visual processing, and judgment and
decision-making. She is outstanding at statistics but does
not want to bore you with that. She has also worked in retail,
clinical research management, e-commerce, learning
support, and (of course) freelance writing. Her interest in the
illusions created by the eye and mind remained constant
across each of these roles, and she very much enjoyed
composing this text for you.The book is not a definitive
guide on thinking critically, and it is not a philosophy text.
Instead, it summarizes reliable research findings and
experts’ opinions and suggests techniques and practice
exercises that you can use in your daily life. So, are you ready
to explore what it means to be a critical thinker?
Dr. Camilla J. Croucher
1
WHAT IT MEANS TO BE A CRITICAL THINKER
IN THIS DAY AND AGE
C hancellor Sham Anderson took a deep breath as she
hung up the phone. Marvin Keller, the Chair of the
University Finance Committee, was not answering her
calls. That morning's big news was the resignation of the
Director of Loofberger Inc., sparking the company's stock’s
sudden plummet.
Naturally, Anderson now had serious concerns about the
University's multi-million dollar investment in that very
company. The decision to invest in a new corporation touting
a completely novel product was risky, of course. However,
the University had the funds despite huge pressure from the
science and technology faculty.
Building a new space research center would allow the
University to build probes to look for life on various moons
within our solar system. This would not be possible without
a vast amount of money.
The company performed excellently from the outset,
suggesting a low risk. The Finance Committee quickly
decided the kudos was worth it. They would invest in
Loofberger.
Soon, the first hints emerged that Loofberger was in trouble.
The product was not selling as expected. The share price
slowly declined. Nevertheless, Keller assured the Committee
that this was a blip and that things would pick up soon.
Sham startled as her cell phone rang: an unknown number. A
somber voice confirmed that Loofberger had gone bust: the
University lost its entire investment.
Hindsight revealed several clues.
Marvin Keller had failed to disclose that his brother-in-law
was also Loofberger's Director. As a lifelong friend as well as
a colleague, Anderson had not thought to question his
recommendation. A few months after finalizing the deal,
Keller took an extended sabbatical to work on special
projects at his lake house.
What about Loofberger's initial promising market data? It
turned out to be too good to be true—a fairly clever fake.
The University never got its space center, and many of those
staff relocated to other, less embarrassing organizations.
Anderson’s peers advised her to take early retirement, which
she did. The student body voted to paint the canteen ceiling
black in memory of the loss, and little paint chips flaked off,
making it look like the night sky.
The finance expert in this scenario had questionable
motives, but nobody had thought to question them. Expert
advice is often trustworthy, but we should not take
everything at face value. Question how you know that source
is an expert and how they know what they know.
Making complex decisions is difficult when we have limited
information, but it is still important to investigate its source
and content by gathering evidence to support our decisions
and conclusions.
Critical thinking is a powerful approach that can help you to
make better decisions. Critical thinkers do not passively
receive information. Instead, they apply the rules of logic
and their own experience to interpret the messages they hear
and see properly 1.
Getting Started With Critical Thinking
Our intuitive reasoning processes work well for everyday
purposes, but they may lead us in the wrong direction when
applying them in more complex situations. This is because
our brains love a good shortcut. In contrast, critical thinking
reduces shortcuts and automatic processing: it is a self-
disciplined process.
That is not to say that critical thinking is completely
different from intuitive thinking and reasoning; it may be
more effortful, but anybody can think critically. The
principles are fairly simple, but we know that people do not
apply them consistently.
Faulty reasoning is commonplace. People smoke, despite the
known health dangers. They gamble to excess, selling
property and losing loved ones to feed their addiction.
Critical thinking is vital to avoid faulty reasoning pitfalls, as
it helps us become better at solving problems. Unfortunately,
faulty reasoning can become habitual. For instance, if
somebody reasons to themselves that ‘astrology is just a bit
of fun, we do not need to prove it like a science,’ that is fine,
but what if they apply the same reasoning to the next
pseudoscientific idea? This ‘slippery slope’ can make us
vulnerable to more unproven ideas and even manipulation 2.
Even though critical thinking is composed of simple
principles, you must practice getting better. The best critical
thinkers share several common traits 3:
They ask questions: They identify relevant information and
pair it with abstract ideas. They draw valid conclusions and
then test them.
They never assume that they got it right the first time: They try
different approaches and ways of framing the problem.
They conclude by reasoning, as opposed to rationalizing:
Reasoning means using logic to conclude, whereas
rationalizing means finding a logic that fits the conclusion 4.
In other words, rationalizing is reasoning done backward.
They are superb at connecting with others: This is not only
about sharing their ideas clearly; it is about listening and co-
operation. You can even enhance your critical thinking by
working with people with highly developed critical thinking
skills 5.
So why is critical thinking important? For one thing, it helps
us to distinguish between facts, opinions, claims, and
evidence. We must delineate these four closely connected
concepts because others may use them to persuade,
misinform, or even manipulate us.
Facts
What exactly is a fact? More importantly, how do you know
something is a fact? What if it is merely an opinion or a
claim?
A fact is a piece of information that we can verify. We can
observe it, or we can find out it is true from a reliable source.
For example, the atomic number of carbon is six; the US Civil
War took place between 1861 and 1865; Armstrong was the
first to walk on the Moon. These are all facts.
Without diving deep into philosophy, we can note here that
‘truth’ is an abstract idea. Nobody can say for certain what is
real (or fake, come to that). Later, when we look at scientific
skepticism, we will examine this in more detail. For now,
let’s assume that truth can exist and that critical thinking at
least brings us closer to it.
As a critical thinker, you should inspect any so-called fact
you encounter. How do you know it is a fact and not an
assumption? Try to verify the ‘fact’ yourself; it may be an
assumption if it cannot be verified. On investigating, you
may find the assumption is incorrect.
In some cases, assumptions are the best we can do. For
example, if you were designing a novel product, at first, you
might assume that it would appeal to customers who buy
related products. Later, you could gain evidence using
market research.
When you investigate a given fact, you might find that it is
outdated or even a total misconception. Scrutinize facts, and
learn to recognize the good facts and reject the bad ones.
Opinions
Opinions may resemble facts, but they are subjective
judgments. People often misrepresent opinions as facts,
perhaps because strongly held opinions may even feel
factual. Opinions are always evaluative or comparative, even
if they use the same form as a fact by stating that something
‘is’ something. Saying that something is the best must,
therefore, be an opinion.
Take this statement:
“Joseph Bloggs is the best downhill skier because they have
won the most gold medals.”
This sentence is an opinion based on a fact. You can verify or
falsify the fact that they won most medals by reading medals
tables. The opinion that such a fact makes Joseph Bloggs the
best downhill skier cannot be verified: it is somebody’s
perspective.
A new skier may be the best, even when they have not won
anything yet. They might be able to beat Joseph Bloggs in
every race, but if medal count is the best measure of skiing
ability, the new skier cannot be said to be the best.
Our motivations, attitudes, and emotional states have huge
effects on our opinions 67. This renders opinions vulnerable
to all sorts of biases; not surprisingly, two people with
identical information can very easily hold opposite opinions.
Of course, opinions can change completely over time and
need not be based on facts at all.
Claims
Like opinions, claims are often wrongly presented as facts.
Claims may be factual, but the definition of claim is an
assertion presented without proof. Therefore, distinguishing
claims from facts is easy; you just need to check whether the
source supplies any evidence for the claim.
Claims can be implied rather than stated. ‘Before and after’
photos in beauty adverts are a good example. The adverts
may or may not overtly claim that the treatment improves
the skin, but the skin certainly looks healthier in the ‘after’
photo.
Companies produce adverts to make viewers spend money
rather than showing them the truth, resulting in advertisers
presenting claims as facts. But claims crop up in the wild,
too.
Conspiracists claim that mankind did not land on the Moon
in 1969, but NASA faked the mission using camera tricks in a
television studio. We can say this is a claim because there is
no evidence of the proposed fakery.
A fake Moon landing would entail faking a lot of evidence.
Fake technical data and fake filmed footage are only the
beginning. NASA would have had to have persuaded their
entire staff to give fake testimony, not to mention fake
paperwork.
Evidence
It is not just conspiracy nuts who persist even when faced
with overwhelming evidence against their beliefs 8. We all do
it. At times, we are all guilty of ignoring or
misunderstanding evidence. This leads us to an important
question: what exactly is evidence, and how should we use
it?
Evidence is an everyday term, but as critical thinkers, we
need a more technical definition. Evidence refers to a body of
information that supports a given position.
We typically use evidence to prove or disprove a belief or
decide whether a judgment or opinion is valid. Of course, you
need evidence from different sources.
A good body of evidence comes from multiple reliable
sources. Imagine overhearing a conversation at a party.
Somebody claims that ‘investments are a great way to make
money.’ A successful investor is listening; he nods
enthusiastically and starts bragging about the huge profits
he has made. Wouldn’t you want to hear the other side?
The more evidence supports a conclusion, the more likely
that conclusion is to be true. You might collect evidence from
pre-existing sources or decide to gather your own.
Picture a range of experts who are interested in why people
fall into problem gambling. The medic does not agree that
sociology surveys are the best way to research this, but the
sociology professor thinks they are the only way that makes
sense.
However, the two researchers would examine different
aspects of addiction. The medic in this example decides to
look at physical differences in the bodies and brains of
addicts and non-addicts; perhaps pre-existing variation
predicts who can gamble casually without becoming
addicted. In contrast, the sociologist wants to look at
socioeconomic factors like gamblers’ family situations,
housing issues, and poverty.
The gambling study could involve neuroscience, interviews
with gamblers, big data science, and more, in addition to
surveys and clinical studies. All these approaches are helpful
because they look at the problem at different levels. The
resulting body of evidence, taken together and processed
according to good logic, could generate more robust data
than the medic or the professor alone. The group can
investigate all potential causes of gambling and compare
how well all the different factors predict who becomes a
problem gambler.
In conclusion, uncertainty is a good thing because it drives
us to examine problems in more depth. You can never gather
all the facts or examine all the evidence. The best you can do
is test your ideas and beliefs and improve them as you go
along, based on a wide range of evidence.
Facts Versus Non-Facts
Critical thinkers evaluate everyday information rather than
simply absorbing it. You must be clear about the division
between facts and opinions, fiction, and emotions.
Facts Versus Opinions
Firstly, discerning facts from opinions is vital. Remember,
we can prove facts, whereas we cannot prove opinions
(subjective points of view). Alleged facts bombard us daily,
but these are often opinions in disguise.
Business documents, and even scientific reports, sometimes
report opinions as facts. Authors may commit this error
while chasing positive results to persuade others to agree
with them. Sometimes, they may misunderstand how to
verbalize the information. For example, it is incorrect to
present opinions based on data as though the opinions were
the data.
The same facts presented differently can lead to opposite
conclusions. The danger is that readers then treat the
opinions as facts.
Facts Versus Fiction
Secondly, we need to separate fact from fiction. This is more
difficult than it sounds. Labeled fiction is not a concern,
although fantasists may claim that it is somehow real. The
internet is full of people who believe in unlikely things.
One such unlikely belief is that aliens have left Stargates all
around the universe, including on Earth. These Stargates
allow beings to travel instantly from one planet to another.
You might recall a long-running TV drama on this theme.
There are many more examples of people taking science
fiction as fact. However, subtler categories of fiction do exist.
Conspiracy theories are good examples of probable fictions
presented as facts, whether due to error, lies, wishful
thinking, or confusion 910.
Facts Versus Emotions
It is also vital to distinguish facts from emotions. Just as we
can feel so confident in opinions that we report them as
factual, our emotions can seduce us into behaving as though
they too are real 11. We also believe that arguments that
support our pre-existing attitudes are stronger 12.
Telling facts and emotions apart may seem easy, especially if
you regard yourself as a rational thinker, but it is not always
obvious.
Let’s consider a married couple having a heated argument.
One yells at the other:
“I hate you! You’re always telling me what to do. I wish we’d
never met!”
These ideas feel vivid and extremely real at the time, but
once the speaker has cooled off and reconciled with their
partner, they will regret saying these words. They realize
that the ideas were expressions of emotions rather than the
truth about their relationship.
Attraction forms the basis of many relationships, and
attraction is another emotion that colors our perceptions.
Earlier in the relationship just mentioned, the partners
might have said things like:
“You are the most beautiful person I have ever seen. You’re
not like other people; you’re better.”
Again, these ideas feel concrete and factual at the time, but
they are simply angry words from the argument in reality.
Positive emotion may be more enjoyable, but it is no more
real than negative emotion.
So, how can we tell the difference between facts and
emotions?
First, let’s draw a line between emotional quality and
emotional state 13.
Emotional quality is the emotion that somebody wants to
convey in a picture, article, advert, or other messages.
Usually, charity adverts show people or animals in a pitiful
state, perhaps crying children, to make the viewer feel sad
and guilty. In this case, sadness is the emotional quality of
the advert.
Your emotional state is how you feel right now: calm,
excited, wistful, or nervous.
You may feel the intended emotional quality in response to
the charity ad or not. That is irrelevant. To critically assess
the message, you only need to appraise what emotion you
are supposed to feel and be aware of it as you process the
message.
When assessing a claim or message, note whether the author
is mistaking their own emotions for facts. People often feel
strongly about causes, and we know that emotions drive us
to justify our decisions and actions.
We are all affected by our emotional states and past
experiences. The tricky thing is that emotions feel urgent at
the time. Realizing that emotions affect your cognition is
only the first step.
To compensate for this, we should try to step back from our
own core effect (how we feel when we take in information)
instead of viewing the information objectively. Pause and
recognize the emotion for what it is, a separate entity from
the information you are processing. Similarly, a message’s
emotional quality is separate from its meaning.
We need to compensate for emotions like this because our
emotional states affect our reasoning processes significantly.
Moreover, we use our emotions to rationalize our decisions
and behavior.
Shopping is a good example. Consumers often buy things
they do not want or need, which is not a problem itself, as it
supports a healthy economy. Excessive spending can become
an issue, though. For example, if somebody cannot resist
spending $300 on a new pair of sneakers when they also
need to pay their rent, but they justify buying the sneakers
because they really want them.
Research suggests that clinical staff, such as nurses and
doctors, regularly make decisions biased by emotions 14.
Ideally, medical reasons, not emotional ones, should inform
choices like whether to discontinue treatment. Therefore,
medics educate their trainees to be as objective as possible,
even in the face of highly emotive situations.
Emotional intelligence is hugely important for these
clinicians. They must not let their emotional responses drive
their decisions, but at the same time, they must be empathic
and kind to patients and their families. Nevertheless, the
study shows that decisions remain biased even under these
conditions.
Whether you are a doctor or not, you probably make
important decisions every day, so be mindful of your
emotional state in reading or receiving a message. Your
emotional brain might be biasing you towards noticing
consistent information if you agree or disagree very strongly.
We are more liable to agree with arguments that support
what we already feel 15. This emotional bias means you could
miss important details and facts.
In contrast to our innate emotions, we can deliberately learn
critical thinking and logic 1617. As intelligent animals, we
have the power to figure out new ways of thinking, develop
these throughout civilization, and pass the techniques down
to future generations.
Compare this to learning a language. We are born with the
capacity for it but need to acquire the pieces (letter sounds,
words, and so on) and put them together into something
meaningful and useful.
To assess information rationally and avoid mingling facts,
opinions, fictions, and emotions, we need to practice and use
our critical thinking skills. Then, we can make informed
judgments and decisions that are more likely to be effective.
The Message Behind The Message
Before we can make sense of the information we receive, we
must fully understand the entire message. We cannot
evaluate the information successfully if we do not
investigate who made it and why: the content is only part of
the story.
Source Of Message
Firstly, find out about the source. Sources are individuals or
organizations, and the following advice applies to both.
A source may be an expert on some topics and naive about
others. Sources may be biased, have special interests in
certain topics, or pet theories. They may be more or less
reliable, more or less trustworthy. Think about the following
aspects of the source:
Is it an academic or government publication? We have to
assume these are more trustworthy than commentators. This
is because their vested interest lies in providing accurate
information for the population, whereas commentators’
motivation is more variable.
Is the source paid (or rewarded in some other way) for conveying
the message? Publishers can and do pay experts to
communicate specific information.
Where do they get their information? Is it a primary or
secondary source? Secondary sources can misquote primary
sources. They might even treat other secondary sources as if
they were primary sources. This magnifies errors and
misconceptions. Find the original information if you want to
assess it fairly.
What does your expertise tell you? If the source is somebody
you know, perhaps you know that they make outlandish
claims quite often. This could factor into your assessment.
When analyzing messages, especially from people you know,
remember that people’s reasoning skills vary. The source
may not be aware of all the aspects just described, and they
may feel that they have made a very good case. Perhaps with
a good debate, you can help them to improve.
At times, we all forget our deep-seated assumptions and
motivations. Do not forget that critical thinking takes
practice.
Purpose Of Message
Next, examine why the source composed the message.
Knowing a message’s purpose may alert you to possible
distortions and half-truths. What was their real motivation?
Here, you need to view the message’s fine details. If it is on a
website, what kind of website? For example, somebody’s
private blog has a different purpose from a government
website. See whether they have declared any interest in the
products or topics they discuss; like influencers, blog writers
are often given ‘freebies’ in exchange for promoting the
product.
A message might not be an obvious advert, but still be a
promotional text. For example, companies often feature
blogs about their products and services; you would not
necessarily take these texts at face value. Instead, think
about what interest the company might have in the topic:
web traffic, affiliate links, direct purchases, or simply to get
you reading more about pet insurance.
People make persuasive messages for many reasons, and
they can be subtle. Analyze the language to detect whether
the message might be covert persuasion rather than
unbiased information. Persuasive texts may feature many
adjectives and adverbs, chatty language, and high-school
rhetorical devices like alliteration and the ‘rule of three.’
Word choices also reveal the author or speaker’s biases and
opinions. Say you are reading reviews of a scientific book
about climate change. One reviewer refers to the ‘climate
scare,’ whereas the other calls it the ‘climate emergency.’
They have a different opinion, but in the context, both
phrases mean the same thing.
Another aspect of purpose is that the source may prefer one
conclusion or decision from the outset. They might then
filter out and distort the evidence to support the position
they have already chosen. You can tackle this issue by using
alternative sources to research the topic and filling in those
gaps yourself.
Field Of Work
As well as the source’s motivation and the message’s
purpose, you must understand at least something about their
field of work. This is even more important if it is not your
specialty. You need to get to grips with the basics.
Firstly, what are the fundamental goals? Imagine a hospital
where radiographers and nurses work together to produce
and analyze magnetic resonance images. Radiographers aim
to produce the best images possible, whereas nurses aim to
keep patients comfortable and well-informed about the
procedure. Sometimes these goals might clash since the
scanning procedure is uncomfortable and noisy. Specialist
staff at all workplaces need to work together in this way to
be effective.
Similarly, to assess the truth or falsehood of a message, you
must understand the sphere the source of the message works
in. This contextual information enables you to judge the
message on its own merits. Further, there is no point judging
the quality of a radiographer’s work in the same way you
would judge nursing care.
Secondly, what basic concepts or assumptions does the
source employ? Individuals may not even be aware of their
basic assumptions, but you, as a critical thinker, should be
able to discern them.
In everyday life, a basic assumption might be that when you
enter a table service restaurant, you wait in line, and then
somebody shows you to a table. You do not have to ask
somebody what to do; you just know. Similarly, physicists
assume that light’s speed is a universal constant; they do not
attempt to measure it in every experiment.
Finally, what kinds of data do they use to expand their
knowledge and inform their decisions? Whether you agree
with the specific methods or not, try to assess them fairly
rather than from a prejudiced position. Be flexible yet
rigorous, like a scientist. Research the message behind the
message you receive, and put your critical thinking skills to
good use.
Reliability And Reputation Of Source
Critical thinking is an extremely powerful approach, which
most people do not use most of the time. One way to dissect
claims more effectively is to be aware of our psychological
tendencies. A further strategy is gathering evidence from
multiple sources and assessing the sources’ reliability and
reputation.
Additionally, we can use critical thinking techniques to
assess the reliability and reputation of the information
source. When a piece of information seems factual, the next
step is to analyze its reliability.
Look at the source of information: Is it primary or secondary?
Primary sources are the originators of the information,
whereas secondary sources are based on primary sources.
If secondary, check whether they report accurately and
completely: Sometimes, secondary sources can leave things
out or report information selectively to support their own
argument, which may differ from the primary source’s
argument. Find the primary source and see how different it
is. You may observe that the secondary source reports their
opinions about the facts.
Now, look for evidence of reputation. If the source claims
they are a world authority on kidney tumors, find out more
about them.
Where do they work? University, hospital, or private company?
You can then assess their employer’s reputation as well.
What do other experts (or world authorities) say about them? Are
they mainstream or fringe? Accepted authority or
controversial?
Have they won prestigious awards, grants, or contracts? These all
indicate that somebody is recognized and successful in their
field.
What organizations do they belong to? This could include
overarching professional bodies (e.g., a therapist belonging
to the American Psychological Association) or more niche
bodies (e.g., the Society For Neuroscience).
What else have they written about? If somebody has written a
huge amount on diverse topics, they may be journalists or
interested amateurs, rather than an expert on the particular
topic.
By assessing the reliability of information sources, we can
exclude those with weak foundations. An accepted authority
on a subject is likely to be a more reliable source than a
relatively inexperienced person who is new to the field.
Similarly, a person or organization with a sharp focus on the
subject at hand is likely to be more reliable than a generalist
since their experience and knowledge run so much deeper.
They are also more likely to base their information on
primary sources rather than secondary or tertiary sources.
Once you have determined the source’s reliability and
reputation, you can put this together with the other factors
we have discussed. This framework enables you to interpret
messages more actively, empowering you to make better
decisions and arrive at more accurate conclusions.
Action Steps
Now that we have explored the features of critical thinking
and how to interpret messages better, it is time to put some
of these ideas into action. Try these suggested exercises.
1. The Fact Check
Identify a purported fact, either from your work or the
media. This can be anything, as long as you can read it in
context and research it to analyze it.
Suggestions:
There is life on other planets in the universe.
Humans only use 10% of their brains.
People with a college degree earn more money than
people without.
Use critical thinking to evaluate your chosen fact
systematically. Use these questions as guidance:
Where does the fact come from?
Is it a reliable source?
How do you know?
What is the author's motivation for presenting this
fact in this context?
What is the evidence for this fact?
Is the evidence presented objectively, or is there
evidence of bias?
Is it a fact, or is it an opinion?
Has emotion influenced it?
Is the source using emotion to try to influence you
(the reader?)
Feel free to ask any other relevant questions you can think
of, based on what we have looked at in this chapter. Now that
you have done this once, you have a framework for assessing
messages you receive using critical thinking.
2. Observational Study
Firstly, visualize a person you think has good critical
thinking skills. Write a few notes about them using the
questions below, or make a mind map.
What kind of person are they? What have they said or done
that makes you think they are great at critical thinking?
What outcomes do they produce?
Examine the evidence you have written down, and conclude
whether this person is a good critical thinker. Perhaps bring
this exercise to mind next time you speak to them or witness
their critical thinking, and make a few more observations.
Now repeat the same process for somebody you think has
poor critical thinking skills, including what makes you think
they are bad at critical thinking. Put your notes or mind
maps side by side and compare them.
This exercise will help you focus on the good (or bad) critical
thinkers’ traits and behaviors. It also starts you thinking
about the real-world applications of critical thinking.
Summary
In the story at the beginning, the University relied on its
staff to disclose conflicts of interest, and they trusted the
market data that the company reported. However, multiple
factors, including misplaced trust in Keller, led them to
invest in a failing company.
A poor decision cost the University more than just money.
Why? Could this have been prevented if the Finance
Committee had applied what we had learned in this chapter?
Perhaps.
Emotions played a role in the investment: the desire for
success, trust in Keller. They appraised the company’s
success incorrectly due to inadequate evidence (they relied
on the market data). Keller, the investment recommendation
source, turned out to be unreliable due to having a personal
interest in the company.
In the story, the University did not have all the information
needed to make the correct decision. No doubt, you will have
been in similar situations yourself. Hopefully, the techniques
covered so far have equipped you with more tools to deal
with information you encounter in the future.
Apart from features of the information we receive, what else
keeps us from getting to the truth? The answer is complex,
and we will delve into it very soon in the next chapter.
Takeaways
1. Critical thinkers must distinguish between facts, opinions,
claims, and evidence.
2. You should be realistic and even humble about your
knowledge. However, pairing logic with your own experience
is a key part of thinking critically.
3. Remember to assess the author and their motivation, as
well as the message.
4. Use multiple reliable sources, including other people, to
help you reason towards better conclusions and decisions.
2
WHAT KEEPS US FROM GETTING TO THE
TRUTH?
“M om! Dad! I need to speak to you!” the kid yelled. He
had just got back from his first day at grade school,
and he had serious beef with his parents.
“What is it?” asked the concerned parents.
“The other kids all laughed at me.”
A sad tale of juvenile bullying, you might think. Yes, but
there was more to it. The kid had started school with
something fairly crucial missing from his social life.
His parents were overjoyed when he was born. As high
achievers themselves, they wanted their children to do well
in life.
The kid’s father had heard about an interesting research
study. He spoke with his spouse, and they both agreed it
could not harm their child.
The study was the famous Mozart Effect. First published in
the early 1990s, this experiment indicated that students who
listened to Mozart did better on certain cognitive tests than
those who did not listen to Mozart 1. The students performed
as though their IQ was 8-9 points higher than those who
listened to a relaxation tape or silence. Furthermore, a
prestigious scientific journal published the study.
This got parents, as well as scientists, very excited.
Everybody wanted to grab those extra IQ points for their
child. There may even have been a boom in baby
headphones’ sales and Best Of Mozart CDs (this was the
1990s, remember).
Our family took this to an extreme, however. The kid had
passed unnoticed through kindergarten, but by grade school,
his deficit was apparent. Shockingly, he had never listened to
anything other than Mozart.
More, his test scores were average at best, and he was the
victim of several bullying incidents within the first few
weeks of school.
That was when his Mom decided to investigate further.
Scientists found the Mozart Effect very hard to replicate, but
they kept trying. More often than not, Mozart listeners
performed about as well as those who listened to different
music or silence 23.
The kid's Mom also found out that the cognitive
enhancement effect was small and probably only lasted a
while after the music finished — anything mildly
stimulating made people do a bit better on the tests.
What she regretted, though, was naming her son Wolfgang.
With the Mozart effect, one experimental study became so
well-known that people did not even notice the subsequent
studies. Other studies were less dramatic and therefore did
not grab the parent’s attention.
Is Mozart special? In a musical sense, of course. But there is
probably not a special Mozart module in the brain that
switches on superior learning processes.
The failure to replicate the Mozart Effect suggests that the
original effect was due to general characteristics of the
music, like complexity or interestingness. Aspects of the
experimental situation might also have led to these
seemingly impressive results 45.
Recent analysis suggests that scientists published more
‘Mozart-positive’ results due to publication bias. This is
similar to confirmation bias, which we will look at in detail
in this chapter.
Our brains construct our perceptions 6 and memories, so we
need to constantly evaluate and question our ideas. Our
brains construct an impression of a three-dimensional world
based on a two-dimensional projection on the retina. We
perceive three dimensions even in two-dimensional
drawings in a way that feels automatic 7. Similarly, the first
idea that comes to mind from memory could easily result
from our cognitive processes rather than being a true
reflection or record of reality. Therefore, we must strive to
become more aware of how our minds can distort reality.
Thinking critically (or using sound reasoning) is not as
simple as it seems. All of us harbor deeply ingrained habits
that influence our judgment of people, events, situations,
and issues.
How Our Brains Short-Circuit Our Logic
Thinking logically, like any variety of thinking, uses our
everyday cognitive processes and systems. In brief, these
include attention, memory, judgment, as well as decision-
making and problem-solving. Beliefs, emotions, fallacies,
biases, and heuristics all affect our cognitive processes.
We have to be realistic about this, but not unduly
pessimistic. Our judgment and decisions will tend to be
pushed in different directions by distorted perceptions and
memories.
Fortunately, we can reduce these tendencies by knowing
what the distortions are and how they work. We can
compensate for them, but firstly, we should define and
explain each of the key terms.
Beliefs
Beliefs are an important part of human life. We all hold prior
beliefs about things, people, and ideas, and one generation
passes them on to the next via social learning. Sometimes,
we believe what we want to believe despite evidence against
it; we can refer to this as wishful thinking 8.
So, where do erroneous beliefs come from? Our brains do not
intend to deceive us, but knowing the truth is not always
their main concern. Erroneous beliefs are a byproduct of the
psychologically adaptive process of social learning 9. Social
learning supports many useful tasks, such as learning our
native language. As social creatures, we need social cohesion
and shared experiences, and we start paying attention to
other humans (and potentially learning from them) as
infants 10. So, it is only natural that we are so open to
acquiring ideas directly from others, especially those we
trust.
Second-hand information has great potential to lead to false
or distorted beliefs. Humans love to tell good stories, and the
storyteller may highlight certain aspects and ignore others,
either to make the story more entertaining or to emphasize
certain parts of it 11.
In turn, prior beliefs can lead to biased perceptions of
people, objects, and events, thereby affecting future
perceptions and experiences. People can then pass these
biased beliefs onto others. This may remind you of the
children’s game Telephone or Chinese Whispers, in which
one person whispers a verbal message to the next along a
long line. The original message disappears by the end of the
game.
Another aspect of our beliefs is that we tend to believe what
we want to believe 12, and this includes our beliefs about
ourselves. We may adopt socially acceptable beliefs to avoid
being rejected by others 13. Like many of our psychological
tendencies, there is nothing wrong with this, but at times it
could obstruct our critical thinking.
Emotions
Social emotions such as trust and the desire for acceptance
can affect what we believe, but emotions have huge effects
on cognition. Psychologists have documented mood
congruent effects in memory and attention 1415.
This means that people tend to notice and remember
information that fits with their current mood; you may
observe this phenomenon casually in everyday life now that
you are looking for it. For example, when somebody feels
joyful, they might notice beautiful scenery or enjoy a good
meal more than when they are in a neutral mood. Our
emotions, therefore, influence not only what information
goes in but also how our minds process it.
In controlled experiments, a scared or sad person is more
likely to perceive others’ faces as threatening or negative.
Someone experiencing a happy, exuberant mood is more
likely to label faces as friendly. The first person might be
more likely to recall unpleasant events from their own life,
whereas the second would recall more happy and joyful
experiences 1617.
This example illustrates that memory retrieval is an active
process; your memory is not like a library issuing you the
same memory every time. Instead, the cognitive system
reconstructs the memory each time 18.
Fallacies
The term fallacies often refer to commonly held false beliefs,
including some examples of folk wisdom. For example, many
people believe that more babies are born during the full
moon 19. In fact (verifiable, reliable fact, that is!), no more
babies are born on the full moon than during any other
phase of the moon.
False belief fallacies can affect our reasoning processes if we
assume that pieces of received wisdom are true without
examining them in more detail.
Fallacies also refer to logical fallacies. These are errors of
reasoning commonly known as non-sequiturs. To reason
properly, we must make sure that our conclusions follow
logically from our arguments’ premises. The study of logical
fallacies has a lengthy history, and there are many of them
20.
Heuristics And Biases
Biases are another important feature of the cognitive system
that affects how our brains absorb and process information.
Attentional biases send our attention towards or away from
certain things. We also experience unintentional biases in
decision-making and judgment.
An interesting bias to note is egocentric bias, in which
egocentric means ‘towards the self.’ People consistently rate
their abilities as above average 21. Can you see what is wrong
here? By definition, most people cannot perform above
average because the average falls in the middle. Scientists
have observed this effect in all sorts of situations: self-
assessed leadership qualities to the likelihood of getting
cancer, and all sorts of people from students to professors 22.
Biases are important to understand here because they can
lead directly to fallacies, and they may also support
erroneous beliefs.
Heuristics are mental short-cuts 2324. One example is the
availability heuristic when we use the most easily available
information to solve a problem. We could also call heuristics
rules of thumb: quick methods to solve problems without
sitting and doing a lot of math or logic. Heuristics are useful,
but they lead to approximate answers and lead us to get
things wrong 25.
In some situations, heuristics can lead to systematic biases.
This is a serious issue for critical thinking because people
then ignore other relevant information.
So why do we have heuristics? Our ancestors needed fast,
efficient solutions to problems during evolution, but they
required precisely correct solutions. They succeeded enough
to survive; otherwise, they would not be our ancestors. As a
result, our brains happily hold on to errors, as long as what
we know works well enough in practice, even when our daily
experience completely contradicts the heuristic.
Examples: Why They Are A Problem
Emotions can cause problems when they interfere with
logical reasoning. Think about aphorisms that we use in
speech every day. For example, nobody wants to ‘let their
heart rule their head,’ but perhaps they want to follow their
‘gut instinct.’ Which is right?
The answer is not straightforward. Our emotional state can
make a huge difference in how we perceive and interpret
incoming information. Moods are transient, so we regard
decisions and conclusions made under highly emotional
conditions as unreliable. However, emotions are far more
vivid to us than cold reasoning processes 26. Accounting for
their influence is, therefore, a difficult task.
Fallacies of logic can occur without awareness, or they can be
used deliberately as a manipulative tool. Both can get in the
way of us knowing the truth. Fallacies are a particular
problem because we do not reason things out in isolation.
Often, the outcome of one reasoning process feeds into the
next.
For example, a CEO might use critical thinking to work out a
plan to expand into a new business area, beginning by
figuring out which products would work best, then moving
on to selecting a team to lead the new venture, then on to
planning the expansion in more detail. If they fall into
logical fallacies in the project’s initial stages, the decisions
may not be optimal, putting later stages at risk.
Cognitive biases and heuristics can make us believe the
impossible. For example, given specific probabilities of two
events, logic enables us to work out both events’
probabilities. If event A is 50% likely to happen, and event B
is only 10% likely to happen, logically, you would expect
people to realize that the two together are even less likely
than event B.
However, this is not what happens. Even medical
professionals judging the likelihood of symptoms make this
mistake 27. This example shows how pervasive our cognitive
biases are and that they sometimes happen without
awareness 28. Biases like this can easily affect people’s
thought processes without them realizing it, leading to
unfortunate consequences like bad investments and
misdiagnoses.
Top Ten Brain Twisters
In our business of separating sense from nonsense, certain
fallacies are particularly relevant. We need to watch out for
these. Where possible, be careful not to commit these
yourself in our everyday debates and discussions. Some of
these are quite subtle. Others are obvious when you know
what to look for.
1. Ad Hominem Fallacy
Ad Hominem means "against the person." It means
attacking the person rather than attacking their point or
conclusion 2930. You might witness this fallacy in a political
debate.
For example, one politician argues passionately against a
new shopping mall in the town, but their opponent points
out that they live in that town and the new mall would bring
a lot of extra noise and traffic to the area. The opponent
argues that the first politician is therefore concerned for
themselves, not necessarily for the residents.
Here, the first politician described a concept, but the other
proceeded to attack the first as a person, ignoring the
debate’s topic. Attacking the opponent is not an effective
way to argue against their idea, so we describe ad hominem
as a fallacy. Like the other factors described here, this fallacy
can lead to divergence from important topics. People
sometimes use it deliberately to divert attention and
discussion away from certain topics.
There are two types of ad hominem 31. The circumstantial
variety is when a source is speaking hypocritically, and
somebody else points it out. This type of ad hominem may
constitute a legitimate argument, but it is still a logical
fallacy. The second variety is abusive ad hominem, where
somebody uses another’s personal traits to attack their idea,
where the traits are unrelated to the idea.
In practice, ad hominem rebuttals are not always irrelevant.
Let us think about a political debate. One politician attacks
the other’s personality or life choices. But what if these are
relevant to the argument?
This example illustrates circumstantial ad hominem: the
opponent points out the first politician’s hypocrisy. Suppose
the first politician had no obvious self-interest in canceling
the new mall. In that case, the opponent could still attack
them to convince the populace that they were not
trustworthy and discredit their opinion. This is abusive ad
hominem, a fallacy we should certainly try to avoid.
2. Hasty Generalization
Hasty generalization is another important fallacy that we
need to understand. It means jumping to a conclusion based
on too little evidence. A more technical definition is
generalizing from a sample or single instance to a whole
population. However, the sample may be too small or not
representative of the general case.
Imagine a friend saying:
“My Grandpa lived to be ninety-six years old, and he drank a
bottle of whisky every day of his life!”
Unfortunately, Grandad does not prove that alcohol is a
recipe for a long and healthy life. This anecdote, a single
example, does not outweigh decades of medical evidence.
Generations of thinkers have described this fallacy. Aristotle
discussed it first, followed by many more scientists and
philosophers. Alternative names for hasty generalization
include faulty generalization, the fallacy of accident, the
fallacy of neglecting qualifications, and many others 32.
Hasty generalization is easy to commit. People under
pressure in busy jobs, seen as authorities on the topic at
hand, might mistakenly conclude too early. Hasty
generalization can also lead to wrongly assuming that every
instance is the same, based on one or two examples. It can
also lead to people ignoring situations where their
conclusion is false. In the example of Grandpa and his
whiskey, the speaker focuses on the single example at the
general case’s expense.
You can see how hasty generalization could become a serious
problem and prevent us from getting to the truth.
3. Bandwagon Fallacy
The bandwagon fallacy means falling into the trap of
thinking that the majority is always right. People commit
this fallacy when they agree with the majority without
seeking further information 33.
A classic psychological study revealed that many people
would agree with the majority opinion even when they can
see that the majority is wrong 34. This experiment’s task was
shockingly simple: participants had to choose the longest
line from a few options, and the lines were different lengths.
The experimenters put individual participants in groups with
fake participants, and all the fake ones chose a line other
than the longest line.
Asch’s study showed that many people agreed with the
majority but then expressed concern and confusion because
the majority gave the wrong answer. The experiment put
people into an unnatural situation, but we can also see the
bandwagon effect in real-life scenarios.
In real life, the majority opinion is often fine, and we can
choose to follow it without dire consequences 35. For
example, most people would agree that dogs make good pets
and rhinoceros do not. Choosing a pet is a relatively benign
decision, though.
In contrast, turbulent environments lead to more copying;
the correct path is harder to discern in more ambiguous
situations 36. Think about how this relates to a high-
pressure business environment, where the situation may be
highly complex, to begin with, and changes rapidly. In these
situations, organizations follow each others’ business
decisions more than in a calm and stable business
environment 37.
People and organizations jump on bandwagon for many
reasons. They may genuinely believe it is the best option, or
they may see others they admire jumping on the same
bandwagon, which gives that choice more credence 38.
However, the bandwagon effect is a failure to apply logic and
our own experience. Information about the majority’s
opinions and choices is easy to obtain and quick to process,
but most are not always right. Even the majority opinion of a
group of experts is not always correct.
4. Straw Man Fallacy
People use the straw man fallacy to influence others. It
involves changing somebody's point or argument to set up
an easy target, then knocking it down using your own
arguments 39. It is the logical equivalent of slapping the
person standing next to your opponent.
Straw man arguments are extremely common. Here is an
example. Two politicians hold a public debate a couple of
weeks before a local election. Sam McAdams makes an
announcement:
"We will not invest in waste disposal in the city. Instead, we
will reorganize our facilities and raise efficiency by 200% in
my first two months in office."
The crowd cheers, but the opponent has something to say.
"I cannot believe that Mr. McAdams proposes scaling back
the workforce at Waste Disposal!"
The debate proceeds; McAdams tries to point out that he
never suggested getting rid of staff.
People may deliberately set up a straw man to knock it down,
but it can make a big difference. In this example, the
opponent set up the straw man to get McAdams to discuss a
different topic. It certainly steered the debate and probably
had a significant effect on the spectators.
5. Confirmation Bias
Confirmation bias is a bias towards information that
confirms what we think we already know. Take this example:
Jayshree firmly believes that all Hollywood actors over 30
years old have had cosmetic surgery. Every time she sees
somebody whose face looks smoother than last year, she
points it out to her friends.
What do you think Jayshree says when she watches a movie
and the actors look no different? Nothing, of course. It is
unremarkable that the actors have aged normally. Jayshree
notices evidence that supports her belief, but she is oblivious
to the evidence against it.
Confirmation bias is extremely common, affecting what
information we notice and what information we seek out 40.
People have a strong tendency to seek out information that
confirms their beliefs due to a strong desire to maintain
those beliefs 41. Returning to our example, Jayshree might
search the internet for ‘celebrity plastic surgery’
information, but she would not be looking for information
on who has not had plastic surgery.
When faced with a message, beware of confirmation bias. It
is similar to wishful thinking: sometimes we believe what we
want to believe, and evidence supporting what we believe
grabs our attention.
6. Anchoring
Anchoring occurs when we over-rely on the most prominent
feature of a situation, person, or object. Anchoring strongly
affects our judgment and estimation 42. This may be the first
piece of information we encountered or the information that
we feel is most important. Anchors are mainly numerical.
For example, someone taking out car finance might choose
to focus on the interest rate, displayed in large figures on the
website, rather than processing additional information.
Anchoring biases our judgments, but also things like
estimates. If you go to a car showroom, you may have room
to negotiate. Nonetheless, your mind anchors your initial
offer around the price quoted on the window. This is known
as anchoring and adjustment: the first number we see biases
our subsequent thinking 4344.
Psychology experiments show that different anchor points
can lead to vastly different decisions. Furthermore, the
anchor does not even need to be related to the question to
influence a person’s answer 4546. This shows that anchoring
is pervasive and, to some extent, automatic.
Anchoring is sometimes also known as a heuristic, and it
does enable our minds to take a shortcut and stop processing
more information. However, it is sometimes automatic and,
at other times, more conscious 47. Automatic anchoring is
more like a suggestion: the anchor primes somebody’s
estimate or choice by activating similar numbers or ideas in
mind, and the person experiencing this may not be aware of
it.
On the other hand, deliberate anchoring is when you
consciously adjust your initial estimate to get closer to the
real answer. This process is more controlled, but people
typically stop adjusting too early, meaning the anchor still
biases their final response. We are more likely to stop
adjusting too early if we are under time pressure or are
multi-tasking 4849.
7. False Consensus
This bias comes from social psychology, the study of
personality and social interaction. False consensus focuses
on how we see ourselves relative to other people. Like the
arsonist who might have once said, 'Well, everyone loves to
set fires, don't they?', we overestimate how common our
actions or traits are in the general population.
This bias emerges when people hear about other people's
responses 5051. Whether we read others’ answers to a set of
questions or hear about decisions made in a scenario, we see
other people's responses as more common and typical when
they match our own. Conversely, we see others' responses as
strange and uncommon when they diverge from our own.
False consensus effects are larger when the question is more
ambiguous. One study asked people specific questions like
‘are you the eldest child?’ and more general questions like
‘are you competitive?’ The study reported a much more
pronounced false consensus effect with more generic
questions 52. This provides more evidence for the effect and
suggests that when people have more room to interpret the
question in their way, they perceive others as more similar.
8. Halo Effect
The halo effect is not about angels; think about the type of
halo you see around a streetlamp in the mist. This bias
occurs when something is seen positively because of an
association with something positive, like the light from the
streetlamp spreading out as it refracts through the mist
particles. You could call this the ‘glory by association’ bias.
We all know that first impressions matter in our
relationships. This bias is part of that. Our initial
impressions of people and things can create a halo, overly
influencing what we think of them.
When people have to rate others on positive criteria like
competence or intelligence, their ratings are influenced by
how warm and friendly they seem to be 53. The halo effect
even occurs for traits we know are unrelated, such as height
and intelligence.
As you can imagine, the same applies to objects and ideas.
Companies like to use beautiful people and scenery in their
adverts and promotions because this gives potential
customers a positive impression of the company and the
product.
9. Availability Heuristic
The availability heuristic affects us when we have to judge
probability or frequency 54. We assume things we can
imagine or recall easily are more common or more likely.
Another way to conceptualize this is to assume that the first
things we think of are the most important 5556.
You can see how the availability heuristic can be useful.
When deciding where to take a vacation, your first thought is
more likely to be somewhere you want to visit rather than an
obscure destination you have barely heard of. The desired
destination is more available in your memory, as well as
more vivid.
This heuristic draws on several characteristics of human
memory 575859. Firstly, the recency effect: we have better
memories for recent events or things we have seen or heard
recently. Secondly, we remember things that make us feel
emotional. Finally, we recollect personally relevant and vivid
information far better than dry, boring stuff. Any of these or
all of them together can create high availability.
The opposite is also true. If you cannot think of many
instances of something, you will think it is less common or
less probable. When researchers asked participants for a very
large number of advantages of something, such as their
college course, they found it hard to think of enough. These
students rated their course as worse than others who had to
think of fewer advantages 60.
This example seems paradoxical at first, but not when you
think of it in terms of availability. The course’s positive
aspects felt less common to those who were asked for more
because they could not think of the full set of advantages
requested. This illustrates how the availability heuristic
could be a problem, depending on questioning techniques.
If we can call examples to mind easily, we think events are
more likely to have happened before or to happen again in
the future. For instance, people worry that terrorist attacks
are possible or even probable. A young graduate’s family
warns them against moving to New York, Paris, or London
because of 'all the terrorists.' These attacks are readily
available to people's minds, so they feel that attacks are
more likely than they are.
Availability is a useful heuristic because it allows us to make
rapid judgments and decisions. People are more influenced
by availability when they process information quickly and
automatically, for example, when feeling happy or distracted
61.
10. Representativeness Heuristic
The representativeness heuristic happens when we judge
things according to how similar they are to things we already
know about 6263.
Representativeness operates when we have to judge
probabilities. Specifically:
Categorizing items: the probability that this object, person, or
situation belongs to a given category.
Origins: the probability that this event comes from a given
process.
Projections: the probability that a given process will generate
an event in the future.
Here is an example of the representativeness heuristic in
action. Someone says:
"Ryan wears glasses, so I think he is a computer scientist
rather than a farmer."
The speaker has a stereotype of a typical computer scientist
in their mind, and Ryan fits that stereotype on one criterion.
Hence, they categorize him as a computer scientist (see 64
for the original example).
When they use the representativeness heuristic, people are
often extremely confident, although a vague impression,
rather than a range of evidence, determined their choice.
They have not considered the percentage of farmers and
computer scientists in the general population, for example.
Therefore, this heuristic is likely to lead to incorrect
conclusions at times, and it probably fuels some of our
human failings, such as prejudice and discrimination.
How To Un-Bias Your Brain
Beliefs and Emotions
The first step in dealing with erroneous beliefs is to use
logic: look at what else must be true for the stated belief to
be true 65.
Next, realize that some beliefs are simply false, and you can
prove this by finding evidence against them. You can easily
disprove a friend who firmly believes that Martina
Navratilova has won the most Grand Slams in women’s
tennis history. This type of belief is a mistaken fact, so you
can assess it the same way you would assess a purported fact
or an unsubstantiated claim. You can perform a quick web
search and find the correct answer from a reliable source,
such as the awarding body (the Women’s Tennis
Association).
In contrast, other beliefs are not falsifiable. These may be
acceptable on their own but at the same time incompatible
with other beliefs. In this situation, you need to assess each
belief’s veracity and arrive at a new understanding.
For example, people who believe the Earth is flat must also
hold other, consistent beliefs. They must believe that
modern-day transport companies lie about the distances
involved in traveling close to the North and South Poles. If
the world were flat, one of these poles would be at the
center, and the other would be a loop around the edge of the
world. Logically, you would only need to disprove one of the
beliefs to falsify the whole set.
Emotions are potentially more difficult to deal with. Instead
of suppressing or ignoring your own emotions, acknowledge
that they exist and can affect your judgment. This simple
change can remove some of their power and help you avoid
falling into the trap of rationalizing 66.
Discerning emotions from facts are the major way you can
avoid getting side-tracked by your core affect and emotional
content in messages and communications. Refer back to
Chapter 1 for more details.
Fallacies, Biases, And Heuristics
To combat these three, we first need to acknowledge that we
are human and our minds work in this way. There is nothing
intrinsically wrong with it. We can follow this with some
principles of critical thinking:
Examine the facts of the matter: Make sure you consider
everything that could be relevant and ensure that the
information is factual rather than beliefs or opinions.
Take a mindful approach: Realize that situations are
constantly in flux. Get the best information you can at the
time.
Question everything: Only make assumptions when the facts
are not yet available.
Compensate for biases: Sometimes this is straightforward,
other times less so.
Draw your own conclusions: Ideally, these should be based on a
full understanding of the available facts and in the light of
your own extensive experience.
To deal with the ad hominem fallacy, understand that
circumstantial ad hominem is sometimes a valid way to
critique a person if their circumstances or traits are relevant
to what they are arguing. Do not be tempted to commit it
yourself unless as a specific tactic to throw off your
opponent. Abusive ad hominem, on the other hand, converts
a discussion into an exchange of personal attacks. When you
encounter ad hominem arguments, point out that attacking
the person does not harm the idea and steer the discussion
back to the facts.
If you want to make better decisions, reject hasty
generalizations. Watch out for others jumping to conclusions
that are not justified. Make sure they specify any
qualifications. For example, a source may tell you that
mobile advertising always generates revenue. Can this be
true? There are probably some hidden qualifications here;
perhaps they mean that mobile advertising usually generates
revenue when targeted at the right customers. Note the
change from ‘always’ to ‘usually’: absolute words like
‘always’ can cue you to other people’s over-generalizations.
Avoiding the bandwagon fallacy appears easy, but it does
involve more work than simply accepting what you see or
hear, like all of our critical thinking principles. Be alert and
process the information actively, not passively. Remain open
to alternative information and solutions, embrace alternative
perspectives, and realize that the majority can easily be
wrong. Keep scanning for new information, see information
and evidence in context, and not be tempted to over-
simplify.
Remember that the bandwagon fallacy, and many of those
discussed here, result from mental shortcuts. A mindful
approach helps to compensate for this and can be extremely
helpful 6768.
The straw man fallacy is fairly easy to recognize. Be alert to
the speaker’s arguments or claims. Watch out for somebody
restating an argument in their own words. Have they added
or omitted anything? Have they changed the argument?
Combat the influence of the availability heuristic and
confirmation bias by going beyond your first thoughts and
encouraging others to do so as well. Acknowledge that your
initial idea could be correct, but search for evidence that
disproves it. Keep multiple possibilities in mind. This gives
you a firmer foundation going forward.
When dealing with numerical data, watch out for anchoring
effects. Ensure you have enough time to perform calculations
in full to avoid mistakenly anchoring your final answer to an
interim solution.
Be aware that completely unrelated values can easily anchor
numerical estimates. Moreover, people and other sources
may intentionally attempt to anchor your responses,
numerical or otherwise. A mindful approach helps in these
situations, and you could even try to re-anchor your
estimates. For example, in a financial negotiation where the
other party has suggested an initial figure, you could
contemplate a very different figure (even if it is only in your
mind) to compensate for the possible anchoring effect.
Anchoring may explain over-optimism in complex projects
as we may overestimate our chances of success based on
success in the early stages. Accept that your best estimate
may still be wrong, and follow a process of critical thinking
by assessing new information and evidence in full and
integrating it into your ongoing projects.
False consensus affects how we perceive other people and
their opinions and how they perceive us. Try to avoid making
assumptions about other people, and if others make
incorrect assumptions about you, point it out politely.
In the professional world, the false consensus at an
organizational level could lead to an array of problems.
Suppose managers at a company assume that their
competitor companies all work in the same way as their own.
In that case, they could miss innovative opportunities by not
absorbing different ways of working. Conversely, they may
not realize when they have a competitive advantage they
could exploit. It is worth taking the time to investigate the
facts of the matter at hand.
Overcoming the halo effect means being objective in our
judgments. Treat separate elements as separate elements:
for example, a beautifully painted scene on the side of an
old, rusty car should not (of course) make you think it is a
good car. Be aware that a good experience of something has
a lot of influence, sometimes more than it should. Remember
not to let the halo effect blind you to things (or people)
becoming worse over time. Assess things individually, on
their own merits.
The availability heuristic could be particularly problematic
for critical thinking. When we try to think rationally, we
must go beyond the obvious and examine situations in detail.
The availability heuristic pushes us in the opposite direction,
but we can push back. For example, if you have to make a
difficult decision or judgment call, write a long list of pros
and cons. This will help you focus on the whole situation
rather than allowing the most available information to hijack
your decision-making.
The representativeness heuristic can be tricky to address
because people are so confident in their judgments. To
prevent yourself from falling into it, find out the base rate of
whatever you are judging. In the example given earlier, the
speaker judged someone to be a computer scientist rather
than a farmer because he wore glasses. A more reliable way
to decide would be to look at what percentage of people work
as farmers and computer scientists and then choose the
most common.
In general, finding out the base rate or baseline figures is an
extremely useful way of dodging the representativeness
heuristic when judging which category belongs. When
working with processes, you can gather more data or repeat
a process several times and observe the outcomes: a larger
sample size is more likely to give you a truly representative
result.
Why You Should Never Be Afraid To Change Your Mind
The biases and heuristics discussed here make up a tiny
fraction of those that philosophers, psychologists, and
economists reckon our brains work with. Remembering that
these are a natural feature of how our minds work and can be
somewhat automatic, we need to challenge beliefs and ways
of thinking without feeling threatened or making others feel
threatened.
In logic, true premises and valid logic lead to a true
conclusion. If the ‘facts’ are wrong, the argument is not
valid. The conclusion may still be true by coincidence, but
the argument given does not justify that conclusion. All the
factors listed in this chapter, and many more, can lead us to
misapprehend the facts.
An argument’s premises may even be assumptions rather
than facts, but this results in a weaker argument. An
assumption is something presented as factual but without
evidence. Sometimes assumptions are the best we can get in
a given situation. For example, there are times when we
simply cannot find any evidence one way or the other.
Scientists and other innovators run into this issue fairly
often. Even so, you should have less confidence in a
conclusion derived from assumptions and gather evidence to
support or reject the assumptions where possible.
Psychologists first defined the term ‘cognitive dissonance’ in
the 1970s 69. It has been a popular idea ever since and has
gained general acceptance 70. When we come across new
information that does not fit our current idea or conclusion,
our brains work to make it fit or reject the new information.
When contradicting beliefs lead to cognitive dissonance, our
minds may try to hold on to both beliefs by
compartmentalizing them; our minds keep the beliefs apart.
We do not suffer from the conflict and therefore ignore it 71.
Instead, try inspecting the beliefs and see whether one or
both need to be updated based on additional evidence.
Note that cognitive dissonance is a rationalizing process. Our
brains find it hard to hold contradictory information, so
facts may get distorted or ignored in the effort to return to a
state of equilibrium.
When working things out for yourself, reduce dissonance by
focusing on the process of logic rather than the conclusion
72. Make sure the logic is sound. As long as all the premises
are true and the conclusion follows from the premises, you
can accept the conclusion that emerges from your reasoning
process.
A further psychological trait that makes people susceptible
to biases is that we have fairly poor intuitive math skills.
Even those with good academic and professional math skills
do not always apply them in everyday reasoning 7374.
For instance, imagine a lottery that draws five numbers from
a pool of fifty. One week, the winning numbers come out as
1, 2, 3, 4, and 5. What is your first reaction to that result?
Highly improbable, some people would say. Perhaps so
improbable that it suggests cheating.
These lottery numbers are just as likely as any other
combination of five numbers drawn randomly from a set.
They do not represent random numbers very well; people
expect to see numbers scattered across the range. It would be
far more surprising if the same set of numbers - any
numbers - came up three weeks in a row.
In conclusion, you should never be afraid of new information
and of changing your mind. Be open-minded, allow growth
in yourself, account for all the factors discussed here, and
share your critical thinking skills with friends and
colleagues.
Other Cognitive Factors That Affect Our Thinking
In addition to the factors already discussed, other features of
our cognitive system and the information itself can affect
our thinking.
Pattern Recognition
Our brains are incredibly good at recognizing patterns.
People often perceive faces in facelike configurations, like
the Man in the Moon, known as visual pareidolia. A large
area of our visual brain is dedicated to face processing, so it
is not surprising that we perceive them even when they are
not there 75.
Pareidolia is automatic: people do not try to see these
patterns; they just do 76. You have almost certainly had this
experience. Countless internet memes show objects like
houses and cars that look like faces. Sometimes it can take a
few moments for the pattern to resolve itself into the image.
Still, other times it strikes you straight away, and it is
difficult or impossible to go back to see the image as a less
meaningful pattern.
Pareidolia can occur in other senses: hearing Satanic
messages in music played backward or ghostly voices in
radio static.
Automatic pattern perception illustrates similar tendencies
to optical illusions, like flat images that appear three-
dimensional. These are not just fun and games. Both pattern
recognition and false perceptions could lead to false beliefs,
and people can and do seek information to support them.
In summary, our brains are incredibly good at recognizing
patterns yet poor at statistics 77. We regularly perceive
meaning in random stimuli.
Missing And Hidden Data
Information you are not aware of sometimes affects
whatever you are trying to reason about. The danger is that
missing data could be crucial; if you had it, your conclusion
or decision might be completely different.
In medical trials, missing data is common. For example, in a
study of patients who have had a stroke, clinicians might not
be able to get data for all their research questions from all
the patients. Some would be unable to complete certain
tasks, whereas others would. One way to account for this is
to use statistics to fill in the gaps, such as replacing missing
data points with all the other patients 78.
Researchers plan their clinical trials in great detail, usually
building in methods to compensate for missing data. You
could consider doing this for your projects where applicable:
plan how to compensate for unobtainable data.
Additionally, hidden information might result from
confirmation bias when people ignore or fail to report
occurrences that disprove an idea 79. Discovering these
occurrences is vital if we want to undo the confirmation bias.
Regression To The Mean And The Hot Hand
Although this is not a math test, you should be aware of
regression to the mean. This is neither a fallacy nor a bias
but a characteristic of data. Regression to the mean occurs
when somebody repeatedly takes a test or performs a task. A
very high or low score, an outlier, may occur, but then the
data goes back towards the previous average 808182.
This phenomenon explains why a great year for a sports
team is more likely to be followed by a worse year than by
another great year. Performance improvements can and do
occur, but we cannot judge a single great year as though it
reflected an average improvement. Excellent performance is
a combination of baseline ability and random good luck 83.
Regression to the mean can have interesting effects in the
real world. One scientist worked with military flight
instructors, one of whom reported that when he praised a
cadet’s performance, they usually did worse the next time 84.
The instructor thought that praise made people worse at
flying airplanes. However, their particularly good flight was
an outlier, resulting in the cadet regressing the mean on
their next performance.
Similarly, extremely poor performance is more likely to be
followed by an average performance than by another dismal
one.
Generally, we have low awareness of the regression to the
mean effect because we fail to account for chance as much as
perhaps we should. Regression to the mean also feeds into
some of the biases and heuristics already discussed 85. The
next fallacy illustrates a similar point.
The hot hand fallacy is the belief that following good
performance; subsequent attempts will also be successful.
Commentary on team sports like basketball sometimes cites
this fallacy 86.
It is related to regression to the mean: regression to the
mean is the real situation, whereas the hot hand fallacy is
what people think will happen. It is also related to
confirmation bias: people notice when the hot hand effect
happens but do not notice when it does not 87.
The Hot Hand fallacy can also apply to casino games, which
players sometimes perceive as non-random. Casino players
exhibit the opposite fallacy: the gambler's fallacy that
because they have lost many times, a win is due 88.
These gamblers’ fallacies suggest that even when a random
chance is the main factor affecting the outcome, people
persist in perceiving patterns. Imagine what our brains
might be doing when it is not so obvious that chance
determines the outcome!
Action Steps
Our brains do a great deal of information processing that we
are not always aware of. We are quite fortunate to have all
these short-cuts making processing more efficient. Try these
suggested exercises to explore these ideas further before we
move on.
1. Fantastic Fallacies, And Where To Find Them
Find a list of fallacies, biases, and heuristics in an online
encyclopedia or psychology website. See how many there
are? Read some of them and make a note of your thoughts.
You could look at things like:
Which ones might people be unaware of when they
encounter them?
Are they rhetorical devices used deliberately to
persuade? Or are they quite automatic?
Are they related to those we have talked about?
Can you think of examples from the media or from
your own life that fit the definitions?
How might you combat these in your own or others’
thinking?
2. Un-attacking The Person
Find an argument, such as a transcript of a debate, in which
someone uses the ad hominem or straw man fallacy
deliberately to divert the debate. Rewrite it (or part of it),
staying focussed on the actual topic. Compare your version
to the original, perhaps show it to a colleague or friend, and
think about which version reaches a more logical outcome.
Summary
At the beginning of this chapter, the story illustrates that
sometimes we get it wrong; sometimes, this applies even
when we exercise good critical thinking skills. Our cognitive
processes may be sophisticated, but they are also
economical. In the story, the parents believed they were
benefiting their son by playing Mozart because they believed
the high-profile research paper suggesting that Mozart
made people more intelligent.
The parents only read the initial research study on the
Mozart Effect. They did not follow it up: hasty
generalization. They did not realize that other scientists had
found it so hard to replicate the Mozart Effect. They fell into
confirmation bias by only noticing media reports praising
(and confirming) the Mozart Effect.
The halo effect may have operated too because Mozart is
generally accepted as one of the best classical composers. If
it had been an obscure composer, would the paper have
gained such a high profile? The population found it easy to
fall in love with the idea that Mozart's music was special in
yet another way.
Nor were the parents skeptical; if they had been, they would
have researched the effect for themselves rather than taking
it at face value. Scientists aim to be skeptical at all stages of
their workflow, from ideas to analyzing the data from
completed research. The next chapter elucidates scientific
skepticism in greater detail.
Takeaways
1. Our minds abound with fallacies, beliefs, emotions, biases,
and heuristics, all of which impact our perceptions and how
we process information.
2. These can have massive effects, so we need to remove
their effects if we want to reach solid conclusions and make
good decisions.
3. It may not be possible to overcome these biases induced by
our minds completely, but critical thinking can help.
3
WHY HAVING A SCIENTIFICALLY SKEPTICAL
MIND HELPS YOU DISCOVER THE TRUTH
F ifteen-year-old Alanna Thomas burst into tears and
buried her face in her hands.
“I’m so sorry,” she gasped. She looked up at the police
officer standing over her. “I did it, I did… I pushed him off.
I’m sorry...”
On the other side of town, local journalist Lin Rodriguez also
buried her head in her hands. She needed to get this article
finished, but the story was so complex. It was hard to know
what was real.
Two weeks prior, Mr. Gomez, a science teacher at Mildenhall
High School, was floating face-up in a flooded disused
quarry. Lin remembered his classes. He was strict but
somehow still inspiring. She would never have studied
forensic sciences at College if it were not for Mr. Gomez.
Not everyone had liked him at high school, but Lin could not
imagine why this local academic had fallen to such a violent
death. Events like this did not happen in their small town;
the community was in shock. Naturally, the rumors began as
soon as the news broke. Murder? Suicide? Misadventure?
Nobody knew, but everybody was talking about it.
Lin’s boss sent her to the scene as soon as he heard, and she
interviewed the forensics team as they painstakingly
collected evidence. They had covered the body, but the lead
investigator told Lin that Mr. Gomez had some suspicious
bruises. They found two different sets of footprints around
the top of the cliff too.
The next day, further evidence came to light. A local man
told police he was walking his dog in the area the previous
night and had heard somebody making their way through
the undergrowth not far from the cliff. The area was
overgrown with brambles, and he could hear they were
having some difficulty. He reckoned this was not long after
Mr. Gomez had his lethal fall.
Lin asked around to find out who might know more. If it was
suicide, perhaps Mr. Gomez had expressed sadness or pain in
the days and weeks before his death. She questioned
colleagues at school and heard a few interesting morsels of
information.
Four separate people highlighted the same concern: a small
group of students appeared to have a rather nasty grudge
against this particular teacher. They even reported social
media threads detailing certain students’ fantasies about
playing nasty tricks on him, like keying his car or even
harming him personally. These groups consisted of students
the other teachers agreed were outcasts. One of the students
was Alanna Thomas, a shy girl who was a local attorney’s
daughter.
Lin investigated the social media posts and found several
distressing threads. Sure enough, ‘let’s kill Mr. Gomez’ came
up more than once.
The problem was, Lin just could not believe that any of these
disaffected children would murder their teacher. Priding
herself on her skepticism, she looked for and found an
alternative explanation.
Alanna Thomas’ father was aiming for a promotion: he
wanted to become a district attorney. Furthermore, a group
of powerful local business owners was firmly against this
idea. Mr. Thomas was a keen environmentalist, and
everybody expected his appointment to scupper their plans
to build a large power plant on the edge of Mildenhall.
Instead of an angry schoolgirl, it was surely more likely that
somebody had hired a professional killer to neutralize Mr.
Thomas by implicating his daughter in a murder case.
Besides, the child was the perfect stooge. She was known to
hold a grudge against her teacher and be a social misfit who
would crumble under police questioning.
As we have seen, that is exactly what happened. Alanna’s
tearful confession formed the backbone of the case against
her. She was easily tall and strong enough to have pushed
Mr. Gomez off the cliff while he was out walking his dog at
night, a habit which the whole town knew about.
Lin published her investigation. Following Lin’s article, the
police dropped their case due to a lack of evidence. Officially,
they concluded that Alanna’s confession was unreliable and
that there was not enough evidence. Mr. Gomez had fallen
into the quarry; it was a terrible accident.
On the same day, Lin received an anonymous email. It said:
“You should have listened. Gomez was murdered.”
The sender attached a high-resolution photograph taken
from the top of the cliff. The time and date were exactly
right, and so was the location data. The image showed
Alanna Thomas standing at the edge and down below the
body of a man face down in the water.
If Lin had been properly skeptical throughout her
investigation, she would not have suffered these dire
consequences. She doubted the first explanation–that
Alanna had killed her teacher–so much that she came up
with an even less plausible alternative. She convinced herself
and others that it was true, even though her conspiracy
theory had less evidence to support it than the police’s
theory. Ultimately, the truth eluded everybody.
Skepticism is not simple cynicism. Skeptics keep an open
mind, doubting every explanation rather than believing they
have arrived at a final answer. Taking a skeptical approach
based on scientific principles helps us get closer to true
conclusions rather than settling for what we want to be true
1. Critical thinkers can guard themselves against being
misled into believing lies or mistaken information.
What Is Scientific Skepticism?
In general usage, skepticism refers to an attitude of doubt.
Skeptics in the media often criticize ideas they see as
unlikely, such as alien abductions or conspiracy theories.
Scientific skeptics are prepared either to believe or disbelieve
claims, depending on a fair analysis of the evidence.
Scientists, as natural skeptics, spend many years gathering
evidence before publishing their findings. What is more,
scientists never claim to discover the truth; they just update
the current understanding. There is no end to scientific
inquiry.
You can only apply a scientifically skeptical approach to
claims that are verifiable and falsifiable. ‘Verifiable’ means
that you can test the concept or claim 2. In the early 20th
century, European philosophers spent a lot of time thrashing
out what verifiability means. Something is verifiable if you
can find out that it is true by observing or measuring it.
Some philosophers include logical verification within this
definition, but scientists prefer to focus on claims that they
can test in the real world. These include whether tooth decay
predicts tooth loss or whether investing in education
correlates with an improved local economy. Science focuses
on things we can measure, usually quantitatively.
To be clear, you do not have to prove the idea for it to be
verifiable. For example, we can say that life may be
discovered in other solar systems in the future, although we
do not possess many methods for doing so at the moment.
This claim is verifiable because we would be able to travel
there in the future and observe whether life exists or not.
Current science measures planetary environments by using
telescopes to detect atmospheric chemistry, which can reveal
life conditions and may reveal chemical signatures of life 3.
An example of an unverifiable claim might be: ‘a child once
swallowed a whole bicycle wheel and survived.’ We cannot
verify this because we do not have the information to
identify the child. Anecdotal reports like this may simply be
mistakes or deceptions. Even if the source is reliable, they
may pass on unreliable information, so you should be
particularly skeptical about second-hand evidence 4.
Claims that we cannot verify are, therefore, not scientific.
Instead, we can call them ‘metaphysical’ in the case of faith-
based claims 56 or refer to them as beliefs or unverified
claims. The problem with verification is that it is not often
that we can 100% verify anything 7. That is why falsifiability
is so important and has somewhat eclipsed the concept of
verifiability in recent decades.
Falsifiability means that it is possible to disprove a claim or
proposal 8. For instance, if somebody stated that Robert de
Niro was born in New Zealand, you could falsify the
statement by obtaining evidence from birth records.
This criterion is more powerful than the verifiability
criterion because it is easier to find ways to disprove claims
than prove them. Falsification is the usual principle for
modern scientific research: scientists, rather than trying to
prove something, try to reject its opposite. Falsifiability also
helps scientists to maintain objectivity while conducting and
analyzing research.
Scientists engage in two broad types of research: observation
and experiment. Both of these have their merits and can be
used to investigate claims. Observational studies are
excellent for gathering initial evidence on a topic. In an
observational study, the scientists do not alter anything
about the situation they are studying. They simply record
data and categorize it to see whether two or more situations
differ.
For example, a psychologist might think that women spend
more money on sun protection cream than men in hot
weather. To conduct a study on this topic, they would
compare sun cream sales between men and women across
various temperatures.
However, they are not trying to verify their idea that women
spend more on sun cream. They would be testing the idea
that sales were no different to see whether they could falsify
it. If the money spent was different, they could conclude that
perhaps they were right and do more studies. If the money
spent were no different, they would have to accept that
result. Perhaps women and men do not differ in reality, or
perhaps some study aspect affected the results.
Observational studies can be very informative, but
experimental studies are more powerful. In an experiment,
the scientist controls as many variables as possible, aiming
to keep everything equal except the variable of interest.
As an example, you might want to design an experiment to
see whether giving salespeople bonuses led to more sales in
the following year. To do this, you would have to give some
people a bonus and others no bonus and measure their
performance. (Perhaps you could give the non-bonus group
a bonus later on).
Another key scientific principle is replicability. Do we find
the same result when we repeat the same research? If a
research result is a one-off, it is not reliable. The idea that
listening to Mozart made people more intelligent was
difficult to replicate 9, even though one study had found this
result 10.
One difficulty with adopting a scientific approach is that
people tend to prefer positive results. We prefer to draw
conclusions based on events instead of non-events,
reflecting our preference for meaningfulness over
randomness 11. Sometimes, people like a scientific result so
much that they ignore other results that falsify it, much like
confirmation bias 1213.
Science uses observation and measurement; therefore it
mostly uses inductive rather than deductive reasoning.
Deductive reasoning is when you argue from the general case
to the specific case. If all penguins can swim, and Benjamin
is a penguin, it follows that Benjamin can swim.
Inductive reasoning flows in the opposite direction. You
argue from the specific to the general case. If every penguin
you see is black and white, you conclude that all penguins
are black and white. Seeing a single blue penguin falsifies the
statement.
In the penguin example given above, seeing a blue penguin
would force us to change our conclusion to ‘most penguins
are black and white, and some are blue,’ which would be
acceptable until we get further information. Perhaps
somebody discovers a new location filled with blue penguins,
tipping the balance so that we change our conclusion again,
now stating that ‘most penguins are blue.’
Claims based on faith fall outside the domain of science
because they are neither verifiable, nor falsifiable. This does
not mean they are false, simply that we cannot investigate
them using the scientific method. We cannot label religious
beliefs as true or false.
Scientific anomalies like dark matter and dark energy are
somewhat similar in that people debate whether they exist
and how to explain to them if they do exist. However, in
these cases, we can say that we are awaiting an explanation,
and scientific methods can potentially explain them (since
both matter and energy are core topics for physics).
How Critical Thinking And Scientific Skepticism Work Together
Scientific skepticism works together with critical thinking to
help us discern truth from non-truth. The techniques of
scientific inquiry involve examining all the evidence before
concluding. This process makes us less error-prone. Neither
scientists nor skeptics are completely free of error and bias,
but aim to be as objective as possible.
As you might imagine, it is often impossible to examine all
the evidence. For example, a team of biologists cannot
dissect every single member of a certain species to examine
their inner workings, and social scientists cannot expect a
100% return rate for their questionnaires. In these cases,
scientists calculate how many individuals’ data points are
likely to give a fair representation of the entire population,
perhaps perform a smaller pilot study to check, and use a
sample of that size for their study.
When analyzing data, we have to beware of false positives
and false negatives. Like a diagnostic test for a disease, a
scientific experiment cannot be 100% accurate; chance
factors can intervene and create a rogue result that does not
reflect the reality of the situation.
A false positive result happens when an experiment
randomly shows that something does happen or does affect
something else, but the result was actually due to chance. In
contrast, a false negative result is when the effect is real but,
again by chance, did not show up in the experiment. The
danger in both cases is that investigators might accept the
false result and either miss something important or proceed
to investigate something unimportant 14.
We do have ways to combat false results. We can repeat the
same experiment several times in different circumstances to
examine replicability. We can also use control conditions in
experiments to reveal what happens under slightly different
circumstances 15.
Applied sciences like medical research may use placebo as a
control condition. A placebo is an inactive substance (or
other types of intervention) that resembles the active
treatment enough that people cannot tell them apart. The
researchers tell participants they may receive the placebo or
the active treatment. Due to ethical concerns around
withholding something that might benefit sick people, new
treatments are often tested against the current treatment.
Similar to the example given earlier, deciding not to provide
some salespeople a bonus as part of a research study would
be unfair.
Critical thinking creates a framework of doubt: like
scientists, we question things constantly and gather
evidence to draw more reliable conclusions. Like a scientist,
try to avoid falling into the trap of thinking you have
discovered the absolute truth. It is better to remain open-
minded and flexible and update your current understanding
by adding new evidence as you discover it. At the same time,
remember that some theories have better logic and evidence
to support them than others 16.
If you want to judge whether a theory is a good scientific
theory, apply the following principles:
Falsifiability: recall that claims that we cannot disprove are
not scientific, and we should label them as beliefs or
assumptions rather than theories.
Occam’s razor: if we have two or more competing theories,
we should use the simplest one (until we have more
evidence). Some thinkers state this as accepting the theory
that forces us to make the fewest assumptions. Also, theories
should not contain extra elements that make no difference to
the whole 17.
Explanatory power: a good theory should adequately explain
its subject better than competing theories. New evidence can
render it a better or worse fit for the data. A theory can also
be thrown out in the light of new evidence, in which case its
explanatory power drops to zero 18.
Predictive power: this to the theory’s ability to generate
testable predictions. For instance, in chemistry, the periodic
table of the elements predicted elements that did not exist
yet, which scientists then discovered or made in the lab.
Similarly, Darwin predicted a pollinating moth for a
specifically shaped flower, and botanists later discovered a
moth that fit his prediction 19.
To illustrate these points, imagine a friend telling you this
tale:
“My Grandad's liver held out until he was ninety-six years
old, despite drinking a bottle of whisky every day of his life!”
Unfortunately, Grandad does not prove that alcohol is fine
for your liver. This idea is falsifiable if you look at the whole
population of drinkers; Grandad was atypical. This story also
illustrates confirmation bias, as though a single example
outweighed decades of medical fact. Occam’s razor tells us
that Grandad is the exception. His good luck does not bode
well for other heavy drinkers: your friend’s theory has low
explanatory and predictive power.
With so many considerations at play, we need to continue to
be skeptical of conclusions even when we feel confident. This
does not mean we should doubt others or ourselves to
excess, only that we should remain open to changing our
minds.
Why We Need To Be Skeptical
One theory states that our minds are hardwired to make
decisions, and once we decide, we are reluctant to change
our minds. This is because we have two different ‘actors’ or
systems within our minds that process information. We can
call these the fast system and the slow system 20.
Have you ever wondered why some experts seem to be able
to make decisions and solve problems instantly? One
explanation for expertise is that well-studied skills become
virtually automatic with practice. This includes thinking as
well as physical skills. These rapid mental actions use the
fast system, which also deals with recognizing emotions
from people’s voices and automatic reading of whole words
in fluent readers.
The slow system does deliberate thinking, such as when we
have to do a difficult calculation. You may observe your mind
needed to concentrate, effortfully retrieving information
from memory, and working through the problem in
sequential stages. This system is highly affected by
distractions, which is why sometimes you might find
yourself concentrating so hard that somebody has to say
your name several times to get your attention.
The fast and slow systems work together, too. The fast
system recruits the slow system to help with difficult tasks,
and sometimes we all experience the conflict between
automatic and effortful processing. One example could be
resisting the urge to criticize somebody if you get angry: the
fast system drives the hot-headed emotional behavior, but
the slow system keeps it in check.
Like skills, prior beliefs, and things we think we know can
become automatic, almost like mental reflexes. If we want to
overcome them, we need to make a significant effort, and
even then, the automatic ‘gut reaction’ can remain.
That is why we need to continue to be skeptical of our
conclusions even when we are confident. We must be wary of
our mind’s flaws, including tendencies like being overly
influenced by other people and our own emotions and prior
beliefs, and our inherent biases and the brain’s preference
for taking shortcuts.
However, the good news is that we can change our cognitive
habits through practice. We can even see changes in the
brain with practice. Neuroplasticity means that our brains
can reorganize themselves quite extensively, and this is not
only the case for younger people whose brains are still
maturing. ‘Map reorganization’ in the brain is particularly
interesting. Research shows that learning and practicing
skills lead to growth in related brain areas, whereas dropping
the practice leads these areas to shrink back towards their
baseline size 21.
It can be difficult to challenge other peoples’ reluctance to
change their minds. Throwing a lot of facts and evidence at
them may only make things worse. Instead, while
maintaining awareness of any cultural and social factors that
might be feeding into their opinions, try coaxing them to
consciously think about their attitudes (use the slow system
of thought) and remind them why evidence is important 22.
It may sound contradictory, but we need both skepticism and
open-mindedness 2324. We can define open-mindedness as a
set of mental habits:
Thinking flexibly and avoiding rigidity.
Accepting views that may contradict each other, at
least until you have evaluated them.
Avoiding getting blinkered by your own beliefs.
Striving to avoid bias even when you disagree with a
claim.
Being willing to explore new ideas.
To some people, these ways of thinking may seem
incompatible with skepticism. Certainly, the everyday
definition of skepticism focuses more on being critical and
challenging ideas than being open to them. A skeptical
person may appear to be closed to new ideas until they
obtain further evidence, in contrast to the open-minded
stance portrayed above. But recall that a key part of
skepticism is being open to changing your position and
seeing other perspectives.
Opponents of a skeptical approach may argue that
skepticism is paradoxical because skepticism itself is a belief
system . These opponents argue that skeptics use ad
hominem, straw men, and similar techniques to discredit
potential miracle discoveries 2526. However, true skeptics
take a balanced view and are always open to the idea they
might be wrong. ‘Pseudo-skeptics’ are those people who
almost exclusively disbelieve and deny claims; they are
similar to ‘debunkers’ whose mission is to try to disprove
claims 27.
To summarize, think of skepticism and open-mindedness as
two complementary aspects of the same process: critical
thinking. They are not incompatible. Note that an open-
minded attitude allows you to salvage the good parts of bad
ideas, whereas a strict skeptic would throw everything out.
Open-mindedness is a key part of creativity and innovation.
Lucidity And Metacognition
Lucidity is an open-minded state where we can see past our
prior beliefs and perceive reality as it is. This is a great state
to aim for if you want to appreciate new information fully.
People have an inbuilt immunity to new ideas and prefer to
stick with what they know. People may even perceive new
ideas as threatening, making a kind of automatic assumption
that they already believe must be better than the novel claim.
That is why discovering the true cause or process of
something is only the first step. Scientists must then
continue to investigate and try to convince others that their
theory is correct 28. At the same time, they must maintain a
skeptical viewpoint, acknowledging that they might be
wrong.
People have trouble with new ideas, particularly new
scientific ideas, for a few reasons. Firstly, the true causes of
phenomena are not usually simple or obvious. Secondly,
people often get the causes wrong when considering things
they feel strongly about. Thirdly, it is difficult to discover
how to get to the correct explanations (that is why science is
always seeking to improve scientific methods and
knowledge). Fourthly, people need to be continuously
motivated to discover the real causes and, subsequently, to
promote novel explanations 29. You can see why scientists
are such busy people.
Evidence shows that high critical thinking skills are
associated with high metacognitive awareness 30.
Metacognitive awareness means having awareness and
control over how you process information; it is a self-
reflective process, known as ‘thinking about thinking’ 31.
We can use metacognition to guide our own learning,
development, and decision-making. People with high
metacognitive awareness have excellent knowledge about
their own cognitive processes and their outcomes. Like
critical thinking, metacognition is teachable and can
improve with practice 32. For example:
John knows he has a poor prospective memory - he always
forgets to do things he has said he will do. His family and
colleagues often get irate about this. However, John has
excellent metacognitive skills: he knows his memory is poor,
enabling him to do something about it. He trains his memory
by setting reminders and writing task lists. After a while, he
does not need to set reminders anymore; he goes straight to
the lists.
You can imagine that somebody with poor metacognitive
skills might not have been as successful. John was not afraid
to admit he had a minor memory problem and was able to
solve it.
Interestingly, student teachers with more experience showed
higher metacognitive awareness and critical thinking skills
(assessed by questionnaire) 33. This was a correlation, so we
do not know whether metacognition causes critical thinking
or the other way round; alternatively, they may draw on the
same underlying skills and habits. Since critical thinking
means deliberately using sophisticated thinking skills to
solve problems, going beyond intuition, and using high-level
analytical skills, it seems reasonable to suppose that it
relates to metacognition.
Paul and Elder 34 describe nine intellectual standards that
should help us think both lucidly and metacognitively about
ideas. These are standards that scientists strive to meet in
their communications, and they give you a helpful
framework whether you are composing an argument or
receiving one from another source:
Clarity: to reason about a claim, we must be clear about what
they mean. Therefore, when you are communicating, you
need to aim for maximum clarity as well. This standard is a
prerequisite for all the other standards.
Accuracy: you may not have access to resources to check the
accuracy of all points made, but you can assess it by thinking
about whether the claim is verifiable and whether the source
is trustworthy.
Precision: information should be appropriately precise for the
point under discussion. A claim could be accurate but
imprecise; for example, ‘the company’s profits fell last year’
is less precise than saying they fell by 18% last financial
year.
Relevance: we might reason clearly, accurately, and precisely,
but this is pointless if we deviate from the core topic.
Depth: this means dealing with the complexities and
relationships of the concept under discussion rather than
over-simplifying it.
Breadth: this means taking in multiple (relevant) points of
view and recognizing alternative perspectives on the issue.
For example, business strategies often look at
environmental, ethical, and social concerns, as well as
economic factors
Logic: this means ensuring that the arguments work
logically: does the evidence lead to the conclusion, and does
the argument have internal consistency?
Significance: this is related to relevance, but sometimes
relevant points are trivial. We need to ensure that our
reasoning focuses on the important aspects of the problem.
Fairness: our reasoning should be bias-free and honest. We
should aim not to argue only for our own interests. Others
may interpret unfair arguments as attempts to manipulate
and deceive them.
Hopefully, you can see how these standards relate to
scientific skepticism and communication. All of these
standards apply to science but also our everyday lives, both
work-related and personal problems. Therefore they are
useful to remember when composing or reading claims and
other communications.
Looking Beyond Our Prior Learning
Scientific skepticism is not always easy. We can only reach
the truth if we work hard to see past the received wisdom
and assumptions that society taught us in our youth.
The postmodern view says that truth is not absolute but
subjective. Declarations are therefore always up for debate.
As any scientist or critical thinker will tell you, all we have is
our best current understanding. Truth is constantly evolving
in the light of new evidence. Postmodernism goes much
further than this.
Postmodernism is not a single theory but a way of looking at
things. Scholars have applied it to many different domains,
mainly literature, the arts, theology, and philosophy 35.
However, here we are concerned with scientific skepticism
and how to get to the truth, focusing on postmodern views of
science and philosophy.
Postmodernism consists of the following key ideas 36:
1. There is no objective reality outside of human experience.
2. Scientific and historical 'facts', therefore, cannot be true
or false because humans concocted the very idea of reality.
3. Science and technology cannot change human existence
for the better. (Some postmodernists believe science is a
dark force rather than a way of humanity progressing).
4. Reason and logic are not universal; they are only valid in
their own domains.
5. All (or nearly all) human nature is socially acquired rather
than hard-wired.
6. Human language does not reflect reality directly; instead,
it is completely fluid and only reflects how people refer to
things within their own cultural and historical context.
7. We cannot gain knowledge about reality, and nor can we
back up our knowledge using evidence or logic.
8. We cannot formulate grand theories that explain wide-
ranging phenomena; postmodernists believe these are a kind
of totalitarianism that disallows other views.
As you can see, postmodernism contains some useful ideas.
However, it is difficult to take a completely postmodernist
view and still expect to explain anything or figure anything
out. You could say that postmodernism fails to explain
anything but at the same time, claims to offer an alternative
to traditional scientific methods. Paradoxically,
postmodernism decries grand theories but is itself a grand
theory 37.
Postmodernism was most popular in the mid to late 20th
century, particularly the 1990s, when academics collectively
published over 100 articles per year 38. The postmodern
movement provoked a great deal of popular debate. Critics of
the approach say that it encourages people to think of
science as no more useful than pseudosciences like astrology
39. (Although we know that scientific good theories predict
future events, astrology does not have this power.
Some postmodern ideas support skepticism and open-
mindedness, but its core suggests that we can never discover
anything because reality does not exist. Mainstream
scientists and philosophers alike seem to have more faith
that we can discover the truth, but postmodern attitudes
persist 40. On the other hand, postmodernism does
encourage us to take a broader view of ideas and look beyond
traditional categories, so it is similar to the idea of scientific
skepticism (even though postmodernism is skeptical of
science!).
Scientific Revolutions
Thomas Kuhn was a philosopher and scientist who wrote
about how science moves forward. His work heavily
influenced the postmodern view, but he did not argue that
science is anti-progress. Instead, he said that we do ‘normal
science,' and knowledge moves forward in jumps, which he
called paradigm shifts. ‘Paradigm’ refers to the prevailing
world view or scientific approach of its time. For example,
history saw a great paradigm shift away from classical
Newtonian physics when Einstein advanced his theory of
relativity 4142.
Normal science is an incremental process. Small advances
taken together, debated by scientists in journals and
conferences, gradually increase knowledge. Scientists predict
many discoveries in advance during normal science, based
on theories that they believe have solid foundations.
Education imparts received wisdom to budding scientists,
and they become fluent in its specific methods and language
and continue research along the established lines.
A paradigm shift results from a crisis in science. The existing
theory can no longer explain observations, or a radical new
theory gets proposed that explains things better than the old
one.
Examples of paradigm shifts in science:
1. Copernicus’ proposal that the Sun, rather than the Earth,
lay at the center of the Solar System.
2. Lavoisier’s discovery that chemical elements combined to
make molecules with various properties, superseding
alchemical views of chemistry.
3. In the 1880s, the ‘germ theory’ that tiny organisms (rather
than bad air) caused diseases.
A paradigm shift means a change in what scientists study
and how they study it, and how society views that topic,
what methods we use to investigate it, and what conclusions
are acceptable. These are huge shifts, hence the alternative
term: scientific revolution.
So what fuels paradigm shifts? There are three major
influences.
Firstly, anomalies. Scientific anomalies happen when
scientists find things they cannot explain. If enough of these
happen, a new idea could gain momentum and lead to
fundamental changes (a paradigm shift). Small anomalies
may occur in science all the time, but nobody is looking for
them to not be perceived or recorded.
Secondly, new technology (ways of measuring things) can
fuel paradigm shifts. For example, medical imaging
techniques to psychological sciences led to the new field of
functional brain imaging in the early 21st century.
Finally, when a new paradigm appears, scientists need to
compare the new and old paradigms with each other and
with observations. Some may be looking to verify the new
paradigm and falsify the old one; others will do the opposite.
Everybody works to find out which theory fits the facts
better. They do 'extraordinary science' to see what is going
on and rewrite the textbooks. Extraordinary science helps to
complete the paradigm shift from old to new.
So what happens afterward? We might casually call outdated
science incorrect, but it was fine in its own time. Outdated
science took steps toward the truth, and the new science
grew out of the old. The discoveries made might still stand
but get interpreted differently under the new paradigm.
Science keeps going between paradigm shifts because people
like to solve problems. Even if the progress is slow and
piecemeal, new research is important. Scientists may work
within the established boundaries or try to push things
forward slightly all the time. As long as their approach is
similar enough to their contemporaries, their results
comprise mainstream science.
Kuhn’s critics proposed that science does progress between
the paradigm shifts. For example, Einstein’s theory of
general relativity began as a theoretical description. Later,
other scientists found empirical evidence and general
relativity led to a wealth of knowledge and technology that
we would not have had otherwise.
Where postmodernism gets interesting is in its applications
to real-world settings like management and education. A
postmodern approach in these areas fosters an open-minded
attitude: if the establishment is no more correct than
anybody else, everybody's ideas are potentially valuable. If
there is no objective truth, a new business process or
teaching technique is never guaranteed to succeed, nor is it
guaranteed to fail 43. That is quite liberating.
Action Steps
We have examined scientific skepticism in detail, with the
aim of helping us get to the truth. Why not have a go at these
optional exercises and apply some of the ideas we have
discussed?
1. Opening The Mind
Write a skeptical and open-minded proposition or theory of
your own. It may be helpful to use something trivial for this
practice exercise. It can be as simple as ‘Why I should get my
driveway resurfaced this summer,’ or ‘An explanation of
why I choose not to dye my hair.’ Use the following helpful
habits of mind 44:
a. Gather as much evidence as possible. For instance, what is
the current state of your driveway, and what are the risks of
not getting it resurfaced?
b. Beware of false positives and false negatives in the
evidence. For example, you might read that driveway
surfaces typically fail after five years, but check who wrote
this and what they base it on, and see what other sources
say.
c. Think broadly: consider everything that might possibly
impact the proposal or theory. This might include personal
finances, the broader economy, environmental concerns -
whatever factors are most relevant to your proposal.
d. Consider what somebody with the opposite opinion to
yours would write: how they would explain it and/or what
they might decide. This will help you maintain an objective
perspective.
2. Metacognition Exercise
It is normal and natural to be resistant to changing our
minds, but we learned here that reflecting on our own
cognitive habits can help enhance them. Use this quick
questionnaire as a self-reflection exercise, or rate somebody
who you know well. Adapted from Snelson 45.
a. How would you rate your ability to accept any new minor
idea with a lot of evidence to support it?
I accept it immediately
I accept it after doing my own research
I need a few weeks to absorb the new idea
I do not change my mind over minor ideas
b. How would you rate your ability to accept any new major
idea with a lot of evidence to support it?
I accept it immediately
I accept it after doing my own research
I need a few weeks to absorb the new idea
I do not change my mind over major ideas
c. How would you rate your ability to accept any new
revolutionary idea with a lot of evidence to support it?
I accept it immediately
I accept it after doing my own research
I need a few weeks to absorb the new idea
I do not change my mind over revolutionary ideas
3. Standard Process
Analyze an article to check whether it meets the intellectual
standards suggested by Paul & Elder 46. Choose something
like an editorial discussing a controversial topic. Is it:
Clear?
Accurate?
Precise?
Relevant?
Deep?
Broad?
Logical?
Significant?
Fair?
Summary
The story that began this chapter showed us that people
reach faulty conclusions even when they try to keep an open
mind and discover the truth: the police thought they had
solved the crime, and Lin thought she had found a better
explanation. They were both wrong.
With a truly skeptical attitude, somebody would have
doubted both explanations, put them to one side, and
investigated further. They would have been open to
alternative explanations and would not have been averse to
changing their mind even once they thought they had the
correct answer.
Scientific skepticism is not easy. It takes vigilance and
discipline to learn, but like critical thinking and other skills
that we discuss here, you can hone your skills. The processes
can become more automatic and less effortful as you develop
your expertise.
Next, we will look at how to deal with claims you see in the
media. That includes social media, so it should be a great
way to practice your skeptical attitude!
Takeaways
1. When assessing claims, act like a scientist: see whether the
claim is verifiable and falsifiable. If not, perhaps somebody
is asking you to believe something without sufficient reason.
2. When making decisions and forming conclusions, keep a
balance between skepticism and open-mindedness.
3. To reach the truth, aim for lucidity. Sweep your
preconceptions out of the way and experience the world as it
really is, without your previous experience blinkering you to
new facts and evidence.
4. Keep the postmodernist view in mind: perhaps we can
never know the truth, and perhaps meaning is completely
relative. If that is the case, many things are possible.
4
WHY THE MEDIA CAN MAKE OR BREAK OUR
THINKING
Z ion Davis was sitting in his corner office early on a
Friday afternoon, signing off business expenses. So far,
so normal. He had had a long, busy week, so the routine
task appealed to him. As a manager, he was generally well-
liked, not least because his civil engineering background
gave him credibility with the office staff.
As he was nearing the end of the task, an email notification
pinged onto his screen: “Interesting read.” The message was
from Alaistair, a junior team member he worked closely
with, so he decided to take a look. Almost immediately, he
wished he had left it until Monday.
The email contained an attachment: the environmental
report he had been waiting for. He proposed ‘rewilding’ a
section of the development, alongside the approved
construction of a visitor center and venue for various
outdoor sports activities. Alastair included a web link which
he said Zion should take a look at.
Zion clicked through. He scrolled through lengthy, emotive
paragraphs about the ‘failure’ of corporate environmental
endeavors like theirs. He sighed. This was going to take the
weekend to sort out.
Later, Zion strolled into the open-plan office Alastair and the
other juniors shared. Immediately, his team bombarded him
with questions:
“How can the company justify this?”
“This is outrageous! I can’t believe our company would do
this!”
A few people argued back in favor of the company.
The conflict was not restricted to his immediate team,
though: the MD, Milton Skelpie phoned Zion, ordering him
to call him in private immediately.
“I thought we were clear. This project is over 80% ecological
work specifically to support wildlife, so why are the
environmentalists up in arms about it? And why are half of
your team on their side?”
“It’s this article, Sir. They’re saying that we should leave
nature to take over by itself and that anything we do will
make it worse. The article’s bogus, Sir-”
The MD cut him off.
“Sort this out, Zion. I’m relying on you.”
Milton hung up. Zion took a deep breath, stood tall, and re-
entered the shared office. He looked around at his team.
“Everybody relax. We can tackle this together. So, we have
some problems. I’ve read this article suggesting that our
wildlife park project will harm the local environment up in
Washbrook, and my boss told me that several of you have
been posting about it on social media today. I’m going to
ignore that because we have bigger problems. As I said, we
can all work together to solve this.”
The next few days were some of the most hectic and
challenging Zion had ever experienced. He delegated
research tasks to several different staff members. He asked
them to research the article, examining the platform that
had published it, who had written it, where they got their
information from, articles that cited this one, the whole
gamut. He received a diverse set of reports.
Two of the staff, Meredith and Marco, had picked up one
interesting fact: the article used some of the same phrases
(and misspellings) as a blog post published towards the start
of the project. A fringe group wrote the blog, and their main
purpose seemed to be to block any kind of development.
Members encouraged each other to lie and exaggerate to get
their point across; they planted misinformation to stoke
readers’ emotions and make them angry.
Zion was ready to present his findings to his superiors when
the MD visited in person, completely out of the blue.
“Zion, this is serious. The local county council is now
concerned that we misled them in our planning application.
Local residents are protesting at the site, stopping
construction vehicles from entering and chaining themselves
to trees. We’ve already lost thousands because of the delays
to this project.”
Zion called Meredith and Marco into his meeting with the
MD.
“Sir, I would like to introduce you to the only two people in
this office who picked up that this article is hogwash.”
They spent the next hour presenting their comprehensive,
detailed findings of the article that had caused so much
trouble: the source was not credible; the story was a
distorted mishmash of second-hand information and
opinions; it played on people’s emotions; it misrepresented
the science. As Zion said, it was hogwash.
Happily, this convinced the MD. He was so impressed that he
assigned Zion and his team to write a well-researched article
on the topic for the company’s website.
After a difficult week, the project was back on track, and Zion
had gained even more respect from his team and his
superiors than he could ever have expected. Still, the fake
news article had almost caused a catastrophe.
Have you ever had an experience like this? Perhaps, or
perhaps not. The point of the story is that a lack of media
literacy can have huge potential consequences.
Several of Zion’s staff believed what they read without
investigating where the story came from; they failed to seek
further information, which led to conflict. It could even have
harmed the business. Zion’s quick, decisive action averted a
potential crisis. Even better, he used his critical thinking
skills to produce a report and web article discrediting the
disparaging claims.
Also, let us not forget the councilors and local residents who
also fell for the disinformation, and the protestors who
would have been ashamed when they realized they had
disrupted something that fit with their values, rather than
opposed them. Surely they would rather have spent their
time and energy protesting against something worthwhile?
Critical Thinking And The Mass And Social Media
This chapter’s story illustrates that people vary in how much
they believe what they read in the media. Some of the
characters discovered something they found upsetting and
took to social media to spread the word. This had multiple
impacts on the characters, the corporation they worked at,
and its stakeholders (local authorities and residents), not to
mention the threat to the wildlife park itself.
Today’s society relies on mass media and social media as its
main sources of information. However, the information
these sources publish is not always what it seems.
Consequently, both types of media can be harmful to
people's wellbeing. Media consumers who do not discern
between truth and falsehood, either because they decide not
to or do not know how to, can ultimately suffer.
When we use the media, it is important to apply the rules of
logic and our own experience, just as in other situations. We
need to read beyond the articles themselves to understand
their purposes and the effects the authors intended to have
on their readers.
The news is a great example of emotional manipulation.
News outlets use forceful language in headlines to stoke
readers’ emotions, but the story is sometimes less exciting
than it sounds. Fake news overtly appeals to people’s
emotions, and this is one reason why it spreads so
effectively. People tend to read fake news in an unskeptical
way, sharing it with others and thereby spreading it further.
Big News Or Fake News?
Fake news is big news these days, but what exactly is it?
Journalism experts define fake news as news that somebody
has deliberately made up to deceive the public, instead of
satirical comedy based on news or innocent mistakes that
the news media might make from time to time. It is also not
published by traditional media that adhere to traditional
standards of journalistic accuracy 1.
Therefore, instead of simply reading a message and
believing it, we should consider the source and look for
alternative perspectives elsewhere. This shifts the
information we receive towards an overall balance, and any
biases will become more obvious by contrast.
Many media outlets publish in good faith; they may be
politically biased, and they may want to entertain and
inform, but their news pieces are at least based on real
events. Some types of news, however, are deliberately
written to mislead and even deceive readers.
Fakers design fake news to appeal to readers’ emotions and
inflame prejudices and divisions. The purpose of fake news is
usually to make money or influence political opinions. None
of this is new, but fake news really took off in the era of
instant access to news stories and social media.
So how can we spot fake news? Like online scams, fake news
websites often imitate credible sources, and creators of fake
news are getting better at making it look official 2. One
famous fake news publication used the domain name
abc.com.co, mimicking the genuine abc.com. Fake news may
also use falsified images, which people find increasingly
difficult to spot 3. It may also feel too good or too shocking to
be true since fake news is composed to make people feel
highly emotional.
Why And How Do People Get Lured In By Fake News?
Part of the answer is other people. A large study analyzed ten
years’ worth of verified true and false news stories on one
famous social media platform and found that fake news
spread significantly ‘faster, further, deeper and more
broadly’ 4. Users were more likely to spread fake news
stories than true stories, with fake stories reaching up to 100
times as many readers. Several factors probably fed into this.
Fake news stories were more novel and, therefore, more
attention-grabbing. People prefer to absorb novel
information instead of run-of-the-mill information, and
social media may be popular because it supplies more novel
data. On social media, people interact with unique networks
of other people who supply them with both information and
entertainment. Novel information flows particularly well on
social media 5.
It might seem self-explanatory that fake news is highly
novel because it reports things that have not happened in
real life. However, the investigators controlled for novelty
statistically and concluded that this alone could not explain
the viral nature of fake news 6. Therefore, there were other
factors at play.
The same study showed that fake news elicited stronger
negative emotions - fright, sadness, and disgust - and more
surprise in users, judging by their reactions. True news
stories made users express more sadness, but also joy and
trust. Surprising visual scenes attract people’s gaze 7, as do
scenes that elicit negative emotions 8, so these factors
probably enhanced attention to the fake news stories.
Additionally, people are more likely to spread
misinformation if they think it will create emotional effects.
This is a powerful driver of both the spread and persistence
of fake news 9. The social aspect of social media is extremely
powerful. For many users, the desire to make an impression
on other people is probably stronger than the desire to
communicate the truth.
Perhaps surprisingly, bots did not spread fake news any
faster or wider than they spread real news 10. Therefore, the
persistence of the fake news stories was due to human users.
Rather than sharing true news stories, they preferred to
spread things they found novel and shocking.
A further characteristic of fake news is emotional
manipulation: its writing style and structure elicit strong
emotions in response to a particular idea and then shift
readers’ responses onto another idea. For instance, a fake
news post might accuse a political party of racism and then
aim to transfer the resulting anger onto other policies
declared by that group, in a kind of negative halo effect. Fake
news also sometimes uses readers’ negative feelings about
one topic to elicit negative feelings about a different topic 11.
'Clickbait' is web content that draws people in by appealing
to common feelings and goals (such as making money,
improving social relationships, or finding out the truth about
something). It is often news-like. The designers dress up
false or misleading information to look plausible in order to
to entice people to click through.
Clickbait makes the user’s goal seem easy and within reach,
with a knock-on effect of dialing down critical thinking and
making the information feel believable. The information
does not matter as the sponsor company has already made
their money when the user clicks through to the site 12.
Examples Of Media Deceptions
Mass media and social media convince people to believe
certain things, regardless of whether they are fact, fiction, or
something in between. There are innumerable examples
from historical and contemporary media.
In 1782, during the US Civil War, Benjamin Franklin
concocted a fake newspaper supplement about native
Americans and the British uniting to scalp 700 Americans.
He then sent the false story to his friends, who sent it to
their friends, and so on, and the story even made it into the
real newspapers 1314.
An example of fake news closely mimicking real news came
in the most shared US fake news story of 2016: “Obama
Signs Executive Order Banning the Pledge of Allegiance in
Schools Nationwide” 15. The graphics and web address
resembled a journalistic news source, resulting in over two
million shares within two months.
A more troubling example relates to the backlash against
vaccinations. The anti-vaccination movement arose from
Andrew Wakefield’s notorious discredited 1980s article
linking the MMR vaccine with autism 16. Most of Wakefield’s
colleagues later retracted the article to clarify that they had
found no causal link, and in recognition of the harmful
effects, the misinterpretation of their results led to 17.
However, retractions are rarely as popular as the original
untrue story 18.
Finally, in 2017, fake news reports circulated on social media
stating that someone had murdered the president of South
Sudan and that his aide was plotting a military coup. The
posts aimed to stir up further violence in the country that
was already suffering due to civil war, but it turned out that
the reports were untrue and originated outside of South
Sudan 19.
We must wonder how nefarious people achieve this
deception. Is it something about how they present the
information? Fake news often features appealing
storytelling, sensationalizing, and plays on compelling
emotions like fear and desire, but what about how readers
receive and process the information?
What Is Media Literacy And Why Is It Important?
Media literacy means applying information literacy
principles when you read media reports. Information literacy
means knowing when you need information and finding,
evaluating, and using the appropriate information for your
needs. Researchers believe that the public has highly variable
information literacy, specifically assessing the quality and
truth of information. This is also true for social media
information, which has no inbuilt quality control system; at
least traditional media has editors and journalistic standards
2021.
Media literacy is more important now than ever before
because of the sheer number of messages we encounter
every day. It is easier to access media than in the past, as
most people are constantly connected, and catch up on the
news by reaching into your pocket.
Several factors may underlie people’s reluctance to be
skeptical about what they read in the media and social
media. These include a lack of awareness about information
literacy and the need to read information critically. Further
obstacles include confirmation bias, increasingly convincing
fake news items, and the rapid progress of technology that
we use to consume news 2223.
According to large-scale surveys, most American adults get
at least some of their news from social media. In recent years
this has expanded to include adults over 50 years old for the
first time 24. This means it is vital for us to apply media
literacy principles when we read and share information.
Practical Ways To Apply Critical Thinking When Reading Articles
So how can we avoid being influenced by fake news? One
idea is to treat it as fun fiction and delineate it clearly in your
mind from real news. Get your news from trusted sources.
You could also find one of the numerous online fact-
checking websites and use those as part of your
investigation.
We already possess intuitive strategies for assessing the
truth of things we read and hear, but these enjoy varying
levels of success. We tend to rely on a few features of the
information 25, but each of these is prone to error:
Compatibility: does the new data fit with our current beliefs?
If so, it feels right to us, and the evidence weighs more
strongly in favor of the new data also being true. People
prefer information that fits with their world view.
Internal consistency: is the information plausible and
consistent with itself? People prefer good stories where the
plot points follow from one another, and the characters
behave in a realistic way.
Credibility: we make a quick assessment of the source but the
surface characteristics such as whether it features a familiar
person often grab our attention, as opposed to contextual
information like the story’s purpose or where it appears.
Consensus: whether others also believe the story is a rough
indicator of its accuracy, we may over-rely on it. For
instance, if people we perceive as similar to ourselves believe
something, we are more likely to believe it even if there are
other signals that it is untrue.
When reading articles, bear these four points in mind and
use them to try to aid objectivity. However, these are
descriptive: they tell us how the mind works by default,
rather than giving us the best method to appraise news
stories critically. Keep an open mind about the story, and
assess it using a more critical mindset than the one your
intuitive decision-making processes might tempt you into.
Rather than relying on intuitions, teachers tell their students
to assess online information using the CRAAP test 2627.
Students learn to assess whether the information is current,
relevant, authoritative, and accurate and look at its purpose.
As you are already a critical thinker, you are most likely
familiar with how to assess information against these
criteria, but here is a brief reminder:
Current: check the dates of both composition and publication,
look for any updates or retractions, and research further
sources that reference this one for the most up-to-date
information.
Relevant: assess whether the material applies to what you
were looking for, whether the language fits with the topic,
and whether they have covered it properly. In the case of
articles you did not specifically search for, think about
whether the title and introductory section present the topic
fairly; sometimes, fake news articles use misleading
headlines and images to draw readers in.
Authoritative: look for the author’s credentials and determine
whether they are qualified and experienced to write on the
topic. You can also examine whether they cover the content
in a logical and appropriate way.
Accurate: this can be more difficult to check for news stories
and particularly social media stories. If the author cites
sources, see if they are academic or official sources. If they
quote scientific results, you can see if they were published in
mainstream journals (indicating that the scientific
community, in general, accepts these results as valid
investigations) and see whether other scientists have
replicated the results.
Purpose: review the details to see whether the author uses the
article to sell products or attract visitors to their site. You
may find evidence of vested interests and/or biased opinions
that affect how they treat the topic.
Although CRAAP is a useful checklist, some authors have
criticized it, as you need to spend a serious amount of time
investigating the source website itself. Lateral reading is an
alternative approach that we can use to assess online sources
more quickly and fully. This entails fact-checking beyond
the site or story itself, including performing a web search for
the source or author 2829. It is also important to know that
search engine results and social media feeds are heavily
personalized. Still, not all students using online information
fully understand these facts 30.
As well as lateral reading, expert fact-checkers can
accurately categorize web sources by taking bearings. They
evaluate the source website using the ‘About Us’ section and
visit other online sources almost straight away to look for
wider information about the source’s authority and
trustworthiness 31.
Establishing whether the source is credible is key to using a
critical thinking approach to gaining information from the
media. However, the rise of fake news may correlate with the
general public’s diminishing trust in experts and
government sources 32. The internet and social media lie at
the root of this trend. People increasingly believe they can
find out anything by going online and that all opinions are
equally valid, whether expert or otherwise 33. This means it
is becoming increasingly challenging to distinguish between
credible and non-credible sources, but remember that a
critical thinking approach can help you get closer to the
truth.
It is enormously important to assess the credibility of the
source. Using lateral reading, you can gauge whether they
consistently report facts and whether they admit and
publicize mistakes in their reporting. You could also check
mediabiasfactcheck.com, which summarizes global news
outlets in terms of their overall accuracy percentage, and
highlights political bias. This is useful for evaluating sources
you may not have come across before.
In addition, you can also consider whether the article meets
the standards described in the Society of Professional
Journalists’ Code of Ethics. Their ‘Seek Truth And Report It’
criterion covers many of the standards discussed in this
chapter 34. Remember that most social media sources are not
answerable to this code.
You can also use online tools to verify the content. Some
helpful sites include:
Factcheck.org: focuses on US political claims across various
media.
Snopes: investigates various categories of potential
misinformation, including hoaxes and urban legends as well
as news items and political claims.
Factscan.ca: fact-checker for Canadian politics.
BBC Reality Check: includes fact-checking and explainer
articles for the UK and international current affairs topics.
Action Steps
1. Media Literacy Practice
Perform a general web search for a topic of interest and
assess two of the resulting articles or webpages using CRAAP
and lateral reading. Notice whether the two approaches give
you different impressions of the sites, perhaps even leading
to different conclusions.
2. Deep Dive
Choose a news story that interests you; perhaps it relates to
your business or personal concerns. It could be one that you
found during Action Step 1. It should be sufficiently complex
and mainstream for you to find at least four different
sources. Research the information these sources report,
aiming to be as diverse as you can. For example, look for left
and right-wing sources from both the mass and social
media. Chart on a piece of paper what they agree on and
what they disagree on. Can you see different 'facts' reported?
What about word choices indicating bias? You can repeat this
exercise in the future if you want to assess another news
story in depth.
Summary
This chapter’s story showed how fake news concocted by
extremists snowballed and nearly spelled disaster for a
company and a community. This fictional story’s message
was serious: many real-world fake news stories have had
terrible consequences. The few examples given here should
give you an idea.
The mass media is now immune to getting things wrong.
Still, even journalistic outlets vary in the quality standards
they set for themselves, so it is important to apply your
critical thinking skills here too. Three of the characters in
the story displayed great analytical skills in picking apart the
mess of blogs and social media posts that led to the
misinformation problem. They presented their findings
rationally and calmly that defused the situation. In the end,
this positive outcome could have even enhanced the
company’s reputation.
Next, we move on to look at how others try to deceive us to
our faces and how we can sort the truth from the lies in
these everyday situations.
Takeaways
1. To separate sense from nonsense in mass media and social
media, we need to apply the rules of logic and use our own
expertise.
2. We need to be alert to fake news, which is deliberately
concocted to fool people, and not confuse it with real news,
satire, editorial opinion, propaganda, or advertisement 35.
3. Take a skeptical approach even if the story feels true, and
beware of ‘news’ that seems too extreme to be true.
4. You can use media literacy tools and resources, such as
CRAAP and lateral reading, to evaluate the source publication
and the author, recognize bias and opinions, and assess the
accuracy of claims.
5
EVERYDAY LIES AND DECEPTION
A fter reading the fine print, Alicia decided she was happy
with the terms of the business loan.
She had recently met Aaron Lowen, a business development
consultant from Howsville, the next town along from hers.
He strolled into her ice-cream parlor and quickly persuaded
her to open another cafe in Howsville. She refused at first:
she liked running a single site business, and her customers
found it charming to buy ice cream from a family-run
concern. The expansion was too risky.
However, Aaron insisted.
“They don’t even have real gelato in Howsville! With this
artisan Italian ice-cream, you’ll make a fortune! I promise
you, there are no decent ice-cream cafes at all.”
A smile flitted across Aaron’s face. Quickly, he looked serious
again.
“Is this really a good opportunity?” Alicia asked.
“Yes, definitely,” Aaron grinned.
Alicia noticed a strange wobble of the head, but thought no
more of it.
So here she was: opening a new café. Once the loan was in
place, it was all hands on deck.
However, Aaron had not been a hundred percent truthful. A
local gourmet ice-cream company was running trucks and
pop-up cafes across town, and they had no qualms about
targeting her new store. Sometimes the local kids would even
come in to criticize her product:
“Not as good as Toni’s.”
The trouble at the new branch rapidly damaged the entire
business. It seemed time to cut their losses. Then, vandals
broke into the new café. They wrecked the displays and
littered ice-cream and toppings everywhere. Alicia closed for
the day, and her employees cleaned up while she called the
police and the insurance company.
This was almost the end of the whole company, but Alicia
smiled and kept going. Her sister-in-law gifted her some of
the profit from her own business, which kept Alicia afloat for
a while. Sadly, the new café was still not viable, so Alicia
decided to close down.
On the last day, they organized an ‘everything must go’
event, with half-price ice-creams for all the local high
school and college kids. Late in the afternoon, this turned
into free ice-creams for all.
Alicia confided in a middle-aged lady who was enjoying a
cookie and cream cone. The lady was sympathetic:
“It’s very sad, but Aaron from Toni’s has such a good grasp
of the local business environment and so many friends and
contacts in the town. You were brave to compete with him.”
“Aaron who?”
“Aaron Lowen, our local entrepreneur. He’s involved in most
of the businesses in town, and even wants to open up in your
town as well. Can I get some white chocolate sprinkles with
this?”
In a flat tone, Alicia directed her to ask at the counter. So
Aaron had lied.
Finally, she had found the missing piece of the puzzle: Aaron
was deliberately trying to put her out of business, and it had
almost worked. He had almost cost her everything. If Aaron
had succeeded, he would have been the number one ice
cream seller in both towns!
She had to applaud his audacity: pop-up ice cream cafes and
trucks, rather than fixed premises, meant she had not
discovered that there was already a popular artisan ice cream
maker in Howsville. So she was back to her initial position,
but it could have been much worse.
A few months later, things had improved. Sympathetic locals
who heard about the diabolical deception flocked to Alicia’s
home town cafe. It was a warm spring, so she added two
bicycle-based ice cream sellers. All this led to record sales, as
well as bad publicity for her rival Aaron.
Alicia was intelligent and successful, but she missed the
signs of deception. Aaron gave away some clues: the quick
smile that flitted across his face when he claimed there were
no ice-cream cafes in Howsville, and the head wobble when
he confirmed that it was a good opportunity, betrayed his
real opinions. He promised something that sounded too good
to be true, and he appeared trustworthy, using his expertise
as a business consultant to add credence to his claims.
Alicia noticed these clues but did not know how to interpret
them. She did not know that even accomplished liars reveal
themselves occasionally, as the human body and face
express our emotions even when we work hard to suppress
them.
Most people are basically honest, but one deliberate
deception could potentially cost us a lot. Therefore, as well as
examining claims and evidence in detail, being skeptical
about ideas, and examining evidence, we need to look at
other clues that can tell us if somebody is lying, whether it is
unintentional or deliberate on their part.
How To Spot A Liar
Outside of the media, people we interact with every day,
expose us to a great deal of information, much of which is
true. Lies are deliberate conscious deceptions, in which
people either conceal something or falsify information 1, so
people have a huge interest in methods for detecting when
somebody is lying.
There is no single clue that tells us somebody is lying.
Instead, we must draw tentative conclusions based on as
much evidence as we can find 2.
We can apply critical thinking to the content of what people
say, and when we interact face to face, there are several
additional sources that can give us clues as to whether
somebody may be lying. Liars can accidentally reveal the
truth by leaking information or emotions they are trying to
hide. A few behaviors might clue us in (but note that these
behaviors rarely reveal the content of the lie).
Liars sometimes work hard to conceal a lot of emotion, and
we can detect this cover-up by gathering evidence from their
faces, bodies, and voices, for instance, if somebody seems
panicky. People who tell the truth expect others to believe
them, so they appear more relaxed 3.
The words used provide the first set of clues. Three ways that
somebody’s words can suggest deception are: making errors
in repeated facts; slips of the tongue (stating the real fact or
situation by mistake); and saying far too much. In the latter
case, you might identify a liar because their explanation is
overly elaborate and detailed, suggesting they are trying
desperately to convince you.
However, liars focus on faking their words and facial
movements, whereas their voice and bodily movements are
less easy to falsify 4. Scientists have studied interpersonal
signals from faces, voices, and body language extensively in
lie detection.
Many people believe that the eyes give away true feelings in
terms of interpersonal signs, and liars often deliberately try
to appear truthful using the eyes, such as making plenty of
eye contact. However, the impression conveyed by eye gaze
differs across cultures. Some regard direct gaze as
disrespectful, which may affect suspicion of guilt when
police officers arrest or question people of different
ethnicities 5. Because eye gaze is so deliberate, it is perhaps a
poor indicator of somebody's inner feelings.
Blinking is more spontaneous than eye gaze, and pupil
dilation is not under conscious control. Therefore these
provide more reliable signals of genuine feelings. Changes in
frequency of blinking and wide pupils could signal emotional
arousal associated with deception, but this is inconclusive
since they are signs of general emotional arousal. Pupil size
was the best indicator of lying in a meta-analysis comparing
various signs of tension in liars 6. Additional bodily signs of
tension include sweating, pale face, and flushing, but these
are general to emotional arousal and not specific to lying.
Facial signals are extremely complex. Liar’s faces
communicate two interesting strands of information: what
they are trying to communicate and what they are trying to
hide. Liars often try to conceal their true feelings, with
varying success. Good ways of reading faces to detect
deception include 7:
Passing expressions: people often express their feelings
spontaneously but then quite quickly suppress them. This
rapid masking of expressions can be a clue to dishonesty.
Micro-expressions: these are much faster than the passing
expressions described above. We do not typically perceive
them, but we can see them on paused or slow-motion videos.
Trained psychiatrists can observe them in normal
conversation, having learned to do so through their
professional practice.
Specific parts of the face: some areas of the face are more
informative than others. For example, people fake smiles
using only the mouth and lower eyelids, whereas a genuine
smile of happiness features raised cheeks and wrinkles at the
corners of the eyes. People also find it difficult to fake the
pursed lips of anger.
Smiles: if somebody smiles too much or at odd times during
the conversation, they may be using a social smile to conceal
nerves. If their smile disappears rapidly or slips off the face
in unnatural steps, they may be using a fake smile to conceal
other emotions.
In terms of voice clues, listen for pauses and a lack of
fluency. If the speaker often pauses or for a long time,
hesitates more than usual, or uses filler sounds like ‘ah’ and
‘um,’ they may be improvising. The high cognitive load of
composing a lie while speaking to you might be taking a toll
on their verbal coherence 8. However, this may be unreliable.
One linguistic study found fewer uses of ‘um’ and similar
words in lies, suggesting that these fluency errors are
perhaps part of normal speech rather than a sign of high
cognitive load 9.
Raised pitch is a further vocal signal of lying 1011. People find
fear and anger difficult to hide vocally, especially if the lie
they are telling makes them feel that way, or they are
worried you have figured out their deceptiveness 12.
According to some acoustic research, people produce shorter
utterances with fewer syllables, speak more slowly and take
longer to respond, and vary more in pitch and intensity
when they lie 13. A meta-analysis also suggests that liars say
less and provide fewer details than truth-tellers 14.
All of these modes of non-verbal communication provide
hints that somebody might be deceiving you. However,
bodily clues might be the best ones to look for. Experiments
show that observers are worse at picking up deception from
facial and vocal cues than bodily cues. Participants who only
saw bodily postures and gestures were much better at
spotting the liar 15.
Remember that everybody has a different baseline. You are
likely to be better at detecting lies from somebody you know
than a stranger. It helps to distinguish between these three
categories of bodily movement:
Emblems: movements like nodding the head, shaking the
head, and shrugging. They are often intentional
communicative gestures but can occur without
consciousness as well. Unintentional emblems are often
smaller versions and may reveal how somebody really feels.
Illustrators: often spontaneous, you may refer to these
gestures as gesticulation. Many of us use our hands a lot
when speaking: we draw pictures in the air and mime
actions. People use fewer illustrators when they lie, perhaps
because it is more effort or they are uncertain about what
they are saying 16.
Manipulators: pinching, stroking, scratching, hair twisting,
and similar movements, including fiddling with small
objects. People engage in more of these gestures if they are
nervous, but also when they are feeling relaxed, so it is not a
great clue to deception.
Be cautious when interpreting faces, voices, and body
language, and remember each of the methods described
above is only a single clue. Listen closely to what the person
says, as well as, observing how they behave. So can we find
any more certainty?
What They Say Or What They Do?
There is no magic formula to discern whether somebody is
trying to deceive you or is giving you misinformation.
Further, there are numerous myths and misconceptions
about how to detect lies. Some experts claim that a
significant portion of the police and customs training
materials on lie detection from interpersonal cues are
incorrect 1718. This suggests the scale of the problem, but
remember that police and customs are working with
individuals that they have never seen before, who will be
feeling victimized and perhaps even panicky due to being
apprehended.
The clues described here are simply clues. They can give you
an idea of whether somebody is deceiving you: with an
abundance of clues, the likelihood of deception rises.
However, the fact that you are dealing with likelihood rather
than certainty raises the problem of false positives and
negatives. The risks and consequences obviously differ
depending on the situation. Is it worse to doubt the truth or
to believe a lie?
For instance, anybody working in law enforcement or other
high-stakes occupations should be extremely cautious about
how they interpret the interpersonal signs of deception.
They should keep an open and skeptical mind and continue
to gather evidence.
A further problem is that lying is a social interaction, and if
somebody feels uncomfortable, they may give off signals
similar to those of deception. If anyone has ever falsely
accused you of lying, you will remember how it feels.
Similarly, shy individuals or people who find themselves in
an aversive situation may be mistaken for deceivers simply
because they feel nervous and untrusting. In both scenarios,
the accuser could misinterpret the accused’s discomfort as
guilt signals, further fuelling the false accusations.
Ideally, get a solid baseline so that you know the person and
their general demeanor. People in close relationships (both
romantic entanglements and friendships) may be better at
detecting each others’ lies, but they are also better at
deceiving each other. One reason for this is that they are
familiar with each others’ typical behavior and modes of
speech and so can fake it more easily. Another reason is that
their desire to maintain a positive relationship leads them to
ignore the signs of deceit 19.
Romantic relationships are paradoxical in terms of honesty.
Most people agree that they want a potential romantic
partner to be honest, whether the imagined relationship is
long-term or dating, but they do not want extreme and
absolute honesty. Instead, many people realize that
sometimes deception helps build self-esteem and can be an
act of kindness.
In very emotionally close relationships, the evidence on lie
detection is mixed: sometimes it seems that romantic
partners are better at detecting lies in each other; other
evidence suggests the opposite 20. Perhaps it depends on the
specific, highly personal details of the relationship.
To detect lies of all types more successfully, we need to look
at the communication in context, including its content. We
are more likely to succeed using a rational approach,
comparing what they say to other evidence and our prior
knowledge than by over-relying on signals from the
potential liar’s face or voice 2122.
In terms of what they say, liars give less consistent accounts,
told in a less personal way, that feels less plausible to
listeners. Their stories seem to be told from a distance and
are less clear; these effects are more reliable than non-
verbal cues such as eye gaze and expressions 23 .
Another clue is the structure of the story. Liars are more
liable to tell you what has happened in the order that it
happened, whereas somebody speaking the truth moves
around the story’s timeline. This structured approach
suggests that liars have carefully composed their story but,
ironically, end up relating something that sounds less
natural 24.
Nobody Is Immune
Everybody is susceptible to believing lies and half-truths
252627. Even the experts get it wrong sometimes.
People are overconfident in detecting lies from the face and
voice, which acts as a barrier to finding the truth. Faces, in
particular, are used for communication and expression, so
any expressions read from the face need not reflect how
somebody really feels. Further, communication via the face
varies in different cultures and also among individuals. We
cannot extract any solid rules for detecting a liar from their
face or voice with so much variability, although they do
provide some clues. We would be better off using evidence
and trying to establish the facts 28.
Interestingly, one study suggests that trained police officers
may be no better or worse at detecting lies regardless of
whether they focus on the content of the lies, the person’s
face and voice, or their body language. Their accuracy was
50/50. They may as well have flipped a coin. This illustrates
that even trained professionals might be fairly poor at
working out when somebody is lying, despite high
confidence. A similar study showed that practice in itself
improved people’s performance, but instructions to attend to
certain cues (face, voice, and body language) had no effect
29.
One reason people get it wrong is that they assume that
others are honest 303132: we believe others by default. Most of
us tell the truth most of the time, so the bias towards belief
normally leads to correct conclusions and better cooperation.
We rarely question others’ honesty unless something makes
us suspicious; however, there is evidence that many people
are poor liars even when they attempt to deceive 33.
Evidence suggests that lying is not particularly common. A
study that asked people how many lies they told over a day
showed that most people reported no lies at all, the vast
majority told one or two lies, and a small number of ‘prolific
liars’ told half of all lies reported. You might suspect that
people lied to the researchers, but that would not explain the
distribution of the responses. Further, hundreds of people
participated in three separate experiments, adding credence
to the conclusions 34.
When people reflect on how they have detected lies
successfully in the past, their answers point towards two
strategies: comparing the lie to the available evidence and
persuading the deceiver to confess. Obtaining evidence relies
on getting contextual information around the lie, so you are
likely to be better at detecting lies within your own domain
of expertise. Surmising that somebody has a motive to lie
raises detection accuracy to almost 100%, and it is useful to
use probing questions. Again, experts are better at this 35.
If you suspect somebody is lying to you, encourage them to
talk. Get the person to repeat their story and listen out for
factual errors and inconsistency 36.
We all need to be aware of our own motivations, emotions,
and preconceptions and do our best to avoid letting these
color our perceptions of others. Overall, it is difficult to
decode when somebody is lying to us. Luckily, in the case of
our social relationships, minor lies are often inconsequential
or even positive. However, modern life is full of scams and
other deceptions, which could be potentially very damaging.
Examples: Is Truth Rarer Than Fiction?
Here, we examine common scams and deceptions that
unscrupulous people and companies might use to sell things
or convince people to believe false ideas. We can look for the
signs of deception already described, but there are many
problems with this. Firstly, those who deliberately set out to
deceive people are more likely to be practiced liars, so they
may display few signs of stress when they are deceptive.
Secondly, in the case of online deception, we may not be
faced with a person at all, so we have to use other clues in
the message itself.
Three of the most common online scams 37 are:
Advance fee scam: if you use email, you have probably
received one of these. A message purporting to originate
from a relative of a wealthy person asks for your assistance
with accessing their funds. They need a small amount of
cash in advance, and will pay you a huge fee once they have
accessed the treasure trove. Other advance fee scams include
pyramid schemes and ‘work from home’ type business
opportunities. Fraudsters circulate these scams to thousands
of people, but only a few need to succumb for the scammer
to make a profit.
Online selling scams: auction fraud, not delivering goods, and
not paying for goods purchased are the most common.
Sometimes identity theft and misuse of credit cards are also
involved. ‘Scareware’ is another online selling scam, in
which fake popups tell the user their computer has a virus
and they need to pay for a special tool to remove it.
Investment scams: when a swindler convinces people to invest
in a fake business. The business sounds genuine and has a
professional-looking website, which anybody could easily
create with a small outlay and some web design skills, which
leads a few people to believe it is genuine.
Like many other forms of deception, these three all rely on
people falling for a falsified desirable idea: the large payout
following the small outlay, the honest seller or buyer, or the
profitable investment. People’s emotions may override their
more critical faculties and lead them to fall for something
that many others would find unlikely 38. The fraud may even
be so convincing that it even fools people who are confident
in their skills at scam-spotting. The assumption that people
are mostly honest probably also plays a role 3940.
The more a fraudulent claim plays on people’s hopes and
wishes, the more people will believe it 41. So, it is important
to report scammers to the authorities, but what if you are
unsure whether something is a scam? US federal government
42 and the UK Citizens Advice Bureau 43 give useful advice.
All of these are signs that you should be suspicious and not
hand over any personal information:
Does it seem too good to be true?
Is the contact unexpected?
Do they pretend to be from a trusted organization
such as your bank or social security?
Do they ask for your personal data such as PIN or
password?
Is there a problem you need to solve (like a huge tax
bill) or an unexpected prize?
Do they rush you to act immediately?
Do they ask you to pay in an unusual way like money
transfer or gift card or send you a check (that later
turns out to be fake)?
Outside of straightforward scams, real businesses and
organizations sometimes engage in trading practices that
may be illegal or at least dodgy 44, for example:
Fake reviews and testimonials: these are common on online
marketplaces. Sometimes celebrity testimonials are used,
and it is difficult to tell whether the celebrity gave their
permission.
Unfounded predictions and promises: this may be illegal if the
company knew a specific claim was untrue, but fanciful
advertising claims are usually allowed.
Bait advertising: this is when a company advertises a product
for sale, but does not have a 'reasonable supply.' The bait
product lures people in; then the seller persuades them to
buy something else.
Misleading guarantees, conditions, or warranties: for example, a
seller cannot make you take an extended warranty, but the
salesperson might try to imply this; this con relies on
customers not knowing the details of the business' legal
obligations.
With so many companies and individuals trying to make
money from us, it is sensible to keep in mind that if
something seems too good to be true, it probably is.
However, it would be cynical and destructive to apply this
attitude to our everyday interactions and relationships.
Remember, there are only a few prolific liars around, and
they are probably busy running online scams.
Action Steps
Now that we have looked at how to use critical thinking and
evidence to spot lies and deception in everyday life, it is time
to apply some of this knowledge. Try the following action
steps.
1. The Lying Game
Play a game of lie detection with somebody close to you.
Each of you can prepare a handful of lies and truths that you
will try to convince the other person are true. Remember this
is a fun learning exercise, so use humorous or innocuous
facts about yourselves that the other person does not
necessarily know. Use some of the techniques covered in the
chapter to convince them and try to detect the lies correctly,
and have a conversation afterward about how it went.
2. Proof Of Lies
Try some of the techniques for spotting a liar. Find an online
video from a few years ago of somebody you know is lying
because someone else exposed them or they confessed. This
could be from politics, an interview with a public figure, or a
televised court case. Watch the video in slow motion and look
out for some of the signals we have examined in this
chapter:
Physical signs of tension.
Fleeting and micro facial expressions.
Shifty body language such as small emblem gestures.
You could then do the same but listen for any acoustic
signals, such as raised pitch and frequent hesitation, perhaps
comparing their verbal behavior to an example when you
know they are not lying.
Summary
In the story at the start of this chapter, it turned out that the
business consultant had seriously misled the business
owner: the rival company was a serious threat to her
business’ expansion after all. How could she have picked up
on this?
Unfortunately, there is no surefire way to tell if somebody is
deceiving you, especially if it is somebody you do not know
well. However, Alicia could have checked the facts: did the
other neighborhood have real gelato? Was the promise that
there were no decent ice cream cafes in that town too good to
be true? The deceiver also showed a possible micro-
expression (a fleeting smile at an odd time) and an emblem
gesture when we slightly shook his head, possibly revealing
that he was saying the opposite of the truth. She might have
been able to figure it out, but perhaps assumed that this man
was telling the truth because most people are honest.
In the next chapter, we will explore what some people might
call a special category of scam. We look at pseudoscience and
how to distinguish it from real science and technology.
Takeaways
1. Tune into the visible and audible signs of potential
deception: you can learn them through careful observation
and practice. However, you need to apply critical thinking to
what they say and pair this with a keen observation to get
closer to the truth.
2. There is no sure-fire way to detect lies, but knowing the
person or establishing a baseline will help. Even a host of
behavioral clues cannot prove that somebody is lying.
3. People believe others by default, and research suggests
this is warranted as most people are honest.
4. Selling products and ideas is perhaps the exception; scams
and frauds are sadly very common, but you can detect them
and overcome them using a skeptical, analytical approach.
6
PSEUDOSCIENCE VERSUS SCIENCE
W hen Marlon’s Mom moved to her retirement
apartment, he noticed something he found strange.
The apartment was spotlessly clean, but they found
the same small object in every corner of every room and
window recess.
Marlon assumed that the previous resident must have gone
crackers. He or she had stashed a horse chestnut in every
corner they could find. The removal men carried on moving
his mom’s possessions in, whistling happily as they wedged
the large couch into the small living room. Marlon heard a
tiny wood-like object roll along the floor underneath the
couch.
“Excuse me, guys,” he said. “I don’t think Mom wants those
chestnuts everywhere. Can you put them in the trash,
please?”
The two assistants put down an oak dresser and looked to
their foreman for guidance, but Marlon’s Mom interjected
before he could say a word.
“They’re fine, gentlemen. Please carry on,” she said to the
removal men, giving Marlon a pointed look.
As the removal men carried on, Marlon looked to his Mom in
confusion.
“Isn’t it odd that they just left these chestnuts everywhere?
Why don’t you want them thrown out?” he asked.
His Mom gave him a superior look.
“They keep the bugs away, Marlon. It’s a tried and tested
natural remedy. I would have thought you would approve.”
Marlon could not help but burst out laughing, but his mother
was clearly serious.
“Proven? Who proved it?” he asked once he had his breath
back.
“Not your new-fashioned scientists. Housewives have
known about it forever. Spiders are scared of the fumes they
give off or something like that. My Grandmother taught my
mother, and my mother taught me. Did you ever see a spider
in my house? I thought not.”
Marlon was sensible enough to mumble in agreement and
then drop the conversation. He had to admit he had never
seen a spider in his Mom’s house, but she spent an awful lot
of time dusting.
In fact, science has found no evidence for Marlon’s Mom’s
belief that horse chestnuts deter spiders 1. This particular
erroneous belief is benign, but it illustrates the point that
sometimes people simply believe in received wisdom.
Marlon’s mother believed her home was spider-free because
of the chestnuts, exhibiting confirmation bias. Still, as
Marlon’s inner voice hinted to him, the lack of spiders was
more likely due to her constant cleaning.
You might conclude that the mother’s belief in the chestnut
deterrent was a harmless superstition, but are all
superstitions harmless? Where it gets more debatable is the
case of pseudosciences. These are more complex and far-
reaching than superstitions; they involve entire belief
systems.
A pseudoscience is a collection of beliefs or practices that
people mistakenly regard as scientific. Sciences challenge
their own claims and look for evidence that might prove
these claims false through systematic observation and
experiment. In contrast, pseudosciences aim to look for
evidence that supports their claims, seeking confirmation
rather than falsification.
How (And Why) Science Works
Scientists analytically explore the world. The scientific
process is the most reliable way of understanding the world
because it involves hypothesis-testing and various
mechanisms to scrutinize the conclusions and evidence.
Essentially, science centers on things we can observe and
measure. We can define science as:
“An interconnected series of concepts and conceptual
schemes that have developed as a result of experimentation
and observation and are fruitful of further experimentation
and observation” 2.
Conceptual schemes mean theories and models, and
hypothesis testing means generating a testable idea and then
using observation or experiment to test it. If we cannot
observe something or test it by experimenting, it is probably
not a scientific idea. Scientists have an array of methods and
techniques to scrutinize evidence and draw conclusions,
many of these are specialized to the individual field of study.
To understand whether a conclusion is valid, we should
examine the techniques used to collect the evidence as well
as the evidence itself. We can then make our own judgment
about the reliability of any associated claims.
Science should employ a rational approach, actively looking
to make sense of the world logically. Scientists describe their
current understanding of a given situation with reference to
the evidence, as well as an assessment of their confidence in
that understanding 3.
Science makes certain very basic assumptions, including that
objective reality exists and that we can observe and measure
it, and find out the truth about things 4. Some adherents of
pseudosciences might question these fundamentals, making
it difficult for science to compete fairly with pseudoscience.
Validity is a term research scientists use to ensure that their
research methodology is relevant to what they want to study.
By assessing validity, they cue themselves to think clearly
and stay on track. For example, a biologist would consider
blood tests a valid way of measuring liver function, but this
technique would not work for all biological variables.
Scientific research studies involve several formal stages. At
each stage, the investigators try to remain as objective and
bias-free as possible. You can think of the stages described
below as a repeating process because scientists use their
observations and experiments to develop their theories,
leading to refinement of the theory, which leads to further
research questions 56.
Observation: Many scientific studies begin with observations.
This is when somebody observes something interesting by
chance, without changing anything to see what would
happen. For example, a nurse might notice that patients with
a certain condition seem to recover faster when doctors
prescribe them aspirin rather than paracetamol
Formulate a research question: this is quite specific, but not
usually something that investigators can cover in a single
study or experiment. It takes the form of “does variable A
affect variable(s) B”. A research question for our example
might be ‘does aspirin improve the symptoms of the
condition?’ The investigators might do several studies to
examine this.
Narrow the question: many research questions are too broad
to form testable hypotheses, so scientists must reduce them
to questions they can examine in a single study or set of
studies. Narrower questions for our example could be: do
patients given aspirin spend fewer nights in the hospital, do
they live longer, or do they experience a lower rate of relapse
compared to those given paracetamol?
Conduct research: gather data that forms evidence from
experiments or observations. Observational studies gather
data without changing the situation, whereas experiments
change one variable while keeping everything else constant.
Research may be naturalistic, like our example with hospital
patients, or scientists might contrive a more tightly
controlled artificial situation.
Analyze data: scientists use descriptive and inferential
statistical methods to compare data against baselines, over
time, or between two or more experimental conditions.
There are a huge number of methods available to analyze
qualitative and quantitative data, and this is such a
specialized area that statisticians work as collaborators and
consultants in all scientific fields.
Draw a conclusion: delineate the most likely explanation of
the data while also discussing alternative explanations. This
does not imply criticizing or outright rejecting alternative
explanations since the chosen interpretation could still be
wrong. Further evidence might result in the scientific
community re-interpreting the result.
The scientific community is a key player in the continued
effort of scientists to reach a better estimate of the truth.
Scientists write specialist articles and present their data at
conferences to spark debate among their peers. Peer-
reviewed journals, where experts peruse the article before
publication, publish most scientific papers 7. This gives
readers reassurance that the methods are appropriate and
the data justify the conclusions.
What Is Pseudoscience?
Now that we have a clear definition of what science is and its
methods, we need to define pseudoscience. As the prefix
‘pseudo’ implies, pseudoscience refers to beliefs and
activities that might resemble science but are not science 8.
We call them ‘not science’ because they diverge from
mainstream scientific theory, and in some cases, scientific
methods cannot test them either 9.
The line between science and pseudoscience is not always
clear, though. Investigators working in pseudoscience are
free to employ hypothesis-testing and scientific techniques
to examine evidence and conclusions. But they sometimes
commit mistakes and produce misinformation in the
process, and end up presenting incorrect conclusions.
Examples of pseudoscience:
Alternative medicine: alternative therapists sometimes fail to
specify how the therapy works or make general references to
things like the energy that the practitioner harnesses or
directs into the client's body. It would be difficult to devise
an adequate control situation to compare to these therapies.
Pseudoscientific therapies often rely on hearsay rather than
clinical trials, and this can be subject to confirmation bias
and the hasty generalization fallacy 10.
Psychic powers: many people across the world believe in
supernatural powers like extra-sensory perception and
clairvoyance. Believers and scientists alike find it difficult to
test these ideas, and although many have tried, the evidence
is inconclusive 11.
Astrology: predicting people’s personality traits and future
events from the position of the stars, the Moon, and planets
is another ancient practice that appeals to people across the
world. The predictions are vague and often not falsifiable,
and therefore have not been tested in a rigorous way like
scientific theories 12. Investigators have found no correlation
between star signs and personality traits 13.
We should not confuse folk remedies and young sciences
with pseudosciences. Be skeptical of ancient traditions: they
might work and might not, but age alone does not imply
efficacy 14. We should also be open-minded about young
sciences while establishing their methods and world views,
although further scientific investigation may falsify them.
One example is germ theory, which the scientific
establishment thought was implausible at first, but further
investigations confirmed that microbes, not foul air, caused
diseases 15.
People and communities hold biases, but so do scientists.
The history of science shows that socio-cultural contexts
affect how scientists work, despite their drive to be bias-
free. For example, the mistaken idea that the surface of the
human brain looked like the gut influenced early scientific
drawings of the brain, even though the artist could see a real
brain in front of them 16.
Why Do People Believe In Pseudosciences?
Confirmation bias and emotional appeal are two reasons why
pseudoscience draws people in, but people do not adopt
unscientific beliefs for no reason. Some pseudosciences are
ancient, even predating modern science and medicine, and
in these cases, people are sticking with what they already
believe. We know that people are reluctant to change their
minds once they have decided 17.
According to surveys conducted in 1993 and 2000, most
American college students do not know the difference
between astronomy (the scientific study of the cosmos) and
astrology. Scientific education can help people distinguish
pseudoscience from real science, but they may hold onto the
pseudoscience as well, particularly if it is a widespread
tradition like astrology. Newspapers have been publishing
horoscopes for generations, and their appearance alongside
news may enhance their believability 18.
However, demonstrations of scientific reasons behind
phenomena can cause people to revise their beliefs 1920. For
example, many people worldwide see solar and lunar
eclipses as supernatural events, but they revise their belief
when scientists demonstrate that they can predict eclipses in
advance 21.
Some evidence points to shared traits among believers of
pseudoscience. People with a ‘conspiracy mentality’ and
lower knowledge of science were more likely to believe in
pseudoscience. Conspiracy mentality consists of distrust and
paranoia aimed at authorities like governments, which
correlates with certain personality traits. This mentality
correlates with conspiracy theorizing, which correlates with
rejecting scientific evidence 22.
Another characteristic of believers in pseudoscience is a
more intuitive thinking style: they engage in the faster, more
automatic reasoning style that Kahneman described 23.
However, instructing people to use the slow system and
think more critically can reduce automatic belief in
unfounded claims 24.
Distinctions Between Science and Pseudoscience
One major difference between science and pseudoscience is
that pseudoscience seeks confirmation, whereas science
seeks falsification. When people claim that pseudoscience
effects a cure for a disease, they work back from the result
and conclude that the pseudoscientific intervention caused it
25. Real scientists, in contrast, are skeptical of their own
findings and theories, alongside being highly motivated to
discover the truth 26.
Some pseudosciences are not testable and rely on people
having faith and belief that they work or explain things. In
contrast, scientific ideas are measurable and testable by
definition. However, scientific ideas sometimes contradict
common sense, and pseudoscientific ideas sometimes seem
to make more sense to laypeople 27.
A further difference is that pseudosciences do not change
and develop in the same way as sciences. Instead, they either
remain the same or change randomly, whereas scientists add
new ideas to their science based on research refining their
theories. This generates more hypotheses for them to
investigate. Pseudosciences develop more haphazardly, with
new ideas having no necessary relationship to the previous
ones 2829, whereas sciences usually develop gradually with
occasional drastic paradigm shifts 3031.
Additionally, pseudoscientists do not usually publish their
work in peer-reviewed journals. We can also check their
sources, as pseudoscientific practitioners do not always
reference scholarly sources 32.
Pseudosciences are popular because people would like to
believe them; they are exciting and capture people’s
imagination, perhaps more so than mainstream science.
Pseudosciences may be harmless - although arguably some
are not if they divert people from seeking effective
treatment. For example, believing in untrue things is
perhaps disempowering because it prevents people from
accessing the truth 33.
How To Approach Ideas With An Open Mind
To sum up, critical thinkers must be able to approach a
pseudoscientific issue with an open mind, ready to follow the
evidence and the arguments wherever they lead. Apply your
skepticism and reasoning skills.
If you think an idea may be pseudoscientific, look for its
sources, examine the logic of the argument, and consider
whether the author has a motive for ‘selling’ the
pseudoscience, such as selling products 34. This does work:
both critical thinking and a skeptical approach reduce
people’s belief in unfounded pseudosciences 35. There are a
few useful ideas you can apply when reading or hearing
about pseudosciences.
Firstly, remember Occam’s razor: the best theory is the one
that forces you to make the fewest assumptions. The
simplest explanation is often the best. Some pseudosciences
ask us to believe in unfalsifiable and unmeasurable ideas like
a universal energy that can flow through crystals, and we can
apply Occam’s razor to such ideas 36. The simpler
explanation is the placebo effect, an effect based on
extensive evidence whereby people’s expectations can make
them feel better.
Secondly, the burden of proof principle. This argues that
when somebody asks society to believe something that
diverges from our accepted knowledge about the world, they
should supply evidence to support their claims, rather than
making it society’s job to disprove them.
Thirdly, Sagan’s balance principle says that extraordinary
claims require extraordinary evidence. Therefore, minor
evidence like a handful of cures or correct predictions cannot
‘prove’ a pseudoscience 3738.
Although many scientists deride pseudosciences, remember
that all sciences have to start somewhere, and it takes time
to gather the evidence. By keeping an open mind and
analyzing things rationally, you can keep abreast of new
developments while avoiding getting sucked in by nonsense.
Action Steps
1. Detective Work
Make a brief list of possible pseudosciences and use your
skills to gather evidence and decide whether you think they
are real science or pseudoscience. If you need ideas, choose a
couple of these:
Iridology
Mycology
Homeopathy
Neurochemistry
Geomorpholog
Macrobiotics
2. Study Skills
Devise a scientific theory within your field of expertise, and
plan an investigation. This could be something work-related,
within a leisure pastime (such as sports or creative work), or
something silly and fun. Whatever you choose, aim to be
thorough. It is fine if you cannot conduct the study for real.
For example, if it involves time, resources, or ethical issues.
Work through the general scientific method to hone your
idea and generate something you can test. Make casual
observations, formulate a research question, narrow this to a
testable hypothesis, and consider how you would analyze the
data. If you are not a statistical expert, never fear - you can
always draw a graph and compare the data visually.
Finally, consider what valid conclusions you could draw from
different results. Congratulations, you have just proved you
are not a pseudoscientist!
Summary
In the anecdote at the start of the chapter, we met Marlon,
who was confused by his Mom’s insistence on keeping horse
chestnuts in the corners of her apartment. She said this was
a well-known way of keeping spiders out of the home, but
she could not explain why horse chestnuts put them off. This
vague explanation is similar to pseudoscience: people might
believe something works, but they do not know why.
Marlon’s Mom believed the practice worked because it was
traditional, and she also exhibited confirmation bias. Even
Marlon succumbed to it slightly when he reflected that he
had never seen a spider in his Mom’s house. However, less
reliable than objective evidence.
A scientific approach to any idea requires observation,
followed by defining a solid research question that you can
test in the real world. This kind of study does not always get
done for pseudoscience. In many instances, it cannot be done
because there is no adequate control condition to compare to
pseudoscientific practice. Overall, science and pseudoscience
alike provide us with ample opportunities to exercise our
critical faculties.
Takeaways
1. Scientific methods and processes are the most reliable
ways to explore and find out about the world.
2. However, not everything that resembles science is actual
science. Mistakes and misrepresentations in the form of
pseudoscience can tempt people towards incorrect
conclusions.
3. Pseudosciences persist for many reasons, including
inherent biases, wishful thinking, tradition, and certain
personality traits.
4. Keep an open mind about novel ideas, but remember that
some ideas are more useful than others because they help us
understand and predict the world.
AFTERWORD
Marvin lazed on the decking at his lake house, watching the
fish whirling around in the clear water. His work phone
vibrated on the kitchen counter, but he let it ring. He knew
the call spelled no good for his summer retreat.
Hours later, the evening drew in, and Marvin finally got
around to checking his missed calls. He was surprised to see
that his bank had called him and left a message. That was
unexpected. He managed to call them on the number they
left and got through to an operator straight away. The line
was terrible, but the voice on the other end sounded urgent.
“I can’t hear you,” said Marvin.
“Give me your password, Mr. Keller,” the crackly voice said.
“Of course…” Marvin duly gave the operator his password
and further security details.
Apparently, there was a problem with his account, which
meant he had to wire his money to a different account
urgently.
Did you guess? Scammers targeted Marvin and managed to
get him to transfer all his funds to them. He did not even
notice until he returned from his lake house to double
trouble: his work colleagues had realized he had ripped them
off, and he realized he had gained nothing because he had
fallen for a telephone banking scam.
In this example, the protagonist was lax about checking the
credentials of the person calling him. The signs were there,
particularly the unscheduled contact and urgency of
transferring the funds. His lack of skepticism about the call
ended up costing him a lot.
Separating sense from nonsense is a massively difficult task,
not least because potential deceptions bombard us all the
time, almost as if they were waiting in line for us to drop our
guards. However, we can get closer to the truth by applying
critical thinking techniques to information we encounter
each day. In summary:
Critical thinking approach: this means reasoning logically,
using evidence rather than working to justify conclusions we
desire. Gathering information to argue for a predetermined
conclusion is easy but wrong. With critical thinking, we can
be sure that our decisions are conscious, and deliberate and
based on facts. We must be clear about the difference
between facts, opinions, and claims. We must know about
the role emotions play in human cognition. Lastly, we must
seek evidence relating to purported facts, including
researching the source of and reason for any message.
Our complex minds: how our brains work can lead to blurred
boundaries between truth and non-truth, or even getting
things completely wrong without even being aware of it.
Humans are emotional creatures with a drive to learn from
and believe others, so, unsurprisingly, misinformation
spreads. Furthermore, biases, fallacies, and heuristics all
have a significant influence on our thinking, sometimes
without us ever becoming aware of it.
Scientific skepticism: this is an attitude that can help gauge
the truth of claims. Be like a scientist and question whether a
claim you hear can be verified or falsified. Scientific
skepticism means overcoming our natural inclination to
process information quickly and automatically, and instead
stepping back, slowing down, and really analyzing what we
encounter. Skepticism means doubt, not necessarily
disbelief, and it works best with an open-minded outlook.
The media: social media and the mass media are the major
sources of information for the vast majority these days, but
they vary in reporting accuracy. Some information can even
be completely false, designed to lure people in to spend
money and/or time on websites run by shysters. Use media
literacy techniques like lateral reading to get a deeper
understanding of the information you see in the media,
rather than taking it at face value.
Deception: dishonesty is fairly widespread outside of the
media, too. Most people are honest enough about the things
that matter, but we would all be wise to stay alert for the
signs that people are lying to us. Faces, voices, and body
language all provide clues, but we should pay attention to
what they say as well. Similarly, be alert to the signs of
fraudsters using scams like advance payment schemes.
Pseudosciences: are explanations or techniques that claim a
scientific basis or approach, but they are distinct from
sciences in several ways. Science uses a cycle of observation,
testing, and refinement of theories and methods, aiming to
advance knowledge in a specific area. In contrast,
pseudosciences are sometimes difficult to test in a truly
scientific manner. However, cynics sometimes mislabel
progressive science as pseudoscience, so we should do our
best to assess new ideas in an open-minded and skeptical
manner.
In conclusion, now that you have the tools required to
separate fact from fiction, make sure to do your critical
thinking as well as you can and work to develop it. Critical
thinking helps you recognize and avoid harmful and useless
thought patterns. It helps you to reach better conclusions. It
improves the quality of your thinking, raising your chances
of achieving your goals. Good luck!
REFERENCES
1. What It Means To Be A Critical Thinker In This Day And Age
1 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
2 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
3 The Foundation For Critical Thinking (2019). Critical Thinking: Where To Begin.
Available at: https://www.criticalthinking.org/pages/critical-thinking-where-
to-begin/796 (Accessed: 14th December 2020)
4 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
5 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
6 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
7 Taber, C.S. & Lodge, M. (2006). Motivated skepticism in the evaluation of
political beliefs. American Journal Of Political Science, 50(3), pp. 755-769. doi:
1540-5907.2006.00214.x
8 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
9 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
10 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
11 Stanovich, K.E., West, F.R., Toplak, M.E. (2013). Myside Bias, Rational
Thinking, and Intelligence. Current Directions in Psychological Science, 22(4) pp.
259–264. doi: 10.1177/0963721413480174
12 Taber, C.S. & Lodge, M. (2006). Motivated skepticism in the evaluation of
political beliefs. American Journal Of Political Science, 50(3), pp. 755-769. doi:
1540-5907.2006.00214.x
13 Russell, J. A. (2003). Core affect and the psychological construction of emotion.
Psychological Review, 110(1), 145–172. doi: 10.1037/0033-295X.110.1.145
14 Kozlowski et al Kozlowski, D., Hutchinson, M., Hurley, J., Rowley, J.,
Sutherland, J. (2017). The role of emotion in clinical decision making: an
integrative literature review. BMC Medical Education, 17(1), p255. doi:
10.1186/s12909-017-1089-7
15 Taber, C.S. & Lodge, M. (2006). Motivated skepticism in the evaluation of
political beliefs. American Journal Of Political Science, 50(3), pp. 755-769. doi:
1540-5907.2006.00214.x
16 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
17 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
2. What Keeps Us From Getting To The Truth?
1 Rauscher, F.H., Shaw, G.L. & Ky, K.N. (1993). Music and spatial task
performance. Nature, 365, p611. doi: 10.1038/365611a0
2 Nantais, K. & Schellenberg, G.E. (1999). The Mozart effect: an artifact of
preference. Psychological Science 10(4), pp370-373. doi: 10.1111/1467-9280.00170
3 Pietschnig, J., Voracek, M., & Formann, A. K. (2010). Mozart effect–Shmozart
effect: A meta-analysis. Intelligence, 38(3), pp314–323. doi:
10.1016/j.intell.2010.03.001
4 Taber, C.S. & Lodge, M. (2006). Motivated skepticism in the evaluation of
political beliefs. American Journal Of Political Science, 50(3), pp. 755-769. doi:
1540-5907.2006.00214.x
5 Stanovich, K.E., West, F.R., Toplak, M.E. (2013). Myside Bias, Rational Thinking,
and Intelligence. Current Directions in Psychological Science, 22(4) pp. 259–264.
doi: 10.1177/0963721413480174
6 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
7 Kahneman, D. (2011). Thinking, Fast And Slow. New York: Farrar, Straus and
Giroux.
8 Ruscio, J. (2006). Critical Thinking In Psychology: Separating Sense From Nonsense
(2nd ed.). Belmont, CA.: Thomson/Wadsworth.
9 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
10 Frank M.C., Vul E., Johnson S.P. (2009). Development of infants' attention to
faces during the first year. Cognition, 110(2), pp160-170. doi:
10.1016/j.cognition.2008.11.010.
11 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
12 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
13 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
14 Bower, G. H., Monteiro, K. P., & Gilligan, S. G. (1978). Emotional mood as a
context for learning and recall. Journal of Verbal Learning & Verbal Behavior, 17(5),
pp573–585. doi: 10.1016/S0022-5371(78)90348-1.
15 Bower, G. H. (1981). Mood and memory. American Psychologist, 36(2), pp129–
148. doi: 10.1037/0003-066X.36.2.129
16 Bower, G. H., Monteiro, K. P., & Gilligan, S. G. (1978). Emotional mood as a
context for learning and recall. Journal of Verbal Learning & Verbal Behavior, 17(5),
pp573–585. doi: 10.1016/S0022-5371(78)90348-1.
17 Bower, G. H. (1981). Mood and memory. American Psychologist, 36(2), pp129–
148. doi: 10.1037/0003-066X.36.2.129
18 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
19 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
20 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
21 Heck, P.R., Simons, D.J., Chabris, C.F. (2018) 65% of Americans believe they
are above average in intelligence: Results of two nationally representative
surveys. PLoSONE, 13(7), e0200103. doi: 10.1371/journal.pone.0200103
22 Heck, P.R., Simons, D.J., Chabris, C.F. (2018) 65% of Americans believe they
are above average in intelligence: Results of two nationally representative
surveys. PLoSONE, 13(7), e0200103. doi: 10.1371/journal.pone.0200103
23 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
24 Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics
and biases. Science, 185, pp1124-1130. doi: 10.1126/science.185.4157.1124
25 Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics
and biases. Science, 185, pp1124-1130. doi: 10.1126/science.185.4157.1124
26 Russell, J.A. (2003) Core affect and the psychological construction of emotion.
Psychological Review, 110(1), pp145-172 doi: 10.1037/0033-295x.110.1.145
27 Tversky, A., & Kahneman, D. (1983). Extensional versus intuitive reasoning:
The conjunction fallacy in probability judgment. Psychological Review, 90, 293-
315. doi:10.1037/0033-295X.90.4.293
28 Kahneman, D. (2011). Thinking, Fast And Slow. New York: Farrar, Straus and
Giroux.
29 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
30 Yap, A. (2013) Ad Hominem Fallacies, Bias, and Testimony. Argumentation,
27(2), pp97-109. doi: 10.1007/s10503-011-9260-5
31 Walton, D.N. (1987) The ad Hominem argument as an informal fallacy.
Argumentation, 1, pp317–331. doi: 10.1007/BF00136781
32 Walton, D. (1999) Rethinking the Fallacy of Hasty Generalization.
Argumentation, 13, pp161–182. doi: 10.1023/A:1026497207240
33 Law, S (2006) Thinking tools: The bandwagon fallacy. Think, 4(12), pp. 111. doi:
10.1017/S1477175600001792
34 Asch, S. E. (1956). Studies of independence and conformity: I. A minority of
one against a unanimous majority. Psychological Monographs: General and Applied,
70(9), 1–70. doi: 10.1037/h0093718
35 Sternberg, R.J. & Halpern, D.F. (Eds.) (2020) Critical Thinking In Psychology (2nd
Ed.). Cambridge, UK: Cambridge University Press.
36 Rosenkopf, L., Abrahamson, E. (1999). Modeling Reputational and
Informational Influences in Threshold Models of Bandwagon Innovation
Diffusion. Computational & Mathematical Organization Theory, 5, pp361–384 doi:
10.1023/A:1009620618662.
37 Fiol, C.M. & O'Connor, E.J. (2003). Waking up! Mindfulness in the face of
bandwagons. Academy of Management Review, 28, pp 54-70. doi:
10.5465/AMR.2003.8925227.
38 Fiol, C.M. & O'Connor, E.J. (2003). Waking up! Mindfulness in the face of
bandwagons. Academy of Management Review, 28, pp 54-70. doi:
10.5465/AMR.2003.8925227.
39 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
40 Nickerson, R. (1998). Confirmation bias: A ubiquitous phenomenon in many
guises. Review of General Psychology, 2, pp175-220. doi: 10.1037/1089-2680.2.2.175
41 Ruscio, J. (2006). Critical Thinking In Psychology: Separating Sense From Nonsense
(2nd ed.). Belmont, CA.: Thomson/Wadsworth.
42 Kahneman, D. (2011). Thinking, Fast And Slow. New York: Farrar, Straus and
Giroux.
43 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
44 Kahneman, D. (2011). Thinking, Fast And Slow. New York: Farrar, Straus and
Giroux.
45 Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics
and biases. Science, 185, pp1124-1130. doi: 10.1126/science.185.4157.1124
46 Wilson, T. D., Houston, C. E., Etling, K. M., & Brekke, N. (1996). A new look at
anchoring effects: Basic anchoring and its antecedents. Journal of Experimental
Psychology: General, 125, pp387-402. doi: 10.1037/0096-3445.125.4.387
47 Kahneman, D. (2011). Thinking, Fast And Slow. New York: Farrar, Straus and
Giroux.
48 Kahneman, D. (2011). Thinking, Fast And Slow. New York: Farrar, Straus and
Giroux.
49 Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics
and biases. Science, 185, pp1124-1130. doi: 10.1126/science.185.4157.1124
50 Ross, L., Greene, D., House, P. (1977) The “false consensus effect”: An
egocentric bias in social perception and attribution processes. Journal of
Experimental Social Psychology, 13 (3), pp279-301. doi: 10.1016/0022-
1031(77)90049-X.
51 Marks, G., & Miller, N. (1987). Ten years of research on the false-consensus
effect: An empirical and theoretical review. Psychological Bulletin, 102(1), 72–90.
doi: 10.1037/0033-2909.102.1.72
52 Gilovich, T. (1990). Differential construal and the false consensus effect.
Journal of Personality and Social Psychology, 59(4), pp623–634. doi: 10.1037/0022-
3514.59.4.623
53 Nisbett, R. E., & Wilson, T. D. (1977). The halo effect: Evidence for unconscious
alteration of judgments. Journal of Personality and Social Psychology, 35(4), pp250–
256. doi: 10.1037/0022-3514.35.4.250
54 Ruscio, J. (2006). Critical Thinking In Psychology: Separating Sense From Nonsense
(2nd ed.). Belmont, CA.: Thomson/Wadsworth.
55 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
56 Kahneman, D. (2011). Thinking, Fast And Slow. New York: Farrar, Straus and
Giroux.
57 Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics
and biases. Science, 185, pp1124-1130. doi: 10.1126/science.185.4157.1124
58 Baddeley, A (1997). Human Memory: Theory And Practice. (Revised Ed.). Hove,
UK: Psychology Press.
59 Tulving, E. (1983). Elements Of Episodic Memory. New York: Oxford University
Press.
60 Kahneman, D. (2011). Thinking, Fast And Slow. New York: Farrar, Straus and
Giroux.
61 Kahneman, D. (2011). Thinking, Fast And Slow. New York: Farrar, Straus and
Giroux.
62 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
63 Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics
and biases. Science, 185, pp1124-1130. doi: 10.1126/science.185.4157.1124
64 Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics
and biases. Science, 185, pp1124-1130. doi: 10.1126/science.185.4157.1124
65 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
66 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
67 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
68 Fiol, C.M. & O'Connor, E.J. (2003). Waking up! Mindfulness in the face of
bandwagons. Academy of Management Review, 28, pp 54-70. doi:
10.5465/AMR.2003.8925227.
69 Festinger, L. (1957). A Theory Of Cognitive Dissonance. Stanford, CA: Stanford
University Press.
70 Miller, M.K., Clark , J.D., Jehle, A. (2015) Cognitive Dissonance Theory (Festinger).
In: The Blackwell Encyclopaedia Of Sociology.
doi.org/10.1002/9781405165518.wbeosc058.pub2
71 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
72 Ruscio, J. (2006). Critical Thinking In Psychology: Separating Sense From Nonsense
(2nd ed.). Belmont, CA.: Thomson/Wadsworth.
73 Ruscio, J. (2006). Critical Thinking In Psychology: Separating Sense From Nonsense
(2nd ed.). Belmont, CA.: Thomson/Wadsworth.
74 Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics
and biases. Science, 185, pp1124-1130. doi: 10.1126/science.185.4157.1124
75 Ruscio, J. (2006). Critical Thinking In Psychology: Separating Sense From Nonsense
(2nd ed.). Belmont, CA.: Thomson/Wadsworth.
76 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
77 Ruscio, J. (2006). Critical Thinking In Psychology: Separating Sense From Nonsense
(2nd ed.). Belmont, CA.: Thomson/Wadsworth.
78 Little, R.J., D'Agostino, R., Cohen, M.L., Dickersin, K., Emerson, S.S., Farrar,
J.T., Frangakis, C., Hogan, J.W., Molenberghs, G., Murphy, S.A., Neaton, J.D.,
Rotnitzky, A., Scharfstein, D., Shih, W.J., Siegel, J.P., Stern, H. (2012) The
prevention and treatment of missing data in clinical trials. New England Journal Of
Medicine, 367(14), pp1355-60. doi: 10.1056/NEJMsr1203730
79 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
80 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
81 Ruscio, J. (2006). Critical Thinking In Psychology: Separating Sense From Nonsense
(2nd ed.). Belmont, CA.: Thomson/Wadsworth.
82 Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics
and biases. Science, 185, pp1124-1130. doi: 10.1126/science.185.4157.1124
83 Ruscio, J. (2006). Critical Thinking In Psychology: Separating Sense From Nonsense
(2nd ed.). Belmont, CA.: Thomson/Wadsworth.
84 Heck, P.R., Simons, D.J., Chabris, C.F. (2018) 65% of Americans believe they
are above average in intelligence: Results of two nationally representative
surveys. PLoSONE, 13(7), e0200103. doi: 10.1371/journal.pone.0200103
85 Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics
and biases. Science, 185, pp1124-1130. doi: 10.1126/science.185.4157.1124
86 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
87 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
88 Ayton, P., & Fischer, I. (2004) The hot hand fallacy and the gambler’s fallacy:
Two faces of subjective randomness? Memory & Cognition, 32, pp1369–1378. doi:
10.3758/BF03206327
3. Why Having A Scientifically Skeptical Mind Helps You Discover The
Truth
1 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
2 The Editors of Encyclopaedia Britannica (2016). Verifiability Principle.
Encyclopædia Britannica. Available at https://www.britannica.com/topic/
verifiability-principle (Accessed January 15, 2021)
3 American Institute Of Physics (2018). Science Strategies Chart Course for
Detecting Life on Other Worlds https://www.aip.org/fyi/2018/science-strategies-
chart-course-detecting-life-other-worlds (Accessed 1 February 2021)
4 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
5 Ayer, A. J. (1936). Language, Truth, And Logic. London, UK: V. Gollancz.
6 Shankar, S. (2017) Verifiability And Falsifiability As Parameters For Scientific
Methodology. International Journal of Education & Multidisciplinary Studies, 7(2),
pp130-137. doi: 10.21013/jems.v7.n2.p10
7 Shankar, S. (2017) Verifiability And Falsifiability As Parameters For Scientific
Methodology. International Journal of Education & Multidisciplinary Studies, 7(2),
pp130-137. doi: 10.21013/jems.v7.n2.p10
8 Popper, K. (1963) Conjectures And Refutations: The Growth Of Scientific Knowledge.
London, UK: Routledge & Kegan Paul.
9 Rauscher, F.H., Shaw, G.L. & Ky, K.N. (1993). Music and spatial task
performance. Nature, 365, p611. doi: 10.1038/365611a0
10 Nantais, K. & Schellenberg, G.E. (1999). The Mozart effect: an artifact of
preference. Psychological Science 10(4), pp370-373. doi: 10.1111/1467-9280.00170
11 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
12 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
13 Pietschnig, J., Voracek, M., & Formann, A. K. (2010). Mozart effect–Shmozart
effect: A meta-analysis. Intelligence, 38(3), pp314–323. doi:
10.1016/j.intell.2010.03.001
14 Neyman, J.; Pearson, E. S. (1933). The testing of statistical hypotheses in
relation to probabilities a priori. Mathematical Proceedings of the Cambridge
Philosophical Society, 29 (4), pp492–510. Doi: 10.1017/s030500410001152x.
15 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
16 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
17 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
18 Schupbach, J., & Sprenger, J. (2011). The Logic of Explanatory Power.
Philosophy of Science, 78(1), pp105-127. doi:10.1086/658111
19 Arditti, J., Elliott, J., Kitching, I. & Wasserthal, L. (2012). ‘Good Heavens what
insect can suck it’– Charles Darwin, Angraecum sesquipedale and Xanthopan
morganii praedicta. Botanical Journal of the Linnean Society, 169, pp403–432. doi:
10.1111/j.1095-8339.2012.01250.x.
20 Kahneman, D. (2011). Thinking, Fast And Slow. New York: Farrar, Straus and
Giroux.
21 Grafman, J. (2000) Conceptualizing functional neuroplasticity. Journal of
Communication Disorders, 33(4), 345-356, doi: 10.1016/S0021-9924(00)00030-7.
22 Liu, D.W.C. (2012) Science Denial and the Science Classroom. CBE - Life
Sciences Education, 11(2) pp129-134.
23 Sagan C. (1987) The Burden Of Skepticism. Skeptical Inquirer, 12(1) https://
skepticalinquirer.org/1987/10/the-burden-of-skepticism/
24 Dwyer, C. (2017). Critical Thinking: Conceptual Perspectives and Practical
Guidelines. Cambridge: Cambridge University Press. doi:10.1017/9781316537411
25 Sagan C. (1987) The Burden Of Skepticism. Skeptical Inquirer, 12(1) https://
skepticalinquirer.org/1987/10/the-burden-of-skepticism/
26 Truzzi, M. (1987) On Pseudo-Skepticism. Zetetic Scholar, 12/13, pp3-4.
27 Truzzi, M. (1987) On Pseudo-Skepticism. Zetetic Scholar, 12/13, pp3-4.
28 Snelson, J.S. (1992) The Ideological Immune System: Resistance to New Ideas
in Science. Skeptic, 1(4).
29 Snelson, J.S. (1992) The Ideological Immune System: Resistance to New Ideas
in Science. Skeptic, 1(4).
30 Çakici, D., Metacognitive Awareness and Critical Thinking Abilities of Pre-
Service EFL Teachers, Journal of Education and Learning, 7(5) pp116-129. doi:
10.5539/jel.v7n5p116
31 Flavell, J. (1979). Metacognition and Cognitive Monitoring: A New Area of
Cognitive-Developmental Inquiry. American Psychologist, 34, 906-911.
32 Schraw, G. (1998) Promoting general metacognitive awareness. Instructional
Science, 26, pp113–125. doi: 10.1023/A:1003044231033
33 Çakici, D., Metacognitive Awareness and Critical Thinking Abilities of Pre-
Service EFL Teachers, Journal of Education and Learning, 7(5) pp116-129. doi:
10.5539/jel.v7n5p116
34 Paul, R & and Elder, L. (2013) Critical Thinking: Intellectual Standards
Essential to Reasoning Well Within Every Domain of Human Thought, Part Two.
Journal Of Developmental Education, 37(1).
35 Wilterdink, N. (2002). The sociogenesis of postmodernism. European Journal of
Sociology, 43(2), pp190-216.
36 Duignan, B. (2020) Postmodernism. Encyclopedia Britannica, https://www.
britannica.com/topic/postmodernism-philosophy. (Accessed 22 January 2021).
37 Wilterdink, N. (2002). The sociogenesis of postmodernism. European Journal of
Sociology, 43(2), pp190-216.
38 Wilterdink, N. (2002). The sociogenesis of postmodernism. European Journal of
Sociology, 43(2), pp190-216.
39 Duignan, B. (2020) Postmodernism. Encyclopedia Britannica, https://www.
britannica.com/topic/postmodernism-philosophy. (Accessed 22 January 2021).
40 Dennett, D.C. (2013). On Wieseltier V. Pinker in The New Republic: Let's Start
With A Respect For Truth. Edge, https://www.edge.org/conversation/
daniel_c_dennett-dennett-on-wieseltier-v-pinker-in-the-new-republic.
(Accessed 22 January 2021).
41 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
42 Kuhn, T. S. (1970). The structure of scientific revolutions (2nd Ed.). University of
Chicago Press: Chicago.
43 Wilterdink, N. (2002). The sociogenesis of postmodernism. European Journal of
Sociology, 43(2), pp190-216.
44 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
45 Snelson, J.S. (1992) The Ideological Immune System: Resistance to New Ideas
in Science. Skeptic, 1(4).
46 Paul, R & and Elder, L. (2013) Critical Thinking: Intellectual Standards
Essential to Reasoning Well Within Every Domain of Human Thought, Part Two.
Journal Of Developmental Education, 37(1).
4. Why The Media Can Make Or Break Our Thinking
1 Watson, C.A. (2018) Information Literacy in a Fake/False News World: An
Overview of the Characteristics of Fake News and its Historical Development.
International Journal of Legal Information, 46(2), pp. 93-96.
2 McDermott, R (2019) Psychological Underpinnings of Post-Truth in Political
Beliefs. Political Science & Politics, 52(2), pp218-222.D
3 Watson, C.A. (2018) Information Literacy in a Fake/False News World: An
Overview of the Characteristics of Fake News and its Historical Development.
International Journal of Legal Information, 46(2), pp. 93-96.
4 Vosoughi, S., Roy, D., Aral, S. (2018) The spread of true and false news online.
Science, 369, pp1146-1151 doi: 10.1126/science.aap9559
5 Aral, S. & Van Alstyne, M.W. (2011). The Diversity-Bandwidth Tradeoff.
American Journal of Sociology, 117(1), doi: 0.2139/ssrn.958158
6 Vosoughi, S., Roy, D., Aral, S. (2018) The spread of true and false news online.
Science, 369, pp1146-1151 doi: 10.1126/science.aap9559
7 Itti, L. & Baldi, P. (2009). Bayesian surprise attracts human attention, Vision
Research, 49 (10), pp1295-1306. doi: 10.1016/j.visres.2008.09.007.
8 Vuilleumier P. (2005). How brains beware: neural mechanisms of emotional
attention. Trends In Cognitive Science, 9(12), pp585-94. Doi:
10.1016/j.tics.2005.10.011
9 Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012).
Misinformation and Its Correction: Continued Influence and Successful
Debiasing. Psychological Science in the Public Interest, 13(3), 106–131. doi:
10.1177/1529100612451018
10 Vosoughi, S., Roy, D., Aral, S. (2018) The spread of true and false news online.
Science, 369, pp1146-1151 doi: 10.1126/science.aap9559
11 McDermott, R (2019) Psychological Underpinnings of Post-Truth in Political
Beliefs. Political Science & Politics, 52(2), pp218-222.D
12 Osborne, C.L. (2018). Programming to Promote Information Literacy in the Era
of Fake News. International Journal of Legal Information, 46(2), pp101-109. doi:
10.1017/jli.2018.21
13 Watson, C.A. (2018) Information Literacy in a Fake/False News World: An
Overview of the Characteristics of Fake News and its Historical Development.
International Journal of Legal Information, 46(2), pp. 93-96.
14 LaGarde, J. & Hudgins, D. (2018) Fact Vs. Fiction: Teaching Critical Thinking Skills
in the Age of Fake News. International Society for Technology in Education.
15 Watson, C.A. (2018) Information Literacy in a Fake/False News World: An
Overview of the Characteristics of Fake News and its Historical Development.
International Journal of Legal Information, 46(2), pp. 93-96.
16 Niedringhaus, K.L (2018). Information Literacy in a Fake/False News World:
Why Does it Matter and How Does it Spread? International Journal of Legal
Information, 46(2), pp97-100. doi:10.1017/jli.2018.26
17 Murch, S.H., Anthony, A., Casson, D.H., Malik, M., Berelowitz, M., Dhillon,
A.P., Thomson, M.A., Valentine, A., Davies, S.E., Walker-Smith, J.A. (2004)
Retraction of an interpretation. Lancet. 363(9411):750. doi: 10.1016/S0140-
6736(04)15715-2. Erratum for: Lancet. 1998 Feb 28;351(9103):637-41.
18 Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012).
Misinformation and Its Correction: Continued Influence and Successful
Debiasing. Psychological Science in the Public Interest, 13(3), 106–131.
doi:10.1177/1529100612451018
19 Niedringhaus, K.L (2018). Information Literacy in a Fake/False News World:
Why Does it Matter and How Does it Spread? International Journal of Legal
Information, 46(2), pp97-100. doi:10.1017/jli.2018.26
20 Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012).
Misinformation and Its Correction: Continued Influence and Successful
Debiasing. Psychological Science in the Public Interest, 13(3), 106–131.
doi:10.1177/1529100612451018
21 Osborne, C.L. (2018). Programming to Promote Information Literacy in the Era
of Fake News. International Journal of Legal Information, 46(2), pp101-109. doi:
10.1017/jli.2018.21
22 McDermott, R (2019) Psychological Underpinnings of Post-Truth in Political
Beliefs. Political Science & Politics, 52(2), pp218-222.D
23 Osborne, C.L. (2018). Programming to Promote Information Literacy in the Era
of Fake News. International Journal of Legal Information, 46(2), pp101-109. doi:
10.1017/jli.2018.21
24 Shearer, E. & Gottfried, J. (2017). News Use Across Social Media Platforms
2017, Pew Research Center. https://www.journalism.org/2017/09/07/news-use-
across-social-media-platforms-2017/
25 Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012).
Misinformation and Its Correction: Continued Influence and Successful
Debiasing. Psychological Science in the Public Interest, 13(3), 106–131.
doi:10.1177/1529100612451018
26 LaGarde, J. & Hudgins, D. (2018) Fact Vs. Fiction: Teaching Critical Thinking Skills
in the Age of Fake News. International Society for Technology in Education.
27 Blakeslee, Sarah (2004) "The CRAAP Test," LOEX Quarterly, 31(3 ). Available at:
https://commons.emich.edu/loexquarterly/vol31/iss3/4
28 Fielding, J.A. (2019) Rethinking CRAAP: Getting students thinking like fact-
checkers in evaluating web sources. College & Research Libraries News, 80(11),
pp.620-622. doi: 10.5860/crln.80.11.620
29 Wineburg, S. & Mcgrew, S. (2017) Lateral Reading: Reading Less and Learning
More When Evaluating Digital Information. Stanford History Education Group
Working Paper No. 2017-A1, Available at http://dx.doi.org/10.2139/ssrn.3048994
30 Fielding, J.A. (2019) Rethinking CRAAP: Getting students thinking like fact-
checkers in evaluating web sources. College & Research Libraries News, 80(11),
pp.620-622. doi: 10.5860/crln.80.11.620
31 Wineburg, S. & Mcgrew, S. (2017) Lateral Reading: Reading Less and Learning
More When Evaluating Digital Information. Stanford History Education Group
Working Paper No. 2017-A1, Available at http://dx.doi.org/10.2139/ssrn.3048994
32 Edelman trust barometer 2021. Available at https://www.edelman.com/sites/
g/files/aatuss191/files/2021-01/2021-edelman-trust-barometer.pdf
33 Niedringhaus, K.L (2018). Information Literacy in a Fake/False News World:
Why Does it Matter and How Does it Spread? International Journal of Legal
Information, 46(2), pp97-100. doi:10.1017/jli.2018.26
34 Society of Professional Journalists (2014). SPJ Code Of Ethics. https://www.spj.
org/ethicscode.asp [accessed 12 Feb 2021]
35 Osborne, C.L. (2018). Programming to Promote Information Literacy in the Era
of Fake News. International Journal of Legal Information, 46(2), pp101-109. doi:
10.1017/jli.2018.21
5. Everyday Lies And Deception
1 Ekman, P. (1992). Telling Lies: Clues To Deceit In The Marketplace, Politics, And
Marriage. New York: W.W. Norton.
2 Vrij, A. (2004). Guidelines to catch a liar. In P. Granhag & L. Strömwall (Eds.),
The Detection of Deception in Forensic Contexts (pp. 287-314). Cambridge:
Cambridge University Press. doi:10.1017/CBO9780511490071.013
3 DePaulo, B., & Morris, W. (2004). Discerning lies from truths: Behavioural cues
to deception and the indirect pathway of intuition. In P. Granhag & L. Strömwall
(Eds.), The Detection of Deception in Forensic Contexts (pp. 15-40). Cambridge:
Cambridge University Press. doi:10.1017/CBO9780511490071.002
4 Ekman, P. (1992). Telling Lies: Clues To Deceit In The Marketplace, Politics, And
Marriage. New York: W.W. Norton.
5 Vrij, A. (2004). Guidelines to catch a liar. In P. Granhag & L. Strömwall (Eds.),
The Detection of Deception in Forensic Contexts (pp. 287-314). Cambridge:
Cambridge University Press. doi:10.1017/CBO9780511490071.013
6 DePaulo, B., & Morris, W. (2004). Discerning lies from truths: Behavioural cues
to deception and the indirect pathway of intuition. In P. Granhag & L. Strömwall
(Eds.), The Detection of Deception in Forensic Contexts (pp. 15-40). Cambridge:
Cambridge University Press. doi:10.1017/CBO9780511490071.002
7 Ekman, P. (1992). Telling Lies: Clues To Deceit In The Marketplace, Politics, And
Marriage. New York: W.W. Norton.
8 Ekman, P. (1992). Telling Lies: Clues To Deceit In The Marketplace, Politics, And
Marriage. New York: W.W. Norton.
9 Arciuli, J., Mallard, D., & Villar, G. (2010). “Um, I can tell you're lying”:
Linguistic markers of deception versus truth-telling in speech. Applied
Psycholinguistics, 31(3), pp397-411. doi:10.1017/S0142716410000044
10 Ekman, P. (1992). Telling Lies: Clues To Deceit In The Marketplace, Politics, And
Marriage. New York: W.W. Norton.
11 Rockwell, P., Buller, D., & Burgoon, J. (1997). Measurement of deceptive voices:
Comparing acoustic and perceptual data. Applied Psycholinguistics, 18(4), 471-484.
doi:10.1017/S0142716400010948
12 Ekman, P. (1992). Telling Lies: Clues To Deceit In The Marketplace, Politics, And
Marriage. New York: W.W. Norton.
13 Rockwell, P., Buller, D., & Burgoon, J. (1997). Measurement of deceptive voices:
Comparing acoustic and perceptual data. Applied Psycholinguistics, 18(4), 471-484.
doi:10.1017/S0142716400010948
14 DePaulo, B., & Morris, W. (2004). Discerning lies from truths: Behavioural
cues to deception and the indirect pathway of intuition. In P. Granhag & L.
Strömwall (Eds.), The Detection of Deception in Forensic Contexts (pp. 15-40).
Cambridge: Cambridge University Press. doi:10.1017/CBO9780511490071.002
15 Ekman, P. (1992). Telling Lies: Clues To Deceit In The Marketplace, Politics, And
Marriage. New York: W.W. Norton.
16 Vrij, A. (2004). Guidelines to catch a liar. In P. Granhag & L. Strömwall (Eds.),
The Detection of Deception in Forensic Contexts (pp. 287-314). Cambridge:
Cambridge University Press. doi:10.1017/CBO9780511490071.013
17 Ekman, P. (1992). Telling Lies: Clues To Deceit In The Marketplace, Politics, And
Marriage. New York: W.W. Norton.
18 Bull, R. (2004). Training to detect deception from behavioural cues: Attempts
and problems. In P. Granhag & L. Strömwall (Eds.), The Detection of Deception in
Forensic Contexts (pp. 251-268). Cambridge: Cambridge University Press.
doi:10.1017/CBO9780511490071.011
19 Knapp, M. (2006). Lying and Deception in Close Relationships. In A. Vangelisti
& D. Perlman (Eds.), The Cambridge Handbook of Personal Relationships, pp. 517-
532). Cambridge: Cambridge University Press.
doi:10.1017/CBO9780511606632.029
20 Knapp, M. (2006). Lying and Deception in Close Relationships. In A. Vangelisti
& D. Perlman (Eds.), The Cambridge Handbook of Personal Relationships, pp. 517-
532). Cambridge: Cambridge University Press.
doi:10.1017/CBO9780511606632.029
21 DePaulo, B., & Morris, W. (2004). Discerning lies from truths: Behavioural
cues to deception and the indirect pathway of intuition. In P. Granhag & L.
Strömwall (Eds.), The Detection of Deception in Forensic Contexts (pp. 15-40).
Cambridge: Cambridge University Press. doi:10.1017/CBO9780511490071.002
22 Levine, T.R. (2014). Truth-default Theory (TDT): A Theory of Human
Deception and Deception Detection. Journal of Language and Social Psychology, 33,
pp378-92. doi: 10.1177/0261927X14535916
23 DePaulo, B., & Morris, W. (2004). Discerning lies from truths: Behavioural
cues to deception and the indirect pathway of intuition. In P. Granhag & L.
Strömwall (Eds.), The Detection of Deception in Forensic Contexts (pp. 15-40).
Cambridge: Cambridge University Press. doi:10.1017/CBO9780511490071.002
24 Vrij, A. (2004). Guidelines to catch a liar. In P. Granhag & L. Strömwall (Eds.),
The Detection of Deception in Forensic Contexts (pp. 287-314). Cambridge:
Cambridge University Press. doi:10.1017/CBO9780511490071.013
25 Ekman, P. (1992). Telling Lies: Clues To Deceit In The Marketplace, Politics, And
Marriage. New York: W.W. Norton.
26 Ekman, P. (1992). Telling Lies: Clues To Deceit In The Marketplace, Politics, And
Marriage. New York: W.W. Norton.
27 DePaulo, B., & Morris, W. (2004). Discerning lies from truths: Behavioural
cues to deception and the indirect pathway of intuition. In P. Granhag & L.
Strömwall (Eds.), The Detection of Deception in Forensic Contexts (pp. 15-40).
Cambridge: Cambridge University Press. doi:10.1017/CBO9780511490071.002
28 Levine, T.R. (2014). Truth-default Theory (TDT): A Theory of Human
Deception and Deception Detection. Journal of Language and Social Psychology, 33,
pp378-92. doi: 10.1177/0261927X14535916
29 Bull, R. (2004). Training to detect deception from behavioural cues: Attempts
and problems. In P. Granhag & L. Strömwall (Eds.), The Detection of Deception in
Forensic Contexts (pp. 251-268). Cambridge: Cambridge University Press.
doi:10.1017/CBO9780511490071.011
30 Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012).
Misinformation and Its Correction: Continued Influence and Successful
Debiasing. Psychological Science in the Public Interest, 13(3), 106–131.
doi:10.1177/1529100612451018
31 Ekman, P. (1992). Telling Lies: Clues To Deceit In The Marketplace, Politics, And
Marriage. New York: W.W. Norton.
32 Levine, T.R. (2014). Truth-default Theory (TDT): A Theory of Human
Deception and Deception Detection. Journal of Language and Social Psychology, 33,
pp378-92. doi: 10.1177/0261927X14535916
33 Levine, T.R. (2014). Truth-default Theory (TDT): A Theory of Human
Deception and Deception Detection. Journal of Language and Social Psychology, 33,
pp378-92. doi: 10.1177/0261927X14535916
34 Serota, K.B., Levine, T. & Boster, F.J. (2010). The Prevalence of Lying in
America: Three Studies of Self-Reported Lies. Human Communication Research, 36,
pp2-25
35 Levine, T.R. (2015). New and Improved Accuracy Findings in Deception
Detection Research. Current Opinion in Psychology, 6, pp1-5 doi:
10.1016/j.copsyc.2015.03.003.
36 Vrij, A. (2004). Guidelines to catch a liar. In P. Granhag & L. Strömwall (Eds.),
The Detection of Deception in Forensic Contexts (pp. 287-314). Cambridge:
Cambridge University Press. doi:10.1017/CBO9780511490071.013
37 Clough, J. (2010). Fraud. In Principles of Cybercrime (pp. 183-220). Cambridge:
Cambridge University Press. doi:10.1017/CBO9780511845123.008
38 Hancock, P. (2015). The Psychology of Deception. In Hoax Springs Eternal: The
Psychology of Cognitive Deception (pp. 61-71). Cambridge: Cambridge University
Press.
39 Levine, T.R. (2014). Truth-default Theory (TDT): A Theory of Human
Deception and Deception Detection. Journal of Language and Social Psychology, 33,
pp378-92. doi: 10.1177/0261927X14535916
40 Levine, T.R. (2015). New and Improved Accuracy Findings in Deception
Detection Research. Current Opinion in Psychology, 6, pp1-5 doi:
10.1016/j.copsyc.2015.03.003.
41 Hancock, P. (2015). The Psychology of Deception. In Hoax Springs Eternal: The
Psychology of Cognitive Deception (pp. 61-71). Cambridge: Cambridge University
Press.
42 Federal Trade Commission (2020). How To Avoid A Scam. https://www.
consumer.ftc.gov/articles/how-avoid-scam [Accessed 7 February 2021]
43 Citizens Advice (2019) Check If Something Might Be A Scam. https://www.
citizensadvice.org.uk/consumer/scams/check-if-something-might-be-a-
scam/[Accessed 7 February 2021]
44 NSW Government. Misleading Representations And Deceptive Conduct. https://
www.fairtrading.nsw.gov.au/buying-products-and-services/advertising-and-
pricing/misleading-or-deceptive-conduct [Accessed 13 February 2021]
6. Pseudoscience Versus Science
1 Evon, D. (2015) Natural repellent for spiders? Snopes.com. Available at https://
www.snopes.com/fact-check/walnut-and-
spiders/#:~:text=Lastly,%20the%20idea%20that%20spiders%20are% [Accesed
6 February 2021]
2 Harrington, M. (2020). The Varieties of Scientific Experience. In The Design of
Experiments in Neuroscience (pp. 1-12). Cambridge: Cambridge University Press.
doi:10.1017/9781108592468.002
3 Gauch, H. (2012). Scientific Method in Brief. Cambridge: Cambridge University
Press. doi:10.1017/CBO9781139095082
4 Gauch, H. (2012). Scientific Method in Brief. Cambridge: Cambridge University
Press. doi:10.1017/CBO9781139095082
5 Harrington, M. (2020). The Varieties of Scientific Experience. In The Design of
Experiments in Neuroscience (pp. 1-12). Cambridge: Cambridge University Press.
doi:10.1017/9781108592468.002
6 Gauch, H. (2012). Scientific Method in Brief. Cambridge: Cambridge University
Press. doi:10.1017/CBO9781139095082
7 Ruscio, J. (2006). Critical Thinking In Psychology: Separating Sense From Nonsense
(2nd ed.). Belmont, CA.: Thomson/Wadsworth.
8 Ruscio, J. (2006). Critical Thinking In Psychology: Separating Sense From Nonsense
(2nd ed.). Belmont, CA.: Thomson/Wadsworth.
9 Bensley, D. (2020). Critical Thinking and the Rejection of Unsubstantiated Claims.
In R. Sternberg & D. Halpern (Eds.), Critical Thinking in Psychology (pp. 68-102).
Cambridge: Cambridge University Press. doi:10.1017/9781108684354.005
10 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
11 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
12 Ruscio, J. (2006). Critical Thinking In Psychology: Separating Sense From Nonsense
(2nd ed.). Belmont, CA.: Thomson/Wadsworth.
13 Percy, J., & Pasachoff, J. (2005). Astronomical pseudosciences in North America. In
J. Pasachoff & J. Percy (Eds.), Teaching and Learning Astronomy: Effective
Strategies for Educators Worldwide (pp. 172-176). Cambridge: Cambridge
University Press. doi:10.1017/CBO9780511614880.026
14 Harrington, M. (2020). The Varieties of Scientific Experience. In The Design of
Experiments in Neuroscience (pp. 1-12). Cambridge: Cambridge University Press.
doi:10.1017/9781108592468.002 Gauch, H. (2012). Scientific Method in Brief.
15 Kuhn, T. S. (1970). The structure of scientific revolutions (2nd Ed.). University of
Chicago Press: Chicago.
16 Harrington, M. (2020). The Varieties of Scientific Experience. In The Design of
Experiments in Neuroscience (pp. 1-12). Cambridge: Cambridge University Press.
doi:10.1017/9781108592468.002
Gauch, H. (2012). Scientific Method in Brief.
17 Kahneman, D. (2011). Thinking, Fast And Slow. New York: Farrar, Straus and
Giroux.
18 Percy, J., & Pasachoff, J. (2005). Astronomical pseudosciences in North America. In
J. Pasachoff & J. Percy (Eds.), Teaching and Learning Astronomy: Effective
Strategies for Educators Worldwide (pp. 172-176). Cambridge: Cambridge
University Press. doi:10.1017/CBO9780511614880.026
19 Bensley, D. (2020). Critical Thinking and the Rejection of Unsubstantiated Claims.
In R. Sternberg & D. Halpern (Eds.), Critical Thinking in Psychology (pp. 68-102).
Cambridge: Cambridge University Press. doi:10.1017/9781108684354.005
20 Narlikar, J. (2005). Astronomy, Pseudoscience, and Rational Thinking.
Highlights of Astronomy, 13, 1052-1054. doi:10.1017/S1539299600018116
21 Narlikar, J. (2005). Astronomy, Pseudoscience, and Rational Thinking.
Highlights of Astronomy, 13, 1052-1054. doi:10.1017/S1539299600018116
22 Landrum, A.R. & Olshansky, A. (2019) The role of conspiracy mentality in
denial of science and susceptibility to viral deception about science. Politics and
the Life Sciences, 38(2), pp193-209
23 Kahneman, D. (2011). Thinking, Fast And Slow. New York: Farrar, Straus and
Giroux.
24 Bensley, D. (2020). Critical Thinking and the Rejection of Unsubstantiated Claims.
In R. Sternberg & D. Halpern (Eds.), Critical Thinking in Psychology (pp. 68-102).
Cambridge: Cambridge University Press. doi:10.1017/9781108684354.005
25 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
26 Lakatos, I. (1978). Introduction: Science and pseudoscience. In J. Worrall & G.
Currie (Eds.), The Methodology of Scientific Research Programmes: Philosophical
Papers (pp. 1-7). Cambridge: Cambridge University Press.
doi:10.1017/CBO9780511621123.002
27 Lakatos, I. (1978). Introduction: Science and pseudoscience. In J. Worrall & G.
Currie (Eds.), The Methodology of Scientific Research Programmes: Philosophical
Papers (pp. 1-7). Cambridge: Cambridge University Press.
doi:10.1017/CBO9780511621123.002
28 Ruscio, J. (2006). Critical Thinking In Psychology: Separating Sense From Nonsense
(2nd ed.). Belmont, CA.: Thomson/Wadsworth.
29 Harrington, M. (2020). The Varieties of Scientific Experience. In The Design of
Experiments in Neuroscience (pp. 1-12). Cambridge: Cambridge University Press.
doi:10.1017/9781108592468.002
30 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
31 Kuhn, T. S. (1970). The structure of scientific revolutions (2nd Ed.). University of
Chicago Press: Chicago.
32 Harrington, M. (2020). The Varieties of Scientific Experience. In The Design of
Experiments in Neuroscience (pp. 1-12). Cambridge: Cambridge University Press.
doi:10.1017/9781108592468.002
33 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
34 Harrington, M. (2020). The Varieties of Scientific Experience. In The Design of
Experiments in Neuroscience (pp. 1-12). Cambridge: Cambridge University Press.
doi:10.1017/9781108592468.002
Gauch, H. (2012). Scientific Method in Brief.
35 Bensley, D. (2020). Critical Thinking and the Rejection of Unsubstantiated Claims.
In R. Sternberg & D. Halpern (Eds.), Critical Thinking in Psychology (pp. 68-102).
Cambridge: Cambridge University Press. doi:10.1017/9781108684354.005
36 Bridgstock, M. (2009). Modern skepticism. In Beyond Belief: Skepticism,
Science and the Paranormal (pp. 86-110). Cambridge: Cambridge University
Press. doi:10.1017/CBO9780511691676.006
37 Bridgstock, M. (2009). Modern skepticism. In Beyond Belief: Skepticism,
Science and the Paranormal (pp. 86-110). Cambridge: Cambridge University
Press. doi:10.1017/CBO9780511691676.006
38 Sagan, C. (1997). The Demon-Haunted World. London: Headline.
1. The Crucial Role Of Critical Thinking
1 https://www.dictionary.com/browse/critical-thinking Dictionary.com. Accessed
on 20 April 2021.
2 The APA Delphi Report, Critical Thinking: A Statement of Expert Consensus for
Purposes of Educational Assessment and instruction.1990 ERIC Doc. NO.: ED 315
423, as cited by: Facione, PA, in “Critical Thinking: What It is and Why it
Counts”, p23.
3 Adapted from: Definitions of Logic, retrieved on 29 April 2021. https://
examples.yourdictionary.com/examples-of-logic.html
4 https://idioms.thefreedictionary.com/the+received+wisdom
5 https://forum.wordreference.com/threads/received-wisdom.2903508/
6 Galbraith, J. K. (1958) The Affluent Society , Houghton Miffling
7 https://www.collinsdictionary.com/dictionary/english/wisdom
2. The Socratic Method Of Thinking
1 http://www.forbes.com/quotes/9496/ http://www.goodreads.com/quotes/
6885-judge-a-man-by-his-questions-rather-than-by-his http://www.
brainyquote.com/quotes/quotes/v/voltaire100338.html
2 https://en.wikiquote.org/wiki/Voltaire#Misattributed
3 Gaston de Lévis, P. M. (1808), Maximes et réflexions sur differents sujets de
morale et de politique, Volume 1, p5, Charles Gosselin
4 https://www.biography.com/scholar/socrates Accessed on 9 May 2021
5 https://www.military-history.org/feature/thinkers-at-war-socrates.htm
Accessed on 7 May 2021
6 https://www.military-history.org/feature/thinkers-at-war-socrates.htm
Accessed on 7 May 2021
7 https://www.biography.com/scholar/socrates Accessed on 9 May 2021
8 Nails, Debra, "Socrates", The Stanford Encyclopedia of Philosophy (Spring
2020 Edition), Edward N. Zalta (ed.), URL = https://plato.stanford.edu/archives/
spr2020/entries/socrates/
9 Socrates: His Life and Times
10 Kreeft P, Dougherty T, (2010) Socratic Logic, Edition 3.1 (St. Augustine’s
Press) p 123
11 ibid. p 123
12 ibid. p 125
13 ibid. p 124
14 ibid.
15 Eldred, K. (2013) 1.2 Arguments - Types of Reasoning, Pima Community
College. Accessed at https://courses.lumenlearning.com/atd-pima-philosophy/
chapter/1-2-arguments-types-of-reasoning/ on 25 May 2021
16 Irving Rothchild (2006) Induction, Deduction, and the Scientific Method. The
Society for the Study of Reproduction, Inc.
17 Richard Muller, Professor of Physics at UC Berkeley, author of Now: The
Physics of Time, quoted on https://www.forbes.com/sites/quora/2017/01/05/the-
hardest-and-most-important-part-of-the-scientific-method-staying-
objective/ accessed 21 May 2021
18 Paul, R. and Elder, L. (2010). The Miniature Guide to Critical Thinking
Concepts and Tools. Dillon Beach: Foundation for Critical Thinking Press.
19 ibid.
20 Kreeft P, Dougherty T, (2010) Socratic Logic, Edition 3.1 (St. Augustine’s
Press) p 144.
3. Traits Of A Socratic Mind
1 https://en.wikipedia.org/wiki/Cardinal_virtues accessed 6 June, 2021.
2 Paul, R. and Elder, L. (2010). The Miniature Guide to Critical Thinking Concepts
and Tools. Dillon Beach: Foundation for Critical Thinking Press.
3 Resnick B, (2019)Intellectual humility: the importance of knowing you might
be wrong. https://www.vox.com/science-and-health/2019/1/4/17989224/
intellectual-humility-explained-psychology-replication accessed 30 May 2021.
4 Steve Wozniak interview with Mark Milian (Dec 8, 2010). http://edition.cnn.
com/2010/TECH/innovation/12/08/steve.wozniak.computers/index.html accessed
8 June 2021
5 Collins Concise English Dictionary, 8th edition, (2012). HarperCollins
Publishers. p 105
6 Dr. Okadigbo Chuba, as reported in the Nigerian Daily Post (2017) https://
dailypost.ng/2017/10/23/fani-kayode-urges-buhari-take-okadigbos-advice/
accessed 10 June 2021
7 Paul, R. and Elder, L. (2010). The Miniature Guide to Critical Thinking Concepts
and Tools. Dillon Beach: Foundation for Critical Thinking Press.
8 Ibid.
9 Ibid.
10 Facione, P. A., “Cultivating A Positive Critical Thinking Mindset,” © 2016
Measured Reasons LLC. p 7 Based in part on material from chapter 2 of Think
Critically, Facione and Gittens, 2016, Pearson Education.
11 George Carlin quotes. https://www.goodreads.com/quotes/679083-forget-
the-politicians-the-politicians-are-put-there-to-give accessed 6 June 2021
4. Questioning: The Heart Of The Socratic Method
1 Toffler, Alvin. “Future Shock,” (1970). Random House. p 211
2 Nascimento, G. https://www.dailymaverick.co.za/opinionista/2021-06-14-
confessions-of-a-white-south-african-on-youth-day-in-2021/ accessed 14
June 2021
3 Kreeft P, Dougherty T, (2010) Socratic Logic, Edition 3.1 (St. Augustine’s Press)
p 124.
4 Westacott, Emrys. (2020). Summary and Analysis of Plato's 'Euthyphro'.
Retrieved from https://www.thoughtco.com/platos-euthyphro-2670341 on 16
June 2021
5 Watch this Elon Musk interview from the 19:45 mark https://www.youtube.
com/watch?v=lS3nIyetS4I&t=1185s accessed on 6 July 2021
6 Steve Jobs takes a full 18 seconds before responding https://www.youtube.com/
watch?v=FF-tKLISfPE&t=78s accessed on 6 July 2021
7 https://www.inc.com/justin-bariso/why-intelligent-minds-like-elon-musk-
steve-jobs-embrace-rule-of-awkward-silence.html accessed on 6 July 2021
8 Paul, R. and Elder, L. (2007). The Thinker’s Guide to The Art of Socratic
Questioning. Foundation for Critical Thinking Press.
9 Daniel J. Levitin As quoted by Berger, W (2018) The book of beautiful questions:
the powerful questions that will help you decide, create, connect, and lead.
Bloomsbury Publishing, New York
10 Berger, W (2018) The book of beautiful questions: the powerful questions that
will help you decide, create, connect, and lead. Bloomsbury Publishing, New York
11 Carl Sagan’s last interview in 1996 on Charlie Rose. Available on YouTube:
www.youtube.com/watch?v=U8HEwO-2L4w. accessed on 6 July 2021
12 Combating the scourge of disinformation https://www.dailymaverick.co.za/
article/2021-06-25-influence-in-africa-combating-the-scourge-of-
disinformation/ accessed on 4 July 2021
5. The Skillful Art Of Asking The Right Questions
1 https://www.telegraph.co.uk/news/uknews/9959026/Mothers-asked-nearly-
300-questions-a-day-study-finds.html accessed on 10 July 2021
2 Berger, W (2018) The book of beautiful questions. Bloomsbury Publishing, New
York.
3 Harris, P. (2012) Trusting What You’re Told: How Children Learn from Others.
Harvard Press, Boston.
4 Mead, M. (1928, 1961), Coming of Age in Samoa. William Morrow and Co. p246.
5 https://hsm.stackexchange.com/questions/3692/is-the-questions-that-can-
t-be-answered-over-answers-that-can-t-be-question and https://en.
wikiquote.org/wiki/Talk:Richard_Feynman#Not_a_quote both accessed 16 July
2021
6 David, J. (2018). How the American Education System Suppresses Critical
Thinking retrieved from https://observer.com/2018/01/american-education-
system-suppresses-critical-thinking/ accessed 18 July 2021.
7 Paul, R. W., Martin, D, and Adamson K, (1989). Critical Thinking Handbook: High
School, A Guide for Redesigning Instruction. Foundation for Critical Thinking.
retrieved from http://web.sonoma.edu/users/s/swijtink/teaching/
philosophy_101/role.htm accessed on 24 April 2021
8 Erasmus-Kritzinger. E. Bowler, A. Goliath, D. (2009). Effective Communication,
Van Schaik Publishers, pp21-23
9 Paul, R. and Elder, L. (2007). The Thinker’s Guide to The Art of Socratic
Questioning. Foundation for Critical Thinking Press.
10 Rowlands, M. (2008). The Philosopher and the Wolf, Granta Publications,
London. p148
11 Resolution 65/309 Happiness: Towards a Holistic Approach to Development
(2011) https://digitallibrary.un.org/record/715187?ln=en#record-files-collapse-
header accessed on 24 July 2021
12 Stearns, P. N. (2012) The History of Happiness. https://hbr.org/2012/01/the-
history-of-happiness accessed on 25 July 2021
13 As quoted in: Lexical Investigations: Happiness. retrieved from https://www.
dictionary.com/e/happiness/ accessed on 26 July 2021.
14 Cilliers J. (2021) South Africa’s security sector is in crisis https://www.
dailymaverick.co.za/article/2021-07-21-south-africas-security-sector-is-in-
crisis-immediate-reform-is-needed-to-ensure-national-stability/
6. Getting It Right: Points To Remember And Apply
1 Vonnegut, K. (1973) Breakfast of Champions. Delacorte Press, New York
2 Collins Concise English Dictionary eighth edition, (2012) HarperCollins
Publishers
3 Ibid.
4 (2021) Critical Thinking In a Nutshell. Thinknetic. p29-34
5 Paul, R. and Elder, L. (2010) The Miniature Guide to Critical Thinking Concepts
and Tools. Dillon Beach: Foundation for Critical Thinking Press.
6 Ibid.
7 Ibid.
Afterword
1 Rowlands, M. (2008). The Philosopher and the Wolf, Granta Publications, London.
p98
THE SOCRATIC WAY OF
QUESTIONING
HOW TO USE SOCRATES’ METHOD TO UNCOVER THE
TRUTH AND ARGUE WISELY
INTRODUCTION
Gustave Flaubert wrote, “Let us think of nothing, neither of
the future nor of ourselves, for to think is to suffer.”
Flaubert is right, but only up to a point and only from a
particular point of view. First, we need to try and see what he
was going through to cause him suffering and pain when he
thought. The quote is from a letter, written at midnight on
August 4, 1846, to his mistress, Louise Colet. “Twelve hours
ago we were still together, and at this very moment
yesterday I was holding you in my arms!” he writes. In this
context, the source of his pain is obvious. He is in love and
the suffering stems from his thoughts of her.
However, most of us do not see thinking as a cause of
suffering. Usually, this is because many people consider
thinking to be something that happens when we are not busy
and our thoughts flit, like a bee, without pausing or
considering deeply. There is no pain here, unless your
thoughts touch on a painful memory. But this isn’t really
thinking. It is daydreaming, simply allowing your mind to
wander.
When you need to apply your mind to a specific problem, or
to a major decision that may impact on you and/or your
family’s future, thinking can be painful–particularly to a
mind not used to concentrated thought. And it’s unlikely
that you will go through life without being presented with
problems and dilemmas that trouble you. For example:
You need to decide whether to accept the new job you
have been offered in a different city.
You need to plan for uncertainty because of climate
change.
You need to factor in changes necessary because of
the current pandemic.
These are important decisions in your life, but you put off
making them because you fear making a decision that you
will later regret. You believe in your own abilities, and know
you are usually right. After all, you have more experience and
are more knowledgeable than most people you know.
But you have deep concerns about the future because of the
uncertainty around climate change and the pandemic. You
worry about how the ‘new normal’ might affect your life.
How will it be for you financially and health-wise?
At the same time, all the news about disinformation has you
distrusting what you read and hear. Yes, there are those you
consider to be authorities and still believe them, but the
doubts creep in and unsettle you.
However, you can still achieve your goals and realize your
dreams despite what the socio-economic outlook might
bring. The book you have in your hands will help you resolve
all your dilemmas and uncertainty.
Most of you have probably come across the concept of
critical thinking. A good proportion of you will have heard
about the Socratic Method. But what do these two have in
common and how do they relate to each other? Take a short
journey into the mindset of Socrates and critical thinking
through the pages of this book and discover exactly what it
involves.
In it, you will find techniques to help you think rationally,
communicate with reason, and ask only meaningful
questions to get the answers that you want. You will learn
how to ‘get to the truth’ of matters that concern you,
without being influenced by what others think or say. As a
result, you will be equipped to reach considered decisions
and arrive at sound conclusions in all aspects of your life.
I am a published author and professional writer, with an
award-winning short story and a BBC radio play under my
belt. A solid background in systems analysis, design and
development taught me the power of deep analysis and to
look beyond the obvious and expect the unexpected. This is
my tenth book, and in a way, a lot of my previous writing,
such as my second book, Systems Analysis and Design, has led
to my writing a book on the Socratic method and critical
thinking.
My childhood in various wayside stations in the African
bush, and my school years in boarding schools in what was
Southern Rhodesia (now Zimbabwe) set the tone for my
search for answers that neither my parents nor my teachers
could provide.
Never mind trying to find the meaning of life, I wanted to
know what a communist was and why did the deputy head of
the school call me a communist organizer? To put it in
context, it was 1965–the year the government of Rhodesia
unilaterally declared independence from Britain. I was 14-
years-old at the time, and had been partly responsible for a
“food riot” at school, a protest over the quality of hostel
food. Not much made sense to me in those days. This was my
sixth year in boarding school and I felt like an unwanted
child, an orphan.
This is no reflection on my parents, who were loving,
upright, and well-meaning people. But their world was not
my world. It was the decade of anti-Vietnam war protests, of
rock ‘n’ roll, and teenagers around the world were
questioning the way things were done and why they had to
be this way.
This was an unknown area for me, so I pushed boundaries,
got into trouble for it, and have never stopped questioning
since then. When I received a request to write this book, it
was as if all those years of being an outsider suddenly had a
point of focus. All my travels around the world, all my
experiences bumping into walls and falling down rabbit
holes were prompting me to write this. In some ways, it has
answered questions I haven’t asked yet. I hope you get as
much out of reading it as I got out of writing it.
1
THE CRUCIAL ROLE OF CRITICAL THINKING
D ave was 27 when he moved to a small city in the
Eastern Cape province of South Africa. He had enrolled
at the local university and was looking forward to
student life, leaving the stresses of alternating between
being a soldier and a farmer in what had just become
Zimbabwe. One Saturday in January, he rode his motorcycle
to an out-of-town hotel where a friend worked as a barman.
Dave declined Brad’s invitation to sleep over. On his way
home, as he crested a hill on a curved stretch of road, he saw
a car on his side of the road coming straight at him.
Larry's sister got married on the same Saturday in January.
As Larry left the wedding reception to drive home, friends
and family tried to dissuade him, knowing how much he had
drunk. Invitations to stay over with them for the night and
leave the next day came from all sides. Larry laughed, said he
was fine to drive, that he knew the road and travelled from
his smallholding to town hundreds of times. Waving off their
protests, he got into his car, and in a few minutes, he was
beyond the city limits.
Larry is not sure what happened next. He remembers seeing
a single headlamp coming over the hill directly in front of
him.
What did Larry think when he got into his car? What were his
friends thinking when he left, knowing he was in no state to
drive? If they were thinking, they were not thinking clearly.
Dave died that night. What did he think when he declined
Brad’s suggestion that he sleep over? He lived alone and had
no pets needing his attention. If he was thinking at all, it was
muddled thinking.
I am fairly certain that none of the role players that fateful
night knew anything about critical thinking. And the results
were deadly.
Of course, not every event where you fail to think critically
will end in tragedy. So, what exactly is critical thinking? And
will it make a difference in your life? To answer the second
question first, critical thinking will undoubtedly make a
positive difference to you and your worldview.
As for what critical thinking is, it’s a way of thinking about
the big and small questions you will face for as long as
you’re alive. Critical thinking is a way of thinking – about
events and ideas – instead of simply accepting them at face
value. One definition that is all over the internet is that
critical thinking is “disciplined thinking that is clear,
rational, open-minded, and informed by evidence.” 1
While the above definition is concise, it is also limited. It
doesn’t mention the important aspect of context or the
methods and standards you should apply to the process. The
definition below, from Peter Facione and the Delphi Project,
is more precise. “Purposeful, reflective judgment which
manifests itself in reasoned consideration of the evidence,
context, methods, standards, and conceptualization in
deciding what to believe or what to do.” 2
We will open a window onto the elements of critical thinking
now, and we will also begin to examine the traits, the
mindset, and the standards that you should apply in your
considerations and deliberations.
The Elements Of Critical Thinking
The ability to think clearly and logically is at the heart of
critical thinking. Fortunately, clear and logical thinking are
skills that you can learn.
Critical thinking is far more than a skill. It is a habit and is
also called a mindset or a habit of mind. In addition, it is a
combination of traits and attitudes. To examine the traits of
critical thinkers, the list prepared by Dr. Peter Facione for
the executive summary of the Delphi Report is a good place
to start.
Table 1: Traits of Critical Thinkers
Critical thinking goes beyond the traits listed in Table 1. But
before expanding on the traits and attitudes of critical
thinkers, think about the number of people you know who
think critically. Now think of the number of people you know
who might think deeply, even philosophically, but are not
critical thinkers. Compare the numbers in those two groups
to the total number of people you know. Now ask yourself if
you could safely, and truthfully, say, “Many people I know
don’t think critically.” Or “most people I know don’t think
critically.”
It is certainly the case with many people I know. Instead of
thinking critically and questioning the status quo, they will
excuse the situation by saying things like, “It’s the norm,”
or “It’s par for the course” (and please excuse the clichés,
but it’s how they talk). These are people who are capable of
critical thinking, both educationally and intellectually, but
have settled into a comfort zone or become complacent
about the world around them.
Let’s examine those two platitudes and ask: what if nobody
had ever questioned the norm? Would we have the science
and technology servicing the world that we have today?
Would we have the extraordinary artworks of van Gogh, or
Picasso, or Dali? Would we have the superlative literary
works available in libraries around the world?
Then, some people say, “It’s always been like this,” and
“what difference can one person make?” Here, we could
question what the adverb ‘always’ means in this context.
Does it mean ever since time began, or is it more limited,
such as in your father’s memory? As for questioning what
difference one person can make, consider this story.
A woman who lives in a coastal city goes for long walks on
the beach every day. Each time she walks, she always comes
back with one piece of rubbish she did not take to the beach,
such as a plastic bottle or a crisps wrapper. Friends point out
to her the volume of litter on the beach and ask, “Why
bother? What difference does it make?”
And she would reply, “Because next time I come here, I will
know there is one piece of rubbish less than there would
have been if I’d done nothing.”
To take this story one step further, what if all her friends
started doing the same thing, and if the idea caught on and
became a habit of more and more people? Clean beaches,
anyone?
Consider also that Socrates was one person. If he had not
begun questioning, you would not be reading this book. We
would, maybe, still be living in a city like ancient Athens.
You can make a difference, even if you act on your own.
Besides, you never know who it might encourage to examine
the roadblocks and handbrakes in their thinking.
Critical thinking does not start with analyzing and
questioning the major and immediate changes you need to
make to the world you live in. It starts with you making
small changes in your life – and the first change is to begin
thinking critically. Perhaps you can start by interrogating
any superstitions you may have learned at the feet of your
grandparents or in the lap of your mother.
For example, if you spill salt at the table and throw a pinch
over your shoulder, why do you do that? What will happen if
you don’t throw a pinch over your shoulder? Answering that
something terrible will happen is not precise enough.
Establish exactly what this terrible thing will be. And ask
yourself why you still follow this obscure practice.
Is Friday the 13th an unlucky day in your family or your
beliefs? If so, why is that?
Even if walking under a ladder is not considered unlucky, it
still makes sense not to do so. If anyone is working up top,
there is always the possibility something could fall on you.
But even if nobody is up the ladder, it still makes sense not
to walk under it. If, for example, someone walking around
the outside of the ladder stumbles against the leg or
dislodges the ladder, it could fall on you.
Examining your superstitions is a fairly easy way to begin
interrogating aspects of your life you have previously
accepted without question. If you would like something a bit
more challenging, examine your behavior, your moods, and
irritations.
Establish when, and under what circumstances, you are at
your worst. Next, analyze what events, words, or actions
change your mood and what pushes the buttons that trigger
a despicable you. Discover what brings out the worst in you,
and go on to question how it makes you feel.
Make notes as you go along to help clarify and review the
steps and choices you make. Now, knowing how you feel
when your worst self surfaces, ask what it was that brought
that aspect of your shadow self to the fore. Then question the
shadow as to why it is now front and center.
Skills You Need to Think Critically
Critical thinking requires more than understanding and
applying the elements of critical thinking listed earlier. You
will also need specific skills. Some of these you may already
have, and even if you don’t have all of these skills, they can
be learned. To think critically, you will need the skills listed
in Table 2. You will not always apply these skills in the order
they appear, but you need to know them and understand how
they are applied.
Table 2: Skills for critical thinking
Have you ever wondered about how our mind works? For
example, can you arrive at a judgment without first
understanding the situation? Of course, some people will
jump to conclusions without understanding the situation and
without even the most elementary reasoning. To understand
the logical connection between ideas, you need to be rational
and think clearly.
First, you would need to assess if there is any logical
connection between the ideas. Just because someone says
there is a connection, it doesn’t mean it is strong or valid.
For example, Alan walked under a ladder on his way to work
this morning. At lunchtime, Alan’s boss fired him. As a
result, Alan now believes walking under ladders is unlucky
because he got fired the day he walked under a ladder. There
is no valid connection between being fired and walking
under a ladder, apart from both happening on the same day.
However, it is common to hear people link two separate
events and conclude that the first two events cause the third
event. People find it easy to understand simple cause-and-
effect stories such as this. Does this mean we naturally look
for easy ways to explain the random events we stumble
across regularly? Are we on a constant search for meaning?
If that is the case, we are far better off when we think
critically. As critical thinkers, we train our minds to question
a bit more deeply and are therefore not fooled easily. We
examine events, statements, and outcomes to test their
validity. We analyze and evaluate.
Before you can reach a reasoned decision, one that is based
on evidence and supported by verifiable data, you have to use
the skills for critical thinking listed in Table 2. These skills
do not necessarily define your ability to reason, but your
reasoning will be supported, deepened, and strengthened.
Let’s look at an example of where applying the traits and
skills of critical thinking might make a difference.
Claire S. runs a school in the center of a small city. The
school is on the 4th and 5th floors of a building across the
municipal building. It is within easy reach of public
transport, but it has no playground or sports facilities. An
opportunity arises to move the school to the grounds of an
old hotel in a rural area 10 miles from the city center. The
hotel buildings are in good condition and include a
swimming pool and spacious grounds for outdoor activities,
all overlooking a dam cradled in the arms of an ancient
mountain range.
The rent is lower out of town; the buildings are clustered,
making it easy to set up classrooms around an admin
building, and no high-rise building in sight. The village
surrounding the hotel is expanding quickly, as a large
poultry farm had set up operations a couple of miles away.
In the city, where the school is currently, there is easy access
to public transport and shops, with many parents wanting
education for their children. Parents can drop their children
at school and be at the office minutes later.
Now, let’s do some critical thinking about this issue for
Claire. And we will do it in an objective way, which means
doing it without bringing our biases and prejudices along.
First, we identify the arguments concerning the moving of
the school. The current school premises do not have playing
fields or sports facilities. Spread over two floors of a multi-
story building, there is a constant noise of children or staff
clattering up and down stairs from admin offices and staff
rooms to classrooms. Rents are high in the city center.
Next, we need to evaluate the two choices – stay in the city
or move to the country – and determine how valid or strong
the supporting arguments are. On the face of it, there are
strong arguments for moving the school. And, at first glance,
it would seem that there are no or few arguments against
moving the school.
We assess any weaknesses in assumptions or points where
assumptions are vague or where little evidence for the
argument exists. Let’s also examine what implications there
are for moving and for staying.
Moving will affect staff, pupils, and parents. Staying in the
city means some parents can drop their children at school
and be at work in five minutes. Moving means they need to
allow for a 20-mile round trip that will take at least 30
minutes in the best traffic conditions.
So Claire must expect to lose some pupils. The question is,
how many will leave? The next question is: how long will it
be before the school picks up students from the village, the
staff at local holiday hotels, and the surrounding farming
community? Claire has done no real research. She did not
send out any questionnaires, so we have no viable way to
assess this. We are in the realm of guesswork. Staff may also
leave because:
They now have to travel further to get to work and
back.
They may have children at another school and usually
go home at lunch to feed their children.
They may have a child at the same school, but they
would take the child to their grandmother, then
return to work in the city. They can’t do that if the
school moves.
If we examine the move as a business proposition, we have
to ignore the scenic views of the dam and the mountains.
They have no commercial or academic value in making the
decision. Of course, if you were doing this exercise for a spa
offering meditation and reflection, the natural setting will
carry more value than a city center setting.
At this point, as an entrepreneur or an investor, you would
want to see financial reports, but we do not cover that here.
You must decide on the given case study. Claire can replace
staff who leave, although it will be disruptive to the
students, and if more than two teachers leave, she could
struggle for a couple of months.
Parents who still want to keep their children at the school
could start lift clubs. The lower rent will help cover gaps left
by students who leave, but the question keeps coming back
to how many students the school can afford to lose and how
quickly a gap can be filled.
Of course, to do any sort of effective analysis and evaluation
will require a basic understanding of logic.
The Logic Behind A Critical Mindset
To use our ability to reason effectively, we need to
understand the basic principles of logic. Logic is a process
for arriving at a conclusion based on given information. As
such, it’s a tool you should use in your life.
Logic is a fundamental part of philosophy, as well as being
an area of mathematics. There are essentially two main
branches in the philosophical world of logic–informal logic
and formal logic.
Critical thinking has been referred to as “informal” logic,
although the structure and discipline required for critical
thinking are anything but informal. More accurately,
informal logic is seen as the process used for daily reasoning
and casual debate. Formal logic, insofar as we are interested
in it, is called classical elementary logic or first-order logic.
Before we can move on, you need to understand a few terms.
The first is “argument.” Most people are already familiar
with the other meaning, namely that of a heated quarrel or a
shouted dispute. Except, in the logical sense, an argument is
one or more statements or propositions supported by one or
more premises and claims, leading to one conclusion, either
stated or inferred. Thus, we can describe the elements of
logic as follows: 3
Propositions - a statement used as the foundation of
a logical argument. The proposition is either accurate
(true) or not accurate (false).
Premises, also called claims, are used to build the
argument and provide support for it.
Conclusions – the result of the argument is drawn
from the premises or is inferred in the premises.
Let’s look at an example. The argument is as follows:
Proposition: Every person who lives in Arkansas lives
in the USA.
Premises: Every person who lives in the USA lives in
North America.
Conclusion: Every person who lives in Arkansas lives
in North America.
Explanation: The conclusion is true because the
claims and premises are verifiable.
This uses a process called deductive reasoning. Other
arguments might use inductive reasoning. In some cases, the
argument could use both deductive and inductive reasoning.
We look at inductive reasoning later, but for now, here is
another example of deductive reasoning.
Proposition: All trees have leaves.
Premises: The trees in my neighborhood lose their
leaves in winter.
Conclusion: Trees lose their leaves in winter.
Explanation: Both the proposition and the premise
are true, but the conclusion is only true in certain
circumstances. If we infer that all trees lose their
leaves in winter, then the conclusion is false.
To correct the conclusion, we must add one word to the
beginning. Doing so also removes the need to infer anything.
The changed conclusion should say:
Deciduous trees lose their leaves in winter. The sentence is
more specific, and the conclusion is now true.
The discussion around the above argument is an example of
checking the proof of an argument. In effect, we tested the
logical accuracy of the premises and the conclusion. For an
argument to be valid, all the premises have to be true, and
the conclusion has to be true. If the argument does not meet
these conditions, it is an invalid argument.
You might think that commonsense would lead you to the
same end as the above discussion. However, if we define
commonsense as what you expect every person to have in a
given situation, you can see that logic is deeper and more
precise. For example, exactly what is meant by “every
person” in the above definition? Does it mean every person
on the planet, or does it mean only people with a specific
educational background? If you analyze this, you will see
that commonsense is not so common.
When you analyze and evaluate an argument, you may be
suspicious of the evidence presented and want to question it.
You may also find inferences that you can draw based on the
premises. As soon as you start making inferences, or
questioning assumptions, be careful to check if your
prejudices or biases affect your objectivity regularly. Be
aware that emotions will also play a role, so you need to put
emotional distance between yourself and the issue. All too
often, emotional or biased stances lead to faulty reasoning,
which can affect our judgment.
A big part of approaching statements and questions without
bias or emotion is keeping an open mind. An open mind is
particularly important when assessing the alternatives, as
you need to view the issue from different positions or points
of view. In a way, trying to see things from a new point of
view is like exploring boundaries. Imagine that you are part
of a small group of blindfolded people led into a darkened
room. The room contains an elephant. Each one of you is led
to a different part of the room and asked to describe what
you feel in front of you. Because you are each feeling a
different part of the elephant–this person feels the tusk, that
person feels a leg, another person feels the body–you all
describe the elephant differently.
It is only when you remove the blindfold that you see the
whole picture. Seeing things from an alternative position is
like removing the blindfold from the people in the room with
the elephant.
Asking questions, the heart of Socratic questioning is not
you trying to be difficult–and there may well be people who
accuse you of that. Rather, it’s you trying to deepen your
understanding of the situation. For example, when you learn
that other people have values that significantly differ from
yours, you may begin to question their beliefs. Depending on
how you go about questioning, their reactions will vary. At
the extremes, reactions will range from defensiveness,
mocking, and attempts to humiliate you, to anger. However,
your questions should also go beyond that. If it is part of
your search for truth and honesty in reasoning, you should
also be questioning your own beliefs, not only those of other
people.
To return to the story about Dave and Larry at the beginning
of this chapter, Dave was my brother. It took days for the
news of his death to reach me. I was a continent away,
camping on the south coast of Portugal.
At the time, I knew nothing of critical thinking. In all my
thinking about that event over the decades that followed, I
always thought that Dave was an innocent victim of Larry’s
irresponsible behavior. At least, that was my belief until I
examined the events with a critical mind.
Knowing my brother, he probably had a couple of drinks
while visiting his friend, the hotel bartender. Did that play a
role in how the night ended for those two young men?
A few years ago, my sister was in a queue for some official
business in which she had to give her maiden name. The
woman behind her in the queue said, “I know that name
from a long time ago. Did you have a brother who died in a
motorcycle accident outside Grahamstown?” It turns out she
was Larry’s sister, the one who got married the night Larry
and Dave met head-on. Apparently, Larry has never been
able to settle down, stay in any job for long, or maintain any
relationship since that night.
Hearing how Larry was struggling with his life brought home
to me, in a forcible way, how seemingly small decisions can
have a significant impact on your future. Of course, this is an
extreme example of the results of an unthinking or badly
informed decision. But, if you apply any critical thinking to
the event, you need to pay as much attention to analyzing
the possible outcomes as you pay to your analysis of the
probable outcomes.
This event, and my subsequent evaluation of it many years
later, also brought home how easy it is to make judgments
without applying critical thinking processes. For example,
because this accident involved my brother, for decades I
brought to my thinking all the biases and prejudices that
close family ties carry, and blamed Larry for the accident.
But the truth is, nobody will ever really know what happened
on that road that night. We only know the outcome, and
carry our wounds, and bear the scars.
Exercises And Tasks
Any exercises, tasks, or challenges are entirely optional. We
include them to help focus your attention on important
aspects of your journey through the Socratic thinking and
questioning processes.
Exercise on received wisdom
Read the following discussion on received wisdom, then
answer the questions below it.
The term “received wisdom” has been used for decades
without any real agreement about its meaning. One
definition is: “The usual way of doing things, the normal
procedures to follow.” But this is too limited and more
closely resembles definitions for common practice or the
status quo.
Another definition is: “common knowledge that is held to be
true, but may not be.” 4 This is closer to a fuller definition
because it expresses some doubt. Then there is this
definition, paraphrased from a discussion group:
Received wisdom is opposable to (but does not necessarily
oppose) knowledge and scientific reason. 5
In other words, it could be a block to personal growth, to
being objective, to an inquisitive mind. As things stand
today, received wisdom appears exempt from questioning.
But it shouldn’t be. It should be the starting point for any
argument you analyze.
Conventional wisdom is commonly used interchangeably with
received wisdom, but they are not the same–although they
both impose unhelpful restrictions and limitations on your
rights to ask questions. Conventional wisdom, first used by
John Kenneth Galbraith in The Affluent Society, 6 described a
set of beliefs and thought patterns that society finds
comfortable and acceptable. It is, in essence, a widely held
belief on which most people act. However, all too frequently,
the result of what is “comfortable and acceptable” for
society leads to opposition to new ideas or barriers and
hurdles to progress.
Given that “Wisdom is the ability to use your experience and
knowledge to make sensible decisions or judgments” 7, it’s
time to start questioning your received wisdom. You have
two tasks.
1. Think back to your childhood or your school days, and look
for received wisdom that you can question. The aim is not to
prove received wisdom wrong but rather to question the
validity of old beliefs and whether they still have value in the
light of the 21st century. These could be from a parent, a
grandparent, a teacher, your college professor, or a mentor
at college or at work.
The received wisdom could be in the form of value systems
or beliefs, such as a handshake is all you need to seal a deal.
You pick the received wisdom for yourself. Ask if you still live
by the handed-down values, or do you make adjustments in
how you see the world?
Earlier in this chapter, we looked at questioning
superstitions, and this exercise is part of a similar process. It
questions beliefs handed down through the generations.
2. Read the discussion on received wisdom again, except this
time, you are to examine it with a questioning mind. Apply
what you have learned about critical thinking to the
reasoning process, break it into bite-sized chunks, and
analyze each part of it. Question the meaning of each
definition and whether it adds value and weight to the
argument.
Chapter Summary
The elements of critical thinking, and how they are used,
consist of:
Being open-minded, in that you look for new or
different perspectives.
Analytical in your approach to questioning.
Systematic in building an argument or finding
weaknesses in an argument.
Inquisitive, if not by nature, then by training.
Judicious in your approach to evaluating the evidence
before you reach any conclusions.
Truth-seeking and ethical and ready to acknowledge
being wrong.
Confidence in reasoning and in examining
alternatives.
When you work with the elements of critical thinking, you
need to bring specific skills to bear on them. Skills such as
observation, reflection, interpretation, problem solving,
analysis, and evaluation.
Understanding the logic that informs a critical mindset helps
you to interrogate arguments presented to you and to build
strong arguments of your own.
Critical thinking is vital to the Socratic method of
questioning and thinking. The systematic application of logic
supported by evaluation and analysis, the courage to
question deeply, confidence in your thinking, and the
humility to admit you were wrong, all embody the spirit of
Socratic dialogue. In the next chapter, we will cover:
Who Socrates was, and how he became known for his
method of thinking and questioning. We will work through
the methods developed to gain insight and find the truth.
By exploring the boundaries of an issue and questioning
assumptions, the Socratic ways have led to discovering
universal definitions and inductive arguments. These have
come to be regarded as the essence of the scientific method
of inquiry.
You will learn to first understand an issue, then look for any
weaknesses in an argument. In the process, you learn to
reflect on alternatives and acknowledge limitations.
The Socratic method is effective and beneficial because:
It starts with an initial definition or opinion about a
subject.
Then it asks a question that raises an exception to
that definition or opinion.
The resulting dialogue yields a better definition or
alternate opinion.
2
THE SOCRATIC METHOD OF THINKING
“J udge a man by his questions, not his answers” is a
well-known quote attributed to Voltaire. Websites,
such as Forbes.com, GoodReads.com, and
BrainyQuote.com, 1 all tell you it was Voltaire. You’ll find the
quote in various formats, such as cards and posters, on
Pinterest and Instagram, and hundreds of pages across the
Internet. Except, there is no record of where Voltaire ever
wrote this.
If you dig deeper (by rephrasing the question, or asking
another question, or going beyond the first ten items
presented by a search engine), you’ll find that Wikiquote
says this quote is misattributed. 2 The original is from a book
of maxims by Pierre Marc Gaston de Lévis. In the original
French the quote is:
Il est encore plus facile de juger de l’esprit d’un homme
par ses questions que par ses réponses. 3
— PIERRE MARC GASTON DE LÉVIS
This translates as: “It is easier to judge a man by his
questions rather than his answers.”
This is just one example of how misleading the Internet can
be. But the relevance of the quote to the Socratic method of
questioning is undeniable, as is the result of looking a little
further than the stated “fact.”
“The unexamined life is not worth living” is one of the most
famous quotes by Socrates. However, the only source we
have for this is Plato’s Apology. All we can question here is
Plato’s memory of exactly what Socrates said. Which is not
to say that we doubt Plato, but this discussion is really about
you having no boundaries as to what you can or can’t
question. No doubt, had Socrates been in a position to speak
for himself, he too would encourage you to question.
What we know about Socrates' life comes from only a few
sources. These are:
The plays of Aristophanes, in which Socrates is
satirized and ridiculed.
The writing of Xenophon, a soldier, philosopher and
historian.
The dialogues of Plato, who is most sympathetic
towards Socrates.
It is unlikely that any of the versions present a complete or
even an accurate picture of Socrates. But all three agree on
two things: Socrates was physically unattractive, and he had
a brilliant mind. Collectively, the three accounts give us a
unique portrayal.
Socrates was also a man who ignored the standards of dress
and cleanliness that were common in Athens. He wore
shabby robes, grew his hair long, and walked around
barefoot in a society with refined standards of cleanliness
and comeliness. It may have added to the distaste Athenian
society came to feel about him.
Socrates (470-399 BCE) was a stonemason and soldier in
Athens before building his reputation as a philosopher. The
son of a stonemason and a midwife, he received a basic
education and learned his father's craft at a young age. As a
stonemason, Socrates must have had well-muscled arms and
torso, but he did not have the classic beauty seen in Grecian
statues or on Grecian urns.
By law, all able-bodied Athenian males were to “serve as
citizen soldiers, on-call for duty from ages 18 until 60.” 4
According to Cambridge philosopher Iain King, “Socrates
had two years’ military training in his early twenties.” 5 King
also writes that men who could afford the armor joined the
city’s army as hoplites, or armored infantry, which Socrates
did. Socrates had peacetime deployments to the Athenian
borders, but he was 37 before his first active service in a war.
His war record begins with a three-year campaign to subdue
Potidaea, a city-state attempting to break away from Athens.
On the way home, the victorious Athenian army was
ambushed, suffering severe losses. It was here that Socrates
saved the life of Alcibiades, a man who went on to become a
leading military general and politician. 6 Socrates' next
active service was five years later, at the Battle of Delium,
with his final military service at Amphipolis when he was
almost 48.
This is Socrates’ background: fighting for his country and
working as a stonemason.
It is difficult to say exactly when Socrates began the
questioning that led to his fame as a philosopher. According
to Plato, Socrates had a desire to learn, and he records
Socrates’ eagerness to acquire the writings of Anaxagoras, a
leading contemporary philosopher 7. We can assume
Socrates had an insatiable curiosity and that this, in
combination with his desire to learn, played a role in
developing his questioning methods.
Socrates was 45 years older than Plato and Xenophon, both
of whom were regarded as students of his, and they could
only have known him in his fifties and sixties, or for the last
15 years of his life. 8 Aristophanes, the third source of our
knowledge of Socrates, had already parodied Socrates in the
play Clouds, in 423 BCE, when Plato and Xenophon were still
infants. Socrates was 46 when the play premiered, so
Socrates had probably been asking his probing questions and
building a reputation for a few years before Aristophanes’
created an unflattering character based on Socrates.
So, who was being questioned by Socrates, and why?
Initially, it was the politicians, merchants, community
leaders, or those believed to be wise. But that doesn’t tell us
why they were being questioned.
Who, What, Where, When, Why, and How
Chaerephon, a friend of Socrates, asked the Oracle of Delphi
“if there was anyone wiser than Socrates.” 9 To which the
Oracle replied, there was not. Socrates did not accept this
and certain there was a wiser man in Athens, he set out to
find him. So he began to ask questions of the elders, the
statesmen, the teachers, and the politicians, or at least those
of them who were thought to be wise and sagacious.
His efforts to prove the Oracle of Delphi wrong, in itself an
unheard of impudence in Athens, is what gave rise to the
Socratic method of questioning. We must keep in mind that
all Socrates wanted to do was establish the truth, and
hopefully, find someone who knew more than he did. But we
all know that your truth might differ from my truth. In
addition, Socrates needed his questions to be understood, so
he required a common understanding of what any given
thing is. Instead, he found that commonly used words,
terms, and actions did not mean the same thing to
everybody.
Let’s look at one example: the differing answers to the
question, what is justice? said more about the loose
interpretations of the thing than it did about a distinct
meaning of what justice is.
Person X might believe that justice is that you get what you
deserve. In which case, who decides what any given person
deserves?
Person Y might think it is defined in scriptural terms, from
one or another sacred book, that justice is an eye for an eye.
This is a short step from justice = retribution.
Person Z may declare justice to be what is fair, what is
ethical and what follows natural law. In which case we now
need to define natural law, and fairness.
As a result of the lack of agreement on common things, it
became apparent that things need to be defined with greater
precision. So Socrates began asking questions to establish
definitions that held true under all conditions. His efforts to
arrive at the truth embarrassed some people, and created
animosity and resentment among others, particularly the
rich and powerful.
However, one thing was clear. “Definition is crucial to the
Socratic Method." 10 Socrates would apparently spend at
least half of each dialogue defining the terms being
discussed and questioned. Plato's Republic, which is probably
the most famous Socratic dialogue, “spends most of its time
in defining just one term, justice." 11
Today, we have easy access to dictionaries, either at home, in
a library, or online, so you can quickly find a definition of
justice or any other word. An ideal definition defines a thing
in such a way that there can be no confusion or doubt about
the meaning of it. This begins with separating the nominal
definition, or the meaning of the word itself, from the
essence of the thing defined.
This difference is best expressed by Peter Kreeft and Trent
Dougherty in their book, Socratic Logic. They write:
“Nominal definitions are definitions of a name, or a word,
not necessarily of a reality. They answer the question, ‘How
is this word used?’ rather than ‘what is this thing?’" 12
A definition, to be effective, must be both clear and distinct.
In other words, a definition must express “the maximum, or
perfect, idea of a thing; but if we cannot have perfect clarity,
we should at least have perfect distinctness.” 13
The minimum standard for a definition is that it must
differentiate the thing defined from every other thing. It
must be so distinct that we do not mistake it for some other
thing.
By exploring the boundaries of a given issue and questioning
assumptions, Socrates’ methods were a defining moment for
words and terms. The Socratic way led to the establishment
of universal rules for definitions.
Kreeft and Dougherty list the six rules required for
definitions to be logically acceptable. 14
1. A definition should be coextensive with the thing defined:
neither too broad nor too narrow. (This is the most
important rule and the hardest to obey. It concerns the
extension of the term rather than the comprehension.)
2. A definition should be clear, not obscure.
3. A definition should be literal, not metaphorical.
4. A definition should be brief, not long.
5. A definition should be positive, not negative, if possible.
(Only negative realities call for negative definitions.)
6. A definition should not be circular. (The term defined
cannot appear in the definition.)
An argument can be made for excluding Rule 5 as it allows
for both positive and negative definitions. In which case, it is
a stated preference, not a rule. But we are not going to make
that argument here. We will look at deductive and inductive
arguments now and the influence of the Socratic method on
the Scientific method.
Arguments And The Scientific Method
The methods of questioning used by Socrates to such telling
effect led to the establishment of deductive and inductive
arguments, and laid the groundwork for universal
definitions. This combination of reasoning and definitions,
along with the critical thinking methodology, can be
regarded as the essence of the scientific method.
In Chapter 1, we had a brief look at deductive reasoning, but
didn’t look at inductive reasoning or define either of them
clearly.
Deductive reasoning: a process of inference that
supports a conclusion with certainty.
Inductive reasoning: a process of inference providing
strong enough support to offer high probability (but
not absolute certainty) for the conclusion. 15
Deductive reasoning is also called top-down reasoning, and
it uses a “general to specifics” approach to building an
argument. For example:
All fruits have seeds inside them.
Cucumber has the seeds inside.
Therefore, cucumber is a fruit.
This is a logically valid argument–the premises are true, so
the conclusion is therefore true. “Valid” or “invalid” are the
attributes that denote the logical strength of deductive
arguments. True or false are the attributes that you apply to
the premises.
With deductive arguments, if the premises are true then it
follows that the conclusion is true. It’s important to note
that it is the form of the argument that is valid or invalid,
not whether the premises are true or false. The premises can
be false, but the logical form can be valid, as in this example:
All birds can fly.
All snakes are birds.
Therefore, all snakes can fly.
In a world where the premises are both true (and let’s
pretend that they are) then the conclusion is true. This is
then a valid argument. Even if the premises are false, as they
are in this case, the logical form is still valid.
Inductive reasoning is something we all do without thinking
much, or realizing we are reasoning inductively. As you go
about your daily business your brain is processing hundreds
of snippets of information. Most of us are also instinctively
making connections between what we see or hear and
drawing conclusions based on that. For example, the last two
buses didn’t stop here, so probably the next one won’t stop
either.
Inductive reasoning is also called bottom-up reasoning. It
uses a “specific to general” approach to building an
argument, as in this example:
Most birds with two wings can fly.
This bird has two wings.
Therefore, this bird can probably fly.
This is a logically strong argument, as its premises are true,
and therefore the conclusion is probably true. Inductive
arguments are not valid or invalid–they are inductively
“strong” or “weak.” Inductive strength is the attribute for
inductive arguments that indicates logical strength.
If all the premises of an inductive argument are true, it is
highly probable that its conclusion is true. An inductive
argument is strong when all its premises are true, because
then it’s very probable that its conclusion is also true. If an
inductive argument is strong, and has true premises, it
cannot have a false conclusion.
For both deductive and inductive reasoning, a good
argument proves its conclusion if it is logically strong and if
all of its premises are true.
Critical thinking and the Socratic questioning method, along
with inductive and deductive reasoning, all come together in
the scientific method. In its simplest form, the scientific
method is a practical way of gaining knowledge.
Irving Rothchild says “Being a good scientist requires
patience, perseverance, imagination, curiosity, and
scepticism.” He goes on to say that you also need to know
how to ask the right questions, how to observe before
judging, and interpret what you see from different points of
view.” 16
We recognize all these traits and attitudes from what we
have already learned in this book. Let’s look at a deductive
argument based on Rothchild’s statement.
Proposition: A scientist requires perseverance, patience,
imagination, curiosity, and skepticism, can ask the right
questions, knows how to observe before judging, and how to
see the issue from different points of view.
Premise: A critical thinker requires imagination, curiosity,
and skepticism, can ask the right questions, knows how to
observe before judging, and can see things from different
points of view.
Conclusion: All scientists are critical thinkers.
You can change the arguments around a bit to yield a
conclusion that all critical thinkers are scientists. But this
conclusion is not supported by the evidence, no matter how
you present it. Of course, an argument could be made against
the conclusion that all scientists are critical thinkers.
On the difference between a scientist and a non-scientist
Richard Muller, emeritus professor of physics at the
University of California, Berkeley, says 17
The Socratic method has grown into the scientific method
because it works – and it works well. Critical thinking, and
Socratic questioning, have many uses in a variety of business
situations. Socratic questioning is commonly used by lawyers
because it lends itself to interrogatory techniques. For
example, a lawyer will build a series of questions around the
central issue, intending to expose contradictions in any
testimony being given.
In other business areas, critical thinking helps in assessing
business opportunities or in analyzing possible competition.
In essence, any activity that requires analysis or assessment
will benefit from applying critical thinking, whether it is
evaluating customer service or allocating resources to
projects.
Critical thinking is also a superb personal growth tool. It is
the process we use to think about and assess the basis for
our beliefs. In doing so, we examine the assumptions
underlying our lives, and the role that assumptions play in
our ideas and actions.
When applying critical thinking techniques, it is important
you first learn to understand the issue. This is a case of you
needing to understand rather than a case of you wanting to
be understood. Until you understand the issue, you’re not in
any position to look for weaknesses in an argument. To
identify weaknesses in an argument, question the
assumptions. What are the assumptions based on, and do
they really support the argument?
Look for distinctness and clarity in all of the definitions, and
look at it from alternative points of view. In exploring the
boundaries of an issue, you learn to see it in greater depth,
and with broader ramifications. In other words, you see the
issue from new perspectives, which is a vital part of the
Socratic mindset.
If the issue in question is complex, or has unclear elements,
the best course is to break it down into chunks. If it is a
subject that you have limited experience of, or includes
elements outside your areas of expertise, acknowledge your
limitations. This shows intellectual humility, which we
discuss in the next chapter.
Let’s recap on how the Socratic method works and why it
works. Socratic questioning begins with you asking for an
opinion, or a definition of a word or phrase.
Building on the initial definition or opinion, ask a question
to help get greater clarity on some aspect, to find a more
precise definition. The resulting dialogue is advantageous to
both parties, and usually yields a better definition or an
alternate opinion.
The Socratic method works if both parties approach the
dialogue with honesty and in search of truth. It works
because it could refine definitions, and thus give greater
understanding, it could broaden an opinion and thus give it
greater depth. It also reminds us to look for bias in our
opinions, and to acknowledge it if it’s there. We need to be
mindful of our prejudices and biases, and the potential for
self-deception that they carry.
Universal Intellectual Standards
The elements of critical thinking and the skills you need to
bring to bear on them were discussed in Chapter 1. In
addition to those skills, you have tools, in the form of
universal intellectual standards to help you to determine the
quality of reasoning. Richard Paul and Linda Elder (2006)
state, “The ultimate goal is for the standards of reasoning to
become infused in all thinking so as to become the guide to
better and better reasoning.” 18 Strong critical thinking
requires having a command of the intellectual standards.
They have the advantage of highlighting areas that are open
to questioning in your search for truth. The intellectual
standards, and the things you can ask about, are: 19
Clarity
Could you elaborate on what you mean?
Could you give me an example?
Accuracy
How could we check on this?
How could we find out if that is true?
Precision
Could you give me more details?
Could you be more exact?
Relevance
How does that relate to the problem?
How does that help us with the issue?
Depth
What are the things that make this difficult?
What are the complexities of this question?
Breadth
Do we need to look at this from another perspective?
How can we look at this in other ways?
Logic
Does all of this make sense together?
Does what you say follow from the evidence?
Significance
Is this the central idea to focus on?
Which of these facts are most important?
Fairness
Is my thinking justifiable in context?
Is my purpose fair given the situation?
Am I distorting my concepts to get what I want?
The processes and methodologies that come from Socrates’
search for truth are still as effective and as important today
as they were 2500 years ago. Given the amount of
disinformation we receive daily in our information society,
critical thinking and Socratic questioning is needed more
than it ever was in the past. So, what is truth? How do we
define it?
Aristotle defined it in the most appealing and most sensible
way. He said, "If a man says of what is that it is, or of what is
not that it is not, he speaks the truth, but if he says of what
is not that it is, or of what is that it is not, he does not speak
the truth." 20
The appeal in the definition lies in the fact that although the
sentence consists of 48 words, they are all words of one
syllable each.
Exercises And Tasks
1. Richard Muller, Professor of Physics at UC Berkeley, tells a
story about a visiting scientist who made a presentation to
Muller’s research group. After the visiting scientist had left,
Muller asked his students to identify the one statement
made that had him rolling his eyes. Unfortunately, none
were able to name the specific statement that had so
infuriated Muller. The offending statement was:
“What I am trying to prove is the following …”
a. Think about this, and consider why Muller would have
been upset by this statement. (Hint: Think in terms of the
elements of critical thinking.)
b. Is there another way to phrase the statement of intent?
(Note: Muller’s reasoning and his preferred statement of
intent are given at the end of the exercises)
2. Read the statement below, posted by JJ* to a social media
discussion group, then answer the questions that follow it.
“I am 26, never had a job, and ruined my life with wrong
decisions. Is there a book that can help me change my life?”
*Not their real initials.
Analyze the statement first.
a. What does JJ mean by “wrong decisions”?
b. Were the decisions made by somebody else, and if so, why
does JJ go along with the decisions?
c. What does “ruined my life” mean to a 26-year-old?
Now analyze the question.
d. Can JJ be more specific about the type of change he or she
wants in their life?
e. What area of their life is affected: health, career, family,
relationships, emotions?
f. In what way does the book change anything? Isn’t JJ just
shifting the decision to “change my life” to somebody else?
Model answer to question 1: Muller says: “To me, that
indicated a bias in favor of one particular answer. With that
bias, there would be great difficulty in evaluating the data in
a truly objective way.”
He says the statement of intent should have been: “There is
speculation that “X” is valid. I would like to test that
hypothesis, and see if it is true or false.”
Chapter Summary
Socrates, an Athenian stonemason, soldier and philosopher
who died about 2500 years ago, is known and honored for
his thinking and ethics, and because of his method of
questioning. We can all benefit from the methods he
developed to gain insight into the world around him and to
find the true meaning of things.
What made him particularly effective was that his method of
questioning was probing and direct, yet remained respectful.
One of the legacies of his questioning methods is the
standards established for the definitions of things. For a
definition to be perfect, it must be perfectly clear, and
perfectly distinct. The minimum standard for a definition is
that it must differentiate the thing defined from every other
thing. It must be so distinct that we do not mistake it for
some other thing.
The Socratic method of inquiry uses questions to clarify
beliefs, expose contradictions in arguments raised,
understand any assumptions, and probe the evidence and the
reasons used to support them. Socrates pushed the
boundaries of the purpose of philosophy by expanding it to
include trying to understand personal values as well as the
place of humanity in the greater scheme of things.. His
passion for detailed and specific answers inspired the
development of formal logic systems.
Structured arguments, as seen with inductive and deductive
reasoning, offer established ways to test a new hypothesis.
This became possible as the scientific method grew out of
the Socratic method. It is popular and well regarded because
it works – and it works well.
The universal intellectual standards can help you to test and
probe the quality of reasoning in any argument, and they
deepen your insights into Socratic thinking and questioning.
The universal intellectual standards are:
Table 3: Universal Intellectual Standards
3
TRAITS OF A SOCRATIC MIND
S ocrates was a humble man, from humble beginnings. He
portrayed that in the image he projected, hanging
around the marketplace shabbily dressed, unkempt, and
barefoot, even though he probably didn’t think of it in terms
of his image. His humility was also evident in his thinking,
believing as he did, that he knew nothing. To think and
reason from a perspective of intellectual humility means you
need a constantly curious mind, a wide open-mind, and a
listening mindset. In truth, you also need a thirst for truth.
Socrates was, to an extent, focused on ethics. His search was
not just for truth, but also for what is good and right. We
know we each have our own concept of what is true, what is
good and what is right. And your truth and right might differ
from my truth and right. They may also differ from country
to country, and from culture to culture. Socrates was
searching for objective truth, a truth that would hold up to
examination in all situations and for all people. If something
is objectively good, it must be good for all parties.
For example, a logging operation in a forest where there are
threatened or endangered species, or a coal mine that will
affect the local pastoral community, might be good for a few
business people, and for a few politicians, but not good for
the local people who live in the area. Whose good takes
precedence in this situation?
How can we know ourselves if we do not examine our lives,
our beliefs? How stable is the foundation on which you build
your thoughts and behavior? Are your rights, or good, or
truth more important than my rights, good, or truth? Does
where I live make a difference? Does my language or my skin
pigmentation make a difference?
We have now moved into the area of virtue, specifically the
cardinal virtues. Wikipedia lists them as Prudence, Justice,
Fortitude, and Temperance, and says, “They form a virtue
theory of ethics.” 1 Three of the names are not in common
use today, but it is the meaning behind them that counts.
Table 4 lists them by different names, with other meanings
to help clarify what each one represents.
Table 4: The Cardinal Virtues
Plato, apparently, thought sound-mindedness to be the
foremost virtue. How, in this instance, would Plato define
sound-mindedness? Sound could be synonymous with
healthy, thus healthy-minded. But again, what yardstick do
we use to measure healthy mindedness?
To think like Socrates and to effectively use his method of
questioning, you will need to assume the freedom to
question statements like “sound-mindedness is the
foremost virtue,” and then pursue the argument until you
have a definition all parties agree to. You will also need to
develop specific traits, or mental characteristics, a task that
requires consistent practice and constant awareness.
Whether you see the traits as habits of mind or think of them
as part of a mindset, the effort to develop these traits or
characteristics is rewarding and satisfying.
Richard Paul and Linda Elder call them intellectual traits,
and say “Consistent application of the standards of thinking
to the elements of thinking result in the development of
intellectual traits.” 2
In effect, by consistently applying the universal intellectual
standards to the elements of critical thinking, you will
develop the mental and intellectual traits of a Socratic mind.
Paul and Elder go on to list the traits as:
Intellectual humility
Intellectual courage
Intellectual empathy
Intellectual autonomy
Intellectual integrity
Intellectual perseverance
Confidence in reason
Fair-mindedness
Let’s go through these, expand on what they are and what
they are not, and reflect on where and how you can apply
them to improve your critical thinking.
Intellectual Humility
In the words of Mark Leary, intellectual humility is “the
recognition that the things you believe in might in fact be
wrong.” 3 Intellectual humility is the scientist working to
disprove their own hypothesis. Acknowledging your
limitations, or admitting you were wrong about something,
requires courage. It also requires an ability to identify bias,
pretentiousness, and conceit, and to work on eliminating
them from your thinking.
If you are truly curious about things around you, intellectual
humility is a necessity, else you’ll limit your learning.
Socrates questioned people he believed knew more than he
did. This is intellectual humility in action.
Intellectual Courage
A Japanese proverb says, “If you believe everything you read,
you better not read.” Or you need to question why you believe
everything you read. To do that takes courage. Because part
of your reading matter will be books and papers you have
grown up trusting, or that you rely on for keeping up to date
with your profession, or that guide you spiritually. When you
start to question your beliefs and values, you will need lots of
intellectual courage.
I doubt if you believe everything you read on the internet,
but we all find our computers, smartphones, and the
internet, an almost irresistible attraction. So much so, we
need to be online, one way or another, all day and every day.
Now question your compulsion to be online. What are you
getting in return for giving your devices so much control
over you? Even when you’re shopping, or in your car, or
juggling the cooking for a five-course meal, you’re online
and talking or texting.
In an interview for CNN, Steve Wozniak said, “All of a
sudden, we've lost a lot of control. We can't turn off our
internet; we can't turn off our smartphones; we can't turn
off our computers. You used to ask a smart person a
question. Now, who do you ask? It starts with g-o, and it's
not God.” 4
Question your devotion to social media, the internet, and to
always being on. What part of your life will suffer if you’re
unconnected for six hours per day, or 10 hours per day?
Would you suffer emotionally or spiritually? Will you
experience physical withdrawal symptoms?
Intellectual Empathy
Being conscious of the need to understand others, and of the
need to see things from their perspective is where
intellectual empathy begins. To understand the needs of
others, we must put our agendas aside, park our prejudices,
and try to see other views, other reasoning, and remember
those times where we were wrong, despite being convinced
we were right. Intellectual empathy requires that you not
just put yourself in someone else’s shoes, but also that you
walk in them for two months.
Intellectual Autonomy
The freedom to form your own beliefs, to conceive and
conceptualize how and where you limit your thinking, and if
you limit it at all. As with intellectual humility, courage,
empathy, and intellectual integrity, your starting point is a
consciousness of what is around you, what is within you, and
what impact these influences, feelings, and impressions
have on the rational control of your beliefs and values.
Collins Concise English Dictionary defines autonomy as “the
right or state of self-government … the freedom to
determine one’s own actions, behavior.” 5 It should go
without saying that this would require a well developed
sense of social responsibility and high moral standards.
Autonomy without taking any responsibility for outcomes, or
autonomy without standards, opens the doors for immoral
or corrupt behavior.
Your ability to think critically must underlie intellectual
autonomy. To begin with, you need to gain control over your
thinking and be aware of what influences your thought
processes. You need to make a commitment to yourself to
analyze and evaluate your beliefs. Look at the evidence that
you use to justify your values, think about your behavior
patterns and the reasons you act as you do. If somebody else
believes and acts as you do, would it be acceptable in all
situations? If not, why do you tolerate it in yourself?
Intellectual Integrity
We should be living our lives with integrity, whether
intellectual or otherwise. If we cannot be honest with
ourselves, how can we be honest with others? Acting with
integrity means you need to be consistent in all your doing,
and thinking, and dealing. The standards you apply to the
thinking and behavior of other people should apply to your
own thinking behavior.
Intellectual integrity requires a recognition of fairness in
how and where you apply standards. If you are honest
enough to admit to errors in your thinking or actions, you
are beginning to apply fair-mindedness.
Intellectual Perseverance
Along with integrity you need perseverance, because in spite
of good intentions to be honest and ethical, you will
encounter times when the breakthrough to intellectual
insights is difficult and frustrating. You may struggle to
adhere to rational principles in the face of irrational ideas
and beliefs–yours and those of others. You may have to
wrestle with confusion or uproot your prejudices, because to
achieve full understanding takes focused effort.
If you are afraid of what you might find by pushing through
the difficulties and obstructions, then you need to find some
other way to get to the truth, such as a constructive
argument with yourself, or a close companion who will help
push you through to your goal.
To cultivate a positive critical thinking mindset, begin by
affirming, on a daily basis, your intention to live by the
values embodied in critical thinking. Then live by those
values for that day.
Confidence In Reason
Whatever obstacles you encounter in your critical thinking
journey, your ultimate success is, to some extent, dependent
on your ability to reason. If you trust your logic, and believe
that what you are doing is in the best interests of humanity,
and that it serves your own higher interests, you will be able
to argue with confidence in your reasoning. However, this is
not the full picture. Without good listening skills you might
find yourself arguing at cross purposes. If you cannot hear
the opposing arguments, hear with a careful, attentive ear
what they had to say, then all your reasoning is of little
value.
Listening is a key element of your critical thinking skills. In
the words of McCoy Mrubata, a jazz saxophonist, responding
to a question about what a musician should do while
someone else is playing their solo, he said, “Listen. Don’t
fiddle with your instrument. Listen, because you need to
respond to the conversation.”
Try and encourage friends and family to come to their own
conclusions by using their own innate faculties. You can only
do this if you carefully listen to what is said, think logically,
form rational arguments, and draw reasoned conclusions. If
you aim to persuade others to become reasonable people, do
it by means of a confidently reasoned argument.
Fair-Mindedness
Objectivity is the goal here. Are there other points of view to
take into account? Are my biases working against the best
outcome? Do I have any vested interests that might influence
my thinking and decision making?
Being conscious of the need to treat all viewpoints with the
same care and attention, without reference to entrenched
beliefs of friends and community, is extremely difficult. Yet,
you need to make an effort to fulfill the demands of fair-
mindedness. Approach all such situations with an inquiring
mind. This is where you will benefit from the curiosity of a
child.
As you can see, these intellectual traits support each other,
and applying one without the others achieves little. You will
also have noticed that emotion does not play a role in critical
thinking. The reason should be obvious: emotion is the
enemy of reason. If you have any emotional attachments to
your beliefs, whether personal, religious or political beliefs,
you need to look closely at the reasons behind the
attachments.
Dr. Okadigbo Chuba, a Nigerian philosopher, political
scientist, and academic, puts it well: “If you are emotionally
attached to your tribe, religion or political leaning to the
point that truth and justice become secondary
considerations, your education is useless. Your exposure is
useless. If you cannot reason beyond petty sentiments, you
are a liability to mankind.” 6
Paul and Elder encourage you to make a habit of thinking
critically, make a habit of using the intellectual traits, and
use these skills to: 7
Raise vital questions and formulate the question
clearly and precisely.
Gather relevant information about the question and
assess it, using logic and abstract ideas to interpret it
effectively.
Arrive at well-considered conclusions and be
prepared to test them against intellectual standards,
using relevant criteria to do so.
Approach the issue with open-minded curiosity,
apply alternative thought systems to recognize any
assumptions and assess the implications and
consequences.
Communicate effectively with other people to arrive
at workable solutions to complex problems.
In addition, maintain your curiosity throughout to keep your
mind agile and active. An agile and active mind is important
for critical thinking because it opens you up to new ideas and
new possibilities.
Under characteristics of a well-cultivated critical thinker,
Paul and Elder mention that when approaching a question, it
helps to know what type of question it is. “Is it a question
with one definitive answer? Is it a question that calls for a
subjective choice? Or does the question require you to
consider competing points of view?” 8 The question types
are graphically illustrated in Figure 1.
To establish what type of question it is, first ask if there are
relevant facts we need to consider to answer the question.
If there are facts to consider, and the facts alone settle the
question, it is a one system question. If the facts can be
interpreted differently, then the question is open to debate
and is a multi-system question. If there are no facts to
consider, then there is no system and the answer is a matter
of personal preference. 9
Figure 1: The three types of questions
In Cultivating A Critical Thinking Mindset, Dr. Peter Facione
discusses the value of critical thinking. He says, “if we value
critical thinking, we desire to be ever more truth-seeking,
open-minded, mindful of consequences, systematic,
inquisitive, confident in our critical thinking, and mature in
our judgment.” 10 Which is a good way to summarize the
traits and methods we cover in this chapter. But it does not
end there. We need to be on the lookout for ways to improve
our critical thinking skills.
One of the ways to improve those skills is to be on the
lookout for opportunities to use critical thinking to support
you in making decisions and to influence how you solve
problems. Instead of just reacting, take some time to be
reflective and thoughtful. The more you look for such
opportunities, the more you will find. Make it a habit.
Forgive yourself if you go back to your old ways of tackling
problems and making decisions. Forgive yourself if you miss
an opportunity to practice critical thinking. It’s not a
disaster, so be gentle with yourself. Humility, integrity,
courage, open-mindedness, and empathy are ideals we
strive to achieve. There will be missteps and stumbles, but
do not let them stop your journey to become a strong critical
thinker.
Dr. Peter Facione refers to “strong critical thinker” as
opposed to “good critical thinker” because, as he says, good
is too ambiguous. Is a good critical thinker also an ethical
person? In this context, good could be interpreted as a
judgment on the person’s ethics or critical thinking skills.
For example, a lawyer defending a multinational corporation
in court for violating environmental legislation may be adept
at presenting well-reasoned arguments, and skillful at
finding the weaknesses in opposing arguments, but that
doesn’t make them good in the ethical sense of the word. If
that lawyer uses these same skills to mislead and exploit
gullible people, or perpetrate a fraud, you can see the lawyer
as a skilled, or strong critical thinker, but hardly as an
ethical thinker.
Critical thinking is useful in decision making, but like any
tool or process, it can be used in unworthy ways. Facione
uses the example of the revelations that Victor Crawford
made in a 60 Minutes interview. Crawford, a lobbyist for the
tobacco industry, “admits that he deliberately misled and
manipulated legislators and the general public.”
This is nicely summed up by the comedian, George Carlin,
when he says, “They spend billions of dollars every year
lobbying ... lobbying, to get what they want ... Well, we know
what they want. They want more for themselves and less for
everybody else, but I’ll tell you what they don’t want ... they
don’t want a population of citizens capable of critical
thinking. They don’t want well-informed, well-educated
people capable of critical thinking. They’re not interested in
that ... that doesn’t help them. That’s against their
interests.” 11
Exercises And Tasks
Think about Steve Wozniak’s quote regarding how fixated we
are with our electronic devices:
“All of a sudden, we've lost a lot of control. We can't turn off
our internet; we can't turn off our smartphones; we can't
turn off our computers. You used to ask a smart person a
question. Now, who do you ask? It starts with g-o, and it's
not God.”
Think about the loss of control he mentions.
Bring this up with your family for discussion. Could you
reach an agreement on no devices at the dinner table?
Try this: Each day for a week, record how much time you are
separated from your electronic lifeline. Then work on
increasing the periods you are not connected, not carrying
your device from the bedroom to bathroom, not staring at a
computer screen.
Report back to your family on the outcome of your one-week
trial. Ask them to try it with you for a week.
Could you persuade them to continue this habit beyond one
week?
Chapter Summary
Socrates was a humble man and portrayed that image
throughout his life. His philosophical search, focused on
ethics and truth, extended to what is good and right. This is
in line with the virtues of Prudence, Justice, Fortitude, and
Temperance, which form a “virtue theory of ethics.”
Table 4: The Cardinal Virtues
To think like Socrates, and to effectively use his method of
questioning, you need to assume the freedom to question
statements like “sound-mindedness is the foremost virtue,”
and then pursue the argument until you have a definition all
parties agree to. You will also need to develop specific traits,
or mental characteristics, which requires consistent practice
and constant awareness. By consistently applying the
universal intellectual standards to the elements of critical
thinking, you will develop the mental and intellectual traits
of a Socratic mind. The traits are:
Intellectual humility
Intellectual courage
Intellectual empathy
Intellectual autonomy
Intellectual integrity
Intellectual perseverance
Confidence in reason
Fair-mindedness
Use the intellectual traits to:
Raise vital questions and formulate the question
clearly and precisely.
Gather relevant information about the question and
assess it using logic and abstract ideas to interpret it
effectively.
Arrive at well-considered conclusions and be
prepared to test them against intellectual standards
using relevant criteria to do so.
Approach the issue with open-minded curiosity and
apply alternative systems of thought to recognize
assumptions, and assess the implications.
Communicate effectively with other people to arrive
at workable solutions to complex problems.
Maintain your curiosity throughout because curiosity will
keep your mind agile and active. An agile and active mind is
important for critical thinking because it opens you up to
new ideas, new possibilities.
When approaching a question, it helps to know whether you
are dealing with a one system question, a multi-system
question, or a no system question.
Practice critical thinking and forgive yourself if you miss an
opportunity to do so. Humility, integrity, courage, open-
mindedness, and empathy are ideals we strive to achieve.
There will be missteps and stumbles, but do not let them
stop your journey to become a strong critical thinker.
4
QUESTIONING: THE HEART OF THE SOCRATIC
METHOD
In many shamanic societies, if you came to a
shaman or medicine person complaining of being
disheartened, dispirited, or depressed, they would
ask one of four questions.
When did you stop dancing?
When did you stop singing?
When did you stop being enchanted by stories?
When did you stop finding comfort in the sweet
territory of silence?
— GABRIELLE ROTH
T his quote is attributed to Gabrielle Roth, and can be
found in many places on the internet with her name as
the author. While the quote is in Roth’s book, Maps to
Ecstasy: The Healing Power of Movement, the quoted text is in
the foreword and is written by Angeles Arrien.
This is another example of an incorrect attribution on the
internet, but its importance here is as an example of
knowing what questions to ask. Fortunately, you do not need
to be a shaman, or a medicine person, to know what
questions to ask. Next time you go to a doctor, pay attention
to the questions your doctor asks. Or, next time you consult a
mechanic about strange and unusual noises from your car,
pay attention to the questions they ask. Essentially, they are
working through a list of questions that will help them
diagnose the problem.
Please understand that knowing what questions to ask is a
matter of practice. Yes, in the case of your doctor or
mechanic, they have some in-depth training and specialist
knowledge. But do not underestimate your depth of
knowledge about the problems and issues that concern you
and trouble you, whether they are business and career-
related, or home and family-related. If you can approach
them from a dispassionate and detached point of view, so
much the better. Emotional distance lets you see any clues
that may help, find a significant pattern, or source of
additional information that will help you.
If necessary, ask a close friend to help you debate the issue.
Let them play devil’s advocate, present an opposing view,
and push you to critically examine your position. Ask
questions about the source of the information or problem. Is
it supported by any other source? Can it be verified
independently? You may be surprised at how much you can
establish with a little effort. But most of all, you are
practicing and establishing your critical thinking and
analytical habits.
To help you break through your standard thought patterns
and access the questioning mindset you strive for, have a
close look at what you have been conditioned to think and
believe. Question what you have been taught. How accurate is
the history you were taught at school? How valid are the
scientific facts we learned in childhood? For example,
Newtonian physics is no longer the only physics that
matters. This is not to say that what you already know is
wrong, but accept the possibility that it might be wrong.
In the book Future Shock, Alvin Toffler is reputed to have
said: “The illiterate of the 21st century will not be those who
cannot read and write but those who cannot learn, unlearn
and relearn.” While the general idea of what Toffler said is
valid, he didn’t say it quite like that.
The original Toffler quote is: “By instructing students how
to learn, unlearn and relearn, a powerful new dimension can
be added to education … Tomorrow's illiterate will not be the
man who can't read; he will be the man who has not learned
how to learn.” 1
To compound the issue, his misquoted quote has been
misappropriated and modified to read: “The illiterate of the
21st Century will not be those that cannot read or write, but
those who cannot unlearn the many lies they have been
conditioned to believe.” 2
If there is anything to learn from this, it is to question every
source, question every statement–particularly if it comes
from the internet. That, and to know that learning is a
continuous process, not something you stop doing after
college.
What Is Piety?
In Chapter 2, in the discussion on definitions, we looked at
the words of Peter Kreeft and Trent Dougherty, who said, “…
if we cannot have perfect clarity, we should at least have
perfect distinctness. If we cannot know exactly what a thing
is, we should at least know what it isn't, that is, know its
limits.”
This is important and was what Socrates was trying to
establish in his dialogue with Euthyphro on piety. To quote
Kreeft and Dougherty again, “The minimum for an
acceptable definition is that it at least distinguishes the
thing defined from all other things, so that we will not
confuse it with other things.” 3
One of the charges that Socrates faced was impiety, in that
he offended the gods of Athens by introducing new gods. On
the day of his trial, Socrates encountered Euthyphro, who
was at the People’s Court for another matter. The story
below comes from a summary of Plato's “Euthyphro” by
Emrys Westacott, a professor of philosophy at Alfred
University. We begin with Westacott’s definitions of the term
"piety" which, he says, has two senses:
1. A narrow sense: knowing and doing what is correct in
religious rituals. For example, knowing what prayers should
be said on any specific occasion or knowing how to perform
a sacrifice.
2. A broad sense: righteousness; being a good person. 4
On seeing Euthyphro, Socrates expresses his delight at
finding someone who is, by Euthyphro’s own claims, an
expert on piety. “This is just what I need in my current
circumstances,” says Socrates, and he asks Euthyphro to
explain to him what piety is.
“Piety is prosecuting wrongdoers,” says Euthyphro, “and
that is what I am doing now. Impiety is failing to prosecute
wrongdoers.”
Socrates objects, saying that what Euthyphro describes as
piety is simply an example of piety, but it does not define the
wholeness of the concept of piety.
“Yes, yes,” says Euthyphro, “I was just getting to that. Piety
is what is loved by the gods, impiety is what is hated by the
gods.”
Again Socrates objects, saying, “Euthyphro, we all know that
the gods sometimes disagree with each other about
questions of justice. So, we agree that some things are loved
by some gods and disliked by other gods. Therefore the
things the gods disagree about will be both pious and
impious, which makes no sense.”
To which Euthyphro replies, “Yes, that is true, Socrates. But
what I meant was that piety is what is loved by all the gods.
Impiety is what all the gods hate.”
Socrates accepts this as a good starting point for defining
piety, but then poses the question that is the key to this
dialogue and goes to its heart. He asks, respectfully, “Do the
gods love piety because it is pious, or is it pious because the
gods love it?”
This is a delicious question, along the lines of: are works of
art in museums because they are works of art, or do we call
them “works of art” because they are in museums?
What Socrates has done is bring the argument around to the
beginning. What is piety? What makes an action pious? Is
something pious only because the gods see it as pious, or do
the gods love actions such as helping a stranger in need
because such actions have the property of piety?
This line of questioning results in Euthyphro trying to clarify
the position by saying that piety is concerned with caring for
the gods. This too, is subjected to the questioning of
Socrates, who says the notion of care, in this sense, is not
clear.
Socrates asks if this sense of care is the same care a dog
owner gives to the dog or is it care in the sense of slaves
caring for their owner? If piety is concerned with caring for
the gods, what is the goal of giving such care? Is the goal to
improve the health and wellbeing of the gods (as it would be
in the case of caring for a dog), or is the goal to improve the
comfort and contentment of the gods (as it would be in the
case of slaves caring for their owner)? Euthyphro can't say
what the goal of such care is but changes tack and gives his
fifth definition of piety. He says, “Piety is saying and doing
what is pleasing to the gods at prayer and sacrifice.”
Socrates points out that this definition takes them back to
the third definition, but in a disguised form, namely that
some gods will disagree about what it is that is pleasing to
them. At this point, Euthyphro decides he has better things
to do and takes his leave.
You can see how Socrates developed his line of questioning
and how he applied logic to the answers given by Euthyphro.
Of course, logic and questioning rely on critical thinking.
A Useful Technique
When you engage in a dialogue with friends or colleagues,
you need to carefully consider what has been said or what
you will say next. Take your time when you think about what
question to ask next or how to answer. This is called the
awkward silence or the pregnant pause. And it is a powerful
tool in your critical thinking skillset.
When faced with a difficult question, pause and think deeply
about how you want to answer. Often, as the silence
lengthens, you rush to fill it with words. A deliberate
awkward silence is a pause of at least 10 or 15 seconds. You
may think a pause as long as 10 seconds is too long, so
practice delaying your answer to give you time to consider all
the possibilities.
You can break the silence with a sigh, as used here 5 by Elon
Musk. Or with an incomplete phrase, as in this Q&A session 6
with Steve Jobs, as he responds to a question made more
difficult as it ends with what can only be an insult.
As these examples illustrate, the awkward silence is a
powerful tool.
It can help you give deeper, more analytical, more
thoughtful answers.
It can help you get to the root of problems more
effectively, leading to greater understanding. 7
These examples of awkward silence are not directly related
to critical thinking, but they are more than a useful tool.
Critical thinking is not something that should be rushed.
Instead, every statement that forms part of your argument
needs to be carefully considered. Does the logic hold true in
the current context? Does the conclusion follow naturally, or
is it contrived?
It helps to pause for a minute, if necessary, and recap the
flow of the argument. Is there a direct and logical connection
from the main ideas to the conclusion?
Critical thinking is the Socratic way of thinking. It forms the
framework for the systematic and disciplined exploration of
the fundamental concepts, principles, and theories on which
our 21st-century society is built. It is made possible by
conducting a dialogue that digs deep, engaging with the
issues and problems that confront modern society. It should
be focused. It must be disciplined. It has to be systematic.
When faced with words like critical and argument, it is easy
to assume that the tone of the debate should be antagonistic
and that the aim is to destroy the stance or viewpoint of the
other party. But this is not the case. Remember that Socrates
was humble, and his dialogues were conducted from the
standpoint of humility. Therefore, treat your interlocutor
with respect. Your aim is to focus their thinking through a
series of questions that encourage a reassessment of a belief
or theory. It is not about trying to prove anyone wrong.
Rather it is a sincere attempt to grow or increase the
knowledge and understanding of all parties.
Socratic questioning is a process that may or may not end
with a satisfactory conclusion. This is because you engage in
questioning to deepen your understanding instead of trying
to prove a point. If, in the process, you cause the other
person to rethink their initial premise or see things from a
different perspective, you have both grown in understanding
and knowledge. The idea, at all times, is to avoid causing
conflict.
When you begin to consider the arguments presented, you
could face a dilemma over where to start. What do you
question? How do you question it? What questions could you
possibly ask? Begin by summarizing the argument presented
and clarifying the terms used, such as words or phrases that
might be ambiguous. Drs Richard Paul and Linda Elder, in
The Thinker’s Guide to The Art of Socratic Questioning, set out
eight possible areas to question. 8
1. Questioning goals and purposes.
Is there an agenda, either hidden or on display? Do you
understand the thought processes at work? Here are a few
questions that focus on purpose in thinking:
What is your purpose right now?
What was your purpose when you made that
comment?
What is the central aim of this line of thought?
What other goals do you need to consider?
2. Questioning questions.
What is the thought that gives rise to this question?
Questions that focus on questions in thinking include:
I am not entirely sure what question you are raising.
Could you explain it?
Is this question the best one to focus on, or is there a
more pressing question to address?
That might be the question from a conservative
viewpoint, but what about a liberal viewpoint?
What questions are you not asking that should be
asked?
3. Questioning information and experience.
If you don’t understand the background information, such as
facts and experiences that inform and support the question,
focus your questions on the information you need:
On what information do you base that comment?
What experience convinced you of this? Could bias
distort your experience?
How do you know this information is accurate? How
can you verify it?
4. Questioning inferences and conclusions.
It is common to create meaning from inferences and then to
draw conclusions from inferences. If you do not fully
understand the thought behind the inferences then question
them. Questions that focus on inferences include:
How did you reach that conclusion?
Could you explain your reasoning?
Is there a plausible alternative conclusion?
5. Questioning concepts and ideas.
You cannot fully understand a line of thinking until you
understand the concepts that shape it. Questions that focus
on concepts in thinking include:
Could you explain the main idea that you use in this
line of reasoning?
Do you have all the facts, or do you need to rethink
how you label the facts?
Is the question a legal, a theological, or an ethical
one?
6. Questioning assumptions.
You cannot fully understand a thought until you understand
what it takes for granted. Questions that focus on
assumptions include:
What are you taking for granted here?
Why do you assume that?
What other assumptions underlie your point of view?
7. Questioning implications and consequences.
All thought begins somewhere, sometimes based on
assumptions, and that thought has implications and
consequences. If you do not fully understand the
implications and consequences, you should question the
thought. Implications and consequences to consider include:
What do you imply when you say…?
If you do this, what is the likely result?
Have you considered the consequences of this?
8. Questioning viewpoints and perspectives.
All thinking has a point of view or frame of reference. You
cannot fully understand the thinking until you can see the
point of view or grasp the frame of reference. Questions that
you should consider here include:
From what point of view are you seeing this?
Is there another point of view to consider?
What frame of reference makes the most sense given
the situation?
The typical question many people ask is the question that is
asked to get a specific answer. What differentiates this type
of question from Socratic questioning is that the Socratic
method is not simply about getting an answer. It is about
getting you or your partner to think more deeply about an
issue. By creating a series of questions that build up a line of
reasoning, or build up your arguments, your dialogue will
either help you reach a reasonable conclusion or bring you or
your interlocutor to a state of puzzlement or perplexity, what
the Greeks called aporia.
In the hands of a practiced questioner, it does not matter
what answer is given as Socratic questioning will take any
answer and form a new line of questioning. In all of these
cases, keep the implications of subtext in mind. For example,
when your life partner says, “Let’s go out for a meal
tonight,” what is really being said?
It could mean “I am tired of eating what you cook, and don’t
want to cook myself.”
Or it could mean, “I don’t want to eat from a tray on my lap
while watching tired reruns on TV.”
Or perhaps it means “I want to change the dynamic of our
relationship.”
Be aware, and be alive to the possibilities. It will also help to
know that clear-headed thinking is difficult in the 21st
century–more difficult than it has been in the past. The
problem is the sheer volume of information we consume
every hour of every day. Between work and family, and the
climate crisis, and the coronavirus pandemic, we face
volume overload. Psychologist and critical-thinking expert
Daniel Levitin says the amount of information coming in
puts a strain on our ability to evaluate it. “We’ve become less
critical in the face of information overload. We throw up our
hands and say, ‘It’s too much to think about.’” 9
Warren Berger clarifies that if we want to improve our
abilities to make considered decisions, we need to hone our
critical thinking skills. He suggests a set of questions and a
willingness to ask them consistently. Not simply ask them,
but to consider the answers thoughtfully before passing
judgment.
Five all-purpose questions for better thinking
How can I see this with fresh eyes?
What might I be assuming?
Am I rushing to judgment?
What am I missing?
What matters most? 10
What are the decisions in your life that you make with little
thought? What are the choices you make based on little more
than a gut instinct? Of course, many of our daily decisions,
such as what to have for dinner or what time to leave for
work, do not need critical thinking. But with the advent of
“fake news” and disinformation, you need to start assessing
things more critically. And in doing so, watch for the
distorting influence from your cognitive biases.
A few short months before his death in 1996, in what was
probably his last interview, astronomer Carl Sagan said to
his interviewer: “If we are not able to ask skeptical
questions, to interrogate those who tell us that something is
true, to be skeptical of those in authority, then we are up for
grabs for the next charlatan, political or religious, who
comes rambling along.” 11
Exercises And Tasks
Fake news has become a buzzword in recent years. The
spectrum of fake news ranges from seemingly benign
misinformation to outright disinformation that is blatantly
dishonest and intended to manipulate and/or confuse
people.
Members of the public made the comments below at a recent
webinar to discuss ways to combat the scourge
of disinformation. After reading the comments, consider
whether you have any opinion about misinformation and
disinformation. Then answer the question at the end of the
comments.
Comments from a webinar on misinformation 12
Frank P: How does one realistically challenge and hold
politicians accountable for divisive and patently
irresponsible rhetoric? In South Africa, this problem is
rampant.
Richard V: @Frank P - by teaching our children from an early
age that they have a right to question. By teaching critical
thinking at secondary school.
Kamo M: What do you think can be done to educate
consumers of mis/disinformation on how to be discerning?
They are the biggest victims, especially with local govt
elections coming up.
Pam T: Yes, I couldn't agree more. We need to keep our
political leaders to account. This is long overdue
Michael C: @Richard V. Fully agree. Teaching critical
independent thinking should be the essence of education.
It's the best defense against an increasingly sophisticated
disinformation industry.
Theuns O: On the principle of censorship, there have been
cases where a social media platform has exercised bias by
'cherry picking’ narratives that suit the people that run it. If
these platforms also 'shadow ban' opposing opinions, how
can a citizen inform themselves by finding the truth between
different positions?
Kim B: The problem is that we are not critical thinkers. When
we receive news, whether on social media, mainstream
media or from the government, we do not interrogate it. I
believe we need to encourage more critical thinking and it
starts in the classroom and at home. Parents should nurture
critical thinking in their children, but how do you do it when
we don't want to be critical thinkers ourselves? Social media
is, by its nature, manipulative in what is in circulation. We
should both challenge, and be open to challenge, some of the
things circulated on social media.
Frank P: Critical thinking is an absolute given, but in a
country where so many have little or no education, the
problem of fake news and populist rhetoric (which is mostly
opinion based), is widely accepted at face value. This is a real
problem and political rhetoric plays into this vacuum with
undue influence. Somehow there needs to be some legal
recourse available too.
Aniedi O: New York Supreme Court suspended Rudi Giuliani's
law license for misinformation. How do we hold people to
account who intentionally misinform Africans?
Questions on taking a stance on disinformation
1. Would you be prepared to take a public stand against
misinformation and/or disinformation?
2. Do you think there should be legislation and an
international court where matters of deliberate
misinformation and/or disinformation can be challenged?
3. If yes, would you write a letter to a newspaper or phone
into a talk radio show and state your case?
4. If not, how would you justify your thinking or stance to
not go public with your thoughts? Write a letter to yourself
setting out your reasoning, and mail it to yourself.
5. If you said you would write a letter or phone in to a talk
show, draft your letter or speech setting out your reasoning
and mail it to yourself.
Chapter Summary
Knowing what questions to ask is a matter of practice. Do not
underestimate your depth of knowledge about the problems
and issues that concern you and trouble you.
To break through your standard thought patterns and access
a questioning mindset, have a close look at what you have
been conditioned to think and believe. Question what you
have been taught. This is not to say that what you already
know is wrong, but accept the possibility that it might be
wrong.
Socratic questioning is a process that may or may not end
with a satisfactory conclusion. This is because you engage in
questioning to deepen your understanding instead of trying
to prove a point.
Clear-headed thinking is difficult in the 21st century. Warren
Berger suggests a set of questions you should ask
consistently and consider the answers thoughtfully before
passing judgment.
Five all-purpose questions for better thinking
How can I see this with fresh eyes?
What might I be assuming?
Am I rushing to judgment?
What am I missing?
What matters most?
What are the decisions in your life that you make with little
thought? With the advent of “fake news” and
disinformation, you need to start assessing things more
critically. And in doing so, watch for the distorting influence
from your cognitive biases.
5
THE SKILLFUL ART OF ASKING THE RIGHT
QUESTIONS
“T he only thing I know, is that I know nothing.” This is
often quoted as an indication of Socrates’ humility.
Not many people have the courage to say it about
themselves and truly mean it. The question is, did Socrates
really know nothing, or was it a persona that he used as a
mask to question the nobility, the priests, the soldiers, the
random people he spoke to in the marketplace?
Whatever the answer is to that question, the one thing we
can say with certainty is that Socrates knew how to ask the
right questions. What we don’t know is how he came to know
which questions to ask. Did he, for example, lie in bed
preparing lists of questions for different people? Or did he
just get better and better as he asked more and more
questions, becoming more proficient through constant
practice?
In all probability, it was a combination of preparation and
practice, driven by an inquisitive intellect and a search for
meaning and knowledge. Something else we cannot know is
what Socrates was like as a child. Can we assume he asked as
many questions as any three or four-year-old child from the
20th and 21st century?
There is little doubt that knowing what questions to ask is
both an art and a skill. As a child, asking questions is a way
of filling in the vast gaps between the world experienced and
the world understood. The cup bumped off the table falls to
the floor. The ball thrown up in the air falls to the ground. So
why do birds and airplanes stay up in the air? Why don’t they
fall?
“Studies have shown that the four-year-old child may ask
anywhere from one hundred to three hundred questions a
day.” 1 Asking questions at this age requires two things. First
the child needs “enough awareness to know that one does
not know,” and second, they need the initiative to start to
remedy the state of unknowing.
Which begs the question–are you still asking questions? Not
the “did you finish this,” or “where did you leave that,” type
of questions, rather the “why do I think that,” or “why do I
react in that way,” type of questions. If you’re not still
questioning your life, your choices, your motives, then when
did you stop, and why did you stop?
These are not idle questions. If you want to practice Socratic
questioning, you need all the questioning practice you can
get. But if you have stopped asking questions, you need to
establish why, and deal with that blockage.
The Enemies Of Questioning
In The Book of Beautiful Questions, Warren Berger discusses
the “five enemies of questioning.” 2 He lists these as: Fear,
Knowledge, Bias, Hubris, and Time.
Fear
Children begin as fearless questioners; then they learn that
asking questions carries risks. These risks come from both
adults and other children. As children grow, the risks of
asking questions include being embarrassed for what they
should know but perhaps don’t know, and ridiculed for
asking a question that is off-topic or has an obvious answer.
Knowledge
Knowledge is a potential enemy in two ways. First, because
you believe you know so much, you stop learning and stop
updating your knowledge. Second, and this is perhaps even
more damaging, we as a society, and you as an individual,
don’t know as much as we think we do. This is particularly
dangerous when society on a global scale is in a time of rapid
change. This is something Socrates discovered in ancient
Athens by questioning those citizens recognized as being
wise and knowledgeable. They didn’t know as much as they
claimed to know or were thought to know.
Bias
Berger says the next two enemies of questioning, bias and
hubris, are related to each other. And they may be, but we
will discuss them separately. Bias is a recurring theme in this
book and has already come up as an issue in every chapter,
particularly in the discussion on universal intellectual
standards. Berger makes the interesting statement that
“some of them are hardwired in us; others may be based on
our own limited experiences.” The result of a bias, whatever
its source, is that you are not as open to questions that
challenge that view. Biases and assumptions go hand in
hand, so question all your assumptions and check them for
bias.
Hubris
This is an unusual word, one which means excessive pride or
arrogance, with the emphasis on excessive being the
important factor. It is excessive pride in your ability or your
knowledge that leads you to believe that your views are
correct and that they are not biases. Berger makes the
important point that if you lack humility, you’ll probably do
less questioning, that you will say things like: “If I don’t
know it already, it can’t be that important,” or “I don’t have
to sit through intelligence briefings because I’m a really
smart guy.”
Time
Time, or the lack of it, is often used as an excuse for not
questioning. And it starts in school, or perhaps even earlier.
The parent who does not answer a question because they are
too busy, or the teacher who does not have time to fully
answer questions in class, all play into the statement that
time is money.
Four-year-old children are not aware of the five enemies of
questioning. And they have an insatiable curiosity. According
to child psychologist Paul Harris, children discover that they
can easily get the information they want from parents or
older family members in their first few years. 3 They learn
that using certain combinations of words and vocal
inflection often results in curiosity being satisfied.
So the four-year-old child flows with their curiosity and
constantly asks questions, always based on “why?” No
matter what your response, the child will ask “why” again
and again. Why? Why? Why? Until, in desperation, or
frustration, you say …? What do you say? “Because I say so!”
Or is it “because pigs can’t fly in the middle of July”? To
which a child, who still has no concept of sarcasm, says,
“Don’t be silly. Of course pigs can’t fly!”
The parental response to the constant questioning, or the
response of any adult to the questioning, should be to
answer the questions, gently and patiently. Of course, it is
too much to expect that an older child will respond with an
answer–they are more likely to respond in mocking tones
and humiliate the questioning child. Sometimes the adult
responds rudely, abruptly, or angrily–more like a child than
an adult. Perhaps answers like that play a role in
discouraging questions when that child is an adult.
Curiosity is an act of wondering, a speculative look at the
environment. Before inhibitions, biases, received wisdom,
and their own acquired knowledge about how the world
works become a burden for children, they have minds that
are open and expansive. The answers to their questions help
them to come to terms with their place in the world. If they
can formulate the question, they deserve the answer.
In her 1928 book, Coming of Age in Samoa, Margaret Mead
wrote “Children must be taught how to think, not what to
think.” 4 This makes absolute sense, so much so, that we
would be justified in expecting every school to teach children
how to think. Of course, there is also the vital counterpart of
thinking, which is questioning.
Richard Feynman, the physicist and Nobel prize laureate, is
reputed to have said, “Don't just teach your students to read.
Teach them to question what they read, what they study.
Teach them to doubt. Teach them to think.” However, this is
a disputed quote, and nobody has cited a specific source for
where or when he said it. Another disputed Feynman quote is
this: “I would rather have questions that can't be answered
than answers that can't be questioned." 5 This quote is also
attributed to George Carlin. So take your pick, Carlin or
Feynman. Both are worthy people to quote, and both quotes
are relevant in any discussion about children, questions, and
thinking.
Key in any discussion about questioning, especially about
children asking questions, are when and why we stop asking
questions. This applies whether we are children or adults.
Sir John Abbott, the 3rd Prime Minister of Canada, is reputed
to have told a story about a very wise man who was asked,
‘How do you know so much about everything?' The wise man
replied, 'By never being afraid or ashamed to ask questions
about anything of which I was ignorant.' This story brings us
closer to the real reasons for not asking questions–it is fear
and perhaps shame at being seen to be ignorant.
Pop psychology articles talk about inhibitions and fear of
being embarrassed as reasons for not asking questions. But
that is not the complete picture, and the underlying reasons
go beyond that. The reasons for avoiding asking questions
may be seated in our education system. Not all education
systems worldwide are run in the same way, so what applies
in, for example, France, may not apply in the USA, which in
turn is different from South Africa.
Teachers expect you to accept whatever information they
pass on at secondary schools in colonial Africa, complete
with biases and prejudices. It is rare for a student to question
anything. This is unfortunate because to arrive at a true
understanding, you need to ask questions. An ability to think
means asking questions, as you cannot arrive at true
understanding by answers alone. An opinion piece by author
Joe David in the Observer in 2018 says, “It is very rare to find
a student with a fresh point of view, derived from clear
thinking, secured in place by sound knowledge.” 6 And that
is a damning indictment of our schooling systems.
Questions challenge authority, and authority, as with
bureaucracy and the Official Mind, do not like being
challenged. At some point we learn that any questioning of
authority can get us into trouble. And another door closes on
the curious and questioning mind.
If nobody was allowed or encouraged to ask questions we
would not have the scientific or technological advances we
have seen in the last half a century. This is neatly summed
up by a statement in the Critical Thinking Handbook: “Had no
questions been asked by those who laid the foundation for a
field – for example, Physics or Biology – the field would
never have been developed in the first place.” 7 Claim your
right to question, but be careful you do not cause offence
through insensitive comment or abrupt or brusque
techniques.
How to Ask Questions
How questions should be asked, or the way that you ask
questions, will define what sort of results you get. Keep in
mind that you want cooperation from your interlocutor, and
you want them to consider the implications of the questions,
not the manner of delivery or tone of voice you use. In other
words, avoid being condescending, judgmental, or
aggressive in tone or delivery. Frame your questions in a way
that comes from a place of humility, not superiority.
Skilled questioning requires patience and practice. Don’t be
surprised if your first attempts fall short of the mark. It
should not make a difference what your line of questioning
is, what the subject matter is, or the avenue of approach, as
long as the questions are framed and asked with respect for
your interlocutor. The avenues of thought and the type of
questions you ask should not be limited in any way.
Examples of questioning and lines of questioning can be any
one or more of these issues:
Investigate the truth of a theory or opinion.
Elicit and develop an idea or thought that is not yet
developed or actualized.
Lead the interlocutor to a conclusion that is either
logical or valid, foreseen or unforeseen by the
questioner.
Elicit a statement or conclusion that can be further
examined for truth or falsity.
For every line of inquiry, there are a number of questions
you should first ask yourself. See if any of the following
seven examples apply to you:
1. To analyze your goals and purposes.
What am I really after in this situation?
Are my goals reasonable?
Am I acting in good faith?
Do I have an agenda, whether hidden or obvious?
2. To question definitions relating to problems and issues.
Is this a reasonable way to put the question at issue?
Am I loading the question by the way I put it? Am I
biasing it?
Am I framing the question in a self-serving way?
Am I asking a question to pursue a selfish interest?
3. To assess the information base of your thinking.
On what information do I base my thinking?
Is this a legitimate source of information?
Is there another source of information to consider?
Am I considering all the relevant information or only
the ones that support my view?
Am I distorting the value of the relevant information
in a self-serving way?
Am I refusing to check the accuracy of some
information to avoid changing my view?
4. To rethink a conclusion or interpretation.
Am I coming to an illogical conclusion because it is in my
interest?
Am I refusing to look at this situation more logically?
If I do look at the situation more logically, will I have
to behave differently?
5. To identify and check my assumptions.
What am I assuming or taking for granted?
Are those assumptions reasonable?
Are they in any way self-serving or one-sided?
Do I make egocentric assumptions in my thinking,
such as, “Everyone always blames me?”
Are my expectations of others reasonable or am I
using double standards?
6. To analyze your own point of view.
Am I refusing to consider another point of view to
maintain my self-serving view?
Am I taking into account the viewpoint of others, or
am I just going through the motions of listening
without actually paying attention?
Am I honestly trying to understand the situation from
another perspective, or am I merely trying to win an
argument?
7. To follow through on the implications of your thinking.
Do I genuinely think through the implications or
possible consequences, or would I rather not consider
them?
Do I avoid thinking through implications because I
don't want to know what they are?
Do I avoid thinking through implications because if I
know them, I will have to change my thinking?
Listening
Listening is one of the most important aspects of
questioning and is often overlooked. It is easy to overlook
because it has nothing to do with analysis or the framing of
questions. But without solid listening skills, any analysis is
likely to be limited to what you want to hear, as opposed to
what is being said.
The art of listening is a crucial skill for effective
communication. “To listen properly means paying careful
attention to what the other person is saying, absorbing the
information, judging it and acting on it.” 8 However, the art
of listening is not a natural skill, so it must be learned and
this requires effort and practice. During a conversation the
receiver needs to listen attentively, critically and
appreciatively.
Attentive listening requires you to listen for the speaker’s
central idea. You want to know the purpose or goal of the
conversation as soon as possible. Once you know what the
conversation is about you will know if the subject matter is
relevant and important to you personally. In this way you
resist distractions and concentrate on the message.
Critical listening means you listen to analyze, evaluate, and
judge the speaker’s intention. Is it to motivate, to persuade,
to confess? Is the tone used to convey the message polite,
insincere, rational, impatient, frustrated? Check your
understanding of what the speaker said by asking for
additional information or asking the speaker to rephrase
what they said.
Appreciative listening means you listen first and then react.
This is the key to effective listening. Try to see the idea or
concept from the speaker’s point of view. Do not get excited
about any point made before you are sure that you fully
understand it. Avoid impatience and allow the speaker to
complete the message. Watch out for biases relating to
differences of perception, personality, status or culture.
Good and careful listening skills go a long way to making
sure that you hear what is being said, and what is not being
said.
The Power of Questions
To practice Socratic questioning you will need at least one
person who is willing to work with you. Ideally, a small
group is better, as you can all learn from each other’s
mistakes. There are very few rules for framing and asking
questions, but following them will help create a meaningful
discussion. First, remember that Socratic questioning is a
discussion, not an argument or debate. Paul and Elder, in the
Thinker's guide to the art of Socratic Questioning, 9 list a few
short rules for the questioning process.
It is led by one person who only asks questions.
The questioner steers or guides the discussion based
on the questions asked.
The questions should be systematic and disciplined.
The purpose of the discussion is to help everyone
examine the complexities of the topic and dig beyond
surface issues.
The power of questioning comes from encouraging us to
think more deeply about the things we think we know and to
discover what we don’t know. Questioning has the power to:
Direct your train of thought and maybe even derail it.
Drive you to learn, to create, to experiment, and to
improve.
Discover something new or rediscover a long-
forgotten passion.
Put things together in new or different ways.
Remember things.
Resolve issues.
Understand people better.
The keys to good questions, and perhaps to asking the
“right” questions, lie partly in listening, which has already
been discussed, and in following a logical sequence to frame
the question. Good questions are also tied up in how you
think about what you know, and realizing that the gap
between what you know and what you understand is where
you find the right questions.
Each of us is different and has individual needs, desires, and
dreams. By using Socratic thinking and questioning, each
individual acquires the logic and critical thinking skills that
are the keys to unlock their own truth.
If you work through the exercises and tasks as you read this
book, two things may happen. First, you will discover the
power of Socratic questioning. Second, you now scratch
beneath the surface more regularly, and you dig a bit deeper
into some of your treasured beliefs.
Knowing how to ask the right questions and asking those
questions can help you make significant changes in your life.
For example:
Questions lead to discoveries about yourself, about
the functions of science, and about the good and bad
aspects of society.
Questions can alert you to your assumptions and
prejudices, not just the received prejudices, but also
those acquired through experience.
Questions foster creativity in that they have a way of
helping you see the world through new eyes.
Questions can help you to solve problems through
specific and logically constructed approaches to areas
of difficulty in your life.
Some of the benefits of questions are that they can help you:
Connect with others.
Deepen existing relationships.
Become better in your work or business.
Strengthen your leadership skills.
In the realm of critical thinking, three types of simple
questions are often overlooked in favor of more complex
questions. Consider how often you ask the wh- questions,
namely, who, what, where, when, and why. How is not a wh-
word, but it forms an important suffix to the five wh-
questions.
Who is questioning, who is affected?
What you ask, and include “what if” questions.
Where do the ideas come from, are they reliable, can
they be verified?
When will this happen? Just do it.
Why? Just step back, pause, and stop knowing it all.
How will you ask the question, will it work?
These types of questions can help you better execute your
plans, focus your actions, and improve your results.
It is important to remember that knowing how to ask the
right questions is deeply rooted in the Socratic way. It
involves all the aspects you have been reading about since
Chapter 1. For example, critical thinking is vital, as is
applying logic to your thinking, observing the universal
intellectual standards, and questioning your mindset to
arrive at the truth.
In its pure form, and executed with the right intent, the
Socratic method has specific positive outcomes. It will help
you discover things about yourself and others that will
benefit you and them.
The Truth About Happiness
In his book, The Philosopher and the Wolf, Mark Rowlands has
a chapter called The Pursuit of Happiness and Rabbits. We
are not concerned with the pursuit of rabbits here, and they
are included in the chapter title because Rowlands speculates
whether the wolf is happy when he is chasing rabbits, even
though he seldom catches any.
In the chapter, Rowlands claims that “According to many
philosophers, happiness is intrinsically valuable.” 10 In other
words, we value happiness for itself, not for any other
reason. This differs from the value we put on things such as
money or medicine, which we value for what they can do for
us, like providing food and shelter, or pain relief.
Rowlands says that since the late 20th-century, happiness
has gained a much higher profile than a purely philosophical
question, that in western culture, it has become big business.
Studies tell us that we are, materially, far richer than our
forebears, but we are no happier than they were. Studies also
tell us, according to Rowlands, that we are happiest when
having sex and unhappiest when talking to the boss. All this
gives rise to the question: Is the whole world obsessed with
happiness, or is it only the western world?
In this section, you are going to be challenged to use critical
thinking and Socratic questioning to reach your own
conclusions about happiness. Based on the information
provided below, analyze the arguments, apply your logic,
challenge the assumptions, and check for bias and
prejudices–yours and in the arguments presented. Decide on
what questions you would ask, and how you would go about
interrogating the concept of happiness.
The World Happiness Report, first published in April 2012,
and published annually since then, is more than a way for
people to see how their country compares against other
nations. It also, according to the original United Nations
resolution, provides information to help guide policymakers
to “pursue the elaboration of additional measures that better
capture the importance of the pursuit of happiness and
wellbeing in development with a view to guiding their public
policies.” 11
You may already have a question or two, such as:
What do you mean by obsessed?
How do you define happiness?
Before we get to those questions, if there is an obsession
with happiness, when did it start? According to historian
Peter N Sterns, writing in the Harvard Business Review in
2012, the drive to happiness is relatively modern in Western
culture. He says there is general agreement among the
historians working on the subject that, at the level of
rhetoric, at least, “a significant shift occurred in Western
culture around 250 years ago.” 12 Or, to put it in perspective,
during the Age of Enlightenment and shortly before the
American Declaration of Independence.
The Age of Enlightenment brought with it the idea that
happiness was the attainment of a worthy life. This does not
deny or exclude any influence or teaching of a religious
nature, but neatly supplements it in some ways. In itself, the
attainment of a worthy life is not that different to the
Aristotelian belief that happiness is a by-product of a life of
virtue. This ideal is also captured in the American
Declaration of Independence, where the three unalienable
human rights are summed up in the phrase, “Life, Liberty,
and the pursuit of Happiness.”
However, you may ask what was the happiness the founding
fathers of American independence had in mind? And how
does it differ from the concept of 21st-century happiness?
In 2005, Anthony Kennedy, a Justice of the US Supreme
Court, said in a lecture that for the framers of the
Declaration of Independence, “happiness meant that feeling
of self-worth and dignity you acquire by contributing to your
community and to its civic life.” 13 This supports the social
responsibility aspect of a different, and perhaps more
meaningful, definition of happiness.
To dig a bit deeper, let’s look at what was happening during
the Age of Enlightenment. First, there was a general growth
in science. Isaac Newton’s Philosophiae Naturalis Principia
Mathematica came out and was an example of how to
synthesize different theories and laws into a coherent view
of a mechanical universe. Then, philosophers such as
Thomas Hobbes, Rene Descartes, John Locke, and Voltaire,
were wrestling with determinism, rationalism, religion, and
ethics. Groups of thinkers and writers across what is now
western civilization were exchanging knowledge and ideas.
The resulting processes of industrialization and
urbanization, and the advances in education, saw a period in
which society had the leisure and knowledge to consider
seeking happiness in this life, rather than waiting for union
with God in the next.
So, at the end of the 18th-century, happiness was to be
found in the self-worth and dignity arising from your
contribution to societal wellbeing. At the end of the 20th
century, happiness has become an egotistical pursuit of self-
gratification. The early days of the 21st-century show no
signs of anything changing soon.
In examining this short discourse on happiness and the role
it plays in society, several questions are raised. Here are a
few questions and reminders to get you started on your
interrogation and analysis. And some words of
encouragement from Immanuel Kant:
“Dare to know! Have courage to use your own reason!”
Five questions to help get you started:
1. Is happiness an inherent feature of the human condition?
If so, does this open new opportunities to understand
important aspects of our social experience?
2. What challenges emerge for you as you reexamine your
relationship with happiness?
3. If there is an obsession with happiness, when, in your
opinion, did it start?
4. Is happiness an inalienable human right?
5. Was there an awareness of happiness before the 18th
century, and is there evidence of it existing as a concept?
As you draw up your own list of possible questions, be aware
of any blind spots you may have to a particular line of
thinking. Be vigilant for bias and prejudice creeping in.
Look for assumptions, and look for the weaknesses that
those assumptions present. For example, are the arguments
presented supported by facts, and can the facts be verified?
Exercises And Tasks
In June 2021, a newsletter from the Nelson Mandela
Foundation said the focus of Nelson Mandela International
Day for 2021 will be on two critical challenges being faced by
many countries around the world, including South Africa.*
These challenges are:
1. Food insecurity;
2. What can only be described as cultures of lawlessness.
The facts that inform these challenges are:
Covid-19 has deepened patterns of poverty and
inequality.
The number of people going hungry is growing.
Social cohesion is under severe strain.
Evidence of diminishing respect for 'the rule of law' is
apparent everywhere.
Jakkie Cilliers of the Institute for Security Studies (ISS),
writing for ISS Today, says: “Citizens don’t obey the rules
because the governing African National Congress sets a poor
example.” 14 Several questions present themselves, and
press for attention. In addition, new lines of inquiry present
themselves for examination. Using your critical thinking
skills, analyze and interrogate the extract from the
newsletter, and answer the questions below.
1. What does 'the rule of law' mean for those who are
starving?
2. What does 'the rule of law' mean to someone unable to put
food on the family table due to the failures of societal
systems and structures?
3. What is the basis for social bonding in contexts where
constitutions and laws do not match the lived reality of the
great majority?
4. Is the achievement of food security for all imaginable
without the rule of law?
Interesting to note how prescient this call to action is. Put
out in June 2021, by 11 July, a week before International
Mandela Day, South Africa was deep in the grip of riots and
looting–the twin outcomes of poverty and lawlessness.
Chapter Summary
Part of the art and skill is knowing the enemies of
questioning, which are:
Fear: Children begin as fearless questioners, then learn that
asking questions carries risks, so they stop questioning.
Knowledge: Knowledge is an enemy in that you believe you
know so much, you stop learning, but you don’t know as
much as you think you do.
Bias: Biases go hand in hand with assumptions, so question
all your assumptions.
Hubris: Excessive pride in your ability or your knowledge
leads you to believe that your views are correct. If you lack
humility, you’ll do less questioning.
Time: The lack of time is often used as an excuse for not
questioning.
Skilled questioning requires patience and practice.
1. Analyze your goals and purposes.
2. Question definitions relating to problems and issues.
3. Assess the information base of your thinking.
4. Rethink a conclusion or interpretation.
5. Identify and check my assumptions.
6. Analyze your own point of view.
7. Follow through on the implications of your thinking.
The power of questioning comes from encouraging you to
think more deeply about things. By using Socratic thinking
and questioning, each individual acquires the logic and
critical thinking skills that are the keys to unlock their own
truth.
6
GETTING IT RIGHT: POINTS TO REMEMBER
AND APPLY
“T rue wisdom comes to each of us when we realize how
little we understand about life, ourselves, and the
world around us.” This is credited to Socrates, and it
does echo the sentiments expressed by him in the opening
quote from Chapter 5, “The only thing I know, is that I know
nothing.” The relevance of the quote is a reminder to us of
how limited we are by thinking we know much, when we
know not. Our confidence in our understanding and
knowledge is limiting our growth as it introduces a subtle
bias in our thinking.
To master the art and skill of Socratic questioning, you first
need to master critical thinking – the foundation that
supports penetrating and incisive questioning. Your ultimate
goal with Socratic questioning is to find a framework to
generate sound arguments, valid conclusions, and,
consequently, the answers you are looking for.
Before you can frame any question, you need to gather all
the information presented. To do this, you must hear what is
being said and what is being presented as facts. You also
need to truly listen and hear what is not being said.
Sometimes, you may find that the question you need to ask
lies in the subtext. Besides, it is not possible to formulate
any question without first hearing the argument.
The next step is to understand all the terms used and all the
concepts presented. These are the ideas around which the
argument is built. If anything is unclear or vague, ask
probing yet respectful questions. The goal is not to win the
argument but to find answers that benefit everybody
involved.
In your quest for deeper understanding, start with questions
about the origin of the information presented. In other
words, from where are the facts derived? Is it from personal
experience, or is it received wisdom? Received wisdom, in
the words of Kurt Vonnegut, is “The things other people
have put into my head … are out of proportion with one
another, are out of proportion with life as it really is…” 1
However the information is derived, or wherever it is from,
question the source of it. If it cannot be independently
verified, it must remain suspect.
Of course, you should also question any conclusions that are
drawn. Are assumptions or inferences involved in arriving at
the conclusion? Does the logic hold true?
Expect the unexpected. This is a maxim you can apply to
other aspects of your life. And it certainly applies to critical
thinking. As soon as you start breaking arguments into their
component parts, and analyzing them separately, all sorts of
unexpected results may pop up. Part of this is as a result of
trying to see the premises from all the different points of
view.
To help you see different viewpoints, imagine that you are
doing some scenario planning. The most unlikely of possible
views must be considered even if it is such a remote
possibility as to seem ridiculous. For example, how many
corporate and government organizations factored a
pandemic into their planning 10 years ago? Or five years ago?
Perhaps the resulting turmoil of the Covid-19 years could
have been lessened if a pandemic was on the radar. With this
in mind, when you do your scenario planning, you need to
turn yourself into a futurist, or a futuristic thinker. Consider
the worst case scenario, and the best case scenario, and
make contingency plans for the unexpected. Then consider
all the possibilities in between and plan for them as well. In
all cases, you have to plan for uncertainty, or at least tolerate
it when it crops up.
Be a good listener. Hear all angles, spoken and unspoken,
some of which you can only pick up from non-verbal
communication. You need to hear it all if you are to actively
participate in the dialogue.
Remember the Universal Intellectual Standards from Chapter
2 of this book? Apply them because they highlight areas that
are open to questioning in your search for truth.
Knowing What To Ignore
To get the best from the Socratic method, start the dialogue
by defining the terms under discussion. Unless both parties
agree on the terms, you will find yourselves talking at cross-
purposes.
It helps to have an idea of what the other person does or
doesn’t know. For example, an animal trainer might know a
bit about quantum mechanics, but you cannot expect them
to have an in-depth understanding of Heisenberg’s
uncertainty principle. A quote from Rumi, the 13th-century
Persian poet, scholar, and mystic, put this quite well when
he said, “The art of knowing is knowing what to ignore.”
Fallacies are one of those things that you need to ignore
when working with the Socratic method. However, you need
to recognize them before you can ignore them. When you
examine an argument, or put together your own arguments,
watch out for fallacies.
A common use of “fallacy” is as a synonym for “false,”
“untrue,” or “incorrect.” However, it is also used to describe
faulty reasoning in an argument, in which case it is then a
“logical fallacy.”
The Collins English Dictionary defines a fallacy as “an
incorrect or misleading notion or opinion based on
inaccurate facts or invalid reasoning.” 2 In logic, it has a
different meaning: “an error in reasoning that renders an
argument logically invalid.” 3
In a deductive argument, a fallacy makes the entire
argument invalid. In an inductive argument, a fallacy does
not invalidate the argument, but it weakens it. Below are a
few common fallacies. This is not a complete list of fallacies,
and be aware that some fallacies have more than one name.
Ad Hominem
Ad hominem means “against the man,” and refers to the
practice of using a personal attack instead of using sound
reasoning and rational arguments to refute an argument.
Strawman Argument
The term “man of straw” is an expression used to describe a
person of no substance, someone you cannot rely on. In the
strawman argument, the real issue or position is not
addressed, but a side issue, without substance, is created and
addressed, hoping that this will be seen as refuting the
argument.
Red Herring Fallacy
A “red herring” fallacy is creating a distraction from the
argument, usually by inserting a topic that appears, at first
glance, to be relevant but does not really address the issue. A
red herring fallacy does not clarify anything, rather it
distracts and confuses.
Appeal to Ignorance
An appeal to ignorance is a situation where ignorance is used
as a premise to support an argument. However, this is
almost always going to be a fallacy. It relies on the ignorance
of most people (we all have areas of which we are totally
ignorant), but it is a manipulative tactic to highlight a
knowledge gap.
Slippery Slope Fallacy
You may have used this fallacy before or had it used on you,
or both. It’s a case of you saying, “But I can’t cut the lawn
today! If I can’t go to that swimming party, I’ll never get a
real girlfriend and I’ll end up single and spend the rest of my
life living with you and Dad!” This fallacy starts from an
innocuous premise and works through a list of gradually
worsening scenarios, all of which are pure conjecture.
Circular Argument
A circular argument is an argument that just repeats earlier
assumptions or restates an earlier premise without arriving
at any new conclusion. In effect, the premise is used as a
conclusion, and the conclusion is used as a premise. For
example, if red is red because blue is blue, then blue is blue
because red is red.
Equivocation
Equivocation is when a word, or a sentence is used to
deliberately mislead, or confuse. It does this by sounding like
it’s saying one thing but it’s actually saying something else.
Equivocation is synonymous with ambiguity and with
elusiveness.
False Dilemma
The false dilemma fallacy, also called the either-or fallacy,
only gives you an either-or choice. As with many situations
in life, there are seldom only two choices. The either-or
fallacy fails because it oversimplifies a range of possible
options. Sometimes, the choice could be a “both and”
situation.
Appeal to Authority
This fallacy usually claims some well known person as an
expert in a field to support an argument. Rather than
presenting concrete evidence, an appeal to authority claims
what some media person says as fact, ignoring the
possibility that this person is not necessarily an authority.
Appeal to Pity
This fallacy relies on factors such as pity or compassion,
which means they are unlikely to be factual and are probably
irrelevant. An appeal to pity is seen as manipulative and
relies on a feel-good factor rather than logic and facts.
One possible way of diverting or deflecting fallacies is to
learn how to answer a question with a question. For example:
Person A: “How did you get your hair to look like that?”
This could be the opening line of a straw man fallacy, or a
personal attack (ad hominem) on you. In either case, you
don’t want to respond with: “What does my hair look like?”
That would simply encourage your interlocutor. A better
response would be: “Could you please explain the relevance
of that statement to me?” Or “Could you please rephrase
that in the context of your original assumption?” The last
thing you want is to get sucked into an exchange of insults.
It’s a battle nobody wins.
Of course, the questions are unlikely to be about mundane
issues such as hair. For the purposes of this chapter, we will
look at scenarios that may concern you, or impact on you
more directly – issues like how a viral pandemic, or climate
change, might affect your decision-making.
Two Frameworks To Guide Questioning
Critical thinking is easier when you have a framework that
can guide your questioning processes. They help to remind
you of sequences and relationships between questions.
Bloom’s Taxonomy
Bloom’s taxonomy is usually depicted as a pyramid, and it
offers a practical path through the critical thinking process.
As with any logical process, whether technical or scientific,
you need to work through the steps, one by one, and each
one is potentially challenging. Skipping a step will mean an
incomplete or inaccurate result.
Figure 2: Bloom’s Taxonomy
The steps below are based on “Critical Thinking In a Nutshell,”
also published by Thinknetic. 4
1. Remember: Step one is to remember relevant details,
including facts, terms, and concepts you are familiar with,
and sources of information, such as books or websites.
2. Understand: Once you have the relevant materials, study
them until you feel you have a full understanding of the issue
at hand. Do not skip this step or skimp on engaging with it.
This is a vital step for everybody, from expert to novice. Do
not move on until you know you can explain all the facts and
concepts to a stranger. You cannot apply the information if
you don’t understand it.
3. Apply: Once you understand the problem, ask yourself:
How does all the information and knowledge apply to
the question at hand?
What is the most valuable information, and what is
the least valuable for solving this problem?
Do I have a complete set of information, and do I fully
understand the problem? Is there anything I am
missing that could help me understand the problem
better?
4. Analyze: What are the major elements of the problem you
face? Break the problem down into its component parts and
define each part and the role it plays. Once you have done
this, examine the links between components and establish
how one influences another. If none of this makes much
sense, you may not have all the information you need and
may have to go back to Step 2 or Step 3.
5. Evaluate: At this point, you might think all you need to do
is put the finishing touches to it and you’ll be done. But,
although you’ve worked hard to get here, you now need to
look at it critically and subject it to rigorous criticism. The
idea is to find any flaws in your analysis, even insignificant
flaws.
Evaluate your analysis based on two criteria:
Does it make sense internally? Are the definitions
precise, and is the analysis based on verifiable
information?
Does it make sense externally? Are there sources of
information outside of your analysis that could bring
key claims into question? Is there information you
examined but did not take into account?
If you find any flaws, this is the best time to fix them.
6. Create: This is the final stage where you take all of the
elements you worked through and combine them into a
cohesive plan. Make sure that your conclusions stand up to
scrutiny and that they are valid and practical.
You may need to adjust some elements of the plans. This is
not unusual and does not represent a failure on your part. All
plans should be seen as “works in progress,” and
adjustments are sometimes necessary as circumstances
change.
The Paul-Elder Framework
The second framework you will learn is the Paul-Elder
framework. You are already familiar with aspects of this
method, as it is covered in earlier chapters. Paul and Elder
use three elements, or sections, in their framework. They
are:
Intellectual standards
Reasoning, or the elements of thought
Intellectual traits
Intellectual standards
These are the universal intellectual standards we discussed
in Chapter 2 and that were repeated earlier in this Chapter.
They inform, and are applied to, the elements of thought.
Reasoning: the elements of thought
When applying your reasoning powers, remember that all
reasoning:
1. has a purpose;
2. is an attempt to work something out, to settle a question,
or solve a problem;
3. is based on assumptions;
4. is done from some point of view;
5. is based on data, information, and evidence;
6. is expressed through and shaped by concepts and ideas;
7. contains inferences or interpretations by which we draw
conclusions and give meaning to data;
8. leads somewhere or has implications and consequences.
Intellectual traits
The ultimate aim of this model is to develop the intellectual
traits so vital to critical thinking.
This framework is best summarized in the Paul-Elder
framework graphic below. 5
Figure 3: The Paul-Elder framework
Questions To Ask
The Socratic method relies on various ‘question types’ to
generate the most complete and correct information for
exploring issues, ideas, emotions, and thoughts.
When the Intellectual Standards are applied to the Elements
of Thought to develop the Intellectual Traits, you need a
basket of questions you can ask. For some guidance on these
questions, refer to Table 6.
Table 5: Questions for the intellectual standards
To get really good at Socratic questioning, you need to
practice, practice, and then practice some more. Given that it
is an art and a skill, you will only get better with more effort
and practice.
Below are a few scenarios for you to work through as
exercises. As with every aspect of critical thinking, you must
apply your mind to these and work through them in a way
that provides you with usable results. We will do the first two
together, using the Paul-Elder method for the first one and
Bloom’s taxonomy for the second one.
Scenario 1
Jonathan runs a small business supplying and fitting
aluminum and glass windows and doors. Business is slow,
and he needs to find ways to increase sales or reduce costs.
He is also considering supplying wrought-iron garden
furniture to customers of the plant nursery run by his wife.
Perhaps it is time to close or sell his business and become
more involved in the cottage industry his wife has built up.
His worry for the aluminum and glass business is that it will
not recover in the face of Covid-19 business closures and job
losses. His other concern is that the plant nursery will not
support them both, even with the garden furniture side of it.
His big fear here is the impact of the climate crisis on the
ability of his wife to protect seedlings from heat and/or
drought and/or localized flooding.
Help Jonathan work out his issues.
First, what is his purpose? From what we know, he has two
issues, and may be conflating them. We need to look at the
future of his window and door business as a separate
exercise to the nursery.
What assumptions does he make in his deliberations?
Jonathan thinks the viral pandemic will have a negative
effect on the economy, that economic recovery is years away
if it ever happens. But what assumption does he make to
arrive at this conclusion?
Perhaps he needs to look at this from a different perspective.
A different point of view could be to see this as a call to
increase his marketing, run specials, or run a competition in
which the 100th order gets a 50% discount on materials.
On what data is his gloomy economic outlook based? If it is
based on information and evidence, what are the sources,
and are they credible? Are there opinions that believe the
economic boom is around the next corner?
There must be a source that helps shape the concepts and
ideas at the root of his worry. Is he a member of a business
group or a group on social media where gloom is the default
position? If so, he is in a self-created echo chamber where all
the information supports his view. It sounds like a change is
due.
What are the inferences or interpretations used by Jonathan
to draw the conclusions he came to? Having a negative view
is not a bad thing in itself, as it lays out the worst case
scenario. Then, if you cannot accept the consequences of the
worst case scenario, you need to look for a way to avoid it.
Jonathan’s other problem is related to the impact of the
climate crisis on his wife’s business. What assumptions does
he make about the nursery? Are they valid? Are there any
aspects that can be mitigated by a different approach, such
as using drip irrigation and/or shade cloth?
In all these steps, apply universal intellectual standards. Ask
the questions to establish aspects such as clarity, relevance,
fairness, and depth.
Now, using the same process that you helped Jonathan work
through for his business, ask questions about his
assumptions and why his wife is not involved in making
decisions about the nursery. After all, it is her business. What
assumptions is Jonathan making about the potential impact
of climate change? He should consider ways of mitigating
climate change, such as rainwater tanks, drip irrigation, and
mulching heavily to retain water in the soil and protect the
soil from being baked.
Scenario 2
For this exercise, we will use Bloom’s taxonomy: Remember,
Understand, Apply, Analyze, Evaluate, Create.
Belinda and Arthur have two young teenagers and intend
sending both children to college. Except, Arthur has lost his
job as a logistics manager as a result of a pandemic affected
economy. Analyzing the family’s projected income and
projected household expenses, they realize that there will
not be much left to put aside for college fees, or at least not
the colleges they had in mind. If Arthur doesn’t get another
job fairly soon, they are going to have to revise their plans.
Help Belinda and Arthur find possible solutions to their
dilemma.
Remember: In this case, remember the original plan. Dig up
old planning sheets, budget notes, records of decisions
made, and how they were to be implemented. If Belinda and
Arthur do not have written records, they must start by
making notes of what they can remember.
Understand: The next step is to study and discuss the notes
until they have a full understanding of the issue. How big is
the gap between what they have and what they still need?
Are there other sources of income to bridge the gap? Engage
with the children and tell them what the situation is. They
have younger and fresher eyes, and they might see
something Belinda and Arthur missed.
Apply: Now, take all the information and apply it to the
problem. Do Belinda and Arthur have a complete set of
information? Of that information, what is the most valuable
and what is the least valuable for reaching a viable
conclusion? Is there anything else to consider?
Analyze: Break the problem down into its component parts
and define each part and the role it plays. Work with the
feedback from the children and examine the links between
the parts and how one influences another.
Evaluate: Collate and organize the various inputs and look at
it critically. Subject it to a searching examination. Does it
make sense internally? Are the definitions precise and is the
information verifiable? Does it make sense externally? Is
there information you did not take into account? Fix any
flaws you find.
Create: combine all of the elements you worked with into a
cohesive plan. Make sure that your conclusions are valid and
practical.
Nobody said Socratic questioning and critical thinking would
be easy. That’s one of the reasons these skills are so rare.
Under Exercises and Tasks at the end of this chapter there
are a few more generic scenarios to help you polish your
skills. Apply one of the two frameworks illustrated above to
each scenario. Have fun with your creativity and curiosity.
Templates And Question Guides
Work through these sample question guides and templates
for practice in critical thinking. Use them for training and
work settings, or to help you analyze and work through
issues you are faced with. The first template is for an
analysis of an article, film, or problem. The second template
is specifically for problem-solving.
Template 1: Analysis of article/film/issue 6
1. The main purpose of this article/film/issue is
___________________.
State what you think the primary purpose is. Be as accurate as
possible.
2. The key question that the article/film/issue addresses is
_____________.
Work out the key question/s presented.
3. The most important information in this article/film/issue
is ______________.
Work out the facts, experiences, and data used to support the
conclusions.
4. The main inferences/conclusions in this article/film/issue
are ____________.
Identify the key conclusions presented.
5. The key concepts we need to understand in this
article/film/issue are ________.
By these concepts the author means ____________.
What are the important ideas you need to understand to follow
the reasoning?
6. The assumptions that underpin the thinking are
___________.
Is anything being taken for granted, and might it be questioned?
7a. If we take this line of reasoning seriously, the
implications are _____________.
What are the likely consequences if we take this line of reasoning
seriously?
7b. If we ignore this line of reasoning, the implications are
_______________.
What are the likely consequences if we fail to take this line of
reasoning seriously?
8. The main point of view in this article/film/issue is
_________________________.
What are the creators looking at, and how are they seeing it?
Template 2: Problem-Solving 7
Every problem is easier to solve if you have a structured
approach to problem solving. This template provides a
working guide, no matter what the problem is.
1. What is your purpose? Decide on your goals and needs and
regularly review them. Recognize problems or obstacles to
reaching your goals and satisfying your needs.
2. Specify each problem, being as precise and clear as you
can.
3. Determine the kind of problem facing you. For example, is
it internal or external, a physical obstacle or a mental block,
and what do you have to do to solve it?
4. Differentiate between problems that are within your
control and problems over which you have no control. Focus
on the problems you can potentially solve.
5. Work out what information you need to overcome the
problem. Then make an effort to find it.
6. Analyze the information you collect, interpret it, look at it
from different angles, then draw reasonable inferences.
7. Decide what actions you can take and whether they are
short term or long term options. Recognize your limitations
in terms of resources, such as time and money.
8. Evaluate your options. Consider their advantages and
disadvantages, for both the short term and long term.
9. Plan a strategy. This may involve direct action, or a
carefully considered wait-and-see approach.
10.When you act, be aware of how your action could
negatively affect the outcome of the problem. Be ready to
revise your strategy if necessary. Be prepared to restate the
problem and to repeat your analysis as more information
becomes available.
Exercises and Tasks
1. Think of an important decision you had to make in the
past, such as buying your first house, selling up to move to
another part of the country, or changing career direction.
It doesn’t matter what it was as long as you had options, and
it wasn’t easy to choose between them. Next:
a. Write down all the alternatives you had to the decision
ultimately made.
b. Write down why you choose one over the other.
c. Were these reasons based on facts you thoroughly
researched or on assumptions?
d. What assumptions informed that decision?
e. How do you know these assumptions to be true? Have you
examined their validity?
f. Do you make many of your decisions based on unfounded
assumptions? What are some examples?
2. Apply what you have learnt in this chapter to these
scenarios:
a. You are trying to interpret an angry friend’s needs,
expressed through a rush of emotion and snide comments.
You would like to give your friend some help and support.
How would you go about doing this, without alienating your
friend through asking probing questions?
b. You are a manager and need to settle a dispute between
members of your staff. What was a healthy competitive
environment in the sales team has become an area of spiteful
and bitter contention. One high-performing individual
seems to be the target of the spitefulness of other staff. You
need to summarize the facts, assess the alternatives and be
fair to all sides of the dispute. How do you approach this
while remaining as objective as possible?
c. You are the first to arrive at the scene of an accident.
Describe how you will analyze the situation, evaluate
priorities, and decide what actions to take and in what order
you will take them.
3. Brain exercises
Reading is great for increasing your pleasure, but it is just as
good for increasing your understanding and knowledge.
Even better, reading is one of the best brain exercises,
particularly if you read critically. Think about what you read,
analyze it, question the sources of the information. Are they
credible? Can the results of the reported experiment be
replicated? Is it consistent with earlier results?
The following four brain exercises have been around for
decades and first crossed the author’s desk as an email about
25 years ago. They are all on the internet, in different
formats, on different websites, so it is almost impossible to
know where they originate.
a. A traveler arrives in a small town that he has never visited
before. He knows nothing about the town or its inhabitants,
but he needs a haircut and shave. There are only two barber
shops in town, both on the main road. The man studies each
of them with care. One shop is neat and tidy. The barber is
sweeping the floor while waiting for his next customer. The
other barber's shop is untidy. Everything looks a bit run
down. The barber, reclining in a chair while waiting for his
next customer, is scruffy, with untidy hair and beard. Both
shops charge the same for a haircut and shave. After careful
consideration, the traveler decides on the scruffy barber for
his haircut. Why?
b. A convicted murderer is given a choice between three
rooms. The first room is full of raging fires, the second room
contains assassins with loaded guns, and the third is full of
lions that haven't eaten in 3 years. Which room is the safest?
c. A woman shoots her husband. Then she holds him
underwater for over 5 minutes. Finally, she hangs him. 30
minutes later, they both go out and enjoy a wonderful meal
together. How can this be?
d. This is an unusual paragraph. I'm curious as to just how
quickly you can find out what is so unusual about it. It looks
so ordinary and plain that you would think nothing was
wrong with it. In fact, nothing is wrong with it! It is highly
unusual though. Study it and think about it, but you still may
not find anything odd. But if you work at it a bit, you might
find out.
The answers to these four brain exercises are all available on
the internet, but you can work them out for yourself with a
little critical thinking and some careful analysis.
Chapter Summary
To master the art and skill of Socratic questioning, you first
need to master critical thinking. Your ultimate aim with
Socratic questioning is to find a framework to use to
generate sound arguments, valid conclusions and,
consequently, the answers you are looking for.
Strong critical thinking requires having a command of the
intellectual standards. They highlight areas that are open to
questioning in your search for truth.
Critical thinking is easier when you have a framework to
remind you of relationships between questions. The two that
were discussed are Bloom’s taxonomy and the Paul-Elder
framework.
The Socratic method relies on various ‘question types’ to
generate the most complete and correct information for
exploring issues, ideas, emotions, and thoughts.
To get really good at Socratic questioning you need to
practice, practice, and then practice some more. Given that it
is an art and a skill, you will only get better the more effort
and practice you put in.
AFTERWORD
In The Philosopher and the Wolf, Mark Rowlands writes about
what philosophers call an epistemic duty, “the duty to
subject one’s beliefs to the appropriate amount of critical
scrutiny.” 1 This is similar to a Socrates quote you may
remember from chapter 2, “the unexamined life is not worth
living.” The difference is that where Socrates made a
statement to promote questioning, Rowlands calls it a duty.
So, whether you are an ancient or a modern philosopher, or
not a philosopher at all, you owe it to yourself to question
your beliefs and examine their origins.
Knowledge of Socratic reasoning and questioning will help
you understand and apply the critical thinking techniques.
Once you understand the Socratic method and critical
thinking, apply your logic to generate sound conclusions
from well considered and carefully analyzed premises. The
result will be decisions that stand up to scrutiny.
The role of critical thinking is to help you to think clearly
and rationally, to understand the logical connection between
ideas and concepts. Important in critical thinking is a well-
developed ability to reason.
When you work with the elements of critical thinking, you
need specific skills such as observation, reflection,
interpretation, problem solving, analysis, and evaluation.
Critical thinking requires confidence in your thinking as you
analyze and evaluate the arguments presented. It is the
systematic application of logic supported by the courage to
question deeply, and the humility to admit you may be
wrong. Understanding the logic that informs a critical
mindset helps you to interrogate arguments presented to
you and to build strong arguments of your own.
The Greek philosopher Socrates was known for his search to
gain insights and get to the truth. Through the
establishment of universal definitions and inductive
arguments, the Socratic Method of thinking is credited with
helping found the essence of the scientific method.
Socrates' search was focused on ethics and truth, and
extended to what is good and right. To think and reason in a
Socratic way you need to develop intellectual humility, open-
mindedness, an inquiring mind, and a thirst for truth.
Socratic questioning is a process that may or may not end
with a satisfactory conclusion. This is because you engage in
questioning to deepen your understanding instead of trying
to prove a point. Break through your standard thought
patterns and access a questioning mindset. Have a close look
at what you have been conditioned to think and believe.
Five questions you should ask consistently and consider
thoughtfully before passing judgment, and they are:
How can I see this with fresh eyes?
What might I be assuming?
Am I rushing to judgment?
What am I missing?
What matters most?
With the advent of “fake news” and disinformation, start
assessing things more critically. And watch for the distorting
influence from your cognitive biases.
Learning about the Socratic way of questioning is not
enough. Asking the right questions is both an art and skill.
The power of questioning comes from how it encourages you
to think more deeply about things. By using Socratic
thinking and questioning, you acquire the logic and critical
thinking skills that are the keys to unlock your own truth.
You need to understand the enemies of questioning, and why
people fail to ask questions. Common reasons are:
Fear: Children begin as fearless questioners, then learn that
asking questions carries risks.
Knowledge: Knowledge is an enemy in that you believe you
know so much, you stop learning.
Bias: Biases go hand in hand with assumptions, so question
all your assumptions.
Hubris: Excessive pride in your ability or your knowledge
leads you to believe that your views are correct.
Time: The lack of time is often used as an excuse for not
questioning.
Your ultimate aim with Socratic questioning is to find a
framework to help you to generate sound arguments, valid
conclusions and, consequently, the answers you are looking
for. Strong critical thinking requires having a command of
the intellectual standards. They highlight areas that are open
to questioning in your search for truth.
In Socratic questioning and critical thinking you have two
powerful tools. To get really good at using them, practice,
practice, and then practice some more. And the best time to
do it is now, while the knowledge is fresh. So what are you
waiting for? Start using Socratic questioning and critical
thinking today.
REFERENCES
1. What It Means To Be A Critical Thinker In This Day And Age
1 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
2 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
3 The Foundation For Critical Thinking (2019). Critical Thinking: Where To Begin.
Available at: https://www.criticalthinking.org/pages/critical-thinking-where-
to-begin/796 (Accessed: 14th December 2020)
4 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
5 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
6 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
7 Taber, C.S. & Lodge, M. (2006). Motivated skepticism in the evaluation of
political beliefs. American Journal Of Political Science, 50(3), pp. 755-769. doi:
1540-5907.2006.00214.x
8 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
9 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
10 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
11 Stanovich, K.E., West, F.R., Toplak, M.E. (2013). Myside Bias, Rational
Thinking, and Intelligence. Current Directions in Psychological Science, 22(4) pp.
259–264. doi: 10.1177/0963721413480174
12 Taber, C.S. & Lodge, M. (2006). Motivated skepticism in the evaluation of
political beliefs. American Journal Of Political Science, 50(3), pp. 755-769. doi:
1540-5907.2006.00214.x
13 Russell, J. A. (2003). Core affect and the psychological construction of emotion.
Psychological Review, 110(1), 145–172. doi: 10.1037/0033-295X.110.1.145
14 Kozlowski et al Kozlowski, D., Hutchinson, M., Hurley, J., Rowley, J.,
Sutherland, J. (2017). The role of emotion in clinical decision making: an
integrative literature review. BMC Medical Education, 17(1), p255. doi:
10.1186/s12909-017-1089-7
15 Taber, C.S. & Lodge, M. (2006). Motivated skepticism in the evaluation of
political beliefs. American Journal Of Political Science, 50(3), pp. 755-769. doi:
1540-5907.2006.00214.x
16 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
17 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
2. What Keeps Us From Getting To The Truth?
1 Rauscher, F.H., Shaw, G.L. & Ky, K.N. (1993). Music and spatial task
performance. Nature, 365, p611. doi: 10.1038/365611a0
2 Nantais, K. & Schellenberg, G.E. (1999). The Mozart effect: an artifact of
preference. Psychological Science 10(4), pp370-373. doi: 10.1111/1467-9280.00170
3 Pietschnig, J., Voracek, M., & Formann, A. K. (2010). Mozart effect–Shmozart
effect: A meta-analysis. Intelligence, 38(3), pp314–323. doi:
10.1016/j.intell.2010.03.001
4 Taber, C.S. & Lodge, M. (2006). Motivated skepticism in the evaluation of
political beliefs. American Journal Of Political Science, 50(3), pp. 755-769. doi:
1540-5907.2006.00214.x
5 Stanovich, K.E., West, F.R., Toplak, M.E. (2013). Myside Bias, Rational Thinking,
and Intelligence. Current Directions in Psychological Science, 22(4) pp. 259–264.
doi: 10.1177/0963721413480174
6 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
7 Kahneman, D. (2011). Thinking, Fast And Slow. New York: Farrar, Straus and
Giroux.
8 Ruscio, J. (2006). Critical Thinking In Psychology: Separating Sense From Nonsense
(2nd ed.). Belmont, CA.: Thomson/Wadsworth.
9 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
10 Frank M.C., Vul E., Johnson S.P. (2009). Development of infants' attention to
faces during the first year. Cognition, 110(2), pp160-170. doi:
10.1016/j.cognition.2008.11.010.
11 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
12 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
13 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
14 Bower, G. H., Monteiro, K. P., & Gilligan, S. G. (1978). Emotional mood as a
context for learning and recall. Journal of Verbal Learning & Verbal Behavior, 17(5),
pp573–585. doi: 10.1016/S0022-5371(78)90348-1.
15 Bower, G. H. (1981). Mood and memory. American Psychologist, 36(2), pp129–
148. doi: 10.1037/0003-066X.36.2.129
16 Bower, G. H., Monteiro, K. P., & Gilligan, S. G. (1978). Emotional mood as a
context for learning and recall. Journal of Verbal Learning & Verbal Behavior, 17(5),
pp573–585. doi: 10.1016/S0022-5371(78)90348-1.
17 Bower, G. H. (1981). Mood and memory. American Psychologist, 36(2), pp129–
148. doi: 10.1037/0003-066X.36.2.129
18 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
19 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
20 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
21 Heck, P.R., Simons, D.J., Chabris, C.F. (2018) 65% of Americans believe they
are above average in intelligence: Results of two nationally representative
surveys. PLoSONE, 13(7), e0200103. doi: 10.1371/journal.pone.0200103
22 Heck, P.R., Simons, D.J., Chabris, C.F. (2018) 65% of Americans believe they
are above average in intelligence: Results of two nationally representative
surveys. PLoSONE, 13(7), e0200103. doi: 10.1371/journal.pone.0200103
23 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
24 Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics
and biases. Science, 185, pp1124-1130. doi: 10.1126/science.185.4157.1124
25 Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics
and biases. Science, 185, pp1124-1130. doi: 10.1126/science.185.4157.1124
26 Russell, J.A. (2003) Core affect and the psychological construction of emotion.
Psychological Review, 110(1), pp145-172 doi: 10.1037/0033-295x.110.1.145
27 Tversky, A., & Kahneman, D. (1983). Extensional versus intuitive reasoning:
The conjunction fallacy in probability judgment. Psychological Review, 90, 293-
315. doi:10.1037/0033-295X.90.4.293
28 Kahneman, D. (2011). Thinking, Fast And Slow. New York: Farrar, Straus and
Giroux.
29 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
30 Yap, A. (2013) Ad Hominem Fallacies, Bias, and Testimony. Argumentation,
27(2), pp97-109. doi: 10.1007/s10503-011-9260-5
31 Walton, D.N. (1987) The ad Hominem argument as an informal fallacy.
Argumentation, 1, pp317–331. doi: 10.1007/BF00136781
32 Walton, D. (1999) Rethinking the Fallacy of Hasty Generalization.
Argumentation, 13, pp161–182. doi: 10.1023/A:1026497207240
33 Law, S (2006) Thinking tools: The bandwagon fallacy. Think, 4(12), pp. 111. doi:
10.1017/S1477175600001792
34 Asch, S. E. (1956). Studies of independence and conformity: I. A minority of
one against a unanimous majority. Psychological Monographs: General and Applied,
70(9), 1–70. doi: 10.1037/h0093718
35 Sternberg, R.J. & Halpern, D.F. (Eds.) (2020) Critical Thinking In Psychology (2nd
Ed.). Cambridge, UK: Cambridge University Press.
36 Rosenkopf, L., Abrahamson, E. (1999). Modeling Reputational and
Informational Influences in Threshold Models of Bandwagon Innovation
Diffusion. Computational & Mathematical Organization Theory, 5, pp361–384 doi:
10.1023/A:1009620618662.
37 Fiol, C.M. & O'Connor, E.J. (2003). Waking up! Mindfulness in the face of
bandwagons. Academy of Management Review, 28, pp 54-70. doi:
10.5465/AMR.2003.8925227.
38 Fiol, C.M. & O'Connor, E.J. (2003). Waking up! Mindfulness in the face of
bandwagons. Academy of Management Review, 28, pp 54-70. doi:
10.5465/AMR.2003.8925227.
39 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
40 Nickerson, R. (1998). Confirmation bias: A ubiquitous phenomenon in many
guises. Review of General Psychology, 2, pp175-220. doi: 10.1037/1089-2680.2.2.175
41 Ruscio, J. (2006). Critical Thinking In Psychology: Separating Sense From Nonsense
(2nd ed.). Belmont, CA.: Thomson/Wadsworth.
42 Kahneman, D. (2011). Thinking, Fast And Slow. New York: Farrar, Straus and
Giroux.
43 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
44 Kahneman, D. (2011). Thinking, Fast And Slow. New York: Farrar, Straus and
Giroux.
45 Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics
and biases. Science, 185, pp1124-1130. doi: 10.1126/science.185.4157.1124
46 Wilson, T. D., Houston, C. E., Etling, K. M., & Brekke, N. (1996). A new look at
anchoring effects: Basic anchoring and its antecedents. Journal of Experimental
Psychology: General, 125, pp387-402. doi: 10.1037/0096-3445.125.4.387
47 Kahneman, D. (2011). Thinking, Fast And Slow. New York: Farrar, Straus and
Giroux.
48 Kahneman, D. (2011). Thinking, Fast And Slow. New York: Farrar, Straus and
Giroux.
49 Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics
and biases. Science, 185, pp1124-1130. doi: 10.1126/science.185.4157.1124
50 Ross, L., Greene, D., House, P. (1977) The “false consensus effect”: An
egocentric bias in social perception and attribution processes. Journal of
Experimental Social Psychology, 13 (3), pp279-301. doi: 10.1016/0022-
1031(77)90049-X.
51 Marks, G., & Miller, N. (1987). Ten years of research on the false-consensus
effect: An empirical and theoretical review. Psychological Bulletin, 102(1), 72–90.
doi: 10.1037/0033-2909.102.1.72
52 Gilovich, T. (1990). Differential construal and the false consensus effect.
Journal of Personality and Social Psychology, 59(4), pp623–634. doi: 10.1037/0022-
3514.59.4.623
53 Nisbett, R. E., & Wilson, T. D. (1977). The halo effect: Evidence for unconscious
alteration of judgments. Journal of Personality and Social Psychology, 35(4), pp250–
256. doi: 10.1037/0022-3514.35.4.250
54 Ruscio, J. (2006). Critical Thinking In Psychology: Separating Sense From Nonsense
(2nd ed.). Belmont, CA.: Thomson/Wadsworth.
55 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
56 Kahneman, D. (2011). Thinking, Fast And Slow. New York: Farrar, Straus and
Giroux.
57 Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics
and biases. Science, 185, pp1124-1130. doi: 10.1126/science.185.4157.1124
58 Baddeley, A (1997). Human Memory: Theory And Practice. (Revised Ed.). Hove,
UK: Psychology Press.
59 Tulving, E. (1983). Elements Of Episodic Memory. New York: Oxford University
Press.
60 Kahneman, D. (2011). Thinking, Fast And Slow. New York: Farrar, Straus and
Giroux.
61 Kahneman, D. (2011). Thinking, Fast And Slow. New York: Farrar, Straus and
Giroux.
62 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
63 Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics
and biases. Science, 185, pp1124-1130. doi: 10.1126/science.185.4157.1124
64 Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics
and biases. Science, 185, pp1124-1130. doi: 10.1126/science.185.4157.1124
65 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
66 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
67 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
68 Fiol, C.M. & O'Connor, E.J. (2003). Waking up! Mindfulness in the face of
bandwagons. Academy of Management Review, 28, pp 54-70. doi:
10.5465/AMR.2003.8925227.
69 Festinger, L. (1957). A Theory Of Cognitive Dissonance. Stanford, CA: Stanford
University Press.
70 Miller, M.K., Clark , J.D., Jehle, A. (2015) Cognitive Dissonance Theory (Festinger).
In: The Blackwell Encyclopaedia Of Sociology.
doi.org/10.1002/9781405165518.wbeosc058.pub2
71 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
72 Ruscio, J. (2006). Critical Thinking In Psychology: Separating Sense From Nonsense
(2nd ed.). Belmont, CA.: Thomson/Wadsworth.
73 Ruscio, J. (2006). Critical Thinking In Psychology: Separating Sense From Nonsense
(2nd ed.). Belmont, CA.: Thomson/Wadsworth.
74 Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics
and biases. Science, 185, pp1124-1130. doi: 10.1126/science.185.4157.1124
75 Ruscio, J. (2006). Critical Thinking In Psychology: Separating Sense From Nonsense
(2nd ed.). Belmont, CA.: Thomson/Wadsworth.
76 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
77 Ruscio, J. (2006). Critical Thinking In Psychology: Separating Sense From Nonsense
(2nd ed.). Belmont, CA.: Thomson/Wadsworth.
78 Little, R.J., D'Agostino, R., Cohen, M.L., Dickersin, K., Emerson, S.S., Farrar,
J.T., Frangakis, C., Hogan, J.W., Molenberghs, G., Murphy, S.A., Neaton, J.D.,
Rotnitzky, A., Scharfstein, D., Shih, W.J., Siegel, J.P., Stern, H. (2012) The
prevention and treatment of missing data in clinical trials. New England Journal Of
Medicine, 367(14), pp1355-60. doi: 10.1056/NEJMsr1203730
79 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
80 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
81 Ruscio, J. (2006). Critical Thinking In Psychology: Separating Sense From Nonsense
(2nd ed.). Belmont, CA.: Thomson/Wadsworth.
82 Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics
and biases. Science, 185, pp1124-1130. doi: 10.1126/science.185.4157.1124
83 Ruscio, J. (2006). Critical Thinking In Psychology: Separating Sense From Nonsense
(2nd ed.). Belmont, CA.: Thomson/Wadsworth.
84 Heck, P.R., Simons, D.J., Chabris, C.F. (2018) 65% of Americans believe they
are above average in intelligence: Results of two nationally representative
surveys. PLoSONE, 13(7), e0200103. doi: 10.1371/journal.pone.0200103
85 Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics
and biases. Science, 185, pp1124-1130. doi: 10.1126/science.185.4157.1124
86 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
87 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
88 Ayton, P., & Fischer, I. (2004) The hot hand fallacy and the gambler’s fallacy:
Two faces of subjective randomness? Memory & Cognition, 32, pp1369–1378. doi:
10.3758/BF03206327
3. Why Having A Scientifically Skeptical Mind Helps You Discover The
Truth
1 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
2 The Editors of Encyclopaedia Britannica (2016). Verifiability Principle.
Encyclopædia Britannica. Available at https://www.britannica.com/topic/
verifiability-principle (Accessed January 15, 2021)
3 American Institute Of Physics (2018). Science Strategies Chart Course for
Detecting Life on Other Worlds https://www.aip.org/fyi/2018/science-strategies-
chart-course-detecting-life-other-worlds (Accessed 1 February 2021)
4 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
5 Ayer, A. J. (1936). Language, Truth, And Logic. London, UK: V. Gollancz.
6 Shankar, S. (2017) Verifiability And Falsifiability As Parameters For Scientific
Methodology. International Journal of Education & Multidisciplinary Studies, 7(2),
pp130-137. doi: 10.21013/jems.v7.n2.p10
7 Shankar, S. (2017) Verifiability And Falsifiability As Parameters For Scientific
Methodology. International Journal of Education & Multidisciplinary Studies, 7(2),
pp130-137. doi: 10.21013/jems.v7.n2.p10
8 Popper, K. (1963) Conjectures And Refutations: The Growth Of Scientific Knowledge.
London, UK: Routledge & Kegan Paul.
9 Rauscher, F.H., Shaw, G.L. & Ky, K.N. (1993). Music and spatial task
performance. Nature, 365, p611. doi: 10.1038/365611a0
10 Nantais, K. & Schellenberg, G.E. (1999). The Mozart effect: an artifact of
preference. Psychological Science 10(4), pp370-373. doi: 10.1111/1467-9280.00170
11 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
12 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
13 Pietschnig, J., Voracek, M., & Formann, A. K. (2010). Mozart effect–Shmozart
effect: A meta-analysis. Intelligence, 38(3), pp314–323. doi:
10.1016/j.intell.2010.03.001
14 Neyman, J.; Pearson, E. S. (1933). The testing of statistical hypotheses in
relation to probabilities a priori. Mathematical Proceedings of the Cambridge
Philosophical Society, 29 (4), pp492–510. Doi: 10.1017/s030500410001152x.
15 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
16 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
17 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
18 Schupbach, J., & Sprenger, J. (2011). The Logic of Explanatory Power.
Philosophy of Science, 78(1), pp105-127. doi:10.1086/658111
19 Arditti, J., Elliott, J., Kitching, I. & Wasserthal, L. (2012). ‘Good Heavens what
insect can suck it’– Charles Darwin, Angraecum sesquipedale and Xanthopan
morganii praedicta. Botanical Journal of the Linnean Society, 169, pp403–432. doi:
10.1111/j.1095-8339.2012.01250.x.
20 Kahneman, D. (2011). Thinking, Fast And Slow. New York: Farrar, Straus and
Giroux.
21 Grafman, J. (2000) Conceptualizing functional neuroplasticity. Journal of
Communication Disorders, 33(4), 345-356, doi: 10.1016/S0021-9924(00)00030-7.
22 Liu, D.W.C. (2012) Science Denial and the Science Classroom. CBE - Life
Sciences Education, 11(2) pp129-134.
23 Sagan C. (1987) The Burden Of Skepticism. Skeptical Inquirer, 12(1) https://
skepticalinquirer.org/1987/10/the-burden-of-skepticism/
24 Dwyer, C. (2017). Critical Thinking: Conceptual Perspectives and Practical
Guidelines. Cambridge: Cambridge University Press. doi:10.1017/9781316537411
25 Sagan C. (1987) The Burden Of Skepticism. Skeptical Inquirer, 12(1) https://
skepticalinquirer.org/1987/10/the-burden-of-skepticism/
26 Truzzi, M. (1987) On Pseudo-Skepticism. Zetetic Scholar, 12/13, pp3-4.
27 Truzzi, M. (1987) On Pseudo-Skepticism. Zetetic Scholar, 12/13, pp3-4.
28 Snelson, J.S. (1992) The Ideological Immune System: Resistance to New Ideas
in Science. Skeptic, 1(4).
29 Snelson, J.S. (1992) The Ideological Immune System: Resistance to New Ideas
in Science. Skeptic, 1(4).
30 Çakici, D., Metacognitive Awareness and Critical Thinking Abilities of Pre-
Service EFL Teachers, Journal of Education and Learning, 7(5) pp116-129. doi:
10.5539/jel.v7n5p116
31 Flavell, J. (1979). Metacognition and Cognitive Monitoring: A New Area of
Cognitive-Developmental Inquiry. American Psychologist, 34, 906-911.
32 Schraw, G. (1998) Promoting general metacognitive awareness. Instructional
Science, 26, pp113–125. doi: 10.1023/A:1003044231033
33 Çakici, D., Metacognitive Awareness and Critical Thinking Abilities of Pre-
Service EFL Teachers, Journal of Education and Learning, 7(5) pp116-129. doi:
10.5539/jel.v7n5p116
34 Paul, R & and Elder, L. (2013) Critical Thinking: Intellectual Standards
Essential to Reasoning Well Within Every Domain of Human Thought, Part Two.
Journal Of Developmental Education, 37(1).
35 Wilterdink, N. (2002). The sociogenesis of postmodernism. European Journal of
Sociology, 43(2), pp190-216.
36 Duignan, B. (2020) Postmodernism. Encyclopedia Britannica, https://www.
britannica.com/topic/postmodernism-philosophy. (Accessed 22 January 2021).
37 Wilterdink, N. (2002). The sociogenesis of postmodernism. European Journal of
Sociology, 43(2), pp190-216.
38 Wilterdink, N. (2002). The sociogenesis of postmodernism. European Journal of
Sociology, 43(2), pp190-216.
39 Duignan, B. (2020) Postmodernism. Encyclopedia Britannica, https://www.
britannica.com/topic/postmodernism-philosophy. (Accessed 22 January 2021).
40 Dennett, D.C. (2013). On Wieseltier V. Pinker in The New Republic: Let's Start
With A Respect For Truth. Edge, https://www.edge.org/conversation/
daniel_c_dennett-dennett-on-wieseltier-v-pinker-in-the-new-republic.
(Accessed 22 January 2021).
41 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
42 Kuhn, T. S. (1970). The structure of scientific revolutions (2nd Ed.). University of
Chicago Press: Chicago.
43 Wilterdink, N. (2002). The sociogenesis of postmodernism. European Journal of
Sociology, 43(2), pp190-216.
44 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
45 Snelson, J.S. (1992) The Ideological Immune System: Resistance to New Ideas
in Science. Skeptic, 1(4).
46 Paul, R & and Elder, L. (2013) Critical Thinking: Intellectual Standards
Essential to Reasoning Well Within Every Domain of Human Thought, Part Two.
Journal Of Developmental Education, 37(1).
4. Why The Media Can Make Or Break Our Thinking
1 Watson, C.A. (2018) Information Literacy in a Fake/False News World: An
Overview of the Characteristics of Fake News and its Historical Development.
International Journal of Legal Information, 46(2), pp. 93-96.
2 McDermott, R (2019) Psychological Underpinnings of Post-Truth in Political
Beliefs. Political Science & Politics, 52(2), pp218-222.D
3 Watson, C.A. (2018) Information Literacy in a Fake/False News World: An
Overview of the Characteristics of Fake News and its Historical Development.
International Journal of Legal Information, 46(2), pp. 93-96.
4 Vosoughi, S., Roy, D., Aral, S. (2018) The spread of true and false news online.
Science, 369, pp1146-1151 doi: 10.1126/science.aap9559
5 Aral, S. & Van Alstyne, M.W. (2011). The Diversity-Bandwidth Tradeoff.
American Journal of Sociology, 117(1), doi: 0.2139/ssrn.958158
6 Vosoughi, S., Roy, D., Aral, S. (2018) The spread of true and false news online.
Science, 369, pp1146-1151 doi: 10.1126/science.aap9559
7 Itti, L. & Baldi, P. (2009). Bayesian surprise attracts human attention, Vision
Research, 49 (10), pp1295-1306. doi: 10.1016/j.visres.2008.09.007.
8 Vuilleumier P. (2005). How brains beware: neural mechanisms of emotional
attention. Trends In Cognitive Science, 9(12), pp585-94. Doi:
10.1016/j.tics.2005.10.011
9 Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012).
Misinformation and Its Correction: Continued Influence and Successful
Debiasing. Psychological Science in the Public Interest, 13(3), 106–131. doi:
10.1177/1529100612451018
10 Vosoughi, S., Roy, D., Aral, S. (2018) The spread of true and false news online.
Science, 369, pp1146-1151 doi: 10.1126/science.aap9559
11 McDermott, R (2019) Psychological Underpinnings of Post-Truth in Political
Beliefs. Political Science & Politics, 52(2), pp218-222.D
12 Osborne, C.L. (2018). Programming to Promote Information Literacy in the Era
of Fake News. International Journal of Legal Information, 46(2), pp101-109. doi:
10.1017/jli.2018.21
13 Watson, C.A. (2018) Information Literacy in a Fake/False News World: An
Overview of the Characteristics of Fake News and its Historical Development.
International Journal of Legal Information, 46(2), pp. 93-96.
14 LaGarde, J. & Hudgins, D. (2018) Fact Vs. Fiction: Teaching Critical Thinking Skills
in the Age of Fake News. International Society for Technology in Education.
15 Watson, C.A. (2018) Information Literacy in a Fake/False News World: An
Overview of the Characteristics of Fake News and its Historical Development.
International Journal of Legal Information, 46(2), pp. 93-96.
16 Niedringhaus, K.L (2018). Information Literacy in a Fake/False News World:
Why Does it Matter and How Does it Spread? International Journal of Legal
Information, 46(2), pp97-100. doi:10.1017/jli.2018.26
17 Murch, S.H., Anthony, A., Casson, D.H., Malik, M., Berelowitz, M., Dhillon,
A.P., Thomson, M.A., Valentine, A., Davies, S.E., Walker-Smith, J.A. (2004)
Retraction of an interpretation. Lancet. 363(9411):750. doi: 10.1016/S0140-
6736(04)15715-2. Erratum for: Lancet. 1998 Feb 28;351(9103):637-41.
18 Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012).
Misinformation and Its Correction: Continued Influence and Successful
Debiasing. Psychological Science in the Public Interest, 13(3), 106–131.
doi:10.1177/1529100612451018
19 Niedringhaus, K.L (2018). Information Literacy in a Fake/False News World:
Why Does it Matter and How Does it Spread? International Journal of Legal
Information, 46(2), pp97-100. doi:10.1017/jli.2018.26
20 Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012).
Misinformation and Its Correction: Continued Influence and Successful
Debiasing. Psychological Science in the Public Interest, 13(3), 106–131.
doi:10.1177/1529100612451018
21 Osborne, C.L. (2018). Programming to Promote Information Literacy in the Era
of Fake News. International Journal of Legal Information, 46(2), pp101-109. doi:
10.1017/jli.2018.21
22 McDermott, R (2019) Psychological Underpinnings of Post-Truth in Political
Beliefs. Political Science & Politics, 52(2), pp218-222.D
23 Osborne, C.L. (2018). Programming to Promote Information Literacy in the Era
of Fake News. International Journal of Legal Information, 46(2), pp101-109. doi:
10.1017/jli.2018.21
24 Shearer, E. & Gottfried, J. (2017). News Use Across Social Media Platforms
2017, Pew Research Center. https://www.journalism.org/2017/09/07/news-use-
across-social-media-platforms-2017/
25 Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012).
Misinformation and Its Correction: Continued Influence and Successful
Debiasing. Psychological Science in the Public Interest, 13(3), 106–131.
doi:10.1177/1529100612451018
26 LaGarde, J. & Hudgins, D. (2018) Fact Vs. Fiction: Teaching Critical Thinking Skills
in the Age of Fake News. International Society for Technology in Education.
27 Blakeslee, Sarah (2004) "The CRAAP Test," LOEX Quarterly, 31(3 ). Available at:
https://commons.emich.edu/loexquarterly/vol31/iss3/4
28 Fielding, J.A. (2019) Rethinking CRAAP: Getting students thinking like fact-
checkers in evaluating web sources. College & Research Libraries News, 80(11),
pp.620-622. doi: 10.5860/crln.80.11.620
29 Wineburg, S. & Mcgrew, S. (2017) Lateral Reading: Reading Less and Learning
More When Evaluating Digital Information. Stanford History Education Group
Working Paper No. 2017-A1, Available at http://dx.doi.org/10.2139/ssrn.3048994
30 Fielding, J.A. (2019) Rethinking CRAAP: Getting students thinking like fact-
checkers in evaluating web sources. College & Research Libraries News, 80(11),
pp.620-622. doi: 10.5860/crln.80.11.620
31 Wineburg, S. & Mcgrew, S. (2017) Lateral Reading: Reading Less and Learning
More When Evaluating Digital Information. Stanford History Education Group
Working Paper No. 2017-A1, Available at http://dx.doi.org/10.2139/ssrn.3048994
32 Edelman trust barometer 2021. Available at https://www.edelman.com/sites/
g/files/aatuss191/files/2021-01/2021-edelman-trust-barometer.pdf
33 Niedringhaus, K.L (2018). Information Literacy in a Fake/False News World:
Why Does it Matter and How Does it Spread? International Journal of Legal
Information, 46(2), pp97-100. doi:10.1017/jli.2018.26
34 Society of Professional Journalists (2014). SPJ Code Of Ethics. https://www.spj.
org/ethicscode.asp [accessed 12 Feb 2021]
35 Osborne, C.L. (2018). Programming to Promote Information Literacy in the Era
of Fake News. International Journal of Legal Information, 46(2), pp101-109. doi:
10.1017/jli.2018.21
5. Everyday Lies And Deception
1 Ekman, P. (1992). Telling Lies: Clues To Deceit In The Marketplace, Politics, And
Marriage. New York: W.W. Norton.
2 Vrij, A. (2004). Guidelines to catch a liar. In P. Granhag & L. Strömwall (Eds.),
The Detection of Deception in Forensic Contexts (pp. 287-314). Cambridge:
Cambridge University Press. doi:10.1017/CBO9780511490071.013
3 DePaulo, B., & Morris, W. (2004). Discerning lies from truths: Behavioural cues
to deception and the indirect pathway of intuition. In P. Granhag & L. Strömwall
(Eds.), The Detection of Deception in Forensic Contexts (pp. 15-40). Cambridge:
Cambridge University Press. doi:10.1017/CBO9780511490071.002
4 Ekman, P. (1992). Telling Lies: Clues To Deceit In The Marketplace, Politics, And
Marriage. New York: W.W. Norton.
5 Vrij, A. (2004). Guidelines to catch a liar. In P. Granhag & L. Strömwall (Eds.),
The Detection of Deception in Forensic Contexts (pp. 287-314). Cambridge:
Cambridge University Press. doi:10.1017/CBO9780511490071.013
6 DePaulo, B., & Morris, W. (2004). Discerning lies from truths: Behavioural cues
to deception and the indirect pathway of intuition. In P. Granhag & L. Strömwall
(Eds.), The Detection of Deception in Forensic Contexts (pp. 15-40). Cambridge:
Cambridge University Press. doi:10.1017/CBO9780511490071.002
7 Ekman, P. (1992). Telling Lies: Clues To Deceit In The Marketplace, Politics, And
Marriage. New York: W.W. Norton.
8 Ekman, P. (1992). Telling Lies: Clues To Deceit In The Marketplace, Politics, And
Marriage. New York: W.W. Norton.
9 Arciuli, J., Mallard, D., & Villar, G. (2010). “Um, I can tell you're lying”:
Linguistic markers of deception versus truth-telling in speech. Applied
Psycholinguistics, 31(3), pp397-411. doi:10.1017/S0142716410000044
10 Ekman, P. (1992). Telling Lies: Clues To Deceit In The Marketplace, Politics, And
Marriage. New York: W.W. Norton.
11 Rockwell, P., Buller, D., & Burgoon, J. (1997). Measurement of deceptive voices:
Comparing acoustic and perceptual data. Applied Psycholinguistics, 18(4), 471-484.
doi:10.1017/S0142716400010948
12 Ekman, P. (1992). Telling Lies: Clues To Deceit In The Marketplace, Politics, And
Marriage. New York: W.W. Norton.
13 Rockwell, P., Buller, D., & Burgoon, J. (1997). Measurement of deceptive voices:
Comparing acoustic and perceptual data. Applied Psycholinguistics, 18(4), 471-484.
doi:10.1017/S0142716400010948
14 DePaulo, B., & Morris, W. (2004). Discerning lies from truths: Behavioural
cues to deception and the indirect pathway of intuition. In P. Granhag & L.
Strömwall (Eds.), The Detection of Deception in Forensic Contexts (pp. 15-40).
Cambridge: Cambridge University Press. doi:10.1017/CBO9780511490071.002
15 Ekman, P. (1992). Telling Lies: Clues To Deceit In The Marketplace, Politics, And
Marriage. New York: W.W. Norton.
16 Vrij, A. (2004). Guidelines to catch a liar. In P. Granhag & L. Strömwall (Eds.),
The Detection of Deception in Forensic Contexts (pp. 287-314). Cambridge:
Cambridge University Press. doi:10.1017/CBO9780511490071.013
17 Ekman, P. (1992). Telling Lies: Clues To Deceit In The Marketplace, Politics, And
Marriage. New York: W.W. Norton.
18 Bull, R. (2004). Training to detect deception from behavioural cues: Attempts
and problems. In P. Granhag & L. Strömwall (Eds.), The Detection of Deception in
Forensic Contexts (pp. 251-268). Cambridge: Cambridge University Press.
doi:10.1017/CBO9780511490071.011
19 Knapp, M. (2006). Lying and Deception in Close Relationships. In A. Vangelisti
& D. Perlman (Eds.), The Cambridge Handbook of Personal Relationships, pp. 517-
532). Cambridge: Cambridge University Press.
doi:10.1017/CBO9780511606632.029
20 Knapp, M. (2006). Lying and Deception in Close Relationships. In A. Vangelisti
& D. Perlman (Eds.), The Cambridge Handbook of Personal Relationships, pp. 517-
532). Cambridge: Cambridge University Press.
doi:10.1017/CBO9780511606632.029
21 DePaulo, B., & Morris, W. (2004). Discerning lies from truths: Behavioural
cues to deception and the indirect pathway of intuition. In P. Granhag & L.
Strömwall (Eds.), The Detection of Deception in Forensic Contexts (pp. 15-40).
Cambridge: Cambridge University Press. doi:10.1017/CBO9780511490071.002
22 Levine, T.R. (2014). Truth-default Theory (TDT): A Theory of Human
Deception and Deception Detection. Journal of Language and Social Psychology, 33,
pp378-92. doi: 10.1177/0261927X14535916
23 DePaulo, B., & Morris, W. (2004). Discerning lies from truths: Behavioural
cues to deception and the indirect pathway of intuition. In P. Granhag & L.
Strömwall (Eds.), The Detection of Deception in Forensic Contexts (pp. 15-40).
Cambridge: Cambridge University Press. doi:10.1017/CBO9780511490071.002
24 Vrij, A. (2004). Guidelines to catch a liar. In P. Granhag & L. Strömwall (Eds.),
The Detection of Deception in Forensic Contexts (pp. 287-314). Cambridge:
Cambridge University Press. doi:10.1017/CBO9780511490071.013
25 Ekman, P. (1992). Telling Lies: Clues To Deceit In The Marketplace, Politics, And
Marriage. New York: W.W. Norton.
26 Ekman, P. (1992). Telling Lies: Clues To Deceit In The Marketplace, Politics, And
Marriage. New York: W.W. Norton.
27 DePaulo, B., & Morris, W. (2004). Discerning lies from truths: Behavioural
cues to deception and the indirect pathway of intuition. In P. Granhag & L.
Strömwall (Eds.), The Detection of Deception in Forensic Contexts (pp. 15-40).
Cambridge: Cambridge University Press. doi:10.1017/CBO9780511490071.002
28 Levine, T.R. (2014). Truth-default Theory (TDT): A Theory of Human
Deception and Deception Detection. Journal of Language and Social Psychology, 33,
pp378-92. doi: 10.1177/0261927X14535916
29 Bull, R. (2004). Training to detect deception from behavioural cues: Attempts
and problems. In P. Granhag & L. Strömwall (Eds.), The Detection of Deception in
Forensic Contexts (pp. 251-268). Cambridge: Cambridge University Press.
doi:10.1017/CBO9780511490071.011
30 Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012).
Misinformation and Its Correction: Continued Influence and Successful
Debiasing. Psychological Science in the Public Interest, 13(3), 106–131.
doi:10.1177/1529100612451018
31 Ekman, P. (1992). Telling Lies: Clues To Deceit In The Marketplace, Politics, And
Marriage. New York: W.W. Norton.
32 Levine, T.R. (2014). Truth-default Theory (TDT): A Theory of Human
Deception and Deception Detection. Journal of Language and Social Psychology, 33,
pp378-92. doi: 10.1177/0261927X14535916
33 Levine, T.R. (2014). Truth-default Theory (TDT): A Theory of Human
Deception and Deception Detection. Journal of Language and Social Psychology, 33,
pp378-92. doi: 10.1177/0261927X14535916
34 Serota, K.B., Levine, T. & Boster, F.J. (2010). The Prevalence of Lying in
America: Three Studies of Self-Reported Lies. Human Communication Research, 36,
pp2-25
35 Levine, T.R. (2015). New and Improved Accuracy Findings in Deception
Detection Research. Current Opinion in Psychology, 6, pp1-5 doi:
10.1016/j.copsyc.2015.03.003.
36 Vrij, A. (2004). Guidelines to catch a liar. In P. Granhag & L. Strömwall (Eds.),
The Detection of Deception in Forensic Contexts (pp. 287-314). Cambridge:
Cambridge University Press. doi:10.1017/CBO9780511490071.013
37 Clough, J. (2010). Fraud. In Principles of Cybercrime (pp. 183-220). Cambridge:
Cambridge University Press. doi:10.1017/CBO9780511845123.008
38 Hancock, P. (2015). The Psychology of Deception. In Hoax Springs Eternal: The
Psychology of Cognitive Deception (pp. 61-71). Cambridge: Cambridge University
Press.
39 Levine, T.R. (2014). Truth-default Theory (TDT): A Theory of Human
Deception and Deception Detection. Journal of Language and Social Psychology, 33,
pp378-92. doi: 10.1177/0261927X14535916
40 Levine, T.R. (2015). New and Improved Accuracy Findings in Deception
Detection Research. Current Opinion in Psychology, 6, pp1-5 doi:
10.1016/j.copsyc.2015.03.003.
41 Hancock, P. (2015). The Psychology of Deception. In Hoax Springs Eternal: The
Psychology of Cognitive Deception (pp. 61-71). Cambridge: Cambridge University
Press.
42 Federal Trade Commission (2020). How To Avoid A Scam. https://www.
consumer.ftc.gov/articles/how-avoid-scam [Accessed 7 February 2021]
43 Citizens Advice (2019) Check If Something Might Be A Scam. https://www.
citizensadvice.org.uk/consumer/scams/check-if-something-might-be-a-
scam/[Accessed 7 February 2021]
44 NSW Government. Misleading Representations And Deceptive Conduct. https://
www.fairtrading.nsw.gov.au/buying-products-and-services/advertising-and-
pricing/misleading-or-deceptive-conduct [Accessed 13 February 2021]
6. Pseudoscience Versus Science
1 Evon, D. (2015) Natural repellent for spiders? Snopes.com. Available at https://
www.snopes.com/fact-check/walnut-and-
spiders/#:~:text=Lastly,%20the%20idea%20that%20spiders%20are% [Accesed
6 February 2021]
2 Harrington, M. (2020). The Varieties of Scientific Experience. In The Design of
Experiments in Neuroscience (pp. 1-12). Cambridge: Cambridge University Press.
doi:10.1017/9781108592468.002
3 Gauch, H. (2012). Scientific Method in Brief. Cambridge: Cambridge University
Press. doi:10.1017/CBO9781139095082
4 Gauch, H. (2012). Scientific Method in Brief. Cambridge: Cambridge University
Press. doi:10.1017/CBO9781139095082
5 Harrington, M. (2020). The Varieties of Scientific Experience. In The Design of
Experiments in Neuroscience (pp. 1-12). Cambridge: Cambridge University Press.
doi:10.1017/9781108592468.002
6 Gauch, H. (2012). Scientific Method in Brief. Cambridge: Cambridge University
Press. doi:10.1017/CBO9781139095082
7 Ruscio, J. (2006). Critical Thinking In Psychology: Separating Sense From Nonsense
(2nd ed.). Belmont, CA.: Thomson/Wadsworth.
8 Ruscio, J. (2006). Critical Thinking In Psychology: Separating Sense From Nonsense
(2nd ed.). Belmont, CA.: Thomson/Wadsworth.
9 Bensley, D. (2020). Critical Thinking and the Rejection of Unsubstantiated Claims.
In R. Sternberg & D. Halpern (Eds.), Critical Thinking in Psychology (pp. 68-102).
Cambridge: Cambridge University Press. doi:10.1017/9781108684354.005
10 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
11 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
12 Ruscio, J. (2006). Critical Thinking In Psychology: Separating Sense From Nonsense
(2nd ed.). Belmont, CA.: Thomson/Wadsworth.
13 Percy, J., & Pasachoff, J. (2005). Astronomical pseudosciences in North America. In
J. Pasachoff & J. Percy (Eds.), Teaching and Learning Astronomy: Effective
Strategies for Educators Worldwide (pp. 172-176). Cambridge: Cambridge
University Press. doi:10.1017/CBO9780511614880.026
14 Harrington, M. (2020). The Varieties of Scientific Experience. In The Design of
Experiments in Neuroscience (pp. 1-12). Cambridge: Cambridge University Press.
doi:10.1017/9781108592468.002 Gauch, H. (2012). Scientific Method in Brief.
15 Kuhn, T. S. (1970). The structure of scientific revolutions (2nd Ed.). University of
Chicago Press: Chicago.
16 Harrington, M. (2020). The Varieties of Scientific Experience. In The Design of
Experiments in Neuroscience (pp. 1-12). Cambridge: Cambridge University Press.
doi:10.1017/9781108592468.002
Gauch, H. (2012). Scientific Method in Brief.
17 Kahneman, D. (2011). Thinking, Fast And Slow. New York: Farrar, Straus and
Giroux.
18 Percy, J., & Pasachoff, J. (2005). Astronomical pseudosciences in North America. In
J. Pasachoff & J. Percy (Eds.), Teaching and Learning Astronomy: Effective
Strategies for Educators Worldwide (pp. 172-176). Cambridge: Cambridge
University Press. doi:10.1017/CBO9780511614880.026
19 Bensley, D. (2020). Critical Thinking and the Rejection of Unsubstantiated Claims.
In R. Sternberg & D. Halpern (Eds.), Critical Thinking in Psychology (pp. 68-102).
Cambridge: Cambridge University Press. doi:10.1017/9781108684354.005
20 Narlikar, J. (2005). Astronomy, Pseudoscience, and Rational Thinking.
Highlights of Astronomy, 13, 1052-1054. doi:10.1017/S1539299600018116
21 Narlikar, J. (2005). Astronomy, Pseudoscience, and Rational Thinking.
Highlights of Astronomy, 13, 1052-1054. doi:10.1017/S1539299600018116
22 Landrum, A.R. & Olshansky, A. (2019) The role of conspiracy mentality in
denial of science and susceptibility to viral deception about science. Politics and
the Life Sciences, 38(2), pp193-209
23 Kahneman, D. (2011). Thinking, Fast And Slow. New York: Farrar, Straus and
Giroux.
24 Bensley, D. (2020). Critical Thinking and the Rejection of Unsubstantiated Claims.
In R. Sternberg & D. Halpern (Eds.), Critical Thinking in Psychology (pp. 68-102).
Cambridge: Cambridge University Press. doi:10.1017/9781108684354.005
25 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
26 Lakatos, I. (1978). Introduction: Science and pseudoscience. In J. Worrall & G.
Currie (Eds.), The Methodology of Scientific Research Programmes: Philosophical
Papers (pp. 1-7). Cambridge: Cambridge University Press.
doi:10.1017/CBO9780511621123.002
27 Lakatos, I. (1978). Introduction: Science and pseudoscience. In J. Worrall & G.
Currie (Eds.), The Methodology of Scientific Research Programmes: Philosophical
Papers (pp. 1-7). Cambridge: Cambridge University Press.
doi:10.1017/CBO9780511621123.002
28 Ruscio, J. (2006). Critical Thinking In Psychology: Separating Sense From Nonsense
(2nd ed.). Belmont, CA.: Thomson/Wadsworth.
29 Harrington, M. (2020). The Varieties of Scientific Experience. In The Design of
Experiments in Neuroscience (pp. 1-12). Cambridge: Cambridge University Press.
doi:10.1017/9781108592468.002
30 Novella S. (2012) Your Deceptive Mind: A Scientific Guide To Critical Thinking Skills.
Chantilly, Va.: The Teaching Company.
31 Kuhn, T. S. (1970). The structure of scientific revolutions (2nd Ed.). University of
Chicago Press: Chicago.
32 Harrington, M. (2020). The Varieties of Scientific Experience. In The Design of
Experiments in Neuroscience (pp. 1-12). Cambridge: Cambridge University Press.
doi:10.1017/9781108592468.002
33 Gilovitch, T. (1993). How We Know What Isn’t So: The Fallibility Of Human
Reasoning In Everyday Life. New York: The Free Press.
34 Harrington, M. (2020). The Varieties of Scientific Experience. In The Design of
Experiments in Neuroscience (pp. 1-12). Cambridge: Cambridge University Press.
doi:10.1017/9781108592468.002
Gauch, H. (2012). Scientific Method in Brief.
35 Bensley, D. (2020). Critical Thinking and the Rejection of Unsubstantiated Claims.
In R. Sternberg & D. Halpern (Eds.), Critical Thinking in Psychology (pp. 68-102).
Cambridge: Cambridge University Press. doi:10.1017/9781108684354.005
36 Bridgstock, M. (2009). Modern skepticism. In Beyond Belief: Skepticism,
Science and the Paranormal (pp. 86-110). Cambridge: Cambridge University
Press. doi:10.1017/CBO9780511691676.006
37 Bridgstock, M. (2009). Modern skepticism. In Beyond Belief: Skepticism,
Science and the Paranormal (pp. 86-110). Cambridge: Cambridge University
Press. doi:10.1017/CBO9780511691676.006
38 Sagan, C. (1997). The Demon-Haunted World. London: Headline.
1. The Crucial Role Of Critical Thinking
1 https://www.dictionary.com/browse/critical-thinking Dictionary.com. Accessed
on 20 April 2021.
2 The APA Delphi Report, Critical Thinking: A Statement of Expert Consensus for
Purposes of Educational Assessment and instruction.1990 ERIC Doc. NO.: ED 315
423, as cited by: Facione, PA, in “Critical Thinking: What It is and Why it
Counts”, p23.
3 Adapted from: Definitions of Logic, retrieved on 29 April 2021. https://
examples.yourdictionary.com/examples-of-logic.html
4 https://idioms.thefreedictionary.com/the+received+wisdom
5 https://forum.wordreference.com/threads/received-wisdom.2903508/
6 Galbraith, J. K. (1958) The Affluent Society , Houghton Miffling
7 https://www.collinsdictionary.com/dictionary/english/wisdom
2. The Socratic Method Of Thinking
1 http://www.forbes.com/quotes/9496/ http://www.goodreads.com/quotes/
6885-judge-a-man-by-his-questions-rather-than-by-his http://www.
brainyquote.com/quotes/quotes/v/voltaire100338.html
2 https://en.wikiquote.org/wiki/Voltaire#Misattributed
3 Gaston de Lévis, P. M. (1808), Maximes et réflexions sur differents sujets de
morale et de politique, Volume 1, p5, Charles Gosselin
4 https://www.biography.com/scholar/socrates Accessed on 9 May 2021
5 https://www.military-history.org/feature/thinkers-at-war-socrates.htm
Accessed on 7 May 2021
6 https://www.military-history.org/feature/thinkers-at-war-socrates.htm
Accessed on 7 May 2021
7 https://www.biography.com/scholar/socrates Accessed on 9 May 2021
8 Nails, Debra, "Socrates", The Stanford Encyclopedia of Philosophy (Spring
2020 Edition), Edward N. Zalta (ed.), URL = https://plato.stanford.edu/archives/
spr2020/entries/socrates/
9 Socrates: His Life and Times
10 Kreeft P, Dougherty T, (2010) Socratic Logic, Edition 3.1 (St. Augustine’s
Press) p 123
11 ibid. p 123
12 ibid. p 125
13 ibid. p 124
14 ibid.
15 Eldred, K. (2013) 1.2 Arguments - Types of Reasoning, Pima Community
College. Accessed at https://courses.lumenlearning.com/atd-pima-philosophy/
chapter/1-2-arguments-types-of-reasoning/ on 25 May 2021
16 Irving Rothchild (2006) Induction, Deduction, and the Scientific Method. The
Society for the Study of Reproduction, Inc.
17 Richard Muller, Professor of Physics at UC Berkeley, author of Now: The
Physics of Time, quoted on https://www.forbes.com/sites/quora/2017/01/05/the-
hardest-and-most-important-part-of-the-scientific-method-staying-
objective/ accessed 21 May 2021
18 Paul, R. and Elder, L. (2010). The Miniature Guide to Critical Thinking
Concepts and Tools. Dillon Beach: Foundation for Critical Thinking Press.
19 ibid.
20 Kreeft P, Dougherty T, (2010) Socratic Logic, Edition 3.1 (St. Augustine’s
Press) p 144.
3. Traits Of A Socratic Mind
1 https://en.wikipedia.org/wiki/Cardinal_virtues accessed 6 June, 2021.
2 Paul, R. and Elder, L. (2010). The Miniature Guide to Critical Thinking Concepts
and Tools. Dillon Beach: Foundation for Critical Thinking Press.
3 Resnick B, (2019)Intellectual humility: the importance of knowing you might
be wrong. https://www.vox.com/science-and-health/2019/1/4/17989224/
intellectual-humility-explained-psychology-replication accessed 30 May 2021.
4 Steve Wozniak interview with Mark Milian (Dec 8, 2010). http://edition.cnn.
com/2010/TECH/innovation/12/08/steve.wozniak.computers/index.html accessed
8 June 2021
5 Collins Concise English Dictionary, 8th edition, (2012). HarperCollins
Publishers. p 105
6 Dr. Okadigbo Chuba, as reported in the Nigerian Daily Post (2017) https://
dailypost.ng/2017/10/23/fani-kayode-urges-buhari-take-okadigbos-advice/
accessed 10 June 2021
7 Paul, R. and Elder, L. (2010). The Miniature Guide to Critical Thinking Concepts
and Tools. Dillon Beach: Foundation for Critical Thinking Press.
8 Ibid.
9 Ibid.
10 Facione, P. A., “Cultivating A Positive Critical Thinking Mindset,” © 2016
Measured Reasons LLC. p 7 Based in part on material from chapter 2 of Think
Critically, Facione and Gittens, 2016, Pearson Education.
11 George Carlin quotes. https://www.goodreads.com/quotes/679083-forget-
the-politicians-the-politicians-are-put-there-to-give accessed 6 June 2021
4. Questioning: The Heart Of The Socratic Method
1 Toffler, Alvin. “Future Shock,” (1970). Random House. p 211
2 Nascimento, G. https://www.dailymaverick.co.za/opinionista/2021-06-14-
confessions-of-a-white-south-african-on-youth-day-in-2021/ accessed 14
June 2021
3 Kreeft P, Dougherty T, (2010) Socratic Logic, Edition 3.1 (St. Augustine’s Press)
p 124.
4 Westacott, Emrys. (2020). Summary and Analysis of Plato's 'Euthyphro'.
Retrieved from https://www.thoughtco.com/platos-euthyphro-2670341 on 16
June 2021
5 Watch this Elon Musk interview from the 19:45 mark https://www.youtube.
com/watch?v=lS3nIyetS4I&t=1185s accessed on 6 July 2021
6 Steve Jobs takes a full 18 seconds before responding https://www.youtube.com/
watch?v=FF-tKLISfPE&t=78s accessed on 6 July 2021
7 https://www.inc.com/justin-bariso/why-intelligent-minds-like-elon-musk-
steve-jobs-embrace-rule-of-awkward-silence.html accessed on 6 July 2021
8 Paul, R. and Elder, L. (2007). The Thinker’s Guide to The Art of Socratic
Questioning. Foundation for Critical Thinking Press.
9 Daniel J. Levitin As quoted by Berger, W (2018) The book of beautiful questions:
the powerful questions that will help you decide, create, connect, and lead.
Bloomsbury Publishing, New York
10 Berger, W (2018) The book of beautiful questions: the powerful questions that
will help you decide, create, connect, and lead. Bloomsbury Publishing, New York
11 Carl Sagan’s last interview in 1996 on Charlie Rose. Available on YouTube:
www.youtube.com/watch?v=U8HEwO-2L4w. accessed on 6 July 2021
12 Combating the scourge of disinformation https://www.dailymaverick.co.za/
article/2021-06-25-influence-in-africa-combating-the-scourge-of-
disinformation/ accessed on 4 July 2021
5. The Skillful Art Of Asking The Right Questions
1 https://www.telegraph.co.uk/news/uknews/9959026/Mothers-asked-nearly-
300-questions-a-day-study-finds.html accessed on 10 July 2021
2 Berger, W (2018) The book of beautiful questions. Bloomsbury Publishing, New
York.
3 Harris, P. (2012) Trusting What You’re Told: How Children Learn from Others.
Harvard Press, Boston.
4 Mead, M. (1928, 1961), Coming of Age in Samoa. William Morrow and Co. p246.
5 https://hsm.stackexchange.com/questions/3692/is-the-questions-that-can-
t-be-answered-over-answers-that-can-t-be-question and https://en.
wikiquote.org/wiki/Talk:Richard_Feynman#Not_a_quote both accessed 16 July
2021
6 David, J. (2018). How the American Education System Suppresses Critical
Thinking retrieved from https://observer.com/2018/01/american-education-
system-suppresses-critical-thinking/ accessed 18 July 2021.
7 Paul, R. W., Martin, D, and Adamson K, (1989). Critical Thinking Handbook: High
School, A Guide for Redesigning Instruction. Foundation for Critical Thinking.
retrieved from http://web.sonoma.edu/users/s/swijtink/teaching/
philosophy_101/role.htm accessed on 24 April 2021
8 Erasmus-Kritzinger. E. Bowler, A. Goliath, D. (2009). Effective Communication,
Van Schaik Publishers, pp21-23
9 Paul, R. and Elder, L. (2007). The Thinker’s Guide to The Art of Socratic
Questioning. Foundation for Critical Thinking Press.
10 Rowlands, M. (2008). The Philosopher and the Wolf, Granta Publications,
London. p148
11 Resolution 65/309 Happiness: Towards a Holistic Approach to Development
(2011) https://digitallibrary.un.org/record/715187?ln=en#record-files-collapse-
header accessed on 24 July 2021
12 Stearns, P. N. (2012) The History of Happiness. https://hbr.org/2012/01/the-
history-of-happiness accessed on 25 July 2021
13 As quoted in: Lexical Investigations: Happiness. retrieved from https://www.
dictionary.com/e/happiness/ accessed on 26 July 2021.
14 Cilliers J. (2021) South Africa’s security sector is in crisis https://www.
dailymaverick.co.za/article/2021-07-21-south-africas-security-sector-is-in-
crisis-immediate-reform-is-needed-to-ensure-national-stability/
6. Getting It Right: Points To Remember And Apply
1 Vonnegut, K. (1973) Breakfast of Champions. Delacorte Press, New York
2 Collins Concise English Dictionary eighth edition, (2012) HarperCollins
Publishers
3 Ibid.
4 (2021) Critical Thinking In a Nutshell. Thinknetic. p29-34
5 Paul, R. and Elder, L. (2010) The Miniature Guide to Critical Thinking Concepts
and Tools. Dillon Beach: Foundation for Critical Thinking Press.
6 Ibid.
7 Ibid.
Afterword
1 Rowlands, M. (2008). The Philosopher and the Wolf, Granta Publications, London.
p98
ONE FINAL WORD FROM US
If this book has helped you in any way, we’d appreciate it if you left a review on
Amazon. Reviews are the lifeblood of our business. We read every single one and
incorporate your feedback in developing future book projects.
To leave an Amazon review simply click below:
(Or go to: smarturl.it/tcter or scan the code with your camera)
CONTINUING YOUR JOURNEY
Those Who Keep Learning, Will Keep Rising In
Life.
— CHARLIE MUNGER (BILLIONAIRE, INVESTOR, AND
WARREN BUFFET’S BUSINESS PARTNER)
The most successful people in life are those who enjoy learning and asking
questions, understanding themselves and the world around them.
In our Thinknetic newsletter we’ll share with you our best thinking improvement
tips and tricks to help you become even more successful in life.
It’s 100% free and you can unsubscribe at any time.
Besides, you’ll hear first about our new releases and get the chance to receive
them for free or highly discounted.
As a bonus, you’ll get our bestselling book Critical Thinking In A Nutshell & 2
thinking improvement sheets completely for free.
You can sign up by clicking on the link below:
(Or go to thinknetic.net or simply scan the code with your camera)
THE TEAM BEHIND THINKNETIC
Michael Meisner, Founder and CEO
When Michael got into publishing books on Amazon, he
found that his favorite topic - the thinking process and its
results, is tackled in a much too complex and unengaging
way. Thus, he set himself up to make his ideal a reality:
books that are informative, entertaining, and can help
people achieve success by thinking things through.
This ideal became his passion and profession. He built a
team of like-minded people and is in charge of the strategic
part and brand orientation, as he continues to improve and
extend his business.
Diana Spoiala, Publishing Manager
From idea to print, there is a process involving researching
and designing the book, writing and editing it, and providing
it with the right covers. Diana oversees this process and
ensures the quality of each book. Outside work, she dedicates
most of her time cultivating her innate love for reading and
writing literature, poetry, and philosophy.
Theresa Datinguinoo, Research and Outline Mastermind
Theresa derives “immense satisfaction from putting
together ideas to provide a solid framework for an engaging
story and seeing the final product come to life.” Her
professional background is in human resources management
and psychology, but she has always enjoyed writing articles
and blog posts about any subject.
Doris Lam, Senior Content Editor
Doris has been editing print media since 2005 as the Chief
Copy Editor and Program Coordinator for several
environmental agencies. She is committed to helping writers
achieve clarity, always up for the challenge of making
everyone's writing a masterpiece. For more information
about Doris’s great work, visit www.dorissiu.com.
Nerina Badalic, Senior Content Editor
Throughout the years, Nerina wrote articles, short stories,
and songs. As an editor, she helps authors bring out the best
in them to produce manuals, thesis, articles, and books that
are valuable and useful to the readers. Nerina continues to
explore the arts that surround the world of words:
communication, marketing, design, music, and
photography.
Francesca Scotti-Goetz, Newsletter Writer and Social Media
Community Manager
An observer first and a copywriter second, Francesca has a
passion for the intersection of art with humanity; social
issues with media; thinking with creativity. She spends her
weekends in Amsterdam with a camera and a notebook, and
her weekdays harnessing her discoveries to effectively
engage with Thinknetic’s worldwide community.
Contributors:
David Brant Yu
David is committed to carefully reviewing the profiles of the
many aspiring writers for Thinknetic, ensuring that the most
skilled and talented ones join the team. His voracious
reading habit and interests in philosophy and current affairs
help him carry out his work critically. David enjoys his spare
time doing freelance copyediting and English tutoring.
Evangeline Obiedo
Evangeline completes our books’ journey to getting
published. She pays attention to all the details, making sure
that every book is properly formatted. Her love for learning
extends into the real world - she loves traveling and
experiencing new places and cultures.
DISCLAIMER
Thе іnfоrmаtіоn соntаіnеd іn this book аnd іtѕ соmроnеntѕ,
іѕ meant to ѕеrvе аѕ a соmрrеhеnѕіvе соllесtіоn оf ѕtrаtеgіеѕ
thаt thе аuthоr оf thіѕ bооk hаѕ dоnе rеѕеаrсh аbоut.
Summаrіеѕ, ѕtrаtеgіеѕ, tірѕ аnd trісkѕ аrе оnlу
rесоmmеndаtіоnѕ bу thе аuthоr, аnd rеаdіng thіѕ bооk wіll
nоt guаrаntее thаt оnе’ѕ rеѕultѕ wіll еxасtlу mіrrоr thе
аuthоr’ѕ rеѕultѕ.
Thе аuthоr оf thіѕ bооk hаѕ mаdе аll rеаѕоnаblе еffоrtѕ tо
рrоvіdе сurrеnt аnd ассurаtе іnfоrmаtіоn fоr thе rеаdеrѕ оf
thіѕ bооk. Thе аuthоr аnd іtѕ аѕѕосіаtеѕ wіll nоt bе held
liable for аnу unіntеntіоnаl еrrоrѕ оr оmіѕѕіоnѕ thаt mау bе
fоund.
Thе mаtеrіаl іn thе bооk mау іnсludе іnfоrmаtіоn by third
раrtіеѕ. Third pаrtу mаtеrіаlѕ соmрrіѕе оf орiniоnѕ
еxрrеѕѕеd bу thеіr оwnеrѕ. Aѕ ѕuсh, thе аuthоr оf thіѕ bооk
dоеѕ nоt аѕѕumе rеѕроnѕіbіlіtу оr lіаbіlіtу fоr аnу thіrd раrtу
mаtеrіаl оr оріnіоnѕ.
Thе рublісаtіоn оf thіrd раrtу mаtеrіаl dоеѕ nоt соnѕtіtutе
thе аuthоr’ѕ guаrаntее оf аnу іnfоrmаtіоn, рrоduсtѕ,
ѕеrvісеѕ, оr оріnіоnѕ соntаіnеd wіthіn third раrtу mаtеrіаl.
Uѕе оf thіrd раrtу mаtеrіаl dоеѕ nоt guаrаntее thаt уоur
rеѕultѕ wіll mіrrоr our rеѕultѕ. Publісаtіоn оf ѕuсh thіrd раrtу
mаtеrіаl іѕ ѕіmрlу a rесоmmеndаtіоn аnd еxрrеѕѕіоn оf thе
аuthоr’ѕ оwn оріnіоn оf thаt mаtеrіаl.
Whеthеr bесаuѕе оf thе рrоgrеѕѕiоn оf thе Intеrnеt, оr thе
unfоrеѕееn сhаngеѕ іn соmраnу роlісу аnd еdіtоrіаl
ѕubmіѕѕіоn guіdеlіnеѕ, whаt іѕ ѕtаtеd аѕ fасt аt thе tіmе оf
thіѕ wrіtіng mау bесоmе оutdаtеd оr іnаррlісаblе lаtеr.
Thіѕ book іѕ соруright ©2022 bу Thinknetic with all rіghtѕ
rеѕеrvеd. It іѕ illegal to rеdіѕtrіbutе, сору, оr сrеаtе
dеrіvаtіvе wоrkѕ frоm thіѕ bооk whоlе оr іn раrtѕ. Nо раrtѕ
оf thіѕ rероrt mау bе rерrоduсеd оr rеtrаnѕmіttеd іn аnу
fоrmѕ whаtѕоеvеr wіthоut thе wrіttеn еxрrеѕѕеd аnd ѕіgnеd
реrmіѕѕіоn frоm thе author.