Civic Decision
Civic Decision
satisfactory. It is therefore a process which can be more or less rational or irrational and can be based on explicit or tacit
knowledge and beliefs. Tacit knowledge is often used to fill the gaps in complex decision-making processes. [3]Usually,
both of these types of knowledge, tacit and explicit, are used together in the decision-making process.
Human performance has been the subject of active research from several perspectives: tacit
Psychological: examining individual decisions in the context of a set of needs, preferences and values the individual
has or seeks.
Cognitive: the decision-making process is regarded as a continuous process integrated in the interaction with the
environment.
Normative: the analysis of individual decisions concerned with the logic of decision-making, or communicative
rationality, and the invariant choice it leads to.[4]
A major part of decision-making, involves the analysis of a finite set of alternatives described in terms of evaluative
criteria. Then the task might be to rank these alternatives in terms of how attractive they are to the decision-maker(s) when
all the criteria are considered simultaneously. Another task might be to find the best alternative or to determine the relative
total priority of each alternative (for instance, if alternatives represent projects competing for funds) when all the criteria
are considered simultaneously. Solving such problems is the focus of multiple-criteria decision analysis (MCDA). This
area of decision-making, although very old, has attracted the interest of many researchers and practitioners and is still
highly debated as there are many MCDA methods which may yield very different results when they are applied on exactly
the same data.[5] This leads to the formulation of a decision-making paradox. Logical decision-making is an important part
of all science-based professions, where specialists apply their knowledge in a given area to make informed decisions. For
example, medical decision-making often involves a diagnosis and the selection of appropriate treatment. But naturalistic
decision-makingresearch shows that in situations with higher time pressure, higher stakes, or increased ambiguities,
experts may use intuitive decision-making rather than structured approaches. They may follow a recognition primed
decision that fits their experience, and arrive at a course of action without weighing alternatives. [6]
The decision-maker's environment can play a part in the decision-making process. For example, environmental complexity
is a factor that influences cognitive function.[7] A complex environment is an environment with a large number of different
possible states which come and go over time. [8] Studies done at the University of Colorado have shown that more complex
environments correlate with higher cognitive function, which means that a decision can be influenced by the location. One
experiment measured complexity in a room by the number of small objects and appliances present; a simple room had less
of those things. Cognitive function was greatly affected by the higher measure of environmental complexity, making it
easier to think about the situation and make a better decision.[7]
It is important to differentiate between problem solving, or problem analysis, and decision-making. Problem solving is the
process of investigating the given information and finding all possible solutions through invention or discovery.
Traditionally, it is argued that problem solving is a step towards decision making, so that the information gathered in that
process may be used towards decision-making.[9][page needed]
Analysis paralysis[edit]
Main article: Analysis paralysis
When a group or individual is unable to make it through the problem-solving step on the way to making a
decision, they could be experiencing analysis paralysis. Analysis paralysis is the state that a person enters where
they are unable to make a decision, in effect paralyzing the outcome. [12][13] Some of the main causes for analysis
paralysis is the overwhelming flood of incoming data or the tendency to overanalyze the situation at hand.
[14]
There are said to be three different types of analysis paralysis. [15]
The first is analysis process paralysis. This type of paralysis is often spoken of as a cyclical process. One is
unable to make a decision because they get stuck going over the information again and again for fear of
making the wrong decision.
The second is decision precision paralysis. This paralysis is cyclical, just like the first one, but instead of
going over the same information, the decision-maker will find new questions and information from their
analysis and that will lead them to explore into further possibilities rather than making a decision.
The third is risk uncertainty paralysis. This paralysis occurs when the decision-maker wants to eliminate
any uncertainty but the examination of provided information is unable to get rid of all uncertainty.
Extinction by instinct[edit]
On the opposite side of analysis paralysis is the phenomenon called extinction by instinct. Extinction by instinct
is the state that a person is in when they make careless decisions without detailed planning or thorough
systematic processes.[16] Extinction by instinct can possibly be fixed by implementing a structural system, like
checks and balances into a group or one's life. Analysis paralysis is the exact opposite where a group's schedule
could be saturated by too much of a structural checks and balance system. [16]
Groupthink is another occurrence that falls under the idea of extinction by instinct. Groupthink is when members
in a group become more involved in the “value of the group (and their being part of it) higher than anything
else”; thus, creating a habit of making decisions quickly and unanimously. In other words, a group stuck in
groupthink are participating in the phenomenon of extinction by instinct. [17]
Information overload[edit]
Main article: Information overload
Information overload is "a gap between the volume of information and the tools we have to assimilate" it.
[18]
Information used in decision making is to reduce or eliminate uncertainty. [19] Excessive information affects
problem processing and tasking, which affects decision-making.[20] Psychologist George Armitage Miller
suggests that humans’ decision making becomes inhibited because human brains can only hold a limited amount
of information.[21] Crystal C. Hall and colleagues described an "illusion of knowledge", which means that as
individuals encounter too much knowledge it can interfere with their ability to make rational decisions. [22] Other
names for information overload are information anxiety, information explosion, infobesity, and infoxication. [23][24]
[25][26]
Decision fatigue[edit]
Main article: Decision fatigue
Decision fatigue is when a sizable amount of decision-making leads to a decline in decision-making skills.
People who make decisions in an extended period of time begin to lose mental energy needed to analyze all
possible solutions. It is speculated that decision fatigue only happens to those who believe willpower has a
limited capacity.[27] Impulsive decision-making and decision avoidance are two possible paths that extend from
decision fatigue. Impulse decisions are made more often when a person is tired of analysis situations or
solutions; the solution they make is to act and not think. [27] Decision avoidance is when a person evades the
situation entirely by not ever making a decision. Decision avoidance is different from analysis paralysis because
this sensation is about avoiding the situation entirely, while analysis paralysis is continually looking at the
decisions to be made but still unable to make a choice. [28]
Post-decision analysis[edit]
Evaluation and analysis of past decisions is complementary to decision-making. See also Mental
accounting and Postmortem documentation.
Neuroscience[edit]
Decision-making is a region of intense study in the fields of systems neuroscience, and cognitive neuroscience.
Several brain structures, including the anterior cingulate cortex(ACC), orbitofrontal cortex, and the
overlapping ventromedial prefrontal cortex are believed to be involved in decision-making processes.
A neuroimaging study[29] found distinctive patterns of neural activation in these regions depending on whether
decisions were made on the basis of perceived personal volition or following directions from someone else.
Patients with damage to the ventromedial prefrontal cortex have difficulty making advantageous decisions.[30]
[page needed]
A common laboratory paradigm for studying neural decision-making is the two-alternative forced choice task
(2AFC), in which a subject has to choose between two alternatives within a certain time. A study of a two-
alternative forced choice task involving rhesus monkeys found that neurons in the parietal cortex not only
represent the formation of a decision[31] but also signal the degree of certainty (or "confidence") associated with
the decision.[32] A 2012 study found that rats and humans can optimally accumulate incoming sensory evidence,
to make statistically optimal decisions.[33] Another study found that lesions to the ACC in the macaque resulted in
impaired decision-making in the long run of reinforcement guided tasks suggesting that the ACC may be
involved in evaluating past reinforcement information and guiding future action. [34] It has recently been argued
that the development of formal frameworks will allow neuroscientists to study richer and more naturalistic
paradigms than simple 2AFC decision tasks; in particular, such decisions may involve planning and information
search across temporally extended environments.[35]
Emotions[edit]
Main article: Emotions in decision-making
Emotion appears able to aid the decision-making process. Decision-making often occurs in the face
of uncertainty about whether one's choices will lead to benefit or harm (see also Risk). The somatic marker
hypothesis is a neurobiological theory of how decisions are made in the face of uncertain outcomes. [36] This
theory holds that such decisions are aided by emotions, in the form of bodily states, that are elicited during the
deliberation of future consequences and that mark different options for behavior as being advantageous or
disadvantageous. This process involves an interplay between neural systems that elicit emotional/bodily states
and neural systems that map these emotional/bodily states. [37] A recent lesion mapping study of 152 patients with
focal brain lesions conducted by Aron K. Barbey and colleagues provided evidence to help discover the neural
mechanisms of emotional intelligence.[38][39][40]
Decision-making techniques[edit]
Decision-making techniques can be separated into two broad categories: group decision-making techniques and
individual decision-making techniques. Individual decision-making techniques can also often be applied by a
group.
Group[edit]
Consensus decision-making tries to avoid "winners" and "losers". Consensus requires that a majority
approve a given course of action, but that the minority agree to go along with the course of action. In other
words, if the minority opposes the course of action, consensus requires that the course of action be
modified to remove objectionable features.
Voting-based methods:
o Majority requires support from more than 50% of the members of the group. Thus, the bar for action is
lower than with consensus. See also Condorcet method.
o Plurality, where the largest faction in a group decides, even if it falls short of a majority.
o Score voting (or range voting) lets each member score one or more of the available options, specifying
both preference and intensity of preference information. The option with the highest total or average is
chosen. This method has experimentally been shown to produce the lowest Bayesian regret among
common voting methods, even when voters are strategic. [41] It addresses issues of voting
paradox and majority rule. See also approval voting.
o Quadratic voting allows participants to cast their preference and intensity of preference for each
decision (as opposed to a simple for or against decision). As in score voting, it addresses issues of
voting paradox and majority rule.
Delphi method is a structured communication technique for groups, originally developed for collaborative
forecasting but has also been used for policy making.[42]
Dotmocracy is a facilitation method that relies on the use of special forms called Dotmocracy. They are
sheets that allows large groups to collectively brainstorm and recognize agreements on an unlimited
number of ideas they have each wrote.[43]
Participative decision-making occurs when an authority opens up the decision-making process to a group of
people for a collaborative effort.
Decision engineering uses a visual map of the decision-making process based on system dynamics and can
be automated through a decision modeling tool, integrating big data, machine learning, and expert
knowledge as appropriate.
Individual[edit]
Decisional balance sheet: listing the advantages and disadvantages (benefits and costs, pros and cons) of
each option, as suggested by Plato's Protagoras and by Benjamin Franklin.[44]
Expected-value optimization: choosing the alternative with the highest probability-weighted utility,
possibly with some consideration for risk aversion. This may involve considering the opportunity cost of
different alternatives. See also Decision analysis and Decision theory.
Satisficing: examining alternatives only until the first acceptable one is found. The opposite
is maximizing or optimizing, in which many or all alternatives are examined in order to find the best
option.
Acquiesce to a person in authority or an "expert"; "just following orders".
Anti-authoritarianism: taking the most opposite action compared to the advice of mistrusted authorities.
Flipism e.g. flipping a coin, cutting a deck of playing cards, and other random or coincidence methods – or
prayer, tarot cards, astrology, augurs, revelation, or other forms ofdivination, superstition or pseudoscience.
Automated decision support: setting up criteria for automated decisions.
Decision support systems: using decision-making software when faced with highly complex decisions or
when considering many stakeholders, categories, or other factors that affect decisions.
Steps[edit]
A variety of researchers have formulated similar prescriptive steps aimed at improving decision-making.
GOFER[edit]
In the 1980s, psychologist Leon Mann and colleagues developed a decision-making process called GOFER,
which they taught to adolescents, as summarized in the bookTeaching Decision Making To Adolescents.[45] The
process was based on extensive earlier research conducted with psychologist Irving Janis.[46] GOFER is
an acronym for five decision-making steps:[47]
DECIDE[edit]
In 2008, Kristina Guo published the DECIDE model of decision-making, which has six parts: [48]
Other[edit]
In 2007, Pam Brown of Singleton Hospital in Swansea, Wales, divided the decision-making process into seven
steps:[49]
In 2009, professor John Pijanowski described how the Arkansas Program, an ethics curriculum at the University
of Arkansas, used eight stages of moral decision-making based on the work of James Rest:[50]: 6
1. Establishing community: Create and nurture the relationships, norms, and procedures that will
influence how problems are understood and communicated. This stage takes place prior to and during
a moral dilemma.
2. Perception: Recognize that a problem exists.
3. Interpretation: Identify competing explanations for the problem, and evaluate the drivers behind those
interpretations.
4. Judgment: Sift through various possible actions or responses and determine which is more justifiable.
5. Motivation: Examine the competing commitments which may distract from a more moral course of
action and then prioritize and commit to moral values over other personal, institutional or social
values.
6. Action: Follow through with action that supports the more justified decision.
7. Reflection in action.
8. Reflection on action.
Group stages[edit]
There are four stages or phases that should be involved in all group decision-making: [51]
Orientation. Members meet for the first time and start to get to know each other.
Conflict. Once group members become familiar with each other, disputes, little fights and arguments occur.
Group members eventually work it out.
Emergence. The group begins to clear up vague opinions by talking about them.
Reinforcement. Members finally make a decision and provide justification for it.
It is said that establishing critical norms in a group improves the quality of decisions, while the majority of
opinions (called consensus norms) do not.[52]
Conflicts in socialization are divided in to functional and dysfunctional types. Functional conflicts are mostly
the questioning the managers assumptions in their decision making and dysfunctional conflicts are like personal
attacks and every action which decrease team effectiveness. Functional conflicts are the better ones to gain
higher quality decision making caused by the increased team knowledge and shared understanding. [53]
In reality, however, there are some factors that affect decision-making abilities and cause people to make
irrational decisions – for example, to make contradictory choices when faced with the same problem framed in
two different ways (see also Allais paradox).
Rational decision making is a multi-step process for making choices between alternatives. The process of
rational decision making favors logic, objectivity, and analysis over subjectivity and insight. Irrational decision
is more counter to logic. The decisions are made in haste and outcomes are not considered. [55]
One of the most prominent theories of decision making is subjective expected utility (SEU) theory, which
describes the rational behavior of the decision maker.[56] The decision maker assesses different alternatives by
their utilities and the subjective probability of occurrence.[56]
Rational decision-making is often grounded on experience and theories that are able to put this approach on
solid mathematical grounds so that subjectivity is reduced to a minimum, see e.g. scenario optimization.
Rational decision is generally seen as the best or most likely decision to achieve the set goals or outcome. [57]
When it comes to the idea of fairness in decision making, children and adults differ much less. Children are able
to understand the concept of fairness in decision making from an early age. Toddlers and infants, ranging from
9–21 months, understand basic principles of equality. The main difference found is that more complex principles
of fairness in decision making such as contextual and intentional information don't come until children get older.
[59]
Adolescents[edit]
‹ The template below (More citations needed section) is being considered for merging. See templates for discussion to help reach a
consensus. ›
During their adolescent years, teens are known for their high-risk behaviors and rash decisions. Research [60] has
shown that there are differences in cognitive processes between adolescents and adults during decision-making.
Researchers have concluded that differences in decision-making are not due to a lack of logic or reasoning, but
more due to the immaturity of psychosocial capacities that influence decision-making. Examples of their
undeveloped capacities which influence decision-making would be impulse control, emotion regulation, delayed
gratification and resistance to peer pressure. In the past, researchers have thought that adolescent behavior was
simply due to incompetency regarding decision-making. Currently, researchers have concluded that adults and
adolescents are both competent decision-makers, not just adults. However, adolescents' competent decision-
making skills decrease when psychosocial capacities become present.
Research[61] has shown that risk-taking behaviors in adolescents may be the product of interactions between the
socioemotional brain network and its cognitive-control network. The socioemotional part of the brain processes
social and emotional stimuli and has been shown to be important in reward processing. The cognitive-control
network assists in planning and self-regulation. Both of these sections of the brain change over the course
of puberty. However, the socioemotional network changes quickly and abruptly, while the cognitive-control
network changes more gradually. Because of this difference in change, the cognitive-control network, which
usually regulates the socioemotional network, struggles to control the socioemotional network when
psychosocial capacities are present.[clarification needed]
When adolescents are exposed to social and emotional stimuli, their socioemotional network is activated as well
as areas of the brain involved in reward processing. Because teens often gain a sense of reward from risk-taking
behaviors, their repetition becomes ever more probable due to the reward experienced. In this, the process
mirrors addiction. Teens can become addicted to risky behavior because they are in a high state of arousal and
are rewarded for it not only by their own internal functions but also by their peers around them. A recent study
suggests that adolescents have difficulties adequately adjusting beliefs in response to bad news (such as reading
that smoking poses a greater risk to health than they thought), but do not differ from adults in their ability to
alter beliefs in response to good news.[62] This creates biased beliefs, which may lead to greater risk taking. [63]
Adults[edit]
Adults are generally better able to control their risk-taking because their cognitive-control system has matured
enough to the point where it can control the socioemotional network, even in the context of high arousal or when
psychosocial capacities are present. Also, adults are less likely to find themselves in situations that push them to
do risky things. For example, teens are more likely to be around peers who peer pressure them into doing things,
while adults are not as exposed to this sort of social setting.[64][65]
Selective search for evidence (also known as confirmation bias): People tend to be willing to gather facts
that support certain conclusions but disregard other facts that support different conclusions. Individuals
who are highly defensive in this manner show significantly greater left prefrontal cortex activity as
measured by EEG than do less defensive individuals.[67]
Premature termination of search for evidence: People tend to accept the first alternative that looks like it
might work.
Cognitive inertia is the unwillingness to change existing thought patterns in the face of new circumstances.
Selective perception: People actively screen out information that they do not think is important (see
also Prejudice). In one demonstration of this effect, discounting of arguments with which one disagrees (by
judging them as untrue or irrelevant) was decreased by selective activation of right prefrontal cortex. [68]
Wishful thinking is a tendency to want to see things in a certain – usually positive – light, which can distort
perception and thinking.[69]
Choice-supportive bias occurs when people distort their memories of chosen and rejected options to make
the chosen options seem more attractive.
Recency: People tend to place more attention on more recent information and either ignore or forget more
distant information (see Semantic priming). The opposite effect in the first set of data or other information
is termed primacy effect.[70][page needed]
Repetition bias is a willingness to believe what one has been told most often and by the greatest number of
different sources.
Anchoring and adjustment: Decisions are unduly influenced by initial information that shapes our view of
subsequent information.
Groupthink is peer pressure to conform to the opinions held by the group.
Source credibility bias is a tendency to reject a person's statement on the basis of a bias against the person,
organization, or group to which the person belongs. People preferentially accept statements by others that
they like (see also Prejudice).
Incremental decision-making and escalating commitment: People look at a decision as a small step in a
process, and this tends to perpetuate a series of similar decisions. This can be contrasted with zero-based
decision-making (see Slippery slope).
Attribution asymmetry: People tend to attribute their own success to internal factors, including abilities and
talents, but explain their failures in terms of external factors such as bad luck. The reverse bias is shown
when people explain others' success or failure.
Role fulfillment is a tendency to conform to others' decision-making expectations.
Underestimating uncertainty and the illusion of control: People tend to underestimate future uncertainty
because of a tendency to believe they have more control over events than they really do.
Framing bias: This is best avoided by increasing numeracy and presenting data in several formats (for
example, using both absolute and relative scales).[71]
o Sunk-cost fallacy is a specific type of framing effect that affects decision-making. It involves an
individual making a decision about a current situation based on what they have previously invested in
the situation.[54]: 372 An example of this would be an individual that is refraining from dropping a class
that they are most likely to fail, due to the fact that they feel as though they have done so much work
in the course thus far.
Prospect theory involves the idea that when faced with a decision-making event, an individual is more
likely to take on a risk when evaluating potential losses, and are more likely to avoid risks when evaluating
potential gains. This can influence one's decision-making depending if the situation entails a threat, or
opportunity.[54]: 373
Optimism bias is a tendency to overestimate the likelihood of positive events occurring in the future and
underestimate the likelihood of negative life events.[72] Such biased expectations are generated and
maintained in the face of counter-evidence through a tendency to discount undesirable information. [73] An
optimism bias can alter risk perception and decision-making in many domains, ranging from finance to
health.
Reference class forecasting was developed to eliminate or reduce cognitive biases in decision-making.
In groups, people generate decisions through active and complex processes. One method consists of three steps:
initial preferences are expressed by members; the members of the group then gather and share information
concerning those preferences; finally, the members combine their views and make a single choice about how to
face the problem. Although these steps are relatively ordinary, judgements are often distorted by cognitive and
motivational biases, include "sins of commission", "sins of omission", and "sins of imprecision". [74][page needed]
Cognitive styles[edit]
Optimizing vs. satisficing[edit]
Main article: Maximization (psychology)
Herbert A. Simon coined the phrase "bounded rationality" to express the idea that human decision-making is
limited by available information, available time and the mind's information-processing ability. Further
psychological research has identified individual differences between two cognitive styles: maximizers try to
make an optimal decision, whereas satisficers simply try to find a solution that is "good enough". Maximizers
tend to take longer making decisions due to the need to maximize performance across all variables and make
tradeoffs carefully; they also tend to more often regret their decisions (perhaps because they are more able than
satisficers to recognize that a decision turned out to be sub-optimal). [75]
Katsenelinboigen states that apart from the methods (reactive and selective) and sub-methods (randomization,
predispositioning, programming), there are two major styles: positional and combinational. Both styles are
utilized in the game of chess. The two styles reflect two basic approaches to uncertainty: deterministic
(combinational style) and indeterministic (positional style). Katsenelinboigen's definition of the two styles are
the following.
In defining the combinational style in chess, Katsenelinboigen wrote: "The combinational style features a clearly
formulated limited objective, namely the capture of material (the main constituent element of a chess position).
The objective is implemented via a well-defined, and in some cases, unique sequence of moves aimed at
reaching the set goal. As a rule, this sequence leaves no options for the opponent. Finding a combinational
objective allows the player to focus all his energies on efficient execution, that is, the player's analysis may be
limited to the pieces directly partaking in the combination. This approach is the crux of the combination and the
combinational style of play.[77]: 57
"Unlike the combinational player, the positional player is occupied, first and foremost, with the elaboration of
the position that will allow him to develop in the unknown future. In playing the positional style, the player must
evaluate relational and material parameters as independent variables. ... The positional style gives the player the
opportunity to develop a position until it becomes pregnant with a combination. However, the combination is
not the final goal of the positional player – it helps him to achieve the desirable, keeping in mind a
predisposition for the future development. The pyrrhic victory is the best example of one's inability to think
positionally."[78]
Other studies suggest that these national or cross-cultural differences in decision-making exist across entire
societies. For example, Maris Martinsons has found that American, Japanese and Chinese business leaders each
exhibit a distinctive national style of decision-making.[82]
The Myers-Briggs typology has been the subject of criticism regarding its poor psychometric properties. [83][84][85]
The rational style is an in-depth search for, and a strong consideration of, other options and/or information
prior to making a decision. In this style, the individual would research the new job being offered, review
their current job, and look at the pros and cons of taking the new job versus staying with their current
company.
The intuitive style is confidence in one's initial feelings and gut reactions. In this style, if the individual
initially prefers the new job because they have a feeling that the work environment is better suited for them,
then they would decide to take the new job. The individual might not make this decision as soon as the job
is offered.
The dependent style is asking for other people's input and instructions on what decision should be made. In
this style, the individual could ask friends, family, coworkers, etc., but the individual might not ask all of
these people.
The avoidant style is averting the responsibility of making a decision. In this style, the individual would not
make a decision. Therefore, the individual would stick with their current job.
The spontaneous style is a need to make a decision as soon as possible rather than waiting to make a
decision. In this style, the individual would either reject or accept the job as soon as it is offered.
6 reasons why we make bad decisions and
how to avoid them
1. We seek social approval
We do make a lot of bad decisions because of our beliefs, because we are
ignorant, because we refuse to put in the effort or otherwise ignore signs that are
right in front of us. But, we also make bad decisions when we know what’s
better. Regardless of our beliefs or knowledge, within certain social contexts,
we may choose what’s easy over what’s right.
We know what a perfectly good choice is, but cave into the social context in
which the decision is made. When everyone around you is saying something
and you have something entirely different to say, it may be hard to make the
right decision for yourself:
Think of a decision in front of an authority figure like your boss. Do you
express your disagreement or nod in agreement with the most popular
decision?
When part of a group discussion, do you find your thinking swayed towards
the collective opinion?
Stanford University sociologist, Mark Granovetter calls this threshold model of
collective behavior, which states that individuals’ behavior depends on the
number of other individuals already engaging in that behavior. Threshold is the
number or proportion of others who must make one decision before you decide
to go with their opinion. Different individuals have different thresholds that may
be influenced by factors like social economic status, education, age, personality,
etc.
People who don’t give in to social context are the ones who don’t need approval
from others to tell them what a good decision is. Even if it means opposing and
standing to everyone else, they aren’t afraid to do it. They don’t give in to peer
pressure and groupthink. They aren’t afraid to stand out. They place more
emphasis on being right than being liked. They don’t make decisions based on
social approval.
To catch yourself, ask these questions:
Another source, relying solely on our instinct. In his classic Thinking, Fast And
Slow, Daniel Kahneman, a psychologist and economist notable for his work on
the psychology of judgement and decision-making stresses on the fact that our
gut instinct can fail us when we apply familiar patterns of experience to
unrelated circumstances or situations. We make quick decisions based on our
past experience which is a key source ofnoise and bias. He asks us to resist
premature intuition and says “Intuition should not be banned, but it should be
informed, disciplined and delayed.”
To catch yourself, ask these questions:
Are you relying solely on your intuition to make this decision? What data
supports your thinking?
How are you sure this data is not biased?
What other dissenting pieces of information have you considered?
What do others have to say about this decision?
Follow this simple rule: Combine your intuitive reasoning with scientific
knowledge and data. Ask others for disconfirming opinions and evidence to
enable you to see both sides of the argument and not only the one that appeals to
you.
It’s tempting to give in to good outcomes with small upside that are easily
accessible and visible to us without weighing in on the potentially large
downside of these decisions in the future. Our experiences and beliefs also limit
our ability to go beyond the natural and seek hard truths by asking difficult
questions, exploring unknown territories, and doubting what may seem like an
obvious choice.
Am I getting a lot of benefit by pursuing this option for the costs I need to
incur?
What am I willing to give up to continue on this path?
What else can be worth pursuing if I give this up?
Follow this simple rule: When you find it difficult to let go of something, ask
yourself: Is it grit, blindness or my own ego getting in the way?
She writes “I’m willing to bet that your best decision preceded a good result and
the worst decision preceded a bad result. I have yet to come across someone
who doesn’t identify their best and worst results rather than their best and worst
decisions. I never seem to come across anyone who identifies a bad decision
where they got lucky with the result, or a well-reasoned decision that didn’t pan
out. We link results with decisions even though it is easy to point out
indisputable examples where the relationship between decisions and results isn’t
so perfectly correlated.” In other words, we equate the quality of a decision with
the quality of its outcome.
She explains why this happens “When we work backward from results to figure
out why those things happened, we are susceptible to a variety of cognitive
traps, like assuming causation when there is only a correlation, or cherry-
picking data to confirm the narrative we prefer. We will pound a lot of square
pegs into round holes to maintain the illusion of a tight relationship between our
outcomes and our decisions.” We are not only bad at separating luck and skill,
we also refuse to accept that some results can be beyond our control.
Drawing a strong connection between our results and the quality of the
decisions preceding them puts us at the risk of making bad decisions every day.
We fail to learn from our past decisions as we fail to separate bad decisions
from good ones.
She suggests “What makes a decision great is not that it has a great outcome. A
great decision is the result of a good process, and that process must include an
attempt to accurately represent our own state of knowledge. That state of
knowledge, in turn, is some variation of I’m not sure.”
Do I think this decision is good or bad purely on the basis of its outcome?
Did I follow the right process to make this decision?
Was I missing critical information that could have led to a different
decision?
Follow this simple rule: Instead of trying to make a decision where you are
100% sure, embrace uncertainty. Evaluate different options based on the
probability that a specific outcome will occur. Your experience and knowledge
of those around you will determine the accuracy of your evaluations.
Summary
1. Learning to catch bad decisions is the only way to enhance the quality of our
future decisions and avoid making decisions we end up regretting.
2. Even with the knowledge of a perfectly good decision, we can incline
towards collective opinion within certain social contexts.
3. We stick to our opinion by rejecting disconfirming pieces of evidence and
collecting information that matches our point of view.
4. We optimize for a small gain in the moment without evaluating the costs we
need to incur in future.
5. Our ego traps us with a bad decision we made earlier and instead of
changing course, we continue investing in it.
6. We identify the quality of our decision based on its outcome instead of
focusing on the process that leads to improvement.
7. When tired, hungry or emotionally charged, we tend to make really terrible
decisions.
,