0% found this document useful (0 votes)
33 views100 pages

MIT9 00SCF11 Text-384-483

Part 4

Uploaded by

CodeInCloud
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
33 views100 pages

MIT9 00SCF11 Text-384-483

Part 4

Uploaded by

CodeInCloud
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 100

Figure 8.

14 Proactive and Retroactive Interference

Retroactive and proactive interference can both influence memory.

The Structure of LTM: Categories, Prototypes, and Schemas

Memories that are stored in LTM are not isolated but rather are linked together into categories—
networks of associated memories that have features in common with each other. Forming
categories, and using categories to guide behavior, is a fundamental part of human nature.
Associated concepts within a category are connected through spreading activation, which occurs
when activating one element of a category activates other associated elements. For instance,
because tools are associated in a category, reminding people of the word “screwdriver” will help
them remember the word “wrench.” And, when people have learned lists of words that come
from different categories (e.g., as in Note 8.33 "Retrieval Demonstration"), they do not recall the

Saylor URL: http://www.saylor.org/books Saylor.org


384
information haphazardly. If they have just remembered the word “wrench,” they are more likely
to remember the word “screwdriver” next than they are to remember the word “dahlia,” because
the words are organized in memory by category and because “dahlia” is activated by spreading
activation from “wrench” (Srull & Wyer, 1989). [12]

Some categories have defining features that must be true of all members of the category. For
instance, all members of the category “triangles” have three sides, and all members of the
category “birds” lay eggs. But most categories are not so well-defined; the members of the
category share some common features, but it is impossible to define which are or are not
members of the category. For instance, there is no clear definition of the category “tool.” Some
examples of the category, such as a hammer and a wrench, are clearly and easily identified as
category members, whereas other members are not so obvious. Is an ironing board a tool? What
about a car?

Members of categories (even those with defining features) can be compared to the
category prototype, which is the member of the category that is most average or typical of the
category. Some category members are more prototypical of, or similar to, the category than
others. For instance, some category members (robins and sparrows) are highly prototypical of the
category “birds,” whereas other category members (penguins and ostriches) are less prototypical.
We retrieve information that is prototypical of a category faster than we retrieve information that
is less prototypical (Rosch, 1975). [13]

Mental categories are sometimes referred to as schemas—patterns of knowledge in long-term


memory that help us organize information. We have schemas about objects (that a triangle has
three sides and may take on different angles), about people (that Sam is friendly, likes to golf,
and always wears sandals), about events (the particular steps involved in ordering a meal at a
restaurant), and about social groups (we call these group schemas stereotypes).

Schemas are important in part because they help us remember new information by providing an
organizational structure for it. Read the following paragraph (Bransford & Johnson,
1972) [14] and then try to write down everything you can remember.

Saylor URL: http://www.saylor.org/books Saylor.org


385
The procedure is actually quite simple. First you arrange things into different groups. Of course,
one pile may be sufficient depending on how much there is to do. If you have to go somewhere
else due to lack of facilities, that is the next step; otherwise you are pretty well set. It is important
not to overdo things. That is, it is better to do too few things at once than too many. In the short
run this may not seem important, but complications can easily arise. A mistake can be expensive
as well. At first the whole procedure will seem complicated. Soon, however, it will become just
another facet of life. It is difficult to foresee any end to the necessity for this task in the
immediate future, but then one never can tell. After the procedure is completed, one arranges the
materials into different groups again. Then they can be put into their appropriate places.
Eventually they will be used once more and the whole cycle will then have to be repeated.
However, that is part of life.

It turns out that people’s memory for this information is quite poor, unless they have been told
ahead of time that the information describes “doing the laundry,” in which case their memory for
the material is much better. This demonstration of the role of schemas in memory shows how our
existing knowledge can help us organize new information, and how this organization can
improve encoding, storage, and retrieval.

The Biology of Memory

Just as information is stored on digital media such as DVDs and flash drives, the information in
LTM must be stored in the brain. The ability to maintain information in LTM involves a gradual
strengthening of the connections among the neurons in the brain. When pathways in these neural
networks are frequently and repeatedly fired, the synapses become more efficient in
communicating with each other, and these changes create memory. This process, known as long-
term potentiation (LTP), refers to the strengthening of the synaptic connections between neurons
as result of frequent stimulation (Lynch, 2002). [15] Drugs that block LTP reduce learning,
whereas drugs that enhance LTP increase learning (Lynch et al., 1991). [16]Because the new
patterns of activation in the synapses take time to develop, LTP happens gradually. The period of
time in which LTP occurs and in which memories are stored is known as the period
of consolidation.

Saylor URL: http://www.saylor.org/books Saylor.org


386
Memory is not confined to the cortex; it occurs through sophisticated interactions between new
and old brain structures (Figure 8.17 "Schematic Image of Brain With Hippocampus, Amygdala,
and Cerebellum Highlighted"). One of the most important brain regions in explicit memory is the
hippocampus, which serves as a preprocessor and elaborator of information (Squire,
1992). [17] The hippocampus helps us encode information about spatial relationships, the context
in which events were experienced, and the associations among memories (Eichenbaum,
1999). [18] The hippocampus also serves in part as a switching point that holds the memory for a
short time and then directs the information to other parts of the brain, such as the cortex, to
actually do the rehearsing, elaboration, and long-term storage (Jonides, Lacey, & Nee,
2005). [19] Without the hippocampus, which might be described as the brain’s “librarian,” our
explicit memories would be inefficient and disorganized.

Figure 8.17 Schematic Image of Brain With Hippocampus, Amygdala, and Cerebellum Highlighted

Saylor URL: http://www.saylor.org/books Saylor.org


387
Different brain structures help us remember different types of information. The hippocampus is particularly

important in explicit memories, the cerebellum is particularly important in implicit memories, and the amygdala is

particularly important in emotional memories.

While the hippocampus is handling explicit memory, the cerebellum and the amygdala are
concentrating on implicit and emotional memories, respectively. Research shows that the
cerebellum is more active when we are learning associations and in priming tasks, and animals
and humans with damage to the cerebellum have more difficulty in classical conditioning studies
(Krupa, Thompson, & Thompson, 1993; Woodruff-Pak, Goldenberg, Downey-Lamb, Boyko, &
Lemieux, 2000). [20] The storage of many of our most important emotional memories, and
particularly those related to fear, is initiated and controlled by the amygdala (Sigurdsson,
Doyère, Cain, & LeDoux, 2007). [21]

Evidence for the role of different brain structures in different types of memories comes in part
from case studies of patients who suffer from amnesia, a memory disorder that involves the
inability to remember information. As with memory interference effects, amnesia can work in
either a forward or a backward direction, affecting retrieval or encoding. For people who suffer
damage to the brain, for instance, as a result of a stroke or other trauma, the amnesia may work
backward. The outcome is retrograde amnesia, a memory disorder that produces an inability to
retrieve events that occurred before a given time. Demonstrating the fact that LTP takes time
(the process of consolidation), retrograde amnesia is usually more severe for memories that
occurred just prior to the trauma than it is for older memories, and events that occurred just
before the event that caused memory loss may never be recovered because they were never
completely encoded.

Organisms with damage to the hippocampus develop a type of amnesia that works in a forward
direction to affect encoding, known as anterograde amnesia. Anterograde amnesia is the
inability to transfer information from short-term into long-term memory, making it impossible to
form new memories. One well-known case study was a man named Henry Gustav Molaison
(before he died in 2008, he was referred to only as H. M.) who had parts of his hippocampus
removed to reduce severe seizures (Corkin, Amaral, González, Johnson, & Hyman,

Saylor URL: http://www.saylor.org/books Saylor.org


388
1997). [22] Following the operation, Molaison developed virtually complete anterograde amnesia.
Although he could remember most of what had happened before the operation, and particularly
what had occurred early in his life, he could no longer create new memories. Molaison was said
to have read the same magazines over and over again without any awareness of having seen them
before.

Cases of anterograde amnesia also provide information about the brain structures involved in
different types of memory (Bayley & Squire, 2005; Helmuth, 1999; Paller, 2004). [23] Although
Molaison’s explicit memory was compromised because his hippocampus was damaged, his
implicit memory was not (because his cerebellum was intact). He could learn to trace shapes in a
mirror, a task that requires procedural memory, but he never had any explicit recollection of
having performed this task or of the people who administered the test to him.

Although some brain structures are particularly important in memory, this does not mean that all
memories are stored in one place. The American psychologist Karl Lashley (1929) [24] attempted
to determine where memories were stored in the brain by teaching rats how to run mazes, and
then lesioning different brain structures to see if they were still able to complete the maze. This
idea seemed straightforward, and Lashley expected to find that memory was stored in certain
parts of the brain. But he discovered that no matter where he removed brain tissue, the rats
retained at least some memory of the maze, leading him to conclude that memory isn’t located in
a single place in the brain, but rather is distributed around it.

Long-term potentiation occurs as a result of changes in the synapses, which suggests that
chemicals, particularly neurotransmitters and hormones, must be involved in memory. There is
quite a bit of evidence that this is true.Glutamate, a neurotransmitter and a form of the amino
acid glutamic acid, is perhaps the most important neurotransmitter in memory (McEntee &
Crook, 1993). [25] When animals, including people, are under stress, more glutamate is secreted,
and this glutamate can help them remember (McGaugh, 2003). [26]The
neurotransmitter serotonin is also secreted when animals learn, andepinephrine may also
increase memory, particularly for stressful events (Maki & Resnick, 2000; Sherwin,
1998). [27] Estrogen, a female sex hormone, also seems critical, because women who are

Saylor URL: http://www.saylor.org/books Saylor.org


389
experiencing menopause, along with a reduction in estrogen, frequently report memory
difficulties (Chester, 2001).[28]

Our knowledge of the role of biology in memory suggests that it might be possible to use drugs
to improve our memories, and Americans spend several hundred million dollars per year on
memory supplements with the hope of doing just that. Yet controlled studies comparing memory
enhancers, including Ritalin, methylphenidate, ginkgo biloba, and amphetamines, with placebo
drugs find very little evidence for their effectiveness (Gold, Cahill, & Wenk, 2002; McDaniel,
Maier, & Einstein, 2002). [29] Memory supplements are usually no more effective than drinking a
sugared soft drink, which also releases glucose and thus improves memory slightly. This is not to
say that we cannot someday create drugs that will significantly improve our memory. It is likely
that this will occur in the future, but the implications of these advances are as yet unknown
(Farah et al., 2004; Turner & Sahakian, 2006). [30]

Although the most obvious potential use of drugs is to attempt to improve memory, drugs might
also be used to help us forget. This might be desirable in some cases, such as for those suffering
from posttraumatic stress disorder (PTSD) who are unable to forget disturbing memories.
Although there are no existing therapies that involve using drugs to help people forget, it is
possible that they will be available in the future. These possibilities will raise some important
ethical issues: Is it ethical to erase memories, and if it is, is it desirable to do so? Perhaps the
experience of emotional pain is a part of being a human being. And perhaps the experience of
emotional pain may help us cope with the trauma.

KEY TAKEAWAYS

• Information is better remembered when it is meaningfully elaborated.

• Hermann Ebbinghaus made important contributions to the study of learning, including modeling the forgetting curve,

and studying the spacing effect and the benefits of overlearning.

• Context- and state-dependent learning, as well as primacy and recency effects, influence long-term memory.

• Memories are stored in connected synapses through the process of long-term potentiation (LTP). In addition to the

cortex, other parts of the brain, including the hippocampus, cerebellum, and the amygdala, are also important in

memory.

Saylor URL: http://www.saylor.org/books Saylor.org


390
• Damage to the brain may result in retrograde amnesia or anterograde amnesia. Case studies of patients with amnesia

can provide information about the brain structures involved in different types of memory.

• Memory is influenced by chemicals including glutamate, serotonin, epinephrine, and estrogen.

• Studies comparing memory enhancers with placebo drugs find very little evidence for their effectiveness.
EXERCISES AND CRITICAL THINKING

1. Plan a course of action to help you study for your next exam, incorporating as many of the techniques mentioned in

this section as possible. Try to implement the plan.

2. Make a list of some the schemas that you have stored in your memory. What are the contents of each schema, and

how might you use the schema to help you remember new information?

3. In the film “Eternal Sunshine of the Spotless Mind,” the characters undergo a medical procedure designed to erase

their memories of a painful romantic relationship. Would you engage in such a procedure if it was safely offered to

you?

[1] Nickerson, R. S., & Adams, M. J. (1979). Long-term memory for a common object.Cognitive Psychology, 11(3), 287–307.

[2] Craik, F. I., & Lockhart, R. S. (1972). Levels of processing: A framework for memory research. Journal of Verbal Learning &

Verbal Behavior, 11(6), 671–684; Harris, J. L., & Qualls, C. D. (2000). The association of elaborative or maintenance rehearsal

with age, reading comprehension and verbal working memory performance. Aphasiology, 14(5–6), 515–526.

[3] Rogers, T. B., Kuiper, N. A., & Kirker, W. S. (1977). Self-reference and the encoding of personal information. Journal of

Personality & Social Psychology, 35(9), 677–688.

[4] Symons, C. S., & Johnson, B. T. (1997). The self-reference effect in memory: A meta-analysis. Psychological Bulletin, 121(3),

371–394.

[5] Bahrick, H. P. (1984). Semantic memory content in permastore: Fifty years of memory for Spanish learned in school. Journal

of Experimental Psychology: General, 113(1), 1–29.

[6] Driskell, J. E., Willis, R. P., & Copper, C. (1992). Effect of overlearning on retention.Journal of Applied Psychology, 77(5), 615–

622.

[7] Godden, D. R., & Baddeley, A. D. (1975). Context-dependent memory in two natural environments: On land and

underwater. British Journal of Psychology, 66(3), 325–331.

[8] Jackson, A., Koek, W., & Colpaert, F. (1992). NMDA antagonists make learning and recall state-dependent. Behavioural

Pharmacology, 3(4), 415.

Saylor URL: http://www.saylor.org/books Saylor.org


391
[9] Marian, V. & Kaushanskaya, M. (2007). Language context guides memory content.Psychonomic Bulletin and Review, 14(5),

925–933.

[10] Bower, G. H. (1981). Mood and memory. American Psychologist, 36, 129–148; Eich, E. (2008). Mood and memory at 26:

Revisiting the idea of mood mediation in drug-dependent and place-dependent memory. In M. A. Gluck, J. R. Anderson, & S. M.

Kosslyn (Eds.),Memory and mind: A festschrift for Gordon H. Bower (pp. 247–260). Mahwah, NJ: Lawrence Erlbaum Associates.

[11] Baddeley, A., Eysenck, M. W., & Anderson, M. C. (2009). Memory. New York, NY: Psychology Press.

[12] Srull, T., & Wyer, R. (1989). Person memory and judgment. Psychological Review, 96(1), 58–83.

[13] Rosch, E. (1975). Cognitive representations of semantic categories. Journal of Experimental Psychology: General, 104(3),

192–233.

[14] Bransford, J. D., & Johnson, M. K. (1972). Contextual prerequisites for understanding: Some investigations of

comprehension and recall. Journal of Verbal Learning & Verbal Behavior, 11(6), 717–726.

[15] Lynch, G. (2002). Memory enhancement: The search for mechanism-based drugs.Nature Neuroscience, 5(Suppl.), 1035–

1038.

[16] Lynch, G., Larson, J., Staubli, U., Ambros-Ingerson, J., Granger, R., Lister, R. G.,…Weingartner, H. J. (1991). Long-term

potentiation and memory operations in cortical networks. In C. A. Wickliffe, M. Corballis, & G. White (Eds.), Perspectives on

cognitive neuroscience (pp. 110–131). New York, NY: Oxford University Press.

[17] Squire, L. R. (1992). Memory and the hippocampus: A synthesis from findings with rats, monkeys, and

humans. Psychological Review, 99(2), 195–231.

[18] Eichenbaum, H. (1999). Conscious awareness, memory, and the hippocampus. Nature Neuroscience, 2(9), 775–776.

[19] Jonides, J., Lacey, S. C., & Nee, D. E. (2005). Processes of working memory in mind and brain. Current Directions in

Psychological Science, 14(1), 2–5.

[20] Krupa, D. J., Thompson, J. K., & Thompson, R. F. (1993). Localization of a memory trace in the mammalian brain. Science,

260(5110), 989–991; Woodruff-Pak, D. S., Goldenberg, G., Downey-Lamb, M. M., Boyko, O. B., & Lemieux, S. K. (2000).

Cerebellar volume in humans related to magnitude of classical conditioning. Neuroreport: For Rapid Communication of

Neuroscience Research, 11(3), 609–615.

[21] Sigurdsson, T., Doyère, V., Cain, C. K., & LeDoux, J. E. (2007). Long-term potentiation in the amygdala: A cellular mechanism

of fear learning and memory. Neuropharmacology, 52(1), 215–227.

[22] Corkin, S., Amaral, D. G., González, R. G., Johnson, K. A., & Hyman, B. T. (1997). H. M.’s medial temporal lobe lesion:

Findings from magnetic resonance imaging. The Journal of Neuroscience, 17(10), 3964–3979.

Saylor URL: http://www.saylor.org/books Saylor.org


392
[23] Bayley, P. J., & Squire, L. R. (2005). Failure to acquire new semantic knowledge in patients with large medial temporal lobe

lesions. Hippocampus, 15(2), 273–280; Helmuth, Laura. (1999). New role found for the hippocampus. Science, 285, 1339–1341;

Paller, K. A. (2004). Electrical signals of memory and of the awareness of remembering. Current Directions in Psychological

Science, 13(2), 49–55.

[24] Lashley, K. S. (1929). The effects of cerebral lesions subsequent to the formation of the maze habit: Localization of the

habit. In Brain mechanisms and intelligence: A quantitative study of injuries to the brain (pp. 86–108). Chicago, IL: University of

Chicago Press.

[25] McEntee, W., & Crook, T. (1993). Glutamate: Its role in learning, memory, and the aging

brain. Psychopharmacology, 111(4), 391–401.

[26] McGaugh, J. L. (2003). Memory and emotion: The making of lasting memories. New York, NY: Columbia University Press.

[27] Maki, P. M., & Resnick, S. M. (2000). Longitudinal effects of estrogen replacement therapy on PET cerebral blood flow and

cognition. Neurobiology of Aging, 21, 373–383; Sherwin, B. B. (1998). Estrogen and cognitive functioning in

women. Proceedings of the Society for Experimental Biological Medicine, 217, 17–22.

[28] Chester, B. (2001). Restoring remembering: Hormones and memory. McGill Reporter, 33(10). Retrieved

from http://www.mcgill.ca/reporter/33/10/sherwin

[29] Gold, P. E., Cahill, L., & Wenk, G. L. (2002). Ginkgo biloba: A cognitive enhancer?Psychological Science in the Public Interest,

3(1), 2–11; McDaniel, M. A., Maier, S. F., & Einstein, G. O. (2002). “Brain-specific” nutrients: A memory cure? Psychological

Science in the Public Interest, 3(1), 12–38.

[30] Farah, M. J., Illes, J., Cook-Deegan, R., Gardner, H., Kandel, E., King, P.,…Wolpe, P. R. (2004). Neurocognitive enhancement:

What can we do and what should we do? Nature Reviews Neuroscience, 5(5), 421–425; Turner, D. C., & Sahakian, B. J. (2006).

Analysis of the cognitive enhancing effects of modafinil in schizophrenia. In J. L. Cummings (Ed.), Progress in neurotherapeutics

and neuropsychopharmacology (pp. 133–147). New York, NY: Cambridge University Press.

Saylor URL: http://www.saylor.org/books Saylor.org


393
8.3 Accuracy and Inaccuracy in Memory and Cognition
LEARNING OBJECTIVES

1. Outline the variables that can influence the accuracy of our memory for events.

2. Explain how schemas can distort our memories.

3. Describe the representativeness heuristic and the availability heuristic and explain how they may lead to errors in

judgment.

As we have seen, our memories are not perfect. They fail in part due to our inadequate encoding
and storage, and in part due to our inability to accurately retrieve stored information. But
memory is also influenced by the setting in which it occurs, by the events that occur to us after
we have experienced an event, and by the cognitive processes that we use to help us remember.
Although our cognition allows us to attend to, rehearse, and organize information, cognition may
also lead to distortions and errors in our judgments and our behaviors.

In this section we consider some of the cognitive biases that are known to influence
humans. Cognitive biases are errors in memory or judgment that are caused by the inappropriate
use of cognitive processes (Table 8.3 "Cognitive Processes That Pose Threats to Accuracy"). The
study of cognitive biases is important both because it relates to the important psychological
theme of accuracy versus inaccuracy in perception, and because being aware of the types of
errors that we may make can help us avoid them and therefore improve our decision-making
skills.

Table 8.3 Cognitive Processes That Pose Threats to Accuracy


Cognitive process Description Potential threat to accuracy

The ability to accurately identify the source of a Uncertainty about the source of a memory may
Source monitoring memory lead to mistaken judgments.

The tendency to verify and confirm our existing


memories rather than to challenge and Once beliefs become established, they become
Confirmation bias disconfirm them self-perpetuating and difficult to change.

When schemas prevent us from seeing and using Creativity may be impaired by the overuse of
Functional fixedness information in new and nontraditional ways traditional, expectancy-based thinking.

Saylor URL: http://www.saylor.org/books Saylor.org


394
Cognitive process Description Potential threat to accuracy

Errors in memory that occur when new but Eyewitnesses who are questioned by the police
Misinformation incorrect information influences existing may change their memories of what they
effect accurate memories observed at the crime scene.

Eyewitnesses may be very confident that they


When we are more certain that our memories have accurately identified a suspect, even
Overconfidence and judgments are accurate than we should be though their memories are incorrect.

When some stimuli, (e.g., those that are colorful, We may base our judgments on a single salient
moving, or unexpected) grab our attention and event while we ignore hundreds of other equally
Salience make them more likely to be remembered informative events that we do not see.

After a coin has come up “heads” many times in


a row, we may erroneously think that the next
Representativeness Tendency to make judgments according to how flip is more likely to be “tails” (the gambler’s
heuristic well the event matches our expectations fallacy).

We may overestimate the crime statistics in our


Idea that things that come to mind easily are own area, because these crimes are so easy to
Availability heuristic seen as more common recall.

We may think that we contributed more to a


Cognitive Idea that some memories are more highly project than we really did because it is so easy to
accessibility activated than others remember our own contributions.

When we “replay” events such that they turn out


differently (especially when only minor changes We may feel particularly bad about events that
Counterfactual in the events leading up to them make a might not have occurred if only a small change
thinking difference) had occurred before them.

Source Monitoring: Did It Really Happen?

One potential error in memory involves mistakes in differentiating the sources of


information. Source monitoring refers to the ability to accurately identify the source of a
memory. Perhaps you’ve had the experience of wondering whether you really experienced an
event or only dreamed or imagined it. If so, you wouldn’t be alone. Rassin, Merkelbach, and
Spaan (2001) [1] reported that up to 25% of college students reported being confused about real
versus dreamed events. Studies suggest that people who are fantasy-prone are more likely to
experience source monitoring errors (Winograd, Peluso, & Glover, 1998), [2] and such errors also

Saylor URL: http://www.saylor.org/books Saylor.org


395
occur more often for both children and the elderly than for adolescents and younger adults
(Jacoby & Rhodes, 2006). [3]

In other cases we may be sure that we remembered the information from real life but be
uncertain about exactly where we heard it. Imagine that you read a news story in a tabloid
magazine such as the National Enquirer. Probably you would have discounted the information
because you know that its source is unreliable. But what if later you were to remember the story
but forget the source of the information? If this happens, you might become convinced that the
news story is true because you forget to discount it. The sleeper effectrefers to attitude change
that occurs over time when we forget the source of information (Pratkanis, Greenwald, Leippe, &
Baumgardner, 1988). [4]

In still other cases we may forget where we learned information and mistakenly assume that we
created the memory ourselves. Kaavya Viswanathan, the author of the book How Opal Mehta
Got Kissed, Got Wild, and Got a Life, was accused of plagiarism when it was revealed that many
parts of her book were very similar to passages from other material. Viswanathan argued that she
had simply forgotten that she had read the other works, mistakenly assuming she had made up
the material herself. And the musician George Harrison claimed that he was unaware that the
melody of his song “My Sweet Lord” was almost identical to an earlier song by another
composer. The judge in the copyright suit that followed ruled that Harrison didn’t intentionally
commit the plagiarism. (Please use this knowledge to become extra vigilant about source
attributions in your written work, not to try to excuse yourself if you are accused of plagiarism.)

Schematic Processing: Distortions Based on Expectations

We have seen that schemas help us remember information by organizing material into coherent
representations. However, although schemas can improve our memories, they may also lead to
cognitive biases. Using schemas may lead us to falsely remember things that never happened to
us and to distort or misremember things that did. For one, schemas lead to the confirmation bias,
which is the tendency to verify and confirm our existing memories rather than to challenge and
disconfirm them. The confirmation bias occurs because once we have schemas, they influence
how we seek out and interpret new information. The confirmation bias leads us to remember

Saylor URL: http://www.saylor.org/books Saylor.org


396
information that fits our schemas better than we remember information that disconfirms them
(Stangor & McMillan, 1992), [5] a process that makes our stereotypes very difficult to change.
And we ask questions in ways that confirm our schemas (Trope & Thompson, 1997). [6] If we
think that a person is an extrovert, we might ask her about ways that she likes to have fun,
thereby making it more likely that we will confirm our beliefs. In short, once we begin to believe
in something—for instance, a stereotype about a group of people—it becomes very difficult to
later convince us that these beliefs are not true; the beliefs become self-confirming.

Darley and Gross (1983) [7] demonstrated how schemas about social class could influence
memory. In their research they gave participants a picture and some information about a fourth-
grade girl named Hannah. To activate a schema about her social class, Hannah was pictured
sitting in front of a nice suburban house for one-half of the participants and pictured in front of
an impoverished house in an urban area for the other half. Then the participants watched a video
that showed Hannah taking an intelligence test. As the test went on, Hannah got some of the
questions right and some of them wrong, but the number of correct and incorrect answers was
the same in both conditions. Then the participants were asked to remember how many questions
Hannah got right and wrong. Demonstrating that stereotypes had influenced memory, the
participants who thought that Hannah had come from an upper-class background remembered
that she had gotten more correct answers than those who thought she was from a lower-class
background.

Our reliance on schemas can also make it more difficult for us to “think outside the box.” Peter
Wason (1960) [8] asked college students to determine the rule that was used to generate the
numbers 2-4-6 by asking them to generate possible sequences and then telling them if those
numbers followed the rule. The first guess that students made was usually “consecutive
ascending even numbers,” and they then asked questions designed to confirm their hypothesis
(“Does 102-104-106 fit?” “What about 404-406-408?”). Upon receiving information that those
guesses did fit the rule, the students stated that the rule was “consecutive ascending even
numbers.” But the students’ use of the confirmation bias led them to ask only about instances
that confirmed their hypothesis, and not about those that would disconfirm it. They never
bothered to ask whether 1-2-3 or 3-11-200 would fit, and if they had they would have learned
that the rule was not “consecutive ascending even numbers,” but simply “any three ascending

Saylor URL: http://www.saylor.org/books Saylor.org


397
numbers.” Again, you can see that once we have a schema (in this case a hypothesis), we
continually retrieve that schema from memory rather than other relevant ones, leading us to act
in ways that tend to confirm our beliefs.

Functional fixedness occurs when people’s schemas prevent them from using an object in new
and nontraditional ways. Duncker (1945) [9] gave participants a candle, a box of thumbtacks, and
a book of matches, and asked them to attach the candle to the wall so that it did not drip onto the
table below (Figure 8.19 "Functional Fixedness"). Few of the participants realized that the box
could be tacked to the wall and used as a platform to hold the candle. The problem again is that
our existing memories are powerful, and they bias the way we think about new information.
Because the participants were “fixated” on the box’s normal function of holding thumbtacks,
they could not see its alternative use.

Figure 8.19 Functional Fixedness

In the candle-tack-box problem, functional fixedness may lead us to see the box only as a box and not as a potential

candleholder.

Saylor URL: http://www.saylor.org/books Saylor.org


398
Misinformation Effects: How Information That Comes Later Can Distort Memory

A particular problem for eyewitnesses such as Jennifer Thompson is that our memories are often
influenced by the things that occur to us after we have learned the information (Erdmann,
Volbert, & Böhm, 2004; Loftus, 1979; Zaragoza, Belli, & Payment, 2007). [10] This new
information can distort our original memories such that the we are no longer sure what is the real
information and what was provided later. The misinformation effect refers to errors in memory
that occur when new information influences existing memories.

In an experiment by Loftus and Palmer (1974), [11] participants viewed a film of a traffic
accident and then, according to random assignment to experimental conditions, answered one of
three questions:

“About how fast were the cars going when they hit each other?”

“About how fast were the cars going when they smashed each other?”

“About how fast were the cars going when they contacted each other?”

As you can see in Figure 8.20 "Misinformation Effect", although all the participants saw the
same accident, their estimates of the cars’ speed varied by condition. Participants who had been
asked about the cars “smashing” each other estimated the highest average speed, and those who
had been asked the “contacted” question estimated the lowest average speed.

Saylor URL: http://www.saylor.org/books Saylor.org


399
Figure 8.20 Misinformation Effect

Participants viewed a film of a traffic accident and then answered a question about the accident. According to

random assignment, the verb in the question was filled by either “hit,” “smashed,” or “contacted” each other. The

wording of the question influenced the participants’ memory of the accident.

Source: Adapted from Loftus, E. F., & Palmer, J. C. (1974). Reconstruction of automobile destruction: An example of

the interaction between language and memory. Journal of Verbal Learning & Verbal Behavior, 13(5), 585–589.

In addition to distorting our memories for events that have actually occurred, misinformation
may lead us to falsely remember information that never occurred. Loftus and her colleagues
asked parents to provide them with descriptions of events that did (e.g., moving to a new house)
and did not (e.g., being lost in a shopping mall) happen to their children. Then (without telling
the children which events were real or made-up) the researchers asked the children to imagine
both types of events. The children were instructed to “think real hard” about whether the events
had occurred (Ceci, Huffman, Smith, & Loftus, 1994). [12] More than half of the children
generated stories regarding at least one of the made-up events, and they remained insistent that
the events did in fact occur even when told by the researcher that they could not possibly have
occurred (Loftus & Pickrell, 1995). [13] Even college students are susceptible to manipulations
that make events that did not actually occur seem as if they did (Mazzoni, Loftus, & Kirsch,
2001). [14]

The ease with which memories can be created or implanted is particularly problematic when the
events to be recalled have important consequences. Therapists often argue that patients may
repress memories of traumatic events they experienced as children, such as childhood sexual

Saylor URL: http://www.saylor.org/books Saylor.org


400
abuse, and then recover the events years later as the therapist leads them to recall the
information—for instance, by using dream interpretation and hypnosis (Brown, Scheflin, &
Hammond, 1998). [15]

But other researchers argue that painful memories such as sexual abuse are usually very well
remembered, that few memories are actually repressed, and that even if they are it is virtually
impossible for patients to accurately retrieve them years later (McNally, Bryant, & Ehlers, 2003;
Pope, Poliakoff, Parker, Boynes, & Hudson, 2007). [16] These researchers have argued that the
procedures used by the therapists to “retrieve” the memories are more likely to actually implant
false memories, leading the patients to erroneously recall events that did not actually occur.
Because hundreds of people have been accused, and even imprisoned, on the basis of claims
about “recovered memory” of child sexual abuse, the accuracy of these memories has important
societal implications. Many psychologists now believe that most of these claims of recovered
memories are due to implanted, rather than real, memories (Loftus & Ketcham, 1994). [17]

Overconfidence

One of the most remarkable aspects of Jennifer Thompson’s mistaken identity of Ronald Cotton
was her certainty. But research reveals a pervasive cognitive bias toward overconfidence, which
is the tendency for people to be too certain about their ability to accurately remember events and
to make judgments. David Dunning and his colleagues (Dunning, Griffin, Milojkovic, & Ross,
1990) [18] asked college students to predict how another student would react in various situations.
Some participants made predictions about a fellow student whom they had just met and
interviewed, and others made predictions about their roommates whom they knew very well. In
both cases, participants reported their confidence in each prediction, and accuracy was
determined by the responses of the people themselves. The results were clear: Regardless of
whether they judged a stranger or a roommate, the participants consistently overestimated the
accuracy of their own predictions.

Eyewitnesses to crimes are also frequently overconfident in their memories, and there is only a
small correlation between how accurate and how confident an eyewitness is. The witness who
claims to be absolutely certain about his or her identification (e.g., Jennifer Thompson) is not

Saylor URL: http://www.saylor.org/books Saylor.org


401
much more likely to be accurate than one who appears much less sure, making it almost
impossible to determine whether a particular witness is accurate or not (Wells & Olson,
2003). [19]

I am sure that you have a clear memory of when you first heard about the 9/11 attacks in 2001,
and perhaps also when you heard that Princess Diana was killed in 1997 or when the verdict of
the O. J. Simpson trial was announced in 1995. This type of memory, which we experience along
with a great deal of emotion, is known as a flashbulb memory—a vivid and emotional memory of
an unusual event that people believe they remember very well. (Brown & Kulik, 1977). [20]

People are very certain of their memories of these important events, and frequently
overconfident. Talarico and Rubin (2003) [21] tested the accuracy of flashbulb memories by
asking students to write down their memory of how they had heard the news about either the
September 11, 2001, terrorist attacks or about an everyday event that had occurred to them
during the same time frame. These recordings were made on September 12, 2001. Then the
participants were asked again, either 1, 6, or 32 weeks later, to recall their memories. The
participants became less accurate in their recollections of both the emotional event and the
everyday events over time. But the participants’ confidence in the accuracy of their memory of
learning about the attacks did not decline over time. After 32 weeks the participants were
overconfident; they were much more certain about the accuracy of their flashbulb memories than
they should have been. Schmolck, Buffalo, and Squire (2000) [22] found similar distortions in
memories of news about the verdict in the O. J. Simpson trial.

Heuristic Processing: Availability and Representativeness

Another way that our information processing may be biased occurs when we use heuristics,
which are information-processing strategies that are useful in many cases but may lead to errors
when misapplied. Let’s consider two of the most frequently applied (and misapplied) heuristics:
the representativeness heuristic and the availability heuristic.

In many cases we base our judgments on information that seems to represent, or match, what we
expect will happen, while ignoring other potentially more relevant statistical information. When
we do so, we are using the representativeness heuristic. Consider, for instance, the puzzle

Saylor URL: http://www.saylor.org/books Saylor.org


402
presented in Table 8.4 "The Representativeness Heuristic". Let’s say that you went to a hospital,
and you checked the records of the babies that were born today. Which pattern of births do you
think you are most likely to find?

Table 8.4 The Representativeness Heuristic


List A List B

6:31 a.m. Girl 6:31 a.m. Boy

8:15 a.m. Girl 8:15 a.m. Girl

9:42 a.m. Girl 9:42 a.m. Boy

1:13 p.m. Girl 1:13 p.m. Girl

3:39 p.m. Boy 3:39 p.m. Girl

5:12 p.m. Boy 5:12 p.m. Boy

7:42 p.m. Boy 7:42 p.m. Girl

11:44 p.m. Boy 11:44 p.m. Boy

Using the representativeness heuristic may lead us to incorrectly believe that some patterns of observed events are
more likely to have occurred than others. In this case, list B seems more random, and thus is judged as more likely
to have occurred, but statistically both lists are equally likely.

Most people think that list B is more likely, probably because list B looks more random, and thus
matches (is “representative of”) our ideas about randomness. But statisticians know that any
pattern of four girls and four boys is mathematically equally likely. The problem is that we have
a schema of what randomness should be like, which doesn’t always match what is
mathematically the case. Similarly, people who see a flipped coin come up “heads” five times in
a row will frequently predict, and perhaps even wager money, that “tails” will be next. This
behavior is known as the gambler’s fallacy. But mathematically, the gambler’s fallacy is an
error: The likelihood of any single coin flip being “tails” is always 50%, regardless of how many
times it has come up “heads” in the past.

Our judgments can also be influenced by how easy it is to retrieve a memory. The tendency to
make judgments of the frequency or likelihood that an event occurs on the basis of the ease with
which it can be retrieved from memory is known as the availability heuristic (MacLeod &

Saylor URL: http://www.saylor.org/books Saylor.org


403
Campbell, 1992; Tversky & Kahneman, 1973). [23] Imagine, for instance, that I asked you to
indicate whether there are more words in the English language that begin with the letter “R” or
that have the letter “R” as the third letter. You would probably answer this question by trying to
think of words that have each of the characteristics, thinking of all the words you know that
begin with “R” and all that have “R” in the third position. Because it is much easier to retrieve
words by their first letter than by their third, we may incorrectly guess that there are more words
that begin with “R,” even though there are in fact more words that have “R” as the third letter.

The availability heuristic may also operate on episodic memory. We may think that our friends
are nice people, because we see and remember them primarily when they are around us (their
friends, who they are, of course, nice to). And the traffic might seem worse in our own
neighborhood than we think it is in other places, in part because nearby traffic jams are more
easily retrieved than are traffic jams that occur somewhere else.

Salience and Cognitive Accessibility

Still another potential for bias in memory occurs because we are more likely to attend to, and
thus make use of and remember, some information more than other information. For one, we
tend to attend to and remember things that are highly salient, meaning that they attract our
attention. Things that are unique, colorful, bright, moving, and unexpected are more salient
(McArthur & Post, 1977; Taylor & Fiske, 1978). [24] In one relevant study, Loftus, Loftus, and
Messo (1987) [25] showed people images of a customer walking up to a bank teller and pulling
out either a pistol or a checkbook. By tracking eye movements, the researchers determined that
people were more likely to look at the gun than at the checkbook, and that this reduced their
ability to accurately identify the criminal in a lineup that was given later. The salience of the gun
drew people’s attention away from the face of the criminal.

The salience of the stimuli in our social worlds has a big influence on our judgment, and in some
cases may lead us to behave in ways that we might better not have. Imagine, for instance, that
you wanted to buy a new music player for yourself. You’ve been trying to decide whether to get
the iPod or the Zune. You checked Consumer Reports online and found that, although the
players differed on many dimensions, including price, battery life, ability to share music, and so

Saylor URL: http://www.saylor.org/books Saylor.org


404
forth, the Zune was nevertheless rated significantly higher by owners than was the iPod. As a
result, you decide to purchase the Zune the next day. That night, however, you go to a party, and
a friend shows you her iPod. You check it out, and it seems really cool. You tell her that you
were thinking of buying a Zune, and she tells you that you are crazy. She says she knows
someone who had one and it had a lot of problems—it didn’t download music correctly, the
battery died right after the warranty expired, and so forth—and that she would never buy one.
Would you still buy the Zune, or would you switch your plans?

If you think about this question logically, the information that you just got from your friend isn’t
really all that important. You now know the opinion of one more person, but that can’t change
the overall rating of the two machines very much. On the other hand, the information your friend
gives you, and the chance to use her iPod, are highly salient. The information is right there in
front of you, in your hand, whereas the statistical information from Consumer Reports is only in
the form of a table that you saw on your computer. The outcome in cases such as this is that
people frequently ignore the less salient but more important information, such as the likelihood
that events occur across a large population (these statistics are known as base rates), in favor of
the less important but nevertheless more salient information.

People also vary in the schemas that they find important to use when judging others and when
thinking about themselves. Cognitive accessibility refers tothe extent to which knowledge is
activated in memory, and thus likely to be used in cognition and behavior. For instance, you
probably know a person who is a golf nut (or fanatic of another sport). All he can talk about is
golf. For him, we would say that golf is a highly accessible construct. Because he loves golf, it is
important to his self-concept, he sets many of his goals in terms of the sport, and he tends to
think about things and people in terms of it (“if he plays golf, he must be a good person!”). Other
people have highly accessible schemas about environmental issues, eating healthy food, or
drinking really good coffee. When schemas are highly accessible, we are likely to use them to
make judgments of ourselves and others, and this overuse may inappropriately color our
judgments.

Saylor URL: http://www.saylor.org/books Saylor.org


405
Counterfactual Thinking

In addition to influencing our judgments about ourselves and others, the ease with which we can
retrieve potential experiences from memory can have an important effect on our own emotions.
If we can easily imagine an outcome that is better than what actually happened, then we may
experience sadness and disappointment; on the other hand, if we can easily imagine that a result
might have been worse than what actually happened, we may be more likely to experience
happiness and satisfaction. The tendency to think about and experience events according to
“what might have been” is known ascounterfactual thinking (Kahneman & Miller, 1986; Roese,
2005). [26]

Imagine, for instance, that you were participating in an important contest, and you won the silver
(second-place) medal. How would you feel? Certainly you would be happy that you won the
silver medal, but wouldn’t you also be thinking about what might have happened if you had been
just a little bit better—you might have won the gold medal! On the other hand, how might you
feel if you won the bronze (third-place) medal? If you were thinking about the
counterfactuals (the “what might have beens”) perhaps the idea of not getting any medal at all
would have been highly accessible; you’d be happy that you got the medal that you did get,
rather than coming in fourth.

Tom Gilovich and his colleagues (Medvec, Madey, & Gilovich, 1995) [28] investigated this idea
by videotaping the responses of athletes who won medals in the 1992 Summer Olympic Games.
They videotaped the athletes both as they learned that they had won a silver or a bronze medal
and again as they were awarded the medal. Then the researchers showed these videos, without
any sound, to raters who did not know which medal which athlete had won. The raters were
asked to indicate how they thought the athlete was feeling, using a range of feelings from
“agony” to “ecstasy.” The results showed that the bronze medalists were, on average, rated as
happier than were the silver medalists. In a follow-up study, raters watched interviews with many
of these same athletes as they talked about their performance. The raters indicated what we
would expect on the basis of counterfactual thinking—the silver medalists talked about their
disappointments in having finished second rather than first, whereas the bronze medalists
focused on how happy they were to have finished third rather than fourth.

Saylor URL: http://www.saylor.org/books Saylor.org


406
You might have experienced counterfactual thinking in other situations. Once I was driving
across country, and my car was having some engine trouble. I really wanted to make it home
when I got near the end of my journey; I would have been extremely disappointed if the car
broke down only a few miles from my home. Perhaps you have noticed that once you get close
to finishing something, you feel like you really need to get it done. Counterfactual thinking has
even been observed in juries. Jurors who were asked to award monetary damages to others who
had been in an accident offered them substantially more in compensation if they barely avoided
injury than they offered if the accident seemed inevitable (Miller, Turnbull, & McFarland,
1988). [29]

Psychology in Everyday Life: Cognitive Biases in the Real World


Perhaps you are thinking that the kinds of errors that we have been talking about don’t seem that important. After all,

who really cares if we think there are more words that begin with the letter “R” than there actually are, or if bronze

medal winners are happier than the silver medalists? These aren’t big problems in the overall scheme of things. But it

turns out that what seem to be relatively small cognitive biases on the surface can have profound consequences for

people.

Why would so many people continue to purchase lottery tickets, buy risky investments in the stock market, or gamble

their money in casinos when the likelihood of them ever winning is so low? One possibility is that they are victims of

salience; they focus their attention on the salient likelihood of a big win, forgetting that the base rate of the event

occurring is very low. The belief in astrology, which all scientific evidence suggests is not accurate, is probably driven

in part by the salience of the occasions when the predictions are correct. When a horoscope comes true (which will, of

course, happen sometimes), the correct prediction is highly salient and may allow people to maintain the overall false

belief.

People may also take more care to prepare for unlikely events than for more likely ones, because the unlikely ones are

more salient. For instance, people may think that they are more likely to die from a terrorist attack or a homicide than

they are from diabetes, stroke, or tuberculosis. But the odds are much greater of dying from the latter than the former.

And people are frequently more afraid of flying than driving, although the likelihood of dying in a car crash is

hundreds of times greater than dying in a plane crash (more than 50,000 people are killed on U.S. highways every

year). Because people don’t accurately calibrate their behaviors to match the true potential risks (e.g., they drink and
[30]
drive or don’t wear their seatbelts), the individual and societal level costs are often quite large (Slovic, 2000).

Saylor URL: http://www.saylor.org/books Saylor.org


407
Salience and accessibility also color how we perceive our social worlds, which may have a big influence on our

behavior. For instance, people who watch a lot of violent television shows also view the world as more dangerous
[31]
(Doob & Macdonald, 1979), probably because violence becomes more cognitively accessible for them. We also
[32]
unfairly overestimate our contribution to joint projects (Ross & Sicoly, 1979), perhaps in part because our own

contributions are highly accessible, whereas the contributions of others are much less so.

Even people who should know better, and who need to know better, are subject to cognitive biases. Economists, stock

traders, managers, lawyers, and even doctors make the same kinds of mistakes in their professional activities that
[33]
people make in their everyday lives (Gilovich, Griffin, & Kahneman, 2002). Just like us, these people are victims of

overconfidence, heuristics, and other biases.

Furthermore, every year thousands of individuals, such as Ronald Cotton, are charged with and often convicted of

crimes based largely on eyewitness evidence. When eyewitnesses testify in courtrooms regarding their memories of a

crime, they often are completely sure that they are identifying the right person. But the most common cause of
[34]
innocent people being falsely convicted is erroneous eyewitness testimony (Wells, Wright, & Bradfield, 1999). The

many people who were convicted by mistaken eyewitnesses prior to the advent of forensic DNA and who have now

been exonerated by DNA tests have certainly paid for all-too-common memory errors (Wells, Memon, & Penrod,
[35]
2006).

Although cognitive biases are common, they are not impossible to control, and psychologists and other scientists are

working to help people make better decisions. One possibility is to provide people with better feedback about their

judgments. Weather forecasters, for instance, learn to be quite accurate in their judgments because they have clear

feedback about the accuracy of their predictions. Other research has found that accessibility biases can be reduced by

leading people to consider multiple alternatives rather than focus only on the most obvious ones, and particularly by

leading people to think about opposite possible outcomes than the ones they are expecting (Lilienfeld, Ammirtai, &
[36]
Landfield, 2009). Forensic psychologists are also working to reduce the incidence of false identification by helping

police develop better procedures for interviewing both suspects and eyewitnesses (Steblay, Dysart, Fulero, & Lindsay,
[37]
2001).

Saylor URL: http://www.saylor.org/books Saylor.org


408
KEY TAKEAWAYS

• Our memories fail in part due to inadequate encoding and storage, and in part due to the inability to accurately

retrieve stored information.

• The human brain is wired to develop and make use of social categories and schemas. Schemas help us remember new

information but may also lead us to falsely remember things that never happened to us and to distort or

misremember things that did.

• A variety of cognitive biases influence the accuracy of our judgments.


EXERCISES AND CRITICAL THINKING

1. Consider a time when you were uncertain if you really experienced an event or only imagined it. What impact did this

have on you, and how did you resolve it?

2. Consider again some of the cognitive schemas that you hold in your memory. How do these knowledge structures bias

your information processing and behavior, and how might you prevent them from doing so?

3. Imagine that you were involved in a legal case in which an eyewitness claimed that he had seen a person commit a

crime. Based on your knowledge about memory and cognition, what techniques would you use to reduce the

possibility that the eyewitness was making a mistaken identification?

[1] Rassin, E., Merckelbach, H., & Spaan, V. (2001). When dreams become a royal road to confusion: Realistic dreams,

dissociation, and fantasy proneness. Journal of Nervous and Mental Disease, 189(7), 478–481.

[2] Winograd, E., Peluso, J. P., & Glover, T. A. (1998). Individual differences in susceptibility to memory illusions. Applied

Cognitive Psychology, 12(Spec. Issue), S5–S27.

[3] Jacoby, L. L., & Rhodes, M. G. (2006). False remembering in the aged. Current Directions in Psychological Science, 15(2), 49–

53.

[4] Pratkanis, A. R., Greenwald, A. G., Leippe, M. R., & Baumgardner, M. H. (1988). In search of reliable persuasion effects: III.

The sleeper effect is dead: Long live the sleeper effect.Journal of Personality and Social Psychology, 54(2), 203–218.

[5] Stangor, C., & McMillan, D. (1992). Memory for expectancy-congruent and expectancy-incongruent information: A review of

the social and social developmental literatures.Psychological Bulletin, 111(1), 42–61.

[6] Trope, Y., & Thompson, E. (1997). Looking for truth in all the wrong places? Asymmetric search of individuating information

about stereotyped group members. Journal of Personality and Social Psychology, 73, 229–241.

[7] Darley, J. M., & Gross, P. H. (1983). A hypothesis-confirming bias in labeling effects.Journal of Personality and Social

Psychology, 44, 20–33.

Saylor URL: http://www.saylor.org/books Saylor.org


409
[8] Wason, P. (1960). On the failure to eliminate hypotheses in a conceptual task. The Quarterly Journal of Experimental

Psychology, 12(3), 129–140.

[9] Duncker, K. (1945). On problem-solving. Psychological Monographs, 58, 5.

[10] Erdmann, K., Volbert, R., & Böhm, C. (2004). Children report suggested events even when interviewed in a non-suggestive

manner: What are its implications for credibility assessment? Applied Cognitive Psychology, 18(5), 589–611; Loftus, E. F. (1979).

The malleability of human memory. American Scientist, 67(3), 312–320; Zaragoza, M. S., Belli, R. F., & Payment, K. E. (2007).

Misinformation effects and the suggestibility of eyewitness memory. In M. Garry & H. Hayne (Eds.), Do justice and let the sky

fall: Elizabeth Loftus and her contributions to science, law, and academic freedom (pp. 35–63). Mahwah, NJ: Lawrence Erlbaum

Associates.

[11] Loftus, E. F., & Palmer, J. C. (1974). Reconstruction of automobile destruction: An example of the interaction between

language and memory. Journal of Verbal Learning & Verbal Behavior, 13(5), 585–589.

[12] Ceci, S. J., Huffman, M. L. C., Smith, E., & Loftus, E. F. (1994). Repeatedly thinking about a non-event: Source

misattributions among preschoolers. Consciousness and Cognition: An International Journal, 3(3–4), 388–407.

[13] Loftus, E. F., & Pickrell, J. E. (1995). The formation of false memories. Psychiatric Annals, 25(12), 720–725.

[14] Mazzoni, G. A. L., Loftus, E. F., & Kirsch, I. (2001). Changing beliefs about implausible autobiographical events: A little

plausibility goes a long way. Journal of Experimental Psychology: Applied, 7(1), 51–59.

[15] Brown, D., Scheflin, A. W., & Hammond, D. C. (1998). Memory, trauma treatment, and the law. New York, NY: Norton.

[16] McNally, R. J., Bryant, R. A., & Ehlers, A. (2003). Does early psychological intervention promote recovery from

posttraumatic stress? Psychological Science in the Public Interest, 4(2), 45–79; Pope, H. G., Jr., Poliakoff, M. B., Parker, M. P.,

Boynes, M., & Hudson, J. I. (2007). Is dissociative amnesia a culture-bound syndrome? Findings from a survey of historical

literature. Psychological Medicine: A Journal of Research in Psychiatry and the Allied Sciences, 37(2), 225–233.

[17] Loftus, E. F., & Ketcham, K. (1994). The myth of repressed memory: False memories and allegations of sexual abuse (1st

ed.). New York, NY: St. Martin’s Press.

[18] Dunning, D., Griffin, D. W., Milojkovic, J. D., & Ross, L. (1990). The overconfidence effect in social prediction. Journal of

Personality and Social Psychology, 58(4), 568–581.

[19] Wells, G. L., & Olson, E. A. (2003). Eyewitness testimony. Annual Review of Psychology, 277–295.

[20] Brown, R., & Kulik, J. (1977). Flashbulb memories. Cognition, 5, 73–98.

[21] Talarico, J. M., & Rubin, D. C. (2003). Confidence, not consistency, characterizes flashbulb memories. Psychological Science,

14(5), 455–461.

Saylor URL: http://www.saylor.org/books Saylor.org


410
[22] Schmolck, H., Buffalo, E. A., & Squire, L. R. (2000). Memory distortions develop over time: Recollections of the O. J.

Simpson trial verdict after 15 and 32 months. Psychological Science, 11(1), 39–45.

[23] MacLeod, C., & Campbell, L. (1992). Memory accessibility and probability judgments: An experimental evaluation of the

availability heuristic. Journal of Personality and Social Psychology, 63(6), 890–902; Tversky, A., & Kahneman, D. (1973).

Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5, 207–232.

[24] McArthur, L. Z., & Post, D. L. (1977). Figural emphasis and person perception. Journal of Experimental Social Psychology,

13(6), 520–535; Taylor, S. E., & Fiske, S. T. (1978). Salience, attention and attribution: Top of the head phenomena. Advances in

Experimental Social Psychology, 11, 249–288.

[25] Loftus, E. F., Loftus, G. R., & Messo, J. (1987). Some facts about “weapon focus.” Law and Human Behavior, 11(1), 55–62.

[26] Kahneman, D., & Miller, D. T. (1986). Norm theory: Comparing reality to its alternatives. Psychological Review, 93, 136–

153; Roese, N. (2005). If only: How to turn regret into opportunity. New York, NY: Broadway Books.

[27] Medvec, V. H., Madey, S. F., & Gilovich, T. (1995). When less is more: Counterfactual thinking and satisfaction among

Olympic medalists. Journal of Personality & Social Psychology, 69(4), 603–610.

[28] Medvec, V. H., Madey, S. F., & Gilovich, T. (1995). When less is more: Counterfactual thinking and satisfaction among

Olympic medalists. Journal of Personality & Social Psychology, 69(4), 603–610.

[29] Miller, D. T., Turnbull, W., & McFarland, C. (1988). Particularistic and universalistic evaluation in the social comparison

process. Journal of Personality and Social Psychology, 55, 908–917.

[30] Slovic, P. (Ed.). (2000). The perception of risk. London, England: Earthscan Publications.

[31] Doob, A. N., & Macdonald, G. E. (1979). Television viewing and fear of victimization: Is the relationship causal? Journal of

Personality and Social Psychology, 37(2), 170–179.

[32] Ross, M., & Sicoly, F. (1979). Egocentric biases in availability and attribution. Journal of Personality and Social Psychology,

37(3), 322–336.

[33] Gilovich, T., Griffin, D., & Kahneman, D. (2002). Heuristics and biases: The psychology of intuitive judgment. New York, NY:

Cambridge University Press.

[34] Wells, G. L., Wright, E. F., & Bradfield, A. L. (1999). Witnesses to crime: Social and cognitive factors governing the validity of

people’s reports. In R. Roesch, S. D. Hart, & J. R. P. Ogloff (Eds.), Psychology and law: The state of the discipline (pp. 53–87).

Dordrecht, Netherlands: Kluwer Academic Publishers.

[35] Wells, G. L., Memon, A., & Penrod, S. D. (2006). Eyewitness evidence: Improving its probative value. Psychological Science

in the Public Interest, 7(2), 45–75.

Saylor URL: http://www.saylor.org/books Saylor.org


411
[36] Lilienfeld, S. O., Ammirati, R., & Landfield, K. (2009). Giving debiasing away: Can psychological research on correcting

cognitive errors promote human welfare?Perspectives on Psychological Science, 4(4), 390–398.

[37] Steblay, N., Dysart, J., Fulero, S., & Lindsay, R. C. L. (2001). Eyewitness accuracy rates in sequential and simultaneous lineup

presentations: A meta-analytic comparison. Law and Human Behavior, 25(5), 459–473.

8.4 Chapter Summary

Memory and cognition are the two major interests of cognitive psychologists. The cognitive
school was influenced in large part by the development of the electronic computer. Psychologists
conceptualize memory in terms of types, stages, and processes.

Explicit memory is assessed using measures in which the individual being tested must
consciously attempt to remember the information. Explicit memory includes semantic and
episodic memory. Explicit memory tests include recall memory tests, recognition memory tests,
and measures of relearning (also known as savings).

Implicit memory refers to the influence of experience on behavior, even if the individual is not
aware of those influences. Implicit memory is made up of procedural memory, classical
conditioning effects, and priming. Priming refers both to the activation of knowledge and to the
influence of that activation on behavior. An important characteristic of implicit memories is that
they are frequently formed and used automatically, without much effort or awareness on our part.

Sensory memory, including iconic and echoic memory, is a memory buffer that lasts only very
briefly and then, unless it is attended to and passed on for more processing, is forgotten.

Information that we turn our attention to may move into short-term memory (STM). STM is
limited in both the length and the amount of information it can hold. Working memory is a set of
memory procedures or operations that operates on the information in STM. Working memory’s
central executive directs the strategies used to keep information in STM, such as maintenance
rehearsal, visualization, and chunking.

Long-term memory (LTM) is memory storage that can hold information for days, months, and
years. The information that we want to remember in LTM must be encoded and stored, and then

Saylor URL: http://www.saylor.org/books Saylor.org


412
retrieved. Some strategies for improving LTM include elaborative encoding, relating information
to the self, making use of the forgetting curve and the spacing effect, overlearning, and being
aware of context- and state-dependent retrieval effects.

Memories that are stored in LTM are not isolated but rather are linked together into categories
and schemas. Schemas are important in part because they help us encode and retrieve
information by providing an organizational structure for it.

The ability to maintain information in LTM involves a gradual strengthening of the connections
among the neurons in the brain, known as long-term potentiation (LTP). The hippocampus is
important in explicit memory, the cerebellum is important in implicit memory, and the amygdala
is important in emotional memory. A number of neurotransmitters are important in consolidation
and memory. Evidence for the role of different brain structures in different types of memories
comes in part from case studies of patients who suffer from amnesia.

Cognitive biases are errors in memory or judgment that are caused by the inappropriate use of
cognitive processes. These biases are caused by the overuse of schemas, the reliance on salient
and cognitive accessible information, and the use of rule-of-thumb strategies known as
heuristics. These biases include errors in source monitoring, the confirmation bias, functional
fixedness, the misinformation effect, overconfidence, and counterfactual thinking. Understanding
the potential cognitive errors we frequently make can help us make better decisions and engage
in more appropriate behaviors.

Saylor URL: http://www.saylor.org/books Saylor.org


413
Chapter 9
Intelligence and Language
How We Talk (or Do Not Talk) about Intelligence
In January 2005, the president of Harvard University, Lawrence H. Summers, sparked an uproar during a

presentation at an economic conference on women and minorities in the science and engineering workforce. During

his talk, Summers proposed three reasons why there are so few women who have careers in math, physics, chemistry,

and biology. One explanation was that it might be due to discrimination against women in these fields, and a second

was that it might be a result of women’s preference for raising families rather than for competing in academia. But

Summers also argued that women might be less genetically capable of performing science and mathematics—that

they may have less “intrinsic aptitude” than do men.

Summers’s comments on genetics set off a flurry of responses. One of the conference participants, a biologist at the

Massachusetts Institute of Technology, walked out on the talk, and other participants said that they were deeply

offended. Summers replied that he was only putting forward hypotheses based on the scholarly work assembled for

the conference, and that research has shown that genetics have been found to be very important in many domains,

compared with environmental factors. As an example, he mentioned the psychological disorder of autism, which was

once believed to be a result of parenting but is now known to be primarily genetic in origin.

The controversy did not stop with the conference. Many Harvard faculty members were appalled that a prominent

person could even consider the possibility that mathematical skills were determined by genetics, and the controversy

and protests that followed the speech led to first ever faculty vote for a motion expressing a “lack of confidence” in a

Harvard president. Summers resigned his position, in large part as a result of the controversy, in 2006 (Goldin,
[1]
Goldin, & Foulkes, 2005).

The characteristic that is most defining of human beings as a species is that our large cerebral
cortexes make us very, very smart. In this chapter we consider how psychologists conceptualize
and measure human intelligence—the ability to think, to learn from experience, to solve
problems, and to adapt to new situations. We’ll consider whether intelligence involves a single
ability or many different abilities, how we measure intelligence, what intelligence predicts, and
how cultures and societies think about it. We’ll also consider intelligence in terms of nature
versus nurture and in terms of similarities versus differences among people.

Saylor URL: http://www.saylor.org/books Saylor.org


414
Intelligence is important because it has an impact on many human behaviors. Intelligence is
more strongly related than any other individual difference variable to successful educational,
occupational, economic, and social outcomes. Scores on intelligence tests predict academic and
military performance, as well as success in a wide variety of jobs (Ones, Viswesvaran, &
Dilchert, 2005; Schmidt & Hunter, 1998). [2] Intelligence is also negatively correlated with
criminal behaviors—the average intelligence quotient (IQ) of delinquent adolescents is about 7
points lower than that of other adolescents (Wilson & Herrnstein, 1985) [3]—and positively
correlated with health-related outcomes, including longevity (Gottfredson, 2004; Gottfredson &
Deary, 2004). [4] At least some of this latter relationship may be due to the fact that people who
are more intelligent are better able to predict and avoid accidents and to understand and follow
instructions from doctors or on drug labels. Simonton (2006) [5] also found that among U.S.
presidents, the ability to effectively lead was well predicted by ratings of the president’s
intelligence.

The advantages of having a higher IQ increase as life settings become more complex. The
correlation between IQ and job performance is higher in more mentally demanding occupations,
such as physician or lawyer, than in less mentally demanding occupations, like clerk or
newspaper delivery person (Salgado et al., 2003). [6] Although some specific personality traits,
talents, and physical abilities are important for success in some jobs, intelligence predicts
performance across all types of jobs.

Our vast intelligence also allows us to have language, a system of communication that uses
symbols in a regular way to create meaning. Language gives us the ability communicate our
intelligence to others by talking, reading, and writing. As the psychologist Steven Pinker put it,
language is the “the jewel in the crown of cognition” (Pinker, 1994). [7] Although other species
have at least some ability to communicate, none of them have language. In the last section of this
chapter we will consider the structure and development of language, as well as its vital
importance to human beings.

Saylor URL: http://www.saylor.org/books Saylor.org


415
[1] Goldin, G., Goldin, R., & Foulkes, A. (2005, February 21). How Summers offended: Harvard president’s comments

underscored the gender bias we’ve experienced. The Washington Post, p. A27. Retrieved

from http://www.washingtonpost.com/wp-dyn/articles/A40693-2005Feb20.html

[2] Ones, D. S., Viswesvaran, C., & Dilchert, S. (2005). Cognitive ability in selection decisions. In O. Wilhelm & R. W. Engle

(Eds.), Handbook of understanding and measuring intelligence (pp. 431–468). Thousand Oaks, CA: Sage; Schmidt, F., & Hunter,

J. (1998). The validity and utility of selection methods in personnel psychology: Practical and theoretical implications of 85 years

of research findings. Psychological Bulletin, 124(2), 262–274.

[3] Wilson, J. Q., & Herrnstein, R. J. (1985). Crime and human nature. New York, NY: Simon & Schuster.

[4] Gottfredson, L. S. (2004). Life, death, and intelligence. Journal of Cognitive Education and Psychology, 4(1), 23–46;

Gottfredson, L. S., & Deary, I. J. (2004). Intelligence predicts health and longevity, but why? Current Directions in Psychological

Science, 13(1), 1–4.

[5] Simonton, D. K. (2006). Presidential IQ, openness, intellectual brilliance, and leadership: Estimates and correlations for 42

U.S. chief executives. Political Psychology, 27(4), 511–526.

[6] Salgado, J. F., Anderson, N., Moscoso, S., Bertua, C., de Fruyt, F., & Rolland, J. P. (2003). A meta-analytic study of general

mental ability validity for different occupations in the European Community. Journal of Applied Psychology, 88(6), 1068–1081.

[7] Pinker, S. (1994). The language instinct (1st ed.). New York, NY: William Morrow.

9.1 Defining and Measuring Intelligence


LEARNING OBJECTIVES

1. Define intelligence and list the different types of intelligences psychologists study.

2. Summarize the characteristics of a scientifically valid intelligence test.

3. Outline the biological and environmental determinants of intelligence.

Psychologists have long debated how to best conceptualize and measure intelligence (Sternberg,
2003). [1] These questions include how many types of intelligence there are, the role of nature
versus nurture in intelligence, how intelligence is represented in the brain, and the meaning of
group differences in intelligence.

Saylor URL: http://www.saylor.org/books Saylor.org


416
General (g) Versus Specific (s) Intelligences

In the early 1900s, the French psychologist Alfred Binet (1857–1914) and his colleague Henri
Simon (1872–1961) began working in Paris to develop a measure that would differentiate
students who were expected to be better learners from students who were expected to be slower
learners. The goal was to help teachers better educate these two groups of students. Binet and
Simon developed what most psychologists today regard as the first intelligence test, which
consisted of a wide variety of questions that included the ability to name objects, define words,
draw pictures, complete sentences, compare items, and construct sentences.

Binet and Simon (Binet, Simon, & Town, 1915; Siegler, 1992) [2] believed that the questions
they asked their students, even though they were on the surface dissimilar, all assessed the basic
abilities to understand, reason, and make judgments. And it turned out that the correlations
among these different types of measures were in fact all positive; students who got one item
correct were more likely to also get other items correct, even though the questions themselves
were very different.

On the basis of these results, the psychologist Charles Spearman (1863–1945) hypothesized that
there must be a single underlying construct that all of these items measure. He called the
construct that the different abilities and skills measured on intelligence tests have in
common thegeneral intelligence factor (g). Virtually all psychologists now believe that there is a
generalized intelligence factor, g, that relates to abstract thinking and that includes the abilities to
acquire knowledge, to reason abstractly, to adapt to novel situations, and to benefit from
instruction and experience (Gottfredson, 1997; Sternberg, 2003).[3] People with higher general
intelligence learn faster.

Soon after Binet and Simon introduced their test, the American psychologist Lewis Terman
(1877–1956) developed an American version of Binet’s test that became known as the Stanford-
Binet Intelligence Test. The Stanford-Binet is a measure of general intelligence made up of a
wide variety of tasks including vocabulary, memory for pictures, naming of familiar objects,
repeating sentences, and following commands.

Saylor URL: http://www.saylor.org/books Saylor.org


417
Although there is general agreement among psychologists that g exists, there is also evidence
for specific intelligence (s), a measure of specific skills in narrow domains. One empirical result
in support of the idea of s comes from intelligence tests themselves. Although the different types
of questions do correlate with each other, some items correlate more highly with each other than
do other items; they form clusters or clumps of intelligences.

One distinction is between fluid intelligence, which refers to the capacity to learn new ways of
solving problems and performing activities, and crystallized intelligence, which refers to the
accumulated knowledge of the world we have acquired throughout our lives (Salthouse,
2004). [4] These intelligences must be different because crystallized intelligence increases with
age—older adults are as good as or better than young people in solving crossword puzzles—
whereas fluid intelligence tends to decrease with age (Horn, Donaldson, & Engstrom, 1981;
Salthouse, 2004). [5]

Other researchers have proposed even more types of intelligences. L. L. Thurstone


(1938) [6] proposed that there were seven clusters of primary mental abilities, made up of word
fluency, verbal comprehension, spatial ability, perceptual speed, numerical ability, inductive
reasoning, and memory. But even these dimensions tend to be at least somewhat correlated,
showing again the importance of g.

One advocate of the idea of multiple intelligences is the psychologist Robert Sternberg.
Sternberg has proposed a triarchic (three-part) theory of intelligence that proposes that people
may display more or less analytical intelligence, creative intelligence, and practical intelligence.
Sternberg (1985, 2003) [7] argued that traditional intelligence tests assess analytical intelligence,
the ability to answer problems with a single right answer, but that they do not well assess
creativity (the ability to adapt to new situations and create new ideas) or practicality (e.g., the
ability to write good memos or to effectively delegate responsibility).

As Sternberg proposed, research has found that creativity is not highly correlated with analytical
intelligence (Furnham & Bachtiar, 2008), [8] and exceptionally creative scientists, artists,
mathematicians, and engineers do not score higher on intelligence than do their less creative
peers (Simonton, 2000).[9] Furthermore, the brain areas that are associated with convergent

Saylor URL: http://www.saylor.org/books Saylor.org


418
thinking, thinking that is directed toward finding the correct answer to a given problem, are
different from those associated with divergent thinking, the ability to generate many different
ideas for or solutions to a single problem (Tarasova, Volf, & Razoumnikova, 2010). [10] On the
other hand, being creative often takes some of the basic abilities measured by g, including the
abilities to learn from experience, to remember information, and to think abstractly (Bink &
Marsh, 2000). [11]

Studies of creative people suggest at least five components that are likely to be important for
creativity:

Expertise. Creative people have carefully studied and know a lot about the topic that they are
working in. Creativity comes with a lot of hard work (Ericsson, 1998; Weisberg, 2006).[12]

Imaginative thinking. Creative people often view a problem in a visual way, allowing them to see
it from a new and different point of view.

Risk taking. Creative people are willing to take on new but potentially risky approaches.

Intrinsic interest. Creative people tend to work on projects because they love doing them, not
because they are paid for them. In fact, research has found that people who are paid to be
creative are often less creative than those who are not (Hennessey & Amabile, 2010). [13]

Working in a creative environment. Creativity is in part a social phenomenon. Simonton


(1992) [14] found that the most creative people were supported, aided, and challenged by other
people working on similar projects.

The last aspect of the triarchic model, practical intelligence, refers primarily to intelligence that
cannot be gained from books or formal learning. Practical intelligence represents a type of “street
smarts” or “common sense” that is learned from life experiences. Although a number of tests
have been devised to measure practical intelligence (Sternberg, Wagner, & Okagaki, 1993;
Wagner & Sternberg, 1985), [15] research has not found much evidence that practical intelligence
is distinct from g or that it is predictive of success at any particular tasks (Gottfredson,
2003). [16] Practical intelligence may include, at least in part, certain abilities that help people

Saylor URL: http://www.saylor.org/books Saylor.org


419
perform well at specific jobs, and these abilities may not always be highly correlated with
general intelligence (Sternberg, Wagner, & Okagaki, 1993). [17] On the other hand, these abilities
or skills are very specific to particular occupations and thus do not seem to represent the broader
idea of intelligence.

Another champion of the idea of multiple intelligences is the psychologist Howard Gardner
(1983, 1999). [18] Gardner argued that it would be evolutionarily functional for different people to
have different talents and skills, and proposed that there are eight intelligences that can be
differentiated from each other (Table 9.1 "Howard Gardner’s Eight Specific Intelligences").
Gardner noted that some evidence for multiple intelligences comes from the abilities ofautistic
savants, people who score low on intelligence tests overall but who nevertheless may have
exceptional skills in a given domain, such as math, music, art, or in being able to recite statistics
in a given sport (Treffert & Wallace, 2004). [19]

Table 9.1 Howard Gardner’s Eight Specific Intelligences


Intelligence Description

Linguistic The ability to speak and write well

Logico-mathematical The ability to use logic and mathematical skills to solve problems

Spatial The ability to think and reason about objects in three dimensions

Musical The ability to perform and enjoy music

Kinesthetic (body) The ability to move the body in sports, dance, or other physical activities

Interpersonal The ability to understand and interact effectively with others

Intrapersonal The ability to have insight into the self

Naturalistic The ability to recognize, identify, and understand animals, plants, and other living things

Source: Adapted from Gardner, H. (1999). Intelligence reframed: Multiple intelligences for the 21st century. New

York, NY: Basic Books.

Saylor URL: http://www.saylor.org/books Saylor.org


420
The idea of multiple intelligences has been influential in the field of education, and teachers have
used these ideas to try to teach differently to different students. For instance, to teach math
problems to students who have particularly good kinesthetic intelligence, a teacher might
encourage the students to move their bodies or hands according to the numbers. On the other
hand, some have argued that these “intelligences” sometimes seem more like “abilities” or
“talents” rather than real intelligence. And there is no clear conclusion about how many
intelligences there are. Are sense of humor, artistic skills, dramatic skills, and so forth also
separate intelligences? Furthermore, and again demonstrating the underlying power of a single
intelligence, the many different intelligences are in fact correlated and thus represent, in part, g
(Brody, 2003). [20]

Measuring Intelligence: Standardization and the Intelligence Quotient

The goal of most intelligence tests is to measure g, the general intelligence factor. Good
intelligence tests are reliable, meaning that they are consistent over time, and also
demonstrate construct validity, meaning that they actually measure intelligence rather than
something else. Because intelligence is such an important individual difference dimension,
psychologists have invested substantial effort in creating and improving measures of intelligence,
and these tests are now the most accurate of all psychological tests. In fact, the ability to
accurately assess intelligence is one of the most important contributions of psychology to
everyday public life.

Intelligence changes with age. A 3-year-old who could accurately multiply 183 by 39 would
certainly be intelligent, but a 25-year-old who could not do so would be seen as unintelligent.
Thus understanding intelligence requires that we know the norms or standards in a given
population of people at a given age. Thestandardization of a test involves giving it to a large
number of people at different ages and computing the average score on the test at each age level.

It is important that intelligence tests be standardized on a regular basis, because the overall level
of intelligence in a population may change over time. The Flynn effect refers to the observation
that scores on intelligence tests worldwide have increased substantially over the past
decades (Flynn, 1999).[21] Although the increase varies somewhat from country to country, the

Saylor URL: http://www.saylor.org/books Saylor.org


421
average increase is about 3 IQ points every 10 years. There are many explanations for the Flynn
effect, including better nutrition, increased access to information, and more familiarity with
multiple-choice tests (Neisser, 1998).[22] But whether people are actually getting smarter is
debatable (Neisser, 1997). [23]

Once the standardization has been accomplished, we have a picture of the average abilities of
people at different ages and can calculate a person’smental age, which is the age at which a
person is performing intellectually. If we compare the mental age of a person to the person’s
chronological age, the result is the intelligence quotient (IQ), a measure of intelligence that is
adjusted for age. A simple way to calculate IQ is by using the following formula:

IQ = mental age ÷ chronological age × 100.

Thus a 10-year-old child who does as well as the average 10-year-old child has an IQ of 100 (10
÷ 10 × 100), whereas an 8-year-old child who does as well as the average 10-year-old child
would have an IQ of 125 (10 ÷ 8 × 100). Most modern intelligence tests are based the relative
position of a person’s score among people of the same age, rather than on the basis of this
formula, but the idea of an intelligence “ratio” or “quotient” provides a good description of the
score’s meaning.

A number of scales are based on the IQ. TheWechsler Adult lntelligence Scale (WAIS) is the
most widely used intelligence test for adults (Watkins, Campbell, Nieberding, & Hallmark,
1995).[24] The current version of the WAIS, the WAIS-IV, was standardized on 2,200 people
ranging from 16 to 90 years of age. It consists of 15 different tasks, each designed to assess
intelligence, including working memory, arithmetic ability, spatial ability, and general
knowledge about the world (see Figure 9.4 "Sample Items From the Wechsler Adult Intelligence
Scale (WAIS)"). The WAIS-IV yields scores on four domains: verbal, perceptual, working
memory, and processing speed. The reliability of the test is high (more than 0.95), and it shows
substantial construct validity. The WAIS-IV is correlated highly with other IQ tests such as the
Stanford-Binet, as well as with criteria of academic and life success, including college grades,
measures of work performance, and occupational level. It also shows significant correlations
with measures of everyday functioning among the mentally retarded.

Saylor URL: http://www.saylor.org/books Saylor.org


422
The Wechsler scale has also been adapted for preschool children in the form of the Wechsler
Primary and Preschool Scale of Intelligence (WPPSI-III) and for older children and adolescents
in the form of the Wechsler Intelligence Scale for Children (WISC-IV).

Figure 9.4 Sample Items From the Wechsler Adult Intelligence Scale (WAIS)

Source: Adapted from Thorndike, R. L., & Hagen, E. P. (1997). Cognitive Abilities Test (Form 5): Research

handbook. Chicago, IL: Riverside Publishing.

Saylor URL: http://www.saylor.org/books Saylor.org


423
The intelligence tests that you may be most familiar with are aptitude tests, which are designed
to measure one’s ability to perform a given task, for instance, to do well in college or in
postgraduate training. Most U.S. colleges and universities require students to take the Scholastic
Assessment Test (SAT) or the American College Test (ACT), and postgraduate schools require
the Graduate Record Examination (GRE), Medical College Admissions Test (MCAT), or the
Law School Admission Test (LSAT). These tests are useful for selecting students because they
predict success in the programs that they are designed for, particularly in the first year of the
program (Kuncel, Hezlett, & Ones, 2010).[25] These aptitude tests also measure, in part,
intelligence. Frey and Detterman (2004) [26] found that the SAT correlated highly (between
about r = .7 and r = .8) with standard measures of intelligence.

Intelligence tests are also used by industrial and organizational psychologists in the process
of personnel selection. Personnel selection is the use of structured tests to select people who are
likely to perform well at given jobs(Schmidt & Hunter, 1998). [27] The psychologists begin by
conducting a job analysis in which they determine what knowledge, skills, abilities, and personal
characteristics (KSAPs) are required for a given job. This is normally accomplished by surveying
and/or interviewing current workers and their supervisors. Based on the results of the job
analysis, the psychologists choose selection methods that are most likely to be predictive of job
performance. Measures include tests of cognitive and physical ability and job knowledge tests, as
well as measures of IQ and personality.

The Biology of Intelligence

The brain processes underlying intelligence are not completely understood, but current research
has focused on four potential factors: brain size, sensory ability, speed and efficience of neural
transmission, and working memory capacity.

There is at least some truth to the idea that smarter people have bigger brains. Studies that have
measured brain volume using neuroimaging techniques find that larger brain size is correlated
with intelligence (McDaniel, 2005), [28] and intelligence has also been found to be correlated with
the number of neurons in the brain and with the thickness of the cortex (Haier, 2004; Shaw et al.,
2006).[29] It is important to remember that these correlational findings do not mean that having

Saylor URL: http://www.saylor.org/books Saylor.org


424
more brain volume causes higher intelligence. It is possible that growing up in a stimulating
environment that rewards thinking and learning may lead to greater brain growth (Garlick,
2003), [30] and it is also possible that a third variable, such as better nutrition, causes both brain
volume and intelligence.

Another possibility is that the brains of more intelligent people operate faster or more efficiently
than the brains of the less intelligent. Some evidence supporting this idea comes from data
showing that people who are more intelligent frequently show less brain activity (suggesting that
they need to use less capacity) than those with lower intelligence when they work on a task
(Haier, Siegel, Tang, & Abel, 1992). [31] And the brains of more intelligent people also seem to
run faster than the brains of the less intelligent. Research has found that the speed with which
people can perform simple tasks—such as determining which of two lines is longer or pressing,
as quickly as possible, one of eight buttons that is lighted—is predictive of intelligence (Deary,
Der, & Ford, 2001). [32] Intelligence scores also correlate at about r = .5 with measures of
working memory (Ackerman, Beier, & Boyle, 2005), [33] and working memory is now used as a
measure of intelligence on many tests.

Although intelligence is not located in a specific part of the brain, it is more prevalent in some
brain areas than others. Duncan et al. (2000) [34]administered a variety of intelligence tasks and
observed the places in the cortex that were most active. Although different tests created different
patterns of activation, as you can see in Figure 9.5 "Where Is Intelligence?", these activated areas
were primarily in the outer parts of the cortex, the area of the brain most involved in planning,
executive control, and short-term memory.

Saylor URL: http://www.saylor.org/books Saylor.org


425
Figure 9.5 Where Is Intelligence?

fMRI studies have found that the areas of the brain most related to intelligence are in the outer parts of the cortex.

Source: Adapted from Duncan, J., Seitz, R. J., Kolodny, J., Bor, D., Herzog, H., Ahmed, A.,…Emslie, H. (2000). A

neural basis for general intelligence. Science, 289(5478), 457–460.

Is Intelligence Nature or Nurture?

Intelligence has both genetic and environmental causes, and these have been systematically
studied through a large number of twin and adoption studies (Neisser et al., 1996; Plomin,
DeFries, Craig, & McGuffin, 2003). [35] These studies have found that between 40% and 80% of
the variability in IQ is due to genetics, meaning that overall genetics plays a bigger role than
does environment in creating IQ differences among individuals (Plomin & Spinath,
2004). [36] The IQs of identical twins correlate very highly (r = .86), much higher than do the

Saylor URL: http://www.saylor.org/books Saylor.org


426
scores of fraternal twins who are less genetically similar (r = .60). And the correlations between
the IQs of parents and their biological children (r = .42) is significantly greater than the
correlation between parents and adopted children (r = .19). The role of genetics gets stronger as
children get older. The intelligence of very young children (less than 3 years old) does not
predict adult intelligence, but by age 7 it does, and IQ scores remain very stable in adulthood
(Deary, Whiteman, Starr, Whalley, & Fox, 2004). [37]

But there is also evidence for the role of nurture, indicating that individuals are not born with
fixed, unchangeable levels of intelligence. Twins raised together in the same home have more
similar IQs than do twins who are raised in different homes, and fraternal twins have more
similar IQs than do nontwin siblings, which is likely due to the fact that they are treated more
similarly than are siblings.

The fact that intelligence becomes more stable as we get older provides evidence that early
environmental experiences matter more than later ones. Environmental factors also explain a
greater proportion of the variance in intelligence for children from lower-class households than
they do for children from upper-class households (Turkheimer, Haley, Waldron, D’Onofrio, &
Gottesman, 2003). [38] This is because most upper-class households tend to provide a safe,
nutritious, and supporting environment for children, whereas these factors are more variable in
lower-class households.

Social and economic deprivation can adversely affect IQ. Children from households in poverty
have lower IQs than do children from households with more resources even when other factors
such as education, race, and parenting are controlled (Brooks-Gunn & Duncan,
1997). [39] Poverty may lead to diets that are undernourishing or lacking in appropriate vitamins,
and poor children may also be more likely to be exposed to toxins such as lead in drinking water,
dust, or paint chips (Bellinger & Needleman, 2003). [40] Both of these factors can slow brain
development and reduce intelligence.

If impoverished environments can harm intelligence, we might wonder whether enriched


environments can improve it. Government-funded after-school programs such as Head Start are
designed to help children learn. Research has found that attending such programs may increase

Saylor URL: http://www.saylor.org/books Saylor.org


427
intelligence for a short time, but these increases rarely last after the programs end (McLoyd,
1998; Perkins & Grotzer, 1997). [41] But other studies suggest that Head Start and similar
programs may improve emotional intelligence and reduce the likelihood that children will drop
out of school or be held back a grade (Reynolds, Temple, Robertson, & Mann 2001). [42]

Intelligence is improved by education; the number of years a person has spent in school
correlates at about r = .6 with IQ (Ceci, 1991). [43] In part this correlation may be due to the fact
that people with higher IQ scores enjoy taking classes more than people with low IQ scores, and
they thus are more likely to stay in school. But education also has a causal effect on IQ.
Comparisons between children who are almost exactly the same age but who just do or just do
not make a deadline for entering school in a given school year show that those who enter school
a year earlier have higher IQ than those who have to wait until the next year to begin school
(Baltes & Reinert, 1969; Ceci & Williams, 1997). [44] Children’s IQs tend to drop significantly
during summer vacations (Huttenlocher, Levine, & Vevea, 1998), [45] a finding that suggests that
a longer school year, as is used in Europe and East Asia, is beneficial.

It is important to remember that the relative roles of nature and nurture can never be completely
separated. A child who has higher than average intelligence will be treated differently than a
child who has lower than average intelligence, and these differences in behaviors will likely
amplify initial differences. This means that modest genetic differences can be multiplied into big
differences over time.

Psychology in Everyday Life: Emotional Intelligence


Although most psychologists have considered intelligence a cognitive ability, people also use their emotions to help

them solve problems and relate effectively to others. Emotional intelligence refers to the ability to accurately identify,

assess, and understand emotions, as well as to effectively control one’s own emotions (Feldman-Barrett & Salovey,
[46]
2002; Mayer, Salovey, & Caruso, 2000).

The idea of emotional intelligence is seen in Howard Gardner’sinterpersonal intelligence (the capacity to understand

the emotions, intentions, motivations, and desires of other people) and intrapersonal intelligence (the capacity to

understand oneself, including one’s emotions). Public interest in, and research on, emotional intellgence became

Saylor URL: http://www.saylor.org/books Saylor.org


428
widely prevalent following the publication of Daniel Goleman’s best-selling book,Emotional Intelligence: Why It Can
[47]
Matter More Than IQ (Goleman, 1998).

There are a variety of measures of emotional intelligence (Mayer, Salovey, & Caruso, 2008; Petrides & Furnham,
[48]
2000). One popular measure, the Mayer-Salovey-Caruso Emotional Intelligence Test

(http://www.emotionaliq.org), includes items about the ability to understand, experience, and manage emotions,

such as these:

• What mood(s) might be helpful to feel when meeting in-laws for the very first time?

• Tom felt anxious and became a bit stressed when he thought about all the work he needed to do. When his

supervisor brought him an additional project, he felt ____ (fill in the blank).

• Contempt most closely combines which two emotions?

1. anger and fear

2. fear and surprise

3. disgust and anger

4. surprise and disgust

• Debbie just came back from vacation. She was feeling peaceful and content. How well would each of the following

actions help her preserve her good mood?

o Action 1: She started to make a list of things at home that she needed to do.

o Action 2: She began thinking about where and when she would go on her next vacation.

o Action 3: She decided it was best to ignore the feeling since it wouldn't last anyway.

One problem with emotional intelligence tests is that they often do not show a great deal of reliability or construct
[49]
validity (Føllesdal & Hagtvet, 2009). Although it has been found that people with higher emotional intelligence are
[50]
also healthier (Martins, Ramalho, & Morin, 2010), findings are mixed about whether emotional intelligence
[51]
predicts life success—for instance, job performance (Harms & Credé, 2010). Furthermore, other researchers have

questioned the construct validity of the measures, arguing that emotional intelligence really measures knowledge
[52]
about what emotions are, but not necessarily how to use those emotions (Brody, 2004), and that emotional

intelligence is actually a personality trait, a part of g, or a skill that can be applied in some specific work situations—
[53]
for instance, academic and work situations (Landy, 2005).

Although measures of the ability to understand, experience, and manage emotions may not predict effective

behaviors, another important aspect of emotional intelligence—emotion regulation—does. Emotion regulation refers

Saylor URL: http://www.saylor.org/books Saylor.org


429
to the ability to control and productively use one’s emotions. Research has found that people who are better able to

override their impulses to seek immediate gratification and who are less impulsive also have higher cognitive and

social intelligence. They have better SAT scores, are rated by their friends as more socially adept, and cope with

frustration and stress better than those with less skill at emotion regulation (Ayduk et al., 2000; Eigsti et al., 2006;
[54]
Mischel & Ayduk, 2004).

Because emotional intelligence seems so important, many school systems have designed programs to teach it to their

students. However, the effectiveness of these programs has not been rigorously tested, and we do not yet know

whether emotional intelligence can be taught, or if learning it would improve the quality of people’s lives (Mayer &
[55]
Cobb, 2000).
KEY TAKEAWAYS

• Intelligence is the ability to think, to learn from experience, to solve problems, and to adapt to new situations.

Intelligence is important because it has an impact on many human behaviors.

• Psychologists believe that there is a construct that accounts for the overall differences in intelligence among people,

known as general intelligence (g).

• There is also evidence for specific intelligences (s), measures of specific skills in narrow domains, including creativity

and practical intelligence.

• The intelligence quotient (IQ) is a measure of intelligence that is adjusted for age. The Wechsler Adult lntelligence

Scale (WAIS) is the most widely used IQ test for adults.

• Brain volume, speed of neural transmission, and working memory capacity are related to IQ.

• Between 40% and 80% of the variability in IQ is due to genetics, meaning that overall genetics plays a bigger role than

does environment in creating IQ differences among individuals.

• Intelligence is improved by education and may be hindered by environmental factors such as poverty.

• Emotional intelligence refers to the ability to identify, assess, manage, and control one’s emotions. People who are

better able to regulate their behaviors and emotions are also more successful in their personal and social encounters.

Saylor URL: http://www.saylor.org/books Saylor.org


430
EXERCISES AND CRITICAL THINKING

1. Consider your own IQ. Are you smarter than the average person? What specific intelligences do you think you excel

in?

2. Did your parents try to improve your intelligence? Do you think their efforts were successful?

3. Consider the meaning of the Flynn effect. Do you think people are really getting smarter?

4. Give some examples of how emotional intelligence (or the lack of it) influences your everyday life and the lives of

other people you know.

[1] Sternberg, R. J. (2003). Contemporary theories of intelligence. In W. M. Reynolds & G. E. Miller (Eds.), Handbook of

psychology: Educational psychology (Vol. 7, pp. 23–45). Hoboken, NJ: John Wiley & Sons.

[2] Binet, A., Simon, T., & Town, C. H. (1915). A method of measuring the development of the intelligence of young children (3rd

ed.) Chicago, IL: Chicago Medical Book; Siegler, R. S. (1992). The other Alfred Binet. Developmental Psychology, 28(2), 179–190.

[3] Gottfredson, L. S. (1997). Mainstream science on intelligence: An editorial with 52 signatories, history and

bibliography. Intelligence, 24(1), 13–23; Sternberg, R. J. (2003). Contemporary theories of intelligence. In W. M. Reynolds & G.

E. Miller (Eds.), Handbook of psychology: Educational psychology (Vol. 7, pp. 23–45). Hoboken, NJ: John Wiley & Sons.

[4] Salthouse, T. A. (2004). What and when of cognitive aging. Current Directions in Psychological Science, 13(4), 140–144.

[5] Horn, J. L., Donaldson, G., & Engstrom, R. (1981). Apprehension, memory, and fluid intelligence decline in

adulthood. Research on Aging, 3(1), 33–84; Salthouse, T. A. (2004). What and when of cognitive aging. Current Directions in

Psychological Science, 13(4), 140–144.

[6] Thurstone, L. L. (1938). Primary mental abilities. Psychometric Monographs, No. 1. Chicago, IL: University of Chicago Press.

[7] Sternberg, R. J. (1985). Beyond IQ: A triarchic theory of human intelligence. New York, NY: Cambridge University Press;

Sternberg, R. J. (2003). Our research program validating the triarchic theory of successful intelligence: Reply to

Gottfredson. Intelligence, 31(4), 399–413.

[8] Furnham, A., & Bachtiar, V. (2008). Personality and intelligence as predictors of creativity. Personality and Individual

Differences, 45(7), 613–617.

[9] Simonton, D. K. (2000). Creativity: Cognitive, personal, developmental, and social aspects. American Psychologist, 55(1),

151–158.

[10] Tarasova, I. V., Volf, N. V., & Razoumnikova, O. M. (2010). Parameters of cortical interactions in subjects with high and low

levels of verbal creativity. Human Physiology, 36(1), 80–85.

[11] Bink, M. L., & Marsh, R. L. (2000). Cognitive regularities in creative activity. Review of General Psychology, 4(1), 59–78.

Saylor URL: http://www.saylor.org/books Saylor.org


431
[12] Ericsson, K. (1998). The scientific study of expert levels of performance: General implications for optimal learning and

creativity. High Ability Studies, 9(1), 75–100; Weisberg, R. (2006). Creativity: Understanding innovation in problem solving,

science, invention, and the arts. Hoboken, NJ: John Wiley & Sons.

[13] Hennessey, B. A., & Amabile, T. M. (2010). Creativity. Annual Review of Psychology, 61, 569–598.

[14] Simonton, D. K. (1992). The social context of career success and course for 2,026 scientists and inventors. Personality and

Social Psychology Bulletin, 18(4), 452–463.

[15] Sternberg, R. J., Wagner, R. K., & Okagaki, L. (1993). Practical intelligence: The nature and role of tacit knowledge in work

and at school. In J. M. Puckett & H. W. Reese (Eds.),Mechanisms of everyday cognition (pp. 205–227). Hillsdale, NJ: Lawrence

Erlbaum Associates; Wagner, R., & Sternberg, R. (1985). Practical intelligence in real-world pursuits: The role of tacit

knowledge. Journal of Personality and Social Psychology, 49(2), 436–458.

[16] Gottfredson, L. S. (2003). Dissecting practical intelligence theory: Its claims and evidence. Intelligence, 31(4), 343–397.

[17] Sternberg, R. J., Wagner, R. K., & Okagaki, L. (1993). Practical intelligence: The nature and role of tacit knowledge in work

and at school. In J. M. Puckett & H. W. Reese (Eds.),Mechanisms of everyday cognition (pp. 205–227). Hillsdale, NJ: Lawrence

Erlbaum Associates.

[18] Gardner, H. (1983). Frames of mind: The theory of multiple intelligences. New York, NY: Basic Books; Gardner, H.

(1999). Intelligence reframed: Multiple intelligences for the 21st century. New York, NY: Basic Books.

[19] Treffert, D. A., & Wallace, G. L. (2004, January 1). Islands of genius. Scientific American, 14–23. Retrieved

from http://gordonresearch.com/articles_autism/SciAm-Islands_of_Genius.pdf

[20] Brody, N. (2003). Construct validation of the Sternberg Triarchic abilities test: Comment and reanalysis. Intelligence, 31(4),

319–329.

[21] Flynn, J. R. (1999). Searching for justice: The discovery of IQ gains over time. American Psychologist, 54(1), 5–20.

[22] Neisser, U. (Ed.). (1998). The rising curve. Washington, DC: American Psychological Association.

[23] Neisser, U. (1997). Rising scores on intelligence tests. American Scientist, 85, 440–447.

[24] Watkins, C. E., Campbell, V. L., Nieberding, R., & Hallmark, R. (1995). Contemporary practice of psychological assessment

by clinical psychologists. Professional Psychology: Research and Practice, 26(1), 54–60.

[25] Kuncel, N. R., Hezlett, S. A., & Ones, D. S. (2010). A comprehensive meta-analysis of the predictive validity of the graduate

record examinations: Implications for graduate student selection and performance. Psychological Bulletin, 127(1), 162–181.

[26] Frey, M. C., & Detterman, D. K. (2004). Scholastic assessment or g? The relationship between the scholastic assessment

test and general cognitive ability. Psychological Science, 15(6), 373–378.

Saylor URL: http://www.saylor.org/books Saylor.org


432
[27] Schmidt, F. L., & Hunter, J. E. (1998). The validity and utility of selection methods in personnel psychology: Practical and

theoretical implications of 85 years of research findings. Psychological Bulletin, 124, 262–274.

[28] McDaniel, M. A. (2005). Big-brained people are smarter: A meta-analysis of the relationship between in vivo brain volume

and intelligence. Intelligence, 33(4), 337–346.

[29] Haier, R. J. (2004). Brain imaging studies of personality: The slow revolution. In R. M. Stelmack (Ed.), On the psychobiology

of personality: Essays in honor of Marvin Zuckerman(pp. 329–340). New York, NY: Elsevier Science; Shaw, P., Greenstein, D.,

Lerch, J., Clasen, L., Lenroot, R., Gogtay, N.,…Giedd, J. (2006). Intellectual ability and cortical development in children and

adolescents. Nature, 440(7084), 676–679.

[30] Garlick, D. (2003). Integrating brain science research with intelligence research.Current Directions in Psychological Science,

12(5), 185–189.

[31] Haier, R. J., Siegel, B. V., Tang, C., & Abel, L. (1992). Intelligence and changes in regional cerebral glucose metabolic rate

following learning. Intelligence, 16(3–4), 415–426.

[32] Deary, I. J., Der, G., & Ford, G. (2001). Reaction times and intelligence differences: A population-based cohort

study. Intelligence, 29(5), 389–399.

[33] Ackerman, P. L., Beier, M. E., & Boyle, M. O. (2005). Working memory and intelligence: The same or different

constructs? Psychological Bulletin, 131(1), 30–60.

[34] Duncan, J., Seitz, R. J., Kolodny, J., Bor, D., Herzog, H., Ahmed, A.,…Emslie, H. (2000). A neural basis for general

intelligence. Science, 289(5478), 457–460.

[35] Neisser, U., Boodoo, G., Bouchard, T. J., Jr., Boykin, A. W., Brody, N., Ceci, S. J.,…Urbina, S. (1996). Intelligence: Knowns and

unknowns. American Psychologist, 51(2), 77–101; Plomin, R. (2003). General cognitive ability. In R. Plomin, J. C. DeFries, I. W.

Craig, & P. McGuffin (Eds.), Behavioral genetics in the postgenomic era (pp. 183–201). Washington, DC: American Psychological

Association.

[36] Plomin, R., & Spinath, F. M. (2004). Intelligence: Genetics, genes, and genomics.Journal of Personality and Social

Psychology, 86(1), 112–129.

[37] Deary, I. J., Whiteman, M. C., Starr, J. M., Whalley, L. J., & Fox, H. C. (2004). The impact of childhood intelligence on later

life: Following up the Scottish mental surveys of 1932 and 1947. Journal of Personality and Social Psychology, 86(1), 130–147.

[38] Turkheimer, E., Haley, A., Waldron, M., D’Onofrio, B., & Gottesman, I. I. (2003). Socioeconomic status modifies heritability

of IQ in young children. Psychological Science, 14(6), 623–628.

[39] Brooks-Gunn, J., & Duncan, G. J. (1997). The effects of poverty on children. The Future of Children, 7(2), 55–71.

Saylor URL: http://www.saylor.org/books Saylor.org


433
[40] Bellinger, D. C., & Needleman, H. L. (2003). Intellectual impairment and blood lead levels [Letter to the editor]. The New

England Journal of Medicine, 349(5), 500.

[41] McLoyd, V. C. (1998). Children in poverty: Development, public policy and practice. In W. Damon, I. E. Sigel, & K. A.

Renninger (Eds.), Handbook of child psychology: Child psychology in practice (5th ed., Vol. 4, pp. 135–208). Hoboken, NJ: John

Wiley & Sons; Perkins, D. N., & Grotzer, T. A. (1997). Teaching intelligence. American Psychologist, 52(10), 1125–1133.

[42] Reynolds, A. J., Temple, J. A., Robertson, D. L., & Mann, E. A. (2001). Long-term effects of an early childhood intervention

on educational achievement and juvenile arrest: A 15-year follow-up of low-income children in public schools. Journal of the

American Medical Association, 285(18), 2339–2346.

[43] Ceci, S. J. (1991). How much does schooling influence general intelligence and its cognitive components? A reassessment of

the evidence. Developmental Psychology, 27(5), 703–722.

[44] Baltes, P. B., & Reinert, G. (1969). Cohort effects in cognitive development of children as revealed by cross-sectional

sequences. Developmental Psychology, 1(2), 169–177; Ceci, S. J., & Williams, W. M. (1997). Schooling, intelligence, and

income. American Psychologist, 52(10), 1051–1058.

[45] Huttenlocher, J., Levine, S., & Vevea, J. (1998). Environmental input and cognitive growth: A study using time-period

comparisons. Child Development, 69(4), 1012–1029.

[46] Feldman-Barrett, L., & Salovey, P. (Eds.). (2002). The wisdom in feeling: Psychological processes in emotional

intelligence. New York, NY: Guilford Press; Mayer, J. D., Salovey, P., & Caruso, D. (2000). Models of emotional intelligence. In R.

J. Sternberg (Ed.), Handbook of intelligence (pp. 396–420). New York, NY: Cambridge University Press.

[47] Goleman, D. (1998). Working with emotional intelligence. New York, NY: Bantam Books.

[48] Mayer, J. D., Salovey, P., & Caruso, D. R. (2008). Emotional intelligence: New ability or eclectic traits. American

Psychologist, 63(6), 503–517; Petrides, K. V., & Furnham, A. (2000). On the dimensional structure of emotional

intelligence. Personality and Individual Differences, 29, 313–320.

[49] Føllesdal, H., & Hagtvet, K. A. (2009). Emotional intelligence: The MSCEIT from the perspective of generalizability

theory. Intelligence, 37(1), 94–105.

[50] Martins, A., Ramalho, N., & Morin, E. (2010). A comprehensive meta-analysis of the relationship between emotional

intelligence and health. Personality and Individual Differences, 49(6), 554–564.

[51] Harms, P. D., & Credé, M. (2010). Emotional intelligence and transformational and transactional leadership: A meta-

analysis. Journal of Leadership & Organizational Studies, 17(1), 5–17.

[52] Brody, N. (2004). What cognitive intelligence is and what emotional intelligence is not. Psychological Inquiry, 15, 234–238.

Saylor URL: http://www.saylor.org/books Saylor.org


434
[53] Landy, F. J. (2005). Some historical and scientific issues related to research on emotional intelligence. Journal of

Organizational Behavior, 26, 411–424.

[54] Ayduk, O., Mendoza-Denton, R., Mischel, W., Downey, G., Peake, P. K., & Rodriguez, M. (2000). Regulating the

interpersonal self: Strategic self-regulation for coping with rejection sensitivity. Journal of Personality and Social Psychology,

79(5), 776–792; Eigsti, I.-M., Zayas, V., Mischel, W., Shoda, Y., Ayduk, O., Dadlani, M. B.,…Casey, B. J. (2006). Predicting cognitive

control from preschool to late adolescence and young adulthood.Psychological Science, 17(6), 478–484; Mischel, W., & Ayduk,

O. (Eds.). (2004). Willpower in a cognitive-affective processing system: The dynamics of delay of gratification. New York, NY:

Guilford Press.

[55] Mayer, J. D., & Cobb, C. D. (2000). Educational policy on emotional intelligence: Does it make sense? Educational

Psychology Review, 12(2), 163–183.

9.2 The Social, Cultural, and Political Aspects of Intelligence


LEARNING OBJECTIVES

1. Explain how very high and very low intelligence is defined and what it means to have them.

2. Consider and comment on the meaning of biological and environmental explanations for gender and racial differences

in IQ.

3. Define stereotype threat and explain how it might influence scores on intelligence tests.

Intelligence is defined by the culture in which it exists. Most people in Western cultures tend to
agree with the idea that intelligence is an important personality variable that should be admired
in those who have it. But people from Eastern cultures tend to place less emphasis on individual
intelligence and are more likely to view intelligence as reflecting wisdom and the desire to
improve the society as a whole rather than only themselves (Baral & Das, 2004; Sternberg,
2007). [1] And in some cultures, such as the United States, it is seen as unfair and prejudicial to
argue, even at a scholarly conference, that men and women might have different abilities in
domains such as math and science and that these differences might be caused by genetics (even
though, as we have seen, a great deal of intelligence is determined by genetics). In short,
although psychological tests accurately measure intelligence, it is cultures that interpret the
meanings of those tests and determine how people with differing levels of intelligence are
treated.

Saylor URL: http://www.saylor.org/books Saylor.org


435
Extremes of Intelligence: Retardation and Giftedness

The results of studies assessing the measurement of intelligence show that IQ is distributed in the
population in the form of anormal distribution (or bell curve), which is the pattern of scores
usually observed in a variable that clusters around its average. In a normal distribution, the bulk
of the scores fall toward the middle, with many fewer scores falling at the extremes. The normal
distribution of intelligence (Figure 9.6 "Distribution of IQ Scores in the General Population")
shows that on IQ tests, as well as on most other measures, the majority of people cluster around
the average (in this case, where IQ = 100), and fewer are either very smart or very dull. Because
the standard deviation of an IQ test is about 15, this means that about 2% of people score above
an IQ of 130 (often considered the threshold forgiftedness), and about the same percentage score
below an IQ of 70 (often being considered the threshold for mental retardation).

Although Figure 9.6 "Distribution of IQ Scores in the General Population"presents a single


distribution, the actual IQ distribution varies by sex such that the distribution for men is more
spread out than is the distribution for women. These sex differences mean that about 20% more
men than women fall in the extreme (very smart or very dull) ends of the distribution (Johnson,
Carothers, & Deary, 2009). [2] Boys are about five times more likely to be diagnosed with the
reading disability dyslexia than are girls (Halpern, 1992), [3] and are also more likely to be
classified as mentally retarded. But boys are also about 20% more highly represented in the
upper end of the IQ distribution.

Saylor URL: http://www.saylor.org/books Saylor.org


436
Figure 9.6 Distribution of IQ Scores in the General Population

The normal distribution of IQ scores in the general population shows that most people have about average

intelligence, while very few have extremely high or extremely low intelligence.

Extremely Low Intelligence

One end of the distribution of intelligence scores is defined by people with very low
IQ. Mental retardation is a generalized disorder ascribed to people who have an IQ below 70,
who have experienced deficits since childhood, and who have trouble with basic life skills, such
as dressing and feeding oneself and communicating with others (Switzky & Greenspan,
2006). [4] About 1% of the United States population, most of them males, fulfill the criteria for
mental retardation, but some children who are diagnosed as mentally retarded lose the
classification as they get older and better learn to function in society. A particular vulnerability
of people with low IQ is that they may be taken advantage of by others, and this is an important
aspect of the definition of mental retardation (Greenspan, Loughlin, & Black, 2001). [5] Mental
retardation is divided into four categories: mild, moderate, severe, and profound. Severe and
profound mental retardation is usually caused by genetic mutations or accidents during birth,
whereas mild forms have both genetic and environmental influences.

One cause of mental retardation is Down syndrome, a chromosomal disorder leading to mental
retardation caused by the presence of all or part of an extra 21st chromosome. The incidence of

Saylor URL: http://www.saylor.org/books Saylor.org


437
Down syndrome is estimated at 1 per 800 to 1,000 births, although its prevalence rises sharply in
those born to older mothers. People with Down syndrome typically exhibit a distinctive pattern
of physical features, including a flat nose, upwardly slanted eyes, a protruding tongue, and a
short neck.

Societal attitudes toward individuals with mental retardation have changed over the past decades.
We no longer use terms such as “moron,” “idiot,” or “imbecile” to describe these people,
although these were the official psychological terms used to describe degrees of retardation in
the past. Laws such as the Americans with Disabilities Act (ADA) have made it illegal to
discriminate on the basis of mental and physical disability, and there has been a trend to bring the
mentally retarded out of institutions and into our workplaces and schools. In 2002 the U.S.
Supreme Court ruled that the execution of people with mental retardation is “cruel and unusual
punishment,” thereby ending this practice (Atkins v. Virginia, 2002). [6]

Extremely High Intelligence

Having extremely high IQ is clearly less of a problem than having extremely low IQ, but there
may also be challenges to being particularly smart. It is often assumed that schoolchildren who
are labeled as “gifted” may have adjustment problems that make it more difficult for them to
create social relationships. To study gifted children, Lewis Terman and his colleagues (Terman
& Oden, 1959)[7] selected about 1,500 high school students who scored in the top 1% on the
Stanford-Binet and similar IQ tests (i.e., who had IQs of about 135 or higher), and tracked them
for more than seven decades (the children became known as the “termites” and are still being
studied today). This study found, first, that these students were not unhealthy or poorly adjusted
but rather were above average in physical health and were taller and heavier than individuals in
the general population. The students also had above average social relationships—for instance,
being less likely to divorce than the average person (Seagoe, 1975). [8]

Terman’s study also found that many of these students went on to achieve high levels of
education and entered prestigious professions, including medicine, law, and science. Of the
sample, 7% earned doctoral degrees, 4% earned medical degrees, and 6% earned law degrees.
These numbers are all considerably higher than what would have been expected from a more

Saylor URL: http://www.saylor.org/books Saylor.org


438
general population. Another study of young adolescents who had even higher IQs found that
these students ended up attending graduate school at a rate more than 50 times higher than that in
the general population (Lubinski & Benbow, 2006). [9]

As you might expect based on our discussion of intelligence, kids who are gifted have higher
scores on general intelligence (g). But there are also different types of giftedness. Some children
are particularly good at math or science, some at automobile repair or carpentry, some at music
or art, some at sports or leadership, and so on. There is a lively debate among scholars about
whether it is appropriate or beneficial to label some children as “gifted and talented” in school
and to provide them with accelerated special classes and other programs that are not available to
everyone. Although doing so may help the gifted kids (Colangelo & Assouline, 2009), [10] it also
may isolate them from their peers and make such provisions unavailable to those who are not
classified as “gifted.”

Sex Differences in Intelligence

As discussed in the introduction to Chapter 9 "Intelligence and Language", Lawrence Summers’s


claim about the reasons why women might be underrepresented in the hard sciences was based in
part on the assumption that environment, such as the presence of gender discrimination or social
norms, was important but also in part on the possibility that women may be less genetically
capable of performing some tasks than are men. These claims, and the responses they provoked,
provide another example of how cultural interpretations of the meanings of IQ can create
disagreements and even guide public policy. The fact that women earn many fewer degrees in
the hard sciences than do men is not debatable (as shown in Figure 9.9 "Bachelor’s Degrees
Earned by Women in Selected Fields (2006)"), but the reasons for these differences are.

Saylor URL: http://www.saylor.org/books Saylor.org


439
Figure 9.9 Bachelor’s Degrees Earned by Women in Selected Fields (2006)

Women tend to earn more degrees in the biological and social sciences, whereas men earn more in engineering,

math, and the physical sciences.

National Science Foundation (2010). Downloaded

from:http://www.nsf.gov/statistics/nsf08321/content.cfm?pub_id=3785&id=2

Differences in degree choice are probably not due to overall intelligence because men and
women have almost identical intelligence as measured by standard IQ and aptitude tests (Hyde,
2005). [11] On the other hand, it is possible that the differences are due to variability in
intelligence, because more men than women have very high (as well as very low) intelligence.
Perhaps success in the mathematical and physical sciences requires very high IQ, and this favors
men.

There are also observed sex differences on some particular types of tasks. Women tend to do
better than men on some verbal tasks, including spelling, writing, and pronouncing words
(Halpern et al., 2007), [12] and they have better emotional intelligence in the sense that they are
better at detecting and recognizing the emotions of others (McClure, 2000). [13]

Saylor URL: http://www.saylor.org/books Saylor.org


440
On average, men do better than women on tasks requiring spatial ability, such as the mental
rotation tasks shown in Figure 9.10 (Voyer, Voyer, & Bryden, 1995). [14] Boys tend to do better
than girls on both geography and geometry tasks (Vogel, 1996). [15] On the math part of the
Scholastic Assessment Test (SAT), boys with scores of 700 or above outnumber girls by more
than 10 to 1 (Benbow & Stanley, 1983), [16] but there are also more boys in the lowest end of the
distribution as well.

Figure 9.10

Men outperform women on measures of spatial rotation, such as this task requires, but women are better at

recognizing the emotions of others.

Source: Adapted from Halpern, D. F., Benbow, C. P., Geary, D. C., Gur, R. C., Hyde, J. S., & Gernsbache, M. A.

(2007). The science of sex differences in science and mathematics. Psychological Science in the Public Interest, 8(1),

1–51.

Although these differences are real, and can be important, keep in mind that like virtually all sex
group differences, the average difference between men and women is small compared to the
average differences within each sex. There are many women who are better than the average man
on spatial tasks, and many men who score higher than the average women in terms of emotional

Saylor URL: http://www.saylor.org/books Saylor.org


441
intelligence. Sex differences in intelligence allow us to make statements only about average
differences and do not say much about any individual person.

Although society may not want to hear it, differences between men and women may be in part
genetically determined, perhaps by differences in brain lateralization or by hormones (Kimura &
Hampson, 1994; Voyer, Voyer, & Bryden, 1995). [17] But nurture is also likely important
(Newcombe & Huttenlocker, 2006). [18] As infants, boys and girls show no or few differences in
spatial or counting abilities, suggesting that the differences occur at least in part as a result of
socialization (Spelke, 2005). [19] Furthermore, the number of women entering the hard sciences
has been increasing steadily over the past years, again suggesting that some of the differences
may have been due to gender discrimination and societal expectations about the appropriate roles
and skills of women.

Racial Differences in Intelligence

Although their bell curves overlap considerably, there are also differences in which members of
different racial and ethnic groups cluster along the IQ line. The bell curves for some groups
(Jews and East Asians) are centered somewhat higher than for Whites in general (Lynn, 1996;
Neisser et al., 1996). [20] Other groups, including Blacks and Hispanics, have averages somewhat
lower than those of Whites. The center of the IQ distribution for African Americans is about 85,
and that for Hispanics is about 93 (Hunt & Carlson, 2007). [21]

The observed average differences in intelligence between groups has at times led to malicious
and misguided attempts to try to correct for them through discriminatory treatment of people
from different races, ethnicities, and nationalities (Lewontin, Rose, & Kamin, 1984). [22] One of
the most egregious was the spread of eugenics, the proposal that one could improve the human
species by encouraging or permitting reproduction of only those people with genetic
characteristics judged desirable.

Eugenics became immensely popular in the United States in the early 20th century and was
supported by many prominent psychologists, including Sir Francis Galton. Dozens of
universities, including those in the Ivy League, offered courses in eugenics, and the topic was
presented in most high school and college biology texts (Selden, 1999). [23] Belief in the policies

Saylor URL: http://www.saylor.org/books Saylor.org


442
of eugenics led the U.S. Congress to pass laws designed to restrict immigration from other
countries supposedly marked by low intelligence, particularly those in eastern and southern
Europe. And because more than one-half of the U.S. states passed laws requiring the sterilization
of low-IQ individuals, more than 60,000 Americans, mostly African Americans and other poor
minorities, underwent forced sterilizations. Fortunately, the practice of sterilization was
abandoned between the 1940s and the 1960s, although sterilization laws remained on the books
in some states until the 1970s.

One explanation for race differences in IQ is that intelligence tests are biased against some
groups and in favor of others. By bias, what psychologists mean is that a test predicts
outcomes—such as grades or occupational success—better for one group than it does for
another. If IQ is a better predictor of school grade point average for Whites than it is for Asian
Americans, for instance, then the test would be biased against Asian Americans, even though the
average IQ scores for Asians might be higher. But IQ tests do not seem to be racially biased
because the observed correlations between IQ tests and both academic and occupational
achievement are about equal across races (Brody, 1992). [24]

Another way that tests might be biased is if questions are framed such that they are easier for
people from one culture to understand than for people from other cultures. For example, even a
very smart person will not do well on a test if he or she is not fluent in the language in which the
test is administered, or does not understand the meaning of the questions being asked. But
modern intelligence tests are designed to be culturally neutral, and group differences are found
even on tests that only ask about spatial intelligence. Although some researchers still are
concerned about the possibility that intelligence tests are culturally biased, it is probably not the
case that the tests are creating all of the observed group differences (Suzuki & Valencia,
1997). [25]

Research Focus: Stereotype Threat


Although intelligence tests may not be culturally biased, the situation in which one takes a test may be. One

environmental factor that may affect how individuals perform and achieve is their expectations about their ability at a

task. In some cases these beliefs may be positive, and they have the effect of making us feel more confident and thus

better able to perform tasks. For instance, research has found that because Asian students are aware of the cultural

Saylor URL: http://www.saylor.org/books Saylor.org


443
stereotype that “Asians are good at math,” reminding them of this fact before they take a difficult math test can
[26]
improve their performance on the test (Walton & Cohen, 2003). On the other hand, sometimes these beliefs are

negative, and they create negative self-fulfilling prophecies such that we perform more poorly just because of our

knowledge about the stereotypes.

In 1995 Claude Steele and Joshua Aronson tested the hypothesis that the differences in performance on IQ tests

between Blacks and Whites might be due to the activation of negative stereotypes (Steele & Aronson,
[27]
1995). Because Black students are aware of the stereotype that Blacks are intellectually inferior to Whites, this

stereotype might create a negative expectation, which might interfere with their performance on intellectual tests

through fear of confirming that stereotype.

In support of this hypothesis, the experiments revealed that Black college students performed worse (in comparison

to their prior test scores) on standardized test questions when this task was described to them as being diagnostic of

their verbal ability (and thus when the stereotype was relevant), but that their performance was not influenced when

the same questions were described as an exercise in problem solving. And in another study, the researchers found

that when Black students were asked to indicate their race before they took a math test (again activating the

stereotype), they performed more poorly than they had on prior exams, whereas White students were not affected by

first indicating their race.

Steele and Aronson argued that thinking about negative stereotypes that are relevant to a task that one is performing

createsstereotype threat—performance decrements that are caused by the knowledge of cultural stereotypes. That is,

they argued that the negative impact of race on standardized tests may be caused, at least in part, by the performance

situation itself. Because the threat is “in the air,” Black students may be negatively influenced by it.

Research has found that stereotype threat effects can help explain a wide variety of performance decrements among

those who are targeted by negative stereotypes. For instance, when a math task is described as diagnostic of

intelligence, Latinos and Latinas perform more poorly than do Whites (Gonzales, Blanton, & Williams,
[28]
2002). Similarly, when stereotypes are activated, children with low socioeconomic status perform more poorly in

math than do those with high socioeconomic status, and psychology students perform more poorly than do natural
[29]
science students (Brown, Croizet, Bohner, Fournet, & Payne, 2003; Croizet & Claire, 1998). Even groups who

typically enjoy advantaged social status can be made to experience stereotype threat. White men perform more poorly

on a math test when they are told that their performance will be compared with that of Asian men (Aronson, Lustina,
[30]
Good, Keough, & Steele, 1999), and Whites perform more poorly than Blacks on a sport-related task when it is

Saylor URL: http://www.saylor.org/books Saylor.org


444
described to them as measuring their natural athletic ability (Stone, 2002; Stone, Lynch, Sjomeling, & Darley,
[31]
1999).

Research has found that stereotype threat is caused by both cognitive and emotional factors (Schmader, Johns, &
[32]
Forbes, 2008). On the cognitive side, individuals who are experiencing stereotype threat show an increased

vigilance toward the environment as well as increased attempts to suppress stereotypic thoughts. Engaging in these

behaviors takes cognitive capacity away from the task. On the affective side, stereotype threat occurs when there is a

discrepancy between our positive concept of our own skills and abilities and the negative stereotypes that suggest

poor performance. These discrepancies create stress and anxiety, and these emotions make it harder to perform well

on the task.

Stereotype threat is not, however, absolute; we can get past it if we try. What is important is to reduce the self doubts

that are activated when we consider the negative stereotypes. Manipulations that affirm positive characteristics about

the self or one’s social group are successful at reducing stereotype threat (Marx & Roman, 2002; McIntyre, Paulson, &
[33]
Lord, 2003). In fact, just knowing that stereotype threat exists and may influence our performance can help
[34]
alleviate its negative impact (Johns, Schmader, & Martens, 2005).

In summary, although there is no definitive answer to why IQ bell curves differ across racial and
ethnic groups, and most experts believe that environment is important in pushing the bell curves
apart, genetics can also be involved. It is important to realize that, although IQ is heritable, this
does not mean that group differences are caused by genetics. Although some people are naturally
taller than others (height is heritable), people who get plenty of nutritious food are taller than
people who do not, and this difference is clearly due to environment. This is a reminder that
group differences may be created by environmental variables but also able to be reduced through
appropriate environmental actions such as educational and training programs.

KEY TAKEAWAYS

• IQ is distributed in the population in the form of a normal distribution (frequently known as a bell curve).

• Mental retardation is a generalized disorder ascribed to people who have an IQ below 70, who have experienced

deficits since childhood, and who have trouble with basic life skills, such as dressing and feeding oneself and

communicating with others. One cause of mental retardation is Down syndrome.

Saylor URL: http://www.saylor.org/books Saylor.org


445
• Extremely intelligent individuals are not unhealthy or poorly adjusted, but rather are above average in physical health

and taller and heavier than individuals in the general population.

• Men and women have almost identical intelligence, but men have more variability in their IQ scores than do women.

• On average, men do better than women on tasks requiring spatial ability, whereas women do better on verbal tasks

and score higher on emotional intelligence.

• Although their bell curves overlap considerably, there are also average group differences for members of different

racial and ethnic groups.

• The observed average differences in intelligence between racial and ethnic groups has at times led to malicious

attempts to correct for them, such as the eugenics movement in the early part of the 20th century.

• The situation in which one takes a test may create stereotype threat—performance decrements that are caused by

the knowledge of cultural stereotypes.


EXERCISES AND CRITICAL THINKING

1. Were Lawrence Summers’s ideas about the potential causes of differences between men and women math and hard

sciences careers offensive to you? Why or why not?

2. Do you think that we should give intelligence tests? Why or why not? Does it matter to you whether or not the tests

have been standardized and shown to be reliable and valid?

3. Give your ideas about the practice of providing accelerated classes to children listed as “gifted” in high school. What

are the potential positive and negative outcomes of doing so? What research evidence has helped you form your

opinion?

4. Consider the observed sex and racial differences in intelligence. What implications do you think the differences have

for education and career choices?

[1] Baral, B. D., & Das, J. P. (2004). Intelligence: What is indigenous to India and what is shared? In R. J. Sternberg

(Ed.), International handbook of intelligence (pp. 270–301). New York, NY: Cambridge University Press; Sternberg, R. J. (2007).

Intelligence and culture. In S. Kitayama & D. Cohen (Eds.), Handbook of cultural psychology (pp. 547–568). New York, NY:

Guilford Press.

[2] Johnson, W., Carothers, A., & Deary, I. J. (2009). A role for the X chromosome in sex differences in variability in general

intelligence? Perspectives on Psychological Science, 4(6), 598–611.

[3] Halpern, D. F. (1992). Sex differences in cognitive abilities (2nd ed.). Hillsdale, NJ: Lawrence Erlbaum Associates.

Saylor URL: http://www.saylor.org/books Saylor.org


446
[4] Switzky, H. N., & Greenspan, S. (2006). What is mental retardation? Ideas for an evolving disability in the 21st century.

Washington, DC: American Association on Mental Retardation.

[5] Greenspan, S., Loughlin, G., & Black, R. S. (2001). Credulity and gullibility in people with developmental disorders: A

framework for future research. In L. M. Glidden (Ed.),International review of research in mental retardation (Vol. 24, pp. 101–

135). San Diego, CA: Academic Press.

[6] Atkins v. Virginia, 536 U.S. 304 (2002).

[7] Terman, L. M., & Oden, M. H. (1959). Genetic studies of genius: The gifted group at mid-life (Vol. 5). Stanford, CA: Stanford

University Press.

[8] Seagoe, M. V. (1975). Terman and the gifted. Los Altos, CA: William Kaufmann.

[9] Lubinski, D., & Benbow, C. P. (2006). Study of mathematically precocious youth after 35 years: Uncovering antecedents for

the development of math-science expertise. Perspectives on Psychological Science, 1(4), 316–345.

[10] Colangelo, N., & Assouline, S. (2009). Acceleration: Meeting the academic and social needs of students. In T. Balchin, B.

Hymer, & D. J. Matthews (Eds.), The Routledge international companion to gifted education (pp. 194–202). New York, NY:

Routledge.

[11] Hyde, J. S. (2005). The gender similarities hypothesis. American Psychologist, 60(6), 581–592.

[12] Halpern, D. F., Benbow, C. P., Geary, D. C., Gur, R. C., Hyde, J. S., & Gernsbache, M. A. (2007). The science of sex differences

in science and mathematics. Psychological Science in the Public Interest, 8(1), 1–51.

[13] McClure, E. B. (2000). A meta-analytic review of sex differences in facial expression processing and their development in

infants, children, and adolescents. Psychological Bulletin, 126(3), 424–453.

[14] Voyer, D., Voyer, S., & Bryden, M. P. (1995). Magnitude of sex differences in spatial abilities: A meta-analysis and

consideration of critical variables. Psychological Bulletin, 117(2), 250–270.

[15] Vogel, G. (1996). School achievement: Asia and Europe top in world, but reasons are hard to find. Science, 274(5291), 1296.

[16] Benbow, C. P., & Stanley, J. C. (1983). Sex differences in mathematical reasoning ability: More facts. Science, 222(4627),

1029–1031.

[17] Kimura, D., & Hampson, E. (1994). Cognitive pattern in men and women is influenced by fluctuations in sex

hormones. Current Directions in Psychological Science, 3(2), 57–61; Voyer, D., Voyer, S., & Bryden, M. P. (1995). Magnitude of

sex differences in spatial abilities: A meta-analysis and consideration of critical variables. Psychological Bulletin, 117(2), 250–

270.

Saylor URL: http://www.saylor.org/books Saylor.org


447
[18] Newcombe, N. S., & Huttenlocher, J. (2006). Development of spatial cognition. In D. Kuhn, R. S. Siegler, W. Damon, & R. M.

Lerner (Eds.), Handbook of child psychology: Cognition, perception, and language (6th ed., Vol. 2, pp. 734–776). Hoboken, NJ:

John Wiley & Sons.

[19] Spelke, E. S. (2005). Sex differences in intrinsic aptitude for mathematics and science? A critical review. American

Psychologist, 60(9), 950–958.

[20] Lynn, R. (1996). Racial and ethnic differences in intelligence in the United States on the differential ability scale. Personality

and Individual Differences, 20(2), 271–273; Neisser, U., Boodoo, G., Bouchard, T. J., Jr., Boykin, A. W., Brody, N., Ceci, S.

J.,…Urbina, S. (1996). Intelligence: Knowns and unknowns. American Psychologist, 51(2), 77–101.

[21] Hunt, E., & Carlson, J. (2007). Considerations relating to the study of group differences in intelligence. Perspectives on

Psychological Science, 2(2), 194–213.

[22] Lewontin, R. C., Rose, S. P. R., & Kamin, L. J. (1984). Not in our genes: Biology, ideology, and human nature (1st ed.). New

York, NY: Pantheon Books.

[23] Selden, S. (1999). Inheriting shame: The story of eugenics and racism in America. New York, NY: Teachers College Press.

[24] Brody, N. (1992). Intelligence (2nd ed.). San Diego, CA: Academic Press.

[25] Suzuki, L. A., & Valencia, R. R. (1997). Race-ethnicity and measured intelligence: Educational implications. American

Psychologist, 52(10), 1103–1114.

[26] Walton, G. M., & Cohen, G. L. (2003). Stereotype lift. Journal of Experimental Social Psychology, 39(5), 456–467.

[27] Steele, C. M., & Aronson, J. (1995). Stereotype threat and the intellectual performance of African Americans. Journal of

Personality and Social Psychology, 69, 797–811.

[28] Gonzales, P. M., Blanton, H., & Williams, K. J. (2002). The effects of stereotype threat and double-minority status on the

test performance of Latino women. Personality and Social Psychology Bulletin, 28(5), 659–670.

[29] Brown, R., Croizet, J.-C., Bohner, G., Fournet, M., & Payne, A. (2003). Automatic category activation and social behaviour:

The moderating role of prejudiced beliefs.Social Cognition, 21(3), 167–193; Croizet, J.-C., & Claire, T. (1998). Extending the

concept of stereotype and threat to social class: The intellectual underperformance of students from low socioeconomic

backgrounds. Personality and Social Psychology Bulletin, 24(6), 588–594.

[30] Aronson, J., Lustina, M. J., Good, C., Keough, K., & Steele, C. M. (1999). When white men can’t do math: Necessary and

sufficient factors in stereotype threat. Journal of Experimental Social Psychology, 35, 29–46.

[31] Stone, J. (2002). Battling doubt by avoiding practice: The effects of stereotype threat on self-handicapping in White

athletes. Personality and Social Psychology Bulletin, 28(12), 1667–1678; Stone, J., Lynch, C. I., Sjomeling, M., & Darley, J. M.

Saylor URL: http://www.saylor.org/books Saylor.org


448
(1999). Stereotype threat effects on Black and White athletic performance. Journal of Personality and Social Psychology, 77(6),

1213–1227.

[32] Schmader, T., Johns, M., & Forbes, C. (2008). An integrated process model of stereotype threat effects on

performance. Psychological Review, 115(2), 336–356.

[33] Marx, D. M., & Roman, J. S. (2002). Female role models: Protecting women’s math test performance. Personality and Social

Psychology Bulletin, 28(9), 1183–1193; McIntyre, R. B., Paulson, R. M., & Lord, C. G. (2003). Alleviating women’s mathematics

stereotype threat through salience of group achievements. Journal of Experimental Social Psychology, 39(1), 83–90.

[34] Johns, M., Schmader, T., & Martens, A. (2005). Knowing is half the battle: Teaching stereotype threat as a means of

improving women’s math performance. Psychological Science, 16(3), 175–179.

9.3 Communicating With Others: The Development and Use of Language


LEARNING OBJECTIVES

1. Review the components and structure of language.

2. Explain the biological underpinnings of language.

3. Outline the theories of language development.

Human language is the most complex behavior on the planet and, at least as far as we know, in
the universe. Language involves both the ability to comprehend spoken and written words and to
create communication in real time when we speak or write. Most languages are oral, generated
through speaking. Speaking involves a variety of complex cognitive, social, and biological
processes including operation of the vocal cords, and the coordination of breath with movements
of the throat and mouth, and tongue. Other languages are sign languages, in which the
communication is expressed by movements of the hands. The most common sign language is
American Sign Language (ASL), currently spoken by more than 500,000 people in the United
States alone.

Although language is often used for the transmission of information (“turn right at the next light
and then go straight,” “Place tab A into slot B”), this is only its most mundane function.
Language also allows us to access existing knowledge, to draw conclusions, to set and
accomplish goals, and to understand and communicate complex social relationships. Language is

Saylor URL: http://www.saylor.org/books Saylor.org


449
fundamental to our ability to think, and without it we would be nowhere near as intelligent as we
are.

Language can be conceptualized in terms of sounds, meaning, and the environmental factors that
help us understand it. Phonemes are the elementary sounds of our language, morphemes are the
smallest units of meaning in a language, syntax is the set of grammatical rules that control how
words are put together, and contextual information is the elements of communication that are not
part of the content of language but that help us understand its meaning.

The Components of Language

A phoneme is the smallest unit of sound that makes a meaningful difference in a language. The
word “bit” has three phonemes, /b/, /i/, and /t/ (in transcription, phonemes are placed between
slashes), and the word “pit” also has three: /p/, /i/, and /t/. In spoken languages, phonemes are
produced by the positions and movements of the vocal tract, including our lips, teeth, tongue,
vocal cords, and throat, whereas in sign languages phonemes are defined by the shapes and
movement of the hands.

There are hundreds of unique phonemes that can be made by human speakers, but most
languages only use a small subset of the possibilities. English contains about 45 phonemes,
whereas other languages have as few as 15 and others more than 60. The Hawaiian language
contains only about a dozen phonemes, including 5 vowels (a, e, i, o, and u) and 7 consonants (h,
k, l, m, n, p, and w).

In addition to using a different set of phonemes, because the phoneme is actually a category of
sounds that are treated alike within the language, speakers of different languages are able to hear
the difference only between some phonemes but not others. This is known as the categorical
perception of speech sounds. English speakers can differentiate the /r/ phoneme from the /l/
phoneme, and thus “rake” and “lake” are heard as different words. In Japanese, however, /r/ and
/l/ are the same phoneme, and thus speakers of that language cannot tell the difference between
the word “rake” and the word “lake.” Try saying the words “cool” and “keep” out loud. Can you
hear the difference between the two /k/ sounds? To English speakers they both sound the same,
but to speakers of Arabic these represent two different phonemes.

Saylor URL: http://www.saylor.org/books Saylor.org


450
Infants are born able to understand all phonemes, but they lose their ability to do so as they get
older; by 10 months of age a child’s ability to recognize phonemes becomes very similar to that
of the adult speakers of the native language. Phonemes that were initially differentiated come to
be treated as equivalent (Werker & Tees, 2002). [1]

Figure 9.11

When adults hear speech sounds that gradually change from one phoneme to another, they do not hear the

continuous change; rather, they hear one sound until they suddenly begin hearing the other. In this case, the change

is from /ba/ to /pa/.

Source: Adapted from Wood, C. C. (1976). Discriminability, response bias, and phoneme categories in

discrimination of voice onset time. Journal of the Acoustical Society of America, 60(6), 1381–1389.

Whereas phonemes are the smallest units of sound in language, a morphemeis a string of one or
more phonemes that makes up the smallest units of meaning in a language. Some morphemes,
such as one-letter words like “I” and “a,” are also phonemes, but most morphemes are made up
of combinations of phonemes. Some morphemes are prefixes and suffixes used to modify other
words. For example, the syllable “re-” as in “rewrite” or “repay” means “to do again,” and the
suffix “-est” as in “happiest” or “coolest” means “to the maximum.”

Syntax is the set of rules of a language by which we construct sentences. Each language has a
different syntax. The syntax of the English language requires that each sentence have a noun and

Saylor URL: http://www.saylor.org/books Saylor.org


451
a verb, each of which may be modified by adjectives and adverbs. Some syntaxes make use of
the order in which words appear, while others do not. In English, “The man bites the dog” is
different from “The dog bites the man.” In German, however, only the article endings before the
noun matter. “Der Hund beisst den Mann” means “The dog bites the man” but so does “Den
Mann beisst der Hund.”

Words do not possess fixed meanings but change their interpretation as a function of the context
in which they are spoken. We usecontextual information—the information surrounding
language—to help us interpret it. Examples of contextual information include the knowledge
that we have and that we know that other people have, and nonverbal expressions such as facial
expressions, postures, gestures, and tone of voice. Misunderstandings can easily arise if people
aren’t attentive to contextual information or if some of it is missing, such as it may be in
newspaper headlines or in text messages.

Examples in Which Syntax Is Correct but the Interpretation Can Be Ambiguous


• Grandmother of Eight Makes Hole in One

• Milk Drinkers Turn to Powder

• Farmer Bill Dies in House

• Old School Pillars Are Replaced by Alumni

• Two Convicts Evade Noose, Jury Hung

• Include Your Children When Baking Cookies

The Biology and Development of Language

Anyone who has tried to master a second language as an adult knows the difficulty of language
learning. And yet children learn languages easily and naturally. Children who are not exposed to
language early in their lives will likely never learn one. Case studies, including Victor the “Wild
Child,” who was abandoned as a baby in France and not discovered until he was 12, and Genie, a
child whose parents kept her locked in a closet from 18 months until 13 years of age, are
(fortunately) two of the only known examples of these deprived children. Both of these children
made some progress in socialization after they were rescued, but neither of them ever developed
language (Rymer, 1993). [2]This is also why it is important to determine quickly if a child is deaf
and to begin immediately to communicate in sign language. Deaf children who are not exposed

Saylor URL: http://www.saylor.org/books Saylor.org


452
to sign language during their early years will likely never learn it (Mayberry, Lock, & Kazmi,
2002). [3]

Research Focus: When Can We Best Learn Language? Testing the Critical Period
Hypothesis
For many years psychologists assumed that there was a critical period (a time in which learning can easily occur) for

language learning, lasting between infancy and puberty, and after which language learning was more difficult or
[4]
impossible (Lenneberg, 1967; Penfield & Roberts, 1959). But more recent research has provided a different

interpretation.
[5]
An important study by Jacqueline Johnson and Elissa Newport (1989) using Chinese and Korean speakers who had

learned English as a second language provided the first insight. The participants were all adults who had immigrated

to the United States between 3 and 39 years of age and who were tested on their English skills by being asked to

detect grammatical errors in sentences. Johnson and Newport found that the participants who had begun learning

English before they were 7 years old learned it as well as native English speakers but that the ability to learn English

dropped off gradually for the participants who had started later. Newport and Johnson also found a correlation

between the age of acquisition and the variance in the ultimate learning of the language. While early learners were

almost all successful in acquiring their language to a high degree of proficiency, later learners showed much greater

individual variation.

Johnson and Newport’s finding that children who immigrated before they were 7 years old learned English fluently

seemed consistent with the idea of a “critical period” in language learning. But their finding of a gradual decrease in

proficiency for those who immigrated between 8 and 39 years of age was not—rather, it suggested that there might

not be a single critical period of language learning that ended at puberty, as early theorists had expected, but that

language learning at later ages is simply better when it occurs earlier. This idea was reinforced in research by Hakuta,
[6]
Bialystok, and Wiley (2003), who examined U.S. census records of language learning in millions of Chinese and

Spanish speakers living in the United States. The census form asks respondents to describe their own English ability

using one of five categories: “not at all,” “not well,” “well,” “very well,” and “speak only English.” The results of this

research dealt another blow to the idea of the critical period, because it showed that regardless of what year was used

as a cutoff point for the end of the critical period, there was no evidence for any discontinuity in language-learning

potential. Rather, the results (Figure 9.12 "English Proficiency in Native Chinese Speakers") showed that the degree of

success in second-language acquisition declined steadily throughout the respondent’s life span. The difficulty of

Saylor URL: http://www.saylor.org/books Saylor.org


453
learning language as one gets older is probably due to the fact that, with age, the brain loses its plasticity—that is, its

ability to develop new neural connections.

Figure 9.12English Proficiency in Native Chinese Speakers

Hakuta, Bialystok, and Wiley (2003) found no evidence for critical periods in language learning. Regardless of level

of education, self-reported second-language skills decreased consistently across age of immigration.

Source: Adapted from Hakuta, K., Bialystok, E., & Wiley, E. (2003). Critical evidence: A test of the critical-period

hypothesis for second-language acquisition. Psychological Science, 14(1), 31–38.

For the 90% of people who are right-handed, language is stored and controlled by the left
cerebral cortex, although for some left-handers this pattern is reversed. These differences can
easily be seen in the results of neuroimaging studies that show that listening to and producing
language creates greater activity in the left hemisphere than in the right. Broca’s area, an area in
front of the left hemisphere near the motor cortex, is responsible for language production (Figure
9.13 "Drawing of Brain Showing Broca’s and Wernicke’s Areas"). This area was first localized
in the 1860s by the French physician Paul Broca, who studied patients with lesions to various
parts of the brain.Wernicke’s area, an area of the brain next to the auditory cortex, is responsible
for language comprehension.

Saylor URL: http://www.saylor.org/books Saylor.org


454
Figure 9.13 Drawing of Brain Showing Broca’s and Wernicke’s Areas

For most people the left hemisphere is specialized for language. Broca’s area, near the motor cortex, is involved in

language production, whereasWernicke’s area, near the auditory cortex, is specialized for language

comprehension.

Evidence for the importance of Broca’s and Wernicke’s areas in language is seen in patients who
experience aphasia, a condition in which language functions are severely impaired. People with
Broca’s aphasia have difficulty producing speech, whereas people with damage to Wernicke’s
area can produce speech, but what they say makes no sense and they have trouble understanding
language.

Learning Language

Language learning begins even before birth, because the fetus can hear muffled versions of
speaking from outside the womb. Moon, Cooper, and Fifer (1993) [7]found that infants only two
days old sucked harder on a pacifier when they heard their mothers’ native language being
spoken than when they heard a foreign language, even when strangers were speaking the

Saylor URL: http://www.saylor.org/books Saylor.org


455
languages. Babies are also aware of the patterns of their native language, showing surprise when
they hear speech that has a different patterns of phonemes than those they are used to (Saffran,
Aslin, & Newport, 2004). [8]

During the first year or so after birth, and long before they speak their first words, infants are
already learning language. One aspect of this learning is practice in producing speech. By the
time they are 6 to 8 weeks old, babies start making vowel sounds (“ooohh,” “aaahh,” “goo”) as
well as a variety of cries and squeals to help them practice.

At about 7 months, infants begin babbling, engaging in intentional vocalizations that lack
specific meaning. Children babble as practice in creating specific sounds, and by the time they
are 1 year old, the babbling uses primarily the sounds of the language that they are learning (de
Boysson-Bardies, Sagart, & Durand, 1984). [9] These vocalizations have a conversational tone
that sounds meaningful even though it isn’t. Babbling also helps children understand the social,
communicative function of language. Children who are exposed to sign language babble in sign
by making hand movements that represent real language (Petitto & Marentette, 1991). [10]

At the same time that infants are practicing their speaking skills by babbling, they are also
learning to better understand sounds and eventually the words of language. One of the first words
that children understand is their own name, usually by about 6 months, followed by commonly
used words like “bottle,” “mama,” and “doggie” by 10 to 12 months (Mandel, Jusczyk, & Pisoni,
1995). [11]

The infant usually produces his or her first words at about 1 year of age. It is at this point that the
child first understands that words are more than sounds—they refer to particular objects and
ideas. By the time children are 2 years old, they have a vocabulary of several hundred words, and
by kindergarten their vocabularies have increased to several thousand words. By fifth grade most
children know about 50,000 words and by the time they are in college, about 200,000.

The early utterances of children contain many errors, for instance, confusing /b/ and /d/, or /c/
and /z/. And the words that children create are often simplified, in part because they are not yet
able to make the more complex sounds of the real language (Dobrich & Scarborough,
1992). [12] Children may say “keekee” for kitty, “nana” for banana, and “vesketti” for spaghetti in

Saylor URL: http://www.saylor.org/books Saylor.org


456
part because it is easier. Often these early words are accompanied by gestures that may also be
easier to produce than the words themselves. Children’s pronunciations become increasingly
accurate between 1 and 3 years, but some problems may persist until school age.

Most of a child’s first words are nouns, and early sentences may include only the noun. “Ma”
may mean “more milk please” and “da” may mean “look, there’s Fido.” Eventually the length of
the utterances increases to two words (“mo ma” or “da bark”), and these primitive sentences
begin to follow the appropriate syntax of the native language.

Because language involves the active categorization of sounds and words into higher level units,
children make some mistakes in interpreting what words mean and how to use them. In
particular, they often make overextensions of concepts, which means they use a given word in a
broader context than appropriate. A child might at first call all adult men “daddy” or all animals
“doggie.”

Children also use contextual information, particularly the cues that parents provide, to help them
learn language. Infants are frequently more attuned to the tone of voice of the person speaking
than to the content of the words themselves, and are aware of the target of speech. Werker, Pegg,
and McLeod (1994) [13] found that infants listened longer to a woman who was speaking to a
baby than to a woman who was speaking to another adult.

Children learn that people are usually referring to things that they are looking at when they are
speaking (Baldwin, 1993), [14] and that that the speaker’s emotional expressions are related to the
content of their speech. Children also use their knowledge of syntax to help them figure out what
words mean. If a child hears an adult point to a strange object and say, “this is a dirb,” they will
infer that a “dirb” is a thing, but if they hear them say, “this is a one of those dirb things” they
will infer that it refers to the color or other characteristic of the object. And if they hear the word
“dirbing,” they will infer that “dirbing” is something that we do (Waxman, 1990). [15]

How Children Learn Language: Theories of Language Acquisition

Psychological theories of language learning differ in terms of the importance they place on
nature versus nurture. Yet it is clear that both matter. Children are not born knowing language;

Saylor URL: http://www.saylor.org/books Saylor.org


457
they learn to speak by hearing what happens around them. On the other hand, human brains,
unlike those of any other animal, are prewired in a way that leads them, almost effortlessly, to
learn language.

Perhaps the most straightforward explanation of language development is that it occurs through
principles of learning, including association, reinforcement, and the observation of others
(Skinner, 1965). [16] There must be at least some truth to the idea that language is learned,
because children learn the language that they hear spoken around them rather than some other
language. Also supporting this idea is the gradual improvement of language skills with time. It
seems that children modify their language through imitation, reinforcement, and shaping, as
would be predicted by learning theories.

But language cannot be entirely learned. For one, children learn words too fast for them to be
learned through reinforcement. Between the ages of 18 months and 5 years, children learn up to
10 new words every day (Anglin, 1993). [17]More importantly, language is more generative than
it is imitative.Generativity refers to the fact that speakers of a language can compose sentences
to represent new ideas that they have never before been exposed to. Language is not a predefined
set of ideas and sentences that we choose when we need them, but rather a system of rules and
procedures that allows us to create an infinite number of statements, thoughts, and ideas,
including those that have never previously occurred. When a child says that she “swimmed” in
the pool, for instance, she is showing generativity. No adult speaker of English would ever say
“swimmed,” yet it is easily generated from the normal system of producing language.

Other evidence that refutes the idea that all language is learned through experience comes from
the observation that children may learn languages better than they ever hear them. Deaf children
whose parents do not speak ASL very well nevertheless are able to learn it perfectly on their
own, and may even make up their own language if they need to (Goldin-Meadow & Mylander,
1998).[18] A group of deaf children in a school in Nicaragua, whose teachers could not sign,
invented a way to communicate through made-up signs (Senghas, Senghas, & Pyers,
2005). [19] The development of this new Nicaraguan Sign Language has continued and changed
as new generations of students have come to the school and started using the language. Although

Saylor URL: http://www.saylor.org/books Saylor.org


458
the original system was not a real language, it is becoming closer and closer every year, showing
the development of a new language in modern times.

The linguist Noam Chomsky is a believer in the nature approach to language, arguing that human
brains contain a language acquisition device that includes a universal grammar that underlies all
human language (Chomsky, 1965, 1972). [20] According to this approach, each of the many
languages spoken around the world (there are between 6,000 and 8,000) is an individual example
of the same underlying set of procedures that are hardwired into human brains. Chomsky’s
account proposes that children are born with a knowledge of general rules of syntax that
determine how sentences are constructed.

Chomsky differentiates between the deep structure of an idea—how the idea is represented in the
fundamental universal grammar that is common to all languages, and the surface structure of the
idea—how it is expressed in any one language. Once we hear or express a thought in surface
structure, we generally forget exactly how it happened. At the end of a lecture, you will
remember a lot of the deep structure (i.e., the ideas expressed by the instructor), but you cannot
reproduce the surface structure (the exact words that the instructor used to communicate the
ideas).

Although there is general agreement among psychologists that babies are genetically
programmed to learn language, there is still debate about Chomsky’s idea that there is a universal
grammar that can account for all language learning. Evans and Levinson (2009) [21] surveyed the
world’s languages and found that none of the presumed underlying features of the language
acquisition device were entirely universal. In their search they found languages that did not have
noun or verb phrases, that did not have tenses (e.g., past, present, future), and even some that did
not have nouns or verbs at all, even though a basic assumption of a universal grammar is that all
languages should share these features.

Bilingualism and Cognitive Development

Although it is less common in the United States than in other countries,bilingualism (the ability
to speak two languages) is becoming more and more frequent in the modern world. Nearly one-
half of the world’s population, including 18% of U.S. citizens, grows up bilingual.

Saylor URL: http://www.saylor.org/books Saylor.org


459
In recent years many U.S. states have passed laws outlawing bilingual education in schools.
These laws are in part based on the idea that students will have a stronger identity with the
school, the culture, and the government if they speak only English, and in part based on the idea
that speaking two languages may interfere with cognitive development.

Some early psychological research showed that, when compared with monolingual children,
bilingual children performed more slowly when processing language, and their verbal scores
were lower. But these tests were frequently given in English, even when this was not the child’s
first language, and the children tested were often of lower socioeconomic status than the
monolingual children (Andrews, 1982). [22]

More current research that has controlled for these factors has found that, although bilingual
children may in some cases learn language somewhat slower than do monolingual children
(Oller & Pearson, 2002), [23] bilingual and monolingual children do not significantly differ in the
final depth of language learning, nor do they generally confuse the two languages (Nicoladis &
Genesee, 1997). [24] In fact, participants who speak two languages have been found to have better
cognitive functioning, cognitive flexibility, and analytic skills in comparison to monolinguals
(Bialystok, 2009). [25] Research (Figure 9.15 "Gray Matter in Bilinguals") has also found that
learning a second language produces changes in the area of the brain in the left hemisphere that
is involved in language, such that this area is denser and contains more neurons (Mechelli et al.,
2004). [26] Furthermore, the increased density is stronger in those individuals who are most
proficient in their second language and who learned the second language earlier. Thus, rather
than slowing language development, learning a second language seems to increase cognitive
abilities.

Saylor URL: http://www.saylor.org/books Saylor.org


460
Figure 9.15 Gray Matter in Bilinguals

Andrea Mechelli and her colleagues (2004) found that children who were bilingual had increased gray matter

density (i.e., more neurons) in cortical areas related to language in comparison to monolinguals (panel a), that

gray matter density correlated positively with second language proficiency (panel b) and that gray matter density

correlated negatively with the age at which the second language was learned (panel c).

Saylor URL: http://www.saylor.org/books Saylor.org


461
Source: Adapted from Mechelli, A., Crinion, J. T., Noppeney, U., O’Doherty, J., Ashburner, J., Frackowiak, R. S., &

Price C. J. (2004). Structural plasticity in the bilingual brain: Proficiency in a second language and age at

acquisition affect grey-matter density. Nature, 431, 757.

Can Animals Learn Language?

Nonhuman animals have a wide variety of systems of communication. Some species


communicate using scents; others use visual displays, such as baring the teeth, puffing up the fur,
or flapping the wings; and still others use vocal sounds. Male songbirds, such as canaries and
finches, sing songs to attract mates and to protect territory, and chimpanzees use a combination
of facial expressions, sounds, and actions, such as slapping the ground, to convey aggression (de
Waal, 1989). [27] Honeybees use a “waggle dance” to direct other bees to the location of food
sources (von Frisch, 1956). [28] The language of vervet monkeys is relatively advanced in the
sense that they use specific sounds to communicate specific meanings. Vervets make different
calls to signify that they have seen either a leopard, a snake, or a hawk (Seyfarth & Cheney,
1997). [29]

Despite their wide abilities to communicate, efforts to teach animals to use language have had
only limited success. One of the early efforts was made by Catherine and Keith Hayes, who
raised a chimpanzee named Viki in their home along with their own children. But Viki learned
little and could never speak (Hayes & Hayes, 1952). [30] Researchers speculated that Viki’s
difficulties might have been in part because the she could not create the words in her vocal cords,
and so subsequent attempts were made to teach primates to speak using sign language or by
using boards on which they can point to symbols.

Allen and Beatrix Gardner worked for many years to teach a chimpanzee named Washoe to sign
using ASL. Washoe, who lived to be 42 years old, could label up to 250 different objects and
make simple requests and comments, such as “please tickle” and “me sorry” (Fouts,
1997). [31] Washoe’s adopted daughter Loulis, who was never exposed to human signers, learned
more than 70 signs simply by watching her mother sign.

Saylor URL: http://www.saylor.org/books Saylor.org


462
The most proficient nonhuman language speaker is Kanzi, a bonobo who lives at the Language
Learning Center at Georgia State University (Savage-Rumbaugh, & Lewin, 1994). [32] As you
can see in Note 9.44 "Video Clip: Language Recognition in Bonobos", Kanzi has a propensity
for language that is in many ways similar to humans’. He learned faster when he was younger
than when he got older, he learns by observation, and he can use symbols to comment on social
interactions, rather than simply for food treats. Kanzi can also create elementary syntax and
understand relatively complex commands. Kanzi can make tools and can even play Pac-Man.

Video Clip: Language Recognition in Bonobos

The bonobo Kanzi is the most proficient known nonhuman language speaker.

And yet even Kanzi does not have a true language in the same way that humans do. Human
babies learn words faster and faster as they get older, but Kanzi does not. Each new word he
learns is almost as difficult as the one before. Kanzi usually requires many trials to learn a new
sign, whereas human babies can speak words after only one exposure. Kanzi’s language is
focused primarily on food and pleasure and only rarely on social relationships. Although he can
combine words, he generates few new phrases and cannot master syntactic rules beyond the level
of about a 2-year-old human child (Greenfield & Savage-Rumbaugh, 1991). [33]

In sum, although many animals communicate, none of them have a true language. With some
exceptions, the information that can be communicated in nonhuman species is limited primarily
to displays of liking or disliking, and related to basic motivations of aggression and mating.
Humans also use this more primitive type of communication, in the form of nonverbal
behaviorssuch as eye contact, touch, hand signs, and interpersonal distance, to communicate
their like or dislike for others, but they (unlike animals) also supplant this more primitive
communication with language. Although other animal brains share similarities to ours, only the
human brain is complex enough to create language. What is perhaps most remarkable is that
although language never appears in nonhumans, language is universal in humans. All humans,
unless they have a profound brain abnormality or are completely isolated from other humans,
learn language.

Saylor URL: http://www.saylor.org/books Saylor.org


463
Language and Perception

To this point in the chapter we have considered intelligence and language as if they are separate
concepts. But what if language influences our thinking? The idea that language and its
structures influence and limit human thought is called linguistic relativity.

The most frequently cited example of this possibility was proposed by Benjamin Whorf (1897–
1941), an American linguist who was particularly interested in Native American languages.
Whorf argued that the Inuit people of Canada (sometimes known as Eskimos) had many words
for snow, whereas English speakers have only one, and that this difference influenced how the
different cultures perceived snow. Whorf argued that the Inuit perceived and categorized snow in
finer details than English speakers possibly could, because the English language constrained
perception.

Although the idea of linguistic relativism seemed reasonable, research has suggested that
language has less influence on thinking than might be expected. For one, in terms of perceptions
of snow, although it is true that the Inuit do make more distinctions among types of snow than do
English speakers, the latter also make some distinctions (think “powder,” “slush,” “whiteout,”
and so forth). And it is also possible that thinking about snow may influence language, rather
than the other way around.

In a more direct test of the possibility that language influences thinking, Eleanor Rosch
(1973) [34] compared people from the Dani culture of New Guinea, who have only two terms for
color (“dark” and “bright”), with English speakers who use many more terms. Rosch
hypothesized that if language constrains perception and categorization, then the Dani should
have a harder time distinguishing colors than would English speakers. But her research found
that when the Dani were asked to categorize colors using new categories, they did so in almost
the same way that English speakers did. Similar results were found by Frank, Everett,
Fedorenko, and Gibson (2008), [35] who showed that the Amazonian tribe known as the Pirahã,
who have no linguistic method for expressing exact quantities (not even the number “one”), were
nevertheless able to perform matches with large numbers without problem.

Saylor URL: http://www.saylor.org/books Saylor.org


464
Although these data led researchers to conclude that the language we use to describe color and
number does not influence our underlying understanding of the underlying sensation, another
more recent study has questioned this assumption. Roberson, Davies, and Davidoff
(2000) [36] conducted another study with Dani participants and found that, at least for some
colors, the names that they used to describe colors did influence their perceptions of the colors.
Other researchers continue to test the possibility that our language influences our perceptions,
and perhaps even our thoughts (Levinson, 1998), [37] and yet the evidence for this possibility is,
as of now, mixed.

KEY TAKEAWAYS

• Language involves both the ability to comprehend spoken and written words and to speak and write. Some languages

are sign languages, in which the communication is expressed by movements of the hands.

• Phonemes are the elementary sounds of our language, morphemes are the smallest units of meaningful language,

syntax is the grammatical rules that control how words are put together, and contextual information is the elements

of communication that help us understand its meaning.

• Recent research suggests that there is not a single critical period of language learning, but that language learning is

simply better when it occurs earlier.

• Broca’s area is responsible for language production. Wernicke’s area is responsible for language comprehension.

• Language learning begins even before birth. An infant usually produces his or her first words at about 1 year of age.

• One explanation of language development is that it occurs through principles of learning, including association,

reinforcement, and the observation of others.

• Noam Chomsky argues that human brains contain a language acquisition module that includes a universal grammar

that underlies all human language. Chomsky differentiates between the deep structure and the surface structure of

an idea.

• Although other animals communicate and may be able to express ideas, only the human brain is complex enough to

create real language.

• Our language may have some influence on our thinking, but it does not affect our underlying understanding of

concepts.

Saylor URL: http://www.saylor.org/books Saylor.org


465
EXERCISES AND CRITICAL THINKING

1. What languages do you speak? Did you ever try to learn a new one? What problems did you have when you did this?

Would you consider trying to learn a new language?

2. Some animals, such as Kanzi, display at least some language. Do you think that this means that they are intelligent?

[1] Werker, J. F., & Tees, R. C. (2002). Cross-language speech perception: Evidence for perceptual reorganization during the

first year of life. Infant Behavior & Development, 25(1), 121–133.

[2] Rymer, R. (1993). Genie: An abused child’s flight from silence. New York, NY: HarperCollins.

[3] Mayberry, R. I., Lock, E., & Kazmi, H. (2002). Development: Linguistic ability and early language exposure. Nature, 417(6884),

38.

[4] Lenneberg, E. (1967). Biological foundations of language. New York, NY: John Wiley & Sons; Penfield, W., & Roberts, L.

(1959). Speech and brain mechanisms. Princeton, NJ: Princeton University Press.

[5] Johnson, J. S., & Newport, E. L. (1989). Critical period effects in second language learning: The influence of maturational

state on the acquisition of English as a second language. Cognitive Psychology, 21(1), 60–99.

[6] Hakuta, K., Bialystok, E., & Wiley, E. (2003). Critical evidence: A test of the critical-period hypothesis for second-language

acquisition. Psychological Science, 14(1), 31–38.

[7] Moon, C., Cooper, R. P., & Fifer, W. P. (1993). Two-day-olds prefer their native language. Infant Behavior & Development,

16(4), 495–500.

[8] Saffran, J. R., Aslin, R. N., & Newport, E. L. (2004). Statistical learning by 8-month-old infants. New York, NY: Psychology

Press.

[9] de Boysson-Bardies, B., Sagart, L., & Durand, C. (1984). Discernible differences in the babbling of infants according to target

language. Journal of Child Language, 11(1), 1–15.

[10] Petitto, L. A., & Marentette, P. F. (1991). Babbling in the manual mode: Evidence for the ontogeny of language. Science,

251(5000), 1493–1496.

[11] Mandel, D. R., Jusczyk, P. W., & Pisoni, D. B. (1995). Infants’ recognition of the sound patterns of their own

names. Psychological Science, 6(5), 314–317.

[12] Dobrich, W., & Scarborough, H. S. (1992). Phonological characteristics of words young children try to say. Journal of Child

Language, 19(3), 597–616.

[13] Werker, J. F., Pegg, J. E., & McLeod, P. J. (1994). A cross-language investigation of infant preference for infant-directed

communication. Infant Behavior & Development, 17(3), 323–333.

Saylor URL: http://www.saylor.org/books Saylor.org


466
[14] Baldwin, D. A. (1993). Early referential understanding: Infants’ ability to recognize referential acts for what they

are. Developmental Psychology, 29(5), 832–843.

[15] Waxman, S. R. (1990). Linguistic biases and the establishment of conceptual hierarchies: Evidence from preschool

children. Cognitive Development, 5(2), 123–150.

[16] Skinner, B. F. (1965). Science and human behavior. New York, NY: Free Press.

[17] Anglin, J. M. (1993). Vocabulary development: A morphological analysis. Monographs of the Society for Research in Child

Development, 58(10), v–165.

[18] Goldin-Meadow, S., & Mylander, C. (1998). Spontaneous sign systems created by deaf children in two cultures. Nature,

391(6664), 279–281.

[19] Senghas, R. J., Senghas, A., & Pyers, J. E. (2005). The emergence of Nicaraguan Sign Language: Questions of development,

acquisition, and evolution. In S. T. Parker, J. Langer, & C. Milbrath (Eds.), Biology and knowledge revisited: From neurogenesis to

psychogenesis(pp. 287–306). Mahwah, NJ: Lawrence Erlbaum Associates.

[20] Chomsky, N. (1965). Aspects of the theory of syntax. Cambridge, MA: MIT Press; Chomsky, N. (1972). Language and

mind (Extended ed.). New York, NY: Harcourt, Brace & Jovanovich.

[21] Evans, N., & Levinson, S. C. (2009). The myth of language universals: Language diversity and its importance for cognitive

science. Behavioral and Brain Sciences, 32(5), 429–448.

[22] Andrews, I. (1982). Bilinguals out of focus: A critical discussion. International Review of Applied Linguistics in Language

Teaching, 20(4), 297–305.

[23] Oller, D. K., & Pearson, B. Z. (2002). Assessing the effects of bilingualism: A background. In D. K. Oller & R. E. Eilers

(Eds.), Language and literacy in bilingual children(pp. 3–21). Tonawanda, NY: Multilingual Matters.

[24] Nicoladis, E., & Genesee, F. (1997). Language development in preschool bilingual children. Journal of Speech-Language

Pathology and Audiology, 21(4), 258–270.

[25] Bialystok, E. (2009). Bilingualism: The good, the bad, and the indifferent. Bilingualism: Language and Cognition, 12(1), 3–

11.

[26] Mechelli, A., Crinion, J. T., Noppeney, U., O’Doherty, J., Ashburner, J., Frackowiak, R. S., & Price C. J. (2004). Structural

plasticity in the bilingual brain: Proficiency in a second language and age at acquisition affect grey-matter density. Nature, 431,

757.

[27] De Waal, F. (1989). Peacemaking among primates. Cambridge, MA: Harvard University Press.

[28] Von Frisch, K. (1956). Bees: Their vision, chemical senses, and language. Ithaca, NY: Cornell University Press.

Saylor URL: http://www.saylor.org/books Saylor.org


467
[29] Seyfarth, R. M., & Cheney, D. L. (1997). Behavioral mechanisms underlying vocal communication in nonhuman

primates. Animal Learning & Behavior, 25(3), 249–267.

[30] Hayes, K. J., and Hayes, C. (1952). Imitation in a home-raised chimpanzee. Journal of Comparative and Physiological

Psychology, 45, 450–459.

[31] Fouts, R. (1997). Next of kin: What chimpanzees have taught me about who we are. New York, NY: William Morrow.

[32] Savage-Rumbaugh, S., & Lewin, R. (1994). Kanzi: The ape at the brink of the human mind. Hoboken, NJ: John Wiley & Sons.

[33] Greenfield, P. M., & Savage-Rumbaugh, E. S. (1991). Imitation, grammatical development, and the invention of

protogrammar by an ape. In N. A. Krasnegor, D. M. Rumbaugh, R. L. Schiefelbusch, & M. Studdert-Kennedy (Eds.), Biological and

behavioral determinants of language development (pp. 235–258). Hillsdale, NJ: Lawrence Erlbaum Associates.

[34] Rosch, E. H. (1973). Natural categories. Cognitive Psychology, 4(3), 328–350.

[35] Frank, M. C., Everett, D. L., Fedorenko, E., & Gibson, E. (2008). Number as a cognitive technology: Evidence from Pirahã

language and cognition. Cognition, 108(3), 819–824.

[36] Roberson, D., Davies, I., & Davidoff, J. (2000). Color categories are not universal: Replications and new evidence from a

stone-age culture. Journal of Experimental Psychology: General, 129(3), 369–398.

[37] Levinson, S. C. (1998). Studying spatial conceptualization across cultures: Anthropology and cognitive science. Ethos, 26(1),

7–24.

9.4 Chapter Summary


Intelligence—the ability to think, to learn from experience, to solve problems, and to adapt to
new situations—is more strongly related than any other individual difference variable to
successful educational, occupational, economic, and social outcomes.

The French psychologist Alfred Binet and his colleague Henri Simon developed the first
intelligence test in the early 1900s. Charles Spearman called the construct that the different
abilities and skills measured on intelligence tests have in common the general intelligence factor,
or simply “g.”

There is also evidence for specific intelligences (s), measures of specific skills in narrow
domains. Robert Sternberg has proposed a triarchic (three-part) theory of intelligence, and
Howard Gardner has proposed that there are eight different specific intelligences.

Saylor URL: http://www.saylor.org/books Saylor.org


468
Good intelligence tests both are reliable and have construct validity. Intelligence tests are the
most accurate of all psychological tests. IQ tests are standardized, which allows calculation of
mental age and the intelligence quotient (IQ),

The Wechsler Adult lntelligence Scale (WAIS) is the most widely used intelligence test for
adults. Other intelligence tests include aptitude tests such as the Scholastic Assessment Test
(SAT), American College Test (ACT), and Graduate Record Examination (GRE), and structured
tests used for personnel selection.

Smarter people have somewhat larger brains, which operate more efficiently and faster than the
brains of the less intelligent. Although intelligence is not located in a specific part of the brain, it
is more prevalent in some brain areas than others.

Intelligence has both genetic and environmental causes, and between 40% and 80% of the
variability in IQ is heritable. Social and economic deprivation, including poverty, can adversely
affect IQ, and intelligence is improved by education.

Emotional intelligence refers to the ability to identify, assess, manage, and control one’s
emotions. However, tests of emotional intelligence are often unreliable, and emotional
intelligence may be a part of g, or a skill that can be applied in some specific work situations.

About 3% of Americans score above an IQ of 130 (the threshold for giftedness), and about the
same percentage score below an IQ of 70 (the threshold for mental retardation). Males are about
20% more common in these extremes than are women.

Women and men show overall equal intelligence, but there are sex differences on some types of
tasks. There are also differences in which members of different racial and ethnic groups cluster
along the IQ line. The causes of these differences are not completely known. These differences
have at times led to malicious, misguided, and discriminatory attempts to try to correct for them,
such as eugenics.

Language involves both the ability to comprehend spoken and written words and to create
communication in real time when we speak or write. Language can be conceptualized in terms of

Saylor URL: http://www.saylor.org/books Saylor.org


469
sounds (phonemes), meaning (morphemes and syntax), and the environmental factors that help
us understand it (contextual information).

Language is best learned during the critical period between 3 and 7 years of age.

Broca’s area, an area of the brain in front of the left hemisphere near the motor cortex, is
responsible for language production, and Wernicke’s area, an area of the brain next to the
auditory cortex, is responsible for language comprehension.

Children learn language quickly and naturally, progressing through stages of babbling, first
words, first sentences, and then a rapid increase in vocabulary. Children often make
overextensions of concepts.

Some theories of language learning are based on principles of learning. Noam Chomsky argues
that human brains contain a language acquisition device that includes a universal grammar that
underlies all human language and that allows generativity. Chomsky differentiates between the
deep structure and the surface structure of an idea.

Bilingualism is becoming more and more frequent in the modern world. Bilingual children may
show more cognitive function and flexibility than do monolingual children.

Nonhuman animals have a wide variety of systems of communication. But efforts to teach
animals to use human language have had only limited success. Although many animals
communicate, none of them have a true language.

Saylor URL: http://www.saylor.org/books Saylor.org


470
Chapter 10
Emotions and Motivations
Captain Sullenberger Conquers His Emotions
He was 3,000 feet up in the air when the sudden loss of power in his airplane put his life, as well as the lives of 150

other passengers and crew members, in his hands. Both of the engines on flight 1539 had shut down, and his options

for a safe landing were limited.

Sully kept flying the plane and alerted the control tower to the situation:

This is Cactus 1539…hit birds. We lost thrust in both engines. We’re turning back towards La Guardia.

When the tower gave him the compass setting and runway for a possible landing, Sullenberger’s extensive experience

allowed him to give a calm response:

I’m not sure if we can make any runway…Anything in New Jersey?

Captain Sullenberger was not just any pilot in a crisis, but a former U.S. Air Force fighter pilot with 40 years of flight

experience. He had served as a flight instructor and the Airline Pilots Association safety chairman. Training had

quickened his mental processes in assessing the threat, allowing him to maintain what tower operators later called an

“eerie calm.” He knew the capabilities of his plane.

When the tower suggested a runway in New Jersey, Sullenberger calmly replied:

We’re unable. We may end up in the Hudson.

The last communication from Captain Sullenberger to the tower advised of the eventual outcome:

We’re going to be in the Hudson.

He calmly set the plane down on the water. Passengers reported that the landing was like landing on a rough runway.

The crew kept the passengers calm as women, children, and then the rest of the passengers were evacuated onto the

boats of the rescue personnel that had quickly arrived. Captain Sullenberger then calmly walked the aisle of the plane

to be sure that everyone was out before joining the 150 other rescued survivors (Levin, 2009; National Transportation
[1]
Safety Board, 2009).

Some called it “grace under pressure,” and others the “miracle on the Hudson.” But psychologists see it as the

ultimate in emotion regulation—the ability to control and productively use one’s emotions.

The topic of this chapter is affect, defined as the experience of feeling or emotion. Affect is an
essential part of the study of psychology because it plays such an important role in everyday life.

Saylor URL: http://www.saylor.org/books Saylor.org


471
As we will see, affect guides behavior, helps us make decisions, and has a major impact on our
mental and physical health.

The two fundamental components of affect are emotions and motivation. Both of these words
have the same underlying Latin root, meaning “to move.” In contrast to cognitive processes that
are calm, collected, and frequently rational, emotions and motivations involve arousal, or our
experiences of the bodily responses created by the sympathetic division of the autonomic nervous
system (ANS). Because they involve arousal, emotions and motivations are “hot”—they
“charge,” “drive,” or “move” our behavior.

When we experience emotions or strong motivations, we feel the experiences. When we become
aroused, the sympathetic nervous system provides us with energy to respond to our environment.
The liver puts extra sugar into the bloodstream, the heart pumps more blood, our pupils dilate to
help us see better, respiration increases, and we begin to perspire to cool the body. The stress
hormones epinephrine and norepinephrine are released. We experience these responses as
arousal.

An emotion is a mental and physiological feeling state that directs our attention and guides our
behavior. Whether it is the thrill of a roller-coaster ride that elicits an unexpected scream, the
flush of embarrassment that follows a public mistake, or the horror of a potential plane crash that
creates an exceptionally brilliant response in a pilot, emotions move our actions. Emotions
normally serve an adaptive role: We care for infants because of the love we feel for them, we
avoid making a left turn onto a crowded highway because we fear that a speeding truck may hit
us, and we are particularly nice to Mandy because we are feeling guilty that we didn’t go to her
party. But emotions may also be destructive, such as when a frustrating experience leads us to
lash out at others who do not deserve it.

Motivations are closely related to emotions. A motivation is a driving force that initiates and
directs behavior. Some motivations are biological, such as the motivation for food, water, and
sex. But there are a variety of other personal and social motivations that can influence behavior,
including the motivations for social approval and acceptance, the motivation to achieve, and the
motivation to take, or to avoid taking, risks (Morsella, Bargh, & Gollwitzer, 2009). [2] In each

Saylor URL: http://www.saylor.org/books Saylor.org


472
case we follow our motivations because they are rewarding. As predicted by basic theories of
operant learning, motivations lead us to engage in particular behaviors because doing so makes
us feel good.

Motivations are often considered in psychology in terms of drives, which are internal states that
are activated when the physiological characteristics of the body are out of balance, and goals,
which are desired end states that we strive to attain. Motivation can thus be conceptualized as a
series of behavioral responses that lead us to attempt to reduce drives and to attain goals by
comparing our current state with a desired end state (Lawrence, Carver, & Scheier,
2002). [3] Like a thermostat on an air conditioner, the body tries to maintain homeostasis, the
natural state of the body’s systems, with goals, drives, and arousal in balance. When a drive or
goal is aroused—for instance, when we are hungry—the thermostat turns on and we start to
behave in a way that attempts to reduce the drive or meet the goal (in this case to seek food). As
the body works toward the desired end state, the thermostat continues to check whether or not
the end state has been reached. Eventually, the need or goal is satisfied (we eat), and the relevant
behaviors are turned off. The body’s thermostat continues to check for homeostasis and is always
ready to react to future needs.

In addition to more basic motivations such as hunger, a variety of other personal and social
motivations can also be conceptualized in terms of drives or goals. When the goal of studying for
an exam is hindered because we take a day off from our schoolwork, we may work harder on our
studying on the next day to move us toward our goal. When we are dieting, we may be more
likely to have a big binge on a day when the scale says that we have met our prior day’s goals.
And when we are lonely, the motivation to be around other people is aroused and we try to
socialize. In many, if not most cases, our emotions and motivations operate out of our conscious
awareness to guide our behavior (Freud, 1922; Hassin, Bargh, & Zimerman, 2009; Williams,
Bargh, Nocera, & Gray, 2009). [4]

We begin this chapter by considering the role of affect on behavior, discussing the most
important psychological theories of emotions. Then we will consider how emotions influence our
mental and physical health. We will discuss how the experience of long-term stress causes
illness, and then turn to research onpositive thinking and what has been learned about the

Saylor URL: http://www.saylor.org/books Saylor.org


473
beneficial health effects of more positive emotions. Finally, we will review some of the most
important human motivations, including the behaviors of eating and sex. The importance of this
chapter is not only in helping you gain an understanding the principles of affect but also in
helping you discover the important roles that affect plays in our everyday lives, and particularly
in our mental and physical health. The study of the interface between affect and physical
health—that principle that “everything that is physiological is also psychological”—is a key
focus of the branch of psychology known as health psychology. The importance of this topic has
made health psychology one of the fastest growing fields in psychology.

[1] Levin, A. (2009, June 9). Experience averts tragedy in Hudson landing. USA Today. Retrieved

from http://www.usatoday.com/news/nation/2009-06-08-hudson_N.htm; National Transportation Safety Board. (2009, June 9).

Excerpts of Flight 1549 cockpit communications. USA Today. Retrieved fromhttp://www.usatoday.com/news/nation/2009-06-09-

hudson-cockpit-transcript_N.htm

[2] Morsella, E., Bargh, J. A., & Gollwitzer, P. M. (2009). Oxford handbook of human action. New York, NY: Oxford University

Press.

[3] Lawrence, J. W., Carver, C. S., & Scheier, M. F. (2002). Velocity toward goal attainment in immediate experience as a

determinant of affect. Journal of Applied Social Psychology, 32(4), 788–802.

[4] Freud, S. (1922). The unconscious. The Journal of Nervous and Mental Disease, 56(3), 291; Hassin, R. R., Bargh, J. A., &

Zimerman, S. (2009). Automatic and flexible: The case of nonconscious goal pursuit. Social Cognition, 27(1), 20–36; Williams, L.

E., Bargh, J. A., Nocera, C. C., & Gray, J. R. (2009). The unconscious regulation of emotion: Nonconscious reappraisal goals

modulate emotional reactivity. Emotion, 9(6), 847–854.

10.1 The Experience of Emotion


LEARNING OBJECTIVES

1. Explain the biological experience of emotion.

2. Summarize the psychological theories of emotion.

3. Give examples of the ways that emotion is communicated.

The most fundamental emotions, known as the basic emotions, are those ofanger, disgust, fear,
happiness, sadness, and surprise. The basic emotions have a long history in human evolution,
and they have developed in large part to help us make rapid judgments about stimuli and to

Saylor URL: http://www.saylor.org/books Saylor.org


474
quickly guide appropriate behavior (LeDoux, 2000). [1] The basic emotions are determined in
large part by one of the oldest parts of our brain, the limbic system, including the amygdala, the
hypothalamus, and the thalamus. Because they are primarily evolutionarily determined, the basic
emotions are experienced and displayed in much the same way across cultures (Ekman, 1992;
Elfenbein & Ambady, 2002, 2003; Fridland, Ekman, & Oster, 1987), [2] and people are quite
accurate at judging the facial expressions of people from different cultures. View Note 10.8
"Video Clip: The Basic Emotions" to see a demonstration of the basic emotions.

Video Clip: The Basic Emotions

Not all of our emotions come from the old parts of our brain; we also interpret our experiences to
create a more complex array of emotional experiences. For instance, the amygdala may sense
fear when it senses that the body is falling, but that fear may be interpreted completely
differently (perhaps even as “excitement”) when we are falling on a roller-coaster ride than when
we are falling from the sky in an airplane that has lost power. The cognitive interpretations that
accompany emotions—known as cognitive appraisal—allow us to experience a much larger and
more complex set of secondary emotions, as shown in Figure 10.2 "The Secondary Emotions".
Although they are in large part cognitive, our experiences of the secondary emotions are
determined in part by arousal (on the vertical axis of Figure 10.2 "The Secondary Emotions")
and in part by their valence—that is, whether they are pleasant or unpleasant feelings (on the
horizontal axis of Figure 10.2 "The Secondary Emotions")

Saylor URL: http://www.saylor.org/books Saylor.org


475
Figure 10.2 The Secondary Emotions

The secondary emotions are those that have a major cognitive component. They are determined by both their level

of arousal (low to high) and their valence (pleasant to unpleasant).

Source: Adapted from Russell, J. A. (1980). A circumplex model of affect.Journal of Personality and Social

Psychology, 39, 1161–1178.

When you succeed in reaching an important goal, you might spend some time enjoying your
secondary emotions, perhaps the experience of joy, satisfaction, and contentment. But when your
close friend wins a prize that you thought you had deserved, you might also experience a variety
of secondary emotions (in this case, the negative ones)—for instance, feeling angry, sad,
resentful, and ashamed. You might mull over the event for weeks or even months, experiencing
these negative emotions each time you think about it (Martin & Tesser, 2006). [3]

Saylor URL: http://www.saylor.org/books Saylor.org


476
The distinction between the primary and the secondary emotions is paralleled by two brain
pathways: a fast pathway and a slow pathway (Damasio, 2000; LeDoux, 2000; Ochsner, Bunge,
Gross, & Gabrielli, 2002). [4] The thalamus acts as the major gatekeeper in this process (Figure
10.3 "Slow and Fast Emotional Pathways"). Our response to the basic emotion of fear, for
instance, is primarily determined by the fast pathway through the limbic system. When a car
pulls out in front of us on the highway, the thalamus activates and sends an immediate message
to the amygdala. We quickly move our foot to the brake pedal. Secondary emotions are more
determined by the slow pathway through the frontal lobes in the cortex. When we stew in
jealousy over the loss of a partner to a rival or recollect on our win in the big tennis match, the
process is more complex. Information moves from the thalamus to the frontal lobes for cognitive
analysis and integration, and then from there to the amygdala. We experience the arousal of
emotion, but it is accompanied by a more complex cognitive appraisal, producing more refined
emotions and behavioral responses.

Figure 10.3 Slow and Fast Emotional Pathways

There are two emotional pathways in the brain (one slow and one fast), both of which are controlled by the

thalamus.

Saylor URL: http://www.saylor.org/books Saylor.org


477
Although emotions might seem to you to be more frivolous or less important in comparison to
our more rational cognitive processes, both emotions and cognitions can help us make effective
decisions. In some cases we take action after rationally processing the costs and benefits of
different choices, but in other cases we rely on our emotions. Emotions become particularly
important in guiding decisions when the alternatives between many complex and conflicting
alternatives present us with a high degree of uncertainty and ambiguity, making a complete
cognitive analysis difficult. In these cases we often rely on our emotions to make decisions, and
these decisions may in many cases be more accurate than those produced by cognitive processing
(Damasio, 1994; Dijksterhuis, Bos, Nordgren, & van Baaren, 2006; Nordgren & Dijksterhuis,
2009; Wilson & Schooler, 1991). [5]

The Cannon-Bard and James-Lange Theories of Emotion

Recall for a moment a situation in which you have experienced an intense emotional response.
Perhaps you woke up in the middle of the night in a panic because you heard a noise that made
you think that someone had broken into your house or apartment. Or maybe you were calmly
cruising down a street in your neighborhood when another car suddenly pulled out in front of
you, forcing you to slam on your brakes to avoid an accident. I’m sure that you remember that
your emotional reaction was in large part physical. Perhaps you remember being flushed, your
heart pounding, feeling sick to your stomach, or having trouble breathing. You were
experiencing the physiological part of emotion—arousal—and I’m sure you have had similar
feelings in other situations, perhaps when you were in love, angry, embarrassed, frustrated, or
very sad.

If you think back to a strong emotional experience, you might wonder about the order of the
events that occurred. Certainly you experienced arousal, but did the arousal come before, after,
or along with the experience of the emotion? Psychologists have proposed three different
theories of emotion, which differ in terms of the hypothesized role of arousal in emotion (Figure
10.4 "Three Theories of Emotion").

Saylor URL: http://www.saylor.org/books Saylor.org


478
Figure 10.4 Three Theories of Emotion

The Cannon-Bard theory proposes that emotions and arousal occur at the same time. The James-Lange theory

proposes the emotion is the result of arousal. Schachter and Singer’s two-factor model proposes that arousal and

cognition combine to create emotion.

Saylor URL: http://www.saylor.org/books Saylor.org


479
If your experiences are like mine, as you reflected on the arousal that you have experienced in
strong emotional situations, you probably thought something like, “I was afraid and my heart
started beating like crazy.” At least some psychologists agree with this interpretation. According
to the theory of emotion proposed by Walter Cannon and Philip Bard, the experience of the
emotion (in this case, “I’m afraid”) occurs alongside our experience of the arousal (“my heart is
beating fast”). According to the Cannon-Bard theory of emotion, the experience of an emotion is
accompanied by physiological arousal. Thus, according to this model of emotion, as we become
aware of danger, our heart rate also increases.

Although the idea that the experience of an emotion occurs alongside the accompanying arousal
seems intuitive to our everyday experiences, the psychologists William James and Carl Lange
had another idea about the role of arousal. According to the James-Lange theory of emotion, our
experience of an emotion is the result of the arousal that we experience. This approach proposes
that the arousal and the emotion are not independent, but rather that the emotion depends on the
arousal. The fear does not occur along with the racing heart but occurs because of the racing
heart. As William James put it, “We feel sorry because we cry, angry because we strike, afraid
because we tremble” (James, 1884, p. 190). [6] A fundamental aspect of the James-Lange theory
is that different patterns of arousal may create different emotional experiences.

There is research evidence to support each of these theories. The operation of the fast emotional
pathway (Figure 10.3 "Slow and Fast Emotional Pathways") supports the idea that arousal and
emotions occur together. The emotional circuits in the limbic system are activated when an
emotional stimulus is experienced, and these circuits quickly create corresponding physical
reactions (LeDoux, 2000). [7] The process happens so quickly that it may feel to us as if emotion
is simultaneous with our physical arousal.

On the other hand, and as predicted by the James-Lange theory, our experiences of emotion are
weaker without arousal. Patients who have spinal injuries that reduce their experience of arousal
also report decreases in emotional responses (Hohmann, 1966). [8] There is also at least some
support for the idea that different emotions are produced by different patterns of arousal. People
who view fearful faces show more amygdala activation than those who watch angry or joyful
faces (Whalen et al., 2001; Witvliet & Vrana, 1995), [9] we experience a red face and flushing

Saylor URL: http://www.saylor.org/books Saylor.org


480
when we are embarrassed but not when we experience other emotions (Leary, Britt, Cutlip, &
Templeton, 1992), [10] and different hormones are released when we experience compassion than
when we experience other emotions (Oatley, Keltner, & Jenkins, 2006). [11]

The Two-Factor Theory of Emotion

Whereas the James-Lange theory proposes that each emotion has a different pattern of arousal,
the two-factor theory of emotion takes the opposite approach, arguing that the arousal that we
experience is basically the same in every emotion, and that all emotions (including the basic
emotions) are differentiated only by our cognitive appraisal of the source of the arousal. The
two-factor theory of emotion asserts that the experience of emotion is determined by the intensity
of the arousal we are experiencing, but that the cognitive appraisal of the situation determines
what the emotion will be. Because both arousal and appraisal are necessary, we can say that
emotions have two factors: an arousal factor and a cognitive factor (Schachter & Singer,
1962): [12]

emotion = arousal + cognition

In some cases it may be difficult for a person who is experiencing a high level of arousal to
accurately determine which emotion she is experiencing. That is, she may be certain that she is
feeling arousal, but the meaning of the arousal (the cognitive factor) may be less clear. Some
romantic relationships, for instance, have a very high level of arousal, and the partners
alternatively experience extreme highs and lows in the relationship. One day they are madly in
love with each other and the next they are in a huge fight. In situations that are accompanied by
high arousal, people may be unsure what emotion they are experiencing. In the high arousal
relationship, for instance, the partners may be uncertain whether the emotion they are feeling is
love, hate, or both at the same time (sound familiar?). The tendency for people to incorrectly
label the source of the arousal that they are experiencing is known as the
misattribution of arousal.

In one interesting field study by Dutton and Aron (1974), [13] an attractive young woman
approached individual young men as they crossed a wobbly, long suspension walkway hanging
more than 200 feet above a river in British Columbia, Canada. The woman asked each man to

Saylor URL: http://www.saylor.org/books Saylor.org


481
help her fill out a class questionnaire. When he had finished, she wrote her name and phone
number on a piece of paper, and invited him to call if he wanted to hear more about the project.
More than half of the men who had been interviewed on the bridge later called the woman. In
contrast, men approached by the same woman on a low solid bridge, or who were interviewed on
the suspension bridge by men, called significantly less frequently. The idea of misattribution of
arousal can explain this result—the men were feeling arousal from the height of the bridge, but
they misattributed it as romantic or sexual attraction to the woman, making them more likely to
call her.

Research Focus: Misattributing Arousal


If you think a bit about your own experiences of different emotions, and if you consider the equation that suggests

that emotions are represented by both arousal and cognition, you might start to wonder how much was determined by

each. That is, do we know what emotion we are experiencing by monitoring our feelings (arousal) or by monitoring

our thoughts (cognition)? The bridge study you just read about might begin to provide you an answer: The men

seemed to be more influenced by their perceptions of how they should be feeling (their cognition) rather than by how

they actually were feeling (their arousal).


[14]
Stanley Schachter and Jerome Singer (1962) directly tested this prediction of the two-factor theory of emotion in a

well-known experiment. Schachter and Singer believed that the cognitive part of the emotion was critical—in fact,

they believed that the arousal that we are experiencing could be interpreted as any emotion, provided we had the right

label for it. Thus they hypothesized that if an individual is experiencing arousal for which he has no immediate

explanation, he will “label” this state in terms of the cognitions that are created in his environment. On the other

hand, they argued that people who already have a clear label for their arousal would have no need to search for a

relevant label, and therefore should not experience an emotion.

In the research, male participants were told that they would be participating in a study on the effects of a new drug,

called “suproxin,” on vision. On the basis of this cover story, the men were injected with a shot of the

neurotransmitter epinephrine, a drug that normally creates feelings of tremors, flushing, and accelerated breathing in

people. The idea was to give all the participants the experience of arousal.

Then, according to random assignment to conditions, the men were told that the drug would make them feel certain

ways. The men in theepinephrine informed condition were told the truth about the effects of the drug—they were told

that they would likely experience tremors, their hands would start to shake, their hearts would start to pound, and

Saylor URL: http://www.saylor.org/books Saylor.org


482
their faces might get warm and flushed. The participants in the epinephrine-uninformed condition, however, were

told something untrue—that their feet would feel numb, that they would have an itching sensation over parts of their

body, and that they might get a slight headache. The idea was to make some of the men think that the arousal they

were experiencing was caused by the drug (the informed condition), whereas others would be unsure where the

arousal came from (the uninformed condition).

Then the men were left alone with a confederate who they thought had received the same injection. While they were

waiting for the experiment (which was supposedly about vision) to begin, the confederate behaved in a wild and crazy

(Schachter and Singer called it “euphoric”) manner. He wadded up spitballs, flew paper airplanes, and played with a

hula-hoop. He kept trying to get the participant to join in with his games. Then right before the vision experiment was

to begin, the participants were asked to indicate their current emotional states on a number of scales. One of the

emotions they were asked about was euphoria.

If you are following the story, you will realize what was expected: The men who had a label for their arousal

(the informed group) would not be experiencing much emotion because they already had a label available for their

arousal. The men in the misinformed group, on the other hand, were expected to be unsure about the source of the

arousal. They needed to find an explanation for their arousal, and the confederate provided one. As you can see

in Figure 10.6 "Results From Schachter and Singer, 1962" (left side), this is just what they found. The participants in

the misinformed condition were more likely to be experiencing euphoria (as measured by their behavioral responses

with the confederate) than were those in the informed condition.

Then Schachter and Singer conducted another part of the study, using new participants. Everything was exactly the

same except for the behavior of the confederate. Rather than being euphoric, he acted angry. He complained about

having to complete the questionnaire he had been asked to do, indicating that the questions were stupid and too

personal. He ended up tearing up the questionnaire that he was working on, yelling “I don’t have to tell them that!”

Then he grabbed his books and stormed out of the room.

What do you think happened in this condition? The answer is the same thing: The misinformed participants

experienced more anger (again as measured by the participant’s behaviors during the waiting period) than did the

informed participants. (Figure 10.6 "Results From Schachter and Singer, 1962", right side) The idea is that because

cognitions are such strong determinants of emotional states, the same state of physiological arousal could be labeled

in many different ways, depending entirely on the label provided by the social situation. As Schachter and Singer put

Saylor URL: http://www.saylor.org/books Saylor.org


483

You might also like