Chapter 5&6 Psychology
Chapter 5&6 Psychology
LEARNING - according to behaviorists, a relatively permanent change in behavior that results from experience
according to cognitive theorists, the process by which organisms make relatively permanent changes in
the way they represent the environment because of experience
Learning, for cognitive psychologists, may be shown by changes in behavior, but learning itself is a
mental process.
CLASSICAL CONDITIONING involves ways in which we learn to associate events with other events.
It is involuntary, automatic learning. Cognitive psychologists view classical conditioning as the learning
of relationships among events.
a simple form of learning in which a neutral stimulus comes to evoke the response usually evoked by
another stimulus by being paired repeatedly with the other stimulus.
is a simple form of learning in which organisms come to anticipate or associate events with one
another.
Classical conditioning focuses on how organisms form anticipations about their environments.
In classical conditioning, involuntary responses such as salivation or eye blinks are often conditioned.
REFLEX a simple unlearned response to a stimulus. Reflexes are unlearned and evoked by certain stimuli
STIMULUS an environmental condition that elicits a
response.
Pavlov discovered that reflexes can also be learned,
or conditioned, by association.
In his initial experiments, Pavlov trained dogs to
salivate when he sounded a tone or a bell. Pavlov termed
these trained salivary responses conditional reflexes.
Conditional reflexes are generally referred to as
conditioned responses.Classical conditioning is a simple
form of learning in which one stimulus comes to evoke
the response usually evoked by another stimulus.
Cognitive psychologists view classical conditioning as the
learning of relationships among events.
unconditioned stimulus (UCS) a stimulus that elicits a
response from an organism prior to conditioning. (The
meat powder)
unconditioned response (UCR) an unlearned response to
an unconditioned stimulus orienting reflex an unlearned
response in which an organism attends to a stimulus.
Salivation in response to the meat powder is an
unlearned
conditioned response (CR) a learned response to a
conditioned stimulus.
Salivation in response to the tone (or conditioned
stimulus) is a learned
conditioned stimulus (CS) a previously neutral stimulus that elicits a conditioned response because it has
been paired repeatedly with a stimulus that already elicited that response. (the tone became a learned
stimulus)
Therefore, salivation can be either a CR or a UCR, depending on the method used to evoke the response
EXTINCTION AND SPONTANEOUS RECOVERY
Extinction enters the picture when times—and the relationships between events—change.
EXTINCTION the process by which stimuli lose their ability to evoke learned responses because the events
that had followed the stimuli no longer occur (The learned responses are said to be extinguished.)
is the process by which CS loses the ability to elicit CR because the CS is no longer associated with UCS.
Rather, extinction inhibits the response. The response remains available for the future under the
“right” conditions.
Example: a child may learn to connect hearing a car pull into the driveway (a CS) with the arriva l of his or
her parents (a UCS). Thus, the child may squeal with delight (squealing is a CR) when he or she hears the
car. After moving to a new house, the child’s parents may commute by public transportation. The sound of
a car in a nearby driveway may signal a neighbor’s, not a parent’s, homecoming.
When a CS (such as the sound of a car) is no longer followed by a UCS (a parent’s homecoming), the CS
loses its ability to elicit a CR.
SPONTANEOUS recovery the recurrence of an extinguished response as a function of the passage of time
Organisms tend to show spontaneous recovery of extinguished CRs as a function of the passage of
time.
In the wild, a water hole may contain water for only a couple of months during the year. But evolution
would favor the survival of animals that associate the water hole with the thirst drive from time to
time so that they return to it when it again holds water.
spontaneous recovery helps organisms adapt to situations that recur from time to time.
ADAPTATION requires us to respond similarly (or generalize) to stimuli that are equivalent in function and to
respond differently to (or discriminate between) stimuli that are not.
GENERALIZATION is the tendency for a CR to be evoked by stimuli that are similar to the stimulus to which the
response was conditioned.
Organisms must also learn that (1) many stimuli perceived as being similar are functionally different,
and (2) they must respond adaptively to each.
Example: During the first couple of months of life, for example, babies can discriminate their mother’s
voice from those of other women
DISCRIMINATION in conditioning, the tendency for an organism to distinguish between a CS and similar
stimuli that do not forecast a UCS
Pavlov showed that a dog conditioned to salivate in response to circles could be trained not to salivate
in response to ellipses. After a while, the dog no longer salivated in response to the ellipses.
HIGHER-ORDER CONDITIONING, a previously neutral stimulus (e.g., hearing the word stove or seeing the
adult who had done the tickling enter the room) comes to serve as a learned or CS after being paired
repeatedly with a stimulus that has already become a learned or CS (e.g., seeing the stove or hearing the
phrase “kitchie-coo”).
Pavlov demonstrated higher-order conditioning by first conditioning a dog to salivate in response to a
tone. He then repeatedly paired the shining of light with the sounding of the tone. After several
pairings, shining the light (the higher-order conditioned stimulus) came to evoke the response
(salivation) that had been elicited by the tone (the first order CS).
TASTE AVERSION
They are adaptive because they motivate organisms to avoid harmful foods.
Only one association may be required.
whereas most kinds of classical conditioning require that the UCS and CS be close together in time, in
taste aversion the UCS (in this case, nausea) can occur hours after the CS (in this case, the flavor of
food).
taste aversion also challenges the view that organisms learn to associate any stimuli that are linked in
time.
BIOLOGICAL PREPAREDNESS readiness to acquire a certain kind of CR due to the biological makeup of the
organism.
humans (and other primates) may be biologically prepared by evolutionary forces to rapidly develop
fears of certain animals, including snakes, that could do them harm. People also s eem to be prepared
to fear thunder, threatening faces, sharp objects, darkness, and heights—all of which would have been
sources of danger to our ancestors and which, to some degree, may still threaten us.
COUNTERCONDITIONING a fear-reduction technique in which pleasant stimuli are associated with fear-
evoking stimuli so that the fear-evoking stimuli lose their aversive qualities.
In counterconditioning, an organism learns to respond to a stimulus in a way that is incompatible with
a response that was conditioned earlier.
relaxation is incompatible with a fear response. The reasoning behind counterconditioning is this: if
fears, as Watson had shown, could be conditioned by painful experiences like a clanging noise, perhaps
fears could be counterconditioned by substituting pleasant experiences.
FLOODING - Flooding, like counterconditioning, is a behavior therapy method for reducing fears.
In flooding, the client is exposed to the fear-evoking stimulus until fear is extinguished.
Little Albert, for example, might have been placed in close contact with a rat until his fear had become
extinguished. In extinction, the CS (in this case, the rat) is presented repeatedly in the absence of the
UCS (the clanging of the steel bars) until the CR (fear) is no longer evoked.
SYSTEMATIC DESENSITIZATION in which the client is gradually exposed to fear-evoking stimuli under
circumstances in which he or she remains relaxed.
Example: For example, while feeling relaxed, Little Albert might have been given an opportunity to look
at photos of rats or to see rats from a distance before they were brought closer.
OPERANT CONDITIONING organisms learn to do things—or not to do things—because of the consequences of
their behavior.
Classical conditioning focuses on how organisms form anticipations about their environments. Operant
conditioning focuses on what they do about them.
In operant conditioning, voluntary responses such as pecking at a target, pressing a lever, or skills
required for playing tennis are acquired or conditioned.
defined as a simple form of learning in which an organism learns to engage in certain behavior because
of the effects of that behavior.
we learn to engage in operant behaviors, also known simply as operants, that result in presumably
desirable outcomes such as food, a hug, an A on a test, attention, or social approval.
LAW OF EFFECT: a response (such as string pulling) would be—to use Thorndike’s term— “stamped in” (i.e.,
strengthened) in a particular situation (such as being inside a puzzle box) by a reward (escaping from the box
and eating). But punishments— using Thorndike’s terminology once again—“stamp out” response. That is,
organisms would learn not to behave in ways that bring on punishment.
Thorndike’s view is that pleasant events stamp in responses, and unpleasant events stamp them out.
REINFORCE to follow a response with a stimulus that increases the frequency of the response. Remember that
reinforcers are not defined as pleasant events but rather as stimuli that increase the frequency of behavior.
OPERANT BEHAVIOR behavior that operates on, or manipulates, the environment.
Skinner’s supporters point out that focusing on discrete behavior creates the potential for helpful
changes. For example, in helping people combat depression, one psychologist might focus on their
“feelings.” A Skinnerian would focus on cataloging (and modifying) the types of things that “depressed
people” do. Directly modifying depressive behavior might also brighten clients’ self-reports about their
“feelings of depression.”
Any stimulus that increases the probability that responses preceding it—whether pecking a button in a
Skinner box or studying for a quiz—will be repeated serves as a REINFORCER.
REINFORCER are defined as stimuli that increase the frequency of behavior.
POSITIVE REINFORCER a reinforcer that when presented increases the frequency of an operant. Increase
the probability that a behavior will occur when they are applied. Food and approval usually serve as
positive reinforcers.
NEGATIVE REINFORCER a reinforcer that when removed increases the frequency of an operant. Increase
the probability that a behavior will occur when the reinforcers are removed.
IMMEDIATE VERSUS DELAYED REINFORCERS
Therefore, the short-term consequences of behavior often provide more of an incentive than the long-
term consequences.
PRIMARY REINFORCER an unlearned reinforcer whose effectiveness is based on the biological makeup of
the organism and not on learning. effective because of the organism’s biological makeup.
Example, food, water, warmth (positive reinforcers), and pain (a negative reinforcer) all serve as primary
reinforcers.
SECONDARY REINFORCERS/ CONDITIONED REINFORCERS - acquire their value through being associated
with established reinforcers.
In operant conditioning, the ensuing events are reinforcers.
The EXTINCTION of learned responses results from the repeated performance of operant behavior
without reinforcement.
In other words, reinforcers maintain operant behavior or strengthen habitual behavior in operant
conditioning.
SPONTANEOUS RECOVERY is adaptive in operant conditioning as well as in classical conditioning.
Reinforcers may once again become available after time elapses.
Reinforcers are known by their effects, whereas rewards and punishments are more known by how
they feel.
POSITIVE PUNISHMENT is the application of an aversive stimulus to decrease unwanted behavior, such as
spanking, scolding, or a parking ticket.
NEGATIVE PUNISHMENT is the removal of a pleasant stimulus, such as removing a student’s opportunity to
talk with friends in class by seating them apart, or removing a student’s opportunity to mentally escape from
class by taking his or her smart phone or tablet computer.
DISCRIMINATIVE STIMULUS in operant conditioning, a stimulus that indicates that reinforcement is available.
Example: such as green or red lights, indicate whether behavior (in the case of the pigeon, pecking a button)
will be reinforced (by a food pellet being dropped into the cage).
CONTINUOUS REINFORCEMENT a schedule of reinforcement in which every correct response is reinforced.
You can get a person “hooked” on gambling by fixing the game to allow heavy winnings at first. Then
you gradually space out the winnings (reinforcements) until gambling is maintained by infrequent
winning—or even no winning at all.
Responses that have been maintained by partial reinforcement are more resistant to extinction than
responses that have been maintained by continuous reinforcement.
FIXED-INTERVAL SCHEDULE, a fixed amount of time—say, a minute—must elapse before the correct response
will result in a reinforcer. A schedule in which a fixed amount of time must elapse between the previous and
subsequent times that reinforcement is available
an organism’s response rate falls off after each reinforcement and then picks up again as the time
when reinforcement will occur approaches.
Examples: Car dealers use fixed-interval reinforcement schedules when they offer incentives for buying up
the remainder of the year’s line in summer and fall. Similarly, you learn to check your email only at a
certain time of day if your correspondent writes at that time each day.
VARIABLE INTERVAL SCHEDULE a schedule in which a variable amount of time must elapse between the
previous and subsequent times that reinforcement is available.
Example: But if we know that the boss might call us in for a report on the progress of a certain project at
any time (variable-interval), we are likely to keep things in a state of reasonable readiness at all times. if
you receive email from your correspondent irregularly, you are likely to check your email regularly for his
or her communication, but with less eagerness.
FIXED-RATIO SCHEDULE, reinforcement is provided after a fixed number of correct responses have been
made. With a fixed-ratio schedule, it is as if the organism learns that it must make several responses before
being reinforced.
VARIABLE-RATIO SCHEDULE a schedule in which reinforcement is provided after a variable number of correct
responses.
Example: In a 10:1 ratio schedule, the mean number correct responses that would have to be made before
subsequent correct response would be reinforced is 10, but the ratio of correct responses to reinforcements
might be allowed to vary from, say, 1:1 to 20:1 on a random basis.
Slot machines tend to pay off on variable-ratio schedules, and players can be seen popping coins into them
and yanking their “arms” with barely a pause. For gamblers, the unpredictability of winning maintains a high
response rate.
RECALL
read lists of nonsense syllables aloud to the beat of a metronome and then see how many he could produce
from memory. Psychologists also often use lists of pairs of nonsense syllables, called PAIRED ASSOCIATES, to
measure recall. In a recall task, the person must retrieve a syllable, with another syllable serving as a cue.
RELEARNING
To study the efficiency of relearning, Ebbinghaus devised the method of savings. First, he recorded the number of
repetitions required to learn a list of nonsense syllables or words. Then, he recorded the number of repetitions required
to relearn the list after a certain amount of time had elapsed. Next, he computed the difference in the number of
repetitions to determine the savings. If a list had to be repeated 20 times before it was learned, and 20 times again after
a year had passed, there were no savings. Relearning, that is, was as tedious as the initial learning. If the list could be
learned with only 10 repetitions after a year had elapsed, however, half the number of repetitions required for learning
had been saved.
METHOD OF SAVINGS a measure of retention in which the difference between the number of repetitions originally
required to learn a list and the number of repetitions required to re -learn the list after a certain amount of time has
elapsed is calculated
SAVINGS the difference between the number of repetitions originally required to learn a list and the number of
repetitions required to relearn the list after a certain amount of time has elapsed.
INTERFERENCE THEORY
According to this view, we also forget material in short-term and long-term memory because newly learned material
interferes with it.
RETROACTIVE INTERFERENCE
new learning interferes with the retrieval of old learning. For example, a medical student may memorize the names of
the bones in the leg through rote repetition. Later he or she may find that learning the names of the bones in the arm
makes it more difficult to retrieve the names of the leg bones, especially if the names are similar in sound or in relative
location on each limb.
PROACTIVE INTERFERENCE
older learning interferes with the capacity to retrieve more recently learned material. High school Spanish may pop in
when you are trying to retrieve college French or Italian words. All three are Romance languages, with similar roots and
spellings. Previously learned Japanese words probably would not interfere with your ability to retrieve more recently
learned French or Italian, because the roots and sounds of Japanese differ considerably from those of the Romance
languages.
REPRESSION
automatic ejection of painful memories and unacceptable urges from conscious awareness.
DISSOCIATIVE AMNESIA loss of memory of personal information that is thought to stem from psychological conflict or
trauma
INFANTILE AMNESIA inability to recall events that occur prior to the age of three or so; also termed childhood amnesia
Infantile amnesia has little to do with the fact that the episodes occurred in the distant past.
When he interviewed people about their early experiences, Freud discovered that they could not recall episodes
that had happened prior to the age of three or so and that recall was cloudy through the age of five.
For example, a structure of the limbic system (the hippocampus) that is involved in the storage of memories
does not become mature until we are about two years old (Allene et al., 2012; Madsen & Kim, 2016). In
addition, myelination of brain pathways is incomplete for the first few years, contributing to the inefficiency of
information processing and memory formation.
In any event, we are unlikely to remember episodes from the first two years of life unless we are reminded of
them from time to time as we develop.
ANTEROGRADE AMNESIA failure to remember events that occurred after physical trauma because of the effects of the
trauma. there are memory lapses for the period following a trauma such as a blow to the head, an electric shock, or an
operation. The ability to pay attention, the encoding of sensory input, and rehearsal are all impaired.
RETROGRADE AMNESIA, the source of trauma prevents people from remembering events that took place before the
accident.