0% found this document useful (0 votes)
356 views16 pages

Chapter 5&6 Psychology

This document discusses different types of learning according to behaviorist and cognitive theories. It describes classical conditioning as involving learning associations between events. Classical conditioning involves pairing an unconditioned stimulus that elicits a response with a neutral conditioned stimulus until the conditioned stimulus elicits the response. Extinction is when the conditioned stimulus no longer elicits the response due to lack of pairing with the unconditioned stimulus. Generalization and discrimination allow organisms to respond similarly or differently to similar or different stimuli. Higher-order conditioning pairs a conditioned stimulus with a new stimulus to create a new conditioned response.

Uploaded by

sophia sierra
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
356 views16 pages

Chapter 5&6 Psychology

This document discusses different types of learning according to behaviorist and cognitive theories. It describes classical conditioning as involving learning associations between events. Classical conditioning involves pairing an unconditioned stimulus that elicits a response with a neutral conditioned stimulus until the conditioned stimulus elicits the response. Extinction is when the conditioned stimulus no longer elicits the response due to lack of pairing with the unconditioned stimulus. Generalization and discrimination allow organisms to respond similarly or differently to similar or different stimuli. Higher-order conditioning pairs a conditioned stimulus with a new stimulus to create a new conditioned response.

Uploaded by

sophia sierra
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

CHAPTER 5 LEARNING

LEARNING - according to behaviorists, a relatively permanent change in behavior that results from experience
 according to cognitive theorists, the process by which organisms make relatively permanent changes in
the way they represent the environment because of experience
 Learning, for cognitive psychologists, may be shown by changes in behavior, but learning itself is a
mental process.
CLASSICAL CONDITIONING involves ways in which we learn to associate events with other events.

 It is involuntary, automatic learning. Cognitive psychologists view classical conditioning as the learning
of relationships among events.
 a simple form of learning in which a neutral stimulus comes to evoke the response usually evoked by
another stimulus by being paired repeatedly with the other stimulus.
 is a simple form of learning in which organisms come to anticipate or associate events with one
another.
 Classical conditioning focuses on how organisms form anticipations about their environments.
 In classical conditioning, involuntary responses such as salivation or eye blinks are often conditioned.
REFLEX a simple unlearned response to a stimulus. Reflexes are unlearned and evoked by certain stimuli
STIMULUS an environmental condition that elicits a
response.
 Pavlov discovered that reflexes can also be learned,
or conditioned, by association.
 In his initial experiments, Pavlov trained dogs to
salivate when he sounded a tone or a bell. Pavlov termed
these trained salivary responses conditional reflexes.
Conditional reflexes are generally referred to as
conditioned responses.Classical conditioning is a simple
form of learning in which one stimulus comes to evoke
the response usually evoked by another stimulus.
Cognitive psychologists view classical conditioning as the
learning of relationships among events.
unconditioned stimulus (UCS) a stimulus that elicits a
response from an organism prior to conditioning. (The
meat powder)
unconditioned response (UCR) an unlearned response to
an unconditioned stimulus orienting reflex an unlearned
response in which an organism attends to a stimulus.
 Salivation in response to the meat powder is an
unlearned
conditioned response (CR) a learned response to a
conditioned stimulus.
 Salivation in response to the tone (or conditioned
stimulus) is a learned
conditioned stimulus (CS) a previously neutral stimulus that elicits a conditioned response because it has
been paired repeatedly with a stimulus that already elicited that response. (the tone became a learned
stimulus)
Therefore, salivation can be either a CR or a UCR, depending on the method used to evoke the response
EXTINCTION AND SPONTANEOUS RECOVERY
Extinction enters the picture when times—and the relationships between events—change.
EXTINCTION the process by which stimuli lose their ability to evoke learned responses because the events
that had followed the stimuli no longer occur (The learned responses are said to be extinguished.)

 is the process by which CS loses the ability to elicit CR because the CS is no longer associated with UCS.
 Rather, extinction inhibits the response. The response remains available for the future under the
“right” conditions.
Example: a child may learn to connect hearing a car pull into the driveway (a CS) with the arriva l of his or
her parents (a UCS). Thus, the child may squeal with delight (squealing is a CR) when he or she hears the
car. After moving to a new house, the child’s parents may commute by public transportation. The sound of
a car in a nearby driveway may signal a neighbor’s, not a parent’s, homecoming.
When a CS (such as the sound of a car) is no longer followed by a UCS (a parent’s homecoming), the CS
loses its ability to elicit a CR.
SPONTANEOUS recovery the recurrence of an extinguished response as a function of the passage of time
 Organisms tend to show spontaneous recovery of extinguished CRs as a function of the passage of
time.
 In the wild, a water hole may contain water for only a couple of months during the year. But evolution
would favor the survival of animals that associate the water hole with the thirst drive from time to
time so that they return to it when it again holds water.
 spontaneous recovery helps organisms adapt to situations that recur from time to time.
ADAPTATION requires us to respond similarly (or generalize) to stimuli that are equivalent in function and to
respond differently to (or discriminate between) stimuli that are not.
GENERALIZATION is the tendency for a CR to be evoked by stimuli that are similar to the stimulus to which the
response was conditioned.
 Organisms must also learn that (1) many stimuli perceived as being similar are functionally different,
and (2) they must respond adaptively to each.
 Example: During the first couple of months of life, for example, babies can discriminate their mother’s
voice from those of other women
DISCRIMINATION in conditioning, the tendency for an organism to distinguish between a CS and similar
stimuli that do not forecast a UCS
 Pavlov showed that a dog conditioned to salivate in response to circles could be trained not to salivate
in response to ellipses. After a while, the dog no longer salivated in response to the ellipses.
HIGHER-ORDER CONDITIONING, a previously neutral stimulus (e.g., hearing the word stove or seeing the
adult who had done the tickling enter the room) comes to serve as a learned or CS after being paired
repeatedly with a stimulus that has already become a learned or CS (e.g., seeing the stove or hearing the
phrase “kitchie-coo”).
 Pavlov demonstrated higher-order conditioning by first conditioning a dog to salivate in response to a
tone. He then repeatedly paired the shining of light with the sounding of the tone. After several
pairings, shining the light (the higher-order conditioned stimulus) came to evoke the response
(salivation) that had been elicited by the tone (the first order CS).
TASTE AVERSION
 They are adaptive because they motivate organisms to avoid harmful foods.
 Only one association may be required.
 whereas most kinds of classical conditioning require that the UCS and CS be close together in time, in
taste aversion the UCS (in this case, nausea) can occur hours after the CS (in this case, the flavor of
food).
 taste aversion also challenges the view that organisms learn to associate any stimuli that are linked in
time.
BIOLOGICAL PREPAREDNESS readiness to acquire a certain kind of CR due to the biological makeup of the
organism.
 humans (and other primates) may be biologically prepared by evolutionary forces to rapidly develop
fears of certain animals, including snakes, that could do them harm. People also s eem to be prepared
to fear thunder, threatening faces, sharp objects, darkness, and heights—all of which would have been
sources of danger to our ancestors and which, to some degree, may still threaten us.
COUNTERCONDITIONING a fear-reduction technique in which pleasant stimuli are associated with fear-
evoking stimuli so that the fear-evoking stimuli lose their aversive qualities.
 In counterconditioning, an organism learns to respond to a stimulus in a way that is incompatible with
a response that was conditioned earlier.
 relaxation is incompatible with a fear response. The reasoning behind counterconditioning is this: if
fears, as Watson had shown, could be conditioned by painful experiences like a clanging noise, perhaps
fears could be counterconditioned by substituting pleasant experiences.
FLOODING - Flooding, like counterconditioning, is a behavior therapy method for reducing fears.
 In flooding, the client is exposed to the fear-evoking stimulus until fear is extinguished.
 Little Albert, for example, might have been placed in close contact with a rat until his fear had become
extinguished. In extinction, the CS (in this case, the rat) is presented repeatedly in the absence of the
UCS (the clanging of the steel bars) until the CR (fear) is no longer evoked.
SYSTEMATIC DESENSITIZATION in which the client is gradually exposed to fear-evoking stimuli under
circumstances in which he or she remains relaxed.
 Example: For example, while feeling relaxed, Little Albert might have been given an opportunity to look
at photos of rats or to see rats from a distance before they were brought closer.
OPERANT CONDITIONING organisms learn to do things—or not to do things—because of the consequences of
their behavior.
 Classical conditioning focuses on how organisms form anticipations about their environments. Operant
conditioning focuses on what they do about them.
 In operant conditioning, voluntary responses such as pecking at a target, pressing a lever, or skills
required for playing tennis are acquired or conditioned.
 defined as a simple form of learning in which an organism learns to engage in certain behavior because
of the effects of that behavior.
 we learn to engage in operant behaviors, also known simply as operants, that result in presumably
desirable outcomes such as food, a hug, an A on a test, attention, or social approval.
LAW OF EFFECT: a response (such as string pulling) would be—to use Thorndike’s term— “stamped in” (i.e.,
strengthened) in a particular situation (such as being inside a puzzle box) by a reward (escaping from the box
and eating). But punishments— using Thorndike’s terminology once again—“stamp out” response. That is,
organisms would learn not to behave in ways that bring on punishment.
 Thorndike’s view is that pleasant events stamp in responses, and unpleasant events stamp them out.
REINFORCE to follow a response with a stimulus that increases the frequency of the response. Remember that
reinforcers are not defined as pleasant events but rather as stimuli that increase the frequency of behavior.
OPERANT BEHAVIOR behavior that operates on, or manipulates, the environment.

 Skinner’s supporters point out that focusing on discrete behavior creates the potential for helpful
changes. For example, in helping people combat depression, one psychologist might focus on their
“feelings.” A Skinnerian would focus on cataloging (and modifying) the types of things that “depressed
people” do. Directly modifying depressive behavior might also brighten clients’ self-reports about their
“feelings of depression.”
Any stimulus that increases the probability that responses preceding it—whether pecking a button in a
Skinner box or studying for a quiz—will be repeated serves as a REINFORCER.
REINFORCER are defined as stimuli that increase the frequency of behavior.
POSITIVE REINFORCER a reinforcer that when presented increases the frequency of an operant. Increase
the probability that a behavior will occur when they are applied. Food and approval usually serve as
positive reinforcers.
NEGATIVE REINFORCER a reinforcer that when removed increases the frequency of an operant. Increase
the probability that a behavior will occur when the reinforcers are removed.
IMMEDIATE VERSUS DELAYED REINFORCERS
Therefore, the short-term consequences of behavior often provide more of an incentive than the long-
term consequences.
PRIMARY REINFORCER an unlearned reinforcer whose effectiveness is based on the biological makeup of
the organism and not on learning. effective because of the organism’s biological makeup.
Example, food, water, warmth (positive reinforcers), and pain (a negative reinforcer) all serve as primary
reinforcers.
SECONDARY REINFORCERS/ CONDITIONED REINFORCERS - acquire their value through being associated
with established reinforcers.
 In operant conditioning, the ensuing events are reinforcers.
 The EXTINCTION of learned responses results from the repeated performance of operant behavior
without reinforcement.
 In other words, reinforcers maintain operant behavior or strengthen habitual behavior in operant
conditioning.
 SPONTANEOUS RECOVERY is adaptive in operant conditioning as well as in classical conditioning.
Reinforcers may once again become available after time elapses.
 Reinforcers are known by their effects, whereas rewards and punishments are more known by how
they feel.
POSITIVE PUNISHMENT is the application of an aversive stimulus to decrease unwanted behavior, such as
spanking, scolding, or a parking ticket.
NEGATIVE PUNISHMENT is the removal of a pleasant stimulus, such as removing a student’s opportunity to
talk with friends in class by seating them apart, or removing a student’s opportunity to mentally escape from
class by taking his or her smart phone or tablet computer.

DISCRIMINATIVE STIMULUS in operant conditioning, a stimulus that indicates that reinforcement is available.
Example: such as green or red lights, indicate whether behavior (in the case of the pigeon, pecking a button)
will be reinforced (by a food pellet being dropped into the cage).
CONTINUOUS REINFORCEMENT a schedule of reinforcement in which every correct response is reinforced.
 You can get a person “hooked” on gambling by fixing the game to allow heavy winnings at first. Then
you gradually space out the winnings (reinforcements) until gambling is maintained by infrequent
winning—or even no winning at all.
 Responses that have been maintained by partial reinforcement are more resistant to extinction than
responses that have been maintained by continuous reinforcement.
FIXED-INTERVAL SCHEDULE, a fixed amount of time—say, a minute—must elapse before the correct response
will result in a reinforcer. A schedule in which a fixed amount of time must elapse between the previous and
subsequent times that reinforcement is available
 an organism’s response rate falls off after each reinforcement and then picks up again as the time
when reinforcement will occur approaches.
Examples: Car dealers use fixed-interval reinforcement schedules when they offer incentives for buying up
the remainder of the year’s line in summer and fall. Similarly, you learn to check your email only at a
certain time of day if your correspondent writes at that time each day.
VARIABLE INTERVAL SCHEDULE a schedule in which a variable amount of time must elapse between the
previous and subsequent times that reinforcement is available.
Example: But if we know that the boss might call us in for a report on the progress of a certain project at
any time (variable-interval), we are likely to keep things in a state of reasonable readiness at all times. if
you receive email from your correspondent irregularly, you are likely to check your email regularly for his
or her communication, but with less eagerness.
FIXED-RATIO SCHEDULE, reinforcement is provided after a fixed number of correct responses have been
made. With a fixed-ratio schedule, it is as if the organism learns that it must make several responses before
being reinforced.
VARIABLE-RATIO SCHEDULE a schedule in which reinforcement is provided after a variable number of correct
responses.
Example: In a 10:1 ratio schedule, the mean number correct responses that would have to be made before
subsequent correct response would be reinforced is 10, but the ratio of correct responses to reinforcements
might be allowed to vary from, say, 1:1 to 20:1 on a random basis.
Slot machines tend to pay off on variable-ratio schedules, and players can be seen popping coins into them
and yanking their “arms” with barely a pause. For gamblers, the unpredictability of winning maintains a high
response rate.

APPLICATION OF OPERANT CONDITIONING


BIOFEEDBACK TRAINING (BFT) is based on operant conditioning. It has enabled people and lower animals to
learn to control autonomic responses to attain reinforcement. In BFT, people receive reinforcement in the
form of information. For example, we can learn to emit alpha waves —the kind of brain wave associated with
relaxation—through feedback from an electroencephalograph, which measures brain waves. People use other
instruments to learn to lower muscle tension, heart rates, and blood pressure.
SHAPING is a procedure for teaching complex behaviors that at first reinforces approximations of the target
behavior. We can teach complex behaviors by shaping. Shaping reinforces progressive steps toward the
behavioral goal.
For example, it may be wise to smile and say, “Good,” when a reluctant newcomer gathers the courage to get
out on the dance floor, even if your feet are flattened by his initial clumsiness.
SUCCESSIVE APPROXIMATIONS behaviors are progressively closer to the target behavior. But as training
proceeds, we come to expect more before we are willing to provide reinforcement. We reinforce successive
approximations of the goal.
But teachers can learn to use behavior modification to reinforce children when they are behaving
appropriately and, when possible, to extinguish misbehavior by ignoring it. Teachers also frequently use time
out from positive reinforcement to discourage misbehavior. In this method, children are placed in a drab,
restrictive environment for a specified period, usually about 10 minutes, when they behave disruptively. While
isolated, they cannot earn the attention of peers or teachers, and no reinforcers are present.
PROGRAMMED LEARNING (B.F. SKINNER)- This method assumes that any complex task can be broken down
into a number of small steps. These steps can be shaped individually and then combined in sequence to form
the correct behavioral chain. Programmed learning does not punish errors. Instead, correct responses are
reinforced, usually with immediate feedback. In programmed learning, one can learn without making
mistakes.
COGNITIVE FACTORS IN LEARNING
In addition to concepts such as association and reinforcement, cognitive psychologists use concepts such as
mental structures, schemas, templates, and information processing.
COGNITIVE MAP a mental representation of the layout of one’s environment. Tolman concluded that the rats
had learned about the mazes by exploring them even when they were unrewarded for doing so. He
distinguished between learning and performance. Rats apparently created a cognitive map of a maze. Even
though they were not externally motivated to follow a rapid route through the maze, they would learn fast
routes just by exploring it.
LATENT LEARNING learning that is hidden or concealed
CONTINGENCY THEORY, suggests that learning occurs only when the conditioned stimulus (CS) provides
information about the unconditioned stimulus (UCS). the view that learning occurs when stimuli provide
information about the likelihood of the occurrence of other stimuli.
OBSERVATIONAL LEARNING show that we can acquire skills by observing the behavior of others. The
acquisition of knowledge and skills through the observation of others (who are called models) rather than by
means of direct experience. Observational learning is not acquired through reinforcement. We can learn
through observation without engaging in overt responses at all.
In the terminology of observational learning, a person who engages in a response that is imitated is a MODEL
(an organism that engages in a response that is then imitated by another organism).
mirror neurons—neurons that re when an animal observes the behavior of another and that tend to stimulate
imitative behavior. Nevertheless, mirror neurons also allow us to anticipate other people’s intentions when
they reach for things.
If children watch two to four hours of television a day, they will have seen 8,000 murders and another 100,000
acts of violence by the time they have finished elementary school
CHAPTER 6 MEMORY
SENSORY MEMORY sensory memory the type or stage of memory first encountered by a stimulus; sensory
memory holds impressions briefly, but long enough so that series of perceptions are psychologically
continuous. Although sensory memory holds impressions briefly, it is long enough so that a series of
perceptions seem to be connected.
MEMORY TRACE an assumed change in the nervous system that reflects the impression made by a stimulus
Sperling concluded that the memory trace of visual stimuli decays within a second.
ICON a mental representation of a visual stimulus that is held briefly in sensory memory.
ICONIC MEMORY the sensory register that briefly holds mental representations of visual stimuli.
EIDETIC IMAGERY the maintenance of detailed visual memories over several minutes .
ECHO a mental representation of an auditory stimulus (sound) that is held briefly in sensory memory.
ECHOIC MEMORY the sensory register that briefly holds mental representations of auditory stimuli . The
sensory register that holds echoes.

SHORT TERM MEMORY


 “Working memory”
 the type or stage of memory that can hold information for up to a minute or so after the trace of the
stimulus decays
 In short-term memory, the image tends to fade signicantly after 10 to 12 sec onds if it is not repeated
or rehearsed.
 You need to rehearse new information to “save” it, but you may need only the proper cue to retrieve
information from long-term memory.
SERIAL-POSITION EFFECT the tendency to recall more accurately the first and last items in a series . They serve
as the visual or auditory boundaries for the other stimuli.
CHUNK a stimulus or group of stimuli that are perceived as a discrete piece of information
As we noted earlier, found that the average person is comfortable with remembering about seven integers at
a time, the number of integers in a telephone number. Psychologists say that the appearance of new
information in short-term memory displaces the old information.
LONG TERM MEMORY (LTM)
 the type or stage of memory capable of relatively permanent storage
 Some psychologists (Freud was one) believed that nearly all of our perceptions and ideas are stored
permanently.
 memories are distorted by our biases and needs—by the ways in which we conceptualize our worlds.
 SCHEMA is a way of mentally representing the world, such as a belief or expectation, which can
influence our perception of persons, objects, and situations.
 Your long-term memory is a biochemical “hard drive” with no known limits on the amount of
information it can store.
 Now and then, it may seem that we have forgotten, or “lost,” a long-term memory such as the names
of elementary-school classmates, yet it may be that we cannot and the proper cues to retrieve them.
 processing with activity in certain parts of the brain, notably the prefrontal area of the cerebral cortex.
FLASHBULB MEMORIES It appears that we tend to remember events that are surprising, important, and
emotionally stirring more clearly. Such events can create flashbulb memories, which preserve experiences in
detail. It is easier to discriminate stimuli that stand out. Such events are striking in themselves. The feelings
caused by them are also special. It is easier to discriminate stimuli that stand out. Such events are striking in
themselves. The feelings caused by them are also special.
We tend to organize information according to a hierarchical structure. A hierarchy is an arrangement of items
(or chunks of information) into groups or classes according to common or distinct features. As we work our
way up the hierarchy shown in the figure, we find more encompassing, or superordinate, classes to which the
items below them belong.
THE TIP-OF-THE-TONGUE PHENOMENON the feeling that information is stored in memory although it cannot
be readily retrieved; also called the feeling-of-knowing experience.
Brown and McNeill also suggested that our storage systems are indexed according to cues that include both
the sounds and the meanings of words—that is, according to both acoustic and semantic codes. By scanning
words similar in sound and meaning to the word on the tip of the tongue, we sometimes find a useful cue and
retrieve the word for which we are searching.
At such times, the problem lies not in retrieval but in the original processes of learning and memory—that is,
encoding and storage.
CONTEXT-DEPENDENT MEMORY
 The context in which we acquire information can also play a role in retrieval.
 information that is better retrieved in the context in which it was encoded and stored, or learned
STATE DEPENDENT MEMORY information that is better retrieved in the physiological or emotional state in
which it was encoded and stored, or learned
 We sometimes retrieve information better when we are in a biological or emotional state similar to the
one in which we encoded and stored the information.
FORGETTING
 Forgetting is defined as failure to recognize a syllable that has been read before.
NONSENSE SYLLABLES are meaningless sets of two consonants with a vowel sandwiched in between. Because
nonsense syllables are intended to be meaningless, remembering them should depend on simple acoustic
coding and maintenance rehearsal rather than on elaborative rehearsal, semantic coding, or other ways of
making learning meaningful. They are thus well suited for use in the measurement of forgetting.
MEMORY TASKS USED IN MEASURING FORGETTING
RECOGNITION This is why multiple-choice tests are easier than fill-in-the blank or essay tests. We can
recognize correct answers more easily than we can recall them unaided. This is why multiple-choice tests are
easier than fill-in-the blank or essay tests. We can recognize correct answers more easily than we can recall
them unaided. In a recognition task, one simply indicates whether an item has been seen before or which of a number
of items is paired with a stimulus (as in a multiple-choice test).

RECALL
read lists of nonsense syllables aloud to the beat of a metronome and then see how many he could produce
from memory. Psychologists also often use lists of pairs of nonsense syllables, called PAIRED ASSOCIATES, to
measure recall. In a recall task, the person must retrieve a syllable, with another syllable serving as a cue.
RELEARNING
To study the efficiency of relearning, Ebbinghaus devised the method of savings. First, he recorded the number of
repetitions required to learn a list of nonsense syllables or words. Then, he recorded the number of repetitions required
to relearn the list after a certain amount of time had elapsed. Next, he computed the difference in the number of
repetitions to determine the savings. If a list had to be repeated 20 times before it was learned, and 20 times again after
a year had passed, there were no savings. Relearning, that is, was as tedious as the initial learning. If the list could be
learned with only 10 repetitions after a year had elapsed, however, half the number of repetitions required for learning
had been saved.
METHOD OF SAVINGS a measure of retention in which the difference between the number of repetitions originally
required to learn a list and the number of repetitions required to re -learn the list after a certain amount of time has
elapsed is calculated
SAVINGS the difference between the number of repetitions originally required to learn a list and the number of
repetitions required to relearn the list after a certain amount of time has elapsed.

INTERFERENCE THEORY
According to this view, we also forget material in short-term and long-term memory because newly learned material
interferes with it.
RETROACTIVE INTERFERENCE
new learning interferes with the retrieval of old learning. For example, a medical student may memorize the names of
the bones in the leg through rote repetition. Later he or she may find that learning the names of the bones in the arm
makes it more difficult to retrieve the names of the leg bones, especially if the names are similar in sound or in relative
location on each limb.
PROACTIVE INTERFERENCE
older learning interferes with the capacity to retrieve more recently learned material. High school Spanish may pop in
when you are trying to retrieve college French or Italian words. All three are Romance languages, with similar roots and
spellings. Previously learned Japanese words probably would not interfere with your ability to retrieve more recently
learned French or Italian, because the roots and sounds of Japanese differ considerably from those of the Romance
languages.
REPRESSION

 automatic ejection of painful memories and unacceptable urges from conscious awareness.
DISSOCIATIVE AMNESIA loss of memory of personal information that is thought to stem from psychological conflict or
trauma
INFANTILE AMNESIA inability to recall events that occur prior to the age of three or so; also termed childhood amnesia

 Infantile amnesia has little to do with the fact that the episodes occurred in the distant past.
 When he interviewed people about their early experiences, Freud discovered that they could not recall episodes
that had happened prior to the age of three or so and that recall was cloudy through the age of five.
 For example, a structure of the limbic system (the hippocampus) that is involved in the storage of memories
does not become mature until we are about two years old (Allene et al., 2012; Madsen & Kim, 2016). In
addition, myelination of brain pathways is incomplete for the first few years, contributing to the inefficiency of
information processing and memory formation.
 In any event, we are unlikely to remember episodes from the first two years of life unless we are reminded of
them from time to time as we develop.
ANTEROGRADE AMNESIA failure to remember events that occurred after physical trauma because of the effects of the
trauma. there are memory lapses for the period following a trauma such as a blow to the head, an electric shock, or an
operation. The ability to pay attention, the encoding of sensory input, and rehearsal are all impaired.
RETROGRADE AMNESIA, the source of trauma prevents people from remembering events that took place before the
accident.

THE BIOLOGY OF MEMORY


ENGRAMA- were viewed as electrical circuits in the brain that corresponded to memory traces—neurological processes
that paralleled experiences. an assumed electrical circuit in the brain that corresponds to a memory trace .
As a result, researchers have been able to study how experience is reflected at the synapses of specific neurons. The sea
snail will reflexively withdraw its gills when it receives electric shock, in the way a person will reflexively withdraw a hand
from a hot stove or a thorn. In one kind of experiment, researchers precede the shock with a squirt of water. After a few
repetitions, the sea snail becomes conditioned to withdraw its gills when squirted with the water. When sea snails are
conditioned, they release more serotonin at certain synapses. As a consequence, transmission at these synapses
becomes more efficient as trials (learning) progress (Kandel et al., 2014; Squire & Kandel, 2008).
LONG-TERM POTENTIATION
enhanced efficiency in synaptic transmission that follows brief, rapid stimulation.
The HIPPOCAMPUS is vital in storing new information even if we can retrieve old information without it. Rather, it is
involved in relaying sensory information to parts of the cortex. The hippocampus is also involved in the where and when
of things. Adults with hippocampal damage may be able to form new procedural memories, even though they cannot
form new episodic (“where and when”) memories.
The LIMBIC SYSTEM is largely responsible for integrating these pieces of information when we recall an event.
The PREFRONTAL CORTEX is the executive center in memory (Nee et al., 2013). It appears to empower people with
consciousness—the ability to mentally represent and become aware of experiences that occur in the past, present, and
future. It enables people to mentally travel back in time to re-experience the personal, autobiographical past.

The THALAMUS is involved in the formation of verbal memories.

You might also like