Why Does Peer Instruction Benefit Student Learning
Why Does Peer Instruction Benefit Student Learning
(2020) 5:15
Cognitive Research: Principles
https://doi.org/10.1186/s41235-020-00218-5 and Implications
Abstract
In peer instruction, instructors pose a challenging question to students, students answer the question individually,
students work with a partner in the class to discuss their answers, and finally students answer the question again. A
large body of evidence shows that peer instruction benefits student learning. To determine the mechanism for
these benefits, we collected semester-long data from six classes, involving a total of 208 undergraduate students
being asked a total of 86 different questions related to their course content. For each question, students chose their
answer individually, reported their confidence, discussed their answers with their partner, and then indicated their
possibly revised answer and confidence again. Overall, students were more accurate and confident after discussion
than before. Initially correct students were more likely to keep their answers than initially incorrect students, and
this tendency was partially but not completely attributable to differences in confidence. We discuss the benefits of
peer instruction in terms of differences in the coherence of explanations, social learning, and the contextual factors
that influence confidence and accuracy.
Keywords: Group decisions, Peer instruction, Metacognition, Confidence, Decision making
Despite wide variations in its implementation, peer in- they may choose whichever answer belongs to the more
struction consistently benefits student learning. Switching confident peer. Evidence about decision-making and
classroom structure from didactic lectures to one centered advice-taking substantiates this account. First, confi-
around peer instruction improves learners’ conceptual un- dence is correlated with correctness across many settings
derstanding (Duncan, 2005; Mazur, 1997), reduces student and procedures (Finley, Tullis, & Benjamin, 2010). Stu-
attrition in difficult courses (Lasry, Mazur, & Watkins, dents who are more confident in their answers are typic-
2008), decreases failure rates (Porter, Bailey-Lee, & Simon, ally more likely to be correct. Second, research
2013), improves student attendance (Deslauriers, Schelew, examining decision-making and advice-taking indicates
& Wieman, 2011), and bolsters student engagement (Lu- that (1) the less confident you are, the more you value
cas, 2009) and attitudes to their course (Beekes, 2006). others’ opinions (Granovskiy, Gold, Sumpter, & Gold-
Benefits of peer instruction have been found across many stone, 2015; Harvey & Fischer, 1997; Yaniv, 2004a,
fields, including physics (Mazur, 1997; Pollock, Chasteen, 2004b; Yaniv & Choshen-Hillel, 2012) and (2) the more
Dubson, & Perkins, 2010), biology (Knight, Wise, & confident the advisor is, the more strongly they influence
Southard, 2013; Smith, Wood, Krauter, & Knight, 2011), your decision (Kuhn & Sniezek, 1996; Price & Stone,
chemistry (Brooks & Koretsky, 2011), physiology (Cort- 2004; Sah, Moore, & MacCoun, 2013; Sniezek & Buck-
right, Collins, & DiCarlo, 2005; Rao & DiCarlo, 2000), cal- ley, 1995; Van Swol & Sniezek, 2005; Yaniv, 2004b).
culus (Lucas, 2009; Miller, Santana-Vega, & Terrell, 2007), Consequently, if students simply choose their final an-
computer science (Porter et al., 2013), entomology (Jones, swer based upon whoever is more confident, accuracy
Antonenko, & Greenwood, 2012), and even philosophy should increase from pre-discussion to post-discussion.
(Butchart, Handfield, & Restall, 2009). Additionally, bene- This explanation suggests that switches in answers
fits of peer instruction have been found at prestigious pri- should be driven entirely by a combination of one’s own
vate universities, two-year community colleges (Lasry initial confidence and one’s partner’s confidence. In ac-
et al., 2008), and even high schools (Cummings & Roberts, cord with this confidence view, Koriat (2015) shows that
2008). Peer instruction benefits not just the specific ques- an individual’s confidence typically reflects the group’s
tions posed during discussion, but also improves accuracy most typically given answer. When the answer most
on later similar problems (e.g., Smith et al., 2009). often given by group members is incorrect, peer interac-
One of the consistent empirical hallmarks of peer in- tions amplify the selection of and confidence in incorrect
struction is that students’ answers are more frequently answers. Correct answers have no special draw. Rather,
correct following discussion than preceding it. For ex- peer instruction merely amplifies the dominant view
ample, in introductory computer science courses, post- through differences in the individual’s confidence.
discussion performance was higher on 70 out of 71 In a second explanation, working with others may
questions throughout the semester (Simon, Kohanfars, prompt students to verbalize explanations and verbaliza-
Lee, Tamayo, & Cutts, 2010). Further, gains in perform- tions may generate new knowledge. More specifically, as
ance from discussion are found on many different types students discuss the questions, they need to create a com-
of questions, including recall, application, and synthesis mon representation of the problem and answer. Generat-
questions (Rao & DiCarlo, 2000). Performance improve- ing a common representation may compel students to
ments are found because students are more likely to switch identify gaps in their existing knowledge and construct
from an incorrect answer to the correct answer than from new knowledge (Schwartz, 1995). Further, peer discussion
the correct answer to an incorrect answer. In physics, 59% of may promote students’ metacognitive processes of detect-
incorrect answers switched to correct following discussion, ing and correcting errors in their mental models. Students
but only 13% of correct answers switched to incorrect create more new knowledge and better diagnostic tests of
(Crouch & Mazur, 2001). Other research on peer instruction answers together than alone. Ultimately, then, the new
shows the same patterns: 41% of incorrect answers are knowledge and improved metacognition may make the
switched to correct ones, while only 18% of correct answers correct answer appear more compelling or coherent than
are switched to incorrect (Morgan & Wakefield, 2012). On incorrect options. Peer discussion would draw attention to
qualitative problem-solving questions in physiology, 57% of coherent or compelling answers, more so than students’
incorrect answers switched to correct after discussion, and initial confidence alone and the coherence of the correct
only 7% of correct answers to incorrect (Giuliodori, Lujan, & answer would prompt students to switch away from incor-
DiCarlo, 2006). rect answers. Similarly, Trouche, Sander, and Mercier
There are two explanations for improvements in pre- (2014) argue that interactions in a group prompt argu-
discussion to post-discussion accuracy. First, switches mentation and discussion of reasoning. Good arguments
from incorrect to correct answers may be driven by and reasoning should be more compelling to change indi-
selecting the answer from the peer who is more viduals’ answers than confidence alone. Indeed, in a rea-
confident. When students discuss answers that disagree, soning task known to benefit from careful deliberation,
Tullis and Goldstone Cognitive Research: Principles and Implications (2020) 5:15 Page 3 of 12
in other classes were not required to tell their answer or distribution for analyses involving continuous out-
their confidence to their peer. Finally, the questions ap- comes (i.e., confidence; Baayen, 2008). P values can
peared at any point during the class period for the cognitive be directly obtained from Wald z statistics for models
psychology classes, while the questions typically happened at with binary outcomes (i.e., correctness).
the beginning of each class for the other classes.
Does accuracy change through discussion?
Results First, we examined how correctness changed across peer
Analytic strategy discussion. A logit model predicting correctness from time
Data are available on the OpenScienceFramework: point (pre-discussion to post-discussion) revealed that the
https://mfr.osf.io/render?url=https://osf.io/5qc46/?ac- odds of correctness increased by 1.57 times (95% confi-
tion=download%26mode=render. dence interval (conf) 1.31–1.87) from pre-discussion to
For most of our analyses we used linear mixed-effects post-discussion, as shown in Table 2. In fact, 88% of stu-
models (Baayen, Davidson, & Bates, 2008; Murayama, dents showed an increase or no change in accuracy from
Sakaki, Yan, & Smith, 2014). The unit of analysis in a pre-discussion to post-discussion. Pre-discussion to post-
mixed-effect model is the outcome of a single trial (e.g., discussion performance for each class is shown in Table 3.
whether or not a particular question was answered cor- We further examined how accuracy changed from pre-
rectly by a particular participant). We modeled these in- discussion to post-discussion for each question and the re-
dividual trial-level outcomes as a function of multiple sults are plotted in Fig. 1. The data show a consistent im-
fixed effects - those of theoretical interest - and multiple provement in accuracy from pre-discussion to post-
random effects - effects for which the observed levels are discussion across all levels of initial difficulty.
sampled out of a larger population (e.g., questions, stu- We examined how performance increased from pre-
dents, and classes sampled out of a population of poten- discussion to post-discussion by tracing the correctness of
tial questions, students, and classes). answers through the discussion. Figure 2 tracks the per-
Linear mixed-effects models solve four statistical prob- cent (and number of items) correct from pre-discussion to
lems involved with the data of peer instruction. First, post-discussion. The top row shows whether students
there is large variability in students’ performance and were initially correct or incorrect in their answer; the mid-
the difficulty of questions across students and classes. dle row shows whether students agreed or disagreed with
Mixed-effect models simultaneously account for random their partner; the last row show whether students were
variation both across participants and across items correct or incorrect after discussion. Additionally, Fig. 2
(Baayen et al., 2008; Murayama et al., 2014). Second, stu- shows the confidence associated with each pathway. The
dents may miss individual classes and therefore may not bottow line of each entry shows the students’ average con-
provide data across every item. Similarly, classes varied fidence; in the middle white row, the confidence reported
in how many peer instruction questions were posed is the average of the peer’s confidence.
throughout the semester and the number of students en- Broadly, only 5% of correct answers were switched to in-
rolled. Mixed-effects models weight each response correct, while 28% of incorrect answers were switched to
equally when drawing conclusions (rather than weight- correct following discussion. Even for the items in which
ing each student or question equally) and can easily ac- students were initially correct but disagreed with their
commodate missing data. Third, we were interested in partner, only 21% of answers were changed to incorrect
how several different characteristics influenced students’ answers after discussion. However, out of the items where
performance. Mixed effects models can include multiple students were initially incorrect and disagreed with their
predictors simultaneously, which allows us to test the ef- partner, 42% were changed to the correct answer.
fect of one predictor while controlling for others. Finally,
mixed effects models can predict the log odds (or logit) Does confidence predict switching?
of a correct answer, which is needed when examining Differences in the amount of switching to correct or in-
binary outcomes (i.e., correct or incorrect; Jaeger, 2008). correct answers could be driven solely by differences in
We fit all models in R using the lmer() function of confidence, as described in our first theory mentioned
the lme4 package (Bates, Maechler, Bolker, & Walker, earlier. For this theory to hold, answers with greater
2015). For each mixed-effect model, we included ran-
dom intercepts that capture baseline differences in Table 2 The effect of time point (pre-discussion to post-
difficulty of questions, in classes, and in students, in discussion) on accuracy using a mixed effect logit model
addition to multiple fixed effects of theoretical inter- Fixed Effect ^
β SE Wald z p
est. In mixed-effect models with hundreds of observa- Intercept 0.68 0.19 3.515 .0004
tions, the t distribution effectively converges to the
Time point (pre to post) 0.45 0.09 5.102 < .0001
normal, so we compared the t statistic to the normal
Tullis and Goldstone Cognitive Research: Principles and Implications (2020) 5:15 Page 5 of 12
confidence must have a greater likelihood of being cor- time. Similarly, we tested what the performance would be
rect. To examine whether initial confidence is associated if peers always picked the answer of the more confident
with initial correctness, we calculated the gamma correl- peer. If peers always chose the more confident answer
ation between correctness and confidence in the answer during discussion, the final accuracy would be 69%, which
before discussion, as shown in the first column of Table 4. is significantly lower than actual final accuracy (M = 72%,
The average gamma correlation between initial confidence t (207) = 2.59, p = 0.01, d = 0.18). While initial confidence
and initial correctness (mean (M) = 0.40) was greater than is related to accuracy, these results show that confidence
zero, t (160) = 8.59, p < 0.001, d = 0.68, indicating that is not the only predictor of switching answers.
greater confidence was associated with being correct.
Changing from an incorrect to a correct answer, then, Does correctness predict switching beyond confidence?
may be driven entirely by selecting the answer from the Discussion may reveal information about the correctness of
peer with the greater confidence during discussion, even answers by generating new knowledge and testing the coher-
though most of the students in our sample were not re- ence of each possible answer. To test whether the correct-
quired to explicitly disclose their confidence to their ness of an answer added predictive power beyond the
partner during discussion. We examined how frequently confidence of the peers involved in discussion, we analyzed
students choose the more confident answer when peers situations in which students disagreed with their partner.
disagree. When peers disagreed, students’ final answers Out of the instances when partners initially disagreed, we
aligned with the more confident peer only 58% of the predicted the likelihood of keeping one’s answer based upon
Fig. 1 The relationship between pre-discussion accuracy (x axis) and post-discussion accuracy (y axis). Each point represents a single question.
The solid diagonal line represents equal pre-discussion and post-discussion accuracy; points above the line indicate improvements in accuracy
and points below represent decrements in accuracy. The dashed line indicates the line of best fit for the observed data
Tullis and Goldstone Cognitive Research: Principles and Implications (2020) 5:15 Page 6 of 12
Fig. 2 The pathways of answers from pre-discussion (top row) to post-discussion (bottom row). Percentages indicate the portion of items from
the category immediately above in that category, the numbers in brackets indicate the raw numbers of items, and the numbers at the bottom of
each entry indicate the confidence associated with those items. In the middle, white row, confidence values show the peer’s confidence.
Turquoise indicates incorrect answers and yellow indicates correct answers
one’s own confidence, the partner’s confidence, and whether analyzed how discussion impacted confidence when
one’s answer was initially correct. The results of a model pre- partners’ answers agreed. We predicted confidence in
dicting whether students keep their answers is shown in answers by the interaction of time point (i.e., pre-
Table 5. For each increase in a point of one’s own confi- discussion versus post-discussion) and being initially
dence, the odds of keeping one’s answer increases 1.25 times correct for situations in which peers initially agreed on
(95% conf 1.13–1.38). For each decrease in a point of the their answer. The results, displayed in Table 6, show that
partner’s confidence, the odds of keeping one’s answer in- confidence increased from pre-discussion to post-
creased 1.19 times (1.08–1.32). The beta weight for one’s discussion by 1.08 points and that confidence was
confidence did not differ from the beta weight of the part- greater for initially correct answers (than incorrect an-
ner’s confidence, χ2 = 0.49, p = 0.48. Finally, if one’s own an- swers) by 0.78 points. As the interaction between time
swer was correct, the odds of keeping one’s answer increased point and initial correctness shows, confidence increased
4.48 times (2.92–6.89). In other words, the more confident more from pre-discussion to post-discussion when stu-
students were, the more likely they were to keep their an- dents were initially correct (as compared to initially in-
swer; the more confident their peer was, the more likely they correct). To illustrate this relationship, we plotted pre-
were to change their answer; and finally, if a student was cor- confidence against post-confidence for initially correct
rect, they were more likely to keep their answer. and initially incorrect answers when peers agreed (Fig. 4).
To illustrate this relationship, we plotted the probability of Each plotted point represents a student; the diagonal
keeping one’s own answer as a function of the difference be- blue line indicates no change between pre-confidence
tween one’s own and their partner’s confidence for initially and post-confidence. The graph reflects that confidence
correct and incorrect answers. As shown in Fig. 3, at every increases more from pre-discussion to post-discussion
confidence level, being correct led to equal or more fre- for correct answers than for incorrect answers, even
quently keeping one’s answer than being incorrect. when we only consider cases where peers agreed.
As another measure of whether discussion allows If students engage in more comprehensive answer test-
learners to test the coherence of the correct answer, we ing during discussion than before, the relationship
Table 4 The gamma correlation between accuracy and confidence before and after discussion for each class
Class Pre-gamma Post-gamma SD of difference Paired t test comparing pre to posta Cohen’s d
Cognitive Psych (Psych) 2015 0.60 0.79 0.52 t (18) = 1.22, p = 0.24 0.29
Cognitive Psych (Psych) 2017 0.27 0.40 0.74 t (37) = 2.29, p = 0.02 0.38
Decision Making (Ed Psych) 2016 0.36 0.56 0.46 t (22) = 3.21, p = 0.004 0.47
Decision Making (Ed Psych) 2017 0.47 0.44 0.46 t (33) = 0.24, p = 0.81 − 0.04
Learning Theories (Ed Psych) 2016 0.18 0.28 0.45 t (11) = 1.57, p = 0.14 0.23
Learning Theories (Ed Psych) 2018 0.43 0.37 0.37 t (13) = 0.58, p = 0.57 − 0.16
Overall 0.40 0.48 0.55 t (139) = 2.98, p = 0.003 0.24
a
Gamma correlation requires that learners have variance in both confidence and correctness before and after discussion. Degrees of freedom are reduced because
many students did not have requisite variation
Tullis and Goldstone Cognitive Research: Principles and Implications (2020) 5:15 Page 7 of 12
Table 5 Logit mixed-level regression analysis Table 6 Mixed-level regression analysis of predicting
Fixed effect ^
β SE Wald z p confidence
Intercept − 0.18 0.13 1.36 .17 Fixed effect ^
β SE t value p
Own confidence (mean-centered) 0.22 0.05 4.16 < .0001 Intercept 5.63 0.21 26.66
Partner confidence (mean-centered) −0.18 0.05 3.51 .0005 Time point (pre vs post) 1.08 0.14 7.98 < .0001
Own correct 1.50 0.22 6.73 < .0001 Initial correct 0.78 0.13 6.05 < .0001
The results of a logit mixed level regression predicting keeping one's answer Time Point*Initial correct 0.33 0.15 2.14 .03
from one's own confidence, the peer's confidence, and the correctness of The results of the mixed level regression predicting confidence in one's
one's initial answer for situations in which peers initially disagreed answer from the time point (pre- or post- discussion), the correctness of one's
answer, and their interaction for situations in which peers initially agreed
between confidence in their answer and the accuracy of and after discussion across six psychology classes. Dis-
their answer should be stronger following discussion cussing a question with a partner improved accuracy
than it is before. We examined whether confidence accur- across classes and grade levels with small to medium-
ately reflected correctness before and after discussion. To do sized effects. Questions of all difficulty levels benefited
so, we calculated the gamma correlation between confidence from peer discussion; even questions where less than
and accuracy, as is typically reported in the literature on meta- half of students originally answered correctly saw im-
cognitive monitoring (e.g., Son & Metcalfe, 2000; Tullis & provements from discussion. Benefits across the
Fraundorf, 2017). Across all students, the resolution of meta- spectrum of question difficulty align with prior research
cognitive monitoring increases from pre-discussion to post- showing improvements when even very few students ini-
discussion (t (139) = 2.98, p = 0.003, d = 0.24; for a breakdown tially know the correct answer (Smith et al., 2009). More
of gamma calculations for each class, see Table 4). Confidence students switched from incorrect answers to correct an-
was more accurately aligned with accuracy following discus- swers than vice versa, leading to an improvement in ac-
sion than preceding it. The resolution between student confi- curacy following discussion. Answer switching was
dence and correctness increases through discussion, driven by a student’s own confidence in their answer
suggesting that discussion offers better coherence testing than and their partner’s confidence. Greater confidence in
answering alone. one’s answer indicated a greater likelihood of keeping
the answer; a partner’s greater confidence increased the
Discussion likelihood of changing to their answer.
To examine why peer instruction benefits student learn- Switching answers depended on more than just confi-
ing, we analyzed student answers and confidence before dence: even when accounting for students’ confidence
Fig. 3 The probability of keeping one’s answer in situations where one’s partner initially disagreed as a function of the difference between
partners’ levels of confidence. Error bars indicate the standard error of the proportion and are not shown when the data are based upon a single
data point
Tullis and Goldstone Cognitive Research: Principles and Implications (2020) 5:15 Page 8 of 12
Fig. 4 The relationship between pre-discussion and post-discussion confidence as a function of the accuracy of an answer when partners agreed.
Each dot represents a student
levels, the correctness of the answer impacted switching a model in which students always choose the answer of
behavior. Across several measures, our data showed that the more confident peer. This over-performance, often
the correctness of an answer carried weight beyond con- called “process gain”, can sometimes emerge when indi-
fidence. For example, the correctness of the answer pre- viduals collaborate to create or generate new knowledge
dicted whether students switched their initial answer during (Laughlin, Bonner, & Miner, 2002; Michaelsen, Watson,
peer disagreements, even after taking the confidence of both & Black, 1989; Sniezek & Henry, 1989; Tindale & Shef-
partners into account. Further, students’ confidence in- fey, 2002). Final accuracy reveals that students did not
creased more when partners agreed on the correct answer simply choose the answer of the more confident student
compared to when they agreed on an incorrect answer. Fi- during discussion; instead, students more thoroughly
nally, although confidence increased from pre-discussion to probed the coherence of answers and mental models
post-discussion when students changed their answers from during discussion than they could do alone.
incorrect to the correct ones, confidence decreased when Students’ final accuracy emerges from the interaction be-
students changed their answer away from the correct one. A tween the pairs of students, rather than solely from individ-
plausible interpretation of this difference is that when stu- uals’ sequestered knowledge prior to discussion (e.g.
dents switch from a correct answer to an incorrect one, their Wegner, Giuliano, & Hertel, 1985). Schwartz (1995) details
decrease in confidence reflects the poor coherence of their four specific cognitive products that can emerge through
final incorrect selection. working in dyads. Specifically, dyads force verbalization of
Whether peer instruction resulted in optimal switch- ideas through discussion, and this verbalization facilitates
ing behaviors is debatable. While accuracy improved generating new knowledge. Students may not create a coher-
through discussion, final accuracy was worse than if stu- ent explanation of their answer until they engage in discus-
dents had optimally switched their answers during dis- sion with a peer. When students create a verbal explanation
cussion. If students had chosen the correct answer of their answer to discuss with a peer, they can identify
whenever one of the partners initially chose it, the final knowledge gaps and construct new knowledge to fill those
accuracy would have been significantly higher (M = 0.80 gaps. Prior research examining the content of peer interac-
(SD = 0.19)) than in our data (M = 0.72 (SD = 0.24), t tions during argumentation in upper-level biology classes
(207) = 6.49, p < 0.001, d = 0.45). While this might be has shown that these kinds of co-construction happen fre-
interpreted as “process loss” (Steiner, 1972; Weldon & quently; over three quarters of statements during discussion
Bellinger, 1997), that would assume that there is suffi- involve an exchange of claims and reasoning to support
cient information contained within the dyad to ascertain those claims (Knight et al., 2013). Second, dyads have more
the correct answer. One individual selecting the correct information processing resources than individuals, so they
answer is inadequate for this claim because they may can solve more complex problems. Third, dyads may foster
not have a compelling justification for their answer. greater motivation than individuals. Finally, dyads may
When we account for differences in initial confidence, stimulate the creation of new, abstract representations of
students’ final accuracy was better than expected. Stu- knowledge, above and beyond what one would expect from
dents’ final accuracy was better than that predicted from the level of abstraction created by individuals. Students need
Tullis and Goldstone Cognitive Research: Principles and Implications (2020) 5:15 Page 9 of 12
to communicate with their partner; to create common Peer instruction may benefit from the generation of
ground and facilitate discourse, dyads negotiate common explanations, but social influences may compound those
representations to coordinate different perspectives. The benefits. Social interactions may help students monitor
common representations bridge multiple perspectives, so and regulate their cognition better than self-explanations
they lose idiosyncratic surface features of individuals’ repre- alone (e.g., Jarvela et al., 2015; Kirschner, Kreijns, Phielix,
sentation. Working in pairs generates new knowledge and & Fransen, 2015; Kreijns, Kirschner, & Vermeulen, 2013;
tests of answers that could not be predicted from individuals’ Phielix, Prins, & Kirschner, 2010; Phielix, Prins, Kirsch-
performance alone. ner, Erkens, & Jaspers, 2011). Peers may be able to judge
More broadly, teachers often put students in groups so the quality of the explanation better than the explainer.
that they can learn from each other by giving and receiving In fact, recent research suggests that peer instruction fa-
help, recognizing contradictions between their own and cilitates learning even more than self-explanations (Ver-
others’ perspectives, and constructing new understandings steeg, van Blankenstein, Putter, & Steendijk, 2019).
from divergent ideas (Bearison, Magzamen, & Filardo, Not only does peer instruction generate new know-
1986; Bossert, 1988-1989; Brown & Palincsar, 1989; Webb ledge, but it may also improve students’ metacognition.
& Palincsar, 1996). Giving explanations to a peer may en- Our data show that peer discussion prompted more
courage explainers to clarify or reorganize information, thorough testing of the coherence of the answers. Specif-
recognize and rectify gaps in understandings, and build ically, students’ confidences were better aligned with ac-
more elaborate interpretations of knowledge than they curacy following discussion than before. Improvements
would have alone (Bargh & Schul, 1980; Benware & Deci, in metacognitive resolution indicate that discussion pro-
1984; King, 1992; Yackel, Cobb, & Wood, 1991). Prompting vides more thorough testing of answers and ideas than
students to explain why and how problems are solved facili- does answering questions on one’s own. Discussion facil-
tates conceptual learning more than reading the problem itates the metacognitive processes of detecting errors
solutions twice without self-explanations (Chi, de Leeuw, and assessing the coherence of an answer.
Chiu, & LaVancher, 1994; Rittle-Johnson, 2006; Wong, Agreement among peers has important consequences
Lawson, & Keeves, 2002). Self-explanations can prompt for final behavior. For example, when peers agreed, stu-
students to retrieve, integrate, and modify their knowledge dents very rarely changed their answer (less than 3% of
with new knowledge; self-explanations can also help stu- the time). Further, large increases in confidence oc-
dents identify gaps in their knowledge (Bielaczyc, Pirolli, & curred when students agreed (as compared to when they
Brown, 1995; Chi & Bassock, 1989; Chi, Bassock, Lewis, disagreed). Alternatively, disagreements likely engaged
Reimann, & Glaser, 1989; Renkl, Stark, Gruber, & Mandl, different discussion processes and prompted students to
1998; VanLehn, Jones, & Chi, 1992; Wong et al., 2002), de- combine different answers. Whether students weighed
tect and correct errors, and facilitate deeper understanding their initial answer more than their partner’s initial an-
of conceptual knowledge (Aleven & Koedinger, 2002; At- swer remains debatable. When students disagreed with
kinson, Renkl, & Merrill, 2003; Chi & VanLehn, 2010; their partner, they were more likely to stick with their
Graesser, McNamara, & VanLehn, 2005). Peer instruction, own answer than switch; they kept their own answer
while leveraging these benefits of self-explanation, also goes 66% of the time. Even when their partner was more
beyond them by involving what might be called “other-ex- confident, students only switched to their partner’s an-
planation” processes - processes recruited not just when swer 50% of the time. The low rate of switching during
explaining a situation to oneself but to others. Mercier and disagreements suggests that students weighed their own
Sperber (2019) argue that much of human reason is the re- answer more heavily than their partner’s answer. In fact,
sult of generating explanations that will be convincing to across prior research, deciders typically weigh their own
other members of one’s community, thereby compelling thoughts more than the thoughts of an advisor (Harvey,
others to act in the way that one wants. Harries, & Fischer, 2000; Yaniv & Kleinberger, 2000).
Conversely, students receiving explanations can fill in gaps Interestingly, peers agreed more frequently than expected
in their own understanding, correct misconceptions, and by chance. When students were initially correct (64% of the
construct new, lasting knowledge. Fellow students may be time), 78% of peers agreed. When students were initially in-
particularly effective explainers because they can better take correct (36% of the time), peers agreed 43% of the time. Pairs
the perspective of their peer than the teacher (Priniski & of students, then, agree more than expected by a random
Horne, 2019; Ryskin, Benjamin, Tullis, & Brown-Schmidt, distribution of answers throughout the classroom. These
2015; Tullis, 2018). Peers may be better able than expert data suggest that students group themselves into pairs based
teachers to explain concepts in familiar terms and direct upon likelihood of sharing the same answer. Further, these
peers’ attention to the relevant features of questions that they data suggest that student understanding is not randomly dis-
do not understand (Brown & Palincsar, 1989; Noddings, tributed throughout the physical space of the classroom.
1985; Vedder, 1985; Vygotsky, 1981). Across all classes, students were instructed to work with a
Tullis and Goldstone Cognitive Research: Principles and Implications (2020) 5:15 Page 10 of 12
neighbor to discuss their answer. Given that neighbors Stijnen, 2013). Learners who engage in peer discussion
agreed more than predicted by chance, students seem to can use their new knowledge to solve new, but similar
tend to sit near and pair with peers that share their same problems on their own (Smith et al., 2009). Generating
levels of understanding. Our results from peer instruction re- new knowledge and revealing gaps in knowledge through
veal that students physically locate themselves near students peer instruction, then, effectively supports students’ ability
of similar abilities. Peer instruction could potentially benefit to solve novel problems. Peer instruction can be an effect-
from randomly pairing students together (i.e. not with a ive tool to generate new knowledge through discussion
physically close neighbor) to generate the most disagree- between peers and improve student understanding and
ments and generative activity during discussion. metacognition.
Learning through peer instruction may involve deep
Acknowledgements
processing as peers actively challenge each other, and Not applicable.
this deep processing may effectively support long-term
retention. Future research can examine the persistence Authors’ contributions
JGT collected some data, analyzed the data, and wrote the first draft of the paper.
of gains in accuracy from peer instruction. For example, RLG collected some data, contributed significantly to the framing of the paper, and
whether errors that are corrected during peer instruction edited the paper. The authors read and approved the final manuscript.
stay corrected on later retests of the material remains an
Authors’ information
open question. High and low-confidence errors that are JGT: Assistant Professor in Educational Psychology at University of Arizona.
corrected during peer instruction may result in different RLG: Chancellor’s Professor in Psychology at Indiana University.
long-term retention of the correct answer; more specific-
Funding
ally, the hypercorrection effect suggests that errors com- No funding supported this manuscript.
mitted with high confidence are more likely to be
corrected on subsequent tests than errors with low con- Availability of data and materials
As described below, data and materials are available on the
fidence (e.g., Butler, Fazio, & Marsh, 2011; Butterfield & OpenScienceFramework: https://mfr.osf.io/render?url=https://osf.io/5qc46/
Metcalfe, 2001; Metcalfe, 2017). Whether hypercorrec- ?action=download%26mode=render.
tion holds for corrections from classmates during peer
Ethics approval and consent to participate
instruction (rather than from an absolute authority) The ethics approval was waived by the Indiana University Institutional
could be examined in the future. Review Board (IRB) and the University of Arizona IRB, given that these data
The influence of partner interaction on accuracy may de- are collected as part of normal educational settings and processes.
pend upon the domain and kind of question posed to Consent for publication
learners. For simple factual or perceptual questions, partner No individual data are presented in the manuscript.
interaction may not consistently benefit learning. More spe-
Competing interests
cifically, partner interaction may amplify and bolster wrong The authors declare that they have no competing interests.
answers when factual or perceptual questions lead most stu-
dents to answer incorrectly (Koriat, 2015). However, for Author details
1
Department of Educational Psychology, University of Arizona, 1430 E.
more “intellective tasks,” interactions and arguments be- Second St., Tucson, AZ 85721, USA. 2Department of Psychology, Indiana
tween partners can produce gains in knowledge (Trouche University, Bloomington, IN, USA.
et al., 2014). For example, groups typically outperform indi-
Received: 8 October 2019 Accepted: 25 February 2020
viduals for reasoning tasks (Laughlin, 2011; Moshman &
Geil, 1998), math problems (Laughlin & Ellis, 1986), and
logic problems (Doise & Mugny, 1984; Perret-Clermont, References
Aleven, V., & Koedinger, K. R. (2002). An effective metacognitive strategy: Learning
1980). Peer instruction questions that allow for student argu- by doing and explaining with a computer based cognitive tutor. Cognitive
mentation and reasoning, therefore, may have the best bene- Science, 26, 147–179.
fits in student learning. Atkinson, R. K., Renkl, A., & Merrill, M. M. (2003). Transitioning from studying
examples to solving problems: Effects of self-explanation prompts and fading
The underlying benefits of peer instruction extend be- worked-out steps. Journal of Educational Psychology, 95, 774–783.
yond the improvements in accuracy seen from pre- Baayen, R. H. (2008). Analyzing linguistic data: A practical introduction to statistics.
discussion to post-discussion. Peer instruction prompts Cambridge: Cambridge University Press.
Baayen, R. H., Davidson, D. J., & Bates, D. M. (2008). Mixed-effects modeling with
students to retrieve information from long-term memory, crossed random effects for subjects and items. Journal of Memory and
and these practice tests improve long-term retention of in- Language, 59, 390–412.
formation (Roediger III & Karpicke, 2006; Tullis, Fiechter, Bangert-Drowns, R. L., Kulik, J. A., & Kulik, C.-L. C. (1991). Effects of frequent
classroom testing. Journal of Educational Research, 85, 89–99.
& Benjamin, 2018). Further, feedback provided by instruc- Bargh, J. A., & Schul, Y. (1980). On the cognitive benefit of teaching. Journal of
tors following peer instruction may guide students to im- Educational Psychology, 72, 593–604.
prove their performance and correct misconceptions, Bates, D., Maechler, M., Bolker, B., & Walker, S. (2015). Fitting linear mixed-effects
models using lme4. Journal of Statistical Software, 67, 1–48.
which should benefit student learning (Bangert-Drowns, Bearison, D. J., Magzamen, S., & Filardo, E. K. (1986). Sociocognitive conflict and
Kulik, & Kulik, 1991; Thurlings, Vermeulen, Bastiaens, & cognitive growth in young children. Merrill-Palmer Quarterly, 32(1), 51–72.
Tullis and Goldstone Cognitive Research: Principles and Implications (2020) 5:15 Page 11 of 12
Beatty, I. D., Gerace, W. J., Leonard, W. J., & Dufresne, R. J. (2006). Designing Jaeger, T. F. (2008). Categorical data analysis: away from ANOVAs (transformation or not)
effective questions for classroom response system teaching. American Journal and towards logit mixed models. Journal of Memory and Language, 59, 434–446.
of Physics, 74(1), 31e39. James, M. C. (2006). The effect of grading incentive on student discourse in peer
Beekes, W. (2006). The “millionaire” method for encouraging participation. Active instruction. American Journal of Physics, 74(8), 689–691.
Learning in Higher Education, 7, 25–36. Jarvela, S., Kirschner, P., Panadero, E., Malmberg, J., Phielix, C., Jaspers, J., …
Benware, C. A., & Deci, E. L. (1984). Quality of learning with an active versus Jarvenoja, H. (2015). Enhancing socially shared regulation in collaborative
passive motivational set. American Educational Research Journal, 21, 755–765. learning groups: Designing for CSCL regulation tools. Educational Technology
Bielaczyc, K., Pirolli, P., & Brown, A. L. (1995). Training in self-explanation and self Research and Development, 63(1), 125e142.
regulation strategies: Investigating the effects of knowledge acquisition Jones, M. E., Antonenko, P. D., & Greenwood, C. M. (2012). The impact of
activities on problem solving. Cognition and Instruction, 13, 221–251. collaborative and individualized student response system strategies on
Bossert, S. T. (1988-1989). Cooperative activities in the classroom. Review of learner motivation, metacognition, and knowledge transfer. Journal of
Research in Education, 15, 225–252. Computer Assisted Learning, 28(5), 477–487.
Brooks, B. J., & Koretsky, M. D. (2011). The influence of group discussion on King, A. (1992). Facilitating elaborative learning through guided student-
students’ responses and confidence during peer instruction. Journal of generated questioning. Educational Psychologist, 27, 111–126.
Chemistry Education, 88, 1477–1484. Kirschner, P. A., Kreijns, K., Phielix, C., & Fransen, J. (2015). Awareness of cognitive and social
Brown, A. L., & Palincsar, A. S. (1989). Guided, cooperative learning and individual behavior in a CSCL environment. Journal of Computer Assisted Learning, 31(1), 59–77.
knowledge acquisition. In L. B. Resnick (Ed.), Knowing, learning, and Knight, J. K., Wise, S. B., & Southard, K. M. (2013). Understanding clicker
instruction: essays in honor of Robert Glaser, (pp. 393–451). Hillsdale: Erlbaum. discussions: student reasoning and the impact of instructional cues. CBE-Life
Butchart, S., Handfield, T., & Restall, G. (2009). Using peer instruction to teach Sciences Education, 12, 645–654.
philosophy, logic and critical thinking. Teaching Philosophy, 32, 1–40. Koriat, A. (2015). When two heads are better than one and when they can be
Butler, A. C., Fazio, L. K., & Marsh, E. J. (2011). The hypercorrection effect persists worse: The amplification hypothesis. Journal of Experimental Psychology:
over a week, but high-confidence errors return. Psychonomic Bulletin & General, 144, 934–950. https://doi.org/10.1037/xge0000092.
Review, 18(6), 1238–1244. Kreijns, K., Kirschner, P. A., & Vermeulen, M. (2013). Social aspects of CSCL
Butterfield, B., & Metcalfe, J. (2001). Errors committed with high confidence are environments: A research framework. Educational Psychologist, 48(4), 229e242.
hypercorrected. Journal of Experimental Psychology: Learning, Memory, and Kuhn, L. M., & Sniezek, J. A. (1996). Confidence and uncertainty in judgmental
Cognition, 27(6), 1491. forecasting: Differential effects of scenario presentation. Journal of Behavioral
Caldwell, J. E. (2007). Clickers in the large classroom: current research and best- Decision Making, 9, 231–247.
practice tips. CBE-Life Sciences Education, 6(1), 9–20. Lasry, N., Mazur, E., & Watkins, J. (2008). Peer instruction: From Harvard to the
Chi, M., & VanLehn, K. A. (2010). Meta-cognitive strategy instruction in intelligent two-year college. American Journal of Physics, 76(11), 1066–1069.
tutoring systems: How, when and why. Journal of Educational Technology Laughlin, P. R. (2011). Group problem solving. Princeton: Princeton University
and Society, 13, 25–39. Press.
Chi, M. T. H., & Bassock, M. (1989). Learning from examples via self-explanations. Laughlin, P. R., Bonner, B. L., & Miner, A. G. (2002). Groups perform better than
In L. B. Resnick (Ed.), Knowing, learning, and instruction: Essays in honor of individuals on letters-to-numbers problems. Organisational Behaviour and
Robert Glaser, (pp. 251–282). Hillsdale: Erlbaum. Human Decision Processes, 88, 605–620.
Chi, M. T. H., Bassock, M., Lewis, M., Reimann, P., & Glaser, R. (1989). Self- Laughlin, P. R., & Ellis, A. L. (1986). Demonstrability and social combination
explanations: How students study and use examples in learning to solve processes on mathematical intellective tasks. Journal of Experimental Social
problems. Cognitive Science, 13, 145–182. Psychology, 22, 177–189.
Chi, M. T. H., de Leeuw, N., Chiu, M. H., & LaVancher, C. (1994). Eliciting self- Lucas, A. (2009). Using peer instruction and i-clickers to enhance student
explanations improves understanding. Cognitive Science, 18, 439–477. participation in calculus. Primus, 19(3), 219–231.
Cortright, R. N., Collins, H. L., & DiCarlo, S. E. (2005). Peer instruction enhanced Mazur, E. (1997). Peer instruction: A user’s manual. Upper Saddle River: Prentice Hall.
meaningful learning: Ability to solve novel problems. Advances in Physiology Mercier, H., & Sperber, D. (2019). The enigma of reason. Cambridge: Harvard
Education, 29, 107–111. University Press.
Crouch, C. H., & Mazur, E. (2001). Peer instruction: Ten years of experience and Metcalfe, J. (2017). Learning from errors. Annual Review of Psychology, 68, 465–489.
results. American Journal of Physics, 69, 970–977. Michaelsen, L. K., Watson, W. E., & Black, R. H. (1989). Realistic test of
Cummings, K., & Roberts, S. (2008). A study of peer instruction methods with school individual versus group decision making. Journal of Applied Psychology,
physics students. In C. Henderson, M. Sabella, & L. Hsu (Eds.), Physics education 64, 834–839.
research conference, (pp. 103–106). College Park: American Institute of Physics. Miller, R. L., Santana-Vega, E., & Terrell, M. S. (2007). Can good questions and peer
Deslauriers, L., Schelew, E., & Wieman, C. (2011). Improved learning in a large- discussion improve calculus instruction? Primus, 16(3), 193–203.
enrollment physics class. Science, 332, 862–864. Morgan, J. T., & Wakefield, C. (2012). Who benefits from peer conversation?
Duncan, D. (2005). Clickers in the classroom: How to enhance science teaching Examining correlations of clicker question correctness and course
using classroom response systems. San Francisco: Pearson/Addison-Wesley. performance. Journal of College Science Teaching, 41(5), 51–56.
Finley, J. R., Tullis, J. G., & Benjamin, A. S. (2010). Metacognitive control of learning Moshman, D., & Geil, M. (1998). Collaborative reasoning: Evidence for collective
and remembering. In M. S. Khine, & I. M. Saleh (Eds.), New science of learning: rationality. Thinking and Reasoning, 4, 231–248.
Cognition, computers and collaborators in education. New York: Springer Murayama, K., Sakaki, M., Yan, V. X., & Smith, G. M. (2014). Type I error inflation in
Science & Business Media, LLC. the traditional by-participant analysis to metamemory accuracy: A
Giuliodori, M. J., Lujan, H. L., & DiCarlo, S. E. (2006). Peer instruction enhanced generalized mixed-effects model perspective. Journal of Experimental
student performance on qualitative problem solving questions. Advances in Psychology: Learning, Memory, and Cognition, 40, 1287–1306.
Physiology Education, 30, 168–173. Newbury, P., & Heiner, C. (2012). Ready, set, react! getting the most out of peer
Graesser, A. C., McNamara, D., & VanLehn, K. (2005). Scaffolding deep instruction using clickers. Retrieved October 28, 2015, from http://www.cwsei.
comprehension strategies through AutoTutor and iSTART. Educational ubc.ca/Files/ReadySetReact_3fold.pdf.
Psychologist, 40, 225–234. Nielsen, K. L., Hansen-Nygård, G., & Stav, J. B. (2012). Investigating peer
Granovskiy, B., Gold, J. M., Sumpter, D., & Goldstone, R. L. (2015). Integration of instruction: how the initial voting session affects students’ experiences of
social information by human groups. Topics in Cognitive Science, 7, 469–493. group discussion. ISRN Education, 2012, article 290157.
Harvey, N., & Fischer, I. (1997). Taking advice: Accepting help, improving Noddings, N. (1985). Small groups as a setting for research on mathematical
judgment, and sharing responsibility. Organizational Behavior and Human problem solving. In E. A. Silver (Ed.), Teaching and learning mathematical
Decision Processes, 70, 117–133. problem solving, (pp. 345–360). Hillsdale: Erlbaum.
Harvey, N., Harries, C., & Fischer, I. (2000). Using advice and assessing its quality. Perret-Clermont, A. N. (1980). Social Interaction and Cognitive Development in
Organizational Behavior and Human Decision Processes, 81, 252–273. Children. London: Academic Press.
Henderson, C., & Dancy, M. H. (2009). The impact of physics education research Perez, K. E., Strauss, E. A., Downey, N., Galbraith, A., Jeanne, R., Cooper, S., &
on the teaching of introductory quantitative physics in the United States. Madison, W. (2010). Does displaying the class results affect student
Physical Review Special Topics: Physics Education Research, 5(2), 020107. discussion during peer instruction? CBE Life Sciences Education, 9, 133–140.
Tullis and Goldstone Cognitive Research: Principles and Implications (2020) 5:15 Page 12 of 12
Phielix, C., Prins, F. J., & Kirschner, P. A. (2010). Awareness of group performance Van Swol, L. M., & Sniezek, J. A. (2005). Factors affecting the acceptance of expert
in a CSCL-environment: Effects of peer feedback and reflection. Computers in advice. British Journal of Social Psychology, 44, 443–461.
Human Behavior, 26(2), 151–161. VanLehn, K., Jones, R. M., & Chi, M. T. H. (1992). A model of the self-explanation
Phielix, C., Prins, F. J., Kirschner, P. A., Erkens, G., & Jaspers, J. (2011). Group effect. Journal of the Learning Sciences, 2(1), 1–59.
awareness of social and cognitive performance in a CSCL environment: Vedder, P. (1985). Cooperative learning: A study on processes and effects of cooperation
Effects of a peer feedback and reflection tool. Computers in Human Behavior, between primary school children. Westerhaven: Rijkuniversiteit Groningen.
27(3), 1087–1102. Versteeg, M., van Blankenstein, F. M., Putter, H., & Steendijk, P. (2019). Peer
Pollock, S. J., Chasteen, S. V., Dubson, M., & Perkins, K. K. (2010). The use of concept tests instruction improves comprehension and transfer of physiological concepts:
and peer instruction in upper-division physics. In M. Sabella, C. Singh, & S. Rebello A randomized comparison with self-explanation. Advances in Health Sciences
(Eds.), AIP conference proceedings, (vol. 1289, p. 261). New York: AIP Press. Education, 24, 151–165.
Porter, L., Bailey-Lee, C., & Simon, B. (2013). Halving fail rates using peer Vygotsky, L. S. (1981). The genesis of higher mental functioning. In J. V. Wertsch (Ed.),
instruction: A study of four computer science courses. In SIGCSE ‘13: The concept of activity in Soviet psychology, (pp. 144–188). Armonk: Sharpe.
Proceedings of the 44th ACM technical symposium on computer science Webb, N. M., & Palincsar, A. S. (1996). Group processes in the classroom. In D. C.
education, (pp. 177–182). New York: ACM Press. Berliner, & R. C. Calfee (Eds.), Handbook of educational psychology, (pp. 841–873).
Price, P. C., & Stone, E. R. (2004). Intuitive evaluation of likelihood judgment New York: Macmillan Library Reference USA: London: Prentice Hall International.
producers. Journal of Behavioral Decision Making, 17, 39–57. Wegner, D. M., Giuliano, T., & Hertel, P. (1985). Cognitive interdependence in
Priniski, J. H., & Horne, Z. (2019). Crowdsourcing effective educational interventions. close relationships. In W. J. Ickes (Ed.), Compatible and incompatible
In A. K. Goel, C. Seifert, & C. Freska (Eds.), Proceedings of the 41st annual relationships, (pp. 253–276). New York: Springer-Verlag.
conference of the cognitive science society. Austin: Cognitive Science Society. Weldon, M. S., & Bellinger, K. D. (1997). Collective memory: Collaborative and
Rao, S. P., & DiCarlo, S. E. (2000). Peer instruction improves performance on individual processes in remembering. Journal of Experimental Psychology:
quizzes. Advances in Physiological Education, 24, 51–55. Learning, Memory, and Cognition, 23, 1160–1175.
Renkl, A., Stark, R., Gruber, H., & Mandl, H. (1998). Learning from worked-out Wieman, C., Perkins, K., Gilbert, S., Benay, F., Kennedy, S., Semsar, K., et al. (2009).
examples: The effects of example variability and elicited self-explanations. Clicker resource guide: An instructor’s guide to the effective use of
Contemporary Educational Psychology, 23, 90–108. personalresponse systems (clickers) in teaching. Vancouver: University of British
Rittle-Johnson, B. (2006). Promoting transfer: Effects of self-explanation and direct Columbia Available from http://www.cwsei.ubc.ca/resources/files/Clicker_
instruction. Child Development, 77, 1–15. guide_CWSEI_CU-SEI.pdf.
Roediger III, H. L., & Karpicke, J. D. (2006). Test-enhanced learning: Taking memory Wong, R. M. F., Lawson, M. J., & Keeves, J. (2002). The effects of self-explanation
tests improves long-term retention. Psychological Science, 17, 249–255. training on students’ problem solving in high school mathematics. Learning
Ryskin, R., Benjamin, A. S., Tullis, J. G., & Brown-Schmidt, S. (2015). Perspective- and Instruction, 12, 23.
taking in comprehension, production, and memory: An individual differences Yackel, E., Cobb, P., & Wood, T. (1991). Small-group interactions as a source of
approach. Journal of Experimental Psychology: General, 144, 898–915. learning opportunities in second-grade mathematics. Journal for Research in
Sah, S., Moore, D. A., & MacCoun, R. J. (2013). Cheap talk and credibility: The consequences Mathematics Education, 22, 390–408.
of confidence and accuracy on advisor credibility and persuasiveness. Organizational Yaniv, I. (2004a). The benefit of additional opinions. Current Directions in
Behavior and Human Decision Processes, 121, 246–255. Psychological Science, 13, 75–78.
Schwartz, D. L. (1995). The emergence of abstract representations in dyad Yaniv, I. (2004b). Receiving other people’s advice: Influence and benefit.
problem solving. The Journal of the Learning Sciences, 4, 321–354. Organizational Behavior and Human Decision Processes, 93, 1–13.
Yaniv, I., & Choshen-Hillel, S. (2012). Exploiting the wisdom of others to make
Simon, B., Kohanfars, M., Lee, J., Tamayo, K., & Cutts, Q. (2010). Experience report:
better decisions: Suspending judgment reduces egocentrism and increases
peer instruction in introductory computing. In Proceedings of the 41st SIGCSE
accuracy. Journal of Behavioral Decision Making, 25, 427–434.
technical symposium on computer science education.
Yaniv, I., & Kleinberger, E. (2000). Advice taking in decision making: Egocentric
Smith, M. K., Wood, W. B., Adams, W. K., Wieman, C., Knight, J. K., Guild, N., & Su, T.
discounting and reputation formation. Organizational Behavior and Human
T. (2009). Why peer discussion improves student performance on in-class
Decision Processes, 83, 260–281.
concept questions. Science, 323, 122–124.
Smith, M. K., Wood, W. B., Krauter, K., & Knight, J. K. (2011). Combining peer
discussion with instructor explanation increases student learning from in- Publisher’s Note
class concept questions. CBE-Life Sciences Education, 10, 55–63. Springer Nature remains neutral with regard to jurisdictional claims in
Sniezek, J. A., & Buckley, T. (1995). Cueing and cognitive conflict in judge–Advisor decision published maps and institutional affiliations.
making. Organizational Behavior and Human Decision Processes, 62, 159–174.
Sniezek, J. A., & Henry, R. A. (1989). Accuracy and confidence in group judgment.
Organizational Behavior and Human Decision Processes, 43, 1–28.
Son, L. K., & Metcalfe, J. (2000). Metacognitive and control strategies in study-time
allocation. Journal of Experimental Psychology: Learning, Memory, and
Cognition, 26, 204–221.
Steiner, I. D. (1972). Group processes and productivity. New York: Academic Press.
Thurlings, M., Vermeulen, M., Bastiaens, T., & Stijnen, S. (2013). Understanding
feedback: A learning theory perspective. Educational Research Review, 9, 1–15.
Tindale, R. S., & Sheffey, S. (2002). Shared information, cognitive load, and group
memory. Group Processes & Intergroup Relations, 5(1), 5–18.
Trouche, E., Sander, E., & Mercier, H. (2014). Arguments, more than confidence,
explain the good performance of reasoning groups. Journal of Experimental
Psychology: General, 143, 1958–1971.
Tullis, J. G. (2018). Predicting others’ knowledge: Knowledge estimation as cue-
utilization. Memory & Cognition, 46, 1360–1375.
Tullis, J. G., Fiechter, J. L., & Benjamin, A. S. (2018). The efficacy of learners’ testing choices.
Journal of Experimental Psychology: Learning, Memory, and Cognition, 44, 540–552.
Tullis, J. G., & Fraundorf, S. H. (2017). Predicting others’ memory performance: The
accuracy and bases of social metacognition. Journal of Memory and
Language, 95, 124–137.
Turpen, C., & Finkelstein, N. (2007). Understanding how physics faculty use peer
instruction. In L. Hsu, C. Henderson, & L. McCullough (Eds.), Physics education
research conference, (pp. 204–209). College Park: American Institute of Physics.