0% found this document useful (0 votes)
14 views6 pages

The Role of Assessment in The Student Learning Process

Uploaded by

purelabs2025
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views6 pages

The Role of Assessment in The Student Learning Process

Uploaded by

purelabs2025
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

ASSESSMENT

The Role of Assessment in the Student


Learning Process
Carmen Fuentealba

ABSTRACT
Assessment is a powerful learning tool that can enhance learning and education. The process of student assessment
should align with curricular goals and educational objectives. Identifying the assessment strategies necessary for the
proper evaluation of students’ progress within individual programs is as important as establishing curricular content and
delivery methods. The purpose of this paper is to discuss elements to be considered in assessment design and implemen-
https://utppublishing.com/doi/pdf/10.3138/jvme.38.2.157 - Thursday, April 24, 2025 7:29:33 AM - IP Address:94.203.128.34

tation as well as common challenges encountered during this process. Elements to be considered during assessment
design include purpose of assessment, domains to be tested, and characteristics of the assessment tools to be
employed. Assessment tools are evaluated according to four main characteristics: relevance, feasibility, validity, and
reliability. Based on the evidence presented in the literature, the use of a variety of assessment tools is recommended
to match diverse domains and learning styles. The assessment cycle concludes with the evaluation of the results and,
based on these, the institution, program, or course can make changes to improve the quality of education. If assessment
design aligns with educational outcomes and instructional methods, it improves the quality of education and supports
student learning.

Key words: student assessment, assessment design, curriculum, learning, feasibility, validity, reliability

BACKGROUND ment of a survey to measure student satisfaction. Assess-


Identifying assessment strategies to properly evaluate ment takes into consideration the needs and motivations
students’ progress within individual programs is as im- of stakeholders who are internal or external to the de-
portant as establishing curricular content and delivery partment, faculty, and university, and it therefore serves
methods. The purpose of this paper is to discuss elements many purposes that often support and compete with one
to be considered in assessment design and implementa- another. 6 Balancing different expectations is one of the
tion as well as common challenges encountered during main challenges faced by administrators and educators
this process. in higher education in general and those in medical edu-
cation in particular.
The process of student assessment should align with cur-
ricular goals and educational objectives. 1–4 The assess- There has been an evolution in the overall appreciation of
ment and evaluation of students’ learning are connected the role of assessment within the learning environment
to educational outcomes and instructional methods in which challenges the early view that assessment is merely
crucial ways (see Figure 1). The cycle is a dynamic, con- a process to monitor learning for the purposes of account-
tinuous, and collaborative effort that can potentially start ability or certification and that the responsibility to assess
at any point since the main goal is to maintain a mutually students’ learning falls on individual instructors. Addi-
supportive system between learning outcomes, assess- tionally, traditional grading practices are often ques-
ment, and learning activities. Ideally, once educational tioned because academic performance comprises a wide
outcomes and instructional methods are established, range of abilities and skills. Since high-quality assess-
assessment strategies and procedures are designed to ments encourage further learning, considerable time is
test students’ achievements as they relate to a variety of spent designing assessment strategies that recognize the
educational objectives (such as theoretical knowledge, need to foster an environment that supports self-directed,
clinical and communication skills, clinical reasoning, and life-long learners. 2,7 For example, in Objective Structured
professionalism). 5 Clinical Examinations (OSCEs) students rotate through
various stations to carry out specific tasks. Each station
The first step in assessment design is to recognize the is designed to evaluate a specific clinical and professional
crucial role it holds in the improvement of the educa- skill and matching pre-determined curricular objectives.
tional environment as it not only supports curricular Although OSCEs provide good information regarding
planning and quality assurance but it is also an essential student performance, the preparation and administration
part of the learning process. For example, attempts to im- of the exam necessitate both time and resources. 8
prove the educational environment might include more
hands-on exposure to clinical or professional activities There is agreement that assessment is a powerful tool
during the first year of the curriculum. Any conclusions to enhance both teaching and learning. The assessment
regarding the effect of the change(s) on the educational process can provide information that both teachers and
environment (for example, increase in retention of infor- students can use as feedback in assessing themselves
mation, motivation, improvement in grades) should be and one another. As a result, new approaches have de-
linked to an assessment strategy such as the administra- veloped in assessment with increased emphasis on initia-
tion of a test/re-test after a few weeks, a comparison of tives that will promote the use of assessment-for-learning
grades prior to and after implementation, the develop- and assessment-as-learning strategies. In both cases, teachers

JVME 38(2) 6 2011 AAVMC doi:10.3138/jvme.38.2.157 157


of student learning. 6 However, professional development
opportunities for educators are not always available and
there are conflicting attitudes with respect to testing.
Many teachers at all levels of the educational system feel
that assessment in general is a necessary nuisance. 10 A
collection of opinions from teachers include the follow-
ing: ‘‘I give tests because I have to give grades’’; ‘‘tests
are necessary evils: I hate them and students hate taking
them’’; ‘‘assessments don’t have anything to do with
teaching and learning’’; ‘‘assessing students is easy, it’s
just a matter of asking questions.’’ 3 In addition, instruc-
tors across disciplines tend to underestimate the impor-
tance of assessment and are often reluctant to discuss
the topic with colleagues. 7 Institutions and administra-
tors should play a role in guiding the process by promot-
ing faculty interaction and recognizing the scholarly
https://utppublishing.com/doi/pdf/10.3138/jvme.38.2.157 - Thursday, April 24, 2025 7:29:33 AM - IP Address:94.203.128.34

activity of faculty members with a passion for veterinary


education.
One of the obvious benefits of the accreditation processes
across educational institutions is the promotion of ac-
countability and, as noted in medical schools, ‘‘accreditors
are paying closer attention to how well schools provide
Figure 1: Relationship between the assessment of measure assurances that students learn what the faculties
students, educational objectives, and instructional set out to teach.’’ 11
methods.

STEPS TO CONSIDER WHEN DEVELOPING A STUDENT


and students share responsibility in the evaluation process ASSESSMENT PLAN
as they set goals that are compatible with external expecta- Assessment supports, drives, and improves learning. 12–16
tions and self-motivation.6,9 OSCEs are a good example of As a result of the close relationship between curriculum,
an assessment-as-learning strategy. Teachers are embrac- instruction, and assessment, once the important role of
ing the use of technologies such as clickers or hand-held assessment in effective teaching and learning is recog-
wireless response systems to provide and obtain immedi- nized, the reasonable sequence of events would be the
ate feedback. These hand-held wireless response systems selection of appropriate assessment tools in teaching and
allow instructors to include questions during lectures or assessment. The first step in assessment design is to
learning activities. The results are immediately presented establish its purpose and expected outcome. Similarly, in
to the class, often in the form of a graph. This information the assessment-driven course design approach, prepara-
can be used by the students to assess their progress and tions for evaluation precede decisions regarding course
by the instructor to assess instructional effectiveness. content. 16,17
Thus, a teacher’s course or lesson planning should pro-
vide opportunities for both the learner and teacher to
obtain and use information about their progress toward
SELECTION OF ASSESSMENT STRATEGIES
educational goals. The process also needs to be flexible
so as to respond to initial and emerging ideas and skills. Once the assessment content has been agreed upon, ade-
Planning should include strategies that ensure that learners quate tools are selected and the cycle concludes with the
understand the goals they are pursuing and the criteria evaluation of assessment results, which can be helpful
that will be applied during assessment. Planning should in redesigning teaching methods and strategies. 2,18 The
also consider how learners will receive feedback, how evaluation cycle is a dynamic process whereby assess-
they will take part in assessing their learning, and how ment strategies and tools are continuously reviewed in
they will be helped to make further progress. 2 Assess- terms of both the quality of the evidence they yield and
ment is part of effective planning for both teaching and the effect they have on future learning. 7 The assessment
learning, which means that teachers require the profes- may take place at different levels. For example, passing
sional knowledge and skills to plan for assessment, ob- rates, course averages, and faculty evaluation (that is,
serve learning, analyze and interpret evidence of learn- student evaluations of instructors’ teaching, peer evalua-
ing, give feedback to learners, and support learners as tions, and faculty self-evaluation of teaching effective-
they assess themselves. ness) might be best assessed centrally (by unit, depart-
ment, or college), whereas an individual instructor could
Faculty members are now required to understand how use periodic assessment of instructional methods and
students learn, recognize the relationship between assess- exams to improve the learning environment in her/his
ment and instruction, and examine students’ performance course. To evaluate multiple-choice questions, it is com-
and their understanding and application of relevant mon to use systematic item analysis, an unbiased system
knowledge, attitudes, and skills. 3,4,6 The teacher’s role that detects questions that need to be modified or de-
in the evolving field of assessment requires a shift from leted. Careful review of assessment results and learning
provider of information and collector of data to facilitator objectives might result in the selection of new assessment

158 JVME 38(2) 6 2011 AAVMC


tools or learning activities. The following are some ele- understanding of how and what students learn. In addi-
ments to be considered in the selection of assessment tion, interpretations of student achievement can help
strategies: identify learning experiences that do or do not promote
e Purpose of the assessment; expected outcomes. Interpreting students’ achievement over
time and sharing assessment results with students enables
e Domains to be tested; learners to understand their strengths and weaknesses and
e Characteristics of assessment tools. to reflect on improvement strategies.2–4 The results of these
assessments might motivate the desire to improve, change,
or introduce new pedagogical approaches. 2–4
Purpose of Assessment
The planning process will answer questions such as the Domains to be Tested
following: Why do we assess? How will the assessment
Consensus about methods of capturing student learning
be used? How will it impact student learning? How will
is a crucial part of assessment design. It is important to
it impact the curriculum?
determine which quantitative and qualitative methods
The purpose of the assessment might focus on assessing and combinations of the two will provide useful and ac-
https://utppublishing.com/doi/pdf/10.3138/jvme.38.2.157 - Thursday, April 24, 2025 7:29:33 AM - IP Address:94.203.128.34

student learning and assessing the program. Assessment curate measures of student achievement, Decisions about
should provide guidance and feedback to the learner and whether to use standardized tests or locally designed
it often serves as a form of learning motivation. Decisions assessment methods—for example, case studies, simula-
about what to assess are inevitably related to decisions tions, portfolios, OSCEs, development of concept maps—
about how to assess. These decisions, in turn, should be should be based on how well a method aligns with what
linked with what students have learned and the ways in and how students have learned at an institution or within
which they have done so; assessment tools are selected a program and how well a method measures the domain
accordingly. 2–4 it is supposed to measure. 2 There are general guidelines
available in the literature, such as the ones created by
How will the assessment be used? The results of the
Nightingale et al., who18 list eight broad categories of learn-
assessment can be used to provide baseline information
ing outcomes and suggest suitable assessment methods
(to establish the starting point or prior knowledge), to
within each category.
provide information to guide instruction and judge stu-
dents’ progression of learning (formative assessment), or The main goal is to select a method that aligns with the
to serve the general purposes of assigning a grade or overall aims of the program and that effectively assesses
making decisions regarding students’ progression within learning objectives such as knowledge, competencies,
the program (summative exam). 3 attitudes, or skills. Ideally, different kinds of information
should be gathered about student learning, and the
A formative exam is a low-stake assessment that is inte-
domains to be tested (below) might require a range of
grated into the act of teaching and provides valuable
assessment methods depending on student qualities or
information about how well students are progressing
abilities:
toward pre-established or planned expectations. Forma-
tive assessment guides learning, provides reassurance, e Declarative knowledge—‘‘what’’ knowledge
promotes reflection, and serves as a powerful tool to rein- e Procedural knowledge—‘‘how’’ knowledge,
force students’ motivation to learn. There are numerous
technical skills
examples of formative assessment in veterinary edu-
cation including questions posted online that provide e Application knowledge—use of knowledge in
immediate feedback and the use of hand-held devices similar settings and in different contexts
or clickers during learning activities such as lectures, e Problem solving, clinical reasoning, critical thinking
tutorials, and laboratory sessions. A specific example of
a formative exam is the pathology mock practical exam e Understanding—learner’s synthesis of concepts,
in which 10 stations are prepared with questions or speci- processes, and skills
mens similar to those used in the final practical exam. e Attitude—professionalism
Answers for each question are provided at the end of
the exam or discussed with the students. The main objec-
tives of the mock exam are to familiarize students with Characteristics of Assessment Tools
the testing environment and to offer them an opportunity The acceptance, success, and implementation of assess-
for self-evaluation. ment methods depend on the validity, reliability, accept-
ability, and feasibility of the final assessment protocol.
A summative exam informs the student and teacher Awareness of the perceptions of stakeholders facilitates
about the levels of conceptual understanding and perfor- development and implementation. 19 Decisions about the
mance that the student has reached. Summative exams quality of assessment methods are generally based on
result in a grade or specific outcome (pass/fail). Assess- quantifiable factors such as validity and reliability. Edu-
ing student learning at the end of a period of instruction cational effect, feasibility, and acceptability are also im-
without offering feedback limits the student’s opportuni- portant factors to be considered when evaluating assess-
ties to reflect on how to improve his/her performance ment tools, but they are difficult to measure. 20–23
and demonstrate that improvement. Using both forma-
tive and summative assessment methods provides the A simple approach to selecting assessment tools is to
educator, department, and/or institution with a better consider the following characteristics:

JVME 38(2) 6 2011 AAVMC 159


e Relevance; e Inappropriate level of difficulty;
e Feasibility; e Inappropriate item or question;
e Validity;
e Improper arrangement of items (identifiable pattern
e Reliability. of answers, clues);
Relevance and educational effect/impact should be obvi- e Insufficient number of assessment items;
ous to both the learner and the educator. Some of the
questions to consider are as follows: Is the process of e Insufficient time;
evaluation appropriate to the jobs to be performed by
e Errors in scoring or subjective scoring.
the student after qualification? Does the process of eval-
uation address valued content and skills? Does it match Reliability is a measure of the reproducibility and consis-
program objectives? Does it reflect current thinking in tency of the test, and it is expressed as a coefficient such
the field? Is it appropriate in the context of discipline as Cronbach. Reliability refers to the ability of an instru-
and social needs? ment to measure in a consistent manner or to the delivery
of a similar score each time an individual is tested under
https://utppublishing.com/doi/pdf/10.3138/jvme.38.2.157 - Thursday, April 24, 2025 7:29:33 AM - IP Address:94.203.128.34

In terms of feasibility, some appropriate questions to ask


are as follows: Is the process of evaluation doable? Is it similar conditions. It is not enough for a test to assess
acceptable? Is it practical? Can it be implemented in prac- accurately; it must also do it consistently. In other words,
tice? Does it have an impact on other learning activities? if the test were to be administered for a second time to
Is it accessible by current technology? Several factors the same group of learners, would the results be iden-
affect feasibility: tical? Are the data collected consistent across applica-
tions within the classroom, school, university, and across
e The time required for preparation, testing, and institutions?
scoring, and the time required for the learner to
prepare for the test, take the test, and ‘‘recover’’ Reliability is a term that, as Bone argues,
from the test; academics are reluctant to discuss but concerns
e The availability of human, economic, and the impact of subjective influences on the assess-
technological resources; the affordability and ment processes. If you were off sick for a pro-
cost effectiveness of resources; and the quantity of longed period and a colleague marked your
necessary resources; examination would the outcome be similar?
e The degree to which the process of evaluation is With huge piles of scripts how can you ensure
that the scripts you mark at the beginning are
realistic in intent and design;
dealt with in exactly the same way as those
e The number of students; you mark last? How does this year’s cohort
e The ease of constructing, administering, scoring, compare with last year’s given that the subject-
and reporting. matter and/or your approach to it may have
changed radically in the interim? 7(p.6)
Validity refers to the ability of a specific instrument to
measure the attribute it is intended to measure under The validity of scoring and the validity of the instrument
the conditions in which it is administered. 24 To ensure used have an impact on reliability. There are other mea-
that the assessment reflects the learning outcomes of the sures that improve reliability, such as increasing the
course, the main questions to consider are as follows: length of the test, maintaining consistent test conditions,
Does it measure what it intends to measure? Does it and using appropriate levels of difficulty and discrimina-
allow students to demonstrate what they know and are tion. 3 Although validity and reliability are crucial compo-
able to do? Is it effective? Is the decision resulting from nents of the accuracy of a test, they are not synonymous.
the evaluation appropriate and fair? Does it consider dif- Reliability is a precondition for validity, but it does not
ferent learning styles? guarantee it. As Figure 2 illustrates, an assessment device
can be reliable but not necessarily valid. Therefore, it is
There are different types of validity—content, criterion, important to incorporate methods that determine both
construct—clearly described in Gareis and Grant’s book. 3 the reliability and validity of assessment tools. 25
When an educator questions if the test measured what
was intended to be taught, he/she is applying the con- The Joint Committee on Standards for Educational Eval-
cept of content validity. Content validity can be im- uation, accredited by the American National Standards
proved by preparing a list of content matter or skills to Institute, added ‘‘propriety’’ as an attribute of appropri-
be tested, assigning weight to each item based on relative ate assessment. 26 Propriety standards help ‘‘ensure that
importance, and/or preparing a rubric or table of specifi- student evaluations are conducted legally, ethically and
cations. Factors that affect validity include the following: with due regard for the well-being of the students being
evaluated and other people affected by the evaluation
e Inadequate testing environment (as a result of
results.’’ 26 The propriety standards determined by the
noise, room temperature, and/or quality of light); Committee are as follows:
e Insufficient and/or unclear instructions; e Service to students, which should promote sound
e Difficult and/or ambiguous phrasing of questions; education principles, the fulfillment of institutional
missions, and effective student work;
e Poorly constructed items or questions, including
spelling errors and grammatical mistakes; e Appropriate Policies and Procedures;

160 JVME 38(2) 6 2011 AAVMC


development opportunities and become discouraged or
susceptible to the risks associated with the persistent
attempts to re-invent the wheel. Although some institu-
tions value innovation and scholarly activity in educa-
tion, there is a lack of institutional support for it and
teaching suffers from the perceived diminished value of
teaching as opposed to research and/or professional ser-
vice, especially during promotion and tenure decisions.
These factors contribute to the decreased level of enthu-
siasm to embrace the role of educator in most academic
Figure 2: Visual representation of validity and environments. However, there is increased awareness
reliability of a test (modified from Gareis and Grant, that activities that teach the teachers need to be promoted
2006).3 and supported. Educational conferences such as the
International Association of Medical Science Educators
(IAMSE) meeting, the Ottawa Conference on the Assess-
ment of Competence in Medicine, the AAVMC Educa-
https://utppublishing.com/doi/pdf/10.3138/jvme.38.2.157 - Thursday, April 24, 2025 7:29:33 AM - IP Address:94.203.128.34

e Access to Evaluation Information, which stipulates tional Symposium, and the AAVMC Veterinary Educators
that confidentiality is maintained and privacy is Collaborative provide excellent opportunities for learning
protected; and networking. Additionally, instructors can enroll in
e Treatment of Students, which maintains that teacher-development workshops offered in education de-
partments within their academic institution or apply the
learners should be treated with respect in all
principles of self-directed learning and study the material
aspects of the evaluation process;
available online or in peer-reviewed literature in the
e Rights of Students must be consistent with appli- general field of medical education. A rarely used, yet
cable laws as well as with fairness and human very effective and motivating, strategy is to access local
rights principles so that students’ rights and expertise present in most veterinary schools whereby
welfare are protected; individuals with strong interest in veterinary education
e Balanced Evaluation identifies both strengths and can share information and help to further develop this
field.
weaknesses. 26
The main strategies to minimize the impact of challenges
The assessment cycle concludes with the evaluation of
during assessment design are to demonstrate a consistent
the results and, based on the evidence gathered, the insti-
commitment to student learning and to maintain open
tution, program, or course can make changes to improve
and clear communication channels between stakeholders.
the quality of education. The assessment cycle then begins
Once members of an institution or department share a com-
again to discover whether the proposed changes or inno-
mitment to matching assessments, to purpose of assess-
vations had a positive impact on student achievement.
ment, instructional content, and student performance goals,
everyone involved will value the advantages of collabo-
ration. 2 Awareness of the importance of engaging and
CHALLENGES DURING ASSESSMENT DESIGN ensuring effective and genuine responses to stakeholders’
One of the main challenges during assessment design opinions and expectations is crucial to the success of
arises from the application of unclear assessment deci- implementing any assessment system. 19 For example, an
sions unrelated to the purpose of the assessment and/or institution or department planning to use OSCEs as an
to the content to be assessed. A common mistake that assessment tool might benefit from discussing with
teachers make is to try to assess everything instead of a instructors the rationale, educational advantages, and
sample of the content that was taught and learned. At overall cost effectiveness of using this testing modality.
the institutional level, trying to change everything at
once without adequate buy-in or involvement from staff If assessment design is aligned with educational out-
is a recipe for disaster. Allocating insufficient time for comes and instructional methods, it improves the quality
teachers to decide how to best change the strategies that of education and supports student learning. Institutions,
they use with their students and promoting good ideas departments, and instructors are encouraged to interpret
without enough resources may lead to frustration and results and share information to enhance institutional
decreased engagement. Changing student outcomes and effectiveness and the veterinary profession.
assessment without teacher input and buy-in often re-
sults in resistance to change or to the use of ineffective
shortcuts if change is undertaken. 27 An effective strategy REFERENCES
to mitigate this problem is to engage faculty by identify- 1 Dunn L. Selecting methods of assessment [Internet].
ing a small group of educators who are committed to Oxford (UK): Oxford Centre for Staff and Learning
innovation to serve as agents of change. 5 Development; 2002 [cited 2009 Dec 23]. Available from:
http://www.brookes.ac.uk/services/ocsd/2_learntch/
Teachers should be supported in the development of the
briefing_papers/methods_assessment.pdf.
skills necessary to plan for assessment and to interpret
evidence of learning through professional development 2 Maki PL. Developing an assessment plan to learn
opportunities throughout their academic careers. How- about student learning. J Acad Libr. 2002;28:8–13.
ever, veterinary educators only have a few professional doi:10.1016/S0099-1333(01)00295-6

JVME 38(2) 6 2011 AAVMC 161


3 Gareis CR, Grant LW. Teacher-made assessments: 16 McLachlan JC. The relationship between assessment
how to connect curriculum, instruction and student and learning. Med Educ. 2006;40:716–7.
learning. Larchmont (NY): Eye on Education; 2006. doi:10.1111/j.1365-2929.2006.02518.x
4 Amin Z, Eng KH. Basics in medical education. 2nd 17 Kuper A, Reeves S, Albert M, Hodges BD.
edition. New York: World Scientific Publishing; 2009. Assessment: do we need to broaden our methodological
horizons? Med Educ. 2007;41:1121–3.
5 Vasconcelos MVL, Rodarte RS, Junior PM. Aligning
doi:10.1111/j.1365-2923.2007.02945.x
assessment with the curriculum: identifying faculty’s
innovative methods. Med Educ. 2009;43:1088–9. 18 Fowell SL, Southgate LJ, Bligh JG. Evaluating
doi:10.1111/j.1365-2923.2009.03460.x assessment: the missing link? Med Educ. 1999;33:276–81.
doi:10.1046/j.1365-2923.1999.00405.x
6 Earl LM. Assessment as learning. Thousand Oaks
(CA): Corwin; 2003. 19 Nightingale P, Te Wiata IT, Toohey S, Ryan G,
Hughes C, Magin D. Assessing learning in universities.
7 Bone A. Ensuring successful assessment [Internet].
Sydney: University of New South Wales Press; 1996.
Coventry (UK): The National Centre for Legal Education;
1999 [cited 2010 Apr 26]. Available from: 20 Murphy DJ, Bruce D, Eva KW. Workplace-based
https://utppublishing.com/doi/pdf/10.3138/jvme.38.2.157 - Thursday, April 24, 2025 7:29:33 AM - IP Address:94.203.128.34

http://www.ukcle.ac.uk/resources/assessment/ assessment for general practitioners: using stakeholder


bone.html. perception to aid blueprinting of an assessment battery.
Med Educ. 2008;42:96–103.
8 Davis MH, Ponnamperuma GG, McAleer S, Dale
doi:10.1111/j.1365-2923.2007.02952.x
VHM. The objective structured clinical examination
(OSCE) as a determinant of veterinary clinical skills. J Vet 21 Norcini JJ, Burch V. Workplace-based assessment as
Med Educ. 2006;33:578–87. doi:10.3138/jvme.33.4.578 an educational tool: AMEE Guide No. 31. Med Teach.
2007;29:855–71. doi:10.1080/01421590701775453
9 Black PJ, Wiliam D. Inside the black box: raising
standards through classroom assessment [Internet]. 22 Rethans JJ, Norcini JJ, Baron-Maldonado M,
1998 [cited 2009 Dec 23]. Available from: Blackmore D, Jolly BC, LaDuca T, et al. The relationship
http://www.michigan.gov/documents/mde/ between competence and performance: implications for
Inside_the_Black_Box_184495_7.pdf. assessing practice performance. Med Educ.
2002;36:901–9. doi:10.1046/j.1365-2923.2002.01316.x
10 Pead MJ. Assessment: Cinderella or sleeping beauty?
Evolution of final examinations at the Royal Veterinary 23 Norcini JJ, Blank LL, Duffy FD, Fortna GS. The
College. J Vet Med Educ. 2008;35:607–11. mini-CEX: a method for assessing clinical skills. Ann
doi:10.3138/jvme.35.4.607 Intern Med. 2003;138:476–81.
11 Kassebaun DG, Eaglen, RH. Shortcoming in the 24 Trent A. Outcomes assessment planning: an
evaluation of students’ clinical skills and behaviors overview with applications in health sciences. J Vet Med
in medical school. Acad Med. 1999;74:842–9. Educ. 2002;29:9–19. doi:10.3138/jvme.29.1.9
doi:10.1097/00001888-199907000-00020
25 Hecker K, Violato C. Validity, reliability, and
12 Muijtjens AMM, Hoogenboom RJI, Verwijnen GM, defensibility of assessments in veterinary education. J Vet
van der Vleuten CPM. Relative or absolute standards in Med Educ. 2009;36:271–5. doi:10.3138/jvme.36.3.271
assessing medical knowledge using progress tests. Adv
26 Student evaluation standards [Internet]. The Joint
Health Sci Educ. 1998;3:81–7.
Committee on Standards for Educational Evaluation;
doi:10.1023/A:1009728423412
2003 [cited 2010 Feb 18]. Available from:
13 Larsen DP, Butler AC, Roediger HL III. Test- http://www.jcsee.org/ses.
enhanced learning in medical education. Med Educ.
27 Corbett HD, Wilson BL. Testing, reform and
2008;42:959–66. doi:10.1111/j.1365-2923.2008.03124.x
rebellion. Norwood (NJ): Ablex; 1991.
14 Kromann C, Jensen M, Ringsted C. The effect of
testing on skills learning. Med Educ. 2009;43:21–7.
doi:10.1111/j.1365-2923.2008.03245.x AUTHOR INFORMATION
15 van der Vleuten CPM, Newble D. How can we test Carmen Fuentealba, MV, MSc, PhD, is Professor of
clinical reasoning? Lancet. 1995;345:1032–4. Veterinary Pathology, Department of Ecosystem and
doi:10.1016/S0140-6736(95)90763-7 Public Health, Faculty of Veterinary Medicine, University of
Calgary, 3330 Hospital Drive NW, Calgary, AB T2N 4N1.
E-mail: [email protected].

162 JVME 38(2) 6 2011 AAVMC

You might also like