JDMS 26:97-105 March/April 2010 97
ENERGIZING EDUCATION
A Literature are “sold out,” and books, articles, and assessment
“toolkits” abound for the professional to choose from.
All of this can make the assessment environment
Review of overwhelming for sonographic educators, espe-
cially new faculty.
Assessment Assessment is defined as examining at a deeper
level what students know and what they can do
3–6
with what they know. It relies on a process of
What New identifying student learning outcomes or objec-
tives, planning how to measure learning that has
Sonographic Faculty occurred, collecting data from various sources,
analyzing and thinking about the data, and then
Should Know examining ways to improve future learning.
This article defines commonly used assessment
terms, provides a brief overview of the assessment
cycle, allows new faculty to examine their syllabi
REVA A. CURRY, PhD, RT(R), RDMS, FSDMS for assessment tools, and shows the relationship
between program, department, school, and institu-
NAYDEEN T. GONZALEZ-DeJESUS, MA tional assessment.
Assessment by Definition
First, it is important to have a common under-
Much attention has been paid by the public,
standing of terms related to assessment. This in
lawmakers, institutions, and accrediting agencies
itself is challenging because some words have
on assessment and student learning outcomes. In
different meanings but are used interchangeably.
the book Learner-Centered Assessment on College
1 Other terms have similar meanings but are used as
Campuses, Huba and Freed write, “Assessment is
if they are different from each other. Four pairs of
a movement that began outside the academy in
words that fit this description are listed as follows.
order to make institutions more accountable to
2 Which pairs of words are similar in meaning?
external constituencies.” Frye reports that assess-
ment for accountability is used for “resource alloca- • Assessment and accountability
tion and fiscal efficiency.” • Student learning outcomes and student learning
In response, many colleges and universities goals
have increased their assessment activity and prom- • Formative assessment and summative assessment
inently display assessment plans and activities on • Learning objectives and learning goals
their Web sites. Interest in assessment is high:
national and regional conferences on assessment Assessment and accountability do not mean
the same thing but are often used interchangeably
as if they do. Simply put, “When we assess our
DOI: 10.1177/8756479310361374 own performance, it’s assessment, when others
98 JOURNAL OF DIAGNOSTIC MEDICAL SONOGRAPHY March/April 2010 VOL. 26, NO. 2
assess our performance, it’s accountability.” 2
Direct performance indicators may be used for
assessment and indirect performance indicators
for accountability. These will be discussed in the
assessment cycle.
Student learning outcomes and student learning
goals usually mean the same thing. These terms
refer to what students know and are able to do after
4,5
learning has occurred. Performance indicators
that document or measure this include direct and
indirect and formative and summative. Formative
and summative assessment is presented in the
assessment cycle.
Formative assessment and summative assess-
ment are not the same. Formative is incremental,
6–9
whereas summative is comprehensive. An exam-
ple of formative assessment is an ungraded mini-
quiz that is given at the end of a lecture to assess FIGURE 1. An assessment cycle.
how well students understand the material that was
just presented. It gives the student and instructor
immediate feedback, and changes can be made to the data, and using the results to make improve-
“get back on track” if needed. A comprehensive ments in future teaching and learning.
final examination is an example of summative This means assessment data are used to make
8
learning. In this example, feedback is intended informed decisions about how to improve learning
mainly for the instructor or other program faculty in the future—that is, the next time the same mod-
to determine how much the student knows about ule or course is offered, which completes the cycle
the course material. (see Figure 1).
Learning objective and learning goal do not Let us look at the four steps in Figure 1 in
mean the same thing. Goals are broad, whereas more detail.
7,10
objectives are specific. Objectives further delin-
eate the broad goal into measurable, specific aims. STEP 1: DEVELOP LEARNING GOALS AND OBJECTIVES
Although there is not a set formula for the number of
objectives used for each goal, an anecdotal review of First, there are abundant resources to help new
goals and objectives listed on syllabi from several faculty develop learning goals and objectives.
colleges shows a range of three to five course goals, Educators can look to their college or university
with four to six objectives for each goal. resources for assistance with writing course goals
and specific objectives. How to write objectives
The Assessment Cycle and goals is included in resource material at teach-
ing excellence or faculty resource centers online
Assessment is frequently described as a four- and on campus, as well as from the institution’s
1,7
step process. Learning outcomes or objectives curriculum review committee. Also, Internet
are developed first. Second, assessment tools are searches yield a plethora of suggestions on how to
created to measure how well the student has write good goals and objectives.
learned. The third step is the actual teaching and Second, new faculty should review their course
learning process, during and after which assess- syllabi regularly to ensure the goals and objectives
ment tools can be administered. The fourth step is for their course meet the mission, goals, and objec-
collecting data from the assessment tools, studying tives of the sonographic educational program, its
A LITERATURE REVIEW OF ASSESSMENT / Curry, Gonzales-DeJesus 99
Education,” encourage faculty and students to
work together to develop an optimum learning
environment that includes the use of a variety of
13,14
assessment tools. This type of learning envi-
ronment includes the following:
• Optimizing contact between students
• Increasing contact with faculty in and outside of
Institutional Mission and Goals class
School/College Mission and Goals • Developing cooperation among students
Department/Division Mission and
• Encouraging active learning, which includes listen-
Goals
Program Mission and ing with purpose
Objectives • Reflecting on what has been learned
Course Goals and • Asking questions
Objectives • Receiving prompt feedback from the instructor
Learning Experiences
In addition, faculty who role-model for stu-
FIGURE 2. The top-down approach ensures the learning dents their respect for diversity and the different
experiences are in line with the institution’s mission and goals. ways students learn communicate their high
expectations for student learning more effec-
12–14
department, the school, and the institution. Courses tively than those who do not. Chickering and
12,13
and programs that help meet the mission and goals Gamson reason that this type of learning com-
of the department, school, and institution have munity forms a foundation upon which a wide
greater validity to institutional mission and resource variety of assessment tools can be developed. The
2,11
allocation than those that do not. The intercon- tools should be equitable and easy to understand,
nectedness described begins with the institution’s with expectations communicated to students well
mission statement, which flows downward to the ahead of time (e.g., reviewing the grading rubric
course level. Simultaneously, goals and objectives with students several weeks before the assign-
starting at the course level should flow upward to ment is due).
support the institution’s mission. A mutually sup- Again, working with institutional resources
portive dynamic is thus achieved. This is shown in such as the faculty development center and the
Figure 2. assessment committee can provide tools the insti-
tution uses to assess student learning. When new
faculty make a conscious effort to use these and
STEP 2: CREATE TOOLS TO ASSESS THE LEARNING
THAT WILL TAKE PLACE
other institutional resources, they set an example
for students to do the same and seek out additional
Which type of assessment measures work best learning resources in and out of the classroom. This
for the objectives in the didactic, laboratory, clini- helps build a learning community between faculty,
cal, or seminar course? New faculty can answer their students, and the institution. Students’ con-
this through a review of the syllabus currently nection to their institution becomes a motivating
being used to determine what measures are already factor to do well academically and can be assessed
14
in place. Is it tests, quizzes, oral examinations, or long after they graduate.
all three, along with other means? How do the One research study on students’ premature
results inform faculty that learning has occurred? departure from college showed that one of the
Are any changes needed? most significant predictors of early departure
12 15
Chickering and Gamson, in their article “Seven was the lack of connection with the institution.
Principles for Good Practice in Undergraduate On the other hand, the same study showed
100 JOURNAL OF DIAGNOSTIC MEDICAL SONOGRAPHY March/April 2010 VOL. 26, NO. 2
TABLE 1.
An Overview of the Differences Between Formative and Summative Assessment
Aspects of Assessment Formative Summative
Purpose of assessment Immediate feedback to Cumulative or comprehensive;
instructor and students on checks learning at the end of
whether learning has occurred a unit of instruction
When assessment is given During the course or unit of instruction At the end of the course or unit of instruction
Are grades given? No Yes
Examples of this assessment Ungraded quizzes at the end of a lecture; Unit tests or comprehensive tests
student reflection and journaling,
pre- and posttests
Student feedback to assessment Essential to the assessment Not required
student satisfaction to have the greatest positive TABLE 2.
impact on student retention. Thus, the impor- Differences Between Direct and Indirect Assessment
tance of the seven principles is even more sig- Direct Indirect
nificant when viewed through the lens of alumni
satisfaction and student attrition. In fact, what Pass rates on Course grades (summative)
12 licensure/certification
Chickering and Gamson are actually describ- examinations (summative)
ing are seven ways to help students connect with Portfolios and capstone Admission, retention, and
their institution! projects (summative) transfer rates (institutional)
Case studies (formative Surveys, focus groups, and
or summative) interviews (institutional)
STEP 3: ADMINISTERING ASSESSMENT TOOLS Oral examination Honors and awards
(formative or summative) (summative)
Assessment tools are administered before, dur-
ing, and/or after learning for incremental and sum- Summative or formative assessment is listed in parentheses next to
each tool. Institutional data tend to be indirect measures and are also
mary assessment. The tools should be easy to
listed in the indirect category.
administer and tally and embedded in the course as
part of the learning process. For example, a 5-point
Likert-type scale can be used to ascertain if there A second category of assessment is direct ver-
are any differences in student behavior before and sus indirect measures. Direct measures analyze
after an intervention has been introduced to facili- student output, whether it is cognitive, affective,
20
tate learning. or psychosocial. Indirect assessment is generally
Assessment can be divided into two distinct used to provide accountability to internal and
categories: formative and summative, as well as external groups. It examines evidence of learning
3,7–9,16–19
direct and indirect. Formative assessment other than student output. One researcher called
occurs as learning is happening, as compared this the “reported perception of student mastery of
19
with summative assessment, which occurs at the learning outcomes.” This work includes detailed
end of the unit of instruction. Table 1 shows the comparisons of strengths and weaknesses in direct
difference between formative and summative assessment tools: published tests, locally devel-
assessment. oped tests, embedded assignments and course
One research study examined what effect activities, and portfolios, as well as the following
group writing and individual reflective writing indirect assessment tools: surveys, interview, and
16
had on pre- and posttests in an economics class. focus groups. Formative and summative measures
Results showed that formative assessment had a can be either direct or indirect, depending on the
“positive and significant” impact on pre-and measure. Table 2 shows some types of direct and
16
posttest scores. indirect measures of assessment.
A LITERATURE REVIEW OF ASSESSMENT / Curry, Gonzales-DeJesus 101
STEP 4: DATA ANALYSIS AND USE Course-Level Assessment:
A critical part of assessment is using what we Syllabus Review of the Assessment Cycle
have learned from our assessment tools to improve
The syllabus is a key document that tells stu-
the teaching and learning environment. This, in
dents what they will learn and how their learning
turn, should improve student learning outcomes.
will be measured. The purpose of this exercise is to
The study and use of results from the initial assess-
help identify parts of the syllabus that already address
ment are critical in producing measurable change
the assessment cycle, suggest ways that course
in subsequent assessment cycles. The results may
documents can be used as additional evidence of
be shared at program faculty meetings and used to
learning, and determine if learning is being achieved
provide evidence for student learning for program
at desired levels (see Figure 1).
and institution self-studies and site visits. For exam-
A four-step process is listed: (1) finding assess-
ple, if assessment data were collected and used to
ment tools that are already present in the syllabus;
improve learning in a second or even third assess-
(2) categorizing the assessment tools to determine
ment cycle, the end results would be three sets of
if more variety is needed; (3) using course files
data. These data, collected over time, could be used
for additional data, including instructor observa-
to strengthen program goals, shared with other fac-
tions and notes, student reflections, and surveys to
ulty in the institution, and perhaps presented at
determine if change is needed; and (4) when indi-
conferences and published.
cated, implementing the change and analyzing the
The type of assessment used determines the way
new data. Sample answers are given in italics.
in which students will receive feedback. For exam-
Exercise Reflection: Think about this exercise
ple, faculty may choose to review quiz results with
and write down any suggestions for improvement
students in great detail. This is especially important
to your syllabus, instruction, or data collection.
if the purpose is formative: the instructor and stu-
Consult with sonographic program faculty for
dents need feedback right away before preparing
additional guidance and direction if major changes
for the next session. However, faculty may choose
are needed.
not to review answers item by item with students,
especially on tests or finals. The purpose here is Program-Level Assessment
summative: the grade itself is used to indicate the
level of learning that has been mastered. When The same paradigm of goals, objectives, assess-
using this type of instrument, faculty can review ment tools, and data analysis is used for program
aggregate test performance to identify concepts assessment. Assessment strategies to determine if
which may need to be covered again and include student learning has occurred at the program level
the material in subsequent meetings or assign- may include capstone courses, embedded assess-
ments. To alleviate anxiety, students need to be ment, and simulated credentialing examinations.
informed at the beginning of the course the differ- Student learning at this level is guided by program
ences between the types of assessment that will be accreditation standards for sonographic educational
used, how they will receive feedback, and resources programs, through the Commission on Accredita-
available to them outside the classroom, such as tion of Allied Health Education Programs (CAA-
22
support services in tutoring, writing, and studying, HEP). CAAHEP is a national accrediting body
that can help improve their learning and course that, in cooperation with Joint Review Committees
grade. In this way, a classroom environment is cre- on Accreditation, sets forth program accreditation
ated that is supportive to students and faculty. standards for 20 health sciences professions, includ-
Now that the assessment cycle has been pre- ing diagnostic medical sonography. A sampling of
sented, let us examine current syllabi and mark the other 19 professions includes cardiovascular
areas that may be needed to support the assessment technologist, exercise scientist, medical illustrator,
cycle and student learning. orthotic and prosthetic practitioner, and respiratory
102 JOURNAL OF DIAGNOSTIC MEDICAL SONOGRAPHY March/April 2010 VOL. 26, NO. 2
TABLE 3.
Course-Level Assessment: A Four-Step Process
Step 1: Find Current Assessment Tools for Learning Objectives
Unit of Instruction Learning Objectives Assessment Tools
Course: didactic, laboratory, or clinical As listed on the syllabus For each learning objective, what
assessment tools will you use to show the
objective has been met?
Example: Laboratory Example: Describe normal anatomy and Example: Ungraded oral competency
21
abnormal findings on abdominal images. using a rubric.
Step 2: Describe Current Assessment Tools
Assessment Tools Formative or Summative Direct or Indirect
List each tool separately For each tool, is it formative or summative? For each tool, is it direct or indirect?
Example: Rubric listing items to be Example: Formative. Note: this would be Example: Direct
covered in the competency and a summative assessment if used at the
levels of performance (to be end of the course and graded.
distributed to students to use
to prepare for assignment)
Step 3: Review Raw Data for Data Analysis
The following information can be found in course files indicating results of assessment.
Date Tool Is Used? Raw Data Data Analysis
Dates that assessment tool was used Number and preliminary results Discuss and reflect on the data. Is the
in the unit of instruction learning objective being met? If not, what
changes can be made to improve
student learning?
Example: Every three weeks Example: 20 students completed oral Example: Data show students are having
throughout laboratory course. competencies three times during the difficulty describing abnormal masses.
laboratory course. 20 × 3 = 60 Change to be made:
competencies completed. Competency 1. Review criteria for abnormal masses at
includes student self-evaluation the next class meeting.
on performance. 2. Give students 15 minutes extra practice
in describing abnormal masses.
3. Ask students to rate their comfort level
after the practice session.
Step 4: Change Implementation Strategy and Reanalyze
The following information can be found in course files.
Implementation Date for Changes Results of Implementation Reanalysis of Data
When were the changes initiated? What were the results? Are further changes indicated? If so, when
will the changes be initiated?
Example: One week after the first Example: Student performance improved Example: Student average comfort level
oral competency was administered. in describing abnormal masses. increased from a “3” to a “4” on
a 5-point Likert-type scale. Add additional
practice exercises to increase comfort
level to 4.25.
A LITERATURE REVIEW OF ASSESSMENT / Curry, Gonzales-DeJesus 103
therapist.23 Close to 2000 programs nationwide States Commission on Higher Education (MSCHE)
have CAAHEP accreditation across the 20 health is one of seven regional accrediting bodies for
professions represented. Program accreditation higher education institutions. It states in its guide-
through CAAHEP is important to ensure high stan- lines, Characteristics of Excellence in Higher Edu-
dards of sonographic educational programs. The cation, under Standard 14: Assessment of Student
Joint Review Committee on Education in Diagnostic Learning, that member institutions must “articulate
Medical Sonography (JRC-DMS) recommends to statements of expected student learning at the insti-
25
CAAHEP standards and guidelines for sonographic tutional, program and individual course levels.”
educational programs. The guidelines state “there This complements the earlier statements cited from
must be a written statement of the program’s goals JRC-DMS programmatic accreditation standards
and learning domains (cognitive, psychomotor, and guidelines.
22
affective).” Just like course goals, documentation The institution’s goals and its assessment plan
of program goals is essential. are guided by regional accreditation standards and
CAAHEP also stresses the importance of guidelines by one of the seven geographically rep-
assessing DMS programs. In Section IV: Student resented commissions that comprise the broader
and Graduate (Outcomes) Evaluation/Assessment, Commission on Institutions in Higher Education
assessment of program goals is addressed: “The (CIHE). Most institutions of higher education in
program must periodically assess its effectiveness the United States belong to CIHE. There are many
22
in achieving its stated goals and learning domains.” similarities as well as distinct differences between
Therefore, implementing ongoing assessment is programmatic and institutional accreditation. Like
vital to the health of the program. programmatic accreditation, the purpose of insti-
The JRC-DMS provides the National Education tutional accreditation is to ensure consistency in
Curriculum (NEC) to supplement accreditation meeting high academic standards—in this case,
guidelines and assist programs in developing sono- among institutions. Another similarity is that
24
graphic curriculum. The NEC contains essential accreditation guidelines are developed by a com-
components for abdominal, OB/GYN, cardiac, and mission or committee (by region instead of by pro-
vascular curriculum. Each component is supported fession) and list specific areas that must be
by a rationale, objectives, and outlines. The NEC documented and supported to receive accredita-
can also be used to ensure that program assess- tion or reaccreditation. Also, both programmatic
ment is broad enough to address all areas within and institutional accreditation processes require
the curriculum. the development of a comprehensive self-study
Sonography program assessment should also that addresses the items in the accreditation stan-
complement the broader assessment goals of the dards and guidelines. In addition, site visit teams
department (i.e., diagnostic imaging or radiological are often required to actually visit the program or
sciences department). The department’s assessment institution to verify information contained in the
should in turn complement the school or college self-study. The submission of self-study docu-
(i.e., School of Allied Health or College of Health ments, followed by site visits, generally occurs
Sciences). Last, all levels of assessment should cul- every 5 to 10 years, depending on the program or
minate in supporting institutional assessment. institutional accrediting body. Table 4 lists the
Therefore, clear connections should exist between commission and states and territories covered.
program, department, and institutional goals. There are also major differences between pro-
grammatic and institutional accreditation. The
Institutional-Level Assessment differences lie mainly in what is reviewed and
who is on the site visit team. Institutional accredi-
Correlations between program and institutional tation examines the institution at an organizational
accreditation documents are helpful in establishing level and does not look at programmatic profes-
interconnectedness from the course level through sional requirements. Institutional site visit teams
the institutional level. For instance, the Middle consist mainly of college administrators and do
104 JOURNAL OF DIAGNOSTIC MEDICAL SONOGRAPHY March/April 2010 VOL. 26, NO. 2
TABLE 4.
Commission on Institutions in Higher Education Member Commissions: States and Territories
Commission Name States, Territories, International
Middle States Commission on Higher Education Delaware, the District of Columbia, Maryland, New Jersey, New York,
Pennsylvania, Puerto Rico, the U.S. Virgin Islands, and several
locations internationally
North Central Association of Colleges Arkansas, Arizona, Colorado, Iowa, Illinois, Indiana, Kansas,
and Schools, The Higher Learning Commission Michigan, Minnesota, Missouri, North Dakota, Nebraska, Ohio,
Oklahoma, New Mexico, South Dakota, Wisconsin, West Virginia,
and Wyoming
New England Association of Schools and Colleges, Connecticut, Maine, Massachusetts, New Hampshire, Rhode Island,
Commission on Institutions of Higher Education and Vermont
Southern Association of Colleges and Schools, Alabama, Florida, Georgia, Kentucky, Louisiana, Mississippi,
Commission on Colleges North Carolina, South Carolina, Tennessee, Texas, Virginia, and
Latin America and other international sites
Northwest Commission on Colleges Alaska, Idaho, Montana, Nevada, Oregon, Utah, and Washington
and Universities
Western Association of Schools and Colleges, California, Hawaii, and the Territories of Guam and Pacific Basin
Accrediting Commission for Community
and Junior Colleges
Western Association of Schools and Colleges, California, Hawaii, and the Territories of Guam and Pacific Basin
Accrediting Commission for Senior Colleges
and Universities
Institutional Assessment Plan the institution. This will ensure the program’s goals
Academic and Student Support Resource Utilization and Administration
and assessment plan are complementary, which is
College or School Assessment Plan important in establishing the program’s value to the
Departments or Divisions
institution and its ability to contribute toward meet-
Assessment Plan of Programs ing the institution’s vision and goals.
Assessment Plan for Courses Conclusion: Why Is the
Assessment Cycle Important?
Assessment of Learning Experiences
Program faculty, especially new faculty, should
FIGURE 3. Institutional goal setting and assessment planning:
know about the assessment process to ensure stu-
a top-down approach.
dent learning is taking place at levels acceptable to
the program. Care should be taken to use assessment
not include sonographic faculty. By contrast, the terms correctly, initiate assessment cycles for key
programmatic accreditation site visit team is made courses, and ensure interconnectedness between
up of credentialed, professional sonographers who course and program goals. New faculty are encour-
not only look at the administrative and academic aged to learn more about their institution’s assess-
issues of the institution but also determine if the ment plan to better understand the role assessment
students are actually educated to perform sono- plays in program and institutional mission, reac-
graphic scanning. creditation, and resource allocation.
The institution’s assessment plan, which docu-
ments how assessment is performed and results
used, should be evident throughout the organiza- References
tion, as shown in Figure 3. Sonographic program 1. Huba ME, Freed JE: Learner-Centered Assessment on
administration and faculty should be familiar with College Campuses: Shifting the Focus From Teaching to
assessment plans and goals at varying levels within Learning. Needham Heights, MA, Allyn & Bacon, 2000.
A LITERATURE REVIEW OF ASSESSMENT / Curry, Gonzales-DeJesus 105
2. Frye R: Assessment, accountability and student learning Panel; 1998. http://www.apsanet.org/imgtest/GolichThink
outcomes [serial online]. 2007. http://www.ac.wwu Assessment.pdf
.edu/~dialogue/issue2.html 15. Freeman JP, Hall EE, Bresciani MJ: What leads students
3. Gamma Sigma Alpha Academic Greek Honorary at to have thoughts, talk to someone about, and take steps to
Bowling Green State University: Factors that impact stu- leave their institution? Coll Student J [serial online]. 2007.
dent achievement and retention [research online]. 2009. http://findaricles.com/p/articles/mi_m0FCR/is_4_41/ai_
http://www.gamasigmaalpha.org/research/research.pdf n27484163/pg_2/?tag=content;c011
4. Gentemann KM, Fletcher JJ, Potter DL: Refocusing the 16. Faulk D: Formative and summative assessment in eco-
academic program review on student learning: the role of nomics principles courses: are applied exercises effective?
assessment. N Dir Inst Res 2006;1994:31–46. Presented at: American Economic Association Annual
5. Astin AW: Assessment for Excellence: The Philosophy Meeting; January 4–6, 2008; New Orleans, LA.
and Practice of Assessment and Evaluation in Higher 17. Cleveland State University Office of Student Learning
Education. Westport, CT, Greenwood, 1991. Assessment: Examples of direct and indirect measures. 2008.
6. Peterson MW, Augustine CH: Organizational practices http://www.csuohio.edu/offices/assessment/exmeasures.html
enhancing the influence of student assessment information 18. Missouri State University White Plains: Direct vs. indi-
in academic decisions. Res Higher Educ 2000;41:21–52. rect assessment [serial online]. 2007. http://www.wp
7. Suskie L: Assessing Student Learning: A Common Sense .missouristate.edu/assessment/3122.htm
Guide. Boston, Anker, 2004. 19. Allen MJ: Strategies for direct and indirect assessment of
8. Krause Center for Innovation: Formative and summative student learning, in Proceedings of the SACS-COC Summer
assessment. Foothill College [serial online]. http://www Institute. Decatur, GA, Southern Association of Colleges
.krauseinnovatoincenter.org/ewyl/modules/module6–3 and Schools, Commission on Colleges, 2008.
.html 20. Bloom BS: Taxonomy of Educational Objectives: Hand-
9. University of Texas at Arlington Active Learning Library: book 1. Cognitive Domain. New York, Longman, 1956.
Active learning for critical thinking. http://activelearning 21. Curry RA, Tempkin BB: Ultrasonographic Educational
.uta.edu/FacStaff/formsum.htm Program: An Introduction to Normal Structure and Func-
10. Ball State University Offices of Academic Assessment tional Anatomy. 2nd ed. Philadelphia, Saunders, 2004.
and Institutional Research: Assessment Workbook. 2000. 22. Commission on Accreditation of Allied Health Education
http://web.bsu.edu/IRAA/AA/WB/contents.htm Programs: Standards and Guidelines: Diagnostic Medical
11. Southeastern Illinois College: Assessment plan. Adopted Sonography. Clearwater, FL, Commission on Accreditation
2006 March 31. http://www.sic.cc.il/us/ of Allied Health Education Programs, 2007.
12. Chickering AW, Gamson ZF: Seven principles for good 23. CAAHEP home page. URL:http://www.caahep.org/
practice in undergraduate education. AAHE Bull 1987; 24. Joint Review Committee on Education in Diagnostic
39:3–7. Medical Sonographic (JRC-DMS): National Education
13. Chickering AW, Ehrmann SC: Implementing the seven Curriculum. http://www.jrcdms.org/nec/
principles: technology as lever. AAHE Bull 1996;49: 25. Middle States Commission on Higher Education: Character-
3–6. istics of Excellence in Higher Education: Eligibility Require-
14. Golich V: Thinking about assessment. Proceedings of the ments and Standards for Accreditation. Philadelphia, Middle
Conference of Chairs, Academic Program Assessment States Commission on Higher Education, 2006.
Correspondence: Reva A. Curry, PhD, RT(R), RDMS, FSDMS,
The Richard Stockton College of New Jersey, PO Box 195, Pomona,
NJ 08240. E-mail:
[email protected].