6 Cba Written Expression Directions
6 Cba Written Expression Directions
org 1
Tracking student growth in emerging writing skills can be confusing and time-consuming for teachers. However,
Curriculum-Based Measurement-Written Expression (CBM-WE) is an efficient, reliable method of formative student
assessment that yields numeric indicators that are instructionally useful--such as total words written, correctly spelled
words, and correct writing sequences (Gansle et al., 2006). CBM-WE probes are group-administered writing samples
with an administration time of about 4 minutes. CBM-Written Expression is therefore a powerful means to monitor a
student's progress in the mechanics and conventions of writing.
CBM-Written Expression: What It Measures. Teachers have several assessment options to choose from when
using CBM-Written Expression (Gansle et al., 2006; Wright, 1992):
• Total Words Written (TWW): This measure is a count of the total words written during the CBM-WE assessment.
Teachers might select Total Words Written as a progress-monitoring target if the student needs to focus on
writing fluency (getting more words onto the page).
• Correctly Spelled Words (CSW): This measure is a count of correctly spelled words written during the CBM-WE
assessment. If poor spelling is a blocker to student writing, the teacher may select this monitoring target.
• Correct Writing Sequences (CWS): This measure is a tabulation of correct 'writing sequences' written during the
CBM-WE assessment. One Correct Writing Sequence is scored whenever two adjacent units of writing (e.g., two
words appearing next to each other) are found to be correct in their punctuation, capitalization, spelling, and
syntactical and semantic usage. When the student is expected to have mastered the basic mechanics and
conventions of writing, Correct Writing Sequences are a useful method to track this group of interrelated skills.
CBM-Written Expression Fluency Measures: How to Access Resources. Teachers who wish to screen their
students in basic writing skills can obtain these free CBM-Written Expression assessment resources: (1) materials for
assessment, (2) guidelines for administration and scoring, and (3) research-based norms.
• Materials for assessment. Schools can create their own CBM Written Expression Fluency assessment materials
at no cost, using the Written Expression Probe Generator, a free online application:
http://www.interventioncentral.org/tools/writing-probe-generator
This program allows the user to customize and to generate printable story-starter worksheets in PDF format.
• Guidelines for administration and scoring. Instructions for preparing, administering, and scoring CBM-Written
Expression assessments appear later in this document:
References
‘How the Common Core Works’ Series © 2013 Jim Wright www.interventioncentral.org 2
Gansle, K. A., VanDerHeyden, A. M., Noell, G. H., Resetar, J. L., & Williams, K. L. (2006). The technical adequacy of
curriculum-based and rating-based measures of written expression for elementary school students. School
Psychology Review, 35, 435-450.
Malecki, C. K., & Jewell, J. (2003). Developmental, gender, and practical considerations in scoring curriculum-based
measurement writing probes. Psychology in the Schools, 40, 379-390.
McMaster, K., & Espin, C. (2007). Technical features of curriculum-based measurement in writing: A literature review.
Journal of Special Education, 41(2), 68-84.
Robinson, L. K., & Howell, K. W. (2008). Best practices in curriculum-based evaluation & written expression. In A.
Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 439-452). Bethesda, MD: National
Association of School Psychologists.
Tadatada, A. (2011). Growth rates of curriculum-based measurement-written expression at the elementary school
level. Unpublished master's thesis, Western Kentucky University, Bowling Green.
Wright, J. (1992). Curriculum-based measurement: A manual for teachers. Retrieved September 23, 20011, from
http://www.jimwrightonline.com/pdfdocs/cbaManual.pdf
‘How the Common Core Works’ Series © 2013 Jim Wright www.interventioncentral.org 3
• Student copy of CBM writing probe with story-starter (the process for creating story-starters is described below)
• Stopwatch
• Pencils for students
Schools can create their own CBM Written Expression Fluency assessment materials at no cost, using the Written
Expression Probe Generator, a free online application: http://www.interventioncentral.org/tools/writing-probe-
generator
This program allows the user to customize and to generate printable story-starter worksheets in PDF format.
The CBM writing probe in Figure 1 is an example of how a such a probe might be formatted. (This particular probe
was used in a 5th-grade classroom.):
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Figure 1: Example of a CBM writing probe
CBM Writing Probe
One day, I was out sailing. A storm carried me far out to sea and wrecked
__________________________________________________________________________
__________________________________________________________________________
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
‘How the Common Core Works’ Series © 2013 Jim Wright www.interventioncentral.org 4
If the student is not yet familiar with CBM:WE assessments, the teacher can administer one or more practice
CBM:WE probes (using the administration guidelines above) and provide coaching and feedback as needed until
assured that the student fully understands the assessment.
Scoring methods differ both in the amount of time that they require of the instructor and in the type of information that
they provide about a student's writing skills. Advantages and potential limitations of each scoring system are
presented below.
Total Words Written (TWW). The examiner counts up and records the total number of words written during the 3-
minute writing probe. Misspelled words are included in the tally, although numbers written in numeral form (e.g., 5,
17) are not counted. Calculating total words is the quickest of scoring methods. A drawback, however, is that it yields
only a rough estimate of writing fluency (that is, of how quickly the student can put words on paper) without
examining the accuracy of spelling, punctuation, and other writing conventions. A 6th-grade student wrote the CBM
writing sample in Figure 2. Using the total-words scoring formula, this sample is found to contain 45 words, including
misspellings.
‘How the Common Core Works’ Series © 2013 Jim Wright www.interventioncentral.org 5
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Figure 2: CBM writing sample scored for Total Words Written:
Total=45 words
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Correctly Spelled Words. The examiner counts up only those words in the writing sample that are spelled correctly.
Words are considered separately, not within the context of a sentence. When scoring a good rule of thumb is to
determine whether--in isolation--the word represents a correctly spelled term in English. If it does, the word is
included in the tally. Assessing the number of correctly spelled words has the advantage of being quick. Also, by
examining the accuracy of the student's spelling, this approach monitors to some degree a student's mastery of
written language. As seen in figure 3, our writing sample is contains 39 correctly spelled words.
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Figure 3: CBM writing sample scored for Correctly Spelled Words
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Correct Writing Sequences. When scoring correct writing sequences, the examiner goes beyond the confines of the
isolated word to consider units of writing and their relation to one another. Using this approach, the examiner starts at
the beginning of the writing sample and looks at each successive pair of writing units (writing sequence). Words are
‘How the Common Core Works’ Series © 2013 Jim Wright www.interventioncentral.org 6
considered separate writing units, as are essential marks of punctuation. To receive credit, writing sequences must
be correctly spelled and be grammatically correct. The words in each writing sequence must also make sense within
the context of the sentence. In effect, the student's writing is judged according to the standards of informal standard
American English. A caret (^) is used to mark the presence of a correct writing sequence.
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Figure 4: An illustration of selected scoring rules for correct writing sequences Because the period is
considered essential
Since the first word punctuation, it is joined with
begins the sentence the words before and after it
correctly, it is marked as to make two correct writing
a correct writing ^It^was^dark^.^Nobody sequences.
sequence. ^could seen the^trees^of
^the forrest.
Grammatical or syntactical errors cannot be
counted as correct writing sequences.
Misspelled words cannot be counted as
correct writing sequences.
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
‘How the Common Core Works’ Series © 2013 Jim Wright www.interventioncentral.org 7
The following scoring rules will aid the instructor in determining correct writing sequences:
• Correctly spelled words make up a correct writing sequence (reversed letters are acceptable, so long as they do
not lead to a misspelling):
Example
^Is^that^a^red^car^?
• Necessary marks of punctuation (excluding commas) are included in correct writing sequences:
Example
^Is^that^a^red^car^?
Not surprisingly, evaluating a writing probe according to correct writing sequences is the most time-consuming of the
scoring methods presented here. It is also the scoring approach, however, that yields the most comprehensive
information about a student's writing competencies. While further research is needed to clarify the point, it also
seems plausible that the correct writing sequence method is most sensitive to short-term student improvements in
writing. Presumably, advances in writing skills in virtually any area (e.g., spelling, punctuation) could quickly register
as higher writing sequence scores. Our writing sample in Figure 5 is found to contain 37 correct writing sequences.
‘How the Common Core Works’ Series © 2013 Jim Wright www.interventioncentral.org 8
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Figure 5: CBM Writing sample scored for Correct Writing Sequence (Each correct writing
sequence is marked with a caret(^)).
(^))
^I woud drink^water^from^the^ocean 5 correct writing sequences
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
References
McMaster, K., & Espin, C. (2007). Technical features of curriculum-based measurement in writing: A literature review.
Journal of Special Education, 41(2), 68-84.
Wright, J. (1992). Curriculum-based measurement: A manual for teachers. Retrieved September 23, 20011, from
http://www.jimwrightonline.com/pdfdocs/cbaManual.pdf
‘How the Common Core Works’ Series © 2013 Jim Wright www.interventioncentral.org 9
Correctly Spelled Words (CSW): This measure is a count of correctly spelled words written during the
CBM-WE assessment.
Grade Fall Fall:+/-1 SD Spring Spring: +/-1 SD Weekly
CSW (≈16th%ile to 84th%ile) CSW (≈16th%ile to 84th%ile) Growth
(Malecki & Jewell, (Malecki & (Tadatada, 2011)
2003) Jewell, 2003)
1 5 1↔9 10 3↔17 0.45
2 20 10↔30 27 15↔39 0.46
3 32 19↔45 33 21↔45 0.37
4 38 26↔50 44 29↔59 0.26
5 48 31↔65 65 42↔88 --
6 42 29↔55 56 41↔71 --
Correct Writing Sequences (CWS): This measure is a tabulation of correct 'writing sequences' written
during the CBM-WE assessment. One Correct Writing Sequence is scored whenever two adjacent units of
writing (e.g., two words appearing next to each other) are found to be correct in their punctuation,
capitalization, spelling, and syntactical and semantic usage.
Grade Fall Fall:+/-1 SD Spring Spring: +/-1 SD Weekly
CWS (≈16th%ile to 84th%ile) CWS (≈16th%ile to 84th%ile) Growth
(Malecki & Jewell, (Malecki & (Tadatada, 2011)
2003) Jewell, 2003)
1 2 0↔4 7 1↔13 0.36
2 15 5↔25 24 11↔37 0.44
3 28 14↔42 31 18↔44 0.35
4 38 25↔51 42 26↔58 0.22
5 46 28↔64 63 40↔86 --
6 41 27↔55 54 37↔71 --
‘How the Common Core Works’ Series © 2013 Jim Wright www.interventioncentral.org 10
References:
• Gansle, K. A., VanDerHeyden, A. M., Noell, G. H., Resetar, J. L., & Williams, K. L. (2006). The technical
adequacy of curriculum-based and rating-based measures of written expression for elementary school students.
School Psychology Review, 35, 435-450.
• Malecki, C. K., & Jewell, J. (2003). Developmental, gender, and practical considerations in scoring curriculum-
based measurement writing probes. Psychology in the Schools, 40, 379-390.
• Tadatada, A. (2011). Growth rates of curriculum-based measurement-written expression at the elementary school
level. Unpublished master's thesis, Western Kentucky University, Bowling Green.
* Reported Characteristics of Student Sample(s) Used to Compile These Norms:
Malecki & Jewell, 2003: Number of Students Assessed: 946 Total; Grade 1: Fall:133 -Spring:123; Grade 2: Fall:200
-Spring:156; Grade 3: Fall:168 -Spring:109; Grade 4: Fall:192 -Spring:182; Grade 5: Fall:127 -Spring:120; Grade 6:
Fall:57 -Spring:54/Geographical Location: Northern Illinois: Sample drawn from 5 suburban and rural schools across
three districts/ Socioeconomic Status: Not reported/Ethnicity of Sample: Not reported/English Language Learners in
Sample: Not reported.
Tadatada, 2011: Number of Students Assessed: 1,004 Total; Grade 1: 207; Grade 2: 208; Grade 3: 204; Grade 4:
220; Grade 5: 165/Geographical Location: Bowling Green, KY: Sample drawn from 5 elementary schools in single
district/ Socioeconomic Status: Not reported/Ethnicity of Sample: 64% White; 18% African-American; 13% Hispanic;
3% Asian; 3% Other/Limited English Proficiency in Sample: 19%.
Where to Find Materials: Schools can create their own CBM Written Expression Fluency assessment materials at no
cost, using the Written Expression Probe Generator, a free online application:
http://www.interventioncentral.org/tools/writing-probe-generator
This program allows the user to customize and to generate printable story-starter worksheets in PDF format.
Limitations of These Research Norms: Norms generated from small-scale research studies--like those used here--
provide estimates of student academic performance based on a sampling from only one or two points in time, rather
than a more comprehensive sampling across separate fall, winter, and spring screenings. These norms also have been
compiled from a relatively small student sample that is not fully representative of a diverse 'national' population.
Nonetheless, norms such as these are often the best information that is publically available for basic academic skills
and therefore do have a definite place in classroom instruction decision-making.
These norms can be useful in general education for setting student performance outcome goals for core instruction
and/or any level of academic intervention. Similarly, these norms can be used to set performance goals for students
with special needs. In both cases, however, single-sample norms would be used only if more comprehensive
fall/winter/spring academic performance norms are not available.