0% found this document useful (0 votes)
14 views

L2 Objective Test

Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views

L2 Objective Test

Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 20

What is Objective test?

An objective test is a test that has right or wrong answers and so


can be marked objectively. It can be compared with a
subjective test, which is evaluated by giving an opinion, usually
based on agreed criteria.

1
What is Objective test?

◼ Objective tests require recognition and recall of subject matter.

◼ The forms vary: questions of fact, sentence completion, true-false,


analogy, multiple-choice, and matching.

◼ They tend to cover more material than essay tests.

◼ They have one, and only one, correct answer to each question.

◼ They may require strict preparation like memorization.

◼ Typically assess lower-level skills such as knowledge, comprehension,


and application (higher-order items are much more difficult to write).
2
Multiple choice item format

A multiple-choice item consist of one or more introductory sentences


followed by a list of two or more suggested responses The student must
choose the correct answer from among the responses you list.

Example : How many sides does a heptagon have? Stem

A. Three
B. Five Distractors
C. Six
D. Seven Keyed alternative

3
Multiple choice item format

Stem Alternatives Keyed alternatives &


Part of the item that ask the The list of suggested distractors
question. You write the stem, responses are called The alternative that is the
students understand what alternatives/choices and best answer is called as
question to answer. options. keyed answer or keyed
alternative.
Should always be arranged
in meaningful way (logically, The remaining incorrect
numerically, alphabetically). alternatives are called
distractors / foils.
WHY?
4
General guidelines for writing & formatting objective test
items
Writing Formatting
◼ Objective test items should cover ◼ Vary the types of items that appear on
important content and skills. classroom tests.
◼ The reading level and vocabulary of ◼ Group items similar in format together so
each item should be as elementary as that each type appears in a separate
section.
possible.
◼ Each section should be preceded by
◼ Each objective item should be stated in clear directions.
an unambiguous manner, and confusing
◼ Within each section, order the items from
sentence structure and wording should easiest to most difficult.
be avoided.
◼ Although all item types will not appear on
◼ Objective items should not consist of every test, they should be arranged in
verbatim statements or phrases lifted the following order: true-false, matching,
from the text. short answer, multiple-choice, and essay.
◼ Clues to the correct answer should not ◼ Provide adequate space for students to
be provided. respond to each item.
◼ Avoid splitting an item between two
pages. 5
Considerations before writing items.
a) Similarity of Distractors.
You can construct a test item for students at a specific level of this learning continuum.
The students who are at this level (or above it) should be able to answer the item
correctly. Lower on the continuum of learning, will not. Consider the following item :
1. In what year did the United States enter World War I ?
A. 1776
B. 1812
C. 1917 You can easily see how
D. 1981
the alternatives operate
2. In what year did the United States enter World War I ? to make the item easy
A. 1901 or difficult.
B. 1917
C. 1941
D. 1950 Research said similarity
among the alternatives
3. In what year did the United States enter World War I ? increases the difficulty
A. 1913
of item (Green, 1984).
B. 1915
C. 1916
D. 1917
6
Considerations before writing items.

b) Basic purpose of Assessment Tasks.

Based on preceding example, some less knowledgeable students


probably will answer some tasks correctly, and other more
knowledgeable students will not. Though, keep in mind this principle.

The basic purpose of an assessment task,


whether or not it is a multiple choice item, is to
identify students who have attained a sufficient
level of knowledge (skill, ability, performance).

7
Considerations before writing items.
c) Variety of Multiple-Choice Items

Teachers and professional test developers use several varieties of


multiple choice items. Teacher usually find that the correct-answer, best-
answer, incomplete-statements, and negative varieties are the most
useful.

8
Considerations before writing items.
d) Direct versus Indirect Assessments

A multiple-choice test can be a direct assessment of certain abilities.


Well written MCQ items can help directly assess a student’s ability to
discriminate and make correct choices , to comprehend concepts,
principles and generalizations, to infer and reason, to compute, to
interpret new data or new information.

A multiple-choice items are only indirect assessment of other important


educational outcomes, such as the ability to :
• To recall information,
• To solve problems that are not well structured
• To organize personal thoughts
• To display thought process
• To work in group 9
Creating Basic Multiple-Choice Items.
Five basic skills of the craft

You will create useful multiple choice items if you learn how to do five
things :
1) Focus items to assess specific learning targets.
2) Prepare the stem as a question or problem to be solved.
3) Write a concise and correct alternative
4) Write distractors that are plausible
5) Edit the item to remove irrelevant clues to the correct answer.

First draft multiple choice items should not be put on a test until they are
edited and polished.

10
Crafting the Stem of the item

✓ Direct question asked or implied.


✓ Put alternatives at the End.
✓ Control vocabulary and sentence structured.
✓ Avoid negatively worded stems.
✓ Avoid textbookish Wording.

11
Crafting Alternatives or Foils

✓ Plausible and functional alternatives


✓ Homogenous alternatives
✓ Put repeated words in the stem.
✓ Consistent, Correct punctuation
✓ Arrangement of the alternatives

12
Item Analysis

Item Analysis : Analysis of statistical characteristics of each item


appearing on a test for purposes of making decisions about retaining,
discarding the items.
•Items should be evaluated:
✓While items are being drafted (using table of specifications,
guidelines, etc.).
✓Following test administration and scoring.

13
Item Analysis

•Four basic statistics:


✓Item difficulty : Proportion of students who answered item
correctly.
✓Item discrimination : Difference between proportion of correct
answers in high-scoring and low-scoring groups.
✓Distractor analysis : Examines patterns of response for incorrect
options.
✓Reliability : Overall consistency across all items

14
Item Analysis

Items difficulty :
✓Symbolized by p.
✓Simply divides the number of students who correctly answered
an item by the number who attempted the item.
✓Can range from .00 (difficult) to 1.00 (easy).
✓Consider revising any item where p < .20 or p > .85.
✓Good judgment should also be used in conjunction with
statistical analyses.
✓Teachers could reasonably expect all students to correctly
answer some items; therefore, p = 1.00 indicates that all students
have mastered the concept.

15
Item Analysis
Item discrimination
✓Symbolized by D.
✓Purpose is to see how well each item discriminates between low-
and high-scoring students (on entire test).
✓If item functions well, most students in high-scoring group will
answer it correctly and most students in low-scoring group will
answer it incorrectly.
✓Positively and negatively discriminating items.
✓Typically ranges from +.10 to +.60; any negative items should be
revised or discarded.

16
Item Analysis

•Distractor analysis
✓Informally examines patterns of responses across all options.
•Reliability
✓Calculation of KR-21 reliability coefficient.
✓Ranges from .00 to 1.00; desirable range for classroom tests is
from .70 to 1.00.

17
Item Analysis
➢ Item Analysis (continued)
• Distractor analysis
✓ Informally examines patterns of responses across
all options.
• Reliability
✓ Calculation of KR-21 reliability coefficient.
✓ Ranges from .00 to 1.00; desirable range for
classroom tests is from .70 to 1.00.
• Sample item analysis…
Validity and Reliability of
Objective Test Items
➢ Validity
• Must be able to answer the following:
✓ Am I measuring what I intend to measure?
✓ To what degree do I have confidence in the
decisions I will make based on those measures?
• Of primary interest is content evidence of validity.
➢ Reliability
• Established through the use of statistical analyses,
specifically, KR-21 reliability coefficient.
THANK YOU

20

You might also like