0% found this document useful (0 votes)
27 views

Tap User Guide

This user's guide describes the features and use of the Test Analysis Program (TAP) software. Key sections include: 1. Installing and setting up TAP, which runs on any Windows system. 2. Entering test data directly into TAP or importing from text files. Options include entering student responses, answer keys, and table of specifications. 3. Using TAP's random score generator to create sample test data for demonstration or teaching purposes. Parameters like difficulty and number of items can be set. 4. Saving entered or generated data as TAP files for future use. 5. Analyzing test data with TAP to produce reports on examinee performance, item

Uploaded by

saleh ahmad
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views

Tap User Guide

This user's guide describes the features and use of the Test Analysis Program (TAP) software. Key sections include: 1. Installing and setting up TAP, which runs on any Windows system. 2. Entering test data directly into TAP or importing from text files. Options include entering student responses, answer keys, and table of specifications. 3. Using TAP's random score generator to create sample test data for demonstration or teaching purposes. Parameters like difficulty and number of items can be set. 4. Saving entered or generated data as TAP files for future use. 5. Analyzing test data with TAP to produce reports on examinee performance, item

Uploaded by

saleh ahmad
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 20

User’s Guide

Prepared by Marsha Lewis


Contents

Section 1 - Introduction and features ............................................................... 3

Section 2 - Installation and setup .................................................................... 3

Section 3 - Entering test data ........................................................................ 11

Section 4 - Using random score generator ....................................................... 9

Section 5 – Saving data files ......................................................................... 13

Section 6 - Analyzing tests with TAP ................................................................. 4

1. Quick examinee results and item analysis ................................................. 4


2. Full results.............................................................................................. 4
3. Spearman-Brown Formula for reliability .................................................... 7
4. Grade reports......................................................................................... 7
5. Options for calculating and viewing data .................................................. 8

Glossary ..................................................................................................... 16

References for formulas and terms................................................................. 18

User agreement........................................................................................... 19

TAP User’s Guide -–Version 4.2.5 2


1 Introduction and Features
The Test Analysis Program (TAP) is designed as a powerful, easy-to-use (and free!) test
analysis package for use by K-12 through graduate level instructors. The software’s
features also make it an excellent teaching tool for both undergraduate and graduate
level test design courses. As a test analysis package, the user can insert test score
data in text format or enter test data directly into the program. For teaching purposes,
TAP generates sample sets of test data using a random seed. The user can set various
parameters for the sample data, including the difficulty of the test, number of scores,
number of test items and number of possible answers per item.

TAP output provides:

Examinee Analysis, including percentage correct, letter grade and confidence intervals
for each student and aggregate descriptive statistics for the group. TAP also generates
an instructor’s list of scores and wrong answers by student, as well as a report for each
student indicating his or her score, responses to each item and the correct answers to
items missed.

Item and Test Analysis, including item difficulty, point biserial, discrimination index and
various statistics if item deleted (KR20, scale mean and standard deviation, etc.). TAP
also calculates the number of additional items needed to increase reliability.

Options Analysis, including high group and low group item difficulty for correct
answer and distracters.

2 Installation and Setup


TAP operates in any Windows environment. To install TAP:

1. Insert the CD-ROM into the proper drive.


2. View contents the CD-ROM using Explore.
3. Click on the TAP icon. The program will launch and you are ready to begin.

TAP User’s Guide -–Version 4.2.5 3


3 Entering Test Data

TAP provides a simple method of entering test data for analysis. Scores can be
entered directly into TAP or imported as text files from another program. Users can
also choose to enter information on cognitive levels and content domain into the
“Table of Specifications” feature.

Entering test data directly into TAP

1. Under INPUT, select Enter New Data and click on Go To Data Editor.

Figure 3.1. Procedure for accessing test score input screen.

1. In the Data Editor screen, enter any descriptive information in the Title and
Comments sections (see Figure 3.2).

2. Input the number of examinees, number of items, missing data symbol and ID
label (student name) length in the appropriate fields. The data entry screen
will automatically adjust to the chosen parameters.

3. In the Answer Key field, enter the numbers corresponding to the correct
answers as a string with no delimiters.

4. In the #Options field, enter the number of options corresponding to each


question.

5. The Item Included field allows the user to eliminate items from the analysis or
set alternative correct answers. For instance, if the teacher realizes that the
item was incorrectly worded, he or she can enter an “N” under that item in the
Item Included field. This will remove the item from the test analysis and will

TAP User’s Guide -–Version 4.2.5 4


calculate student scores without the item. This feature also allows the user to
enter alternative answers for items. To set alternative correct answers:

a) Flag that item by placing an A in the Item Included field in the


Data Editor screen.
b) Under “Options” in the Data Editor screen, choose Set
Alternative Correct Answers. Enter the alternatives.

6. In the Data screen, enter the student identification information and scores.
Align the score data with the guide above the Data screen.

7. When all information is entered, click on either Save File or Close and Analyze
at the bottom of the Data Entry screen. The Save File button will allow you to
choose the location to save the entered data as a .tap file for future data entry
or analysis. The Close and Analyze button will allow you to immediately view
the test analysis data (see Section 6 for details on analysis).

Figure 3.2. TAP Data Editor screen.

Importing test data

TAP User’s Guide -–Version 4.2.5 5


Note: Under the Data menu in Data Editor, TAP also provides the option of
entering data from an existing text (.txt) file.

Entering a table of specifications

A table of specifications allows instructors to develop test questions based on: 1)


the content domain covered, and 2) the cognitive skill level being tested. There are
several methods for defining cognitive skill level. Bloom’s Taxonomy is one
common framework, classifying objectives into six levels (knowledge,
comprehension, application, analysis, synthesis, evaluation).

Ideally, instructors want to mix the cognitive skill levels they are testing to ensure
that they are not just asking questions that require memorization of facts
(knowledge-level). Also, if more than one content domain is being covered on a
test, instructors want to ensure that the questions reflect all the content. A table of
specifications helps ensure that the instructor is testing across the appropriate
content areas and at the appropriate cognitive levels.

Table of Specifications for a 30 question General Business Test


Content Domain Cognitive Level
knowledge comprehension application
Factors of production 6 4 2
Market/command economies 3 4 1
Measuring economic activity 2 5 4

In the Data Editor window the user can enter the table of specifications
information by choosing that option from the Options menu. Figure 3.3 illustrates the
Table of Specifications entry screen.

TAP User’s Guide -–Version 4.2.5 6


Figure 3.3. Table of Specifications entry screen.

TAP User’s Guide -–Version 4.2.5 7


4 Using Random Score Generator
For users who are demonstrating the use of the software to colleagues or teaching test
or item analysis, TAP generates sample sets of test score data using a random seed.
The user can modify parameters such as item difficulty and number of items.

From the first TAP screen under INPUT, select


Generate Sample Data, then click on Go To Data Editor.

The Data Generation Information options window appears (Figure 4.1). The user can
choose the difficulty level, number of cases (examinees), number of test items, and the
number of answer options per item. The seed for random data generation can also
be changed. Once desired parameters are set, click Generate to create sample data
set.

Figure 4.1. Data Generation Information window.

Generated scores will appear in the Data Editor. Analyze data by clicking Close and
Analyze in the Data Editor window or save for future analysis by clicking Save button
in Data Editor window.

TAP User’s Guide -–Version 4.2.5 8


5 Saving Data Files
Test data created entered by the user or created by TAP’s random data generator can
be saved as TAP files for archival information, future modification or analysis.

Data files can be saved in TAP’s data editor window.

1. Choose Save TAP file under the File menu.

2. Select the location and save the TAP file.

3. To open file later, choose Open TAP file under the File menu in the Data Editor
Screen.

TAP User’s Guide -–Version 4.2.5 9


6 Analyzing Tests with TAP

Once you have a set of test scores in the Data Editor, either by entering your own test
data (see section 3 for procedure) or generating sample scores for teaching purposes
(see section 4 for procedure), you can generate the analysis by clicking Analyze (F9).
Quick examinee results and quick item analysis appear. The quick results provide
summary information for the examinees as a group and the test as a whole.

To retrieve the full analysis click on the View Full Results box. The full results window
will open. This window contains several pages of analysis on examinees, individual
items, and the test as a whole.

TITLE: Sample Data Set #1


COMMENT:
***********************************************************************
Examinee Analysis
***********************************************************************
Ltr ~68% C.I. ~68% C.I. ~68% C.I.
Examinee Score Percent Grade (Raw Score) (Percents) (Letters)
-------------------- ----- ------- ----- ------------ ------------ ----
-----
Isaac Irt 23 48.94% F (20.3- 25.7) (43.3- 54.6) (F , D )
Valeria Validity 26 55.32% D (23.3- 28.7) (49.6- 61.0) (F , D )
Nancy Nominal 26 55.32% D (23.3- 28.7) (49.6- 61.0) (F , D )
James Judges 27 57.45% D (24.3- 29.7) (51.8- 63.1) (F , C )
Oscar Option 27 57.45% D (24.3- 29.7) (51.8- 63.1) (F , C )
Susan S.Brown 28 59.57% D (25.3- 30.7) (53.9- 65.2) (D , C )
Figure 6-1. View of first section of results from View Full Results command.

Figure 6-1 shows a portion of the first section of analysis, which


provides score information for each examinee.

To continue viewing the full analysis, use the down arrow in the lower right portion of
the TAP display window to scroll through the Full Results section. The examinee
analysis subsection contains additional information on scores, including a bar graph
and a stem-and-leaf display (Figure 6-2).

=====================

TAP User’s Guide -–Version 4.2.5 10


Stem-and-Leaf Display
=====================

Stem Leaves (width=10)


---- -----------------
2 . 3
2 . 6667789
3 . 122334444
3 . 58999
4 . 0
Figure 6.2. Stem and leaf display under Examinee Analysis
subsection of Full Results.

The stem and leaf display is one method of checking test scores for normality (when
rotated to the left 90° the curve should approximate a normal one). The plot also
provides a succinct look at each score. In a typical stem and leaf plot for a test score
distribution, the tens place of each score is the stem and the units place of each score
is the leaf. In Figure 6-2, for example, one person scored 23 and three people
scored 26.

The Full Analysis section provides detailed information on individual items, including
item difficulty, point biserial and discrimination index (Figure 6.3). The section also
provides various statistics if item deleted (KR20, scale mean and standard deviation,
etc.). The item information is helpful in determining how well an item functioned and
can be used to revise or discard bad test items, while identifying good test items that
can be used again.

***********************************************************************
Item and Test Analysis
***********************************************************************

Number Item Disc. # Correct # Correct Point Adjusted


Item Correct Diff. Index in High Grp in Low Grp Biserial Pt. Bis.
------- ------- ------ ------ ----------- ----------- -------- --------
Item 01 23 1.000 0.000 10 (1.00) 6 (1.00) ***** *****
Item 02 9 0.391 0.500 5 (0.50) 0 (0.00) 0.402 0.312
Item 03 10 0.435 0.500 5 (0.50) 0 (0.00) 0.393 0.302
Item 04 20 0.870 0.233 9 (0.90) 4 (0.67) 0.251 0.184
Item 05 22 0.957 0.167 10 (1.00) 5 (0.83) 0.402 0.366
Figure 6.3. Example of item and test analysis information provided in Full
Results

Item discrimination index


One example of a useful item analysis statistic is the item discrimination index. This
index indicates how well each item discriminated between the more knowledgeable
and less knowledgeable students. The item discrimination index is calculated by
subtracting the item difficulty of the high group from the item difficulty of the low
group.

TAP User’s Guide -–Version 4.2.5 11


The default item discrimination index calculation in TAP uses the top 27% of scores
and the bottom 27% of scores as the high and low groups. The user can change this
in the by choosing Set Percentages for Item Discrimination under the Options menu.

The Full Results section provides individual item information, including high group and
low group item difficulty for correct answer and distracters (Figure 6.4). The correct
answer should have a positive difference (more of the high group chose the answer
than the low group). The incorrect answers (distracters) should have negative
differences (more of the low group chose them than the high group). If no one chose
a particular distracter, the teacher can examine that distracter to see if it could be
improved.

***********************************************************************
Options Analysis
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Item Frequencies and Percentages --
page1
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
* is keyed answer, # is option that discriminates better than keyed
answer

Item Group Option 1 Option 2 Option 3 Option 4


---- ---------- ----------- ----------- ----------- -----------
1 TOTAL 1 (0.043) 0 (0.000) 22*(0.957) 0 (0.000)
High Group 0 (0.000) 0 (0.000) 10 (1.000) 0 (0.000)
Low Group 1 (0.167) 0 (0.000) 5 (0.833) 0 (0.000)
Difference -1(-0.167) 0 (0.000) 5 (0.167) 0 (0.000)

2 TOTAL 3 (0.130) 5 (0.217) 6 (0.261) 9*(0.391)


High Group 1 (0.100) 1 (0.100) 3 (0.300) 5 (0.500)
Low Group 2 (0.333) 3 (0.500) 1 (0.167) 0 (0.000)
Difference -1(-0.233) -2(-0.400) 2 (0.133) 5 (0.500)

Figure 6-4. Example of Options Analysis information provided in Full


Results.

If Table of Specifications information was entered (see Section 3 and Figure 3.2 for
information on entering this information) the analysis includes information on the
average item difficulty by content area and cognitive skill level. Figure 6-5 provides
an example of the Table of Specifications analysis.

***********************************************************************

TAP User’s Guide -–Version 4.2.5 12


Table of Specifications Analysis
***********************************************************************
======================
CONTENT AREA ANALYSIS:
======================
Average
Content Area Difficulty Items in Content Area
-------------------- ---------- ------------------------------
Factors of productio 0.691 1,8,10,11,15,21,22,23,24,25,26
Market/command econo 0.750 2,3,7,9,12,13,18,27
Measuring economic a 0.691 4,5,6,14,16,17,19,20,28,29,30

===============================
COGNITIVE SKILL LEVEL ANALYSIS:
===============================
Average
Cognitive Level Difficulty Items in Cognitive Level
-------------------- ---------- -----------------------------
knowledge 0.620 1,2,3,9,10,11,19,20,23,26
comprehension 0.738 4,5,6,7,8,12,13,14,15,16,24,25,27
application 0.771 17,18,21,22,28,29,30

========================================
CONTENT AND COGNITIVE COMBINED ANALYSIS:
========================================
Avg.
Content Area Cognitive Level Diff. Items
-------------------- -------------------- ----- --------------------
Factors of productio knowledge 0.600 1,10,11,23,26
Factors of productio comprehension 0.650 8,15,24,25
Factors of productio application 1.000 21,22
Market/command econo knowledge 0.733 2,3,9
Market/command econo comprehension 0.700 7,12,13,27
Market/command econo application 1.000 18
Measuring economic a knowledge 0.500 19,20
Measuring economic a comprehension 0.840 4,5,6,14,16
Measuring economic a application 0.600 17,28,29,30
Figure 6-5. Example of Table of Specifications Analysis

TAP provides information on test length and reliability using the Spearman-Brown
Formula (see Glossary). The information is provided as part of the Full Results. To
calculate test length for various reliability coefficients, click on Spearman-Brown
Prophecy under the Analysis menu.

Additional Options

Options for calculating and presenting information can be set under the Options
menu.

TAP User’s Guide -–Version 4.2.5 13


Figure 6.6. Options Menu.

Under Set Percentages for Grades, the user can choose the
percentages that correspond to letter grades then save the scale as the default grading
scale.

Under Use Letter Grades in Confidence Bands, TAP will indicate the range of letter
grades corresponding to the confidence interval around the score.

TITLE: Sample Data Set #1


COMMENT:
***********************************************************************
Examinee Analysis
***********************************************************************
Ltr ~68% C.I. ~68% C.I. ~68% C.I.
Examinee Score Percent Grade (Raw Score) (Percents) (Letters)
-------------------- ----- ------- ----- ------------ ------------ ----
-----
Isaac Irt 23 48.94% F (20.3- 25.7) (43.3- 54.6) (F , D )
Valeria Validity 26 55.32% D (23.3- 28.7) (49.6- 61.0) (F , D )
Nancy Nominal 26 55.32% D (23.3- 28.7) (49.6- 61.0) (F , D )
James Judges 27 57.45% D (24.3- 29.7) (51.8- 63.1) (F , C )
Oscar Option 27 57.45% D (24.3- 29.7) (51.8- 63.1) (F , C )
Susan S.Brown 28 59.57% D (25.3- 30.7) (53.9- 65.2) (D , C )
Figure 6-7. Example of Analysis when Use Letter Grades in Confidence
Bands option is selected

TAP User’s Guide -–Version 4.2.5 14


Saving Analysis

TAP provides options for saving the results of the analysis.

1. Under OUTPUT, type a file name in the white Results File Name box (the file
name will automatically appear in the TAP File Name box, assigning the same
file name to the .tap data file).

2. Click Save Full Results box, and then choose the location to save the text file.

The Results File Name will create a text (.txt) file containing the Full Results. This text
file can be opened in a word processing program such as Microsoft Word, WordPad
or Notepad. Note: For best results, set the font to Courier or Courier New (10 point)
and turn on the “ word wrap” feature.

Figure 6.8. Saving Analysis.

3. To save the Quick Examinee Results and/or Quick Item Analysis as separate
files for quick reference, check these boxes before clicking on Save Full Results.
These files will be saved with the same name and .EXM and .ITM extensions,
and can also be opened in a word processing program.

4. In addition to the information provided in Full Results, TAP provides the option
of retrieving detailed test information on individual students. From the TAP File
menu, click on Save Grade Reports. Choose the location to save the data.

Social Studies Test 2


~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

TAP User’s Guide -–Version 4.2.5 15


Individual Grade Report for: Susan Spearman-Brown
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Total Number of Items Correct = 20
Percentage Correct = 66.67%

************************************************************

Item Student Correct


Number Response Response
------ -------- --------
1 1
2 1 2
3 2
4 2
5 1
Figure 6.9. Example of output from Save Grade Reports option under
File menu.

To view and print grade reports, open the Grade Report file in a
word processing program such as Microsoft Word, WordPad or Notepad. Note: For
best results, set the font to Courier or Courier New (10 point) and turn on the word
wrap feature.

TAP User’s Guide -–Version 4.2.5 16


Glossary
Confidence Interval An interval that consists of those values of a statistic that
have a given probability of including the population value. In terms of test scores, when
confidence intervals are constructed for a number of examinees, a specified percentage (for
instance, 95%) of these intervals are expected to contain the examinees’ actual scores.

Item Difficulty The proportion of examinees answering the item correctly (the lower the
proportion, the more difficult the item).

Item Discrimination Index A number (ranging from 1.00 to –1.00) representing the ability of
an item to distinguish between more and less knowledgeable students. The index is commonly
calculated as the difference in proportion of students in the high group (i.e. top quartile of
scores) and the low group (i.e. bottom quartile of scores). An item discriminates if a larger
proportion of students from the upper group correctly answered the item than in the lower
group.

KR20 (Kuder-Richardson Formula 20) A method of calculating the internal consistency


reliability of a test by dividing the test into components and calculating a reliability coefficient.

KR21 (Kuder-Richardson Formula 21) A method of calculating the internal consistency


reliability of a test using the mean and standard deviation of the scores.

Kurtosis The relative steepness or flatness of a distribution—one measure of normality. A


relatively flat distribution is described as platikurtic; a relatively steep distribution is described
as leptokurtic. In general, kurtosis between –1 and 1 is considered normal (but only one
measure of normality).

Mean Score The arithmetic average of actual scores on a test.

Median Score The score below which 50 percent of the scores on a test fall.

Point Biserial Correlation A method of calculating a correlation between the number of


correct answers on a particular item and the number of correct answers on the entire test.

Reliability The degree to which a test measures something consistently (calculated in this
program using the Kuder-Richardson Formula KR20). In general, for teacher-made tests that
take approximately 50 minutes to complete, the KR20 reliability should be between .60 and
.80. Shorter tests will likely have lower reliabilities. If the KR20 reliability is below .50, the test
does not adequately differentiate the performance of a student from the average performance
of the class as a whole.

Skewness A measure of symmetry of a distribution, used as one measure of normality. A


positively skewed distribution has many low scores and relatively few high scores; a negatively
skewed distribution has many high scores and relatively few low scores. In general, a
skewness between –1 and 1 is considered normal (but only one measure of normality).

TAP User’s Guide -–Version 4.2.5 17


Spearman-Brown Formula A method of estimating the effects that changes in test length will
have on the reliability of the test.

Standard Deviation An estimate of how far a set of scores deviates from the
Mean.

Standard Error of Measurement The standard deviation of the distribution of observed


scores for any individual.

Stem and Leaf Display A bar plot that provides a visual picture of the distribution as well as
much or all of the actual data. In a typical stem and leaf plot for a test score distribution, the
tens place of each score as the stem and the units place of each score as the leaf.

TAP User’s Guide -–Version 4.2.5 18


References for formulas and terms

Allen, M.J., & Yen, W.M. (1979). Introduction to measurement theory.


Prospect Heights, IL: Waveland Press.

Crocker, L., & Algina, J. (1986). Introduction to classical and modern test
theory. Orlando, Harcourt Brace Jovanovich.

Harris, M. B. (1998). Basic statistics for behavioral science research.


Needham Heights, MA: Allyn & Bacon.

Oosterhof, A. (2001). Classroom applications of educational


measurement. Upper Saddle River, NJ: Merrill Prentice Hall.

TAP User’s Guide -–Version 4.2.5 19


User Agreement
Carefully read the following User Agreement (License, Terms of Use, and Disclaimer of Warranty). Use of the TAP
software program provided with this Agreement constitutes acceptance of these terms and conditions of use. If you
do not agree to the terms of this agreement, do not use the
TAP software program.

LICENSE
TAP is a copyrighted program and is NOT public domain. The user is granted license, not ownership, to use the
TAP software program on any computer subject to the restrictions described in the User Agreement and Disclaimer.

TAP is Freeware. The user is licensed to make an unlimited number of exact copies of the TAP software program,
to give these exact copies to any other person for their personal use,
and to distribute the TAP software program in its unmodified form only via disk, email, or local area network. If
these methods of distribution are unavailable, any person wanting to use the
TAP software program should be directed either to contact the author or to visit the author's Internet web site (the
URL is provided below and may be posted on any web site).

If you find the program useful, if you copy it for others, if you find problems or bugs in the program, or if you use
the program for teaching, educational, or consulting purposes, you are requested to inform the authors:

Author: Gordon P. Brooks, Ph.D.


Address: McCracken Hall, Ohio University, Athens, OH 45701
Telephone: 740-593-0880
Fax: 740-593-0477
Email: [email protected]
Web Page: http://oak.cats.ohiou.edu/~brooksg/tap.htm

TERMS OF USE
The TAP software program may be used and copied for personal use subject to the following
license restrictions:

§ the TAP software program shall be copied and supplied in its original, unmodified form;
§ the TAP software program shall not be sold or used for profit, nor may any amount or fee be charged for
use, rental, lease, or distribution of the program;
§ the TAP software program shall not be included or bundled with any other goods or services;
§ the TAP software program may not be decompiled, disassembled, or otherwise modified.

any such unauthorized use without expressed, written permission granted by the author shall
result in immediate and automatic termination of this license.

DISCLAIMER OF WARRANTY
Every effort has been made to ensure the accuracy of the TAP program, the algorithms and subroutines used, and
the results produced by the TAP software program, both on screen and printed.

However, no warranty is expressed or implied concerning the function or fitness of the TAP software program,
subroutines, or results provided by the program. That is, the TAP software program is provided on an "as is" basis
without warranty of any kind. The author shall have neither liability nor responsibility to any person or entity with
respect to any liability, loss, or damage directly or indirectly arising from the use of or inability to use the TAP
software program or the results of the analyses provided by the TAP software program, even if the author has been
advised of the possibility of such damages or claims. In no event shall any liability exceed the license fee paid to
the author of the TAP software program. In the event of invalidity of any provision of this license, the user agrees
that such invalidity shall not affect the validity of the remaining portions of this license.

All rights not expressly granted here are reserved to the author of the TAP software program.

User’s Guide prepared by Marsha Lewis.

TAP User’s Guide -–Version 4.2.5 20

You might also like