What Is Usability in The Context of The Digital Library and How Can It Be Measured?
What Is Usability in The Context of The Digital Library and How Can It Be Measured?
D
Digital library are organizations that provide the
the 1990s, has made significant progress thus far. resources, including the specialized staff, to select,
Although there is still a long way to go before structure, offer intellectual access to, interpret, distrib-
reaching their full potential, digital libraries are matur- ute, preserve the integrity of, and ensure the persis-
ing (Fox 2002; Marcum 2002). However, the evaluation of tence over time of collections of digital works so that
digital libraries has not kept pace. As Saracevic (2000) has they are readily and economically available for use by
a defined community or set of communities.
outlined, fundamental concepts remain to be clarified,
such as What is a digital library? What is there to evalu- Francisco-Revilla et al. (2001) report digital libraries
ate? What are the criteria? How to apply them in evalu- are increasingly being defined as ones that collect point-
ation? Why evaluate digital libraries in the first place? ers to Web-based resources rather than hold the resources
Borgman (2002) has also stated that the digital libraries themselves. A library’s Web site is an example of this defi-
research community needs large test beds, including col- nition. Greenstein (2000) shares this view and says that
lections and testing mechanisms, as a means to evaluate the digital library is known less for the extent and nature
new concepts. There is also a need of benchmarks for of the collections it owns than for the networked informa-
comparison between systems and services. tion space it defines through its online services. Paepcke
This research is to develop and evaluate methods and et al. (1996) also state that a digital library provides a
instruments for assessing the usability of digital libraries. single point of access to a wide range of autonomously
Compared to other areas in digital library research, as distributed sources.
Theng, Mohd-Nasir, and Thimbleby (2000a, 238) point In addition, digital libraries may be seen as new
out, “Little work is being done to understand the purpose forms of information institutions, multimedia informa-
and usability of digital libraries.” Borgman et al. (2000, tion retrieval systems, or information systems that sup-
229) also state, “Relatively little work has been done on port the creation, use, and searching of digital content
evaluating the usability of digital libraries in any con- (Borgman 2002). Digital libraries also represent a new
text.” The same observations are also made by Blandford, infrastructure and environment that has been created by
Stelmaszewska, and Bryan-Kinns (2001) as well as Brogan the integration and use of computing, communications,
(2003). Blandford and Buchanan (2002b) call for a need for and digital content on a global scale destined to become
further work on methods for analyzing usability, includ- an essential part of the information infrastructure in the
ing an understanding of how to balance rigor, appropri- twenty-first century (DELOS 2004).
ateness of techniques, and practical limitations. In summary, digital libraries:
This study contributes to the literature the under-
■ are an organized and managed collection of digital
standing of usability, reviews what methods have been
information;
applied and their applicability, and proposes a suite of
■ are accessible over a network; and
methods for evaluating usability for academic digital
■ may include service.
libraries.
As Borgman (2002, QY: page?) states, “Digital librar-
ies are not ends in themselves; rather, they are enabling
technologies for digital asset management . . . electronic
publishing, teaching and learning, and other activities.
Judy Jeng ([email protected]) is a Ph.D. candidate at Accordingly, digital libraries need to be evaluated in the
the School of Communication, Information, and Library Studies, context of specific applications.”
Rutgers, The State University of New Jersey.
WHAT IS USABILITY IN THE CONTEXT OF THE DIGITAL LIBRARY AND HOW CAN IT BE MEASURED? | JENG 3
■ Dimensions of Usability
usability but slow down the system.
Usability has user focus. Dumas and Redish (1993, 4)
define usability as “people who use the product can do so
Usability is a multidimensional construct that can be quickly and easily to accomplish their task.” Clairmont,
examined from various perspectives. The term usability Dickstein, and Mills (1999, QY: page?) make the similar
has been used broadly and means different things to dif- statement that “[u]sability is the degree to which a user
ferent people. Some relate usability to ease of use or user- can successfully learn and use a product to achieve a
friendliness and consider from an interface effectiveness goal.”
point-of-view. This view makes sense, as usability has Usability is different from functionality. Dumas and
theoretical base on human-computer interaction. Many Redish (1993) use the videocassette recorder (VCR) as
studies on usability focus on interface design. Kim (2002, an example to illustrate the difference between the two:
26), for instance, points out that “the difference between VCRs may have high functionality (the feature works
interface effectiveness and usability is not clear.” as it was designed to work) but they have low usability
Usability can also be related to usefulness and usable- (people cannot use them quickly and easily to accomplish
ness. Gluck (1997), for instance, made this assessment. their task). Usability has several aspects, including inter-
Usableness refers to such functions as “Can I turn it on?” face design, functional design, data and metadata, and
“Can I invoke that function?” Usefulness refers to such computer systems and networks (Arms 2000). Usability is
functions as “Did it really help me?” “Was it worth the a property of the total system. All the components must
effort?” Landauer (1995) distinguishes usability (ease of work together smoothly to create an effective and conve-
operation) from usefulness (serving an intended pur- nient digital library.
pose), commenting that the two are hard to separate in Usability can be tackled from various directions.
the context of evaluation. Blandford and Buchanan (2002a) suggest that usability is
Usability has several attributes. The International technical, cognitive, social, and design-oriented, and it is
Standards Organization (1994, 10) defines usability as important to bring these different perspectives together,
“the extent to which a product can be used by speci- to share views, experiences, and insights. Indeed, digital
fied users to achieve specified goals with effectiveness, library development involves interplay between people,
efficiency, and satisfaction in a specified context of use.” organization, and technology. The usability issue should
Nielsen (1993) points out that usability has five attributes: look at the system as a whole.
learnability, efficiency, memorability, low error rate or In addition to those views, usability can also be exam-
easy error recovery, and satisfaction. Brinck, Gergle, and ined from the perspectives of graphic design, navigation,
Wood (2002) share a similar perspective that usability is and content (Spool et al. 1999). Turner (2002) categorizes
functionally correct, efficient to use, easy to learn and usability into navigation, page design, content, accessibil-
remember, error tolerant, and subjectively pleasing. In ity, media use, interactivity, and consistency.
addition, Booth (1989) outlines that usability has four fac- Figure 1 compares various perspectives on the attri-
tors: usefulness, effectiveness (ease of use), learnability, butes of usability.
and attitude (likeability). Hix and Hartson (1993) clas-
sify usability into initial performance, long-term perfor-
■
mance, learnability, retainability, advanced feature usage,
first impression, and long-term user satisfaction. Hix and Evaluation of Usability
Hartson are unique in that they take one step further to
differentiate performance and satisfaction into initial and There are a number of ways to evaluate usability. The
long-term measures. The definitions given by ISO and techniques include formal usability testing; usability
Nielsen are most widely cited. inspection; card sort; category membership expectation;
Usability can also be grouped into two large cat- focus groups; questionnaires; think-aloud; analysis of
egories: inherent usability (Kurosu and Kashimura 1995) site usage logs; cognitive walkthrough; heuristic evalu-
and apparent usability (Kurosu and Kashimura 1995; ation; claims analysis; concept-based analysis of surface
Tractinsky 1997). Inherent usability is mainly related to and structural misfits (CASSM); and paper prototyp-
the functional or dynamic part of interface usability. It ing (Askin 1998; Blandford et al. 2004; Campbell 2001;
includes those attributes that focus on how to make the Kantner and Rosenbaum 1997; Keith et al. 2003; Nielsen
product easy to understand, easy to learn, efficient to and Mack 1994; Popp 2001; Rosson and Carroll 2002;
use, less erroneous, and pleasurable. On the other hand, Snyder 2003). The areas of usability testing for digital
apparent usability is more related to the visual impres- libraries have covered breadth of coverage, navigation,
sion of the interface. At times, inherent usability and functionality, utility, interface, metadata appropriateness,
apparent usability may be contradictory (Fu 1999). For and awareness of library resources.
example, in Web page design, graphics enhance apparent The National Taiwan University Library used ques-
WHAT IS USABILITY IN THE CONTEXT OF THE DIGITAL LIBRARY AND HOW CAN IT BE MEASURED? | JENG 5
and menu structure; and formal usability testing was to labeling, visual appearance, contents, and error correc-
observe real user’s use of the site. tions and will be measured by Likert scales and question-
Dorward, Reinke, and Recker (2002) evaluated naires. Ease of use is to evaluate user’s perception on the
Instructional Architect, which aims to increase the utility ease of use of the system. Organization of information is
of NSDL resources for classroom teachers. The methods to evaluate if the system’s structure, layout, and organi-
they employed included formal usability testing and zation meets the user’s satisfaction. Labeling is to evalu-
focus groups. The evaluation centered on interface design ate from user’s perception if the system provides clear
and contents. It was suggested that an introductory tuto- labeling and if terminology used are easy to understand.
rial, better graphics, and a preview screen should be Visual appearance evaluates the site’s design to see if
incorporated. it is visually attractive. Contents evaluate the authority
University of the Pacific applied formal usability test- and accuracy of information provided. Error is to test if
ing technique to measure students’ awareness of library users may recover from mistakes easily and if they make
resources (Krueger, Ray, and Knight 2004). They recruited mistakes easily due to system’s design. Learnability is
134 students to perform eight tasks, including locating an to measure learning effort. The learning effort takes into
article, locating a journal, finding call number of a book, consideration how soon the subject begins to know how
finding overdue information, finding a biography, and to perform tasks and how many tasks are completed cor-
how to connect from home. They found 45 percent of par- rectly.
ticipants were familiar enough with library resources and Figure 3 is a diagram illustrating this evaluation
34 percent were regular users of library Web resources. model. It is suspected that there exists an interlocking
They also found that the majority of their students know relationship among effectiveness, efficiency, and satisfac-
how to search for books in their OPAC but many floun- tion. In addition, it will be interesting to examine how
der when asked to find similar information for journals. learnability interacts with effectiveness, efficiency, and
Another lesson the university learned was that they satisfaction.
should have employed a smaller number of samples
using purposeful sampling. This would allow them to
gather more useful data from targeting small groups of
students that represent demographic characteristics of
interest.
■ Usability Evaluation Instruments
Figure 2 is a review of usability tests in academic A set of instruments are designed based on the evaluation
digital libraries. model. University and college library Web sites are selected
as an example to test the model and instruments for the
purpose of this paper. The instruments include a pretest
ACM, IEEE-CS formal usability test 48 students interface Kengeri et al. (1999)
NCSTRL, NDLTD questionnaire (38 graduate,
10 undergraduate)
ACMDL, NCSTRL questionnaire 45 undergraduate design and structure Theng et al. (2000a, 2000b)
NZDL heuristic evaluation
Alexandria questionnaire 23 students interface Thomas (1998)
formal usability test
CUNY+ questionnaire 10 students interface Oulanov & Pajarillo (2002)
DeLIver transaction log 1900 graduate accessibility Neumann & Bishop (1998)
survey, interview 420 faculty Bishop (2001)
focus groups
formal usability test
DLESE, NSDL focus groups 36 teachers design Sumner et al. (2003)
2 librarians
Instructional Architect formal usability test 26 teachers interface, content Dorward et al. (2002)
focus group
London Hospital focus groups 73 clinicians accessibility Adams & Blandford (2002)
interviews
MARIAN formal usability test students, faculty, interface France et al. (1999)
(Virginia Tech) log analysis staff questionnaire
MIT formal usability test 29 (faculty, graduate) site design Hennig (1999)
undergraduate, staff)
National Taiwan U. questionnaire 1784 faculty information Lan (2001)
and students architecture
browsing & searching
mechanism
layout and display
NCSTRL usability inspection 3 usability experts design, interface, Hartson et al. (2004)
functionality
SABIO formal usability test students design Dickstein & Mills (2000)
heuristic evaluation
design walk-through
card sorting
U. of Illinois at Chicago formal usability test 12 students navigation Augustine & Greene (2002)
U. of South Florida formal usability test 26 undergraduate interface Allen (2002)
U. of the Pacific formal usability test 134 students awareness of Krueger et al. (2004)
library resources
Washington State U. formal usability test 12 students navigation Chisman et al. (1999)
■
In addition, there is a post-test questionnaire that spe- Testing of the Model
cifically examines satisfaction in the areas of ease of use,
and Instruments
organization of information, clear labeling, visual appear-
ance, contents, and error corrections. The earlier version of the model and instruments was
WHAT IS USABILITY IN THE CONTEXT OF THE DIGITAL LIBRARY AND HOW CAN IT BE MEASURED? | JENG 7
tested using three students at the Rutgers University
Effectiveness
Libraries Web site. Revisions were made after the pilot
study. The current version of the model and instru-
ments are tested at the Rutgers University Libraries Web
site (www.libraries.rutgers.edu) and the Queens College Efficiency
Library Web site (http://qcpages.qc.edu/Library). It is
hoped that the model and instruments can be generalized
for use in academic digital libraries. Ease of Use
The study employs a number of techniques, includ-
ing formal usability testing, questionnaire, interview,
think aloud, and log analysis. The evaluation model and Organization of
Information
instruments in this study consider both the quantifying Usability
elements of performance (time, accuracy rate, steps to
complete tasks) as well as subjective criteria (satisfaction).
Labeling
Satisfaction is further examined in the areas of ease of use,
organization of information, labeling, visual appearance, Satisfaction
content, and error correction. The evaluation approach is Visual Appearance
empirical.
Content
■ Results
Error Correction
While the primary interest of this study is to devise an
evaluation model and a suite of instruments for evalu- Learnability
ating usability of academic digital libraries, the data
collected in the study are used to explore the following
usability issues. Figure 3. A proposed usability evaluation model
Literature review has indicated that there is a need
of usability testing benchmarks for comparison. For
example, Theng, Mohd-Nasir, and Thimbleby (2000b) results are reported in Jeng (2004) and will be available in
report that they had to make the assumption that if an the author’s doctoral dissertation. Although there is inter-
area scores 75 percent and above for accuracy it implies locking relationship among these three criteria, each has
that the area is well implemented. The usability testing its own emphasis and should be measured separately.
at MIT Libraries also report that subjects had 75 percent
success rate (Hennig 1999). But, they wondered, is 75
■
percent high or low? The results of the usability testing
of this study are forthcoming in the author’s doctoral Contribution
dissertation and will be contributed to the literature as a
benchmark. This paper contributes to the literature an evaluation
In addition, this research examines user lostness model and a suite of instruments for evaluating usabil-
and navigation disorientation issue. The user lostness ity of academic digital libraries. It calls attention to the
issue has been reported by several studies, includ- potential usability differences due to age and culture, the
ing Blandford, Stelmaszewska, and Bryan-Kinns (2001), user lostness and navigation disorientation issue, and the
Buttenfield (1999), Gullikson et al. (1999), Kengeri et need for benchmark. It discusses usability in the context
al. (1999), and Spool et al. (1999) as well as by Theng, of digital libraries and examines how it has been evalu-
Mohd-Nasir, and Thimbleby (2000a). Indeed, navigation ated. This study will continue as doctoral dissertation
disorientation is among the biggest frustrations for Web research, and the results will be shared with academic
users (Brinck, Gergle, and Wood 2002). This situation is and professional communities.
common particularly with the increasing provision of
digital library portals that provide links to various librar- Editor’s Note: Ms. Jeng’s article is the winner of the 2004
ies from one Web site. LITA/Endeavor Student Writing Award.
This research also examines if there exists an inter-
related relationship among effectiveness, efficiency, and
satisfaction. The results indicate this relationship. The
WHAT IS USABILITY IN THE CONTEXT OF THE DIGITAL LIBRARY AND HOW CAN IT BE MEASURED? | JENG 9
John Wiley. Paepcke, A., et al. 1996. Using distributed objects for digital
International Standards Organization. 1994. Ergonomic require- library interoperability. Computer 29 (5): 61–68.
ments for office work with visual display terminals. Part 11: Guid- Popp, Mary Pagliero. 2001. Testing library Web sites: ARL libraries
ance on usability (ISO DIS 9241-11). London: International weigh in. Paper presented at the Association of College and
Standards Organization. Research Libraries, 10th National Conference, Denver, Colo.,
Jeng, Judy. 2004. Usability evaluation of academic digital librar- March 15–18, www.ala.org/ala/acrl/acrlevents/popp.pdf
ies: From the perspectives of effectiveness, efficiency, satisfac- (QY: accessed date?).
tion, and learnability. In Proceedings of the 67th Annual Meeting Rosson, M. B., and J. M. Carroll. 2002. Usability engineering:
of the American Society for Information Science and Technology, Scenario-based development of human-computer interaction. San
vol. 41, November 13–18, 2004, Providence, R.I., www. Francisco: Morgan Kaufmann.
asis.org/Conferences/AM04/posters/180.doc (QY: accessed Saracevic, Tefko. 2000. Digital library evaluation: Toward an
date?). evolution of concepts. Library Trends 49 (2): 350–69.
Kantner, Laurie, and Stephanie Rosenbaum. 1997. Usability Shackel, B. 1986. Ergonomics in design for usability. In People
studies of www sites: Heuristic evaluation vs. laboratory & computers: Designing for usability. Proceedings of the second
testing. Proceedings of the 15th Annual International Conference conference of the BCS HCI specialist group, ed. M. D. Harrison
on Computer Documentation. City, state: publisher?, 153–60. and A. F. Monk. Cambridge: Cambridge Univ. Pr. QY: Cam-
Keith, Suzette, et al. 2003. An investigation into the application bridge, England?
of claims analysis to evaluate usability of a digital library Snyder, C. 2003. Paper prototyping: The fast and easy way to design
interface. Paper presented at the usability workshop of JCDL and refine user interfaces. Boston: Morgan Kaufmann. QY:
2002, www.uclic.ucl.ac.uk/annb/DLUsability/Keith15.pdf Morgan Kaufmann in Boston and San Francisco?
(QY: accessed date?). Spool, J. M., et al. 1999. Web site usability: A designer’s guide. San
Kengeri, Rekha, et al. 1999. Usability study of digital libraries: Francisco: Morgan Kaufmann.
ACM, IEEE-CS, NCSTRL, NDLTD. International Journal on Sumner, Tamara, and Melissa Dawe. 2001. Looking at digital
Digital Libraries 2: 157–69. library usability from a reuse perspective. Proceedings of the
Kim, Kyunghye. 2002. A model of digital library information first ACM/IEEE-CS Joint Conference on Digital Libraries. City,
seeking process (DLISP model) as a frame for classifying state: publisher?, 416–25.
usability problems. PhD diss., Rutgers Univ. Sumner, Tamara, et al. 2003. Understanding educator percep-
Krueger, Janice, Ron L. Ray, and Lorrie Knight. 2004. Apply- tions of “quality” in digital libraries. Proceedings of the third
ing Web usability techniques to assess student awareness of ACM/IEEE-CS Joint Conference on Digital Libraries. City, state:
library Web resources. Journal of Academic Librarianship 30 (4): publisher?, 269–79.
285–93. Theng, Yin Leng, Norliza Mohd-Nasir, and Harold Thimbleby.
Kurosu, Masaaki, and Kaori Kashimura. 1995. Apparent usabil- 2000a. Purpose and usability of digital libraries. Proceedings
ity vs. inherent usability: Experimental analysis on the deter- of the fifth ACM Conference on Digital Libraries. City, state:
minants of the apparent usability. Conference on Human Factors publisher?, 238–39.
and Computing Systems. City, state: publisher?, 292–93. Theng, Yin Leng, Norliza Mohd-Nasir, and Harold Thimbleby.
Lan, Su-Hua. 2001. A study of usability evaluation of informa- 2000b. A usability tool for Web evaluation applied to digital
tion architecture of the University Library Web site: A case library design. Poster presented at the WWW9 Conference,
study of National Taiwan University Library Web site. Bul- Amsterdam, http://citeseer.nj.nec.com/theng00usability.
letin of the Library Association of China 67: 139–54. html (QY: accessed date?).
Landauer, Thomas K. 1995. The trouble with computers: Usefulness, Thomas, Rita Leigh. 1998. Elements of performance and satisfac-
usability and productivity. Cambridge, Mass.: MIT Pr. tion as indicators of the usability of digital spatial interfaces
Lesk, Michael. 1997. Practical digital libraries: Books, bytes, and for information-seeking: Implications for ISLA. PhD diss.,
bucks. San Francisco: Morgan Kaufmann. Univ. of Southern California.
Marcum, Deanna B. 2002. Preface to The digital library: A biog- Tractinsky, Noam. 1997. Aesthetics and apparent usability:
raphy, ed. Daniel Greenstein and Suzanne E. Thorin. Wash- Empirically assessing cultural and methodological issues.
ington, D.C.: Digital Library Federation, Council on Library Proceedings of the SIGCHI Conference on Human Factors in Com-
and Information Resources, www.clir.org/pubs/reports/ puting Systems. City, state: publisher?, 115–22.
pub109/pub109.pdf (QY: accessed date?). Turner, Steven. 2002. The HEP test for grading Web site usability.
Neumann, Laura J., and Ann Peterson Bishop. 1998. From Computers in Libraries 22 (10): 37–39.
usability to use: Measuring success of testbeds in the real Vohringer-Kuhnt, Thomas. 2003. The influence of culture on
world, http://forseti.grainger.uiuc.edu/dlisoc/socsci_site/ usability, http://userpage.fu-berlin.de/~kuhnt/thesis/
dpc-paper-98.html (QY: accessed date?). results.pdf (QY: accessed date?).
Nielsen, Jakob. 1993. Usability engineering. Cambridge, Mass.:
Academic Pr.
Nielsen, Jakob, and Robert L. Mack, eds. 1994. Usability inspec-
tion methods. New York: Wiley.
Oulanov, Alexei, and Edmund F. Y. Pajarillo. 2002. CUNY+
Web: Usability study of the Web-based GUI version of the
bibliographic database of the City University of New York
(CUNY). The Electronic Library 20 (6): 481–87.
Thank you very much for agreeing to participate in this Major/Department: _______________________________
experiment. All of your personal data that we collect will
be entirely confidential, viewed only by the experimenter, How many years have you been at Rutgers or Queens?
and shared only as part of group results. But first, we _______
would like to gather a bit of background information
about you, so that we will be better able to interpret your If you are from foreign country, how long have you been
use of and reactions to the system. in the U.S.? ________ years
Your original nationality: _________________
Participant # ______
Ethnic group: ___ White ___ African American ___ Asian
Date: _________________ ___ Hispanic ___ Native American ___ Other: _________
__
Gender: ___ Male ___ Female
How often do you use the Library’s Web site:
Age: _____ _____ Never used it
_____ Once or twice a semester
What is your current status: _____ Once or twice a month
_____ Undergraduate _____ Master’s Student _____ _____ Once or twice a week
Doctoral Student _____ Faculty _____ Daily
2. Does the library currently subscribe to paper copy of 6. Find an encyclopedia article about French wine.
Advertising Age? 1 2 3 4 5
1 2 3 4 5 Easy to use Difficult to use
Easy to use Difficult to use Your comment: ____________________________________
Your comment: ____________________________________ __________________________________________________
__________________________________________________
7. Find an e-book called “The story of mankind.”
3. Use a database to find an article about nursing homes 1 2 3 4 5
and mental illness. Easy to use Difficult to use
WHAT IS USABILITY IN THE CONTEXT OF THE DIGITAL LIBRARY AND HOW CAN IT BE MEASURED? | JENG 11
Your comment: ____________________________________ 9. Find instruction on how to set up your home computer
__________________________________________________ to have remote access to the library electronic resources.
1 2 3 4 5
8. Can alumni enjoy inter-library loan service? Easy to find Difficult to find
1 2 3 4 5 Your comment: ____________________________________
Easy Difficult __________________________________________________
Your comment: ____________________________________
__________________________________________________
Thanks again for participating in this experiment. This 7. What new content or features that you would like to see
questionnaire gives you an opportunity to tell us your on the site? ________________________________________
reactions to the system you used. Please circle a number
on the scale to indicate your reactions. Please write com- 8. Can you recover from mistakes easily?
ments to elaborate on your answers. I will go over your 1 2 3 4 5
answers with you to make sure that I understand all of Easy Difficult
your responses. Thank you. Your comment: ____________________________________
__________________________________________________
1. Please rate the ease of use of the Web site.
1 2 3 4 5 9. Your overall reaction to the system:
Easy Difficult 1 2 3 4 5
Your comment: ____________________________________ Satisfied Unsatisfied
__________________________________________________ Your comment: ____________________________________
__________________________________________________
2. What do you think about the organization of informa-
tion on the site? 10. Do you feel lost while using the site?
1 2 3 4 5 _____ Yes _____ No
Clear Unclear Your comment: ____________________________________
Your comment: ____________________________________ __________________________________________________
__________________________________________________
11. Is the site easy to navigate?
3. What do you think about the terminology used in the _____ Yes _____ No
site? Are categories clearly labeled? Your comment: ____________________________________
1 2 3 4 5 __________________________________________________
Clear Unclear
Your comment: _____________________________________ 12. When you click a button on the Web page, do you
__________________________________________________ expect that the click will lead you to correct answer?
_____ Yes _____ No
4. Is the site visually attractive? Your comment: ____________________________________
1 2 3 4 5 __________________________________________________
Attractive Unattractive
Your comment: ____________________________________ 13. Do you have any other comments about the Web
__________________________________________________ site?
__________________________________________________
5. What is the best feature(s) of the site? _______________ __________________________________________________
__________________________________________________ __________________________________________________
____________________________________________________________________________________________________
6. What is the worst feature(s) of the site? ______________ ____________________________________________________________________________________________________
_________________________________________________ __________________________________________________