0% found this document useful (0 votes)
71 views

What Is Usability in The Context of The Digital Library and How Can It Be Measured?

This document discusses the concept of usability in the context of digital libraries. It begins by defining digital libraries as organized collections of digital information that are accessible over a network and may include services. It then examines usability as a multidimensional construct with several attributes, including effectiveness, efficiency, and user satisfaction. The document reviews different definitions and dimensions of usability, and proposes evaluating usability based on how well users can achieve their goals quickly and easily within a digital library system.

Uploaded by

Loai F Alzouabi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
71 views

What Is Usability in The Context of The Digital Library and How Can It Be Measured?

This document discusses the concept of usability in the context of digital libraries. It begins by defining digital libraries as organized collections of digital information that are accessible over a network and may include services. It then examines usability as a multidimensional construct with several attributes, including effectiveness, efficiency, and user satisfaction. The document reviews different definitions and dimensions of usability, and proposes evaluating usability based on how well users can achieve their goals quickly and easily within a digital library system.

Uploaded by

Loai F Alzouabi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

What Is Usability in the

Context of the Digital Library


and How Can It Be Measured? Judy Jeng

This paper reviews how usability has been defined in


the context of the digital library, what methods have ■ Definition of Digital Library
been applied and their applicability, and proposes an There are many different views in the literature on what
evaluation model and a suite of instruments for evaluat- digital libraries are. This paper does not intend to provide
a comprehensive collection on the definitions of digital
ing usability for academic digital libraries. The model
libraries, but rather representative ones.
examines effectiveness, efficiency, satisfaction, and learn- Lesk (1997, 1) views digital libraries as “organized
ability. It is found that there exists an interlocking rela- collections of digital information.” Arms (2000, 2) views
tionship among effectiveness, efficiency, and satisfaction. digital libraries as “managed collection of information,
with associated services, where the information is stored
It also examines how learnability interacts with these in digital formats and accessible over a network.”
three attributes. The Digital Library Federation (1999) representing the
practical community, defines digital library as follows:
igital library development, since its inception in

D
Digital library are organizations that provide the
the 1990s, has made significant progress thus far. resources, including the specialized staff, to select,
Although there is still a long way to go before structure, offer intellectual access to, interpret, distrib-
reaching their full potential, digital libraries are matur- ute, preserve the integrity of, and ensure the persis-
ing (Fox 2002; Marcum 2002). However, the evaluation of tence over time of collections of digital works so that
digital libraries has not kept pace. As Saracevic (2000) has they are readily and economically available for use by
a defined community or set of communities.
outlined, fundamental concepts remain to be clarified,
such as What is a digital library? What is there to evalu- Francisco-Revilla et al. (2001) report digital libraries
ate? What are the criteria? How to apply them in evalu- are increasingly being defined as ones that collect point-
ation? Why evaluate digital libraries in the first place? ers to Web-based resources rather than hold the resources
Borgman (2002) has also stated that the digital libraries themselves. A library’s Web site is an example of this defi-
research community needs large test beds, including col- nition. Greenstein (2000) shares this view and says that
lections and testing mechanisms, as a means to evaluate the digital library is known less for the extent and nature
new concepts. There is also a need of benchmarks for of the collections it owns than for the networked informa-
comparison between systems and services. tion space it defines through its online services. Paepcke
This research is to develop and evaluate methods and et al. (1996) also state that a digital library provides a
instruments for assessing the usability of digital libraries. single point of access to a wide range of autonomously
Compared to other areas in digital library research, as distributed sources.
Theng, Mohd-Nasir, and Thimbleby (2000a, 238) point In addition, digital libraries may be seen as new
out, “Little work is being done to understand the purpose forms of information institutions, multimedia informa-
and usability of digital libraries.” Borgman et al. (2000, tion retrieval systems, or information systems that sup-
229) also state, “Relatively little work has been done on port the creation, use, and searching of digital content
evaluating the usability of digital libraries in any con- (Borgman 2002). Digital libraries also represent a new
text.” The same observations are also made by Blandford, infrastructure and environment that has been created by
Stelmaszewska, and Bryan-Kinns (2001) as well as Brogan the integration and use of computing, communications,
(2003). Blandford and Buchanan (2002b) call for a need for and digital content on a global scale destined to become
further work on methods for analyzing usability, includ- an essential part of the information infrastructure in the
ing an understanding of how to balance rigor, appropri- twenty-first century (DELOS 2004).
ateness of techniques, and practical limitations. In summary, digital libraries:
This study contributes to the literature the under-
■ are an organized and managed collection of digital
standing of usability, reviews what methods have been
information;
applied and their applicability, and proposes a suite of
■ are accessible over a network; and
methods for evaluating usability for academic digital
■ may include service.
libraries.
As Borgman (2002, QY: page?) states, “Digital librar-
ies are not ends in themselves; rather, they are enabling
technologies for digital asset management . . . electronic
publishing, teaching and learning, and other activities.
Judy Jeng ([email protected]) is a Ph.D. candidate at Accordingly, digital libraries need to be evaluated in the
the School of Communication, Information, and Library Studies, context of specific applications.”
Rutgers, The State University of New Jersey.

WHAT IS USABILITY IN THE CONTEXT OF THE DIGITAL LIBRARY AND HOW CAN IT BE MEASURED? | JENG 3
■ Dimensions of Usability
usability but slow down the system.
Usability has user focus. Dumas and Redish (1993, 4)
define usability as “people who use the product can do so
Usability is a multidimensional construct that can be quickly and easily to accomplish their task.” Clairmont,
examined from various perspectives. The term usability Dickstein, and Mills (1999, QY: page?) make the similar
has been used broadly and means different things to dif- statement that “[u]sability is the degree to which a user
ferent people. Some relate usability to ease of use or user- can successfully learn and use a product to achieve a
friendliness and consider from an interface effectiveness goal.”
point-of-view. This view makes sense, as usability has Usability is different from functionality. Dumas and
theoretical base on human-computer interaction. Many Redish (1993) use the videocassette recorder (VCR) as
studies on usability focus on interface design. Kim (2002, an example to illustrate the difference between the two:
26), for instance, points out that “the difference between VCRs may have high functionality (the feature works
interface effectiveness and usability is not clear.” as it was designed to work) but they have low usability
Usability can also be related to usefulness and usable- (people cannot use them quickly and easily to accomplish
ness. Gluck (1997), for instance, made this assessment. their task). Usability has several aspects, including inter-
Usableness refers to such functions as “Can I turn it on?” face design, functional design, data and metadata, and
“Can I invoke that function?” Usefulness refers to such computer systems and networks (Arms 2000). Usability is
functions as “Did it really help me?” “Was it worth the a property of the total system. All the components must
effort?” Landauer (1995) distinguishes usability (ease of work together smoothly to create an effective and conve-
operation) from usefulness (serving an intended pur- nient digital library.
pose), commenting that the two are hard to separate in Usability can be tackled from various directions.
the context of evaluation. Blandford and Buchanan (2002a) suggest that usability is
Usability has several attributes. The International technical, cognitive, social, and design-oriented, and it is
Standards Organization (1994, 10) defines usability as important to bring these different perspectives together,
“the extent to which a product can be used by speci- to share views, experiences, and insights. Indeed, digital
fied users to achieve specified goals with effectiveness, library development involves interplay between people,
efficiency, and satisfaction in a specified context of use.” organization, and technology. The usability issue should
Nielsen (1993) points out that usability has five attributes: look at the system as a whole.
learnability, efficiency, memorability, low error rate or In addition to those views, usability can also be exam-
easy error recovery, and satisfaction. Brinck, Gergle, and ined from the perspectives of graphic design, navigation,
Wood (2002) share a similar perspective that usability is and content (Spool et al. 1999). Turner (2002) categorizes
functionally correct, efficient to use, easy to learn and usability into navigation, page design, content, accessibil-
remember, error tolerant, and subjectively pleasing. In ity, media use, interactivity, and consistency.
addition, Booth (1989) outlines that usability has four fac- Figure 1 compares various perspectives on the attri-
tors: usefulness, effectiveness (ease of use), learnability, butes of usability.
and attitude (likeability). Hix and Hartson (1993) clas-
sify usability into initial performance, long-term perfor-


mance, learnability, retainability, advanced feature usage,
first impression, and long-term user satisfaction. Hix and Evaluation of Usability
Hartson are unique in that they take one step further to
differentiate performance and satisfaction into initial and There are a number of ways to evaluate usability. The
long-term measures. The definitions given by ISO and techniques include formal usability testing; usability
Nielsen are most widely cited. inspection; card sort; category membership expectation;
Usability can also be grouped into two large cat- focus groups; questionnaires; think-aloud; analysis of
egories: inherent usability (Kurosu and Kashimura 1995) site usage logs; cognitive walkthrough; heuristic evalu-
and apparent usability (Kurosu and Kashimura 1995; ation; claims analysis; concept-based analysis of surface
Tractinsky 1997). Inherent usability is mainly related to and structural misfits (CASSM); and paper prototyp-
the functional or dynamic part of interface usability. It ing (Askin 1998; Blandford et al. 2004; Campbell 2001;
includes those attributes that focus on how to make the Kantner and Rosenbaum 1997; Keith et al. 2003; Nielsen
product easy to understand, easy to learn, efficient to and Mack 1994; Popp 2001; Rosson and Carroll 2002;
use, less erroneous, and pleasurable. On the other hand, Snyder 2003). The areas of usability testing for digital
apparent usability is more related to the visual impres- libraries have covered breadth of coverage, navigation,
sion of the interface. At times, inherent usability and functionality, utility, interface, metadata appropriateness,
apparent usability may be contradictory (Fu 1999). For and awareness of library resources.
example, in Web page design, graphics enhance apparent The National Taiwan University Library used ques-

4 INFORMATION TECHNOLOGY AND LIBRARIES | JUNE 2005


Authors Attributes usability of the ACM Digital Library, the
Networked Computer Science Technical
Booth (1989) usefulness, effectiveness, learnability, attitude Reference Library, and the New Zealand
Brinck et al. (2002) functionally correct, efficient to use, easy to learn, easy Digital Library. This study helps to under-
to remember, error tolerant, and subjectively pleasing stand the purpose of digital libraries.
Sumner and Dawe (2001) studied
Clairmont et al. (1999) successfully learn and use a product to achieve a goal
usability of the Digital Library for Earth
Dumas & Redish (1993) perform tasks quickly and easily System Education (DLESE) focusing on
Furtado et al. (2003) ease of use and learning
its role in the process of educational
resource reuse. One finding is that the
Gluck (1997) useableness, usefulness design of the search results page is criti-
Guillemette (1995) effectively used by target users to perform tasks cal for supporting resource comprehen-
sion. Also, the library’s metadata plays a
Hix & Hartson (1993) initial performance, long-term performance, learnability, central role in documenting the resource
retainability, advanced feature usage, first impression,
and long-term user satisfaction
enough to support comprehension and
modification processes.
ISO (1994) effectiveness, efficiency, satisfaction Sumner et al. (2003) again used DLESE
Kengeri et al. (1999) effectiveness, likeability, learnability, usefulness to study usability in addition to National
Science Digital Library (NSDL). The pur-
Kim (2002) interface effectiveness pose of this study was to identify educa-
Nielsen (1993) learnability, efficiency, memorability, errors, satisfaction tors’ expectations and requirements for
the design of educational digital collec-
Oulanov & Pajarillo (2002) affect, efficiency, control, helpfulness, adaptability
tions for classroom use. A series of five
Shackel (1986) effectiveness, learnability, flexibility, user attitude focus groups was conducted with a total
of thirty-six teachers and two librarians
to review eighteen Web sites. The par-
Figure 1. Attributes of usability ticipants indicated that content quality,
advertising, bias, and design were impor-
tant factors influencing their perceptions.
tionnaires to survey 1,784 users on usability (Lan 2001). Hartson, Shivakumar, and Pérez-
They found the site’s usability problems are mainly in the Quiñones (2004) applied usability inspection method
areas of information architecture and in the browsing and to evaluate the design and functionality of Networked
searching mechanism. The study of CUNY+ (Oulanov Computer Science Technical Reference Library (NCSTRL).
and Pajarillo 2002) also employed a questionnaire as the They found NCSTRL’s design was apparently function-
primary method of usability assessment. The authors ally oriented rather than an approach based on user task
conducted a two-phase study to compare usability of threads. Another finding of the usability inspection was
text-based and Web-based CUNY Web sites. The criteria about terminology used in NCSTRL. There was jargon
used were affect, efficiency, control, helpfulness, and and the use of terms was designer-centered rather than
adaptability. user-centered.
Adams and Blandford (2002) reported on their study The evaluation of DeLIver applied a mix of methods,
of accessibility on a large London-based hospital. They including transaction log analysis, surveys, interviews,
conducted focus groups and in-depth interviews with focus groups, and formal usability testing to measure
seventy-three hospital clinicians. Fifty percent of the par- accessibility (Bishop 2001; Neumann and Bishop 1998).
ticipants were nurses, while the other fifty percent were They learned triangulation of data is crucial. The evalu-
senior and junior doctors, consultants, surgeons, manag- ation process has allowed the evaluators to pursue the
ers, and IT department members. The study focused on different social issues surrounding digital library use as
two themes: (1) the perceived effectiveness of traditional well as dealing with specific usability issues.
and digital libraries as clinical resources; and (2) the The University of Arizona Library applied a number
impact of clinician status on control over and access to of methods to evaluate the usability of the library Web
information. Participants responded that digital library site, SABIO, including heuristic evaluation, walk-through,
technology provides remote access to materials, but the card sorting, and formal usability testing (Dickstein and
system’s usability is poor and it is time-consuming to Mills 2000). Heuristic evaluation was used to system-
access information. atically inspect user interface; walk-through was used
Theng, Mohd-Nasir, and Thimbleby (2000a) utilized to explore and to envision user’s problems in the proto-
questionnaires and heuristic evaluation to measure type stage; card sorting was used to assess organization

WHAT IS USABILITY IN THE CONTEXT OF THE DIGITAL LIBRARY AND HOW CAN IT BE MEASURED? | JENG 5
and menu structure; and formal usability testing was to labeling, visual appearance, contents, and error correc-
observe real user’s use of the site. tions and will be measured by Likert scales and question-
Dorward, Reinke, and Recker (2002) evaluated naires. Ease of use is to evaluate user’s perception on the
Instructional Architect, which aims to increase the utility ease of use of the system. Organization of information is
of NSDL resources for classroom teachers. The methods to evaluate if the system’s structure, layout, and organi-
they employed included formal usability testing and zation meets the user’s satisfaction. Labeling is to evalu-
focus groups. The evaluation centered on interface design ate from user’s perception if the system provides clear
and contents. It was suggested that an introductory tuto- labeling and if terminology used are easy to understand.
rial, better graphics, and a preview screen should be Visual appearance evaluates the site’s design to see if
incorporated. it is visually attractive. Contents evaluate the authority
University of the Pacific applied formal usability test- and accuracy of information provided. Error is to test if
ing technique to measure students’ awareness of library users may recover from mistakes easily and if they make
resources (Krueger, Ray, and Knight 2004). They recruited mistakes easily due to system’s design. Learnability is
134 students to perform eight tasks, including locating an to measure learning effort. The learning effort takes into
article, locating a journal, finding call number of a book, consideration how soon the subject begins to know how
finding overdue information, finding a biography, and to perform tasks and how many tasks are completed cor-
how to connect from home. They found 45 percent of par- rectly.
ticipants were familiar enough with library resources and Figure 3 is a diagram illustrating this evaluation
34 percent were regular users of library Web resources. model. It is suspected that there exists an interlocking
They also found that the majority of their students know relationship among effectiveness, efficiency, and satisfac-
how to search for books in their OPAC but many floun- tion. In addition, it will be interesting to examine how
der when asked to find similar information for journals. learnability interacts with effectiveness, efficiency, and
Another lesson the university learned was that they satisfaction.
should have employed a smaller number of samples
using purposeful sampling. This would allow them to
gather more useful data from targeting small groups of
students that represent demographic characteristics of
interest.
■ Usability Evaluation Instruments
Figure 2 is a review of usability tests in academic A set of instruments are designed based on the evaluation
digital libraries. model. University and college library Web sites are selected
as an example to test the model and instruments for the
purpose of this paper. The instruments include a pretest

■ Usability Evaluation Model


questionnaire (see appendix A), a list of tasks (see appendix
B), and a post-test questionnaire (see appendix C).
The pretest questionnaire collects demographic data,
This paper proposes an evaluation model for assessing including gender, age, status (undergraduate, master’s,
usability of digital libraries. The proposed evaluation or doctoral student), major, years at the institution,
model applies the definition of ISO 9241-11 (International original nationality if coming from foreign country, and
Standards Organization 1994) that examines effective- familiarity with the site. There have been studies on how
ness, efficiency, and satisfaction. In addition, the model gender, age, and cultural differences affect how people
includes learnability (see figure 3). The ISO definition interact with online information (Collins and Auguiñaga
defines usability as “the extent to which a product can 2001; Duncker 2002; Vohringer-Kuhnt 2003). A university
be used by specified users to achieve specified goals with or college library Web site serves a diverse student body,
effectiveness, efficiency, and satisfaction in a specified including international students and students in a wide
context of use.” QY: Page # for quote? The ISO definition, range of ages. It is interesting to examine how those
however, does not explicitly specify operational criteria demographic factors influence usability assessment.
on what to evaluate. The list of tasks includes nine questions that are rep-
In the proposed model, effectiveness is to evaluate if resentative of typical uses of a library’s Web site. Three
the system as a whole can provide information and func- of those questions are to locate known items, including
tionality effectively and will be measured by how many author, title, and e-book searching. Four are to use data-
answers are correct. Efficiency is likewise to evaluate if bases to find articles in electronic journals. Two are to
the system as a whole can be used to retrieve informa- locate information, such as eligibility of ILL services and
tion efficiently and will be measured by how much time how to set up remote access.
it takes to complete tasks. Satisfaction will look into the The subjects are asked to rank satisfaction with the
areas of ease of use, organization of information, clear system after each task and to write down comments.

6 INFORMATION TECHNOLOGY AND LIBRARIES | JUNE 2005


QY: There were no tabs between items in this table, so the copyeditor guessed on placement. Pls. review carefully.

Site Methods Subjects Areas Authors

ACM, IEEE-CS formal usability test 48 students interface Kengeri et al. (1999)
NCSTRL, NDLTD questionnaire (38 graduate,
10 undergraduate)
ACMDL, NCSTRL questionnaire 45 undergraduate design and structure Theng et al. (2000a, 2000b)
NZDL heuristic evaluation
Alexandria questionnaire 23 students interface Thomas (1998)
formal usability test
CUNY+ questionnaire 10 students interface Oulanov & Pajarillo (2002)
DeLIver transaction log 1900 graduate accessibility Neumann & Bishop (1998)
survey, interview 420 faculty Bishop (2001)
focus groups
formal usability test
DLESE, NSDL focus groups 36 teachers design Sumner et al. (2003)
2 librarians
Instructional Architect formal usability test 26 teachers interface, content Dorward et al. (2002)
focus group
London Hospital focus groups 73 clinicians accessibility Adams & Blandford (2002)
interviews
MARIAN formal usability test students, faculty, interface France et al. (1999)
(Virginia Tech) log analysis staff questionnaire
MIT formal usability test 29 (faculty, graduate) site design Hennig (1999)
undergraduate, staff)
National Taiwan U. questionnaire 1784 faculty information Lan (2001)
and students architecture
browsing & searching
mechanism
layout and display
NCSTRL usability inspection 3 usability experts design, interface, Hartson et al. (2004)
functionality
SABIO formal usability test students design Dickstein & Mills (2000)
heuristic evaluation
design walk-through
card sorting
U. of Illinois at Chicago formal usability test 12 students navigation Augustine & Greene (2002)
U. of South Florida formal usability test 26 undergraduate interface Allen (2002)
U. of the Pacific formal usability test 134 students awareness of Krueger et al. (2004)
library resources
Washington State U. formal usability test 12 students navigation Chisman et al. (1999)

Figure 2. Methods of usability evaluation QY: Pls. look over carefully


In addition, there is a post-test questionnaire that spe- Testing of the Model
cifically examines satisfaction in the areas of ease of use,
and Instruments
organization of information, clear labeling, visual appear-
ance, contents, and error corrections. The earlier version of the model and instruments was

WHAT IS USABILITY IN THE CONTEXT OF THE DIGITAL LIBRARY AND HOW CAN IT BE MEASURED? | JENG 7
tested using three students at the Rutgers University
Effectiveness
Libraries Web site. Revisions were made after the pilot
study. The current version of the model and instru-
ments are tested at the Rutgers University Libraries Web
site (www.libraries.rutgers.edu) and the Queens College Efficiency
Library Web site (http://qcpages.qc.edu/Library). It is
hoped that the model and instruments can be generalized
for use in academic digital libraries. Ease of Use
The study employs a number of techniques, includ-
ing formal usability testing, questionnaire, interview,
think aloud, and log analysis. The evaluation model and Organization of
Information
instruments in this study consider both the quantifying Usability
elements of performance (time, accuracy rate, steps to
complete tasks) as well as subjective criteria (satisfaction).
Labeling
Satisfaction is further examined in the areas of ease of use,
organization of information, labeling, visual appearance, Satisfaction
content, and error correction. The evaluation approach is Visual Appearance
empirical.

Content

■ Results
Error Correction
While the primary interest of this study is to devise an
evaluation model and a suite of instruments for evalu- Learnability
ating usability of academic digital libraries, the data
collected in the study are used to explore the following
usability issues. Figure 3. A proposed usability evaluation model
Literature review has indicated that there is a need
of usability testing benchmarks for comparison. For
example, Theng, Mohd-Nasir, and Thimbleby (2000b) results are reported in Jeng (2004) and will be available in
report that they had to make the assumption that if an the author’s doctoral dissertation. Although there is inter-
area scores 75 percent and above for accuracy it implies locking relationship among these three criteria, each has
that the area is well implemented. The usability testing its own emphasis and should be measured separately.
at MIT Libraries also report that subjects had 75 percent
success rate (Hennig 1999). But, they wondered, is 75


percent high or low? The results of the usability testing
of this study are forthcoming in the author’s doctoral Contribution
dissertation and will be contributed to the literature as a
benchmark. This paper contributes to the literature an evaluation
In addition, this research examines user lostness model and a suite of instruments for evaluating usabil-
and navigation disorientation issue. The user lostness ity of academic digital libraries. It calls attention to the
issue has been reported by several studies, includ- potential usability differences due to age and culture, the
ing Blandford, Stelmaszewska, and Bryan-Kinns (2001), user lostness and navigation disorientation issue, and the
Buttenfield (1999), Gullikson et al. (1999), Kengeri et need for benchmark. It discusses usability in the context
al. (1999), and Spool et al. (1999) as well as by Theng, of digital libraries and examines how it has been evalu-
Mohd-Nasir, and Thimbleby (2000a). Indeed, navigation ated. This study will continue as doctoral dissertation
disorientation is among the biggest frustrations for Web research, and the results will be shared with academic
users (Brinck, Gergle, and Wood 2002). This situation is and professional communities.
common particularly with the increasing provision of
digital library portals that provide links to various librar- Editor’s Note: Ms. Jeng’s article is the winner of the 2004
ies from one Web site. LITA/Endeavor Student Writing Award.
This research also examines if there exists an inter-
related relationship among effectiveness, efficiency, and
satisfaction. The results indicate this relationship. The

8 INFORMATION TECHNOLOGY AND LIBRARIES | JUNE 2005


References date?).
Collins, Kathleen, and José Aguiñaga. 2001. Learning as we go:
Adams, A., and A. Blandford. 2002. Acceptability of medical Arizona State University West Library’s usability experience.
digital libraries. Health Informatics Journal 8 (2): 58-66. In Usability assessment of library-related Web sites: Methods and
Allen, Maryellen. 2002. A case study of the usability testing of case studies, ed. N. Campbell, 16–29. Chicago: ALA.
the University of South Florida’s virtual library interface DELOS. 2004. About the DELOS Network of Excellence on
design. Online Information Review 26: 40-53. Digital Libraries, http://delos-noe.iei.pi.cnr.it (QY: accessed
Arms, William Y. 2000. Digital libraries. Cambridge, Mass.: MIT date?).
Pr. Dickstein, Ruth, and Vicki Mills. 2000. Usability testing at the
Askin, A. Y. 1998. Effectiveness of usability evaluation methods University of Arizona library: How to let the users in on the
at a function of users’ learning stages. Master’s thesis, Pur- design. Information Technology and Libraries 19 (3): 144–51.
due Univ. Digital Library Federation. 1999. A working definition of digital
Augustine, Susan, and Courtney Greene. 2002. Discovering how library, www.clir.org/diglib/dldefinition.htm (QY: accessed
students search a library Web site: A usability case study. Col- date?).
lege & Research Libraries 63 (4): 354–65. Dorward, Jim, Derek Reinke, and Mimi Recker. 2002. An evalua-
Bishop, Ann Peterson. 2001. Logins and Bailouts: Measuring tion model for a digital library services tool. Proceedings of the
access, use, and success in digital libraries. The Journal of second ACM/IEEE-CS Joint Conference on Digital Libraries. City,
Electronic Publishing 4 (2), www.press.umich.edu/jep/04-02/ state: publisher?, 322–23.
bishop.html (QY: accessed date?). Dumas, Joseph S., and Janice C. Redish. 1993. A practical guide to
Blandford, Ann, and George Buchanan. 2002a. Usability for usability testing. Norwood, N.J.: Ablex.
digital libraries. Proceedings of the second ACM/IEEE-CS Joint Duncker, Elke. 2002. Cross-cultural usability of the library meta-
Conference on Digital Libraries. City, state: publisher?, 424. phor. Proceedings of the second ACM/IEEE-CS Joint Conference
Blandford, Ann, and George Buchanan. 2002b. Workshop report: on Digital Libraries. City, state: publisher?, 223–30.
Usability of digital libraries @ JCDL’02, www.uclic.ucl.ac.uk/ Fox, Edward A., and Shalini R. Urs. 2002. Digital libraries. Vol.
annb/DLUsability/SIGIR.pdf (QY: accessed date?). 36 of Annual review of information science and technology, ed.
Blandford, Ann, et al. 2004. Analytical usability evaluation Blaise Cronin, 503–89. Medford, N.J.: Information Today.
for digital libraries: A case study. Proceedings of the fourth France, Robert K., et al. 1999. Use and usability in a digi-
ACM/IEEE Joint Conference on Digital Libraries. City, state: tal library search system, www.dlib.vt.edu/Papers/Use_
publisher?, 27–36. usability.PDF (QY: accessed date?).
Blandford, Ann, Hanna Stelmaszewska, and Nick Bryan-Kinns. Francisco-Revilla, Luis, et al. 2001. Managing change on the
2001. Use of multiple digital libraries: A case study. Proceed- Web. Proceedings of the first ACM/IEEE-CS Joint Conference on
ings of the first ACM/IEEE-CS Joint Conference on Digital Librar- Digital Libraries. City, state: publisher?, 67–76.
ies. City, state: publisher?, 179–88. Fu, Limin Paul. 1999. Usability evaluation of Web page design.
Booth, Paul. 1989. An introduction to human-computer interaction. PhD diss., Purdue Univ.
London: Lawrence Erlbaum Associates. Furtado, Elizabeth, et al. 2003. Improving usability of an online
Borgman, Christine L. 2002. Fourth DELOS workshop. Evalua- learning system by means of multimedia, collaboration, and
tion of digital libraries: Testbeds, measurements, and metrics. adaptation resources. In Usability evaluation of online learning
Final report to National Science Foundation, wwwold.sztaki. programs, ed. Claude Ghaoui, 69–86. Hershey, Pa.: Informa-
hu/conferences/deval/presentations.html (QY: accessed tion Science Publ.
date?). Gluck, Myke. 1997. A descriptive study of the usability of geo-
Borgman, Christine L., et al. 2000. Evaluating digital libraries for spatial metadata. Annual Review of OCLC Research, www.
teaching and learning in undergraduate education: A case oclc.org/research/publications/arr/1997/gluck/gluck_
study of the Alexandria Digital Earth ProtoType (ADEPT). frameset.htm (QY: accessed date?).
Library Trends 49 (2): 228-50. Greenstein, Daniel. 2000. Digital libraries and their challenges.
Brinck, Tom, Darren Gergle, and Scott D. Wood. (2002). Design- Library Trends 49 (2): 290–303.
ing Web sites that work: Usability for the Web. San Francisco: Guillemette, Ronald A. 1995. The evaluation of usability in
Morgan Kaufmann. interactive information systems. In Human factors in infor-
Brogan, Martha L. 2003. A survey of digital library aggregation mation systems: Emerging theoretical bases, ed. Jane M. Carey.
services. Washington, D.C.: The Digital Library Federation, Norwood, N.J.: Ablex.
Council on Library and Information Resources. Gullikson, Shelley, et al. 1999. The impact of information archi-
Buttenfield, Barbara. 1999. Usability evaluation of digital librar- tecture on academic Web site usability. The Electronic Library,
ies. Science and Technology Libraries, 17 (3/4): 39–59. 17 (5): 293–304.
Campbell, Nicole. 2001. Usability assessment of library-related Web Hartson, H. Rex, Priya Shivakumar, and Manuel A. Pérez-Qui-
sites: Methods and case studies. Chicago: ALA. ñones. 2004. Usability inspection of digital libraries: A case
Chisman, Janet, Karen Diller, and Sharon Walbridge. 1999. study. International Journal on Digital Libraries 4 (2): 108–23.
Usability testing: A case study. College & Research Libraries 60 Hennig, Nicole. 1999. Web site usability test, http://macfadden.
(6): 552–69. mit.edu:9500/webgroup/usability/results (QY: accessed
Clairmont, Michelle, Ruth Dickstein, and Vicki Mills. 1999. Test- date?).
ing of usability in the design of a new information gateway, www. Hix, Deborah, and H. Rex Hartson. 1993. Developing user inter-
library.arizona.edu/library/teams/access9798 (QY: accessed faces: Ensuring usability through product and process. New York:

WHAT IS USABILITY IN THE CONTEXT OF THE DIGITAL LIBRARY AND HOW CAN IT BE MEASURED? | JENG 9
John Wiley. Paepcke, A., et al. 1996. Using distributed objects for digital
International Standards Organization. 1994. Ergonomic require- library interoperability. Computer 29 (5): 61–68.
ments for office work with visual display terminals. Part 11: Guid- Popp, Mary Pagliero. 2001. Testing library Web sites: ARL libraries
ance on usability (ISO DIS 9241-11). London: International weigh in. Paper presented at the Association of College and
Standards Organization. Research Libraries, 10th National Conference, Denver, Colo.,
Jeng, Judy. 2004. Usability evaluation of academic digital librar- March 15–18, www.ala.org/ala/acrl/acrlevents/popp.pdf
ies: From the perspectives of effectiveness, efficiency, satisfac- (QY: accessed date?).
tion, and learnability. In Proceedings of the 67th Annual Meeting Rosson, M. B., and J. M. Carroll. 2002. Usability engineering:
of the American Society for Information Science and Technology, Scenario-based development of human-computer interaction. San
vol. 41, November 13–18, 2004, Providence, R.I., www. Francisco: Morgan Kaufmann.
asis.org/Conferences/AM04/posters/180.doc (QY: accessed Saracevic, Tefko. 2000. Digital library evaluation: Toward an
date?). evolution of concepts. Library Trends 49 (2): 350–69.
Kantner, Laurie, and Stephanie Rosenbaum. 1997. Usability Shackel, B. 1986. Ergonomics in design for usability. In People
studies of www sites: Heuristic evaluation vs. laboratory & computers: Designing for usability. Proceedings of the second
testing. Proceedings of the 15th Annual International Conference conference of the BCS HCI specialist group, ed. M. D. Harrison
on Computer Documentation. City, state: publisher?, 153–60. and A. F. Monk. Cambridge: Cambridge Univ. Pr. QY: Cam-
Keith, Suzette, et al. 2003. An investigation into the application bridge, England?
of claims analysis to evaluate usability of a digital library Snyder, C. 2003. Paper prototyping: The fast and easy way to design
interface. Paper presented at the usability workshop of JCDL and refine user interfaces. Boston: Morgan Kaufmann. QY:
2002, www.uclic.ucl.ac.uk/annb/DLUsability/Keith15.pdf Morgan Kaufmann in Boston and San Francisco?
(QY: accessed date?). Spool, J. M., et al. 1999. Web site usability: A designer’s guide. San
Kengeri, Rekha, et al. 1999. Usability study of digital libraries: Francisco: Morgan Kaufmann.
ACM, IEEE-CS, NCSTRL, NDLTD. International Journal on Sumner, Tamara, and Melissa Dawe. 2001. Looking at digital
Digital Libraries 2: 157–69. library usability from a reuse perspective. Proceedings of the
Kim, Kyunghye. 2002. A model of digital library information first ACM/IEEE-CS Joint Conference on Digital Libraries. City,
seeking process (DLISP model) as a frame for classifying state: publisher?, 416–25.
usability problems. PhD diss., Rutgers Univ. Sumner, Tamara, et al. 2003. Understanding educator percep-
Krueger, Janice, Ron L. Ray, and Lorrie Knight. 2004. Apply- tions of “quality” in digital libraries. Proceedings of the third
ing Web usability techniques to assess student awareness of ACM/IEEE-CS Joint Conference on Digital Libraries. City, state:
library Web resources. Journal of Academic Librarianship 30 (4): publisher?, 269–79.
285–93. Theng, Yin Leng, Norliza Mohd-Nasir, and Harold Thimbleby.
Kurosu, Masaaki, and Kaori Kashimura. 1995. Apparent usabil- 2000a. Purpose and usability of digital libraries. Proceedings
ity vs. inherent usability: Experimental analysis on the deter- of the fifth ACM Conference on Digital Libraries. City, state:
minants of the apparent usability. Conference on Human Factors publisher?, 238–39.
and Computing Systems. City, state: publisher?, 292–93. Theng, Yin Leng, Norliza Mohd-Nasir, and Harold Thimbleby.
Lan, Su-Hua. 2001. A study of usability evaluation of informa- 2000b. A usability tool for Web evaluation applied to digital
tion architecture of the University Library Web site: A case library design. Poster presented at the WWW9 Conference,
study of National Taiwan University Library Web site. Bul- Amsterdam, http://citeseer.nj.nec.com/theng00usability.
letin of the Library Association of China 67: 139–54. html (QY: accessed date?).
Landauer, Thomas K. 1995. The trouble with computers: Usefulness, Thomas, Rita Leigh. 1998. Elements of performance and satisfac-
usability and productivity. Cambridge, Mass.: MIT Pr. tion as indicators of the usability of digital spatial interfaces
Lesk, Michael. 1997. Practical digital libraries: Books, bytes, and for information-seeking: Implications for ISLA. PhD diss.,
bucks. San Francisco: Morgan Kaufmann. Univ. of Southern California.
Marcum, Deanna B. 2002. Preface to The digital library: A biog- Tractinsky, Noam. 1997. Aesthetics and apparent usability:
raphy, ed. Daniel Greenstein and Suzanne E. Thorin. Wash- Empirically assessing cultural and methodological issues.
ington, D.C.: Digital Library Federation, Council on Library Proceedings of the SIGCHI Conference on Human Factors in Com-
and Information Resources, www.clir.org/pubs/reports/ puting Systems. City, state: publisher?, 115–22.
pub109/pub109.pdf (QY: accessed date?). Turner, Steven. 2002. The HEP test for grading Web site usability.
Neumann, Laura J., and Ann Peterson Bishop. 1998. From Computers in Libraries 22 (10): 37–39.
usability to use: Measuring success of testbeds in the real Vohringer-Kuhnt, Thomas. 2003. The influence of culture on
world, http://forseti.grainger.uiuc.edu/dlisoc/socsci_site/ usability, http://userpage.fu-berlin.de/~kuhnt/thesis/
dpc-paper-98.html (QY: accessed date?). results.pdf (QY: accessed date?).
Nielsen, Jakob. 1993. Usability engineering. Cambridge, Mass.:
Academic Pr.
Nielsen, Jakob, and Robert L. Mack, eds. 1994. Usability inspec-
tion methods. New York: Wiley.
Oulanov, Alexei, and Edmund F. Y. Pajarillo. 2002. CUNY+
Web: Usability study of the Web-based GUI version of the
bibliographic database of the City University of New York
(CUNY). The Electronic Library 20 (6): 481–87.

10 INFORMATION TECHNOLOGY AND LIBRARIES | JUNE 2005


Appendix A. Pretest Questionnaire

Thank you very much for agreeing to participate in this Major/Department: _______________________________
experiment. All of your personal data that we collect will
be entirely confidential, viewed only by the experimenter, How many years have you been at Rutgers or Queens?
and shared only as part of group results. But first, we _______
would like to gather a bit of background information
about you, so that we will be better able to interpret your If you are from foreign country, how long have you been
use of and reactions to the system. in the U.S.? ________ years
Your original nationality: _________________
Participant # ______
Ethnic group: ___ White ___ African American ___ Asian
Date: _________________ ___ Hispanic ___ Native American ___ Other: _________
__
Gender: ___ Male ___ Female
How often do you use the Library’s Web site:
Age: _____ _____ Never used it
_____ Once or twice a semester
What is your current status: _____ Once or twice a month
_____ Undergraduate _____ Master’s Student _____ _____ Once or twice a week
Doctoral Student _____ Faculty _____ Daily

Appendix B. Usability Testing Questions


The goal of this test is to evaluate the usability of the 1 2 3 4 5
library’s Web site. I will ask you a series of questions and Easy to use Difficult to use
would like you to think out loud while you look for the Your comment: ____________________________________
answer. Some questions are easy and some are more diffi- __________________________________________________
cult. Do not worry if you can’t find the answer every time.
Please remember that we are testing the effectiveness of 4. Find a journal article on gospel music.
the site design and this is not a test of you. The whole test 1 2 3 4 5
should take less than an hour. I thank you. Easy to use Difficult to use
Your comment: ____________________________________
1. Does the library have a copy of Gone with the Wind, __________________________________________________
book format, by Margaret Mitchell?
5. I am interested in investing in what are referred to as
Please rank from 1 to 5 regarding the ease of use of the “callable securities.” Please find a recent article about
system, 1 being the easiest and 5 being the most difficult. them.
1 2 3 4 5 1 2 3 4 5
Easy to use Difficult to use Easy to use Difficult to use
Your comment: _____________________________________ Your comment: ____________________________________
__________________________________________________ __________________________________________________

2. Does the library currently subscribe to paper copy of 6. Find an encyclopedia article about French wine.
Advertising Age? 1 2 3 4 5
1 2 3 4 5 Easy to use Difficult to use
Easy to use Difficult to use Your comment: ____________________________________
Your comment: ____________________________________ __________________________________________________
__________________________________________________
7. Find an e-book called “The story of mankind.”
3. Use a database to find an article about nursing homes 1 2 3 4 5
and mental illness. Easy to use Difficult to use

WHAT IS USABILITY IN THE CONTEXT OF THE DIGITAL LIBRARY AND HOW CAN IT BE MEASURED? | JENG 11
Your comment: ____________________________________ 9. Find instruction on how to set up your home computer
__________________________________________________ to have remote access to the library electronic resources.
1 2 3 4 5
8. Can alumni enjoy inter-library loan service? Easy to find Difficult to find
1 2 3 4 5 Your comment: ____________________________________
Easy Difficult __________________________________________________
Your comment: ____________________________________
__________________________________________________

Appendix C. Post-Test Questionnaire

Thanks again for participating in this experiment. This 7. What new content or features that you would like to see
questionnaire gives you an opportunity to tell us your on the site? ________________________________________
reactions to the system you used. Please circle a number
on the scale to indicate your reactions. Please write com- 8. Can you recover from mistakes easily?
ments to elaborate on your answers. I will go over your 1 2 3 4 5
answers with you to make sure that I understand all of Easy Difficult
your responses. Thank you. Your comment: ____________________________________
__________________________________________________
1. Please rate the ease of use of the Web site.
1 2 3 4 5 9. Your overall reaction to the system:
Easy Difficult 1 2 3 4 5
Your comment: ____________________________________ Satisfied Unsatisfied
__________________________________________________ Your comment: ____________________________________
__________________________________________________
2. What do you think about the organization of informa-
tion on the site? 10. Do you feel lost while using the site?
1 2 3 4 5 _____ Yes _____ No
Clear Unclear Your comment: ____________________________________
Your comment: ____________________________________ __________________________________________________
__________________________________________________
11. Is the site easy to navigate?
3. What do you think about the terminology used in the _____ Yes _____ No
site? Are categories clearly labeled? Your comment: ____________________________________
1 2 3 4 5 __________________________________________________
Clear Unclear
Your comment: _____________________________________ 12. When you click a button on the Web page, do you
__________________________________________________ expect that the click will lead you to correct answer?
_____ Yes _____ No
4. Is the site visually attractive? Your comment: ____________________________________
1 2 3 4 5 __________________________________________________
Attractive Unattractive
Your comment: ____________________________________ 13. Do you have any other comments about the Web
__________________________________________________ site?
__________________________________________________
5. What is the best feature(s) of the site? _______________ __________________________________________________
__________________________________________________ __________________________________________________
____________________________________________________________________________________________________
6. What is the worst feature(s) of the site? ______________ ____________________________________________________________________________________________________
_________________________________________________ __________________________________________________

12 INFORMATION TECHNOLOGY AND LIBRARIES | JUNE 2005

You might also like