A. System Usability
A. System Usability
A. SYSTEM USABILITY
The researcher adopted a system usability questionnaire by Lewis, J. R. (1995).
IBM Computer Usability Satisfaction Questionnaires: Psychometric Evaluation and
Instructions for Use. See Appendix A for the copy of the questionnaire. The system
usability test was conducted by the researcher in the six (6) BISU campuses, namely:
Bilar Campus, Main Campus, Candijay Campus, Clarin Campus, Calape Campus, and
Balilihan Campus from February to March 2013. The respondents were the librarians
and library personnel from each campus. The researcher demonstrated the system’s
features in detail, like borrower’s information profile, library item’s information, check
out, check in and reservation modules. The researcher also presented the library
statistical reporting and providing them the system processes manual (see Appendix C).
It took 2-3 hours spent during the presentation. The respondents did hands-on library
system processes. After the hands-on, they answered the system usability
questionnaire. The researcher was able to collect 16 respondents from different
campuses in the BISU library.
The ranges of the interpretative guides were computed by getting the interval
value. The interval value for the system usability is 0.9 which is computed as follows:
Interval = Number of options – 1 / Number of options
= (7 – 1) / 7
= 0.9
The interpretative guide for the interpretation of the statistical results of the
system usability is presented in Table 8 on the next page.
B. WEB USABILITY
The aspects of Web usability were evaluated using a Web Usability Survey
adopted from the Web Usability Survey developed by Massachusetts Institute of
Technology. See Appendix B for the copy of questionnaire. The Web Usability Survey
consisted of questions that rate the website’s aspects in Navigation, Functionality, User
Control, Language and Content, Online Help and User Guides, System and User
Feedback, Consistency, Error Prevention and Correction, and Architectural and Visual
Clarity.
During the administration of the study, a sample size of the respondents was
determined where the total number of population was based on the approximate
number of student population in all campuses of BISU which is 11, 000. The sample
size was rounded off to 386 respondents. Computation of the sample size using Slovin’s
formula is as follows:
n = N/1 + Ne2:
n = 11, 000 /(1+(11,000)(.05)2)
n = 386
where n = the sample size
N = the total population
e = the margin of error
Table 10 on page 83 presents the distribution of the respondents. Of the 386, 136
are BISU students, 56 are BISU faculty and staff, 165 are non-BISU students and 29
are non-BISU professionals.
4
During the actual survey administration, a random sampling procedure was utilized.
There were two different groups of the administration. The first group was the BISU
students, faculty and staff. The web usability test was conducted by the researcher in
the six (6) BISU campuses and access existing library records of BISU through the
internet from February to March 2013. This group of respondents was BISU students
and faculty members/staff. They were randomly selected during a computer laboratory
classes for the reason that the demonstration has to be conducted inside the computer
laboratory. The researcher asked an approval from the campus administrator, dean or
the instructor’s in-charge in the laboratory room before the demonstration and testing
process. The campus librarian assisted also during the conduct of the web usability
evaluation. During the demonstration, the Online Public Access Catalog (OPAC)
features were presented in detail through the web and the respondents accessed the
OPAC processes. After using the OPAC, the respondents were asked to answer the
web accessibility survey questionnaire. A total of 192 BISU-respondents participated
during the web usability evaluation, wherein 136 were students while 56 were faculty
members and staff.
The second group of web usability testing was also conducted by the researcher
in Silliman University, Dumaguete City. This is to consider the evaluation coming from
the community perspective. This group of respondents browsed and accessed existing
library records of BISU through the Internet on April 17-19, 2013. This group of
respondents was students and other professionals. They were randomly selected during
a computer laboratory classes for the reason that the demonstration has to be
conducted inside the computer laboratory. The researcher asked an approval from the
College Dean before the demonstration and testing process. During the demonstration,
the Online Public Access Catalog (OPAC) features were presented in detail through the
web and the respondents accessed the OPAC processes. After using the OPAC, the
respondents were asked to answer the web accessibility survey questionnaire. A total of
194 respondents of this group participated during the web usability evaluation, wherein
165 were students from Silliman University while 29 were other professionals.
No. of
Type of Respondents Percentage
respondents
BISU Students 136 35.23
BISU Faculty/Staff 56 14.51
Non-BISU Students 165 42.75
Non- BISU
Professionals 29 7.51
Similar with the system usability evaluation, the ranges of the interpretative guide
for web usability were computed by getting the interval value. The interpretative guide
5
for the interpretation of the statistical results of the web usability is presented in Table
11 below.
Table 11. Interpretative Guide of the Web Usability
user productivity.
8.4 White space is sufficient; pages Very
4.0 4.0 4.0 4.4 4.1
are not too dense. Good
8.5 Unnecessary animation is Excelle
4.3 4.5 4.3 4.6 4.4
avoided. nt
8.6 Colors used for visited and
Very
unvisited links are easily seen and 4.2 4.1 4.0 4.6 4.2
Good
understood.
8.7 Bold and italic text is used Excelle
4.2 4.4 4.1 4.6 4.3
sparingly. nt
Very
AGGREGATE MEAN 4.2
Good
Based on the results as shown in Table 12 above and on the previous two
pages, the site showed great Web Usability: having Excellent rating in Navigation ( x =
4.3), Very Good rating in Functionality ( x = 4.2), Very Good rating in User Control ( x =
4.2), Excellent rating in Language and Content ( x = 4.3), Very Good rating Online Help
and User Guides ( x = 4.0), Excellent rating in Consistency ( x = 4.3), Very Good rating in
Error Prevention ( x =4.2), and Very Good rating in Architectural and Visual Clarity ( x =
4.2). Table 6 shows the aggregate mean and interpretation of each statement. The
aggregate mean of the web usability is 4.2 with the interpretation of “Very Good.” This
implies that the OPAC’s usability is very good. The result suggests also that the
respondents are very fulfilled and satisfied in terms of the navigation, consistency, and
the ease of use of the web page.
Prevention
and Good Good Good 5 t Good
Correction
VIII.
Architectural Very Very Very 4. Excellen Very
4.2 4.2 4.0 4.2
and Visual Good Good Good 6 t Good
Clarity
Very
AGGREGATE MEAN 4.2
Good
C. WEB ACCESSIBILITY
The Web Accessibility used for the evaluation of Online Public Access Catalog
(OPAC) was the WAVE (http://wave.webaim.org/). It is a free web accessibility
evaluation tool powered by WebAIM (Web Accessibility in Mind). It is used to aid
humans in the Web accessibility evaluation process. Rather than providing a complex
technical report, WAVE shows the original Web page with embedded icons and
indicators that reveal the accessibility of that page. The tool works by scanning the
website the OPAC which is http://libtest.bisu.edu.ph. The tool examined the syntax and
structure of the website and determined if the code follows Web accessibility guidelines.
After scanning the site, the results showed that web page had no web accessibility
problems. Figure 23 below shows the actual result of the Web Accessibility Scan.
CONCLUSIONS
good in terms of usability and accessibility. The librarians and other intended users
believe and confident that the system is very usable. It has functions and features that
are highly acceptable by the intended users. The system enables the librarians to
maintain and organize library processes and information for better library business
decision-making.
The proposed system is integrated into one centralized server which is hosted
and implemented at the main campus of the university. Likewise, the proposed library
system offers modules that are highly acceptable by the librarians such as acquisition,
cataloging, circulation, OPAC and administration. Moreover, the graphical enterprise
reporting as business intelligence technique for decision-support was implemented that
is highly acceptable also by the librarians.
REFERENCES
[2] Freeman, N. A. (2009).The Bookends Scenarios: Alternative futures for the Public
Library Network in NSW in 2030
[3] Breeding, M.(2012). Lowering the Threshold for Automation in Small Libraries,
Retrieved December 10, 2012, from http://www.questia.com/library/1G1-
286254700/lowering-the-threshold-for-automation-in-small-libraries
[5] Wolk, T., Robert , M. (2006). Back to the Maxwell Library’s Future Student Library
and Information Resource Usage. Retrieved December 10, 2012, from
http://isedj.org/6/56/index.html
[8] Arlante, S. M. (2008). Quality Assurance at the University of the Philippines Library
System, Retrieved December 10, 2012, from
http://paarl.wikispaces.com/file/view/Arlante+-
+Quality+assurance+at+UP+Lib.pdf
[9] Ciciani, B. & Dias, D. (1990). A Hybrid Distributed Centralized System Structure for
Transaction Processing
11
[12] Rud, O. (2009). Business Intelligence Success Factors: Tools for Aligning Your
Business in the Global Economy. Hoboken, N.J: Wiley & Sons.
[16] Uzoka, F. M. & LIjatuyi, O. A. (2005). Decision support system for library
acquisitions: a framework
[17] UP, Integrated Library System (2008). Online Public Access Catalog (OPAC)
User’s Manual, Retrieved December 20, 2012, from
http://www.scribd.com/doc/61099353/iLib-Manual-OPAC
[19] Library of Congress- Network Development and MARC Standards Office, Retrieved
April 17, 2013, from http://www.loc.gov/marc/
[20] Alabama A&M University, Administrative Offices, Retrieved December 20, 2012,
from
http://www.aamu.edu/administrativeoffices/library/public_services_departments/
pages/serials.aspx
[23] Golding, P and Tennant, V. - Using RFID Inventory Reader at the Item-Level in a
Library Environment: Performance Benchmark, The Electronic Journal
12
[25] Wolk, T. & Robert M. (2010). Back to the Maxwell Library’s Future Student Library
and Information Resource Usage
[26] Kokabi, M. “Where was Information Ethics in Iranian Library and Information
Science Publications and Services?.” The Electronic Journal Information
Systems Evaluation Volume 12 Issue 1 2009, pp. 89 - 94, available online at
www.ejise.com
[30] The University of Chicago Library, Retrieved December 20, 2012, from
http://www.lib.uchicago.edu/e/index.html#using
[33] University Library System, The Chinese University of Hong Kong, Retrieved
December 20, 2012, from
http://www.lib.cuhk.edu.hk/Common/Reader/Channel/ShowPage.jsp?
Cid=344&Pid=8&Version=0&Charset=ascii_text&page=0
[34] Silliman University (SU) Library, Retrieved December 20, 2012, from
http://su.edu.ph/page/130-Silliman-University-Library-Electronic-Databases
[35] Holy Name University (HNU) Library System, Retrieved December 20, 2012, from
http://www.hnu.edu.ph/main/library/services.php
13
[38] AdU Library Electronic Resources, Retrieved December 20, 2012, from
http://www.adamson.edu.ph/?page=library
[42] Capterra, Inc., Library Automation Software Programs, Retrieved December 20,
2012, from http://www.capterra.com/library-automation-software?
srchid=936189&pos=1
[43] TLC (The Library Corporation), Retrieved December 20, 2012, from
http://www.tlcdelivers.com/tlc/what-we-do/library-automation.asp
[44] Athena, library automation software from SoftScout, Retrieved December 20, 2012,
from http://www.softscout.com/software/Public-Services-and-Utilities/Library/
Athena.html
[45] Access-It Library Software, The best solutions library, Retrieved December 20,
2012, from http://www.accessitsoftware.com/
[47] Evergreen Open Source Software, Retrieved December 20, 2012, from
http://www.open-ils.org/
[49] New Gen Lib Best Open Source Library System, Retrieved December 20, 2012,
from http://www.verussolutions.biz/web/
[50] Open Biblio a Library System (2012), Retrieved December 20, 2012, from
http://obiblio.sourceforge.net/
14
[51] Genove, P.G. M. (2011). Network Driven Budget Preparation and Monitoring
System
[52] Business Intelligence (BI) and Analytics Made for action. Retrieved April 17, 2013,
from http://www.targit.com/en
[53] Luhn, H. P. (1958). A Business Intelligence System, IBM Journal of Research and
Development
[56] McDonald, J. (1986). Designing a Decision Support System (DSS) for Academic
Library Managers Using Preprogrammed Application Software on a
Microcomputer
[58] Uzoka, F. M. (2005). Decision support system for library acquisitions: a framework
[65] Enterprise Reporting, Pixel-and Print-Perfect Reports, Retrieved October 14, 2011,
from http://www.microstrategy.com/software/business-intelligence/enterprise-
reporting/
15
[67] Interview with Kai Seidler, Retrieved August 22, 2012, from
http://sourceforge.net/projects/xampp/
[68] The PHP Group, Retrieved December 20, 2012, from http://www.php.net/
[69] Don Ho: HTML and CSS, Retrieved December 20, 2012, from http://notepad-plus-
plus.org/
[70] Microsoft, Visual Studio .NET 2003, Retrieved December 20, 2012, from
http://msdn.microsoft.com/en-us/library/aa669223(v=vs.71).aspx
[76] Writing Software Security Test Cases - Putting security test cases into your test
plan, Retrieved December 20, 2012, from http://www.qasec.com/2007/01/writing-
software-security-test-cases.html
[79] Software Testing Stuff: How to do System Testing, Retrieved April 17, 2013, from
http://www.softwaretestingstuff.com/2009/12/how-to-do-system-testing.html