Evaluating A Gamification Proposal For Learn - 2022 - International Journal of H
Evaluating A Gamification Proposal For Learn - 2022 - International Journal of H
A R T I C L E I N F O A B S T R A C T
Keywords: This paper presents the results of an educational experiment conducted to determine whether an automated,
Usability card-based gamification strategy has an impact on the learning of Jakob Nielsen’s 10 heuristic usability rules.
Nielsen heuristics The participants in the experiment were 55 students enrolled on a human-computer interaction course. Ac
Gamification
cording to the results of the experiment and the hypothesis tests performed to compare both traditional and
Student experiment
gamified approaches, there were no significant differences (t (53) = 0.66, p = 0.52), although the scores attained
by the students who used the gamification strategy were slightly better when evaluated one week later (M = 6.29
and M = 6.57 out of 10, respectively). Moreover, the students’ perceptions reflect that the proposed tool is easy
to use (MD = 4.00 out of 5) and useful as regards learning (MD = 4.00 out of 5). Further research is needed to
determine whether incorporating other gamification elements, such as rankings, difficulty levels, and game
modes, would have a positive impact on student motivation, engagement and performance.
1. Introduction conventional learning ideas and activities into something new and
meaningful (Johnson et al., 2016).
Usability is a product attribute that influences the quality of a soft The concept of gamification originated from the digital media in
ware system. According to ISO 9241 (2018), usability is “the extent to dustry (Rodrigues et al., 2019). Several attempts have been made to
which a system, product, or service can be used by specified users to define gamification. Barber (2021) defines gamification as “the appli
achieve specified goals with effectiveness, efficiency, and satisfaction in cation of gaming elements to non-gaming contexts”, while Huotari and
a specified context of use”. Usability is important because it addresses Hamari (2012) define it as “a process of enhancing a service with
pragmatic aspects of a product related to behavioral goals that the affordances for gameful experiences in order to support user’s overall
software must achieve (Guerino and Valentim, 2020). value creation.” Other definitions include “the use of game thinking and
Educators are continuously seeking ways in which to innovate the game mechanics to engage users and solve problems” (Dale, 2014).
learning process in order to decrease the dropout rate and, in general, Gamification has already been successfully used in education, market
make this process more effective in terms of a better understanding of ing, organizational, health and environmental initiatives to achieve
the subjects and consequently increase pass rates (Sharma and Sharma, desirable outcomes by influencing user behavior (Bai et al., 2020). In
2021). One means employed to achieve this objective is that of adapting terest in applying gamification to education is increasing, given its ca
traditional learning methods to new pedagogical theories. These new pacity to capture and sustain students’ attention, which is a prerequisite
approaches include the application of digital strategies by taking for students’ success in educational environments (Khalil et al., 2018).
advantage of emerging technologies, such as the Internet, multimedia, A review of an academic bibliography carried out by (Carrion et al.,
and lately social networks and video games (Crittenden et al., 2019). 2019) showed that the terms gamification and serious game share
Digital strategies concern not only technologies, but also the ways in characteristics, although they are different. A serious game is defined as
which devices and software are used in order to enrich learning, whether “any kind of interactive computer application that incorporates gami
inside or outside of the classroom. Moreover, these strategies can be fication principles and serves an educational purpose, or aims to achieve
used in both formal and informal learning, and can transform a predefined goal” (Meijer et al., 2021). It is widely recognized that
* Corresponding author at: Faculty of Computer Science, Campus of Espinardo, Murcia, Spain.
E-mail address: [email protected] (J. Nicolás).
https://doi.org/10.1016/j.ijhcs.2022.102774
Received 13 April 2021; Received in revised form 11 January 2022; Accepted 18 January 2022
Available online 25 January 2022
1071-5819/© 2022 The Author(s). Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-
nc-nd/4.0/).
R. Sobrino-Duque et al. International Journal of Human - Computer Studies 161 (2022) 102774
serious games provide numerous benefits, including improved student frequently used elements.
participation, timely assessment and feedback, and ultimately, the Researchers have defined game mechanics in various ways, and there
achievement of learning outcomes (Bai et al., 2020). Games of this na is no consensus as to a unified taxonomy, thus making it difficult to
ture have been proposed for many Computer Science domains, including distinguish how game mechanics work analytically speaking (Schell,
Computer fundamentals (Sindre et al., 2009), Programming (Muratet 2019). Game mechanics help build a narrative with which to keep users
et al., 2011); (Haaranen et al., 2014); (Hakulinen et al., 2013); (O’Do curious and make them look forward to the evolution of the gamified
novan et al., 2013), Operating systems (Hill et al., 2003), Information environment as they get involved with activities. Several game me
systems and Computer engineering (Barata et al., 2013), Information chanics have been proposed, such as points, badges, levels, challenges,
and computer technologies (Domínguez et al., 2013), Mathematics leaderboards, on-boarding, social engagement loops, and feedback
(Gordon et al., 2013), Computer organization and Cloud computing (E. (Zichermann and Cunningham, 2011).
Rodrigues et al., 2007), Software engineering, and Design & Usability A gamification strategy is defined as using game-based mechanics,
(Labrador and Villegas, 2014). aesthetics, and game thinking to engage people, motivate action, pro
The aim of this paper is to determine the impact of the use of mote learning, and solve problems (Kapp, 2012). A gamification strat
gamification on the learning process of a specific well-known usability egy is an integrated effort that provokes the user’s proactive
asset, specifically Jakob Nielsen’s 10 usability heuristic rules (Nielsen, participation and action by applying game mechanics and elements to
2020). In this vein, the specific contributions of this paper are the the non-game field (Kim, 2021). Previous studies have presented
following: (1) a systematic gamification process based on rigorous numerous gamification strategies. However, many of them consider
learning theories that underpins the game design for learning usability neither the users nor the context, or are duplicates because of the
heuristics; (2) a user-centered, self-built tool called Heureka to support inconsistent definitions suggested by several researchers (Kim, 2021).
that learning process; and (3) an empirical evaluation of both process Several gamification strategies have been reported in literature
and tool, involving students enrolled on the User Interfaces course, a (Zichermann and Cunningham, 2011): (1) Competition, (2) Relation
Human-Computer Interaction (HCI) subject on the Degree in Computer ship, (3) Challenge, (4) Compensation, (5) Achievement, (6)
Engineering at the University of OMITTED FOR REVIEW (OMITTED Self-expression, and (7) Usability.
FOR REVIEW).
The remainder of this paper is organized as follows. Section 2 in 2.2. Gamification-motivation theories
troduces the background to the theory of gamification, while Section 3
reviews previous work on gamification experiences for usability Despite the fact that gamification has been employed in different
learning. Section 4 introduces the process used to gamify Heureka, a web academic disciplines (Bai et al., 2020), motivational theories with which
application whose intention is to allow users to practice the 10 heuristics to define and analyze the foundations of gamification are underdevel
for user interface design by Jakob Nielsen. The experiment methodology oped, as claimed in a literature survey by Seaborn and Fels (2015). Some
is presented in Section 5, and Section 6 shows the results of the statistical gamification frameworks have been defined in order to ease the appli
analysis carried out on the data obtained. Section 7 discusses the study cation of these theories into the development of both rigorous and
findings and lessons learned considering the key objectives of the engaging gamification approaches. Seaborn and Fels (2015) evidence
experiment carried out, while threats to validity are highlighted in the existence of several theories and frameworks. The theories under
Section 8. Finally, Section 9 provides some concluding remarks and pinning the gamification frameworks are the following: (1)
shows an outline of future work. Self-Determination Theory, (2) Intrinsic and Extrinsic Motivation, (3)
Situational Relevance, (4) Situated Motivational Affordance, (5)
2. Background Transtheoretical Model of Behavior Change, (6) Universal Design for
Learning, and (7) User-Centered Design.
2.1. Gamification and serious games
• The Self-Determination Theory, which specifically concerns the
One important element in the present-day learning process is the use concepts of autonomy, competence and relatedness (Ryan and Deci,
of Information and Communication Technologies (ICT) owing to their 2000), is the basis of the framework defined by Aparicio et al.
acceptance by both students and teachers. Of the various ICT tools for (2012). This framework is divided into the following four steps: (1)
education, there are resources with which to create work environments, identification of the main objective, stating the reasons for the use of
or to share files, interactive games, and others (Carrion et al., 2019). gamification; (2) identification of the cross-cutting objective,
Serious games are, along with gamification, currently growing among defining what intrinsically motivating factors the system is intended
interactive games, and will likely continue to do so because of four to provide; (3) determination of the game mechanics, determining
factors: (1) the number of domains in which serious games are used is which game mechanics will be used on the basis of their relationship
growing; (2) the serious games industry is closely related to the video with the self-determination concepts, and finally, (4) the evaluation
game industry, the latter of which is in a state of permanent growth; (3) of the framework in the system applied, indicating how to evaluate
videogames are now part of our culture, and (4) serious games are the framework in those systems.
increasingly open to the teaching of adults (Gounaridou et al., 2021). • Intrinsic and extrinsic motivation theories (Blohm and Leimeister
Game elements can be defined as elements that are characteristic of 2013) (Nicholson 2012) support user-centered frameworks for the
games (Deterding et al., 2011). The game elements deliver information so-called meaningful gamification, i.e., gamification based on
to the players and usually appear in forms of a user interface. Game intrinsic (or internal) motivation rather than extrinsic (or external)
elements can be classified on various levels of abstraction. A systematic motivation. These user-centered frameworks aim to elucidate how
mapping found 27 game elements distributed throughout 43 serious gamification can operate on intrinsic and extrinsic motivators to
games (Dos Santos et al., 2018). These game elements were classified in elicit behavioral change and reframe activities such as learning.
three categories: dynamics, components, and mechanics. There were These frameworks specifically suggest that meaningful game ele
four elements in the dynamics group, of which Fantasy was the most ments are intrinsically motivating, regardless of the external rewards
frequently used element (17 games in total), while there were nine el that may be associated with them. In this vein, a value-based gami
ements in the components group. The most frequently used elements in fication framework for designers who aim to foster and leverage
that group were Level (36 games), Quest (16 games) and Avatar (14 intrinsic motivation was defined (Sakamoto et al., 2012). The five
games). Lastly, 14 elements were collected in the mechanics group, in values that comprise the framework are: (1) information, as neces
which Goal (21 games) and Point System (16 games) were the most sary and immediately available; (2) empathic values, drawn from
2
R. Sobrino-Duque et al. International Journal of Human - Computer Studies 161 (2022) 102774
virtual characters and social involvement; (3) persuasive values, that the proposal that cases can be a minimalist-information design tech
is, information that provides a forward-looking perspective based on nique, that is, a design technique that (1) focuses on information in order
behaviors, actions, and outcomes; (4) economic values, related to to facilitate user action; (2) anchors information in the activity; (3)
charging and ownership; and (5) ideological values, defined as im prevents, mitigates and leverages errors; and (4) develops user auton
plicit beliefs through storytelling and a variety of message types. This omy. These authors discuss a case study in which students performed all
framework is designed to be complementary to other gamification the steps related to the usability lifecycle.
frameworks. With regard to usability audits, Tao (2005) presents an approach
• The Theory of Situational Relevance (Wilson, 1973) is based on the with which to integrate usability evaluation into behavior modeling for
importance that an individual places on a particular situation, interactive systems in order to help students introduce usability con
although on the situation as s/he perceives it, and not as other people cepts from the early stages of software development. Furthermore, Wahl
perceive it. Situational Relevance frameworks (Nicholson, 2012) (2000) presents a set of steps in which students were separated into
imply that the user should make decisions concerning what is groups and asked to develop a library automation system and evaluate
meaningful. The Theory of Situational Relevance should be circum the usability of the software developed by another group. Furthermore,
scribed to the details of the situation that affect the individual. Ludi (2005) proposed a process using a hands-on approach that allowed
• The Theory of Situated Motivational Affordance (Deterding, 2011) students to apply various techniques to address usability in systems.
states that motivation is afforded when the relationship between the Students followed a usability testing process that gave them the op
features of an object and the ability of a subject allows the subject to portunity to plan the process and methodology, recruit participants,
experience the satisfaction of such needs when interacting with the conduct tests, and analyze test results. Moreover, Faulker and Culwin
object. People perform activities if those activities promise to satisfy (2001) used students to conduct a usability study involving 124 users
their motivational needs, such as competence, autonomy, or relat who analyzed a set of web pages and answered a set of questions related
edness. Nicholson (2012) introduces a framework, stressing the need to the usability of these pages.
for a correspondence between the user’s background and the game Despite the widespread use of game-playing elements to teach
setting. In this framework, affordances (e.g., perceived opportunities computer literacy, only two serious games, called UsabilityGame (Bar
for action on the elements of the user interface) are mapped onto reto et al., 2015) and UsabilityCity (Ferreira et al., 2014), were found to
motivational needs drawn from satisfaction theories of motivation teach Jakob Nielsen’s 10 usability heuristic rules.
(specifically from the Self-Determination Theory). UsabilityGame is a web application designed to complement uni
• The Transtheoretical Model of Behavior Change (TTM) (Sakamoto versity teaching in the field of HCI and to explain two processes: (1) the
et al., 2012) is a theory of intentional change that focuses on the life cycle of usability engineering and (2) Jakob Nielsen’s heuristic
decision-making abilities of the individual rather than social and evaluation from a procedural point of view. The game experience in
biological influences on behavior. This model grew from the sys volves two types of roles: teacher and student. The student takes on the
tematic integration of more than 300 theories of psychotherapy, role of Usability Engineer, whereas the teacher plays the role of Usability
along with the analysis of the leading theories of behavior change. Engineer Leader. The process of evaluation and monitoring of students is
• The Theory of Universal Design for Learning (UDL) addresses the guided by the instructor through a specific environment.
need to provide educational resources for a wide spectrum of users, UsabilityGame is structured in three stages: (1) Requirements Anal
taking into account presentation, activity types and learning paths ysis (the players analyze all scenarios, and from this analysis, define the
(Rose and Meyer, 2007). requirements); (2) Design, Testing and Development of a prototype
• The User-Centered Design Theory (Norman, 1989) is a design phi (given the requirements specification document, the students prototype
losophy that places the user at the center of the experience, and an interface that meets that specification with a level of fidelity (low,
designs iteratively with the user’s needs and desires in mind. User medium or high) to be configured by the teacher using a prototyping
feedback is thus essential to specify and refine requirements and tool embedded in the game); and (3) Heuristic Evaluation (given the
designs. Communication between the user and the system is also captures of real interfaces with the problem areas highlighted visually,
important, as long as the user should be aware at all times of the the students choose from the list of Nielsen heuristics that are being
system status and the actions that can be taken. violated in each case).
The elements of gamification that UsabilityGame uses are: (1)
In conclusion, all gamification frameworks share a core consisting of interactivity, since the player can move between the game screens and
motivational theories, behavioral change and engagement (Kim, 2021). check boxes to indicate their answers using a prototyping tool within the
Furthermore, User-Centered Design is common to all theories (Nich game itself; (2) narrative, because it is developed in videos and images
olson, 2012), as it places the user in the foreground of the gamification with text balloons and cartoon drawings; (3) levels, i.e., the division of
strategy and can serve as a basis on which to combine other motivation content in phases, and the need to overcome one in order to access the
theories. next phase, and finally (4) points and rankings, since the game updates
the player’s score at the end of each level. The winner is the person who
3. Related work accumulates the most points at the end of the three stages.
UsabilityCity is a web application (available in Portuguese) whose
In a systematic literature review, Wangenheim and Shull (2009) objective is to allow HCI students to learn the Nielsen heuristic evalu
refer to 12 games, in which computer-based simulations led the list of ation method. The game does not require the intervention of a teacher
the most used game type for learning. A breakdown of the studies by and is structured in 5 phases, each one for two heuristics. If the two
topics and learning domains reveals that most of them were developed exercises in each phase are correct, the game moves on to the next phase.
in order to teach software project management skills, but usability is one Some of the elements of gamification presented in UsabilityCity are
of the disciplines for which less empirical evidence was found. The us interactivity, narrative, and levels. Firstly, (1) interactivity is supported,
ability life cycle, the heuristic evaluation in the software projects since the players can move between game screens and click onto char
development, the evaluation of the usability of a software system and acters in order to select their answers (each character represents a
the use of various techniques to address usability in systems are the main Nielsen heuristic). Secondly, (2) the narrative is shown by means of
usability education related issues addressed by the scientific community images with text balloons and cartoon drawings. The player takes on the
(Barreto et al., 2015). role of an inspector (represented by a character with a magnifying glass)
Carrol and Rosson (2006) used case-based learning as an instruc who must identify the problems of the so-called “UsabilityCity” in order
tional resource for the teaching of usability engineering. They analyzed to improve the lives of its residents (called Users). The heuristics
3
R. Sobrino-Duque et al. International Journal of Human - Computer Studies 161 (2022) 102774
themselves are personified so as to draw the student’s attention. Finally, 4.1. Business modeling and requirements
(3) the levels divide the presentation of the content into unlockable
phases. As stated by Herzig et al. (2015), several stakeholders and roles
The consulted bibliography allowed us to conclude that there is little participate in gamification at some point of the process: gamification
evidence on the use of gamification elements in usability learning. This experts, domain experts, business experts, IT experts and end users. The
indicates a need for empirical research related to this topic and more authors of the present work played all roles except that of end users. An
specifically to Jakob Nielsen’s heuristic rules. agile software development philosophy steered project management
activities in Heureka. A Kanban board was used to visualize the work
4. Implementing gamification in Heureka flow and task progression, using cards that move along “swimlanes” (i.
e., pending, in progress, done) during the software development process
Alhammad and Moreno (2018) state that there are no systematic (see Fig. 1).
approaches with which to gamify software engineering education. Most The first step of the gamification process is a key aspect as regards
primary studies analyzed in their systematic mapping reported no allowing all the stakeholders —with the exception of the end users—to
formal or structured approach. Seaborn and Fels (2015) point out that share a common ground in the business processes and to identify the
87% of applied gamification research is not based on any theoretical general project goals and end users, who are the final recipients of the
foundation. To avoid this issue, the gamification process proposed by gamification effort. In this case, the project goal is to help players learn
Herzig et al. (2015) was followed in the development of Heureka. These Jakob Nielsen’s 10 heuristic rules of usability (described in Table 1)—
authors believe that gamification can be understood as a software although the resulting gamified system should be easy to adapt to
development process. User-Centered Design (Don Norman, 2013) was different contexts in usability learning—. The end users are students of a
another of the key methodologies employed when developing Heureka. User Interfaces subject.
This process is characterized by the fact that the users play a central role The 10 Usability Heuristics created by Jakob Nielsen for HCI design
and that the entire design revolves around their interests and needs. were proposed in 1990 and can be considered the gold standard when
Moreover, Nicholson (2012) argues that User-Centered Design can act as evaluating usability heuristics. These heuristics are rules of thumb that
a nexus between the theoretical models underlying gamification sys must be adapted to specific interface types. Other sets of usability heu
tems. In this vein, Heureka is based on three theories that have been ristics have been proposed in literature, which include modifications to
previously applied in interactive systems gamification: (1) the Nielsen’s heuristics and/or the addition of new heuristics in order to
Self-Determination Theory; (2) the User-Centered Design Theory; and optimally design or evaluate specific aspects of user interfaces not
(3) the Operant Conditioning psychological theory (Skinner, 1965). covered by other heuristics. According to one systematic literature re
Self-Determination Theory and User-Centered Design provide formal view (Quiñones and Rusu, 2017), a total of 68 usability heuristics have
support for the interplay between individual psychological needs and been created for specific domains, signifying that Nielsen’s heuristics is a
self-motivation, whereas Operant Conditioning is an associative must-have material on an HCI course. However, learning such general
learning process consisting of the development of new behaviors based rules proves to be a real challenge. This learning requires showing stu
on their positive or negative consequences. dents many examples, thus enabling them to relate concrete situations,
The gamification process presented by Herzig et al. (2015) is orga and move from general guidelines to particular scenarios. These rules
nized into eight workflows comprising numerous tasks and roles. For the have traditionally been learned by using a variety of resources: text,
sake of simplicity, the four high-level phases shown in Calderón et al. figures and videos (see e.g. Harley (2019)). Heureka was designed to be
(2018) are used instead to articulate the explanation: (1) Business used without the intervention of an instructor and to require as few
Modeling and Requirements, in which the application context is resources from the user as possible. Moreover, its gamification approach
analyzed and business goals are documented; (2) Design, in which the aims to motivate students by taking into account their user profile:
gamification design is developed and playtested; (3) Implementation, consumer, exploiter, achiever and free spirit user types (cf. SubSection
during which the design is implemented as software artifacts and 4.2).
functionally tested; and (4) Monitoring and Adaptation, during which The end users and their use cases were then analyzed to elicit and
business goal achievement is measured, and subsequent design adap document requirements that are aligned with the project goals. At this
tations are conducted. point, the end users were involved, and the participation of all other
roles except the IT expert was also necessary. Epics and user stories were
used to construct the Heureka requirements specification. These user
stories were distributed in iterations on the basis of their priority,
Fig. 1. An example of visualization of workflow and task progression in iteration 3 using the Kanban Board.
4
R. Sobrino-Duque et al. International Journal of Human - Computer Studies 161 (2022) 102774
5
R. Sobrino-Duque et al. International Journal of Human - Computer Studies 161 (2022) 102774
6 Quests. The most prominent game element in Heureka is the quiz this typology was used (Herbert et al., 2014). As a result, the player
(i.e., questions and answers); quizzes represent the challenges (especially the consumer and exploiter subtypes), achiever and free spirit
that the player is required to overcome. During a game, the player user types (Marczewski, 2015; Tondello et al., 2016) were found to fit
is presented with a set of 10 questions that must be answered by the proposed gamification strategy. Most users who enter a gamified
clicking onto the right alternative (see Fig. 2). system do so initially for the extrinsic rewards they can attain. The re
7 Rewards. The system provides positive feedback when the player wards are, therefore, an important design element in Heureka that meet
performs well in the game. the players’ user-type needs. The fundamental idea is to attempt to
8 Celebrate. There are both minor (i.e., correct answer) and major convert them from reward-oriented users to intrinsically motivated
(i.e., a complete quiz) achievements in the game, which represent users, as defined by the self-determination theory. Furthermore, the
the outcomes that the player will celebrate. achiever user type is motivated by mastery: the design elements
9 Punish. Punishment is applied in two ways: (1) when a wrong implemented in Heureka to suit achievers are challenges, learning new
answer to a question is provided, the system sends the players skills and quests. Finally, the free spirits are motivated by autonomy, act
negative feedback; and (2) when the players lose all their lives, outside the control of others and enjoy exploration. The exploratory
the game ends and the players must start again from the tasks are enabled by the replayability of Heureka, and Heureka’s design
beginning. element is more focused on the requirements of this user type, together
10 Win. The players win when they complete the quiz without losing with the fact that the system is conceived to be used independently by
all their lives. Fig. 3 depicts a player’s states and transitions in the players.
game. The design of the gamification strategy does not explicitly refer to the
philanthropic and socializer player types, although elements of these
Of the variety of gamification strategies discussed in literature (Kim, player types are addressed implicitly. This is because Heureka was
2021), the categories (1) challenge, (2) compensation and (3) usability are developed with the aim of supporting the learning of Jakob Nielsen’s 10
applicable to Heureka. As stated above, (1) the player must undertake a heuristics, a specific content within the subject of User Interfaces, and
quest within Heureka, which consists of meeting a knowledge challenge elements of these player types are, therefore, addressed outside that
presented in the context of the gamified system. The Self-Determination context. For example, in User Interfaces classes, the emphasis is on the
Theory emphasizes the existence of a link between motivation and students being altruistic and helping each other without expecting a
behavior. The gamification of the system aims to promote the players’ reward for it (philanthropic). Furthermore, in class, the students are
motivation, signifying that the players should be motivated by the intrinsically motivated by relationships, enjoying interacting with other
gamified experience and consequently modify their behavior in order to students and creating social connections (socializers). These types of
carry out the planned learning task. Challenge strategies should allow players are not mutually exclusive.
users to carry out achievable tasks while maintaining their motivation
(Kim, 2021). With regard to (2) compensation, the player’s interest and 4.3. Implementation
satisfaction are stimulated by means of a reinforcement schema based on
the Operant Conditioning theory. This simple schema facilitates learning Once a gamification strategy has been designed, the execution of
by integrating positive (e.g., celebrate, rewards) and negative (e.g., several technical activities will crystallize into the final gamified system,
punish, permadeath) game mechanics and elements according to the namely: provisioning, implementation, testing and deployment.
player’s inputs. What is more, in the case of (3) usability, Heureka offers Although domain, gamification or business experts can help clarify or
explanations about the usability heuristics if the player needs help. The discuss the gamification concept, ICT experts are ultimately responsible
usability strategy, therefore, enhances the player’s adaptability, flexi for this workflow. There are two main options as regards implementing
bility of use, and usage behavior (Kim, 2021). the gamification strategy: (1) using a general-purpose gamification
The user type was also part of the design of the gamification strategy platform, or (2) creating self-built solutions with which to support
employed for Heureka. Marczewski’s typology (Klock et al., 2018) was gamification.
considered, as it describes players according to their motivations for General-purpose gamification platforms can simplify the imple
using the gamified system. More specifically, the extended version of mentation process at the cost of reduced flexibility and higher
6
R. Sobrino-Duque et al. International Journal of Human - Computer Studies 161 (2022) 102774
Fig. 3. Heureka interface navigation map, following the notation of Constantine & Lockwood (1999).
integration effort (Calderón et al., 2018). This could be the solution of domain, gamification and business experts. With regard to non-
choice when there is a lack of experience in gamification, gamification functional attributes, particular attention was paid to accessibility, by
complexity is low, or knowledge or resources are insufficient to create a testing WCAG 2.1 accessibility guidelines. Moreover, QUnit was used as
self-built solution. Heureka was conceived as a self-built gamified web a unit testing framework in JavaScript, Express.js (Node.js) for the
application. This decision made it easier to design a tool that would be server code, while Heroku was employed as a cloud computing service
aligned with the business goals defined, ensure control over the gami for temporary deployment during validation with end users. Once all the
fication engine, and streamline the processing, controlling and moni tests had been passed, the tool was deployed and its access granted to all
toring of the user data generated (Herzig et al., 2012; Maican et al., end users (available only at https://docentis.inf.um.es:5050/).
2016). This made it possible to modify, change and adjust all aspects of
the system to our needs without being limited by the functionality of a
4.4. Monitoring and adaptation
generic gamification platform. Heureka has been developed with Java
Script, HTML and CSS. This choice was based on the criterion of design
When the gamified system is running, the operational end user data
flexibility, and the learning curve is low. Git was used for version con
is processed and analyzed in order to discover whether the system is
trol, in addition to which Handlebars.js (an HTML template engine) and
successful and identify any possible modifications. In the case of Heur
JSON were used as supporting technologies. The HTML code for the
eka, an empirical study was carried out as part of the monitoring tasks,
usability quizzes presented by the gamified tool was dynamically
given that the greatest challenge identified in gamification is the use of
generated from a database in JSON format (see Fig. 4).
quantitative and qualitative data to obtain reliable information and
As the construction of the gamified system progressed, it was
adequately guide educators’ decision making (Alhammad and Moreno,
necessary to carry out the usual validation and verification (V&V) ac
2018). In this respect, a new functionality was added to Heureka that
tivities. In the tests performed on Heureka, it was possible to verify the
allowed the empirical study data of the experiment to be downloaded in
requirements with the prototype in operation. This included the tech
JSON format once the users were interacting with the gamified tool.
nical testing of functional correctness and non-functional attributes,
Examples of data that can be collected are the students’ percentage of
along with verifying design constraints thanks to the cooperation of
success in each of the heuristics and the average time they spent on each
7
R. Sobrino-Duque et al. International Journal of Human - Computer Studies 161 (2022) 102774
heuristic. Detailed information on the procedure employed to design 5.3. Research goals
and conduct an experiment with which to evaluate our contribution to
the use of gamification in usability learning is shown below. We shall present the objective of our empirical evaluation by
following the recommendations of Basili and Dieter Rombach (1988),
5. Experimental methodology which are based on the application of the Goal/Question/Metric (GQM)
method. Our goal is thus defined as follows:
5.1. Participants
• To analyze the Use of gamification in the learning of usability
In the academic year 2020/2021, a total of 66 students were enrolled • for the purposes of Evaluating and Improving the learning of
on “User Interfaces”, a 4th year, first-term course on the BSc in Com usability
puter Engineering at the University of OMITTED FOR REVIEW. Students • in terms of the Adequacy of gamifying usability learning tools
have to complete a total of 6 ECTS credits, which are distributed into 2 • from the point of view of the Researcher
h/week of lectures and 1 h 40 min/week of skills practice during a • in the context of Jakob Nielsen’s usability heuristics.
period of 15 weeks. The objective of this subject is to introduce students
to HCI issues, including the development and auditing of usable and The aforementioned goals were considered in order to pose the
accessible user interfaces, paying special attention to the application of following research question:
standards and style guides in web, mobile and desktop computer ap
plications. The two key terms on this course are usability and • RQ. How does the use of a gamified tool affect the learning of us
accessibility. ability heuristics?
The participants were recruited during the teaching of the subject, in
which the teacher explained the purpose of the study, its duration and
the activities to be carried out, and requested their verbal consent. 5.4. Variable description
5.2. Design The next step was to decide on the variables to be used to carry out
the experiment. Three independent variables were defined. TimePoint
One important aspect is the fact that, owing to the Covid-19 represents the points in time at which Exam1 and Exam2 were per
pandemic this year, all classes are being conducted via video confer formed. EducationTool denotes the learning tool used: Heureka (EG)
encing (using Zoom, in our case), a software application that students and UsabilityWebs (CG). Time_Inver_H represents the timeinterval spent
have been familiar with from the beginning of the course. The experi on each heuristic: High_TimeInterval (time interval required to answer a
ment was, therefore, carried out remotely by means of video conference. heuristic, first tercile), Intermediate_TimeInterval (time interval
The students were given prior training in Jakob Nielsen’s 10 rules of required to answer a heuristic, second tercile) and Low_TimeInterval
usability during the first 25 min of the experiment. They were then (time interval required to answer a heuristic, third tercile).
randomly assigned to two groups: (1) the control group (CG) and (2) the The dependent variables were the scores of those students who used
experimental group (EG). A total of 55 students attended the class and Heureka, ScoreHeureka (M1); the scores of those students who used the
were divided into two groups by creating two Zoom rooms, i.e., each usability webs, ScoreUsabilityWeb (M2); the scores attained by the
student was randomly assigned to one of two rooms. A total of 29 stu students in the first exam, ScoreofExam1 (M3); the scores attained by
dents participated in the CG experiment and 26 in the EG, thus repre the students in the second exam, ScoreofExam2 (M4); the success rate of
senting a participation of 83.33% of the enrolled students. the students who used Heureka for each heuristic, Success_Rate (M5);
The instructor provided the documentation on the Nielsen heuristics and the difference between the scores attained in the first and second
to the CG. The CG used a series of materials provided by the teacher that exams, Score_Difference (M6).
represented the traditional way of teaching Nielsen’s principles. As
Nielsen’s heuristics are well-known and widespread, a lot of resources
are, therefore, available on the Internet, and the materials provided to 5.5. Hypotheses
the students were basically a selection obtained from Internet sites. The
CG students, therefore, spent 40 min using a set of links that contained In order to answer the aforementioned research question, the
information about the 10 Nielsen heuristics rules. following hypotheses were defined on the basis of the measures selected
The EG received 10 minutes’ training on the functioning of the in the design of the empirical evaluation:
Heureka tool, after which the group made use of Heureka by playing
with the tool as much they could for 30 min. The EG interacted with • H10, Null Hypothesis: Scores of students who used Heureka
Heureka through a link provided by a guidelines document. Once the (M1–ScoreHeureka) are not affected by the timing of the students’
link was opened, the students recorded a video showing their interaction evaluation (TimePoint).
with the game. At the end of the time given to them, and following the • H11, Alternative Hypothesis: Scores of students who used Heureka
instructions given by the teachers, the students uploaded these videos to (M1–ScoreHeureka) are affected by the timing of the students’
the Multimedia Gallery of the User Interfaces subject in the virtual evaluation (TimePoint).
classroom of the University of OMITTED FOR REVIEW. • H20, Null Hypothesis: Scores of students who used usability websites
Finally, students from both groups accessed the exam area of the (M2-ScoreUsabilityWeb) are not affected by the timing of the stu
virtual classroom in order to take a 20-minute test (Exam1). Both groups dent’s evaluation (TimePoint).
subsequently had the opportunity to continue practicing during the • H21, Alternative Hypothesis: Scores of students who used usability
following week with either the traditional materials or with the Heureka websites (M2-ScoreUsabilityWeb) are affected by the timing of the
tool, depending on whether they had participated in CG or EG. After one student’s evaluation (TimePoint).
week, they repeated the exam in the virtual classroom (Exam2). On this • H30, Null Hypothesis: Scores of the students in the first exam (M3-
occasion, the students in the experimentation group did not have to ScoreofExam1) are not affected by the type of learning tool used
record videos of their interaction with Heureka. (EducationTool).
• H31, Alternative Hypothesis: Scores of the students in the first exam
(M3-ScoreofExam1) are affected by the type of learning tool used
(EducationTool).
8
R. Sobrino-Duque et al. International Journal of Human - Computer Studies 161 (2022) 102774
• H40, Null Hypothesis: Scores of the students in the second exam (M4- 6. Results
ScoreofExam2) are not affected by the type of learning tool used
(EducationTool). 6.1. Participants’ characteristics
• H41, Alternative Hypothesis: Scores of the students in the second
exam (M4-ScoreofExam2) are affected by the type of learning tool A demographic analysis of the 55 computer science students showed
used (EducationTool). that 87.27% of the participants were male (n = 48) and 12.73% were
• H50, Null Hypothesis: The success rate of the students who used female (n = 7). The population sample had a similar academic back
Heureka (M5-Success_Rate) is not affected by time they spent ground, in the 20- to 21-year-old age range.
answering the heuristics (Time_Inver_H).
• H51, Alternative Hypothesis: The success rate of the students who 6.2. Descriptive statistics
used Heureka (M5-Success_Rate) is affected by time they spent
answering the heuristics (Time_Inver_H). Table 1 shows that CG students scored better in the first exam than in
• H60, Null Hypothesis: The difference between the scores attained in the second exam, while the scores attained by the EG students in the
the first and second exams (M6- Score_Difference) is not affected by second exam were slightly higher than those attained in the first exam.
the type of learning tool used (EducationTool). Table 2 shows the percentage of success of the students in the
• H61, Alternative Hypothesis: The difference between the scores experimentation group for each of the heuristics. The study revealed
attained in the first and second exams (M6- Score_Difference) is that the heuristics for which the students obtained better results,
affected by the type of learning tool used (EducationTool). considering the data obtained after giving an explanation by the teacher
• The objective of each hypothesis is explained as follows: about the Nielsen heuristics, were: h8 (85.45%) and h3 (80.35%), while
• Hypothesis H1. Attempts to discover the extent to which Heureka heuristics h6 (55.35%) and h1 (63.79%) were the worst performers.
enables students to retain, in the short term, the knowledge learned Upon considering the results obtained by the students from the
as regards Jakob Nielsen’s 10 Usability Heuristics. control and experimental groups in both the first and second examina
• Hypothesis H2. Attempts to discover to what extent traditional us tions, it was determined that the heuristics with the highest percentage
ability learning allows students to retain, in the short term, the of success in the control group were h2 and h8, while those with the
knowledge learned as regards Jakob Nielsen’s 10 Usability worst results were h5 and h7. In the experimental group, the heuristics
Heuristics. with the highest percentage of success were h2, h3 and h8, while heu
• Hypothesis H3. Attempts to discover the extent to which the type of ristics h5 and h4 obtained the worst results. Table 3 shows the results
learning approach used enables students to achieve better knowledge obtained by the students in the control and experimental groups after
on Jakob Nielsen’s 10 Usability Heuristics immediately after the the first and second tests.
intervention. The study revealed that the heuristics on which the students spent
• Hypothesis H4. Attempts to examine the extent to which the type of most time were h5 (A = 168.98 s) and h9 (A = 162.87 s); while the
learning approach used enables students, in the short term, to ach heuristics on which the students spent least time were h2 (A = 133.29 s)
ieve better knowledge on Jakob Nielsen’s 10 Usability Heuristics. and h8 (A = 144.96 s). Table 4 shows the average time the students
• Hypothesis H5. Attempts to explore what heuristics are more diffi spent on each heuristic.
cult to learn with the gamified approach and to discover whether this In order to determine whether the heuristics on which most and least
is related to the time the students required in order to understand time was spent influenced the students’ success rate, Table 5 was
them. organized by considering the average time that the students took to
• Hypothesis H6. Attempts to ascertain what learning approach allows respond to each heuristic.
students to achieve a greater increase in knowledge as regards Jakob Table 6 shows the descriptive statistics (number of heuristics, mean,
Nielsen’s 10 Usability Heuristics. median, and standard deviation) obtained for success rate.
Table 7 shows the descriptive statistics (number of students, mean,
5.6. Statistical analysis median, and standard deviation) obtained for mean difference. On
average, the students in the experimental group (using Heureka) ach
The data gathered was analyzed using the SPSS 24.0 statistical ieved better results than those in the control group.
software package and Microsoft Office Excel 2016. The Shapiro-Wilk W
statistical test was applied in order to verify whether the scores in the 6.3. Data analysis
two tasks had a normal distribution. A standard level of significance
(0.05) was selected so as to reject the null hypothesis. The Levene test 6.3.1. H1: TimePoint – ScoreHeureka (M1)
was used to verify variance homogeneity. The result obtained by the related sample t-test revealed that, owing
Wilcoxon and Student-T tests for paired samples were used to search to the means of the two exams, there was no statistically significant
for any significant differences in the scores obtained by those students improvement as regards the scores obtained by the students in the
who made use of the usability websites and the Heureka tool in the first period of time just after being taught the 10 Jakob Nielsen usability
and second exams. heuristics (Exam1) and those obtained after the experiment was
The Student-T test for unpaired samples was used to discover any repeated one week later (Exam2), with the statistical result of t (25)
significant differences between the scores attained by those students =− 0.75, p = 0.46.
who used the usability websites and those who used the Heureka tool in
the first exam. Furthermore, the Man-Whitney test was used to discover Table 1a
any significant differences between the scores attained by those students Descriptive statistics for student scores. “N”: sample size; “M”: mean; “Md”:
who used the usability websites and those who made use of the Heureka median; “SD”: standard deviation.
tool in the second exam. The Man-Whitney test was also used to discover Variables N M Md SD
whether the learning tool had an effect on the differences between the
ScoreUsabilityWeb
scores attained in the first and second exams. Exam1 29 6,65 6,01 1,93
Finally, the Kruskall-Wallis test for unpaired samples was used to Exam2 29 6,44 7,34 2,32
discover whether the time spent answering questions (high, intermedi ScoreHeureka
ate and low time intervals) had an impact on the success rates of the Exam1 26 6,29 6,67 2,15
Exam2 26 6,57 6,34 1,69
heuristics.
9
R. Sobrino-Duque et al. International Journal of Human - Computer Studies 161 (2022) 102774
Table 2 Table 4
Results obtained by students from the experimentation group after interacting Heuristics on which the students from the experimentation group spent most and
with Heureka. least time.
Heuristics Matches Total Success Heuristics Total submissions Seconds Average
rate
h1 58 8572 147.79
h1-Visibility of system status 37 58 63.79 h2 97 12,930 133.29
h2- Match between system and the real world 70 97 72.16 h3 56 8781 156.80
h3-User control and freedom 45 56 80.35 h4 102 15,849 155.38
h4-Consistency and standards 79 102 77.45 h5 56 9463 168.98
h5-Error prevention 36 56 64.28 h6 112 17,566 156.83
h6-Recognition rather than recall 62 112 55.35 h7 51 7770 152.35
h7-Flexibility and efficiency of use 35 51 68.62 h8 55 7973 144.96
h8- Aesthetic and minimalist design 47 55 85.45 h9 101 16,450 162.87
h9-Help users recognize, diagnose, and recover 81 101 80.19 h10 90 13,148 146.08
from errors
h10-Help and documentation 65 90 72.22
Table 5
Heuristics organized according to the average time spent by students on each
6.3.2. H2: TimePoint – ScoreUsabilityweb (M2) heuristic.
In order to discover out whether the scores of those students who
Heuristics Average Success Rate
learned by employing usability webs (M2ScoreUsabilityWeb) were
affected by the timing of the students’ evaluation (TimePoint), a Wil Heuristics on which most time was spent
h5 168.98 64.28
coxon signed-rank test was performed for the variable Score h9 162.87 80.19
UsabilityWeb. The results obtained did not show a statistically h6 156.83 55.35
significant change between the scores obtained by the students just after Heuristics on which an intermediate amount of time was spent
being taught the 10 Jakob Nielsen usability heuristics (Exam1) and h3 156.80 80.35
h4 155.38 77.45
those attained after the experiment was repeated one week later
h7 152.35 68.62
(Exam2), with the statistical result of Z = − 0.09 and p = 0.99. h1 147.79 63.79
Heuristics on which least time was spent
6.3.3. H3: EducationTool – ScoreofExam1 (M3) h10 146.08 72.22
An independent sample test (Student T) for the variable Score h8 144.96 85.45
h2 133.29 72.16
ofExam1 was used to compare the scores obtained by the CG students
with those from the EG in Exam1. The results revealed that there was no
statistically significant difference between the two groups. The statistic t Table 6
(53) = 0.66, with p = 0.52. Descriptive statistics for success rate. “N”: Number of students; “M”: Mean;
“Md”: Median; “SD”: Standard deviation.
6.3.4. H4: EducationTool – ScoreofExam2 (M4) Success_Rate N M Md SD
Although the EG students scored better in the second test (Exam2),
Most_Time Spent 3 66.60 64.28 12.58
the differences are not sufficiently statistically significant to be able to Intermediate_Time 4 72.55 73.03 7.68
state that the students in one group achieved greater knowledge on Least_Time Spent 3 76.61 72.22 7.65
Jakob Nielsen’s heuristics than those in the other group. A Mann-
Whitney U test for the ScoreofExam2 variable was applied to Exam2.
The results revealed that there was no significant difference between CG Table 7
and EG. In this case, the statistic obtained was U = 365 and p = 0.84. Descriptive statistics for mean difference. “N”: Number of students; “M”: Mean;
“Md”: Median; “SD”: Standard deviation.
Table 3
Results obtained by the students from the control and experimental groups in the first and second exams.
Heuristics Matches Success rate
First Exam (CG) First Exam (EG) Second Exam (CG) Second Exam (EG) First Exam (CG) First Exam (EG) Second Exam (CG) Second Exam (EG)
10
R. Sobrino-Duque et al. International Journal of Human - Computer Studies 161 (2022) 102774
heuristic, χ2(2) = 1.47 and p = 0.48, with a mean percentage of success 7. Discussion and lessons learned
of 66.61 for High_TimeInterval, 72.55 for Intermediate_TimeInterval
and 76.61 for Low_TimeInterval. 7.1. Discussion
6.3.6. H6: EducationTool – Score_Difference (M6) There is little consensus as to whether gamification has positive ef
In order to determine whether the difference between the control fects on performance. Previous studies have reported mixed findings.
group’s scores in the first and second tests was greater than the differ The gamification component is believed to be effective in enhancing
ence between the scores attained by the students in the experimental students’ motivation and improving their learning experience, engage
group, an independent sample test (Mann-Whitney U) was used for the ment and performance (Legaki et al., 2020). Several studies (Man
variable Score_Difference. The results revealed that there was no sta zano-le et al., 2021) have shown that the addition of game mechanics
tistically significant difference. The statistic U = 402,50 and p = 0.66. (such as badges, levels and leader boards) has a positive effect on learner
engagement. However, Zainuddin et al. (2020) claim that critics have
6.4. Survey argued that these mechanics create only extrinsic motivation, not
intrinsic motivation; that is, learners complete a task simply to earn a
A questionnaire concerning the participation in the experiment was badge, not for the satisfaction of gaining new knowledge and skills. An
filled out by the students. The aim of this survey was to collect feedback increase in performance was observed in the Heureka group between
regarding the students’ perceptions with respect to their experience with Exam1 and Exam2, in the period of time of a week, although this
Heureka. A total of 14 questions were administrated, employing a five- improvement was not statistically significant (H1). The Heureka group
point Likert-type scale (1 = Completely disagree; 2 = Disagree; 3 = students played during the week and thus probably remembered the
Neither agree nor disagree; 4 = Agree; 5 = Completely agree). heuristics better thanks to repetition. This is partly in line with the re
The questionnaire was designed by keeping in mind the Technology sults obtained in an empirical study carried out by (Barreto et al., 2015),
Acceptance Model (TAM)(Al-Qaysi et al., 2021), which helps to put the which compared results of the pretest and post-test for students using
results into a perspective of technology adoption. The TAM highlights UsabilityGame (experimental group) and the Monopoly board game
the need to be conscious of the socially constructed processes in which (control group). Significant statistical improvements were found as
tools are deployed and used on a daily basis (Al-Qaysi et al., 2021). This regards learning the concepts of usability. This means that those stu
is because, when users are presented with new technology, a number of dents who used UsabilityGame learned heuristic evaluation concepts
factors—in particular perceived usefulness and perceived ease of use better than those who used the Monopoly board game.
—influence their decision regarding how and when they will use this Nevertheless, the results shown in Section 5 indicate that the use of
new technology. Table 8 shows the questions, the means, standard de Heureka did not help the students gain better academic results in Exam1
viations and median for the students’ answers. and Exam 2 (hypotheses H3 and H4). We believe that the reason for
these results lies in the fact that the students were exposed to the game
for only a short amount of time. Another reason could be that some
Heureka cards may recall several of Nielsen’s heuristics at the same
time, which could have created confusion in the learners. An
Table 8 exploration-based learning tool such as Heureka could be an intuitive
Means, standard deviations and medians of students’ perceptions. “M”: mean; and effective approach in domains less prone to subjective evaluation by
“SD”: standard deviations; “Md”: median. the students. For instance, this is the case of teaching software process
Id Question M SD Md improvement, such as SPICE (Software Process Improvement Capability
dEtermination) (Dorling and McCaffery, 2012). What is more, another
Block 1: Attitude of use
Q1 I would use the Heureka tool if I needed to study 4 0.73 4 factor to consider is that gamification may have less impact on usability
Jakob Nielsen’s heuristics. heuristics than occurs in other disciplines, as the material and examples
Q2 I would use the Heureka tool to improve my 3.7 0.66 4 used in the traditional teaching of Nielsen heuristics can also be
performance in the subject User Interfaces.
enjoyable.
Block 2: Intention of use
Q3 I would recommend the Heureka tool to future UI 3.85 0.93 4
These results show that further research is required in order to add
students. new gamified elements and analyze which of them are most influential
Q4 I would recommend the Heureka tool in order to 4.2 0.83 4 so as to achieve a better learning performance. Note that Heureka does
learn Jakob Nielsen’s heuristics. not employ the game elements most frequently used in literature, such
Block 3: Perceived ease of use
as badges and leaderboards, which will be included in future work. This
Q5 In general, I found the application intuitive. 4.35 0.75 4.5
Q6 I easily realized that I could use the ’View Definition’ 3.95 1.00 4 is because evidence has been found concerning a positive effect on ac
button to get help. ademic performance in teaching programming fundamentals by using
Q7 I easily realized that I had to press the ’Confirm 4.4 0.82 5 the “Clara” framework gamified with star ratings, badges and challenges
Answer’ button to set my final answer. (Bogdanovych and Trescak, 2016). A positive impact on academic per
Q8 I easily noticed that some questions asked for an 3.8 1.06 4
example that did not comply with the heuristics.
formance was also evidenced when using the UDPiler compiler to teach
Q9 I knew for sure how many mistakes I could make at 3.6 1.10 3.5 C programming in comparison to using a non-gamified platform (Marín
any given time. et al., 2019). UDPiler is gamified with points, badges and leaderboards.
Q10 I easily noticed when I got a question right or wrong. 4.3 0.80 4.5 De Marcos et al. (2014) ascertained that a gamification learning
Q11 When I missed a question, I easily realized the right 4.1 0.72 4
approach improved academic achievement in practical assignments.
one.
Block 4: Perceived utility However, a traditional e-learning approach was better for students in
Q12 Using Heureka would make it easier for me to study 4.15 0.88 4 terms of knowledge. The result obtained in De Marcos’ study could have
Jakob Nielsen’s heuristics. been owing to the fact that UDPiler compiler was not easy to use, as
Q13 I would have preferred to use the Heureka tool in 3.5 1.15 4 reported by the students in an attitudinal survey. Note that perceived
class in order to learn about Jakob Nielsen’s
heuristics, instead of listening to the teacher’s verbal
usefulness and perceived ease of use are considered key variables that
explanation. explain outcome measures such as performance (Marangunić and
Q14 I would have preferred to have used the Heureka tool 3.9 0.97 4 Granić, 2015). The positive effects of gamification may be blunted by a
to review Jakob Nielsen’s heuristics before the exam, poorly designed tool.
instead of studying the slides.
Although the hypotheses test did not find statistically significant
11
R. Sobrino-Duque et al. International Journal of Human - Computer Studies 161 (2022) 102774
difference in the percentage of success between time interval groups, to the designer and guide educators’ decision-making. A video
extreme behavior in the percentage of success was observed for three recording tool, especially in the case of a subject on user interfaces, is
heuristics. First, two heuristics attained lower success rates and needed an interesting instrument to analyze student interactions.
more time to be answered, namely h5 – “Error prevention” and h6 –
“Recognition rather than recall”. Extensive work can be found on tips to On the gamification process
improve both the h5 and h6 heuristics, but these tips may be difficult to
translate into Heureka cards: (1) on the one hand, in the case of h5, one • Follow a gamification process that gives the team a clear direction,
of the main goals of a well-designed user interface is to prevent inter helping to approach the problem in a systematic way in order to
action problems, thus promoting error prevention by eliminating error- integrate the various elements involved (stakeholders, theories,
prone conditions or checking for them and presenting users with methodologies and technologies) and to manage the activities
confirmation dialogs so as to avoid “unconscious slips” and “conscious necessary for the success of the project.
mistakes” (Sherwin, 2019). Norman identifies two categories of user • Take into account all the key elements to design the gamification
errors (Laubheimer, 2015): slips occur “when a user is on autopilot, and strategy such as game elements, mechanics, user type, context and
takes the wrong actions in service of a reasonable goal”; mistakes occur motivation. A conceptual framework to help designer to cover these
“when a user has developed a mental model of the interface that is not principles can be useful, such as Gamicards (c.f. Section 4.2).
correct, and forms a goal that does not suit the situation well”; (2) on the • Determine the type of user targeted by the gamification effort in
other hand, with regard to h6, some useful tips include providing easy order to fine-tune the design of the gamification strategy and the
access to the history and previously visited content as long as visible, design elements to be included in the gamified system.
and intuitive interfaces (Budiu, 2014). According to (Dix et al., 2004) • Sufficient time should be planned to validate and review the ques
one way in which to achieve the latter is by ensuring that the interface is tions and answers used in the game. The usability domain is given to
synthesizable, i.e., users must be able to evaluate the effect of previous writing questions that may later be ambiguous, in the sense that
operations on the current state. As stated above, it would appear that these questions may refer to more usability issues than originally
some of these tips cannot be easily translated into static content, i.e., expected, so all the questions should be carefully reviewed by a team,
Heureka cards. If these heuristics are to be illustrated, dynamic visual especially if they are self-correcting multiple-choice questions,
resources are required. On the contrary, heuristic h8 – “Aesthetic and where students are not allowed to reason their answers.
minimalist design” attained the best success rate and required the least
amount of time to be solved: this heuristic is intuitively well-suited to 8. Threats to the validity of the study
representation on static cards.
With regard to subjective perceptions, the students pointed out that The effect of different threats to validity has been analyzed in the
the Heureka tool is recommendable as regards learning Jakob Nielsen’s present research, focusing on threats to internal, external and the
heuristics (M = 4.20). They also perceived Heureka to be intuitive (M = conclusion validity.
4.35). The utility perceived by the subjects in our experiment confirms
findings obtained with UsabilityGame (Barreto et al., 2015). In our 8.1. Internal validity
survey, the students affirmed that using Heureka would make it easier to
study Jakob Nielsen’s heuristics (M = 4.15). Positive perceptions were During the selection process, the students who participated in the
also found in the UsabilityGame experiment, in which around 65% of experiment received prior preparation on Nielsen’s 10 Usability Heu
the students responded that they strongly agreed or partially agreed ristics for User Interface Design, thus counteracting the internal threat to
with the utility of the game as regards teaching the evaluation of heu validity that may be caused by the effect of the confounding variable
ristics. Moreover, more than 80% of the students believed that the idea related to prior knowledge and experience in this topic. In addition, the
of teaching usability through UsabilityGame was adequate. Motivation effect of this, and other possible confounding variables such as: (1)
was similarly highlighted by more than 75% of the students who used students’ motivation, (2) students’ personality and (3) students’ fatigue,
UsabilityCity (Ferreira et al., 2014). These results are also similar to was controlled by randomly forming the groups of students in the design
gamified experiences in other settings such as the UDPiler (Marín et al., of the experiment.
2019), in which more than 80% of the students stated that UDPiler
helped them to obtain better results. 8.2. External validity
7.2. Lessons learned Students may feel overwhelmed when taking extensive exams, thus
resulting in the fatigue effect. An exam comprising only 10 questions
Based on the experience reported in this paper, this section synthe was, therefore, prepared, one question for each heuristic. However, the
sizes some lessons learned in relation to the development of a gamified small size of this experimental object could have been a threat to the
learning system in the domain of usability and user interface interaction. external validity of the results. The size and complexity of the exam had
On the software engineering methodology to be limited, as it took place within a time-constrained HCI under
graduate course.
• Adopt user-centered design to address the students’ interests and Internal consistency, construct validity and reliability were not
needs. The system must engage students’ attention for learning to be quantified in the questionnaire used in the experiment. In order to
effective. mitigate the threat of construct validity, several reviews of the content of
• Employ an agile software development based on user stories which the questionnaire were carried out by three professionals in the field of
reduces development time and allows to provide the trainer with a usability engineering, and the TAM (Al-Qaysi et al., 2021) was used to
non-complex gamified system in a short period of time. discover the acceptance of the technology. In this regard, one of the
• Select the best implementation strategy between a general-purpose instructors has ten years of experience in teaching usability, which could
gamification platform to reduce cost or a self-built application to add reliability to the instrument developed. Errors related to some
achieve a higher integration and flexibility. This decision should be questions were identified and corrected in these reviews. In addition, we
based on previous knowledge, available resources and the pre- avoided writing negatives or double negatives in the survey questions, as
established objectives. respondents tend to spend a lot of time figuring out whether they agree
• Develop a robust tracking and data collection system in advance. or disagree with the questions. However, there is a general tendency for
Data on the use of the gamified system is crucial to provide feedback assent rather than dissent (acquiescence) (J. M. Johnson et al., 2011),
12
R. Sobrino-Duque et al. International Journal of Human - Computer Studies 161 (2022) 102774
signifying that the mean of all responses may tend toward the side of Fathian, 2020) (Klock et al., 2020).
agreement.
Finally, when experiments are conducted with students, the external CRediT authorship contribution statement
validity may also be threatened; if usability engineering professionals
are used, the representativeness of the participants may be improved. Raimel Sobrino-Duque: Methodology, Formal analysis, Writing –
Nevertheless, controlled experiments provide insights into issues and original draft, Writing – review & editing. Noelia Martínez-Rojo:
problems that can later be considered in industrial case studies (Ari Conceptualization, Software. Juan Manuel Carrillo-de-Gea: Method
sholm et al., 2006). As suggested by Carver et al. (2003), the results ology, Writing – review & editing. Juan José López-Jiménez: Data
obtained through empirical studies conducted with students have rele curation. Joaquín Nicolás: Conceptualization, Methodology, Writing –
vance in the progress of Software Engineering (Salman et al., 2015). review & editing, Funding acquisition. José Luis Fernández-Alemán:
Methodology, Validation, Formal analysis, Writing – review & editing,
8.3. Conclusion validity Funding acquisition.
13
R. Sobrino-Duque et al. International Journal of Human - Computer Studies 161 (2022) 102774
Carrion, M., Santorum, M., Flores, H., Aguilar, J., Perez, M., 2019. Serious game, Hill, J.M.D., Ray, C.K., Blair, J.R.S., Carver, C.A., 2003. Puzzles and games: addressing
gamified applications, educational software: a comparative study. In: Proceedings - different learning styles in teaching operating systems concepts. In: SIGCSE Bulletin
2019 International Conference on Information Systems and Software Technologies, (Association for Computing Machinery, Special Interest Group on Computer Science
ICI2ST 2019, pp. 55–62. https://doi.org/10.1109/ICI2ST.2019.00015. Education), pp. 182–186. https://doi.org/10.1145/792548.611964.
Carroll, J.M., Rosson, M.B., 2006. Case studies as minimalist information. IEEE Trans. Huotari, K., Hamari, J., 2012. Defining gamification - a service marketing perspective. In:
Prof. Commun. 49 (4), 297–310. https://doi.org/10.1109/TPC.2006.885836. Proceedings of the IADIS International Conference Interfaces and Human Computer
Carver, J., Jaccheri, L., Morasca, S., Shull, F., 2003. Issues in using students in empirical Interaction 2012, IHCI 2012, Proceedings of the IADIS International Conference
studies in software engineering education. In: Proceedings - International Software Game and Entertainment Technologies 2012, pp. 17–22. https://doi.org/10.1145/
Metrics Symposium, pp. 239–249. https://doi.org/10.1109/METRIC.2003.1232471. 2393132.2393137.
Constantine, L., Lockwood, L., 1999. Software For use: A Practical Guide to the Models ISO 9241. (2018). Ergonomics of human-system interaction — Part 11: usability: definitions
and Methods of Usage-Centered Design. Addison-Wesley. and concepts. https://www.iso.org/obp/ui/#iso:std:iso:9241:-11:ed-2:v1:en.
Crittenden, W.F., Biel, I.K., Lovely, W.A., 2019. Embracing digitalization: student Johnson, J.M., Bristow, D.N., Schneider, K.C., 2011. Did you not understand the question
learning and new technologies. J. Market. Educ. 41 (1), 1–10. https://doi.org/ or not? An investigation of negatively worded questions in survey research. J. Appl.
10.1177/0273475318820895. Bus. Res. (JABR) 20 (1), 75–86. https://doi.org/10.19030/jabr.v20i1.2197.
Dale, S., 2014. Gamification: making work fun, or making fun of work? Bus. Inf. Rev. 31 Johnson, L., Adams Becker, S., Cummins, M., Estrada, V., Freeman, A., Hall, C, 2016.
(2), 82–90. https://doi.org/10.1177/0266382114538350. Horizon report - 2016 higher education edition. NMC Horizon Rep. http://www.
De-Marcos, L., Domínguez, A., Saenz-De-Navarrete, J., Pagés, C., 2014. An empirical nmc.org/publications/2014-horizon-report-higher-ed.
study comparing gamification and social networking on e-learning. Comput. Educ. Kapp, K.M., 2012. The Gamification of Learning and instruction: Game-Based Methods
75, 82–91. https://doi.org/10.1016/j.compedu.2014.01.012. and Strategies for Training and Education, 148. John Wiley & Sons.
Deterding, S., 2011. Situated motivational affordances of game elements: a conceptual Khalil, M., Wong, J., De Koning, B., Ebner, M., Paas, F., 2018. Gamification in MOOCs: a
model. In: CHI 2011 Workshop “Gamification,” January 2011, pp. 2–4. http://g review of the state of the art. In: IEEE Global Engineering Education Conference,
amification-research.org/chi2011/papers. EDUCON, 2018-April, pp. 1629–1638. https://doi.org/10.1109/
Deterding, S., Dixon, D., Khaled, R., Nacke, L., 2011. From game design elements to EDUCON.2018.8363430.
gamefulness: defining “gamification. In: Proceedings of the 15th International Kim, S., 2021. How a company’s gamification strategy influences corporate learning: a
Academic MindTrek Conference: Envisioning Future Media Environments, MindTrek study based on gamified MSLP (Mobile social learning platform). Telematics Inform.
2011, pp. 9–15. 57 (September 2020), 1–19. https://doi.org/10.1016/j.tele.2020.101505.
Dicheva, D., Dichev, C., Agre, G., Angelova, G., 2015. Gamification in education: a Klock, A.C.T., Gasparini, I., Pimenta, M.S., Hamari, J., 2020. Tailored gamification: a
systematic mapping study. Educ. Technol. Soc. 18 (3), 75–88. review of literature. Int. J. Hum. Comput. Stud. 144 (September 2019) https://doi.
Dix, A., Finlay, J., Abowd, G.D., Beale, R., 2004. Human–computer Interaction. Pearson org/10.1016/j.ijhcs.2020.102495.
Education. Klock, A.C.T., Pimenta, M.S., Gasparini, I., 2018. A systematic mapping of the
Domínguez, A., Saenz-De-Navarrete, J., De-Marcos, L., Fernández-Sanz, L., Pagés, C., customization of game elements in gamified systems. In: XVII Brazilian Symposium
Martínez-Herráiz, J.J., 2013. Gamifying learning experiences: practical implications on Computer Games and Digital Entertainment (SBGames 2018)N, pp. 1–8. In:
and outcomes. Comput. Educ. 63, 380–392. https://doi.org/10.1016/j. http://www.sbgames.org/sbgames2018/proceedings-eng.
compedu.2012.12.020. Labrador, E., Villegas, E., 2014. Sistema Fun Experience Design (FED) aplicado en el
Dorling, A., McCaffery, F., 2012. The gamification of SPICE. Commun. Comput. Inf. Sci., aula. ReVisión 7 (2). http://www.aenui.net/ojs/index.php?journal=revision&pa
290 CCIS 295–301. https://doi.org/10.1007/978-3-642-30439-2_35. ge=article&op=viewArticle&path%5B%5D=147&path%5B%5D=242.
Santos, Dos, L, A., Souza, M.R.D.A., Figueiredo, E., Dayrell, M, 2018. Game elements for Laubheimer, P., 2015. Preventing User Errors: Avoiding Conscious Mistakes. World
learning programming: a mapping study. In: CSEDU 2018 - Proceedings of the 10th Leaders in Research-Based User Experience. https://www.nngroup.com/articles/u
International Conference on Computer Supported Education, 2, pp. 89–101. https:// ser-mistakes/.
doi.org/10.5220/0006682200890101. Legaki, N.Z., Xi, N., Hamari, J., Karpouzis, K., Assimakopoulos, V., 2020. The effect of
Faulkner, X., Culwin, F., 2001. The internet as a resource for teaching non internet challenge-based gamification on learning: an experiment in the context of statistics
topics. In: Proceedings - 2001 Symposium on Applications and the Internet education. Int. J. Hum. Comput. Stud. 144 (November 2019) https://doi.org/
Workshops, SAINT 2001, pp. 51–55. https://doi.org/10.1109/ 10.1016/j.ijhcs.2020.102496.
SAINTW.2001.998209. Ludi, S., 2005. Providing students with usability testing experience: bringing home the
Ferreira, B.M., Rivero, L., Lopes, A., Marques, A.B., Conte, T., 2014. UsabiliCity: Um Jogo lesson “the user is not like you. In: Proceedings - Frontiers in Education Conference, FIE,
de Apoio ao Ensino de Propriedades de Usabilidade de Software Através de 2005, pp. 6–11. https://doi.org/10.1109/fie.2005.1611949.
Analogias. In: Anais Do XXV Simpósio Brasileiro de Informática Na Educação (SBIE Maican, C., Lixandroiu, R., Constantin, C., 2016. Interactivia.ro - a study of a
2014), 1, pp. 1273–1282. https://doi.org/10.5753/cbie.sbie.2014.1273. gamification framework using zero-cost tools. Comput Human Behav 61, 186–197.
Ferro, L.S., 2021. The Game Element and Mechanic (GEM) framework: a structural https://doi.org/10.1016/j.chb.2016.03.023.
approach for implementing game elements and mechanics into game experiences. Manzano-le, A., Camacho-lazarraga, P., Guerrero, M.A., Guerrero-puerta, L., Aguilar-
Entertain. Comput. 36 (July 2020), 100375 https://doi.org/10.1016/j. parra, J.M., & Alias, A. (2021). Between level up and game over : a systematic literature
entcom.2020.100375. review of gamification in education. 1–14.
Ferro, L.S., Walz, S.P., Greuter, S., 2014. Gamicards - an alternative method for paper- Marangunić, N., Granić, A., 2015. Technology acceptance model: a literature review
prototyping the design of gamified systems. In: International Conference on from 1986 to 2013. Univ. Access Inf. Soc. 14 (1), 81–95. https://doi.org/10.1007/
Entertainment Computing, 8770, pp. 11–18. https://doi.org/10.1007/978-3-662- s10209-014-0348-1.
45212-7_2. Marczewski, A., 2015. Gamification, Game Thinking and Motivational design. Even Ninja
Gordon, N., Brayshaw, M., Grey, S., 2013. Maximising gain for minimal pain: utilising Monkeys Like to Play. CreateSpace Independent Publishing Platform. https://www.ga
natural game mechanics. ITALICS Innov. Teach. Learn. Inf. Comput. Sci. 12 (1), mified.uk/wp-content/uploads/2018/10/Narrative-Chapter.pdf.
27–38. https://doi.org/10.11120/ital.2013.00004. Marín, B., Frez, J., Cruz-Lemus, J., Genero, M., 2019. An empirical investigation on the
Gounaridou, A., Siamtanidou, E., Dimoulas, C., 2021. A serious game for mediated benefits of gamification in programming courses. ACM Trans. Comput. Educ. 19 (1),
education on traffic behavior and safety awareness. Educ. Sci. 11 (3), 127. https:// 1–22. https://doi.org/10.1145/3231709.
doi.org/10.3390/educsci11030127. Meijer, H.A.W., Graafland, M., Obdeijn, M.C., Dieren, S.Van, Goslings, J.C., Schijven, M.
Guerino, G.C., Valentim, N.M.C., 2020. Usability and user experience evaluation of P, 2021. Serious game versus standard care for rehabilitation after distal radius
natural user interfaces: a systematic mapping study. IET Software 14 (5), 451–467. fractures : a protocol for a multicentre randomised controlled trial. BMJ Open 11,
https://doi.org/10.1049/iet-sen.2020.0051. 1–8. https://doi.org/10.1136/bmjopen-2020-042629.
Haaranen, L., Ihantola, P., Hakulinen, L., Korhonen, A., 2014. How (not) to introduce Muratet, M., Torguet, P., Viallet, F., Jessel, J.P., 2011. Experimental feedback on
badges to online exercises. In: SIGCSE 2014 - Proceedings of the 45th ACM Technical Prog&Play: a serious game for programming practice. Comput. Graphics Forum 30
Symposium on Computer Science Education, pp. 33–38. https://doi.org/10.1145/ (1), 61–73. https://doi.org/10.1111/j.1467-8659.2010.01829.x.
2538862.2538921. Nasirzadeh, E., Fathian, M., 2020. Investigating the effect of gamification elements on
Hakulinen, L., Auvinen, T., Korhonen, A., 2013. Empirical study on the effect of bank customers to personalize gamified systems. Int. J. Hum. Comput. Stud. 143
achievement badges in TRAKLA2 online learning environment. In: Proceedings - (March 2019) https://doi.org/10.1016/j.ijhcs.2020.102469.
2013 Learning and Teaching in Computing and Engineering, LaTiCE 2013, Nicholson, S., 2012. A user-centered theoretical framework for meaningful gamification.
pp. 47–54. https://doi.org/10.1109/LaTiCE.2013.34. Games+Learning+Society 8.0 18, S66–S67. https://doi.org/10.1089/
Harley, A. (2019). Usability heuristic 1: visibility of system status (Video). https://www. dia.2016.2506.
nngroup.com/videos/usability-heuristic-system-status/. Nielsen, Jakob., 2020. 10 Usability Heuristics for User Interface Design. World Leaders in
Herbert, B., Charles, D., Moore, A., Charles, T., 2014. An investigation of gamification Research-Based User Experience. https://tfa.stanford.edu/download/TenUsabili
typologies for enhancing learner motivation. In: Proceedings - 2014 International tyHeuristics.pdf.
Conference on Interactive Technologies and Games, ITAG 2014, pp. 71–78. https:// Nielsen, Jakob., 1993. Usability Engineering. Morgan Kaufmann.
doi.org/10.1109/iTAG.2014.17. Norman, Don., 2013. The Design of Everyday things: Revised and Expanded Edition.
Herzig, P., Ameling, M., Schill, A., 2012. A generic platform for enterprise gamification. Basic Books. https://doi.org/10.1145/1340961.1340979.
In: Proceedings of the 2012 Joint Working Conference on Software Architecture and Norman, Donald., 1989. The Psychology of Everyday Things. Doubleday.
6th European Conference on Software Architecture, WICSA/ECSA 2012, O’Donovan, S., Gain, J., Marais, P., 2013. A case study in the gamification of a
pp. 219–223. https://doi.org/10.1109/WICSA-ECSA.212.33. university-level games development course. In: ACM International Conference
Herzing, P., Ameling, M., Wolf, B., Schill, A., 2015. Implementing gamification: Proceeding Series, pp. 242–251. https://doi.org/10.1145/2513456.2513469.
requirements and gamification platforms. Gamification Educ. Bus. 431–450. https://
doi.org/10.1007/978-3-319-10208-5.
14
R. Sobrino-Duque et al. International Journal of Human - Computer Studies 161 (2022) 102774
Quiñones, D., Rusu, C., 2017. How to develop usability heuristics: a systematic literature Sherwin, K. (2019). Usability Heuristic 5: Error prevention (Video). World Leaders in
review. Comput. Standards Interfaces 53 (March), 89–122. https://doi.org/10.1016/ Research-Based User Experience. https://www.nngroup.com/videos/usability-heuri
j.csi.2017.03.009. stic-error-prevention/.
Rodrigues, E., Schwan-estrada, K.R.F., Fiori-tutida, A.C.G., Stangarlin, J.R., Eugênia, M., Sindre, G., Natvig, L., Jahre, M., 2009. Experimental validation of the learning effect for
& Cruz, S. (2007). Orgânico contra Sclerotinia sclerotiorum pelo extrato de gengibre. a pedagogical game on computer fundamentals. IEEE Trans. Educ. 52 (1), 10–18.
124–128. https://doi.org/10.1109/TE.2007.914944.
Rodrigues, L., Oliveira, A., Rodrigues, H., 2019. Main gamification concepts: a systematic Skinner, B., 1965. Science and Human Behavior. Macmillan, New York. https://doi.org/
mapping study. Heliyon 5 (7), e01993. https://doi.org/10.1016/j.heliyon.2019. 10.1016/B978-012370509-9.00087-5.
e01993. Tao, Y., 2005. Work in progress - Introducing usability concepts in early phases of
Rose, D.H., Meyer, A., 2007. Teaching every student in the digital age: universal design software. In: Proceedings - Frontiers in Education Conference, FIE, 2005, pp. 19–20.
for learning. Educ. Technol. Res. Dev. 55 (5), 521–525. https://doi.org/10.1007/ Tondello, G.F., Wehbe, R.R., Diamond, L., Busch, M., Marczewski, A., Nacke, L.E., 2016.
s11423-007-9056-3. The gamification user types hexad scale. In: Proceedings of the 2016 Annual
Ryan, R.M., Deci, E.L., 2000. Intrinsic and extrinsic motivations: classic definitions and Symposium on Computer-Human Interaction in Play, pp. 229–243.
new directions. Contemp. Educ. Psychol. 25 (1), 54–67. https://doi.org/10.1006/ von Wangenheim, C.G., Shull, F., 2009. Voice of evidence To Game or Not to Game? IEEE
ceps.1999.1020. Softw. 26 (2), 92–94.
Sakamoto, M., Nakajima, T., Alexandrova, T., 2012. Value-based design for gamifying Wahl, N.J., 2000. Student-run usability testing. In: Software Engineering Education
daily activities. In: Entertainment Computing - ICEC 2012, pp. 421–424. Conference, Proceedings, pp. 123–131. https://doi.org/10.1109/
Salman, I., Misirli, A.T., Juristo, N., 2015. Are students representatives of professionals in CSEE.2000.827030.
software engineering experiments?. In: Proceedings - International Conference on Wilson, P., 1973. Situational relevance. Inf. Storage Retrieval 9 (8), 457–471. https://
Software Engineering, 1, pp. 666–676. https://doi.org/10.1109/ICSE.2015.82. doi.org/10.1016/0020-0271(73)90096-X.
Sardi, L., Idri, A., Fernández-Alemán, J.L., 2017. A systematic review of gamification in Zainuddin, Z., Chu, S.K.W., Shujahat, M., Perera, C.J., 2020. The impact of gamification
e-Health. J. Biomed. Inform. 71, 31–48. https://doi.org/10.1016/j.jbi.2017.05.011. on learning and instruction: a systematic review of empirical evidence. Educ. Res.
Schell, J., 2019. The Art of Game Design: A book of Lenses. CRC Press. https://doi.org/ Rev. 30 (March) https://doi.org/10.1016/j.edurev.2020.100326.
10.1201/b22101. Zichermann, G., Cunningham, C., 2011. Gamification By design: Implementing game
Seaborn, K., Fels, D.I., 2015. Gamification in theory and action: a survey. Int. J. Hum. Mechanics in Web and Mobile Apps. O’Reilly Media, Inc. http://storage.libre.li
Comput. Stud. 74, 14–31. https://doi.org/10.1016/j.ijhcs.2014.09.006. fe/Gamification_by_Design.pdf.
Sharma, M.K., Sharma, R.C., 2021. Innovation framework for excellence in higher
education institutions. Glob. J. Flexible Syst. Manage. https://doi.org/10.1007/
s40171-021-00265-x.
15