0% found this document useful (0 votes)
13 views

Unit 2 Modified

Uploaded by

R.M.SAI PUNEETH
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views

Unit 2 Modified

Uploaded by

R.M.SAI PUNEETH
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 35

19ECB431: USABILITY DESIGN

OF SOFTWARE APPLICATIONS
Dr. Padmaja Madugula
Assistant Professor
Department of Computer Science and Engineering
GITAM School of Technology (GST)
Visakhapatnam – 530045
Email: [email protected]
Mobile No: 9394696905
UNIT - II
Heuristic Evaluation: 10 Heuristic Principles, Examples. Heuristic Evaluation: Group Assignment

initiation (Website and App). Evaluation for key tasks of the app or website for heuristic principles,

severity, recommendations. Group Assignment Presentations and reviews.

Learning Outcomes:

After completion of this unit, the student will be able to:

• analyze Usability Heuristics for User Interface Design (L4)

• identify and fix usability issues (L3)

• identify effective designs using heuristic evaluation (L3)


Heuristic Evaluation

3-3
Heuristic Evaluation
• Heuristic evaluation is an informal usability inspection technique developed by Jakob
Nielsen and his colleagues in which experts, guided by a set of usability principles
known as heuristics, evaluate whether user-interface elements, such as dialog boxes,
menus, navigation structure, online help, etc., conform to the principles.

• It helps to identify usability problems in the user interface design.

• Heuristic evaluation finds usability problems by inspection

• When used in evaluation, they are called heuristics. The original set of heuristics was
derived empirically from an analysis of 249 usability problems.
Heuristics principles
• We list the latest here, this time expanding them to include some of the questions addressed when
doing evaluation:
1) Visibility of system status
• Are users kept informed about what is going on?
• Is appropriate feedback provided within reasonable time about a user’s action?

2) Match between system and the real world


• Is the language used at the interface simple?
• Are the words, phrases and concepts used familiar to the user?

3) User control and freedom


• Are there ways of allowing users to easily escape from places they unexpectedly find themselves in?
4) Consistency and standards
• Are the ways of performing similar actions consistent?

5) Help users recognize, diagnose, and recover from errors


• Are error messages helpful?
• Do they use plain language to describe the nature of the problem and suggest a
way of solving it?

6) Error prevention
• Is it easy to make errors?
• If so where and why?

7) Recognition rather than recall


• Are objects, actions and options always visible?
8) Flexibility and efficiency of use
• Have accelerators (i.e., shortcuts) been provided that allow more experienced
users to carry out tasks more quickly?

9) Aesthetic and minimalist design


• Is any unnecessary and irrelevant information provided?

10) Help and documentation


• Is help information provided that can be easily searched and easily followed?
S.No Principles Description
1 Users should be informed about what is going on with the system
Visibility of system status through appropriate feedback.
2 The image of the system perceived by users and presentation of
Match between system and the real world information on screen should match the model users have about the
system.
3 Users should not have the impression that they are controlled by the
User control and freedom system.
4 Users should not have to wonder whether different words,
Consistency and standards situations, or actions mean the same thing. Design standards and
conventions should be followed.
5 Error prevention It is always better to design interfaces that prevent errors from
happening in the first place.
6 The user should not have to remember information from one part of
Recognition rather than recall the system to another.
7 Both inexperienced and experienced users should be able to
Flexibility and efficiency of use customize the system, tailor frequent actions, and use shortcuts to
accelerate their interaction.
8 Aesthetic and minimalist design Any extraneous information is a distraction and a slowdown.
9 Error messages should be expressed in plain language (no codes),
Help users recognize, diagnose, and precisely indicate the problem, and constructively suggest a
recover from errors solution.
10 Help and documentation System should provide help when needed
Conti..
• Some of these core heuristics are too general for evaluating new products coming onto the market.

Nielsen (1999) suggests that the following heuristics are more useful for evaluating

commercial websites, and makes them memorable by introducing the acronym H O M E R U N:

• High-quality content
• Often updated
• Minimal download time
• Ease of use
• Relevant to users' needs
• Unique to the online medium
• Netcentric corporate culture
Conti..
• Heuristic evaluation enables designers to evaluate an interface without users
– inspection, guided by a set of guidelines
• Economical technique to identify usability issues early in the design process
– no implementation or users required
– can be performed on existing interfaces
• Helps identify usability problems in UI
– [Nielsen and Mohlich, 1990]
• HE = heuristics + procedure
– about 5 evaluators
– each evaluates UI against heuristics
– rate severity of each issue
– aggregate results
– devise design solutions
Heuristic Evaluation (HE) – Pros
and Cons
• Pros
– Very cost effective
– Identifies many usability issues
– Heuristics can help highlight potential usability issues early in the design process
– It is a fast and inexpensive tool compared with other methods involving real users
• Cons
– relies on interpretation of guidelines
– guidelines may be too generic
– needs more than one evaluator to be effective
– Heuristic evaluation depends on the knowledge and expertise of the evaluators
– Heuristic evaluation is based on assumptions about what “good” usability is
– Heuristic evaluation can end up giving false alarms
Stages of Evaluation
• Briefing
– teach to evaluators; ensure each person receives same briefing.
– become familiar with the UI and domain
• Evaluation period
– compare UI against heuristics
– spend 1-2 hours with interface; minimal 2 interface passes
– take notes
• Debriefing session
– Prioritize problems; rate severity
– aggregate results
– discuss outcomes with design/development team
Doing Heuristic Evaluation
• Heuristic evaluation is one of the most straightforward evaluation
methods. The evaluation has three stages:
1. The briefing session in which the experts are told what to do. A
prepared script is useful as a guide and to ensure each person
receives the same briefing.
2. The evaluation period in which each expert typically spends 1-2
hours independently inspecting the product, using the heuristics
for guidance. The experts need to take at least two passes through
the interface.
Conti..
• The first pass gives a feel for the flow of the interaction and the product's scope.
• The second pass allows the evaluator to focus on specific interface elements in the context of the
whole product, and to identify potential usability problems.
• If the evaluation is for a functioning product, the evaluators need to have some specific user tasks
in mind so that exploration is focused. Suggesting tasks may be helpful but many experts do this
automatically.
• However, this approach is less easy if the evaluation is done early in design when there are only
screen mockups or a specification; the approach needs to be adapted to the evaluation
circumstances.
• While working through the interface, specification or mockups, a second person may record the
problems identified, or the evaluator may think aloud.
• Alternatively, she may take notes herself. Experts should be encouraged to be as specific as
possible and to record each problem clearly.
3. The debriefing session in which the experts come together to discuss their findings and to
prioritize the problems they found and suggest solutions.
Formal Evaluation Process for performing
heuristic evaluation
• Training
• Meeting for design team & evaluators
• Introduce application
• Explain user population, domain, scenarios
• Evaluation
• Evaluators work separately
• Generate written report, or oral comments recorded by an observer
• Focus on generating problems, not on ranking their severity yet
• 1-2 hours per evaluator
• Severity Rating
• Evaluators prioritize all problems found (not just their own)
• Take the mean of the evaluators’ ratings
• Debriefing
• Evaluators & design team discuss results, brainstorm solutions
Severity Ratings
• Contributing factors
• Frequency: how common?
• Impact: how hard to overcome?
• Persistence: how often to overcome?
• Severity scale
• 0 - this is not a usability problem
• 1 - Cosmetic: need not be fixed
• 2 - Minor: needs fixing but low priority
• 3 - Major: needs fixing and high priority
• 4 - Catastrophic: imperative to fix
• Combination of frequency and impact
How To Do Heuristic Evaluation
• Justify every problem with a heuristic
• “Too many choices on the home page (Aesthetic & Minimalist Design)”
• Can’t just say “I don’t like the colors”
• List every problem
• Even if an interface element has multiple problems
• Go through the interface at least twice
• Once to get the feel of the system
• Again to focus on particular interface elements
• Don’t have to limit to the 10 Nielsen heuristics
• Nielsen’s 10 heuristics are easier to compare against
• Our 7 general principles are easier still
Hints for Better Heuristic
Evaluation
• Use multiple evaluators
• Different evaluators find different problems
• The more the better, but diminishing returns
• Nielsen recommends 3-5 evaluators
• Alternate heuristic evaluation with user testing
• Each method finds different problems
• Heuristic evaluation is cheaper
• It’s OK for observer to help evaluator
• As long as the problem has already been noted
• This wouldn’t be OK in a user test
Evaluating Prototypes
• Heuristic evaluation works on:
• Sketches
• Paper prototypes
• Buggy implementations
• “Missing-element” problems are harder to find on sketches
• Because you’re not actually using the interface, you aren’t blocked by
feature’s absence
• Look harder for them
Writing Good Heuristic
Evaluations
• Heuristic evaluations must communicate well to developers and managers
• Include positive comments as well as criticisms
• “Good: Toolbar icons are simple, with good contrast and few colors (minimalist
design)”
• Be tactful
• Not: “the menu organization is a complete mess”
• Better: “menus are not organized by function”
• Be specific
• Not: “text is unreadable”
• Better: “text is too small, and has poor contrast (black text on dark green
background)”
Heuristic Examples

3-22
Examples. Heuristic evaluation of websites
• MEDLINEplus, a medical information website created by the National Library of
Medicine (NLM) to provide health information for patients, doctors, and
researchers.
• In 1999 usability consultant Keith Cogdill was commissioned by NLM to evaluate
MEDLINEplus.
• Using a combination of his own knowledge of the users' tasks, problems that had
already been reported by users, and advice from documented sources
(Shneiderman, 1998a; Nielsen, 1993; Dumas and Redish, 1999), Cogdill
identified the seven heuristics listed below.
Home page of MEDLINEplus
https://medlineplus.gov/about/
Clicking Health Topics on the home page produced
this pagehttps://medlineplus.gov/healthtopics.html
• Some of the heuristics resemble Nielsen's original set, but have been tailored for evaluating MEDLINEplus.
1) Internal consistency.
• The user should not have to speculate about whether different phrases or actions carry the same meaning.
2) Simple dialog.
• The dialog with the user should not include information that is irrelevant, unnecessary, or rarely needed. The dialog
should be presented in terms familiar to the user and not be system-oriented.
3) Shortcuts.
• The interface should accommodate both novice and experienced users.
4) Minimizing the user's memory load.
• The interface should not require the user to remember information from one part of the dialog to another.
5) Preventing errors.
• The interface should prevent errors from occurring.
6) Feedback.
• The system should keep the user informed about what is taking place.
7) Internal locus of control.
• Users who choose system functions by mistake should have an "emergency exit" that lets them leave the unwanted state
without having to engage in an extended dialog with the system.
• These heuristics were given to three expert evaluators who independently
evaluated MEDLINEplus. Their comments were then compiled and a meeting
was called to discuss their findings and suggest strategies for addressing
problems.
• The following points were among their findings:
Layout.
• All pages within MEDLINEplus have a relatively uncomplicated vertical design.
The home page is particularly compact, and all pages are well suited for printing.
The use of graphics is conservative, minimizing the time needed to download
pages.

Internal consistency.
• The formatting of pages and presentation of the logo are consistent across the
website. Justification of text, fonts, font sizes, font colors, use of terms, and links
labels are also consistent.
The experts also suggested improvements, including:
Arrangement of health topics.
• Topics should be arranged alphabetically as well as in categories. For example,
health topics related to cardiovascular conditions could appear together.
Depth of navigation menu.
• More topics should be listed on the surface, giving many short menus rather
than a few deep ones.
• Having a higher "fan-out" in the navigation menu in the left margin would
enhance usability. By this they mean that more topics should be listed on the
surface, giving many short menus rather than a few deep ones
Turning design guidelines into heuristics for the web
• The following list of guidelines for evaluating websites was compiled from several
sources and grouped into three categories: navigation, access, and information design.
• These guidelines provide a basis for developing heuristics by converting them into
questions.
1. Navigation - One of the biggest problems for users of large websites is navigating
around the site. The phrase "lost in cyberspace" is understood by every web user. The
following six guidelines are intended to encourage good navigation design:
• Avoid orphan pages i.e. pages that are not connected to the home page, because they
lead users into dead ends.
Are there any orphan pages? Where do they go to?
• Avoid long pages with excessive white space that force scrolling.
Are there any long pages? Do they have lots of white space or are they full of texts or
lists?
• Provide navigation support, such as a strong site map that is always present.
Is there any guidance, e.g. maps, navigation bar, menus, to help users find their
way around the site?
• Avoid narrow, deep, hierarchical menus that force users to burrow deep into the
menu structure.
• Avoid non-standard link colors.
What color is used for links? Is it blue or another color? If it is another color, then
is it obvious to the user that it is a hyperlink?
• Provide consistent look and feel for navigation and information design.
Are menus used, named, and positioned consistently? Are links used consistently?
2. Access - Accessing many websites can be a problem for people with slow Internet
connections and limited processing power. In addition, browsers are often not
sensitive to errors in URLs. Nielsen (1998) suggests the following guidelines:
• Avoid complex URLs.
Are the URLs complex? Is it easy to make typing mistakes when entering them?
• Avoid long download times that annoy users.
Are there pages with lots of graphics? How long does it take to download each
page?
3. Information design (i.e., content comprehension and aesthetics) contributes to users'
understanding and impressions of the site as you can see in below Activity.
• Consider the following design guidelines for information design and for each one
suggest a question that could be used in heuristic evaluation:
• Outdated or incomplete information is to be avoided (Nielsen, 1998). It creates a poor
impression with users.
• Good graphical design is important. Reading long sentences, paragraphs, and
documents is difficult on screen, so break material into discrete, meaningful chunks to
give the website structure (Lynch and Horton, 1999).
• Avoid excessive use of color. Color is useful for indicating different kinds of
information, i.e., cueing (Preece et al., 1994).
• Avoid gratuitous use of graphics and animation. In addition to increasing download
time, graphics and animation soon become boring and annoying (Lynch and Horton,
1999).
• Be consistent. Consistency both within pages (e.g., use of fonts, numbering,
terminology, etc.) and within the site (e.g., navigation, menu names, etc.) is important for
usability and for aesthetically pleasing designs.
Comment
• We suggest the following questions; you may have identified others:
• Outdated or incomplete information.
Do the pages have dates on them? How many pages are old and provide outdated information?
• Good graphical design is important.
Is the page layout structured meaningfully? Is there too much text on each page?
• Avoid excessive use of color.
How is color used? Is it used as a form of coding? Is it used to make the site bright and
cheerful? Is it excessive and garish?
• Avoid gratuitous use of graphics and animation.
Are there any flashing banners? Are there complex introduction sequences? Can they be short-
circuited? Do the graphics add to the site?
• Be Consistent.
Are the same buttons, fonts, numbers, menu styles, etc. used across the site? Are they used in
the same way?
References
• Interaction Design: Beyond Human-Computer Interaction, 4th Edition,
Jennifer Preece, Helen Sharp and Yvonne Rogers – Chapter -13
• http://web.mit.edu/6.813/www/sp17/classes/18-heuristic-evaluation
/#
reading_18_heuristic_evaluation
• https://uxplanet.org/10-usability-heuristics-and-how-to-apply-them-
to-product-design-dd4a4a06d78c
THANK YOU

35

You might also like