0% found this document useful (0 votes)
155 views47 pages

Advance Software Engineering

The document discusses testing and quality assurance (QA) approaches for agile software development. It notes that dependable systems often require extensive documentation that can conflict with agile methods' focus on iterative development and minimal documentation. While pure agile approaches are impractical for safety-critical systems, some agile techniques like test-driven development can be incorporated if the process is documented. Feature-driven development is presented as one agile method that focuses on designing and building features in iterations with frequent deliverables and progress monitoring. The document also covers agile testing principles and how testing differs between agile and traditional approaches.

Uploaded by

arooba abdullah
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
155 views47 pages

Advance Software Engineering

The document discusses testing and quality assurance (QA) approaches for agile software development. It notes that dependable systems often require extensive documentation that can conflict with agile methods' focus on iterative development and minimal documentation. While pure agile approaches are impractical for safety-critical systems, some agile techniques like test-driven development can be incorporated if the process is documented. Feature-driven development is presented as one agile method that focuses on designing and building features in iterations with frequent deliverables and progress monitoring. The document also covers agile testing principles and how testing differs between agile and traditional approaches.

Uploaded by

arooba abdullah
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 47

Testing and QA in Agile Development

Session 11
Dependable systems and Agility
• Dependable software often requires certification:
– Both process and product documentation has to be
produced.
• Up-front requirements analysis is essential to:
– Discover requirements and requirements conflicts
– May compromise the safety and security of the system.
– Formal change analysis is essential to assess the
effect of changes on the safety and integrity of the
system
• These conflict with the general approach in agile
development.

2
Advanced Software Engineering - MSCSE 25, Session 11
Dependable Processes and Agility
• An agile process may be defined that
incorporates techniques such as:
– Iterative development, test-first development and
user involvement in the development team.
– So long as the team follows that process and
documents their actions, agile methods can be
used.
• Additional documentation and planning is
essential so ‘pure agile’ is impractical for
dependable systems engineering.
3
Advanced Software Engineering - MSCSE 25, Session 11
Safety and Agile Methods
• Safety-critical systems, have high dependability requirements
– Need to be based on dependable processes:
• requirements management,
• change management and configuration control,
• system modeling, reviews and inspections, test planning, and test
coverage analysis
• Agile methods are not usually used for safety-critical systems
engineering
– Extensive process and product documentation is needed for
system regulation.
• Contradicts the focus in agile methods on the software itself.
– A detailed safety analysis of a complete system specification is
important.
• Contradicts the interleaved development of a system specification and
program.
• Some agile techniques such as test-driven development (TDD)
may be used

4
Advanced Software Engineering - MSCSE 25, Session 11
Feature-Driven Development- FDD
• FDD is a client-centric, architecture-centric, and
pragmatic software process:
– A feature (as Use case, user story/ primary source of
requirements )
• A small, client-valued function expressed in the form
<action><result><object>.
– For example:
• "Calculate the total of a sale",
• "Validate the password of a user",
• "Authorize the sales transaction of a customer".

5
Advanced Software Engineering - MSCSE 25, Session 11
Feature Driven Development (FDD)
• Does not cover the entire software
development process
– Focuses on the design and building phases
– The first three phases are done at the
beginning of the project.
– Last two phases are the iterative part of the
process
– The FDD approach includes frequent and
tangible deliverables, along with accurate
monitoring of the progress of the report

6
Advanced Software Engineering - MSCSE 25, Session 11
Develop, Build and Plan
• Develop an Overall Model:
– A high level system scope and its context by the domain
expert to the team members and chief architect
– Documented requirements such as use cases or functional
specifications are developed.
• Build a Features List :
– A categorized list of features to support the requirements is
produced
• Plan by Feature:
– The development team orders the feature sets according
to their priority and dependencies and assigned to chief
programmers.
– The classes identified in the first phase are assigned to
class owners (individual developers).
– Schedule and milestones are set for the feature sets.

7
Advanced Software Engineering - MSCSE 25, Session 11
Design & Build by Feature
• Features are selected from the feature set
and feature teams needed to develop these
features are chosen by the class owners.
– The design by feature and build by feature are
iterative procedures
• The team produces the sequence diagrams for the
assigned features.
• These diagrams are passed on to the developers who
implement the items necessary to support the design
for a particular feature.
– There can be multiple feature teams concurrently
designing and building their own set of features.
• The code developed is then unit tested and inspected
• After a successful iteration, the completed features are
promoted to the main build.
8
Advanced Software Engineering - MSCSE 25, Session 11
Agile Testing and Agile manifesto:
• Agile Testing is a software testing practice that follows
the principles of agile software development.
• Covers all the levels of testing and all types of testing.

Individuals and interactions over processes and tools


Working software over comprehensive documentation
Customer collaboration over contract negotiation
Responding to change over following a plan

How do these values affect testing?


9
Advanced Software Engineering - MSCSE 25, Session 11
Some Agile Principles
• Satisfy the customer through early • The most efficient and effective
and continuous delivery of method of conveying
valuable software. information to and within a
• Working software is the primary development team is face-to-
measure of progress. face conversation.
• Deliver working software • Business people and
frequently, from a couple of developers must work
weeks to a couple of months. together daily throughout the
• Welcome changing project.
requirements, even late in • Simplicity--the art of
development. maximizing the amount of work
not done--is essential.

http://www.agilemanifesto.org/
10
Advanced Software Engineering - MSCSE 25, Session 11
Agile Testing Principles
• Testing moves the project forward
• Testing is not a phase
• Everyone tests
• Shortening Feedback Loops
• Lightweight Documentation

11
Advanced Software Engineering - MSCSE 25, Session 11
Agile and Traditional Testing

12
Advanced Software Engineering - MSCSE 25, Session 11
Agile and Traditional Testing

Four main levels of testing are Unit Testing, Integration Testing,


System Testing and Acceptance Testing

13
Advanced Software Engineering - MSCSE 25, Session 11
Agile Test Strategy
• Each sprint or Iteration is focused on only a few
requirements or user stories
– Documentation may not be as extensive
• Require a high-level agile test strategy as a
guideline
– Purpose:
• To list best practices and some form of structure that the
• Mission statement
teams can must
follow, (Agile doesbe
not supported by:
mean unstructured).
– –Mission:
No code may be written for a story until we first
•define
Project Scope
Boundaries, Key
its acceptance
To Constantly criteria/tests
Deliver Working Software that Meets
– ACustomer’s Requirements by means of Providing Fast
Requirements
story may not be considered complete until all
Feedback and Defect Prevention, rather than Defect
its acceptance tests pass
Detection.
14
Advanced Software Engineering - MSCSE 25, Session 11
Agile Testing Strategies
• Agile testing life cycle spans four stages
– Iteration 0:
• Initial setup tasks are performed
– It includes identifying people for testing, installing testing
tools, scheduling resources’
» Establishing a business case for the project
» Establish the boundary conditions and the project scope
» Outline the key requirements and use cases that will drive the
Project Scope design trade-offs
Boundaries, Key
Requirements » Outline one or more candidate architectures
» Identifying the risk
» Cost estimation and prepare a preliminary project

15
Advanced Software Engineering - MSCSE 25, Session 11
Agile Testing Strategies
• Agile testing life cycle spans four stages
– Construction Iterations
• The majority of the testing occurs during this phase
– Observed as a set of iterations for an increment
» Requirement prioritization is practiced
» Team implements a hybrid of practices from XP,
Scrum etc.
• Classify into:
– Confirmatory testing ~ verifying the system fulfills the needs
(Developer and Acceptance Testing)
– Investigative testing ~ Tester determines the potential
problems in the form of defect stories
(Continue……)

16
Advanced Software Engineering - MSCSE 25, Session 11
Agile Testing Strategies (Cont..)
• Agile testing life cycle spans four stages
– Construction Iterations
• Agile acceptance testing
– A combination of traditional functional testing and traditional
acceptance testing
» Developers and stakeholders are doing it together.
– Developer testing is a mix of traditional unit testing and
traditional service integration testing.
» Developer testing verifies both the application code and
the database schema.

17
Advanced Software Engineering - MSCSE 25, Session 11
Agile Testing Strategies (Cont..)
• Agile testing life cycle spans four stages
– Release End Game Or Transition Phase
• Activities
– Training of end users, support people and operational
people.
– Marketing of the product, finalization of system and user
documentation.
• Includes:
– Full system testing and acceptance testing
– During the end game, testers will be working on its defect
stories.
– Production
• After release stage, the product will move to the
production stage.
18
Advanced Software Engineering - MSCSE 25, Session 11
Quality Assurance- QA
• Quality, in agile development, means:
– code quality and practices such as:
• Refactoring, and TDD are used to ensure that high-quality
code is produced
• Quality management in agile development is
informal rather than document-based
• QA, A set of activities intended to ensure that:
– Products satisfy customer requirements in a
systematic, reliable fashion.
• In SCRUM QA is the responsibility of everyone,
not only the testers.
– QA is all the activities we do to ensure correct quality
during the development of new products.

19
Advanced Software Engineering - MSCSE 25, Session 11
20
Advanced Software Engineering - MSCSE 25, Session 11
Agile Testing Quadrants
Automated Business Facing
or Manual Manual

Exploratory Testing
Functional Testing (Examples) Scenarios
Story test Usability testing

Critique Product
Prototypes UAT (User acceptability Testing)
Supporting the Team

Simulations Alpha/Beta Testing


Q2 Q3
Q1 Q4

Unit Test Performance and load Testing


Component Test Security Testing
“ilities” Testing

Automated Technology Facing Tools

21
Advanced Software Engineering - MSCSE 25, Session 11
Agile Testing Quadrants
• Agile Quadrant I
– The internal code quality is the main focus
• it consists of test cases which are technology driven
and are implemented to support the team, it includes
– Unit Tests
– Component Tests
• Agile Quadrant II
– It contains test cases that are business driven and
are implemented to support the team.
– Focuses on the requirements.
– The kind of test performed
• Testing of examples of possible scenarios and workflows
• Testing of User experience such as prototypes
• Pair testing
22
Advanced Software Engineering - MSCSE 25, Session 11
Example Test Case
• Title: Login Page – Authenticate Successfully on gmail.com
• Description: A registered user should be able to successfully
login at gmail.com.
• Precondition: the user must already be registered with an
email address and password.
• Assumption: a supported browser is being used.
• Test Steps:
– Navigate to gmail.com
– In the ’email’ field, enter the email-ID of the registered user.
– Click the ‘Next’ button.
– Enter the password of the registered user
– Click ‘Sign In’
• Expected Result: A page displaying the gmail user’s inbox
should load, showing any new message at the top of the
page.

23
Advanced Software Engineering - MSCSE 25, Session 11
Agile Testing Quadrants
• Agile Quadrant -III
– Provide feedback to quadrants one and two.
– Iteration reviews are carried out which builds confidence in the
product.
– The kind of testing done in this quadrant is
• Usability Testing, Pair testing with customers, User acceptance
Testing

• Agile Quadrant IV
– Focus on non-functional requirements
• Non-functional tests such as stress and performance testing
• Security testing with respect to authentication and hacking
• Infrastructure testing
• Data migration testing
• Scalability testing
• Load testing

24
Advanced Software Engineering - MSCSE 25, Session 11
Agile Testing Quadrants
Business Facing
API / Service Testing System Testing / Regression Testing /
WHY: To ensure communication between UAT
components are working WHY: To ensure the whole system works
WHO: Developers / Technical Architects when integrated
Exploratory Testing
WHAT: New web services, components, WHO: Business Analyst / Product Owner
Functional
controllers, etc Testing (Examples) WHAT: ScenarioScenarios
Testing, Performance and
WHEN: As soonStoryas newtest
API is developed security testing
Usability testing
WHEN: When Acceptance Testing is

Critique Product
and ready
WHERE: LocalPrototypes UAT (User acceptability Testing)
Supporting the Team

Dev completed
HOW: Automated Simulations WHERE: Alpha/Beta
Staging Environment
Testing
HOW: Automated (We bdriver) Exploratory
Testing
Unit Testing Acceptance Testing

WHY: To ensure code is developed correctly WHY: To ensure customer’s expectations


WHO: Developers / Technical Architects are met
WHAT: All new code + re-factoring of legacy WHO: Developer / SDET / Manual QA
Unit Testunit Testing
code as well as Javascript Performance
WHAT: and load
Verifying acceptance testsTesting
on the
WHEN: As Component
soon as new code is written stories, verification of features
Test WHEN: When Security Testing
the feature is ready and unit
WHERE: Local Dev
HOW: Automated, Junit, PHPUnit tested “ilities” Testing
WHERE: Test Environment
HOW: Automated (Cucumber)

Technology Facing
25
Advanced Software Engineering - MSCSE 25, Session 11
Two views of Agile Testing
eXtreme Testing Exploratory Testing
• Automated unit testing • Manual testing by professional
– Developers write tests
skilled testers
– Test first development
• Freedom and flexibility and for
testers
– Daily builds with unit tests
always 100% pass • Optimized to find bugs
• Continually adjusting plans,
• Functional testing
re-focusing on the most
– Customer-owned promising risk areas
– Comprehensive • Minimizing time spent on
– Repeatable documentation
– Automatic
– Timely
– Public

Focus on automated verification – Focus on manual validation


enabling agile software – making testing
development activities agile
26
Advanced Software Engineering - MSCSE 25, Session 11
Product Backlog
• Most common cause of software development failure:
– Unclear requirements
– Different interpretation of requirements by different
members of the team.
• The
User storiesformat
following should be simple,
should concise
be used to andstories
write user
unambiguous.
“As a [role], I want [feature], So that [benefit]”
• Example:
As a good guideline, it
Preventing is best to follow
Defects the INVEST
model
As a user,forI writing
can backupuser my stories.
entire hard drive
ItWhile Collecting
is– important
A good user User
not to story
forget the stories
“Benefit”
should or conducting
(value
be (INVEST): adding by developing
Story
the • workshops,
Independent
story).. (of allPO, BA, Dev, and QA must
others)
• Negotiable (not a specific contract for features)
be involved
• Valuable (or verifiable )
• Estimable (to a good approximation)
• Small (so as to fit within an iteration)
• Testable (in principle, even if there isn’t a test for it yet)

27
Advanced Software Engineering - MSCSE 25, Session 11
QA challenges with Agile Software
Development
• Chances of error are more
– Documentation less priority
• New features intro quickly
– Reduces available time for test teams
• Latest features Vs Requirement Vs Business Value
• Testers are required to play a developer role
• Test execution cycles are highly compressed
• Very less time to prepare test plan

28
Advanced Software Engineering - MSCSE 25, Session 11
Stages of testing
1. Development testing:
– System testing during development involves:
• Integrating components to create a version of the
system
• Testing the integrated system.
– The focus is on testing the interactions between
components.
– System testing checks that components are
compatible, interact correctly and transfer the
right data at the right time across their
interfaces.
• System testing tests the emergent behavior of a
system.
29
Advanced Software Engineering - MSCSE 25, Session 11
Stages of testing
1. Development testing
– Test-driven development (TDD) is an
approach to program development in which
testing and code development are inter-
leaved.
• Tests are written before code and ‘passing’ the
tests is the critical driver of development.
– Develop the code incrementally, with a test for
that increment.
• Next increment only when the current code passes
its test.

30
Advanced Software Engineering - MSCSE 25, Session 11
Stages of testing
2. Release testing:
– A process of testing, where a separate testing
team test a complete version of the system
before it is released to users.
– Must show that:
• The system delivers its specified functionality,
performance and dependability,
• Itobjective
– The does not fail
of during normal
release use.is to check
testing
–that
It isthe
usually a black-box
system testing
meets its process where
requirements and is
good
testsenough
are onlyfor external
derived fromuse
the (validation
system
testing).
specification.
31
Advanced Software Engineering - MSCSE 25, Session 11
Stages of testing
2. Release testing:
– Performance testing
• Part of release testing may involve testing the
emergent properties of a system, such as
performance and reliability.
– examines responsiveness, stability, scalability, reliability,
speed and resource usage of software
• Load Testing: The system is raised beyond limits in
order to check its performance when higher loads
are applied.
• Stress Testing: The system is tested beyond the
normal expectations or operational capacity
32
Advanced Software Engineering - MSCSE 25, Session 11
Stages of testing
3. User Test:
– where users or potential users of a system
test the system in their own environment.
• User provide input and advice on system testing.
– User testing is essential, even when
comprehensive system and release testing
have been carried out because:
• The influences on the reliability, performance,
usability and robustness of a system.
– These cannot be replicated in a testing environment.

33
Advanced Software Engineering - MSCSE 25, Session 11
Stages of testing
3. User Test:
– Types of user Tests
• Alpha testing
– Users work with the development team to test the
software at the developer’s site.
• Beta testing
– A release is made available to users to allow them to
experiment and to raise problems that they discover with
the system developers.
• Acceptance testing
– Customers test a system to decide whether or not it is
ready to be accepted and deployed in the customer
environment.
– Primarily for custom systems. 34
Advanced Software Engineering - MSCSE 25, Session 11
Stages of testing
3. User Test:
– Agile methods and acceptance testing
• The user is part of the development team and is
responsible for making decisions on the
acceptability of the system.
• Tests are defined by the user and are integrated
with other tests in that they are run automatically
when changes are made.
• There is no separate acceptance testing process.
• Main problem here is whether or not the
embedded user is ‘typical’ and can represent the
interests of all system stakeholders.
35
Advanced Software Engineering - MSCSE 25, Session 11
Testing method where user is not required
• Functional Testing:
– The software is tested for the functional
requirements
• Ad-hoc Testing:
– Testing is done without any formal Test Plan or
Test Case creation.
• helps in deciding the scope and in learning the
application prior starting with any other testing.
• Exploratory Testing:
– similar to the ad-hoc testing and is done to
learn/explore the application

36
Advanced Software Engineering - MSCSE 25, Session 11
Testing method where user is not required
• Smoke Testing:
– To check if the application is ready for further
major testing and is working properly without
failing up to least expected level
• Recovery Testing:
– To check how fast and better the application can
recover against any type of crash
• Volume Testing:
– Done against the efficiency of the application.
• Huge amount of data is processed through the
application to check the extreme limits

37
Advanced Software Engineering - MSCSE 25, Session 11
Testing where user plays a role/user is required
• User Acceptance Testing:
– the software is handed over to the user to find out
• if the software meets the user expectations and works
as it is expected to
• Alpha Testing:
– the users are invited at the development center
– User use the application and the developers note
every input or action carried out by the user
• Any type of abnormal behavior of the system is noted
and rectified by the developers.
• Beta Testing:
– the software is distributed as a beta version to the
users and users test the application at their sites.
• As the users explore the software, in case if any
exception/defect occurs that is reported to the
developers.

38
Advanced Software Engineering - MSCSE 25, Session 11
Current Challenges
• Previously Testing was exclusively restricted to
Desktop applications but these days:
– PCs and browsers to smart phones, tablets, and even
wearable technology
– Proliferation of new devices and interfaces
• A Web-based applications tester find it hard to
mobile software testing?
– “though”
• Testing methodologies are universal
• The logic and processes used for one environment should apply
to all environments
• Successful tests are adaptive by nature
– Few challenges are there:
• Testing for mobile and testing for the Web are not the same

39
Advanced Software Engineering - MSCSE 25, Session 11
Current Challenges
• Limited Size / View
– The most obvious difference is screen size.
• Responsive design is relatively easy to code for desktop and
laptop browsers (mostly come with predefined ratios)
– Mobile devices are much smaller.
• Aligning images and text becomes a real challenge
– Especially when you factor in features like portrait orientation (i.e.
the ability to rotate a mobile device and have the image flip
accordingly).
• Still worse, in variation when dealing with the same
manufacturer.
– For example, the iPhone 5 has a 4” display, whereas the iPhone 6
is 4.7”, iPhone 6 Plus (5.5”), iPad Mini (7.9”), and iPad standard
(9.7”),
• it becomes harder and harder to code and test mobile
applications that look “good” on all screens.
– Screens are not the only spatial constraint mobile
software testers face
40
Advanced Software Engineering - MSCSE 25, Session 11
Current Challenges
• Limited Storage and RAM
– The limited storage and processing power of
today’s mobile devices.
• Even high capacity phones can quickly fill up with
downloaded apps and multimedia.
• For browser, such constraints are unlikely
– Desktop storage is essentially unlimited
(measured in terabytes).
– Cloud-based storage is easy to increase,
even if this requires charging higher prices to
end-users.

41
Advanced Software Engineering - MSCSE 25, Session 11
Current Challenges
• Internet Access
– With the exception of a few off-line browser
applications (e.g. Gmail), Web-based software
always requires an Internet connection.
• Mobile apps may or may not need online
access.
– When Internet is needed, mobile software
testers must factor in 3G and 4G – in addition
to normal Wi-Fi.

42
Advanced Software Engineering - MSCSE 25, Session 11
Current Challenges
• More Configurations
– The majority of today’s browsers follow the same
basic logic.
• There are, exceptions to this rule, But:
– Chrome is not radically different from Internet Explorer.
– Firefox has more in common with Safari than Mozilla or
Apple.
– Not so in the mobile world.
• Various framework and platforms
– iOS, Android, Windows OS, and BlackBerry.
• Different hardware limitations specific to devices:
– Nokia, HTC, Sony, Samsung, Apple etc.
– Even within the same manufacturer
• Siri doesn’t work with the iPhone 4 or earlier
43
Advanced Software Engineering - MSCSE 25, Session 11
Current Challenges
• More Configurations
– Frequency of New mobile devices to market:
• Web-based software testers nearly always have access to
the platforms they’re testing,
• Mobile app developers often have to use simulators and
emulators to test devices that have yet to become public.
– So many software updates.
• Mobile applications tend to go through more iterations (as do
their underlying operating systems).
– Browsers go through updates as well, and
major upgrades can render certain functions
obsolete.
• Can be resolved with patches and plugins

44
Advanced Software Engineering - MSCSE 25, Session 11
Current Challenges

• Input Interface
– Another major difference is how users
interface with mobile applications versus
Web-based software.
• it is usually with keyboards and mice (although
even this is changing).
• With mobile applications, testers must
factor in touch screens, USB connections,
and even voice recognition

45
Advanced Software Engineering - MSCSE 25, Session 11
Current Challenges

• Performance Speed
– Speed is one area in which Web-based
software testers face a disadvantage.
– According to Google, delays of 400
milliseconds or longer are enough response
time to online browsers
– You may have an easier time testing
applications for the Web, but your product can
still fail if it doesn’t deliver as promised in 399
milliseconds or less.

46
Advanced Software Engineering - MSCSE 25, Session 11
Reference
• Software Engineering by Ian Sommerville
• Literature on Agile Software Development for
Agile platform
– http://agilemodeling.com/essays/fdd.htm
– https://www.guru99.com/agile-testing-a-beginner-
s-guide.html
– https://www.tutorialspoint.com/agile_testing/agile_
testing_tutorial.pdf
– https://www.testingexcellence.com/testing-quality-
assurance-agile/

47
Advanced Software Engineering - MSCSE 25, Session 11

You might also like