0% found this document useful (0 votes)
14 views

TESTING

Uploaded by

vamsi.d124
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views

TESTING

Uploaded by

vamsi.d124
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 16

8.

TESTING

Testing Concepts

 Testing

 Testing Methodologies
 Black box Testing:
 White box Testing.
 Gray Box Testing.
 Levels of Testing
 Unit Testing.
 Module Testing.
 Integration Testing.
 System Testing.
 User Acceptance Testing.
 Types Of Testing
 Smoke Testing.
 Sanitary Testing.
 Regression Testing.
 Re-Testing.
 Static Testing.
 Dynamic Testing.
 Alpha-Testing.
 Beta-Testing.
 Monkey Testing.
 Compatibility Testing.
 Installation Testing.
 Adhoc Testing.
 Ext….
TCD (Test Case Documentation)
 STLC
 Test Planning.
 Test Development.
 Test Execution.
 Result Analysis.
 Bug-Tracing.
 Reporting.
 Microsoft Windows – Standards
 Manual Testing
 Automation Testing (Tools)
 Win Runner.
 Test Director.
Software Testing is the process used to help identify the correctness, completeness,
security, and quality of developed computer software. Testing is a process of technical
investigation, performed on behalf of stakeholders, that is intended to reveal quality-related
information about the product with respect to the context in which it is intended to operate.
This includes, but is not limited to, the process of executing a program or application with the
intent of finding errors. Quality is not an absolute; it is value to some person. With that in
mind, testing can never completely establish the correctness of arbitrary computer software;
testing furnishes a criticism or comparison that compares the state and behavior of the
product against a specification. An important point is that software testing should be
distinguished from the separate discipline of Software Quality Assurance (SQA), which
encompasses all business process areas, not just testing.
Introduction:
In general, software engineers distinguish software faults from software failures. In case
of a failure, the software does not do what the user expects. A fault is a programming error
that may or may not actually manifest as a failure. A fault can also be described as an error in
the correctness of the semantic of a computer program. A fault will become a failure if the
exact computation conditions are met, one of them being that the faulty portion of computer
software executes on the CPU. A fault can also turn into a failure when the software is ported
to a different hardware platform or a different compiler, or when the software gets extended.
Software testing is the technical investigation of the product under test to provide
stakeholders with quality related information.
Software testing may be viewed as a sub-field of Software Quality Assurance but typically
exists independently (and there may be no SQA areas in some companies). In SQA, software
process specialists and auditors take a broader view on software and its development. They
examine and change the software engineering process itself to reduce the amount of faults
that end up in the code or deliver faster.
Regardless of the methods used or level of formality involved the desired result of testing
is a level of confidence in the software so that the organization is confident that the software
has an acceptable defect rate. What constitutes an acceptable defect rate depends on the
nature of the software. An arcade video game designed to simulate flying an airplane would
presumably have a much higher tolerance for defects than software used to control an actual
airliner.
The software, tools, samples of data input and output, and configurations are all
referred to collectively as a test harness.

History

The separation of debugging from testing was initially introduced by Glen ford J. Myers in
his 1978 book the "Art of Software Testing". Although his attention was on breakage testing
it illustrated the desire of the software engineering community to separate fundamental
development activities, such as debugging, from that of verification. Drs. Dave Gelperin and
William C. Hetzel classified in 1988 the phases and goals in software testing as follows: until
1956 it was the debugging oriented period, where testing was often associated to debugging:
there was no clear difference between testing and debugging. From 1957-1978 there was the
demonstration oriented period where debugging and testing was distinguished now - in this
period it was shown, that software satisfies the requirements. The time between 1979-1982 is
announced as the destruction oriented period, where the goal was to find errors. 1983-1987 is
classified as the evaluation oriented period: intention here is that during the software lifecycle
a product evaluation is provided and measuring quality. From 1988 on it was seen as
prevention oriented period where tests were to demonstrate that software satisfies its
specification, to detect faults and to prevent faults. Dr. Gelperin chaired the IEEE 829-1988
(Test Documentation Standard) with Dr. Hetzel writing the book "The Complete Guide of
Software Testing". Both works were pivotal in to today's testing culture and remain a
consistent source of reference. Dr. Gelperin and Jerry E. Durant also went on to develop High
Impact Inspection Technology that builds upon traditional Inspections but utilizes a test
driven additive.

Testing:
 The process of executing a system with the intent of finding an error.
 Testing is defined as the process in which defects are identified, isolated,
subjected for rectification and ensured that product is defect free in order to
produce the quality product and hence customer satisfaction.
 Quality is defined as justification of the requirements
 Defect is nothing but deviation from the requirements
 Defect is nothing but bug.
 Testing --- The presence of bugs
 Testing can demonstrate the presence of bugs, but not their absence
 Debugging and Testing are not the same thing!
 Testing is a systematic attempt to break a program or the AUT
 Debugging is the art or method of uncovering why the script /program did not
execute properly.
Testing Methodologies:
 Black box Testing: is the testing process in which tester can perform testing on an
application without having any internal structural knowledge of application.
Usually Test Engineers are involved in the black box testing.
 White box Testing: is the testing process in which tester can perform testing on an
application with having internal structural knowledge.
Usually The Developers are involved in white box testing.
 Gray Box Testing: is the process in which the combination of black box and white
box tonics’ are used.
Levels of Testing:

Module1 Module2
Module3
Units Units Units

i/p Integration o/p i/p


Integration o/p
System Testing: Presentation + business +Databases
UAT: user acceptance testing

STLC (SOFTWARE TESTING LIFE CYCLE)


Test Planning:
1.Test Plan is defined as a strategic document which describes
the procedure how to perform various testing on the total application in the most efficient
way.
2.This document involves the scope of testing,
3. Objective of testing,
4. Areas that need to be tested,
5. Areas that should not be tested,
6. Scheduling Resource Planning,
7. Areas to be automated, various testing tools Used….

Test Development:
1. Test case Development (check list)
2. Test Procedure preparation. (Description of the Test cases)
3. Implementation of test cases. Observing the result.
Result Analysis: 1. Expected value: is nothing but expected behavior Of application.
2. Actual value: is nothing but actual behavior of application
Bug Tracing: Collect all the failed cases, prepare documents.
Reporting: Prepare document (status of the application)

Types Of Testing:
> Smoke Testing: is the process of initial testing in which tester looks for the availability of
all the functionality of the application in order to perform detailed testing on them. (Main
check is for available forms)
> Sanity Testing: is a type of testing that is conducted on an application initially to check
for the proper behavior of an application that is to check all the functionality are available
before the detailed testing is conducted by on them.
> Regression Testing: is one of the best and important testing. Regression testing is the
process in which the functionality, which is already tested before, is once again tested
whenever some new change is added in order to check whether the existing functionality
remains same.
>Re-Testing: is the process in which testing is performed on some functionality which is
already tested before to make sure that the defects are reproducible and to rule out the
environments issues if at all any defects are there.
Static Testing: is the testing, which is performed on an application when it is not been
executed.ex: GUI, Document Testing
Dynamic Testing: is the testing which is performed on an application when it is being
executed.ex: Functional testing.
Alpha Testing: it is a type of user acceptance testing, which is conducted on an application
when it is just before released to the customer.
 Beta-Testing: it is a type of UAT that is conducted on an application when it is released to
the customer, when deployed in to the real time environment and being accessed by the real
time users.
 Monkey Testing: is the process in which abnormal operations, beyond capacity operations
are done on the application to check the stability of it in spite of the users abnormal behavior.
Compatibility testing: it is the testing process in which usually the products are tested on
the environments with different combinations of databases (application servers, browsers…
etc) In order to check how far the product is compatible with all these environments platform
combination.
Installation Testing: it is the process of testing in which the tester try to install or try to
deploy the module into the corresponding environment by following the guidelines produced
in the deployment document and check whether the installation is successful or not.
Adhoc Testing: Adhoc Testing is the process of testing in which unlike the formal testing
where in test case document is used, with out that test case document testing can be done of
an application, to cover that testing of the future which are not covered in that test case
document. Also it is intended to perform GUI testing which may involve the cosmotic issues.

TCD (Test Case Document:


Test Case Document Contains
 Test Scope (or) Test objective
 Test Scenario
 Test Procedure
 Test case
This is the sample test case document for the Acadamic details of student project:
Test scope:
 Test coverage is provided for the screen “ Acadamic status entry” form of a student
module of university management system application
 Areas of the application to be tested
Test Scenario:
 When the office personals use this screen for the marks entry, calculate the status
details, saving the information on student’s basis and quit the form.
Test Procedure:
 The procedure for testing this screen is planned in such a way that the data entry,
status calculation functionality, saving and quitting operations are tested in terms of
Gui testing, Positive testing, Negative testing using the corresponding Gui test cases,
Positive test cases, Negative test cases respectively

Test Cases:
 Template for Test Case
T.C.No Description Exp Act Result

Enter user name True/false True Home page


1 and password

Enter valid date


Accurate/Valid Valid date Data stored
2 to store in the
data successfully
database

Guidelines for Test Cases:


1. GUI Test Cases:
 Total no of features that need to be check
 Look & Feel
 Look for Default values if at all any (date & Time, if at all any require)
 Look for spell check

Example for Gui Test cases:


T.C.No Description Expected value Actual value Result

Check for all the The screen must


1 features in the screen contain all the
features

Check for the The alignment


2 alignment of the should be in proper
objects as per the way
validations

2. Positive Test Cases:


 The positive flow of the functionality must be considered
 Valid inputs must be used for testing
 Must have the positive perception to verify whether the requirements are justified.

Example for Positive Test cases:


T.C.No Description Expected value Actual value Result
1 Check for the The date and
date Time Auto time of the
Display system must be
displayed
2 Enter the valid It should accept
Roll no into the
student roll no
field

3. Negative Test Cases:


 Must have negative perception.
 Invalid inputs must be used for test.

Example for Negative Test cases:


T.C.No Description Expected value Actual value Result
1 Try to modify The Modification should
information in date and not be allow
time
2 Enter invalid data in to It should not accept
the student details form, invalid data, save
click on save should not allow

Login Page Test Case


Test Steps
Test Test Case
Case Description
Name
Step Expected Actual

Login Validate Login To verify that enter login name an error


Login name less than 1 charsmessage
on login page (say a) and “Login not less
must be password and than 1
greater than 1 click Submit characters”
characters button must be
displayed
enter login name Login success
1 chars (say a) full or an error
and password and message
click Submit “Invalid Login
button or Password”
must be
displayed
Pwd Validate To verify that enter Password an error
Password Password on less than 1 chars message
login page (say nothing) and “Password not
must be Login Name and less than 1
greater than 1 click Submit characters”
characters button must be
displayed
Pwd02 Validate enter Password Login success
To verify that
Password with special full or an error
Password on
characters(say ! message
login page
@hi&*P) Login “Invalid Login
must be allow
Name and click or Password”
special
Submit button must be
characters
displayed
Llnk Verify To Verify the Click Sign Up Home Page
Hyperlinks Hyper Links Link must be
available at displayed
left side on
login page Click Sign Up Sign Up page
working or Link must be
not displayed
Click New Users New Users
Link Registration
Form must be
displayed
Registration Page Test Case
Test Steps
Test Case Test Case
Step Expected Actual
Name Description

Registration Validate To verify that enter User name an error


User Name User name on click Submit message User
Registration button Name Must
page must be be Declared
Declared
Validate To verify that enter Password an error
Password Password on click Submit message
Registration button Password
page must be Must be
Declared Declared
Validate First To verify that enter First Name an error
Name First Name on click Submit message First
Registration button Name Must
page must be be Declared
Declared
Validate Last To verify that enter Last Name an error
Name Last Name on click Submit message Last
Registration button Name Must
page must be be Declared
Declared
Validate To verify that enter Address an error
Address Address on click Submit message
Registration button Address Must
page must be be Declared
Declared
Validate To verify that enter Phone an error
Phone Phone number number click message
number on Submit button Phone number
Registration Must be
page must be Declared
Declared
Validate To verify that enter Phone an error
Phone Phone number number is only message
number is (say abc) numeric values Phone number
giving Registration click Submit Must be
characters page must be button numeric
Declared Declared
Validate To verify that enter Phone an error
Phone Phone number number is Valid message
number valid (say 1234) values click Phone number
number Registration Submit button Must be Valid
page must be value
Declared Declared

Dispatch Goods Page Test Case


Test Steps
Test Case Test Case
Step Expected Actual
Name Description
Dispatch Validate To verify that enter valid an error
code dispatch code dispatch code dispatch code message
on login page and dispatch “dispatch
must be goods and click code” must be
greater than 1 Submit button displayed
characters
enter dispatch Dispatch
code and success full or
dispatch goods an error
and click Submit message
button “Invalid
dispatch or
dispatch
goods” must
be displayed
Dispatch Validate To verify that enter Dispatch an error
goods1 Dispatch Dispatch goods and message
goods goods page Dispatch code “Dispatch
must be and click Submit goods” must
declared button be displayed
Dispatch Validate To verify that enter Dispatch Dispatch
goods2 Dispatch Dispatch goods and goods success
goods goods page Dispatch code full
must be and click Submit
declared button

Add Material Page Test Case


Test Steps
Test Case Test Case
Step Expected Actual
Name Description
Add Material Validate To verify that enter Material an error
Material Material name name click message
Name on Submit button Material
Registration Name Must
page must be be Declared
Declared
Validate To verify that enter Material an error
Material Material Data Data click message
Data on Submit button Material Data
Registration Must be
page must be Declared
Declared

Validate To verify that enter date click an error


Date date on Submit button message date
Registration Must be
page must be Declared
Declared
Validate file To verify that enter file type an error
type file type on click Submit message file
Registration button type Must be
page must be Declared
Declared

System Security Measures :

Data Security

Term Definition
Data security is the process of protecting information systems and its data from unauthorized
accidental or intentional modification, destruction or disclosure. The protection includes the
confidentiality, integrity and availability of these systems and data.

Risk assessment, mitigation and measurement are key components of data security. To
maintain a secure environment, data security protocols require that any changes to data
systems have an audit trail, which identifies the individual, department, time and date of any
system change. Companies utilize personnel, policies, protocols, standards, procedures,
software, hardware and physical security measures to attain data security. Data security may
include one or a combination of all of these.

Data security is not confined to the Information Services or Information Technology


departments, but will involve various stakeholders including senior management, the board of
directors, regulators, internal and external auditors, partners, suppliers and shareholders.

Data security encompasses the security of the Information System in its entirety. The U.S.
National Information Systems Security Glossary defines Information Systems Security
(INFOSEC) as: “The protection of information systems against unauthorized access to or
modification of information, whether in storage, processing or transit, and against the denial
of service to authorized users or the provision of service to unauthorized users, including
those measures necessary to detect, document, and counter such threats.“

Protecting data from unauthorized access is one component of data security that receives a
great deal of attention. The concern for data protection extends beyond corporate concerns
but is a high priority consumer interest as well. Data can be protected against unauthorized
access through a variety of mechanisms. Passwords, digital certificates and biometric
techniques all provide a more secure method to access data. Once the authorized user has
been authorized or authenticated, sensitive information can be encrypted to prevent spying or
theft. However, even the most sophisticated data security programs and measures cannot
prevent human error. Security safeguards must be adhered to and protected to be effective.

a. Creating user profiles and access rights

Step 1.
Open SqlPlus
Enter Administrator User Name:--------
Password:--------
HostString:--------
Step 2.
SQL>create user <username> identified by <password>;
user Created.

SQL>grant dba to <username>;


grant succeeded.

SQL>conn username/password;(newly created username password)


Connected.

SQL>show user;
user is "username" (check here username).

SQL>select * from tab;


no rows selected.
Step 3.
Start---->Run--->imp--->
username:<username>/<password>
----Press Enter
Key-----

Import file:EXPDAT.DMP><give database dump file path>


(Example:---> f:\
xxx.dmp)
----Press Enter Key-----

Enter insert buffer size(min is 8192) 30720>


---Press Enter Key---
List Contents of Import file only(yes/no):no> Press --> n

---Press Enter Key---

Press --> y ---


Press Enter Key---

Press --> y ---


Press Enter Key---

Press --> y ---


Press Enter Key---

Press --> y ---


Press Enter Key---

SQL>select * from tab;


rows selected.

SQL>desc <tablename>;
It gives column names and datatypes.

SQL>select * from <tablename>;


this is for userids,passwords,logintypes.

Cost of Estimation of Project :

Total Metrics specializes in quantifying software development projects early in their lifecycle
and using their functional size as input into software project resource estimates of effort, cost,
team size and schedule. We use industry project data sourced from ISBSG combined with the
expert knowledge base of Knowledge PLAN™ to determine the likely productivity and
quality of the project.

Functional size can be determined as soon as the Business Requirements are identified. Our
project estimates use an independent 'top down' method of estimating, which complement the
standard 'bottom up' work breakdown methodologies. However industry experience shows
that functional size based estimates are much more accurate early in the lifecycle of a project
and can be completed in under 3 days of effort as compared to work breakdown estimates
which need detailed project information and take weeks to develop.

Our estimation techniques have been proven to be accurate and provide an independent
estimate of project budget and schedule requirements.

Why Estimate?
The accuracy of project estimates can have a dramatic impact on profitability. Software
development projects are characterized by regularly over running their budgets and rarely
meeting deadlines.
Effective software estimation is one of the most important software development activities
however it is also one of the most difficult. Under estimating a project will lead to under
staffing it, under scoping the quality assurance effort and setting too short a schedule. That in
turn can lead to staff burnout , low quality , loss of credibility, deadlines being missed and
ultimately to inefficient development effort that takes longer than normal. Overestimating a
project can be almost as bad. Parkinson's Law is that work expands until available time
comes into play. Which means that the project will take as long as estimated even if the
project is over estimated. An accurate estimate is a critical part of the foundation of efficient
software development.

Total Metrics uses Functional Size Measures and Industry data to develop Project
Estimates:
Size is a major driver of effort, duration and risk. Once Total Metrics can measure the
functional size of your project then the estimates of cost, duration, effort and defects can be
created.
Estimating is a critical business process, especially at the early stages of the project.
Enterprises have limited resources of personnel, time and budget, and proper estimating will
allow the leaders of the enterprise to properly allocate these limited resources to achieve the
highest benefits.

There are detailed estimating processes in many different industry sectors. In the building
industry, there are standard books with detailed methodologies for all of the craftsmen
required to build a home or building In engineering, there are similar guidelines based on
physics and chemistry If these disciplines have strong estimating practices, then why is
software estimation's track record so abysmal?

The information technology (IT) trade magazines are constantly filled with stories of
runaway projects which have exceeded their original budgets by multiples of two and higher,
projects which failed to meet the users' requirements, time of delivery, or projects which were
cancelled after substantial financial Investments. These failed projects seem to have some
consistent elements:

 Inadequate project definition


 Lack of scope control
 Poor or non-existent estimating process
 Misapplication of metrics
 Lack of project management

You might also like