0% found this document useful (0 votes)
74 views14 pages

Overview On Testing

The document discusses the system development life cycle (SDLC) model which involves investigating, analyzing, designing, implementing, and maintaining information systems. The key phases of the SDLC model include system/information engineering and modeling, software requirements analysis, system analysis and design, code generation, testing, and maintenance. Software testing is an important part of the SDLC process and helps identify bugs and ensure requirements are met before public release. Different types of testing include functional, non-functional, and automated testing.

Uploaded by

Vikram Reddy
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
74 views14 pages

Overview On Testing

The document discusses the system development life cycle (SDLC) model which involves investigating, analyzing, designing, implementing, and maintaining information systems. The key phases of the SDLC model include system/information engineering and modeling, software requirements analysis, system analysis and design, code generation, testing, and maintenance. Software testing is an important part of the SDLC process and helps identify bugs and ensure requirements are met before public release. Different types of testing include functional, non-functional, and automated testing.

Uploaded by

Vikram Reddy
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 14

System Development Life Cycle Model (SDLC Model) SDLC is the process of developing information systems through investigation,

analysis, design, implementation and maintenance. SDLC is also known as information systems development or application development This is also known as Classic Life Cycle Model (or Linear Se!uential Model (or "aterfall Method. This has the following activities. #. System$%nformation &ngineering and Modeling '. Software (e!uirements )nalysis *. Systems )nalysis and Design +. Code ,eneration -. Testing .. Maintenance Back to top

System/Information Engineering and Modeling )s software is always of a large system (or /usiness , work /egins /y esta/lishing re!uirements for all system elements and then allocating some su/set of these re!uirements to software. This system view is essential when software must interface with other elements such as hardware, people and other resources. System is the /asic and very critical re!uirement for the e0istence of software in any entity. So if the system is not in place, the system should /e engineered and put in place. %n some cases, to e0tract the ma0imum output, the system should /e re1engineered and spruced up. 2nce the ideal system is engineered or tuned, the development team studies the software re!uirement for the system. Software Requirement Analysis This is also known as feasi/ility study. %n this phase, the development team visits the customer and studies their system. They investigate the need for possi/le software automation in the given system. 3y the end of the feasi/ility study, the team furnishes a document that holds the different specific recommendations for the candidate system. %t also includes the personnel assignments, costs, pro4ect schedule, and target dates. The re!uirements gathering process is intensified and focussed specially on software. To understand the nature of the program(s to /e /uilt, the system engineer (5analyst5 must understand the information domain for the software, as well as re!uired function, /ehavior, performance and interfacing. The essential purpose of this phase is to find the need and to define the pro/lem that needs to /e solved . System Analysis and Desi n %n this phase, the software development process, the software6s overall structure and its nuances are defined. %n terms of the client$server technology, the num/er of tiers needed for the package architecture, the data/ase design, the data structure design etc are all defined in this phase. ) software development model is created. )nalysis and Design are very crucial in the whole development cycle. )ny glitch in the design phase could /e very e0pensive to solve in the later stage of the software development. Much care is taken during this phase. The logical system of the product is developed in this phase. Code !eneration The design must /e translated into a machine1reada/le form. The code generation step performs this task. %f the design is performed in a detailed manner, code generation can /e accomplished without much complication. 7rogramming tools like Compilers, %nterpreters, De/uggers are used to generate the code. Different high level programming languages like C, C88, 7ascal, 9ava are used for coding. "ith respect to the type of application, the right programming language is chosen. "estin 2nce the code is generated, the software program testing /egins. Different testing methodologies are availa/le to unravel the /ugs that were committed during the previous phases. Different testing tools and methodologies are already availa/le. Some companies /uild their own testing tools that are tailor made for their own development operations. Maintenance Software will definitely undergo change once it is delivered to the customer. There are many reasons for the change. Change could happen /ecause of some une0pected input values into the system. %n addition, the changes in the system could directly affect the software operations. The software should /e developed to accommodate changes that could happen during the post implementation period.

"hat is software testing: The goal of the testing activity is to find as many errors as possi/le /efore the user of the software finds them. "e can use testing to determine whether a program component meets its re!uirements. To accomplish its primary goal (finding errors or any of its secondary purposes (meeting re!uirements , software testing must /e applied in a systematic fashion. Testing involves operation of a system or application under controlled conditions and evaluating the results. #erification and #alidation ;erification and ;alidation (;<; is a Software Testing activity to enhance !uality of the software /eing /uilt. %t is planned and conducted systematically through out the software lifecycle. ;erification is the checking or testing of items, including software, for conformance and consistency with an associated specification. Software testing is 4ust one kind of verification, which also uses techni!ues such as reviews, analysis, inspections and walkthroughs. ;alidation is the process of checking that what has /een specified is what the user actually wanted. ;alidation activity may /egin when most or all software functions as per customer e0pectations. ;alidation testing provides final accurance that the software meets all functional, /ehavioural and performance re!uirements. =sually 3lack1/o0 testing is used for this activity #erification$ )re we /uilding the 7ro4ect right: #alidation$ )re we /uilding the right product: De%u in #s "estin

The term /ug is often used to refer to a pro/lem or fault in a computer. There are software /ugs and hardware /ugs. Software testing should not /e confused with de/ugging. De/ugging is the process of analy>ing and locating /ugs when software does not /ehave as e0pected. )lthough the identification of some /ugs will /e o/vious from playing with the software, a methodical approach to software testing is a much more thorough means of identifying /ugs. De/ugging is therefore an activity, which supports testing, /ut cannot replace testing. ?owever, no amount of testing can /e guaranteed to discover all /ugs. Common 7ro/lems 7oor re!uirements @ if re!uirements are unclear, incomplete, too general, or not testa/le, there will /e pro/lems =nrealistic schedule @ if too much work is crammed in too little time, pro/lems are inevita/le %nade!uate testing @ no one will know whether or not the program is any good until the customer complains or systems crash (e!uirements change @ re!uests to pile on new features after development is underway are common Miscommunication @ if developers donAt know what is needed or customers have erroneous e0pectations, pro/lems are guaranteed 7oorly documented code @ sufficient comments not /uilt into the source code, re!uirement changes not updated in the impacted documents Solutions Solid re!uirements @ clear, complete, detailed, cohesive, attaina/le, testa/le re!uirements that are agreed to /y all players use prototypes to help nail down re!uirements. (ealistic schedules @ allow ade!uate time for planning, design, testing, /ug fi0ing, re1testing, changes, and documentationB personnel should /e a/le to complete the pro4ect without /urning out. )de!uate testing 1 start testing early on, re1test after fi0es or changes, plan for ade!uate time for testing and /ug1fi0ing. Stick to initial re!uirements as much as possi/le @ /e prepared to defend against e0cessive changes and additions once development has /egun, and /e prepared to e0plain conse!uences. %f changes are necessary, they should /e ade!uately reflected in related schedule changes. %f possi/le, use rapid prototype during the design phase so that customer can see what to e0pect. This will provide them a higher comfort level with their re!uirements decisions and minimi>e changes later on. Communication @ re!uire walkthroughs and inspections when appropriateB make e0tensive use of group communication tools 1 e1mail, groupware, networked /ug1tracking tools and

change management tools, intranet capa/ilities, etc.B ensure that information$documentation is availa/le and up1to1date 1 prefera/ly electronic, not paperB promote teamwork and cooperationB use prototypes early on so that customers6 e0pectations are clarified "estin Services & "ypes "S Testing 7ractice provides a wide services portfolio, from =nit Testing right up to )pplication Certification. %n general testing services can /e categori>ed into three main types, such asC Dunctional Testing Eon1Dunctional Testing )utomated Testing 1 for /oth functional and non1functional testing Competitive )nalysis Testing 'unctional "estin Testing developed application against /usiness re!uirements. Dunctional testing is done using the functional specifications provided /y the client or /y using the design specifications like use cases provided /y the design team. Dunctional testing covers =nit Testing Smoke testing $ Sanity testing %ntegration Testing (Top Down, 3ottom up Testing %nterface < =sa/ility Testing (including %ndependent Docus ,roups System Testing (egression Testing 7re =ser )cceptance Testing ()lpha < 3eta =ser )cceptance Testing "hite 3o0 Testing, 3lack 3o0 Testing ,lo/alisation and Localisation Testing ((egional Settings, Languages etc. "SAs Testing 7ractice has *FF8 person1years of e0perience across various types of Dunctional testing. (on)'unctional "estin Testing the application /ased on the clients and performance re!uirement. Eon1functioning testing is done /ased on the re!uirements and test scenarios defined /y the client. Eon1 functional testing covers Load and 7erformance Testing &rgonomics Testing Stress < ;olume Testing Compati/ility < Migration Testing Data Conversion Testing Security $ 7enetration Testing 2perational (eadiness Testing %nstallation Testing Security Testing ()pplication Security, Eetwork Security, System Security "e/ Spiders Testing 7ractice has #G-8 person1years of e0perience across various types of Eon1Dunctional testing )utomated Testing )utomated testing is an art of converting manual test cases to machine e0ecuta/le code. The output of a test automation pro4ect is a (or a set of test suite, which will /e used /y testers to verify the application time and again. Test automation is perceived as an efficiency improvement program, which will improve time to market advantage for product development organi>ation. )utomated Testing is automating the manual testing process currently in use. This re!uires that a formali>ed 5manual testing process5 e0ist in the company or organi>ation. Minimally, such a process includesC Detailed test cases, including predicta/le 5e0pected results5, which have /een developed from 3usiness Dunctional Specifications and Design documentation. ) standalone Test &nvironment, including a Test Data/ase that is restora/le to a known constant, such that the test cases are a/le to /e repeated each time there are modifications made to the application.

"e/ Spiders Testing 7ractice has .F8 person1years of )utomation Testing e0perience. Competitive Analysis "estin Competitive )nalysis can /e defined as usa/ility, functionality or a performance evaluation where/y two or more competitive products are compared /y simulating an environment where the products are going to /e used. (e.g. %& ;s Eetscape, 2racle D3 ;s. SHL Server ;s D3' "e/ SpiderAs Competitive )nalysis Testing ensures that your product fits in perfectly in its competitive market. 2ur analysis matri0 makes sure that you get the most out of the analysis, whether you are a manufacturer or a /uyer. Dor 7roduct developers our /enchmark testing services provide results that form the /asis for !uality improvement or as a sales tool to show superiority of their products. Dor /uyers, itAs an effective way to help them make the right purchase decision. 3efore starting our analysis, we ensure that the right measurement parameters are identified on the /asis of which the analysis is carried out. %n order to understand this, we encourage the active participation of our customers. 2nce these are decided, our H) e0perts do the rest. "ec*nolo y Based Software "estin 1. !+, testin "estin Considerations for !+, CommunicationB aspects to /e tested areC I Tool tips and status /ar Missing information I &na/le$Disa/le tool/ar /uttons I "rong$misleading$confusing information I ?elp te0t and &rror messages I Training documents Dialog 3o0esB aspects to /e tested areC I Jey/oard actions I Mouse actions I Canceling I 2kaying I Default /uttons I Layout error I Modal I "indow /uttons I Si>a/le I Title$%con I Ta/ order I Display layout I 3oundary conditions I Sorting I )ctive window I Memory leak Command structureB aspects to /e tested areC I Menus I 7opup menus I Command Line 7arameters I State transitions 7rogram rigidityB aspects to /e tested areC I =ser options I Control I 2utput 7referencesB aspects to /e tested areC I =ser tailor1a/ility

I ;isual preferences I Locali>ation =sa/ilityB aspects to /e tested areC I )ccessi/ility I (esponsiveness I &fficiency I Comprehensi/ility I =ser scenarios I &ase of use Locali>ationB aspects to /e tested areC I Translation I &nglish1only dependencies I Cultural dependencies I =nicode I Currency I Date$Time I Constants I Dialog contingency -. Application "estin "estin Considerations for Application "estin Applications C, C88 )pplications "estin Memory leak detection Code coverage Static and dynamic testing Test coverage analysis (untime error detection )utomated component testing Measurement of maintaina/ility, porta/ility, comple0ity, standards compliance 3oundary conditions testing 9ava )pplications$)pplets )utomated component testing Dunctional testing 7erformance testing )pplet$application testing Code coverage 3oundary conditions testing "in *'1/ased )pplications Memory leak detection of win*' programs 7erformance testing Stress testing of "indows applications and system clients /. Application 0ro rammin ,nterface (A0,) testin "estin Considerations1,ssues The following tasks can /e automatedC #. '. *. +. -. .. G. Test1code /uilds Test1suites e0ecution (eport generation (eport pu/lishing (eport notification 7eriodic e0ecution of test1suites ) tool for test automation should /e identified very early in the pro4ect.

K. ) simulator (that services the )7% calls should /e developed /y the testing team. %n the a/sence of this simulator, there is a dependency on a Server (not developed /y the )7% development team . ;alidating the )7%s for user1friendlinessC Meaningful names Concise and short names Eum/er of arguments &asy1to1read names

2. Middleware "estin Dunctional testing %nteropera/ility testing 7erformance testing

3. Data%ase1Backend (& data%ase applications) testin "estin Considerations1,ssues )part from testing (verify$validate this the application the following needs to /e considered for data/asesC Distri/uted &nvironment 7erformance %ntegration with other systems Security )spects

Description1Su%)tasks 3oundary conditions testing (Data B aspects to /e testedC Dataset Eumeric )lpha Eumerosity Dield si>e Data structures Timeouts

Accuracy1,nte rity testin (Data)4 aspects to %e tested$ 3oundary conditions testing (Data B aspects to /e testedC Calculations 1 (eports Calculations 1 3ackend Divide /y >ero Truncate Compati/ility Test data Data consistency

Data%ase connectivity4 aspects to %e tested$

Data/ase connectivityB aspects to /e testedC Save (etrieval

Data%ase sc*ema testin 4 aspects to %e tested$ Data/ase connectivityB aspects to /e testedC Data/ases and devices Ta/les, Dields, Constraints, Defaults Jeys and %ndices Stored procedures &rror messages Triggers, =pdate Triggers, %nsert Triggers, Delete Schema comparisons

Security testin 4 aspects to %e tested Security testingB aspects to /e tested Login and =ser security

5. 6e% Site10a e "estin Applications Link and ?TML Testing "estin Detecting ?TML compati/ility pro/lems Checking Cascading Style Sheets Checking link and content Checking ?TML synta0 $ ;alidating ?TML documents Detecting /roken$dead links "e/ site performance analysis Dunctional Testing "e/ site functional testing Testing for completeness and consistency of we/ pages 7erformance Testing Load testing of we/ /ased systems (elia/ility, performance and scala/ility testing of we/ applications Load and performance testing of we/ server "estin Life cycle$ "&3 S7%D&(S has a well1defined testing Lifecycle, which is applica/le to all testing scenarios. The lifecycle ensures that all the relevant inputs are o/tained, the planning is ade!uately carried out and the e0ecutions are as per plan. %n addition the results are o/tained reviewed and monitored. The lifecycle also defines the interfaces into the overall Huality management processes and also the 7ro4ect delivery phases. The testing life cycle can /e /roadly classified into three different life cycle models depending upon the type of application and the test strategy used such asC )pplication Testing Life Cycle

)utomation Testing Life Cycle 7ackage Testing Life Cycle

Application "estin life cycle This life cycle is used for standard applications that are /uilt from various custom technologies and follow the normal or standard testing approach. The )pplication or custom1 /uild Lifecycle and its phases is depicted /elowC I (e!uirement Specification documents I Dunctional Specification documents I Design Specification documents (use cases, etc I =se case Documents I Test Trace1a/ility Matri0 for identifying Test Coverage

"est Requirements

"est 0lannin

I Test Scope, Test &nvironment I Different Test phase and Test Methodologies I Manual and )utomation Testing I Defect Mgmt, Configuration Mgmt, (isk Mgmt. &tc I &valuation < identification @ Test, Defect tracking tools

"est 7nvironment Setup

I Test 3ed installation and configuration I Eetwork connectivityAs I )ll the Software$ tools %nstallation and configuration I Coordination with ;endors and others I Test Tracea/ility Matri0 and Test coverage I Test Scenarios %dentification < Test Case preparation I Test data and Test scripts preparation I Test case reviews and )pproval I 3ase lining under Configuration Management I )utomation re!uirement identification I Tool &valuation and %dentification. I Designing or identifying Dramework and scripting I Script %ntegration, (eview and )pproval I 3ase lining under Configuration Management

"est Desi n

"est Automation

"est 78ecution and Defect "rackin

I &0ecuting Test cases I Testing Test Scripts I Capture, review and analy>e Test (esults I (aised the defects and tracking for its closure

"est Reports and Acceptance

I Test summary reports I Test Metrics and process %mprovements made I 3uild release I (eceiving acceptance

Automation "estin life cycle )dvantages of this automated software using the a/ove )ST life cycle. ?igh Huality to market Low Time to market (educed testing time Consistent test procedures (educed H) costs %mproved testing productivity %mproved product !uality I (e!uirement $ Dunctional Specification documents I Design Specification documents (use cases, etc I Test Tracea/ility Matri0 for identifying Test Coverage I Dunctional$ Eon1Dunctional and test data re!uirements I Test phases to /e automated and L of automation

AS" Requirements

AS" 0lannin

I )utomated Software Testing ()ST Scope I Tool &valuation and identification I )ST Methodologies and Dramework I 7repare and 3ase lining Scripting standard and )ST7lan

AS" 7nvironment Setup

I )ST Test 3ed installation and configuration I Eetwork connectivityAs I )ll the Software$ tools Licenses, %nstallation and configuration I Coordination with ;endors and others

AS" Desi n

I Test Script and test data preparation I Test scripts $ test data review and unit testing I %ntegration Testing Test scripts and testing I 3ase lining under Configuration Management

AS" 78ecution and Defect "rackin

I &0ecuting )ST Test Suit I Capture, review and analy>e Test (esults I Defects reporting and tracking for its closure

AS" Maintenance Reports and Acceptance

I )ST (esults and summary reports I Test Metrics and process %mprovements made I 3ase lining of )ST Test suits$ scripts$ test date etc for maintenance phase I ,etting )cceptance

0acka e "estin life cycle Testing life cycle followed for all the packaged applications like 2racle, S)7, Sie/el, C(M tools, Supply Chain management applications, etc are detailed in the /elow diagram.

0ro9ect 0reparation

I %dentifying the /usiness processes I 2rgani>ation of the pro4ect team I Setting up the communication channel I Jick start the pro4ect I %dentifying the infrastructure availa/ility I (eporting structure and pro4ect co1 ordination

Business Blueprintin

I (e!uirement Study I %dentifying the /usiness rules I Mapping the /usiness processes I %dentify the test conditions I Setting up the test environment for the system I Dorms the input needed for the configurations

Reali:ation

I Configuration < Customi>ation I )ctivating the /usiness rules I Development of certain flows I %dentifying certain flows not in the standard I Dorming the system configurations I =nit Testing

'inal 0reparation

I =ploading the master data I &nd user training I Simulating all the flows I Tie1up /etween interfaces I 2perational (eadiness Testing and =)T I Sign1off

Cut over; !o)live and Support

I Migrate to new system I Transfer all legacy /usiness applications I Communicate deployment status. I Support new system I Transfer ownership to system owner I Take customer acceptance after production deployment

Delivery Model ,iven /elow is the generic service delivery model followed /y "e/ Spiders 7ractice.

<(S,"7

I (esources at onsite location I Complete pro4ect e0ecution onsite I 7ro4ect reporting to 2nsite Manager

<(S,"7 ) <''S=<R7

I (esources at onsite and offshore locations I 7ro4ect e0ecution at /oth onsite and offshore I 7ro4ect transition to offshore I 7ro4ect reporting to 2ffshore Manager I 7ro4ect /acked /y proven processes methodologies and support functions

<''S=<R7

I &ntire pro4ect e0ecution offshore I 7ro4ect reporting to 2ffshore Manager I )ll the Software$ tools %nstallation and configuration I 7ro4ect /acked /y proven processes methodologies and support functions

"e/ Spiders selects the ideal delivery model /ased on each customerAs specific /usiness case. ) very high1level workflow of the interaction process /etween the customer and "e/ Spiders is illustrated /elowC #. The client sends the introductory mail mentioning the outline of the H) 4o/ that he$she is looking for. '. The H) Manager goes through it and responds with a set of !ueries. *. The client responds and asks for the test plan and sample test cases. +. The H) Manager prepares the test plan and mails it along with a few sample test cases. -. The test plan is reviewed and finali>ed /y /oth the parties. .. The H) Manager discusses the pro4ect with the team mem/ers and the test cases are prepared. G. Test cases are e0ecuted for a particular version of the product, reviewed and mailed to the client. K. The /ugs are approved /y the client and de/ugged /y his$her development team. M. Depending on the pro4ect re!uirements, the client releases the ne0t version of the pro4ect for H) and the test plan and test cases are updated, accordingly. #F. The same process is repeated. 2nce all the /ugs have /een reported, and the pro4ect is re1tested and de1/ugged, the H) process concludes "est Delivera%les There are different test delivera/les at every phase of the SDLC. These delivera/les are provided /ased on the re!uirement once /efore the start of the test phase and there are other delivera/les that are produced towards the end$after completion of each test phase. )lso there are several test metrics that are collected at each phase of testing. 3elow are the details of the various test delivera/les corresponding to each test phase along with their test metrics. The standard delivera/les provided as part of testing areC Test Trace1a/ility Matri0

Test 7lan Testing Strategy Test Cases (for functional testing Test Scenarios (for non1functional testing Test Scripts Test Data Test (esults Test Summary (eport (elease Eotes Tested 3uild

"est Metrics There are several test metrics identified as part of the overall testing activity in order to track and measure the entire testing process. These test metrics are collected at each phase of the testing life cycle $SDLC and analy>ed and appropriate process improvements are determined and implemented as a result of these test metrics that are constantly collected and evaluated as a parallel activity together with testing /oth for manual and automated testing irrespective of the type of application. The test metrics can /e /roadly classified into the following three categories such asC

1. 0ro9ect Related Metrics @ such as Test Si>e, N of Test Cases tested per day @
)utomated (ETT) , N of Test Cases tested per day @Manual (ETTM , N of Test Cases created per day @ Manual (TC&D , Total num/er of review defects ((D , Total num/er of testing defects (TD , etc

2. 0rocess Related Metrics @ such as Schedule )dherence (S) , &ffort ;ariance (&; ,
Schedule Slippage (SS , Test Cases and Scripts (ework &ffort, etc.

3. Customer related Metrics @ such as 7ercentage of defects leaked per release


(7DL7( , 7ercentage of automation per release (7)7( , )pplication Sta/ility %nde0 ()S% , etc. 2ur testing services has a good e0posure and deep e0pertise into automated test tools, test management tools and defect tracking tools that are used to smoothen the testing process and make it more reusa/le with a centrali>ed tracking and repository. >. Automated testin tools

Sample "ools descri%ed %elow$ Sl.(o. # ' * + . G K M #F "ool (ame Huick Test 7rofessional Silk 7erformer Silk Test "in(unner ..' Test director (ational (o/ot (ational suite Test Studio Clear Case (ational (ose H) "i>ard "ype of testin Dunctional and (egression Load, Stress and 7erformance Dunctional and (egression Dunctional and (egression &ffective test management ,=% Dunctional and test control #endor Mercury %nteractive %nc. Segue Software Segue Software Mercury %nteractive %nc. Mercury %nteractive %nc. (ational and %3M (ational and %3M

Configuration Management and change (ational and control (version %3M =nit $ Design changes and reengineering "e/ /ased applications or "e/ sites, "indows applications, or 9ava applications (ational and %3M Seapine Software

-. "est mana ement tools "*ese are some of t*e tools t*at are used to plan; mana e and track t*e testin activities; wit* t*eir correspondin vendors suc* as$ Sl.(o. "est Mana ement "ool #endor

# ' *

Test director Test Manager H) Director

Mercury (ational Compuware

"est case$ ) set of test inputs, e0ecution conditions, and e0pected results developed for a particular o/4ective, such as to e0ercise a particular program path or to verify compliance with a specific re!uirement "est plan$ ) document descri/ing the scope, approach, resources, and schedule of intended testing activities. %t identifies test items, the features to /e tested, the testing tasks, who will do each task, and any risks re!uiring contingency planning ,777 (,nstitute of 7lectrical and 7lectronics 7n ineers)

You might also like