SDLC Models Ebook
SDLC Models Ebook
com
SDLC Models
Here are various software development approaches defined and designed which are
used/employed during development process of software, these approaches are also
referred as "Software Development Process Models". Each process model follows a
particular life cycle in order to ensure success in process of software development.
Waterfall Model
Waterfall approach was first Process Model to be introduced and followed widely in
Software Engineering to ensure success of the project. In "The Waterfall" approach,
the whole process of software development is divided into separate process phases.
The phases in Waterfall model are: Requirement Specifications phase, Software
Design, Implementation and Testing & Maintenance. All these phases are cascaded
to each other so that second phase is started as and when defined set of goals are
achieved for first phase and it is signed off, so the name "Waterfall Model". All the
methods and processes undertaken in Waterfall Model are more visible.
System & Software Design: Before a starting for actual coding, it is highly
important to understand what we are going to create and what it should look like?
The requirement specifications from first phase are studied in this phase and system
design is prepared. System Design helps in specifying hardware and system
requirements and also helps in defining overall system architecture. The system
design specifications serve as input for the next phase of the model.
Implementation & Unit Testing: On receiving system design documents, the work
is divided in modules/units and actual coding is started. The system is first
developed in small programs called units, which are integrated in the next phase.
Each unit is developed and tested for its functionality; this is referred to as Unit
Testing. Unit testing mainly verifies if the modules/units meet their specifications.
Integration & System Testing: As specified above, the system is first divided in
units which are developed and tested for their functionalities. These units are
integrated into a complete system during Integration phase and tested to check if all
modules/units coordinate between each other and the system as a whole behaves as
per the specifications. After successfully testing the software, it is delivered to the
customer.
Operations & Maintenance: This phase of "The Waterfall Model" is virtually never
ending phase (Very long). Generally, problems with the system developed (which are
not found during the development life cycle) come up after its practical use starts, so
the issues related to the system are solved after deployment of the system. Not all
the problems come in picture directly but they arise time to time and needs to be
solved; hence this process is referred as Maintenance.
Advantages
Disadvantages
The disadvantage of waterfall development is that it does not allow for much
reflection or revision. Once an application is in the testing stage, it is very difficult to
go back and change something that was not well-thought out in the concept stage.
Alternatives to the waterfall model include joint application development (JAD), rapid
application development (RAD), synch and stabilize, build and fix, and the spiral
model.
Well, at least that's the way it's supposed to work theoretically. In reality, there are
a number of problems with this theoretical model, and these can cause delays and
knock-on errors in the rest of the process. This article discusses some of the more
common problems that project managers experience during this phase, and suggests
possible solutions.
Possibly the most common problem in the requirements analysis phase is that
customers have only a vague idea of what they need, and it's up to you to ask the
right questions and perform the analysis necessary to turn this amorphous vision
into a formally-documented software requirements specification that can, in turn, be
used as the basis for both a project plan and an engineering architecture.
• Ensure that you spend sufficient time at the start of the project on
understanding the objectives, deliverables and scope of the project.
• Make visible any assumptions that the customer is using, and critically
evaluate both the likely end-user benefits and risks of the project.
• Attempt to write a concrete vision statement for the project, which
encompasses both the specific functions or user benefits it provides and the
overall business problem it is expected to solve.
• Get your customer to read, think about and sign off on the completed
software requirements specification, to align expectations and ensure that
both parties have a clear understanding of the deliverable.
The second most common problem with software projects is that the requirements
defined in the first phase change as the project progresses. This may occur because
as development progresses and prototypes are developed, customers are able to
more clearly see problems with the original plan and make necessary course
corrections; it may also occur because changes in the external environment require
reshaping of the original business problem and hence necessitates a different
solution than the one originally proposed. Good project managers are aware of these
possibilities and typically already have backup plans in place to deal with these
changes.
It's quite common to hear a customer say something like "it's an emergency job and
we need this project completed in X weeks". A common mistake is to agree to such
timelines before actually performing a detailed analysis and understanding both of
the scope of the project and the resources necessary to execute it. In accepting an
unreasonable timeline without discussion, you are, in fact, doing your customer a
disservice: it's quite likely that the project will either get delayed (because it wasn't
possible to execute it in time) or suffer from quality defects (because it was rushed
through without proper inspection).
• Enter into a conversation about deadlines with your customer, using the
figures in your draft plan as supporting evidence for your statements.
Assuming that your plan is reasonable parties.
Often, customers and engineers fail to communicate clearly with each other because
they come from different worlds and do not understand technical terms in the same
way. This can lead to confusion and severe miscommunication, and an important
task of a project manager, especially during the requirements analysis phase, is to
ensure that both parties have a precise understanding of the deliverable and the
tasks needed to achieve it.
• Take notes at every meeting and disseminate these throughout the project
team.
• Be consistent in your use of words. Make yourself a glossary of the terms that
you're going to use right at the start, ensure all stakeholders have a copy,
and stick to them consistently.
The scholars Bolman and Deal suggest that an effective manager is one who views
the organization as a "contested arena" and understands the importance of power,
conflict, negotiation and coalitions. Such a manager is not only skilled at operational
and functional tasks, but he or she also understands the importance of framing
agendas for common purposes, building coalitions that are united in their
perspective, and persuading resistant managers of the validity of a particular
position.
These skills are critical when dealing with large projects in large organizations, as
information is often fragmented and requirements analysis is hence stymied by
problems of trust, internal conflicts of interest and information inefficiencies.
• Review your existing network and identify both the information you need and
who is likely to have it.
• Cultivate allies, build relationships and think systematically about your social
capital in the organization.
• Persuade opponents within your customer's organization by framing issues in
a way that is relevant to their own experience.
• Use initial points of access/leverage to move your agenda forward.
Iterative Model
An iterative lifecycle model does not attempt to start with a full specification of
requirements. Instead, development begins by specifying and implementing just part
of the software, which can then be reviewed in order to identify further
requirements. This process is then repeated, producing a new version of the software
for each cycle of the model. Consider an iterative lifecycle model which consists of
repeating the following four phases in sequence:
A Requirements phase, in which the requirements for the software are gathered
and analyzed. Iteration should eventually result in a requirements phase that
produces a complete and final specification of requirements.
- An Implementation and Test phase, when the software is coded, integrated and
tested.
- A Review phase, in which the software is evaluated, the current requirements are
reviewed, and changes and additions to requirements proposed.
For each cycle of the model, a decision has to be made as to whether the software
produced by the cycle will be discarded, or kept as a starting point for the next cycle
(sometimes referred to as incremental prototyping). Eventually a point will be
reached where the requirements are complete and the software can be delivered, or
it becomes impossible to enhance the software as required, and a fresh start has to
be made.
VModel
The V-model is a software development model which can be presumed to be the
extension of the waterfall model. Instead of moving down in a linear way, the
process steps are bent upwards after the coding phase, to form the typical V shape.
The V-Model demonstrates the relationships between each phase of the development
life cycle and its associated phase of testing.
Verification Phases
Spiral Model
History
The spiral model was defined by Barry Boehm in his 1988 article A Spiral Model of
Software Development and Enhancement. This model was not the first model to
discuss iterative development, but it was the first model to explain why the iteration
matters. As originally envisioned, the iterations were typically 6 months to 2 years
long. Each phase starts with a design goal and ends with the client (who may be
internal) reviewing the progress thus far. Analysis and engineering efforts are
applied at each phase of the project, with an eye toward the end goal of the project.
The spiral model, also known as the spiral lifecycle model, is a systems development
method (SDM) used in information technology (IT). This model of development
combines the features of the prototyping model and the waterfall model. The spiral
model is intended for large, expensive, and complicated projects.
1. The new system requirements are defined in as much detail as possible. This
usually involves interviewing a number of users representing all the external
or internal users and other aspects of the existing system.
2. A preliminary design is created for the new system.
3. A first prototype of the new system is constructed from the preliminary
design. This is usually a scaled-down system, and represents an
approximation of the characteristics of the final product.
4. A second prototype is evolved by a fourfold procedure: (1) evaluating the first
prototype in terms of its strengths, weaknesses, and risks; (2) defining the
requirements of the second prototype; (3) planning and designing the second
prototype; (4) constructing and testing the second prototype.
5. At the customer's option, the entire project can be aborted if the risk is
deemed too great. Risk factors might involve development cost overruns,
operating-cost miscalculation, or any other factor that could, in the
customer's judgment, result in a less-than-satisfactory final product.
6. The existing prototype is evaluated in the same manner as was the previous
prototype, and, if necessary, another prototype is developed from it according
to the fourfold procedure outlined above.
7. The preceding steps are iterated until the customer is satisfied that the
refined prototype represents the final product desired.
8. The final system is constructed, based on the refined prototype.
9. The final system is thoroughly evaluated and tested. Routine maintenance is
carried out on a continuing basis to prevent large-scale failures and to
minimize downtime.
Applications
For a typical shrink-wrap application, the spiral model might mean that you have a
rough-cut of user elements (without the polished / pretty graphics) as an operable
application, add features in phases, and, at some point, add the final graphics. The
spiral model is used most often in large projects. For smaller projects, the concept of
agile software development is becoming a viable alternative. The US military has
adopted the spiral model for its Future Combat Systems program.
Disadvantages
• Distance/velocity relationship: distant galaxies are moving away from us, with
speeds which increase linearly with distance
• Chemistry: the universe is almost entirely hydrogen and helium, in a mixture
of roughly 12 H atoms to 1 He atom
• Cosmic Microwave Background: no matter where we look in the universe, we
see radio waves which look like those radiated by a blackbody at about 2.7
degrees above absolute zero. There are tiny (one part in 10,000) variations in
the brightness of this radiation on scales of a degree or so
Is there any way to tie all these pieces of data together? Yes! One model which can
explain them all is called the Big Bang model. The name was coined by a scientist
who didn't like the theory and tried to make it sound silly.
Note that the basic Big Bang Model does NOT say anything about the following
questions:
Some of these questions all depend upon the values of certain parameters in the
model, which we may derive from observations. Others have nothing to do with the
Big Bang itself.
Our understanding of the laws of nature permit us to track the physical state of the
universe back to a certain point, when the density and temperature were REALLY
high. Beyond that point, we don't know exactly how matter and radiation behave.
Let's call that moment the starting point. It doesn't mean that the universe
"began" at that time, it just means that we don't know what happened before that
point.
One of the primary successes of the Big Bang theory is its explanation for the
chemical composition of the universe. Recall that the universe is mostly hydrogen
and helium, with very small amounts of heavier elements. How does this relate to
the Big Bang?
Well, a long time ago, the universe was hot and dense. When the temperature is
high enough (a few thousand degrees), atoms lose all their electrons; we call this
state of matter, a mix of nuclei and electrons, a fully-ionized plasma. If the
temperature is even higher (millions of degrees), then the nuclei break up into
fundamental particles, and one is left with a "soup" of fundamental particles:
• protons
• neutrons
• electrons
Now, if the "soup" is very dense, then these particles will collide with each other
frequently. Occasionally, groups of protons and neutrons will stick together to form
nuclei of light elements ... but under extremely high pressure and temperature, the
nuclei are broken up by subsequent collisions. The Big Bang theory postulates that
the entire universe was so hot at one time that it was filled with this proton-neutron-
electron "soup."
But the Big Bang theory then states that, as the universe expanded, both the density
and temperature dropped. As the temperature and density fell, collisions between
particles became less violent, and less frequent. There was a brief "window of
opportunity" when protons and neutrons could collide hard enough to stick together
and form light nuclei, yet not suffer so many subsequent collisions that the nuclei
would be destroyed. This "window" appeared about three minutes after the starting
point, and lasted for a bit less than a minute.
Which nuclei would form under these conditions? Experiments with particle colliders
have shown us that most of the possible nuclei are unstable, meaning they break
up all by themselves, or fragile, meaning they are easily broken by collisions.
Helium (the ordinary sort, with 2 protons and 2 neutrons) is by far the most stable
and robust compound nucleus. Deuterium (one proton and one neutron) is easily
destroyed, and so is helium-3 (2 protons, one neutron).
So, it seems that this period of hot, dense plasma would create a lot of helium. Could it
create other, heavier elements, too?
Detailed models of Big Bang nucleosynthesis predict that the brief "window of
opportunity" lasted only a minute or two. After that, about three and a half minutes
after the starting point, the temperature and density dropped so much that collisions
between particles were rare, and of such low energy that the electric forces of
repulsion between positively-charged nuclei prevented fusion. The result is
• lots of hydrogen
• some helium (ordinary helium-4)
• tiny bits of deuterium
• tiny bits of lithium
• not much else
So, during the first few minutes after the starting point, the universe was hot enough
to fuse particles into helium nuclei. The result was a ratio of about 12 hydrogen
nuclei to 1 helium nucleus; that's equivalent to saying that three quarters of the
mass of the universe was hydrogen nuclei, and one quarter of the mass was helium
nuclei.
But these nuclei were totally ionized: they lacked the normal collection of electrons
surrounding them. The electrons were free to fly around space on their own. Free
electrons are very efficient at scattering photons. Any light rays or radio waves or X-
rays in this ionized plasma were scattered before they could travel far. The universe
was opaque.
RAD Model
Contents:
What is RAD?
Development Methodology
RAD Model Phases
Advantages and Disadvantages of RAD?
Prototype Model
In many instances the client only has a general view of what is expected from the
software product. In such a scenario where there is an absence of detailed
information regarding the input to the system, the processing needs and the output
requirements, the prototyping model may be employed. This model reflects an
attempt to increase the flexibility of the development process by allowing the client
to interact and experiment with a working representation of the product. The
developmental process only continues once the client is satisfied with the functioning
of the prototype. At that stage the developer determines the specifications of the
client’s real needs.
Contents:
Software Prototyping
Overview
Versions
Types of Prototyping
Advantages of Prototyping
Disadvantages of Prototyping
Best projects to use Prototyping
Methods
Tools
After a few thousand years, as the universe continued to expand and cool, the
temperature reached a critical point. About 100,000 years after the starting point,
the temperature dropped to about 3,000 degrees Kelvin. At this point, hydrogen
nuclei (protons) were able to capture electrons, and hold them against collisions. We
call this process of capturing electrons recombination (even though it was really
the first "combination", not a re-"combination").
The universe became largely neutral, with electrons bound into hydrogen and
helium atoms. Neutral atoms are nearly transparent to light rays and radio waves.
Suddenly, the universe became transparent.
The Big Bang theory states that the universe is expanding, though it does not
explain why the universe should behave in this way. As a result, objects which are
subject to no forces should move away from each other. In a uniformly-expanding
universe, the rate at which objects move away from each other depends linearly on
their distance from each other.
And that linear relationship between distance and radial velocity is just what we see
when we look at distant galaxies:
But ... wait a minute. Does this expansion occur on all scales? What about
If there are "significant" attractive forces between objects, they do not move away
from each other as time goes by. These attractive forces may be
So the distance between the Earth and Sun has not increased over the past 4 billion
years. Nor does the length of a meterstick grow due to the expansion of the
universe.
Only on the very largest scales, distances between isolated galaxies or clusters of
galaxies, are the attractive forces so weak that the expansion of the universe is able
to move objects apart.
The rate of expansion depends sensitively on the exact amount of matter in the
universe. If the density of matter is high, long-range gravitational forces can slow
down the expansion, or even stop it. If the density of matter is very low, the
expansion will go on forever.
http://www.OneStopTesting.com
http://groups.yahoo.com/group/OneStopTesting/
http://www.CoolInterview.com
and
http://www.TestingInterviewQuestions.com
and
http://www.NewInterviewQuestions.com