Unit-III Software Project Estimation && COCOMO and Risk Management
Unit-III Software Project Estimation && COCOMO and Risk Management
UNIT – III
Yazhini 1
S/W Project Estimation & COCOMO and Risk Management
• The advantage of this is that it is quick and if the expert is knowledgeable, it is often the most accurate estimate for
uncertain activities. The disadvantages are that you may not have an expert available and even if you do, the expert
often can provide no solid rationale for their estimate beyond, “That’s what I think it will take to do this”.
Decomposition Techniques
• Size-oriented software metrics are derived by normalizing quality and/or productivity measures by considering the size
of the software that has been produced.
• The decomposition technique is most frequently used when estimating projects. Decomposition techniques use the
divide and conquer approach. Estimating size, effort, and cost is performed in a stepwise manner by breaking down a
project into the main functions or related software engineering activities.
• These techniques are broadly classified as recursive decomposition, data-decomposition, exploratory decomposition,
and speculative decomposition. The recursive- and data-decomposition techniques are relatively general purpose as
they can be used to decompose a wide variety of problems.
• The organization builds a simple record of size measure for the software projects. It is built on past experiences of
organizations. It is a direct measure of software. This metric measure is one of the simplest and earliest metrics that is
used for computer programs to measure size. Size Oriented Metrics are also used for measuring and comparing the
productivity of programmers. It is a direct measure of a Software. The size measurement is based on lines of code
computation. The lines of code are defined as one line of text in a source file. While counting lines of code, the
simplest standard is:
Don’t count blank lines
Don’t count comments
Size = Kilo Lines of Code
Quality = Number of faults / KLOC
Count everything else
(KLOC)
Effort = Person / month Cost = $ / KLOC
The size-oriented
Productivity = KLOC /measure
person- is not aDocumentation
universally accepted
= Pages ofmethod.
documentation / KLOC
month
A simple set of size measures that can be developed is given below:
Yazhini 4
S/W Project Estimation & COCOMO and Risk Management
Function Point
• Function-oriented software metrics use a measure of the functionality delivered by the application as a normalization
value. Since ‘functionality’ cannot be measured directly, it must be derived indirectly using other direct measures.
• Function-oriented metrics were first proposed by Albrecht, who suggested a measure called the function point.
• Function points are derived using an empirical relationship based on countable (direct) measures of software’s information
domain and assessments of software complexity.
• FPs of an application is found out by counting the number and types of functions used in the applications. Various functions
used in an application can be put under five types, as shown in Table:
Types of FP Attributes
and ∑(fi) is the sum of all 14 questionnaires and show the complexity adjustment value/ factor-CAF (where i ranges from 1 to
14). Usually, a student is provided with the value of ∑(fi)
Also note that ∑(fi) ranges from 0 to 70, i.e.,
0 <= ∑(fi) <=70
and CAF ranges from 0.65 to 1.35 because
When ∑(fi) = 0 then CAF = 0.65
When ∑(fi) = 70 then CAF = 0.65 + (0.01 * 70) = 0.65 + 0.7 = 1.35
Based on the FP measure of software many other metrics can be computed:
• Errors/FP
• $/FP.
• Defects/FP
• Pages of documentation/FP
• Errors/PM.
• Productivity = FP/PM (effort is measured in person-months).
• $/Page of Documentation.
LOCs of an application can be estimated from FPs. That is, they are interconvertible. This process is known as backfiring. For
example, 1 FP is equal to about 100 lines of COBOL code.
FP metrics is used mostly for measuring the size of Management Information System (MIS) software.
But the function points obtained above are unadjusted function points (UFPs). These (UFPs) of a subsystem are further
Yazhini
adjusted by considering some more General System Characteristics 6
(GSCs). It is a set of 14 GSCs that need to be considered.
The procedure for adjusting UFPs is as follows:
S/W Project Estimation & COCOMO and Risk Management
• Degree of Influence (DI) for each of these 14 GSCs is assessed on a scale of 0 to 5. (b) If a particular GSC has no
influence, then its weight is taken as 0 and if it has a strong influence then its weight is 5.
• The score of all 14 GSCs is totaled to determine Total Degree of Influence (TDI).
• Then Value Adjustment Factor (VAF) is computed from TDI by using the formula: VAF = (TDI * 0.01) + 0.65
Remember that the value of VAF lies within 0.65 to 1.35 because
When TDI = 0, VAF = 0.65
When TDI = 70, VAF = 1.35
VAF is then multiplied with the UFP to get the final FP count: FP = VAF * UFP
Differentiate between FP and LOC
FP LOC
1. FP is specification based. 1. LOC is an analogy based.
2. FP is language 2. LOC is language
independent. dependent.
3. FP is user-oriented. 3. LOC is design-oriented.
4. It is extendible to LOC. 4. It is convertible to FP
(backfiring)
Yazhini 7
S/W Project Estimation & COCOMO and Risk Management
Yazhini 8
S/W Project Estimation & COCOMO and Risk Management
Yazhini 9
S/W Project Estimation & COCOMO and Risk Management
Object Point
• Object points are a way of estimating effort size, similar to Source Lines Of Code (SLOC) or Function Points.
• They are not necessarily related to objects in Object-oriented programming, the objects referred to include screens,
reports, and modules of the language. The number of raw objects and complexity of each are estimated and a weighted
total Object-Point count is then computed and used to base estimates of the effort needed.
• Object points are an approach used in software development effort estimation under some models such as COCOMO II.
Assess object count: number of screens, reports and 3GL components. Productivity
Classify object: simple, medium and difficult depending on the values of characteristic dimensions.
Weight the number in each cell using the following scheme. The weights reflect the relative effort required to implement
an instance of that complexity level as given in the table.
Determine object points: add all the weighted object instances to get one number, the object point count.
Estimate percentage of reuse you expect to be achieved in this project.
Compute new object points to be developed as,
NOP=(Object Point) * (100 - %reuse) / 100
Where %reuse is the percentage of screens, reports and 3GL modules reused from previous applications.
Determine a productivity rate depending on developers’ experience and ICASE maturity as given in the above table.
Yazhini 11
S/W Project Estimation & COCOMO and Risk Management
COCOMO-81
COCOMO: The COCOMO (Constructive Cost Estimation Model) is proposed by DR. Berry Boehm in 1981 and that's why it is also known as
COCOMO'81. It is a method for evaluating the cost of a software package. According to him software cost estimation should be done through
three stages:
• Basic COCOMO Model
• Intermediate COCOMO Model
• Complete/Detailed COCOMO Model
COCOMO'81 models depend upon the two main equations:
Development Effort : MM = a * KDSI b
Which is based on MM - man-month / person month / staff-month is one month of effort by one person.
Note: In COCOMO'81, there are 152 hours per Person month. According to organization this values may differ from the standard by 10% to
20%.
Efforts and Development Time (TDEV) : TDEV = 2.5 * MM c
Note: The coefficients a, b and c depend on the mode of the development.
DEVELOPMENT MODES:
Yazhini 12
S/W Project Estimation & COCOMO and Risk Management
Advantages of COCOMO’81
• COCOMO is transparent; you can see how it works unlike other models such as SLIM.
• Drivers are particularly helpful to the estimator to understand the impact of different factors that affect project costs.
Drawbacks of COCOMO’81
• It is hard to accurately estimate KDSI (thousand delivered source instruction) early on in the project, when most effort
estimates are required.
• KDSI, actually is not a size measure it is a length measure.
• It is extremely vulnerable to misclassification of the development mode.
• Success depends largely on tuning the model to the needs of the organization, using historical data which is not always
available.
COCOMO-II
COCOMO-II is the revised version of the original Cocomo (Constructive Cost Model) and is developed at University of
Southern California. It is the model that allows one to estimate the cost, effort and schedule when planning a new
software development activity.
COCOMO 1 and COCOMO 2 are two cost estimation models, developed by Barry Boehm. These two cost estimation
models are used for computing the cost of software development. The most basic difference between these two models is
that the COCOMO 1 model helps to provide the estimates required for the efforts and schedule, whereas the COCOMO 2
model provides the estimates that represent a standard deviation near the most likely estimate.
COCOMO II has three different models:
The Application Composition Model: Suitable for projects built with modern GUI-builder tools. Based on new Object
Yazhini 14
Points.
S/W Project Estimation & COCOMO and Risk Management
The Post-Architecture Model: This is the most detailed COCOMO II model. You'll use it after you've developed your
project's overall architecture. It has new cost drivers, new line counting rules, and new equations.
COCOMO II can be used for the following major decision situations
• Making investment or other financial decisions involving a software development effort
• Setting project budgets and schedules as a basis for planning and control
S.No
• Deciding on or negotiating tradeoffs among software cost, schedule,
.
functionality,
COCOMO 1 COCOMO 2
Note: Please Refer book for Example of COCOMO-II 7. The number of sub-models are 3.
The number of sub-models
required are 4.
Risk Management
• Risk management is the identification, evaluation, and prioritization of risks followed by coordinated and economical application of
resources to minimize, monitor, and control the probability or impact of unfortunate events or to maximize the realization of
opportunities.
• The risk management practice, which involves risk identification, analysis, prioritization, planning, mitigation, monitoring and
communication.
• To reduce risk, an organization needs to apply resources to minimize, monitor and control the impact of negative events while
maximizing positive events. A consistent, systemic and integrated approach to risk management can help determine how best to
identify, manage and mitigate significant risks.
Risk and Risk Event
• Event risk refers to any unforeseen or unexpected occurrence that can cause losses for investors or other stakeholders in a
company or investment.
• A risk is the possibility that an undesirable even (called the risk event) could happen. Risks involve both uncertainty and loss.
Proactive risk management is the process of trying to minimize the possible bad effects of risk events happening.
Risk Management and Software Project Management
• Risk management in software engineering is related to the various future harms that could be possible on the software due to
some minor or non-noticeable mistakes in software development project or process.
• Risk management is the most important issue involved in the software project development. This issue is generally managed by
Software Project Management (SPM). During the life cycle of software projects, various risks are associated with them. These
risks in the software project is identified and managed by software risk management which is a part of SPM.
• Some of the important aspects of risk management in software engineering are software risk management, risk classification and
strategies for risk management.
Purpose of Risk Management
• Anticipate and Identify risk
Yazhini 16
• Minimize the impact/damage/loss
S/W Project Estimation & COCOMO and Risk Management
THANK YOU
Yazhini 20