Unit- 5 Software Effort Estimation-slides
Unit- 5 Software Effort Estimation-slides
A successful project is one delivered ‘on time, within budget and with the
required quality’ and normally, this targets are set and reasonable which then the
project manager tries to meet.
Even if a team is highly productive, there are limits to what can be achieved
within certain constraints, such as deadlines based on incorrect initial estimates.
So, realistic estimates are therefore crucial.
Some of the difficulties of estimating arise from the complexity and invisibility
of software. Also, the intensely human activities (deeply human aspects) which
make up system development cannot be treated in a purely mechanistic way.
Under-estimating tasks can compromise quality as inexperienced staff may rush to meet
deadlines, resulting in substandard work.
This aligns with Weinberg's zeroth law of reliability, suggesting that a system without a
reliability requirement can prioritize other objectives. The impact of substandard work
often surfaces during testing phases, leading to challenging control issues and potential
project delays due to extensive rework.
An estimate is not really a prediction, it is a management goal. Boehm proposes that if the
actual cost is within 20% of the estimated cost, a skilled manager can influence the
outcome to align with the estimate, turning it into a self-fulfilling prophecy(believe).
Unit 5: Software Effort Estimation
1. Historical data
2. Parameters to be estimated
3. Measure of work
Limitations of SLOC:
No precise definition - Researchers have not been consistent on points like does it include
comment lines or are data declarations to be included?
Difficult to estimate at start of a project - can be accurately computed only after the
development of the software is complete. The SLOC count can only be guessed at the
beginning of a project, often leading to grossly inaccurate estimations.
Only a code measure - effort required for carrying out all the life cycle activities and not just
coding.
Programmer-dependent - problem size can vary widely with the coding style of individual
programmers. This aspect alone renders any LOC-based size and effort estimations inaccurate.
Does not consider code complexity - Two software components with the same SLOC will not
necessarily take the same time to write, even if done by the same programmer in the same
environment. One component might be more complex. Attempts have been made to find
objective measures of complexity,
Various techniques are used for software effort estimation, and they can be broadly
categorized into two types: Expert Judgment Techniques and Algorithmic Techniques.
1. Expert Judgment Techniques:
a. Delphi Technique: Involves a group of experts who anonymously provide their
estimates on a particular task. A facilitator collects, summarizes, and redistributes the
estimates until a consensus is reached.
b. Expert Opinion: Involves seeking opinions and judgments from domain experts,
project managers, and experienced team members. This method is subjective and relies
on the expertise of individuals.
c. Analogy-Based Estimation: Estimates are derived by comparing the current project
with similar past projects. Requires a historical database of completed projects for
reference.
Note: Parkinson" method and "price to win" are not true effort or price prediction methods but rather
tools for defining project scope and determining a competitive pricing strategy. While they may not be
effective for prediction, they offer value as management techniques, as acknowledged by Boehm.
Unit 5: Software Effort Estimation
Software Effort Estimation Techniques:
Bottom-up Estimating - Bottom-up estimating is an approach to software effort
estimation where the project is broken down into smaller, more manageable
components, and estimates are made for each individual component. These estimates
are then aggregated to arrive at an overall estimate for the entire project.
With a large project, the process of breaking it down into tasks is iterative: each task
is decomposed into its component subtasks and these in turn could be further
analysed. It is suggested that this is repeated until you get tasks an individual could do
in a week or two.
It is really a separate process of producing a Work Breakdown Schedule (WBS).
Break down the project into smaller, more manageable tasks or work packages.
Create a hierarchical structure known as the Work Breakdown Structure (WBS),
which organizes the tasks in a logical order.
The bottom-up approach in project planning is most effective during later stages
when detailed information is available. Using it earlier requires making assumptions
about the final system and project methods, which may lead to inaccuracies.
productivity = effort/size
A more sophisticated way of doing this would be by using the statistical technique least
squares regression to derive an equation in the form:
Some parametric models, such as that implied by function points, are focused on system
or task size, while others, such are COCOMO, are more concerned with productivity
factors.
Unit 5: Software Effort Estimation
Software Effort Estimation Techniques:
The Top-down Approach and Parametric Models -
The first step is to calculate the total effort required for a project. Once the
overall effort is determined, the challenge is to distribute or allocate portions
of that effort to different activities within the project.
The Euclidean distance between the source and the target is therefore the
square-root of ((7 – 8)2 + (15 – 17)2 ), that is 2.24.
3. External inquiry types are transactions initiated by the user which provide information
but do not update the internal files. The user inputs some information that directs the
system to the details required. Examples: Online queries, information retrieval requests.
4. Logical internal file types are the standing files used by the system. These are files
maintained within the boundary of the system that store data used by the application.
Examples: Databases (relational tables or entity types), data stores, or data repositories, .
5. External interface file types allow for output and input that may pass to and from other
computer applications. Examples of this would be the transmission of accounting data
from an order processing system to the main ledger system or the production of a file of
direct debit details on a magnetic or electronic medium to be passed to the Bankers
Automated Clearing System (BACS). Files shared between applications would also be
counted here.
Unit 5: Software Effort Estimation
Software Effort Estimation Techniques:
Albrecht Function Point Analysis -
The analyst identifies each instance of each external user type in the application. Each
component is then classified as having either high, average or low complexity. The counts
of each external user type in each complexity band are multiplied by specified weights
(see Table) to get FP scores which are summed to obtain an overall FP count which
indicates the information processing size.
Sum up the counts of five major components (or ‘external user types’) is called Unadjusted Function
Point (UFP) Count.
Unit 5: Software Effort Estimation
Software Effort Estimation Techniques:
Albrecht Function Point Analysis -
Function point analysis recognizes that the effort required to implement a computer-based
information system relates not just to the number and complexity of the features provided
but also to the operational environment of the system.
Fourteen factors have been identified which can influence the degree of difficulty
associated with implementing a system. The list that Albrecht produced related
particularly to the concerns of information system developers in the late 1970s and early
1980s. Some technology which was then new and relatively threatening is now well
established
The technical complexity adjustment (TCA) calculation has had many problems.
After obtaining the Unadjusted Function Point count, it serves as a basis for further
adjustments to account for various complexity factors. The adjustments involve assigning
weights to different aspects of the software based on factors such as data
communications, distributed data processing, performance, and more.
The Mark II method, sponsored by the CCTA (now OGC), is a UK government project
standard. It replaces the Albrecht method (now IFPUG) and is labeled 'Mark II' to signify
improvement. Despite refinements to the Albrecht method, FPA Mark II is a minority
method, primarily employed in the United Kingdom.
Unit 5: Software Effort Estimation
Software Effort Estimation Techniques:
Function Points Mark II -
As with Albrecht, the information processing size is initially measured in unadjusted
function points (UFPs) to which a Technical Complexity Adjustment(TCA) can then be
applied. The assumption is that an information system comprises transactions which have
the basic structure shown in Figure 5.2.
Wi, We, and Wo are weightings derived by asking developers the proportions of effort
spent in previous projects developing the code dealing respectively with inputs, accessing
and modifying stored data and processing outputs.
The proportions of effort are then normalized into ratios, or weightings, which add up to
2.5. This process for calculating weightings is time consuming and most FP counters use
industry averages which are currently 0.58 for Wi , 1 .66 for We and 0.26 for Wo.
Project cost can be obtained by multiplying the estimated effort (in man-month,
from the effort estimate) with the manpower cost per month. Project cost
computation is the assumption that the entire project cost is incurred on account
of the manpower cost alone.
However, in addition to manpower cost, a project would incur several other types
of costs which we shall refer to as the overhead costs. The overhead costs would
include the costs of hardware and software required for the project and the
company overheads for administration, office space, etc.
Depending on the expected values of the overhead costs, the project manager has
to suitably scale up the cost estimated by using the COCOMO formula.
Unit 5: Software Effort Estimation
Staffing Pattern:
After the effort required to complete a software project has been estimated, the
staffing requirement for the project can be determined.
Putnam was the first to study the problem of what should be a proper staffing
pattern for software projects.
He extended the classical work of Norden who had earlier investigated the
staffing pattern of general research and development (R&D) type of projects.
In order to appreciate the staffing pattern desirable for software projects, we must
understand both Norden’s and Putnam’s results.
where pmnew is the new effort, pmorg is the originally estimated effort and tdorg is the
originally estimated time for project completion and tdnew is the compressed schedule.
Unit 5: Software Effort Estimation
Effect of Schedule Compression:
It is quite common for a project manager to encounter client requests to deliver products
faster, that is, to compress the delivery schedule.
It is quite common for a project manager to encounter client requests to deliver products
faster, that is, to compress the delivery schedule. It is therefore important to understand
the impact of schedule compression on project cost. Putnam studied the effect of
schedule compression on the development effort and expressed it in the form of the
following equation:
where pmnew is the new effort, pmorg is the originally estimated effort and tdorg is the
originally estimated time for project completion and tdnew is the compressed schedule.
From this expression, it can easily be observed that when the schedule of a project is
compressed, the required effort increases in proportion to the fourth power of the degree of
compression. It means that a relatively small compression in a delivery schedule can result in
substantial penalty on human effort. For example, if the estimated development time using
COCOMO formula is one year, then in order to develop the product in six months, the total
effort required (and hence the project cost) increases 16 times.
where pmnew is the new effort, pmorg is the originally estimated effort and tdorg is the
originally estimated time for project completion and tdnew is the compressed schedule.
From this expression, it can easily be observed that when the schedule of a project is
compressed, the required effort increases in proportion to the fourth power of the degree of
compression. It means that a relatively small compression in a delivery schedule can result in
substantial penalty on human effort. For example, if the estimated development time using
COCOMO formula is one year, then in order to develop the product in six months, the total
effort required (and hence the project cost) increases 16 times.
Unit 5: Software Effort Estimation
Effect of Schedule Compression:
Boehm arrived at the result that there is a limit beyond which a software project cannot
reduce its schedule by buying any more personnel or equipment. This limit occurs
roughly at 75% of the nominal time estimate for small and medium sized projects.
Thus, if a project manager accepts a customer demand to compress the development
schedule of a typical project (medium or small project) by more than 25%, he is very
unlikely to succeed. The main reason being, that every project has only a limited amount
of activities which can be carried out in parallel, and the sequential activities cannot be
speeded up by hiring any number of additional developers.