0% found this document useful (0 votes)
49 views

Project Manager Interview Questions

This document discusses techniques for estimating software projects. It describes several common techniques including lines of code, IFPUG function point analysis, COCOMO II, and function point analysis. Function point analysis measures size based on the number of inputs, outputs, inquiries, internal logical files and external interface files. It uses unadjusted function points adjusted by general system characteristics and historical productivity data to estimate effort, schedule and costs. Function point analysis provides a standardized way to measure size that is independent of programming language or tools.
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
49 views

Project Manager Interview Questions

This document discusses techniques for estimating software projects. It describes several common techniques including lines of code, IFPUG function point analysis, COCOMO II, and function point analysis. Function point analysis measures size based on the number of inputs, outputs, inquiries, internal logical files and external interface files. It uses unadjusted function points adjusted by general system characteristics and historical productivity data to estimate effort, schedule and costs. Function point analysis provides a standardized way to measure size that is independent of programming language or tools.
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

How do you handle non-productive team members?

Non Productivity is caused by these factors:-

- Lack of skills

- Not knowing what needs to be done and how it needs to be done

- Lack of motivation

- Boredom

- Fear

- Time management issues

- lack of commitment towards job

- Health issues

- Personal tensions in life

- Professional pressures like company politics etc

All of these can be resolved with a patient and loving attitude towards the employee. After all we are
humans and we respond to someone who has our best interest in mind.

Remedial Measures can be:-

- Training to improve skills required for the job

- Counselling to resolve personal and professional issues

- Encouragement to boost morale in case low self esteem is the cause

- Being strict with those who have taken it for granted and are careless to bring them in focus with the
seriousness of the job

All in all, these issues can be resolved once the underlying cause is identified.
Software Estimation Techniques
There are many models for software estimation available and prevalent in the industry. Researchers have been
working on formal estimation techniques since 1960. Early work in estimation which was typically based on
regression analysis or mathematical models of other domains, work during 1970s and 1980s derived models from
historical data of various software projects. Among many estimation models expert estimation, COCOMO, Function
Point and derivatives of function point like Use Case Point, Object Points are most commonly used. While Lines Of
Code (LOC) is most commonly used size measure for 3GL programming and estimation of procedural languages,
IFPUG FPA originally invented by Allen Alrecht at IBM has been adopted by most in the industry as alternative to
LOC for sizing development and enhancement of business applications. FPA provides measure of functionality based
on end user view of application software functionality. Some of the commonly used estimation techniques are as
follows:

Lines of Code (LOC): A formal method to measure size by counting number of lines of Code, Source
Lines of Code (SLOC) has two variants- Physical SLOC and Logical SLOC. While two measures can
vary significantly care must be taken to compare results from two different projects and clear guideline
must be laid out for the organization.
IFPUG FPA: Formal method to measure size of business applications. Introduces complexity factor for
size defined as function of input, output, query, external input file and internal logical file.
Mark II FPA: Proposed and developed by Mark Simons and useful for measuring size for functionality in
real time systems where transactions have embedded data
COSMIC Full Function Point (FFP): Proposed in 1999, compliant to ISO 14143. Applicable for
estimating business applications that have data rich processing where complexity is determined by
capability to handle large chunks of data and real time applications where functionality is expressed in
terms of logics and algorithms.
Quick Function Point (QFP): Derived out of FPA and uses expert judgment. Mostly useful for arriving
at a ballpark estimate for budgetary and marketing purposes or where go-no go decision is required
during project selection process.
Object Points: Best suited for estimating customizations. Based on count of raw objects, complexity of
each object and weighted points.
COCOMO 2.0: Based on COCOMO 81 which was developed by Barry Boehme. Model is based on the
motivation of software reuse, application generators, economies or diseconomies of scale and process
maturity and helps estimate effort for sizes calculated in terms of SLOC, FPA, Mark IIFP or any other
method.
Predictive Object Points: Tuned towards estimation of the object oriented software projects.
Calculated based on weighted methods per class, count of top level classes, average number of
children, and depth of inheritance.
Estimation by Analogy: Cost of project is computed by comparing the project to a similar project in the
same domain. The estimate is accurate if similar project data is available.
Estimation methods mentioned above use various factors that affect productivity or size based on system
characteristics. COCOMO I uses 15, while COCOMO II uses 23 productivity factors, IFPUG FPA uses 14 General
System Characteristics to arrive at the adjusted function point count. Some of these are tuned to early estimation
during proposal, project selection phase or budget estimation phase while others are fairly detailed. Selection of
estimation approach is based on the availability of historical data, availability of trained estimators, availability of tools
and estimation schedule and cost constraints. Each estimation technique has its own advantages and disadvantages;
however, selection of a particular approach is based on the goal for estimation.

Function Point Analysis


font size
E-mail
Function Point Analysis was invented by Allen Albrecht in 1979 at IBM. In 1986 International Function Point Users
group (IFPUG) was created. In 1990, IFPUG released first version of the Counting practices Manual (CPM). Ver 4.0
of CPM was released in 2000 while CPM ver. 4.2 was released in 2004. Function Point Analysis provides an
approach to measure and present size of development software, enhancement and applications in terms of business
users understanding of functions and data files and provides a mechanism to consistently compare software
developed for different platforms, using different languages and tools. Function Point Analysis provides excellent
mechanism to create baseline of requirements in terms and transactions and files as described by business users
and helps control scope creep.
At a conceptual level, Function Point Analysis (FPA) helps define two abstract levels of data, viz., data at rest and
data in motion handled by elementary processes within application boundary. Transactions that bring data from
outside the application boundary to inside the application boundary are called External Inputs (EI). Whereas External
Outputs (EO) or External Query( EQ) are kind of transactions that take data from a resting position (files) to outside
the application boundary. Following diagram outlines the process of Function Point Analysis.
The first step to Function Point Analysis is to define the application boundary as visualized by the business users not
dependent of the technical implementation. Scope document, requirement matrix or functional specifications and data
flow diagrams can be used as inputs. Next step involves calculation of unadjusted function points considering data
files and transactional functions. While Data Element Types (DETs) and Record Element Types (RETs) are
considered while estimating files Internal Logical Files (ILF) and External Interface Files (EIFs); DETs and File Type
Records (FTRs) are considered for analysis of Transactional functions.
There are 14 General System Characteristics (GSCs) that are used to calculate the Value Adjustment Factor (VAF)
that are applied on unadjusted function points to arrive at an adjusted function point. Adjusted function points are
used to determine the effort or cost, optimum duration of the development and derive the schedule for the
development. Cost, duration and schedule are derived from the adjusted function point using the historical
measurement data of the organization related to productivity from various projects in similar domain, using similar
technology and tools.
Size of the software plays an important role in measures used in deployment of statistical process control and
compare results from various projects. These results help organizations plan improvement activities at organization,
department or unit levels. Availability of trained professionals having understanding of Function Point Analysis can
help consistently deploy statistical process control framework across organization to deliver predictable results.

You might also like