0% found this document useful (0 votes)
29 views

Unit 1 - Project Management - www.rgpvnotes.in

The document outlines the principles of project management in software development, focusing on improving software economics through five key parameters: size, process, personnel, environment, and quality. It discusses strategies for reducing product size, enhancing team effectiveness, and automating processes to achieve better return on investment. Additionally, it emphasizes the importance of reusing components and using integrated tools to streamline development and improve overall productivity.

Uploaded by

dhruvmalviya344
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
29 views

Unit 1 - Project Management - www.rgpvnotes.in

The document outlines the principles of project management in software development, focusing on improving software economics through five key parameters: size, process, personnel, environment, and quality. It discusses strategies for reducing product size, enhancing team effectiveness, and automating processes to achieve better return on investment. Additionally, it emphasizes the importance of reusing components and using integrated tools to streamline development and improve overall productivity.

Uploaded by

dhruvmalviya344
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 13

Program : B.

Tech
Subject Name: Project Management
Subject Code: CS-604
Semester: 6th
Downl oa d e d f ro m ww w . r
gp v Cn S
ot 6e 0 s05
.iP n P RO J E C T
M A N A G E M ENT
Department of Computer Science and Engineering
Subject Notes
CS-604 (B) Project Management
Unit -1
Topics to be covered
Evolution of software economics, improving software economics: reducing product size, software processes,
team effectiveness, automation through software environments. Principles of modern software management.

1. Evolution of software economics


Most software cost models can be abstracted into a function of five basic parameters: size, process,
personnel, environment, and required quality.
1. The size of the end product (in human-generated components), which is typically quantified in terms
of the number of source instructions or the number of function points required to develop the
required functionality
2. The process used to produce the end product, in particular the ability of the process to avoid non-
value-adding activities (rework, bureaucratic delays, communications overhead)
3. The capabilities of software engineering personnel, and particularly their experience with the
computer science issues and the applications domain issues of the project
4. The environment, which is made up of the tools and techniques available to support efficient software
development and to automate the process
5. The required quality of the product, including its features, performance, reliability, and
adaptability The relationships among these parameters and the estimated cost can be written as follows:
Effort = (Personnel) (Environment) (Quality) (Size)Process
Several parametric models have been developed to estimate software costs; all of them can be generally
abstracted into this form. One important aspect of software economics (as represented within today's
software cost models) is that the relationship between effort and size exhibits a diseconomy of scale. The
diseconomy of scale of software development is a result of the process exponent being greater than 1.0.
Contrary to most manufacturing processes, the more software you build, the more expensive it is per unit
item.
Target objective: improved ROI
Environment/Tools:
Off-the-shelf, integrated

Size:
70% component-based
30% custom

Process:
Managed/measured

Improving Software Economics: Reducing Software product size, improving software processes, improving
team effectiveness, improving automation, Achieving required quality, peer inspections.
The old way and the new: The principles of conventional software Engineering, principles of modern
software management, transitioning to an iterative process.

Page no: 1 Get real-time updates from RGPV


Downl oa d e d f ro m ww w . r
C S 6 0 05 P P RO J E C T

2. Improving Software Economics


Five basic parameters of the software cost model are

1. Reducing the size or complexity of what needs to be developed

2. Improving the development process

3. Using more-skilled personnel and better teams (not necessarily the same thing)

4. Using better environments (tools to automate the process)

5. Trading off or backing off on quality thresholds

These parameters are given in priority order for most software domains. Table 3-1 lists some of the
technology developments, process improvement efforts, and management approaches targeted at improving
the economics of software development and integration.

Important trends in improving software economics

COST MODEL PARAMETERS TRENDS

Size Higher order languages (C++, Ada 95,


Abstraction and component-based Java, Visual Basic, etc.)
development technologies Object-oriented (ana lysis, design,
programming)
Reuse Commercial components

Environment Integrated tools (visual modeling, compiler,


Automation technologies and tools editor, debugger, change management, etc.)
Open systems
Hardware platform performance
Automation of coding, documents, testing,
analyses

Quality Hardware platform performance


Performance, reliability, accuracy Demonstration-based assessment
Statistical quality control

3. Reducing Software Product Size


The most significant way to improve affordability and return on investment (ROI) is usually to produce a
product that achieves the design goals with the minimum amount of human-generated source material.
Component-based development is introduced here as the general term for reducing the "source" language
size necessary to achieve a software solution. Reuse, object-oriented technology, automatic code production,
and higher order programming languages are all focused on achieving a given system with fewer lines of
human-specified source directives (statements). This size reduction is the primary motivation behind
improvements in higher order languages (such as C++, Ada 95, Java, Visual Basic, and fourth-generation
languages), automatic code generators (CASE tools, visual modeling tools, GUI builders), reuse of commercial
components (operating systems, windowing environments, database management systems, middleware,
networks), and object-oriented technologies (Unified Modeling Language, visual modeling tools, architecture

Page no: 2 Get real-time updates from RGPV


Downl oa d e d f ro m ww w . r
C S 6 0 05 P P RO J E C T

frameworks). The reduction is defined in terms of human-generated source material. In general, when size-
reducing technologies are used, they reduce the number of human-generated source lines.
LANGUAGES
Universal function points (UFPs) are useful estimators for language-independent, early life-cycle estimates.
The basic units of function points are external user inputs, external outputs, internal logical data groups,
external data interfaces, and external inquiries. SLOC metrics are useful estimators for software after a
candidate solution is formulated and an implementation language is known. Substantial data have been
documented relating SLOC to function points . Some of these results are shown in Table 3-2.

Language Expressiveness

LANGUAGE SLOC PER UFP


Assembly 320
C 128
FORTAN 77 105
COBOL 85 91
Ada 83 71
C++ 56
Ada 95 55
Java 55
Visual Basics 35
OBJECT-ORIENTED METHODS AND VISUAL MODELING
There has been a widespread movement in the 1990s toward object-oriented technology. I spend very little
time on this topic because object-oriented technology is not germane to most of the software management
topics discussed here, and books on object-oriented technology abound. Some studies have concluded that
object-oriented programming languages appear to benefit both software productivity and software quality.
The fundamental impact of object-oriented technology is in reducing the overall size of what needs to be
developed. These are interesting examples of the interrelationships among the dimensions of improving
software economics.
1. An object-oriented model of the problem and its solution encourages a common vocabulary
between the end users of a system and its developers, thus creating a shared understanding of
the problem being solved.
2. The use of continuous integration creates opportunities to recognize risk early and make
incremental corrections without destabilizing the entire development effort.
3. An object-oriented architecture provides a clear separation of concerns among disparate
elements of a system, creating firewalls that prevent a change in one part of the system from
rending the fabric of the entire architecture.
Booch also summarized five characteristics of a successful object-oriented project.
1. A ruthless focus on the development of a system that provides a well understood collection of
essential minimal characteristics.
2. The existence of a culture that is centred on results, encourages communication, and yet is not
afraid to fail.
3. The effective use of object-oriented modelling.
4. The existence of a strong architectural vision.
5. The application of a well-managed iterative and incremental development life cycle.

Page no: 3 Get real-time updates from RGPV


Downl oa d e d f ro m ww w . r
C S 6 0 05 P P RO J E C T

REUSE

Reusing existing components and building reusable components have been natural software engineering
activities since the earliest improvements in programming languages. Software design methods have always
dealt implicitly with reuse in order to minimize development costs while achieving all the other required
attributes of performance, feature set, and quality. Try to treat reuse as a mundane part of achieving a
return on investment.
Most truly reusable components of value are transitioned to commercial products supported by organizations
with the following characteristics:

 They have an economic motivation for continued support.


 They take ownership of improving product quality, adding new features, and transitioning to new
technologies.
 They have a sufficiently broad customer base to be profitable.

The cost of developing a reusable component is not trivial. Figure 3-1 examines the economic trade-offs. The
steep initial curve illustrates the economic obstacle to developing reusable components.
Reuse is an important discipline that has an impact on the efficiency of all workflows and the quality of most
artifacts

COMMERCIAL COMPONENTS

A common approach being pursued today in many domains is to maximize integration of commercial
components and off-the-shelf products. While the use of commercial components is certainly desirable as a
means of reducing custom development, it has not proven to be straightforward in practice. Following Table
identifies some of the advantages and disadvantages of using commercial components.

Advantages and Disadvantages of Commercial Components and Custom Software

APPROACH ADVANTAGE DISADVANTAGE


Commercial Predictable license costs Frequent upgrades
Components Broadly used mature Up-front license fees
technology Recurring maintenance fees
Dedicated support Dependency on vendor

Page no: 4 Get real-time updates from RGPV


Downl oa d e d f ro m ww w . r
C S 6 0 05 P P RO J E C T

organization Run time efficiency sacrifices


Hardware/Software Functionality constraints
independence Integration not always trivial
Rich in functionality No control over upgrades and
maintenance
Unnecessary features that consumes
extra resources
Often inadequate reliability and
stability
Multiple vendor incompatibilities
Custom development Complete change Expensive, unpredictable
freedom Smaller, often development Unpredictable
simpler implementations availability date Undefined
Often better performance maintenance model
Control of development and Often immature and fragile
enhancement Drain on expert resources

4. Improving Software Process


Process is an overloaded term. For software-oriented organizations, there are many processes and sub
processes. Three distinct process perspectives are.

Metaprocess: An organization's policies, procedures, and practices for pursuing a software-intensive


line of business. The focus of this process is on organizational economics, long-term strategies, and
software ROI.
Macroprocess: A project's policies, procedures, and practices for producing a complete software
product within certain cost, schedule, and quality constraints. The focus of the macro process is on
creating an adequate instance of the Meta process for a specific set of constraints.
Microprocess: A project team's policies, procedures, and practices for achieving an artifact of the
software process. The focus of the micro process is on achieving an intermediate product baseline
with adequate quality and adequate functionality as economically and rapidly as practical.

Although these three levels of process overlap somewhat, they have different objectives, audiences, metrics,
concerns, and time scales as shown in Table.

In a perfect software engineering world with an immaculate problem description, an obvious solution
space, a development team of experienced geniuses, adequate resources, and stakeholders with common
goals, we could execute a software development process in one iteration with almost no scrap and rework.
Because we work in an imperfect world, however, we need to manage engineering activities so that scrap
and rework profiles do not have an impact on the win conditions of any stakeholder. This should be the
underlying premise for most process improvement

5. Improving Team Effectiveness


Teamwork is much more important than the sum of the individuals. With software teams, a project manager
needs to configure a balance of solid talent with highly skilled people in the leverage positions. Some maxims
of team management include the following:
 A well-managed project can succeed with a nominal engineering team.
 A mismanaged project will almost never succeed, even with an expert team of engineers.
 A well-architected system can be built by a nominal team of software builders.

Page no: 5 Get real-time updates from RGPV


Downl oa d e d f ro m ww w . r
C S 6 0 05 P P RO J E C T

 A poorly architected system will flounder even with an expert team of builders. Boehm five
staffing principles are

1. The principle of top talent: Use better and fewer people


2. The principle of job matching: Fit the tasks to the skills and motivation of the people
available.
3. The principle of career progression: An organization does best in the long run by helping
its people to self-actualize.
4. The principle of team balance: Select people who will complement and harmonize with
one another
5. The principle of phase-out: Keeping a misfit on the team doesn't benefit anyone

Software project managers need many leadership qualities in order to enhance team effectiveness. The following
are some crucial attributes of successful software project managers that deserve much more attention:

 Hiring skills. Few decisions are as important as hiring decisions. Placing the right person in the
right job seems obvious but is surprisingly hard to achieve.
 Customer-interface skill. Avoiding adversarial relationships among stakeholders is a prerequisite
for success.
 Decision-making skill. The jillion books written about management have failed to provide a clear
definition of this attribute. We all know a good leader when we run into one, and decision-
making skill seems obvious despite its intangible definition.
 Team-building skill. Teamwork requires that a manager establish trust, motivate progress,
exploit eccentric prima donnas, transition average people into top performers, eliminate misfits,
and consolidate diverse opinions into a team direction.
 Selling skill. Successful project managers must sell all stakeholders (including themselves) on
decisions and priorities, sell candidates on job positions, sell changes to the status quo in the face of
resistance, and sell achievements against objectives. In practice, selling requires continuous
negotiation, compromise, and empathy

6. Improving Automation through Software Environments


The tools and environment used in the software process generally have a linear effect on the
productivity of the process. Planning tools, requirements management tools, visual modelling tools,
compilers, editors, debuggers, quality assurance analysis tools, test tools, and user interfaces provide crucial
automation support for evolving the software engineering artifacts. Above all, configuration management
environments provide the foundation for executing and instrument the process. At first order, the isolated
impact of tools and automation generally allows improvements of 20% to 40% in effort. However, tools and
environments must be viewed as the primary delivery vehicle for process automation and improvement, so
their impact can be much higher.

Automation of the design process provides payback in quality, the ability to estimate costs and
schedules, and overall productivity using a smaller team. Integrated toolsets play an increasingly important
role in incremental/iterative development by allowing the designers to traverse quickly among development
artifacts and keep them up-to-date.
Round-trip engineering is a term used to describe the key capability of environments that support iterative
development. As we have moved into maintaining different information repositories for the engineering
artifacts, we need automation support to ensure efficient and error-free transition of data from one artifact
to another. Forward engineering is the automation of one engineering artifact from another, more
abstract

Page no: 6 Get real-time updates from RGPV


Downl oa d e d f ro m ww w . r
C S 6 0 05 P P RO J E C T

representation. For example, compilers and linkers have provided automated transition of source code into
executable code. Reverse engineering is the generation or modification of a more abstract representation
from an existing artifact (for example, creating a visual design model from a source code representation).
Economic improvements associated with tools and environments. It is common for tool vendors to make
relatively accurate individual assessments of life-cycle activities to support claims about the potential
economic impact of their tools. For example, it is easy to find statements such as the following from
companies in a particular tool.
 Requirements analysis and evolution activities consume 40% of life-cycle costs.
 Software design activities have an impact on more than 50% of the resources.
 Coding and unit testing activities consume about 50% of software development effort and
schedule.
 Test activities can consume as much as 50% of a project's resources.
 Configuration control and change management are critical activities that can consume as much
as 25% of resources on a large-scale project.
 Documentation activities can consume more than 30% of project engineering resources.
 Project management, business administration, and progress assessment can consume as much
as 30% of project budgets.
ACHIEVING REQUIRED QUALITY
Software best practices are derived from the development process and technologies. Table 3-5 summarizes
some dimensions of quality improvement.
 Key practices that improve overall software quality include the following:
 Focusing on driving requirements and critical use cases early in the life cycle, focusing on
requirements completeness and traceability late in the life cycle, and focusing throughout the life
cycle on a balance between requirements evolution, design evolution, and plan evolution
 Using metrics and indicators to measure the progress and quality of an architecture as it evolves
from a high-level prototype into a fully compliant product
 Providing integrated life-cycle environments that support early and continuous configuration
control, change management, rigorous design methods, document automation, and regression
test automation
 Using visual modelling and higher level languages that support architectural control, abstraction,
reliable programming, reuse, and self-documentation
 Early and continuous insight into performance issues through demonstration-based evaluations
Conventional development processes stressed early sizing and timing estimates of computer program
resource utilization. However, the typical chronology of events in performance assessment was as
follows

 Project inception. The proposed design was asserted to be low risk with adequate performance
margin.
 Initial design review. Optimistic assessments of adequate design margin were based mostly on
paper analysis or rough simulation of the critical threads. In most cases, the actual application
algorithms and database sizes were fairly well understood.
 Mid-life-cycle design review. The assessments started whittling away at the margin, as early
benchmarks and initial tests began exposing the optimism inherent in earlier estimates.
 Integration and test. Serious performance problems were uncovered, necessitating fundamental
changes in the architecture. The underlying infrastructure was usually the scapegoat, but the real
culprit was immature use of the infrastructure, immature architectural solutions, or poorly
understood early design trade-offs.

Page no: 7 Get real-time updates from RGPV


Downl oa d e d f ro m ww w . r
C S 6 0 05 P P RO J E C T

PEER INSPECTIONS: A PRAGMATIC VIEW


Peer inspections are frequently over hyped as the key aspect of a quality system. In my experience, peer reviews
are valuable as secondary mechanisms, but they are rarely significant contributors to quality compared with
the following primary quality mechanisms and indicators, which should be emphasized in the management
process:
 Transitioning engineering information from one artifact set to another, thereby assessing the
consistency, feasibility, understandability, and technology constraints inherent in the engineering
artifacts
 Major milestone demonstrations that force the artifacts to be assessed against tangible criteria
in the context of relevant use cases
 Environment tools (compilers, debuggers, analyzers, automated test suites) that ensure
representation rigor, consistency, completeness, and change control
 Life-cycle testing for detailed insight into critical trade-offs, acceptance criteria, and requirements
compliance
 Change management metrics for objective insight into multiple-perspective change trends and
convergence or divergence from quality and progress goals
Inspections are also a good vehicle for holding authors accountable for quality products. All authors of
software and documentation should have their products scrutinized as a natural by-product of the
process. Therefore, the coverage of inspections should be across all authors rather than across all
components.

THE PRINCIPLES OF CONVENTIONAL SOFTWARE ENGINEERING


1. Make quality #1. Quality must be quantified and mechanisms put into place to motivate its
achievement
2. High-quality software is possible. Techniques that have been demonstrated to increase quality
include involving the customer, prototyping, simplifying design, conducting inspections, and hiring the
best people
3. Give products to customers early. No matter how hard you try to learn users' needs during the
requirements phase, the most effective way to determine real needs is to give users a product and let
them play with it
4. Determine the problem before writing the requirements. When faced with what they believe is a
problem, most engineers rush to offer a solution. Before you try to solve a problem, be sure to
explore all the alternatives and don't be blinded by the obvious solution
5. Evaluate design alternatives. After the requirements are agreed upon, you must examine a variety of
architectures and algorithms. You certainly do not want to use “architecture” simply because it was
used in the requirements specification.
6. Use an appropriate process model. Each project must select a process that makes ·the most sense for
that project on the basis of corporate culture, willingness to take risks, application area, volatility of
requirements, and the extent to which requirements are well understood.
7. Use different languages for different phases. Our industry's eternal thirst for simple solutions to
complex problems has driven many to declare that the best development method is one that uses the
same notation throughout the life cycle.
8. Minimize intellectual distance. To minimize intellectual distance, the software's structure should be
as close as possible to the real-world structure

Page no: 8 Get real-time updates from RGPV


Downl oa d e d f ro m ww w . r
C S 6 0 05 P P RO J E C T

9. Put techniques before tools. An undisciplined software engineer with a tool becomes a dangerous,
undisciplined software engineer
10. Get it right before you make it faster. It is far easier to make a working program run faster than it is
to make a fast program work. Don't worry about optimization during initial coding
11. Inspect code. Inspecting the detailed design and code is a much better way to find errors than testing
12. Good management is more important than good technology. Good management motivates people
to do their best, but there are no universal "right" styles of management.
13. People are the key to success. Highly skilled people with appropriate experience, talent, and training
are key.
14. Follow with care. Just because everybody is doing something does not make it right for you. It may be
right, but you must carefully assess its applicability to your environment.
15. Take responsibility. When a bridge collapses we ask, "What did the engineers do wrong?" Even when
software fails, we rarely ask this. The fact is that in any engineering discipline, the best methods can
be used to produce awful designs, and the most antiquated methods to produce elegant designs.
16. Understand the customer's priorities. It is possible the customer would tolerate 90% of the
functionality delivered late if they could have 10% of it on time.
17. The more they see, the more they need. The more functionality (or performance) you provide a user,
the more functionality (or performance) the user wants.
18. Plan to throw one away. One of the most important critical success factors is whether or not a
product is entirely new. Such brand-new applications, architectures, interfaces, or algorithms rarely
work the first time.
19. Design for change. The architectures, components, and specification techniques you use must
accommodate change.
20. Design without documentation is not design. I have often heard software engineers say, "I have
finished the design. All that is left is the documentation. "
21. Use tools, but be realistic. Software tools make their users more efficient.
22. Avoid tricks. Many programmers love to create programs with tricks constructs that perform a
function correctly, but in an obscure way. Show the world how smart you are by avoiding tricky code
23. Encapsulate. Information-hiding is a simple, proven concept that results in software that is easier to
test and much easier to maintain.
24. Use coupling and cohesion. Coupling and cohesion are the best ways to measure software's inherent
maintainability and adaptability.
25. Use the McCabe complexity measure. Although there are many metrics available to report the
inherent complexity of software, none is as intuitive and easy to use as Tom McCabe's.

7. Principles of Modern Software Management

Top 10 principles of modern software management are. (The first five, which are the main themes of my
definition of an iterative process)

1. Base the process on an architecture-first approach. This requires that a demonstrable balance
be achieved among the driving requirements, the architecturally significant design decisions, and
the life-cycle plans before the resources are committed for full-scale development.

Page no: 9 Get real-time updates from RGPV


Downl oa d e d f ro m ww w . r
C S 6 0 05 P P RO J E C T

2. Establish an iterative life-cycle process that confronts risk early. With today's sophisticated
software systems, it is not possible to define the entire problem, design the entire solution, build
the software, then test the end product in sequence. Instead, an iterative process that refines the
problem understanding, an effective solution, and an effective plan over several iterations
encourages a balanced treatment of all stakeholder objectives. Major risks must be addressed
early to increase predictability and avoid expensive downstream scrap and rework.
3. Transition design methods to emphasize component-based development. Moving from a line-of-
code mentality to a component-based mentality is necessary to reduce the amount of human-
generated source code and custom development.
4. Establish a change management environment. The dynamics of iterative development, including
concurrent workflows by different teams working on shared artifacts, necessitates objectively
controlled baselines.
5. Enhance change freedom through tools that support round-trip engineering. Round-trip engineering
is the environment support necessary to automate and synchronize engineering information in
different formats (such as requirements specifications, design models, source code, executable code,
test cases).

1. Capture design artifacts in rigorous, model-based notation. A model based approach (such as
UML) supports the evolution of semantically rich graphical and textual design notations.

2. Instrument the process for objective quality control and progress assessment. Life-cycle
assessment of the progress and the quality of all intermediate products must be integrated
into the process.

3. Use a demonstration-based approach to assess intermediate artifacts. Plan intermediate


releases in groups of usage scenarios with evolving levels of detail. It is essential that the
software management process drive toward early and continuous demonstrations within
the operational context of the system, namely its use cases.
6. Establish a configurable process that is economically scalable. No single process is suitable for all
software developments.
TRANSITIONING TO AN ITERATIVE PROCESS
Modern software development processes have moved away from the conventional waterfall model, in which
each stage of the development process is dependent on completion of the previous stage.

The economic benefits inherent in transitioning from the conventional waterfall model to an iterative
development process are significant but difficult to quantify. As one benchmark of the expected economic
impact of process improvement, consider the process exponent parameters of the COCOMO II model.
(Appendix B provides more detail on the COCOMO model) This exponent can range from 1.01 (virtually no
diseconomy of scale) to 1.26 (significant diseconomy of scale). The parameters that govern the value of the
process exponent are application precedentedness, process flexibility, architecture risk resolution, team
cohesion, and software process maturity.

The following paragraphs map the process exponent parameters of CO COMO II to my top 10 principles of a
modern process.

 Application precedentedness. Domain experience is a critical factor in understanding how to plan and
execute a software development project. For unprecedented systems, one of the key goals is to
confront risks and establish early precedents, even if they are incomplete or experimental. This is one
of the primary reasons that the software industry has moved to an iterative life-cycle process. Early

Page no: 10 Get real-time updates from RGPV


Downl oa d e d f ro m ww w . r
C S 6 0 05 P P RO J E C T

iterations in the life cycle establish precedents from which the product, the process, and the plans can be
elaborated in evolving levels of detail.
 Process flexibility. Development of modern software is characterized by such a broad solution space
and so many interrelated concerns that there is a paramount need for continuous incorporation of
changes. These changes may be inherent in the problem understanding, the solution space, or the
plans. Project artifacts must be supported by efficient change management commensurate with
project needs. A configurable process that allows a common framework to be adapted across a range
of projects is necessary to achieve a software return on investment.
 Architecture risk resolution. Architecture-first development is a crucial theme underlying a successful
iterative development process. A project team develops and stabilizes architecture before developing
all the components that make up the entire suite of applications components. An architecture-first and
component-based development approach forces the infrastructure, common mechanisms, and control
mechanisms to be elaborated early in the life cycle and drives all component make/buy decisions into
the architecture process.
 Team cohesion. Successful teams are cohesive, and cohesive teams are successful. Successful teams
and cohesive teams share common objectives and priorities. Advances in technology (such as
programming languages, UML, and visual modelling) have enabled more rigorous and understandable
notations for communicating software engineering information, particularly in the requirements and
design artifacts that previously were ad hoc and based completely on paper exchange. These model-
based formats have also enabled the round-trip engineering support needed to establish change
freedom sufficient for evolving design representations.
 Software process maturity. The Software Engineering Institute's Capability Maturity Model (CMM) is a
well-accepted benchmark for software process assessment. One of key themes is that truly mature
processes are enabled through an integrated environment that provides the appropriate level of
automation to instrument the process for objective quality control.

Page no: 11 Get real-time updates from RGPV


We hope you find these notes useful.
You can get previous year question papers at
https://qp.rgpvnotes.in .

If you have any queries or you want to submit your


study notes please write us at
[email protected]

You might also like