0% found this document useful (0 votes)
155 views22 pages

Establishing M and E System

an M and E system helps project teams and stakeholders understand whether the project is on track, achieving its goals, and delivering value to beneficiaries. It ensures accountability, supports decision-making, and facilitates learning for future projects.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
155 views22 pages

Establishing M and E System

an M and E system helps project teams and stakeholders understand whether the project is on track, achieving its goals, and delivering value to beneficiaries. It ensures accountability, supports decision-making, and facilitates learning for future projects.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 22

Qualities and principles of indicators

and processes involved in establishing


and implementing monitoring and
evaluation systems within an
organization or project.
2
• Presentation Outline

• Monitoring
• Evaluation
• Monitoring and Evaluation System
• Indicators
• Establishing an M&E system
• Implementing M&E system

• Logical Framework

3
Monitoring
Monitoring is the routine collection and analysis of information to
track progress against set plans and check compliance to established
standards.
It helps identify trends and patterns, adapt strategies and inform
decisions for project/programme management.

Evaluation
A systematic and objective assessment of an on-going or completed
project, program, or policy, and its design, implementation and
results. The aim is to determine the relevance and fulfilment of
objectives, development efficiency, effectiveness, impact, and
sustainability.

Monitoring and evaluation (M&E)


In most cases an M&E system refers to all the indicators, tools and 4
processes that you will use to measure if a program has been
implemented according to the plan (monitoring) and is having the
Aims of Monitoring and Evaluation

• Relevance: Do the objectives and goals match the problems or needs that
are being addressed?

• Efficiency: Is the project delivered in a timely and cost-effective manner?

• Effectiveness: To what extent does the intervention achieve its


objectives?
What are the supportive factors and obstacles encountered during the
implementation?

• Impact: What happened as a result of the project? This may include


intended and unintended positive and negative effects.

• Sustainability: Are there lasting benefits after the intervention is 5


completed?
Common terms Definition Examples

Inputs The financial, human, and material  Technical Expertise


resources used for the development  Equipment
intervention.  Funds

Activities Actions taken or work performed.  Training workshops


conducted
 classroom blocks
constructed

Outputs The products, capital goods, and  Number of people trained


(short-term) services that result from a  Number of workshops
development intervention. conducted

Outcomes The likely or achieved short-term and  Increased skills


(medium-term) medium-term effects or changes of an  New employment
intervention’s outputs. opportunities

Impact The long-term consequences of the  Improved standard of


(Long-term) program, may be positive and living 6
negative effects.  Increased literacy
levels/rate
Indicators

• Are clues, signs or markers that measure one aspect of a program and
show how close a program is to its desired path and outcomes.

• Indicators show progress and help measure change.

• Indicators are realistic and measurable criteria of project progress.

• They are usually expressed as a percentage or a number.

• Indicators can be measured quantitatively or qualitatively.

7
Quantitative Indicators
They are measurements, hard facts and rigid numbers, percentages or
ratios that tell us whether the activities we have planned are actually
happening as intended.

Example;
• The number of people attending a training
• The weight of fish caught

A hybrid indicator
Quantifies qualitative data – like an index.

8
Qualitative Indicators:
• Qualitative indicators are usually indicators of change (outcomes).

• Qualitative data is based on opinions, feelings or viewpoints rather


than hard facts . It is also people’s judgements and perceptions about
a subject.

• Qualitative indicators are seen as subjective, unreliable and difficult to


verify.

Example;
• Level of Satisfaction with the services
• Increased Hopes of the people towards betterment of the democratic
systems
• Greater freedom of expression
9
Qualities of an Indicator
Good Indicators can said to be roaring (ROARS):

• Relevant: It measures an important part of an objective or output;

• Objective: If two people measure the same indicator using the same
tool, they should get the same result. The indicator should be based
on fact, rather than feelings or impressions.

• Available: Indicators should be based on data that is readily available,


or on data that can be collected with reasonable extra effort as part of
the implementation of the (sub-) project.

• Realistic: It should not be too difficult or too expensive to collect the


information
10
• Specific: The measured changes should be attributable to the project,
and they should be expressed in precise terms
• Examples of good indicators include;
% decrease in prevalence of water borne diseases
% increase in proper hand-washing practices
% increase in household income
% increase in per unit yield of maize crop

Example of a defined indicator and how it is calculated

11
Criteria for Selecting Indicators

• Valid and meaningful – an indicator should adequately reflect the


phenomenon it is intended to measure and should be appropriate to the
needs of the user.

• Sensitive and specific to the underlying phenomenon – sensitivity


relates to how significantly an indicator varies according to changes in
the underlying phenomenon.

• Grounded in research – awareness of the key influences and factors


affecting outcomes.

• Statistically sound – indicator measurement needs to be


methodologically sound and fit for the purpose to which it is being
applied.
12
• Linked to policy or emerging issues – indicators should be selected to
reflect important issues as closely as possible. Where there is an
Criteria – Cont’d

• Intelligible and easily interpreted – indicators should be sufficiently


simple to be interpreted in practice and intuitive in the sense that it is
obvious what the indicator is measuring.

• Relate where appropriate to other indicators – a single indicator


often tends to show part of a phenomenon and is best interpreted
alongside other similar indicators.

• Ability to be disaggregated over time – indicators should be able to


be broken down into population sub-groups or areas of particular
interest, such as ethnic groups or regional areas.

• Consistency over time – the usefulness of the indicators is directly


related to the ability to track trends over time, so as far as possible 13
indicators should be consistent.
Key Steps in Establishing an M&E System
1. Identify the purpose and scope of the M&E system
 Know your program/project
 Review the project/programme’s operational design (logframe)
 Identify key stakeholder informational needs and expectations
 Identify any M&E requirements
 Scope of major M&E events and functions

2. Plan for data collection and management


 Choose and define your indicators, and how they will be measured
• Determine the data collection methods, sources, and tools
 Determine the balance of quantitative and qualitative data
 Triangulate data collection sources and methods
 Determine sampling requirements
 Determine any surveys and define the frequencies of the survey
14
 Establish stakeholder complaints and feedback mechanisms
 Plan for data management
Cont’d
3. Plan for data analysis
 Determine how data will be analysed
 Develop a data analysis plan
 Determine how data will be stored
 Define responsibilities in data analysis and storage

4. Plan for information reporting and utilization


 Plan for reporting
 Plan for information utilization and dissemination

5. Plan for M&E human resources and capacity building


 Assess the project/programme’s human resources capacity for M&E
 Determine the extent of local participation
 Determine the extent of outside expertise
15
 Define the roles and responsibilities for M&E
 Plan to manage project/programme team’s M&E activities
Cont’d
6. Prepare the M&E budget
 Itemize M&E budget needs
 Incorporate M&E costs in the project/programme budget
 Review any donor budget requirements and contributions
 Plan for cost contingency

16
Implementing an M&E system
1. Assess capacities and conditions for implementing the system/plan
2. Build capacities of personnel responsible for implementing the plan

3. A detailed work plan for the M&E Plan to include:


• Each M&E activity (including update of M&E Plan)
• Timing of each activity
• Party responsible for each activity
• Budget necessary for each activity

4. How much will it cost? Need to budget costs of:


• Information systems (data collection, processing, and analysing)
• Information dissemination and use
• Data quality control system 17
• Coordination and capacity building
Core M&E system elements
1. The monitoring, evaluation and reporting plan is the heart of the
system.
• This is a plan (in table format) to manage the collection, analysis and
reporting of data.

2. Central documents guiding the implementation of the M&E plan.


• A guideline document for the M&E plan
• Forms for tools for data collection
• Reporting formats

3. Institutional arrangements
• Clearly defined roles and responsibilities for monitoring, evaluation
and reporting.
18
• Human and physical resources: sufficient budget, capacity/training
• Electronic data base – are there information systems for data storage
and analysis
Institutional arrangements – Cont’d
• Continuous improvement – are there processes for using the evidence
arising from the M&E system to improve projects and processes
• Capacity building

19
The Logical Framework

20
• The logic of assumptions

21
22

You might also like