100% found this document useful (1 vote)
360 views31 pages

M&E Overview

Monitoring provides routine tracking of progress according to agreed plans, while evaluation determines the worth or significance of an intervention in an objective manner. Monitoring measures progress towards specific objectives, while evaluation assesses impact and is more episodic. Both serve accountability and provide analysis of why progress fell short. Together, monitoring and evaluation learn from activities to maximize desired impact.

Uploaded by

Gezahegn Yenew
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
360 views31 pages

M&E Overview

Monitoring provides routine tracking of progress according to agreed plans, while evaluation determines the worth or significance of an intervention in an objective manner. Monitoring measures progress towards specific objectives, while evaluation assesses impact and is more episodic. Both serve accountability and provide analysis of why progress fell short. Together, monitoring and evaluation learn from activities to maximize desired impact.

Uploaded by

Gezahegn Yenew
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 31

December 2022

Quiz:Monitoring? or Evaluation?
 E attempts to determine as systematically & objectively as
___
possible the worth or significance of an intervention, strategy
or policy
 M information for tracking progress according to
In ___
previously agreed plans and schedule is routinely gathered
 M provides elements of analysis as to why Progress fell
___
short of expectations
 ___ serves accountability purposes
M&E

2
Quiz: M or E? continued
M measures progress in achieving specific objectives & results
• ___
in relation to an implementation plan whether for programme,
projects, strategies & activities
• ___
E is more episodal than ___M

3
Définition
• Monitoring can be defined as the ongoing
process by which stakeholders obtain regular
feedback on the progress being made towards
achieving their goals and objectives.
• Evaluation is a rigorous and independent
assessment of either completed or ongoing
activities to determine the extent to which they
are achieving stated objectives and contributing
to decision making.
Source: UNDP Handbook
Traditional v.s. RBM&E
Traditional M&E
focuses on the
monitoring and
evaluation of inputs,
activities, and outputs
– programmme
implementation.
RBM&E, however,
combines the
traditional approach
of monitoring
implementation with
the assessment of
results.
Results Based M&E – Why?
• RBM&E:
– Provides momentum for establishing goals and
objectives that address outcomes
– Provides timely information about progress and
helps early identification of any weaknesses that
require action.
– Provides a view on how well outcomes are being
achieved over time.
– Promotes credibility and public confidence by
reporting on the results of program
“An unexamined life is not worth living” Plato.

Monitoring and Evaluation is


the function of projects
management whose aim is
to learn from what you are
doing and how you are doing
it with an objective of
adjusting, correcting or
reinforcing activities in the
project so as to maximize
the desired impact.
M & E therefore focuses on:
Efficiency
Effectiveness
Impact
Evaluation:
 Based on the stated objectives, evaluation
assesses the relevance, impact and
sustainability of the project.
 Evaluation looks at what the project intended
to achieve, what it has achieved, and how it
has been achieved.
 Evaluation enables project managers and
officers to understand and demonstrate the
results of their work, determine the best
strategies for achieving their goals and
document lessons learned to improve future
programs.
RBM life cycle approach
Putting planning, monitoring and evaluation together…
Designing and Building
a Results-Based Monitoring and
Evaluation System
10 Steps
Step 1: Conducting a Readiness
Assessment
Capacity assessment
• Who has the technical skills to design and implement M&E
system?
• What data systems currently exist, and what quality are
they?
• What technology is available to support data system?
• What resource is available to implement the system?
– A readiness assessment is like constructing the foundation for a
building. A good foundation provides support for all that is above
it. It is below ground, not seen, but critical
Step 2: Agreeing on Outputs/ Out Comes
to Monitor and Evaluate
Know where you are going before you
get moving!
• Outcomes are what tell you whether you have been
successful or not.
• It should be derived from the strategic priorities of
the country. Are there stated national/sectoral
goals?
– Have political promises been made that specify
improved performance?
– Is authorizing legislation present?
– Do brainstorm, interviews, surveys…
Step 3: Developing key Indicators to
monitor outcome
• Indicator development is a core activity in building
M&E system and derives all subsequent data
collection, analysis and reporting.
• Good indicators should be: Clear, Relevant, Economic,
Adequate, monitorable. When selecting indicators, be
sure to select more than one for each outcome.
Changing indicators regularly helps reduce the chance
that people will find a way to manipulate them.
• Indicators should be:
– SMART
– More than one for each outcome
– Review and change over time
– Data to be available
Step 4: Gathering Baseline Data on
Indicators
• ‘Where are we today?’ Find at the beginning of the
intervention.
• Collecting baseline data means essentially to take the first
measurements of the indicators to find out, ‘where are we
today?’
• In fact, one consideration when choosing indicators is the
availability of baseline data, which will allow performance to
be tracked relative to that baseline.
• Sources of baseline data can be either primary or secondary.
• Once you have chosen your baseline data, you will need to
decide who is going to collect the data and how. You will
need to develop the data collection instruments such as
forms for gathering from information from files, records,
interviews, survey, etc.
A form: Building Baseline Information
Indicat Data Data Who Frequ Cost Who Who Who
or source collect will ency and will will will
ion collect to difficu analyz report use
metho data? collect lty to e data? data?
d collect data?

1.

2.

3.
Step 5: Setting the Targets
• Definition: Targets are quantifiable levels of the indicators
that a country/organisation wants to achieve at a given point
in time.
• For example, ‘pre- school enrollments will increase by 20% in
the next five years over the baseline’
Example: Education
Step 6: Building a monitoring system
RBM&E tracks both
implementation and
results!

Building an effective monitoring system involves administrative and


institutional tasks such as establish data collection and analysis
guidelines, means of quality control, timeliness and costs, the guidelines
on dissemination of the information and analysis.

We have set indicators, targets, baselines. Now we need to move to


‘system’.
Glance at different level of indicators for OVC
intervention
•The number of textbooks
•Activity
distributed in a district

•The number of children who


•Output received textbooks

•School attendance rate among


•Outcome children affected by HIV/AIDS.

•Literacy rate among adolescents


•Impact orphaned or made vulnerable by
HIV/AIDS
Source: Adopted from Guide to monitoring and evaluation of
the national response for children orphanes and made
vulnerable by HIV/AIDS
Step 7: Role of Evaluations

• Evaluation can tell:


– Strategy: Are we doing the right things?
– Operation: Are we doing things right?
– Learning: Are there better ways of doing it?
• Evaluation can challenge the causal
assumption.
Evaluation – When?
• When
– Require in-depth investigation
– To determine whether or not to expand a pilot
– A long period no improvement
– Divergent outcomes reported
Step 8: Reporting M&E Findings
• Report to whom? What form? What intervals?
• Use easy-to-understand visual display.
• Compare the current status to past data, look
for patterns and trend.
• Provide hints to problems (Early warning).
• Report whether positive or negative.
• Demonstrate value of intervention.
Step 9: Using the Findings
• M&E system is not in simply generating
information, but in getting the information to
the appropriate users in a timely fashion.
• Use of findings:
– Helps formulate and justify budget requests
– Triggers examination of what problems exist and
what corrections needed
– Provides data for in-depth evaluations, etc.
Step 10: Sustaining the System
• Demand-build in a formal structure that requires regular
reporting
• Clear roles and responsibilities-Creation of clear,
formal lines of authority, and responsibilities
• Trustworthy and credible information-
information system must be able to produce both good
and bad news
• Accountability-Consider the external stakeholders
who have an interest in performance information
• Capacity-Human and institutional

You might also like