0% found this document useful (0 votes)
274 views

Quick Guide To The MEAL DPro

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
274 views

Quick Guide To The MEAL DPro

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 24

A quick Guide to the

A Quick Guide to MEAL DPro


Publisher
© Copyright 2021 PM4NGOs
DPro and its symbols are trademarks of PM4NGOs
The Guide to the MEAL DPro is jointly owned by Catholic Relief Services, the Humanitarian
Leadership Academy, Humentum and PM4NGOs.

ISBN: 978-1-7345721-9-3

This work is licensed under the Creative Commons Attribution-NonCommercial 4.0 International
License. To view a copy of this license, visit http://creativecommons.org/licenses/by-nc/4.0/.

Users are free to copy/redistribute and adapt/transform for non-commercial purposes.

Version Information
This is “A Quick Guide to the MEAL DPro” compiled by PM4NGOs Board Member, Peter Marlow, and
based on “A Guide to the MEAL DPro – Monitoring, Evaluation, Accountability and Learning for
Humanitarian and Development Professionals”, Version 1.0 dated April 2019

Version 1 , December 2021

Preface
All people working in the international relief and development sectors need to understand MEAL -
Monitoring, Evaluation, Accountability and Learning. The Guide to MEAL DPro provides a
certification-based and sector-wide standard which helps teams to design, plan and implement
MEAL in their projects by providing clear, practical guidance and tools that can immediately be
applied. This quick guide is a brief overview of the MEAL DPro guide. The full guide is downloadable
for free at https://mealdpro.org/ and contains many useful examples and case studies to illustrate
good practise. You will need to study the full guide if you are planning to take the certification
exam. Also, take a look at MEAL DPro Starter at https://mealdprostarter.org/ which gives on-
demand access to MEAL DPro tools for use in projects, which are also downloadable for free.

A Quick Guide to MEAL DPro


INTRODUCTION
MEAL: A key contributor to project success
The Guide to the MEAL DPro helps teams design, plan and implement monitoring, evaluation,
accountability and learning (MEAL) in their projects by providing clear, practical guidance and tools
for project team members. The Guide will also help MEAL officers who may be new to the sector or
new to the job. As a project manager or project team member, you will collaborate with MEAL
technical specialists to ensure that your systems are strong and that your MEAL data are timely and
accurate. Always remember that strong MEAL is critical to project success.

1. MEAL IN PROJECTS
By the end of this chapter, you will be able to:
✓ define the components, structure and purpose of MEAL;
✓ explain the benefits of a strong MEAL system;
✓ describe the relationship between MEAL and project management;
✓ identify the five phases of MEAL;
✓ describe the ethical standards and principles relevant to MEAL;
✓ understand the importance of participation and critical thinking in MEAL processes.

What is MEAL?
MEAL can be imagined as a puzzle made up of four unique pieces - monitoring, evaluation,
accountability and learning – only effective when the pieces are aligned, connected and working
together. We’ll look at each piece in turn to understand what they are - and what they are not.

M - Monitoring: The continual and systematic collection of data to provide information about
project progress.
E - Evaluation: The user-focused, systematic assessment of the design, implementation and results
of an ongoing or completed project.

A Quick Guide to MEAL DPro 2


Monitoring and evaluation are often discussed together as if they were a single, inseparable concept
(M&E). They differ in terms of purpose, frequency, timing and use of data.

Monitoring Evaluation
Purpose Tracking inputs, activities and progress A systematic and objective assessment of the merit,
toward achievement of agreed outcomes value or worth of an ongoing or completed project
and impacts
Frequency Regular and ongoing during project Periodic, one-off events during and, if funding
implementation permits, after project implementation
Responsibility Activities are conducted by members of Activities are often externally led, although they
the project team should involve the active participation of project staff
Use of Data Informs timely decision-making and short- Identifies potential course corrections
term corrective action in support of Contributes to longer-term organizational learning
adaptive management

Monitoring and evaluation data should always be used to inform management decisions, which, in
turn, promote Accountability and Learning.
A – Accountability: A commitment to balance and respond to the needs of all stakeholders
(including project participants, donors, partners and the organization itself) in the activities of the
project.
Projects embrace accountability by promoting transparent communications, aligning with standards
and best practises, by being responsive and encouraging participation.
L – Learning: Having a culture and processes in place that enable intentional reflection. The aim of
learning is to make smarter decisions.
Projects learn by incentivizing learning, encouraging a spirit of curiosity, embedding the learning
experience, promoting adaptive management and sharing information.

The MEAL phase model


So, what does strong MEAL look like in practice? MEAL activities in projects are organized into five
phases. Each phase is covered in a separate chapter in the Guide.

Phase 1: Designing logic models


The first phase of the MEAL cycle involves designing logic models - theory of change, results
framework and Logical Framework - that show how the desired change will happen. These models
establish the strong foundations of MEAL because they explain the change the project is seeking to

A Quick Guide to MEAL DPro 3


achieve, the steps through which change will occur, and how change will be measured.
Phase 2: Planning MEAL activities
Working from the foundations of MEAL established in the logic models, you will need to develop
more detailed and comprehensive plans for MEAL, aligned with the larger project plan. There are a
number of tools to help you plan for MEAL.
Phase 3: Collecting MEAL data
Once MEAL planning is complete, you will need to develop and use tools to collect high-quality data
that measure progress, and help you make decisions and learn in a timely manner.
Phase 4: Analyzing MEAL data
Data analysis is conducted during and after project implementation according to the analysis plans
established during the MEAL planning phase.
Phase 5: Using MEAL data
To be of value, MEAL data need to be used. Data are used internally to inform management
decisions, and externally to inform communications and promote accountability.

Together, the five phases of MEAL form a loop that promotes continual, intentional accountability
and learning. Your project should use MEAL data to periodically revisit the logic, design and
implementation of the project and its MEAL system. Furthermore, based on your learning, you
should update the original project design and adjust the MEAL system accordingly, if needed.

Ethical standards in MEAL


When MEAL systems are designed and implemented correctly, projects have the capacity to track
progress, make informed decisions, and increase project impact. To ensure this is done properly
organizations have created ethical principles which include the following themes:
• Representation of all relevant populations in the data collected.
• Informed consent for participation in data collection activities.
• Privacy and confidentiality of data.
• Participant safety of participants in the collection of data collection.
• Data minimization to ensure that only data specific to answer MEAL questions is collected.
• Responsible data usage, storing it securely and destroying it when no longer needed.

Cross-cutting themes in MEAL


There are two cross-cutting themes that should be integrated into the design, development and
implementation of MEAL activities:
▪ Participation of stakeholders to ensure that MEAL findings are relevant to the local context and
to improve communication, understanding and participation. It can also increase low-level
capacity in MEAL and promote a more efficient allocation of resources.
▪ Critical thinking is a process of thinking that is clear, rational, open to different opinions and
informed by evidence. It helps to reduce the risk of bias and improve the quality of project data.

Adapting the MEAL DPro


Every project is different and its MEAL systems will reflect any number of factors, including an
organization’s policies and culture in support of MEAL, the project context, the project value and
duration, donor MEAL requirements, and the project’s complexity and associated risks. Therefore,
the MEAL DPro tools and processes should be adapted to suit your context.

A Quick Guide to MEAL DPro 4


2. DESIGNING LOGIC MODELS (Phase 1)
By the end of this chapter, you will be able to:
✓ Describe how project logic models contribute to establishing a strong foundation for MEAL;
✓ Compare and contrast the components, structure and purpose of theories of change, results
frameworks and Logframes;
✓ Explain the purpose of identifying assumptions in project logic models;
✓ Interpret the vertical and horizontal logic of Logframes;
✓ Understand the characteristics of a SMART indicator;
✓ Identify the most common measurement methods and when they are used.

Introduction
The first phase of the MEAL cycle involves designing logic models. A logic model is a systematic,
visual way to present a summarized understanding of a project and how it works. It helps project
teams articulate the desired long-term change and how it will be achieved. The information
contained in logic models is the principal input to MEAL system design and is used by many
stakeholders such as project proposal writers, project managers and their teams.
This chapter explores three commonly used project logic models: the Theory of Change (ToC), the
Results Framework (RF) and the Logical Framework (Logframe). It’s best to create them in sequence
as each logic model draws and builds on the information found in the previous models.

Theory of change
The Theory of Change (ToC) is a comprehensive and visual description of how and why a desired
change is expected to happen. It defines the long-term goal of a project and the broad strategic
areas of intervention. It then maps the building blocks or preconditions which need to be in place
for the long-term change to occur. It also identifies the assumptions that need to hold true for the
project to succeed, and the evidence that is available to support them. It is recommended that ToCs

A Quick Guide to MEAL DPro 5


are presented in a visual format so that they are easier to understood. An example of a ToC is
shown in the full Guide. There are free tools available online to help you to create one.
The ToC is made up of the following components:
• Long-term change is the desired lasting impact that the intervention aims to support.
• Preconditions & pathways of change:
o Preconditions are the building blocks of the ToC. They are the requirements that
must exist for the long-term change to take place;
o Domains of changes are the broad strategic areas of intervention that most directly
contribute to achieving the long-term goal of the ToC;
o Pathways of change identify the connections between preconditions, how they
relate to each other and in what order. Most initiatives have multiple pathways that
contribute to the long-term goal.
• Assumptions are the conditions or resources outside the direct control of project
management, but that nevertheless must be met for progress to be made toward the
eventual achievement of the long-term goal.
Finally, as you review the ToC, you will find that some preconditions contribute to more than one
pathway of change. Also, keep an eye out for any “blinds spots” and “prevailing myths” that may
undermine the validity of your logic models. Blind spots are unintentional omissions in thinking or
errors that happen because of habit, snap judgments or overconfidence. Prevailing myths include
misguided assumptions like “access equals use,” “knowledge equals action,” and “activities equal
outcomes.”
You should always treat the ToC as a living document which should be updated as needed or when
new information is reported.

Results Framework
Now that the ToC is complete, the next step is to translate its contents into a Results Framework
(RF). The RF maps out the logic of the project strategy like the ToC but only includes interventions
that are the direct responsibility of the project team. It is important that the project team clearly
identify and prioritize the criteria it will use to decide what will be included in the RF, and what will
not be included. These criteria can be summarized as: needs prioritization; external program
considerations; appropriateness; institutional capacity; resources availability; financial and economic
feasibility; technical feasibility and sustainability; strategic considerations; and portfolio
considerations. Once these strategic decisions have been made, you will be able to identify what is
inside - and outside - of the scope of the results framework, and you can begin mapping content
from the ToC to your results framework.
The Guide to the MEAL DPro uses a four-level RF model that includes a hierarchy of objectives:
• Goal describes the longer-term, wider development to which the project contributes.
• Strategic objectives (SOs) express the central purpose of the project and significant benefits,
often addressing the immediate causes of the core problem.
• Intermediate results (IRs) express the expected change(s) in behaviors, systems, policies or
institutions as a result of project outputs and activities. There may be more than one IR for
each SO.
• Outputs are the deliverables resulting from project activities. There may be more than one
output for each IR.
In translating the contents of the ToC into an RF:
• The goal level in the RF is consistent with the long-term change identified in the ToC.

A Quick Guide to MEAL DPro 6


• The strategic objectives level in the RF corresponds with the ToC statements found at the
domains of change level.
• The intermediate results and outputs levels correspond with the preconditions of the ToC.
Remember, however, that not all preconditions of the ToC are included in the RF, only the ones that
are the responsibility of your specific project.

The full Guide contains an example of how to translate ToC content into Results Framework
objective statements.

Logical Framework
Once the Results Framework is complete, the next step is to develop the project’s Logical
Framework or Logframe. The Logframe is a logic model that describes the key features of the project
(objectives, indicators, measurement methods and assumptions) and highlights the logical linkages
between them. With the inclusion of these additional items, the Logframe provides the basis for
later developing the MEAL plan.
Like the Theory of Change (ToC) and the Results Framework (RF), the Logframe is intended to
communicate the purpose and main components of a project as clearly and simply as possible.
However, the Logframe includes information that is missing in the ToC and the RF, specifically:
• Indicators are measures used to track progress, reflect change or assess project performance.
• Measurement methods identify how the project will gather the data to track indicator progress.
There are many variations of Logframes. This guide uses a five-level matrix structure:
Objectives statements Indicators Measurement Methods Assumptions
Goal
Strategic Objectives
Intermediate results
Outputs
Activities

A Quick Guide to MEAL DPro 7


Objectives statements (Column 1)
The first column of the Logframe includes the objectives statements that were first created for the
Results Framework. The objectives statements define the “vertical logic” of the project. This also
includes activities which describe the work to deliver the project outputs. At higher levels the
objectives statements tend to be strategic, whereas at lower levels they are more operational.

Assumptions (Column 4)
Before completing columns 2 and 3 of the Logframe (indicators and measurement methods), it is
helpful to complete column 4, the assumptions. They complement the “vertical logic” of objective
hierarchy by introducing the “horizontal logic” of the project. The vertical logic only succeeds if and
only if the assumptions at each level of the Logframe are true.

In principle, you can copy the ToC assumptions into your Logframe. The Guide contains a Decision
Tree for selecting Logframe assumptions

Indicators (Column 2)
An Indicator is a measure used to track progress, reflect change or assess project performance. Each
Objectives Statement will require at least one indicator, and sometimes more depending on the
information you need. Also, the type of information will depend on which objectives statement the
indicator is intended to track. The key to a good set of indicators is their quality and usefulness.
Also, they should be SMART - Specific, Measurable, Achievable, Relevant, Time-bound. To save you
time and effort, explore whether there are standard, validated indicators that can be reused or
repurposed for your needs.
There are two types of indicators:
• Direct indicators track change by directly examining what you are trying to measure.
• Indirect or proxy indicators track change by examining markers that are generally accepted as
being proxies for what you are trying to measure. These are helpful when the result you are
attempting to monitor is difficult or too expensive to measure.
Lastly, you will need to decide whether your indicator will be quantitative or a qualitative:
• Quantitative indicators are measures of quantities or amounts. They help you measure project
progress in the form of numerical information, such as numbers, percentages, rates (such as
birth rate) and ratios (such as number of men to number of women).

A Quick Guide to MEAL DPro 8


• Qualitative indicators measure judgments, opinions, perceptions and attitudes toward a given
situation or subject.
The SPICED framework was developed to help teams collaborate more effectively with communities
to develop objectives and indicators, particularly qualitative. Indicators developed collaboratively
are stronger when they are Subjective, Participatory, Interpreted and communicable, Cross-checked
and compared, Empowering, and Diverse and disaggregated.

Measurement methods (Column 3)


Measurement methods identify how the project will gather the data to track the indicators. There
are two methods:
• Quantitative methods collect data that can be counted and subjected to statistical analysis. They
measure quantities, whether they be pure numbers, ratios or percentages. Quantitative
indicators are very widely used in development projects as they give a very clear measurement,
and quantitative data are easy to compare over time (or between projects).
• Qualitative methods capture participants’ experiences using words, pictures and stories. Data is
collected through prompting questions that trigger reflection, ideas and discussion, looking at
why and how change is happening.
Each method has its own strengths and weaknesses, so a mixed-methods approach is often used.
This can strengthen your data if you incorporate a process called triangulation, the validation of
data through cross-verification of more than two sources. Primary data come from information
collected directly by the project’s team and stakeholders. However, when possible, consider using
secondary data sources as well. Secondary data come from information that is already available
through other published or unpublished sources.

3. PLANNING MEAL ACTIVITIES (Phase 2)


By the end of this chapter, you will be able to:
✓ Identify and describe the purpose, process and content of key MEAL planning tools:
o Performance management plan;
o Indicator Performance Tracking Table;
o Feedback-and-response mechanism flowchart;
o Learning plan;
o Planning tools for MEAL communications;
o Summary evaluation table;
o Evaluation terms of reference;
✓ Understand the various types of evaluation and the purpose of each;
✓ Explain why MEAL planning is important and understand its relationship to broader project
planning and project management.

MEAL planning tools


To plan MEAL activities in your project you need to answer the question “How will we collect,
analyze, interpret, use and communicate MEAL information through the life of the project?” These
planning tools help you to answer this question, taking into account your project’s complexity.
There are detailed examples of the use of these tools in the full Guide. However, you should be
aware that many donors specify the formats to be used so as to align with their own systems.
 Performance management plan (PMP), also known as a Monitoring and Evaluation plan, usually
consist of a table with the following columns:

A Quick Guide to MEAL DPro 9


o Objectives statements pulled from the Logframe but not the goal and activities lines;
o Indicators pulled from the Logframe with definitions as needed;
o Data collection with 4 sub columns: measurement methods, timing and frequency, who
is responsible, and a list of respondents;
o Means of analysis has 2 sub columns: type of analysis to be used, and the different
subgroups (strata) of people taking part in your project;
o Use of information for communication and decision making.
 Indicator Performance Tracking Table (IPTT) helps teams track progress toward a project’s
indicator targets in an easy-to-read table format. The two key components of the IPTT are:
o Baseline: The value of an indicator before the implementation of an activity, against
which subsequent progress can be assessed;
o Target: The specific, planned level of change to be achieved during the project lifetime.
 Feedback-and-response mechanism (FRM) flowchart maps the flow of feedback from
stakeholders and identifies how the project will respond to the feedback it receives. The key to
a strong FRM is ensuring that communication is flowing in two directions:
o Feedback mechanisms: Communities provide feedback to the project team through
channels that include meetings, suggestion boxes, hotlines and others;
o Response mechanisms: The project team acknowledges receipt of the feedback and
provides appropriate responses to the community.
 Learning plan ensures that learning activities are intentionally planned, embedded and managed
throughout the life of the project. There are two elements:
o Learning enables adaptive management, an intentional approach to making decisions
and adjustments to the project in response to new information and changes in context.
This might be done through Learning-to-action discussions (LADs) which are specifically
planned discussions that bring staff together to reflect on data and understand project
progress. They take place throughout the data collection process;
o Organizational learning is the process by which an organization discovers and adapts to
new knowledge, through knowledge creation, transfer and retention.
The formation of a learning plan should include activity or process, roles & responsibilities,
expected outcomes, timeline, and resources, under the following headings: improving the
learning culture, embedding the learning process, investing in the capacity to learn, and
encouraging the sharing of learning.
 Planning tools for MEAL communications identifies stakeholder information needs and helps
ensure that MEAL communications are systematically planned and managed throughout the life
of the project. The communications plan should include the following information:
o Target stakeholders who need to receive communications in ways appropriate to them.
o Information needs for each stakeholder such as:
▪ Project goals & objectives, including project targets & who will receive support;
▪ Access to and use of feedback-and-response mechanisms;
▪ Project progress, changes and updates;
▪ Results of learning efforts.
o Appropriate communications methods to ensure that he project is transparent,
participatory and responsive. Written reports should take account of literacy
requirements. Communications should be two-way if appropriate.
o Timing and frequency according to the project calendar as appropriate.
 Summary evaluation table (SET) describes planned evaluations, including priority questions,
timing and budget. Evaluation types are:
o Formative to improve or refine an existing project, up to the mid-point;
o Process to understand how well a project is being implemented;

A Quick Guide to MEAL DPro 10


o Impact or outcome to assess how well a project has meet its goals;
o Summative to judge the performance of the project;
o Ex-post to assess the long-term sustainability of the project;
o Developmental evaluation to design a response to a known need;
o Empowerment evaluation seeks to improve project implementation;
o Meta evaluation is a systematic and formal evaluation of evaluations.
Once you are clear about the type(s) of evaluation you will conduct, you can begin filling out the
columns of the summary evaluation table: evaluation purpose, priority evaluation questions,
timing, anticipated start and end, and evaluation budget. To identify the evaluation questions,
which are clear statements of what you need to know from the evaluation, you need to establish
your evaluation criteria. These are a set of principles that guide the development of evaluation
questions and the overall evaluation planning process. They include relevance, efficiency,
effectiveness, impact and sustainability.
 Evaluation terms of reference (ToR) plans the specifics of each evaluation, including concise
evaluation questions, proposed methods, and roles and responsibilities. In writing the ToR it’s
essential to collaborate with stakeholders to manage expectations, and with the wider project
team to ensure that the project budget and calendar include the time and resources you will
need to conduct your evaluation. An evaluation ToR should include the following information:
o Project introduction and background;
o Evaluation purpose, audience and use explains why you are conducting the evaluation;
o Evaluation criteria and questions;
o Methodological approach (your donor may have suggested the type of evaluation);
o Evaluation roles and responsibilities of the team and how they will communicate;
o Evaluation deliverables and timeline;
o Evaluation logistics and other support.

MEAL in project management


As your team plans for MEAL, it is critical that these plans are aligned with, and embedded into, the
larger project’s budget, timeline and staffing requirements in two ways:
 MEAL in the project calendar. It is good practise to build a Gantt chart specifically for all MEAL
activities identifying the start & end dates and expected duration of each. These include all
monitoring visits, evaluation activities, learning initiatives, feedback-and-response mechanisms,
communications efforts, and any reports that need to be created. You should use a
participatory approach, working with the project team and stakeholders to help identify
opportunities for scheduling efficiencies and reduce the risk of scheduling conflicts with project
implementation.
 MEAL in the project budget. Budgeting for MEAL is usually an iterative process. The initial step
toward establishing the MEAL budget takes place when the original project proposal is
developed. This original budget is a high-level estimate of costs based on initial estimates of the
MEAL activities that will be conducted.
After the proposal is approved, a more detailed budget needs to be created. Detailed budgets
are often activity-based. This means that the project creates accurate and complete budget
estimates by systematically listing, quantifying and costing out all the resources (e.g., staffing,
materials, equipment and travel) that are needed to run the MEAL activities for the project.
These MEAL activities are found in the MEAL planning documents and the MEAL Gantt chart.
It is important to consult with the budget and human resources offices within your organization
to verify and understand your particular budget process, rules and policies. Similarly, review any
donor requirements and regulations related to project MEAL.

A Quick Guide to MEAL DPro 11


4. COLLECTING MEAL DATA (Phase 3)
Now that your MEAL planning process is complete, the next step is to get started with collecting
data. Timely, high-quality data are the foundation upon which project teams can measure progress,
make decisions and learn.
By the end of this chapter, you will be able to:
✓ Explain the five elements of data quality;
✓ Describe the components of a basic data collection tool outline;
✓ Identify three primary methods of data collection and key characteristics of each
(questionnaires, interviews and focus group discussions);
✓ Explain the basic principles of sampling;
✓ Describe key steps in preparing to implement data collection tools;
✓ Identify generally accepted protocols and standards for responsible data management;
✓ Understand the basics of selecting databases and associated data entry and cleaning
practices.

Data quality
The data you collect will never be free of bias. Thus, you need to determine, with the help of your
stakeholders, what quality and quantity of data is “good enough” for your decision-making, learning
and accountability needs. It is useful to consider the following five data quality standards.
• Validity. Data are valid when they accurately represent what you intend to measure i.e., the data
you collect helps you measure the indicators outlined in your Performance Management Plan
(PMP).
• Reliability. Data are reliable when the collection methods used are stable and consistent.
Reliable data are collected by using tools such as questionnaires that can be implemented in the
same way multiple times.
• Precision. Data are precise when they have a level of detail that gives you an accurate picture of
what is happening and enables you to make good decisions. For example, precise data allow
you to compare results between men and women (if this important for your project).
• Integrity. Data have integrity when they are accurate. Data should be free of the kinds of errors
that occur, consciously or unconsciously, when people collect and manage data.
• Timeliness. Timely data should be available when you need it for learning that informs decisions
and for communication purposes. Data are not useful to you when they arrive too late to inform
these processes.

Developing data collection tools


As you begin developing your data collection tools, it is a good idea to revisit the question, “What do
I need to know?” This should be answered by mapping out by the indicators found in your
Performance Management Plan (PMP) and, if you are conducting an evaluation, the evaluations
questions in your Summary Evaluation Table (SET) and the Evaluation Terms of Reference (ToR). This
section explores three of the tools most frequently used to collect quantitative and qualitative data:
questionnaires, semi-structured interviews and focus group discussions.
Each tool is designed with a similar three-section outline:
1. Introduction to explain the project and the data collection process, why the information is
being collected, how the data will be used and protected, and who will have access to it.
2. The Questions to be asked to gather the data required. These should be well-organized and
well-structured, using jargon-free language. Include fields to record date, time, location etc.
3. Conclusion. All tools should close by offering the respondent a chance to ask questions and
provide feedback on the experience.

A Quick Guide to MEAL DPro 12


 Questionnaires are a quantitative data collection tool comprising a structured set of questions
designed to elicit specific information from respondents. It’s good practise to design your
questionnaire to collect data on multiple indicators to save everyone’s time. Closed-ended
questions provide a predefined list of answer options. This makes it easier for responses to be
coded numerically allowing for statistical analysis. Also, decide what media type is best to
present the questions and record responses, considering your target population & local logistics.
 Semi-structured interviews and Focus group discussions are qualitative data collection tools
designed to explore and understand the rich depth and context of the respondent’s
perspectives, opinions and ideas:
o Semi-structured interview is a guided discussion between an interviewer and a single
respondent.
o Focus group discussion is a guided discussion between respondents in a group. It is
crucial to recruit the right participants, typically 8 to 12.
Whether or not you are designing a semi-structured interview or a focus group discussion, the
key to strong qualitative data collection is to carefully plan the questions that will frame the
conversation. These planned questions are prepared ahead of time, carefully scripted, and
documented in an interview or discussion guide. Unlike the closed-ended questions used in
questionnaires, most of the questions in the interview guides are open-ended, those that allow
someone to give a free-form response in their own words. There are two types:
o Content-mapping questions are also known as opening questions. These are intended to
initiate the exploration of a topic by raising and broadly exploring an issue.
o Content-mining questions. In order to encourage the rich discussion or responses
desired by qualitative data collection, facilitators often follow content-mapping
questions with content-mining questions or probing questions. These are follow-up
questions that elicit more detail or explanation about a response to a content-mapping
question. They are unscripted and free-form so as to explore a topic more deeply.

Creating samples
Gathering data is expensive and time-consuming, making it difficult to speak to everyone. This is
why you need to identify a sample group (or subset) of respondents who will give you valid, reliable
and generalizable information to meet your needs. Sampling can be divided into two basic types:
 Random sampling is used when you plan to use quantitative methods and analysis. This
sampling approach is used when you need confidence that what is true for your sample is likely
true for the entire population (or a subgroup of the larger population).
Take steps to avoid a sampling bias. This occurs when you are not taking into consideration all
the available perspectives, ideas and opinions and means that your data will not be as valid
(accurate), and cannot be easily generalized to the population you want to address. Two specific
types of bias can be especially problematic:
• Convenience sampling bias occurs when data are collected from respondents who are
easy to reach, or who are easy to work with. This runs the risk of over-representing
people located closer to main roads, or groups that are fluent in the predominant
language.
• Voluntary response bias occurs when data are collected disproportionately from self-
selected volunteers. This runs the risk of under-representing people with busy
schedules or people who travel frequently, and over-representing people with strong
opinions or specific agendas related to the project.

A Quick Guide to MEAL DPro 13


After you have considered all these factors, the four steps to identify a random sample include:
1. Defining your population and the sampling unit. A population is a set of similar people,
items or events that is of interest for some question or experiment. When defining this,
clearly articulate your inclusion and exclusion criteria, such as geographic boundaries or
demographic characteristics. Next you should clearly identify your sampling unit, the
individual person, category of people, or object from whom/which the measurement
(observation) is taken.
2. Choosing a method to calculate your random sample. These are:
o Simple random sample. Every unit of your population has a chance of being
selected.
o Systematic sample is a process of listing and numbering all potential subjects
then selecting e.g., every 10th person until you’ve reached your sample size.
o Cluster sampling. The population is divided into naturally occurring clusters
such as geographical areas, schools or places of employment. All the
clusters are listed, and a sample of clusters is randomly selected. Stratified
sampling is a strategy that allows you to analyze subgroups (or strata) within
the larger population.
3. Determining your sample size. How well a sample represents the population is gauged
by two important statistics:
o Margin of error, the maximum expected difference between the true
population and the sample estimate.
o Confidence level refers to the percentage of all possible samples that can be
expected to include the true population parameter.
4. Selecting your sample units, it is especially helpful if you can start by accessing a sample
frame. This is a specific list of units (men, women, households, individuals, children,
adolescents, etc.) that you will use to generate your sample. Other methods are
available such as the random route method where you sketch a map of the community
and generate a random route through it, interviewing at every nth house for example.
 Purposive (selective) sampling is used to understand the experience or perspective of a
particular group by gaining a “deep” understanding at the level of the individual participant. It
helps gain an understanding of the change you see, unpacking the meaning of the change and
developing explanations for the change. However, because purposive sampling is non-random,
the data collected from the sample cannot be generalized to the general population. Two steps
to identifying a purposive sample include:
1. Identify the type of purposive sampling you desire. Start by clearly defining your
population and sample frame. Establish sampling criteria that are very clear about the
sampling units you intend to use.
2. Determine your sample size. This is calculated very differently in purposive sampling
than in random sampling. Often, qualitative data are used to triangulate, or cross-check,
quantitative or other qualitative data. Thus, purposive sample sizes must be considered
with triangulation needs in mind. You need to conduct enough focus group discussions
or interviews to test, reinforce and confirm the patterns that are emerging.

Using data collection tools


Once you have designed your tool and your sampling strategy, it is time to implement your data
collection effort. There are four steps that should be followed:
1. Translate your data collection tools. Is your project working in a region that uses multiple
languages? If so, then your tool will need to be translated so that it is not biased toward
those who speak the initial language of your tool.

A Quick Guide to MEAL DPro 14


2. Train data collectors and test your tools. Written instructions accompanying your collection
tool are essential. Often, additional training is also needed. Training data collectors serves
two purposes: building the skills of your data collectors and ensuring that your tool works as
it should.
3. Revise and finalize your tools. After you have tested your tool, any revisions can be
incorporated into your final document.
4. Plan for implementation and data management.
o Allow enough time for each data collection event.
o Choose a venue for interviews and focus group discussions that provides privacy and
an appropriate level of comfort.
o Identify how you intend to manage the data you collect. Who will be responsible for
entering the data and conducting data quality checks? How will you protect and
store completed questionnaires, and how will you protect privacy?

Managing data
Creating an effective data management system enables you to effectively analyze, interpret and use
the data you collect. There are four primary components:
 Data entry. Use digital devices to collect data if possible. If manual entry is used, ensure your
data entry staff are trained.
 Data cleaning is about detecting and removing errors and inconsistencies from data to improve
its quality. Conduct random quality checks, look for unexpected entries in the data, and remove
duplicate entries.
 Data storage and security. It is important to ensure that data are secure and protected against
unauthorized changes, copying, tampering, unlawful destruction, accidental loss (have a backup
policy), improper disclosure or unauthorized transfer.
 Data retention, disposal and de-identification. When it has been decided that data are no longer
needed - either following the end of the project, or during the implementation of the project -
all records and backups should be disposed of or adjusted so that it is impossible to identify the
data respondents. If you choose to retain data, de-identification can be done by anonymization
or pseudonymization.

5. ANALYZING MEAL DATA (Phase 4)


Data become useful when you give them meaning, and this is done through analysis, visualization
and interpretation.
By the end of this chapter, you will be able to:
✓ Explain how your MEAL planning documents guide data analysis, visualization and
interpretation;
✓ Describe the purpose and processes of quantitative data analysis;
✓ Describe the purpose and processes of qualitative data analysis;
✓ Describe the purpose and process of data visualization;
✓ Explain how analysis leads to appropriate interpretation and the development of
conclusions and recommendations.

Introduction to data analysis


A careful review of your Performance Management Plan (PMP) or Monitoring and Evaluation plan
will tell you what data you will analyze, when and how you analyze them, and how you will use your
results.

A Quick Guide to MEAL DPro 15


Quantitative data are analyzed using quantitative, statistical methods and computer packages such
as Microsoft Excel or SPSS. The results are numerical and easily visualized using a graph, chart or
map.
Qualitative analysis is most often done by reading through qualitative data in the form of data
transcripts, such as notes from focus group discussions or interviews, to identify themes that emerge
from the data. This process is called content analysis or thematic analysis. It can be aided by
software, but is most often done using paper, pens and sticky notes.
The timing of your data analysis depends on when it is collected and the timing of your stakeholder
information needs. It is particularly important to coordinate your data analysis with the overall
project implementation calendar as you may need time and input from the wider non-MEAL project
team.

Quantitative data analysis basics


At a basic level, there are two kinds of quantitative analysis:
• Descriptive data analysis is the analysis of a data set that helps you describe, show or summarize
data in a meaningful way so that patterns might emerge.
• Inferential data analysis (also known as interpretive) enables you to use data from samples to
make statistical generalizations about the populations from which the data were drawn.
To understand your quantitative data, you need to understand variables. These are defined as any
characteristic, number or quantity that can be measured or counted. There are two categories:
Independent and Dependent, the latter depending on other factors.
Quantitative data is classified into four fundamental levels of measurement:
 Nominal data is data collected in the form of names (not numbers) and which are organized by
category, for example gender, ethnicity, religion, place of birth. Information collected from
nominal data is very useful, even essential, as it enables basic descriptions of your project.
 Ordinal data is data that has an order to them. They can be ranked from lesser to greater, for
example by scales measuring levels of satisfaction or levels of agreement. Strictly speaking,
ordinal data can only be counted, but a consensus has not been reached among statisticians
about whether you can calculate an average for data collected using an ordinal scale.
 Interval data is data expressed in numbers which can be analyzed statistically, for example
temperature, time. Distances between data points on an interval scale are always the same,
meaning that interval data can be counted enabling more advanced statistical calculations.
 Ratio data is data expressed in numbers, with the added element of an “absolute zero” value,
for example height and weight. Ratio data cannot be negative, and because ratio data have an
absolute zero, you can make statements such as “one object is twice as long as another.
No matter what their type, data are not particularly useful to you in their raw form. You need to
analyze the raw data before you can determine whether your program is meeting its targets, use it
to make decisions or start to communicate with your stakeholders.
For analyzing quantitative data using descriptive statistics, there are three categories of
calculations:
✓ Measures of frequency indicates how many times something occurred or how many responses
fit into a particular category. You can analyze frequencies by using two tools: frequency tables
for a single group and cross-tabulation tables for multiple groups.
✓ Measures of central tendency help identify a single value around which a group of data is
arranged, using three tools: Mean is the average of a data set, identified by adding up all the
values and dividing by the whole; Median is the middle point of a data set, where half the values
fall below it and half are above; Mode is the most commonly occurring answer or value. To
determine which if these to use, there are three factors to consider:

A Quick Guide to MEAL DPro 16


o What type of data do you have (nominal, ordinal, interval or ratio)?
o Does your data set have outliers and/or is it skewed?
o What you are trying to show from your data?
✓ Measures of variability are the third set of calculations used to analyze data using descriptive
statistics. They tell you the spread or the variation of the values in a data set. Are the responses
very different from each other over the scale of possible responses or are they clustered in one
area? There are two tools to calculate the variability of the data set: the Range is the difference
between the lowest and highest values of a data set, and the Standard Deviation which
calculates how far responses differ (deviate) from the mean (average).
Once you’ve calculated your descriptive statistics it’s useful to stop at this point and ask yourself
three questions:
• What are the maximum and minimum values for frequencies … what is the range? What do we
need to do next with our analysis if the range is very large?
• What is the spread of these values? Are they clustered in any way? Is the mean very different
from the mode? If so, what is our next step for analysis?
• What do our contingency tables show us? Are there any interesting differences or similarities
between the subgroups identified in your Performance Management Plan (PMP) or Monitoring
and Evaluation plan?
Analyzing quantitative data using inferential analysis helps you to:
✓ Compare the significance of differences between groups. Are the differences that exist between
subgroups are large enough to matter? Three tests are available to help you determine
whether the differences between the descriptive statistics for subgroups are significant. These
are t-tests, analysis of variance (ANOVA), and chi-square tests.
✓ Examine the significance of differences between variables to determine correlation and,
potentially, causation. Have your activities contributed to the changes you are seeing?
Regression analysis gives you an understanding of how changes to variable(s) affect other
variable(s). It gives you an understanding of correlation, a statistical measure (usually expressed
as a number) that describes the size and direction of the relationship between two or more
variables. It is important to note that correlation does not necessarily imply causation, when
changes to one or more variables are the result of changes in other variables. Proving causation
can be difficult but two strategies can be used to increase your confidence that causation exists
between variables:
o The use of counterfactuals and control groups is a strategy usually used in impact
evaluations. These evaluations are designed to understand cause and effect between
your project and the outcomes you see. The “counterfactual” measures what happens
to the “control group,” a group of people who are not involved or impacted by your
project. During analysis and interpretation, you compare the results of your project
sample with the control group in an effort to demonstrate causation. This kind of study
requires a great deal of planning and structure, and resources.
o Mixed-method approaches: Many experts believe that a higher level of certainty about
causation is possible using a mix of evidence to triangulate your results.
The full Guide contains worked examples of these forms of analysis. You will need to consult the
statistical experts on your team.
Contribution Analysis: an alternative to causation. It is sometimes difficult to be confident about
causation in development settings. As a result, an alternative has been developed called
contribution analysis. It is used in situations where rigorous sampling and data collection processes
are not possible and it would be unrealistic to attempt to establish statistical causation. Instead of
asking “Did our project cause the changes we are seeing,” these experts ask, “Did our project

A Quick Guide to MEAL DPro 17


contribute to the changes we are seeing?” Contribution Analysis is a process of clearly outlining a
contribution “story” by transparently following these six steps:
• Clearly define the questions that need to be answered;
• Clearly define the project’s theory of change and associated risks to it;
• Collect existing evidence supporting the theory of change (your conceptual frameworks);
• Assemble and assess your own project’s contribution story;
• Seek out additional evidence where necessary;
• Revise and conclude the contribution story.
By following and documenting these steps, contribution analysis can demonstrate that a project
contributed to change.
Quantitative analysis errors. As you consider quantitative analysis and the sampling decisions that
go along with it, there are two general types of quantitative analysis errors you need to be aware of:
• Type I error, wrongly concluding that your project has had an effect on the target population
when it has not. This is also called a false positive.
• Type II error, the opposite of the Type I error. This occurs when you wrongly conclude that
your project has not had an effect on the target population when it actually has. This is also
called an error of exclusion or a false negative.
To avoid Type I errors, you will want to plan for a smaller margin of error and a higher confidence
level when you select your sample from which to collect data. However, be careful not to set your
requirements too high. This can lead to Type II errors, where you fail to recognize important factors
that make a difference to your population or project implementation. One way to reduce the risk of
making a Type II error is by increasing your sample size, but this may have implications on your
budget.

Qualitative data analysis basics


Qualitative analysis is working with words that combine to become ideas, opinions and impressions.
There are fewer rules, and approaches vary. In general, the objective of qualitative analysis is to
identify key themes and findings, including among subgroups if you have them, from all the notes
you have collected from your interviews and focus group discussions.
Qualitative analysis is often called “content analysis” and requires multiple reviews of data (your
content) so that the data becomes more manageable. The process of becoming familiar with the
data will generate themes, which you will use in your analysis.
Qualitative analysis begins with the raw data, which can take many forms, such as recordings of
interviews or notes from focus group discussions. The raw data need to be organized so that they
are easy to review. You then need to complete the following three steps:
✓ Step 1: Code data: Begin to identify themes. Coding is a process that helps reduce the large
quantity of qualitative data you have into manageable units. Read through your data and apply
a category label that identifies a particular event, opinion, idea, etc. These codes will be mapped
in a matrix that will help you visualize the data and begin to interpret its meaning (next section).
There are two approaches to coding:
o Deductive coding is an approach to coding in which codes are developed to before the
data is reviewed, relating back to indicators in your PMP. During the review, the codes
are applied to the data.
o Inductive coding is an approach to coding in which codes are developed as the data is
reviewed, using the specific words used by participants themselves. Themes emerge
naturally, and codes are built and modified during the coding process itself.

A Quick Guide to MEAL DPro 18


It’s recommended using a mix of both deductive and inductive coding to arrive at the most
comprehensive results.
✓ Step 2: Index the data. As you begin reading your transcripts, you may need to match concepts
and relevant quotations to the codes you have identified. This is called indexing: you essentially
tag the content from your transcripts using the codes from the previous step. Then, you create
a list of those tags and where they are in the data in the form of an index. Once you have
indexed your content, you will be able to review your codes and more easily find the different
concepts and relevant quotes related to the codes within your transcripts. You will also be able
to identify how dense a code is; how often the code appears and where, relative to the other
codes you created. Indexing is particularly important if you need to go back to find a
noteworthy idea or quotation when you are communicating your results.
✓ Step 3: Frame the data. At this point, you begin to put the qualitative data you are working with
into a form that can be understood. The most frequently used method of describing qualitative
data is a matrix - sometimes called the framework approach - which organizes your data
according to categories that are useful to you. The structure of a matrix will differ depending on
the type of data collection you are doing. A matrix helps you visualize and begin to interpret
your qualitative data, allowing you to arrive at meaningful conclusions. It’s also a good tool to
support your conclusions, which you can show to stakeholders if necessary.

Data visualization
Data visualization is the process of showing your data in a graph, picture or chart. This is easier than
poring over spreadsheets or reports, and helps to share detailed insights into data in the quickest
and most efficient way. It assists with:
• Analysis: Discovering relationships between, and patterns in, the data.
• Interpretation: Understanding and reflecting on patterns in the data set and then inferring new
information based on that interpretation.
• Communication: Making technical, statistical analysis understandable to people with limited
technical knowledge, and sharing your information in ways appropriate to your stakeholders.
Consider following these steps to ensure that your products are effective, especially if you intend to
use data visualization to aid communication to stakeholders:
✓ Step 1: Define the stakeholder(s) and your audience before designing a visualization. Keep in
mind that different people have different learning styles.
✓ Step 2: Define the data visualization content. Check your communications plan to determine the
“need-to-know” content for each of the stakeholders identified. Then, determine where a visual
will be most useful based on your findings, your information needs and the data available.
✓ Step 3: Design and test your visualization. Remember to keep it simple. Less is more with data
visualization. Do not crowd your visuals with too much data. Get started on paper, with the
audience-specific content that was identified. For each key audience identified, different visuals
or dashboards may need to be designed. The most common data visualization tools are:
o Bar chart: Shows multiple responses across different subgroups or points in time;
o Stacked column chart: Shows the variation in multiple variables or options across different
subgroups on different questions or different points in time;
o Pie chart: Shows composition of data set when component parts add up to 100 percent;
o Line chart: Shows the trends across different points in time;
o Scatter chart: Shows the relationship between two continuous variables or distribution
within a data set;
o Heat map: Shows the distribution of results across a geographic area, with greater
distributions represented by greater (“hotter”) color intensity;

A Quick Guide to MEAL DPro 19


o Line histogram: Shows the distribution with a range of numeric data;
o Data dashboards: Visually display a collection of key data points to monitor the status of a
project. A dashboard can include multiple visualization tools as its subcomponents.
✓ Step 4: Build your visualizations. Team members with skills and experience in digital software
can build data visualizations, using Microsoft Excel for example. To create more complex
visualizations digital experts and MEAL staff may be need to be consulted.

Interpreting quantitative and qualitative data


Quantitative analysis generates frequencies, averages and levels of difference that exist in your data.
Qualitative analysis identifies themes and patterns. Both types of analysis need to be interpreted to
make sense of the information they offer you. Together with your team and other important
stakeholders, you interpret your data set by giving meaning to it. The meaning you give it is the
story of your project, the story you will use to make project decisions and share your results with
others.
There is no prescribed process for interpreting data, but there are several recommended practices
for improving your data interpretation through enhanced participation and critical thinking:
✓ Creating visualizations of your results to help people better understand and interpret your data,
making sure your visualizations are used to give the full picture of the data and are not
misleading.
✓ Triangulating your data by presenting the results of both quantitative and qualitative analysis
together so that you can compare the results. Triangulation is the validation of data through
cross-verification of more than two sources.
✓ Convening a stakeholder meeting to interpret the data. This meeting should involve
stakeholders with different perspectives on the project.
✓ Planning an adequate amount of time in the project implementation plan to analyze and
interpret data.
✓ Making sure roles and responsibilities around interpretation are clear. Usually, the MEAL team
does the initial analysis while project staff organize and facilitate interpretation events.
As your team and stakeholders undertake data interpretation, you need to consider your
interpretation (and subsequent results and recommendations) for the following:
✓ Validity. Your interpretation is considered more valid if you can clearly demonstrate that it is
based on data that directly supports it.
✓ Reliability. Your interpretation will be considered more reliable if you can demonstrate the
consistency of your data analysis methods and their use across multiple data sets.
✓ Integrity. Your interpretation will be considered to have more integrity if you can demonstrate
that it is based on data collection and analysis processes that are relatively free of error and bias.
Data limitations to consider during interpretation:
Limitations related to data type. With qualitative data, you must be very clear about the fact that
your data represent only the perspectives of the people participating in the focus group discussions
or set of interviews. They should not be used to make broad generalizations about the population.
However, this information can be used to support other findings, such as those generated using
quantitative data. Quantitative data can tell you whether something happened, but possibly not
why. Whenever possible, combine quantitative data interpretations with supporting interpretations
from qualitative data.
Limitations related to sampling. You now know that there are different sampling methodologies.
Your sampling method and size have an impact on the kind of analysis and interpretation you can
conduct. For example, random sampling allows you to generalize to the larger population from
which the sample was selected. If your results fall within your desired margin of error, you can then
make more confident statements about how your project can benefit others.

A Quick Guide to MEAL DPro 20


Purposive sampling, on the other hand, is used to better understand a specific context or situation,
usually one in which you are hoping to triangulate data. Sometimes your best efforts to collect data
according to your purposive sampling plan are unsuccessful.
Limitations related to data quality. With any data, you must be explicit about any existing quality
issues and how they might influence your interpretation. The information you collect will never be
perfect. Questionnaires will have missing responses, focus group leaders may unintentionally
influence respondents, and self-reported responses may be improperly understood. Your
interpretation of both quantitative and qualitative data must incorporate your understanding of any
data quality issues.
Limitations related to bias. Bias has already been mentioned in different settings. Remember that
bias can be defined as any trend or deviation from the truth in data collection, analysis,
interpretation, and even publication and communication. There are various types of bias:
• Sampling bias is when certain types of respondents are more likely than others to be included in
your sample, as in convenience sampling and voluntary response bias. This bias compromises
the validity of your random sample.
• Data analysis bias occurs when your analysis includes, intentionally or unintentionally, practices
such as eliminating data that do not support your conclusion or using statistical tests that are
inappropriate for the data set.
• Data interpretation bias occurs when your interpretation does not reflect the reality of the data.
For example, the analysis team may:
o Generalize results to the wider population when they only apply to the group you
have studied.
o Make conclusions about causation when the sampling and collection designs do not
make this possible.
o Ignore Type I and Type II errors.
• Data publication and communication bias occurs when those publishing or reporting on project
results neglect, for example, to consider all results equally, whether positive or negative. For
example, there are many published and communicated success stories, but not as many
“failure” or “lessons learned” stories.
Validating, or testing, the themes and conclusions you generate from data analysis is always an
important part of the process. There are clear benefits to including multiple stakeholders when
validating themes and conclusions.

6. USING MEAL DATA (Phase 5)


In this chapter we explore the purpose and practice of using MEAL data to inform project
management and direction through a discussion of adaptive management. This chapter also
includes guidance for meeting key stakeholder information needs, particularly in the areas of
progress and evaluation reporting.
By the end of the chapter, you will be able to:
✓ Identify the key principles of adaptive management, including how they are incorporated
into the MEAL cycle;
✓ Describe how data are used in reporting and communication with stakeholders.

Adaptive management
In order to contribute to project improvements, MEAL information should be used as part of
ongoing project decision-making. Adaptive management encourages and supports this process.
Effective adaptive management collects and analyzes project monitoring and feedback data to help

A Quick Guide to MEAL DPro 21


project staff make collaborative, timely and informed decisions to ensure that project activities
deliver intended impact to participants within the approved time, scope and budget. Adaptive
management also contributes to internal and external learning.
A culture of adaptive management results from a series of intentional investments related to project
design, staffing, budgeting, decision-making and more. A project that embraces adaptive
management will respond affirmatively to the questions: Does your project have resources to
support learning? Are project decisions informed by evidence-based data? Does your project accept
and encourage change?

Progress Reporting
Reporting and communication can be seen as the culmination of your data analysis process,
recognizing that the ways you choose to include information in your reports is the final stage of
interpretation. High-quality, transparent reports in line with your donor or other internal and
external stakeholder requirements are vital. Good reporting captures and explains both the
successes and the challenges facing the project, and offers evidence of robust evaluative thinking in
the search for solutions. The guidance below is critical to creating reports that resonate with your
stakeholders and are useful.
Consult your project communications plan and data flow map to remind you of your
communications audience, purpose and timing.
Identify or develop report templates. Don’t make unnecessary effort when it comes to reporting.
Check whether your donor, organization or project already has a report template that you can (or
must) use. If you need to create a new one, ask your colleagues and stakeholders for examples they
find useful that you can adapt for your purposes.
Identify donor reporting requirements. Donors frequently specify their required reporting template
and schedule. Ensure that any templates you adapt or create also comply with these requirements.
Given the importance of reports, many donors and organizations have created detailed guidance on
how to create them. Check with your donor for their guidance on evaluation reporting. For example,
USAID gives extensive guidance on how to prepare an evaluation report.

FINALLY….
The Guide to MEAL DPro provides a certification-based and sector-wide standard which helps teams
to design, plan and implement MEAL in their projects by providing clear, practical guidance and tools
that can immediately be applied. We hope you now have a good understanding of the basics, but
remember this quick guide is only a brief overview of the MEAL DPro guide. If you want to learn
more and get a certification you should now study the full Guide to the MEAL DPro. It provides
important detail with lots of examples and practical advice for MEAL in projects in the real world.
It’s downloadable for free from https://mealdpro.org. Also, take a look at MEAL DPro Starter at
https://mealdprostarter.org/ which gives on-demand access to MEAL DPro tools for use in projects,
which are also downloadable for free.

A Quick Guide to MEAL DPro 22


A Quick Guide to MEAL DPro 23

You might also like