0% found this document useful (0 votes)
51 views

Training As A Development Tool

This document provides guidance for integrating training activities strategically to support USAID's strategic objectives in line with reengineering principles. It discusses training evaluation models, developing training indicators and monitoring tools. The document also highlights case studies of sector training programs in economic growth, democracy, health, education and organizational training designed strategically. The goal is to help training specialists apply reengineering concepts and clarify their role in demonstrating how training links to strategic objectives.

Uploaded by

ERMIYAS TARIKU
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
51 views

Training As A Development Tool

This document provides guidance for integrating training activities strategically to support USAID's strategic objectives in line with reengineering principles. It discusses training evaluation models, developing training indicators and monitoring tools. The document also highlights case studies of sector training programs in economic growth, democracy, health, education and organizational training designed strategically. The goal is to help training specialists apply reengineering concepts and clarify their role in demonstrating how training links to strategic objectives.

Uploaded by

ERMIYAS TARIKU
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 75

Training as a Development Tool

Cecilia Otero

September 1997

Document No. PN-ACA-630

Research and Reference Services


United States Agency for International Development
Center for Development Information and Evaluation
Washington, DC 20523

Research and Reference Services Project is operated by the Academy for Educational Development
TRAINING AS A DEVELOPMENT TOOL

TABLE OF CONTENTS

Introduction
Highlights
Methodology

THEORY

Training in a Reengineered Context 8


- Definition of training impact
- Application of results-oriented framework to training systems
- Reengineering core values: Their relationship to training

Evaluation Models 15
- Donald Kirkpatrick
- Robert Brinkerhoff
- Human and Educational Resources Network Support Project

Training Indicators 32
- Criteria for developing indicators

Additional Monitoring and Measurement Tools 37


- Ways to isolate the effect of training on performance
- Assigning a monetary value to the improvement

Comments 42

PRACTICE

Training Activities in Support of Strategic Objectives 43


Sector Case Studies

- Economic Growth 44
Privatization of Industries and Agriculture in Tajikistan
USAID/Central Asia

- Democracy and Governance 49


Strengthening Municipal Governments, USAID/Bolivia
Sector Case Studies (cont.)

- Health 53
Improved Quality with Equity in Health, USAID/El Salvador

- Education 60
Increased Girls’ Basic Education, USAID/Morocco

- Training 65
Enhancing the Human Resource Base of Nongovernmental
and Governmental Organizations, USAID/Namibia

Documenting Training Activities - TraiNet 73

Resources 74
Training-Related Internet Sites 75

Acknowledgements

I am indebted to many individuals for their assistance and substantive input in this
study. I would like to express my deep appreciation to John Jessup,
USAID/G/HCD, whose invaluable expertise, advice, and encouragement guided
the research and writing of the study. Wendy Putman and Stacey Cramp, RRS,
provided their expert and careful editing skills. The RRS review team comprised
of Nick Mills, Dana Wichterman, and Anne Langhaug made thoughtful
suggestions as to the overall format and content. The names and affiliations of the
training specialists who collaborated in the preparation of the case studies are
listed under the section Training in Support of Strategic Objectives on page 43.
ACRONYMS

ADR Alternative Dispute Resolution


ADS Automated Directives System
AOJ Administration of Justice
CLASP Caribbean and Latin American Scholarship Program
GTD Global Training for Development
HERNS Human and Educational Resources Network Support
HCD Human Capacity Development
HR Human Resources
HRD Human Resource Development
IR Intermediate Results
M&E Monitoring and Evaluation
NGO Nongovernmental Organization
NIS Newly Independent States
NPR National Performance Review
PDF Participant Data Form
PIO/P Project Implemenation Order/Participant
PTMS Participant Training Management System
PVO Private Voluntary Organization
SKA Skills, Knowledge, Attitudes
SO Strategic Objective
USAID United States Agency for International Development
READ Reaching Out with Education to Adults in Development

TABLES

TABLE I Comparison of Traditional vs. Reengineered Training ( p.11)


TABLE II Relationship of USAID Reengineering Core Values to the Training Process (p.14)
TABLE III Measuring Progress Toward Achievement of Strategic Objectives (p.30)
TABLE IV Criteria for Assessing Generic and Training Indicators (p.34)
TABLE V Examples of Training Indicators (p.35)
TABLE VI How to Isolate the Effects of Training on Performance Improvement (p.38)
TABLE VII How to Calculate the Value of Increased Production (p.40)
TABLE VIII How to Calculate Savings in Overtime (p.41)
TABLE IX HRD Constraints and HRD Gaps (education case study, USAID/Morocco) (p.62)
TABLE X Mechanisms Used to Monitor and Evaluate the Effectiveness and Impact of the
Training Program (training case study, USAID/Namibia) (p.70)
Introduction

In 1993 President Clinton created the National Performance Review (NPR) to reform
the practices and procedures employed by the federal government. With Vice President
Gore as its leader, the NPR initiative led to the Government Performance and Results
Act of 1993, which requires federal agencies to develop strategic plans, prepare annual
performance plans, and submit performance reviews. The United States Agency for
International Development (USAID) was selected as a "Reinvention Laboratory"
within NPR’s initiatives.

Following this mandate, in September 1995, USAID Administrator Brian Atwood


approved the Agency’s Strategic Framework and Indicators, 1995-1996. This
document articulates the guiding criteria and principles that the Agency must follow
to reengineer its systems. It provides a graphic explanation of the Agency’s goals and
objectives drawn from the strategies for sustainable development—protecting the
environment, building democracy, encouraging broad-based economic growth,
stabilizing world population growth, and providing humanitarian assistance. The
framework is used to review Missions’ programs, assist operating units in linking their
activities to Agency goals, assess the Agency’s performance, and enhance its ability to
manage for results.

In the light of this directive, the Agency has reassessed the role of training in the
context of reengineered systems, i.e., to expand the function of training beyond
individual achievement to organizational improvement. This task involves a significant
shift from past training practices and requires developing appropriate design,
implementation, and evaluation tools to integrate training programs strategically in
support of other activities.

The purpose of this study is to examine how reengineering concepts and principles
apply to the training function and to provide trainers with approaches, ideas, and
strategies to design quality integrated programs, as well as monitor and measure
results. Five case studies representing different sector areas—economic growth,
democracy and governance, health, education, and training—are also included as
examples of reengineered training designs.
Highlights

This study is primarily intended to serve as a guide to USAID strategic objective (SO) teams
and training specialists as they confront the challenge of applying reengineering concepts and
integrating training activities into Missions’ strategic objectives. Its aim is to assist field
training staff in clarifying their function within SO teams; reiterate the crucial role they play
in demonstrating the link between training and strategic objectives; and examine useful
techniques and strategies that they can adapt to their own programs.

The Best Practices modules developed by the Human Resource Development Assistance
Project provide a detailed explanation, with a wide array of examples and illustrations, of the
major components and activities that comprise strategic training. Thus, this paper focuses
primarily on three topics: a discussion of training evaluation models, mechanisms for
developing training indicators as well as monitoring and measurement tools, and sector case
studies presented as examples of innovative training programs designed strategically. Every
effort was made in discussing each of these topics to bridge the gap between the theoretical
concepts stated in the reengineering initiatives and the reality experienced by those charged
with the implementation of reengineered training systems.

The first part of the study entitled Theory comprises four sections:

Training in a Reengineered Context provides a definition of training impact and briefly


discusses how reengineering guidelines affect each of the components of training. A table
contrasting concepts and practices under traditional and reengineered training is also provided,
as well as a brief discussion on how the Agency core values apply to training systems.

Evaluation Models synthesizes three commonly used training evaluation models developed by
Donald Kirkpatrick, Robert Brinkerhoff, and the USAID-funded HERNS project.

Training Indicators offers a general discussion of indicators and provides strategies and
recommendations for developing training indicators.

Additional Monitoring and Measurement Tools includes tools and mechanisms to isolate the
effect of training on performance improvement and calculate results in financial terms.

The second part of the study, Practice, presents five case studies of integrated training
programs designed strategically in support of other activities—economic growth, democracy
and governance, health, education, and training. The aim is to illustrate how reengineering
concepts and approaches are being applied in various sectors and avail training specialists of
the experiences and expertise of others. A discussion of training results is also presented, as
well as a description of TraiNet, a database system designed to record and report training
activities and results.
Methodology

This study was prepared as a response to a request from USAID’s Center for Human
Capacity Development of the Global Bureau to identify training case studies and develop
monitoring and evaluation tools. The research involved a review of key documents dealing
with USAID reengineering concepts and practices; a review of major evaluations of USAID-
sponsored participant training programs; consultation with several training and evaluation
specialists; and searches in education\training databases and Internet web pages.

The case studies were prepared with the assistance of the training officers in the respective
Missions or their training contractors who provided the data and approved the final versions.
While the extensive interaction with these individuals was a most rewarding experience, it
was extremely time-consuming and not an efficient way of identifying and collecting the
information required. Prior to the transition from project-level activity to strategic objectives,
a significant amount of information related to projects and documents could be retrieved
through the USAID bibliographic database. Activities implemented under reengineering no
longer have an identifier, i.e., project number, and most of the information generated remains
in the field. To address this situation, G/HCD has developed an Agency-wide information
system, The Training Results and Information Network, TraiNet, to be used by field or
stateside training staff. It provides a standard mechanism for consistent data input and
collection that responds to reengineering guidelines. We must underscore the importance of
updating and maintaining this database on a regular basis as it will become the central
repository for USAID training-related information. (See last section, Documenting Training
Practices, for a more detailed explanation of TraiNet).
Training in a Reengineered Context

The United States Agency for International Development has long held the belief that
developing the human resource base of countries is a critical element in promoting sustainable
development. The variety and richness of training programs funded since the early 1960s
throughout the world attests to this belief. Stated in general terms, the overriding purpose of
these programs was to upgrade the skills and knowledge of participants who were selected
based on their personal merit or leadership potential. An effort was made to promote the
participation of women, indigenous peoples, and other disadvantaged groups.

Through these programs, effective and useful methodologies and evaluation tools were
developed and refined. The lessons and guidance they provide prepared the groundwork for
the design and implementation of strategic programs and allowed training specialists to
formulate agency-wide norms. Thus, as we discuss strategic training, we should emphasize
that we are not discarding many of the important and necessary elements developed in the
traditional training methodologies. In a reengineered Agency, "We must enhance the
traditional approach by shifting our emphasis to a results-oriented approach to training." (Best
Practices: 3)

The concepts and initiatives set forth by reengineering have reshaped the way the Agency
approaches and views development work. The Strategic Framework mandates that all country
activities show linkage to SOs and to Agency goals and objectives. For the training function,
this means that plans must be linked to technical activities of results packages. Thus,
reengineering has forced training to become more rigorous in responding to customer needs,
selecting participants, and measuring results. Its outcome will no longer be measured in terms
of number of people trained or their satisfaction with the training they received but on how
the training activity contributes to performance improvement and supports technical assistance
programs.

Definition of Training Impact

In redefining the role of training, the focus is on the functions, rather than the person, and
one must be prepared to address a totally different set of issues, such as: How did the
investment in training contribute to specific program outcomes or strategic goals? What
linkages can be established between training and the achievement of USAID’s strategies?
And what are the implications of these linkages in the approach to training?

The definition that we ascribe to training impact in a reengineered context must reflect the
new concepts being formulated. It will guide the principles and approaches we must follow in
a results-oriented training system and determine how all other aspects of the training process
are planned and managed from selection to evaluation. The forthcoming update of the
ADS 253 (Automated Directives System) provides a definition of training impact that focuses
on a functional approach:

Training as a Development Tool


PPC\CDIE\DI\RRS 8
[For the purposes of USAID-sponsored training], training impact is defined as the
improvement in individual job or organizational performance that can be directly
attributed to the skills, knowledge, and attitudes acquired during training.

Training professionals could argue that this definition is a restrictive, narrow one and excludes
the widely accepted notion that the purpose of training is to transfer knowledge. This may well
be a valid statement depending on the objectives of the training. However, if we are to view
training as a development tool used to achieve a strategic objective, we can no longer use the
concept of upgrading skills or imparting knowledge as the sole criterion for assessing its impact
or concluding that the training was a success. We must look beyond the individual attainment
and be able to assess, in quantifiable terms, how the investment in training contributed to
specific program outcomes. Thus, embedded in this definition is the concept that training does
not have an impact until the skills or knowledge acquired have been successfully applied in a
specified work situation and have resulted in a measurable improvement.

Application of Results-Oriented Framework to Training Systems

In The Learning Alliance, Robert Brinkerhoff asks the key question: "How will the training
design result in knowledge and behavior that lead to performance improvement that achieves
the goals of the organization?" The issues raised in this question, as well as in the definition
of training impact, illustrate the interconnection among the various components of a training
program. There is a sequential progression: Each phase builds on the previous one and
influences the decisions taken at the next level. Let us then briefly examine the implications
of these concepts at each of the phases of the training continuum, from needs assessment and
selection to evaluation.1

Under strategic training, a problem or need that is linked


to a performance gap is identified. We analyze the
organizational context in which the job will be performed, "Training—if well designed,
which will determine the skills that the employee needs implemented, applied, and
to acquire or strengthen. A well thought out, carefully transferred—can be an
executed analysis at the initial planning stages will lead effective tool, whether used for
an academic institution, an
to the formulation of clear training objectives. If the
NGO, a community group, or a
objectives are not well articulated, the training design will
government institution."
not contribute to an intermediate result, nor address the
needs of the customer. This analysis will guide the USAID/El Salvador
decisions that need to be made at each of the subsequent
phases of the training program.

The individuals selected to receive training are those who will perform the jobs that

1
See Best Practices Guide and companion subguides 1996 for an indepth discussion on how to plan,
manage, and monitor the various components of training in a reengineered context.

Training as a Development Tool


PPC\CDIE\DI\RRS 9
contribute to organizational performance. Personal merit or leadership potential will no longer
be criterion for selecting trainees. What we seek is a critical mass of participants, who will
return to their jobs upon completion of the training, and have an individual or collective
impact on their institutions.

Training designs will spell out the specific results that the training is intended to have, and
questionnaires, surveys, and evaluation tools will be designed based on the expected results.
In the traditional approach, it was assumed that returned participants would apply the new
skills and knowledge acquired, achieve professional gains, and make contributions to their
communities and society at large. The specific results expected, however, were not always
articulated. Thus, evaluators were forced to seek out returned participants, identify their
accomplishments, and report whatever results they observed.

Reengineering guidelines and practices call for a greater level of accountability. The same
rigor that should guide the design and implementation phases of a training program needs to
be observed when documenting and reporting results. Under a results framework, regular
monitoring is conducted, and adjustments are made periodically at the design and
implementation levels. Success is measured in terms of performance deficits addressed and
documented improvements that directly link to the strategic objectives.

Identifying these intended results at the design stage assists all those involved in the training
effort: providers are given a clear idea of the larger goals behind their particular program;
participants benefit from a program with clear expectations, objectives, and defined
applicability to their work; and USAID and its contractors have clear benchmarks with which
to measure program results.

Mission training specialists play a pivotal role in this effort. They must thoroughly familiarize
themselves with USAID reengineering concepts and practices, particularly as they are applied
in their respective Missions. The challenge for them lies in restructuring their function and
gaining recognition from SO teams that human resource development is at the core of
achieving sustainable results in each of the strategic areas.

Training specialists are expected to assist SO teams in aligning training activities with specific
objectives; justify the need for a timely training intervention; work closely with partner
institutions to support continuous staff development and learning; monitor and measure
improvements; and report results. The knowledge, expertise, and background that they bring
to this endeavor will be in great demand and certainly tested. This new and challenging
function requires a shift from planning and managing training as a process to shaping and
improving performance in support of organizational goals; a shift from being training
managers to providing strategic input. Thus, the main purpose of this study is to provide these
specialists with practical and useful tools, techniques, or approaches for the effective
stewardship required to design quality programs.

Table I on the next two pages synthesizes and contrasts the differences between traditional
and reengineered training practices discussed above.

Training as a Development Tool


PPC\CDIE\DI\RRS 10
TABLE I

Comparison of Training Concepts Under Traditional and Reengineered Training

Traditional Training Reengineered Training

Needs Assessment

General inventory of training needs was conducted. Needs that address specific gaps in job performance
are identified.

Objectives

Training was the objective. Training is one of several development tools used
to achieve a strategic objective.

Objectives were not linked to program goals; Objectives show direct linkage to program goals.
they were defined as learning results.

Aimed at general improvement. Targets specific objectives. Has a precise and


narrow focus.

To strengthen the organization or institution, Strives to improve the individual and the
or provide general institutional building. organizational performance through the application
of new knowledge, skills, and attitudes.

Selection

Participants were selected based on individual Participants identified for training are those who
merit, ability, or leadership qualities. will perform the jobs that will contribute to the
organizational improvement. A critical mass of
people is selected for maximum impact.

Design/Implementation

Historic preference for U.S.-based training. Choice of training, location, and duration should
match real needs.

Training designs were based on the number of Design is targeted and based on the need to
people trained or the needs of the participants. upgrade the performance of the institution.
Training content shows linkage to strategic
objectives.

Training as a Development Tool


PPC\CDIE\DI\RRS 11
Traditional Training Reengineered Training

Evaluation

Number of people trained was the indicator. SO teams identify training indicators prior to
training, i.e., agree on the changes that training will
bring.

Was based in terms of outputs, such as number Is evaluated in terms of results, i.e., the
of people trained; or inputs, such as number of improvements participants have on job performance
courses offered. or on the organization.

Quality of training was assessed based on Results are assessed in terms of customer needs.
participant satisfaction and individual results Evaluates changes in specific performance areas,
achieved. such as, productivity, efficiency, safety, quality
of service.

Learning results and impact were not specified. Requires baseline data and targets. Indicates
measurement of improvement and results.

Accountability rested on trainers only. Trainers, supervisors, and participants are


accountable for results.
Role of Training Specialists

Training was the sole responsibility of the training Training specialists are integrated into SO teams
office. and together they participate in the planning,
implementation, and monitoring of training.

Training specialists managed the training function They become strategic partners. They assess
and provided a specialized service. needs, monitor progress, and report results.

Role of the Customer

Partner organizations had little input in the planning Customers (participants, supervisors) provide input
of training activities. and are directly involved in the planning and
implementation phases of all training components.

The benefit of training results to partner Partner organizations are fully aware of the benefits
organizations was not always specified. derived from training its staff.

Participant alone was responsible for applying new Application of new skills is the responsibility
skills. of the customers as well.

Regardless of the form (training, education, development, or some combination or variant


of these types), all HRD is alike in that it is not meant to be done for its own sake but rather
to benefit the organization. (Brinkerhoff: 10)

Training as a Development Tool


PPC\CDIE\DI\RRS 12
Reengineering Core Values: Their Relationship to Training

Four core values have guided the Agency’s effort to restructure its operating systems: managing
for results, teamwork, customer focus, and empowerment and accountability. It is important to
analyze how we can use these values to guide our thinking and decision-making process; what
they mean in terms of planning and deciding training activities; and how they apply to the
various phases of the training process. Following is a brief description of the core values and
their application to training:

Managing for results - Means developing results-oriented strategic objectives and performance
indicators, routinely collecting and analyzing data to monitor program results, and using this
information to allocate resources.

For the training function, this means that objectives address the skills that need to be improved in
the workplace or organization. Mechanisms for collecting regular feedback allow for timely
changes in the design or implementation phases and provide an analytic base for measuring and
using results. Budget decisions are made based on results—on the actual improvements made.

Teamwork - Missions will establish strategic objective teams to design and manage their
programs. SO teams have the freedom and authority to plan their own activities and set goals.

In the training context, teams comprise training specialists, partners, customers, participants, or
beneficiaries, who develop the objectives and indicators and are in charge of monitoring and
conducting periodic evaluation activities.

Customer focus - The customer is involved in defining the activities that will best address their
needs. This means that Missions must include customers as part of the SO team, and they must
participate in all phases of program development.

In planning training, the customer is directly involved in assessing the organization’s performance
gaps; identifying the skills that need to be upgraded or acquired to address these gaps; selecting
the group of employees that need to be trained; deciding on the most appropriate training design;
and participating in the monitoring and feedback process.

This level of participation may be deemed too involved and time consuming. Supervisors, in
particular, may feel that it detracts from other more pressing managerial functions. But the
importance of direct customer involvement and participation cannot be overemphasized. Building
the human resource base not only allows the customer and the organization to keep pace with
change, but it also provides a significant source of competitive advantage.

Empowerment and accountability - USAID/Washington will set directions and provide guidelines,
but field Missions will decide how to implement them. Training teams will have the
responsibility for allocating, managing, monitoring, and reporting on the resources expended.

The following table presents a graphic illustration of how the four core values are applied at each
of the major phases of the training process.

Training as a Development Tool


PPC\CDIE\DI\RRS 13
TABLE II
Relationship of USAID Reengineering Core Values to the Training Process

Training Process
Core Values
Design/Planning Implementation Monitoring & Evaluation

Managing - Training objectives are defined in - Training is implemented with focus -Results are reported in quantifiable
for Results terms of performance gaps. on skill improvement. terms and are linked to SOs.

- Training indicators have been - Timely adjustments to design and - Results are linked to budget
established. implementation are based on regular decisions.
feedback.
- Reduced reporting requirements
facilitate focus on results.

Teamwork -TEAM* develops and agrees on - TEAM input is sought to implement - M&E process and activities are
objectives and indicators. and manage project. conducted by TEAM at regular
intervals.

Customer - Customer defines needs, i.e., - Customer is involved in - Customer experiences improvement
Focus performance gaps that training can implementation process and provides in individual job and in
address. regular input. organizational performance.

- Customer satisfaction is essential to


future training programs.

Empowerment - Training resources are allocated by - Implementation of training activities - TEAM has responsibility for M&E
and SO teams. is delegated to TEAM. process.
Accountability
- SO team reports results in R4.
*TEAM refers to SO team, training specialists, partners, customers, participants, or beneficiaries.

Training as a Development Tool


PPC\CDIE\DI\RRS 14
Evaluation Models

If we are to rethink the role of training in a strategic context and assess its impact beyond
individual attainment, then the tools we use to measure results must reflect the new practices
being implemented under reengineering. How do we measure the effectiveness of training in
terms of results rather than inputs? How do we know if training is the appropriate medium for
meeting the performance gaps identified? Training professionals confront these issues
increasingly as they are pressured both to redefine the training function and justify their
investments. Regular monitoring and evaluation practices are not only the best means of
providing such justification but also an essential component of all training designs.

This section summarizes three training evaluation models designed by Donald Kirkpatrick,
Robert Brinkerhoff, and the USAID-funded HERNS project (Human and Educational
Resources Network Support). It is beyond the scope of this study to provide detailed
evaluation tools or techniques for the various levels of assessment described in each of the
models. The intent is rather to present criteria and guidelines useful in designing quality
monitoring mechanisms and to synthesize the issues—from a conceptual point of
view—related to measuring training effectiveness and efficiency.

It should be noted, however, that only the HERNS model was specifically designed to
evaluate USAID training programs. The concept of preparing change agents—individuals who
exert influence beyond the workplace, at the program level or in their communities and
society at large—is used by the HERNS model to measure impact at the highest level.

While the Kirkpatrick and Brinkerhoff models do not measure results beyond the workplace,
they do provide practical and valuable strategies, insights, explanations, or solutions
applicable to development training programs. Considering the numerous differences in
programs, structures, and work environments in which SO teams operate, a single monitoring
and evaluation method, most probably, will not address all the issues related to the training
event. A sound and more reliable strategy is to establish monitoring mechanisms that will
permit training staff to collect, analyze, and report results on a regular basis. In other words,
allow the strengths and uses that each model offers to reenforce the various needs of the
training programs as best suited. The level and degree of effort expended at each level of
evaluation naturally depends on the needs, requirements, and resources of the program.

Training as a Development Tool


PPC\CDIE\DI\RRS 15
Four-Level Evaluation Model
Donald Kirkpatrick

Donald Kirkpatrick outlined the four levels of his widely used evaluation model in a series of
articles published in 1959. Often acclaimed for its practicality and simplicity, Kirkpatrick’s
model has certainly withstood the test of time. In the subsequent decades since the publication
of the articles, training and evaluation professionals have frequently quoted, applied, modified,
and expanded this model. And despite the numerous changes and innovations that training
concepts and designs have undergone over the years, this model continues to be a useful and
effective tool.

In 1994, Kirkpatrick published, Evaluation Training Programs, the Four Levels in which he
explains the concepts put forth in his series of articles and provides techniques along with a
set of guidelines for evaluating each level. The second part of the book provides case studies
of organizations that have used evaluation at different levels.

The four levels outlined by Kirkpatrick are:

Level 1 - Reaction
Level 2 - Learning
Level 3 - Behavior
Level 4 - Results

The author cautions that each level is important and none should be skipped in favor of the
level that is deemed most useful. Each level of evaluation provides essential information for
assessing the effectiveness of the following level. The motivation and interest of the trainees
(Level 1 - Reaction), for instance, has a direct influence on the level of learning that takes
place (Level 2 - Learning). Likewise, the amount of learning that takes place influences the
behavior (Level 3) of the person, without which there would be no results (Level 4). The
higher the level, the more involved, costly, and challenging the process becomes to
accomplish and assess.

Reaction - Level 1

Reaction measures the level of trainee satisfaction as to the location, training content, or
effectiveness of the trainer. If the trainees do not have a positive reaction to these, they will
not respond favorably to the material presented or skills taught. Thus, it is crucial to assess
the level of satisfaction of the participants at regular intervals during the training and make
the necessary adjustments based on the feedback received. The motivation and interest of the
trainees has a direct influence on the amount and level of learning that takes place.

Training as a Development Tool


PPC\CDIE\DI\RRS 16
Guidelines for evaluating reaction

- Determine what you want to find out


- Design a form that will quantify reactions
- Encourage written comments and suggestions
- Get 100 percent immediate response
- Develop acceptable standards
- Measure reactions against standards and take appropriate action
- Communicate reactions as appropriate

A short, yet well-constructed questionnaire should provide the necessary information to assess
Level 1 results. This is a relatively easy task, and one should aim to get a 100 percent
response. A positive response will not guarantee that participants will apply the content of the
training in the workplace, but a negative reaction, most likely, will prevent trainees from
going beyond this level.

Learning - Level 2

Kirkpatrick contends that learning takes place when attitudes are changed, knowledge is
increased, or skills are improved. Learning to use a new software program, for instance,
increases the skill level, while a program aimed at enhancing male involvement in family
planning would deal with cultural differences and seek to change attitudes. An evaluation tool
aimed at measuring learning must take into account the specific objectives of the training.

Guidelines for evaluating learning

- Use a control group, if practical


- Evaluate knowledge, skills, and/or attitudes both before and after the program
- Get a 100 percent response
- Use the results of the evaluation to take appropriate action

Kirkpatrick makes the point that when evaluating learning, we are also measuring the
effectiveness of the trainers. If the results at this level are not satisfactory, we may also need
to assess the training venues, as well as the expertise and training skills of the staff.

Behavior - Level 3

This level tests whether participation in training has resulted in changes in behavior.
Participants may be asked to provide specific examples of how training has affected their job
performance. Kirkpatrick emphasizes the importance of evaluating levels one and two before
attempting to measure changes in behavior.

Training as a Development Tool


PPC\CDIE\DI\RRS 17
Guidelines for evaluating behavior

- Use a control group, if practical


- Allow time for behavior change to take place
- Evaluate before and after the program, if practical
- Survey and\or interview one or more of the following: trainees, their immediate supervisor,
their subordinates, and others who often observe their behavior
- Get 100 percent response or a sampling
- Repeat the evaluation at appropriate times
- Consider cost versus benefits

For change in behavior to occur, two key conditions must be present: The person must have
an opportunity to put into practice the skills acquired and must encounter a favorable work
climate. The training program can teach the necessary skills and provide a conducive
environment to change, but providing the right climate is the responsibility of the participant’s
immediate supervisor. If learning took place, but no changes in behavior are observed, it may
be that the person does not have a supportive environment, or work conditions prevent
him/her from applying the new skills. Likewise, if the individual has shown improvement in
job performance, but no improvement is evident in the organization, then the climate of the
organization should be analyzed to assess the causes, rather than the training. All these are
important variables to consider before deciding whether or not the training has produced the
expected results at this level.

Results - Level 4

The first three levels assess the degree to which participants are pleased with the program,
acquire knowledge, and apply it to their jobs. Level 4 attempts to measure the final results
that took place due to participation in the training.

This is the most difficult level to evaluate and requires considerable time, skill, and resources.
At this level we measure the benefits to the organization that resulted from training. There are
numerous ways of measuring results: increased efficiency, reduced costs, better quality,
enhanced safety, greater profits. Again, the final objectives of the training program must be
defined in terms of the results expected. Kirkpatrick, however, does not address the fact that
impact measurement must take into account that other variables affect performance besides
training. (See section on Monitoring and Measurement Tools.)

Guidelines for evaluating results

- Use a control group, if practical


- Allow time for results to be achieved
- Measure both before and after the program, if practical
- Repeat the measurement at appropriate times
- Consider cost versus benefits
- Be satisfied with evidence if proof is not possible

Training as a Development Tool


PPC\CDIE\DI\RRS 18
The usefulness of the Kirkpatrick model lies in its logic, elegance, and applicability.
However, while it has been used widely to evaluate USAID-sponsored training, we must take
into account that it was not devised with development programs in mind, which seek ways of
measuring results linked to strategic objectives at the program level.

The chart below illustrates the chain of impact of each of the four evaluation levels based on
the value of the information that each level provides, the power to show results, the frequency
of use, and the difficulty of assessment. Level 4 evaluation yields more valuable information
and has a greater power to show results than the other levels. Level I evaluations are fairly
common, but less frequent at Level 4 largely due to the level of difficulty to administer and
assess. (Phillips 1994: 7)

_________________________________________________________

Chain of Value of Power to Frequency Difficulty of


impact information show results of use assessment
______________________________________________________________

Reaction least least frequent easy


(Level 1) valuable power

↓ ↓ ↓ ↑ ↓

Learning ↓ ↓ ↑ ↓
(Level 2)

↓ ↓ ↓ ↑ ↓

Behavior ↓ ↓ ↑ ↓
(Level 3)

↓ ↓ ↓ ↑ ↓

Results most most infrequent difficult


(Level 4) valuable power
______________________________________________________________

Training as a Development Tool


PPC\CDIE\DI\RRS 19
Six-Stage Evaluation Model
Robert Brinkerhoff

In his book Achieving Results from Training, Robert Brinkerhoff presents a six-stage training
evaluation model, which adds two initial steps to the Kirkpatrick model—evaluation of the
needs and goals of the training design. Brinkerhoff contends that crucial information needs to
be gathered at these first two stages before the decision is made to implement a training
program. He examines in considerable detail the issues that need to be resolved at each level
before moving to the next one and offers a wide variety of data collection techniques,
guidelines, and criteria crucial to making sound decisions and ensuring that the training
program pays off.

Before undertaking an evaluation exercise at any level, however, Brinkerhoff underscores the
importance of clarifying the need and purpose of the evaluation; the type of information that
should be collected at each stage; the audience for whom the information is gathered; how the
reporting will be conducted; and the key decisions and judgments that need to be made based
on the data collected at each step of the process.

While he provides a vast array of examples, checklists, and suggestions, he encourages


trainers to decide for themselves what important issues and questions to raise throughout the
process based on the unique circumstances and needs of their respective programs. An
evaluation effort (and for that matter a training program) should not be carried out if there is
no consensus or clear answers to the most important issues identified.

The six stages of Brinkerhoff’s evaluation model are:

Stage I - Evaluate Needs and Goals


Stage II - Evaluate the Design
Stage III - Evaluate Operation
Stage IV - Evaluate Learning
Stage V - Evaluate Usage and Endurance of Learning
Stage VI - Evaluate Payoff

These six stages represent a sequence in which each step is linked to the preceding one, as
well as to the following step. The issues that may arise at any stage are directly linked to the
decisions made in the preceding one and have a direct impact on the outcome of the
following stage. He refers to the "training decision-making cycle," i.e., problems that surface
pertaining to any of the stages may necessitate reviewing the decisions made at the previous
stages, examining the reliability of the data, or even returning to Stage I.

The following diagram illustrates the decision-making cycle of the six-stage model (p.27):

Training as a Development Tool


PPC\CDIE\DI\RRS 20
The Six-Stage Model as a Cycle

Stage I
Evaluate needs
and goals

Stage VI Stage II
Evaluate Evaluate
payoff design

↑ ↓
Stage V Stage III
Evaluate usage Evaluate
and endurance operation
of learning

Stage IV
Evaluate
learning

This model shows the "recycling that takes place among the stages." For instance: if the
participants are not interested and motivated during the training (Stage III), is the design
appropriate (Stage II)? Is the training necessary (Stage I)?; or if the employees are not
applying the skills taught (Stage V), did they really learn them (Stage IV)? Are the new skills
still necessary for them (Stage I)?; or if trainers cannot agree on the appropriate design (Stage
II), is training the answer to the problem (Stage I)? (p.33)

Following is a summary of the six-stage model. As stated above, it is beyond the scope of
this section to provide specific mechanisms or tools needed to conduct an evaluation exercise.
The intent rather is to synthesize the salient concepts and definitions that each stage addresses
and examine the guidelines, criteria, and key issues that need to be resolved throughout the
process.

Training as a Development Tool


PPC\CDIE\DI\RRS 21
Stage I - Evaluate Needs and Goals

The data collected at this level are used to analyze and prioritize the needs, problems, and
weaknesses of the organization and establish what training goals are worth pursuing.
This analysis also provides crucial information to determine whether training is the solution to
the weaknesses identified.

There are several situations that may call for a training solution, such as performance deficits,
organizational changes, or management decisions. Since Stage I analysis will provide a
framework for establishing the value of the training and determine its potential payoff, it is
directly linked to Stage VI evaluation, which determines whether or not the training was
worthwhile.

The following examples illustrate the relationship between the performance gaps identified
(Stage I) and the benefits sought (Stage VI). (Adapted from p. 33)

_____________________________________________________________
Stage I Stage VI
Needs (Reasons) for Training Benefits (Criteria for Success)
_____________________________________________________________

Lack of research skills for Data is collected, analyzed,


ongoing qualitative and and reported regularly
quantitative data collection

Inefficient decision- Increase in productivity


making procedures due with changes in local problem-
to centralized systems solving ability
_____________________________________________________________

These examples illustrate how organizational deficits are linked to corresponding criteria to
assess the results of the training. If consensus for Stage VI criteria cannot be reached, it is an
indication that either work at Stage I is not complete, or there is no need to do training.

In summmary, the purposes of Stage I evaluation are to assess, validate, and rank the needs
and problems; clarify the causes of the problems; distinguish between needs and wants; and
determine the potential value in meeting these needs.

Stage I seeks data that will "predict" whether on-job behavior can and should
be changed, whether specific SKA [skills, knowledge, attitudes] changes would
be sufficient for changed behavior, and whether SKA changes are achievable
through a training intervention (p.26).

Training as a Development Tool


PPC\CDIE\DI\RRS 22
Stage II - Evaluate Training Design

This level assesses the appropriateness of the training design. It focuses on the issues that
must be considered after the decision is made to undertake a training activity, but before it is
implemented. At this stage, several designs may be proposed, and the strengths and
weaknesses of each assessed. The design finally adopted may represent a composite of the
best elements of several designs.

Careful analysis of the adequacy of the strategies and materials, as well as the training
methods and venues selected, will render a stronger and more effective design and allow the
process to move to the implementation stage. The inevitable weaknesses present in the design
will be revealed when it is actually put in operation, and the trainers will have to review it
and make the necessary adjustments. This is an example of what the author refers to as the
recycling process of the six-stage evaluation model.

Among the criteria suggested to guide the assessment of the training plan are:

Clarity and definition. Everyone involved in the training event—operating unit, customers,
and participants—must be able to readily understand the various components of the design.
This involves clear definition of the needs and goals to be addressed by the training; the
approaches and strategies developed; and the resources necessary to implement the programs.

Compatibility. The training format adopted and the materials selected must also reflect the
environment in which the training will take place, the cultural and ethnic make-up of the
participants, as well as their educational, professional, and social backgrounds.

Theoretical adequacy. The design must incorporate current research and sound theory related
to adult-learning practices.

Practicality and cost-effectiveness. The theory that supports the design might be excellent, but
if it requires unreasonable financial or human resources, it may not be a practical design. The
evaluation at this level should consider economic alternatives of implementing the training
without compromising the objectives.

Legality and ethics. The importance of considering this criterion at the design level cannot
be overemphasized, and the "criteria regarding ethics and legality are absolute and must not
be compromised." Trainers need to take into account and honor the needs, rights, and
expectations of the participants based on their customs and traditions, as well as ensure their
physical safety. (For USAID-sponsored training, this means that it must adhere to regulations
put forth in ADS 253.)

Stage II evaluation should carefully identify the objectives that a given design
will probably achieve, then compare these against the initial expectations to
assure that real and worthwhile needs are likely to be addressed (p.88).

Training as a Development Tool


PPC\CDIE\DI\RRS 23
Stage III - Evaluate Training Implementation

Once the training design is deemed appropriate, this stage monitors the training activities and
gathers feedback on the reaction and level of satisfaction of the participants. It assesses the
discrepancy between what is actually taking place in the training and what was planned or
expected. To solve the problems encountered at this level, trainers may need to refer back to
the training design (Stage II) and make the necessary adjustments.

Some useful techniques to conduct Stage III evaluation include:

Interviewing. Whether the interviews with participants are structured or informal, they are a
useful technique because they allow the trainers to ask follow-up questions and obtain more
detailed information.

Key participant method. This method involves selecting trainees who, because of their
expertise or leadership qualities, are able to provide thoughtful comments and insights.

Observations. One trainer observes another and records participant reactions and behaviors. If
an observation form is developed, it will render useful quantitative data on the reaction of the
participants.

Trainee rating and reaction forms. Questionnaires and surveys may be administered at regular
intervals during the training to gauge the satisfaction level of the participants. But
because this is the most commonly used method to evaluate reaction to the training,
participants may not pay much attention to the forms and provide superficial comments.
Nonetheless, their reaction is important in order to proceed to the next stage.

... Stage III process is one of observing and assessing the program’s progress,
noticing discrepancies, making revisions, and trying it out again, then
reobserving and reassessing to see if progress is now acceptable. This is the
process that makes training work and move toward payoff (p.96).

Training as a Development Tool


PPC\CDIE\DI\RRS 24
Stage IV - Evaluate Learning

Any training event, regardless of its scope or duration, aims to enhance the skill and
knowledge level of the participants. The extent to which this improvement has been achieved
is the measure of the effectiveness of the program. Stage IV determines the level of learning
and improvement that took place. If sufficient learning occurred, we can expect that it will be
applied in the workplace and results will be achieved.

The data gathered at this level are used to revise and refine the activities and strategies that
will ensure the desired transfer of learning. Brinkerhoff suggests the following uses for Stage
IV evaluation:

Gathering evidence that proves the effect of training—accountability. Trainers need to provide
evidence that the skill and knowledge level of the participants has improved.

Determine mastery of training results. This information is useful at three levels: it provides
feedback to the participants regarding their achievement, to the trainers regarding trainee
performance, and to the supervisors regarding the degree of skill mastery of their staff.

Looking for unintended results. If the unintended results are also undesirable, trainers need to
know this information to reenforce those areas in the program that produce desirable results.

Planning for Stage V follow up. Because Stage IV evaluation identifies weaknesses in the
achievement of skills and knowledge, it sets the framework for Stage V evaluation, which
assesses the application of these skills.

Marketing. By reviewing past learning results of training, particularly results based on


"reliable and valid measures of achievement," SO teams and customers can make informed
decisions on whether training is the appropriate answer.

A common definition of evaluation, particularly among educators, is that it


consists of defining objectives, specifying those objectives measurably, and then
assessing the extent to which learners have mastered those objectives. With a
few minor additions and caveats, this definition accurately captures the
spirit of Stage IV evaluation (p 113).

Training as a Development Tool


PPC\CDIE\DI\RRS 25
Stage V - Evaluate Usage of Learning

This level of evaluation indicates how well the trainees are using on the job the knowledge
and skills acquired. "It looks at actual performance, not ability to perform."

Stage V evaluation usually takes place at the workplace, which "represents the richest source
of data." Because transfer of training to the workplace does not take place exactly as planned,
evaluators should take into account the numerous steps and changes that occur from the
learning results phase (Stage IV) to the eventual application of these results. The purpose of
Stage V evaluation then is to record and analyze these steps and changes. Essentially, it
documents when, where, how well, and how often the training is being used; which skills are
and are not being used; and how long the effects of training have lasted.

The author once again cautions that it is important to clearly define the explicit purposes and
uses of a Stage V evaluation before designing it. Below are some guidelines:

Revising training. This level determines the effective and ineffective ways in which the new
knowledge is being applied. It signals ways of improving the program to achieve transfer of
skills and knowledge at the level expected. Or it may be decided that other types of targeted
interventions at the workplace, such as providing peer support or greater guidance, are all that
is necessary.

Planning ahead for Stage VI evaluation. The benefits that training brings to the organization
cannot be assessed without an accurate understanding of how the new skills are actually being
applied. Stage V documents instances of appropriate application of the new skills, which
forms the basis for Stage VI inquiry.

Documenting and accounting for transfer of training. Provides crucial information to potential
participants as to what they can expect from training in terms of actual results in the
workplace. By documenting the before and after behaviors in the workplace, an evaluator also
develops a database for Stage VI evaluation.

It should be noted ... that the benefits to the organization derive not from what
was learned but from what actually gets used. This provides the basic reason
for being of Stage V evaluation: Training is not done for the sake of learning
alone but for the sake of providing value to the organization through improved
job performance (p.133).

Training as a Development Tool


PPC\CDIE\DI\RRS 26
Stage VI - Evaluate Payoff

By the time the evaluation process reaches this level, we can assume that the training was
successful, the participants are applying what they have learned, and an evaluator has
identified and recorded the extent to which changes have taken place in the workplace. The
aim of Stage VI evaluation then is to assess the value that these changes have brought to the
organization and whether this value was worth the effort given the time and resources
expended.

The sequence of events that follows a training intervention—from acquiring knowledge to


changing behavior in the workplace to deriving benefit from the change—is an unpredictable
and complicated one. The worth of the training is measured by documenting the benefits,
assessing their value, and comparing them to the cost of the training. Thus, the key question
posed by Stage VI inquiry, Was the training worth it? cannot be readily answered without
first examining four issues:

- The benefits that have resulted from training


- The value of each of the benefits (monetary or otherwise)
- How the value of the program’s benefits compare to training costs
- The extent to which the initial training need or problem has been resolved

If we consider the six-stage model as a cycle, the data collected at the final stage are used to
assess whether the training results have resolved satisfactorily the needs of the organization.
The fourth point above addresses this issue, which leads directly to the needs and goals that
were identified at Stage I. To determine whether the training has paid off, it is crucial to
show at this level of evaluation the link between Stage VI and Stage I. The answers derived
from this analysis will guide the decision to either replicate or abandon future training
programs in the same area.

Sometimes the value of the benefits may be assessed in monetary terms or cost savings, and
therefore, can be easily measured. But in cases in which it is not feasible to measure the
value of the improvement in financial terms—such as clean air or improved teamwork and
morale—we have to use qualitative methods. And while these methods may be more
subjective, the improvement should not be considered less valuable or beneficial.

Some important guidelines to follow when embarking on a Stage VI evaluation include:

Consider a broad range of training impact variables. This involves documenting benefits that
may not be directly linked to the needs but are nonetheless beneficial to the organization.

Look for specific training applications. The point here is that a list of specific applications
—as opposed to general statements of impact—is of greater use when making decisions about
the value of future programs.

Training as a Development Tool


PPC\CDIE\DI\RRS 27
Consider a wide range of cost factors. The worth of training must take into account the
numerous costs associated with this activity.

Refer to specific data from preceding evaluation stages. This guideline refers to the recycling
concept. Stage VI calls for attributing a value—monetary or otherwise—to the results. Thus,
the data obtained at Stages II and III, the design and implementation levels, are used to
estimate the cost of the training. Stage I data, as mentioned above, provide a basis for
determining the value. The crucial point that bears repeating is that Stage VI evaluation
should not be undertaken without reference to the data obtained at the previous levels.

Stage VI evaluation will have to work toward specifying, identifying,


describing, and documenting those results of HRD that can be construed, in
and of themselves to be pay-offs—things of value to the organization. These
"things of value" are the end points on the chain of events resulting from HRD.

The six-stage model represents an exhaustive evaluation exercise that is not always feasible to
achieve based on competing deadlines and reduced budgets. Nonetheless, Brinkerhoff’s advice
is to consider each of the stages, if only briefly, to guide and bolster the training function and
educate the customers as to its benefits.

Training as a Development Tool


PPC\CDIE\DI\RRS 28
HERNS Models

The HERNS (Human and Educational Resources Network Support) project carried out by
Aguirre International provides assistance to USAID Missions with the design, development,
and evaluation of training activities. Through this project, HERNS specialists have developed
performance monitoring systems and training impact evaluations for several Missions.

The model presented below was designed in 1995 for USAID/Egypt’s integrated monitoring
and evaluation (M&E) system. It links the sequence of events of the training cycle with key
M&E activities. Source: Development Training II Project, M&E System, USAID/Egypt.

Planning Strategic, tactical, and operational.


Key M&E activites include clarifying the link of training to SO,
establishing indicators, and collecting available baseline data.

Implementing Pre-departure, training provision, re-entry, and follow-up.


Key M&E activities include collecting trainee biographical data,
pretraining, and end-of-training evaluation data.

Applying Post-training application of new skills to workplace.


Key M&E activities include follow-up questionnaires,
interviews, focus groups, and institutional case studies.

Achieving Trainee performance improvements.


Intermediate Results Key M&E activities include follow-up questionnaires,
(Individual) interviews, focus groups, and institutional case studies.

Achieving Partner institution performance improvements.


Intermediate Results Key M&E activities include follow-up questionnaires,
(Institutional) interviews, focus groups, and institutional case studies.

Realizing Results Impact on customers and achieving SOs.


Key activities should be included in M&E plans for results
packages and SOs.

Subsequent monitoring and evaluation systems designed by HERNS evaluators continue to


emphasize that training for results shifts the focus away from the individual trainee to the
organization. Implementing training for results leads training managers to work closely with
organizational sponsors—those who will have a direct bearing on how the development of
human capital is used to achieve the goals of the organization. The focus on organizational
performance is also characterized by a shift in monitoring and evaluation activities, as
illustrated in TABLE III prepared for USAID/El Salvador. The issues and questions that need
to be addressed in a monitoring and evaluation system that links training to results are
described for each level.

Training as a Development Tool


PPC\CDIE\DI\RRS 29
TABLE III Measuring Progress Toward Achievement of Strategic Objectives

Strategically Plan and Acquire skills, Apply SKA/ Contribute to the Contribute to the
Implement Training knowledge, and Achieve training activity Results Package (RP) Strategic Objective
attitudes (SKA) objective objective
→ → → → →

Why is To judge the performance To measure the To measure the improved To measure progress To measure
this being of SO/RP teams, training increased capacity of performance of trainees as toward improved progress toward
monitored? units, contractors, and trainees as a necessary related to key institutional institutional the SO.
providers in ensuring precondition to performance requirements. performance as a key
relevant and quality improved performance intermediate result
training programs. in the workplace. leading to an SO.

What indicators Generic: Degree of Generic: Degree of Generic: Percentage of To be determined by To be determined
will be used to collaboration of all change in SKA trainees applying elements RP team. by SO team.
measure it? stakeholders in planning, (pre/post). of training; percentage of
including action planning Specific: TBD by action plans executed;
that links training to SOs. RP team. percentage of trainees with
Degree of trainees’ increased responsibilities.
satisfaction with training. Specific: TBD by
Specific: TBD by RP team. RP/SO team.

How will it be Self assessment of Training providers’ Trainee questionnaires. To be determined by To be determined
measured? collaboration through focus assessments of trainees. Interviews/focus groups RP team. by SO team.
groups with stakeholders. Exit interviews and with supervisors. On-site
Trainee satisfaction questionnaires. observations.
questionnaires.

When will it be Focus groups: annually Upon completion of Within six months of To be determined by To be determined
measured? Questionnaires: upon training. completion of training. RP team. by SO team.
completion of training.

Who will be Focus group: GTD Training provider and Training unit with RP team. SO team.
responsible? contractor training unit. RP team.
Questionnaire: training unit

Source: Human Capacity Development Activity Design. A HERNS Report. USAID/El Salvador, January 1997.

Training as a Development Tool


PPC\CDIE\DI\RRS 30
HERNS evaluation specialists underscore the importance of specifying who needs to know the
results and how these will be recorded. The results on how training contributes to
performance improvement in institutions can be used by the SO/RP teams and reported in
R4s. Results about the training activity itself assist managers and training contractors in
making adjustments in the design of future programs.

Change Agents

An earlier model was developed by Aguirre


International to evaluate training under the
CLASP2 program. Given the goals of the
training to select and train young leaders in
leadership and substantive skills, the
evaluators identified the participants as
change agents and documented their
individual achievements in an ever-widening
circle of influence. Change agents are defined
as "individuals who have the capacity and
motivation to initiate or effectively support
sustainable development through their own
actions and by their influence on the actions
of others." (Training Impact and
Development, 1994). The further the person
moves from the center, however, the more
difficult it is to quantify the contributions or
to link them to the training experience. This model was developed prior to reengineering and
does not reflect the current focus on results-oriented training. Nevertheless, it provides a useful
conceptual framework for assessing the unanticipated consequences of training, such as those
described under the Multiplier Effect Activities in the economic growth and democracy and
governance case studies.

2
Caribbean and Latin America Scholarship Program, the primary source of funding for participant training
programs in Latin America from 1985-1995.

Training as a Development Tool


PPC\CDIE\DI\RRS 31
Training Indicators

One of the core reengineering values—managing for results—calls for establishing clearly defined
strategic objectives and developing performance indicators to track and measure progress.
"Performance indicators are measures that describe how well a program is achieving its objectives."
(TIPS #6, Selecting Performance Indicators). They are the direct measure of the intermediate result
and, consequently, are indispensible tools in determining the extent to which changes or
improvements have occurred. Appropriate and carefully articulated indicators provide the
mechanism to monitor progress, measure achievement, and collect, analyze, and report results.
If we view training as a tool that contributes to the achievement of a strategic goal, then training
indicators must be derived from the technical intermediate results to which the training activity has
been linked. Training indicators will allow SO teams to establish the relationship between training
and the results expected and determine the value of training as a tool to achieve an objective.

Developing quality indicators is a challenging task. It requires considerable deliberation and


refinement to ensure that they meet the established criteria. Representatives from the various units
involved in training, including the participants and supervisors, need to participate in the process
and agree on the results that will be measured. It should be a collaborative effort that brings
consensus on the final indicators that will be selected.

Criteria for Developing Indicators

Indicators specify the type of data that should be collected, along with the unit of measurement to
be used. Only those indicators that can be measured at regular intervals, given the time and
resources available, should be selected. The criteria presented below define the characteristics of
quality indicators and the application of these criteria to training indicators.3

Direct - A direct indicator measures the result it is intended to measure. The indicator should not be
stated at a higher or lower level than the result being measured.
For training, this criterion means that the indicator measures only one specific job improvement.

Objective - The indicator is precise, clear, and understood by everyone. It measures only one
change at a time and there is no ambiguity as to the type of data that needs to be collected.
For objective training indicators, there is no doubt among all the groups involved—SO teams,
participants and supervisors—as to what job improvement is being measured.

Adequate - All the indicators selected should together measure the result adequately. The actual
number of indicators needed depends on the nature of the result, the level of resources available to
monitor performance, and the amount of information needed to make informed decisions.
Adequate training indicators determine improvement in job or organizational performance that can
be traced to the training.

3
Adapted from TIPS #6, Selecting Performance Indicators,1996

Training as a Development Tool


PPC\CDIE\DI\RRS 32
Quantitative and Qualitative - Quantitative indicators provide measures in numbers or percentages,
such as reduced turnover or increased efficiency. Qualitative indicators describe changes that cannot
be easily measured, such as improved attitude or morale. They often provide background
information useful in explaining or analyzing the numbers or percentages.
Quantitative training indicators would measure increased productivity or efficiency, while
qualitative indicators refer to changes in behavior.

Practical - The data can be collected with sufficient frequency and in a cost-effective manner.
Reengineering guidance states that between 3 and 10 percent of total program resources should be
allocated for performance monitoring and evaluation.
For training, practical means that SO teams or supervisors can administer easy-to-use monitoring
tools at regular intervals and at a reasonable cost.

Reliable - Refers to the reliability and validity of the data, i.e., if the procedures used to collect and
analyze the data were duplicated, the results would remain constant.
Reliable training indicators use valid surveys or questionnaires and the information collected can be
easily verified.

Given the multiplicity of training applications and approaches, it is advisable first to develop
a series of indicators and then select from that list the ones that will best measure the results.
When undertaking this activity, however, one should keep in mind the admonition provided
by Administrator Atwood in a recent communication:

These tools... should not be used in a rigid, mechanistic manner, stifling field
creativity or ignoring the reality that performance must be interpreted differently in
different settings. They should promote our knowledge of development and our ability
to assess whether we are making progress, not limit it.
(USAID General Notice, 2/7/97).

With these issues in mind, the next two tables were designed as tools to assist SO teams in
developing useful and appropriate training indicators. Table IV summarizes the criteria stated above
for assessing the validity of generic indicators and the application of these criteria to training
indicators. Table V provides examples of good and poor indicators judged against the established
set of criteria and defined according to Kirkpatrick’s evaluation levels. It is important to underscore,
however, that Level III and IV indicators require more rigorous evaluation methods and that
attribution of the improvement to the training experience needs to be specified.

Training as a Development Tool


PPC\CDIE\DI\RRS 33
TABLE IV Criteria for Assessing Generic and Training Indicators

Criteria
Direct Objective Adequate Quantitative Practical Reliable
Qualitative

Explanation Measures the It is precise, It measures the Indicator is Data can be Uses valid methods
of criteria for result it is clear, and result needed to quantitative collected in a of data collection.
generic indicators → intended to widely make informed (numerical) timely and cost-
measure. understood. decisions. or effective fashion.
qualitative
(descriptive)

___________________ _____________ _____________ _____________ ____________ _______________ ________________


Application of above Indicator There is no Indicator Quantitative SO teams or Surveys or
criteria to training measures the doubt among determines shows increased supervisors can questionnaires used
indicators → specific job SO team, improvement productivity or administer are reliable, and
improvement in participants, in job or efficiency. regularly practical, data can be easily
question. supervisors organizational cost-effective verified.
as to what job performance that Qualitative monitoring tools,
improvement is can be traced to shows changes such as surveys or
measured. training. in behavior or questionnaires.
attitudes.

The table on the next page provides examples of good and poor training indicators judged against the above set of criteria.

Training as a Development Tool


PPC\CDIE\DI\RRS 34
TABLE V Examples of Training Indicators

Criteria
Kirkpatrick Quantitative
Indicators Evaluation Direct Objective Adequate Qualitative Practical Reliable
Levels

Indicator There is no Indicator determines Quantitative SO teams or Surveys or


measures the doubt among improvement shows increased supervisors questionnaires
specific job SO team, in job or productivity can administer used are
improvement participants, organizational or efficiency. regularly reliable, and
in question. supervisors as performance that practical, data can be
to what job can be traced to Qualitative cost-effective easily verified.
improvement training. shows changes evaluation tools.
is measured. in behavior or
attitudes.

Teachers are using Level III Yes Yes States improvement. Quantitative. Yes Yes
locally relevant Indicate other Should state
curriculum. contributing factors. % of teachers.
Five ADR mechanisms Level III Yes Yes States improvement. Quantitative. Yes Yes
created. Indicate other
contributing factors.
New performance Level IV Yes Yes States improvement. Quantitative. Yes Yes
appraisal systems Indicate other
established. contributing factors.
80% reduction in the Level III Yes Yes States improvement. Quantitative. Yes Yes
amount of time it Indicate other
takes to issue a contributing factors.
license.
Continued

Training as a Development Tool


PPC\CDIE\DI\RRS 35
Examples of Training Indicators (continued)

Criteria
Indicators Kirkpatrick Quantitative
Evaluation Direct Objective Adequate Qualitative Practical Reliable
Levels

Environmental impact Level III Yes Yes States improvement. Quantitative Yes Yes
statements carried out Indicate other
in 75% of projects. contributing factors.

75% of projects are Level IV Yes Yes States improvement. Quantitative Yes Yes
modified to comply Indicate other
with environmental contributing factors.
impact statements.
75% of employees Level III Several elements May involve States improvement. Qualitative Difficult to Difficult to
report improved comprise improved improved Indicate other (indicates administer verify
morale. morale. communication, contributing factors. change in regular data.
teamwork, or attitude) evaluation tools.
less absenteeism.
Increased number Level IV No. Should be Exact change/ Linkage to training Quantitative No No. Data
of child survival broken down: improvement is will have to be percentage is not
practices used. -rate of ORT uses not specified. demonstrated. needs to be verified
-percentage of Improvement is specified. easily.
children vaccinated too broad.
-number of cases of
diarrhea reported

Training as a Development Tool


PPC\CDIE\DI\RRS 36
Additional Monitoring and Measurement Tools

Ways to Isolate the Effect of Training on Performance

The definition adopted for training impact states that we will measure improvements in job
performance that can be directly attributed to training. Oftentimes, we observe that significant
changes have taken place following a training event, but because training is only one of
several inputs that influences results, we cannot attribute 100 percent of the improvement to
the training experience. When reporting results, training specialists have found it especially
challenging to isolate improvements from other nontraining variables, particularly since most
evaluations are not designed to do so.

If the objective of the program was to improve performance in a specific area, and
improvement can be recorded, Kirkpatrick would suggest that we should be satisfied with
evidence instead of proof. (See Kirkpatrick Level IV guidelines). Other evaluation specialists,
however, have proposed ways of isolating the effects of training by using trends analyses,
control groups, or forecasting. While these methods present persuasive arguments for isolating
performance improvements, they either require an unreasonable level of effort and resources
or would not be feasible to conduct in a development context. (However, see Witness Schools
under the case study for education, USAID/Morocco, for an example of a control group.)

The method described below—developed by Jack Phillips—is presented here because of its
practicality and applicability. It represents a cost- and time-saving technique that is applied
to Level IV evaluation. The data can be easily gathered, interpreted, and reported by the
supervisors or the participants themselves. By developing a user-friendly form that surveys
the percentage of the improvement derived from training, the author suggests factoring in a
confidence level. For instance, if a participant estimates that 80 percent of an improvement is
due to training and is 90 percent confident about that estimate, multiply 80% x 90% = 72%,
which indicates the overall confidence level. Multiply this figure by the degree of the
improvement in order to isolate the portion attributable to training.

It would not be a practical exercise, however, to isolate the effects of training without having
collected data at the various levels of evaluation. This process begins once the participants
have had enough time to apply the new skills in the workplace and sufficient information can
be gathered on the results and improvements achieved.

Table VI provides a tool for isolating the effect of training by factoring in a confidence level
and indicating other variables that may have contributed to the improvement.

Training as a Development Tool


PPC\CDIE\DI\RRS 37
TABLE VI

How to Isolate the Effect of Training on Performance Improvement

Estimated Estimated Confidence Amount of Portion of


percentage of confidence level improvement improvement
improvement percentage derived from
derived from training
training
Improvement

85 percent 80% x 90% = 72% x 85% = 61%


reduction
in customer
complaints

Issues Indicate basis Indicate basis Other


to consider for your for your factors that
estimation. estimation. contributed
to the
improvement

Indicate here
response
to above
issues.

To increase the reliability of this approach, management can review and confirm
participants’ estimations. Supervisors may be aware of other factors not related to training
that caused the improvement and their estimates may be combined with those of the
participants. Likewise, depending on the circumstances, estimates may be obtained from
customers or subordinates. Granted that we are dealing with estimates, which present an
undeniable level of subjectivity, however, Phillips would argue that the estimates come
from a "credible source, the people who actually produce the improvement." ("Was it the
Training"?, Training and Development, March 1996.)

Training as a Development Tool


PPC\CDIE\DI\RRS 38
Assigning a Monetary Value to the Improvement

The approach proposed in this section involves thinking of the improvements gained through
training in financial terms. There are numerous instances in which assigning a monetary value
to the improvement would not be a practical nor feasible exercise. However, if the
improvement results in less time spent in accomplishing a task, or fewer errors or accidents,
or reduced turnover, then financial benefits can be calculated in terms of staff time, fewer
fees paid by the organization, or increased production.

The two examples presented here illustrate how to calculate the value of the improvement and
how to calculate savings in overtime.

To convert the value of the improvement into monetary terms, concentrate on one indicator
at a time. For instance, if the improvement consists of reducing the amount of time it takes to
issue licenses, first determine the baseline, i.e., the number of licenses issued per week before
training times the amount earned by the employee per week. Show the difference in the
number of licenses issued after training and calculate the percentage of the improvement.
Multiply the employee’s salary by the percentage of the improvement to obtain the value of
the improvement. (See Table VII)

To calculate savings in overtime, first indicate the target: 50 percent reduction in overtime in
a six-month period. Then determine the baseline, i.e., the amount paid in overtime prior to
training. Establish the employee’s hourly salary and multiply it by the number of overtime
hours worked per month to arrive at the monthly cost. Follow the same procedure for the six
month period being measured after training and calculate the savings. When presenting these
results, a comparison should be made between the target established and the results achieved.
(See Table VIII)

Converting results into monetary terms provides an additional way of measuring the benefits
of training to an organization. As stated in preceding sections, training begins as a response to
a need or problem in an organization. By calculating the value of the improvements, we bring
training full circle to the needs and problems it was meant to address. In times of reduced
funding, SO teams can use this data to decide which training activities should be funded, how
to manage resources more efficiently, or to justify increased expenditures on training.

Training as a Development Tool


PPC\CDIE\DI\RRS 39
TABLE VII
How to Calculate the Value of Increased Production

Baseline - Number of licenses issued BEFORE training


___________________________________________________

No. of licenses Employee’s


issued per week weekly salary
(includes benefits)

14 $175

___________________________________________________

Results - Number of licenses issued AFTER training


___________________________________________________

No. of licenses Employee’s


issued per week weekly salary

20 $175

___________________________________________________

Value of increased production


___________________________________________________

Difference Percentage of Value of


in number of improvement improvement
licenses issued
per week

6 42% $73.5
(6 ÷ 14) ($175 x 42%)

A 42% improvement has a value of $73.50 per week

Training as a Development Tool


PPC\CDIE\DI\RRS 40
TABLE VIII
How to Calculate Savings in Overtime

TARGET: 50% reduction in overtime in a six-month period

Baseline - Amount paid in overtime BEFORE training


________________________________________________________________________________________

Employee’s Monthly overtime Monthly Amount paid


hourly hours worked amount paid in a six-month
salary before training in overtime period
before training

$5 x 30 = $150 $900 ($150 x 6)

________________________________________________________________________________________

Results - Amount paid in overtime AFTER training


________________________________________________________________________________________

Employee’s Monthly overtime Monthly Amount paid Six-month


hourly hours worked amount paid in overtime savings
salary after training in overtime in a six-month in overtime
after training period

$5 x 10 = $50 $300 $600


($900-$300)

_______________________________________________________________________________________

Comparison of target established with results gained


________________________________________________________________________________________

Savings Percentage Exceeded


(six months) of savings target by
(six months)

$600 66% 16%

($600÷$900)

Training as a Development Tool


PPC\CDIE\DI\RRS 41
Comments

The previous discussion on evaluation models, indicators, and measurement tools underscores
that the monitoring and evaluation functions are not isolated academic tasks. They are integral
and essential components of training activities, beginning with the needs assessment. The
benefits derived from monitoring training progress and measuring results are significant. The
process allows us to account for the resources expended and justify the investment made;
provide a mechanism for regular revision and improvement of designs; and demonstrate that
carefully planned programs constitute an effective tool for achieving results.

When analyzing and reporting training results, we also need to gauge the level of
commitment that the trainees and their supervisors have to the training, as well as the climate
that the participants will find in the workplace upon return. The extent to which trainees are
given the opportunity to apply the new skills and the level of encouragement and support they
receive are important factors to consider before deciding whether or not the training has
produced the expected results or if the indicators were met.

Before deciding on the most appropriate evaluation model at any level, it is first important to
clarify the need for an evaluation as well as its audience. Agree upon the types of questions
that will best elicit the responses desired; carefully decide on the most appropriate evaluation
tool; and determine who in the organization should be involved in the evaluation surveys or
interviews.

Given the rapid organizational changes that some institutions experience, conducting only one
report or evaluation will probably render it obsolete in a short period of time. A more
credible approach is to develop a program that would include several monitoring and
measurement mechanisms at different levels over regular intervals, using a variety of data
collection methods and sources. The more approaches we use, the greater the level of
reliability and credence given to the findings.

Training as a Development Tool


PPC\CDIE\DI\RRS 42
Training Activities in Support of Strategic Objectives

Sector Case Studies

This section features examples of training programs designed and carried out to support Mission
strategic objectives. They are offered as case studies to illustrate the variety of approaches and
creative applications that SO teams have used to implement the training function.

The following criteria guided the selection of the case studies: the objectives of the training show
direct linkage to the intermediate result(s); training was offered to a critical mass of carefully
selected participants who are in positions to effect change; involvement from the outset and at all
stages of the entities affected by the results sought; follow-on activities; and results that show
achievement of the intermediate result.

While each case study is presented in the format that best suits the training program described,
the overall areas covered in each study include: a background piece describing the situation in the
country, followed by an explanation of the training model, including training objectives and
selection criteria, the monitoring and measurement tools developed, and a summary of the results
achieved.

The case studies were prepared with the assistance of the respective Mission staff and/or training
contractors who provided the information and data reported. Their interest, assistance, and
collaboration in this effort have been invaluable. They enthusiastically shared their training
designs and plans, provided thoughtful insights and observations, maintained regular
communication with the author, and reviewed draft copies.

Following are the sectors represented in the case studies:

Economic Growth - Privatization of industries and agriculture, USAID/Tajikistan


Prepared with the assistance of Patrick Collins, NET Project, Academy for Educational
Development and Brooke Isham, USAID/Almaty.

Democracy and Governance - Administration of justice, USAID/Bolivia


Prepared with the assistance of Beatriz O’Brien, USAID/Bolivia.

Health - Improved quality with equity in health, USAID/El Salvador


Prepared with the assistance of Jaleh Torres, USAID/El Salvador, and
Henry Kirsch, Development Associates.

Education - Increased girls’ basic education, USAID/Morocco


Prepared with the assistance of Monique Bidaoui, USAID/Morocco, and
Meghan Donahue, AMEDIST.

Training - Enhancing the Human Resource Base of Nongovernmental and


Governmental Organizations.
Prepared with the assistance of Leslie Long and Bonnie Mullinix, World Education.

Training as a Development Tool


PPC\CDIE\DI\RRS 43
ECONOMIC GROWTH
Privatization of Industries and Agriculture
in Tajikistan

USAID\Central Asia

Background

USAID has a regional Central Asia office in Almaty, Kazakhstan with one of its satellite
offices located in Tajikistan. The training program described in this case study was designed
for Tajik participants and represents USAID/Central Asia’s focus on economic restructuring
as the foundation for developing the private sector.

Following its independence from the Soviet Union in 1991, Tajikistan faced grave economic
and social problems. Serious political and ethnic differences among the various factions led to
civil war; numerous industries closed; unemployment and inflation were high; and basic
commodities, such as food, transportation, or public utilities became dangerously scarce.
Moreover, the human resource skill base was undermined by the large emigration of ethnic
Russians and other non-Tajik groups.

Faced with this situation, the government of Tajikistan sought to restructure its economy
through the privatization of targeted industries. The thrust was to stimulate economic growth
by facilitating the transition from a centrally controlled to a market-based economy.
The USAID/CAR economic growth SO team assists the Tajik government as it defines and
articulates its role in a market-led economy and formulates economic policies that promote
growth and stability.

The USAID strategic objective for economic restructuring reads:

Foster the emergence of a competitive market-oriented economy in which the


majority of the economic resources are privately owned and operated.

The intermediate result is:

Improved, more sustainable business operations.

Training Model

In order to assist Tajikistan’s transition to privatized industries, government officials and


USAID staff and training specialists designed three U.S.-based economic restructuring training
programs that took place in 1994, ranging from five to six weeks each. The 27 participants
selected for these programs were senior executives, senior government policymakers, and
mid-level officials who represented a cross section of ministries and government agencies that
have played a critical role in restructuring Tajikistan’s economy.

Training as a Development Tool


PPC\CDIE\DI\RRS 44
Although each of the three training programs was designed for a specific group of participants
with a separate set of activities, the programs were conceptualized as an integrated whole.
Program content ranged from general theory and exposure to U.S. policies and practices to
more specific aspects of implementation of economic procedures to a final session on overall
management structures and legal underpinnings. Participants varied from mid- to senior-level
officials based on content of the training program. The last session served to wrap up what
had been covered in the previous sessions so that policymakers at different levels in the
government could reach consensus on the changes that needed to be implemented. A degree
of flexibility was built into the design of the program so that modifications could be made
based on feedback from the participants.

The objectives for each of the training programs, along with the groups of government
officials who participated, are indicated below. Because sections I and III were designed for
senior-level officials, the objectives were the same.

Economic Restructuring I

Participants: Five senior executive officials

Objectives: To assist participants in defining an appropriate role for their


government in a market-based economy by:
a) examining differences between market and command economies
b) focusing on how the government can establish a supportive environment
through fiscal and monetary policy

To enhance participants’ leadership skills to bring about economic and political


change that promotes sustainable economic growth and stability by:
a) examining leadership and managerial characteristics of exceptional leaders
b) allowing participants to meet with these types of leaders and discuss how
they have effected positive change

Economic Restructuring II

Participants: Twenty mid-level officials from various ministries and


government organizations

Objectives: Examine the advantages and disadvantages of privatization

Review case studies of revitalized industries, focusing on the


role of the public sector, investors, and employees

Explore the government’s role in using fiscal and monetary


policy, as well as regulation in facilitating economic growth

Identify strategies for managing change

Training as a Development Tool


PPC\CDIE\DI\RRS 45
Economic Restructuring III

Participants: Two senior government officials selected from the president’s


office.

Objectives: Same as Economic Restructuring I

Results

An essential component of the training design was an explicit description of the intended
results, which spelled out how participants would be able to apply the knowledge and skills
acquired during their training. It also gauged how the successful application of what was
learned supported the achievement of the relevant USAID strategic objective.

Individual programs were evaluated using a variety of qualitative and quantitative methods.
These included arrival and exit questionnaires, weekly interviews with groups and trainers to
monitor progress, and site visits to selected programs. In addition, debriefings were conducted
with returned participants, along with follow-on questionnaires. The following success stories
from participants who attended the economic restructuring programs reveal the types of bold
and innovative privatization and business measures undertaken that directly support
achievement of the intermediate result:

One participant fulfilled a contract that he signed with an American partner


during training. The contract was designed to assist U.S. companies investing
in Tajikistan and provided marketing, banking, and management training for
Tajik specialists. In addition, the participant founded two wine and flour
businesses.

Under the direction of another participant, privatization of a state corporation,


the Center for Electric Assembling, began with 20 percent of the stock
transferred to employees as of January 1996. Complete employee ownership is
planned for the end of 1997. The participant was also instrumental in the
creation of two companies, a U.S.-registered import-export company established
with Russian partners and another involved in the transportation of combustible
materials.

One participant planted a fruit orchard with a potential annual output of 1,000
tons. The irrigation systems installed for the orchard also provide drinking
water for the population of the nearby valley.

Another participant developed a proposal to establish an information agency,


Asia-Plus. He approached several private and domestic organizations for
funding and obtained a grant from Mercy Corps International. The participant
also set up an office and hired staff.

Training as a Development Tool


PPC\CDIE\DI\RRS 46
One participant developed a proposal to create a fund to conduct economic and
political analysis for government officials in support of political and economic
reform. The fund has been operating for about three years.

Follow-on questionnaires, typically administered after participants have been back from
training for at least six months, provide additional quantitative data and represent key
indicators of project success. The following statistics were reported in the September 1996
issue of the NIS Highlights, the monthly newsletter of the NIS Exchanges and Training
Project:

95 percent reported that they have used the knowledge gained during training
to effect policy decisions at the organizational level

88 percent reported that they have effected policy decisions that support the
further development of a free market economy

80 percent reported that they have effected policy decisions that support the
further development of a democratic system of government

97 percent reported that they have shared the ideas and techniques acquired in
training with their colleagues and supervisors

Multiplier Effect

Training programs customarily have multiplier effects, and participants often engage in a
variety of experiences—lectures, seminars, interviews, and writings—that they share with
fellow professionals.

One multiplier effect activity pertaining to this training program, however, stands out: Five
Tajik senior government officials from the President’s Board on Economic Reform and from
the Strategic Research Center who had participated in training traveled to Almaty to observe
economic and privatization reforms in Kazakhstan. Upon their return, they organized a series
of five-day seminars, which took place concurrently in the Leninabad and Khanton oblasts.

The topics included: infrastructure of a market economy, developing credit mechanisms, state
support to small entrepreneurs, investment development, and economic restructuring.
Hundreds of local government officials, state managers, and small- and medium-size
businessmen attended these sessions, held in a variety of settings, such as government offices,
universities, factories, plants, and farms.

This event accomplished two objectives: The instructors helped institutionalize the training
they received by training others; and a wide range of professionals gained a comparative view
of how economic restructuring has been implemented in the United States and in Kazakhstan.

Training as a Development Tool


PPC\CDIE\DI\RRS 47
The success stories and results mentioned above are
representative of effective initiatives undertaken to
"The guiding factor for successful
restructure the Tajik economy and revitalize industries. planning is a focus on the intended
The initiatives focused on the role of the public sector, use for the training once an
investors, and employees. Training prepared the individual has returned home."
participants to develop economic initiatives and provided
them with the skills and resources to establish innovative NIS Exchanges and
mechanisms to bring about changes that promote Training Project
economic growth.
NIS Exchange Highlights
September 1996

Training as a Development Tool


PPC\CDIE\DI\RRS 48
DEMOCRACY AND GOVERNANCE
Strenghthening Municipal Governments

USAID\Bolivia

Background

Since 1995, USAID/Bolivia’s training and follow-on activities have been designed and carried
out in strict relationship to the achievement of the Mission’s strategic objectives.
USAID/Bolivia’s democracy SO team provides technical assistance to several key institutions
to help them develop more efficient, accessible, and transparent procedures. The SO team
determined that a critical mass of administration of justice (AOJ) professionals required
targeted training in order to effectively carry out important reforms in the sector.

The USAID strategic objective for democracy reads:

Social base of democracy broadened and governance strengthened.

The three intermediate results are:

1. Key elements of rule of law become more transparent, efficient, effective,


accountable, and accessible.

2. National representation becomes more responsive to constituent needs and


demands.

3. Local governments effectively respond to citizen needs and demands.

Training Model

Through 1995, most of the training in the sector took place in-country. When results
frameworks and intermediate results were developed, it was determined that exposure to the
U.S. justice system was critical to acquaint Bolivian professionals with different AOJ
mechanisms, procedures, and techniques. The training activity described in this case study
was linked to IR 1, and took place between September 1996 and March 1997.

A generic training model applied to all training activities was developed for the Bolivian
Peace Scholarship Program under CLASP (Caribbean and Latin America Scholarship
Program, USAID’s major funding source for training activities conducted in the region).

The five key elements in the training model are as follows:

1. Training objectives are defined with a clear focus on results. SO teams and
partners work together to define the training program and to select participants.

Training as a Development Tool


PPC\CDIE\DI\RRS 49
2. Selection of candidates takes into account their role and their potential
to become change agents in support of results sought in their sector.

3. Development of action plans and team building take place during


predeparture orientation.

4. Intense and concentrated short-term training programs. Creating a critical mass


of trained individuals through group training has been the preferred training
model. Tailor-made training is designed to provide the best hands-on and
practical programs to meet specific trainee/country requirements. Training
programs also include a broad range of cultural and human interactions.

5. Follow-on programs promote, facilitate, and encourage the multiplier effect,


results-achieving activities, and networking among returned participants.

AOJ Training Activity under Intermediate Result #1

1. Training objectives. Themes and areas of specialized training were defined


along requirements outlined under IR 1: key elements of rule of law become
more transparent, efficient, effective, accountable, and accessible.
The specific objectives were:

Oral prosecutorial training


· To equip participants with skills and hands-on experience related
to procedures and techniques under the oral prosecutorial
criminal system, such as jury selection, evidence collection,
interrogation, and public defense.

· Upon conclusion of training, the participants were expected to be


able to apply the concepts and techniques learned in the
implementation of the oral prosecutorial system.

Alternative dispute resolution training (ADR)


· To provide hands-on experience to participants working in ADR
in the public and private sectors with conciliation, mediation,
and arbitration techniques used in the United States.

· Upon conclusion of training, the participants were expected to


introduce into their institutions the mechanisms, procedures, and
tools used in ADR to reduce the burden of unnecessary trials of
cases that could have been resolved via mediation.

Training as a Development Tool


PPC\CDIE\DI\RRS 50
2. Selection of candidates. Selection of judges, prosecutors, public defenders, and
mediators was made in coordination with government partner institutions. Key
individuals in all nine departments of Bolivia who advocate reform and had the
potential to become change agents in the implementation, promotion, and
efficient management of AOJ reforms were identified.

3. Predeparture orientation. During predeparture orientation (two to three full days


per group) participants had a chance to become acquainted and learn to think
as a team, provide their expert views, assess the problems in the justice sector,
and develop action plans.

4. Intense and concentrated training. U.S. universities designed three short-term


courses that addressed specific Mission requirements in AOJ subject areas.
In addition, a member of the training team visited Bolivia to become familiar
with the participants, the AOJ environment in the country, and USAID staff.
The participants were trained in the United States as a group, had on-site
counterparts, participated in hands-on activities, and engaged in numerous
cultural activities that exposed them to a wide range of U.S. democratic
practices.

5. Follow-on program. Debriefing information from returned participants was used


to improve or make adjustments to upcoming training programs. All the
participants returned to their former jobs, and a few received promotions.

Results

Results achieved by participants in ADR and oral prosecutorial training:

Legislation on arbitration and conciliation was approved in March 1997, which


includes a special article enabling the creation of Conciliation Centers
throughout the country. This article was written and promoted for inclusion in
the legislation by the ADR training participants.

A conciliators’ manual was developed by three of the returned participants and


distributed extensively throughout urban and rural areas.

Support committees were created in three departments of the country to


promote the use of ADR.

The Bolivian criminal process calls for the use of oral prosecutorial
mechanisms, but judges and prosecutors are not familiar with their use.
Following training, one judge resolved 25 cases in only one month using these
techniques.

Training as a Development Tool


PPC\CDIE\DI\RRS 51
Multiplier Effect Activities

The following activities took place within two to four months after training:

Four seminars on the oral prosecutorial system were conducted, training an


additional 420 judges, prosecutors, and public defenders.

Three workshops were conducted on ADR concepts and techniques, training an


additional 90 specialists.

One judge published a series of 12 articles in a major newspaper describing the


contents, methodology, procedures, and applicability of the oral prosecutorial
system in Bolivia.

A prosecutor produced a training manual describing and analyzing the major


features of the oral prosecutorial system and its applicability to Bolivia.

It should be noted that returned participants have achieved concrete results and implemented
several multiplier-effect activities in less than six months following their training.
The Mission anticipates that in upcoming months, once the Criminal Procedures Code is
passed, participants will increasingly apply oral prosecutorial system procedures. The
democracy SO team maintains regular contact with the stakeholders and partners who
sponsored the ADR and AOJ trainees to promote, encourage, and monitor the application of
the skills acquired during training.

Training as a Development Tool


PPC\CDIE\DI\RRS 52
HEALTH
Improved Quality with Equity in Health

USAID/El Salvador

Background

El Salvador’s centralized health service delivery system concentrates the bulk of its services
in the San Salvador metropolitan area, where the majority of physicians from the public and
private sectors practice medicine. Thus, the rural population lives in areas that are relatively
inaccessible to medical services.

Traditionally, health service providers in the public and private sectors (particularly NGOs)
have not worked together and have tended to mistrust one another. A fundamental problem of
the public health system has been its emphasis on curative rather than preventive medicine.
Many NGOs, on the other hand, have well established community outreach and public
education programs. This, combined with the fact that NGO personnel live in the
communities in which they serve, makes them powerful proponents and instruments of
preventive health care. The Ministry of Public Health and Social Assistance (MSPAS) staff
has not taken full advantage of what these NGOs have to offer to El Salvador in terms of
reform of the national health care service delivery system.

Although the majority of managers and supervisors working in the public health care system
are qualified and educated, the government realized that to implement system reforms it must
be able to count on a critical mass of managers who are well versed in new concepts and
techniques that will support a complete modification of the system.

The USAID strategic objective for health reads:

Sustainable improvements in health of women and children achieved.

The three intermediate results are:

1. Increased use of appropriate child survival practices and services.

2. Increased use of appropriate reproductive health practices and services.

3. Enhanced policy environment to support sustainability of child survival and


reproductive health programs.

Training as a Development Tool


PPC\CDIE\DI\RRS 53
Training Strategy

In consultation with MSPAS, the Salvadoran Social Security Institute (ISSS), health NGOs,
and training specialists, a comprehensive U.S. training program was designed for 110
participants in five separate groups. The participants were drawn from the MSPAS, ISSS, and
NGOs; the majority of them held mid-level management positions—regional supervisors,
financial managers, health unit administrators, division chiefs, project coordinators, and head
nurses. The thrust of the program was to provide participants with the technical skills
necessary to design and implement programs to improve the delivery of health services
throughout the country. Trainees were exposed to different models of the administration of
health services, successful and unsuccessful reform efforts, and effective coordination
mechanisms carried out by the public and private sectors.

Specifically, the program focused on:

· Role of health service organizations


· Decentralization of services
· Human resources management
· Financial resources management
· Cost-effective health services
· Participatory mechanisms in health service delivery
· Advances in areas of health care reform
· Integrated management information systems
· Health economics

Eight Steps that Modify Training Activities to Meet Reengineering Guidelines

The following eight steps indicate the sequence of events identified by the Mission that need
to take place during the training process:

- Strategic planning
- Needs assessment
- Specific purpose of training
- Training design
- Selection of participants
- Training
- Follow-on
- Monitoring and evaluation

The chart on the next page illustrates the interconnection of the eight steps and their
corresponding activities.

Note: Chart was translated from Spanish by the author: Ocho pasos en los que las actividades
de capacitación deben cambiar para responder a la reingeniería.

Training as a Development Tool


PPC\CDIE\DI\RRS 54
EIGHT STEPS THAT MODIFY TRAINING ACTIVITIES TO MEET REENGINEERING GUIDELINES
USAID/El Salvador

Strategic Planning
Involvement of stakeholders
Strengths, weaknesses, and limitations
Options
Goals, results
Needs Assessment Monitoring and Evaluation
Performance assessment Performance indicators
Gaps in skills, knowledge, and are determined during the
attitudes (SKA) strategic planning phase
Continuous monitoring
↓ ↑

Specific Purpose of Training Follow-on


Institutional goals Action plans
Agreements with stakeholders Multiplier effect
address intermediate results Continuous education
Planned at the beginning


Training Design Training
Customized Follows the design
Adult training techniques In-country, third country,
Experience in total learning in the U.S., or at the workplace
Action plans Integration of leadership skills
Innovative, participatory, and
dynamic methodologies

Selection of Participants
Directed to target population
Geared to several hierarchical levels
Development of change agents
Critical mass

All of these steps require shared responsibility and mutual collaboration between the training unit and the strategic objective teams.

Training as a Development Tool


PPC\CDIE\DI\RRS 55
Comments on the eight-step design:

Training is effective if planned with full participation and involvement of all


stakeholders (trainee, institution, institutional contractor, results package team,
and the training unit).

Training is more likely to bring desired results if the follow-on component is planned
at the design stage.

Training needs to be fully integrated into the local institutional plans.

Training, if well-designed, implemented, applied, and transferred, can be an


effective and powerful tool whether used for an academic institution, an NGO,
a community group, or a government institution.

Training can be tested on a pilot basis, modified, and expanded to cover an


entire government sector or institution.

Lessons learned can be capitalized upon and applied to new initiatives with
modifications, improvements, expansions, and creativity.

Human Capacity Development is an investment that is intended to remove


individual and institutional performance constraints and contribute to the
achievement of results and/or objectives. The final goal is sustainable
development in the institutions the Mission works with to benefit Salvadoran
people.

Training as a Development Tool


PPC\CDIE\DI\RRS 56
Results

Specific examples of changes and improvements to public health care services as a result of
the training program include:

A doctor at a medical unit reports that the hospital’s administrators now rely heavily
on feedback from both employees and clients to monitor and evaluate service quality.
The hospital has installed suggestion boxes for client evaluation.

Another doctor started a public relations department, something no other ISSS


facility has. She opened a desk next to the main entrance, so that staff she
personally trained are able to respond to client requests or problems as soon as
they enter the hospital.

One of the MSPAS doctors has set up a visiting nurses system through his newly
established close working relationships with community leaders and local NGOs.

The director of an NGO considers that the greatest achievement of the training
is that staff have learned to accomplish more through working together. The
NGO has since been working with ISSS to review patients and make
recommendations for medical care. This had never been the case before.

The ministry has recognized the potential of a number of former training participants by
promoting them within the ministry. The MSPAS is now ready to proceed with modernization,
and the training program has supported the ministry’s plan for modernization.

For the ISSS, which provides health care services to workers and their dependents, the most
significant improvement has been in the decentralization of the decision-making and problem-
solving processes. Unit personnel are now much more conscious of what they have, what they
need, and what they can obtain. On the service provision side, the biggest change has been the
focus on client satisfaction. ISSS participants have put together a complete training package for
employees in all units, which includes sessions on leadership and Total Quality Management.
They have been conducting training for several months and are using videos developed
specifically for the training.

A new vision exists for the health care service delivery system in El Salvador. The process of
decentralization had an impact on not only administrative and financial procedures, but on treat-
ment and service strategies as well. The linchpins of the improved public health care system are:

A client-centered approach. This reconfiguration of self-image reflects a shift in the


hierarchy of priorities, from responsibility to central government agencies and policies to
accountability to the client base, the Salvadoran public. Feedback from patients is solicited
and is instrumental in defining needed changes in institutional policies and procedures.

Training as a Development Tool


PPC\CDIE\DI\RRS 57
Preventive rather than curative medicine. Government health care service providers
are launching large-scale public education campaigns to convey to people that their health
is also their responsibility, not just the government’s. These campaigns are conducted
through community-level talks given by health promoters, and public events such as health
fairs.

A realization of the importance of collaboration with NGOs and an appreciation


of the richness of human resources in the private sector and at the community level.
For MSPAS and ISSS, the benefits of this new relationship have been twofold:

a. Closer linkages with community-based NGOs mean closer ties to the communities
themselves, especially those in remote rural areas. This means faster service access
to these communities, as well as public education programs and a greater
acceptance on the part of community members.

b. Public sector agencies can take full advantage of NGO clinics and other services,
given their focus on prevention. Better preventive health programs ultimately mean
that fewer seriously ill patients need to go to public hospitals, which also results in
an economic savings for the public sector.

Performance Indicators established by USAID/El Salvador

USAID/El Salvador has capitalized on lessons learned from CLASP (Caribbean and Latin
American Scholarship Program) training projects and has systematically built upon the successes
achieved. For the new Human Capacity Development (HCD) activity due to start in FY 1998, all
of the elements of success according to the Mission’s strategic objectives/results packages will be
included.

For performance and impact monitoring of the new HCD activity, the Mission has established
three main training indicators as follow:

1. Trainees applying elements of their training to the workplace (expressed as percent)

This indicator measures the impact of training in the workplace by evaluating whether
trainees apply elements of their training. Information is obtained from trainee self-
evaluation as well as from selected supervisor evaluations. Baseline data is to be collected
in 1997.

2. Trainees with increased responsibilities (expressed as percent)

This indicator measures the impact of training in the workplace by examining whether
trainees assume greater responsibilities. Only those trainees with increased responsibilities
related to their training will be considered. Information is obtained by trainee self-
evaluation and supervisor evaluations. Baseline data is to be collected in 1997.

Training as a Development Tool


PPC\CDIE\DI\RRS 58
3. Trainee action contracts executed within six months of training (expressed as percent)

This indicator measures the impact of training in the workplace by examining whether
trainees successfully complete their action contracts. These action contracts are agreed
upon by both the individual trainee and his or her institution and involve the
implementation of measurable activities. Baseline data to be collected in 1997.

Data for each indicator will be collected on a rural/national and male/female/total basis.

Training as a Development Tool


PPC\CDIE\DI\RRS 59
EDUCATION
Increased Girls’ Basic Education

USAID/Morocco

Background

In rural Morocco, only 22.5 percent of girls enroll in primary school and four out of ten girls
complete the sixth year of the primary cycle. Many rural schools have multigrade classes,
while most primary school teachers do not have the pedagogical background and practical
skills necessary to teach in these settings. The coursework offered at the teachers’ training
colleges does not include multigrade teaching techniques; nor do student teachers acquire the
skills necessary for adapting the curriculum to local needs or making it gender sensitive.

Faced with this situation, the Ministry of National Education (MNE) has developed the Rural
School Development Program (DSMR) to improve rural primary education in Morocco. In
partnership with parents, students, communities, local authorities, ministries, and NGOs, the
DSMR has set out to revolutionize rural education by improving the quality and relevance of
primary education and integrating primary schools into the communities. USAID, along with
other donors,4 is assisting the ministry in implementing this strategy, which will target the 13
most disadvantaged provinces in the country, as identified by the World Bank and the
government.

USAID’s first initiative in support of the MNE strategy is a training activity that consists of
testing new teaching interventions in 20 schools located in five of the 13 pilot provinces.

The USAID strategic objective for girls’ education reads:

Increased basic educational attainment among girls in selected rural areas.

The three intermediate results are:

1. Increased responsiveness of the primary school system to girls’ educational needs.


2. Increased community involvement in girls’ education.
3. Reduced operational constraints to girls’ participation in primary school.

The training activity described here was developed in support of the first IR and, more
specifically, the two mentioned below:

- Multigrade, gender sensitive, locally relevant curriculum developed.


- Cadre of competent educators developed.

4
Other donors include UNICEF, UNDP, the World Bank, and the French government.

Training as a Development Tool


PPC\CDIE\DI\RRS 60
Training Strategy

Under USAID’s Training for Development project, an ambitious training plan was designed to
improve the teaching methodology in rural areas and to make the school system more
responsive to the needs of the regions. A series of in-country training interventions have been
implemented since the beginning of the current school year and will continue to take place
over the next one and a half years (1997 to mid-1998). The primary focus is to provide
educators with the necessary skills to develop effective teaching objectives, adapt locally
relevant and gender-sensitive curriculum, and manage multigrade classroom settings.

The core training group consists of primary school teachers, inspectors, school directors, and
faculty at the Teacher Training College, faculty who work in the pilot regions, as well as
ministry staff.

The five major components of the training strategy are outlined below:

1. Assessment of human resource constraints and performance gaps


(see table on next page)

2. Identification of training skills and teaching techniques necessary to fill the


performance gaps

3. Identification of training results and impact indicators, including preconditions to


impact

4. Establishment of Witness Schools to serve as control groups

5. Development of mechanism for systematic collection of education data developed

1. Assessment of Human Resource Constraints and Performance Gaps

Based on a joint effort involving MNE and USAID staff and training specialists, the training
needs of primary school teachers, inspectors, and directors were assessed. A plan was also
designed to determine what skills are needed to enable rural primary schools to offer a
relevant and participatory curriculum and what performance gaps prevent this from happening.

The following table illustrates the human resource constraints and performance gaps identified
under each of the two lower intermediate results.

Training as a Development Tool


PPC\CDIE\DI\RRS 61
TABLE IX HRD Constraints and HRD Gaps

Lower Intermediate Results HR Constraints Training Events - 1997 HR Gaps

Multigrade, gender-sensitive, · lack of baseline data to · pilot school teachers and · lack of awareness among
locally relevant curriculum measure how much curriculum professors in multigrade community leaders, parents and
developed adaptation has been classroom workshops 3/97 urban teachers about the
accomplished importance of recruiting and
· management skills workshop keeping girls in school
· lack of parental input in focus for teachers in each of the pilot
groups to discuss priority needs regions 4/97 · lack of female teachers and
in curriculum adaptation administrators from the local or
· curriculum adaptation in the urban areas
· lack of gender-sensitive pilot regions 5/97
materials, ethics, and pedagogy · lack of research skills for
in teaching colleges · teacher conferences 7/97 ongoing quantitative and
_________________________ __________________________ qualitative data collection
· curriculum adaptation for
Cadre of competent educators · teachers’ lack of experience in central team teachers in Rabat · lack of management skills for
developed student-oriented classrooms 9/97 ongoing monitoring to
motivate, inform, practice, and
· teachers’ lack of experience in · multigrade classroom apply workshop experiences
adapting classroom progress techniques for teachers in the
based on student abilities pilot schools 10/97 · lack of a school philosophy
that reflects different realities
· teachers’ attitude and lack of · curriculum adaptation for and takes into consideration
enthusiasm for working in rural teachers in the pilot schools introduction of students to a
settings 11/97 second language

· lack of teaching materials and · absence of a reward system


creative supplies that recognizes excellence in
teaching and creativity
· inadequate analysis of teacher
recruitment and placement · absence of a teacher support
network to help teachers feel
connected nationally

Training as a Development Tool


PPC\CDIE\DI\RRS 62
2. Training skills and teaching techniques

The following training skills and teaching techniques were identified as necessary to lay the
groundwork for increased responsiveness to girls’ educational needs:

· Evaluation techniques to identify and analyze pedagogical objectives

· Management skills for application to pilot school operations

· Action plan development skills and techniques

· Training management skills

· Participatory teaching techniques

· Gender awareness

3. Training results and impact indicators

Qualitative and quantitative indicators were developed to measure the skills, knowledge, and
attitudes of the participants, as well as the number of primary schools offering an improved
multigrade curriculum.

· Evaluation strategies to identify pedagogical objectives adapted by inspectors


and faculty of selected teacher training colleges (TTC)

Training indicators: Number of teachers adapting curricula to local circumstances.


An increase in the number of lessons created based on pedagogical objectives.

· Management skills of inspectors and ministry staff improved

Training Indicators: Increase in use of established communication channels;


increase in skills development for managing change through group work.

· Creative and practical action plans developed

Training indicators: Number of decisions made locally and autonomously based on


action plans.

· Quality and quantity of managing training increased

Training indicators: Percentage of teachers capable of managing space, time,


varying conditions for multigrade and multiage classrooms in rural areas.

Training as a Development Tool


PPC\CDIE\DI\RRS 63
· Experience and confidence gained as skills improve

Training indicators: Percentage of teachers capable of identifying objective of


lesson in national books and developing lesson design and planning trimester.
Number of schools that become self-managed as teachers, students, and
community become involved.

· Awareness of gender-sensitive classroom increased

Training indicators: Number of girls responding to creative change in school


environment that encourages them to stay in school.

4. Witness Schools

In each of the five pilot areas, a witness school was set up to serve as a control group. Staff
at this school will not be involved in the training, nor will the school receive any assistance.
Statistics will be gathered from these schools to compare and analyze against the performance
of the pilot schools.

5. Mechanism for systematic collection of educational data

Annual targets have been established to determine enrollment and retention rates. USAID will
acquire the Education Automated Statistical Information System Toolkit (ED*ASSIST)
to collect, analyze, and report data. ED*ASSIST is an integrated set of tools designed to assist
ministries of education in planning and implementing systems used to collect educational
statistics in a timely, efficient, and reliable manner.

Training as a Development Tool


PPC\CDIE\DI\RRS 64
TRAINING
Enhancing the Human Resource Base of Nongovernmental and
Governmental Organizations

USAID/Namibia

Background

At independence, in March 1990, the new government of the Republic of Namibia inherited a
legacy of apartheid policies. Virtually all the country’s natural resources and most of its social
services had been directed primarily to a five percent minority of the most advantaged sector,
while the needs of the majority of the population were largely neglected. This created a dual
economy in the classical colonial mode with wide disparities in income and resource
allocations. Seven years after independence, Namibia continues to struggle to overcome this
economic and social heritage.

Over the past five years, approximately 70 percent of USAID/Namibia’s resources have been
invested in education and training. The goal of USAID’s assistance program is "the
strengthening of Namibia’s new democracy through the social, economic, and political
empowerment of Namibians historically disadvantaged by apartheid." (Results Review, 1997)

In keeping with the strategy to use PVOs and local NGOs to address the development needs
in Namibia, USAID initiated in FY 1992 a five-year NGO capacity building program. This
project, entitled Reaching Out with Education to Adults in Development (READ), was
designed to provide a combination of grants, training, and technical assistance to NGOs to
increase their capacity to deliver services and education to historically disadvantaged adults.
This case study focuses on the training component of this integrated support effort.

USAID/Namibia SOs address the need to develop long-neglected human resources. One SO
in particular places emphasis on fostering and strengthening the human and institutional
capacity of local NGOs engaged in adult training and/or civic advocacy across a wide range
of sectors. While the READ project addresses two of the Mission’s four SOs, its training
component falls primarily under the one listed below:

The USAID strategic objective for increasing the skills of NGO personnel reads:

Enhanced roles for historically disadvantaged Namibians in key public sector,


NGOs, and private sector organizations.

The two intermediate results are:

1. Increased number of historically disadvantaged Namibians acquiring


enhanced managerial and technical skills and knowledge.

2. Improved access for trained historically disadvantaged Namibians to


technical, managerial, and leadership positions.

Training as a Development Tool


PPC\CDIE\DI\RRS 65
Training Design

In the last three years, the READ project has provided training to 400-plus participants
through a combination of workshop series, individual sectoral workshops, conferences, and
seminars. Early evaluations indicated that training impact was greatest in the areas where
participants had the opportunity to acquire targeted skills, apply these skills during field
assignments in their organizations, and return to share their experiences. Thus, the core of the
overall training design lies within three separate workshop series designed to increase both the
technical skills and professional qualifications of NGO personnel, as well as enhance their
ability to transfer these new skills to others. Participants were selected from the staff of
approximately 40 NGOs and two government ministries. Most training programs were
designed and cofacilitated with NGO input. In the case of the training of trainers (ToT) series,
building institutional capacity within NGOs to implement these workshops in the future has
been a central part of the overall implementation strategy.

The components of the training are as follows:

Institutional building workshop series

Participants: An average of 20 NGO managers and administrators

Design: This is a three-month workshop series designed to strengthen the


organizational development skills of NGO management. It was
first offered in 1993/4 as an introductory series to both build
skills and orient NGOs to READ project activities. It is currently
being redesigned into a broader Organizational
Development/Capacity Building series slated to be offered in
1997/8. The three areas covered in the first series were:

Institutional assessment. Covers review and assessment of NGO


mission, goals, structure, planning, HRD activities, programs and
services, financial resources, evaluation, and overall program
management. Analysis and review of NGO stakeholder
expectations are also addressed as part of the overall
organizational assessment.

Action planning. Participants review and analyze the institutional


assessment data collected during the field assignment and
develop action plans to address the critical concerns identified.

Project development and proposal writing. Participants review


terminology and components of proposals and acquire proposal
writing skills.

Training as a Development Tool


PPC\CDIE\DI\RRS 66
Training of trainers (ToT) workshop series. Initially designed and offered in 1994/5, this
series has had a noticeable impact and has been in high demand ever since. It was also
conducted in 1996 and 1997, and plans exist for offering it again in 1998. After the first year,
the process of transferring responsibility to Namibian NGO partners for implementation and
building master trainer skills was actively undertaken. The 1997/1998 ToTs will be offered by
the local NGOs.

Participants: Experienced training staff of NGOs where the need to build


capacity for exemplary participatory training skills and
curriculum development has been prioritized.

Design: Spanning a period of 10 months, this comprehensive training in


participatory training skills and curriculum development consists
of four intensive two-week workshops with three field activities.
The following topics are covered in the workshops and field
assignments:

ToT 1: Introduction to participatory training. Participants


acquire the skills to facilitate participatory training programs,
design session plans and support materials, and develop a
training needs assessment. The first field assignment involves
implementing the training needs assessment.

ToT 2: Curriculum and materials development. Participants learn


to develop training curricula and supporting teaching materials
based on the needs of the target population. For the second field
assignment, participants work in teams to pretest the training
curricula developed during the course and gather information for
the next session.

ToT 3: Curriculum and materials modification. Participants


modify the training curricula and materials based on the above
field pretesting activity. In addition, they translate each
curriculum into a complete training manual. During the third
field assignment, participants assess the impact of their training
and provide technical assistance and follow-on support to their
target participants.

ToT 4: Expanding training impact. Having completed the


workshop series, participants take time to reflect on their
personal growth and the impact their increased skills have had
on their organizations. Analytical, training, and colleague support
skills are refined, and participants develop plans for
implementing skills and expanding organizational training
capacity.

Training as a Development Tool


PPC\CDIE\DI\RRS 67
AIDS training of trainers workshop series

Participants: Health workers and trainers of NGOs that are working in the
field of HIV/AIDS education.

Design: This series of workshops provides trainers with the skills to


assist target groups in developing and implementing community-
based AIDS programs (CBAP). The series consist of three two-
week workshops with a three-month break for field work.

The workshops are planned as follows:

ToT A covers three aspects that serve as the foundation of a


community-based program: establishing a CBAP, participatory
training methodology, and facts about HIV/AIDS.
During the field work exercise, participants implement the major
steps in establishing a CBAP, namely: community identification,
needs assessment, community mobilization, and sensitization of
community leaders.

ToT B equips participants with the skills to train community


groups, develop training materials, and facilitate community
ownership of programs. The topics covered include community
training practice, group dynamics, materials development, and
networking.
During the field work exercise, participants continue to work on
establishing a CBAP by mobilizing communities to select AIDS
educators and elect AIDS committees.

ToT C provides trainers with the skills needed to foster


community ownership of the program and to monitor and
evaluate community-based programs. The main topics covered
include: basic counseling skills, fostering community ownership,
community coping strategies, monitoring and evaluation, and
phase-out of NGO support. As part of the field work exercises,
participants provide training to AIDS educators, equip
community members with the skills to evaluate the program, and
design ways to sustain the program in the future.

Upon completion of the core ToT, participants’ mastery of technical skills and leadership
qualities are assessed based on the following areas of expertise: knowledge of training
theories, facilitation skills, curriculum development skills, materials development skills,
analytical skills, knowledge of training content, training implementation and management
skills, communication and interpersonal skills, needs assessment, and monitoring and
evaluation skills. The ability of participants to demonstrate and apply these skills qualifies
them as Certified Participatory Trainers. HIV/AIDS Trainer Certification is based on similar

Training as a Development Tool


PPC\CDIE\DI\RRS 68
criteria with decreased emphasis on curriculum development and analytical skills and
increased emphasis on comprehensive HIV/AIDS content knowledge. Completion of these
ToT training series qualifies participants to develop and deliver participatory training and help
others within their organizations do the same.

To effectively transfer implementation responsibility to NGOs, a parallel training and


technical assistance strategy was introduced. A cadre of Master Trainers was selected within
partner NGOs that had accepted the invitation to take on responsibility for the ToTs.
This program is described below:

Master trainer support

Participants: Training staff of NGOs who have participated in a workshop series and
expressed an interest in cofacilitating training for other NGO trainers.

Design: This intensive and targeted support involves working closely


with trainers in the design and implementation of training
courses for NGO staff. This support spans a period of not less
than 12 months and consists of:

Practical Training, Part I: Initial priority is given to refining and


enhancing the trainer's participatory training facilitation and
implementation skills. Practical involvement in the design, redesign,
and effective implementation of a successful training series is the primary
focus.

Master Trainer Skills Deepening Workshop: Trainees participate


in a 5-day workshop guided by specific trainer needs and the
skills and knowledge indicators set out in the skills inventory.

Successful completion of above workshops entitles participants to certification as Master Trainers.

Practical Training, Part II: In cases where Master Trainers are


based with organizations that are responsible for taking over training
series implementation, additional technical assistance and on-the-job
training is provided to improve their management skills and to equip
them to develop and adapt MIS and M&E systems.

In addition to the above, the READ project has actively supported the establishment of a
National Trainers Network for Namibia. This network will help maintain, expand, and build
on connections established during training between individuals and organizations involved in
training in the country. Also, to help the Ministry of Education deal with nonformal and
participatory approaches to education and to enable them to interact with NGO efforts in the
country, the READ project sponsored four staff of the Directorate of Adult Basic Education to
participate in the ToT series, and an additional five staff to attend a Master’s degree program
at the Center for International Education, University of Massachusetts.

Training as a Development Tool


PPC\CDIE\DI\RRS 69
Monitoring and Evaluation Tools

Table X Mechanisms Used to Monitor and Evaluate the Effectiveness and Impact of
the Training Program

Computer-Based Implementation
Tool/Activity Purpose Information on: Time Table
ToT Needs To collect background/baseline MS Access Beginning of
Assessment information on needs of Database training cycle
Questionnaire potential training participants.
Participant To track progress of training Lotus 123 ToT 1 (pre &
Assessment: ToT Self through formative evaluation spreadsheet and post) & ToT 4
Assessment Forms, mechanisms and self-assessment MS Access
HIV/AIDS Training tailored to the training topics Database
Appraisal Form, and objectives
Daily Evaluation/
Steering Committee
session designs
Training Activity To document relevant Lotus 123; End of training
Reports information on Participation in Training Activity activity
Training Activities conducted Spreadsheet
under READ sponsorship
Trainer Skills To document and monitor skill MSWord list and Beginning
Inventories: Master levels required for trainer/ forms of/during/end
Participatory Trainer, participant certification training program
Certified Participatory
Trainer, (HIV/AIDS
Trainer)
Training Impact: To document the impact of MSWord merge To be completed
Trainer Profiles participatory training on File + MSAccess and discussed
participants and their NGO Database within one
The Trainer's Net clients month of
(newsletter). Articles, conclusion
list of participants, of training
shifts in
responsibility, and
other mechanisms
List of READ Project To monitor READ-initiated MS Word list Upon publication
Training Materials contributions to participatory printing of
Namibian training literature. manuals
Impact Assessment To identify the impact of MS Word files Yearly
Matrix project inputs—training,
technical assistance and
subgrants—at the participant,
organization, and clients of
organization levels.

Training as a Development Tool


PPC\CDIE\DI\RRS 70
Results

Ten of the 37 individuals who participated in the core ToT series have received official
promotions. Of the 76 participants in both ToTs, the majority report an increase in job
responsibilities and productivity. As a result of the increased commitment, effort, and skill
level that ToT graduates bring to their responsibilities, the management within their
organizations look to them to contribute outside their areas of responsibility. Thus, while
position titles may not change, in the majority of cases, participants report that their
responsibilities and status within the organization have expanded as a result of their increased
skills and ability.

Following are examples of how participants have applied their skills in their organizations:

In addition to the eleven manuals produced during the ToT workshops, five
participants from ToT’95 and ToT’96 have designed, produced, and published
participatory training curricula and manuals for their organizations. These
manuals are used in training and cover topics ranging from Training of Business
Trainers, Training of Community Development Committees, and Training of
Teachers.

A female participant from the Ministry of Education was appointed acting head
of the Training Division. As one of her innovations, she designed and
implemented nationwide regional training workshops for promoters to introduce
new formats and mechanisms for lesson planning based directly on content
introduced during ToT training. As a result, literacy promoters now focus on
techniques, have a greater understanding of what they should do, recognize what
materials they need, and conduct more effective literacy lesson lessons.

A participant from ToT’97 was promoted to Head Trainer in a local NGO and
has integrated content learned during the ToT into the organization’s core training
offered to field workers. He incorporated some of the ToT’s more challenging
topics and tools for training design and analysis.

A trainer who moved from an NGO to a parastatal organization was elected as


the first woman president of a trade union in Namibia. She attributes this
promotion to her dedication to bringing broader support from private and
parastatal organizations to the development efforts of the NGO community.
To accomplish this, she applied the networking skills she acquired during the ToT.

Staff of the Ministry of Education and two NGOs are working together in
TOT’97 to develop a training curriculum for English promoters from both
organizations to strengthen skills in planning participatory English lessons.
A primary product of this collaboration will be a collection of sample lesson
plans based on the current English language texts used in adult literacy
classrooms. Overall, networking between the ministry and literacy NGOs has
expanded considerably as a result of contact initiated through the ToT series.

Training as a Development Tool


PPC\CDIE\DI\RRS 71
The Red Cross reports that its ability to serve its mission in general and to
undertake HIV/AIDS education activities in particular have dramatically
improved since trainers have completed the HIV/AIDS ToT series. Due to the
impact realized through the introduction of a community-based approach, they
are now in the process of extending the community-based approach to the
implementation of their first-aid training.

Since the ultimate beneficiaries of this development intervention are the clients of the NGOs
(men and women living in rural and urban communities throughout Namibia), the real test of
training impact lies in the results found at this final level. Trainers report that the introduction
of participatory training skills empowers people to take actions that have a direct impact on
increased income, leadership, community development, and advocacy efforts.
A few specific examples follow:

Workers who are members of the Namibia Food and Allied Workers Union (NAFAU)
are demanding that their companies sign HIV/AIDS policy agreements with the Union
to protect the rights of workers.

Small Scale Entrepreneurs disenchanted with their inability to get loans from the banks
persistently approached their branch in Katutura and, with the help and support of
COSEDA, their credit support NGO, encouraged the bank to reconsider its policy on
loan size.

HIV/AIDS Committees are taking leadership and responsibility for mobilizing their
communities against HIV/AIDS. HIV/AIDS Community Educators are actively
contacting community members and interacting with traditional healers to change high-
risk practices and behaviors. The transition of control is sucessfully passing from the
initiating NGOs to the community bases as planned.

In general, analysis of such reports supports the contention that this new approach to training
has planted seeds which continue to grow and encompass broader areas. In the process, it
effectively supports the growth of democratic interactions and a strong civil society.

Training as a Development Tool


PPC\CDIE\DI\RRS 72
Documenting Training Practices - TraiNet

The numerous contractors and grantees involved in designing and implementing training
programs employ a wide variety of reporting procedures, mechanisms, and formats to record
training-related information. The lack of a consistent, standardized reporting practice has resulted
in duplicate or incomplete data, making it impossible to account accurately for training costs,
number of participants trained, or results achieved.

The Training Results and Information Network,TraiNet,designed by Development InfoStructure


in conjunction with G/HCD and M/IRM, represents an Agency-wide information management
system for training that responds to reengineering practices. TraiNet will provide a consistent
framework for systematic data input and collection, and enable quantitative and qualitative
analyses of training practices using consistent, standarized data formats. It will be used at several
levels: For administrative and financial issues, users will be able to automatically enroll trainees
in health insurance, track departures and returns, and monitor costs. At the design,
implementation, monitoring and evaluation levels, training staff will be able to record as well as
access information and practices pertaining to all stages of training.

TraiNet, with greater applicability and fewer data entry requirements, will eliminate the
Participant Training Management System (PTMS); the Participant Data Form (PDF), the
Project Implementation Order/Participant (PIO/P), biodata, statement of expenditures, and budget
worksheets. As the information evolves through the various stages of training, the data will be
submitted and included in the TraiNet database. Development InfoStructure will receive this
information and update the web page on a regular basis, <www.devis.com/traiNet>.

Implementation: During Phase I (October 1996-February 1997), TraiNet was designed and
installed on the Intranet in USAID\Washington. Field testing started during Phase I and was
expanded during Phase II with visits to five Missions. Phase III—full installation—is scheduled
to start in January 1998 through a series of regional training workshops and country visits.
Working with G/HCD, Development InfoStructure will forward to stateside contractors and to all
Missions an information packet detailing the installation, use, and maintenance of the TraiNet
database. Contractors who do not receive this packet should contact Development InfoStructure at
(703) 525-6485 or <[email protected]>.

The importance of documenting and updating this information on a regular basis cannot be
overemphasized. As the Agency transitioned from project-level activity to strategic objectives,
much of the information crucial to conducting research remains in the field and can no longer be
accessed through the USAID Development Experience System database, the Agency’s central
repository of information. The performance tracking and reporting capabilities of TraiNet will
enable field and Washington staff to document and report activities, generate centralized reports,
as well as analyze and exchange information with the intent of enhancing efficiency and learning
from experience.

Training as a Development Tool


PPC\CDIE\DI\RRS 73
Resources

Aguirre International. 1994. Training Impact and Development: An Evaluation of the Impact of the
Cooperative Association of States for Scholarships (CASS) Program. HERNS Project. USAID.
Washington, DC. (PD-ABK-526)

Aguirre International. 1995. Development Training II Project, Monitoring and Evaluation


System. HERNS Project. USAID/Egypt. Washington, DC. (PN-ABW-920)

Aguirre International. 1994. Human Capacity Development: People as Change Agents,


HERNS Project. USAID. Washington, DC. (PN-ABX-315)

AMEX International, Inc. 1995. Impact Evaluation Resources Guide. USAID, Washington, DC.

AMEX International. 1996. USAID/Madagascar, Customers and Partners: A Four-Level Evaluation


of Training Impact. USAID. Washington, DC. (PN-ABY-685)

AMEX International. 1995. "USAID/Senegal, Assessment of the Development Impact of Participant


Training, 1961-1995." USAID. Washington, DC.

Brinkerhoff, Robert. 1989. Evaluating Training Programs in Business and Industry. Jossey-Bass Inc.,
San Francisco, California.

Brinkerhoff, Robert. 1987. Achieving Results from Training. Jossey-Bass Inc., San Francisco, CA.

Brinkerhoff, Robert, and Stephen Gill. 1994. The Learning Alliance. Jossey-Bass Inc., San
Francisco, California.

Broad, Mary and John Newstrom. 1992. Transfer of Training: Action Packed Strategies to Ensure
High Payoff from Training Investments. Addison-Wesley, Reading, Massachusetts.

Creative Associates International. 1995. How to Design a Country Training Strategy for its Impact
on Development. Washington, DC.

Gillies, John, A. 1992. Training for Development. Academy for Educational Development,USAID,
Washington, DC. (PN-ABL-295)

Kirkpatrick, Donald. 1994. Evaluating Training Programs, the Four Levels. Barrett-Koehler
Publishers, Inc., San Francisco, California.

Kirkpatrick, Donald. "Great Ideas Revisited." Training and Development. January 1996, pp.54-59.

Phillips, Jack. "Was it the Training?" Training and Development. March 1996, pp. 29-32.

Phillips, Jack. 1994. Measuring Return on Investment. Volume 1. American Society for Training and
Development, Alexandria, Virginia.

Phillips, Jack. "ROI: The Search for Best Practices." Training and Development. February 1996,
p.42-47.

Training as a Development Tool


PPC\CDIE\DI\RRS 74
Phillips, Jack. "How Much is the Training Worth?" Training and Development, April 1996,
p.20-24.

United States Agency for International Development. 1996. Program Performance Monitoring and
Evaluation at USAID. PPC\CDIE\PME.

United States Agency for International Development. 1996. Selecting Performance Indicators.
TIPS #6. PPC/CDIE/PME. Washington, DC. (PN-ABY-214)

United States Agency for International Development. 1995. The Agency’s Strategic Framework and
Indicators, 1995-1996. PPC/CDIE/PME. Washington, DC. (PN-ABX-284)

United States Agency for International Development. 1994. Strategies for Sustainable Development,
Washington, DC. (PN-ABQ-636)

United States Agency for International Development. 1992. Program Overview: Education and
Human Resources Development, Latin America and the Caribbean. EHRTS Project.
Washington, DC. (PD-ABD-012)

United States Agency for International Development. 1992. Training for Development.
EHRTS Project. Washington, DC. (PN-ABL-295)

United States Agency for International Development. 1986. Annotated Bibliography of


Participant Training Evaluations, Studies, and Related Reports. Evaluation Occasional Paper
#08. Washington, DC. (PN-AAU-752)

United States Agency for International Development. 1986. Review of Participant Training
Evaluation Studies. Evaluation Occasional Paper #11. Washington, DC. (PN-AAV-288)

Training-Related Internet Sites

www.astd.org - Site of the American Society for Training and Development, one of the leading
sources in the field of training and human resource development. Provides research, analysis and
practical information. See under Site Index for links to a wealth of resources.

www.shrm.org - Site of the Society for Human Resource Management; provides resources similar to
those of ASTD.

www.tcm.com/trdev - Provides links to a vast array of non-commercial sites, three on-line


bookstores, and commercial suppliers. Links to Trdev-L, an excellent listserv (internet discussion
group) created and managed by Penn. State University.

www.trainingsupersite.com - This is an integrated site that provides training resources including


links to numerous publishers and the Training Magazine.

www.universityassociates.com - University Associates provides consulting and training services and


products. Click under Books and Materials for an excellent selection of resources.

Training as a Development Tool


PPC\CDIE\DI\RRS 75

You might also like