Introduction: Why does each programme need an MRM manual?
Experience has shown that the key challenge in monitoring and results measurement (MRM) is ensuring that it is fully integrated with program management. Only when MRM is integrated with management will information and analysis be fully utilised to improve program strategy and implementation. For this integration to happen, all managers and staff have to be involved in the MRM system. Both the implementation team and the MRM team must fully understand the different aspects of the MRM system and each staff member has to be able to effectively implement his or her roles within the system.
In order for all staff to feel comfortable with the MRM system and confident in their roles within it, it is essential that they have sufficient guidance. Even those who are familiar with MRM will need guidance, as each program is different and aspects of the MRM system will be tailored accordingly. Therefore, each program should provide managers and staff with:
- induction training in the program’s MRM system;
- regular, on-the-job guidance in the operation and use of the MRM system;
- a comprehensive, tailored MRM manual specific to the program.
The program-specific MRM manual is the subject of this guidance note. The MRM manual provides staff with written guidance on why and how the MRM system works and their roles within it. It provides both a comprehensive overview of the system to accompany induction training and a handy reference on all aspects of the system for staff to check as needed. The DCED Standard on Results Measurement recognises the importance of this written guidance for staff and so includes it as one of the ’compliance criteria’ against which program monitoring systems will be assessed during an audit. The compliance criterion reads:
8.3.3 All program staff have access to written guidance (e.g. a manual or staff guide) on how to implement all elements of results measurement (each of the sections above).
This guidance note provides an annotated outline of an MRM manual illustrated with examples from actual programs. The note will provide readers with a good understanding and practical examples of what to include in an MRM manual. For additional guidance, see the DCED guidelines for implementing the Standard.
Annexed to the Guidance Note are full MRM manuals from various programs. We are very grateful to the programs that have allowed us to publicly share their manuals with others. We invite other programs to share their manuals as well, in order to provide more examples for practitioners in our field. If your program would like to share their MRM manual, please email the DCED Secretariat, at Admin@Enterprise-Development.org.
Please note that neither the DCED nor the authors endorse any of the annexed manuals – readers will need to use their own judgement on the usefulness of each manual for them. We also encourage readers to be inspired by the ideas, but to limit copy/paste. Experience shows that it is important for an internal MRM manual to be tailored to a specific program in order to be useful to managers and staff.
We are indebted to the programs which provided the examples for the manual.
These are listed below.
- AIP-PRISMA: The Australia-Indonesia Partnership for Promoting Rural Income through Support for Markets in Agriculture (AIP-PRISMA), funded by the Australian Department of Foreign Affairs and Trade and implemented by GRM, is a multi-year program that is a part of the Government of Indonesia’s mid-term development strategy to accelerate poverty reduction through inclusive economic growth.
- ALCP: The Alliances Lesser Caucasus Program (ALCP) is a market development program in Georgia funded by the Swiss Agency for Development and Cooperation (SDC) and implemented by Mercy Corps partnering with local NGOs, ICCN on gender, ethnicity, governance and disaster risk reduction, and IAAD on livestock.
- Katalyst (archive): Katalyst was a market development programme increasing the incomes of poor men and women in rural areas of Bangladesh. It has now ended, but its website provides an archive with a wealth of experience and achievement.
- MDF: MDF is a multi-country private sector development program funded by the Australian Government and implemented by Cardno, currently operating in Fiji, Timor-Leste and Pakistan. MDF stimulates investment, business innovation and regulatory reform to create additional jobs and increase income for poor women and men in rural and urban areas.
- M4C: Making Markets Work for the Jamuna, Padma and Teesta Chars (M4C) is a 5-year project funded by the Swiss Agency for Development and Cooperation and implemented by Swisscontact and Practical Action. The project aims to establish and strengthen sustainable market systems for the ‘economically active poor’ on river islands in northern and north-western Bangladesh.
- RisiAlbania: RisiAlbania is a youth employment project that focuses on increasing employment opportunities for young women and men aged 15-29 in Albania. It is funded by the Swiss Agency for Development and Cooperation (SDC) and is implemented by a consortium consisting of HELVETAS Swiss Intercooperation and Partners Albania, Center for Change and Conflict Management.
- Samarth-NMDP: Samarth-Nepal Market Development Programme (NMDP) is a five year UK Aid-funded rural market development programme that aims to reduce poverty in Nepal by increasing the annual incomes of 300,000 smallholder farmers and small-scale entrepreneurs by 80 GBP. The programme is implemented by Adam Smith International, The Springfield Centre and Swiss Contact.
Overview of an MRM Manual
The MRM manual should be developed early on during the implementation phase of the program, generally right after the overall program management system has been developed. This timing helps to ensure that MRM will be integrated with program management right from the start. For example, it will be necessary to consider how the MRM approach will be integrated with key program management choices such as how the project approach will be implemented and decisions made, sector or situation assessment and the development of key strategies, and templates for intervention plans, partnership agreements etc.
Keep in mind, however, that the MRM manual should be a living document; adjustments and additions will be needed throughout the program. A number of programs have found that it is useful to thoroughly revise the MRM manual approximately one year into implementation when there is enough experience to assess what is working well and what needs to change to maximize efficiency and effectiveness. Additions or further adjustments are usually needed as the program matures, and requires different information in order to inform decision making. For example, when the scale of results increases or a program aims to assess progress in catalysing systemic change, the MRM system may need modifying to accommodate different information requirements.
The MRM manual should clearly explain how the MRM system is integrated with the overall program management system, as well as contain step by step guidelines on how to implement each element of the system. In general, the MRM manual should cover the followings topics :
- what the program aims to achieve;
- how the program pursues its objective – the program approach;
- how the program plans its interventions, including the development of results chains;
- how the program plans results measurement, including the development of indicators and measurement plans;
- how the program measures results, including timing, sources, methods and strategies to assess attribution;
- who will perform each of the functions in the MRM system;
- how information on results are analysed and used in managing the program;
- how the program reports results.
It is usually easiest to write the MRM manual at the same time as, or immediately after, developing the MRM system. This process begins with conceptualising how the MRM system will be integrated with the overall program management process. Then the program can identify the required MRM processes and instruments (e.g. sector guides, intervention guides, data collection, etc.). The program managers must lead the process of developing the MRM system, so that it is set up to serve their needs. Documenting the system in the manual is often carried out by an MRM technical specialist in the program or a consultant. If a consultant is used, it is important that s/he works closely with the program team, both so that the manual reflects the tailored system of the program and so that the program team has a thorough understanding of the manual.
Annotated Outline of an MRM manual
Program Objectives and Overall Approach
The first section of the manual should clearly explain the following.
- Objectives of the program: what does the program aim to achieve?
- Scope of the program: who are the target beneficiaries, and what are the targeted sectors and geographic coverage?
- Overall approach and theory of change of the program: how are the program activities expected to lead to the desired impacts?
Overall Program Management and Results Measurement System
This section of the manual is very important because it describes how the MRM system is integrated with the management of the program. This section should clearly explain how MRM informs program decision making and provide an overview of the key instruments and processes used in both the program management and the MRM. The description of processes should include an overview of how to develop and adjust strategies, how to design and implement an intervention, how to measure the results of an intervention and how information on results will be used in management decision making and reporting. This example describes the integration of MRM and program management.
Articulating Results Chains
Results chains are the backbone of the MRM system. This section should provide a clear step-by-step guideline on how to develop a results chain at appropriate program levels, such as sector and intervention. Generally, it should include the following topics:
- the links between the overall program log frame or theory of change and results chains at different levels of the program, as well as the typical structure of the results chains at different levels;
- guidelines/tips on developing sector results chains (if applicable);
- guidelines/tips on developing intervention results chains and documenting the assumptions and supporting research;
- guidelines/tips on how to outline expected systemic change (when applicable);
- guidelines on how and how often to review the results chains
- guidelines on how to take the risk of displacement into account;
- timing, roles and responsibilities developing the results chains.
Defining Indicators and Making Projections
To measure changes in each intervention results chain, the program team will define indicators for each step in the results chain. To fully capture the key change processes, a combination of qualitative and quantitative indicators has to answer the following questions.
- Has the expected change actually happened?
- To what extent?
- How are the changes taking place?
- Why are the changes taking place or not taking place?
- To what extent are the expected changes sustainable?
- explanation of the common impact indicators used by the program, as well as the reasons for excluding any of the DCED-recommended indicators if applicable;
- guidelines on how to define indicators including qualitative and sustainability indicators;
- a menu of frequently used indicators the staff can choose from and adjust to specific interventions;
- guidelines on how to make projections;
- timing, roles and responsibilities in the development of indicators and projections.
Attribution and Measurement
This section of the manual should provide guidelines on how to assess the attribution of changes to program interventions, how to develop a measurement plan and how to measure results. It is important to cover attribution and measurement together because the choice of which method to use to assess attribution will influence the measurement plan.On attribution, the manual should include the following topics:
- explanation of the concept of attribution;
- how to use quantitative and qualitative information on changes to assess the likely strength of causality along the results chain, as well as how to estimate the counterfactual;
- options for the attribution strategy that the program team can use to assess attribution and guidelines on how to choose an appropriate one.
On measurement, the manual should include the following topics:
- guidelines on how to develop a measurement plan;
- when and how to collect baseline information
- a menu of data collection tools and tips for selecting tools;
- guidelines on conducting research;
- guidelines on how to assess and understand differentiated results by gender and any other cross-cutting issues;
- guidelines on how to identify any unintended effects;
- timing, roles and responsibilities for measurement.
Programs that aim to catalyse systemic change should also include a section in their manual on monitoring and measuring systemic change and the results from it. In this section, the manual should include the followings topics:
- explanation of systemic change and guidelines on how to assess the progress toward the expected systemic change;
- guidelines on how to measure the results of systemic changes and how to assess attribution.
One of the key functions of the MRM system is to provide useful information and a mechanism to utilise that information for managing and improving the program implementation and strategy. Hence, this section should contain:
- a clear guideline on how to analyse information on results
- a detailed description of the process for using information on results in decision-making at different levels of the program
- inputs to the meeting,
- a structure and agenda for the meeting,
- participants and roles during the meeting,
- expected documented outputs.
In order to report program-wide impacts, the program has to aggregate the impacts from all interventions across the whole program for the common impact indicators. In many cases, programs also aggregate data on a few selected intermediate indicators as well. Usually, the program has to correct for overlapping between interventions and sectors as part of the aggregation process. For example, some beneficiaries could benefit from several interventions. If the program simply adds up the number of beneficiaries from each intervention, then the program will count the same beneficiaries several times.Therefore, this section of the manual should include the following topics:
- a description of the aggregation system covering how results will be aggregated and corrected for overlapping
- a clear guideline on the time-frame over which the impacts will be claimed and reported.
Implementing the MRM system effectively requires close co-operation between the program implementation and MRM staff. Each staff member needs to clearly understand his or her roles and responsibilities and how those are related to others. Otherwise, confusion on roles and responsibilities can lead to gaps in implementation of the MRM system and also sometimes lead to conflicts between staff or departments.Hence, the MRM manual should include a detailed description of roles and responsibilities related to the MRM system. This should cover all of the activities described in the previous sections. It should also explain who leads each activity or output, who participates in it, who ensures the quality of it and who approves it. These details will help staff to understand how they should cooperate to complete each activity or output.
In order to make the MRM manual really useful to staff as a ‘one stop shop’ reference for the MRM system, it is important to include all the formats and tools relevant to the MRM system. These can be put into annexes to the MRM manual. These formats and tools will include:
- outlines for planning documents, such as a sector strategy report and an annual plan;
- templates for MRM documents such as an intervention guide including at least a summary of the intervention strategy, results chain, measurement plan and projections; and a research plan;
- outlines for internal reporting documents such as an intervention progress report and minutes of a review meeting; and
- outlines for external reporting documents, such as a semester or annual report and a beneficiary case study.
Many programs also add other annexes to their MRM manuals covering topics such as tips for conducting surveys or in-depth interviews, guidance on sampling in results research, or templates for agreements with partners. The MRM manual annexes are a good place for any templates or tips that staff are likely to use regularly in program management and MRM.
Benefits of a Program Specific MRM Manual
Experience has shown that programs with detailed and tailored MRM manuals tend to have better structured and more effective MRM systems. First, the act of documenting the MRM system often helps programs to structure their system and identify gaps where they have not yet fully developed a specific aspect of the system. Secondly, having a detailed MRM manual provides staff with an easy reference to answer questions, find a needed format or check that they have completed an MRM activity or output appropriately. Thirdly, a tailored MRM manual is also a valuable resource to assist in orienting and training new staff members. Finally, a clear MRM manual that both emphasizes and illustrates how to integrate MRM and program management helps staff to operationalise this essential integration in practice. To be really useful, the MRM manual/guidance has to be specific to the program context. Hence, to develop the MRM manual, it is important that the program does not just copy/paste examples from other programs. Instead, the program should use examples for inspiration and guidance to develop or improve their own MRM manual tailored to their own program.
Please note that neither the DCED nor the authors endorse any of the annexed manuals – readers will need to use their own judgement on the usefulness of each manual for them. We invite other programs to share their manuals as well, in order to provide more examples for practitioners in our field. If your program would like to share their MRM manual, please email Jim Tanburn, the Coordinator of the DCED Secretariat, at Tanburn@EnterpriseDevelopment.org
Complete List of Examples and Manuals
Extracts from Manuals:
- Program Objectives and Scope – ALCP
- Overall Approach and Program Theory of Change – Samarth-NMDP
- Overall Program Management and Results Measurement System – MDF
- Links between Log Frame and Results Chains and Typical Structure of the Results Chains- RisiAlbania
- Guideline on How to Develop Sector Results Chains – M4C
- Guideline on How to Develop Intervention Results Chains and Document the Assumptions and Supporting Research – MDF
- Timing, Roles and Responsibilities in Developing Results Chains – ALCP
- How to Outline Systemic Changes in Results Chains – RisiAlbania
- How to Review Results Chains – Samarth-NMDP
- How to Take the Risk of Displacement into Account – MDF
- Description of the Program’s Common Impact Indicators – Samarth-NMDP
- Guidelines on How to Define Indicators – MDF
- Menu of Frequently Used Indicators – MDF
- Guidelines on Making Projections – PRISMA
- Guidelines on Attribution – PRISMA
- Guidelines on Developing a Measurement Plan – Samarth-NMDP
- Guidelines on Baseline Information Collection – MDF
- Menu of Data Collection Tools – RisiAlbania
- Tips for Selecting Tools – MDF
- Guidelines on Conducting Research – PRISMA
- Guidelines on How to Assess and Understand Differentiated Results by Gender and Other Cross-Cutting Issues – Samarth-NMDP
- Guidelines on How to Identify Unintended Effects – M4C
- Guidelines on Systemic Changes – Samarth-NMDP
- Guidelines on How to Analyze Results – M4C
- Guidelines on Structured Review Meetings – RisiAlbania
- Guidelines on Overlapping in Aggregation – PRISMA
- Guidelines on Measurement Period and Timeframe for Aggregation – Katalyst
- Roles and Responsibilities Matrix – Samarth-NMDP
Program MRM Manuals:
- AIP PRISMA Results Measurement Manual February 2015
- ALCP Monitoring and Evaluation Manual 2015
- Katalyst Monitoring and Results Measurement Version 2.0 April 2012. The annexes to this manual are also available.
- MDF Results Measurement Manual Version 2 February 2014
- M4C MRM Manual July 2013
- RisiAlbania Monitoring and Results Measurement Manual Version 2 April 2015. The annexes to this manual are also available.
- Samarth-NMDP Results Measurement System – User Manual Ver. 3 May 2015
This is one of 10 cases that have been developed by Hans Posthumus Consultancy. The HPC consortium was led by Hans Posthumus (HPC) and consisted of Aly Miehlbradt (MCL), Ben Fowler (MSA), Mihaela Balan, Nabanita Sen (OU), Phitcha Wanitphon and Wafa Hafiz (H&S). The preparation of these cases was supported by funds from the Swiss Agency for Development and Cooperation (SDC), provided through the DCED Trust Fund. We would like to thank them for providing the opportunity to work on this case. The examples used in this case are drawn from many contributing programs, to which we are indebted. We would like to extend our gratitude, in particular, to Goetz Ebbecke and Sadia Ahmed from AIP-PRISMA, Helen Bradbury from ALCP, Markus Kupper from Swisscontact, Harald Bekkers and Victoria Carter from MDF, Isabelle Fragniere from RisiAlbania and Tim Stewart and Sanju Joshi from SAMARTH-NMDP for allowing us to use their manuals as examples for the case and publicly sharing their manuals with others. We would also like to thank Hans Posthumus for his valuable input into the case.