Early in 2017, the IEU kick-started its LORTA Programme with a workshop in Bangkok where entities and agencies implementing 15 GCF-approved projects were invited to learn about impact evaluation and constructing evaluation designs.
In September, I had the privilege of travelling to Malawi and participating in the scoping mission for one of those projects -- Project FP002: Scaling up Modernized Climate Information and Early Warning Systems to Save Lives and Protect Agriculture in Malawi (M-CLIMES).
The aim in Malawi was to design a rigorous, feasible impact evaluation in collaboration with the project management team and relevant stakeholders. This blog looks at some of the activities conducted in the field and highlights a few key ‘take-aways’ that might be helpful for fellow evaluators.
On Monday, September 10th, we met the M-CLIMES team, including representatives from the Environmental Affairs Department (Malawi’s National Designated Authority), and the Department of Disaster Management Affairs (DoDMA), along with staff from the UNDP coordination team. Together we discussed the team’s objectives and secured buy-in from key stakeholders. The latter was important as some stakeholders were initially wary of LORTA. Over coffee, we assured everyone we were not there to audit ongoing activities or evaluate staff but were there to co-create mechanisms that would allow for impact measurement and real-time learning. The ‘take-away’ for me was, where possible, try and predict other’s likely reactions in advance of your arrival and proactively address them. Next, we met with a sub-group of stakeholders, consisting of the project management team and members of its consulting firm. Though we had reviewed the available documents exhaustively and planned preliminary ideas as thoroughly as possible, some project elements were still not fully formulated, such as the selection criteria and the baseline data needed ahead of training.
The fact that the rainy season was fast approaching added more complexity to designing a high-quality evaluation, as the program had to be implemented quickly and we still had so much to figure out. My ‘take-away’ on this occasion was: know in advance how far preparatory work has progressed, and be aware of constraints and potential threats (time, weather, resources, human capital, etc.)
On Tuesday, we delivered a capacity-building workshop for DoDMA and other ministries involved in M-CLIMES. The workshop covered the utility and need for evidence in evaluations, examined impact evaluation methods and conducted a Theory of Change workshop. On Wednesday we travelled to Dedza and met with the Agricultural Extension Officers in charge of training farmers. The visit improved our targeting of farmer training, handling of group concerns and engaging at all levels of the programme. On Thursday we met with the Department of Climate Change and Meteorological Services, an M-CLIMES implementing partner, and helped refine evaluation questions and identification strategies with their input. Friday was our big day – the day we were to present and discuss the rough outline of our impact evaluation design to the project’s stakeholders.
Participants noted that evaluations should be included in the funding proposal preparation cycle to enable better planning and budgeting. They also noted the challenge of being among the first cycle of GCF-funded projects, saying there was still much to learn and that they had asked for too little funding in their original proposals. This rich discussion provided us with plenty of food for thought for future similar exercises.
Some of my key take-aways:
- Textbook scenarios for conducting capacity building in evaluations rarely ever occur. Activities will always be dictated by the realities on the ground, where evaluations require the ownership of all the stakeholders involved. Evaluators/capacity builders thus need to be prepared to meet people half-way and approach their tasks with humility, with no pre-conceived ideas, and with an open heart and mind.
- The term “evaluation” can evoke negative connotations and cause defensiveness. When helping others develop evaluation designs, it is important to clearly define their objectives – with full consideration of local needs – while aligning them with the larger goals of maintaining accountability and building evidence for overall learning.
- Building and implementing a good impact evaluation is an arduous journey for all. Each step is important to ensuring collaborative success and should not be rushed. Encouraging consensus early will ensure smoother processes in the long-run.
- Funding can be the elephant in the room. If this topic and other concerns are not addressed transparently, evaluation teams miss out on a huge opportunity to build trust with key partners. Acknowledging you are aware of local concerns gains trust and facilitates stakeholder buy-in.
- Perhaps most importantly, do everything in humility. Check yourself for any trace of patronising behaviour, mindset, or tone, and celebrate the unique expertise and value that can sometimes only be found in the local experience.
- It’s impossible to distil all that I learned. With every person I met and every word I heard, my knowledge of LORTA and how it will help the GCF grew exponentially. More importantly, so did my respect for the people on the ground working with those most affected by climate change.
Stay up to date with the IEU and its LORTA Program through the IEU web site.