Newsroom

June 28, 2018

In Conversation with Lettie Miles and Lauren Fok on Conducting M&E at Zenex Foundation

The value of monitoring is that we can better understand the context and beneficiaries in which we implement projects. We can see the project in action, identify gaps in implementation and capacity issues. Monitoring gives insight on the nuances in context and can help to manage risks in the project. These lessons can then be shared in the sector.

Q&A with Lettie Miles 

Lettie Miles is a Senior Programme Manager at the Zenex Foundation. Her portfolio focuses on school programmes in Mathematics, Science and Language education. Her responsibilities include stakeholder engagement, programme design, project monitoring and management of external evaluations and research.

What is the nature of your role in the programmes that you manage?

I work with complex projects that have multiple partners and multiple intervention components. Most of the M&E that I do involves managing and coordinating multiple stakeholders such as the Department of Basic Education (DBE) partners and external evaluation agencies. My monitoring involves holding all aspects of the project and partners together to drive a common vision and strategy and creating an enabling environment for evaluations to be conducted and for learning to take place. I also ensure that implementation aligns with the vision and strategy.

Lettie Miles

What do you generally monitor and for what purpose?

We generally monitor contract compliance and budgets. This includes monitoring if the delivery is taking place as per the implementation plan and monitoring quality through attending training sessions and school coaching sessions etc. We also monitor to troubleshoot, problem-solve, facilitate engagement with stakeholders to learn and share ideas and to manage risk and reputation. In essence, it is ensuring my eyes and ears are on the ground to have knowledge of the context and to understand what works and does not work in practice.

What do you see as the value of monitoring?

The value of monitoring is that we can better understand the context and beneficiaries in which we implement projects. We can see the project in action, identify gaps in implementation and capacity issues. Monitoring gives insight on the nuances in context and can help to manage risks in the project. These lessons can then be shared in the sector. 

What have been the benefits and limitations of the evaluations of your projects?

The benefit of evaluations is that they are an independent view of whether the intervention model works and whether it is giving us the desired outcome. Evaluations help to elevate project thinking by explicitly clarifying the project logic. Below are some of the limitations I have identified with evaluations in the projects that I have worked on.

• The evaluation design is often limited by our budget constraints.

• Evaluations are usually commissioned after the start of project implementation

• The complexity of our project designs create difficulties with attribution.

• Often during the course of the evaluation, we realise that data is not available, accessible or helpful for the evaluation question.

• We try to plan and pre-empt everything that may be a risk to the success of the evaluation but we have realised that even with our planning, there are still issues that transpire.

Q&A with Lauren Fok 

Lauren is a Programme Manager at the Zenex Foundation, where she is responsible for projects in the Schools and Sector-Strengthening Programme initiatives. We sat with her to discuss her involvement in M&E for the programmes that she manages.

Zenex does clarification workshops at the beginning of an evaluation; How do you think this has helped project partners?

Clarification workshops in the context of Zenex projects have helped to:

1. Promote buy-in and ownership by all project stakeholders and partners.

2. Forge a common vision and build consensus.

3. Clarify roles and responsibilities of different partners.

4. Build relationships by understanding the various organisational cultures of partners.     Lauren Fok

Clarification workshops take place at the inception phase of an evaluation. They bring together all the intervention stakeholders in order to clarify and come to a common understanding of the objectives, theory of change, indicators and assumptions of an intervention.  Unpacking assumptions is important during this process because they give the theoretical underpinning of why we chose a particular approach. The product of a clarification workshop is a document that articulates what we hope to achieve and how the activities we undertake will help us to achieve the desired outcomes.

Tell us about a time where you changed an implementation plan following feedback from an evaluation report.

I managed a Zenex Literacy project where the original plan was to conduct one-on-one coaching sessions with Foundation Phase Teachers. The evaluation revealed that the school context did not allow for individual coaching sessions. We discovered that Foundation Phase teachers teach the whole day and some of them could not stay for coaching afterschool because they are part of lift clubs. At the end of year one, the project revised the approach and adopted a blended approach that involved group coaching sessions at phase meetings and one-on-one sessions during classroom observations. There have been several other examples where preliminary evaluation findings have resulted in immediate revisions to the approach and remedial action to improve project implementation.

Have you had an experience where the findings of an evaluation were contrary to what you expected?

In instances where the findings of an evaluation are contrary to what is expected, we engage the evaluator. The evaluator may probe further on the finding by using additional sources. There are cases, however, where the findings are valid, and Zenex and the evaluators moderate the tone to ensure that the message is well received. We do evaluations to learn. We engage in this action research in order to test ideas and answer questions that we do not have answers to. It is therefore important for evaluations to be completely objective and be communicated in a tone that fosters learning for all stakeholders.

You have been involved in sharing school-by-school feedback on evaluations; what has been the feedback of principals?

Research has shown that the school community and beneficiaries do not receive feedback when they have participated in research studies. For Zenex giving feedback is a critical part of the process. Zenex is not extractive and we understand that there is a two-way responsibility. The schools provide information and we have a responsibility to provide feedback.

Schools and principals appreciate the feedback because it gives them an opportunity respond, challenge findings, ask questions of clarity, and learn in the process. Zenex has also tried to make the evaluation reports accessible through the visual depiction of empirical data. Reports mirror school performance and we deliver them with practical recommendations for how schools can improve.

 

Read 205 times