A Practitioner’s Experience of Data, Monitoring & Evaluation

Session Report
Mohd Ali Siddiqi

An online International Summer School Program on “Data, Monitoring and Evaluation” is a two-month immersive online hands-on certificate training course organized by IMPRI Impact and Policy Research Institute, New Delhi. The day 5 of the program on July 1st, started with a session on “A Practitioner’s Experience of Data, Monitoring & Evaluation,” Dr Devender Singh, Global Studies Programme, University of Freiburg, Germany, Visiting Senior Fellow, IMPRI.

Dr. Devender Singh conducted an illuminating session on A Practitioner’s Experience of Data, Monitoring & Evaluation at the Impact and Policy Research Institution in New Delhi on 1st July. The session began with a comprehensive overview of Monitoring and Evaluation (M&E). Dr. Singh emphasized that monitoring involves continuous review and data collection to assess whether desired outcomes are being achieved. Evaluation, on the other hand, entails systematic information gathering during or after a project to make judgments about effectiveness and inform future interventions.

The session delved into the importance of effectiveness and efficiency in evaluation. Dr. Singh highlighted that evaluations should impartially assess the achievement of expected and unexpected results, considering factors such as relevance, effectiveness, efficiency, impact, and sustainability. He emphasized the need for evaluations to provide credible, evidence-based information that can be timely incorporated into decision-making processes for organizations and stakeholders. This insight shed light on the essential role of M&E in evidence-based decision making.

The session then explored the evolving perspective on M&E, shifting from a traditional approach of measuring and reporting progress to a current view that emphasizes learning, adaptation, and course correction. Dr. Singh introduced the concept of MEAL, which stands for Monitoring, Evaluation, Accountability, and Learning. MEAL encompasses the proof of impact as well as identifying areas for improvement. It underlines the importance of learning what works, understanding enabling and constraining factors, and making course corrections accordingly.

Accountability emerged as a key element of M&E, encompassing an organization’s responsibility to itself, stakeholders, donors, and beneficiaries. The session emphasized that evidence-based decision-making is crucial for effective planning, programming, budgeting, and implementation. M&E was highlighted as a means to an end, enabling organizations to enhance their performance and make a positive impact.

Dr. Singh presented a Program Action-Logic Model to demonstrate that evaluation should be integrated into the entire program lifecycle. Starting from the initial stages of program conception, evaluation plays a crucial role in assessing inputs, outputs, and outcomes. Inputs encompass various investments, such as staff, volunteers, time, money, research base, and materials. Outputs involve the activities conducted, services delivered, products developed, and partnerships formed. Outcomes encompass short-term impacts, such as learning, awareness, knowledge, attitudes, skills, and opinions, leading to medium-term results and ultimate impacts on social, economic, civil, and environmental frameworks.

The session continued with an exploration of the different types of evaluation: formative, process, and outcome evaluation. Dr. Singh shared his personal experience of running an educational program in Rajasthan, where he used process evaluation to understand the reasons behind high dropout rates among young people and peer educators. For outcome evaluation, he set indicators to measure the program’s success and assessed the outcomes accordingly.

The session also highlighted the importance of employing a pluralistic methodological approach, combining quantitative and qualitative analysis. Dr. Singh emphasized the need to adopt a mixed methods approach from the outset to capture a comprehensive understanding of complex issues. He presented an example of his research on the relationship between social capital and HIV risk behavior, demonstrating the value of integrating different methods for a nuanced understanding.

Regarding the evaluation process, Dr. Singh discussed the involvement of internal and external evaluators. Internal evaluation can be conducted by project staff or beneficiaries, while external evaluation may involve donors or independent agencies. He highlighted the key requirements for conducting effective evaluations, including credibility, independence, ethics, and transparency. These requirements ensure that evaluations are unbiased, trustworthy, and considerate of ethical considerations and the well-being of individuals involved.

The session emphasized the need for proper planning and assessment in M&E, including the capacity building of staff and allocation of adequate resources. The involvement of stakeholders was highlighted as crucial, with the session emphasizing that evaluation should be demand-driven and not imposed from outside. It should be a shared responsibility among project staff and stakeholders to conduct and utilize the results of evaluations effectively.

Dr. Singh concluded the session by emphasizing that M&E should not be viewed as a mere exercise to be undertaken at the end of a program but rather as an integral part that should be initiated from the beginning and maintained throughout the program lifecycle. This comprehensive and insightful session provided participants with a deep understanding of the importance of M&E in evidence-based decision-making, learning, accountability, and program improvement.

The session concluded with a Q&A session, where participants had the opportunity to further engage with Dr. Singh and clarify their doubts and queries. The interactive nature of the session allowed for a deeper exploration of the topics discussed and enhanced the participants’ understanding of M&E practices.

Overall, Dr. Devender Singh’s session on A Practitioner’s Experience of Data, Monitoring & Evaluation was highly illuminating and insightful. His expertise and practical examples provided a comprehensive understanding of M&E principles, processes, and best practices. Participants gained valuable insights into the role of M&E in evidence-based decision-making, learning, accountability, and program improvement. The session served as a catalyst for participants to enhance their understanding and application of M&E in their respective organizations and projects.

Acknowledgement: Mohd Ali Siddiqi is a research intern at IMPRI.

Read more event reports of IMPRI here
How to Read Financial Statements?Monitoring and Evaluation in Practice & Reporting &; The Statistical System in India and an Introduction to Various Official and Other Databases

Authors

  • Tripta Behera

    Visiting Researcher and Assistant Editor, IMPRI

    View all posts
  • IMPRI

    IMPRI, a startup research think tank, is a platform for pro-active, independent, non-partisan and policy-based research. It contributes to debates and deliberations for action-based solutions to a host of strategic issues. IMPRI is committed to democracy, mobilization and community building.

    View all posts