5 Trends we saw at SAMEA’s 2016 Capacity Development Workshop Series
26 October 2016
At our recent Biorhythm and Life Cycle of Programme Evaluation workshop, Creative Consulting & Development Works shared expert guidelines and insights for managing evaluations both from perspective of the commissioner and the evaluator.
The workshop was held on 6th and 7th October 2016 and focused on developing terms of reference, assessing an inception report, quality assurance for the phases of evaluation reports, and communicating evaluation findings. This was part of a broader workshop series hosted by South African Monitoring and Evaluation Association (SAMEA), Department of Planning, Monitoring & Evaluation (DPME), CLEAR, Zenex Foundation and others and held at the Hilton Doubletree in Cape Town.
Our workshop presenters and participants. From left to right: Nonhlanhla Buta, Quondile Hadebe, Leanne Adams (CC&DW), Mkhululi Mnyaka (CC&DW), Nirmala Govender, Lesego Taunyane, Kholofelo Sebothoma, Ruzelle Julie, Nicola van der Merwe (CC&DW), Koena Mwale
Some trends and insights that emerged:
- The workshop series was well-attended, particularly by government representatives at national, provincial and local levels. This showed the need and movement towards monitoring and evaluation (M&E) as a core requirement for good governance and within the public service.
- The discussion around professionalisation continues to drive the development of the parameters, standards and required competencies of evaluators within the sector. Its more and more evident that solid evaluation grounding, theoretical foundation and well-established practical experience are essential and necessary as the evaluation industry consolidates and stengthens in South Africa. The interest in participating in various training options was indicative of the desire of evaluators and practitioners to build on their evaluation skills.
- Smaller venue spaces allowed for active and engaging round-table discussions. These more intimate spaces, compared to some of the larger venues which were filled to capacity, saw increased levels of personal engagement and participation in workshop conversation and allowed for more active learning.
Workshop attendees seemed to be able to better relate to one another in this smaller setting and were able to share personalised examples of M&E queries and challenges. Trainers could also tailor responses and lead discussions that were most relevant.
- Despite the recognition of the importance of good monitoring and evaluation practices, government officials report that they are still facing many challenges in implementing M&E at ground level. This is possibly largely due to funding and human resources. Often, there are a number of monitoring officers who are expected to perform an evaluation function, but have only been equipped with the skills to conduct project monitoring.
While monitoring is the ongoing analysis of project progress towards achieving planned results, evaluation provides an assessment of the efficiency, impact, relevance and sustainability of the project. This value judgement often required a separate set of skills and is usually conducted by an independent and external agency.
There is gap in the capacity development of government representatives who focus on M&E within their departments while their need for this development is becoming more and more prevalent. Allowing time for an investment in capacity development, through continued training, is important to bridge the gap and transfer skills.
- The ease of understanding and practicality of concepts taught and discussed in M&E training workshop is important. In the same way that evaluations need to be conducted with a utilization-focused lens, training and capacity building programmes in the evaluation sector also need to concentrate on how the learnings can be less focused on high-level theories and more focused on being easily implementable for everyday working settings.
No comments yet.
RSS feed for comments on this post. TrackBack URL