Creative Consulting & Development Works

We are a research, evaluation and communications consultancy, servicing nonprofits, governments and donors with innovative solutions within the development context.

The importance of evaluation in programme design

1 November 2017

A common theme across SAMEA’s many strands and amongst various stakeholders was that programme’s need to be designed with monitoring and evaluation in mind. Even though the social development community appears to be making promising inroads towards incorporating this as part of the programme design process, as evaluators, we find that this is not always the case.   

Sometimes, organisations enlist an evaluator to help with the design of a programme at conceptualisation. Leanne Adams presented work undertaken as part of a Master’s dissertation where she explained the methodology used in the design of a new programme together with an organisation. Her presentation pointed to the fact that evaluators can help these organisations plug monitoring and evaluations gaps early on in the design phase.

Other times, evaluations need to shape themselves around an existing programme design. This was the case in CC&DW’s recent evaluation of the Zenex Learner Support Programme. CC&DW evaluator’s Leanne Adams and Fatima Mathivha presented the novel design used by the team for this evaluation.  This was presented in a Zenex Foundation hosted panel discussion on quantitative designs used in education evaluations. This led to much lively discussion between education stakeholders from government, private and NGO spaces.

The key takeaway point was that although stronger quantitative evaluation designs (i.e  the gold standard) are often favoured within the evaluation and research communities, we should place enough importance on designs that adapt themselves to the context in which they are intended to function. In these cases, these designs are, in fact, the gold standard as they are fit for purpose in their ability to work within an existing programme design and still produce empirically valid findings.

Some of the key learnings in these presentations were that as evaluators, where we have the luxury to introduce evaluative thinking at programme conceptualisation to design evidence-based programmes, donor-based financing remains a significant challenge in the South African NGO landscape. But this is uncommon and evaluators often find themselves balancing client wishes with programme constraints.

The debates at the conference seemed to produce more questions than answers about what evaluators can do to bridge this gap in evaluative thinking in the sector. Asking challenging questions and nurturing a space for conversation is part of continued learning and strengthening of the evaluation landscape in South Africa.

« Previous article
Next article »

No Comments

No comments yet.

RSS feed for comments on this post. TrackBack URL

Leave a comment