Creative Consulting & Development Works

We are a research, evaluation and communications consultancy, servicing nonprofits, governments and donors with innovative solutions within the development context.

The ‘urban/semi-urban/rural’ classification debate – is it still relevant and adequate for evaluators?

1 November 2017

Creative Consulting & Development Works (CC&DW) team members attended a SAMEA conference from 23rd-27th of October 2017 at the Hilton Hotel, Sandton Johannesburg. SAMEA (South African Monitoring & Evaluation Association) brings together delegates, from government, private sector, nonprofits, universities and international organisations to share knowledge on monitoring and evaluation.

Fia Janse van Rensburg and Sitho Mavengere presented a paper titled; Is the seemingly simple ‘urban/semi-urban/rural’ classification still relevant and adequate, and are evaluators aware of the discourse on geographic typologies in South Africa?  This paper was motivated by the argument that in development thinking and evaluation practice and subsequent evaluations there is continuous reference to typology of ‘urban/semi-urban/rural areas’.

The key question in this paper, is whether the ‘urban-rural’ dichotomy, or an ‘urban-semi-urban-rural’ continuum actually exists. This typology is widely used, but there are differing definitions of ‘urban’ and ‘rural’ exists, so although the same label may be used, the conceptualisation could differ. There is no standard definition of urban or rural characteristics as there are different geographical typologies that are used in South Africa.  Migration and changing settlement patterns shows up the limitations of the ‘urban-rural’ classification, and questions its ability to adequately respond to the reality of social, economic and developmental diversity of populations residing in areas classified as ‘urban’.

Evaluators need to be aware how the geographical classification can contribute to intervention effects.  More importantly, evaluators should be able to identify and use the appropriate geographical classification system to ensure adequate coverage of beneficiary groups in an evaluation. In South Africa, informal settlements are developing in affluent areas resulting in both urban communities and informal communities living side-by-side. When evaluations are being commissioned for these areas the interventions are classified as located in urban areas, yet it is both urban and informal.

For the evaluations to measure the effects of the programmes or projects, the geographical classification should play a pivotal role for the interventions to produce the desired effects. Therefore, evaluators need to understand contextual issues, including appropriate geographical classification of programmes implemented in South Africa.

Dataviz enriches the research and evaluation landscape

1 November 2017

Creative Consulting & Development Works (CC&DW) facilitated a well attended one-day workshop on Creative Data Visualisation at the recent 6th Biennial SAMEA Pre-Conference Capacity Building Series. Participants included South African local, provincial and national government representatives, international agencies and nonprofits, all looking to start their journey with data visualisation, aka “dataviz”.

CC&DW’s introductory workshop on Creative Data Visualisation is designed as a two-day course. It was tailored for the SAMEA conference into a one-day workshop to provide participants with a ‘taste’ of data visualisation. This course is designed to facilitate the start of participants data visualisation journey. It is specifically for those working in the field of research, monitoring and evaluation, including individuals involved in presenting data. Given the overwhelmingly positive feedback, CC&DW will be offering this course to the public, organisations and government departments in 2018.

Participants will be introduced to the key principles and theories behind the visualisation of data. Participatory exercises will encourage the application of creative data visualisation tools. Presenting data visually to effectively communicate to various audiences, with maximum impact, is the future. Join our team in 2018 at one of our upcoming creative data visualisation courses and start your exciting dataviz journey.

Quotes from Creative Data Visualisation workshop participants

CC&DW Creative Data Visualisation participant Moipone Malahleha from Gauteng Office of the Premier, with workshop presenter Fia van Rensburg.

Marethabile Maseela – World Vision Lesotho

The course was great. It’s something I wanted to know more about. I don’t think anything in regards to the course content should change. I would, however, want the course to be a two or three-day course. I was so re-charged from the course that I went out to buy books on data visualization and am reading them.

Phiwe Mncwabe – Gauteng Office of the Premier

The course was lovely. I thoroughly enjoyed myself. I would recommend this course to my colleagues. It helped me to look at data creatively. I would like more time to do the course, I think it would be best to go through the exercises with the facilitator, than on my own. Overall it was a great workshop, I would suggest that we have more than one day, perhaps two to three days would be sufficient.

Ann-Lhyn – DPME

The course was very insightful. However, it was rather short. I wanted to do more practical work and if it were over two days, it would be even more amazing. I would definitely recommend it to my colleagues

Mpumelelo Mahlalela – Kwadukuza Municipality

I enjoyed the course because it was interactive, very useful, I have learnt a lot of new things from it. The resources shared with us are very constructive and provided insight based on the course. This is a constructive and mind-blowing course that I would love to recommend to my colleagues.

The one word that summarises SAMEA 2017

1 November 2017

Itumeleng Ramano, CC&DW Project and Training Coordinator attended the recent SAMEA conference and took care of visitors to the CC&DW exhibition stand. She shares her experience and conference takeaways with us here.

My role in CC&DW is that of a Project and Training Coordinator. My job entails that I am responsible for overall project assistance and the training component of the organization. When I was given the opportunity to attend the SAMEA conference in Johannesburg, I was very excited. Upon arrival at the conference, the atmosphere was electric. People were excited and happy. When I started speaking with other delegates, I began to understand the true power behind M&E, informed by my background in capacity building, community mobilization, strategic planning within the public health sector.

I learned from the attending talks at the conference, as well as my communication with various people, of the amazing calibre and importance of M&E. One of the most interesting conversations I had was with Mr Amadou Oumarou from the Niger High Commission, reporting to the Prime Minister. When Mr Oumarou approached our stand, he initially wanted a stress ball (granted, he confessed he had no idea what it was for).  From that moment, we began talking about the conference and how important it is to have something like this in Africa. His interest in CC&DW peaked when I offered him a company profile and explained the work we do. As a representative of Twende Mbele (a Kiswahili term for moving together forward), he spoke of seeking a potential partnership with CC&DW.

If there is one word I can use to summarize the conference is “Purpose”. This is what I saw from all the people I interacted with.  They came to the conference with a purpose and left with a sense of renewal, revival and energy to continue with the great work which they do.

The importance of evaluation in programme design

1 November 2017

A common theme across SAMEA’s many strands and amongst various stakeholders was that programme’s need to be designed with monitoring and evaluation in mind. Even though the social development community appears to be making promising inroads towards incorporating this as part of the programme design process, as evaluators, we find that this is not always the case.   

Sometimes, organisations enlist an evaluator to help with the design of a programme at conceptualisation. Leanne Adams presented work undertaken as part of a Master’s dissertation where she explained the methodology used in the design of a new programme together with an organisation. Her presentation pointed to the fact that evaluators can help these organisations plug monitoring and evaluations gaps early on in the design phase.

Other times, evaluations need to shape themselves around an existing programme design. This was the case in CC&DW’s recent evaluation of the Zenex Learner Support Programme. CC&DW evaluator’s Leanne Adams and Fatima Mathivha presented the novel design used by the team for this evaluation.  This was presented in a Zenex Foundation hosted panel discussion on quantitative designs used in education evaluations. This led to much lively discussion between education stakeholders from government, private and NGO spaces.

The key takeaway point was that although stronger quantitative evaluation designs (i.e  the gold standard) are often favoured within the evaluation and research communities, we should place enough importance on designs that adapt themselves to the context in which they are intended to function. In these cases, these designs are, in fact, the gold standard as they are fit for purpose in their ability to work within an existing programme design and still produce empirically valid findings.

Some of the key learnings in these presentations were that as evaluators, where we have the luxury to introduce evaluative thinking at programme conceptualisation to design evidence-based programmes, donor-based financing remains a significant challenge in the South African NGO landscape. But this is uncommon and evaluators often find themselves balancing client wishes with programme constraints.

The debates at the conference seemed to produce more questions than answers about what evaluators can do to bridge this gap in evaluative thinking in the sector. Asking challenging questions and nurturing a space for conversation is part of continued learning and strengthening of the evaluation landscape in South Africa.

A new ‘Gold Standard’ in evaluation design

1 November 2017

CC&DW’s Mkhululi Mnyaka pilots a data collection tool.

The word ‘gold standard’ is a contentious word when speaking about evaluation designs. Often, it refers to randomised control trials (RCTs), which are evaluation designs that replicate the experimental design in physical and biological sciences that help us to make causal claims. Some claim that this is the strongest and most robust evaluation design.

However, as evaluators when asked what the best evaluation design is for a specific intervention off the bat, you’re likely to have the ambiguous response “it depends”. Because interventions don’t work like neat and tidy laboratory experiments –  and one can argue that they shouldn’t give the problems they address. Interventions should be bold and innovative and conceptualised to perform a specific function. Although this is great for social and human development it can be challenging for evaluators.

In a landscape of interventions of all shapes and sizes, evaluators are presented with the task of being an educator, advocate, technician, and sometimes even a magician. At the core of an evaluator’s response to an evaluation should be ‘what is the purpose of this evaluation’. More often than not, commissioners of evaluations are interested in outcomes and impact, but as a result of various factors (such as programme design at conceptualisation and the time at which the evaluation is commissioned), an RCT becomes unfeasible. It is at this point that evaluators need to do the best they can with what they have. And using the analogy of travelling from point A to point B, the best possible vehicle is not the Rolls Royce envisioned, but rather a dirt bike.

Some work still needs to be done in shaking off the stigma of not producing an evaluation designed using the often elusive ‘gold standard’ in the hope that what will become the ‘gold standard’ of evaluation will be what is fit-for-purpose.