When Creative Consulting & Development Works (CC&DW) was approached by the South African Institute of International Affairs as well as Konrad-Adenauer-Stiftung in South Africa to be part of the University of Cape Town (UCT) Careers evening, CC&DW welcomed the opportunity.
The evening started off on a high note with many organizations such as the South African Navy, Human Sciences Research Council, German Academic Exchange Service, Cognia Law, Human Rights Media Centre, Heinrich Boll Stiftung, and the like, in attendance. Each one of these organizations were reaching out to students, sharing information and forging relationships.
Students from all walks of life and academic departments were making sure to seize this opportunity with both hands. The hunger in their eyes truly made the event worthwhile.
When one tries to explain CC&DW’s vision and mission to young students you immediately see their eyes widen with expectation, wonder and most of all, unwavering interest. A decade ago, mentioning the word evaluator was like speaking Greek to an Italian. The world of Monitoring & Evaluation (M&E) was considered to be in its infancy and now, with the need to account for policies implemented, project reviews coming into the spotlight, and investment in development and service delivery, M&E is rising and is a much-needed intervention.
After fourteen years of working in the development sector, CC&DW knows first hand the importance of grooming students who want to pursue a career in M&E, especially young emerging evaluators who want to gain solid evaluation experience. CC&DW has to date hosted 39+ young students, mainly previously disadvantaged graduates, at our offices in Cape Town and Pretoria. We have mentored and helped these young researchers, communicators, evaluators and social entrepreneurs to launch their careers through gaining practical work experience. Through our if i could… internship programme we have also placed 100+ international graduates in internships in nonprofit organisations in Cape Town and New Delhi. Through mentoring and helping our youth we hope to change perceptions, encourage new ways of thought, identify and document learnings, make recommendations and contribute to an ever-changing society. There was a wonderful sense of curiosity seen recently at the UCT careers evening. The atmosphere was electric and the students were engaged. They showed a keen understanding and desire to learn, contribute to society and build their careers in a meaningful way. It’s this spirit and desire for learning that fuels development and motivates us the most.
This is what Zenex Foundation did at its Breakfast Seminar in Gauteng on the 11th of October 2017, when the results of the evaluations of its Learner Support Programme was presented. By publicly sharing evaluation results without any sugar-coating, Zenex Foundation showed how serious it is about the evaluation of its programmes. This is congruent with the rigour of the Zenex Foundation’s approach to evaluations.
Having undertaken one of these multi-faceted evaluations over a period of three years, Susannah Clarke and Leanne Adams of Creative Consulting & Development Works, presented the findings to the audience at this Seminar.
One of the comments from the audience at the breakfast meeting was of appreciation for the fact that Zenex Foundation did not try to hide what did not work so well. In fact, the presenters mentioned that Zenex Foundation wanted them to “tell the truth”.
Professional evaluators know that the courage to make a judgement comes with the territory – the 1983 Michael Scriven adage “what is good is good and what is bad is bad and it is the job of evaluators to decide which is which” is well-known in the evaluation community. This seems quite simple, but when faced with high levels of complexity, the good-bad dichotomy becomes an uncomfortable continuum with a fuzzy in-between space where the mist of conditions, dosage, assumptions, unforeseen consequences, changes in the environment over time and many other variables challenge the rigour of an evaluation and the sound judgement of the evaluation team to be able to see the wood for the trees.
Although the judgement element is inherent in an evaluation, it is not always welcomed. Michael Patton refers to the uneasiness with evaluator’s incisive questions in his prelude to his 1997 book titled “Utilization-Focused Evaluation” where he pictures the archangel asking God on the seventh day of creation “…how do you know that what you have created is ‘very good’?”. Then, on the eighth day, God responds by saying: “Lucifer, go to hell”. (See Text Box) Indeed, the evaluator may be the proverbial bearer of bad news, who risks being killed (proverbially). “Killing” the evaluator comes in many forms, ranging from requests to “soften” findings, to omit certain troublesome findings, to obscure certain issues, and even refusal to accept a report. These are risks that come with the territory and are real challenges to ethics in evaluation.
Attempts to influence evaluators’ independence is as shocking as naivety about the politics of evaluation is surprising. When an evaluation is conducted, there is a lot at stake. Hopefully, evaluations influence decisions or rather sound decisions. It is, unfortunately, a reality that, in some, and especially in dysfunctional environments, evaluations could be used for the wrong purpose. It is therefore refreshing to encounter a situation where the data is allowed to “speak”. Only when programmes are willing to hear what has not worked so well, is it possible to learn and improve, and to share that knowledge with others, who may want to replicate it.
The Zenex Foundation approach to take the good and the bad on the chin and to commit to learning is commendable. It is in the mix of the judgement and the learning that progress can be made. The Breakfast Meeting on the Zenex Learner Support Programme has demonstrated very clearly that a programme with an authentic intention to make a difference, will be brave enough to learn.
Susannah Clarke and Leanne Adams will present findings and lessons learnt to inform the education sector for these types of learner support programmes, alongside Professor Paul Hobden, who conducted an initial evaluation of the programme.
Notable speakers include Programme Director Ms Linda Vilakazi, CEO of the Oliver and Adelaide Tambo Foundation; Mr Edward Mosuwe, HOD of the Gauteng Provincial Department of Education and Linda Zuze, Chief Research Specialist of the Human Sciences Research Council. Speakers will respond to evaluation learnings and draw critical reflections on learner support programmes.
Events that create a space to share and discuss evaluation findings within the South African evaluation community are rare. In hosting this seminar, the Zenex Foundation is contributing to strengthening evaluation in South Africa. These efforts also strengthen the education sector through the use of evidence-based approaches to the design, implementation and evaluation of projects and programmes.
Creative Data Visualisation is an excellent tool for re-framing very detailed information in a more user-friendly, accessible way and is an ideal way to communicate complex evaluation findings. The use of infographics enables evaluation users to disseminate evaluation findings and infographics to increase the utility of the findings and recommendations. The infographic produced by CC&DW for the Zenex learner support programme evaluation has been used by the Foundation to disseminate information to project stakeholders.
CC&DW will be presenting a one-day workshop on the use of Creative Data Visualisation at the upcoming 6th biennial SAMEA conference on 23 October. Spaces are limited, so sign up now if you are interested in how to use Creative Data Visualisation in an evaluation.
The SAMEA 6th Biennial Pre-Conference Capacity Building Workshop Series will precede the 6th Biennial SAMEA Conference with the theme ‘‘Purpose-driven Monitoring and Evaluation’’ and features workshops by distinguished M&E professionals. CC&DW is proud to be selected as an M&E thought-leader and will host a 1-day workshop covering Data Visualisation.
Dates: 23 October 2017
Presenter: Sophia Van Rensburg and Susannah Clarke
Who should attend: This workshop is designed for individuals involved in presenting data, including researchers, M&E practitioners and programme officers.
Skills requirements: Participants should be competent in Microsoft Office, including Excel.
Data visualisation is a tool to transform information that is buried in complex reports and statistics, into innovative visual products that communicate concepts in a clear and actionable manner. This workshop introduces participants to techniques to improve standard data visualisation (diagrams and charts) and to create innovative visual products that can facilitate the communication of complex stories and ideas. The focus is on developing conceptual skills and critical thinking through participatory methodology. Participants will have the opportunity to engage in practical learning activities to explore creative data visualisation.
At the end of the workshop participants will be able to understand the power of creative thinking and strategic data visualisation; maximise the visual impact of standard graphs and diagrams and create simple infographics
Spaces are limited so book your seat today!
We recently held a technical assistance session on M&E Design in Cape Town in July 2017.
Have a look at what facilitator Dr Donna Podems highlighted as the main aims and objectives of this training:
“Since every organisation is unique, we understand that generic training is not always the most useful approach in building M&E skills. That’s why we’ve made sure that the technical assistance we provide is flexible yet guided to ensure core M&E concepts are understood and can be applied.
We engage with participants’ M&E needs and challenges. Participants actively apply their learnings about problem statements, activities and results to their own organisational programmes. This means that you walk away with a tangible product, in the form of an M&E framework, and can continue building this at your organisation.”
Here’s what participants had to say:
“Policy and strategy officials should attend this training as they are usually the people who develop indicator frameworks based on problem statements provided to them. Sometimes they have the correct problem statements but derive the wrong indicators and use incorrect data sources.”
“I’m more on the monitoring side and I’ve heard so much about evaluation. I was wondering how to connect the two and that is what this clarified for me. This filled the gap between monitoring and evaluation for me.”
“Donna put a great perspective on the way she explained core M&E concepts and the way that she placed less focus on the labels used.”
Here’s how participants will apply what they learnt to their work:
“The session gave me the foresight to be able to see what needs to be done in my organisation to make these changes happen back at the office. There is definitely some direction I will take using what I have learnt here at my organisation.”
“These exercises have pointed out that we need to make sure that we have the skills to do what we need to in order deliver results.”
Let us know if you would like us to provide customised technical assistance in M&E design for your organisation – contact Itumeleng Ramano today on (021) 4482058 or firstname.lastname@example.org