Creative Consulting & Development Works

We are a research, evaluation and communications consultancy, servicing nonprofits, governments and donors with innovative solutions within the development context.

Your groundbreaking data will go unnoticed if you can’t communicate its value

13 April 2017

Meet your Creative Data Visualisation facilitator, Fia van Rensburg who is based at our new Pretoria office.

Fia has more than 18 years’ experience in the development sector, specializing in monitoring and evaluation. She has worked in provincial and national government as well as international organizations. In addition to core skills in M&E and training facilitation, she is passionate about capacity building and is interested in integrating issues of gender and appreciative inquiry into her work.

There is a wide misconception that monitoring and evaluation (M&E) is a field for geeks and nerds, and relevant only to those who understand it. This couldn’t be further from the truth. Any industry that conducts interviews, gathers data or research requires M&E skills. We interviewed Fia to find out more about her, the training and what participants can expect.

CC&DW: The value in M&E, when correctly applied, has the power to improve systems and programmes but not many see the creative side of it. Why did you get into the M&E field? What excites you about research?

Fia: Well, I’m just a child at heart! And I think it’s very important to be able to communicate effectively with people that are not involved in research and evaluation. Creative Data Visualization is a way of developing a language that can reach a much broader audience.

The fact is, things change in the world all the time and we are communicating differently in business nowadays.We Whatsapp and skype, so why would we want to continue to do our visualization of data the same way we did ten years ago and not keeping up with development in the world?

I am interested in making things fun and accessible. Research is alive! It’s not a dry and boring subject, and I think one needs to bring that out. People often complain about the lack of data in Africa and we need to push that agenda more, how else will we plan and ensure that we develop is based on evidence. Even casual communications like emojis etc, have already started enabling us to use visuals to communicate differently.

CC&DW: How will the workshop be conducted? And what can participants look forward to?

Fia: We’ll be looking at some traditional ways of reporting and presenting data and taking that and facilitating a new way of thinking about presenting data. Using what participants are familiar with and how that can be used optimally to present data and entice people to apply the information differently. We’re building technical skills. Think about how sometimes you have an idea and you know how it looks but you don’t have the skills to develop it.

CC&DW: The general assumption is that M&E is for development practitioners, government officials, and non-profits, but who else are these skills and knowledge necessary for? What would other industries benefit from these workshops?

Fia: Everybody that has to do a presentation or write a report or communicate data. The moment you do a presentation you work with visuals and need visual mediums and not everybody has access to things like Prezi. With just what we have on MS Office you can improve your data visualization. Any report has data in it. Corporate, social sector, NGOs, building, and training, investment, trainers, and facilitators can benefit from this.

CC&DW: What impact can the right or wrong visualization or presentation have on your data?

Fia: The advantage of correct and exciting data visualization is that you generate an interest in your data and what your evaluation report is used for. Let’s face it, the quality of your evaluation depends on the actual use of the findings, which is a contentious statement but there is some truth in it. However good your report, if it doesn’t get used what’s the value? Data visualization skills will help you take that knowledge that sits on the shelf and put it to use.

One of the consequences of incorrect, poor or irrelevant data visualization is cultural insensitivity e.g. using thumbs up for something that went well but in some cultures that might be rude. You have to be aware of correct visualization tools. It’s a new language that people have to learn.

I think it’s very important to be able to communicate effectively with people across all disciplines, including those who may not have a deep understanding of research and evaluation.

No matter which industry or sector you’re involved in, if you need to present data, it is important to know which tools are available to you (both traditional and modern ones) and how to use them. As Fia mentioned, there is no value in having groundbreaking findings if you can’t communicate your information in a clear, concise and exciting manner.

Improve your visualization skills at our next Creative Data Visualisation workshop. At the end of the workshop, you will be able to:

  • Understand the power of creative thinking and strategic data visualization;
  • Maximize the visual impact of standard graphs and diagrams;
  • Design creative images that explain scope, relationships, and interactions between concepts and phenomena; and
  • Create simple infographics.

Register here before the 20 April 2017 to get your early bird special.


Evaluation is serious business and learning about evaluation is serious fun

27 February 2017

We are serious about having fun when presenting our training workshops. While research and evaluation may come across as mundane to some, we have included appropriate experiential learning opportunities into our training workshops ensuring active as opposed to theory-based-only learning.

Through our unique, interactive activities built into our training, we would like to share our passion for research and evaluation. In February, we held our Qualitative Data Collection training workshops in Cape Town and Pretoria. Join us for the upcoming Creative Data Collection training workshops in Cape Town (8 & 9 March) and Pretoria (14 & 15 March).

Copyright CC&DW

CC&DW Research intern, Sitho Mavengere and course participant, Ramona Mohanlall

Evaluation is serious business – in fact, one of the factors distinguishing evaluation from research is that evaluation inevitably involves a judgement. Knowing that an evaluation will influence decisions about programme design; whether or not a programme will continue to receive funding; and the direction of policy or legislation, can weigh heavily on the shoulders of an evaluator. Irrespective of our theoretical orientation, we cannot escape the reality that evaluators have to put their head on the proverbial chopping block when they conclude findings and make recommendations. High levels of responsibility require high levels of professionalism, rigour, accuracy and accountability. Ongoing development of evaluation capacity is an important component of the process of professionalization of evaluation.


Qualitative data collection tools training in action

Sound theoretical knowledge is the non-negotiable basis for capacity building, and there are a number of excellent academic offerings on programme evaluation.  At CC&DW we have recongnised that complementary to academic programmes, there is space for experiential learning through skills development training workshops that provide participants with the opportunity to learn through application and experimentation.  Knowledge does not equal capacity. Deep capacity requires application of knowledge; the quality of application depends on skill; and the quality of skill depends on practice.

Steps in Action Learning | copyright

Steps in Action Learning | copyright

The classic professions include both theory and praxis. A medical doctor has to do a two-year internship after completing six years of formal study, even though extensive practical work is already integrated in the curriculum from the 3rd or the 4th year, depending on the faculty. During their internship, junior doctors are registered at the Health Sciences Professional Council of South Africa (HPCSA) for “supervised practice”, and they are only allowed to practice independently after successfully completing the internship. Similarly, Clinical Psychologists do internships, Lawyers and Accountants do articles.

Taking a page from this book, we make sure that we provide the opportunity to our workshop participants to get exposure to skills training and a taste of what it means to apply their knowledge and skills. CC&DW has developed a series of complementary training workshops which enable practical peer learning and are embedded in a sound theoretical foundation. All our courses have space for learning through fun activities, and elements of surprise have been built into each workshop. This makes evaluation theory and skills real and relevant, with specific activities through which participants can try out what they have learnt, and then reflect on their experiences. This is action learning in practice!

The Importance of an Evaluability Assessment in Research Analysis

24 January 2017

EA feature image -blog

An evaluability assessment (EA) is an assessment that helps to check whether a programme is ready for an evaluation and, if so, how the evaluation should be designed to ensure maximum efficacy and to allow evaluators to measure the programme’s effect accurately. But the assessment goes further than merely providing information on whether or not a programme can be evaluated.

An EA is an important step in determining which elements of a programme are open to further evaluation and which are not.  These are especially important for non-profit organisations whose activities and goals are more often quite broad. An assessment can be undertaken jointly by the evaluator, programme staff, evaluation sponsor and other relevant stakeholders. 

This is an especially important aspect of development in South Africa, a country that lacks data. This gap can mean that programme goals are not realistic or lack clarity and there are no logical links between activities and objectives. Such inconsistencies between plans and what actually occurs results in evaluators not being able to determine which activities of the programme were implemented as planned and measure the true impact of the programme .

evaluability assessment

Evaluability process diagram/ Evaluability Assessment: A Tool for Programme Development in Corrections©


As a result, evaluations undertaken under such circumstances are limited and waste resources as well as programme staff and stakeholders’ time trying to evaluate a programme that is not ready for a comprehensive evaluation.

Evaluation effectiveness is improved by the EA process in three ways:

1) by stating, plainly and concisely, the programme’s measurable actions,

2) by clearly articulating goals in a realistic and measurable way, and

3) by making a rational link between the program activities and envisioned goals

It is also important that the environment is favourable to conduct an evaluation and that key stakeholders are all in agreement for the evaluation to take place. The stakeholders need to agree to use the results of the evaluation to improve the programme. There has to be positive feedback from the programme staff, who are willing to provide support in undertaking an evaluation.

So if one or more of the above-mentioned conditions are not met, the program is deemed unevaluable until further clarification and reassessment can be provided. 

An EA is vital in the evaluation process as if it is effectively conducted on a poorly designed programme, it has the potential to save programme staff and stakeholder’s time and funding resources that would be otherwise wasted if the programme were to continue functioning unchanged.

Ethics in research and how to handle socio-economic challenges in fieldwork

18 January 2017

Ethics in research are extremely important and should always be maintained to ensure that researchers conduct their work in a professional manner. Basically put, ethics are the rules that distinguish between “right” and “wrong”, “bad” and “good”, ethics are about the norms for conduct that distinguish between acceptable and unacceptable behavior in society.

Fieldwork training

Copyright CC&DW

In research and fieldwork in particular, maintaining good ethics at all times is a must and should be a norm. These ethical characteristics include honesty, objectivity, integrity, carefulness, openness, respect, and confidentiality, among others.

Conducting fieldwork is challenging and interesting at the same time. Doing it successfully requires properly trained and well-prepared fieldworkers. Before any fieldwork can commence, our fieldworkers are thoroughly trained over a period 2-3 days regarding the context of the programme, ethics, and many other issues, in order to be able to carry out the project successfully.


But in the field of social sciences, things are not as clear-cut or straightforward as we may like. People are unpredictable and complicated, so no matter how well-trained fieldworkers may be, working with communities can bring about unexpected challenges when they respond in very different and unpredictable ways.

An example of such a situation is dealing with requests for material things from the participants of a study. This may include requests for money, clothes or anything else while conducting research, and this is caused by the expectation that our team is there to offer material benefits in exchange for information. This is never the case.

On top of the ethics training that fieldworkers receive, it is also important to emphasize that they always need to be professional when dealing with the different kinds of participants in the different communities regardless of the socio-economic conditions.  

So fieldworkers always need to be trained in handling unexpected social situations in a professional and ethical manner. Treating people and their questions with respect and tact, and explaining the project terms to them in an understandable manner will help make sure everyone involved is on the same page about the data collection.

Register for our upcoming evaluation trainings now

16 January 2017


We’re very pleased to present our evaluation training schedule for 2017. Click here to register now for these upcoming courses.

After you register, we’ll get in touch to finalise the details. And if you’d like to register now for one later on in the year, that’s definitely allowed — we’ll be sure to remind you closer to the time.