We are serious about having fun when presenting our training workshops. While research and evaluation may come across as mundane to some, we have included appropriate experiential learning opportunities into our training workshops ensuring active as opposed to theory-based-only learning.
Through our unique, interactive activities built into our training, we would like to share our passion for research and evaluation. In February, we held our Qualitative Data Collection training workshops in Cape Town and Pretoria. Join us for the upcoming Creative Data Collection training workshops in Cape Town (8 & 9 March) and Pretoria (14 & 15 March).
Evaluation is serious business – in fact, one of the factors distinguishing evaluation from research is that evaluation inevitably involves a judgement. Knowing that an evaluation will influence decisions about programme design; whether or not a programme will continue to receive funding; and the direction of policy or legislation, can weigh heavily on the shoulders of an evaluator. Irrespective of our theoretical orientation, we cannot escape the reality that evaluators have to put their head on the proverbial chopping block when they conclude findings and make recommendations. High levels of responsibility require high levels of professionalism, rigour, accuracy and accountability. Ongoing development of evaluation capacity is an important component of the process of professionalization of evaluation.
Sound theoretical knowledge is the non-negotiable basis for capacity building, and there are a number of excellent academic offerings on programme evaluation. At CC&DW we have recongnised that complementary to academic programmes, there is space for experiential learning through skills development training workshops that provide participants with the opportunity to learn through application and experimentation. Knowledge does not equal capacity. Deep capacity requires application of knowledge; the quality of application depends on skill; and the quality of skill depends on practice.
The classic professions include both theory and praxis. A medical doctor has to do a two-year internship after completing six years of formal study, even though extensive practical work is already integrated in the curriculum from the 3rd or the 4th year, depending on the faculty. During their internship, junior doctors are registered at the Health Sciences Professional Council of South Africa (HPCSA) for “supervised practice”, and they are only allowed to practice independently after successfully completing the internship. Similarly, Clinical Psychologists do internships, Lawyers and Accountants do articles.
Taking a page from this book, we make sure that we provide the opportunity to our workshop participants to get exposure to skills training and a taste of what it means to apply their knowledge and skills. CC&DW has developed a series of complementary training workshops which enable practical peer learning and are embedded in a sound theoretical foundation. All our courses have space for learning through fun activities, and elements of surprise have been built into each workshop. This makes evaluation theory and skills real and relevant, with specific activities through which participants can try out what they have learnt, and then reflect on their experiences. This is action learning in practice!
An evaluability assessment (EA) is an assessment that helps to check whether a programme is ready for an evaluation and, if so, how the evaluation should be designed to ensure maximum efficacy and to allow evaluators to measure the programme’s effect accurately. But the assessment goes further than merely providing information on whether or not a programme can be evaluated.
An EA is an important step in determining which elements of a programme are open to further evaluation and which are not. These are especially important for non-profit organisations whose activities and goals are more often quite broad. An assessment can be undertaken jointly by the evaluator, programme staff, evaluation sponsor and other relevant stakeholders.
This is an especially important aspect of development in South Africa, a country that lacks data. This gap can mean that programme goals are not realistic or lack clarity and there are no logical links between activities and objectives. Such inconsistencies between plans and what actually occurs results in evaluators not being able to determine which activities of the programme were implemented as planned and measure the true impact of the programme .
As a result, evaluations undertaken under such circumstances are limited and waste resources as well as programme staff and stakeholders’ time trying to evaluate a programme that is not ready for a comprehensive evaluation.
Evaluation effectiveness is improved by the EA process in three ways:
1) by stating, plainly and concisely, the programme’s measurable actions,
2) by clearly articulating goals in a realistic and measurable way, and
3) by making a rational link between the program activities and envisioned goals
It is also important that the environment is favourable to conduct an evaluation and that key stakeholders are all in agreement for the evaluation to take place. The stakeholders need to agree to use the results of the evaluation to improve the programme. There has to be positive feedback from the programme staff, who are willing to provide support in undertaking an evaluation.
So if one or more of the above-mentioned conditions are not met, the program is deemed unevaluable until further clarification and reassessment can be provided.
An EA is vital in the evaluation process as if it is effectively conducted on a poorly designed programme, it has the potential to save programme staff and stakeholder’s time and funding resources that would be otherwise wasted if the programme were to continue functioning unchanged.
Ethics in research are extremely important and should always be maintained to ensure that researchers conduct their work in a professional manner. Basically put, ethics are the rules that distinguish between “right” and “wrong”, “bad” and “good”, ethics are about the norms for conduct that distinguish between acceptable and unacceptable behavior in society.
In research and fieldwork in particular, maintaining good ethics at all times is a must and should be a norm. These ethical characteristics include honesty, objectivity, integrity, carefulness, openness, respect, and confidentiality, among others.
Conducting fieldwork is challenging and interesting at the same time. Doing it successfully requires properly trained and well-prepared fieldworkers. Before any fieldwork can commence, our fieldworkers are thoroughly trained over a period 2-3 days regarding the context of the programme, ethics, and many other issues, in order to be able to carry out the project successfully.
But in the field of social sciences, things are not as clear-cut or straightforward as we may like. People are unpredictable and complicated, so no matter how well-trained fieldworkers may be, working with communities can bring about unexpected challenges when they respond in very different and unpredictable ways.
An example of such a situation is dealing with requests for material things from the participants of a study. This may include requests for money, clothes or anything else while conducting research, and this is caused by the expectation that our team is there to offer material benefits in exchange for information. This is never the case.
On top of the ethics training that fieldworkers receive, it is also important to emphasize that they always need to be professional when dealing with the different kinds of participants in the different communities regardless of the socio-economic conditions.
So fieldworkers always need to be trained in handling unexpected social situations in a professional and ethical manner. Treating people and their questions with respect and tact, and explaining the project terms to them in an understandable manner will help make sure everyone involved is on the same page about the data collection.
We’re very pleased to present our evaluation training schedule for 2017. Click here to register now for these upcoming courses.
After you register, we’ll get in touch to finalise the details. And if you’d like to register now for one later on in the year, that’s definitely allowed — we’ll be sure to remind you closer to the time.
On Wednesday the 11th of January, we hosted a Design Thinking workshop in collaboration with the Western Cape Department of Economic Development and Tourism (DEDAT) and the Cape Craft and Design Institute (CCDI). The workshop served as an introduction and awareness initiative for a study commissioned to CC&DW by the Department titled A Baseline Study on the Western Cape Design Eco-System. The Department seeks to identify strategic interventions that continue to place design as a catalyst for economic growth in the Province. In order to inform these interventions, the Department has subsequently commissioned this study to gain a better understanding of the provincial design landscape, challenges and opportunities.
Local business owners and designers from various sectors attended the workshop to understand what ‘Design Thinking’ really is, as well as the role design can play in their businesses.
So, what is it?
“Design Thinking has its core in human empathy…it’s not a quick fix, but it’s a real fix”, said guest speaker Johan van Niekerk, Industrial Design programme leader and lecturer at the Cape Peninsula University of Technology (CPUT). Design Thinking, which is often misunderstood, is not adopting a vague and “airy-fairy way” of thinking nor does it necessarily mean you need to start thinking like an artist. It refers to a broader, more creative approach to problem-solving compared to conventional practices.
How can I practically apply Design Thinking methodologies to my business?
Design Thinking methods and strategies are not limited to any particular industry. The process emphasizes more detailed focus in research and analysis in the primary phase of product conceptualization. This means more time, money and effort spent understanding what your client needs, how the product/service should be made and ultimately how to deliver such. Although this does raise a lot more questions and cause uncertainty at the beginning of creating a product or service, the end result is useful with a clear direction and higher chance of success.
“Design is a human-centered and collaborative approach to problem solving, using a designed mindset to problem solving.” – Tim Brown
Here’s what participants of the Design Thinking workshop had to say about their experience. Keep an eye out for follow-up blogs during the lifespan of this study.