By Joaquín Navas, Ricardo Ramírez y Mariana López-Fernández
Communication and evaluation generally develop independently. However, they both share some common elements theoretically and practically.
DECI (Designing Evaluation and Communication for Impact) is an action-research project financed by the International Development Research Centre in Canada (IDRC). It was established in 2009 and is currently in its fourth phase. The DECI project combines evaluation and communication capacity building. This approach creates a decision making framework that helps research projects to be more strategic with their activities and to become more adaptable to unexpected situations in order to maximize their impact.
The objectives of the DECI Project are:
- To train regional mentors in evaluation and communication in East Africa, Asia and Latin America.
- To provide capacity building for partner project staff in evaluation and communication.
- To provide consulting services to develop evaluation plan and communication strategies.
- To explore methodological innovations through action-research.
The partners are research projects also financed by IDRC that have evolved in ICT for development such as: open development, collaborative and open science, digital rights, artificial intelligence for development, etc.
Below, we provide a brief description of the utilization-focused evaluation (UFE) approach and we contrast it with conventional evaluation approach. In addition, we share specific features and principles belonging to the DECI approach. We conclude with the main lessons learned and challenges that have emerged throughout the different stages of DECI project.
UTILIZATION-FOCUSED EVALUATION AS METHODOLOGICAL APPROACH
Utilization-focused evaluation (UFE, Patton, 2008) is a decision-making framework where stakeholders – known as “primary intended users” – take ownership over the design and implementation of evaluations of a project. It should be noted that from a purely theoretical perspective, UFE is not a promoted as a participatory approach. However, it offers the opportunity to create spaces that are highly participatory. This is possible when the conditions allow the space for the primary intended users to take the driver seat in evaluation and communication. In the case of the DECI, the team takes on a mentoring role to facilitate the process.
The evaluations facilitated by DECI have been participatory in the sense that the primary intended users – often project managers and other key member of the project team – are the ones who define the purpose of the evaluation, the uses that they want to give to the findings, and the evaluation questions.
Each project assigns an internal evaluator who is supported by a DECI mentor. The evaluator is responsible to gather data, analyze it and prepare the reports; a process that unfolds in consultation with the primary intended users. Therefore, it could be said that in the case of DECI, while the evaluation design is highly participatory, its implementation follows a collaborative approach. The different levels of participation throughout the evaluation process invite reflection on what constitutes participatory evaluation and what does not. The DECI experience has demonstrated that utilization-focused evaluation contrasts with conventional evaluation in several aspects, as shown in the following table.
THE DECI MODEL
While DECI is mainly a capacity-building initiative, it is also a research project on capacity-building methodology in evaluation and communication. The combination of service provision with research has allowed us to experiment; this has led to new perspectives and the definition of working principles. Below, we list the most relevant specific features of the DECI model:
Building capacity through a mentor
Evaluation capacity building is typically delivered through workshops where participants are exposed to evaluation theories and methods. Similarly, workshops on research communication usually feature some conventional planning methods and tools (Quarry & Ramírez, 2014). In the DECI project context, the workshop format has been dropped as various experiences yielded limited learning outcomes from this training modality. Instead, training is delivered through a mentor who offers timely support (just-in-time mentoring) to the person who the organization has chosen to facilitate the evaluation. The mentor offers support at a pace that matches each project’s calendar and stages of development. This responds to experiential learning processes, through which teams from partner projects learn through action and reflection (Kolb, 1984). This allows for mentor support that can be adjusted to every context as knowledge is shared in a timely fashion. Timing is important, to provide support during those moments when the person being trained can apply to steps in ways that are specific to their project needs. This tailored approach seems to be more effective than offering pre-designed, standard steps, like recipes shared through a workshop. In our experience, we feel it and it is also more cost-effective for the partner project.
The evaluator as process facilitator
In the UFE approach adopted by DECI, the role of external evaluator is not that of an “expert” who intervenes at the end of a project to judge its achievements. Instead, the roles if one of a facilitator who guides the project evaluator and the intended evaluation users through a set of decisions that must be taken throughout the evaluation process.
Evaluation as a framework for decision-making
Traditionally, project evaluation are carried out as a response to donor requirements that emphasize accountability to verify that objectives are being met and resources are being used in a satisfactory manner.
However, in the DECI process and thanks to the flexibility of its donor IDRC, evaluation is seen as a tool for learning and decision-making throughout the life cycle of the projects. The UFE approach responds well to these aims because it proposes that the evaluation should be designed and implemented from the beginning of the project rather than at the end. They also formulate the questions that the evaluation should respond to so that the findings generated are useful. From this perspective, evaluation is not carried out in order to provide accountability to a donor, but rather as a tool for collective reflection and strategic management which enables the generation of organizational learning. It is worth adding that in the IDRC context, the partners already provide technical and financial report on a regular basis, which means the accountability requirement is covered.
Integrating evaluation and communication into projects
Communication planning and evaluation design share some common steps and DECI, as a research project, has tested some methodological innovations while integrating them. The decision-making required to design evaluation and communication creates a place for internal reflection within the project teams about how best to achieve outcomes and improve effectiveness. Moreover, when working in research projects that are addressing new and emerging fields, the importance of adaptive management has become a priority.
Practical knowledge is only acquired through experience
Contrary to the notion of ‘best practices’ which suggests pre-determined “recipes”, practical wisdom refers to the ability to respond to every situation in a unique way according to the specificities presented by each context (Schwartz & Sharpe 2010).
The capacity-building approach adopted by the DECI project creates spaces where practical wisdom guides the decision on evaluation and communication instead of so-called ‘best practices’. We have learned that practical wisdom is acquired through accumulated experience. It guides the mentoring process by testing specific strategies suited to each project evaluation, paying attention to details as we help to facilitate. It is about responding to unique features that arise as decisions are made about the evaluation and communication plans. This necessitates continual reflection and detailed documentation among the mentors for every experience, that leads to the preparation of case studies summarizing the process and outcomes.
The DECI experience has taught us some meaningful lessons, from which the following can be highlighted.
Project readiness: a critical factor
The “readiness” of a project to receive capacity building is not an evaluation tool as such, but rather a management tool for the assessment of the enabling context (Ramírez, 2017). We have learned that readiness is a critical factor to implement the capacity-building method that we use in DECI. The notion of readiness comes from the UFE approach (Patton, 2008) which states the importance of verifying from the beginning: (i) whether there are staff that are interested in receiving training; (ii) whether the organization’s directors approve the process and provide resources; (iii) whether the donor makes room for the evaluation by allowing the project teams to be included in the evaluation design, allowing them the freedom to address a range of purposes and uses for the evaluation.
Readiness is also by the balance of power between a donor and funding recipient. In best- cases scenario, the donor allows the partner a certain level of independence in order to at least be able to participate in the evaluation design.
These conditions enable the project teams to enjoy some ownership over their evaluation, something that we found contributes to project teams acquiring an evaluative culture (Mayne, 2009). We have also learned that when the readiness context is not favourable -because most of the aforementioned conditions do not exist- it is better to suggest to the project team that they opt for a conventional type of evaluation.
The trust between the mentor and evaluator makes all the difference
An important factor to determine whether or not the capacity-building model suggested by DECI yields results is the level of trust between the mentor and the evaluator. Among the mechanisms used to generate this trust, personal style of each mentor is key, which in turn is influenced by the mentor’s familiarity with the project’s context. In addition, flexibility and patience by the mentor are important to adjust to the project’s time requirements and assess the context in order to help the evaluator pick up the pace or give them more time. Lastly, in the DECI context, the fact that the mentor does not represent the donor is a plus. Since DECI itself is also a research project funded by IDRC, it is subject to the same conditions as the partner projects we support; in that sense we are peers.
The following are the most common activities by DECI project mentors that generate trust with the partner:
- Helping to simplify the scope and tasks related to evaluation and communication processes.
- Helping to clarify language when necessary and translate ideas or comments into evaluation uses and key evaluation questions.
- Helping to facilitate conversations between the trained evaluator and the primary intended users team.
- Helping the primary users focus more on the mid-term outcomes rather than the long-term impact that may be difficult to confirm in short-term projects.
- Guiding the evaluator so that the evaluation efforts are useful for the primary intended users as they make decisions; providing tools and strategies that add value to that which the partner project is seeking to achieve.
- Provide a safe space where the project’s strategy can be adjusted when necessary.
Value is recognised at the end of the process
The UFE approach is difficult to understand when explained in writing. It is especially difficult to perceive the value that it can add to a project when one has not experienced it. This is especially the case when the staff assigned by the partner organization to work with DECI has no prior evaluation experience. However, experience shows that most organizations that collaborate with DECI end up convinced of the added value that the process provides. This approach is learned by experiencing the process. In addition to generating useful findings, the organizations appreciate the spaces for reflection and internal dialogue that the mentoring provides. This often result in rethinking strategies, revealing non-explicit assumptions, and confirming the value of previous decisions. The process provides professional development for the staff members who receive mentoring support.
The following are some of the challenges we have observed while implementing the DECI project:
Staff rotation in the partner projects
Staff changes within a partner project makes continuity difficult. In many cases it requires that the mentor start the process from scratch once again with a new person. This requires double the effort and delays in the work. Solution: the mentor requires patience.
Lack of time among staff working in the partner projects
The people working in the partner projects are usually working at full capacity, leaving little time for them to dedicate to the task of evaluation or planning a communication strategy. Solution: offer support with an open calendar to make the mentoring schedule as flexible as possible. The mentor should also have the practical wisdom to recognize and take advantages of instances when the process gains momentum in unexpected ways.
Lack of interest in the person to be trained
In some cases the organization accepts DECI’s support and seems to fulfill the minimum conditions of readiness, but it then found that the person to be trained is less interested in learning. This tends to happen where the person has a fair amount of experience in evaluation or communication and sees no value in additional training. Solution: perseverance by the mentor to help the person realize that DECI’s proposal can provide him/her with new knowledge that is worth gaining and can generate useful knowledge for the organization.
Integration between the evaluation design and the communication plan
Most organizations in one way or another design communication and they generally have a strategy in place even if it is not documented. In contrast, most have a significantly less experience when it comes to evaluation. Solution: Design the evaluation and communication planning so that they converge in order to enrich the organization’s overall capacity. This requires perseverance, monitoring, and coordinated support by the mentors.
Brodhead, D & Ramírez, R. (2014). Readiness & mentoring: Two touchstones for capacity development in evaluation. Paper presented at the CDI Conference: Improving the use of M&E processes and findings. 2021 March. Wageningen, The Netherlands.
Kolb, D. A. (1984). Experiential learning: Experience as the source of learning and development. Englewood Cliffs, NJ: Prentice-Hall.
Mayne, J. (2009). Building an evaluation culture: The key to effective evaluation and results management. Canadian Journal of Program Evaluation 24(2): pp.1-30.
Patton, M. Q. (2008). Utilization-focused evaluation, 4th ed. Los Angeles, London, New Delhi, Singapore: Sage Publications.
Quarry, W. & Ramírez, R. (2014). Comunicación para otro desarrollo: Escuchar antes de hablar. Madrid: Editorial Popular.
Ramírez, R. (2011). Why “utilization focused communication” is not an oxymoron. Communication, media and development policy; BB World Service Trust.
Ramírez, R. (2017). Un marco para la toma de decisiones en evaluación y comunicación: Resumen de investigación y comunicación. Revista de Comunicación y Ciudadanía Digital – COMMONS, Vol. 6 N. 1 pp. 23-44.
Ramírez, R. & Brodhead, D. (2013). Las evaluaciones orientadas al uso: Guía para evaluadores. Penang: Southbound.
Ramírez, R., Quarry, W. & Guerin, F. 2015. Community Note. Can participatory communication be taught? Finding your inner phronēsis. Knowledge Management for Development Journal 11(2): 101-111.
Schwartz, B., & Sharpe, K. (2010). Practical wisdom: The right way to do the right thing. New York: Riverhead Books.