Development action with informed and engaged societies
After nearly 28 years, The Communication Initiative (The CI) Global is entering a new chapter. Following a period of transition, the global website has been transferred to the University of the Witwatersrand (Wits) in South Africa, where it will be administered by the Social and Behaviour Change Communication Division. Wits' commitment to social change and justice makes it a trusted steward for The CI's legacy and future.
 
Co-founder Victoria Martin is pleased to see this work continue under Wits' leadership. Victoria knows that co-founder Warren Feek (1953–2024) would have felt deep pride in The CI Global's Africa-led direction.
 
We honour the team and partners who sustained The CI for decades. Meanwhile, La Iniciativa de Comunicación (CILA) continues independently at cila.comminitcila.com and is linked with The CI Global site.
Time to read
2 minutes
Read so far

Pathfinder: A Practical Guide to Advocacy Evaluation

0 comments
Date
Summary

This guide from Innovation Network is intended as an introduction to advocacy evaluation. It is written to give a sense of what is involved in the process and how this kind of evaluation differs from programme evaluations. The guide defines advocacy as "a wide range of activities conducted to influence decision makers at various levels." The approach is learning-focused advocacy evaluation, which is structured to result in an evaluation design that yields the type of information funders and advocates need to understand their progress, make mid-course correction, and ultimately conduct successful advocacy and policy change projects.


The steps in the guide generate information that advocates can use to strengthen decisionmaking. The challenges it seeks to address include:

  • Time frame - sustainability over time, requiring strong infrastructure and robust capacity qualities, which will keep an advocacy organisation viable for as long as it takes to achieve its ends.
  • Contribution, not attribution - understanding that contribution yields information without alienating partners or depleting resources.
  • Documenting progress - interim measures of success to show whether work is on track, informing advocates on progress, and helping them share success stories.


The 8 steps to learning-focused advocacy evaluation are the following:

  1. Evaluation Purpose - examine the forces behind the evaluation and its intended audience. "Clarifying who will receive evaluation findings and how the findings will be used is important to the design of the evaluation plan."
  2. Roles and Responsibilities - study power dynamics among all involved parties; identify who should be a part of the evaluation workgroup and how often does it meet, as well as what is the evaluator's proximity to the advocacy effort within the dynamics of building trust while maintaining objectivity.
  3. Theory of Change - seek out the ultimate goal(s) and the outcomes along the path to the intended goal, and alignment and agreement among advocates (and funders). "For the most part, advocacy occurs in a highly complex environment resulting in a theory of change that evolves over the life of the engagement."
  4. What to Measure - measure external changes (talk about how the organisation is building support/allies, reading/reacting to opponents, reading/reacting to the climate, and making progress with decisionmakers) and internal changes (like measuring capacity), and integrate the internal and external measurement into an evaluation timeline.
  5. Methodology and Data Collection - enable advocates to learn about their work, make more informed decisions, and be more likely to achieve success through up-to-date data. "Evaluation designs such as summative, quasi-experimental, and experimental designs - while they work very well in some contexts - are less suited to advocacy work. We recommend evaluation designs structured to collect and produce information during advocacy work. We have found that formative and developmental evaluation designs produce valuable information within acceptable timeframes." Decide how to collect data using such tools as monitoring and tracking - ongoing, systematic data collection, such as media tracking and meeting tracking, and telling the story to add context. For example, "If media tracking reports that an organization is getting more earned media, you can review a sample of media articles, analyzing the changes in the media portrayal about the organization and its issue over time."
  6. Analysis, Reflection, and Data Use - foster a culture of curiosity; encourage the evaluation team to regularly review incoming evaluation data as part of strategy meetings; hand over fresh data ready for meaningful analysis and application.
  7. Communications and Reporting - use a flexible reporting style that captures and conveys what is actually happening, and supports learning. "Structure the evaluation around advocates’ work, rather than trying to shoehorn their work into your evaluation plan."
  8. Checking the Big Picture - review notes and thoughts; reflect on the evaluation purpose as identified; and suggest modifications of the evaluation to the workgroup if changes are needed.


The guide is available in three tailored editions: for advocates, evaluators, and funders.

Source

e-CIVICUS 476, February 25 2010.