Breakout Report 9: Evaluating the impact of information and media projects – Knight Foundation

Breakout Report 9: Evaluating the impact of information and media projects

Presenter: FSG

Scribe: LuAnn Lovlin, The Winnipeg Foundation

Evaluating the impact of information and media projects requires breaking down your evaluation process into four main areas:

  1. Describe your project and identify your target audience

  2. Identify the evaluation’s purpose and key questions        

  3. Design your evaluation using effective methods

  4. Report/Communicate the evaluation findings

References and examples were provided from the new publication; IMPACT: A Practical Guide to Evaluating Community Information Projects and the 2011 Reports from the Field. 

Two questions were put to small groups for discussion:

  • Why did you choose this session of the 10 available?

  • Identify one thing you would like to learn/take away about evaluating the impact of information projects?

Discussion and reporting back shared a broad range of what participants wanted to learn/take away from the session. Highlights included:

  • Trying to influence the way communities plan for change and growth

  • Assess impact beyond numbers

  • How to measure change

  • Measure impact and be able to report on it in a timely fashion

  • What options are available to survey, when budget is an ongoing consideration?

  • It is always a question of balance – there are never have enough resources to do everything we want to

  • We all need to cut through the clutter and think in simple terms when it comes to evaluation, ask yourself, what can you/your organization reasonably do – given resources and time?

  • How do you know your project will, or has, made a difference?

  • Was it worth the time, money, resources?

With those questions on the table, FSG built a framework for discussion that broke down each of the four areas and engaged participants to work through some of the important considerations of building effective project evaluations.

The facilitators’ experience working on evaluation of current community information projects indicates many project grantees are having trouble clearly defining the target audience for their project. Project evaluation should be guided by “If your project is successful, what change, among which members of the community, do you expect to see, want to see,hope to see?” These indicators are different. And how do you use the information gathered for further community engagement?

Participants suggested talking about evaluation at the inception of a project crystallizes direction and effort. What is the issue, activity, change or outcome and what does success look like? If we can’t identify that, we can’t evaluate it, so definitions around evaluation and what we are measuring must be clear.  It is important when developing evaluation measures to ask yourself the ‘so what’ regarding outcomes: “So what will change, so what will it mean?”

Evaluation should be focused and simple. Trying to do too much with too many audiences is tough and may not get at the most important things you are trying to learn through evaluation. Goals of the evaluation (ie: building capacity, creating awareness) may be more measurable outcomes. Another consideration with evaluation outcomes is: What is the impact of sharing information with the broader community? Does it help people feel more connected to their community? How do you measure this?

Participants also raised other questions: Is donor engagement a key indicator in evaluation? Is engaging new donors in an area of community interest also a valid evaluation indicator?

Participants broke into smaller working groups and addressed the following questions:

  • What are the characteristics of your target audience?

  • How many people are in your target audience?

  • How do you intend to reach and engage them?

  • How will you know you have reached and engaged your target audience?

No evaluation should be conducted without a clear, intended use of the findings.

A way to focus thinking around evaluation is to ask your organization/group : The purpose of our evaluation is to understand what? The findings will be used to do what?

When designing evaluation, consideration has to be given to stakeholders, which can be a number of different groups connected to your project. You must ask: Who will be entitled to use the outcomes of the evaluation? Will the information be used to make things better, build understanding and awareness? Are there political or persuasive uses for the findings? Will it help project staff, community members involved in the project, peers (best practices) and other partners?

 It is important to ask and answer; ‘what do you hope to learn from your evaluation?’

Evaluation can also identify next or ongoing needs and should take into consideration audience groups, how they are using the information and how we can keep them engaged. FSG suggested keeping evaluations short and very targeted (three to five questions)

Choosing the right evaluation methods (performance metrics) is important. The most common methods include:

  • Website analytics

  • Social Media Analysis

  • Online surveys/polls

  • Interviews

Participants found online analytics require time to set up, to fully understand and to make the best use of.  Social media tools were simpler to measure and track: ‘liking, posting and sentiment’ (how followers feel about the issue) but are often not broad enough to provide an in-depth feedback. Surveys, interviews and online polls can provide a deeper understanding of impact/evaluation. Some unexpected outcomes may also be realized from a more detailed evaluation.

Session participants agreed that it is important to communicate and report your evaluation findings. Before doing so, organizationally you should have identified who you want to communicate your findings with and for what purpose? You should also know what kinds of uses of the information you hope to see and what communication strategies will be most effective for informing your audiences.