How Do We Know…If Rakes Work?

*This is the fourth post in our new “Ask the Evaluators!” series, exploring real-world assessment of recent media projects to help you make informed decisions about what can be measured, and good questions to ask along the way. Stay tuned for future posts and click here to see all of the articles.*

by Jessica Sperling, PhD

What is a Rake, and how do you evaluate it?

A “Rake” scratches the surface of an issue to engage people of different perspectives around common values. As Active Voice Lab explains:

“Rakes don’t require a traditional story arc—they can be observational, slices of life, profiles of a community under pressure, etc. While they might have a clear point of view, they always include and respect multiple perspectives… Instead of digging in deep, they use a broad, multi-character approach to make their points.”

Given this, some may wonder: how can we evaluate a film that utilizes such as an expansive approach rather than advocates for a specific policy or action? This is a challenging but important question. Below, I take a look at the evaluation of one Rake to highlight overarching lessons – some that apply specifically to evaluation of Rakes, but some that apply more broadly.

One example: Welcome to Shelbyville
The Story

The 2009 documentary Welcome to Shelbyville examines how residents of a rural Tennessee town experience community change.  As described by the producer, Kim Snyder:

Welcome to Shelbyville is a glimpse of America at a crossroads.  In one town in the heart of America’s bible belt, a community grapples with rapidly changing demographics…the economy is in crises, factories are closing, and jobs are hard to find…[T]hrough the vibrant and colorful characters of Shelbyville, the film explores immigrant integration and the interplay between race, religion, and identity.” 

With this focus, the film does not closely examine one specific issue or double down into a policy perspective. Rather, it presents an array of voices and perspectives to steer an audience toward shared values and emotions of the varied individuals – and, of the communities – covered. It thus exemplifies a “Rake.”

The Evaluation (in a nutshell)

The film was part of the Shelbyville Multimedia (SMM) campaign, which was evaluated by Community Science as part of a project entitled “Evaluation of Active Voice’s Ecosystem of Change Model.” The SMM evaluation used a case study approach for two key partnerships and assessed the campaign’s successes and challenges. In this, it drew upon three main data sources:

  • Interviews with representatives of lead and local organizations partnering in the SMM campaign
  • Data from Active Voice, including staff time in working with partner organizations
  • Surveys of community screening attendees.

Recommended Practices in Evaluating a Rake

This evaluation offers lessons and recommended practices that apply to other Rake films and their associated campaigns – and, in some cases, to media impact evaluation more generally.
Key take-aways drawn from this evaluation include:

1. Address not only the film, but also any associated campaign.

If a film has an associated campaign, and if you’re looking to understand and improve the film’s impact, you should include a clear focus on the campaign in your evaluation. This is central to and highly intentional in the SMM evaluation, as the campaign was core to the potential impact of the film on viewers and on community change. This may be particularly significant for a Rake, since a campaign can help frame how the film is perceived and whether the public perceived a specific, and perhaps policy-oriented, intent.

2. Clearly develop a Theory of Change for your film and/or campaign.

What kind of change do you believe you will make, and what is the pathway toward that change? This should be clearly documented through Theory of Change, potentially using a logic model or another similar tool. This should clearly identify anticipated proximal and distal outcomes, and it should account for assumptions and external factors. This step is vital for a strategic and thoughtful evaluation, and it is also beneficial in clarifying one’s own understanding of a film and in communicating its purpose to potential partners.  As such, a logic model was included in the larger evaluation that included the SMM case study; it helped to define the Ecosystem of Change model upon which the campaign was based.

3. Have a clear idea of what you think will happen (outcomes) as a result of the film and/or campaign.

Your documentation of outcomes will frame any impact measurement and related data collection. As such, be sure you are considering outcomes – especially those that are related to your initial idea – that are most relevant to a Rake. In the case of SMM, which does not directly advocate for specific policy change but it would more likely serve as a point of entry. If you focus evaluation on outcomes too distant from your film, given its Rake orientation – for instance, if you focus measurement on whether viewers engage in direct action – you may encounter a lack of positive findings due to faulty conceptualization of outcomes rather than your film’s true lack of impact.

Given the Rake focus, outcomes of interest are likely largely attitudinal or awareness-oriented: for instance, constructs could include awareness of an issue itself; empathy with characters, particularly those that may have previously been seen as unsympathetic; exposure to personal experience with an issue; or interest in learning more about an issue. The SMM community screening survey maintained this focus by including items oriented toward knowledge or awareness (e.g., gaining of new knowledge or ideas about immigrants and refugees). Yet, if a Rake film is part of a more pointed campaign oriented toward social change action, you may consider expanding outcomes to include behavioral constructs as well. Thus, the SMM evaluation also chose to include select behaviorally-oriented outcomes in the screening survey (e.g., likelihood of getting involved in immigrant integration activities after watching the film).

Finally, if working with partner entities, you should consider how working with a film impacts their organization – for example, in increasing their public visibility. This is valuable in understanding more about the film’s varying types of impact, and it may also help a partner organization better understand what it gains from collaboration. Again, this focus was also included in the SMM community screening survey.

4. Don’t forget about process evaluation

Though interest in outcome evaluation – i.e., the change effected by your film – is of likely interest, many projects benefit from the inclusion of process evaluation. Process evaluation focuses not on whether a program was “successful” across various metrics, but rather on – no surprise! – the process and implementation of a program. This could include whether a program was implemented as intended, as well as what was challenging or successful in implementation. Often, outcome and process interact; for instance, ineffective implementation may help to explain a lack of desired outcomes. A core focus on process evaluation can be particularly beneficial to an evaluation of a collaboration or partnership; as such, the SMM evaluation included a strong process orientation.

5. Use evaluation as a means of learning and informing future practice

Evaluation is not undertaken primarily to mark success or failure (despite what some may believe). Rather, it is meant to allow for informed learning and development. During and after an evaluation, one should take a careful look at the lessons learned and apply that information to ongoing work; we should understand supposed “failures” as opportunities for strategic improvement. In the case of the SMM campaign, lessons from its evaluation were applied to the two subsequent campaigns – Gaining Ground and Kids for Cash – and, more broadly, informed Active Voice’s Ecosystem of Change approach.