Evaluations can be useful tools for measuring whether a given activity, in this case intergroup programs and/or dialogues, achieves the goals that were originally set by the organizers, conveners, or participants. Often evaluations are key to learning from previous experiences and to shaping future events and efforts.

Unlike many other programs and initiatives, the impact of intergroup relations work – the forging of connections across race, culture, and other divides to deepen our collective potential for economic and social justice – does not easily convert into quantitative data. While other programs and initiatives can easily cite how many clients received services or measure how tutoring affected reading scores, relationship building is a more amorphous concept that cannot be easily captured by traditional assessments. This makes it challenging to evaluate the progress of many initiatives, particularly those that center on relationship-building exclusively or at least prior to working on issue areas of mutual concern.

Thus, many intergroup initiatives rely on anecdotal evidence, which is often solicited at the conclusion of a program or dialogue series. A popular education format is commonly used for these evaluations, in keeping with the pedagogical approach of the rest of the program.

Following the Latin American and Caribbean Community Center’s 2005 anti-oppression workshops, facilitators gathered immediate feedback popular education style. Participants were asked what parts of the workshops they loved, what parts they liked, and what parts could be improved. Feedback was documented using post-it notes and by writing on flip charts.

In addition to a written survey, Be Present, Inc. holds a verbal “highlights in learning” following sessions in which people go around the room and say what they gained. These are captured on video.

In addition, while it is feasible to conduct evaluations at the end of program or dialogue sessions, the long-term nature of breaking down barriers to envision and ultimately realize new intergroup dynamics is not necessarily captured from a short-term perspective. While changes in attitudes and perspectives may appear in the short-term, the benefit of time often better clarifies the changes that have or have not occurred within a community. Thus, immediate evaluation may not capture the full range of ramifications that the longer-term perspective of hindsight can offer.

Gamaliel of Metro Chicago leaders followed up with participants following their intergroup meetings to clarify lingering concerns and quell any remaining misunderstandings.  These one-on-ones gathered four main pieces of information:  1) one word that describes how the participant is feeling after the meeting; 2) what were the moments of tension and growth during the meeting?; 3) how did we perform as participants in this meeting?; and 4) what content did the participant learn from this meeting?

Be Present, Inc. emphasizes the organization’s interest in receiving evaluations and feedback well beyond the conclusion of the actual event.  Annie Tobias and Arianna Robinson shared, “We’re also very open.  Even if the participant doesn’t say something right away, we may get a call a few weeks later if they still have a feeling about something that happened.  We’re open and we encourage feedback of all types.  If you don’t have a fabulous experience, there’s a lot that can be learned and gained from that.  That feedback is encouraged as well.”

Other organizations that have the time and resources to engage in larger scale evaluations are able to gather data that extends well beyond the anecdotal. Some organizations employ case studies in which evaluators seek to understand the impact of a given effort on the larger community. Others employ a pre-test/post-test research model to better understand how participants and their communities changed over the course of a program or dialogue.

Helpful materials

This document (.pdf) from the Catalyst Project  outlines an interactive way to do evaluation in a large group setting.

Human Rights Education Associates (HREA) produced a technical assistance guide for evaluation. This primer (.pdf) explores different evaluation methods and data collection techniques. It is also available in Spanish.

Image: Atelier Teee, http://creativecommons.org/licenses/by-nc-nd/2.