EVALUATION 2018: Speaking Truth to Power
Whether the origins of the phrase Speaking Truth to Power are an American Friends Service Committee’s 1955 pamphlet on nonviolence, writings by civil rights leader Bayard Rustin, a speech by ACLU president Patrick Malin, or even plays by Sophocles or Shakespeare, it has been used by social justice activists, clergy, academics, journalists, politicians, and evaluators to describe taking a stand, speaking out.
Join your evaluation colleagues in a lively, critical conversation about the role of evaluators and evaluation in Speaking Truth to Power.
As you think about how you might contribute to this important dialogue, consider these questions as catalysts for consideration about presentations, panels, posters, and even performances and other creative session formats for Evaluation 2018:
What responsibilities do we have as evaluators for Speaking Truth to Power? When? In what contexts or situations? With what consequences? At what risk or cost? To whom, with what expectations?
We look forward to engaging you in this important conversation at Evaluation 2018 in Cleveland, October 28 – November 3!
Submit a Proposal for Evaluation 2018
Join Leslie Goodyear for a live Coffee Break presentation to hear more about this year’s theme.
Details on the Evaluation 2018 Theme
February 16, 2018, 2:00 - 2:30 p.m. ET
Register Here
AZNET MEMBER PRESENTATIONS AT EVALUATION 2017 in WASHINGTON, DC
Becoming a resource in the community: Connecting with coalitions, nonprofits, and government agencies to learn about data.
Presenters: Wendy Wolfersteig, PhD, Grant Yoder, and Kathryn Dorsey Hamm, MPA
Date/Time: Thursday, November 9, 2017 / 4:30 PM - 5:15 PM
Track: Community empowerment, organizational capacity building
Evaluators are often asked to provide technical assistance related to evaluation and data utilization in order to build capacity in organizations, nonprofits or government agencies. This technical assistance often results in high marks from participants, but the benefit of these trainings can be short lived. Teaching clients the underlying purposes and skills for conducting secondary data collection can provide long-term benefits to both clients and evaluators. This round table will explore how we as evaluators can take the next steps in becoming a tool and a resource for building lasting and meaningful relationships, and thus make a wider impact in communities. Looking at a project related to community data trainings conducted by the Southwest Interdisciplinary Research Center (SIRC) at ASU, the roundtable will discuss effective methods for involving coalitions, nonprofits, and government organizations in better understanding and using data.
Evaluating Shared Use Projects Using Community Based Participatory Research and Evaluation
Presenters: Eileen Eisen Cohen, Breanne Lott, Kenneth Steele, Rebecca Henry, Ilana Parsai
Date/Time: Thursday, November 9, 2017 / 1:15 PM - 2:00 PM
Track: Collaborative, Participatory & Empowerment Evaluation
Maricopa County experiences a large gap in quality of living and longevity from one zip code to the other. This is the result of inequities related to social factors. One strategy that may help reduce inequities is shared-use. Maricopa County Department of Public Health and Vitalyst Health Foundation funded six organizations to implement a diversity of shared use projects, including a community garden, opening school gyms and libraries to the public after school-hours, and offering GED and Zumba classes to the community. The purpose was to increase opportunities to learn, play, interact with other community members and exercise. The long-term goal was to increase community health. In this presentation the evaluators will discuss each of the projects and the mix methods used to evaluate them. The evaluation guided by Community Based Participatory Research and Evaluation (CBPRE) framework included community surveys, focus groups, stakeholder interviews, and field observations.
Let Us Show You the Ropes: A Process That Utilizes Evaluators-in-Training as Ideal Mentors
Presenters: Levi John Roth, Phillop Stoeklen, Deven Lee Wisner
Date/Time: Friday, November 10, 2017 / 8:00 AM - 9:30 AM
Track: Evaluation Managers and Supervisors
The Applied Research Center (ARC) at the University of Wisconsin-Stout specializes in program evaluation for educational institutions, non-profits, and organizations. Each year, the ARC hires students from the M.S. Applied Psychology program as graduate assistants (GAs). Unlike traditional research assistantships, the ARC provides practical experience interacting with stakeholders, writing technical business reports, managing project tasks and timelines, and conducting research in real world settings. The ARC accomplishes this with a “Train the Trainer” model. The office staff provides high level training to the GAs in evaluation practice, while also providing an opportunity to mentor the new employees on fundamental evaluation and research skills. This session will include an explanation of the ARC model for incorporating graduate students in evaluation and research projects. In addition, the ARC will describe and share some of the tools used to manage GA tasks and ensure project quality.
Who’s a Utilization-Focused Evaluator?: Exploring Professional Evaluator Perceptions of the Seventeen Steps to UFE
Presenters: Deven Lee Wisner, Phillip Stoeklen, Tiffany Lee Smith, Libby Smith
Date/Time: Wednesday, November 8, 2017 / 7:00 PM - 8:30 PM
Track: Evaluation Practice
Utilization-Focused Evaluation (UFE) is a very popular approach to evaluation utilized in many classrooms and practitioner settings. However, little research has been done regarding the perceptions of evaluation practitioners regarding the utilization of Patton’s (2011) 17 step process. The proposed presentation will serve as an opportunity to share professional evaluators’ views regarding the use and perceived importance of Patton’s 17-Step Checklist for UFE. This survey study is a quantitative expansion on a qualitative case study of evaluation practitioners regarding the importance and use of the steps that was presented at AEA 2016. The results gleaned from the prior investigation prompted researchers to explore the pragmatism of the checklist as well as the most used components of it, regardless of the evaluation model an evaluator ascribes to. This research is important to the growth and refinement of UFE but also serves as a reflective opportunity for attendees as evaluation practitioners.