Log in

AZENet & Field Events

Your Gateway to Professional Growth and Networking Opportunities in Evaluation

Explore a curated selection of events tailored for evaluators seeking to expand their knowledge, connect with peers, and advance in the field. From AZENet's signature gatherings to broader field events, discover opportunities to engage, learn, and contribute to the vibrant evaluation community in Arizona.

AZENet Events     Field Events

All Upcoming Events

    • March 06, 2026
    • 9:00 AM - 4:30 PM
    • United Way Campus
    • 11
    Register


    Leveling Up Together embraces a rapidly evolving landscape, this theme invites us to reflect on how we can stay forward-looking, community-centered, and grounded in meaningful impact as we mount new challenges. Connect and learn with peers on March 6, 2026, at the United Way Campus downtown Tucson. Don’t miss this opportunity to be part of the future of evaluation!

    LOCATION

    United Way Campus in Tucson, Arizona

    KEYNOTE SPEAKER

    Dr. Wendy Wolfersteig is a Research Professor Emerita at Arizona State University who has spent more than 25 years advancing evaluation, prevention, and data-driven strategies to reduce health disparities across Arizona. She partners with community agencies to improve outcomes around substance abuse, social determinants of health, and equity. Her work reflects the mission and values at the heart of our network.

    Claudia Jasso is the Executive Director of Amistades and a longtime social sector leader dedicated to advancing community integrity, trust based relationships, and authentic representation. She has spent nearly two decades bridging grassroots, philanthropic, and governmental efforts to strengthen Latino and border communities in Arizona and beyond. Learn more about Claudia below.

    PRICING

    Conference registration pricing varies by membership type, with discounted rates for AzENet members. Not a member? Join today to access exclusive resources and save on registration. Log in to your account to view registration options based on your active membership type. You must be logged in as a member (student member or regular member) to receive the ticketed member price. To receive membership pricing, you must first become a member (student or regular member). 

    Take advantage of Early Bird pricing up until December 28th! Standard ticket prices will take effect on December 29th. 

     Ticket Type  Standard Price  Early Bird Price
    Member
    $125  $110
    Non-Member $155   $135
    Student Member
     $95  $95




    CONFERENCE SPONSORS

     United Way of Tucson and Southern Arizona 
     Collaborators Consulting Group

    • March 07, 2026
    • 9:00 AM - 10:00 AM
    • Room 222 ASU on United Way's Campus
    • 30
    Register

    Motivational Interviewing for Leaders: Building Engagement Through Better Conversations

    with Kerri Rittschof, PhD, PMP, Director of Data Science & Analytics at the ASU Library

    Includes a detailed overview of MI’s origins, principles, and relevance to leadership and the core techniques (OARS): Guided discussion and examples of open questions, affirmations, reflections, and summaries as leadership tools.

    Workshop Objectives - 1) Introduce participants to the core principles of MI, including empathy, developing discrepancy, managing resistance, and supporting self-efficacy; 2) Demonstrate how these principles can enhance leadership effectiveness in building trust, fostering collaboration, and navigating change; 3) Equip leaders with practical communication tools—open-ended questions, affirmations, reflective listening, and summarizing (OARS), to use in real-world situations; and 4) Provide interactive opportunities for participants to apply these skills in leadership scenarios.

    Workshop Takeaways - 1) A clear understanding of MI and how its principles translate into effective leadership communication; 2) Practical, actionable skills through the OARS framework to navigate resistance, enhance accountability, and improve collaboration; and 3) Scenario based activities to practice MI techniques.

    Presenters will conduct interactive activities and applyied MI techniques to demonstrate learning and deepen skill integration.

    • March 07, 2026
    • 9:30 AM - 11:30 AM
    • Room 230 ASU on United Way's Campus
    • 37
    Register

    Leveling Up Project Coordination: Revealing the Hidden Infrastructure Behind Project Success

    with Erin Cooper, PhD, and Stacey Kesten, PhDCollaborators Consulting Group

    Evaluation projects succeed or stumble on their hidden infrastructure: the often-invisible coordination work that connects people, tasks, and information. This interactive workshop introduces a clear framework and practical tools to help attendees level up how they plan, communicate, document, and coordinate project workflows and the humans at the heart of them.

    This two-hour workshop blends brief didactic conceptual grounding with highly interactive, hands-on learning. After a short overview of project coordination as hidden infrastructure and description of five coordination practices, participants will engage in structured reflections, small-group discussions, and guided exercises that help them surface and analyze the coordination systems they already use.

    Throughout the session, the facilitators will demonstrate key coordination practices across planning, task management, meetings, communication, and documentation, with participants applying each concept to their own project contexts through short partner or small-group activities. A scenario-based group exercise allows participants to diagnose realistic coordination breakdowns and identify strategies or tools to resolve them. Facilitation techniques, such as a clear agenda, visible documentation, and clear communication, are intentionally used to model the practices being taught. The workshop concludes with an individual coordination planning exercise to support immediate application.

    • March 07, 2026
    • 9:30 AM - 11:30 AM
    • Room 226 ASU on United Way's Campus
    • 30
    Register

    DIG in, Evaluators! Two Arizona-Born, Community-Engaged Tools for Developing Integrated Gardens

    with Theresa LeGros, MA and Anvi Bhakta, MPH, University of Arizona

    Here, we share our participatory approach to creating two Developing Integrated Gardens (DIG) evaluation tools for school and community gardens, and how to use each in a "cycle of change." We expect attendees to come away with new ideas for engaging communities in creating or revising evaluation tools and a new understanding of two Arizona-born assessment tools for creating and sustaining gardens.

    Workshop objectives are:

    1.        Demonstrate participatory tool development methods. Participants will learn about and discuss our iterative, community-engaged process for creating evaluation tools, including expert panel formation, questionnaire design, pilot testing, and tool refinement across diverse contexts.

    2.        Practice applying the four-stage Cycle of Change. Through hands-on engagement with the tools and exploration of Arizona case examples, participants will experience each stage (till, sow, grow, reap) and explore how to embed evaluation into ongoing implementation cycles in their own work.

    3.        Develop strategies for interest holder engagement. Through facilitated discussion and small group work, participants will identify approaches for engaging communities in (1) designing evaluations, (2) implementing evaluations, and (3) selecting and using evaluation tools that themselves promote engagement.

    Participants will leave the workshop with:

    •        Access to the DIG in Schools and DIG in Community tools (available in English, Spanish, and Diné) and resources.

    •        Concrete examples and case studies demonstrating how Arizona schools and community gardens have used the tools for action planning, sustained program integration, and celebrating change.

    •        A framework for participatory tool development that integrates equity, community engagement, and trauma awareness into the evaluation process—applicable to any community-based program evaluation.

    •        Practical strategies for embedding evaluation into implementation cycles that transform evaluation from a one-time assessment to an ongoing process of community-engaged change.

    •        Small-group insights and peer learning from discussions about applying participatory evaluation principles in participants' own contexts, including considerations for diverse geographies, populations, and program types.

    • March 07, 2026
    • 10:30 AM - 11:30 AM
    • Room 222 ASU on United Way's Campus
    • 30
    Register

    Operationalizing MAPP 2.0 Through Community-Driven Focus Groups

    with Dr. Maria Aguilar-Amaya and Chantel Welker, MS from ASU Southwest Interdisciplinary Research Center

    Includes a detailed overview community-driven focus groups to enhance a community health assessment. Handouts will be included.  

    Participants will learn the methodology used to recruit specialized populations. Participants will also learn about the qualitative rigors used in community health needs assessments to create a more scientific and reflexive practice. 

    • March 07, 2026
    • 12:30 PM - 2:30 PM
    • Room 226 ASU on United Way's Campus
    • 29
    Register

    A Hands-On Introduction to Ripple Effects Mapping

    with Janel Hansel, MPA, Grant Street Consulting

    Ripple Effects Mapping (REM) is a participatory evaluation method that elevates stories, relationships, and reveals unanticipated impact. This interactive 2-hour workshop includes a live mini-REM session, allowing participants to experience the method directly and leave with tools to apply it in their own contexts.

    This 2-hour workshop provides an experiential introduction to REM, originally developed by Extension practitioners at Washington State University and the University of Idaho. REM blends appreciative inquiry interviews with visual mapping to reveal intended and unintended impacts, strengthen relationships, and support strategic learning. This workshop is designed with evaluation professionals in mind—including those grounded in quantitative methods who are looking for ways to capture complexity, meaning, and unexpected outcomes. Ripple Effects Mapping (REM) complements traditional measurement by providing contextual, relational, and narrative data that deepen the interpretation of numerical findings. Participants will learn how REM can reveal emergent themes, help refine logic models, generate new indicators, and surface outcomes that surveys or performance metrics may overlook.

    The format allows participants to experience a live mini-REM session, giving them practical, grounded insight into how REM actually unfolds with real groups. Through this hands-on experience, participants will see how stories become data, how connections are visualized, and how collective sense-making reveals insights that can inform evaluation, strategy, and organizational learning.

    Examples will be shared, along with practical tools for adapting REM to different settings. We will review sample agendas, appreciative inquiry prompts, mapping templates, thematic analysis structures, and guidance for integrating REM findings with existing qualitative and quantitative data.

    Workshop Objectives

    1.        Understand the purpose of Ripple Effects Mapping and the kinds of situations and questions it works best for—such as capturing stories of change, surfacing unanticipated impact, strengthening relationships, and supporting strategic learning.

    2.        Learn the core components of REM: prompt development, appreciative inquiry story harvesting, mapping ripples, theme development, and meaning-making.

    3.        Participate in a facilitated mini-REM experience to see how stories are used and how insights emerge visually and collectively.

    4.        Examine examples of REM maps, reports, and implementation models.

    5.        Identify considerations for preparing and facilitating REM, including group composition, power dynamics, hybrid/virtual adaptations, and time requirements.

    Participant Take-aways

    • Firsthand experience of REM in action—engaging with story prompts, mapping connections, and identifying emerging themes.

    • An overview of the REM workflow from planning to reporting.

    • Editable tools and templates: sample agendas, appreciative inquiry prompts, mapping structures, participant instructions, and reporting formats.

    • Ideas for how REM can support evaluation, strategic planning, community engagement, coalition strengthening, internal learning, and cross-sector storytelling.

    • Strategies for adapting REM to various contexts: small nonprofits, coalitions, foundations, education, government, and cross-system collaboration.

    The workshop balances conceptual grounding with hands-on practice, ensuring participants leave not only understanding REM but also feeling confident about when and how to use it.

    • March 07, 2026
    • 1:00 PM - 2:00 PM
    • Room 222 ASU on United Way's Campus
    • 30
    Register

    The Whole Evaluator: Cultivating Balance and Sustainability in Practice

    with Eileen Eisen-Cohen, PhD, MSW, ASU School of Public Affairs

    Evaluation can be deeply rewarding—and deeply draining. The Whole Evaluator is a one-hour applied learning session designed to help professionals reconnect with balance, purpose, and wellbeing in their work. Through reflection, short activities, and peer exchange, evaluators will leave with a personalized “balance toolkit” to support resilience, creativity, and long-term engagement in their practice. Using science-based strategies and engaging activities, participants explore practical ways to maintain wellbeing, sharpen focus, and bring greater presence and engagement to their daily work.

    By the end of this session, participants will be able to:

    1.        Identify factors that contribute to imbalance and stress in evaluation work and their impact on professional effectiveness.

    2.        Apply science-based strategies to enhance focus, presence, and personal wellbeing within their daily evaluation practice.

    3.        Reflect on individual habits, patterns, and work environments that influence energy, engagement, and sustainability.

    4.        Fill up a toolbox with ways to incorporate long-term balance and professional resilience.

    • March 07, 2026
    • 1:00 PM - 2:00 PM
    • Room 230 ASU on United Way's Campus
    • 38
    Register

    Critical Issues in Program Evaluation: Learning Through the Practice of Complex Interviewing

    with Desire Toussoukpe, PhD Student and Priscila Ledezma, PhDUniversity of Arizona

    This session explores evaluation challenges that arise when participants share critical information “off record” due to fear of being identified and complex team dynamics. Through scenarios, frameworks, and discussion, participants will examine how evaluators can balance ethical obligations, participant trust, and the need to provide meaningful formative feedback for program improvement.

    Evaluators frequently rely on interviews to understand program implementation, participant experience, and areas for improvement; however, evaluation becomes challenging when participants share critical information “off record” due to fear of retaliation, mistrust, or complex team dynamics. This session uses a real-world evaluation dilemma to explore how fear, power, and positionality influence what participants feel safe disclosing and how these dynamics can compromise data quality, ethical decision-making, and the utility of findings. Grounded in a power analysis framework (Friesen & Cimetta, 2025) and culturally responsive evaluation approaches (Hood et al., 2015), the session examines how evaluators can balance ethical obligations, participant trust, and the need to provide actionable formative feedback for program improvement.

    Participants will engage in interactive activities to analyze sensitive disclosures, assess risks to anonymity in small or politically complex settings, and practice strategies for responsibly navigating off-record requests. Through scenario-based discussion and collaborative problem-solving, attendees will explore tools for protecting participant identities, mitigating power imbalances, and communicating sensitive findings in ways that minimize harm while maintaining evaluation integrity. By the end of the session, participants will be able to identify common ethical dilemmas associated with off-record information, explain how power dynamics shape participant openness and data quality, and select reporting approaches that balance transparency, accuracy, and confidentiality. The session provides evaluators with practical, culturally responsive strategies for managing conflict, honoring participant voice, and ensuring that evaluation findings support meaningful and safe program improvement.

    • March 07, 2026
    • 2:30 PM - 3:30 PM
    • Room 222 ASU on United Way's Campus
    • 30
    Register

    Eval-utionary Tech: Free Digital Tools That Rock Our World

    with Madeleine deBlois, M.S.Ed., Sc.D., and Terrace Ewinghill, MEd, University of Arizona Community Research, Evaluation, and Development Team

    This workshop showcases some of our favorite free digital tools to help evaluators focus on the information, not the futzing. We'll give an overview of the platforms/tools, and give participants a choose-your-own-adventure chance to try out the tools that interest them.

    This workshop is a practical skill-share of the online tools that we personally use and love. We know there are many options out there (and we home to hear about some new-to-us ones from the audience), but we want to highlight the specific ones that have truly helped us at different stages of the evaluation process. Participants will need laptops and an internet connection to participate. Objective: To share what works: We will highlight specific websites and apps that our team loves —from finding secondary data for Arizona communities to creating better charts and visuals.

    Participant Take-aways:

    1.        An Expanded Digital Toolkit: Participants will leave with knowledge of specific digital resources and data sources that are applicable to their evaluation work in Arizona and beyond. Tools we plan to cover include: MAG Interactive Maps (https://maps.azmag.gov/) for accessing and wrangling data that can inform needs assessments, Lucid Spark/Zoom Whiteboard for online community engagement, DataWrapper for beating Excel at its own game, and Canva for data visualization/infographics.

    2.        Practical Experience: By the end of the session, participants will have hands-on familiarity with at least one new tool, lowering the barrier to adopting new technology in their daily practice.

    • March 07, 2026
    • 2:30 PM - 3:30 PM
    • Room 226 ASU on United Way's Campus
    • 30
    Register

    Level-Up with Monthly Check-Ins: A Lightweight Monitoring Loop for Continuous Program Improvement

    with Qian Wang, PhD Student, and Adriana Cimetta, PhD, MPH, from Educational Psychology, College of Education, The University of Arizona

    This session introduces a practical, low-burden approach for bridging monitoring and evaluation through streamlined Monthly Check-Ins. Drawing from Project CAN, a statewide initiative addressing digital inequities in rural and low-income Arizona communities. This session illustrates how a lightweight monitoring loop can replace complex project-management systems while still producing timely, actionable data for continuous improvement.

    Project CAN partners work across diverse contexts to expand broadband access, strengthen remote learning supports, and deliver digital literacy and workforce development opportunities tailored to local needs. Early attempts to use a commercial project-management platform proved too time-consuming for partners, resulting in low adoption and inconsistent data. In response, the evaluation team developed a more accessible Monthly Check-In process that aligns with partners’ workflows and capacity while maintaining strong accountability and engagement.

    The Monthly Check-In model uses short Qualtrics prompts or brief semi-structured conversations to capture five key areas: activities completed, plans for the next period, challenges, needed supports, and evaluation needs. The evaluation team synthesizes this information into ongoing feedback for partners and integrates findings into quarterly and annual reports, creating a real-time improvement loop that supports swift adjustments, strengthens relationships, and enhances data quality across a large, multi-site initiative.

    This session will share insights from implementing this model, including how a low-burden system can sustain partner engagement, improve monitoring efficiency, and balance autonomy with accountability. Participants will leave with adaptable templates, design considerations, and strategies for implementing similar light-touch monitoring systems in their own evaluation contexts, particularly when working with distributed or capacity-strained partners.

    • March 07, 2026
    • 2:30 PM - 3:30 PM
    • Room 230 ASU on United Way's Campus
    • 38
    Register

    Mismeasurement in Program Evaluation

    with Craig W. LeCroy, PhD from LeCroy & Milligan Associates

    Progress in evidence-based practice has been substantially constrained by systematic mismeasurement. Although reliability and validity are widely treated as sufficient criteria for selecting instruments, many commonly used measures are poorly suited to detecting intervention effects. Drawing on examples from home visitation research, the workshop demonstrates how instruments originally designed to measure constructs or predict risk—such as parenting stress or child abuse potential—are routinely misapplied as outcome measures, resulting in null or mixed findings that may reflect measurement failure rather than program ineffectiveness. The provocative presentation critiques conventional research practices that privilege habituation and psychometric tradition over thoughtful alignment between intervention goals and measurement strategies. Central to the argument is the concept of sensitivity to change, or “validity for change,” which is often neglected despite being essential for outcome evaluation.


Arizona Evaluation Network is a non-profit 501(C)6 organization dedicated to supporting and strengthening a vibrant community of aspiring and established Arizona-based program evaluators by promoting meaningful connections, elevating the evaluation field across Arizona, and facilitating continuous learning.

© 2024. All Rights Reserved by Arizona Evaluation Network.