2017 Spring Exchange

OPEG wishes to thank all those who participated and attended the event at Otterbein University in May.  Our theme of Expanding Your Evaluation Tool-Kit will continue through our Summer Stats and Fall Workshop events later this year.  Please check our home page for more information and opportunities to register.


Presentation Title


Keynote Speaker:

Lori Wingate, PhD

Director of Research, The Evaluation Center at Western Michigan University


Margaret Hutzel, MPA, and Daniel Kloepfer, MPP

Ohio University

A Quick and Dirty Review of Common Types of Evaluations

Much has been written on the ever-growing and ever-evolving field of program evaluation though we evaluation practitioners are often so neck deep in various evaluation tasks that there’s little time to take stock. Program evaluations can take a variety of different forms with multiple types of program evaluation models existing that evaluators can utilize for each unique program need. But what types of program evaluations exist and how does a novice evaluator choose which type of evaluation model is best for each unique situation? When is it appropriate to use certain program evaluation types? Why would an evaluator consider using a different evaluation approach for different programs? These questions and more will be addressed during this session through an introduction and discussion of several different types of evaluations. 

Matthew Barlet, BA

Community AIDS Network/Akron Pride Initiative
Grassroots: A Downstream Approach to Improving Program Quality Learning outcomes: Participants will learn that being downstream from decisions about grant reporting requirements does not make an organization helpless to change data collection, tracking and reporting outcomes. Data and data systems can be used to save countless hours and headaches, even if those data systems do not exist yet.


Erin Gerbec, PhD, Sheri Chaney Jones, MA, and Jay Seetharaman, BS, BA

Measurement Resources Company


 Using Publicly Available Data to Measure Impact

This program demonstrates an alternative methodology to measure research or program outcomes. Audience members will learn about the researchers’ experiences struggling to adapt traditional methods to a novel situation, resulting in the creation of a new and innovative tool that aligns to meaningful outcomes for those who receive the service. The presentation will focus on evaluators taking an  outcomes focused approach that prioritizes impactful measurement when choosing between an existing methodology or creating a new tool.

Maria Green Cohen, MA, and Monica Hunter, PhD

PAST Foundation
Employing a Mixed-Methods Approach in the Evaluation Process

This presentation will explore internal evaluation as a collaborative process involving diverse stakeholders ranging from educators in schools or district-level programs, to organizations engaged in regional/national action to plan and implement STEM policy initiatives. Essential understanding of the cultural variables involved in STEM education transformation is best attained in a mixed-methods evaluation approach that involves qualitative as well as quantitative data collection and analysis in real time.

Jay Seetharaman, BS, BA, Sheri Chaney Jones, MA, and Elizabeth Pafford, MPA

Measurement Resources Company

Social Return on Investment (SROI): Methods and Implications

Shana Alford, MPP

Feeding America

Food Insecurity and Health: Two Questions that Changed the Landscape for Services and Evaluation

Food insecurity is a household-level economic and social condition of limited or uncertain access to adequate food (USDA) and it is interrelated with other needs and challenges that people may have. As evaluators, we seek standardized tools that are validated and can be replicated to increase accuracy and consistency in the data we collect. The two-question food insecurity screener, which is extracted from a longer module, is a great example to introduce to others and demonstrate how it is being used in non-traditional settings to accomplish the goal of getting people to the services and programs they need, while also contributing to a larger body of research work about the effect of hunger on health.

Lana Rucks, PhD

The Rucks Group
Panel Discussion: Don’t Panic, Stay Calm & Carry On…


2016 Fall Workshop

OPEG wishes to thank all those who participated and attended the event at the ESC Center in Cleveland.  

Click to see the full program.

Workshop Sessions Materials:

Participatory Analysis

Data Driven Cultures

Designing High Quality Statistical Graphs

©  2011 - 2014 Ohio Program Evaluators' Group


Powered by Wild Apricot Membership Software