Beginning Farmer Program Evaluation Resource Library

The Beginning Farmer Program Evaluation Resource Library is a compilation of materials to assist beginning farmer and rancher training programs to conduct evaluation.

This Resource Library was created as part of the Gaining Results through Evaluation Work (GREW) project, funded through a US Department of Agriculture Beginning Farmer and Rancher Development Program (BFRDP) grant. This project supports the development of strong, effective and long-lasting farmer and rancher training programs so that beginning farmers enter the field of farming and establish successful farm businesses.

This library contains hundreds of resources focused on running effective and thorough program evaluations collected by the GREW team. Some resources focus explicitly on farming projects and others provide more general program evaluation instruction. You can use the topic of interest buttons below to search for the types of materials of interest or you can type a search directly “I’m looking for…” bar.

Please visit again – more resources will be added regularly.

If you have a resource you would like to see, have a resource you’d like to share, or have any feedback about the Resource Library, please contact nesfp@tufts.edu.

Source: Pennsylvania State University - Richard Stup

"Program evaluation is a powerful tool for demonstrating the value of Extension education to stakeholders. When presenting the results of evaluation, it is important to know exactly who the stakeholder is. As programs increasingly depend on client registration fees, it becomes essential to demonstrate to clients that they will receive return on their investment. This article points out an opportunity for Extension to improve programs and marketing by focusing evaluation to meet the decision needs of business organizations. Core evaluation articles and reports of successful Extension examples are reviewed."

Source: National Farm to School Network

"This resource is intended to help advance National Farm to School Network’s racial and social equity priority by increasing our understanding of the work in the context of structural, institutional, and interpersonal racism." It has many useful items and reflection questions that can be used for beginning farmer and rancher programs as well.

Source: Washington State University Extension - Debra Hansen Kollock

"Learn more about a promising follow-up, participatory group process designed to document the results of Extension educational efforts within complex, real-life settings. The method, known as Ripple Effect Mapping, uses elements of Appreciative Inquiry, mind mapping, and qualitative data analysis to engage program participants and other community stakeholders to reflect upon and visually map the intended and unintended changes produced by Extension programming. The result is not only a powerful technique to document impacts, but a way to engage and re-energize program participants."

Source: Ohio State University Extension - Thomas M. Archer, Karen Bruns; Stanford University Prevention Research Center - Catherine A. Heaney

"Whether evaluating impact of community-based programs is new to you, or you are an experienced evaluator, SAMMIE can help you expand your skills. SAMMIE represents Successful Assessment Methods and Measurement In Evaluation. It is a one-stop Web portal to valuable impact documentation resources. Through SAMMIE you can: [1] Access resources on 21 evaluation related topics; [2] Read the best literature on the Web related to program evaluation; [3] Ask an Expert your questions about program evaluation; and [4] Develop a personalized program evaluation plan. SAMMIE is available free of charge to anyone who has Web access."

Source: University of Kentucky

Links to evaluation resources for extention programs - University of Kentucky - Agriculture, Food and Environment

Source: PASA

After a workshop or other training event, the lead staff who participated in the event would use this template to complete post-session survey. Event reflections can be aggregated and reviewed annually by the staff to identify trends, programmatic improvements, and topics for future programming.

Source: Western Michigan University

This website offers numerous checklists to use when conducting evaluations to ensure needed steps are addressed. Examples include budget development, contract development, considerations for developing logic models, how to implement different types of evaluation approaches, evaluation design, interpreting evidence, reporting and using results, etc.

Pages