The Niagara Peninsula Conservation Authority’s (NPCA) nature school is an alternative school for primary students in the Niagara Region, focused on building students’ self-confidence and connection with nature through outdoor daily experiences. During my placement with NPCA, I was responsible for designing the purpose, scope, and evaluation tools for a program evaluation of the nature school.
What is evaluation and why is it important?
Program evaluation seeks to address questions about a program’s operations and results, through the application of various data collection methods. Evaluation plays an integral role in enhancing the overall quality of a program, as it helps to acknowledge the circumstances, challenges, and setbacks.
My group members and I worked alongside the NPCA to evaluate the overall satisfaction of students’ participating in the nature school, as well as their development of environmental stewardship, and day-to-day program delivery. These criteria were evaluated for the NPCA to gain insight into their program, which is a critical aspect of assessing the impact of the program.
How do you get started?
Once you know what you are evaluating, it is important to think about how you will evaluate it. You must select appropriate assessment tools, which will help you collect data relevant to your program evaluation questions and the information you seek to learn about.
Hoping to acquire accurate and reliable data, my group members and I set out to conduct in-person weekly observations of the nature school, as well as to gather feedback from stakeholders including the students, teachers, supervisors, funders, and board members using a series of surveys, questionnaires, and interviews. It was important that we learned what the program experience was like for all stakeholders.
Springing into Action
Amidst the pandemic and the uncertainty of 2020, my group's evaluation changed several times. We intended to take in-person observations, gather surveys and questionnaires, and conduct interviews with various stakeholders. However, as the school's delivery method and regulations changed with Covid-19, so did our evaluation plan, and stakeholder’s capacity to participate in the project. Instead of postponing the evaluation, we chose to create a virtual logbook. The logbook allowed us to gather data on student satisfaction, the delivery of lessons/teachings, and evidence of environmental stewardship from the perspective of instructors. The tool was overall a safe and effective way for us to collect information about the daily functioning of the school during the pandemic, using a free online survey software – Google Forms.
With our revised approach to data collection/assessment, we were able to collect valuable data for our evaluation. We re-evaluated our initial evaluation questions and goals, simplifying them down to the most important, which allowed for the flexibility that the nature program required. We also remained patient with program leaders, asking them to complete the logbook 2-3 times each week, rather than daily, as originally hoped. We were grateful, in the end, to be able to offer the NPCA tangible program recommendations based on the information we gathered in logbooks.
Overall, throughout the process of developing our assessment tools for our program evaluation, we learned that it is important to re-evaluate, to re-assess, and ultimately to be flexible - as things do not always pan out as planned!
This blog is part of a series in collaboration with Brock University. Written by a student in the ‘Program Evaluation in Professional Practice’ course led by Dr. Corliss Bean and Dr. Meghan Harlow, this blog details a student’s first-hand experience conducting a program evaluation during a placement with the Niagara Peninsula Conservation Authority’s Ball’s Falls Nature School.
Take Your Learning Further:
- Newcomer, K. E., Hatry, H. P., & Wholey, J. S. - Handbook of practical program evaluation
- Two-year process evaluation of a pilot program to increase elementary children’s physical activity during school. https://doi.org/10.1016/j.evalprogplan.2018.01.009