Reflections on My Journey as a Program Evaluator
Do you think of yourself as an evaluator? I never did until recently …
Although evaluation has always been part of my job as a Program Manager, the idea of evaluation itself was never all that appealing to me. Chasing surveys from participants, analyzing data in Excel … I guess it all seemed a bit boring and un-sexy; the behind-the-scenes stuff that was necessary and important, but not particularly exciting.
I am the National Program Manager for The Youth and Philanthropy Initiative (YPI) Canada. YPI is a public charitable foundation that provides grants to social service organizations across the country through an innovative curricular program that has engaged over 450,000 Grade 9 and 10 students since 2002 in real-world experiences with philanthropy and advocacy. Students participating in our program study social issues in their community; get out into the real world to interview charity staff, volunteers and beneficiaries; and give pitch presentations to their peers, educating one another and advocating for grants for their chosen charities. Ultimately, a $5,000 grant from YPI is given to one charity for each school, as chosen by students.
We have a three-person staff team that conducts annual pre- and post- program questionnaires with students, feedback questionnaires with teachers and charity representatives, and occasional focus groups and interviews with various stakeholders. This evaluation program has been in place for the past four years and I’ve managed it as part of my job since I started in 2015.
Even with all these evaluation responsibilities in my job description, I didn’t think of myself as a professional evaluator, nor did I think of the rest of my responsibilities as a Program Manager in the context of evaluation. To me, program management is about making frontline work happen and this just wasn’t how I thought about evaluation.
Then I completed YouthREX’s 10-week online certificate, Understanding Program Evaluation for Youth Wellbeing, and it re-framed the way I think about evaluation. Evaluation isn’t just a part of my job; it is my job.
Here are three major take-aways from my experience in this certificate:
- Formalize your program theory: In my career as a Program Manager in the non-profit sector, I have created logic models for grant applications before, and have used them to report to funders. I have always found them to be useful … but this time, I found the opportunity to re-examine our logic model for YPI, without the expectations, guidelines or templates required by a funder, to be really helpful in thinking about our program on a macro-level. I dug deep using the theoretical framework of short-term, medium-term, and long-term impact to examine what changes we really want to see as a result of our program. Asking questions about the things we are trying to shift made designing evaluation questions and connecting these questions to evaluation tools (i.e. surveys, interview guides) flow naturally. With this groundwork, I was able to think critically about what we already measure and what gaps we should look to fill.
- “Monitoring”is evaluation: I always grouped monitoring and evaluation together, but thought of them as separate things. Through the course, I’ve learned to think about monitoring as evaluation. We learned about two types of program evaluation: process evaluation and outcome evaluation. Process evaluation looks at inputs (i.e. staff time, funding, curricular resources), activities (i.e. students researching and visiting charities for their YPI project), and outputs (i.e. number of participants, number of people learning about social issues, granting dollars distributed to communities). This kind of evaluation is what I thought about as monitoring since there were no formal evaluation methods involved; we collect most of this information through forms, phone calls, meetings, etc. I find it helpful to think of measuring these things as evaluation because this kind of measurement provides the scaffolding necessary to conduct an outcome evaluation. To make sense of outcomes – changes that you want to have happen as a result of your program, or your sphere of influence—you need to first fully understand your outputs; what you can reasonably control when running your program.
- Evaluation is a team effort: We are a small, but very mighty, team at YPI. We have three core team members, each charged with furthering our mission in a different way. The certificate explored the benefits of having more than one team member analyze and interpret the data. Triangulating through double and even triple checking to see data from different perspectives is something that we will certainly take forward at YPI.
Finalizing our evaluation plan was also a team effort: we discussed in depth what we actually want to learn through our evaluation, and how we might best get there. It’s finished for now, but it remains a working document that we will revisit on an ongoing basis. I took the YouthREX course, but my entire team benefited from the knowledge and tools that I gained.
Bonus: Learning is best when shared!
On behalf of the YPI team, I am pleased to share our full Evaluation Plan here. This is as an example of what a small organization can do with limited resources and I hope that it will initiate conversations in a community of practice for youth program evaluators and staff. This plan contains our evaluation questions, evaluation methodology, a key stakeholders matrix, logic model, a report card for our key performance indicators, plans for analysis and interpretation, and plans for use and sharing.
Now that I am more confident calling myself an evaluator and I see these processes as part of program management, we’re looking to expand our evaluation program at YPI. Evaluation can be fun and we’d like to innovate by using our evaluation program as a site of connection with our students. We are currently developing an additional participatory evaluation piece for students in the final stages of our program to share reflections on their experience with YPI. Stay tuned for what we learn on our evaluation journey…