Now and Going Forward – Why Evidence Still Matters
Currently the sector is grappling with the impact of COVID-19 on service delivery. It is understandable that evaluation – designing and measuring for outcomes – may have moved way down in the list of priorities.
While we are at different stages of this journey, we wanted to explore how we can harness evidence and data through this crisis and why evidence matters even with the urgent priorities that we are all facing.
This blog highlights and responds to some of the takeaway messages we identified in an article written by Michael Quinn Patton, an international evaluation guru, on the implications of the current COVID-19 crisis for undertaking evaluation, measuring for outcomes, and data collection. While he is writing more for professional external evaluators, OPEN (Outcomes, Practice & Evidence Network) felt there were important points here for our practitioners in the child and family service sector, during and after COVID-19.
So here goes – our reflections on the key takeaways from Michael’s post:
Do not forget data collections
As services are adapting to the fast-paced environment, it is even more important to collect data about your program or services, what has changed, and the potential impact of the change (for clients as well as staff). You can start to document the impact and needs caused by the crisis right now – emerging trends, assessment of needs and how they have changed, and potential options going forward are vital. Also, document any changes made to the implementation of your services or programs and their implications.
Our sector may gain learnings from this situation that can benefit our future delivery models.
Consider the ‘good enough’ standard of rigour
Patton encourages us to detach from narrow notions of rigour and move to notions of ‘good enough’ evidence as more appropriate at times of uncertainty and urgency created by crisis conditions. During a crisis, it is important to do quick and small-scale data collection to inform these decisions rather than waiting for later, or for end-of-program/service or post-crisis assessment of outcomes. End-of-program evaluations are likely to yield data that is ‘too late and too little’. For instance, a small purposeful sample of staff and clients from diverse programs will deliver results faster and be a ‘good enough’ evaluation.
For our sector, this is an important point to take forward. Getting started on designing and measuring for outcomes, thinking through how you can do this within your current context, is a better approach than waiting for the right circumstances, method or expert to help.
Embrace change and do systems thinking
In a crisis, everything changes – be it program goals or outcomes, service delivery models, data collection plans, and timelines. Therefore, rapidly adapt and acknowledge the connections between personal, public, community, national, and global domains. It is a great opportunity to go from micro to macro and then macro to micro – or think global and act local. Learn from intersecting trends at national and international levels and collaborate with different sectors.
For our sector, revelations from this crisis have the potential to drive whole system transformations for a more sustainable and equitable future. This requires a reflection on how the current systems are not working, and for whom, and identifying pathways to respond to both short-term and long-term effects of this crisis. This is a great opportunity for us all.
Demonstrate the value and importance of evidence
Patton suggests that evaluators, and all who believe in building an evidence base about understanding the outcomes of our work, need to act to lay the groundwork now to enable organizations to commit to, and push for, continuous learning through monitoring and evaluation frameworks.
This is important for our sector as with funds likely to be restricted, it is even more important for organizations to understand the effects of this crisis on their clients and how they can ensure they support their needs in the most efficient and effective way possible going forward. These are the questions we can focus on in our use and creation of evidence.
Be a fact checker and evaluative thinker
Look for trustworthy, valid, and useful information before making decisions. Evaluate information from multiple data sources to ensure its validity. If evidence is missing locally, look at the global trends or gather learnings from past emergencies. Inculcate a questioning mindset – what is working – what is not working (evaluative thinking) to avoid making premature judgements from inadequate evidence.
And finally – to wrap it up…
Support others in our sectors’ evidence-focused community
Michael Patton encourages evaluators to buddy up, look past differences, and support each other in a shared a commitment to evidence-informed decision-making and evidence- and outcomes-focused thinking to make a better world. Let’s stay connected. Let’s support each other. What we do matters. Stay healthy. Stay strong. Stay sane.
Wise words – this is equally true in our sector, now during this crisis and moving forward. Stay connected to others interested in designing and measuring for outcomes and building the evidence base about the difference we make. Some of us may be in research or evaluation roles, others may just be starting to think about measuring the impact of their program or services – let’s help each other. Share resources and what has worked for you. There may be differences in approach, methods, and experience, but we are all on a common journey to strengthen evidence-informed practice and decision-making to get the best outcomes for children and families.
This blog was originally posted on the OPEN Blog on April 30, 2020. If YouthREX can support your need for evidence and/or evaluation supports, please contact us.