Celebrating 10 Years of Building Evidence
As we continue to celebrate ten years of the Educate! Experience, we want to highlight the dedicated staff who have allowed Educate! to reach this milestone and those who are working to prepare Educate! for its next decade of impact.
Today, we are excited to spotlight Educate!’s Director of Monitoring and Evaluation, Meghan Mahoney. After four years at The Abdul Latif Jameel Poverty Action Lab (J-PAL) – a global research center working to reduce poverty by ensuring that policy is informed by scientific evidence – Meghan joined Educate! in 2017 as Director of Monitoring and Evaluation, overseeing our ever-evolving strategic learning agenda.
As part of a series of interviews with alumni, Meghan recently caught up with J-PAL to share more about the career she’s built in impact evaluation. Here are a few of our favorite highlights from the interview on J-PAL's blog.
Could you tell us a little bit about your background, and how that led to your decision to pursue a career in impact evaluation?
My first job after college was with an economic consulting firm in D.C. working on international trade policy research. During my time there, I grew to understand that while inclusive trade policy is an important tool for poverty alleviation, it was sometimes difficult to trace the work on a policy level to short-term improvements in the lives of individuals. But, I when I started looking into what works, I was surprised that there was still a dearth of rigorous evidence about which social policies achieved their goal of reducing poverty. This motivated me to pursue graduate studies in development economics and program evaluation, so that I could develop the research and analysis skills to design and test effective social programs and the writing and communication skills to communicate the findings to those who most needed the evidence.
Could you tell us a little about your role now?
As the Director of Monitoring and Evaluation, I oversee the development and execution of Educate!’s monitoring and evaluation strategy. Based on the impact that we would like to have, I work to articulate the organization’s theory of change, and create a plan to test and validate that theory of change through both continuous performance monitoring and rigorous research. This means that I collaborate with colleagues across various design and program implementation functions to determine what data they need to do their jobs better, and figure out how to get it to them.
In retrospect, are there any experiences or lessons learned from your time at J-PAL that you think are applicable in your role today?
My experience and lessons learned from piecing together evidence from different sectors or different contexts has been really helpful in my role today. There is a lot of evidence out there, thanks to the great work that has been done by J-PAL affiliates, IPA, and large international organizations such as the World Bank. But you can’t always draw direct parallels between the available evidence and the specific context that we work in. This means that I often have to look at similar programs from other contexts or sectors and think about whether or not we can use that evidence in Educate!’s work.
Educate!’s Monitoring and Evaluation Team (M&E) has played a crucial role in the growth and development of our programs over the last ten years. After learning more about Meghan’s pursuit of evidence critical to successful development, we wanted to sit down with her and learn more about her work at Educate!, as well as the M&E Team’s focus over the next ten years.
From your perspective, what would you say are Educate!'s M&E Team's greatest accomplishments since the Educate! Experience launched a decade ago?
Great question. I could go on for days about what I think our M&E team does well, but I will focus and highlight 3 key achievements:
Monitoring and evaluating at scale: I think it’s much easier to build a system that serves you when you’re only in 50 schools. It’s harder to update that system to be lean and deliver the same quality of information when you’re in 500 or 1000 schools. It’s been a learning process, but our cultural commitment to Always Learning really serves us here!
Generating rigorous evidence: There is still a dearth of evidence regarding what programs when implemented at the secondary school level successfully improve youth outcomes and not a lot of research in general causally tying secondary school to medium- or long-term skill and labor market impacts. While it’s certainly not easy, Educate! has made the important investment in and commitment to testing our solutions rigorously, so that we and others can learn what works. We’re conducting two randomized evaluations, have conducted a quasi-experimental evaluation, and also do a lot of qualitative work to make sure we understand the links in our theory of change. There’s more to do, but we’ve come really far.
Critically thinking about how to learn, and learn well: This is not just a success of the M&E team, but speaks to E!’s broader approach to learning. We’ve worked hard as an organization to figure out what data we need, when we need it, and how to design research to deliver that data. I feel so privileged to do this type of word at an org that values it so much.
Looking ahead to Educate!’s next decade, what will be the biggest focus of the Monitoring and Evaluation team?
Educate! is thinking a lot about systems integration as a pathway to sustainable, long-term impact. In order to design a program that governments and their partners can implement, we need to design a monitoring systems that can be integrated into their systems and processes. In order for us to be successful, we need to think about our monitoring and evaluation systems not only as a way for us to get the right information, but as a way for the actors and stakeholders that we partner with to also get that information continuously.
Describe what Educate! means to you, on a personal level, in 10 words or less.
Helping education fulfill its promise to youth.