International development organizations like IYF, if they are to keep pace with the changing times and contexts in which they operate, must learn to grow, adapt, and evolve as effectively and efficiently as possible. That involves recognizing when a program, product, or approach is (or isn't) meeting expectations and figuring out how to fix it.
That's where an organization's Measurement, Evaluation, Research, and Learning (MERL) team comes into play. Their work, which often takes place behind the scenes, is critical not only because funders want assurance that their investments are yielding maximum returns, but because the young people and communities we serve deserve the very best outcomes. Recently, I caught up with three members of IYF's MERL team to ask a few questions about their field—what their work entails, why it's important, and how it can be advanced.
Briefly, what does MERL work involve as it’s practiced in the international development space?
Linda Fogarty, Director of Measurement, Evaluation, Research, and Learning (MERL)
Under the MERL umbrella is a range of activities, some targeting projects and others addressing larger questions important to the global youth development community. On the project side, we help projects understand if they’re doing the right things, at the right times, and in the right way to meet the needs of those we’re trying to reach. That means asking questions like: Are those most vulnerable able to access services? Are girls and women getting as much out of the projects as boys and men? Are people with disabilities getting the services they need? If not, how can we change our services to provide access?
Elizabeth Kim, Manager, Measurement, Evaluation, Research, and Learning (MERL)
To pick up on the idea of adaptive management, MERL work really does help guide program decision making. It helps to identify the successes and challenges of the program and to determine if we should keep doing what we’re doing or pivot to improve. In organizations like IYF, there are multiple projects with multiple goals that relate to overarching, institution-wide goals. On a larger scale, MERL allows us to measure progress on the whole, and to see whether we are moving the needle on youth development.
"MERL is important in the youth development space because it helps us be accountable, transparent with our resources, and ensures we are doing no harm and promoting ethical measurement, evaluation, research, and learning."
Amy Zangari, former Senior Technical Advisor, Measurement, Evaluation, Research, and Learning (MERL)
I’d just add that MERL is important in the youth development space because it helps us be accountable, transparent with our resources, and ensures we are doing no harm and promoting ethical measurement, evaluation, research, and learning. That means only collecting the data that we need and making sure to consult our participants when creating or adapting instruments so we’re not unintentionally creating harm in how we ask certain questions.
In the youth development space, what’s the most valuable kind of research/evidence and why? What factors determine the kind of research you conduct?
LF: We’re not an academic institution, so our primary purpose isn’t implementing research—it’s implementing the best programs we can. We want to be able to say with confidence that the approaches we have are working, that they are making an impact. That takes a serious commitment of funds, so if we’re already confident that our approaches are working, then it’s better to spend the money on programs. However, as needs and contexts change, we may need to invest in really good, rigorous research—such as impact evaluations to ensure we aren’t wasting our money on projects that don’t work well. Over the last six years, for example, IYF invested in three impact evaluations to see if our life skills interventions are really making a difference in the lives of young people with respect to things like staying in school, getting jobs, earning higher wages. This is valuable evidence—especially for country level policy makers and implementers, and for the broader youth development community.
"The best research is actionable research, where the results give us information that we can act upon to improve programming."
EK: I agree, the best research is actionable research, where the results give us information that we can act upon to improve programming. But, in many ways, we are bound by finances, time, and resources. For example, multi-year randomized control trials (RCTs) are regarded as the gold standard for experimental research—and they are great in theory and awesome if we have the resources. But they also take a lot of time, and often there isn’t funding available after a program closes. So, there are trades offs. And, as Linda said, we’re not an academic institution. If research doesn’t translate into improving outcomes or the ability to provide better services, then it’s not as valuable as research that might be a little simpler, or maybe even qualitative, but which provides rich information that leads to actionable next steps. In 2019, for example, we set out to create a new Life Skills Survey Tool (LiSST). The process arose from the need for a more reliable and valid survey tool for measuring life skills change. The result is a tool that we can use in future programs and stand behind with confidence. It will help us better deliver on our mission. It’s a great example of actionable research leading to real change.
AZ: Practical research—research that is used—is the most valuable. And sometimes it’s not appropriate to have an RCT, or other certain kinds of research, because beneficiary populations are so unique. It’s important to work with our staff, especially in country offices and local partner organizations, to make sure the level of rigor is appropriate and takes into consideration their needs, and the needs of those we serve. In the international development space, it’s also important for donors to understand what indicators or measurement tools are appropriate for the type of project they’re funding. For example, for a systems change project we should focus on the larger ecosystem, so we may not measure simply the total number of participants who went through a training, but rather how many teachers received upgraded skills or training.
What should MERL practitioners in the international development space (and donors who fund our projects) think about going forward?
"I would like to see donors being more flexible about how they think of evaluations. Evaluations … can be used to evolve a project as contexts and needs change and to make projects more dynamic."
LF: I would like to see donors being more flexible about how they think of evaluations. Evaluations aren’t just valuable as a summative tool to say, at the end of a program, whether it worked or not. Developmental evaluations, for example, can be used to evolve a project as contexts and needs change and to make projects more dynamic. I think this is starting to happen, and I hope it continues. It would also be great to see investments in long-term follow up data collection to understand the impact of projects over the long haul.
EK: We in the international development space have a tendency to focus on descriptive statistics and output metrics, such as who we serve and how many trainings we put out. But there’s more to be explored about why things work, or don’t work. I hope we can begin looking more into these processes and mechanisms that explain how and why we achieved the change that we think we achieved. Doing so will inform where we invest and how we design our projects. I also want to reiterate the importance of equitable MERL, which is an important part of decolonizing international development. There’s a trend towards having field offices and local partners take more ownership of work. I’d like to explore how we can better partner with local offices and implementers to carry out MERL work that is relevant and impactful for their contexts.
AZ: As practitioners, it’s important that we continue to share stories—what worked, what didn’t work—so we are consistently building up MERL in the international development space. This will help us stay true to our principles and ensure that the data that’s being collected is being used to implement positive change.
Below, IYF's MERL team shares recent select research, tools, and resources.
- PTS Evidence Brief #1: Evidence of the Impact of IYF's Passport to Success (PTS) Life Skills Training
- Evidence Brief #2: Evidence on the Reliability and Validity of IYF's Life Skills Survey Tool (LiSST)
- Co-Creating for Impact: Five Steps to More Responsive Youth Programming
- COVID Check-In: How Young People are Coping With Mental Health Challenges in Tanzania
- COVID Check-In: Reinvention Through Skills Training in Mozambique
For updated content and a deeper dive into MERL at IYF, click here.
To discuss MERL at IYF or to inquire about MERL partnerships, please contact Linda Fogarty, Director of Measurement, Evaluation, Research and Learning.