It’s easy—and fashionable—to talk about engaging youth in development programming, but it’s much harder to make it happen. Everyone agrees it’s important, but commitment to meaningful youth engagement falls short without a strong plan and a structure for true collaboration.
Developmental Evaluation (DE) is one solution. First proposed by Michael Quinn Patton in his 2010 book Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use, DE is a tried-and-true approach to adaptive management or collaborating, learning and adapting (CLA). With DE, projects are flexible, changing within complex and uncertain conditions to meet the shifting need.
In a recently published article I co-authored for Gates Open Research, my colleagues and I offer a practical approach to use DE for collaborative program design. Our example describes DE in an HIV-prevention (PrEP) protocol in a dynamic and complex context. But the approach is also ideal for international positive youth development (PYD) programming—which promotes engaging young people as agents for change—because, as my co-authors and I point out, the DE approach “is deeply and continuously informed by those the intervention is meant to serve and fundamentally rooted within the context of beneficiaries’ communities.”
For projects aimed at understanding and responding to young people’s evolving concerns, DE is a powerful companion. It’s a system for co-creation. In the article, we describe this simple 5-step process.
Step 1: Collect. Use data already available or find ways to collect data to track what’s most important to you. Talk to young people to continuously define the most urgent problems and rate program success. Pull together key data using simple graphics.
Step 2: Review. Invite a wide range of people to routinely review the data. Include program staff, partners, and government representatives, and—perhaps most importantly—young people themselves. For example, young people reviewing a difference in service use by girls compared to boys may have ideas about why the difference exists and what to do about it.
Step 3: Reflect. Think together about what the data means. How should the program change to better meet young people’s needs? Consider actions within the control of those in the room. Who is being missed? What other information could make the program stronger? Who else should be invited to review and reflect on the data? Perhaps young people with disabilities are not represented, or private sector partners that might inform the relevance of work programs.
Step 4: Record. Write down the reflections and planned actions. This improves accountability and keeps everyone on track and working together. In this step, young group members might use social media to post and share the plan and track progress, which puts them in an empowered position to hold others accountable.
Step 5: Act. DE is essentially utilization-focused; the data is collected, reviewed and reflected for the purpose of making improvements in a way that addresses the needs of beneficiaries. Encourage actions by all partners. Engage young people as community leaders and ambassadors advocating for their peers by bringing in subgroups that were not engaged at the outset. For example, young people could create messages to attract youth now missed by services, or they could develop youth clubs to build civic engagement.
In the HIV example, we found that “used consistently, DE helped adapt and refine services, improve service access, reach target audiences and improve continuation rates.” And this was done only with and through the voices of those the program served. The same can be done for PYD programming, using DE to create a plan and structure to harness youth voices for responsive programming.
Linda Fogarty is IYF's Director of Measurement, Evaluation, Research and Learning (MERL).