Today more than ever philanthropies and nonprofits are tackling innovation in earnest, from impact investing to hiring innovation officers. And yet, as a sector, we rarely engage in research and development (R&D) to optimize our interventions before we invest in replication and scaling. We need intentional processes that enable us to take calculated risks, test and refine new approaches, and launch better-designed initiatives before we move to rigorous impact assessment.
Most grant-funded efforts are evaluated on their ability to meet static measures on predetermined outcomes. But how do you determine outcomes for an untested program—based on a set of hunches—that you think will produce better results?
Developmental evaluation (DE) turns the traditional role of evaluation on its head by focusing on right-time data and feedback to hone interventions. Think product prototyping, or beta testing, with a social purpose. Done well, DE can be a critical driver of innovation in the social sector.
Our experience evaluating the Bill & Melinda Gates Foundation’s Community Partnerships portfolio offers an example of DE in action.
Increasing college completion among low-income young adults so that they earn credentials with fair market value is one of the foundation’s education imperatives. To-date efforts aimed at college completion have resulted in a fragmented system that leaves many students behind, particularly students of color, and low-income and first-generation students. The foundation wanted to know: What would happen if an entire community bought into a college completion agenda? What would it take and who could champion it? How would you get partners—business, higher education, and school districts—to work together differently?
Since 2009, the foundation has supported seven cities with nearly 350 community stakeholders, two intermediaries, and the OMG Center as the evaluation partner to begin to tackle these questions together. Through our engagement, we identified five DE principles that can fuel R&D in the social sector:
• Reset expectations from impact assessment to learning. DE is not about reporting on the impact of the investment per se, but rather about using data to co-design an investment strategy that is most likely to lead to impact. Our role as evaluators is to work alongside the funder, intermediaries, grantees, and their partners, to document and assess which strategies, and under which conditions, are most effective in getting a community to take up a college completion agenda and increase student success. For example, the history of collaboration in a community or the level of formality among stakeholders has implications for both the strategy and the selection of appropriate progress measures.
• Use your theories as guides. The Community Partnerships work is not about testing an idealized model. In fact, at the beginning, the only elements of the theory of change (TOC) were four broad areas of work: building commitment, using data, developing partnerships, and addressing policy and practice change. The TOC was intentionally ambiguous—allowing for real R&D to take place as sites began to interpret the work based on their individual contexts and knowledge. Our role as evaluators is to reduce the ambiguity of the TOC by clearly articulating how change happens in diverse contexts.
• Fine-tune questions, methods, and measures as you learn. Just like the TOC, all of the elements of our evaluation plan keep shifting as we learn more from the work on the ground. We drop methodologies, introduce new questions and measures, and identify and measure outcomes simultaneously. For example, where we initially sought to measure commitment building, we now identify and track how stakeholders act differently as a result of their commitments. Does the mayor regularly ensure that new education initiatives support the completion efforts? Will the superintendent cut programs if they don’t support college readiness?
• Set up structures for shared reflection and strategizing. The power of R&D is in reflection on and fine-tuning of strategy based on timely information. Seeking as many perspectives and involving as many stakeholders as possible—despite the blurred lines it can create between the evaluator and partners—makes for richer conversation and a more grounded strategy that resonates with different players. It also builds collective trust and dedication.
• Diversify the talent of the evaluation team. DE is messy, fast-paced, and complicated, and it requires skills beyond content and methodological expertise. To make DE work, you need a team with sharp facilitation skills; expertise in identifying informal systems and power dynamics; strong advising aptitudes; and persistence in the face of ambiguity.
Implementing these practices won’t lessen the challenges inherent in addressing social inequities. But by using an evaluation approach more in tune with complex and rapidly shifting initiatives like the Community Partnerships portfolio, we can gain the insight necessary to develop more agile approaches to social change – approaches that are measurable, sustainable, and scalable.