Foundations and social entrepreneurs seeking to maximize the impact of their investments should consider the dramatic leveraging potential of program evaluation. Many foundations use program evaluation to monitor and learn from their programs. But some (including the David and Lucile Packard Foundation) have taken program evaluation to the next level and used it as a powerful strategic intervention for large-scale policy and systems change.

In a recently released report, we describe how the Packard Foundation used rigorous program evaluation, together with active networking, technical assistance, and communications to boost the impact of its work in children’s health insurance. While the report details a specific grantmaking strategy, the lessons from that experience (including engaging important audiences early and often, and delivering evaluation findings effectively) are directly relevant to many funders and strategists seeking to bring about meaningful social change.

The foundation’s experience using evaluation as a strategic intervention began with the effort to leverage an innovative local initiative, the Santa Clara Children’s Health Inititiave (SCCHI), into a program of health insurance coverage for all children in California. SCCHI built on exisiting federal-state programs of coverage for children in families with low to moderate incomes (Medicaid and CHIP—the Children’s Health Insurance Program) by adding a new, locally funded program for children not eligible for the exisiting programs and a single, very simple in-take process for all three programs to provide coverage for all children in the county. The foundation’s strategy was to support SCCHI and promote its dissemination to other California counties to create a “tipping point” that ultimately would lead to a state-wide program of coverage for all children. Central to this strategy was a rigorous evaluation of SCCHI that would produce robust, policy-relevant findings in real time.

The findings from the evaluation were compelling: Access to care increased dramatically (for example, the proportions of children with a usual source of primary care and of dental care rose by 80 percent and 170 percent respectively); children’s health outcomes as reported by their parents improved; and the frequency of three or more school absences a month was halved. In addition—and particularly of interest for local leaders—by enrolling many previously uninsured but eligible children in Medicaid and CHIP, SCCHI brought substantial state and federal dollars into Santa Clara county. The foundation disseminated these findings to state and local officials using short issue briefs, press events, and in-person briefings, and supported replication of the SCCHI model in other counties by launching a technical assistance center. The findings also helped persuade other funders to support SCCHI-type programs in more than half of California’s 58 counties. Ultimately, the evaluation findings helped make the case for a new state program to cover all of California’s children. (Unfortunately, although passed by the legislature, the governor vetoed the bill to establish the program on fiscal grounds.)

Building on its experience in California, the foundation integrated strategic evaluation into a multi-state grantmaking strategy, Insuring America’s Children (IAC), designed to build momentum for a national program to cover all children. IAC combined funding for state-based children’s advocacy organizations with extensive networking, technical assistance, and communications support for advocates in both funded and unfunded states. For this initiative, the evaluation focused on identifying effective advocacy strategies, and the foundation shared the findings with advocates in all states and with other funders to help grow, add resources to, and strengthen the momentum-building effort.

Here are some of the most important lessons we identified:

  • Beyond its value for program monitoring and internal learning, program evaluation—when coupled with sophisticated communications, technical assistance, and networking support—can be a powerful strategic intervention in its own right.
  • To maximize its impact on the policy process, the focus of the evaluation must be informed from the outset by the interests and concerns of the principal stakeholders, who should also be engaged in disseminating the findings.
  • To maximize the impact of evaluation findings, funders must frame them in terms most relevant to the policy process and must deliver them to strategically important audiences in real time.
  • While rigorous quantitative findings usually have the greatest traction with policy makers, less costly qualitative evaluations can also be of great value to advocates, funders, and others seeking to bring about change.
  • Effectively using evaluation as a strategic intervention can be highly labor-intensive and may require greater staffing capacity than many funders currently devote to evaluation. Evaluation and/or program staff should include trained research professionals with an understanding of the policy process and a thorough grasp of the design, implementation, and strategic application of program evaluations.

Using evaluation as a strategic intervention may not always be an appropriate way for funders to increase the impact of their investments. Some interventions may not lend themselves to rigorous evaluation; in other cases, the positive impact of an intervention may not emerge for a long time. But when credible, timely findings can add force to a comprehensive strategy, rigorous evaluation can be a highly effective tool for those seeking to bring about meaningful social change.