The latest winner of the Nobel Prize in Economics, Princeton’s Angus Deaton, was described by Justin Wolfers in the New York Times as “an influential counterweight against a popular strand of econometric practice arguing that if you want to know whether something works, you should just test it, preferably with a randomized control trial. In Mr. Deaton’s telling, the observation that a particular government intervention worked is no guarantee that it will work again, or in another context.”'
Vincent DeVita, MD, former head of the National Cancer Institute and physician-in-chief of the Memorial Sloan Kettering Cancer Center, is also skeptical, but in a medical context. In his book, The Death of Cancer, he characterized evidence-based guidelines for the treatment of cancer as “backwards looking.” He wrote, “With cancer, things change too rapidly for doctors to be able to rely on yesterday’s guidelines for long. Reliance on such standards inhibits doctors from trying something new.”
Evaluation guru Thomas Schwandt also urges caution in how we approach documenting effectiveness. In the 2015 book, Credible Evidence in Evaluation and Applied Research (S. Donaldson, C. Christie & M. Mark, Eds.), he wrote, “ ... the field of evaluation seems captivated by discussions of methods needed to produce evidence of impact ... [distracting] us from carefully attending to a variety of important issues related to evaluative evidence and its use.” He suggests that “the term evidence base must be interpreted with caution: To claim that evidence may figure importantly in our decisions is one thing; to claim it is foundation for our actions is another. We would be well advised to talk about evidence-informed decision making instead.”
From a philanthropic perspective, Vivian Tseng, vice president of the WT Grant Foundation, writes in a similar vein, in “Evidence at the Crossroads”: “A narrow focus on evidence-based programs encourages people to run after silver bullet solutions that are not necessarily aligned with the myriad other interventions that they are running.”
These are compelling points of view. When it comes to addressing serious problems such as poverty, and race- and income-based disparities in health and education, the world is beginning to discover that the most effective interventions consist of far more than individual, circumscribed programs. This may help to explain why the tide seems to be shifting away from a narrow focus on experimental evidence of program impact.
Thinking that we’re probably only in the early stages with this realization, I was surprised that in a session on this subject at last November’s American Evaluation Association meeting, the message that we need a broader approach to evidence was enthusiastically received. There seemed to be considerable agreement that a narrow focus on trying to identify which programs “work” is actually keeping us from getting better results and that the social sector’s program-centric focus has been based on several erroneous assumptions. These include that individual, stand-alone programs can achieve ambitious goals; that if we know from RCTs that a program works in one place, it will work everywhere; and that innovation won’t be discouraged by an over-arching reliance on programs that have been shown to work in the past.
No one questions the importance of evidence. But it is time for all of us to think more expansively about evidence as we strive to understand the world of today and to improve the world of tomorrow.
Don Berwick, health policy reformer extraordinaire (and my colleague in the Friends of Evidence), describes the situation this way: “The world we live in is a world of true complexity, strong social influences, tight dependence on local context—a world of uncertain predictions, a world less of proof than of navigation, less of final conclusions than of continual learning.” (“Eating Soup with a Fork,” Keynote, 2007 Forum on Quality Improvement in Health Care.)
To get better results in this complex world, we must be willing to shake the intuition that certainty should be our highest priority. We must draw on, generate, and apply a broader range of evidence to take account of at least five factors that we have largely neglected in the past:
- The complexities of the most promising interventions
- The practice-based evidence that spotlights the realities and subtleties of implementation that account for success
- The importance of fitting interventions and strategies to the strengths, needs, resources and values of particular populations and localities
- The heavy context-dependence of many of the most promising interventions
- The systematic learning and documentation that could inform future action
One way to accomplish this goal is for all those involved in intentional social change— including philanthropies, public policy makers, and nonprofit organizations—to go about the business of knowledge development in a way that would enable us reliably to achieve greater results at scale in tomorrow’s world by making sure that all public and philanthropic funding is evidence-informed. For a start, this would require:
- Investment in structures that could identify the common underlying elements of diverse attempts to reach similar goals
- The development and maintenance of directories that would address the contextual factors, whether and under what circumstances programs are likely to be effective in new settings and populations, and add a focus on the work that focuses on systems and community change
- A means to identify ways of making systems more hospitable to interventions that are evolving and improving, and take seriously the challenges of implementation
This approach to knowledge development and learning, in the United States at least, would contribute substantially to the nation’s capacity to solve big problems. Of course, solving big problems takes political will, not just more and better knowledge. But by becoming smarter in how we approach the generation, analysis, and application of knowledge and evidence, we can contribute mightily to building the needed political will.