Foundations

Stop Funding Duplicative Projects

Field scans are crucial for providing data about what has been funded and where funding gaps lie.

(Illustration by AAD Goudappel) 

In late summer 2009, several months before the climate change conference in Copenhagen, I was asked to moderate a three-day discussion among a group of climate change experts in Europe. As leaders from Brazil, China, the United Kingdom, and the United States debated the national and international security implications of climate change, it became clear that the leaders in many diplomatic and foreign policy circles understood little about climate change. Without better communication between the security and climate change communities, the chances of meaningful action in Copenhagen were slim.

What should be done? Some argued that a major research initiative be undertaken linking climate change with national and international security issues—a nexus we envisioned as “climate security.” But such an initiative would be expensive and time consuming. Had others done similar work that we didn’t know about? Without that background, how could we know what research would be meaningful? And was funding research even the right way to go?

I approached my client, the Planet Heritage Foundation, a newly created foundation that hosted the gathering, to fund an initial field scan. The scan would include interviews with the funding community in the United States and Europe as well as with leading research experts. Through the scan, we would learn what others in the field believed would be useful and additive and test our assumptions coming out of our three-day retreat. And the cost would be a fraction of launching a major new research effort.

Why a Scan?

The Planet Heritage Foundation was intrigued but initially pushed back: Why fund a scan? Couldn’t the money be better spent launching a research agenda, especially because climate change posed such a huge threat with limited time to act? Why spend money on a scan when the resources could be given to deserving NGOs right away?

These questions were legitimate. But my decade as founding director of the Philanthropy Workshop West at the William and Flora Hewlett Foundation had taught me the importance of understanding context, of having a framework for funding before launching an initiative. I had seen the value of scans undertaken by major foundations and the kinds of insights they provided about where additional philanthropic money could be truly well spent. Done thoroughly, field scans provide data not only about what has been funded, but also about where gaps exist; they point to underappreciated problems and where a small investment could unlock a major new area of much-needed engagement. Unfortunately, many field scans are kept proprietary by the commissioning funder, and their lessons are not shared with others.

In other cases, I had seen high-level field scans that examined funding flows without analyzing their effectiveness. Although useful to a degree, these scans often missed the critical questions: What is really working—and not? Where could more money make a significant difference?

Sadly, I have seen the fallout from funders who launched expensive initiatives without taking time to look around to see what else had been done before. It is the equivalent of “fire, ready, aim”—money spent on projects nearly identical to ones that others had already funded, resulting in a lost opportunity to learn from others’ trials, efforts, successes, and failures.

Why does all this matter? Shouldn’t foundations just fund good projects and assume that well-regarded NGOs know best how to spend the resources? The answer is astonishingly simple. Because the field of philanthropy historically has done a poor job of commissioning and sharing fundamental field research, funders often fund duplicative projects, NGOs unwittingly repeat the work of others before them, and we miss opportunities to fund work that is crying to be done but unnoticed in our haste. We fund projects that mirror other efforts without even knowing it. And ultimately, we waste money by not identifying the critical areas or organizations where more funds could be effectively used.

In many other fields of endeavor, including business, doing market analysis is fundamental. Few venture capitalists or investors would fund startup companies without understanding market trends, opportunities, and gaps. Making a substantial investment without that deep market understanding would be seen as foolish.

But in philanthropy, a field that historically has seldom commissioned or shared this kind of analysis, we too often leap to fund a promising project first, then ask later what else is needed or even what else already exists. We often start with asking which organization or individual is doing the best work, creating a funding bias toward well-known NGOs or the hottest newcomer on the block. In so doing, we often skip over the fundamental underlying questions: What needs exist in a given field? What has been tried before and succeeded—or not? And where can a precious marginal additional philanthropic dollar make the biggest splash?

It was in this spirit that I recommended a field scan dedicated to understanding who was working in the important intersection between climate and security—and what was needed going forward. It could be a wise investment. The Planet Heritage Foundation agreed, and we launched a six-month scan.

What We Found

We began by having a team of analysts from Stanford University conduct an extensive literature review. To our surprise, we found an extensive body of reports going back over a decade examining the links between climate and security. More than 20 groups in Europe and the United States had studied the issue in some depth. As we gathered these reports, it became apparent that an idea from our European discussions—the need for fundamental research on the link between climate and security issues—was less important than we had thought. Instead, there was near-uniform consensus that conveying central insights from this research to policy circles to inform the decision-making process was far more necessary.

As we began our interviews, which involved detailed conversations with more than 20 experts from the United States and elsewhere, we were surprised again—by requests all around that we share our findings. Researchers and funders who had examined this link, or who were at least intrigued by it, were grateful that a new funder was systematically looking at the field and asked to see our findings. Many field scans are commissioned privately and never shared. The reasons are varied: a desire for confidentiality, sensitivity to criticism of other efforts, or proprietary considerations. The Planet Heritage Foundation understood and believed that the value of sharing our findings outweighed other concerns; it agreed to publish a version of the report. What surprised us was that many researchers and the few funders in the field knew vaguely of work others had done, but they didn’t have the big picture.

We were also surprised to learn that only relatively modest amounts of money had been invested in this particular subset of the climate change agenda. Millions of dollars have been spent in climate change efforts overall, but only a tiny fraction in the climate security space, despite the existence of well-regarded groups that had achieved success in this area. In short, the scan pointed out a true funding gap; for small amounts of money, important but underfunded groups could do powerful, much-needed work.

What We Shared

Six months after launching the field scan we published a report, “Climate Change and National Security: A Field Map and Analysis of Funding Opportunities,” that has influenced the field in ways we never imagined. It has been featured in national conferences of major climate change funders, has spawned briefings for leading organizations in the field, and has found its way into influential policy circles. The report identified areas of both consensus and disagreement in the field, and it continues to provide a valuable roadmap to funders and analysts alike.

For me, the response reaffirmed the critical importance of taking time to understand context through undertaking and sharing field scans. I have since had the privilege of working with other funders to commission scans in divergent areas, ranging from understanding the root causes and possible solutions to the epidemic of violence against women in the Democratic Republic of the Congo (DRC) to analyzing the aftermath of post-earthquake funding in Haiti. In each case, the scans identified where popular interventions were working, where others were not (and indeed had inadvertent negative consequences), and where a donor could tackle an issue’s origins rather than just treat symptoms.

For example, the DRC analysis suggested that small investments in local nonprofits working on women’s leadership, media capacity-building, and land disputes could make a real difference in reducing violence against women. In Haiti, we learned that money spent on certain health interventions had negative consequences in the health sector and that underfunded areas included youth-focused professional development and rural agriculture.

More funders are realizing the importance of field scans, but we can do better. First, we should encourage comprehensive scans that not only list “who funds what,” but also offer an analysis of program effectiveness. Second, we should find ways to share what we learn with other funders and nonprofits. Imagine what could happen if more funders were to share (at a minimum) the core findings of their reports or portions of their analyses? Sharing could foster honest discussion, encourage collaboration, minimize redundant funding, and redirect money to issues that are begging for attention.

To be sure, these scans take time and money to do well. And a scan alone will not invariably lead to better funding outcomes. But as we learned from our experience in Europe, money spent doing the analysis up front can be leveraged dramatically when shared more broadly and can point the way to interventions that may be genuine breakthroughs.

Tracker Pixel for Entry
 
 

COMMENTS

  • BY Nomor Papersonpapers

    ON June 6, 2013 11:50 AM

    Do we really need more reports written about reports?  Haven’t we started researching away philanthropic money at the expense of programs that implement real impact on the ground?  Good projects stand on their own merit, even if they duplicate efforts of other projects, just to different constituents.  Research for advocacy and education is certainly valuable, but researching research seems a bit of a waste, after all, isn’t this a problem Google solved long ago.

  • BY We loves more papers on papers!

    ON June 6, 2013 02:16 PM

    While its true that papers on papers have gotten a bad name (sometimes by the very people who have published these papers), I still think more longitudinal research is needed as to their impact on the personal gratification and feelings of self worth of the funding sources.

  • Jim Fruchterman's avatar

    BY Jim Fruchterman, Benetech

    ON June 7, 2013 03:25 PM

    I agree with Christine.  As someone who does technology in the nonprofit sector, I see a steady stream of “new” ideas that are portrayed as breakthrough and likely to have massive impact.  However, it seems like few people do the research to find out that that “breakthrough” has been tried out multiple times and failed each time. 

    I do recognize that sometimes that the fifth attempt at a project or company idea can be the successful one.  But, usually that successful attempt is informed by the past failures, rather than bumbling in ignorance along many of the same bad side-paths. 

    Or, that there are five players doing something similar. But, you should understand why the project you pick is likely to have better impacts!

  • BY Daniel F. Bassill

    ON June 12, 2013 08:20 AM

    Christine, thanks for the article. I recall being part of a planning group in the early 1990s that was seeking multi year funding for a major foundation initiative. As I sat in the meetings I realized that I had access to many more reports in my own library than what this group was drawing from in its own deliberations. Yet, the group still ended up getting funded.

    Started leading a volunteer-based tutor/mentor program in Chicago in 1975, as a volunteer, while holding a full time corporate advertising job. I had no previous experience leading such a program so began to collect information from various sources while also reaching out to others in the city to create “lunch & learn” sessions.  Over many years this led me to have a database of peers, and a library of information that others would visit and use.

    In 1993 I created the Tutor/Mentor Connection with the goal of collecting “all that was known” about inner city tutoring/mentoring programs, such as who they were, where they were located, why they were needed, tools each needs to constantly improve, etc. with the aim of sharing this information to help existing programs improve, new programs start, and donors and business leaders become more strategic in how they support such programs.

    Since then I’ve amassed a huge library which I outline with concept maps. http://tinyurl.com/TMI-library  I’ve also created a map-based directory of non-school tutor/mentor programs, which you can search by community area, type of program and age group served. Thus leaders who are interested in investing in youth education, violence prevention, public health and/or diversity initiatives, could choose what neighborhood they want to support, then which programs in that neighborhood they want to partner with. In some cases there are few or no programs in areas with several thousand high poverty youth.

    Thus, this is more than a scan. It’s an on-going effort to collect and share information that could be used by anyone who is interested in helping youth in poverty neighborhoods.  Collecting the information is just one part of a four part strategy that I describe in this PDF. http://tinyurl.com/TMI-4-part-strategy  All parts need to be funded and have ownership in many places.

    Yet, as you’ve suggested, it is difficult to find funding for such work, or to sustain it from year to year. The value of the library is under appreciated, and without advertising and facilitation dollars few people find and use it. 

    There is a growing emphasis on collective impact in articles on SSIR, yet the knowledge that supports such efforts needs to be collected and maintained on an ongoing basis.  I’ve not seen too many articles talking about this, or showing how it might be better funded.

  • BY Paula Cohen

    ON September 7, 2013 12:51 PM

    Reading Ms. Sherry’s account of securing support for a ‘field scan’ for the Climate Change and Security project brought to mind how Knowledge Services equips organizations with tools and techniques to capture the information captured by the ‘scan’ and avoid duplicating activities or ‘reinventing the wheel’, though on an ongoing, consistent basis. 
    The push-back and skepticism Ms. Sherry encountered when proposing the ‘field scan’ – funds are better spent on direct programming…─ is, unfortunately, all too familiar to knowledge and information practitioners working in the social purpose sphere. “Why spend money on a scan when the resources could be given to deserving NGOs right away?”  As Ms. Sherry noted, the scan can be an expensive and time consuming task, though without it succeeding actions do not have a solid base to move forward on.  Consider the value brought about by a strategic, well-defined, and funded knowledge services initiative aligned with the organization’s mission when staff can readily access documents instead of endlessly surfing the shared drive, a knowledge audit informs the kinds of information, knowledge, and strategic learning resources and services people require to do their work, how these resources and services are actually used, and how knowledge assets used in the organization are produced (and by whom);  and systems are in place to regularly monitor the competitive landscape.  The significant value comes from sharing knowledge gained and lessons learned to capture a true realistic assessment of programs and impact of dollars. 


    As Ms. Sherry began pursuing Climate Change and Security she observed, “we often skip over the fundamental underlying questions: What’s really working—and not? What has been tried before and succeeded—or not, serious gaps that, if addressed would have a positive impact. Where could more money make a significant difference? Had others done similar work that we didn’t know about?” There is no shortage of examples among social purpose organizations funding “projects that mirror other efforts without even knowing it, and missed opportunities to fund work that is crying to be done but unnoticed in our haste”. Jim Fruchterman’s comment on research being the missing piece behind “breakthrough and innovative ideas” is exactly the point – upfront research as a routine practice uncovers whether the idea is truly unique, success/failure rate, if others are doing something similar, as well as the process undertaken by other similar ventures. Capturing this information can make a tremendous difference in time and resources invested.

    This ‘field scan,’ while a valuable investment, was a one-time initiative. As Ms. Sherry’s experience and the benefits provided by the ‘field scan’ demonstrated, this is an ongoing problem that the information gaps illuminated by the Climate Change project occur on an ongoing basis throughout the social sector. Consider the value of embedded practices and techniques to effectively enable the organization to identify, capture, organize, access, share and reuse their information and knowledge – the right information gets to the right people at the right time to make the right decisions. This inevitably serves to reduce redundancy along with other wasteful, time-consuming, and costly practices. 

    The saga of the Climate Change and Security project offered an enlightening account of how easily projects at all levels can veer off-track when capturing critical information is not a priority. An apt comment from Daniel F. Bassill, Founder of Tutor/Mentor Connection in 1993, and Tutor/Mentor Institute, LLC , who chided SSIR for “the growing emphasis on collective impact in articles on SSIR, yet the knowledge that supports such efforts needs to be collected and maintained on an ongoing basis.”

Leave a Comment

 
 
 
 
 

Please enter the word you see in the image below:

 

SSIR reserves the right to remove comments it deems offensive or inappropriate.