When we launched the Hewlett Foundation’s Madison Initiative in 2014, we were excited to support nonprofits, advocates, and researchers who shared our audacious goal of improving the US Congress’s effectiveness in a polarized age. But after experiencing some of the all-too-common pitfalls that foundations can stumble into with grantees, we soon decided that we had to reevaluate our grant practices.

Some of these challenges came with the territory: Funders in the democracy field have long emphasized short-term, project-based grants. Funding tends to ebb and flow over the recurring two-year political cycles.

Yet some challenges were self-inflicted. As we developed our initiative, we wanted to learn by making a range of smaller bets. So we asked grant-seekers to provide us with theories of change, performance indicators, hypotheses they were testing, key risks and mitigation strategies, and so on. When proposals came back with incomplete or misconstrued responses, we gave grant-seekers more specific instructions and elaborate tables to complete. However, the situation didn’t improve. We began hearing half-in-jest comments from applicants about the difficulties they had filling out what one referred to as “the infamous Hewlett grid.”

Meanwhile, our program team felt stretched and as though we were veering off course. Under the Hewlett Foundation’s intentionally lean staffing model, we had three people to orchestrate the initiative’s $15 million in annual grantmaking, and that was not going to change. We needed a more sustainable workload. We also sensed that we were not devoting sufficient time to learning from the grantees’ work as they reported back to us.

Are you enjoying this article? Read more like this, plus SSIR's full archive of content, when you subscribe.

So 18 months into the initiative, we set out to redesign our grantmaking practices to make things better for our grantees and our team while clearing the way for more learning and impact. To do so, we realized that we needed some additional information and perspectives at the table. In partnership with our grants officer and Hewlett’s Effective Philanthropy Group, we researched best practices in foundation-grantee relations and had our grants officer conduct in-depth and confidential interviews with a half dozen Madison grantees to understand their needs and experiences in working with funders. We also analyzed all the grants we had already made to see what patterns were emerging.

After analyzing this rich array of information, we decided to make two substantial changes: in how we engage grantees in the proposal and reporting processes, and in the type of grants we make.

Simplifying Processes and Emphasizing Shared Learning

We recognized early on that we were asking grantees for a lot of information that we didn’t actually use in grantmaking decisions. We also came to appreciate that the budgets and metrics we were getting from our grantees were in effect artificial—created solely for proposing and reporting on our grant. Grantees were not using them in their day-to-day work.

Then our grants officer reported on the aggregate impressions from her confidential interviews with grantees. We got an earful. Our elaborate processes and requirements, stacked on top of those of their other funders, left grantees feeling frustrated, confused, and resigned to just go through the often duplicative make-work to get a grant from us.

We decided, as a first step, to pare down the information that we were asking of grantees. We now give them three simple prompts:

  • Assuming that things go really well for the work that would be supported by the proposed grant, please describe what success might look like in the longer term, i.e., beyond the grant period.
  • In the near term, during the grant period, what are 2-4 indicators that you would plan to use to assess your progress? Where possible, please specify indicators that you are already tracking to ensure they are relevant and useful.
  • What potential developments or challenges—internal or external—could make it harder to realize the longer run vision described above?

To avoid mindless rework, we now tell all grantees wherever possible to send us write-ups and budgets they have already prepared for other funders.

We tell grantees that one to three paragraphs will suffice for each answer. For general support grants, we also ask them to send us their annual report. For project grants, we ask for a budget and a short description of the work being funded. To avoid mindless rework, we now tell all grantees wherever possible to send us write-ups and budgets they have already prepared for other funders, and that if we need additional information we will get back to them.

We also learned that grantees experienced our process as a black box. After all the work we asked them to do, they heard only crickets for weeks and in some cases months, interrupted by random requests for an updated wire transfer form or a copy of their IRS determination letter. They wanted (and deserved) clear and timely information about how and when we would review the grant proposal, when we would be back in touch, when they would actually receive the money if we proceeded, etc. So we began providing this information upfront, in the email in which we invited the grant application.

Another imperative of our redesign was to devote more time and attention to learning from our grantees’ work after we funded them. We were spending the vast majority of our time getting money to grantees, and our process for grantee reports on the backend felt like an afterthought. We would prompt them by annual automated email, and they would send back a report into what must have seemed like a void.

How could we make this process more focused and fruitful for grantees and our team alike? As with our grant applications, an important first step was distilling the information we were requesting into a few essential questions:

  • What have been the biggest high points and/or promising developments that you experienced in your work over the past year?
  • What have been the biggest disappointments and/or challenges that you have experienced in your work over the past year?
  • What are the most important things you have learned over the grant period? Have these lessons led you to adjust your plans in any material respects?

We encourage grantees to refer back to the long-term vision and progress indicators they identified in their proposals when answering these questions. And here, too, we tell them when possible to simply send us recent reports they have prepared for other funders if they speak to most of our questions.

To personalize things and ensure grantees know we are taking their reports seriously, our program officers now craft customized emails to let grantees know a report is due and to flag any topics over and above the three core questions that we’d like them to address. We have also established the practice of talking with grantees to jointly reflect on and learn from each report they have submitted. We’ve found that these conversations almost always yield helpful information beyond the report, especially on how grantees are seeing and experiencing developments in the field.

Additionally, in our team meetings we now discuss grantee reports before turning to prospective grants to ensure that the former do not get short shrift. Every month, each program officer prepares a 3- to 5-bullet summary of the key takeaways from their recent grantee reports and follow-up calls. We share and discuss these with our full team so we can collectively connect the dots and make sense of what we are learning.

Moving to Larger, Longer-term, General Support Grants

In our redesign process, we changed not just how we were engaging with grantees but also the kinds of grants we were making. An analysis revealed that half our grants were funding short-term projects. No wonder we and our grantees felt harried! Our grantee interviews underscored what we knew from best-practice research and our own experience working in nonprofits: that larger, longer-term, general support grants are the most valued form of funding.

We now spend less time refining project plans and budgets, and more time reflecting on the strengths and areas for development of our anchor grantees

With this in mind, we decided to shift our default operating mode from providing short-term project funding to longer-term general support whenever possible and appropriate. We recognized this meant we would be making larger grants, but fewer of them. This shift presented obvious tradeoffs, experienced most directly by grantees whose short-term project funding we did not renew. But our remaining grantees benefited from more support.

We also benefited internally, not simply because the rate of activity became manageable for our small team, but also through more and better conversations about which grantees and areas warrant these deeper investments. We now spend less time refining project plans and budgets, and more time reflecting on the strengths and areas for development of our anchor grantees—how we can help them build on the former and shore up the latter.

With the reduced number of project grants that we now make, we decided to address another problem flagged by our research: the chronic underfunding of the indirect costs grantees incur in carrying out such projects, which erodes their capacity to function and sustain results over time. The Bridgespan Group has termed this dynamic “the nonprofit starvation cycle.”

We are determined to avoid falling into this cycle with our grantees. Our grant application now includes a paragraph describing our commitment to fully covering indirect costs and encouraging grantees to accurately represent them. We know from benchmarking with several of our grantees that their indirect costs typically range from 20 to 60 percent of their direct costs for a given project. So now when grant applicants report indirect costs below 20 percent, we double-check with them to make sure they aren’t inadvertently low-balling their budgets.

Since making these changes, our median grant size and median grant term have both increased by 25 percent. And most strikingly, the proportion of our grant dollars going out the door in the form of restricted funding for specified projects has fallen from 50 percent to less than 20 percent of our grants budget.

We can measure how these changes are affecting our funding patterns and internal processes, but of course the real test is whether they are helping our grantees by easing their administrative burdens, increasing their freedom and flexibility to respond to challenges and opportunities that arise, and improving their long-term sustainability.

We have gotten anecdotal feedback from grantees that they are indeed experiencing these as positive changes. But as this process reminded us, grantees are, for understandable reasons, not always forthcoming in giving their funders constructive feedback. So we will carefully consider the confidential grantee feedback that the Center for Effective Philanthropy will gather for us next year and compare it with the feedback we received from the same survey just before we implemented the changes.

Learning from Our Experience

Are the changes we made right for other teams of grantmakers? That depends on their circumstances, goals, etc. One size does not fit all here. To help others decide if our approach may be relevant, we’ll share some questions we found to be helpful prompts for discussion on our team:

  • Do we review and use all the information we ask grantees to produce in their grant applications?
  • Do our grantees clearly understand our decision-making processes and timelines?
  • Do we read, reflect on, and discuss all the reports that we ask grantees to prepare?
  • If we step back and are really honest with ourselves, do we think that our grantees find us easy to work with?
  • Are the size and duration of our grants preserving if not enhancing the financial sustainability of our grantees?
  • Do our grants provide grantees with the degrees of freedom they need to respond flexibly to developments in the field?
  • Is our internal workload sustainable?
  • Are we devoting ample time as a team to learning and adapting our strategy?

If the answer to one or more of these questions is “no,” you may find it beneficial to revisit and refine your grant practices. Your grantees certainly will appreciate it!

Support SSIR’s coverage of cross-sector solutions to global challenges. 
Help us further the reach of innovative ideas. Donate today.

Read more stories by Jillian Misrack Galbete & Daniel Stid.