Hi, Perla. Thanks for this post. This is the only time I ever get to see you. How’s it going with GreatNonprofits?
Keep in mind that a lot of the formality in the sector comes in reaction to a real or perceived threat of audit by the IRS. A program officer might feel that she has enough information—through personal contact with a grantee—to approve or close a grant, yet nevertheless request a formal report for the official, auditable file.
Keep in mind also the nonprofits that rely primarily on federal, state, and local municipal grants for funding a nearly overwhelming burden of reporting to the various government agencies. Talk about bureaucracy!
This is a big problem. Foundations want to show outcomes and ask nonprofits to show they “theory of change” and the outcomes it seeks to achieve. Unfortunately, the vast majority of foundations don’t track or monitor outcomes. I find writing grant reports to be tedious and time-consuming and they often don’t correspond to our work cycle/calendar. I am happy to write for those who read them, but they are few and far between…Usually it is a Grants Administrator / Assistant who calls to make sure the report is “on file.” I would think that the IRS would only be interested in documentation that the grant was spent in accordance w/ the grant agreement. This year, I submitted a report for a grantmaker (small family foundation) that has supported our work for many years. Once it was submitted, they requested a grant from the year BEFORE. Said they needed it for the file?!? I was so confused why they would ask for a report from 2 years ago when I had just submitted for this year. Am I crazy or is there something wrong with this picture?
Could you expand on what you mean when you write that “we’re not really a profession yet”? Who are “we” and what does it mean to be a nonprofit or foundation professional?
Dear Perla,
There is a much deeper issue involved. Foundations may have their set of grant reporting template which is laborious and repetitive, but within the nonprofit sector, there is no universal standard on program outcome measurement. A measurement that is adaptable to program types, user and reader friendly, and measure long-term sustainable activities.
>In venture capital, funders rarely ask portfolio companies for progress reports.
>The response would be, “What? You want me to stop calling potential customers and write a report instead?”
Right on, Perla. Demands for reports that are rarely read, reacted to, or learned from reduce the “net grant” and thus reduce services to target groups (and presumably impact).
One challenge in moving to phone or lunch briefings is that program officers often claim they don’t have time to meet either. Presumably that’s why reports are requested—there’s just no time to meet. This is a tricky barrier to overcome. Perhaps more organizations could take a page from Yvon Chouinard’s notion of leading an examined life. This problem of reporting may only be solved through deliberate examiniation and reflction by funders on what is minmally necessary to know about progress and what the implications of their bureaucratic footprint truly are.
I am a program officer and I do read grant applications thoroughly. My work load is overwhelming as I review approximately 350-400 proposals a year + letters of inquiry and my other responsibilities. I must say it is difficult to keep up with all of the paperwork, meet 7 private foundation deadlines and try to meet individuals face-to-face or carve out time for site visits. I appreciate the time and effort that grant seekers put into their applications and evaluation reports. Although, it is difficult to maintain efficient evaluation record keeping, we are trying to address this situation and provide outcomes and reports to our various foundation trustees so that they may see the impact of their grantmaking. In addition, reports will help us provide information that will show trends/changes in the community, as well as information about successful programs/organizations that we can share with individual donors who may be interested in providing separate grants.
I wonder whether the issue is one of content. Performance reports can be used to document outputs, outcomes, or net impacts. As an outsider to this process, I suspect that many performance reports empahsize outputs (e.g., clients served, public service announcements run, etc.), which are useful to have in program files for accountability reasons, but have limited usefulness beyond that. Measuring outcomes is probably more useful for the foundation and more interesting to the board and others, because they show what the grant helped to do (e.g., an increase in desired client response, a change in a social metric, etc.). Not surprisingly, that is harder to measure and more expensive. Indications of net impacts (i.e., outcomes that can be shown to be caused by the program) are even better because they show the program’s true effect and can yield insights into how to accomplish the goal more effectively. This is hardest to measure and usually outside the capabilities and resources of grantees. Foundations would need to ask for and fund this level of evaluation separately—and get the program evaluation profession involved more. I suppose the cost would need to be justified as an investment in future effectiveness.
Thanks for raising this topic - and special thanks to Mary for weighing in. I would love to hear from other grantmakers on this. In fact, I’ve been wishing for a while that someone would do a survey on the topic—if there’s any way design one that would get candid results.
As a development director, I’ve had debates with both my current and previous executive directors about whether it matters what goes into reports. My bosses have urged me not to spend much time on them. I’ve argued that while most grantmakers probably don’t read them, some do. Also, I used to work in philanthropy; so I know that sometimes grantee reports are used later for things like cluster evaluations, program reviews, or historical purposes. But I honestly don’t know to what extent, and I’m beginning to think my bosses might be right.
I’ve just had this exact discussion with my boss. We used to request only financial reports at the end of grant period for audit purposes but have just started to ask for brief narrative reports on results and outcomes. We don’t provide a format though and each grantee interprets “results and outcomes” differently. We mostly get process related narratives… I read the reports thoroughly, provide informal (e-mail) reactions to grantees on the information provided and try to share any insight gained with other grantees working in similar issues but there isn’t an official foundation evaluation program to collect the information and facilitate the process of reporting to the board. We are currently in the process of developing one but only for our main focus area program.
Thanks Mary and Maria for chiming in and sharing your perspective from the other side of the fence. I think we’re all in agreement that it doesn’t help nonprofits and it certainly doesn’t help foundations if the main use is for foundations to be able to have reports “on file”. Everyone seems overburdened.
Is there some other way to do this? This may sound old fashioned, but I know of a public health grantmaker who intentionally decided about 3 years ago that he was going to make grants based upon “walking around the clinics and talking to the patients and doctors.” No formal reports. Not even a scheduled lunch meeting with the ED. He’ll just show up every couple of months or so to a clinic he funds and spend a couple of hours walking around, observing and asking doctors and patients how things are going. Is his grantmaking any more or less effective than all the foundations that are implementing a “theory of change” or “evaluation process”? Unlike a lot of grantmakers, this particular grantmaker has met and talked to and tried to understand on a human level the problem from the perspective of people needing the service or people most directly involved in providing the service. I think it’s kind of like being a war journalist who has been to the front lines. They’ve seen it, experienced it and understand it at a level that a desk bound policy maker may not be able to no matter how many reports they read.
Love the story of the grantmaker who evaluates programs based on ground level conversations with patients and doctors. That is my preferred method of evaluation coupled with informal lunches with the ED. Thank you for reminding me of the VC phone/in-person evaluation model. At the same time, for “larger” grants, I do need to see some sort of annual or progress reports. However, I would VERY MUCH like to see them standardized across foundations. It makes absolutely no sense that the ED/dev director should write custom progress reports for each grantmaker. That’s nonsense and completely inefficient and a waste of precious time. Course that means grantmakers also have to be willing to “standardize” their definitions of programs.
As I am utterly opposed to EDs reinventing work - what I do ask is that they give me copies of progress or annual reports they’ve given to other grantmakers. That way, I still can measure “progress” without adding more work to their load.
COMMENTS
BY Albert Ruesga
ON May 23, 2007 08:19 PM
Hi, Perla. Thanks for this post. This is the only time I ever get to see you. How’s it going with GreatNonprofits?
Keep in mind that a lot of the formality in the sector comes in reaction to a real or perceived threat of audit by the IRS. A program officer might feel that she has enough information—through personal contact with a grantee—to approve or close a grant, yet nevertheless request a formal report for the official, auditable file.
BY Michael Charter
ON May 24, 2007 03:28 PM
Keep in mind also the nonprofits that rely primarily on federal, state, and local municipal grants for funding a nearly overwhelming burden of reporting to the various government agencies. Talk about bureaucracy!
BY Laura Foulke
ON May 24, 2007 03:31 PM
This is a big problem. Foundations want to show outcomes and ask nonprofits to show they “theory of change” and the outcomes it seeks to achieve. Unfortunately, the vast majority of foundations don’t track or monitor outcomes. I find writing grant reports to be tedious and time-consuming and they often don’t correspond to our work cycle/calendar. I am happy to write for those who read them, but they are few and far between…Usually it is a Grants Administrator / Assistant who calls to make sure the report is “on file.” I would think that the IRS would only be interested in documentation that the grant was spent in accordance w/ the grant agreement. This year, I submitted a report for a grantmaker (small family foundation) that has supported our work for many years. Once it was submitted, they requested a grant from the year BEFORE. Said they needed it for the file?!? I was so confused why they would ask for a report from 2 years ago when I had just submitted for this year. Am I crazy or is there something wrong with this picture?
BY Laura Foulke
ON May 24, 2007 03:34 PM
Could you expand on what you mean when you write that “we’re not really a profession yet”? Who are “we” and what does it mean to be a nonprofit or foundation professional?
BY Richard Wong
ON May 24, 2007 03:36 PM
Dear Perla,
There is a much deeper issue involved. Foundations may have their set of grant reporting template which is laborious and repetitive, but within the nonprofit sector, there is no universal standard on program outcome measurement. A measurement that is adaptable to program types, user and reader friendly, and measure long-term sustainable activities.
BY John Nash
ON May 24, 2007 05:26 PM
>In venture capital, funders rarely ask portfolio companies for progress reports.
>The response would be, “What? You want me to stop calling potential customers and write a report instead?”
Right on, Perla. Demands for reports that are rarely read, reacted to, or learned from reduce the “net grant” and thus reduce services to target groups (and presumably impact).
One challenge in moving to phone or lunch briefings is that program officers often claim they don’t have time to meet either. Presumably that’s why reports are requested—there’s just no time to meet. This is a tricky barrier to overcome. Perhaps more organizations could take a page from Yvon Chouinard’s notion of leading an examined life. This problem of reporting may only be solved through deliberate examiniation and reflction by funders on what is minmally necessary to know about progress and what the implications of their bureaucratic footprint truly are.
BY Mary
ON May 25, 2007 06:15 AM
I am a program officer and I do read grant applications thoroughly. My work load is overwhelming as I review approximately 350-400 proposals a year + letters of inquiry and my other responsibilities. I must say it is difficult to keep up with all of the paperwork, meet 7 private foundation deadlines and try to meet individuals face-to-face or carve out time for site visits. I appreciate the time and effort that grant seekers put into their applications and evaluation reports. Although, it is difficult to maintain efficient evaluation record keeping, we are trying to address this situation and provide outcomes and reports to our various foundation trustees so that they may see the impact of their grantmaking. In addition, reports will help us provide information that will show trends/changes in the community, as well as information about successful programs/organizations that we can share with individual donors who may be interested in providing separate grants.
BY Ingo Bensch
ON May 25, 2007 09:08 AM
I wonder whether the issue is one of content. Performance reports can be used to document outputs, outcomes, or net impacts. As an outsider to this process, I suspect that many performance reports empahsize outputs (e.g., clients served, public service announcements run, etc.), which are useful to have in program files for accountability reasons, but have limited usefulness beyond that. Measuring outcomes is probably more useful for the foundation and more interesting to the board and others, because they show what the grant helped to do (e.g., an increase in desired client response, a change in a social metric, etc.). Not surprisingly, that is harder to measure and more expensive. Indications of net impacts (i.e., outcomes that can be shown to be caused by the program) are even better because they show the program’s true effect and can yield insights into how to accomplish the goal more effectively. This is hardest to measure and usually outside the capabilities and resources of grantees. Foundations would need to ask for and fund this level of evaluation separately—and get the program evaluation profession involved more. I suppose the cost would need to be justified as an investment in future effectiveness.
BY Dorothy
ON May 25, 2007 04:30 PM
Thanks for raising this topic - and special thanks to Mary for weighing in. I would love to hear from other grantmakers on this. In fact, I’ve been wishing for a while that someone would do a survey on the topic—if there’s any way design one that would get candid results.
As a development director, I’ve had debates with both my current and previous executive directors about whether it matters what goes into reports. My bosses have urged me not to spend much time on them. I’ve argued that while most grantmakers probably don’t read them, some do. Also, I used to work in philanthropy; so I know that sometimes grantee reports are used later for things like cluster evaluations, program reviews, or historical purposes. But I honestly don’t know to what extent, and I’m beginning to think my bosses might be right.
BY Maria
ON May 31, 2007 08:54 AM
I’ve just had this exact discussion with my boss. We used to request only financial reports at the end of grant period for audit purposes but have just started to ask for brief narrative reports on results and outcomes. We don’t provide a format though and each grantee interprets “results and outcomes” differently. We mostly get process related narratives… I read the reports thoroughly, provide informal (e-mail) reactions to grantees on the information provided and try to share any insight gained with other grantees working in similar issues but there isn’t an official foundation evaluation program to collect the information and facilitate the process of reporting to the board. We are currently in the process of developing one but only for our main focus area program.
BY Perla Ni
ON June 6, 2007 10:26 PM
Thanks Mary and Maria for chiming in and sharing your perspective from the other side of the fence. I think we’re all in agreement that it doesn’t help nonprofits and it certainly doesn’t help foundations if the main use is for foundations to be able to have reports “on file”. Everyone seems overburdened.
Is there some other way to do this? This may sound old fashioned, but I know of a public health grantmaker who intentionally decided about 3 years ago that he was going to make grants based upon “walking around the clinics and talking to the patients and doctors.” No formal reports. Not even a scheduled lunch meeting with the ED. He’ll just show up every couple of months or so to a clinic he funds and spend a couple of hours walking around, observing and asking doctors and patients how things are going. Is his grantmaking any more or less effective than all the foundations that are implementing a “theory of change” or “evaluation process”? Unlike a lot of grantmakers, this particular grantmaker has met and talked to and tried to understand on a human level the problem from the perspective of people needing the service or people most directly involved in providing the service. I think it’s kind of like being a war journalist who has been to the front lines. They’ve seen it, experienced it and understand it at a level that a desk bound policy maker may not be able to no matter how many reports they read.
BY Clara
ON August 10, 2007 12:45 AM
One donor’s perspective…
Love the story of the grantmaker who evaluates programs based on ground level conversations with patients and doctors. That is my preferred method of evaluation coupled with informal lunches with the ED. Thank you for reminding me of the VC phone/in-person evaluation model. At the same time, for “larger” grants, I do need to see some sort of annual or progress reports. However, I would VERY MUCH like to see them standardized across foundations. It makes absolutely no sense that the ED/dev director should write custom progress reports for each grantmaker. That’s nonsense and completely inefficient and a waste of precious time. Course that means grantmakers also have to be willing to “standardize” their definitions of programs.
As I am utterly opposed to EDs reinventing work - what I do ask is that they give me copies of progress or annual reports they’ve given to other grantmakers. That way, I still can measure “progress” without adding more work to their load.