(Illustration by Adam McCauley) 

Private foundations enjoy a unique degree of freedom to pursue their missions without the constraints that face many other nonprofit organizations. Such freedom can also be a challenge insofar as it comes without the external forces that can drive organizations to achieve results. As Thomas Tierney and Joel Fleishman note in their book Give Smart: Philanthropy that Gets Results: “in philanthropy excellence is self-imposed.” That reality contributed to our decision to create a new framework—called Foundation Performance Assessment—to help us assess The James Irvine Foundation’s performance.

Advancing Evaluation Practices in Philanthropy

This special supplement includes six articles that address basic principles and practices that inform efforts to monitor performance, track progress, and assess the impact of foundation strategies, initiatives, and grants. The supplement was sponsored by the Aspen Institute Program on Philanthropy and Social Innovation and underwritten with a grant from the Ford Foundation.

In this article we describe the Irvine Foundation’s approach to performance assessment and discuss some of the lessons we have learned over the past six years. We write from the dual vantage point of having been involved in creating and evolving the foundation’s approach to performance assessment as well as producing the annual report that describes our performance (report preparation is integrated into our assessment process). We define Foundation Performance Assessment as an effort to assess the organization’s performance by examining the various levers by which the institution can achieve its mission.

Our work in evaluation and performance assessment helps focus program staff on the goals and outcomes for their grantmaking.

The Irvine Foundation was created in 1937 to benefit the people of California. In the 75 years since its inception, the foundation has awarded more than $1 billion in grants to thousands of organizations serving Californians. The majority of our grantmaking falls into three program areas: youth, arts, and California democracy. The youth program helps high school students build a strong foundation for success in both college and career. The arts program promotes engagement in the arts for all Californians. And our California democracy program advances effective public policy decision-making that is reflective of and responsive to all Californians. All three areas are guided by our mission of expanding opportunity for the people of California.

The Performance Assessment Framework

A major strategic planning process in 2003 led the Irvine Foundation to streamline its program focus and create a new performance assessment approach. As the result of a comprehensive review and planning process, we updated our mission statement and trimmed six diverse program areas to three.

Once we had identified where we would focus and what we sought to achieve, we turned our attention to how to measure and assess our progress toward achieving our goals. We assembled an ad hoc committee of board members and staff to help shape this work, and we spent several months studying the best practices of other foundations that were pioneering foundation performance assessment. Although a few foundations—most notably the Robert Wood Johnson Foundation—stood out, we concluded that few foundations were taking a comprehensive approach to performance assessment, and there was therefore an opportunity for the Irvine Foundation to contribute to the development of new ideas and approaches in this emerging area.

With the few existing models in mind and the knowledge that foundation performance assessment was still nascent, we created the Performance Assessment Framework. With the Irvine Foundation board of directors as the primary audience, the framework addresses six areas—the first three focusing on our programmatic work, and the remaining three providing a more institution-wide view.

We evaluate our performance in these six areas by asking the following six questions. The results of this inquiry provide the basis for our yearly report to the board.

  1. What is the context in our program fields? This section of the performance assessment outlines information that helps our board understand how the Irvine Foundation’s work fits in a broader context. For example, we include external indicators and new research findings that are relevant to our program goals. We also report on grantmaking by peer foundations in similar areas. Board members have indicated that this context is valuable to their deeper understanding of the challenges and opportunities we face in each program area.

  2. What progress are we making toward our program goals? This section reports on evaluation findings and program progress indicators that track the impact of our grantmaking. The progress indicators are developed by each program team and cover a range of information, both quantitative and qualitative. The indicators are organized by the goals and priorities in each of our programs. In many respects, this section covers much of what is traditionally considered to be “evaluation” of a foundation’s work.

  3. How do lessons from our program work improve our approach? This section discusses how we have used our grant monitoring, evaluations, and other engagements in the field to inform and refine our strategies and implementation. Because the Irvine Foundation’s philosophy of evaluation is guided by continuous improvement and refinement, much of our work with the board during the year focuses on this set of questions. In this section of the annual performance report we summarize our activities and reinforce our commitment to ongoing improvement.

  4. How is the foundation exercising leadership in the field? This section, which shifts the focus from specific programs to the broader organization, assesses the ways we can use our leadership platform and voice to extend our impact and advance the foundation’s mission. We do not presume that leadership is conferred upon us by virtue of our resources, but we are also mindful that foundations can play important leadership roles, especially when done with humility and through authentic partnership. Here we assess leadership activities undertaken by the foundation as well as ways we help to frame discussion, often via publications. We are beginning to integrate social media measures into this work.

  5. How do key stakeholders perceive us, and how do their perceptions inform our work? This section reports feedback we have solicited about how important constituents view the foundation—typically collected by third-party surveys, confidential interviews, or other organized fora. We typically include results from Grantee Perception Reports administered by the Center for Effective Philanthropy, website user surveys, and other constituent feedback activities. We have found that the self-imposed requirement for external feedback to include in our annual report encourages us to identify more feedback opportunities than we might otherwise—a good development.

  6. How are we performing on measures of financial health and organizational effectiveness? In this section we track a number of indicators related to the foundation’s investment performance, operating ratios and costs, board and staff diversity, and institutional developments. In contrast to other parts of the report, for this section we are able to draw on ready sources of financial benchmarks for comparable institutions in the field and for the foundation’s past performance.

It is important to note that when answering these six questions, we operate with the assumption that the measurement needs to fit the subject matter. As much as possible, we rely on quantitative data for maximum precision and clarity. For some topics in our framework, such as exercising leadership or gathering constituent feedback, a quantitative approach may not be as useful, so we try to balance the quantitative and qualitative.

Evolution, Refinement, and Improvement

Our Performance Assessment Framework has evolved using feedback from the board and our experience creating and using the annual performance reports. The most important evolution has been to change the sequence and emphasis of topics. The framework initially led with a review of new grantmaking during the reporting year, but we have reorganized the report to focus attention on the progress and results of our previously awarded grants. We include a grantmaking summary as an appendix to the annual performance report, but we rely on quarterly dashboards to keep the board members up to date on recent approvals.

An important goal in creating the Performance Assessment Framework was to create a view of the foundation’s work as a whole rather than as a series of parts. It was the quest for a holistic view that motivated us to organize the report by topics—the six questions—rather than by programs. That said, as a multipurpose foundation, we have three very distinct bodies of work in our three core programs, so although we take stock of the foundation as a whole, we do not attempt to aggregate measures into a single index for the foundation. Rather, we weigh progress and challenges to learn from each.

Audiences for Assessment

Over time we have broadened our thinking about the audiences for our performance assessment work. The main audience for our annual performance report is the foundation’s board of directors. The report is one of the primary deliverables for our annual retreat at which we conduct in-depth conversations with the board about our work. As part of the evolution of our framework and with our board’s feedback, we came to define three additional audiences for our performance assessment work.

Our staff, especially the program staff, are a second important audience that can derive benefit from both the assessment process and results. The annual performance report represents a regular checkin on our ongoing process of strategy development and refinement. Our work in evaluation and performance assessment helps focus program staff on the goals and outcomes for their grantmaking. We have found that challenges in performance assessment often help uncover areas where our strategy needs refinement and elaboration. The process of reviewing progress indicators and other material to develop the annual performance report helps us reflect on how the progress we’ve made should inform the work ahead.

In addition to the two internal audiences, we believe that the analysis in our annual performance report can provide grantees and other funders—our third target audience—with a better understanding of how we define success in our work. Over time, we believe this understanding can facilitate collaboration toward shared goals. We reconceived the foundation’s public annual report (which is different from the annual report we provide to the board) to integrate performance assessment content so that our grantee partners have easier access to the information.

The general public is a fourth, although lower priority, audience. By including more performance assessment in our public annual report and making it generally available, we seek to contribute to a broader understanding of philanthropy’s role in society beyond the grantmaking transaction.

Lessons Learned

Literature on measurement and evaluation reminds us that consciously building in opportunities for learning will help us use our results in actionable ways. At the Irvine Foundation, that wisdom has helped to guide our approach to performance assessment across the foundation. From the beginning, we knew we were experimenting; building on the work of pioneers in foundation-wide assessment, such as the Robert Wood Johnson Foundation. We also knew we did not have the answers, and now, after six years of creating and evolving our framework, we know that we still have more to learn. At the same time, we are equally persuaded that our commitment to foundation-wide performance assessment has made us a better foundation—one in which a culture of reflection, learning, and refinement has become even more pronounced, and in which we actively find ways to use what we learn from performance assessments to improve our strategic thinking and, by extension, to deepen our impact.

A commitment to foundation performance assessment can force us both to consider context and to build in opportunities for learning.

Recognizing that our assessment framework remains a work in progress, we offer the following three lessons in the hope that they may inform others who are interested in foundation-wide performance assessment.

Lesson 1: The traditional structure of philanthropic activity can conflict with a commitment to performance assessment, so it is important to address related barriers and incentives.

Incentives within foundations are typically organized around the core activity of awarding grants. The IRS mandate to spend grant dollars annually helps to drive the requisite work of grantmaking—proposal review, site visits, docket preparation—all leading up to board decisions about grant awards at designated intervals throughout the year. When one cycle concludes, another begins—or more often, cycles overlap and docket deadlines are constantly looming for foundation staff. We have concluded that this structure tends to discourage careful reflection, thoughtful analysis, and distillation of lessons learned. Often, these activities are considered luxuries at their best and distractions at their worst.

For a foundation to embrace fully a commitment to institution-wide performance assessment, careful consideration must be given to addressing this structural barrier, which tends to focus us primarily on the next deadline at the expense of what we are learning and how we are using that learning. We have discovered a related temporal challenge. Because of regulations related to payout, grant budgets are often organized around an annual calendar. Our program goals and aspirations rarely follow annual timelines, nor should they, if they are sufficiently ambitious in scope. So how do we reconcile these different timelines?

At the Irvine Foundation, the discipline of producing an annual performance report to our board (and then sharing it publicly) has oriented us toward a sharper focus on reporting progress, not necessarily final results. To do so, we have had to orient ourselves toward identifying shorter- and medium-term indicators and measures of progress that we can track and report on in annual increments.

The need to clarify these indicators and measures of progress has been a valuable contribution to our strategy development and refinement as well, because it has forced us to articulate more clearly both the logic and sequence of outcomes we seek. We believe that any foundation committed to assessing its performance must determine how best to keep the grantmaking work moving forward while creating the space for consideration of broader progress assessment.

Lesson 2: Foundation performance assessment both requires and fosters a culture of reflection and learning that can lead to ongoing refinement and program improvement.

Much of today’s business literature speaks to the importance of building adaptive organizations that stay closely attuned to their external environments and retool strategies in ways that align with that ever-shifting context. Although foundations do not necessarily think in terms of “competitive advantage” or worry about going out of business, foundations do have an obligation to remain attentive to the context of their work.

Similarly, we need to find ways to create feedback loops that permit us to use what we learn to improve our strategies. A promising development over the past decade has been a more intentional focus on evaluation for learning and improvement rather than simply for auditing purposes or declaring success or failure. Although we certainly need to use evaluation to guide our understanding of whether we are succeeding or failing, we stand to benefit significantly if we can use what we learn to improve our strategies and their execution.

In this respect, performance assessment is inextricably linked to program strategy, and it has been our experience that a focus on assessment has improved the rigor and logic of our program strategy. As previously noted, a key contribution of this process has been defining progress indicators that allow us to determine whether we are making the kind of progress we seek in the short term, and if not, to understand what that might imply about our strategies. We have discovered that even the process of identifying these indicators can help us to surface possible instances of underdeveloped strategic thinking or unrealistic expected outcomes.

A commitment to foundation performance assessment can force us both to consider context and to build in opportunities for learning. From the beginning, our Performance Assessment Framework contained a section on what we call “program context.” It was designed to encourage us to collect data about the contextual issues related to our programs. We also annually collect data on other philanthropic investments in the particular areas we fund. Examining both of these contextual data sets helps us understand better how we view the foundation’s contribution and track other philanthropy and the broader trends in our focus areas. Interestingly, this contextual data has been one of the areas that our board has found the most interesting. It has helped them to position our work in a broader context and has encouraged us to find other ways to expose the board to the environment and context for the foundation’s efforts.

Understanding the context and reflecting upon our learning, however, can be useful only if it can then be translated into action. This is where we have tried to demonstrate our commitment to adaptability without falling prey to a tendency by foundations to embrace the issue of the day. For example, in our California democracy program we have remained focused on the goal of advancing more representative public policy decision making. As we have approached this priority from different angles, such as voter mobilization and civic engagement, the program team has analyzed the products of its grantmaking and adjusted its strategy as appropriate.

Lesson 3: Successful foundation performance assessment requires both buy-in and engagement from stakeholders at all levels.

The old adage that “leadership starts at the top” holds true for foundation performance assessment. Without demonstrated commitment and visible leadership from the board, the CEO, and other senior officers, foundation performance assessment simply cannot succeed. We put it this starkly because the effort required for foundation performance assessment mandates full institutional commitment and cannot be the province of just the evaluation director.

Obtaining the buy-in of the board from the very beginning was essential. We created a board task force to help us design our approach to foundation performance assessment, and from the outset we viewed the board as the lead initial stakeholder. We did so because we believed it would advance board members’ understanding of the foundation’s progress and impact—an important outcome by itself—but we also knew it would signal both internally and externally just how important this work was to the foundation.

For the process to take hold, it also requires broad-based buy-in from the staff. We were perhaps well-served by the fact that our foray into foundation performance assessment coincided with the adoption of a new strategic plan and the staff change that often accompanies a shift in new directions. The foundation’s staff embraced this approach from the outset, and it became part of the way we did business. Nonetheless, we recognize the need to ensure that staff view performance assessment not as an additional burden, but rather as a tool to inform better decisions.

Conclusion

This article explains how we have structured and evolved our foundation’s performance assessment. We are eager to share our approach, not just because we are persuaded that adopting an institutional view of performance assessment can improve a foundation’s work, but also because we want to create a broader community of foundations committed to learning from each other about this important dimension of our work. Fortunately, we have moved beyond the question of whether measurement and evaluation are useful, but there remains much to be explored about how the tools of measurement and evaluation can be applied across a foundation’s work. That’s where we have been experimenting for the past six years and where we know we still have a great deal to learn.

Among the questions that we have yet to explore is how an approach to foundation performance assessment can work in organizations of different sizes and scopes. Although the Irvine Foundation is a relatively large foundation, we are also regional in scope and focus on only a few program areas. How this approach can be applied in a foundation with a broader mandate and more diverse portfolio of grantmaking remains an open question.

We hope this contribution to the body of knowledge on performance assessment can support robust exploration of questions such as these. In the end, our institutions exist to provide a public benefit, and we must therefore embrace opportunities that might enable us to deliver on that promise more effectively. We believe that a commitment to foundation performance assessment offers one such opportunity, and we know there are more experimenting to be done, more learning to be accumulated, and many stories to be shared.

See the complete evaluation supplement.

Support SSIR’s coverage of cross-sector solutions to global challenges. 
Help us further the reach of innovative ideas. Donate today.

Read more stories by James E. Canales & Kevin Rafter.