“We’ve done evaluations before, but the results weren’t that useful or interesting.”

“I barely have time to get to all the things on my to-do list, let alone to spend extra time reflecting and learning!”

These are common reactions when evaluators ask nonprofit and foundation staff members to participate in evaluation and organizational learning activities. While evaluators and organizational learning staff may take the importance of their work for granted, poorly designed efforts can result in busy work that does not add much value—in fact, it can contribute to negative perceptions of evaluation and learning. One solution may lie in an idea that has gained currency in other fields: design thinking.

While design thinking is primarily used to describe an approach to tackling social problems, as social sector evaluators (one of us is an evaluation officer at a healthcare foundation, the other an evaluation consultant at a social sector consulting firm), we have found that it is also a useful approach for increasing the impact of evaluation and learning work at nonprofits and foundations.

Are you enjoying this article? Read more like this, plus SSIR's full archive of content, when you subscribe.

Evaluation and learning efforts, when implemented well, provide rich information and insights that drive strategy and impact. A recent benchmarking study of foundations showed that organizations are increasingly integrating evaluation into the strategy cycle; however, there is considerable room for improvement. Here are three concepts drawn from design thinking that organizations can use to increase the effectiveness and use of evaluation and organizational learning.

1. Becoming User-Centered

An important tenet of design thinking is to put the user at the center of the issue and design solutions that would work for them. When the user is at the center, every aspect of the design process—including inspiration, ideation, and implementation—is built around their needs and experiences.

When implementing evaluations, this translates into gaining a solid understanding of whom the evaluation end beneficiaries are, and then involving them as much as possible in the evaluation design, data collection, and analysis. The nonprofit YouthTruth, for example, collects and reports systematic perception data about school culture, including the quality of teaching and learning, from students—a voice that traditional educational evaluations often neglect.

The same holds true for organizational learning. At the California HealthCare Foundation, we have found that the most effective approaches to learning require a true understanding of our staff’s learning needs, From there, we design activities that leverage the collective knowledge of the organization. As opposed to appointing an outside “expert” to teach others, this approach recognizes the expertise and contributions that staff members—the users—offer.

2. Identifying Latent Needs

A skillful design thinker not only puts the user at the center and identifies their explicit needs, but also uncovers “latent” needs. For example, designers working on improving the functionality of a toothbrush would likely ask questions about the overall user experience, not just about the toothbrush itself; they may observe users’ morning routines so that they can understand how brushing one’s teeth fits in.

The identification of latent needs has implications for the foundation of any evaluation: gathering information. Evaluators are constantly gathering information—and how they gather that information matters. Take, for example, an evaluation of a nonprofit service provider. Observing the people who are receiving services, and asking them to describe how they interact with the service providers, how it affects their life, and how it fits in with their day will yield more insightful nuggets of data than a straightforward, “How would you improve this program?” approach—even if the latter is the question the evaluator eventually aims to answer.

3. Implementing Rapid Prototyping

A useful process that aids design thinkers is rapid prototyping—that is, building quick, inexpensive models and simulations of an end product and then testing them with users. The design process is a cascade of small tests and iterations rather than a large bet that could fail big.

For evaluation, this translates into a flexible evaluation design, and short feedback loops around data and findings. Developmental Evaluation, an approach that has recently come to the fore around evaluating social innovations, thrives on rapid feedback loops and flexible, iterative evaluation design. FSG consultants use this type of evaluation with clients; they use tools such as verbal debriefs, reviews, and two- to three-page learning memos on a regular basis, rather than relying on a year-end report.

For the California HealthCare Foundation’s most recent organizational learning efforts, we created a “grantmaking toolbox” to document new, effective, and/or innovative grantmaking tactics. Essentially a web-based database, it documents and describes fifty “tools” that the foundation can use in its grantmaking to increase impact (convenings, intermediaries, challenges, etc.), organized under eleven topics that represent common grantmaking challenges (such as learning about the field, supporting grantees, or spreading ideas). The intent of the toolbox is to encourage staff members to try new tactics. For example, it may suggest that a program officer try crowdsourcing as a way to come up with new ideas. By listing knowledgeable people within the organization for each tool, it also facilitates internal collaboration and information sharing. Most of all, the process of creating the toolbox demonstrates the value and benefits of rapid prototyping. We used an iterative and collaborative process to create four successive toolbox prototypes; each allowed us to collect detailed input from staff members to inform the design of the next version, fostering engagement that would not have occurred if a single person were assigned to create the final version of the toolbox on their own. Moreover, it created a space where staff members could collaborate on a joint project and learn from each other.

Summary

Evaluation and learning initiatives play an important role in equipping social sector organizations with tools to solve chronic and complex problems. However, we need to evolve our approaches if we want evaluation to remain relevant and useful. By adopting concepts and practices from the field of design thinking, evaluators and evaluation and learning staff can ensure that their efforts contribute to—rather than detract from—the effectiveness of their organizations.

Support SSIR’s coverage of cross-sector solutions to global challenges. 
Help us further the reach of innovative ideas. Donate today.

Read more stories by Rosanna Tran & Srik Gopalakrishnan.