Alice came to a fork in the road. “Which road do I take?” she asked. “That depends a good deal on where you want to go,” responded the Cheshire Cat. “I don’t know,” Alice answered. “Then,” said the Cheshire Cat, “It doesn’t matter which way you go.” –From “Alice in Wonderland”

In 1998, after finishing a two-year stint as a business analyst for McKinsey & Company, I was hired by Hillel: The Foundation for Jewish Campus Life, to help the organization identify where it was going and how well it was getting there. At McKinsey, I had undertaken several strategic growth projects with local nonprofits, but I was frustrated by the lack of tools for evaluating nonprofit success. My passion was to find a way to apply the analytic tools I had learned at McKinsey to help Hillel establish realistic improvement goals. I had no idea just how difficult that would be.

Hillel, a $50 million Washington, D.C.-based organization, has over 500 affiliates, known as “Hillel houses,” on campuses across North America. Founded in 1923, Hillel strives to maximize Jewish student connections to Jewish life during the university years.

By 1998, officials at Hillel headquarters were struggling to assess the effectiveness of its 105 largest Hillel “franchises,” relying largely on anecdotal evidence from directors as well as occasional site visits. Staff in Washington, D.C., for example, assumed the Rutgers University Hillel was up-and- coming – simply because leadership conveyed that they were riding high. In fact, as we would later learn, the Hillel was not reaching its full potential, and was in serious need of organizational change. But objective data was hard to come by.

Hillel hired me to direct the newly formed Strategic Services Group, charged with turning around this situation. In this position, I created a performance assessment tool I call the Peer Yardstick, designed to assess the performance of an organization with geographically dispersed franchises.1 What follows is our four-stage approach.

1. Understand How Performance Measurement Challenges Organizational Culture

I came to Hillel with a consulting firm mentality. The typical consulting project at McKinsey lasted about three months. I reasoned that I could quickly design and implement a quantitative system to help the Hillel franchises increase their performance. I also felt that, as a former student leader at Yale University’s Hillel, and considering my new position, I had the authority to begin right away.

Yet after three months, I had made almost no progress. My mistake, in part, was failing to recognize the fundamental clash between my department’s mission, quantitative assessment, and Hillel’s qualitative culture. Hillels had never been asked to submit data before, and so changing the culture required a new approach.

I knew that I needed to get in touch with Hillel’s organizational culture. My first step was to join a young professional development trip to Israel with 30 peers from Hillel franchises. I then spent six months getting to know the directors, listening to their insights, and socializing with them. I also developed a field-based team composed of senior and mid-level local Hillel professionals who had demonstrated openness toward our new approach. I charged this team with helping generate buy-in to a measurement-based approach among their colleagues. One way that we did this was by having members of the field team present the first measurable results at the annual professional staff conference.

2. Develop an Evaluation Model

The power of the Peer Yardstick is that the same measures are used for all the franchises, permitting comparability. It uses statistical analysis to identify which among dozens of possible organizational factors drive desired outcomes. It enables franchises to set goals based on their peers’ performance.

To begin with, organizations must select readily quantifiable measures for mission and financial strength. At Hillel, we chose cumulative number of participants as the measure of our mission. Since Hillel competes for college students’ free time with countless other options, we reasoned that the cumulative number of Jewish students participating was a good proxy for success. We used fundraising dollars as a measure of financial strength, in part because underfunding was a good predictor of mission failure.

But telling a franchise to improve top-line goals such as participation or fundraising is insufficient to drive operational changes. So we hypothesized dozens of “key success factors” that might impact participation or fundraising. For example, we showed empirically that, all other factors being equal, if a Hillel formed 10 new student activity clubs – groups organized around specific interests like hiking or community service – it could be expected to generate a 16 percent increase in participation. We also emphasized the importance of growing Hillel’s student contact lists. Increasing the size of Hillel contact lists from 25 percent to 75 percent of the total Jews on campus could be expected to increase participation by 80 percent. In all, through empirical analysis, we identified about 20 significant factors.

For fundraising dollars, we hypothesized that the board’s leadership – as indicated by board member contributions – was critical. We found that a $10,000 increase in total board giving could be expected to leverage a 14 percent increase in total Hillel contributions.

We also deflated some widely held assumptions. It was assumed, for example, that participation was driven by university type – commuter, state residential, or private. The Peer Yardstick showed there were no significant differences in participation by campus type.

But to compare franchises, we did have to isolate the statistically significant variables outside management’s control. For example, we adjusted expectations for a Hillel based on the number of Jewish students on campus; fundraising expectations were adjusted according to each university’s alumni giving rate.

3. Create a Self-Assessment Survey That Generates Trust

To collect the data necessary to perform this empirical analysis, one needs a good survey. It is important to have local directors fill out the survey, empowering them to own the process and diminishing the likelihood of data being contested.

But how do you get 105 individual franchise directors to self-report a survey, and to do it accurately? Especially when many local directors feared that if the “truth” came out about their franchise’s underperformance, they might be fired. One of the keys is to generate an atmosphere of trust.

We found that using self-reflexive language when referring to the survey helped set the right tone. Instead of calling it a “performance measurement survey,” we called it a “selfassessment survey.” When headquarters staff communicated with franchises – on the phone, at conferences, or through written communications – we always spoke of “helping your Hillel reach its potential” as opposed to “revamping underperforming Hillels.”

There were also incentives to completing the survey. Some were “sticks” – eligibility for headquarters grants and accreditation as a Hillel. Yet such “sticks” were used sparingly because they bred resentment. More importantly, we provided “carrots” – money, mentoring, and consulting services to Hillels wanting to undertake the strategic change process.

We monitored surveys carefully to ensure quality data. Some directors were simply estimating answers to quantitative questions – for instance, questions about fundraising dollars – off the top of their heads. To ensure quality control, headquarters began spot-checking figures, flagging unusual data points, and contacting directors to discuss how they reached their figures. Subsequent surveys were adjusted with more detailed instructions on how to arrive at the accurate numbers, and we distributed new fundraising software.

In year one, only 50 percent of the Hillels returned surveys, and many questions went unanswered. By year four, our most recent year, 95 percent of Hillels filled out the survey, and few were incomplete.

4. Analyze and Disseminate Results, and Target Underperformers

Next, we analyzed the surveys and designed reports that showed how Hillels compared with others in their peer group. Groupings were based on variables to maximize similarity, such as number of Jews on campus or academic strength. Peer grouping made directors and boards feel comfortable with the results.

We disseminated PowerPoint summaries of the fieldwide results at all of our conferences and training sessions for staff, boards, and students. We followed up by offering to create a customized report for each Hillel. We encouraged Hillels to initiate contact with our group. When local Hillel directors or board members called, we spoke to them about using the data to initiate strategic change.

The top-performing Hillels requested the data as a marketing tool to show their board and their major donors just how strong their performance was. For some average performers, the reports themselves become the impetus for change. Meanwhile, we proactively monitored the poor performers, seizing on opportunities to begin the change process.

Here’s How it Works …

In 1999, the Hillel at Rutgers University had 540 total students participating out of a market of 4,500 Jewish students. The board was composed of local New Brunswick, N.J., community members, and it had an annual budget of $335,000. Its major funders were several New Jersey Jewish community foundations. The foundations had never been fully satisfied with the director and board’s depictions of the organization’s success, but without evidence to the contrary, they assumed the Hillel was in good shape. The board and director believed the organization was strong, based on improvements over prior years’ performance.

But compared to the performance of other Hillels at flagship state universities with similar numbers of Jewish students, Rutgers Hillel was performing greatly below its potential – one of the lowest in the peer group. Our 1999 self-assessment survey showed that Rutgers lagged behind in all the major performance scales – participation, number of activity clubs, income, board giving, and staffing.

Our opportunity for involvement at headquarters was a salary dispute between the director and his board that led both sides to seek my help. I quickly shifted the focus from the salary to the organization’s potential. Our strategy was to gather the major stakeholders – in this case, the community foundations – and present our findings. In January 2000, we laid out specific goals in each area, over a several year period – goals that were reasonable given peer performance. We targeted a 200 percent increase in participation, to be achieved through the creation of more clubs, more student empowerment, and more staff. We then explained that to meet these mission goals, the Hillel needed to increase its budget close to the peer average of $500,000. To do so, we needed a new board, one that could give at least the peer average of $33,000 annually – and a near doubling in the contributions of the foundations from $69,000 to the peer average of $110,000. These recommendations were accepted because we had evidence in hand showing clearly the expected return on investment.

We moved quickly to implementation. We reconstituted the board, asking many members to resign because they lacked the resources or influence to lead Rutgers to its potential. This was a very painful but necessary step. The Jewish community foundations filled the board seats from their own ranks, creating a powerful new board that could lead through financial contribution and vision.

One important result of this was that the new board and director now had a common set of measurable goals established by the Peer Yardstick to which they were both accountable. When the director chose to leave shortly thereafter, the new incoming director, Andrew Getraer, had a clear road map to guide him. Headquarters also paid for Rutgers Hillel to be guided by the director of a top-performing Hillel at the University of Maryland.

By the 2002-2003 academic year, Rutgers Hillel’s income had grown to $460,000, with the board members contributing $40,000 and the community foundations contributing $132,000 – both at or above the goals we had set in 2000. The Hillel doubled its program staff. More importantly, participation has grown from 540 to 1,700 this year, putting Rutgers Hillel at the peer average. It now operates 30 clubs, including the newly formed Jewish meditation group and alternative spring break community service group. The mailing list for contacting Jewish students has nearly doubled.

“It can be daunting in areas where we are underperforming,” said Getraer, “but I know we can get there because my peers have done it, and I try to learn from their experience.”

Stakeholder Benefits

By using data from across a broad spectrum and establishing peer benchmarks, the Peer Yardstick framework generated benefits for local directors, headquarters, board members, donors, staff, and volunteers. Directors used the data to better determine priorities and hold staff and volunteers more accountable. (Getraer, for instance, now reviews the participation numbers with his staff every week.)

Donors and board members, meanwhile, can use the yardstick to measure the return on their investment, ensure accountability, and drive change from an informed perspective. Recently, for example, Rutgers Hillel approached a potential major donor with the Peer Yardstick data, showing how much they had grown and outlining their future goals. The donor was so impressed that she initiated the next call to Getraer to propose a substantial naming gift for the new facility Rutgers Hillel plans to build.

Finally, the Peer Yardstick gives headquarters a value-added capability to initiate change at struggling Hillels – and to attack problems aggressively. Headquarters can immediately show key stakeholders the potential for a struggling Hillel and the steps necessary to meet that potential.

In 2000, the Strategic Services Group at Hillel headquarters engaged in major change processes at 10 struggling Hillel franchises using the Peer Yardstick. Over the past three years, these franchises have grown their participation 142 percent and increased their incomes 50 percent on average, despite the difficult economy.

As Richard Joel, former Hillel president and international director, put it, the Peer Yardstick approach “helped Hillel create a culture of ‘what is’ and ‘what can be.’”

1 My former colleagues at Hillel, especially Rob Goldberg and Jay Rubin, as well as Professor Elizabeth Keating of Harvard University s Kennedy School and Professor Karl Schmedders of Northwestern University s Kellogg School of management, helped influence my thinking in designing the Peer Yardstick.


Sacha Litman is principal consultant at Measuring Success, a firm that develops quantitative tools to help nonprofits improve their effectiveness. His current clients include the United Jewish Communities, which serves Jewish organizations across North America. He can be reached at sacha.litman@measuringsuccess. com

Support SSIR’s coverage of cross-sector solutions to global challenges. 
Help us further the reach of innovative ideas. Donate today.

Read more stories by Sacha Litman.