Excellent article that every significant charity ought to take seriously and every aspiring NGO analyst and funder advisor ought to take to heart and make happen.
Excellent article Sara and Kate. As you already know SoPact (http://sopact.com) is designed based on many of the similar principles. We take a different approach from products like B-Analytics by letting capital market funds define social, environmental, financial and governance metrics. More importantly, you can create a hierarchy in program and metrics allowing you to handle a subset of grantees or investees under a portfolio at a site or program level so changes can be made at that level instead of the individual grantee level. In other words, you can manage variation by define common or core metrics but create hierarchy to manage variations!
I understand a point of SROI, but I must caution that without quality data, SROI do not provide a good evaluation. In other words, quality data do need measurement! So, I am bit confused about a title of the article!
Thanks Sara, I like the connections you made across the fields of social impact measurement, evaluation and analytics. I also saw a key point around “social impact analysts” which to me is different than a basic data analyst and if I understand your thought piece correctly, is someone who would be expert in their art of both understanding the stats/numbers as well as the “social impact context” to which they are applying their art. To me, this combination is the difficult to source.
We do need to differentiate between measurement methodology and reporting and how these reports are analyzed and interpreted . Highlighting the skill sets needed for both and need for comparative data is key thanks for pulling on the historical context of accounting. Has great relevance for where we are today in social impact reporting.
This is an excellent article with many great insights relating to the practical side of measurement. A key requirement for all kinds of measurement is indicator or metric relevance. So many organizations and people seem to end up collecting data that they don’t see as having high informational value. If an indicator isn’t giving us good insights into the thing we seek to measure it’s hard to see it’s value (and get excited about it). I also appreciated your comment on the requirement to reduce the reporting burden. Of course, we should always be looking for better ways to collect data - we want to reduce the administrative burden. This opens up space to do higher value activities that help transform data into knowledge and action. However, reducing the burden of reporting doesn’t always translate into acceptance and follow through on measurement activities. It often comes back to the relevance (and perceived value) of the data we are asking people to capture. In my experience, when people see the relevance and practical use of the data we are asking them to collect, they actually don’t see reporting activities as a burden. When we help them see the connections, people often appreciate having the opportunity to contribute to our knowledge and participate in the utilization of that knowledge for learning, improvement, and the production of high value stakeholder/client/societal outcomes.
Excellent article - great to discover some like-minded fellow travellers. I once heard evaluation Professor Jennifer Greene say “methods are always at the service of substance”, by which I believe she meant we have to think beyond measurement and use our good old Mark One brains to reach well reasoned judgments - from data yes, and also from stories, logic, values and (shock, horror) even intuition. In this article I argue explicit evaluative reasoning offers a valid, credible way to make sense of economic measures of social value alongside other factors.
Read your article on social impact reporting with interests. I wish to add one more point: there should be increased interaction between profit oriented investors and those driven by social impact. From conversation with practitioners in social impact investing, it is notable that the dialogue is often divided along this line; adjustment to social impact reporting would be one of many measures needed to change that.
Thanks for all the comments here and on social media. They have been thought provoking and encouraging.
Some questions necessitate more than a tweet-sized response, so Sara and I are answering here.
Ryan asked ‘Who will pay the analysts?’ Good question. First, it is important to recognize that there are already many people getting paid to make decisions based on the actual and expected achievements of mission-driven organizations. They work at impact investing funds, community foundations, united ways, professionalized family and corporate foundations, boutique consultancies, watchdogs, certification organizations, etc. What Sara and I hope we have pointed out is while most (all?) of these folks encounter the “comparison problem”, current thinking advises them to look for “measurement solutions”. We argue that social capital markets will be improved by rethinking the strategies (including software - see Unmesh’s comment above) used to overcome the comparison problem. (That said, we don’t want to suggest that creating better social capital markets won’t take money. Getting to the next frontier requires a critical mass of organizations producing the kinds of reports that analysts need (fine print details, consistency, measures chosen from a menu that balances relevance and comparability). This will likely require large-scale infrastructure-like investment. )
The ‘who’, both now and going forward, will be a mix of different players. Rick Jacobus (http://ssir.org/articles/entry/who_will_pay_for_data) spoke of business models. Sara and I foresee that there will always be a role for grant-funded analysis undertaken by academics and nonprofit research institutes. There will also be some analysis that end-consumers/donors/investors will pay for directly and some that is bundled with other services. This is mix of public and private funding, and a smattering of different models is typical of analogous fields. Exactly how it shakes out in the field of social impact remains a question.
Great article. I totally agree that there is a lot of complication and nuance in impact measurement and that skilled analysts need to play a leading role for this process. Comparing conventional investments that are only driven by profit is a challenge in itself. There is already a robust industry built up for this purpose. It makes sense that impact investments are more difficult to assess across geographies/issue areas/etc as the field becomes more mainstream.
I do think there is still lots of room for improvement in measurement standards, particularly around federally supported social services in the United States (and elsewhere I imagine). For an issue area like homelessness where there are many local service providers competing for the same limited pool of resources, having a centralized database with standardized quality data would enable skilled analysts to identify the best investment opportunities.
Having standardized data enables a better comparison (by skilled analysts) where the important nuances and data-limitations can be considered alongside the investment priorities. It’s a good point in this article that skilled analysts play a key role in interpreting impact reports and data to help allocate resources towards the best investments. Nice work Sara and Kate!
Moral imagination inspires entrepreneurs to break through barriers in new ways, moral technique based on experience is required to evaluate the outcome. Thanks for your consistent stance on this for over a decade!
I love this: “The market is best served when each organization can measure its social impact in the way that is most meaningful and insightful to its aim and operations” - as someone who works with social entrepreneurs, I know that many are frustrated by being forced into a one size fits all reporting metric.
Kate and Sara — excellent piece. I was particularly interested on your analysis of the challenges to standardizing social impact measurement. We’re working to address this problem through our Social Impact Projection certification (http://goodcompanyventures.org/programs/social-impact-projection) which quantifies an early-stage entrepreneur’s external social impact projections and communicates the data to impact investors.We will definitely need to continue thinking who our “social impact analysts” will be as we move forward on the model.
As an entrepreneur new to the impact investment space, I’m interested in offering my investors the opportunity to measure our impact as we grow and transferring “dividends” into tangible “multipliers of effect”. Identifying the best forms of analysis and anecdotes that communicate (perhaps in real time) how each investment dollar inspires/results in change might be a challenge in and of itself. I’d also like to see every “pro-sumer” be able to identify the impact measures important to them and apply them to every product or process that is out there in the world by tapping a shared database that takes all the analysis and consolidates it. Thank you, Sara (and Kate) for moving measurement forward!
Interesting, especially your point on “analysts think about social impact measurement holistically—reading across techniques, instead of just designing an evaluation or performance monitoring system for a particular program, or just applying a particular technique such as theory of change, sustainable livelihoods, or social return on investment (SROI)....These frameworks function as translating mechanisms that allow analysts to understand and compare different approaches.”
Hi Jindra, Thanks for your comment. To your first question, an emerging framework for skilled impact analysis considers stakeholder engagement to be a cross-cutting consideration from framing to measurement and analysis to communication. To your second, there are certainly impacts that occur well after a given project ends. I infer that you are asking about what is to be measured or accounted for/considered by an analyst. The answer would have to emerge from a consensus about norms in impact analysis which does not yet exist. I hope you will engage in helping to uncover it!
COMMENTS
BY john phillips
ON May 11, 2016 08:40 AM
Excellent article that every significant charity ought to take seriously and every aspiring NGO analyst and funder advisor ought to take to heart and make happen.
BY Unmesh Sheth
ON May 11, 2016 01:33 PM
Excellent article Sara and Kate. As you already know SoPact (http://sopact.com) is designed based on many of the similar principles. We take a different approach from products like B-Analytics by letting capital market funds define social, environmental, financial and governance metrics. More importantly, you can create a hierarchy in program and metrics allowing you to handle a subset of grantees or investees under a portfolio at a site or program level so changes can be made at that level instead of the individual grantee level. In other words, you can manage variation by define common or core metrics but create hierarchy to manage variations!
I understand a point of SROI, but I must caution that without quality data, SROI do not provide a good evaluation. In other words, quality data do need measurement! So, I am bit confused about a title of the article!
- Founder, SoPact (http://sopact.com)
BY Veronica Olazabal
ON May 13, 2016 03:02 AM
Thanks Sara, I like the connections you made across the fields of social impact measurement, evaluation and analytics. I also saw a key point around “social impact analysts” which to me is different than a basic data analyst and if I understand your thought piece correctly, is someone who would be expert in their art of both understanding the stats/numbers as well as the “social impact context” to which they are applying their art. To me, this combination is the difficult to source.
BY Joanne Norris
ON May 13, 2016 09:50 AM
We do need to differentiate between measurement methodology and reporting and how these reports are analyzed and interpreted . Highlighting the skill sets needed for both and need for comparative data is key thanks for pulling on the historical context of accounting. Has great relevance for where we are today in social impact reporting.
BY Sandy Richardson
ON May 13, 2016 11:43 AM
This is an excellent article with many great insights relating to the practical side of measurement. A key requirement for all kinds of measurement is indicator or metric relevance. So many organizations and people seem to end up collecting data that they don’t see as having high informational value. If an indicator isn’t giving us good insights into the thing we seek to measure it’s hard to see it’s value (and get excited about it). I also appreciated your comment on the requirement to reduce the reporting burden. Of course, we should always be looking for better ways to collect data - we want to reduce the administrative burden. This opens up space to do higher value activities that help transform data into knowledge and action. However, reducing the burden of reporting doesn’t always translate into acceptance and follow through on measurement activities. It often comes back to the relevance (and perceived value) of the data we are asking people to capture. In my experience, when people see the relevance and practical use of the data we are asking them to collect, they actually don’t see reporting activities as a burden. When we help them see the connections, people often appreciate having the opportunity to contribute to our knowledge and participate in the utilization of that knowledge for learning, improvement, and the production of high value stakeholder/client/societal outcomes.
BY Julian King
ON May 14, 2016 10:08 PM
Excellent article - great to discover some like-minded fellow travellers. I once heard evaluation Professor Jennifer Greene say “methods are always at the service of substance”, by which I believe she meant we have to think beyond measurement and use our good old Mark One brains to reach well reasoned judgments - from data yes, and also from stories, logic, values and (shock, horror) even intuition. In this article I argue explicit evaluative reasoning offers a valid, credible way to make sense of economic measures of social value alongside other factors.
http://aje.sagepub.com/content/early/2016/04/29/1098214016641211.full.pdf?ijkey=vIrr2wzArzTowB4&keytype=finite
BY Robin Wang
ON May 15, 2016 02:46 AM
Read your article on social impact reporting with interests. I wish to add one more point: there should be increased interaction between profit oriented investors and those driven by social impact. From conversation with practitioners in social impact investing, it is notable that the dialogue is often divided along this line; adjustment to social impact reporting would be one of many measures needed to change that.
BY Kate Ruff
ON May 15, 2016 11:08 AM
Thanks for all the comments here and on social media. They have been thought provoking and encouraging.
Some questions necessitate more than a tweet-sized response, so Sara and I are answering here.
Ryan asked ‘Who will pay the analysts?’ Good question. First, it is important to recognize that there are already many people getting paid to make decisions based on the actual and expected achievements of mission-driven organizations. They work at impact investing funds, community foundations, united ways, professionalized family and corporate foundations, boutique consultancies, watchdogs, certification organizations, etc. What Sara and I hope we have pointed out is while most (all?) of these folks encounter the “comparison problem”, current thinking advises them to look for “measurement solutions”. We argue that social capital markets will be improved by rethinking the strategies (including software - see Unmesh’s comment above) used to overcome the comparison problem. (That said, we don’t want to suggest that creating better social capital markets won’t take money. Getting to the next frontier requires a critical mass of organizations producing the kinds of reports that analysts need (fine print details, consistency, measures chosen from a menu that balances relevance and comparability). This will likely require large-scale infrastructure-like investment. )
The ‘who’, both now and going forward, will be a mix of different players. Rick Jacobus (http://ssir.org/articles/entry/who_will_pay_for_data) spoke of business models. Sara and I foresee that there will always be a role for grant-funded analysis undertaken by academics and nonprofit research institutes. There will also be some analysis that end-consumers/donors/investors will pay for directly and some that is bundled with other services. This is mix of public and private funding, and a smattering of different models is typical of analogous fields. Exactly how it shakes out in the field of social impact remains a question.
~Kate and Sara
BY Kevin jones
ON May 15, 2016 11:47 AM
this makes sense
BY John Perovich
ON May 17, 2016 12:14 PM
Great article. I totally agree that there is a lot of complication and nuance in impact measurement and that skilled analysts need to play a leading role for this process. Comparing conventional investments that are only driven by profit is a challenge in itself. There is already a robust industry built up for this purpose. It makes sense that impact investments are more difficult to assess across geographies/issue areas/etc as the field becomes more mainstream.
I do think there is still lots of room for improvement in measurement standards, particularly around federally supported social services in the United States (and elsewhere I imagine). For an issue area like homelessness where there are many local service providers competing for the same limited pool of resources, having a centralized database with standardized quality data would enable skilled analysts to identify the best investment opportunities.
Having standardized data enables a better comparison (by skilled analysts) where the important nuances and data-limitations can be considered alongside the investment priorities. It’s a good point in this article that skilled analysts play a key role in interpreting impact reports and data to help allocate resources towards the best investments. Nice work Sara and Kate!
BY G. Benjamin Bingham
ON May 17, 2016 02:57 PM
Moral imagination inspires entrepreneurs to break through barriers in new ways, moral technique based on experience is required to evaluate the outcome. Thanks for your consistent stance on this for over a decade!
BY Jenny Kassan
ON May 21, 2016 05:16 PM
I love this: “The market is best served when each organization can measure its social impact in the way that is most meaningful and insightful to its aim and operations” - as someone who works with social entrepreneurs, I know that many are frustrated by being forced into a one size fits all reporting metric.
BY Kevin jones
ON May 21, 2016 05:18 PM
they are Jenny
BY Devon O Sanford
ON May 25, 2016 07:40 AM
Kate and Sara — excellent piece. I was particularly interested on your analysis of the challenges to standardizing social impact measurement. We’re working to address this problem through our Social Impact Projection certification (http://goodcompanyventures.org/programs/social-impact-projection) which quantifies an early-stage entrepreneur’s external social impact projections and communicates the data to impact investors.We will definitely need to continue thinking who our “social impact analysts” will be as we move forward on the model.
BY Kevin jones
ON May 25, 2016 07:42 AM
interesting development Devon Sanford
BY Hiroko Kurihara
ON May 27, 2016 07:18 AM
As an entrepreneur new to the impact investment space, I’m interested in offering my investors the opportunity to measure our impact as we grow and transferring “dividends” into tangible “multipliers of effect”. Identifying the best forms of analysis and anecdotes that communicate (perhaps in real time) how each investment dollar inspires/results in change might be a challenge in and of itself. I’d also like to see every “pro-sumer” be able to identify the impact measures important to them and apply them to every product or process that is out there in the world by tapping a shared database that takes all the analysis and consolidates it. Thank you, Sara (and Kate) for moving measurement forward!
BY Jindra Cekan
ON March 6, 2017 11:14 AM
Interesting, especially your point on “analysts think about social impact measurement holistically—reading across techniques, instead of just designing an evaluation or performance monitoring system for a particular program, or just applying a particular technique such as theory of change, sustainable livelihoods, or social return on investment (SROI)....These frameworks function as translating mechanisms that allow analysts to understand and compare different approaches.”
I have two questions. One, two where are the country nationals themselves, assessing the social impact the project is having on their lives, and Two, is impact boundaried by the project duration? The impact that Valuing Voices evaluates is the Sustained and Emerging Impacts after donor resources end. More here: http://valuingvoices.com/leading-in-challenging-times-sustained-and-emerging-impacts-evaluation-seies-reposted-from-medium-com/
BY Sara Olsen
ON June 15, 2017 12:14 AM
Hi Jindra, Thanks for your comment. To your first question, an emerging framework for skilled impact analysis considers stakeholder engagement to be a cross-cutting consideration from framing to measurement and analysis to communication. To your second, there are certainly impacts that occur well after a given project ends. I infer that you are asking about what is to be measured or accounted for/considered by an analyst. The answer would have to emerge from a consensus about norms in impact analysis which does not yet exist. I hope you will engage in helping to uncover it!