Today, data governs almost every aspect of our lives, shaping the opportunities we have, how we perceive reality and understand problems, and even what we believe to be possible. Philanthropy is particularly data driven, relying on it to inform decision-making, define problems, and measure impact. But what happens when data design and collection methods are flawed, lack context, or contain critical omissions and misdirected questions? With bad data, data-driven strategies can misdiagnose problems and worsen inequities with interventions that don’t reflect what is needed.
Data justice begins by asking who controls the narrative. Who decides what data is collected and for which purpose? Who interprets what it means for a community? Who governs it? In recent years, affected communities, social justice philanthropists, and academics have all begun looking deeper into the relationship between data and social justice in our increasingly data-driven world. But philanthropy can play a game-changing role in developing practices of data justice to more accurately reflect the lived experience of communities being studied. Simply incorporating data justice principles into everyday foundation practice—and requiring it of grantees—would be transformative: It would not only revitalize research, strengthen communities, influence policy, and accelerate social change, it would also help address deficiencies in current government data sets.
When Data Is Flawed
Some of the most pioneering work on data justice has been done by Native American communities, who have suffered more than most from problems with bad data. A 2017 analysis of American Indian data challenges—funded by the W.K. Kellogg Foundation and the Morris K. Udall and Stewart L. Udall Foundation—documented how much data on Native American communities is of poor quality, inaccurate, inadequate, inconsistent, irrelevant, and/or inaccessible. The National Congress of American Indians even described American Native communities as “The Asterisk Nation,” because in many government data sets they are represented only by an asterisk denoting sampling errors instead of data points.
Are you enjoying this article? Read more like this, plus SSIR's full archive of content, when you subscribe.
Where it concerns Native Americans, data is often not standardized and different government databases identify tribal members at least seven different ways using different criteria; federal and state statistics often misclassify race and ethnicity; and some data collection methods don’t allow tribes to count tribal citizens living off the reservation. For over a decade the Department of the Interior’s Bureau of Indian Affairs has struggled to capture the data it needs for a crucial labor force report it is legally required to produce; methodology errors and reporting problems have been so extensive that at times it prevented the report from even being published. But when the Department of the Interior changed several reporting requirements in 2014 and combined data submitted by tribes with US Census data, it only compounded the problem, making historical comparisons more difficult. Moreover, Native Americans have charged that the Census Bureau significantly undercounts both the American Indian population and key indicators like joblessness.
Community Control
Why should statistics on Native Americans be generated or controlled by federal and state agencies, outside researchers, or other institutions without deep connections to the unique challenges these communities face, the historical contexts of their challenges, and the desires and needs of these communities?
To remedy the problem, American Indian tribes have partnered with universities and research institutions to create more participatory data, research, and evaluation practices. In academic circles variously referred to as “community-based participatory research,” “equity approaches to research,” or research that is “culturally responsive,” they are models which treat communities as research partners and involve them in the process from beginning to end, from defining research questions to collecting and analyzing data. These new models not only produce data that is more accurate, but they do a better job of identifying actionable solutions and investments that make a difference. Community-based participatory research tends to get more community buy-in, and can help spot systemic problems that might be overlooked using more traditional methods.
For example, when Texas closed the Ysleta del Sur Pueblo tribe’s casino in 2002, outdated census data from 2000 didn’t reflect the economic decline the tribe experienced. So the tribe partnered with the University of Texas at El Paso to conduct a socio-economic survey and to develop an economic development plan out of the data they produced. The initiative not only helped create jobs but successfully challenged the rejection of a grant application by the Department of Housing and Urban Development: After tribal members helped design new research questions and data collection methods, they showed that HUD had relied on incorrect and insufficient federal data in their initial evaluation.
In a similar case, when a 2000 Bureau for Indian Affairs report showed South Dakota’s Cheyenne River Sioux had an 88 percent unemployment rate, the Northwest Area Foundation helped fund a Voices research project—staffed mostly by tribal members—to help the tribe collect more accurate socio-economic data. Working with Colorado State University researchers, they demonstrated that because federal data was focused on deficits, it couldn’t recognize an extensive informal traditional arts and crafts sector, nor its development potential. Armed with this data—which showed, for example, that 78 percent of survey respondents participated in these micro-enterprises—Northwest Area Foundation was able to help the tribe’s community development fund establish a micro-loan program for small arts businesses, as well as launch other targeted economic initiatives.
Principles for Data Justice
Data justice not only opens the door to greater impact and helps underserved communities build capacity to act on their own behalf, but philanthropy is uniquely positioned to accelerate its adoption.
Some principles and approaches to get started:
1. Take advantage of what already exists. A rich repository of information and experience already shows what works and what doesn’t, but much of it is tucked away in specialized journals, overlooked because projects don’t fit into current philanthropic categories, or refers to communities where few foundations work. Native Americans comprise about 2 percent of the population, for example, but receive only .4 percent of grants, and are often regarded as too specialized for particular foundations’ core work. As a result, the leadership and expertise of Native communities on data justice is not as widely influential as it should be.
The excellent “Chicago Beyond” research guidebook shows how Native American research partnership principles and models work in an urban context. It discusses promising data strategies, and how issues like unintended bias, aggregate data, and problematic government administrative data sets can lead to misleading data narratives and even cause harm.
2. Learn by doing. Fund a community-researcher partnership project for learning purposes and document what you learn. What unexpected insights and challenges emerged? What worked well, what didn’t, and why? Communities tend to seek actionable information to address an urgent problem. How did their priorities, goals, and expectations compare with yours? How did funder/nonprofit/community relations evolve?
3. Build expectations into grant guidelines. Philanthropy can exert tremendous leverage by adding community participation requirements to grant guidelines. For example, the American Indian research team at the Administration for Children and Families (ACF) at the Department of Health and Human Services already asks potential contractors and grantees to describe how (not if) they will involve the community in research and evaluation. They ask for specifics on hiring and other support, including what kind of training in data collection and evaluation the contractor or grantee will provide to the community to ensure it succeeds as a research partner.
For communities with bad prior research experiences or who are suspicious of funders’ motives, it is particularly important to account for an initial phase of trust and relationship building in timelines and budgets. An ACF-sponsored working group of mostly tribal members has also produced an evaluation “roadmap” that highlights more collaborative practices, methods which would also apply to non-native communities.
4. Experiment with other data collection methods. Philanthropic organizations like the National Committee for Responsive Philanthropy and large research firms like Mathematica Policy Research have found that emphasizing more qualitative data can be more accurate in conveying lived experience and context (and can be used alongside quantitative data). And by using rapid-cycle analytics, for example, researchers can assess information and shift priorities in real time, rather than wait years for results before they can do anything. There are many options to explore, and different methods (or mixed methods) can encourage community buy-in and yield better data.
5. Emphasize transparency and two-way communication. Data justice is built from trusting relationships among researchers, communities, and funders who all communicate effectively across cultural divides. Transparency is key to trust. Researchers and communities need to agree on research protocols, establish realistic expectations, and speak frankly about risks, benefits, privacy concerns, and data governance up front. The Department of the Interior could have avoided many of its labor report problems if it had engaged more with Native communities early on about research design and data collection methods, and had provided tribes with better guidance, training, and resources to collect data.
The NCAI proposed these measures in remedies it suggested to DOI. And the Cheyenne River Sioux’s Voices project is an example of best practices. They held community meetings to explain how the research process would work and to solicit input on designing data collection methods. To reinforce trust and a sense of ownership, it shared research results with tribal members first, before distributing them elsewhere, and conducted 45 presentations for more than 500 community members in three months. It also partnered with a local newspaper to publish a “Data Matter” column to highlight various research results in order to stimulate community interest and discussion of the research findings.
6. Revisit notions of impact. Treating communities as research partners rather than subjects not only helps the communities themselves learn research and evaluation skills—and a better appreciation for how data can help them—but increases their capacity to use data to identify and address other problems and should be seen as an important measure of success in its own right.
The Ysleta del Sur Pueblo’s experience conducting surveys, for example, would later help them challenge the racial blood quantum criteria Congress had imposed on the tribe, requiring individuals to prove they had at least one-eighth Ysleta del Sur Pueblo “blood” to claim tribal membership. Because such a requirement would cause the tribe’s population (also known as the Tigua) to decrease with successive generations, they had been unsuccessfully petitioning Congress for decades. By 2011, 66 percent of tribal members’ children did not qualify for tribal citizenship (even those that lived on the reservation, spoke the Tigua language, and practiced Tigua culture). But surveys the Ysleta del Sur Pubelo generated of tribal member views on citizenship bolstered their advocacy efforts and contributed to the law’s change in 2012. Finally, they were able to decide for themselves who could become a tribal member.
Small Community, Big Impact
These models show that where results are concerned, it matters less how big a population is than how the data process works. Even a small rural community with a unique cultural heritage can produce outsized impacts and contribute diverse perspectives and novel approaches. For example, the long research partnership between Arizona’s White Mountain Apache and Johns Hopkins University has produced so many valuable medical discoveries the tribe has received awards from both the World Health Organization and UNICEF and is credited with saving 50 million lives. That’s impact.
Data justice is waiting for philanthropy to act.
Support SSIR’s coverage of cross-sector solutions to global challenges.
Help us further the reach of innovative ideas. Donate today.
Read more stories by Louise Lief.