People holding up their hands toward a futuristic looking disco ball in the center, surrounded by illustrations of technology in health, justice, social media, education, design. (Illustration by Vreni Stollberger)

Something peculiar happened three years ago in Ramsey County, Minnesota. In a little-told tale of two grants, a progressive district attorney’s office had embarked on two externally funded and functionally separate interventions to the school-to-prison pipeline: deep community engagement (funded by a local family foundation) on the one hand, and the development of a data-driven predictive model to flag at-risk kids (funded by a national crime-and-data-focused nonprofit) on the other. At a glance, this could have been any community anywhere, creatively exploring solutions to its most pernicious problems. But in fact, what unfolded was a microcosm, uniquely glimpsed, of the way emerging technology at once implies promising potential on intractable issues while also having the capacity to harm both current and future publics—and the way our communities offer us an alternative path forward.

When parents and community groups learned of the plan to use machine learning to predict their children’s future, the deeply engaged community in the Twin Cities first resisted, then refused, the data-sharing agreement that would have enabled the predictive model. Specifically, we at the Twin Cities Innovation Alliance, in coalition with community partners, rallied quickly to research the proposed technology intervention, learning that such risk-assessment tools were scientifically dubious and corrosive to civil rights. We created a data primer and hosted “algorithmic improv” sessions to demystify the relevant tech jargon. We experimented with community benefits agreements (borrowed from the commercial real estate development world) and algorithmic impact assessments, and we ultimately raised a clear and succinct alarm about the county’s plan, including recommendations for alternative interventions.

Putting the Public Interest in Front of Technology
Putting the Public Interest in Front of Technology
This series, sponsored by the Ford Foundation, explores the pioneering new field of public interest technology and highlights the imperative to create and distribute technology that works for all.

Local officials, initially convinced that the community just didn’t understand the proposed technology, eventually came to see the wisdom in our community instead: Decades of structurally biased data would feed into the development of an inevitably biased model that would claim to predict future delinquency, unfairly labeling children as risks to their community. Following the dissolution of the data-sharing agreement, the community coalition engaged system leaders in a restorative process to address harm, rebuild trust, and implement a more inclusive approach. This work culminated in the launch of the Transforming Systems Together community advisory committee, designed to include community members in county decision-making processes as equal voices.

In the wake of this experience, the Twin Cities Innovation Alliance and collaborators have become powerful messengers from an under-resourced reality—one in which communities define the vision for their future and design solutions to move toward it.

* * *

This story and many more like it, playing out in communities around the world, hint at a vast and largely untapped store of potential energy for reimagining our technological futures—a techtonic shift on the horizon. Where diverse groups of people are engaging in the interlocking challenges they face, power is being generated to resist, refuse, redirect, and revitalize how we collectively define “progress” and “innovation.” Seen through this light, a tradition of harmful technology development and deployment that has sped ahead of people power illuminates opportunities: to develop deeper relationships, to better understand our commitments and social contracts to each other, and to hold each other in community.

Rather than focusing on particular tools that can be described as being in the “public interest,” the most crucial conversations about public interest technology get to what’s at stake in shifting systems: whose values are centered, whose needs are considered, and what is likely to occur as a result of deploying new tools. Today, amidst an explosion of technology innovation, the stakes have never been higher: Our biosphere, our democracy, and the health and well-being of our individual bodies and collective communities are all directly impacted by technological change. Importantly, the network of actors and approaches coming together around public interest technology view these stakes as interconnected and interdependent.

As the closing piece in a series meant to memorialize an emergent moment for the public interest technology ecosystem, this article will reflect on the publics that are the central focus for the field and provide recommendations for funders and others interested in supporting public power to determine technology’s future.

Technology Can Never Be Neutral

While it is common to think of computational code as binary 1s and 0s, and data as purely mathematical, these tools instead represent a rich contextual fabric that loses resolution as you get further from what is being measured. People’s lived experiences, relationships, and communities are abstracted by these measurements, and without being situated in a framework of values—beliefs about the way the world works and should work, the resulting code and data don’t mean much.

Values are embedded in technical tools across the lifecycle of technology development, dictating (often invisibly) how technology works and who it works for. When a computer model widely used by health insurers to identify high-risk patients was found to be racially biased, external auditors were needed to clarify that cost was an inappropriate proxy for health. When Michigan replaced much of its insurance fraud department with an algorithm that turned out to have a 97 percent false positive rate, leading to more than 40,000 false accusations against residents, the responsible vendor claimed its model was working exactly as intended. Without explicit attention paid to prioritizing a public interest values framework, business priorities like speed, scale, and profit—that can actually trade off public interest values—will too often be the default. 

Even where positive values are made explicit, harms can still accidentally arise from technology in spite of its developers best intentions. This is partially because values themselves are contextual and expressed differently across communities and social-economic status. Consider, for example, an online communications effort to advocate for a marginalized community that instead exposes that community to harassment when sensitive data gets inadvertently exposed; or a platform designed for social connection that over time becomes an engine for misinformation.

Technology systems interact with social systems in complex ways, and as more and more new systems are layered together across sectors, complexity compounds, leading to downstream harms and unintended consequences that can be difficult to predict, inscrutable to assess, and nearly impossible to unwind. This is why professionals across disciplines must be equipped to understand complex sociotechnical dynamics and must be prepared to meet the downstream effects of technology with resilient systems of redress.

The emerging field of public interest tech draws expertise from a multiplicity of disciplines and backgrounds—including those with specific technology skills but also artists, activists, scholars, advocates, community leaders, journalists, and lawyers. Often, those with the most expertise to contribute to understanding the complexity of the issues facing our social systems are those with the most experience of marginalization and harm from technology change.

Rather than a shared focus on particular tools or even technology itself, the through-line for public interest tech is instead a values framework that centers the people most impacted by technology in determining its future. When we focus on technology and other STEM approaches too narrowly, we risk replicating decades of techno-solutionist failures. Instead, the public interest approach invites working together across disciplines to approach complex challenges with humility, historical understanding, and curiosity about the tools that can help us move forward.

The public interest tech ecosystem encompasses many sub-fields and movements, from civic tech to AI ethics to critical internet inquiry and beyond. Many stakeholders in public interest tech focus primarily on developing and nurturing these “technologists,” not technologies, and on the values, conditions, and capacities that need to be in place in our organizations and systems to ensure any innovation does not harm or leave our communities behind.

Public Interest Tech: Values, Principles, and Approaches

“Public interest” technology, “responsible” technology, “ethical” technology—these are not labels that can be applied to certain tools. To be sure, labels can be useful, for example in identifying tools that are accessible, interpretable, rights-based, community-led—these potential labels all imply meaningful departures from “tech as usual.” But rather than labels for tools, “public interest” and parallel frames on technology refer to overlapping efforts to define the values, trade-offs, and sociotechnical considerations that must be central to any effort to design or deploy technology that seeks to avoid doing harm.

Instead of referring to particular technologies or particular sectors, “public interest tech” implies a set of approaches, a diversity of contributors, and a supportive ecosystem of talent and infrastructure that includes planning for any tool to fail. What gets talked about as “unintended consequences” by tech developers are often actually seen as predictable harms by those who are negatively impacted. The fact that this phenomenon is so prevalent invites us to never deploy new technology without also providing resilient systems of governance, redress, and intervention.

Through public interest tech we are instead invited to ask, what is in the interest of current and future publics, without relying on the privileged possibility of planetary escape?

The field is diverse but there is broad agreement on a few things, including that technology is never neutral; we ignore relevant political dimensions and power dynamics at peril to people, to the planet, and also at peril to long-term potential for both profit and “progress,” however you define it. Technology is always dual use, and real harms result when technology interventions are narrowly developed and/or hastily deployed.

Technology is also not inevitable. The framing of “public interest” invites us to imagine beyond a popularly portrayed future of leaving this planet a dried out husk as we colonize other rocks in outer space. We are thus invited to take a long lens of the future, but also to resist “longtermism,” a techno-fantastical and fatalistic vision of the future that assumes that human subjugation of nature on a planetary and cosmological scale is manifest destiny, thus abdicating responsibility for those alive today. Through public interest tech we are instead invited to ask, what is in the interest of current and future publics, without relying on the privileged possibility of planetary escape?

But while we have some clarity about what is not in the public interest, including narratives about the inevitability of particular tech futures, defining what is in the public interest is the ongoing work of the field, always in partnership with the communities most impacted by technological change. Even absent a formalized field of “public interest technology,” communities have been leading this charge. Specific sectors, such as civil society, play important roles in checking and balancing institutional and other forms of power, while collaborators from across sectors can and should work together to clarify the values trade-offs between different approaches. But communities must lead, in providing the vision for what health, justice, and progress should truly mean in a future where our ability to survive on this planet is at stake.

Implications for the Funder Community

Data and technology projects have the power to shape—or warp—reality, similar to how science-fiction writers can help us envision the future but also have the power to bend what the world looks like. Funders also have significant power, directly in terms of what gets funded, and more subtly as narrative-shapers and vision-builders. Funders must be clear-eyed about their responsibility in defining the drivers of how to invest in civil society and in civic space.

At a minimum, funders should be thoughtful about how our power shows up, transparent about how it functions, and we should seek to redistribute it more equitably. This can manifest as empowering our grantees within grant relationships, setting up mechanisms for impacted communities to guide grantmaking, and in ensuring we only fund technology projects that rely on the participation and informed consent of those impacted.

This period of rapid technological change is akin to a living laboratory of approaches, rich with learning and possibilities for the future. But where scientific laboratories have evolved standards, norms, ethics, and codes of conduct, our sociotechnical ecosystem—our public—is significantly less protected. Laboratories imply the informed consent of those being experimented upon. Instead, today we see a vast and networked system of surveillance proceeding without consent and without clarity about what we may be trading off in a hunt for ever-richer and more comprehensive data. In her timely book Race After Technology, Ruha Benjamin talks about this as the “datafication of injustice, in which the hunt for more and more data is a barrier to acting on what we already know.”

Often, communities already know. Funder guidance with titles like “Why am I always being researched?” points toward an evolution of understanding not only how lived experience is valuable evidence, but also how it can be made meaningful in partnership with communities, rather than extracted from them. Communities like those in the Twin Cities, in Detroit, and in Oakland, where community-built organizations are letting values and community needs dictate technical futures, are demonstrating a critical set of approaches to the funding community, and their work invites resourcing, trust, and runway.

In the funder ecosystem, we’ve seen a push and pull between approaches that seek to mitigate the harms of technology and approaches that seek to enable innovation, from which is emerging a rich space of agreement. Funders are beginning to collaborate to resource a healthy enabling environment for innovation that can be socially positive, to build capacity for strategic decision-making relevant to technology, to invest in impacted communities that must drive collaboration, and to build toward resilient systems of justice and redress. For too long, the future has been imagined and built without those who are most impacted by technological change, and these default futures have been imposed on too many. More collaboration within the funder ecosystem will be required to enable imagining what alternative futures could look like at scale.

Funders need to move on multiple fronts at once, toward long-term structural change that prioritizes the public interest as it relates to technology. Across these fronts, a few lenses are useful to foreground:

  • Communities are where social issues are experienced and where problematic patterns are identified; therefore, communities are the richest source of wisdom for how to move forward. Creating feedback loops to hear from and move with communities most impacted by the issues funders seek to address is essential to positive long-term change. Putting communities first means developing pathways for communities to influence and inform systems—even or especially where interventions might need corrective actions. This can look like the parents and community groups who organized in Minnesota to refuse a biased predictive technology to flag “at-risk” children, reframe the problem, and repair relationships between the community and the county. It can look like community-led efforts to shore up digital security and reduce community surveillance in Harlem, New York. It can look like community-deployed wireless networks like those in Detroit.
  • Systems like education, child welfare, government, and public health often dictate how communities access essential services and opportunities. How these systems operate and evolve is too often dictated by market values and the logic of extractive capitalism. Given the importance of demonstrating ways of organizing our societies for long-term flourishing, there is an urgent need to reprogram our systems with a different values framework. Previous pieces in this series have detailed the changes possible when diverse teams of public interest technologists work with impacted communities to help rethink public systems, and the solutions they help develop are often low- or no-code: Redesigned forms and streamlined processes are sometimes all that’s needed to enable people to provide caring public services.
  • Values like transparency, accountability, and justice are often taken for granted in the language of social impact work, even as they are authentically held by communities. But even in work that transcends individual communities, it is useful to make a values framework explicit. Public interest values imply a meaningful departure from standard practice in most sectors, and need to carry through from language to action. Prioritizing public interest values across our funding and innovation activities is critical to ensuring we don’t continue to let priorities of domination, growth, and speed trade away the space for pursuing partnership, co-liberation, appropriate scale, and moving at the pace of trust. Where communities are impacted by a problem, let those communities define a values framework for making progress.


Techtonic Shift

The choices we make today about what to fund, who to listen to, and what stories to tell about the future matter. Techno-grift abounds, including a proliferation of crypto scams, a full-court press toward artificial general intelligence, and new platforms for surveillance and targeted advertising. At the same time, we have extraordinary opportunities to infuse resources into loving and resilient communities, like the Twin Cities example above, who have internalized and express values consistent with the public interest, whether those communities are showing up to resist, refuse, or reimagine the way we use emerging technology. Signals of how we can invest in these current and future publics also abound, some featured in this in-depth series. Working together as a diverse community of funders and across sectors is critical for sensemaking, for forecasting, and for collaboration.

Author adrienne maree brown observes that communities who organize are putting their hands directly on the future. Seen through this lens, movements for climate justice and just technology are different accents or translations of the same language, responding to the prominent narratives and consequences of domination, othering and extraction that have guided innovation. In particular, dominant political, colonial and corporate power structures have underpinned centuries of industrial, agricultural, and computational technologies. These concentrations are at odds with the tried and true hallmarks of equitable progress in every sector and society, namely collaborative interdependence, collective action, and community care. Rather than replicating the colonial project on a barren rock, public interest technology invites us to be attentive descendants and good ancestors toward a just transition on Earth.

We are invited to position the “public interest” not as a monolith, but a set of core, internalized values through which we can care for our collective well-being and foster a sense of stewardship for our complex and interdependent ecosystem. This is in service of a world where technology exists not as the central site of our focus, but as a generative space of activities through which we pursue and manifest possible futures. A world where tech infrastructure is not something done for or done to communities, but instead represents the bridges we construct together across old and new horizons, as the ground for movements just being imagined by the next generation.

Support SSIR’s coverage of cross-sector solutions to global challenges. 
Help us further the reach of innovative ideas. Donate today.

Read more stories by Michelle Shevin, Aasim Shabazz & Marika Pfefferkorn.