Technolgists with tablet, VR goggles, laptop, looking up at thought bubbles with scenes of tech at work in communities. (Illustration by Vreni Stollberger)

Consider the video camera outside your window. Does it give you a sense of safety or of being watched? The wearable tracker on your wrist—will it help you instill better health habits or sell your private information to insurers and ad-tech companies? Will your child’s online schooling help them connect with teachers and friends or expose you and your household to surveillance?

Virtually every new technology tied into the massive, interconnected web of data and machine power undergirding the global internet has the potential for both social benefit and social harm. And communities that have been overpoliced and surveilled are more likely than others to experience the negative capabilities of new technology. As Simone Browne demonstrated in her book Dark Matters, contemporary tech-enabled surveillance practices are an extension of the long history of policing Black life in the United States from slavery onward.

Putting the Public Interest in Front of Technology
Putting the Public Interest in Front of Technology
This series, sponsored by the Ford Foundation, explores the pioneering new field of public interest technology and highlights the imperative to create and distribute technology that works for all.

Yet even as vital critiques like Browne’s emerge, the trajectory of technological investment and development continues to produce powerful engines of prediction, decision-making, and tracking that governments and companies seamlessly apply in a social context. These become solutions in search of a problem, often applied to complex dilemmas surrounding social issues such as policing, criminal justice, health care, education, immigration, and social services.

Unfortunately, it’s becoming ever clearer that artificial intelligence and automated systems can create, reinforce, and deepen social injustice. This is because the predictions that machines make are not objective. Humans imagine, build, train, and deploy computational systems using flawed datasets that embed the historical bias of our social and political institutions. And the development of large computational models and infrastructure resources needed to sustain these systems creates an interconnected foundation of powerful, centralized surveillance systems with the potential to be exploited and abused.

The Problem With Tech Solutionism

It is easy to find instances of journalism, academic research, and industry forces promoting claims that computer vision or machine learning can solve problems by predicting a person’s sexual orientation or criminality, preventing petty theft, nudging us away from polarization in social media discourse, or assisting teachers in the classroom. These claims rest on an assumption that technology can correct for human error, pushing systems toward a standard of computational objectivity. Once a technology is introduced, framed, and sold with the purported purpose of correcting a set of socioeconomic problems, the suggested solution to address embedded bias is usually to improve these technical systems, to nudge, to “de-bias.”

Yet evidence is growing that the harms that accompany the deployment of new technologies in context are not simply the result of bias, but betray a fundamental mismatch between complex social issues and tech solutionism. Our fractured and deeply unequal society is facing compound, unprecedented challenges presented by climate change, rising authoritarianism, and ever-widening inequality. What if new technologies sold to solve social problems are simply ill-suited to the complexity of the environments where they are introduced? What if, sometimes, the answer isn’t technology?

From "smart cities" to "smart borders," the violence that underpins local forms of social displacement and global regimes of boundary-building is encoded in new surveillance technologies. These high-tech tracking systems appear more humane than police sweeps of unhoused encampments or whip-wielding border agents chasing migrants. But tech-mediated state violence reinforces existing hierarchies and presents new dangers due to the scale and scope of data weaponized against the most vulnerable.

The first step in countering the gospel of tech solutionism is simply to question the presumption that new or revised technologies are the solution to any given social problem. Communities impacted by technology must be able to resist and refuse its incursions if and as they experience harm. If refusal is not an option, then we are still trapped in a vision of the future created by a small sliver of humanity: powerful investors, industry leaders, elite technologists, and special interests.

We need to also consider that the technology that might be working just fine for some of us (now) could harm or exclude others—and that, even when the stakes seem trivial, a visionary ethos requires looking down the road to where things might be headed. Those who have been excluded, harmed, exposed, and oppressed by technology understand better than anyone how things could go wrong. Correcting the path we’re on means listening to those voices.

To change course, we also need to rethink assumptions about who might imagine, design, build, and oversee the technologies that are shaping our future. Other essays in this series have covered the need for a focus on technologists, researchers, leaders, and advocates who can advance public interest technology. In this article, we argue further that the perspectives and leadership of those who have been most impacted by rampant tech solutionism must be central in the visioning and design of healthy, just public interest tech ecosystems.

To rebuild from the roots up, social change leaders, investors, and tech visionaries of all stripes must look beyond analysis generated by the very culture that helped create the problem. After all, how could those occupying powerful positions in the tech industry—having directly benefited from the racist, sexist, and classist status quo—ever develop tools that would undo those very sources of power?

A ‘Just Tech’ Agenda

We need time, space, and resources to dream up and enact the societies and communities we want—life cannot be about just surviving and defending ourselves from systems of oppression.—Safiya Umoja Noble, Just Tech steering committee member, MacArthur Fellow, and author of Algorithms of Oppression

In 2019, a group of leading scholars, organizers, educators, advocates, and artists at the intersection of technology and social justice convened to shape a new civil rights agenda focused on the impact of new technologies.

This group (including co-author Ruha Benjamin) developed the “Just Tech” framework, holding that to re-imagine and re-direct the interwoven trajectory of technology, society, government, and culture, we must collectively acknowledge and address the underlying structural injustices embedded in tech’s design, creation, and deployment. Further, efforts to build justice must go beyond responding and reacting to present harms to fully reimagine what a more intentionally chosen future with tech might look like, a vision to strive toward as we invent and innovate.

Between 2019 and 2022, with support from the Ford, MacArthur, and Surdna Foundations, the Just Tech framework moved into active development at the Social Science Research Council (SSRC) with the counsel of many more experts in the field. This collective envisioned and iterated on the idea of a fellowship that would support visionaries working on these issues, nurturing a culture of care and collaboration to nurture the people who have the unique capacity to steer technology development toward justice.

Building on the SSRC’s tradition of supporting rigorous research to advance the public good, the SSRC’s Just Tech program was designed to highlight and interrogate questions of justice, power, and equity, creating an ecosystem of support for analysts, artists, and activists experimenting with both speculative and trusted methods.

The two-year Just Tech Fellowship was launched in November 2021 with an open call for proposals.​ ​In its inaugural year, the fellowship received 600 applications from social scientists, computer scientists, artists, organizers, social change leaders, and legal experts—visionaries in all disciplines. Out of this, the Just Tech advisory board identified six outstanding fellows. During their terms, these emerging leaders will take on some of the thorniest issues at the intersection of technology and social justice:

Kim Gallon, founder of COVID Black, an organization that has taken on racial health disparities throughout the pandemic by telling empowering stories about Black life, will create a justice-centered framework for design and development of health information technology.

Chris Gilliard, a community college professor and widely published critic and advocate for civil rights in tech, will map novel surveillance practices and technologies to create a taxonomy for identifying and assessing their social impact and risk for marginalized communities.

Christine Miranda, a community organizer and digital director with Movimiento Cosecha, a national movement fighting for immigrants’ rights, will research and develop shared resources for decentralized digital organizing strategies.

Clarence Okoh, a civil rights attorney, will analyze the impact of carceral technologies on the civil and human rights of Black students in public school systems with longstanding histories of systemic racial discrimination.

Meme Styles, founder of MEASURE, a social enterprise creating antiracist evaluation tools and providing free data support for Black, Brown, and Indigenous-led organizations, will develop a data-sharing tool to enable strategic collaboration.

Rua Williams, a computer graphics designer and disability justice advocate, will partner with adaptive technology users, developers, and user-experience designers to develop a collective “Cyborg Maintenance” approach to advance collective self-determination by eliminating barriers to equipment access, maintenance, and customization.

These fellows will engage with and shape cultural conversations with the public, mobilize and work with affected communities, audit and address harms created by novel technology systems, and envision and develop new tools and prototypes. The program will support them by building critical connections, continually growing and deepening the reach of its network of Just Tech fellows, scholars, advisors, and affiliates, showcasing work and convening community conversations.

Just Tech aims to model a radical imaginary: a collective effort to expand our sense of what’s possible–to shift our cultural imagination away from the unjust tech future that right now seems inevitable toward a radically new field of possibility and potential. Once supported and recognized, leaders and changemakers like the Just Tech Fellows have the capacity not only to disrupt and dismantle the logic of tech solutionism, but to envision and create a world where humanity is free to choose a relationship with technology that embraces and manifests justice.

Rebuilding the Future of Tech

We are living at a critical moment. New technologies are rapidly being integrated into everyday life, often below the radar. Our intervention must be both timely and holistic: What does it mean to include new voices unless we create a context in which those voices are welcome and heard? To create those conditions, leaders in civil society and the private and public sectors must challenge institutional power and center the discussion on core social justice issues such as racism and structural inequality.

Institutions and philanthropies invested in social welfare and the public interest are uniquely positioned to intervene in this critical moment. They can invest in those working on the social implications of data-driven technologies; provide more substantive and lasting pathways for those who could be; and legitimize and fund work that is led by and that centers the needs of marginalized people in particular, and civil society more widely. As series contributor Jake Porway notes, funders and civil society organizations can “redefine how these technologies are fundamentally designed.”

And while Just Tech’s support of the people who will reimagine and rebuild the future of tech is critical, it is just one piece of a bigger puzzle. The full suite of tools and measures needed to enable a world where technology operates in the public interest is much broader. To fully realize the potential influence of Just Tech leaders, foundations and donors should invest in tech innovation and design from the margins. Communities’ lived experience must inform tech products and implementation, as highlighted in the series essay on the University of Michigan’s Public Interest Technology Knowledge Network. We need true commitment to cross-sector, public-private efforts to create public interest standards that can guide the development and deployment of new technologies. We need public officials who have a full and accountable understanding of the implications of the technologies that they are governing.

As the world struggles to reckon with ongoing legacies of systemic discrimination and racial injustice, we must resist the assumption that the solutions to the problems created by technology are to be found in technology—and instead invest in the people, ideas, and frameworks that can deeply transform sociotechnical systems. Now is the time to build toward radically imagined tech futures that celebrate and manifest hope, joy, justice, and self-determination.

Support SSIR’s coverage of cross-sector solutions to global challenges. 
Help us further the reach of innovative ideas. Donate today.

Read more stories by Greta Byrum & Ruha Benjamin.