Major revelations about data protection have emerged over the past year. Last spring, the political consulting firm Cambridge Analytica found itself at the center of a massive scandal after it was revealed that it had collected and misused sensitive personal data about Facebook users. This coincided with the launch of the European Union’s General Data Protection Regulation (GDPR), a government policy that enforces new rules relating to the protection of people’s personal data. Paired with high-profile data breaches from companies like Equifax, issues around ethical and responsible data use have infiltrated the public’s collective conscience.
I believe the social sector must address three core considerations as a part of its own practice of ethical and responsible data use: how we maintain an individual’s right to their data, how we secure that data, and how we ensure that no one uses it as a means to manipulate or mislead them.
Where the social sector stands with GDPR
The core feature of GDPR is the protection of individuals’ personal data. It outlines people’s right to control who has their data (consent) and how others can use it (processing). It also introduces mechanisms that allow individuals to reinforce their ownership over their personal data over time, including the right to request that an organization “forget” all the data it has on them. In this way, GDPR essentially represents a massive transfer of wealth. After decades of organizations freely mining personal data as an asset investment, GDPR dictates that individuals maintain ownership over that data. As a result, we have already seen whole business models fail; the social data mining company Klout, for example, ceased operations in response to the regulation.
Any framework for ethical data use must respect this paradigm. At the most basic level, if a user consents to joining your organization’s email marketing list, using that data for something beyond email marketing would be a misuse of that information and a violation of their rights. On a more sophisticated level, it has long been practical, lucrative, and relatively common in the sector to merge and mine credit agency and social platform data with real estate and wealth indexes to profile potential major donors. Today, this practice is a prime example of privacy violation.
In terms of data breaches, the major ones we have witnessed in the last year have happened to massive private-sector organizations with information technology budgets in the millions. Yet the risk of data breaches at social sector organizations—where budgets, resources, and real leadership is limited—remains high.
Implementing responsible and secure data protection measures can be a huge undertaking, with pressure to maintain pace with the private sector and only a fraction of human and financial capital. Most organizations don’t understand the privacy implications of their data-collection practices and are ignorant of the harm they may cause. This is even true of organizations that outsource the majority of their information systems and marketing practices, where suppliers operate like a black box and may not maintain the same principles of data security.
Though there are a few exceptions like California’s Consumer Privacy Act, the reality is that most governments in North America are either incapable or unwilling to pass regulations like GDPR. Outside Europe, therefore, it is up to the social sector to find a way to lead data privacy efforts through our values and practices. If we do nothing, we agree to the status quo—to continue putting ourselves and our supporters at risk.
How the social sector can lead the way in data privacy
Leading means developing policies and practices that move toward more-transparent collection and use of data, and allow us to speak directly and frankly to our supporters. We have an opportunity to build a code of practice that addresses ethical data collection and consent, embraces transparency, and reflects the values of the social sector. Even without robust resources, there are things organizations can do to attain a higher standard of personal data use and protection. Here are three:
1. Conduct an internal review of how you collect, store, and use constituent data across your organization.
Organizations should actively avoid collecting more data than necessary or gathering it through covert methods. Data append services and behavioral marketing platforms that discreetly track and identify user data can be the biggest culprits. Knowing the difference between solicited data (information a constituent willingly provides, such as their home or email address) and unsolicited data (information that lies within aggregate data, such voter records or credit ratings, that users don’t know about) is also important, as is recognizing what you realistically need to accomplish your goals to build ethical guidelines.
Organizations collect data in different ways and on different platforms: CRMs, ad + marketing platforms, analytics platforms, social media platforms, and so on. It is important to know how information flows into these systems and how it serves the missions of your organization, and to ensure that its storage is legally compliant and secure from data breaches.
A Canadian national cancer organization I worked with, for example, wanted to use its database to survey existing cancer patients, but doing this inadvertently meant combining donor data with sensitive health informatics. After digging deeper, it realized the project was in contravention of Canada’s Personal Health Information Protection Act and ultimately canceled it.
2. Practice radical transparency.
As long organizations stay true to their missions, visions, and overall brand, constituents will continue to support their work. Communicate that you respect and care about their privacy, and you’re well on your way to a long-term, mutually beneficial relationship.
The annual transparency report from the social media site Reddit offers an excellent example of how to achieve this. While not a charity, its community welcomes the company’s regular publication of government and law-enforcement requests for user information. This year’s thread demonstrates the kind of discussion and engagement that can comes with transparency, and how it can positively affect users’ overall trust and support.
3. Contribute to the conversation and help build a better practice.
Many people are looking to engage and work together to find solutions to the data protection challenges facing the social sector. In a recent article, for example, 92nd Street Y’s Asha Curran and The Partnership on AI’s Julia Rhodes Davis discuss how vital it is for mission-driven organizations to seek individuals with digital/data fluency, who can help translate concepts like data protection across roles and departments. And the University of Ottawa’s Michael Geist constantly writes about the need to modernize regulations in Canada to catch up to the European Union, and to adopt standards that focus on maintaining individual rights.
To help build a better practice, organizations can:
- Ask vendors: What do your data use and protection practices look like? This can both expand your own organization’s knowledge and help you see the fuller picture.
- Ask yourself: If a data breach were to happen today, what would the impact be? What is our action plan? What is the risk? Bring this question to your senior leadership team and ensure there is a proper risk assessment and crisis plan in place.
- If possible, create an internal team or task force to conduct an assessment and ensure that your organization is regularly evaluating practices as the sector continues to shift.
Data collection, use, and protection will continue to evolve and become even more complex. Inaction and maintaining the status quo will only put more social sector organizations at risk. While we wait for governments to step in and create policies that are genuinely in the best interest of the public, the social sector should strive to hold itself to a higher standard and thus stay true to the spirit of making meaningful change in the world.