国产视频

In Short

Centering Civil Rights in the Privacy Debate

civil rights and data
Lightspring / Shutterstock.com

Can Congress prevent the disproportionate harm inflicted on marginalized communities from at times irresponsible commercial data practices?

The internet provides access to innumerable goods and services that many people rely on to survive, including jobs, housing, and general information. But nothing is free: In exchange for this access, companies often collect and process endless amounts of data on individuals. More than that, people have come to learn the hard way, primarily via data breaches, that processing so much data presents real privacy pitfalls to individuals鈥攔anging from identity theft and lost income, to more intangible harms like damaged reputation and emotional distress, to other inconveniences like having to spend time changing passwords and updating privacy settings.

People who are part of marginalized communities are especially vulnerable to exploitative commercial data practices. This often results in data collection that fuels discrimination, including restricted access to and . It also opens up avenues through which these communities can be targets of and .

Put another way, as our lives increasingly shift online, so, too, have methods of discrimination鈥攗sing individual data profiles鈥攁nd our laws have been slow to keep up. It鈥檚 thus vital for Congress, in particular, to conceptualize privacy as a civil right, and move the privacy debate to prioritize perspectives from marginalized communities who are disproportionately harmed by these new dangers.

Indeed, privacy and civil rights ought to go hand in hand, but they often don鈥檛. In fact, civil rights discussions have largely been missing from the privacy debate. This is despite the reality that privacy infringements affect civil rights in myriad ways.

Erin Shields, the national field organizer for internet rights at the Center for Media Justice, explained some of these ways at a recent event on 鈥Centering Civil Rights in the Privacy Debate,鈥 hosted by 国产视频鈥檚 Open Technology Institute and Color of Change. The 鈥渙ld regime of oppression and discrimination,鈥 she said, has been 鈥渦pdated and compounded by algorithms and the ability of corporations, third-party data brokers, social media, and the government to collect an extreme amount of information on us and also to guess about us, and deliver us services or not deliver us services based on those guesses.鈥

Further, manipulated data can harm communities of color and low-income communities through over-policing, voter suppression, hate campaigns, online fraud, predatory schemes, digital redlining, and more. As Francella Ochillo, vice president of policy and general counsel at the National Hispanic Media Coalition, explained, it鈥檚 critical that 鈥減rivacy laws acknowledge the enduring economic, political, and cultural oppression that still exists in this country to this day.鈥

For immigrant populations, there鈥檚 also a crucial link between privacy and surveillance. , an organization that has historically focused on immigrant rights and that was also represented at the event, has shed light on the fact that the tech industry has played a key role in accelerating the targeting, surveillance, deportation, and detention of immigrants.

By using software provided by the data-mining company Palantir, for instance, Immigration and Customs Enforcement agencies of immigrant children and their family members to track and detain undocumented immigrants, which then facilitated the family separation crisis. Tech companies鈥 apparent willingness to participate, however willfully, in this entrapment of undocumented immigrants, along with the lack of transparency around the practice, only exacerbates fears that civil and human rights are being violated as companies profit off the commodification of data.

In addition, some corporations have collected personal information without clearly disclosing what鈥檚 being collected and why鈥攕ometimes repurposing data for other uses without consent and building tools that make segregation worse. Consider how, while Facebook as a security protocol against unauthorized logins, it also used that information to deliver targeted advertising. have already established that Facebook鈥檚 ad delivery algorithm discriminates based on race and gender. The such study, published last month by Northeastern University, the University of Southern California, and Upturn, finds that this discrimination persists in job listings and housing ad delivery鈥攅ven when advertisers don鈥檛 opt to target specific demographics and are trying to reach a broad audience.

Experts have long known that data practices can lead to discriminatory outcomes, but now there鈥檚 a growing body of evidence as further proof. So how can government and society begin to hold corporations accountable for these harmful consequences?

One possibility is legislation. Free Press and the Lawyers鈥 Committee for Civil Rights Under Law recently published outlining how Congress can ensure that personal data isn鈥檛 used to discriminate against protected classes for things like employment, housing, and education. They also propose classifying online businesses as public accomodations, which would make it unlawful for them to discriminate against marginalized communities and restrict access. The model legislation would also give the Federal Trade Commission (FTC), the primary government agency responsible for regulating privacy, the ability to enforce the law. Senator Ed Markey (D-Mass.) has introduced a .

Another possibility is empowering the FTC to protect privacy and civil rights through broad rulemaking and other changes in the agency鈥檚 authority. Public Knowledge has advocated for these sorts of changes, which could help protect marginalized communities from these now-extensively documented civil rights harms. In general, Congress needs to ensure that the agency has the right resources and experts to understand and address these issues.

Ultimately, though, the debate on privacy ought to center perspectives from marginalized communities that are disproportionately affected, and the tech policy community more broadly needs to reflect the diversity of the United States.

Alisa Valentin, a communications justice fellow at Public Knowledge and another panelist at the event, has written about to underscore the importance of diversifying the tech policy space. Affected people鈥攅specially black people鈥攐ught to be included in discussions about these issues, she said, particularly because stakeholders define 鈥減rivacy鈥 differently, depending on the community and cultural background. (Notably, black and brown communities have already acknowledged the impact of data on their lives through research like 鈥溾 and 鈥,鈥 the latter by the Stop LAPD Spying Coalition.)

As Ochillo urged at the event, the privacy debate can鈥檛 exclude these critical voices. 鈥淵ou have to be persistent and clear about elevating your communities鈥 message in language that actually, they understand,鈥 she said. 鈥淏ecause there are lots of different stakeholders in this debate who might all agree on where we鈥檙e going, but disagree on how we鈥檙e going to get there. What are you doing to get your constituents鈥 message into those core storerooms?鈥

More 国产视频 the Authors

Becky_Chao.jpg
Becky Chao

Programs/Projects/Initiatives

Centering Civil Rights in the Privacy Debate