国产视频

In Short

Civil Rights in the Age of Big Data

Digital information about who we are and what we do is the currency of our digital economy and the substrate of our digital society. As the flow of information increases, fairness, justice, and equal opportunity become paramount to the design, deployment, and entrenchment of computation-based technologies in our lives. Today, a group of civil and human rights groups鈥攊ncluding the Open Technology Institute鈥攈ave released a for big data that identify the critical importance of equity in the debate of big data鈥檚 opportunities and threats.

Historically, civil and human rights concerns tend to run in the background of heated conversations about government surveillance, corporate privacy intrusions, and national security. But examples that demonstrate palpable concerns are circulating in our public sphere. Recently, we heard by President Obama to FBI surveillance of Dr. Martin Luther King and the targeting of movements for collective self-determination. There have been of sustained economic stagnation among communities of color caught up in a recursive loop of bad credit data. Meanwhile, we鈥檝e heard of the use of surveillance technologies for and of low-income populations.

These episodes raise more than just questions of personal liberty. They force us to ask: how does big data categorize us into groups? When do predictive analytics persuade us to think or act based on patterns of behavior of people like us (or algorithmically deemed to be like us)? How are computational systems making decisions for us based on an aggregate self?

Our civil rights principles for big data provide an initial roadmap for addressing these questions. They ask companies and government to stop high-tech profiling, demand fairness in automated decision making systems, preserve constitutional principles, enhance individuals鈥 meaningful control and access over personal data, and protect people from the consequences of inaccurate data. And they put these issues in conversation with a longer history of data discrimination: communities of color, especially poor communities of color, have historically shouldered the burden of government surveillance and corporate intrusions of privacy.

At OTI, we鈥檝e seen digital privacy鈥檚 concerns and anxieties arise in field work with members of underserved communities. Our shows people trying hard to get by want fairness in their daily digital encounters. They want pragmatic solutions鈥攑ractical ways to protect themselves from irresponsible or abusive behavior of those who transport their messages or information from one point to the next. And they wish to take advantage of the economic, social, and civic benefits advertised to them as they become 鈥渄igitally included.鈥 Having been preyed upon before, they would prefer that access to new digital technologies break that trend.

Embraced by government agencies and the private sector, civil rights principles for big data can help make that opportunity a reality. But it depends on a willingness of all the relevant parties to recognize technological systems are never neutral. Companies and government have the power to instantiate a set of norms commensurate with our expectations of equity. Fairness is a choice. It can be intentional.

Indeed, big data analytics and computer decision systems can be shaped to produce public value. In an about electronic health records, Frank Pasquale described the possibility for big data analysis to serve the public good. That possibility would require a trade-off: in exchange for government subsidies, parties involved in the development and operation of new, proprietary electronic health systems should make the data on these platforms* available to government agencies tasked with monitoring public health and related industries. That is helpful not only for demonstrating fraud and abuse, but also for advancing medical research and making health and wellness a more equal opportunity.

Whether to the challenges or opportunities, we should鈥攁nd can鈥攖hink more broadly about big data. Let鈥檚 stop arguing about whether privacy is dead or the surveillance state inevitable. Let鈥檚 focus on equity. Doing so will help us better understand what we have to gain in shaping our digital destinies.

*with consent and deidentified

More 国产视频 the Authors

Seeta Pe帽a Gangadharan

Programs/Projects/Initiatives

Civil Rights in the Age of Big Data