Charlton McIlwain
Vice Provost for Faculty Engagement, Pathways & Public Interest Technology, New York University
This is an excerpt from , edited by Torie Bosch. Copyright 漏 2022 by Princeton University Press. Reprinted with permission from Princeton University Press.
In the early 1960s, the Black civil rights revolution raged in the streets across the United States. This quest to build a more racially just and equitable society happened right alongside the computer revolution. Soon the two fused with the advent of the Police Beat Algorithm (PBA), a software system to help police departments collect crime data and determine where to focus crime-fighting efforts鈥攁nd one that would end up deeply affecting our society from the 1960s up through the present.
Why did the Police Beat Algorithm come to exist? What problems prompted the need for its formulation? Who developed it, and to what ends? The answers to each of these questions collectively tell a story about how a little-known computational experiment laid the cornerstone for what would become today鈥檚 surveillance infrastructure鈥攐ne that has deeply and negatively affected communities of color across the globe.
In the early 1960s, IBM topped the list of the world鈥檚 leading computing companies. It innovated not only new computer hardware and systems but new ways of thinking about the computer鈥檚 role and utility in everyday society. In its 1965 annual report, IBM president Thomas J. Watson Jr. defined the computer as essentially a problem-solving tool and aligned the company鈥檚 mission accordingly.
IBM鈥檚 focus on problem-solving also dictated its marketing strategy. The company鈥檚 marketing representatives didn鈥檛 peddle prepackaged products. Rather, they engaged leaders in every major industry鈥攆rom banking to transportation to the military鈥攁nd simply asked, 鈥淲hat problem do you have?鈥 Then, they promised to marshal IBM鈥檚 research and development strength to build customized solutions for its customers鈥攕olutions that could be broadly applied and widely scaled.
While IBM labored to market new computational solutions to social problems, uprisings materialized across the United States. In 1964 alone, so-called ghetto riots broke out in places like Harlem and Rochester in New York; Philadelphia, Pennsylvania; and Dixmoor, Illinois. These uprisings captivated the nation, as did the rampant white violence against those who marched for civil rights across the South. In a speech to Congress on March 15, 1965, President Lyndon Johnson proclaimed that America鈥檚 鈥淣egro problem鈥 was America鈥檚 problem. Citizens across the United States identified this fracture in 鈥渞ace relations鈥 as the nation鈥檚 most pressing dilemma.
For most white Americans, however, the urban uprisings that plagued the nation revealed Black Americans鈥 penchant toward violence and criminality鈥攕o much so that President Johnson鈥檚 white, Southern constituents thought solving America鈥檚 crime problem should be his government鈥檚 top priority. Heeding their agitation, Johnson, on July 23, 1965, formed the President鈥檚 Commission on Law Enforcement and the Administration of Justice. The Commission鈥檚 charge was to study the causes of, and find solutions to, America鈥檚 crime problem.
Just 19 days later, one of the most deadly and costly uprisings erupted in Watts, Los Angeles. One too many incidents of police brutality at the hands of the Los Angeles Police Department set off six days of unrest. Hundreds of LAPD police officers flooded the streets. Fourteen thousand National Guard troops stormed the city. Law enforcement killed 34 Black residents and injured thousands more. More than $40 million worth of property was damaged during the siege.
Through the Watts uprisings, Black America sent a message to white America: We鈥檙e fed up. We鈥檙e tired of racism, discrimination, and police brutality. White Americans, however, saw Watts as confirmation of their prejudiced belief that Black people are lawless and violent. For the President鈥檚 Crime Commission, white America鈥檚 vision of the Watts uprisings put a face to the problem the president called on them to solve鈥攁 problem that they felt required an extraordinary remedy. They found great potential in the new computing technologies that had already revolutionized war and national defense. Computing held so much promise that in the spring of 1966, following the Watts uprisings, Johnson added the Science and Technology Task Force to the Commission to introduce new computational solutions to crime. The president justified the task force鈥檚 work by pointing to computing technology鈥檚 success in war, national defense, and space exploration:
The scientific and technological revolution that has so radically changed most of American society during the past few decades has had surprisingly little impact upon the criminal justice system. In an age when many executives in government and industry, faced with decision making problems, ask the scientific and technical community for independent suggestions on possible alternatives and for objective analyses of possible consequences of their actions, the public officials responsible for establishing and administering the criminal law . . . have almost no communication with the scientific and technical community. More than two hundred thousand scientists and engineers are helping to solve military problems, but only a handful are helping to control the crimes that injure or frighten millions of Americans each year.
While the president and the Commission held great hope for the solutions the Science and Technology Task Force would produce, they placed their hopes more specifically in the one man whom they appointed to lead it: Saul I. Gass.
Gass was a mathematician and operations research pioneer. In 1958 he wrote the first textbook on linear programming鈥攁 mathematical modeling technique that seeks to (in large part) influence human behavior by quantifying and understanding the linear relationships between variables. Gass went to work for IBM in 1960 as project manager for the company鈥檚 contract to develop the real-time computational systems needed for Project Mercury, the United States鈥 first manned space mission.
By 1965, when the President appointed Gass to lead the Science and Technology Task Force, Gass was managing all of IBM鈥檚 federal system projects. By heading the task force, Gass signaled his agreement with the Johnson administration that policing was the institution best equipped to solve America鈥檚 crime problem鈥攁nd therefore developed鈥攖he Police Beat Algorithm.
The Police Beat Algorithm was designed to address two broad planning questions: First, how should police departments equitably divide the geographic and demographic parameters of a municipal area? (Gass focused on 鈥渦rban鈥 areas based on population, crime levels, and demographic factors.) Second, how should police departments effectively deploy police resources (people, weapons, vehicles, etc.) based on these geographical divisions?
Interestingly, Gass frequently highlighted the need to solve these problems in order to develop 鈥渃ontingency riot and other emergency plans鈥濃攁 growing concern directly tied back to Watts and similar uprisings.
The Police Beat Algorithm predominantly addressed four problems associated with police operations: 1) pattern recognition, identifying crime patterns within a set of crime data; 2) profiling, associating crime patterns with probable suspects; 3) dragnetting, linking probable suspects of one crime with past crimes or arrests; and 4) patrol positioning, how to best place patrols within appropriate geographical divisions of the city based on where the most crimes take place and where known criminal suspect profiles predicted who will most likely commit those crimes and where.
This is where planning problems and operational problems intersected. The Police Beat Algorithm was designed to focus on patrol positioning. Doing so relied on one primary component鈥攖he availability of crime data鈥攁nd two key computational techniques, norming and weighting. Norming refers to analyzing the data to determine 鈥渘ormal鈥 and aberrant ranges of criminal activity, both across a geographical area and for particular groups of criminal suspects (white people versus Black people, for example). Weighting, in this instance, was a means to rank the severity of different crimes. For example, crimes like homicide, rape, burglary, larceny, and auto theft were weighted with a score of four, signifying the most severe forms of crimes. Some of the arbitrary鈥攐r dare I say biased鈥攏ature of these weights can be seen in the lack of weighted differentiation between crimes against humanity like homicide on the one hand, and property crimes like car theft on the other. Traffic accidents received a weighted score of two, and drunkenness, a score of one. Geographical areas were weighted by the preponderance of crimes committed within their boundaries. The crime data, the statistical norms, weights, and geographical configurations of a city all figured into the Police Beat Algorithm.
In one respect, the PBA was developed to address a problem that framed Black people鈥攑rimarily those who were poor and lived in urban environments鈥攁s predominantly responsible for crime and, as a result, the problem that needed to be solved. The Police Beat Algorithm was therefore predetermined to geographically locate, isolate, and target Black and brown communities for police profiling, surveillance, and patrol and tactical unit distribution and deployment. All of the resulting data from these 鈥渟olutions鈥 could be used to forecast and predict where crime was most likely to happen in the future and allow police to plan accordingly. To be sure, the framing of the problem, and the configuration of the Police Beat Algorithm itself, promised outcomes that were not so much predictive of future crime as they were self-fulfilling prophesies. Gass鈥檚 PBA was essentially a proof of concept. Nevertheless, it was implemented in 1968 in the Kansas City Missouri Police Department鈥檚 new Alert II Criminal Justice Information System. It was through this system that the PBA鈥檚 racist impact was fully realized. Kansas City鈥檚 鈥淥peration Robbery Control鈥 was just the first example of how the algorithm led police officials to make the tactical decision to concentrate police personnel and deploy weapons on what was essentially the whole of East Kansas City, which housed the vast majority of the city鈥檚 Black citizens.
Ultimately, the Police Beat Algorithm became thousands of similar systems designed and built throughout the seventies, eighties, nineties and beyond. Over the decades, these algorithms have grown to include facial recognition, mobile surveillance, risk assessment, and other such tools used from local law enforcement to international security. The same logics and assumptions that motivated the creation of the PBA more than 50 years ago continue to permeate this array of contemporary law enforcement technologies. Fear of crime鈥攕till personified disproportionately by Black and brown people鈥攃ontinues to be greatly exaggerated, justifying exorbitant investment in developing more law enforcement technologies. Belief in the objective and infallible nature, and in the predictive power of data, continues to run rampant among technology purveyors, law enforcement personnel, public officials, and policy influencers. And stories about the disparate outcomes these technologies have on communities of color continue to roll in like a steady drumbeat. In these ways, today鈥檚 law enforcement technologies are not new; they鈥檙e just more sophisticated, insidious, ubiquitous, and more impactful than when the PBA was first conceived more than half a century ago.