Phillip Atiba Goff is an assistant professor of social psychology at UCLA and co-founder and president of the Center for Policing Equity. This op-ed appeared May 2, 2014 on CNN.
Target knows when you get pregnant. Facebook knows when you break up with someone. Google knows when you have been in a car accident.
These are all (terrifying) consequences of our increasing lack of privacy. They are also results of the “big data” revolution that’s transforming our society. Whether you are a buyer, a seller or a hopeless romantic “researching” your old high school crush, chances are, some program is keeping track of all your clicks. This is what big data can do. By aggregating data from billions of people, analysts can predict what you are likely to do next.
Given the degree to which Amazon and Facebook have crushed their competition, companies that do not take advantage of big data are doomed to obsolescence.
Why then, with all the noise we hear about the rise of big data in consumerism, do we hear so little about big data in social justice? Is it not possible for big data to make the world a better place while making it creepier and less private?
The short answer is that big data can make the world a fairer place. But there is less immediate financial gain to be made from it, and some of the science is still in progress.
The biggest reason perhaps is that we fear what big data will tell us about our democracy. Racism, sexism and other forms of discrimination are still alive. If we collect data about disparities, we might see far more ugliness in our society.
“Measuring fairness” is harder than it sounds. But big data can help us.
For example, some police departments are leveraging big data innovations, like the use of CompStat. You know, the scenes in “The Wire” where people talk about “bringing numbers down”?
CompStat is a way of mapping crime in regions of a given city and measuring patterns of crime in those regions. It has become a popular method for holding police executives accountable for crime rates, and it seems to work. All kinds of information is captured through CompStat: type of crime, time of day, GPS coordinates of the incident and even the demographics (e.g., race, gender, age) of the suspect.
Why, then, don’t most police departments conduct CompStat for officer fairness?
In part, it’s because we don’t know how to measure “fairness.” Simply comparing the percentage of white suspects an officer stops to the percentage of whites in the neighborhood is a common but terrible metric. That is because, if whites commit disproportionately more (or less) crime than their representation in the neighborhood, then using the neighborhood as a benchmark will mislead us.
This technique, called population benchmarking, is widely discredited by scholars and law enforcement alike. Comparing officers against each other makes it hard to tell whether all or none of the officers are engaged in biased policing, which troubles academics and civil liberties groups alike. And comparing officers in one district against officers in another is often an unfair comparison given that income — and, often, crime — varies widely across districts.
So what is a police chief to do? The answer is, at least collect the data.
Target can tell if you get pregnant because it has billions of data points about who buys what when. Hidden within the data are patterns of behavior — the “if this, then that” logic — that most of us do not recognize within our own decisions. This is how it recommends purchases to you.
There are billions of police encounters across the country every year. When we start to aggregate the data, we can figure out how to ask better questions and get better answers.
This year, my colleagues and I will begin collecting data for the nation's first database (PDF) tracking police stops and use of force. Using big data to analyze police behavior across the country is the first step in harnessing these powers to make our society fairer. Although this justice database will not end unfairness in policing, it might give us tangible benchmarks from which to work in our conversations about race in the U.S. And it will let us learn how to ask the right questions to reduce racial disparities.
Just like Amazon was not remotely correct in thinking I might like Justin Bieber’s new album because I liked “Purple Rain,” big data may need time to get it right.
And although we must remain at least as vigilant over privacy concerns in law enforcement as we are over online shopping, it’s time to leverage the big data revolution to start helping us achieve a fairer society.