|
META TOPICPARENT | name="WebPreferences" |
|
| The present: predictive policing |
|
< < | Top cops around the country are announcing that predictive policing is upon us. Academic experiments have been running for a while, and last year, Microsoft announced it would be assisting police in developing technology for predictive policing purposes. Microsoft was somewhat cautious about framing the purpose of such technology: 'predictive policing is not about making arrests, but about preventing crime through more effective and efficient resource allocation.' Other commentators have excitedly noted, without irony, the similarities with the film Minority Report. This is no longer simply fantasy: a study from UCLA has found that predictive policing algorithms actually reduce crime in the field. |
> > | Top cops around the country are announcing that predictive policing is upon us. Academic experiments have been running for a while, and last year, Microsoft announced it would be assisting police in developing technology for predictive policing purposes. Microsoft was somewhat cautious about framing the purpose of such technology: 'predictive policing is not about making arrests, but about preventing crime through more effective and efficient resource allocation.' Other commentators have excitedly noted, without irony, the similarities with the film Minority Report. This is no longer simply fantasy: a study from UCLA has found that predictive policing algorithms actually reduce crime in the field. |
| What are we to make of this? Should critics be silent, and agree that security is the beneficial outcome of a rather disquieting development in technology? |
|
< < | Predictive policing algorithms are based primarily on crime statistics, in other words, past behaviour of law enforcement in tackling crime. It is therefore arguable that crime statistics do not reflect what crimes are occurring; a better way to think about this data set is that it provides a picture of the state's response to crime. This creates a real risk the biases and social trends we see in everyday policing being reproduced in the supposedly more objective and scientific methodology of computerized predictive policing. Feeding data into automated processes without careful analysis of the assumptions being made can provide misleading answers to important questions. |
> > | Predictive policing algorithms are based primarily on crime statistics, in other words, past behaviour of law enforcement in tackling crime. It is therefore arguable that crime statistics do not reflect what crimes are occurring; a better way to think about this data set is that it provides a picture of the state's response to crime. This creates a real risk the biases and social trends we see in everyday policing being reproduced in the supposedly more objective and scientific methodology of computerized predictive policing. Feeding data into automated processes without careful analysis of the assumptions being made can provide misleading answers to important questions. |
| The future of law enforcement? |
|
< < | A stark example of this specific problem and the potential problems it creates, albeit in a slightly different context, was revealed in documents leaked by Edward Snowden. The Skynet program run by the NSA uses an algorithm applied to data to identify terrorists. This algorithm was developed using data about 'known terrorists' and comparing it with a wide range of behavioural data taken from mobile phone use. |
> > | A stark example of this specific problem and the potential problems it creates, albeit in a slightly different context, was revealed in documents leaked by Edward Snowden. The Skynet program run by the NSA uses an algorithm applied to data to identify terrorists. This algorithm was developed using data about 'known terrorists' and comparing it with a wide range of behavioural data taken from mobile phone use. |
| |
|
< < | After no doubt many tax payer dollars and NSA man hours, this algorithm's highest rating target was Ahmad Zaidan, not an actual terrorist at all, rather the bureau chief in Islamabad for Al-Jazeera. The NSA documents refer to Zaidan as a MEMBER OF AL-QA'IDA, again, seemingly without irony. |
> > | After no doubt many tax payer dollars and NSA man hours, this algorithm's highest rating target was Ahmad Zaidan, not an actual terrorist at all, rather the bureau chief in Islamabad for Al-Jazeera. The NSA documents refer to Zaidan as a MEMBER OF AL-QA'IDA, again, seemingly without irony. |
| |
|
< < | There was a range of problems with the Skynet program. But one of the most obvious appears to be a rejection of one of the most basic principles of data science: correlation does not imply causation. While Zaidan may meet with known terrorism suspects, travel with them, and share social networks, he is clearly engaging in this behaviour as part of his role as a journalist. While Zaidan may fit the algorithm to identify terrorists perfectly, it is immediately obvious to any human that he doesn't actually belong in this category at all. |
> > | There was a range of problems with the Skynet program. But one of the most obvious appears to be a rejection of one of the most basic principles of data science: correlation does not imply causation. While Zaidan may meet with known terrorism suspects, travel with them, and share social networks, he is clearly engaging in this behaviour as part of his role as a journalist. While Zaidan may fit the algorithm to identify terrorists perfectly, it is immediately obvious to any human that he doesn't actually belong in this category at all. |
| |
|
< < | Indeed, our approach to terrorism is perhaps a future echo of more general trends in policing. The intelligence and law enforcement resources devoted to terrorism are inordinately large and they have been deployed to prevent a social problem that is relative tiny. The outcome has been over-policing of the worse kind; with unnecessary surveillance, increased data collection and invasive investigation techniques. In short, our approach to counter-terrorism takes our current approaches to policing, both in data collection and its analysis, and turbo-charges it. |
> > | Indeed, our approach to terrorism is perhaps a future echo of more general trends in policing. The intelligence and law enforcement resources devoted to terrorism are inordinately large and they have been deployed to prevent a social problem that is relative tiny. The outcome has been over-policing of the worse kind; with unnecessary surveillance, increased data collection and invasive investigation techniques. In short, our approach to counter-terrorism takes our current approaches to policing, both in data collection and its analysis, and turbo-charges it. |
| |
|
< < | Perhaps what is most troubling of all is the outcome to which all of this leads: large numbers of terrorism convictions arising as a result of entrapment. With all this knowledge and resources directed towards terrorism, the proverbial sledgehammer has ended up creating its own nuts to crack. The lesson we can learn from over policing in an effort to predict human behaviour is that it becomes a self-fulfilling prophecy. These high levels of confected terror threats, heroically avoided thanks to the FBI, are perhaps the logical outcome of policing that has a socially constructed our understanding of crime. If we add data science to this heady mix, there is a grave risk that it will provide 'a veneer of technological authority' to these practices. |
> > | Perhaps what is most troubling of all is the outcome to which all of this leads: large numbers of terrorism convictions arising as a result of entrapment. With all this knowledge and resources directed towards terrorism, the proverbial sledgehammer has ended up creating its own nuts to crack. The lesson we can learn from over policing in an effort to predict human behaviour is that it becomes a self-fulfilling prophecy. These high levels of confected terror threats, heroically avoided thanks to the FBI, are perhaps the logical outcome of policing that has a socially constructed our understanding of crime. If we add data science to this heady mix, there is a grave risk that it will provide 'a veneer of technological authority' to these practices. |
| Arresting these trends |
|
< < | This gives us pause to think about how the law can intervene into such debates to protect ourselves from these serious problems. One obvious strategy, at least initially, is to provide those working within the criminal justice system with transparency over algorithms used in predictive policing. Currently, these algorithms are not publicly available. Such information ought to be considered vital to protection offered by the fifth and fourteenth amendment. There are security implications, no doubt, in revealing this, but there are risks if we do not. Such transparency will also allow the court to test the reliability and accuracy of such programs in providing the reasonable suspicion (and probable cause) required by the fourth amendment. If courts allow predictive policing determinations to substitute themselves for reasonable suspicion absent this kind of transparency, the fourth amendment will be weakened, arguably to an unprecedented degree. |
> > | This gives us pause to think about how the law can intervene into such debates to protect ourselves from these serious problems. One obvious strategy, at least initially, is to provide those working within the criminal justice system with transparency over algorithms used in predictive policing. Currently, these algorithms are not publicly available. Such information ought to be considered vital to protection offered by the fifth and fourteenth amendment. There are security implications, no doubt, in revealing this, but there are risks if we do not. Such transparency will also allow the court to test the reliability and accuracy of such programs in providing the reasonable suspicion (and probable cause) required by the fourth amendment. If courts allow predictive policing determinations to substitute themselves for reasonable suspicion absent this kind of transparency, the fourth amendment will be weakened, arguably to an unprecedented degree. |
| |
|
< < | Another policy alternative could be to decouple the idea of algorithmic trends in antisocial behaviour from law enforcement and criminal justice entirely. It is possible to imagine a world where this kind of data analysis is used to inform government spending and social programs. This is, of course, another way to reduce crime and resource burdens on policing. Yet, perhaps unsurprisingly in this political environment, it remains woefully under-explored. |
> > | Another policy alternative could be to decouple the idea of algorithmic trends in antisocial behaviour from law enforcement and criminal justice entirely. It is possible to imagine a world where this kind of data analysis is used to inform government spending and social programs. This is, of course, another way to reduce crime and resource burdens on policing. Yet, perhaps unsurprisingly in this political environment, it remains woefully under-explored. |
| Lastly, like so many moments in our post-Snowden world, the take up of these kinds of programs gives us a chance to reflect upon our relationship with technology. Technology companies are now drawing on wider data sets than criminal statistics as data inputs for predicting crime to balance out potential biases in crime statistics. Perhaps it is time to find more robust protections over data, technically and legally, including statutory protection over personal data and the requirement of informed consent from user before data can be used. |