Algorithmic policing technology
Mass data processing
Police are increasingly using algorithms as part of their law enforcement strategies. In the age of information, a burgeoning internet allows police to access and share vast and continually growing amounts of data. In fact, the material collected has resulted in the creation of databases and internal records management systems at every government level. Such data can potentially be used to deter crime before it occurs, advocates say. And if utilizing that information allows police to be proactive, isn’t that preferable to merely reacting when a crime has already occurred?
Predictive policing methods are not new but perhaps not widely understood by the general population. According to a Rand Foundation report, this investigative tool, also referred to as forecasting,“is the application of analytical techniques – particularly quantitative techniques –to identify likely targets for police intervention and prevent crime or solve past crimes by making statistical predictions.”
The strategy allows law enforcement officials to “work more proactively with limited resources,” the study noted.
“The objective of these methods is to develop effective strategies that will prevent crime or make investigation efforts more effective,” according to the Rand study. “However, it must be understood at all levels that applying predictive policing methods is not equivalent to finding a crystal ball. For a policing strategy to be considered effective, it must produce tangible results.”
It is a technology in its infancy and for that reason, it needs to be monitored closely, says Ottawa criminal defence lawyer Céline Dostaler.
“For the police, something like this can be helpful but they are going to have to work out a lot of kinks,” she says. “I think there’s going to be issues that will come out in court in the future. I would hope there would be guidelines, but I believe those will not be seen until after some trial and error. That is unfortunate because they are going to make many mistakes before getting it right.”
Data used to determine hotspots
The idea is simple enough. Computer programs help determine where to deploy police or to root out individuals who may be more likely to commit a crime or become a victim of one. In a recent study titled To Surveil and Predict A Human Rights Analysis of Algorithmic Policing in Canada [PDF], the Citizen Lab found “algorithmic policing methods often rely on the aggregation and analysis of massive volumes of data, such as personal information, communications data, biometric data, geolocation data, images, social media content, and policing data (such as statistics based on police arrests or criminal records).”
Opinion is split on the issue of algorithms used in policing. Critics are concerned about accountability and transparency and the risk of perpetuating racial profiling. Proponents say such methods can aid in more accurately predicting crime than traditional police methods.
Should we feel comforted or concerned?
Algorithmic surveillance technologies
As the Citizen Lab report notes, algorithmic policing technology can also provide police with “sophisticated” surveillance and monitoring functions, allowing for the automated collection of data, such as online images.
In its report, four types of algorithmic surveillance technologies were examined:
- Automated licence plate readers: Scans and identifies the licence plate numbers of parked or moving vehicles;
- Social media surveillance software: Used to collect and analyze personal data from social media platforms. This information can be used to predict behavioural patterns, future activities, or relationships and connections;
- Social network analysis: Used to find social connections between people in particular networks and how they influence each other;
- Facial recognition: Biometric identification technology using algorithms to detect specific details about a face. A mathematical representation of those details is used in an attempt to find a match in a facial recognition database.
How widely known is the practice of algorithmic policing?
According to Citizen Lab, the “extent to which Canadian law enforcement agencies have begun to use algorithmic policing technologies is poorly understood.” However, they note that interest is growing among police forces across the country.
“The research conducted … found that multiple law enforcement agencies across Canada have started to use, procure, develop, or test a variety of algorithmic policing methods,” Citizen Lab found. “These programs include using and developing ‘predictive policing’ technologies – one location-focused and one person-focused – and using algorithmic surveillance tools, such as facial recognition technology and social network analysis.”
In Vancouver, police launched a program in 2016 to project property crime. According to Global News, officials reported a 20 per cent decline in burglaries in the first six months.
However, the officer leading the program acknowledged that risks go along with the benefits, saying that the force is aware predictive policing can disproportionately target marginalized communities, Global reported.
For Dostaler, there is also the issue of predictive policing blurring the lines between the guilty and the innocent.
“It can be quite scary. We have to be careful when things are posted on social media. The police may be able to use that to determine whether you might be in a grouped behaviour,” she says
Dostaler explains that people add new contacts or may have old contacts on their social media platforms that draw police interest. However, a person should not be subjected to attention from law enforcement just because they friended someone on Facebook
“It doesn’t mean they hang out with this person on a regular basis,” she says. “I would want to make sure this predictive analysis doesn’t unnecessarily target someone. It would be very frightening if, just because you knew a certain person, you would be under supervision as well and police might analyze you to predict whether you could be part of something.
“People have to start thinking about their social media platforms, who their friends are and what they might be posting,” Dostaler adds.
The Saskatoon Police Service, meanwhile, is developing an algorithm intended to help identify those at risk of going missing. Citizen Lab also reported the Toronto Police Service may be interested in developing a “location-focused algorithmic program” in the future.
How the United States has fared
Predictive policing has been employed with mixed results in the United States. According to the Brennan Center for Justice (BCJ), a nonpartisan law and policy institute, one of the earliest to get on board was the Los Angeles Police Department (LAPD), which began exploring predictive policing strategies in 2008.
One of the initiatives, which identified areas where gun violence was believed to be most likely to occur, was scrapped in 2019 when an audit found “the data program lacked oversight and that officers used inconsistent criteria to label people who were likely to commit violent crimes,” the Los Angeles Times reported.
In Chicago, law enforcement officers “ran one of the biggest person-based predictive policing programs” in the U.S., according to the BCJ. However, the Chicago Police Department’s “heat list” led to a legal battle which revealed that the list “far from being narrowly targeted, included every single person arrested or fingerprinted in Chicago since 2013,” according to the Brennan report. The program was scrapped at the beginning of 2020.
In Detroit, a man was falsely charged with theft in 2018 after being mistakenly tagged by facial recognition software in what the New York Times said was likely the first known case of its kind. The arrest has added more fuel to the fire on the debate of racism in law enforcement
In Part Two, we will discuss algorithmic surveillance safeguards.
At ICONA, we have a team of web professionals, designers, writers and content creators who know what it takes to make law firms stand out from the crowd. If you have any questions about content creation, please do get in touch. We are here to help.