Police work used to consist of data collection and traditional intelligence gathering. Now, it involves greater use of big data, face recognition, surveillance and other data-driven tools.
Vasilis Galis from the IT University of Denmark is the project manager for CUPP (“Critical Understanding of Predictive Policing”), a large, international project conducting research into aspects of policing such as the implementation of big data in investigations and preventive work – also called “predictive policing” in the Nordic Region, the United Kingdom and the Baltic states.
The project, funded by NordForsk, aims to shed light on the human and social consequences of police forces’ increasing use of technological solutions, such as face recognition, CCTV, big data and other digital tools. According to Galis, the police are no longer limited to traditional intelligence work and database analysis.
“Recently, police work has changed from data collection after a crime has been committed to more preventive methods. The police no longer simply react. Instead, they now run simulations and map out a range of scenarios in an attempt to predict future events,” he says.
A new category of police officers
Galis explains that recent developments in the ability to predict future actions are rooted in the digitalisation of society as a whole. He believes it is vital to investigate the ramifications of this new technology, the types of data being collected and how this affects the whole foundation of police work.
“Digitalisation has created an entirely new category of police officers. Some still operate on the street in the traditional way, but we now suddenly have a new breed of analysts sitting at computers, using data to produce reports and forecasts. We want to understand the transition from traditional surveillance methods to the new forms in the digital landscape. This will also have consequences for the courts and how they approach these new forms of data collection. IT is an unregulated field that also raises questions about the nature of law enforcement,” he explains.
Built-in algorithmic bias
Galis point out that the research project is also relevant in the light of the growing debate concerning the underlying values and different forms of “bias” inherent in the algorithms that drive commonly used software. These algorithms are in no way neutral formulas that produce objective results. They are made by human beings. As such, researchers need to study how the algorithms are designed and identify the values that underpin them.
“The internet is a space for all types of activity and is, therefore, a mirror of society. Everyone produces data on a daily basis, e.g., via social media, various types of apps and consumer services such as Amazon and Google. Their data is not necessarily linked to criminal acts, but each individual user leaves digital traces that can be used to profile them. The Cambridge Analytica scandal and the manipulation of the previous US elections revealed just how little data is required to predict human behaviour,” he extrapolates.
“The algorithms reproduce stereotypes and also contain racial bias. We want to understand what happens when these algorithms are used in an area as sensitive as police work. In other words, we want to create transparency and an understanding of how the data is used. The novel aspect of our research consists of exploring the implications of this new ‘predictive policing’ phenomenon – not only for police work but for society as a whole.”
“In other words, we watch the watchers and study the tools they use. This development in law enforcement will be a key element in the state’s handling of hybrid threats and cyber security at an overarching level.”
International partnership offers broader perspectives
The Nordic countries’ social and welfare state models are world-renowned. By looking at police work in relation to “big data,” the CUPP project focuses on one of the welfare state’s most important sectors.
“We want to shed light on an area that is currently characterised by a lack of transparency and knowledge. We aim to translate difficult procedures and political processes into more accessible scientific knowledge and also build bridges between civil society and digitalisation,” Galis continues.
“The United Kingdom and the Baltic states represent different kinds of state and welfare models. The UK has a long tradition of surveillance based on a strongly capitalist state. The former Soviet satellite states such as Estonia and Latvia also have long traditions of surveilling people, albeit based on a completely different paradigm. Collaboration between researchers from the Nordic Region, the United Kingdom and the Baltic states affords a much broader perspective and a greater degree of methodological and empirical diversity.
My hope is to create a more extensive network of researchers involved in researching police services from a critical vantage point,” he concludes.
FACTS ABOUT CUPP
The research project CUPP (Critical Understanding of Predictive Policing) is a collaboration between the following project partners:
- IT University of Copenhagen (Denmark)
- Union of IT Professionals (PROSA) (Denmark)
- University of St Andrews (United Kingdom)
- University of Latvia (Latvia)
- Baltic Studies Centre (Latvia)
- Tallinn University of Technology (Estonia)
- University of Oslo (Norway)
CUPP is a three-year research project and has received a grant of €1 million from NordForsk.