Automating Society Report 2020

Story

Swiss police automated crime predictions but have little to show for it

A review of 3 automated systems in use by the Swiss police and judiciary reveals serious issues. Real-world effects are impossible to assess due to a lack of transparency.

Predicting burglaries

Precobs has been used in Switzerland since 2013. The tool is sold by a German company that makes no mystery of its lineage with “Minority Report”, a science-fiction story where “precogs” predict some crimes before they occur. (The plot revolves around the frequent failures of the precogs and the subsequent cover-up by the police).

The system tries to predict burglaries from past data, based on the assumption that burglars often operate in small areas. If a cluster of burglaries is detected in a neighborhood, the police should patrol that neighborhood more often to put an end to it, the theory goes.

Three cantons use Precobs: Zürich, Aargau, and Basel- Land,  totaling almost a third of the Swiss population. Burglaries have fallen dramatically since the mid- 2010s. The Aargau police even complained in April 2020 that there were now too few burglaries for Precobs to use.

But burglaries fell in every Swiss canton, and the three that use Precobs are nowhere near the best performers. Between 2012-2014 (when burglaries were at their peak) and between 2017-2019 (when Precobs was in use in the three cantons), the number of burglaries decreased in all cantons, not just in the three that used the software. The decrease in Zürich and Aargau was less than the national average of -44%, making it unlikely that Precobs had much of an actual effect on burglaries.

A 2019 report by the University of Hamburg, could not find any evidence of the efficacy of predictive policing solutions, including Precobs. No public documents detail how much Swiss authorities have spent on the system, but Munich paid 100,000 euros to install Precobs (operating costs not included).

Predicting violence against women

Six cantons (Glarus, Luzern, Schaffhausen, Solothurn, Thur- gau, and Zürich) use the Dyrias-Intimpartner system to predict the likelihood that a person will assault their intimate partner. Dyrias stands for “dynamic system for the analysis of risk” and is also built and sold by a German company.

According to a 2018 report by Swiss public-service broadcaster SRF, Dyrias requires police officers to answer 39 “yes” or “no” questions about a suspect. The tool then outputs a score on a scale from one to five, from harmless to dangerous. While the total number of persons tested by the tool is unknown, a tally by SRF showed that 3,000 individuals were labeled “dangerous” in 2018 (but the label might not be derived from using Dyrias).

The vendor of Dyrias claims that the software correctly identifies eight out of ten potentially dangerous individuals. However, another study looked at the false positives, individuals labeled dangerous who were in fact harmless, and found that six out of ten people flagged by the software should have been labeled harmless. In other words, Dyrias boasts good results only because it takes no risks and assigns the “dangerous” label liberally. (The company behind Dyrias disputes the results).

Even if the performance of the system was improved, its effects would still be impossible to assess. Justyna Gospodi- nov, the co-director of BIF-Frauenberatung, an organization that supports victims of domestic violence, told Algorithm- Watch that, while cooperation with the police was improving and that the systematic assessment of risk was a good thing, she could not say anything about Dyrias. “When we take in a new case, we do not know whether the software was used or not,” she said.

Predicting recidivism

Since 2018, all justice authorities in German-speaking cantons use ROS (an acronym for “Risikoorientierter Sanktionenvollzug” or risk-oriented execution of prison sentences). The tool labels prisoners ‘A’ when they have no risk of recidivism, ‘B’ when they could commit a new offense, or ‘C’ when they could commit a violent crime. Prisoners can be tested several times, but subsequent tests will only allow them to move from category A to B or C and not the other way around.

A report by SRF revealed that only a quarter of the prisoners in category C committed further crimes upon being released (a false-positive rate of 75%) and that only one in five of those who committed further crimes were in category C (a false-negative rate of 80%), based on a 2013 study by the University of Zürich. A new version of the tool was released in 2017 but has yet to be reviewed.

The French and Italian-speaking cantons are working on an alternative to ROS, which should be deployed in 2022. While it keeps the same three categories, their tool will only work in conjunction with prisoner interviews that will be rated.

Mission:Impossible

Social scientists are sometimes very successful when predicting general outcomes. In 2010, the Swiss statistics office predicted that the resident population of Switzerland would reach 8.5 million by 2020 (the actual 2020 population is 8.6 million). But no scientist would try to predict the date a given individual will die: Life is simply too complicated.

In this regard, demography is no different from criminology. Despite claims to the contrary by commercial vendors, predicting individual behavior is likely to be impossible. In 2017, a group of scientists tried to settle the issue. They asked 160 teams of researchers to predict school performance, the likelihood of being evicted from home, and four other outcomes for thousands of teenagers, all based on precise data collected since birth. Thousands of data points were available for each child. The results, published in April 2020, are humbling. Not only could not a single team predict an outcome with any accuracy, but the ones who used artificial intelligence performed no better than teams who used only a few variables with basic statistical models.

Moritz Büchi, a senior researcher at the University of Zürich, is the only Swiss scholar who took part in this experiment. In an email to AlgorithmWatch, he wrote that while crime was not part of the outcomes under scrutiny, the insights gained from the experiment probably apply to predictions of criminality. This does not mean that predictions should not be attempted, Mr. Büchi wrote. But turning simulations into ready-to-use tools gives them a “cloak of objectivity” which can discourage critical thinking, with potentially devastating consequences for the people whose future is predicted.

Precobs, which does not attempt to predict the behavior of specific individuals, does not fall into the same category, he added. More policing could have a deterrent effect on criminals. However, the detection of hotspots relies on historical data. This might lead to the over-policing of communities where crime was reported in the past, in a self-reinforcing feedback loop.

Chilling effects


Despite their patchy track record and evidence of the near- impossibility to predict individual outcomes, Swiss law enforcement authorities keep using tools that claim to do just that. Their popularity is due in part to their opacity. Very little public information exists on Precobs, Dyrias, and ROS. The people impacted, who are overwhelmingly poor, rarely have the financial resources needed to question automated systems, as their lawyers usually focus on verifying the basic facts alleged by the prosecution.

Timo Grossenbacher, the journalist who investigated ROS and Dyrias for SRF in 2018, told AlgorithmWatch that finding people affected by these systems was “almost impossible”. Not for lack of cases: ROS alone is used on thousands of inmates each year. Instead, their opacity prevents watchdogs from shedding light on algorithmic policing.

Without more transparency, these systems could have a “chilling effect” on Swiss society, according to Mr. Büchi of the University of Zürich. “These systems could deter people from exercising their rights and could lead them to modify their behavior,” he wrote. “This is a form of anticipatory obedience. Being aware of the possibility of getting (unjustly) caught by these algorithms, people may tend to increase conformity with perceived societal norms. Self-expression and alternative lifestyles could be suppressed.”