Mock crime prediction tool profiles MEPs as potential criminals | Computer Weekly

Several MEPs have been profiled as “at risk” of criminal behaviour after using mock crime prediction tool created by non-governmental organisation Fair Trials to highlight discriminatory and unfair nature of predictive policing systems.

The online tool – which asks users questions designed to draw out the kinds of information police across Europe are using to “predict” whether someone will commit a crime – was launched on 31 January 2023, and has been used by MEPs and members of the public from across the European Union (EU).

Predictive policing systems can be used to profile and make “predictions” about criminality in both individuals and locations, which Fair Trials said is determined by a range of data about education, family life and background, engagement with public services such as welfare benefits and housing, ethnicity, nationality, credit scores, and whether someone has previously been in contact with police, even as a victim.   

People profiled as “at risk” on the basis of this information face a range of serious consequences, from being subject to regular stop and search to having their children removed by social services. The profiles and predictions are also used to inform pre-trial detention, prosecution, sentencing and probation decisions.

As a result of the tool’s use, Fair Trials said more than 1,000 emails were subsequently sent to MEPs by members of the public calling on them to ban predictive policing systems in the EU’s upcoming Artificial Intelligence (AI) Act.

“Our interactive predictive tool shows just how unjust and discriminatory these systems are. It might seem unbelievable that law enforcement and criminal justice authorities are making predictions about criminality based on people’s backgrounds, class, ethnicity and associations, but that is the reality of what is happening in the EU,” said Griff Ferris, senior legal and policy officer at Fair Trials.

“There’s widespread evidence that these predictive policing and criminal justice systems lead to injustice, reinforce discrimination and undermine our rights. The only way to protect people and their rights across Europe is to prohibit these criminal prediction and profiling systems, against people and places.”

Socialists and Democrats (S&D) MEP Petar Vitanov, who was profiled by the mock tool as having a “medium risk” of committing a crime in the future, said there is no place for such unreliable, biased and unfair systems in the EU.

“I have never thought that we will live in a sci-fi dystopia where machines will ‘predict’ if we are about to commit a crime or not,” he said. “I grew up in a low-income neighbourhood, in a poor Eastern European country, and the algorithm profiled me as a potential criminal.”

Renew MEP and member of the legal affairs committee, Karen Melchior, who was profiled as “at risk”, added that the automated judging of people’s behaviour will lead to discrimination and irrevocably alter people’s lives.

“We cannot allowed funds to be misplaced from proper police work and well-funded as well as independent courts to biased and random technology,” she said. “The promised efficiency will be lost in the clean up after wrong decisions – when we catch them. Worst of all, we risk destroying lives of innocent people. The use of predictive policing mechanisms must be banned.”

Other MEPs profiled as a risk by the tool, and who subsequently expressed their support for banning predictive policing systems, include Cornelia Ernst, Tiemo Wölken, Petar Vitanov, Birgit Sippel, Kim van Sparrentak, Tineke Strik and Monica Semedo.

Civil society groups such as Fair Trials, European Digital Rights (EDRi) and others have long argued that because the underlying information used in predictive policing systems is drawn from data sets that reflect the historical structural biases and inequalities in society, the use of such systems will result in racialised people and communities being disproportionately targeted for surveillance, questioning, detaining and, ultimately, imprisonment by police.

In March 2020, evidence submitted to the United Nations (UN) by the UK’s Equalities and Human Rights Commission (EHRC) said the use of predictive policing could replicate and magnify “patterns of discrimination in policing, while lending legitimacy to biased processes”.

Lawmakers have come to similar conclusions. In the UK, for example, a House of Lords inquiry into the police use of algorithmic technologies noted that predictive policing tools tend to produce a “vicious circle” and “entrench pre-existing patterns of discrimination” because they direct police patrols to low-income, already over-policed areas based on historic arrest data.

“Due to increased police presence, it is likely that a higher proportion of the crimes committed in those areas will be detected than in those areas which are not over-policed. The data will reflect this increased detection rate as an increased crime rate, which will be fed into the tool and embed itself into the next set of predictions,” it said.

Although the two MEPs in charge of overseeing and amending the EU’s AI Act said in April 2022 that the use of AI-powered predictive policing tools to make “individualised risk assessments” should be prohibited on the basis that it “violates human dignity and the presumption of innocence”, the proposed prohibition only extended to individualised assessments and not place-based predictive systems used to profile areas or locations.

Sarah Chander, a senior policy adviser at EDRi, told Computer Weekly at the time that profiling neighbourhoods for the risk of crime has a similar effect to profiling individuals, in that it “can increase experiences of discriminatory policing for racialised and poor communities”.

Civil society groups have therefore called – on multiple occasions – for predictive policing systems to be completely banned.

While amendments are constantly being made to the AI Act, the limited ban on predictive policing systems has not yet been extended to place-based systems. MEPs next vote on the act is due to take place sometime at the end of March 2023, with the exact date yet to be confirmed.

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! TechNewsBoy.com is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.