UK plans to make the sharing of non-consensual deepfake porn illegal

The UK government says it plans to make the sharing of non-consensual pornographic deepfakes illegal, with offenders facing “potential time behind bars.”

The new offense is set to be added to the long-awaited and controversial Online Safety Bill, a mammoth piece of legislation that will rewrite the UK’s rules for policing harmful internet content. The government announced this morning that deepfakes would be covered in the legislation along with strengthened laws against “downblousing” (taking explicit images down a women’s top without consent). The passage of the bill was delayed this year by recent political chaos, but the UK government now plans to return it to parliament in December for further debate.

1 in 14 adults in England and Wales have been threatened with sharing intimate images

In the government’s announcement, it says that non-consensual deepfakes will be tackled as part of a wider initiative to stamp out revenge porn and other forms of “intimate image abuse.” It defines problematic deepfakes as “manufactured intimate images” shared without consent, but does not offer further definitions of the technology involved. Current statistics in England and Wales suggest 1 in 14 adults have been threatened by someone with sharing intimate images of them without their consent.

Pornographic deepfakes made using machine learning techniques began to appear on the web towards the end of 2017. They quickly spread across internet forums like Reddit, with users making custom pornographic clips featuring the likenesses of celebrities and women they knew. Although some mainstream forums and porn sites banned deepfakes in response, the technology is well-rooted in the lesser-seen corners of the internet. There are many cases of such images being used to humiliate, abuse, and intimidate women, but globally there is little legislation that outlaws their dissemination. In the US, only three states (Virginia, Texas, and California) have laws referencing deepfakes at all.

Although the first wave of pornographic deepfakes focused on AI methods that pasted targets’ faces over existing video clips, new technology — specifically, text-to-image AI models — makes creating NSFW deepfake images much easier and much faster. These new technologies also blur the line between cartoonish and photorealistic depictions, which may further complicate any legislation designed to tackle the issue. At what point does a deepfake take on a specific person’s likeness?

Regarding the UK’s plans to tackle this threat, Professor Penney Lewis of the Law Commission (an independent body designed to review the laws of England and Wales) said she was pleased about the government’s planned changes.

“Taking or sharing intimate images of a person without their consent can inflict lasting damage,” said Lewis, who recommended the addition, according to The Guardian. “A new set of offences will capture a wider range of abusive behaviours, ensuring that more perpetrators of these deeply harmful acts face prosecution.”

Many advocacy groups, including those working on child safety, have welcomed the passage of the Online Safety Bill, saying the legislation is needed to better protect the UK’s internet users. However, the bill has also been criticized for its threat to free speech, particularly its provision targeting “legal but harmful” content.

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! TechNewsBoy.com is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.