Sharing pornographic deepfakes to be illegal in England and Wales

 1 year ago
source link: https://www.bbc.com/news/technology-63669711
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Sharing pornographic deepfakes to be illegal in England and Wales

1 hour ago
Stock image of a man looking at a smartphone screen at night
Image source, Getty Images
Image caption,
The government says there are growing concerns about intimate images being shared without consent
By Monika Plaha & Joseph Lee
BBC News

A planned new law would make sharing pornographic deepfakes without consent a crime in England and Wales.

Tackling the rise in manipulated images, where a person's face is put on someone else's body, is part of a crackdown on the abuse of intimate pictures in the Online Safety Bill.

This law would also make it easier to charge people with sharing intimate photos without consent.

Prosecutors would no longer need to prove they intended to cause distress.

In some cases under the existing law, men have admitted sharing women's intimate images without consent, but have not been prosecuted because they said they did not intend to cause any harm.

The government says around one in 14 adults in England and Wales say they have been threatened with their intimate images being shared against their will.

It also says there are growing global concerns about technology being used to create fake pornographic images and video, with one website which creates nude images from clothed ones receiving 38 million visits last year.

In August, BBC Panorama exposed a network of men on the social media site Reddit who traded women's nudes online - including some which had been faked - as well as harassing them and threatening them.

The Law Commission said reporting such as this, along with campaigners' calls for stronger laws, helped to make a "compelling case" to government for reform.

It outlined recommendations earlier this year to ensure all examples of deliberately taking or sharing intimate images without consent are illegal.

The government said some of these - including specific laws against "downblousing", installing hidden cameras, and threatening to share someone's intimate images - would be dealt with in future legislation, but it did not offer any timescale.

Prime Minister Rishi Sunak had promised to criminalise downblousing, a term for taking photos down a woman's top without her consent, in this summer's Tory leadership contest. The move would bring it in line with an earlier law against "upskirting".

Announcing the measures, Justice Secretary Dominic Raab said the government accepted there were gaps in the law and it needed to adapt to the changing use of technology.

He said he wanted to "give women and girls the confidence that the justice system is on their side and will really come down like a ton of bricks on those who abuse or intimidate them".

Ayesha - not her real name - told Panorama in August how videos secretly filmed by a partner and faked images of her were being shared on Reddit, driving her to try to take her own life.

She told the BBC that the new announcement gave her hope that police could take action and the harassment she experienced would finally end.

"It will make a massive change to my life and to many other lives as well. We'll be actually able to live and to breathe in peace without having to be scared," she said.

Kate Isaacs, a deepfake porn victim and campaigner, successfully sought the removal of a deepfake video of her friend.

She then discovered that a Twitter user had created a fake video of her in retaliation.

She told the BBC Radio 4's Today programme: "I went on Twitter, and the notification came through. I went onto the video, and it looked like me having sex. It was an interview I'd done with the BBC that they'd taken and superimposed onto this porn video.

"I had given advice to so many survivors on what you should do if you become a victim of this crime. But I just froze, and didn't follow any of my own advice. I think it was just too scary to comprehend and process at the time."

Ms Isaacs said that there needs to be more than a change in the law to tackle the issue.

"I think it's deeply rooted in society at this point. We don't have time. The technology is running 1,000 miles quicker than we are."

Media caption,

Georgie has been campaigning to change the law around intimate image abuse

One deepfake porn creator told the BBC earlier this year that the risk of prosecution could make him stop.

"If I could be traced online I would stop there and probably find another hobby," he said.

The Ministry of Justice also said it was looking at whether it could give the victims of intimate image abuse the same anonymity as the victims of sexual offences are granted, in line with the Law Commission's recommendations.

Prof Clare McGlynn at Durham University, an expert in image-based sexual abuse, told the BBC the changes were "a testament to the courage of women who have been speaking up", but added that it was "absolutely vital that anonymity is granted immediately".

"Victims tell us that not being able to be anonymous means they are more reluctant to report to the police and it means cases are more often dropped," she said.

About Joyk

Aggregate valuable and interesting links.
Joyk means Joy of geeK