Search  


Taylor Swift and deepfake porn: What's the law? by Emma Batha 
Thursday, February 1, 2024, 07:58 AM
Posted by Administrator
What’s the context?

Laws are lagging behind the proliferation of deepfake porn, leaving affected women, like Taylor Swift, with few options for justice

Deepfake porn images of Taylor Swift viewed by millions
AI tech fuels global spread of forged videos
Limited legislation worldwide hampers pursuit of justice

LONDON - The wildfire spread of fabricated porn images of pop megastar Taylor Swift has fuelled calls in the United States for strong legislation to tackle an explosion of deepfake sexual abuse facilitated by artificial intelligence.

The images - which grafted Swift's face onto another woman's body - attracted tens of millions of views on social media last week in what one lawyer said was the biggest such case to date.

There has been media speculation that the billionaire music icon could pursue legal action but given the limited legislation around deepfake pornography it is not clear how she might do this.

Here is a look at what laws are out there and why it is so hard to bring a case.
How big a problem is deepfake porn?

While politicians worry about the potential for fabricated images and videos to skew elections, the vast majority of deepfakes are non-consensual porn of women.

The first such clips were shared on social media platform Reddit in 2017. They required technical skills to create and numerous images of the targeted woman's face.

Today there are multiple apps that allow anyone to make deepfakes with just one photo and no expertise.

"I call it point-and-click violence against women. It's that easy now," said Adam Dodge, an attorney and founder of online safety company EndTAB.

The number of forged videos being created has skyrocketed.

More than 144,000 clips - over 14,000 hours of footage - were posted last year on the main sites for deepfake porn, according to independent analyst Genevieve Oh. This is more than the combined total for all preceding years.

Her research shows there have been more than 4.2 billion views on these sites since 2017.

Is deepfake porn illegal?

Only a handful of countries have laws around deepfake porn - although none uses that term. These include Australia, South Africa and Britain, where the new Online Safety Act carries a maximum penalty of two years jail.

In South Korea, where K-pop stars have been frequent targets, the penalty for producing deepfake porn for profit is up to seven years in jail.

Colombia and Canada are among other countries considering legislation.

The United States has no federal law. About 10 states have introduced a patchwork of legislation but some, like California, have not criminalised deepfake porn and only provide for civil lawsuits.

Last week, the White House described the Swift case as "alarming" and spokeswoman Karine Jean-Pierre said Congress should take legislative action.

On Tuesday, a bipartisan group of U.S. senators introduced a bill that would allow victims of deepfake porn images or videos to sue those producing and distributing them.
What are the hurdles in pursuing justice?

One of the biggest problems is discovering who made the clips with creators using virtual private networks (VPNs) to hide their identity.

Even if the creator can be tracked down, they will likely be in a different jurisdiction, which may make it impossible to pursue them.

Persuading the police to take a case seriously is also a challenge.

Some women who have reported deepfake porn say officers have laughed off their complaints or told them to turn off their computers.

Given the global nature of the problem, rights organisation Equality Now said countries should cooperate and ensure that laws are accompanied by proper training of enforcement agencies.

Evidence is another hurdle. For most women the first priority is to get the content removed. This is no easy task, but if they do manage to get it taken down, they will lose proof it existed.

Amanda Manyame, Equality Now's digital rights expert, said that in order to bring a case prosecutors would need the platform to issue a certificate proving the content was posted, but this cannot be done if it has been deleted.

Campaigners say tech companies should preserve evidence when they remove material from their platforms, but tech companies say this could breach privacy laws.

Has anyone ever brought a lawsuit over deepfake porn?

Lawyers told Context they did not know of any case in the U.S. or elsewhere.

Scarlett Johansson, a frequent target of deepfakes, looked into legal action a few years ago but decided not to proceed.

U.S. attorney Carrie Goldberg said she had helped a number of celebrities get deepfake material taken down but none had gone to court.

Civil lawsuits are costly, lengthy and can backfire if the defendant countersues.

A YouTuber, who goes by the name Gibi, told Context she had found it incredibly hard to bring a lawsuit against a man who had created deepfakes of her and other women, including students at the college he attended.

Although Gibi has amassed a mound of evidence, she thought it unlikely the case would proceed.

"It would cost just an unbelievable amount of time and money to most likely lose because of the lack of laws (around this)," she said.

Even if the U.S. does bring in legislation, lawyers said deepfake victims would still face problems pursuing justice.

"Not even Taylor Swift is going to be able to hold accountable someone creating and sharing videos in a jurisdiction beyond our reach," EndTAB's Dodge said.
What laws could Swift use?

Swift's press team could not immediately be reached for comment on whether she was considering legal action, but if she were to do so her approach would depend on which state she was legally resident in.

The singer has homes in Tennessee, New York and California, according to media reports.

Tennessee has no law around deepfake porn, but New York and California do.

Legal experts said targets of deepfake porn might also be able to employ other legislation including laws around copyright, defamation, invasion of privacy, image misappropriation and cyberstalking.

But this has never been tested in a court. In Swift's case, lawyers said the creators might argue that the images were not an infringement of copyright because they had been heavily altered.

Some commentators have suggested Swift could sue X and other social networks that enabled the proliferation of the images.

But lawyers said tech companies in the U.S. are largely protected from liability for information provided by a third party under Section 230 of the Communications Decency Act.

X has removed the images and even temporarily blocked searches for Taylor Swift to tackle their spread.

Attorney Goldberg said pursuing the individuals who made the images would not only be very hard but would do little to halt the abuse.

She said she would opt for suing the companies that created and sold the apps used to make the deepfakes, adding that it was totally foreseeable these apps would be used to harm people.

"If I were Taylor's lawyer ... I would be looking at the liability of those dangerous products," she said.

"We're not going to be able to go after a million people who shared and distributed (the image), but we can go after the products that were used to create it."
add comment ( 141 views )   |  permalink   |  $star_image$star_image$star_image$star_image$star_image ( 2.9 / 181 )

<<First <Back | 324 | 325 | 326 | 327 | 328 | 329 | 330 | 331 | 332 | 333 | Next> Last>>







Share CertificationPoint & Stay Informed Socially About EduTech?