News & Current Affairs

Victims of Deepfake Porn Want Legislation to Provide Protection

By Azeezat Okunlola | Jun 9, 2023
Lauren experienced a nervous breakdown after seeing herself having sexual intercourse with a man she had never been intimate with in a film. The body of the woman in the video may have been skinnier and paler, but that wasn't the case with her face. 
 
How humans are exploited by AI is fast expanding in tandem with its growing capabilities. In 2019, researchers found that "96% of deepfake victims are sexualized, and nearly all of them are women," as reported by Psychology Today.
 
Like many, Lauren had fallen for deepfake porn, a growing genre of pornography in which a person's face is digitally placed onto another's body and then shown engaging in sexual activities.
 
Deepfake porn movies can be made with easy face swapping programmes found on most phones, so its makers don't even need to be computer whizzes. The spread of this problem is likely to accelerate with the development of increasingly sophisticated AI technology. 
 
For reasons of privacy, Lauren's name has been changed. After a man she met at the gym, Dan, asked her out on a date, she became the topic of deepfake porn. She didn't give it much thought after telling him she wasn't interested, but the next time Lauren saw Dan at the gym, she claims things got heated. "He became aggressive and frustrated, saying I should just give him a chance," Lauren explained. "I should have told the gym management but I was embarrassed and just wanted to get out of there."
 
Lauren claimed that when she went back to the gym, a man followed her into the locker room and told her he needed to talk to her. "He told me Dan was showing people a video of us having sex," she revealed. At first, I didn't believe him because, as I said, "I'd never had sex with Dan, so it didn't seem possible."
 
However, this was made feasible by developments in deepfake and face swapping technologies. Lauren said that Dan had made a video depicting him and her engaging in sexual activity, and was showing it to the male patrons of the gym. Using Lauren's first and last name, Dan uploaded the video to his Instagram story a few days later, boasting that he would make the fictitious video a reality. 
 
Lauren said that she reported Dan to the gym's administration and that he was subsequently banned from returning after having his membership cancelled, "I didn't want to go knowing that a bunch of the guys had seen porn of me," she explained. "Even though it was fake, it still made me feel really ashamed and gross."
 
After being urged to consult an attorney by a friend, Lauren did so and learned that her defamation claim was without merit. Lauren claims she was advised she couldn't file a lawsuit under revenge porn statutes since the pornographic material in question was not revenge porn but rather deep fake porn, which is not illegal under federal law due to the fact that legislation has lagged behind technology developments.
 
While several states, including Virginia and California, have approved legislation to crack down on deepfake pornography, victims may still be on their own at the federal level. The breasts or genitals of the individual would have to be revealed for these movies and photos to be termed image-based sexual assault, but this is often not the case in deepfake pornography, according to Honza Cervenka, a lawyer who specialises in nonconsensual pornography, who spoke to Refinery29. Rather with the more complex deepfake imagery, "it sort of falls through the cracks of many of the laws that were written with the original revenge pornography," Cervenka said. 
 
The Iranian actress Uldouz Wallace was one of the famous people whose private images were leaked online in the 2014 iCloud breach, along with Kirsten Dunst, Jennifer Lawrence, and Kate Upton. At the age of 25, Wallace had her private photos hacked, and she has since witnessed deepfake pornography being created from her images. According to Wallace, "it's several layers of different types of abuse," including the deepfake part of it following the whole initial attack and disclosure. Now I can't tell fake news from real because there's so much of it.
 
Wallace has recently joined the Sexual Violence Prevention Association (SVPA), whose mission is to "create a world where everyone can live free from the threat of sexual violence" through "advocacy, education, and community engagemen." In an open letter, the SVPA is demanding that lawmakers outlaw the use of deepfake porn. The letter states, "Right now, there are no [federal] laws banning the creation or distribution of deepfake porn. Until there are consequences, deepfake pornography will continue to increase."
 
SVPA's founder and CEO, Omny Miranda Martone, has stated the group's dedication to promoting federal legislation against deepfake pornography and informing the public on the harms associated with it. "People are like, well, why do [victims] even care? It’s not real anyways. It’s not actually them," Martone claimed. "I don't think a lot of people fully understand the consent piece of this - that you don't have the person's consent and this is a violation of autonomy and privacy."
 
Rep. Joseph Morelle (D-N.Y.) presented the Preventing Deepfakes of Intimate Images Act to establish minimum protections against the proliferation of deepfake technology and other forms of artificial intelligence. "As artificial intelligence continues to evolve and permeate our society, it is critical that we take proactive steps to combat the spread of disinformation and protect individuals from compromising situations online," Morelle added. The bill has not passed the House of Representatives as of this moment.
HIDDEN - to trigger update. rm later