Protecting victims of non-consensual deepfake pornography
A new bill, the Take if Down Act, is spearheaded led by U.S. Senator Amy Klobuchar. This legislation would finally help with protecting victims of non-consensual deepfake pornography with having federal legal recourse.
Non-consensual intimate images or deepfake pornography, is becoming a significant issue for many Americans. According to a 2023 study from a U.S. Cybersecurity Firm, Home Security Heroes, they reported over 950,000 deepfake videos online. This is a 550 percent increase from 2019. Everyday people are more likely to be affected by non-consensual deepfakes than public figures.
Molly Kelley, a woman from Otsego, Minnesota, is one of 85 women who were victims of a close family friends. The perpetrator used her likeness in non-consensual deepfake pornography.
“When people hear about pornography made with deep fake technology, they think of high profile cases involving celebrities like Taylor Swift. I am proof that this can happen to anybody,” Kelley said. “My only crime was existing online and sharing photos on platforms like Instagram. The person who did this was not a stranger. I was not hacked. And my social media has never been public. He was someone I trusted and a close friend of over 20 years. The offender’s now ex-wife saw the images he created and notified us immediately.”
Several states, including Minnesota, have laws that offer some help for victims of pornographic deepfakes. However, Kelley says many victims also live in states like Wisconsin, where there’s few if any legal options.
“While Minnesota has laws against nonconsensual deepfake images, the response from law enforcement was inconsistent and it’s sometimes dismissive. The offender does not deny his actions and has faced no consequences to date,” Kelley said. “As for the victims in the other states, the message was painfully clear that there is no recourse.”
While artificial intelligence continues to develop, there are also new pending safeguards in the works to help protect people from being used in non-consensual intimate images. As a result, several U.S. senators are continuing the push for bipartisan legislation for protecting victims of non-consensual deepfake pornography.
U.S. Senator Amy Klobuchar is spearheading the Take it Down Act. A federal bill that would criminalize a person for publishing and distributing non-consensual deepfake pornography. The bill was passed in the U.S. Senate, but is now moving to the U.S, House of Representatives for a vote. The legislation would also give victims the opportunity to have social media platforms take down the deepfake images and videos within 48 hours of being notified.
“It is estimated that one in twelve American adults have had some type of image distributed without their consent. We are working to get it through the House,” Sen. Klobuchar said. “There are some opportunities at the end of the year here, where we could either have a separate vote. Maybe get it on another bill and that’s what we’re working to do right now.”
Some of tips from cyber security experts recommend parents to not have their kids sharing photos, video, or audio online. Also if you see deepfake content report it on the social media platform, and then block the user spreading it.