Last week, the firm was quoted in a Bustle story, entitled A Fake Porn App Lets People Put Celebrities’ Faces In Videos & Here’s Why It’s So Dangerous. The article discussed the legal implications of a new phenomenon revealed in a recent Motherboard’s article of new applications using artificial intelligence for face-swapping celebrity faces onto porn performers’ bodies. In the piece, I addressed the question of whether revenge porn laws would capture the use of a person’s image on a third-party’s body.
Bennet Kelley, founder of Internet Law Center, tells Bustle that with revenge porn already being a widespread issue online, the idea of an open-source technology like this app could present a whole lot of issues. At present, our current laws on revenge porn may not even be able to handle such a problem. . . .
“[The video] is not you but it is you. It’s a hybrid harm but it’s a real harm,” Kelley says. “It’s not a photo of your genitalia. It’s not a photo of your breasts. But it’s your face. So, I don’t think [present] legislation contemplates that. You could argue that it’s the same harm [as revenge porn].”
At this moment, there are no laws that specifically target machine-created fake videos, but scientists are trying to spot them with reverse-imaging and video-splicing. Kelley says that maybe down the line, state laws against revenge porn “could be expanded” to include such fake videos.
In the article, I also address the need to adjust privacy settings and be prudent about what is posted on social media, while also recommending the use of alerts to ensure you learn about uses of your name.
The full article is here.