If you’re on social media then you too can become the unwilling star of a porn video or pictures. This is what anyone with a minimum of technological skills can do to you without your knowledge, as more than a dozen women in Nassau County found out to their grief.
Nassau County District Attorney Anne T. Donnelly announced that a Seaford man was sentenced today to six months’ jail time and 10 years’ probation with sex offender conditions for creating and sharing sexually explicit “deepfaked” images of more than a dozen underage women on a pornographic website. Patrick Carey, 22, pleaded guilty on December 12, 2022.
“Patrick Carey targeted these women, altering images he took from their social media accounts and the accounts of their family members and manipulating them using ‘deepfake’ technology to create pornography that he disseminated across the Internet,” said DA Donnelly.
According to the charges, from January 2021 through September 2021, Nassau County Police Department detectives from the Eighth Squad were contacted by approximately 11 women who had discovered images of themselves on a pornographic website. Many of the women indicated that the images, taken when the women were in high school and middle school, were re-posted on the website from their own social media platforms and altered to suggest the women were engaging in sexual conduct.
The images had been altered in what is otherwise known as a “deepfake” – convincingly superimposing the victim’s faces on other separate images of women engaging in sexual conduct. The posted images were also accompanied by personal identifying information, including full names, addresses and telephone numbers—and an invitation to receive porn from other posters.
The implications of this case are noteworthy especially for two aspects that are a serious problem today. The first is that Artificial Intelligence is making frightening strides in technology and it is becoming increasingly difficult to distinguish what is real from what is fake. Making realistic fake videos—deepfakes– once required elaborate software to put one person’s face onto another’s. But now, many of the tools to create them are available to everyday consumers — even on smartphone apps, and often for little to no money—as Patrick Carey has shown.
Celebrities are being made to endorse products through deepfakes, as happened to Joe Rogan who found that he had been made the spokesperson for a “libido-boosting” coffee brand for men, for example.
Rogan is not alone. An abundance of manipulated content has circulated on TikTok and elsewhere for years, but typically using more homespun tricks like careful editing or the swapping of one audio clip for another. In one video on TikTok, Vice President Kamala Harris appeared to say everyone hospitalized for Covid-19 was vaccinated. In fact, she said the patients were unvaccinated.
Graphika, a research firm that studies disinformation, spotted deepfakes of fictional news anchors that pro-China bot accounts distributed late last year, in the first known example of the technology’s being used for state-aligned influence campaigns.
But now, as A.I. makes advances, several new tools offer similar technology to everyday internet users, giving anyone one of us the chance to make their own convincing spoofs.
This new availability of the technology has some A.I. researchers worrying about their dangers, and they have raised fresh concerns over whether social media companies are prepared to moderate the growing digital fakery. The question has become especially urgent as A.I. has made wondrous strides in the past year and disinformation watchdogs are steeling themselves for a wave of digital fakes that could become ever harder to detect.
The second aspect in the Patrick Carey case that is of note is that the technology is outpacing the legislative tools available to deal with the violators. New York State currently has no criminal statutes addressing “deepfaked” or digitally manipulated images of a sexually explicit nature, leaving a significant loophole that can be exploited by child pornographers.
In response to this astonishing lack, DA Donnelly has proposed the “Digital Manipulation Protection Act.” “New York State currently lacks the adequate criminal statutes to protect victims of ‘deepfake’ pornography, both adults and children.” DA Donnelly continued stating, “it would close the loopholes in the law that allow sexual predators and child pornographers to create sexually explicit digitally manipulated images and evade prosecution. We cannot protect New Yorkers without making these changes.”
Britt Paris, an assistant professor of library and information science at Rutgers University, encapsulated the significance of the problem: We have gone from “deepfakes” to ”cheapfakes”. “It’s not just people with sophisticated computational technology and fairly sophisticated computational know-how. Instead, it’s a free app.”