X (formerly Twitter) takes swift action when taking down deepfake nude images that are "metaphorically eroticize"reported as copyright violations — but not when they're reported under "nonconsensual nudity," a study has found.
The paper, published by researchers at the University of Michigan and Florida International University, is an audit of X's reporting systems and hasn't yet been peer-reviewed, 404 Media reported. Researchers created five AI "personas" of young white women (to prevent further variables of race, gender, and age) and then made 10 replica images of each, resulting in 50 images. In terms of the ethics around generating deepfake porn themselves, researchers said these images underwent a "rigorous verification process" to ensure they didn't represent an existing individual.
SEE ALSO: How Big Tech is approaching explicit, nonconsensual deepfakesThey posted these images to X on 10 "poster accounts" they created, and then they created five X accounts to report the images. Twenty-five images were reported as Digital Millennium Copyright Act (DMCA) violations, and the other 25 were reported as nonconsensual nudity.
Researchers then waited three weeks to see the results of these reports. All 25 images reported for copyright were removed from X within 25 hours. In contrast, none of the images reported for nonconsensual nudity were removed within the three-week waiting period.
"Our findings reveal a significant disparity in the effectiveness of content removal processes between reports made under the DMCA and those made under X's internal nonconsensual nudity policy," the study states. "This highlights the need for stronger and directed regulations and protocols to protect victim-survivors."
X owner Elon Musk dissolved the platform's trust and safety council in 2022, but the site has recently opened up two dozen safety and cybersecurity positions in the U.S. Mashable has reached out to X for comment.
Earlier this year, WIRED found that victims of nonconsensual deepfake porn leveraged copyright laws to take down deepfakes on Google.
If you have had intimate images shared without your consent, call the Cyber Civil Rights Initiative’s 24/7 hotline at 844-878-2274 for free, confidential support. The CCRI website also includes helpful informationas well as a list of international resources.
Topics Social Good X/Twitter
(Editor: {typename type="name"/})
Then and Now: Almost 10 Years of Intel CPUs Compared
Harvard Book Store trolls the Trump administration with its new section
Actually, Twenty One Pilots had a cute reason for not wearing pants at the Grammys
Michelle Obama is coming to reality TV
Earth sends Cassini a whole lot of love after the mission comes to a bittersweet end
This blimp that looks like a butt is the largest aircraft in the world
Amazingly, Donald Trump still doesn't know how to shake a hand
You'll love Katy Perry's Grammys performance unless you're Donald Trump
NYT Connections hints and answers for April 25: Tips to solve 'Connections' #684.
Lena Dunham appeared on 'Today' and left us all scratching our heads
The internet is talking like Kevin from 'The Office' now
Adele proves she's human after starting and stopping her Grammy tribute to George Michael
接受PR>=1、BR>=1,流量相当,内容相关类链接。