Most of the time when I write about Twitter, I usually talk about what a dumpster fire that this social media platform is. But today, I’m doing something different. I’m going to give it some praise. Twitter has stopped searches for Taylor Swift after deepfake nudes, meaning fake nude pictures, started to flood the site. The BBC has the details:
In a statement to the BBC, X’s head of business operations Joe Benarroch said it was a “temporary action” to prioritise safety.
When searching for Swift on the site, a message appears that says: “Something went wrong. Try reloading.”
Fake graphic images of the singer appeared on the site earlier this week.
Some went viral and were viewed millions of times, prompting alarm from US officials and fans of the singer.
Posts and accounts sharing the fake images were flagged by her fans, who populated the platform with real images and videos of her, using the words “protect Taylor Swift”.
The photos prompted X, formerly Twitter, to release a statement on Friday, saying that posting non-consensual nudity on the platform is “strictly prohibited”.
While this is one of the few good moves that Twitter has made lately, one wonders what would have happened if these images were not of Taylor Swift, and were instead a woman who doesn’t have millions of fans to flag images on her behalf so that Twitter could take action. Would Twitter have acted to deal with the issue? I don’t know. But it’s a question worth asking. In any case, this highlights why strict laws need to be enacted everywhere to stop this sort of thing from happening.
Like this:
Like Loading...
Related
This entry was posted on January 29, 2024 at 11:05 am and is filed under Commentary with tags Twitter. You can follow any responses to this entry through the RSS 2.0 feed.
You can leave a response, or trackback from your own site.
Twitter Blocks Searches For Taylor Swift After Deepfake Nudes Appear On The Site
Most of the time when I write about Twitter, I usually talk about what a dumpster fire that this social media platform is. But today, I’m doing something different. I’m going to give it some praise. Twitter has stopped searches for Taylor Swift after deepfake nudes, meaning fake nude pictures, started to flood the site. The BBC has the details:
In a statement to the BBC, X’s head of business operations Joe Benarroch said it was a “temporary action” to prioritise safety.
When searching for Swift on the site, a message appears that says: “Something went wrong. Try reloading.”
Fake graphic images of the singer appeared on the site earlier this week.
Some went viral and were viewed millions of times, prompting alarm from US officials and fans of the singer.
Posts and accounts sharing the fake images were flagged by her fans, who populated the platform with real images and videos of her, using the words “protect Taylor Swift”.
The photos prompted X, formerly Twitter, to release a statement on Friday, saying that posting non-consensual nudity on the platform is “strictly prohibited”.
While this is one of the few good moves that Twitter has made lately, one wonders what would have happened if these images were not of Taylor Swift, and were instead a woman who doesn’t have millions of fans to flag images on her behalf so that Twitter could take action. Would Twitter have acted to deal with the issue? I don’t know. But it’s a question worth asking. In any case, this highlights why strict laws need to be enacted everywhere to stop this sort of thing from happening.
Share this:
Like this:
Related
This entry was posted on January 29, 2024 at 11:05 am and is filed under Commentary with tags Twitter. You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or trackback from your own site.