Saturday, Dec 21, 2024
CLOSE

Twitter Failing To Deal With Child Sexual Abuse Material, Says Stanford Internet Observatory


Twitter Failing To Deal With Child Sexual Abuse Material, Says Stanford Internet Observatory

Twitter’s logo can be seen on the outside of its headquarters. The Twitter logo is displayed on the exterior of Twitter headquarters. Images)

Getty Images

Twitter has failed to remove images of child sexual abuse over recent months—even though they were flagged as such, a new report will allege this week.

Stanford Internet Observatory researchers say that the company did not deal with forty items of Child Sexual Abuse Material (CSAM) between the months of March and May of this year.

Microsoft’s PhotoDNA was then used to search for images containing CSAM. PhotoDNA automatically hashes images and compares them with known illegal images of minors held on the National Center for Missing & Exploited Children (NCMEC)—and highlighted 40 matches.

The team reports that “the investigation found problems with Twitter’s CSAM detector mechanisms. We reported this issue in April to NCMEC, but the problem persisted.”

We approached an intermediary for a briefing, as we had no Trust and Safety contact at Twitter. Twitter received notification of the problem and it appears that the issue has been resolved by May 20.

Research such as this is about to become far harder—or at any rate far more expensive—following Elon Musk’s decision to start charging $42,000 per month for its previously free API. Stanford Internet Observatory has been forced recently to cease using its enterprise version of the software. The free version, however, is only able to give read-only access. There are also concerns about researchers being forced to erase data collected previously under an agreement.

After highlighting the disinformation that was spread on Twitter during the U.S. presidential elections in 2020, it has been a constant thorn for Twitter. Musk called the platform a “propaganda system” at that time.

Wall Street Journal will publish more research results later this month.

The report states that Twitter “is not the sole platform that deals with CSAM nor is it the main focus of our upcoming study.” We’re grateful to Twitter for helping to improve child safety and we thank them.

Twitter Safety announced in January that they were “moving quicker than ever” to eliminate CSAM. In January, Twitter Safety reported that they had “moved faster than ever” to remove CSAM.

Several reports since have shown that CSAM continues to be a problem on the platform. The New York Times reported in February that Twitter took twice as long after Elon Musk’s takeover to remove CSAM flagged child safety groups.

It still replies to any press queries with an emoji of a toilet.

The post Twitter Failing To Deal With Child Sexual Abuse Material, Says Stanford Internet Observatory appeared first on Social Media Explorer.


Did you miss our previous article...
https://socialmediaamplification.com/social-media-analysis/healing-rituals-creating-a-memorial-for-your-beloved-cat-through-cremation