Mastodon, the decentralized network seen as a viable alternative to Twitter, is riddled with child sexual abuse material (CSAM), according to a new study From Stanford’s Internet Observatory ,Via Washington Post, In just two days, the researchers found 112 instances of known CSAM across 325,000 posts on the platform – the first instance appeared after just five minutes of searching.
To conduct their research, the Internet Observatory scanned the 25 most popular mastodon examples for CSAM. The researchers also employed Google’s SafeSearch API to identify explicit images, as well as PhotoDNA, a tool that helps find flagged CSAM. During their search, the team found 554 pieces of content that matched hashtags or keywords frequently used by online child sexual abuse groups, all of which were identified by Google SafeSearch as explicit in “highest confidence”.
Open posting of CSAM is “disturbingly prevalent”
Fediverse also had 713 uses of the top 20 CSAM-related hashtags on posts containing media, as well as 1,217 text-only posts that pointed to “off-site CSAM trading or grooming of minors”. The study states that the open posting of CSAM is “disturbingly prevalent”.
One example referenced the extended Mastodon.xyz server outage that we noted earlier this month, an event that was caused by CSAM posted on Mastodon. In a post about the eventThe sole maintainer of the server said he was alerted to the content containing CSAM, but notes that moderation is done in his spare time and can take days – it’s not a huge operation like Meta with a team of contractors around the world, it’s just one person.
While they said they took action against the content in question, the host of the Mastodon.xyz domain had suspended it anyway, leaving the server inaccessible to users until they were able to reach someone to restore its listing. After the issue was resolved, the admin of mastodon.xyz says that the registrar has added the domain to a “false positive” list to prevent future removals. However, as the researchers point out, “the reason the action took place was not a false positive.”
“We received more photoDNA hits in a two-day period than in the entire history of our organization doing any type of social media analysis, and it’s not even close,” David Thiel, one of the report’s researchers, said in a statement. Washington Post. ,Much of this is a result of the lack of tools that centralized social media platforms use to address child safety concerns.
As decentralized networks like Mastodon grow in popularity, so do concerns about security. Decentralized networks do not use the same approach to moderation as mainstream sites such as Facebook, Instagram, and Reddit. Instead, each decentralized instance is given control over moderation, which can lead to inconsistencies across fedivers. That’s why the researchers recommend that networks like Mastodon use more robust tools for moderators, along with photoDNA integration and cyber tipline reporting.










