Mastodon, an alternative social network to Twitter, has a serious problem with child sexual abuse material according to researchers from Stanford University. In just two days, researchers found over 100 instances of known CSAM across over 325,000 posts on Mastodon. The researchers found hundreds of posts containing CSAM related hashtags and links pointing to CSAM trading and grooming of minors. One Mastodon server was even taken down for a period of time due to CSAM being posted. The researchers suggest that decentralized networks like Mastodon need to implement more robust moderation tools and reporting mechanisms to address the prevalence of CSAM.

  • shiri@foggyminds.com
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    @mudeth I 110% agree faeranne, especially in that this is much like the topic of encryption and how people (especially politicians) keep arguing that we just need to magically come up with a solution that allows governments to access all encrypted communication somehow without impacting security and preventing people from using existing encryption to completely bypass it. It’s much like trying to legislate math into functioning differently.

    The closest you can get to a federated moderation protocol is basically just a standard way to report posts/users to admins.

    You could absolutely build blocklists that are shared around, but that’s already a thing and will never be universal.

    Basically what you’re describing is that someone should come up with a way to *force* me to apply moderation actions to my server that I disagree with. That somehow such a system would be immune to abuse (ie. because it’s external to my server, it would magically avoid hackers and trolls manipulating it) and that I would have no choice in whether or not to allow that access despite running a server based on open source software in which I can edit the code myself if I wish (but somehow in this case wouldn’t be able to edit it to prevent the external moderation from working).

    You largely miss the point entirely of my other arguments: email is a perfect reference point because, despite private vs public, it faces all the same technical, social, and legal challenges. It’s just an older system with a slightly different purpose (that doesn’t change it’s technical foundations, only just how it’s interacted with), but the closest relative to activitypub with much much larger scale adoption. These issues and topics have already been discussed ad nauseum there.

    And I didn’t say users would moderate themselves, we decide what is worth taking action on. If you’re not an admin, you choose whether or not something is worth reporting and whether or not you find the server you’re on acceptable to your wants/needs. If you take issue with anti-vaxxers, climate change deniers, and nazis and your server allows all of that (either on the server itself, or has no issue with other servers that allow it)… then you move to a server that doesn’t.

    Finally, this doesn’t end in centralization because of all the aforementioned gray areas. There are many things that I don’t consider acceptable on my server but aren’t grounds for defederation.

    For example: I won’t tolerate the ignoring of minority voices on topics of cultural appropriation and microaggressions… but I don’t consider it a good idea to defederate other servers for it because the admins themselves often barely understand it and I would be defederation 90% of the fediverse at that point. If I see such from my users I will talk to them and take action as appropriate, but from other servers I’ll report if the server looks remotely receptive to it.