An in-depth report reveals an ugly truth about isolated, unmoderated parts of the Fediverse. It’s a solvable problem, with challenges.

  • Sean Tilley@lemmy.mlOPM
    link
    fedilink
    English
    arrow-up
    5
    ·
    11 months ago

    It’s a bit of an unknown, since the service is a proprietary black box. With that being said, my guess:

    • A database with perceptual hash data for volumes and volumes of CSAM.
    • Means to generate new hashes from media
    • Infrastructure for adding and auditing more of it
    • REST API for hash comparisons and reporting
    • Integration for pushing reports to NCMEC and law enforcement.

    None of those things are impossible or out of reach…but, collecting a new database of hashes is challenging. Where do you get it from? How is it stored? Do you allow the public to access the hash data correctly, or do you keep it secret like all the other solutions do?

    I’m imagining a solution where servers aggregate all of this data up to a dispatch platform like the one described above, possibly run by a non-profit or NGO, which then dispatches the data to NCMEC directly.

    The other thing to keep in mind is that solutions like photoDNA are HUGE. I’m talking like hundreds of thousands of pieces of reported media per year. It’s something that would require a lot of uptime, and the ability to handle a significantly high amount of requests on a daily basis.

    • Elise@beehaw.org
      link
      fedilink
      English
      arrow-up
      2
      ·
      11 months ago

      Thanks for the thought you put into your answer.

      I’ve been thinking: CSAM is just one of the many problems communities face. E.g. Youtube is unable to moderate transphobia properly, which has significant consequences as well.

      Let’s say we had an ideal federated copy of the existing system. It would still not detect many other types of antisocial behavior. All I’ms saying is that the existing approach by M$ feels a bit like it’s based on a moral tunnel vision and trying to solve complex human social issues by using some kind of silver bullet. It lacks nuance. Whereas in fact this is a community management issue.

      Honestly I feel it’s really a matter of having manageable communities with strong moderation. And the ability to report anonymously, in case one becomes involved in something bad and wants out.

      Thoughts?