The best part of the fediverse is that anyone can run their own server. The downside of this is that anyone can easily create hordes of fake accounts, as I will now demonstrate.

Fighting fake accounts is hard and most implementations do not currently have an effective way of filtering out fake accounts. I’m sure that the developers will step in if this becomes a bigger problem. Until then, remember that votes are just a number.

  • danc4498@lemmy.world
    link
    fedilink
    arrow-up
    57
    arrow-down
    2
    ·
    1 year ago

    This is the problem. All the algorithms are based on the upvote count. Bad actors will abuse this.

        • Protoknuckles@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          1 year ago

          So, the question becomes how do we rank posts and comments in a way that is not based on either upvotes or down votes or number of comments? I could see a trust value being made for each user based on trusted users marking others as trusted combined with a personal trust score, but that puts a barrier on new users and enforces echo chambers.

          What else could be tried?

          • TheOnlyMego@lemmy.world
            link
            fedilink
            arrow-up
            3
            ·
            1 year ago

            that puts a barrier on new users and enforces echo chambers

            Only if trust starts at 0. A system where trust started high enough to not filter out posts and comments would avoid that issue.

          • danc4498@lemmy.world
            link
            fedilink
            arrow-up
            2
            arrow-down
            1
            ·
            1 year ago

            Maybe instances should be assigned a rank for how dependable they are. Length of time active, number of active users… Stuff like that and each instance keeps track of its own rankings for each instance it is federated with. Put the upvote and those stats in a magic box to calculate the actual upvote value.