It’s not just about facts: Democrats and Republicans have sharply different attitudes about removing misinformation from social media::One person’s content moderation is another’s censorship when it comes to Democrats’ and Republicans’ views on handling misinformation.

  • bamboo@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I’m not looking for guaranteed amplifiers. I’m looking for an outcome where anyone can find an amplifier if that’s what they want. No party should be required to amplify anyone else. It’s possible in this situation that someone could fail to find an amplifier, but I’d like to minimize that by just having many platforms with different incentives such that it is unlikely that they would all align against any one persons message.

    The fediverse is built on this concept. Every instance can moderate their own users and communities, and choose which other instances to federate with. It’s unlikely that a specific user would be unable to find an instance that accommodates them, even if larger instances won’t. This contrasts with traditional social media where there is a sole for-profit entity that controls the entire network, able to completely remove people and ideas they don’t want.

    • partial_accumen@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      I’m looking for an outcome where anyone can find an amplifier if that’s what they want.

      How is that not a guaranteed amplifier?

      but I’d like to minimize that by just having many platforms with different incentives such that it is unlikely that they would all align against any one persons message.

      So if a person’s message is something clearly abhorrent like “white supremacy” advocating violence against other races, you’re hopeful there is a platform that person can amplify their voice with their ideas to the general public?

      It’s unlikely that a specific user would be unable to find an instance that accommodates them, even if larger instances won’t.

      My guess is that as the user increases the level of their bigotry, if the instance still allows is, that instance will be de-federated by nearly everyone. So they have their own echo chamber at that point. How is this different than what groups like that have done for decades prior to the internet with a private newsletter mailed out? This is essentially the situation we have at present.

      What are you advocating that would change this?