• 𝕽𝖚𝖆𝖎𝖉𝖍𝖗𝖎𝖌𝖍@midwest.social
    link
    fedilink
    arrow-up
    18
    arrow-down
    1
    ·
    1 year ago

    This is at the root of the paradox of tolerance. If you tolerate intolerance, eventually intolerance dominates.

    Robert Anton Wilson wrote about Big Truths and Little Truths. Similarly, we can talk about Little Censorship and Big Censorship. I don’t know what those definitions are, but I’m sure that it’s not just a matter of scale, because the Paradox of Tolerance applies at all scales. I think the difference lies between in what’s being censored, things that promote intolerance. And then there are things outside of intolerance that most of us agree should be squashed – child porn, hate speech, incenting to crimes against individuals, doxxing. But it’s a fine line, and you could argue that it’s better to not censor, and just make the the sharing a crime.

    Personally, I don’t have clear definitions around this stuff, but I do think the Paradox of Tolerance is a real thing that’s been demonstrated countless times, and which should be heeded.

    • jet@hackertalks.com
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      The tolerance intolerance discussion is interesting, and very sticky.

      If speech is criminally intolerable, then it should be up to the criminal system to prevent that speech. Not digital platform providers to enforce their opinions. Or at least that’s why I support the fediverse.

      "If there be any among us who would wish to dissolve this Union or to change its republican form, let them stand undisturbed as monuments of the safety with which error of opinion may be tolerated where reason is left free to combat it. I know, indeed, that some honest men fear that a republican government can not be strong, that this Government is not strong enough; " - Jefferson, Thomas speech

      Personally I fall on the the side of a free and open discord, we cannot be fearful of evil ideas, we must expose them to sunlight so that they may shrink away by the minds of conscionable people.

      Rhetorically I’ve seen many internet arguments use the intolerance of tolerance idea, to shut down any idea they don’t agree with. They wield it as a shield to prevent open debate. I think that hurts discourse, and finding common ground, it polarizes people in a discussion.

      • We completely agree that it’s a difficult question, and a slippery slope. And also on the point of government’s role.

        Do you them believe that privately run platforms shouldn’t have the right to choose what gets put on their platform? Or is it a matter of scale, like, Sxan’s GoToSocial server can do what it wants, but The-Platform-Formally-Known-As-Twitter shouldn’t?

        I always think of the brigading that happens on “open” platforms. The Masses will effectively censor any real debate, but especially if they know there are no rules. How are we to deal with that?

        • jet@hackertalks.com
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          1 year ago

          You bring up some excellent points. Right now, there’s private organizations that are acting as de facto public squares. I think when they’re the only option, it gets muddy, if they’re going to be so essential to society, they have to operate like utilities, not have opinions beyond legal or illegal.

          For all the platforms, that are just options, but are not de facto public squares, I’m perfectly happy for them to have opinions about what can or cannot be said.

          Let’s take the fediverse, as an example, any individual server can have its own opinions, and enforce them through moderation - perhaps very heavy moderation. And that’s totally fine. Because any group of people can run their own instance, and have their own moderation policies, and it’s all on an equitable playing field.

          The brigading, that we’re seeing on these open platforms, is the early adopter phenomenon, and that groups tend to move together. The only real solution, is heavy moderation, on different instances. So if you have a community that’s talking about fishing, the moderator should prevent brigading from discussing things that don’t relate to fishing. “Person I hate was caught fishing, we should ban them from fish, etc etc oh you support fish killing…” the moderator should stop that.


          The litmus test I would use, to determine if a social media company was a public space, and should act by utility rules, rather than private club rules - would be, does the government use that platform to communicate with citizens?

          X-twitter, Facebook… both have governments using them to communicate directly if their citizens, sometimes as the only means of communication. So they are de facto public squares and utilities.

          • So, I’ve left this on “unread” for so long only because until now I only used Lemmy on my phone, and I really hate typing long replies on my phone. I wanted to give your reply due consideration, though. Anyway, I’m embarassed to have taken this long to respond.

            I agree with you about the public square, and I think you bring up an excellent point about these systems becoming “essential to society.” I think it’s a thing that is obvious to younger people, and almost completely invisible to older people. Even those of us who grew up during the IT boom decades and lived through the change may find it difficult to grok just how much of an impact this is having. I do think that people are generally well aware of how slow legislation is in adapting to rapid changes in society, but the impact you talk about has happened at such an accelerated rate, useful precedents are lacking. So we see legislators thrashing about more than usual, over or under-reacting, and mostly in extreme ignorance.

            I see brigading in the fediverse as a worse problem than you do. It’s mob rule, and it is unchecked largely – I feel – as a result of hesitance by moderators to be accused of censorship. I haven’t yet seen much of what Reddit suffers from – moderator affinity, where mods have a heavier hand with posters they disagree with – but the result is unchecked herd mentality cowing dissenters.

            But, maybe mob rule is good? I vacillate on this one. A well-functioning, healthy society has laws controlling gross topics, and social censure is used to moderate distructive elements. We don’t want a society where we have laws for every little infraction; in that society, every citizen is a criminal by default, and the government always has a legal justification to persecute everyone they want to (and let slip those they don’t). OTOH, we have what happened in the US in the 50’s, with mobs of white people harrassing black integration students. I don’t know what the right answer is for this, honestly, but it is an issue in meatspace, and it’s as much or more of an issue online.

            Your litmus is good, I think, but risks being based largely on our current clueless government. As the generations age out, and younger generations take control, the government will become increasingly social-media savvy. I can easily see a future government having a communications department that is competent enough to hit nearl every social media platform, regardless of popularity. What about cross-posting? If we use that litmus, then if I were the government and wanted to control a platform, all I need to do is start posting to it and now it qualifies as subject to regulation?

            I think I’ve said before, but I’ll repeat it: I don’t have answers to any of these issues. I wish we could have a censorship-free internet; there was a time in the early history when most users were well-behaved and followed established etiquette. I think a lot of that may have been due to the lack of anonymity, but whatever the reason, we’ve been past that for decades, and we haven’t yet adapted.

            • jet@hackertalks.com
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 year ago

              Thank you for the very thoughtful reply.

              The brigading is a huge problem and discourages people from joining lemmy, we need highly opinionated moderated communities to create “safe spaces” for niche communities and viewpoints. The inclusion of “user participation requirements”, like account age, interaction with a community, karma scores in community - are necessary to help lemmy grow.

              From a long term stability of society perspective, absolute free speech is the only path forward. Yes, people we hate will have voices, and people who are criminal will have voices, but that is the price of giving everyone a voice. We only have to look at the diversity of “governments” globally to realize having a community focused, respectful, government is a temporary thing. Governments change with time, with those enforcing the rules. Just as a thought experiment imagine you lived your entire life in every country, and imagine you wanted to advocate for 1. human rights, 2. a political opposition party. In many countries, that is aggressively stamped out, “don’t rock the boat”. In many global communities’ doing 1 and 2 are great ways to embarrasses powerful people and have a short life.

              I know many people will think, “yes, but… what about thing I don’t like X”… If we create the digital tooling to ban X, whatever X is, then those in power will use that tooling to target everything else. Tools in the toolbox get used. Its a difficult stance to be a free speech absolutist, its unpopular, but I think its necessary. I’m not saying communities have to suffer outsider speech intruding on their spaces, but that platforms cannot be opinioned as a whole.

              You bright up very thoughtful points and I agree censorship is necessary to grow communities, but censorship should never get larger then the community level. Platform level censorship is bad for society in the long term.

                • jet@hackertalks.com
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 year ago

                  Moderation: not deplatforming, but putting rails on a specific discussion

                  Censorship: deplatforming, total limits on a topic in all places.

                  I.E. anyway can send mail in the post office. A news letter editor moderates the received letters for inclusion in their publication.

                  So in a Lemmy context, it’s not censorship to have rules on a instances, but it would be censorship to deny people the ability to run a instance. Lemmy is very censorship resistant.

                  • Are you suggesting that there are no topics, no content, that should be censored? I’m not trying to walk you into Godwin’s law; I just don’t see how you address issues like CP, snuff porn, or hate/incentivizing speech. I personally would rather err on the conservative side of the Paradox of Tolerance, than allow intolerence to take hold and take over. With total and complete freedom of expression, how do you prevent the emergence of populist oppressive movements like the Khmer Rouge, or the Nazi party? Or do you think the Paradox of Tolerance is flawed?

          • Facebones@reddthat.com
            link
            fedilink
            arrow-up
            2
            ·
            1 year ago

            I think a publicly funded platform would be beneficial in today’s world. Nobody can be banned but you can still block people individually. (Criminal stuff would still be criminal and you could potentially be muted by govt entities though) All govt communication would be through this platform, so nobody can be “walled off” from govt comms. It would still function as social media as well, but people would be free to twit/fb/whatevs - there would just no longer be govt entities there.

            It would also lay the framework to potentially move our voting systems into the 21st century IMO.

          • partizan
            link
            fedilink
            arrow-up
            2
            ·
            1 year ago

            But the governments prefer the current situation, as they have channels to ask for removal, but have zero liability and the company is covered, as they can do as they please, because its their private platform where they are allowing them. So I dont see why would the government declare social media as public squares…

      • HumbertTetere@feddit.de
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Being able to criminally persecute someone requires knowing their identity. If this is the only approach, the real need to prevent anonymous internet usage will increase.

        • LinkOpensChest.wav@lemmy.one
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          Not to mention, in most communities I choose to be part of, I trust the judgment of the admins and moderators far more than the state’s “justice” system.