X continues to suck at moderating hate speech, according to a new report::A new report from the Center for Countering Digital Hate (CCDH) suggests X is failing to remove posts that violate its own community rules regarding misinformation and hate speech.

  • TheFeatureCreature@lemmy.world
    link
    fedilink
    English
    arrow-up
    41
    arrow-down
    6
    ·
    1 year ago

    Of course it does. It’s literally owned and run by a fascist. I really wish people would stop expecting Twitter to run and behave like it used too. It will not and it never will again.

  • DeathWearsANecktie
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    1 year ago

    Can we be sure they’re trying to moderate hate speech? I don’t think they are.

  • Obinice@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    Isn’t hate speech what they’re going for?

    They’re run by literal fascists now, presumably they want it to be a platform for their kind?

  • sugarfree@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    8
    ·
    1 year ago

    Don’t use it if you don’t like it. What’s with the obsession with harsh moderation over every facet of the internet? Is that really the future people want?

    • DoctorButts@kbin.melroy.org
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      Is that really the future people want?

      Maybe?

      Anecdotally, I feel like I’m seeing more people pre-emptively censoring their own swears on the fucking internet. Also there’s a weird and bad puritan sentiment towards all things sex-related that feels like it popped up out of nowhere in the last few years.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    This is the best summary I could come up with:


    According to the CCDH, the reported posts, which largely promoted bigotry and incited violence against Muslims, Palestinians, and Jewish people, were collected from 101 separate X accounts.

    Just one account was suspended over their actions, and the posts that remained live accrued a combined 24,043,693 views at the time the report was published.

    X filed a lawsuit against the CCDH in July earlier this year over claims the organization “unlawfully” scraped X data to create “flawed” studies about the platform.

    In a statement to The Verge, X’s head of business operations, Joe Benarroch, said that the company was made aware of the CCDH’s report yesterday and directed X users to read a new blog post that details the “proactive measures” it has taken to maintain the safety of the platform during the ongoing Israel-Hamas war, including the removal of 3,000 accounts tied to violent entities in the region and taking action against over 325,000 pieces of content that violate its terms of service.

    X claims that by choosing to only measure account suspensions, the CCDH has not accurately represented its moderation efforts and urged the organization to “engage with X first” to ensure the safety of the X community.

    After publication, Benarroch questioned the methodology of the CCDH’s study and claimed the organization only considers a post “actioned” after the account has been suspended.


    The original article contains 476 words, the summary contains 224 words. Saved 53%. I’m a bot and I’m open source!