In spring, 2018, Mark Zuckerberg invited more than a dozen professors and academics to a series of dinners at his home to discuss how Facebook could better keep its platforms safe from election disinformation, violent content, child sexual abuse material, and hate speech. Alongside these secret meetings, Facebook was regularly making pronouncements that it was spending hundreds of millions of dollars and hiring thousands of human content moderators to make its platforms safer. After Facebook was widely blamed for the rise of “fake news” that supposedly helped Trump win the 2016 election, Facebook repeatedly brought in reporters to examine its election “war room” and explained what it was doing to police its platform, which famously included a new “Oversight Board,” a sort of Supreme Court for hard Facebook decisions.

Several years later, Facebook has been overrun by AI-generated spam and outright scams. Many of the “people” engaging with this content are bots who themselves spam the platform. Porn and nonconsensual imagery is easy to find on Facebook and Instagram. We have reported endlessly on the proliferation of paid advertisements for drugs, stolen credit cards, hacked accounts, and ads for electricians and roofers who appear to be soliciting potential customers with sex work. Its own verified influencers have their bodies regularly stolen by “AI influencers” in the service of promoting OnlyFans pages also full of stolen content.

Meta now at best inconsistently responds to our questions about these problems, and has declined repeated requests for on-the-record interviews for this and other investigations. Several of the professors who used to consult directly or indirectly with the company say they have not engaged with Meta in years. Some of the people I spoke to said that they are unsure whether their previous contacts still work at the company or, if they do, what they are doing there. Others have switched their academic focus after years of feeling ignored or harassed by right-wing activists who have accused them of being people who just want to censor the internet.

  • Flying Squid@lemmy.world
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    edit-2
    5 months ago

    I again report it if I have no interest in that sort of thing. At this point, the ‘suggested’ stuff is usually just a group with a meme that I can either ignore or enjoy and then just not join the group.

    But of course there are still exceptions.

    My method is not perfect, but it saves me a ton of grief and annoyance.

    • EatATaco
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 months ago

      You’re a better man than I because those things drive me fucking nuts.