Very, Very Few People Are Falling Down the YouTube Rabbit Hole | The site’s crackdown on radicalization seems to have worked. But the world will never know what was happening before that::The site’s crackdown on radicalization seems to have worked. But the world will never know what was happening before that.

  • intensely_human
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    20
    ·
    10 months ago

    Why do we need to know what happened before? A record of the past is just material radicals can use to radicalize others.

    • bhmnscmm@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      10 months ago

      Understanding past problems and solutions is critical to solve the problems of the future.

      Do you really not see the importance of establishing a cause and effect relationship to past events?

    • Fisk400@feddit.nu
      link
      fedilink
      English
      arrow-up
      7
      ·
      10 months ago

      Understanding how things happened helps you not doing it again. When the say we don’t know what happened they are not talking about individual videos. They are talking about the algorithm. Google made a radicalizing algorithm and then they stopped but because we don’t know why the first algorithm was so radicalizing it could happen all over again and we wouldn’t know until literal millions have already gone down the rabbithole.

      • dragontamer@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        10 months ago

        Because clickbait works, and the treadmill of more exotic and/or clickbaity titles is infinite as online creators try to establish a niche.

        See Elsa from Frozen. When “mainstream” Elsa videos get saturated, its impossible to make your video stand out. So you make “Pregnant Elsa” videos instead, which is more clickbaity and allows you to get more eyeballs / ad revenue associated to your account. But then “Pregnant Elsa” videos get over-saturated, so you go +1 on the extreme count to “Pregnant Elsa gives Birth” videos. Etc. etc. etc.

        Next thing you know, sex videos are being shown to children who have clicked on too many Elsa videos in the Youtube Radicalization treadmill.


        Cheap sexual content, cheap violent videos, and cheap “anger” videos are all the natural result of online content creators just trying to stand out. The more extreme you go, the less competition you have. So the more your audience connects with you. But the audience has to get there through the treadmill that steps them on this path… but recommendation algorithms automatically put people down these paths. (probably unintended behavior back then, but these days its a well known phenomenon).