I guess it shouldn’t bother me, after all I try my best to avoid watching anything on youtube that I didn’t go there to watch in the first place but nevertheless, it’s hard not to see the clickbaity thumbnails for suggested videos to the right of the one I’m watching, and also, when I’m researching something it’s undeniable that sometimes it genuinely was actually useful to have another video on the topic suggested that was relevant.

But this has really started to freak me out, Youtube has gone bananas recently. I’m constantly getting suggestions decrying woke this or woke that, in particular a lot of videos of compilations of police bodycam footage with titles about various people being ‘idiots’ or ‘entitled’ or various other terms suggesting a strong pro-authority angle. Those suggestions were annoying but it’s starting to get disturbing now because they’re veering strongly towards incel themes. Today I’ve seen suggestions for: a video about catching a woman faking a rape accusation ‘caught on camera’, another about a ‘high value’ man winning in court against a woman wanting him to pay reparations because she refused a DNA test, another about why men don’t approach women anymore, and on and on. They stick in my mind because I can see exactly the the world view this constellation of fucking garbage is catering to.

On the one hand, I guess I could take some comfort in the fact that if the algorithms have gotten it this far wrong then maybe Google really didn’t manage to snag so much data about me as I assumed but on the other, it’s definitely used something it reckons it knows about me to make these assumptions. I can’t for the life of me figure out where the fuck it got the idea I would like this or why it’s so persistent despite the lack of positive reinforcement. I am not signed in, so I can’t even attempt to manually give negative feedback and as far as I can tell people who do that say it doesn’t work anyway.

This all seems to have coincided with a dramatic change in recent viewing habits, so I’m guessing that’s what has triggered this, but it’s still weird as fuck that there’s apparently some overlap here. I’ve been in the market for a new phone and so have been watching a lot of videos about various phones and I was also considering a pixel as one of the options because if I went with that, I could use GrapheneOS. This has meant watching a lot of content content on GrapheneOS as well. This does seem to have had the expected effect of a lot of suggestions along those lines but it’s definitely coincided with the fucking alt-right starter kit. This is certainly a counterintuitive link. Shit makes me want to puke. Normally when I see these kinds of surveillance economy mechanisms at work I just look at them with a detached kind of wry amusement at the shitty state of things but this actually really did offend me, I mean yuck, makes my fucking skin crawl.

  • ___spannungsbogen@lemmy.world
    link
    fedilink
    arrow-up
    16
    ·
    6 months ago

    It’s because you’re clearing out cookies and all that other stuff that you keep seeing it. The youtube algorithm knows that the people who watch that garbage watch it often and for long periods of time, making those kinds of videos a top suggestion. Pretty sad and scary to think of so many people spending all their free time watching and rotting away inside, just soaking it all up to take it with them the next time they interact with real people.

  • Schwim Dandy
    link
    fedilink
    arrow-up
    13
    arrow-down
    2
    ·
    6 months ago

    I wouldn’t necessarily think it has anything to do with you, your browsing habits or online history and more about the fact that these companies know that outrage and extremism keeps people engaged for more time. Moderate bipartisan discussion doesn’t pull the same numbers as loons shouting out uninformed prejudices.

    • Jimmycrackcrack@lemmy.mlOP
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      6 months ago

      The trouble with that idea though is that for a start I’ve always done this, and also, it’s very specific in which kind of rage bait and cultural warfare it’s pushing. I don’t want to sound all “both sides” but, if I wanted to, I’m sure I could design a similar kind of keep 'em angry list of suggestions that looked very different. I could have suggestions filled with videos of police brutality, I could have videos filled with women being mistreated by misogynysts, the suggestions could be about court cases about abusers being jailed or they could have been about lack of diversity and inclusion in film and media. All of that could be rage bait, it’d all be about being aggrieved in some way. I’d say in a way I could sympathise with vastly more but it would still be cultivating outrage and extremism.

      Something about how sudden this is and how hard it’s pushing and the specific bias it’s pushing towards is indicating something’s changed. It’s like a dog with a bone. An interesting thing I’ve noticed is that on Newpipe there’s a related items button that I started pressing after watching something just to compare to my desktop and those items are well … related, there’s no fucked up shit. On one video it was suggesting a lot of stuff related to guns and I was starting to worry but I realised it was only when watching a slow Mo Guys videos involving shooting so it actually made sense. On all other examples it’s just something vaguely related to the topic of what’s been watched. Admittedly I don’t know if the related items button is akin to suggestions but if so then there’s a clear difference.

      Anyway thanks for listening to me moan about YouTube recommending things I don’t like. I hope it stops.

    • pikasaurX4
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      6 months ago

      Yeah, it’s exactly this. If you have no data/cookies/account history then you’re getting the default which is all rage bait, frustrating, polarizing stuff. If you want to see stuff you like fed directly to you, you have to give them some data

  • LinkOpensChest.wav@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    5
    ·
    6 months ago

    Could it be that stuff is really popular in your location? The first time I tried Tik Tok it was nothing but batshit right-wing anti-woke content, and I’m pretty sure it’s based on my location.

    • Jimmycrackcrack@lemmy.mlOP
      link
      fedilink
      arrow-up
      2
      ·
      6 months ago

      That would be interesting, but it’s really only been going on about 2 weeks and I haven’t noticed a change pushing ant further in to the crazy at my location in that timeframe.

    • Jimmycrackcrack@lemmy.mlOP
      link
      fedilink
      arrow-up
      2
      ·
      6 months ago

      You’re not totally wrong, this does boil down to “person who deliberately avoided giving machine that turns data in to relevant recommendations shocked that they are given irrelevant recommendations” but in my defence, I have always done this and there was no obvious agenda to the suggested videos. They weren’t random, because clearly despite my best efforts YouTube can identify me from visit to visit, and there was a clear link between them and my viewing habits but at least the link was comprehensible and there was no ideological basis for any of it.