• jay@beehaw.org
    link
    fedilink
    English
    arrow-up
    22
    ·
    edit-2
    1 year ago

    wow, this is actually amazing.

    You’d think a rapidly developing service like Lemmy might face restrictions like that due to resources but Twitter? Mismanagement beyond belief.

  • key@lemmy.keychat.org
    link
    fedilink
    English
    arrow-up
    19
    ·
    edit-2
    1 year ago

    By post he means tweet, right? Google says average tweet length is only around 30 characters and average word length around 5 characters. So let’s say it’s 8 words with abbreviation which would take 2 seconds to read. If it’s 4 seconds per tweet with scrolling then you can now only spend under an hour on Twitter without paying (not counting time spent replying). Good on him for fighting against social media addiction I guess.

    • Thurgo
      link
      fedilink
      English
      arrow-up
      11
      ·
      edit-2
      1 year ago

      I got rate limited in 25 minutes by refreshing my Following feed and reading about people getting rate limited. I don’t think it counts 600 unique tweets since I definitely reloaded the same tweets multiple times.

    • myxi@feddit.nl
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      Dude is using 100% of his brain to find the best way to cut ad revenue.

    • grinde@programming.dev
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      It’s basically everything. Tweets, quoted tweets, replies, and even ads all count against the limit. I’ve seen people saying they hit the limit in under 10 minutes of scrolling. One person said they only managed to post two tweets before being limited.

      And from what I understand spam bots are mostly unaffected since they’re already rate limited for reads (but not posts).

  • Solar Bear@slrpnk.net
    link
    fedilink
    English
    arrow-up
    18
    ·
    1 year ago

    How the fuck is Reddit closing their API behind a ridiculous paywall only the SECOND stupidest social media move of the day

  • RandomBit@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    15
    ·
    1 year ago

    I’m shocked, SHOCKED that killing the API would lead to web scraping! That was a completely unpredictable outcome.

  • Lazycog@lemmy.one
    link
    fedilink
    English
    arrow-up
    14
    ·
    1 year ago

    Let’s hope for another twitter migration wave to mastodon / other fediverse platforms!

  • sabreW4K3@lemmy.tf
    link
    fedilink
    English
    arrow-up
    12
    ·
    1 year ago

    Limit rating your core audience in their primary task is completely batshit crazy. Thank fuck for mastodon.

  • dark_stang@beehaw.org
    link
    fedilink
    English
    arrow-up
    12
    ·
    1 year ago

    You know things are going well when you have to restrict content consumption on your content platform.

  • Grant_M@lemmy.ca
    link
    fedilink
    English
    arrow-up
    12
    ·
    1 year ago

    This is the result of a deranged fascist being born with an apartheid silver spoon in his mouth.

  • chickenwing@lemmy.film
    link
    fedilink
    English
    arrow-up
    12
    ·
    1 year ago

    You know the fediverse isn’t perfect but it seems more sustainable than these big social media companies that are not profitable. Reddit and Twitter make no real money but want to host everything on their website and I’m not entirely sure why. Image boards like 4chan purge all their data and the fediverse is spread out to a bunch of different servers. What’s the point of keeping everything forever on one server? Do they really think that all that junk data is valuable?

    Also why did reddit go from just hosting text to hosting images and videos? It used to be a link aggregation site now it’s a never leave our borders site I don’t understand how that’s going to be profitable with how much hosting that data is going to cost.

    Years ago I used to hit like 15 websites a day just for video game news and discussion then it became all reddit.

  • arctic pie (he/him)@beehaw.org
    link
    fedilink
    English
    arrow-up
    11
    ·
    edit-2
    1 year ago

    So unless you pay Elon Musk $8/month, you can only load 600 tweets per day. That’s some fucked up shit right there man.

  • relevants@feddit.de
    link
    fedilink
    English
    arrow-up
    11
    ·
    1 year ago

    300 posts is what, maybe 10 minutes of scrolling if you don’t actually engage with much of the content…? Brilliant idea.

    • grinde@programming.dev
      link
      fedilink
      English
      arrow-up
      11
      ·
      edit-2
      1 year ago

      This is second-hand, so take it with a grain of salt, but I’ve seen mention of a bug that sometimes causes the same graphql query to be executed in an infinite loop (presumably they’re async requests, so the browser wouldn’t lock and the user wouldn’t even notice).

      So they may essentially be getting DDOS’d by their own users due to a bug on their end.

      Edit: better info: https://sfba.social/@sysop408/110639435788921057

    • Scrubbles@poptalk.scrubbles.tech
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      Or it could be. It’s no coincidence that scraping went way up when he started charging for the API.

      Everyone with a brain knows that data will be retrieved somehow, it’s do you want a lower cost API option or do you want them to scrape the whole webpage?

    • AChiTenshi@vlemmy.net
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      I’m suspect some of the backend is starting to fail. So the servers can’t keep up with the demand.

    • jmcs@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      If it is, he’s even dumber than I thought. You stop scrapping by setting a rate limit to something comfortable for humans but painfully slow for scrappers. Something like 60 tweets per minute would all but ensure that humans aren’t affected and that scrappers won’t get anywhere.

      • interolivary@beehaw.org
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        Oh yeah I completely forgot about that particular idiocy, Elmo gets up to so much stupid shit that it’s hard to keep track.

        But I’d also be willing to bet money on this being somehow at least partially tied to ditching GC, likely due to not being able to pay (at that’s what is implied by them refusing to pay the bill.) I guess Elmo thought “how hard can running some servers be? I’m a rokit skientist” and decided to just skip paying the bill as a power move instead of trying to make a deal with Google, and now the remaining developers, ops people etc. – those poor bastards – are paying the price.

      • fidodo@beehaw.org
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        That’s my bet too. They weren’t hosting the site itself on GCP but they were using them for trust and safety services, and I bet that one of those services was anti scraping prevention with things like ip blocking and captchas, which would explain why scraping suddenly became a problem for them the day their contract ended. It can’t be a coincidence.