• Ilovethebomb
    link
    fedilink
    English
    arrow-up
    30
    arrow-down
    2
    ·
    2 months ago

    Considering the hype and publicity GPT-4 produced, I don’t think this is actually a crazy amount of money to spend.

    • oce 🐆@jlai.lu
      link
      fedilink
      English
      arrow-up
      19
      arrow-down
      2
      ·
      edit-2
      2 months ago

      Yeah, I’m surprised at how low that is, a software engineer in a developed country is about 100k USD per year.
      So 40M USD for training ChatGPT 4 is the cost of 400 engineers for one year.
      They say cost of salaries could make up to 50% of the total, so the total cost is 800 engineers for one year.
      That doesn’t seem extreme.

    • Voroxpete@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      10
      ·
      2 months ago

      Comparitively speaking, a lot less hype than their earlier models produced. Hardcore techies care about incremental improvements, but the average user does not. If you try to describe to the average user what is “new” about GPT-4, other than “It fucks up less”, you’ve basically got nothing.

      And it’s going to carry on like this. New models are going to get exponentially more expensive to train, while producing less and less consumer interest each time, because “Holy crap look at this brand new technology” will always be more exciting than “In our comparitive testing version 7 is 9.6% more accurate than version 6.”

      And for all the hype, the actual revenue just isn’t there. OpenAI are bleeding around $5-10bn (yes, with a b) per year. They’re currently trying to raise around $11bn in new funding just to keep the lights on. It costs far more to operate these models (even at the steeply discounted compute costs Microsoft are giving them) than anyone is actually willing to pay to use them. Corporate clients don’t find them reliable or adaptable enough to actually replace human employees, and regular consumers think they’re cool, but in a “nice to have” kind of way. They’re not essential enough a product to pay big money for, but they can only be run profitably by charging big money.

    • huginn@feddit.it
      link
      fedilink
      English
      arrow-up
      7
      ·
      2 months ago

      The latest releases ChatGPT 4o costs $600/hr per instance to run based on the discussion I could find about it.

      If OpenAI is running 1k of those models to service the demand (they’re certainly running more since queries can take 30+ seconds) then that’s 200M/yr just keeping the lights on.

        • huginn@feddit.it
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 months ago

          3.4bn is their gross - we have no idea what their operating costs are since they refuse to share them.

          Some estimates say they’re burning 8 billion a year.