• MysticKetchup@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    1
    ·
    1 year ago

    Impressive, but being able to make a breakout clone is one of the most basic tests. How well does it do with things that haven’t been made a million times before?

    • kescusay@lemmy.world
      link
      fedilink
      English
      arrow-up
      29
      arrow-down
      1
      ·
      1 year ago

      I have a feeling that if it can make something highly complex, doing so will require a very high degree of precise and detailed instructions. You know what we call precise and detailed enough instructions to produce a working, sophisticated game?

      Code. It’s called code.

      • vzq@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        1 year ago

        It’s a significant leap in abstraction. At least as big as the introduction of the first high level programming languages.

        • kescusay@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          edit-2
          1 year ago

          Sigh… No, it’s really not. It’s a neat toy, and cleverly designed, but it’s not actually going to replace any developers.

          I don’t think people actually understand how many moving - and disconnected - parts there are in a real application. What that video shows is something that has some built-in generic 2D physics components mashed up with GPT for interpreting drawings combined with text descriptions and guessing what you want from them.

          It’s neat. It’s a step up from those JavaScript ragdoll physics games from ten years ago, where you could draw shapes and have stuff bounce off them in interesting ways.

          Now tell it to store game saves somewhere. Now tell it to include a login screen, with OAuth integration. Now tell it to synthesize new, unique music - making sure not to violate copyright, of course. Now tell it to render its assets in 2.5D. Now tell it to include score sharing.

          See what I mean? We are years - if not decades - away from AI being able to actually fully generate a useful, usable application from scratch, based on nebulous, imprecise instructions and guesswork. There are many, many things it simply can’t do right now, because doing them requires knowledge, and generative AI doesn’t know anything.

          • vzq@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            We don’t disagree on much. It’s going to be a long and exciting road.

            I would caution however about hanging your hat too much on all- or-nothing epistemology. It doesn’t really matter whether something or someone “knows” something if they can apply that information in a useful way. There are gradations of knowing (also for humans) and there are tasks that can be done productively at every step.

            If you’ve ever had an intern or a really green junior you know what I mean ;)

            • kescusay@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              edit-2
              1 year ago

              Sure. I actually use AI - well, Copilot - regularly for my job, and I’m well aware of its capabilities. It’s useful.

              Every line of code it creates also has to be checked, because it often produces code that either includes hallucinations (e.g., references to functions and methods that don’t exist) or - worse - code that contains no errors as far as the IDE is concerned, but isn’t what I needed.

              It’s still helpful. I estimate that it boosts my productivity by around 25% or so, which is huge. But if I were replaced with some MBA - or even a junior dev or intern - because my company became convinced it didn’t need senior developers anymore and someone without my skills could just tell Copilot what to do, they’d either collapse or hire me back within a couple months (and you’d better believe they’d need to offer me the moon for me to accept).

              Maybe someday, a large language model can be built that will produce the 100,000 lines of code, in five different repositories, each with its own pipelines, and all the auxiliary configuration files and mechanisms for storing/retrieving secrets and auth tokens and whatnot… that comprise just one of the applications I work on. Maybe.

              But that day sure as heck isn’t here yet. Things like Copilot are tools for developers right now, not tools to replace us. Believing they’re capable of replacing us now is as wrong-headed as believing “no-code” tools would replace us fifteen years ago.

              I honestly believe there’s a measure of jealousy in declarations that the days of software development by humans are numbered. What we do seems like magic to some people, and I think there’s an urge to demystify us by declaring us obsolete. It happens every few years when there’s something new (something created by developers, ironically) that purportedly does everything we do. Invariably, it doesn’t. If it’s good, like Copilot, it ends up in the toolbox alongside everything else we use. If it’s bad, like “no-code,” it doesn’t.

              But until something comes along that can comprehensively see the big picture of a complex application and implement it without human intervention every step of the way, I’m not going to start looking for a job in a different field.

              • vzq@lemmy.blahaj.zone
                link
                fedilink
                English
                arrow-up
                3
                ·
                1 year ago

                When I compare it to the shift to high level languages, I don’t mean it casually. I mean it as a direct analogy.

                Business languages like COBOL were originally intended to be used directly by “non programmers”. We know how that turned out. Programmers did not go extinct. In fact, it led to a huge increase as more and more tasks were in economic scope for automation. The productivity increase of high level languages (which is huge!) is directly responsible for this.

                I don’t think AI will make programmers disappear. But it will change the way the field is organized, the way the work is done, and the tasks that can be economically automated. And, here’s the thing, that goes for most knowledge economy jobs. Programming is just the most visible now.

                • kescusay@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  1 year ago

                  Oh! Sure, I get where you’re coming from now, and agree. For example, precision writing for a large language model is going to be a prerequisite for software development jobs these days.

                  All I’m saying is that those jobs will continue to exist, contrary to the breathless declarations I’ve been seeing that my job is doomed.

      • anlumo@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        2
        ·
        1 year ago

        Yes, but unlike code, you can iterate from a vague description to become more precise over time. Code has to be highly precise from the start.

        • kescusay@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          I’m sorry, but I don’t see it as significantly different. Code from the start is usually “hello world” followed by iterations to make it closer and closer to what you want. The syntax is more complicated (and far more flexible), but the process of narrowing in on what you want by adding more and more precise detail seems awfully similar to me.

    • AlternateRoute@lemmy.ca
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      Ya we need to ask it to make something NEW, not something that probably is in every language programing book of examples that the model was trained on.

    • mr_tyler_durden@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Also you know that wasn’t the first time he hit “generate”, he added multiple other prompts/text until he got it working then demoed the final thing. It’s disingenuous at best and the “might start looking for another job” just about made my eyes roll out of my head. It’s a fucking breakout clone and not a very good one at that, give me a break.

    • emptiestplace@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      1 year ago

      I think you are vastly overestimating the uniqueness of most of what we do, and I think that’s probably an adequate rebuttal here, but for the sake of gratuitous verbosity, let’s say it weren’t: the hypothetical ‘thing’ to which you refer will almost always be made of many pieces that have been made a million times before. And as we can break a problem down to solve it and effectively produce something we consider novel, so too can it - especially with a bit of expert guidance.

      If a conventional expert can delegate pieces of their puzzle to ‘an LLM’, and achieve near-instantaneous results comparable in quality to what they might hope to eventually get from a team of motivated but less experienced folks, why wouldn’t they - and how does this not portend the obsolescence of human expertise as we know it? If that seems absurd, consider how expertise is gained.

      More directly, but not aimed at you, I am confident that anyone who shares your sentiment has not spent any meaningful time working with GPT-4, or lacks competencies necessary to meaningfully assess and recognize the unmistakeable glints of something new; of a higher level of comprehension and ability.

      It worries me, seeing so many intelligent people so wilfully unprepared for what is coming, oblivious to the fact that what we are arguing about today is already irrelevant! Because though things have been changing at a literal-fuck-you rate, technologies are converging and progress is accelerating in ways that are necessarily incomprehensible even to the folks ostensibly driving it.

      We should already be way past this point, collectively. It isn’t going to take more than a couple quick iterations to leave the naysayers in the same pool as DT supporters in terms of sheer force of cognitive dissonance required to get through a day.

      It is ok that we aren’t special, but failing to come to terms with this reality … probably won’t bode well for us.

      • MysticKetchup@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        I have messed around with generative AI and that is what lead me to the conclusion that it’s just derivative replication of things humans have already done. Trying to direct the AI to create specific visions or wholly original things feels like trying to herd cats, it’s just not very good at it.

        While there are obvious applications for AI even if it is only useful for replicating things, it’s starting to feel like the whole thing is smoke and mirrors in how much AI is actually capable of. And they just keep saying “think of how good it will be in the future” which makes it seem even more like the next crypto/nft bubble. Especially when AI companies are burning through money so fast that they’re bound to try and get industries dependent on their tech before squeezing and enshittifying them for all they’re worth

        • emptiestplace@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          1 year ago

          The vast majority of things humans do (and receive monetary compensation for) are things humans have already done; the result of countless generations of failure-driven iteration.

          If you’re interested in this you might enjoy exploring the ideas around consciousness as an emergent property, and the work of Douglas Hofstadter.

          …and try GPT-4 before you write it off.

          • kescusay@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            For my job, I use Copilot, which is built on GPT-4, and I have zero concern that it’s going to replace me.

            It’s very useful, don’t get me wrong. It makes generating new code in applications that already exist a breeze a lot of the time (minus hallucinations and other forms of mistakes, of course). But it simply can’t create whole new applications of any complexity from scratch, and requires actual developers to check the code it does create. It doesn’t actually know what you want, it’s just auto-completing based on what its model decides you want.

            Again, it’s very good at that. But it’s not so good that you can replace a team of developers with just one… Or worse yet, with an MBA who thinks he can figure it out without paying anyone.

      • vzq@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 year ago

        I think you are vastly overestimating the uniqueness of most of what we do

        I call it the “magical meat fallacy”.

  • burliman
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    1 year ago

    This attitude is so broken. You don’t need to look for a new job, you just can do more complex things now. You get to play an orchestra, not just a single instrument.

    • kescusay@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      Pretty sure the guy in the video is the developer of that application, and is joking (and hyping his app).

    • akwd169@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      1 year ago

      By that logic, the company only needs one person from the orchestra and the rest are fired.

      But the other person who replied to you makes the most pertinent point anyway