Sigh… No, it’s really not. It’s a neat toy, and cleverly designed, but it’s not actually going to replace any developers.
I don’t think people actually understand how many moving - and disconnected - parts there are in a real application. What that video shows is something that has some built-in generic 2D physics components mashed up with GPT for interpreting drawings combined with text descriptions and guessing what you want from them.
It’s neat. It’s a step up from those JavaScript ragdoll physics games from ten years ago, where you could draw shapes and have stuff bounce off them in interesting ways.
Now tell it to store game saves somewhere. Now tell it to include a login screen, with OAuth integration. Now tell it to synthesize new, unique music - making sure not to violate copyright, of course. Now tell it to render its assets in 2.5D. Now tell it to include score sharing.
See what I mean? We are years - if not decades - away from AI being able to actually fully generate a useful, usable application from scratch, based on nebulous, imprecise instructions and guesswork. There are many, many things it simply can’t do right now, because doing them requires knowledge, and generative AI doesn’t know anything.
We don’t disagree on much. It’s going to be a long and exciting road.
I would caution however about hanging your hat too much on all- or-nothing epistemology. It doesn’t really matter whether something or someone “knows” something if they can apply that information in a useful way. There are gradations of knowing (also for humans) and there are tasks that can be done productively at every step.
If you’ve ever had an intern or a really green junior you know what I mean ;)
Sure. I actually use AI - well, Copilot - regularly for my job, and I’m well aware of its capabilities. It’s useful.
Every line of code it creates also has to be checked, because it often produces code that either includes hallucinations (e.g., references to functions and methods that don’t exist) or - worse - code that contains no errors as far as the IDE is concerned, but isn’t what I needed.
It’s still helpful. I estimate that it boosts my productivity by around 25% or so, which is huge. But if I were replaced with some MBA - or even a junior dev or intern - because my company became convinced it didn’t need senior developers anymore and someone without my skills could just tell Copilot what to do, they’d either collapse or hire me back within a couple months (and you’d better believe they’d need to offer me the moon for me to accept).
Maybe someday, a large language model can be built that will produce the 100,000 lines of code, in five different repositories, each with its own pipelines, and all the auxiliary configuration files and mechanisms for storing/retrieving secrets and auth tokens and whatnot… that comprise just one of the applications I work on. Maybe.
But that day sure as heck isn’t here yet. Things like Copilot are tools for developers right now, not tools to replace us. Believing they’re capable of replacing us now is as wrong-headed as believing “no-code” tools would replace us fifteen years ago.
I honestly believe there’s a measure of jealousy in declarations that the days of software development by humans are numbered. What we do seems like magic to some people, and I think there’s an urge to demystify us by declaring us obsolete. It happens every few years when there’s something new (something created by developers, ironically) that purportedly does everything we do. Invariably, it doesn’t. If it’s good, like Copilot, it ends up in the toolbox alongside everything else we use. If it’s bad, like “no-code,” it doesn’t.
But until something comes along that can comprehensively see the big picture of a complex application and implement it without human intervention every step of the way, I’m not going to start looking for a job in a different field.
When I compare it to the shift to high level languages, I don’t mean it casually. I mean it as a direct analogy.
Business languages like COBOL were originally intended to be used directly by “non programmers”. We know how that turned out. Programmers did not go extinct. In fact, it led to a huge increase as more and more tasks were in economic scope for automation. The productivity increase of high level languages (which is huge!) is directly responsible for this.
I don’t think AI will make programmers disappear. But it will change the way the field is organized, the way the work is done, and the tasks that can be economically automated. And, here’s the thing, that goes for most knowledge economy jobs. Programming is just the most visible now.
Oh! Sure, I get where you’re coming from now, and agree. For example, precision writing for a large language model is going to be a prerequisite for software development jobs these days.
All I’m saying is that those jobs will continue to exist, contrary to the breathless declarations I’ve been seeing that my job is doomed.
It’s a significant leap in abstraction. At least as big as the introduction of the first high level programming languages.
Sigh… No, it’s really not. It’s a neat toy, and cleverly designed, but it’s not actually going to replace any developers.
I don’t think people actually understand how many moving - and disconnected - parts there are in a real application. What that video shows is something that has some built-in generic 2D physics components mashed up with GPT for interpreting drawings combined with text descriptions and guessing what you want from them.
It’s neat. It’s a step up from those JavaScript ragdoll physics games from ten years ago, where you could draw shapes and have stuff bounce off them in interesting ways.
Now tell it to store game saves somewhere. Now tell it to include a login screen, with OAuth integration. Now tell it to synthesize new, unique music - making sure not to violate copyright, of course. Now tell it to render its assets in 2.5D. Now tell it to include score sharing.
See what I mean? We are years - if not decades - away from AI being able to actually fully generate a useful, usable application from scratch, based on nebulous, imprecise instructions and guesswork. There are many, many things it simply can’t do right now, because doing them requires knowledge, and generative AI doesn’t know anything.
We don’t disagree on much. It’s going to be a long and exciting road.
I would caution however about hanging your hat too much on all- or-nothing epistemology. It doesn’t really matter whether something or someone “knows” something if they can apply that information in a useful way. There are gradations of knowing (also for humans) and there are tasks that can be done productively at every step.
If you’ve ever had an intern or a really green junior you know what I mean ;)
Sure. I actually use AI - well, Copilot - regularly for my job, and I’m well aware of its capabilities. It’s useful.
Every line of code it creates also has to be checked, because it often produces code that either includes hallucinations (e.g., references to functions and methods that don’t exist) or - worse - code that contains no errors as far as the IDE is concerned, but isn’t what I needed.
It’s still helpful. I estimate that it boosts my productivity by around 25% or so, which is huge. But if I were replaced with some MBA - or even a junior dev or intern - because my company became convinced it didn’t need senior developers anymore and someone without my skills could just tell Copilot what to do, they’d either collapse or hire me back within a couple months (and you’d better believe they’d need to offer me the moon for me to accept).
Maybe someday, a large language model can be built that will produce the 100,000 lines of code, in five different repositories, each with its own pipelines, and all the auxiliary configuration files and mechanisms for storing/retrieving secrets and auth tokens and whatnot… that comprise just one of the applications I work on. Maybe.
But that day sure as heck isn’t here yet. Things like Copilot are tools for developers right now, not tools to replace us. Believing they’re capable of replacing us now is as wrong-headed as believing “no-code” tools would replace us fifteen years ago.
I honestly believe there’s a measure of jealousy in declarations that the days of software development by humans are numbered. What we do seems like magic to some people, and I think there’s an urge to demystify us by declaring us obsolete. It happens every few years when there’s something new (something created by developers, ironically) that purportedly does everything we do. Invariably, it doesn’t. If it’s good, like Copilot, it ends up in the toolbox alongside everything else we use. If it’s bad, like “no-code,” it doesn’t.
But until something comes along that can comprehensively see the big picture of a complex application and implement it without human intervention every step of the way, I’m not going to start looking for a job in a different field.
When I compare it to the shift to high level languages, I don’t mean it casually. I mean it as a direct analogy.
Business languages like COBOL were originally intended to be used directly by “non programmers”. We know how that turned out. Programmers did not go extinct. In fact, it led to a huge increase as more and more tasks were in economic scope for automation. The productivity increase of high level languages (which is huge!) is directly responsible for this.
I don’t think AI will make programmers disappear. But it will change the way the field is organized, the way the work is done, and the tasks that can be economically automated. And, here’s the thing, that goes for most knowledge economy jobs. Programming is just the most visible now.
Oh! Sure, I get where you’re coming from now, and agree. For example, precision writing for a large language model is going to be a prerequisite for software development jobs these days.
All I’m saying is that those jobs will continue to exist, contrary to the breathless declarations I’ve been seeing that my job is doomed.