Just because you can have a robot/machine that can efficiently churn out 1000 frozen lasagnas a second doesn't necessarily mean that italian restaurants have been "outcompeted" or "left behind" by not using such a machine in their business.
Sometimes quality and responsibility matter. Even if a machine is really good at producing bug-free code, often someone is going to have to read/understand the code that the machine produces and take resposibility for it, since machines cannot take responsibility for things.
In the end, you'll always need a human that understands the formal algorithmic language (i.e., a programmer), capable of parsing the formal program that will be run, if you want to be able to trust the program, since any way of mapping an informal request (described imperfectly in an informal language) to a formal construct is always going to be "lossy" and prone to errors (and you'll know this if you ever used an automatic code generator). Just because someone is willing to blindly trust automatically-generated code, doesn't mean everyone else is: there are contexts in which you need a person to blame, when things go wrong.
Ok, but to continue this analogy, industrialization and the ability to create 1000 frozen lasagnas a second had an enormous impact on the world. Not only on the economics of production, but ultimately on human society.
Sure, handmade lasagna still exists, but the world looks nothing like it did 200 years ago.
Heh, most cooking these days is like using libraries to build an application.
I don't slaughter an animal, I buy a cut of meat.
It's very rare I make pasta, I rehydrate dried husks.
The cheese comes in some kind of jar or package. The vegetables come from a store.
This has been the general move in applications too. I see companies with very large programs that are just huge sets of node modules joined together with a small amount of code.
> and take responsibility for it, since machines cannot take
responsibility for things.
That's an interesting thought; people "taking responsibility" as a
form of labour, for machines that stole their lunch. Probably be
around zero applicants for that job.
Responsibility is a complex quality needing capacity and competence.
Right now even the manufacturers of "AI" are unable to assert much
about behaviour.
Where "responsibility" exists around robotics and ML it will more
likely be a blunt legal instrument tied to ownership, like owning a
dangerous animal.
AI can be used to support that activity too. Models can just as well used to explain existing code, possibly cranked out by another AI. I bet many companies are thinking about fine-tuning or LoRA-ing language models on their moldy codebases and outdated pile of documentation to make onboarding, refactoring, and routine extensions easier.
To interpret what AI models themselves are actually doing, researchers employ AI models as well.
The point is that people won't get paid anymore to do it. It has happened before: many activities that have been replaced by technology have been almost forgotten (for example the newspaper reader in the factory) or are practiced as art or niche crafts only. Careers built on these are either wildly successful or have highly unsteady income since they are literally reliant on the whims, not on the needs, of people.
All disciplines evolve over time and those who fail or refuse to keep up will be left behind.