This is the precise reason I prefer embedded development. The challenge of fitting my entire application into a handful of kilobytes with just a KB or two of RAM is a lot of fun. I get to build a program that runs really fast on a very slow system.
It's a point of personal pride for me to really understand what the machine is and what it's doing. Programming this way is working with the machine rather than trying to beat it into submission like you do with high level languages.
It seems a lot of programmers just see the CPU as a black box, if they even think about it at all. I don't expect more than a couple percent of programmers would truly grok the modern x86 architecture, but if you stop to consider how the CPU actually executes your code, you might make better decisions.
In the same vein, very high level languages are a big part of the problem. It's so far abstracted from the hardware that you can't reason about how it will actually behave on any real machine. And then you also have an invisible iceberg of layer upon layer upon layer of abstraction and indirection and unknowable, unreadable code that there's no reasonable way to know that your line of code does what you think and nothing else.
Modern software practices are bad and we should all throw away our computers and go back to the 8086. Just throw away the entire field of programming and start again.
I love embedded as a hobby but God is it a silly statement to imply we should go back to low level asm/C bs for everything, we would get so little done. Oh, but it would run fast at least.
Problem isn't high level dev, it's companies skimping out on the optimisation process.
It's a point of personal pride for me to really understand what the machine is and what it's doing. Programming this way is working with the machine rather than trying to beat it into submission like you do with high level languages.
It seems a lot of programmers just see the CPU as a black box, if they even think about it at all. I don't expect more than a couple percent of programmers would truly grok the modern x86 architecture, but if you stop to consider how the CPU actually executes your code, you might make better decisions.
In the same vein, very high level languages are a big part of the problem. It's so far abstracted from the hardware that you can't reason about how it will actually behave on any real machine. And then you also have an invisible iceberg of layer upon layer upon layer of abstraction and indirection and unknowable, unreadable code that there's no reasonable way to know that your line of code does what you think and nothing else.
Modern software practices are bad and we should all throw away our computers and go back to the 8086. Just throw away the entire field of programming and start again.