Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

what's fascinating is that developers have been automating-away their jobs for decades with tools and libraries... so just add to the automation-pressure? or be some sea-change?

hard to know, economically-speaking...



Difficult to say. My bold prediction is that within the next two decades, bootcamps for web dev decline as the barrier to entry of that field increases as the field matures, despite pay being excellent.

How much less menial labor does one do with modern web frameworks relative to the state of the art in 2002?


I find the menial work is about the same.

Less was possible in 2002, so the menial tasks were different. But I think the split between interesting/novel code and boilerplate/menial work was about the same.


That is a bold prediction since people are constantly reinventing the web. For a few years it really looked like Macromedia Flash was going to be a thing to learn. These days it’s reactive UI components. Tomorrow who knows? I can’t see it ever really maturing as it’s not terribly hard to apply new paradigms to the web every few years that gain traction and move the state-of-the-art.


"field matures" is the operative phrase. Flash jumped in a void that was left by a pretty nonfunctional html experience. But it was proprietary, presumably was an energy hog, insecure, and was essentially a stand-alone experience that didn't play nicely with the browser. Reactive UI components are based on web standards, and it is VERY hard to imagine web standards (HTML, CSS, and JavaScript in particular) being replaced by anything in the near future. They are the "C" of the web for better or worse.

And yes I know WebAssembly is supposed to loosen JavaScript's hold on the web, but as far as I can see that has not happened yet even though every browser now supports it. I am not exactly sure why that is.


For one thing, the js VM, V8, is extremely good these days. I’ve been very curious about these questions, and in single thread contexts, V8 seems to beat JVM, with compute performance roughly equivalent to Rust in debug mode.

Since the bindings to the rest of the “essential” browser model are all geared towards js, the appropriate mental model might be that js is the “assembly” of the browser, even though it is not assembly in any meaningful compute/memory model sense.


tbh, and while knowing this argument is hard to push on most companies, « old web technologies » are just as relevant as one decade ago.

If I look at any project I’ve made for the last ten years with « hype technologies », REST APIs everywhere, complex front end frameworks (so complex that it’s literally a second codebase), they totally could have been developed with an old and mature framework like Django, RoR or ASP.Net.

Once those projects are done, you clearly can see that : things like sending a form are incredibly complex and non standard, ressources are wasted everywhere, and only 1 to 3 APIs are used externally, and you maintain the dozen others for your own usage.

The sole reason we are on more complex stack is because the industry magically acknowledged that « web development » was in fact two different jobs.


The one thing I like about “modern” JS-driven websites now versus then is, in general, interactive web software (e.g. with JS-driven interaction) is much more maintainable now.


> hard to know, economically-speaking...

It’s not hard to know at all, it’s a well studied area of economics.

https://en.wikipedia.org/wiki/Jevons_paradox




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: