The horror of picking tech working in it 10 or 15 years and then it suddenly becoming obsolete or irrelevant. Is something a lot of people can relate to.
For company stuff I love subscriptions, I don’t have to ask bean counters for money each time there is a new version, they just approve monthly payments and we are done.
I have set path as well but I don’t always know what is there in the fridge or pantry as my partner mostly cooks. So we make the list that I later sort to match my path at the shop. It speeds up my shopping a bit and sometimes I just pick something outside of the list for fun.
AI companies don’t have 20 years, they have max 5 years where they have to turn to profit.
They don’t have time to wait for all the companies to pick up use of AI tooling in their own pace.
So they lie and try to manufacture demand. Well demand is there but they have to manufacture FOMO so that demand materializes now and not in 20 or 10 years.
This outlook is as short-sighted as the 2000 fiber optic bust. Critics then thought overcapacity meant the end, yet that infrastructure eventually created the modern internet. Capital does not walk away from a fundamental shift just because of one market correction. While specific companies may fail, the long-term value of the technology ensures that investment will continue far beyond a five-year window.
The massive investment in power grids and data centers provides a permanent physical backbone that outlives any specific silicon generation. This infrastructure serves as a durable shell for the model design knowledge and chip architectural IP gained through each iteration. Capital is effectively funding a structural moat built on energy access and engineering mastery.
Seems like there’s a lot of resources being dumped into those data centers that will not be very useful. Saying it will all be worthwhile because we’ll have the buildings and the modest power grid updates (which are largely paid for by tax payers, anyway,) feels like saying a PS5 is a good long-term investment because the cords and box will still be good long ag after the PS5 has outlived its usefulness.
The "PS5" analogy fails to account for how "useless" hardware often triggers the next paradigm shift. For decades, traditionalists dismissed high-end GPUs as expensive toys for gamers, yet that specific architecture became the accidental engine of the AI revolution.
And you imagine these incredibly expensive-to-operate, environmentally damaging, highly specialized, years-outdated GPUs will trigger some sort of technological revolution that won’t be infinitely better served by the shiny new GPUs of the day that will not only be dramatically more powerful, but offer a ton more compute for the amount of electricity used?
The AI use of GPUs didn’t stem from a glut of outdated, discarded units with nearly no market value. All of those old discarded GPUs were, and still are, worthless digital refuse.
The closest analog i can think of to what you’re referring to is cluster computing with old commodity PCs that got companies like Google and Hotmail off the ground… for a few years until they could afford big boy servers and now all of those, and most current PCs on the verge of obsolescence, are also worthless digital refuse.
The big difference is that Google et al chose those PC clusters because they were cheap, commodity pieces right off-the-bat, not because they were narrowly scoped specialty hardware pieces that collectively cost hundreds of billions of dollars.
Your supposition fails to account for our history with hardware in any reasonable way.
Focusing exclusively on the physical decay and replacement cycle of hardware is a classic case of tunnel vision. It ignores the fact that the semiconductor industry’s true value lies in the evolution of manufacturing processes and architectural design rather than the lifespan of a specific unit. While individual chips eventually become obsolete, the compounding breakthroughs in logic and efficiency are what actually drive the technological revolution you are discounting.
Tunnel vision is ignoring the astonishing amount of money and environmental resources our society is dumping into these very physical, very temporally useful chips and their housing because… of what we learn by doing that. We should have dumped 1/100th of that money into research and we’d have been further along.
This isn’t a normal tech expenditure— the scale of this threatens the economy in a serious way if they get it wrong. That’s 401ks, IRAs, pension plans, houses foreclosed on, jobs lost, surgeries skipped… if we took a tiny fraction of this race-to-hypeland and put towards childhood food insecurity, we could be living in a fundamentally different looking society. The big takeaway from this whole ordeal has nothing to do with semiconductors — it is that rich guys playing with other people’s money singularly focused on becoming king of the hill are still terrible stewards of our financial system.
Dismissing massive capital expenditure as "hypeland" ignores the historical reality that speculative bubbles often build the physical foundation for the next century. The Panic of 1873 saw a catastrophic evaporation of debt-driven capital, yet the "worthless" railroads built during that frenzy remained in the ground. That redundant, overbuilt infrastructure became the literal backbone of American industrialization, providing the logistics required for a global economic shift that far outlasted the initial financial ruin.
Divorcing research from "learning by doing" is a recipe for a bureaucratic ivory tower. If you only funnel money into pure research without the messy, expensive, and often "wasteful" reality of large-scale deployment, you end up with an economy of academic metrics rather than industrial power.
The most damning evidence against the "research-only" model is the birth of the Transformer architecture. It did not emerge from an ivory tower funded by bureaucratic grants or academic peer-review cycles; it was forged in the fires of industrial practice.
History shows that a fixation on immediate social utility or "rational" cost analysis can be a strategic trap. During the same era, Qing Dynasty bureaucrats employed your exact logic, arguing that the astronomical costs of industrialization and rail were a waste of resources better spent elsewhere. By prioritizing short-term stability over "expensive" technological leaps, they missed the industrial window entirely. Two decades later, they faced an industrialized Japan in 1894 and suffered a total collapse. The "waste" of one generation is frequently the essential infrastructure of the next.
How much capital was wiped out for it to be cheap after the bust? Someone is going to eat the exuberance loss in the near term, even if there is long term value.
> Sounds like you need to read up on dopamine and addictions a bit more.
Nah, I just need to not equivocate between them. The use of the same term to describe activities that produce a dopamine response as is used for ingestion of chemicals that create a direct physical dependence is little more than a propaganda tactic.
You're blurring the lines a bit. Gambling isn't inherently an addiction. Just like a good TV show isn't inherently addictive either. Social media trying to be more engaging shouldn't really be viewed as an evil action anymore than HBO trying to create compelling content is.
The problem with comparing social media use to tobacco is that they are completely different. It's like saying weed is just like heroin because they both make you feel good. It's reductive and not productive.
The completely anti-social media stance ignores the good parts of social media. People can connect from across the planet and found others who shares the same views or experiences. People who are marginalized can find community where none may exist in their local area. So we should approach this more carefully and grounded.
Maybe this will make it more clear, so big difference is that people can connect across the planet without "big social media".
There are internet forums, chats, e-mail, blogs, there is no inherent need for "big social media" as we know. I do understand those companies made it much easier for average person to participate but still using internet forum or e-mail isn't exactly rocket science.
Here we are on HN, where no one is changing the layout and not doing much to drive engagement. Some days I don't even open any discussion because there is a lot of stuff that is not interesting for me.
"Big social media" companies had already multiple people speaking up explaining that they specifically made changes to drive engagement to hook people up and keep them scrolling without "creating compelling content". They specifically tuned feed algorithms to promote lowest common denominator trash content that makes people react in anger/frustration/whatever and not "creating/promoting compelling content".
Comparing internet forums, chatrooms, email, and blogs to Facebook and TikTok seems like a bad joke. I don't think you recognize how impactful "Big Social Media" is. Facebook brought about the ability to easily reconnect with people you had lost touch with and stay in touch with them. Things like Instagram made photo sharing and discovery significantly easier than simply looking at what the most recent posted photos on Photobucket. TikTok mass marketed bite sized videos and community trends. These things either did not happen on other platforms or could not happen on them.
I think most people remember the earlier days of Twitter where having a centralized place with strong discoverability led to unique communities forming and expressing themselves. I shouldn't need to say this but, it obviously wasn't all sunshine and rainbows. So I'm not saying these platforms were perfect or without major issues. I am say that their unique nature is not something that can be replicated via other mediums. It simply doesn't scale.
Honestly I'm not seeing the issue with these platforms wanting to maximize time users spend on them. That's the goal of every business. What seems to get lost though is self control. TikTok being fun and enjoyable does not mean that you are incapable of closing the app. It's like banning phones from leaving your house because you are so addicted to texting and apps. You cannot fully control what comes up on most social media. But as any therapist will tell you, all you can control is your response. I just think there is a space for big social media sites in the world. I don't even use them, but I can recognize the impact they have made with the good and the bad.
So now you're demonstrating that you can criticize social media for its own flaws without having to conflate it with something else. I don't disagree with anything you're saying here, but nothing you're saying here involves attempting to equivocate social media with physical substance abuse.
I don't think I implied that. Of course, but the ability to regulate usage is hampered by nicotine. That does not mean one cigarette and you're addicted though.
You can make the point that social media has real positive benefits as well as negatives without minimizing the well proven fact that gambling creates a form of addiction in a significant proportion, though not all, of its users, one every bit as devastating as heroin or alcohol.
Seems like you're overestimating how many people are addicted to gambling. Much in the same way those who are anti-alcohol will conflate responsible drinking with alcoholism. Gambling can be just as terrible, but it is different than heroin and alcoholism since it does not have a chemically addictive component. Reducing all addictions to being the same thing is damaging to addicts and addiction recovery. Much the same way reducing all crime to the same thing is for inmates of the prison system. You're removing nuance and difference which helps promote understanding.
To build gtk you are hit with GPL which sucks. To build Swift you have to pay developer fee to Apple, to build win32 you have to pay developer fee to Microsoft. Which both suck. Don’t forget mobile Android you pay to Google.
That is why everyone jumped to building in Electron because it is based on web standards that are free and are running on chromium which kind of is tied to Google but you are not tied to Google and don’t have to pay them a fee. You can also easily provide kind of the same experience on mobile skipping Android shenigans.
>"to build win32 you have to pay developer fee to Microsoft"
Not really, you can self sign but your native application will be met with a system prompt trying to scare user away. This is maddening of course and I wish MS, Apple, whatever others will die just for this thing alone. You fuckers leveraged huge support from developers writing to you platform but not, it is of course not enough for you vultures, now let's rip money from the hands that fed you.
reply