Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> The [human] brain seems to be hundreds of thousands to millions of times more energy efficient than any kind of current AI

I don't know about that... I've consumed quite a few calories in my lifetime directly, plus there is all the energy needed for me to live in a modern civilization and make the source material available to me for learning (schools, libraries, internet) and I still only have a minuscule fraction of the information in my head that a modern LLM does after a few months of training.

Translated into KWh, I've used very roughly 50,000 KWh just in terms of food calories... but a modern human uses between 20x and 200x as much energy in supporting infrastructure than the food calories they consume, so we're at about 1 to 10 GWh, which according to GPT5 is in the ballpark for what it took to train GPT3 or GPT4... GPT5 itself needing about 25x to 30x as much energy to train... certainly not 100s of thousands to millions of times as much. And again, these LLMs have a lot more information encoded into them available for nearly instant response than even the smartest human does, so we're not really comparing apples with apples here.

In short, while I wouldn't rule out that the brain uses quantum effects somehow, I don't think there's any spectacular energy-efficiency there to bolster that argument.



> plus there is all the energy needed for me to live in a modern civilization and make the source material available to me for learning (schools, libraries, internet)

To be fair, this is true of LLMs too, and arguably more true for them than it is for humans. LLMs would've been pretty much impossible to achieve w/o massive amounts of digitized human-written text (though now ofc they could be bootstrapped with synthetic data).

> but a modern human uses between 20x and 200x as much energy in supporting infrastructure than the food calories they consume, so we're at about 1 to 10 GWh, which according to GPT5 is in the ballpark for what it took to train GPT3 or GPT4

But if we're including all the energy for supporting infrastructure for humans, shouldn't we also include it for GPT? Mining metals, constructing the chips, etc.? Also, the "modern" is carrying a lot of the weight here. Pre-modern humans were still pretty smart and presumably nearly as efficient in their learning, despite using much less energy.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: