Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> human neuron is more complex than an AI neuron by a constant factor

constant still can be not reachable yet: like 100T neurons in brain vs 100B in chatgpt, and also brain can involve some quantum mechanics for example, which will make complexity diff not constant, but say exponential.



> and also brain can involve some quantum mechanics

A neuroscientist once pointed this out to me when illustrating how many huge gaps there are in our fundamental understanding of how the brain works. The brain isn't just as a series of direct electrical pathways - EMF transmission/interference is part of it. The likelihood of unmodeled quantum effects is pretty much a guarantee.


Wikipedia says 100 billion neurons in the brain


Ok, I messed up, we need compare LLM weight with synaps, not neuron, and wiki says there are 100-500T synapses in human brain


Ok let's say 500T. Rumor is currently gpt4 is 1T. Do you expect gpt6 to be less than 500T? Non sarcastic question. I would lean no.


So, they may trained gpt4 with 10B fundings, than for 500T model they would need 5T fundings.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: