Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Well, my son is a meat robot who's constantly ingesting information from a variety of sources including but not limited to youtube. His firmware includes a sophisticated realtime operating system that models reality in a way that allows interaction with the world symbolically. I don't think his solving the |i+1| question was founded in linguistic similarity but instead in a physical model / visualization similarity.

So -- to a large degree "bucket of neurons == bucket of neurons" but the training data is different and the processing model isn't necessarily identical.

I'm not necessarily disagreeing as much as perhaps questioning the size of the neighborhood...



Heh I guess it's s matter of perspective. Your son's head is not made of silicon so in that sense it is a large neighborhood. But if you put them behind a screen and only see the output then the neighborhood looks smaller. Maybe it looks even smaller a couple of years in the future. It certainly looks smaller than it did a couple of years in the past.


From the meat robot perspective the structure, operation and organisation of the neurons is also significantly different.


Maybe Altman should just go have some kids and RLHF them instead.


Doesn't scale.

Too many years to max compute. All models limited lifespan inherent.

Avg $200k+ training cost over 18 year in house data center costs. More for reinforcement.

He's still 38. Gates took much longer. To stop working 24/7.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: