Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Strange definition. Humans can't recursively self improve, at least not in the way I'm assuming you to mean. That's more like definition of the singularity


Well, humans aren't machines. Why would a definition of AGI need to apply to humans? On the other hand, as we gain the ability to edit our DNA, I think recursive self-improvement over generations is on the table for our species.

I guess it would be useful to have a definition of weak AGI, but after reading Bostrom's Superintelligence, I struggle to imagine an AGI without a singularity or intelligence explosion. It seems like wishful thinking.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: