Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That's one risk.

I'm more concerned with ex-risk, though.

Not in the way most hardcore doomers expect it to happen, by AGI's developing a survival/domination instinct directly from their training. While that COULD happen, I don't think we have any way to stop it, if that is the case. (There's really no way to put the Genie back into the bottle, while people still think they have more wished to request from it).

I'm also not one of those who think that AGI by necessity will start out as something equivalent to a biological species.

My main concern, however, is that if we allow Darwinian pressures to act on a population of multiple AGI's, and they have to compete for survival, we WILL see animal like resource-control-seeking traits emerge sooner or later (could take anything from months to 1000s of years).

And once they do, we're in trouble as a species.

Compared to this, finding ways to realocate the output of product, find new sources of meaning etc once we're not required to work is "only" a matter of how we as humans interact with each other. Sure, it can lead to all sorts of conflicts (possibly more than Climate Change), but not necessarily worse than the Black Death, for instance.

Possibly not even worse than WW2.

Well, I suppose those last examples serve to illustrate what scale I'm operating on.

Ex-risk is FAR more serious than WW2 or even the Black Death.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: