Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

No, it doesn't. Kurzweil has been making this claim for the last decade based solely on the increasing speed of computers, ignoring the fact that we don't yet have any clue how general intelligence actually works. It doesn't matter how fast our computers are if we don't know what algorithms will give rise to "intelligence", and we've made virtually no headway in this field.

The examples of "AI" cited in the article are remarkable, but are still extremely specific or not really intelligence at all. Siri, etc, are nothing more than text parsers that give a canned set of responses. The work on neural networks is interesting but still, at best, only a small component of actual AI. (Note: I'm not going to define an "actual AI". Yes, I know we keep moving the goalposts on what that would be. I'll know it when I see it, and so will you).

I'm not saying it won't happen, but it will require a type of conceptual breakthrough that we simply haven't had yet. To hype "the singularity is nigh!" at this point is dishonest, trivializes the real problems and sets false expectations for industry and policy-makers.



His argument, if you read his work, isn't based solely on the increasing speed of computers.

His idea is that progress is exponential in all of the requisite areas. That includes algorithms, hardware, biology, neuroscience, and more.

> Siri, etc, are nothing more than text parsers that give a canned set of responses

We say the same about everything once we can do it with computers, because progress doesn't come magically. It's incremental. Yet much of what we have already is what used to be "science fiction" and, before that, "magic". I'm sure when AIs are passing the Turing test, we won't think any more of computers, but less of the Turing test.

This isn't an argument against Kurzweil's predictions, it's just moving the goalposts as you say. If strong AI comes, we'll move them all the way there and maybe even a bit past.

I disagree that we'll know it when we see it though. I think we'll deny it until we die and the new generation grows up in a world in which computers have rights.

> I'm not saying it won't happen, but it will require a type of conceptual breakthrough that we simply haven't had yet.

Finally, Kurzweil's arguments do not require a breakthrough as a premise. Kurzweil's idea is to scan the brain at the neuronal level, then brute-force its simulation at the biochemical level. It's straightforward extrapolation.

You could propose that we will come across some roadblock in our exponential progress towards that goal, but in the absence of one, the null hypothesis is that progress will continue as it has. Then indeed, the singularity is nigh.


Well, I'm skeptical of Kurzweil's prediction too, but I see where he's coming from. His prediction is solely based on computing power because he also sees general AI as a pure brute force problem. It's the same approach he took to predict that a computer would beat a human in chess before 2000 - and he got that one right.


Another one he got right was that we wouldn't think more of computers when it happened -- we would simply think less of chess.


Yes, and that's precisely why he's wrong.


If intelligence is just an emergent property of a complex-enough system we won't need more than brute force. This seems a very reasonable hypothesis considering the wetware prior-art, so I wouldn't be as quick to say he's wrong.

What I'm skeptical about are the predictions that we will reach something more intelligent than humans (how to quantify intelligence even?), that it will improve our culture, and other sci-fi stuff...

Even if it turns out the predictions are wrong, the brute force can reach something interesting anyway, not human-like intelligence, but something new and complementary.


If we look back 30 years to 1984 and try to estimate how much progress we've made that might give us some indication on how much change we'll make in the next 30 years.

I'd argue, however, that since the rate of change is accelerating, maybe we should actually compare to 60 years ago. In 1954 we were practically in the stone ages.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: