Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I agree. But you could ask which is more intelligent: recognising a trick question and balking, or recognising that the question as posed doesn’t quite make sense and offering a reformulation together with its answer. It’s not always clear whether something’s a trick, a mistake or a strangely worded (but nonetheless intentionally weird) question. So I think it would be very hard to get it to never fall for any tricks.


I think they've fixed it now, but it does seem to recognize popular trick questions, like "what weighs more, a ton of feathers or a ton of bricks?". It would answer with the typical explanation about density not mattering, etc.

But, it used to fail on "what weighs more, 3 tons of feathers or 2 tons of bricks?".

So, it seems less about what's a trick, and more about what's a common question --> answer pattern.


It's the same with humans. I don't fail on this (in an on-the-spot response) question because I've fallen on it as a kid, then learned the trick, then learned to be suspicious of this trick in similarly-worded questions.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: