Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I would definitely fact check search results as much as AI, especially the info snippets that appear at the top of Google's SERPs.

For example, until a few months the results for "pork cooked temperature" and "chicken cooked temperature" were returning incorrect values, boldly declaring too low of a temperature right at the top of the page (I know these numbers can vary based on how long the meat is at a certain temperature, but I verified Google was parsing the info incorrectly from the page it was referencing, pulling the temperature for the wrong kinds of meat). This was potentially dangerous incorrect info IMO



Snippets have become so useless I use a plugin to remove them.

What is ridiculous is, when, say, Stack Overflow has a good answer, it is a few lines down or on the next page in the search results, but some page-mill SEO site is in snippets up top with a completely wrong or naively pathetic partially correct answer. It is so annoying it has lowered my opinion on Google a lot in recent times.


> I would definitely fact check search results as much as AI, especially the info snippets that appear at the top of Google's SERPs.

Yes, so would I. And I also double check things like Google Maps -- a tool I find very helpful but don't trust blindly. But... do most people think to take a close look at Google Maps to make sure it makes sense, and trust their own judgement if they disagree with the map? Will most people fact check confident LLM outputs?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: