Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Define what “working right” means? I think that’s the core of the issue with rolling out Siri as an LLM. People trust answers that Siri gives them. LLMs hallucinate… that a problem. To me, a hallucination means something isn’t working right. That’s not just an Apple problem, it’s an industry problem that everyone mostly tries to whitewash.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: