Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's a text generator that spits out tokens. It has absolutely no understanding of what it's saying. We as humans are attaching meaning to the generated text.

It's the humans that are hallucinating, not the text generator.



They've already researched this and have found model inside the LLM such as a map of the world - https://x.com/wesg52/status/1709551516577902782. Understanding is key to how so much data can be compressed into a LLM. There really isn't a better way to store all of it better than plain understanding it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: