Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
alienicecream
on May 15, 2024
|
parent
|
context
|
favorite
| on:
We gotta stop ignoring AI's hallucination problem
Intelligence is the ability to comprehend a state of affairs. The input and the output are secondary. What LLMs do is take the input and the output as primary and skip over the middle part, which is the important bit.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: