Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

My thoughts exactly. natural language semantics is imperfect and human reasoning is weird. Let's not mistake LLM models as a single source of absolute truth, but a funny & bullshitting assistant who happens to read and vaguely remembers much information.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: