Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why would LLMs be less monetizable than search? People are still going to want to buy goods and services, and being the place they go to find out about those goods and services will be just as valuable as it is today. Perhaps even more so, since LLMs are able to answer questions that are difficult to formulate as search queries.


Probably because they are too expensive. I heard it costs 7 cents per query on average for ChatGPT. That’s more than ads pay.


With the search query as the prompt, they could probably cache the response and use it several times. That might keep the cost down to something reasonable.


Because most queries are not shopping queries. Yet ads are injected regardless to tempt you into buying something.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: