Hacker Newsnew | past | comments | ask | show | jobs | submit | benkarst's commentslogin

Built with speed in mind. Each query takes about 1 second. Initial load time is averaging 1 second and to load it from cache takes 0.2s. Both of these are averaging faster than Google. But with a lot less functionality!

Duck duck Go exists as a privacy first search engine, so why not Komodo as a privacy first LLM client?


It's not unemployment, it's self-actualized "me" time!


How is it even remotely legal for Shutterstock to cash in on something that photographers never gave consent for? OpenAI is mindblowingly unethical.

Shutterstock was started in 2003.


Your notions of ethics are obsolete. You can’t stop ai. And yes it’s bad news. But also it’s inevitable.


Our legal system's idea of intellectual property wasn't designed for and had no way of knowing what it means to use a copyrighted work as 'training data'. Sadly it still has no protections and even there were, it's no clear how it would be enforced.


Precisely. Our legal system has two issues. First, the definitions are imprecise, non-objective and tethered to historical meaning that is no longer relevant. And second, law you can't enforce is nothing at all, even if defined perfectly.

AI is working on the scale of seconds, and nanoseconds. The legal system is working on the scale of years. It's too slow and dysfunctional even for human matters. But for AI it's hopeless. It's like a tree trying to catch up to a running cheetah.


What do you mean, never gave consent for? I helped a friend put their photos on there, and they most definitely did consent to practically anything.


I mean they never gave consent for their photos to be used as training data.


They did. Read the ToS.


So they built an entire company betting that little old Apple wouldn't mind hacking a proprietary protocol? Hmm.


Altmanheimer


Greed is undefeated.


Side note, the 10B investment is less than a half a percent of MSFT's 2.75T market cap.


This plays perfectly into the narrative that Sam wanted to take this godlike tech that Ilya created and commercialize it.

Sam chose greed over safety.


OpenAI never found a true identity under Sam. Sam treated pursued hyper growth (treating it like a YC startup) and many in the company, including Ilya, wanted it to be a research company emphasizing AI safety.

Whether you're Team Sutzkaver or team Altman, you can't deny it's been interesting to see extremely talented people fundamentally disagree what to do with godlike technology.


OpenAI never found a true identity under Sam. Sam treated pursued hyper growth (treating it like a YC startup) and many in the company, including Ilya, wanted it to be a research company emphasizing AI safety.

Whether you're Team Sutzkaver or team Altman, you can't deny it's been interesting to see extremely talented people fundamentally disagree what to do with godlike technology.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: