Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
RandyOrion
10 months ago
|
parent
|
context
|
favorite
| on:
The Llama 4 herd
People who downvoted this comment, do you guys really have GPUs with 80GB VRAM or M3 ultra with 512GB rams at home?
rfoo
10 months ago
[–]
I don't. I have no problem not running open-weight models myself because there's an efficiency gap of two orders of magnitude between "pretend-I-can" solution and running them on hundreds of H100s for high thousands of users.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: