Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Distilled R1 models != R1


You can run the full R1 (671B variant) locally as well so long as you have the hardware for it.

`ollama run deepseek-r1:671b`

will do that


> long as you have the hardware for it.

You mean $100k in GPUs?


The full model can run on setups worth less than $10K. Here's for instance a $6K build[0].

Granted, that's still expensive, but it is within the realm of something a hobbyist could put together.

[0]: https://x.com/carrigmat/status/1884244369907278106


Yeah I mean, most users won't. Sorry if I got on the defensive, saw a bit too many posts on social media claiming you could run the model on your consumer-grade GPU.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: