Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Latency to access a SSD is calculated in nanoseconds, whereas latency across networks is typically calculated in terms of milliseconds. Order of magnitude difference here.

That being said, to an end user, the difference between 100 nanoseconds and 100 milliseconds is - probably - very small.



The difference is small for a single file. Then they try to start an app that loads 1000 files on startup (say, a game or Rails...), and those milliseconds turns to seconds, while the nanoseconds turns to still mostly imperceptible microseconds.


A nanosecond is 1 billionth of a second. A millisecond is 1 thousandth of a second.

1/1000000000 vs. 1/1000

Six orders of magnitude.

Edit: your point may hold true as I'm not sure SSD access or seek times are in the 10ns range.


Hmm, I thought it was more like 10s to 100s of microseconds of latency for a SSD.

I've seen estimates that Google fiber latency (when plugged in, to the nearest Google datacenter) could be anywhere from microseconds to 10s of milliseconds, but I don't have any reliable sources and I don't have any expertise here myself.

I'm hoping that someone who does know might chime in?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: