Latency to access a SSD is calculated in nanoseconds, whereas latency across networks is typically calculated in terms of milliseconds. Order of magnitude difference here.
That being said, to an end user, the difference between 100 nanoseconds and 100 milliseconds is - probably - very small.
The difference is small for a single file. Then they try to start an app that loads 1000 files on startup (say, a game or Rails...), and those milliseconds turns to seconds, while the nanoseconds turns to still mostly imperceptible microseconds.
Hmm, I thought it was more like 10s to 100s of microseconds of latency for a SSD.
I've seen estimates that Google fiber latency (when plugged in, to the nearest Google datacenter) could be anywhere from microseconds to 10s of milliseconds, but I don't have any reliable sources and I don't have any expertise here myself.
I'm hoping that someone who does know might chime in?
That being said, to an end user, the difference between 100 nanoseconds and 100 milliseconds is - probably - very small.