TL/DR: Simulating a low noise environment is lot easier than finding one in the real world. And that's not news.
Researchers simulated a 1Tb/s transfer rate within a 100Mhz channel. Shannon's limit[1] explains how: set the noise term arbitrarily small, and your transfer rate can be arbitrarily large.
Bandwidth is popularly misunderstood to be some fixed range of spectrum, as if signals outside the range are attenuated to zero. But you can't attenuate a signal to zero. "Bandwidth" is measured[2] as the range over which attenuation is less than (say) 3db.
If not for noise, you could utilize a limited-bandwidth channel as if it had infinite bandwidth merely by amplifying frequencies outside the range in the exact proportion they are attenuated.
But every real system has noise. Overcoming this noise is the challenge of high speed transfers. There's no indication these "researchers" have accomplished anything whatsoever.
>>by amplifying frequencies outside the range in the exact proportion they are attenuated.
This is true in the sense you know a function everywhere if you can expand it for example using the Taylor series with inifnite terms. This is not true for discrete data which is limited by the niquist rate and the DTFT of the data will clearly repeat, meaning you literally can't get at higher frequencies to amplify.
I'm not sure what you mean - any real world test wouldn't be a discrete signal.
And yes, real physical channels are imperfect in other ways, for example not perfectly linear, etc. Could you dig into that a little? You think these effects are more significant than noise? Love to hear about it.
>>any real world test wouldn't be a discrete signal.
Perhaps band-limited, because they have a clear start and stop. If they were infinite you wouldn't be able to encode data as you could measure them at t=+/- infinity and retrieve all the signal information. Perhaps HN isn't the best place to have this discussion :-)
>>You think these effects are more significant than noise?
Group velocity dispersion in optical channels, people called it `noise` was at one point treated as noise before the appropriate transforms were worked out, I believe Infinera did this. In the case of remote sensing, ewald's limiting sphere represents a limit on the scattering meaning that you can't simply deconvolve and get better resolution.
How is this even physically possible? Is this for a small mobile device? It would imply a spectral efficiency of 10000 bits/s/hz. To put in perspective [1], 4G claims a best case spectral efficiency of 30 (!) bits/s/hz using 4x4 MIMO.
I'm not sure there are even enough physical degrees of freedom to allow that with a small device, given the wavelength of light at the Ghz range is ~10cm (i.e. sure, if you use 10000 antennas over a large area, you can have 10000 b/s/hz!; in this case, I would estimate it to need ~2000 antennas). To be clear, for a single "degree of freedom" you are limited by the Shannon capacity [2], which we are pretty close to already in most technologies.
So I'm skeptical. It's either a nice optical link, a massive device, or some high freq. radiation I'm not familiar with.
Not really, according to the update the entire result is sim. From your link:
> According to Tafazolli, the new class of Detector (a completely new approach) was tested through computer simulations (these simulated a real mobile/wireless environment) and were found to achieve the 1Tbps rate claimed. In our view that’s quite a bit different from conducting a practical test.
It doesn't look like there was any actual lab result.
Yeah but this is optical, not radio. Optical transmission may technically be wireless (such as for laser communications) but this is usually too unreliable for ordinary communications. Thus, optical is most often wired, as in fiber optic.
This test is largely bunk. Its over 100 feet, in direct line of sight, using a client device that is a test device built exclusively for this (as in, it is not a phone nor a laptop nor anything small and underpowered like that).
They say "5G speeds" but neglect to tell you this is not done using LTE or Wimax (the only two protocols that ITU-R recognizes for 4G; 5G will use descendents of these as they are designed for many generation forwards and backwards compatibility in mind).
Well, duh. This is obviously not a production device. Nothing about the title leads us to believe that this is a production device.
You generally have to achieve breakthroughs in lab settings before you can achieve them in real-world settings. 1Tbps is well beyond anything we have in production today.
Well, it also fails to claim what it can do over distance. Like, LTE-Advanced with high order antennas (supports up to like 32x32 MIMO, but lets for this example say 8x8) could do 100mbit/sec within a 25 mile sphere around the tower using a fixed install (ie, home internet router), according to some stuff I've read.
This is technology we have now that could be used to deploy the last mile solution that'd work in many communities, yet we're not deploying it, but we're working on new tech to replace tech we're not even using (possibly just "yet", possibly forever).
Researchers simulated a 1Tb/s transfer rate within a 100Mhz channel. Shannon's limit[1] explains how: set the noise term arbitrarily small, and your transfer rate can be arbitrarily large.
Bandwidth is popularly misunderstood to be some fixed range of spectrum, as if signals outside the range are attenuated to zero. But you can't attenuate a signal to zero. "Bandwidth" is measured[2] as the range over which attenuation is less than (say) 3db.
If not for noise, you could utilize a limited-bandwidth channel as if it had infinite bandwidth merely by amplifying frequencies outside the range in the exact proportion they are attenuated.
But every real system has noise. Overcoming this noise is the challenge of high speed transfers. There's no indication these "researchers" have accomplished anything whatsoever.
[1] http://en.wikipedia.org/wiki/Noisy-channel_coding_theorem
[2] http://en.wikipedia.org/wiki/Cutoff_frequency