As cellular communication has progressed in the last two decades, we’ve rapidly approached the theoretical limits for wireless data transmission set by Shannon’s Law. Every successive cellular generation has brought dramatic increases in data rates. 2G networks offered a maximum theoretical data rate of 40 kbps–but today’s 4G LTE-Advanced networks have peak theoretical data rates of 1 Gbps. 5G takes that one step further; next-generation networks will have peak theoretical data rates of 20 Gbps for downlink and 10 Gbps for uplink.
Theoretical peak rates are just that: theoretical. You probably don’t see 1 Gbps download speeds on your LTE Android or iPhone handset. The more useful metric defined by the International Telecommunications Union (ITU) for the IMT-2020 standard (basically the 5G standard) is user experience data rate, which is the data rate experienced by users in at least 95% of the locations where the network is deployed for at least 95% of the time. By this measure, at a minimum of 100 Mbps, 5G should be at least five times faster than average 4G speeds.
To understand how 5G achieves these higher data rates, we need to dig into Shannon’s Law to see how engineers have tackled each of the limiting factors from previous generations.
Please note, we’re completely ignoring latency here. Latency, or the time it takes to reach a server, is not limited by Shannon’s Law and has a huge impact on everyday Internet usage. We’ll cover how 5G networks improve latency in a future post.
from Hacker News https://ift.tt/2w3zJes
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.