Every time some vendors put together an ideal condition, very well crafted 5G data link that hits some new speed record, the data is jumped on by the media like the discovery of a new black hole.
Then, a gaggle of journalists call a gaggle of analysts and company PR people to analyze the event as if it really means something. News flash; it does not. Why? Because it is not real-world and only presents a transmission event that is finely tweaked with proprietary hardware in an isolated test. At best it is a glimpse of how well current equipment can perform.
One thing it does is impart the state-of-the-art under ideal and test conditions. That is all good and well, but it is hardly worth more than a few sentences in the news sections.
The most recent one came in a release from Ericsson and MediaTek where they combined both mmWave and carrier aggregation to achieve, combined, nearly a 500 Mbps uplink. What makes this noteworthy is that, traditionally, uplink speeds in wireless networks have been dismal.
Let us back up a moment and drill down on this a bit. Traditionally, bandwidth allocation has been asymmetrical. Uplink speeds have been sacrifice on the download alter – typically 10 percent or less of the download speed. And, for most consumer use cases it did not matter that much because most consumers download way more than the upload, anyway. The enterprise is another story, and its use cases are quite different.
That has been changing over the last few years because the consumer has been catching on to doing things that require more bandwidth. An example of that is creating and uploading their own music and video creations. Online gaming is another use case and social media back and forth is also starting to gobble up more bandwidth. Therefore, the demand for wider upload pipes has been increasing. And with it comes new ways to enable greater bandwidth, both symmetrical and, going forward, dynamic.
What all this is pointing to is that next-generation wireless networks will require a different approach to data flowing around.
Back to this test. This is an example of using both mmWave and LTE (of course, that is the definition in carrier aggregation in 5G). The mmWave part used 5G NSA, which is typical but uses proprietary equipment. The LTE is the standard 1900 MHz we all know and love.
The heavy lifting, of course, was done by the mmWave component at 39 GHz. Of the 495 Mbps, it hauled 425 Mbps. LTE handled the rest. The interesting part is that this test is comprised of aggregated carriers. There were four, 100-megahertz component carriers combined with one, 20-MHz channel of the LTE spectrum. Combined, they hit the 495 Mbps record.
The claim is that it “doubled the current uplink speed on the market.” OK, but, the truth be told, the current uplink speeds, in most cases, are not even close to 250 Mbps (see the data from Opensignal further on).
The star of the show is the mmWave spectrum. Of the 420 MHz of bandwidth, 400 MHz handled 425 Mbps of the 495 Mbps. What I am getting at is that while throwing a 20 MHz channel at it to show carrier aggregation between 4G and 5G is not earth-shattering it is more to show that it works. In reality, the 4G channel is relatively insignificant.
Now, if the test used 400 MHz of mmWave and say, four or five 20 MHz LTE channels and following the logic, be closer to 700 Mbps, I would be more impressed. And do not forget this was over a5G NSA setup.
One can do carrier aggregation with both 4G and 5G all day long and get proportional results based on how much mmWave and how much LTE is aggregated. So, now we know that both mmWave and LTE can be combined (we knew that already) and the latest test simply showcases one particular configuration and what can be achieved.
The idea does have significance, however, because the uplink channel is becoming a major consideration in the evolution of networks. The pandemic is a classic case of how uplink bandwidth is becoming more important (as with the applications mentioned earlier) As well, expanded use of other applications, such as video conferencing, and online learning are ratcheting up.
Currently, upload speeds are dismal. Our friends at Opensignal, who seem to be the guardians of bandwidth metrics, just released a report of average upload speeds. The results were fairly consistent among the major carriers – between 30 and 40 Mbps (not 250 Mbps as Ericsson claimed).
This is also not going to go over well for things like sportingf events, concerts, and such where attendees are streaming videos of the content to others. Multiply that by a venue full of this type of streaming and things will get dicey. Users expect high-quality streams, which eat up bandwidth and frankly, sub-6 GHz frequencies are going to have a hard time providing sufficient bandwidth to meet these high-definition requirements.
Ericsson put a lot of emphasis on the carrier aggregation component. That is just being used to show carrier aggregation works (but we knew that already). This is not a particularly earth-shattering event.
The real excitement behind this will be the deployment of 5G mmWave, with or without carrier aggregation. Sub-6 GHz spectrum will not likely play a large part in this. What it will be good for is to manage those higher frequencies within the network. As I said, mmWave will do the heavy lifting.
The nice thing about this is that it puts a spotlight on a nagging issue that has to be addressed. While download speeds will continue to be the primary driver for bandwidth upload speeds will have to be sufficient to meet the rising demand that is emerging. The trick will be to juggle the bandwidth between up and download demand. This could be a prime case for network slicing. The key will be to do this dynamically.
Incidentally, when one looks at the raw numbers, each mmWave channel carried just over 106 Mbps. The LTE channel, 70 Mbps. To me, a 100 MHz channel carrying only about 33 percent more data than a 20 MHz channel has me questioning the technology here, especially in the mmWave spectrum.
While this was just a show project, it does call attention to the lack of resources for uploading data. There will be several vectors that will push the upload bandwidth issue, so it is good that the industry is starting to pay attention to it.
Ernest Worthman is an executive editor with AGL Media Group.