If 6G becomes just 5G+ we’ll have made a big mistake

  • It looks like the general premise of this article's 6G is "semantic" technologies where the transmitter knows things like the fact that it's transmitting video, and can do smart things (prioritize, infer missing portions, and the like).

    Problem 1: So much for the OSI model. You're putting application-level knowledge in the link layer. Ugh. Not that it's entirely unprecedented, but there are reasons the layers are separated: heterogenous apps need to work together on heterogenous hardware stacks that have radically different lifecycles. The coordination problem of just rolling out IPv6 has been hard enough, and is simple by comparison.

    Problem 2: I don't actually want the transmitter to know much about what it's transmitting. I want encryption. This doesn't kill the deal, but it does make it a lot harder, and it means you definitely 100% need explicit protocol-level support for all of the semantics (and can't just throw deep packet inspection at the problem to solve any of it).

  • No thank you. Just give me even more reliable and even faster transmission. I don't really want machines to make best effort guesses what is happening at the other end of the communication pipeline, because the connection is now so bad compared to previous technologies that it has to.

    The rest can be done by software / hardware on top of that, if one wishes so, I don't see why that has to be integrated into the communication pipeline itself.

  • 5G seems to me like planned obsolesence of 4G with only an incremental improvement. 4G was supposed to be "Long-Term Evolution". I had thought that meant forward- and backward-compatibility for decades. Granted, latency is much lower. Did I get taken by marketing?

    With 6G deployment already being projected, when I don't even know anybody with a 5G handset, it also seems like a make-work technology.

    Deployment of a new wireless technology throughout a national wireless carrier, let alone all carriers in a country the size of the US, or in the world, is such a huge undertaking in terms of environmental resources, that it seems almost irresponsible unless there is a significant improvement.

    2G was a quantum leap over 1G analog networks. In 2G networks like GSM, the digital communications stack was so tightly integrated, and as optimized as a 1990s video game, that even everyday layman users knew stories like "I could send an emergency SMS from the middle of nowhere even when there was no signal". It seems like that kind of robustness is missing the more abstract and higher-level these wireless generations go.

  • > When the wireless industry deploys 6G networks around 2030, they hope to usher in an era in which everything, not just our phones, is sensed, connected, and intelligent.

    And every tiny detail of everyone's life tracked in real time. What a horror.

  • I don’t get the benefits of the articles vision of 6G. There is a good reason for why we use the OSI Mode and separate problems that appear at the physical layer from those that appear at the application layer. I think of it as a powerful and universal interface specification. I couldn’t find a single argument in the article that even justifies mingling some general machine learning with the actual physical data transmission. Please, 6G people, more bandwidth and lower latency is all we want.

  • All of this fanciful stuff is just at the wrong layer.

    > "making it possible for a device to infer missing video data based on context clues"

    Great, can't wait for your video streaming app with AI inference

    > reduce the bandwidth, data rates, and energy consumption required for transmitting data

    OK, love to see your work on the compression standard that's going to be used in this codec you're designing, sounds great.

    This all sounds great. Stick it in your application.

    But it's nothing whatsoever to do with the network layer, which is going to continue to benefit from lower latency, lower overhead and higher bandwidth.

  • What about coverage?

    If you happen to live in a place where cell carriers choose not to cover (E.g. go 200 miles west from the east coast) and you are stuck with a landline you’ll find a scarlet letter that says ā€˜PROBABLE SPAM CALL’ on the CallerID of anybody you call.

    This is unacceptable. The last G is fiber optics and it is about time.

  • >At the University of Oulu in Finland, where I am a professor and head of the Intelligent Connectivity and Networks/Systems group, we’re working on a new research vision called VisionX.

    This pretty much sums up the narrative and incentive of the article. The paper [1] is linked in the article, but for included here for those who might want to take a look at it.

    There is nothing wrong with so called dumb pipe other than the name. Because networks in 21st century is anything but dumb. That is especially true in Wireless / 5G network. Both semantic and efficiency is fundamentally tied to another factor known as cost and QoS (I dont see how that Semantics is any different to QoS ). Which 6G development are well aware of. And network vendors are already complaining about the complexity of 5G.

    I still remember there was some debate in ~2019 that they should try and simplify everything in 6G. And unless there are some killer apps requirement, i think 4G / 5G transition will be the point where it hit the end of S curve, being good enough for 80% of people. As a matter of fact, I know a lot of people are already happy with their 4G. They just want cheaper 4G, not faster and more expensive 5G. In terms of cost per Data usage 5G will provide exactly that, but it is sort of counter intuitive for most consumer that 5G is actually cheaper. Nor will MNO price them as such.

    [1] https://export.arxiv.org/abs/2108.05681

  • >Including level B technologies would look something like each video call participant locally predicting and rendering any missing portions of the video data if a glitch or network hiccup happens. Currently, we allocate significant amounts of time, energy, and computational resources to ensure very high transmission reliability. But instead, each participant’s machine—whether it’s a laptop, phone, or something else—would ā€œfill in the blanksā€ by inferring what was missing based on what had arrived.

    So this is called interpolation, and it has probably been set to "enabled" inside the video menu of your HDTV for a decade. Prediction is also part of the video compression standard:

    https://en.wikipedia.org/wiki/Video_compression_picture_type...

    https://en.wikipedia.org/wiki/Motion_interpolation

    https://en.wikipedia.org/wiki/Interpolation_(computer_graphi...

    https://en.wikipedia.org/wiki/Motion_compensation

  • Almost all of 5G that has actually been deployed comes under the header of an evolution, over the long term, of what was called "Long Term Evolution" (LTE), before it was rebranded "4G" to make 5G look like a fundamentally new product. That includes an all-IP core network, full-duplex radios, IP voice, etc. All of that was in the LTE specs.

    What's really different in 5G was, predictably, difficult and expensive to deploy, and it remains very challenging. On top of that, it is much more complex to operate in a way that extracts the most advantage from things like beam steering, while avoiding pitfalls like poor penetration of mmwave bands into buildings, or even cars.

    Until what's called 5G Stand Alone (SA) is mastered and widely built-out, it is difficult to say what is buildable in 6G and what is a lab demo.

    On top of that, it is difficult to find real world use cases that are uniquely enabled by even the most complete deployment of everything 5G claims to be able to do. If you have a 5G phone, what have you been able to do you can't do on a wifi network? It's just more bits. Which is nice. As an evolution of the network.

  • I prefer wired connection and WIFI for the "last mile".

  • This is using strange terminology, but it sounds like 'layer B' and 'layer C' are literally higher layers in the network stack. Why would those things be locked in to '6G'? Why not use them with existing 5G hardware or on my home fiber connection?

    Consider the following paragraph:

    >Including level B technologies would look something like each video call participant locally predicting and rendering any missing portions of the video data if a glitch or network hiccup happens. Currently, we allocate significant amounts of time, energy, and computational resources to ensure very high transmission reliability. But instead, each participant’s machine—whether it’s a laptop, phone, or something else—would ā€œfill in the blanksā€ by inferring what was missing based on what had arrived. On a deeper level, machines would be able to reconstruct data with the same meaning as what was sent, even if it’s not the same on a bit-by-bit or pixel-by-pixel level. The machine learning techniques to do this already exist, though they are still relatively new: Two examples are variational autoencoders and generative adversarial networks. The latter in particular has gained attention in recent years because of its ability to develop deepfake images.

    The author is talking about video codecs. Certainly codecs can be much improved based on a higher-level understanding of the content. What does that have to do with networks?

  • Don't build content formats into communication tech. Imagine if tcp could only send text and gifs, or http only send html and images. Sure you could try to have 6G define everything all the way up, but why? Instead, understand how to decompose characteristics such as latency, or multicasting, detection and reporting of missing data, etc. and build those mechanisms. Have the content use those capabilities to send, receive, reconstruct whatever using their own evolving formats and algorithms.

    On top of that to mention the privacy/security issues if Huawei or whoever hardware vendor not only had the bits but also all the semantic information.

  • I think the article should be: "If 6G is centralized, we have failed again."

    Add decentralized radio to phones now, or we will build our own phones: http://radiomesh.org

  • The one thing we need is a legal agreement that company ee and service providers don’t water down the phrase ā€œXGā€. They use it as a marketing term rather than a technology. So the moment they start spending money on it and they want to sell it even if it’s not fully baked.

    Other than that I don’t think we want semantic understanding of communication at the link layer. There is no positives there with the current state of security and privacy. Of course that could usher in new research in security and privacy preserving techniques and it could make for some interesting stuff.

  • So just 6G, but add at the end of the specification "All data needs to be json encoded with a >>Semantic fingerprint<< tag."

    I mean there may well be semantic network ideas beyond just QoS that are actually interesting, so far it's just that no semantic network proposal convinced me that the right problem description is "do the bit pushing really, really well."

  • Impressive guest article.In my opinion there is huge telecom cold war and giants like China and S. Korea is more focused on just winning this battle rather than the quality of tech.

  • The most important application to consider when developing the next generation of wireless communications technology is to ensure that 6G will have the robustness and reliability needed to deliver on the promise of doubling Zuckerberg’s net worth by making his portable VR pr0n world into a reality. Users should be able to strap on a VR helmet anytime, anywhere and partake in Metaverse activities that far exceed the imagination of even the most crazed Roman emperors.

  • undefined