2. Global gigabit connectivity at ultra-low cost
This megatrend is actually the convergence of three: “global”, “gigabit ” and “ultra-low cost”. The quest for coverage and for performance is nothing new, as a matter of fact. the novelty is in the “quality” and “quantity” foreseen by this trend.
Connectivity has kept improving over the last 150 years. However, it is only in the last 20 years, with the advent of low cost wireless technology (particularly on the handset side, the cellphone) that we have see a tremendous growth.
What used to take 50 years in terms of usage adoption has been squeezed into a few years. There are now 5.5 billion people connected via a cellphone, 90% of them with a smartphone.
It is no more just about people. Actually, if we look at the number objects connectivity is already dwarfing people connectivity in number of devices and in number of transactions (not in terms of bandwidth: our usage of bandwidth for movies keeps the bandwidth usage on our side but this will also change in the coming years as more and more streaming video from safety cameras will take the upper end in bandwidth usage).
In this decade connectivity is expected to increase further in two dimensions:
- broader area of coverage, with expectation to have full planet coverage by 2035 accessible through normal consumer cellphones (today to access satellite networks, the only one providing full coverage, a special -expensive- phone is required). This is expected to be achieved by new generations of satellites (like OneWeb planning to have 48,000 satellites in its constellation, Starlink, already serving US and Canada with 540 satellites and expected to expand coverage in 2021/22 once 1500 more satellites will be deployed), both low orbit constellation and cube-satellites constellation, and the capabilities of cellphones to operate in the THz band, expected to become reality with 6G;
- higher bandwidth delivered through higher spectrum availability (because of higher frequencies, in the microwave range -above 300GHz in the next decade and in the mmwave range 30-300GHz in the second part of this decade) coupled with more dense network (higher number of access points, 10.to 1,000 times the one existing today, with the higher multiplier effective with the deployment of 6G and networks dynamically set up from the edges).
If “global” is easy to understand, the “gigabit” part is not so straightforward because it raises the question if “gigabit” connectivity will make a difference (there are then sub-questions like “for whom” and “who will be willing to pay for it and how much” but these latter can be superseded by the third forecast, i.e. “at ultra-low cost”.
As it happened in the past we can assume that once increased bandwidth is available someone will find a way to exploit it and that eventually many people will be using it.
The graphic on the side present a forecast from Cisco on the possible bandwidth demand by future applications, in temporal order (from near to far down the lane):
- Ultra-high definition security cameras: 15 Mbps
- Ultra-high definition streaming (4k): 15 Mbps
- Virtual Reality streaming: 17 Mbps
- Self-driving vehicles diagnostics: 20 Mbps
- Cloud Gaming: 30 Mbps
- Ullltra-High definition IP Video: 51 Mbps
- 8K wall television: 100 Mbps
- High Definition Virtual lReality: 167 Mbps
- Ultra-High Definition Virtual Reality: 500 Mbps
Some of the above applications may require low latency (<10ms) or very low latency (<2ms) and will therefore requite edge computing and edge / peer-to-peer communication, hence a quite different network architecture that, in principle, is already possible with 5G but then will surely be implemented for 6G.
Delivering Gigabit capacity to the single user (not to a single cell) requires very dense networks, and of course adequate technology. On the wireline side the optical fibre can deliver multiple Gbps already today, On the wireless side we need sufficient spectrum to funnel 1 Gbps. Considering 20 bit per Hz (a very very high spectral efficiency, never reached in normal conditions were a 4-6 b per Hz would be considered as a very good efficiency) to get 1 Gbps you need 50 MHz spectrum availability (today’s 5G allocated spectrum in Italy has a maximum of 80 MHz and that is for the whole cell, not for a single user!). Hence the need to use mmwaves and µmwaves (in the THz range). These allow allocation of broad spectrum. The evolution of electronics will made this feasible in the last part of this decade.
Recapping: “global” and “gigabit” are reasonable targets. What about “ultra-low cost”?
Here is where I feel it gets really interesting!
If we look back we can see that the shift from wireline to wireless has dramatically slashed the cost of delivering bits. This is due
- first, to the fact that wireless infrastructures can scale (almost) in synch with demand. When traffic demand grows you can deploy one more cell, and then another and right were it is needed. This makes investment much more effective.
- second, to the shift of (part) of the infrastructure investment on the customer. In fact, the cell phone, smartphone are network equipment, they carry out actions that once were part of the infrastructure, like digitisation, access selection, … Smartphone represent something like 70% of the overall cost of the end-to-end wireless infrastructure. Hence, the telecom Operators are covering only 30% of the cost, whilst in a wireline infrastructure they have to sustain 100% of the cost!
This is decreasing the perception (and reality) of cost to the end user. As the cost of the smartphones decreases so does the cost of connectivity.
This trend will continue in this decade and it will have a further acceleration by the end of the decade, beginning of the next, as communication will start to be provided by the edges (networks deployed by third parties that are not interested in charging for the access) and by objects themselves. 6G will be the first system designed to create edge networks, in part formed by meshing networks created by objects. This is what will lead to ultra-low cost connectivity.