The Thousand Times Challenge: PART 2 … How to provide cellular data capacity?

Advertisements

CELLULAR DATA CAPACITY … A THOUSAND TIMES CHALLENGE?

It should be obvious that I am somewhat skeptical about all the excitement around cellular data growth rates and whether its a 1,000x or 250x or 42x (see my blog on “The Thousand Times Challenge … The answer to everything about mobile data?”). In this I share very much Dean Bubley’s (Disruptive Wireless) critical view on the “cellular growth rate craze”. See Dean’s account in his recent Blog “Mobile data traffic growth – a thought experiment and forecast”.

This obsession with cellular data growth rates is Largely Irrelevant or only serves Hysteria and Cool Blogs, Twittter and Press Headlines (which is for nothing else occasionally entertaining).

What IS Important! is how to provide more (economical) cellular capacity, avoiding;

  • Massive Congestion and loss of customer service.
  • Economical devastation as operator tries to supply network resources for an un-managed cellular growth profile.

(Source: adapted from K.K. Larsen “Spectrum Limitations Migrating to LTE … a Growth Market Dilemma?“)

To me the discussion of how to Increase Network Capacity with a factor THOUSAND is an altogether more interesting discussion than what the cellular growth rate might or might not be in 2020 (or any other arbitrary chosen year).

Mallinson article “The 2020 Vision for LTE”  in FierceWirelessEurope gives a good summary of this effort. Though my favorite account on how to increase network capacity focusing on small cell deployment  is from Iris Barcia (@ibtwi) & Simon Chapman (@simonchapman) from Keima Wireless.

So how can we simply describe cellular network capacity?

Well … it turns out that Cellular Network Capacity can be described by 3 major components; (1) available bandwidth B, (2) (effective) spectral efficiency and (3) number of cells deployed N.

The SUPPLIED NETWORK CAPACITY in Mbps (i.e., C) is equal to  the AMOUNT OF SPECTRUM, i.e., available bandwidth, in MHz (i..e, B) multiplied with the  SPECTRAL EFFICIENCY PER CELL in Mbps/MHz (i.e., E) multiplied by the NUMBER OF CELLS (i.e., N).

It should be understood that the best approach is to apply the formula on a per radio access technology basis, rather than across all access technologies. Also separate the analysis in Downlink capacity (i.e., from Base Station to Customer Device) and in Uplink (from consumer Device to Base Station). If averages across many access technologies or you are considering the total bandwidth B including spectrum both for Uplink and for Downlink, the spectral efficiency needs to be averaged accordingly. Also bear in mind that there could be some inter-dependency between the (effective) spectral efficiency and number cells deployed. Though it  depends on what approach you choose to take to Spectral Efficiency.

It should be remembered that not all supplied capacity is being equally utilized. Most operators have 95% of their cellular traffic confined to 50% of less of their Cells. So supplied capacity in half (or more) of most cellular operator’s network remains substantially under-utilized (i.e., 50% or more of radio network carries 5% or less of the cellular traffic … if you thought that Network Sharing would make sense … yeah it does … but its a different story;-).

Therefore I prefer to apply the cellular capacity formula to geographical limited areas of the mobile network, rather than network wide. This allows for more meaningful analysis and should avoid silly averaging effects.

So we see that providing network capacity is “pretty easy”: The more bandwidth or available spectrum we have the more cellular capacity can be provided. The better and more efficient air-interface technology the more cellular capacity and quality can we provide to our customers. Last (but not least) the more cells we have build into our mobile network the more capacity can be provided (though economics does limit this one).

The Cellular Network Capacity formula allow us to breakdown the important factors to solve the “1,000x Challenge”, which we should remember is based on a year 2010 reference (i.e., feels a little bit like cheating! right?;-) …

The Cellular Capacity Gain formula:

Basically the Cellular Network Capacity Gain in 2020 (over 2010) or the Capacity we can supply in 2020 is related to how much spectrum we have available (compared to today or 2010), the effective spectral efficiency relative improvement over today (or 2010) and the number of cells deployed in 2020 relative to today (or 2010).

According with Mallinson’s article the “1,000x Challenge” looks the following (courtesy of SK Telekom);

According with Mallinson (and SK Telekom, see “Efficient Spectrum Resource Usage for Next Generation NW” by H. Park, presented at 3GPP Workshop “on Rel.-12 and onwards”, Ljubljana, Slovenia, 11-12 June 2012) one should expect to have 3 times more spectrum available in 2020 (compared to 2010 for Cellular Data), 6 times more efficient access technology (compared to what was available in 2010) and 56 times higher cell density compared to 2010. Another important thing to remember when digesting the 3 x 6 x 56 is: this is an estimate from South Korea and SK Telekom and to a large extend driven by South Korean conditions.

Above I have emphasized the 2010 reference. It is important to remember this reference to better appreciate where the high ratios come from in the above. For example in 2010 most mobile operators where using 1 to maximum 2 carriers or in the process to upgrade to 2 carriers to credible support HSPA+. Further many operators had not transitioned to HSPA+ and few not even added HSUPA to their access layer. Furthermore, most Western European operators had on average 2 carriers for UMTS (i.e., 2×10 MHz @ 2100MHz). Some operators with a little excess 900MHz may have deployed a single carrier and either postponed 2100MHz or only very lightly deployed the higher frequency UMTS carrier in their top cities. In 2010, the 3G population coverage (defined as having minimum HSDPA) was in Western Europe at maximum 80% and in Central Eastern & Southern Europe most places maximum 60%. 3G geographical coverage always on average across the European Union was in 2010 less than 60% (in Western Europe up-to 80% and in CEE up-to 50%).

OPERATOR EXAMPLE:

Take an European Operator with 4,000 site locations in 2010.

In 2010 this operator had deployed 3 carriers supporting HSPA @ 2100MHz (i..e, total bandwidth of 2x15MHz)

Further in 2010 the Operator also had:

  • 2×10 MHz GSM @ 900MHz (with possible migration path to UMTS900).
  • 2×30 MHz GSM @ 1800MHz (with possible migration path to LTE1800).

By 2020 it retained all its spectrum and gained

  • 2×10 MHz @ 800MHz for LTE.
  • 2×20 MHz @ 2.6GHz for LTE.

For simplicity (and idealistic reasons) let’s assume that by 2020 2G has finally been retired. Moreover, lets concern ourselves with cellular data at 3G and above service levels (i.e., ignoring GPRS & EDGE). Thus I do not distinguish between whether the air-interface is HSPA+ or LTE/LTE advanced.

OPERATOR EXAMPLE: BANDWIDTH GAIN 2010 – 2020:

The Bandwidth Gain part of the “Cellular Capacity Gain” formula is in general specific to individual operators and the particular future regulatory environment (i.e., in terms of new spectrum being released for cellular use). One should not expect a universally applicable ratio here. It will vary with a given operator’s spectrum position … Past, Present & Future.

In 2010 our Operator had 15MHz (for either DL or UL) supporting cellular data.

In 2020 the Operator should have 85MHz (for either DL or UL), which is a almost a factor 6 more than in 2010. Don’t be concerned about this not being 3! After all why should it be? Every country and operator will face different constraints and opportunities and therefor there is no reason why 3 x 6 x 56 would be a universal truth!

If Regulator’s and Lawmakers would be more friendly towards spectrum sharing the boost of available spectrum for cellular data could be a lot more.

SPECTRAL EFFICIENCY GAIN 2010 – 2020:

The Spectral Efficiency Gain part of the “Cellular Capacity Gain” formula is more universally applicable to cellular operators at the same technology stage and with a similar customer mix. Thus in general for apples and apple comparison more or less same gains should be expected.

In my experience Spectral Efficiency almost always gets experts emotions running high. More often than not there is a divide between those experts (across Operators, Suppliers, etc.) towards what would be an appropriate spectral efficiency to use in capacity assessments. Clearly everybody understands that the theoretical peak spectral efficiency is not reflecting the real service experience of customers or the amount of capacity an operator has in his Mobile Network. Thus, in general an effective (or average) spectral efficiency is being applied often based on real network measurements or estimates based on such.

When LTE was initially specified its performance targets was referenced to HSxPA Release 6. The LTE aim was to get 3 -4 times the DL spectral efficiency and 2 – 3 times the UL spectral efficiency. LTE advanced targets to double the peak spectral efficiency for both DL and UL.

At maximum expect the spectral efficiency to be:

  • @Downlink to be 6 – 8 times that of Release 6.
  • @Uplink to be 4 – 6 times that of Release 6.

Note that this comparison is assuming an operator’s LTE deployment would move 4×4 MiMo to 8×8 MiMo in Downlink and from 64QAM SiSo to 4×4 MiMo in Uplink. Thus a quantum leap in antenna technology and substantial antenna upgrades over the period from LTE to LTE-advanced would be on the to-do list of the mobile operators.

In theory for LTE-advanced (and depending on the 2010 starting point) one could expect a factor 6 boost in spectral efficiency  by 2020 compared to 2010, as put down in the “1,000x challenge”.

However, it is highly unlikely that all devices by 2020 would be LTE-advanced. Most markets would be have at least 40% 3G penetration, some laggard markets would still have a very substantial 2G base. While LTE would be growing rapidly the share of LTE-advanced terminals might be fairly low even at 2020.

Using a x6 spectral efficiency factor by 2020 is likely being extremely optimistic.

A more realistic assessment would be a factor 3 – 4 by 2020 considering the blend of technologies in play at that time.

INTERLUDE

The critical observer sees that we have reached a capacity gain (compared to 2010) of 6 x (3-4) or 18 to 24 times. Thus to reach 1,000x we still need between 40 and 56 times the cell density.

and that translate into a lot of additional cells!

CELL DENSITY GAIN 2010 – 2020:

The Cell Density Gain part of the “Cellular Capacity Gain” formula is in general specific to individual operators and the cellular traffic demand they might experience, i.e., there is no unique universal number to be expected here.

So to get to 1,000x the capacity of 2010 we need either magic or a 50+x increase in cell density (which some may argue would amount to magic as well) …

Obviously … this sounds like a real challenge … getting more spectrum and high spectral efficiency is piece of cake compared to a 50+ times more cell density. Clearly our Mobile Operator would go broke if it would be required to finance 50 x 4000 = 200,000 sites (or cells, i.e., 3 cells = 1 macro site ). The Opex and Capex requirements would simply NOT BE PERMISSIBLE.

50+ times site density on a macro scale is Economical & Practical Nonsense … The Cellular Network Capacity heuristics in such a limit works ONLY for localized areas of a Mobile Network!

The good news is that such macro level densification would also not be required … this is where Small Cells enter the Scene. This is where you run to experts such as Simon Chapman (@simonchapman) from Keima Wireless or similar companies specialized in providing intelligent small cell deployment. Its clear that this is better done early on in the network design rather than when the capacity pressure becomes a real problem.

Note that I am currently assuming that Economics and Deployment Complexity will not become challenging with Small Cell deployment strategy … this (as we shall see) is not necessarily a reasonable assumption in all deployment scenarios.

Traffic is not equally distributed across a mobile network as the chart below clearly shows (see also Kim K Larsen’s “Capacity Planning in Mobile Data Networks Experiencing Exponential Growh in Demand”):

20% of the 3G-cells carries 60% of the data traffic and 50% of the 3G-cells carries as much as 95% of the 3G traffic.

Good news is that I might not need to worry too much about half of my cellular network that only carries 5% of my traffic.

Bad news is that up-to 50% of my cells might actually give me a substantial headache if I don’t have sufficient spectral capacity and enough customers on the most efficient access technology. Leaving me little choice but to increase my cellular network density, i.e., build more cells to my existing cellular grid.

Further, most of the data traffic is carried within the densest macro-cellular network grid (at least if an operator starts exhausting its spectral capacity with a traditional coverage grid). In a typical European City ca. 20% of Macro Cells will have a range of 300 meter or less and 50% of the Macro Cells will have a range of 500 meter or less (see below chart on “Cell ranges in a typical European City”).

Finding suitable and permissible candidates for Macro cellular cell splits below 300 meter is rather unlikely.  Between 300 and 500 meter there might still be macro cellular split optionallity and if so would make the most sense to commence on (pending on future anticipated traffic growth). Above 500 meter its usually fairly likely to find suitable macro cellular site candidates (i.e., in most European Cities).

Clearly if the cellular data traffic increase would require a densification ratio of 50+ times current macro-cellular density a macro cellular alternative might be out of the question even for cell ranges up-to 2 km.

A new cellular network paradigm is required as the classical cellular network design brakes down!

Small Cell implementation is often the only alternative a Mobile Operator has to provide more capacity in a dense urban or high-traffic urban environment.

As Mobile Operators changes their cellular design, in dense urban and urban environments, to respond to the increasing cellular data demand, what kind of economical boundaries would need to be imposed to make a factor 50x increase in cell density work out.

No Mobile Operator can afford to see its Opex and Capex pressure rise! (i.e., unless revenue follows or exceed which might not be that likely).

For a moment … remember that this site density challenge is not limited to a single mobile operator … imagining that all operators (i.e., typical 3 -5 except for India with 13+;-) in a given market needs to increase their cellular site density with a factor 50. Even if there is (in theory) lots of space on the street level for Small Cells … one could imagine the regulatory resistance (not to mention consumer resistance) if a city would see a demand for Small Cell locations increase with a factor 150 – 200.

Thus, Sharing Small Cell Locations and Supporting Infrastructure will become an important trend … which should also lead to Better Economics.

This bring us to The Economics of the “1,000x Challenge” … Stay tuned!

4 thoughts on “The Thousand Times Challenge: PART 2 … How to provide cellular data capacity?

  1. Good analysis! It is also required to see the relationship between cellular network capacity requirements and WiFi Networks working with Broadband Internet connections, which is also growing at exponential rate.

    • Dr. Kim – Here, There & Everywhere – I am passionately interested in how Artificial Intelligence may shape our immediate, near, and longer-term future. Not so much in terms of consumerism and all that ado of autonomous cars and unsupervised dreams of digital cats. Much more so in understanding how A.I. will shape and change corporations, how corporate work will be transformed by artificial intelligence-based augmentation, and how decisions ultimately will be made in a corporate setting having a strong AI foundation. What strategic considerations does a business or corporation need to be making in order to prepare for AI. Opinions are of course my own.
      Dr. Kim on said:

      Thank you Kumar! I fully agree with you … the dynamics between Cellular & WiFi is very important. I can recommend to check out Thomas Wehmeier’s (@twehmeier) latest Informa – Mobida analysis (based on very impressive amount of data from around the world) on Cellular & WiFi Smartphone usage. See http://www.informatandm.com/wp-content/uploads/2012/08/Mobidia_final.pdf.

  2. That is really attention-grabbing, You are an overly professional blogger.
    I have joined your feed and look ahead to in quest of more of your excellent post.
    Also, I have shared your website in my social networks

Leave a Reply to Mukesh KumarCancel reply