The Thousand Times Challenge: PART 2 … How to provide cellular data capacity?

CELLULAR DATA CAPACITY … A THOUSAND TIMES CHALLENGE?

It should be obvious that I am somewhat skeptical about all the excitement around cellular data growth rates and whether its a 1,000x or 250x or 42x (see my blog on “The Thousand Times Challenge … The answer to everything about mobile data?”). In this I share very much Dean Bubley’s (Disruptive Wireless) critical view on the “cellular growth rate craze”. See Dean’s account in his recent Blog “Mobile data traffic growth – a thought experiment and forecast”.

This obsession with cellular data growth rates is Largely Irrelevant or only serves Hysteria and Cool Blogs, Twittter and Press Headlines (which is for nothing else occasionally entertaining).

What IS Important! is how to provide more (economical) cellular capacity, avoiding;

  • Massive Congestion and loss of customer service.
  • Economical devastation as operator tries to supply network resources for an un-managed cellular growth profile.

(Source: adapted from K.K. Larsen “Spectrum Limitations Migrating to LTE … a Growth Market Dilemma?“)

To me the discussion of how to Increase Network Capacity with a factor THOUSAND is an altogether more interesting discussion than what the cellular growth rate might or might not be in 2020 (or any other arbitrary chosen year).

Mallinson article “The 2020 Vision for LTE”  in FierceWirelessEurope gives a good summary of this effort. Though my favorite account on how to increase network capacity focusing on small cell deployment  is from Iris Barcia (@ibtwi) & Simon Chapman (@simonchapman) from Keima Wireless.

So how can we simply describe cellular network capacity?

Well … it turns out that Cellular Network Capacity can be described by 3 major components; (1) available bandwidth B, (2) (effective) spectral efficiency and (3) number of cells deployed N.

The SUPPLIED NETWORK CAPACITY in Mbps (i.e., C) is equal to  the AMOUNT OF SPECTRUM, i.e., available bandwidth, in MHz (i..e, B) multiplied with the  SPECTRAL EFFICIENCY PER CELL in Mbps/MHz (i.e., E) multiplied by the NUMBER OF CELLS (i.e., N).

It should be understood that the best approach is to apply the formula on a per radio access technology basis, rather than across all access technologies. Also separate the analysis in Downlink capacity (i.e., from Base Station to Customer Device) and in Uplink (from consumer Device to Base Station). If averages across many access technologies or you are considering the total bandwidth B including spectrum both for Uplink and for Downlink, the spectral efficiency needs to be averaged accordingly. Also bear in mind that there could be some inter-dependency between the (effective) spectral efficiency and number cells deployed. Though it  depends on what approach you choose to take to Spectral Efficiency.

It should be remembered that not all supplied capacity is being equally utilized. Most operators have 95% of their cellular traffic confined to 50% of less of their Cells. So supplied capacity in half (or more) of most cellular operator’s network remains substantially under-utilized (i.e., 50% or more of radio network carries 5% or less of the cellular traffic … if you thought that Network Sharing would make sense … yeah it does … but its a different story;-).

Therefore I prefer to apply the cellular capacity formula to geographical limited areas of the mobile network, rather than network wide. This allows for more meaningful analysis and should avoid silly averaging effects.

So we see that providing network capacity is “pretty easy”: The more bandwidth or available spectrum we have the more cellular capacity can be provided. The better and more efficient air-interface technology the more cellular capacity and quality can we provide to our customers. Last (but not least) the more cells we have build into our mobile network the more capacity can be provided (though economics does limit this one).

The Cellular Network Capacity formula allow us to breakdown the important factors to solve the “1,000x Challenge”, which we should remember is based on a year 2010 reference (i.e., feels a little bit like cheating! right?;-) …

The Cellular Capacity Gain formula:

Basically the Cellular Network Capacity Gain in 2020 (over 2010) or the Capacity we can supply in 2020 is related to how much spectrum we have available (compared to today or 2010), the effective spectral efficiency relative improvement over today (or 2010) and the number of cells deployed in 2020 relative to today (or 2010).

According with Mallinson’s article the “1,000x Challenge” looks the following (courtesy of SK Telekom);

According with Mallinson (and SK Telekom, see “Efficient Spectrum Resource Usage for Next Generation NW” by H. Park, presented at 3GPP Workshop “on Rel.-12 and onwards”, Ljubljana, Slovenia, 11-12 June 2012) one should expect to have 3 times more spectrum available in 2020 (compared to 2010 for Cellular Data), 6 times more efficient access technology (compared to what was available in 2010) and 56 times higher cell density compared to 2010. Another important thing to remember when digesting the 3 x 6 x 56 is: this is an estimate from South Korea and SK Telekom and to a large extend driven by South Korean conditions.

Above I have emphasized the 2010 reference. It is important to remember this reference to better appreciate where the high ratios come from in the above. For example in 2010 most mobile operators where using 1 to maximum 2 carriers or in the process to upgrade to 2 carriers to credible support HSPA+. Further many operators had not transitioned to HSPA+ and few not even added HSUPA to their access layer. Furthermore, most Western European operators had on average 2 carriers for UMTS (i.e., 2×10 MHz @ 2100MHz). Some operators with a little excess 900MHz may have deployed a single carrier and either postponed 2100MHz or only very lightly deployed the higher frequency UMTS carrier in their top cities. In 2010, the 3G population coverage (defined as having minimum HSDPA) was in Western Europe at maximum 80% and in Central Eastern & Southern Europe most places maximum 60%. 3G geographical coverage always on average across the European Union was in 2010 less than 60% (in Western Europe up-to 80% and in CEE up-to 50%).

OPERATOR EXAMPLE:

Take an European Operator with 4,000 site locations in 2010.

In 2010 this operator had deployed 3 carriers supporting HSPA @ 2100MHz (i..e, total bandwidth of 2x15MHz)

Further in 2010 the Operator also had:

  • 2×10 MHz GSM @ 900MHz (with possible migration path to UMTS900).
  • 2×30 MHz GSM @ 1800MHz (with possible migration path to LTE1800).

By 2020 it retained all its spectrum and gained

  • 2×10 MHz @ 800MHz for LTE.
  • 2×20 MHz @ 2.6GHz for LTE.

For simplicity (and idealistic reasons) let’s assume that by 2020 2G has finally been retired. Moreover, lets concern ourselves with cellular data at 3G and above service levels (i.e., ignoring GPRS & EDGE). Thus I do not distinguish between whether the air-interface is HSPA+ or LTE/LTE advanced.

OPERATOR EXAMPLE: BANDWIDTH GAIN 2010 – 2020:

The Bandwidth Gain part of the “Cellular Capacity Gain” formula is in general specific to individual operators and the particular future regulatory environment (i.e., in terms of new spectrum being released for cellular use). One should not expect a universally applicable ratio here. It will vary with a given operator’s spectrum position … Past, Present & Future.

In 2010 our Operator had 15MHz (for either DL or UL) supporting cellular data.

In 2020 the Operator should have 85MHz (for either DL or UL), which is a almost a factor 6 more than in 2010. Don’t be concerned about this not being 3! After all why should it be? Every country and operator will face different constraints and opportunities and therefor there is no reason why 3 x 6 x 56 would be a universal truth!

If Regulator’s and Lawmakers would be more friendly towards spectrum sharing the boost of available spectrum for cellular data could be a lot more.

SPECTRAL EFFICIENCY GAIN 2010 – 2020:

The Spectral Efficiency Gain part of the “Cellular Capacity Gain” formula is more universally applicable to cellular operators at the same technology stage and with a similar customer mix. Thus in general for apples and apple comparison more or less same gains should be expected.

In my experience Spectral Efficiency almost always gets experts emotions running high. More often than not there is a divide between those experts (across Operators, Suppliers, etc.) towards what would be an appropriate spectral efficiency to use in capacity assessments. Clearly everybody understands that the theoretical peak spectral efficiency is not reflecting the real service experience of customers or the amount of capacity an operator has in his Mobile Network. Thus, in general an effective (or average) spectral efficiency is being applied often based on real network measurements or estimates based on such.

When LTE was initially specified its performance targets was referenced to HSxPA Release 6. The LTE aim was to get 3 -4 times the DL spectral efficiency and 2 – 3 times the UL spectral efficiency. LTE advanced targets to double the peak spectral efficiency for both DL and UL.

At maximum expect the spectral efficiency to be:

  • @Downlink to be 6 – 8 times that of Release 6.
  • @Uplink to be 4 – 6 times that of Release 6.

Note that this comparison is assuming an operator’s LTE deployment would move 4×4 MiMo to 8×8 MiMo in Downlink and from 64QAM SiSo to 4×4 MiMo in Uplink. Thus a quantum leap in antenna technology and substantial antenna upgrades over the period from LTE to LTE-advanced would be on the to-do list of the mobile operators.

In theory for LTE-advanced (and depending on the 2010 starting point) one could expect a factor 6 boost in spectral efficiency  by 2020 compared to 2010, as put down in the “1,000x challenge”.

However, it is highly unlikely that all devices by 2020 would be LTE-advanced. Most markets would be have at least 40% 3G penetration, some laggard markets would still have a very substantial 2G base. While LTE would be growing rapidly the share of LTE-advanced terminals might be fairly low even at 2020.

Using a x6 spectral efficiency factor by 2020 is likely being extremely optimistic.

A more realistic assessment would be a factor 3 – 4 by 2020 considering the blend of technologies in play at that time.

INTERLUDE

The critical observer sees that we have reached a capacity gain (compared to 2010) of 6 x (3-4) or 18 to 24 times. Thus to reach 1,000x we still need between 40 and 56 times the cell density.

and that translate into a lot of additional cells!

CELL DENSITY GAIN 2010 – 2020:

The Cell Density Gain part of the “Cellular Capacity Gain” formula is in general specific to individual operators and the cellular traffic demand they might experience, i.e., there is no unique universal number to be expected here.

So to get to 1,000x the capacity of 2010 we need either magic or a 50+x increase in cell density (which some may argue would amount to magic as well) …

Obviously … this sounds like a real challenge … getting more spectrum and high spectral efficiency is piece of cake compared to a 50+ times more cell density. Clearly our Mobile Operator would go broke if it would be required to finance 50 x 4000 = 200,000 sites (or cells, i.e., 3 cells = 1 macro site ). The Opex and Capex requirements would simply NOT BE PERMISSIBLE.

50+ times site density on a macro scale is Economical & Practical Nonsense … The Cellular Network Capacity heuristics in such a limit works ONLY for localized areas of a Mobile Network!

The good news is that such macro level densification would also not be required … this is where Small Cells enter the Scene. This is where you run to experts such as Simon Chapman (@simonchapman) from Keima Wireless or similar companies specialized in providing intelligent small cell deployment. Its clear that this is better done early on in the network design rather than when the capacity pressure becomes a real problem.

Note that I am currently assuming that Economics and Deployment Complexity will not become challenging with Small Cell deployment strategy … this (as we shall see) is not necessarily a reasonable assumption in all deployment scenarios.

Traffic is not equally distributed across a mobile network as the chart below clearly shows (see also Kim K Larsen’s “Capacity Planning in Mobile Data Networks Experiencing Exponential Growh in Demand”):

20% of the 3G-cells carries 60% of the data traffic and 50% of the 3G-cells carries as much as 95% of the 3G traffic.

Good news is that I might not need to worry too much about half of my cellular network that only carries 5% of my traffic.

Bad news is that up-to 50% of my cells might actually give me a substantial headache if I don’t have sufficient spectral capacity and enough customers on the most efficient access technology. Leaving me little choice but to increase my cellular network density, i.e., build more cells to my existing cellular grid.

Further, most of the data traffic is carried within the densest macro-cellular network grid (at least if an operator starts exhausting its spectral capacity with a traditional coverage grid). In a typical European City ca. 20% of Macro Cells will have a range of 300 meter or less and 50% of the Macro Cells will have a range of 500 meter or less (see below chart on “Cell ranges in a typical European City”).

Finding suitable and permissible candidates for Macro cellular cell splits below 300 meter is rather unlikely.  Between 300 and 500 meter there might still be macro cellular split optionallity and if so would make the most sense to commence on (pending on future anticipated traffic growth). Above 500 meter its usually fairly likely to find suitable macro cellular site candidates (i.e., in most European Cities).

Clearly if the cellular data traffic increase would require a densification ratio of 50+ times current macro-cellular density a macro cellular alternative might be out of the question even for cell ranges up-to 2 km.

A new cellular network paradigm is required as the classical cellular network design brakes down!

Small Cell implementation is often the only alternative a Mobile Operator has to provide more capacity in a dense urban or high-traffic urban environment.

As Mobile Operators changes their cellular design, in dense urban and urban environments, to respond to the increasing cellular data demand, what kind of economical boundaries would need to be imposed to make a factor 50x increase in cell density work out.

No Mobile Operator can afford to see its Opex and Capex pressure rise! (i.e., unless revenue follows or exceed which might not be that likely).

For a moment … remember that this site density challenge is not limited to a single mobile operator … imagining that all operators (i.e., typical 3 -5 except for India with 13+;-) in a given market needs to increase their cellular site density with a factor 50. Even if there is (in theory) lots of space on the street level for Small Cells … one could imagine the regulatory resistance (not to mention consumer resistance) if a city would see a demand for Small Cell locations increase with a factor 150 – 200.

Thus, Sharing Small Cell Locations and Supporting Infrastructure will become an important trend … which should also lead to Better Economics.

This bring us to The Economics of the “1,000x Challenge” … Stay tuned!

The Thousand Times Challenge: PART 1 … The answer to everything about mobile data?

This is not PART 2 of “Mobile Data Growth…The Perfect Storm” … This is the story of the Thousand Times Challenge!

It is not unthinkable that some mobile operators will face very substantial problems with their cellular data networks due to rapid, uncontrollable or un-managed cellular data growth. Once cellular data demand exceeds the installed base supply of network resources, the customer experience will likely suffer and cellular data consumers will no longer get the same service level that they had prior to the onset of over-demand.

One might of course argue that consumers were (and in some instances still are) spoiled during the period when mobile operators had plenty of spectral capacity available (relative to their active customer base) with unlimited data plans and very little cellular network load . As more and more customers migrate to smartphones and 3G data services, it follows naturally that there will be increasingly less spectral resources available per customer.

The above chart (from “Capacity Planning in Mobile Data Networks Experience Exponential Growth in Demand” illustrates such a situation where customers cellular data demand eventually exceeds the network capacity … which leads to a congested situation and less network resources per customer.

A mobile operator have several options that can mitigate emergence of capacity and spectrum crunch:

  1. Keep expand and densify the cellular network.
  2. Free up legacy (i.e. “old-technology”) spectrum and deploy for technology facing demand pressure.
  3. Introduce policy and active demand management on per user / segment level.
  4. Allow customers service to degrade as provider of best-effort cellular data.
  5. Stimulate and design for structural off-loading (levering fixed as well as cellular networks).
  6. etc..

DEMAND … A THOUSAND TIMES FABLE?

Let me start with saying that cellular data growth does pose a formidable challenge for many mobile operators … already today … its easy to show that even at modest growth rates cellular data demand gets pretty close or beyond cellular network resources available today and in the future. Unless we fundamentally changes the way we design, plan and build networks.

However, Today The Challenge is Not network wide … At present, its limited to particular areas of the cellular networks … though as the cellular data traffic growths, the demand challenge does spread outwards and addresses an ever higher share of the cellular network.

Lately 1,000 has become a very important number. It has become the answer to the Smartphone Challenge and exponential growth of mobile data. 1000 seems to represent both demand as well as supply. Qualcomm has made it their “mission in life” (at at least the next 8 years) to solve the magic 1000 challenge. Mallinson article “The 2020 Vision for LTE”  in FierceWirelessEurope gives a slightly more balanced view on demand and target supply of cellular resources: “Virtually all commentators expect a 15 to 30-fold traffic increase over five years and several expect this growth trend to last a decade to 2020, representing a 250-1,000-fold increase.” (note: the cynic in wonders about the several, its more than 2, but is it much more than 3?)

The observant reader will see that the range between minimum and maximum to be a factor of 4 … a reasonably larger error of margin to plan for. If by 2020 the demand would be 1,000 times that of demand in 2010, our Technologies better be a lot better than that as that would be an average with a long tail.

Of course most of us know that the answer really is 42! NOT 1000!

Joke aside … And let’s get serious about this 1000 Fable!

Firstly, 1,000 is (according with Qualcomm) the expected growth of data between 2010 and 2020 … Thus if data was 42 in 2010 it would be 1000×42 by 2020. That would be a CAGR of 100% over the period or a doubling of demanded data year in year our for 10 years.

… Well not really!

Qualcomm states that data demand in 2012 would be 10x that of 2010 . Thus, it follows that data demand between 2012 and 2020 “only” would be 100x or a CAGR of 78% over that period.

So in 2021 (1 year after we had 1,000x) we would see demand of ca. 1,800x, in 2022 (2 years after we solved the 1000x challenge) we would experience a demand of more than 3,000x, and so forth …

So great to solve the 1,000x challenge by 2020 but it’s going to be like “peeing in your trouser on a cold winter day” . Yes it will be warm, for a little while. Then its going to be really cold. In other words not going to help much structurally.

Could it be that this 1,000x challenge might be somewhat flawed?

  1. If All Commentators and Several Experts are to be believed, the growth worldwide is almost perfectly exponential with an annual growth rate between 70% and 100%.
  2. Growth is “unstoppable” -> unlimited sources for growth.

Actually most projections (from several expert sources;-) that I have seen does show substantial deceleration as the main source for growth exhaust, i.e., as Early & Late Majority of customers adapt to mobile data. Even Cisco own “Global Mobile Data Traffic Forecast Update, 2011 – 2016” shows an average deceleration of growth with an average of 20% per anno between 2010 and their 2014 projections (note: it’s sort of “funny” that Cisco then decide that after 2014 growth no longer slows down but stays put at 78% … alas artistic freedom I suppose?).

CELLULAR CUSTOMER MIGRATION

The following provides projection of 2G, 3G and LTE uptake between 2010 (Actual) and 2020 (Expected). The dynamics is based on latest Pyramid Research cellular projections for WEU, US, APAC, LA & CEE between 2010 to 2017. The “Last Mile”, 2018 – 2020, is based on reasonable dynamic extrapolations based on the prior period with a stronger imposed emphasis on LTE growth. Of course Pyramid Research provides one view of the technology migration and given the uncertainty on market dynamics and pricing policies are simply one view on how the cellular telco world will develop. This said, I tend to find Pyramid Research getting reasonably close to actual developments and the trends across the various markets are not that counter-intuitive.

For the US Market LTE is expected to grow very fast and  reach a penetration level beyond 60% by 2020. For the other markets LTE is expected to evolve relative sluggish with an uptake percentage of 20%+/-5% by 2020. It should be remembered that all projections are averages. Thus within a market, for a specific country or operator, the technology shares could very well differ somewhat from the above.

The growth rates for LTE customer uptake over the period; 2010/2011 – 2020, 2015 – 2020 and respective LTE share in 2020.

WEU 2010-2020: 87%, 2015 – 2020: 24%, share in 2020: 20%.

USA 2010-2020: 48%, 2015 – 2020: 19%, share in 2020: 62%.

APAC 2010-2020: 118%, 2015 – 2020: 61%, share in 2020: 30%.

CEE 2011-2020: 168%, 2015 – 2020: 37%, share in 2020: 20%.

LA 2010-2020: 144%, 2015 – 2020: 37%, share in 2020: 40%.

Yes the LTE growth rates are very impressive when compared to the initial launch year with the very initial uptake. As already pointed out in my Blog …. growth rates in referenced back to a penetration less than 2% has little practical meaning. The average LTE uptake rate across all the above markets between 2012 to 2020 is 53%+/-17% (highest being APAC and Lowest being USA).

What should be evident from the above technology uptake charts are that

  • 3G remains strong even in 2020 (though likely dominated by prepaid at that time).
  • 2G will remain for a longtime in both CEE & APAC, even toward 2020.

In the scenario where we have a factor 100 in growth of usage between 2012 and 2020, which is a CAGR of 78%, the growth of usage per user would to be 16% pa at an annual uptake rate of 53%. However, without knowing the starting point of the LTE data usage (which initially will be very low as there is almost not users), these growth rates are not of much use and certainly cannot be used to make up any conclusions about congestion or network dire straits.

Example based on European Growth Figures:

A cellular networks have 5 mio customers, 50% Postpaid.

Network has 4,000 cell sites (12,000 sectors) that by 2020 covers both UMTS & LTE to the same depth.

in 2020 the operator have allocated 2×20 MHz for 3G & 2×20 MHz for LTE. Remaining 2G customers are one a single shared GSM network support all GSM traffic in country with no more than 2x5MHz.

By 2020 the cellular operator have ca. 4Mio 3G users and ca. 0.9Mio LTE users (remaining 100 thousand GSM customers are the real Laggards).

The 3G uptake growth rate ‘2010 – ‘2020 was 7%, between ’10 – ’12 it was 25%. 3G usage growth would not be very strong as its a blend of Late Majority and Laggards (including a fairly large Prepaid segment that appear hardly to use Cellular data).

The LTE uptake growth rate ‘2010 – ‘2020 was 87%, between ’10 – ’12 it was 458%. The first 20% of LTE would like be consisting of Innovators and Early Adopters. Thus, usage growth of LTE should be expected to be more aggressive than for 3G.

Let’s assume that 20% of the cell sites carries 50% of the devices and for simplicity also data traffic (see for example my Slideshare presentation “Capacity Planning in Mobile Data Networks Experiencing Exponential Growth in Demand” which provides evidence for such distribution).

So we have ca. 800 3G users per sector (or ca. 40 3G users per sector per MHz). By 2020, one would likewise for LTE anticipate ca. 200 LTE users per sector (or ca. 10 LTE users per sector per MHz). Note that no assumptions of activity rate has been imposed.

Irrespective of growth rate we need to ask ourselves whether 10 LTE users per sector per MHz would pose a congested situation (in the busy hour). Assume that the effective LTE spectral efficiency across a macro cellular cell would be 5Mbps/MHz/Sector. So the 10 LTE users could on average share up-to 100Mbps (@ 20MHz DL).

For 3G, where we would have 40 3G users per sector per MHz. Similar (very simple) considerations allows to conclude that the 40 4G users would have no more than 40Mbps (under semi-ideal radio conditions and @ 20MHz DL). This could be a lot more demanding and customer affecting than the resulting LTE demand, despite LTE having substantially higher growth rate than we saw for 3G over the same period.

High growth rates does not default result in cellular network breakdown. It is the absolute traffic load (in the Busy Hour) that matters.

The growth of of cellular data usage between 2010 and 2020 is likewise going to be awesome (it would be higher than above technology uptake rates).. but also pretty meaningless.

Growth rates only matter in as much as growth brings an absolute demanded traffic level above the capability of the existing network and spectral resources (supplied traffic capacity).

Irrespective of a growth rate is high, medium or low … all can cause havoc in a cellular network … some networks will handle a 1,000x without much ado, others will tumble at 250x whatever the reference point level (which also includes the network design and planning maturity levels).

However, what is important is how to provide more (economical) cellular capacity avoiding;

  • Massive Congestion and loss of customer service.
  • Economical devastation as operator tries to supply network resources for an un-managed cellular growth profile.

(Source: adapted from K.K. Larsen “Spectrum Limitations Migrating to LTE … a Growth Market Dilemma?“)

Facebook Values … Has the little boy spoken?

Facebook has lost ca. 450+ Million US$ per day since its IPO … or about 40 Billion US$ … in a little under 90 days (i.e., reference date 17-08-2012).

This is like loosing an Economy such as the Seychelles every second day. Or a Bulgaria  in less than 90 days. (Note: this is not to say that you could buy Bulgaria for $40B … well who knows? 😉 … the comparison just serves at making the loss of Facebook value more tangible. Further one should not take the suggestion of a relationship between market value of a corporation such as Facebook with GDP of country too serious as also pointed out by Dean Bubley @disruptivedean).

That’s a lot of value lost in a very short time. I am sure Bulgarians,”Seychellians” and FB investors can agree to that.

40 Billion US Dollar?  … Its a little less than 20 Mars Missions … or

40 Billion US Dollar could keep 35 thousand Americans in work for 50 years each!

So has the little boy spoken? Is the Emperor of Social Media Naked?

Illustration: THORARINN LEIFSSON http://www.totil.com

Let’s have a more detailed look at Facebook’s share price development since May 18th 2012.

The Chart below shows the Facebook’s share price journey, the associated book value, the corresponding sustainable share of Online Ad Spend (with an assumed 5yr linear ramp-up from today’s share) and the projected share of Online Ad Spend in 2012.

In the wisdom of looking backwards …  is Facebook, the Super-Mario of Social Media, really such a bad investment? or is this just a bump in a long an prosperous road ahead?

I guess it all rise and fall with what ever belief an investor have of Facebook’s ability to capture sufficient Online Advertisement Spend. Online Ad spend obviously includes the Holy Grail of Mobile Ad Revenues as well.

FB’s revenue share of Online Ad Spend has raised steady from 1.3% in 2009 to ca. 5% in2011 and projected to be at least 6% in 2012.

Take a look at FB’s valuation (or book value) which at the time of the IPO (i.e., May 18th 2012) was ca. 80+ Billion US Dollars. Equivalent to a share price of $38.32 per share (at closing).

In terms of sustainable business such a valuation could be justifiable if FB could capture and sustain at least 23% of the Online Ad Spend in the longer run. Compare this with ca. 5% in 2011. Compare this with Googles 40+% om 2011. AOL, which is Top 5 of best companies at conquering Online Advertisement Spend, share of Online Ad Spend was a factor 15 less than Google. Furthermore, Top-5 accounts for more than 70% of the Online Ad Spend in 2011. The remaining 30% of Online Ad Spend arises mainly from Asia Pacific logo-graphic, politically complicated, and Cyrillic dominated countries of which Latin-based Social Media & Search in general perform poorly in (i.e., when it comes to capturing Online Ad Spend).

Don’t worry! Facebook is in the Top 5 list of companies getting a piece of the Online Advertisement pie.

It would appear likely that Facebook should be able to continue to increase its share of Online Ad Spend from today’s fairly low level. The above chart shows FB’s current share price level (closing 17-August-2012) corresponds to a book value of ca. $40 Billion and a sustainable share of the Online Ad Spend of a bit more than 10+%.

It would be sad if Facebook should not be able to ever get more than 10% of the Online Ad Spend.

From this perspective:

A Facebook share price below $20 does seem awfully cheap!

Is it time to invest in Facebook? … at the moment it looks like The New Black is bashing Social Media!

So the share price of Facebook might drop further … as current investors try too off-load their shares (at least the ones that did not buy at and immediately after the IPO).

Facebook has 900+ Million (and approaching a Billion) users. More than 500+ Million of those 900+ Million Facebook users are active daily and massively using their Smartphones to keep updated with Friends and Fiends. In 2011 there where more than 215 Billion FB events.

Facebook should be a power house for Earned and Owned Social Media Ads (sorry this is really still Online Advertisement despite the Social Media tag) … we consumers are much more susceptible to friend’s endorsements or our favorite brands (for that matter) than the mass fabricated plain old online  advertisement that most of us are blind to anyway (or get annoyed by which from awareness is not necessarily un-intended ).

All in all

Maybe the Little Boy will not speak up as the Emperor is far from naked!

METHODOLOGY

See my Social Media Valuation Blog “A walk on the Wild Side”.

Following has been assumed in FB Valuation Assessment:

  1. WACC 9.4%
  2. 2012 FB capture 6% of total online ad spend.
  3. FB gains a sustainable share of online ad spend X%.
  4. 5 yr linear ramp-up from 2012 6% to X%, and then maintained.
  5. Other revenues 15% in 2012, linearly reduced to 10% after 5 yrs and then maintained.
  6. Assume FB can maintain a free cash flow yield of 25%.

Mobile Data Growth … The Perfect Storm? (PART 1)

The Perfect Mobile Data StormSmartphone Challenge and by that the Signalling Storm

Mobile Operators hit by the Mobile Data Tsunami … tumbling over mobile networks … leading to

Spectrum Exhaustion

and

Cash Crunch

and

Financial disaster (as cost of providing mobile data exceeds the revenues earned from mobile data).

as Mobile Operators tries to cope with hyper-inflationary growth of data usage.

Will LTE be ready in time?

Will LTE be sufficient remedying the mobile data growth observed the last couple of years?

The Mobile Industry would have been better off if Data Consumption had stayed “Fixed”? Right! …Right?

At this time my Twitter Colleague Dean Bubley (@Disruptivedean) will be near critical meltdown 😉 …

Dean Bubley (Disruptive Wireless) is deeply skeptical about the rhetoric around the mobile data explosion and tsunamis, as he has accounted for in a recent Blog “Mobile data traffic growth – a thought experiment and forecast”. Dean hints at possible ulterior motives behind the dark dark picture of the mobile data future painted by the Mobile Industry.

I do not share Dean’s opinion (re:ulterior motives in particular, most of his other thoughts on cellular data growth are pretty OK!). It almost suggest a Grand Mobile Industry Conspiracy in play … Giving the Telco Industry a little too much credit … Rather than the simple fact that we as an industry (in particular the Marketing side of things) tends to be govern by the short term. Being “slaves of anchoring bias” to the most recent information available to us (i.e, rarely more than the last 12 or so month).

Of course Technology Departments in the Mobile Industry uses the hyper-growth of Cellular Data to get as much Capex as possible. Ensure sufficient capacity overhead can be bought and build into the Mobile Networks, mitigating the uncertainty and complexity of Cellular data growth.

Cellular Data is by its very nature a lot more difficult to forecast and plan for than the plain old voice service.

The Mobile Industry appears to suffer from Mobile Data AuctusphopiaThe Fear of Growth (which is sort of “funny” as the first ca. 4 – 5 years of UMTS, we all were looking for growth of data, and of course the associated data revenues, that would make our extremely expensive 3G spectrum a somewhat more reasonable investment … ).

The Mobile Industry got what it wished for with the emergence of the Smartphone (Thanks Steve!).

Why Data Auctusphopia? … ?

Let’s assume that an operator experienced a Smartphone growth rate of 100+% over the last 12 month. In addition, the operator also observes the total mobile data volume demand growing with 250+% (i.e., not uncommon annual growth rates between 2010 and 2011). Its very tempting (i.e., it is also likely to be very wrong!) to use the historical growth rate going forward without much consideration for the underlying growth dynamics of technology uptake, migration and usage-per-user dynamics. Clearly one would be rather naive NOT to be scared about the consequences of a sustainable annual growth rate of 250%! (irrespective of such thinking being flawed).

Problem with this (naive) “forecasting” approach is that anchoring on the past is NOT likely to be a very good predictor for longer/long term expectations.

THE GROWTH ESSENTIALS – THE TECHNOLOGY ADAPTATION.

To understand mobile data growth, we need to look at minimum two aspects of Growth:

  1. Growth of users (per segment) using mobile data (i.e., data uptake).
  2. Growth of data usage per user segment (i.e., segmentation is important as averages across a whole customer base can be misleading).

i.e., Growth can be decomposed into uptake rate of users  and growth of these users data consumption, i.e., CAGR_Volume = (1 + CAGR_Users) x (1+CAGR_Usage) – 1.

The segmentation should be chosen with some care, although a split in Postpaid and Prepaid should be a minimum requirement. Further refinements would be to include terminal type & capabilities, terminal OS, usage categories, pricing impacts, etc.. and we see that the growth prediction process very rapidly gets fairly complex, involving a high amount of uncertain assumptions. Needless to say that Growth should be considered per Access Technology, i.e., split in GPRS/EDGE, 3G/HSPA, LTE/LTE-a and WiFi.

Let’s have a look at (simple) technology growth of a new technology or in other words the adaptation rate.

The above chart illustrates the most common uptake trend that we observe in mobile networks (and in many other situations of consumer product adaptation). The highest growth rates are typically observed in the beginning. Over time the growth rate slows down as saturation is reached. In other words the source of growth has been exhausted.

At Day ZERO there where ZERO 3G terminals and their owners.

At Day ONE some users had bought 3G terminals (e..g, Nokia 6630).

Between Zero and Some, 3G terminals amounts to an Infinite growth rate … So Wow! … Helpful … Not really!

Some statistics:

In most countries it has taken on average 5 years to reach a 20% 3G penetration.

The KA moment of 3G uptake really came with the introduction of the iPhone 3 (June 9 2008) and HTC/Google G1 (October 2008) smartphones.

Simplified example: in 4 years a Mobile Operator’s 3G uptake went from 2% to 20%. An compounded annual growth rate (CAGR) of at least 78%. Over the same period the average mobile (cellular!) data consumption per user increased by a factor 15 (e.g., from 20MB to 300MB), which gives us a growth rate of 97% per anno. Thus the total volume today is at least 150 times that of 4 years ago or equivalent to an annual growth rate 250%!

Geoffrey A. Moore’s book “Crossing the Chasm” (on Marketing and Selling High-Tech products to mainstream customers) different segmentation of growth have been mapped out in (1) Innovators (i.e., first adopters), (2) Early Adoptors, (3) Early Majority, (4) Late Majority and (5) The Laggards.

It is fairly common to ignore the Laggards in most analysis, as these do not cause direct problems for new technology adaptation. However, in mobile networks Laggards can become a problem if they prevent the operator to re-farm legacy spectrum by refusing to migrate, e.g., preventing GSM 900MHz spectrum to be re-purposed to UMTS or GSM 1800 to be re-purposed to LTE.

Each of the stages defined by Geoffrey Moore correspond to a different time period in the life cycle of a given product and mapped to above chart on technology uptake looks like this:

In the above “Crossing the Chasm” chart I have imposed Moore’s categories on a logistic-like (or S-curve shaped) cumulative distribution function rather than the Bell Shaped (i.e., normal distribution) chosen in his book.

3G adaptation has typically taken ca. 5+/-1 years from launch to reach the stage of Early Majority.

In the mobile industry its fairly common for a user to have more than 1 device (i.e., handset typically combined with data stick, tablet, as well as private & work related device split, etc..). In other words, there are more mobile accounts than mobile users.

In 2011, Western Europe had ca. 550 Million registered mobile accounts (i.e., as measured by active SIM Cards) and a population of little over 400 Million. Thus a mobile penetration of ca. 135% or if we consider population with a disposable income 160+%.

The growth of 3G users (i.e., defined as somebody with a 3G capable terminal equipment) have been quiet incredible with initial annual growth rates exceeding 100%. Did this growth rate continue? NO it did NOT!

As discussed previously, it is absolutely to be expected to see very high growth rates in the early stages or technology adaptation. The starting is Zero or Very Low and incremental additions weight more in the beginning than later on in the adaptation process.

The above chart (“CAGR of 3G Customer Uptake vs 3G Penetration”) illustrates the annual 3G uptake growth rate data points, referenced to the year of 10% penetration, for Germany, Netherlands and USA (i.e., which includes CDMA2000). It should be noted that 3G Penetration levels above 50+% are based on Pyramid Research projections.

The initial growth rates are large and then slows down as the 3G penetration increases.

As saturation is reached the growth rate comes almost to a stop.

3G saturation level is expected to be between 70% and 80+% … When LTE takes over!

For most Western European markets the saturation is expected to be reached between 2015 – 2018 and sooner in the USA … LTE takes over!

The (diffusion) process of Technology uptake can be described by S-shaped curves (e.g., as shown in “Crossing the Chasm”). The simplest mathematical description is a symmetric logistic function (i..e, Sigmoid) that only depends on time. The top solid (black) curve shows the compounded annual growth rate, referenced to the Year of 10% 3G penetration, vs 3G penetration. Between 10% and 15% 3G penetration the annual growth rate is 140%, between 10% and 50% its “only” 108% and drops to 65% at 90% 3G penetration (which might never be reached as users starts migrating to LTE).

The lower dashed (black) curve is a generalized logistic function that provides a higher degree of modelling flexibility accounting for non-symmetric adaptation rate pending on the 3G penetration. No attempt of curve fitting to the data has been applied in the chart above. I find the generalized logistic function in general can be made to agree well with actual uptake data. Growth here is more modest with 72% (vs 140% for the Simple Logistic representation), 57% (vs 108%) and 35% (vs 65%). Undershooting in the beginning of the growth process (from 10% ->;20%: Innovators & Early Adopters phase) but representing actual data after 20% 3G penetration (Early and Late Majority).

Finally, I have also included the Gomperz function (also sigmoid) represented by light (grey) dashed line in between the Simple and Generalized Logistic Functions. The Gomperz function has found many practical applications describing growth. The parameters of the Gormperz function can be chosen so growth near lower and upper boundaries are different (i.e., asymmetric growth dynamics near the upper and lower asymptotes).

As most mature 3G markets have passed 50% 3G penetration (i.e., eating into the Late Majority) and approaching saturation, one should expect to see annual growth rates of 3G uptake to rapidly reduce. The introduction of LTE will also have a substantial impact of the 3G uptake and growth.

Of course the above is a simplification of the many factors that should be considered. It is important that you;

  1. Differentiate between Prepaid & Postpaid.
  2. Consider segmentation (e.g., Innovator, First Adopter, Early Majority & Late Majority).
  3. Projections should Self-consistent with market dynamics: i.e., Gross Adds, Churn, hand-down and upgrade dynamics within Base, etc…

THE GROWTH ESSENTIALS – THE CELLULAR USAGE.

In the following I will focus on Cellular (or Mobile) data consumption. Thus any WiFi consumption on public, corporate or residential access points are deliberately not considered in the following. Obviously, in cellular data demand forecasting WiFi usage can be important as it might be a potential source for cellular consumption via on-loading. In particular with new and better performing cellular technologies are being introduced (i.e., LTE / LTE advanced). Also price plan policy changes might result in higher on-load of the cellular network (at least if that network is relative unloaded and with lots of spare capacity).

It should come as no surprise that today the majority of mobile data consumers are Postpaid.

Thus, most of the average data usage being reported are based on the Postpaid segment. This also could imply that projecting future usage based on past and current usage could easily overshoot. Particular if Prepaid consumption would be substantially lower than Postpaid data consumption. The interesting and maybe somewhat surprising is that Active Prepaid mobile data consumers can have a fairly high data consumption (obviously pending price plan policy). In the example shown below, for an Western European Operator with ca. 50%:50% Postpaid – Prepaid mix, the Postpaid active mobile data consumers are 85% of total Postpaid Base. The Mobile Data Active Prepaid base only 15% (though growing fast).

The illustrated data set, which is fairly representative for an aggressive smartphone operation, have an average data consumption of ca. 100MB (based on whole customer base) and an Active Average consumption of ca. 350MB. Though fairly big consumptive variations are observed within various segments of the customer base.

The first 4 Postpaid price plans are Smartphone based (i.e., iOS and Android) and comprises 80% of all active devices on the Network. “Other Postpaid” comprises Basic Phones, Symbian and RIM devices. The Active Prepaid device consumption are primarily Android based.

We observe that the following:

  1. Unlimited price plan results in the highest average volumetric usage (“Unlimited Postpaid” & “Postpaid 1″ price plans are comparable in device composition. The difference is in one being unlimited the other not).
  2. Unlimited average consumption dominated by long tail towards extreme usage (see chart below).
  3. Smartphone centric postpaid price plans tend to have a very high utilization percentage (90+%).
  4. Active Prepaid Data Consumption (200MB) almost as high as less aggressive smartphone (210MB) price plans (this is however greatly depending on prepaid price policy).

The above chart “Cellular Data Consumption Distribution” illustrates the complexity of technology and cellular data consumption even within different price plan policies. Most of the distributions consist of up-to 4 sub-segments of usage profiles.Most notably is the higher consumption segment and the non-/very-low consumptive segment.

There are several observations worth mentioning:

  • Still a largely untapped Prepaid potential (for new revenue as well as additional usage).
  • 15% of Postpaid consumers are data inactive (i.e., Data Laggards).
  • 40% of active Postpaid base consumes less than 100MB or less than 1/4 of the average high-end Smartphone usage.

Clearly, the best approach to come to a meaningful projection of cellular data usage (per consumer) would be to consider all the above factors in the estimate.

However, there is a problem!

The Past Trends may not be a good basis for predicting Future Trends!

Using The Past we might risk largely ignoring:

  1. Technology Improvements that would increase cellular data consumption.
  2. New Services that would boost cellular data usage per consumer.
  3. New Terminal types that would lead to another leapfrog in cellular data consumption.
  4. Cellular Network Congestion leading to reduced growth of data consumption (i.e., reduced available speed per consumer, QoS degradation, etc..).
  5. Policy changes such as Cap or allowing Unlimited usage.

Improvements in terminal equipment performance (i.e., higher air interface speed capabilities, more memory, better CPU performance, larger / better displays, …) should be factored into the cellular data consumption as the following chart illustrates (for more details see also Dr. Kim’s Slideshare presentation on “Right Pricing Mobile Broadband: Examing The Business Case for Mobile Broadband”).

I like to think about every segment category has its own particular average data usage consumption. A very simple consideration (supported by real data measurements) would to expect to find the extreme (or very high) data usage in the Innovator and Early Adopter segments and as more of the Majority (Early as well as Late) are considered the data usage reduces. Eventually at Laggards segment hardy any data usage is observed.

It should be clear that the above average usage-distribution profile is dynamic. As time goes by the distribution would spread out towards higher usage (i.e., the per user per segment inflationary consumption). At the same time as increasingly more of the customer base reflects the majority of the a given operators customer base (i.e., early and late majority)

Thus over time it would be reasonable to expect that?

The average volumetric consumption could develop to an average that is lower than when Innovators & Early Adopters dominated.

Well maybe!? Maybe not?!

The usage dynamics within a given price plan is non-trivial (to say the least) and we see in general a tendency towards higher usage sub-segment (i.e., within a given capped price plan). The following chart (below) is a good example of the data consumption within the same Capped Smartphone price plan over an 12 month period. The total amount of consumers in this particular example have increased 2.5 times over the period.

It is clear from above chart that over the 12 month period the higher usage sub-segment has become increasingly popular. Irrespective the overall average (including non-active users of this Smartphone price plan) has not increased over the period.

Though by no means does this need to be true for all price plans. The following chart illustrates the dynamics over a 12 month period of an older Unlimited Smartphone price plan:

Here we actually observe a 38% increase in the average volumetric consumption per customer. Over the period the ca. 50% of customers in this price plan have dropped out leaving primarily heavy users enjoy the benefits on unlimited consumption.

There is little doubt that most mature developed markets with a long history of 3G/HSPA will have reached a 3G uptake level that includes most of the Late Majority segment.

However, for the prepaid segment it is also fair to say that most mobile operators are likely only to have started approach and appeal to Innovators and Early Adopters. The chart below illustrates the last 12 month prepaid cellular consumptive behavior.

In this particular example ca. 90% of the Prepaid customer base are not active cellular data consumers (this is not an unusual figure). Even over the period this number has not changed substantially. The Active Prepaid consumes on average 40% more cellular data than 12 month ago. There is a strong indication that the prepaid consumptive dynamics resembles that Postpaid.

Data Consumption is a lot more complex than Technology Adaptation of the Cellular Customer.

The data consumptive dynamics is pretty much on a high level as follows;

  1. Late (and in some case Early) Majority segments commence consuming cellular data (this will drag down the overall average).
  2. Less non-active cellular data consumers (beside Laggards) ->; having an upward pull on the average consumption.
  3. (in particular) Innovator & Early Adopters consumption increases within limits of given price plan (this will tend to pull up the average).
  4. General migration upwards to higher sub-segmented usage (pulling the overall average upwards).
  5. If Capped pricing is implemented (wo any Unlimited price plans in effect) growth will slow down as consumers approach the cap.

We have also seen that it is sort of foolish to discuss a single data usage figure and try to create all kind of speculative stories about such a number.

BRINGING IT ALL TOGETHER.

So what’s all this worth unless one can predict some (uncertain) growth rates!

WESTERN EUROPE (AT, BE, DK, FIN, F, DE,GR,IRL,IT,NL,N,P, ESP, SE, CH, UK,)

3G uptake in WEU was ca. 60% in 2011 (i.e., ca. 334 Million 3G devices). This correspond to ca. 90% of all Postpaid customers and 32% of all Prepaid users have a 3G device. Of course it does not mean that all of these are active cellular data users. Actually today (June 2012) ca. 35% of the postpaid 3G users can be regarded as non-active cellular user and for prepaid this number may be as high as 90%.

For Western Europe, I do not see much more 3G additions in the Postpaid segment. It will be more about replacement and natural upgrade to higher capable devices (i.e., higher air interface speed, better CPU, memory, display, etc..). We will see an increasing migration from 3G Postpaid towards LTE Postpaid. This migration will really pick-up between 2015 and 2020 (Western Europe lacking behind LTE adaptation in comparison with for example USA and some of the Asian Pacific countries). In principle this could also mean that growth of 3G postpaid cellular data consumption could rapidly decline (towards 2020) and we would start seeing overall cellular data usage decline rather than increase of 3G Postpaid data traffic.

Additional Cellular data growth may come from the Prepaid segment. However, still a very large proportion of this segment is largely data in-active in Western Europe. There are signs that, depending on the operator prepaid price plan policy, prepaid consumption appears to be fairly similar to Postpaid on a per user basis.

3G Growth Projections for Western Europe (reference year 2011):

Above assumes that usage caps will remain. I have assumed this to be 2GB (on average for WEU). Further in above it is assumed that the Prepaid segment will remain largely dominated by Laggards (i.e., in-active cellular data users) and that the active Prepaid cellular data users have consumption similar to Postpaid.

Overall 3G Cellular data growth for Western Europe to between 3x to no more than 4x (for very aggressive prepaid cellular data uptake & growth) over the period 2011 to 2016.

Postpaid 3G Cellular data growth will flatten and possible decline towards the end of 2020.

More agresive LTE Smartphone uptake (though on average across Western European appears unlikely) could further release 3G growth pains between 2015 – 2020.

Innovators & Early Adopters, who demand the most of the 3G Cellular Networks, should be expected to move quickly to LTE (as coverage is provided) off-loading the 3G networks over-proportionally.

The 3G cellular growth projections are an Average consideration for Western Europe where most of the postpaid 3G growth has already happen with an average of 60% overall 3G penetration. As a rule of thumb: the lower the 3G penetration the higher the CAGR growth rates (as measured from a given earlier reference point).

In order to be really meaningful and directly usable to a Mobile Operator, the above approach should be carried out for a given country and a given operator conditions.

The above growth rates are lower but within range of my Twitter Colleague Dean Bubley (@Disruptivedean) states as his expectations for Developed Markets in his Blog “Mobile data traffic growth – a thought experiment and forecast”. Not that it makes it more correct or more wrong! Though for any one who spend a little time on the growth fundamentals of existing Western European mobile data markets would not find this kind of growth rate surprising.

So what about LTE growth? … well given that we today (in Western Europe) have very very little installed base LTE devices on our networks … the growth or uptake (seen as on its own) is obviously going to be very HIGH the first 5 to 7 years (depending on go to market strategies).

What will be particular interesting with the launch of LTE is whether we will see an on-loading effect of the cellular LTE network from todays WiFi usage. Thomas Wehmeier (Principal Analyst, Telco Strategy, Informa @Twehmeier) has published to very interesting and study worthy reports on Cellular and WiFi Smartphone Usage (see “Understanding today’s smartphone user: Demystifying data usage trends on cellular & Wi-Fi networks” from Q1 2012 as well as Thomas’s sequential report from a couple of weeks ago “Understanding today’s smartphone user: Part 2: An expanded view by data plan size, OS, device type and LTE”).

THE CLIFFHANGER

Given the dramatic beginning of my Blog concerning the future of the Mobile Industry and Cellular data … and to be fair to many of the valid objections that Dean Bubley‘s has raised in his own Blog and in his Tweets … I do owe the reader who got through this story some answer …

I have no doubt (actually I know) that there mobile operators (around the world) that already today are in dire straits with their spectral resources due to very aggressive data growth triggered by the Smartphone. Even if growth has slowed down as their 3G customers (i.e., Postpaid segment) have reached the Late Majority (and possible fighting Laggards) that lower growth rate still causes substantial challenges to provide sufficient capacity & not to forget quality.

Yes … 3G/HSPA+ Small Cells (and DAS-like solutions) will help mitigate the growing pains of mobile operators, Yes … WiFi off-load too, Yes … LTE & LTE-advanced too will help. Though the last solution will not be much of a help before critical mass of LTE terminals have been reached (i.e., ca. 20% = Innovators + Early Adopters).

Often forgotten is traffic management and policy remedies (not per see Fair Use Policy though!) are of critical importance too in the toolset of managing cellular data traffic.

Operators in emerging markets and in markets with a relative low 3G penetration, better learn the Growth Lessons from the AT&T’s and other similar Front Runners in the Cellular Data and Smartphone Game.

  1. Unless you manage cellular data growth from the very early days, you are asking for (in-excusable) growth problems.
  2. Being Big in terms of customers are not per see a blessing if you don’t have proportionally the spectrum to support that Base.
  3. Don’t expect to keep the same quality level throughout your 3G Cellular Data life-cycle,!
  4. Accept that spectral overhead per customer obviously will dwindle as increasingly more customers migrate to 3G/HSPA+.
  5. Technology Laggards should be considered as the pose an enormous risk to spectral re-farming and migration to more data efficient technologies.
  6. Short Term (3 – 5 years) … LTE will not mitigate 3G growing pains (you have a problem today, its going to get tougher and then some tomorrow).

Is Doom knocking on Telecom’s Door? … Not very Likely (or at least we don’t need to open the door if we are smart about it) … Though if an Operator don’t learn fast and be furiously passionate about economical operation and pricing policies … things might look a lot more gloomy than what needs to be.

STAY TUNED FOR A PART 2 … taking up the last part in more detail.

ACKNOWLEDGEMENT

To great friends and colleagues that have challenged, suggested, discussed, screamed and shouted (in general shared the passion on this particular topic of Cellular Data Growth) about this incredible important topic for our Mobile Industry (and increasingly Fixed Broadband). I am in particular indebted to Dejan Radosavljevik for bearing with my sometimes crazy data requests (at odd h0urs and moments) and last but not least thinking along with me on what mobile data (cellular & WiFi) really means (though we both have come to the conclusion that being mobile is not what it means. But that is a different interesting story for another time).