Facebook Values … Has the little boy spoken?

Facebook has lost ca. 450+ Million US$ per day since its IPO … or about 40 Billion US$ … in a little under 90 days (i.e., reference date 17-08-2012).

This is like loosing an Economy such as the Seychelles every second day. Or a Bulgaria  in less than 90 days. (Note: this is not to say that you could buy Bulgaria for $40B … well who knows? 😉 … the comparison just serves at making the loss of Facebook value more tangible. Further one should not take the suggestion of a relationship between market value of a corporation such as Facebook with GDP of country too serious as also pointed out by Dean Bubley @disruptivedean).

That’s a lot of value lost in a very short time. I am sure Bulgarians,”Seychellians” and FB investors can agree to that.

40 Billion US Dollar?  … Its a little less than 20 Mars Missions … or

40 Billion US Dollar could keep 35 thousand Americans in work for 50 years each!

So has the little boy spoken? Is the Emperor of Social Media Naked?

Illustration: THORARINN LEIFSSON http://www.totil.com

Let’s have a more detailed look at Facebook’s share price development since May 18th 2012.

The Chart below shows the Facebook’s share price journey, the associated book value, the corresponding sustainable share of Online Ad Spend (with an assumed 5yr linear ramp-up from today’s share) and the projected share of Online Ad Spend in 2012.

In the wisdom of looking backwards …  is Facebook, the Super-Mario of Social Media, really such a bad investment? or is this just a bump in a long an prosperous road ahead?

I guess it all rise and fall with what ever belief an investor have of Facebook’s ability to capture sufficient Online Advertisement Spend. Online Ad spend obviously includes the Holy Grail of Mobile Ad Revenues as well.

FB’s revenue share of Online Ad Spend has raised steady from 1.3% in 2009 to ca. 5% in2011 and projected to be at least 6% in 2012.

Take a look at FB’s valuation (or book value) which at the time of the IPO (i.e., May 18th 2012) was ca. 80+ Billion US Dollars. Equivalent to a share price of $38.32 per share (at closing).

In terms of sustainable business such a valuation could be justifiable if FB could capture and sustain at least 23% of the Online Ad Spend in the longer run. Compare this with ca. 5% in 2011. Compare this with Googles 40+% om 2011. AOL, which is Top 5 of best companies at conquering Online Advertisement Spend, share of Online Ad Spend was a factor 15 less than Google. Furthermore, Top-5 accounts for more than 70% of the Online Ad Spend in 2011. The remaining 30% of Online Ad Spend arises mainly from Asia Pacific logo-graphic, politically complicated, and Cyrillic dominated countries of which Latin-based Social Media & Search in general perform poorly in (i.e., when it comes to capturing Online Ad Spend).

Don’t worry! Facebook is in the Top 5 list of companies getting a piece of the Online Advertisement pie.

It would appear likely that Facebook should be able to continue to increase its share of Online Ad Spend from today’s fairly low level. The above chart shows FB’s current share price level (closing 17-August-2012) corresponds to a book value of ca. $40 Billion and a sustainable share of the Online Ad Spend of a bit more than 10+%.

It would be sad if Facebook should not be able to ever get more than 10% of the Online Ad Spend.

From this perspective:

A Facebook share price below $20 does seem awfully cheap!

Is it time to invest in Facebook? … at the moment it looks like The New Black is bashing Social Media!

So the share price of Facebook might drop further … as current investors try too off-load their shares (at least the ones that did not buy at and immediately after the IPO).

Facebook has 900+ Million (and approaching a Billion) users. More than 500+ Million of those 900+ Million Facebook users are active daily and massively using their Smartphones to keep updated with Friends and Fiends. In 2011 there where more than 215 Billion FB events.

Facebook should be a power house for Earned and Owned Social Media Ads (sorry this is really still Online Advertisement despite the Social Media tag) … we consumers are much more susceptible to friend’s endorsements or our favorite brands (for that matter) than the mass fabricated plain old online  advertisement that most of us are blind to anyway (or get annoyed by which from awareness is not necessarily un-intended ).

All in all

Maybe the Little Boy will not speak up as the Emperor is far from naked!

METHODOLOGY

See my Social Media Valuation Blog “A walk on the Wild Side”.

Following has been assumed in FB Valuation Assessment:

  1. WACC 9.4%
  2. 2012 FB capture 6% of total online ad spend.
  3. FB gains a sustainable share of online ad spend X%.
  4. 5 yr linear ramp-up from 2012 6% to X%, and then maintained.
  5. Other revenues 15% in 2012, linearly reduced to 10% after 5 yrs and then maintained.
  6. Assume FB can maintain a free cash flow yield of 25%.

Mobile Data Growth … The Perfect Storm? (PART 1)

The Perfect Mobile Data StormSmartphone Challenge and by that the Signalling Storm

Mobile Operators hit by the Mobile Data Tsunami … tumbling over mobile networks … leading to

Spectrum Exhaustion

and

Cash Crunch

and

Financial disaster (as cost of providing mobile data exceeds the revenues earned from mobile data).

as Mobile Operators tries to cope with hyper-inflationary growth of data usage.

Will LTE be ready in time?

Will LTE be sufficient remedying the mobile data growth observed the last couple of years?

The Mobile Industry would have been better off if Data Consumption had stayed “Fixed”? Right! …Right?

At this time my Twitter Colleague Dean Bubley (@Disruptivedean) will be near critical meltdown 😉 …

Dean Bubley (Disruptive Wireless) is deeply skeptical about the rhetoric around the mobile data explosion and tsunamis, as he has accounted for in a recent Blog “Mobile data traffic growth – a thought experiment and forecast”. Dean hints at possible ulterior motives behind the dark dark picture of the mobile data future painted by the Mobile Industry.

I do not share Dean’s opinion (re:ulterior motives in particular, most of his other thoughts on cellular data growth are pretty OK!). It almost suggest a Grand Mobile Industry Conspiracy in play … Giving the Telco Industry a little too much credit … Rather than the simple fact that we as an industry (in particular the Marketing side of things) tends to be govern by the short term. Being “slaves of anchoring bias” to the most recent information available to us (i.e, rarely more than the last 12 or so month).

Of course Technology Departments in the Mobile Industry uses the hyper-growth of Cellular Data to get as much Capex as possible. Ensure sufficient capacity overhead can be bought and build into the Mobile Networks, mitigating the uncertainty and complexity of Cellular data growth.

Cellular Data is by its very nature a lot more difficult to forecast and plan for than the plain old voice service.

The Mobile Industry appears to suffer from Mobile Data AuctusphopiaThe Fear of Growth (which is sort of “funny” as the first ca. 4 – 5 years of UMTS, we all were looking for growth of data, and of course the associated data revenues, that would make our extremely expensive 3G spectrum a somewhat more reasonable investment … ).

The Mobile Industry got what it wished for with the emergence of the Smartphone (Thanks Steve!).

Why Data Auctusphopia? … ?

Let’s assume that an operator experienced a Smartphone growth rate of 100+% over the last 12 month. In addition, the operator also observes the total mobile data volume demand growing with 250+% (i.e., not uncommon annual growth rates between 2010 and 2011). Its very tempting (i.e., it is also likely to be very wrong!) to use the historical growth rate going forward without much consideration for the underlying growth dynamics of technology uptake, migration and usage-per-user dynamics. Clearly one would be rather naive NOT to be scared about the consequences of a sustainable annual growth rate of 250%! (irrespective of such thinking being flawed).

Problem with this (naive) “forecasting” approach is that anchoring on the past is NOT likely to be a very good predictor for longer/long term expectations.

THE GROWTH ESSENTIALS – THE TECHNOLOGY ADAPTATION.

To understand mobile data growth, we need to look at minimum two aspects of Growth:

  1. Growth of users (per segment) using mobile data (i.e., data uptake).
  2. Growth of data usage per user segment (i.e., segmentation is important as averages across a whole customer base can be misleading).

i.e., Growth can be decomposed into uptake rate of users  and growth of these users data consumption, i.e., CAGR_Volume = (1 + CAGR_Users) x (1+CAGR_Usage) – 1.

The segmentation should be chosen with some care, although a split in Postpaid and Prepaid should be a minimum requirement. Further refinements would be to include terminal type & capabilities, terminal OS, usage categories, pricing impacts, etc.. and we see that the growth prediction process very rapidly gets fairly complex, involving a high amount of uncertain assumptions. Needless to say that Growth should be considered per Access Technology, i.e., split in GPRS/EDGE, 3G/HSPA, LTE/LTE-a and WiFi.

Let’s have a look at (simple) technology growth of a new technology or in other words the adaptation rate.

The above chart illustrates the most common uptake trend that we observe in mobile networks (and in many other situations of consumer product adaptation). The highest growth rates are typically observed in the beginning. Over time the growth rate slows down as saturation is reached. In other words the source of growth has been exhausted.

At Day ZERO there where ZERO 3G terminals and their owners.

At Day ONE some users had bought 3G terminals (e..g, Nokia 6630).

Between Zero and Some, 3G terminals amounts to an Infinite growth rate … So Wow! … Helpful … Not really!

Some statistics:

In most countries it has taken on average 5 years to reach a 20% 3G penetration.

The KA moment of 3G uptake really came with the introduction of the iPhone 3 (June 9 2008) and HTC/Google G1 (October 2008) smartphones.

Simplified example: in 4 years a Mobile Operator’s 3G uptake went from 2% to 20%. An compounded annual growth rate (CAGR) of at least 78%. Over the same period the average mobile (cellular!) data consumption per user increased by a factor 15 (e.g., from 20MB to 300MB), which gives us a growth rate of 97% per anno. Thus the total volume today is at least 150 times that of 4 years ago or equivalent to an annual growth rate 250%!

Geoffrey A. Moore’s book “Crossing the Chasm” (on Marketing and Selling High-Tech products to mainstream customers) different segmentation of growth have been mapped out in (1) Innovators (i.e., first adopters), (2) Early Adoptors, (3) Early Majority, (4) Late Majority and (5) The Laggards.

It is fairly common to ignore the Laggards in most analysis, as these do not cause direct problems for new technology adaptation. However, in mobile networks Laggards can become a problem if they prevent the operator to re-farm legacy spectrum by refusing to migrate, e.g., preventing GSM 900MHz spectrum to be re-purposed to UMTS or GSM 1800 to be re-purposed to LTE.

Each of the stages defined by Geoffrey Moore correspond to a different time period in the life cycle of a given product and mapped to above chart on technology uptake looks like this:

In the above “Crossing the Chasm” chart I have imposed Moore’s categories on a logistic-like (or S-curve shaped) cumulative distribution function rather than the Bell Shaped (i.e., normal distribution) chosen in his book.

3G adaptation has typically taken ca. 5+/-1 years from launch to reach the stage of Early Majority.

In the mobile industry its fairly common for a user to have more than 1 device (i.e., handset typically combined with data stick, tablet, as well as private & work related device split, etc..). In other words, there are more mobile accounts than mobile users.

In 2011, Western Europe had ca. 550 Million registered mobile accounts (i.e., as measured by active SIM Cards) and a population of little over 400 Million. Thus a mobile penetration of ca. 135% or if we consider population with a disposable income 160+%.

The growth of 3G users (i.e., defined as somebody with a 3G capable terminal equipment) have been quiet incredible with initial annual growth rates exceeding 100%. Did this growth rate continue? NO it did NOT!

As discussed previously, it is absolutely to be expected to see very high growth rates in the early stages or technology adaptation. The starting is Zero or Very Low and incremental additions weight more in the beginning than later on in the adaptation process.

The above chart (“CAGR of 3G Customer Uptake vs 3G Penetration”) illustrates the annual 3G uptake growth rate data points, referenced to the year of 10% penetration, for Germany, Netherlands and USA (i.e., which includes CDMA2000). It should be noted that 3G Penetration levels above 50+% are based on Pyramid Research projections.

The initial growth rates are large and then slows down as the 3G penetration increases.

As saturation is reached the growth rate comes almost to a stop.

3G saturation level is expected to be between 70% and 80+% … When LTE takes over!

For most Western European markets the saturation is expected to be reached between 2015 – 2018 and sooner in the USA … LTE takes over!

The (diffusion) process of Technology uptake can be described by S-shaped curves (e.g., as shown in “Crossing the Chasm”). The simplest mathematical description is a symmetric logistic function (i..e, Sigmoid) that only depends on time. The top solid (black) curve shows the compounded annual growth rate, referenced to the Year of 10% 3G penetration, vs 3G penetration. Between 10% and 15% 3G penetration the annual growth rate is 140%, between 10% and 50% its “only” 108% and drops to 65% at 90% 3G penetration (which might never be reached as users starts migrating to LTE).

The lower dashed (black) curve is a generalized logistic function that provides a higher degree of modelling flexibility accounting for non-symmetric adaptation rate pending on the 3G penetration. No attempt of curve fitting to the data has been applied in the chart above. I find the generalized logistic function in general can be made to agree well with actual uptake data. Growth here is more modest with 72% (vs 140% for the Simple Logistic representation), 57% (vs 108%) and 35% (vs 65%). Undershooting in the beginning of the growth process (from 10% ->;20%: Innovators & Early Adopters phase) but representing actual data after 20% 3G penetration (Early and Late Majority).

Finally, I have also included the Gomperz function (also sigmoid) represented by light (grey) dashed line in between the Simple and Generalized Logistic Functions. The Gomperz function has found many practical applications describing growth. The parameters of the Gormperz function can be chosen so growth near lower and upper boundaries are different (i.e., asymmetric growth dynamics near the upper and lower asymptotes).

As most mature 3G markets have passed 50% 3G penetration (i.e., eating into the Late Majority) and approaching saturation, one should expect to see annual growth rates of 3G uptake to rapidly reduce. The introduction of LTE will also have a substantial impact of the 3G uptake and growth.

Of course the above is a simplification of the many factors that should be considered. It is important that you;

  1. Differentiate between Prepaid & Postpaid.
  2. Consider segmentation (e.g., Innovator, First Adopter, Early Majority & Late Majority).
  3. Projections should Self-consistent with market dynamics: i.e., Gross Adds, Churn, hand-down and upgrade dynamics within Base, etc…

THE GROWTH ESSENTIALS – THE CELLULAR USAGE.

In the following I will focus on Cellular (or Mobile) data consumption. Thus any WiFi consumption on public, corporate or residential access points are deliberately not considered in the following. Obviously, in cellular data demand forecasting WiFi usage can be important as it might be a potential source for cellular consumption via on-loading. In particular with new and better performing cellular technologies are being introduced (i.e., LTE / LTE advanced). Also price plan policy changes might result in higher on-load of the cellular network (at least if that network is relative unloaded and with lots of spare capacity).

It should come as no surprise that today the majority of mobile data consumers are Postpaid.

Thus, most of the average data usage being reported are based on the Postpaid segment. This also could imply that projecting future usage based on past and current usage could easily overshoot. Particular if Prepaid consumption would be substantially lower than Postpaid data consumption. The interesting and maybe somewhat surprising is that Active Prepaid mobile data consumers can have a fairly high data consumption (obviously pending price plan policy). In the example shown below, for an Western European Operator with ca. 50%:50% Postpaid – Prepaid mix, the Postpaid active mobile data consumers are 85% of total Postpaid Base. The Mobile Data Active Prepaid base only 15% (though growing fast).

The illustrated data set, which is fairly representative for an aggressive smartphone operation, have an average data consumption of ca. 100MB (based on whole customer base) and an Active Average consumption of ca. 350MB. Though fairly big consumptive variations are observed within various segments of the customer base.

The first 4 Postpaid price plans are Smartphone based (i.e., iOS and Android) and comprises 80% of all active devices on the Network. “Other Postpaid” comprises Basic Phones, Symbian and RIM devices. The Active Prepaid device consumption are primarily Android based.

We observe that the following:

  1. Unlimited price plan results in the highest average volumetric usage (“Unlimited Postpaid” & “Postpaid 1″ price plans are comparable in device composition. The difference is in one being unlimited the other not).
  2. Unlimited average consumption dominated by long tail towards extreme usage (see chart below).
  3. Smartphone centric postpaid price plans tend to have a very high utilization percentage (90+%).
  4. Active Prepaid Data Consumption (200MB) almost as high as less aggressive smartphone (210MB) price plans (this is however greatly depending on prepaid price policy).

The above chart “Cellular Data Consumption Distribution” illustrates the complexity of technology and cellular data consumption even within different price plan policies. Most of the distributions consist of up-to 4 sub-segments of usage profiles.Most notably is the higher consumption segment and the non-/very-low consumptive segment.

There are several observations worth mentioning:

  • Still a largely untapped Prepaid potential (for new revenue as well as additional usage).
  • 15% of Postpaid consumers are data inactive (i.e., Data Laggards).
  • 40% of active Postpaid base consumes less than 100MB or less than 1/4 of the average high-end Smartphone usage.

Clearly, the best approach to come to a meaningful projection of cellular data usage (per consumer) would be to consider all the above factors in the estimate.

However, there is a problem!

The Past Trends may not be a good basis for predicting Future Trends!

Using The Past we might risk largely ignoring:

  1. Technology Improvements that would increase cellular data consumption.
  2. New Services that would boost cellular data usage per consumer.
  3. New Terminal types that would lead to another leapfrog in cellular data consumption.
  4. Cellular Network Congestion leading to reduced growth of data consumption (i.e., reduced available speed per consumer, QoS degradation, etc..).
  5. Policy changes such as Cap or allowing Unlimited usage.

Improvements in terminal equipment performance (i.e., higher air interface speed capabilities, more memory, better CPU performance, larger / better displays, …) should be factored into the cellular data consumption as the following chart illustrates (for more details see also Dr. Kim’s Slideshare presentation on “Right Pricing Mobile Broadband: Examing The Business Case for Mobile Broadband”).

I like to think about every segment category has its own particular average data usage consumption. A very simple consideration (supported by real data measurements) would to expect to find the extreme (or very high) data usage in the Innovator and Early Adopter segments and as more of the Majority (Early as well as Late) are considered the data usage reduces. Eventually at Laggards segment hardy any data usage is observed.

It should be clear that the above average usage-distribution profile is dynamic. As time goes by the distribution would spread out towards higher usage (i.e., the per user per segment inflationary consumption). At the same time as increasingly more of the customer base reflects the majority of the a given operators customer base (i.e., early and late majority)

Thus over time it would be reasonable to expect that?

The average volumetric consumption could develop to an average that is lower than when Innovators & Early Adopters dominated.

Well maybe!? Maybe not?!

The usage dynamics within a given price plan is non-trivial (to say the least) and we see in general a tendency towards higher usage sub-segment (i.e., within a given capped price plan). The following chart (below) is a good example of the data consumption within the same Capped Smartphone price plan over an 12 month period. The total amount of consumers in this particular example have increased 2.5 times over the period.

It is clear from above chart that over the 12 month period the higher usage sub-segment has become increasingly popular. Irrespective the overall average (including non-active users of this Smartphone price plan) has not increased over the period.

Though by no means does this need to be true for all price plans. The following chart illustrates the dynamics over a 12 month period of an older Unlimited Smartphone price plan:

Here we actually observe a 38% increase in the average volumetric consumption per customer. Over the period the ca. 50% of customers in this price plan have dropped out leaving primarily heavy users enjoy the benefits on unlimited consumption.

There is little doubt that most mature developed markets with a long history of 3G/HSPA will have reached a 3G uptake level that includes most of the Late Majority segment.

However, for the prepaid segment it is also fair to say that most mobile operators are likely only to have started approach and appeal to Innovators and Early Adopters. The chart below illustrates the last 12 month prepaid cellular consumptive behavior.

In this particular example ca. 90% of the Prepaid customer base are not active cellular data consumers (this is not an unusual figure). Even over the period this number has not changed substantially. The Active Prepaid consumes on average 40% more cellular data than 12 month ago. There is a strong indication that the prepaid consumptive dynamics resembles that Postpaid.

Data Consumption is a lot more complex than Technology Adaptation of the Cellular Customer.

The data consumptive dynamics is pretty much on a high level as follows;

  1. Late (and in some case Early) Majority segments commence consuming cellular data (this will drag down the overall average).
  2. Less non-active cellular data consumers (beside Laggards) ->; having an upward pull on the average consumption.
  3. (in particular) Innovator & Early Adopters consumption increases within limits of given price plan (this will tend to pull up the average).
  4. General migration upwards to higher sub-segmented usage (pulling the overall average upwards).
  5. If Capped pricing is implemented (wo any Unlimited price plans in effect) growth will slow down as consumers approach the cap.

We have also seen that it is sort of foolish to discuss a single data usage figure and try to create all kind of speculative stories about such a number.

BRINGING IT ALL TOGETHER.

So what’s all this worth unless one can predict some (uncertain) growth rates!

WESTERN EUROPE (AT, BE, DK, FIN, F, DE,GR,IRL,IT,NL,N,P, ESP, SE, CH, UK,)

3G uptake in WEU was ca. 60% in 2011 (i.e., ca. 334 Million 3G devices). This correspond to ca. 90% of all Postpaid customers and 32% of all Prepaid users have a 3G device. Of course it does not mean that all of these are active cellular data users. Actually today (June 2012) ca. 35% of the postpaid 3G users can be regarded as non-active cellular user and for prepaid this number may be as high as 90%.

For Western Europe, I do not see much more 3G additions in the Postpaid segment. It will be more about replacement and natural upgrade to higher capable devices (i.e., higher air interface speed, better CPU, memory, display, etc..). We will see an increasing migration from 3G Postpaid towards LTE Postpaid. This migration will really pick-up between 2015 and 2020 (Western Europe lacking behind LTE adaptation in comparison with for example USA and some of the Asian Pacific countries). In principle this could also mean that growth of 3G postpaid cellular data consumption could rapidly decline (towards 2020) and we would start seeing overall cellular data usage decline rather than increase of 3G Postpaid data traffic.

Additional Cellular data growth may come from the Prepaid segment. However, still a very large proportion of this segment is largely data in-active in Western Europe. There are signs that, depending on the operator prepaid price plan policy, prepaid consumption appears to be fairly similar to Postpaid on a per user basis.

3G Growth Projections for Western Europe (reference year 2011):

Above assumes that usage caps will remain. I have assumed this to be 2GB (on average for WEU). Further in above it is assumed that the Prepaid segment will remain largely dominated by Laggards (i.e., in-active cellular data users) and that the active Prepaid cellular data users have consumption similar to Postpaid.

Overall 3G Cellular data growth for Western Europe to between 3x to no more than 4x (for very aggressive prepaid cellular data uptake & growth) over the period 2011 to 2016.

Postpaid 3G Cellular data growth will flatten and possible decline towards the end of 2020.

More agresive LTE Smartphone uptake (though on average across Western European appears unlikely) could further release 3G growth pains between 2015 – 2020.

Innovators & Early Adopters, who demand the most of the 3G Cellular Networks, should be expected to move quickly to LTE (as coverage is provided) off-loading the 3G networks over-proportionally.

The 3G cellular growth projections are an Average consideration for Western Europe where most of the postpaid 3G growth has already happen with an average of 60% overall 3G penetration. As a rule of thumb: the lower the 3G penetration the higher the CAGR growth rates (as measured from a given earlier reference point).

In order to be really meaningful and directly usable to a Mobile Operator, the above approach should be carried out for a given country and a given operator conditions.

The above growth rates are lower but within range of my Twitter Colleague Dean Bubley (@Disruptivedean) states as his expectations for Developed Markets in his Blog “Mobile data traffic growth – a thought experiment and forecast”. Not that it makes it more correct or more wrong! Though for any one who spend a little time on the growth fundamentals of existing Western European mobile data markets would not find this kind of growth rate surprising.

So what about LTE growth? … well given that we today (in Western Europe) have very very little installed base LTE devices on our networks … the growth or uptake (seen as on its own) is obviously going to be very HIGH the first 5 to 7 years (depending on go to market strategies).

What will be particular interesting with the launch of LTE is whether we will see an on-loading effect of the cellular LTE network from todays WiFi usage. Thomas Wehmeier (Principal Analyst, Telco Strategy, Informa @Twehmeier) has published to very interesting and study worthy reports on Cellular and WiFi Smartphone Usage (see “Understanding today’s smartphone user: Demystifying data usage trends on cellular & Wi-Fi networks” from Q1 2012 as well as Thomas’s sequential report from a couple of weeks ago “Understanding today’s smartphone user: Part 2: An expanded view by data plan size, OS, device type and LTE”).

THE CLIFFHANGER

Given the dramatic beginning of my Blog concerning the future of the Mobile Industry and Cellular data … and to be fair to many of the valid objections that Dean Bubley‘s has raised in his own Blog and in his Tweets … I do owe the reader who got through this story some answer …

I have no doubt (actually I know) that there mobile operators (around the world) that already today are in dire straits with their spectral resources due to very aggressive data growth triggered by the Smartphone. Even if growth has slowed down as their 3G customers (i.e., Postpaid segment) have reached the Late Majority (and possible fighting Laggards) that lower growth rate still causes substantial challenges to provide sufficient capacity & not to forget quality.

Yes … 3G/HSPA+ Small Cells (and DAS-like solutions) will help mitigate the growing pains of mobile operators, Yes … WiFi off-load too, Yes … LTE & LTE-advanced too will help. Though the last solution will not be much of a help before critical mass of LTE terminals have been reached (i.e., ca. 20% = Innovators + Early Adopters).

Often forgotten is traffic management and policy remedies (not per see Fair Use Policy though!) are of critical importance too in the toolset of managing cellular data traffic.

Operators in emerging markets and in markets with a relative low 3G penetration, better learn the Growth Lessons from the AT&T’s and other similar Front Runners in the Cellular Data and Smartphone Game.

  1. Unless you manage cellular data growth from the very early days, you are asking for (in-excusable) growth problems.
  2. Being Big in terms of customers are not per see a blessing if you don’t have proportionally the spectrum to support that Base.
  3. Don’t expect to keep the same quality level throughout your 3G Cellular Data life-cycle,!
  4. Accept that spectral overhead per customer obviously will dwindle as increasingly more customers migrate to 3G/HSPA+.
  5. Technology Laggards should be considered as the pose an enormous risk to spectral re-farming and migration to more data efficient technologies.
  6. Short Term (3 – 5 years) … LTE will not mitigate 3G growing pains (you have a problem today, its going to get tougher and then some tomorrow).

Is Doom knocking on Telecom’s Door? … Not very Likely (or at least we don’t need to open the door if we are smart about it) … Though if an Operator don’t learn fast and be furiously passionate about economical operation and pricing policies … things might look a lot more gloomy than what needs to be.

STAY TUNED FOR A PART 2 … taking up the last part in more detail.

ACKNOWLEDGEMENT

To great friends and colleagues that have challenged, suggested, discussed, screamed and shouted (in general shared the passion on this particular topic of Cellular Data Growth) about this incredible important topic for our Mobile Industry (and increasingly Fixed Broadband). I am in particular indebted to Dejan Radosavljevik for bearing with my sometimes crazy data requests (at odd h0urs and moments) and last but not least thinking along with me on what mobile data (cellular & WiFi) really means (though we both have come to the conclusion that being mobile is not what it means. But that is a different interesting story for another time).

Social Media Valuation …. a walk on the wild side.

Lately I have wondered about Social Media Companies and their Financial Valuations. Is it hot air in a balloon that can blow up any day? Or are the hundred of millions and billions of US Dollars tied to Social Media Valuations reasonable and sustainable in the longer run? Last question is particular important as more than 70% of the value in Social Media are 5 or many more years out in the Future.  Social Media startup companies, without any turnover, are regularly being  bought for, or able to raise money at a value, in the hundreds of millions US dollar range. Lately, Instagram was bought by Facebook for 1 Billion US Dollar. Facebook itself valued at a $100B at its IPO. Now several month after their initial public offering, Facebook may have lost as much as 50% of the originally claimed IPO value.

The Value of Facebook, since its IPO,  has lost ca. 500 Million US Dollar per day (as off 30-July-2012).

What is the valuation make-up of Social Media? And more interestingly what are the conditions that need to be met to justify $100B or $50B for Facebook, $8B for Twitter, $3B (as of 30-July-2012, $5B prior to Q2 Financials) or $1B for Instagram, a 2 year old company with a cool mobile phone Photo App? Is the Social Media Business Models Real? or based on an almost religious belief that someday in the future it will Return On Investment. Justifying the amount of money pumped into it?

My curiosity and analytical “hackaton” got sparked by the following Tweet:

Indeed! what could possible justify paying 1 Billion US Dollar for Instagram, which agreeably has a very cool FREE Smartphone Photo App (far better than Facebook’s own), BUT without any income?

  • Instagram, initially an iOS App, claims 50 Million Mobile Users (ca. 5 Million unique visitors and 31 Million page-views as of July 2012). 5+M photos are uploaded daily with a total of 1+ Billion photos uploaded. No reported revenues to date. Prior to being bought by Facebook for $1 Billion, was supposed to have been prepared for a new founding round valued at 500 Million US$.
  • Facebook has 900M users, 526M (58%) active daily and 500M mobile users (May 2012). 250M photos are uploaded daily with a total of 150 Billion photos. Facebook generated ca. $5B in revenue in 2011 and current market cap is ca. $61B (24 July 2012). 85% of FB revenue in 2011 came from advertisement.

The transaction gives a whole new meaning to “A picture is worth a Billion words”  … and Instagram is ALL about PICTURES & SOCIAL interactions!

Instagram is a (really cool & simple) mobile & smartphone optimized App. Something that would be difficult to say about FB’s mobile environment (in particular when it comes to photo experience).

One thing is of course clear. If FB is willing to lay down $1B for Instagram, their valuation should be a good deal higher than $1B (i.e., ca. $4+B?). It will be very interesting to see how FB plans to monetize Instagram. Though the acquisition might be seen as longer-outlook protective move to secure Facebook’s share of the Mobile Market, which for Social Media will become much more important than the traditional desktop access.

So how can we get a reality check on a given valuation?

Lets first look at the main Business Models of today (i.e., how the money will be or are made);

  1. Capture advertising spend – typically online advertisement spend (total of $94B in 2012 out of an expected total Media Ad spend of $530B). With uptake of tablets traditional “printed media” advertising spend might be up for grabs as well (i.e., getting a higher share of the total Media Ad spend).
  2. Virtual Goods & credits (e.g., Zynga’s games and FB’s revenue share model) – The Virtual Economy has been projected to be ca. $3B in 2012 (cumulative annual growth rate of 35% from 2010).
  3. Payed subscriptions (e.g., LinkedIn’s Premium Accounts: Business Plus, Job Seeker, etc or like Spotify Premium, etc..).
  4. B2B Services (e.g.,. LinkedIn’s Hiring Solutions).

The Online Advertisement Spend is currently the single biggest source of revenue for the Social Media Business Model. For example Google (which is more internet search than Social Media) takes almost 50% of the total available online advertisement spend and it accounts for more than 95% of Google’s revenues. In contrast, Facebook in 2011 only captured ca. 4+% of Online Ad Spend which accounted for ca. 85% of FB’s total revenue. By 2015 eMarketeer.com (see http://www.emarketer.com/PressRelease.aspx?R=1008479) has projected the total online advertisement spend could be in the order of $132B (+65% increase compared to 2011). USA and Western Europe is expected to account for 67% of the $132B by 2015.

Virtual Goods are expected to turn-over ca. $3B in 2012. The revenue potential from Social Networks and Mobile has been projected (see Lazard Capital’s Atul Bagga ppt on “Emerging Trends in Games-as-a-Service”) to be ca. $10B worldwide by 2015. If (and that is a very big if) the trend would continue the 2020 potential would be in the order of $60B (though I would expect this to be a maximum and very optimistic upside potential).

So how can a pedestrian get an idea about Social Media valuation? How can one get a reality check on these Billionaires being created en mass at the moment in the Social Media sphere?

“Just for fun” (and before I get really “serious”) I decided see whether there is any correlation between a given valuation and the number of Unique Visitors (per month) and Pageviews (per month) … my possible oversimplified logic would be that if the main part of the Social Media business model is to get a share of the Online Advertisement Spending there needs to be some sort of dependency on the those (i..e, obviously whats really important is the clickthrough (rate) but lets be forget this for a moment or two):

 The two charts (log-log scaled) shows Valuation (in Billion US$) versus Unique Visitors (in Millions) and Pageviews (in Billions). While the correlations are not perfect, they are really not that crazy either. I should stress that the correlations are power-law correlations NOT LINEAR, i.e., Valuation increases with power of unique and active users/visitors.

An interesting out-lier is Pinterest. Let’s just agree that this does per see mean that Pinterest’s valuation at $1.5B is too low! … it could also imply that the rest are somewhat on the high side! 😉

Note: Unique Visitors and Pageview statistics can be taken from Google’s DoubleClick Ad Planner. It is a wonderful source of domain attractiveness, usage and user information.

Companies considered in Charts: Google, Facebook, Yahoo, LinkedIN, Twitter, Groupon, Zynga, AOL, Pinterest, Instagram (@ $1B), Evernote, Tumblr, Foursquare, Baidu.

That’s all fine … but we can (and should) do better than that!

eMarketeer.com has given us a Online Advertisement Spend forecast (at least until 2015). In 2011, the Google’s share amounted to 95% of their revenue and for Facebook at least 85%. So we are pretty close to having an idea of the Topline (or revenue) potential going forward. In addition, we also need to understand how that Revenue translates into Free Cash Flow (FCF) which will be the basis for my simple valuation analysis. To get to a Free Cash Flow picture we could develop a detailed P&L model for the company of interests. Certainly an interesting exercise but would require “Millions” of educated guesses and assumptions for a business that we don’t really know.

Modelling a company’s P&L is not really a peaceful walk for our interested pedestrian to take.

A little research using Google Finance, Yahoo Finance or for example Ycharts.com (nope! I am not being sponsored;-) will in general reveal a typical cash yield (i.e., amount of FCF to Revenue) for a given type of company in a given business cycle.

Examples of FCF performance relative to Revenues: Google for example has had an average FCF yield of 30% over the last 4 years, Yahoo’s 4 year average was 12% (between 2003 and 2007 Google and Yahoo had farily similar yields ).  Facebook has been increasing its yield steadily from 2009 (ca. 16%) to 2011 (ca. 25%), while Zynga had 45% in 2010 and then down to 13% in 2011.

So having an impression of the revenue potential (i.e., from eMarketeer) and an idea of best practice free cash flow yield, we can start getting an idea of the Value of a given company. It should of course be clear that we can also turn this Simple Analysis around and ask what should the Revenue & Yield be in order to justify a given valuation. This would give a reality check on a given valuation as the Revenue should be in reasonable relation to market and business expectations.

Lets start with Google (for the moment totally ignoring Motorola;-):

Nothing fancy! I am basically assuming Google can keep their share of Online Advertising Spend (as taken from eMarketeer) and that Google can keep their FCF Yield at a 30% level. The discount rate (or WACC) of 9% currently seems to be a fair benchmark (http://www.wikiwealth.com/wacc-analysis:goog). I am (trying) to be conservative and assumes a 0% future growth rate (i.e., changing will in general have a high impact on the Terminal Value). If all this comes true, Google’s value would be around 190 Billion US Dollars. Today (26 July 2012) Google Finance tells me that their Market Capitalization is $198B (see http://www.google.com/finance?q=NASDAQ:GOOG) which is 3% higher than the very simple model above.

How does the valuation picture look for Facebook (pre-Zynga results as of yesterday 25 July 2012):

First thought is HALLELUJAH … Facebook is really worth 100 Billion US Dollars! … ca. $46.7 per share… JAIN (as they would say in Germany) … meaning YESNO!

  • Only if Facebook can grow from capturing ca. 6% of the Online Advertisement Spend today to 20% in the next 5 – 6 years.
  • Only if Facebook can improve their Free Cash Flow Yield from today’s ca. 25% to 30%.
  • Only if Facebooks other revenues (i.e., from Virtual Goods, Zynga, etc..) can grow to be 20% of their business.

What could possible go wrong?

  • Facebook fatigue … users leaving FB to something else (lets be honest! FB has become a very complex user interface and “sort of sucks” on the mobile platforms. I guess one reason for Instagram acquisition).
  • Disruptive competitors/trends (which FB cannot keep buying up before they get serious) … just matter of time. I expect this to happen first in the Mobile Segment and then spread to desktop/laptop.
  • Non-advertisement revenues (e.g., from Virtual Goods, Zynga, etc..) disappoints.
  • Need increasing investments in infrastructure to support customer and usage growth (i.e., negative impact on cash yields).
  • The Social Media business being much more volatile than current hype would allow us to assume.

So how would a possible more realistic case look like for Facebook?

Here I assume that Facebook will grow to take 15% (versus 20% above) of the Online Ad spend. Facebook can keep a 25% FCF Yield (versus growing to 30% in the above model). The contribution from Other Revenues has been brought down to a more realistic level of the Virtual Goods and Social Media Gaming expectations (see for example Atul Bagga, Lazard Capital Markets, analysis http://twvideo01.ubm-).

The more conservative assumptions (though with 32% annual revenue growth hardly a very dark outlook) results in a valuation of $56 Billion (i.e., a share price of ca. $26). A little bit more than half the previous (much) more optimistic outlook for Facebook. Not bad at all of course … but maybe not what you want to see if you paid a premium for the Facebook share? Facebook’s current market capitalization (26 July 2012, 18:43 CET) is ca. $60B (i..e, $28/share).

So what is Facebooks value? $100B (maybe not), $50+B? or around $60+B? Well it all depends on how shareholders believe Facebook’s business to evolve over the next 5 – 10 (and beyond) years. If you are in for the long run it would be better to be conservative and keep the lower valuation in mind rather than the $100B upside.

Very few of us actually sit down and do a little estimation ourselves (we follow others = in a certain sense we are financial lemmings). With a little bit of Google Search (yes there is a reason why they are so valuable;-) and a couple of lines of Excel (or pen and paper) it is possible to get an educated idea about a certain valuation range and see whether the price you paid was fair or not.

Lets just make a little detour!

Compare Facebook’s current market capitalization of ca. $60B (@ 26 July 2012, 18:43 CET) at $3.7B Revenue (2011) and ca. $1B of free cash flow (2011). Clearly all value is in anticipation of future business! Compare this with Deutsche Telecom AG with a market capitalization of ca. $50B at $59B (2011, down -6% YoY2010) and ca. $7.8B of free cash flow (2011). It is Fascinating that a business with well defined business model, paying customers, healthy revenue (16xFB) and cash flow (8xFB) can be worth a lot less than a company that relies solely on anticipation of a great future.  Facebook’s / Social Media Business Model future appear a lot more optimistic (the blissfull unknown) than the Traditional Telco Business model (the known” unknown). Social Media by 2015 is a game of maybe a couple of hundred Billions (mainly from advertisement, app sales and virtual economy) versus the Telecom Mobile (ignoring the fixed side) of a Trillion + (1,000 x Billion) business.

Getting back to Social Media and Instragram!

So coming back to Instagram … is it worth paying $1B for?

Let’s remind ourselves that Instagram is a Mobile Social Media Photo sharing platform (or Application) serving Apple iOS (originally exclusively so) and Android. Instagram has ca. 50+M registered users (by Q1’2012) with 5+M photos uploaded per day with a total of 1+B photos uploaded. The Instagram is a through-rough optimized smartphone application. There are currently more than 460+ photo  apps with 60Photos being a second to Instagram in monthly usage (http://www.socialbakers.com/facebook-applications/category/70-photo).

Anyway, to get an idea about Instagram’s valuation potential, it would appear reasonable to assume that their Business Model would target the Mobile Advertisement Spend (which is a sub-set of Online Ad Spend). To get somewhere with our simple valuation framework I assume:

  1. that Instagram can capture up to 10% of the Mobile Adv Spend by 2015 – 2016 (possible Facebook boost effect, better payment deals. Keep ad revenue with Facebook).
  2. Instagram’s  a revenue share dynamics similar to Facebooks initial revenue growth from Online Ad Spend (possible Facebook boost effect, better payment deals. Keep ad revenue with Facebook).
  3. Instagram could manage a FCF Yield to 15% over the period analysed (there could be substantial synergies with Facebook capital expenditures).

In principle the answer to that question above is YES paying $1B for Instagram would be worth it as we get almost $5B from our small and simple valuation exercise … if one believes;

  1. Instagram can capture 10% of the Mobile Advertisement Spend (over the next 5 – 6 years).
  2. Instagram can manage a Free Cash Flow Yield of at least 15% by Year 6.

Interesting looking at the next 5 years would indicate a value in the order of $500M. This is close to the rumored funding round that was in preparation before Facebook laid down $1B. However and not surprising most of the value for Instagram comes from the beyond 5 years. The Terminal Value amounts to 90% of the Enterprise Value.

For Facebook to breakeven on their investment, Instagram would need to capture no more than 3% of the Mobile Ad Spend over the 5 year period (assuming that the FCF Yield remain at 10% and not improving due to scale).

Irrespective;

Most of the Value of Social Media is in the Expectations of the Future.

70+% of Social Media Valuation relies on the Business Model remaining valid beyond the first 5 years.

With this in mind and knowing that we the next 5 years will see a massive move from desktop dominated Social Media to Mobile dominated Social Media, should make us somewhat nervous about desktop originated Social Media Businesses and whether these can and will make the transformation.

The question we should ask is:

Tomorrow, will today’s dot-socials be yesterday’s busted dot-coms?

PS

For the pedestrian that want to get deeper into the mud of valuation methodologies I can really recommend “Valuation: Measuring & Managing the Value of Companies” by Tim Koller, Marc Goedhart & David Wessels (http://www.amazon.com/Valuation-Measuring-Managing-Companies-Edition/dp/0470424656). Further there are some really cool modelling exercises to be done on the advertisement spend projections and the drivers behind as well as a deeper understand (i.e., modeling) of the capital requirements and structure of Social Media Business Models.

In case of interest in the simple models used here and the various sources … don’t be a stranger … get in touch!

PSPS (as of 28-July-2012) – A note on Estimated Facebook Market Capitalization

In the above Facebook valuation commentary I have used the information from Google Finance (http://www.google.com/finance?q=facebook) and Yahoo Finance (http://finance.yahoo.com/q?s=FB) both basing their Market Capitalization estimation on 2.14B Shares. MarketWatch (http://www.marketwatch.com/investing/stock/fb) appear to use 2.75B shares (i.e., 29% high than Google & Yahoo). Obviously, MarketWatch market capitalization thus are higher than what Google & Yahoo would estimate.

Mobile Data Consumption, the Average Truth? the Average Lie?

“Figures often beguile me” leading to the statement that “There are three kinds of lies: lies, damned lies, and statistics.” (Mark Twain, 1906).

We are so used to averages … Read any blog or newspaper article trying to capture a complex issue and its more than likely that you are being told a story of averages … Adding to Mark Twain’s quote on Lies, in our data intense world ” The Average is often enough the road to an un-intentional Lie” .. or just about “The Average Lie” .

Imagine this! Having (at the same time) your feet in the oven at 80C and you head in the freezer at -6C … You would be perfectly OK! On average! as your average temperature would equal 80C + (-6C) divided by 2 which is 37C, i.e., the normal and recommended body temperature for an adult human being. However both your feet and your head is likely to suffer from such an experiment (and therefore really should not be tried out … or left to Finns used to Sauna and Icy water … though even the Finns seldom enjoyed this simultaneously).

Try this! Add together the age of the members your household and divide by the number of members. This would give you the average age of your household … does the average age you calculated have any meaning? … if you have young children or grandparents living with you, I think that there is a fairly high chance that the answers to that question is NO! …  The average age of my family”s household is 28 years. However, this number is a meaningless average representation of my household. It is 20 times higher than my sons age and about 40% lower than my own age.

Most numbers, most conclusions, most stories, most (average) analysis are based on an average representation of one or another Reality …. and as such can easily lead to Reality Distortion.

When we are presented with averages (or mean values as it is also called in statistics), we tend to substitute Average with Normal and believe that the story represents most of us (i.e., statistically this means about 68% of us all). More often than not we sit back with the funny feeling that if what we just read is “normal” then maybe we are not.

On mobile data consumption (I ll come back to Smartphone data consumption a bit later) … There is one (non-average) truth about mobile data consumption that has widely (and correctly) been communicated …

Very few mobile customers (10%) consumes the very most of the mobile data traffic (90%).

(see for example: http://www.nytimes.com/2012/01/06/technology/top-1-of-mobile-users-use-half-of-worlds-wireless-bandwidth.html/).

Lets just assume that a mobile operator make claim to an average 200MB monthly consumption (source: http://gigaom.com/broadband/despite-critics-cisco-stands-by-its-data-deluge/). Lets assume that 10% of customer base generating 90% of the traffic. It follows that the high usage segment has an average  volumetric usage of 1,800MB and the low usage segment an average volumetric usage of only 22MB.  In other words 10% of the customer base have 80+ times higher consumption than the remaining 90%. The initial average consumption (taken across the whole customer base) of 200MB communicated is actually 9 times higher than the average consumption of 90% of the customer base. It follows (with some use case exceptions) that the 10% high usage segment spends a lot more Network Resources and Time. The time the high usage segment spend actively with their device are likely to be a lot higher than the 90% low usage segment.

The 200MB is hardly normal! It is one of many averages that can be calculated. Obviously 200MB is a lot more “sexy” than to state that 90% of the customer base consumes typically 22MB.

Created using PiktoChart http://app.piktochart.com.

Do Care about Measurement and Data Processing!

What further complicates consumptive values being quoted is how the underlying data have been measured, processed and calculated!

  1. Is the averaging done over the whole customer base?,
  2. Is the averaging done over active customers?, or
  3. A subset of active customers (i.e., 2G vs 3G, 3G vs HSPA+ vs LTE vs WiFi, smartphone  vs basic phone, iPad vs iPhone vs Laptop, prepaid vs postpaid, etc..) or
  4. A smaller subset based on particular sample criteria (i.e., iOS, Android, iPad, iPhone, Galaxy, price plan, etc..) or availability (mobile Apps installed, customer approval, etc..).  or …

Without knowing the basis of a given average number any bright analysis or cool conclusion might be little more than Conjecture or Clever Spin.

On Smartphone Usage

One the most recent publicized studies on Smartphone usage comes from O2/Telefonica UK (Source: http://mediacentre.o2.co.uk/Press-Releases/Making-calls-has-become-fifth-most-frequent-use-for-a-Smartphone-for-newly-networked-generation-of-users-390.aspx). The O2 data provides an overview of average daily Smartphone usage across 10 use case categories.

The O2’s Smartphone statistics have been broken down in detail by one of our industry”s brightest Tomi Ahonen (A Must Read http://www.communities-dominate.blogs.com/ though it is drowning in his Nokia/Mr. Elop “Howler Letters”). Tomi points out the Smartphone’s disruptive replacement potential of many legacy consumer products (e.g., think: watch, alarm clock, camera,  etc..).

The O2 Smartphone data is intuitive and exactly what one would expect! Boring really! Possible with the exception of Tomi’s story telling (see above reference)! The data was so boring that The Telegraph (source: http://www.telegraph.co.uk/technology/mobile-phones/9365085/Smartphones-hardly-used-for-calls.html) had to conclude that “Smartphones Hardly Used for Calls”. Relative to other uses of course not really an untruth.

Though The Telegraph did miss 9or did not care) the fact that both Calls and SMS appeared to be what one would expect (and why would a Smartphone generate more Voice and SMS than Normal? … hmmmm). Obviously, the Smartphone is used for a lot of other stuff than calling and SMSing! The data tells us that an average Smartphone user (whatever that means) spend ca. 42 minutes on web browsing and social networking while “only” 22 minutes on Calls and SMS (i.e., actually 9 minutes of SMS sounds more like a teenager than a high-end smartphone user … but never mind that!). There are lots of other stuff going on with that Smartphone. In fact out of the total daily usage of 128 minutes only 17% of the time (i.e., 22 minutes) is used for Plain Old Mobile Telephony Services (The POMTS). We do however find that both voice minutes and legacy messaging consumption are declining faster in the Smartphone segment than for Basic Phones (which are declining rapidly as well) as OTT Mobile Apps alternatives substitute POMTS (see inserted chart from http://www.slideshare.net/KimKyllesbechLarsen/de-risking-the-broadband-business-model-kkl2411201108x).

I have no doubt that the O2 data represents an averaging across a given Smartphone sample, the question is how does this data help us to understand the Real Smartphone User and his behavior.

So how did O2 measure this data?

(1) To be reliable and reasonable, data collection should be done by an App residing in the O2 customer’s smartphone. An alternative (2) would be deep packet inspection (dpi) but this would only capture network usage which can (and in most cases will be) very different from the time the customer actively uses his Smartphone. (3) Obviously the data could also be collected by old fashion Questionnaires being filled in. This would be notoriously unreliable and I cannot imagine this being the source.

Thus, I am making the reasonable guess that the Smartphone Data Collection is mobile App based.

“Thousand and 1 Questions”: Does the data collected represents a normal O2 Smartphone user? or a particular segment that don’t mind having a Software Sniffer (i.e., The Sniffer) on the used device reporting his behavior? Is “The Sniffer” a standard already installed (and activated?) App on all Smartphone devices?, only on a certain segment? or is it downloadable? (i..e, which would require a certain effort from the customer), is the collection done for both prepaid & contract customers, both old and new smartphones (i.e., usage patterns depends on OS version/type, device capabilities such as air interface speed DL & UL, CPU, memory management, etc..) … is WiFi included or excluded?, what about Apps running in the background (are these included), etc…

I should point out that it is always much easier to poke at somebody else data analysis than it often is to collect, analyse and present such data. Though, depending on the answer to the above “1,000 + 1” questions the O2 data either becomes a fair representation of an O2 Smartphone customer or “just” an interesting data point for one of their segments.

If the average Smartphone cellular (i.e., no WiFi blend) monthly consumption in UK is ca. 450MB (+/-50MB) and if the consumer had on average cellular speed of 0.5Mbps (i.e., likely conservative with exception of streaming services which could be lower), one would expect that Time spend consuming Network Resources would be no more than 120 minutes per month or 5 minutes per day (@ R99 384kbps this would be ca. 6 min per day). If I would chose a more sophisticated QoS distribution, the Network Consumption Time would anyway not change with an order of magnitude or more.

So we have 5 minutes of Mobile Data Network Time Consumption daily versus O2’s Smartphone usage time of 106 minutes (wo Calls & SMS) … A factor 22 in difference!

For every minutes of mobile data network consumption the customer spends 20+ minutes actively with his device (i.e., reading, writing, playing, etc..).

So …. Can we trust the O2 Smartphone data?

Trend wise the data certainly appear reasonable! Whether the data represents a majority of the O2 smartphone users or not … I doubt somewhat. However, without having a more detailed explanation of data collection, sampling, and analysis it’s difficult to conclude how representable the O2 Smartphone data really is for their Smartphone customers.

Alas this is the problem with most of the mobile data user and usage statistics being presented to the public as an average (i.e., have had my share of this challenge as well).

Clearly we spend a lot more time with our device than the device spends actively at the mobile network. This trend has been known for a long time from the fixed internet. O2 points out that the Smartphone, with its mobile applications, has become the digital equivalent to a “Swiss Army Knife” and as a consequence (as Tomi also points out in his Blog) already in the process of replacing a host of legacy consumer devices, such as the watch, alarm clock, camera (both still pictures and video), books, music radios, and of course last but not least substituting The POMTS.

I have made argued and shown examples that Average Numbers we are presented with are notorious by character. What other choices do we have?  Would it be better to report the Median? rather than the Average (or  Mean)? The Median divides a given consumptive distribution in half (i.e., 50% of customers have a consumption below the Median and 50% above). Alternative we could report the Mode which would give us the most frequent consumption across our consumer distribution.

Of course if consumer usage was distributed normally (i.e., symmetric bell shaped) Mean, Median and Mode would be one and the same (and we would all be happy and bored). Not so much luck!

Most consumptive behaviors tends to be much more skewed and asymmetric (i.e., “the few takes the most”) than the normal distribution (that most of us instinctively uses when we are presented with figures). Most people are not likely to spend much thought on how a given number is calculated. However, it might be constructive to provide a %tage of the customers for which their usage is below the reported average. The reader should however note that in case the percentage figure is different from 50%, the consumptive distribution is skewed and

onset of Reality Distortion has occurred.

Wireless Broadband Access (BWA) Greenfield Ambition… (from March 2008)

In case you are contemplating starting a wireless broadband, maybe even mobile broadband, greenfield operation in Europe there will be plenty of opportunity the next 1 to 2 years.Will it be a great business in Western Europes mature market? – probably not – but it still might be worth pursuing. The mobile incumbants will have a huge edge when it comes to spectrum and capacity for growth which will be very difficult to compete against for a Greenfield with comparable limited spectrum.Upcoming 2.50 GHz to 2.69 GHz spectrum (i.e., 2.6 GHz for short) auctions, often refered to as the UMTS extension band spectrum, are being innitiated in several European countries (United Kingdom, The Netherlands, Sweden, etc..). Thus, we are talking about 190 MHz of bandwidth up for sale to the highest bidder(s). Compared this with the UMTS auction at the 2.1 GHz band which was 140 Mhz. The European Commission has recommended to split up the 190 MHz into 2×70 MHz for FDD operations (basically known as UMTS extension band in some countries) and a (minimum ) 1×50 MHz part for TDD operation.

In general it is expected that incumbent mobile operators (e.g., Vodafone, T-Mobile, KPN, Orange, Telefonica/O2, etc..) will bid for the 2.6 GHz FDD spectrum, supplementing their existing UMTS 2.10 GHz spectrum mitigating possible growth limitation they might foresee in the future. The TDD spectrum is in particular expected to be contended by new companies, greenfield operations as well as fixed-line operators (i.e, BT) with the ambition to launch broadband wireless access BWA (i..e, WiMAX) networks. Thus, new companies which intend to compete with today’s mobile operators and their mobile broadband data proporsitions. Furthermore, just as mobile operators with broadband data competes with fixed broadband business (i.e., DSL & cable); so is it expected that the new players would likewise compete with both existing fixed and mobile broadband data proporsitions. Obviously, new business might not limit their business models to broadband data but also provide voice offerings.

Thus, the competive climate would become stronger as more players contend for the same customers and those customer’s wallet.

Let’s analyse the Greenfields possible business model as the economical value of starting up a broadband data business in mature markets of Western Europe. The analysis will be done on a fairly high level which would give us an indication of the value of the Greenfield Business model as well as what options a new business would have to optimize that value.

FDD vs TDD Spectrum

The 2.6 GHz auction is in its principles assymetric, allocating more bandwidth to FDD based operation than to TDD-based Broadband Wireless Access (BWA) deployment; 2×70 MHz vs 1×50 MHz. It appears fair to assuming that most incumbent operators will target 2×20 MHz FDD which coincide with the minimum bandwidth target for the Next-Generation Mobile Network (NGMN)/Long-Term Evolution (LTE) Network vision (ref: 3GPP LTE).

For the entrant interested in the part of the 1×50 MHz TDD spectrum would in worst case need 3x the FDD spectrum to get an equivalent per sector capacity as an FDD player, i.e., 2×20 MHz FDD equivalent to 1×60 MHz TDD with a frequency re-use of 3 used by the TDD operator. Thus, in a like-for-like a TDD player would have difficulty matching the incumbants spectrum position at 2.6 GHz (ignoring the incumbant having a significantly stronger spectrum position from the beginning).

Of course better antenna systems (moving to re-use 1), improved radio resource management, higher spectral efficiency (i.e., Mbps/MHz) as well as improved overall link budgets might mitigate possible disadvantage in spectral assymmetry benefiting the TDD player. However, those advantages are more a matter of time before competing access technologies bridge an existing performance gab (technology equivalent tit-for-tat).

Comparing actual network performance of FDD-based UMTS/HSPA (High-Speed Packet Access) with WiMAX 802.16e-2005 the performance is roughly equivalent in terms of spectral efficiency. However, in general in Europe there has been allocated far more FDD-based spectrum than TDD-based which overall does result in a considerable capacity and growth issues for TDD-based business models. Long-Term Evolution (LTE) path is likely to be developed both for FDD and TDD based access and equivalent performance might be expected in terms of bits-per-second to Hz performance.

Thus, it is likely that a TDD-based network would become capacity limited sooner than a mobile operator having a full portfolio of FDD-based spectrum (i.e., 900 MHz (GSM), 1800 MHz (GSM), 2,100 MHz (FDD UMTS) and 2,500 MHz (FDD – UMTS/LTE) to its disposition. Therefore, a TDD based business model could be expected to look differently than an incumbants mobile operators existing business model.

The Greenfield BWA Business Case

Assume that Greenfield BWA intends to start-up its BWA business in a market with 17 million inhabitants, 7.4 million households, and a surface area of 34,000 km2. The Greenfield’s business model is based on house-hold coverage with focus on Urban and Sub-Urban areas covering 80% of the population and 60% of the surface area.

It is worth mentioning that the valuation approach presented here is high-level and should not replace proper financial modelling and due dilligence. This said, the following approach does provide a good guidance to the attractiveness of a business proporsition.

Greenfield BWA – The Technology Part

The first exercise the business modeller is facing is to size the network needed consistent with the business requirements and vision. How many radio nodes would be required to provide coverage and support the projected demand – is the question to ask! Given frequency and radio technology it is relative straightforward to provide a business model estimate of the site numbers needed.

Using standard radio engineering framework (e.g., Cost231 Walfish-Ikegami cell range model (Ref.:Cost321)) a reasonable estimate for a typical maximum cell range which can be expected subject to the radio environment (i.e, dense-city, urban, sub-urban and rural). Greenfield BWA intends to deploy (mobile) WiMAX at 2.6 GHz. Using the standard radio engineering formula a 1.5 km @ 2.6 GHz Uplink limited cell range is estimated. Uplink limited implies that the range between the Customer Premise Equipment (CPE) and the Basestation (BS) is shorter than the other direction from BS to CPE. This is a normal situation as the CPE equipment often is the limiting factor in network deployment considerations.

The 1.5-km cell range we have estimated above should be compared with typical cell ranges observed in actual mobile networks (e.g., GSM900, GSM1800 and UMTS2100). Typically in dense-city (i.e., Top-3 cities) areas, the cell range is between 0.5 and 0.7 km depending on load. In urban/metropolitan radio environment we often find an average between 2.0 – 2.5 km cell range depending on deployed frequency, cell load and radio environment. In sub-urban and rural areas one should expect an average cell range between 2.0 – 3.5 km depending on frequency and radio environment. Typically cell load would be more important in city and urban areas (i.e., less frequency dependence) while the frequency will be most important in sub-urban and rural areas (i.e., low-frequency => higher cell range => fewer sites; higher frequency => lower cell range => higher number of sites).The cell range (i.e., 1.5 km) and effective surface area targeted for network deployment (i.e., 20,000 km2) provides an estimate for the number of coverage driven sites of ca. 3,300 BWA nodes. Whether more sites would be needed due to capacity limitations can be assessed once the market and user models have been defined.

Using typical infrastructure pricing and site-build cost the investment level for Western Europe (i.e., Capital expenses, Capex) should not exceed 350 million Euro for the network deployment all included. Assuming that the related network operational expense can be limited to 10%(excluding personnel cost) of the cumulated Capex, we have a yearly Network related opex of 35 million Euro (after rollout target has been reached). After the the final deployment target has been reached the Greenfield should assume a capital expense level of minimum 10% of their service revenue.

It should not take Greenfield BWA more than 4 years to reach their rollout target. This can further be accelerated if Greenfield BWA can share existing incumbant network infrastructure (i.e., site sharing) or use independent tower companies services. In the following assume that the BWA site rollout can be done within 3 years of launch.

Greenfield BWA the Market & Finance Part

Greenfield BWA will target primarily the house-hold market with broadband wireless access services based on the WiMAX (i.e., 802.16e standard). Voice over IP will be supported and offered with the subscription.

Furthermore, the Greenfield BWA intends to provide stationary as well as normadic services to the house-hold segment. In addition Greenfield BWA also will provide some mobility in the areas they provide coverage. However, this would not be their primary concern and thus national roaming would not be offered (reducing roaming charges/cost).

Greenfield BWA reaches a steady-state (i.e., after final site rollout) customer market-share of 20% of the Household base; ca. 1.1 million household subscriptions on which they have a blended revenue per household €20 per month can be expected. Thus, a yearly service revenue of ca. 265 million Euro. From year 4 and onwards a maintenance Capex level of 25 million Euro is kept (i.e., ca. 10% of revenue).

Greenfield BWA manage its cost strictly and achieve an EBITDA margin of 40% from year 4 onwards (i.e, total annual operational cost of 160 million Euro).

Depreciation & Amortisation (D&A) level is kept at a level of $40 million annually (steady-state). Furthermore, Greenfield Inc has an effective tax rate of 30%.

Now we can actually estimate the free cash flow (FCF) Greenfield Inc would generate from the 4th year forward:

(all in million Euro)
Revenue €265
-Opex €158
=EBITDA €106
– D&A €40 (ignoring spectrum amortization)
– Tax €20 (i.e., 30%)
+ D&A €40
=Gross Cash Flow €86
-Capex €25
=FCF €61

assuming zero percent FCF growth rate and operating with a 10% (i.e., this could be largely optimistic for a pure Greenfield operation. Having 15% – 25% is not unheard off to reflect the high risks) Weighted Average Cost of Capital (i.e., WACC) the perpetuity value from year 4 onwards would be €610 million. In Present Value this is €416 million, net €288 million for the initial 3 years discounted capital investment (for network deployment) and considering the first 3 years cumulated discounted EBITDA 12 million provides

a rather weak business case of ca. 140 million (upper) valuation prior to spectrum investment where-of bulk valuation arises from the continuation value (i.e., 4 year onwards).

Alternative valuation would be to take a multiple of the EBITDA (4th year) as a sales price valuation equivalent; typically one would expect between 6x and 10x the (steady-state) EBITDA and thus €636 mio (6x) to €1,000 mio (10x).

The above valuation assumptions are optimistic and it is worthwhile to note the following;

1. €20 per month per household customer should be seen as optimistic upper value; lower and more realistic might not be much more than €15 per month.
2. 20% market share is ambitious particular after 3 years operation.
3. 40% margin with 15% customer share and 3,300 radio nodes is optimistic but might be possible if Greenfield BWA can make use of Network Sharing and other cost synergies in relation to for example outsourcing.
4. 10% WACC is assumed. This is rather low given start-up scenario. Would not be surprised that this could be estimated to be as high as 15% to 20%.If point 1 to 4 lower boundaries would be applied to above valuation logic the business case would very quickly turn in red (i.e., negative); leading to the conclusion of a significant business risk given the scope of above business model.Our hypothetical Greenfield BWA should target paying minimum license fee for the TDD spectrum; upper boundary should not exceed €50 million to mitigate too optimistic business assumptions.The City-based Operation Model

Greenfield BWA could choose to focus their business model on the top-10 cities and their metropolitan areas. Lets assume that by this 50% of population or house-holds are captured as well as 15% of the surface area. This should be compared with the above assumptions 80% population and 60% surface area coverage.

The key business drivers would look as follows (in paranthesis the previous values have been shown for reference).

Sites 850 (3,300) rollout within 1 to 2 years (3 years).
Capex €100 mio (€350) for initial deployment; afterwhich €18 mio (€25).

Customer 0.74 mio (1.1)
Revenue €178 mio (€264)
EBITDA €72 mio (€106)
Opex €108 mio (€160)
FCF €38 mio (€61)
Value €210 mio (€140)

The city-based network strategy is about 50% more valuable than a more extensive coverage strategy would be.

Alternative valuation would be to take a multiple of the EBITDA (3rd year) as the sales price valuation equivalent; typically one would expect between 6x and 10x the (steady-state) EBITDA and thus €432 mio (6x) to €720 mio (10x).

Interestingly (but not surprising!) Greenfield BWA would be better of focusing on smaller network but in areas of high population density is financially more attractive. Greenfield BWA should avoid coverage based rollout strategy known from the mobile operator business model.

The question is how important is it for the Greenfield BWA to provide coverage everywhere? if their target is primarily households based customers with normadic and static mobility requirements then such a “coverage where the customer is” business model might actually work?

Source: http://harryshell.blogspot.de/2008/03/wireless-broadband-access-bwa.html

Did you know? Did you consider? (from March 2008)

In 2007 the European average mobile revenue per user (ARPU per month) was €28+/-€6; a drop of ca. 4% compared to 2006 (the EU inflation level in 2007 was ca. 2.3%).

of the €28 ARPU, ca. 16% could be attributed to non-voice usage (i.e,. €4.5).

of the €4.5 Non-Voice ARPU, ca. 65% could be attributed to SMS usage (i.e, €3.0).

Thus, leaving €1.5 for non-voice (mobile) data service (i.e., 5.4% of total ARPU).

The increase that most European countries have seen in their mobile Non-Voice Revenue has by far not been able to compensate for the drop in ARPU across most countries over the last 5 to 6 years.

Adding advanced data (e.g., UMTS and HSPA) capabilities to the mobile networks around Europe has not resulted in getting more money out of the mobile customer (but absolute revenue has grown due to customer intake).

Although most European UMTS/HSPA operators report a huge uptake (in relative terms) of Bytes generated by the customers, this is not reflected in the ARPU development.

Maybe it really does not matter as long as the mobile operators overall financial performance remains excelent (i.e., Revenues, Customers, EBITDA, Cash, ….)?

Is it possible to keep healthy financial indicators with decreasing ARPU, huge data usage growth and investments into brand-new radio access technologies targeting the €1.5 per month per user?

Source: http://harryshell.blogspot.de/2008_03_01_archive.html

Winner of the 700-MHz Auction is … Google! (from April 2008)

The United States has recently ended (March 2008) the auction of 5 blocks (see details below) of the analog TV spectrum band of 700-MHz. More specifically the band between 698 – 763 MHz (UL) and 728 – 793 MHZ (DL), with a total bandwidth of 2×28 MHz. In addition a single band 1×6 MHz in 722 – 728 MHz range was likewise auctioned. The analog TV band is expected to be completely vacated by Q1 2009.

The USA 700 MHz auction result was an impressive total of $19.12 billion, spend buying the following spectrum blocks: A (2×6 MHz), B (2×6 MHz), C (2×11 MHz) and E (1×6 MHz) blocks. The D (2×5 Mhz) block did not reach the minimum level. A total of 52 MHz (i.e, 2×23 + 1×6 MHz) bandwidth was auctioned off.

Looking with European eyes on the available spectrum allocated per block it is not very impressive (which is similar to other US Frequency Blocks per Operator, e.g., AWS & PCS). The 700 MHz frequency is clearly very economical for radio network coverage deployment in particular compared the high-frequency AWS spectrum used by T-Mobile, Verizon and Sprint. However, the 6 to 11 MHz (UL/DL) is not very impressive from a capacity sustainance perspective. It is quiet likely that this spectrum would be exhausted and rapidly leading to a significant additional financial commitment to cell splits / capacity extensions.

This $19.12 billion for 52 MHz translates to $1.22 per MHz spectrum per Population @ 700 MHz.

This should be compared to following historical auctions
* $0.56/MHz/Pop @ 1,700 MHz in 2006 US AWS auction
* $0.15/MHz/Pop (USA Auction 22 @ 1999) to $4.74/MHz/Pop (NYC, Verizon).
* $1.23/MHz/Pop Canadian 2000 PCS1900 Auction of 40MHz.
* $5.94/MHz/Pop UK UMTS auction (2001) in UK auctioning a total of 2×60 MHz FDD spectrum (TDD not considered).
* $7.84/MHz/Pop German UMTS auction in 2001 (2×60 MHz FDD, TDD not considered).

(Note: the excesses of the European UMTS auctions clearly illustrates a different time and place).

What is particular interesting is that Verizon “knocked-out” Google by paying $4.74 billion for the nationwide C-block of 2×11 MHz. “Beating” Google’s offer of $4.6 billion.

However, Google does not appear too sadened of the outcome and …. why should they! Google has to a great extend influenced the spectrum conditions allowing for open access (although it remains to be seen what this really means) to the C spectrum block; The USA Federal Communications Commission (FCC) has proposed to apply “open access” requirements for devices and applications on a the nation wide spectrum block C (2×11 MHz). 

Clearly Google should be regarded as the winner of the 700 MHz auction. They have avoided committing a huge amount of cash for the spectrum and on-top having to deploy even more cash to build and operate a wireless network (i.e., which is really their core business anyway).

Googling the Business Case
Google was willing to put down $4.6 billion for the 2×11 MHz @ 700 MHz. Let’s stop up an ask how their business case possible could have looked like.

At 700 MHz, with not too ambitious bandwidth per user requirements, Google might achieve a typical cell range between 2.5 and 4 km (Uplink limited, i.e., user equipment connection to base station). Although in “broadcast/downlink” mode, the cell range could be significantly larger (and downlink is all you really need for advertisement and broadcast;-).

Assume Google’s ambition was top-100 cities and 1-2% of the USA surface area they would need at least 30 thousand nodes. Financially (all included) this would likely result in $3 to $5 billion network capital expense (Capex) and a technology driven annual operational expense (Opex) of $300 to $500 million (in steady-state). On top of the spectrum price.

Using above rough technology indicators Google (if driven by sound financial principles) must have had a positive business case for a cash-out of minimum $8 billion over 10 years, incl. spectrum and discounted with WACC of 8% (all in all being very generous) and annual Technology Opex of minimum $300 million. On top of this comes customer acquisition, sales & marketing, building a wireless business operations (obviously they might choose to outsource all that jazz).

… and then dont forget the customer device that needs to be developed for the 700 MHz band (note GSM 750 MHz falls inside the C-band). Typically takes between 3 to 5 years to get a critical customer mass and then only if the market is stimulated.

It would appear to be a better business proporsition to let somebody else pay for spectrum, infrastructure, operation, etc… and just do what Google does best … selling advertisments and deliver search results … for mobile devices … maybe even agnostic to the frequency (seems better than wait until critical mass has been reached at the 700 MHz).

But then again … Google reported for full year 2007 a $16.4 billion in advertising revenues (up 56% compared to the previous year).(see refs Google Investor Relations). Imagine what this could be if extended to wireless / mobile market. Still lower than Verizon’s 2007 full year revnue of $23.8B (up 5.5% from 2006) but not that much lower considering the difference in growth rate.

The “successfull” proud owners (Verizon, AT&T Mobility, etc….) of the 700 MHz spectrum might want to keep in mind that Google’s business case for entering wireless must have been far beyond the their proposed $4.6 billion.

Appendix:
The former analog TV spectrum auction has been divided UHF spectrum into 5 blocks:
Block A: 2×6 MHz bandwidth (698–704 and 728–734 MHz); $3.96 billion
Block B: 2×6 MHz bandwidth (704–710 and 734–740 MHz); $9.14 billion dominated by AT&T Mobility.
Block C: 2×11 MHz bandwidth (746–757 and 776–787 MHz) Verizon $4.74 billion
Block D: 2×5 MHz bandwidth (758–763 and 788–793 MHz) No bids above the minimum.
Block E: 1×6 MHz bandwidth (722–728 MHz)Frontier Wireless LCC $1.26 billion

Source: http://harryshell.blogspot.de/2008/04/winner-of-700-mhz-auction-is-google.html

Google Openness – Your Click makes Google Tick (From April 2008)

In an open letter to the chairman of the USA Federal Communications Commission (FCC) Kevin J. Martin, Google pleas for openness in the recently ended (i..e, March 2008) auction for the 700 MHz;

1. Open Applications – users can gain access to and use any applications, services or content.
2. Open Devices – any device on any network.
3. Open Wholesale Services – Service Providers and Virtual Network Operators should get wholesale access to the 700 MHz based network(s) on reasonably non-discriminatory commercial terms.
4. Open Network Access – service provides and virtual network operators should be allowed to interconnect with the 700 MHz wireless network(s).

The two first points of Open Applications and Open Devices are in principle independent of the 700 Mhz auction, although they can of course be made mandatory in the particular auction requirements and ….. so they were.

The Open Wholesale Service point makes the mind bugle (well at least mine) while figuring out funny wholesale models that would be non-discriminatory to both the wireless operator (having invested in spectrum and network) and Googles-and-alike (GAAs … whomever other than Google that might be?). The biggest question for a network operator providing wholesale to GAAs is likely going to be how to get a piece of the Google advertisement revenue pie.

Open Applications
Within devices capabilities and network possibilities this does not sound like mission impossible. Obviously, if a wireless network is interconnected to the web (i.e., 4th requirement) services and application available in general to a device (pc, laptop, etc.) connected to the fixed internet would also be available to the mobile device.

However, there are particular services and applications that wireless operators might want to traffic control and manage. Particular in the case of having only an 11 MHz bandwidth available (i.e., USA nationwide C-band @ 700MHz) on the air-interface, heavy peer-2-peerk applications and streaming might result in severe congestion and loss of service quality. Thus, the ability to control and manage the Quality of Service per application / content category will be necessary in order to avoid that few heavy users jeopardize the service quality for the majority of average wireless users.

The wireless operator however should have no problems in complying with Google Point 1.

Open Devices – Open Networks
This requirement might appear harmless and not worth worrying about. In principle a user with a subscription and who pays the access price can have access to any network his device is capable of communicating with (not exactly true in most mobile standards, such as GSM and UMTS/HSPA, of today). Basically WiFi hotspot access comes closest to such a business principle and if the customer does not care about the mobile operator services and content this would suffice.

The mobile business model is (even technically) not build on principles of free access between wireless / mobile networks. A mobile subscription (post-paid or pre-paid) is associated with customer acquisition cost, often subsidising the user terminal. The subscription requires the customer to keep paying for a period of time to pay-back the upfront customer investment done by the mobile operator.

Allowing a business model were customers can freely roam/move across networks might require a different financing mechanism (or none) of the consumers device and access rights. As a service provider with whole-sale agreements with several wireless network operators could enable this for their customer base. For a traditional mobile operator such a model would not be very attractive unless national roaming is invoked due to lack of coverage in a given area.

The Google proposal is from a business model very interesting (altans likely disruptive) although would also require some rethinking of current mobile AAA (i.e., Authentication, Authorization, and Accounting) architecture.

Furthermore, one might fear that by moving to the proposed Google model, that few internet-based businesses would end-up “owning” the customer-data (g-search, gmail, g-chat, g-blog, g-msisdn, g-device, etc..), while the customer-data ownership is currently spread out across several mobile and fixed telecommunication business. The legacy mobile / wireless operator becomes a bit carrier paid by those few internet-based businesses.

Open Wholesale Services
For the Google business this is a really fun one to think about. How would that work for an entity as Google?, for which close to 100% of revenues comes from advertising (i.e., 2007-earnings shows that 98.91% of their $16.594 billion from advertising).

The value for Google going wireless is clearly from opening up a new channel for advertising. The growth potential entering the mobile channel is potential enormeous, with mobile penetration approaching 100% and even far beyond in many European markets (closer to 120%+).

Normal telecommunication wholesale models are based on volumetric usage (i.e., Minutes or Bytes). However Google would hardly trigger any direct volumetric usage with exception of the volume it takes to download google.com. Alas there might be a considerable traffic stream arising from YouTupe and some from gmail usage. Even following an advertising link will generate traffic although not necessarily generating much additional volumetric usage. Of course the question is how to distinguish between Google generated traffic and non-Google traffic?

Furthermore, the price per advertisement click that Google earns could be significantly different (i.e., higher) from the cost of the click according with a standard volumetric wholesale model. Furthermore, a different click might have different values but still generate the same volume and associated cost.

Maybe the wireless operator should not care too much how Google earns its money as long as the traffic generated by providing access to happy Googlers and GAAs are recovered by a healthy margin and does not jeopardize the quality of other customers.

Open Network Access
Yeah this sort of make sense …. without this the first three points become rather academic. There is no essential technical barriers for interconnect.

Source: http://harryshell.blogspot.de/2008/04/google-openess-your-click-make-google.html

Backhaul Pains (from April 2008)

Backhaul, which is the connection between a radio node and the core network, is providing mobile-wireless operators possible with the biggest headache ever (apart from keeping a healthy revenue growth in mature markets 😉 … it can be difficult to come by in the right quantities and can be rather costly with conventional transmission cost-structures … Backhaul is expected to have delayed the Sprint WiMAX rollout of their Xohm branded wireless internet service. A Sprint representative is supposed to have said: “You need a lot of backhaul capacity to do what’s required for WiMax.” (see forexample WiMax.com blog)

What’s a lot?

Well … looking at the expected WiMAX speed per Base Station (BS) of up-to 50 Mbps (i.e., 12 – 24x typical backhaul supporting voice demand), it is clear that finding suitable and low-cost bachaul solutions might be challenging. Conventional leased lines would be grossly un-economical at least if priced conventionally; xDSL and Fiber-to-the-Premises (FTTP) infrastructure that could support (economically?) such bandwidth demand is not widely deployed yet.

Is this a Sprint issue only? Nope! …. Sprint cannot be the only mobile-wireless operator with this problem – for UMTS/HSPA mobile operators the story should be pretty much the same (unless an operator has a good and modern microwave backhaul network supporting the BS speed).

Backhaul Pains – Scalability Issues
The backhaul connection can be either via a Leased Line (LL) or a Microwave (MW) radio link. Sometimes a MW link can be leased as well and might even be called a leased line.

With microwave (MW) links one can easily deliver multiples of 2.048 Mbps (i.e., 10 – 100 Mbps) on the same connection for relative low capital cost (€500 – €1,000 per 2.048 Mbps) and low operational expense. However planning and deployment experience and spectrum is required.

In many markets network operators have been using conventional (fixed) leased lines, leased from incumbent fixed-line providers. The pricing model is typically based on an upfront installation fee (might be capitalized) and a re-occurring monthly lease. On a yearly basis this operational expense can be in the order of €5,000 per 2.048 Mbps, i.e., 5x to 10 x the amount of a MW connection. Some price-models trade-off the 1-off installation fee with a lower lease cost.

Voice was the Good for Backhaul; Before looking at the broadband wireless data bandwidth demand its worth noticing that in the good old Voice days (i.e., GSM, IS95, ..) 1x to 2x 2.048 Mbps was more than sufficient to support most demands on a radio base station (BS).

Mobile-Wireless Broadband data enablers are the Bad and quickly becoming the Very Ugly for Backhaul; With the deployment of High Speed Packet Access (HSPA) on-top of UMTS and with WiMAX (a la Sprint) a BS can easily provide between 7.2 to 14.4 Mbps or higher per sector depending on available bandwidth. With 3 sectors per BS the total supplied data capacity could (in theory … ) be in excess of 21 Mbps per radio Base Station.

From the perspective of backhaul connectivity one would need at least an equivalent bandwidth of 10x 2.048 Mbps connections. Assuming such backhaul lease bandwidth is available in the first instance, with conventional leased line pricing structure, such capacity would be very expensive, i.e., €50,000 per backhaul connection per year. Thus, for 1,000 radio nodes an operator would pay on an annual basis 50 million Euro (Opex directly hitting the EBITDA). This operational expense could be 8 times more than a voice-based operational leased-line expense.

Now that’s alot!

Looking a little ahead (i.e., next couple of years) our UMTS and WiMAX based mobile networks will undergo the so-called Long-Term Evolution (LTE; FDD and TDD based) with expected radio node downlink (i.e., base station to user equipment) capacity between 173 Mbps and 326 Mbps depending on antenna system and available bandwidth (i.e., minimum 20 Mhz spectrum per sector). Thus over a 3-sectored BS (theoretical) speeds in excess of 520 Mbps might be dreamed of (i.e., 253x 2.048 Mbps – and this is HUGE!:-). Alas across a practical real-life deployed base station (on average) no more than 1/3 of the theoretical speed should be expected.

“Houston we have a problem” … should be ringing in any CFO / CTO’s ears – a. Financially near-future developments could significantly strain the Technology Opex budgets and b.Technically providing cost-efficient backhaul capacity that can sustain the promised land.

A lot of that above possible cost can and should be avoided; looking at possible remedies we have several options;

1. High capacity microwave backhaul can prevent the severe increase in leased line cost; provided spectrum and expertise is available. Financially microwave deployment has the advantage of being mainly capital-investment driven with resulting little additional operational expense per connection. It is expected that microwave solutions will be available in the next couple of years which can provide connection capacity of 100 Mbps and above.

Microwave backhaul solutions are clearly economical. However, it is doubtful that LTE speed requirements can be met even with most efficient microwave backhaul solutions?

2. Move to different leased line (LL) pricing mechanisms such as flat pricing (eat all you can for x-Euro). Changing the LL pricing structure is not sufficient. At the same time providers of leased-line infrastructure will be “forced” (i.e., by economics and bandwidth demand) to move to new types of leased bandwidth solutions and architectures in order to sustain the radio network capabilities; ADSL is expected to develop from 8(DL)/1(UL) Mbps to 25(DL)/3.5(UL) Mbps with ADSL2+; VDSL (UL/DL symmetric) from ca. 100 Mbps to 250 Mbps with VDSL2 (ITU-T G.993.2 standard).

Clearly a VDSL2-based infrastructure could support today’s HSPA/WiMAX requirements, as well as the initial bandwidth requirements of LTE. Although VDSL2-based networks are being deployed around Europe (and the world) it is not not widely available.

Another promising mean of supporting the radio-access bandwidth requirements is Fiber to the Premises (FTTP), such as for example offered by Verizon in certain areas of USA (Verizon FiOS Service). With Gigabit Passive Optical Network (GPON, ITU-T G.984 standard) maximum speeds of 2,400 Mbps (DL) and 1,200 Mbps (UL) can be expected. If available FTTP to the base station would be ideal – provided that the connection is priced no higher than a standard 2.048 Mbps leased line to day (i.e., €5,000 benchmark). Note that for a mobile operator it could be acceptable to pay a large 1-off installation fee which could partly finance the FTTP connection to the base station.

Cost & Pricing Expectations
It is in general accepted by industry analysts that broadband wireless services are not going to add much to mobile operators total service revenue growth. In optimistic revenue scenarios data revenue compensates for stagnating/falling voice revenues. EBITDA margins will (actually are!) under pressure and the operational expenses will be violently scrutinized.

Thus, mobile operators deploying UMTS/HSPA, WiMAX and eventually (in the short-term) LTE cannot afford to have its absolute Opex increase. Therefore, if a mobile-wireless operator has a certain backhaul Opex, it would try to keep it at the existing level or reduce it over time (to mitigate possible revenue decline).

For the backhaul leased-capacity providers this is sort of bad news (or good? as it forces them to become economically more efficient) …. as they would have to finance their new fixed higher-bandwidth infrastructures (i.e., VDSL or FTTP) with little additional revenue from the mobile-wireless operators.

Economically it is not clear whether mobile-wireless cost-structure expectations will meet the leased-capacity providers total-cost of deploying networks supporting the mobile-wireless bandwidth demand.

However, for the provider of leased fixed-bandwith, providing VDSL2 and/or FTTP to the residential market should finance their deployment model.

With more than 90% of all data traffic being consumed in-house/in-door and with VDSL2/Fiber-to-the-Home (FTTH) solutions being readily available to the Homes (in urban environments at least) of business as well as residential customers, will mobile-wireless LTE base stations be loaded to the extend that very-high capacity (i.e., beyond 50 Mbps) backhaul connections would be needed?

Source: http://harryshell.blogspot.de/2008/04/backhaul-pains.html