Can LEO Satellites close the Gigabit Gap of Europe’s Unconnectables?

Is LEO satellite broadband a cost-effective and capable option for rural areas of Europe? Given that most seem to agree that LEO satellites will not replace mobile broadband networks, it seems only natural to ask whether LEO satellites might help the EU Commission’s Digital Decade Policy Programme (DDPP) 2030 goal of having all EU households (HH) covered by gigabit connections delivered by so-called very high-capacity networks, including gigabit capable fiber-optic and 5G networks, by 2030 (i.e., only focusing on the digital infrastructure pillar of the DDPP).

As of 2023, more than €80 billion had been allocated in national broadband strategies and through EU funding instruments, including the Connecting Europe Facility and the Recovery and Resilience Facility. However, based on current deployment trajectories and cost structures, an additional €120 billion or more is expected to close the remaining connectivity gap from the approximately 15.5 million rural homes without a gigabit option in 2023. This brings the total investment requirement to over €200 billion. The shortfall is most acute in rural and hard-to-reach regions where network deployment is significantly more expensive. In these areas, connecting a single household with high-speed broadband infrastructure, especially via FTTP, can easily exceed €10,000 in public subsidy, given the long distances and low density of premises. It would be a very “cheap” alternative for Europe if a non-EU-based (i.e., USA) satellite constellation could close the gigabit coverage gap even by a small margin. However, given some of the current geopolitical factors, 200 billion euros could enable Europe to establish its own large LEO satellite constellation if it can match (or outperform) the unitary economics of SpaceX, rather than its IRIS² satellite program.

In this article, my analysis focuses on direct-to-dish low Earth orbit (LEO) satellites with expected capabilities comparable to, or exceeding, those projected for SpaceX’s Starlink V3, which is anticipated to deliver up to 1 Terabit per second of total downlink capacity. For such satellites to represent a credible alternative to terrestrial gigabit connectivity, several thousand would need to be in operation. This would allow overlapping coverage areas, increasing effective throughput to household outdoor dishes across sparsely populated regions. Reaching such a scale may take years, even under optimistic deployment scenarios, highlighting the importance of aligning policy timelines with technological maturity.

GIGABITS IN THE EU – WHERE ARE WE, AND WHERE DO WE THINK WE WILL GO?

  • In 2023, Fibre-to-the-Premises (FTTP) rural HH coverage was ca. 52%. For the EU28, this means that approximately 16 million rural homes lack fiber coverage.
  • By 2030, projected FTTP deployment in the EU28 will result in household coverage reaching almost 85% of all rural homes (under so-called BaU conditions), leaving approximately 5.5 million households without it.
  • Due to inferior economics, it is estimated that approximately 10% to 15% of European households are “unconnectable” by FTTP (although not necessarily by FWA or broadband mobile in general).
  • EC estimated (in 2023) that over 80 billion euros in subsidies have been allocated in national budgets, with an additional 120 billion euros required to close the gigabit ambition gap by 2030 (e.g., over 10,000 euros per remaining rural household in 2023).

So, there is a considerable number of so-called “unconnectable” households within the European Union (i.e., EU28). These are, for example, isolated dwellings away from inhabited areas (e.g., settlements, villages, towns, and cities). They often lack the most basic fixed communications infrastructure, although some may have old copper lines or only relatively poor mobile coverage.

The figure below illustrates the actual state of FTTP deployment in rural households in 2023 (orange bars) as well as a Rural deployment scenario that extends FTTP deployment to 2030, using the maximum of the previous year’s deployment level and the average of the last three years’ deployment levels. Any level above 80% grows by 1% pa (arbitrarily chosen). The data source for the above is “Digital Decade 2024: Broadband Coverage in Europe 2023” by the European Commission. The FTTP pace has been chosen individually for suburban and rural areas to match the expectations expressed in the reports for 2030.

ARE LEO DIRECT-TO-DISH (D2D) SATELLITES A CREDIBLE ALTERNATIVE FOR THE “UNCONNECTABLES”?

  • For Europe, a non-EU-based (i.e., US-based) satellite constellation could be a very cost-effective alternative to closing the gigabit coverage gap.
  • Megabit connectivity (e.g., up to 100+ Mbps) is already available today with SpaceX Starlink LEO satellites in rural areas with poor broadband alternatives.
  • The SpaceX Starlink V2 satellite can provide approximately 100 Gbps (V1.5 ~ 20+ Gbps), and its V3 is expected to deliver 1,000 Gbps within the satellite’s coverage area, with a maximum coverage radius of over 500 km.
  • The V3 may have 320 beams (or more), each providing approximately ~3 Gbps (i.e., 320 x 3 Gbps is ca. 1 Tbps). With a frequency re-use factor of 40, 25 Gbps can be supplied within a unique coverage area. With “adjacent” satellites (off-nadir), the capacity within a unique coverage area can be enhanced by additional beams that overlap the primary satellite (nadir).
  • With an estimated EU28 “unconnectable” household density of approximately 1.5 per square kilometer, the LEO satellite constellation would cover more than 20,000 households, each with a capacity of 20 Gbps over an area of 15,000 square kilometers.
  • At a peak-hour user concurrency of 15% and a per-user demand of 1 Gbps, the backhaul demand would reach 3 terabits per second (Tbps). This means we have an oversubscription ratio of approximately 3:1, which must be met by a single 1 Tbps satellite, or could be served by three overlapping satellites.
  • This assumes a 100% take-up rate of the unconnectable HHs and that each would select a 1 Gbps service (assuming such would be available). In rural areas, the take-up rate may not be significantly higher than 60%, and not all households will require a 1 Gbps service.
  • This also assumes that there are no alternatives to LEO satellite direct-to-dish service, which seems unlikely for at least some of the 20,000 “unconnectable” households. Given the typical 5G coverage conditions associated with the frequency spectrum license conditions, one might hope for some decent 5G coverage; alas, it is unlikely to be gigabit in deep rural and isolated areas.

For example, consider the Starlink LEO satellite V1.5, which has a total capacity of approximately 25 Gbps, comprising 32 beams that deliver 800 Mbps per beam, including dual polarization, to a ground-based user dish. It can provide a maximum of 6.4 Gbps over a minimum area of ca. 6,000 km² at nadir with an Earth-based dish directly beneath the satellite. If the coverage area is situated in a UK-based rural area, for example, we would expect to find, on average, 150,000 rural households using an average of 25 rural homes per km². If a household demands 100 Mbps at peak, only 60 households can be online at full load concurrently per area. With 10% concurrency, this implies that we can have a total of 600 households per area out of 150,000 homes. Thus, 1 in 250 households could be allowed to subscribe to a Starlink V1.5 if the target is 100 Mbps per home and a concurrency factor of 10% within the coverage area. This is equivalent to stating that the oversubscription ratio is 250:1, and reflects the tension between available satellite capacity and theoretical rural demand density. In rural UK areas, the beam density is too high relative to capacity to allow universal subscription at 100 Mbps unless more satellites provide overlapping service. For a V1.5 satellite, we can support four regions (i.e., frequency reuse groups), each with a maximum throughput of 6.4 Gbps. Thus, the satellite can support a total of 2,400 households (i.e., 4 x 600) with a peak demand of 100 Mbps and a concurrency rate of 10%. As other satellites (off-nadir) can support the primary satellite, it means that some areas’ demand may be supported by two to three different satellites, providing a multiplier effect that can increase the capacity offered. The Starlink V2 satellite is reportedly capable of supporting up to a total of 100 Gbps (approximately four times that of V1.5), while the V3 will support up to 1 Tbps, which is 40 times that of V1.5. The number of beams and, consequently, the number of independent frequency groups, as well as spectral efficiency, are expected to be improved over V1.5, which are factors that will enhance the overall total capacity of the newer Starlink satellite generations.

By 2030, the EU28 rural areas are expected to achieve nearly 85% FTTP coverage under business-as-usual deployment scenarios. This would leave approximately 5.5 million households, referred to as “unconnectables,” without direct access to high-speed fiber. These households are typically isolated, located in sparsely populated or geographically challenging regions, where the economics of fiber deployment become prohibitively uneconomical. Although there may be alternative broadband options, such as FWA, 5G mobile coverage, or copper, it is unlikely that such “unconnectable” homes would sustainably have a gigabit connection.

This may be where LEO satellite constellations enter the picture as a possible alternative to deploying fiber optic cables in uneconomical areas, such as those that are unconnectable. The anticipated capabilities of Starlink’s third-generation (V3) satellites, offering approximately 1 Tbps of total downlink capacity with advanced beamforming and frequency reuse, already make them a viable candidate for servicing low-density rural areas, assuming reasonable traffic models similar to those of an Internet Service Provider (ISP). With modest overlapping coverage from two or three such satellites, these systems could deliver gigabit-class service to tens of thousands of dispersed households without (much) oversubscription, even assuming relatively high concurrency and usage.

Considering this, there seems little doubt that an LEO constellation, just slightly more capable than SpaceX’s Starlink V3 satellite, appears to be able to fully support the broadband needs of the remaining unconnected European households expected by 2030. This also aligns well with the technical and economic strengths of LEO satellites: they are ideally suited for delivering high-capacity service to regions where population density is too low to justify terrestrial infrastructure, yet digital inclusion remains equally essential.

LOW-EARTH ORBIT SATELLITES DIRECT-TO-DISRUPTION.

I have in my blog “Will LEO Satellite Direct-to-Cell Networks make Terrestrial Networks Obsolete?” I provided some straightforward reasons why the LEO satellite with direct to an unmodified smartphone capabilities (e.g., Lynk Global, AST Spacemobile) would not make existing cellular network obsolete and would be of most value in remote or very rural areas where no cellular coverage would be present (as explained very nicely by Lynk Global) offering a connection alternative to satellite phones such as Iridium, and thus being complementary existing terrestrial cellular networks. Thus, despite the hype, we should not expect a direct disruption to regular terrestrial cellular networks from LEO satellite D2C providers.

Of course, the question could also be asked whether LEO satellites directed to an outdoor (terrestrial) dish could threaten existing fiber optic networks, the business case, and the value proposition. After all, the SpaceX Starlink V3 satellite, not yet operational, is expected to support 1 Terabit per second (Tbps) over a coverage area of several thousand kilometers in diameter. It is no doubt an amazing technological achievement for SpaceX to have achieved a 10x leap in throughput from its present generation V2 (~100 Gbps).

However, while a V3-like satellite may offer an (impressive) total capacity of 1 Tbps, this capacity is not uniformly available across its entire footprint. It is distributed across multiple beams, potentially 256 or more, each with a bandwidth of approximately 4 Gbps (i.e., 1 Tbps / 256 beams). With a frequency reuse factor of, for example, 5, the effective usable capacity per unique coverage area becomes a fraction of the satellite’s total throughput. This means that within any given beam footprint, the satellite can only support a limited number of concurrent users at high bandwidth levels.

As a result, such a satellite cannot support more than roughly a thousand households with concurrent 1 Gbps demand in any single area (or, alternatively, about 10,000 households with 100 Mbps concurrent demand). This level of support would be equivalent to a small FTTP (sub)network serving no more than 20,000 households at a 50% uptake rate (i.e., 10,000 connected homes) and assuming a concurrency of 10%. A deployment of this scale would typically be confined to a localized, dense urban or peri-urban area, rather than the vast rural regions that LEO systems are expected to serve.

In contrast, a single Starlink V3-like satellite would cover a vast region, capable of supporting similar or greater numbers of users, including those in remote, low-density areas that FTTP cannot economically reach. The satellite solution described here is thus not designed to densify urban broadband, but rather to reach rural, remote, and low-density areas where laying fiber is logistically or economically impractical. Therefore, such satellites and conventional large-scale fiber networks are not in direct competition, as they cannot match their density, scale, or cost-efficiency in high-demand areas. Instead, it complements fiber infrastructure by providing connectivity and reinforces the case for hybrid infrastructure strategies, in which fiber serves the dense core, and LEO satellites extend the digital frontier.

However, terrestrial providers must closely monitor their FTTP deployment economics and refrain from extending too far into deep rural areas beyond a certain household density, which is likely to increase over time as satellite capabilities improve. The premise of this blog is that capable LEO satellites by 2030 could serve unconnected households that are unlikely to have any commercial viability for terrestrial fiber and have no other gigabit coverage option. Within the EU28, this represents approximately 5.5 million remote households. A Starlink V3-like 1 Tbps satellite could provide a gigabit service (occasionally) to those households and certainly hundreds of megabits per second per isolated household. Moreover, it is likely that over time, more capable satellites will be launched, with SpaceX being the most likely candidate for such an endeavor if it maintains its current pace of innovation. Such satellites will likely become increasingly interesting for household densities above 2 households per square kilometer. However, suppose an FTTP network has already been deployed. In that case, it seems unlikely that the satellite broadband service would render the terrestrial infrastructure obsolete, as long as it is priced competitively in comparison to the satellite broadband network.

LEO satellite direct-to-dish (D2D) based broadband networks may be a credible and economical alternative to deploying fiber in low-density rural households. The density boundary of viable substitution for a fiber connection with a gigabit satellite D2D connection may shift inward (from deep rural, low-density household areas). This reinforces the case for hybrid infrastructure strategies, in which fiber serves the denser regions and LEO satellites extend the digital frontier to remote and rural areas.

THE USUAL SUSPECT – THE PUN INTENDED.

By 2030, SpaceX’s Starlink will operate one of the world’s most extensive low Earth orbit (LEO) satellite constellations. As of early 2025, the company has launched more than 6,000 satellites into orbit; however, most of these, including V1, V1.5, and V2, are expected to cease operation by 2030. Industry estimates suggest that Starlink could have between 15,000 and 20,000 operational satellites by the end of the decade, which I anticipate to be mainly V3 and possibly a share of V4. This projection depends largely on successfully scaling SpaceX’s Starship launch vehicle, which is designed to deploy up to 60 or more next-generation V3 satellites per mission with the current cadence. However, it is essential to note that while SpaceX has filed applications with the International Telecommunication Union (ITU) and obtained FCC authorization for up to 12,000 satellites, the frequently cited figure of 42,000 satellites includes additional satellites that are currently proposed but not yet fully authorized.

The figure above, based on an idea of John Strand of Strand Consult, provides an illustrative comparison of the rapid innovation and manufacturing cycles of SpaceX LEO satellites versus the slower progression of traditional satellite development and spectrum policy processes, highlighting the growing gap between technological advancement and regulatory adaptation. This is one of the biggest challenges that regulatory institutions and policy regimes face today.

Amazon’s Project Kuiper has a much smaller planned constellation. The Federal Communications Commission (FCC) has authorized Amazon to deploy 3,236 satellites under its initial phase, with a deadline requiring that at least 1,600 be launched and operational by July 2026. Amazon began launching test satellites in 2024 and aims to roll out its service in late 2025 or early 2026. On April 28, 2025, Amazon launched its first 27 operational satellites for Project Kuiper aboard a United Launch Alliance Atlas (ULA) V rocket from Cape Canaveral, Florida. This marks the beginning of Amazon’s deployment of its planned 3,236-satellite constellation aimed at providing global broadband internet coverage. Though Amazon has hinted at potential expansion beyond its authorized count, any Phase 2 remains speculative and unapproved. If such an expansion were pursued and granted, the constellation could eventually grow to 6,000 satellites, although no formal filings have yet been made to support the higher amount.

China is rapidly advancing its low Earth orbit (LEO) satellite capabilities, positioning itself as a formidable competitor to SpaceX’s Starlink by 2030. Two major Chinese LEO satellite programs are at the forefront of this effort: the Guowang (13,000) and Qianfan (15,000) constellations. So, by 2030, it is reasonable to expect that China will field a national LEO satellite broadband system with thousands of operational satellites, focused not just on domestic coverage but also on extending strategic connectivity to Belt and Road Initiative (BRI) countries, as well as regions in Africa, Asia, and South America. Unlike SpaceX’s commercially driven approach, China’s system is likely to be closely integrated with state objectives, combining broadband access with surveillance, positioning, and secure communication functionality. While it remains unclear whether China will match SpaceX’s pace of deployment or technological performance by 2030, its LEO ambitions are unequivocally driven by geopolitical considerations. They will likely shape European spectrum policy and infrastructure resilience planning in the years ahead. Guowang and Qianfan are emblematic of China’s dual-use strategy, which involves developing technologies for both civilian and military applications. This approach is part of China’s broader Military-Civil Fusion policy, which seeks to integrate civilian technological advancements into military capabilities. The dual-use nature of these satellite constellations raises concerns about their potential military applications, including surveillance and communication support for the People’s Liberation Army.

AN ILLUSTRATION OF COVERAGE – UNITED KINGDOM.

It takes approximately 172 Starlink beams to cover the United Kingdom, with 8 to 10 satellites overhead simultaneously. To have a persistent UK coverage in the order of 150 satellite constellations across appropriate orbits. Starlink’s 53° inclination orbital shell is optimized for mid-latitude regions, providing frequent satellite passes and dense, overlapping beam coverage over areas like southern England and central Europe. This results in higher throughput and more consistent connectivity with fewer satellites. In contrast, regions north of 53°N, such as northern England and Scotland, lie outside this optimal zone and depend on higher-inclination shells (70° and 97.6°), which have fewer satellites and wider, less efficient beams. As a result, coverage in these Northern areas is less dense, with lower signal quality and increased latency.

For this blog, I developed a Python script, with fewer than 600 lines of code (It’s a physicist’s code, so unlikely to be super efficient), to simulate and analyze Starlink’s satellite coverage and throughput over the United Kingdom using real orbital data. By integrating satellite propagation, beam modeling, and geographic visualization, it enables a detailed assessment of regional performance from current Starlink deployments across multiple orbital shells. Its primary purpose is to assess how the currently deployed Starlink constellation performs over UK territory by modeling where satellites pass, how their beams are steered, and how often any given area receives coverage. The simulation draws live TLE (Two-Line Element) data from Celestrak, a well-established source for satellite orbital elements. Using the Skyfield library, the code propagates the positions of active Starlink satellites over a 72-hour period, sampling every 5 minutes to track their subpoints across the United Kingdom. There is no limitation on the duration or sampling time. Choosing a more extended simulation period, such as 72 hours, provides a more statistically robust and temporally representative view of satellite coverage by averaging out orbital phasing artifacts and short-term gaps. It ensures that all satellites complete multiple orbits, allowing for more uniform sampling of ground tracks and beam coverage, especially from shells with lower satellite densities, such as the 70° and 97.6° inclinations. This results in smoother, more realistic estimates of average signal density and throughput across the entire region.

Each satellite is classified into one of three orbital shells based on inclination angle: 53°, 70°, and 97.6°. These shells are simulated separately and collectively to understand their individual and combined contributions to UK coverage. The 53° shell dominates service in the southern part of the UK, characterized by its tight orbital band and high satellite density (see the Table below). The 70° shell supplements coverage in northern regions, while the 97.6° polar shell offers sparse but critical high-latitude support, particularly in Scotland and surrounding waters. The simulation assumes several (critical) parameters for each satellite type, including the number of beams per satellite, the average beam radius, and the estimated throughput per beam. These assumptions reflect engineering estimates and publicly available Starlink performance information, but are deliberately simplified to produce regional-level coverage and throughput estimates, rather than user-specific predictions. The simulation does not account for actual user terminal distribution, congestion, or inter-satellite link (ISL) performance, focusing instead on geographic signal and capacity potential.

These parameters were used to infer beam footprints and assign realistic signal density and throughput values across the UK landmass. The satellite type was inferred from its shell (e.g., most 53° shell satellites are currently V1.5), and beam properties were adjusted accordingly.

The table above presents the core beam modeling parameters and satellite-specific assumptions used in the Starlink simulation over the United Kingdom. It includes general values for beam steering behavior, such as Gaussian spread, steering limits, city-targeting probabilities, and beam spacing constraints, as well as performance characteristics tied to specific satellite generations to the extent it is known (e.g., Starlink V1.5, V2 Mini, and V2 Full). These assumptions govern the placement of beams on the Earth’s surface and the capacity each beam can deliver. For instance, the City Exclusion Radius of 0.25 degrees corresponds to a ~25 km buffer around urban centers, where beam placement is probabilistically discouraged. Similarly, the beam radius and throughput per beam values align with known design specifications submitted by SpaceX to the U.S. Federal Communications Commission (FCC), particularly for Starlink’s V1.5 and V2 satellites. The table above also defines overlap rules, specifying the maximum number of beams that can overlap in a region and the maximum number of satellites that can contribute beams to a given point. This helps ensure that simulations reflect realistic network constraints rather than theoretical maxima.

Overall, the developed code offers a geographically and physically grounded simulation of how the existing Starlink network performs over the UK. It helps explain observed disparities in coverage and throughput by visualizing the contribution of each shell and satellite generation. This modeling approach enables planners and researchers to quantify satellite coverage performance at national and regional scales, providing insight into both current service levels and facilitating future constellation evolution, which is not discussed here.

The figure illustrates a 72-hour time-averaged Starlink coverage density over the UK. The asymmetric signal strength pattern reflects the orbital geometry of Starlink’s 53° inclination shell, which concentrates satellite coverage over southern and central England. Northern areas receive less frequent coverage due to fewer satellite passes and reduced beam density at higher latitudes.

This image above presents the Starlink Average Coverage Density over the United Kingdom, a result from a 72-hour simulation using real satellite orbital data from Celestrak. It illustrates the mean signal exposure across the UK, where color intensity reflects the frequency and density of satellite beam illumination at each location.

At the center of the image, a bright yellow core indicating the highest signal strength is clearly visible over the English Midlands, covering cities such as Birmingham, Leicester, and Bristol. The signal strength gradually declines outward in a concentric pattern—from orange to purple—as one moves northward into Scotland, west toward Northern Ireland, or eastward along the English coast. While southern cities, such as London, Southampton, and Plymouth, fall within high-coverage zones, northern cities, including Glasgow and Edinburgh, lie in significantly weaker regions. The decline in signal intensity is especially apparent beyond the 56°N latitude. This pattern is entirely consistent with what we know about the structure of the Starlink satellite constellation. The dominant contributor to coverage in this region is the 53° inclination shell, which contains 3,848 satellites spread across 36 orbital planes. This shell is designed to provide dense, continuous coverage to heavily populated mid-latitude regions, such as the southern United Kingdom, continental Europe, and the continental United States. However, its orbital geometry restricts it to a latitudinal range that ends near 53 to 54°N. As a result, southern and central England benefit from frequent satellite passes and tightly packed overlapping beams, while the northern parts of the UK do not. Particularly, Scotland lies at or beyond the shell’s effective coverage boundary.

The simulation may indicate how Starlink’s design prioritizes population density and market reach. Northern England receives only partial benefit, while Scotland and Northern Ireland fall almost entirely outside the core coverage of the 53° shell. Although some coverage in these areas is provided by higher inclination shells (specifically, the 70° shell with 420 satellites and the 97.6° polar shell with 227 satellites), these are sparser in both the number of satellites and the orbital planes. Their beams may also be broader and less (thus) less focused, resulting in lower average signal strength in high-latitude regions.

So, why is the coverage not textbook nice hexagon cells with uniform coverage across the UK? The simple answer is that real-world satellite constellations don’t behave like the static, idealized diagrams of hexagonal beam tiling often used in textbooks or promotional materials. What you’re seeing in the image is a time-averaged simulation of Starlink’s actual coverage over the UK, reflecting the dynamic and complex nature of low Earth orbit (LEO) systems like Starlink’s. Unlike geostationary satellites, LEO satellites orbit the Earth roughly every 90 minutes and move rapidly across the sky. Each satellite only covers a specific area for a short period before passing out of view over the horizon. This movement causes beam coverage to constantly shift, meaning that any given spot on the ground is covered by different satellites at different times. While individual satellites may emit beams arranged in a roughly hexagonal pattern, these patterns move, rotate, and deform continuously as the satellite passes overhead. The beams also vary in shape and strength depending on their angle relative to the Earth’s surface, becoming elongated and weaker when projected off-nadir, i.e., when the satellite is not directly overhead. Another key reason lies in the structure of Starlink’s orbital configuration. Most of the UK’s coverage comes from satellites in the 53° inclination shell, which is optimized for mid-latitude regions. As a result, southern England receives significantly denser and more frequent coverage than Scotland or Northern Ireland, which are closer to or beyond the edge of this shell’s optimal zone. Satellites serving higher latitudes originate from less densely populated orbital shells at 70° and 97.6°, which result in fewer passes and wider, less efficient beams.

The above heatmap does not illustrate a snapshot of beam locations at a specific time, but rather an averaged representation of how often each part of the UK was covered over a simulation period. This type of averaging smooths out the moment-to-moment beam structure, revealing broader patterns of coverage density instead. That’s why we see a soft gradient from intense yellow in the Midlands, where overlapping beams pass more frequently, to deep purple in northern regions, where passes are less common and less centered.

Illustrates an idealized hexagonal beam coverage footprint over the UK. For visual clarity, only a subset of hexagons is shown filled with signal intensity (yellow core to purple edge), to illustrate a textbook-like uniform tiling. In reality, satellite beams from LEO constellations, such as Starlink, are dynamic, moving, and often non-uniform due to orbital motion, beam steering, and geographic coverage constraints.

The two charts below provide a visual confirmation of the spatial coverage dynamics behind the Starlink signal strength distribution over the United Kingdom. Both are based on a 72-hour simulation using real Starlink satellite data obtained from Celestrak, and they accurately reflect the operational beam footprints and orbital tracks of currently active satellites over the United Kingdom.

This figure illustrates time-averaged Starlink coverage density over the UK with beam footprints (left) and satellite ground tracks (right) by orbital shell. The high density of beams and tracks from the 53° shell over southern UK leads to stronger and more consistent coverage. At the same time, northern regions receive fewer, more widely spaced passes from higher-inclination shells (70° and 97.6°), resulting in lower aggregate signal strength.

The first chart displays the beam footprints (i.e., the left side chart above) of Starlink satellites across the UK, color-coded by orbital shell: cyan for the 53° shell, green for the 70° shell, and magenta for the 97° polar shell. The concentration of cyan beam circles in southern and central England vividly demonstrates the dominance of the 53° shell in this region. These beams are tightly packed and frequent, explaining the high signal coverage in the earlier signal strength heatmap. In contrast, northern England and Scotland are primarily served by green and magenta beams, which are more sparse and cover larger areas — a clear indication of the lower beam density from the higher-inclination shells.

The second chart illustrates the satellite ground tracks (i.e., the right side chart above) over the same period and geographic area. Again, the saturation of cyan lines in the southern UK underscores the intensive pass frequency of satellites in the 53° inclined shell. As one moves north of approximately 53°N, these tracks vanish almost entirely, and only the green (70° shell) and magenta (97° shell) paths remain. These higher inclination tracks cross through Scotland and Northern Ireland, but with less spatial and temporal density, which supports the observed decline in average signal strength in those areas.

Together, these two charts provide spatial and orbital validation of the signal strength results. They confirm that the stronger signal levels seen in southern England stem directly from the concentrated beam targeting and denser satellite presence of the 53° shell. Meanwhile, the higher-latitude regions rely on less saturated shells, resulting in lower signal availability and throughput. This outcome is not theoretical — it reflects the live state of the Starlink constellation today.

The figure illustrates the estimated average Starlink throughput across the United Kingdom over a 72-hour window. Throughput is highest over southern and central England due to dense satellite traffic from the 53° orbital shell, which provides overlapping beam coverage and short revisit times. Northern regions experience reduced throughput from sparser satellite passes and less concentrated beam coverage.

The above chart shows the estimated average throughput of Starlink Direct-2-Dish across the United Kingdom, simulated over 72 hours using real orbital data from Celestrak. The values are expressed in Megabits per second (Mbps) and are presented as a heatmap, where higher throughput regions are shown in yellow and green, and lower values fade into blue and purple. The simulation incorporates actual satellite positions and coverage behavior from the three operational inclination shells currently providing Starlink service to the UK. Consistent with the signal strength, beam footprint density, and orbital track density, the best quality and most supplied capacity are available south of the 53°N inclination.

The strongest throughput is concentrated in a horizontal band stretching from Birmingham through London to the southeast, as well as westward into Bristol and south Wales. In this region, the estimated average throughput peaks at over 3,000 Mbps, which can support more than 30 concurrent customers each demanding 100 Mbps within the coverage area or up to 600 households with an oversubscription rate of 1 to 20. This aligns closely with the signal strength and beam density maps also generated in this simulation and is driven by the dense satellite traffic of the 53° inclination shell. These satellites pass frequently over southern and central England, where their beams overlap tightly and revisit times are short. The availability of multiple beams from different satellites at nearly all times drives up the aggregate throughput experienced at ground level. Throughput falls off sharply beyond approximately 54°N. In Scotland and Northern Ireland, values typically stay well below 1,000 Mbps. This reduction directly reflects the sparser presence of higher-latitude satellites from the 70° and 97.6° shells, which are fewer in number and more widely spaced, resulting in lower revisit frequencies and broader, less concentrated beams. The throughput map thus offers a performance-level confirmation of the underlying orbital dynamics and coverage limitations seen in the satellite and beam footprint charts.

While the above map estimates throughput in realistic terms, it is essential to understand why it does not reflect the theoretical maximum performance implied by Starlink’s physical layer capabilities. For example, a Starlink V1.5 satellite supports eight user downlink channels, each with 250 MHz of bandwidth, which in theory amounts to a total of 2 GHz of spectrum. Similarly, if one assumes 24 beams, each capable of delivering 800 Mbps, that would suggest a satellite capacity in the range of approximately 19–20 Gbps. However, these peak figures assume an ideal case with full spectrum reuse and optimized traffic shaping. In practice, the estimated average throughput shown here is the result of modeling real beam overlap and steering constraints, satellite pass timing, ground coverage limits, and the fact that not all beams are always active or directed toward the same location. Moreover, local beam capacity is shared among users and dynamically managed by the constellation. Therefore, the chart reflects a realistic, time-weighted throughput for a given geographic location, not a per-satellite or per-user maximum. It captures the outcome of many beams intermittently contributing to service across 72 hours, modulated by orbital density and beam placement strategy, rather than theoretical peak link rates.

A valuable next step in advancing the simulation model would be the integration of empirical user experience data across the UK footprint. If datasets such as comprehensive Ookla performance measurements (e.g., Starlink-specific download and upload speeds, latency, and jitter) were available with sufficient geographic granularity, the current Python model could be calibrated and validated against real-world conditions. Such data would enable the adjustment of beam throughput assumptions, satellite visibility estimates, and regional weighting factors to better reflect the actual service quality experienced by users. This would enhance the model’s predictive power, not only in representing average signal and throughput coverage, but also in identifying potential bottlenecks, underserved areas, or mismatches between orbital density and demand.

It is also important to note that this work relies on a set of simplified heuristics for beam steering, which are designed to make the simulation both tractable and transparent. In this model, beams are steered within a fixed angular distance from each satellite’s subpoint, with probabilistic biases against cities and simple exclusion zones (i.e., I operate with an exclusion radius of approximately 25 km or more). However, in reality, Starlink’s beam steering logic is expected to be substantially more advanced, employing dynamic optimization algorithms that account for real-time demand, user terminal locations, traffic load balancing, and satellite-satellite coordination via laser interlinks. Starlink has the crucial (and obvious) operational advantage of knowing exactly where its customers are, allowing it to direct capacity where it is needed most, avoid congestion (to an extent), and dynamically adapt coverage strategies. This level of real-time awareness and adaptive control is not replicated in this analysis, which assumes no knowledge of actual user distribution and treats all geographic areas equally.

As such, the current Python code provides a first-order geographic approximation of Starlink coverage and capacity potential, not a reflection of the full complexity and intelligence of SpaceX’s actual network management. Nonetheless, it offers a valuable structural framework that, if calibrated with empirical data, could evolve into a much more powerful tool for performance prediction and service planning.

Median Starlink download speeds in the United Kingdom, as reported by Ookla, from Q4 2022 to Q4 2024, indicate a general decline through 2023 and early 2024, followed by a slight recovery in late 2024. Source: Ookla.com.

The decline in real-world median user speeds, observed in the chart above, particularly from Q4 2023 to Q3 2024, may reflect increasing congestion and uneven coverage relative to demand, especially in areas outside the dense beam zones of the 53° inclination shell. This trend supports the simulation’s findings: while orbital geometry enables strong average coverage in the southern UK, northern regions rely on less frequent satellite passes from higher-inclination shells, which limits performance. The recovery of the median speed in Q4 2024 could be indicative of new satellite deployments (e.g., more V2 Minis or V2 Fulls) beginning to ease capacity constraints, something future simulation extensions could model by incorporating launch timelines and constellation updates.

Illustrates a European-based dual-use Low Earth Orbit (LEO) satellite constellation providing broadband connectivity to Europe’s millions of unconnectables by 2030 on a secure and strategic infrastructure platform covering Europe, North Africa, and the Middle East.

THE 200 BILLION EUROS QUESTION – IS THERE A PATH TO A EUROPEAN SPACE INDEPENDENCE?

Let’s start with the answer! Yes!

Is €200 billion, the estimated amount required to close the EU28 gigabit gap between 2023 and 2030, likely to enable Europe to build its own LEO satellite constellation and potentially develop one that is more secure, inclusive, and strategically aligned with its values and geopolitical objectives? In comparison, the European Union’s IRIS² (Infrastructure for Resilience, Interconnectivity and Security by Satellite) program has been allocated a total budget of 10+ billion euros aiming at building 264 LEO satellites (1,200 km) and 18 MEO satellites (8,000 km) mainly by the European “Primes” (i.e., the usual “suspects” of legacy defense contractors) by 2030. For that amount, we should even be able to afford our dedicated European stratospheric drone program for real-world use cases, as opposed to, for example, Airbus’s (AALTO) Zephyr fragile platform, which, imo, is more an impressive showcase of an innovative, sustainable (solar-driven) aerial platform than a practical, robust, high-performance communications platform.

A significant portion of this budget should be dedicated to designing, manufacturing, and launching a European-based satellite constellation. If Europe could match the satellite cost price of SpaceX, and not that of IRIS² (which appears to be large based on legacy satellite platform thinking or at least the unit price tag is), it could launch a very substantial number of EU-based LEO satellites within 200 billion euros (also for a lot less obviously). It easily matches the number of SpaceX’s long-term plans and would vastly surpass the satellites authorized under Kuiper’s first phase. To support such a constellation, Europe must invest heavily in launch infrastructure. While Ariane 6 remains in development, it could be leveraged to scale up the Ariane program or develop a reusable European launch system, mirroring and improving upon the capabilities of SpaceX’s Starship. This would reduce long-term launch costs, boost autonomy, and ensure deployment scalability over the decade. Equally essential would be establishing a robust ground segment covering the deployment of a European-wide ground station network, edge nodes, optical interconnects, and satellite laser communication capabilities.

Unlike Starlink, which benefits from SpaceX’s vertical integration, and Kuiper, which is backed by Amazon’s capital and logistics empire, a European initiative would rely heavily on strong multinational coordination. With 200 billion euros, possibly less if the usual suspects (i.e., ” Primes”) are managed accordingly, Europe could close the technology gap rapidly, secure digital sovereignty, and ensure that it is not dependent on foreign providers for critical broadband infrastructure, particularly for rural areas, government services, and defense.

Could this be done by 2030? Doubtful, unless Europe can match SpaceX’s impressive pace of innovation. That is at least to match the 3 years (2015–2018) it took SpaceX to achieve a fully reusable Falcon 9 system and the 4 years (2015–2019) it took to go from concept to the first operational V1 satellite launch. Elon has shown it is possible.

KEY TAKEAWAYS.

LEO satellite direct-to-dish broadband, when strategically deployed in underserved and hard-to-reach areas, should be seen not as a competitor to terrestrial networks but as a strategic complement. It provides a practical, scalable, and cost-effective means to close the final connectivity gap, one that terrestrial networks alone are unlikely to bridge economically. In sparsely populated rural zones, where fiber deployment becomes prohibitively expensive, LEO satellites may render new rollouts obsolete. In these cases, satellite broadband is not just an alternative. It may be essential. Moreover, it can also serve as a resilient backup in areas where rural fiber is already deployed, especially in regions lacking physical network redundancy. Rather than undermining terrestrial infrastructure, LEO extends its reach, reinforcing the case for hybrid connectivity models central to achieving EU-wide digital reach by 2030.

Instead of continuing to subsidize costly last-mile fiber in uneconomical areas, European policy should reallocate a portion of this funding toward the development of a sovereign European Low-Earth Orbit (LEO) satellite constellation. A mere 200 billion euros, or even less, would go a very long way in securing such a program. Such an investment would not only connect the remaining “unconnectables” more efficiently but also strengthen Europe’s digital sovereignty, infrastructure resilience, and strategic autonomy. A European LEO system should support dual-use applications, serving both civilian broadband access and the European defense architecture, thereby enhancing secure communications, redundancy, and situational awareness in remote regions. In a hybrid connectivity model, satellite broadband plays a dual role: as a primary solution in hard-to-reach zones and as a high-availability backup where terrestrial access exists, reinforcing a layered, future-proof infrastructure aligned with the EU’s 2030 Digital Decade objectives and evolving security imperatives.

Non-European dependence poses strategic trade-offs: The rise of LEO broadband providers, SpaceX, and China’s state-aligned Guowang and Qianfan, underscores Europe’s limited indigenous capacity in the Low Earth Orbit (LEO) space. While non-EU options may offer faster and cheaper rural connectivity, reliance on foreign infrastructure raises concerns about sovereignty, data governance, and security, especially amid growing geopolitical tensions.

LEO satellites, especially those similar or more capable than Starlink V3, can technically support the connectivity needs of Europe’s 2030s “unconnectable” (rural) households. Due to geography or economic constraints, these homes are unlikely to be reached by FTTP even under the most ambitious business-as-usual scenarios. A constellation of high-capacity satellites could serve these households with gigabit-class connections, especially when factoring in user concurrency and reasonable uptake rates.

The economics of FTTP deployment sharply deteriorate in very low-density rural regions, reinforcing the need for alternative technologies. By 2030, up to 5.5 million EU28 households are projected to remain beyond the economic viability of FTTP, down from 15.5 million rural homes in 2023. The European Commission has estimated that closing the gigabit gap from 2023 to 2030 requires around €200 billion. LEO satellite broadband may be a more cost-effective alternative, particularly with direct-to-dish architecture, at least for the share of unconnectable homes.

While LEO satellite networks offer transformative potential for deep rural coverage, they do not pose a threat to existing FTTP deployments. A Starlink V3 satellite, despite its 1 Tbps capacity, can serve the equivalent of a small fiber network, about 1,000 homes at 1 Gbps under full concurrency, or roughly 20,000 homes with 50% uptake and 10% busy-hour concurrency. FTTP remains significantly more efficient and scalable in denser areas. Satellites are not designed to compete with fiber in urban or suburban regions, but rather to complement it in places where fiber is uneconomical or otherwise unviable.

The technical attributes of LEO satellites make them ideally suited for sparse, low-density environments. Their broad coverage area and increasingly sophisticated beamforming and frequency reuse capabilities allow them to efficiently serve isolated dwellings, often spread across tens of thousands of square kilometers, where trenching fiber would be infeasible. These technologies extend the digital frontier rather than replace terrestrial infrastructure. Even with SpaceX’s innovative pace, it seems unlikely that this conclusion will change substantially within the next five years, at the very least.

A European LEO constellation could be feasible within a € 200 billion budget: The €200 billion gap identified for full gigabit coverage could, in theory, fund a sovereign European LEO system capable of servicing the “unconnectables.” If Europe adopts leaner, vertically integrated innovation models like SpaceX (and avoids legacy procurement inefficiencies), such a constellation could deliver comparable technical performance while bolstering strategic autonomy.

The future of broadband infrastructure in Europe lies in a hybrid strategy. Fiber and mobile networks should continue to serve densely populated areas, while LEO satellites, potentially supplemented by fixed wireless and 5G, offer a viable path to universal coverage. By 2030, a satellite constellation only slightly more capable than Starlink V3 could deliver broadband to virtually all of Europe’s remaining unconnected homes, without undermining the business case for large-scale FTTP networks already in place.

CAUTIONARY NOTE.

While current assessments suggest that a LEO satellite constellation with capabilities on par with or slightly exceeding those anticipated for Starlink V3 could viably serve Europe’s remaining unconnected households by 2030, it is important to acknowledge the speculative nature of these projections. The assumptions are based on publicly available data and technical disclosures. Still, it is challenging to have complete visibility into the precise specifications, performance benchmarks, or deployment strategies of SpaceX’s Starlink satellites, particularly the V3 generation, or, for that matter, Amazon’s Project Kuiper constellation. Much of what is known comes from regulatory filings (e.g., FCC), industry reports and blogs, Reddit, and similar platforms, as well as inferred capabilities. Therefore, while the conclusions drawn here are grounded in credible estimates and modeling, they should be viewed with caution until more comprehensive and independently validated performance data become available.

THE SATELLITE’S SPECS – MOST IS KEPT A “SECRET”, BUT THERE IS SOME LIGHT.

Satellite capacity is not determined by a single metric, but instead emerges from a tightly coupled set of design parameters. Variables such as spectral efficiency, channel bandwidth, polarization, beam count, and reuse factor are interdependent. Knowing a few of them allows us to estimate, bound, or verify others. This is especially valuable when analyzing or validating constellation design, performance targets, or regulatory filings.

For example, consider a satellite that uses 250 MHz channels with 2 polarizations and a spectral efficiency of 5.0 bps/Hz. These inputs directly imply a channel capacity of 1.25 Gbps and a beam capacity of 2.5 Gbps. If the satellite is intended to deliver 100 Gbps of total throughput, as disclosed in related FCC filings, one can immediately deduce that 40 beams are required. If, instead, the satellite’s reuse architecture defines 8 x 250 MHz channels per reuse group with a reuse factor of 5, and each reuse group spans a fixed coverage area. Both the theoretical and practical throughput within that area can be computed, further enabling the estimation of the total number of beams, the required spectrum, and the likely user experience. These dependencies mean that if the number of user channels, full bandwidth, channel bandwidth, number of beams, or frequency reuse factor is known, it becomes possible to estimate or cross-validate the others. This helps identify design consistency or highlight unrealistic assumptions.

In satellite systems like Starlink, the total available spectrum is limited. This is typically divided into discrete channels, for example, eight 250 MHz channels (as is the case for Starlink’s Ku-band downlink to the user’s terrestrial dish). A key architectural advantage of spot-beam satellites (e.g., with spots that are at least 50 to 80 km wide) is that frequency channels can be reused in multiple spatially separated beams, as long as the beams do not interfere with one another. This is not based on a fixed reuse factor, as seen in terrestrial cellular systems, but on beam isolation, achieved through careful beam shaping, angular separation, and sidelobe control (as also implemented in the above Python code for UK Starlink satellite coverage, albeit in much simpler ways). For instance, one beam covering southern England can use the same frequency channels as another beam covering northern Scotland, because their energy patterns do not overlap significantly at ground level. In a constellation like Starlink’s, where hundreds or even thousands of beams are formed across a satellite footprint, frequency reuse is achieved through simultaneous but non-overlapping spatial beam coverage. The reuse logic is handled dynamically on board or through ground-based scheduling, based on real-time traffic load and beam geometry.

This means that for a given satellite, the total instantaneous throughput is not only a function of spectral efficiency and bandwidth per beam, but also of the number of beams that can simultaneously operate on overlapping frequencies without causing harmful interference. If a satellite has access to 2 GHz of bandwidth and 250 MHz channels, then up to 8 distinct channels can be formed. These channels can be replicated across different beams, allowing many more than 8 beams to be active concurrently, each using one of those 8 channels, as long as they are separated enough in space. This approach allows operators to scale capacity dramatically through dense spatial reuse, rather than relying solely on expanding spectrum allocations. The ability to reuse channels across beams depends on antenna performance, beamwidth, power control, and orbital geometry, rather than a fixed reuse pattern. The same set of channels is reused across non-interfering coverage zones enabled by directional spot beams. Satellite beams can be “stacked on top of each other” up to the number of available channels, or they can be allocated optimally across a coverage area determined by user demand.

Although detailed specifications of commercial satellites, whether in operation or in the planning phase, are usually not publicly disclosed. However, companies are required to submit technical filings to the U.S. Federal Communications Commission (FCC). These filings include orbital parameters, frequency bands in use, EIRP, and antenna gain contours, as well as estimated capabilities of the satellite and user terminals. The FCC’s approval of SpaceX’s Gen2 constellation, for instance, outlines many of these values and provides a foundation upon which informed estimates of system behavior and performance can be made. The filings are not exhaustive and may omit sensitive performance data, but they serve as authoritative references for bounding what is technically feasible or likely in deployment.

ACKNOWLEDGEMENT.

I would like to acknowledge my wife, Eva Varadi, for her unwavering support, patience, and understanding throughout the creative process of writing this article.

FURTHER READINGS.

Kim K. Larsen, “Will LEO Satellite Direct-to-Cellular Networks Make Traditional Mobile Networks Obsolete?”, A John Strand Consult Report, (January 2025). This has also been published in full on my own Techneconomy blog.

Kim K. Larsen, “The Next Frontier: LEO Satellites for Internet Services.” Techneconomyblog (March 2024).

Kim K. Larsen, “Stratospheric Drones & Low Earth Satellites: Revolutionizing Terrestrial Rural Broadband from the Skies?” Techneconomyblog (January 2024).

Kim K. Larsen, “A Single Network Future“, Techneconomyblog (March 2024).

NOTE: My “Satellite Coverage Concept Model,” which I have applied to Starlink Direct-2-Dish coverage and Services in the United Kingdom, is not limited to the UK alone but can be straightforwardly generalized to other countries and areas.

The Next Frontier: LEO Satellites for Internet Services.

THE SPACE RACE IS ON.

If all current commercial satellite plans were to be realized within the next decade, we would have more, possibly substantially more, than 65 thousand satellites circling Earth. Today, that number is less than 10 thousand, with more than half that number realized by StarLink’s Low Earth Orbit (LEO) constellation over the last couple of years (i.e., since 2018).

While the “Arms Race” during the Cold War was “a thing” mainly between The USA and the former Soviet Union, the Space Race will, in my opinion, be “battled out” between the commercial interests of the West against the political interest of China (as illustrated in Figure 1 below). The current numbers strongly indicate that Europe, Canada, the Middle East, Africa, and APAC (minus China) will likely and largely be left on the sideline to watch the US and China impose, in theory, a “duopoly” in LEO satellite-based services. However, in practice, it will be a near-monopoly when considering security concerns between the West and the (re-defined) East block.

Figure 1 Illustrates my thesis that we will see a Space Race over the next 10 years between a (or very few) commercial LEO constellation, represented by a Falcon-9 like design (for maybe too obvious reasons), and a Chinese-state owned satellite constellation. (Courtesy: DALL-E).

As of end of 2023, more than 50% of launched and planned commercial LEO satellites are USA-based. Of those, the largest fraction is accounted for by the US-based StarLink constellation (~75%). More than 30% are launched or planned by Chinese companies headed by the state-owned Guo Wang constellation rivaling Elon Musk’s Starlink in ambition and scale. Europe comes in at a distant number 3 with about 8% of the total of fixed internet satellites. Apart from being disappointed, alas, not surprised by the European track record, it is somewhat more baffling that there are so few Indian and African satellite (there are none) constellations given the obvious benefits such satellites could bring to India and the African continent.

India is a leading satellite nation with a proud tradition of innovative satellite designs and manufacturing and a solid track record of satellite launches. However, regarding commercial LEO constellations, India still needs to catch up on some opportunities here. Having previously worked on the economics and operationalizing a satellite ATC (i.e., a satellite service with an ancillary terrestrial component) internet service across India, it is mind-blowing (imo) how much economic opportunity there is to replace by satellite the vast terrestrial cellular infrastructure in rural India. Not to mention a quantum leap in communication broadband services resilience and availability that could be provided. According to the StarLink coverage map, the regulatory approval in India for allowing StarLink (US) services is still pending. In the meantime, Eutelsat’s OneWeb (EU) received regulatory approval in late 2023 for its satellite internet service over India in collaboration with Barthi Enterprises (India), that is also the largest shareholder in the recently formed Eutelsat Group with 21.2%. Moreover, Jio’s JioSpaceFiber satellite internet services were launched in several Indian states at the end of 2023, using the SES (EU) MEO O3b mPower satellite constellation. Despite the clear satellite know-how and capital available, it appears there is little activity for Indian-based LEO satellite development, taking up the competition with international operators.

The African continent is attracting all the major LEO satellite constellations such as StarLink (US), OneWeb (EU), Amazon Kuipers (US), and Telesat Lightspeed (CAN). However, getting regulatory approval for their satellite-based internet services is a complex, time-consuming, and challenging process with Africa’s 54 recognized sovereign countries. I would expect that we will see the Chinese-based satellite constellations (e.g., Guo Wang) taking up here as well due to the strong ties between China and several of the African nations.

This article is not about SpaceX’s StarLink satellite constellation. Although StarLink is mentioned a lot and used as an example. Recently, at the Mobile World Congress 2024 in Barcelona, talking to satellite operators (but not StarLink) providing fixed broadband satellite services, we joked about how long into a meeting we could go before SpaceX and StarLink would be mentioned (~ 5 minutes where the record, I think).

This article is about the key enablers (frequencies, frequency bandwidth, antenna design, …) that make up an LEO satellite service, the LEO satellite itself, the kind of services one should expect from it, and its limitations.

There is no doubt that LEO satellites of today have an essential mission: delivering broadband internet to rural and remote areas with little or no terrestrial cellular or fixed infrastructure to provide internet services. Satellites can offer broadband internet to remote areas with little population density and a population spread out reasonably uniformly over a large area. A LEO satellite constellation is not (in general) a substitute for an existing terrestrial communications infrastructure. Still, it can enhance it by increasing service availability and being an important remedy for business continuity in remote rural areas. Satellite systems are capacity-limited as they serve vast areas, typically with limited spectral resources and capacity per unit area.

In comparison, we have much smaller coverage areas with demand-matched spectral resources in a terrestrial cellular network. It is also easier to increase capacity in a terrestrial cellular system by adding more sectors or increasing the number of sites in an area that requires such investments. Adding more cells, and thus increasing the system capacity, to satellite coverage requires a new generation of satellites with more advanced antenna designs, typically by increasing the number of phased-array beams and more complex modulation and coding mechanisms that boost the spectral efficiency, leading to increased capacity and quality for the services rendered to the ground. Increasing the system capacity of a cellular communications system by increasing the number of cells (i.e., cell splitting) works the same in satellite systems as it does for a terrestrial cellular system.

So, on average, LEO satellite internet services to individual customers (or households), such as those offered by StarLink, are excellent for remote, lowly populated areas with a nicely spread-out population. If we de-average this statement. Clearly, within the satellite coverage area, we may have towns and settlements where, locally, the population density can be fairly large despite being very small over the larger footprint covered by the satellite. As the capacity and quality of the satellite is a shared resource, serving towns and settlements of a certain size may not be the best approach to providing a sustainable and good customer experience as the satellite resources exhaust rapidly in such scenarios. In such scenarios, a hybrid architecture is of much better use as well as providing all customers in a town or settlement with the best service possible leveraging the existing terrestrial communications infrastructure, cellular as well as fixed, with that of a satellite backhaul broadband connection between a satellite ground gateway and the broadband internet satellite. This is offered by several satellite broadband providers (both from GEO, MEO and LEO orbits) and has the beauty of not only being limited to one provider. Unfortunately, this particular finesse, is often overlooked by the awe of massive scale of the StarLink constellation.

AND SO IT STARTS.

When I compared the economics of stratospheric drone-based cellular coverage with that of LEO satellites and terrestrial-based cellular networks in my previous article, “Stratospheric Drones: Revolutionizing Terrestrial Rural Broadband from the Skies?”, it was clear that even if LEO satellites are costly to establish, they provide a substantial cost advantage over cellular coverage in rural and remote areas that are either scarcely covered or not at all. Although the existing LEO satellite constellations have limited capacity compared to a terrestrial cellular network and would perform rather poorly over densely populated areas (e.g., urban and suburban areas), they can offer very decent fixed-wireless-access-like broadband services in rural and remote areas at speeds exceeding even 100 Mbps, such as shown by the Starlink constellation. Even if the provided speed and capacity is likely be substantially lower than what a terrestrial cellular network could offer, it often provides the missing (internet) link. Anything larger than nothing remains infinitely better.

Low Earth Orbit (LEO) satellites represent the next frontier in (novel) communication network architectures, what we in modern lingo would call non-terrestrial networks (NTN), with the ability to combine both mobile and fixed broadband services, enhancing and substituting terrestrial networks. The LEO satellites orbit significantly closer to Earth than their Geostationary Orbit (GEO) counterparts at 36 thousand kilometers, typically at altitudes between 300 to 2,000 kilometers, LEO satellites offer substantially reduced latency, higher bandwidth capabilities, and a more direct line of sight to receivers on the ground. It makes LEO satellites an obvious and integral component of non-terrestrial networks, which aim to extend the reach of existing fixed and mobile broadband services, particularly in rural, un-and under-served, or inaccessible regions as a high-availability element of terrestrial communications networks in the event of natural disasters (flooding, earthquake, …), or military conflict, in which the terrestrial networks are taken out of operation.

Another key advantage of LEO satellite is that the likelihood of a line-of-sight (LoS) to a point on the ground is very high compared to establishing a LoS for terrestrial cellular coverage that, in general, would be very low. In other words, the signal propagation from a LEO satellite closely approximates that of free space. Thus, all the various environmental signal loss factors we must consider for a standard terrestrial-based cellular mobile network do not apply to our satellite with signal propagation largely being determined by the distance between the satellite and the ground (see Figure 2).

Figure 2 illustrates the difference between terrestrial cellular coverage from a cell tower and that of a Low Earth Orbit (LEO) Satellite. The benefit of seeing the world from above is that environmental and physical factors have substantially less impact on signal propagation and quality primarily being impacted by distance as it approximates free space propagation with signal attenuation mainly determined by the Line-of-Sight (LoS) distance from antenna to Earth. This situation is very different for a terrestrial-based cellular tower with its radiated signal being substantially compromised by environmental factors.

Low Earth Orbit (LEO) satellites, compared to GEO and MEO-based higher-altitude satellite systems, in general, have simpler designs and smaller sizes, weights, and volumes. Their design and architecture are not just a function of technological trends but also a manifestation of their operational environment. The (relative) simplicity of LEO satellites also allows for more standardized production, allowing for off-the-shelf components and modular designs that can be manufactured in larger quantities, such as the case with CubeSats standard and SmallSats in general. The lower altitude of LEO satellites translates to a reduced distance from the launch site to the operational orbit, which inherently affects the economics of satellite launches. This proximity to Earth means that the energy required to propel a satellite into LEO is significantly less than needed to reach Geostationary Earth Orbit (GEO), resulting in lower launch costs.

The advent of LEO satellite constellations marks an important shift in how we approach global connectivity. With the potential to provide ubiquitous internet coverage in rural and remote places with little or no terrestrial communications infrastructure, satellites are increasingly being positioned as vital elements in global communication. The LEO satellites, as well as stratospheric drones, have the ability to provide economical internet access, as addressed in my previous article, in remote areas and play a significant role in disaster relief efforts. For example, when terrestrial communication networks may be disrupted after a natural disaster, LEO satellites can quickly re-establish communication links to normal cellular devices or ad-how earth-based satellite systems, enabling efficient coordination of rescue and relief operations. Furthermore, they offer a resilient network backbone that complements terrestrial infrastructure.

The Internet of Things (IoT) benefits from the capabilities of LEO satellites. Particular in areas where there is little or no existing terrestrial communications networks. IoT devices often operate in remote or mobile environments, from sensors in agricultural fields to trackers across shipping routes. LEO satellites provide reliable connectivity to IoT networks, facilitating many applications, such as non- and near real-time monitoring of environmental data, seamless asset tracking over transcontinental journeys, and rapid deployment of smart devices in smart city infrastructures. As an example, let us look at the minimum requirements for establishing a LEO satellite constellation that can gather IoT measurements. At an altitude of 550 km the satellite would take ca. 1.5 hour to return to a given point on its orbit. Earth rotates (see also below) which require us to deploy several orbital planes to ensure that we have continuous coverage throughout the 24 hours of a day (assuming this is required). Depending on the satellite antenna design, the target coverage area, and how often a measurement is required, a satellite constellation to support an IoT business may not require much more than 20 (lower measurement frequency) to 60 (higher measurement frequency, but far from real real-time data collection) LEO satellites (@ 550 km).

For defense purposes, LEO satellite systems present unique advantages. Their lower orbits allow for high-resolution imagery and rapid data collection, which are crucial for surveillance, reconnaissance, and operational awareness. As typically more LEO satellites will be required, compared to a GEO satellite, such systems also offer a higher degree of redundancy in case of anti-satellite (ASAT) warfare scenarios. When integrated with civilian applications, military use cases can leverage the robust commercial infrastructure for communication and geolocation services, enhancing capabilities while distributing the system’s visibility and potential targets.

Standalone military LEO satellites are engineered for specific defense needs. These may include hardened systems for secure communication, resistance to jamming, and interception. For instance, they can be equipped with advanced encryption algorithms to ensure secure transmission of sensitive military data. They also carry tailored payloads for electronic warfare, signal intelligence, and tactical communications. For example, they can host sensors for detecting and locating enemy radar and communication systems, providing a significant advantage in electronic warfare. As the line between civilian and military space applications blurs, dual-use LEO satellite systems are emerging, capable of serving civilian broadband and specialized military requirements. It should be pointed out that there also military applications, such as signal gathering, that may not be compatible with civil communications use cases.

In a military conflict, the distributed architecture and lower altitude of LEO constellations may offer some advantages regarding resilience and targetability compared to GEO and MEO-based satellites. Their more significant numbers (i.e., 10s to 1000s) compared to GEO, and the potential for quicker orbital resupply can make them less susceptible to complete system takedown. However, their lower altitudes could make them accessible to various ASAT technologies, including ground-based missiles or space-based kinetic interceptors.

It is not uncommon to encounter academic researchers and commentators who give the impression that LEO satellites could replace existing terrestrial-based infrastructures and solve all terrestrial communications issues known to man. That is (of course) not the case. Often, such statements appears to be based an incomplete understanding of the capacity limitation of satellite systems. Due to satellites’ excellent coverage with very large terrestrial footprints, the satellite capacity is shared over very large areas. For example, consider an LEO satellite at 550 km altitude. The satellite footprint, or coverage area (aka ground swath), is the area on the Earth’s surface over which the satellite can establish a direct line of sight. The satellite footprint in our example diameter would be ca. five thousand five hundred kilometers. An equivalent area of ca. 23 million square kilometers is more than twice that of the USA (or China or Canada). Before you get too excited, the satellite antenna will typically restrict the surface area the satellite will cover. The extent of the observable world that is seen at any given moment by the satellite antenna is defined as the Field of View (FoV) and can vary from a few degrees (narrow beams, small coverage area) to 40 degrees or higher (wide beams, large coverage areas). At a FoV of 20 degrees, the antenna footprint would be ca. 2 thousand 400 kilometers, equivalent to a coverage area of ca. 5 million square kilometers.

In comparison, for a FoV of 0.8 degrees, the antenna footprint would only be 100 kilometers. If our satellite has a 16-satellite beam capability, it would translate into a coverage diameter of 24 km per beam. For the StarLink system based on the Ku-band (13 GHz) and a cell downlink (Satellite-to-Earth) capacity of ca. 680 Mbps (in 250 MHz) we would have ca. 2 Mbps per km2 unit coverage area. Compared to a terrestrial rural cellular site with 85 MHz (Downlink, Base station antenna to customer terminal), it would deliver 10+ Mbps per km2 unit coverage area.

It is always good to keep in mind that “Satellites mission is not to replace terrestrial communications infrastructures but supplement and enhance them”, and furthermore, “Satellites offer the missing (internet) link in areas where there is no terrestrial communications infrastructure present”. Satellites offer superior coverage to any terrestrial communications infrastructure. Satellites limitations are in providing capacity, and quality, at population scale as well as supporting applications and access technologies requiring very short latencies (e.g., smaller than 10 ms).

In the following, I will focus on terrestrial cellular coverage and services that LEO satellites can provide. At the end of my blog, I hope I have given you (the reader) a reasonable understanding of how terrestrial coverage, capacity, and quality work in a (LEO) satellite system and have given you an impression of key parameters we can add to the satellite to improve those.

EARTH ROTATES, AND SO DO SATELLITES.

Before getting into the details of low earth orbit satellites, let us briefly get a couple of basic topics off the table. Skipping this part may be a good option if you are already into and in the know satellites. Or maybe carry on an get a good laugh of those terra firma cellular folks that forgot about the rotation of Earth 😉

From an altitude and orbit (around Earth) perspective, you may have heard of two types of satellites: The GEO and the LEO satellites. Geostationary (GEO) satellites are positioned in a geostationary orbit at ~36 thousand kilometers above Earth. That the satellite is geostationary means it rotates with the Earth and appears stationary from the ground, requiring only one satellite to maintain constant coverage over an area that can be up to one-third of Earth’s surface. Low Earth Orbit (LEO) satellites are positioned at an altitude between 300 to 2000 kilometers above Earth and move relative to the Earth’s surface at high speeds, requiring a network or constellation to ensure continuous coverage of a particular area.

I have experienced that terrestrial cellular folks (like myself) when first thinking about satellite coverage are having some intuitive issues with satellite coverage. We are not used to our antennas moving away from the targeted coverage area, and our targeted coverage area, too, is moving away from our antenna. The geometry and dynamics of terrestrial cellular coverage are simpler than they are for satellite-based coverage. For LEO satellite network planners, it is not rocket science (pun intended) that the satellites move around in their designated orbit over Earth at orbital speeds of ca. 70 to 80 km per second. Thus, at an altitude of 500 km, a LEO satellite orbits Earth approximately every 1.5 hours. Earth, thankfully, rotates. Compared to its GEO satellite “cousin,” the LEO satellite ” is not “stationary” from the perspective of the ground. Thus, as Earth rotates, the targeted coverage area moves away from the coverage provided by the orbital satellite.

We need several satellites in the same orbit and several orbits (i.e., orbital planes) to provide continuous satellite coverage of a target area. This is very different from terrestrial cellular coverage of a given area (needles to say).

WHAT LEO SATELLITES BRING TO THE GROUND.

Anything is infinitely more than nothing. The Low Earth Orbit satellite brings the possibility of internet connectivity where there previously was nothing, either because too few potential customers spread out over a large area made terrestrial-based services hugely uneconomical or the environment is too hostile to build normal terrestrial networks within reasonable economics.

Figure 3 illustrates a low Earth satellite constellation providing internet to rural and remote areas as a way to solve part of the digital divide challenge in terms of availability. Obviously, the affordability is likely to remain a challenge unless subsidized by customers who can afford satellite services in other places where availability is more of a convenience question. (Courtesy: DALL-E)

The LEO satellites represent a transformative shift in internet connectivity, providing advantages over traditional cellular and fixed broadband networks, particularly for global access, speed, and deployment capabilities. As described in “Stratospheric Drones: Revolutionizing Terrestrial Rural Broadband from the Skies?”, LEO satellite constellations, or networks, may also be significantly more economical than equivalent cellular networks in rural and remote areas where the economics of coverage by satellite, as depicted in the above Figure 3, is by far better than by traditional terrestrial cellular means.

One of the foremost benefits of LEO satellites is their ability to offer global coverage as well as reasonable broadband and latency performance that is difficult to match with GEO and MEO satellites. The GEO stationary satellite obviously also offers global broadband coverage, the unit coverage being much more extensive than for a LEO satellite, but it is not possible to offer very low latency services, and it is more difficult to provide high data rates (in comparison to a LEO satellite). LEO satellites can reach the most remote and rural areas of the world, places where laying cables or setting up cell towers is impractical. This is a crucial step in delivering communications services where none exist today, ensuring that underserved populations and regions gain access to internet connectivity.

Another significant advantage is the reduction in latency that LEO satellites provide. Since they orbit much closer to Earth, typically at an altitude between 350 to 700 km, compared to their geostationary counterparts that are at 36 thousand kilometers altitude, the time it takes for a communications signal to travel between the user and the satellite is significantly reduced. This lower latency is crucial for enhancing the user experience in real-time applications such as video calls and online gaming, making these activities more enjoyable and responsive.

An inherent benefit of satellite constellations is their ability for quick deployment. They can be deployed rapidly in space, offering a quicker solution to achieving widespread internet coverage than the time-consuming and often challenging process of laying cables or erecting terrestrial infrastructure. Moreover, the network can easily be expanded by adding more satellites, allowing it to dynamically meet changing demand without extensive modifications on the ground.

LEO satellite networks are inherently scalable. By launching additional satellites, they can accommodate growing internet usage demands, ensuring that the network remains efficient and capable of serving more users over time without significant changes to ground infrastructure.

Furthermore, these satellite networks offer resilience and reliability. With multiple satellites in orbit, the network can maintain connectivity even if one satellite fails or is obstructed, providing a level of redundancy that makes the network less susceptible to outages. This ensures consistent performance across different geographical areas, unlike terrestrial networks that may suffer from physical damage or maintenance issues.

Another critical advantage is (relative) cost-effectiveness compared to a terrestrial-based cellular network. In remote or hard-to-reach areas, deploying satellites can be more economical than the high expenses associated with extending terrestrial broadband infrastructure. As satellite production and launch costs continue to decrease, the economics of LEO satellite internet become increasingly competitive, potentially reducing the cost for end-users.

LEO satellites offer a promising solution to some of the limitations of traditional connectivity methods. By overcoming geographical, infrastructural, and economic barriers, LEO satellite technology has the potential to not just complement but effectively substitute terrestrial-based cellular and fixed broadband services, especially in areas where such services are inadequate or non-existent.

Figure 4 below provides an overview of LEO satellite coverage with fixed broadband services offered to customers in the Ku band with a Ka backhaul link to ground station GWs that connect to, for example, the internet. Having inter-satellite communications (e.g., via laser links such as those used by Starlink satellites as per satellite version 1.5) allows for substantially less ground-station gateways. Inter-satellite laser links between intra-plane satellites are a distinct advantage in ensuring coverage for rural and remote areas where it might be difficult, very costly, and impractical to have a satellite ground station GW to connect to due to the lack of global internet infrastructure.

Figure 4 In general, a satellite is required to have LoS to its ground station gateway (GW); in other words, the GW needs to be within the coverage footprint of the satellite. For LEO satellites, which are at low altitudes, between 300 and 2000 km, and thus have a much lower footprint than MEO and GEO satellites, this would result in a need for a substantial amount of ground stations. This is depicted in (a) above. With inter-satellite laser links (SLL), e.g., those implemented by Starlink, it is possible to reduce the ground station gateways significantly, which is particularly helpful in rural and very remote areas. These laser links enable direct communication between satellites in orbit, which enhances the network’s performance, reliability, and global reach.

Inter-satellite laser links (ISLLs), or, as it is also called Optical Inter-satellite Links (OISK), are an advanced communication technology utilized by satellite constellations, such as for example Starlink, to facilitate high-speed secure data transmission directly between satellites. Inter-satellite laser links are today (primarily) designed for intra-plane communication within satellite constellations, enabling data transfer between satellites that share the same orbital plane. This is due to the relatively stable geometries and predictable distances between satellites in the same orbit, which facilitate maintaining the line-of-sight connections necessary for laser communications. ISLLs mark a significant departure from traditional reliance on ground stations for inter-satellite communication, and as such the ISL offers many benefits, including the ability to transmit data at speeds comparable to fiber-optic cables. Additionally, ISLLs enable satellite constellations to deliver seamless coverage across the entire planet, including over oceans and polar regions where ground station infrastructure is limited or non-existent. The technology also inherently enhances the security of data transmissions, thanks to the focused nature of laser beams, which are difficult to intercept.

However, the deployment of ISLLs is not without challenges. The technology requires a clear line of sight between satellites, which can be affected by their orbital positions, necessitating precise control mechanisms. Moreover, the theoretical limit to the number of satellites linked in a daisy chain is influenced by several factors, including the satellite’s power capabilities, the network architecture, and the need to maintain clear lines of sight. High-power laser systems also demand considerable energy, impacting the satellite’s power budget and requiring efficient management to balance operational needs. The complexity and cost of developing such sophisticated laser communication systems, combined with very precise pointing mechanisms and sensitive detectors, can be quite challenging and need to be carefully weighted against building satellite ground stations.

Cross-plane ISLL transmission, or the ability to communicate between satellites in different orbital planes, presents additional technical challenges, as it is technically highly challenging to maintain a stable line of sight between satellites moving in different orbital planes. However, the potential for ISLLs to support cross-plane links is recognized as a valuable capability for creating a fully interconnected satellite constellation. The development and incorporation of cross-plane ISLL capabilities into satellites are an area of active research and development. Such capabilities would reduce the reliance on ground stations and significantly increase the resilience of satellite constellations. I see the development as a next-generation topic together with many other important developments as described in the end of this blog. However, the power consumption of the ISLL is a point of concern that needs careful attention as it will impact many other aspects of the satellite operation.

THE DIGITAL DIVIDE.

The digital divide refers to the “internet haves and haves not” or “the gap between individuals who have access to modern information and communication technology (ICT),” such as the internet, computers, and smartphones, and those who do not have access. This divide can be due to various factors, including economic, geographic, age, and educational barriers. Essentially, as illustrated in Figure 5, it’s the difference between the “digitally connected” and the “digitally disconnected.”.

The significance of the digital divide is considerable, impacting billions of people worldwide. It is estimated that a little less than 40% of the world’s population, or roughly 2.9 billion people, had never used the internet (as of 2023). This gap is most pronounced in developing countries, rural areas, and among older populations and economically disadvantaged groups.

The digital divide affects individuals’ ability to access information, education, and job opportunities and impacts their ability to participate in digital economies and the modern social life that the rest of us (i.e., the other side of the divide or the privileged 60%) have become used to. Bridging this divide is crucial for ensuring equitable access to technology and its benefits, fostering social and economic inclusion, and supporting global development goals.

Figure 5 illustrates the digital divide, that is, the gap between individuals with access to modern information and communication technology (ICT), such as the internet, computers, and smartphones, and those who do not have access. (Courtesy: DALL-E)

CHALLENGES WITH LEO SATELLITE SOLUTIONS.

Low-Earth-orbit satellites offer compelling advantages for global internet connectivity, yet they are not without challenges and disadvantages when considered substitutes for cellular and fixed broadband services. These drawbacks underscore the complexities and limitations of deploying LEO satellite technology globally.

The capital investment required and the ongoing costs associated with designing, manufacturing, launching, and maintaining a constellation of LEO satellites are substantial. Despite technological advancements and increased competition driving costs down, the financial barrier to entry remains high. Compared to their geostationary counterparts, the relatively short lifespan of LEO satellites necessitates frequent replacements, further adding to operational expenses.

While LEO satellites offer significantly reduced latency (round trip times, RTT ~ 4 ms) compared to geostationary satellites (RTT ~ 240 ms), they may still face latency and bandwidth limitations, especially as the number of users on the satellite network increases. This can lead to reduced service quality during peak usage times, highlighting the potential for congestion and bandwidth constraints. This is also the reason why the main business model of LEO satellite constellations is primarily to address coverage and needs in rural and remote locations. Alternatively, the LEO satellite business model focuses on low-bandwidth needs such as texting, voice messaging, and low-bandwidth Internet of Things (IoT) services.

Navigating the regulatory and spectrum management landscape presents another challenge for LEO satellite operators. Securing spectrum rights and preventing signal interference requires coordination across multiple jurisdictions, which can complicate deployment efforts and increase the complexity of operations.

The environmental and space traffic concerns associated with deploying large numbers of satellites are significant. The potential for space debris and the sustainability of low Earth orbits are critical issues, with collisions posing risks to other satellites and space missions. Additionally, the environmental impact of frequent rocket launches raises further concerns.

FIXED-WIRELESS ACCESS (FWA) BASED LEO SATELLITE SOLUTIONS.

Using the NewSpace Index database, updated December 2023, there are currently more than 6,463 internet satellites launched, of which 5,650 (~87%) from StarLink, and 40,000+ satellites planned for launch, with SpaceX’s Starlink satellites having 11,908 planned (~30%). More than 45% of the satellites launched and planned support multi-application use cases. Thus internet, together with, for example, IoT (~4%) and/or Direct-2-Device (D2D, ~39%). The D2D share is due to StarLink’s plans to provide services to mobile terminals with their latest satellite constellation. The first six StarLink v2 satellites with direct-to-cellular capability were successfully launched on January 2nd, 2024. Some care should be taken in the share of D2D satellites in the StarLink number as it does not consider the different form factors of the version 2 satellite that do not all include D2D capabilities.

Most LEO satellites, helped by StarLink satellite quantum, operational and planned, support satellite fixed broadband internet services. It is worth noting that the Chinese Guo Wang constellation ranks second in terms of planned LEO satellites, with almost 13,000 planned, rivaling the StarLink constellation. After StarLink and Guo Wang are counted there is only 34% or ca. 16,000 internet satellites left in the planning pool across 30+ satellite companies. While StarLink is privately owned (by Elon Musk), the Guo Wang (國網 ~ “The state network”) constellation is led by China SatNet and created by the SASAC (China’s State-Owned Assets Supervision and Administration Commission). SASAC oversees China’s biggest state-owned enterprises. I expect that such an LEO satellite constellation, which would be the second biggest LEO constellation, as planned by Guo Wang and controlled by the Chinese State, would be of considerable concern to the West due to the possibility of dual-use (i.e., civil & military) of such a constellation.

StarLink coverage as of March 2024 (see StarLink’s availability map) does not provide services in Russia, China, Iran, Iraq, Afghanistan, Venezuela, and Cuba (20% of Earth’s total land base surface area). There are still quite a few countries in Africa and South-East Asia, including India, where regulatory approval remains pending.

Figure 6 NewSpace Index data of commercial satellite constellations in terms of total number of launched and planned (top) per company (or constellation name) and (bottom) per country.

While the term FWA, fixed wireless access, is not traditionally used to describe satellite internet services, the broadband services offered by LEO satellites can be considered a form of “wireless access” since they also provide connectivity without cables or fiber. In essence, LEO satellite broadband is a complementary service to traditional FWA, extending wireless broadband access to locations beyond the reach of terrestrial networks. In the following, I will continue to use the term FWA for the fixed broadband LEO satellite services provided to individual customers, including SMEs. As some of the LEO satellite businesses eventually also might provide direct-to-device (D2D) services to normal terrestrial mobile devices, either on their own acquired cellular spectrum or in partnership with terrestrial cellular operators, the LEO satellite operation (or business architecture) becomes much closer to terrestrial cellular operations.

Figure 7 Illustrating a Non-Terrestrial Network consisting of a Low Earth Orbit (LEO) satellite constellation providing fixed broadband services, such as Fixed Wireless Access, to individual terrestrial users (e.g., Starlink, Kuiper, OneWeb,…). Each hexagon represents a satellite beam inside the larger satellite coverage area. Note that, in general, there will be some coverage overlap between individual satellites, ensuring a continuous service. The operating altitude of an LEO satellite constellation is between 300 and 2,000 km, with most aiming to be at 450 to 550 km altitude. It is assumed that the satellites are interconnected, e.g., laser links. The User Terminal antenna (UT) is dynamically orienting itself after the best line-of-sight (in terms of signal quality) to a satellite within UT’s field-of-view (FoV). The FoV has not been shown in the picture above so as not to overcomplicate the illustration.

Low Earth Orbit (LEO) satellite services like Starlink have emerged to provide fixed broadband internet to individual consumers and small to medium-sized enterprises (SMEs) targeting rural and remote areas often where no other broadband solutions are available or with poor legacy copper- or coax-based infrastructure. These services deploy constellations of satellites orbiting close to Earth to offer high-speed internet with the significant advantage of reaching rural and remote areas where traditional ground-based infrastructure is absent or economically unfeasible.

One of the most significant benefits of LEO satellite broadband is the ability to deliver connectivity with lower latency compared to traditional satellite internet delivered by geosynchronous satellites, enhancing the user experience for real-time applications. The rapid deployment capability of these services also means that areas in dire need of internet access can be connected much quicker than waiting for ground infrastructure development. Additionally, satellite broadband’s reliability is less affected by terrestrial challenges, such as natural disasters that can disrupt other forms of connectivity.

The satellite service comes with its challenges. The cost of user equipment, such as satellite dishes, can be a barrier for some users. So, can the installation process be of the terrestrial satellite dish required to establish the connection to the satellite. Moreover, services might be limited by data caps or experience slower speeds after reaching certain usage thresholds, which can be a drawback for users with high data demands. Weather conditions can also impact the signal quality, particularly at the higher frequencies used by the satellite, albeit to a lesser extent than geostationary satellite services. However, the target areas where the fixed broadband satellite service is most suited are rural and remote areas that either have no terrestrial broadband infrastructure (terrestrial cellular broadband or wired broadband such as coax or fiber)

Beyond Starlink, other providers are venturing into the LEO satellite broadband market. OneWeb is actively developing a constellation to offer internet services worldwide, focusing on communities that are currently underserved by broadband. Telesat Lightspeed is also gearing up to provide broadband services, emphasizing the delivery of high-quality internet to the enterprise and government sectors.

Other LEO satellite businesses, such as AST SpaceMobile and Lynk Mobile, are taking a unique approach by aiming to connect standard mobile phones directly to their satellite network, extending cellular coverage beyond the reach of traditional cell towers. More about that in the section below (see “New Kids on the Block – Direct-to-Devices LEO satellites”).

I have been asked why I appear somewhat dismissive of the Amazon’s Project Kuiper in a previous version of article particular compared to StarLink (I guess). The expressed mission is to “provide broadband services to unserved and underserved consumers, businesses in the United States, …” (FCC 20-102). Project Kuiper plans for a broadband constellation of 3,226 microsatellites at 3 altitudes (i.e., orbital shells) around 600 km providing fixed broadband services in the Ka-band (i.e.,~ 17-30 GHz). In its US-based FCC (Federal Communications Commission) filling and in the subsequent FCC authorization it is clear that the Kuiper constellation primarily targets contiguous coverage of the USA (but mentions that services cannot be provided in the majority of Alaska, … funny I thought that was a good definition of a underserved remote and scarcely populated area?). Amazon has committed to launch 50% (1,618 satellites) of their committed satellites constellation before July 2026 (until now 2+ has been launched) and the remaining 50% before July 2029. There is however far less details on the Kuiper satellite design, than for example is available for the various versions of the StarLink satellites. Given the Kuiper will operate in the Ka-band there may be more frequency bandwidth allocated per beam than possible in the StarLink satellites using the Ku-band for customer device connectivity. However, Ka-band is at a higher frequency which may result in a more compromised signal propagation. In my opinion based on the information from the FCC submissions and correspondence, the Kuiper constellation appear less ambitious compared to StarLink vision, mission and tangible commitment in terms of aggressive launches, very high level of innovation and iterative development on their platform and capabilities in general. This may of course change over time and as more information becomes available on the Amazon’s Project Kuiper.

FWA-based LEO satellite solutions – takeaway:

  • LoS-based and free-space-like signal propagation allows high-frequency signals (i.e., high throughput, capacity, and quality) to provide near-ideal performance only impacted by the distance between the antenna and the ground terminal. Something that is, in general, not possible for a terrestrial-based cellular infrastructure.
  • Provides satellite fixed broadband internet connectivity typically using the Ku-band in geographically isolated locations where terrestrial broadband infrastructure is limited or non-existent.
  • Lower latency (and round trip time) compared to MEO and GEO satellite internet solutions.
  • Current systems are designed to provide broadband internet services in scarcely populated areas and underserved (or unserved) regions where traditional terrestrial-based communications infrastructures are highly uneconomical and/or impractical to deploy.
  • As shown in my previous article (i.e., “Stratospheric Drones: Revolutionizing Terrestrial Rural Broadband from the Skies?”), LEO satellite networks may be an economical interesting alternative to terrestrial rural cellular networks in countries with large scarcely populated rural areas requiring tens of thousands of cellular sites to cover. Hybrid models with LEO satellite FWA-like coverage to individuals in rural areas and with satellite backhaul to major settlements and towns should be considered in large geographies.
  • Resilience to terrestrial disruptions is a key advantage. It ensures functionality even when ground-based infrastructure is disrupted, which is an essential element for maintaining the Business Continuity of an operator’s telecommunications services. Particular hierarchical architectures with for example GEO-satellite, LEO satellite and Earth-based transport infrastructure will result in very high reliability network operations (possibly approaching ultra-high availability, although not with service parity).
  • Current systems are inherently capacity-limited due to their vast coverage areas (i.e., lower performance per unit coverage area). In the peak demand period, they will typically perform worse than terrestrial-based cellular networks (e.g., LTE or 5G).
  • In regions where modern terrestrial cellular and fixed broadband services are already established, satellite broadband may face challenges competing with these potentially cheaper, faster, and more reliable services, which are underpinned by the terrestrial communications infrastructure.
  • It is susceptible to weather conditions, such as heavy rain or snow, which can degrade signal quality. This may impact system capacity and quality, resulting in inconsistent customer experience throughout the year.
  • Must navigate complex regulatory environments in each country, which can affect service availability and lead to delays in service rollout.
  • Depending on the altitude, LEO satellites are typically replaced on a 5—to 7-year cycle due to atmospheric drag (which increases as altitude decreases; thus, the lower the altitude, the shorter a satellite’s life). This ultimately means that any improvements in system capacity and quality will take time to be thoroughly enjoyed by all customers.

SATELLITE BACKHAUL SOLUTIONS.

Figure 8 illustrates the architecture of a Low Earth Orbit (LEO) satellite backhaul system used by providers like OneWeb as well as StarLink with their so-called “Community Gateway”. It showcases the connectivity between terrestrial internet infrastructure (i.e., Satellite Gateways) and satellites in orbit, enabling high-speed data transmission. The network consists of LEO satellites that communicate with each other (inter-satellite Comms) using the Ku and Ka frequency bands. These satellites connect to ground-based satellite gateways (GW), which interface with Points of Presence (PoP) and Internet Exchange Points (IXP), integrating the space-based network with the terrestrial internet (WWW). Note: The indicated speeds and frequency bands (e.g., Ku: 12–18 GHz, Ka: 28–40 GHz) and data speeds illustrate the network’s capabilities.

LEO satellites providing backhaul connectivity, such as shown in Figure 8 above, are extending internet services to the farthest reaches of the globe. These satellites offer many benefits, as already discussed above, in connecting remote, rural, and previously un- and under-served areas with reliable internet services. Many remote regions lack foundational telecom infrastructure, particularly long-haul transport networks needed for carrying traffic away from remote populated areas. Satellite backhauls do not only offer a substantially better financial solution for enhancing internet connectivity to remote areas but are often the only viable solution for connectivity.

Take, for example, Greenland. The world’s largest non-continental island, the size of Western Europe, is characterized by its sparse population and distinct unconnected by road settlement patterns mainly along the West Coast (as well as a couple of settlements on the East Coast), influenced mainly by its vast ice sheets and rugged terrain. With a population of around 56+ thousand, primarily concentrated on the west coast, Greenland’s demographic distribution is spread out over ca. 50+ settlements and about 20 towns. Nuuk, the capital, is the island’s most populous city, housing over 18+ thousand residents and serving as the administrative, economic, and cultural hub. Terrestrial cellular networks serve settlements’ and towns’ communication and internet services needs, with the traffic carried back to the central switching centers by long-haul microwave links, sea cables, and satellite broadband connectivity. Several settlements connectivity needs can only be served by satellite backhaul, e.g., settlements on the East Coast (e.g., Tasiilaq with ca. 2,000 inhabitants and Ittoqqotooormiit (an awesome name!) with around 400+ inhabitants). LEO satellite backhaul solutions serving Satellite-only communities, such as those operated and offered by OneWeb (Eutelsat), could provide a backhaul transport solution that would match FWA latency specifications due to better (round trip time) performance than that of a GEO satellite backhaul solution.

It should also be clear that remote satellite-only settlements and towns may have communications service needs and demand that a localized 4G (or 5G) terrestrial cellular network with a satellite backhaul can serve much better than, for example, relying on individual ad-hoc connectivity solution from for example Starlink. When the area’s total bandwidth demand exceeds the capacity of an FWA satellite service, a localized terrestrial network solution with a satellite backhaul is, in general, better.

The LEO satellites should offer significantly reduced latency compared to their geostationary counterparts due to their closer proximity to the Earth. This reduction in delay is essential for a wide range of real-time applications and services, from adhering to modern radio access (e.g., 4G and 5G) requirements, VoIP, and online gaming to critical financial transactions, enhancing the user experience and broadening the scope of possible services and business.

Among the leading LEO satellite constellations providing backhaul solutions today are SpaceX’s Starlink (via their community gateway), aiming to deliver high-speed internet globally with a preference of direct to consumer connectivity; OneWeb, focusing on internet services for businesses and communities in remote areas; Telesat’s Lightspeed, designed to offer secure and reliable connectivity; and Amazon’s Project Kuiper, which plans to deploy thousands of satellites to provide broadband to unserved and underserved communities worldwide.

Satellite backhaul solutions – takeaway:

  • Satellite-backhaul solutions are excellent, cost-effective solution for providing an existing isolated cellular (and fixed access) network with high-bandwidth connectivity to the Internet (such as in remote and deep rural areas).
  • LEO satellites can reduce the need for extensive and very costly ground-based infrastructure by serving as a backhaul solution. For some areas, such as Greenland, the Sahara, or the Brazilian rainforest, it may not be practical or economical to connect by terrestrial-based transmission (e.g., long-haul microwave links or backbone & backhaul fiber) to remote settlements or towns.
  • An LEO-based backhaul solution supports applications and radio access technologies requiring a very low round trip time scale (RTT<50 ms) than is possible with a GEO-based satellite backhaul. However, the optimum RTT will depend on where the LEO satellite ground gateway connects to the internet service provider and how low the RTT can be.
  • The collaborative nature of a satellite-backhaul solution allows the terrestrial operator to focus on and have full control of all its customers’ network experiences, as well as optimize the traffic within its own network infrastructure.
  • LEO satellite backhaul solutions can significantly boost network resilience and availability, providing a secure and reliable connectivity solution.
  • Satellite-backhaul solutions require local ground-based satellite transmission capabilities (e.g., a satellite ground station).
  • The operator should consider that at a certain threshold of low population density, direct-to-consumer satellite services like Starlink might be more economical than constructing a local telecom network that relies on satellite backhaul (see above section on “Fixed Wireless Access (FWA) based LEO satellite solutions”).
  • Satellite backhaul providers require regulatory permits to offer backhaul services. These permits are necessary for several reasons, including the use of radio frequency spectrum, operation of satellite ground stations, and provision of telecommunications services within various jurisdictions.
  • The Satellite life-time in orbit is between 5 to 7 years depending on the LEO altitude. A MEO satellite (2 to 36 thousand km altitude) last between 10 to 20 years (GEO). This also dictates the modernization and upgrade cycle as well as timing of your ROI investment case and refinancing needs.

NEW KIDS ON THE BLOCK – DIRECT-TO-DEVICE LEO SATELLITES.

A recent X-exchange (from March 2nd):

Elon Musk: “SpaceX just achieved peak download speed of 17 Mb/s from a satellite direct to unmodified Samsung Android Phone.” (note: the speed correspond to a spectral efficiency of ~3.4 Mbps/MHz/beam).

Reply from user: “That’s incredible … Fixed wireless networks need to be looking over their shoulders?”

Elon Musk: “No, because this is the current peak speed per beam and the beams are large, so this system is only effective where there is no existing cellular service. This services works in partnership with wireless providers, like what @SpaceX and @TMobile announced.”

Figure 9 illustrating a LEO satellite direct-to-device communication in a remote areas without any terrestrially-based communications infrastructure. Satellite being the only means of communications either by a normal mobile device or by classical satphone. (Courtesy: DALL-E).

Low Earth Orbit (LEO) Satellite Direct-to-Device technology enables direct communication between satellites in orbit and standard mobile devices, such as smartphones and tablets, without requiring additional specialized hardware. This technology promises to extend connectivity to remote, rural, and underserved areas globally, where traditional cellular network infrastructure is absent or economically unfeasible to deploy. The system can offer lower latency communication by leveraging LEO satellites, which orbit closer to Earth than geostationary satellites, making it more practical for everyday use. The round trip time (RTT), the time it takes the for the signal to travel from the satellite to the mobile device and back, is ca. 4 milliseconds for a LEO satellite at 550 km compared to ca. 240 milliseconds for a geosynchronous satellite (at 36 thousand kilometers altitude).

The key advantage of a satellite in low Earth orbit is that the likelihood of a line-of-sight to a point on the ground is very high compared to establishing a line-of-sight for terrestrial cellular coverage that, in general, would be very low. In other words, the cellular signal propagation from a LEO satellite closely approximates that of free space. Thus, all the various environmental signal loss factors we must consider for a standard terrestrial-based mobile network do not apply to our satellite. In other, more simplistic words, the signal propagation directly from the satellite to the mobile device is less compromised than it typically would be from a terrestrial cellular tower to the same mobile device. The difference between free-space propagation, which considers only distance and frequency, and the terrestrial signal propagation models, which quantifies all the gains and losses experienced by a terrestrial cellular signal, is very substantial and in favor of free-space propagation.  As our Earth-bound cellular intuition of signal propagation often gets in the way of understanding the signal propagation from a satellite (or antenna in the sky in general), I recommend writing down the math using the formula of free space propagation loss and comparing this with terrestrial cellular link budget models, such as for example the COST 231-Hata Model (relatively simple) or the more recent 3GPP TR 38.901 Model (complex). In rural and sub-urban areas, depending on the environment, in-door coverage may be marginally worse, fairly similar, or even better than from terrestrial cell tower at a distance. This applies to both the uplink and downlink communications channel between the mobile device and the LEO satellite, and is also the reason why higher frequency (with higher frequency bandwidths available) use on LEO satellites can work better than in a terrestrial cellular network.

However, despite its potential to dramatically expand coverage, after all that is what satellites do, LEO Satellite Direct-to-Device technology is not a replacement for terrestrial cellular services and terrestrial communications infrastructures for several reasons: (a) Although the spectral efficiency can be excellent, the frequency bandwidth (in MHz) and data speeds (in Mbps) available through satellite connections are typically lower than those provided by ground-based cellular networks, limiting its use for high-bandwidth applications. (b) The satellite-based D2D services are, in general, capacity-limited and might not be able to handle higher user density typical for urban areas as efficiently as terrestrial networks, which are designed to accommodate large numbers of users through dense deployment of cell towers. (c) Environmental factors like buildings or bad weather can more significantly impact satellite communications’ reliability and quality than terrestrial services. (d) A satellite D2D service requires regulatory approval (per country), as the D2D frequency typically will be limited to terrestrial cellular services and will have to be coordinated and managed with any terrestrial use to avoid service degradation (or disruption) for customers using terrestrial cellular services also using the frequency. The satellites will have to be able to switch off their D2D service when the satellite covers jurisdictions that have not provided approval or where the relevant frequency/frequencies are in use terrestrially.

Using the NewSpace Index database, updated December 2023, there are current more than 8,000 Direct-to Device (D2D), or Direct-2-Cell (D2C), satellites planned for launch, with SpaceX’s Starlink v2 having 7,500 planned. The rest, 795 satellites, are distributed on 6 other satellite operators (e.g. AST Mobile, Sateliot (Spain), Inmarsat (HEO-orbit), Lynk,…). If we look at satellites designed for IoT connectivity we get in total 5,302, with 4,739 (not including StarLink) still planned, distributed out over 50+ satellite operators. The average IoT satellite constellation including what is currently planned is ~95 satellites with the majority targeted for LEO. The the satellite operators included in the 50+ count have confirmed funding with a minimum amount of US$2 billion (half of the operators have only funding confirmed without an amount). About 2,937 (435 launched) satellites are being planned to only serve IoT markets (note: I think this seems a bit excessive). With Swarm Technologies, a SpaceX subsidiary rank number 1 in terms of both launched and planned satellites. Swarm Technologies having launched at least 189 CubeSats (e.g., both 0.25U and 1U types) and have planned an addition 150. The second ranked IoT-only operator is Orbcomm with 51 satellites launched and an additional 52 planned. The average launched of the remaining IoT specific satellites operators are 5 with on average planning to launch 55 (over 42 constellations).

There are also 3 satellite operators (i.e., Chinese-based Galaxy Space: 1,000 LEO-sats; US-based Mangata Networks: 791 MEO/HEO-sats, and US-based Omnispace: 200 LEO?-sats) that have planned a total of 2,000 satellites to support 5G applications with their satellite solutions and one operator (i.e., Hanwha Systems) has planned 2,000 LEO satellites for 6G.

The emergence of LEO satellite direct-to-device (D2D) services, as depicted in the Figure 10 below, is at the forefront of satellite communication innovations, offering a direct line of connectivity between devices that bypasses the need for traditional cellular-based ground-based network infrastructure (e.g., cell towers). This approach benefits from the relatively short distance of hundreds of kilometers between LEO satellites and the Earth, reducing communication latency and broadening bandwidth capabilities compared to their geostationary counterparts. One of the key advantages of LEO D2D services is their ability to provide global coverage with an extensive number of satellites, i.e., in their 100s to 1000s depending the targeted quality of service, to support the services, ensuring that even the most remote and underserved areas have access to reliable communication channels. They are also critical in disaster resilience, maintaining communications when terrestrial networks fail due to emergencies or natural disasters.

Figure 10 This schematic presents the network architecture for satellite-based direct-to-device (D2D) communication facilitated by Low Earth Orbit (LEO) satellites, exemplified by collaborations like Starlink and T-Mobile US, Lynk Mobile, and AST Space Mobile. It illustrates how satellites in LEO enable direct connectivity between user equipment (UE), such as standard mobile devices and IoT (Internet of Things) devices, using terrestrial cellular frequencies and VHF/UHF bands. The system also shows inter-satellite links operating in the Ka-band for seamless network integration, with satellite gateways (GW) linking the space-based network to ground infrastructure, including Points of Presence (PoP) and Internet Exchange Points (IXP), which connect to the wider internet (WWW). This architecture supports innovative services like Omnispace and Astrocast, offering LEO satellite IoT connectivity. The network could be particularly crucial for defense and special operations in remote and challenging environments, such as the deserts or the Arctic regions of Greenland, where terrestrial networks are unavailable. As an example shown here, using regular terrestrial cellular frequencies in both downlink (~300 MHz to 7 GHz) and uplinks (900 MHz or lower to 2.1 GHz) ensures robust and versatile communication capabilities in diverse operational contexts.

While the majority of the 5,000+ Starlink constellation is 13 GHz (Ku-band), at the beginning of 2024, SpaceX launched a few 2nd generation Starlink satellites that support direct connections from the satellite to a normal cellular device (e.g., smartphone), using 5 MHz of T-Mobile USA’s PCS band (1900 MHz). The targeted consumer service, as expressed by T-Mobile USA, provides texting capabilities across the USA for areas with no or poor existing cellular coverage. This is fairly similar to services at similar cellular coverage areas presently offered by, for example, AST SpaceMobileOmniSpace, and Lynk Global LEO satellite services with reported maximum downlink speed approaching 20 Mbps. The so-called Direct-2-Device, where the device is a normal smartphone without satellite connectivity functionality, is expected to develop rapidly over the next 10 years and continue to increase the supported user speeds (i.e., utilized terrestrial cellular spectrum) and system capacity in terms of smaller coverage areas and higher number of satellite beams.

Table 1 below provides an overview of the top 13 LEO satellite constellations targeting (fixed) internet services (e.g., Ku band), IoT and M2M services, and Direct-to-Device (or Direct-to-Cell, D2C) services. The data has been compiled from the NewSpace Index website, which should be with data as of 31st of December 2023. The Top-satellite constellation rank has been based on the number of launched satellites until the end of 2023. Two additional Direct-2-Cell (D2C or Direct-to-Device, D2D) LEO satellite constellations are planned for 2024-2025. One is SpaceX Starlink 2nd generation, which launched at the beginning of 2024, using T-Mobile USA’s PCS Band to connect (D2D) to normal terrestrial cellular handsets. The other D2D (D2C) service is Inmarsat’s Orchestra satellite constellation based on L-band (for mobile terrestrial services) and Ka for fixed broadband services. One new constellation (Mangata Networks, see also the NewSpace constellation information) targeting 5G services. With two 5G constellations already launched, i.e., Galaxy Space (Yinhe) launched 8 LEO satellites, 1,000 planned using Q- and V-bands (i.e., not a D2D cellular 5G service), and OmniSpace launched two satellites and appear to have planned a total of 200 satellites. Moreover, currently, there is one planned constellation targeting 6G by the South Korean Hanwha Group (a bit premature, but interesting to follow nevertheless) with 2,000 6G (LEO) satellites planned.

Most currently launched and planned satellite constellations offering (or plan to provide) Direct-2-Cell services, including IoT and M2M, are designed for low-frequency bandwidth services that are unlikely to compete with terrestrial cellular networks’ quality of service where reasonable good coverage (or better) exists.

Table 1 An overview of the Top-14 LEO satellite constellations targeting (fixed) internet services (e.g., Ku band), IoT and M2M services, and Direct-to-Device (or direct-to-cell) services. The data has been compiled from the NewSpace Index website, which should be with data as of 31st of December 2023.

The deployment of LEO D2D services also navigates a complicated regulatory landscape, with the need for harmonized spectrum allocation across different regions. Managing interference with terrestrial cellular networks and other satellite operations is another interesting challenge albeit complex aspect, requiring sophisticated solutions to ensure signal integrity. Moreover, despite the cost-effectiveness of LEO satellites in terms of launch and operation, establishing a full-fledged network for D2D services demands substantial initial investment, covering satellite development, launch, and the setup of supporting ground infrastructure.

LEO satellites with D2D-based capabilities – takeaway:

  • Provides lower-bandwidth services (e.g., GPRS/EDGE/HSDPA-like) where no existing terrestrial cellular service is present.
  • (Re-)use on Satellite of the terrestrial cellular spectrum.
  • D2D-based satellite services may become crucial in business continuity scenarios, providing redundancy and increased service availability to existing terrestrial cellular networks. This is particularly essential as a remedy for emergency response personnel in case terrestrial networks are not functional. Limited capacity (due to little assigned frequency bandwidth) over a large coverage area serving rural and remote areas with little or no cellular infrastructure.
  • Securing regulatory approval for satellite services over independent jurisdictions is a complex and critical task for any operator looking to provide global or regional satellite-based communications. The satellite operator may have to switch off transmission over jurisdictions where no permission has been granted.
  • If the spectrum is also deployed on the ground, satellite use of it must be managed and coordinated (due to interference) with the terrestrial cellular networks.
  • Require lowly or non-utilized cellular spectrum in the terrestrial operator’s spectrum portfolio.
  • D2D-based communications require a more complex and sophisticated satellite design, including the satellite antenna resulting in higher manufacturing and launch cost.
  • The IoT-only commercial satellite constellation “space” is crowded with a total of 44 constellations (note: a few operators have several constellations). I assume that many of those plans will eventually not be realized. Note that SpaceX Swarm Technology is leading and in terms of total numbers (available in the NewSpace Index) database will remain a leader from the shear amount of satellites once their plan has been realized. I expect we will see a Chinese constellation in this space as well unless the capability will be built into the Guo Wang constellation.
  • The Satellite life-time in orbit is between 5 to 7 years depending on the altitude. This timeline also dictates the modernization and upgrade cycle as well as timing of your ROI investment and refinancing needs.
  • Today’s D2D satellite systems are frequency-bandwidth limited. However, if so designed, satellites could provide a frequency asymmetric satellite-to-device connection. For instance, the downlink from the satellite to the device could utilize a high frequency (not used in the targeted rural or remote area) and a larger bandwidth, while the uplink communication between the terrestrial device and the LEO satellite could use a sufficiently lower frequency and smaller frequency bandwidth.

MAKERS OF SATELLITES.

In the rapidly evolving space industry, a diverse array of companies specializes in manufacturing satellites for Low Earth Orbit (LEO), ranging from small CubeSats to larger satellites for constellations similar to those used by OneWeb (UK) and Starlink (USA). Among these, smaller companies like NanoAvionics (Lithuania) and Tyvak Nano-Satellite Systems (USA) have carved out niches by focusing on modular and cost-efficient small satellite platforms typically below 25 kg. NanoAvionics is renowned for its flexible mission support, offering everything from design to operation services for CubeSats (e.g., 1U, 3U, 6U) and larger small satellites (100+ kg). Similarly, Tyvak excels in providing custom-made solutions for nano-satellites and CubeSats, catering to specific mission needs with a comprehensive suite of services, including design, manufacturing, and testing.

UK-based Surrey Satellite Technology Limited (SSTL) stands out for its innovative approach to small, cost-effective satellites for various applications, with cost-effectiveness in achieving the desired system’s performance, reliability, and mission objectives at a lower cost than traditional satellite projects that easily runs into USD 100s of million. SSTL’s commitment to delivering satellites that balance performance and budget has made it a popular satellite manufacturer globally.

On the larger end of the spectrum, companies like SpaceX (USA) and Thales Alenia Space (France-Italy) are making significant strides in satellite manufacturing at scale. SpaceX has ventured beyond its foundational launch services to produce thousands of small satellites (250+ kg) for its Starlink broadband constellation, which comprises 5,700+ LEO satellites, showcasing mass satellite production. Thales Alenia Space offers reliable satellite platforms and payload integration services for LEO constellation projects.

With their extensive expertise in aerospace and defense, Lockheed Martin Space (USA) and Northrop Grumman (USA) produce various satellite systems suitable for commercial, military, and scientific missions. Their ability to support large-scale satellite constellation projects from design to launch demonstrates high expertise and reliability. Similarly, aerospace giants Airbus Defense and Space (EU) and Boeing Defense, Space & Security (USA) offer comprehensive satellite solutions, including designing and manufacturing small satellites for LEO. Their involvement in high-profile projects highlights their capacity to deliver advanced satellite systems for a wide range of use cases.

Together, these companies, from smaller specialized firms to global aerospace leaders, play crucial roles in the satellite manufacturing industry. They enable a wide array of LEO missions, catering to the burgeoning demand for satellite services across telecommunications, Earth observation, and beyond, thus facilitating access to space for diverse clients and applications.

ECONOMICS.

Before going into details, let’s spend some time on an example illustrating the basic components required for building a satellite and getting it to launch. Here, I point at a super cool alternative to the above-mentioned companies, the USA-based startup Apex, co-founded by CTO Max Benassi (ex-SpaceX and Astra) and CEO Ian Cinnamon. To get an impression of the macro-components of a satellite system, I recommend checking out the Apex webpage and “playing” with their satellite configurator. The basic package comes at a price tag of USD 3.2 million and a 9-month delivery window. It includes a 100 kg satellite bus platform, a power system, a communication system based on X-band (8 – 12 GHz), and a guidance, navigation, and control package. The basic package does not include a solar array drive assembly (SADA), which plays a critical role in the operation of satellites by ensuring that the satellite’s solar panels are optimally oriented toward the Sun. Adding the SADA brings with it an additional USD 500 thousand. Also, the propulsion mechanism (e.g., chemical or electric; in general, there are more possibilities) is not provided (+ USD 450 thousand), nor are any services included (e.g., payload & launch vehicle integration and testing, USD 575 thousand), including SADAs, propulsion, and services, Apex will have a satellite launch ready for an amount of close to USD 4.8 million.

However, we are not done. The above solution still needs to include the so-called payload, which relates to the equipment or instruments required to perform the LEO satellite mission (e.g., broadband communications services), the actual satellite launch itself, and the operational aspects of a successful post-launch (i.e., ground infrastructure and operation center(s)).

Let’s take SpaceX’s Starlink satellite as an example illustrating mission and payload more clearly. The Starlink satellite’s primary mission is to provide fixed-wireless access broadband internet to an Earth-based fixed antenna using. The Starlink payload primarily consists of advanced broadband internet transmission equipment designed to provide high-speed internet access across the globe. This includes phased-array antennas for communication with user terminals on the ground, high-frequency radio transceivers to facilitate data transmission, and inter-satellite links allowing satellites to communicate in orbit, enhancing network coverage and data throughput.

The economical aspects of launching a Low Earth Orbit (LEO) satellite project span a broad spectrum of costs from the initial concept phase to deployment and operational management. These projects commence with research and development, where significant investments are made in designengineering, and the iterative process of prototyping and testing to ensure the satellite meets its intended performance and reliability standards in harsh space conditions (e.g., vacuum, extreme temperature variations, radiation, solar flares, high-velocity impacts with micrometeoroids and man-made space debris, erosion, …).

Manufacturing the satellite involves additional expenses, including procuring high-quality components that can withstand space conditions and assembling and integrating the satellite bus with its mission-specific payload. Ensuring the highest quality standards throughout this process is crucial to minimizing the risk of in-orbit failure, which can substantially increase project costs. The payload should be seen as the heart of the satellite’s mission. It could be a set of scientific instruments for measuring atmospheric data, optical sensors for imaging, transponders for communication, or any other equipment designed to fulfill the satellite’s specific objectives. The payload will vary greatly depending on the mission, whether for Earth observation, scientific research, navigation, or telecommunications.

Of course, there are many other types and more affordable options for LEO satellites than a Starlink-like one (although we should also not ignore achievements of SpaceX and learn from them as much as possible). As seen from Table 1, we have a range of substantially smaller satellite types or form factors. The 1U (i.e., one unit) CubeSat is a satellite with a form factor of 10x10x11.35 cm3 and weighs no more than 1.33 kilograms. A rough cost range for manufacturing a 1U CubeSat could be from USD 50 to 100+ thousand depending on mission complexity and payload components (e.g., commercial-off-the-shelf or application or mission-specific design). The range includes considering the costs associated with the satellite’s design, components, assembly, testing, and initial integration efforts. The cost range, however, does not include other significant costs associated with satellite missions, such as launch services, ground station operations, mission control, and insurance, which is likely to (significantly) increase the total project cost. Furthermore, we have additional form factors, such as 3U CubeSat (10x10x34.05 cm3, <4 kg), manufacturing cost in the range of USD 100 to 500+ thousand, 6U CubeSat (20x10x34 cm3, <12 kg), that can carry more complex payload solutions than the smaller 1U and 3U, with the manufacturing cost in the range of USD 200 thousand to USD 1+ million and 12U satellites (20x20x34 cm3, <24 kg) that again support complex payload solutions and in general will be significantly more expensive to manufacture.

Securing a launch vehicle is one of the most significant expenditures in a satellite project. This cost not only includes the price of the rocket and launch itself but also encompasses integration, pre-launch services, and satellite transportation to the launch site. Beyond the launch, establishing and maintaining the ground segment infrastructure, such as ground stations and a mission control center, is essential for successful satellite communication and operation. These facilities enable ongoing tracking, telemetry, and command operations, as well as the processing and management of the data collected by the satellite.

The SpaceX Falcon rocket is used extensively by other satellite businesses (see above Table 1) as well as by SpaceX for their own Starlink constellation network. The rocket has a payload capability of ca. 23 thousand kg and a volume handling capacity of approximately 300 cubic meters. SpaceX has launched around 60 Starlink satellites per Falcon 9 mission for the first-generation satellites. The launch cost per 1st generation satellite would then be around USD 1 million per satellite using the previously quoted USD 62 million (2018 figure) for a Falcon 9 launch. The second-generation Starlink satellites are substantially more advanced compared to the 1st generation. They are also heavier, weighing around a thousand kilograms. A Falcon 9 would only be able to launch around 20 generation 2 satellites (only considering the weight limitation), while a Falcon Heavy could lift ca. 60 2nd gen. satellites but also at a higher price point of USD 90 million (2018 figure). Thus the launch cost per satellite would be between USD 1.5 million using Falcon Heavy and USD 3.1 million using Falcon 9. Although the launch cost is based on price figures from 2018, the expected efficiency gained from re-use may have either kept the cost level or reduced it further as expected, particularly with Falcon Heavy.

Satellite businesses looking to launch small volumes of satellites, such as CubeSats, have a variety of strategies at their disposal to manage launch costs effectively. One widely adopted approach is participating in rideshare missions, where the expenses of a single launch vehicle are shared among multiple payloads, substantially reducing the cost for each operator. This method is particularly attractive due to its cost efficiency and the regularity of missions offered by, for example, SpaceX. Prices for rideshare missions can start from as low as a few thousand dollars for very small payloads (like CubeSats) to several hundred thousand dollars for larger small satellites. For example, SpaceX advertises rideshare prices starting at $1 million for payloads up to 200 kg. Alternatively, dedicated small launcher services cater specifically to the needs of small satellite operators, offering more tailored launch options in terms of timing and desired orbit. Companies such as Rocket Lab (USA) and Astra (USA) launch services have emerged, providing flexibility that rideshare missions might not, although at a slightly higher cost. However, these costs remain significantly lower than arranging a dedicated launch on a larger vehicle. For example, Rocket Lab’s Electron rocket, specializing in launching small satellites, offers dedicated launches with prices starting around USD 7 million for the entire launch vehicle carrying up to 300 kg. Astra has reported prices in the range of USD 2.5 million for a dedicated LEO launch with their (discontinued) Rocket 3 with payloads of up to 150 kg. The cost for individual small satellites will depend on their share of the payload mass and the specific mission requirements.

Satellite ground stations, which consist of arrays of phased-array antennas, are critical for managing the satellite constellation, routing internet traffic, and providing users with access to the satellite network. These stations are strategically located to maximize coverage and minimize latency, ensuring that at least one ground station is within the line of sight of satellites as they orbit the Earth. As of mid-2023, Starlink operated around 150 ground stations worldwide (also called Starlink Gateways), with 64 live and an additional 33 planned in the USA. The cost of constructing a ground station would be between USD 300 thousand to half a million not including the physical access point, also called the point-of-presence (PoP), and transport infrastructure connecting the PoP (and gateway) to the internet exchange where we find the internet service providers (ISPs) and the content delivery networks (CDNs). The Pop may add another USD 100 to 200 thousand to the ground infrastructure unit cost. The transport cost from the gateway to the Internet exchange can vary a lot depending on the gateway’s location.

Insurance is a critical component of the financial planning for a satellite project, covering risks associated with both the launch phase and the satellite’s operational period in orbit. These insurances are, in general, running at between 5% to 20% of the total project cost depending on the satellite value, the track record of the launch vehicle, mission complexity, and duration (i.e., typically 5 – 7 years for a LEO satellite at 500 km) and so forth. Insurance could be broken up into launch insurance and insurance covering the satellite once it is in orbit.

Operational costs, the Opex, include the day-to-day expenses of running the satellite, from staffing and technical support to ground station usage fees.

Regulatory and licensing fees, including frequency allocation and orbital slot registration, ensure the satellite operates without interfering with other space assets. Finally, at the end of the satellite’s operational life, costs associated with safely deorbiting the satellite are incurred to comply with space debris mitigation guidelines and ensure a responsible conclusion to the mission.

The total cost of an LEO satellite project can vary widely, influenced by the satellite’s complexity, mission goals, and lifespan. Effective project management and strategic decision-making are crucial to navigating these expenses, optimizing the project’s budget, and achieving mission success.

Figure 11 illustrates an LEO CubeSat orbiting above the Earth, capturing the satellite’s compact design and its role in modern space exploration and technology demonstration. Note that the CubeSat design comes in several standardized dimensions, with the reference design, also called 1U, being almost 1 thousandth of a cubic meter and weighing less than 1.33 kg. More advanced CubeSat satellites would typically be 6U or higher.

CubeSats (e.g., 1U, 3U, 6U, 12U):

  • Manufacturing Cost: Ranges from USD 50,000 for a simple 1U CubeSat to over USD 1 million for a more complex missions supported by 6U (or higher) CubeSat with advanced payloads (and 12U may again amount to several million US dollars).
  • Launch Cost: This can vary significantly depending on the launch provider and the rideshare opportunities, ranging from a few thousand dollars for a 1U CubeSat on a rideshare mission to several million dollars for a dedicated launch of larger CubeSats or small satellites.
  • Operational Costs: Ground station services, mission control, and data handling can add tens to hundreds of thousands of dollars annually, depending on the mission’s complexity and duration.

Small Satellites (25 kg up to 500 kg):

  • Manufacturing Cost: Ranges from USD 500,000 to over 10 million, depending on the satellite’s size, complexity, and payload requirements.
  • Launch Cost: While rideshare missions can reduce costs, dedicated launches for small satellites can range from USD 10 million to 62 million (e.g., Falcon 9) and beyond (e.g., USD 90 million for Falcon Heavy).
  • Operational Costs: These are similar to CubeSats, but potentially higher due to the satellite’s larger size and more complex mission requirements, reaching several hundred thousand to over a million dollars annually.

The range for the total project cost of a LEO satellite:

Given these considerations, the total cost range for a LEO satellite project can vary from as low as a few hundred thousand dollars for a simple CubeSat project utilizing rideshare opportunities and minimal operational requirements to hundreds of millions of dollars for more complex small satellite missions requiring dedicated launches and extensive operational support.

It is important to note that these are rough estimates, and the actual cost can vary based on specific mission requirements, technological advancements, and market conditions.

CAPACITY AND QUALITY

Figure 12 Satellite-based cellular capacity, or quality measured, by the unit or total throughput in Mbps is approximately driven by the amount of spectrum (in MHz) times the effective spectral efficiency (in Mbps/MHz/units) times the number of satellite beams resulting in cells on the ground.

The overall capacity and quality of satellite communication systems, given in Mbps, is on a high level, the product of three key factors: (i) the amount of frequency bandwidth in MHz allocated to the satellite operations multiplied by (ii) the effective spectral efficiency in Mbps per MHz over a unit satellite-beam coverage area multiplied by (iii) the number of satellite beams that provide the resulting terrestrial cell coverage. Thus, in other words:

Satellite Capacity (in Mbps) =
Frequency Bandwidth in MHz ×
Effective Spectral Efficiency in Mbps/MHz/Beam ×
Number of Beams (or Cells)

Consider a satellite system supporting 8 beams (and thus an equivalent of terrestrial coverage cells), each with 250 MHz allocated within the same spectral frequency range, can efficiently support ca. 680 Mbps per beam. This is achieved with an antenna setup that effectively provides a spectral efficiency of ~2.7 Mbps/MHz/cell (or beam) in the downlink (i.e., from the satellite to the ground). Moreover, the satellite typically will have another frequency and antenna configuration that establishes a robust connection to the ground station that connects to the internet via, for example, third-party internet service providers. The 680 Mbps is then shared among users that are within the satellite beam coverage, e.g., if you have 100 customers demanding a service, the speed each would experience on average would be around 7 Mbps. This may not seem very impressive compared to the cellular speeds we are used to getting with an LTE or 5G terrestrial cellular service. However, such speeds are, of course, much better than having no means of connecting to the internet.

Higher frequencies (i.e., in the GHz range) used to provide terrestrial cellular broadband services are in general quiet sensitive to the terrestrial environment and non-LoS propagation. It is a basic principle of physics that signal propagation characteristics, including the range and penetration capabilities of an electromagnetic waves, is inversely related to their frequency. Vegetation and terrain becomes an increasingly critical factor to consider in higher frequency propagation and the resulting quality of coverage. For example trees, forests, and other dense foliage can absorb and scatter radio waves, attenuating signals. The type and density of vegetation, along with seasonal changes like foliage density in summer versus winter, can significantly impact signal strength. Terrains often include varied topographies such as housing, hills, valleys, and flat plains, each affecting signal reach differently. For instance, housing, hilly or mountainous areas may cause signal shadowing and reflection, while flat terrains might offer less obstruction, enabling signals to travel further. Cellular mobile operators tend to like high frequencies (GHz) for cellular broadband services as it is possible to get substantially more system throughput in bits per second available to deliver to our demanding customers than at frequencies in the MHz range. As can be observed in Figure 12 above, we see that the frequency bandwidth is a multiplier for the satellite capacity and quality. Cellular mobile operators tend to “dislike” higher frequencies because of their poorer propagation conditions in their terrestrially based cellular networks resulting in the need for increased site densification at a significant incremental capital expense.

The key advantage of a LEO satellite is that the likelihood of a line-of-sight to a point on the ground is very high compared to establishing a line-of-sight for terrestrial cellular coverage that, in general, would be very low. In other words, the cellular signal propagation from an satellite closely approximates that of free space. Thus, all the various environmental signal loss factors we must consider for a standard terrestrial-based mobile network do not apply to our satellite having only to overcome the distance from the satellite antenna to the ground.

Let us first look at the satellite frequency component of the above satellite capacity, and quality, formula:

FREQUENCY SPECTRUM FOR SATELLITES.

The satellite frequency spectrum encompasses a range of electromagnetic frequencies allocated specifically for satellite communication. These frequencies are divided into bands, commonly known as L-band (e.g., mobile broadband), S-band (e.g., mobile broadband), C-band, X-band (e.g., mainly used by military), Ku-band (e.g., fixed broadband), Ka-band (e.g., fixed broadband), and V-band. Each serves different satellite applications due to its distinct propagation characteristics and capabilities. The spectrum bandwidth used by satellites refers to the width of the frequency range that a satellite system is licensed to use for transmitting and receiving signals.

Careful management of satellite spectrum bandwidth is critical to prevent interference with terrestrial communications systems. Since both satellite and terrestrial systems can operate on similar frequency ranges, there is a potential for crossover interference, which can degrade the performance of both systems. This is particularly important for bands like C-band and Ku-band, which are also used for terrestrial cellular networks and other applications like broadcasting.

Using the same spectrum for both satellite and terrestrial cellular coverage within the same geographical area is challenging due to the risk of interference. Satellites transmit signals over vast areas, and if those signals are on the same frequency as terrestrial cellular systems, they can overpower the local ground-based signals, causing reception issues for users on the ground. Conversely, the uplink signals from terrestrial sources can interfere with the satellite’s ability to receive communications from its service area.

Regulatory bodies such as the International Telecommunication Union (ITU) are crucial in mitigating these interference issues. They coordinate the allocation of frequency bands and establish regulations that govern their use. This includes defining geographical zones where certain frequencies may be used exclusively for either terrestrial or satellite services, as well as setting limits on signal power levels to minimize the chance of interference. Additionally, technology solutions like advanced filtering, beam shaping, and polarization techniques are employed to further isolate satellite communications from terrestrial systems, ensuring that both may coexist and operate effectively without mutual disruption.

The International Telecommunication Union (ITU) has designated several frequency bands for Fixed Satellite Services (FSS) and Mobile Satellite Services (MSS) that can be used by satellites operating in Low Earth Orbit (LEO). The specific bands allocated for satellite services, FSS and MSS, are determined by the ITU’s Radio Regulations, which are periodically updated to reflect global telecommunication’s evolving needs and address emerging technologies. Here are some of the key frequency bands commonly considered for FSS and MSS with LEO satellites:

V-Band 40 GHz to 75 GHz (microwave frequency range).
The V-band is appealing for Low Earth Orbit (LEO) satellite constellations designed to provide global broadband internet access. LEO satellites can benefit from the V-band’s capacity to support high data rates, which is essential for serving densely populated areas and delivering competitive internet speeds. The reduced path loss at lower altitudes, compared to GEO, also makes the V-band a viable option for LEO satellites. Due to the higher frequencies offered by V-band it also is significant more sensitive to atmospheric attenuation, (e.g., oxygen absorption around 60 GHz), including rain fade, which is likely to affect signal integrity. This necessitates the development of advanced technologies for adaptive coding and modulation, power amplification, and beamforming to ensure reliable communication under various weather conditions. Several LEO satellite operators have expressed an interest in operationalizing the V-band in their satellite constellations (e.g., StarLink, OneWeb, Kuiper, Lightspeed). This band should be regarded as an emergent LEO frequency band.

Ka-Band 17.7 GHz to 20.2 GHz (Downlink) & 27.5 GHz to 30.0 GHz (Uplink).
The Ka-band offers higher bandwidths, enabling greater data throughput than lower bands. Not surprising this band is favored by high-throughput satellite solutions. It is widely used by fixed satellite services (FSS). This makes it ideal for high-speed internet services. However, it is more susceptible to absorption and scattering by atmospheric particles, including raindrops and snowflakes. This absorption and scattering effect weakens the signal strength when it reaches the receiver. To mitigate rain fade effects in the Ka-band, satellite, and ground equipment must be designed with higher link margins, incorporating more powerful transmitters and more sensitive receivers. Additionally, adaptive modulation and coding techniques can be employed to adjust the signal dynamically in response to changing weather conditions. Overall, the system is more costly and, therefore, primarily used for satellite-to-earth ground station communications and high-performance satellite backhaul solutions.

For example, Starlink and OneWeb use the Ka-band to connect to satellite Earth gateways and point-of-presence, which connect to Internet Exchange and the wider internet. It is worth noticing that the terrestrial 5 G band n256 (26.5 to 29.5 GHz) falls within the Ka-band’s uplink frequency band. Furthermore, SES’s mPower satellites, operating at Middle Earth Orbit (MEO), operate exclusively in this band, providing internet backhaul services.

Ku-Band 12.75 GHz to 13.25 GHz (Downlink) & 14.0 GHz to 14.5 GHz (Uplink).
The Ku-band is widely used for FSS satellite communications, including fixed satellite services, due to its balance between bandwidth availability and susceptibility to rain fade. It is suitable for broadband services, TV broadcasting, and backhaul connections. For example, Starlink and OneWeb satellites are using this band to provide broadband services to earth-based customer terminals.

X-Band 7.25 GHz to 7.75 GHz (Downlink) & 7.9 GHz to 8.4 GHz (Uplink).
The X-band in satellite applications is governed by international agreements and national regulations to prevent interference between different services and to ensure the efficient use of the spectrum. The X-band is extensively used for secure military satellite communications, offering advantages like high data rates and relative resilience to jamming and eavesdropping. It supports a wide range of military applications, including mobile command, control, communications, computer, intelligence, surveillance, and reconnaissance (i.e., C4ISR) operations. Most defense-oriented satellites operate at geostationary orbit, ensuring constant coverage of specific geographic areas (e.g., Airbus Skynet constellations, Spain’s XTAR-EUR, and France’s Syracuse satellites). Most European LEO defense satellites, used primarily for reconnaissance, are fairly old, with more than 15 years since the first launch, and are limited in numbers (i.e., <10). The most recent European LEO satellite system is the French-based Multinational Space-based Imaging System (MUSIS) and Composante Spatiale Optique (CSO), where the first CSO components were launched in 2018. There are few commercial satellites utilizing the X-band.

C-Band 3.7 GHz to 4.2 GHz (Downlink) & 5.925 GHz to 6.425 GHz (Uplink)
C-band is less susceptible to rain fade and is traditionally used for satellite TV broadcasting, maritime, and aviation communications. However, parts of the C-band are also being repurposed for terrestrial 5G networks in some regions, leading to potential conflicts and the need for careful coordination. The C-band is primarily used in geostationary orbit (GEO) rather than Low Earth Orbit (LEO), due to the historical allocation of C-band for fixed satellite services (FSS) and its favorable propagation characteristics. I haven’t really come across any LEO constellation using the C-band. GEO FSS satellite operators using this band extensively are SES (Luxembourg), Intelsat (Luxembourg/USA), Eutelsat (France), Inmarsat (UK), etc..

S-Band 2.0 GHz to 4.0 GHz
S-band is used for various applications, including mobile communications, weather radar, and some types of broadband services. It offers a good compromise between bandwidth and resistance to atmospheric absorption. Both Omnispace (USA) and Globalstar (USA) LEO satellites operate in this band. Omnispace is also interesting as they have expressed intent to have LEO satellites supporting the 5G services in the band n256 (26.5 to 29.5 GHz), which falls within the uplink of the Ka-band.

L-Band 1.0 GHz to 2.0 GHz
L-band is less commonly used for fixed satellite services but is notable for its use in mobile satellite services (MSS), satellite phone communications, and GPS. It provides good coverage and penetration characteristics. Both Lynk Mobile (USA), offering Direct-2-Device, IoT, and M2M services, and Astrocast (Switzerland), with their IoT/M2M services, are examples of LEO satellite businesses operating in this band.

UHF 300 MHz to 3.0 GHz
The UHF band is more widely used for satellite communications, including mobile satellite services (MSS), satellite radio, and some types of broadband data services. It is favored for its relatively good propagation characteristics, including the ability to penetrate buildings and foliage. For example, Fossa Systems LEO pico-satellites (i.e., 1p form-factor) use this frequency for their IoT and M2M communications services.

VHF 30 MHz to 300 MHz

The VHF band is less commonly used in satellite communications for commercial broadband services. Still, it is important for applications such as satellite telemetry, tracking, and control (TT&C) operations and amateur satellite communications. Its use is often limited due to the lower bandwidth available and the higher susceptibility to interference from terrestrial sources. Swarm Technologies (USA and a SpaceX subsidiary) using 137-138 MHz (Downlink) and 148-150 MHz (Uplink). However, it appears that they have stopped taking new devices on their network. Orbcomm (USA) is another example of a satellite service provider using the VHF band for IoT and M2M communications. There is very limited capacity in this band due to many other existing use cases, and LEO satellite companies appear to plan to upgrade to the UHF band or to piggyback on direct-2-cell (or direct-2-device) satellite solutions, enabling LEO satellite communications with 3GPP compatible IoT and M2M devices.

SATELLITE ANTENNAS.

Satellites operating in Geostationary Earth Orbit (GEO), Medium Earth Orbit (MEO), and Low Earth Orbit (LEO) utilize a variety of antenna types tailored to their specific missions, which range from communication and navigation to observation (e.g., signal intelligence). The satellite’s applications influence the selection of an antenna, the characteristics of its orbit, and the coverage area required.

Antenna technology is intrinsically linked to spectral efficiency in satellite communications systems and of course any other wireless systems. Antenna designs influence how effectively a communication system can transmit and receive signals within a given frequency band, which is the essence of spectral efficiency (i.e., how much information per unit time in bits per second can I squeeze through my communications channel).

Thus, advancements in antenna technology are fundamental to improving spectral efficiency, making it a key area of research and development in the quest for more capable and efficient communication systems.

Parabolic dish antennas are prevalent for GEO satellites due to their high gain and narrow beam width, making them ideal for broadcasting and fixed satellite services. These antennas focus a tight beam on specific areas on Earth, enabling strong and direct signals essential for television, internet, and communication services. Horn antennas, while simpler, are sometimes used as feeds for larger parabolic antennas or for telemetry, tracking, and command functions due to their reliability. Additionally, phased array antennas are becoming more common in GEO satellites for their ability to steer beams electronically, offering flexibility in coverage and the capability to handle multiple beams and frequencies simultaneously.

Phased-array antennas are indispensable in for MEO satellites, such as those used in navigation systems like GPS (USA), BeiDou (China), Galileo (European), or GLONASS (Russian). These satellite constellations cover large areas of the Earth’s surface and can adjust beam directions dynamically, a critical feature given the satellites’ movement relative to the Earth. Patch antennas are also widely used in MEO satellites, especially for mobile communication constellations, due to their compact and low-profile design, making them suitable for mobile voice and data communications.

Phased-array antennas are very important for LEO satellites use cases as well, which include broadband communication constellations like Starlink and OneWeb. Their (fast) beam-steering capabilities are essential for maintaining continuous communication with ground stations and user terminals as the satellites quickly traverse the sky. The phased-array antenna also allow for optimizing coverage with both narrow as well as wider field of view (from the perspective of the satellite antenna) that allow the satellite operator to trade-off cell capacity and cell coverage.

Simpler Dipole antennas are employed for more straightforward data relay and telemetry purposes in smaller satellites and CubeSats, where space and power constraints are significant factors. Reflect array antennas, which offer a mix of high gain and beam steering capabilities, are used in specific LEO satellites for communication and observation applications (e.g., for signal intelligence gathering), combining features of both parabolic and phased array antennas.

Mission specific requirements drive the choice of antenna for a satellite. For example, GEO satellites often use high-gain, narrowly focused antennas due to their fixed position relative to the Earth, while MEO and LEO satellites, which move relatively closer to the Earth’s surface, require antennas capable of maintaining stable connections with moving ground terminals or covering large geographical areas.

Advanced antenna technologies such as beamforming, phased-arrays, and Multiple In Multiple Out (MMO) antenna configurations are crucial in managing and utilizing the spectrum more efficiently. They enable precise targeting of radio waves, minimizing interference, and optimizing bandwidth usage. This direct control over the transmission path and signal shape allows more data (bits) to be sent and received within the same spectral space, effectively increasing the communication channel’s capacity. In particular, MIMO antenna configurations and advanced antenna beamforming have enabled terrestrial mobile cellular access technologies (e.g., LTE and 5G) to quantum leap the effective spectral efficiency, broadband speed and capacity orders of magnitude above and beyond older technologies of 2G and 3G. Similar principles are being deployed today in modern advanced communications satellite antennas, providing increased capacity and quality within the satellite cellular coverage area provided by the satellite beam.

Moreover, antenna technology developments like polarization and frequency reuse directly impact a satellite system’s ability to maximize spectral resources. Allowing simultaneous transmissions on the same frequency through different polarizations or spatial separations effectively double the capacity without needing additional spectrum.

WHERE DO WE END UP.

If all current commercial satellite plans were realized, within the next decade, we would have more, possibly substantially more than 65 thousand satellites circling Earth. Today, that number is less than 10 thousand, with more than half that number realized by StarLink’s LEO constellation. Imagine the increase in, and the amount of, space debris circling Earth within the next 10 years. This will likely pose a substantial increase in operational risk for new space missions and will have to be addressed urgently.

Over the next decade, we may have at least 2 major LEO satellite constellations. One from Starlink with an excess of 12 thousand satellites, and one from China, the Guo Wang, the state network, likewise with 12 thousand LEO satellites. One global satellite constellation is from an American-based commercial company; the other is a worldwide satellite constellation representing the Chinese state. It would not be too surprising to see that by 2034, the two satellite constellations will divide Earth in part, being serviced by a commercial satellite constellation (e.g., North America, Europe, parts of the Middle East, some of APAC including India, possibly some parts of Africa). Another part will likely be served by a Chinese-controlled LEO constellation providing satellite broadband service to China, Russia, significant parts of Africa, and parts of APAC.

Over the next decade, satellite services will undergo transformative advancements, reshaping the architecture of global communication infrastructures and significantly impacting various sectors, including broadband internet, global navigation, Earth observation, and beyond. As these services evolve, we should anticipate major leaps in satellite technologies, driven by innovation in propulsion systems, miniaturization of technology, advancements in onboard processing capabilities, increasing use of AI and machine learning leapfrogging satellites operational efficiency and performance, breakthrough in material science reducing weight and increasing packing density, leapfrogs in antenna technology, and last but not least much more efficient use of the radio frequency spectrum. Moreover, we will see the breakthrough innovation that will allow better co-existence and autonomous collaboration of frequency spectrum utilization between non-terrestrial and terrestrial networks reducing the need for much regulatory bureaucracy that might anyway be replaced by decentralized autonomous organizations (DAOs) and smart contracts. This development will be essential as satellite constellations are being integrated into 5G and 6G network architectures as the non-terrestrial network cellular access component. This particular topic, like many in this article, is worth a whole new article on its own.

I expect that over the next 10 years we will see electronically steerable phased-array antennas, as a notable advancement. These would offer increased agility and efficiency in beamforming and signal direction. Their ability to swiftly adjust beams for optimal coverage and connectivity without physical movement makes them perfect for the dynamic nature of Low Earth Orbit (LEO) satellite constellations. This technology will becomes increasingly cost-effective and energy-efficient, enabling widespread deployment across various satellite platforms (not only LEO designs). The advance in phased-array antenna technology will facilitate substantial increase in the satellite system capacity by increasing the number of beams, the variation on beam size (possibly down to a customer ground station level), and support multi-band operations within the same antenna.

Another promising development is the integration of metamaterials in antenna design, which will lead to more compact, flexible, and lightweight antennas. The science of metamaterials is super interesting and relates to manufacturing artificial materials to have properties not found in naturally occurring materials with unique electromagnetic behaviors arising from their internal structure. Metamaterial antennas is going to offer superior performance, including better signal control and reduced interference, which is crucial for maintaining high-quality broadband connections. These materials are also important for substantially reducing the weight of the satellite antenna, while boosting its performance. Thus, the technology will also support bringing the satellite launch cost down dramatically.

Although primarily associated MIMO antennas with terrestrial networks, I would also expect that massive MIMO technology will find applications in satellite broadband systems. Satellite systems, just like ground based cellular networks, can significantly increase their capacity and efficiency by utilizing many antenna elements to simultaneously communicate with multiple ground terminals. This could be particularly transformative for next-generation satellite networks, supporting higher data rates and accommodating more users. The technology will increase the capacity and quality of the satellite system dramatically as it has done on terrestrial cellular networks.

Furthermore, advancements in onboard processing capabilities will allow satellites to perform more complex signal processing tasks directly in space, reducing latency and improving the efficiency of data transmission. Coupled with AI and machine learning algorithms, future satellite antennas could dynamically optimize their operational parameters in real-time, adapting to changes in the network environment and user demand.

Additionally, research into quantum antenna technology may offer breakthroughs in satellite communication, providing unprecedented levels of sensitivity and bandwidth efficiency. Although still early, quantum antennas could revolutionize signal reception and transmission in satellite broadband systems. In the context of LEO satellite systems, I am particularly excited about utilizing the Rydberg Effect to enhance system sensitivity could lead to massive improvements. The heightened sensitivity of Rydberg atoms to electromagnetic fields could be harnessed to develop ultra-sensitive detectors for radio frequency (RF) signals. Such detectors could surpass the performance of traditional semiconductor-based devices in terms of sensitivity and selectivity, enabling satellite systems to detect weaker signals, improve signal-to-noise ratios, and even operate effectively over greater distances or with less power. Furthermore, space could potentially be the near-ideal environment for operationalizing Rydberg antenna and communications systems as space had near-perfect vacuum, very low-temperatures (in Earth shadow at least or with proper thermal management), relatively free of electromagnetic radiation (compared to Earth), as well as its micro-gravity environment that may facilitate long-range “communications” between Rydberg atoms. This particular topic may be further out in the future than “just” a decade from now, although it may also be with satellites we will see the first promising results of this technology.

One key area of development will be the integration of LEO satellite networks with terrestrial 5G and emerging 6G networks, marking a significant step in the evolution of Non-Terrestrial Network (NTN) architectures. This integration promises to deliver seamless, high-speed connectivity across the globe, including in remote and rural areas previously underserved by traditional broadband infrastructure. By complementing terrestrial networks, LEO satellites will help achieve ubiquitous wireless coverage, facilitating a wide range of applications and use cases from high-definition video streaming to real-time IoT data collection.

The convergence of LEO satellite services with 5G and 6G will also spur network management and orchestration innovation. Advanced techniques for managing interference, optimizing handovers between terrestrial and non-terrestrial networks, and efficiently allocating spectral resources will be crucial. It would be odd not to mention it here, so artificial intelligence and machine learning algorithms will, of course, support these efforts, enabling dynamic network adaptation to changing conditions and demands.

Moreover, the next decade will likely see significant improvements in the environmental sustainability of LEO satellite operations. Innovations in satellite design and materials, along with more efficient launch vehicles and end-of-life deorbiting strategies, will help mitigate the challenges of space debris and ensure the long-term viability of LEO satellite constellations.

In the realm of global connectivity, LEO satellites will have bridged the digital divide, offering affordable and accessible internet services to billions of people worldwide unconnected today. In 2023 the estimate is that there are about 3 billion people, almost 40% of all people in the world today, that have never used internet. In the next decade, it must be our ambition that with LEO satellite networks this number is brought down to very near Zero. This will have profound implications for education, healthcare, economic development, and global collaboration.

FURTHER READING.

  1. A. Vanelli-Coralli, N. Chuberre, G. Masini, A. Guidotti, M. El Jaafari, “5G Non-Terrestrial Networks.”, Wiley (2024). A recommended reading for deep diving into NTN networks of satellites, typically the LEO kind, and High-Altitude Platform Systems (HAPS) such as stratospheric drones.
  2. I. del Portillo et al., “A technical comparison of three low earth orbit satellite constellation systems to provide global broadband,” Acta Astronautica, (2019).
  3. Nils Pachler et al., “An Updated Comparison of Four Low Earth Orbit Satellite Constellation Systems to Provide Global Broadband” (2021).
  4. Starlink, “Starlink specifications” (Starlink.com page). The following Wikipedia resource is quite good as well: Starlink.
  5. Quora, “How much does a satellite cost for SpaceX’s Starlink project and what would be the cheapest way to launch it into space?” (June 2023). This link includes a post from Elon Musk commenting on the cost involved in manufacturing the Starlink satellite and the cost of launching SpaceX’s Falcon 9 rocket.
  6. Michael Baylor, “With Block 5, SpaceX to increase launch cadence and lower prices.”, nasaspaceflight.com (May, 2018).
  7. Gwynne Shotwell, TED Talk from May 2018. She quotes here a total of USD 10 billion as a target for the 12,000 satellite network. This is just an amazing visionary talk/discussion about what may happen by 2028 (in 4-5 years ;-).
  8. Juliana Suess, “Guo Wang: China’s Answer to Starlink?”, (May 2023).
  9. Makena Young & Akhil Thadani, “Low Orbit, High Stakes, All-In on the LEO Broadband Competition.”, Center for Strategic & International Studies CSIS, (Dec. 2022).
  10. AST SpaceMobile website: https://ast-science.com/ Constellation Areas: Internet, Direct-to-Cell, Space-Based Cellular Broadband, Satellite-to-Cellphone. 243 LEO satellites planned. 2 launched.
  11. Lynk Global website: https://lynk.world/ (see also FCC Order and Authorization). It should be noted that Lynk can operate within 617 to 960 MHz (Space-to-Earth) and 663 to 915 MHz (Earth-to-Space). However, only outside the USA. Constellation Area: IoT / M2M, Satellite-to-Cellphone, Internet, Direct-to-Cell. 8 LEO satellites out of 10 planned.
  12. Omnispace website: https://omnispace.com/ Constellation Area: IoT / M2M, 5G. Ambition to have the world’s first global 5G non-terrestrial network. Initial support 3GPP-defined Narrow-Band IoT radio interface. Planned 200 LEO and <15 MEO satellites. So far, only 2 satellites have been launched.
  13. NewSpace Index: https://www.newspace.im/ I find this resource to have excellent and up-to-date information on commercial satellite constellations.
  14. R.K. Mailloux, “Phased Array Antenna Handbook, 3rd Edition”, Artech House, (September 2017).
  15. A.K. Singh, M.P. Abegaonkar, and S.K. Koul, “Metamaterials for Antenna Applications”, CRC Press (September 2021).
  16. T.L. Marzetta, E.G. Larsson, H. Yang, and H.Q. Ngo, “Fundamentals of Massive MIMO”, Cambridge University Press, (November 2016).
  17. G.Y. Slepyan, S. Vlasenko, and D. Mogilevtsev, “Quantum Antennas”, arXiv:2206.14065v2, (June 2022).
  18. R. Huntley, “Quantum Rydberg Receiver Shakes Up RF Fundamentals”, EE Times, (January 2022).
  19. Y. Du, N. Cong, X. Wei, X. Zhang, W. Lou, J. He, and R. Yang, “Realization of multiband communications using different Rydberg final states”, AIP Advances, (June 2022). Demonstrating the applicability of the Rydberg effect in digital transceivers in the Ku and Ka bands.

ACKNOWLEDGEMENT.

I greatly acknowledge my wife, Eva Varadi, for her support, patience, and understanding during the creative process of writing this article.

On Cellular Data Pricing, Revenue & Consumptive Growth Dynamics, and Elephants in the Data Pipe.

I am getting a bit sentimental as I haven’t written much about cellular data consumption for the last 10+ years. At the time, it did not take long for most folks in and out of our industry to realize that data traffic and, thereby, so many believed, the total cost of providing the cellular data would be growing far beyond the associated data revenues, e.g., remember the famous scissor chart back in the early two thousand tens. Many believed (then) that cellular data growth would be the undoing of the cellular industry. In 2011 many believed that the Industry only had a few more years before the total cost of providing cellular data would exceed the revenue rendering cellular data unprofitable. Ten years after, our industry remains alive and kicking (though they might not want to admit it too loudly).

Much of the past fear was due to not completely understanding the technology drivers, e.g., bits per second is a driver, and bytes that price plans were structured around not so much. The initial huge growth rates of data consumption that were observed did not make the unease smaller, i.e., often forgetting that a bit more can be represented as a huge growth rate when you start with almost nothing. Moreover, we also did have big scaling challenges with 3G data delivery. It became quickly clear that 3G was not what it had been hyped to be by the industry.

And … despite the historical evidence to the contrary, there are still to this day many industry insiders that believe that a Byte lost or gained is directly related to a loss or gain in revenue in a linear fashion. Our brains prefer straight lines and linear thinking, happily ignoring the unpleasantries of the non-linear world around us, often created by ourselves.

Figure 1 illustrates linear or straight-line thinking (left side), preferred by our human brains, contrasting the often non-linear reality (right side). It should be emphasized that horizontal and vertical lines, although linear, are not typically something that instinctively enters the cognitive process of assessing real-world trends.

Of course, if the non-linear price plans for cellular data were as depicted above in Figure 1, such insiders would be right even if anchored in linear thinking (i.e., even in the non-linear example to the right, an increase in consumption (GBs) leads to an increase in revenue). However, when it comes to cellular data price plans, the price vs. consumption is much more “beastly,” as shown below (in Figure 2);

Figure 2 illustrates the two most common price plan structures in Telcoland; (a, left side) the typical step function price logic that associates a range of data consumption with a price point, i.e., the price is a constant independent of the consumption over the data range. The price level is presented as price versus the maximum allowed consumption. This is by far the most common price plan logic in use. (b, right side) The “unlimited” price plan logic has one price level and allows for unlimited data consumption. T-Mobile US, Swisscom, and SK Telecom have all endorsed the unlimited with good examples of such pricing logic. The interesting fact is that most of those operators have several levels of unlimited tied to the consumptive behavior where above a given limit, the customer may be throttled (i.e., the speed will be reduced compared to before reaching the limit), or (and!) the unlimited plan is tied to either radio access technology (e.g., 4G, 4G+5G, 5G) or a given speed (e.g., 50 Mbps, 100 Mbps, 1Gbps, ..).

Most cellular data price plans follow a step function-like pricing logic as shown in Figure 2 (left side), where within each level, the price is constant up to the nominal data consumption value (i.e., purple dot) of the given plan, irrespective of the consumption. The most extreme version of this logic is the unlimited price plan, where the price level is independent of the volumetric data consumption. Although, “funny” enough, many operators have designed unlimited price plans that, in one way or another, depend on the customers’ consumption, e.g., after a certain level of unlimited consumption (e.g., 200 GB), cellular speed is throttled substantially (at least if the cell under which the customer demand resources are congested). So the “logic” is that if you wanted true unlimited, you still need to pay more than if you only require “unlimited”. Note that for the mathematically inclined, the step function is regarded as (piece-wise) linear … Although our linear brains might not appreciate that finesse very much. Maybe a heuristic that “The brain thinks in straight lines” would be more precisely restated as “The brain thinks in continuous non-constant monotonous straight lines”.

Any increase in consumption within a given pricing-consumption level will not result in any additional revenue. Most price plans allow for considerable growth without incurring additional associated revenues.

NETHERLANDS vs INDONESIA – BRIEFLY.

I like to keep informed and updated about markets I have worked in, with operators I have worked for, and with. I have worked across the globe in many very diverse markets and with operators in vastly different business cycles gives an interesting perspective on our industry. Throughout my career, I have been super interested in the difference between Telco operations and strategies in so-called mature markets versus what today may be much more of a misnomer than 10+ years ago, emerging markets.

The average cellular, without WiFi, consumption per customer in Indonesia was ca. 8 GB per month in 2022. That consumption would cost around 50 thousand Rp (ca. 3 euros) per month. For comparison, in The Netherlands, that consumption profile would cost a consumer around 16 euros per month. As of May 2023, the median cellular download speed was 106 Mbps (i.e., helped by countrywide 5G deployment, for 4G only, the speed would be around 60 to 80 Mbps) compared with 22 Mbps in Indonesia (i.e., where 5G has just been launched. Interestingly, although most likely coincidental, in Indonesia, a cellular data customer would pay ca. 5 times less than in the Netherlands for the same volumetric consumption. Note that for 2023, the average annual income in Indonesia is about one-quarter of that in the Netherlands. However, the Indonesian cellular consumer would also have one-fifth of the quality measured by downlink speed from the cellular base station to the consumer’s smartphone.

Let’s go deeper into how effective consumptive growth of cellular data is monetized… what may impact the consumptive growth, positively and negatively, and how it relates to the telco’s topline.

CELLULAR BUSINESS DYNAMICS.

Figure 3 Between 2016 and 2021, Western European Telcos lost almost 7% of their total cellular turnover (ca. 7+ billion euros over the markets I follow). This corresponds to a total revenue loss of ca. 1.4% per year over the period. To no surprise, the loss of cellular voice-based revenue has been truly horrendous, with an annual loss ca. 30%, although the Covid year (2021 and 2022, for that matter) was good to voice revenues (as we found ourselves confined to our homes and a call away from our colleagues). On the positive side, cellular data-based revenues have “positively” contributed to the revenue in Western Europe over the period (we don’t really know the counterfactual), with an annual growth of ca. 4%. Since 2016 cellular data revenues have exceeded that of cellular voice revenues and are 2022 expected to be around 70% of the total cellular revenue (for Western Europe). Cellular revenues have been and remain under pressure, even with a positive contribution from cellular data. The growth of cellular data volume (not including the contribution generated from WiFi usage) has continued to grow with a 38% annualized growth rate and is today (i.e., 2023) more than five times that of 2016. The annual growth rate of cellular data consumption per customer is somewhat lower ranging from the mid-twenties to the end-thirties percent. Needless to say that the corresponding cellular ARPU has not experienced anywhere near similar growth. In fact, cellular ARPU has generally been lowered over the period.

Some, in my opinion, obvious observations that are worth making on cellular data (I come to realize that although I find these obvious, I am often confronted with a lack of awareness or understanding of those);

Cellular data consumption grows much (much) faster than the corresponding data revenue (i.e., 38% vs 4% for Western Europe).

The unit growth of cellular data consumption does not lead to the same unit growth in the corresponding cellular data revenues.

Within most finite cellular data plans (thus the not unlimited ones), substantial data growth potential can be realized without resulting in a net increase of data-related revenues. This is, of course, trivial for unlimited plans.

The anticipated death of the cellular industry back in the twenty-tens was an exaggeration. The Industry’s death by signaling, voluptuous & unconstrained volumes of demanded data, and ever-decreasing euros per Bytes remains a fading memory and, of course, in PowerPoints of that time (I have provided some of my own from that period below). A good scare does wonders to stimulate innovation to avoid “Armageddon.” The telecom industry remains alive and well.

Figure 4 The latest data (up to 2022) from OECD on mobile data consumption dynamics. Source data can be found at OECD Data Explorer. The data illustrates the slowdown in cellular data growth from a customer perspective and in terms of total generated mobile data. Looking over the period, the 5-year cumulative growth rate between 2016 and 2021 is higher than 2017 to 2022 as well as the growth rate between 2022 and 2021 was, in general, even lower. This indicates a general slowdown in mobile data consumption as 4G consumption (in Western Europe) saturates and 5G consumption still picks up. Although this is not an account of the observed growth dynamics over the years, given the data for 2022 was just released, I felt it was worth including these for completeness. Unfortunately, I have not yet acquired the cellular revenue structure (e.g., voice and data) for 2022, it is work in progress.

WHAT DRIVES CONSUMPTIVE DATA GROWTH … POSITIVE & NEGATIVE.

What drives the consumer’s cellular data consumption? As I have done with my team for many years, a cellular operator with data analytics capabilities can easily check the list of positive and negative contributors driving cellular data consumption below.

Positive Growth Contributors:

  • Customer or adopter uptake. That is, new or old, customers that go from non-data to data customers (i.e., adopting cellular data).
  • Increased data consumption (i.e., usage per adopter) within the cellular data customer base that is driven by a lot of the enablers below;
  • Affordable pricing and suitable price plans.
  • More capable Radio Access Technology (RAT), e.g., HSDPA → HSPA+ → LTE → 5G, effectively higher spectral efficiency from advanced antenna systems. Typically will drive up the per-customer data consumption to the extent that pricing is not a barrier to usage.
  • More available cellular frequency spectrum is provisioned on the best RAT (regarding spectral efficiency).
  • Good enough cellular network consistent with customer demand.
  • Affordable and capable device ecosystem.
  • Faster mobile device CPU leads to higher consumption.
  • Faster & more capable mobile GPUs lead to higher consumption.
  • Device screen size. The larger the screen, the higher the consumption.
  • Access to popular content and social media.

Figure 5 illustrates the description of data growth as depending on the uptake of Adopters and the associated growth rate α(t) multiplied by the Usage per Adopter and the associated growth rate of usage μ(t). The growth of the Adopters can typically be approximated by an S-curve reaching its maximum as there are few more customers left to adopt a new service or product or RAT (i.e., α(t)→0%). As described in this section, the growth of usage per adopter, μ(t), will depend on many factors. Our intuition of μ is that it is positive for cellular data and historically has exceeded 30%. A negative μ would be an indication of consumptive churn. It should not be surprising that overall cellular data consumption growth can be very large as the Adopter growth rate is at its peak (i.e., around the S-curve inflection point), and Usage growth is high as well. It also should not be too surprising that after Adopter uptake has reached the inflection point, the overall growth will slow down and eventually be driven by the Usage per Adopter growth rate.

Figure 6 Using the OECD data (OECD Data Explorer) for the Western European mobile data per customer consumptive growth from 2011 to 2022, the above illustrates the annual growth rate of per-customer data mobile consumption. Mobile data consumption is a blend of usage across the various RATs enabling packet data usage. There is a clear increased annual growth after introducing LTE (4G) followed by a slowdown in annual growth, possibly due to reaching saturation in 4G adaptation, i.e., α3G→4G(t) → 0% leaving μ4G(t) driving the cellular data growth. There is a relatively weak increase in 2021, and although the timing coincides with 5G non-standalone (NSA) introduction (typically at 700 MHz or dynamics spectrum share (DSS) with 4G, e.g., Vodafone-Ziggo NL using their 1800 MHz for 4G and 5G) the increase in 2020 may be better attributed to Covid lockdown than a spurt in data consumption due to 5G NSA intro.

Anything that creates more capacity and quality (e.g., increased spectral efficiency, more spectrum, new, more capable RAT, better antennas, …) will, in general, result in an increased usage overall as well as on a per-customer basis (remember most price plans allow for substantial growth within the plans data-volume limit without incurring more cost for the customer). If one takes the above counterfactual, it should not be surprising that this would result in slower or negative consumption growth.

Negative growth contributors:

  • Cellular congestion causes increased packet loss, retransmissions, and deteriorating latency and speed performance. All in all, congestion may have a substantial negative impact on the customer’s service experience.
  • Throttling policies will always lower consumption and usage in general, as quality is intentionally lowered by the Telco.
  • Increased share of QUIC content on the network. The QUIC protocol is used by many streaming video providers (e.g., Youtube, Facebook, TikTok, …). The protocol improves performance (e.g., speed, latency, packet delivery, network changes, …) and security. Services using QUIC will “bully” other applications that use TCP/IP, encouraging TCP/IP to back off from using bandwidth. In this respect, QUIC is not a fair protocol.
  • Elephant flow dynamics (e.g., few traffic flows causing cell congestion and service degradation for the many). In general, elephant flows, particularly QUIC based, will cause an increase in TCP/IP data packet retransmissions and timing penalties. It is very much a situation where a few traffic flows cause significant service degradation for many customers.

One of the manifestations of cell congestion is packet loss and packet retransmission. Packet loss due to congestion ranges from 1% to 5%. or even several times higher at moments of peak traffic or if the user is in a poor cellular coverage area. The higher the packet loss, the worse the congestion, and the worse the customer experience. The underlying IP protocols will attempt to recover a lost packet by retransmission. The retransmission rate can easily exceed 10% to 15% in case of congestion. Generally, for a reliable and well-operated network, the packet loss should be well below 1% and even as low as 0.1%. Likewise, one would expect a packet retransmission rate of less than 2% (I believe the target should be less than 1%).

Thus, customers that happen to be under a given congested cell (e.g., caused by an elephant flow) would incur a substantially higher rate of retransmitted data packages (i.e., 10% to 15% or higher) as the TCP/IP protocol tries to make up for lost data packages. The customer may experience substantial service quality degradation and, as a final (unintended) “insult”, often be charged for those additional retransmitted data volumes.

From a cellular perspective, as the congestion has been relieved, the cellular operator may observe that the volume on the congested cell actually drops. The reason is that the packet loss and retransmission drops to a level far below the congested one (e.g., typically below 1%). As the quality improves for all customers demanding service from the previously overloaded (i.e., congested) cell, sustainable volume growth will commence in total and as well as will the average consumption on a customer basis. As will be shown below for normal cellular data consumption and most (if not all) price plans, a few percentage points drop in data volume will not have any meaningful effect on revenues. Either because the (temporary) drop happens within the boundaries of a given price plan level and thus has no effect on revenue, or because the overall gainful consumptive growth, as opposed to data volume attributed to poor quality, far exceeds the volume loss due to improved capacity and quality of a congested cell.

Well-balanced and available cellular sites will experience positive and sustainable data traffic growth.

Congested and over-loaded cellular sites will experience a negative and persistent reduction of data traffic.

Actively managing the few elephant flows and their negative impact on the many will increase customer satisfaction, reduce consumptive churn, and increase data growth, easily compensating for the congestion-induced increases due to packet retransmission. And unless an operator consistently is starved for radio access investments, or has poor radio access capacity management processes, most cell congestion can be attributed to the so-called elephant flows.

CELLULAR DATA CONSUMPTION IN REAL NETWORKS – ON A SECTOR LEVEL.

And irrespective of whatever drives positive and negative growth, it is worth remembering that daily traffic variations on a sector-by-sector basis and an overall cellular network level are entirely natural. An illustration of such natural sector variation over a (non-holiday) week is shown below in Figure 7 (c) for a sector in the top-20% of busiest sectors. In this example, the median variation over all sectors in the same week, as shown below, was around 10%. I often observe that even telco people (that should know better) find this natural variation quite worrisome as it appears counterintuitive to their linear growth expectations. Proper statistical measurement & analysis methodologies must be in place if inferences and solid analysis are required on a sector (or cell) basis over a relatively short time period (e.g., day, days, week, weeks,…).

Figure 7 illustrates the cellular data consumption daily variation over a (non-holiday) week. In the above, there are three examples (a) a sector from the bottom 20% in terms of carried volume, (b) a sector with a median data volume, and (c) a sector taken from the top 20% of carried data volume. Over the three different sectors (low, median, high) we observe very different variations over weekdays. From the top-20%, we have an almost 30% variation between the weekly minimum (Tuesday) and the weekly maximum (Thursday) to the bottom-20% with a variation in excess of 200% over the week. The charts above show another trend we observe in cellular networks regarding consumptive variations over time. Busy sectors tend to have a lower weekly variation than less busy sectors. I should point out that I have made no effort to select particular sectors. I could easily find some (of the less busy sectors) with even more wild variations than shown above.

The day-to-day variation is naturally occurring based on the dynamic behavior of the customers served by a given sector or cell (in a sector). I am frequently confronted with technology colleagues (whom I respect for their deep technical knowledge) that appear to expect (data) traffic on all levels monotonously increase with a daily growth rate that amounts to the annual CAGR observed by comparing the end-of-period volume level with the beginning of period volume level. Most have not bothered to look at actual network data and do not understand (or, to put it more nicely, simply ignore) the naturally statistical behavior of traffic that drives hourly, daily, weekly, and monthly variations. If you let statistical variations that you have no control over drive your planning & optimization decisions. In that case, you will likely fail to decide on the business-critical ones you can control.

An example of a high-traffic (top-20%) sector’s complete 365 day variations of data consumption is shown below in Figure 8. We observe that the average consumption (or traffic demand) increases nicely over the year with a bit of a slowdown (in this European example) during the summer vacation season (same around official holidays in general). Seasonal variations is naturally occurring and often will result in a lower-than-usual daily growth rate and a change in daily variations. In the sector traffic example below, Tuesdays and Saturdays are (typically) lower than the average, and Thursdays are higher than average. The annual growth is positive despite the consumptive lows over the year, which would typically freak out my previously mentioned industry colleagues. Of course, every site, sector, and cell will have a different yearly growth rate, most likely close to a normal distribution around the gross annual growth rate.

Figure 8 illustrates a top-20% sector’s data traffic growth dynamics (in GB) over a calendar year’s 365 days. Tuesdays and Saturdays are likely below the weekly average data consumption, and Thursdays are more likely to be above. Furthermore, the daily traffic growth is slowing around national holidays and in the summer vacation (i.e., July & August for this particular Western European country).

And to nail down the message. As shown in the example in Figure 9 below, every sector in your cellular network from one time period to the other will have a different positive and negative growth rate. The net effect over time (in terms of months more than days or weeks) is positive as long as customers adopt the supplied RAT (i.e., if customers are migrating from 4G to 5G, it may very well be that 4G consumed data will decline while the 5G consumed data will increase) and of course, as long as the provided quality is consistent with the expected and demanded quality, i.e., sectors with congestion, particular so-called elephant-flow induced congestion, will hurt the quality of the many that may reduce their consumptive behavior and eventually churn.

Figure 9 illustrates the variation in growth rates across 15+ thousand sectors in a cellular network comparing the demanded data volume between two consecutive Mondays per sector. Statistical analysis of the above data shows that the overall average value is ca. 0.49% and slightly skewed towards the positive growths rates (e.g., if you would compare a Monday with a Tuesday, the histogram would typically be skewed towards the negative side of the growth rates as Tuesday are a lower traffic day compared to Monday). Also, with the danger of pointing out the obvious, the daily or weekly growth rates expected from an annual growth rate of, for example, 30% are relatively minute, with ca. 0.07% and 0.49%, respectively.

The examples above (Figures 7, 8, and 9) are from a year in the past when Verstappen had yet to win his first F1 championship. That particular weekend also did not show F1 (or Sunday would have looked very different … i.e., much higher) or any other big sports event.

CELLULAR DATA PRICE PLAN LOGIC.

Figure 10 above is an example of the structure of a price plan. Possibly represented slightly differently from how your marketeer would do (and I am at peace with that). We observe the illustration of a price level of 8 data volume intervals on the upper left chart. This we can also write as (following the terminology of the lower right corner);

Thus, for the p_1 package allowing the customer to consume up to 3 GB is priced at 20 (irrespective of whether the customer would consume less). For package p_5 a consumer would pay 100 for a data consumption allowance up to 35 GB. Of course, we assume that the consumer choosing this package would generally consume more than 24 GB, which is the next cheaper package (i.e., p_4).

The price plan example above clearly shows that each price level offers customers room to grow before upgrading to the next level. For example, a customer consuming no more than 8 GB per month, fitting into p_3, could increase consumption with 4 GB (+50%) before considering the next level price plan (i.e., p_4). This is just to illustrate that even if the customer’s consumption may grow substantially, one should not per se be expecting more revenue.

Even though it should be reasonably straightforward that substantial growth of a customer base data consumption cannot be expected to lead to an equivalent growth in revenue, many telco insiders instinctively believe this should be the case. I believe that the error may be due to many mentally linearizing the step-function price plans (see Figure 2 upper right side) and simply (but erroneously) believing that any increase (decrease) in consumption directly results in an increase (or decrease) in revenue.

DATA PRICING LOGIC & USAGE DISTRIBUTION.

If we want to understand how consumptive behavior impacts cellular operators’ toplines, we need to know how the actual consumption distributes across the pricing logic. As a high-level illustration, Figure 11 (below) shows the data price step-function logic from Figure 9 with an overall consumptive distribution superimposed (orange solid line). It should be appreciated that while this provides a fairly clear way of associating consumption with pricing, it is an oversimplification at best. It will nevertheless allow me to estimate crudely the number of customers that are likely to have chosen a particular price plan matching their demand (and affordability). In reality, we will have customers that have chosen a given price plan but either consume less than the limit of the next cheaper plan (thus, if consistently so, could save but go to that plan). We will also have customers that consume more than their allowed limit. Usually, this would result in the operator throttling the speed and sending a message to the customer that the consumption exceeds the limit of the chosen price plan. If a customer would consistently overshoots the limits (with a given margin) of the chosen plan, it is likely that eventually, the customer will upgrade to the next more expensive plan with a higher data allowance.

Figure 11 above illustrates on the left side a consumptive distribution (orange line) identified by its mean and standard deviation superimposed on our price plan step-function logic example. The right summarizes the consumptive distribution across the eight price plan levels. Note that there is a 9th level in case the 200 GB limit is breached (0.2% in this example). I am assuming that such customers pay twice the price for the 200 GB price plan (i.e., 320).

In the example of an operator with 100 million cellular customers, the consumptive distribution and the given price plan lead to a fiat of 7+ billion per month. However, with a consumptive growth rate of 30% to 40% annually per active cellular data user (on average), what kind of growth should we expect from the associated cellular data revenues?

Figure 12 In the above illustration, I have mapped the consumptive distribution to the price plan levels and then developed the begin-of-period consumptive distribution (i.e., the light green curve) month by month until month 12 has been reached (i.e., the yellow curve). I assume the average monthly consumptive cellular data growth is 2.5% or ca. 35% after 12 months. Furthermore, I assume that for the few customers falling outside the 200 GB limit that they will purchase another 200 GB plan. For completeness, the previous 12 months (previous year) need to be carried out to compare the total cumulated cellular data revenue between the current and previous periods.

Within the current period (shown in Figure 12 above), the monthly cellular data revenue CAGR comes out at 0.6% or a total growth of 7.4% of monthly revenue between the beginning period and the end period. Over the same period, the average data consumption (per user) grew by ca. 34.5%. In terms of the current year’s total data revenue to the previous year’s total data revenue, we get an annual growth rate of 8.3%. This illustrates that it should not be surprising that the revenue growth can be far smaller than the consumptive growth given price plans such as the above.

It should be pointed out that the above illustration of consumptive and revenue growth simplifies the growth dynamics. For example, the simulation ignores seasonal swings over a 12-month period. Also, it attributes 1-to-1 all consumption falling within the price range to that particular price level when there is always spillover on both upper and lower levels of a price range that will not incur higher or lower revenues. Moreover, while mapping the consumptive distribution to the price-plan giga-byte intervals makes the simulation faster (and setup certainly easier), it is also not a very accurate approach to the coarseness of the intervals.

A LEVEL DEEPER.

While working with just one consumptive distribution, as in Figure 11 and Figure 12 above, allows for simpler considerations, it does not fully reflect the reality that every price plan level will have its own consumptive distribution. So let us go that level deeper and see whether it makes a difference.

Figure 13 above, illustrates the consumptive distribution within a given price plan range, e.g., the “5 GB @ 30” price-plan level for customers with a consumption higher than 3 GB and less than or equal to 5 GB. It should come as no surprise that some customers may not reach even the 3 GB, even though they pay for (up to) 5 GB, and some may occasionally exceed the 5 GB limit. In the example above, 10% of customers have a consumption below 3 GB (and could have chosen the next cheaper plan of up to 3 GB), and 3% exceed the limits of the chosen plan (an event that may result in the usage speed being throttled). As the average usage within a given price plan level approaches the ceiling (e.g., 5 GB in the above illustration), in general, the standard deviation will reduce accordingly as customers will jump to the Next Expensive Plan to meet their consumptive needs (e.g., “12 GB @ 50” level in the illustration above).

Figure 14 generalizes Figure 11 to the full price plan and, as illustrated in Figure 12, let the consumption profiles develop in time over a 12-month period (Initial and +12 month shown in the above illustration). The difference between the initial and 12 months can be best appreciated with the four smaller figures that break up the price plan levels in 0 to 40 GB and 40 to 200 GB.

The result in terms of cellular data revenue growth is comparable to that of the higher-level approach of Figure 12 (ca. 8% annual revenue growth vs 34 % overall consumptive annual growth rate). The detailed approach of Figure 11 is, however, more complicated to get working and requires much more real data to work with (which obviously should be available to operators in this time and age). One should note that in the illustrated example price plan (used in the figures above) that at a 2.5% monthly consumptive growth rate (i.e., 34% annually), it would take a customer an average of 24 months (spread of 14 to 35 month depending on level) to traverse a price plan level from the beginning of the level (e.g., 5 GB) to the end of the level (12 GB). It should also be clear that as a customer enters the highest price plan levels (e.g., 100 GB and 200 GB), little additional can be expected to be earned on those customers over their consumptive lifetime.

The illustrated detailed approach shown above is, in particular, useful to test a given price plan’s profitability and growth potential, given the particularities of the customers’ consumptive growth dynamics.

The additional finesse that could be considered in the analysis could be an affordability approach because the growth within a given price level slows down as the average consumption approaches the limit of the given price level. This could be considered by slowing the mean growth rate and allowing for the variance to narrow as the density function approaches the limit. In my simpler approach, the consumptive distributions will continue to grow at a constant growth rate. In particular, one should consider more sophisticated approaches to modeling the variance that determines the spillover into less and more expensive levels. An operator should note that consumption that reduces or consistently falls into the less expensive level expresses consumptive churn. This should be monitored on a customer level as well as on a radio access cell level. Consumptive churn often reflects the supplied radio access quality is out of sync with the customer demand dynamics and expectations. On a radio access cell level, the diligent operator will observe a sharp increase in retransmitted data packages and increased latency on a flow (and active customer basis) hallmarks of a congested cell.

WRAPPING UP.

To this day, 20+ odd years after the first packet data cellular price plans were introduced, I still have meetings with industry colleagues where they state that they cannot implement quality-enhancing technologies for the fear that data consumption may reduce and by that their revenues. Funny enough, often the fear is that by improving the quality for typically many of their customers being penalized by a few customers’ usage patterns (e.g., the elephants in the data pipe), the data packet loss and TCP/IP retransmissions are reducing as the quality is improving and more customers are getting the service they have paid for. It is ignoring the commonly established fact of our industry that improving the customer experience leads to sustainable growth in consumption that consequently may also have a positive topline impact.

I am often in situations where I am surprised with how little understanding and feeling Telco employees have for their own price plans, consumptive behavior, and the impact these have on their company’s performance. This may be due to the fairly complex price plans telcos are inventing, and our brain’s propensity for linear thinking certainly doesn’t make it easier. It may also be because Telcos rarely spend any effort educating their employees about their price plans and products (after all, employees often get all the goodies for “free”, so why bother?). Do a simple test at your next town hall meeting and ask your CXOs about your company’s price plans and their effectiveness in monetizing consumption.

So what to look out for?

Many in our industry have an inflated idea (to a fault) about how effective consumptive growth is being monetized within their company’s price plans.

Most of today’s cellular data plans can accommodate substantial growth without leading to equivalent associated data revenue growth.

The apparent disconnect between the growth rate of cellular data consumption (CAGR ~30+%), in its totality as well on an average per-customer basis, and cellular data revenues growth rate (CAGR < 10%) is simply due to the industry’s price plan structures allowing for substantial growth without a proportion revenue growth.

ACKNOWLEDGEMENT.

I greatly acknowledge my wife, Eva Varadi, for her support, patience, and understanding during the creative process of writing this Blog.

FURTHER READING.

Kim Kyllesbech Larsen, Mind Share: Right Pricing LTE … and Mobile Broadband in general (A Technologist’s observations) (slideshare.net), (May 2012). A cool seminal presentation on various approaches to pricing mobile data. Contains a lot of data that illustrates how far we have come over the last 10 years.

Kim Kyllesbech Larsen, Mobile Data-centric Price Plans – An illustration of the De-composed. | techneconomyblog (February, 2015). Exploring UK mobile mixed-services price plans in an attempt to decipher the price of data which at the time (often still is) a challenge to figure out due to (intentional?) obfuscation.

Kim Kyllesbech Larsen, The Unbearable Lightness of Mobile Voice. | techneconomyblog (January, 2015). On the demise of voice revenue and rise of data. More of a historical account today.

Tellabs “End of Profit” study executive summary (wordpress.com), (2011). This study very much echoed the increasing Industry concern back in 2010-2012 that cellular data growth would become unprofitable and the industry’s undoing. The basic premise was that the explosive growth of cellular data and, thus, the total cost of maintaining the demand would lead to a situation where the total cost per GB would exceed the revenue per GB within the next couple of years. This btw. was also a trigger point for many cellular-focused telcos to re-think their strategies towards the integrated telco having internal access to fixed and mobile broadband.

B. de Langhe et al., “Linear Thinking in a Nonlinear World”, Harvard Business Review, (May-June, 2017). It is a very nice and compelling article about how difficult it is to get around linear thinking in a non-linear world. Our brains prefer straight lines and linear patterns and dependencies. However, this may lead to rather amazing mistakes and miscalculations in our clearly nonlinear world.

OECD Data Explorer A great source of telecom data, for example, cellular data usage per customer, and the number of cellular data customers, across many countries. Recently includes 2022 data.

I have used Mobile Data – Europe | Statista Market Forecast to better understand the distribution between cellular voice and data revenues. Most Telcos do not break out their cellular voice and data revenues from their total cellular revenues. Thus, in general, such splits are based on historical information where it was reported, extrapolations, estimates, or more comprehensive models.

Kim Kyllesbech Larsen, The Smartphone Challenge (a European perspective) (slideshare.net) (April 2011). I think it is sort of a good account for the fears of the twenty-tens in terms of signaling storms, smartphones (=iPhone) and unbounded traffic growth, etc… See also “Eurasia Mobile Markets Challenges to our Mobile Networks Business Model” (September 2011).

Geoff Huston, “Comparing TCP and QUIC”, APNIC, (November 2022).

Anna Saplitski et al., “CS244 ’16: QUIC loss recovery”, Reproducing Network Research, (May 2016).

RFC9000, “QUIC: A UDP-Based Multiplexed and Secure Transport“, Internet Engineering Task Force (IETF), (February 2022).

Dave Gibbons, What Are Elephant Flows And Why Are They Driving Up Mobile Network Costs? (forbes.com) (February 2019).

K.-C. Lan and J. Heidemann, “A measurement study of correlations of Internet flow characteristic” (February 2006). This seminal paper has inspired many other research works on elephant flows. A flow should be understood as an unidirectional series of IP packets with the same source and destination addresses, port numbers, and protocol numbers. The authors define elephant flows as flows with a size larger than the mean plus three standard deviations of the sampled data. Though it is important to point out that the definition is less important. Such elephant flows are typically few (less than 20%) but will cause cell congestion by reducing the quality of many requiring a service in such an affected cell.

Opanga Networks is a fascinating and truly innovative company. Using AI, they have developed their solution around the idea of how to manage data traffic flows, reduce congestion, and increase customer quality. Their (N2000) solution addresses particular network situations where a limited number of customer data usage takes up a disproportionate amount of resources within the cellular network (i.e., the problem with elephant flows). Opanga’s solution optimizes those traffic congestion-impacting flows and results in an overall increase in service quality and customer experience. Thus, the beauty of the solution is that the few traffic patterns, causing the cellular congestion, continue without degradation, allowing the many traffic patterns that were impacted by the few to continue at their optimum quality level. Overall, many more customers are happy with their service. The operator avoids an investment of relatively poor return and can either save the capital or channel it into a much higher IRR (internal rate of return) investment. I have seen tangible customer improvements exceeding 30+ percent improvement to congested cells, avoiding substantial RAN Capex and resulting Opex. And the beauty is that it does not involve third-party network vendors and can be up and running within weeks with an investment that is easily paid back within a few months. Opanga’s product pipeline is tailor-made to alleviate telecom’s biggest and thorniest challenges. Their latest product, with the appropriate name Joules, enables substantial radio access network energy savings above and beyond what features the telcos have installed from their Radio Access Network suppliers. Disclosure: I am associated with Opanga as an advisor to their industrial advisory board.

Mobile Data-centric Price Plans – An illustration of the De-composed.

How much money would it take for you to give up internet? …for the rest of your life? … and maybe much more important; How much do you want to pay for internet? The following cool video URL “Would you give up the Internet for 1 Million Dollars” hints towards both of those questions and an interesting paradox!

The perception of value is orders of magnitude higher than the willingness to pay, i.e.,

“I would NOT give up Internet for life for a Million+ US Dollars … oh … BUT… I don’t want to pay more than a couple of bucks for it either” (actually for a mature postpaid-rich market the chances are that over your expected life-time you will pay between 30 to 40 thousand US$ for mobile internet & voice & some messaging).

Price plans are fascinating! … Particular the recent data-centric price plans bundling in legacy services such as voice and SMS.

Needles to say that a consumer today often needs an advanced degree in science to really understand the price plans they are being presented. A high degree of trust is involved in choosing a given plan. The consumer usually takes what has been recommended by the shop expert (who most likely doesn’t have an advanced science degree either). This shop expert furthermore might (or might not) get a commission (i.e., a bonus) selling you a particular plan and thus in such a case hardly is the poster child of objectiveness.

How does the pricing experts come to the prices that they offer to the consumer? Are those plans internally consistent … or  maybe not?

It becomes particular interesting to study data-centric price plans that try to re-balance Mobile Voice and SMS.

How is 4G (i.e., in Europe also called LTE) being charged versus “normal” data offerings in the market? Do the mobile consumer pay more for Quality? Or maybe less?

What is the real price of mobile data? … Clearly, it is not the price we pay for a data-centric price plan.

A Data-centric Tale of a Country called United & a Telecom Company called Anything Anywhere!

As an example of mobile data pricing and in particular of data-centric mobile pricing with Voice and SMS included, I looked at a Western European Market (let’s call it United) and a mobile operator called Anything Anywhere. Anything Anywhere (AA) is known for its comprehensive & leading-edge 4G network as well as several innovative product ideas around mobile broadband data.

In my chosen Western European country United, voice revenues have rapidly declined over the last 5 years. Between 2009 to 2014 mobile voice revenues lost more than 36% compared to an overall revenue loss of “only” 14%. This corresponds to a compounded annual growth rate of minus 6.3% over the period. For an in depth analysis of the incredible mobile voice revenue losses the mobile industry have incurred in recent years see my blog “The unbearable lightness of mobile voice”.

Did this market experience a massive uptake in prepaid customers? No! Not at all … The prepaid share of the customer base went from ca. 60% in 2009 to ca. 45% in 2014. So in other words the Postpaid base over the period had grown with 15% and in 2014 was around 55%. This should usually have been a cause for great joy and incredible boost in revenues. United is also a market that has largely managed not to capitalize economically on substantial market consolidation.

As it is with many other mobile markets, engaging & embracing the mobile broadband data journey has been followed by a sharp decline in the overall share of voice revenue from ca. 70% in 2009 to ca. 50% in 2014. An ugly trend when the total mobile revenue declines as well.

The Smartphone penetration in United as of Q1 2014 was ca. 71% with 32% iOS-based devices. Compare this to 2009 where the smartphone penetration was ca. 21% with iOS making out around 75+%.

Our Mobile Operator AA has the following price plan structure (note: all information is taken directly from AA’s web site and can be found back if you guess which company it applies to);

  • Data-centric price plans with unlimited Voice and SMS.
  • Differentiated speed plans, i.e., 4G (average speed advertised to 12 – 15 Mbps) vs. Double Speed 4G (average speed advertised to 24 – 30 Mbps).
  • Offer plans that apply Europe Union-wide.
  • Option to pay less for handsets upfront but more per month (i.e., particular attractive for expensive handsets such as iPhone or Samsung Galaxy top-range models).
  • Default offering is 24 month although a shorter period is possible as well.
  • Offer SIM-only data-centric with unlimited voice & SMS.
  • Offer Data-only SIM-only plans.
  • Further you will get access to extensive “WiFi Underground”. Are allowed tethering and VoIP including Voice-calling over WiFi.

So here is an example of AA’s data-centric pricing for various data allowances. In this illustration I have chosen to add an iPhone 6 Plus (why? well I do love that phone as it largely replaces my iPad outside my home!) with 128GB storage. This choice have no impact on the fixed and variable parts of the respective price plans. For SIM-Only plans in the data below, I have added the (Apple) retail price of the iPhone 6 Plus (light grey bars). This is to make the comparison somewhat more comparable. It should of course be clear that in the SIM-only plans, the consumer is not obliged to buy a new device.

tco 24 month

  • Figure above: illustrates the total consumer cost or total price paid over the period (in local currency) of different data plans for our leading Western European Mobile Operator AA. The first 9 plans shown above includes a iPhone 6 Plus with 128GB memory. The last 5 are SIM only plans with the last 2 being Data-only SIM-only plans. The abbreviations are the following PPM: Pay per Month (but little upfront for terminal), PUF: Pay UpFront (for terminal) and less per month, SIMO: SIM-Only plan, SIMDO: SIM Data-Only plan, xxGB: The xx amount of Giga Bytes offered in Plan, 2x indicates double 4G speed of “normal” and 1x indicates “normal” speed, 1st UL indicates unlimited voice in plan, 2nd UL indicates unlimited SMS in plan, EU indicates that the plan also applies to countries in EU without extra charges. So PPM20GB2xULULEU defines a Pay per Month plan (i.e., the handset is pay over the contract period and thus leads to higher monthly charges) with 20 GB allowance at Double (4G) Speed with Unlimited Voice and Unlimited SMS valid across EU. In this plan you would pay 100 (in local currency) for a iPhone 6 Plus with 128 GB. Note the local Apple Shop retail price of an iPhone 6 Plus with 128 GB is around 789 in local currency (of which ca. 132 is VAT) for this particular country. Note: for the SIM-only plans (i.e., SIMO & SIMDO) I have added the Apple retail price of a iPhone 6 Plus 128GB. It furthermore should be pointed out that the fixed service fee and the data consumption price does not vary with choice of handset.

If I decide that I really want that iPhone 6 Plus and I do not want to pay the high price (even with discounts) that some price plans offers. AA offers me a 20GB 4G data-plan, pay 100 upfront for the iPhone 6 Plus (with 128 GB memory) and for the next 24 month 63.99 (i.e., as this feels much cheaper than paying 64) per month. After 24 month my total cost of the 20 GB would be 1,636. I could thus save 230 over the 24 month if I wanted to pay 470 (+370 compared to previous plan & – 319 compared to Apple retail price) for the iPhone. In this lower cost plan my monthly cost of the 20 GB would be 38.99 or 25 (40%!) less on a monthly basis.

The Analysis show that a “Pay-less-upfront-and-more-per-month” subscriber would end up after the 24 month having paid at least ca. 761 for the iPhone 6 Plus (with 128GB). We will see later, that the total price paid for the iPhone 6 Plus however is likely to be approximately 792 or slightly above today’s retail price (based on Apple’s pricing).

The Price of a Byte and all that Jazz

So how does the above data-price plans look like in terms of Price-per-Giga-Byte?

Although in most cases not be very clear to the consumer, the data-centric price plan is structured around the price of the primary data allowance (i.e., the variable part) and non-data related bundled services included in the plan (i.e., the fixed service part representing non-data items).

There will be a variable price reflecting the data-centric price-plans data allowance and a “Fixed” Service Fee that capture the price of bundled services such as voice and SMS. Based on total price of the data-centric price plan, it will often appear that the higher the allowance the cheaper does your unit-data “consumption” (or allowance) become. Indicating that volume discounts have been factored into the price-plan. In other words, the higher the data allowance the lower the price per GB allowance.

This is often flawed logic and simply an artefact of the bundled non-data related services being priced into the plan. However, to get to that level of understanding requires a bit of analysis that most of us certainly don’t do before a purchase.

price per giga byte

  • Figure above: Illustrates the unit-price of a Giga Byte (GB) versus AA’s various data-centric price plans. Note the price plans can be decomposed into a variable data-usage attributable price (per GB) and a fixed service fee that accounts for non-data services blended into the price. The Data Consumption per GB is the variable data-usage dependable part of the Price Plan and the Total price per GB is the full price normalized to the plans data consumption allowance.

So with the above we have argued that the total data-centric price can be written as a fixed and a variable part;

{P_{Tot}} = {P_{Fixed}} + {P_{Data}}({U_{GB}}) = {P_{Fixed}} + {p_{GB}}U_{GB}^\beta

As will be described in more detail below, the data-centric price {P_{Tot}} is structured in what can be characterized as a “Fixed Service Fee”  {P_{Fixed}} and a variable “Data Consumption Price{P_{Data}} that depends on a given price-plan’s data allowance {U_{GB}} (i.e., GB is Giga Byte). The “Data Consumption Price{P_{Data}} is variable in nature and while it might be a complex (i.e. in terms of complexity) function of data allowance {U_{GB}} it typically be of the form {p_{GB}}U_{GB}^\beta with the exponent \beta (i.e., Beta) being 1 or close to 1. In other words the Data Consumptive price is a linear (or approximately so) function of the data allowance. In case \beta is larger than 1, data pricing gets progressively more expensive with increasing allowance (i.e., penalizing high consumption or as I believe right-costing high consumption). For \beta lower than 1, data gets progressively cheaper with increasing data allowances corresponding to volume discounts with the danger of mismatching the data pricing with the cost of delivering the data.

The “Fixed Service Fee” depends on all the non-data related goodies that are added to the data-centric price plan, such as (a) unlimited voice, (b) unlimited SMS, (c) Price plan applies Europe-wide (i.e., EU-Option), (d) handset subsidy recovery fee, (e) maybe a customer management fee, etc..

For most price data-centric plan, If the data-centric price divided by the allowance would be plotted against the allowance {U_{GB}} in a Log-Log format would result in a fairly straight-line.

examples of power-law behaviour

Nothing really surprising given the pricing math involved! It is instructive to see what actually happens when we take a data-centric price and divide by the corresponding data allowance;

\frac{{{P_{Tot}}}}{{{U_{GB}}}} = \frac{{{P_{Fixed}} + {p_{GB}}U_{GB}^\beta }}{{{U_{GB}}}}  = \limits^{\beta  = 1} {p_{GB}} + {P_{Fixed}}U_{GB}^{ - 1}

For very large data allowances {U_{GB}} the price-centric per GB would asymptotically converge to {p_{GB}}, i.e., the unit cost of a GB. As {p_{GB}} is usually a lot smaller than {P_{Fixed}}, we see that there is another limit, where the allowance {U_{GB}} is relative low, where we would see the data-centric pricing per GB slope (in a Log-Log plot) become linear in the data allowance. Typically for allowances from 0.1 GB up towards 50 GB, non-linear slope of approximately -0.7±0.1 is observed and thus in between the linear and the constant pricing regime.

We can also observe that If the total price, of a data-centric price plan associated with a given data allowance (i.e., GB), is used to derive a price-per-GB, one would conclude that most mobile operators provide the consumer with volume discounts as they adapt higher data allowance plans. The GB gets progressively cheaper for higher usage plans. As most data-centric price plans are in the range where {p_{GB}} is (a lot) smaller than {P_{Fixed}}U_{GB}^{ - 1} , it will appear that the unit price of data declines as the data allowance increases. However in most cases it is likely an artefact of the Fixed Service Fee that reflects non-data related services which unless a data-only bundle can be a very substantial part of the data-centric price plan.

It is clear that data-allowance normalizing the totality of a data-centric price plan, particular when non-data services have been blended into the plan, will not reveal the real price of data. If used for assessing, for example, data profitability or other mobile data related financial KPIs this approach might be of very little use.

data centric price dynamics

  • Figure above: illustrates the basic characteristics of a data-centric price plan normalized by the data allowance. The data for this example reflects the AA’s data-centric price plans 2x4G Speed with bundled unlimited Voice & SMS as well as applying EU-wide. We see that the Beta value corresponds to a Volume Discount (at values lower than 1) or a Volume Penalty (at values higher than 1).

Oh yeah! … The really “funny” part of most data-price plan analysis (including my own past ones!) are they are more likely to reflect the Fixed Service Part (independent of the Data allowance) of the Data-centric price plan than the actual unit price of mobile data.

What to expect from AA’s data-centric price plans?

so in a rational world of data-centric pricing (assuming such exist) what should we expect of Anything Anywhere’s price plans as advertised online;

  • The (embedded) price for unlimited voice would be the same irrespective of the data plan’s allowed data usage (i.e., unlimited Voice does not depend on data plan).
  • The (embedded) price for unlimited SMS would be the same irrespective of the data plan’s allowed data usage (i.e., unlimited SMS does not depend on data plan).
  • You would pay more for having your plan extended to apply across Europe Union compared to not having this option.
  • You would (actually you should) expect to pay more per Mega Byte for the Double Speed option as compared to the Single Speed Option.
  • If you decide to “finance” your handset purchase (i.e., pay less upfront option) within a data plan you should expect to pay more on a monthly basis.
  • Given a data plan has a whole range of associated handsets priced From Free (i.e., included in plan without extra upfront charge) to high-end high-priced Smartphones, such as iPhone 6 Plus 128 GB, you would not expect that handset related cost would have been priced into the data plan. Or if it is, it must be the lowest common denominator for the whole range of offered handsets at a given price plan.
  • Where the discussion becomes really interesting is how your data consumption should be priced; (1) You pay more per unit of data consumption as you consume more data on a monthly basis, (2) You pay the same per unit irrespective of your consumption or (3) You should have a volume discount making your units cheaper the more you consume.

of course the above is if and only if the price plans have been developed in reasonable self-consistent manner.

data price analysis

  • Figure above: Illustrates AA’s various data-centric price plans (taken from their web site). Note that PPM represents low upfront (terminal) cost for the consumer and higher monthly cost and PUF represent paying upfront for the handset and thus having lower monthly costs as a consequence. The Operator AA allows the consumer in the PPM Plan to choose for an iPhone 6 Plus 128GB (priced at 100 to 160) or an IPhone 6 Plus 64GB option (at a lower price of course).

First note that Price Plans (with more than 2 data points) tend to be linear with the Data Usage allowance.

The Fixed Service Fee – The Art of Re-Capture Lost legacy Value?

In the following I define the Fixed Service Fee as the part of the total data-centric price plan that is independent of a given plan’s data allowance. The logic is that this part would contain all non-data related cost such as Unlimited Voice, Unlimited SMS, EU-Option, etc..

From AA’s voice plan (for 250 Minutes @ 10 per Month & 750 Minutes @ 15 per Month) with unlimited SMS (& no data) it can be inferred that

  • Price of Unlimited SMS can be no higher than 7.5. This however is likely also include general customer maintenance cost.

Monthly customer maintenance cost (cost of billing, storage, customer care & systems support, etc.) might be deduced from the SIM-Only Data-Only package and would be

  • Price of Monthly Customer Maintenance could be in the order of 5, which would imply that the Unlimited SMS price would be 2.5. Note the market average Postpaid SMS ARPU in 2014 was ca., 8.40 (based on Pyramid Research data). The market average number of postpaid SMS per month was ca. 273 SMS.

From AA’s SIM-only plan we get that the fixed portion of providing service (i.e., customer maintenance, unlimited Voice & SMS usage) is 14 and thus

  • Price of Unlimited Voice should be approximately 6.5. Note the market average Postpaid Voice ARPU was ca. 12 (based on Pyramid Research data). The market average voice usage per month was ca. 337 minutes. Further from the available limited voice price plans it can be deduced that unlimited voice must be higher than 1,000 Minutes or more than 3 times the national postpaid average.

The fixed part of the data-centric pricing difference between the data-centric SIM-only plan and similar data-centric plan including a handset (i.e., all services are the same except for the addition of the handset) could be regarded as a minimum handset financing cost allowing the operator to recover some of the handset subsidy

  • Equipment subsidy recovery cost of 7 (i.e., over a 24 month period this amounts to 168 which is likely to recover the average handset subsidy). Note is the customer chooses to pay little upfront for the handset, the customer would have to pay 26 extra per month in he fixed service fee. Thus low upfront cost result in another 624 over the 24 month contract period. Interestingly is that with the initial 7 for handset subsidy recovery in the basic fixed service fee a customer would have paid 792 in handset recovery over 24 month period the contract applies to (a bit more than the iPhone 6 Plus 128GB retail price).

The price for allowing the data-centric price-plan to apply Europe Union Wide is

  • The EU-Option (i.e., plan applicable within EU) appears to be priced at ca. 5 (caution: 2x4G vis-a-vis 1x4G could have been priced into this delta as well).

For EU-option price it should be noted here that the two plans that are being compared differs not only in the EU-option. The plan without the EU option is a data plan with “normal” 4G speed, while the EU-option plan supports double 4G speeds. So in theory the additional EU-option charge of 5 could also include a surcharge for the additional speed.

Why an operator would add the double speed to the fixed Service Fee price part is “bit” strange. The 2x4G speed price-plan option clearly is a variable trigger for cost (and value to the customer’s data usage). Thus should be introduced in the the variable part (i.e., the Giga-Byte dependent part) of the data-centric price plan.

It is assumed that indeed the derived difference can be attributed to the EU-option, i.e., the double speed has not been include in the monthly Fixed Service Fee.

In summary we get AA’s data-centric price plan’s monthly Fixed Service Fee de-composition as follows;

fixed part of data-centric pricing

  • Figure above: shows the composition of the monthly fixed service fee as part of AA’s data-centric plans. Of course in a SIM-only scenario the consumer would not have the Handset Recovery Fee inserted in the price plan.

So irrespective of the data allowance a (postpaid) customer would pay between 26 to 52 per month depending on whether handset financing is chosen (i.e., Low upfront payment on the expense of higher monthly cost).

Mobile data usage still has to happen!

The price of Mobile Data Allowance.

The variable data-price in the studied date-centric price plans are summarized in the table below as well as the figure;

Price-plan

4G Speed

Price per GB

Pay Less Upfront More per Month

Double

0.61±0.03

Pay Upfront & Less per Month

Double

0.67±0.05

SIM-Only

Single

1.47±0.08

SIM-Only Data Only

Single

2 (only 2 data points)

variable data price analysis

The first thing that obviously should make you Stop in Wonder is that Single 4G Speed Giga Byte is more than Twice the price of a Double 4G Speed Giga Byte In need for speed … well that will give you a pretty good deal with AA’s price 2x4G plans.

Second thing to notice is that it would appear to be a really bad deal (with respect to the price-per-byte) to be a SIM-Only Data-Only customer.

The Data-Only pays 2 per GB. Almost 3 times more than if you would choose a subscription with a device, double speed, double unlimited and EU-wide applicable price plan.

Agreed! In absolute terms the SIM-only Data-only cost a lot less per month (9 less than the 20GB pay device upfront) and it is possible to run away after 12 months (versus the 24 month plans). One rationale for charging extra per Byte for a SIM-only Data-only plan could be that the SIM card might be used in Tablet or Data-card/Dongle products that typically does consume most if not all of a given plans allowance. For normal devices and high allowance plans on average the consumption can be quiet a lot lower than the actual allowance. Particular over a 24 month period.

You might argue that this is all about how the data-centric price plans have been de-composed in a fixed service fee (supposedly the non-data dependent component) and a data consumptive price. However, even when considering the full price of a given price plan is the Single-4G-Speed more expensive per Byte than Double-4G-Speed.

You may also argue that I am comparing apples and oranges (or even bananas pending taste) as the Double-4G-Speed plans include a devices and a price-plan that applies EU-wide versus the SIM-only plan that includes the customers own device and a price-plan that only works in United. All true of course … Why that should be more expensive to opt out of is a bit beyond me and why this should have an inflationary impact on the price-per-Byte … well a bit of a mystery as well.

At least there is no (statistical) difference in the variable price of a Giga Byte whether the customer chooses to pay of her device over the 24 month contract period or pay (most of) it upfront.

For AA it doesn’t seem to be of concern! …. As 88% would come back for more (according with their web site).

Obviously this whole analysis above make the big assumption that the data-centric price plans are somewhat rationally derived … this might not be the case!

and it assumes that rationally & transparently derived price plans are the best for the consumer …

and it assumes what is good for the consumer is also good for the company …

Is AA different in this respect to that of other Operators around the world …

No! AA is not different from any other incumbent operator coming from a mobile voice centric domain!

Acknowledgement

I greatly acknowledge my wife Eva Varadi for her support, patience and understanding during the creative process of creating this Blog.

Postscript – The way I like to look at (rational … what ever that means) data-centric pricing.

Firstly, it would appear that AA’s pricing philosophy follows the industry standard of pricing mobile services and in particular mobile data-centric services by the data volume allowance. Non-data services are added to the data-centric price plan and in all effect make up for the most part of the price-plan even at relative higher data allowances;

standard pricing philosophy in mobile domain

  • Figure above: illustrates the typical approach to price plan design in the Telecom’s industry. Note while not per se wrong it often overweight’s the volume element of pricing and often results in sub-optimizing the Quality and Product aspects . Source: Dr. Kim K Larsen’s Mind Share contribution at Informa’s LTE World Summit May 2012; “Right pricing LTE and mobile broadband in general (a Technologist’ Observations)”.

Unlimited Voice and SMS in AA’s standard data-centric plans clearly should mitigate possible loss or migration away from old fashion voice (i.e., circuit switched) and SMS. However both the estimated allowances for unlimited voice (6.5) and SMS (2.5) appear to be a lot lower than their classical standalone ARPUs for the postpaid category. This certainly could explain that this market (as many others in Western Europe) have lost massive amount of voice revenues over the last 5 years. In other words re-capturing or re-balancing legacy service revenues into data-centric plans still have some way to go in order to be truly effective (if at all possible which is highly questionable at this time and age).

pricing_fundamentals

As a Technologist, I am particular interested in how the technology cost and benefits are being considered in data-centric price plans.

The big challenge for the pricing expert who focus too much on volume is that the same volume can result from vastly different network qualities and speed. The customers handset will drive the experience of quality and certainly consumption. By that differences in network load and thus technology cost. A customer with a iPhone 6 Plus is likely to load the mobile data network more (and thus incur higher cost) than a customer with a normal screen smartphone of 1 or 2 generations removed from iPhone 6 Plus. It is even conceivable that a user with iPhone 6 Plus will load the network more than a customer with a normal iPhone 6 (independent of the iOS). This is very very different for Voice and SMS volumetric considerations in legacy price plans, where handset had little (or no) impact on network load relative to the usage.

For data-centric price plans to be consistent with the technology cost incurred one should consider;

  • Higher “guarantied” Quality, typically speed or latency, should be priced higher per Byte than lower quality plans (or at the very least not lower).
  • Higher Volumetric Allowances should be priced per Byte higher than Lower Volumetric Allowance (or at the very least not lower).
  • Offering unlimited Voice & SMS in data-centric plans (as well as other bundled goodies) should be carefully re-balanced to re-capture some of lost legacy revenues.

That AA’s data-centric plans for double speed appears to be cheaper than their plans at a lower data delivery quality level is not consistent with costing. Of course, AA cannot really guaranty that the customer will get double 4G speed everywhere and as such it may not be fair to charge substantially more than for single speed. However, this is of course not what appear to happen here.

AA’s lowest data unit price (in per Giga Byte) is around 0.6 – 0.7 (or 0.06 – 0.07 Cent per Mega Byte). That price is very low and in all likelihood lower than their actual production cost of a GB or MB.

However, one may argue that as long as the Total Service Revenue gained by a data-centric price plan recover the production cost, as well as providing a healthy margin then whether the applied data unit-price is designed to recover the data production cost is maybe less of an issue.

In other words, data profitability may not matter as much as overall profitability. This said it remains in my opinion in-excusable for a mobile operator not to understand its main (data) cost drivers and ensure it is recovered in their overall pricing strategies.

Surely! You may say? … “Surely Mobile Operators know their cost structure and respective cost drivers and their price plans reflects this knowledge?”

It is my observation that most price plans (data-centric or not) are developed primarily in response to competition (which of course is an important pricing element as well) rather than firmly anchored in Cost, Value & Profit considerations. Do Operators really & deeply know their own cost structure and cost drivers? … Ahhh … In my opinion few really appear to do!

The Unbearable Lightness of Mobile Voice.

  • Mobile data adaption can be (and usually is) very un-healthy for the mobile voice revenues.
  • A Mega Byte of Mobile Voice is 6 times more expensive than a Mega Byte of Mobile Data (i.e., global average) 
  • If customers would pay the Mobile Data Price for Mobile Voice, 50% of Global Mobile Revenue would Evaporate (based on 2013 data).
  • Classical Mobile Voice is not Dead! Global Mobile Voice Usage grew with more than 50% over the last 5 years. Though Global Voice Revenue remained largely constant (over 2009 – 2013). 
  • Mobile Voice Revenues declined in most Western European & Central Eastern European countries.
  • Voice Revenue in Emerging Mobile-Data Markets (i.e., Latin America, Africa and APAC) showed positive growth although decelerating.
  • Mobile Applications providing high-quality (often High Definition) mobile Voice over IP should be expected to dent the classical mobile voice revenues (as Apps have impacted SMS usage & revenue).
  • Most Western & Central Eastern European markets shows an increasing decline in price elasticity of mobile voice demand. Even some markets (regions) had their voice demand decline as the voice prices were reduced (note: not that causality should be deduced from this trend though).
  • The Art of Re-balancing (or re-capture) the mobile voice revenue in data-centric price plans are non-trivial and prone to trial-and-error (but likely also un-avoidable).

An Unbearable Lightness.

There is something almost perverse about how light the mobile industry tends to treat Mobile Voice, an unbearable lightness?

How often don’t we hear Telco Executives wish for All-IP and web-centric services for All? More and more mobile data-centric plans are being offered with voice as an after thought. Even though voice still constitute more than 60% of the Global Mobile turnover  and in many emerging mobile markets beyond that. Even though classical mobile voice is more profitable than true mobile broadband access. “Has the train left the station” for Voice and running off the track? In my opinion, it might have for some Telecom Operators, but surely not for all. Taking some time away from thinking about mobile data would already be an incredible improvement if spend on strategizing and safeguarding mobile voice revenues that still are a very substantial part of The Mobile Business Model.

Mobile data penetration is un-healthy for voice revenue. It is almost guarantied that voice revenue will start declining as the mobile data penetration reaches 20% and beyond. There are very few exceptions (i.e., Australia, Singapore, Hong Kong and Saudi Arabia) to this rule as observed in the figure below. Much of this can be explained by the Telecoms focus on mobile data and mobile data centric strategies that takes the mobile voice business for given or an afterthought … focusing on a future of All-IP Services where voice is “just” another data service. Given the importance of voice revenues to the mobile business model, treating voice as an afterthought is maybe not the most value-driven strategy to adopt.

I should maybe point out that this is not per se a result of the underlying Cellular All-IP Technology. The fact is that Cellular Voice over an All-IP network is very well specified within 3GPP. Voice over LTE (i.e., VoLTE), or Voice over HSPA (VoHSPA) for that matter, is enabled with the IP Multimedia Subsystem (IMS). Both VoLTE and VoHSPA, or simply Cellular Voice over IP (Cellular VoIP as specified by 3GPP), are highly spectral efficient (compared to their circuit switched equivalents). Further the Cellular VoIP can be delivered at a high quality comparable to or better than High Definition (HD) circuit switched voice. Recent Mean Opinion Score (MOS) measurements by Ericsson and more recently (August 2014) Signals Research Group & Spirent have together done very extensive VoLTE network benchmark tests including VoLTE comparison with the voice quality of 2G & 3G Voice as well as Skype (“Behind the VoLTE Curtain, Part 1. Quantifying the Performance of a Commercial VoLTE Deployment”). Further advantage of Cellular VoIP is that it is specified to inter-operate with legacy circuit-switched networks via the circuit-switched fallback functionality. An excellent account for Cellular VoIP and VoLTE in particular can be found in Miikki Poikselka  et al’s great book on “Voice over LTE” (Wiley, 2012).

Its not the All-IP Technology that is wrong, its the commercial & strategic thinking of Voice in an All-IP World that leaves a lot to be wished for.

Voice over LTE provides for much better Voice Quality than a non-operator controlled (i.e., OTT) mobile VoIP Application would be able to offer. But is that Quality worth 5 to 6 times the price of data, that is the Billion $ Question.

voice growth vs mobile data penetration

  • Figure Above: illustrates the compound annual growth rates (2009 to 2013) of mobile voice revenue and the mobile data penetration at the beginning of the period (i.e., 2009). As will be addressed later it should be noted that the growth of mobile voice revenues are NOT only depending on Mobile Data Penetration Rates but on a few other important factors, such as addition of new unique subscribers, the minute price and the voice arpu compared to the income level (to name a few). Analysis has been based on Pyramid Research data. Abbreviations: WEU: Western Europe, CEE: Central Eastern Europe, APAC: Asia Pacific, MEA: Middle East & Africa, NA: North America and LA: Latin America.

In the following discussion classical mobile voice should be understood as an operator-controlled voice service charged by the minute or in equivalent economical terms (i.e., re-balanced data pricing). This is opposed to a mobile-application-based voice service (outside the direct control of the Telecom Operator) charged by the tariff structure of a mobile data package without imposed re-balancing.

If the Industry would charge a Mobile Voice Minute the equivalent of what they charge a Mobile Mega Byte … almost 50% of Mobile Turnover would disappear … So be careful AND be prepared for what you wish for! 

There are at least a couple of good reasons why Mobile Operators should be very focused on preserving mobile voice as we know it (or approximately so) also in LTE (and any future standards). Even more so, Mobile Operators should try to avoid too many associations with non-operator controlled Voice-over-IP (VoIP) Smartphone applications (easier said than done .. I know). It will be very important to define a future voice service on the All-IP Mobile Network that maintains its economics (i.e., pricing & margin) and don’t get “confused” with the mobile-data-based economics with substantially lower unit prices & questionable profitability.

Back in 2011 at the Mobile Open Summit, I presented “Who pays for Mobile Broadband” (i.e., both in London & San Francisco) with the following picture drawing attention to some of the Legacy Service (e.g., voice & SMS) challenges our Industry would be facing in the years to come from the many mobile applications developed and in development;

voice_future

One of the questions back in 2011 was (and Wow it still is! …) how to maintain the Mobile ARPU & Revenues at a reasonable level, as opposed to massive loss of revenue and business model sustainability that the mobile data business model appeared to promise (and pretty much still does). Particular the threat (& opportunities) from mobile Smartphone applications. Mobile Apps that provides Mobile Customers with attractive price-arbitrage compared to their legacy prices for SMS and Classical Voice.

IP killed the SMS Star” … Will IP also do away with the Classical Mobile Voice Economics as well?

Okay … Lets just be clear about what is killing SMS (it’s hardly dead yet). The Mobile Smartphone  Messaging-over-IP (MoIP) App does the killing. However, the tariff structure of an SMS vis-a-vis that of a mobile Mega Byte (i..e, ca. 3,000x) is the real instigator of the deed together with the shear convenience of the mobile application itself.

As of August 2014 the top Messaging & Voice over IP Smartphone applications share ca. 2.0+ Billion Active Users (not counting Facebook Messenger and of course with overlap, i.e., active users having several apps on their device). WhatsApp is the Number One Mobile Communications App with about 700 Million active users  (i.e., up from 600 Million active users in August 2014). Other Smartphone Apps are further away from the WhatsApp adaption figures. Applications from Viber can boast of 200+M active users, WeChat (predominantly popular in Asia) reportedly have 460+M active users and good old Skype around 300+M active users. The impact of smartphone MoIP applications on classical messaging (e.g., SMS) is well evidenced. So far Mobile Voice-over-IP has not visible dented the Telecom Industry’s mobile voice revenues. However the historical evidence is obviously no guaranty that it will not become an issue in the future (near, medium or far).

WhatsApp is rumoured to launch mobile voice calling as of first Quarter of 2015 … Will this event be the undoing of operator controlled classical mobile voice?  WhatsApp already has taken the SMS Scalp with 30 Billion WhatsApp messages send per day according the latest data from WhatsApp (January 2015). For comparison the amount of SMS send out over mobile networks globally was a bit more than 20 Billion per day (source: Pyramid Research data). It will be very interesting (and likely scary as well) to follow how WhatsApp Voice (over IP) service will impact Telecom operator’s mobile voice usage and of course their voice revenues. The Industry appears to take the news lightly and supposedly are unconcerned about the prospects of WhatsApp launching a mobile voice services (see: “WhatsApp voice calling – nightmare for mobile operators?” from 7 January 2015) … My favourite lightness is Vodacom’s (South Africa) “if anything, this vindicates the massive investments that we’ve been making in our network….” … Talking about unbearable lightness of mobile voice … (i.e., 68% of the mobile internet users in South Africa has WhatsApp on their smartphone).

Paying the price of a mega byte mobile voice.

A Mega-Byte is not just a Mega-Byte … it is much more than that!

In 2013, the going Global average rate of a Mobile (Data) Mega Byte was approximately 5 US-Dollar Cent (or a Nickel). A Mega Byte (MB) of circuit switched voice (i.e., ca. 11 Minutes @ 12.2kbps codec) would cost you 30+ US$-cent or about 6 times that of Mobile Data MB. Would you try to send a MB of SMS (i.e., ca. 7,143 of them) that would cost you roughly 150 US$ (NOTE: US$ not US$-Cents).

1 Mobile MB = 5 US$-cent Data MB < 30+ US$-cent Voice MB (6x mobile data) << 150 US$ SMS MB (3000x mobile data).

A Mega Byte of voice conversation is pretty un-ambiguous in the sense of being 11 minutes of a voice conversation (typically a dialogue, but could be monologue as well, e.g., voice mail or an angry better half) at a 12.2 kbps speech codec. How much mega byte a given voice conversation will translate into will depend on the underlying speech coding & decoding  (codec) information rate, which typically is 12.2 kbps or 5.9 kbps (i.e., for 3GPP cellular-based voice). In general we would not be directly conscious about speed (e.g., 12.2 kbps) at which our conversation is being coded and decoded although we certainly would be aware of the quality of the codec itself and its ability to correct errors that will occur in-between the two terminals. For a voice conversation itself, the parties that engage in the conversation is pretty much determining the duration of the conversation.

An SMS is pretty straightforward and well defined as well, i.e., being 140 Bytes (or characters). Again the underlying delivery speed is less important as for most purposes it feels that the SMS sending & delivery is almost instantaneously (though the reply might not be).

All good … but what about a Mobile Data Byte? As a concept it could by anything or nothing. A Mega Byte of Data is Extremely Ambiguous. Certainly we get pretty upset if we perceive a mobile data connection to be slow. But the content, represented by the Byte, would obviously impact our perception of time and whether we are getting what we believe we are paying for. We are no longer master of time. The Technology has taken over time.

Some examples: A Mega Byte of Voice is 11 minutes of conversation (@ 12.2 kbps). A Mega Byte of Text might take a second to download (@ 1 Mbps) but 8 hours to process (i.e., read). A Mega Byte of SMS might be delivered (individually & hopefully for you and your sanity spread out over time) almost instantaneously and would take almost 16 hours to read through (assuming English language and an average mature reader). A Mega Byte of graphic content (e.g., a picture) might take a second to download and milliseconds to process. Is a Mega Byte (MB) of streaming music that last for 11 seconds (@ 96 kbps) of similar value to a MB of Voice conversation that last for 11 minutes or a MB millisecond picture (that took a second to download).

In my opinion the answer should be clearly NO … Such (somewhat silly) comparisons serves to show the problem with pricing and valuing a Mega Byte. It also illustrates the danger of ambiguity of mobile data and why an operator should try to avoid bundling everything under the banner of mobile data (or at the very least be smart about it … whatever that means).

I am being a bit naughty in above comparisons, as I am freely mixing up the time scales of delivering a Byte and the time scales of neurological processing that Byte (mea culpa).

price of a mb 

  • Figure Above: Logarithmic representation of the cost per Mega Byte of a given mobile service. 1 MB of Voice is roughly corresponding to 11 Minutes at a 12.2 voice codec which is ca. 25+ times the monthly global MoU usage. 1 MB of SMS correspond to ca. 7,143 SMSs which is a lot (actually really a lot). In USA 7,143 would roughly correspond to a full years consumption. However, in WEU 7,143 SMS would be ca. 6+ years of SMS consumption (on average) to about almost 12 years of SMS consumption in MEA Region. Still SMS remain proportionate costly and clear is an obvious service to be rapidly replaced by mobile data as it becomes readily available. Source: Pyramid Research.

The “Black” Art of Re-balancing … Making the Lightness more Bearable?

I recently had a discussion with a very good friend (from an emerging market) about how to recover lost mobile voice revenues in the mobile data plans (i.e., the art of re-balancing or re-capturing). Could we do without Voice Plans? Should we focus on All-in the Data Package? Obviously, if you would charge 30+ US$-cent per Mega Byte Voice, while you charge 5 US$-cent for Mobile Data, that might not go down well with your customers (or consumer interest groups). We all know that “window-dressing” and sleight-of-hand are important principles in presenting attractive pricings. So instead of Mega Byte voice we might charge per Kilo Byte (lower numeric price), i.e., 0.029 US$-cent per kilo byte (note: 1 kilo-byte is ca. 0.65 seconds @ 12.2 kbps codec). But in general the consumer are smarter than that. Probably the best is to maintain a per time-unit charge or to Blend in the voice usage & pricing into the Mega Byte Data Price Plan (and hope you have done your math right).

Example (a very simple one): Say you have 500 MB mobile data price plan at 5 US$-cent per MB (i.e., 25 US$). You also have a 300 Minute Mobile Voice Plan of 2.7 US$-cent a minute (or 30 US$-cent per MB). Now 300 Minutes corresponds roughly to 30 MB of Voice Usage and would be charged ca. 9$. Instead of having a Data & Voice Plan, one might have only the Data Plan charging (500 MB x 5 US$cent/MB + 30 MB x 30 US$/cent/MB) / 530 MB or 6.4 US$-cent per MB (or 1.4 US$-cent more for mobile voice over the data plan or a 30% surcharge for Voice on the Mobile Data Bytes). Obviously such a pricing strategy (while simple) does pose some price strategic challenges and certainly does not per se completely safeguard voice revenue erosion. Keeping Mobile Voice separately from Mobile Data (i.e., Minutes vs Mega Bytes) in my opinion will remain the better strategy. Although such a minutes-based strategy is easily disrupted by innovative VoIP applications and data-only entrepreneurs (as well as Regulator Authorities).

Re-balancing (or re-capture) the voice revenue in data-centric price plans are non-trivial and prone to trial-and-error. Nevertheless it is clearly an important pricing strategy area to focus on in order to defend existing mobile voice revenues from evaporating or devaluing by the mobile data price plan association.

Is Voice-based communication for the Masses (as opposed to SME, SOHO, B2B,Niche demand, …) technologically un-interesting? As a techno-economist I would say far from it. From the GSM to HSPA and towards LTE, we have observed a quantum leap, a factor 10, in voice spectral efficiency (or capacity), substantial boost in link-budget (i.e., approximately 30% more geographical area can be covered with UMTS as opposed to GSM in apples for apples configurations) and of course increased quality (i.e., high-definition or crystal clear mobile voice). The below Figure illustrates the progress in voice capacity as a function of mobile technology. The relative voice spectral efficiency data in the below figure has been derived from one of the best (imo) textbooks on mobile voice “Voice over LTE” by Miikki Poikselka et all (Wiley, 2012);

voice spectral capacity

  • Figure Above: Abbreviation guide;  EFR: Enhanced Full Rate, AMR: Adaptive Multi-Rate, DFCA: Dynamic Frequency & Channel Allocation, IC: Interference Cancellation. What might not always be appreciate is the possibility of defining voice over HSPA, similar to Voice over LTE. Source: “Voice over LTE” by Miikki Poikselka et all (Wiley, 2012).

If you do a Google Search on Mobile Voice you would get ca. 500 Million results (note Voice over IP only yields 100+ million results). Try that on Mobile Data and “sham bam thank you mam” you get 2+ Billion results (and projected to increase further). For most of us working in the Telecom industry we spend very little time on voice issues and an over-proportionate amount of time on broadband data. When you tell your Marketing Department that a state-of-the-art 3G can carry at least twice as much voice traffic than state-of-the –art GSM (and over 30% more coverage area) they don’t really seem to get terribly exited? Voice is un-sexy!? an afterthought!? … (don’t even go brave and tell Marketing about Voice over LTE, aka VoLTE).

Is Mobile Voice Dead or at the very least Dying?

Is Voice un-interesting, something to be taken for granted?

Is Voice “just” data and should be regarded as an add-on to Mobile Data Services and Propositions?

From a Mobile Revenue perspective mobile voice is certainly not something to be taken for granted or just an afterthought. In 2013, mobile voice still amounted for 60+% of he total global mobile turnover, with mobile data taking up ca. 40% and SMS ca. 10%. There are a lot of evidence that SMS is dying out quickly with the emergence of smartphones and Messaging-over-IP-based mobile application (SMS – Assimilation is inevitable, Resistance is Futile!). Not particular surprising given the pricing of SMS and the many very attractive IP-based alternatives. So are there similar evidences of mobile voice dying?

NO! NIET! NEM! MA HO BU! NEJ! (not any time soon at least)

Lets see what the data have to say about mobile voice?

In the following I only provide a Regional but should there be interest I have very detailed deep dives for most major countries in the various regions. In general there are bigger variations to the regional averages in Middle East & Africa (i.e., MEA) as well as Asia Pacific (i.e., APAC) Regions, as there is a larger mix of mature and emerging markets with fairly large differences in mobile penetration rates and mobile data adaptation in general. Western Europe, Central Eastern Europe, North America (i.e., USA & Canada) and Latin America are more uniform in conclusions that can reasonably be inferred from the averages.

As shown in the Figure below, from 2009 to 2013, the total amount of mobile minutes generated globally increased with 50+%. Most of that increase came from emerging markets as more share of the population (in terms of individual subscribers rather than subscriptions) adapted mobile telephony. In absolute terms, the global mobile voice revenues did show evidence of stagnation and trending towards decline.

mobile revenues & mou growth 

  • Figure Above: Illustrates the development & composition of historical Global Mobile Revenues over the period 2009 to 2013. In addition also shows the total estimated growth of mobile voice minutes (i.e., Red Solid Curve showing MoUs in units of Trillions) over the period. Sources: Pyramid Research & Statista. It should noted that various data sources actual numbers (over the period) are note completely matching. I have observed a difference between various sources of up-to 15% in actual global values. While interesting this difference does not alter the analysis & conclusions presented here.

If all voice minutes was charged with the current Rate of Mobile Data, approximately Half-a-Billion US$ would evaporate from the Global Mobile Revenues.

So while mobile voice revenues might not be a positive growth story its still “sort-of” important to the mobile industry business.

Most countries in Western & Central Eastern Europe as well as mature markets in Middle East and Asia Pacific shows mobile voice revenue decline (in absolute terms and in their local currencies). For Latin America, Africa and Emerging Mobile Data Markets in Asia-Pacific almost all exhibits positive mobile voice revenue growth (although most have decelerating growth rates).

voice rev & mous

  • Figure Above: Illustrates the annual growth rates (compounded) of total mobile voice revenues and the corresponding growth in mobile voice traffic (i.e., associated with the revenues). Some care should be taken as for each region US$ has been used as a common currency. In general each individual country within a region has been analysed based on its own local currency in order to avoid mixing up currency exchange effects. Source: Pyramid Research.

Of course revenue growth of the voice service will depend on (1) the growth of subscriber base, (2) the growth of the unit itself (i.e., minutes of voice usage) as it is used by the subscribers (i.e., which is likely influenced by the unit price), and (3) the development of the average voice revenue per subscriber (or user) or the unit price of the voice service. Whether positive or negative growth of Revenue results, pretty much depends on the competitive environment, regulatory environment and how smart the business is in developing its pricing strategy & customer acquisition & churn dynamics.

Growth of (unique) mobile customers obviously depends the level of penetration, network coverage & customer affordability. Growth in highly penetrated markets is in general (much) lower than growth in less mature markets.

subs & mou growth

  • Figure Above: Illustrates the annual growth rates (compounded) of unique subscribers added to a given market (or region). Further to illustrate the possible relationship between increased subscribers and increased total generated mobile minutes the previous total minutes annual growth is shown as well. Source: Pyramid Research.

Interestingly, particular for the North America Region (NA), we see an increase in unique subscribers of 11% per anno and hardly any growth over the  period of total voice minutes. Firstly, note that the US Market will dominate the averaging of the North America Region (i.e., USA and Canada) having approx. 13 times more subscribers. So one of the reasons for this no-minutes-growth effect is that the US market saw a substantial increase in the prepaid ratio (i.e., from ca.19% in 2009 to 28% in 2013). Not only were new (unique) prepaid customers being added. Also a fairly large postpaid to prepaid migration took place over the period. In the USA the minute usage of a prepaid is ca. 35+% lower than that of a postpaid. In comparison the Global demanded minutes difference is 2.2+ times lower prepaid minute usage compared to that of a postpaid subscriber). In the NA Region (and of course likewise in the USA Market) we observe a reduced voice usage over the period both for the postpaid & prepaid segment (based on unique subscribers). Thus increased prepaid blend in the overall mobile base with a relative lower voice usage combined with a general decline in voice usage leads to a pretty much zero growth in voice usage in the NA Market. Although the NA Region is dominated by USA growth (ca. 0.1 % CAGR total voice growth), Canada’s likewise showed very minor growth in their overall voice usage as well (ca. 3.8% CAGR). Both Canada & USA reduced their minute pricing over the period.

  • Note on US Voice Usage & Revenues: note that in both in US and in Canada also the receiving party pays (RPP) for receiving a voice call. Thus revenue generating minutes arises from both outgoing and incoming minutes. This is different from most other markets where the Calling Party Pays (CPP) and only minutes originating are counted in the revenue generation. For example in USA the Minutes of Use per blended customer was ca. 620 MoU in 2013. To make that number comparable with say Europe’s 180 MoU, one would need to half the US figure to 310 MoU still a lot higher than the Western European blended minutes of use. The US bundles are huge (in terms of allowed minutes) and likewise the charges outside bundles (i.e., forcing the consumer into the next one) though the fixed fees tends be high to very high (in comparison with other mobile markets). The traditional US voice plan would offer unlimited on-net usage (i.e., both calling & receiving party are subscribing to the same mobile network operator) as well as unlimited off-peak usage (i.e., evening/night/weekends). It should be noted that many new US-based mobile price plans offers data bundles with unlimited voice (i.e., data-centric price plans). In 2013 approximately 60% of the US mobile industry’s turnover could be attributed to mobile voice usage. This number is likely somewhat higher as some data-tariffs has voice-usage (e.g., typically unlimited) embedded. In particular the US mobile voice business model would be depending customer migration to prepaid or lower-cost bundles as well as how well the voice-usage is being re-balanced (and re-captured) in the Data-centric price plans.

The second main component of the voice revenue is the unit price of a voice minute. Apart from the NA Region, all markets show substantial reductions in the unit price of a minute.mou & minute price growth

  • Figure Above: Illustrating the annual growth (compounded) of the per minute price in US$-cents as well as the corresponding growth in total voice minutes. The most affected by declining growth is Western Europe & Central Eastern Europe although other more-emerging markets are observed to have decelerating voice revenue growth. Source: Pyramid Research.

Clearly from the above it appears that the voice “elastic” have broken down in most mature markets with diminishing (or no return) on further minute price reductions. Another way of looking at the loss (or lack) of voice elasticity is to look at the unit-price development of a voice-minute versus the growth of the total voice revenues;

elasticity

  • Figure Above: Illustrates the growth of Total Voice Revenue and the unit-price development of a mobile voice minute. Apart from the Latin America (LA) and Asia Pacific (APAC) markets there clearly is no much further point in reducing the price of voice. Obviously, there are other sources & causes, than the pure gain of elasticity, effecting the price development of a mobile voice minute (i.e., regulatory, competition, reduced demand/voice substitution, etc..). Note US$ has been used as the unifying currency across the various markets. Despite currency effects the trend is consistent across the markets shown above. Source: Pyramid Research.

While Western & Central-Eastern Europe (WEU & CEE) as well as the mature markets in Middle East and Asia-Pacific shows little economic gain in lowering voice price, in the more emerging markets (LA and Africa) there are still net voice revenue gains to be made by lowering the unit price of a minute (although the gains are diminishing rapidly). Although most of the voice growth in the emerging markets comes from adding new customers rather than from growth in the demand per customer itself.

voice growth & uptake

  • Figure Above: Illustrating possible drivers for mobile voice growth (positive as well as negative); such as Mobile Data Penetration 2013 (expected negative growth impact), increased number of (unique) subscribers compared to 2009 (expected positive growth impact) and changes in prepaid-postpaid blend (a negative %tage means postpaid increased their proportion while a positive %tage translates into a higher proportion of prepaid compared to 2009). Voice tariff changes have been observed to have elastic effects on usage as well although the impact changes from market to market pending on maturity. Source: derived from Pyramid Research.

With all the talk about Mobile Data, it might come as a surprise that Voice Usage is actually growing across all regions with the exception of North America. The sources of the Mobile Voice Minutes Growth are largely coming from

  1. Adding new unique subscribers (i.e., increasing mobile penetration rates).
  2. Transitioning existing subscribers from prepaid to postpaid subscriptions (i.e., postpaid tends to have (a lot) higher voice usage compared to prepaid).
  3. General increase in usage per individual subscriber (i.e., few markets where this is actually observed irrespective of the general decline in the unit cost of a voice minute).

To the last point (#3) it should be noted that the general trend across almost all markets is that Minutes of Use per Unique customer is stagnating and even in decline despite substantial per unit price reduction of a consumed minute. In some markets that trend is somewhat compensated by increase of postpaid penetration rates (i.e., postpaid subscribers tend to consume more voice minutes). The reduction of MoUs per individual subscriber is more significant than a subscription-based analysis would let on.

Clearly, Mobile Voice Usage is far from Dead

and

Mobile Voice Revenue is a very important part of the overall mobile revenue composition.

It might make very good sense to spend a bit more time on strategizing voice, than appears to be the case today. If mobile voice remains just an afterthought of mobile data, the Telecom industry will loose massive amounts of Revenues and last but not least Profitability.

 

Post Script: What drives the voice minute growth?

An interesting exercise is to take all the data and run some statistical analysis on it to see what comes out in terms of main drivers for voice minute growth, positive as well as negative. The data available to me comprises 77 countries from WEU (16), CEE (8), APAC (15), MEA (17), NA (Canada & USA) and LA (19). I am furthermore working with 18 different growth parameters (e.g., mobile penetration, prepaid share of base, data adaptation, data penetration begin of period, minutes of use, voice arpu, voice minute price, total minute volume, customers, total revenue growth, sms, sms price, pricing & arpu relative to nominal gdp etc…) and 7 dummy parameters (populated with noise and unrelated data).

Two specific voice minute growth models emerges our of a comprehensive analysis of the above described data. The first model is as follows

(1) Voice Growth correlates positively with Mobile Penetration (of unique customers) in the sense of higher penetration results in more minutes, it correlates negatively with Mobile Data Penetration at the begin of the period (i.e., 2009 uptake of 3G, LTE and beyond) in the sense that higher mobile data uptake at the begin of the period leads to a reduction of Voice Growth, and finally  Voice Growth correlates negatively with the Price of a Voice Minute in the sense of higher prices leads to lower growth and lower prices leads to higher growth.  This model is statistically fairly robust (e.g., a p-values < 0.0001) as well as having all parameters with a statistically meaningful confidence intervals (i.e., upper & lower 95% confidence interval having the same sign).

The Global Analysis does pin point to very rational drivers for mobile voice usage growth, i.e., that mobile penetration growth, mobile data uptake and price of a voice minute are important drivers for total voice usage. 

It should be noted that changes in the prepaid proportion does not appear statistically to impact voice minute growth.

The second model provides a marginal better overall fit to the Global Data but yields slightly worse p-values for the individual descriptive parameters.

(2) The second model simply adds the Voice ARPU to (nominal) GDP ratio to the first model. This yields a negative correlation in the sense that a low ratio results in higher voice usage growth and a higher ration in lower voice usage growth.

Both models describe the trends or voice growth dynamics reasonably well, although less convincing for Western & Central Eastern Europe and other more mature markets where the model tends to overshoot the actual data. One of the reasons for this is that the initial attempt was to describe the global voice growth behaviour across very diverse markets.

mou growth actual vs model

  • Figure Above: Illustrates total annual generated voice minutes compound annual growth rate (between 2009 and 2013) for 77 markets across 6 major regions (i.e., WEU, CEE, APAC, MEA, NA and LA). The Model 1 shows an attempt to describe the Global growth trend across all 77 markets within the same model. The Global Model is not great for Western Europe and part of the CEE although it tends to describe the trends between the markets reasonably.

w&cee growth

  • Figure Western & Central Eastern Region: the above Illustrates the compound annual growth rate (2009 – 2013) of total generated voice minutes and corresponding voice revenues. For Western & Central Eastern Europe while the generated minutes have increased the voice revenue have consistently declined. The average CAGR of new unique customers over the period was 1.2% with the maximum being little less than 4%.

apac growth

  • Figure Asia Pacific Region: the above Illustrates the compound annual growth rate (2009 – 2013) of total generated voice minutes and corresponding voice revenues. For the Emerging market in the region there is still positive growth of both minutes generated as well as voice revenue generated. Most of the mature markets the voice revenue growth is negative as have been observed for mature Western & Central Eastern Europe.

mea growth

  • Figure Middle East & Africa Region: the above Illustrates the compound annual growth rate (2009 – 2013) of total generated voice minutes and corresponding voice revenues. For the Emerging market in the region there is still positive growth of both minutes generated as well as voice revenue generated. Most of the mature markets the voice revenue growth is negative as have been observed for mature Western & Central Eastern Europe.

    na&la growth

  • Figure North & Latin America Region: the above Illustrates the compound annual growth rate (2009 – 2013) of total generated voice minutes and corresponding voice revenues. For the Emerging market in the region there is still positive growth of both minutes generated as well as voice revenue generated. Most of the mature markets the voice revenue growth is negative as have been observed for mature Western & Central Eastern Europe.

    PS.PS. Voice Tariff Structure

  • Typically the structure of a mobile voice tariff (or how the customer is billed) is structure as follows

    • Fixed charge / fee

      • This fixed charge can be regarded as an access charge and usually is associated with a given usage limit (i.e., $ X for Y units of usage) or bundle structure.
    • Variable per unit usage charge

      • On-net – call originating and terminating within same network.
      • Off-net – Domestic Mobile.
      • Off-net – Domestic Fixed.
      • Off-net – International.
      • Local vs Long-distance.
      • Peak vs Off-peak rates (e.g., off-peak typically evening/night/weekend).
      • Roaming rates (i.e., when customer usage occurs in foreign network).
      • Special number tariffs (i.e., calls to paid-service numbers).

    How a fixed vis-a-vis variable charges are implemented will depend on the particularity of a given market but in general will depend on service penetration and local vs long-distance charges.

  • Acknowledgement

    I greatly acknowledge my wife Eva Varadi for her support, patience and understanding during the creative process of creating this Blog. I certainly have not always been very present during the analysis and writing. Also many thanks to Shivendra Nautiyal and others for discussing and challenging the importance of mobile voice versus mobile data and how practically to mitigate VoIP cannibalization of the Classical Mobile Voice.