Will LEO Satellite Direct-to-Cell Networks make Terrestrial Networks Obsolete?

THE POST-TOWER ERA – A FAIRYTAIL.

From the bustling streets of New York to the remote highlands of Mongolia, the skyline had visibly changed. Where steel towers and antennas once dominated now stood open spaces and restored natural ecosystems. Forests reclaimed their natural habitats, and birds nested in trees undisturbed by the scaring of high rural cellular towers. This transformation was not sudden but resulted from decades of progress in satellite technology, growing demand for ubiquitous connectivity, an increasingly urgent need to address the environmental footprint of traditional telecom infrastructures, and the economic need to dramatically reduce operational expenses tied up in tower infrastructure. By the time the last cell site was decommissioned, society stood at the cusp of a new age of connectivity by LEO satellites covering all of Earth.

The annual savings worldwide from making terrestrial cellular towers obsolete in total cost are estimated to amount to at least 300 billion euros, and it is expected that moving cellular access to “heaven” will avoid more than 150 million metric tons of CO2 emissions annually. The retirement of all terrestrial cellular networks worldwide has been like eliminating the entire carbon footprint of The Netherlands or Malaysia and leading to a dramatic reduction in demand for sustainable green energy sources that previously were used to power the global cellular infrastructure.

INTRODUCTION.

Recent postings and a substantial part of commentary give the impression that we are heading towards a post-tower era where Elon Musk’s Low Earth Orbit (LEO) satellite Starlink network (together with competing options, e.g., ATS Spacemobile and Lynk, and no, I do not see Amazon’s Project Kuiper in this space) will make terrestrially-based tower infrastructure and earth-bound cellular services obsolete.

T-Mobile USA is launching its Direct-to-Cell (D2C) service via SpaceX’s Starlink LEO satellite network. The T-Mobile service is designed to work with existing LTE-compatible smartphones, allowing users to connect to Starlink satellites without needing specialized hardware or smartphone applications.

Since the announcement, posts and media coverage have declared the imminent death of the terrestrial cellular network. When it is pointed out that this may be a premature death sentence to an industry, telecom operators, and their existing cellular mobile networks, it is also not uncommon to be told off as being too pessimistic and an unbeliever in Musk’s genius vision. Musk has on occasion made it clear the Starlink D2C service is aimed at texts and voice calls in remote and rural areas, and to be honest, the D2C service currently hinges on 2×5 MHz in the T-Mobile’s PCS band, adding constraints to the “broadbandedness” of the service. The fact that the service doesn’t match the best of T-Mobile US’s 5G network quality (e.g., 205+ Mbps downlink) or even get near its 4G speeds should really not bother anyone, as the value of the D2C service is that it is available in remote and rural areas with little to no terrestrial cellular coverage and that you can use your regular cellular device with no need for a costly satellite service and satphone (e.g., Iridium, Thuraya, Globalstar).

While I don’t expect to (or even want to) change people’s beliefs, I do think it would be great to contribute to more knowledge and insights based on facts about what is possible with low-earth orbiting satellites as a terrestrial substitute and what is uninformed or misguided opinion.

The rise of LEO satellites has sparked discussions about the potential obsolescence of terrestrial cellular networks. With advancements in satellite technology and increasing partnerships, such as T-Mobile’s collaboration with SpaceX’s Starlink, proponents envision a future where towers are replaced by ubiquitous connectivity from the heavens. However, the feasibility of LEO satellites achieving service parity with terrestrial networks raises significant technical, economic, and regulatory questions. This article explores the challenges and possibilities of LEO Direct-to-Cell (D2C) networks, shedding light on whether they can genuinely replace ground-based cellular infrastructure or will remain a complementary technology for specific use cases.

WHY DISTANCE MATTERS.

The distance between you (your cellular device) and the base station’s antenna determines your expected service experience in cellular and wireless networks. The longer you are away from the base station that serves you, in general, the poorer your connection quality and performance will be, with everything else being equal. As the distance increases, signal weakening (i.e., path loss) grows exponentially, reducing signal quality and making it harder for devices to maintain reliable communication. Closer proximity allows for more substantial, faster, and more stable connections, while longer distances require more power and advanced technologies like beamforming or repeaters to compensate.

Physics tells us how a signal loses its signal strength (or power) over a distance with the square of the distance from the source of the signal itself (either the base station transmitter or the consumer device). This applies universally to all electromagnetic waves traveling in free space. Free space means that there are no obstacles, reflections, or scattering. No terrain features, buildings, or atmospheric conditions interfere with the propagation signal.

So, what matters to the Free Space Path Loss (FSPL)? That is the signal strength over a given distance in free space:

  • The signal strength reduces (the path loss increases) with the square of the distance (d) from its source.
  • Path loss increases (i.e., signal strength decreases) with the (square of the) frequency (f). The higher the frequency, the higher the path loss at a given distance from the signal source.
  • A larger transmit antenna aperture reduces the path loss by focusing the transmitted signal (energy) more efficiently. An antenna aperture is an antenna’s “effective area” that captures or transmits electromagnetic waves. It depends directly on antenna gain and inverse of the square of the signal frequency (i.e., higher frequency → smaller aperture).
  • Higher receiver gain will also reduce the path loss.

$PL_{FS} \; = \; \left( \frac{4 \pi}{c} \right)^2 (d \; f)^2 \; \propto d^2 \; f^2$

$$FSPL_{dB} \; = 10 \; Log_{10} (PL_{FS}) \; = \; 20 \; Log_{10}(d) \; + \; 20 \; Log_{10}(f) \; + \; constant$$

The above equations show a strong dependency on distance; the farther away, the larger the signal loss, and the higher the frequency, the larger the signal loss. Relaxing some of the assumptions leading to the above relationship leads us to the following:

$FSPL_{dB}^{rs} \; = \; 20 \; Log_{10}(d) \; – \; 10 \; Log_{10}(A_t^{eff}) \; – \; 10 \; Log_{10}(G_{r}) \; + \; constant$

The last of the above equations introduces the transmitter’s effective antenna aperture (\(A_t^{eff}\)) and the receiver’s gain (\(G_r\)), telling us that larger apertures reduce path loss as they focus the transmitted energy more efficiently and that higher receiver gain likewise reduces the path loss (i.e., “they hear better”).

It is worth remembering that the transmitter antenna aperture is directly tied to the transmitter gain ($G_t$) when the frequency (f) has been fixed. We have

$A_t^{eff} \; = \; \frac{c^2}{4\pi} \; \frac{1}{f^2} \; G_t \; = \; 0.000585 \; m^2 \; G_t \;$ @ f = 3.5 GHz.

From the above, as an example, it is straightforward to see that the relative path loss difference between the two distances of 550 km (e.g., typical altitude of an LEO satellite) and 2.5 km (typical terrestrial cellular coverage range ) is

$\frac{PL_{FS}(550 km)}{PL_{FS}(2.5 km)} \; = \; \left( \frac {550}{2.5}\right)^2 \; = \; 220^2 \; \approx \; 50$ thousand. So if all else was equal (it isn’t, btw!), we would expect that the signal loss at a distance of 550 km would be 50 thousand times higher than at 2.5 km. Or, in the electrical engineer’s language, at a distance of 550 km, the loss would be 47 dB higher than at 2.5 km.

The figure illustrates the difference between (a) terrestrial cellular and (b) satellite coverage. A terrestrial cellular signal typically covers a radius of 0.5 to 5 km. In contrast, a LEO satellite signal travels a substantial distance to reach Earth (e.g., Starlink satellite is at an altitude of 550 km). While the terrestrial signal propagates through the many obstacles it meets on its earthly path, the satellite signal’s propagation path would typically be free-space-like (i.e., no obstacles) until it penetrates buildings or other objects to reach consumer devices. Historically, most satellite-to-Earth communication has relied on outdoor ground stations or dishes where the outdoor antenna on Earth provides LoS to the satellite and will also compensate somewhat for the signal loss due to the distance to the satellite.

Let’s compare a terrestrial 5G 3.5 GHz advanced antenna system (AAS) 2.5 km from a receiver with a LEO satellite system at an altitude of 550 km. Note I could have chosen a lower frequency, e.g., 800 MHz or the PCS 1900 band. While it would give me some advantages regarding path loss (i.e., $FSPL \; \propto \; f^2$), the available bandwidth is rather smallish and insufficient for state-or-art 5G services (imo!). From a free-space path loss perspective, independently of frequency, we need to overcome an almost 50 thousand times relative difference in distance squared (ca. 47 dB difference) in favor of the terrestrial system. In this comparison, it should be understood that the terrestrial and the satellite systems use the same carrier frequency (otherwise, one should account for the difference in frequency), and the only difference that matters (for the FSPL) is the difference in distance to the receiver.

Suppose I require that my satellite system has the same signal loss in terms of FSPL as my terrestrial system to aim at a comparable quality of service level. In that case, I have several options in terms of satellite enhancements. I could increase transmit power, although it would imply that I need a transmit power of 47 dB more than the terrestrial system, or approximately 48 kW, which is likely impractical for the satellite due to power limitations. Compare this with the current Starlink transmit power of approximately 32 W (45 dBm), ca. 1,500 times lower. Alternatively, I could (in theory!) increase my satellite antenna aperture, leading to a satellite antenna with a diameter of ca. 250 meters, which is enormous compared to current satellite antennas (e.g., Starlink’s ca. 0.05 m2 aperture for a single antenna and total area in the order of 1.6 m2 for the Ku/Ka bands). Finally, I could (super theoretically) also massively improve my consumer device (e.g., smartphone) to receive gain (with 47 dB) from today’s range of -2 dBi to +5 dBi. Achieving 46 dBi gain in a smartphone receiver seems unrealistic due to size, power, and integration constraints. As the target of LEO satellite direct-to-cell services is to support commercially available cellular devices used in terrestrial, only the satellite specifications can be optimized.

Based on a simple free-space approach, it appears unreasonable that an LEO satellite communication system can provide 5G services at parity with a terrestrial cellular network to normal (unmodified) 5G consumer devices without satellite-optimized modifications. The satellite system’s requirements for parity with a terrestrial communications system are impractical (but not impossible) and, if pursued, would significantly drive up design complexity and cost, likely making such a system highly uneconomical.

At this point, you should ask yourself if it is reasonable to assume that a terrestrial communication cellular system can be taken to propagate as its environment is “free-space” like. Thus, obstacles, reflections, and scattering are ignored. Is it really okay to presume that terrain features, buildings, or atmospheric conditions do not interfere with the propagation of the terrestrial cellular signal? Of course, the answer should be that it is not okay to assume that. When considering this, let’s see if it matters much compared to the LEO satellite path loss.

TERRESTRIAL CELLULAR PROPAGATION IS NOT HAPPENING IN FREE SPACE, AND NEITHER IS A SATELLITE’S.

The Free-Space Path Loss (FSPL) formula assumes ideal conditions where signals propagate in free space without interference, blockage, or degradation, besides what would naturally be by traveling a given distance. However, as we all experience daily, real-world environments introduce additional factors such as obstructions, multipath effects, clutter loss, and environmental conditions, necessitating corrections to the FSPL approach. Moving from one room of our house to another can easily change the cellular quality and our experience (e.g., dropped calls, poorer voice quality, lower speed, changing from using 5G to 4G or even to 2G, no coverage at all). Driving through a city may also result in ups and downs with respect to the cellular quality we experience. Some of these effects are tabulated below.

Urban environments typically introduce the highest additional losses due to dense buildings, narrow streets, and urban canyons, which significantly obstruct and scatter signals. For example, the Okumura-Hata Urban Model accounts for such obstructions and adds substantial losses to the FSPL, averaging around 30–50 dB, depending on the density and height of buildings.

Suburban environments, on the other hand, are less obstructed than urban areas but still experience moderate clutter losses from trees, houses, and other features. In these areas, corrections based on the Okumura-Hata Suburban Model add approximately 10–20 dB to the FSPL, reflecting the moderate level of signal attenuation caused by vegetation and scattered structures.

Rural environments have the least obstructions, resulting in the lowest additional loss. Corrections based on the Okumura-Hata Rural Model typically add around 5–10 dB to the FSPL. These areas benefit from open landscapes with minimal obstructions, making them ideal for long-range signal propagation.

Non-line-of-sight (NLOS) conditions increase additionally the path loss, as signals must diffract or scatter to reach the receiver. This effect adds 10–20 dB in suburban and rural areas and 20–40 dB in urban environments, where obstacles are more frequent and severe. Similarly, weather conditions such as rain and foliage contribute to signal attenuation, with rain adding up to 1–5 dB/km at higher frequencies (above 10 GHz) and dense foliage introducing an extra 5–15 dB of loss.

The corrections for these factors can be incorporated into the FSPL formula to provide a more realistic estimation of signal attenuation. By applying these corrections, the FSPL formula can reflect the conditions encountered in terrestrial communication systems across different environments.

The figure above illustrates the differences and similarities concerning the coverage environment for (a) terrestrial and (b) satellite communication systems. The terrestrial signal environment, in most instances, results in the loss of the signal as it propagates through the terrestrial environment due to vegetation, terrain variations, urban topology or infrastructure, weather, and ultimately, as the signal propagates from the outdoor environment to the indoor environment it signal reduces further as it, for example, penetrates windows with coatings, outer and inner walls. The combination of distance, obstacles, and material penetration leads to a cumulative reduction in signal strength as the signal propagates through the terrestrial environment. For the satellite, as illustrated in (b), a substantial amount of signal is reduced due to the vast distance it has to travel before reaching the consumer. If no outdoor antenna connects with the satellite signal, then the satellite signal will be further reduced as it penetrates roofs, multiple ceilings, multiple floors, and walls.

It is often assumed that a satellite system has a line of sight (LoS) without environmental obstructions in its signal propagation (besides atmospheric ones). The reasoning is not unreasonable as the satellite is on top of the consumers of its services and, of course, a correct approach when the consumer has an outdoor satellite receiver (e.g., a dish) in direct LoS with the satellite. Moreover, historically, most satellite-to-Earth communication has relied on outdoor ground stations or outdoor dishes (e.g., placed on roofs or another suitable location) where the outdoor antenna on Earth provides LoS to the satellite’s antenna also compensating somewhat for the signal loss due to the distance to the satellite.

When considering a satellite direct-to-cell device, we no longer have the luxury of a satellite-optimized advanced Earth-based outdoor antenna to facilitate the communications between the satellite and the consumer device. The satellite signal has to close the connection with a standard cellular device (e.g., smartphone, tablet, …), just like the terrestrial cellular network would have to do.

However, 80% or more of our mobile cellular traffic happens indoors, in our homes, workplaces, and public places. If a satellite system had to replace existing mobile network services, it would also have to provide a service quality similar to that of consumers from the terrestrial cellular network. As shown in the above figure, this involves urban areas where the satellite signal will likely pass through a roof and multiple floors before reaching a consumer. Depending on housing density, buildings (shadowing) may block the satellite signal, resulting in substantial service degradation for consumers suffering from such degrading effects. Even if the satellite signal would not face the same challenges as a terrestrial cellular signal, such as with vegetation, terrain variations, and the horizontal dimension of urban topology (e.g., outer& inner walls, coated windows,… ), the satellite signal would still have to overcome the vertical dimension of urban topologies (e..g, roofs, ceilings, floors, etc…) to connect to consumers cellular devices.

For terrestrial cellular services, the cellular network’s signal integrity will (always) have a considerable advantage over the satellite signal because of the proximity to the consumer’s cellular device. With respect to distance alone, an LEO satellite at an altitude of 550 km will have to overcome a 50 thousand times (or a 47 dB) path loss compared to a cellular base station antenna 2.5 km away. Overcoming that path loss penalty adds considerable challenges to the antenna design, which would seem highly challenging to meet and far from what is possible with today’s technology (and economy).

CHALLENGES SUMMARIZED.

Achieving parity between a Low Earth Orbit (LEO) satellite providing Direct-to-Cell (D2C) services and a terrestrial 5G network involves overcoming significant technical challenges. The disparity arises from fundamental differences in these systems’ environments, particularly in free-space path loss, penetration loss, and power delivery. Terrestrial networks benefit from closer proximity to the consumer, higher antenna density, and lower propagation losses. In contrast, LEO satellites must address far more significant free-space path losses due to the large distances involved and the additional challenges of transmitting signals through the atmosphere and into buildings.

The D2C challenges for LEO satellites are increasingly severe at higher frequencies, such as 3.5 GHz and above. As we have seen above, the free-space path loss increases with the square of the frequency, and penetration losses through common building materials, such as walls and floors, are significantly higher. For an LEO satellite system to achieve indoor parity with terrestrial 5G services at this frequency, it would need to achieve extraordinary levels of effective isotropic radiated power (EIRP), around 65 dB, and narrow beamwidths of approximately 0.5° to concentrate power on specific service areas. This would require very high onboard power outputs, exceeding 1 kW, and large antenna apertures, around 2 m in diameter, to achieve gains near 55 dBi. These requirements place considerable demands on satellite design, increasing mass, complexity, and cost. Despite these optimizations, indoor service parity at 3.5 GHz remains challenging due to persistent penetration losses of around 20 dB, making this frequency better suited for outdoor or line-of-sight applications.

Achieving a stable beam with the small widths required for a LEO satellite to provide high-performance Direct-to-Cell (D2C) services presents significant challenges. Narrow beam widths, on the order of 0.5° to 1°, are essential to effectively focus the satellite’s power and overcome the high free-space path loss. However, maintaining such precise beams demands advanced satellite antenna technologies, such as high-gain phased arrays or large deployable apertures, which introduce design, manufacturing, and deployment complexities. Moreover, the satellite must continuously track rapidly moving targets on Earth as it orbits around 7.8 km/s. This requires highly accurate and fast beam-steering systems, often using phased arrays with electronic beamforming, to compensate for the relative motion between the satellite and the consumer. Any misalignment in the beam can result in significant signal degradation or complete loss of service. Additionally, ensuring stable beams under variable conditions, such as atmospheric distortion, satellite vibrations, and thermal expansion in space, adds further layers of technical complexity. These requirements increase the system’s power consumption and cost and impose stringent constraints on satellite design, making it a critical challenge to achieve reliable and efficient D2C connectivity.

As the operating frequency decreases, the specifications for achieving parity become less stringent. At 1.8 GHz, the free-space path loss and penetration losses are lower, reducing the signal deficit. For a LEO satellite operating at this frequency, a 2.5 m² aperture (1.8 m diameter) antenna and an onboard power output of around 800 W would suffice to deliver EIRP near 60 dBW, bringing outdoor performance close to terrestrial equivalency. Indoor parity, while more achievable than 3.5 GHz, would still face challenges due to penetration losses of approximately 15 dB. However, the balance between the reduced propagation losses and achievable satellite optimizations makes 1.8 GHz a more practical compromise for mixed indoor and outdoor coverage.

At 800 MHz, the frequency-dependent losses are significantly reduced, making it the most feasible option for LEO satellite systems to achieve parity with terrestrial 5G networks. The free-space path loss decreases further, and penetration losses into buildings are reduced to approximately 10 dB, comparable to what terrestrial systems experience. These characteristics mean that the required specifications for the satellite system are notably relaxed. A 1.5 m² aperture (1.4 m diameter) antenna, combined with a power output of 400 W, would achieve sufficient gain and EIRP (~55 dBW) to deliver robust outdoor coverage and acceptable indoor service quality. Lower frequencies also mitigate the need for extreme beamwidth narrowing, allowing for more flexible service deployment.

Most consumers’ cellular consumption happens indoors. These consumers are compared to an LEO satellite solution typically better served by existing 5G cellular broadband networks. When considering a direct-to-normal-cellular device, it would not be practical to have an LEO satellite network, even an extensive one, to replace existing 5G terrestrial-based cellular networks and the services these support today.

This does not mean that LEO satellite cannot be of great utility when connecting to an outdoor Earth-based consumer dish, as is already evident in many remote, rural, and suburban places. The summary table above also shows that LEO satellite D2C services are feasible, without too challenging modifications, at the lower cellular frequency ranges between 600 MHz to 1800 MHz at service levels close to the terrestrial systems, at least in rural areas and for outdoor services in general. In indoor situations, the LEO Satellite D2C signal is more likely to be compromised due to roof and multiple floor penetration scenarios to which a terrestrial signal may be less exposed.

WHAT GOES DOWN MUST COME UP.

LEO satellite services that provide direct to unmodified mobile cellular device services are getting us all too focused on the downlink path from the satellite directly to the device. It seems easy to forget that unless you deliver a broadcast service, we also need the unmodified cellular device to directly communicate meaningfully with the LEO satellite. The challenge for an unmodified cellular device (e.g., smartphone, tablet, etc.) to receive the satellite D2C signal has been explained extensively in the previous section. In the satellite downlink-to-device scenario, we can optimize the design specifications of the LEO satellite to overcome some (or most, depending on the frequency) of the challenges posed by the satellite’s high altitude (compared to a terrestrial base station’s distance to the consumer device). In the device direct-uplink-to-satellite, we have very little to no flexibility unless we start changing the specifications of the terrestrial device portfolio. Suppose we change the specifications for consumer devices to communicate better with satellites. In that case, we also change the premise and economics of the (wrong) idea that LEO satellites should be able to completely replace terrestrial cellular networks at service parity with those terrestrial cellular networks.

Achieving uplink communication from a standard cellular device to an LEO satellite poses significant challenges, especially when attempting to match the performance of a terrestrial 5G network. Cellular devices are designed with limited transmission power, typically in the range of 23–30 dBm (0.2–1 watt), sufficient for short-range communication with terrestrial base stations. However, when the receiving station is a satellite orbiting between 550 and 1,200 kilometers, the transmitted signal encounters substantial free-space path loss. The satellite must, therefore, be capable of detecting and processing extremely weak signals, often below -120 dBm, to maintain a reliable connection.

The free-space path loss in the uplink direction is comparable to that in the downlink, but the challenges are compounded by the cellular device’s limitations. At higher frequencies, such as 3.5 GHz, path loss can exceed 155 dB, while at 1.8 GHz and 800 MHz, it reduces to approximately 149.6 dB and 143.6 dB, respectively. Lower frequencies favor uplink communication because they experience less path loss, enabling better signal propagation over large distances. However, cellular devices typically use omnidirectional antennas with very low gain (0–2 dBi), poorly suited for long-distance communication, placing even greater demands on the satellite’s receiving capabilities.

The satellite must compensate for these limitations with highly sensitive receivers and high-gain antennas. Achieving sufficient antenna gain requires large apertures, often exceeding 4 meters in diameter for 800 MHz or 2 meters for 3.5 GHz, increasing the satellite’s size, weight, and complexity. Phased-array antennas or deployable reflectors are often used to achieve the required gain. Still, their implementation is constrained by the physical limitations and costs of launching such systems into orbit. Additionally, the satellite’s receiver must have an exceptionally low noise figure, typically in the range of 1–3 dB, to minimize internal noise and allow the detection of weak uplink signals.

Interference is another critical challenge in the uplink path. Unlike terrestrial networks, where signals from individual devices are isolated into small sectors, satellites receive signals over larger geographic areas. This broad coverage makes it difficult to separate and process individual transmissions, particularly in densely populated areas where numerous devices transmit simultaneously. Managing this interference requires sophisticated signal processing capabilities on the satellite, increasing its complexity and power demands.

The motion of LEO satellites introduces additional complications due to the Doppler effect, which causes a shift in the uplink signal frequency. At higher frequencies like 3.5 GHz, these shifts are more pronounced, requiring real-time adjustments to the receiver to compensate. This dynamic frequency management adds another layer of complexity to the satellite’s design and operation.

Among the frequencies considered, 3.5 GHz is the most challenging for uplink communication due to high path loss, pronounced Doppler effects, and poor building penetration. Satellites operating at this frequency must achieve extraordinary sensitivity and gain, which is difficult to implement at scale. At 1.8 GHz, the challenges are somewhat reduced as the path loss and Doppler effects are less severe. However, the uplink requires advanced receiver sensitivity and high-gain antennas to approach terrestrial network performance. The most favorable scenario is at 800 MHz, where the lower path loss and better penetration characteristics make uplink communication significantly more feasible. Satellites operating at this frequency require less extreme sensitivity and gain, making it a practical choice for achieving parity with terrestrial 5G networks, especially for outdoor and light indoor coverage.

Uplink, the consumer device to satellite signal direction, poses additional limitations to the frequency range. Such systems may be interesting to 600 MHz to a maximum of 1.8 GHz, which is already challenging for uplink and downlink in indoor usage. Service in the lower cellular frequency range is feasible for outdoor usage scenarios in rural and remote areas and for non-challenging indoor environments (e.g., “simple” building topologies).

The premise that LEO satellite D2C services would make terrestrial cellular networks redundant everywhere by offering service parity appears very unlikely, and certainly not with the current generation of LEO satellites being launched. The altitude range of the LEO satellites (300 – 1200 km) and frequency ranges used for most terrestrial cellular services (600 MHz to 5 GHz) make it very challenging and even impractical (for higher cellular frequency ranges) to achieve quality and capacity parity with existing terrestrial cellular networks.

LEO SATELLITE D2C ARCHITECTURE.

A subscriber would realize they have LEO satellite Direct-to-Cell coverage through network signaling and notifications provided by their mobile device and network operator. Using this coverage depends on the integration between the LEO satellite system and the terrestrial cellular network, as well as the subscriber’s device and network settings. Here’s how this process typically works:

When a subscriber moves into an area where traditional terrestrial coverage is unavailable or weak, their mobile device will periodically search for available networks, as it does when trying to maintain connectivity. If the device detects a signal from a LEO satellite providing D2C services, it may indicate “Satellite Coverage” or a similar notification on the device’s screen.

This recognition is possible because the LEO satellite extends the subscriber’s mobile network. The satellite broadcasts system information on the same frequency bands licensed to the subscriber’s terrestrial network operator. The device identifies the network using the Public Land Mobile Network (PLMN) ID, which matches the subscriber’s home network or a partner network in a roaming scenario. The PLMN is a fundamental component of terrestrial and LEO satellite D2C networks, which is the identifier that links a mobile consumer to a specific mobile network operator. It enables communication, access rights management, network interoperability, and supporting services such as voice, text, and data.

The PLMN is also directly connected to the frequency bands used by an operator and any satellite service provider, acting as an extension of the operator’s network. It ensures that devices access the appropriately licensed bands through terrestrial or satellite systems and governs spectrum usage to maintain compliance with regulatory frameworks. Thus, the PLMN links the network identification and frequency allocation, ensuring seamless and lawful operation in terrestrial and satellite contexts.

In an LEO satellite D2C network, the PLMN plays a similar but more complex role, as it must bridge the satellite system with terrestrial mobile networks. The satellite effectively operates as an extension of the terrestrial PLMN, using the same MCC and MNC codes as the consumer’s home network or a roaming partner. This ensures that consumer devices perceive the satellite network as part of their existing subscription, avoiding the need for additional configuration or specialized hardware. When the satellite provides coverage, the PLMN enables the device to authenticate and access services through the operator’s core network, ensuring consistency with terrestrial operations. It ensures that consumer authentication, billing, and service provisioning remain consistent across the terrestrial and satellite domains. In cases where multiple terrestrial operators share access to a satellite system, the PLMN facilitates the correct routing of consumer sessions to their respective home networks. This coordination is particularly important in roaming scenarios, where a consumer connected to a satellite in one region may need to access services through their home network located in another region.

For a subscriber to make use of LEO satellite coverage, the following conditions must be met:

  • Device Compatibility: The subscriber’s mobile device must support satellite connectivity. While many standard devices are compatible with satellite D2C services using terrestrial frequencies, certain features may be required, such as enhanced signal processing or firmware updates. Modern smartphones are increasingly being designed to support these capabilities.
  • Network Integration: The LEO satellite must be integrated with the subscriber’s mobile operator’s core network. This ensures the satellite extends the terrestrial network, maintaining seamless authentication, billing, and service delivery. Consumers can make and receive calls, send texts, or access data services through the satellite link without changing their settings or SIM card.
  • Service Availability: The type of services available over the satellite link depends on the network and satellite capabilities. Initially, services may be limited to text messaging and voice calls, as these require less bandwidth and are easier to support in shared satellite coverage zones. High-speed data services, while possible, may require further advancements in satellite capacity and network integration.
  • Subscription or Permissions: Subscribers must have access to satellite services through their mobile plan. This could be included in their existing plan or offered as an add-on service. In some cases, roaming agreements between the subscriber’s home network and the satellite operator may apply.
  • Emergency Use: In specific scenarios, satellite connectivity may be automatically enabled for emergencies, such as SOS messages, even if the subscriber does not actively use the service for regular communication. This is particularly useful in remote or disaster-affected areas with unavailable terrestrial networks.

Once connected to the satellite, the consumer experience is designed to be seamless. The subscriber can initiate calls, send messages, or access other supported services just as they would under terrestrial coverage. The main differences may include longer latency due to the satellite link and, potentially, lower data speeds or limitations on high-bandwidth activities, depending on the satellite network’s capacity and the number of consumers sharing the satellite beam.

Managing a call on a Direct-to-Cell (D2C) satellite network requires specific mobile network elements in the core network, alongside seamless integration between the satellite provider and the subscriber’s terrestrial network provider. The service’s success depends on how well the satellite system integrates into the terrestrial operator’s architecture, ensuring that standard cellular functions like authentication, session management, and billing are preserved.

In a 5G network, the core network plays a central role in managing calls and data sessions. For a D2C satellite service, key components of the operator’s core network include the Access and Mobility Management Function (AMF), which handles consumer authentication and signaling. The AMF establishes and maintains connectivity for subscribers connecting via the satellite. Additionally, the Session Management Function (SMF) oversees the session context for data services. It ensures compatibility with the IP Multimedia Subsystem (IMS), which manages call control, routing, and handoffs for voice-over-IP communications. The Unified Data Management (UDM) system, another critical core component, stores subscriber profiles, detailing permissions for satellite use, roaming policies, and Quality of Service (QoS) settings.

To enforce network policies and billing, the Policy Control Function (PCF) applies service-level agreements and ensures appropriate charges for satellite usage. For data routing, elements such as the User Plane Function (UPF) direct traffic between the satellite ground stations and the operator’s core network. Additionally, interconnect gateways manage traffic beyond the operator’s network, such as the Internet or another carrier’s network.

The role of the satellite provider in this architecture depends on the integration model. If the satellite system is fully integrated with the terrestrial operator, the satellite primarily acts as an extension of the operator’s radio access network (RAN). In this case, the satellite provider requires ground stations to downlink traffic from the satellites and forward it to the operator’s core network via secure, high-speed connections. The satellite provider handles radio gateway functionality, translating satellite-specific protocols into formats compatible with terrestrial systems. In this scenario, the satellite provider does not need its own core network because the operator’s core handles all call processing, authentication, billing, and session management.

In a standalone model, where the LEO satellite provider operates independently, the satellite system must include its own complete core network. This requires implementing AMF, SMF, UDM, IMS, and UPF, allowing the satellite provider to directly manage subscriber sessions and calls. In this case, interconnect agreements with terrestrial operators would be needed to enable roaming and off-network communication.

Most current D2C solutions, including those proposed by Starlink with T-Mobile or AST SpaceMobile, follow the integrated model. In these cases, the satellite provider relies on the terrestrial operator’s core network, reducing complexity and leveraging existing subscriber management systems. The LEO satellites are primarily responsible for providing RAN functionality and ensuring reliable connectivity to the terrestrial core.

REGULATORY CHALLENGES.

LEO satellite networks offering Direct-to-Cell (D2C) services face substantial regulatory challenges in their efforts to operate within frequency bands already allocated to terrestrial cellular services. These challenges are particularly significant in regions like Europe and the United States, where cellular frequency ranges are tightly regulated and managed by national and regional authorities to ensure interference-free operations and equitable access among service providers.

The cellular frequency spectrum in Europe and the USA is allocated through licensing frameworks that grant exclusive usage rights to mobile network operators (MNOs) for specific frequency bands, often through competitive auctions. For example, in the United States, the Federal Communications Commission (FCC) regulates spectrum usage, while in Europe, national regulatory authorities manage spectrum allocations under the guidelines set by the European Union and CEPT (European Conference of Postal and Telecommunications Administrations). The spectrum currently allocated for cellular services, including low-band (e.g., 600 MHz, 800 MHz), mid-band (e.g., 1.8 GHz, 2.1 GHz), and high-band (e.g., 3.5 GHz), is heavily utilized by terrestrial operators for 4G LTE and 5G networks.

In March 2024, the Federal Communications Commission (FCC) adopted a groundbreaking regulatory framework to facilitate collaborations between satellite operators and terrestrial mobile service providers. This initiative, termed “Supplemental Coverage from Space,” allows satellite operators to use the terrestrial mobile spectrum to offer connectivity directly to consumer handsets and is an essential component of FCC’s “Single Network Future.” The framework aims to enhance coverage, especially in remote and underserved areas, by integrating satellite and terrestrial networks. The FCC granted SpaceX (November 2024) approval to provide direct-to-cell services via its Starlink satellites. This authorization enables SpaceX to partner with mobile carriers, such as T-Mobile, to extend mobile coverage using satellite technology. The approval includes specific conditions to prevent interference with existing services and to ensure compliance with established regulations. Notably, the FCC also granted SpaceX’s request to provide service to cell phones outside the United States. For non-US operations, Starlink must obtain authorization from the relevant governments. Non-US operations are authorized in various sub-bands between 1429 MHz and 2690 MHz.

In Europe, the regulatory framework for D2C services is under active development. The European Conference of Postal and Telecommunications Administrations (CEPT) is exploring the regulatory and technical aspects of satellite-based D2C communications. This includes understanding connectivity requirements and addressing national licensing issues to facilitate the integration of satellite services with existing mobile networks. Additionally, the European Space Agency (ESA) has initiated feasibility studies on Direct-to-Cell connectivity, collaborating with industry partners to assess the potential and challenges of implementing such services across Europe. These studies aim to inform future regulatory decisions and promote innovation in satellite communications.

For LEO satellite operators to offer D2C services in these regulated bands, they would need to reach agreements with the licensed MNOs with the rights to these frequencies. This could take the form of spectrum-sharing agreements or leasing arrangements, wherein the satellite operator obtains permission to use the spectrum for specific purposes, often under strict conditions to avoid interference with terrestrial networks. For example, SpaceX’s collaboration with T-Mobile in the USA involves utilizing T-Mobile’s existing mid-band spectrum (i.e., PCS1900) under a partnership model, enabling satellite-based connectivity without requiring additional spectrum licensing.

In Europe, the situation is more complex due to the fragmented nature of the regulatory environment. Each country manages its spectrum independently, meaning LEO operators must negotiate agreements with individual national MNOs and regulators. This creates significant administrative and logistical hurdles, as the operator must align with diverse licensing conditions, technical requirements, and interference mitigation measures across multiple jurisdictions. Furthermore, any satellite use of the terrestrial spectrum in Europe must comply with European Union directives and ITU (International Telecommunication Union) regulations, prioritizing terrestrial services in these bands.

Interference management is a critical regulatory concern. LEO satellites operating in the same frequency bands as terrestrial networks must implement sophisticated coordination mechanisms to ensure their signals do not disrupt terrestrial operations. This includes dynamic spectrum management, geographic beam shaping, and power control techniques to minimize interference in densely populated areas where terrestrial networks are most active. Regulators in the USA and Europe will likely require detailed technical demonstrations and compliance testing before approving such operations.

Another significant challenge is ensuring equitable access to spectrum resources. MNOs have invested heavily in acquiring and deploying their licensed spectrum, and many may view satellite D2C services as a competitive threat. Regulators would need to establish clear frameworks to balance the rights of terrestrial operators with the potential societal benefits of extending connectivity through satellites, particularly in underserved rural or remote areas.

Beyond regulatory hurdles, LEO satellite operators must collaborate extensively with MNOs to integrate their services effectively. This includes interoperability agreements to ensure seamless handoffs between terrestrial and satellite networks and the development of business models that align incentives for both parties.

TAKEAWAYS.

Ditect-to-cell LEO satellite networks face considerable technology hurdles in providing services comparable to terrestrial cellular networks.

  • Overcoming free-space path loss and ensuring uplink connectivity from low-power mobile devices with omnidirectional antennas.
  • Cellular devices transmit at low power (typically 23–30 dBm), making it difficult for uplink signals to reach satellites in LEO at 500–1,200 km altitudes.
  • Uplink signals from multiple devices within a satellite beam area can overlap, creating interference that challenges the satellite’s ability to separate and process individual uplink signals.
  • Developing advanced phased-array antennas for satellites, dynamic beam management, and low-latency signal processing to maintain service quality.
  • Managing mobility challenges, including seamless handovers between satellites and beams and mitigating Doppler effects due to the high relative velocity of LEO satellites.
  • The high relative velocity of LEO satellites introduces frequency shifts (i.e., Doppler Effect) that the satellite must compensate for dynamically to maintain signal integrity.
  • Address bandwidth limitations and efficiently reuse spectrum while minimizing interference with terrestrial and other satellite networks.
  • Scaling globally may require satellites to carry varied payload configurations to accommodate regional spectrum requirements, increasing technical complexity and deployment expenses.
  • Operating on terrestrial frequencies necessitates dynamic spectrum sharing and interference mitigation strategies, especially in densely populated areas, limiting coverage efficiency and capacity.
  • Ensuring the frequent replacement of LEO satellites due to shorter lifespans increases operational complexity and cost.

On the regulatory front, integrating D2C satellite services into existing mobile ecosystems is complex. Spectrum licensing is a key issue, as satellite operators must either share frequencies already allocated to terrestrial mobile operators or secure dedicated satellite spectrum.

  • Securing access to shared or dedicated spectrum, particularly negotiating with terrestrial operators to use licensed frequencies.
  • Avoiding interference between satellite and terrestrial networks requires detailed agreements and advanced spectrum management techniques.
  • Navigating fragmented regulatory frameworks in Europe, where national licensing requirements vary significantly.
  • Spectrum Fragmentation: With frequency allocations varying significantly across countries and regions, scaling globally requires navigating diverse and complex spectrum licensing agreements, slowing deployment and increasing administrative costs.
  • Complying with evolving international regulations, including those to be defined at the ITU’s WRC-27 conference.
  • Developing clear standards and agreements for roaming and service integration between satellite operators and terrestrial mobile network providers.
  • The high administrative and operational burden of scaling globally diminishes economic benefits, particularly in regions where terrestrial networks already dominate.
  • While satellites excel in rural or remote areas, they might not meet high traffic demands in urban areas, restricting their ability to scale as a comprehensive alternative to terrestrial networks.

The idea of D2C satellite networks making terrestrial cellular networks obsolete is ambitious but fraught with practical limitations. While LEO satellites offer unparalleled reach in remote and underserved areas, they struggle to match terrestrial networks’ capacity, reliability, and low latency in urban and suburban environments. The high density of base stations in terrestrial networks enables them to handle far greater traffic volumes, especially for data-intensive applications.

  • Coverage advantage: Satellites provide global reach, particularly in remote or underserved regions, where terrestrial networks are cost-prohibitive and often of poor quality or altogether lacking.
  • Capacity limitations: Satellites struggle to match the high-density traffic capacity of terrestrial networks, especially in urban areas.
  • Latency challenges: Satellite latency, though improving, cannot yet compete with the ultra-low latency of terrestrial 5G for time-critical applications.
  • Cost concerns: Deploying and maintaining satellite constellations is expensive, and they still depend on terrestrial core infrastructure (although the savings if all terrestrial RAN infrastructure could be avoided is also very substantial).
  • Complementary role: D2C networks are better suited as an extension to terrestrial networks, filling coverage gaps rather than replacing them entirely.

The regulatory and operational constraints surrounding using terrestrial mobile frequencies for D2C services severely limit scalability. This fragmentation makes it difficult to achieve global coverage seamlessly and increases operational and economic inefficiencies. While D2C services hold promise for addressing connectivity gaps in remote areas, their ability to scale as a comprehensive alternative to terrestrial networks is hampered by these challenges. Unless global regulatory harmonization or innovative technical solutions emerge, D2C networks will likely remain a complementary, sub-scale solution rather than a standalone replacement for terrestrial mobile networks.

FURTHER READING.

  1. Kim K. Larsen, “The Next Frontier: LEO Satellites for Internet Services.” Techneconomyblog, (March 2024).
  2. Kim K. Larsen, “Stratospheric Drones & Low Earth Satellites: Revolutionizing Terrestrial Rural Broadband from the Skies?” Techneconomyblog, (January 2024).
  3. Kim K. Larsen, “A Single Network Future“, Techneconomyblog, (March 2024).
  4. T.S. Rappaport, “Wireless Communications – Principles & Practice,” Prentice Hall (1996). In my opinion, it is one of the best graduate textbooks on communications systems. I bought it back in 1999 as a regular hardcover. I have not found it as a Kindle version, but I believe there are sites where a PDF version may be available (e.g., Scribd).

ACKNOWLEDGEMENT.

I greatly acknowledge my wife, Eva Varadi, for her support, patience, and understanding during the creative process of writing this article.

Spectrum in the USA – An overview of Today and a new Tomorrow.

This week (Week 17, 2023), I submitted my comments and advice titled “Development of a National Spectrum Strategy (NSS)” to the United States National Telecommunications & Information Administration (NTIA) related to their work on a new National Spectrum Strategy.

Of course, one might ask why, as a European, bother with the spectrum policy of the United States. So hereby, a bit of reasoning for bothering with this super interesting and challenging topic of spectrum policy on the other side of the pond.

A EUROPEAN IN AMERICA.

As a European coming to America (i.e., USA) for the first time to discuss the electromagnetic spectrum of the kind mobile operators love to have exclusive access to, you quickly realize that Europe’s spectrum policy/policies, whether you like them or not, are easier to work with and understand. Regarding spectrum policy, whatever you know from Europe is not likely to be the same in the USA (though physics is still fairly similar).

I was very fortunate to arrive back in the early years of the third millennium to discuss cellular capacity and, as it quickly evolves (“escalates”), too, having a discussion of available cellular frequencies, the associated spectral bandwidth, and whether they really need that 100 million US dollar for radio access expansions.

Why fortunate?

I was one of the first (from my company) to ask all those “stupid” questions whenever I erroneously did not just assume things surely must be the same as in Europe and ended up with the correct answer that in the USA, things are a “little” different and a lot more complicated in terms of the availability of frequencies and what feeds the demand … the spectrum bandwidth. My arrival was followed by “hordes” of other well-meaning Europeans with the same questions and presumptions, using European logic to solve US challenges. And that doesn’t really work (surprised you not should be). I believe my T-Mobile US colleagues and friends over the years surely must have felt like Groundhog Day all over again at every new European visit.

COMPARING APPLES AND ORANGES.

Looking at US spectrum reporting, it is important to note that it is customary to provide the total amount of spectrum. Thus, for FDD spectrum bands, including both the downlink spectrum portion and uplink spectrum part of the cellular frequency band in question. For example, when a mobile network operator (MNO) reports that it has, e.g., 40 MHz of AWS1 spectrum in San Diego (California), it means that it has 2×20 MHz (or 20+20 MHz). Thus, 20 MHz of downlink (DL) services and 20 MHz of uplink (UL) services. For FDD, both the DL and the UL parts are counted. In Europe, historically, we mainly would talk about half the spectrum for FDD spectrum bands. This is one of the first hurdles to get over in meetings and discussions. If not sorted out early can lead to some pretty big misunderstandings (to say the least). To be honest, and in my opinion, providing the full spectrum holding, irrespective of whether a band is used as FDD or TDD, is less ambiguous than the European tradition.

The second “hurdle” is to understand that a USA-based MNO is likely to have a substantial variation in its spectrum holdings across the US geography. An MNO may have a 40 MHz (i.e., 2×20 MHz) PCS spectrum in Los Angeles (California) and only 30 MHz (2×15 MHz) of the same spectrum in New York or only 20 MHz (2×10 MHz) in Miami (Florida). For example, FCC (i.e., the regulator managing non-federal spectrum) uses 734 so-called Cellular Market Areas or CMAs, and there is no guarantee that a mobile operator’s spectrum position will remain the same over these 734 CMAs. Imagine Dutch (or other European) mobile operators having a varying 700 MHz (used for 5G) spectrum position across the 342 municipalities of The Netherlands (or another European country). It takes a lot of imagination … right? And maybe why, we Europeans, shake our heads at the US spectrum fragmentation, or market variation, as opposed to our nice, neat, and tidy market-wise spectrum uniformity. But is the European model so much better (apart from being neat & tidy)? …

… One may argue that the US model allows for spectrum acquisition to be more closely aligned with demand, e.g., less spectrum is needed in low-population density areas and more is required in high-density population areas (where demand will be much more intense). As evidenced by many US auctions, the economics matched the demand fairly well. While the European model is closely aligned with our good traditions of being solid on average … with our feet in the oven and our head in the freezer … and on average all is pretty much okay in Europe.

Figure 1 and 2 below illustrates a mobile operator difference between its spectrum bandwidth spread across the 734 US-defined CMAs in the AWS1 band and how that would look in Europe.

Figure 1 illustrates the average MNO distribution of (left chart) USA AWS1 band (band 4) distribution over the 734 Cellular Market Areas (CMA) defined by the FCC. (right chart) Typical European 3 MNO 2100-band (band-1) distribution across the country’s geographical area. As a rule of thumb for European countries, the spectrum is fairly uniformly distributed across the national MNOs. E.g., if you have 3 mobile operators, the 120 MHz available to band-1 will be divided equally among the 3, and If there are 4 MNOs, then it will be divided by 4. Nevertheless, in Europe, an MNO spectrum position is fixed across the geography.

Figure 2 below is visually an even stronger illustration of mobile operator bandwidth variation across the 734 cellular market areas. The dashed white horizontal line is if the PCS band (a total of 120 MHz or 2×60 MHz) would be shared equally between 4 main nationwide mobile operators ending up at 30 MHz per operator across all CMAs. This would resemble what today is more or less a European situation, i.e., irrespective of regional population numbers, the mobile operator’s spectrum bandwidth at a given carrier frequency would be the same. The European model, of course, also implies that an operator can provide the same quality in peak bandwidth before load may become an issue. The high variation in the US operator’s spectrum bandwidth may result in a relatively big variation in provided quality (i.e., peak speed in Mbps) across the different CMAs.

There is an alternative approach to spectrum acquisition that may also be more spectrally efficient, which the US model is much more suitable for. Aim at a target Hz per Customer (i.e., spectral overhead) and keep this constant within the various market. Of course, there is a maximum realistic amount of bandwidth to acquire, governed by availability (e.g., for PCS, that is, 120 MHz) and competitive bidders’ strength. There will also be a minimum bandwidth level determined by the auction rules (e.g., 5 MHz) and a minimum acceptable quality level (e.g., 10 MHz). However, Figure 2 below reflects more opportunistic spectrum acquisition in CMAs with less than a million population as opposed to a more intelligent design (possibly reflecting the importance of, or lack of, different CMAs to the individual operators).

Figure 2 illustrates the bandwidth variation (orange dots) across the 734 cellular market areas for 4 nationwide mobile network operators in the United States. The horizontal dashed white line is if the four main nationwide operators would equally share the 120 MHz of PCS spectrum (fairly similar to a European situation). MNOs would have the same spectral bandwidth across every CMA. The Minimum – Growing – Maximum dashed line illustrates a different spectrum acquisition strategy, where the operator has fixed the amount of spectrum per customer required and keeps this as a planning rule between a minimum level (e.g., a unit of minimum auctioned bandwidth) and a realistic maximum level (e.g., determined by auction competition, auction ruling, and availability).

Thirdly, so-called exclusive use frequency licenses (as opposed to shared frequencies), as issued by FCC, can be regarded accounting-wise as an indefinitely-lived intangible asset. Thus, once a US-based cellular mobile operator has acquired a given exclusive-use license, that license can be considered disposable to the operator in perpetuity. It should be noted that FCC licenses typically would be issued for a fixed (limited) period, but renewals are routine.

This is a (really) big difference from European cellular frequency licenses that typically expire after 10 – 20 years, with the expired frequency bands being re-auctioned. A European mobile operator cannot guarantee its operation beyond the expiration date of the spectrum acquired, posing substantial existential threats to business and shareholder value. In the USA, cellular mobile operators have a substantially lower risk regarding business continuity as their spectrum, in general, can be regarded as theirs indefinitely.

FCC also operates with a shared-spectrum license model, as envisioned by the Citizens Broadband Radio Service (CBRS) in the 3.55 to 3.7 GHz frequency range (i.e., the C-band). A shared-spectrum license model allows for several types of users (e.g., Federal and non-Federal) and use-cases (e.g., satellite communications, radar applications, national cellular services, local community broadband services, etc..) to co-exist within the same spectrum band. Usually, such shared licenses come with firm protection of federal (incumbent) users that allows commercial use to co-exist with federal use, though with the federal use case taking priority over the non-federal. A really good overview of the CBRS concept can be found in “A Survey on Citizens Broadband Radio Service (CBRS)” by P. Agarwal et al.. Wireless Innovation Forum published on 2022 a piece on “Lessons Learned from CBRS” which provides a fairly nuanced, although somewhat negative, view on spectrum sharing as observed in the field and within the premises of the CBRS priority architecture and management system.

Recent data around FCC’s 3.5 GHz (CBRS) Auction 105 would indicate that shared-licensed spectrum is valued at a lower USD-per-MHz-pop (i.e., 0.14 USD-per-MHz-pop) than exclusive-use license auctions in 3.7 GHz (Auction 107; 0.88 USD-per-MHz-pop) and 3.45 GHz (Auction 110; 0.68 USD-per-MHz-pop). The duration of the shared-spectrum license in the case of the Auction 105 spectrum is 10 years after which it is renewed. Verizon and Dish Networks were the two main telecom incumbents that acquired substantial spectrum in Auction 105. AT&T did not acquire and T-Mobile US only picked close to nothing (i.e., 8 licenses).

THE STATE OF CELLULAR PERFORMANCE – IN THE UNITED STATES AND THE REST OF THE WORLD.

Irrespective of how one feels about the many mobile cellular benchmarks around in the industry (e.g., Ookla Speedtest, Umaut benchmarking, OpenSignal, etc…), these benchmarks do give an indication of the state of networks and how those networks utilize the spectral resources that mobile companies have often spend hundreds of millions, if not billions, of US dollars acquiring and not to underestimate in cost and time, spectrum clearing or perfecting a “second-hand” spectrum may incur for those operators.

So how do US-based mobile operators perform in a global context? We can get an impression, although very 1-dimensional, from Figure 1 below.

Figure 3 illustrates the comparative results of Ookla Speedtest data in median downlink speed (Mbps) for various countries. The selection of countries provides a reasonable representation of maximum and minimum values. To give an impression of the global ranking as of February 2023; South Korea (3), Norway (4), China (7), Canada (17), USA (19), and Japan (48). As a reminder, the statistic is based on the median of all measurements per country. Thus, half of the measurements were above the median speed value, and the other half were below. Note: median values from 2020 to 2017 are estimated as Ookla did only provide average numbers.

Ookla’s Speedtest rank (see Figure 3 above) positions the United States cellular mobile networks (as an average) among the Top-20. Depending on the ambition level, that may be pretty okay or a disappointment. However, over the last 24 months, thanks to the fast 5G deployment pace at 600 MHz, 2.5 GHz, and C-band, the US has leapfrogged (on average) its network quality which for many years did not improve much due to little spectrum availability and huge capital investment levels. Something that the American consumer can greatly enjoy irrespective of the relative mobile network ranking of the US compared to the rest of the world. South Korea and Norway are ranked 3 and 4, respectively, regarding cellular downlink (DL) speed in Mbps. The above figure also shows a significant uplift in the speed at the time of introducing 5G in the cellular operators’ networks worldwide.

How to understand the supplied cellular network quality and capacity that the consumer demand and hopefully also enjoy? Let start with the basics:

Figure 4 illustrates one of the most important (imo) to understand about creating capacity & quality in cellular networks. You need frequency bandwidth (in MHz), the right technology boosting your spectral efficiency (i.e., the ability to deliver bits per unit Hz), and sites (sectors, cells, ..) to deploy the spectrum and your technology. That’s pretty much it.

We might be able to understand some of the dynamics of Figure 3 using Figure 4, which illustrates the fundamental cellular quality (and capacity) relationship with frequency bandwidth, spectral efficiency, and the number of cells (or sectors or sites) deployed in a given country.

Thus, a mobile operator can improve its cellular quality (and capacity) by deploying more spectrum acquired on its existing network, for example, by auctions, leasing, sharing, or other arrangements within the possibilities of whatever applicable regulatory regime. This option will exhaust as the operator’s frequency spectrum pool is deployed across the cellular network. It leaves an operator to wait for an upcoming new frequency auction or, if possible, attempt to purchase additional spectrum in the market (if regulation allows) that may ultimately include a merger with another spectrum-rich entity (e.g., AT&T attempt to take over T-Mobile US). All such spectrum initiatives may take a substantial amount of time to crystalize, while customers may experience a worsening in their quality. In Europe, the licensed spectrum becomes available in cycles of 10 – 20 years. In the USA, exclusive-use licensed spectrum typically would be a once-only opportunity to acquire (unless you acquire another spectrum-holding entity later, e.g., Metro PCS, Sprint, AT&T’s attempt to acquire T-Mobile, …).

Another part of the quality and capacity toolkit is for the mobile operator to choose appropriately spectral efficient technologies that are supported by a commercially available terminal ecosystem. Firstly, migrate frequency and bandwidth away from currently deployed legacy radio-access technology (e.g., 2G, 3G, …) to newer and spectrally more efficient ones (e.g., 4G, 5G, …). This migration, also called spectral re-farming, requires a balancing act between current legacy demand versus the future expectations of demand in the newer technology. In a modern cellular setting, the choice of antenna technology (e.g., massive MiMo, advanced antenna systems, …) and type (e.g., multi-band) is incredibly important for boosting quality and capacity within the operators’ cellular networks. Given that such choices may result in redesigning existing site infrastructure, it provides an opportunity to optimize the existing infrastructure for the best coverage of the consolidated spectrum pool. It is likely that the existing infra was designed with a single or only a few frequencies in mind (e.g., PCS, PCS+AWS, …) as well as legacy antennas, and the cellular performance is likely improved by considering the complete pool of frequencies in the operator’s spectrum holding. The mobile operator’s game should always be to achieve the best possible spectral efficiency considering demand and economics (i.e., deploying 64×64 massive MiMo all over a network may be the most spectrally efficient solution, theoretically, but both demand and economics would rarely support such an apparently “silly” non-engineering strategy). In general, this will be the most frequently used tool in the operators’ quality/capacity toolkit. I expect to see an “arms race” between operators deploying the best and most capable antennas (where it matters), as it will often be the only way to differentiate in quality and capacity (if everything else is almost equal).

Finally, the mobile operator can deploy more site locations (macro and small cells), if permitting allows, or more sectors by sectorization (e.g., 3 → 4, 4 → 5 sectors) or cell split if the infrastructure and landlord allows. If there remains unused spectral bandwidth in the operator’s spectrum pool, the operator may likely choose to add another cell (i.e., frequency band) to the existing site. Particular adding new site locations (macro or small cell) is the most complex path to be taken and, of course, also often the least economic path.

Thus, to get a feeling for the Ookla Speedtest, which is a country average, results of Figure 3, we need, as a starting point, to have the amount of spectral bandwidth for the average cellular mobile operator. This is summarised in below’s Table 1.

Table 1 provides, per country, the average amount of Low-band (≤ 1 GHz), Mid-band (1 GHz to 2.1 GHz), 2.3 & 2.5 GHz bands, Sub-total bandwidth before including the C-band, the C-band (3.45 to 4.2 GHz) and the Total bandwidth. The table also includes the Ookla Global Speedtest DL Mbps and Global Rank as of February 2023. I have also included the in-country mobile operator variation within the different categories, which may indicate what kind of performance range to expect within a given country.

It does not take too long to observe that there is only an apparently rather weak correlation between spectrum bandwidth (sub-total and total) and the observed DL speed (even after rescaling to downlink spectrum only). Also, what is important is, of course, how much of the spectrum is deployed. Typically low and medium bands will be deployed extensively, while other high-frequency bands may only have been selectively deployed, and the C-band is only in the process of being deployed (where it is available). What also plays a role is to what degree 5G has been rollout across the network, how much bandwidth has been dedicated to 5G (and 4G), and what type of advanced antenna system or massive MiMo capabilities has been chosen. And then, to provide a great service, a network must have a certain site density (or coverage) compared to the customer’s demand. Thus, it is to be expected that the number of mobile site locations, and the associated number of frequency cells and sectors, will play a role in the average speed performance of a given country.

Figure 5 illustrates how the DL speed in Mbps correlates with the (a) total amount of spectrum excluding the C-band (still not widely deployed), (b) Customers per Site that provides a measure of the customer load at the site location level. The more customers load a site or compete for radio resources (i.e., MHz), the lower the experience. Finally, (c) The higher the Site times, the bandwidth is compared to the number of customers. More quality can be provided (as observed with the positive correlation). The data is from Table 1.

Figure 5 shows that load (e.g., customers per site) and available capacity (e.g., sites x bandwidth) relative to customers are strongly correlated with the experienced quality (e.g., speed in Mbps). The comparison between the United States and China is interesting as both countries with a fairly similar surface area (i.e., 9.8 vs. 9.6 million sq. km), the USA has a little less than a quarter of the population, and the average mobile US operator would have about one-third of the customers compared to the average Chinese operator (note: China mobile dominates the average). The Chinese operator, ignoring C-band, would have ca. 25 MHz or ~+20% (~50 MHz or ca. +10% if C-band is included) more than the US operator. Regarding sites, China Mobile has been reported to have millions of cell site locations (incl. lots of small cells). The US operator’s site count is in the order of hundreds of thousands (though less than 200k currently, including small cells). Thus, Chinese mobile operators have between 5x to 10x the number of site locations compared to the American ones. While the difference in spectrum bandwidth has some significance (i.e., China +10% to 20% higher), the huge relative difference in site numbers is one of the determining factors in why China (i.e., 117 Mbps) gets away with a better speed test score that is better than the American one (i.e., 85 Mbps). While theoretically (and simplistically), one would expect that the average Chinese mobile operator should be able to provide more than twice the speed as compared to the American mobile operator instead of “only” about 40% more, it stands to show that the radio environment is a “bit” more complex than the simplistic view.

Of course, the US-based operator could attempt to deploy even more sites where it matters. However, I very much doubt that this would be a feasible strategy given permitting and citizen resistance to increasing site density in areas where it actually would be needed to boost the performance and customer experience.

Thus, the operator in the United States must acquire more spectrum bandwidth and deploy that where it matters to their customers. They also need to continue to innovate on leapfrogging the spectral efficiency of the radio access technologies and deploy increasingly more sophisticated antenna systems across their coverage footprint.

In terms of sectorization (at existing locations), cell split (adding existing spectrum to an existing site), and/or adding more sophisticated antenna systems is a matter of Capex prioritization and possibly getting permission from the landlord. Acquiring new spectrum … well, that depends on such new spectrum somehow becomes available.

Where to “look” for more spectrum?

WHERE COULD MORE SPECTRUM COME FROM?

Within the so-called “beachfront spectrum” covering the frequency range from 225 MHz to 4.2 GHz (according to NTIA), only about 30% (ca. 1GHz of bandwidth within the frequency range from 600 MHz to 4.2 GHz) is exclusively non-Federal, and mainly with the mobile operators as exclusive use licenses deployed for cellular mobile services across the United States. Federal authorities exclusively use a bit less than 20% (~800 MHz) for communications, radars, and R&D purposes. This leaves ca. 50% (~2 GHz) of the beachfront spectrum shared between Federal authorities and commercial entities (i.e., non-Federal).

For cellular mobile operators, exclusive use licenses would be preferable (note: at least at the current state of the relevant technology landscape) as it provides the greatest degree of operational control and possibility to optimize spectral efficiency, avoiding unacceptable levels of interference either from systems or towards systems that may be sharing a given frequency range.

The options for re-purposing the Federal-only spectrum (~800 MHz) could, for example, be either (a) moving radar systems’ operational frequency range out of the beachfront spectrum range to the degree innovation and technology supports such a migration, (b) modernizing radar systems with a focus of making these substantially more spectrally efficient and interference-resistant, (c) migrated federal-only communications services to commercially available systems (e.g., 5G federal-only slicing) similar to the trend of migrating federal legacy data centers to the public cloud. Within the shared frequency portion with the ~2 GHz of bandwidth, it may be more challenging as considerable commercial interests (other than mobile operators) have positioned that business at and around such frequencies, e.g., within the CBRS frequency range. This said, there might also be opportunities within the Federal use cases to shift applications towards commercially available communication systems or to shift them out of the beachfront range. Of course, in my opinion, it always makes sense to impose (and possibly finance) stricter spectral efficiency conditions, triggering innovation on federal systems and commercial systems alike within the shared portion of the beachfront spectrum range. With such spectrum strategies, it appears compelling that there are high likelihood opportunities for creating more spectrum for exclusive license use that would safeguard future consumer and commercial demand and continuous improvement of customer experience that comes with the future demand and user expectations of the technology that serves them.

I believe that the beachfront should be extended beyond 4.2 GHz. For example aligning with band-79, whose frequency range extends from 4.4 GHz to 5 GHz, allows for a bandwidth of 600 MHz (e.g., China Mobile has 100 MHz in the range from 4.8 GHz to 4.9 GHz). Exploring additional re-purposing opportunities for exclusive use licenses in what may be called the extended beachfront frequency range from 4.2 GHz up to 7.2 GHz should be conducted with priority. Such a study should also consider the possibility of moving the spectrum under exclusive and shared federal use to other frequency bands and optimizing the current federal frequency and spectrum allocation.

The NTIA, that is, the National Telecommunications and Information Administration, is currently (i.e., 2023) for the United States developing a National Spectrum Strategy (NSS) and the associated implementation plan. Comments and suggestions to the NSS were possible until the 18th of April, 2023. The National Spectrum Strategy should address how to create a long-term spectrum pipeline. It is clear that developing a coherent national spectrum strategy is critical to innovation, economic competition, national security, and maybe re-capture global technology leadership.

So who is the NTIA? What do they do that FCC doesn’t already do? (you may possibly ask).

WHO MANAGES WHAT SPECTRUM?

Two main agencies in the US manage the frequency spectrum, the FCC and the NTIA.The Federal Communications Commission, the FCC for short, is an independent agency that exclusively regulates all non-Federal spectrum use across the United States. FCC allocates spectrum licenses for commercial use, typically through spectrum auctions. A new or re-purposed commercialized spectrum has been reclaimed from other uses, both from federal uses and existing commercial uses. Spectrum can be re-purposed either because newer, more spectrally efficient technologies become available (e.g., the transition from analog to digital broadcasting) or it becomes viable to shift operation to other spectrum bands with less commercial value (and, of course, without jeopardizing existing operational excellence). It is also possible that spectrum, previously having been for exclusive federal use (e.g., military applications, fixed satellite uses, etc..), can be shared, such as the case with Citizens Broadband Radio Service (CBRS), which allows non-federal parties access to 150 MHz in the 3.5 GHz band (i.e., band 48). However, it has recently been concluded that (centralized) dynamic spectrum sharing only works in certain use cases and is associated with considerable implementation complexities. Multiple parties with possible vastly different requirements co-existence within a given band is very much work-in-progress and may not be consistent with the commercialized spectrum operation required for high-quality broadband cellular operation.

In parallel with the FCC, we have the National Telecommunications and Information Administration, NTIA for short. NTIA is solely responsible for authorizing Federal spectrum use. It also acts as the President of the United State’s principal adviser on telecommunications policies, coordinating the views of the Executive Branch. NTIA manages about 2,398 MHz (69%) within the so-called “beachfront spectrum” range of 225 MHz to 3.7 GHz (note: I would let that Beachfront go to 7 GHz, to be honest). Of the total of 3,475 MHz, 591 MHz (17%) is exclusively for Federal use, and 1,807 MHz (52%) is shared (or coordinated) between Federal and non-Federal. Thus, leaving 1,077 MHz (31%) for exclusive commercial use under the management of the FCC.

NTIA, in collaboration with the FCC, has been instrumental in the past in freeing up substantial C-band spectrum, 480 MHz in total, of which 100 MHz is conditioned on prioritized sharing (i.e., Auction 105), for commercial and shared use that subsequently has been auctioned off over the last 3 years raising USD 109 billion. In US Dollar (USD) per MHz per population count (pop) we have on average ca. USD 0.68 per MHz-pop from the C-band auctions in the US, compared to USD 0.13 per MHz-pop in Europe C-band auctions, and USD 0.23 per MHz-pop in APAC auctions. It should be remember that the United States exclusive-use spectrum licenses can be regarded as an indefinite-lived intangible asset while European spectrum rights expire between 10 and 20 years. This may explain a big part of the pricing difference between US-based spectrum pricing and that of Europe and Asia.

NTIA and FCC jointly manage all the radio spectrum, licensed (e.g., cellular mobile frequencies, TV signals, …) and unlicensed (e.g., WiFi, MW Owens, …) of the United States, NTIA for Federal use, and FCC for non-Federal use (put simply). FCC is responsible for auctioning spectrum licenses and is also authorized to redistribute licenses.

RESPONSE TO NTIA’s National Spectrum Strategy Request for Comments

Here are some of key points to consider for developing a National Spectrum Strategy (NSS).

  • The NTIA National Spectrum Strategy (NSS) should focus on creating a long-term spectrum pipeline. Developing a coherent national spectrum strategy is critical to innovation, economic competition, national security, and global technology leadership.
  • NTIA should aim at significant amounts of spectrum to study and clear to build a pipeline. Repurposing at least 1,500 Mega Hertz of spectrum perfected for commercial operations is good initial target allowing it to continue to meet consumer, business, and societal demand. It requires more than 1,500 Mega Hertz to be identified for study.
  • NTIA should be aware that the mobile network quality strongly correlates with the mobile operators’ spectrum available for their broadband mobile service in a global setting.
  • NTIA must remember that not all spectrum is equal. As it thinks about a pipeline, it must ensure its plans are consistent with the spectrum needs of various use cases of the wireless sectors. The NSS is a unique opportunity for NTIA to establish a more reliable process and consistent policy for making the federal spectrum available for commercial use. NTIA should reassert its role, and that of the FCC, as the primary federal and commercial regulator of spectrum policy.

A balanced spectrum policy is the right approach. Given the current spectrum dynamics, the NSS should prioritize identifying exclusive-use licensed spectrum instead of, for example, attempting co-existence between commercial and federal use.

Spectrum-band sharing between commercial communications networks and federal communications, or radar systems, may impact the performance of all the involved systems. Such practice compromises the level of innovation in modern commercialized communications networks (e.g., 5G or 6G) to co-exist with the older legacy systems. It also discourages the modernization of legacy federal equipment.

Only high-power licensed spectrum can provide the performance necessary to support nationwide wireless with the scale, reliability, security, resiliency, and capabilities consumers, businesses, and public sector customers expect.

Exclusive use of licensed spectrum provides unique benefits compared to unlicensed and shared spectrum. Unlicensed spectrum, while important, is only suitable for some types of applications, and licensed spectrum under shared access frameworks by CBRS is unsuited for serving as the foundation for nationwide mobile wireless networks.

Allocating new spectrum bands for the exclusive use of licensed spectrum positively impacts the entire wireless ecosystem, including downstream investments by equipment companies and others who support developing and deploying wireless networks. Insufficient licensed spectrum means increasingly deteriorating customer experience and lost economic growth, jobs, and innovation.

Other countries are ahead of the USA in developing plans for licensed spectrum allocations, targeting the full potential of the spectrum range from 300 MHz up to 7 GHz (i.e., the beachfront spectrum range), and those countries will lead the international conversation on licensed spectrum allocation. The NSS offers an opportunity to reassert U.S. leadership in these debates.

NTIA should also consider the substantial benefits and economic value of leading the innovation in modernizing the legacy spectrally in-efficient non-commercial communications and radar systems occupying vast spectrum resources.

Exclusive-use licensed spectrum has inherent characteristics that benefit all users in the wireless ecosystem.

Consumer demand for mobile data is at an all-time high and only continues to surge as demand grows for lightning-fast and responsive wireless products and services enabled by licensed spectrum.

With an appropriately designed and well-sized spectrum pipeline, demand will remain sustainable as supplied spectrum capacity compared to the demand will remain or exceed today’s levels.

Networks built on licensed spectrum are the backbone of next-generation innovative applications like precision agriculture, telehealth, advanced manufacturing, smart cities, and our climate response.

Licensed spectrum is enhancing broadband competition and bridging the digital divide by enabling 5G services like 5G Fixed Wireless Access (FWA) in areas traditionally dominated by cable and in rural areas where fiber is not cost-effective to deploy.

NTIA should identify the midband spectrum (e.g., ~2.5GHz to ~7GHz) and, in particular, frequencies above the C-band for licensed spectrum. That would be the sweet spot for leapfrogging broadband speed and capacity necessary to power 5G and future generations of broadband communications networks.

The National Spectrum Strategy is an opportunity to improve the U.S. Government’s spectrum management process.

The NSS allows NTIA to develop a more consistent and better process for allocating spectrum and providing dispute resolution.

The U.S. should handle mobile networks without a new top-down government-driven industrial policy to manage mobile networks. A central planning model would harm the nation, severely limiting innovation and private sector dynamism.

Instead, we need a better collaboration between government agencies with NTIA and the FCC as the U.S. Government agencies with clear authority over the nation’s spectrum. The NSS also should explore mechanisms to get federal agencies (and their associated industry sectors) to surface their concerns about spectrum allocation decisions early in the process and accept NTIA’s role as a mediator in any dispute.

ACKNOWLEDGEMENT.

I greatly acknowledge my wife, Eva Varadi, for her support, patience, and understanding during the creative process of writing this article. Of course, throughout the years of being involved in T-Mobile US spectrum strategy, I have enjoyed many discussions and debates with US-based spectrum professionals, bankers, T-Mobile US colleagues, and very smart regulatory policy experts in Deutsche Telekom AG. I have the utmost respect for their work and the challenges they have faced and face. For this particular work, I cannot thank Roslyn Layton, PhD enough for nudging me into writing the comments to NTIA. By that nudge, this little article is a companion to my submission about the US Spectrum as it stands today and what I would like to see with the upcoming National Spectrum Strategy. I very much recommend reading Roslyn’s far more comprehensive and worked-through comments to the NTIA NSS request for advice. A final thank you to John Strand (who keeps away from Linkedin;-) of Strand Consult for challenging my way of thinking and for always stimulating new ways of approaching problems in our telecom sector. I very much appreciate our discussions.

ADDITIONAL MATERIAL.

  1. Kim Kyllesbech Larsen, “NTIA-2023-003. Development of a National Spectrum Strategy (NSS)”, National Spectrum Strategy Request for Comment Responses April 2023. See all submissions here.
  2. Roslyn Layton, “NTIA–2023–0003. Development of a National Spectrum Strategy (NSS)”, National Spectrum Strategy Request for Comment Responses April 2023..
  3. Ronald Harry Coase, “The Federal Communications Commission”, The Journal of Law & Economics, Vol. 2 (October 1959), pp. 1- 40. In my opinion, a must-read for anyone who wants to understand the US spectrum regulation and how it came about.
  4. Kenneth R. Carter, “Policy Lessons from Personal Communications Services: Licensed vs. Unlicensed Spectrum Access,” 2006, Columbus School of Law. An interesting perspective on licensed and unlicensed spectrum access.
  5. Federal Communication Commission (FCC) assigned areas based on the relevant radio licenses. See also FCC Cellular Market Areas (CMAs).
  6. FCC broadband PCS band plan, UL:1850-1910 MHz & DL:1930-1990 MHz, 120 MHz in total or 2×60 MHz.
  7. Understanding Federal Spectrum Use is a good piece from NTIA about the various federal use of spectrum in the United States.
  8. Ookla’s Speedtest Global Index for February 2023. In order to get the historical information use the internet archive, also called “The Wayback Machine.”
  9. I make extensive use of the Spectrum Monitoring site, which I can recommend as one of the most comprehensive sources of frequency allocation data worldwide that I have come across (and is affordable to use).
  10. FCC Releases Rules for Innovative Spectrum Sharing in 3.5 GHz Band.
  11. 47 CFR Part 96—Citizens Broadband Radio Service. Explain the hierarchical spectrum-sharing regime of and priorities given within the CBRS.