Telecom networks represent the backbone of digital products and services, which, now more than ever, have the potential to support all aspects of our daily lives. The evolution of digital products and services constitutes a key driver and enabler for economic growth throughout the world, and the development of fixed broadband markets is a critical component to this end. Very high capacity networks (VHCN), as the European Commission likes to call them, and enhanced connectivity to all citizens should be regarded as treasured assets for a national economy.
Countries have come a long way in the development of high capacity networks, such as FTTH. However, development and adoption are on-going processes, which have been relatively uneven among countries around the world.
Exhibit 1: Percent of broadband lines based on FTTH access [Source: OECD]
In Europe, creating a level playing field and providing the right conditions for digital services and networks to flourish are relevant pillars of the Digital Single Market. In this context, the governing regulatory framework in the EU, the European Electronics Communications Code (EECC), presents as one of its key objectives the widespread access to very high capacity networks for all European citizens and businesses. The principle of widespread adoption of very high capacity networks is one of the targets that shape regulatory policy for fixed broadband markets in the EU and is an example that other countries around the world should follow.
Regulatory bodies must strike a very delicate balance between promoting investment and safeguarding competition in the market to maximise deployment and uptake of very high capacity networks. Regulators have at their disposal a toolbox of regulatory instruments to meet these objectives. We can summarise these tools into three main groups:
- Market analysis and related remedies
- Cost-based price setting
- Replicability-based regulation
Each of these instruments can be used in different ways depending on the market conditions and the regulatory objectives. In practice, regulators tend to use a combination of these tools to balance the benefits that each one brings. However, selecting the right balance of regulatory instruments to apply is only one part of the equation. To maximise the benefits of a particular regulation, regulators must also overcome a series of challenges. In our experience, in fixed broadband markets, some of the most typical difficulties that regulators are facing are:
- Segmentation of geographical markets. Contrary to traditional copper networks, when talking about very high capacity networks, multiple infrastructure providers are commonly competing within a country. The scope of these providers can be radically different, with some deploying nation-wide networks and others covering only small areas such as a single municipality. Therefore, and far from what typically would happen in the past, regulators need to investigate more granular geographical markets where operators other than the incumbent can be found to have significant market power (SMP). In this context, regulators do not face only the difficulty of working with very detailed geographical information, but they also have to bear in ming that defining a joint-SMP status in the market is likely going to be challenged in the courts (as is happening already). Thus, the market analysis should be bullet-proof (from both an economic and technical point of view).
- Granularity of costing tools. The evolution of market analysis is impacting the definition of cost-based remedies. Traditional Bottom-Up cost models are no longer capable of coping with the necessary granularity and, instead, big data solutions are required. There is a need to determine accurately the costs associated to particular areas and, this time, reliance on the incumbent’s databases alone (as many regulators have done in the past) is not enough to regulate non-incumbent SMPs. Therefore, cost models should evolve to be much more sophisticated, while remaining transparent and usable for all involved parties.
- Sending proper build-or buy signals while ensuring cost-recovery. Most (if not all) regulators base their price decisions on current costs since, in theory, it is the best approach to send proper build-buy signals to the market. However, regulators choke when talking about legacy infrastructure and its “current costs”. Shall they assume that trenches dug decades ago have the value of current labour costs? What happens with fully depreciated assets? What is the proper value for assets which are reusable for very high capacity networks? How can they find the balance between avoiding a slowdown of adoption of very high capacity networks while also avoiding unreasonable cost over recovery?
- Defining the proper methodology to ensure tariff replicability. Tariff replicability (and margin/price squeeze tests) is a very well known instrument for regulators. However, it is evolving both in terms of relevance and complexity. Regarding relevance, some countries have started to rely more on ensuring replicability than on a cost-based tariff imposition (for instance, this is the preference shown by the European Commission). In terms of complexity, the increase of bundles (in most cases even including services within unregulated markets), the broad range of alternative operators to be represented by the reference operator, as well as the coexistence of regionally focused providers (with their regionally focused tariffs) and national providers, require a deep revisit of the replicability methodology used.
This write-up serves as an introduction to a series of articles where we will dig deeper into these, and other, challenges that regulatory bodies can face when regulating fixed broadband markets, and how to overcome them. Stay tuned for the next articles about fixed broadband regulation.