Andy is one of the key staff driving the engineering strategy at EE, the UK's largest mobile network. Here he explains why EE's network topology enables them to defer the inevitable expansion into small cell deployment, compares alternative future LTE network architectures and forecasts the likely timing and reasons for future small cell adoption.
There are conflicting industry views about whether mobile networks are already or soon to be massively overwhelmed by data traffic. What's your perspective on this?
Certainly we see a data tsunami, and it's not new, starting with when the first iPhone was introduced on 2G. We saw this affect everyone across the industry, and the 3G version further increased traffic. Overall, I think we've been pretty good at responding, evolving the radio capacity to HSPA+ and upgrading backhaul from old E1 circuit links to Ethernet. While the "hockey stick growth" projected took a little longer than originally forecast, we've certainly witnessed this over the last few years. Clearly the growth over that time has been phenomenal but the industry is responding proactively.
We are finding that the bulk of data traffic can be found on a relatively small number of sites (but more than 10% of them) and that hot traffic areas are growing at an even faster rate than other parts of the network. Exceptions to this include rural broadband in areas such as Cumbria, where there is no fixed line broadband alternative. Indeed, some of our rural 4G sites have seen remarkably high traffic levels.
Mass gatherings, such as sports stadiums, need special solutions. Last year the UK telecoms industry performed very well throughout the Olympics 2012, having prepared well in advance and deployed lots of capacity.
We continue to keep a close eye on how traffic use is changing, wanting to understand the underlying dynamics of growth. Where large numbers want to view the same content simultaneously, broadcast and multi-cast features such as eMBMS might become appropriate. The highest peak in our traffic recently was when Andy Murray won Wimbledon – the video traffic throughput was intense – and eMBMS would fit that demand well. However there has always been some scepticism that most people want to watch unicast rather than broadcast content.
The vast bulk of users today are still on 3G. How will EE evolve the use of 2G, 3G and 4G radio technologies?
Our primary focus today is to provide mobile broadband capacity and availability through a mix of 3G and LTE.
It makes most sense to invest in 4G today because it provides better efficiency, a better end-user experience and any new equipment will have a longer lifetime. We also continue to invest in 3G because most of our customers have not yet migrated to 4G.
We have an ongoing RAN refresh program, which is replacing legacy GSM basestation equipment with the latest "Single RAN" equipment serving both 2G and 4G. Our 3G network is, in the main, shared with 3 UK, this uses a technical approach known as MORAN (Multi-Operator Radio Access Network) in which certain elements of the base station are shared, we do however deploy dedicated carriers. The EE 4G network is a unilateral deployment.
I don't foresee us shutting down our GSM network anytime soon, as is happening in the US and Japan. For us, GSM is the underlying glue that provides the certainty of a roaming service – a common denominator found in virtually every country worldwide. There are also many M2M (machine to machine) devices sold on long term contracts that we continue to service. Our new base stations allow us to reduce the spectrum that needs to be allocated to 2G and transfer it across to 4G in the future should we wish to.
I'd expect GSM to be around until at least 2020. There's an interesting industry debate about whether 3G might be retired first, with much of today's 3G data traffic wanting to migrate to LTE in due course. Perhaps we'll see both 2G and 3G reaching the end of life around the same time in the next decade.
What is EE's Super Macro concept and how does that affect small cells?
It's a case of delivering maximum capability within a fixed budget to the benefit of as many subscribers as possible. We already had a tightly integrated 1800MHz GSM grid of cellsites packed to such a density that expansion is driven by capacity, not coverage in urban and suburban areas. Additionally, we have substantial spectrum assets including 3G with 4 carriers at 2100MHz and LTE at 800MHz, 1800MHz and 2600MHz.
Traditionally, most of these are 3-sector configuration, but we now have many 6-sector sites (and several in-between with four or five sectors). If we find that one sector is particularly hot, we'd only split that one. In dense urban areas, there can be as little as 200 meter spacing between sites.
We initially deployed 4G with 2 x 10MHz channels in the 1800MHz band, and have covered over 120 towns and cities to date. We have also introduced 'double speed' 4G to the largest 20 cities by increasing the 4G channel bandwidths from 2 x 10MHz to 2 x 20MHz in the 1800MHz band. These 20MHz channels will continue to rollout to become the base build of the 4G network.
We plan to launch a quad-speed LTE service next year, using Carrier Aggregation of both 2 x 20MHz at 1800MHz and 2 x 20MHz at 2600MHz, theoretically achieving up to 300Mbps download speeds with CAT6 LTE devices.
Our approach gives us the most flexibility and backward compatibility, expanding coverage, capacity and enhancing our capability. Elsewhere, each operator may choose a different strategy and we would never criticize an alternative approach - there are many variables to be considered. Availability of spectrum, regulatory constraints (e.g. RF power levels, planning laws, competitive constraints) and economic factors all play a part in determining the most appropriate strategy. We have executed our strategy which has delivered by building on our extensive spectrum assets and numerous existing cellsites capable of expansion.
What's next after Super Macros and what are the technical issues to overcome?
A critical performance metric relates to efficiency of data capacity, measured in bits per second per hertz per square metre. I've been impressed by how high the traffic levels are that we can achieve using Super Macros alone, but this won't be enough for all our future needs.
We want to evolve from our Super Macro strategy into a full HetNet (Heterogeneous Network) and what I want is a single point of control of small cells integrated with Super Macros.
For this to work well, the X2 interface which co-ordinates between small and large cells will be vital. At the outset, it's important to understand what we are trying to achieve through this mechanism.
We are very likely to be sharing the same spectrum between small cells and super macros, and this will need eICIC (enhanced Inter-Cell Interference Co-ordination), or a similar mechanism, to provide coordinated scheduling across both network layers.
When we look to implement ABS (Almost Blank Subframes) patterns, we find that these don't change often – perhaps at most every five minutes and perhaps as little as only once per hour. Although subscribers move around, the traffic hotspots might not. We find we can determine where to locate small cells to capture the maximum traffic capacity, meaning we only need install a few to make a substantial difference.
Latency across the X2 interface is a critical factor. Today, we route X2 in parallel with S1 via the security gateway, which gives us latency in the region of 10-20ms. That's less than the break/make time for mobility procedures in LTE and perfectly adequate to support eICIC and many of the other centrally coordinated techniques being discussed. These optimisation techniques do require the use of time (phase) synchronisatrion between base stations, which is something we'll introduce as and when necessary.
If you look at some of the joint processing schemes being discussed today for CoMP (Co-ordinated Multi-Point), then the X2 interface requires very low latency, which would be hugely expensive. It would need dedicated dark fibre to each site or a complete wireless mesh.
This leads me to question the C-RAN (Cloud RAN) over D-RAN (Distributed RAN) approach as a capacity enhancement. It might work well where you have access to dark fibre at low cost to connect cellsites but we find that using our Super Macro strategy combined with 5 to 8 small cells and eICIC can deliver phenomenal capacity with a D-RAN.
There are further improvements that can be made, such as better optimization tools, better radios in the handsets or using MIMO in small cells. We have been pleasantly surprised at just how much capacity we have delivered already and are able to achieve by adding a small cell layer.
What's your view of the timing and scale of small cell growth?
When talking to our US operator colleagues, it's clear they are constrained by both limited spectrum holdings and the increasing demands from many 4G customers. This is driving a rapid take-up of small cells for capacity. I also expect some European operators will adopt small cells more quickly than others depending on their spectrum holdings and other factors.
We will continue trials through 2014, but in the main I think it will be 2015 at the earliest before we see large numbers of small cells in our network. Once it starts, we can expect to see significant volumes; Super Macros are our first step towards HetNets, but small cells are the essential next step.
As the small cell industry matures, we have to continue to learn together and strengthen the eco system, which has to establish a mutually beneficial relationship throughout. There has to be a fair and reasonable price for each aspect of the solution, in everything from site acquisition through to equipment cost.
We caught up with Andy again in 2014, when he walked us around some of EE's urban sites in London and explained their three stage evolution plan towards Small Cells and separately discussed issues for urban small cell backhaul.