As we move closer towards mass Small Cell deployment, several of the back office systems used for cellular network planning and deployment need to evolve. Rather than a series of independent, silo-based systems, we are starting to see greater interworking and information exchange that will ultimately reduce risk and increase the pace of introducing features, configuration options and vendor equipment.
Back office systems used to be independent and standalone
In the early days of cellular networks, it wasn't uncommon for point solutions to be introduced to deal with each new problem as they cropped up. Small, independent software programs would be developed to address each different area. While each has evolved at its own pace, some have become outdated or discontinued, some were in-house IT projects and many addressed issues which have since been resolved by later versions of network equipment. Additional back-office systems continue to be essential to operate an efficient mobile network.
Three key functions within the planning and management of a cellular radio network include:
- RF planning, which uses theoretical models and maps to predict how the signals will disperse from each cellsite. This includes generating coverage and capacity maps, and evaluating the best locations for future sites and site expansion.
- Performance Management, which imports "Big Data" from the radio network and other sources to establish the actual performance achieved. Executive bonuses are often tied to the KPI metrics reported, which include average data rates, call drop rates and utilisation/efficiency levels.
- System test platforms, which emulate the end-to-end network using a combination of simulation of large numbers of end user devices and a few real small and large cells. These can verify/validate correct operation of new features/products, test corner cases such as overload, and establish the results of changing network configuration parameters.
Generally speaking, these systems have been operated independently and share little data. Issues identified with the live network may be reported to the RF planning team, who might use a different screen/system to determine how best to address it. The evaluation of new products and features might be thorough in the lab, but not relate as closely to the real world.
Connecting the dots
A trend is emerging which links these together, sharing results and data sets that improve each function directly. One example I saw at Mobile World Congress links ERCOM's Mobipass 4G network validation system with InfoVista's radio planning and performance management tools.
This shares data between the radio planning, performance management and system testbed to improve the performance of all - increasing the quality/accuracy of the testbed so it can more accurately diagnose real-world issues, and also increasing the predicted RF coverage and capacity results of the planning tool.
Infovista's RF planning tool, Mentum Planet, continues to evolve to support Small Cells. The company recently acquired Aexio, an RF network optimization software company which handles call trace post processing into its expanding portfolio.
Their solution can geo-locate the traffic sources and link these to the macrocells providing service. Deployment of Small Cells can be prioritised for those hotspots where the macrocells are overloaded, taking into account a known database of potential Small Cell sites.
An example use case: A busy urban district in Japan served by 5 three sector macro sites (i.e. 15 sectors) was fed 34 possible points of presence. The tool recommended 18 new small cells achieving 44% traffic offload. A key principle is to focus on achieving macro offload rather than where the highest traffic hotspots are – after all, if the macrocells can already serve the traffic adequately then there may be better places to invest in additional equipment.
The propagation models of RF planning tools have been developed and refined over many years and are a critical factor in predicting network performance.
Simulating the network
ERCOM's LTE network validation system is used in the lab to simulate large numbers of smartphones and mobile users actively generating traffic. This can be connected to real network equipment, macrocells, small cells, core network etc. to replicate an end-to-end mobile network.
Thousands of individual smartphones, each with their own traffic pattern and service usage are emulated, some moving around quickly and others stationary. They interact with real macrocells and small cells sending millions of signalling messages including RF measurement reports, driving a variety of traffic through the macrocells. The basestations do their work, directing handovers, allocating data channels and throttling overload conditions.
The compute performance of these systems shouldn't be underestimated, each of the thousands of mobile devices being emulated dynamically accounts for changes in the behaviour of the radio channel to a 1 millisecond resolution. This takes into account the RF signal fading, interference from other devices and eNodeBs, independently for uplink and downlink.
The system can be used to test boundary conditions, such as determining behaviour in extreme overload conditions or the response to rogue/ misbehaving smartphone devices. By using real eNodeB's, the system reports real-world behaviour of actual products. This makes it ideal to test out new features, new products and/or significant system parameter changes prior to live deployment. I expect this would also be extremely useful to assess SON features.
Virtual Drive Testing
By importing the network configuration from InfoVista's RF planning tool, the validation system can closely match the real world deployment. ERCOM market their Mobipass system as being able to conduct a "Virtual Drive Test" – similar to using a Flight Simulator rather than flying a real aeroplane. Example use cases involve debugging specific problem areas, such as busy urban street canyons or central plazas. It can be easier to determine what the issue is and assess the impact of configuration changes to resolve problems in the lab system rather than in the live network.
The same tests can be used to benchmark different vendor equipment by repeating the same scenarios. Although every vendor complies with the 3GPP standards, each has its own proprietary implementation and additional features. For example, the data scheduler algorithms used to allocate resources to each user can be very complex. Their behaviour can be characterised on the testbed and the dataset fed back into the RF planning tool to improve predicted performance.
Interworking between performance management, planning and validation systems is a virtuous circle that provides benefits for each of these important capabilities.
ERCOM/Infovista's partnership is the first I've seen to share the data in both directions, enabling better RF planning and more realistic network test emulation.
As we move towards HetNets with a mix of macrocells and small cells from different vendors, masterminded by centralised SON, we'll need these improved planning, monitoring and validation capabilities.