- Published on Thursday, 03 January 2013 17:32
- Written by David Chambers
Pricing plans for mobile network use can seem very complex and convoluted. Even the simplest voice and text bundles seem to have a wide range of exceptions and exclusions that can catch out the unwary.
The subject can be highly controversial, with operators wanting to maximise revenues, avoid network capacity overload yet ensure that customers perceive good value.
Is there an easy way to introduce clearly defined levels of service, should these be specific to Wi-Fi use or not, regardless of whether incorporated into Small Cells?
Today's mostly single dimensional mobile data tariff plans
On the whole, data tariffs tend to be more straightforward and are based purely on volume. The introduction of 4G has seen some higher pricing for the faster speeds, such as separate allowances for 3G and 4G data, but on the whole most customers are paying per Gigabyte. A few operators still offer unlimited tariff plans, but most have stopped offering these. Those that do may implement traffic shaping which throttles back the performance for the small number of really heavy users, avoiding the situation where 3% of users consume 30% or more of network resources.
There are many ways to address the growing demand for wireless data traffic: increasing capacity, offloading to Wi-Fi, traffic shaping and billing tariffs. Data caps, which limit the worst abuses by small numbers of subscribers, have been one of the most effective tools used to date. Traffic shaping, which prioritises different services according to their needs, has quietly been adopted and implemented by most carriers. In general, this leads to an improved service for all concerned, and avoids the worst abuses, but it has its own limits.
Slowing down data rates may not increase capacity
Data is mostly sold as a one-dimensional product, measured by data volume alone.
In some cases, end user speeds are limited but depending on how this is implemented, it may not release significant additional network capacity. The higher RF modulation schemes used to achieve the fastest data rates between the users device and nearest cellsite can be reduced to force slower speeds. This doesn't free up (significant) capacity for use elsewhere. It's better to run the RF network as fast as possible and throttle bandwidth elsewhere, typically in the core network which can discriminate between users who are on different tariff plans and/or have exceeded their regular data allowance.
Some network operators use the higher data rates to attract and retain customers, rather than selling the higher speeds at a premium. This helps reduce customer complaints where those higher data rates aren't available. It's been noticeable that LTE operators have been relatively conservative with the advertised speeds quoted for typical usage. For example, Verizon Wireless quote 15Mbps average nationwide download speed rather than the theoretically achievable peak speeds of 80 to 100Mbps.
Some data is less urgent
When I consider the types of data service used, some data is clearly urgent and needed quickly. Emails, instant messages and even Facebook status updates might fall into that class. It's typically where there is interaction with others – real-time communication – that is most important. Some types of service such as video streaming also place real-time demands on network capacity.
Other updates are less urgent and could easily take a day or more to get through. These include uploads of photos taken, synchronisation of dropbox/cloud folders, software/app updates and libraries of reference documents. As long as the data update is handled securely, doesn't completely drain the battery and gets there in the end, I don't really mind how long it takes.
As an end user, it's more important to avoid nasty surprises when I receive my next monthly bill. Like many, I'd prefer to have a fixed cost for my mobile communication service and work within those limits. But I have little idea of exactly how much data I'm consuming each month and/or through which methods.
Some data transactions are much larger than others
Video traffic is commonly pointed out as taking up the lion's share of network capacity these days. With portable devices having retina displays that can do justice to HD, the potential demand for substantial capacity continues to grow.
But I'm also seeing a general trend in data size growth driven by faster internet connections. Webpages, presentations, apps and OS system updates (700 Mbytes for an iPhone) all seem to be on the increase.
Is two tier data delivery already in place?
For me today, this distinction occurs when docking my iPhone or iPad. Podcasts, photos and software updates are synchronised automatically. Recently, this is automatically done over Wi-Fi when at home and connected to a power source.
Email clients can be configured not to download large attachments unless specifically requested. Ideal when roaming, where message headers can be read on the go but the larger accompanying files can be deferred until connected at a local Wi-Fi hotspot.
So without thinking about it, we are already conditioned to handle some types of data update more cost effectively, downloading/uploading when at home.
Other types of networks are different
Tariff plans for wireline broadband data often include a data cap these days, but the small print in some contracts excludes traffic at off peak hours, such as midnight to 6am, when you really can use as much as you want, including P2P file sharing.
Electricity users will be familiar with off-peak rates, where electric storage heaters are powered during the night and release warmth during the day.
What we haven't seen yet is these off-peak tariff plans become more visible to mobile users.
Is Wi-Fi offload the "off-peak" service?
Many smartphone and tablet users head for coffee shops and internet cafes where they can access broadband data more cheaply than by cellular. In some ways, this can be seen as the cheaper, slightly less convenient, off-peak service.
In some cases, the speed of the Wi-Fi service may be better than that of the cellular network. This makes watching videos and other high speed services more attractive.
In my experience though, the actual speed and connectivity of Wi-Fi in public areas can vary widely. You often don't know what the service will be like until you are using it. Competing providers offer Wi-Fi for free in many places. This makes it more difficult to charge consistently for.
Wi-Gig, soon to offer Gigabit speeds across very short distances, will emphasise the use of short term/high speed data hotspots for synchronisation as we discussed in a previous article.
Could mobile operators introduce an off-peak tariff?
This concept isn't new. Telefonica O2 UK used to count Wi-Fi access at 25% of the cost of cellular data, so that 4Mbytes of Wi-Fi cost the same as 1Mbyte of 3G. This may be a simpler way of communicating the price to end users, who recognise they will get more data for their money if they access Wi-Fi where possible.
More recently O2 has offered Wi-Fi completely free in some parts of London, even for those on competing mobile networks.
Many other operators include access to public Wi-Fi services as part of their standard data plan, although it varies as to whether usage counts towards a monthly data cap or not.
A key advantage being that this is simple for the end user to understand and for the application developers to cater for. More complex schemes, as suggested by other analysts, may too ambitious at this stage.
Perhaps the future will differentiate between best effort and premium quality, regardless of whether Wi-Fi is being used
Mobile operators are working hard to adopt Wi-Fi and add control to the quality of service it delivers. It's attractive because of low cost (due to mass market takeup) and zero cost of spectrum. The short range of its signal (which is very low power) and variable quality of service (because spectrum is shared and usage varies widely) can be disadvantages.
One of the major initiatives between the Small Cell Forum and Wireless Broadband Alliance this year is to provide more seamless access to Service Provider Wi-Fi from mobile phones and control/manage the end user Quality of Experience.
This might lead to a situation where end users may be switched between Wi-Fi, 3G and LTE to make best use of spectrum, handset capabilities and available small cells. You may not be aware of which radio technology you are using at any given time. Wi-Fi may not be associated with the poorer rate of service delivery in specific situations.
In this case, it may still be worthwhile for operators to implement a two-tier data service to discriminate between competing traffic.
In this case, the "best effort" and "premium" service may be delivered by any or all available technologies, and it may not necessarily be Wi-Fi that's used for the lower cost service.
With more small cells now including Wi-Fi as an option, the longer term commercial tariff implications also need to be well thought through.
Keep informed of small cell thinking. Signup to our FREE monthly newsletter and articles and get a FREE ebook!