Steve Saunders, Founder and CEO of LightReading, published a provocative article about the current state of network virtualisation. He believes the industry has abandoned its proven standardisation process resulting in confusion rather than achieving their promised benefits. Will his approach solve the problem and get the industry back on track?
Steve Saunders has had a long career in media coverage of the telecom industry, founding and later selling the LightReading business for $33 million in 2005, only to buy it back in 2014. He’s not afraid to be outspoken and was highly critical of the industry's lovefest with Virtualisation in his opening keynote speech of the LightReading BCE conference.
His arguments are worth a read whether you agree or disagree, together with the audience comments.
The promised benefits of Virtualisation
Enterprise and public data centres have clearly benefitted from several industry initiatives in recent years, establishing huge server capacity in the Cloud. Amazon, Facebook, Microsoft and Google have all been able to achieve very low cost by commoditising hardware, using open source platforms, automating process management while hosting a huge variety of operating systems and applications. Large Enterprise businesses have copied a similar strategy for their own internal data centres, also claiming significant cost and flexibility benefits. So it’s not surprising that telecom operators want to copy much of that winning formula.
The first “big wins” of virtualisation have been to package up and run core network elements on standard hardware, replacing the need for specific hardware for each function thus simplifying capacity allocation and growth in standard data centres.
While many of the large vendors have packaged up their existing software in this way, it doesn’t really make a huge difference. A promised benefit was that this would open up the market to enable new vendors to supply software components. However, the same proprietary interfaces remain, software still operates the same way and it remains difficult to integrate other vendor’s products.
Three Guilty Parties
Steve argues that the OpenSource community lacks the discipline or control for focussed innovation. His view seems to be that Opensource projects are developed either in areas of huge end-user demand or where funding is available. The mission critical quality control that telcos demand might be available in some areas (eg Linux operating system) but perhaps less so in some of the more specific applications.
Enterprise Vendors, have been building enterprise clouds for ages but have difficulties building the “Telco Cloud”. He points to a large failed virtualisation project for Telefonica in 2015.
Media and Industry Analysts got super excited by the story spun by the Opensource Crowd and Enterprise Vendors that was impossibly optimistic and wildly inaccurate. They had said that networks don’t matter, all the money is in Apps and DevOps rules. That isn’t necessarily proving to be the case.
An alternative initiative
Steve created a new organisation on 2015 called the NIA (New IP Agency), which appears to be a subsidiary of LightReading with him as the primary contact. The website lists 24 members to date including some major vendors. The site is a little out of date with a four phase interoperability test schedule set out up to end 2016 but few updates over the past year. It's unclear whether the initiative has run out of momentum.
If it was easy to fix the problem, I’m sure it would have been done a long time ago.
Telco Survey Results
LightReading surveyed 150 CSPs to uncover the general perception of Virtualisation, Opensource and Digital Transformation.
As you can see from the figures above, the general trends are to be quite heavily engaged in Virtualisation and about 50/50 for OpenSource. Steve positions himself in the 1.3% that thinks that Opensource is actually bad for telecom. I’d suggest there are definitely some areas (eg Linux operating systems) which clearly helpful and provide benefits, but it's certainly not a panacea.
More important for me was the last survey question, which rated process automation above all other aspects. I don’t believe this is just about reducing manpower to manage and operate networks, but covers the ease of introducing new vendors, scaling up deployment and reconfiguration (yes including for small cells) and improving overall service quality.
So for me, I’d also question how the industry expects to capture big wins from Virtualisation and would see it has become quite a distraction. From my perspective, the biggest take-away is the importance of process automation. This should be evolved in such a way that new vendors, software and hardware, can be introduced into existing networks. This will need standardisation and widespread adoption for both platform and application vendors throughout the industry.