Talent [R]evolution

A brave new world for the telecom data centre

Reading Time: 7 minutes

The telecommunications space is evolving. Now more than ever, companies need to have enormous data processing and low latency capabilities. This comes with the advent of various next-generation technologies like 5G, IoT, augmented reality, and machine learning. Data centres need to be able to handle dizzying amounts of information at break-neck speed.

Subsequently, the hardware required to handle this data is going through a radical transformation. This is necessary to meet consumer and business expectations, and these customers are only getting more demanding. Smart cities, smart homes, and industrial automation will shape the future, and many of these technologies require hyperscale cloud services or computing at the edge.

Over the years, telecom companies have invested huge amounts of money in increasing network capacity and data handling capabilities. Now, we are entering a new epoch for telecom data centres. We are exiting the world of proprietary physical devices once and for all and moving towards public and hybrid cloud-based environments.

So what does the future hold for the telecoms data centre, once many players’ most valuable asset? Here, we’ll look at what’s driving this change and how both technological and management solutions for data centres are changing – and why telecoms may be more relevant than ever as computing moves to the edge.

Defining the data centre in telecoms

In theory, there is no difference between the hyperscale data centre used by Google Cloud, Amazon, or AWS, a colocation data centre supporting these facilities, and the data centres used by telecoms companies. Data are data; however, data centres operated by telecoms providers have a few requirements that go beyond your average information storage facility.

The primary function of a telecom data centre is the driving of content delivery, via both mobile and cloud services. These functions require very high connectivity, and typically, a telecoms provider will have extra space to offer colocation and managed services to third parties. At face value, this seems like a good investment; it provides the services that the company needs while opening up an extra revenue stream. 

This business model has been somewhat received wisdom in the telecoms space for the past decade. Telecoms have full data sovereignty, from hardware to the information itself, and these assets generate supplementary profit. However, perspectives are starting to change. Before we dive into how they’re evolving, we’ll look closer at why.

What’s driving the evolution of the data centre?

As touched on in the introduction, we’re entering a new era of connectivity. Many regions are already transitioning to 5G, which has the capacity to link all sorts of intelligent devices. This is achieved by utilising the radio frequency spectrum more efficiently, sharing capacity among connected devices, increasing throughput, and consuming less energy than previous technologies, at a lower TCO for telecom operators.

From the user’s point of view, the most significant advantage of 5G is the speed and capacity. This results in the improved performance of multimedia applications and gaming apps, for example. Looking to the future, it could facilitate real-time innovations like autonomous vehicles, security applications with biometric recognition, etc. Ultimately, using 5G will bring faster speeds, less latency and the ability to connect more devices to the network.

Autonomous vehicles are a potential part of the rapidly expanding Internet of Things (IoT), which is drastically changing consumer expectations, and thus, the business landscape. Connecting medical devices, solar panels, cars, and more means we need control over objects and environments that previously were beyond the reach of the Internet.

Certainly, this will mean businesses have new opportunities for increasing revenue, reducing operating costs and driving customer loyalty and satisfaction. Here’s just one illustration: the maker of an autonomous car can alert the driver in advance as to a possible failure based on data collected from the vehicle, analyzed using artificial intelligence and machine learning processes.   

Then there’s augmented reality (AR). This technology is radically enhancing the physical world around us by using digitized visual elements, sound, or other sensory stimuli delivered via technology. For example, an Airbus factory has given its workers smart glasses to help precisely position cabin seats and furnishings on commercial jets. This has resulted in an error rate of zero and a 500% improvement in productivity.

Undoubtedly, there are opportunities; but we need infrastructure to run these technologies. Many of these solutions demand a move away from traditional data centre architecture. Major European edge network expansion is required to make latency times near zero – and this is a substantial project. We require new business models for the edge. 

The move from proprietary physical devices into the cloud

This paradigm shift is a major event for telecoms. To build context, it’s useful to introduce a very brief history of the telecom data centre. A traditional data centre architecture was a proprietary, on-site facility that was mainly used to support internal telecom functions. 

These functions were deployed on huge, monolithic servers, tightly coupled with hardware. This set-up was secure for sure but tended to be inflexible. There were vendor lock-in issues and high costs associated with maintenance and space. Moreover, the rents and energy bills associated with running these spaces were absolutely astronomical.

Gradually, virtualisation began. This ushered in a significant infrastructural transformation, as certain aspects of network functions were made accessible via virtual environments called virtual machines. Examples include Firewall, Packet Core, MMSC, etc. The virtualisation of these network functions opened the infrastructure out to third parties, as they mostly ran on commercial off-the-shelf hardware.

Undoubtedly, it improved technological and economical efficiencies. However, telecom found these tools difficult to scale and maintain themselves. The increased need for scalability, reduced time to market, quick roll-outs and upgrades resulted in the movement towards the cloud.

Into the cloud and beyond

The cloud is delivering the benefits of automated scaling, which is its defining feature compared to the traditional virtualisation infrastructure. The applications are decomposed to head towards a microservice-based architecture where applications are cloud-native tech instead of virtualized network functions.

Thus far, telecoms have addressed infrastructure needs by designing and managing their own data centres using a private cloud for their network and other non-core operations. As a result, many telecom operators attempted to diversify into cloud service providers – similar to the previous collocation/managed services business model. 

However, public cloud providers make this a tough market. Hyperscale operations like Google Cloud, AWS and Azure have made it difficult for telecom players to hold their own. We’re seeing the effects in the market; last year, France’s Orange signed a far-reaching deal with Google Cloud. Michael Trabbia, their chief technology officer, said in a recent interview: “We cannot pretend, because we don’t have the scale to redo what they have been able to build with worldwide scale.”

The great data centre sell-off wave

This is just one example of a larger trend. Many telecoms players aren’t just signing deals, they’re selling off assets. In 2017, Verizon announced it was selling off 24 of its data centres to Equinix for $3.6 billion. In May 2019, Telefonica SA sold 11 data centres to Asterion Industrial Partners for $600 million. This trend has continued; in October of last year, Mexican telecom giant Axtel sold three of its relatively new data centres to Equinix for $175 million. This is but a handful of examples.

Ultimately, this is because telecoms companies know they can’t fight the battle on both fronts. Although colocation is a healthy industry – the market size is expected to grow from $31.5 billion in 2017 to $62.3 billion by 2022 – there’s too much pressure from dedicated data centre providers, such as the likes of Equinix. With telecoms itself becoming an increasingly competitive space, many of the big players have decided they’re better off focusing on their core business.

Besides, the ongoing 5G rollout is projected to have cost more than $2.7 trillion by the end of 2020. Currently, analysts at Greensill think that the process will account for another $1.1 trillion in expenditure before the implementation is “finished”. But, despite these dizzying sums, top industry insiders project that the 5G rollout will drive higher returns and better shareholder value compared to data centre assets.

Moreover, not all is lost; public cloud players have a lot to offer telecoms. Hyperscalers can provide the expertise and agility that telecoms need to meet their customer expectations. Furthermore, they can facilitate better IT services, enhancing internal operations. But it doesn’t have to mean surrendering all data sovereignty. hybrid cloud options are presenting a compelling solution. 

Leveraging hybrid cloud infrastructure

At the end of the day, public cloud players provide scalability, lower TCO, and the quick roll-out of services. Many telecom players will find that the public cloud offers more than it detracts. In many ways, telecoms should embrace the public cloud rather than see it as a threat to their business. Many big telecom players, such as Orange France, have already brokered partnerships with hyperscalers to provide a better service to the end consumer with analytics, off-the-shelf solutions, and high-speed development. 

This is hugely beneficial as telecoms see traffic volume skyrocket with the advent of new services like streaming on over-the-top (OTT) platforms, which are put under even more strain during events like soccer, festivals and so on. Telecom players started to collaborate with public cloud providers because the economies of scale on offer were too good to refuse. 

To maintain a degree of data sovereignty, many telecom companies have adopted distributed and hybrid cloud models. Non-core applications like BSS, OSS, billing and revenue management have been moved to public clouds. Meanwhile, core network operations remain within their private clouds for enhanced security and control.

Hybrid cloud solutions help modernize on-premises data centres; essentially, they’re becoming smaller, easier to maintain and more cost-efficient. Armed with various technologies like Open Digital Architecture and RedHat’s OpenShift, the telecom data centre is becoming more agile. These tools integrate access to private and public cloud environments, easing workload migration between the two.

The future of the telecoms data centre

So where are we heading? As with many aspects of IT, we seem to be moving towards an as-a-service model, where telecoms companies are turning to public cloud providers. In some cases, they’re even soliciting the services of the very same companies that bought out their data centres. This isn’t necessarily a bad thing; more often than not, outsourcing maintenance is far cheaper than owning and operating your own infrastructure.

However, telecoms companies may once again have a greater stake in the game as network operations shift towards the edge. Although public cloud providers might be the gatekeepers when it comes to compute power, telecoms have the knowledge when it comes to distributed network infrastructure. Increasingly, micro data centres are popping up – and telecoms certainly know a thing or two about distributed coverage.

After all, there are few places in the world now where you don’t have a mobile phone signal, for instance. This is decades of work from telecoms companies to provide connectivity across regions. Whether it’s negotiating the architecture of built-up, densely populated urban environments or improving infrastructure in rural areas, telecoms have decades of experience with distributed networks. Alongside hyperscalers, there is a golden opportunity to exploit these existing networks to rollout micro data centres and computing at the edge. The wheels are already in motion: according to a recent press release from Google, as partners, Google Cloud and Nokia will develop cloud-native tech solutions for communications service providers and enterprise customers. Deals like these will usher in a new epoch of digital transformation. The key is collaboration.

Eduardo is a technology professional with more than 30 years of experience in several multinational companies in the telecom and banking industries.

With a career spanning across two continents, managing technology operations in more than 10 countries, he has led many successful transformational initiatives for digitizing processes, data centres consolidation, improving operational capabilities, creating new revenue streams, and enhancing customer satisfaction.

No comments yet

There are no comments on this post yet.