FEATURE STORY / THE DIGITAL EDGE
PUBLISHED 3 JULY 2015
Millions of new users, billions of new devices, gigabytes more consumption of bandwidth per device and user, mobile and fixed network content consumption and fundamental changes in computing architecture are driving robust bandwidth demand – and the importance of cloud data centers “on network” – across the emerging markets corridor stretching from the Middle East across India to the rest of Asia, and from there spanning the globe.
In part, that opportunity exists because Asia and the Middle East are among the fastest-growing markets for Internet-related products. At the same time, the profound change in computing architecture also matters.
In the mainframe and minicomputer eras, computing happened in a glass room, with some demand for wide area communications. All that began to change in the personal computer era, when networked computing fabrics (inside the building, across the campus, across the wide area network) became more important.
In the Internet era, communications across the wide area network are as foundational as communications between a PC hard drive and its peripherals. In other words, content and information primarily is stored remotely, in data centers, not locally, on a personal computer hard drive, or on a local server. In the era of cloud computing, wide area communications and data centers lie at the heart of most computing operations.
The emerging markets corridor stretching from the Middle East across India to the rest of Asia, and from there spanning the globe
That matters for enterprises and end users across South Asia and the emerging markets corridor because the global market will be driven by developing markets across South Asia and Africa.” according to Global Industry Analysts.
“Few, if any, can match our collection of assets across that corridor,” said Bill Barney, Global Cloud Xchange CEO.
And assets now matter, for after several eras where computing operations increasingly were dispersed and decentralized, the era of cloud computing–while supporting remote access for most applications–is built on a new degree of server concentration inside data centers that increasingly feature co-located application provider operations, and multiple ways for wide area networks to interconnect.
Enterprise data centers always were important loci for computing operations. In the cloud era, enterprise data centers–especially those featuring content and application provider servers–are essential, representing the source and destination of a majority of global Internet traffic.
It is a truism that Internet traffic now drives global capacity demand, nearly all supplied by cloud-based applications, and anchored by content sources residing in a relatively few big data centers. In fact, as much as 98 percent of Internet traffic now consists of content stored on servers.
If it once was true that most global capacity was driven by telephone company central offices, now most global traffic is driven by data centers, within data centers and between data centers. The implications are simple and profound: computing now is built on a cloud architecture. In turn, the cloud is built on data centers, which to reduce latency are situated directly on global backbone networks.
In 2013, cloud operations accounted for 54 percent of total data center traffic. By 2018, cloud will account for 76 percent of total data center traffic, according to Cisco. Content demand, cloud storage and computing shapes bandwidth demand across the region and globe.
“That is the foundation of the Global Cloud Xchange strategy,” said Wilfred Kwan, Global Cloud Xchange COO.
Aside from stimulating intra-region communications, demand for content across the emerging markets corridor also involves fetching content from outside the local area or region, thus driving international bandwidth patterns and the importance of cloud content servers, said the Internet Society. Hence the strategic importance of fast and reliable optical networks which feature data center connectivity directly onto the backbone.
India is key
If you draw a circle that includes China, India and Indonesia, encompassing Japan, Korea and the Philippines, there are more human beings–and therefore potential Internet users–inside that circle than everywhere else on the planet combined, according to the Oxford Internet Institute.
“India is key,” said Barney. “About 500 million people are coming online across India.” And since that demand will rely on enterprise content stores and computing facilities, a good argument can be made that the enterprise segment is organically an underpinning of the global computing architecture.
“Enterprise data centers today increasingly are about support for external and customer-facing operations based on cloud software and resources. Those Internet operations drive enterprise computing, which drives use of data centers, which drives use of global bandwidth,” Barney added.
Where clock speed across a personal computer’s internal bus once determined computing responsiveness and speed, now co-located content servers, located directly on a fast global wide area network, determine performance and end user experience.
The Global Cloud Xchange ecosystem (global bandwidth, data centers and services) has been purpose-built for the new era of cloud computing.
Cloud Changes KPIs
Cloud computing now has changed key performance indicators for any supplier of communications, information or entertainment services to end users, businesses and other organizations globally.
“Cloud computing has become so fundamental that it is driving international bandwidth patterns, growth and performance objectives.”
Quantitatively, the impact of cloud computing on data center traffic is clear. Most Internet traffic has originated or terminated in a data center since 2008, according to Cisco. By 2018, 76 percent of data center traffic will be cloud traffic.
“So the impact of cloud is clear. Internet traffic drives total traffic. Most Internet traffic originates or terminates in a data center. Most data center traffic is cloud traffic. That is why fiber is so important for the cloud ecosystem globally.” said Bill Barney, Global Cloud Xchange CEO.
For a growing number of enterprises, end users, communications suppliers, data center operators and application providers, revenue and end user experience are dictated by the response of cloud computing facilities and the networks that support them.
When the core cloud data centers and cloud-supporting networks are flexible and fast, app response is fast, devices add lots of value and end users have pleasant experiences.
The reason is that the “data bus” has shifted from something “inside the PC” to the networks connecting data centers with each other and data centers with end users.
So aside from bandwidth, content providers need enhanced latency performance, precisely because the data bus now is the global network connecting data centers with each end user device.
As PC experience once was enhanced because data resided on a fast hard drive, not a slower disk drive inside the PC, so end user experience on all devices now is shaped and conditioned by the speed and elasticity of the cloud data center servers and the high speed global networks that support the cloud data centers.
As always, better latency performance hinges on network architecture. For content delivery, both end user experience and provider cost effectiveness dictate placing content as close to the consumer as possible.
But not all content can be cached close to each end user. Much has to be fetched from distant servers.
Increasingly, end user experience of cloud content therefore benefits from data centers located on-network, since latency performance when fetching distant content is better.
The ability to distribute workloads also is enhanced when processes can be shared across all resources, elastically, as though all processes were functioning on a single machine, at a single location.
That requires low-latency, high speed connections and low-latency servers and data centers already located on the network backbone, able to reconfigure resources “on the fly.”