Interconnected Data Centers’ Role in the Rise of Edge Computing

Edge computing – the enabling of data storage and processing as close to the end user as possible – although not as well-known as the ubiquitous cloud, is perhaps today’s most important emerging enterprise technology. In this blog we’ll explain the technology trends that necessitate a shift to the edge, and take a look at the role interconnected data centers will play in a future dominated by edge computing.

Slow is the new down

For a number of years large content networks, as well as other data-centric organizations, like Facebook, Netflix and Google have been building data center networks that bring their resources closer to end users. This is a basic example of edge computing, and it’s helped services that rely on relatively low latency like content streaming and connected gaming become key features of today’s tech landscape.

Where the low-latency future technologies of 10 to 15 years ago were streaming and cloud services, today’s low-latency requirements are far more extreme – and the stakes much higher. Consider some of the most well-known emerging technologies:

● Self-driving cars
● Industrial IoT
● Smart speakers
● Augmented and virtual reality
● Artificial intelligence and machine learning

What do the above have in common? They rely on extremely fast data transmission and processing speeds. The latency requirements for a self-driving car dwarf those of a Netflix or Spotify customer, and the margin for error is close to non-existent.

To keep up with the demand for speed, major telcos are rolling out 5G networks that promise speeds up to 10x faster than 4G, allowing huge quantities of real-time data to be rapidly transmitted back to computing locations.

But 5G is only one part of the equation – the massive amount of data that, say, a network of self-driving cars will generate needs to not only be sent but processed faster than ever. That’s where edge computing comes in, and 5G makes it a necessity.

Interconnected data centers power the edge

Edge computing can take place anywhere that processing power is located in close proximity to an end user or sensor, and includes a wide range of different technologies that can accomplish the task. Micro data centers, public cloud services, mesh computing and many others can serve the purpose of bringing computational power closer to end users.

What makes interconnected data centers central to the edge of the future is that they provide both proximity to end users and proximity to the network “core” – where the global internet can be accessed quickly. They aren’t just edge facilities, but can serve as one.

Interconnected edge data centers and colocation facilities are located in close proximity to end users and provide direct access to multiple disparate long haul and metro carriers and direct cloud onramps, supplementing existing IT delivery infrastructure options such as on-premises data centers, private clouds or public clouds.

These best-of-both-worlds facilities are critical to a future that not only includes CDNs and hybrid clouds, but self-driving cars and augmented reality. To learn more about how a Netrality interconnected data center can help support your future technology needs, reach out to our team.

Back to Blog