Edge to Core Networking is Critical for IoT Devices, AI and Automation – Data Center Frontier

Data Center Frontier
Charting the future of data centers and cloud computing.
By Leave a Comment
The Flexential Lone Mountain data center near Las Vegas. (Photo: Flexential)
In this edition of Voices of the Industry, Jason Carolan, Chief Innovation Officer at Flexential explores why networking from the edge to core (and back to the edge) will be critical for IoT, AI, automation and other data intensive applications.
Jason Carolan, Chief Innovation Officer, Flexential
Edge compute and edge networking is gaining rapid traction. The emergence of millions of IoT sensors and devices is producing enormous amounts of critical data, forcing businesses to move heuristical services closer to where the data is being generated. The ability to capture more data is being enabled by technology evolutions like 5G, Wifi6, and expansion of national fiber networks.
Workloads are becoming more distributed to support new generation apps and broader use of Machine Learning and AI to help make real-time decisions. Ultimately, these services connect back to larger scale data centers, but the immediate data capture and connectivity happens very near the end sensor or application. According to IDC, by 2023 more than 50% of new enterprise IT infrastructure deployed will be at the edge. Data can be lost forever at the edge, making solid architecture to enable edge technologies important considerations.
The edge is critical to locally managing immediate data and controls, but the delivery from the edge to the core (and back to the edge) will be just as critical. Network connectivity will need to be scalable at massive amounts that were inconceivable three to five years ago. The ability to bring analytical data and content back and forth from large core data centers to the near-edge or far-edge will require 100 Gbps (if not Tbps) or more of backbone capability.
That is more difficult than it sounds. In fact, it’s really complicated and confusing for the average enterprise. Add the challenges of getting the data from the edge to the core and back again and you find that a complex ecosystem of multiple players are required to make an edge deployment successful. There’s also a consideration of how latency impacts applications; what is close enough or not? If only 2-3% of edge data needs to be stored, how do you decide what data is important?
Emerging edge applications are very latency sensitive. These decisions need to be made locally to save time, for example, to turn off manufacturing equipment due to a safety issue that could injure an employee. Or process audio data within a city to detect a gun shot and vector video cameras and emergency responders quickly, gathering critical data at the scene. Or perhaps a manufacturing or logistics service is building a local cost effective 5G network to enable it’s IoT services, without the additional expense of paying national 5G public providers.
In addition to those complexities, edge data centers are not meant to be the final solution, but only the latest part of it. A robust connection back to a large, high quality, support for high density services, and highly scalable facility, hyperscaler, or carrier hotel is critical for the bi-directional movement of data between the edge and the core. Also up for consideration is how edge data centers must be distributed in order to be the most efficient at collecting and managing data at the edge. Many edge apps will ultimately need AI or Machine Learning to process, at the edge and beyond. Efficient edge apps will be trained using massive amounts of data (the definition of “data gravity”) and compute horsepower, housed only in large scale, highly dense network-capable facilities, like core and regional facilities. This helps sort and defines what’s important and not, at the edge.
Because of edge being very disparate, partnerships and collaboration are key enablers for edge compute and edge networking. It isn’t something one company or provider can tackle alone, and with the need for a holistic solution, it takes a combination of multiple partnerships to make this happen. There will need to be an open ecosystem that allows providers to work with technology platforms, other data centers, and service providers to make it easy for enterprises and governments to deploy edge services.
It goes without saying that large colocation data centers will not be built in every Tier II or Tier III market due to costs and local demands. Smaller edge data centers with substantial power capabilities are instead emerging to drive edge deployments that rely on federated communication to larger colocation environments via content delivery networks and content distribution networks.
Flexible, dynamic bandwidth will be required to compensate for large bursts of traffic from the edge, in addition to accommodating unforeseen growth trends, such as growth from the gaming, precision agriculture and automotive industries. Connectivity to hyperscalers, efficient subsea cables (for international connectivity) and robust metropolitan regions all work hand-in-hand with edge computing and core connectivity.
Jason Carolan is Chief Innovation Officer at Flexential. Flexential provides the managed network services, as well as the relationships, to make secure and scalable edge deployments easier. The FlexAnywhere™ platform provides access to other edge sites and cloud locations, as well as global data centers across the world. Capturing more data at the edge results in more value within Flexential’s ecosystem of partners and services, creating a flywheel that draws in more workloads, connected by FlexAnywhere™ to make it easy and secure, regardless of where the edge “is.”

Our Voice of the Industry feature showcases guest articles on thought leadership from sponsors of Data Center Frontier. For more information, see our Voices of the Industry description and guidelines..
Your email address will not be published. Required fields are marked *






To meet the needs of the rapidly expanding global colocation market, a significant amount of new infrastructure must be built quickly. Project buffering can eliminate many of the threats that inhibit data center speed-to-market, including market demand for OFCI equipment, supply chain issues, labor shortages, and conflicts around the just-in-time delivery of mission critical equipment.
The COVID-19 pandemic presents strategic challenges for the data center and cloud computing sectors. Data Center Frontier provides a one-stop resource for the latest news and analysis for decision-makers navigating this complex new landscape.
The Northern Virginia data center market is seeing a surge in supply and an even bigger surge in demand. Data Center Frontier explores trends, stats and future expectations for the No. 1 data center market in the country.
See More Spotlight Features
Data Center Frontier, in partnership with Open Spectrum, brings our readers a series that provides an introductory guidebook to the ins and outs of the data center and colocation industry. Think power systems, cooling, solutions, data center contracts and more. The Data Center 101 Special Report series is directed to those new to the industry, or those of our readers who need to brush up on the basics.
See More Data center 101 Topics
Copyright Data Center Frontier LLC © 2022

source
Connect with Chris Hood, a digital strategist that can help you with AI.

Leave a Reply

Your email address will not be published. Required fields are marked *

© 2022 AI Caosuo - Proudly powered by theme Octo