Empowering the Edge: Dynamic, Disruptive Trends are Shaping the Digital World

September 16, 2020

Before COVID-19, and continuing through it as well, the Internet of Things (IoT), Machine Learning (ML) and Artificial Intelligence (AI) were the key drivers behind digital transformation and commanding the lion’s share of wallet for IT budget planning. IoT speaks to interconnected devices, riding on your network or the internet, and analysts predict there will be tens of billions of interconnected devices in a few short years. Imagine smart cities with traffic lights and directional guidance, flowing traffic in optimal ways to reduce congestion or improve air quality. Or devices in mass agriculture recording soil moisture and nitrogen levels, helping growers determine when to plant and when to harvest optimal yields.

5G enabled wireless networks speed this up, allowing for more and more devices to be connected to the network. But what do you do with all this stranded, useful data that exists outside of the centralized data center? How do you access these data lakes? And more importantly, how do you harness the value of this data and where does your processing need to be in relation to your data? Is a centralized cloud strategy best or a decentralized, edge-computing or micro-cloud strategy preferable?

IT organizations are increasingly looking at cloud journeys as a multi-stage trip, with applications being moved out of the data center to enjoy economies of scale in hosted colocation providers who must now act as cloud consultants, helping clients determine which apps can be re-platformed for public clouds vs. hosted private clouds and providing hyper-speed, low-latency connectivity between those cloud nodes. Many organizations are going through these same decision trees and making choices to better align their resources on application development and less on infrastructure support.

One such challenge is edge computing. How do you gain relevant access and make good use of data created at the core, near-edge and edge? And how do you do so in an ultra-low-latency way that’s required by technologies such as ML and AI? What new opportunities and use cases have emerged as a result of this increase in edge computing?

One good example is in-game sports betting. In-game betting through legalized sportsbooks has become increasingly popular, and in order for users to have a seamless experience, there is a need for real-time data analytics. One case study is a leading bare metal automation platform provider that wanted to develop an app to allow gamblers to do in-game betting on MLB games. The challenge was dealing with High Definition video data with large, unstructured data sets. In addition, there were high processing requirements and the data processing needed to be done within very close proximity to sports venues in order to achieve low latency. 

The solution? A Tier 3 edge data center colocation with a dedicated bare metal environment and a purpose-built cloud instance close to where the data was being captured. Professional services were also utilized for security and infrastructure deployment. This solution achieved a seamless application experience for in-game betting using real-time data, bringing computing to the edge of the client’s network with very low latency. The customer was able to take advantage of an entirely new revenue stream from existing media content.

There are countless new revenue opportunities now that never existed before edge computing. With network backbone being of massive importance, organizations and IT leaders are strategically utilizing core-to-edge architectures because of all the emerging edge use cases and the criticality for low-latency and high bandwidth connections between devices. In fact, connected edge devices are projected to be in the trillions by 2025, and organizations that fail to adopt an edge deployment model could turn into this decade’s new version of Blockbuster Video.

Michael Silla

SVP of Design & Construction

Complete the form to sign up for our blog.