What is edge computing? The benefits of mobile edge computing and 5G
Our editorial transparency tool uses blockchain technology to permanently log all changes made to official releases after publication. However, this post is not an official release and therefore not tracked. Visit our learn more for more information.
Edge computing is based on bringing computing resources closer to users, at the “edge” of the network. By placing cloud resources physically near the source of the data, instead of in data centers hundreds or thousands of miles away, edge computing can help critical, performance-impacting applications respond more quickly and efficiently.
In today’s network architecture, data is typically processed either on our devices, like PCs and smartphones, or in a centralized cloud (apps Gmail, Dropbox and others run in such a cloud). The cloud provides infrastructure, and other powerful capabilities like machine learning, and gives us unparalleled access to software and data, but performance can sometimes be slow or spotty. Edge computing attempts to overcome this performance issue.
Verizon first launched a Mobile Edge Compute service (MEC) with AWS in November. We’re calling it Verizon 5G Edge. It utilizes all the benefits of 5G cellular technology to provide even faster access to the applications and data individuals and businesses need.
The Intersection of Mobile Edge Computing Technology and 5G
By the end of 2020, billions of connected devices are estimated to be added to cellular networks, requiring both wide spectrums of cellular frequencies as well as near-real time processing and minimal latency. Verizon’s 5G Ultra Wideband network should help deliver on those demands. 5G technology is expected to play a key role in increasing the speed at which data travels between two locations, and edge computing will help shorten the distance between the two.
Some Applications of Mobile Edge Computing Technology
Edge computing brings large servers and data centers, or the cloud, closer to the end user. This will help with situations like augmented reality, where that real-time nature of the data processing is critical.
Without edge computing, data would likely need to travel much further away to a central cloud server, and the resulting latency, or lag time, could be noticeably longer.