5G and mixed reality glasses help change how we see the world

By: Chris Ashraf

Full Transparency

Our editorial transparency tool uses blockchain technology to permanently log all changes made to official releases after publication. However, this post is not an official release and therefore not tracked. Visit our learn more for more information.

More of our content is being permanently logged via blockchain technology starting [10.23.2020].

Learn more


Today, there are already a plethora of augmented reality (AR) use cases for businesses, however, bulky headsets and motion sickness have become big barriers to scaling usage. The cost of such devices has also made them unattainable for consumers since expensive computing power needs to be built into the headset.

Enter 5G. With high bandwidth and low latency, as well as the power of mobile edge computing, 5G makes it possible for AR/MR glasses to offload the majority of their workload to the edge of the network. Thanks to the reduced compute and power needs, smart glasses are about to become significantly lighter and cheaper, ultimately increasing adoption.

Consider First Responders wearing AR/MR glasses – they’ll be able to travel to a scene and have images fed to the corner of their glasses to help plan rescue actions before they arrive on-site. The glasses can also display live video feeds from drones hovering over the scene as well as street or topographical maps to help them better prepare.

At Verizon’s 5G Lab in New York City, ThirdEye, creator of the world’s smallest mixed reality glasses, recently showed off this 5G use case. The company demonstrated how its smart glasses – along with its AR/MR software – can help bring about a new era of hands-free human interaction with computing at the edge. While wearing the glasses, users such as First Responders can directly interact with surrounding objects or digital information placed in their field of view.

Additionally, field service workers (like auto mechanics) will be able to scan an object, such as a complex motor, with the glasses’ built-in camera and send images to a remote expert for help. They can then receive live audio/video guidance from the expert appearing at the top right of their smart glasses while working hands-free, instead of having to look down at a tablet or booklet to repair the part.

“5G really helps in terms of latency,” said ThirdEye’s Nick Cherukuri. “For AR smart glasses we have to stream live video or live 3D models, which can end up using a lot of data. 5G will eventually help reduce latency to under ten milliseconds, which will be a total game changer.”

For consumers, using the smart glasses over 5G will allow for more interactive and collaborative gaming experiences, as well as enhanced in-flight entertainment. Imagine being given a pair of AR glasses preloaded with HD movies when boarding a plane, then after take-off, being able to stare at the ceiling or even look out the window and see your movie presented directly in front of you in a screen size much larger than the seat-installed TV.

Stay tuned next week for another cool 5G demo from Verizon’s 5G Lab!

For related media inquiries, please contact Christina.moon.ashraf@verizon.com

About the author:

Chris Ashraf is an external communications manager at Verizon writing about 5G and network solutions.

Related Articles

At Verizon's 5G Lab in New York, Highfive recently demonstrated how its in-room video conferencing solution can be enhanced with 5G technology.
At Verizon's 5G Lab in New York, Soul Machines shows how 5G can help bring digital humans to life.