Will Edge Computing Help Cloud Infrastructure Meet Consumer…


In the dawn of consumer electronics, content was the physical possession of consumers. You put a tape into your VCR player and voila, your movie came to life. Then personal computers came, and for a while, we still loaded up DVDs or CD-ROMs. But eventually, this approach evolved into accessing central services such as Gmail and Netflix. We realized we could do things more efficiently if all of our data was “in the cloud.” And as things became more central, it became apparent that economy of scale is a huge advantage in cloud computing. Eventually, nearly every company was piggybacking off of infrastructure owned by Google, Microsoft, IBM and Amazon.

And for the most part, the traditional cloud has expanded to its limit in the U.S., though many countries overseas are still growing. So much of the remaining expansion in the cloud and in content delivery will come from progress in “edge computing.”

Edge Computing Defined

Edge computing is extending the cloud out into the “edge”—that is, closer to consumers and their devices that rely on cloud services. To explain it simply, one giant grocery store in the middle of the country would be wildly impractical. Many branches in different locations, however, is a far more realistic way to support locals across the country. This model also makes sense in computing. A central cloud location is often farther away than is ideal for various purposes, such as autonomous vehicles.

The Latency Effect

Data takes time to travel, even across fiber optics, and an automated car’s central computer must communicate with the main network quickly to do its job efficiently. Even something as innocuous as your Google Home is limited by this delay, called latency. When you ask your Google Home a question, it sends your voice data to the cloud, which processes it and returns the answer. With most experts predicting a boom in voice-based-system adoption, edge computing will be an important part of that growth.

Bandwidth and Increasing Demand

Beyond latency, the sheer volume of data can be prohibitive to the central model, too. As another hypothetical metaphor, imagine a massive body of water in the middle of the U.S. was the only source for every North American’s water.

To meet water demand at peak hours, an absurd volume of water would require transportation. Just as enormous pipes would be impractical, the central model is ill equipped to handle the volume of data, or bandwidth, that consumers demand. Creating edge locations that can process data locally allows consumers’ growing latency and bandwidth needs to be met.

Security and Personal Data

One interesting benefit of edge computing is a (potential) decrease in personal data sent to the cloud. For example, an authorized login via retina scan would typically be processed in the cloud. But moving AI chips, personal security data and other things to personal devices and local data centers means your data remains closer to you as well.

IoT devices will probably also shift toward edge computing for security and updates as well. When a smart thermostat or similar device is running a Linux distribution that’s outdated and vulnerable but still has to access your network, you’re looking at a massive vulnerability. Just like web browsers are updated constantly without us knowing, it might be better to let IoT-device management and updating take place centrally in the same way.

For this reason Microsoft and Amazon offer Microsoft Azure and IoT Core, which are essentially Linux-based operating systems that connect to the cloud to receive updates automatically, just like a web browser would.

How Will Edge Computing Progress?

Edge computing’s benefits to consumers creates an incentive to expand infrastructure for the technology. This infrastructure requires sizable investment, and many of the applications that benefit most heavily, such as virtual reality and augmented reality, are too far out of the mainstream to really drive edge computing. The benefits are trivial enough for traditional applications that consumers won’t be pushing the envelope.

Thus, the development and implementation of edge technologies are proceeding steadily, with telecom companies such as AT&T developing edge-computing test zones to gather real-time data before committing on a wider scale. As self-driving cars, AR/VR and similar technologies progress, hyperscalers and telecom companies will surely rise to meet the demand, but until then, we should see slow but steady adoption on all fronts for edge-computing infrastructure.

Is Edge Computing a Concern?

Edge computing is a blanket term that covers many things. The general concept of decentralizing more of the devices that run our lives is inevitable. Consumers will continue to demand more out of their devices, and the central model will never be able to meet this demand—unless we adapt. Edge computing could give more control over our digital world to the cloud companies, but if history is any indication, we’ll gladly give it up as long as Bezos and Page keep the progress coming. And that progress is coming.

Imagine if car accidents were practically eliminated. And if autonomous vehicles are perfected, your time spent stuck in traffic could be spent reading or watching shows. Imagine if an augmented-reality eyepiece could transform your walk through the park into an adventure through Wonderland. Imagine if cell phones could load 4k video live with no buffering, no matter where you went. These things are all poised to happen soon thanks to edge computing.

About the Author

edge computingBrian McMullin is the operations director for Exit Technologies, an R2-certified global IT-asset-disposition (ITAD) company. He has over two decades of experience overseeing large real-estate contracting projects. Brad tweets at @ExitTech.

Will Edge Computing Help Cloud Infrastructure Meet Consumer Demand? was last modified: November 23rd, 2018 by Brian McMullin

Zoomd.com Trends

Be the first to comment

Leave a Reply

Your email address will not be published.


*