CloudSeptember 19, 2018

Edge Computing: Powering the “Thing” in the Industrial Internet of Things

Sometimes even blazing fast doesn’t cut it—just ask the runners who finished…
Avatar
John Martin
John Martin writes about technology, business, science, and general-interest topics. A former U.S. correspondent for The Economist (Science & Technology), he writes for the private sector, universities, and media, and can be reached at jm@jmagency.com.

Sometimes even blazing fast doesn’t cut it—just ask the runners who finished second to Usain Bolt in the 100-meter dash. Top speed is the whole point of the Industrial Internet of Things—getting data in as near real time as possible so you can automatically regulate a healthcare delivery device, stop a robotic arm in response to a safety alert, correct a drone course in-flight, or actuate braking in a driverless car.

Edge computing saves time where milliseconds of data latency are critical by placing small-form processing, along with storage and connectivity, at or close to the device where the data is generated. As the IIoT links millions of machines and objects to the web to collect data, sometimes you can’t wait for the information to travel to data centers and cloud servers. Don Duet, President and CEO of Vapor IO, told the Wall Street Journal that housing compute power closer to devices can cut that time to 2-5 milliseconds, from the 150-200 milliseconds it generally takes for data to travel to the cloud and back.

That’s critical for smart, autonomous devices. Theo Levine, a general partner at venture capital firm Andreessen Horowitz, explained to TechCrunch that a self-driving car is effectively “a data center on wheels, and a drone is a data center with wings, and a robot is a data center with arms and legs.” These devices are processing vast amounts of information, and need to do it fast.

Most agree edge computing is complementary to the cloud. The edge can process IIoT data locally, like embedding vision processing and AI in security cameras—this minimizes the raw streams sent to the cloud for storage and analysis, provides redundancy against outages, and saves on connectivity costs and network/compute capacity. Edge computing is a good fit for AI and machine learning, enabling AI to more quickly process data where actions are time-sensitive; other data can then flow to data centers and the cloud, where machine learning can work its magic on larger sets to unearth deeper patterns.

Security appears to be a toss-up. Some think it’s stronger at the edge, because the data can’t be intercepted or altered on its journey across the network to the cloud. Others say there might be more exposure and risk at the edge devices themselves.

Edge computing is a further iteration of technology democratization, the relentless drive to push processing power down to the user and device, whether it be a PC, smartphone, or IIoT object. Major cloud providers like Amazon and Google are now offering services to integrate devices with their clouds.

Satya Nadella, Microsoft CEO, put it this way: “Artificial intelligence is getting built into every experience, and we need to take data and compute to where it’s generated. Whether in an autonomous car, on a factory floor, or in a hospital, every one of these experiences is going to be powered by AI. We’re taking this intelligence and distributing it to the edge.”

Stay up to date

Receive monthly updates on content you won’t want to miss

Subscribe

Register here to receive a monthly update on our newest content.