IIoT risks of relying heavily on edge computing

The sheer volume of data created by the Internet of Things (IoT) is increasing dramatically as the world is becoming progressively more connected. There is projected to be a mind-boggling 75 billion IoT devices in the world by 2025. Meanwhile, edge computing is set to be adopted into the mainstream by as early as 2020.

This means that increasingly vast amounts of IoT data will be stored, processed and analyzed on the edge. While edge computing has its advantages, I am cautious of companies relying too heavily on it. I would urge all Industrial Internet of Things (IIoT) customers to avoid being taken in by the fervor behind edge computing. Here’s why.

Edge computing simply won’t provide an answer to all digital transformation problems. It does, however, serve an effective purpose if properly used. It enables optimization when an organization knows precisely what specific problem needs to be addressed in a specific place.

However, edge computing also denies enterprises the macro business-critical information the Industrial Internet of Things (IIoT) is designed to deliver. For instance, the IIoT provides manufacturers with data from machines on the factory floor and presents it into a usable, integrated format. The risk of blocking companies from accessing this crucial data is what I call, premature optimization.

Furthermore, Operational Technology (OT) and Information Technology (IT) will converge in the era of connectivity, machine learning and automation, otherwise known as Industry 4.0. As a result, companies can be tempted to spend unnecessarily on OT edge infrastructure when it already exists right next door in IT.

Most importantly, what is the point of OT/IT convergence and the Industrial Internet of Things? The underlying aim is to bring the physical and digital worlds together in order to provide smarter, more innovative and optimized business decisions. Automation, AI, machine learning, sensors, analytics, cloud, blockchain, 5G, and – yes – edge technologies are interconnected to find ways to make things faster, more valuable, more customized, higher quality for less expense and more profit. When it comes to business, the IIoT delivers better answers that were previously unattainable.

Edge computing applies analytics and algorithms where they occur, onsite, at the local level, right on the factory floor. Why go through the expense of sending data to the cloud for better answers when those answers can be found right there in the factory? If the problem is uniquely understood and isolated to that factory, machine, or process, then processing data on the edge makes a lot of sense.

However, problems with edge computing arise when businesses haven’t identified the precise problems they’re trying to fix. It’s also true that these same problems may be hindering a company on a global scale and aren’t isolated to a single factory. Edge computing therefore denies companies total network visibility which can be a major security risk. Comprehensive visibility is vital for companies to protect themselves.

When problems are solved on the edge they are dealt with locally, on a micro level, which leads to premature optimization. This deprives the entire ecosystem of the benefits of network-effect solutions. Better to send streams of important data to the cloud, where mighty layers of analytics, AI, and related technologies with a global view can deliver macro-level, universal answers and much broader, more valuable optimization.

Why self-limit with edge and run the risk of throwing potentially crucial data away? Especially as we’re still in the early days of digital transformation. With edge, an organization may solve one local problem but fail to send important data to the cloud, which is vastly better equipped to use that data to solve multiple bigger problems across the network and protect against cyber threats.

Connect OT edge to the public cloud by using the IT data centre as a private cloud

Many people think that sending all our IoT data to the cloud for a global view, in search of universal answers, is simply not a reality. Firstly, data is growing voluminously as sensors proliferate and broadcast at increasingly smaller intervals. Secondly, do we really want to start building OT infrastructure to connect to the cloud? Who’s got the money for that?

I believe it is eminently feasible. To begin with, data can be compressed in the IT data centre before being sent to the cloud, reducing the data stream by as much as a third without loss of data quality and saving the expense of extra storage. There are also existing and emerging solutions in the data centre designed to accommodate the increasing flow of information generated by more sensors and the arrival of 5G. Secondly, we definitely do not want to build OT data centers at the edge of the network. Enterprise IT data centers already exist, and are being affordably configured to handle the tasks required (including using remote algorithmic IIoT techniques).

Based upon my experiences working with customers worldwide, the budget and skill-sets required to do that can be found more readily in IT. OT teams have their own challenges to deal with around optimization. No need to load unnecessary IT responsibilities onto their full plates. Even assuming it could be done, given the average age of OT equipment (10 years or more) with limited CPU power, there’s no reason to build new OT infrastructure to connect the edge to the cloud.

Sending as much problem-solving data as possible to the cloud is not only feasible but smart. IT data centres that have already been built and are in the process of upgrading for the challenges of Industry 4.0. are ideally suited to provide the compute power and storage needed as the middle step – a private cloud – between the edge and the public cloud.

The public cloud is best equipped to provide the business optimization answers digitalization promises. Ultimately, relying too heavily on the edge this early in the game would be a mistake as the local answers edge develops and isolates could turn out to be the universal, network-wide optimizations needed all over the world, but never discovered. Achieving full optimization can only be done by avoiding edge’s premature optimization.