LogistiVIEW analyzes recent projection
Lost between tomorrow and yesterday,
Between now and then.
And now we're back where we started,
Here we go round again.
-- The Kinks
I'm certain Ray Davies didn't write those lyrics about the cyclical nature of computing platforms. Nonetheless, they came to mind as I listened to venture capitalist Peter Levine recently declare the imminent death of cloud computing. Rising from the ashes of the old distributed model, Levine declared "edge computing" as tomorrow's buzzword. Briefly, edge computing refers to pushing more complex decision making to devices at the "edge" of the network. This does not refer to servers, PCs, or even personal mobile devices, but robotics, controllers, sensors and other members of the "Internet of Things" (IoT).
Take self-driving cars, Levine argues. There is no way a transportation system full of autonomous vehicles can rely on cloud computing to make navigational decisions. Each self-driving car (and for that matter, human-driven car) carries enough computing power to operate every control system, without access to a network. Therefore, he argues, cloud computing will soon be relegated to the back seat.
Similarly, the explosion of IoT devices and machine learning will also make computing at the "edge of the network" more practical than relying on off-site resources, Levine continues.
While I agree with his predictions about the rise of edge computing, I disagree that it signals the death of the cloud. Edge and cloud are not mutually exclusive. I don't believe anyone ever suggested that a robot should be dependent on the cloud for basic mobility (a self-driving car would fall under this category). Every network device has some level of self-contained decision making, and this has always been governed by what is necessary and practical.
The cloud, and its partner-in-crime, Big Data, still have plenty of utility. First, not all systems are control systems. Enterprise applications run finance, HR, inventory, etc., all without heavy reliance on "edge" devices. Cloud deployment still makes a lot of sense for these applications. Second, even if the cloud is not involved in the low-level decisions or communications among edge devices, it can still be used for macro-level optimization, trend analysis, predictions and mode switches.
Imagine a system where the IoT data from many devices and locations is continually aggregated. The data is analyzed for trends which seem to predict certain events. Realizing current operational data is trending in the pattern, a centralized system instructs the appropriate edge devices to switch into different operating modes or configurations.
Thinking of our own offering at LogistiVIEW, we do edge computing on our VIEW devices. For example, we decode the images captured on video to search for barcodes, QR codes, readable text (OCR), or even recognizable objects. All of this must happen on the device, in real time, in order to keep the workflow progressing and not delay the human. However, we have the ability to capture far more data that, while not immediately useful to the human, is potentially interesting to the business. Incidental scans, travel speeds and congestion patterns are all examples of information captured "at the edge," but best mined in the cloud.
There's simply no reason to ring the death knell of cloud just yet. Plenty of computing is still necessary and valuable, several steps back from the edge.
Levine also predicts the emergence of new programming languages to deal with these edge devices. I agree. I believe we will need secure mechanisms to inject new algorithms / behavior into edge devices from a central source. The algorithms may even be developed automatically, via artificial intelligence (AI), as part of the above data analytics. Perhaps the data will show that certain code branches are unnecessary or even harmful, or perhaps can be optimized. Perhaps a centralized application can determine which devices will benefit and which devices must not be altered. Regardless, the mechanism to push new behavior must be guaranteed secure, and the interfaces to injection points must be well-defined.
It should be noted that while Mr. Levine's employer does have vested interests in the rise and fall of certain technologies, his materials were presented as speculative.
On a final note, Levine describes the thought exercise used to arrive at such projections. To paraphrase, identify something you consider integral to an industry, and then imagine its replacement. While I disagree with some of his conclusions, I can see the value in this exercise for every technologist and business strategist from time to time.