When Intelligence moves to the Edge
- Anamika Sarkar
- Aug 19, 2018
- 4 min read
As a term and an architecture, Edge computing has existed for quite some time now. However, in the era of the Industrial IoT, Edge computing is completely centering the devices and technologies which are attached to the things in the Internet of Things. IoT is all about connecting what was previously unconnected; thus, in order to analyze and leverage information from the devices which contribute to the required goals, we need an architecture to enable the collection of this data. The goal of Edge computing is to save bandwidth, store data, reduce time and costs by restricting the information to be transmitted, and decreasing network latency. Data, speed and analytics are equally important. Edge computing pushes the intelligence, processing power and communication capabilities of an edge gateway or appliance directly into devices like programmable automation controllers. In a nutshell, Edge computing is the major key for IoT and intelligence in shifting to this edge, thereby making this technology unique and important. The following illustration shows the two computing techniques with profound clarity.
Image credits: MachBase
Why is Edge computing critical for IoT?
While many of today’s always-connected technical devices take the advantage of Cloud computing, IoT manufacturers and developers are discovering the benefits of computing and analyzing on those devices themselves. This on-device approach helps reduce latency, lower the dependence on the Cloud, and manage the massive deluge of data generated by the IoT better. Additionally, this on-device processing improves the speed of alerts while reducing the chances of recurrent false alarms. This ability to perform advanced on-device processing and analytics is itself termed as Edge computing. Think of the “Edge” as the universe of internet-connected devices and gateways sitting on the field — the counterpart to the “Cloud.” Edge computing provides new possibilities in IoT applications, particularly for those relying on Machine Learning, for tasks such as object detection, face recognition, language processing, and obstacle avoidance.
The rise of Edge computing is an iteration of a well-known technology cycle that begins with centralized processing and then evolves into more distributed architectures. Edge computing delivers tangible values in both consumer and industrial IoT use cases. It helps reduce connectivity costs by sending only relevant information rather than sending raw streams of sensor data. This is particularly valuable on devices that connect via LTE/cellular, such as smart meters or asset trackers. Also, when dealing with a massive amount of data produced by sensors in an industrial facility or a mining operation, for instance, having the ability to analyze and filter the data before sending it can lead to huge savings in network and computing resources.
The Why of moving Edge Computing to IoT
If you receive a lot of data ,as is the case when you leverage IoT in such end-to-end ways or even in specific highly sensor-intensive and thus data-intensive environments, wherein, data is generated at the Edge, which by definition happens in IoT as your data sensing and gathering devices are at the Edge. The speed of data and analysis is essential in many industrial IoT applications, but is also a key element of industrial transformation and all the other areas where we move towards autonomous and semi-autonomous decisions made by systems, actuators, and various controls. This degree of autonomy is at the very core of many desired outcomes and goals in, say, Industry 4.0 as we move towards the next stage of the third platform which is all about autonomy. We do live in times where having the right insights fast enough can have enormous consequences.
Edge Computing and the IoT in 2018 and Beyond
There are some applications and industries wherein, just at the level of sending data, traditional networks don’t suffice, let alone used. For instance, in satellite communications owing to their remoteness and the costs it takes to send all this data through. So for reasons such as those of bandwidth, costs, speed, and many more; we need a faster, smarter, and a cheaper approach. With real-time information being a proven competitive differentiator, it is clear that in the increasing unstructured data deluge, of which the IoT and sensor data deluge is part, traditional approaches don’t fit anymore as we’ll see.
According to a November 1, 2017 announcement, regarding research of the Edge computing market across hardware, platforms, solutions and applications (smart city, Augmented Reality, analytics etc.) the global Edge computing market is expected to reach USD 6.72 billion by 2022, at a compound annual growth rate (CAGR) of a whopping 35.4 percent. The major trends responsible for the growth of the market in North America are all too familiar- a growing number of devices and dependency on IoT devices, the need for faster processing, the increase in Cloud adoption, and the increase in pressure on networks.
Summarizing
Gartner defines Edge computing as: “The solutions that facilitate data processing at or near the source of data generation.” For example, in the context of the Internet of Things (IoT), the sources of data generation are usually things with sensors or embedded devices. Edge computing serves as the decentralized extension of the campus networks, cellular networks, data center networks or the Cloud. The growing number of IoT devices and dependency on them, the need for faster processing, the increase in Cloud adoption, and the increase in pressure on networks drive the Edge computing market. That’s where both Edge computing and Cloud computing comes to play. If your data is generated at the Edge in IoT, then why not bring all your intelligence and analysis as close to the Edge, the source, as possible, with all the obvious benefits.
Comments