ggk-quote

Let's Connect

ggk-quote

Let's Connect

ggk-contact

+91 1234 44 4444

Edge AI: Data Intelligence at the Edge Level

Edge AI: Data Intelligence at the Edge Level

September 28, 2021

by Arvind Ramachandra & Munish Singh

According to a top consulting report, if the Industry gets it right, linking the physical and digital worlds could generate up to $11.1 trillion a year in economic value by 2025. These have resulted in the exponential growth of the data generated through the IoT devices, which has created a requirement to bring computational power at individual device levels using edge computing rather than sending data to the cloud for analysis.

Edge computing can move parts of the service-specific processing and data storage from the central cloud/datacenter to edge network nodes; when combined with Artificial Intelligence (AI), it can bring intelligence at the device level. This help to build a smart/intelligent connected network of edge devices called Edge AI or Edge AIoT (Artificial Intelligence of Things) or Intelligent Internet of Things. To know more about Edge AI please check out our blog on Edge AI: The Era of Distributed AI Computing.

With the upgradation in hardware and software, AI algorithms are also becoming compact to run on less storage space and provide better accuracy and precision. For example, YOLOv3 & YOLOv4 weights file (with Darknet architecture) were more than 200MB, but the YOLOv5 weights file is less than 50MB and similarly, there are various Tiny AI/ML algorithms versions. To know more about different algorithms in the computer vision area please check out our whitepaper on A Study on Computer Vision Algorithm.

In the Edge AI, the edge devices process the data locally using AI/ML algorithms thus reducing the need to send each and every data to the cloud for data analytics. Thus, bringing data intelligence to each edge node or cluster of nodes and increasing the level of data processing and capacity to filter information on the edge. Also, this can help them to learn more efficiently and effectively about the environment they are working in and might share the insights they have learnt with the other nodes to upgrade themself at a regular interval. In short, these edge nodes with the help of AI/ML are going to self-learn what is a typical and abnormal behaviour for a machine such as sense soil moisture during water pipeline leakage can lead to a rapid increase in rusting, autonomous vehicles sharing their data with each other, managing smart factories, digitizing healthcare and many more.

A simple illustration of AI/ML supporting edge network

Below are the four important data pattern factors that needs to considered while setting up the edge network:

  • single device transmit data on single network
  • single device transmit data on multiple networks
  • multiple devices transmit data on single network
  • Data from multiple device and multiple networks

We can use following five data intelligence level on the above data pattern at the edge node level with/ without the corporation of the cloud:

5 stages of data intelligence

We have considered space complexity as a factor to segregate some algorithms examples, but there can be many other factors such as time, sample size, bias-variance tradeoff, parallel algorithms, etc.

From the above diagram, as the data intelligence process improves and moves from the cloud to the device level, the amount and distance of data offloading reduces which can:

  • decrease the transmission latency of data offloading,
  • improves data privacy and
  • reduces the cost of WAN bandwidth

The data intelligence at the edge node level has to go a long road, as it will require computation-intensive algorithms to run locally. The DNN models are very resource-intensive, and it requires devices equipped with high-level processing capabilities. Thus, increasing the cost of data intelligence at the edge level, which is not compatible with the existing legacy end devices that provide limited computing capacities.

To overcome the above challenges, we have been studying various architecture patterns and identified three architectures for the development of data intelligence at the edge level are:

  • Centralized : Here the training and conclusion are stored on the cloud/datacenter. Therefore, the architecture can be placed in stage-1, stage-2, and stage-3 of the figure-2.
  • Decentralized : Here the training and conclusion are stored on the edge node level. Therefore, the architecture can be placed in stage-4 of the figure-2.
  • Hybrid : Here the training can happen at the individual edge nodes or centralized training with the cloud/datacenter and conclusions are shared between the edge and the cloud/datacenter. Therefore, the architecture can be placed in stage-3 and stage-2, of the figure-2.

The different architecture of the Edge AI

Let’s take a real-time example for stage 3, where we have tested a hybrid solution approach that process leakage data from a water pipe utilizing algorithms such as Support Vector Classifier (SVC), Decision Tree Classifier, Random Forest Classifier (RF) and XGBoost (XGB) in such a way that the edge node process small amount data in a timeframe and update it inferences at regular interval of time. The dataset consists of the following parameters timestamp, demand, flow, pressure and leakage measurement which we have collected from a sensor to achieve the following output.

Pipe Leakage Detection Results

Algorithm ModelAccuracy
SVM84.10%
Decision Tree Classifier90.23%;
Random Forrest Classifier95.01%
XGBoost95.35%

In the next section of the series, we will review more about the technologies to build distributed deep neural networks (DNN) for training at the edge level. For some time, the cloud is going to provide the necessary support for data intelligence at the edge level, which can help to reduce both the end-to-end latency transmission, computational task and energy consumption when compared to the local execution approach, but as the hardware and software architecture of the Edge AI improves the data intelligence is going to move swiftly from the analytics in the cloud to the devices.

To know more about the architecture pattern for the above areas, please feel free to contact us at arvind.ramachandra@acsicorp.com

Contributors

Arvind Ramachandra

Arvind Ramachandra

Vice President of Technology & Cloud Services

arvind.ramachandra@acsicorp.com

LinkedIn

Munish Singh

Munish Singh

Deputy Manager, Research & Advisory AI/ML

munish.singh@acsicorp.com

LinkedIn

Get in touch with us for feasibility studies, joint workshops, pilots and ideation sessions.













    Request a Consultation