The Internet of Things (IoT) is already part of our life in smartphones, smart TVs, or Echo Dot, this is a real fact. But how long will IoT be able to fit the market needs? What is the next step of this technology?
The biggest challenge of IoT is to maximize the potential and extract even more value from its application. A great opportunity for this technology is the ability to collect data through intelligent devices. But, to add value to data collection it is necessary to increase Artificial Intelligence and Machine Learning to generate analysis and decision-making forecasts.
So, to empower this technology it is possible to combine Artificial Intelligence (AI) with the Internet of Things (IoT), creating the AIoT. However, the architecture used for applying IoT may not be able to support the AIoT structure, which introduces us to the EdgeAI concept.
AIoT: extract the real value from IoT technology
In AIoT applications, IoT devices transfer the collected data through a WiFi connection to a cloud service, which will process the data using the AI. A practical example of AIoT is video sensing, which integrates image processing and computing in IoT networks. So, the cameras of the environment capture the images (Input) and transfer them to the IoT gateway that transmits to the cloud service to do the image recognition by using Deep Learning (Output). Figure 1 illustrates the architecture of this application.
In this model, computing power and data aggregation are performed in the cloud service, and the AI-powered real-time decisions and generate predictions. However, this structure may affect the process performance. The farther away a cloud service is located from IoT device, more it takes time to complete transferring data collection, and more latency issue is introduced. Latency is critical in this application because it directly affects real-time decision making, the most valuable outcome of data collection. It is in this context that we introduced the concept of EdgeAI, a new approach to solving the latency issue.
EdgeAI: Edge Computing is the way of AIoT?
The purpose of EdgeAI is to use Edge Computing and to bring AI processing close to the device. This technology allows the application of machine learning and deep learning on the edge, which enables new application opportunities. The AIoT system with EdgeAI application means that the data processing will occur on the IoT device, instead of the cloud service, eliminating data transfer, and solving the latency issue. Figure 2 illustrates the same example seen earlier of video sensing, but now with EdgeAI architecture dividing by layers.
As we can see, Edge Computing enables the collection, storage, and analysis of data on IoT devices for real-time decisions out of the cloud. The application generates a real-time response, increased security, and reduced computational cost. However, the use of EdgeAI can also bring some limitations. Edge computing requires more hardware and peripheral devices. Furthermore, it can limit data processing, because at the edge only a specific dataset can be analyzed, raw data is discarded and only necessary information is sent to the cloud. This is why it is indicated that the organization makes a prior assessment to decide which structure best supports your AIoT application.
Cloud or Edge Computing?
Before deciding on the best AIoT architecture, it is important to consider the application requirements. Initially, is necessary to understand if the collection and analysis of the data will drive to real-time decisions. If so, as mentioned earlier, EdgeAI operates under better conditions for real-time decisions. But if there is no need for real-time decision making and there may be transmission latency, cloud service can be used.
Also, if large amounts of data are required to be sent for analysis in the cloud, it is not indicated to use edge computing because it limits this purpose. Cloud service also enables greater autonomy for control and governance of collected data. Finally, both infrastructure offer advantages and benefits, so it is important to use the architecture that will bring the best performance of AIoT application and that extracts the maximum potential of this technology.
Reference used in this article:
H. Li, K. Ota and M. Dong, “Learning IoT in Edge: Deep Learning for the Internet of Things with Edge Computing,” in IEEE Network, vol. 32, no. 1, pp. 96–101, Jan.-Feb. 2018, doi: 10.1109/MNET.2018.1700202.