Main trends on Edge AI to watch

Edge AI has emerged as a game-changer technology for the Industrial World. Industries with highly distributed critical assets will be the great beneficiaries of taking advanced computing to the Edge.

Technology

Since it was added as one a field of study in 1956 at Universities, Artificial Intelligence has gone through both, periods of optimism and pessimism in equal measure. There is no doubt that today, we are witnessing one of great optimism.

Data Science is the third most sought job post worldwide (in fact, in our recent study, about the State of Edge Computing in Spain, Data Scientist is the most wanted professional amongst Spanish companies) in a market that is experiencing exponential growth, that will reach up to 190 billion dollars by 2025.

Such is the prominence of AI in the market industry that it does not longer make sense to speak of it as a technology but as many branches that serve different uses and to different industries.

This is perfectly drawn at Gartner´s “Hype” cycle of 2021.

The rise of Edge AI

Among the trends identified as the most mature and closest to the production stage are the ones we can identify with, in our daily routine. The likes of the plain language processing that we use when we talk with increasingly human-like chatbots, the machine imaging that makes it possible to automate real-time video processing, and semantics searches, that leads to better search results.

At the other extreme, there are more futuristic ones that will not emerge for at least 10 years. Some interesting examples are the AI TRISM (Trust, Risk and Security Management) technologies, which make it possible to regulate AI models making them more resilient to security and privacy attacks, and Transformers, which make it possible to adapt AI models to fit the context and will have a great impact on improving applications such as translators, automatic document creation, or the analysis of biological sequences.

Between the two extremes are other enabling technologies that will take to 2 to 5 years from deployment to market maturity, which can be called as the “Near Future of AI”. Among these are human-centred AI, generative AI, the orchestration and automation of AI and, leading all the others on the maturity curve, is AI at the Edge, also known as “Edge AI”.  

Edge AI and the distributed intelligence revolution for the industrial world

Edge AI can be summed up as the ability to execute artificial intelligence algorithms on devices ( IoT devices, edge devices) that are very close to data source. This technology is growing because of a daunting statistic: more than 60% of industrial organisations do not have a Cloud infrastructure in place that help them innovate efficiently.

Top 5 Most disruptive trends on Edge AI

1. Critical industries will be major drivers: from SCADA to Edge AI

At Barbara we are finding repeated patterns in industries that are at the forefront on Edge AI. All of them handle many critical distributed assets. In other words, they are industries that face great challenges from technological fragmentation, scalability and cybersecurity and that can be minimised by executing AI algorithms at the edge. We can forecast these industries will develop very ambitious and transformative use cases; up to a point, that are very familiar with concepts like IoT Edge and Edge Computing.

The SCADA systems that have been used since the 80s have similar purposes in terms of data collection and processing. However, SCADA systems need to be complemented by more modern technologies so that they can respond to the increasingly demanding requirements for interoperability, openness and security where Edge AI can help and can multiply the value of these systems.

2. Thin Edge will complement Thick Edge

There are different interpretations about the meaning of what “edge” is, when we refer to Edge AI. The Edge has depth. Traditionally, the edge has been identified as the network operator infrastructure closest to the user. When we talk about 5G networks, we refer to operators that are rolling out a multitude of nodes called “Multiaccess Edge Computing” that are used for up-close data processing. These nodes are installed on servers very similar to those that can be found in a data centre designed to host cloud services and they have high potential as well as the resources capability for processing complex AI algorithms. This is what some analysts name  “Thick” Edge.

However, there are other type of Edge Nodes, ones directly connected to sensors and switches that are installed on low-power devices such as gateways or concentrators and run simpler AI algorithms with shorter response times very close to real-time responses. This new type of Edge, is called “Thin” Edge, very suitable to rapidly tackle large scale projects involving remote locations or requirements for high security.

3. Edge Mesh as the new paradigm to enable distributed Artificial Intelligence

Edge AI is traditionally based on decision models that are trained using large data. The model, consisting  of a series of mathematical formulas is installed on Edge nodes. From there, each node is able to make its own decisions depending on the data it receives and the model that has been installed.

The new paradigm, known as Edge Mesh, makes it possible a node’s decision to be conditioned by another nodes as if it were a lattice network. One good example for understanding the power of this new architecture is a smart traffic system.

An Edge node can make decisions about the time of a traffic light using AI algorithms that takes into account the number of cars and people detected by sensors. However, this decision could be perfectly complemented by the decisions being made by other nodes in nearby streets.

The aim of Edge Mesh is to distribute intelligence amongst various nodes in order to offer better performance, response times and fault tolerance than with more traditional architectures.

4. Lifecycle management using MLOps, is increasingly more important

As industry moves towards rolling out Edge AI with more distributed nodes and more complex training algorithms, the ability to maintain the lifecycle of these trained models, and the devices that execute them, will be key to the future of this technology.

In this sense, the projects and companies that apply the DevOps philosophy for the development, roll-out and maintenance of AI algorithms will be enhanced. This philosophy is called MLOps, a combination of Machine Learning and DevOps.

But what is it exactly? I aims to reduce the development, testing and implementation times of AI at the Edge models through an ongoing integration of equipment and development environments, testing and operations.

5. Edge AI enables Sovereign Data Exchange

There is no doubt that data sharing will be paramount for improving processes in industry sectors with many stakeholders within the value chain.

Let´s look into the near future electricity grid model – a Smart Grid –. To be able to receive or to offer better service, it is essential for suppliers to be able to analyse and process information from a number of stakeholders like prosumers, operators, distributors, aggregators, etc… Without a transparent, agile data exchange will be impossible to reach the required grid optimisation  by 2050.

With Edge AI an on-centralised data processing is possible, which will help overcome some of the obstacles the industry is currently facing such as data security, privacy and sovereignty.

Interested in EDGE AI. Check our latests blog post: 2023 the year of Edge AI