Deploying Intelligence on the Edge with Docker

Docker brings enormous flexibility to application development teams: it gives them the freedom to develop in the programming language of their choice and deploy them easily without worrying about the final hardware on which they will run. It also simplifies the maintenance and version control of the applications themselves, which is very useful for homogeneous deployments in distributed environments.

Technology

The use of Docker containers to deploy applications on the Edge is becoming increasingly widespread. The rise of technologies such as Artificial Intelligence or Machine Learning brought to the Edge (one of the technology trends of 2022 according to surveys such as this one by Red Hat), is accelerating the adoption of Docker as the "de facto" standard for intelligent application encapsulation and deployment.

However, to manage deployments in a scalable manner requires tools to orchestrate the entire process in a simple, centralised and secure manner.

Docker as a Software Development Enabler

Docker is a technology that allows applications to be developed and tested quickly and programmatically in any language by encapsulating them in standardised units called containers. These containers include everything needed for the application to run, including libraries, system tools, code and runtime.

Docker brings enormous flexibility to application development teams: it gives them the freedom to develop in the programming language of their choice and deploy them easily without worrying about the final hardware on which they will run. It also simplifies the maintenance and version control of the applications themselves, which is very useful for homogeneous deployments in distributed environments.

"This is precisely the defining characteristic of the Edge: these are distributed environments in which multiple devices, often with different hardware, run applications.
We recommend reading: Why everyone is talking about Docker in the Industrial IoT

Edge Computing, Artificial Intelligence and the use of Docker

Edge computing is a computing model that consists of processing data at the edge of the network. That is, at nodes much closer to where the data is captured.

This model is expanding across industrial sectors and is driven by another technology trend that analysts see gaining momentum in 2022 as part of industrial digitisation strategies: Artificial Intelligence at the Edge (known as Edge AI).

This technology focuses on deploying algorithms close to where the very data they use for their computations originate. It uses Edge nodes which are placed and connected locally to the data sources themselves.

Sectors such as Electricity Distribution or the Water Industry are immersed in processes of digital transformation of a large part of their business and Edge AI is an enabler for these processes to be carried out. And it is in these environments where, for the development and deployment of data exploitation algorithms, the use of Docker has become increasingly widespread.

Descarga el Barómetro del Edge Computing Industrial

However, working with Docker in distributed and often remote environments, such as those proposed by Edge Computing or Edge AI models, requires tools that allow control of the entire lifecycle of the edge nodes and the intelligence running on them.

How to work with Docker in Distributed Environments like the Edge

To speed up the work of deploying and running Docker on the Edge on a large scale, it is essential to have tools that allow at least the following actions to be performed securely:

a) deploy Docker containers on one or multiple Edge nodes at a time

(b) to update applications running on those devices at will; and

c) to know what happens during the whole process, by means of log display screens

At Barbara IoT, we have developed a platform which, among other things, includes a module for cyber-secure governance of all distributed intelligence on Edge nodes, facilitating the deployment, debugging and updating of applications that teams of Data Scientists have developed.

In addition, the Barbara platform includes a battery of additional functionalities aimed at facilitating the management of large parks of distributed Edge nodes.

If you are working on deploying applications on the Edge in industrial environments and would like us to show you how our platform can help you in the process, don't hesitate to ask us for a demo.