Machine Learning at the Edge and MLOps: How to deploy it successfully?

We are still at the dawn of Machine Learning and Artificial Intelligence in the Industry. Still, as we envision new use cases and develop them in our environment, we realise success in the future depends on the proper implementation today.


MLOps: agility as a cornerstone for machine learning growth

A good example is the use Edge Computing applications in Industry. With more and more distributed edge nodes, executing more complex AI based algorithms demands an edge infrastructure designed to maintain the lifecycle of trained models, as well as the devices that run them.

In this sense, projects that apply DevOps philosophy in the development, deployment and maintenance of AI algorithms will prosper more likely. This approach, is what we call MLOps and stands for Machine Learning Operations, a combination of Machine Learning and DevOps.

it is highly challenging to automate and operationalize ML products and thus many ML endeavors fail to deliver on their expectations. The paradigm of Machine Learning Operations (MLOps) addresses this issue.

MLOps is a core function of Machine Learning engineering, focused on streamlining the process of taking machine learning models to production, and then maintaining and monitoring them.

However, the adoption of MLOps philosophy faces a number of challenges:

1) Boards of directors do not always see Machine Learning as a strategic point of the company and perceive this type of project as something difficult to measure and manage.

2) Machine Learning initiatives often work in isolation from each other, making it difficult to integrate processes across teams.

3) To achieve efficiency, model training requires large amounts of quality data, which generates significant costs in data accessibility, preparation and management.

4 ) Data science involves a lot of trial and error, which makes it difficult to plan a project in time.

MLOps is about breaking away from slow and linear practices, to transform development processes into rapid continuous iteration, allowing developers to constantly create and deploy innovative solutions.

It is based on DevOps principles and practices. Built on the foundations of efficiency, continuous integration, delivery and deployment, DevOps meets the needs of the agile enterprise to deliver innovation at scale.

MLOps is based on the following premises inherited from DevOps:

  • The user as the centre of everything.
  • Connecting data and services: Success depends on how well existing or new data platforms and services can be integrated, adapting to the circumstances.
  • Automation: To ensure constant, consistent and efficient delivery of business value.
  • Manage infrastructure resources: Applications will be deployed in a flexible infrastructure and services environment at platform level.

By applying these principles, it enables the delivery of ML- based innovation at scale that result in:

  • Faster deployment time to production of ML-based solutions.
  • A faster pace of experimentation, which drives innovation.
  • Quality assurance, reliability and ethics IA.

In short, MLOps enables data science and IT teams to collaborate and increase the pace of model development and deployment by monitoring, validating and governing machine learning models.

Barbara Platform as an enabler for MLOps

In this data scientist-led process, Barbara´s Edge platform facilitates data scientists to deploy, start, monitor, stop or update applications and models to thousands of distributed edge nodes. With Barbara Edge Orchestrator, they can collaborate and increase the pace of model development and deployment by monitoring, validating and governing Machine Learning models.

With Barbara´s Edge Platform they can: 

  • Increase productivity and improve operations by reducing variations in model iterations for industrial-grade scenarios, using reproducible models and combining them to create models that can be automatically retrained for IoT Edge devices.
  • Automatically scale and deploy applications without code, automating the processes of compiling and deploying Machine Learning models for perimeter devices.
  • Easily and rapidly deploy highly accurate and reliable models anywhere, packaging models quickly using Dockers and releasing them in a controlled manner to production.
  • Effectively manage the entire machine learning lifecycle, benefiting from the interoperability of the platform.

Implementing Machine Learning models at the Edge poses some challenges that Barbara also can help with: 

  • IoT Machine Learning models change rapidly. Therefore, they need more frequent and automatic re-training, our Barbara Edge Orchestrator section allows you to download new algorithms to the Edge Node or upgrade existing models to newer versions.
  • IoT Machine Learning models typically rely on a wide variety of devices, with different technologies. Our extensive connector library allows Edge Nodes to connect to any sensor, actuator, PLC or industrial equipment to exchange information and commands.
  • IoT Edge solutions may often need to run in different connection environments, so different connectivity standards need to be enabled. Barbara OS can work with the network that best suits your coverage, battery consumption and bandwidth needs. Barbara OS offers connectivity through short-range technologies such as WiFi or Zigbee and long-range standards such as 5G. It can also integrate with LPWAN-type networks such as LoRaWan.


As we have seen, MLOps is an essential discipline for a winning organisation that intends to remain relevant. The competitiveness of industries in the future will be defined by Machine Learning and MLOps. Companies that remain at a lower level will be at a significant disadvantage to those that are in a position to expand their ML efforts to provide a real business advantage.

The importance of agility in technology development is even more important in business decisions. Budgets rarely arise specifically to improve process maturity. The business technology leader is in a unique position to take the initiative and create projects that use machine learning as a catalyst for the company.

If you like to learn more on how to implement MLOps at the Edge get in touch.