IoT without Big Data is nothing
When we write about Industrial IoT, at Barbara we compare it to the nervous system of a company, imagining it as a network of sensors that collects valuable information from all corners of a production plant and stores it in a repository for data analysis and exploitation. In these articles, we emphasize the need to measure and obtain data in oorder to make informed decisions.
But what happens next? what should we do with all that data? We always talk about making good decisions based on reliable information, but although it may sound obvious, it is not always that easy to achieve it. In this article we will go a bit beyond IoT and will focus on the data and how to exploit it.
We’ll talk about the analysis phase, the process that turns data into information first and then into knowledge; what some refer to it as business logic. In the end we are not going that far from the area of IoT, because for us IoT without Big Data is meaningless.
Big Data and Data Analytics
In recent decades, especially in the last one, we have witnessed an incredible flood of data (structured and unstructured), mass-produced by the ubiquity of digital technologies. In the particular case of the Industrial world, the value of exploiting this huge amount of information is of great interest.
This need to process business data has given rise to Big Data, Data Science or Data Analytics, which we could define as the processes we follow to examine the data captured by our network of devices, with the aim to reveal hidden trends, patterns or correlations. Always with the underlying idea of improving the business with new types of knowledge.
There are different definitions for Big Data, one of them referred by Gartner considers 3 key aspects such as the volume of data, its variety or the speed with which it is captured when talking about Big Data. These are the so-called 3Vs, although others extend them to 5Vs, adding the veracity of the data and the value they bring to the business.
We believe though, it does not make much sense to go into theoretical disquisitions on what is and what is not, because Big Data is already practically everything.
IoT and Big Data
How do IoT and Big Data relate to each other? The main point of contact is usually a database. In general terms, we could say that the work of IoT ends in that database, i.e. its goal is to dump all the data acquired in a more or less orderly manner in a common repository. The domain of Big Data starts by going back to that repository to play with data and get the information needed.
In any case, it is interesting to visualize this Big IoT Data Analytics as a toolbox. A box from which we will draw one tool or another depending on the type of information and knowledge we want to acquire from the data. Many of these tools are traditional algorithms, or improvements or adaptations of them, with very similar statistical and algebraic principles. Algorithms that were not invented in this century which surprises to many, who wonder why they are now relevant and were not then.
The quick answer is that the volume of data available is now much greater than then, but above all, that the computing power of today’s machines allows the use of these techniques on a larger scale, giving, in a way, new uses to old methodologies.
But neither do we want to give the impression that everything has been invented and that the current trend in data analysis has brought nothing, that is not true. The data ecosystem is very broad and has witnessed a strong innovation in recent years.
One of those fastest growing areas is Artificial Intelligence. Although many will argue that it is not a recent invention, since this phenomenon was discussed as early as 1956. However, Artificial Intelligence is so broad and its impact so big, that it is often considered a self-contained discipline. The reality is that, in some way, it is part of Big Data or Data Analytics. It is another of those tools that was already part of our tool box but found a natural evolution with AIoT.
AIoT: the Artificial Intelligence of Things
The exponential growth in the volume of data requires new ways of analyzing it. In this context, Artificial Intelligence becomes particularly relevant. According to Forbes, the two main trends that are dominating the technology industry are precisely the Internet of Things (IoT) and Artificial Intelligence.
IoT and AI are two independent technologies that have a significant impact on multiple verticals. While IoT is the digital nervous system, AI becomes an advanced brain that makes the decisions that control the overall system. According to IBM, the true potential of IoT will only be achieved through the introduction of AIoT.
But what is Artificial Intelligence and how is it different from conventional algorithms?
We usually speak of Artificial Intelligence when a machine (Artificial) mimics the cognitive functions (Intelligence) of humans. That is, it solves problems in the same way as a human would. Or, let´s say that a machine is able to find new ways of understanding data, new algorithms to solve complex problems without the programmer -and this is the key- knowing them i.e. without the programmer programming them. So we could think of Artificial Intelligence and, in particular, Machine Learning (which is the part with the greatest projection within AI) as algorithms that invent algorithms.
More on Artificial Intelligence: The State of AI in 2020 by Mckinsey
Edge AI and Cloud AI
The combination of IoT+AI brings us AIoT (Artificial Intelligence of Things), intelligent and connected systems that are able to make decisions on their own, evaluate the results of these decisions and improve over time.
This combination can be done in several ways, of which we would like to highlight two:
1. On the one hand we could continue to speak of that brain as a centralized system that processes all impulses and makes decisions. In this case we would be referring to a system in the cloud that centrally receives all telemetry and acts. It would be Cloud AI (Artificial Intelligence in the Cloud).
2. On the other hand, we must also talk about a very important part of the nervous system: reflexes. Reflexes are autonomous decisions that the nervous system makes without the need to send all the information to the central processor (the brain). These decisions are made in the periphery, close to the source where data was originated. This is called Edge AI (Artificial Intelligence at the Edge).
Related article: IoT Edge Computing, edge nodes and industrial use cases
Edge AI and Cloud AI Use Cases
Cloud AI provides a thorough analysis process that takes into account the entire system, whereas Edge AI gives us speed of response and autonomy. But as with the human body, these two ways of reacting are not mutually exclusive, but can be complementary.
As an example, a water control system can block a valve in the field the moment it detects a leak to prevent from major water losses and, in parallel, it notifies to the central system, where higher-level decisions can be made such as opening alternative valves to channel water through another circuit.
The possibilities are endless, and can go beyond this simple example of «reactive» maintenance, to be able to predict possible events and thus, enabling a «predictive» maintenance.
Another example of AIoT can be found in Smart Grids, where we have smart devices at the edge analyzing the electricity flows at each node and making load balancing decisions locally, while in parallel it sends all this data to the cloud for a more nationwide process. Macroscopic level analysis would allow load balancing decisions to be made at a regional level or even decreasing or increasing electricity production, by shutting down hydroelectric plants or launching a power purchase process from a neighboring country.
Related article: Edge Computing use case in the Electricity Sector
Research paper about Edge Computing has exploded in recent years, going from an output of about 2,000 papers annually to more than 25,000, according to Google Scholar. Furthermore, the International Data Corporation (IDC) predicts that the Edge Computing market is set to double in the next 4 years. According to Kevin Scott, CTO of Microsoft, edge intelligence is proving to be the last mile in the convergence of the digital and physical worlds.
At Barbara IoT we are enablers of both, Big Data Analytics and Artificial Intelligence in the Cloud and on the Edge. Request a personalized demo to learn how we can equip your company with that intelligent nervous system that will take it, to the next level.