There are a lot of competitive paradigms to keep pace with the advancement of new technologies. Paradigms 4.0 today are capable of transforming industrial (and non-industrial) machinery, in multiple and innovative application scenarios.

Sustainability and efficiency are the keywords for the change that IoT solutions today propose in various industrial sectors, through a constant updating of the programming algorithms used and tested by research departments.

We are talking, for example, about Edge Computing, one of the most popular  trending paradigms in the IoT sector of Industry 4.0.

Are there opportunities for integration of Cloud and Edge?

To remain competitive, companies need to constantly update their production lines with the latest technologies, prioritizing the industrial department’s needs. During the design phase of IoT technologies, particular attention must be paid to:

  • Monitoring assets from sensor data in both greenfield and brownfield scenarios;
  • Archiving and subsequent analysis of extracted data; 
  • Choosing corrective actions and optimization of the production process starting from these analyzes in real time;
  • Optimizing the parameters of the individual machines, remotely, to improve their operation.

A centralized computing model, on the other hand, would not be able to solve all the required needs, and would encounter various problems, especially traffic, latency, and security.

Despite this, by bringing intelligence to the machine, thanks to the use of hardware that performs much of the necessary computation directly on the sensor data, less data is sent to the cloud and only that which is necessary for broader analyzes where the time factor is less significant.

Interconnection between cloud platforms and edge platforms is possible.

A practical example: the case of a vibration sensor placed on an industrial machine.

If the goal is to detect, in advance, a possible failure starting from vibration models, in this case, the raw data of the sensor are analyzed and an alert is generated in the event that an alteration is detected, with respect to the standard intervals, provided by the model .

In the centralized model, the vibration data would be transferred to the cloud to be fed to some other analysis tool, and the result would be sent as an alert from the cloud to the asset to possibly turn on an alarm signal and inform the operator of the anomaly.

In the edge computing model, however, since the vibration data are local to the asset, the calculations are performed on a device positioned in the industrial plant itself, resulting  in a faster and cheaper way for creating the alarm.

Figure 1. An edge device with a powerful database implemented on multiple use cases on the shop floor.

The cloud traffic is, therefore, zero and the latency is dependent only on the performance of the local network. In addition, the edge computing device can store vibration data, allowing the creation of a vibration history to be consulted over time.

But not all scenarios are so clear cut

For example, one can imagine a setting for monitoring biogas production for real-time waste management to optimize production, based on energy prices and the state of all wells. In this setup, the temperature, humidity, biogas, and oxygen levels data for each well are local to the waste management site.

However, actions to improve production must be carried out on the basis of external data (e.g. forecasts, energy costs, pipeline efficiency, etc.). In this case, the analytics pipeline is not completely local and could be conveniently placed in the cloud.

Yet, applying edge computing principles can still reduce costs and latency, thus improving reliability.

Each well, in fact, can be monitored locally by a perimeter device which stores all the readings and immediately acts on the well valves if the oxygen level rises above the alarm threshold.

Towards sustainability 4.0

There are many advantages of Edge Computing, the benefits of which the environmental impact, of a paradigm like this can bring, should not be underestimated:

  • Less overheating: excessive computation of data causes excessive overheating in order to be able to process and store data in an appropriate manner. Edge Computing, on the other hand, avoids this problem by assigning data processing tasks to individual devices.
  • Lower data transport: Decentralizing data processing operations allows consumption reduction that data archiving inevitably causes.
  • Reduced latency: Faster and more efficient communication is synonymous with sustainability. With shorter latency and wait times, there is no need to wait for a response from physically distant data centers.
  • Synergy: a balanced use between edge computing and cloud computing allows you to use both tools synchronously, improve energy consumption, and use data communication strategically.

And at a time in which there is much talk of environmental sensitivity and technologies for better consumption management, the adequate use of innovative paradigms is an advantage that should not be underestimated.

What is the Opportunity and Future of 5.0?

Companies are thus able to bring the optimization of industrial processes to more advanced levels, where machines and plants can in fact be monitored and controlled with minimum latencies at the edge, while continuing to extract value from aggregate analyzes at the cloud. .

To date, the opportunities for integrating the two worlds are within reach and will be even more poignant with the introduction of 5G networks that will add security and ease of deployment.

If you are interested in learning more about how the Edge Computing paradigm works and its potential in 4.0, you can read more in our white paper.