Simple, open and standardised cloud connectivity

June 8th, 2016, Published in Articles: EngineerIT


As information technology and automation technology continue to converge, cloud-based communication and data services are increasingly used in industrial automation projects. Beyond the scope of conventional control tasks, applications such as big data, data mining and condition or power monitoring enable the implementation of superior, forward-looking automation solutions.

Industry 4.0 and IoT strategies place strict requirements on the networking and communication capabilities of devices and services. In the traditional communication pyramid point of view (Fig. 1), large quantities of data must be exchanged between field-level sensors and higher-level layers in these implementations. However, horizontal communication between PLC control systems also plays a critical role in modern production facilities. PC-based control technologies provide universal capabilities for horizontal communication and have become an essential part of present-day automation projects exactly for this reason. Products such as TwinCAT IoT engineering and control software provide the ideal foundation technology for Industry 4.0 concepts and IoT communication. Moreover, new IoT-compatible I/O components solutions enable easy-to-configure and seamless integration into public and private cloud applications.

Fig. 1: Communication pyramid.

Fig. 1: Communication pyramid.

Definition of business objectives for increasing the competitive edge

Industry 4.0 and IoT applications do not start with just the underlying technology. In reality, the work begins much earlier than this. It is critically important when implementing IoT projects to first examine the corporate business objectives, establishing the benefits to be gained as a company from such projects. From an automation provider perspective, there are two distinct categories of customers that can be defined: machine manufacturers and their end customers – in other words, the end users of the automated machines.

In the manufacturing sector in particular, there is an obvious interest in reducing in-house production costs, both through efficient and reliable production control and also by reducing the number of rejects produced. The traditional machine manufacturer pursues very similar objectives, and above all is interested in reducing the cost of the machine while maintaining or even increasing production quality. Optimising the machine’s energy consumption and production cycles, as well as enabling predictive maintenance and fault diagnostics, can also be rewarding goals. The last two points in particular offer the machine manufacturer a solid basis to establish services that can be offered to end customers as an additional revenue stream. Of course, what both customer categories ultimately want is for the machine or product to be designed more attractively and to increase competitiveness in the marketplace.

Collecting, aggregating and analysing process data

The process data used during production provides a foundation for creating added value and for achieving above-mentioned business objectives. This includes the machine values that are recorded by a sensor and transmitted via a fieldbus to the PLC. This data can be analysed directly on the controller for monitoring the status of a system using the condition monitoring libraries integrated in automation software, thereby reducing downtime and maintenance costs.

Fig. 2: Analysis on the controller or server.

Fig. 2: Analysis on the controller or server.

However, where there are several distributed controllers in production areas, it may not be sufficient to analyse data from a single controller. The aggregated data from multiple or even all controllers in a production system or a specific machine type is often needed to perform sufficient data analysis and make an accurate analytical statement about the overall system. However, the corresponding IT infrastructure is required for this purpose. Previous implementations focussed on the use of a central server system within the machine or corporate network that was equipped with data memory, often in the form of a database system. This allowed analysis software to access the aggregated data directly in the database in order to perform corresponding evaluations (Fig. 2).

Although such an approach to realise data aggregation and analysis in production facilities certainly worked well, it presented a number of problems at the same time, since the required IT infrastructure had to be made available first. The fact that this gives rise to high hardware and software costs for the corresponding server system can be seen right away. However, the costs with respect to personnel should also not be overlooked: Because of the increasing complexity involved in networking production systems, especially with large numbers of distributed production locations, skilled personnel are necessary to successfully perform the implementation in the first place. To complicate matters, the scalability of such a solution is very low. Ultimately the physical limits of the server system are reached at some point, be it the amount of memory available or the CPU power, or the performance and memory size required for analyses. This often resulted in more extensive, manual conversion work if systems had to be supplemented by new machines or controllers. At the end of the day, the central server system had to grow alongside in order to capably handle and process the additional data volume.

The path to the public cloud

Cloud-based communication and data services now avoid the aforementioned disadvantages by providing the user with an abstract view of the underlying hardware and software systems. “Abstract” in this context means that a user does not have to give any thought to the respective server system when using a service. Rather, only the use of the respective services has to be considered. All maintenance and update work on the IT infrastructure is performed on the part of the provider of a cloud system. Such cloud systems can be divided into public and private clouds.

The so-called public cloud service providers, such as Microsoft Azure or Amazon Web Services (AWS), for example, provide users with a range of services from their own data centres. This starts with virtual machines, where the actual user has control of the operating system and the applications installed on it, and stretches to abstracted communication and data services, which can be integrated by the user in an application. The latter, for example, also includes access to machine learning algorithms, which can make predictions and perform classifications regarding specific data states on the basis of certain machine and production information. The algorithms obtain the necessary contents with the aid of the communication services.

Fig. 3: Publish/subscribe communication with public cloud services.

Fig. 3: Publish/subscribe communication with public cloud services.

Such communication services are usually based on communication protocols, which in turn are based on the publish/subscribe principle. This offers definite advantages from the resulting decoupling of all applications that communicate with one another. On one hand, the various communication participants no longer need to know each other – in other words, any time-consuming disclosure of address information is reduced. All applications communicate via the central cloud service. On the other hand, data communication with the cloud service, via the message broker (Fig. 3), involves a purely outgoing communication connection from the perspective of the terminal device – regardless of whether data is sent (publish) or received (subscribe). The advantages this offers for configuring the IT infrastructure are immediately clear: no incoming communication connections have to be configured, for example in firewalls or other network terminals. This significantly reduces IT infrastructure set-up time and maintenance costs. Transport protocols used for data communication are exceptionally lean and standardised, such as MQTT and AMQP. In addition, various security mechanisms can be also anchored here, for example, encryption of data communication and authentication with respect to the message broker. The standardised communication protocol, OPC UA has likewise recognised the added value of a publish/subscribe-based communication scenario and taken appropriate steps to integrate this communication principle in the specification. This means that an additional standard besides MQTT and AMQP is consequently available as a transport mechanism to the cloud.

The private cloud

However, publish/subscribe mechanisms such as these can not only be used in public cloud systems; they can also be used in the company or machine network. In the case of MQTT and AMQP, the infrastructure required for this purpose can be installed and made available easily on any PC in the form of a message broker. This means that both M2M scenarios can be implemented and any terminal devices, such as smartphones, can be connected to the controller. Moreover, access to these devices is further secured by means of firewall systems (Fig. 4). The extensions of the OPC UA specification with regard to publish/subscribe will also simplify the configuration and use of 1:N communication scenarios within a machine network in the future.

Fig. 4: Publish/subscribe communication in the machine network.

Fig. 4: Publish/subscribe communication in the machine network.

Products for Industry 4.0 and IoT

Beckhoff provides users a variety of components for simple and standardised integration into cloud-based communication and data services. The IoT products within a automation software platform offer varied functionalities for exchanging process data by means of standardised publish-/subscribe-based communication protocols and for accessing special data and communication services of public cloud service providers. Corresponding services can be hosted in public cloud systems, such as Microsoft Azure or AWS, but can be used just as effectively in private cloud systems.

These IoT functions can be accessed alternatively via special function modules directly from the control program or can be configured via an application such as the “TwinCAT IoT Data Agent” outside of the control program. The data to be transmitted can be selected easily via a graphical configurator and configured for transfer to a specific service. A major advantage here is that the data agent also allows integration of cloud-based services in older, existing TwinCAT systems. The process data can also continue to be exported here using the standardised communication protocol OPC UA, with the result that data can likewise be sent from various systems (Fig. 5). An additionally-available smartphone app enables mobile display of a machine’s alarm and status messages.

Fig. 5: A configurable and easy-to-use cloud connection, including OPC UA.

Fig. 5: A configurable and easy-to-use
cloud connection, including OPC UA.

If I/O signals are to be forwarded directly without a control program, then a bus coupler such as the EK9160 IoT allows I/O data to be parameterised via an easy-to-configure website on the device for sending to a cloud service. The bus coupler then independently carries out the sending of the digital or analogue I/O values to the cloud service. An IoT coupling station consists of a bus coupler and powerful and ultra-fast EtherCAT terminals. The data is sent in a user-friendly, standardised JSON format to the cloud service and can also be transmitted in encrypted form if required.

Extended mechanisms, such as local buffering of I/O data in the case of an interrupted internet connection, are provided here in the same way as a monitoring function for connected fieldbuses. The I/O signals can therefore not only be collected by means of EtherCAT, but also via other fieldbuses, such as CANopen or PROFIBUS.

Analytics and machine learning

Once the data has been sent to a public or private cloud service, the next question is how the data can now continue to be processed. As previously mentioned, many public cloud providers offer various analytics and machine learning services that can be used for further examination of process data. Moreover, Beckhoff also has its own analytics platform for users to use. This platform provides relevant mechanisms for data analysis, with all process-related machine data being recorded in a precise and cyclical manner. All machine processes can therefore be fully recorded as a result.

Fig. 6: IoT and cloud bus coupler.

Fig. 6: IoT and cloud bus coupler.

Depending on requirements, this data can either be stored for evaluation locally on the machine processor, or within a public or private cloud solution. TwinCAT Analytics uses TwinCAT IoT to connect to cloud solutions, ensuring seamless data communication. Generally-speaking, this provides the power to create new business ideas and models for the machine manufacturer and respective end customers to capitalise on.


Industry 4.0 and IoT are on everyone’s minds. Likewise, these concepts are important when the realisation of innovative new business models is a requirement for the underlying infrastructure. This also drives the increased convergence of IT and automation technologies. Cloud-based data services can help implement such automation projects, as they save the machine manufacturer or end customer from having to provide the corresponding IT expertise. With an IoT and cloud bus coupler, customers can integrate such cloud-based data services into the control project. Additionally, software enables the support of such projects using an analytics platform, which facilitates comprehensive analysis of the recorded process data.

Contact Kenneth McPherson,  Beckhoff Automation, Tel 011 795-2898,

Related Articles

  • South African Government COVID-19 Corona Virus Resource Portal
  • Now Media acquires EngineerIT and Energize from EE Publishers
  • Printed electronics: The defining trends in 2019
  • Charlie and the (fully-automated) Chocolate Factory
  • SANSA app calculates best HF communication channel