FOCUS | Factory of the Future Clusters
01-01-2015
-31-12-2016
01-01-2015
-31-12-2016
01-10-2016
-30-09-2019
01-01-2018
-31-12-2020
01-05-2015
-31-07-2018
01-02-2015
-31-10-2018
01-10-2016
-31-03-2020
The secure connection are encrypted by two types of authentication:
Difficulties in having access to machines/systems in the shopfloor for end users’ internal security policies was mitigated with the shopfloor machines/systems interface that is carried out by implementing security measures where different options to avoid direct access to machines through the company network have been investigated (e.g. separate network, mirror DB,…)
SSL (Secure Sockets Layer) is a cryptographic protocol to provide communications security and ensure privacy and data integrity between two or more communicating computer applications.
OpenSSL is an open-source software library that contains an implementation of the SSL protocol used by NGINX (web server) to provide API over HTTPS.
Data communication between components is essential for the project. End users create data on their shop floor with embedded sensors on the machines, new integrated sensors developed for the project. All these data is propagated in the system with data communication protocols, such as HTTP and AMQP, creating a data stream process in the system. Interoperability between the data communication protocolos and brockers is crucial for a successful result of the data communication of the system. Various data sources work together and use different communication protocols. As a result, all these components and protocols should seamlessly work and their interoperability is what helps them. A message brocker was developed for the project, based on AMQP for data communication. In the initial phases of the project, there were also RESTful APIs that helped in the initial development of the components.
The Incremental Integration Strategy (IIS) provides a unified framework for all the EU distributed partners, to work on common principles. By following the IIS, we try to ensure that the integration will be successfully executed in a timely manner. It defines a number of factors to monitor and steps to execute.
The IIS manifests that the components are integrated and tested incrementally and tested to ensure smooth interaction among them. Every component is combined incrementally, i.e., one by one till all components are integrated logically to make the required application, instead of integrating the whole system at once and then performing testing on the end product. Integrated components are tested as a group to ensure successful integration and data flow between components. The process is repeated until all components are combined and tested successfully. The tests included in the IIS are:
The IIS and the Integration plan of the Z-Fact0r solution were based on the same APIs and protocols as the data exchange in the system. There were no new APIs designed for the integration process and the integration protocol implemented was derived by the IIS and the Integration plan of the Z-Fact0r system.
During the integration phase the same communication protocols were used: HTTP and AMQP for the data exchange. Also there is Wi - Fi connection for integration the various components and their updates on premises or on cloud during the integration process of the system. Finally, FTP was used during the integration phase for quick transfer of files on the shop floor premises.
The data exchange format throughout the project's components was JSON. JSON lightweight, easy for humans to read and write it and provides all relevant information in a formatted way. It is also easy to change to include further fields when necessary or to be restructured for other components. XML was also used as data exchange format. XML also has the same characteristics with JSON in regards to easiness and accessibility. An example of one of the JSON formats used the project is given below to describe the prediction outputs:
Semantic interoperability is desired in the project. An ontology was created to describe all the entities participating in the project components, system, communication protocols as well as the entities given by the end users. The Context Aware algorithm was based on this ontology to create the operation rules for the system. The algorithm provided the essential information to other components about the implementation of the solution. For example, the Context Aware algorithm provided the Reverse Supply Chain with all the necessary information about the production line, the production stages, the return levels and then the RSC was able to create a set of rules to implemented by the end user.
The whole platform of the Z-Fact0r solution was able to work with other external applications, through a message brocker which is able to receive and send data to external systems. The interoperability level between the Z-Fact0r platform and external applications is essential for communication and integration purposes. Security and safety issues arise when different platforms cooperate. The Z-Fact0r platform implemented an AAA mechanism (Access, Autorisation and Authentication) to secure the safety of the platform during the connection with other external applications.
Access was given to the Z-Fact0r platform to only authorised users. The platform installation was done either on the shop floor premises or servers deployed by the technical providing partners creating a limited access environment. There was also the authorisation between the components and external appl, where the each component was authorised in an authorisation server with their unique Bearer Token in order to subscribe in the message brocker and publish or receive the available data. Further steps, such as user authentication, were not included in the project scope.
Z-Fact0r components were developed by different technical providing partners as mostly standalone components. The result was that on each shop floor worked many different components individually. An interoperability level was necessary for the Z-Fact0r system to be a solution to work as a whole. Various integration processes and extensive planning took place during the project and created an integrated system as a final product. The interoperability between the components was the first essential characteristic for the integration process. The components were desinged in the system, in a way that allowed them to operate together without conflicts during data streaming and operation.
Z-Fact0r architecture was based on the modular design of the components and then the integration of the components to a complete system. For each component a specific architecture was followed by the responsible technology providing partner, base on the use cases, scenarios, end user requirements and technical requirements. The desing for each component was documented in the respective deliverable. Each component also followed the technological trends of their fields and exploited the state of the art of the field. An overall ontology of the Z-Fact0r system was created to include all possible actors, functions, assets etc. All components were initially deployed as standalone applications and then an integration plan was implemented. Z-Fact0r project followed the Incremental Integration Strategy (IIS) where the components were deployed on the shop floors and integrated as one.
The Z-Fac0r project followed the AMQP and MQTT protocols for the communication between the components. A message brocker was develope by HOLONIX and was called iLike. The iLike implemented the publish/subscribe mechanism for all components which connected to it. The components were authorised in the iLike brocker and repository with a Bearer Token and then used the mechanism to publish or receive the data. An open API was used to create REST GET calls in order to initiate the communicatio between the component and the brocker. The communication steps between the component and the brocker were:
The data from the iLike machines are sent into the cloud to a broker using MQTT protocol (a lightweight protocol that transports messages between devices), it stores the data as messages, so the subscribers can get the values.
MQTT broker can easily scale from a single device to thousands, manage and tracks all client connection state and permit secure connections.
Z-Fact0r hybrid framework, obtained by applying a software and hardware integration strategy, is installed on the industrial end users shop floors. This architecture exploits features from Relational Databases and Triplestore while using the blackboard architectural pattern which ensures efficient and accurate communication of data transfer among software applications and devices.
The RESTful API over HTTP has been chosen to fulfil the necessity of sending intermediate or final results to the repository from Z-Modules side, the API utilizes JSON as default exchange format and JWT (JSON Web Token) as authentication mechanisms.
The JWT is a standard that defines a JSON format scheme for exchanging information between various services. JWTs are widely used to authenticate requests in Web Services authentication mechanisms where the client sends an authentication request to the server, the server generates a signed token and returns it to the client which, from that moment on, will use it to authenticate subsequent requests.
To store data from different sources, including the data elaborated by various Z-Modules a Z-Fact0r data repository has been developed.
The first source of data is the temporal machine data coming from machine sensors, to store this data is used Cassandra, a distributed NoSQL DBMS capable to handle large amount of data across many servers and provide high availability.
The following one is used to store others complex and structured production information with the relational DBMS Mysql.
Another data source in the Z-Fact0r context is the output generated by various modules that carry out the analysis result.
All database schema, communication protocols, security applications of the Z-Fact0r solution are designed to accommodate the scalability of the solution. All technology can be implemented in larger scale projects without major changes. The one difference with dealing with big data is the use of a different database approach, such as MongoDB, which is more suitable for big data analysis.
01-01-2015
-31-12-2017
09-01-2015
-31-08-2018
01-10-2016
-29-02-2020
01-09-2015
-31-08-2018
01-06-2016
-30-09-2019
11-01-2015
-30-04-2019
01-12-2014
-30-11-2017
01-01-2015
-31-12-2017
02-09-2013
-01-09-2017
01-10-2012
-30-09-2015
01-09-2012
-31-08-2016
01-11-2012
-31-10-2016
07-11-2017
-06-05-2020
01-05-2010
-30-04-2013
10-09-2020
-09-09-2022
Difficulties in setting up the initial data collection infrastructure on the pilot sites was mitigated through the very careful data collection infrastructure selection where end users have been constantly assisted to set up the best approach to transfer data to Z-Fact0r platform, taking into consideration the internal security policies of the companies.