MANUWORK | Balancing Human and Automation Levels for the Manufacturing Workplaces of the Future
01-10-2016
-31-03-2020
01-10-2016
-31-03-2020
01-10-2016
-30-09-2019
01-10-2016
-30-09-2019
01-09-2016
-31-05-2021
01-10-2016
-31-03-2020
Difficulties in setting up the initial data collection infrastructure on the pilot sites was mitigated through the very careful data collection infrastructure selection where end users have been constantly assisted to set up the best approach to transfer data to Z-Fact0r platform, taking into consideration the internal security policies of the companies.
X.509 Certificate for the client certification and TLS handshakes between the Z-Fact0r components and the iLike machines
The secure connection are encrypted by two types of authentication:
Z-Fact0r system is a distributed system by desing. The connections between the components and the data flow is authorized by the iLike machine in order the system to maintain all security aspects.
Difficulties in having access to machines/systems in the shopfloor for end users’ internal security policies was mitigated with the shopfloor machines/systems interface that is carried out by implementing security measures where different options to avoid direct access to machines through the company network have been investigated (e.g. separate network, mirror DB,…)
Implementation of security mechanism on the Z-Fact0r platform components, to achieve security. The basic architecture provides authorization and authentication as a means of deploying a secure system architecture.
SSL (Secure Sockets Layer) is a cryptographic protocol to provide communications security and ensure privacy and data integrity between two or more communicating computer applications.
OpenSSL is an open-source software library that contains an implementation of the SSL protocol used by NGINX (web server) to provide API over HTTPS.
Data communication between components is essential for the project. End users create data on their shop floor with embedded sensors on the machines, new integrated sensors developed for the project. All these data is propagated in the system with data communication protocols, such as HTTP and AMQP, creating a data stream process in the system. Interoperability between the data communication protocolos and brockers is crucial for a successful result of the data communication of the system. Various data sources work together and use different communication protocols. As a result, all these components and protocols should seamlessly work and their interoperability is what helps them. A message brocker was developed for the project, based on AMQP for data communication. In the initial phases of the project, there were also RESTful APIs that helped in the initial development of the components.
The Incremental Integration Strategy (IIS) provides a unified framework for all the EU distributed partners, to work on common principles. By following the IIS, we try to ensure that the integration will be successfully executed in a timely manner. It defines a number of factors to monitor and steps to execute.
The IIS manifests that the components are integrated and tested incrementally and tested to ensure smooth interaction among them. Every component is combined incrementally, i.e., one by one till all components are integrated logically to make the required application, instead of integrating the whole system at once and then performing testing on the end product. Integrated components are tested as a group to ensure successful integration and data flow between components. The process is repeated until all components are combined and tested successfully. The tests included in the IIS are:
The IIS and the Integration plan of the Z-Fact0r solution were based on the same APIs and protocols as the data exchange in the system. There were no new APIs designed for the integration process and the integration protocol implemented was derived by the IIS and the Integration plan of the Z-Fact0r system.
During the integration phase the same communication protocols were used: HTTP and AMQP for the data exchange. Also there is Wi - Fi connection for integration the various components and their updates on premises or on cloud during the integration process of the system. Finally, FTP was used during the integration phase for quick transfer of files on the shop floor premises.
The data exchange format throughout the project's components was JSON. JSON lightweight, easy for humans to read and write it and provides all relevant information in a formatted way. It is also easy to change to include further fields when necessary or to be restructured for other components. XML was also used as data exchange format. XML also has the same characteristics with JSON in regards to easiness and accessibility. An example of one of the JSON formats used the project is given below to describe the prediction outputs:
Semantic interoperability is desired in the project. An ontology was created to describe all the entities participating in the project components, system, communication protocols as well as the entities given by the end users. The Context Aware algorithm was based on this ontology to create the operation rules for the system. The algorithm provided the essential information to other components about the implementation of the solution. For example, the Context Aware algorithm provided the Reverse Supply Chain with all the necessary information about the production line, the production stages, the return levels and then the RSC was able to create a set of rules to implemented by the end user.
The whole platform of the Z-Fact0r solution was able to work with other external applications, through a message brocker which is able to receive and send data to external systems. The interoperability level between the Z-Fact0r platform and external applications is essential for communication and integration purposes. Security and safety issues arise when different platforms cooperate. The Z-Fact0r platform implemented an AAA mechanism (Access, Autorisation and Authentication) to secure the safety of the platform during the connection with other external applications.
Access was given to the Z-Fact0r platform to only authorised users. The platform installation was done either on the shop floor premises or servers deployed by the technical providing partners creating a limited access environment. There was also the authorisation between the components and external appl, where the each component was authorised in an authorisation server with their unique Bearer Token in order to subscribe in the message brocker and publish or receive the available data. Further steps, such as user authentication, were not included in the project scope.
Z-Fact0r components were developed by different technical providing partners as mostly standalone components. The result was that on each shop floor worked many different components individually. An interoperability level was necessary for the Z-Fact0r system to be a solution to work as a whole. Various integration processes and extensive planning took place during the project and created an integrated system as a final product. The interoperability between the components was the first essential characteristic for the integration process. The components were desinged in the system, in a way that allowed them to operate together without conflicts during data streaming and operation.
Z-Fact0r architecture was based on the modular design of the components and then the integration of the components to a complete system. For each component a specific architecture was followed by the responsible technology providing partner, base on the use cases, scenarios, end user requirements and technical requirements. The desing for each component was documented in the respective deliverable. Each component also followed the technological trends of their fields and exploited the state of the art of the field. An overall ontology of the Z-Fact0r system was created to include all possible actors, functions, assets etc. All components were initially deployed as standalone applications and then an integration plan was implemented. Z-Fact0r project followed the Incremental Integration Strategy (IIS) where the components were deployed on the shop floors and integrated as one.
The Z-Fac0r project followed the AMQP and MQTT protocols for the communication between the components. A message brocker was develope by HOLONIX and was called iLike. The iLike implemented the publish/subscribe mechanism for all components which connected to it. The components were authorised in the iLike brocker and repository with a Bearer Token and then used the mechanism to publish or receive the data. An open API was used to create REST GET calls in order to initiate the communicatio between the component and the brocker. The communication steps between the component and the brocker were:
The data from the iLike machines are sent into the cloud to a broker using MQTT protocol (a lightweight protocol that transports messages between devices), it stores the data as messages, so the subscribers can get the values.
MQTT broker can easily scale from a single device to thousands, manage and tracks all client connection state and permit secure connections.
Wireless communication of the Z-Fact0r solution was based on Wi - Fi protocol.
Z-Fact0r hybrid framework, obtained by applying a software and hardware integration strategy, is installed on the industrial end users shop floors. This architecture exploits features from Relational Databases and Triplestore while using the blackboard architectural pattern which ensures efficient and accurate communication of data transfer among software applications and devices.
There is little integration with legacy systems, such as CMMS or ERP for the Z-Fact0r solution.
The RESTful API over HTTP has been chosen to fulfil the necessity of sending intermediate or final results to the repository from Z-Modules side, the API utilizes JSON as default exchange format and JWT (JSON Web Token) as authentication mechanisms.
The JWT is a standard that defines a JSON format scheme for exchanging information between various services. JWTs are widely used to authenticate requests in Web Services authentication mechanisms where the client sends an authentication request to the server, the server generates a signed token and returns it to the client which, from that moment on, will use it to authenticate subsequent requests.
To store data from different sources, including the data elaborated by various Z-Modules a Z-Fact0r data repository has been developed.
The first source of data is the temporal machine data coming from machine sensors, to store this data is used Cassandra, a distributed NoSQL DBMS capable to handle large amount of data across many servers and provide high availability.
The following one is used to store others complex and structured production information with the relational DBMS Mysql.
Another data source in the Z-Fact0r context is the output generated by various modules that carry out the analysis result.
All database schema, communication protocols, security applications of the Z-Fact0r solution are designed to accommodate the scalability of the solution. All technology can be implemented in larger scale projects without major changes. The one difference with dealing with big data is the use of a different database approach, such as MongoDB, which is more suitable for big data analysis.
01-10-2016
-31-10-2019
01-10-2016
-31-03-2021
01-10-2016
-30-09-2019
01-10-2016
-30-09-2019
Data collection in a "manufacturing database" to enable real-time feedback on how deviations in the manufacturing process will have impact on part performance.
01-10-2016
-30-09-2020
01-01-2017
-30-06-2020
01-11-2015
-31-10-2017
01-06-2017
-31-05-2020
01-05-2015
-31-07-2018
01-10-2017
-31-03-2021
Α Web API will return semantic data. The communication interface is through the SPARQL query engine. Z-BRE4K ontology is implemented with the Open Semantic Framework (OSF), an integrated software stack using semantic technologies for knowledge management. Furthermore, JSON formatted data from the shop floor is transferred through a MQTT broker, to be finally stored in I-LiKe machines internal data repository. IDS connectors are used to transform data into the NGSI format, move the data to the ORION context broker to be finally consumed by other applications. Also, the Quality Information Framework (QIF) standard guarantees interoperability since it defines an integrated set of information models that enable the effective exchange of metrology data throughout the entire manufacturing quality measurement process – from product design to inspection planning to execution to analysis and reporting. OpenCPPS (part of AUTOWARE) will provide support for selected mainstream communication protocols and will define the proper interfaces for other communication protocols to be plugged-in.
Orion is a C++ implementation of the NGSIv2 REST API binding developed as a part of the FIWARE platform that allows the management of the entire lifecycle of context information including updates, queries, registrations and subscriptions. It is an NGSIv2 server implementation to manage context information and its availability allowing subscription to context information so when some condition occurs notifications are sent. The Industrial Data Space foster secure data exchange among its participants, while at the same time ensuring data sovereignty for the participating data owners. The architecture of the Industrial Data Space does not require central data storage capabilities but follows a decentralized approach, meaning that data physically remain with the respective data owner until they are transmitted to a trusted party. Thus, the Industrial Data Space is not a cloud platform, but an architectural approach to connect various, different platforms.
Ontology-based data integration is part of the Z-BRE4K solution. Ontology effectively combines data and/or information from multiple heterogeneous sources. The ontology semantics used by SPL program is described through OWL. OWL follows the RDF syntax, so SPARQL is suitable for seamlessly querying the ontology defined by OWL. SPARQL will be used as the transformation language for converting Semantic data to corresponding syntax data. IDS connectors are used in Z-BRE4K to guarantee the interoperability among the various components that are not part of the Industrial Data Space. Part of connectors functionality is to transform data to/from NGSI format data in order to be shared by the ORION context broker.
Z-BRE4K ontology contains information about all Z-BRE4K relevant data (metadata), linked in a way described by a controlled, shared vocabulary. The data relationships are part of the data itself, in one self-describing information package that is independent of any information system. In simple terms, this means that data from various sources can be easily harmonised. The shared vocabulary, and its associated links to an ontology, provide the foundation and the capabilities of machine interpretation, inference, and logic.
The Z-Bre4k solution is based on the blackboard architectural model. This model is mainly an artificial intelligence approach, where a common knowledge base, the "blackboard", is iteratively updated by a diverse group of specialist knowledge sources, starting with a problem specification and ending with a solution. Each knowledge source updates the blackboard with a partial solution when its internal constraints match the blackboard state. In this way, the specialists work together to solve the problem. The blackboard model was originally designed as a way to handle complex, ill-defined problems, where the solution is the sum of its parts. The blackboard component acts as a central repository system. The rest of the software applications (components) act independently at the common data structure stored on the blackboard, they respond on changes and create new reactions according to changes. Interaction between components is implemented via the blackboard.
Z-BRE4K ontology supports real-time communication capabilities, by providing an agreement among a shared conceptualization, an explicit formal specification and in-between-relations of objects to support the predictive maintenance domain and data classification. Real-time data is gathered from the shop-floor and sent through the MQTT broker to be consumed by the solution’s prediction software applications in order to predict and provide suggestions. I-Like Machines provide a visualization UI and provide a real-time monitoring of relevant variables and comparison with meaningful thresholds.
01-10-2017
-31-03-2021
01-10-2017
-30-09-2020
01-09-2017
-30-11-2022
01-11-2017
-30-04-2021
01-10-2017
-30-09-2021