HORSE | Smart integrated Robotics system for SMEs controlled by Internet of Things based on dynamic manufacturing processes
11-01-2015
-31-07-2020
11-01-2015
-31-07-2020
11-01-2015
-31-12-2018
11-01-2015
-31-10-2018
01-10-2016
-30-09-2019
Security and privacy risk assessment to identify countermeasures to be included in the A4BLUE framework.
Integration with automation mecahinsms through pulg and produce capabilities based on OPC UA.
10-01-2015
-30-09-2018
01-09-2016
-31-08-2019
Security by design approach. Development of a Security Framework composed by a core set of security mechanisms to guarantee the security, confidentiality, integrity and availability of the managed information for all authorised stakeholders in the supply chain while at the same time maintaining suitable levels of IPR and knowledge protection. The COMPOSITION Security Framework iapplies blockchain technology to provide an audit trail for manufacturing and supply chain data, enabling both product data traceability and secure access for stakeholders. Cybersecurity mechanisms will be developed to monitor and protect against potential threats that could affect the COMPOSITION infrastructure.
The Security Framework together with the Blockchain layer is strongly interlaced with the COMPOSITION ecosystem, providing all the necessary security features as authentication, authorization, message integrity and message traceability.
1) Security Information & Event Management API, 2) BPMN standard as part of the Integrated Digital Factory Model, 3) RabbitMQ implementation, 4) OGC sensor things compliant API through Integrated Digital Factory Metadata Model, 5) Part of the Integrated Digital Factory Model
01-10-2016
-30-09-2019
01-10-2016
-30-09-2019
01-10-2016
-30-09-2019
01-09-2016
-31-08-2019
01-09-2016
-30-11-2019
01-10-2016
-30-09-2019
01-10-2016
-31-10-2019
01-10-2016
-31-03-2020
01-10-2016
-31-03-2020
01-10-2016
-30-09-2019
01-10-2016
-31-03-2020
Difficulties in setting up the initial data collection infrastructure on the pilot sites was mitigated through the very careful data collection infrastructure selection where end users have been constantly assisted to set up the best approach to transfer data to Z-Fact0r platform, taking into consideration the internal security policies of the companies.
X.509 Certificate for the client certification and TLS handshakes between the Z-Fact0r components and the iLike machines
The secure connection are encrypted by two types of authentication:
Z-Fact0r system is a distributed system by desing. The connections between the components and the data flow is authorized by the iLike machine in order the system to maintain all security aspects.
Difficulties in having access to machines/systems in the shopfloor for end users’ internal security policies was mitigated with the shopfloor machines/systems interface that is carried out by implementing security measures where different options to avoid direct access to machines through the company network have been investigated (e.g. separate network, mirror DB,…)
Implementation of security mechanism on the Z-Fact0r platform components, to achieve security. The basic architecture provides authorization and authentication as a means of deploying a secure system architecture.
SSL (Secure Sockets Layer) is a cryptographic protocol to provide communications security and ensure privacy and data integrity between two or more communicating computer applications.
OpenSSL is an open-source software library that contains an implementation of the SSL protocol used by NGINX (web server) to provide API over HTTPS.
Data communication between components is essential for the project. End users create data on their shop floor with embedded sensors on the machines, new integrated sensors developed for the project. All these data is propagated in the system with data communication protocols, such as HTTP and AMQP, creating a data stream process in the system. Interoperability between the data communication protocolos and brockers is crucial for a successful result of the data communication of the system. Various data sources work together and use different communication protocols. As a result, all these components and protocols should seamlessly work and their interoperability is what helps them. A message brocker was developed for the project, based on AMQP for data communication. In the initial phases of the project, there were also RESTful APIs that helped in the initial development of the components.
The Incremental Integration Strategy (IIS) provides a unified framework for all the EU distributed partners, to work on common principles. By following the IIS, we try to ensure that the integration will be successfully executed in a timely manner. It defines a number of factors to monitor and steps to execute.
The IIS manifests that the components are integrated and tested incrementally and tested to ensure smooth interaction among them. Every component is combined incrementally, i.e., one by one till all components are integrated logically to make the required application, instead of integrating the whole system at once and then performing testing on the end product. Integrated components are tested as a group to ensure successful integration and data flow between components. The process is repeated until all components are combined and tested successfully. The tests included in the IIS are:
The IIS and the Integration plan of the Z-Fact0r solution were based on the same APIs and protocols as the data exchange in the system. There were no new APIs designed for the integration process and the integration protocol implemented was derived by the IIS and the Integration plan of the Z-Fact0r system.
During the integration phase the same communication protocols were used: HTTP and AMQP for the data exchange. Also there is Wi - Fi connection for integration the various components and their updates on premises or on cloud during the integration process of the system. Finally, FTP was used during the integration phase for quick transfer of files on the shop floor premises.
The data exchange format throughout the project's components was JSON. JSON lightweight, easy for humans to read and write it and provides all relevant information in a formatted way. It is also easy to change to include further fields when necessary or to be restructured for other components. XML was also used as data exchange format. XML also has the same characteristics with JSON in regards to easiness and accessibility. An example of one of the JSON formats used the project is given below to describe the prediction outputs:
Semantic interoperability is desired in the project. An ontology was created to describe all the entities participating in the project components, system, communication protocols as well as the entities given by the end users. The Context Aware algorithm was based on this ontology to create the operation rules for the system. The algorithm provided the essential information to other components about the implementation of the solution. For example, the Context Aware algorithm provided the Reverse Supply Chain with all the necessary information about the production line, the production stages, the return levels and then the RSC was able to create a set of rules to implemented by the end user.
The whole platform of the Z-Fact0r solution was able to work with other external applications, through a message brocker which is able to receive and send data to external systems. The interoperability level between the Z-Fact0r platform and external applications is essential for communication and integration purposes. Security and safety issues arise when different platforms cooperate. The Z-Fact0r platform implemented an AAA mechanism (Access, Autorisation and Authentication) to secure the safety of the platform during the connection with other external applications.
Access was given to the Z-Fact0r platform to only authorised users. The platform installation was done either on the shop floor premises or servers deployed by the technical providing partners creating a limited access environment. There was also the authorisation between the components and external appl, where the each component was authorised in an authorisation server with their unique Bearer Token in order to subscribe in the message brocker and publish or receive the available data. Further steps, such as user authentication, were not included in the project scope.
Z-Fact0r components were developed by different technical providing partners as mostly standalone components. The result was that on each shop floor worked many different components individually. An interoperability level was necessary for the Z-Fact0r system to be a solution to work as a whole. Various integration processes and extensive planning took place during the project and created an integrated system as a final product. The interoperability between the components was the first essential characteristic for the integration process. The components were desinged in the system, in a way that allowed them to operate together without conflicts during data streaming and operation.
Z-Fact0r architecture was based on the modular design of the components and then the integration of the components to a complete system. For each component a specific architecture was followed by the responsible technology providing partner, base on the use cases, scenarios, end user requirements and technical requirements. The desing for each component was documented in the respective deliverable. Each component also followed the technological trends of their fields and exploited the state of the art of the field. An overall ontology of the Z-Fact0r system was created to include all possible actors, functions, assets etc. All components were initially deployed as standalone applications and then an integration plan was implemented. Z-Fact0r project followed the Incremental Integration Strategy (IIS) where the components were deployed on the shop floors and integrated as one.
The Z-Fac0r project followed the AMQP and MQTT protocols for the communication between the components. A message brocker was develope by HOLONIX and was called iLike. The iLike implemented the publish/subscribe mechanism for all components which connected to it. The components were authorised in the iLike brocker and repository with a Bearer Token and then used the mechanism to publish or receive the data. An open API was used to create REST GET calls in order to initiate the communicatio between the component and the brocker. The communication steps between the component and the brocker were:
The data from the iLike machines are sent into the cloud to a broker using MQTT protocol (a lightweight protocol that transports messages between devices), it stores the data as messages, so the subscribers can get the values.
MQTT broker can easily scale from a single device to thousands, manage and tracks all client connection state and permit secure connections.
Wireless communication of the Z-Fact0r solution was based on Wi - Fi protocol.
Z-Fact0r hybrid framework, obtained by applying a software and hardware integration strategy, is installed on the industrial end users shop floors. This architecture exploits features from Relational Databases and Triplestore while using the blackboard architectural pattern which ensures efficient and accurate communication of data transfer among software applications and devices.
There is little integration with legacy systems, such as CMMS or ERP for the Z-Fact0r solution.
The RESTful API over HTTP has been chosen to fulfil the necessity of sending intermediate or final results to the repository from Z-Modules side, the API utilizes JSON as default exchange format and JWT (JSON Web Token) as authentication mechanisms.
The JWT is a standard that defines a JSON format scheme for exchanging information between various services. JWTs are widely used to authenticate requests in Web Services authentication mechanisms where the client sends an authentication request to the server, the server generates a signed token and returns it to the client which, from that moment on, will use it to authenticate subsequent requests.
To store data from different sources, including the data elaborated by various Z-Modules a Z-Fact0r data repository has been developed.
The first source of data is the temporal machine data coming from machine sensors, to store this data is used Cassandra, a distributed NoSQL DBMS capable to handle large amount of data across many servers and provide high availability.
The following one is used to store others complex and structured production information with the relational DBMS Mysql.
Another data source in the Z-Fact0r context is the output generated by various modules that carry out the analysis result.
All database schema, communication protocols, security applications of the Z-Fact0r solution are designed to accommodate the scalability of the solution. All technology can be implemented in larger scale projects without major changes. The one difference with dealing with big data is the use of a different database approach, such as MongoDB, which is more suitable for big data analysis.
01-10-2016
-31-10-2019
01-10-2016
-31-03-2021
01-10-2016
-30-09-2019
01-10-2016
-30-09-2019
Data collection in a "manufacturing database" to enable real-time feedback on how deviations in the manufacturing process will have impact on part performance.
01-11-2015
-31-10-2017
01-05-2015
-31-07-2018
01-06-2017
-31-05-2020
01-09-2017
-28-02-2021
Open Platform Communication Unified Architecture (OPC-UA) is considered for UPTIME platform and for modular edge data collection and diagnosis of UPTIME_SENSE component.
The UPTIME platform is built upon the predictive maintenance concept, the technological pillars (i.e. Industry 4.0, IoT and Big Data, Proactive Computing) and the existing baseline tools (i.e. USG, preInO, PANDDA, SeaBAR, DRIFT) resulting in a unified information system for predictive maintenance. The extended UPTIME baseline tools (SENSE, DETECT, PREDICT, DECIDE, ANALYZE, FMECA, VISUALIZE) will address the various steps of the unified predictive maintenance approach and will incorporate interconnections with other industrial operations related to production planning, quality management and logistics management.
Open Platform Communication Unified Architecture (OPC-UA) is considered for UPTIME platform and for modular edge data collection and diagnosis of UPTIME_SENSE component.
To ensure secure access, the UPTIME Platform offers appropriate authorization and authentication mechanisms. These are based on the JWT technology and are implemented by using the Spring Security framework. Currently, JWT is used to ensure a secure log-in; as components are iteratively integrated. JWT will also be used to ensure secure communications between components.
The UPTIME conceptual architecture was designed according to the ISO/IEC/IEEE 42010 “System and software engineering – Architecture description” and mapped to RAMI 4.0 in order to ensure that it can be represent predictive maintenance in the frame of Industry 4.0.
The UPTIME vision converges and synthesizes predictive maintenance, proactive computing, the Gartner’s levels of industrial analytics maturity and the ISO 13374 as implemented in MIMOSA OSA-CBM in order to create a consistent basis for a generic predictive maintenance architecture in an IoT-based industrial environment. In this way, the Operational Technology and the Information Technology can also be converged in the context of Industry 4.0.
One of the main functionalities of UPTIME Platform is the batch data analytics implemented by UPTIME_ANALYZE component to analyse maintenance-related data from legacy and operational data and UPTIME_FMECA component that provides estimation of possible failure modes. The interoperability interfaces with UPTIME End-Users' (e.g. Whirlpool) legacy systems are defined, specified and developed according to latest practices and standards for APIs.
01-01-2019
-31-07-2022
Associated to QU4LITY Reference Architecture: Corporate Network/ Production OT Access Network: Deterministic Ethernet (TSN), OPC-UA, IDS/NGSI-LD
Details: OPC UA: an industrial M2M communication protocol for Interoperability; Information modelling
Associated to QU4LITY Reference Architecture: Corporate Network/ Production OT Access Network: Deterministic Ethernet (TSN), OPC-UA, IDS/NGSI-LD
Details: OPC UA: an industrial M2M communication protocol for Interoperability; Information modelling
Associated to QU4LITY Reference Architecture: Corporate Network/ Production OT Access Network: Deterministic Ethernet (TSN), OPC-UA, IDS/NGSI-LD
Details: OPC UA: an industrial M2M communication protocol for Interoperability; Information modelling
There was a risk that other developments made within this pilot do not follow the reference architecture of IDS and thus are incompatible. This would cause that certain applications could not be deployed and run within in the proposed data space approach.
01-01-2019
-30-06-2023
01-01-2019
-31-12-2022