Comment: Service Oriented Architecture (SOA) concepts will be used to achieve a high degree of configurability, scalability and interoperability of the individual components, while maintaining the reliability, safety, certifiability and time-to-market benefits of off the shelf solutions.
Comment: The platform is based on a semantic database where products, sensors, services, patterns and instances are linked. Their fuctionalities, characteristics and models are also linked one with each other.
Comment: Digital Factory Models’ instances will gather data in a common format. It is based on well-known standards such as BPMN, B2MML, gbXML and OGC. They are based XML and JSON syntax and provide a high level of simplicity, extensibility, interoperability and openness.
1) Security Information & Event Management API, 2) BPMN standard as part of the Integrated Digital Factory Model, 3) RabbitMQ implementation, 4) OGC sensor things compliant API through Integrated Digital Factory Metadata Model, 5) Part of the Integrated Digital Factory Model
Α Web API will return semantic data. The communication interface is through the SPARQL query engine. Z-BRE4K ontology is implemented with the Open Semantic Framework (OSF), an integrated software stack using semantic technologies for knowledge management. Furthermore, JSON formatted data from the shop floor is transferred through a MQTT broker, to be finally stored in I-LiKe machines internal data repository. IDS connectors are used to transform data into the NGSI format, move the data to the ORION context broker to be finally consumed by other applications. Also, the Quality Information Framework (QIF) standard guarantees interoperability since it defines an integrated set of information models that enable the effective exchange of metrology data throughout the entire manufacturing quality measurement process – from product design to inspection planning to execution to analysis and reporting. OpenCPPS (part of AUTOWARE) will provide support for selected mainstream communication protocols and will define the proper interfaces for other communication protocols to be plugged-in.
Ontology-based data integration is part of the Z-BRE4K solution. Ontology effectively combines data and/or information from multiple heterogeneous sources. The ontology semantics used by SPL program is described through OWL. OWL follows the RDF syntax, so SPARQL is suitable for seamlessly querying the ontology defined by OWL. SPARQL will be used as the transformation language for converting Semantic data to corresponding syntax data. IDS connectors are used in Z-BRE4K to guarantee the interoperability among the various components that are not part of the Industrial Data Space. Part of connectors functionality is to transform data to/from NGSI format data in order to be shared by the ORION context broker.
Z-BRE4K ontology contains information about all Z-BRE4K relevant data (metadata), linked in a way described by a controlled, shared vocabulary. The data relationships are part of the data itself, in one self-describing information package that is independent of any information system. In simple terms, this means that data from various sources can be easily harmonised. The shared vocabulary, and its associated links to an ontology, provide the foundation and the capabilities of machine interpretation, inference, and logic.
Comment: The communication middleware in the DIGICOR platform will be based on OPC-UA standard. In order to facilitate the flow of data to and from the middleware, the project will develop Data (harmonisation and transformation) Maps. The Data Maps will be offered as microservices to address the semantic and syntactic interoperability issues. In addition, standardised service interfaces will be defined to allow the discovery and enactment of services within the DIGICOR Platform.
Data communication between components is essential for the project. End users create data on their shop floor with embedded sensors on the machines, new integrated sensors developed for the project. All these data is propagated in the system with data communication protocols, such as HTTP and AMQP, creating a data stream process in the system. Interoperability between the data communication protocolos and brockers is crucial for a successful result of the data communication of the system. Various data sources work together and use different communication protocols. As a result, all these components and protocols should seamlessly work and their interoperability is what helps them. A message brocker was developed for the project, based on AMQP for data communication. In the initial phases of the project, there were also RESTful APIs that helped in the initial development of the components.
The Incremental Integration Strategy (IIS) provides a unified framework for all the EU distributed partners, to work on common principles. By following the IIS, we try to ensure that the integration will be successfully executed in a timely manner. It defines a number of factors to monitor and steps to execute.
The IIS manifests that the components are integrated and tested incrementally and tested to ensure smooth interaction among them. Every component is combined incrementally, i.e., one by one till all components are integrated logically to make the required application, instead of integrating the whole system at once and then performing testing on the end product. Integrated components are tested as a group to ensure successful integration and data flow between components. The process is repeated until all components are combined and tested successfully. The tests included in the IIS are:
Semantic interoperability is desired in the project. An ontology was created to describe all the entities participating in the project components, system, communication protocols as well as the entities given by the end users. The Context Aware algorithm was based on this ontology to create the operation rules for the system. The algorithm provided the essential information to other components about the implementation of the solution. For example, the Context Aware algorithm provided the Reverse Supply Chain with all the necessary information about the production line, the production stages, the return levels and then the RSC was able to create a set of rules to implemented by the end user.
The data exchange format throughout the project's components was JSON. JSON lightweight, easy for humans to read and write it and provides all relevant information in a formatted way. It is also easy to change to include further fields when necessary or to be restructured for other components. XML was also used as data exchange format. XML also has the same characteristics with JSON in regards to easiness and accessibility. An example of one of the JSON formats used the project is given below to describe the prediction outputs:
During the integration phase the same communication protocols were used: HTTP and AMQP for the data exchange. Also there is Wi - Fi connection for integration the various components and their updates on premises or on cloud during the integration process of the system. Finally, FTP was used during the integration phase for quick transfer of files on the shop floor premises.
The IIS and the Integration plan of the Z-Fact0r solution were based on the same APIs and protocols as the data exchange in the system. There were no new APIs designed for the integration process and the integration protocol implemented was derived by the IIS and the Integration plan of the Z-Fact0r system.
Comment: According to the RAMI 4.0 architecture, the “standard” way for Industrie 4.0 platforms to integrate legacy equipment (or any other kind of legacy “object”, including software components) into will be to encapsulate them inside an ad-hoc Administration Shell wrapper, which will expose them as I4.0 Components. The I4.0 interface specification has not been published yet, but a key enabling technology will probably be OPC UA, used as both a communication protocol and a data meta-model. In FAR-EDGE, OPC UA will be one of the field communication technology supported.
Comment: The system blueprint is the FAR-EDGE RA. After having defined the requirements and the constraints for each block of the RA, a thorough analysis of the SotA has been done, which led to the identification of some existing software components meeting the specs. We then identified the gaps that the project will need to fill-in: not surprisingly, these where all the key enabling technologies, like the distributed ledger. However, hardly anything is going to be built totally from scratch in FAR-EDGE. The distributed ledger, for example, will be a customization of a generic, open source Blockchain platform (Hyperledger Fabric). Following the high-level functional decomposition defined in the FAR-EDGE RA, three Open APIs will be exposed by the Cloud layer of the FAR-EDGE Platform: Automation, Virtualization and Analytics. Automation will provide endpoints to monitor, control and manage automation workflows deployed on the Edge layer. Virtualization will create hooks for external simulation tools that need to read and manage factory models deployed on the Cloud layer (that are kept in-sync with the real world by the lower layers). Analytics will allow to manage distributed data analysis processes running on the Edge layer, and to collect aggregated results.
Comment: The FAR-EDGE architecture defines a protocol abstraction layer (the “connectivity middleware” box in the diagram provided here – this name is provisional) to decouple the field from its Edge Computing infrastructure: this is where OPC UA compatibility is going to be introduced.
Comment: IISF based access control framework to allow integration of additional external components into SAFIRE. OPC-UA plug-in for the data ingestion. External systems can ingest data into SAFIRE solutio by either using pre-developed plug-ins (e.g. OPC-UA) or by developing own plug-in modules
Comment: Goal: AUTOWARE will establish and push forward an open CPPS ecosystem, allowing SMEs to access all the different components in order to develop digital automation cognitive solutions for their manufacturing processes. .
Leveraging reference heterogeneous communications and networking architecture to support connectivity & data management in CPPS. To allow the interaction and interworking of the heterogeneous communications and networking technologies necessary to support the ubiquitous connectivity of CPPS.
Comment: Refrain from proprietary formats; if necessary, build adapters that go both ways (import/export).
Platform-independent micro-service architecture (micro-services are designed to be independent of Bluemix stack (but can use it if it’s there))
Standards compliance for product categorisation (eClass), business process specification (UBL), oneM2M for manufacturing interoperation.
Comment: B2B collaboration: Semantic annotation of products and services.
Business process design and execution. Open ontologies.
UBL for business processes, eClass for products, domain specific ontologies aligned via light-weight upper ontology
Comment: IEC-61499 established both a programming language and the rules for open communication protocols for real-time control over distributed applications. Interaction between non real-time and real-time IIoT could be guaranteed through the standard.
Comment: A cross-platform acceptance of IEC-61499 for the real-time control domain for distributed applications guarantees an open and interoperable way of putting together different platform over this specific aspect.
Comment: The vf-OS publish-subscribe middleware will provide the data infrastructure and will be fully compatible with major industry standards (i.e. JBI, SCA, BPEL or WSDL). IO Toolkit to implement connectors to devices or business software and specific implementations for OPC UA and MQTT. Other standards as ANSI/ISA-95 will be followed.