Projects overview
ICP4Life | An Integrated Collaborative Platform for Managing the Product-Service Engineering Lifecycle
01-01-2015
-01-01-2019
SYMBIO-TIC | Symbiotic Human-Robot Collaborative Assembly: Technologies, Innovations and Competitiveness
01-04-2015
-01-04-2019
ambliFibre | adaptive model-based Control for laser-assisted Fibre-reinforced tape winding
01-09-2015
-31-08-2018
BEinCPPS | Business Experiments in Cyber Physical Production Systems
11-01-2015
-31-10-2018
Fortissimo 2 | Factories of the Future Resources, Technology, Infrastructure and Services for Simulation and Modelling 2
11-01-2015
-31-10-2018
TWIN-CONTROL | Twin-model based virtual manufacturing for machine tool-process simulation and control
10-01-2015
-30-09-2018
COMPOSITION | Ecosystem for Collaborative Manufacturing Processes _ Intra- and Interfactory Integration and Automation
01-09-2016
-31-08-2019
ConnectedFactories | Industrial scenarios for connected factories
01-09-2016
-31-08-2019
INCLUSIVE | Smart and adaptive interfaces for INCLUSIVE work environment
01-10-2016
-30-09-2019
Z-Fact0r | Zero-defect manufacturing strategies towards on-line production management for European factories
01-10-2016
-31-03-2020
Within the Z-Fact0r, the proposed (higher level) DSS, with the support of the knowledge base and the online inspection module (1st level decision support at single stage), produce, verify and validate decisions aligned with the quality control policies, production targets, desired product specifications and maintenance management requirements. Key functional characteristics of the envisioned DSS incorporates among others, techniques for monitoring and predicting product quality, action prioritization, root cause analysis, and mitigation planning algorithms (at product and workstation level). Moving beyond existing solutions that focus only on specific aspects of the production procedure, or that are restrained to diagnosis, the proposed DSS system incorporates autonomous, hierarchical decision support, based on process analytical technologies and newly developed suitably adjusted knowledge-based systems, and combines product monitoring models and data analytics from heterogeneous sources. The envisioned DSS takes into account a wide set of multiple factors and criteria, such as data uncertainty, lack of information and information quality, involvement of multiple actors, and real-time response. Thanks to the 5 intertwined zero-defect strategies (i.e. Z-PREDICT, Z-PREVENT, Z-DETECT, Z-REPAIR and Z-MANAGE) the overall solution presents a significant contribution to a spectacular improvement in the overall performance and reliability of the targeted multi-stage manufacturing systems and in the production agility (response to continuous adjustments in production targets).
DATAPIXEL provides the information associated to the defect detection in the manufacturing parts selected. This information is used as an input for developing the defect detection algorithms of Z-Fact0r solution. Based on this input, a data conditioning methodology has been developed to extract information concerning to the defect position and type. This information will be used as baseline for the model validation, via comparison with the respective simulation results.
The procedure that has been used is the following:
- Image-based feature extraction: Convolutional Neural Networks (CNN) and Variational Auto Encoders (VAE) used as feature extractors. CNNs will define the appropriate features that have been used for the classification between healthy and defected parts, and VAEs will be utilized to distinguish that can be used for image generation.
- Feature selection: An efficient filter feature selection (FS) method was developed for selecting informative and non- redundant feature subsets. In addition to enhanced accuracy rates and dimensionality reduction, the method have reasonably low computational demands. A robust and computationally efficient evaluation criterion with respect to patterns was defined allowing us to assess the redundancy between the features. The proposed FS technique was performed on a forward selection basis handling simultaneously both the discrimination power and the complementary characteristics between the extracted features. To decide on the number of retained features, a termination condition was finally introduced, thus avoiding the trial-and-error procedure usually employed in the common FS techniques of the literature.
- Classification: The selected features were input in a virtual classification module. The role of this module is to provide a decision on the workpiece condition. The technologies used were Artificial Neural Networks (ANN) and deep learning algorithms.
From a technological point of view, Open vf-OS Platform will provide elements covering the connected world, allowing the exchange and collaboration of information between companies on a value stream thanks to the cloud approach to be adopted (vf-Platform). The Open vf-OS covers from the control device level, where information from the systems (IoT, CPS, embedded systems) is gathered, processed and empowered.
SAFIRE | Cloud-based Situational Analysis for Factories providing Real-time Reconfiguration Services
01-10-2016
-30-09-2019
ZAero | Zero-defect manufacturing of composite parts in the aerospace industry
01-10-2016
-30-09-2019
5G-Ensure | 5G Enablers for Network and System Security and Resilience
01-11-2015
-31-10-2017
Z-BRE4K | Strategies and Predictive Maintenance models wrapped around physical systems for Zero-unexpected-Breakdowns and increased operating life of Factories
01-10-2017
-31-03-2021
The Z-Break solution uses a variety of communication protocols. HTTP, OPC-UA, IEEE 802.15.4e and IEC WirelessHART. The Hypertext Transfer Protocol (HTTP) is an application protocol for distributed, collaborative, hypermedia information systems. HTTP is the foundation of data communication for the World Wide Web. OPC UA supports two protocols. The binary protocol is opc.tcp://Server and http://Server is for Web Service. Otherwise OPC UA works completely transparent to the API. IEEE 802.15.4 is a technical standard which defines the operation of low-rate wireless personal area networks (LR-WPANs). It specifies the physical layer and media access control for LR-WPANs, and is maintained by the IEEE 802.15 working group, which defined the standard in 2003. WirelessHART is a wireless sensor networking technology based on the Highway Addressable Remote Transducer Protocol (HART). Developed as a multi-vendor, interoperable wireless standard, WirelessHART was defined for the requirements of process field device networks. Also, it uses the NGSI protocol. NGSI is a protocol developed to manage Context Information. It provides operations like managing the context information about context entities, for example the lifetime and quality of information and access (query, subscribe/notify) to the available context Information about context Entities.
Z-BRE4K solution provides a big data analytics framework for the identification of the deterioration trends to extended towards prescriptive maintenance. Advanced data analysis tools are under development, to be applied to the quality and production data to realise zero-defect and zero-break down production. Furthermore, it involves models for anomaly detection, that are capable of identifying the machine states where the operation deviated from the norm. This is achieved by collecting the data from the machine sensors in chunks of time and processing them in batch through deep learning models. The models are trying to recreate their inputs, and this results in an observable measure called Reconstruction Error, which is used to identify states that the models aren’t capable of addressing sufficiently (which constitutes an anomaly.
Z-BRE4K solution will provide an ontology with annotation mechanisms that include all the necessary information to perform predictive maintenance analysis to achieve extended operating life of assets in production facilities. This context includes the sensorial data processing to be used as simulation inputs and the simulation process itself (physics-based modelling). It also includes the machine learning application as well, due to the usage of prediction models in data-driven modelling. Knowledge Based System (KBS) will extract, store and retrieve all the relevant information enriched with semantic annotations to guarantee a prompt identification of criticalities. Shoop floor data is transformed into RDF (Resource Description Framework) data, a standard model for data interchange on the web, and stored in a triple store DB. Also, the M3 Gage platform serves for fast verification of the machine, condition monitoring, and as a data repository as well. It allows information interconnection from different data sources, and furthermore, the architecture proposed by AUTOWARE provides the ability to establish data processing and computing at the most appropriate abstraction level in the hierarchy: Field, Workcell/ Production Line, Factory and Enterprise. Different filtering and pre-processing algorithms are applied on the edge to clean real time raw data and reduce unwanted noise. In addition, convolutional neural networks are used to process high-throughput video streams, providing a non-time critical stream of features for cloud services.
UPTIME will reframe predictive maintenance strategy in a systematic and unified way with the aim to fully exploit the advancements in ICT and maintenance management by examining the potential of big data in an e-maintenance infrastructure taking into account the Gartner’s four levels of data analytics maturity and the
proactive computing principles.
UPTIME will enable manufacturing companies to reach Gartner's four levels of data analytics maturity for optimised decision making - each one building on the previous one: Monitor, Diagnose and Control, Manage, Optimize - aims to optimise in-service efficiency and contribute to increased accident mitigation capability by avoiding crucial breakdowns with significant consequences. UPTIME Components UPTIME_DETECT & UPTIME_PREDICT and UPTIME_ANALYZE aim to enhance the methodology framework for data processing and analytics. The key role for the UPTIME_DETECT and UPTIME_PREDICT components are data scientists who are in charge of developing, testing and deploying algorithmic calculations on data streams. In this way, the component is able to to identify the current condition of technical equipment and to give predictions. On the other hand, the UPTIME_ANALYZE is a data analytics engine driven by the need to leverage manufacturers’ legacy data and operational data related to maintenance, and to extract and correlate relevant knowledge.
In UPTIME, two data processing solutions are considered. (1) Batch processing of data at rest, through massively paralle processing, (2) real-time processing of data in motion, real time data from heterogeneous sources are processsed as a continuous "stream" of events (produced by some outside system or systems), and that data processing occurs so fast that all decisions are made without stopping the data stream and storing the information first.
UPTIME main functionalities are structured in three main modules, namely: edge, cloud and GUI modules.
- The UPTIME edge module will ensure data collection from machines, sensors, etc. and sent it on for analysis. It may also include some additional functionalities which require real-time results.
- The UPTIME Cloud module contains all the advanced functionalities of the solution, which do not require a real time result. There we will analyse the data collected on the edge, as well as data received from relevant information systems, and provide the expected predictions. “Cloud” can refer to remote servers or an internal cloud within the customer’s Plant or Enterprise, as is deemed necessary by the customer.
- Lastly the GUI module, through which the user will interact with the previously mentioned functionalities, whether it is to view data or configure the solution.
4 main components in the cloudbased infastructure of the UPTIME platform include:
- UPTIME_DETECT and PREDICT component (extended version of PreIno prototype) processes mainly timeseries?based data from the field, to give further context to the data, e.g. to detect topical conditions of technical equipment and to predict probable future conditions.
- UPTIME_ANALYZE (a new developed prototype) is a data analytics engine driven by the need to leverage manufacturers’ legacy data and operational data related to maintenance, and to extract and correlate relevant knowledge.
- UPTIME_DECIDE component (extended version of PANDDA prototype) that implements a prescriptive analytics approach for proactive decision making in a streaming processing computational environment. It provides real-time prescriptions fo the optimal maintenance actions and the optimal time for their implementation on the basis of streams of predictions about future failures.
- UPTIME_FMECA (extended version of DRIFT prototype) provides estimation of possible failure modes and risk criticalities evolution through its data-driven FMECA approach.
UPTIME_SENSE component (USG prototype) is located in the edgebased infrastructure of the UPTIME platform. It aims to capture data from a high variety of sources and cloud environments. SENSE brings configurable diagnosis capabilities on the edge, e.g. for real-time or off-the-grid applications. SENSE addresses 3 high level functions:
- Sensor signal processing, which collects the signals from equipment or other sensors, and pre-processes them before transmitting them on the cloud platform.
- Edge diagnosis for optional state detection diagnosis for certain use cases.
- Support functions for functions necessary for the correct operation of the edge module.
PreCoM | Predictive Cognitive Maintenance Decision Support System
01-11-2017
-31-10-2020
AI REGIO | Regions and DIHs alliance for AI-driven digital transformation of European Manufacturing SMEs
01-10-2020
-30-09-2023
AI-PROFICIENT | Artificial Intelligence for improved PROduction efFICIEncy, quality and maiNTenance
01-11-2020
-31-10-2023