SCOTT | Secure COnnected Trustable Things
01-05-2017
-31-10-2020
01-05-2017
-31-10-2020
01-05-2018
-31-10-2021
01-06-2016
-30-09-2019
01-07-2015
-31-07-2018
01-06-2018
-31-05-2021
01-01-2019
-31-07-2022
Through innovative algorithms and statistical methods, possible data sources for predictive quality control can be identified and evaluated. Moreover, by cooperation of all project partners, the realization of data access and acquisition along the whole process chain can be realized. With a focus on algorithms and methodology, a use case-specific algorithm is going to be implemented and validated to maintain high prediction accuracy.
Data availability is a challenge: Limited access to measurement data (due to limited access to third-party systems)
There seems to be relationship to predict torque with use of in-line data. Needs to be more explored
By applying sophisticated algorithms and methods on the acquired data, systematic failure root cause detection supported by data analytics can be implemented. In addition, improved knowledge of machine states/maintenance requirements for neuralgic points can be implemented through the desired solution path within this pilot.
An AI vision algorithm developed by TNO (WP3) seems to filter bad rated parts compared to installed algorithm. Advantage can be when product print is changing to catch-up development speed in traditional algorithm development.
For this trial, the acquired test data will be analyzed regarding quality classification. In every test a part could pass or fail. Failed parts must be reworked, if possible, and brought back to the process. Sometimes parts are classified as failed even if they are good (false positive). This effect will be analyzed by machine learning algorithms and, if necessary, adopted in classification parameterisation. Additionally, the fact of 100% testing, means every panel is tested automatically, with bottleneck in out of the line test stations will be addressed in setting up failure prediction models for quality forecast. This will be supported by data analysis of pre reflow AOI (automated optical inspection).
With all these data analysis and process optimization activities economical evaluation will be included to support decisions in-process and configuration changes. For the development of these applications, the main steps are data availability/access, data processing, and model development. The developed applications should be deployed on Edge devices.
Milling Digital Twin enables strategy design and quality control in milling processes with only SW tools, simulation and virtual optimisation
Cockpit optimiser software provides environment for intelligent design of an automated cell with the customer.
Cockpit optimiser and Milling Digital Twin with AI tools for accelerating current design and optimisation processes by operators
Solutions to facilitate the analytical thinking of the operator. The solution will help the operator with the correlation of quality and process parameters in order to make a decision upwards in the process.
With the help of skilled production line workers, the data in the AI platform can be annotated and herewith produce the predictive models for ZDM autonomous quality inspection. The platform gives users the ability to monitor the AQ process (Autonomous Quality) and provide feedback for the ZDM.
To acquire quality data, all involved users and managers must understand some basic data science principles. Machine vision in modern times relies on large amount of consistent data. Data acquisition process begins with organized collection of samples, which should become an integral part of every standardized manufacturing process that involves automated quality inspection or ZDM.
There is a need of managing large Data Sets and Big Data, IA solutions for different Manufacturing Processes. Solutions need to support operators in decision-making
Enable operators to work in a more complex environment while reducing the strain of administrative tasks and enabling easy production analytics by capturing information online instead of on paper.
Shopfloor worker (operator – technical support group): From a shopfloor perspective new job profiles, or altered job profiles should be defined, however In essence the job profiles will remain the same, while the operators and Technical Support Groups need to understand & be able to work with these new technologies. This requires some basic knowledge on the (digitalized) systems, for the operators a lot can be captured in SOP’s (Standard Operating Procedures), but the technical support staff should also have some basic knowledge on the workings and the hardware/software side of the systems in order to be able to support the shopfloor where needed.
The ZDM-Autononous Quality Solutions are used as systems that perform tasks in an autonomous/automated way, requiring the intervention of an operator only when an operational tie-breaker is needed. When that is the case, the operator has to analyse the incident and provide for a solution to the AQL System, interacting with it via an HMI interface.
Complete machine parameters correlation is realized, allowing machine operators to take into account all the assets from each workstation of the production line. It enhances its capacity in relation to conventional analytics methods.
The end2end process supported by the overall architecture helps the operator and team leader in their daily activities in order to prevent and anticipate as much as possible quality issues on the product via the analysis of a huge amount of data linked together via the holistic semantic model.
01-01-2019
-30-06-2023
01-01-2019
-31-12-2022
01-10-2018
-31-03-2021
The Industry4.E Lighthouse team has created careers resources targetted at citizens considering STEM careers, up-skilling or re-skilling, and SMEs interested in the training and resources available to get involved in Industry4.0.
Visit Industry4.E careers webpage today and download our Industy4.0 careers opportunities flyer
01-09-2017
-29-02-2020
01-01-2020
-31-12-2023
Operational services aim to collect product data on post-use Li-Ion batteries about their use phase in order to enable monitoring and full traceability of its life-cycle;
Operational services aim to:
As the proactive exploitation of the DigiPrime platform enables the car-monitored SOH tracing and availability, less testing is needed to assess the residual capacity of the battery. Moreover, by knowing the structure of the battery packs, a decision support system can be implemented to adjust the de-and remanufacturing strategy accordingly and select the most proper cells for re-assembly second-life modules, thus unlocking a systematic circular value chain for Li-ion battery cells re-use. Furthermore, excessively degraded cells which cannot be re-used can be sent to high-value recycling, based on the knowledge of their material compositions.
01-10-2019
-30-09-2023
01-01-2020
-31-12-2023
01-10-2019
-31-03-2023
01-09-2019
-30-11-2022
01-10-2019
-31-03-2024
15-10-2019
-14-10-2022
01-10-2017
-31-03-2021
01-05-2015
-31-07-2018
One of the objectives of the MANTIS project is to design and develop the human-machine interface (HMI) to deal with the intelligent optimisation of the production processes through the monitoring and management of its components. MANTIS HMI should allow intelligent, context-aware human-machine interaction by providing the right information, in the right modality and in the best way for users when needed. To achieve this goal, the user interface should be highly personalised and adapted to each specific user or user role. Since MANTIS comprises eleven distinct use cases, the design of such HMI presents a great challenge. Any unification of the HMI design may impose the constraints that could result in the HMI with a poor usability.
01-06-2017
-31-05-2020
01-10-2017
-31-03-2021
The Z-Break solution uses a variety of communication protocols. HTTP, OPC-UA, IEEE 802.15.4e and IEC WirelessHART. The Hypertext Transfer Protocol (HTTP) is an application protocol for distributed, collaborative, hypermedia information systems. HTTP is the foundation of data communication for the World Wide Web. OPC UA supports two protocols. The binary protocol is opc.tcp://Server and http://Server is for Web Service. Otherwise OPC UA works completely transparent to the API. IEEE 802.15.4 is a technical standard which defines the operation of low-rate wireless personal area networks (LR-WPANs). It specifies the physical layer and media access control for LR-WPANs, and is maintained by the IEEE 802.15 working group, which defined the standard in 2003. WirelessHART is a wireless sensor networking technology based on the Highway Addressable Remote Transducer Protocol (HART). Developed as a multi-vendor, interoperable wireless standard, WirelessHART was defined for the requirements of process field device networks. Also, it uses the NGSI protocol. NGSI is a protocol developed to manage Context Information. It provides operations like managing the context information about context entities, for example the lifetime and quality of information and access (query, subscribe/notify) to the available context Information about context Entities.
Z-BRE4K solution provides a big data analytics framework for the identification of the deterioration trends to extended towards prescriptive maintenance. Advanced data analysis tools are under development, to be applied to the quality and production data to realise zero-defect and zero-break down production. Furthermore, it involves models for anomaly detection, that are capable of identifying the machine states where the operation deviated from the norm. This is achieved by collecting the data from the machine sensors in chunks of time and processing them in batch through deep learning models. The models are trying to recreate their inputs, and this results in an observable measure called Reconstruction Error, which is used to identify states that the models aren’t capable of addressing sufficiently (which constitutes an anomaly.
Z-BRE4K solution will provide an ontology with annotation mechanisms that include all the necessary information to perform predictive maintenance analysis to achieve extended operating life of assets in production facilities. This context includes the sensorial data processing to be used as simulation inputs and the simulation process itself (physics-based modelling). It also includes the machine learning application as well, due to the usage of prediction models in data-driven modelling. Knowledge Based System (KBS) will extract, store and retrieve all the relevant information enriched with semantic annotations to guarantee a prompt identification of criticalities. Shoop floor data is transformed into RDF (Resource Description Framework) data, a standard model for data interchange on the web, and stored in a triple store DB. Also, the M3 Gage platform serves for fast verification of the machine, condition monitoring, and as a data repository as well. It allows information interconnection from different data sources, and furthermore, the architecture proposed by AUTOWARE provides the ability to establish data processing and computing at the most appropriate abstraction level in the hierarchy: Field, Workcell/ Production Line, Factory and Enterprise. Different filtering and pre-processing algorithms are applied on the edge to clean real time raw data and reduce unwanted noise. In addition, convolutional neural networks are used to process high-throughput video streams, providing a non-time critical stream of features for cloud services.
The suggestion beyond the state-of-the-art is to have intelligent machine simulators so an information and knowledge rich platform can provide an accurate account of the machine’s current state and provide predictive (look ahead) potential scenarios of future time type, severity and risks of breakdown. Collected, processed, integrated and aggregated data will be structured and fed in real-time into networked simulators enabling advanced analysis and visualization to provide smart services, higher fidelity and prediction accuracy for production and manufacturing assets management. Different schemes for data collection configuration are implemented (ranging dedicated IoT devices with independent methodologies) to collect raw data from sensors, pre-process and aggregate the information, and share the results with other services through an IDS connector. The Z-Bre4k IDS connectors have a reference architecture to ensure data sovereignty and integrity throughout this collection phase.
Within Z-BRE4K a semantic data modelling is used for interoperability. Semantic representation is used for machinery, critical components, failure modes as well as optimal conditions. Statistical methods and machine learning algorithms are used in offline mode to discover patterns in the data and associate them with specific events (Pattern discovery, Association Rules), as well as infer causality events in cases such as quality control (Quality Estimation based on Machine Status.
Within Z-BRE4K, a novel software application will be developed and added to I-LiKe Machine’s tech stack: A Knowledge Base System (KBS) to extract, store, and retrieve all the relevant information enriched with semantic annotations to guarantee a prompt identification of criticalities in the process.
The KBS represents a step towards the implementation of novel and innovative solutions, still not common practice in manufacturing. The data repository is in the form of Triplestore which is designed to store identities conducted from triplex collections representing a subject-predicate-object relationship. On top of the repositories a reasoning engine creates relationships and allows to extract knowledge to be consumed by other applications.
In the framework of Z-BRE4K, an IoT approach is applied to integrate end user machines to the Z-BRE4K platform. Through IoT gateways deployed at the shop floor, machine components are enabled to communicate their conditions, sending sensors data to the cloud where they are stored and analysed to provide predictive maintenance related information.
The UI’s goal is to visualize data from maintainers and components statuses (real time data and relevant KPIs). The SPARQL Web service is used to send custom SPARQL queries against the Semantic Framework RDF repository as a general-purpose querying web service. The UI can visualise the probability of breakdown and RUL. CAD and CAE models would be useful in mapping these values into a 3D visualisation.
Z-Bre4k provides a Semantic Framework as a RESTful web services API. Each request returns an HTTP status and optionally a document of result sets. Each results document can be serialized and may be expressed as RDF, pure XML or JSON. The operator input to the machine and threshold changes can be built as a UI. These parameters can be monitored directly in machine simulators. Furthermore, the dashboard application (M3 modules) will alert the shop floor operator about quality detected issues and suggests recommendations for the production adjustment and maintenance of the machine.
The AUTOWARE apps development will be supported and linked to the FIWARE. AUTOWARE approach is to connect and extend the FIWARE for Industry resources and assets to the end of digital automation community, so synergies emerge both in terms of multi-sided business opportunities and amount of resources that are made available to the cognitive automation community to build their autonomous solutions and apps.
FIWARE is a curated open source framework with components that can be assembled together with other third-party ones to accelerate the development of Smart Solutions. The Orion Context Broker is the core component of any “Powered by FIWARE” platform. It enables the system to perform updates and access to the current state of context.
The choice of the AUTOWARE platform was based on the fact that is an open source project, and that is hardware agnostic.
AUTOWARE platform will push forward and stablish an open CPPS ecosystem. In the AUTOWARE Framework, a collection of enablers has been defined as components/tools that will enable potential users of the tools, be it end users, system integrators or software developers, to easily apply the developed technical enablers to their daily work. Moreover, verification, validation and certification enablers will be introduced in the AUTOWARE platform.
Z-BRE4K’s mission is to build a distributed software system solution including the Industry 4.0 principals towards cyber-physical, digital, virtual and resource-efficient factories. The ultimate goal is to develop intelligent maintenance systems for increased reliability of production systems. Additionally, special attention is going to be given on processes, advancing technologies and products, integrating knowledge, training, technology and industrial development in a market-oriented environment. Z-BRE4K’s intended impact to the European manufacturing industry in the increase of the in-service efficiency by 24%
Z-BRE4K will contribute to the productivity increase of different critical manufacturing processes, such as joining (GESTAMP), cutting (PHILIPS) and forming (GESTAMP, PHILIPS, SACMI) by providing analytics and suggestion in to order to assist in minimizing the machines breakdowns. The main gain is operational and maintenance costs reduction. Furthermore, for the GESTAMP use case a real-time arc welding quality control system, based on infrared images, is being developed
Z-break will make it possible to combine the current manufacturing systems with current and new mechatronic systems. These combinations will lead to smarter manufacturing systems and thus a shorter ramp up in generating higher quality and productivity.
Part of the Z-BRE4K project is the development, of a novel embedded condition monitoring solution with cognitive capabilities, by applying deep learning techniques to reduce the dimensionality of multimodal sensor data associated to a given machine/device, and provide meaningful features to predictive maintenance services on the cloud. Most suitable IoT edge devices, for optimal trade-off between computational power and energy consumption, sensors, providing relevant information of the condition of different components, and signal processing algorithm are proposed for different machines and processes. Data gathering is enabled by the installation of IoT gateways, where data in different protocols are homogenised and sent to the cloud for storage. Real-time data, relevant KPIs and information about components status are visualised through dedicated dashboards.
The modelling and simulation methods used in Z-BRE4K are mainly Finite Element Methods (FEM) where complex problems and processes from the real world are being simplified and solved using a numerical approach. First, an accurate digital model of the geometry and material properties of all involved objects, boundary conditions between these objects and process data is created (i.e. forces or temperature).
Then, the complex shape of all objects involved, is approximate using a finite number of simple geometries (i.e. triangles) which simplify the complex mathematical problem. A computer is capable of solving these mathematical operations at a rate impossible for humans and thus enables the user to analyse various scenarios, ranging from mechanical strains within the objects to rise in temperature or material fatigue. This information can be used to predict the remaining useful lifetime of a given tool.
Simulation platform is deployed by the physical equipment to create intuitive maintenance control and management systems. The Z-BRE4K’s platform simulation capabilities will estimate the remaining useful life calling for maintenance and suggesting the optimal times to place orders for spare parts, reducing the related costs. The increased predictability of the system and the failure prevention actions will reduce the number of failures, maximise the performance, reduce the repair/recover times reducing further the costs.
By applying time series analysis, we are able to detect special events that are known (Fault detection) or unknown (anomaly detection) during production. This information, correlated with sensor readings is fed into machine learning algorithms that create estimates of Remaining Useful Life (RUL), Health Indexes (HI) and forecast upcoming events (Likelihood of Failure). Special focus is given in techniques that can provide real-time information (Fast computation and high accuracy) as well as being scalable in order to use new data as it becomes available. Additional information such as meantime between failures based on historical data or an expert opinion, CAE data, quality control data, real time states etc. are also used to the design of machine simulators.
01-10-2017
-30-09-2021
CloudiFacturing will extend the field of action of the technology developed in CloudFlow and CloudSME from the product development process to the production process, in order to leverage factory data with analytics algorithms and simulation tools
Thanks to cloud resources, enough power computing is available to analyze different scenarios in a few days instead of several weeks.
Designers of CATMARINE and SKA are now able to achieve high-quality products by analyzing different manufacturing scenarios without wasting time, money and material.
The platform is able to optimize the resin injections points/vents and verify the presence of defects in the final product, thus ensuring a complete and correct mold-filling.
Outcomes of the project creates base for the improvement of the existing design of the water quench and will be used for the development of the new generation of the nozzles.
It is expected that new nozzle design and thus new water quench will be available for the customers in 5 years time. It is expected that those new products will attract new clients: 5 new contracts in 1 year increasing to 10 new contracts in 5 years, which will increase the turnover of Ferram by 500k Euros in 1 year and 3,5 million Euros in 5 years after the experiment end.
01-10-2017
-30-09-2020
01-09-2017
-30-11-2022