AI is one of the biggest mega-trends towards the 4th industrial revolution. While these technologies promise business sustainability and product/process quality, it seems that the ever-changing market demands and the lack of skilled humans, in combination with the complexity of technologies, raise an urgent need for new suggestions. Suggestions that will be agile, reusable, distributed, scalable, accountable, secure, standardized and collaborative.
To break the entry barriers for these technologies and unleash their potential, the knowlEdge project will develop a new generation of AI methods, systems and data management infrastructure. This framework will provide means for the secure management of distributed data and the computational infrastructure to execute the needed analytic algorithms and redistribute the knowledge towards a knowledge exchange society.
To do so, knowlEdge proposes 6 major innovations in the areas of data management, data analytics and knowledge management:
(i) A set of AI services that allow the usage of edge deployments as computational and live data infrastructure, an edge continuous learning execution pipeline;
(ii) A digital twin of the shop-floor to test the AI models;
(iii) A data management framework deployed from the edge to the cloud ensuring data quality, privacy and confidentiality, building a data safe fog continuum;
(iv) Human-AI Collaboration and Domain Knowledge Fusion tools for domain experts to inject their experience into the system to trigger an automatic discovery of knowledge that allows the system to adapt automatically to system changes; (v) A set of standardization mechanisms for the exchange of trained AI-models from one context to another;
(vi) A knowledge marketplace platform to distribute and interchange AI trained models.
The knowlEdge consortium consists of 12 partners from 7 EU countries, and its solution will be tested and evaluated in 3 manufacturing sectors.
- Full knowledge website
- Initial knowlEdge website
- Exploitation Strategy & IPR Management
- Initial site-wide data collection and integration toolkit
- Evolutionary Requirement Engineering and Innovations [Initial/Updated/Final]
- Automated Kernel Search for Gaussian Processes on Data Streams
- Local Gaussian Process Model Inference Classification for Time Series Data
- Automated Model Inference for Gaussian Processes: An Overview of State-of-the-Art Methods and Algorithms
- Production Scheduling Optimization enabled by Digital Cognitive Platform
- Evaluating the Lottery Ticket Hypothesis to Sparsify Neural Networks for Time Series Classification