Flexibility in manufacturing means the ability to deal with slightly or greatly mixed parts, to allow variation in parts assembly and variations in process sequence, change the production volume and change the design of certain product being manufactured.
A lead time is the latency between the initiation and execution of a process. For example, the lead time between the placement of an order and delivery of a new car from a manufacturer (from https://en.wikipedia.org/wiki/Lead_time)
In business, engineering, and manufacturing, quality has a pragmatic interpretation as the non-inferiority or superiority of something; it's also defined as being suitable for its intended purpose (fitness for purpose) while satisfying customer expectations. (from https://en.wikipedia.org/wiki/Quality_(business))
Quality assurance (QA) is a way of preventing mistakes and defects in manufactured products and avoiding problems when delivering solutions or services to customers; which ISO 9000 defines as "part of quality management focused on providing confidence that quality requirements will be fulfilled". This defect prevention in quality assurance differs subtly from defect detection and rejection in quality control, and has been referred to as a shift left as it focuses on quality earlier in the process i.e. to the left of a linear process diagram reading left to right. (from https://en.wikipedia.org/wiki/Quality_control)
Productivity describes various measures of the efficiency of production. A productivity measure is expressed as the ratio of output to inputs used in a production process, i.e. output per unit of input. Productivity is a crucial factor in production performance of firms and nations. (from https://en.wikipedia.org/wiki/Productivity)
In systems engineering, dependability is a measure of a system's availability, reliability, and its maintainability, and maintenance support performance, and, in some cases, other characteristics such as durability, safety and security. In software engineering, dependability is the ability to provide services that can defensibly be trusted within a time-period. This may also encompass mechanisms designed to increase and maintain the dependability of a system or software. (from https://en.wikipedia.org/wiki/Dependability)
Business development entails tasks and processes to develop and implement growth opportunities within and between organizations. It is a subset of the fields of business, commerce and organizational theory. Business development is the creation of long-term value for an organization from customers, markets, and relationships. (from https://en.wikipedia.org/wiki/Business_development)
Occupational safety and health (OSH), also commonly referred to as occupational health and safety (OHS), occupational health or workplace health and safety (WHS), is a multidisciplinary field concerned with the safety, health, and welfare of people at work. (from https://en.wikipedia.org/wiki/Occupational_safety_and_health)
Material efficiency is a description or metric which expresses the degree in which raw materials are consumed, incorporated, or wasted, as compared to previous measures in construction / manufacturing projects or physical processes. Making a usable item out of thinner stock than a prior version increases the material efficiency of the manufacturing process. (from https://en.wikipedia.org/wiki/Material_efficiency)
Waste minimisation is a set of processes and practices intended to reduce the amount of waste produced. By reducing or eliminating the generation of harmful and persistent wastes, waste minimisation supports efforts to promote a more sustainable society. Waste minimisation involves redesigning products and processes and/or changing societal patterns of consumption and production. (from https://en.wikipedia.org/wiki/Waste_minimisation)
The efficiency and sustainability of both the manufacturing of actual and future products is still very much determined by the processes that shape and assemble the components of these products.
Innovative products and advanced materials (including nano-materials) are emerging but are not yet developing to their full advantage since robust manufacturing methods to deliver these products and materials are not developed for large scale. Research is needed to ensure that novel manufacturing processes can efficiently exploit the potential of novel products for a wide range of applications.
Integration of non-conventional technologies (e.g. laser, ultrasonic) towards the development of new multifunctional manufacturing processes (including in process concept: inspection, thermal treatment, stress relieving, machining, joining
Mechatronics, which is also called mechatronic engineering, is a multidisciplinary branch of engineering that focuses on the engineering of both electrical and mechanical systems, and also includes a combination of robotics, electronics, computer, telecommunications, systems, control, and product engineering. (From https://en.wikipedia.org/wiki/Mechatronics)
Control technologies will be further exploiting the increasing computational power and intelligence in order to come forward to the demands of increased speed and precision in manufacturing. Advanced control strategies will allow the use of lighter actuators and structural elements for obtaining very rigid and accurate solutions, replacing slower and more energy-intensive approaches. Learning controllers adapt the behaviour of systems to changing environments or system degradation, taking into account constraints and considering alternatives, hereby relying on robust industrial real-time communication technologies, system modelling approaches and distributed intelligence architectures.
Continuous monitoring of the condition and performance of the manufacturing system on component and machine level, enables sustainable and competive manufacturing, also by introducing autonomous diagnosis capabilities and context-awareness. Detecting, measuring and monitoring the variables, events and situations will increase the performance and reliability of manufacturing systems. This involves advanced metrology, calibration and sensing, signal processing and model-based virtual sensing for a wide range of applications, e.g. event pattern detection, diagnostics, anomaly detection, prognostics and predictive maintenance.
Intelligent components enable the deployment of safe, energy-efficient, accurate and flexible or reconfigurable products and production systems. This includes the introduction of smart actuators and the use of advanced end-effectors composed of passive and active materials. Energy technologies are gaining importance, such as (super)capacitors, pneumatic storage devices, batteries and energy harvesting technologies.
Production equipment does not yet take full advantage of the benefits that new and advanced materials offer, and factories of the future will need more advanced equipment to meet the requirements for energy efficiency and environmental targets and to meet new demands for a connected world. The future will therefore see modern, lightweight, long-lasting/flexible and smart equipment able to produce current and future products for existing and new markets. There will be a step change in the construction of such equipment, leading to a sustainable manufacturing base able to deliver high added value products and customised production. Increased smartness in the manufacturing equipment also enables a systems approach with machines able to learn from each other and impacting on the human-machine interface.
Smarter equipment and manufacturing systems with self-diagnosis (temperature, vibrations, noise) and embedded sensing, memory or active architecture, with functional materials allowing them to adjust work processes and operations to variances in structure, shape and material composition (right first time manufacture).. Capture of machine data through this inherent ‘smartness’ for communication between machines (for M2M), at factory level and through supply chains for a systems approach to manufacturing and meeting customer demand.
New equipment components taking advantage of new designs and advanced materials (e.g. gears and transmissions providing longer lifetime of equipment, active surfaces that can embed and release lubricant when needed (higher pressures or temperatures))
The Internet of Things (IoT) is a system of interrelated computing devices, mechanical and digital machines, objects, animals or people that are provided with unique identifiers (UIDs) and the ability to transfer data over a network without requiring human-to-human or human-to-computer interaction. (from https://en.wikipedia.org/wiki/Internet_of_things)
Advanced machine interaction with humans through ubiquity of mobile devices will enable users to receive relevant production and enterprise-specific information regardless of their geographical location and tailored to the context and the skills/responsibilities they own. Interactions with ICT infrastructures and equipment will be natural language-like
Data acquisition is the process of sampling signals that measure real world physical conditions and converting the resulting samples into digital numeric values that can be manipulated by a computer. Data acquisition systems, abbreviated by the acronyms DAS or DAQ, typically convert analog waveforms into digital values for processing. The components of data acquisition systems include:
Sensors, to convert physical parameters to electrical signals.
Signal conditioning circuitry, to convert sensor signals into a form that can be converted to digital values.
Analog-to-digital converters, to convert conditioned sensor signals to digital values.
Data acquisition applications are usually controlled by software programs developed using various general purpose programming languages
So, as a summary, Data acquisition is in itself a vast group of protocols, technologies, sensors, hardware and software…
Data storage is the recording (storing) of information (data) in a storage medium. DNA and RNA, handwriting, phonographic recording, magnetic tape, and optical discs are all examples of storage media. (from https://en.wikipedia.org/wiki/Database)
Dataspaces are an abstraction in data management that aim to overcome some of the problems encountered in data integration system. The aim is to reduce the effort required to set up a data integration system by relying on existing matching and mapping generation techniques, and to improve the system in "pay-as-you-go" fashion as it is used. (From https://en.wikipedia.org/wiki/Dataspaces)
Cloud computing can be deployed as private cloud, public cloud, hybrid cloud
Digital Manufacturing Platforms can be ran into IaaS, PaaS or SaaS.
Considerations need to be made to security measures in the cloud (kubernetes, container security), identity & access, or carefully considering the security measures by the respective cloud services providers.
In computer science, artificial intelligence (AI), sometimes called machine intelligence, is intelligence demonstrated by machines, in contrast to the natural intelligence displayed by humans and animals. Computer science defines AI research as the study of "intelligent agents": any device that perceives its environment and takes actions that maximize its chance of successfully achieving its goals. Colloquially, the term "artificial intelligence" is used to describe machines that mimic "cognitive" functions that humans associate with other human minds, such as "learning" and "problem solving" (from https://en.wikipedia.org/wiki/Artificial_intelligence)
Fuzzy logic is a form of many-valued logic in which the truth values of variables may be any real number between 0 and 1 inclusive. It is employed to handle the concept of partial truth, where the truth value may range between completely true and completely false (from https://en.wikipedia.org/wiki/Fuzzy_logic)
Artificial neural networks (ANN) or connectionist systems are computing systems vaguely inspired by the biological neural networks and astrocytes that constitute animal brains. The neural network itself is not an algorithm, but rather a framework for many different machine learning algorithms to work together and process complex data inputs. Such systems "learn" to perform tasks by considering examples, generally without being programmed with any task-specific rules. (from https://en.wikipedia.org/wiki/Artificial_neural_network)
In computer science and operations research, a genetic algorithm (GA) is a metaheuristic inspired by the process of natural selection that belongs to the larger class of evolutionary algorithms (EA). Genetic algorithms are commonly used to generate high-quality solutions to optimization and search problems by relying on bio-inspired operators such as mutation, crossover and selection. (https://en.wikipedia.org/wiki/Genetic_algorithm)
Simulation (often referred to as digital twins) is the imitation of the operation of a real-world process or system. The act of simulating something first requires that a model be developed; this model represents the key characteristics, behaviors and functions of the selected physical or abstract system or process. The model represents the system itself, whereas the simulation represents the operation of the system over time. (from https://en.wikipedia.org/wiki/Simulation)
The Orion Context Broker Generic Enabler is the core and mandatory component of any “Powered by FIWARE” platform or solution. It enables to manage context information in a highly decentralized and large-scale manner. It provides the FIWARE NGSIv2 API which is a simple yet powerful Restful API enabling to perform updates, queries or subscribe to changes on context information.
The Cygnus Generic Enabler brings the means for managing the history of context that is created as a stream of data which can be injected into multiple data sinks, including some popular databases like PostgreSQL, MySQL, MongoDB or AWS DynamoDB as well as BigData platforms like Hadoop, Storm, Spark or Flink.
The “servitization wave” of manufacturing has already spread out to the advanced countries and many leading high-capital investment sectors (e.g. aerospace and automotive) are already competing in the international markets providing to their customers a composition of services for product operation (e.g. maintenance, reliability, upgrades), and end-of-life use (e.g. re-manufacturing, recycling, disposal). Especially SMEs are trying to compete in the international markets with their niche solutions, adding innovative services to their value propositions. Such innovative business models are based on a dynamic network of companies, continuously moving and changing in order to afford more and more complex compositions of services. In such a context, there is a strong need to create distributed, adaptive, and interoperable virtual enterprise environments supporting these undergoing processes. In order to do so, new tools must be provided for enabling and fostering the dynamic composition of enterprise networks. In particular, SMEs call for tools and instruments which follow them in their continuously re-shaping process, enabling collaboration and communication among the different actors of the product-service value chains. New IPR methods are also needed.
The rise of the transport cost, the need for higher efficiency and productivity, the customer demand for greener product, the higher instability of raw material and energy prices and the shortening of the lead-time production will push for a more critical assessment of the delocalisation strategy towards low cost countries. Service-led personalised products will require a new paradigm for western countries re-industrialisation (Globalisation 2.0), moving back (re-shoring) manufacturing of selected products.
Finally, innovation should become a business model in itself and a continuously run business process (the factory innovation): increasing the competitiveness through the design of a new product requires the development of a company strategy where product and process innovation is seen as a permanent, widely distributed, multi level, social oriented and user centric activity. Collaboration among companies of different sectors to exploit multi-disciplinary cross fertilisation is also envisaged. New tools, methodology and approaches for the user experience intelligence (i.e. social networks, crowd sourcing, social science methods, qualitative and quantitative, to generate insights, models and demonstrations, etc.) need to be addressed and explored.
According to the new paradigm of sustainability, the importance of the user is increasing. The user is at the same time a customer, a citizen and a worker. The well-being of the user could therefore become a winning strategy both for B2B as well as B2C companies. More detailed modelling behaviour can help the development of innovative solutions, aiming at user comfort, safety, performance, style; this requires new competitive focus for the development of these innovative solutions and new business models to support a quick and dynamic response to market changes.
As products are today virtually designed and tested before being engineered for production, new business models need also to have tools to support the company to design and test them before they are implemented through products, services and manufacturing processes. The complexity of these tools is higher than that of tools for product development, due the need for holistic modelling of product and processes.
The European Factories of the Future are expected to provide global manufacturing competitiveness, but also to create a large amount of work opportunities for the European population. Future factory workers are therefore key resources for industrial competitiveness as well as important consumers. However, as previously stated, the changing demographics and high skill requirements faced by European industry pose new challenges. Workers with high knowledge and skills (“knowledge workers”) will be scarce resources. Research efforts within Horizon 2020 must address ways to increase the number of people available for, and interested in, manufacturing tasks. This includes the following important aspects of the human resources: - New technology-based approaches to accommodate age-related limitations, through ICT and automation - New technical, educational, and organisational ways to increase the attractiveness of factory work to the young potential workforce, the existing workforce, the potential immigrant workforce, and the older workforce - New approaches to skill- and competence development, as well as skill and knowledge management, to increase competitiveness and be part of the global knowledge society - New ways to organise and compensate factory knowledge workers - New factory human-centric work-environments based on safety and comfort - Ways to integrate future factory work in global and local societal agendas and social patterns
Authorisation is the process of allowing an entity (humans, systems or devices) to access information systems or facilities where information and processing capabilities are being stored. More practical in an industrial setting for Digital Manufacturing Platforms, an authorized person can get access to an operational machine in order to update it, or investigate its contents. Unauthorized access could be someone who has been able to access the network from the outside, performing actions that have not been authorized and cannot be justified.
Authentication is a means to assess the authorization rules of an entity by means of a set of instruments. In the case of Digital Manufacturing Platforms it would be the instruments like user name and password, and in addition a second factor such as a physical token or a mobile phone that can authenticate the person accessing the platform. The physical token connects the person to something he has, the password to something he knows.
A third A in the AAA-architecture is related to Access. Once authorized, and authenticated, access can be granted to the location, system, application, and / or information. Access control levels can thus be set up on different layers. These can be physical (access to the country, to the plant, to the building, the room and the environment where the system is located), and logical (using authentication technologies). In Digital Manufacturing Platforms this means the systems could be accessible only on premise, in the factory or for instance in the (private or public) cloud. As a result different access mechanisms needs to be considered, depending on the risk and intended security levels and controls.
API's (and REST API's) need to be carefully protected through mechanisms limiting access on the basis of identity and authorizing and authenticating through managed and controlled mechanisms. Usually certificates and IP-addresses are being used to restrict access to API's, but a more granular approach is advisable from a Security Architecture perspective. Other architectures being used as Integratio Protocols in Digital Manufacturing Platforms are JSON (for its near real time capabilities) and MQ (message bus architectures). The letter being less secure, since it provides a continuous stream of information which is being sent to a destination.
A Security Architecture is a conceptual design that addresses various aspects of security in a system and resulting application, set of applications and components that make up the system. It is being used to support the design, development, implementation and operation of these systems, which can include Manufacturing Platforms. For Digital Manufacturing Platforms it addresses necessities and potential risks identified following potential scenario's or within a specific environment. It tries to present a comprehensive perspective of various security concepts on the conceived OT and IT architecture which includes networks, systems and equipment connected to these networks, the communication protocols and operating systems being used, the application development and operational process and recommends the use of security measures using security controls. Having a Security Architecture also helps both the design and integration process, supports identification of incidents and the security monitoring, speeds up discussions with partners for a level play field and best practices and is generally reproducible. Digital Manufacturing Platforms tend to try to bridge operational systems with information technology, such as the use of analytics, data collection and distribution and visualization that can lead to automated actions by these systems on the basis of unattended and unsupervised decisions and control implementations. To avoid physical harm, collateral damage other safety or cybersecurity issues, having a Security Architecture supporting the Digital Manufacturing Platforms should allow developers and companies at least to consider the various aspects and challenges of security in an organized and comprehensible manner. Architectures can follow standards such as IEC62443, ISO27k or NIST800.16, or any alternative scheme, but that needs to complete towards the digital and operational platforms.
Federal Ministry of Economic Affairs and Energy -Alignment Report for Reference Architectural Model for Industrie4.0 / Intelligent Manufacturing System Architecture. Sino-German Industrie4.0 / Intelligent Manufacturing. Standardisation Sub-Working Group
The resources layer addresses standards regarding the distribution of all participating physical Industry 4.0 components. This includes equipment and machinery (typically at field and process level), control systems, applications and network infrastructure.
The communication layer includes standards regarding integration and communication protocols as well as related mechanisms that contribute to interoperability of components. Also includes the provision of services for control of the integration processes
The information layer focuses on standards that regulate information flows, information objects and data models being used by components, services and their functions to exchange information. A common semantic base is the key requirement to ensure consistent integration of different data and achieve interoperability between components at this layer.
The function layer includes standards that cumulate common functions derived from applications, systems and components. Remote access and horizontal integration take place within the layer to ensures the integrity of the information and conditions in the process and the integration of the technical level.
The business layer focuses on standards that represent business objectives and the resulting overall processes regarding the exchanged information at the lower layers. The layer addresses regulatory and market perspectives, business models, business cases and products, business processes and regulations influencing business models.
This is a collection of Industrial CyberSecurity Standards and de-facto standards relevant for organizations designing, developing, selecting, installing and operating digital manufacturing platforms. The selection was made on the basis of expert advisory and selections by researchers in their assessment of relevant State of the Art. Next to the standards, readers should also consider the works ongoing in standardisation efforts.
The following is a collection of Industrial CyberSecurity Standards and de-facto standards relevant for organizations designing, developing, selecting, installing and operating digital manufacturing platforms. The selection was made on the basis of expert advisory and selections by researchers in their assessment of relevant State of the Art. Next to the standards, readers should also consider the works ongoing in standardization efforts
ISO 27000 (or ISO27k) refers to a standard and a series of international standards (27001, 27002, ...) set up by the ISO (International Standards Organization) on information security management series (ISMS). The standard was initiated around 2005 and revised multiple times (2016, 2018) with the support of the different ISO member states representatives, people with an information security background and profession.
The standard describes a norm to which organizations can organize themselves in order to properly manage information and information security. It sets out a series policies, defines the setup of a system to control and manage the policies. Today the standard is being referred to by many compliancy requirement regulations and setups such as the Network and Information Security Directive by the European Commission and the European Member States.
The ISO 27000 series have been divised into multiple sub-standards
27001 :This is the specification for an information security management system (an ISMS) which replaced the old BS7799-2 standard
27002 : This is the 27000 series standard number of what was originally the ISO 17799 standard (which itself was formerly known as BS7799-1)
27003 : guidance for the implementation of an ISMS (IS Management System)
27004 : measurement and metrics
27005 : methodology independent ISO standard
27006 : accreditation of organizations offering ISMS certification.
27007 : ISMS auditing
27032 : CyberSecurity
27033 : Network Security
(* 27013 : Manufacturing)
The objective of the 27001 standard itself is to "provide requirements for establishing, implementing, maintaining and continuously improving an Information Security Management System (ISMS)". Regarding its adoption, this should be a strategic decision. Further, "The design and implementation of an organization's information security management system is influenced by the organization's needs and objectives, security requirements, the organizational processes used and the size and structure of the organization". The original standard was more oriented towards planning and organizing, the later versions more towards measuring and evaluating. 27002 is about controls and control mechanisms.
Organizations can have their ISMS's or their ISO process certified, which can be needed for compliance reasons.
Provides guidance based on ISO/IEC 27002:2013 applied to process control systems used by the energy utility industry for controlling and monitoring the production or generation, transmission, storage and distribution of electric power, gas, oil and heat, and for the control of associated supporting processes.
central and distributed process control, monitoring and automation technology as well as information systems used for their operation, such as programming and parameterization devices;
digital controllers and automation components such as control and field devices or Programmable Logic Controllers (PLCs), including digital sensor and actuator elements;
all further supporting information systems used in the process control domain, e.g. for supplementary data visualization tasks and for controlling, monitoring, data archiving, historian logging, reporting and documentation purposes;
communication technology used in the process control domain, e.g. networks, telemetry, telecontrol applications and remote control technology;
Advanced Metering Infrastructure (AMI) components, e.g. smart meters;
measurement devices, e.g. for emission values; - digital protection and safety systems, e.g. protection relays, safety PLCs, emergency governor mechanisms;
energy management systems, e.g. of Distributed Energy Resources (DER), electric charging infrastructures,
all software, firmware and applications installed on above-mentioned systems, e.g. DMS (Distribution Management System) applications or OMS (Outage Management System); - any premises housing the above-mentioned equipment and systems;
remote maintenance systems for above-mentioned systems.
ISO/IEC 27019:2017 does not apply to the process control domain of nuclear facilities. This domain is covered by IEC 62645. ISO/IEC 27019:2017 also includes a requirement to adapt the risk assessment and treatment processes described in ISO/IEC 27001:2013 to the energy utility industry-sector?specific guidance provided in this document.
The ISA/IEC 62443 standard specifies security capabilities for (industrial) control system components. Developed by the ISA99 committee and adopted by the International Electrotechnical Commission (IEC), provides a framework to address and mitigate security vulnerabilities in industrial automation and control systems (IACSs). it is based upon the input and knowledge of IACS security experts from across the globe to develop consensus standards that are applicable to all industry sectors and critical infrastructure. Central is the application of IACS security zones and conduits (isolation & segmentation), which were introduced in 62443-1-1,
ISA-62443-4-2, Security for Industrial Automation and Control Systems: Technical Security Requirements for IACS Components, provides the cybersecurity technical requirements for components that make up an IACS, specifically the embedded devices, network components, host components, and software applications.
Based on the IACS system security requirements of ISA/IEC 62443‑3-3, System Security Requirements and Security Levels, 4-2 specifies security capabilities that enable a component to mitigate threats for a given security level without the assistance of compensating countermeasures.
ISA/IEC 62443-4-1, Product Security Development Life-Cycle Requirements, specifies process requirements for the secure development of products used in an IACS and defines a secure development life cycle for developing and maintaining secure products. The life cycle includes security requirements definition, secure design, secure implementation (including coding guidelines), verification and validation, defect management, patch management, and product end of life.
ISA/IEC 62443-3-2, Security Risk Assessment, System Partitioning and Security Levels, is based on the understanding that IACS security is a matter of risk management. 3-2 will define a set of engineering measures to guide organizations through the process of assessing the risk of a particular IACS and identifying and applying security countermeasures to reduce that risk to tolerable levels.
By aligning the identified target security level with the required security level capabilities 3‑3, System Security Requirements and Security Levels it takes the earlier 1-1 standard a step further. 2-3, Patch Management in the IACS Environment addresses the installation of patches, also called software updates, software upgrades, firmware upgrades, service packs, hot fixes, basic input/output system updates, and other digital electronic program updates that resolve bug fixes, operability, reliability, and cybersecurity vulnerabilities. It covers many of the problems and industry concerns associated with IACS patch management for asset owners and IACS product suppliers. It also describes the effects poor patch management can have on the reliability and operability of an IACS.
These requirements establish a standard way of expressing the assurance requirements for Targets of Evaluation (TOEs). This part of ISO/IEC 15408 catalogues the set of assurance components, families and classes. This part of ISO/IEC 15408 also defines evaluation criteria for PPs (Protection Profile) and STs (Security Target) and presents evaluation assurance levels that define the predefined ISO/IEC 15408 scale for rating assurance for TOEs, which is called the Evaluation Assurance Levels (EALs). The audience for this part of ISO/IEC 15408 includes consumers, developers, and evaluators of secure IT products. Developers, who respond to actual or perceived consumer security requirements in constructing a TOE, reference this part of ISO/IEC 15408 when interpreting statements of assurance requirements and
determining assurance approaches of TOEs.
The CIS Controls™ are a prioritized set of actions that collectively form a defense-in-depth set of best practices that mitigate the most common attacks against systems and networks. The CIS Controls are developed by a community of IT experts who apply their first-hand experience as cyber defenders to create these globally accepted security best practices. The experts who develop the CIS Controls come from a wide range of sectors including, retail, manufacturing, healthcare, education, government, defense, and others. So, while the CIS Controls address the general practices that most organizations should take to secure their systems, some operational environments may present unique requirements not addressed by the CIS Controls.
The guidance on how to apply the security best practices found in CIS Controls. For each top-level CIS Control, there is a brief discussion of how to interpret and apply the CIS Control in such environments, along with any unique considerations or differences from common IT environments. The applicability or not of specific Sub-Controls is addressed and additional steps needed in ICS environments are explained.
This document provides guidance on how to secure Industrial Control Systems (ICS), including Supervisory Control and Data Acquisition (SCADA) systems, Distributed Control Systems (DCS), and other control system configurations such as Programmable Logic Controllers (PLC), while addressing their unique performance, reliability, and safety requirements. The document provides an overview of ICS and typical system topologies, identifies typical threats and vulnerabilities to these systems, and provides recommended security countermeasures to mitigate the associated risks.
ICS cybersecurity programs should always be part of broader ICS safety and reliability programs at both industrial sites and enterprise cybersecurity programs, because cybersecurity is essential to the safe and reliable operation of modern industrial processes. Threats to control systems can come from numerous sources, including hostile governments, terrorist groups, disgruntled employees, malicious intruders, complexities, accidents, and natural disasters as well as malicious or accidental actions by insiders. ICS security objectives typically follow the priority of availability and integrity, followed by confidentiality.
The Framework focuses on using business drivers to guide cybersecurity activities and considering cybersecurity risks as part of the organization’s risk management processes. The Framework consists of three parts: the Framework Core, the Implementation Tiers, and the Framework Profiles. The Framework Core is a set of cybersecurity activities, outcomes, and informative references that are common across sectors and critical infrastructure. Elements of the Core provide detailed guidance for developing individual organizational Profiles. Through use of Profiles, the Framework will help an organization to align and prioritize its cybersecurity activities with its business/mission requirements, risk tolerances, and resources. The Tiers provide a mechanism for organizations to view and understand the characteristics of their approach to managing cybersecurity risk, which will help in prioritizing and achieving cybersecurity objectives.
While this document was developed to improve cybersecurity risk management in critical infrastructure, the Framework can be used by organizations in any sector or community. The Framework enables organizations – regardless of size, degree of cybersecurity risk, or cybersecurity sophistication – to apply the principles and best practices of risk management to improving security and resilience.
The Framework provides a common organizing structure for multiple approaches to cybersecurity by assembling standards, guidelines, and practices that are working effectively today.
The Cybersecurity Framework (CSF) Version 1.1 implementation details developed for the manufacturing environment. The “Manufacturing Profile” of the CSF can be used as a roadmap for reducing cybersecurity risk for manufacturers that is aligned with manufacturing sector goals and industry best practices. This Manufacturing Profile provides a voluntary, risk-based approach for managing cybersecurity activities and reducing cyber risk to manufacturing systems. The Manufacturing Profile is meant to enhance but not replace current cybersecurity standards and industry guidelines that the manufacturer is embracing.
Internet of Things (IoT) devices often lack device cybersecurity capabilities their customers organizations and individuals—can use to help mitigate their cybersecurity risks. Manufacturers can help their customers by improving how securable the IoT devices they make are by providing necessary cybersecurity functionality and by providing customers with the cybersecurity-related information they need. This publication describes recommended activities related to cybersecurity that manufacturers should consider performing before their IoT devices are sold to customers. These foundational cybersecurity activities can help manufacturers lessen the cybersecurity-related efforts needed by customers, which in turn can reduce the prevalence and severity of IoT device compromises and the attacks performed using compromised devices
While oriented in the first place to consumer devices, ETSI EN 303 645, a standard for cybersecurity in the Internet of Things is relevant for manufacturing considerations. The standard establishes a security baseline for internet-connected consumer products and provides a basis for future IoT certification schemes. Based on the ETSI specification TS 103 645, EN 303 645 went through National Standards Organization comments and voting, engaging even more stakeholders in its development and ultimately strengthening the resulting standard. The EN is a result of collaboration and expertise from industry, academics and government.
ETSI EN 303 645 specifies 13 provisions for the security of Internet-connected consumer devices and their associated services. IoT products in scope include connected children’s toys and baby monitors, connected safety-relevant products such as smoke detectors and door locks, smart cameras, TVs and speakers, wearable health trackers, connected home automation and alarm systems, connected appliances (e.g. washing machines, fridges) and smart home assistants. The EN also includes 5 specific data protection provisions for consumer IoT.
Development of standards for cybersecurity and data protection covering all aspects of the evolving information society including but not limited to: - Management systems, frameworks, methodologies - Data protection and privacy - Services and products evaluation standards suitable for security assessment for large companies and small and medium enterprises (SMEs) - Competence requirements for cybersecurity and data protection - Security requirements, services, techniques and guidelines for ICT systems, services, networks and devices, including smart objects and distributed computing devices Included in the scope is the identification and possible adoption of documents already published or under development by ISO/IEC JTC 1and other SDOs and international bodies such as ISO, IEC, ITU-T, and industrial fora. Where not being developed by other SDO's, the development of cybersecurity and data protection CEN/CENELEC publications for safeguarding information such as organizational frameworks, management systems, techniques, guidelines, and products and services, including those in support of the EU Digital Single Market.
Its scope is to contribute, support and coordinate the preparation of international standards for systems and elements used for industrial process measurement, control and automation at CENELEC level. To coordinate standardisation activities which affect integration of components and functions into such systems including safety and security aspects. This CENELEC work of standardisation is to be carried out for equipment and systems and closely coordinated with IEC TC65 and its subcommittees with the objective of avoiding any duplication of work while honouring standing agreements between CENELEC and IEC.
The Smart Applications REFerence (SAREF) ontology is a shared model of consensus that facilitates the matching of existing assets in the smart applications domain. SAREF provides building blocks that allow separation and recombination of different parts of the ontology depending on specific needs.
SAREF4INMA, an extension of SAREF that was created for the industry and manufacturing domain. SAREF4INMA was created to be aligned with related initiatives in the smart industry and manufacturing domain in terms of modelling and standardization, such as the Reference Architecture Model for Industry 4.0 (RAMI), which combines several standards used by the various national initiatives in Europe that support digitalization in manufacturing.
These initiatives include, but are not limited to, the platform Industrie 4.0 in Germany, the Smart Industry initiative in the Netherlands, Industria 4.0 in Italy, the 'Industrie du future initiative' in France and more.
It extends SAREF with 24 classes (in addition to a number of classes directly reused from the SAREF ontology and the SAREF4BLDG extension), 20 object properties (in addition to a number of object properties reused from the SAREF ontology and the SAREF4BLDG extension) and 11 data type properties. SAREF4INMA focuses on extending SAREF for the industry and manufacturing domain to solve the lack of interoperability between various types of production equipment that produce items in a factory and, once outside the factory, between different organizations in the value chain to uniquely track back the produced items to the corresponding production equipment, batches, material and precise time in which they were manufactured.
The Industrial Internet Security Framework (IISF) is a cross-industry-focused security framework comprising expert vision, experience and security best practices. It reflects thousands of hours of knowledge and experiences from security experts, collected, researched and evaluated for the benefit of all IIoT system deployments.
It builds on the ‘Industrial Internet of Things Reference Architecture’ (IIRA), that lays out the most important architecture components, how they fit together and how they influence each other. Each of these components must be made secure, as must the key system characteristics that bind them together into a trustworthy system.
It reviews security assessment for organizations, architectures and technologies. It outlines how to evaluate attacks as part of a risk analysis and highlights the many factors that should be considered, ranging from the endpoints and communications to management systems and the supply chains of the elements comprising the system. Different roles are identified that should be considered in conjunction with the key characteristics, including, owner/operator, system integrator/builder and equipment vendor. Each role offers different risk management perspectives that affect the decisions regarding security and privacy.
An industry-accepted way to document what security controls exist in IaaS, PaaS, and SaaS services, providing security control transparency. It provides a set of Yes/No questions a cloud consumer and cloud auditor may wish to ask of a cloud provider to ascertain their compliance to the Cloud Controls Matrix (CCM).
It helps cloud service providers and their customers to gauge the security posture and determine if their cloud services are suitably secure. In addition to improving the clarity and accuracy, it also supports better auditability of the CCM controls.
The overall concept is the use of the Admnistrative Asset Shell (AAS). It is requesting access to an object. In the context of an AAS an object typically is a submodel or a property or any other submodel element connected to the asset. The implemented access control mechanism of the AAS evaluates the access permission rules (2a) that include constraints that need to be fulfilled w.r.t. the subject attributes (2b), the object attributes and the environment conditions (2d). The focus is on access control. An object in the context of ABAC corresponds typically to a submodel or to a submodel element. The object attributes again are modelled as submodel elements. Subject Attributes need to be accessed either via an external policy information point or they are defined as properties within a special submodel of the AAS. A typical subject attribute is its role. The role is the only subject attribute defined in case of role based access control. Optionally, environment conditions can be defined. In role based access control no environment conditions are defined. Environment conditions can be expressed via formula constraints. To be able to do so the values needed should be defined as property or reference to data within a submodel of the AAS.
NAMUR, the "User Association of Automation Technology in Process Industries", is an international association of user companies (established in 1949) and represents their interests concerning automation technology. NAMUR numbers over 150 member companies. The achievement of added value through automation engineering is at the forefront in all NAMUR member company activities. NAMUR conducts a frank and fair dialogue with manufacturers.
NAMUR’s Automation Security working group 4.18 addresses issues including the following topics in the context of its experience exchange, its concept developments, formulation of requirements to be met by automation solutions and its involvement in national and international standardisation.
Relevant recommendations and worksheets
NA 163 Security Risk Assessment of SIS (Safety Instrumented Systems)
NA 169 Automation Security Management in the Process Industry. NA 169 describes the steps to systematically build a Cyber Security Management System (CSMS) for automation systems in the process industry in order to ensure the correct operation of the functional safety devices, to protect critical data and to ensure the availability and reliability of the plants
The international standard IEC 61499, addressing the topic of function blocks for industrial process measurement and control systems, was initially published in 2005. The specification of IEC 61499 defines a generic model for distributed control systems and is based on the IEC 61131 standard. (see https://en.wikipedia.org/wiki/IEC_61499 and IEC 61499 - International Electrotechnical Commission.
MQTT (MQ Telemetry Transport) is an open OASIS and ISO standard (ISO/IEC PRF 20922) lightweight, publish-subscribe network protocol that transports messages between devices. (From https://en.wikipedia.org/wiki/MQTT)
Universal Business Language (UBL) is an open library of standard electronic XML business documents for procurement and transportation such as purchase orders, invoices, transport logistics and waybills. UBL was developed by an OASIS Technical Committee with participation from a variety of industry data standards organizations. Version 2.1 was approved as an OASIS Standard in November 2013 and an ISO Standard (ISO/IEC 19845:2015) in December 2015
SensorThings API is an Open Geospatial Consortium (OGC) standard providing an open and unified framework to interconnect IoT sensing devices, data, and applications over the Web. It is an open standard addressing the syntactic interoperability and semantic interoperability of the Internet of Things. It complements the existing IoT networking protocols such CoAP, MQTT, HTTP, 6LowPAN. While the above-mentioned IoT networking protocols are addressing the ability for different IoT systems to exchange information, OGC SensorThings API is addressing the ability for different IoT systems to use and understand the exchanged information. As an OGC standard, SensorThings API also allows easy integration into existing Spatial Data Infrastructures or Geographic Information Systems.
The W3C Web Ontology Language (OWL) is a Semantic Web language designed to represent rich and complex knowledge about things, groups of things, and relations between things. (from https://www.w3.org/OWL/)
In general, compliance means conforming to a rule, such as a specification, policy, standard or law. Regulatory compliance describes the goal that organizations aspire to achieve in their efforts to ensure that they are aware of and take steps to comply with relevant laws, policies, and regulations. (from https://en.wikipedia.org/wiki/Regulatory_compliance)
Here the term “business models” is used in a wide sense, complementing the technological and organisation aspects of digital platforms.
One proven tool for analysing and shaping business model is the “Business Model Canvas”. When trying to apply this tool to platforms, it appears that some elements apply to platform-based business models (e.g. the “value proposition”) and that tools as the ”canvas” can provide a first inspiration.
However, for digital platforms the traditional business models view in the narrow sense falls short of describing the business and relationship aspects of platforms. In particular, the strict “partner” and “customer”- view has to be replaced by an ecosystem-perspective. In addition, this ecosystem can be highly dynamic, which means that platforms can move into new user groups, change their features and might have the typical effects. Another difference is the central role of data for platforms, meaning that data governance is one of the essential elements of the value proposition of platforms.
By definition, by bringing together actors from different sides, platforms are defined by their stakeholders. There are core stakeholders (target customers, core suppliers, value chain partners), but it should not be forgotten that there are also actors with an indirect or external interest in the activities in the platform (competitors, existing customers not addressed through the platform). A platform also defines the relationship with and the channels with the different user groups.
In order to be sustainable, the value proposition must be mirrored by a revenue stream, which is orchestrated by the platform. This value streams can be direct (pay-per-use, subscription, sales etc.), but could also be indirect (increasing price of products, increasing market share).
Platform as a Service (PaaS) or application platform as a Service (aPaaS) or platform base service is a category of cloud computing services that provides a platform allowing customers to develop, run, and manage applications without the complexity of building and maintaining the infrastructure typically associated with developing and launching an app. (from https://en.wikipedia.org/wiki/Platform_as_a_service)
Software as a service (SaaS /sæs/) is a software licensing and delivery model in which software is licensed on a subscription basis and is centrally hosted. It is sometimes referred to as "on-demand software". (from https://en.wikipedia.org/wiki/Software_as_a_service)
Pay-per-use or pay-per-duration-of-use implies that users are charged pro-rata of how much they used the service (in terms of consumed resources, computing power,... or in terms of the duration of the use of the service)
At the core of all potential industrial use case scenarios of platforms are data. When formerly isolated data are shared, suddenly a new set of factors arises, both in terms of new external factors, but also in terms of business/microeconomic implications. Therefore, at the core of every digital platform must be a legally, organizationally and commercially viable concept for data sharing/trading/exchange.
When shaping this model, the following questions must be answered:
What is the legal arrangement for data “ownership”? Can users classify their data, is staggered approach possible (closed, traded or open data)? What are legal means that the platform uses to ensure the confidentiality of data ? (Trade Secrets, data base directive)
Transparency: Can users monitor/control the sharing of data with third parties? Are there “expiration dates” for data use?
Is the legal setting a fixed standards (“general conditions”) or is it a flexible, individual approach? Are model contracts available?
Are there sectorial regulatory requirements concerning data?
How far is portability and change of platform possible?
Who is responsible in the case of breaches of confidentiality?
How is fairness/ a level playing field between the platform and smaller players ensured ?
Digital platforms will be successful if they provide a clear value proposition to the user groups involved. In general, digital platforms offer added-value basd upon three main mechanisms:
Reduction of transaction costs
Use of data integration for new services (mainly optimisation) and business models
Based upon these mechanisms, added-value can be created in a variety of perspectives, such as the process perspective (what process or activity is optimised?) or the KPI perspective (what KPI is the focus of the optimisation). This added value enables the financing of the digital processes through e.g.increased price margins, market shares or reduced costs.
Proprietary software is non-free computer software for which the software's publisher or another person retains intellectual property rights—usually copyright of the source code, but sometimes patent rights. (from https://en.wikipedia.org/wiki/Proprietary_software)
Open-source software (OSS) is a type of computer software whose source code is released under a license in which the copyright holder grants users the rights to study, change, and distribute the software to anyone and for any purpose. Open-source software may be developed in a collaborative public manner. According to scientists who studied it, open-source software is a prominent example of open collaboration. (from https://en.wikipedia.org/wiki/Open-source_software)
In the same way that software can be developed and commercialized using different business models according to the software ownership, digital platforms could be developed and commercialized using different business models according to the infrastructure ownership. Different infrastructure ownerships can be identified in this chapter and also their business models (like renting, pay per use…)
Physical and logical password should be considered from the overall taxonomy and as part of one of the Digital Pathways, as Physical and Logical Access provisioning. Physical passwords here are types of authentication technologies and can be voice commands, fingerprints, or simple presence (by means of an electronic token that an operator carries). Logical passwords here are both pincodes, passphrases or even certificates or hash keys, that support the specific levels of security. Both are considering the mechanism of access control for security in this pathway.
Access control is a key component of security and cybersecurity to any system, being it a physical (gates, doors, equipment, ...) or logical (application, service, activity, ...) one.
Under this heading, the purpose is to clarify that access control should be mandatory for every system being operated in a manufacturing environment. Access control levels can be very low, by providing everybody access to an application on the factory floor. But at least it has considered that only people on the factory floor should be getting access. That physical constraint can be taken into account. This means that from a risk perspective, unaccompanied visitors or subcontractors without oversight could also get access to this system.
By considering access control as a fundamental security mechanism, based upon a risk approach, controls can be further built in, relating back to the types of users, or moments of intervention. Least access principles should be applied, in order to only provide access after a specific given thought. For instance, the system can have a regular user (an operator), a floormanager or head of production (being capable to override a decision from an operator), a service engineer (maintainance) and an administrator.
These roles should allow different levels of access to the systems and can be related to specific risks related to them, and to the overall risk consideration. Physical passwords can be considered into the application as additional means to identify the specfic roles.
As an example, to enhance the security of an application in a manufacturing environment from Level 1 to Level 3, there will be administrator access needed to operate a specific machine or function, instead of simply pushing the button to power up a specific machine. This can be trivial, as a sawing machine that can only be used by an operator qualified to use it, up until ensuring only oversight happens when a maintenance engineer updates the machine via a usb-token and leaves additional malware on the machines.
Malware is a broad term that describes a computer program (software) that was intentionally developed to cause damage to a computer system, mainly with the intention in financial gains - but more frequently to cause business interruptions, being held hostage or to simply steal information.
For over two decades malwares have existed, specifically written to exploit vulnerabilities in computer systems, that can be used for personal gains. It is a form of cybercrime to use them, to break into someone else system. In most countries in the world, it is not a crime to develop malware - only to exploit it against someone else.
Malwares exist in many different forms. What used to be viruses, that were sent generally via email in the past, have transformed into specifically engineered pieces of software for specific purposes - the most infamous one today being Stuxnet. For viruses, security software and firewalls have been equipped to detect them and quarantaine them before they can even be seen by the destination email address. But through phising attacks (emails with a malicious hyperlink - URL) or man in the middle attacks (website that have been compromised and redirect traffic) users are still being exposed to malware.
Malware can also enter by means of USB-sticks, pieces of software that don't belong on an industrial control systems or manufacturing system (games, apps, ...) which can sometimes contain malware of pieces of them.
Ransomware is a form of malware that typically starts encrypting data, once it has been activated. To decrypt a ransom has to be paid. Ransomware can be avoided by 1) frequently upgrading the underlying software to avoid exploitation of vulnerabilities, 2) isolating the industrial systems from office and other types of systems, 3) restricting access to the systems by means of physical and logical limitations.
APTs (Advanced Persistent Threats) usually are a combination of multiple attacks and threats, intended towards a specific target. APT's will combine the detection of vulnerabilities with the exploitation of malware and ransomware. APT's are typically being coordinated by nation state actors or organized crime.
Digital Manufacturing Platforms should be concerned about the abuse of their platforms by malicious users, and should prevent by all means available man in the middle attacks or similar attacks where redirects of the platform end up on the download of malwares. By running the Digital Manufacturing Platforms in the cloud, additional security measures can be put in place specifically monitoring the activities of specific containers for unexpected calls or actions. Manufacturing companies should further give notice to the continuous protection of end point devices and active monitoring of network traffic on top of the detection of malicious activities.
The process of recording activities happening on an IT system, including OT systems operated via IT. Monitoring and logging typically occurs on network level, where packages are being sent over TCP/IP (internet protocol) and captured at the edges, in the controlling entities (routers and gateways) or as an in-line device (such as a firewall, ids or ips). Network traffic typically records origin and destination IP address, the type of application, and contents. Some of the traffic could have been encrypted.
Network traffic can be captured via a monitoring port on network devices. This results in the recording of all events that have been instructed to be logged. During the monitoring phase, this near real time data can be evaluated and analyzed. On the basis of the traffic patterns can be detected that allow the understanding of how applications (such as ransomware) arrives inside the organization or on how confidential data might leave the organization.
Logging also happens on the device level, allowing to identify the activities taking place on the device (types of applications being used and identities of people accessing the devices). This allows to identify a user with a certain transaction, or allows better for the detection of data manipulation or data theft to take place. With Machine Learning techniques some behavioral actions on a network will be detected prior to the malicious action of theft or abuse taking place. On the basis of patterns and pattern recognition, actions and events which are being used by criminals can be detected and indicating that an incident is taking place.
By utilising similar data from the outside, incidents happening in other locations, in other factories and companies can be recorded and similar patterns (signatures) can be signaled amongst trusted partners. This allows for preventative instructions inside the intrusion prevention systems, which will be able to block IP addresses, block users and applications.
Finally the monitoring and logging is important for forensics. Once an incident has happened, the recorded sessions allow to understand what exactly happened, collect evidence and use as a means for future preventive actions.
In Digital Manufacturing Platforms a logging facitlity should be enabled allowing to record the manipulations and transactions that have happened inside the platform itself, and recording the access and identity of the persons who have been controlling the platform itself.
Penetration Testing (Pentesting) is a term used by Cybersecurity practitioners to describe the process of diligently assessing potential vulnerabilities in the information security infrastructure, including in the case of Manufacturing and Industrial environments also operational technology infrastructures. It typically uses a series of tools to automate the process, but will make use of the expert experiences focusing on known tricks and vulnerabilities. The goal for the pentester is to detect and report the leaks, but not to exploit them. It is also refered to as ethical hacking, in the perspective of not intentionally manipulating equipment, data, stealing data or leaving exploitable software behind. Pentesting is the ultimate means to demonstrate both the capabilities of the security infrastructure, as it is the way to identify the shortcomings upfront. A pentesting report will allow security managers to support their activities by indicating risks, threats, vulnerabilities and indicating the needs for a risk management process. Companies with a higher level of maturity will organize a systemic approach, allowing for pentesting to take place periodically, or following specific changes happening inside the infrastructure. This can also take place in the form of contests, having for instance red teams (the attackers) playing against the defenders (blue team); both utilizing their experiences of pentesting. With a Responsible Disclosure, organizatoins and individuals can call upon the community of ethical hackers (white hats) to help identifying vulnerabilities. These will be reported sometimes in return for a small bonus. Large hacking contests can be organized to test complete platforms. When vulnerabilities are found in technologies, including Platforms which are being sold, they are being reported as CVE's after a grace period of the reporting for about 3 to 6 months. For Digital Manufacturing Platforms pentesting should also take place in the platform itself, by performing software testing and testing the Platform being put into an operational environment, as it uses web and internet technologies making it susceptible for exploitation.
Transmission data protection is the description of the security used for the communication of the data.
This can be Tranmission Layer Protocol (TLP) when considering two or more systems communicating directly communicating with each other over the internet, and securing the communication itself by means of encryption and decryption on either end.
Other means can be by using (other) VPN technologies, where usually an encryption layer between devices and applications running VPN-type services and applications are being used.
Public operators such as Internet Services Providers, Mobile Operators, ... in most cases use encryption technologies to protect the data transmssion over the public network, when providing specific business to business services. In 3G, 4G and the up and coming 5G mobile data provisioning transmission data protection has been enabled.
However, operators and platform providers should assure themselves about which transmission data has been facilitated, or should require a security baseline for it. Additionally, digital platform can start providing transmission security as part of the platform. This will be especially necessary when working with edge devices transmitting and cloud platforms receiving data.
Transmission data protection should also be considered for machines and equipment on site, or nearby. Many robot instructions and their commands for instance, are being transmitted in clear text. Many technologies exist today to prevent this from happening, even at high speeds.
The tranmission data itself should also be protected and prevented from leaking. The transmission data can also be used as a control protocol, checking the transmission for arrival and audit.
Following a risk analysis, and upon the choice of a risk framework and definition of security policies, a password policy can be derived.
The password policy is to be set up by organizations, both end user organizations manufacturers and digital platform and system providers.
Password policies should at least include :
- strong passwords or passphrases
- users to regularly update their passwords
- advise the use of multifactor (use an additional authentication device)
Digital Platform providers should provide a mechanism for single sign on or federated authentication, allowing for passwords not to be stored into the platform itself, but by accepting tokens from third party suppliers.
Physical Security refers to the part of physical access control, borders, gates, identity verfication, passport control, manned guard services, videosurveillance, biometrics and related components. Physical security also considers physical attacks such as terrorist and criminal attacks, fire and water challenges.
Multi-factor authentication describes the necessity for using more than 1 token as a proof of identity. As an example, when a user logs on to to a digital platform the basic means of authentication are user name and password.
In addition to the password (single authentication), the user can be asked for a physical token (RFID-key, ID-card, ...). This can also be a mobile phone, an authenticator app token, a SecurID or Digipass token, or biometric (fingerprint, facial recognition, ...) elements.
In security terminology this related to the concept on assuring someone's identity by something the user knows (password) and something he/she has (physical token). Additional layers can be built into this concept in order to further improve and strenghten the security levels.
When proving someone's identity at the front gate on the basis of an ID-card, Driver License or verifiable photo-ID, it can be enhanced with a log into the system that the person has reached the premise. With his personal RFID-token, he will be able to access his office. Meanwhile video surveillance camera's might have identified him in the building. Finally when logging on to his system on the network, he can be asked for an authentication code coming from his company mobile phone.
These additional levels of authentication harden the security and can be continuously expanded, depending on the security levels required.
Security training and awareness entails awareness creation, security information sessions an materials, education, educational programs, certification of people and all related formats and programs designed to inform and support people in understanding about cybersecurity.
Training & education
Security training programs will need to be an integrated part of a security strategy and policy. Next to the definition of risk, design of security policies describing how people should be getting or not getting access to specific environments, the people operating these environment should be instructed properly.
Security training and education can be system and operation specific, but needs also to accompany the company and plant specific guidelines in security.
Training and education should be a continuous activity, including repetition of elements of importance and strategic relevance.
Security education programs should be adapted to specific departments, or groups of people, depending on their levels of maturity, systems access and responsibilities.
Security education can be educational programs outside of the organizations, at specific dedicated educational organizations (private, high schools, universities, ... ) or within the organization itself. Some companies organize a one day educational course on cybersecurity, while others provide access to courses online.
These educational programs can be followed by assessments, and can lead to the provision of certificates of attendance or qualification.
Programs related to Cybersecurity can be CISSP (Certified Information Security Professional), CISM (Certified Information Security Manager), CISA (Certified Informatio Security Auditor).
Other Cybersecurity educational programs will relate to specific components in the Cybersecurity architecture, such as Firewall, Monitoring, Identity & Access expert.
Organizations can provide educational programs from within their internal organizations (own developments or licensed from educational organizations), or can develop a specific cybersecurity program dedicated to a specific application or service which has been developed.
Cybersecurity awareness programs are more informative than educational programs, typically less attention demanding, less lengthy, but aimed to a specific series of rules, or oriented to relate to a specific behavior instead of knowledge transfer.
The awareness program can indicate that the company is concerned over cybersecurity and draws attention to its employees how to handle incoming emails, watch out for suspicious behavior, means to detect that it is suspicious and what NOT to do with it. It can indicate the impact by means of a short movie, without going into detail on the whole architecture behind it.
Cyber incident reponse capability is referred to as the means of an organization to cope with a cyber incident. Usually organized in a dedicated CSIRT (CyberSecurity Incident Response Team) or a CERT (Cyber Emergency Response Team) has developed a procedure for dealing with incidents (leakages, break-ins, attacks, ...) being detected in the organization and taking the necessary measures to mitigate, prevent and respond. This dedicated team should be empowered to be in control to prevent additional loss, and to fight an attack as it happens. That means that they are required to have a good understanding of the infrastructure and have the necessary means to deflect, increase security, limit access and ensure forensic means to collect during an incident. They should be in direct response and interaction with the crisis management team. During normal operations they will support the organization Security Operations (SOC) Team onsite or remote in coping with day to day alarms, investigating their threat levels and managing with the investigation of minor incidents.