Skip to content
  • There are no suggestions because the search field is empty.

ISO 42001: 2023 - A.6.2.2 AI System Requirements and Specification

This article provides guidance on how to implement the ISO 42001: 2023 A.6.2.2 AI System Requirements and Specification

ISO 42001 Control Description

The organisation shall establish and document requirements for the AI system, including functional requirements, performance criteria, and constraints arising from the intended use context, applicable regulations, and organisational policies.

Control Objective

To ensure that AI systems are developed or procured against a clearly defined, comprehensive set of requirements that reflect the intended purpose, applicable risk profile, and obligations of the organisation, thereby establishing a sound basis for all subsequent design, development, and evaluation activities.

Purpose

AI system requirements and specifications serve as the authoritative reference against which design decisions are made, development activities are guided, and conformance is assessed. Without a well-formed requirements specification, AI systems risk being built or procured in ways that diverge from organisational intent, fail to address applicable legal obligations, or inadequately account for the needs of affected stakeholders.

This control recognises that AI systems present unique requirements challenges compared to conventional software. Factors such as the stochastic nature of model outputs, dependency on training data quality, potential for emergent behaviours, and the need to specify acceptable performance across diverse population groups all demand that requirements elicitation and documentation be approached with particular rigour and deliberateness.

Establishing requirements before development or procurement commences reduces the likelihood of costly rework, supports the identification of foreseeable risks at an early stage, and creates a foundation for meaningful verification and validation activities throughout the AI system lifecycle.

Guidance on Implementation

Defining Functional Requirements

The organisation shall document the functional requirements for the AI system, specifying what the system is intended to do, the inputs it will process, and the outputs it is expected to produce. Functional requirements shall reflect the intended use case as defined in the system concept documentation and shall be sufficient to guide design and development decisions.

Functional requirements should address the scope of the system's capabilities, any explicit limitations on its intended function, and the conditions under which the system is expected to operate. Requirements should be stated in a manner that is testable, so that conformance can be assessed during verification activities.

Specifying Performance Criteria

Performance criteria establish the measurable standards against which AI system behaviour will be evaluated. The organisation shall define performance criteria appropriate to the intended use, including relevant accuracy metrics, error tolerance thresholds, response time requirements, and any requirements for consistency of outputs across relevant population groups.

Where the AI system's use context involves safety, legal compliance, or significant impact on individuals, performance criteria shall be defined with particular care. Thresholds shall be grounded in the risk assessment conducted under A.6.1.2 and shall reflect the consequences of system failures or substandard performance.

Incorporating Regulatory and Policy Requirements

Requirements shall incorporate obligations arising from applicable laws, regulations, and organisational policies. This includes data protection requirements that affect permissible inputs or processing activities, sector-specific regulations governing AI use, and internal policies established under the AI policy framework.

The organisation shall ensure that requirements traceability is maintained, enabling a clear connection between specific requirements and the regulatory or policy provisions from which they derive. This supports compliance demonstration and facilitates the identification of requirements changes when regulations or policies are updated.

Addressing Constraints and Dependencies

AI system requirements shall document constraints that limit the design space, including constraints on data sources, computational resources, integration with existing systems, and acceptable third-party components. Dependencies on external data suppliers, application programming interfaces, or infrastructure services shall be identified and documented.

Where requirements interact with or depend upon other AI systems or conventional software components, these interdependencies shall be made explicit so that integration requirements can be appropriately managed.

Stakeholder Requirements and Affected Parties

The requirements specification shall reflect the needs of relevant stakeholders, including intended users, operators, and, where appropriate, the interests of individuals who may be affected by AI system outputs. Requirements arising from user needs shall be distinguished from technical constraints and regulatory obligations to preserve clarity.


The organisation shall establish a process for validating requirements with appropriate stakeholders before design activities commence, ensuring that the specification accurately represents the intended use and operational context.

Requirements Management and Change Control

Requirements shall be placed under version control and managed in accordance with the organisation's document control procedures. A change control process shall be applied to requirements throughout the AI system lifecycle, ensuring that changes are assessed for their impact on design, verification activities, risk assessments, and compliance obligations before being incorporated.

The requirements specification shall be reviewed and updated whenever material changes occur to the intended use context, the regulatory environment, or the results of risk assessment activities.

Related Controls

  • A.6.1.1 – AI System Impact Assessment: Impact assessment findings shall inform requirements, particularly with respect to constraints on acceptable outputs and performance expectations.
  • A.6.1.2 – AI Risk Assessment: Risk assessment results shall be reflected in performance criteria and constraints documented in the requirements specification.
  • A.6.2.3 – Data for Development and Testing of AI Systems: Requirements shall address data quality, provenance, and representativeness needs that will govern data sourcing activities.
  • A.6.2.6 – AI System Verification and Validation: Verification and validation activities shall be planned against the requirements established under this control.
  • A.6.2.8 – AI System Documentation: The requirements specification forms a core component of AI system documentation maintained throughout the lifecycle.