ISO 42001: 2023 - A.7.6 AI System Change Management
This article provides guidance on how to implement the ISO 42001: 2023 A.7.6 AI System Change Management
ISO 42001 Control Description
The organisation shall establish and operate a change management process for AI systems in operational use, ensuring that all changes to systems, data pipelines, models, and operational configurations are assessed, authorised, implemented, and documented in a controlled manner.
Control Objective
To ensure that changes to operational AI systems are managed in a way that preserves system integrity, maintains the validity of existing assessments and documentation, protects against the introduction of new risks, and preserves the organisation's capacity to account for the current state of its AI systems.
Purpose
AI systems are not static artefacts: they require ongoing maintenance, and the operational and regulatory environments in which they function evolve continuously. Changes may be required to address performance degradation, incorporate new data, respond to regulatory developments, address security vulnerabilities, or adapt the system to changes in the business processes it supports. Without a structured change management process, changes can be implemented in ways that introduce unintended risks, invalidate prior assessments, or leave the organisation unable to account for the current state or history of its systems.
AI system change management presents distinctive challenges compared with change management for conventional software. Changes to a machine learning model — including retraining on new data, modification of hyperparameters, or changes to data pre-processing — can alter system behaviour in ways that are not always predictable or easily characterised. A change that appears minor in technical terms may have significant implications for system performance, fairness, or the validity of existing documentation.
This control establishes the requirement for all changes to AI systems to pass through a disciplined review and authorisation process before implementation, with particular attention to the assessment of change impacts on risk profile, performance characteristics, and documentation validity.
Guidance on Implementation
Change Identification and Classification
The organisation shall establish a process for identifying and classifying proposed changes to AI systems. Change types to be covered include model retraining, including changes to training data composition, training procedures, or hyperparameter configurations; changes to the data pipeline, including modifications to data sources or pre-processing logic; changes to system architecture or integration interfaces; operational configuration changes; updates to third-party components or dependencies; and changes to the operational use context or the populations the system serves.
Changes shall be classified according to their potential impact on system behaviour and risk profile, with classification determining the level of assessment and authorisation required.
Change Impact Assessment
All proposed changes shall be subject to an impact assessment before implementation. The assessment shall evaluate the potential effects of the change on system performance and output characteristics; the continued validity of existing risk assessments and impact assessments; compliance with applicable regulatory and policy requirements; the need to update system documentation; and whether the change necessitates additional verification and validation activities.
The depth of the impact assessment shall be proportionate to the classification of the change. Changes assessed as having significant potential impact shall require more extensive analysis, including re-evaluation of fairness and robustness properties.
Change Authorisation
Changes shall be subject to formal authorisation before implementation. Authorisation shall be provided by personnel with appropriate technical authority and accountability for the AI system. For changes assessed as having significant potential impact, authorisation shall require review by relevant governance functions, including risk management and compliance where applicable.
The organisation shall maintain records of all change authorisation decisions, including the rationale for approvals and any conditions attached to authorised changes.
Testing and Validation of Changes
Changes shall be tested and validated before deployment to the operational environment. The scope of testing shall be commensurate with the assessed impact of the change. Changes to the model or data pipeline shall be subject to appropriate performance and fairness evaluation using current test datasets. Where a change alters system behaviour materially, a full or partial re-run of the validation process may be required.
Testing results shall be documented and shall form part of the evidence base for change authorisation.
Change Documentation and Traceability
All implemented changes shall be documented, with records maintained of the nature of the change, the impact assessment conducted, the authorisation obtained, the testing performed, and the date of implementation. Documentation shall be updated to reflect the post-change state of the system, ensuring that the organisation's records accurately represent the current AI system.
Version control shall be applied to model artefacts, code, and configuration files affected by the change, maintaining a complete history of the system's evolution.
Emergency Changes
The organisation shall define procedures for emergency changes — changes required to address urgent issues such as critical security vulnerabilities or severe operational failures — where the normal change management timeline cannot be followed. Emergency change procedures shall preserve essential controls, including documentation and retrospective impact assessment, while enabling the organisation to act promptly when required.
Related Controls
- A.7.5 – AI System Monitoring: Monitoring findings are a primary source of change triggers, and monitoring shall be used to evaluate the effects of changes following implementation.
- A.6.2.6 – AI System Verification and Validation: Changes with significant potential impact on system behaviour shall be subject to appropriate re-validation activities before deployment.
- A.6.2.8 – AI System Documentation: Change records and updated documentation shall be maintained as part of the comprehensive AI system documentation.
- A.6.1.2 – AI Risk Assessment: Significant changes shall trigger a review and update of the AI risk assessment to ensure it reflects the current state of the system.
- A.7.2 – Establishing Processes, Functions and Tools for AI Operation: Change management is an operational process and shall be integrated with the broader operational process framework.