ISO 42001: 2023 - A.5.3 Documentation of AI System Impact Assessments
This article provides guidance on how to implement the ISO 42001: 2023 - A.5.3 Documentation of AI System Impact Assessments
ISO 42001 Control Description
The organisation shall document the results of AI system impact assessments and retain results for a defined period.
Control Objective
To assess AI system impacts to individuals or groups of individuals, or both, and societies affected by the AI system throughout its life cycle.
Purpose
To create comprehensive records of impact assessments that support transparency, accountability, decision-making, and compliance. Documentation enables communication with stakeholders, informs risk management, provides evidence for audits, and supports continuous improvement of AI systems.
Guidance on Implementation
What to Document
The organisation should document (reference ISO/IEC 42001 Annex B.5.3 and ISO/IEC 42005 Clause 6):
a) AI system description and scope- System purpose and functionality
- Technical characteristics
- Deployment context
- Lifecycle stage
- Specified intended uses
- Reasonable foreseeable misuse scenarios
- Use cases not intended
- Beneficial impacts for individuals, groups, and societies
- Potential harms or adverse impacts
- Severity and likelihood of impacts
- Relevant demographic groups the system is applicable to
- Specific vulnerable groups (children, elderly, disabled persons, workers)
- Direct and indirect stakeholders
- System limitations and potential failures
- Impacts of failures
- Measures taken to mitigate failures
- Technical sophistication
- Decision-making opacity
- Interconnections with other systems
- Role of humans in relation to the system
- Human oversight capabilities, processes, and tools
- Mechanisms to avoid negative impacts
- Impact on employment
- Staff skilling and training needs
- Workforce transformation
- Data resources used
- Data quality assessments
- Algorithm and model details
- Known biases
- Geographic areas and languages
- Environmental complexity and constraints
- Integration with existing systems
- Actions taken to maximise benefits
- Mitigation measures for harms
- Residual impacts after mitigation
Retention Requirements
Organisations should:
- Define retention period - Based on:
-
- Organisational retention schedules
- Legal requirements (e.g., EU AI Act record-keeping)
- AI system lifecycle duration
- Potential liability considerations
- Typical retention periods:
- Duration of AI system operation plus defined period (e.g., 5-10 years)
- Regulatory requirements may specify minimum periods
- Consider statute of limitations for potential claims
- Update documentation - Reassess and update impact assessments when:
- Significant system changes occur
- New impacts are identified
- Mitigation measures change
- Regulatory requirements change
Documentation Format and Templates
Organisations can use (ISO/IEC 42005 provides templates):
- Standalone impact assessment document (ISO/IEC 42005 Annex E)
- Integrated with other assessments (DPIA, risk assessment)
- Structured database or system for multiple AI systems
- Alignment guide linking to existing documentation (ISO/IEC 42005 Annex D)
Documentation should be:
- Comprehensive yet concise
- Structured for easy review
- Version-controlled
- Securely stored with appropriate access controls
Implementation Steps
- Select documentation approach - Choose template or format appropriate for organisation
- Document each assessment - Complete documentation during or immediately after impact assessment
- Review for completeness - Verify all required elements are documented
- Obtain approvals - Secure necessary sign-offs
- Store securely - Maintain in controlled repository with access management
- Define retention schedule - Establish retention periods and disposal procedures
- Enable retrieval - Ensure documentation can be accessed when needed (audits, reviews, stakeholder requests)
- Maintain confidentiality - Protect sensitive information while enabling appropriate transparency
Use of Documentation
Impact assessment documentation should inform:
- Communication with users and stakeholders (reference ISO/IEC 42001 Clause 9.1 - monitoring, measurement, analysis and evaluation)
- Risk management decisions (link to Clause 6.1.2)
- System design and development choices
- Deployment go/no-go decisions
- Ongoing monitoring activities
- Regulatory compliance demonstrations
Key Considerations
Transparency vs. confidentiality: Balance the need for transparency with protection of confidential information (trade secrets, security details). Consider publishing summary versions for external stakeholders while maintaining detailed internal documentation.
Living documents: Impact assessment documentation should evolve as understanding of the AI system and its impacts develops. Version control is essential.
Accessibility: Ensure documentation is accessible to relevant stakeholders:
- Internal: developers, risk managers, compliance officers, auditors
- External: regulators, affected communities (where appropriate)
- Format appropriately for different audiences
Integration with other documentation: Link impact assessment documentation to:
- Risk assessment records
- System specifications
- Testing and validation reports
- Monitoring data
- Incident reports
Regulatory requirements: Some jurisdictions mandate specific documentation for AI impact assessments. Ensure compliance with applicable requirements (e.g., EU AI Act conformity assessments).
Related Controls
Within ISO/IEC 42001:
- A.5.2 AI system impact assessment process
- A.5.4 Assessing impact on individuals and groups
- A.5.5 Assessing societal impacts
- Clause 7.5 Documented information
Related Standards:
- ISO/IEC 42005:2025 Clause 6 and Annexes D, E (detailed documentation guidance and templates)