7 Design
This version of the AQuA Book is a preliminary ALPHA draft. It is still in development, and we are still working to ensure that it meets user needs.
The draft currently has no official status. It is a work in progress and is subject to further revision and reconfiguration (possibly substantial change) before it is finalised.
During the design stage the analyst creates an actionable analytical plan from the scope for the analysis agreed with the commissioner.
This chapter sets out recommended practices around designing the analysis, deciding on the associated assurance activities, documenting the design and assuring the design. It also discusses considerations around the treatment of uncertainty in design and the design of multi-use and Artificial Intelligence (AI) models.
The development of the analytical plan should consider:
- methodology for producing results, including the treatment of uncertainty
- project management approach (for example agile[^1], waterfall[^2] or a combination of approaches)
- sourcing of inputs and assumptions
- data and file management
- change management and version control
- programming language and software
- code management, documentation and testing
- communication between stakeholders
- verification and validation procedures during the project lifetime
- documentation to be delivered
- process for updating the analytical plan
- process for ongoing review and maintenance of models, including reviewing inputs and calibrations and ensuring that software relied on continues to be supported and up to date
- ethics
- reporting
- the use and application of the analysis
Iteration of the plan between the commissioner and the analyst is normal and expected while the analytical design develops.
The recommended approach for developing analysis in code is to use a Reproducible Analytical Pipeline (RAP). RAPs shall:
- follow the practices set out in the Analysis Function Quality Assurance of Code for Analysis and Research manual
- meet the requirements of the RAPs minimum viable product
7.1 Roles and responsibilities in the design stage
7.1.1 The commissioner’s responsibilities
The commissioner should confirm that the analytical approach will satisfy their needs. To assist in this, the commissioner may review the analytical plan.
The commissioner’s expertise can be a useful resource for the analyst in the design stage. The commissioner might provide information regarding the input assumptions, data requirements and the most effective ways to present the outputs. All of these can inform the design.
7.1.2 The analyst’s responsibilities
The analyst should:
- develop the method and plan to address the commissioner’s needs
- establish assurance requirements
- develop a plan for proportionate verification and validation as described in the National Audit Office Framework to review models;
- plan in sufficient time for the assurance activity
- document the analytical plan in a proportionate manner
- follow any organisation governance procedures for project design
7.1.3 The assurer’s responsibilities
The assurer should review the analytical plan to ensure that it is able to conduct the required assurance activities. They may provide feedback on the analytical plan. The assurer should plan sufficient time for the assurance activity.
7.1.4 The approver’s responsibilities
In smaller projects, the approver may not be heavily involved in the design stage. However, for business critical analysis, the approver may want to confirm that organisational governance procedures for design have been followed.
7.2 Assurance activities
When the design stage has been completed the assurer should be aware of the quality assurance tasks that will be required of them during the project lifetime and have assured the necessary elements of the analytical plan.
The assurance of the design stage should consider whether the analytical plan is likely to:
- address commissioner’s requirements (validation)
- deliver as intended (verification)
- meet the principles of RIGOUR by, for example, providing a well-structured, data driven plan with a sound overall design
The assurance of the design stage may be carried out by the assurer. For more complex analysis, it is good practice to engage subject matter experts to provide independent assurance and to ensure the accuracy and limitations of the chosen methods are understood, ideally with tests baselining their response against independent reference cases.
7.3 Documentation
The design process should be documented in a proportionate manner. A design document that records the analytical plan should be produced by the analyst and signed-off by the commissioner. The design document may be reviewed by the assurer.
For modelling, an initial model map may be produced that describes data flows and transformations. This can be updated as the project progresses through the Analysis stage.
It is best practice to use formal version control to track changes in the design document.
7.4 Treatment of uncertainty in the design stage
During the design stage, analysts should mine the planned analysis systematically for possible sources and types of uncertainty. This is to maximise the chance of identifying all that are sufficiently large to breach the acceptable margin of error.
You can read more in Chapter 3 of the Uncertainty Toolkit for Analysts.
7.5 Black box models
Using black box models places greater weight on the design of the analysis and the assurance and validation of outputs by domain experts.
This guidance on AI assurance outlines considerations for the design of AI models, including risk assessment, impact assessment, bias audits and compliance audits.
In the design of AI and machine learning models, the analyst should:
- define the situation they wish to model
- the prediction they wish to make
- assess the quality of data that could be used to make the prediction, this includes the data used in any pretrained models
- carry out a literature review to identify appropriate modelling, valuation verification methods and document the rationale for selecting their approach
- consider the appropriate data training, validation and testing strategy for your models - it’s usual to design a model with a fraction of the data, validate with a separate portion and then test the final model with the data that was not used in the design
- consider your strategy when testing a pre-trained model, including appropriate validation methods for the models such as calculating similarity to labelled images or ground truths for generative AI
- consider developing automatic checks to identify if the model is behaving unexpectedly, this is important if the model is likely to be used frequently to make regular decisions or is deployed into a production environment
- consider the plan for maintenance and continuous review, including the thresholds or timeline to retrain the model and the resources required to support this - see the [maintenance and continuous review]https://best-practice-and-impact.github.io/aqua_book_revision/analytical_lifecycle.html#maintenance-and-continuous-review) section
- consider referring the model to their ethics committee, or a similar group dependent on your internal governance structures - see the Data Ethics Framework
- consider setting up a peer or academic review process to test the methodologies and design decisions
7.6 Multi-use models
Designing multi-use models should take into account the needs of all users of the analysis. An Analysis Steering Group may be an effective means for communication about the design with a range of user groups.
The design of multi-use models may entail a modular structure with different analysts and assurers responsible for different elements. The design of assurance activities should capture both the assurance of individual modules and their integration.
… [^1]: Agile Software Development [^2]: Waterfall planning