7  Design

Important

This version of the AQuA book is a preliminary ALPHA draft. It is still in development, and we are still working to ensure that it meets user needs.

The draft currently has no official status. It is a work in progress and is subject to further revision and reconfiguration (possibly substantial change) before it is finalised.

7.1 Introduction and overview

The design stage is where the Analyst translates the scope for the analysis agreed with the Commissioner into an actionable analytical plan. This chapter sets out recommended practices around designing the analysis and associated assurance activities, documenting the design and assuring the design. It also discusses considerations around the treatment of uncertainty in design, and design of multi-use and AI models.

7.1.1 The design stage

The development of the analytical plan should consider:

  • Methodology for producing results, including the treatment of uncertainty;
  • Project management approach (for example Agile, Waterfall or a combination of approaches);
  • Sourcing of inputs and assumptions;
  • Data and file management;
  • Change management and version control;
  • Programming language and/or software;
  • Code management, documentation and testing;
  • Communication between stakeholders;
  • Verification and validation procedures during the project lifetime;
  • Documentation to be delivered;
  • Process for updating the analytical Plan;
  • Ethics;
  • Reporting;
  • Downstream application.

The use of Reproducible Analytical Pipelines (RAP) is encouraged as a means of effective project design.

Iteration between the Commissioner and the Analyst is normal and expected whilst the analytical design develops.

Reproducible analytical pipelines

The recommended approach for developing analysis in code is to use a Reproducible Analytical Pipeline (RAP). Reproducible Analytical Pipelines shall:

7.2 Roles and responsibilities in the design stage

7.2.1 The Commissioner’s responsibilities during the design stage

The Commissioner should confirm that the analytical approach will satisfy their needs. To assist in this, the Commissioner may review the analytical plan.

The Commissioner’s domain expertise can be a useful resource for the analyst in the design stage. The Commissioner might provide information regarding the input assumptions, data requirements and the most effective ways to present the outputs, all of which can inform the design.

7.2.2 The Analyst’s responsibilities during the design stage

The Analyst should:

  • develop the method and plan to address the Commissioner’s needs,
  • develop the assurance requirements will be and
  • plan in sufficient time for the assurance activity.
  • document the analytical plan in a proportionate manner.
  • follow any organisation governance procedures for project design.

7.2.3 The Assurer’s responsibilities during the design stage

The Assurer should review the analytical plan to ensure that they are able to conduct the required assurance activities. They may provide feedback on the analytical plan. The Assurer should plan sufficient time for the assurance activity.

7.2.4 The Approver’s responsibilities during the design stage

In smaller projects, the Approver may not be heavily involved in the design stage. However, for business critical analysis, the Approver may want to confirm that organisational governance procedures for design have been followed.

7.3 Assurance activities in the design stage

Designing the assurance activities for subsequent analytical stages is a key part of the design stage. The design of the assurance should take into account the principles of proportionality. Assurance of data quality should be considered in the analytical plan.

On completion of the design stage, the Assurer should be aware of the quality assurance tasks that will be required of them during the project lifetime and have assured the necessary elements of the analytical plan.

The assurance of the design stage should consider whether the analytical plan:

  • Delivers as intended
  • Adequately addresses the complexities of the customer issue for this purpose
  • Is well-structured for the purpose, data driven, and reflects a robust overall design.

The assurance of the design stage may be carried out by the Assurer. For more complex analysis, it is good practice to engage subject matter experts to provide independent assurance, and to ensure the accuracy and limitations of the chosen methods are understood, ideally with tests baselining their response against independent reference cases.

7.4 Documention in the design stage

The design process should be documented in a proportionate manner. A design document that records the analytical plan should be produced by the Analyst, and signed-off by the Commissioner. The design document may be reviewed by the Assurer.

It is best practice to use formal version control to track changes in the design document.

7.5 Treatment of uncertainty in the design stage

During the design stage, Analysts should examine the planned analysis systematically for possible sources and types of uncertainty, to maximise the chance of identifying all that are sufficiently large to breach the acceptable margin of error. This is discussed in Chapter 3 of the Uncertainty Toolkit for Analysts

7.6 Black box models and the design stage

Using black box models places greater weight on the design of the analysis and the assurance and validation of outputs by domain experts.

This guidance outlines considerations for the design of AI models, including risk assessment, impact assessment, bias audits and compliance audits.

In the Design of AI/ML models, the Analyst should:

  • define the situation they wish to model
  • the prediction they wish to make
  • the data that could be used to make the prediction
  • consider how to separate the data for the design and testing of models - it’s usual to design a model with a fraction of the data and then test it with the data that was not used in the design
  • identify potential modelling methods

7.7 Multi-use models and the design stage

Designing multi-use models should take into account the needs of all users of the analysis. An Analysis Steering Group can be an effective means for communication about the design with a range of user groups.

The design of multi-use models may entail a modular structure with different Analysts and Assurers responsible for different elements. The design of assurance activities should capture both the assurance of individual modules and their integration.