Quality Questions

Note

This guidance is an ALPHA draft. It is in development and we are still working to ensure that it meets user needs.

Please get in touch with feedback to support the guidance by creating a GitHub Issue or emailing us.

To get the most out of the template, we strongly recommend that teams identify who will take the key quality assurance roles of Commissioner, Approver and Assurer and name the analytical team at the start of the analytical cycle. This is crucial because together these roles help to make sure that the analysis you do is fit-for-purpose.

I. Scoping

Quality Question Why do I need to know the answer to this?
Q1 What question does the analysis try to answer? A clear understanding of the analysis question is critical. It helps your team to scope out requirements, understand the strengths and limitations of the analysis and make sure it is fit for purpose.

If the question is not clear, you risk designing and delivering analysis which does not meet user needs.
Q2 Why do you need to answer this analysis question? Knowing why you need the analysis, what it is for and how it will be used will help you to understand the importance and impact of your work and how it supports decision making.

It will also help you to make sure the analysis is fit for purpose and correctly answers the question.
Q3 Which organisational priorities does this analysis address? Knowing how the work aligns with organisational priorities shows how it will fit with wider strategic objectives and why you should do the analysis now.

It informs the level of assurance needed to confirm the work is fit for purpose.
Q4 If you use a model, is it business critical? Identifying if the work is business critical determines the assurance needed to ensure it is fit for purpose.
Q5 Who needs the answer to the analysis question? Knowing what your outputs will be used for can ensure they meet user needs. A good understanding of uses is essential for making sure that your analysis is fit for purpose.
Q6 Who do you need to consult to make sure you meet the right user needs? Analysis must be well understood by relevant users, or else risks scope creep and misspecification.

Identify relevant stakeholders and users, consult them before designing the analysis, and consider their views.

Consulting the right stakeholders helps you and your users agree on how to answer the question and what the output should look like. This means you can check the users' understanding of the process and its quality, and that the final ouput meets user needs.
Q7 How will you know you have answered the analysis question correctly? Being clear about the outputs required and acceptable uncertainty is essential for producing accurate and reliable outputs where users understand the work's limitations and uncertainty.

It is also essential when designing verification and validation activities to check the robustness of results under a range of plausible assumptions about methods and data.
Q8 What is the estimated time and resource required to answer the analysis question (in months and FTE)? Without a clear understanding of time and resources available, you may overcommit. It is important to push back against unrealistic demands if there is not enough time to effectively quality assure the analysis.

The users and commissioner should fully understand and accept increased risks to quality when time and resource pressures are unavoidable.
Q9 What is the impact if the analysis is not done now? Understanding why the work needs to happen now will help you to prioritise and use limited resources for the right activities.
Q10 What is the impact if the analysis is not done correctly? Understanding the possible legal, financial or reputational consequences if the analysis is not carried out correctly helps you to design proportionate assurance activities.

You should consider how these consequences line up with the risk appetite of your organisation when you design your mitigation.
Q11 Name the Commissioner, Approver and Assurer of this analysis? Clear accountability makes sure that important decisions are signed off by the right people. There should be a clear understanding of who is responsible for managing, producing and quality assuring the analysis in the team.
Q12 What tools and resources will you use in production? Are they the best for the job? Before starting, identify all the skills and resources needed to produce the final output in a sustainable and reproducible way and quality assure it at every step.

The platform used to host the analysis and the software to build and run the analysis should be appropriate and risks considered. For example, producing critical outputs in Excel does not comply with best practice and is unlikely to be robust or verifiable.
Q13 Do you have the right internal and external resources and capability to deliver the analysis? If the team lacks capability, resource, or time, this increases the risk that the analysis will not be fit for purpose or sufficiently assured.
Q14 What are the anticipated risks of the analysis?

Have you discussed these risks with customers and stakeholders?
You must identify potential risks and their impact well in advance to enable effective mitigation and quicker, confident decision-making.

It is important that users understand risks to ensure that their expectations and requirements are met. You should document how you have identified and are monitoring and mitigating risks.
Q15 Is there a contingency plan prepared if your mitigation plans fail? You should account for risks that you can mitigate. Include risks that have a high impact on the analysis but a low probability happening. Without well designed contingencies, we put quality at risk.
Q16 Do the data and analysis comply with ethical requirements? Analysis must comply with ethics standards to ensure public confidence. You must consider the ethical implications of the analysis when you create the workflow and report your findings.
Q17 What relevant questions are outside the scope of the analysis? Limiting the scope of analysis shapes the quality of outputs and what can be done with them. By being clear about the limitations of the analysis we can mitigate or accept them. Limitations must be documented so everybody using the analysis is aware of them.
Q18 How will you peer review and assure the analysis? Internal audit and peer review are critical for monitoring and assuring that the analysis is performed appropriately and meets the required aims and objectives. The outcomes of peer review should be documented and relevant actions and recommendations should be prioritised, addressed, and taken forward.
Q19 Will external experts be involved in development and scrutiny of analysis? You should commission external specialists to peer review or audit the analysis in proportion to risks around use. They should be able to draw on expertise and experience across government and beyond to get feedback, exchange experience and suggest best practice to improve the analysis.
Quality Question Which Code practice(s) are most relevant here?
*Trustworthiness (T), Quality (Q), Value (V)
Q1 What question does the analysis try to answer? V8.2 Actively engage key users of your statistics, such as those in academia, business, civil society, the media and public bodies, to identify the most important questions the statistics need to answer. Report on the findings and your decisions in your annual statistical work programme
Q2 Why do you need to answer this analysis question? V8.2 Actively engage key users of your statistics, such as those in academia, business, civil society, the media and public bodies, to identify the most important questions the statistics need to answer. Report on the findings and your decisions in your annual statistical work programme
Q3 Which organisational priorities does this analysis address? T2.4 Seek the approval of the Chief Statistician/Head of Profession for Statistics for major revisions to statistics; new official statistics and official statistics in development; to cease the production of official statistics that are no longer viable or required; and when seeking a change to the accreditation of official statistics
Q4 If you use a model, is it business critical? Q6.1 Produce statistics to a suitable level of quality that means they meet their intended uses and are not misleading
Q5 Who needs the answer to the analysis question? V8.2 Actively engage key users of your statistics, such as those in academia, business, civil society, the media and public bodies, to identify the most important questions the statistics need to answer. Report on the findings and your decisions in your annual statistical work programme
Q6 Who do you need to consult to make sure you meet the right user needs? V8.2 Actively engage key users of your statistics, such as those in academia, business, civil society, the media and public bodies, to identify the most important questions the statistics need to answer. Report on the findings and your decisions in your annual statistical work programme

V8.3 Gain views from a range of users to inform decisions on your work programme, including when statistics are started, stopped or changed, being clear on where and why user needs can and cannot be met, such as addressing information gaps. Involve users in the ongoing development and testing of statistics
Q7 How will you know you have answered the analysis question correctly? Q6.9 Use a proportionate quality assurance approach across production and release processes. Validate statistics through comparison with other relevant statistics and data sources where possible
Q8 What is the estimated time and resource required to answer the analysis question (in months and FTE)? T2.7 Recruit suitably skilled staff and apply an appropriate competency framework. Have clear roles and responsibilities for these staff

T2.8 Provide sufficient resources and time to enable staff to develop skills, knowledge and competencies, including training on applying the Code, secure data handling and quality management

T3.5 Release on a timely basis, meeting the needs of users as far as possible and as soon as the statistics are ready, under the guidance of the Chief Statistician/Head of Profession for Statistics
Q9 What is the impact if the analysis is not done now? T2.4 Seek the approval of the Chief Statistician/Head of Profession for Statistics for major revisions to statistics; new official statistics and official statistics in development; to cease the production of official statistics that are no longer viable or required; and when seeking a change to the accreditation of official statistics
Q10 What is the impact if the analysis is not done correctly? Q5.5 Periodically review the effectiveness of your processes and quality management approach and be open about findings and planned improvements
Q11 Name the commissioner, senior responsible owner and analytical assurer of this analysis? T2.7 Recruit suitably skilled staff and apply an appropriate competency framework. Have clear roles and responsibilities for these staff
Q12 What tools and resources will you use in production? Are they the best for the job? T2.8 Provide sufficient resources and time to enable staff to develop skills, knowledge and competencies, including training on applying the Code, secure data handling and quality management
Q13 Do you have the right internal and external resources and capability to deliver the analysis? T2.8 Provide sufficient resources and time to enable staff to develop skills, knowledge and competencies, including training on applying the Code, secure data handling and quality management
Q14 What are the anticipated risks of the analysis? Have you discussed these risks with customers and stakeholders? Q6.9 Use a proportionate quality assurance approach across production and release processes. Validate statistics through comparison with other relevant statistics and data sources where possible

V8.3 Gain views from a range of users to inform decisions on your work programme, including when statistics are started, stopped or changed, being clear on where and why user needs can and cannot be met, such as addressing information gaps. Involve users in the ongoing development and testing of statistics
Q15 Is there a contingency plan prepared if your mitigation plans fail? Q5.1 Promote and apply appropriate quality standards, taking account of how quality can change
Q16 Do the data and analysis comply with ethical requirements? T1.2 Handle data and statistics with honesty and integrity, in ways that serve the public good
Q17 What relevant questions are outside the scope of the analysis? Q7.1 Prominently communicate the quality of the statistics and the strengths and limitations that impact their use, reflecting the needs of different types of users

Q7.3 Explain the nature of data sources and why they were selected, anticipating possible areas of misunderstanding or misuse. Prominently communicate limitations in the underlying data and explain their impact on the statistics
Q18 How will you peer review and assure the analysis? Q5.4 Work collaboratively with data supply partners, other producers, topic experts and other partners to develop a common understanding of quality matters. Welcome and seek their input on ways to improve quality

Q6.9 Use a proportionate quality assurance approach across production and release processes. Validate statistics through comparison with other relevant statistics and data sources where possible
Q19 Will external experts be involved in development and scrutiny of analysis? Q6.8 Collaborate with experts, other analysts and statistics producers in the UK and internationally where appropriate and share best practice
Quality Question Which AQuA role(s) would normally answer this? Why are these AQuA roles involved?
Q1 What question does the analysis try to answer? Commissioner, Analyst The commissioner sets out the commission. They work with the analyst team to ensure that everyone has a common understanding of the problem.
Q2 Why do we need to answer this analysis question? Commissioner, Analyst, Assurer The analyst must document the purpose of the analysis and the levels of quality and certainty that are needed to meet user requirements. The commissioner and assurer make sure the analysis aligns with the stated purpose.
Q3 Which organisational priorities does this analysis address? Commissioner The commissioner makes sure that key aspects of the problem, scope and complexities, including programme constraints, are captured and clearly communicated. They also make sure there is proportionate governance in place to support the analysis and its role in the wider project or programme
Q4 If you use a model, is it business critical? Commissioner, Approver Business critical models must be managed appropriately so that the right specialists are responsible for developing, using and assuring them.

Decision-makers need sufficient assurance from an appropriate level in the organisation that the model is fit for purpose before using it to inform a decision. For business critical analysis and modelling, the commissioner should be satisfied with the seniority of the assurer.
Q5 Who needs the answer to the analysis question? Commissioner, Analyst, Assurer The commissioner makes sure that the right stakeholders have been identified so that the scope and boundaries of the analysis can be appropriately explored. The analyst team and assurer should also contribute.
Q6 Who do you need to consult to make sure you meet the right user needs? Analyst, Commissioner Analysts should explore the analysis requirements and scope with all relevant stakeholders to make sure a wide range of perspectives are sought. The commissioner should be aware and briefed.
Q7 How will you know you have answered the analysis question correctly? Commissioner, Analyst During the design and conduct of analysis, the commissioner should set out details like the level of precision, accuracy and uncertainty that are needed.
Q8 What is the estimated time and resource required to answer this analysis question (in months and FTE)? Commissioner, Approver, Analyst team During commissioning and scoping, the commissioner and analyst will need to make trade-offs between time, resources and quality. They should work together to agree and document the right balance across these constraints.
Q9 What is the impact if the analysis is not done now? Commissioner The commissioner makes sure that there is sufficient time and resource for the required level of assurance to be delivered and that they understand the risks when time and resource pressures are unavoidable.
Q10 What is the impact if the analysis is not done correctly? Commissioner, Analyst The commissioner makes sure the analyst team understands the context for the analysis question. This helps the analyst to understand and assess likely risks and determine the right analytical and quality assurance response.
Q11 Name the Commissioner, Approver and Assurer of this analysis? Commissioner During scoping of the analysis, the commissioner makes sure there is proportionate governance in place to support the analysis and its role in the wider project or programme.
Q12 What tools and resources will you use in production? Are they the best for the job? Commissioner The commissioner makes sure that there is enough time and resource for the required level of assurance to be delivered. They must be confident that they understand the risks when time and resource pressures are unavoidable.
Q13 Do you have the right internal and external resources and capability to deliver the analysis? Commissioner The commissioner makes sure that there is enough time and resource for the required level of assurance to be delivered. They must be confident that they understand the risks when time and resource pressures are unavoidable.
Q14 What are the anticipated risks of the analysis?

Have you discussed these risks with customers and stakeholders?
Assurer, Commissioner, Analyst The analystical assurer should challenge and test the understanding of the problem. The commissioner and analyst work with the assurer to make sure that all share a common understanding.
Q15 Is there a contingency plan prepared if your mitigation plans fail? Commissioner, Assurer The commissioner makes sure that there is enough time and resource for the required level of assurance to be delivered. They must be confident that they understand the risks when time and resource pressures are unavoidable.

If there is a need for urgent action, such as mitigation of unacceptable but uncertain risks, the commissioner may ask for further analysis. They might also commission extra evidence-gathering in parallel to inform the policy response when uncertainty is reduced. The assurer must advise the commissioner on whether sufficient analytical quality assurance has happened and inform them about any outstanding risks.
Q16 Do the data and analysis comply with ethical requirements? Analyst, Commissioner, Assurer The analyst should make sure that there is appropriate ethical approval for the analysis. The commissioner and assurer should be informed.
Q17 What relevant questions are outside the scope of the analysis? Commissioner, Analyst The commissioner and the analyst work together at the scoping stage to get a clear understanding of analytical requirements. During scoping, the commissioner makes sure that the right aspects of the problem, scope and complexities, including programme constraints, are captured and clearly communicated.
Q18 How will you peer review and assure the analysis? Assurer, Commissioner, Analyst The assurer makes sure quality assurance plans for the analysis are appropriate for the decision it supports. All analysis requires some level of quality assurance. Analyst and commissioner should be involved.
Q19 Will external experts be involved in development and scrutiny of analysis? Assurer, Commissioner, Analyst The assurer should challenge the proposed approach. Check that it delivers as intended and meets customer needs. It is good practice to engage subject matter experts in this review. Analyst and commissioner should be involved.

II. Design

Quality Question Why do I need to know the answer to this?
Q20 Is there a simple, plain English description of what the analysis is for and what it does? Writing a plain English description of what the analysis is for and how it works means that everybody in the team, including new starters, and others with no subject or technical expertise can understand the purpose of the analysis and what it does.
Q21 Does the analysis have a logic flowchart which explains the end-to-end steps in the workflow? Setting out a clear summary of the analysis process in a diagram helps the team, users and customers to understand at a glance what the analysis does, where inputs come from, how they are processed and how it generates outputs.
Q22 When do you expect to start and finish each stage of analysis: data collection, processing, quality assurance, analysis and dissemination? Clearly setting out the time you need to perform each stage of analysis helps you to evaluate if the time allocated to each stage is right and plan mitigation if plans look too ambitious.
Q23 Does any part of the analysis rely on manual processing? Have you considered the cost and benefits of fully automating the process? Manual processes are inefficient and more risky than well-designed automated ones. Analysis with manual steps like copying and pasting data between files, manually updating cells in tables, or moving data between software packages is harder to assure and carries extra quality risks.
Q24 What happens if team members, reviewers or users find a mistake in the analysis?

Do you have a clear and efficient process for addressing issues and preventing them from happening again?
Mistakes happen in analysis. You should have a clear and efficient process for reporting, documenting and addressing errors. Being open and honest about problems and working together to solve them in a supportive way creates an atmosphere of openness in the team and is critical for upholding users’ trust in your output.

Teams should take reasonable steps to understand and document how and why errors came about, and mitigate the risk of them happening again. Your mitigation approach should be consistent with your department’s revision policies.
Q25 Have you assessed uncertainty? Consider how uncertainty impacts on all stages of the analysis. Think about quantifying and measuring uncertainty as early as possible. Identify and review sources of uncertainty regularly.

If you delay the assessment of uncertainty until late in the process, it is often difficult and costly to mitigate risks. In some cases, you may need to revise methods or alter commissioning decisions. If a potential source of uncertainty is overlooked, this can limit the usefulness and impact of the analysis.
Quality Question Which Code practice(s) are most relevant here?
*Trustworthiness (T), Quality (Q), Value (V)
Q20 Is there a simple, plain English description of what the analysis is for and what it does? V9.4 Explain how the statistics add value and serve the public good, to demonstrate and help users and potential users understand how they could inform decision making
Q21 Does the analysis have a logic flowchart which explains the end-to-end steps in the workflow? V10.2 Make sure statistics, data and related guidance are easily accessible. Provide other relevant information, such as metadata and coding where appropriate
Q22 When do you expect to start and finish each stage of analysis: data collection, processing, quality assurance, analysis and dissemination? Q6.1 Verify that the statistics are representative and of suitable quality and monitor relevant quality dimensions for both input data and the statistics, such as completeness and validity, accuracy and reliability, coherence and comparability, and timeliness. Quantify statistical error, including bias, and produce measures of confidence where possible

T3.5 Release on a timely basis, meeting the needs of users as far as possible and as soon as the statistics are ready, under the guidance of the Chief Statistician/Head of Profession for Statistics
Q23 Does any part of the analysis rely on manual processing? Have you considered the cost and benefits of fully automating the process? Q5.2 Provide a supportive environment to enable staff to propose improvements in ways of working and raise quality concerns

T2.8 Provide sufficient resources and time to enable staff to develop skills, knowledge and competencies, including training on applying the Code, secure data handling and quality management
Q24 What happens if team members, reviewers or users find a mistake in the analysis? Do you have a clear and efficient process for addressing issues and preventing them from happening again? Q5.2 Provide a supportive environment to enable staff to propose improvements in ways of working and raise quality concerns

T3.9 Release revisions and corrections of errors transparently and as soon as possible in line with the organisation’s published policy, being clear about the nature and scale of change
Q25 Have you assessed uncertainty? Q7.2 Report on the key quality dimensions, such as accuracy and timeliness, and, where possible, give estimates of error and confidence for the statistics. Summarise how uncertainty in the estimates may impact use by using qualifying words, numbers or graphics
Quality Question Which AQuA role(s) would normally answer this? Why are these AQuA roles involved?
Q20 Is there a simple, plain English description of what the analysis is for and what it does? Analyst, Assurer The analyst should produce sufficient design documentation. Best practice can include a description of the analysis, user requirements, design specification, functional specification, data dictionary, and test plan. The assurer should give feedback to make sure documentation is fit for purpose.
Q21 Does the analysis have a logic flowchart which explains the end-to-end steps in the workflow? Analyst, Assurer The analyst should produce sufficient design documentation. Best practice can include a description of the analysis, user requirements, design specification, functional specification, data dictionary, and test plan. The assurer should give feedback to make sure documentation is fit for purpose.
Q22 When do you expect to start and finish each stage of analysis: data collection, processing, quality assurance, analysis and dissemination? Commissioner, Analyst The analyst and commissioner should work together during scoping to set out and agree trade-offs between time, resources and quality and establish the optimal balance of these constraints.
Q23 Does any part of the analysis rely on manual processing? Have you considered the cost and benefits of fully automating the process? Commissioner, Analyst The analyst should use a risk-based approach to understand the areas of greatest potential error and focus assurance efforts on these areas. The analyst should brief the commissioner so they understand the impact of any reduction in the thoroughness of analytical quality assurance activities.
Q24 What happens if team members, reviewers or users find a mistake in the analysis?

Do you have a clear and efficient process for addressing issues and preventing them from happening again?
Analyst, Assurer The analyst should use a risk-based approach to highlight the areas of greatest potential error and focus assurance efforts on these areas.
Q25 Have you assessed uncertainty? Commissioner, Analyst Commissioners should expect and require information about uncertainty from analysts. They should challenge them when it is absent, inadequate or ambiguous.

Commissioners may have identified sources of uncertainty as part of their wider considerations and should share them with the analyst. If the commissioner can explain in advance the impact on decision-making of different degrees of uncertainty, this can help the analyst to design and carry out the analysis at a proportionate level.

III. Analysis

Quality Question Why do I need to know the answer to this?
Q26 How will the data in the analysis be processed before and during use? Processing the data inputs will impact methods and outputs. A clear understanding of how these processes affect the workflow is essential for understanding quality.
Q27 Is the data appropriate given the methods selected? A comprehensive understanding of data inputs is a prerequisite for meeting user needs.
Q28 What are the strengths and limitations of the data you use? Data requirements for analysis vary. Formats, coverage, time scales and granularity must all be appropriate for the research question.

A comprehensive understanding of data inputs is a prerequisite for meeting user needs. Without understanding the strengths and weaknesses of the data, it is impossible to make meaningful improvements to the analysis or the inputs to manage these limitations.
Q29 Is there a robust relationship between your team and data providers?

Do data providers understand how and why you use their data?
A good relationship with data suppliers helps to make sure that their data meet your requirements. Lack of communication can mean you are not aware of quality risks or changes in collection or processing steps that can affect your results.

You should communicate with your suppliers sufficiently to manage input quality. Data providers should have a good understanding of how and why you are using their data. This helps them to improve data quality and value and find and address gaps or issues that are relevant for your analysis.
Q30 Do you understand how data providers collect, process and quality assure the data you use? Never assume that datasets are of sufficient quality. Make sure that suppliers give you the metadata and other supporting information you need to assure the quality of the data. Validate the information provided by suppliers using your own checks and confirmation if appropriate.
Q31 Is there a formal agreement to set out data content, when and how you will get the data? If not, why not? A formal service level agreement with data providers makes sure that everybody understands what will be delivered, when and how. This is useful for setting out the division of responsibilities between data providers and your team for getting and sharing the data. It might specify formats, delivery, timescale, legal framework, accompanying metadata and quality checks.
Q32 Do you know what quality checks are carried out on the data before you receive them? Data suppliers should be able to show that their data is sufficiently assured to meet your needs. You should be able to demonstrate that the data meet your needs and that reported quality matches what you observe in practice. Simply having a quality report is not enough.
Q33 How will you work with your data provider when your data requirements change? Review your data requirements regularly to ensure they are still relevant and feasible. Changes to requirements should be communicated to data providers well in advance and agreed by all stakeholders. If there is a formal agreement, it may need to be revised as requirements change.
Q34 How do you know if your data provider changes their systems or processes in a way that could impact the data you receive or the analysis you produce? Data suppliers make changes to their definitions, methods and systems. This may not affect the quality of their data but can affect how you process the data and what you can infer from it. Tailor communication with data suppliers so it is sufficiently frequent, effective and ongoing to get timely information about changes.
Q35 How did you choose the methods for the analysis? How do you know the methods you use are appropriate? You should be able to explain why you chose your methods. For each method, document the underlying assumptions, why the method is suitable for answering the analysis question, why it is applicable to the type and distribution of data you are using, and how these decisions were signed off.
Q36 Have reasonable alternative methods been explored and rejected for good reasons? There is often more than one way to answer a question with data. When you have made choices about methods and approaches, explain how and why you considered and rejected other options. Unless there is evidence underpinning your choice, users cannot be sure that you have chosen the most suitable methods.
Q37 How do you know that your analysis is working correctly? You need to be sure that your analysis produces the outputs you think it should and the processes run as expected. If you cannot demonstrate that scripts and processes work correctly, you cannot confirm the quality of the results.
Q38 Can you describe the assumptions of your analysis, when they were made and who made them and signed them off? You must understand the assumptions your analysis makes. Assumptions set out how the analysis simplifies the world and mitigates uncertainty. If assumptions are inadequately set out or absent, important characteristics of the analysis and its inputs will be unclear, greatly increasing risk.

Without a comprehensive log of assumptions made by the analysis, an audit trail signed off by assumption owners, a version control log reflecting when assumptions were last updated, and evidence showing internal and external validation of assumptions, uncertainties may go unacknowledged and could drastically impact outputs.
Q39 How are assumptions validated and assured before you apply them? A clear understanding of how assumptions have been externally and internally validated and signed off gives us confidence that they are reasonable.
Q40 How do you measure and report uncertainty in your analysis? All analysis contains uncertainty. Quantifying and reporting uncertainty means we can inform users how precise reported values are and how much confidence they can have in the analysis. It will also help you to determine where the analysis can be improved.

There are many ways to quantify uncertainty in input data, assumptions, processes and outputs. Choose appropriate ones for your situation. For instance, uncertainties can be understood and quantified by comparing against similar or historical data, Monte Carlo simulation, break-even analysis or using expert judgement.
Q41 Have you considered the implications of relevant, unquantified uncertainties? A good understanding of the uncertainties in the analysis workflow is critical to ensure the analysis and its outputs are fit for purpose.
Q42 Can you explain the impact of your analysis on downstream processes? Are there any risks around these dependencies? Understanding how your analysis might be used would help ensure that the right quality and assurance levels are in place. It would help you assess the risks around the use of analysis and if there are other stakeholders or users you need to consult.
Q43 Is all or part of the analysis reliant on a single person? Single points of failure carry significant business risk. If only one person understands how to carry out all or part of the analysis or maintain the code then the process is extremely vulnerable.
Q44 Is it clear why important decisions about the analysis were made, who made them and when? All analysis involves decisions. A comprehensive record of the decisions made in specifying and conducting the analysis ensures a full audit trail of why decisions were made, who made them and signed them off.
Q45 If changes need to be made to code or datasets, is it easy to track who made the changes and when and why they were made? Good version control ensures a full understanding of when, why, and how changes were made to your analysis process. If it is hard to track changes, it will be hard to retrace steps if there is a problem and means you do not fully understand the process.
Q46 Would another analyst be able to reproduce your analysis output or continue the work without talking to you first? Your analysis must be well documented and repeatable so that somebody new can understand it, use it, and produce the same output with the same inputs. Poor documentation can lead to errors.
Q47 Do you use internal peer review to check scripts and code, documentation, implementation of methods, processes and outputs? You should independently review and validate the logical integrity of you analysis as well as the structure and functionality of the code against the research question.

A record of validation and verification activities undertaken, outstanding tasks and remedial actions helps to confirm that the correct analysis has been performed for the required purpose and the chosen approach minimises risk.
Q48 Is your code and analysis ever peer reviewed by someone outside your team or organisation? External peer review is one of the best ways to ensure that the analysis and code are well made and fit for purpose. Without it, teams can reinforce their own biases and may not notice there is anything wrong.
Q49 What is your assessment of the quality of your analytical outputs? Understanding and reporting on the quality of your analysis is critical to ensure fitness for purpose and maintain trust and reliability. This ensures that analysis can appropriately inform decision-making. Quality assessments are key information to share with users and a requirement of the Code of Practice for Statistics.
Q50 How do you assure yourselves that analysis you do is correct? If you check that results fit with your expectations and you can explain discrepancies, this makes it easier to mitigate risk.

There are many ways to check if analysis is carried out correctly. For example, sensitivity analysis can help you understand which inputs have the greatest effect on the outputs.You can also compare figures from the analysis with similar data from other sources or from historical series.
Q51 Do the outputs of your analysis align with similar findings from elsewhere? If not, can you explain why? If you can, check that your outputs align with findings from previous runs of the analysis, alternate data sources, and comparable studies. This gives you confidence that the analysis works as expected. You should be able to explain any inconsistencies that you see.
Q52 If you find outliers or unusual trends in the data, what steps do you take to investigate them? It is crucial to check unusual trends and values in the data and understand why they are there. Not all outliers are the same. Some have a strong influence, some not at all. Some are valid and important data values. Others might be errors.

Investigate outliers and unusual patterns thoroughly and take reasonable steps to check their impact on your final output. If you choose to exclude unusual values, you should explain why this is acceptable.
Quality Question Which Code practice(s) are most relevant here?
*Trustworthiness (T), Quality (Q), Value (V)
Q26 How will the data in the analysis be processed before and during use? Q7.4 Be clear about the methods used. Explain quality issues related to the methods, systems and processes, including the extent to which the statistics are representative and comparable across the UK and internationally. Describe potential bias and steps taken to address it
Q27 Is the data appropriate given the methods selected? Q7.3 Explain the nature of data sources and why they were selected, anticipating possible areas of misunderstanding or misuse. Prominently communicate limitations in the underlying data and explain their impact on the statistics
Q28 What are the strengths and limitations of the data you use? Q6.2 Use the most suitable data for what needs to be measured. Monitor for changes in the data sources and potential bias in the data. Explain any issues and their implications for use of the data in producing statistics

Q7.3 Explain the nature of data sources and why they were selected, anticipating possible areas of misunderstanding or misuse. Prominently communicate limitations in the underlying data and explain their impact on the statistics
Q29 Is there a robust relationship between your team and data providers? Do data providers understand how and why you use their data? Q6.4 Maintain constructive relationships with those involved in the data provision, statistics preparation and quality assurance processes. Be clear about your data supply and quality requirements and understand how these will be met. Where possible, provide feedback to data suppliers on your use of their data
Q30 Do you understand how data providers collect, process and quality assure the data you use? Q5.4 Work collaboratively with data supply partners, other producers, topic experts and other partners to develop a common understanding of quality matters. Welcome and seek their input on ways to improve quality

Q6.4 Maintain constructive relationships with those involved in the data provision, statistics preparation and quality assurance processes. Be clear about your data supply and quality requirements and understand how these will be met. Where possible, provide feedback to data suppliers on your use of their data
Q31 Is there a formal agreement to set out data content, when and how you will get the data? If not, why not? Q6.4 Maintain constructive relationships with those involved in the data provision, statistics preparation and quality assurance processes. Be clear about your data supply and quality requirements and understand how these will be met. Where possible, provide feedback to data suppliers on your use of their data
Q32 Do you know what quality checks are carried out on the data before you receive them? Q6.4 Maintain constructive relationships with those involved in the data provision, statistics preparation and quality assurance processes. Be clear about your data supply and quality requirements and understand how these will be met. Where possible, provide feedback to data suppliers on your use of their data
Q33 How will you work with your data provider when your data requirements change? Q6.4 Maintain constructive relationships with those involved in the data provision, statistics preparation and quality assurance processes. Be clear about your data supply and quality requirements and understand how these will be met. Where possible, provide feedback to data suppliers on your use of their data
Q34 How do you know if your data provider makes a change to their systems or processes which could impact the data you receive or the analysis you produce? Q5.4 Work collaboratively with data supply partners, other producers, topic experts and other partners to develop a common understanding of quality matters. Welcome and seek their input on ways to improve quality

Q6.4 Maintain constructive relationships with those involved in the data provision, statistics preparation and quality assurance processes. Be clear about your data supply and quality requirements and understand how these will be met. Where possible, provide feedback to data suppliers on your use of their data
Q35 How did you choose the methods for the analysis? How do you know the methods you use are appropriate? Q6.7 Base methods on national or international good practice, scientific principles or professional consensus. Identify potential bias and address limitations. Use recognised standards, classifications and definitions. Explain reasons for deviations from these standards and any related implications for use
Q36 Have reasonable alternative methods been explored and rejected for good reasons? Q6.7 Base methods on national or international good practice, scientific principles or professional consensus. Identify potential bias and address limitations. Use recognised standards, classifications and definitions. Explain reasons for deviations from these standards and any related implications for use
Q37 How do you know that your analysis is working correctly? Q6.1 Verify that the statistics are representative and of suitable quality and monitor relevant quality dimensions for both input data and the statistics, such as completeness and validity, accuracy and reliability, coherence and comparability, and timeliness. Quantify statistical error, including bias, and produce measures of confidence where possible
Q38 Can you describe the assumptions of your analysis, when they were made and who made them and signed them off? Q7.4 Be clear about the methods used. Explain quality issues related to the methods, systems and processes, including the extent to which the statistics are representative and comparable across the UK and internationally. Describe potential bias and steps taken to address it
Q39 How are assumptions validated and assured before you apply them? Q6.1 Regularly review strengths and limitations in the data and statistics, including the continued suitability of data sources and methods. Be open about your decisions and reasons for change

Q7.3 Explain the nature of data sources and why they were selected, anticipating possible areas of misunderstanding or misuse. Prominently communicate limitations in the underlying data and explain their impact on the statistics

Q7.4 Be clear about the methods used. Explain quality issues related to the methods, systems and processes, including the extent to which the statistics are representative and comparable across the UK and internationally. Describe potential bias and steps taken to address it
Q40 How do you measure and report uncertainty in your analysis? Q6.1 Verify that the statistics are representative and of suitable quality and monitor relevant quality dimensions for both input data and the statistics, such as completeness and validity, accuracy and reliability, coherence and comparability, and timeliness. Quantify statistical error, including bias, and produce measures of confidence where possible
Q41 Have you considered the implications of relevant, unquantified uncertainties? Q7.2 Report on the key quality dimensions, such as accuracy and timeliness, and, where possible, give estimates of error and confidence for the statistics. Summarise how uncertainty in the estimates may impact use by using qualifying words, numbers or graphics
Q42 Can you explain the impact of your analysis on downstream processes? Are there risks around these dependencies? Q7.1 Prominently communicate the quality of the statistics and the strengths and limitations that impact their use, reflecting the needs of different types of users
Q43 Is all or part of the analysis reliant on a single person? T2.7 Recruit suitably skilled staff and apply an appropriate competency framework. Have clear roles and responsibilities for these staff

T2.8 Provide sufficient resources and time to enable staff to develop skills, knowledge and competencies, including training on applying the Code, secure data handling and quality management
Q44 Is it clear why important decisions about the analysis were made, who made them and when? Q6.7 Base methods on national or international good practice, scientific principles or professional consensus. Identify potential bias and address limitations. Use recognised standards, classifications and definitions. Explain reasons for deviations from these standards and any related implications for use
Q45 If changes need to be made to code or datasets, is it easy to track who made the changes and when and why they were made? V10.5 Support the reuse of data and statistics, preventing barriers to use where possible. Ensure statistics are reproducible. Support data and statistics to be shared, accessed and linked, using common data standards with associated metadata
Q46 Would another analyst be able to reproduce your analysis output or continue the work (without talking to you first)? V10.5 Support the reuse of data and statistics, preventing barriers to use where possible. Ensure statistics are reproducible. Support data and statistics to be shared, accessed and linked, using common data standards with associated metadata
Q47 Do you use internal peer review to check scripts and code, documentation, implementation of methods, processes and outputs? Q5.5 Periodically review the effectiveness of your processes and quality management approach and be open about findings and planned improvements

T4.6 Hold regular reviews of the data management arrangements used and share best practice across the organisation to ensure data protection procedures remain effective. Keep pace with changing circumstances such as technological advances
Q48 Is your code and analysis ever peer reviewed by someone outside your team or organisation? Q5.5 Periodically review the effectiveness of your processes and quality management approach and be open about findings and planned improvements

T4.6 Hold regular reviews of the data management arrangements used and share best practice across the organisation to ensure data protection procedures remain effective. Keep pace with changing circumstances such as technological advances
Q49 What is your assessment of the quality of your analytical outputs? Q6.1 Verify that the statistics are representative and of suitable quality and monitor relevant quality dimensions for both input data and the statistics, such as completeness and validity, accuracy and reliability, coherence and comparability, and timeliness. Quantify statistical error, including bias, and produce measures of confidence where possible

Q6.9 Use a proportionate quality assurance approach across production and release processes. Validate statistics through comparison with other relevant statistics and data sources where possible
Q50 How do you assure yourself that the analysis you do is correct? Q6.1 Regularly review strengths and limitations in the data and statistics, including the continued suitability of data sources and methods. Be open about your decisions and reasons for change

Q6.1 Verify that the statistics are representative and of suitable quality and monitor relevant quality dimensions for both input data and the statistics, such as completeness and validity, accuracy and reliability, coherence and comparability, and timeliness. Quantify statistical error, including bias, and produce measures of confidence where possible
Q51 Do the outputs of your analysis align with similar findings from elsewhere? If not can you explain why? Q6.9 Use a proportionate quality assurance approach across production and release processes. Validate statistics through comparison with other relevant statistics and data sources where possible
Q52 If you find outliers or unusual trends in the data, what steps do you take to investigate them? Q5.3 Promote the sharing of good practice and examples of effective quality management. Learn from both mistakes and good practice and conduct timely reviews of quality concerns

Q6.1 Verify that the statistics are representative and of suitable quality and monitor relevant quality dimensions for both input data and the statistics, such as completeness and validity, accuracy and reliability, coherence and comparability, and timeliness. Quantify statistical error, including bias, and produce measures of confidence where possible
Quality Question Which AQuA role(s) would normally answer this? Why are these AQuA roles involved?
Q26 How will the data used in the analysis be processed before and during use? Analyst, Assurer The analyst should collect and manage the data. They must understand data accuracy and uncertainties and capture, manage and understand assumptions. Assurer should check that data processing is sufficient to ensure fitness for purpose.
Q27 Is the data appropriate given the methods selected? Analyst, Assurer, Commissioner The analyst should understand data accuracy and uncertainties and capture, manage and understand assumptions made. The analyst should engage appropriate subject matter experts at the appropriate time. The commissioner may be a subject matter expert. The assurer should check that there is sufficient assurance around the choice of data.
Q28 What are the strengths and limitations of the data you use? Analyst, Assurer If applicable, analyst should undertake parametric analysis to understand the consequences of missing or uncertain data and assumptions. Assurer should make sure there is sufficient consideration of strengths and limitations of data.
Q29 Is there a robust relationship between your team and data providers?

Do data providers understand how and why you use their data?

Do you understand how data providers collect, process and quality assure the data you use?
Analyst, Assurer The assurer should expect to see evidence that there has been sufficient dialogue between analysts and the providers of data and other evidence sources.
Q30 Do you understand how data providers collect, process and quality assure the data you use? Analyst, Assurer The analyst should ensure data formats, units, and context are properly understood and handled. They should design and implement quality checks to validate data inputs as required. Assurer should verify that the right assurance is in place.
Q31 Is there a formal agreement to set out data content, when and how you will get the data? If not, why not? Commissioner, Analyst, Assurer The commissioner may need to provide the analyst with agreement to use specific data. The analyst should ensure data formats, units, and context are properly understood and handled. Assurer should verify that assurance is in place.
Q32 Do you know what quality checks are carried out on the data before you receive them? Analyst The analyst should understand data accuracy and uncertainties and capture, manage and understand implicit assumptions made. The assurer should assess whether assurance is sufficient.
Q33 How will you work with your data providers when your data requirements change? Commissioner, Analyst, Assurer During the design and conduct of analysis, the commissioner may need to provide the analyst with information, agreement to use resources or confirmation of assumptions or approach. The analyst should understand data accuracy and uncertainties and capture, manage and understand implicit assumptions made. The assurer checks that assurance and mitigation are sufficient.
Q34 How do you know if your data provider changes their systems or processes in a way that could impact the data you receive or the analysis you produce? Commissioner, Analyst, Assurer During the design and conduct of analysis, the commissioner provides the analyst with the information they need for the analysis to proceed. This could include agreement to use datasets, setting out of key assumptions and signing off assumptions developed during the project. Analyst should understand data accuracy and uncertainties and capture, manage and understand assumptions made. The assurer checks that assurance and mitigation are sufficient.
Q35 How did you choose the methods for the analysis? How do you know the methods you use are appropriate? Analyst, Assurer During the design phase, the analyst will convert the commission into an analytical plan and will consider inputs, analytical methods and processes, and expected outputs. The assurer should check that the proposed design meets the commissioner’s requirements and is sufficiently assured.
Q36 Have reasonable alternative methods been explored and rejected for good reasons? Analyst, Assurer The analyst should review the analysis as a whole and consider carefully whether there are other, better ways in which it could be done. The assurer should check that the investigation of methods was sufficiently thorough and proportionate.
Q37 How do you know that your analysis is working correctly? Analyst, Assurer The analyst should validate that the analysis as set up to answer the specification of the commissioner. The assurer checks that assurance and mitigation are sufficient so the analysis is fit for purpose.
Q38 Can you describe the assumptions of your analysis, when they were made and who made them and signed them off? Analyst, Assurer, Commissioner The analyst should capture, manage and understand explicit and implicit assumptions made. The assurer should assess whether these are sufficient. The commissioner should be made aware of key assumptions and confirm that they are happy that the assumptions are applied.
Q39 How are assumptions validated and assured before you apply them? Analyst, Assurer If applicable, analyst should undertake parametric analysis to understand the consequences of missing or uncertain assumptions. Assurer should check that validation and assurance of assumptions is sufficient.
Q40 How do you measure and report uncertainty in your analysis? Analyst, Commissioner, Assurer Analyst should determine and communicate the uncertainty associated with outputs so that the commissioner can make informed decisions. The range of possible outcomes and their relative likelihoods should be described. The assurer checks that measuring and reporting of uncertainty is sufficient to meet the needs of the commissioner.
Q41 Have you considered the implications of relevant, unquantified uncertainties? Analyst, Commissioner If uncertainties are too complex for analysts to quantify, even approximately, the analysts should say so in order that the commissioner can take this into account.
Q42 Can you explain the impact of your analysis on downstream processes? Are there risks around these dependencies? Analyst, Assurer Analyst should make sure that the implications of data dependencies or relationships to other analysis and methods are understood. Assurer should check that dependencies have been properly considered.
Q43 Is all or part of the analysis reliant on a single person? Assurer, Analyst, Commissioner Analysis should be peer reviewed at an appropriate and proportionate level by a competent person. Comissioner, analyst and assurer should all be involved in each stage of the analytical cycle.
Q44 Is it clear why important decisions about the analysis were made, who made them and when? Assurer, Analyst The assurer should make sure that a suitable audit trail is in place that clarifies the level of validation, scope, and risks associated with the analysis. Best practice includes the production of validation log books. Analyst should build this audit trail.
Q45 If changes need to be made to code or datasets, is it easy to track who made the changes and when and why they were made? Assurer, analyst To make analytical audit easy, you should set up a version control system for the analysis as a whole and for code, supporting data and assumptions. Best practice includes the production of validation log books. Analyst should build this audit trail.The assurer should ensure that the audit trail clarifies the level of validation, scope, and risks associated with the analysis.
Q46 Would another analyst be able to reproduce your analysis output or continue the work without talking to you first? Analyst, Assurer Good quality analysis is reproducible. Analyst should check that the analytical process reflects the principles of RIGOUR (Repeatable, Independent, Grounded in reality, Objective, Uncertainty-managed, Robust)
Q47 Do you use internal peer review to check scripts and code, documentation, implementation of methods, processes and outputs? Analyst, Assurer The analyst should provide proportionate documentation that explains the verification and validation activities that the analysis is subjected to. Analysts must perform appropriate test to check the analysis. They should commission other verification and validation as required. Assurer should confirm that planned validation and verification are sufficient.
Q48 Is your code and analysis ever peer reviewed by someone outside your team or organisation? Analyst, Assurer Analysts should work with the commissioner to set out the analysis question so that appropriate analysis is done. Some analysis may require external specialists, so analysts may have responsibilities as part of the procurement process. Analysts, including 3rd parties providing analysis, should provide proportionate documentatiob describing the verification and validation activities undertaken and associated conclusions. The assurer advises the commissioner on whether appropriate analytical quality assurance has taken place.
Q49 What is your assessment of the quality of your analytical outputs? Commissioner, Assurer As part of the delivery phase, the commissioner should ensure there is an assessment of the level of analytical quality assurance of the analysis, noting where there have been trade-offs between time, resources and quality. The assurer advises the commissioner on whether appropriate analytical quality assurance has taken place.
Q50 How do you assure yourself that the analysis you do is correct? Commissioner, Analyst, Assurer The analyst must build in checks and processes to ensure that the analysis is correct. During the delivery phase, the commissioner should give feedback to assist in the correct interpretation of results and determine if the analysis has addressed the commission. The analyst should work with the assurer while doing the analysis so that they can comment on whether the analysis meets the needs of the commission to ensure best use of the results
Q51 Do the outputs of your analysis align with similar findings from elsewhere? If not, can you explain why? Commissioner, Analyst When interpreting the results of a piece of analysis, the commissioner provides constructive challenge. They work with the analyst to explore whether further analysis is needed.
Q52 If you find outliers or unusual trends in the data, what steps do you take to investigate them? Analyst, Assurer If applicable, analyst should undertake parametric analysis to understand the consequences of missing or uncertain data and assumptions. The analysis plan should include treatment of unusual values and outliers. Assurer should be involved.

IV. Delivery

Quality Question Why do I need to know the answer to this?
Q53 Can you give a clear account of what can and cannot be inferred from the analysis? Often the aim of final output is to inform decison-making. Output might include predictions, involving lots of underlying assumptions. It is critical that you support your users to make appropriate use of outputs and understand what can and cannot be inferred. Without this, users may misinterpret findings, make in appropriate comparisons, use the analysis for unsuitable purposes and arrive at the wrong conclusions. For example, a non-expert user may wrongly interpret correlation as causation or use incomplete or disconnected data to make forecasts.
Q54 Have you assessed the limitations of the data and analysis and set out how they affect the quality and use of the outputs? You should describe why limitations related to data and methods exist, why they cannot be overcome using the chosen approach and their impact on the quality and interpretation of the output. Analysis is of very little value if limitations aren’t properly documented and explained.
Q55 Have you sense checked outputs with user groups and stakeholders? You should work with users, experts, and other relevant stakeholders to verify the credibility of outputs and sense check that they are useful.
Q56 Is uncertainty about data quality, assumptions and methodology clearly communicated to users? Outputs are never 100% accurate. Users need to understand how uncertainties related to data, assumptions and methodology feed into and through the analysis workflow and what this means for the use of the outputs. Results must clearly explain how uncertainty affects the findings from the analysis, or we risk misinterpretations and conclusions being overly reliant on imprecise results.
Q57 Are the implications of unquantified uncertainties communicated to users? You must support your users to understand relevant uncertainties which are not captured in the analysis. When you can, make reasonable judgements about the likely size and direction of unquantified uncertainty. Provide a qualitative description informing users about why the uncertainty cannot be quantified and their likely impact.
Q58 Is workflow documentation including technical guides and code repositories publicly available? Transparency about your analysis supports proper scrutiny and challenge, promotes public trust, and encourages re-use of the resources you develop.
Q59 Does the technical guide and documentation explain how to run the analysis to obtain valid outputs? A good technical guide helps everybody to understand what the analysis does and how it works. A well-written technical guide is essential for effective maintenance of the analysis. It helps users of the analysis to replicate the findings, get answers to methodology questions and build their trust in the output.
Q60 Have you fully documented the analysis code to comply with good practice? The technical guide is complemented by fully documented analysis code. Code documentation must comply with good practice so new users can understand and execute the code as easily and quickly as possible.
Q61 Are users able to feed back on the suitability of outputs? External critique makes analysis more robust. Users should be able to give feedback to your team to ensure that results meet their needs. User feedback and customer reviews inform you of issues and changes that you might need to make. They also act as evidence that users have been consulted.
Question Which Code practice(s) are most relevant here?
*Trustworthiness (T), Quality (Q), Value (V)
Q53 Can you give a clear account of what can and cannot be inferred from the analysis? Q7.3 Explain the nature of data sources and why they were selected, anticipating possible areas of misunderstanding or misuse. Prominently communicate limitations in the underlying data and explain their impact on the statistics
Q54 Have you assessed the limitations of the data and analysis and set out how they affect the quality and use of the outputs? Q6.1 Regularly review strengths and limitations in the data and statistics, including the continued suitability of data sources and methods. Be open about your decisions and reasons for change

Q6.2 Use the most suitable data for what needs to be measured. Monitor for changes in the data sources and potential bias in the data. Explain any issues and their implications for use of the data in producing statistics

Q7.1 Prominently communicate the quality of the statistics and the strengths and limitations that impact their use, reflecting the needs of different types of users
Q55 Have you sense checked outputs with user groups and stakeholders? V8.3 Gain views from a range of users to inform decisions on your work programme, including when statistics are started, stopped or changed, being clear on where and why user needs can and cannot be met, such as addressing information gaps. Involve users in the ongoing development and testing of statistics
Q56 Is uncertainty about data quality, assumptions and methodology clearly communicated to users? Q7.2 Report on the key quality dimensions, such as accuracy and timeliness, and, where possible, give estimates of error and confidence for the statistics. Summarise how uncertainty in the estimates may impact use by using qualifying words, numbers or graphics

Q7.4 Be clear about the methods used. Explain quality issues related to the methods, systems and processes, including the extent to which the statistics are representative and comparable across the UK and internationally. Describe potential bias and steps taken to address it
Q57 Are the implications of unquantified uncertainties communicated to users? Q7.2 Report on the key quality dimensions, such as accuracy and timeliness, and, where possible, give estimates of error and confidence for the statistics. Summarise how uncertainty in the estimates may impact use by using qualifying words, numbers or graphics

V9.2 Communicate the statistics in a way that helps users understand issues and support them to make appropriately informed decisions. Provide a clear description of the main messages with suitable data visualisations
Q58 Is workflow documentation including technical guides and code repositories publicly available? V10.2 Make sure statistics, data and related guidance are easily accessible. Provide other relevant information, such as metadata and coding where appropriate
Q59 Does the technical guide and documentation explain how to run the analysis to obtain valid outputs? V10.2 Make sure statistics, data and related guidance are easily accessible. Provide other relevant information, such as metadata and coding where appropriate

V10.5 Support the reuse of data and statistics, preventing barriers to use where possible. Ensure statistics are reproducible. Support data and statistics to be shared, accessed and linked, using common data standards with associated metadata
Q60 Have you fully documented the analysis code to comply with good practice? Q6.7 Base methods on national or international good practice, scientific principles or professional consensus. Identify potential bias and address limitations. Use recognised standards, classifications and definitions. Explain reasons for deviations from these standards and any related implications for use

V10.2 Make sure statistics, data and related guidance are easily accessible. Provide other relevant information, such as metadata and coding where appropriate
Q61 Is there a clear feedback mechanism so users can report back on the suitability of outputs? V8.1 Be accountable to users by providing the means for users to engage meaningfully in open and constructive ways, enabling questions to be asked and providing prompt responses

V8.3 Gain views from a range of users to inform decisions on your work programme, including when statistics are started, stopped or changed, being clear on where and why user needs can and cannot be met, such as addressing information gaps. Involve users in the ongoing development and testing of statistics
Quality Question Which AQuA role(s) would normally answer this? Why are these AQuA roles involved?
Q53 Can you give a clear account of what can and cannot be inferred from the analysis? Commissioner, analyst During the delivery phase, the commissioner receives the results of the analysis and decides whether it meets their needs. The analyst provides sufficient information to support the commissioner to make an informed decision.
Q54 Have you assessed the limitations of the data and analysis and set out how they affect the quality and use of the outputs? Commissioner, Assurer, Analyst The commissioner must be confident in the quality of the outputs. They should understand the strengths, limitations and context of the analysis so that the results are correctly interpreted. Assurer sign-off provides confidence that analysis risks, limitations and major assumptions are understood by the users of the analysis. Analysts make sure that the commissioner and assurer have the evidence they need.
Q55 Have you sense checked outputs with user groups and stakeholders? Analyst, Assurer The analyst and assurer should enable and encourage peer review. Peer reviews provide useful critical challenge about the analytical approach, application of methods and interpretation of the analysis. Verification and peer review of work should be done by analysts who had no involvemen in the work
so their views are independent.
Q56 Is uncertainty about data quality, assumptions and methodology clearly communicated to users? Analyst, commissioner The analyst must determine and communicate the uncertainty associated with the analysis so the commissioner can make informed decisions. The commissioner should ensure that an assessment of uncertainty has been provided and that the implications of uncertainty are understood.
Q57 Are the implications of unquantified uncertainties communicated to users? Analyst, commissioner If uncertainty is too complex to quantify, even approximately, the analysts should explain this so the commissioner can take this into account. In communicating analysis results to decision-makers and stakeholders, the commissioner should be open about the existence of deep uncertainties whose impact cannot be assessed, and explain how they are managed in the analysis.
Q58 Is documentation including technical guides and code repositories publicly available? Analyst, Assurer The analyst must produce appropriate design documentation. Best practice includes maintaining a record of the analysis workflow in a technical report, including a concept of analysis, user requirements, design specification, functional specification, data dictionary, and test plan. Code should be properly documented.
Q59 Does the technical guide and documentation explain how to run the analysis to obtain valid outputs? Analyst, Assurer The analyst must produce appropriate documentation. Best practice includes maintaining a record of the work that has been done in a technical report, including a full description of the analysis, user requirements, design specification, functional specification, data dictionary, and test plan. The assurer makes sure that the documentation is fit for purpose.
Q60 Have you fully documented the analysis code to comply with good practice? Analyst, Assurer Analysts should develop and maintain analysis code in line with best practice. Code must comply with relevant policies and standards.
Q61 Is there a clear feedback mechanism so users can report back on the suitability of outputs? Analyst, Approver You can assess the usefulness of the analysis by getting feedback from users, stakeholders and other experts. Quality analysis should be free of prejudice or bias. The SRO and analysts should check that the analysis follows the principles of RIGOUR (Repeatable, Independent, Grounded in reality, Objective, Uncertainty-managed, Robust)