Quality Questions
Note
This guidance is an ALPHA draft. It is in development and we are still working to ensure that it meets user needs.
Please get in touch with feedback to support the guidance by creating a GitHub Issue or emailing us.
To get the most out of the template, we strongly recommend that teams identify who will take the key quality assurance roles of Commissioner, Senior Responsible Owner and Analytical Assurer and name the analytical team at the start of the analytical cycle. This is crucial because together these roles help to make sure that the analysis you do is fit-for-purpose.
I. Scoping
Quality Question | Why do I need to know the answer to this? | |
---|---|---|
Q1 | What question does the analysis try to answer? | A clear understanding of the analysis question is critical. It helps
your team to scope out requirements, understand the strengths and limitations of the analysis and make sure it is fit for purpose. If the question is not clear, you risk designing and delivering analysis which does not meet user needs. |
Q2 | Why do you need to answer this analysis question? | Knowing why you need the analysis, what it is for and how it will be used will help you to understand the importance and impact of your work and how it supports decision making. It will also help you to make sure the analysis is fit for purpose and correctly answers the question. |
Q3 | Which organisational priorities does this analysis address? | Knowing how the work aligns with organisational priorities shows how it will fit with wider strategic objectives and why you should do the analysis now. It informs the level of assurance needed to confirm the work is fit for purpose. |
Q4 | If you use a model, is it business critical? | Identifying if the work is business critical determines the assurance needed to ensure it is fit for purpose. |
Q5 | Who needs the answer to the analysis question? | Knowing what your outputs will be used for can ensure they meet user needs. A good understanding of uses is essential for making sure that your analysis is fit for purpose. |
Q6 | Who do you need to consult to make sure you meet the right user needs? | Analysis must be well understood by relevant users, or else
risks scope creep and misspecification. Identify relevant stakeholders and users, consult them before designing the analysis, and consider their views. Consulting the right stakeholders helps you and your users agree on how to answer the question and what the output should look like. This means you can check the users' understanding of the process and its quality, and that the final ouput meets user needs. |
Q7 | How will you know you have answered the analysis question correctly? | Being clear about the outputs required and acceptable uncertainty
is essential for producing accurate and reliable outputs where users understand the work's limitations and uncertainty. It is also essential when designing verification and validation activities to check the robustness of results under a range of plausible assumptions about methods and data. |
Q8 | What is the estimated time and resource required to answer the analysis question (in months and FTE)? | Without a clear understanding of time and resources available,
you may overcommit. It is important to push back against unrealistic demands if there is not enough time to effectively quality assure the analysis. The users and commissioner should fully understand and accept increased risks to quality when time and resource pressures are unavoidable. |
Q9 | What is the impact if the analysis is not done now? | Understanding why the work needs to happen now will help you to prioritise and use limited resources for the right activities. |
Q10 | What is the impact if the analysis is not done correctly? | Understanding the possible legal, financial or reputational consequences if the analysis is not carried out correctly helps you to design proportionate assurance activities. You should consider how these consequences line up with the risk appetite of your organisation when you design your mitigation. |
Q11 | Name the commissioner, senior responsible owner and analytical assurer of this analysis? | Clear accountability makes sure that important decisions are signed off by the right people. There should be a clear understanding of who is responsible for managing, producing and quality assuring the analysis in the team. |
Q12 | What tools and resources will you use in production? Are they the best for the job? | Before starting, identify all the skills and resources needed to produce the final output in a sustainable and reproducible way and quality assure it at every step. The platform used to host the analysis and the software to build and run the analysis should be appropriate and risks considered. For example, producing critical outputs in Excel does not comply with best practice and is unlikely to be robust or verifiable. |
Q13 | Do you have the right internal and external resources and capability to deliver the analysis? | If the team lacks capability, resource, or time, this increases the risk that the analysis will not be fit for purpose or sufficiently assured. |
Q14 | What are the anticipated risks of the analysis? Have you discussed these risks with customers and stakeholders? |
You must identify potential risks and their impact well in advance to enable effective mitigation and quicker, confident decision-making. It is important that users understand risks to ensure that their expectations and requirements are met. You should document how you have identified and are monitoring and mitigating risks. |
Q15 | Is there a contingency plan prepared if your mitigation plans fail? | You should account for risks that you can mitigate. Include risks that have a high impact on the analysis but a low probability happening. Without well designed contingencies, we put quality at risk. |
Q16 | Do the data and analysis comply with ethical requirements? | Analysis must comply with ethics standards to ensure public confidence. You must consider the ethical implications of the analysis when you create the workflow and report your findings. |
Q17 | What relevant questions are outside the scope of the analysis? | Limiting the scope of analysis shapes the quality of outputs and what can be done with them. By being clear about the limitations of the analysis we can mitigate or accept them. Limitations must be documented so everybody using the analysis is aware of them. |
Q18 | How will you peer review and assure the analysis? | Internal audit and peer review are critical for monitoring and assuring that the analysis is performed appropriately and meets the required aims and objectives. The outcomes of peer review should be documented and relevant actions and recommendations should be prioritised, addressed, and taken forward. |
Q19 | Will external experts be involved in development and scrutiny of analysis? | You should commission external specialists to peer review or audit the analysis in proportion to risks around use. They should be able to draw on expertise and experience across government and beyond to get feedback, exchange experience and suggest best practice to improve the analysis. |
Quality Question |
Which Code pillar and practice are most relevant here? *Trustworthiness (T), Quality (Q), Value (V) |
|
---|---|---|
Q1 | What question does the analysis try to answer? | V1.1 Statistics producers should maintain and refresh their understanding of the use and potential use of the statistics and data. They should consider the ways in which the statistics might be used and the nature of the decisions that are or could be informed by them. |
Q2 | Why do you need to answer this analysis question? | V1.1 Statistics producers should maintain and refresh their understanding of the use and potential use of the statistics and data. They should consider the ways in which the statistics might be used and the nature of the decisions that are or could be informed by them. |
Q3 | Which organisational priorities does this analysis address? | T2.1 The Chief Statistician/Head of Profession for Statistics should have sole authority for deciding on methods, standards and procedures, and on the content and timing of the release of regular and ad hoc official statistics. This should include: determining the need for new official statistics, ceasing the release of official statistics, and the development of experimental statistics. |
Q4 | If you use a model, is it business critical? | Q3.2 Quality assurance arrangements should be proportionate to the nature of the quality issues and the importance of the statistics in serving the public good. Statistics producers should be transparent about the quality assurance approach taken throughout the preparation of the statistics. The risk and impact of quality issues on statistics and data should be minimised to an acceptable level for the intended uses. |
Q5 | Who needs the answer to the analysis question? | V1 Users of statistics and data should be at the centre of statistical production; their needs should be understood, their views sought and acted upon, and their use of statistics supported. |
Q6 | Who do you need to consult to make sure you meet the right user needs? |
V1 Users of statistics and data should be at the centre of statistical production; their needs should be understood, their views sought and acted upon, and their use of statistics supported. V1.2 Statistics producers should use appropriate ways to increase awareness of the statistics and data, communicate effectively with the widest possible audience, and support users and potential users in identifying relevant statistics to meet their needs. |
Q7 | How will you know you have answered the analysis question correctly? | Q3 Producers of statistics and data should explain clearly how they assure themselves that statistics and data are accurate, reliable, coherent and timely. |
Q8 | What is the estimated time and resource required to answer the analysis question (in months and FTE)? |
T5 People producing statistics should be appropriately skilled, trained and supported in their roles and professional development. T4.3 Sufficient human, financial and technological resources should be provided to deliver statistical services that serve the public good. T3.5 Statistics and data should be released on a timely basis and at intervals that meet the needs of users as far as practicable. The statistics should be released as soon as they are considered ready, under the guidance of the Chief Statistician or Head of Profession for Statistics. |
Q9 | What is the impact if the analysis is not done now? | T2.1 The Chief Statistician/Head of Profession for Statistics should have sole authority for deciding on methods, standards and procedures, and on the content and timing of the release of regular and ad hoc official statistics. This should include: determining the need for new official statistics, ceasing the release of official statistics, and the development of experimental statistics. |
Q10 | What is the impact if the analysis is not done correctly? | Q3 Producers of statistics and data should explain clearly how they assure themselves that statistics and data are accurate, reliable, coherent and timely. |
Q11 | Name the commissioner, senior responsible owner and analytical assurer of the analysis? | T5.2 The roles and responsibilities of those involved in the production of statistics and data should be clearly defined with supporting guidance provided to help staff carry out their roles. |
Q12 | What tools and resources will you use in production? Are they the best for the job? |
T5 People producing statistics should be appropriately skilled, trained and supported in their roles and professional development. T5.5 Staff should be provided with the time and resources required to develop their skills, knowledge and competencies. T4.3 Sufficient human, financial and technological resources should be provided to deliver statistical services that serve the public good. |
Q13 | Do you have the right internal and external resources and capability to deliver the analysis? |
Q3 Producers of statistics and data should explain clearly how they assure themselves that statistics and data are accurate, reliable, coherent and timely. T4.3 Sufficient human, financial and technological resources should be provided to deliver statistical services that serve the public good. |
Q14 |
What are the anticipated risks of the analysis? Have you discussed these risks with customers and stakeholders? |
Users of statistics and data should be at the centre of statistical production; their needs should be understood, their views sought and acted upon, and their use of statistics supported. V4.3 Users should be involved in the ongoing development of statistics and data, exploring and testing statistical innovations, so that the statistics remain relevant and useful. Q3.2 Quality assurance arrangements should be proportionate to the nature of the quality issues and the importance of the statistics in serving the public good. Statistics producers should be transparent about the quality assurance approach taken throughout the preparation of the statistics. The risk and impact of quality issues on statistics and data should be minimised to an acceptable level for the intended uses. |
Q15 | Is there a contingency plan prepared if your mitigation plans fail? | T4.5 Organisations should be open about their commitment to quality and make clear their approach to quality management. They should ensure that the organisational structure and tools are in place to manage quality effectively, and promote and adopt appropriate quality standards. |
Q16 | Do the data and analysis comply with ethical requirements? |
T6 Organisations should look after people’s information securely and manage data in ways that are consistent with relevant legislation and serve the public good. T1.1 Everyone that works in organisations producing official statistics should handle and use statistics and data with honesty and integrity, guided by established principles of appropriate behaviour in public life. |
Q17 | What relevant questions are outside the scope of the analysis? |
Q1.6 The causes of limitations in data sources should be identified and addressed where possible. Statistics producers should be open about the extent to which limitations can be overcome and the impact on the statistics. Q2.4 Relevant limitations arising from the methods and their application, including bias and uncertainty, should be identified and explained to users. An indication of their likely scale and the steps taken to reduce their impact on the statistics should be included in the explanation. |
Q18 | How will you peer review and assure the work? |
Q3 Producers of statistics and data should explain clearly how they assure themselves that statistics and data are accurate, reliable, coherent and timely. T4.6 Independent measures, such as internal and external audit, peer review and National Statistics Quality Reviews, should be used to evaluate the effectiveness of statistical processes. Statistics producers should be open about identified areas for improvement. |
Q19 | Will external experts be involved in development and scrutiny of analysis? |
Q3 Assured quality Producers of statistics and data should explain clearly how they assure themselves that statistics and data are accurate, reliable, coherent and timely. T4.6 Independent measures, such as internal and external audit, peer review and National Statistics Quality Reviews, should be used to evaluate the effectiveness of statistical processes. Statistics producers should be open about identified areas for improvement. |
Quality Question | Which AQuA role(s) would normally answer this? | Why are these AQuA roles involved? | |
---|---|---|---|
Q1 | What question does the analysis try to answer? | Commissioner, analyst | The commissioner sets out the commission. They work with the analyst team to ensure that everyone has a common understanding of the problem. |
Q2 | Why do we need to answer this analysis question? | Commissioner, analyst, analytical assurer | The analyst must document the purpose of the analysis and the levels of quality and certainty that are needed to meet user requirements. The commissioner and analytical assurer make sure the analysis aligns with the stated purpose. |
Q3 | Which organisational priorities does this analysis address? | Commissioner | The commissioner makes sure that key aspects of the problem, scope and complexities, including programme constraints, are captured and clearly communicated. They also make sure there is proportionate governance in place to support the analysis and its role in the wider project or programme |
Q4 | If you use a model, is it business critical? | Commissioner, Senior Responsible Owner |
Business critical models must be managed appropriately so that the right specialists are responsible for developing, using and assuring them. Decision-makers need sufficient assurance from an appropriate level in the organisation that the model is fit for purpose before using it to inform a decision. For business critical analysis and modelling, the commissioner should be satisfied with the seniority of the analytical assurer. |
Q5 | Who needs the answer to the analysis question? | Commissioner,analyst,analytical assurer | The commissioner makes sure that the right stakeholders have been identified so that the scope and boundaries of the analysis can be appropriately explored. The analyst team and analytical assurer should also contribute. |
Q6 | Who do you need to consult to make sure you meet the right user needs? | Analyst, commissioner | Analysts should explore the analysis requirements and scope with all relevant stakeholders to make sure a wide range of perspectives are sought. The commissioner should be aware and briefed. |
Q7 | How will you know you have answered the analysis question correctly? | Commissioner, analyst | During the design and conduct of analysis, the commissioner should set out details like the level of precision, accuracy and uncertainty that are needed. |
Q8 | What is the estimated time and resource required to answer this analysis question (in months and FTE)? | Commissioner, senior responsible owner, analyst team | During commissioning and scoping, the commissioner and analyst will need to make trade-offs between time, resources and quality. They should work together to agree and document the right balance across these constraints. |
Q9 | What is the impact if the analysis is not done now? | Commissioner | The commissioner makes sure that there is sufficient time and resource for the required level of assurance to be delivered and that they understand the risks when time and resource pressures are unavoidable. |
Q10 | What is the impact if the analysis is not done correctly? | Commissioner, analyst | The commissioner makes sure the analyst team understands the context for the analysis question. This helps the analyst to understand and assess likely risks and determine the right analytical and quality assurance response. |
Q11 | Name the commissioner, senior responsible owner and analytical assurer of this analysis? | Commissioner | During scoping of the analysis, the commissioner makes sure there is proportionate governance in place to support the analysis and its role in the wider project or programme. |
Q12 | What tools and resources will you use in production? Are they the best for the job? | Commissioner | The commissioner makes sure that there is enough time and resource for the required level of assurance to be delivered. They must be confident that they understand the risks when time and resource pressures are unavoidable. |
Q13 | Do you have the right internal and external resources and capability to deliver the analysis? | Commissioner | The commissioner makes sure that there is enough time and resource for the required level of assurance to be delivered. They must be confident that they understand the risks when time and resource pressures are unavoidable. |
Q14 |
What are the anticipated risks of the analysis? Have you discussed these risks with customers and stakeholders? |
Analytical assurer, commissioner, analyst | The analystical assurer should challenge and test the understanding of the problem. The commissioner and analyst work with the analytical assurer to make sure that all share a common understanding. |
Q15 | Is there a contingency plan prepared if your mitigation plans fail? | Commissioner, analytical assurer |
The commissioner makes sure that there is enough time and resource for the required level of assurance to be delivered. They must be confident that they understand the risks when time and resource pressures are unavoidable. If there is a need for urgent action, such as mitigation of unacceptable but uncertain risks, the commissioner may ask for further analysis. They might also commission extra evidence-gathering in parallel to inform the policy response when uncertainty is reduced. The analytical assurer must advise the commissioner on whether sufficient analytical quality assurance has happened and inform them about any outstanding risks. |
Q16 | Do the data and analysis comply with ethical requirements? | Analyst, commissioner, analytical assurer | The analyst should make sure that there is appropriate ethical approval for the analysis. The commissioner and analytical assurer should be informed. |
Q17 | What relevant questions are outside the scope of the analysis? | Commissioner, analyst | The commissioner and the analyst work together at the scoping stage to get a clear understanding of analytical requirements. During scoping, the commissioner makes sure that the right aspects of the problem, scope and complexities, including programme constraints, are captured and clearly communicated. |
Q18 | How will you peer review and assure the analysis? | Analytical assurer, commissioner, analyst | The analytical assurer makes sure quality assurance plans for the analysis are appropriate for the decision it supports. All analysis requires some level of quality assurance. Analyst and commissioner should be involved. |
Q19 | Will external experts be involved in development and scrutiny of analysis? | Analytical assurer, commissioner, analyst | The analytical assurer should challenge the proposed approach. Check that it delivers as intended and meets customer needs. It is good practice to engage subject matter experts in this review. Analyst and commissioner should be involved. |
II. Design
Quality Question | Why do I need to know the answer to this? | |
---|---|---|
Q20 | Is there a simple, plain English description of what the analysis is for and what it does? | Writing a plain English description of what the analysis is for and how it works means that everybody in the team, including new starters, and others with no subject or technical expertise can understand the purpose of the analysis and what it does. |
Q21 | Does the analysis have a logic flowchart which explains the end-to-end steps in the workflow? | Setting out a clear summary of the analysis process in a diagram helps the team, users and customers to understand at a glance what the analysis does, where inputs come from, how they are processed and how it generates outputs. |
Q22 | When do you expect to start and finish each stage of analysis: data collection, processing, quality assurance, analysis and dissemination? | Clearly setting out the time you need to perform each stage of analysis helps you to evaluate if the time allocated to each stage is right and plan mitigation if plans look too ambitious. |
Q23 | Does any part of the analysis rely on manual processing? Have you considered the cost and benefits of fully automating the process? | Manual processes are inefficient and more risky than well-designed automated ones. Analysis with manual steps like copying and pasting data between files, manually updating cells in tables, or moving data between software packages is harder to assure and carries extra quality risks. |
Q24 |
What happens if team members, reviewers or users find a mistake in the analysis? Do you have a clear and efficient process for addressing issues and preventing them from happening again? |
Mistakes happen in analysis. You should have a clear and efficient process for reporting, documenting and addressing errors. Being open and honest about problems and working together to solve them in a supportive way creates an atmosphere of openness in the team and is critical for upholding users’ trust in your output. Teams should take reasonable steps to understand and document how and why errors came about, and mitigate the risk of them happening again. Your mitigation approach should be consistent with your department’s revision policies. |
Q25 | Have you assessed uncertainty? |
Consider how uncertainty impacts on all stages of the analysis. Think about quantifying and measuring uncertainty as early as possible. Identify and review sources of uncertainty regularly. If you delay the assessment of uncertainty until late in the process, it is often difficult and costly to mitigate risks. In some cases, you may need to revise methods or alter commissioning decisions. If a potential source of uncertainty is overlooked, this can limit the usefulness and impact of the analysis. |
Quality Question |
Which pillar and principle of the Code of Practice are most relevant here? *Trustworthiness (T), Quality (Q), Value (V) |
|
---|---|---|
Q20 | Is there a simple description in plain English of what the analysis is for and what it does? | V3 Statistics and data should be presented clearly, explained meaningfully and provide authoritative insights that serve the public good. |
Q21 | Does the analysis have a logic flowchart which explains the end-to-end conceptual steps in the work flow? | V3 Statistics and data should be presented clearly, explained meaningfully and provide authoritative insights that serve the public good. |
Q22 |
When do you expect to start and finish each stage of analysis: data collection, processing, quality assurance, analysis and dissemination? |
T3.5 Statistics and data should be released on a timely basis and at intervals that meet the needs of users as far as practicable. The statistics should be released as soon as they are considered ready, under the guidance of the Chief Statistician/Head of Profession for Statistics. Q3.3 The quality of the statistics and data, including their accuracy and reliability, coherence and comparability, and timeliness and punctuality, should be monitored and reported regularly. |
Q23 | Does any part of the analysis rely on manual processing? Have you considered the cost and benefits of fully automating the process? |
V4 Statistics producers should be creative and motivated to improve statistics and data, recognising the potential to harness technological advances for the development of all parts of the production and dissemination process. T4.3 Sufficient human, financial and technological resources should be provided to deliver statistical services that serve the public good. |
Q24 |
What happens if team members, reviewers or users find a mistake in the analysis? Do you have a clear and efficient process for addressing issues and preventing them from happening again? |
Q3.4 Scheduled revisions, or unscheduled corrections that result from errors, should be explained alongside the statistics, being clear on the scale, nature, cause and impact. V5 Statistics and data should be published in forms that enable their reuse. Producers should use existing data wherever possible and only ask for more where justified. |
Q25 | Have you assessed uncertainty? | Q2.4 Relevant limitations arising from the methods and their application, including bias and uncertainty, should be identified and explained to users. An indication of their likely scale and the steps taken to reduce their impact on the statistics should be included in the explanation. |
Quality Question | Which AQuA role(s) would normally answer this? | Why are these AQuA roles involved? | |
---|---|---|---|
Q20 | Is there a simple, plain English description of what the analysis is for and what it does? | Analyst, analytical assurer | The analyst should produce sufficient design documentation. Best practice can include a description of the analysis, user requirements, design specification, functional specification, data dictionary, and test plan. The analytical assurer should give feedback to make sure documentation is fit for purpose. |
Q21 | Does the analysis have a logic flowchart which explains the end-to-end steps in the workflow? | Analyst, analytical assurer | The analyst should produce sufficient design documentation. Best practice can include a description of the analysis, user requirements, design specification, functional specification, data dictionary, and test plan. The analytical assurer should give feedback to make sure documentation is fit for purpose. |
Q22 | When do you expect to start and finish each stage of analysis: data collection, processing, quality assurance, analysis and dissemination? | Commissioner, analyst | The analyst and commissioner should work together during scoping to set out and agree trade-offs between time, resources and quality and establish the optimal balance of these constraints. |
Q23 | Does any part of the analysis rely on manual processing? Have you considered the cost and benefits of fully automating the process? | Commissioner, analyst | The analyst should use a risk-based approach to understand the areas of greatest potential error and focus assurance efforts on these areas. The analyst should brief the commissioner so they understand the impact of any reduction in the thoroughness of analytical quality assurance activities. |
Q24 |
What happens if team members, reviewers or users find a mistake in the analysis? Do you have a clear and efficient process for addressing issues and preventing them from happening again? |
Analyst, analytical assurer | The analyst should use a risk-based approach to highlight the areas of greatest potential error and focus assurance efforts on these areas. |
Q25 | Have you assessed uncertainty? | Commissioner, analyst |
Commissioners should expect and require information about uncertainty from analysts. They should challenge them when it is absent, inadequate or ambiguous. Commissioners may have identified sources of uncertainty as part of their wider considerations and should share them with the analyst. If the commissioner can explain in advance the impact on decision-making of different degrees of uncertainty, this can help the analyst to design and carry out the analysis at a proportionate level. |
III. Doing and Checking Analysis
Quality Question | Why do I need to know the answer to this? | |
---|---|---|
Q26 | How will the data in the analysis be processed before and during use? | Processing the data inputs will impact methods and outputs. A clear understanding of how these processes affect the workflow is essential for understanding quality. |
Q27 | Is the data appropriate given the methods selected? | A comprehensive understanding of data inputs is a prerequisite for meeting user needs. |
Q28 | What are the strengths and limitations of the data you use? |
Data requirements for analysis vary. Formats, coverage, time scales and granularity must all be appropriate for the research question. A comprehensive understanding of data inputs is a prerequisite for meeting user needs. Without understanding the strengths and weaknesses of the data, it is impossible to make meaningful improvements to the analysis or the inputs to manage these limitations. |
Q29 |
Is there a robust relationship between your team and data providers? Do data providers understand how and why you use their data? |
A good relationship with data suppliers helps to make sure that their data meet your requirements. Lack of communication can mean you are not aware of quality risks or changes in collection or processing steps that can affect your results. You should communicate with your suppliers sufficiently to manage input quality. Data providers should have a good understanding of how and why you are using their data. This helps them to improve data quality and value and find and address gaps or issues that are relevant for your analysis. |
Q30 | Do you understand how data providers collect, process and quality assure the data you use? | Never assume that datasets are of sufficient quality. Make sure that suppliers give you the metadata and other supporting information you need to assure the quality of the data. Validate the information provided by suppliers using your own checks and confirmation if appropriate. |
Q31 | Is there a formal agreement to set out data content, when and how you will get the data? If not, why not? | A formal service level agreement with data providers makes sure that everybody understands what will be delivered, when and how. This is useful for setting out the division of responsibilities between data providers and your team for getting and sharing the data. It might specify formats, delivery, timescale, legal framework, accompanying metadata and quality checks. |
Q32 | Do you know what quality checks are carried out on the data before you receive them? | Data suppliers should be able to show that their data is sufficiently assured to meet your needs. You should be able to demonstrate that the data meet your needs and that reported quality matches what you observe in practice. Simply having a quality report is not enough. |
Q33 | How will you work with your data provider when your data requirements change? | Review your data requirements regularly to ensure they are still relevant and feasible. Changes to requirements should be communicated to data providers well in advance and agreed by all stakeholders. If there is a formal agreement, it may need to be revised as requirements change. |
Q34 | How do you know if your data provider changes their systems or processes in a way that could impact the data you receive or the analysis you produce? | Data suppliers make changes to their definitions, methods and systems. This may not affect the quality of their data but can affect how you process the data and what you can infer from it. Tailor communication with data suppliers so it is sufficiently frequent, effective and ongoing to get timely information about changes. |
Q35 | How did you choose the methods for the analysis? How do you know the methods you use are appropriate? | You should be able to explain why you chose your methods. For each method, document the underlying assumptions, why the method is suitable for answering the analysis question, why it is applicable to the type and distribution of data you are using, and how these decisions were signed off. |
Q36 | Have reasonable alternative methods been explored and rejected for good reasons? | There is often more than one way to answer a question with data. When you have made choices about methods and approaches, explain how and why you considered and rejected other options. Unless there is evidence underpinning your choice, users cannot be sure that you have chosen the most suitable methods. |
Q37 | How do you know that your analysis is working correctly? | You need to be sure that your analysis produces the outputs you think it should and the processes run as expected. If you cannot demonstrate that scripts and processes work correctly, you cannot confirm the quality of the results. |
Q38 | Can you describe the assumptions of your analysis, when they were made and who made them and signed them off? |
You must understand the assumptions your analysis makes. Assumptions set out how the analysis simplifies the world and mitigates uncertainty. If assumptions are inadequately set out or absent, important characteristics of the analysis and its inputs will be unclear, greatly increasing risk. Without a comprehensive log of assumptions made by the analysis, an audit trail signed off by assumption owners, a version control log reflecting when assumptions were last updated, and evidence showing internal and external validation of assumptions, uncertainties may go unacknowledged and could drastically impact outputs. |
Q39 | How are assumptions validated and assured before you apply them? | A clear understanding of how assumptions have been externally and internally validated and signed off gives us confidence that they are reasonable. |
Q40 | How do you measure and report uncertainty in your analysis? |
All analysis contains uncertainty. Quantifying and reporting uncertainty means we can inform users how precise reported values are and how much confidence they can have in the analysis. It will also help you to determine where the analysis can be improved. There are many ways to quantify uncertainty in input data, assumptions, processes and outputs. Choose appropriate ones for your situation. For instance, uncertainties can be understood and quantified by comparing against similar or historical data, Monte Carlo simulation, break-even analysis or using expert judgement. |
Q41 | Have you considered the implications of relevant, unquantified uncertainties? | A good understanding of the uncertainties in the analysis workflow is critical to ensure the analysis and its outputs are fit for purpose. |
Q42 | Can you explain the impact of your analysis on downstream processes? Are there any risks around these dependencies? | Understanding how your analysis might be used would help ensure that the right quality and assurance levels are in place. It would help you assess the risks around the use of analysis and if there are other stakeholders or users you need to consult. |
Q43 | Is all or part of the analysis reliant on a single person? | Single points of failure carry significant business risk. If only one person understands how to carry out all or part of the analysis or maintain the code then the process is extremely vulnerable. |
Q44 | Is it clear why important decisions about the analysis were made, who made them and when? | All analysis involves decisions. A comprehensive record of the decisions made in specifying and conducting the analysis ensures a full audit trail of why decisions were made, who made them and signed them off. |
Q45 | If changes need to be made to code or datasets, is it easy to track who made the changes and when and why they were made? | Good version control ensures a full understanding of when, why, and how changes were made to your analysis process. If it is hard to track changes, it will be hard to retrace steps if there is a problem and means you do not fully understand the process. |
Q46 | Would another analyst be able to reproduce your analysis output or continue the work without talking to you first? | Your analysis must be well documented and repeatable so that somebody new can understand it, use it, and produce the same output with the same inputs. Poor documentation can lead to errors. |
Q47 | Do you use internal peer review to check scripts and code, documentation, implementation of methods, processes and outputs? |
You should independently review and validate the logical integrity of you analysis as well as the structure and functionality of the code against the research question. A record of validation and verification activities undertaken, outstanding tasks and remedial actions helps to confirm that the correct analysis has been performed for the required purpose and the chosen approach minimises risk. |
Q48 | Is your code and analysis ever peer reviewed by someone outside your team or organisation? | External peer review is one of the best ways to ensure that the analysis and code are well made and fit for purpose. Without it, teams can reinforce their own biases and may not notice there is anything wrong. |
Q49 | What is your assessment of the quality of your analytical outputs? | Understanding and reporting on the quality of your analysis is critical to ensure fitness for purpose and maintain trust and reliability. This ensures that analysis can appropriately inform decision-making. Quality assessments are key information to share with users and a requirement of the Code of Practice for Statistics. |
Q50 | How do you assure yourselves that analysis you do is correct? |
If you check that results fit with your expectations and you can explain discrepancies, this makes it easier to mitigate risk. There are many ways to check if analysis is carried out correctly. For example, sensitivity analysis can help you understand which inputs have the greatest effect on the outputs.You can also compare figures from the analysis with similar data from other sources or from historical series. |
Q51 | Do the outputs of your analysis align with similar findings from elsewhere? If not, can you explain why? | If you can, check that your outputs align with findings from previous runs of the analysis, alternate data sources, and comparable studies. This gives you confidence that the analysis works as expected. You should be able to explain any inconsistencies that you see. |
Q52 | If you find outliers or unusual trends in the data, what steps do you take to investigate them? |
It is crucial to check unusual trends and values in the data and understand why they are there. Not all outliers are the same. Some have a strong influence, some not at all. Some are valid and important data values. Others might be errors. Investigate outliers and unusual patterns thoroughly and take reasonable steps to check their impact on your final output. If you choose to exclude unusual values, you should explain why this is acceptable. |
Quality Question |
Which pillar and principle of Code of Practice are relevant here? *Trustworthiness (T), Quality (Q), Value (V) |
|
---|---|---|
Q26 | How will the data used in the analysis be processed before and during use? | Q1 Statistics should be based on the most appropriate data to meet intended uses. The impact of any data limitations for use should be assessed, minimised and explained. |
Q27 | Is the data appropriate given the methods selected? | Q1 Statistics should be based on the most appropriate data to meet intended uses. The impact of any data limitations for use should be assessed, minimised and explained. |
Q28 | What are the strengths and limitations of the data you use? |
Q1 Statistics should be based on the most appropriate data to meet intended uses. The impact of any data limitations for use should be assessed, minimised and explained. Q1.5 Potential bias, uncertainty and possible distortive effects in the source data should be identified and the extent of any impact on the statistics should be clearly reported. |
Q29 |
Is there a robust relationship between your team and data providers? Do data providers understand how and why you use their data? |
Q1.2 Statistics producers should establish and maintain constructive relationships with those involved in the collection, recording, supply, linking and quality assurance of data, wherever possible. |
Q30 | Do you understand how data providers collect, process and quality assure the data you use? |
Q1.1 Statistics should be based on data sources that are appropriate for the intended uses. The data sources should be based on definitions and concepts that are suitable approximations of what the statistics aim to measure, or that can be processed to become suitable for producing the statistics. Q1.6 The causes of limitations in data sources should be identified and addressed where possible. Statistics producers should be open about the extent to which limitations can be overcome and the impact on the statistics. |
Q31 | Is there a formal agreement to set out data content, when and how you will get the data? If not, why not? | Q1.3 A clear statement of data requirements should be shared with the organisations that provide that data, setting out decisions on timing, definitions and format of data supply, and explaining how and why the data will be used. |
Q32 | Do you know what quality checks are carried out on the data before you receive them? | Q1 Statistics should be based on the most appropriate data to meet intended uses. The impact of any data limitations for use should be assessed, minimised and explained. |
Q33 | How will you work with your data provider when your data requirements change? | Q1.2 Statistics producers should establish and maintain constructive relationships with those involved in the collection, recording, supply, linking and quality assurance of data, wherever possible. |
Q34 | How do you know if your data provider changes their systems or processes in a way that could impact the data you receive or the analysis you produce? |
Q1.2 Statistics producers should establish and maintain constructive relationships with those involved in the collection, recording, supply, linking and quality assurance of data, wherever possible. Q1.7 The impact of changes in the circumstances and context of a data source on the statistics over time should be evaluated. Reasons for any lack of consistency and related implications for use should be clearly explained to users. |
Q35 | How did you choose the methods for the analysis? How do you know the methods you use are appropriate? | Q2 Producers of statistics and data should use the best available methods and recognised standards, and be open about their decisions. |
Q36 | Have reasonable alternative methods been explored and rejected for good reasons? | Q2 Producers of statistics and data should use the best available methods and recognised standards, and be open about their decisions. |
Q37 | How do you know that your analysis is working correctly? | Q3 Producers of statistics and data should explain clearly how they assure themselves that statistics and data are accurate, reliable, coherent and timely. |
Q38 | Can you describe the assumptions of your analysis, when they were made and who made them and signed them off? |
Transparency means being clear and open about the choices you make and not holding back or being opaque about decisions. Q1.6 The causes of limitations in data sources should be identified and addressed where possible. Statistics producers should be open about the extent to which limitations can be overcome and the impact on the statistics. Q2.4 Relevant limitations arising from the methods and their application, including bias and uncertainty, should be identified and explained to users. An indication of their likely scale and the steps taken to reduce their impact on the statistics should be included in the explanation. |
Q39 | How are assumptions validated and assured before you apply them? |
Q1.6 The causes of limitations in data sources should be identified and addressed where possible. Statistics producers should be open about the extent to which limitations can be overcome and the impact on the statistics. Q2.4 Relevant limitations arising from the methods and their application, including bias and uncertainty, should be identified and explained to users. An indication of their likely scale and the steps taken to reduce their impact on the statistics should be included in the explanation. Q3.5 Systematic and periodic reviews on the strengths and limitations in the data and methods should be undertaken. Statistics producers should be open in addressing the issues identified and be transparent about their decisions on whether to act. |
Q40 | How do you measure and report uncertainty in your analysis? | Q2.4 Relevant limitations arising from the methods and their application, including bias and uncertainty, should be identified and explained to users. An indication of their likely scale and the steps taken to reduce their impact on the statistics should be included in the explanation. |
Q41 | Have you considered the implications of relevant, unquantified uncertainties? |
Q2.4 Relevant limitations arising from the methods and their application, including bias and uncertainty, should be identified and explained to users. An indication of their likely scale and the steps taken to reduce their impact on the statistics should be included in the explanation. Q3.3 The extent and nature of any uncertainty in the estimates should be clearly explained. |
Q42 | Can you explain the impact of your work/output on downstream processes? Are there any risks around these dependencies? | V1.1 Statistics producers should maintain and refresh their understanding of the use and potential use of the statistics and data. They should consider the ways in which the statistics might be used and the nature of the decisions that are or could be informed by them. |
Q43 | All or part of the analysis is reliant on a single person? |
V4 Statistics producers should be creative and motivated to improve statistics and data, recognising the potential to harness technological advances for the development of all parts of the production and dissemination process. T4.3 Sufficient human, financial and technological resources should be provided to deliver statistical services that serve the public good. |
Q44 | Is it clear why important decisions were made and who made them? | Q2 Producers of statistics and data should use the best available methods and recognised standards, and be open about their decisions. |
Q45 | If changes need to be made to code or datasets, is it easy to track who made the changes and when and why they were made? | V4 Statistics producers should be creative and motivated to improve statistics and data, recognising the potential to harness technological advances for the development of all parts of the production and dissemination process. |
Q46 | Would another analyst be able to reproduce your analysis output or continue the work without talking to you first? | V5 Statistics and data should be published in forms that enable their reuse. Producers should use existing data wherever possible and only ask for more where justified. |
Q47 | Do you use internal peer review to check scripts and code, documentation, implementation of methods, processes and outputs? |
Q3 Producers of statistics and data should explain clearly how they assure themselves that statistics and data are accurate, reliable, coherent and timely. T4.6 Independent measures, such as internal and external audit, peer review and National Statistics Quality Reviews, should be used to evaluate the effectiveness of statistical processes. Statistics producers should be open about identified areas for improvement. |
Q48 | Is your code and analysis ever peer reviewed by someone outside your team or organisation? |
Q3 Producers of statistics and data should explain clearly how they assure themselves that statistics and data are accurate, reliable, coherent and timely. T4.6 Independent measures, such as internal and external audit, peer review and National Statistics Quality Reviews, should be used to evaluate the effectiveness of statistical processes. Statistics producers should be open about identified areas for improvement. |
Q49 | What is your assessment of the quality of your analytical outputs? |
Q3 Producers of statistics and data should explain clearly how they assure themselves that statistics and data are accurate, reliable, coherent and timely. Q3.2 Quality assurance arrangements should be proportionate to the nature of the quality issues and the importance of the statistics in serving the public good. Statistics producers should be transparent about the quality assurance approach taken throughout the preparation of the statistics. The risk and impact of quality issues on statistics and data should be minimised to an acceptable level for the intended uses. |
Q50 | How do you assure yourself that the analysis you do is correct? |
Q3 Producers of statistics and data should explain clearly how they assure themselves that statistics and data are accurate, reliable, coherent and timely. Q3.2 Quality assurance arrangements should be proportionate to the nature of the quality issues and the importance of the statistics in serving the public good. Statistics producers should be transparent about the quality assurance approach taken throughout the preparation of the statistics. The risk and impact of quality issues on statistics and data should be minimised to an acceptable level for the intended uses. |
Q51 | Do the outputs of your analysis align with similar findings from elsewhere? If not, can you explain why? | Q3 Producers of statistics and data should explain clearly how they assure themselves that statistics and data are accurate, reliable, coherent and timely. |
Q52 | If you find outliers or unusual trends in the data, what steps do you take to investigate them? |
Q3 Producers of statistics and data should explain clearly how they assure themselves that statistics and data are accurate, reliable, coherent and timely. Q3.3 The quality of the statistics and data, including their accuracy and reliability, coherence and comparability, and timeliness and punctuality, should be monitored and reported regularly. Statistics should be validated through comparison with other relevant statistics and data sources. The extent and nature of any uncertainty in the estimates should be clearly explained. |
Quality Question | Which AQuA role(s) would normally answer this? | Why are these AQuA roles involved? | |
---|---|---|---|
Q26 | How will the data used in the analysis be processed before and during use? | Analyst, analytical assurer | The analyst should collect and manage the data. They must understand data accuracy and uncertainties and capture, manage and understand assumptions. Analytical assurer should check that data processing is sufficient to ensure fitness for purpose. |
Q27 | Is the data appropriate given the methods selected? | Analyst, analytical assurer, commissioner | The analyst should understand data accuracy and uncertainties and capture, manage and understand assumptions made. The analyst should engage appropriate subject matter experts at the appropriate time. The commissioner may be a subject matter expert. The analytical assurer should check that there is sufficient assurance around the choice of data. |
Q28 | What are the strengths and limitations of the data you use? | Analyst, analytical assurer | If applicable, analyst should undertake parametric analysis to understand the consequences of missing or uncertain data and assumptions. Analytical assurer should make sure there is sufficient consideration of strengths and limitations of data. |
Q29 |
Is there a robust relationship between your team and data providers? Do data providers understand how and why you use their data? Do you understand how data providers collect, process and quality assure the data you use? |
Analyst, analytical assurer | The analytical assurer should expect to see evidence that there has been sufficient dialogue between analysts and the providers of data and other evidence sources. |
Q30 | Do you understand how data providers collect, process and quality assure the data you use? | Analyst, analytical assurer | The analyst should ensure data formats, units, and context are properly understood and handled. They should design and implement quality checks to validate data inputs as required. Analytical assurer should verify that the right assurance is in place. |
Q31 | Is there a formal agreement to set out data content, when and how you will get the data? If not, why not? | Commissioner, analyst, analytical assurer | The commissioner may need to provide the analyst with agreement to use specific data. The analyst should ensure data formats, units, and context are properly understood and handled. Analytical assurer should verify that assurance is in place. |
Q32 | Do you know what quality checks are carried out on the data before you receive them? | Analyst | The analyst should understand data accuracy and uncertainties and capture, manage and understand implicit assumptions made. The analytical assurer should assess whether assurance is sufficient. |
Q33 | How will you work with your data providers when your data requirements change? | Commissioner, analyst, analytical assurer | During the design and conduct of analysis, the commissioner may need to provide the analyst with information, agreement to use resources or confirmation of assumptions or approach. The analyst should understand data accuracy and uncertainties and capture, manage and understand implicit assumptions made. The analytical assurer checks that assurance and mitigation are sufficient. |
Q34 | How do you know if your data provider changes their systems or processes in a way that could impact the data you receive or the analysis you produce? | Commissioner, analyst, analytical assurer | During the design and conduct of analysis, the commissioner provides the analyst with the information they need for the analysis to proceed. This could include agreement to use datasets, setting out of key assumptions and signing off assumptions developed during the project. Analyst should understand data accuracy and uncertainties and capture, manage and understand assumptions made. The analytical assurer checks that assurance and mitigation are sufficient. |
Q35 | How did you choose the methods for the analysis? How do you know the methods you use are appropriate? | Analyst, analytical assurer | During the design phase, the analyst will convert the commission into an analytical plan and will consider inputs, analytical methods and processes, and expected outputs. The analytical assurer should check that the proposed design meets the commissioner’s requirements and is sufficiently assured. |
Q36 | Have reasonable alternative methods been explored and rejected for good reasons? | Analyst, analytical assurer | The analyst should review the analysis as a whole and consider carefully whether there are other, better ways in which it could be done. The analytical assurer should check that the investigation of methods was sufficiently thorough and proportionate. |
Q37 | How do you know that your analysis is working correctly? | Analyst, analytical assurer | The analyst should validate that the analysis as set up to answer the specification of the commissioner. The analytical assurer checks that assurance and mitigation are sufficient so the analysis is fit for purpose. |
Q38 | Can you describe the assumptions of your analysis, when they were made and who made them and signed them off? | Analyst, analytical assurer, commissioner | The analyst should capture, manage and understand explicit and implicit assumptions made. The analytical assurer should assess whether these are sufficient. The commissioner should be made aware of key assumptions and confirm that they are happy that the assumptions are applied. |
Q39 | How are assumptions validated and assured before you apply them? | Analyst, analytical assurer | If applicable, analyst should undertake parametric analysis to understand the consequences of missing or uncertain assumptions. Analytical assurer should check that validation and assurance of assumptions is sufficient. |
Q40 | How do you measure and report uncertainty in your analysis? | Analyst, commissioner, analytical assurer | Analyst should determine and communicate the uncertainty associated with outputs so that the commissioner can make informed decisions. The range of possible outcomes and their relative likelihoods should be described. The analytical assurer checks that measuring and reporting of uncertainty is sufficient to meet the needs of the commissioner. |
Q41 | Have you considered the implications of relevant, unquantified uncertainties? | Analyst, commissioner | If uncertainties are too complex for analysts to quantify, even approximately, the analysts should say so in order that the commissioner can take this into account. |
Q42 | Can you explain the impact of your analysis on downstream processes? Are there risks around these dependencies? | Analyst, analytical assurer | Analyst should make sure that the implications of data dependencies or relationships to other analysis and methods are understood. Analytical assurer should check that dependencies have been properly considered. |
Q43 | Is all or part of the analysis reliant on a single person? | Analytical assurer, analyst, commissioner | Analysis should be peer reviewed at an appropriate and proportionate level by a competent person. Comissioner, analyst and analytical assurer should all be involved in each stage of the analytical cycle. |
Q44 | Is it clear why important decisions about the analysis were made, who made them and when? | Analytical assurer, analyst | The analytical assurer should make sure that a suitable audit trail is in place that clarifies the level of validation, scope, and risks associated with the analysis. Best practice includes the production of validation log books. Analyst should build this audit trail. |
Q45 | If changes need to be made to code or datasets, is it easy to track who made the changes and when and why they were made? | Analytical assurer, analyst | To make analytical audit easy, you should set up a version control system for the analysis as a whole and for code, supporting data and assumptions. Best practice includes the production of validation log books. Analyst should build this audit trail.The Analytical assurer should ensure that the audit trail clarifies the level of validation, scope, and risks associated with the analysis. |
Q46 | Would another analyst be able to reproduce your analysis output or continue the work without talking to you first? | Analyst, analytical assurer | Good quality analysis is reproducible. Analyst should check that the analytical process reflects the principles of RIGOUR (Repeatable, Independent, Grounded in reality, Objective, Uncertainty-managed, Robust) |
Q47 | Do you use internal peer review to check scripts and code, documentation, implementation of methods, processes and outputs? | Analyst, analyical assurer | The analyst should provide proportionate documentation that explains the verification and validation activities that the analysis is subjected to. Analysts must perform appropriate test to check the analysis. They should commission other verification and validation as required. Analytical assurer should confirm that planned validation and verification are sufficient. |
Q48 | Is your code and analysis ever peer reviewed by someone outside your team or organisation? | Analyst, analytical assurer | Analysts should work with the commissioner to set out the analysis question so that appropriate analysis is done. Some analysis may require external specialists, so analysts may have responsibilities as part of the procurement process. Analysts, including 3rd parties providing analysis, should provide proportionate documentatiob describing the verification and validation activities undertaken and associated conclusions. The analytical assurer advises the commissioner on whether appropriate analytical quality assurance has taken place. |
Q49 | What is your assessment of the quality of your analytical outputs? | Commissioner, analytical assurer | As part of the delivery phase, the commissioner should ensure there is an assessment of the level of analytical quality assurance of the analysis, noting where there have been trade-offs between time, resources and quality. The analytical assurer advises the commissioner on whether appropriate analytical quality assurance has taken place. |
Q50 | How do you assure yourself that the analysis you do is correct? | Commissioner, analyst, analytical assurer | The analyst must build in checks and processes to ensure that the analysis is correct. During the delivery phase, the commissioner should give feedback to assist in the correct interpretation of results and determine if the analysis has addressed the commission. The analyst should work with the analytical assurer while doing the analysis so that they can comment on whether the analysis meets the needs of the commission to ensure best use of the results |
Q51 | Do the outputs of your analysis align with similar findings from elsewhere? If not, can you explain why? | Commissioner, analyst | When interpreting the results of a piece of analysis, the commissioner provides constructive challenge. They work with the analyst to explore whether further analysis is needed. |
Q52 | If you find outliers or unusual trends in the data, what steps do you take to investigate them? | Analyst, analytical assurer | If applicable, analyst should undertake parametric analysis to understand the consequences of missing or uncertain data and assumptions. The analysis plan should include treatment of unusual values and outliers. Analytical assurer should be involved. |
IV. Delivery
Quality Question | Why do I need to know the answer to this? | |
---|---|---|
Q53 | Can you give a clear account of what can and cannot be inferred from the analysis? | Often the aim of final output is to inform decison-making. Output might include predictions, involving lots of underlying assumptions. It is critical that you support your users to make appropriate use of outputs and understand what can and cannot be inferred. Without this, users may misinterpret findings, make in appropriate comparisons, use the analysis for unsuitable purposes and arrive at the wrong conclusions. For example, a non-expert user may wrongly interpret correlation as causation or use incomplete or disconnected data to make forecasts. |
Q54 | Have you assessed the limitations of the data and analysis and set out how they affect the quality and use of the outputs? | You should describe why limitations related to data and methods exist, why they cannot be overcome using the chosen approach and their impact on the quality and interpretation of the output. Analysis is of very little value if limitations aren’t properly documented and explained. |
Q55 | Have you sense checked outputs with user groups and stakeholders? | You should work with users, experts, and other relevant stakeholders to verify the credibility of outputs and sense check that they are useful. |
Q56 | Is uncertainty about data quality, assumptions and methodology clearly communicated to users? | Outputs are never 100% accurate. Users need to understand how uncertainties related to data, assumptions and methodology feed into and through the analysis workflow and what this means for the use of the outputs. Results must clearly explain how uncertainty affects the findings from the analysis, or we risk misinterpretations and conclusions being overly reliant on imprecise results. |
Q57 | Are the implications of unquantified uncertainties communicated to users? | You must support your users to understand relevant uncertainties which are not captured in the analysis. When you can, make reasonable judgements about the likely size and direction of unquantified uncertainty. Provide a qualitative description informing users about why the uncertainty cannot be quantified and their likely impact. |
Q58 | Is workflow documentation including technical guides and code repositories publicly available? | Transparency about your analysis supports proper scrutiny and challenge, promotes public trust, and encourages re-use of the resources you develop. |
Q59 | Does the technical guide and documentation explain how to run the analysis to obtain valid outputs? | A good technical guide helps everybody to understand what the analysis does and how it works. A well-written technical guide is essential for effective maintenance of the analysis. It helps users of the analysis to replicate the findings, get answers to methodology questions and build their trust in the output. |
Q60 | Have you fully documented the analysis code to comply with good practice? | The technical guide is complemented by fully documented analysis code. Code documentation must comply with good practice so new users can understand and execute the code as easily and quickly as possible. |
Q61 | Are users able to feed back on the suitability of outputs? | External critique makes analysis more robust. Users should be able to give feedback to your team to ensure that results meet their needs. User feedback and customer reviews inform you of issues and changes that you might need to make. They also act as evidence that users have been consulted. |
Question |
Which pillar and principle of Code of Practice matter here? *Trustworthiness (T), Quality (Q), Value (V) |
|
---|---|---|
Q53 | Can you give a clear account of what can and cannot be inferred from the analysis? | Q2.4 Relevant limitations arising from the methods and their application, including bias and uncertainty, should be identified and explained to users. An indication of their likely scale and the steps taken to reduce their impact on the statistics should be included in the explanation. |
Q54 | Have you assessed the limitations of the data and analysis and set out how they affect the quality and use of the outputs? |
Q3 Producers of statistics and data should explain clearly how they assure themselves that statistics and data are accurate, reliable, coherent and timely. Q1.5 Potential bias, uncertainty and possible distortive effects in the source data should be identified and the extent of any impact on the statistics should be clearly reported. Q3.1 Statistics should be produced to a level of quality that meets users’ needs. The strengths and limitations of the statistics and data should be considered in relation to different uses, and clearly explained alongside the statistics. |
Q55 | Have you sense checked outputs with user groups and stakeholders? | Q3 Producers of statistics and data should explain clearly how they assure themselves that statistics and data are accurate, reliable, coherent and timely. |
Q56 | Is uncertainty about data quality, assumptions and methodology clearly communicated to users? |
Q1.5 Potential bias, uncertainty and possible distortive effects in the source data should be identified and the extent of any impact on the statistics should be clearly reported. Q2.4 Relevant limitations arising from the methods and their application, including bias and uncertainty, should be identified and explained to users. An indication of their likely scale and the steps taken to reduce their impact on the statistics should be included in the explanation. |
Q57 | Are the implications of unquantified uncertainties communicated to users? |
Q2.4 Relevant limitations arising from the methods and their application, including bias and uncertainty, should be identified and explained to users. An indication of their likely scale and the steps taken to reduce their impact on the statistics should be included in the explanation. Q3.3 The extent and nature of any uncertainty in the estimates should be clearly explained. |
Q58 | Is documentation including technical guides and code repositories publicly available? | V2 Statistics and data should be equally available to all, not given to some people before others. They should be published at a sufficient level of detail and remain publicly available. |
Q59 | Does the technical guide and documentation explain how to run the analysis to obtain valid outputs? |
V2 Statistics and data should be equally available to all, not given to some people before others. They should be published at a sufficient level of detail and remain publicly available. V3 Statistics and data should be presented clearly, explained meaningfully and provide authoritative insights that serve the public good. |
Q60 | Have you fully documented the analysis code to comply with good practice? |
T4.4 Good business practices should be maintained in the use of resources. Where appropriate, statistics producers should take opportunities to share resources and collaborate to achieve common goals and produce coherent statistics. Q2.1 Methods and processes should be based on national or international good practice, scientific principles, or established professional consensus. |
Q61 | Are users able to feed back on the suitability of outputs? |
V1 Users of statistics and data should be at the centre of statistical production; their needs should be understood, their views sought and acted upon, and their use of statistics supported. V1.4 Statistics producers should engage publicly through a variety of means that are appropriate to the needs of different audiences and proportionate to the potential of the statistics to serve the public good. An open dialogue should be maintained using proactive formal and informal engagement to listen to the views of new and established contacts. Statistics producers should undertake public engagement collaboratively wherever possible, working in partnership with policy makers and other statistics producers to obtain the views of stakeholders. V1.5 The views received from users, potential users and other stakeholders should be addressed, where practicable. Statistics producers should consider whether to produce new statistics to meet identified information gaps. Feedback should be provided to them about how their needs can and cannot be met, being transparent about reasons for the decisions made and any constraints. |
Quality Question | Which AQuA role(s) would normally answer this? | Why are these AQuA roles involved? | |
---|---|---|---|
Q53 | Can you give a clear account of what can and cannot be inferred from the analysis? | Commissioner, analyst | During the delivery phase, the commissioner receives the results of the analysis and decides whether it meets their needs. The analyst provides sufficient information to support the commissioner to make an informed decision. |
Q54 | Have you assessed the limitations of the data and analysis and set out how they affect the quality and use of the outputs? | Commissioner, analytical assurer, analyst | The commissioner must be confident in the quality of the outputs. They should understand the strengths, limitations and context of the analysis so that the results are correctly interpreted. Analytical assurer sign-off provides confidence that analysis risks, limitations and major assumptions are understood by the users of the analysis. Analysts make sure that the commissioner and analytical assurer have the evidence they need. |
Q55 | Have you sense checked outputs with user groups and stakeholders? | Analyst, analytical assurer |
The analyst and analytical assurer should enable and encourage peer review. Peer reviews provide useful critical challenge about the analytical approach, application of methods and interpretation of the analysis. Verification and peer review of work should be done by analysts who had no involvemen in the work so their views are independent. |
Q56 | Is uncertainty about data quality, assumptions and methodology clearly communicated to users? | Analyst, commissioner | The analyst must determine and communicate the uncertainty associated with the analysis so the commissioner can make informed decisions. The commissioner should ensure that an assessment of uncertainty has been provided and that the implications of uncertainty are understood. |
Q57 | Are the implications of unquantified uncertainties communicated to users? | Analyst, commissioner | If uncertainty is too complex to quantify, even approximately, the analysts should explain this so the commissioner can take this into account. In communicating analysis results to decision-makers and stakeholders, the commissioner should be open about the existence of deep uncertainties whose impact cannot be assessed, and explain how they are managed in the analysis. |
Q58 | Is documentation including technical guides and code repositories publicly available? | Analyst, analytical assurer | The analyst must produce appropriate design documentation. Best practice includes maintaining a record of the analysis workflow in a technical report, including a concept of analysis, user requirements, design specification, functional specification, data dictionary, and test plan. Code should be properly documented. |
Q59 | Does the technical guide and documentation explain how to run the analysis to obtain valid outputs? | Analyst, analytical assurer | The analyst must produce appropriate documentation. Best practice includes maintaining a record of the work that has been done in a technical report, including a full description of the analysis, user requirements, design specification, functional specification, data dictionary, and test plan. The analytical assurer makes sure that the documentation is fit for purpose. |
Q60 | Have you fully documented the analysis code to comply with good practice? | Analyst, analytical assurer | Analysts should develop and maintain analysis code in line with best practice. Code must comply with relevant policies and standards. |
Q61 | Is there a clear feedback mechanism so users can report back on the suitability of outputs? | Analyst, senior responsible owner | You can assess the usefulness of the analysis by getting feedback from users, stakeholders and other experts. Quality analysis should be free of prejudice or bias. The SRO and analysts should check that the analysis follows the principles of RIGOUR (Repeatable, Independent, Grounded in reality, Objective, Uncertainty-managed, Robust) |