Framework structure

This framework has been designed to help you make decisions about the quality of administrative data for statistics. It groups the quality assessment into two phases: input and output quality.

The input quality assesses the data you have coming in the door; how suitable is it for what you want to do with it?

The output quality assesses the quality of the statistics or analysis you have produced; how well does it meet you and your users’ needs?

Both phases use quality dimensions to help you prioritise and assess different aspects of fitness for use. Quality dimensions are characteristics of the data and output which may be important to you and your users. Very rarely will there be an output or data set which is completely perfect. Identifying the dimensions which are important for you helps you to make decisions about how fit for purpose your data and output are. As the purpose of the data and output are different (one is to be processed, the other published and shared), the dimensions are slightly different across the phases.

Each section contains:

We plan to add more detailed guidance around quality indicators / methods in a future iteration. If you have thoughts or preferences on how we include these, please email

Administrative data must be accessed securely and via legal gateways. Their use represents an opportunity for analysts, however, it is important to remember that the subjects of the data must be protected from misuse. This framework does not support you with making decisions about access to data, however, this is something you need to consider. Your organisation will have data protection policies, such as these data protection guidelines from the ONS. If you have questions, you should contact your Data Protection Officer.

Some core principles

Existing quality guidance, such as the Quality Assurance of Administrative Data (QAAD) emphasises the idea of proportionality. This is something to bear in mind in any quality assessment, including the present framework. Your assessment of quality should be proportionate to the needs of your users and the resource you have available. This framework has been designed to be flexible, and your approach can be tailored in proportion to your needs. There are three main ways to do this:

Also bear in mind that, as society changes, the uses and profile of your statistics and the risks to quality may change, sometimes rapidly. So previous quality assessments should then be revisited in the light of the current situation.

How do I answer the questions in this framework?

Throughout this framework we suggest questions you may want to think about when considering the quality of your data or output. The purpose of this framework is not to answer these questions for you, but to point you towards what you should be thinking about (and doing) in order to understand and judge whether your data or output is fit for use. Quality assessment is as much a thought process as anything else, and should involve curiosity and reflection about how what you are doing matches up to what your users need from you.

To guide you through this thought process, we set out a number of core questions to ask about the data or output. These are not exhaustive, but should provide a good starting point. There are often multiple approaches to answering or addressing these questions, and it may involve liaising with others within or outside your organisation (e.g. data owners, data suppliers, data acquisition teams, quality teams etc.). Again, this will partly depend on how much time and resource you have available, but some options are listed below to give you an idea:

What do I do with the answers?

The Code of Practice for Statistics not only requires us to consider the strengths and limitations of our statistics and data in relation to different users, but also says that these strengths and limitations should be clearly explained alongside the statistics (Practice Q3.1). This is supported by the guidance linked on communicating quality, uncertainty and change.

How you record and communicate the outcomes of applying this framework is largely up to you, and, you guessed it, should be based on your different users’ needs! One common approach is to layer the information in increasing levels of detail:

We plan to add some case study examples to this framework in the future, but in the meantime, some ideas of how to record the outcomes of applying this framework are presented below:

It is important to have open discussions with your users throughout the quality assessment process about their needs. No data set or output is perfect, but the most crucial element is transparent and clear communication with your users about the data you had, what you did with it, and what that means in terms of quality. This allows the person who is using that statistic to use it in a way which is informed, and any decisions they make with it to include important context.