Producer Handbook for the Assessment Process

1. Overview

The aim of this document is to provide producers with the appropriate information to enable them to prepare for, and understand the Assessment process.

The Statistics and Registration Service Act 2007 created the UK Statistics Authority and empowered it to determine, and assess compliance with, a Code of Practice for official statistics.

The Code of Practice was published in January 2009 and the first round of Assessments was carried out following this.

As set out in the Code, the Authority’s Assessment team will systematically review the evidence from producers, users and other stakeholders against the Code of Practice. The findings from this Assessment will be included in a report which will provide a considered Assessment of the strengths and weaknesses of the statistical activities being assessed and will cover all aspects of the work leading to the statistical output/publication and its dissemination.

The report will clearly present the Assessment team’s conclusions in relation to the degree to which the statistics comply with the Code of Practice; the nature of any improvements needed, and; a recommendation to the Authority as to whether the statistics should be, or should continue to be, designated as National Statistics.

Code of Practice for Official Statistics (366.9 Kb PDF document).

The Assessment team currently comprises around 20 staff spread across three sites: Newport, London and Edinburgh. The Assessment team is always available to answer questions and queries relating to any part of the process and can be contacted by phone or email:

Phone: 0845 604 1857
Email: assessment@statistics.gsi.gov.uk

2. Aims of the Assessment Process

The rationale behind the setting up of the Assessment process is two-fold:

  • to communicate the extent of compliance with the Code to Parliament and the public, and;
  • to help the producers of official statistics to enhance the quality of the statistical service over a period of time.

Furthermore, the process of promoting the coherent application of best statistical principles, methods and practices by producers of official statistics aims to improve trust and confidence in the UK Statistical System.

3. Detail of the Assessment Process

As stated in the Code of Practice, managers in the producer body will normally be notified at least three months in advance of a planned Assessment by means of updates to the Programme of Assessment.

The timetable (203.8 Kb Word document)covering the work programme for Assessing existing National Statistics from 2010-2012 is available on the Authority’s website . This has been produced in consultation with Heads of Profession who have made suggestions for grouping outputs and assessing level of importance and concern.

The process map ( 203.4 Kb Pdf document), outlines the Assessment process in terms of key stages. In summary, the stages include:

Initial contact between Assessment team and producers

The aim of the inaugural meeting is to agree on the timetable for the Assessment, taking care to avoid any especially resource-intensive periods; to decide on the scope for the Assessment in terms of clarifying the list of outputs, and; to outline the next steps in the process. Producers are also asked to provide any relevant background material to the Assessment team at or soon after this meeting. Relevant documents include:

  • Links to the most recent publication of each of the outputs in this Assessment, along with accompanying quality indicators, metadata and other supplementary reports.
  • Summary (maximum one side of A4) of the history and key characteristics of the statistical product, including any relevant (national or international) legislation or other obligations, and an estimate of the costs to your organisation of producing them.
  • A brief summary of the ways in which the public, Parliament or other users are likely to use the data, particularly including those uses which go beyond their original intended use. Include any Government targets that your statistics are used to measure progress against. This will form the basis of a summary in the Assessment report about the statistics, their use and utility and will help to demonstrate the relevance and importance of them.
  • If readily available, details of website usage for the outputs concerned – e.g. number of downloads or web hits for the statistical outputs.
  • Organisation chart of the statistical output area, including governance arrangements such as steering groups.
  • List of users of your statistics, with contact details.

User consultation

The Assessment team will collect evidence to support its recommendations about Assessment and designation from a range of sources one of which is the body of users of the statistical outputs. The aim is to assess the level of engagement the producers have with users and to highlight any areas of concern, not to produce user satisfaction statistics. In order to carry out this process, the producers provide the team with an up-to-date list of user contact details so that they can be contacted and asked to provide their views. Details of these questions are available from the Monitoring and Assessment team.
In order to collect the views of data users, suppliers and other stakeholders producers are asked to supply the names, email addresses and telephone numbers for the following:

  • Main people within the statistical production process;
  • Data suppliers (or representatives of supplier communities if relevant);
  • Internal users (end or interim) of the statistical product, for example policy colleagues;
  • Other internal stakeholders (e.g. Private Office, Press Office);
  • External users including user groups, fora, committees etc, and;
  • Other external commentators, including your main media contacts.

Producers are also asked to list the various uses made of the statistics by external and internal users. For supplier contacts, producers are asked to say briefly what role each has in the supply chain. For other stakeholders, a brief description is required of their role in the statistical production process.

Written Evidence for Assessment

This is the main evidence provided by producers to the Assessment team and consists of two separate documents.
The first is the organisational evidence ( 172.5 Kb Word document) which is completed and updated by the Head of Profession for the organisation and focuses on those sections of the Code which relate to organisational practices.

The second is the product evidence (172 Kb Word document) which is completed by the statisticians responsible for producing the statistics and focuses on the sections of the Code which relate to specific outputs. This takes the form of an open-ended questionnaire with questions linked to the various elements of the Code of Practice.

The majority of the evidence document should completed by attaching links to the relevant documents that will help the Assessment team understand the strengths and weaknesses of the set of statistics in question.

Producers are encouraged to provide examples of documents which demonstrate compliance with the Code rather than including links to all available documentation. Further guidance on documentation to include in the written evidence is included in Section 4 of this document.

Follow-up meeting between the Assessment team and producers

This meeting serves to clarify any issues which may be unclear to the Assessment team and ensures they understand the written material provided in relation to the Code. It also gives the producers the opportunity to add additional information which may have been omitted from the written evidence. Following this meeting the Assessment team will analyse evidence and complete the draft Assessment report.

Note: Where Heads of Profession (HoPs) have identified statistics which are of low importance and of low concern, the aim is to streamline these Assessments in order to reduce the burden on both the producers and Assessments teams. In these cases the evidence document and follow-up meetings will be combined so that all the evidence required for the Assessment is collated and provided in one face-to-face meeting with producers.

Draft report sent to producers for comment

The Assessment team will send the completed Assessment report to the producers so that they are able to make comments in relation to factual accuracy. Producers will generally be given at least five working days to comment. The report is also sent to the National Statistician at the same time.

Regulation Committee and Authority Board

The report is presented to the Regulation Committee along with any written comments provided by the producer body or the National Statistician. The Regulation Committee then agree any revisions and the report is submitted to the full Authority Board. The Authority Board will either approve it for publication or ask for further work. Unless there are any problems, reports are published on the first Thursday after approval by the Board.

4. Written Evidence for Assessment – Documentation Required

Organisational Evidence for Assessment

This is completed by the HoPs of the organisation being assessed and covers almost half of the practices in the Code. Section 2 of the Output-related Evidence for Assessment asks for confirmation that producers of the statistics being assessed follow the organisation’s standard policies when producing the statistics.

Producers should therefore ensure they are familiar with the content of the Organisational Evidence for Assessment before completing the Output-related Evidence for Assessment.

Output-related evidence for Assessment

The output-related evidence document is designed as a prompt for producers to bring to the Assessment team’s attention any documentation of processes or examples of good practice that are currently being undertaken. The evidence document is divided into two parts.

The first part asks you to provide a range of existing material about the outputs being assessed. The information required is split into four sections:

  • contextual documentation about the outputs;
  • compliance with organisational policies;
  • published documentation, and;
  • ‘off the shelf’ documentation.

The second part, section 5, relates to additional evidence of Code-compliance. For most Assessments, we will ask for written evidence. For those statistics that are considered of low importance and low concern, we will collect this evidence during a meeting with you. We will tell you at the start of the Assessment how we propose to collect this information.

We ask you to indicate whether you consider that any of your products are not fully compliant with any requirements of the Code, and how you are addressing these issues. Please also use this section to flag up any relevant issues that are not explicitly covered elsewhere.

This section of the document provides producers with the opportunity to outline what they consider to be the strengths and weaknesses of the statistical releases (in relation to the statistical products being assessed). It may be helpful to structure any observations in terms of the six criteria against which we will assess the standard of statistical releases, which are:

  • Clear identification of the statistics that are being released;
  • Commentary that is helpful and presents the key messages;
  • Commentary that is objective, balanced and professionally sound;
  • Commentary that discusses the statistics in the context of their likely uses;
  • Readily available metadata about the statistics in the release, and;
  • Statistical presentation that is professional and helpful.

In section 4, we ask producers to provide us with internal timetables for the preparation of the statistics. This is to ensure that the statistics are being released as soon as they are ready, to avoid the opportunity (or perception of opportunity) for the release to be withheld or delayed (Practice 1 in the Code).

The written evidence document is not published, but the Assessment team would make it available on request if asked. If there are sections containing information that should not be made public for reasons of confidentiality, then please make this clear in the document.

The bullet points below give a number of suggestions for the sorts of documents which should be available to give evidence under specific principles/protocols within the Code of Practice.

  • Refer to the Code when providing your evidence. We only want to see documentation that is directly relevant to evaluating the strengths and weaknesses of your statistics against the Code;
  • We do not intend that Assessments require a lot of new written information. Please refer to relevant sections of existing documentation wherever possible. In particular:
    • refer to documents available on your website, and provide links;
    • refer to the set of background information (part A) when completing the main evidence section (part B), where relevant; and
    • append other existing documentation, and refer to relevant paragraphs or sections.
  • The statements of practice in the Code apply at different levels. Some relate to organisational policies and practices and will be covered by the Organisational Evidence for Assessment, while others are specific to individual statistical outputs and should be included in the output-specific written evidence for Assessment. In some cases, an individual practice will apply at different levels. Please consider the extent of application, and provide evidence accordingly;
  • If there is no evidence under a particular question, simply state “No evidence”. Not all practices apply to all statistics – in which case simply state “Not applicable” (but note, this is not the same as “No evidence”);
  • Under each Principle and Protocol we ask you to indicate whether you consider that there are any requirements of the Code that are not being fully complied with. Including something in this section is not automatically a negative point that will prevent designation. We welcome open acknowledgement of any shortcomings – preferably accompanied by information on proposed action. This is better than us presuming you are not aware of a problem. Please also use this section to flag up any issues that are not explicitly covered elsewhere;
  • If the Assessment covers a range of statistical products within your organisation and practices vary between them, please give enough detail on each for us to be able to determine the strengths, weaknesses and extent of compliance for each individual product;
  • You may embed links in the Word document, or append documents separately. If appending separately, please clearly title each document;
  • Do cross-refer within this document, if appropriate, to avoid repetition; and;
  • if using technical terms and acronyms please provide a short glossary.

Above all, if in any doubt about how to tackle any of this document, please do contact the Assessment Team, who will be pleased to help.

5. Guidance on Understanding the Code of Practice

The National Statistician has produced some guidance on the Code of Practice which may be of use to producers. This guidance is available on the Statistics Authority website and includes 4 documents:

  • Quality, Methods and Harmonisation
  • Use of Administrative or Management Information
  • Confidentiality of Official Statistics
  • Presentation and publication of Official Statistics

The output-related evidence for Assessment document has been structured so that it looks at the various requirements of the Code to ensure producers provide evidence for all the relevant sections. The role of the Assessment team is to evaluate this evidence in relation to the specific requirements of the Code. It may be useful for producers to consider the key principles within the Code when providing their evidence to ensure that all elements are considered and omissions are avoided. See Annex A for a table which outlines for each of the Code’s practices, the main sources of evidence the Assessment Team will use to assess code compliance.

Principle 1: Meeting user needs

Engagement with users might range from strategic engagement at the organisation level, through to detailed engagement with individual users. Users may also be in a range of sectors – within your organisation, wider government, private sector bodies, researchers and members of the public. Your response should reflect that diversity.

We are interested in any information that is already available that outlines what users think about the service they receive – in particular, whether the service meets their need, what they think about data quality, and the form and timing of reports.

Relevant documents to demonstrate ways users’ needs are met could include:

  • A brief summary of the ways in which the public, Parliament or other users are likely to use the data, particularly including those uses which go beyond their original intended use. Include any Government targets that the statistics have been designed to measure;
  • Information about users’ experience of statistical services, data quality and the format and timing of reports, and;
  • Examples of communication with users, e.g. minutes of meetings, notes on website to inform users of changes, consultations.

Principle 2: Impartiality and objectivity

Impartiality and objectivity can be achieved through an explicit revisions policy, prompt and clear explanations of errors and access to regular reports on the internet free of charge.

The questions in section 4 of the evidence document ask for web links and documents covering the different scenarios which may have occurred, including changes in methods, revisions, exemptions and errors. While most organisations should have a revisions policy and a charging policy, there may be no errors, exceptions or changes in methods to report in relation to these statistics. In this case simply state “None”. We will also refer to your statistical releases and the National Statistics Publication Hub.

Relevant documents to include in section 4 are:

  • Details of all approved exemptions and exceptions to the Code, and any breach report, and;
  • Details of any errors in statistical reports and information about how this was dealt with.

Principle 3: Integrity

In this section we are looking for evidence that the production of the statistics is guided by public interest and free from personal or political interference. The evidence should show how any such pressures are dealt with and outline the controls in place to prevent inappropriate influences. We would also like to see how any complaints have been dealt with.

The Code of Practice requires organisations to promote a culture within which statistical experts in Government can comment publicly on statistical issues, including the misuse of official statistics. Any examples of this in relation to these statistics should be included.

Principle 4: Sound methods and assured quality

In terms of the methods used, we expect the evidence to cover:

  • (for outputs derived from administrative sources) a description of the data and their sources. Include the means of collection, classification and transfer to the statistical producer, and the processes and analysis involved in producing statistical outputs from them;
  • (for surveys) a description of survey methods including the sampling frame used, sample design and sample selection procedures, estimation procedures etc. This should also include examples of questionnaires and advance letters used, or the instructions that interviewers give to surveyed individuals etc as these will be used to assess strengths and weaknesses against other principles within the Code, and;
  • (for other analysis) a description of data sources and analytical methods.

This information may all be included in the statistical releases and/or a Technical Report, or similar. We will refer to descriptions of quality (including guidance on use) included in the releases and also to published quality guidelines. Please feel free to append any of these documents, and in the evidence refer to relevant paragraphs and chapters.

Principle 5: Confidentiality

In this section we ask for details of how confidential information is protected. This may include, for example, a confidentiality policy or pledge, IT security, restricted access to files and methods of disclosure control. We will also refer to the survey materials (for example, survey questionnaires) when evaluating the strengths and weaknesses of the confidentiality procedures. Please cross-refer if applicable.

Principle 6: Proportionate burden

For existing surveys, estimates of the cost of complying with the survey should be provided. For new data collections, we are looking for evidence that there is a definite need for the data and that existing sources have been considered before taking the decision to collect new data. The costs of proposed new data requirements (to data suppliers) should be analysed against the potential benefits.

Principle 7: Resources

We recognise that business plans and associated documentation may not identify the resources available at a fine level of detail. In order to assess compliance with this section of the Code, please report the most appropriate information available, giving any supplementary commentary necessary so that we can understand the resources available.

Principle 8: Frankness and accessibility

We would like to see how the statistical outputs being assessed are produced and publicised, how decisions are made about the content of the releases, and evidence that analysis and re-use of data is encouraged.

6. Designation of Ad Hoc Statistical Articles

The assessment process works especially well for regular, ongoing statistical outputs, in whatever form. This note describes how the Authority deals with a range of special cases:

  • Unpublished outputs;
  • Secondary analysis of existing National Statistics/derivative volumes, and;
  • One-off outputs.

Unpublished outputs

There is an approved procedure of assessing as much as we can before publication and designation is accompanied by a paragraph along the following lines:

“Some members of the Assessment team were granted pre-release access to [unpublished output] in draft format, for the purposes of assessing its content against the principles of the Code of Practice, and the conclusions in this section reflect the content of that draft. The Assessment team has judged the draft as meeting the requirements of the Code of Practice, subject to the caveats listed in section 2.4. Should there be significant changes to the final publication that would lead the Assessment team to different conclusions, the Assessment report may be withdrawn and revised as necessary.”

Secondary analysis of existing National Statistics/derivative volumes

Sometimes a supplementary volume, for example containing detailed statistics on a particular topic, is published alongside “headline” summary figures. Equally, such volumes may be published after the release of the headline statistics. In this case, we would regard the statistics as National Statistics in exactly the same way as they would have been had they been published at the same time as the headline statistics. Indeed, not wanting to delay the publication of the main headline figures reflects Protocol 2, Practice 1 to release statistics as soon as they are ready.

In previous cases a paragraph has been included in the release which outlines this process.

“This new set of statistics has not been formally assessed for compliance with the Code of Practice for Official Statistics. However, the Statistics Authority has agreed that, in view of the fact that the statistics are the product of secondary analysis of existing National Statistics, they can also be designated as National Statistics. The producer body has confirmed that the new statistics are produced to the same standards as the existing ones.”

One-off articles

Ad hoc articles may comprise further detail, or re-presentation as described above.

In the former case, we should consider on a case-by-case basis whether an existing designation can be carried over to a new article. This may depend on the extent to which it is similar, and how long has elapsed between the publication of the original statistics and the proposed additional article.

In some cases, where we are broadly content that the Code is being complied with, but need some additional evidence, we may be able to offer an abbreviated assessment process. This may typically be in the case that we want a producer to give a short account of the way they have evaluated the user need for the article, and engaged with users about its content.
One-off articles may also contain new analyses of existing data, or presentation of statistics based on multiple data sources. In such cases it seems clearer that this would be a new product, and should be subject to a fresh assessment if it meets at least one of the following criteria:

  • does it have a significantly different user base than existing statistics;
  • is it a significantly different type of statistic being presented, and;
  • are the statistics being presented from a new source, or from a range of sources that have not previously been combined in this way.

In some cases “one-off” might mean a series of two or three, say, irregular articles. The same rules would apply to those. If the series became more regular, and became the main means for disseminating statistics, we would want to regard this as a new product, and assess it according to the usual procedures.

Annex A: Sources of evidence used to assess code compliance

The table in this document outlines, for each of the Code’s 74 practices, the main sources of evidence the Assessment Team will use to assess code compliance.

The list is not necessarily comprehensive, but could be helpful to producers when providing written evidence to the Assessment Team, and in carrying out their own self-assessment against the Code.

For each practice, it lists the sections in the organisational and output-level Written Evidence for Assessment (WEFA) from which evidence will mainly be drawn.

Principle and Practice Main sources of evidence for assessment
Organisational WEFA Output-level WEFA
Principle 1: Meeting user needs
Practice 1 4
Practice 2 2, 3
Practice 3 4
Practice 4 1
Practice 5 1, 2
Principle 2: Impartiality and objectivity
Practice 1 Covered by protocol 2
Practice 2 1
Practice 3 3b
Practice 4 4
Practice 5 1
Practice 6 1 2
Practice 7 4
Practice 8 3b
Practice 9 2 2
Principle 3: Integrity
Practice 1 3b
Practice 2 3a 4
Practice 3 3b
Practice 4 3b
Practice 5 3b
Practice 6 3b
Practice 7 3a
Principle 4: Sound methods and assured quality
Practice 1 3
Practice 2 3
Practice 3 4
Practice 4 1 2
Practice 5 4
Practice 6 3, 4
Practice 7 4
Principle 5: Confidentiality
Practice 1 1
Practice 2 2
Practice 3 4
Practice 4 1 2, 5
Practice 5 1
Practice 6 5
Principle 6: Proportionate burden
Practice 1 1
Practice 2 4
Practice 3 4
Practice 4 5
Practice 5 5
Principle 7: Resources
Practice 1 4
Practice 2 4
Practice 3 4
Practice 4 4 5
Practice 5 5
Practice 6 2 2
Practice 7 Covered by protocol 3
Principle 8: Frankness and accessibility
Practice 1 3
Practice 2 1
Practice 3 5
Practice 4 4
Practice 5 5
Practice 6 4
Practice 7 4
Protocol 1: User engagement
Practice 1 2, 4 4
Practice 2 4 5
Practice 3 4 5
Practice 4 4 3
Practice 5 4 5
Practice 6 4 5
Practice 7 4 5
Protocol 2: Release practices
Practice 1 4
Practice 2 1
Practice 3 3b 4
Practice 4 3b 4
Practice 5 3b
Practice 6 1
Practice 7 1, 2
Practice 8 1, 2, 3a 4
Practice 9 3a
Protocol 3: The use of administrative sources for statistical purposes
Practice 1 3b
Practice 2 5
Practice 3 4 5
Practice 4 5
Practice 5 2