UK Statistics Authority oral evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on the work of the UK Statistics Authority

On Tuesday 23 May, Sir Robert Chote, Chair of the UK Statistics Authority, Sir Ian Diamond, National Statistician and Ed Humpherson, Director General for Regulation, gave evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on the work of the UK Statistics Authority.

A transcript of which has been published on the UK Parliament website.

Office for National Statistics written evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on the Civil Service People Survey

Dear Mr Wragg,

I write in response to the Committee’s call for evidence on the Civil Service People Survey. We have focused our evidence on questions in the Terms of Reference regarding survey design, delivery and validity of results, from the perspective of our role as the Office for National Statistics in administering high-quality national surveys including the census.

Survey design


Staff surveys, and surveys in general, can adopt different strategies to protect respondents’ privacy. These can range from anonymising responses by removing any information that connects the survey to the respondents, to ensuring that analysis derived from the survey does not lead to disclosing identity.

Full anonymisation can limit analysis. For example, if different age groups have a different experience of working in an organisation, this would not by highlighted if the age field was removed to protect privacy. Therefore, best practice is to seek a compromise by using a range of measures to protect privacy. These include:

  • Deidentification removes fields that are highly likely to identify an individual such as their name and address and keeps fields such as age that do not directly relate to one person. Grouping answers limits direct identification further, for example using age ranges rather than dates of birth or referring to branches rather than teams. All these group small numbers of people together to limit the identification of one person while maximising the benefit of the survey.
  • Use of identifiers rather than names so that there is additional protection. Only very few people would have the technical ability and access rights to link respondents to responses. If there were a breach, they would be easily identified because they would leave a digital footprint.
  • Open rather than targeted invitation provides respondents with more control over their responses. An open invitation to the target population allows a respondent to input all their information without it connecting back to a database. A targeted invitation provides a respondent with a code that connects them, and only them, to the sampling frame but could come at the expense of privacy if other measures are not taken, and possibly impact on responses and the response rate if people are more concerned they can be identified.
  • Segregation of duties enables a significant reduction in the number of people who have access to identifiable information. Analysts are not granted access to personal identifiers and users of analysis are only granted access to aggregated data.
  • Statistical Disclose Control is a process by which analytical outputs are checked to ensure that they cannot lead to reidentification on individuals. There are a number of methods that can be used including supressing small numbers and swapping cells which mean the headline summary is still correct. We would not recommend completing cross-sectional analysis when there are low numbers in a category as this might enable identification, especially when it is possible to link to other information in the public domain.
  • Summarising and controlling access to free text are important to ensuring that respondents who provide information that can be used to identify themselves or others are protected. This is particularly important when respondents use the survey as an opportunity to raise issues which require careful handling such as safeguarding. It is best practice to have a safeguarding policy that provides clear guidance and oversight as to when privacy should be breached to protect individuals.

In addition, it is good practice to carry out privacy impact assessments and to make privacy notices and technical guides readily available.

Survey design and delivery

To ensure the best design and delivery of a survey, you may want to be aware of the following:

  • Continuity – staff surveys such as the People Survey are repeated every year so that changes can be tracked and compared over time. To achieve the objectives, the survey needs to be relatively stable, and changes carefully considered and implemented. When a question is discontinued or changed significantly, the time series is ‘broken’, and a new measure is tracked. This is sometimes necessary to ensure the survey remains relevant and useful.
  • Comparability – when a key requirement is the ability to compare performance across the civil service, a key feature of the survey must be consistency: the same survey, with the same questions must be used by all organisations.
  • Comprehension – questions should be pre-tested to ensure that they are being understood as intended and the wording is suitable and understood by all respondents. As far as possible, the survey should use harmonised standards that are available to government departments or reuse questions that are commonly used. These questions have been tested and the practice enables comparison with other data.
  • Scope – the topics covered by the survey are varied. To better understand them, questions in a ‘block’ touch on subsets of the topic. The survey designers must consider the length of the survey and the impact that may have on the quality of the responses, participation and respondents following through to the end, thus completing the survey. The usual recommendation for an online survey is that it should be completed in around 20 minutes.
  • Mode of collection – how responses are collected is determined by cost, the speed in which results are needed, participant preference and the influence modes have on the responses provided. For example, when people complete a survey online, which is the cheapest collection mode, they tend to complete it quickly and may be less reflective compared to an interviewer led survey where the interaction between people can lead to explaining the answer and probing further.
  • Inclusive – the survey ought to be inclusive by design and this refers to the overarching study design but also to the design of the questions themselves and the interfaces that respondents interact with. For example, the online survey should be designed to meet accessibility standards so that it does not limit participation through design. We should be inclusive in the questions we ask, ensuring that the available answer options collect data that represents the population being surveyed. Having multiple modes of collection available increases access to the survey and in turn increases representation in the data.

Who should be involved?

Developing and delivering a survey of the scale of the People Survey is a multidisciplinary task that requires the involvement of many professionals to ensure it delivers on analytical and business objectives.

  • Policy and Analysis users – it is essential to involve those who will be using the results to understand their requirements and to ensure that the data being collected meets their policy questions.
  • Methodologists and Data Architects – the data that underpins the analysis that responds to the policy questions needs to be designed and architected so that it meets data standards and methodological requirements. This step is crucial to ensure that the data collected is fit for purpose, can be used, reused and linked (for example to the data from previous years).
  • Survey designers – As with all surveys, involving questionnaire design experts in the development of the questions and survey to ensure it will meet and balance user need is crucial. As part of their professional input, survey designers will review if questions are clear, appropriate, representative, inclusive and accessible by involving groups across the civil service and asking for their views. They will test questions to ensure that they meet the requirements. We would prioritise cognitive testing to check understanding and interpretation, to mitigate any potential quality issues in the data ahead of going live and so that results can be explained clearly following analysis.
  • Survey developers and user experience designers – whether data is collected online, by an interviewer or using a papers questionnaire, the survey flow and the User Interface must be designed and tested to meet industry standards and to ensure that the survey is accessible to everyone. The survey can be sent to for testing.
  • Procurement – whether the survey is commissioned internally or externally, the specification must be understood and agreed by all parties with subsequent changes governed appropriately. The successful bidder must be able to meet the required standards.
  • Supplier – at the appropriate stage it is essential to build a strong working relationship with the supplier and especially with the technical delivery team. The supplier will be a survey expert with a wealth of experience and should be able to deliver the specified requirements as well as advise on innovation.
  • Communication and dissemination teams – the survey must be promoted by central and local teams to encourage participation. In addition to advertising the survey, the communication can include descriptions of how data will be used, what the benefit of the survey will be and why it is worth taking part. As well as communicating the results, it is necessary to ensure methods and processes are transparent so that people know what to read into them, and importantly what not to read into them. For communication teams to support the survey they must be given all the relevant information from design to analysis.

Relevance of metrics

The information included in the People Survey should be based on the data user needs and the departments that will use it. As mentioned above, these can be ascertained through consultation with policy users. Comparison over time is always an important aspect of any regular survey, and we would recommend keeping question sets as comparable as possible from year to year with changes, when needed, following a transparent methodological review. Finally, some terms used within questions are subjective depending on the department – again this could be improved through consultation.

Periodically the topics covered will change and be impacted by other issues. A good example is the need to monitor the experience of working in the civil service throughout and following the pandemic. When adding or changing a metric, it is important to communicate and explain the changes, especially at the reporting stage.

Some departments may also need to consider organisational changes and how they would like them reported against previous years.

Validity of results

Quality assurance

It would be difficult to quality assure the information provided via the People Survey. There are limited sources to cross-check the information, but these could be exit interviews and/or any internal departmental staff surveys. One approach to address this could be to do a quality follow-up survey with a sample of respondents – which is like what we do with the census to quality assure that data.

Non-response bias impact

Non-response bias can have a huge impact. It can cause results to be distorted, and this is linked with wider issues, as typically people that don’t respond have a reason not to engage, and those reasons are particularly interesting for those collecting the data but remain unseen.

It also means that the data will not be representative and, as a result, any policy changes might not address the real issues. Methodological solutions include weighting and imputation and require comparing the population of respondents to the population of civil servants using the data that is available through HR departments. For example, if fewer people aged under 30 respond to the survey, the responses of those who have replied could be given a bigger weight. Any weighting strategy would need to be transparent and carefully considered, attempting to explain the assumption that people who have responded do indeed represent those who have not.

Survey Delivery

Strengths and weaknesses

Strengths and weaknesses are mostly the result of trade-offs. For example, while the People Survey is relatively long, risking attrition, lower response rate and haste in completion, it does allow for more detailed analysis on many topics.

As discussed in this submission, using a consistent survey across the Civil Service enables efficiency, comparison between organisations, sharing of good practice and analysis over time while limiting bespoke design on issues that may be of interests to specific departments.

Again, as aforementioned, while the survey is not weighted, which enables quick access to the results, it does have an impact on how confident we can be that respondents represent the civil service as a whole. The survey is reported as percentages of respondents rather than a percentage of the population and users can break down the results further to compare responses from different groups. This is a pragmatic, clear approach which is clearly communicated.

A mixture of both quantitative and qualitative data collection could improve the quality of the analysis and the usefulness of the survey. The People Survey is quantitative, with a few open questions capturing free text. There are other qualitative measures such as depth interviews and focus group discussions that can be used alongside the People Survey to enhance understanding of the results. These can either be in addition to, or instead of some of the questions in the survey.

Finally, the survey is accompanied by a tool that enables quick analysis and comparisons, disseminated to all participating organisations; this is a strength.

My colleague Sarah Henry, Director of Methodology at the ONS, looks forward to discussing this further with the Committee on 13 September. Please do let us know if any questions ahead of then.

Yours sincerely,

Professor Sir Ian Diamond

Office for National Statistics correspondence to the Public Administration and Constitutional Affairs Committee regarding UK wide census data

Dear William,

Thank you for your letter of 14 June 2022 regarding census response rates in Scotland and the implications for the integrity of UK-wide data. I will answer your specific questions in turn, but first I wanted to emphasise the close working relationship between all UK Census offices. We have offered and provided support to National Records of Scotland (NRS) including sharing designs, seconding staff, and are now working with them to develop methods to maximise the accuracy of their Census estimates.

The Office for National Statistics (ONS) also worked with NRS to establish an international steering group, which is providing the highest quality technical expertise, advice and challenge to NRS on census matters. This group is advising NRS to focus efforts toward a good census coverage survey, particularly in regions where responses were lowest, and strengthen the use of administrative data to supplement census data sources in their statistical production, including a clear steer to prioritise the early acquisition of new administrative data sources. This oversight will offer NRS the best possible opportunity to deliver a high-quality outcome for Scotland that will, in turn, contribute to high-quality UK statistics.

The ONS has committed an internationally renowned team of experts equipped with decades of demographic experience to work alongside NRS. Through a combination of this world-leading expertise, the ambitious use of supplementary administrative data, and sophisticated estimation methods, I remain confident that together we will deliver robust UK-wide estimates of the population.

What assessment has been made of the reasons underlying the low response rate in certain areas, and what steps were taken to avoid this occurrence? To what extent is the separate delivery of the Scottish Census considered to have impacted the response rate?

The Census in Scotland is a devolved matter; our assessment of the reasons underlying a low response rate in some areas and the associated mitigations has therefore been formed through our close working partnership with NRS.

The decision by Scottish Ministers to move Scotland’s Census to March 2022 was informed by NRS analysis of the potential impact of COVID-19 on the quality of an operation in March 2021. NRS adopted a diverse and inclusive approach to public awareness around census through media and physical advertising, follow-up reminders, field staff, and focused efforts in areas of low return; approaches comparable to those of the ONS and the Northern Ireland Statistics and Research Agency (NISRA). Evidence gathered in Scotland at the end of the collection phase reported that ‘too busy’ and ‘not aware of the Census or the need to complete it’ were the more common reasons given by householders who had yet to return.

As already mentioned, in light of lower than anticipated returns, the ONS and the Registrar General for Scotland established an international steering group of census and coverage experts. Despite these challenges in the collection phase, having considered the position and the planned next steps with the census in detail, the steering group have confirmed that there is a stable foundation from which to move
from census collection onto the next stage of the census operation, namely the census coverage survey and the incorporation of administrative data into estimates. It is the combination of census returns, coverage survey, administrative data, and estimation methodology that will deliver high quality census outputs for Scotland.

What are the implications of Scotland’s lower response rate for the quality and comparability of UK-wide population statistics?

By taking the actions outlined above, through a combination of census data, supplementary administrative data, and sophisticated estimation methods, we believe that it will be possible to deliver a high-quality outcome for Scotland that will, in turn, contribute to high-quality UK-wide population statistics.

What actions will be taken to quality-assure the Scottish Census data in order to reduce the impact of the lower response rate on the standards of UK-wide population statistics?

NRS remain committed to continuing to produce the best possible population estimates for Scotland, which will be used to produce UK-level population estimates. They currently produce annual official population figures for Scotland using data from a range of sources including the Census, registration data on births and deaths, migration estimates, and a wide range of other administrative data sources. NRS continue to improve these statistics through augmentation of existing and new administrative data sources, ensuring the focus is to deliver population estimates that are accessible and valuable for users.

NRS continue to work closely with partner organisations across the UK, including the ONS and NISRA, to ensure that UK population data and analysis is coherent, comparable and understandable for all users across the UK.

The Committee might also wish to note that the Office for Statistics Regulation (OSR) is currently assessing the Scottish Census, including how NRS are responding to the current situation and the methods and quality assurance they will put in place to provide the best quality data and statistics on the population of Scotland. The OSR has, in both its preliminary findings assessment report for Censuses in the UK, and in subsequent assessment reports for England, Wales and Northern Ireland, highlighted UK data and continues to engage with all three census offices in this regard. I am sure the Director General for Regulation, Ed Humpherson, will keep the Committee informed of its findings, which it plans to publish in November.

Yours sincerely,
Professor Sir Ian Diamond

UK Statistics Authority correspondence to the Public Administration and Constitutional Affairs Committee regarding Public Confidence in Official Statistics report

Dear William,

I am writing to draw your attention to the latest Public Confidence in Official Statistics report (2021), which has been produced by the National Centre for Social Research (NatCen) on behalf of the UK Statistics Authority. I am happy to share that the report finds that public confidence in official statistics remains high, and engagement with official statistics has increased since 2018.

Awareness of the Office for National Statistics (ONS) and the Authority has increased from 70% and 33% in 2018 to 75% and 48% in 2021 respectively. Furthermore, for the first time people were asked if they were aware of the Office for Statistics Regulation, with 41% saying that they were.

Notably, 96% of people able to express a view agreed that it is important for there to be a body such as the Authority to speak out against the misuse of statistics, and 94% agreed about the importance of there being a body to ensure that official statistics are produced without political interference.

Members might also be interested to note that a very high proportion of respondents trusted the ONS (89% of those able to express a view) and our statistics (87%). Of those able to express an opinion, trust in the ONS was highest of all institutions asked about, including the Government, the Bank of England, and the civil service as a whole. 82% of people able to express an opinion agreed that official statistics are generally accurate, up from 78% in 2018. Meanwhile 44% said they had used ONS COVID-19 statistics; they were more commonly used than any of the other statistics asked about with the exception of the census.

This report is very welcome, especially following our hard work to provide clear insights throughout the pandemic. We are proud that the public support our vision of statistics that serve the public good, which we will continue to deliver with honesty, and free from political interference.

A copy of the report will be annexed to this letter for the Committee’s information.

Yours sincerely,
Professor Sir Ian Diamond

UK Statistics Authority oral evidence to the Public Administration and Constitutional Affairs Committee’s pre-appointment hearing for Chair of the UK Statistics Authority

On Tuesday 29 March 2022 Sir Robert Chote, the Government’s preferred candidate for Chair of the UK Statistics Authority, gave evidence to the Public Administration and Constitutional Affairs Committee’s pre-appointment hearing for Chair of the UK Statistics Authority.

A transcript of which has been published on the UK Parliament website.


Office for Statistics Regulation written evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on Coronavirus Act 2020: two years on

Dear Mr Wragg,

I write in response to the Committee’s call for evidence for the inquiry ‘Coronavirus Act
2020: two years on’; specifically, the transparency surrounding the use of data in decision making relating to the renewal of the Coronavirus Act 2020.

During the pandemic, the response of statistics producers to publish close to real-time data
was remarkable – for example, daily updates on COVID-19 cases and deaths. Producers
demonstrated the ability to consider user needs when balancing timeliness against quality
and strong analytical collaboration resulted in valuable, high-quality statistics.

As noted in our written evidence to your inquiry on data transparency and accountability:
COVID-19, throughout the pandemic the Office for Statistics Regulation (OSR) has called
for greater transparency of data related to COVID-19. In May 2020 we set out our
expectations on the use of management information by government and other official
bodies, and followed up in November 2020 and July 2021.

As you know, unfortunately we had to intervene on multiple occasions regarding instances
of data being quoted publicly that were not in the public domain, which had been used to
inform decision-making. These ranged from data quoted during press briefings, testing data, and data to inform quarantine policy around red-list countries. These were key decisions and announcements made with evidence that the public, media, and Parliament should have been able to scrutinise immediately.

However, two years on, we can see that progress has been made to support the quick
publication of new statistics and data which inform decision-making. It is encouraging that,
when we have raised concerns with producers recently, often during challenging
circumstances such as the emergence of new variants, these have been resolved more
quickly. We discussed the overall improvements in our October 2021 report: ‘improving
health and social care statistics: lessons learned’ where we noted how essential
transparency is for building public trust in statistics and retaining public confidence in
government decisions.

Transparency has been a big theme of our work. In October 2021 we published a blog
setting out our own priorities: build our evidence base of good examples of transparency,
continue to intervene when necessary, and work with external organisation and
governments to make the case for transparency. We are grateful for the Committee’s
support on these issues, and we will continue to keep you updated on this work.

I hope this is a useful summary for the Committee, and please do let me know if I can be of
further assistance.

Yours sincerely
Ed Humpherson
Director General for Regulation

Office for National Statistics correspondence to the Public Administration and Constitutional Affairs Committee regarding the UK Statistics (Amendment etc.) (EU Exit) Regulations 2021

Dear Mr Wragg,

As National Statistician and Chief Executive of the UK Statistics Authority (the Authority), I am
writing to inform the Public Administration and Constitutional Affairs Committee about the UK
Statistics (Amendment etc.) (EU Exit) Regulations 2021, that have today been laid before

The statutory instrument (SI) is a continuation of The UK Statistics (Amendment etc.) (EU
Exit) Regulations 2019 (2019 SI), that came into force on implementation period (IP)
completion day. The 2019 SI revoked 293 pieces of retained EU law on statistics. The aim of
the SI laid before Parliament today is to continue this approach by revoking the remaining
retained EU law on statistics, including new legislation that has come into force since the
2019 SI was laid before Parliament. The SI revokes 137 additional pieces of retained EU law
on statistics.

In revoking retained EU law requiring the onward provision of data to Eurostat, the SI
removes substantive obligations on the UK to collect statistical data at certain times and in
certain ways. While the production of statistics will continue to be subject to domestic
regulation as UK official statistics; the UK will have greater flexibility in the production of these
statistics than under EU law.

HM Paymaster General, the Rt Hon. Michael Ellis QC MP, made the SI in the Cabinet Office’s
capacity as the Authority’s sponsoring department. The Welsh Minister for Finance and Local
Government, the Scottish Deputy First Minister and the Northern Irish Minister of Finance
have confirmed the Welsh and Scottish Government’s and Northern Ireland Executive’s
consent to this instrument respectively.

Please do let me know if any questions or if you wish to discuss further.

Yours sincerely,
Professor Sir Ian Diamond

UK Statistics Authority oral evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on the work of the UK Statistics Authority

On 21 October 2021, Sir David Norgrove, Chair, UK Statistics Authority, Professor Sir Ian Diamond, National Statistician and Ed Humpherson, Director General for Regulation, gave evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on the work of the UK Statistics Authority.

A transcript of which has been published on the UK Parliament Website.