UK Statistics Authority oral evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on the work of the UK Statistics Authority

On Tuesday 1 July, Sir Robert Chote, Chair of the UK Statistics Authority, Emma Rourke, Acting National Statistician and Ed Humpherson, Director General for Regulation, gave evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on the work of the UK Statistics Authority.

A transcript of which has been published on the UK Parliament website.

UK Statistics Authority written evidence to the Public Administration and Constitutional Affairs Committee’s inquiry into the work of the UK Statistics Authority

Dear Simon

I am writing in response to the call for evidence for the Committee’s inquiry into the work of the UK Statistics Authority. We welcome our regular appearances before the Committee, not just as a channel of formal accountability to Parliament, but also as an important source of support, challenge and advice in ensuring that the official statistical system serves the public good as effectively as possible.

Last week, the Authority announced the resignation of Sir Ian Diamond as the UK’s National Statistician due to ongoing health issues. I am grateful to Sir Ian for his tireless energy and the passionate dedication he brought both to the role of National Statistician and to championing the vital role of statistics across society more broadly. Sir Ian oversaw many successes over his tenure during a remarkable period of economic and societal change, particularly during the pandemic. Emma Rourke, Deputy National Statistician for Health, Population and Methods, will be Acting National Statistician pending longer term arrangements being put in place. We will keep your Committee updated on these arrangements.

As you will be aware, it has been a challenging period for the official statistical system and for the Office for National Statistics (ONS) in particular. Most obviously, the long-term trend of declining response rates for household surveys accelerated following the Covid pandemic, making it more difficult and expensive to maintain the quality of key economic data on which policy and other decision makers rely – most notably those relating to the labour market. T his has happened at a time when financial resources remain constrained and the barriers within government to the sharing, linking and exploitation of administrative data remain frustratingly high.

Colleagues across the official system have worked tirelessly to address these challenges and to exploit available opportunities, and the ONS and other statistical producers have continued to generate many high-quality outputs. But we need to ensure that the system is focused on addressing the challenges and difficulties, as well as being self-critical and open to learning and advice from outside when things can be done better. To that end, in addition to benefiting from the insights of the Lievesley Review in 2024, the UK Statistics Authority Board has supported the ONS in commissioning an unsparing internal ‘lessons learned’ exercise around the process of reforming its labour market statistics, in drawing on technical input from independent outside experts, and in engaging with and responding fully to the recommendations and requirements of the Office for Statistics Regulation (OSR).

Most recently, in April 2025, the Board and the Cabinet Office jointly commissioned Sir Robert Devereux to undertake a short but wide-ranging independent review of the performance and culture of the ONS, drawing on the experiences and insights of staff across the organisation as well as external stakeholders. As I write this letter, the review is still under way. But I am confident that it will provide important insights and recommendations to help ensure that the ONS can operate to its full potential, and we will be able to brief you on these at a later stage of your inquiry.

You set out four sets of questions in your Terms of Reference:

  1. How well served are policy-makers, researchers, businesses and citizens, by the data that ONS produces and the services it provides?
  2. How is the UK’s data environment evolving, and what challenges and opportunities does this present official statisticians and analysts? What does the development of a National Data Library mean for the ONS?
  3. How successful has the OSR been in identifying issues with official data, and making the case for improvements?
  4. How does the UKSA Board carry out its statutory functions, and how involved is it in the decisions taken by senior leaders at ONS and OSR?

To address these questions, I attach three submissions – one each from the Authority, the OSR and the ONS. As you will appreciate, the ONS is in a period of transition following the resignation of Sir Ian Diamond as National Statistician on 9 May 2025 and the ONS submission was being written as he departed.

We look forward to discussing the questions you have raised and any other issues with you and the Committee on 1 July.

Yours sincerely,

Sir Robert Chote

Chair, UK Statistics Authority

How does the UK Statistics Authority Board carry out its statutory functions, and how involved is it in the decisions taken by senior leaders at Office for National Statistics (ONS) and the Office for Statistics Regulation (OSR)?

Introduction

The UK Statistics Authority was established under the Statistics and Registration Service Act 2007 (‘the Act’) and formally assumed its powers under the Act on 1 April 2008. The Act gave the Authority the statutory objective of ‘promoting and safeguarding the production and publication of official statistics that serve the public good’. The public good includes:

  • informing the public about social and economic matters;
  • assisting in the development and evaluation of public policy; and
  • regulating quality and publicly challenging the misuse of statistic

In practice the Authority fulfils these objectives directly through the Office for National Statistics (ONS; its executive arm and the largest single producer of official statistics in the UK) and the Office for Statistics Regulation (OSR; its assessment arm), and indirectly through its oversight of the Government Statistical Service (GSS; the statisticians working in UK and devolved government departments and public bodies, which produce most UK official statistics)

The governance of the Authority was examined by Professor Denise Lievesley in her independent Review of the UK Statistics Authority (‘Lievesley Review’) published in 2024. The Authority welcomed Professor Lievesley’s recognition that the Authority’s governance was working well and that the two executive arms (the ONS and OSR) are sufficiently operationally independent in practice. To increase public understanding of the de facto distinction between the arms of the Authority, OSR published a statement on the operational separation between the ONS and OSR in October 2024.

Membership of the Authority Board comprises the Chair, at least five non-executive members, and three executive members. Other members of the ONS and OSR executive staff and representatives of the GSS attend as required.

As detailed in the standing orders of the Act, the Board is required to ‘usually to meet at least eight times a year’, but in both 2023 and 2024 met ten times. It also holds ad hoc meetings to cover topics of interest in greater depth or when more timely input is needed. The Chair has regular separate bilateral meetings with the National Statistician and the head of OSR, and non-executive members meet with ONS and OSR staff as needed on topics of shared interest or expertise.

The Board delegates some of its functions to committees. Among them, the Regulation Committee oversees the work of OSR, inputting to and signing off its major reports and decisions. As OSR regulates the ONS as well as other statistical producers, to avoid conflicts of interest as far as possible, the Regulation Committee comprises non-executive members of the Board and OSR executive members, but no executive staff from the ONS.

The Authority’s engagement with the devolved administrations is guided by the Concordat on Statistics, an agreed framework for co-operation in relation to the production of statistics, for and within the UK, statistical standards and the statistics profession. High-level governance and oversight of cross-UK statistical work is provided by the Authority’s Inter-Administration Committee (IAC), chaired by the National Statistician with membership including the Chief Statisticians of the devolved administrations.

Statutory functions

The functions of the Authority under the 2007 Act include:

  • To monitor the production and publication of official statistics.
  • To develop and maintain definitions, methodologies, classifications and standards for official statistics.
  • To prepare, adopt and publish a Code of Practice for Statistics.
  • When requested by the producer, to assess and determine whether the Code has been complied with in relation to any official statistics, and if so to designate them as National Statistics, nowadays commonly referred to as ‘accredited official statistics’.
  • To determine whether the Code continues to be complied with by ‘accredited official statistics’, and if not to cancel the designation.
  • To produce and publish statistics
  • To compile and maintain the retail prices index
  • To provide statistical services and promote and assist statistical research
  • To fulfil former functions of Registrar General for England and Wales as regards undertaking a census

In common with other public and private sector organisations, the Act imposes a duty on the Board to produce a report after the end of each financial year: the Authority’s Annual Report and Accounts. This meets statutory obligations, providing transparency and accountability for the use of public resources. The most recent Report (for 2023/24) can be found on the Authority website.

The Board must exercise its functions efficiently and cost-effectively and seek to minimise the burdens it places on other persons.

The Board and decision making

As noted above, the Board comprises a combination of executive and non-executive members. The non-executive members are appointed by ministers on the recommendation of an appointment panel that typically includes the Authority Chair, an independent member, and a representative from the Cabinet Office as sponsor department. The aim is to ensure a range of skills and expertise, currently from academia, public service and the private sector. Individual members’ expertise currently encompasses economics, statistics, data collection, technology, risk and governance and communication.

In the last couple of years, it has been something of a struggle to maintain a full complement of non-executive members despite excellent candidates being available. The previous Government refused to renew members beyond their initial three-year term (contrary to good governance practice which would have allowed at least one additional term) and it took a significant period to complete the appointment process for our three most recent arrivals as the process was delayed by the general election. Despite periods of sitting without a full quota of non-executive members, Board meetings were able to be quorate on virtually every occasion, and members in post continued to be actively engaged with the work of the Authority holding relevant expertise.

Board meetings typically comprise regular reports from the Chair, National Statistician, head of OSR, head of communications and sub-committee chairs, plus papers on substantive items of current importance. Some of these are on a regular cycle (for example discussion of business plans and the annual report and accounts). Some are as needed. In the last couple of years, regular items have included labour market statistics, the Integrated Data Service and the future of population statistics (including the census). On occasion, regular issues of this sort are dealt with through discussion of the National Statistician’s report to avoid staff working in high pressure areas spending too much time preparing board papers rather than on their core activities.

Agenda papers may ask the Board to note particular developments, to offer advice or to make formal decisions – for example endorse a recommendation from the National Statistician on the future of population statistics or the need for a census. As detailed in Section 30 of the Act, the National Statistician is the Board’s principal adviser on the quality of official statistics, good practice in relation to official statistics and the comprehensiveness of official statistics. The Board must have regard to their advice on those matters. Decisions are normally made following discussion that leads to a shared view of the way forward. However, the Chair can request a formal vote with a simple majority of those present deciding the matter. The Board has not yet voted formally.

The Board sets the broad direction of the Authority through agreement on a five-year strategy, currently Statistics for the Public Good 2020-2025 and due to be refreshed this year. The ONS and OSR business plans underpin delivery of the strategy, and the Board is engaged in the development of these, providing support, oversight, scrutiny and challenge ahead of approval.

The Board and its committees periodically review their own effectiveness and the effectiveness of their members. In line with good practice, we will be commissioning an externally led review of the Board in the coming year.

Subcommittees of the Board

Regulation Committee

The role of the Regulation Committee (formerly the ‘Assessment Committee’) is to help shape the regulation strategy of the Authority and to oversee the programme of assessment of sets of official statistics against the Code of Practice for Official Statistics, plus other work related to assessment and regulation, thereby contributing to achievement of the Authority’s strategic objectives.

In practice, this means overseeing the work of OSR in setting, promoting and judging compliance with the Code, and in intervening (via the Authority Chair, the head of OSR or other OSR staff) when ministers, senior public figures or statistical producers fall short of the code or the associated principles of ‘intelligent transparency’. Committee members consider the final conclusions of assessment reports ahead of publication and also support the OSR in its wider activity aimed at supporting good practice.

The OSR’s recent work is described in detail in OSR’s submission. Recent areas of regulatory focus have included labour market statistics, specific economic statistics, the broader landscape of economic statistics, population statistics and the approach to gender identity in the last census. On a number of occasions, the Regulation Committee has de-accredited official statistics that no longer comply with the Code (sometimes at the producer’s request), accompanied by agreement on actions that, if fulfilled, would allow the statistic to be re-accredited as Code-compliant.

The Regulation Committee meets at least quarterly, with additional meetings convened as necessary. It comprises the Authority Chair, three non-executive Members of the Board and the head of OSR, with other OSR staff members attending as required. There are no executive staff members from the ONS or other statistical producers represented on the committee (except when invited as guests for specific discussions), consistent with the statutory requirement to separate statistical production and assessment.

The Audit and Risk Committee

Executive accountability for risk management resides with the National Statistician (as Accounting Officer), with executive oversight residing with the Executive Committee and its sub-committees. Chaired by a qualified Non-Executive Member, the Audit and Risk Assurance Committee (ARAC) supports the Board and the Accounting Officer in their responsibilities for risk management, control and governance by reviewing the comprehensiveness, reliability and integrity of the assurance available to them.

The Authority Risk and Assurance Framework provides a mechanism for the identification, analysis and management of risks across the Authority and is aligned to The Orange Book – Management of Risk, and reflects Risk Management best practice.

ARAC has responsibility for advising the Board on the effectiveness of governance, risk management and the system of internal control. This is also informed through audits and advisory work by the internal audit team. It currently comprises of the Non-Executive Chair, two non-executive Board members, two external independent members.

The Authority Board ensures that plans are in place for any risks outside of appetite. Updates are provided to each ARAC meeting on the evolving profile. ARAC scrutinises the management of the strategic risks to satisfy itself that major risks are identified and that mitigation strategies and appropriate levels of assurance are in place. It challenges and holds the Risk and Assurance team and Strategic Risk Owners to account.

Currently the Authority’s Strategic Risks relate to:

  • Independence, trustworthiness and impact
  • User needs
  • Delivery of strategic ambition
  • Quality management framework
  • Our Security
  • Our People
  • Our Communications
  • Quality economic statistics
  • Quality population statistics
  • Data access and usability
  • Technological resilience

Discussions by ARAC and the Authority Board have focused on the most significant risks to the successful delivery of the Authority strategy, including the interplay across the strategic risk profile, specifically statistical quality, technological resilience (with particular focus on legacy) and people and skills. The Statistics Quality risk has remained outside of appetite for a prolonged period. ARAC has provided a high level of oversight and consistently sought assurances on timing for the Statistics Quality risk to return within appetite. ARAC has also continued its focus on assuring legacy plans which should support improving quality and will play a key role in scrutinising all the strategic risks under the new risk profile, with particular focus on the quality of our economic statistics, legacy, people and meeting user needs.

Remuneration Committee

The Remuneration Committee agrees the pay and performance management framework for members of the Senior Civil Service employed by the Authority, within the parameters set by the Cabinet Office. It signs off performance and bonus decisions for staff at Deputy Director level and above.

How involved is the Board in the decisions taken by senior leaders at ONS and OSR?

As with most corporate or public sector organisations, a key role of the Board – and its non-executive members in particular – is to provide support and challenge to the Executive to help them deliver on the Board’s strategy and fulfil any statutory or regulatory duties. The Authority Board is unusual in that it has two executive arms – the production arm, the ONS, and the regulatory arm, OSR – and one of them regulates the other (along with the many other bodies producing official statistics). It also has a less direct responsibility for the GSS.

Within the ONS, the National Statistics Executive Group (NSEG) is the most senior executive committee, chaired by the National Statistician. Its role is to advise the National Statistician in the exercise of their functions as the Head of the GSS and Analysis Function and Chief Executive of the Authority. NSEG focuses on system-wide statistical and analytical matters, and this is reflected in the Group’s membership which includes two GSS Heads of Profession and colleagues from the devolved administrations. Meanwhile the Executive Committee (ExCo) focuses on all aspects of our business delivery within the ONS. Below NSEG and ExCo are a number of sub-committees that feed into discussions. For the OSR, the Regulation Committee helps to shape the regulation strategy of the Authority and oversee regulatory work. The Director General sits on the Regulation Committee, and he is supported by the OSR senior leadership team, who all attend the Committee.

Support and challenge from the Board is provided through multiple channels and is provided without seeking to conduct the role of the executive or micromanage. At every Board meeting there are update reports from the National Statistician and the DG for Regulation providing the opportunity for the two executive arms of the Authority to report on delivery against their respective Business Plans, including areas of success as well as highlight challenges. This allows the Board to engage, offer support, share expertise and offer challenge in the wider work of the executive arms, often covering work areas not discussed as substantive agenda items. When it is necessary to offer challenge, Board members are conscious of the need to be robust, forthright and persistent, while acting in a constructive and collegiate way and recognising any constraints the executive faces.

For support and challenge to operate effectively it is important that the executive has an accurate picture of what is going on the organisation and its major programmes and activities, and that this in turn is shared with the Board in a full and transparent way. As in almost every Board, the non-executives periodically emphasise the importance of candour and transparency so that they do not receive an incomplete or unduly rose-tinted picture.

The Chairs of the Audit and Risk, Regulation and Remuneration Committees also provide updates at the Board on the work of respective committees, in line with their delegated authority as set out in committee Terms of Reference. Through these updates the Board can share its views and support the work of Committees. For example, highlighting areas of concern and interest relating to Gender Identity in the 2021 England and Wales Census as the Regulation Committee investigated the matter.

The Authority Board receives monthly management information which helps to monitor performance against key deliverables as outlined in the Business Plan.

The Board sets the risk appetite for the Authority’s strategic risk profile and has a specific item on the strategic risks every six months. Work areas, projects and programmes feeding into strategic risks are of course often covered in substantive agenda items. This is underpinned by the work delegated to the ARAC. The strategic risk profile demonstrates the most significant risks to the successful delivery of the work of Authority Strategy. The forward agenda for the Authority Board reflects the key challenges and aligns to the strategic risk profile. The Board forward agenda is produced by the Secretariat in accordance with any determination of the Board and in consultation with the Chair of the Authority, the National Statistician and the Strategy and Policy Deputy Director.

External assurance

The Authority is unusual in that it has an in-house regulatory arm (OSR), which in addition to the work of the internal audit team, gives the Authority both internal and external scrutiny. Both the external and internal scrutiny are welcomed and encouraged by the Board.

External assurance is provided to the Authority Board and National Statistician in several ways. First and foremost, the Authority is an independent non-ministerial department that reports directly to the UK Parliament, the Scottish Parliament, the Welsh Parliament and the Northern Ireland Assembly. The work and interest of Parliamentarians and Committees, including the Public Administration and Constitutional Affairs Select Committee (PACAC), on the work of the Authority is an important source of support, challenge and advice in ensuring that the official statistical system serves the public good as effectively as possible.

In addition, the National Statistician has convened a set of advisory committees and panels to provide external independent advice on specific topic areas. They include

  • Advisory Panels on Consumer Price Statistics (Stakeholder and Technical).
  • Data Ethics Advisory Committee
  • Committee for Advice on Standards for Economic Statistics
  • Expert User Advisory Committee.
  • Inclusive Data Advisory Committee.
  • Labour Market Statistics.
  • Methodological Assurance Panel.

Membership of these groups includes representation from academia, government departments, the devolved governments and research bodies.

External expertise is also sought on a bespoke basis and shared with the Board. One such example was the review undertaken by Professors Ray Chambers and James Brown as part of the ONS’s systematic assessment of its readiness to manage the transition from the Labour Force Survey (LFS) to the Transformed Labour Force Survey (TLFS). They looked specifically at survey design, response patterns and weighting methods.

The Board also encourages frank and honest engagement with key users of ONS statistics to ensure that their requirements and feedback can be reflected. For example, ensuring that the ONS engages fully and constructively with the Bank of England and HM Treasury regarding the transition from the LFS to the TLFS and that any significant concerns they have are shared with the Board. The Stakeholder Advisory Panel on Labour Market Statistics chaired by Professor Jonathan Portes, and including representatives from other government departments, academia, the Scottish and Welsh governments and Northern Ireland Statistics and Research Agency has been a key source of advice and assurance as work on this transition has proceeded.

Over the last year this work has been an area of significant focus for the Board. The sharp fall in household survey response rates, a significant challenge in the UK as well as for other NSIs around the world has affected the quality of data from the LFS. As part of the work to address these challenges the ONS has been developing an online-first TLFS.

Throughout this process the Board has provided scrutiny, oversight, challenge and support as the has work progressed (as detailed in published Board papers). The non-executive members have also held extended sessions allowing them to better understand the range of issues. The Board has provided clear feedback about the risks of transferring from the LFS to the TLFS mindful of stakeholder concerns and quality issues. At their meeting on 27 March, the Board considered the advice by the National Statistician on the TLFS as well as assurances from technical advisors and the advisory committee to reach collective agreement on the way forward.

The National Infrastructure and Service Transformation Authority (NISTA) – previously the Infrastructure and Project Authority (IPA) – is the government’s centre of expertise for infrastructure and major projects. It regularly scrutinises any of our projects that fall under the Government Major Projects Portfolio (GMPP). These have included the ONS’s Integrated Data Service Programme and the Future of Population and Migration Statistics. Along with Treasury Business case reviews, these have offered external assurance both to the executive and the Board, although it is not unknown for projects that have cleared these hurdles multiple times to end up with difficulties that would presumably have been even harder for the Board alone to surface.

UK Statistics Authority

May 2025

 

How successful has the OSR been in identifying issues with official data, and making the case for improvements?

Summary

The Office for Statistics Regulation (OSR) is charged with upholding the standards of official statistics across the UK. Through our wide-ranging regulatory work we identify issues and respond to stakeholder concerns about official data. We set requirements for improvements, as well as highlighting areas of best practice.

Our work has secured commitments from statistical producers that have led to positive improvements in many official statistics, although the speed with which this can be achieved is not always as timely as we would like. The importance of our role, and the independence and rigour that we bring to the task, has been noted by external reviews such as Professor Denise Lievesley’s independent review of the UK Statistics Authority and the PACAC report on Transforming the UK’s Evidence Base.

This submission sets out: our scope and approach to regulation; our key regulatory interventions and how we have identified issues and required improvements; and our views on the evolving statistical system. We have included clear examples of where our work has been vital in securing improvements or holding organisations to account against a backdrop of significant issues or concerns.

Many of the examples included in this submission relate to statistics produced by the Office for National Statistics (ONS). This represents some of our most recent and high-profile interventions. However, OSR’s regulatory activities span the range of Crown and non-Crown producers of official statistics across the UK.

Introduction

OSR is the regulatory arm of the UK Statistics Authority and was established in November 2016 following the Bean Review. OSR fulfils the assessment and regulatory function set out in the Statistics and Registration Service Act (2007). We are independent from Government and are separate from producers of statistics, including the ONS.

The work of OSR is overseen by the Regulation Committee of the UKSA Board, which comprises non-executive members of the main UKSA Board and the Director General of Regulation sits as an executive member. There are no executive members of the ONS on this committee to avoid any conflict of interest when OSR is examining the work of the ONS. The Chair of the Authority sits on the Regulation Committee and has stated – like his predecessor – that in the event of a dispute between OSR and the ONS, he and the Board would, by instinct, side with the regulator.

Professor Lievesley examined the operation of OSR and the Regulation Committee and concluded that: “Having reviewed the organisation thoroughly, this Review is satisfied that there is sufficient operational independence between ONS and OSR. The Review could find no tangible evidence to support assertions that the two organisations are too cosy or that a fundamental, unmanageable conflict of interest exists between the two that undermines the integrity or quality of the statistics produced by ONS, though it is important to pay attention to the perception of independent scrutiny.”

In line with the Statistics and Registration Service Act (2007) the principal roles of the OSR are to:

  • Set the statutory Code of Practice for Statistics
  • Assess compliance with the Code of Practice
  • Accredit official statistics that comply fully with the Code of Practice
  • Report any concerns on the quality and comprehensiveness of official statistics
  • Report any concerns on good practice in relation to official statistics

Our purpose is to ensure statistics serve the public good by regulating against the principles of Trustworthiness, Quality, and Value. As a regulator, we work through three delivery channels:

  • We uphold the trustworthiness, quality and value of statistics and data used as evidence
  • We protect the role of statistics in public debate
  • We develop a better understanding of the public good of statistics

Our 5-year plan sets out our vision and priorities for 2020-2025 and how we will contribute to fostering the Authority’s ambitions for the statistics system. Our annual business plan shares our focus for the current year.

Our regulatory approach

Regulatory tools

As the regulator for official statistics across the UK, we have a number of different tools that we use in order to identify issues with official statistics and make recommendations or requirements for improvements:

  • Assessments: Detailed reviews of an official statistics output that grant, reconfirm or remove the status of ‘accredited official statistics’ (referred to as ‘National Statistics’ in the Statistics and Registration Service Act 2007)
  • Compliance checks: Short, focused reviews, typically providing a high-level investigation of the official statistics
  • Reviews: Pieces of work examining issues across the statistics landscape or related sets of official statistics to provide strategic recommendations
  • Casework: Complaints received on the production and use of statistics which are investigated and a judgement reached

Engagement with statistical producers

OSR is structured around 8 topic domains, each of which are responsible for maintaining an overview of the statistics produced by relevant government departments and public bodies within that topic. This knowledge ensures that OSR remains up to date on existing and emerging issues and ensure that our reviews and judgements are informed by a deep understanding of the topic.

The domains build strong regulatory relationships with the relevant statistics producers, which support better outcomes for the statistical system, through early and frank exchange of information and intelligence, and securing buy-in from the producers of statistics for the requirements and recommendations set by OSR.

One of these key relationships is with the civil service Heads of Profession for Statistics who sit within each statistical producer organisation. Heads of Profession for Statistics play a vital role in upholding the quality and standards of official statistics as set out in the Code. OSR works closely with the Heads of Profession across government to provide a mix of challenge, advice and support where appropriate.

Whilst we provide specific recommendations to producers as part of our reviews, in general we take a more holistic approach to regulation, providing support, advice and training in additional to our formal regulatory work. This approach ensures that our work with producers secures real change and improvement in statistics, rather than being a performative tick-box exercise. We set the expectation that producers are open and honest about the statistics with us as the regulator and in the public domain. We stand firm on our regulatory decisions but always ensure that we are fully informed by conversations with the producer so that they are proportionate and rooted in the facts.

We pride ourselves on this collaborative approach and consider that it leads to considerably better outcomes for the statistical system. This approach was endorsed by Professor Lievesley, whose review noted that many statistics producers commend the support and guidance from the OSR and that this constructive approach is having a positive impact on compliance with the Code.

Separation from the ONS

For regulation to be effective, it is important that external stakeholders have confidence in the arrangements ensuring OSR’s separation from the ONS. This separation is crucial because it is what enables OSR to make sound regulatory decisions about the ONS’s production of official statistics. These regulatory decisions should be made in the same way, using the same criteria and governance, as for any producer of official statistics.

In October 2024 we published a statement which transparently set out how the separation of OSR from the ONS is achieved in practice. The OSR has separate governance structures, strategy and business planning, reporting lines to the Chair of the Authority, and external communications. As noted above, Professor Lievesley examined the operation of the OSR and the Regulation Committee and found them to be robustly independent.

Regulatory work

Economic statistics

High quality economic statistics are a crucial underpinning for informed decision-making and the functioning of the UK economy. Over the last few years, there have been a number of economic shocks which have brought increased interest in, and scrutiny of, economic statistics produced by the ONS. Over the last year in particular, there has been growing external criticism of ONS.

OSR has proactively undertaken an extensive programme of regulatory work on economic statistics over the last 5 years. These reviews have been fundamental in synthesising stakeholder concerns, identifying issues with the quality of the core building blocks of economic statistics. We have set out clear requirements for improvements from the ONS, and hold the ONS to account by monitoring progress against its delivery of these improvements.

This section sets out regulatory work on the ONS’s economic statistics. It:

  • Describes our methodology (Spotlight on quality)
  • Summarises our work on price statistics
  • Summarises our work on labour market statistics

Sets out how our April 2025 review builds on the issues identified in our assessments over the preceding 5 years and highlights the urgent need for the ONS to address quality concerns.

Spotlight on Quality Assessments

The UK’s departure from the EU ended the role of the European statistical office (Eurostat) in verifying the quality of UK statistics. In response, we enhanced our work programme of reviews of economic statistics including the development of a Spotlight on Quality assessment framework which will provide continued assurance on the quality internationally comparable economic statistics.

This framework builds on our earlier regulatory reviews on the Living Costs and Food Survey (LCFS) and UK Business Demography Statistics. The framework sets out four key areas to evaluate the quality of statistics: whether the statistics are produced using suitable data sources whether appropriate methods are used transparent quality assurance whether the statistics are sufficiently prioritised and resourced proportionately to their use. This framework has been vital to highlighting issues and areas for improvement in the economic statistics produced primarily by the ONS. It has been used to underpin the requirements we have set for the ONS.

The Spotlight on Quality Assessment programme provides a detailed review of many of the data sources and components that feed into the production of GDP and the broader National Accounts. We have undertaken the following reviews:

  • Price Index of Private Rents (PIPR) – October 2024. The assessment found improvements in methods and user engagement but noted that further information needs to be published around methods and data quality. It recommended enhancing explanations of the methods and better communicating development plans.
  • Business Investment Statistics – October 2024. The review highlights positive user feedback on the frequency and availability of the statistics but highlighted concerns about revisions and outdated production systems. It recommended analysing the impact of non-sampling errors, updating methods and quality information, and engaging a wider range of users.
  • Review of Economic Statistics Classifications – July 2024. The review recognised the importance of classifications for National Accounts but raised issues about capability and responsiveness to user needs. Recommendations included more openness about decision-making and faster publication of classification decisions.
  • UK Business Enterprise Research and Development (BERD) Statistics – July 2024. The assessment highlighted efforts to improve the BERD methodology and a move to electronic questionnaires. It recommended transparency about the questionnaire used and better communication with users on uncertainty, strengths and limitations.
  • Northern Ireland Business Expenditure on Research and Development Statistics – July 2024. OSR noted good alignment with UK standards but advised on improving documentation of methods and expanding user engagement. Recommendations focused on engaging with users to identify any needs for a potential back series and additional background information.
  • Profitability of UK Companies and Gross Operating Surplus of Private Non-Financial Corporations – January 2024. The assessment found that while the statistics are broadly reliable, there is limited documentation on the quality of different data sources. OSR recommended improving quality assurance, documenting quality information and wider user engagement
  • Producer Price Inflation (PPI) – July 2023. The assessment found that while the ONS has made improvements to quality and international comparability of the PPIs, under-prioritisation of these statistics has negatively affected the quality. OSR defined several requirements to improve the statistics, including to modernise the inflexible legacy systems used to produce the statistics.

ONS’s Price Statistics

The CPIH (Consumer Prices Index including Owner Occupiers’ Housing Costs) is the ONS’s lead and most comprehensive measure of consumer price inflation. It includes the costs associated with owning, maintaining, and living in one’s own home, which is the most significant expense for many households. As such, it is key that the owner occupiers’ housing costs (OOH) element is captured accurately.

As highlighted in our Systemic Review on Housing and Planning Statistics in 2017, the previous method for producing private rental sector statistics had known limitations including being unable to provide estimates of private rent levels and change that were both comparable over time and available at low levels of geography.

To address these limitations, the ONS developed the Price Index of Private Rents (PIPR) which has now replaced the ONS’s Index of Private Housing Rental Prices (IPHRP) and Private Rental Market Statistics (PRMS) and is used for estimating the owner occupiers’ housing costs (OOH) element of CPIH.

PIPR was published for the first time in March 2024, following which, the ONS requested that we assess the statistics against the Code with a view to them becoming accredited official statistics. This process was undertaken in order to provide assurance to users and stakeholders on the quality and reliability of the estimates. We undertook our review at pace, publishing in October 2024.

The OSR review judged that “ONS’s new PIPR statistics generally appear to be meeting users’ needs more effectively than the previous private rents measures that these statistics have replaced.” However, we also concluded that “although the ONS has published supporting methods and quality documentation for PIPR, this does not currently amount to a sufficiently accessible and detailed account of PIPR methods to enable an adequate understanding of the approaches used, the ONS’s rationale for choosing them, and their relative strengths and limitations, for both technical and non-technical users.”

Ultimately, our review determined that the ONS will need to develop and publish the necessary materials; publish NI and full UK PIPR-based estimates; and facilitate an effective evaluation of the UK PIPR series with users before we will consider initiating a full assessment of whether these statistics merit accredited official statistics status.

The review set out five requirements that the ONS will need to address as it further develops the PIPR statistics and required that the ONS publish an action plan by January 2025 setting out how it will address these requirements, and report back to us publicly every three months on its progress. This process ensures that there is transparency for stakeholders and users and that the ONS is held to account.

In February 2025 we set out a forward work plan on assuring confidence in consumer and household price statistics, we have since initiated a review that will focus specifically on the ONS’s approach to transforming its consumer price statistics in advance of a full re-assessment of CPI and CPIH statistics. We will also begin a review of Household Cost Indices (HCIs) later this year.

Labour market statistics

We have focused our work around the challenges that the ONS has faced with its labour market statistics on three distinct themes:

  • Recognising the changing labour market
  • Declining response rates
  • Transforming labour market statistics

Responding to a changing labour market

Employment and jobs statistics are essential for understanding the patterns and dynamics of the UK labour market. They are used widely by a variety of stakeholders, for example within UK Government and by the Bank of England to develop and monitor government policies and so it is important that they are accurate, high quality and clear to fully serve the public good.

Over the last few years, labour market statistics have faced a variety of challenges related predominantly to falling response rates. OSR has provided regulatory oversight of these issues and the ONS’s response, undertaking a significant volume of regulatory work on labour market statistics in the past few years as set out below.

In response to growing concerns about the reliability of the Labour Force Survey in 2020, we assessed the UK employment and job statistics produced by the ONS.

The report emphasised the importance of the ONS adopting a flexible approach. It highlighted the labour market and economy are in constant change, and that the statistics that describe the labour market must therefore adapt to reflect those changes. This includes embracing new data sources and navigating the impact of COVID-19. We concluded that “ONS needs to demonstrate drive and ambition to fill the data gaps and match the pace of change in the labour market, engaging effectively with users to ensure their needs are met.”

In our report we identified areas of good practice such as the labour market statistics team’s collaboration and engagement with a wide range of users and stakeholders. We also set out 12 requirements for the ONS, which were necessary in order to ensure that these statistics could continue to be designated as accredited official statistics (then referred to as National Statistics).

Response rate challenges

We have seen a long-term trend of response rate challenges facing the LFS, which became acute when the sample boost in place to enable pandemic operations was removed in July 2023. Following this, in October 2023, the ONS suspended publication of its estimates of UK employment, unemployment and economic inactivity based on LFS data and announced that it would publish a new experimental series using additional data sources in its place. This short notice change to methods of a key series had a significant impact on user confidence.

In response, we immediately announced and initiated a rapid review of these experimental statistics which was published in November 2023. This review set out key requirements for the ONS on:

  • suitable data sources
  • sound methods and quality assurance
  • clarity of communication
  • managing quality

As a reflection of the significant concerns about quality, the review resulted in the removal of the accredited official statistics status from LFS-based estimates. Following enhanced quality information provided by producers at our request, we also removed the accreditation from other outputs based on data from the Annual Population Survey which is based on responses to wave 1 and wave 5 of the LFS plus a boost sample.

In response to the ONS reintroducing the LFS-based labour market statistics in February 2024, we carried out a short review. A key theme that emerged from this review was the need for improved, clear and open communication from the ONS. The review set out requirements around: communication of plans and priorities; accessibility of updates and communications; explaining how the data should be used; communication of data quality issues and improvements; and transition to the TLFS. In August 2024, we carried out a follow-up review to check the progress made against the requirements.

ONS’s Transformation plans for the Labour Force Survey

We have also carried out regulatory work throughout the development of the Transformed Labour Force Survey (TLFS). Throughout this period, the ONS was developing in parallel the TLFS, which was intended to address many of the concerns and shortcomings of the LFS, but this work has also faced significant delivery challenges.

We carried out our TLFS review in three phases with the aim of sharing early regulatory insights to help the ONS in ensuring the new survey meets the standards of the Code. The first phase (which started in April 2022) focused on the design and development work the ONS had planned before transitioning to the new survey approach.

We published our initial findings on the TLFS in November 2022 which set out a range of requirements, including on enhancing public confidence and maximising the public value of the TLFS; communicating impacts; and supporting public confidence in the transformation process. In July 2023, we published an updated letter and progress report following phase two of our review.

In February 2025, we reported the outcome of phase three of our review of the ONS’s LFS transformation. This report consolidated OSR’s work on both the LFS and TLFS, bringing together our judgements to date and providing updates on the remaining open recommendations and requirements.

Following recommendations set out by OSR, the ONS has widened its user engagement with the introduction of the stakeholder panel and expert data sharing groups and has been publishing updates on the labour market transformation – progress and plans. In December 2024, the ONS also published an interim action plan based on the results of its ‘lessons learnt’ exercise conducted in summer 2024; published the detail of an independent methodological review; and explained its plans in an accessible way. The ONS has revised its plans for the TLFS and in response we made further recommendations for the ONS to set out detailed plans for transitioning to the TLFS, and to set out plans for regular reporting on the progress of the interim action plan from its ‘lessons learnt’ exercise. We have asked the ONS to report on progress again by July 2025.

We continue to closely monitor the ONS’s work to improve the LFS. We will maintained our engagement with the ONS and users to understand whether these changes have increased quality sufficiently to meet user needs. We have asked the ONS to report on progress again by July 2025.

Review of ONS economic statistics

In April 2025, we published our report based on our Systemic Review of ONS Economic Statistics. The report provided a synthesis of the concerns surrounding the ONS economic statistics that had emerged from our work over the last five years, and feedback from stakeholders.

The report was direct in recognising the need for urgency in addressing the declining stakeholder confidence in ONS’s economic statistics, concluding that:

  • The ONS must fully acknowledge and address declining data quality
  • Making progress with administrative data is difficult
  • Greater strategic clarity of purpose and transparency on prioritisation would help reassure external stakeholders

The review also set requirements that the ONS must address:

  • Restoring confidence, by producing a fully resourced plan to recover its social survey operation and reduce risk in its business survey operation.
  • Ensuring strategic transparency, by clearly setting out the core purpose of economic statistics and what can be achieved with available funding in its business plan, a strategic plan for economic statistics and a strategic plan for data sources.
  • Focusing on the quality of data inputs, by implementing a prioritised rolling programme of regular reviews of individual surveys and other data sources.

Population statistics

The UK’s population statistics are going through a period of profound challenge and change.

This section sets out how OSR’s work on population statistics has highlighted the effective work undertaken by the ONS, National Records Scotland (NRS), and Northern Ireland and Statistics and Research Agency (NISRA) on the 2021 and 2022 Censuses respectively, but also brings out the issues surrounding measuring gender identity, and the opportunities and challenges from the use of administrative data.

Censuses in the UK

We have conducted assessments of the censuses produced by the ONS, NRS and NISRA. Our assessments have been conducted in three phases. In October 2019, we published our reports on Phase 1, focusing on the planning and consultation activities undertaken by the census offices across the UK. In November 2021, we published our Phase 2 Assessment reports, focusing on the strategies for developing and providing outputs for both the England & Wales Census 2021 and the Northern Ireland Census 2021. For Scotland Census 2022, our Phase 2 Assessment report was published in April 2023.

The phased approach is essential for the Census outputs as we have historically granted accredited official statistics status at the end of phase 2 which is prior to the publication of the outputs themselves. This is due to the national significance of these statistics and the importance of reassuring users as to the quality of them at the time of publication, rather than retrospectively. This approach means that it is essential for the departments to meet the requirements set out in phase 1 and phase 2 before accredited official statistics status is granted.

For Scotland’s Census 2022, NRS faced unexpected challenges given that the overall response rate was lower than had been anticipated (89.8% compared to target of 94%). The media and users were concerned that statistics derived from the Census would not be fit for purpose because of the response rate. As a result, NRS undertook a number of steps in collaboration with international census experts to change how the final census estimates were calculated, involving the use of administrative data alongside the Census Coverage Survey and census responses in the estimation process. Using OSR principles of intelligent transparency and OSR communicating uncertainty guidance, NRS carefully considered its communication approach for its first outputs of the Census 2022 data. We commended NRS’s dedication to meet the needs of users and follow the standards of the Code of Practice, including most recently by advocating this good practice in a published communicating uncertainty case study.

We published our phase 3 assessment report of the 2021 Census in Northern Ireland in February 2025. This final report confirmed that the 2021 Census statistics in Northern Ireland are produced in compliance with the Code. Our phase 3 assessment of the 2021 Census in England and Wales is ongoing with a projected summer 2025 publication date. Our phase 3 assessment of the Scotland Census will be undertaken in 2025/26.

Review of Gender Identity in the ONS 2021 England and Wales Census

Information on individual’s gender identity was collected on a voluntary basis for the first time in the ONS 2021 England and Wales Census. As such, the data provided the first ever nationally available estimates for England and Wales on the size and characteristics of the trans population. In addition, the question developed for the Census represents the current Government Statistical Service (GSS) harmonised standard in development for collecting data on gender identity.

Following the first release of census statistics on gender identity in England and Wales in January 2023, concerns were raised about the published estimates of the trans population. As additional census data were published, these concerns extended to the relationship between gender identity and proficiency in English. OSR also received concerns about the level of methodological information published.

OSR undertook a review of these statistics and published an interim report in October 2023 and a final report in September 2024. Learning from new evidence in Scotland’s Census, the ONS wrote to us on 5 September 2024 to request that the gender identity estimates from Census 2021 in England and Wales should no longer be accredited official statistics, and should instead be classified as official statistics in development. The ONS’s proposal was consistent with our report findings, and we accordingly removed the accreditation for these statistics. We also concluded that the issues were unique to the statistics on gender identity, and therefore all other outputs from the Census 2021 in England and Wales are unaffected and remained as accredited official statistics. Our work also found that the ONS had been somewhat closed and at times defensive, in responding to concerns raised by users.

Our final report shared our recommendations on the steps the ONS must take to help users of the census gender identity statistics understand their strengths and limitations and set out the development work we consider is required on the GSS gender identity harmonised standard.

The ONS wrote to us in December 2024 updating on their progress towards meeting the recommendations. This included publishing a workplan for developing harmonised standards for sex and gender identity data collection and new Gender Identity Data Harmonisation interim guidance for statistics producers.

Following these publications, we updated our existing guidance on collecting and reporting data about sex and gender identity in official statistics in December 2024 to include these new publications.

On 26 March 2025, the ONS published a blog further updating on the actions it is taking to meet our recommendations. These actions included publishing additional guidance on the appropriate use of the gender identity estimates from Census 2021 in England and Wales and information on the uncertainty associated with them. We consider this to be an excellent research report which includes example use cases at different levels of geography and population and addresses anomalies and implausibility’s. We are confident this practical information will users to better understand the uncertainty in the data and its implications for use.

The ONS is also making progress with developing harmonised standards for sex and gender identity data collection. We have asked that the ONS continues to keep OSR updated as it develops these harmonised standards.

Admin-based population estimates for England and Wales

Admin-based population estimates (ABPE’s) have huge potential to provide more timely, detailed and potentially more accurate population data compared to traditional census-based methods. The ONS intends for the ABPE’s to become the official population estimates for England and Wales in 2025. Given the extensive use of population statistics, it is vital that this new methodology has appropriate oversight and scrutiny.

Our phased assessment approach for ONS’s Admin-based population estimates (ABPE’s) for England and Wales statistics aims to provide reassurance to users on the new methods by the ONS for producing population estimates in England and Wales.

We published our phase one assessment of these statistics in July 2024 which focused on reviewing quality. As part of this assessment, we commissioned an independent review from Professor Arkadiusz Wisniowski, University of Manchester to inform our judgements around the suitability and quality assurance of the data and methods. Our assessment identified 11 requirements for the ONS to act on that will help to enhance the public value, quality and trustworthiness of these statistics. These requirements covered areas such as governance, data quality, methods, revisions, user engagement, and communication. The requirements included:

  • Requirement 1: To maintain public confidence in its population statistics, the ONS needs to understand the current dependencies between the ABPEs and MYEs. Together with key stakeholders, such as the Welsh Government, the ONS should also develop and publish criteria to support its decision about when the ABPEs will replace the MYEs. The criteria should include statistical quality, operational readiness, planned evaluation and assurance processes and contingency plans, and be usefully applied to the ABPEs and MYEs.
  • Requirement 2: To ensure that there is sufficient oversight and leadership of the production of ABPEs in a way that is joined-up across the ONS, and support the ongoing development of ABPEs, the ONS should strengthen its governance structure. Work here should include establishing clearly defined decision-making responsibilities to manage any risks associated with funding, capability and prioritisation across the ABPEs production process.
  • Requirement 8: To instil confidence in the ABPEs and ensure that the DPM methods are sound and subject to sufficient independent and external challenge, the ONS should:
    • continue with its plans to create a sub-group of its Methodological Assurance Review Panel (MARP; the independent panel used by ONS to provide advice and assurance on methods used to produce official statistics).
    • create and implement an expert user group.
    • make it easier for users to find relevant MARP papers to support technical user understanding of the methods used in the DPM.

Since our report was published, the ONS has used our findings to help shape and steer its development work for the ABPE’s. In October 2024 the ONS published an action plan for how it will develop population statistics. This sets out that the work to address and build on the requirements and recommendations from the assessment will be iterative. Over the last six months, and in response to our findings, the ONS has developed and published a population and international migration statistics revisions policy, introduced quarterly updates to keep users up to date with its plans including its work on ABPEs and increased its user engagement activities in a co-ordinated and transparent way. We continue to engage with the ONS as part of our follow-up phase to scrutinise the ONS’s activities and will consider the next phase of our assessment in summer 2025.

Other regulatory work

Domain summaries

OSR is structured into eight topic teams, called ‘domains’. The two preceding sections have summarised some of the key work undertaken by our Economy, Business and Trade domain and our Population and Society domain respectively.

This section provides a short overview of our remaining six domains as well as recent regulatory work.

Children, Education and Skills: This domain oversees the regulation of data and statistics concerning all stages of education from early years to university and beyond, including statistics on teachers and lecturers, learners, and looked after children. We recently carried out an assessment of the Higher Education (HE) Graduate Outcomes Data and Statistics produced by the Jisc under the Higher Education Statistics Agency (HESA) brand. We published our report in April 2024 and confirmed the accreditation of the statistics without requirements for improvement. We are providing support to the newly created Welsh Government Sponsored Body, Medr, which is responsible for a number of official statistics previously published by Welsh Government in an addition to outputs from the Higher Education Funding Council Wales (HEFCW). In previous years we have also carried out assessments on the Achievement of Curriculum for Excellence Levels statistics produced by Scottish Government, and the Key Stage 4 performance statistics for England produced by the Department for Education.

Crime and Security: This domain covers statistics on crime, policing, justice systems (family, civil and criminal) and national security. This domain has undertaken a number of significant and impactful reviews over the last few years including: the Fraud and Computer misuse statistics for England and Wales published by the ONS; the quality of Criminal Court Statistics for England and Wales produced by the Ministry of Justice and based on data from the HM Courts Tribunals Service (HMCTS); and Police recorded crime statistics published quarterly by the ONS, based on Home Office data collected from the 43 individual police forces in England and Wales and British Transport Police. We have also carried out assessments on the Scottish Prison Population statistics produced by the Scottish Government and Police Officer Uplift statistics produced by the Home Office.

Health and Social Care: This domain oversees the regulation of statistics concerning the health of the UK population and health and social care services provided in England, Wales, Scotland and Northern Ireland. Our Health and Social Care domain played a vital role during, and following, the COVID-19 pandemic in rapidly reviewing the quality of statistics being used by Government and the public including:

In addition to work on statistics relating to COVID-19, the domain has also assessed Accident and Emergency (A&E) Activity Statistics in Scotland and statistics about the workforce employed by adult social services departments in England.

Housing, Planning and Local Services: This domain oversees statistics on a range of topics, including: house building; household estimates and projections; homelessness and rough sleeping; housing need and demand; land stock, use and development; and local authority planning. It also covers information on local services such as fire and rescue services. This domain has undertaken a wide range of compliance checks in the last few years including on Statistics on Council Tax in Wales, Social Housing Lettings in England produced by the Ministry of Housing, Communities and Local Government, and Valuation Office Agency Council Tax statistics. The domain has also assessed statistics on Statutory Homelessness in England produced by the, then named, Department for Levelling Up, Housing and Communities (DLUHC).

Labour Market and Welfare: The Labour Market and Welfare theme includes statistics measuring different aspects of work and jobs and covers people’s employment, working patterns and the types of work they do. The theme also covers any earnings and benefits they receive. Currently, this domain is primarily focused on the LFS and TLFS as summarised in the ‘Economic statistics’ section above. However, other examples of work include an assessment on the Personal Independence Payment statistics produced by the Department for Work and Pensions.

Transport, Environment and Climate Change: This domain covers statistics on transport and transport infrastructure; food and farming; the natural environment; energy; fuel poverty; and climate change. This domain maintains strong stakeholder relationship with the wide range of producers who are active in producing statistics in this topical area. We have intervened on subjects that have attracted much public interest such as improving transparency of Welsh Governments’ 20-mph speed limit data. We have previously undertaken regulatory reviews of some key biodiversity indicators including our assessment of Defra’s butterfly statistics as well as more systemic reviews such as accessibility and coherence of UK climate change statistics.

Transparency on the use of data

One of our flagship campaigns relates to the way statistics and data are used in communication. The campaign is based on our principles of intelligent transparency. The core of intelligent transparency is the requirement that all statements made by those in government involving statistics or data must be based on publicly available data, preferably the latest available official statistics where possible.

There have been several high-profile endorsements of intelligent transparency including the report from the PACAC on Transforming the UK’s Evidence Base in May 2024, which commended our work on intelligent transparency and noted that “This Intelligent Transparency guidance has driven the publication of several datasets which would otherwise remain hidden to members of the public, and has been welcomed by many organisations who rely on good data”. The Royal Statistical Society (RSS) also supports the campaign and has integrated the principles of intelligent transparency into the RSS’s new Principles to support statisticians making trade-offs in pressurised situations.

Casework

Through our casework process we regularly receive complaints about the use of data and statistics, often relating to our principles of intelligent transparency. This process is vital in OSR upholding the standards of the use of official statistics and beyond in public debate, holding individuals and organisations to account when needed. In the international landscape of official statistics, this function is relatively unusual, but we consider it an important part of underpinning confidence in statistics and data. This section provides some examples of our recent key interventions.

In October 2024, we were made aware of an unsupported statement made by the Prime Minister, Sir Keir Starmer at the Labour Party Conference regarding immigration returns. At the time the Prime Minister made the claim there was no Home Office data or statistics available in the public domain for the relevant time period to support this statement. We worked with the Home Office and this led to the publication of an ad-hoc statistical release, which provided the underlying data that related to the statement.

In March 2025 we wrote to Peter Schofield, Permanent Secretary for the Department for Work and Pensions (DWP) regarding a statement on the number of people on Universal Credit health with no requirement to look for work in a press release. We judged that the statement that the number of people claiming disability elements of Universal Credit had “increased by 383%” presented an ‘entirely misleading’ picture to the public as it did not recognise that the majority of this increase is due to the process of migrating people from legacy benefits over the last few years. When these people are accounted for, the actual increase in the number of people claiming disability elements of Universal Credit is around 50%. We requested that the press release was updated that week to remove the reference to the 383% figure and that it was not to be used going forward. DWP actioned the change to the press release shortly after and the Permanent Secretary responded to us committing to involving lead statisticians and analysts at all stages of the process, and that there will be appropriate oversight from their Head of Profession for Statistics.

In October 2022, we wrote to Scottish Government in relation to concerns that had been raised with us about the NHS inform dashboard. The NHS Inform dashboard showed the numbers of patients treated in the last quarter and their median wait times by clinical specialty. However, patients who have not yet been treated, some of whom may have been waiting a long time, were not included in these statistics. As such, we judged that the dashboard could potentially mislead some patients about the length of time they may have to wait. Based on our recommendations, Scottish Government implemented improvements to the way that figures were presented in late 2022. In October 2024, Public Health Scotland wrote to us outlining their plans to overhaul the dashboard which will result in a range of improvements in the presentation of the statistics which will provide a better reflection of people’s actual experience of waiting for appointments and treatment.

During the lead up to the 2024 General Election, we published a statement on claims made by the Conservative Government about the UK’s plan to “increase defence spending to 2.5% of GDP by 2030 – an increase of £75 billion”. We determined that the figure of £75 billion did not provide a clear picture to the public as it assumed that annual spending on defence would remain flat in cash terms. If the calculation assumed that defence spending was held at the share of GDP originally planned for 2024-25 then the proposed cash ‘increase’ would drop from £75 billion to £25 billion. Our statement notes: “Cumulating spending increases (or cuts) over several years to derive a large cash figure for presentational purposes does not in general facilitate public understanding of the data in question – the longer the period you choose, the bigger the number you get.”

An evolving statistical system

We consider that the UK data and statistics environment is constantly evolving which presents new challenges and opportunities for official statisticians and analysts. We consider many of these aspects through our reviews and through much of our wider work.

The State of the Statistical System

The State of the Statistical System is an annual report produced by OSR which presents our view on the performance and challenges facing the UK’s statistical system.

The 2023/24 report, published in July 2024, emphasised the increasing strain on the system due to financial and resource pressures, and the need to prioritise core statistics to ensure they are adequately resourced and funded.

To address the issues in the report we set out a number of recommendations. These included that the GSS develop a strategic plan for household data and invest more in its approach to engagement, and that the statistics system shares knowledge and best practice on delivering transformation programmes.

Data sharing and linkage

In 2023 we published a review of data sharing and linkage across government with 16 recommendations for the statistical system as well as a follow up report in July 2024 which assessed the progress that had been made.

Our 2023 report had positive impacts on several projects relating to data sharing and linkage. These impacts include influencing the strategic approach taken by the Department for Science, Innovation and Technology (DSIT) to reviewing cross-government data sharing policy; developments in the Data Marketplace led by the Central Digital and Data Office (now Government Digital Service); the implementation of Wave 2 of the Public Engagement in Data Research Initiative (PEDRI); and technical innovation by the ONS Data Science Campus in developing new privacy-enhancing technologies (PETs).

However, our 2024 follow-on report concluded that despite welcome pockets of innovation, there continues to be a failure to deliver on data sharing and linkage across government, with many persisting barriers to progress. Linking datasets for research, statistics and evaluation – both across government and among external researchers – is not yet the norm in the UK statistical system. To make this a reality, stronger commitments to prioritise data sharing and linkage are required. Such commitments further need to be endorsed and sustainably resourced by senior political and Civil Service leadership.

Our report also acknowledged specific process barriers to data access and linkage. Among these, we noted that concerns that data use cases are often too tightly defined to enable the use of data in policy development are particularly relevant to the success of the Integrated Data Service (IDS). OSR are working with members of the IDS team and the UKSA Research Accreditation Panel to consider programmatic access.

Conclusion

This submission summarises the effective work of OSR, and the tools we use to uphold compliance with the Code of Practice for Statistics. It shows how our work ensures accountability for the production of official statistics that comply with the principles of Trustworthiness, Quality and Value. This submission also provides clear examples of where our work has identified issues, set requirements, and secured change across a wide suite of statistical outputs and data practices, often against a backdrop of significant issues or concerns.

 

Ed Humpherson

Office for Statistics Regulation

May 2025

 

How well served are policy-makers, researchers, businesses and citizens, by the data that ONS produces and the services it provides?

Summary

The Office for National Statistics (ONS) is the UK’s national statistical institute and largest producer of accredited official statistics. It produces statistics, data and analysis to support a wide range of users including decision makers, researchers, businesses, and citizens. The ONS continually engages with its users to understand and meet their evolving needs and ensure that its outputs and services are of a high quality, accessible and relevant.

As the Committee is aware, the National Statistician, Professor Sir Ian Diamond, resigned earlier this month due to health issues. Emma Rourke, Deputy National Statistician for Health, Population and Methods, will be Acting National Statistician pending longer term arrangements being put in place.

The ONS continues to face challenges including falling survey response rates and operating within a tight financial and human resources environment. We remain committed to continuous improvement of our methods and approaches. Alongside this, there is an ongoing independent review of the performance and culture of the ONS being led by Sir Robert Devereux, former Permanent Secretary at the Department for Work and Pensions and the soon to be completed Spending Review. In this context the ONS continues to review its priorities and will make changes to its work as required, to further strengthen our approach to continuous improvement and enable the organisation to deliver our core mission of providing statistics for the public good.

The Office for National Statistics

The ONS is the executive office of the UK Statistics Authority (the Authority). It delivers independent, high quality and relevant statistics and analysis. The wide range of economic and social statistics we produce includes the UK’s National Accounts (such as gross domestic product (GDP)), vital events statistics (such as births, marriages and deaths) and labour market statistics (such as employment, unemployment and earnings) amongst others. The ONS also designs and runs the census in England and Wales every 10 years.

Our statistics and analysis are crucial evidence for decision making and monitoring by central and local government, the health service, businesses, charities and communities across the UK. It also informs public debate.

The ONS responds to changing contexts and user demand for more flexible, tailored and granular data. We are transforming our approach to how we produce statistics across the economy, population and society. This includes advancing data linking across government to enable faster, evidence-based decisions, and gripping the opportunities and challenges of new technologies (such as artificial intelligence (AI), including large language models) to shape and support thriving analytical and statistical systems for the future.

Our priorities are driven by our statutory objective set out in the Statistics and Registration Service Act, the UK Statistics Authority strategy for the statistical system ‘Statistics for the Public Good’, and the relevance and impact of our work to users and the public, with a focus on where we are uniquely placed to deliver.

ONS Strategic Business Plan 2025-2026

We have been open about the challenges the ONS has faced in recent months and set out a renewed focus on our core statistics in the ONS Strategic Business Plan for April 2025-March 2026, published in April 2025. This highlights that delivering our suite of economic and population statistics remains our core function and is reflective of decisions we have made to prioritise resources. GDP, prices, labour market and population statistics take prominence in our outputs.

We remain focused on producing the highest quality statistics and are committed to continuous improvement of our methods and approaches. Alongside our core outputs, we are undertaking vital transformation work, including delivery of our labour market statistics and improving the quality, granularity and timeliness of our prices data. Acknowledging the complexities of the challenges and the vital importance of our statistics to users, we have strengthened our engagement with stakeholders and channels for external challenge, support and expertise to inform our approach.

Our four key strategic priorities which will guide day-to-day activity of the organisation are:

  • An enhanced reputation for delivering trusted, relevant, independent statistics and analysis.
  • Top quality published statistics on prices, GDP (including trade and public sector finance), the labour market and population (including births, deaths and migration).
  • Support the Government’s missions and other users by maximising the use of our statistics and responding to evidence gaps where we are uniquely positioned to do so.
  • Greater linked data capabilities that result in faster, evidence-based decisions across government.

Given the tight financial and human resources backdrop and the need to prioritise our most critical statistics, difficult decisions, including to stop or reduce work, will need to be made in the coming period. While the prioritisation necessary to remain affordable will not satisfy demand from some users, we will continue to deliver impact by protecting our core deliverables.

Understanding user needs

The services that we provide can only happen when we listen to and work in partnership with our users. We engage with a wide range of users and stakeholders to increase both their understanding of our work as well as our understanding of their evolving needs. We are committed to ensuring our statistics are accessible, inclusive and trustworthy, representing and serving everyone in society.

We engage with users through a range of methods including regular meetings, consultations, stakeholder surveys, events, tailored explainer webinars for specific audiences, focus groups, expert advisory panels and user research.

As well as engaging our users on our statistics and analysis, we regularly seek feedback on their levels of satisfaction. Our most recent feedback showed high levels of use of our core statistics on population and the economy along with agreement that we fulfil our mission to produce “High quality data and analysis to inform the UK, improve lives and build the future”, are trustworthy and that our statistics are relevant and of a high standard.

In listening to users, we are also able to better understand how the challenges we are facing are impacting them. For example, how the impact of falling survey response rates on outputs, specifically labour market statistics, is impacting economic decision making. Users have also highlighted delays to some publications, the need for improvements to our website and a desire for more granular data across multiple topic areas. We fully recognise these points and have plans to address them.

In her Review of the Authority, Professor Denise Lievesley recommended that ‘It is time for the Board to move into a more visible, ambitious space, primarily through establishing a Triennial Statistical Assembly which will consult widely with statistics users and producers to understand the range of views regarding the priorities and data needs for the UK’. In response to this recommendation, we held the inaugural UK Statistics Assembly in January 2025. It was attended by over 550 attendees from a wide variety of sectors and roles.

The Assembly was summarised in an independent report produced by Professor David Hand, the then Chair of the National Statistician’s Expert User Advisory Committee (NSEUAC), and highlighted four high-level priorities for the Authority and Government Statistical Service:

  • Re-invigorate sustained and effective user engagement
  • Ensure user needs for more granular statistics are met
  • Commit to, invest in, and take a leadership position in a significant scaling up in the use of administrative data, as well as improvement of its quality and coherence
  • Recognise the needs for UK-wide statistics and advocate for, and support, harmonised data where desirable.

We plan to build on the success of the Assembly through a refreshed user engagement strategy, taking these priorities into consideration as we do so. The Authority Chair, Sir Robert Chote, will also deliver a lecture in July setting out the progress of the statistical system and priorities, drawing on the insights of the Assembly and the Office for Statistics Regulation’s (OSR) annual State of the Statistical System report.

Meeting user needs

Through our programmes, transformation work and statistics, the ONS is working hard to deliver statistics and analysis that meet user needs. This has involved significant prioritisation. The following paragraphs provide examples of work we do to understand and meet users’ needs and to inform policy makers, researchers, businesses and citizens.

Dissemination of Statistics

The ONS’s outputs, in line with the rest of the Government Statistical Service (GSS), are regulated by OSR. Equality of access to official statistics is a fundamental principle of statistical good practice. We publish our statistical releases on the ONS website and users are also able to request extra information from the ONS and see the information others have requested.

We are committed to improving the user experience on the ONS website and over the past year have been addressing website performance and stability as well as wider improvements to address feedback from users. We have launched new website page previews, most recently on prices statistics, with new navigation, page designs and smarter content for users to provide feedback on.

Priority issues for decision makers

The ONS works closely with partners to provide responsive analysis that directly address policy priorities, including the missions introduced by the Government. These statistics, and many others across the organisation, provide vital insights for policy formulation across government.

The National Statistician also leads the GSS and Analysis Function. The ONS sits at the heart of the GSS and Analysis Function and works with the network of UK Civil Servants to provide the statistical evidence base, professional advice and analysis required by decision-makers to ensure policy and operations are evidenced and deliver value for money.

This collaborative approach to delivering statistical outputs, responding to analytical demand and the continuous improvement of our statistics across the statistical system will continue to be an underpinning element to our plans in 2025/26.

The ONS’s Analytical Hub has a close partnership with the Joint Data and Analysis Centre (JDAC) in the Cabinet Office where we directly provide data and strategic analysis to support policy and decision making at the heart of government.

Delivering rapid insights in changing circumstances

We publish a range of statistics to provide timely indicators for users covering the effect of developing world events on the UK economy and society.

For example, the Opinions and Lifestyle Survey (OPN) collects information monthly on a variety of topics relating to people’s experience of daily life and events, including questions about what people feel are important issues, their health and well-being. The content on the survey changes regularly, to keep pace with changing content requirements from users.

The value of these ONS surveys such as the OPN or the Business Insights and Conditions Survey (BICS) were prominently demonstrated during the Covid-19 pandemic. They were regularly updated and adapted to reflect changes in policy and our understanding of the virus. The ONS consulted with a wide range of other government departments on a regular basis as it developed questions.

Real-time indicators are also invaluable for enhancing and developing core statistics by providing timely insights that complement traditional data sources. For example, during the Covid-19 pandemic, the ONS used real-time data from sources like card spending and mobility indicators to quickly gauge economic activity and societal changes.

Providing safe research environments for accredited and approved researchers

We make de-identified record-level data available to accredited or approved researchers through the Secure Research Service (SRS) and the Integrated Data Service (IDS) services to facilitate work on research projects for the public good. The SRS makes static snapshots of data available to researchers using a windows desktop environment and the IDS makes flexible views of indexed data available using the tools and scaling available through the google cloud. The SRS is one of the largest trusted research environments in the UK, around 6,000 accredited researchers having potential access. Of these, around 1,500 are actively working on research projects at any given time.

Building on the success of the SRS, we are now migrating users with the highest value use cases to the IDS, enabling far greater flexibility and broader ranging analysis work programmes.

This will also enable far greater insight into our core national statistics relating for example relating to trade, employment and growth, and the interactions between them at national, regional and local levels.

Delivering Local and Sub-national Insights

ONS Local is a dedicated analytical advisory service for Local Government and local decision makers with team based in each of the English regions, and in Wales, Scotland and Northern Ireland. A core part of the team’s role is engaging with local users to understand data and information needs and gaps, in addition to producing bespoke analysis for their local stakeholders. ONS Local and our subnational statistics, together provide unique local information to support national, regional and local leaders’ understanding of topics through a place lens.

To this end, the ONS has developed a tool called Explore Local Statistics, that allows users to find, visualise, compare, and download local data in one place. This service brings together a wealth of data across various topics, including the economy, education, health, and wellbeing, making it easier for local leaders and decision-makers to access the information they need. Users can search for data by postcode, local authority, region, or parliamentary constituency, providing detailed insights into specific areas. The service also allows for comparisons between different areas or clusters with similar demographic or economic characteristics, enhancing the ability to make evidence-based decisions.

How is the UK’s data environment evolving, and what challenges and opportunities does this present official statisticians and analysts?

The UK’s data environment is evolving at pace. There are more data, insights, and opportunities, and the ONS is acting to realise the full value of data while maintaining high levels of trust and transparency.

Challenges remain in streamlining and simplifying approval processes to these new sources and developing the skills and capability to make best use of them, whilst continuing to address the challenges of collecting information from traditional survey sources.

  1. Data capability and skills within our organisation are key. As part of capability building and future proofing, the GSS is collaborating with the Royal Statistical Society (RSS) on a project about the future of the statistical profession examining what the role of a statistician, including their skills and training, will look like in the future.
  2. Below we expand on some of the current opportunities and challenges relating to the evolving data environment.

Survey data collection challenges

Changes in society and the pace of technology are having a direct impact on how people perceive their data and interact with surveys and government services.

ONS surveys, both social and business, are the primary data source for statistical producers in the UK, central to the nation’s most significant economic and social indicators, and despite the increasing use of administrative data, they remain vital.

The sharp fall in household survey response rates both in the UK and internationally is well-known, often linked to an increased difficulty in accessing properties, increased cautiousness to share information, and declining trust in government and public institutions. This can affect data quality and has been evidenced most prominently by the impacts of survey response rates on the Labour Force Survey (LFS). The ONS’s plans for a long-term solution to the challenges faced by the LFS remains the online first Transformed Labour Force Survey (TLFS), more details about which can be found on the ONS website.

Although integration of administrative data sources is already taking place to produce economic outputs and remains our preferred data source, we recognise that surveys currently still play a vital role in collecting data from businesses. Direct data collection tools also become more important as AI is increasingly used due to the need for new, high-quality data.

Linked to this, but a separate challenge, are our legacy statistical production systems. As highlighted in our business plan, addressing our IT and coding legacy systems remains a key focus for 2025/26.

A legal requirement to complete ONS business surveys, coupled with predominately online collection, has seen business survey response rates improve to pre-pandemic levels.

What the ONS is doing in relation to survey data collection

The ONS is increasingly integrating a wide variety of data sources to produce high-quality research and statistics and adapting survey methodologies to encourage greater coverage. We have built relationships across the public sector to implement regular flows of anonymised administrative data collected as part of the delivery of frontline services, such as tax, benefits, health and education. We have also built relationships with the private sector and other bodies, enabling secure access to anonymised big data, such as aggregated financial transactions and mobile network operator data, and scanner data from supermarket sales.

The ONS has a vision for a more efficient and effective social survey system delivered through innovative use of administrative and alternative data across the survey end-to-end. We are also exploring and embedding opportunities to use AI and other innovative digital technology across the survey end-to-end. This will significantly improve how we meet changing data user needs with a sustainable, robust, resilient and agile survey delivery system.

To enable urgent quality improvements to the social surveys feeding economic statistics, we received additional funding from HM Treasury in 2024/25. This was invested in increasing our interviewer numbers, implementation of changes to bolster retention, and increasing incentives for survey respondents. This has helped to improve performance on key surveys, but we need to do more to improve performance across our full survey portfolio. Our survey recovery strategy, including focused resources in this area, will help us continue to improve response rates in an increasing challenging environment.

The ONS is already transforming how we collect data to an integrated and modularised suite of surveys through our Business Survey Strategy. This will reduce the burden placed on businesses and improve engagement, improving the quality of the data we collect. We are also expanding the use of AI and moving to cloud-based collection and production systems.

Administrative data

Many opportunities and challenges are linked to the growing scale and types of data that are available, in particular, the proliferation of administrative data within the public sector and the growth in alternative, big data.

The increasing availability and content of administrative data present the opportunity for the ONS to produce more frequent and timely statistics that sustain a better level of quality over time.

The quality of administrative data varies, with issues including data completeness, differences between concepts (including reporting periods), timeliness and frequency of data deliveries, consistency of data deliveries and the availability of metadata. Nonetheless, linking administrative data with survey data provides valuable insights into how they can be used to improve our understanding of survey bias, and to develop new outputs that make the most of the strengths of all sources.

The ONS has been investigating the use of primarily administrative data to produce annual population estimates, and a range of other types of estimates historically enabled only by the census. The ONS has worked with statistical offices in Scotland, Wales and Northern Ireland to consider the viability of taking similar approaches across the UK, with an agreement on this topic signed in November 2022.

This is an area of ongoing research and development, and the Authority is set to announce its recommendation on the future of the census in England and Wales in the summer.

The ONS has already moved to administrative measures of international migration due to unavoidable challenges with its traditional sources for these statistics. The ONS has confidence in the long-term strategy for migration estimates, and their future coherence with admin-based population estimates, and we are working with users to increase confidence as new methods mature.

A key enabler of this work is improving the sustainability of the supply of administrative data to the ONS, and improvements to their content and quality, working in partnership with data suppliers.

As we consider embedding new methods, and question the role a potential future census might play in the ONS’s long-term statistical design for population statistics, there is a need to balance ambitions for research and development alongside the requirement for a steady state of operational delivery in line with user needs, and resource constraints. This is a challenge for the whole of the UK, not just the ONS.

Data sharing across Government

Through extensive engagement across government departments, we are making some progress in acquiring new administrative data sources. However, this continues to be very time consuming, and each data sharing agreement can take months or years to agree.

Data owners are understandably risk adverse, often resulting in complex agreements, with conditions varying significantly between sources. In her Review of the UK Statistics Authority, Professor Denise Lievesley highlighted that ‘systemic and cultural barriers to responsible data sharing between government departments’ hamper the Authority’s efficacy, including the work of the IDS.

We continue to work with other government departments to remove blockers and simplify approval processes, and we hope plans to develop a National Data Library will increase the focus on resolving these challenges.

Data linking

The ONS has developed a suite of core linkage indexes covering business, the population and addresses which enables datasets to be safely de-identified and linked at scale without the need to share personal data with analysts. This linkage enables full exploration of the utility of administrative data for statistics and supports addressing data gaps going forward.

Using this approach, we are aiming to increase standardisation of data production and usage within not only the ONS but across the public sector.

With advances in AI, there is further opportunity to use the new technology to bring efficiencies to data processing and to improve data quality. However, this opportunity cannot be fully realised unless data has foundational quality such as high-quality metadata and clear governance, ethical and security frameworks in place. Good progress is being made in the ONS piloting the use of Generative AI to speed up data processing and improve user experience.

The National Data Library

The ONS continues to support the Government Digital Service (GDS) and Department for Science, Innovation and Technology (DSIT) on initiatives to improve data sharing, including the development of the Data Ownership Model for Government, identification of Essential Shared Data Assets and data discovery through the Data Marketplace.

Plans for a National Data Library (NDL), whilst still in the discovery phase, give a further opportunity to drive alignment across data sharing infrastructure services.

The ONS has shared key lessons relating to data sharing with DSIT as part of the NDL discovery, and we continue to help shape the longer-term solution. There will clearly be a role to signpost to the right service, support users access the right platform for their needs, and address data sharing barriers across the eco-system, particularly to support the government deliver its missions.

Office for National Statistics

May 2025

UK Statistics Authority response to the Public Administration and Constitutional Affairs Committee’s report on Transforming the UK’s Evidence Base

Dear Mr Hoare,

Now that the Committee has been reconstituted, I write to provide a response from the UK Statistics Authority to your predecessors’ report on ‘Transforming the UK’s Evidence Base’, published shortly before the General Election was called. On behalf of the Authority, I would like to express my thanks to them and their supporting staff for launching this timely inquiry and for the report and recommendations.

This report comes at an important time for the official statistical system, with the Authority due to convene the first ‘Statistical Assembly’ and prepare its next strategy for the statistical system, due to be published in July 2025. In the coming months the Authority will also be making a recommendation to government on the future of population and migration statistics in England and Wales, based on advice from the National Statistician.

In sending this response, I would also like to highlight to the Committee our response to Professor Denise Lievesley’s Independent Review of the Authority, which raised many similar issues and themes as the Committee’s report. Below, I broadly address the Authority view of the key points from each section of the Committee’s report and respond to recommendation 5 in more detail. Appended to this, you will also find the individual responses of Professor Sir Ian Diamond, National Statistician and Ed Humpherson, Director General for Regulation on behalf of the Office for National Statistics (ONS) and Office for Statistics Regulation (OSR) respectively, addressing their specific recommendations.

The Authority welcomes the Committee’s engagement on the future of the UK’s statistical system and the opportunities presented by your recommendations. We will continue to keep you updated on our work and progress made towards the recommendations aimed at the Authority, the ONS and OSR respectively.

Yours sincerely,
Sir Robert Chote
Chair, UK Statistics Authority

Delivering evidence for the public good

Recommendation 5: It is time to democratise access to data and evidence. The UK Statistics Authority should establish a framework for identifying and prioritising demands for evidence. We recommend that it use a high-level Assembly (of the kind recently recommended by Professor Denise Lievesley) to draw together information from communities across the UK about their needs for evidence and the benefits new evidence would bring, alongside research on data gaps, and public understanding. We further recommend that the UK Statistics Authority submit its findings on the nation’s demands for evidence to Parliament on a triennial basis, for scrutiny by this Committee.

1. The Authority welcomes the Committee’s fifth recommendation, that both a framework and a high-level assembly be established to identify and prioritise demands for evidence, with its findings submitted to PACAC for scrutiny on a triennial basis. Work is already underway to meet these objectives. In April, the ONS published its Strategic Business Plan for 2024-25, setting out our approach to prioritisation in a constrained financial environment. The Plan makes a commitment to securing the stability and quality of our core statistical outputs across five priority areas. In taking on additional work, the ONS will seek to align its resources to activities and programmes where it is uniquely placed to deliver, and that have the highest impact on the strategic priorities.
2. As set out in the Authority’s response to the Lievesley Review, a UK Statistics Assembly will meet for the first time on 22 January 2025. It will bring together users and producers across sectors to discuss and give advice on the UK’s needs for statistics. The insights will be drawn together in a published advisory report, indicating potential priorities and data gaps for scrutiny by users and by your Committee. They will inform delivery planning for the ONS and other official statistical producers.
3. As well as identifying data gaps, the Assembly’s discussions will provide valuable insights on the quality of statistics, to contribute to shaping the OSR’s regulatory work programme. Following the first Assembly in January 2025, the Authority and stakeholders will review the frequency of future events, including the timing of future reporting to Parliament. As Professor Lievesley has pointed out, there is no precise template for an Assembly so the first will inevitably be an experiment from which we can learn.
4. Additionally, across the ONS, producers of statistics regularly engage with users of statistics across existing forums and advisory groups. For example, the ONS Local team (based physically in the nine regions of England as well as Scotland, Northern Ireland and Wales) act as the front door for local government to access the ONS and the wider GSS, supporting users to make the most of a wide range of data and analysis.
5. Furthermore, the ONS provides accessible digital content to help audiences find, understand, explore, and act upon its content. These include data visualisations and explorer tools, as well as explanatory articles and bite-sized videos to suit different audiences – available both on its own website and via external channels, including social media platforms. To extend the reach of statistics and data to audiences with whom the ONS traditionally has had less engagement, the ONS works with relevant organisations and citizen representative groups, to help disseminate its outputs as well as inform the design of communications.
6. The ONS have also initiated a Citizen Data project with the aim of securely playing back data held on the citizen to the citizen. This will enable ONS to engage directly on a one-to-one basis encouraging individuals to validate their personal data and help to improve characteristics coverage and public trust in the use and storage of administrative data.
7. OSR continuously seeks to embed the principle of statistics for the public good in its regulatory approach. The Code of Practice for Statistics has clear expectations that official statistics support the needs of a wide range of users, alongside policy makers. OSR is in the process of refreshing the Code and will continue to articulate and strengthen its expectations on this principle. One of the areas of focus for the refresh of the Code is a greater focus on user engagement.
8. OSR also conducts research to further understand how official statistics can serve the public good. In its recent think piece, OSR proposed that “official statistics serve the public good as public assets that provide insight, which allows them to be used widely for informing understanding and shaping action”. OSR is also undertaking complementary research into how individuals may use statistics in their personal lives. This research is used to strengthen its regulatory approach, and its advice and requirements on producers of official statistics.
9. As mentioned, the Authority’s current strategy, Statistics for the Public Good, launched in 2020 and will end in July 2025. We will look to engage with the Committee to ensure the next strategy reflects views from users, including Parliament.

Navigating new data sources and technologies

10. The Authority acknowledges the Committee’s high-level findings that there has been an increase in data generated across the UK, with a need to bring together ‘old’ and ‘new’ data sources to make best use of it. Valid concerns are raised about the current provision and proficiency of cross-government data sharing and how if not addressed, this may hamper efforts such as our ambitious future of population and migration statistics programme, which is seeking fully to capitalise on the transformational opportunities offered by administrative data sources.
11. As we outlined in our response to Professor Lievesley’s review, we concur with the notion that a drive is needed from the centre of government to increase incentivisation and ability for departments to share data between them, with the FPMS programme and successful delivery of the Integrated Data Service (IDS) requiring a positive culture shift towards data sharing becoming a reality. Additionally, we seek to be as transparent as possible about what data we are seeking and how we are using them. Thus, we see the benefit in all the recommendations in this section of the report and they are explored in more detail in the ONS response.

Evidence in policymaking

12. Several recommendations in this section are made with a view to ensuring the Analysis Function (AF) has the resource and vision it requires to enact significant change and evaluate its future successes. As part of the privacy section of the report, the Committee also suggests that the AF explore options for improving transparency where personal data is used in official analyses. These recommendations are responded to at length in the ONS’s response, given the National Statistician’s role as Head of the AF.
13. I was pleased to note the Committee’s praise for the OSR’s fantastic work on Intelligent Transparency (IT) and their suggestion that its remit should be widened in scope and government communication professionals be trained in the IT principles. Previous work that the OSR has done in this space, and thoughts on these recommendations, can be seen in their response.

Privacy and ethics in an age of data

14. The Committee’s last recommendation that the Centre for Data Ethics and Innovation should continue to monitor public attitudes on the Government’s use of data is welcomed by the Authority.
15. The Authority pays tribute to the work carried out by the Responsible Technology Adoption Unit (RTAU) in the Department for Science, Industry and Technology (previously CDEI). We also monitor public attitudes towards the use of data and trust in institutions more widely, including an insights paper we published in June 2023. This release summarises our findings on public attitudes, concerns and expectations on the use of data and views on our use of administrative data in publishing statistics.
16. In the past, the Authority has regularly commissioned the National Centre for Social Research (NatCen) to assess independently the public’s knowledge of, and trust in official statistics, and how they are produced and used in the Public Confidence in Official Statistics (PCOS) Survey. The most recent PCOS survey results were published by NatCen on 14 May 2024.
17. Additionally, in 2022 and 2023 the ONS was commissioned by the Cabinet Office to run the Organisation for Economic Co-operation and Development Survey on Drivers of Trust in Public Institutions on behalf of the UK Government. The most recent release, for 2023 was published on 1 March 2024. We therefore support this recommendation and would be happy to provide our knowledge and expertise to RTAU to assist with future work seeking to understand public attitudes towards data usage.

Sir Robert Chote, Chair
UK Statistics Authority

The Office for National Statistics (ONS) response addresses the Committee’s recommendations both directly aimed at the ONS, and those where other government departments have joint responsibility. This response focuses on data sharing and the future of population and migration statistics programme, which are inherently linked, and data ethics. It also provides a response to the series of recommendations specifically directed at the Analysis Function.

Data Sharing and the Future of Population and Migration Statistics

Recommendation 1: It is time for Government to do what it promised to do seven years ago, and to join up the UK’s evidence base. Given that the Cabinet Office’s existing initiatives for improving data sharing are self-evidently insufficient, it should in partnership with the Office for National Statistics develop a comprehensive new programme aimed at improving data-sharing for statistical and research purposes. The programme must clearly define deliverables and timelines, and must be owned by a senior responsible officer at an appropriately high level. In line with the recommendations of the Lievesley report, we also recommend that HM Treasury establish mechanisms so that the costs are not borne by individual Departments, but rather centrally. The Cabinet Office should prepare and publish an annual progress report on delivery against the programme.

18. The ONS is strongly supportive of efforts to enhance data sharing across Government. As the largest producer of official statistics, we are dependent on effective data sharing across the public sector and beyond, to support more quality, timely and granular admin-based statistics. The ONS also plays a key role in supporting Government, the devolved administrations and wider academia to access data to support statistical research. As such, the ONS firmly supports the Committee’s recommendation that a cross-government data sharing programme be established.
19. To date, we have worked closely with the Central Digital and Data Office (CDDO) and wider government departments to promote effective data sharing. We also played a leading role in supporting key initiatives to deliver upon the commitments within the 2022-2025 Roadmap for Digital and Data. We have supported a number of initiatives aimed at improving data sharing, including developing the Data Maturity Assessment for Government; the identification (and publication) of Essential Shared Data Assets and developing common governance arrangements to support sharing of data.
20. As the lead delivery partner for the Integrated Data Service, the ONS has also delivered a trusted research environment in the cloud. We are uniquely well placed to facilitate access to a growing library of linked data sources to support collaborative analysis, including to support the development and delivery of Government’s key missions. However, this will only be possible with continued and increased support from key data owners across government.
21. The ONS recognises both the progress that has been made and the substantive challenges that remain to cross-Government data sharing. Therefore, the ONS welcomes the creation of a new Digital Centre of Excellence within the Department for Science, Innovation and Technology. We look forward to working with DSIT to define and implement a programme of work that drives a step change in data sharing to enable statistical and research use cases within Government and beyond.

Recommendation 2: Separately, the Office for National Statistics should publish information on the datasets it is seeking on an annual basis, setting out its rationale for seeking those data, and details on the status of the request – all of which should be made available on the ONS website.

22. The ONS accepts the Committee’s recommendation that we publish information on, and rationale for, the datasets we are seeking on our website annually. This fits well with our strong desire to be transparent about the data we use to support statistical outputs. For example, we already publish a report on the datasets that we have acquired that contain personal identifiers which was most recently updated in July 2024. We are working to expand the coverage of this publication to cover a broader array of alternative and administrative data sources, irrespective of whether the dataset contains personal identifiers or not.
23. We have a broader transparency ambition, which will lead to further publications that provide information about how the data we acquire are processed and the relationship between these data ‘inputs’ and our broad portfolio of statistical outputs. Whilst all the necessary information is available on an individual basis, drawing together all of the elements needed to depict this will require a programme of work across the office, including the development of an enterprise data model. We intend to start with some illustrative examples to test the best ways of presenting what will be a very large amount of information. We will then expand from that point, recognising the need to be both informative and comprehensive.
24. We also acknowledge the importance of being transparent about the data that we have not yet acquired, in both illuminating the progress on key data shares, but also conveying a clear sense of our progress in delivering a viable administrative data based statistical system. Therefore, we agree to add data that is in the process of being acquired to our transparency reporting. We will
develop the best format for these publications, in conjunction with our various suppliers so that we can appropriately convey the status of an acquisition.
25. We recognise that data sharing is a complex process. Various stages are required to mature sharing arrangements and deliver sustainable supplies of data and it is necessary to provide a sense of how mature our sharing arrangements are. We must also ensure that we adhere to commercial sensitivities in naming some suppliers.

Recommendation 4: This Committee’s view – particularly in light of challenges around data-access – is that officials have not yet demonstrated that they can deliver the evidence users need, without a decennial census. We therefore recommend that the Office for National Statistics undertake further work on proposals for the future of migration and population statistics.

26. Both the ONS and the Authority welcome the Committee’s recognition of the opportunities offered by administrative data sources. We also recognise the need to improve the culture of data-sharing across government if we are to maximise those opportunities. This is a challenge that was also highlighted by Professor Denise Lievesley’s review earlier this year, and which the Authority continues to work with partners across the public sector to resolve.
27. Data already held within the public sector mean population and migration statistics can be more consistently accurate and produced more often and quickly. As a result, decision-makers have more, higher-quality, information about local populations, their characteristics, where they live and the public services they need.
28. In line with the Committee’s recommendation, the ONS continues its work to develop and improve admin-based population estimates, using innovative new methods and a wider range of data sources, accounting for quality limitations in the data. We published updated estimates as official statistics in development in July, and aim for these to become the official mid-year population estimates in 2025.
29. The Authority expects to publish its recommendation on the future of population and migration statistics in England and Wales in the coming months. This recommendation will draw on extensive engagement with users of these statistics, including through the public consultation last year, and will include the Authority’s proposed approach to the future of the census in England and Wales.

The Analysis Function

Recommendation 10: We recommend that Government reaffirm its commitment to the analysis function, and that HM Treasury review options for its future funding. If Government truly wishes to improve its use of analysis and deliver better outcomes for the public, it clearly needs to fund that change.

30. The ONS remains committed to the Analysis Function (AF), and as such accepts the principle of this recommendation and is grateful for the Committee’s concern about its future funding model.
31. We believe that funding for a dedicated central AF team is essential to ensure that analysts across the Civil Service have the support they need to deliver better outcomes for the public by providing the best analysis to inform decision making.
32. Therefore, the AF Central Team (AFCT) will work with HM Treasury to assess the best option for future funding. As Head of the AF, I will work with Chief Analysts across government to ensure the profile of the Function continues to be raised.

Recommendation 11: In parallel, the National Statistician should review the analysis function’s scope and standard, with a view to defining an achievable set of next-steps, and clear plans for honest evaluations of the function’s success. This review and subsequent evaluations should be made publicly available, so that Parliament is in future better equipped to scrutinise both the Government’s use of evidence and the progress of the analysis function.

33. The ONS accepts the Committee’s recommendation to review the AF’s scope and standard with achievable next steps in mind and will take the points raised away for further consideration. Subject to gaining sufficient funding for the AFCT, the team will review the scope of the AF, which will be reflected in the updated AF Strategy for 2025-2028. The team will also review the AF standard, focusing on any changes needed to reflect the importance of transparency in analysis.
34. With regards to the evaluation of the AF, subject to funding, the AFCT will evaluate the impact of their work to support analysts across government. It will also undertake a light touch assessment of the impact of analysis more widely, using existing evidence sources, such as the results of the assessment against the AF Standard. This will be complemented by the work of departmental Chief Analysts who are responsible for evaluating the impact of their analysis and whether it is meeting the needs of their decision makers.
35. The AFCT will update the Committee on the findings of this evaluation, via a letter from the National Statistician to the Chair. The AFCT anticipate that this work will be completed by Q2 2025/26. However, further reviews of the standard, and evaluation of the work of the Central Team, will be undertaken as part of our business as usual.

Recommendation 14: We recommend that, at a minimum, governments in future routinely publish the evidence and data underpinning their major policy announcements. Making this happen will not be a straightforward task, and we suggest that in the first instance leaders of the analysis and communications functions develop options to deliver this ambition, for the consideration of Ministers.

36. The ONS accepts the Committee’s recommendation that options be developed for Ministers on the routine publication of evidence and data underpinning major policy announcements in the future. Transparency in evidence and data underpinning policy decisions is an important matter for the AF and was discussed at an AF Board meeting last year. There is a variety of guidance already in existence in this area, including the Code of Practice for Official Statistics, the Analysis Function Standard and OSR’s guidance on Intelligent Transparency.

37. In line with the Committee’s suggestion, the AFCT will work with the Communications Function, and other bodies, such as the Policy Profession, to consider options around the recommendation to routinely publish the evidence and data underpinning their major policy announcements.

Recommendation 15: We recommend that the analysis function explore options for improving transparency around the use of personal data in official analyses, and that this work be made publicly available.

38. The ONS accepts the recommendation that the AF explore options for improving transparency around the use of personal data in official analyses. Subject to sufficient funding, the AFCT will investigate options around what would improve transparency around the use of personal data in official analysis, working with relevant bodies that deal with use of personal data such as CDDO and OSR. The AFCT will complete this work by Q4 2025/26.

Data Ethics

Recommendation 16: It is now time to consolidate the excellent exploratory work that has been done on data ethics, and to embed it more formally into the collection, analysis, and communication of evidence in the UK. We recommend that the Cabinet Office’s Central Digital and Data Office and the Office for National Statistics jointly review the varying data ethics frameworks available to analysts across the UK; considering opportunities for greater consistency, and possible accountability mechanisms, to encourage a wider adoption of data ethics across government.

39. The Authority has worked with the CDDO to develop a common understanding of data ethics in the public sector. Our conversations have resulted in agreement with the recommendation that existing frameworks across the UK be reviewed with the aim of encouraging wider adoption of data ethics across government.
40. Regular working-level meetings between the data ethics teams from both departments have been organised, and discussions have included sharing each team’s learnings in the data ethics space. Together, we have discussed the Authority’s data ethics self-assessment tool and the recent landscape review around the responsible use of data-driven technologies in the public sector for which the Authority participated in an interview, amongst others. The Authority’s Centre for Applied Data Ethics (CADE) continues to monitor the impact of the tool along with providing practical support and thought leadership in the application of data ethics by the research and statistical community.
41. As noted in the Government response, recommendations coming out of the landscape review suggested consolidation work on data and AI ethics guidance across government. We will work with CDDO and other partners across government on this exercise.
42. We also wish to concur with the point raised in the Government response about flexibility and context. We agree that harmonisation is desirable in some instances and have discussed shared opportunities with CDDO. But our objective is to promote and safeguard the production and publication of official statistics, and specifically provide guidance to researchers (from within and outside of government) on the ethics of their research. Therefore, we agree that much of the guidance material we produce must be discrete from general data ethics guidance produced for central government.

Professor Sir Ian Diamond, National Statistician
Office for National Statistics
September 2024

This response from the Office for Statistics Regulation (OSR) addresses the Committee’s recommendations for OSR as well the recommendations with joint responsibility across other government departments. Our response has focused on the development of a framework for reporting data gaps across the UK, and on provisions for improving intelligent transparency across government.

Data gap reporting framework

Recommendation 6: We recommend that the OSR support this activity by preparing regular and public reports on data gaps in the UK.

43. OSR accepts the Committee’s recommendation to prepare regular and public reports on data gaps in the UK in principle.
44. Identifying and helping statisticians focus on addressing data gaps is an important aspect of our regulatory work. Our domain (topic area) model of regulation allows us to have strong relationships with statistics producers and organisations who publicly advocate for improved statistics. Our annual report, State of the Statistical System, which was last published on 17 July 2024, is our key publication bringing together the insight provided by our regulatory work. It also shares our views on the performance of the statistical system and the challenges facing it. In line with the Committee’s recommendation, we will have an enhanced focus on data gaps in future editions of this report.
45. Our recently published report, “Data Sharing and Linkage for the Public Good: Follow-Up Report” highlights the importance of enabling greater data sharing and linkage, in a secure way, for research and statistics22. This is also relevant to the government’s ability to respond to data gaps.
46. Additionally, OSR is refreshing the Code of Practice for Statistics to ensure that it remains relevant. We are strengthening the emphasis on involving users in decisions about what statistics are required – whether to start, stop or change official statistics. This includes being clear on where and why user needs can and cannot be met, such as addressing information gaps.
47. Noting recommendation 7, in our aforementioned “Data Sharing and Linkage for the Public Good” report, we call for consistent and sustainable funding streams for data sharing, access and linkage initiatives, and specifically for a centralised government funding structure to support data collaboration projects across government. This structure should prioritise a system-level, access-based approach to investment, as well as continue and expand initiatives such as the Shared Outcomes Fund.

Recommendation 9: We recommend that the Office for Statistics Regulation review and publish a report on the adequacy of UK-wide comparable data, by themes, before April 2025.

48. OSR accepts the Committee’s recommendation that we review and publish a report on the adequacy of UK-wide comparable data. Our State of the Statistical System report highlighted the need for producers to work in partnership across the UK and provided examples of where statistics producers are making improvements to UK comparable statistics and data. We will build on this work and publish an update in 2025 which shares our more detailed views on the adequacy of UK comparability.

Improving intelligent transparency across government

Recommendation 12: We commend the OSR for its work on Intelligent Transparency and recommend that it publish an annual report card on departments’ compliance with its guidance, so that Parliament and external bodies might support it in holding departments to account, and making the case for well-informed policy. Recognising that this important work expands the remit of the OSR beyond official statistics, and into the larger world of government analysis, we also suggest that at the next Spending Review, it works with HM Treasury to agree a sustainable funding model for this work, given the vital role it plays.

49. OSR accepts this recommendation and thanks the Committee for its commendation of our work on intelligent transparency (IT). We agree that this would expand the remit of OSR and would therefore need to be appropriately funded.
50. OSR has an intelligent transparency working group tasked to find ways to promote, and embed, the IT principles across government. We are beginning work to develop proposals for a monitoring and reporting approach for IT across departments. The project will consider incentives and formats for departmental reporting, as well as automated tools for OSR to monitor IT on a rolling basis. We will consider these proposals alongside discussions with HM Treasury on a sustainable funding model for this work.

Recommendation 13: We recommend that all government communications professionals are trained on the OSR’s Intelligent Transparency guidance, and that the Government Functional Standard for Communication be updated to make it clear that officials are expected to comply with that guidance.

51. OSR accepts the recommendation that government communications professionals should receive training on IT and would be happy to work in partnership with the Cabinet Office to achieve this.

52. Since the creation of the IT principles, OSR has continued to promote them across government, working with a range of organisations and professions, including the communications profession. We are very encouraged by the response and engagement from departments to date, and the commitment to supporting public confidence in government statistics and data through transparency. Our ambition is to continue to take these principles to new audiences, including ministerial private offices and Special Advisors.
53. We are also exploring ways to provide our IT materials on our website following feedback from sessions we have delivered previously. This includes developing new guidance on adhering to IT principles on social media.
54. As part of our work refreshing the Code, we are looking to articulate our standards relating to IT more clearly and better highlight how they relate to all those in government involved in communicating statistics, data and wider analysis.
55. Relatedly, we wish to express our support for recommendation 14 of the Committee’s report, that future governments should publish the evidence and data underpinning their major policy announcements. We already encourage IT around policy announcements and government decision making and will continue to do so. Our latest report on Analytical Leadership explores our position on this in more detail. Additionally, I included this issue in my letter to Permanent Secretaries at the start of this year’s general election campaign.

Ed Humpherson, Director General for Regulation
Office for Statistics Regulation
September 2024

UK Statistics Authority supplementary evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on Transforming the UK’s Evidence Base

Dear Mr Wragg,

Following the submission of the Office for National Statistics’ (ONS) written evidence to the Committee’s Transforming the UK’s Evidence Base inquiry on 31st August 2023, I then gave oral evidence to the Committee on 5th September 2023.

One of the topics that I am aware the Committee has been interested in, during the course of this inquiry, is analytical capability across government. I am pleased to be able to provide some additional evidence on this topic, as requested.

Analytical Skills

 The Analysis Function (AF) is committed to building skills and knowledge across our community of 17,000 analysts, supporting effective career planning, and ensuring that we have skilled people in the right place at the right time. We have developed a suite of materials, designed to support analysts to navigate their careers across government analysis. These include the AF Career Framework, which features multidisciplinary role profiles and career pathways, as well as career stories showcasing the variety of entry points and available career progression routes.

The Analysis Function remains focussed on providing a learning and skills offer that meets the diverse needs of our community and adds value to the work being delivered by analytical professions and departmental colleagues.

Analysis Function Standard Assessment Framework

In my previous letter dated 5 October, I noted that Departmental Directors of Analysis were asked to undertake a self-assessment against the standard in 2023, for the first time. This process is a framework designed so that organisations can assess how well they are applying all aspects of the Analysis Function Standard. Consequently, this means that the assessment covers issues beyond analytical capability, such as capacity, governance and structures.

The Cabinet Office mandates that all functions conduct such exercises. The assessments were carried out in Q4 of 2023/24 and we had responses from 21 organisations across government. As the assessment is meant to drive improvements within organisations, only high-level information was returned to the

Analysis Function Central team, under the agreement that these responses would not be shared more widely. The returns showed a mixture picture of strengths and weaknesses across government. The summary information returned has been used

to develop further actions to support organisations in meeting the Analysis Function Standard, for example, setting up sessions to share best practice on key areas of the Standard.

Analytical Capability Audit of Policy Professionals

As part of his review of the effectiveness of government functions in 2021, Lord Maude commissioned a review of the analytical capability of policy professionals. The AF worked closely with the Head of the Policy Profession, Tamara Finkelstein, to identify areas of strength in the analytical capability of policy professionals, as well as development areas for improvement. This work further fostered positive working relationships between analysts and policy makers in government.

The report was completed in summer 2022 and has been well received. It is a key evidence base for the analytical skills development agenda across government, for both policy professionals and more widely across the whole Civil Service. This has led to a more robust analytical capability learning offer for all, ultimately ensuring that officials are more comfortable working with and analysing data when developing and delivering public services. The legacy of this work has been highlighted in core reform activities, such as the One Big Thing initiative in 2023, pushing the analytical capability agenda in government.

Cross Government Evaluation Capacity and Capability Survey

An Evaluation Capacity and Capability survey (ECCS) was conducted in Summer 2023. This was in response to a recommendation in the 2021 National Audit Office report Evaluating Government Spending to enhance the evaluation capacity and capability within government. The survey, conducted by the Analysis Function Central Team, aimed to assess government’s specialist evaluation capacity and capability and develop a plan to address any identified shortfalls.

The survey focuses on key research questions regarding evaluation skills and experience, confidence in applying evaluation-related skills, understanding of evaluation concepts among non-evaluation practitioners, engagement between analysts and non-analysts, and areas for improvement. The results are currently being analysed and associated recommendations developed, in collaboration with the Evaluation Task Force. High level results from the survey will be released through a blog.

We remain aware of, and draw on, the work of others who have influence in this space. This includes the Office Statistics Regulation, whom the Committee heard evidence from on 6 February.

I hope that you find this additional information useful. Please do let us know if we can assist the Committee further on the topic of the Analysis Function, or with any of its other inquiries.

Yours sincerely,

Professor Sir Ian Diamond

Office for Statistics Regulation supplementary evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on Transforming the UK’s Evidence Base

Dear Mr Wragg,

Thank you very much for the opportunity to give evidence to your Committee as part of the Transforming the UK Evidence Base inquiry on 6 February. I enjoyed the session and I hope that you found my evidence useful.

I am writing to provide some supplementary evidence related to comparability of statistics across the UK.

During the session, I set out the expectations we have as the Office for Statistics Regulation for statistics producers on questions of comparability. I emphasised that where there are questions from users around how to compare the performance of public services across the UK, producers in the four nations should recognise and seek to meet that need.

Meeting that need is not straightforward. As I explained, the configuration of public services will probably be different, because of different policy and delivery choices that have been made by the different governments. This is consistent with the concept of devolution, but it does mean that administrative data may be collected and reported on different bases.

However, it is not in our view sufficient for producers to simply argue that statistics are not comparable. They should recognise the user demand, and explain how their statistics do, and do not, compare with statistics in other parts of the UK. And they should also undertake analysis to try to identify measures that do allow for comparison.

A very good example of this approach is provided by statisticians in the Welsh Government. Their Chief Statistician published two blogs on the comparability of health statistics, Comparing NHS performance statistics across the UK and Comparing NHS waiting list statistics across the UK. These blogs recognise the user demand and provides several insights to enable users to make comparisons of NHS performance.

In addition, the Welsh Government’s monthly NHS performance release also highlights what can, and cannot, be compared. For example, it shows that in November 2023, there were approximately 22 patient pathways open for every 100 people, while for England, the figure in November was 13 pathways for every 100 people. More generally, I would commend the Chief Statistician’s blogs as a good example of providing guidance and insight to users across a wide range of statistical issues.

During my evidence session I also mentioned the approach taken by NHS England to highlight the most comparable accident and emergency statistics. NHS England provide a Home Nations Comparison file for hospital accident and emergency activity each year.

More generally, the ONS is leading comparability work across a range of measures. In addition to work on health comparability, they have produced very good analysis of differences in fuel poverty measurement across the four nations.

I hope this additional evidence is useful. I would like to reiterate that these examples show statisticians recognising the core point – that there is a user demand for comparability and that they are taking steps to meet that demand.

Yours sincerely,

Ed Humpherson

Director General for Regulation

UK Statistics Authority supplementary evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on Transforming the UK’s Evidence Base

Dear Mr Wragg,

Following the submission of the Office for National Statistics’ (ONS) written evidence to the Committee’s Transforming the UK’s Evidence Base on 31st August 2023, I then gave evidence to the Committee on 5th September 2023. I am now able to provide some supplementary evidence, as requested, on several topics of interest.

The Integrated Data Service (IDS)

As you will be aware, the IDS is a cross-government project, for which the ONS is the lead delivery partner. The project is a key enabler of the National Data Strategy and seeks to securely enable coordinated access to a range of high-quality data assets built, linked and maintained for richer analysis. Please find below some further detail on the background of this project and the progress towards its delivery.

What is the scope of the IDS?

The scope of the IDS is to deliver a secure scalable modern data service which operates on a cloud-native platform, hosting a rich and diverse data catalogue consisting of indexed and linkable data with the latest provision of data science and generative AI potential. The service has been designed to better inform effective policy making.

The vision of the IDS is to address the lack of a central integration platform that can cater for the future needs of both data providers and analysts looking to utilise integrated data to develop cross-cutting analytical results. The IDS builds on the success of the Secure Research Service (SRS) and offers to significantly reduce the time it takes to negotiate and access data and the provision of data assets.

The IDS provides a secure environment that enables streamlined data sharing across government improving the ways that data are made available via cloud native technologies, modernising the way departments and their professionals operate. The IDS is the first of its kind in the UK and will be setting the precedence for how data is being processed on a cloud native platform.

When is it expected to be delivered?

The programme has been in development by the ONS over the last 18 months and is funded until March 2025 (under the current Spending Review). After this date, the IDS becomes a live running service.

What is the cost of the programme?

The programme secured funding from HM Treasury (HMT) until the end of the investment period (financial year March 2024/25). The cost of the programme is estimated to be £228.7m which covers the development and running costs from 2020 – 2025. Furthermore, the programme continues to assess funding options beyond March 2024/25.

Who are the users likely to be?

The IDS is designed for use by accredited analysts, within government and the wider research community. The ambition for the IDS is to have every government analyst, roughly estimated at 14,000 individuals, capable of utilising the platform to better inform decisions for the public good.

What data do you expect to be available on the service?

There are currently 81 datasets available in the IDS from across government. This includes high-value data assets, such as levelling up; climate change and net zero. Additionally, health data assets are underway with identified datasets being indexed by the Reference Data Management Framework (RDMF) – which enables multiple data to be linked and analysed, creating new comprehensive data assets – and published on the IDS so that analysts can link data according to their requirements.

The programme intends to continue to work with data owners across government and the private sector to acquire more datasets in conjunction with the RDMF.  However, this is dependent on data owners signing up to data sharing agreements to make this data available.

In accordance with the Central Digital and Data Office’s roadmap for 2022-25, departments have agreed to share their essential shared data assets across government, including through IDS. This further enables the IDS as a Trusted Research Environment to facilitate and support this commitment.

However, discussions with government analysts have highlighted a range of concerns about how current incentives for departmental data sharing fit with the needs of ministerial-facing departments. There is also a wider financial risk regarding other department’s ability to fund activity such as data cleansing, which may limit their ability to effectively share data. Although HMT set out the expectation that OGDs will support data sharing in all SR21 settlements, no specific funding was provided, which may limit activity in some cases. As part of the IDS Programme, ONS is working with Chief Data Officers across government to minimise frictions around the sharing of data via IDS. One of the pilots in development is looking at Data Ownership and Stewardship approaches to streamline the governance arrangements and make it quicker for departments to agree to share data via IDS, and for analysts to subsequently access that data for a broad range of analysis in the public good. As always, I would welcome support from the Committee to share and promote the benefits of data sharing across government for the public good.

What safeguards will be in place to protect data?

The IDS is a trusted research environment which means it adheres to the 5 Safes in accordance with the Digital Economy Act (DEA); The 5 safes of secure data are as follows:

  • Safe projects – Is this use of the data appropriate, lawful, and ethical?
  • Safe people – Can the users be trusted to use it in an appropriate manner?
  • Safe settings – Does the access facility limit unauthorised use?
  • Safe data – Is there a disclosure risk in the data?
  • Safe outputs – Are the statistical results non-disclosive?

These principles enable the safeguards and governance for the IDS to operate with sensitive data which in turn ensures public confidence in the security and processing of data. Access to the IDS platform is granted via secure gateway in line with the data legislation; furthermore, the IDS utilises strict policies around the cleaning, linkage, validation and controlling data.

The IDS Programme is also working across ONS in the development of key governance through policy creations that will enable safeguards and the appropriate use of data. The policy workstream, which is coordinated by ONS’ Data Governance, Legislation and Policy and Security and Information Management teams, is helping to develop adequate governance for the programme via policy development. In developing safeguards, the programme employs the following principles:

  • Adapting successful policies within the ONS and across government analytical communities (e.g., GSS, GSR, GES) that can support the programme.
  • Working with the National Statistician’s Data Ethics Advisory Committee, which is underpinned by the UK Statistics Authority’s (UKSA) ethics framework for the use of data for statistical, research and analytical purposes, to identify and mitigate any potential ethical risks at project-level.
  • Access to all data are controlled through the concept of a analytical ‘project’, within supporting business and technical processes linked to user need.
  • An overarching programme Data Protection Impact Assessment (DPIA) is maintained to define key activities and associated data risks. Continued engagement with the Information Commissioner’s Office on the DPIA as it is maintained and updated as the programme develops.

The programme also adheres to the UK Statistics Authority/ONS Data Protection Policy (required by the Data Protection Act 2018 and the General Data Protection Regulation).

The ONS website

The Committee also asked for some insight into the current condition of the ONS website and any plans to change the site in the future. Below I have outlined our vision for dissemination, of which our website is an integral part, as well as some exploratory work we are undertaking to see how we could use AI technology to address some of the challenges with our existing website.

Our Vision for Dissemination

The ONS website supports the Statistics for the Public Good strategy by helping to build trust in evidence, enhance understanding of social, economic and environmental matters and improve the clarity and coherence of our communication. By helping people to be aware of the ONS and to find, understand and explore our data, statistics and analysis we are giving people the information they need to make decisions, and act, at a national, local and individual level.

Our vision for statistics dissemination goes beyond the website. We want people to have trust in our data and analysis. We know that our users want to find trusted ONS information wherever they look – whether that’s on the ONS website, on social media, in the media or through search engines. Our users want ONS answers to their questions and we are exploring a range of different approaches to serve this need, including providing answers to questions using Large Language Models (LLMs).

Our goal is for users to understand our data, statistics and analysis more quickly and easily, with the right contextual information to help people know how they can use them. We want our users to explore and tailor our information so they can find what is important to them – whether that is by creating their own datasets based on ONS data or through our expert curated view of key insights for the economy or society.

Our priorities for the website in recent years have been delivering the capability to support census 2021 outputs and the reliability of the service to all our users, particularly in response to the additional demand for ONS data on the economy, in response to changes in the cost of living. We’re currently running a package of work to address and improve website performance to meet demand and our next priority will be programmatic access to our data via application programming interfaces (APIs). This will improve the agility of all users of our data, both internal and external, to consume and gain insights from the ONS website.

We have also focused on improved search both on the ONS website and through greater visibility of our data and insight in search engines and in the media.

This year we are also setting the future direction for how we create and manage our statistical content in a more efficient and structured way to enable business agility and flexibility for our users, aligned to their broad range of needs. This will set out a forward plan to transform ONS data and insight and will make the case for the additional funding needed to deliver on our ambitions.

StatsChat

Additionally, the ONS Data Science Campus are currently exploring how new tools and technology can help the organisation disseminate information more effectively. We have developed a new product, ‘StatsChat’, that uses LLMs to search and summarise text from across our website, and present relevant sections of our web pages to user’s natural language questions.

We are aiming to make this available to a small selection of users for testing and fine-tuning, so that we can improve the relevance of the responses and provide assurance from a data ethics, data protection and security perspective.

Stakeholder engagement

The ONS conducts a wide range of user and stakeholder insights, consultations and listening exercises. This engagement is essential as it provides us with actionable insights on users’ and stakeholders’ views on the strength of their relationship with the ONS, feedback on its outputs, and on how stakeholders access and use our statistics and analysis.

As part of this, the ONS’s Engagement Hub conducts annual stakeholder ‘deep dive’ research and an annual stakeholder satisfaction survey. I understand the Committee is interested in understanding more about these exercises and insights from recent examples.

The deep dive research is conducted through in-depth interviews with senior representatives from around 45 key stakeholder organisations. The stakeholder satisfaction survey is an online questionnaire aimed at a wider range of users from a variety of sectors and roles to provide broader insight. Deep dive participants include those from central and local government departments, devolved administrations, research institutes, think tanks, public bodies such as NHS England and the ICO, international partners, business representative bodies and charities. The stakeholder satisfaction survey reaches similar types of organisations, with a wider range of responses at senior manager, operational, public affairs, analyst, researcher, policy maker and economist levels.

Deep dive interviews took place in summer 2022 and the findings were positive. Many stakeholders said that the organisation had built on and maintained its reputation for independence, trustworthiness, quality and reliability. They also felt that the ONS had developed its reputation for being flexible, agile and responsive to changing needs. Additionally, the ONS was seen to be working more collaboratively with policymakers than it had in the past.

The stakeholder satisfaction survey was conducted in early 2023. It found respondents to be positive across key sentiment measures on trust, quality, and on the ONS producing statistics which are relevant to issues of the day. There were also positive views expressed about the ONS as an organisation with reliability, responsiveness, and willingness to help being cited. It was also noted that ONS staff were knowledgeable and helpful.

There were areas highlighted for improvement in both the stakeholder deep dive and satisfaction survey. These included how the ONS works with both devolved governments and heads of the statistical profession in government departments; improving the ease of finding the right people to speak to in the organisation; and more regular, strategic overviews of the ONS’s work (for stakeholders to be able to connect different topics better). Some participants referenced a need for further scrutiny to understand some data anomalies which had occurred in mid-2022.

These findings are shared throughout the ONS, including with the National Statistician’s Executive Group, and are used to inform planning and prioritisation. We have implemented measures to respond to the issues raised as part of a wider programme of ongoing external affairs improvements, which we continue to monitor with further research.

The ONS conducted a subsequent stakeholder deep dive in autumn 2023 and are currently analysing the findings. The latest ONS annual stakeholder satisfaction survey is currently live and will be open for responses until 22 January 2024.

Full business case on population and migration statistics improvements

As you are already aware, next year I will be making a recommendation to Government on the future of the population and migration statistics system in England and Wales. I understand that the Committee has requested some additional detail surrounding the financial aspects of this transformational work.

In the outline business case for the Future of Population and Migration Statistics programme, initial cost estimates of a potential census in 2031 range from £1.3 billion to £2 billion, with increases expected across all phases of such an operation.

The ONS is working to produce a full business case (FBC) for our proposals to improve our population and migration statistics. The FBC will be developed in the context of the forthcoming recommendation to UK Government, and the response from Government. At this stage, while the recommendation remains in development, it is difficult to provide an accurate updated estimate of cost.

The FBC is expected by HM Treasury in late 2024. We will be able to provide the Committee with further information on costs at a later date.

Migration statistics

As part of improving population statistics we are also transforming international migration statistics. Our latest estimates, year to June 2023 are official statistics in development and are provisional. We revised our June 2022 and December 2022 estimates upwards due to a combination of more data and methodological improvements.

International migration estimates are produced using three key sources: Home Office border data linked to a person’s travel visa for non-EU nationals, which made up 82% of total immigration in 2023; tax and benefit data (known as RAPID) for EU nationals; and International Passenger Survey data for British nationals. We are most confident with Home Office border data and have an ambition to produce all migration statistics from these data in future.

We work very closely with Home Office to procure and use border data linked with visa data to produce migration estimates. The ability of free movement for British nationals and some EU and non-EU nationals makes the current method a challenge for those that don’t require visas. However, there is further data held by the Home Office, known as Advanced Passenger Information, that would help with our research, particularly for British nationals. We have requested these data and would like to see Home Office accelerate this request.

Census 2021 data confirmed our position that the administrative data we use for non-British nationals is robust and that the international passenger survey data does not measure actual migration patterns well due to people changing their intentions. Rather than rebasing once a decade, following a decennial census, to correct for any drift in our population estimates, we aim to produce statistics that do not ‘drift’ from the truth. Our Dynamic Population Model based population statistics show how drift in both population and migration statistics can be mitigated. That does not remove the need to revise estimates as the data and methods mature.

Long-term international migration uses the UN definition of a migrant, that is someone that changes their country of residence for 12 months or more. To produce timely estimates, we therefore have to make assumptions based on previous behaviour. As more time passes, we are able to update those assumptions with data of actual travel. We therefore become more confident in our estimates over time. For example, our June 2022 estimates now have complete data to show if a migrant has stayed or left for 12 months and we therefore have less uncertainty around those estimates compared to the provisional June 2023 estimates.

We have recently published experimental uncertainty measures for our admin-data based migration estimates for the first time. These show our users how our confidence increases once we have complete data that meet the required definition.

We also described the nature of provisional estimates that are subsequently revised and the reasons behind these revisions. This was picked up and presented accurately in the media and in playing back conversations with our core users. The Office for Statistics Regulation (OSR) recently published a review of their recommendations on migration statistics. The OSR considered we sufficiently described uncertainty to our users, although we recognise these are experimental and will continue to update our users as they develop.

I hope that you find this additional information useful. Please do let us know if we can assist the Committee further on any of the issues discussed in this letter, or with any of its other inquiries.

Yours sincerely,

Professor Sir Ian Diamond

Office for National Statistics correspondence to the Public Administration and Constitutional Affairs Committee on labour market statistics

Dear Mr Wragg,

I am writing to update the Committee on the Office for National Statistics’ work on the Labour Market.

As you will be aware, due to quality concerns, the ONS suspended publication of the Labour Force Survey (LFS) estimates element of the wider Labour Market release in October. Instead, to provide users with our best assessment of the labour market we produced indicative experimental estimates of the headline employment, unemployment and inactivity rates. These were produced using the most robust administrative data sources available to us. For employment, we used payroll data from HMRC’s Real Time Information system, applying the growth rates of that data to the LFS for April to June 2023. Likewise, we used Claimant Count data for unemployment.

Today we have published a development plan for the LFS. This will focus on work to increase the number, and diversity, of the responses to the LFS and on improved methods to better account for non-response and bias. We will also update the population figures used in the Labour Market estimates which is another important improvement. With this work in train, we are aiming to reintroduce LFS estimates in the Labour Market release on 12 December.

In parallel, we will continue our work to transform this key survey. Alongside the LFS, we currently also have the transformed LFS in the field. This has a sample size that is three times that of the current LFS and has an on-line first mode of collection supported by telephone and face to face interviewing, to help ensure a higher and more representative response. We are doing some final fine-tuning to the questionnaire and expect to fully transition to this new survey in March 2024.

I do hope that you find this update helpful but please do let me know if you have any other questions about this topic, or if we can be of assistance to the Committee on any other matter.

I am also copying this letter to Harriett Baldwin MP, Chair of the Treasury Committee and The Lord Bridges of Headley MBE, Chair of the Economic Affairs Committee as their specialists were recently briefed on this matter by members of my team.

Yours sincerely,

Professor Sir Ian Diamond

UK Statistics Authority follow-up written evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on Transforming the UK’s Evidence Base

Dear Mr Wragg,

When giving evidence to the Public Administration and Constitutional Affairs Committee on 5 September 2023, I promised to follow-up on a couple of points with various members of the Committee.

GDP Revisions

Firstly, I agreed to let you know if I was aware of similar revisions happening in other comparable countries.

As I outlined in the Financial Times recently, The UK’s official economic statistics are rightly seen as among the world’s best. This includes the recent upgrade of our official estimates for economic growth in the pandemic years of 2020 and 2021. The latest Organisation for Economic Co-operation and Development (OECD) information shows that the UK is one of the first countries in the world to estimate the 2020 and 2021 coronavirus (COVID-19) pandemic period through the detailed Supply and Use framework. This standard economic framework enables us to confront our data at a much more granular level for products and industry. The OECD provides a real-time vintages of GDP database in their main economic indicators, which takes data directly from the National Statistics Institutes.

Each country will follow different revision policies and practices, which can result in their estimates being revised at a later date, according to their own needs. The timing and impact of revision changes will depend on data availability and magnitude, with large annual structural surveys being the data source needed to make detailed product and industry changes. These annual data sources come with lags on timeliness, often being available up to 2 or 3 years later.

We have now seen revisions to GDP estimates published by other countries. As we previously announced, the 2021 GDP estimates for the UK were revised to 8.7 percent growth from our initial estimate of 7.6 percent growth, a revision of +1.1 percentage points. The Spanish Statistical Agency has now published 6.4 percent growth in GDP for 2021, compared with the previous estimate of 5.5 percent, a revision of +1.1 percentage points. The Netherlands have now published 6.2 percent growth for 2021, revised from an initial estimate of 4.9 percent, a revision of +1.3 percentage points. Italy, have now published 8.3 percent growth for 2021, revised from an initial estimate of 7.0 percent, a +1.3 percentage point revision. All are a similar magnitude upwards revision for 2021 as observed in the UK context. Conversely, the United States have now published 5.8 percent growth for 2021, compared to a previous estimate of 5.9 percent growth, a revision of -0.1 percentage points [ONS own calculations based on published US data from www.bea.gov]. This highlights that revisions can differ across countries.

Strengthening the Analysis Standard

Secondly, I promised to examine whether there is a case for strengthening the Analysis Standard. I am passionate about ensuring the robustness of the Analysis Standard and welcome the committee taking an interest its strength and its application across Government.

The Analysis Function Standard, which was updated earlier this year, is part of a suite of management standards that promote consistent and coherent ways of working across government, and provides a stable basis for assurance, risk management and capability improvement.

In my letter to you of 18 September regarding the Committee’s report‘Where Civil Servants work: Planning for the future of the Government’s estates’, I emphasised my work to promote transparency in Government Analysis through my role as Head of the Analysis Function. I am keen to take every opportunity to champion the Standard across government and will reiterate the importance of this area at October’s Heads of Function board meeting.

The Standard is very clear on expectations about transparency in the commissioning, production and publishing of analysis. It also has clear messaging about compliance to the Code of Practice for Statistics and other official guidance for the remaining analytical professions including the Aqua, Green and Magenta books.

It is my expectation that all departments closely follow the principles in these sets of guidance and through the Analysis Function Standards Steering Group we monitor and scrutinise these documents to ensure their continued effectiveness.

For the first time this year, all Departmental Directors of Analysis undertook a self-assessment against the Standard and in response to this we are staring a series of action groups to drive improvements, including in Departments compliance to official guidance.

I will keep the Analysis Function Standard under close review and, where necessary, strengthen the messages in it.

Please do let us know if any other questions, and if we can help the Committee further on either of these topics or any of its other inquiries.

Yours sincerely,

Professor Sir Ian Diamond