UK Statistics Authority oral evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on the work of the UK Statistics Authority

On Tuesday 1 July, Sir Robert Chote, Chair of the UK Statistics Authority, Emma Rourke, Acting National Statistician and Ed Humpherson, Director General for Regulation, gave evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on the work of the UK Statistics Authority.

A transcript of which has been published on the UK Parliament website.

UK Statistics Authority written evidence to the Public Administration and Constitutional Affairs Committee’s inquiry into the work of the UK Statistics Authority

Dear Simon

I am writing in response to the call for evidence for the Committee’s inquiry into the work of the UK Statistics Authority. We welcome our regular appearances before the Committee, not just as a channel of formal accountability to Parliament, but also as an important source of support, challenge and advice in ensuring that the official statistical system serves the public good as effectively as possible.

Last week, the Authority announced the resignation of Sir Ian Diamond as the UK’s National Statistician due to ongoing health issues. I am grateful to Sir Ian for his tireless energy and the passionate dedication he brought both to the role of National Statistician and to championing the vital role of statistics across society more broadly. Sir Ian oversaw many successes over his tenure during a remarkable period of economic and societal change, particularly during the pandemic. Emma Rourke, Deputy National Statistician for Health, Population and Methods, will be Acting National Statistician pending longer term arrangements being put in place. We will keep your Committee updated on these arrangements.

As you will be aware, it has been a challenging period for the official statistical system and for the Office for National Statistics (ONS) in particular. Most obviously, the long-term trend of declining response rates for household surveys accelerated following the Covid pandemic, making it more difficult and expensive to maintain the quality of key economic data on which policy and other decision makers rely – most notably those relating to the labour market. T his has happened at a time when financial resources remain constrained and the barriers within government to the sharing, linking and exploitation of administrative data remain frustratingly high.

Colleagues across the official system have worked tirelessly to address these challenges and to exploit available opportunities, and the ONS and other statistical producers have continued to generate many high-quality outputs. But we need to ensure that the system is focused on addressing the challenges and difficulties, as well as being self-critical and open to learning and advice from outside when things can be done better. To that end, in addition to benefiting from the insights of the Lievesley Review in 2024, the UK Statistics Authority Board has supported the ONS in commissioning an unsparing internal ‘lessons learned’ exercise around the process of reforming its labour market statistics, in drawing on technical input from independent outside experts, and in engaging with and responding fully to the recommendations and requirements of the Office for Statistics Regulation (OSR).

Most recently, in April 2025, the Board and the Cabinet Office jointly commissioned Sir Robert Devereux to undertake a short but wide-ranging independent review of the performance and culture of the ONS, drawing on the experiences and insights of staff across the organisation as well as external stakeholders. As I write this letter, the review is still under way. But I am confident that it will provide important insights and recommendations to help ensure that the ONS can operate to its full potential, and we will be able to brief you on these at a later stage of your inquiry.

You set out four sets of questions in your Terms of Reference:

  1. How well served are policy-makers, researchers, businesses and citizens, by the data that ONS produces and the services it provides?
  2. How is the UK’s data environment evolving, and what challenges and opportunities does this present official statisticians and analysts? What does the development of a National Data Library mean for the ONS?
  3. How successful has the OSR been in identifying issues with official data, and making the case for improvements?
  4. How does the UKSA Board carry out its statutory functions, and how involved is it in the decisions taken by senior leaders at ONS and OSR?

To address these questions, I attach three submissions – one each from the Authority, the OSR and the ONS. As you will appreciate, the ONS is in a period of transition following the resignation of Sir Ian Diamond as National Statistician on 9 May 2025 and the ONS submission was being written as he departed.

We look forward to discussing the questions you have raised and any other issues with you and the Committee on 1 July.

Yours sincerely,

Sir Robert Chote

Chair, UK Statistics Authority

How does the UK Statistics Authority Board carry out its statutory functions, and how involved is it in the decisions taken by senior leaders at Office for National Statistics (ONS) and the Office for Statistics Regulation (OSR)?

Introduction

The UK Statistics Authority was established under the Statistics and Registration Service Act 2007 (‘the Act’) and formally assumed its powers under the Act on 1 April 2008. The Act gave the Authority the statutory objective of ‘promoting and safeguarding the production and publication of official statistics that serve the public good’. The public good includes:

  • informing the public about social and economic matters;
  • assisting in the development and evaluation of public policy; and
  • regulating quality and publicly challenging the misuse of statistic

In practice the Authority fulfils these objectives directly through the Office for National Statistics (ONS; its executive arm and the largest single producer of official statistics in the UK) and the Office for Statistics Regulation (OSR; its assessment arm), and indirectly through its oversight of the Government Statistical Service (GSS; the statisticians working in UK and devolved government departments and public bodies, which produce most UK official statistics)

The governance of the Authority was examined by Professor Denise Lievesley in her independent Review of the UK Statistics Authority (‘Lievesley Review’) published in 2024. The Authority welcomed Professor Lievesley’s recognition that the Authority’s governance was working well and that the two executive arms (the ONS and OSR) are sufficiently operationally independent in practice. To increase public understanding of the de facto distinction between the arms of the Authority, OSR published a statement on the operational separation between the ONS and OSR in October 2024.

Membership of the Authority Board comprises the Chair, at least five non-executive members, and three executive members. Other members of the ONS and OSR executive staff and representatives of the GSS attend as required.

As detailed in the standing orders of the Act, the Board is required to ‘usually to meet at least eight times a year’, but in both 2023 and 2024 met ten times. It also holds ad hoc meetings to cover topics of interest in greater depth or when more timely input is needed. The Chair has regular separate bilateral meetings with the National Statistician and the head of OSR, and non-executive members meet with ONS and OSR staff as needed on topics of shared interest or expertise.

The Board delegates some of its functions to committees. Among them, the Regulation Committee oversees the work of OSR, inputting to and signing off its major reports and decisions. As OSR regulates the ONS as well as other statistical producers, to avoid conflicts of interest as far as possible, the Regulation Committee comprises non-executive members of the Board and OSR executive members, but no executive staff from the ONS.

The Authority’s engagement with the devolved administrations is guided by the Concordat on Statistics, an agreed framework for co-operation in relation to the production of statistics, for and within the UK, statistical standards and the statistics profession. High-level governance and oversight of cross-UK statistical work is provided by the Authority’s Inter-Administration Committee (IAC), chaired by the National Statistician with membership including the Chief Statisticians of the devolved administrations.

Statutory functions

The functions of the Authority under the 2007 Act include:

  • To monitor the production and publication of official statistics.
  • To develop and maintain definitions, methodologies, classifications and standards for official statistics.
  • To prepare, adopt and publish a Code of Practice for Statistics.
  • When requested by the producer, to assess and determine whether the Code has been complied with in relation to any official statistics, and if so to designate them as National Statistics, nowadays commonly referred to as ‘accredited official statistics’.
  • To determine whether the Code continues to be complied with by ‘accredited official statistics’, and if not to cancel the designation.
  • To produce and publish statistics
  • To compile and maintain the retail prices index
  • To provide statistical services and promote and assist statistical research
  • To fulfil former functions of Registrar General for England and Wales as regards undertaking a census

In common with other public and private sector organisations, the Act imposes a duty on the Board to produce a report after the end of each financial year: the Authority’s Annual Report and Accounts. This meets statutory obligations, providing transparency and accountability for the use of public resources. The most recent Report (for 2023/24) can be found on the Authority website.

The Board must exercise its functions efficiently and cost-effectively and seek to minimise the burdens it places on other persons.

The Board and decision making

As noted above, the Board comprises a combination of executive and non-executive members. The non-executive members are appointed by ministers on the recommendation of an appointment panel that typically includes the Authority Chair, an independent member, and a representative from the Cabinet Office as sponsor department. The aim is to ensure a range of skills and expertise, currently from academia, public service and the private sector. Individual members’ expertise currently encompasses economics, statistics, data collection, technology, risk and governance and communication.

In the last couple of years, it has been something of a struggle to maintain a full complement of non-executive members despite excellent candidates being available. The previous Government refused to renew members beyond their initial three-year term (contrary to good governance practice which would have allowed at least one additional term) and it took a significant period to complete the appointment process for our three most recent arrivals as the process was delayed by the general election. Despite periods of sitting without a full quota of non-executive members, Board meetings were able to be quorate on virtually every occasion, and members in post continued to be actively engaged with the work of the Authority holding relevant expertise.

Board meetings typically comprise regular reports from the Chair, National Statistician, head of OSR, head of communications and sub-committee chairs, plus papers on substantive items of current importance. Some of these are on a regular cycle (for example discussion of business plans and the annual report and accounts). Some are as needed. In the last couple of years, regular items have included labour market statistics, the Integrated Data Service and the future of population statistics (including the census). On occasion, regular issues of this sort are dealt with through discussion of the National Statistician’s report to avoid staff working in high pressure areas spending too much time preparing board papers rather than on their core activities.

Agenda papers may ask the Board to note particular developments, to offer advice or to make formal decisions – for example endorse a recommendation from the National Statistician on the future of population statistics or the need for a census. As detailed in Section 30 of the Act, the National Statistician is the Board’s principal adviser on the quality of official statistics, good practice in relation to official statistics and the comprehensiveness of official statistics. The Board must have regard to their advice on those matters. Decisions are normally made following discussion that leads to a shared view of the way forward. However, the Chair can request a formal vote with a simple majority of those present deciding the matter. The Board has not yet voted formally.

The Board sets the broad direction of the Authority through agreement on a five-year strategy, currently Statistics for the Public Good 2020-2025 and due to be refreshed this year. The ONS and OSR business plans underpin delivery of the strategy, and the Board is engaged in the development of these, providing support, oversight, scrutiny and challenge ahead of approval.

The Board and its committees periodically review their own effectiveness and the effectiveness of their members. In line with good practice, we will be commissioning an externally led review of the Board in the coming year.

Subcommittees of the Board

Regulation Committee

The role of the Regulation Committee (formerly the ‘Assessment Committee’) is to help shape the regulation strategy of the Authority and to oversee the programme of assessment of sets of official statistics against the Code of Practice for Official Statistics, plus other work related to assessment and regulation, thereby contributing to achievement of the Authority’s strategic objectives.

In practice, this means overseeing the work of OSR in setting, promoting and judging compliance with the Code, and in intervening (via the Authority Chair, the head of OSR or other OSR staff) when ministers, senior public figures or statistical producers fall short of the code or the associated principles of ‘intelligent transparency’. Committee members consider the final conclusions of assessment reports ahead of publication and also support the OSR in its wider activity aimed at supporting good practice.

The OSR’s recent work is described in detail in OSR’s submission. Recent areas of regulatory focus have included labour market statistics, specific economic statistics, the broader landscape of economic statistics, population statistics and the approach to gender identity in the last census. On a number of occasions, the Regulation Committee has de-accredited official statistics that no longer comply with the Code (sometimes at the producer’s request), accompanied by agreement on actions that, if fulfilled, would allow the statistic to be re-accredited as Code-compliant.

The Regulation Committee meets at least quarterly, with additional meetings convened as necessary. It comprises the Authority Chair, three non-executive Members of the Board and the head of OSR, with other OSR staff members attending as required. There are no executive staff members from the ONS or other statistical producers represented on the committee (except when invited as guests for specific discussions), consistent with the statutory requirement to separate statistical production and assessment.

The Audit and Risk Committee

Executive accountability for risk management resides with the National Statistician (as Accounting Officer), with executive oversight residing with the Executive Committee and its sub-committees. Chaired by a qualified Non-Executive Member, the Audit and Risk Assurance Committee (ARAC) supports the Board and the Accounting Officer in their responsibilities for risk management, control and governance by reviewing the comprehensiveness, reliability and integrity of the assurance available to them.

The Authority Risk and Assurance Framework provides a mechanism for the identification, analysis and management of risks across the Authority and is aligned to The Orange Book – Management of Risk, and reflects Risk Management best practice.

ARAC has responsibility for advising the Board on the effectiveness of governance, risk management and the system of internal control. This is also informed through audits and advisory work by the internal audit team. It currently comprises of the Non-Executive Chair, two non-executive Board members, two external independent members.

The Authority Board ensures that plans are in place for any risks outside of appetite. Updates are provided to each ARAC meeting on the evolving profile. ARAC scrutinises the management of the strategic risks to satisfy itself that major risks are identified and that mitigation strategies and appropriate levels of assurance are in place. It challenges and holds the Risk and Assurance team and Strategic Risk Owners to account.

Currently the Authority’s Strategic Risks relate to:

  • Independence, trustworthiness and impact
  • User needs
  • Delivery of strategic ambition
  • Quality management framework
  • Our Security
  • Our People
  • Our Communications
  • Quality economic statistics
  • Quality population statistics
  • Data access and usability
  • Technological resilience

Discussions by ARAC and the Authority Board have focused on the most significant risks to the successful delivery of the Authority strategy, including the interplay across the strategic risk profile, specifically statistical quality, technological resilience (with particular focus on legacy) and people and skills. The Statistics Quality risk has remained outside of appetite for a prolonged period. ARAC has provided a high level of oversight and consistently sought assurances on timing for the Statistics Quality risk to return within appetite. ARAC has also continued its focus on assuring legacy plans which should support improving quality and will play a key role in scrutinising all the strategic risks under the new risk profile, with particular focus on the quality of our economic statistics, legacy, people and meeting user needs.

Remuneration Committee

The Remuneration Committee agrees the pay and performance management framework for members of the Senior Civil Service employed by the Authority, within the parameters set by the Cabinet Office. It signs off performance and bonus decisions for staff at Deputy Director level and above.

How involved is the Board in the decisions taken by senior leaders at ONS and OSR?

As with most corporate or public sector organisations, a key role of the Board – and its non-executive members in particular – is to provide support and challenge to the Executive to help them deliver on the Board’s strategy and fulfil any statutory or regulatory duties. The Authority Board is unusual in that it has two executive arms – the production arm, the ONS, and the regulatory arm, OSR – and one of them regulates the other (along with the many other bodies producing official statistics). It also has a less direct responsibility for the GSS.

Within the ONS, the National Statistics Executive Group (NSEG) is the most senior executive committee, chaired by the National Statistician. Its role is to advise the National Statistician in the exercise of their functions as the Head of the GSS and Analysis Function and Chief Executive of the Authority. NSEG focuses on system-wide statistical and analytical matters, and this is reflected in the Group’s membership which includes two GSS Heads of Profession and colleagues from the devolved administrations. Meanwhile the Executive Committee (ExCo) focuses on all aspects of our business delivery within the ONS. Below NSEG and ExCo are a number of sub-committees that feed into discussions. For the OSR, the Regulation Committee helps to shape the regulation strategy of the Authority and oversee regulatory work. The Director General sits on the Regulation Committee, and he is supported by the OSR senior leadership team, who all attend the Committee.

Support and challenge from the Board is provided through multiple channels and is provided without seeking to conduct the role of the executive or micromanage. At every Board meeting there are update reports from the National Statistician and the DG for Regulation providing the opportunity for the two executive arms of the Authority to report on delivery against their respective Business Plans, including areas of success as well as highlight challenges. This allows the Board to engage, offer support, share expertise and offer challenge in the wider work of the executive arms, often covering work areas not discussed as substantive agenda items. When it is necessary to offer challenge, Board members are conscious of the need to be robust, forthright and persistent, while acting in a constructive and collegiate way and recognising any constraints the executive faces.

For support and challenge to operate effectively it is important that the executive has an accurate picture of what is going on the organisation and its major programmes and activities, and that this in turn is shared with the Board in a full and transparent way. As in almost every Board, the non-executives periodically emphasise the importance of candour and transparency so that they do not receive an incomplete or unduly rose-tinted picture.

The Chairs of the Audit and Risk, Regulation and Remuneration Committees also provide updates at the Board on the work of respective committees, in line with their delegated authority as set out in committee Terms of Reference. Through these updates the Board can share its views and support the work of Committees. For example, highlighting areas of concern and interest relating to Gender Identity in the 2021 England and Wales Census as the Regulation Committee investigated the matter.

The Authority Board receives monthly management information which helps to monitor performance against key deliverables as outlined in the Business Plan.

The Board sets the risk appetite for the Authority’s strategic risk profile and has a specific item on the strategic risks every six months. Work areas, projects and programmes feeding into strategic risks are of course often covered in substantive agenda items. This is underpinned by the work delegated to the ARAC. The strategic risk profile demonstrates the most significant risks to the successful delivery of the work of Authority Strategy. The forward agenda for the Authority Board reflects the key challenges and aligns to the strategic risk profile. The Board forward agenda is produced by the Secretariat in accordance with any determination of the Board and in consultation with the Chair of the Authority, the National Statistician and the Strategy and Policy Deputy Director.

External assurance

The Authority is unusual in that it has an in-house regulatory arm (OSR), which in addition to the work of the internal audit team, gives the Authority both internal and external scrutiny. Both the external and internal scrutiny are welcomed and encouraged by the Board.

External assurance is provided to the Authority Board and National Statistician in several ways. First and foremost, the Authority is an independent non-ministerial department that reports directly to the UK Parliament, the Scottish Parliament, the Welsh Parliament and the Northern Ireland Assembly. The work and interest of Parliamentarians and Committees, including the Public Administration and Constitutional Affairs Select Committee (PACAC), on the work of the Authority is an important source of support, challenge and advice in ensuring that the official statistical system serves the public good as effectively as possible.

In addition, the National Statistician has convened a set of advisory committees and panels to provide external independent advice on specific topic areas. They include

  • Advisory Panels on Consumer Price Statistics (Stakeholder and Technical).
  • Data Ethics Advisory Committee
  • Committee for Advice on Standards for Economic Statistics
  • Expert User Advisory Committee.
  • Inclusive Data Advisory Committee.
  • Labour Market Statistics.
  • Methodological Assurance Panel.

Membership of these groups includes representation from academia, government departments, the devolved governments and research bodies.

External expertise is also sought on a bespoke basis and shared with the Board. One such example was the review undertaken by Professors Ray Chambers and James Brown as part of the ONS’s systematic assessment of its readiness to manage the transition from the Labour Force Survey (LFS) to the Transformed Labour Force Survey (TLFS). They looked specifically at survey design, response patterns and weighting methods.

The Board also encourages frank and honest engagement with key users of ONS statistics to ensure that their requirements and feedback can be reflected. For example, ensuring that the ONS engages fully and constructively with the Bank of England and HM Treasury regarding the transition from the LFS to the TLFS and that any significant concerns they have are shared with the Board. The Stakeholder Advisory Panel on Labour Market Statistics chaired by Professor Jonathan Portes, and including representatives from other government departments, academia, the Scottish and Welsh governments and Northern Ireland Statistics and Research Agency has been a key source of advice and assurance as work on this transition has proceeded.

Over the last year this work has been an area of significant focus for the Board. The sharp fall in household survey response rates, a significant challenge in the UK as well as for other NSIs around the world has affected the quality of data from the LFS. As part of the work to address these challenges the ONS has been developing an online-first TLFS.

Throughout this process the Board has provided scrutiny, oversight, challenge and support as the has work progressed (as detailed in published Board papers). The non-executive members have also held extended sessions allowing them to better understand the range of issues. The Board has provided clear feedback about the risks of transferring from the LFS to the TLFS mindful of stakeholder concerns and quality issues. At their meeting on 27 March, the Board considered the advice by the National Statistician on the TLFS as well as assurances from technical advisors and the advisory committee to reach collective agreement on the way forward.

The National Infrastructure and Service Transformation Authority (NISTA) – previously the Infrastructure and Project Authority (IPA) – is the government’s centre of expertise for infrastructure and major projects. It regularly scrutinises any of our projects that fall under the Government Major Projects Portfolio (GMPP). These have included the ONS’s Integrated Data Service Programme and the Future of Population and Migration Statistics. Along with Treasury Business case reviews, these have offered external assurance both to the executive and the Board, although it is not unknown for projects that have cleared these hurdles multiple times to end up with difficulties that would presumably have been even harder for the Board alone to surface.

UK Statistics Authority

May 2025

 

How successful has the OSR been in identifying issues with official data, and making the case for improvements?

Summary

The Office for Statistics Regulation (OSR) is charged with upholding the standards of official statistics across the UK. Through our wide-ranging regulatory work we identify issues and respond to stakeholder concerns about official data. We set requirements for improvements, as well as highlighting areas of best practice.

Our work has secured commitments from statistical producers that have led to positive improvements in many official statistics, although the speed with which this can be achieved is not always as timely as we would like. The importance of our role, and the independence and rigour that we bring to the task, has been noted by external reviews such as Professor Denise Lievesley’s independent review of the UK Statistics Authority and the PACAC report on Transforming the UK’s Evidence Base.

This submission sets out: our scope and approach to regulation; our key regulatory interventions and how we have identified issues and required improvements; and our views on the evolving statistical system. We have included clear examples of where our work has been vital in securing improvements or holding organisations to account against a backdrop of significant issues or concerns.

Many of the examples included in this submission relate to statistics produced by the Office for National Statistics (ONS). This represents some of our most recent and high-profile interventions. However, OSR’s regulatory activities span the range of Crown and non-Crown producers of official statistics across the UK.

Introduction

OSR is the regulatory arm of the UK Statistics Authority and was established in November 2016 following the Bean Review. OSR fulfils the assessment and regulatory function set out in the Statistics and Registration Service Act (2007). We are independent from Government and are separate from producers of statistics, including the ONS.

The work of OSR is overseen by the Regulation Committee of the UKSA Board, which comprises non-executive members of the main UKSA Board and the Director General of Regulation sits as an executive member. There are no executive members of the ONS on this committee to avoid any conflict of interest when OSR is examining the work of the ONS. The Chair of the Authority sits on the Regulation Committee and has stated – like his predecessor – that in the event of a dispute between OSR and the ONS, he and the Board would, by instinct, side with the regulator.

Professor Lievesley examined the operation of OSR and the Regulation Committee and concluded that: “Having reviewed the organisation thoroughly, this Review is satisfied that there is sufficient operational independence between ONS and OSR. The Review could find no tangible evidence to support assertions that the two organisations are too cosy or that a fundamental, unmanageable conflict of interest exists between the two that undermines the integrity or quality of the statistics produced by ONS, though it is important to pay attention to the perception of independent scrutiny.”

In line with the Statistics and Registration Service Act (2007) the principal roles of the OSR are to:

  • Set the statutory Code of Practice for Statistics
  • Assess compliance with the Code of Practice
  • Accredit official statistics that comply fully with the Code of Practice
  • Report any concerns on the quality and comprehensiveness of official statistics
  • Report any concerns on good practice in relation to official statistics

Our purpose is to ensure statistics serve the public good by regulating against the principles of Trustworthiness, Quality, and Value. As a regulator, we work through three delivery channels:

  • We uphold the trustworthiness, quality and value of statistics and data used as evidence
  • We protect the role of statistics in public debate
  • We develop a better understanding of the public good of statistics

Our 5-year plan sets out our vision and priorities for 2020-2025 and how we will contribute to fostering the Authority’s ambitions for the statistics system. Our annual business plan shares our focus for the current year.

Our regulatory approach

Regulatory tools

As the regulator for official statistics across the UK, we have a number of different tools that we use in order to identify issues with official statistics and make recommendations or requirements for improvements:

  • Assessments: Detailed reviews of an official statistics output that grant, reconfirm or remove the status of ‘accredited official statistics’ (referred to as ‘National Statistics’ in the Statistics and Registration Service Act 2007)
  • Compliance checks: Short, focused reviews, typically providing a high-level investigation of the official statistics
  • Reviews: Pieces of work examining issues across the statistics landscape or related sets of official statistics to provide strategic recommendations
  • Casework: Complaints received on the production and use of statistics which are investigated and a judgement reached

Engagement with statistical producers

OSR is structured around 8 topic domains, each of which are responsible for maintaining an overview of the statistics produced by relevant government departments and public bodies within that topic. This knowledge ensures that OSR remains up to date on existing and emerging issues and ensure that our reviews and judgements are informed by a deep understanding of the topic.

The domains build strong regulatory relationships with the relevant statistics producers, which support better outcomes for the statistical system, through early and frank exchange of information and intelligence, and securing buy-in from the producers of statistics for the requirements and recommendations set by OSR.

One of these key relationships is with the civil service Heads of Profession for Statistics who sit within each statistical producer organisation. Heads of Profession for Statistics play a vital role in upholding the quality and standards of official statistics as set out in the Code. OSR works closely with the Heads of Profession across government to provide a mix of challenge, advice and support where appropriate.

Whilst we provide specific recommendations to producers as part of our reviews, in general we take a more holistic approach to regulation, providing support, advice and training in additional to our formal regulatory work. This approach ensures that our work with producers secures real change and improvement in statistics, rather than being a performative tick-box exercise. We set the expectation that producers are open and honest about the statistics with us as the regulator and in the public domain. We stand firm on our regulatory decisions but always ensure that we are fully informed by conversations with the producer so that they are proportionate and rooted in the facts.

We pride ourselves on this collaborative approach and consider that it leads to considerably better outcomes for the statistical system. This approach was endorsed by Professor Lievesley, whose review noted that many statistics producers commend the support and guidance from the OSR and that this constructive approach is having a positive impact on compliance with the Code.

Separation from the ONS

For regulation to be effective, it is important that external stakeholders have confidence in the arrangements ensuring OSR’s separation from the ONS. This separation is crucial because it is what enables OSR to make sound regulatory decisions about the ONS’s production of official statistics. These regulatory decisions should be made in the same way, using the same criteria and governance, as for any producer of official statistics.

In October 2024 we published a statement which transparently set out how the separation of OSR from the ONS is achieved in practice. The OSR has separate governance structures, strategy and business planning, reporting lines to the Chair of the Authority, and external communications. As noted above, Professor Lievesley examined the operation of the OSR and the Regulation Committee and found them to be robustly independent.

Regulatory work

Economic statistics

High quality economic statistics are a crucial underpinning for informed decision-making and the functioning of the UK economy. Over the last few years, there have been a number of economic shocks which have brought increased interest in, and scrutiny of, economic statistics produced by the ONS. Over the last year in particular, there has been growing external criticism of ONS.

OSR has proactively undertaken an extensive programme of regulatory work on economic statistics over the last 5 years. These reviews have been fundamental in synthesising stakeholder concerns, identifying issues with the quality of the core building blocks of economic statistics. We have set out clear requirements for improvements from the ONS, and hold the ONS to account by monitoring progress against its delivery of these improvements.

This section sets out regulatory work on the ONS’s economic statistics. It:

  • Describes our methodology (Spotlight on quality)
  • Summarises our work on price statistics
  • Summarises our work on labour market statistics

Sets out how our April 2025 review builds on the issues identified in our assessments over the preceding 5 years and highlights the urgent need for the ONS to address quality concerns.

Spotlight on Quality Assessments

The UK’s departure from the EU ended the role of the European statistical office (Eurostat) in verifying the quality of UK statistics. In response, we enhanced our work programme of reviews of economic statistics including the development of a Spotlight on Quality assessment framework which will provide continued assurance on the quality internationally comparable economic statistics.

This framework builds on our earlier regulatory reviews on the Living Costs and Food Survey (LCFS) and UK Business Demography Statistics. The framework sets out four key areas to evaluate the quality of statistics: whether the statistics are produced using suitable data sources whether appropriate methods are used transparent quality assurance whether the statistics are sufficiently prioritised and resourced proportionately to their use. This framework has been vital to highlighting issues and areas for improvement in the economic statistics produced primarily by the ONS. It has been used to underpin the requirements we have set for the ONS.

The Spotlight on Quality Assessment programme provides a detailed review of many of the data sources and components that feed into the production of GDP and the broader National Accounts. We have undertaken the following reviews:

  • Price Index of Private Rents (PIPR) – October 2024. The assessment found improvements in methods and user engagement but noted that further information needs to be published around methods and data quality. It recommended enhancing explanations of the methods and better communicating development plans.
  • Business Investment Statistics – October 2024. The review highlights positive user feedback on the frequency and availability of the statistics but highlighted concerns about revisions and outdated production systems. It recommended analysing the impact of non-sampling errors, updating methods and quality information, and engaging a wider range of users.
  • Review of Economic Statistics Classifications – July 2024. The review recognised the importance of classifications for National Accounts but raised issues about capability and responsiveness to user needs. Recommendations included more openness about decision-making and faster publication of classification decisions.
  • UK Business Enterprise Research and Development (BERD) Statistics – July 2024. The assessment highlighted efforts to improve the BERD methodology and a move to electronic questionnaires. It recommended transparency about the questionnaire used and better communication with users on uncertainty, strengths and limitations.
  • Northern Ireland Business Expenditure on Research and Development Statistics – July 2024. OSR noted good alignment with UK standards but advised on improving documentation of methods and expanding user engagement. Recommendations focused on engaging with users to identify any needs for a potential back series and additional background information.
  • Profitability of UK Companies and Gross Operating Surplus of Private Non-Financial Corporations – January 2024. The assessment found that while the statistics are broadly reliable, there is limited documentation on the quality of different data sources. OSR recommended improving quality assurance, documenting quality information and wider user engagement
  • Producer Price Inflation (PPI) – July 2023. The assessment found that while the ONS has made improvements to quality and international comparability of the PPIs, under-prioritisation of these statistics has negatively affected the quality. OSR defined several requirements to improve the statistics, including to modernise the inflexible legacy systems used to produce the statistics.

ONS’s Price Statistics

The CPIH (Consumer Prices Index including Owner Occupiers’ Housing Costs) is the ONS’s lead and most comprehensive measure of consumer price inflation. It includes the costs associated with owning, maintaining, and living in one’s own home, which is the most significant expense for many households. As such, it is key that the owner occupiers’ housing costs (OOH) element is captured accurately.

As highlighted in our Systemic Review on Housing and Planning Statistics in 2017, the previous method for producing private rental sector statistics had known limitations including being unable to provide estimates of private rent levels and change that were both comparable over time and available at low levels of geography.

To address these limitations, the ONS developed the Price Index of Private Rents (PIPR) which has now replaced the ONS’s Index of Private Housing Rental Prices (IPHRP) and Private Rental Market Statistics (PRMS) and is used for estimating the owner occupiers’ housing costs (OOH) element of CPIH.

PIPR was published for the first time in March 2024, following which, the ONS requested that we assess the statistics against the Code with a view to them becoming accredited official statistics. This process was undertaken in order to provide assurance to users and stakeholders on the quality and reliability of the estimates. We undertook our review at pace, publishing in October 2024.

The OSR review judged that “ONS’s new PIPR statistics generally appear to be meeting users’ needs more effectively than the previous private rents measures that these statistics have replaced.” However, we also concluded that “although the ONS has published supporting methods and quality documentation for PIPR, this does not currently amount to a sufficiently accessible and detailed account of PIPR methods to enable an adequate understanding of the approaches used, the ONS’s rationale for choosing them, and their relative strengths and limitations, for both technical and non-technical users.”

Ultimately, our review determined that the ONS will need to develop and publish the necessary materials; publish NI and full UK PIPR-based estimates; and facilitate an effective evaluation of the UK PIPR series with users before we will consider initiating a full assessment of whether these statistics merit accredited official statistics status.

The review set out five requirements that the ONS will need to address as it further develops the PIPR statistics and required that the ONS publish an action plan by January 2025 setting out how it will address these requirements, and report back to us publicly every three months on its progress. This process ensures that there is transparency for stakeholders and users and that the ONS is held to account.

In February 2025 we set out a forward work plan on assuring confidence in consumer and household price statistics, we have since initiated a review that will focus specifically on the ONS’s approach to transforming its consumer price statistics in advance of a full re-assessment of CPI and CPIH statistics. We will also begin a review of Household Cost Indices (HCIs) later this year.

Labour market statistics

We have focused our work around the challenges that the ONS has faced with its labour market statistics on three distinct themes:

  • Recognising the changing labour market
  • Declining response rates
  • Transforming labour market statistics

Responding to a changing labour market

Employment and jobs statistics are essential for understanding the patterns and dynamics of the UK labour market. They are used widely by a variety of stakeholders, for example within UK Government and by the Bank of England to develop and monitor government policies and so it is important that they are accurate, high quality and clear to fully serve the public good.

Over the last few years, labour market statistics have faced a variety of challenges related predominantly to falling response rates. OSR has provided regulatory oversight of these issues and the ONS’s response, undertaking a significant volume of regulatory work on labour market statistics in the past few years as set out below.

In response to growing concerns about the reliability of the Labour Force Survey in 2020, we assessed the UK employment and job statistics produced by the ONS.

The report emphasised the importance of the ONS adopting a flexible approach. It highlighted the labour market and economy are in constant change, and that the statistics that describe the labour market must therefore adapt to reflect those changes. This includes embracing new data sources and navigating the impact of COVID-19. We concluded that “ONS needs to demonstrate drive and ambition to fill the data gaps and match the pace of change in the labour market, engaging effectively with users to ensure their needs are met.”

In our report we identified areas of good practice such as the labour market statistics team’s collaboration and engagement with a wide range of users and stakeholders. We also set out 12 requirements for the ONS, which were necessary in order to ensure that these statistics could continue to be designated as accredited official statistics (then referred to as National Statistics).

Response rate challenges

We have seen a long-term trend of response rate challenges facing the LFS, which became acute when the sample boost in place to enable pandemic operations was removed in July 2023. Following this, in October 2023, the ONS suspended publication of its estimates of UK employment, unemployment and economic inactivity based on LFS data and announced that it would publish a new experimental series using additional data sources in its place. This short notice change to methods of a key series had a significant impact on user confidence.

In response, we immediately announced and initiated a rapid review of these experimental statistics which was published in November 2023. This review set out key requirements for the ONS on:

  • suitable data sources
  • sound methods and quality assurance
  • clarity of communication
  • managing quality

As a reflection of the significant concerns about quality, the review resulted in the removal of the accredited official statistics status from LFS-based estimates. Following enhanced quality information provided by producers at our request, we also removed the accreditation from other outputs based on data from the Annual Population Survey which is based on responses to wave 1 and wave 5 of the LFS plus a boost sample.

In response to the ONS reintroducing the LFS-based labour market statistics in February 2024, we carried out a short review. A key theme that emerged from this review was the need for improved, clear and open communication from the ONS. The review set out requirements around: communication of plans and priorities; accessibility of updates and communications; explaining how the data should be used; communication of data quality issues and improvements; and transition to the TLFS. In August 2024, we carried out a follow-up review to check the progress made against the requirements.

ONS’s Transformation plans for the Labour Force Survey

We have also carried out regulatory work throughout the development of the Transformed Labour Force Survey (TLFS). Throughout this period, the ONS was developing in parallel the TLFS, which was intended to address many of the concerns and shortcomings of the LFS, but this work has also faced significant delivery challenges.

We carried out our TLFS review in three phases with the aim of sharing early regulatory insights to help the ONS in ensuring the new survey meets the standards of the Code. The first phase (which started in April 2022) focused on the design and development work the ONS had planned before transitioning to the new survey approach.

We published our initial findings on the TLFS in November 2022 which set out a range of requirements, including on enhancing public confidence and maximising the public value of the TLFS; communicating impacts; and supporting public confidence in the transformation process. In July 2023, we published an updated letter and progress report following phase two of our review.

In February 2025, we reported the outcome of phase three of our review of the ONS’s LFS transformation. This report consolidated OSR’s work on both the LFS and TLFS, bringing together our judgements to date and providing updates on the remaining open recommendations and requirements.

Following recommendations set out by OSR, the ONS has widened its user engagement with the introduction of the stakeholder panel and expert data sharing groups and has been publishing updates on the labour market transformation – progress and plans. In December 2024, the ONS also published an interim action plan based on the results of its ‘lessons learnt’ exercise conducted in summer 2024; published the detail of an independent methodological review; and explained its plans in an accessible way. The ONS has revised its plans for the TLFS and in response we made further recommendations for the ONS to set out detailed plans for transitioning to the TLFS, and to set out plans for regular reporting on the progress of the interim action plan from its ‘lessons learnt’ exercise. We have asked the ONS to report on progress again by July 2025.

We continue to closely monitor the ONS’s work to improve the LFS. We will maintained our engagement with the ONS and users to understand whether these changes have increased quality sufficiently to meet user needs. We have asked the ONS to report on progress again by July 2025.

Review of ONS economic statistics

In April 2025, we published our report based on our Systemic Review of ONS Economic Statistics. The report provided a synthesis of the concerns surrounding the ONS economic statistics that had emerged from our work over the last five years, and feedback from stakeholders.

The report was direct in recognising the need for urgency in addressing the declining stakeholder confidence in ONS’s economic statistics, concluding that:

  • The ONS must fully acknowledge and address declining data quality
  • Making progress with administrative data is difficult
  • Greater strategic clarity of purpose and transparency on prioritisation would help reassure external stakeholders

The review also set requirements that the ONS must address:

  • Restoring confidence, by producing a fully resourced plan to recover its social survey operation and reduce risk in its business survey operation.
  • Ensuring strategic transparency, by clearly setting out the core purpose of economic statistics and what can be achieved with available funding in its business plan, a strategic plan for economic statistics and a strategic plan for data sources.
  • Focusing on the quality of data inputs, by implementing a prioritised rolling programme of regular reviews of individual surveys and other data sources.

Population statistics

The UK’s population statistics are going through a period of profound challenge and change.

This section sets out how OSR’s work on population statistics has highlighted the effective work undertaken by the ONS, National Records Scotland (NRS), and Northern Ireland and Statistics and Research Agency (NISRA) on the 2021 and 2022 Censuses respectively, but also brings out the issues surrounding measuring gender identity, and the opportunities and challenges from the use of administrative data.

Censuses in the UK

We have conducted assessments of the censuses produced by the ONS, NRS and NISRA. Our assessments have been conducted in three phases. In October 2019, we published our reports on Phase 1, focusing on the planning and consultation activities undertaken by the census offices across the UK. In November 2021, we published our Phase 2 Assessment reports, focusing on the strategies for developing and providing outputs for both the England & Wales Census 2021 and the Northern Ireland Census 2021. For Scotland Census 2022, our Phase 2 Assessment report was published in April 2023.

The phased approach is essential for the Census outputs as we have historically granted accredited official statistics status at the end of phase 2 which is prior to the publication of the outputs themselves. This is due to the national significance of these statistics and the importance of reassuring users as to the quality of them at the time of publication, rather than retrospectively. This approach means that it is essential for the departments to meet the requirements set out in phase 1 and phase 2 before accredited official statistics status is granted.

For Scotland’s Census 2022, NRS faced unexpected challenges given that the overall response rate was lower than had been anticipated (89.8% compared to target of 94%). The media and users were concerned that statistics derived from the Census would not be fit for purpose because of the response rate. As a result, NRS undertook a number of steps in collaboration with international census experts to change how the final census estimates were calculated, involving the use of administrative data alongside the Census Coverage Survey and census responses in the estimation process. Using OSR principles of intelligent transparency and OSR communicating uncertainty guidance, NRS carefully considered its communication approach for its first outputs of the Census 2022 data. We commended NRS’s dedication to meet the needs of users and follow the standards of the Code of Practice, including most recently by advocating this good practice in a published communicating uncertainty case study.

We published our phase 3 assessment report of the 2021 Census in Northern Ireland in February 2025. This final report confirmed that the 2021 Census statistics in Northern Ireland are produced in compliance with the Code. Our phase 3 assessment of the 2021 Census in England and Wales is ongoing with a projected summer 2025 publication date. Our phase 3 assessment of the Scotland Census will be undertaken in 2025/26.

Review of Gender Identity in the ONS 2021 England and Wales Census

Information on individual’s gender identity was collected on a voluntary basis for the first time in the ONS 2021 England and Wales Census. As such, the data provided the first ever nationally available estimates for England and Wales on the size and characteristics of the trans population. In addition, the question developed for the Census represents the current Government Statistical Service (GSS) harmonised standard in development for collecting data on gender identity.

Following the first release of census statistics on gender identity in England and Wales in January 2023, concerns were raised about the published estimates of the trans population. As additional census data were published, these concerns extended to the relationship between gender identity and proficiency in English. OSR also received concerns about the level of methodological information published.

OSR undertook a review of these statistics and published an interim report in October 2023 and a final report in September 2024. Learning from new evidence in Scotland’s Census, the ONS wrote to us on 5 September 2024 to request that the gender identity estimates from Census 2021 in England and Wales should no longer be accredited official statistics, and should instead be classified as official statistics in development. The ONS’s proposal was consistent with our report findings, and we accordingly removed the accreditation for these statistics. We also concluded that the issues were unique to the statistics on gender identity, and therefore all other outputs from the Census 2021 in England and Wales are unaffected and remained as accredited official statistics. Our work also found that the ONS had been somewhat closed and at times defensive, in responding to concerns raised by users.

Our final report shared our recommendations on the steps the ONS must take to help users of the census gender identity statistics understand their strengths and limitations and set out the development work we consider is required on the GSS gender identity harmonised standard.

The ONS wrote to us in December 2024 updating on their progress towards meeting the recommendations. This included publishing a workplan for developing harmonised standards for sex and gender identity data collection and new Gender Identity Data Harmonisation interim guidance for statistics producers.

Following these publications, we updated our existing guidance on collecting and reporting data about sex and gender identity in official statistics in December 2024 to include these new publications.

On 26 March 2025, the ONS published a blog further updating on the actions it is taking to meet our recommendations. These actions included publishing additional guidance on the appropriate use of the gender identity estimates from Census 2021 in England and Wales and information on the uncertainty associated with them. We consider this to be an excellent research report which includes example use cases at different levels of geography and population and addresses anomalies and implausibility’s. We are confident this practical information will users to better understand the uncertainty in the data and its implications for use.

The ONS is also making progress with developing harmonised standards for sex and gender identity data collection. We have asked that the ONS continues to keep OSR updated as it develops these harmonised standards.

Admin-based population estimates for England and Wales

Admin-based population estimates (ABPE’s) have huge potential to provide more timely, detailed and potentially more accurate population data compared to traditional census-based methods. The ONS intends for the ABPE’s to become the official population estimates for England and Wales in 2025. Given the extensive use of population statistics, it is vital that this new methodology has appropriate oversight and scrutiny.

Our phased assessment approach for ONS’s Admin-based population estimates (ABPE’s) for England and Wales statistics aims to provide reassurance to users on the new methods by the ONS for producing population estimates in England and Wales.

We published our phase one assessment of these statistics in July 2024 which focused on reviewing quality. As part of this assessment, we commissioned an independent review from Professor Arkadiusz Wisniowski, University of Manchester to inform our judgements around the suitability and quality assurance of the data and methods. Our assessment identified 11 requirements for the ONS to act on that will help to enhance the public value, quality and trustworthiness of these statistics. These requirements covered areas such as governance, data quality, methods, revisions, user engagement, and communication. The requirements included:

  • Requirement 1: To maintain public confidence in its population statistics, the ONS needs to understand the current dependencies between the ABPEs and MYEs. Together with key stakeholders, such as the Welsh Government, the ONS should also develop and publish criteria to support its decision about when the ABPEs will replace the MYEs. The criteria should include statistical quality, operational readiness, planned evaluation and assurance processes and contingency plans, and be usefully applied to the ABPEs and MYEs.
  • Requirement 2: To ensure that there is sufficient oversight and leadership of the production of ABPEs in a way that is joined-up across the ONS, and support the ongoing development of ABPEs, the ONS should strengthen its governance structure. Work here should include establishing clearly defined decision-making responsibilities to manage any risks associated with funding, capability and prioritisation across the ABPEs production process.
  • Requirement 8: To instil confidence in the ABPEs and ensure that the DPM methods are sound and subject to sufficient independent and external challenge, the ONS should:
    • continue with its plans to create a sub-group of its Methodological Assurance Review Panel (MARP; the independent panel used by ONS to provide advice and assurance on methods used to produce official statistics).
    • create and implement an expert user group.
    • make it easier for users to find relevant MARP papers to support technical user understanding of the methods used in the DPM.

Since our report was published, the ONS has used our findings to help shape and steer its development work for the ABPE’s. In October 2024 the ONS published an action plan for how it will develop population statistics. This sets out that the work to address and build on the requirements and recommendations from the assessment will be iterative. Over the last six months, and in response to our findings, the ONS has developed and published a population and international migration statistics revisions policy, introduced quarterly updates to keep users up to date with its plans including its work on ABPEs and increased its user engagement activities in a co-ordinated and transparent way. We continue to engage with the ONS as part of our follow-up phase to scrutinise the ONS’s activities and will consider the next phase of our assessment in summer 2025.

Other regulatory work

Domain summaries

OSR is structured into eight topic teams, called ‘domains’. The two preceding sections have summarised some of the key work undertaken by our Economy, Business and Trade domain and our Population and Society domain respectively.

This section provides a short overview of our remaining six domains as well as recent regulatory work.

Children, Education and Skills: This domain oversees the regulation of data and statistics concerning all stages of education from early years to university and beyond, including statistics on teachers and lecturers, learners, and looked after children. We recently carried out an assessment of the Higher Education (HE) Graduate Outcomes Data and Statistics produced by the Jisc under the Higher Education Statistics Agency (HESA) brand. We published our report in April 2024 and confirmed the accreditation of the statistics without requirements for improvement. We are providing support to the newly created Welsh Government Sponsored Body, Medr, which is responsible for a number of official statistics previously published by Welsh Government in an addition to outputs from the Higher Education Funding Council Wales (HEFCW). In previous years we have also carried out assessments on the Achievement of Curriculum for Excellence Levels statistics produced by Scottish Government, and the Key Stage 4 performance statistics for England produced by the Department for Education.

Crime and Security: This domain covers statistics on crime, policing, justice systems (family, civil and criminal) and national security. This domain has undertaken a number of significant and impactful reviews over the last few years including: the Fraud and Computer misuse statistics for England and Wales published by the ONS; the quality of Criminal Court Statistics for England and Wales produced by the Ministry of Justice and based on data from the HM Courts Tribunals Service (HMCTS); and Police recorded crime statistics published quarterly by the ONS, based on Home Office data collected from the 43 individual police forces in England and Wales and British Transport Police. We have also carried out assessments on the Scottish Prison Population statistics produced by the Scottish Government and Police Officer Uplift statistics produced by the Home Office.

Health and Social Care: This domain oversees the regulation of statistics concerning the health of the UK population and health and social care services provided in England, Wales, Scotland and Northern Ireland. Our Health and Social Care domain played a vital role during, and following, the COVID-19 pandemic in rapidly reviewing the quality of statistics being used by Government and the public including:

In addition to work on statistics relating to COVID-19, the domain has also assessed Accident and Emergency (A&E) Activity Statistics in Scotland and statistics about the workforce employed by adult social services departments in England.

Housing, Planning and Local Services: This domain oversees statistics on a range of topics, including: house building; household estimates and projections; homelessness and rough sleeping; housing need and demand; land stock, use and development; and local authority planning. It also covers information on local services such as fire and rescue services. This domain has undertaken a wide range of compliance checks in the last few years including on Statistics on Council Tax in Wales, Social Housing Lettings in England produced by the Ministry of Housing, Communities and Local Government, and Valuation Office Agency Council Tax statistics. The domain has also assessed statistics on Statutory Homelessness in England produced by the, then named, Department for Levelling Up, Housing and Communities (DLUHC).

Labour Market and Welfare: The Labour Market and Welfare theme includes statistics measuring different aspects of work and jobs and covers people’s employment, working patterns and the types of work they do. The theme also covers any earnings and benefits they receive. Currently, this domain is primarily focused on the LFS and TLFS as summarised in the ‘Economic statistics’ section above. However, other examples of work include an assessment on the Personal Independence Payment statistics produced by the Department for Work and Pensions.

Transport, Environment and Climate Change: This domain covers statistics on transport and transport infrastructure; food and farming; the natural environment; energy; fuel poverty; and climate change. This domain maintains strong stakeholder relationship with the wide range of producers who are active in producing statistics in this topical area. We have intervened on subjects that have attracted much public interest such as improving transparency of Welsh Governments’ 20-mph speed limit data. We have previously undertaken regulatory reviews of some key biodiversity indicators including our assessment of Defra’s butterfly statistics as well as more systemic reviews such as accessibility and coherence of UK climate change statistics.

Transparency on the use of data

One of our flagship campaigns relates to the way statistics and data are used in communication. The campaign is based on our principles of intelligent transparency. The core of intelligent transparency is the requirement that all statements made by those in government involving statistics or data must be based on publicly available data, preferably the latest available official statistics where possible.

There have been several high-profile endorsements of intelligent transparency including the report from the PACAC on Transforming the UK’s Evidence Base in May 2024, which commended our work on intelligent transparency and noted that “This Intelligent Transparency guidance has driven the publication of several datasets which would otherwise remain hidden to members of the public, and has been welcomed by many organisations who rely on good data”. The Royal Statistical Society (RSS) also supports the campaign and has integrated the principles of intelligent transparency into the RSS’s new Principles to support statisticians making trade-offs in pressurised situations.

Casework

Through our casework process we regularly receive complaints about the use of data and statistics, often relating to our principles of intelligent transparency. This process is vital in OSR upholding the standards of the use of official statistics and beyond in public debate, holding individuals and organisations to account when needed. In the international landscape of official statistics, this function is relatively unusual, but we consider it an important part of underpinning confidence in statistics and data. This section provides some examples of our recent key interventions.

In October 2024, we were made aware of an unsupported statement made by the Prime Minister, Sir Keir Starmer at the Labour Party Conference regarding immigration returns. At the time the Prime Minister made the claim there was no Home Office data or statistics available in the public domain for the relevant time period to support this statement. We worked with the Home Office and this led to the publication of an ad-hoc statistical release, which provided the underlying data that related to the statement.

In March 2025 we wrote to Peter Schofield, Permanent Secretary for the Department for Work and Pensions (DWP) regarding a statement on the number of people on Universal Credit health with no requirement to look for work in a press release. We judged that the statement that the number of people claiming disability elements of Universal Credit had “increased by 383%” presented an ‘entirely misleading’ picture to the public as it did not recognise that the majority of this increase is due to the process of migrating people from legacy benefits over the last few years. When these people are accounted for, the actual increase in the number of people claiming disability elements of Universal Credit is around 50%. We requested that the press release was updated that week to remove the reference to the 383% figure and that it was not to be used going forward. DWP actioned the change to the press release shortly after and the Permanent Secretary responded to us committing to involving lead statisticians and analysts at all stages of the process, and that there will be appropriate oversight from their Head of Profession for Statistics.

In October 2022, we wrote to Scottish Government in relation to concerns that had been raised with us about the NHS inform dashboard. The NHS Inform dashboard showed the numbers of patients treated in the last quarter and their median wait times by clinical specialty. However, patients who have not yet been treated, some of whom may have been waiting a long time, were not included in these statistics. As such, we judged that the dashboard could potentially mislead some patients about the length of time they may have to wait. Based on our recommendations, Scottish Government implemented improvements to the way that figures were presented in late 2022. In October 2024, Public Health Scotland wrote to us outlining their plans to overhaul the dashboard which will result in a range of improvements in the presentation of the statistics which will provide a better reflection of people’s actual experience of waiting for appointments and treatment.

During the lead up to the 2024 General Election, we published a statement on claims made by the Conservative Government about the UK’s plan to “increase defence spending to 2.5% of GDP by 2030 – an increase of £75 billion”. We determined that the figure of £75 billion did not provide a clear picture to the public as it assumed that annual spending on defence would remain flat in cash terms. If the calculation assumed that defence spending was held at the share of GDP originally planned for 2024-25 then the proposed cash ‘increase’ would drop from £75 billion to £25 billion. Our statement notes: “Cumulating spending increases (or cuts) over several years to derive a large cash figure for presentational purposes does not in general facilitate public understanding of the data in question – the longer the period you choose, the bigger the number you get.”

An evolving statistical system

We consider that the UK data and statistics environment is constantly evolving which presents new challenges and opportunities for official statisticians and analysts. We consider many of these aspects through our reviews and through much of our wider work.

The State of the Statistical System

The State of the Statistical System is an annual report produced by OSR which presents our view on the performance and challenges facing the UK’s statistical system.

The 2023/24 report, published in July 2024, emphasised the increasing strain on the system due to financial and resource pressures, and the need to prioritise core statistics to ensure they are adequately resourced and funded.

To address the issues in the report we set out a number of recommendations. These included that the GSS develop a strategic plan for household data and invest more in its approach to engagement, and that the statistics system shares knowledge and best practice on delivering transformation programmes.

Data sharing and linkage

In 2023 we published a review of data sharing and linkage across government with 16 recommendations for the statistical system as well as a follow up report in July 2024 which assessed the progress that had been made.

Our 2023 report had positive impacts on several projects relating to data sharing and linkage. These impacts include influencing the strategic approach taken by the Department for Science, Innovation and Technology (DSIT) to reviewing cross-government data sharing policy; developments in the Data Marketplace led by the Central Digital and Data Office (now Government Digital Service); the implementation of Wave 2 of the Public Engagement in Data Research Initiative (PEDRI); and technical innovation by the ONS Data Science Campus in developing new privacy-enhancing technologies (PETs).

However, our 2024 follow-on report concluded that despite welcome pockets of innovation, there continues to be a failure to deliver on data sharing and linkage across government, with many persisting barriers to progress. Linking datasets for research, statistics and evaluation – both across government and among external researchers – is not yet the norm in the UK statistical system. To make this a reality, stronger commitments to prioritise data sharing and linkage are required. Such commitments further need to be endorsed and sustainably resourced by senior political and Civil Service leadership.

Our report also acknowledged specific process barriers to data access and linkage. Among these, we noted that concerns that data use cases are often too tightly defined to enable the use of data in policy development are particularly relevant to the success of the Integrated Data Service (IDS). OSR are working with members of the IDS team and the UKSA Research Accreditation Panel to consider programmatic access.

Conclusion

This submission summarises the effective work of OSR, and the tools we use to uphold compliance with the Code of Practice for Statistics. It shows how our work ensures accountability for the production of official statistics that comply with the principles of Trustworthiness, Quality and Value. This submission also provides clear examples of where our work has identified issues, set requirements, and secured change across a wide suite of statistical outputs and data practices, often against a backdrop of significant issues or concerns.

 

Ed Humpherson

Office for Statistics Regulation

May 2025

 

How well served are policy-makers, researchers, businesses and citizens, by the data that ONS produces and the services it provides?

Summary

The Office for National Statistics (ONS) is the UK’s national statistical institute and largest producer of accredited official statistics. It produces statistics, data and analysis to support a wide range of users including decision makers, researchers, businesses, and citizens. The ONS continually engages with its users to understand and meet their evolving needs and ensure that its outputs and services are of a high quality, accessible and relevant.

As the Committee is aware, the National Statistician, Professor Sir Ian Diamond, resigned earlier this month due to health issues. Emma Rourke, Deputy National Statistician for Health, Population and Methods, will be Acting National Statistician pending longer term arrangements being put in place.

The ONS continues to face challenges including falling survey response rates and operating within a tight financial and human resources environment. We remain committed to continuous improvement of our methods and approaches. Alongside this, there is an ongoing independent review of the performance and culture of the ONS being led by Sir Robert Devereux, former Permanent Secretary at the Department for Work and Pensions and the soon to be completed Spending Review. In this context the ONS continues to review its priorities and will make changes to its work as required, to further strengthen our approach to continuous improvement and enable the organisation to deliver our core mission of providing statistics for the public good.

The Office for National Statistics

The ONS is the executive office of the UK Statistics Authority (the Authority). It delivers independent, high quality and relevant statistics and analysis. The wide range of economic and social statistics we produce includes the UK’s National Accounts (such as gross domestic product (GDP)), vital events statistics (such as births, marriages and deaths) and labour market statistics (such as employment, unemployment and earnings) amongst others. The ONS also designs and runs the census in England and Wales every 10 years.

Our statistics and analysis are crucial evidence for decision making and monitoring by central and local government, the health service, businesses, charities and communities across the UK. It also informs public debate.

The ONS responds to changing contexts and user demand for more flexible, tailored and granular data. We are transforming our approach to how we produce statistics across the economy, population and society. This includes advancing data linking across government to enable faster, evidence-based decisions, and gripping the opportunities and challenges of new technologies (such as artificial intelligence (AI), including large language models) to shape and support thriving analytical and statistical systems for the future.

Our priorities are driven by our statutory objective set out in the Statistics and Registration Service Act, the UK Statistics Authority strategy for the statistical system ‘Statistics for the Public Good’, and the relevance and impact of our work to users and the public, with a focus on where we are uniquely placed to deliver.

ONS Strategic Business Plan 2025-2026

We have been open about the challenges the ONS has faced in recent months and set out a renewed focus on our core statistics in the ONS Strategic Business Plan for April 2025-March 2026, published in April 2025. This highlights that delivering our suite of economic and population statistics remains our core function and is reflective of decisions we have made to prioritise resources. GDP, prices, labour market and population statistics take prominence in our outputs.

We remain focused on producing the highest quality statistics and are committed to continuous improvement of our methods and approaches. Alongside our core outputs, we are undertaking vital transformation work, including delivery of our labour market statistics and improving the quality, granularity and timeliness of our prices data. Acknowledging the complexities of the challenges and the vital importance of our statistics to users, we have strengthened our engagement with stakeholders and channels for external challenge, support and expertise to inform our approach.

Our four key strategic priorities which will guide day-to-day activity of the organisation are:

  • An enhanced reputation for delivering trusted, relevant, independent statistics and analysis.
  • Top quality published statistics on prices, GDP (including trade and public sector finance), the labour market and population (including births, deaths and migration).
  • Support the Government’s missions and other users by maximising the use of our statistics and responding to evidence gaps where we are uniquely positioned to do so.
  • Greater linked data capabilities that result in faster, evidence-based decisions across government.

Given the tight financial and human resources backdrop and the need to prioritise our most critical statistics, difficult decisions, including to stop or reduce work, will need to be made in the coming period. While the prioritisation necessary to remain affordable will not satisfy demand from some users, we will continue to deliver impact by protecting our core deliverables.

Understanding user needs

The services that we provide can only happen when we listen to and work in partnership with our users. We engage with a wide range of users and stakeholders to increase both their understanding of our work as well as our understanding of their evolving needs. We are committed to ensuring our statistics are accessible, inclusive and trustworthy, representing and serving everyone in society.

We engage with users through a range of methods including regular meetings, consultations, stakeholder surveys, events, tailored explainer webinars for specific audiences, focus groups, expert advisory panels and user research.

As well as engaging our users on our statistics and analysis, we regularly seek feedback on their levels of satisfaction. Our most recent feedback showed high levels of use of our core statistics on population and the economy along with agreement that we fulfil our mission to produce “High quality data and analysis to inform the UK, improve lives and build the future”, are trustworthy and that our statistics are relevant and of a high standard.

In listening to users, we are also able to better understand how the challenges we are facing are impacting them. For example, how the impact of falling survey response rates on outputs, specifically labour market statistics, is impacting economic decision making. Users have also highlighted delays to some publications, the need for improvements to our website and a desire for more granular data across multiple topic areas. We fully recognise these points and have plans to address them.

In her Review of the Authority, Professor Denise Lievesley recommended that ‘It is time for the Board to move into a more visible, ambitious space, primarily through establishing a Triennial Statistical Assembly which will consult widely with statistics users and producers to understand the range of views regarding the priorities and data needs for the UK’. In response to this recommendation, we held the inaugural UK Statistics Assembly in January 2025. It was attended by over 550 attendees from a wide variety of sectors and roles.

The Assembly was summarised in an independent report produced by Professor David Hand, the then Chair of the National Statistician’s Expert User Advisory Committee (NSEUAC), and highlighted four high-level priorities for the Authority and Government Statistical Service:

  • Re-invigorate sustained and effective user engagement
  • Ensure user needs for more granular statistics are met
  • Commit to, invest in, and take a leadership position in a significant scaling up in the use of administrative data, as well as improvement of its quality and coherence
  • Recognise the needs for UK-wide statistics and advocate for, and support, harmonised data where desirable.

We plan to build on the success of the Assembly through a refreshed user engagement strategy, taking these priorities into consideration as we do so. The Authority Chair, Sir Robert Chote, will also deliver a lecture in July setting out the progress of the statistical system and priorities, drawing on the insights of the Assembly and the Office for Statistics Regulation’s (OSR) annual State of the Statistical System report.

Meeting user needs

Through our programmes, transformation work and statistics, the ONS is working hard to deliver statistics and analysis that meet user needs. This has involved significant prioritisation. The following paragraphs provide examples of work we do to understand and meet users’ needs and to inform policy makers, researchers, businesses and citizens.

Dissemination of Statistics

The ONS’s outputs, in line with the rest of the Government Statistical Service (GSS), are regulated by OSR. Equality of access to official statistics is a fundamental principle of statistical good practice. We publish our statistical releases on the ONS website and users are also able to request extra information from the ONS and see the information others have requested.

We are committed to improving the user experience on the ONS website and over the past year have been addressing website performance and stability as well as wider improvements to address feedback from users. We have launched new website page previews, most recently on prices statistics, with new navigation, page designs and smarter content for users to provide feedback on.

Priority issues for decision makers

The ONS works closely with partners to provide responsive analysis that directly address policy priorities, including the missions introduced by the Government. These statistics, and many others across the organisation, provide vital insights for policy formulation across government.

The National Statistician also leads the GSS and Analysis Function. The ONS sits at the heart of the GSS and Analysis Function and works with the network of UK Civil Servants to provide the statistical evidence base, professional advice and analysis required by decision-makers to ensure policy and operations are evidenced and deliver value for money.

This collaborative approach to delivering statistical outputs, responding to analytical demand and the continuous improvement of our statistics across the statistical system will continue to be an underpinning element to our plans in 2025/26.

The ONS’s Analytical Hub has a close partnership with the Joint Data and Analysis Centre (JDAC) in the Cabinet Office where we directly provide data and strategic analysis to support policy and decision making at the heart of government.

Delivering rapid insights in changing circumstances

We publish a range of statistics to provide timely indicators for users covering the effect of developing world events on the UK economy and society.

For example, the Opinions and Lifestyle Survey (OPN) collects information monthly on a variety of topics relating to people’s experience of daily life and events, including questions about what people feel are important issues, their health and well-being. The content on the survey changes regularly, to keep pace with changing content requirements from users.

The value of these ONS surveys such as the OPN or the Business Insights and Conditions Survey (BICS) were prominently demonstrated during the Covid-19 pandemic. They were regularly updated and adapted to reflect changes in policy and our understanding of the virus. The ONS consulted with a wide range of other government departments on a regular basis as it developed questions.

Real-time indicators are also invaluable for enhancing and developing core statistics by providing timely insights that complement traditional data sources. For example, during the Covid-19 pandemic, the ONS used real-time data from sources like card spending and mobility indicators to quickly gauge economic activity and societal changes.

Providing safe research environments for accredited and approved researchers

We make de-identified record-level data available to accredited or approved researchers through the Secure Research Service (SRS) and the Integrated Data Service (IDS) services to facilitate work on research projects for the public good. The SRS makes static snapshots of data available to researchers using a windows desktop environment and the IDS makes flexible views of indexed data available using the tools and scaling available through the google cloud. The SRS is one of the largest trusted research environments in the UK, around 6,000 accredited researchers having potential access. Of these, around 1,500 are actively working on research projects at any given time.

Building on the success of the SRS, we are now migrating users with the highest value use cases to the IDS, enabling far greater flexibility and broader ranging analysis work programmes.

This will also enable far greater insight into our core national statistics relating for example relating to trade, employment and growth, and the interactions between them at national, regional and local levels.

Delivering Local and Sub-national Insights

ONS Local is a dedicated analytical advisory service for Local Government and local decision makers with team based in each of the English regions, and in Wales, Scotland and Northern Ireland. A core part of the team’s role is engaging with local users to understand data and information needs and gaps, in addition to producing bespoke analysis for their local stakeholders. ONS Local and our subnational statistics, together provide unique local information to support national, regional and local leaders’ understanding of topics through a place lens.

To this end, the ONS has developed a tool called Explore Local Statistics, that allows users to find, visualise, compare, and download local data in one place. This service brings together a wealth of data across various topics, including the economy, education, health, and wellbeing, making it easier for local leaders and decision-makers to access the information they need. Users can search for data by postcode, local authority, region, or parliamentary constituency, providing detailed insights into specific areas. The service also allows for comparisons between different areas or clusters with similar demographic or economic characteristics, enhancing the ability to make evidence-based decisions.

How is the UK’s data environment evolving, and what challenges and opportunities does this present official statisticians and analysts?

The UK’s data environment is evolving at pace. There are more data, insights, and opportunities, and the ONS is acting to realise the full value of data while maintaining high levels of trust and transparency.

Challenges remain in streamlining and simplifying approval processes to these new sources and developing the skills and capability to make best use of them, whilst continuing to address the challenges of collecting information from traditional survey sources.

  1. Data capability and skills within our organisation are key. As part of capability building and future proofing, the GSS is collaborating with the Royal Statistical Society (RSS) on a project about the future of the statistical profession examining what the role of a statistician, including their skills and training, will look like in the future.
  2. Below we expand on some of the current opportunities and challenges relating to the evolving data environment.

Survey data collection challenges

Changes in society and the pace of technology are having a direct impact on how people perceive their data and interact with surveys and government services.

ONS surveys, both social and business, are the primary data source for statistical producers in the UK, central to the nation’s most significant economic and social indicators, and despite the increasing use of administrative data, they remain vital.

The sharp fall in household survey response rates both in the UK and internationally is well-known, often linked to an increased difficulty in accessing properties, increased cautiousness to share information, and declining trust in government and public institutions. This can affect data quality and has been evidenced most prominently by the impacts of survey response rates on the Labour Force Survey (LFS). The ONS’s plans for a long-term solution to the challenges faced by the LFS remains the online first Transformed Labour Force Survey (TLFS), more details about which can be found on the ONS website.

Although integration of administrative data sources is already taking place to produce economic outputs and remains our preferred data source, we recognise that surveys currently still play a vital role in collecting data from businesses. Direct data collection tools also become more important as AI is increasingly used due to the need for new, high-quality data.

Linked to this, but a separate challenge, are our legacy statistical production systems. As highlighted in our business plan, addressing our IT and coding legacy systems remains a key focus for 2025/26.

A legal requirement to complete ONS business surveys, coupled with predominately online collection, has seen business survey response rates improve to pre-pandemic levels.

What the ONS is doing in relation to survey data collection

The ONS is increasingly integrating a wide variety of data sources to produce high-quality research and statistics and adapting survey methodologies to encourage greater coverage. We have built relationships across the public sector to implement regular flows of anonymised administrative data collected as part of the delivery of frontline services, such as tax, benefits, health and education. We have also built relationships with the private sector and other bodies, enabling secure access to anonymised big data, such as aggregated financial transactions and mobile network operator data, and scanner data from supermarket sales.

The ONS has a vision for a more efficient and effective social survey system delivered through innovative use of administrative and alternative data across the survey end-to-end. We are also exploring and embedding opportunities to use AI and other innovative digital technology across the survey end-to-end. This will significantly improve how we meet changing data user needs with a sustainable, robust, resilient and agile survey delivery system.

To enable urgent quality improvements to the social surveys feeding economic statistics, we received additional funding from HM Treasury in 2024/25. This was invested in increasing our interviewer numbers, implementation of changes to bolster retention, and increasing incentives for survey respondents. This has helped to improve performance on key surveys, but we need to do more to improve performance across our full survey portfolio. Our survey recovery strategy, including focused resources in this area, will help us continue to improve response rates in an increasing challenging environment.

The ONS is already transforming how we collect data to an integrated and modularised suite of surveys through our Business Survey Strategy. This will reduce the burden placed on businesses and improve engagement, improving the quality of the data we collect. We are also expanding the use of AI and moving to cloud-based collection and production systems.

Administrative data

Many opportunities and challenges are linked to the growing scale and types of data that are available, in particular, the proliferation of administrative data within the public sector and the growth in alternative, big data.

The increasing availability and content of administrative data present the opportunity for the ONS to produce more frequent and timely statistics that sustain a better level of quality over time.

The quality of administrative data varies, with issues including data completeness, differences between concepts (including reporting periods), timeliness and frequency of data deliveries, consistency of data deliveries and the availability of metadata. Nonetheless, linking administrative data with survey data provides valuable insights into how they can be used to improve our understanding of survey bias, and to develop new outputs that make the most of the strengths of all sources.

The ONS has been investigating the use of primarily administrative data to produce annual population estimates, and a range of other types of estimates historically enabled only by the census. The ONS has worked with statistical offices in Scotland, Wales and Northern Ireland to consider the viability of taking similar approaches across the UK, with an agreement on this topic signed in November 2022.

This is an area of ongoing research and development, and the Authority is set to announce its recommendation on the future of the census in England and Wales in the summer.

The ONS has already moved to administrative measures of international migration due to unavoidable challenges with its traditional sources for these statistics. The ONS has confidence in the long-term strategy for migration estimates, and their future coherence with admin-based population estimates, and we are working with users to increase confidence as new methods mature.

A key enabler of this work is improving the sustainability of the supply of administrative data to the ONS, and improvements to their content and quality, working in partnership with data suppliers.

As we consider embedding new methods, and question the role a potential future census might play in the ONS’s long-term statistical design for population statistics, there is a need to balance ambitions for research and development alongside the requirement for a steady state of operational delivery in line with user needs, and resource constraints. This is a challenge for the whole of the UK, not just the ONS.

Data sharing across Government

Through extensive engagement across government departments, we are making some progress in acquiring new administrative data sources. However, this continues to be very time consuming, and each data sharing agreement can take months or years to agree.

Data owners are understandably risk adverse, often resulting in complex agreements, with conditions varying significantly between sources. In her Review of the UK Statistics Authority, Professor Denise Lievesley highlighted that ‘systemic and cultural barriers to responsible data sharing between government departments’ hamper the Authority’s efficacy, including the work of the IDS.

We continue to work with other government departments to remove blockers and simplify approval processes, and we hope plans to develop a National Data Library will increase the focus on resolving these challenges.

Data linking

The ONS has developed a suite of core linkage indexes covering business, the population and addresses which enables datasets to be safely de-identified and linked at scale without the need to share personal data with analysts. This linkage enables full exploration of the utility of administrative data for statistics and supports addressing data gaps going forward.

Using this approach, we are aiming to increase standardisation of data production and usage within not only the ONS but across the public sector.

With advances in AI, there is further opportunity to use the new technology to bring efficiencies to data processing and to improve data quality. However, this opportunity cannot be fully realised unless data has foundational quality such as high-quality metadata and clear governance, ethical and security frameworks in place. Good progress is being made in the ONS piloting the use of Generative AI to speed up data processing and improve user experience.

The National Data Library

The ONS continues to support the Government Digital Service (GDS) and Department for Science, Innovation and Technology (DSIT) on initiatives to improve data sharing, including the development of the Data Ownership Model for Government, identification of Essential Shared Data Assets and data discovery through the Data Marketplace.

Plans for a National Data Library (NDL), whilst still in the discovery phase, give a further opportunity to drive alignment across data sharing infrastructure services.

The ONS has shared key lessons relating to data sharing with DSIT as part of the NDL discovery, and we continue to help shape the longer-term solution. There will clearly be a role to signpost to the right service, support users access the right platform for their needs, and address data sharing barriers across the eco-system, particularly to support the government deliver its missions.

Office for National Statistics

May 2025

UK Statistics Authority response to the Public Administration and Constitutional Affairs Committee’s report on Transforming the UK’s Evidence Base

Dear Mr Hoare,

Now that the Committee has been reconstituted, I write to provide a response from the UK Statistics Authority to your predecessors’ report on ‘Transforming the UK’s Evidence Base’, published shortly before the General Election was called. On behalf of the Authority, I would like to express my thanks to them and their supporting staff for launching this timely inquiry and for the report and recommendations.

This report comes at an important time for the official statistical system, with the Authority due to convene the first ‘Statistical Assembly’ and prepare its next strategy for the statistical system, due to be published in July 2025. In the coming months the Authority will also be making a recommendation to government on the future of population and migration statistics in England and Wales, based on advice from the National Statistician.

In sending this response, I would also like to highlight to the Committee our response to Professor Denise Lievesley’s Independent Review of the Authority, which raised many similar issues and themes as the Committee’s report. Below, I broadly address the Authority view of the key points from each section of the Committee’s report and respond to recommendation 5 in more detail. Appended to this, you will also find the individual responses of Professor Sir Ian Diamond, National Statistician and Ed Humpherson, Director General for Regulation on behalf of the Office for National Statistics (ONS) and Office for Statistics Regulation (OSR) respectively, addressing their specific recommendations.

The Authority welcomes the Committee’s engagement on the future of the UK’s statistical system and the opportunities presented by your recommendations. We will continue to keep you updated on our work and progress made towards the recommendations aimed at the Authority, the ONS and OSR respectively.

Yours sincerely,
Sir Robert Chote
Chair, UK Statistics Authority

Delivering evidence for the public good

Recommendation 5: It is time to democratise access to data and evidence. The UK Statistics Authority should establish a framework for identifying and prioritising demands for evidence. We recommend that it use a high-level Assembly (of the kind recently recommended by Professor Denise Lievesley) to draw together information from communities across the UK about their needs for evidence and the benefits new evidence would bring, alongside research on data gaps, and public understanding. We further recommend that the UK Statistics Authority submit its findings on the nation’s demands for evidence to Parliament on a triennial basis, for scrutiny by this Committee.

1. The Authority welcomes the Committee’s fifth recommendation, that both a framework and a high-level assembly be established to identify and prioritise demands for evidence, with its findings submitted to PACAC for scrutiny on a triennial basis. Work is already underway to meet these objectives. In April, the ONS published its Strategic Business Plan for 2024-25, setting out our approach to prioritisation in a constrained financial environment. The Plan makes a commitment to securing the stability and quality of our core statistical outputs across five priority areas. In taking on additional work, the ONS will seek to align its resources to activities and programmes where it is uniquely placed to deliver, and that have the highest impact on the strategic priorities.
2. As set out in the Authority’s response to the Lievesley Review, a UK Statistics Assembly will meet for the first time on 22 January 2025. It will bring together users and producers across sectors to discuss and give advice on the UK’s needs for statistics. The insights will be drawn together in a published advisory report, indicating potential priorities and data gaps for scrutiny by users and by your Committee. They will inform delivery planning for the ONS and other official statistical producers.
3. As well as identifying data gaps, the Assembly’s discussions will provide valuable insights on the quality of statistics, to contribute to shaping the OSR’s regulatory work programme. Following the first Assembly in January 2025, the Authority and stakeholders will review the frequency of future events, including the timing of future reporting to Parliament. As Professor Lievesley has pointed out, there is no precise template for an Assembly so the first will inevitably be an experiment from which we can learn.
4. Additionally, across the ONS, producers of statistics regularly engage with users of statistics across existing forums and advisory groups. For example, the ONS Local team (based physically in the nine regions of England as well as Scotland, Northern Ireland and Wales) act as the front door for local government to access the ONS and the wider GSS, supporting users to make the most of a wide range of data and analysis.
5. Furthermore, the ONS provides accessible digital content to help audiences find, understand, explore, and act upon its content. These include data visualisations and explorer tools, as well as explanatory articles and bite-sized videos to suit different audiences – available both on its own website and via external channels, including social media platforms. To extend the reach of statistics and data to audiences with whom the ONS traditionally has had less engagement, the ONS works with relevant organisations and citizen representative groups, to help disseminate its outputs as well as inform the design of communications.
6. The ONS have also initiated a Citizen Data project with the aim of securely playing back data held on the citizen to the citizen. This will enable ONS to engage directly on a one-to-one basis encouraging individuals to validate their personal data and help to improve characteristics coverage and public trust in the use and storage of administrative data.
7. OSR continuously seeks to embed the principle of statistics for the public good in its regulatory approach. The Code of Practice for Statistics has clear expectations that official statistics support the needs of a wide range of users, alongside policy makers. OSR is in the process of refreshing the Code and will continue to articulate and strengthen its expectations on this principle. One of the areas of focus for the refresh of the Code is a greater focus on user engagement.
8. OSR also conducts research to further understand how official statistics can serve the public good. In its recent think piece, OSR proposed that “official statistics serve the public good as public assets that provide insight, which allows them to be used widely for informing understanding and shaping action”. OSR is also undertaking complementary research into how individuals may use statistics in their personal lives. This research is used to strengthen its regulatory approach, and its advice and requirements on producers of official statistics.
9. As mentioned, the Authority’s current strategy, Statistics for the Public Good, launched in 2020 and will end in July 2025. We will look to engage with the Committee to ensure the next strategy reflects views from users, including Parliament.

Navigating new data sources and technologies

10. The Authority acknowledges the Committee’s high-level findings that there has been an increase in data generated across the UK, with a need to bring together ‘old’ and ‘new’ data sources to make best use of it. Valid concerns are raised about the current provision and proficiency of cross-government data sharing and how if not addressed, this may hamper efforts such as our ambitious future of population and migration statistics programme, which is seeking fully to capitalise on the transformational opportunities offered by administrative data sources.
11. As we outlined in our response to Professor Lievesley’s review, we concur with the notion that a drive is needed from the centre of government to increase incentivisation and ability for departments to share data between them, with the FPMS programme and successful delivery of the Integrated Data Service (IDS) requiring a positive culture shift towards data sharing becoming a reality. Additionally, we seek to be as transparent as possible about what data we are seeking and how we are using them. Thus, we see the benefit in all the recommendations in this section of the report and they are explored in more detail in the ONS response.

Evidence in policymaking

12. Several recommendations in this section are made with a view to ensuring the Analysis Function (AF) has the resource and vision it requires to enact significant change and evaluate its future successes. As part of the privacy section of the report, the Committee also suggests that the AF explore options for improving transparency where personal data is used in official analyses. These recommendations are responded to at length in the ONS’s response, given the National Statistician’s role as Head of the AF.
13. I was pleased to note the Committee’s praise for the OSR’s fantastic work on Intelligent Transparency (IT) and their suggestion that its remit should be widened in scope and government communication professionals be trained in the IT principles. Previous work that the OSR has done in this space, and thoughts on these recommendations, can be seen in their response.

Privacy and ethics in an age of data

14. The Committee’s last recommendation that the Centre for Data Ethics and Innovation should continue to monitor public attitudes on the Government’s use of data is welcomed by the Authority.
15. The Authority pays tribute to the work carried out by the Responsible Technology Adoption Unit (RTAU) in the Department for Science, Industry and Technology (previously CDEI). We also monitor public attitudes towards the use of data and trust in institutions more widely, including an insights paper we published in June 2023. This release summarises our findings on public attitudes, concerns and expectations on the use of data and views on our use of administrative data in publishing statistics.
16. In the past, the Authority has regularly commissioned the National Centre for Social Research (NatCen) to assess independently the public’s knowledge of, and trust in official statistics, and how they are produced and used in the Public Confidence in Official Statistics (PCOS) Survey. The most recent PCOS survey results were published by NatCen on 14 May 2024.
17. Additionally, in 2022 and 2023 the ONS was commissioned by the Cabinet Office to run the Organisation for Economic Co-operation and Development Survey on Drivers of Trust in Public Institutions on behalf of the UK Government. The most recent release, for 2023 was published on 1 March 2024. We therefore support this recommendation and would be happy to provide our knowledge and expertise to RTAU to assist with future work seeking to understand public attitudes towards data usage.

Sir Robert Chote, Chair
UK Statistics Authority

The Office for National Statistics (ONS) response addresses the Committee’s recommendations both directly aimed at the ONS, and those where other government departments have joint responsibility. This response focuses on data sharing and the future of population and migration statistics programme, which are inherently linked, and data ethics. It also provides a response to the series of recommendations specifically directed at the Analysis Function.

Data Sharing and the Future of Population and Migration Statistics

Recommendation 1: It is time for Government to do what it promised to do seven years ago, and to join up the UK’s evidence base. Given that the Cabinet Office’s existing initiatives for improving data sharing are self-evidently insufficient, it should in partnership with the Office for National Statistics develop a comprehensive new programme aimed at improving data-sharing for statistical and research purposes. The programme must clearly define deliverables and timelines, and must be owned by a senior responsible officer at an appropriately high level. In line with the recommendations of the Lievesley report, we also recommend that HM Treasury establish mechanisms so that the costs are not borne by individual Departments, but rather centrally. The Cabinet Office should prepare and publish an annual progress report on delivery against the programme.

18. The ONS is strongly supportive of efforts to enhance data sharing across Government. As the largest producer of official statistics, we are dependent on effective data sharing across the public sector and beyond, to support more quality, timely and granular admin-based statistics. The ONS also plays a key role in supporting Government, the devolved administrations and wider academia to access data to support statistical research. As such, the ONS firmly supports the Committee’s recommendation that a cross-government data sharing programme be established.
19. To date, we have worked closely with the Central Digital and Data Office (CDDO) and wider government departments to promote effective data sharing. We also played a leading role in supporting key initiatives to deliver upon the commitments within the 2022-2025 Roadmap for Digital and Data. We have supported a number of initiatives aimed at improving data sharing, including developing the Data Maturity Assessment for Government; the identification (and publication) of Essential Shared Data Assets and developing common governance arrangements to support sharing of data.
20. As the lead delivery partner for the Integrated Data Service, the ONS has also delivered a trusted research environment in the cloud. We are uniquely well placed to facilitate access to a growing library of linked data sources to support collaborative analysis, including to support the development and delivery of Government’s key missions. However, this will only be possible with continued and increased support from key data owners across government.
21. The ONS recognises both the progress that has been made and the substantive challenges that remain to cross-Government data sharing. Therefore, the ONS welcomes the creation of a new Digital Centre of Excellence within the Department for Science, Innovation and Technology. We look forward to working with DSIT to define and implement a programme of work that drives a step change in data sharing to enable statistical and research use cases within Government and beyond.

Recommendation 2: Separately, the Office for National Statistics should publish information on the datasets it is seeking on an annual basis, setting out its rationale for seeking those data, and details on the status of the request – all of which should be made available on the ONS website.

22. The ONS accepts the Committee’s recommendation that we publish information on, and rationale for, the datasets we are seeking on our website annually. This fits well with our strong desire to be transparent about the data we use to support statistical outputs. For example, we already publish a report on the datasets that we have acquired that contain personal identifiers which was most recently updated in July 2024. We are working to expand the coverage of this publication to cover a broader array of alternative and administrative data sources, irrespective of whether the dataset contains personal identifiers or not.
23. We have a broader transparency ambition, which will lead to further publications that provide information about how the data we acquire are processed and the relationship between these data ‘inputs’ and our broad portfolio of statistical outputs. Whilst all the necessary information is available on an individual basis, drawing together all of the elements needed to depict this will require a programme of work across the office, including the development of an enterprise data model. We intend to start with some illustrative examples to test the best ways of presenting what will be a very large amount of information. We will then expand from that point, recognising the need to be both informative and comprehensive.
24. We also acknowledge the importance of being transparent about the data that we have not yet acquired, in both illuminating the progress on key data shares, but also conveying a clear sense of our progress in delivering a viable administrative data based statistical system. Therefore, we agree to add data that is in the process of being acquired to our transparency reporting. We will
develop the best format for these publications, in conjunction with our various suppliers so that we can appropriately convey the status of an acquisition.
25. We recognise that data sharing is a complex process. Various stages are required to mature sharing arrangements and deliver sustainable supplies of data and it is necessary to provide a sense of how mature our sharing arrangements are. We must also ensure that we adhere to commercial sensitivities in naming some suppliers.

Recommendation 4: This Committee’s view – particularly in light of challenges around data-access – is that officials have not yet demonstrated that they can deliver the evidence users need, without a decennial census. We therefore recommend that the Office for National Statistics undertake further work on proposals for the future of migration and population statistics.

26. Both the ONS and the Authority welcome the Committee’s recognition of the opportunities offered by administrative data sources. We also recognise the need to improve the culture of data-sharing across government if we are to maximise those opportunities. This is a challenge that was also highlighted by Professor Denise Lievesley’s review earlier this year, and which the Authority continues to work with partners across the public sector to resolve.
27. Data already held within the public sector mean population and migration statistics can be more consistently accurate and produced more often and quickly. As a result, decision-makers have more, higher-quality, information about local populations, their characteristics, where they live and the public services they need.
28. In line with the Committee’s recommendation, the ONS continues its work to develop and improve admin-based population estimates, using innovative new methods and a wider range of data sources, accounting for quality limitations in the data. We published updated estimates as official statistics in development in July, and aim for these to become the official mid-year population estimates in 2025.
29. The Authority expects to publish its recommendation on the future of population and migration statistics in England and Wales in the coming months. This recommendation will draw on extensive engagement with users of these statistics, including through the public consultation last year, and will include the Authority’s proposed approach to the future of the census in England and Wales.

The Analysis Function

Recommendation 10: We recommend that Government reaffirm its commitment to the analysis function, and that HM Treasury review options for its future funding. If Government truly wishes to improve its use of analysis and deliver better outcomes for the public, it clearly needs to fund that change.

30. The ONS remains committed to the Analysis Function (AF), and as such accepts the principle of this recommendation and is grateful for the Committee’s concern about its future funding model.
31. We believe that funding for a dedicated central AF team is essential to ensure that analysts across the Civil Service have the support they need to deliver better outcomes for the public by providing the best analysis to inform decision making.
32. Therefore, the AF Central Team (AFCT) will work with HM Treasury to assess the best option for future funding. As Head of the AF, I will work with Chief Analysts across government to ensure the profile of the Function continues to be raised.

Recommendation 11: In parallel, the National Statistician should review the analysis function’s scope and standard, with a view to defining an achievable set of next-steps, and clear plans for honest evaluations of the function’s success. This review and subsequent evaluations should be made publicly available, so that Parliament is in future better equipped to scrutinise both the Government’s use of evidence and the progress of the analysis function.

33. The ONS accepts the Committee’s recommendation to review the AF’s scope and standard with achievable next steps in mind and will take the points raised away for further consideration. Subject to gaining sufficient funding for the AFCT, the team will review the scope of the AF, which will be reflected in the updated AF Strategy for 2025-2028. The team will also review the AF standard, focusing on any changes needed to reflect the importance of transparency in analysis.
34. With regards to the evaluation of the AF, subject to funding, the AFCT will evaluate the impact of their work to support analysts across government. It will also undertake a light touch assessment of the impact of analysis more widely, using existing evidence sources, such as the results of the assessment against the AF Standard. This will be complemented by the work of departmental Chief Analysts who are responsible for evaluating the impact of their analysis and whether it is meeting the needs of their decision makers.
35. The AFCT will update the Committee on the findings of this evaluation, via a letter from the National Statistician to the Chair. The AFCT anticipate that this work will be completed by Q2 2025/26. However, further reviews of the standard, and evaluation of the work of the Central Team, will be undertaken as part of our business as usual.

Recommendation 14: We recommend that, at a minimum, governments in future routinely publish the evidence and data underpinning their major policy announcements. Making this happen will not be a straightforward task, and we suggest that in the first instance leaders of the analysis and communications functions develop options to deliver this ambition, for the consideration of Ministers.

36. The ONS accepts the Committee’s recommendation that options be developed for Ministers on the routine publication of evidence and data underpinning major policy announcements in the future. Transparency in evidence and data underpinning policy decisions is an important matter for the AF and was discussed at an AF Board meeting last year. There is a variety of guidance already in existence in this area, including the Code of Practice for Official Statistics, the Analysis Function Standard and OSR’s guidance on Intelligent Transparency.

37. In line with the Committee’s suggestion, the AFCT will work with the Communications Function, and other bodies, such as the Policy Profession, to consider options around the recommendation to routinely publish the evidence and data underpinning their major policy announcements.

Recommendation 15: We recommend that the analysis function explore options for improving transparency around the use of personal data in official analyses, and that this work be made publicly available.

38. The ONS accepts the recommendation that the AF explore options for improving transparency around the use of personal data in official analyses. Subject to sufficient funding, the AFCT will investigate options around what would improve transparency around the use of personal data in official analysis, working with relevant bodies that deal with use of personal data such as CDDO and OSR. The AFCT will complete this work by Q4 2025/26.

Data Ethics

Recommendation 16: It is now time to consolidate the excellent exploratory work that has been done on data ethics, and to embed it more formally into the collection, analysis, and communication of evidence in the UK. We recommend that the Cabinet Office’s Central Digital and Data Office and the Office for National Statistics jointly review the varying data ethics frameworks available to analysts across the UK; considering opportunities for greater consistency, and possible accountability mechanisms, to encourage a wider adoption of data ethics across government.

39. The Authority has worked with the CDDO to develop a common understanding of data ethics in the public sector. Our conversations have resulted in agreement with the recommendation that existing frameworks across the UK be reviewed with the aim of encouraging wider adoption of data ethics across government.
40. Regular working-level meetings between the data ethics teams from both departments have been organised, and discussions have included sharing each team’s learnings in the data ethics space. Together, we have discussed the Authority’s data ethics self-assessment tool and the recent landscape review around the responsible use of data-driven technologies in the public sector for which the Authority participated in an interview, amongst others. The Authority’s Centre for Applied Data Ethics (CADE) continues to monitor the impact of the tool along with providing practical support and thought leadership in the application of data ethics by the research and statistical community.
41. As noted in the Government response, recommendations coming out of the landscape review suggested consolidation work on data and AI ethics guidance across government. We will work with CDDO and other partners across government on this exercise.
42. We also wish to concur with the point raised in the Government response about flexibility and context. We agree that harmonisation is desirable in some instances and have discussed shared opportunities with CDDO. But our objective is to promote and safeguard the production and publication of official statistics, and specifically provide guidance to researchers (from within and outside of government) on the ethics of their research. Therefore, we agree that much of the guidance material we produce must be discrete from general data ethics guidance produced for central government.

Professor Sir Ian Diamond, National Statistician
Office for National Statistics
September 2024

This response from the Office for Statistics Regulation (OSR) addresses the Committee’s recommendations for OSR as well the recommendations with joint responsibility across other government departments. Our response has focused on the development of a framework for reporting data gaps across the UK, and on provisions for improving intelligent transparency across government.

Data gap reporting framework

Recommendation 6: We recommend that the OSR support this activity by preparing regular and public reports on data gaps in the UK.

43. OSR accepts the Committee’s recommendation to prepare regular and public reports on data gaps in the UK in principle.
44. Identifying and helping statisticians focus on addressing data gaps is an important aspect of our regulatory work. Our domain (topic area) model of regulation allows us to have strong relationships with statistics producers and organisations who publicly advocate for improved statistics. Our annual report, State of the Statistical System, which was last published on 17 July 2024, is our key publication bringing together the insight provided by our regulatory work. It also shares our views on the performance of the statistical system and the challenges facing it. In line with the Committee’s recommendation, we will have an enhanced focus on data gaps in future editions of this report.
45. Our recently published report, “Data Sharing and Linkage for the Public Good: Follow-Up Report” highlights the importance of enabling greater data sharing and linkage, in a secure way, for research and statistics22. This is also relevant to the government’s ability to respond to data gaps.
46. Additionally, OSR is refreshing the Code of Practice for Statistics to ensure that it remains relevant. We are strengthening the emphasis on involving users in decisions about what statistics are required – whether to start, stop or change official statistics. This includes being clear on where and why user needs can and cannot be met, such as addressing information gaps.
47. Noting recommendation 7, in our aforementioned “Data Sharing and Linkage for the Public Good” report, we call for consistent and sustainable funding streams for data sharing, access and linkage initiatives, and specifically for a centralised government funding structure to support data collaboration projects across government. This structure should prioritise a system-level, access-based approach to investment, as well as continue and expand initiatives such as the Shared Outcomes Fund.

Recommendation 9: We recommend that the Office for Statistics Regulation review and publish a report on the adequacy of UK-wide comparable data, by themes, before April 2025.

48. OSR accepts the Committee’s recommendation that we review and publish a report on the adequacy of UK-wide comparable data. Our State of the Statistical System report highlighted the need for producers to work in partnership across the UK and provided examples of where statistics producers are making improvements to UK comparable statistics and data. We will build on this work and publish an update in 2025 which shares our more detailed views on the adequacy of UK comparability.

Improving intelligent transparency across government

Recommendation 12: We commend the OSR for its work on Intelligent Transparency and recommend that it publish an annual report card on departments’ compliance with its guidance, so that Parliament and external bodies might support it in holding departments to account, and making the case for well-informed policy. Recognising that this important work expands the remit of the OSR beyond official statistics, and into the larger world of government analysis, we also suggest that at the next Spending Review, it works with HM Treasury to agree a sustainable funding model for this work, given the vital role it plays.

49. OSR accepts this recommendation and thanks the Committee for its commendation of our work on intelligent transparency (IT). We agree that this would expand the remit of OSR and would therefore need to be appropriately funded.
50. OSR has an intelligent transparency working group tasked to find ways to promote, and embed, the IT principles across government. We are beginning work to develop proposals for a monitoring and reporting approach for IT across departments. The project will consider incentives and formats for departmental reporting, as well as automated tools for OSR to monitor IT on a rolling basis. We will consider these proposals alongside discussions with HM Treasury on a sustainable funding model for this work.

Recommendation 13: We recommend that all government communications professionals are trained on the OSR’s Intelligent Transparency guidance, and that the Government Functional Standard for Communication be updated to make it clear that officials are expected to comply with that guidance.

51. OSR accepts the recommendation that government communications professionals should receive training on IT and would be happy to work in partnership with the Cabinet Office to achieve this.

52. Since the creation of the IT principles, OSR has continued to promote them across government, working with a range of organisations and professions, including the communications profession. We are very encouraged by the response and engagement from departments to date, and the commitment to supporting public confidence in government statistics and data through transparency. Our ambition is to continue to take these principles to new audiences, including ministerial private offices and Special Advisors.
53. We are also exploring ways to provide our IT materials on our website following feedback from sessions we have delivered previously. This includes developing new guidance on adhering to IT principles on social media.
54. As part of our work refreshing the Code, we are looking to articulate our standards relating to IT more clearly and better highlight how they relate to all those in government involved in communicating statistics, data and wider analysis.
55. Relatedly, we wish to express our support for recommendation 14 of the Committee’s report, that future governments should publish the evidence and data underpinning their major policy announcements. We already encourage IT around policy announcements and government decision making and will continue to do so. Our latest report on Analytical Leadership explores our position on this in more detail. Additionally, I included this issue in my letter to Permanent Secretaries at the start of this year’s general election campaign.

Ed Humpherson, Director General for Regulation
Office for Statistics Regulation
September 2024

UK Statistics Authority supplementary evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on Transforming the UK’s Evidence Base

Dear Mr Wragg,

Following the submission of the Office for National Statistics’ (ONS) written evidence to the Committee’s Transforming the UK’s Evidence Base inquiry on 31st August 2023, I then gave oral evidence to the Committee on 5th September 2023.

One of the topics that I am aware the Committee has been interested in, during the course of this inquiry, is analytical capability across government. I am pleased to be able to provide some additional evidence on this topic, as requested.

Analytical Skills

 The Analysis Function (AF) is committed to building skills and knowledge across our community of 17,000 analysts, supporting effective career planning, and ensuring that we have skilled people in the right place at the right time. We have developed a suite of materials, designed to support analysts to navigate their careers across government analysis. These include the AF Career Framework, which features multidisciplinary role profiles and career pathways, as well as career stories showcasing the variety of entry points and available career progression routes.

The Analysis Function remains focussed on providing a learning and skills offer that meets the diverse needs of our community and adds value to the work being delivered by analytical professions and departmental colleagues.

Analysis Function Standard Assessment Framework

In my previous letter dated 5 October, I noted that Departmental Directors of Analysis were asked to undertake a self-assessment against the standard in 2023, for the first time. This process is a framework designed so that organisations can assess how well they are applying all aspects of the Analysis Function Standard. Consequently, this means that the assessment covers issues beyond analytical capability, such as capacity, governance and structures.

The Cabinet Office mandates that all functions conduct such exercises. The assessments were carried out in Q4 of 2023/24 and we had responses from 21 organisations across government. As the assessment is meant to drive improvements within organisations, only high-level information was returned to the

Analysis Function Central team, under the agreement that these responses would not be shared more widely. The returns showed a mixture picture of strengths and weaknesses across government. The summary information returned has been used

to develop further actions to support organisations in meeting the Analysis Function Standard, for example, setting up sessions to share best practice on key areas of the Standard.

Analytical Capability Audit of Policy Professionals

As part of his review of the effectiveness of government functions in 2021, Lord Maude commissioned a review of the analytical capability of policy professionals. The AF worked closely with the Head of the Policy Profession, Tamara Finkelstein, to identify areas of strength in the analytical capability of policy professionals, as well as development areas for improvement. This work further fostered positive working relationships between analysts and policy makers in government.

The report was completed in summer 2022 and has been well received. It is a key evidence base for the analytical skills development agenda across government, for both policy professionals and more widely across the whole Civil Service. This has led to a more robust analytical capability learning offer for all, ultimately ensuring that officials are more comfortable working with and analysing data when developing and delivering public services. The legacy of this work has been highlighted in core reform activities, such as the One Big Thing initiative in 2023, pushing the analytical capability agenda in government.

Cross Government Evaluation Capacity and Capability Survey

An Evaluation Capacity and Capability survey (ECCS) was conducted in Summer 2023. This was in response to a recommendation in the 2021 National Audit Office report Evaluating Government Spending to enhance the evaluation capacity and capability within government. The survey, conducted by the Analysis Function Central Team, aimed to assess government’s specialist evaluation capacity and capability and develop a plan to address any identified shortfalls.

The survey focuses on key research questions regarding evaluation skills and experience, confidence in applying evaluation-related skills, understanding of evaluation concepts among non-evaluation practitioners, engagement between analysts and non-analysts, and areas for improvement. The results are currently being analysed and associated recommendations developed, in collaboration with the Evaluation Task Force. High level results from the survey will be released through a blog.

We remain aware of, and draw on, the work of others who have influence in this space. This includes the Office Statistics Regulation, whom the Committee heard evidence from on 6 February.

I hope that you find this additional information useful. Please do let us know if we can assist the Committee further on the topic of the Analysis Function, or with any of its other inquiries.

Yours sincerely,

Professor Sir Ian Diamond

UK Statistics Authority supplementary evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on Transforming the UK’s Evidence Base

Dear Mr Wragg,

Following the submission of the Office for National Statistics’ (ONS) written evidence to the Committee’s Transforming the UK’s Evidence Base on 31st August 2023, I then gave evidence to the Committee on 5th September 2023. I am now able to provide some supplementary evidence, as requested, on several topics of interest.

The Integrated Data Service (IDS)

As you will be aware, the IDS is a cross-government project, for which the ONS is the lead delivery partner. The project is a key enabler of the National Data Strategy and seeks to securely enable coordinated access to a range of high-quality data assets built, linked and maintained for richer analysis. Please find below some further detail on the background of this project and the progress towards its delivery.

What is the scope of the IDS?

The scope of the IDS is to deliver a secure scalable modern data service which operates on a cloud-native platform, hosting a rich and diverse data catalogue consisting of indexed and linkable data with the latest provision of data science and generative AI potential. The service has been designed to better inform effective policy making.

The vision of the IDS is to address the lack of a central integration platform that can cater for the future needs of both data providers and analysts looking to utilise integrated data to develop cross-cutting analytical results. The IDS builds on the success of the Secure Research Service (SRS) and offers to significantly reduce the time it takes to negotiate and access data and the provision of data assets.

The IDS provides a secure environment that enables streamlined data sharing across government improving the ways that data are made available via cloud native technologies, modernising the way departments and their professionals operate. The IDS is the first of its kind in the UK and will be setting the precedence for how data is being processed on a cloud native platform.

When is it expected to be delivered?

The programme has been in development by the ONS over the last 18 months and is funded until March 2025 (under the current Spending Review). After this date, the IDS becomes a live running service.

What is the cost of the programme?

The programme secured funding from HM Treasury (HMT) until the end of the investment period (financial year March 2024/25). The cost of the programme is estimated to be £228.7m which covers the development and running costs from 2020 – 2025. Furthermore, the programme continues to assess funding options beyond March 2024/25.

Who are the users likely to be?

The IDS is designed for use by accredited analysts, within government and the wider research community. The ambition for the IDS is to have every government analyst, roughly estimated at 14,000 individuals, capable of utilising the platform to better inform decisions for the public good.

What data do you expect to be available on the service?

There are currently 81 datasets available in the IDS from across government. This includes high-value data assets, such as levelling up; climate change and net zero. Additionally, health data assets are underway with identified datasets being indexed by the Reference Data Management Framework (RDMF) – which enables multiple data to be linked and analysed, creating new comprehensive data assets – and published on the IDS so that analysts can link data according to their requirements.

The programme intends to continue to work with data owners across government and the private sector to acquire more datasets in conjunction with the RDMF.  However, this is dependent on data owners signing up to data sharing agreements to make this data available.

In accordance with the Central Digital and Data Office’s roadmap for 2022-25, departments have agreed to share their essential shared data assets across government, including through IDS. This further enables the IDS as a Trusted Research Environment to facilitate and support this commitment.

However, discussions with government analysts have highlighted a range of concerns about how current incentives for departmental data sharing fit with the needs of ministerial-facing departments. There is also a wider financial risk regarding other department’s ability to fund activity such as data cleansing, which may limit their ability to effectively share data. Although HMT set out the expectation that OGDs will support data sharing in all SR21 settlements, no specific funding was provided, which may limit activity in some cases. As part of the IDS Programme, ONS is working with Chief Data Officers across government to minimise frictions around the sharing of data via IDS. One of the pilots in development is looking at Data Ownership and Stewardship approaches to streamline the governance arrangements and make it quicker for departments to agree to share data via IDS, and for analysts to subsequently access that data for a broad range of analysis in the public good. As always, I would welcome support from the Committee to share and promote the benefits of data sharing across government for the public good.

What safeguards will be in place to protect data?

The IDS is a trusted research environment which means it adheres to the 5 Safes in accordance with the Digital Economy Act (DEA); The 5 safes of secure data are as follows:

  • Safe projects – Is this use of the data appropriate, lawful, and ethical?
  • Safe people – Can the users be trusted to use it in an appropriate manner?
  • Safe settings – Does the access facility limit unauthorised use?
  • Safe data – Is there a disclosure risk in the data?
  • Safe outputs – Are the statistical results non-disclosive?

These principles enable the safeguards and governance for the IDS to operate with sensitive data which in turn ensures public confidence in the security and processing of data. Access to the IDS platform is granted via secure gateway in line with the data legislation; furthermore, the IDS utilises strict policies around the cleaning, linkage, validation and controlling data.

The IDS Programme is also working across ONS in the development of key governance through policy creations that will enable safeguards and the appropriate use of data. The policy workstream, which is coordinated by ONS’ Data Governance, Legislation and Policy and Security and Information Management teams, is helping to develop adequate governance for the programme via policy development. In developing safeguards, the programme employs the following principles:

  • Adapting successful policies within the ONS and across government analytical communities (e.g., GSS, GSR, GES) that can support the programme.
  • Working with the National Statistician’s Data Ethics Advisory Committee, which is underpinned by the UK Statistics Authority’s (UKSA) ethics framework for the use of data for statistical, research and analytical purposes, to identify and mitigate any potential ethical risks at project-level.
  • Access to all data are controlled through the concept of a analytical ‘project’, within supporting business and technical processes linked to user need.
  • An overarching programme Data Protection Impact Assessment (DPIA) is maintained to define key activities and associated data risks. Continued engagement with the Information Commissioner’s Office on the DPIA as it is maintained and updated as the programme develops.

The programme also adheres to the UK Statistics Authority/ONS Data Protection Policy (required by the Data Protection Act 2018 and the General Data Protection Regulation).

The ONS website

The Committee also asked for some insight into the current condition of the ONS website and any plans to change the site in the future. Below I have outlined our vision for dissemination, of which our website is an integral part, as well as some exploratory work we are undertaking to see how we could use AI technology to address some of the challenges with our existing website.

Our Vision for Dissemination

The ONS website supports the Statistics for the Public Good strategy by helping to build trust in evidence, enhance understanding of social, economic and environmental matters and improve the clarity and coherence of our communication. By helping people to be aware of the ONS and to find, understand and explore our data, statistics and analysis we are giving people the information they need to make decisions, and act, at a national, local and individual level.

Our vision for statistics dissemination goes beyond the website. We want people to have trust in our data and analysis. We know that our users want to find trusted ONS information wherever they look – whether that’s on the ONS website, on social media, in the media or through search engines. Our users want ONS answers to their questions and we are exploring a range of different approaches to serve this need, including providing answers to questions using Large Language Models (LLMs).

Our goal is for users to understand our data, statistics and analysis more quickly and easily, with the right contextual information to help people know how they can use them. We want our users to explore and tailor our information so they can find what is important to them – whether that is by creating their own datasets based on ONS data or through our expert curated view of key insights for the economy or society.

Our priorities for the website in recent years have been delivering the capability to support census 2021 outputs and the reliability of the service to all our users, particularly in response to the additional demand for ONS data on the economy, in response to changes in the cost of living. We’re currently running a package of work to address and improve website performance to meet demand and our next priority will be programmatic access to our data via application programming interfaces (APIs). This will improve the agility of all users of our data, both internal and external, to consume and gain insights from the ONS website.

We have also focused on improved search both on the ONS website and through greater visibility of our data and insight in search engines and in the media.

This year we are also setting the future direction for how we create and manage our statistical content in a more efficient and structured way to enable business agility and flexibility for our users, aligned to their broad range of needs. This will set out a forward plan to transform ONS data and insight and will make the case for the additional funding needed to deliver on our ambitions.

StatsChat

Additionally, the ONS Data Science Campus are currently exploring how new tools and technology can help the organisation disseminate information more effectively. We have developed a new product, ‘StatsChat’, that uses LLMs to search and summarise text from across our website, and present relevant sections of our web pages to user’s natural language questions.

We are aiming to make this available to a small selection of users for testing and fine-tuning, so that we can improve the relevance of the responses and provide assurance from a data ethics, data protection and security perspective.

Stakeholder engagement

The ONS conducts a wide range of user and stakeholder insights, consultations and listening exercises. This engagement is essential as it provides us with actionable insights on users’ and stakeholders’ views on the strength of their relationship with the ONS, feedback on its outputs, and on how stakeholders access and use our statistics and analysis.

As part of this, the ONS’s Engagement Hub conducts annual stakeholder ‘deep dive’ research and an annual stakeholder satisfaction survey. I understand the Committee is interested in understanding more about these exercises and insights from recent examples.

The deep dive research is conducted through in-depth interviews with senior representatives from around 45 key stakeholder organisations. The stakeholder satisfaction survey is an online questionnaire aimed at a wider range of users from a variety of sectors and roles to provide broader insight. Deep dive participants include those from central and local government departments, devolved administrations, research institutes, think tanks, public bodies such as NHS England and the ICO, international partners, business representative bodies and charities. The stakeholder satisfaction survey reaches similar types of organisations, with a wider range of responses at senior manager, operational, public affairs, analyst, researcher, policy maker and economist levels.

Deep dive interviews took place in summer 2022 and the findings were positive. Many stakeholders said that the organisation had built on and maintained its reputation for independence, trustworthiness, quality and reliability. They also felt that the ONS had developed its reputation for being flexible, agile and responsive to changing needs. Additionally, the ONS was seen to be working more collaboratively with policymakers than it had in the past.

The stakeholder satisfaction survey was conducted in early 2023. It found respondents to be positive across key sentiment measures on trust, quality, and on the ONS producing statistics which are relevant to issues of the day. There were also positive views expressed about the ONS as an organisation with reliability, responsiveness, and willingness to help being cited. It was also noted that ONS staff were knowledgeable and helpful.

There were areas highlighted for improvement in both the stakeholder deep dive and satisfaction survey. These included how the ONS works with both devolved governments and heads of the statistical profession in government departments; improving the ease of finding the right people to speak to in the organisation; and more regular, strategic overviews of the ONS’s work (for stakeholders to be able to connect different topics better). Some participants referenced a need for further scrutiny to understand some data anomalies which had occurred in mid-2022.

These findings are shared throughout the ONS, including with the National Statistician’s Executive Group, and are used to inform planning and prioritisation. We have implemented measures to respond to the issues raised as part of a wider programme of ongoing external affairs improvements, which we continue to monitor with further research.

The ONS conducted a subsequent stakeholder deep dive in autumn 2023 and are currently analysing the findings. The latest ONS annual stakeholder satisfaction survey is currently live and will be open for responses until 22 January 2024.

Full business case on population and migration statistics improvements

As you are already aware, next year I will be making a recommendation to Government on the future of the population and migration statistics system in England and Wales. I understand that the Committee has requested some additional detail surrounding the financial aspects of this transformational work.

In the outline business case for the Future of Population and Migration Statistics programme, initial cost estimates of a potential census in 2031 range from £1.3 billion to £2 billion, with increases expected across all phases of such an operation.

The ONS is working to produce a full business case (FBC) for our proposals to improve our population and migration statistics. The FBC will be developed in the context of the forthcoming recommendation to UK Government, and the response from Government. At this stage, while the recommendation remains in development, it is difficult to provide an accurate updated estimate of cost.

The FBC is expected by HM Treasury in late 2024. We will be able to provide the Committee with further information on costs at a later date.

Migration statistics

As part of improving population statistics we are also transforming international migration statistics. Our latest estimates, year to June 2023 are official statistics in development and are provisional. We revised our June 2022 and December 2022 estimates upwards due to a combination of more data and methodological improvements.

International migration estimates are produced using three key sources: Home Office border data linked to a person’s travel visa for non-EU nationals, which made up 82% of total immigration in 2023; tax and benefit data (known as RAPID) for EU nationals; and International Passenger Survey data for British nationals. We are most confident with Home Office border data and have an ambition to produce all migration statistics from these data in future.

We work very closely with Home Office to procure and use border data linked with visa data to produce migration estimates. The ability of free movement for British nationals and some EU and non-EU nationals makes the current method a challenge for those that don’t require visas. However, there is further data held by the Home Office, known as Advanced Passenger Information, that would help with our research, particularly for British nationals. We have requested these data and would like to see Home Office accelerate this request.

Census 2021 data confirmed our position that the administrative data we use for non-British nationals is robust and that the international passenger survey data does not measure actual migration patterns well due to people changing their intentions. Rather than rebasing once a decade, following a decennial census, to correct for any drift in our population estimates, we aim to produce statistics that do not ‘drift’ from the truth. Our Dynamic Population Model based population statistics show how drift in both population and migration statistics can be mitigated. That does not remove the need to revise estimates as the data and methods mature.

Long-term international migration uses the UN definition of a migrant, that is someone that changes their country of residence for 12 months or more. To produce timely estimates, we therefore have to make assumptions based on previous behaviour. As more time passes, we are able to update those assumptions with data of actual travel. We therefore become more confident in our estimates over time. For example, our June 2022 estimates now have complete data to show if a migrant has stayed or left for 12 months and we therefore have less uncertainty around those estimates compared to the provisional June 2023 estimates.

We have recently published experimental uncertainty measures for our admin-data based migration estimates for the first time. These show our users how our confidence increases once we have complete data that meet the required definition.

We also described the nature of provisional estimates that are subsequently revised and the reasons behind these revisions. This was picked up and presented accurately in the media and in playing back conversations with our core users. The Office for Statistics Regulation (OSR) recently published a review of their recommendations on migration statistics. The OSR considered we sufficiently described uncertainty to our users, although we recognise these are experimental and will continue to update our users as they develop.

I hope that you find this additional information useful. Please do let us know if we can assist the Committee further on any of the issues discussed in this letter, or with any of its other inquiries.

Yours sincerely,

Professor Sir Ian Diamond

UK Statistics Authority follow-up written evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on Transforming the UK’s Evidence Base

Dear Mr Wragg,

When giving evidence to the Public Administration and Constitutional Affairs Committee on 5 September 2023, I promised to follow-up on a couple of points with various members of the Committee.

GDP Revisions

Firstly, I agreed to let you know if I was aware of similar revisions happening in other comparable countries.

As I outlined in the Financial Times recently, The UK’s official economic statistics are rightly seen as among the world’s best. This includes the recent upgrade of our official estimates for economic growth in the pandemic years of 2020 and 2021. The latest Organisation for Economic Co-operation and Development (OECD) information shows that the UK is one of the first countries in the world to estimate the 2020 and 2021 coronavirus (COVID-19) pandemic period through the detailed Supply and Use framework. This standard economic framework enables us to confront our data at a much more granular level for products and industry. The OECD provides a real-time vintages of GDP database in their main economic indicators, which takes data directly from the National Statistics Institutes.

Each country will follow different revision policies and practices, which can result in their estimates being revised at a later date, according to their own needs. The timing and impact of revision changes will depend on data availability and magnitude, with large annual structural surveys being the data source needed to make detailed product and industry changes. These annual data sources come with lags on timeliness, often being available up to 2 or 3 years later.

We have now seen revisions to GDP estimates published by other countries. As we previously announced, the 2021 GDP estimates for the UK were revised to 8.7 percent growth from our initial estimate of 7.6 percent growth, a revision of +1.1 percentage points. The Spanish Statistical Agency has now published 6.4 percent growth in GDP for 2021, compared with the previous estimate of 5.5 percent, a revision of +1.1 percentage points. The Netherlands have now published 6.2 percent growth for 2021, revised from an initial estimate of 4.9 percent, a revision of +1.3 percentage points. Italy, have now published 8.3 percent growth for 2021, revised from an initial estimate of 7.0 percent, a +1.3 percentage point revision. All are a similar magnitude upwards revision for 2021 as observed in the UK context. Conversely, the United States have now published 5.8 percent growth for 2021, compared to a previous estimate of 5.9 percent growth, a revision of -0.1 percentage points [ONS own calculations based on published US data from www.bea.gov]. This highlights that revisions can differ across countries.

Strengthening the Analysis Standard

Secondly, I promised to examine whether there is a case for strengthening the Analysis Standard. I am passionate about ensuring the robustness of the Analysis Standard and welcome the committee taking an interest its strength and its application across Government.

The Analysis Function Standard, which was updated earlier this year, is part of a suite of management standards that promote consistent and coherent ways of working across government, and provides a stable basis for assurance, risk management and capability improvement.

In my letter to you of 18 September regarding the Committee’s report‘Where Civil Servants work: Planning for the future of the Government’s estates’, I emphasised my work to promote transparency in Government Analysis through my role as Head of the Analysis Function. I am keen to take every opportunity to champion the Standard across government and will reiterate the importance of this area at October’s Heads of Function board meeting.

The Standard is very clear on expectations about transparency in the commissioning, production and publishing of analysis. It also has clear messaging about compliance to the Code of Practice for Statistics and other official guidance for the remaining analytical professions including the Aqua, Green and Magenta books.

It is my expectation that all departments closely follow the principles in these sets of guidance and through the Analysis Function Standards Steering Group we monitor and scrutinise these documents to ensure their continued effectiveness.

For the first time this year, all Departmental Directors of Analysis undertook a self-assessment against the Standard and in response to this we are staring a series of action groups to drive improvements, including in Departments compliance to official guidance.

I will keep the Analysis Function Standard under close review and, where necessary, strengthen the messages in it.

Please do let us know if any other questions, and if we can help the Committee further on either of these topics or any of its other inquiries.

Yours sincerely,

Professor Sir Ian Diamond

UK Statistics Authority correspondence to the Treasury Select Committee on revisions within Blue Book 2023

Dear Ms Baldwin,

Thank you for your letter of 14 September 2023 regarding revisions within Blue Book 2023. To take your four points in turn:

  1. An overview of the main drivers of these revisions, and whether there were particular circumstances (including those arising from the pandemic) in 2020 and 2021 that made early estimates of GDP especially uncertain.

As I outlined in the Financial Times recently, The UK’s official economic statistics are rightly seen as among the world’s best. This includes the recent upgrade of our official estimates for economic growth in the pandemic years of 2020 and 2021.

It is certainly true that the large shifts in activity, and the means of delivering that activity in many cases, made it harder for all statistical agencies to measure economic activity during the pandemic. But it is equally true that the larger revisions we have seen for our 2020 and 2021 GDP estimates are proportionally in line with the much larger declines and growths seen over these periods as well.

The main drivers of revision in our 2020 and 2021 GDP estimates come from these changes in activity. For example, the health service had increased costs to deliver a reduced amount of output (e.g. protective equipment, and extra staff) during 2020 which increased the intermediate consumption and decreased the value added of the health sector. During 2021 these intermediate consumption costs continued to rise, but more slowly, while output volumes saw a massive increase from the return of mainstream health activities such as elective surgeries but also from the COVID vaccination programme and so value added then grew strongly.

Secondly retailers and wholesalers also changed the way they operated with specialist stores being forced to close or be limited to click and collect, and a much larger proportion of transactions were completed on-line. This again changed the retail and wholesale margins element in 2020 and then this partially swung back the other way in 2021 as retailers, especially those selling clothing and textiles saw a strong recovery in 2021.

The third driver of revisions was inventories data, where our annual, more complete, data sources gave information that businesses undertook more stock building that previously thought at the start of the pandemic when restrictions were quickly introduced. For more detail, please see our article on the 1st September 2023.

  1. An explanation of what has been learnt from these revisions about what may have been wrong with the earlier estimates, and what improvements the ONS will implement from what it has learnt.

Our early monthly and quarterly estimates for GDP followed the standard ONS procedures using the available ONS data sources. The challenge was the sheer scale of fundamental change in the economy in such a small space of time. The ratio of intermediate consumption to final output is usually very stable, and as a result ONS did not have any data sources for changes to this ratio for periods beyond the latest supply and use balanced year, which was 2018 at the time the pandemic started.

We have now sourced intermediate consumption data on a more timely basis for the health service with quarterly and annual data available within a month or two of the reference period. We are also investigating the use of administrative tax data (VAT) on purchases by businesses as a means of identifying changes in the intermediate consumption ratio more quickly across industry.

We have welcomed the recently announced review by the Office for Statistics Regulation, and look forward to their recommendations as one of the themes relates to “Potential improvements to early estimates of GDP enabled through enhanced access to data”.

An outline of whether the ONS expects similarly large revisions to GDP data for 2022, in either direction, and more broadly whether the ONS sees revisions of this size as exceptional or typical.

The revisions profile of GDP estimates for 2022 and for the first half of 2023 were published on 29 September in the Quarterly National Accounts. There was little to no revision to previously published GDP from 2022 onwards, and we saw only 1 out of the last 6 quarters have been revised. The quarterly growth rate of GDP across all of 2022 was unrevised, while growth in 2023 Q1 was revised up 0.2 percentage points and 2023 Q2 was unrevised. With this release, we observed that revisions for that period are more typical of the pre-pandemic era.

As part of our continual improvement, we have already implemented the new health intermediate consumption data to reduce the potential for revision in this large sector of the economy. While other work looking at wider intermediate consumption continues, we have proactively reviewed areas such as rail transport and air transport to ensure that the intermediate consumption ratio of 2021 does not apply directly to 2022 as well, where we can see clear evidence of a recovery in those sectors. As part of the OSR review of GDP, ONS has committed to provide additional revision analysis of our GDP estimates in October 2023.

Given the ONS notes that it has completed its revisions to GDP using a Supply and Use Table framework ahead of many other countries, what it expects may happen in comparator countries when they undertake their own similar analysis.

 Each country will follow different revision policies and practices, which can result in their estimates being revised at a later date according to their own needs. The timing and impact of revision changes will depend on data availability and magnitude, with large annual structural surveys being the data source needed to make detailed product and industry changes. These annual data sources come with lags on timeliness, often being available up to 2 or 3 years later.

We have now seen revisions to GDP estimates published by other countries. As we previously announced, the 2021 GDP estimates for the UK were revised to 8.7 percent growth from our initial estimate of 7.6 percent growth, a revision of +1.1 percentage points. The Spanish Statistical Agency has now published 6.4 percent growth in GDP for 2021, compared with the previous estimate of 5.5 percent, a revision of +1.1 percentage points. The Netherlands have now published 6.2 percent growth for 2021, revised from an initial estimate of 4.9 percent, a revision of +1.3 percentage points. Italy, have now published 8.3 percent growth for 2021, revised from an initial estimate of 7.0 percent, a +1.3 percentage point revision. All are a similar magnitude of upwards revision for 2021 as observed in the UK context. Conversely, the United States have now published 5.8 percent growth for 2021, compared to a previous estimate of 5.9 percent growth, a revision of -0.1 percentage points [ONS own calculations based on published US data from www.bea.gov]. This highlights that revisions can differ across countries.

Please do let me know if you have any further questions about this topic or if I can be of assistance to the Committee on any other matter.

I am copying this letter to Rt Hon Greg Clark MP, Chair of the Science, Innovation and Technology Committee, and William Wragg MP, Chair of the Public Administration and Constitutional Affairs Committee.

Yours sincerely,

Professor Sir Ian Diamond

UK Statistics Authority written evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on Transforming the UK’s evidence base

Dear William,

I write in response to the Committee’s call for evidence for its new inquiry, Transforming the UK’s Evidence Base. I very much welcome this inquiry with its focus on the future of data, statistics and analysis in government, as we in the UK Statistics Authority look to the future in a variety of ways.

As you will be aware, the Office for National Statistics (ONS) launched a consultation on the future of population and migration statistics in England and Wales in June. I enclose written evidence from the National Statistician, Sir Ian Diamond, within this submission, which highlights not only this consultation but also the progress of data sharing in government so far, how the Authority uses data ethically and protects users’ privacy, and how the ONS understands and responds to user needs.

Meanwhile, the Office for Statistics Regulation published their review into data sharing and linkage for the public good in July. I also attach written evidence from their Head of Regulation, Ed Humpherson, within this submission, where they discuss their findings from this report in more detail. In addition, they will soon be launching a review and will be seeking feedback on the Code of Practice for Statistics to ensure it remains relevant for today’s world of data and statistics production. We will provide more detail to the Committee on this soon.

Sir Ian Diamond, Ed Humpherson and I stand ready to engage with the Committee to expand on any of these points if helpful, and indeed will follow all oral evidence sessions of the inquiry with interest.

Yours sincerely,

Sir Robert Chote

Chair, UK Statistics Authority

Office for National Statistics response

Data and analysis in government 

How are official statistics and analysis currently produced? 

  1. Official statistics are defined as those produced by organisations named in the Statistics and Registration Service Act 2007 (the 2007 Act) or in the Official Statistics Order (SI 878 of 2023). The Code of Practice for Statistics (the Code) sets the standards that producers of official statistics should follow. The Office for Statistics Regulation (OSR) sets this statutory Code, assesses compliance with the Code, and awards the National Statistics designation to official statistics that comply with the highest standards of the Code.
  2. The majority of official statistics are produced by statisticians operating under the umbrella of the GSS, working in either the Office for National Statistics, UK government departments and agencies, or one of the three devolved administrations in Northern Ireland, Scotland and Wales. Every public body with a significant GSS presence has its own designated Head of Profession for Statistics. Each of the devolved administrations has its own Chief Statistician. The Concordat on Statistics sets out an agreed framework for statistical collaboration between the UK Statistics Authority, UK Government, and the Northern Ireland, Scottish and Welsh Governments.
  3. The Analysis Function brings together the 16,000 analysts across 7   The strategic aim of the Analysis Function is to integrate analysis into all facets of Government, building on the strengths of professions. The Analysis Function supports analysis in Government through capability building, sharing good practice, championing innovation, and building a strong analytical community. The National Statistician is head of the Analysis Function, as well as the Government Statistical Service (GSS). Each Government department has a Departmental Director of Analysis, who is responsible for analytical standards in their department. The network of Departmental Directors of Analysis (DDANs) form the leadership of the Analysis Function in order to drive and deliver the functional aims.
  4. A range of analytical techniques and various sources of evidence, combined or individually, including official statistics, can be used to provide insights for key questions for the public and decision makers. Such analytical processes and products are also supported by guidance such as the Green Book (appraisal of options), the Magenta Book (evaluation), and the Aqua book (quality assurance), which set the highest standards for government analysis.
  5. Official statistics and analysis across government are currently produced in line with the Code and its three pillars: trustworthiness, quality, and value.

How successfully do Government Departments share data?

  1. Successful data sharing across government departments is critical to operating and transforming the statistical system and producing high quality, trustworthy, and valuable analyses. The importance of sharing, and linking, data and putting data at the heart of statistics is set out in our current consultation on the future of population and migration statistics in England and Wales.
  2. There are some good examples of effective data sharing across government. The COVID-19 pandemic illustrated the ability of government and public services to use and share data to help and protect people. When data are shared effectively, the speed at which analysis can be done means time-critical policy issues can be understood and addressed quickly. For example, we created the Public Health Data Asset (PHDA) which is a unique population level dataset combining, at individual level, data from the 2011 census, mortality data, primary care records, hospital records, vaccination data and Test and Trace data, and allowed us to link across these data sources to provide new insights.
  3. Cross Government networks and the Analysis Function have a critical role in the success of cross department data sharing. For example, the Data Liaison Officer network, the National Situation Centre, the ONS and the Analysis Function recently collaborated to produce guidance on data sharing for crisis response. This built on the principles developed and utilised during the COVID-19 pandemic.
  4. However, there are challenges to building on this success and maintaining the momentum that occurred during the pandemic. Data sharing between departments continues to have asymmetric risk; with the perceived risks – either legal, operational or reputational – falling on the supplier or department sharing the data, while the benefits are diffuse across the system, or perceived to accrue to others. There are several common challenges to data sharing by suppliers, for example their level of risk appetite and differing interpretations of the law, their data preparation (and accordingly, data quality) and engineering capacity, and the governance within their own organisations.
  5. In terms of risk appetite, even when the environment is judged to be safe and secure by internal and external parties, there is still too much weight being placed on the risks of data sharing as opposed to the very real risk to the public of policy harm and loss of opportunity where valuable data is not being actively used and shared. The OSR notes this point in their report on data sharing and linking.
  6. Another challenge is that agreements to share data are often narrow; from one department to another for a specific purpose, for example a piece of analysis specific to a policy area or statistical output. It is often challenging to broaden these agreements which put a limit on the amount data can be reused or shared more broadly across multiple departmental boundaries. This creates inefficiency where the value of data is not fully realised and causes government departments to incur unnecessary and duplicative costs in implementing numerous bilateral arrangements, often with the same party.
  7. The level of data maturity across departments is also varied, which leads to a multitude of different approaches and interpretations to agreeing data sharing, a wide range of people being involved in approving data ownership and stewardship, and a myriad of different templates to formalise agreements. This contributes to a complex and burdensome system, which leads to long lead-in times to agree data shares. This issue is particularly acute when data is brought together and integrated from multiple departments, necessitating different governance processes to be engaged each time a change is required.
  8. The ONS provides an ‘Acquisition Service’, which proactively supports data suppliers and collaborates to put in place mechanisms which support the sharing of data, reducing the burden on the supplier as far as possible. For example, this could be seconding analysts into a department, drafting up Memorandum of Understanding on behalf of suppliers, and agreeing to undertake significant improvement on the data to make it of high and usable quality.
  9. The ONS is the lead delivery partner on behalf of government to deliver a cross government major programme, the Integrated Data Service (IDS), a Trusted Research Environment (TRE) which seeks to build on the success of the Secure Research Service (SRS), to bring together ready-to-use data to enable faster and wider collaborative analysis for the public good. The IDS intends to transform the way that data users share and access government data. Firstly, the IDS is a fully cloud-native system which will further enable connectivity across a federated data environment, reducing the friction caused by sharing data multiple times. Secondly, it will provide the facility to fully exploit the opportunities for safe and secure access to data provided for in the Digital Economy Act 2017 (DEA). Thirdly, it will apply a common linkage approach to enable analysts to join data from different departments, repeatedly, to meet diverse analytical requirements.
  10. The ambition is that the IDS will help overcome some of the existing challenges, costs and delays to effective data sharing across government. Its success, however, will depend on the extent to which government departments can embrace a common approach to sharing, stewarding, linking and accessing data.

How do other nations collect and produce statistics? 

  1. The UK is connected with other National Statistical Organisations (NSOs) on both a multilateral and bilateral basis, to learn and to share best practice. The UK is represented at the highest levels in multilateral fora such as the Organisation for Economic Co-operation and Development (OECD) and the United Nations Economic Commission for Europe (UNECE) and participates in a number of working groups advancing the use of administrative data and new forms of data. For example, we recently presented our work on nowcasting to the UNECE Group of Experts on National Accounts which gathered interest from other countries such as Austria, Canada, Indonesia and the United States.
  2. The UK chairs the UNECE Expert Group on Modernising Statistical Legislation, ensuring that statistical legislation and frameworks are equipped to deal with a changing data ecosystem. This has led to further work on data ethics, social acceptability, and access to privately held data. We also sit on the UNECE Task Force on Data Stewardship, which aims to develop a common understanding of the concept of ‘data stewardship’ and define the role of NSOs as data stewards.
  3. We are regularly contacted by other NSOs to share our experiences of utilising data science techniques to produce high quality data in near real time to inform decision making. For example, the ONS recently hosted the German led Future of Statistics Commission to share our experiences of utilising new data and new methodology to keep up with societal needs and respond efficiently to emerging crises. We have hosted colleagues from Statistics Finland to discuss the data landscape in the UK and colleagues from Statistics New Zealand to discuss challenges faced with data collection. We have also shared experiences on our real-time economic indicator suite, particularly on new sources of data such as card data.
  4. The ONS has been involved with the World Health Organisation’s Pandemic Preparedness toolkit project. This project calls upon the Authority’s pandemic response experience and expertise to develop a toolkit containing practical guidance, statistical methods, knowledge products, case studies and training materials for other NSOs, particularly for data sharing.
  5. As part of the International Census Forum, the ONS has had ongoing conversations with the US Census Bureau, Statistics Canada, Australian Bureau of Statistics, Statistics New Zealand and CSO Ireland about the use of administrative data for population statistics. All participants have benefited from these conversations over the last few years, covering subjects such as census collection planning and efficiencies, quality assurance, processing improvements and contingency plans.
  6. The UK is a member of the UN Statistics Division (UNSD) Collaborative on the Use of Administrative Data for Statistics. This group of countries and regional and international agencies was convened by the UNSD and the Global Partnership for Sustainable Development Data (GPSDD), with the aim of strengthening countries’ capacity to use administrative data sources for statistical purposes, including replicating census variables.
  7. The Collaborative provides a platform to share resources, best practices and experiences. This includes a self-assessment tool, a draft toolkit for quality assessment of administrative data sources, and an inventory of resources which contains recommendations and practical examples on the use of administrative data in different contexts.

The changing data landscape

Is the age of the survey, and the decennial Census, over?

  1. The ONS’s vision is to improve its statistics so that they can respond more effectively to society’s rapidly changing needs. The ONS is proposing to create a sustainable system for producing essential, up-to-date statistics about the population. To do this, the system would primarily use administrative data like tax, benefit, health and border data, complemented by survey data and a wider range of data sources. This could radically improve the statistics that the ONS produces each year and could replace the current reliance on, and need for, a census every ten years.
  2. Producing high-quality, timely population statistics is essential to ensure people get the services and support they need, both within their communities and nationwide. Population statistics provide evidence for policies and public services, as well as helping businesses and investors to deliver economic growth across the country. It is important that these statistics are up to date and reliable, so that they can accurately reflect the needs of everyone in society. Currently, the census provides the backbone of these statistics, offering a rich picture of our society at national and local levels every ten years. Every year, the ONS brings together census data with survey and administrative data to reflect changes in society. As a result of this approach of ‘rolling forward’ estimates year-on-year, statistics become less accurate over the ten years between censuses and local detail on important topics becomes increasingly out of date. After each census, the previous decade’s mid-year population estimates are “rebased” to ensure they are consistent with the baseline estimate from the new census. This makes the previous decade’s mid-year population estimates as accurate as possible.
  3. There has also been a well-documented global trend of declining response rates to surveys and censuses, which can impact representativeness of the data and, as a result, data quality. While Census 2021 in England and Wales enjoyed a high level of public engagement and response, this is an outlier in a wider trend of population censuses and social surveys across the world.
  4. Data collection is costly, and these costs can be elevated when the need arises to incentivise survey response (often monetarily), chase response many times (particularly in the case of the census) or adjust collection operations. In recent examples where response rates to censuses have been below targets, mitigations have included the deadline for responses to census being extended, boosted communication campaigns, and greater use of administrative data to enable the production of robust estimates, all of which can contribute to elevated cost.
  5. Building on recent advances in technology and statistical methods, and legislative facilitation of data-sharing across government for statistical purposes, the ONS has for several years been researching the use of administrative data as a primary source for meeting user needs for some statistics. For population statistics specifically, this work is responding to the Government’s ambition, set out in 2014, that censuses after 2021 “be conducted using other sources of data and [provide] more timely statistical information.”
  6. It has shown that it can produce population estimates with a more consistent level of detail and accuracy over time, and migration estimates based on observed travel patterns rather than respondents’ stated intentions, using administrative data to respond to the difficulties of estimating internal and international migration. The ONS has also developed methods for producing information about the population more often and more quickly. These methods will offer insights into our rapidly changing society as administrative data reach their full potential over the next decade.
  7. In June 2023, the ONS launched a consultation on its proposals for the future of population and migration statistics in England and Wales, responses to which will inform a recommendation to Government. The consultation’s proposals emphasise that surveys may continue to play an important role whilst ONS works with partners to widen and develop the range of administrative data sources that are collected and used. However, the ONS believes it has reached a point where a serious question can be asked about the role the census plays in its statistical system.
  8. If implemented, the proposed system would respond more effectively to society’s changing needs by giving users high-quality population statistics each year. It would also offer new and additional insights into the changes and movement of our population across different seasons or times of day. For many topics, it would provide much more local information not just once a decade but every year, exploring them in new detail and covering areas not recorded by the census, such as income. These are ambitious changes, and decisions on the next phase of this work will set the direction for the ONS’s work programme over the coming years, as the ONS continues to improve its population and migration statistics.
  9. It is worth noting that fast-paced, qualitative surveys have and will continue to have a place in the statistical system. For example, the Opinions and Lifestyle Survey and Business Insights and Conditions Survey in particular illustrate the worth of flexible surveys, adapted from a source of rapid intelligence during the pandemic to useful tools for understanding at pace current issues such as cost of living to the adoption of AI.

What new sources of data are available to government statisticians and analysts? 

  1. The ONS has been a heavy user of Census and survey data and, more recently, government administrative data. Given the new global data landscape, and the increasing society-wide and economy-wide digitisation, new and exciting massive, high-frequency data (‘Big Data’) are being generated, some of which is proprietary and some of which is openly available. These new data sources offer opportunity for the ONS to be radical, and ambitious in what data it uses – in line with the Authority’s strategy, Statistics for the Public Good – and how, to provide better, highly trusted information and statistics to the public, to the private sector, and the public sector (including government) at national and regional level, across a wide range of issues.
  2. A key part of the current consultation on the future of population and migration statistics seeks input on transforming population and migration statistics using alternative data, as opposed to using predominantly survey-based sources. The ONS has now published a suite of evidence demonstrating the opportunity to use alternative data sources to deliver more timely and granular statistics as well as provide value for money.
  3. To support statistical transformation and the strategy, the ONS works with hundreds of independent data suppliers including central, devolved and local government, and the private sector, to share data for the public good research. The ONS’ most recent data transparency publication demonstrates the breadth of data sources containing personal data, and the broad opportunity to use novel data sources to support statistics.
  4. The ONS uses around 700 data sources, including health, tax, benefits, education and demographic information from other government departments and public bodies. We also work with a wide range of data from commercial providers including retail scanner data, where coverage is around 60-70% of the grocery market, alongside financial transaction data, domestic energy usage and others.
  5. A recent example of the ONS harnessing the power of new data sources is published data on UK Direct Debits, developed with our partners at Vocalink and Pay.UK. This new, fast and anonymised data source provides insight on consumer payments to most of the UK’s largest companies. It gives the ONS, for the first time, an opportunity to rapidly analyse price movements such as changes in the average Direct Debit amount for bills, subscriptions, loans, or mortgages as well as overall payment behaviour via failure to make these Direct Debit payments. This provides extremely useful and timely insights into the state of the UK economy and in time, could feed into wider national accounts estimates.
  6. In addition, new sources being acquired as part of the transformation of consumer statistics are already being incorporated into headline measures of Consumer Prices Index (CPI). This has started with the incorporation of rail fares, enabling a far greater level of detail to be accessed within our published indices.
  7. Alongside the opportunities presented by novel data sources, there is also huge potential through continued broader integration of data. Integrated data assets, made up of multiple constituent data sources, provides the potential for greater depth and breadth of research, and for the creation of insight which is not possible from analysing data sources in isolation.
  8. A good example of the value of integrated data is ONS’ development of a PHDA which has enabled the ONS to produce novel analyses including Ethnic contrasts in COVID-19 mortality and other high impact pandemic related statistics. The ONS is now using the PHDA to explore more indirect impacts of COVID-19 and wider non-COVID research questions. These include the impact of health on labour market participation by linking in Department for Work and Pensions (DWP) and HM Revenue and Customs (HMRC) data.
  9. The ONS has also made use of ‘open’ data to better inform the public. Using Automatic Identification System (AIS) shipping data to help monitor trade (shipping) flows. This data science work feeds into the regular monthly ONS publication on Economic Activity and Social Change in the UK, Real Time Indicators. We are looking at other forms of open data that can be used to produce statistical outputs and products that deliver on statistics for the public good.
  10. As well as using linked data, the ONS does work bringing multiple types of data together and uses advanced data science methods to deliver statistics for the public good. The ONS collects travel to work data from the census every ten years, with the most recent being 2021. Travel to work matrices which show movement of people from their home location (origin) to their place of work (destination) at an aggregated level. Information on travel to work provides a basis for transport planning, for example, whether new public transport routes or changes to existing routes are needed. Additionally, it allows the measurement of environmental impacts of commuting, for example traffic congestion and pollution, and how these might change over time, for example because of changes in commuting modes, such as a shift from car to bicycle. This travel to work data helps us generate travel to work matrices for census years, for instance, at 10-year intervals with no updates for years in-between. However, using Census 2011 travel to work data, Census 2021 population data, National Travel Survey data (collected by Department for Transport (DfT)), National Trip Ends Model data (produced by DfT), ONS geography products such as MSOA boundaries and Population Weighted Centroids, we can produce estimates of travel to work matrices at more regular intervals than once every 10 years.
  11. The approach to integrated data is being taken further as part of the IDS, with data ‘indexed by default,’ enabling common deidentified identifiers to be consistently applied to enable a common linkage approach. Data deidentified in this way can be grouped thematically to create Integrated Data Assets around the themes of health, levelling up and net zero. The value of this approach is that it retains the core value of the source data, while being supported by Privacy Enhancing Technology, and facilitates access to a much broader range of data enabling analysts to exponentially grow the value of their analysis.

What are the strengths and weaknesses of new sources of data? 

  1. Use of new administrative and commercial data sources are transforming the way we produce statistics (a trend seen across the world). The sources provide timely, frequent and granular data about the population that is not possible through survey collection. Through the linkage of different data sources, we can provide coverage across the population down to local levels of geography. Innovations in this area include a new approach to produce more timely and frequent high-quality estimates of the size and composition of the population down to local level and how this changes due to international migration. However, administrative data are not collected for statistical purposes and, as a result, there are both strengths and weaknesses with such data sources.
  2. Relevance and conceptual fit: the content of administrative data is determined by the services they support. This includes the topics collected, but also the precise definitions of the items that are measured. While surveys can be designed so that the questions they ask capture the statistical concepts we want to measure as accurately as possible, this is rarely possible for administrative sources, especially well-established ones. In practice, that means we need to adjust the data so it fits with statistical definitions using additional data from elsewhere (such as a survey) or can only approximate those definitions if adjustment is not possible. An important area to help with this is collaboration across government, to improve the collection of data in administrative systems, particularly around protected characteristics.
  3. Coverage: the strength of many of these large data sources is their granularity, which gives us the ability to analyse data for small groups or for small areas. This is not always possible with surveys, particularly surveys with small sample sizes. However, the coverage of most of these datasets will be incomplete. Parts of the population will not interact with the administrative source and therefore they will be missed from the dataset. In other cases, the data may not cover everybody or everything. The power, from an analytical point of view, comes from linking different datasets together to improve the coverage and enable analysis at a local level. However, sometimes surveys are also needed to fill the coverage gaps. There can also be over coverage in data sources, where individuals appear on the dataset who aren’t within a target population, for example the inclusion of emigrants and short-term residents who have recent administrative data activity but are either no longer resident or are not resident for sufficient time to meet the definition for inclusion in our estimates.
  4. Linkage: new sources of data need to be integrated to improve coverage and allow analysis. There are often not unique identifiers to enable this linkage so this can add to the complexity, time, and cost to process data to allow analysis. The complexity of linkage without common unique identifiers means that it is never perfect. Moreover, the quality of linkage may vary across the population. Understanding and quantifying linkage quality is critical, as issues that arise (such as under-representation) will feed through into statistical analysis and may affect results if not properly mitigated.
  5. Timeliness, for both collection and delivery (although new data sources are often relatively more timely than survey data):
    • Collection lags: there is often a lag between an event occurring and the data for that event becoming available, for example moving address, then registering at a doctor’s surgery or making a profit and filing a tax return. There are different time lags for different datasets. Real time analysis is often not, at this point in time, possible from the new data sources; there is always a time lag.
    • Data delivery: timelines can impact on the timeliness of the statistical and analytical outputs. Data takes time to be processed and be delivered to statisticians who analyse it. Often data can be delayed in the delivery or not shared in line with analytical requirements due to the nature of data sharing agreements. Within the ONS we are working with departments to build mature data transfer systems supported by robust Data Sharing Agreements (DSAs).
  6. Coherence, harmonised standards, and metadata: different operational polices lead to associated administrative data being collected in different ways. One dataset may be at quite a high level, while others could have more detailed information – even datasets that appear to collect information on the same metric may not be comparable, or not wholly comparable. This makes analysis more difficult and can mean that data is not available at the relevant level of granularity. Metadata and detailed information about the data collected is also often lacking, making using the data more difficult. We are using secondments into departments to better understand the data and build the metadata, and thus improve this situation.
  7. Stability and accuracy: with survey data we have control over the questions asked and the stability of those questions. Administrative or commercial data can change, for example in what and how data is collected because of changes to the operational process. This is both a strength and a weakness: strength, as it allows the data to adapt more to changing requirements and needs; weakness, as it brings in a possibility of breaking series, which is not ideal for statistical analysis of trends. An example of this was the removal of Universal Child benefit, which caused a big drop in the coverage of children in DWP/ HMRC data. To future proof statistics, we need to make sure that we are not reliant on just one source of data. In addition, accuracy will often depend on whether it is a critical variable in the administrative function, when it’s not quality drops, as the data may often be missing (if voluntary) or it does not undergo robust checks on collection.
  8. Design: sometimes administrative data collection processes were designed a few decades ago and can rely on legacy IT to be analysed. This also implies that design of questions or forms doesn’t fit new needs or requirements, and does not follow user centred design principles, affecting the quality of the data collected. However, some of the major sources that we are currently using have undergone improvement in this area.
  9. Supplier restrictions: some suppliers place restrictions on the data that is shared, for example by applying techniques to data to enhance privacy (such as hashing, perturbing, aggregating data). This can damage the usefulness of the data and how it can be used within statistical outputs.

Protecting privacy and acting ethically 

Who seeks to protect the privacy of UK citizens in the production of statistics and analysis? How? 

  1. All producers of official statistics are legal entities and data controllers in their own right, and therefore are responsible for protecting the data they hold and use. Data protection legislation, including the UK GDPR and the Data Protection Act 2018 provides the statutory framework all producers of official statistics must adhere to, and makes specific reference to personal data processed for statistics and research purposes. The Information Commissioner (ICO) is the independent authority that ensures compliance with data protection legislation and upholds information rights in the public interest. As part of their role the ICO provides advice and guidance, including, for example, a Data Sharing Code of Practice.
  2. As the executive office of the Authority, the ONS collects and processes information, both directly from individuals and from other organisations, and does so using a variety of methods. The Authority has the statutory objective to promote and safeguard the production of official statistics that serve the public good. Any personal data collected by the Authority can only ever be used to produce statistics or undertake statistical research.
  3. In addition to data protection legislation, personal information held by the Authority is further protected by the 2007 Act, and makes disclosure of personal information a criminal offence, except in limited prescribed circumstances, for example where disclosure is required by law.
  4. The DEA provides the Authority with permissive and mandatory gateways to receive data from all public authorities, crown bodies and businesses. These data sharing powers can only be used for the statistical functions of the Authority and sharing can only take place if it is compliant with data protection legislation.
  5. The 2007 Act requires the Authority to produce and publish the Code, governing the production and publication of official statistics. One of the core principles of the Code is around data governance and this states that organisations should ensure that personal information is kept safe and secure.
  6. The ONS provides guidance, support, and training on matters across the GSS, including on data protection and privacy.

Data protection

  1. The Authority has a dedicated Data Protection Officer and teams and colleagues that manage data protection and legal compliance and the security of data. These teams provide advice and guidance across the organisation on data protection matters; deliver training sessions on the protection of data; and engage regularly with the ICO.
  2. The Authority takes a data protection by design approach when processing data for statistical purposes. Privacy and data protection issues are considered at the design phase of systems or projects. The Authority has published extensive material regarding privacy for members of the public including privacy information for those taking part in surveys and a Data Protection Policy. For new projects that involve the processing of personal data, colleagues are advised to complete Data Protection Impact Assessments, that enable the Authority to identify any risks of processing to data subjects and to mitigate those risks.

Statistical confidentiality

  1. The Authority collects a vast range of information from survey respondents, as well as administrative data, such as registration information on births, deaths and other vital events. The ONS publishes statistics and outputs from this information, and statistical disclosure methods are applied so that the confidentiality of data subjects, including individuals, households, and corporate bodies, is protected. All statistical outputs are checked for disclosure risk, and disclosure control techniques are applied as required.
  2. The DEA facilitates the linking and sharing of de-identified data by public authorities for accredited research purposes to support valuable new research insights about UK society and the economy. The Authority is the statutory accrediting body for the accreditation of processors, researchers and their projects under the DEA.
  3. The Authority allows access to de-identified data within its trusted research environments. To ensure the security of the data and individual privacy, the Authority uses the Five Safes Framework:
    • Safe People: trained and accredited researchers trusted to use data appropriately.
    • Safe Projects: data that are only used for valuable, ethical research that delivers clear public benefits.
    • Safe Settings: settings in which access to data is only possible using our secure technology systems.
    • Safe Data: data that have been de-identified.
    • Safe Outputs: all research outputs that are checked to ensure they cannot identify data subjects.

Security controls

  1. The protection of data is a top priority for the Authority, and we implement and operate substantial security measures for our staff, data and services. This security focus ensures that the Authority operates and continues to develop secure options that meet its objectives for data use and maintains public trust in how we access, use, process, store, and make available data for statistics and research purposes.
  2. To ensure the confidentiality, integrity and availability of our data are protected at all times, we operate a security management framework, which continuously evaluates the threat landscape, evaluates the security risks and ensures that the appropriate controls are in place, so that we are operating within corporate risk appetite, maintaining a strong security posture and complying with the relevant legislation, Code of Practices and industry best practice. This is underpinned by a robust secure by design approach, comprehensive protective monitoring, internal and external assurance and the training of our staff.

What does it mean to use data ethically, in the context of statistics and analysis? 

  1. The Authority owns a set of six ethical principles relating to the use of data for research and statistics. These principles cover: public good, confidentiality & data security, methods & quality, legal compliance, public views & engagement, and transparency. The production, maintenance and review of these principles are conducted by the National Statistician’s Data Ethics Advisory Committee (NSDEC). The NSDEC was established to advise the National Statistician that the access, use and sharing of public data, for research and statistical purposes, is ethical and for the public good. The NSDEC consider project and policy proposals, which make use of innovative and novel data, from the ONS, the GSS and beyond, and advise the National Statistician on the ethical appropriateness of these. The NSDEC meet quarterly and have a key role in ensuring transparency around the access, use and sharing of data for statistical purposes.
  2. In 2021 the Centre for Applied Data Ethics (CADE) was established within the Authority. CADE provide practical support and thought leadership in the application of data ethics by the research and statistical community. The Centre provides a world leading resource that addresses the current and emerging needs of user communities, collaborating with partners in the UK and internationally to develop user-friendly, practical guidance, training and advice in the effective use of data for the public good. In addition to providing the secretariat to the NSDEC, the CADE mobilise the Authority’s six ethical principles via a self-assessment tool that is available to the entire research and statistics system. This tool supports researchers and analysts to identify ethical concerns in their work and then to engage with CADE to ensure mitigations and solutions are in place. Since January 2021, this tool has enabled nearly 900 pieces of ethical research and statistics and is growing in use by hundreds a year.
  3. Complementing the independent advice and guidance of the NSDEC and the self-assessment ethics services of CADE, the Centre also produces several bespoke ethics guidance pieces each year. These guidance pieces are typically produced in collaboration with an area of the ONS or the wider statistical system and focus on key concerns, such as identifying and promoting public good, considering public views and engagement, and specific ethical considerations in inclusive data, machine learning and geospatial data. Finally, the CADE also offer bespoke ethics support with specific projects, workstreams and teams and tailor their services to one-off events and longitudinal engagement work. This includes our international development programme where we work to support the work of various other National Statistical Institutes.
  4. The focus of the CADE’s activities is to ensure that the Authority’s ethical principles are promoted and accessible and that tools to ensure the principles are put into practice are effective and easy to use. We achieve this through promoting CADE at internal and external events, providing secretariat to the independent NSDEC, operating and providing oversight of the CADE self-assessment tool, producing specific, collaborative guidance pieces and providing bespoke ethics advice and support. By engaging with the CADE, researchers and analysts can ensure ethical practice, in-line with the Authority’s ethical principles, in the production of research and statistics.

Are current processes and protections sufficient? 

  1. The Authority has well established processes and procedures in place to ensure the protection of the data of UK citizens. As the statistical and analytical landscape around data changes, as with during the COVID-19 pandemic, the Authority ensures that it remains up to date with any changes to privacy legislation, regulatory guidance or cross-government good practice that could impact data subjects. This ensures a robust statistical system that produces public-good statistics that are trusted by the public.
  2. The ONS security management framework incorporates and references appropriate recognised security standards and guidance from within Government (Cabinet Office, National Cyber Security Centre (NCSC), Centre for Protection of National Infrastructure (CPNI)) and international standards and best practice from international security organisations including ISO 27001, the American National Institute of Science and Technology (NIST) and the Information Security Forum (ISF).
  3. From a data ethics perspective, there are dozens of organisations in the statistical system who display their ethical framework and commit to using it. CADE goes beyond this by evidencing, transparently, the impact that engaging with CADE is having on the production of research and statistics. Numerically, by the number of projects that CADE and the NSDEC see each year, and in more detail, by the production of case-studies, publicly displayed meeting minutes and audits of projects that have been signed-off. Where researchers and analysts engage with CADE and their services, ethical practice can be assured and evidenced.

Understanding and responding to evolving user needs

Who should official data and analyses serve? 

  1. Our data, statistics and analysis serve the public through our statutory duty to “promote and safeguard the production and publication of official statistics that serve the public good” (as set out in the 2007 Act). Everyone is a user or potential user of our statistics and can use data to inform their decision making: from policy makers to enquiring citizens, including local businesses, charities and community groups.
  2. Within the ONS, we have established an Engagement Hub to enable us to coordinate our engagement with users, understand user needs, reach new audiences and evaluate our engagement.
  3. Users are at the heart of everything we do. When identifying priorities for analysis, we do so through:
    1. discussions with other government departments and the devolved administrations.
    2. local engagement: our new ONS Local service works with analytical communities locally and with the wider civic society to support further analysis to target local questions to address local issues.
    3. Citizen focus groups with members of the public and the ONS Assembly with charities and bodies representing the interests of underrepresented groups of the population.
    4. drawing on external advice, for example the National Statistician’s Expert User Advisory Committee (who advise on cross-cutting issues).
    5. regular engagement activities with businesses and third sector organisations.
  4. A good example of the ONS reflecting the needs of users is the COVID-19 Latest Insights Tool, developed so that members of the public could find reliable, easy to understand information about the COVID-19 pandemic in one place. We engaged in user testing at key stages to make sure it met user need, and it became the most widely read product in the history of the ONS website.
  5. We also undertake an annual stakeholder deep-dive research which explores stakeholder needs, and a stakeholder satisfaction survey which supports evaluating progress against our strategic objectives.
  6. According to the Public Confidence in Office Statistics (PCOS) 2021 report, a very high proportion of respondents trusted the ONS (89% of those able to express a view) and its statistics (87%). This was very encouraging to see. It also asked respondents about their level of trust in the ONS compared to other institutions in British public life. Of the institutions listed on the survey, the ONS has the highest levels of trust, similar to that of the Bank of England and the courts system.
  7. In terms of analysis, the Analysis Function strategy explains how we bring analysts across government together to deliver better outcomes for the public by providing the best analysis to inform decision making. The Function serves policy makers across Government and has regular conversations with stakeholders to ensure that our data, statistics and analyses are relevant to public policy priorities and delivered in a timely way.
  8. Our Analytical Hub aims to provide capability and capability to deliver radial cross-cutting analysis that supports Government, civil society, and the public to understand the key questions of the day, responding flexibly and in a timely fashion to the ongoing economic and public policy priorities.

How are demands for data changing?   

  1. Changes in society, technology, and legislation mean that more data are available, in richer and more complex forms, than ever before, with the COVID-19 pandemic shifting the expectations of users to receiving more insights more rapidly. The pandemic and its impact on our society and the economy has led to more complex questions which means that the needs for data are also accompanied for growth in expertise and support to use data that reflects the intersectional nature of policy enquiry. Our statistics need to be quick, relevant, trusted and reliable to withstand public prominence and scrutiny, respond to a rapidly changing environment, and inform critical policy
  2. The ONS aims to respond to the needs of the public, decision makers and society – including providing data and insight on the topics and priorities of the day. We have already evolved our approach to respond to demand increasing for data, statistics, and analysis to be:
    • More timely, through more rapid surveys, such as the Opinions and Lifestyle Survey, and the use of new data sources, like financial transaction and mobility data.
    • More local, with the production of more granular and hyper local data, which allows users to build up their own bespoke geographies that matter to them such as Gross Value Added (GVA) at lower super output area, and greater support for local users and decision makers using our ONS Local service.
    • More inclusive, through making our data more accessible and reflective of our users, allow people to see themselves in our statistics and analysis, such as our shopping prices comparison tool; and
    • More relevant, both in terms of topics, for example looking beyond Gross Domestic Product (GDP) to consider multi-dimensional wellbeing alongside improved measures of economic performance, and in terms of how we disseminate our data and statistics, for example through application programming interfaces (APIs), to empower users to do their own analysis.
  3. We will continue to build on this progress as demands change, for example through the increasing availability and evolving possibilities for artificial intelligence (AI). We did this particularly well during the pandemic, and have since focused on new policy priorities, such as the rising cost of living, the changing nature of the labour market and the experiences of Ukrainian nationals arriving in the UK having been displaced through the conflict.
  4. As well as responding to emerging issues, we are making it easier for the public to find and consume insights on topics of interest by pulling together our many different data sets in the form of dashboards and data explorers. These include the Health Index, which brings together different datasets at local levels, subnational data explorers considering the economy, society, environment and more across local areas, the new UK measures of wellbeing providing 60 indicators across 10 domains, and the latest data & insights on the cost-of-living. In addition, data collected through the 2021 Census is being made available through our flexible table builder, articles of interest and interactive maps.
  5. Demands for data are also changing among expert users, with the rise of big data and the need for more data linkage across Government. This is why we are investing in the IDS to provide a secure environment for trusted researchers to analyse new, granular and timely data sources for the public good.

How do users of official statistics and analysis wish to access data?  

  1. We have developed a deep understanding of our diverse user audiences and their unique needs and requirements. We have grouped website users into 5 persona groups:
    • Technical User: Someone who only wants data and will create their own datasets and customise their own geography boundaries. Data from the ONS are frequently used in conjunction with data from other government departments. They may be expert at what they do with statistics but can be less expert at looking for base data. There is not the urgency we see from the expert analyst. They do not tend to use written publications.
    • Expert Analyst: Someone who creates their own analysis from data. This user downloads spreadsheets into their own statistical models to create personal datasets.
      Access to the data for analysis is more important to them than its presentation.
    • Policy Influencer: Someone who uses data for benchmarking and comparison. For some policy influencers, this requires data and analysis at a regional or local level. They rely on official government statistics, trusted by decision makers, for their reports.
    • Information Forager: Someone who wants local data and keeps up to date with the latest economic and population trends to help them make practical, strategic business decisions. They often do not know exactly what to search for, until they come to it.
    • Inquiring Citizen: Infrequent visitors to our site who search for unbiased facts about topical issues. They want simply worded, visually engaging summaries, charts and infographics. Data can help make informal decisions about pensions and investments. They engage on social media and browse with smartphones or tablets.
  2. We have found that citizen type users want ways to get data on their local area or to fact check data by using interactive tools, summaries, dashboards, visualisations and maps, whereas more data literate users are interested in the data itself and the associated metadata and methodology. Technically advanced analysts are also interested in being able to access data via APIs and for data to be easily used in tools such as Python and R. These technical users prefer not to have heavily formatted Excel spreadsheets with multiple tabs.
  3. We know many of our local users are keen to understand a place by many topics rather than go to several publications with multiple datasets for one datum. As such, we are developing our Explore Subnational Statistics service, that will allow users to select a geography and see metrics across a range of themes. ONS Local also helps local government users to bring together evidence across their area, alongside local intelligence and data and analysis to create greater insights.
  4. Our search engine optimisation strategy recognises that not all users need to or want to come to our website and that Google and the major search engines often represent our data directly in their search results. This is particularly applicable to those with accessibility needs, since many of these ways of representing our data can be returned via voice search on a variety of platforms.
  5. Citizen users want data communicated in a way that is easy for them to digest and research has shown that there is a degree of education required about our key topics such as inflation and GDP. Users may also be interested in their local areas as much as national level data.
  6. Within the ONS Engagement hub, there are dedicated teams focussing on building relationships with different audience segments. The External Affairs team supports stakeholder engagement with key government stakeholders, business and industry groups, consumer bodies and think-tanks. The dedicated Outreach and Engagement team is focused on engaging with local authorities, and building sustainable relationships with community and faith groups, voluntary sector organisations and others representing the interests of those audiences traditionally less well represented in official statistics and government data.

How can we ensure that official data and analyses have impact? 

  1. Ensuring what we are doing focuses on the topics that matter most to people and ensuring that it is disseminated in a way that is easy to understand, engaging and relevant to the audience, is key to achieving impact.
  2. For example, we worked with colleagues across government to establish new data collection on Ukrainian refugees and Visa sponsors in the UK. This allowed us to provide invaluable insights on the experience of Ukrainians coming to the UK, and the impact on service provision in areas. The publication of this in English and Ukrainian supported both policy decisions around the humanitarian response & provided those impacted with the ability to read the findings in their own language.
  3. On cost of living, we delivered a broad information, engagement and communications programme that included promoting cost of living insights and data products to a wide range of users; diversifying engagement with non-expert users (for example 99 civil society and community groups attended a session on how they could benefit from our insights); and seeking user feedback to further improve our cost of living products and analysis. Impact from this includes a continued increase in the use of our insights tool with the personal inflation calculator being embedded into Guardian and BBC websites and the shopping prices comparison tool reaching over 700,000 uses in its first week.
  4. In June we launched a public consultation on the future of population and migration statistics in England and Wales. We engaged extensively with stakeholders before the consultation launch through sector specific round tables. The consultation launch itself was widely promoted across stakeholders in all sectors and around 500 people attended launch events and webinars. Engagement will continue throughout the consultation period to maximise awareness, understanding and response.
  5. We regularly review the impact of our work, providing impact reports to the Strategic Outputs Committee (was Analysis and Evaluation Committee) on a quarterly basis, providing deep dives on priority topics.
  6. The ONS has access to a number of metrics that can be used to assess the impact of our outputs, specifically reach and awareness to understand the importance and relevance of the insight and content. We also test engagement levels to understand how well our content performs to achieve cut-through and add value to a debate. In addition, we use our own surveys to test appetite for future topics and outputs.
    1. Reach / awareness – to test importance and relevance of topic and insights.
      • Web page views: unique sessions a page was viewed at least once, within seven days.
      • Social media impressions: number of views per posts on social media platforms, within 48 hours
      • Print, digital, broadcast: number of views / listens of ONS insight from outputs.
    2. Engagement – to test content cut-through, clarity of messages and the value added to a debate or discussion.
      • Time spent on web page: time users spent viewing a specified page or screen, taken after seven days.
      • Social media engagement: shares, favourites, replies and comments, URL click throughs, hashtag clicks, mention clicks and media views, taken after 48 hours.
      • Print, digital, broadcast: cut through of ONS comments / main points within coverage.
      • Online pop-up survey for targeted releases to test user satisfaction and help set continued improvements.
      • An annual stakeholder survey and in-depth interviews targeted at government departments, charities, public institutions and businesses, tests satisfaction and use of statistics and analysis, and future needs.
  7. We bring these insights together, alongside granular stakeholder engagement to understand the impact of our work, at both a topic level, and our individual outputs to support ongoing decision making around both what we focus on, and how we can best maximise the impact of our work.

User engagement

  1. User engagement is key to making an impact. Our User Engagement Strategy for Statistics promotes a theme-based approach to user engagement. This allows all users of government data and statistics to interact with the GSS by their area of interest or by cross-cutting theme. This approach also aims to support collaboration with producers of official statistics to develop work programmes, address data gaps and help improve GSS products and services.
  2. We have created the ONS Assembly to support regular dialogue and feedback on delivering inclusive data with charities and bodies representing the interests of underrepresented groups of the population. The Assembly aims to be:
    • A forum for the ONS to engage and have an open dialogue with charities and member bodies on a range of key topics.​
    • A space to build trusted, long-lasting relationships between members and the ONS​.
    • An opportunity for members to share insight, advice and feedback on behalf of their interests and audiences.​
    • A space to exchange news and move collaboratively toward the future of data.
    • A route to help ensure vital themes, such as inclusivity, accessibility, wellbeing etc. are fully explored.
  3. Alongside working with users in local government, wider civic society, and the public, we build and maintain strong relationships with key policy makers in central government. These relationships with both local and central policy makers, allow the ONS to understand the challenges they face. We can then help build their understanding of our statistics and analysis , and the wider evidence base, enabling greater insight towards the topics that matter to our users, maximising ONS’s impact on decisions that affect the whole country.

Communication

  1. The way we communicate our statistics has much improved in recent years, having a direct influence on our impact. Statisticians speaking directly to the public via television and radio helps the transparent communication of statistics, assisted by our amend to publication times which ensures parity of communication (from 26 March 2020 we amended release times for market sensitive releases at 7:00 (rather than 9:30) and this was agreed with OSR).
  2. This includes 226 broadcast media interviews undertaken by ONS spokespeople during 2022/23 financial year, generating an average of 2.5k pieces of quoted coverage in the broadcast/online media each month, as well as our solid presence on Twitter (354.2k followers), which achieves good comparable engagement and reach, with threads created to support outputs and to respond to specific trends on social media.
  3. The ONS’s ‘Statistically Speaking’ podcast takes a deep dive into hot data topics and explains what’s behind the numbers. Between April 2022 and March 2023, the podcast had more than 18k downloads, including our most popular episode, ‘The R Word: Decoding ‘recession’ and looking beyond GDP’, which achieved 1,891 downloads in its first 30 days. In total since the podcast started in January 2022, it has achieved almost 30k unique downloads.

Dissemination

  1. Our approach to dissemination plays a pivotal role in maximising the insight and impact our data has. A prime example is our award-winning Census dissemination portfolio, with our Census maps offering users the ability to explore spatial patterns down to the neighbourhood level, empowering planners, and policymakers to precisely target interventions, and for any users to better understand their communities. Since then, we have developed interactive and highly localised content to encourage audiences to engage with the more granular data, producing data visualisation tools and innovative content so citizens can explore the data that is important to them. Users responded positively, saying they were “visually excellent”, “personalisable, visual and really well presented”.
  2. To promote widespread reuse of our insights and thus amplify their reach and impact, we designed tools to encourage users to embed custom views in their websites and publications. The results have been remarkable, with Census maps accounting for an impressive 24% of total views on the ONS site, garnering around 30 million views since its launch in November 2022.
  3. We also released our custom area profiles product. We recognised that user needs often extend beyond predefined geographic areas. Users can now draw specific areas of interest and generate tailored profiles with indicators and comparisons that match their unique use cases. The outputs are also exportable for use in websites and presentations, and has already reached tens of thousands of users, bridging gaps in specialised expertise.
  4. To cater to time-constrained users and unlock the potential of data hidden in spreadsheets, we introduced semi-automated localised reporting. With algorithms generating approximately 350 customised reports, one for each local authority, key trends in respective areas are efficiently explained, making insight more accessible and impactful. These reports are extensively accessed and widely referenced by local authorities and other local users.
  5. Additionally, we enhanced the reach of these reports by making their content crawlable by search engines. Snippets from these reports now appear directly in search results and voice-based queries, further bolstering the impact of our data and analyses.
  6. In tandem with our work on the Census, we have goals to transform the presentation of ONS’s day-to-day insights, with a particular focus on enhancing offerings for the general public. We have a digital content team comprised of data visualisation specialists, data journalists, and designers focussed on collaborating with analytical teams to achieve this.
  7. This approach centres on addressing the most pertinent questions for our users, often focussed on creating more personalised and localised experiences. By empowering users to see themselves within our data, we establish a meaningful connection with our audience.
  8. Through these collaborations with analytical teams, we are reaching a much-expanded user base, with audiences engaging with our content 40% longer than typical offerings. Our commitment to delivering impactful, accessible, data-driven insights ensures our offerings resonate with diverse audiences and have a lasting impression.
  9. We gather a range of feedback on our digital products to develop and improve the usability of our content, including our interactive online content and tools. We also undertake analysis of how user groups access our content, needs across our users of data, of statistics and trends, and those who want a deeper understanding of topics.
  10. There were 6.4m users of the ONS website in 2022/23. Most users (4.3m) use a mobile device to access our website, with 1.9m on desktop and 0.2m on tablets. Engagement levels remain highest with desktop users. There were nearly 26m pageviews on ONS.gov.uk over 2022/23. This only represents users that accept cookies. We estimate this to be approximately 30% of users.
  11. Peak demand on the website was driven by census releases, with 8x higher daily page views on 29 November for the ethnic group, national identity, language and religion census data than average, and nearly 6x higher daily page views for the demography and migration releases for census on 2 November. The census first release on 28 June saw roughly double the daily average for the year.
  12. The most popular topics on ONS.gov.uk across the year were census (3.6m pageviews), covid (3.6m pageviews) and inflation (3.2m pageviews).

Analysis

  1. Ensuring that analysis is good quality will also help ensure that it has impact. The Analysis Function Standard sets expectations for the planning and undertaking of analysis to support well-informed decision making. It provides clear direction and guidance for all users and producers of government analysis.
  2. The Analysis Function also shares best practice through the Analysis in Government awards, which includes an impact award
  3. Maximising the impact of across government and for the ONS is in understanding the priorities of the day, both for the citizen but also decision makers at the heart of local and central government, and flexing at pace as new priorities emerge – this often means the evidence base may be less robust or that data do not exist, but ONS’s Analytical Hub is constantly adapting to produce the best analysis at pace to support decision making. The ONS also scans the horizon anticipating what may be becoming an emerging issue.

How do we ensure that users, in the Civil Service, Parliament and beyond, have the skills they need to make effective use of data? 

  1. There are a range of initiatives aimed at improving the analytical skills of civil servants and beyond.

The Analysis Function

  1. The Analysis Function is a network for all civil servants working in government analysis and aims to help improve the capability of all analysts across government. The Analysis Function website hosts the dedicated analysis function curriculum webpages alongside a range of technical analytical learning for all, as well as a guidance hub providing access to key analysis guidance. The Function also hosts regular information sharing events and webinars.
  2. The Function works with the policy profession and other teams across government to ensure we are building a level of analytical capability specifically for non-analysts. The Analysis Function have also developed a learning pathway specifically for non-analysts in line with wider government reform priorities.
  3. The Analysis Function conducted a review in 2022 of the analytical capability of policy officials. Since then, we have been working closely with the policy profession unit through a dedicated implementation working group looking to address the recommendations from the review. Progress has been made against several actions including the launch of the analytical literacy course, data master class and launch of policy to delivery pilot.

The Methodology Advisory Service

  1. The Methodology Advisory Service (MAS) based within the ONS offers advice, guidance and provide support for the public sector, nationally and internationally, using teams of experts covering different areas of statistical and survey methodology. We offer an advisory service for:
    • methodological advice on production and analysis of data
    • development of surveys or outputs
    • feasibility studies
    • methodological research to answer complex problems
    • quality assurance of methods or outputs
    • cross-cutting reviews of processes and methods across a department’s statistical work
    • evaluation of competing sources
    • health checks before an OSR assessment
  2. The ONS’s methodologists and researchers receive their own methodological advice from the Methodological Assurance Review Panel (MARP). They provide external, independent assurance and guidance on the statistical methodology underpinning ONS statistical production and research.

The Data Science Campus

  1. The Data Science Campus is at the heart of leading-edge data science capacity building with public sector bodies in the UK and abroad. We equip analysts with the latest tools and techniques, giving them the capability to perform effectively in their roles. We also work in partnership with organisations to ensure they have the capacity to develop their own data science skills in the long-term.
  2. Our evolving range of programmes reflects our focus on using data to drive innovation for public good, and provide analysts across the ONS, the UK public sector and international partners with a developmental framework to build capacity and enhance analytical capability:
    • Data Science Accelerator
    • Data Science Graduate Programme
    • Degree Data Science Apprenticeship
    • Masters in Data Analytics for Government
    • Cross-government and Public Sector Data Science Community

ONS Local

  1. The ONS Local service provides peer-to-peer forums and platforms for local, regional, and national analytical communities to share best practice, and helps local users navigate around the extensive subnational offer from the ONS, both what is already available and what’s in development, and wider UK government data. For example, “ONS Local Presents” webinars allow ONS teams and analysts from local or central government to present analysis on a topic to a wide audience for feedback and challenge or to showcase innovation in techniques or data that may be useful to others. We have also held our first in a series of “ONS Local How to” workshop, aimed at a similar audience and run jointly with the Data Science Campus, to support local government analysts create dashboards and use APIs.

ONS Outreach and Engagement

  1. Finally, the ONS Outreach and Engagement Team are piloting and developing a programme of online engagement activities to help improve data literacy among underrepresented groups, non-expert users and those less likely to engage with data. The sessions vary in topic across the range of statistical production and collection themes at ONS and include a range of engagement formats. Topics and activities so far have included an introduction to the ONS and census webinars, Q&As on how to use census data and show and tells, demonstrations or learn ins on data tools such as Cost of Living Insights Tool, Census Maps Tool and Build a Custom Data Set Tool.
  2. These sessions can be tailored to the audience, including civil service colleagues who may be less confident or engaged with data, and aim to improve awareness and understanding of the foundations of data use and production.

Professor Sir Ian Diamond, National Statistician

Office for National Statistics

August 2023

Office for Statistics Regulation response

Introduction

About us

  1. The Office for Statistics Regulation (OSR) is the independent regulatory arm of the UK Statistics Authority. In line with the Statistics and Registration Service Act (2007), our principal roles are to:
    • set the statutory Code of Practice for Statistics (the Code).
    • assess compliance with the Code to ensure statistics serve the public, in line with the pillars of Trustworthiness, Quality and Value. We do this through our regulatory work that includes assessments, systemic reviews, compliance checks and casework.
    • award the National Statistics designation to official statistics that comply fully with the Code.
    • report any concerns on the quality, good practice and comprehensiveness of official statistics.
  2. While our formal remit covers official statistics, we also encourage organisations to voluntarily apply the Code to demonstrate their commitment to trustworthy, high quality and valuable statistics. Our 5-year plan sets out our vision and priorities for 2020-2025 and how we will contribute to fostering the Authority’s ambitions for the statistics system. Our annual business plan shares our focus for the current year.

Data and analysis in Government

How successfully do Government Departments share data? 

  1. For the last five years, OSR has been monitoring and commenting on data sharing and linkage across government, producing reports to understand issues and identify opportunities to move the wider system forward. We are an advocate and a champion for data sharing and linkage, when this is done in a secure way that maintains public trust. It is our ambition that sharing and linking datasets, and using them for research and evaluation, will become the norm across the UK statistical system.
  2. Our latest data sharing and linkage report takes stock of data sharing and linkage across government. There has been some excellent progress in creating linked datasets and making them available for research, analysis and statistics.
    • The Office for National Statistics (ONS) recently published statistics on sociodemographic inequalities in suicides, which utilised linked demographic and socioeconomic data about individuals from the 2011 Census with death registration data and, for the first time, was able to show estimates for rates of suicide across a wide range of different demographic groups. They believe this analysis will support the development of more effective suicide prevention strategies.
    • Data First aims to unlock the potential of Ministry of Justice (MoJ) data by linking administrative datasets from across the justice system and enabling accredited researchers, from within government and academia, to access the data. Data First is also enhancing the linking of justice data with data from other government departments, such as the Department for Education (DfE), where linking data has unlocked a wealth of information for researchers about young people who interact with the criminal justice system.
    • BOLD, led by the MoJ, is a three-year cross-government data-linking programme which aims to improve the connectedness of government data in England and Wales. It was created to demonstrate how people with complex needs can be better supported by linking and improving the government data held on them in a safe and secure way.
  3. Our report highlights an emerging theme on the overall willingness to share and link data across government and public bodies. The benefits and value of doing so are widely recognised, with the COVID-19 pandemic helping to change mindsets and highlight opportunities that exist for greater collaboration and sharing.
  4. However, through speaking with stakeholders across the data sharing and linkage landscape during our review, we also found there is still uncertainty about how to share and link data in a legal and ethical way, and about public perception of data sharing and linkage. There is also a lack of clarity about data access processes and data availability and standards across government. Together, these factors can lead to a nervousness to share and link data, which can cause blockages or delays.
  5. The picture is not the same in every area of government. Some areas have moved faster than others and we have found that culture and people are key determinants of progress.
  6. In the report, we summarise and discuss our findings within four themes in the context of both barriers and opportunities:
    1. Public engagement and social licence: The importance of obtaining a social licence for data sharing and linkage and how public engagement can help build understanding of whether/how much social licence exists and how it could be strengthened. We also explore the role data security plays here.
    2. People: The risk appetite and leadership of key decision makers, and the skills and availability of staff.
    3. Processes: The non-technical processes that govern how data sharing and linkage happens across government.
    4. Technical: The technical specifics of datasets, as well as the infrastructure to support data sharing and linkage.
  7. Overall, data sharing and linkage in government stands at a crossroads. Great work has been done and there is the potential to build on this. However, there is also the possibility that, should current barriers not be resolved, progress will be lost.
  8. Our review makes 16 recommendations that, if realised, will enable government to confront ingrained challenges, and ultimately to move towards greater data sharing and linkage for the public good. Following the report, OSR will be following up with those organisations mentioned in our recommendations to monitor how they are being taken forward.

The changing data landscape

Is the age of the survey, and the decennial Census, over?

  1. Statistics producers are increasingly turning to alternative data sources in the production of official statistics, in light of challenges with survey data collection and increased recognition of the potential of alternative data sources. Administrative data (that is, data that are primarily collected for administrative or operational purposes) are increasingly used to produce official statistics across a range of topics including health, such as waiting times data; crime, such as police recorded crime data; and international migration, such as borders and immigration data. Challenges faced during the COVID-19 pandemic highlighted society’s need for timely statistics and further demonstrated the potential of administrative data.
  2. However, such methods are unlikely to be able to capture all aspects of our population and society and therefore surveys are likely to play an ongoing but changing role in the statistical system. For instance, many crimes are not reported to the police, and data quality for some crime types is poor, so users cannot rely exclusively on administrative datasets of police recorded crime. To get a full picture of crime, both police recorded crime and the Crime Survey for England and Wales will always need to be used alongside each other.
  3. Moreover, there is strong interest in opinion and perception data such as the successful ONS Business Insights and Confidence Survey. Our Visibility, Vulnerability and Voice report on statistics on children and young people also demonstrated the strong user demand and importance of data that include children’s voice about their experiences and see the child holistically. These insights would not be available through administrative sources.

What new sources of data are available to government statisticians and analysts? 

  1. We highlight in our State of the Statistical System 2022/23 report that the increasing availability of new data sources such as administrative data, management information and growing use of artificial intelligence should be seen as an opportunity for the statistical system.
  2. Administrative data are helping to provide new insights and improve the quality of statistics. For example, the Department for Work and Pensions (DWP) is exploring the integration of administrative data into the Family Resources Survey (FRS) and related outputs through its FRS Administrative Data Transformation Project.
  3. The ONS has developed experimental measures of inflation using new data sources, including scanner and web-scraped data, publishing experimental analysis using web scraped data looking at the lowest cost grocery items. Their Consumer Prices Development Plan details the new sources of data that can be used and the insights they can bring.
  4. Technology can also provide opportunities to collect data in different ways, such as DfE pupil attendance data that is automatically submitted from participating schools’ management systems and allows for more timely analysis of attendance in schools in England. This data collection won the RSS Campion Award for excellence in Official Statistic

What are the strengths and weaknesses of new sources of data? 

  1. In the wider context of technological advances, statistics need to remain relevant, accurate and reliable, and new data sources support this ambition. However, with the use of these new and innovative data sources in the production of official statistics, producers need to manage risks around quality. Moreover, with more use of data science and statistical models in the production of official statistics it is crucial that producers ensure that any development of models is explainable and interpretable to meet the transparency requirements of the Code.
  2. To maximise the opportunities from new data sources, the role of the statistician has to evolve and keep pace with the increasing use of data science techniques. Our latest State of the Statistical system report highlights the difficulties producers have getting people with the right skills in post; these challenges are not being consistently felt across the whole UK statistical system. There is a concerning risk that continued financial and resource pressures will hinder future progress and evolution of the system to keep pace with increasing demand. A successful statistical system that is able to utilise new data sources depends on having a workforce that is sufficiently resourced and skilled to deliver.
  3. New data sources often provide insights in a timelier manner (in some instances this can be near real time such as England’s school attendance data) and provide better coverage (such as the web scraped and supermarket prices data often including all transactions or prices). On the other hand, there is a risk it may not be measuring what people want to measure and there is no option to amend or edit the data or questions being asked. Producers also have little control over the coherence and comparability in the data; there may be differences in how organisations record their data as well as between datasets on a similar topic. Data could also be missing for some observations and variables and the data could be bias by only covering certain groups of people or transactions.

Protecting privacy and acting ethically 

What does it mean to use data ethically, in the context of statistics and analysis? 

  1. As the regulator of official statistics in the UK, it is our role to uphold public confidence in statistics. In our view, an oft-neglected question of data ethics concerns not so much how data are collected and processed, but how the resulting statistics are used in public debate. As a result, we consider the question of whether a particular use is misleading as intrinsically ethical.
  2. One of the areas we continue to develop our thinking on is the topic of misleadingness, publishing a think piece on misleadingness in May 2020 and following up on our initial thinking in May 2021. The latter focuses on feedback to the first think piece that it is important to distinguish between the production of statistics and the use of statistics, as well as identifying areas not covered in the original think piece, like the risk of incomplete evidence. Based on our findings, our thinking has evolved to be clearer on the circumstances in which it is relevant to consider misleadingness: “We are concerned when, on a question of significant public interest, the way statistics are used is likely to leave audiences believing something which the relevant statistical evidence would not support.”
  3. We are launching a review of the Code of Practice for Statistics in September. As part of it, we will be asking the question “what are the key ethical issues in the age of AI: how do we balance serving public good with the potential for individualised harms?”. The review will run until December, and we will be highlighting how people can engage and contribute, including a planned panel session on this topic.

Understanding and responding to evolving user needs

Who should official data and analyses serve? How do users of official statistics and analysis wish to access data? 

  1. OSR’s vision, based on our founding legislation, is that statistics serve the public good. In 2022 we worked in partnership with ADR UK to explore what the term ‘public good’ means to the public. We found that research and statistics should aim to address real-world needs, including those that may impact future generations and those that only impact a small number of people. There was also clear evidence that members of the public want to be involved in making decisions about whether public good is being served, through meaningful public engagement and full, transparent and easy access to the decision-making process of Data Access Committees (which evaluate applications from trained and accredited researchers for the use of de-identified data for research).
  2. In 2021, we published a report looking at Defining the Public Good in Applications to Access Public Data. The report highlights how researchers see their research as serving the public good or providing public benefits, and this differed between organisations. For example, the most frequently mentioned public benefits in National Statistician’s Data Ethics Advisory Committee (NSDEC) applications were to improve statistics and service delivery, whereas Reproducible Analytical Pipeline (RAP) applications mentioned policy decisions and societal benefit more.

How are demands for data changing?   

  1. There continues to be a significant shift in government and public demand for statistics and data from COVID-19 to other key issues. The statistical system has demonstrated its responsiveness to meet these data needs. However, as mentioned at paragraph 17, pressure on resources and finances poses a significant threat to the ability of government analysts to produce the insight government and the wider population needs to make well-informed decisions.
  2. Working in an efficient way will help address one part of this problem: it will help ensure maximum value is achieved with the resources that are available, which will in turn help others across government appreciate the benefit of having analysts at the table. Our blog on smart statistics: what can the Code tell us about working efficiently highlights ways to support efficiency based on the Code.
  3. Users of statistics and data should always be at the centre of statistical production; their needs should be understood, their views sought and acted on, and their use of statistics supported. We encourage producers of statistics to have conversations with a wide range of users to identify where statistics can be ceased, or reduced in frequency or detail, to save resources if appropriate. This can free up resource, while helping producers to fulfil their commitment to producing statistics of public value that meet user needs. Ofsted has recently done this to great effect.
  4. The UK statistical system should maintain the brilliant responsive and proactive approach taken during the COVID-19 pandemic and look to do this in a sustainable way. Improvements to data infrastructure, processes, and systems could all help. For example, the use of technology and data science principles, such as that set out in our 2021 RAP review, supports the more efficient and sustainable delivery of statistics. This review includes several case studies of producers using RAP principles to reduce manual effort and save time, alongside other benefits. The recent Analysis Function RAP strategy sets out the ambition to embed RAP across government, and the Analysis Function can offer RAP support, through its online pages, its Analysis Standards and Pipelines Team and via the cross-government the RAP champion network.
  5. Statistics and data should be published in forms that enable their reuse, and opportunities for data sharing, data linkage, cross-analysis of sources, and the reuse of data should be acted on. The visualisations and insights generated by individuals, from outside the statistical system, using easily downloadable data from the COVID-19 dashboard nicely demonstrate the benefits of making data available for others to do their own analysis, which can add value without additional resource from producers.
  6. Promoting data sharing and linkage, in a secure way, is one of OSR’s priorities and we are currently engaging with key stakeholders involved in data to gather examples of good practice, and to better understand the current barriers to sharing and linking. This will be used to champion successes, support positive change, and provide opportunities for learning to be shared.
  7. To ensure overall success, it requires:
    • independent decision making and leadership, in particular Chief Statisticians and Heads of Profession for Statistics having authority to uphold and advocate the standards of the Code.
    • professional capability, again demonstrating the benefit of investing in training and skills, even when resources are scarce.

How can we ensure that official data and analyses have impact? 

  1. To have impact, official data and analysis need to serve the public good (by being quality, trusted and valued) and be well communicated.
  2. This is reflected in the three pillars of our Code: Quality sits between Trustworthiness, representing the confidence users can have in the people and organisations that produce data and statistics, and Value, ensuring that statistics support society’s needs for information. All three pillars are essential for achieving statistics that serve the public good. They each provide a particular lens on key areas of statistical practice that complement each other and help to ensure the data are being used as intended.
  3. Quality is not independent of Trustworthiness and Value. A producer cannot deliver high quality statistics without well-built and functioning systems and skilled staff. It cannot produce statistics that are fit for their intended uses without first understanding the uses and the needs of users. This interface between quality, its institutional context and statistical purpose are also reflected in quality assurance frameworks (QAF), including the European Statistical System’s QAF and the International Monetary Fund’s DQAF. The Code is consistent with these frameworks and with the UN Fundamental Principles of Official Statistics.
  4. We use assessments and compliance checks to judge compliance with the Code for individual sets of statistics or small groups of related statistics and data (for example, covering the same topics across the UK). Whether we use an assessment or compliance check will often be determined by balancing the value of investigating a specific issue (through a compliance check) versus the need to cover the full scope of the Code (through an assessment).
  5. There is no ‘typical’ assessment or compliance check – each project is scoped and designed to reflect its needs. An assessment will always be used when it concerns a new National Statistics designation and will also be used to undertake in-depth reviews of the highest profile, highest value statistics, especially where potentially critical issues have been identified.
  6. We have some useful guidance that can assist producers in their quality management. We published a guide to thinking about quality when producing statistics following our in-depth review of quality management in HMRC, and released a blog to accompany our uncertainty report. It highlights some important resources, top among them the Data Quality Hub guidance on presenting uncertainty. Our quality assurance of administrative data (QAAD) framework is a useful tool to reassure users about the quality of the data sources.
  7. To support statistics leaders in developing a strategic approach to applying the Code pillars and a quality culture, we have developed a maturity model, ‘Improving Practice’. It provides a business tool to evaluate the statistical organisation against the three Code pillars and helps producers identify the current level of practice achievement and their desired level, and to formulate an action plan to address the priority areas for improvement for the year ahead.
  8. We are also continuing to promote a Code culture that supports producers opening themselves to check and challenge as they embed Trustworthiness, Quality and Value, because in combination, the three pillars provide the most effective means to deliver relevant and robust statistics that the public can use with confidence when trying to shine a light on important issues in society.
  9. In our report on presenting uncertainty in the statistical system we found that presenting uncertainty in a meaningful, succinct way that delivers the key messages can be challenging for producers. We found that typically, uncertainty is better depicted and described in statistical bulletins and methodological documents than it is in data tables, data dashboards and downloadable datasets.
  10. We also found that there is a wide and increasing range of guidance and advice to help producers think about how to best present uncertainty. OSR will do more to promote and support good practice and consider what this means for our regulatory work. We will focus on the judgements that we make and the guidance we produce to help producers to improve the presentation of uncertainty.
  11. In our report, we concluded that showing uncertainty in estimates, for example through data visualisation, is essential in improving the interpretation of statistics and in bringing clarity to users about what the statistics can and cannot be used for. At the same time, however, we recognise that this is often not always a straightforward task. With support from us and those at the centre of the Government Statistical Service (GSS), we encourage Heads of Profession for Statistics to review whether uncertainty is being assessed appropriately in their data sources, and to review how this is presented in all statistical outputs.
  12. We will continue to review the communication of uncertainty in our regulatory projects. We already have a good range of experience and effective guidance to help review uncertainty presented in statistical bulletins and methodology documents.

How do we ensure that users in the Civil Service, Parliament and beyond, have the skills they need to make effective use of data? 

Intelligent transparency

  1. Intelligent transparency is fundamental in supporting public trust in statistics. Our campaign and guidance aim to ensure an open and accessible approach to communicating numbers.
  2. In our blog What is intelligent transparency and how you can help?, we highlight our expectation that at its heart intelligent transparency is about proactively taking an open, clear and accessible approach to the release and use of data, statistics and wider analysis. We also recognise that whilst we will continue to champion intelligent transparency and equal access to data, statistics and wider analysis, it isn’t something we can do on our own. Our expectations for transparency apply regardless of how data are categorised. For many who see numbers used by governments, the distinction between official statistics and other data, such as management information or research, may seem artificial. Therefore, any data which are quoted publicly or where there is significant public interest should be released and communicated in a transparent way.
  3. We need users of data to continue to question where data comes from and if it is being used appropriately. We also need those based in a department or a public body to champion intelligent transparency in their team, their department and their individual work, build networks to promote our intelligent transparency guidance across all colleagues and senior leaders, and to engage with users to understand what information it is they need to inform their work to inform the case for publishing it.
  4. Parliamentarians also have a role to play in ensuring intelligent transparency in debate. This includes advocating for best practice around the use of statistics and calling out misuse of statistics where it occurs. Following the principles of intelligent transparency allows the topic discussed to remain the focus of conversation, rather than the provenance of the data.
  5. We have launched a communicating statistics programme that will in part look to understand how users want to access data and help support producers to communicate their data through those different means. This will include reviewing our existing guidance to understand what more we can do to support the use and range of communication methods while preventing and combatting misuse.

Statistical literacy

  1. In our regulatory work, when people talk to us about statistical literacy it is often in the context of it being something in which the public has a deficit. For example, ‘statistical literacy’ may be cited to us as a factor in a general discussion on why the public has a poor understanding of economic statistics. OSR commissioned a review of published research on this topic area and published an accompanying article to investigate if this was indeed the case.
  2. We found wide variability across the general public in the skills and abilities that are linked to statistical literacy. Our review highlights that a substantial proportion of the population display basic levels of foundational skills and statistical knowledge, and that skill level is influenced by demographic factors such as age, gender, education and socioeconomic status.
  3. Given this, we think that it is important that statistical literacy is not viewed as a deficit that needs to be fixed, but instead as something that is varied and dependent on the context of the statistics and factors that are important in that context. Therefore, rather than address deficits in skills or abilities, we recommend that producers of statistics focus on how best to publish and communicate statistics that can be understood by audiences with varying skill levels and abilities.
  4. Our review identified a number of areas where there is good evidence on how best to communicate statistics to non-specialist audiences in the following areas:
    • Target audience: Our evidence endorses the widely recognised importance of understanding audiences. The evidence highlights that the best approach to communicating information (including data visualisations) can vary substantially depending on the characteristics of the audience for the statistics. Considering the target audience’s characteristics is, therefore, an important factor when designing communication materials.
    • Contextual information: Contextual information helps audiences to understand the significance of the statistics. Our evidence highlights the importance of providing narrative aids, and also that providing statistical context can help to establish trust in the statistics. Again, this supports and reflects existing notions of best practice.
    • Establishing trust: As well as providing context, we found evidence that highlighting the independent nature of the statistical body and, when needed, providing sufficient information so that the reasons for unexpected result are understood, can increase trust in the statistics. This finding aligns with the Trustworthiness pillar of the Code.
    • Language: In the statistical system, statistics producers recognise that they should aim for simple easy to understand language. We found evidence to endorse this recognition – in particular, that, when used, the level of technical language should be dictated by the intended target audience.
    • Format and framing of statistical information: We found evidence that different formats (e.g., probability, percentage or natural frequency) and/or framing (e.g., positive or negative) in wording can lead to unintended bias or affect perceptions of the statistics and both need to be considered. This finding is probably the one which is least widely recognised in current best practice in official statistics, and we consider it is an area that would benefit from further thinking.
    • Communicating uncertainty: Communicating uncertainty is important and may need to be tailored dependent on the information needs and interest levels of the audience. This topic is a particular focus area for OSR, and we discussed our report on communicating uncertainly at paragraph 39.

UK Statistics Authority oral evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on the work of the UK Statistics Authority

On Tuesday 23 May, Sir Robert Chote, Chair of the UK Statistics Authority, Sir Ian Diamond, National Statistician and Ed Humpherson, Director General for Regulation, gave evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on the work of the UK Statistics Authority.

A transcript of which has been published on the UK Parliament website.

UK Statistics Authority correspondence to the Public Administration and Constitutional Affairs Committee regarding Public Confidence in Official Statistics report

Dear William,

I am writing to draw your attention to the latest Public Confidence in Official Statistics report (2021), which has been produced by the National Centre for Social Research (NatCen) on behalf of the UK Statistics Authority. I am happy to share that the report finds that public confidence in official statistics remains high, and engagement with official statistics has increased since 2018.

Awareness of the Office for National Statistics (ONS) and the Authority has increased from 70% and 33% in 2018 to 75% and 48% in 2021 respectively. Furthermore, for the first time people were asked if they were aware of the Office for Statistics Regulation, with 41% saying that they were.

Notably, 96% of people able to express a view agreed that it is important for there to be a body such as the Authority to speak out against the misuse of statistics, and 94% agreed about the importance of there being a body to ensure that official statistics are produced without political interference.

Members might also be interested to note that a very high proportion of respondents trusted the ONS (89% of those able to express a view) and our statistics (87%). Of those able to express an opinion, trust in the ONS was highest of all institutions asked about, including the Government, the Bank of England, and the civil service as a whole. 82% of people able to express an opinion agreed that official statistics are generally accurate, up from 78% in 2018. Meanwhile 44% said they had used ONS COVID-19 statistics; they were more commonly used than any of the other statistics asked about with the exception of the census.

This report is very welcome, especially following our hard work to provide clear insights throughout the pandemic. We are proud that the public support our vision of statistics that serve the public good, which we will continue to deliver with honesty, and free from political interference.

A copy of the report will be annexed to this letter for the Committee’s information.

Yours sincerely,
Professor Sir Ian Diamond