Office for Statistics Regulation written evidence to the Modernisation Committee’s inquiry into the work of the committee

Dear Ms Powell,

As Director General of Regulation at the Office for Statistics Regulation (OSR), I am writing following the Modernisation Committee’s call for evidence on the work of the Committee.

The OSR is the regulatory arm of the UK Statistics Authority and plays a key role in protecting public confidence in the trustworthiness, quality and value of statistics produced and used by government. We consider statistics to be the lifeblood of democratic debate and that misuse of statistics results in an erosion of trust in Government.

We are responsible for setting the standards that official statistics must meet in the Code of Practice for Statistics. We also use our voice to stand up for statistics and represent the interests of the public by investigating concerns raised with us (referred to as casework) about the dissemination and use of statistics, reporting publicly where necessary.

We have two key areas that we would like to bring to the attention of the Modernisation Committee that we consider would contribute to the strategic aim of driving up standards and restoring public trust.

Intelligent Transparency

Why this topic would benefit from the attention of the Modernisation Committee

Intelligent transparency ensures public understanding of, and confidence in, numbers used by governments. It involves proactively taking an open, clear, and accessible approach to the release and use of data and statistics so that they can be easily accessed, scrutinised and used appropriately.

Our principles for intelligent transparency are:

  • Equality of access – Data used by government in the public domain should be made available to all in an accessible and timely way. Our expectation is that figures used by MPs in a public forum should already be publicly available. This ensures that any claims made are evidenced, verifiable and able to be scrutinised.
  • Ease of understanding – Sources for figures should be cited and appropriate explanation of context, including strengths and limitations, communicated clearly alongside figures. Our expectation is that MPs use statistics in a fair and accurate way that supports understanding and is not misleading.
  • Independent decision making and leadership – Decisions about the publication of statistics and data, such as content and timing, should be independent of political influence and policy processes. Our expectation is that MPs will not interfere with the independent process of publishing statistics.

We have worked closely with the Heads of Profession for Statistics network across government departments to deliver training and raise awareness of intelligent transparency. We have had considerable successful engaging with civil servants including statisticians, analysts, communication professions, policy teams and Permanent Secretaries. However, to date, we have had limited direct contact with MPs and Ministers. Given that the full success of intelligent transparency is dependent on all being aware of it, across government and parliament, we would encourage parliamentary committees and individual MPs to be aware of the principles of intelligent transparency in their work.

It is our view that the Modernisation Committee could play a key role in ensuring that the principles of intelligent transparency are fully embraced and embedded across parliament as the default approach for communicating statistics. This will ensure that statistics are used by MPs in a way that supports public trust.

Existing work relevant to this topic

We have a range of publicly available materials including guidance, FAQs and several blogs.

Our most recent blog comments on a claim made by the Prime Minister at the Labour Party Conference, specifically that there had been “a 23 per cent increase in returns of people who have no right to be here, compared with last summer.” This claim was based on unpublished Home Office data and resulted in us requesting that the Home Office publish an ad hoc release containing the underlying data in advance of the official statistics publication. Cases such as these can result in confusion over the source of the claim, negative media coverage and a disruption to the orderly release of official statistics.

Whilst much of the regulatory work we do is behind the scenes, we have written publicly on several cases relating to intelligent transparency. Key examples include:

  • A statement by OSR (June 2024) relating to the claim made by the Conservative Party that “a Labour government would mean £2,000 of tax rises per working household”. The statement concludes that without reading the full Conservative Party costing document, someone hearing the claim would have no way of knowing that this is an estimate summed together over four years.
  • Sir Robert Chote to Rt Hon Richard Holden Party Spending Claims (June 2024) relating to the claim by the Labour Party that the Conservative Party would “raise people’s mortgages by £4,800.” The letter states that: “When distilling these claims into a single number, there should be enough context to allow the average person to understand what it means and how significant it is. Omitting this information can damage trust in the data and the claims that these data inform.”

Ed Humpherson to Matthew Rycroft transparency of home office statistics (November 2022) which sets out our concerns regarding the use of unpublished Home Office data and statistics by Priti Patel (then Home Secretary), Rishi Sunak (then Prime Minister) and Robert Jenrick (then Minister for Immigration)

It is important to note that there have been several high-profile endorsements of intelligent transparency including:

  • A PACAC report on ‘Transforming the UK’s Evidence Base’ (May 2024) which commended our work on intelligent transparency and recommended that OSR publish an annual report card of departments’ compliance with this guidance so that “Parliament and external bodies might support OSR in holding departments to account and making the case for well-informed policy.” We are currently exploring options for what annual reporting could look like.
  • A private letter from Sir Robert Chote, Chair of the UK Statistics Authority, to new Secretaries of State on ‘Support for ensuring statistics serve the public good’ (October 2024). We have received positive replies to this letter from several Secretaries of State demonstrating their support for these principles.
  • A public letter from Sir Robert Chote, Chair of the UK Statistics Authority, to political party leaders ahead of the general election (June 2024).
  • A private letter from Alex Chisholm (then Civil Service Chief Operating Officer and Cabinet Office Permanent Secretary) to Permanent Secretaries on the ‘Transparency in use of statistics’ (April 2022).
  • Full Fact Report 2023 summarises our involvement in several key cases that relate to the principles of intelligent transparency and notes that: “In 2022 alone the Office for Statistics Regulation (OSR) had to write to Government departments at least ten times about the lack of transparency in their use of statistics.
  • A PACAC report on ‘Government transparency and accountability during Covid 19: The data underpinning decisions’ (March 2021) which states that “statistics quoted by Ministers have not always been underpinned by published data, which goes against the UKSA Code of Practice. Publishing the underlying data is key to transparency and building trust. When the underlying data is not published, numbers may be used to make politicised points and members of the public, journalists and Parliamentarians have no way of verifying the information shared. This means constructive debate cannot happen. When Ministers or senior officials quote statistics, the underlying data must be published.”

Strengthening the Ministerial Code

Why this topic would benefit from the attention of the Modernisation Committee

The current version of the Ministerial Code states that: “Ministers need to be mindful of the UK Statistics Authority’s Code of Practice which defines good practice in relation to official statistics, observance of which is a statutory requirement on all organisations that produce National Statistics in accordance with the provisions of the Statistics and Registration Service Act 2007.”

In his October letter to Secretaries of State, Sir Robert Chote asked that “[Secretaries of State] consider going beyond the letter of the Ministerial Code, from merely being mindful to complying with the Code of Practice for Statistics.

As the Ministerial Code provides the foundation for setting standards across parliament, we consider that strengthening the Code in relation to the Code of Practice for Statistics will protect against the misuse of statistics. Strengthening the Ministerial Code will also signal that MPs are fully committed to upholding the standards of Trustworthiness, thus supporting confidence.

Recently there have been several calls in favour of strengthening the Ministerial Code in relation to the Code of Practice for Statistics. It was our hope that this would have been incorporated in the most recent update to the Ministerial Code which took place in November, however this was not the case.

Existing work relevant to this topic

Over the past few years, there have been several calls in favour of strengthening the Ministerial Code including:

  • The independent review of the UK Statistics Authority by Professor Denise Lievesley CBE (March 2024) which stated that “As the current and former Chairs of the UKSA and PACAC have noted, there is scope to strengthen the Ministerial Code to mandate adherence to the UKSA Code of Practice for Statistics. This Review concurs.” The review goes on to state that: “bolstering the Code in this way will send a clear signal to the country that Ministers are holding themselves to the highest account.”
  • A letter from the Royal Statistical Society to Secretaries of State (July 2024) which calls for Secretaries of State to “pledge to abide by the Code of Practice of Statistics – rather than merely being mindful of it as the current Ministerial Code requires.”
  • A letter from Full Fact to the Prime Minister (July 2024) which asked that the PM “Make the Ministerial Code statutory, and incorporate compulsory adherence to the Code of Practice for Statistics.”
  • Feedback in OSR’s review of the Code of Practice for Statistics in 2023 highlighted how helpful producers and users find OSR’s regulatory guidance on Intelligent Transparency and requested that it be incorporated into the Code. OSR is now consulting on a proposed third edition of the Code of Practice which includes Standards for Intelligent Transparency that those in public bodies should meet when using statistics to support statements in the public domain.
  • A PACAC report on ‘Government transparency and accountability during Covid 19: The data underpinning decisions’ (March 2021) which stated that: “The Ministerial Code needs to be strengthened so it is clear that Ministers are required to abide by the UKSA Code of Practice in their presentation of data. The UKSA Code includes the principle of trustworthiness that builds ‘confidence in the people and organisations that produce statistics and data’. Abiding by the UKSA Code of Practice is a statutory requirement for Government Departments. It is simply not enough to ask Ministers to be “mindful” of the UKSA code.”

I hope this evidence is useful to the Committee. Please let us know if you have any questions or if the OSR can support the Committee further in its inquiry. 

Yours sincerely,

Ed Humpherson

Director General for Regulation

Office for Statistics Regulation supplementary evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on Transforming the UK’s Evidence Base

Dear Mr Wragg,

Thank you very much for the opportunity to give evidence to your Committee as part of the Transforming the UK Evidence Base inquiry on 6 February. I enjoyed the session and I hope that you found my evidence useful.

I am writing to provide some supplementary evidence related to comparability of statistics across the UK.

During the session, I set out the expectations we have as the Office for Statistics Regulation for statistics producers on questions of comparability. I emphasised that where there are questions from users around how to compare the performance of public services across the UK, producers in the four nations should recognise and seek to meet that need.

Meeting that need is not straightforward. As I explained, the configuration of public services will probably be different, because of different policy and delivery choices that have been made by the different governments. This is consistent with the concept of devolution, but it does mean that administrative data may be collected and reported on different bases.

However, it is not in our view sufficient for producers to simply argue that statistics are not comparable. They should recognise the user demand, and explain how their statistics do, and do not, compare with statistics in other parts of the UK. And they should also undertake analysis to try to identify measures that do allow for comparison.

A very good example of this approach is provided by statisticians in the Welsh Government. Their Chief Statistician published two blogs on the comparability of health statistics, Comparing NHS performance statistics across the UK and Comparing NHS waiting list statistics across the UK. These blogs recognise the user demand and provides several insights to enable users to make comparisons of NHS performance.

In addition, the Welsh Government’s monthly NHS performance release also highlights what can, and cannot, be compared. For example, it shows that in November 2023, there were approximately 22 patient pathways open for every 100 people, while for England, the figure in November was 13 pathways for every 100 people. More generally, I would commend the Chief Statistician’s blogs as a good example of providing guidance and insight to users across a wide range of statistical issues.

During my evidence session I also mentioned the approach taken by NHS England to highlight the most comparable accident and emergency statistics. NHS England provide a Home Nations Comparison file for hospital accident and emergency activity each year.

More generally, the ONS is leading comparability work across a range of measures. In addition to work on health comparability, they have produced very good analysis of differences in fuel poverty measurement across the four nations.

I hope this additional evidence is useful. I would like to reiterate that these examples show statisticians recognising the core point – that there is a user demand for comparability and that they are taking steps to meet that demand.

Yours sincerely,

Ed Humpherson

Director General for Regulation

Office for Statistics Regulation written evidence to the Transport Committee’s inquiry on the future of transport data

Dear Mr Stewart,

The Office for Statistics Regulation (OSR) supports and encourages innovation and improvement in data and statistics. As the OSR Programme Lead for systemic reviews, I welcome the opportunity to respond to the call for evidence for the Transport Committee’s inquiry ‘Future of Transport Data’.

The OSR is the independent regulatory arm of the UK Statistics Authority. In line with the Statistics and Registration Service Act (2007), our principal roles are to:

  • Set the statutory Code of Practice for Statistics
  • Assess compliance with the Code to ensure statistics serve the public, in line with the pillars of Trustworthiness, Quality and Value. We do this through our regulatory work that includes assessments, systemic reviews, compliance checks and casework.
  • Award the National Statistics designation to official statistics that comply fully with the Code.
  • Report any concerns on the quality, good practice and comprehensiveness of official statistics.

As part of our planned programme of systemic reviews, in February 2022 we published the UK wide Review of transport accessibility statistics. Accessible transport plays a key role in having an equal society and describes a transport network which allows all users equal opportunity to travel when they want, where they want, how they want, at a price they can afford. I would like to take this opportunity to share the key findings from this review and our wider views on data sharing and data linkage across government that may be of interest to the Committee for this inquiry.

Please let me know if you have any follow up questions or if OSR can support the Committee further in its inquiry.

Yours sincerely,

Gail Rankin

Office for Statistics Regulation written evidence, ‘Future of Transport Data’,

Review of Transport Accessibility Statistics

Gaps in Transport Data

  1. We found that that whilst many statistics on transport and transport use are well developed, existing official statistics are not fully answering the key questions of those with a specific interest in the accessibility of transport networks. These include:
    • Statistics on entire journeys: Current data are largely focused on measuring constituent parts of the transport experience, rather than entire journeys. As such, the connections between legs of journeys, which may pose significant challenges to disabled people, are not taken account of and statistics producers were unable to quantify how many opportunities people have missed out on due to failures or barriers in the transport network.
    • Survey data from disabled people: Across a wide range of policy areas, including transport, disabled people are systematically excluded from statistics which are based on surveys. The reasons for this are varied. Some individuals live in establishments such as care homes that are not included in samples based on households and survey questionnaires may not have been adapted to enable completion by those with some disabilities.
    • Granular data: Users of data and statistics told us of their need for more geographical and demographic information. We heard that often sample sizes were too small to allow for local authority or regional breakdowns, particularly in being able to differentiate the experiences of those living in urban areas and those who live in rural areas.
  2. We looked at the three most commonly mentioned barriers to travel: affordability, safety and journey times, as well as at modal specific data gaps and concerns about data granularity. Data about the average cost per journey, for example costs for commuters of making the same journey by different modes of transport, was not published in England, Scotland, or Wales.
  3. Some organisations raised concerns about the lack of, and poor quality of data available about physical abuse and hate crimes on public transport, particularly towards disabled people. We found that generally statistics were only available at a high level with limited detail, making the data difficult to analyse to form a coherent understanding of what was happening.
  4. We found opportunities to improve data about a variety of modes of transport. This included Community Transport in England, bus and coach travel in Northern Ireland, and the accessibility of railway stations across Great Britain. In addition, our research highlighted concerns that statistics about walking and wheeling and taxi services did not reflect the lived experiences of disabled people.
  5. Users also told us that a greater number of age breakdowns would be beneficial, for example to identify whether the experiences of young adults with disabilities varies from that of older adults with disabilities.

Bringing data and statistics together

  1. We found that both qualitative and quantitative data are needed to understand the experiences of those accessing transport. When qualitative and quantitative data are brought together, they can help to paint an insightful and engaging picture.
  2. Some statistics users are not aware of the extent of available data and statistics, suggesting that engagement with users could be improved and existing publications could be promoted more. We also found that once users had identified the relevant statistics, data or analysis, many publications provide only a snapshot of experiences, making it difficult to understand how these are changing over time.

Our UK wide recommendations

  1. We recommended that statistics should be developed which reflect the lived experience of disabled people to support a focus on removing barriers to access.
  2. All producers of transport statistics should aim to publish data and analysis that are already being collected or produced to improve transparency of ministerial statements and policy development, and to increase clarity and value from the findings.
  3. The Department for Transport (DfT) and the Office of Rail and Road should work together to publish and regularly update statistics about the accessibility of train stations across Great Britain, covering accessible infrastructure to support those with different types of disabilities (such as step free access for those with mobility impairments) and geographical breakdowns.
  4. The DfT should explore whether new or existing data, for example the English National Travel Survey, can be used to fill data gaps highlighted in the report, for example around community and coach travel.
  5. Transport Scotland and Transport for Wales should publish internal analysis on journey times and seek user engagement on what else is needed to support local understanding and policy development.
  6. The Office of Rail and Road should work with the DfT and the Rail Delivery Group to develop a publication about the use and impact of railcards, drawing on data from the Rail Delivery Group and other sources, such as the English National Travel Survey.
  7. All statistics producers should explore where further demographic breakdowns of survey data provide new insights into the experiences of different population groups and publish data where this could be of interest to users. For example, new urban-rural splits of national figures, and more age breakdowns, such as focussing on the experiences of younger adults.
  8. We are currently looking at what progress has been made towards achieving these recommendations, noting that some will require longer term changes to be secured. We would be happy to share updates on these with the Committee when they are published.

Data sharing and linkage across government

  1. The pandemic provided a strong impetus to share data for the public good. There has been some excellent progress in creating linked datasets and making them available for research, analysis, and statistics. However, despite the value of sharing and linking data being widely recognised, there remains areas of challenge and uncertainties about the public’s attitude to, and confidence in, data sharing.
  2. OSR has been monitoring and commenting on data sharing and linkage across government for a number of years. Our most recent report on Data Sharing and Linkage for the Public Good was published on 12 July 2023 and takes stock of the current data sharing and linkage landscape across government, specifically exploring the barriers and opportunities to this.
  3. Of the 16 recommendations set out in the report, the recommendations we consider most relevant to this call for evidence are as follows.
    • Report recommendation 1: Social Licence. The government needs to be aware of the public’s views on data sharing and linkage, and to understand existing or emerging concerns. Public surveys such as the ‘Public attitudes to data and AI: Tracker survey’ by the Centre for Data, Ethics and Innovation (CDEI) provide valuable insight. They should be maintained and enhanced, for example to include data linking.
    • Report recommendation 3: The Five Safes Framework. Since the Five Safes Framework was developed twenty years ago, new technologies to share and link data have been introduced and data linkage of increased complexity is occurring. As the Five Safes Framework is so widely used across data access platforms, we recommend that the UK Statistics Authority review the framework to consider whether there are any elements or supporting material that could be usefully updated
    • Report recommendation 4: Privacy Enhancing Technologies. To enable wider sharing of data in a secure way, government should continue to explore the potential for Privacy Enhancing Technologies (PETs) to be used to enhance security and protect privacy where data are personally identifiable. The Office for National Statistics (ONS) Data Science Campus is well placed to lead and coordinate this work.
    • Report recommendation 5: Data Literacy in Government. To gain the skills to create and support a data-aware culture, it is important for senior leaders to have awareness of and exposure to data issues. One way to raise awareness and exposure would be for senior leaders to ensure that they participate in the Data Masterclass delivered by the ONS Data Science Campus in partnership with the 10 Downing Street (No10) Data Science Team.
    • Report recommendation 7: Arbitration Process. To facilitate greater data sharing among organisations within government, a clear arbitration process, potentially involving ministers, should be developed for situations in which organisations cannot agree on whether data shares can or should occur. Developing such an arbitration process could be taken on by the Cabinet Office, commissioned by the Cabinet Secretary and delivered working with partners such as No10 and the ONS.
    • Report recommendation 10: Broader use cases for data. To support re-use of data where appropriate, those creating data sharing agreements should consider whether restricting data access to a specific use case is essential or whether researchers could be allowed to explore other beneficial use cases, aiming to broaden the use case were possible.

 

UK Statistics Authority oral evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on the work of the UK Statistics Authority

On Tuesday 23 May, Sir Robert Chote, Chair of the UK Statistics Authority, Sir Ian Diamond, National Statistician and Ed Humpherson, Director General for Regulation, gave evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on the work of the UK Statistics Authority.

A transcript of which has been published on the UK Parliament website.

Office for Statistics Regulation written evidence to the Levelling Up, Housing and Communities Committee’s inquiry into Funding for Levelling Up

Dear Mr Betts,

I write in response to the Levelling Up, Housing and Communities Committee’s inquiry into Funding for Levelling Up.

The Office for Statistics Regulation (OSR) is the independent regulatory arm of the UK Statistics Authority and provides independent regulation of all official statistics produced in the UK. We aim to enhance public confidence in the trustworthiness, quality and value of statistics produced by government through setting, and assessing compliance with, the Code of Practice for Statistics.

Like the Committee, we strongly advocate for the transparency of data that are in the public interest and required for key decisions underpinning policy making. Our expectations are outlined in our guidance on intelligent transparency. We would like to see greater clarity and transparency around the data that are required to understand progress against the UK Government’s levelling up policy commitments. This includes both data on funding for levelling up policies and the metrics that will be used to measure the outcomes. Where there are new or existing data of sufficient quality and at the level of granularity users require, we would like to see greater transparency and accessibility of those data.

 

We are aware that the Committee wrote to Neil O’ Brien MP, then Parliamentary Under Secretary of State for Levelling Up, asking for further information around the various funding programmes for levelling up policies and his response, accompanied by a funding map dataset, was published on the Committee’s webpages. As far as we can tell, this has not been published elsewhere. We are aware that the Committee has concerns about the table that was provided, which is limited in terms of its utility, and the guidance needed for a comprehensive interpretation of the figures. There will clearly be interest from the Committee and others in updated versions of this table as the levelling up agenda progresses, and other funds are allocated.

We see that there is a case for developing a more publicly accessible version that could sit on gov.uk, rather than a future update coming out again as an annex to a committee letter. This separate publication could then include the necessary additional information required to interpret the data in a more informed manner and be developed to improve its utility for understanding the outcome of bids for different levelling up funding streams, as well as clarity on periodicity and sources of funding. This should be published in an accessible form, in line with the Code of Practice for Statistics.

We corresponded with Department for Levelling Up, Housing and Communities (DLUHC) from August 2021 to November 2021, asking for greater transparency around data on the Levelling Up Fund and the related ‘prioritisation of places model’. This led to a positive outcome and a commitment from DLUHC to the data that inform key decisions available. When the next round of levelling up funding allocations is announced we expect to see published data that is supported by a methodology and links to the source data. This should allow users to recreate the funding model for the allocation of priorities areas and enhance public confidence in the decisions that are being made.

DLUHC has set up the Office for Local Government (Oflog) which is tasked with bringing together data to understand local government performance in relation to value for money and publishing its conclusions in an annual report. This information could be used to monitor local authorities’ performance against aspects levelling up objectives based on the funding that they receive and may add to the case for developing a published levelling up funding series. We also note that the Levelling Up White Paper outlines that “devolved governments are best placed to deliver certain services like health and education” – as a result at some point there may be a need to publish transparent information on the performance of these services, in relation to the funds allocated to them.

 

We understand that the Office for National Statistics (ONS) is receiving funding from DLUHC to develop subnational data and has started publishing its subnational data explorer. The data explorer provides helpful explanations and includes a ‘data dictionary’ that accompanies the supporting dataset. We think the ONS’s approach to publishing subnational data could serve as a good example for other departments looking to publish their own data relating to levelling up policies in a transparent and accessible way.

Please let me know if you have any questions or if I can support the Committee further in its inquiry.

Yours sincerely,

Ed Humpherson 

Director General for Regulation

Office for Statistics Regulation written evidence to the DCMS Sub-committee on Online Harms and Disinformation inquiry on misinformation and trusted voices

Dear Mr Knight, 

I write in response to the inquiry Misinformation and trusted voices, as conducted by the DCMS Sub-committee on Online Harms and Disinformation.  

Which organisations are the most trusted sources of information in the UK?  

The Office for Statistics Regulation is the independent regulatory arm of the UK Statistics Authority and provides independent regulation of all official statistics produced in the UK. It aims to enhance public confidence in the trustworthiness, quality and value of statistics produced by government through setting, and assessing compliance with, the Code of Practice for Statistics1. In addition, one of our key roles is to use our voice to stand up for statistics and to represent the public, monitoring and reporting publicly where we have concerns about the dissemination and use of statistics and highlighting good practice. 

The Code of Practice for Statistics has three pillars: Trustworthiness, Quality and Value. The three pillars work together to provide the conditions to support public confidence in statistics, which relates directly to the question the Committee is asking. In particular, we distinguish trust – a belief on the part of individuals – from trustworthiness – a property of organisations. Trustworthiness is about providing evidence that the systems, processes and governance surrounding statistics are effective. However, we never consider trustworthiness in isolation. We consider all three pillars to determine whether statistics are fully compliant with the Code of Practice and can be designated as National Statistics. This designation demonstrates to users that they can have confidence in the relevant official statistics. 

One source that can give some insight into levels of trust in official statistics is the 2021 study of public confidence in official statistics. It found that, amongst people who responded, there was high confidence in the statistical system. While respondents did not necessarily know about the Authority or the OSR, there was strong support for our role, with 96% of respondents agreeing there should be an independent body to speak out against the misuse of statistics and 94% agreeing that such a body should ensure that statistics are produced free from political interference. Regarding the Office for National Statistics (ONS), the largest producer of official statistics in the UK, 87% of respondents reported that they trusted ONS statistics. The public value of statistics has also been shown through 92% of respondents who had used COVID-19 data reporting them being useful. Although this is only one source, and we are careful not to place too much weight on a single survey result, we do consider that this provides some reassurance around public confidence in official statistics. 

In addition to official statistics producers, there is a wider ecosystem of statistics and data. Many of these other sources of statistics and data inform policy and public debate and it is important that they are used for the public good. We encourage producers outside of the official statistics producer community to apply the Code of Practice for Statistics on a voluntary basis. Our annual award for Statistical Excellence in Trustworthiness, Quality and Value recognises those who voluntarily apply the core pillars of the Code of Practice for Statistics. 

Is the provision of authoritative information responsive enough to meet the challenge of misinformation that is spread on social media? 

Our view is that the best way to combat misinformation is to ensure that information that is trustworthy, high quality and high value is made available to the public. In this way, the good information can drive out the bad. 

However, we recognise that it is hard to live up to this ideal. The experience of the pandemic is instructive. As we noted in our recent State of the Statistical System report, there are a variety of organisations and individuals commenting on the use of statistics by government. The COVID-19 pandemic in particular was associated with an increase in the role of citizens as ‘armchair epidemiologists’. We wrote a blog highlighting how open data enabled great work to be done to communicate data on COVID-19 publicly from outside the official statistics system, including on social media. This demonstrated the changing statistical landscape of increased commentary around official statistics at its best. 

Since the pandemic there has continued to be an increased interest in and scrutiny of statistics. This is a positive for the statistics system but also brings risk. Much discussion of statistics takes place on social media with increased risks around misuse, misinterpretation and ‘echo chambers’. Official statistics producers need to be aware of these changes in the use of statistics. 

Areas that we highlight in our report that can help official statistics producers meet the challenge of misinformation that is spread on social media include: 

  • improving how uncertainty in statistics is communicated to bring effective insight; 
  • an increase in government statisticians challenging the inappropriate use of statistics and engaging directly with users to support understanding of statistics; and  
  • intelligent transparency around statistics, data and wider analysis. 

Intelligent transparency means proactively taking an open, clear and accessible approach to the release and use of data, statistics and wider analysis. As set out in our regulatory guidance on transparency, intelligent transparency is informed by three core principles: equality of access, enhancing understanding and analytical leadership. It is about more than just getting the data out there. Intelligent transparency is about thinking about transparency from the outset of policy development, getting data and statistics out at the right time to support thinking and decisions on an issue, supporting the wider public need for information and presenting the data and statistics in a way that aids understanding and prevents misinterpretation. 

In conclusion, a constant refrain of the OSR is that it is important to ensure that the bad data does not drive out the good. However, as long as producers have the right approach, based on trustworthiness, quality and value, good statistics can thrive. 

Please let me know if any questions or if I can support the Committee further in its inquiry. 

Yours sincerely  

Ed Humpherson  

Director General for Regulation 

Office for Statistics Regulation written evidence to the Scottish Parliament’s Covid-19 Recovery Committee’s inquiry on pre-budget scrutiny

Dear Ms Brown, 

I am writing to make you and the Committee aware that on 30 August 2022 the Office for Statistics Regulation (OSR) has published an update to our March 2021 review of the COVID-19 Infection Survey (CIS). The CIS measures how many people living in Scotland, Wales, Northern Ireland, and England test positive for a COVID-19 infection at a given point in time, regardless of whether they experience symptoms. In Scotland, the statistics contribute to ongoing surveillance of the coronavirus pandemic, along with other sources such as genomic sequencing to identify new variants, testing in health and social care settings, and wastewater surveillance.  

The CIS is therefore a key component of public health surveillance in Scotland. In line with its importance, we have maintained a close regulatory focus on how the survey is conducted and on how the results are calculated and presented. The background to our latest review is that in June 2022, the Office for National Statistics (ONS) announced changes to the survey, introducing a digital questionnaire and sending swab and blood sample kits by post. These changes reflected plans to maintain a scaled back version of the CIS set out by the UK Government in its Living with COVID-19 plan.  

In light of the ONS’s changes, we agreed with the ONS that we would undertake a further review of the statistics against the Code of Practice for Statistics. This update looks at whether, and to what extent, the statistics from this survey continue to serve the public good.  

Our review highlights the ongoing value of the CIS. Given the cessation of the REACT study and changes in testing regimes by governments across the UK, these statistics are now the most up-to-date, reliable source on COVID-19 infections. They contribute to scientific advice provided to governments, including the Scottish Government, informing decisions on the ongoing management of the pandemic. In Scotland, the statistics are reported on weekly by Public Health Scotland in its COVID-19 statistical report. Public Health Scotland states that the statistics are the “current best understanding of community population prevalence”. The statistics from the CIS contribute to the estimate of the reproduction (R) number for Scotland, also published in Public Health Scotland’s report. This provides an assessment of whether the pandemic is shrinking or growing. And there is a high level of public interest in the survey – people really value the statistics and many use them to make day-to-day decisions, including potentially serious decisions for those vulnerable to COVID-19.   

Our review makes several recommendations to the ONS regarding ongoing improvements to the statistics: 

  • The ONS should ensure that devolved administrations have appropriate input at the programme level. The ONS has built good working-level relationships with the devolved administrations, including statisticians in the Scottish Government. However, we consider that devolved administrations would benefit from increased engagement at a senior level, for example to ensure that they can input to decisions relating to changes to the survey.  
  • The ONS should continue to inform users about the impacts of the change in mode to digital data collection on the statistics. We found that for the statistics to remain as valuable as possible, it is important for many users, particularly those in the devolved administrations, that granular breakdowns are still available following changes to the survey mode. We are encouraged to see ONS’s plans to understand and publish information about the change in mode. This includes information on any impact on the response rates and sample, and therefore the representativeness of the survey. The ONS recently published their initial findings on the effects of the change of mode which offers a first insight into many of these aspects. 
  • The ONS should ensure it keeps users informed about development plans, even if these plans are tentative and subject to change. While we appreciate that the ONS is working in a fast-moving environment and that decisions about the survey may sit with other partners, we consider that it could have done more to keep users informed in a clear and timely way about planned or potential changes to the survey. It will be particularly important for the ONS to keep users informed about the future of the survey as the financial year ends.
  • The ONS should also consider how the CIS can be adapted to play a role in understanding public health in future. The coronavirus pandemic reinforced the need for statistics to inform society about public health. In our review of lessons learned for health and social care statistics during the pandemic we highlighted the need for statistics producers across the UK to continue to develop outputs which go beyond operational data in order to support a better understanding of public health.  

I know the Committee is currently holding evidence sessions for their pre-Budget scrutiny on the COVID-19 strategic framework and are looking specifically at surveillance measures. I hope this letter will help inform the Committee’s work on the subject. 

Please do let me know if you have any questions.  

Yours sincerely  

Ed Humpherson
Director General for Regulation 

Office for Statistics Regulation written evidence to the Procedure Committee’s inquiry on correcting the record

Dear Ms Bradley,

I write in response to the Committee’s call for evidence for its inquiry Correcting the record.

The UK Statistics Authority and the Office for Statistics Regulation (OSR), as its regulatory arm, have a responsibility to ensure that official statistics meet the public good. We provide independent regulation of all official statistics produced in the UK, and aim to enhance public confidence in the trustworthiness, quality and value of statistics produced by government. We do this by setting the standards official statistics must meet in the Code of Practice for Statistics. We ensure that producers of official statistics uphold these standards by conducting assessments against the Code. Those which meet the standards are given National Statistics status, indicating that they meet the highest standards of trustworthiness, quality and value.

We also report publicly on systemwide issues and on the way that statistics are being used, celebrating when the standards are upheld and challenging publicly when they are not, intervening when statistics are either misused publicly, or quoted without sources being made clear. Our interventions policy explains how we make these judgements in a considered and proportionate way.

Key to our interventions is the ask that people practise intelligent transparency. Transparency and clarity support public confidence and trust in statistics and the organisations that produce them and minimises the risk of misinterpretation of statistics and data. Transparency allows individuals to reach informed decisions, answer important questions and provide a mechanism for holding governments to account. Statistics and data also underpin successful implementation of government policies, and individuals’ views on the effectiveness of policy decisions.

Intelligent transparency is informed by three principles:

  • Equality of access: Data quoted publicly, for example in parliament or the media, should be made available to all in a transparent way. This includes providing sources and appropriate explanation of context, including strengths and limitations.
  • Understanding: Analytical professions need to work together to provide data which enhances understanding of societal and economic matters, including the impacts of policy. Governments should consider data needs when developing policy and be transparent in sharing analytical and research plans and outputs with the public.
  • Leadership: Organisations need strong analytical leadership, within and beyond analytical professions. Decisions about the publication of statistics and data, such as content and timing, should be independent of political and policy processes. These decisions should be made by analytical leaders, who should also be given freedom to collaborate across organisational boundaries to support statistics that serve the public good. Their expertise and decision-making authority should be endorsed by Permanent Secretaries.

As tools for understanding public policy, statistics and data rightly belong at the heart of Parliamentary debate. They can be a powerful support to an argument. In the pressured environment of an oral debate, it is only natural that some of these references to statistics, though made in good faith, will be misremembered, unclear, or misattributed. In these circumstances, it is always welcome when MPs make the effort to furnish the record with clarifications or additional information about their sources. This not only ensures that the House is fully informed, but also meaningfully improves the quality of media reporting and subsequent public debate.

At other times an MP may quote statistics correctly but confuse data from a private source with that already in the public domain. In particular, Ministers (who under the Ministerial Code are required to be mindful of the Code of Practice for Statistics) have access to a wide range of published and unpublished information from their departments and should take care to rely on the former when making their statements. However, as set out in our guidance for the transparent release and use of statistics and data, when unpublished information is used unexpectedly, statistical officials in Government departments can play their role in setting the record straight by publishing the information as soon as possible in an accessible form, ideally on the same day. This can be done via an ad-hoc release, which need not be long, or technical. For example, the Department for Work and Pensions has a page dedicated to ad hoc statistical analyses.

Our aim, one that we would hope the Committee agrees with, would be to see intelligent transparency being the default for all statistics and data, including those used by Ministers and parliamentarians.

Please let me know if you have any questions.

Yours sincerely

Ed Humpherson
Director General for Regulation

Office for Statistics Regulation correspondence with the Levelling Up, Housing and Communities Committee on transparency of data related to the Levelling Up Fund

Dear Mr Betts,

Accessibility of Levelling Up policy funding stream data

I write regarding your recent letter to Neil O’Brien MP, Parliamentary Under Secretary of State for Levelling Up, the Union and Constitution, at the Department for Levelling Up, Housing and Communities (DLUHC), requesting access to data on the various Levelling Up policy funding streams.

At the Office for Statistics Regulation (OSR), we share your support for the transparency of data that is in the public interest and required for key decisions underpinning policy making.

You may be aware that we corresponded with DLUHC from August 2021 to November 2021, asking for greater transparency around data on the Levelling Up Fund and the related ‘prioritisation of places model’.

OSR expects such data to be made publicly available, as outlined in our guidance on intelligent transparency. This guidance has been endorsed by the Civil Service Chief Operating Officer, Alex Chisholm.

Please let me know if there is anything you would like from us in support of your request, or if you would like a meeting to discuss further.

Yours sincerely

Ed Humpherson