UK Statistics Authority oral evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on the work of the UK Statistics Authority

On Tuesday 23 May, Sir Robert Chote, Chair of the UK Statistics Authority, Sir Ian Diamond, National Statistician and Ed Humpherson, Director General for Regulation, gave evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on the work of the UK Statistics Authority.

A transcript of which has been published on the UK Parliament website.

UK Statistics Authority correspondence to the Public Administration and Constitutional Affairs Committee regarding Public Confidence in Official Statistics report

Dear William,

I am writing to draw your attention to the latest Public Confidence in Official Statistics report (2021), which has been produced by the National Centre for Social Research (NatCen) on behalf of the UK Statistics Authority. I am happy to share that the report finds that public confidence in official statistics remains high, and engagement with official statistics has increased since 2018.

Awareness of the Office for National Statistics (ONS) and the Authority has increased from 70% and 33% in 2018 to 75% and 48% in 2021 respectively. Furthermore, for the first time people were asked if they were aware of the Office for Statistics Regulation, with 41% saying that they were.

Notably, 96% of people able to express a view agreed that it is important for there to be a body such as the Authority to speak out against the misuse of statistics, and 94% agreed about the importance of there being a body to ensure that official statistics are produced without political interference.

Members might also be interested to note that a very high proportion of respondents trusted the ONS (89% of those able to express a view) and our statistics (87%). Of those able to express an opinion, trust in the ONS was highest of all institutions asked about, including the Government, the Bank of England, and the civil service as a whole. 82% of people able to express an opinion agreed that official statistics are generally accurate, up from 78% in 2018. Meanwhile 44% said they had used ONS COVID-19 statistics; they were more commonly used than any of the other statistics asked about with the exception of the census.

This report is very welcome, especially following our hard work to provide clear insights throughout the pandemic. We are proud that the public support our vision of statistics that serve the public good, which we will continue to deliver with honesty, and free from political interference.

A copy of the report will be annexed to this letter for the Committee’s information.

Yours sincerely,
Professor Sir Ian Diamond

UK Statistics Authority oral evidence to the Public Administration and Constitutional Affairs Committee’s pre-appointment hearing for Chair of the UK Statistics Authority

On Tuesday 29 March 2022 Sir Robert Chote, the Government’s preferred candidate for Chair of the UK Statistics Authority, gave evidence to the Public Administration and Constitutional Affairs Committee’s pre-appointment hearing for Chair of the UK Statistics Authority.

A transcript of which has been published on the UK Parliament website.

 

UK Statistics Authority oral evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on the work of the UK Statistics Authority

On 21 October 2021, Sir David Norgrove, Chair, UK Statistics Authority, Professor Sir Ian Diamond, National Statistician and Ed Humpherson, Director General for Regulation, gave evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on the work of the UK Statistics Authority.

A transcript of which has been published on the UK Parliament Website.

UK Statistics Authority and Office for Statistics Regulation written evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on data transparency and accountability: COVID-19

Dear William,

Thank you for your letter of 2 February, in which you asked about the Government’s use of data, in relation to your inquiry Data transparency and accountability: COVID-19.

We discuss your specific questions in an Annex to this letter. Here I would like to reflect more generally about the use of data and statistics in the past year.

Overall I believe our statistical system has responded well to the stress and pressures of the pandemic. Ian Diamond’s separate letter to you describes an immense range of work that has been done to understand the pandemic itself, which has been fundamental to government decision making and public understanding. Alongside the work on the pandemic the Office for National Statistics (ONS) and statisticians across government have continued to produce remarkable data and analysis across the economy and society, work that is high quality and innovative. Preparations and contingency plans for the census in England and Wales are encouraging.

The legislative framework for our statistics as set out in the Statistics and Registration Service Act 2007 together with the Digital Economy Act 2017 has also, I think, met the sternest test it has yet seen. The new data and statistics required by the pandemic have for the most part been compiled and published in accordance with the Code of Practice on Statistics and statisticians have generally been able to access the new sources of data they need.

I pay warm tribute to all involved in this work, at a time of anxiety for them and their families, with all the disruption caused by the need to work from home, alongside the increased difficulty of their professional lives, with many surveys and other sources of data having to be changed or abandoned.

Within this generally positive picture not all has gone well, and there are lessons to be learned.

It has too often been a struggle to develop a coherent picture of the pandemic even within England as a single country. DHSC currently plays a limited role in health statistics. Its resource has been strengthened following an ONS review undertaken at their request. But the disparate bodies involved in the provision of health are in terms of statistical output too often inchoate, to the extent for example that both the NHS and Public Health England produce statistics on vaccinations that are published separately.

This is an issue that has been highlighted by the Office for Statistics Regulation (OSR) in the past. It goes well beyond the concerns raised by the pandemic. We currently have no coherent statistical picture of health in England or of the provision of health services and social care.

There are similar issues in relation to the health data for the four nations. The adoption of different definitions complicates comparisons and makes it hard to draw the valuable lessons we could all learn from different ways of doing things.

I strongly support the proposal by the Royal Statistical Society for a thorough external review.

More immediately it is hard to understand why the different nations have chosen to publish vaccination data in the different ways and detail they have chosen. OSR is pursuing this.

Some people may be surprised by my mostly positive assessment of the handling of statistics and data over the past year. Their more negative view is likely to have been influenced by a number of – too many – particular examples of poor practice.

  • The presentation of data at No 10 press briefings has improved, helped by the later involvement of ONS staff, but early presentations were not always clear or well founded, and more recently a rushed presentation has undermined confidence.
  • Ministers have sometimes quoted unpublished management information, and continue to do so, against the requirements of the Code of Practice for Statistics. Such use of unpublished data leads of course to accusations of cooking the books or cherry picking the data. It should not require my involvement or that of OSR to secure publication.
  • Perhaps most important is the damage to trust from the mishandling of testing data. The target of 100,000 tests per day was achieved by adding tests sent out to tests completed. As predicted there was huge double counting, to the extent of some 1.3 million tests that were eventually removed from the figures, in August. The controversy over testing data seems likely to continue to undermine the credibility of statistics and the use that politicians make of them.

The Annex describes a range of current issues in relation to the pandemic, including testing and vaccinations, as well as replying more directly to your letter.

There are perhaps two areas the Committee might like to consider in terms of future change.

The first is the central role of the Authority together with the National Statistician and OSR. The UK has a decentralised system of statistics where individual departments are responsible for their statistics and departmental statisticians report within their departments. This has strengths we should not lose. It ties statistics and statisticians closely into the policy making of their departments and any change should not weaken that tie. But the complexity of data and statistics in the current crisis has shown the need in these circumstances for a firmer central controlling mind. The National Statistician and the ONS have taken this role to a large extent, through expertise, position and personality rather than formal agreement.

OSR has also taken a more expansive role. For the future there may be a place for more formal arrangements.

Secondly, it is clear that political pressures have led to some of the weaknesses in the handling of Covid statistics. It is to the credit of our politicians that they have created an organisation like the Authority that is permitted to criticise them, and in general politicians respond appropriately to our criticisms. But it might help if more issues were headed off before they arose. The Ministerial Code for example only asks Ministers to be ‘mindful’ of the Code of Practice. The requirement could be stronger.

The Authority in 2020 published a new five year strategy, during the pandemic. It remains valid and we are pursuing it at pace. The ONS is leading the development of an Integrated Data Platform for Government as well as developing new and better statistics to help the country understand the economy and society, from trade to happiness and from crime to migration. Statistics to help the recovery are a particular focus. OSR is developing its work on statistical models – its review of exams algorithms will be published shortly – as well as on automation of statistics, data linkage, National Statistics designation, granularity and statistical literacy.

I look forward to keeping you and the Committee in touch with our progress.

 

Kind regards,

Sir David Norgrove

 

OFFICE FOR STATISTICS REGULATION ANNEX

1. Has the Government made enough progress on data since the start of the pandemic, and what gaps still remain?

Summary

Since the start of the pandemic Governments across the UK have maintained a flow of data which has been quite remarkable. New data collections have been established swiftly, existing collections have been amended or added to, and data sources have been linked together in new ways to provide additional insight. We have seen good examples from across the UK, including data on the virus itself and on the wider impacts on the pandemic.

However, in some areas Governments have been slow both to publish data and to ensure its fitness for purpose. For example, more consideration should have been given to the data available as testing and tracing was being set up. While UK governments have started publishing vaccinations data more promptly, and are continuing to develop the statistics on an ongoing basis, there remains much room for improvement in terms of the amount of information that is published and the breakdowns within the data. There are also gaps remaining in the data available to support longer term understanding of the implications of the pandemic.

It is clear that there is intensive work taking place to provide more comprehensive vaccinations data, both by each Government, and through cross-UK collaboration.

New data developed

There are many examples of outputs which have been developed quickly to provide new insights to help understand the pandemic and its implications. Some specific examples include:

  • The coronavirus (COVID-19) infection survey, carried out by the Office for National Statistics (ONS) in conjunction with partners. This is the largest and only representative survey in the world that follows participants longitudinally over a period of up to 16 months.
  • Statistics on the Coronavirus Job Retention Scheme (CJRS), the Self-Employment Income Support Scheme (SEISS) published by HM Revenue and Customs (HMRC), and the ONS Business Impact of Coronavirus Survey (BICS).
  • The Ministry of Justice (MoJ) and Her Majesty’s Prison and Probation Service (HMPPS) have published official statistics providing data on COVID-19 in HM Prison and Probation Service in England and Wales.

We have also seen outputs which attempt to draw together data to make it more easily accessible. The most well-known of these is the coronavirus dashboard which is widely used and constantly evolving. While the dashboard focuses on data related to COVID-19 infections, hospitalisations and deaths, the ONS has pulled together a broader range of data from across the UK government and devolved administrations to highlight the effects of the pandemic on the economy and society: Coronavirus (COVID-19): 2020 in charts. This output covers a broad range of data including transport use, schools attendance and anxiety levels.

Coronavirus (COVID-19) data

We have seen improvements to the data since the start of the pandemic, particularly around cases and deaths, with clearer information on the different sources of data available and the value of each of these sources.

However, we continue to have concerns, including seeing examples of data being referenced publicly before they have been released in an orderly manner, which we have highlighted to your Committee. In some areas Governments have been slow both to publish data and to ensure its fitness for purpose. For example, the UK coronavirus dashboard is now a rich source of information with plans for the inclusion of further data. However, it took several months to become the comprehensive source it is now.

Our concerns around COVID-19 health data cover three broad areas: Testing and tracing, hospitalisations, and vaccinations.

Our concerns with data on testing have been made public on a number of occasions(1 & 2). While there have been significant improvements to the England Test and Trace data it is still disappointing that there is no clear view of the end to end effectiveness of the test and trace programme.

In December we published a statement on data on hospital capacity and occupancy, noting the biggest limitation to the existing data is the inability to make a distinction between whether someone is in hospital because of COVID-19 infection, or whether they are in hospital for some other reason but also have a COVID-19 infection. These data should become available in time but the delay limits understanding at a critical time.

On vaccinations data, UK governments have been quick to start publishing data and have learnt some of the lessons from test and trace (see section 4). However, there remains room for much improvement in terms of the amount of information that is published and the breakdowns within the data. We would like to see more granular breakdowns and more consistency between administrations across the UK. Our letters to producers published on 1 December 2020 and 20 January 2021 outline our expectations in relation to these statistics.

The improvements cover four broad aspects of vaccinations data. First, there should be more granular data on percentage take up, for example by, age band, ethnic group and by Joint Committee on Vaccination and Immunisation (JCVI) priority groups.

Second, in terms of UK-level consistency, the data available for each administration varies:

  • Information on the percentages of individuals in each priority group that have been vaccinated to date is available for both Scotland and Wales
  • In England, breakdowns including ethnicity and some age bands are published by NHS However, it is disappointing that the publication of Public Health England’s COVID-19 vaccine monitoring reports, that might have been a vehicle for more granular information about the vaccination programme in England, has been halted with no explanation as to why it has stopped or when it may restart. We have called upon the statistics producers to be clear on the data currently available and when more data will be published.
  • Northern Ireland data are included in the UK However, the more granular data are not released in Northern Ireland itself in an orderly manner. We have written separately to the Department of Health (Northern Ireland), requesting that the data are published in an orderly and transparent way.

Third, on vaccine type, Scotland is the only administration to routinely publish data on vaccination type (Pfizer BioNTech or AstraZeneca). It is in the public interest to have this information for all parts of the UK, particularly in the context of the media coverage on vaccine efficacy and sustainability of supply.

Fourth, it would also be helpful to have better information on those who have been offered a vaccine and those who have taken them up. This would help with understanding the reasons why some people may not be taking up vaccines, for example whether it is refusals, access or because they have not actually received an invitation to have a vaccine.

In addition to the areas outlined above, there are gaps remaining in health related COVID-19 data. For example, very little data exists on ‘long COVID’, despite emerging evidence that a proportion of people can suffer from symptoms that last for weeks or months after the infection has gone. We understand that the ONS is starting to look at this area and welcome this effort to fill an important data gap.

There are also gaps in the understanding of and information on variants of coronavirus. This is important in understanding the implications, for example whether data already collected on issues such as hospitalisations, deaths and efficacy of vaccinations are still applicable.

It is a fundamental role of official statistics and data to be used to hold government to account, and the lack of granularity and timeliness seen with some of the data makes it hard to do this. We recognise that producers are working intensively to make improvements and are keen to support these efforts.

Wider impacts

It is inevitable in such a fast-moving environment that there will be gaps in the data. To date the priority has been to fill the gaps most needed to understand the immediate and most direct impacts of the pandemic but at some stage the focus will need to shift from what is happening today to looking at what data are needed to fully understand the longer term consequences of the pandemic.

The issues are broad ranging and cover diverse areas such as:

  • the impact of the pandemic on children and young people
  • the long-term effects on health services
  • the future health of the nation
  • the impacts on inequalities
  • the financial impacts of both the pandemic and the response to it

Answering society’s questions about the pandemic must be done using both existing and new data. There are some longstanding statistical outputs on aspects of society that are likely to have been affected by the pandemic – for example, statistics on trade and migration. These outputs should be used to present analysis on the pandemic’s impact on those specific issues. However, where existing data is collected through a survey, the data may be impacted by the difficulties in collecting face to face survey data over the past 12 months.

It is also important to consider sooner rather than later where new data may be required.

2.   Can you give a view on how well (or otherwise) the Government has communicated data on the spread of the virus and other metrics needed to inform its response to the pandemic?

During the pandemic we have highlighted two main areas of concern in the way governments have communicated data: accessibility and transparency.

Accessibility

Early in the pandemic there were lots of new sources of data being published by a range of organisations to support understanding of the pandemic. Much of these data were put out without narrative or sufficient explanation of limitations and could be hard to find. This made it difficult for members of the public to navigate the data and understand the key messages, which in turn could undermine the data and confidence in the decisions made on the basis of the data.

There have been improvements. For example:

  • Data have been more effectively drawn together in dashboards and summary articles, such as the UK coronavirus dashboard and the continually evolving dashboards in Scotland, Wales and Northern Ireland
  • Publications have been developed to include better metadata and explanation of sources and limitations, and how the data link with other An early example was the Department for Transport’s (DfT) statistics on transport usage to support understanding of the public response to the pandemic. We also saw the Department of Health and Social Care (DHSC) develop its weekly test and trace statistics to include information on future development plans and in October an article was published comparing methods used in the COVID-19 Infection Survey and NHS Test and Trace.

However, it can still be hard to know what is available and where to find data on the range of issues people are interested in.

Transparency

Throughout the pandemic we have been calling for greater transparency of data related to COVID-19. Early in the pandemic we published a statement highlighting the important of transparency. We also published a statement and blog on 5 November 2020. One of the prompts for this was the press conference on 31 October announcing the month-long lockdown for England. The slides presented at this conference were difficult to understand and contained errors, and the data underpinning the decision to lock down were not published for several days after the press conference.

We continue to see instances of data being quoted publicly that are not in the public domain. Most recently, for example, at the Downing Street briefing on 3 February, the Prime Minister said:

“…we have today passed the milestone of 10 million vaccinations in the United Kingdom including almost 90% of those aged 75 and over in England and every eligible person  in a care home…”

At the time this statement was made these figures were not published. Breakdowns by age are not published for the UK as a whole. On 4 February NHS England included additional age breakdowns in its publication for data up to 31 January, including percentage 75-79 for the first time (82.6% having had a first dose), and 80 and over (88.1% having had a first dose). The 10 million first doses figure for the UK was reached on 2 February (published 3 February). Our view is that it is a poor data practice to announce figures selectively from a dataset without publishing the underlying data.

Our recent report on statistical leadership highlights the importance of governments across the UK showing statistical leadership to ensure the right data and analysis exist, that they are used at the right time to inform decisions, and that they are communicated clearly and transparently in a way that supports confidence in the data and decisions made on the basis of them. It sets out recommendations to support governments in demonstrating statistical leadership from the most junior analysts producing statistics to the most senior Ministers quoting statistics in parliaments and the media.

We will continue to copy letters to PACAC when we write publicly on transparency issues.

3.   Can you give a view on how well the UK’s statistical infrastructure has held up to the challenge of the pandemic? Are there key systems or infrastructure issues that need to be addressed going forward?

By and large the statistical infrastructure has been extraordinarily resilient. It moved from a largely office-based operation to being almost completely remote in a very short space of time. There were also fast adjustments to allow surveys and data operations for major household, economic and business statistics to continue in some form. Furthermore, new important statistical surveys were launched in a few weeks that would have taken months of planning pre-pandemic, while existing surveys were adapted swiftly from face-to-face to online.

The statistical system has also responded quickly to the need to provide more timely data, for example to support daily briefings.

Producers have been more open to creative solutions and new data sources, for example web-scraped data and PAYE Real Time Information. There have also been greater instances of data sharing. For example, the ONS coronavirus insights dashboard seems to be working towards being a ‘one-stop shop’ for several different producers and the devolved administrations to collate data in a single place. This appears to be encouraging communication between producers and is improving each week.

There are key infrastructure opportunities now that can be exploited, and it is important to question what elements of the new approaches should remain and which should change back to how things used to be done. For example, when and how best should data collection return to face to face households surveys? Should legacy surveys like the Annual Business Survey continue or should there be a move to new platforms or administrative data? How can the new data sources that have now come on stream be exploited even more? Is there a case for synthetic data to enhance existing data to help phase out large and expensive surveys? Can new survey platforms be used to answer short-term questions to help manage the impacts of the pandemic?

It is also important to learn lessons from mistakes that have occurred, at whatever stage of the process they occured. One high profile example involved an error with testing data which meant there were delays to some data being included in the daily figures compiled by   PHE.

Errors like this reflect the underyling data and process infrastructure. OSR is currently exploring the use of reproducible analytical pipelines in government. We are focusing on what enables departments to successfully implement RAP and what issues prevent producers either implementing RAP fully or applying elements of it. This work will give a further insight into infrastructure challenges and where improvements may be needed.

Much of the data that are published are also drawn from administrative sources. In terms of monitoring the pandemic this has presented specific challenges. Public health, social care and hospital administrative systems are not connected to one another, which makes it time consuming to collate the data and puts quality at risk. Updating the IT infrastructure and data governance to make it possible to share information in a timely way is vital.

The pandemic has again highlighted the importance of analysts being involved in the development of operational systems, to make sure they are set up in a way that can best support data and evaluation needs.

The pandemic has also highlighted some of the strengths and limitations of the UK Statistical System. For example, analysts embedded in policy departments and devolved administrations have been able to quickly respond to emerging issues, but this has also been balanced with the complexitiy of having multiple organisations working on overlapping areas.

IT infrastructure as well as how statisticians organise themselves within and across organisations are covered further in our Statistical Leadership report.

4.   What should the Government learn from the issues with testing data that can be applied to vaccine data?

A key issue that has persisted throughout the pandemic has been the need for timely data to be published as quickly as possible. We saw in the early days of testing that the data were disorderly and confused. We were keen that this experience was not repeated with the roll- out of vaccines. So before the start of the vaccine roll-out, we wrote to producers of health-related statistics across the UK, outlining our expectations that they should work proactively to develop statistics on the programme.

Drawing on the lessons to be learnt from the development of test and trace statistics, we outlined the following key requirements for vaccination statistics:

  • Any publications of statistics should be clear on the extent to which they can provide insight into the efficacy of the vaccinations, or whether they are solely focused on operational aspects of programme delivery.
  • Data definitions must be clear at the start to ensure public confidence in the data is
  • Where statistics are reported in relation to any target, the definition of the target and any related statistics should be clear.
  • The statistics should be released in a transparent and orderly way, under the guidance of the Chief Statistician/Head of Profession for Statistics.
  • Some thought needs to be applied as to the timeliness, for example, daily or weekly data, to meet the needs of a range of users.

Encouragingly, we have seen signs that producers have learnt from their previous experiences. For the most part they made initial vaccination data available quickly; at first the numbers were fairly crude but they have continued to develop the data as time has gone on. We have seen also that whereas data were initially published on a weekly basis, producers very quickly moved to publishing daily figures with more detailed breakdowns provided in the weekly updates. This in part reflects a greater acknowledgement of the need to  publish the data so that it can be quoted in parliaments and media without undermining confidence.

This is pleasing but the statistics remain in development and there is more to be done, as we have outlined in this annex and in our letter of 20 January. We also asked producers to publish development plans for these statistics and indicate if some data cannot be provided, as this will help users to understand the limitations of the data available.

Office for Statistics Regulation

February 2021

Related links:

ONS written evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on data transparency and accountability: COVID-19

UK Statistics Authority correspondence to the Public Administration and Constitutional Affairs Committee on the Authority’s new strategy

Dear William,

Thank you so much for your help with the launch of our new strategy ‘Statistics for the Public Good’. I wanted to now write to your formally to let you and your Committee know that we have published it today.

Statistics and data have never been as necessary for the public good as they have in recent months; indeed, we delayed the release of our new strategy to focus all our attention on meeting both the need for information about the COVID-19 pandemic. I am hugely proud of the work done by the Office for National Statistics (ONS), Office for Statistics Regulation (OSR) and across the Government Statistical Service (GSS). They have shown the strength
and value of our statistical system.

But there is more to do. In many ways the work of the past few months has accelerated plans that were already in train, the use of joined up administrative data in particular. Joined up government can’t happen without joined up data and statistics.

We have set out in our strategy how we aim to inform the UK, improve lives and build the future, through four core principles. We will be radical, ambitious, inclusive and sustainable in all that we do over the next five years.

A key part of each of these principles is better communication, and of course, our independence of Government and accountability to Parliament.

I hope there will be an opportunity to discuss this new strategy and more with you and your Committee later in the year.

Yours sincerely,
Sir David Norgrove

UK Statistics Authority correspondence to the Public Administration and Constitutional Affairs Committee regarding progress against recommendations (Governance of Statistics report)

Dear William,

I was glad we had the chance to speak last month. I said then that I would write and set out the further actions the UK Statistics Authority has taken in response to the Committee’s 2019 Governance of Statistics report. Our full response to the report made a number of commitments and this letter describes our progress against them.

The Committee made separate recommendations in relation to the Authority, Office for Statistics Regulation (OSR), the Office for National Statistics (ONS) and the Government Statistical Service (GSS), and this letter responds to them separately where possible.

UK Statistics Authority

Reporting to Parliament

(Recommendations 11,14,22,23,24)

As discussed in our response, the Authority is of course always happy to give evidence to PACAC, and I was glad to see the National Statistician and Director General for Regulation appear in front of the Committee earlier this month. We have also aimed to keep the Committee informed of our statistical interventions and other progress of note via correspondence. I will continue to write to you with regular updates on the work of the Authority, and am always happy to meet to discuss these.

OSR have kept the relevant departmental select committees abreast of statistical interventions since September 2019, and this is now built into ways of working. Both OSR and the ONS regularly give evidence to a wide range of select committees, most recently the Women and Equalities Committee; the Science and Technology Committee; and the Treasury Committee.

Transparency

(Recommendations 11,14,18,20,25,26,34)

An updated framework document, setting out these roles and responsibilities and those across the statistical system more broadly, will be published with our forthcoming strategy. The Board agreed at its March meeting to pause the launch of a new strategy due to the COVID-19 pandemic. The opportunities for new data and analysis were already tremendous, supported by the Digital Economy Act. The response of statisticians to the epidemic has accelerated change as I hope the Committee saw at its recent evidence session. We aim to publish the new strategy later in the year.

Regarding Authority Board minutes, we have ensured these are published in a timely way and reflect the key points of non-executive discussion. In addition, we have recruited two new non-executive directors, Richard Dobbs and Professor Sir David Spiegelhalter, to work a minimum of two days per month.

We committed to publishing the Regulation Committee’s minutes and future meeting dates, and updated the terms of reference. This work was completed and published on the Authority website in October 2019.

We have considered options for developing the Authority website, to emphasise and make clearer the separation that exists between the ONS and OSR, and these improvements are underway. To illustrate separation further, OSR launched their own Twitter account (@StatsRegulation) in November 2019.

We will continue to publish a separate Annual Report for OSR, distinct from the work of the ONS, and the publication of both is planned for July 2020. The OSR Annual Report will be an annex in the Authority Annual Report and Accounts as well as a separate document on the OSR website.

RPI

(Recommendation 30)

The Retail Prices Index (RPI), uniquely out of all ONS statistics, requires the Authority to seek consent from the Chancellor of the Exchequer to make certain methodological changes. As the Committee are aware, I wrote on behalf of the Board to the then Chancellor in March 2019 to recommend that the publication of the RPI should be stopped at a point in future and that, in the interim, the shortcomings of the RPI should be addressed by bringing the methods of the CPIH into it. In September the next Chancellor announced his intention to consult on whether to bring the methods in CPIH into RPI between 2025 and 2030, effectively aligning the measures. At the same time, the Authority said it would consult about the methods of making the change. This joint consultation was launched on 11 March alongside the Budget and has, in light of the pandemic, since been extended to be open for responses until 21 August. In the meantime we continue to urge the Government and others to cease to use the RPI. As I said in our response to the Committee’s report, it would be wrong for the Government to continue to use a measure of inflation which it itself accepts is flawed, where it has the opportunity to change.

We will update the Committee again on the RPI following the conclusion of the joint consultation.

Office for Statistics Regulation

Protecting the role of statistics in public debate

(Recommendations 1,4,11)

OSR sent the Annual Review of Casework to the Committee in September 2019, and since then the casework function has been developed further, with the programme managed by the Deputy Director for Regulation. A specific blog on approach and casework during the pre-election period was also sent to the Committee in January 2020. This blog summarised a particularly active period for OSR, more so than in previous pre-election periods. In the 2019 election campaign, public interventions were made to clarify statistics on a range of topics including violent crime, homelessness, education funding and youth unemployment. During the current COVID-19 pandemic OSR has continued to protect the role of statistics in public debate by responding to a range of concerns around the publication of data on COVID-19 cases, deaths and testing across the four countries of the UK. Following these interventions, there have been improvements in the information provided by Government though there is still more to do.

OSR has also outlined its expectations on the use of management information by government and other official bodies, strongly advocating for information to be presented to the public in a way that promotes transparency and clarity. OSR will continue to take appropriate action directly and through the Committee if necessary where Ministers and others refer to unpublished information.

Longer-term developments

(Recommendations 13,14,15,16,22,38)

OSR resourcing has been considered at Regulation Committee in October 2019 and February 2020, and the options will be considered fully after the development of the Authority strategy. Meanwhile, responsibility for budget has been clarified in the new Regulation Committee Terms of Reference, and an MoU between OSR and the ONS has made the Director General for Regulation a Secondary Accounting Officer with responsibility for OSR.

Recommendations on separation of OSR regarding physical location were discussed at the Authority Board and the Regulation Committee, where the aim is to ensure OSR’s physical location is consistent with OSR’s status as being part of the Authority but separate in decision-making terms from the ONS and the National Statistician. These plans involve looking at each of OSR’s three physical sites and ensuring that the physical space is aligned with this overall status. Progress will resume as and when we return to office-based working, although our first priority here will of course be the health and safety of our staff.

OSR’s systemic review of statistical leadership was discussed at Regulation Committee in February. A series of reports are planned on different aspects of the review. A blog in March 2020 outlined the work on this to date.

Finally, the Regulation Committee is actively considering the co-option of external members in their latest round of recruitment.

As noted throughout this update, the COVID-19 pandemic has meant that long-term actions have had reduced priority while focus turns to shorter term needs. Further detail on resource and location issues will be provided in a future update.

Office for National Statistics and Government Statistical Service

User engagement and accessibility of outputs

(Recommendations 2,3,4,5,6,7)

The ONS committed to producing a stakeholder engagement strategy and implementation plan across the GSS. A cross government steering group and working group were set up in December 2019. The ONS is working closely with Heads of Profession across the GSS, and digital and communications experts, to identify best practice, build capability in user engagement and to collaborate fully in the development of the engagement strategy. The engagement plan will be launched after the Authority strategy; good user engagement is a cornerstone of the Authority strategy and will be sent to the Committee.

Statistical leadership

(Recommendations 32,34,38)

The office has considered thoroughly and implemented processes to ensure succession planning, talent management and workforce planning. These are a consistent priority and there have been regular reports to the National Statistician since he took up post in October 2019. To aid in leadership of the GSS, the ONS Head of Profession for Statistics (HoP) role has been made into a more senior post, at Director General level. Our new HoP, Iain Bell, has begun holding career development discussions with HoPs from across the GSS.

Data capability and innovation

(Recommendations 36,37)

Within strategy conversations, the Board are continuing to discuss the Authority and ONS’s stronger role across data ethics and data science. Meanwhile, the ONS have explored what new work the Data Science Campus could do with additional funding, which can be provided to the Committee if this would be useful. The Campus are continuing to strengthen and expand their work, including through partnerships with DFID and the new Joint Biosecurity Centre.

I look forward to meeting you in person I hope before too long.

Yours sincerely,
Sir David Norgrove

UK Statistics Authority written evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on governance of statistics inquiry

Dear Bernard,

I write to offer the UK Statistics Authority response to the Public Administration and Constitutional Affairs Committee report on ‘Governance of official statistics: redefining the dual role of the UK Statistics Authority; and re-evaluating the Statistics and Registration Service Act 2007.’

The Authority thanks the Committee for this report and its recommendations. The following note lays out our response and appended to this are two further responses: one from Ed Humpherson, Director General for Regulation, on behalf of the Office for Statistics Regulation (OSR), and another from Jonathan Athow, interim National Statistician, on behalf of the Government Statistical Service (GSS) and Office for National Statistics (ONS).

The Committee’s report comes at an exciting time for the Authority, with the newly-appointed National Statistician, Professor Sir Ian Diamond, due to take office on 22 October. In parallel, the Authority is also developing its next strategy, to follow on from Better Statistics, Better Decisions (BSBD), and which we intend to launch next year. As we do so, Sir Ian, the Board and I will consider the conclusions and recommendations of the Committee very carefully.

As set out below, we will continue to keep the Committee updated on our work, and specifically on the progress made against these recommendations.

Yours ever,

David

 

Introduction

The Authority’s response considers the Committee’s recommendations around understanding demands for data and leadership on data; on separation of functions, reporting to Parliament, transparency and the role of the non-executive directors; and the latest update on the Retail Prices Index (RPI).

Understanding demands for data
(Recommendations 1,2,3,4,6,38)

As the Authority makes clear within its Code of Practice for Statistics, users of statistics and data should be at the centre of statistical production; their needs should be understood, their
views sought and acted upon, and their use of statistics supported. During early discussions on the Authority’s forthcoming strategy, Board members have agreed that this principle will continue to be at the heart of our work over the coming five years. With that in mind, the Authority leads the way by planning an ambitious programme of work to improve the statistical system’s collective understanding of demands for statistics across the UK.

During the initial phase of this work, staff from across the Office for National Statistics (ONS), the Office for Statistics Regulation (OSR), and the Government Statistical Service (GSS) are identifying what sources of intelligence already exist about demands for data (including, as the Committee suggests, the Institute for Government’s report on data gaps). This work will be followed by a range of events with users of statistics including policymakers, Parliamentarians and expert users, as well as members of the public in a series of regional road-shows. As the appended note from the National Statistician illustrates, many gaps have been identified and filled following conversations with users, but there is still more to do.

This exercise is intended to not only provide a clearer picture for the statistical system of demands and current data gaps, but also to forge links with users of statistics across the UK. It is important to respond to the needs and diverging policy demands across the UK, keeping in mind that the Authority is accountable to all four legislatures. We look forward to working closely with the Committee as part of this work. Further detail on the Committee’s recommendations on sector-by-sector reports, and ongoing user engagement, are contained within the appended notes from the OSR and the National Statistician respectively.

Leadership on data
(Recommendations 36,37)

The Authority agrees with the Committee’s recommendation that our responsibility for the statistical system necessitates a leading role across areas such as technology, data science, data sharing and linking and data ethics. Improved access to administrative data in particular remains key to delivering improvements for users of statistics, and we continue to drive work to make best use of the gateways established in the Digital Economy Act. As set out in further detail in the National Statistician’s response, the Authority’s ambition is to be a leader in the use of official statistics and government data both domestically and internationally, by developing world-class technical skills underpinned by a robust legal and ethical framework. The role of the ONS in the United Nations (UN) Global Working Group on Big Data for Official Statistics provides a clear example of the strength and potential in the UK statistical system.

Separation of functions
(Recommendations 13,14,15,16,17,18,26)

The Authority recognises the challenges created by its statutory responsibility to oversee both the production and regulation of official statistics but agrees with witnesses to the Committee that legislative change is unnecessary.

In recent years, the Authority has enhanced separation between its regulatory and production responsibilities by establishing a distinct executive office for regulation, the OSR. These arrangements provide the OSR with full decision-making autonomy from the ONS, but we accept that this separation can be made more visible to those outside the statistical system.

With that in mind, the Authority Board has considered the Committee’s recommendations at length, and have decided to introduce the following changes to its policies and practices:

• The Regulation Committee terms of reference will be updated to reflect the Committee’s responsibility for overseeing the budget of the OSR.
• Further, the Authority will explore arrangements whereby the Head of the OSR could be appointed as a Secondary Accounting Officer for the Department in respect of its regulatory responsibilities.
• The Authority will publish an updated framework document, setting out these roles and responsibilities and those across the statistical system more broadly, within its forthcoming strategy.
• The Authority is also exploring options for developing its website, to emphasise and make clearer the separation that exists between the ONS and the OSR.
• And we will continue to publish a separate Annual Report for the OSR distinct from the work of the ONS.

The Committee also recommended that the OSR consider changing location. As Mr Humpherson explains in his appended note, the OSR is currently considering the business case for such a move and will keep the Committee appraised of its conclusions.

Reporting to Parliament
(Recommendations 11,14,22,23,24)

The Authority is happy to continue and expand on its engagement and reporting to Parliament and this Committee, and indeed is grateful to the Committee in its continued support of the Authority’s independence. The Authority is always content to appear in front of the Committee, including at an annual hearing following the publication of its annual report and accounts. The OSR will continue to publish an annual report which will contribute to this hearing, which is discussed in the appended note from Mr Humpherson.

The Authority will give updates at these annual evidence sessions of the progress of implementation of the recommendations in this report. The Authority is also aware of the recommendation to report annually on progress in implementing other recommendations, of both external reviews and parliamentary select committees, and would be happy to do this via the annual hearing with this Committee and regular correspondence from the Chair. Regarding more frequent correspondence with the Committee, the Director General for Regulation and the Chair of the Authority will inform the Committee when making a significant intervention in the statistical system. As the appended note from Mr Humpherson explains, the OSR will increase their engagement with departmental select committees as another means to ensure Parliament is aware of data and statistical issues. This is in addition to the Chair’s established updates to this Committee which will continue with regular frequency.

Transparency
(Recommendations 11,14,20,25,26,34)

The Authority is committed to transparency. In response to the Committee’s recommendations, we will ensure Authority Board minutes are published in a timely fashion following their approval, and that they capture the key points of non-executive discussion. In future we will also publish the Authority’s Regulation Committee minutes.

Following approval at the next Regulation Committee meeting in October, July’s minutes and agenda will be published on the Authority’s website. The Committee’s Terms of Reference will be updated accordingly. The Authority website will also be updated to include future meeting dates for the Regulation Committee, along with those for the Authority Board.

Non-Executive Directors
(Recommendations 20,25,34)

The Authority’s non-executive directors play a crucial role in the work of the Board and in the exercise of the Authority’s statutory duties. Non-executive directors also sit on the Authority’s
sub-Committees and will continue to do so in future. In addition, in recent years the Authority’s non-executive directors have increasingly played a role outside the Boardroom, regularly attending sessions with staff to offer support and challenge on key programmes of work, including the 2021 Census, changes to the National Accounts, challenges in accessing administrative data and the work of the OSR. This is alongside planned external engagement activities: for example an event in September to engage with members of the Senedd in Cardiff, and we would be pleased to update the Committee on this too.

The Authority expects there to be a competition for new non-executive directors later this year, and in the course of this process, we can consider the case for increasing the number
of working days for non-executive directors.

RPI
(Recommendation 30)

As the Committee are aware, the Authority published its response to the report ‘Measuring Inflation’ of the House of Lords Economic Affairs Committee on 4 September. This included the Authority’s proposals for the Retail Prices Index (RPI). As noted in our response, the Advisory Panel on Consumer Prices provided advice to the National Statistician on the composition of the RPI in light of the Economic Affairs Committee report. Taking account of that advice, the then National Statistician concluded that the current position was unsatisfactory and put options for the future of the RPI to the Authority Board on 26 February 2019.

After receiving this advice, the Chair of the Authority wrote on behalf of the Board to the previous Chancellor of the Exchequer on 4 March 2019 with the following recommendations:

• That the publication of the RPI be stopped at a point in future.
• In the interim, the shortcomings of the RPI should be addressed by bringing the
methods of the CPIH into it.

In his response to the Lords Economic Affairs Committee, the Chancellor announced his intention to consult on whether to bring the methods in CPIH into RPI between 2025 and 2030, effectively aligning the measures. The Authority will consult on the method of making this change. The role of the Authority is to promote and safeguard official statistics. We have been clear that the RPI is not a good measure, at times significantly overestimating inflation and at other times underestimating it, and have consistently urged all – in Government and the private sector – to stop using it. However, the RPI is unique as we need consent from the Chancellor to make certain changes, such as the one we have proposed. Although we regret that no change will occur before 2025, we welcome the Chancellor’s intention to consult on resolving current issues with the RPI. We continue to urge the Government and others to cease to use the RPI. It would be wrong for the Government to continue to use a measure of inflation which it itself accepts is flawed, where it has the opportunity to change. Looking ahead to the proposed consultations in January 2020, the Authority looks forward to both Committees’ continued engagement on this matter.

Office for Statistics Regulation response

Introduction

This response focuses on the aspects of the Committee’s report that relate to the Office for Statistics Regulation (OSR). Overall, the Committee has recognised the importance of OSR’s role, endorses a strong, separate identify for OSR as the Authority’s regulatory arm, and ultimately provides a basis for enhancing OSR’s role and identity. Since giving evidence to the Committee alongside Sir David Norgrove and John Pullinger, we have published “OSR’s vision: what we do and how we do it”, which resonates closely with many of the findings and  recommendations outlined in the report. The essence of OSR’s vision is that statistics should serve the public good, and that in a world of abundant data, we want people to have confidence in statistics produced by the public sector. In order to realise this vision, our work focuses on three key themes:
• Upholding the trustworthiness, quality and value of statistics.
• Protecting the role of statistics in public debate.
• Developing and leading a better understanding of the public good of statistics in
collaboration with others.

These themes will underpin OSR’s approach to delivering the recommendations outlined by the Committee. This note presents OSR’s response and details plans for driving improvements in these areas. It confirms OSR’s ambitions to enhance its regulatory role and identity. These ambitions can be achieved by enhancing work already underway; immediate implementation of some of the Committee’s recommendations; and evaluating options for more significant growth in OSR’s remit and resourcing.

Actions already underway

Many of the Committee’s recommendations speak to areas in which OSR is already operating and has an active role. Those relating to conducting user research, carrying out sector by sector reviews, identifying data gaps and reviewing quality information are of particular relevance to OSR’s current and planned regulatory activity. In this regard, this inquiry has provided a platform for the importance of OSR’s work, and for raising these important issues with the wider Authority which will support and strengthen OSR’s work across the Government Statistical Service (GSS).

Upholding the trustworthiness, quality and value of statistics
(Recommendations 1,6,7,37)

OSR will continue to uphold the trustworthiness, quality and value of statistics and drive improvements to statistics through its regulatory work programme of assessments and compliance checks. OSR’s current regulatory work programme has an emphasis on statistics that have the greatest public value. It considers the key policy debates and where there may be data gaps or a lack of coherence or insight in statistics which support these areas. Our work programme will continue to be informed by stakeholder discussions, as well as internal expertise, and by monitoring new areas for future inclusion. The ongoing delivery of OSR’s regulatory work programme will support several of the Committee’s recommendations.

The Code of Practice highlights the importance of the role of Chief Statistician/Head of Profession across producers of official statistics. It makes clear their role in upholding and advocating the standards of the Code, stating that they should strive to improve statistics and data for the public good, and challenge inappropriate use. The Code also emphasises the responsibility of organisations to consider the views of its statistical leadership in all matters relating to data and statistics. We have recently started a systemic review to understand characteristics of strong statistical leadership in government. We will be working with Heads of Profession and others with a view to supporting and strengthening their contribution to upholding trustworthiness, quality and value. The review will look at factors that currently support effective statistical leadership and the development of future leadership.

OSR is seeking to expand the growing list of organisations that have made a public commitment to adopt and apply the pillars of the Code of Practice for Statistics voluntarily. The adoption of the Code of Practice’s pillars of trustworthiness, quality and value for data presented as evidence in public debate (e.g. management and performance information, social research) has the potential to offer significant benefits. It can raise standards of analysis and dissemination among organisations that use it and demonstrate transparency to users of these statistics and data used as evidence. Continuing to promote and grow this area of regulatory work will address the Committee’s recommendations and extend them to data and statistics from outside government.

Providing guidance and advice on data quality and appropriate use

OSR fully supports the Committee in highlighting the importance of producers providing guidance and advice to users on the strengths and limitations of statistics, and their appropriate use. OSR will continue to raise this with producers as part of regulatory work, and will monitor and challenge if progress is slow. Since high-profile concerns relating to police recorded crime were raised at the Public Administration Select Committee and the Home Affairs Select Committee, and following the Bean review in 2016, OSR’s assessments have had a stronger focus on quality and quality assurance. OSR will continue to:
• Review, develop and promote its guidance on the Quality Assurance of Administrative Data (QAAD) (first published in 2015)
• Work with producers to further emphasise the need for understanding the natures of data sources, as well as the methods and processes for producing the statistics.
• Regulate in accordance with the pillar of Quality in the Code of Practice for Statistics

Identifying and addressing data gaps

A prominent feature of OSR’s regulatory work is identifying gaps where public debate is not well-informed by statistics. OSR will continue to expand this area of research and regulatory activity. Two current areas in which OSR is investigating the information needs of users and other stakeholders are adult social care and policing, with other topics planned to commence in the next year. These reviews seek to improve the evidence base for public debate across the UK.

OSR will continue to learn from and build on previous successes in identifying data gaps and working with producers to address them. For example, following a series of interventions from OSR highlighting the need for a trustworthy source of statistics on school funding, the Department for Education has recently committed to publishing a comprehensive set of official statistics on this topic.

Through a new series of reports called ‘Insight’, OSR will deliver lessons and insight to a wider audience in an accessible way. Following OSR’s first report on Coherence , published shortly after the PACAC inquiry hearing, the next instalment will focus on identifying and addressing data gaps.

Improving data sharing

Data linkage has great potential to address data gaps and provide additional or new insights to users. Following OSR’s data linkage review which highlighted concerns around the barriers to using linked data effectively, OSR will monitor progress in data sharing and access and will continue to strongly advocate for the wider use of linked data under appropriate conditions.

OSR will continue to collaborate with key organisations in the data landscape, convening workshops involving the Information Commissioner’s Office, the Centre for Data Ethics and Innovation and the Royal Statistical Society. OSR will also commit to reviewing, developing and promoting its regulatory guidance for statistics producers on handling data in ways that are transparent and accountable. These will help establish a common understanding of issues such as data ethics, data ownership and data access, and provide reassurance to the public that their data are safe.

Protecting the role of statistics in public debate
(Recommendations 1,4,11)

Under the Statistics and Registration Service Act 2007, there is a statutory requirement to promote, monitor and safeguard the production and publication of official statistics that serve the public good. OSR’s casework function plays an important role in building public confidence in the production and use of official statistics, intervening when there are significant or persistent issues with how statistics are being used. OSR will commit to developing our casework function further, while noting that we are already proactively monitoring how statistics are used in public debate by initiating a sizeable proportion of cases internally, and are developing automated tools and dashboards to provide intelligence on significant uses.

In the interests of providing greater transparency around the casework process, the Authority and OSR published an Annual Review of Casework in 2018. Seeking to provide further insight to users, the Annual Review of Casework will be published in September 2019 and builds on the quantitative information by including additional commentary and evidence of the impact of the Authority and OSR’s work. We will send a copy of this to the Committee separately. OSR and the Authority will continually review and develop the transparency and effectiveness of its casework function in line with the Committee’s recommendations.

Developing and leading a better understanding of the public good of statistics in collaboration with others
(Recommendations 1,4,6)

In line with the Committee’s recommendations, to develop a better understanding of how valuable statistics are in supporting key policy decisions made by public bodies, as well as by a much wider range of people in society such as businesses, charities and community groups, OSR will commit to delivering work in three key areas:
• Identifying and sharing details of research on the public impact of statistics.
• Developing a framework for judging whether statistics have been used in a misleading way.
• Investigating the understanding among producers, users and other stakeholders, of the role of National Statistics designation in conveying the value of individual sets of statistics.

We recognise that this is a gap in OSR’s past work and filling it will help demonstrate the value of statistics to society.

We note the concerns of the Committee in wanting to see the results of effective engagement in shaping improvements to the nature and content of official statistics. In this vein, OSR will continue to seek to better understand the views and experiences of users, potential users and other stakeholders through its systemic review programme. This programme provides an opportunity to convene mixed groups, involving producers, users and other stakeholders, to facilitate a common understanding of information needs and the challenges in meeting them. More importantly, they provide a means for solutions to be identified and collaborations forged. OSR will continue to drive progress in this area, for example by making additional recommendations to producers for the long-term strategy for improving engagement – a need that is common across the statistical system. We are also currently in the early stages of a review of public and user engagement that will
help inform the preparation of regulatory guidance to push for progress among producers.

Short-term actions
(Recommendations 11,16,17,18,22,23,26)

In addition to the work already underway, OSR will commit to delivering some of the Committee’s recommendations in the short-term and will begin work to address them immediately. These largely focus on transparency, including publishing OSR’s annual report separately and reporting to and engaging with parliaments aross the UK. We welcome the Committee’s call for greater transparency in OSR’s regulatory decision making and agree that this is a priority. In future, more information will be made available about Regulation Committee meetings, such as the schedule, agendas and minutes.

As part of OSR’s efforts to demonstrate a clearer separation from the rest of the Authority, we will launch our own Twitter account. This visible separation of public communication channels will reinforce OSR’s independence and contribute further to the strengthening of our own distinct voice.

This year OSR sought to more clearly illustrate independence from statistics production by publishing a report summarising our work in 2018-19 as an annex in the Authority’s Annual Report and Accounts. We will commit to separately publishing an Annual Review for 2019/20. This will be laid before Parliament as recommended, as well as other OSR reports that highlight specific concerns in statistical practice.

In addition, we note the Committee’s recommendation to ensure that Departmental Select Committees are appraised of OSR’s findings regarding statistical issues that are relevant to their areas of activity, and will write to the Committees to draw their attention to immediate matters of concern.

OSR will support the Authority as it compiles a framework document that clearly describes the roles and responsibilities of the various parts of the statistical system.

Longer-term developments
(Recommendations 13,14,15,16,22,38)

OSR is considering the means for further physical separation from the rest of the Authority and the potential regulatory benefits from an increase in resource. We will be ambitious in our thinking and will report back to the Committee in early 2020.

Premises and location

OSR is considering future office locations and will evaluate a range of options – based on an identification of the costs and benefits, risks and opportunities – from maintaining the current
situation through co-location with other government organisations to completely separate premises at all chosen locations.

OSR will take into account the varying experiences of the three sites, which include: existing separation from statistical producers (Edinburgh); sharing a building with other areas of the
Authority including ONS, taking into account the different historical arrangements for provision of space in those offices (London and Newport); and location within a national capital city (London and Edinburgh) or otherwise (Newport).

Resources

Ambitious proposals will be developed that reflect a range of options for OSR with a doubling or tripling of resource. These will be developed with input from a range of stakeholders in the Authority and beyond. Lessons from previous business planning and engagement activities will be incorporated, as well as the approaches taken by other regulators. With a strong focus on how to better deliver OSR’s vision, a range of means to deliver independent, effective, cost efficient, proactive and comprehensive regulation will be considered. These may include, for example: the expansion of OSR’s regulatory teams; extension of its research programme; and the potential development of data-driven technology tools to support regulatory decision-making. The impact of the options on location will also be considered.

Other actions

There are a number of other areas in which OSR will seek to further demonstrate its independence. The feasibility of establishing a separate website for OSR, as well as the value of having one or more external members to the Regulation Committee (that are not non-executive directors of the Authority) to provide further challenge on regulatory judgments will also be considered to make clearer the separation of OSR from the rest of the Authority.

Conclusion

OSR’s ambition is clear: we want to enhance public confidence in statistics and data, and we will do this by:
• Upholding the trustworthiness, quality and value of statistics.
• Protecting the role of statistics in public debate.
• Developing and leading a better understanding of the public good of statistics in collaboration with others.

We welcome the Committee’s report and see it as a tremendous opportunity to enhance our profile and strengthen our delivery as the regulator of official statistics. As mentioned throughout our response, we will continue to keep the Committee updated on our work.

Ed Humpherson, Director General for Regulation
Office for Statistics Regulation
September 2019

UK Statistics Authority response to the Lords Economic Affairs Committee report on their use of RPI inquiry

Dear Lord Forsyth,

Following the Committee’s recent report on Measuring Inflation, I write with the UK Statistics Authority’s response to your recommendations.

As your report made clear, the question faced by the Authority in 2012 was whether to make substantive changes to the construction of the Retail Prices Index (RPI). The decision made by the then National Statistician, one widely supported in the consultation at the time, was to leave the RPI unchanged. This decision gave rise in turn to the conclusion that the RPI should be treated as a legacy measure, with no future substantive changes to its construction and methods. That position was endorsed by an independent review of consumer prices led by Paul Johnson, which reported in 2015. In the period since, the Office for National Statistics (ONS) has developed alternative measures of inflation, and the Authority has urged users to move away from the RPI.

Nonetheless, the RPI continues in widespread use. This – along with new advice from ONS on the flaws of the RPI, new advice from the National Statistician’s Advisory Panels, and the urgings of your Committee – convinced the Board that further action was necessary. The then National Statistician put options for the future of the RPI to the UK Statistics Authority’s Board on 26 February 2019.

After receiving this advice, Sir David Norgrove, Chair of the UK Statistics Authority, wrote on behalf of the Board to the previous Chancellor of the Exchequer on 4 March 2019 with the following recommendations:

  • that the publication of the RPI be stopped at a point in future; and
  • in the interim, the shortcomings of the RPI should be addressed by bringing the methods of the CPIH into it.

Today the Chancellor has announced his intention to consult on whether to bring the methods in CPIH into RPI between 2025 and 2030, effectively aligning the measures.

The proposals made by the Authority address many of the recommendations made by the Committee in its report. More detailed responses to the other recommendations are set out in the attached Annex.
Yours sincerely,

Sir David Norgrove

Related Links:

ONS oral evidence – (2018)

UKSA oral evidence – (2018)

UKSA follow up written evidence – (2018)

 

Annex: Detailed Response to Specific Recommendations

  1. We heard evidence that the Carli formula, as used in the RPI, produces an upward bias. But expert opinion on the shortcomings of the RPI differs. (Paragraph 99)
  2. There is however broad agreement that the widening of the range of clothing for which prices were collected has produced price data which, when combined with the Carli formula, have led to a substantial increase in the annual rate of growth of RPI. (Paragraph 100)
  3. We are not in a position to reach a conclusion on the question of whether the Carli formula is problematic in areas other than clothing. Given the properties of the Carli formula that may lead to upward bias have long been evident, yet expert opinion still differs, it may be a perpetual debate. (Paragraph 101)

The Authority agrees that there is never likely to be unanimity on the issue of the elementary indices (e.g. Carli, Jevons or Dutot) used in inflation measurement. There is no single universally agreed set of criteria against which to judge them and there are specific examples where each index can be shown to produce either plausible or implausible results. A judgement therefore needs to be taken in the round.

Our view is that the Carli is not generally a good index. A thorough exploration of the issues related to the Carli index was set out in both Chapter 10 of the independent review of consumer prices by Paul Johnson and the 2012 review of UK consumer price statistics conducted by Erwin Diewert, a leading authority on index numbers.

This view is supported by international practice and the National Statistician’s Technical Advisory Panel for Consumer Prices. Many technical manuals and academic papers also highlight the undesirable properties of the Carli index. Regulations on the production of the Harmonised Index of Consumer Prices go further and state that the Carli should not be used unless it can be demonstrated to behave in a similar way to the Jevons or Dutot.

We agree that the interaction between the Carli index and the collection of clothing prices created an increase in the rate of RPI inflation in 2010. It was this event that led ONS and the Authority to put in place a programme of work that led to the 2012 consultation on the future of RPI.

4. Given its widespread use, it is surprising that the UK Statistics Authority is treating RPI as a ‘legacy measure’. The programme of periodic methodological improvements should be resumed. (Paragraph 116)

5. We are unconvinced by the National Statistician’s suggestion that in publishing statistics that serve the public good, the interests of those who may be affected negatively by any change should be taken into account. It is not clear from section 7 of the Statistics and Registration Service Act 2007 that this is a relevant consideration for the statistical authorities to be taking into account when they are producing and publishing statistics. (Paragraph 117)

6. What is clear from section 7 is that the UK Statistics Authority has to promote and safeguard the quality of official statistics, which includes their impartiality, accuracy and relevance, and coherence with other statistics. In publishing an index which it admits is flawed but refuses to fix, the Authority could be accused of failing in its statutory duties. (Paragraph 118)

7. We believe section 7 requires the Authority to attempt to fix the issue with clothing prices. Section 21 may require the Authority to consult the Bank of England over the change and obtain the consent of the Chancellor of the Exchequer, however this provision cannot be cited as a reason for not requesting the change in the first place. (Paragraph 119)

8. If the Authority requests the change, the Chancellor of the Exchequer should consent to it. It is untenable for an official statistic, that is used widely, to continue to be published with flaws that are admitted openly. (Paragraph 120) 

The announcements by the UK Statistics Authority and HM Treasury on 4 September deal with this substantive issue raised in these recommendations, and are summarised in the covering letter to this response.

9. While we accept the arguments that consumer price indices have different purposes, we do not believe this warrants the production of multiple indices for government use. Two different measures of inflation allow a government to engage in ‘inflation shopping’. (Paragraph 134)

10. The Government should address the imbalance in its use of consumer price indices. It risks undermining public confidence in economic statistics. It is encouraging to see that the present Government is taking some steps to address the imbalance, for example with the change to uprating business rates by CPI and recent discussions around rail fares. (Paragraph 135)

11. In future there should be one measure of general inflation that is used by the Government for all purposes. This would be simpler and easier for the public to understand. But the UK Statistics Authority should also continue to develop the Household Cost Indices, discussed below. (Paragraph 136)

We welcome the Committee’s recommendation that the Household Cost Indices should continue to be developed. On 28 June 2019, the National Statistician outlined the next steps in the development of these Indices.

12. We disagree with the UK Statistics Authority that RPI does not have the potential to become a good measure of inflation. With the improvements to RPI that we set out in the previous chapter, and a better method of capturing owner-occupier housing costs as discussed below, we believe RPI would be a viable candidate for the single general measure of inflation. (Paragraph 139)

13. We are not convinced by the use of rental equivalence in CPIH to impute owneroccupier housing costs. The UK Statistics Authority, together with its stakeholder and technical advisory panels and a consultation of a wide range of interested parties, should agree on the best method for capturing owner-occupier housing costs in a consumer price index. (Paragraph 153)

14. Once a method of capturing owner-occupier housing costs has been agreed, the UK Statistics Authority, after consulting the stakeholder and technical panels, should decide which index to recommend as the Government’s single general measure of inflation. The Government should have adopted the preferred candidate as its single general measure of inflation within five years. (Paragraph 154)

Owner occupiers’ housing (OOH) costs are one of the most challenging aspects of inflation to measure. There is also no single approach that will be correct in all circumstances, as the choice will depend on the purpose of the index and also practical issues around data availability. In light of this, ONS has spent the last 10 years developing and consulting on its approaches to owner occupiers’ housing costs.

The development of an OOH measure for CPI was first considered in 2009 by the Consumer Prices Advisory Committee (CPAC). The committee then spent the next three years investigating different approaches to measuring OOH costs. In September 2010 it narrowed down the options to two – net acquisitions and rental equivalence – which it evaluated in detail against the five dimensions of statistical quality defined by the European Statistical System. The Committee finally agreed on rental equivalence in April 2012, giving consideration to both conceptual appropriateness and how well the index could be calculated in practice.

A first consultation was launched in the summer of 2012, in which users were asked about rental equivalence. The responses were fairly evenly split between support for rental equivalence, net acquisitions and neither approach. The National Statistician chose rental equivalence reflecting the quality of the underlying data available and whether asset prices were appropriately treated. The process is described in more detail in Appendix A of the CPIH Compendium.

Paul Johnson’s review of consumer prices was published in January 2015. This looked again at CPAC’s recommendation to use the rental equivalence method. It concluded the underlying assumptions are reasonable in a UK context and that the measure is based on a large, detailed source of underlying data. Therefore, the Review recommended that ONS should continue to use the rental equivalence measure.

A further consultation was conducted on the findings of the Johnson Review. Responses to the review on CPIH and OOH were again mixed, highlighting that users are unlikely to come to an agreement on the most appropriate choice for measuring OOH costs.

The Office for Statistics Regulation’s 2016 re-assessment of CPIH as a National Statistic noted that ‘there is some disagreement among users about the concepts and methods…’ Work to address these recommendations resulted in a wide ranging process of user engagement on CPIH, and the publication of numerous supporting materials such as the CPIH Compendium, which articulates the rationale for ONS’s choice of rental equivalence alongside the pros and cons of each approach, an ongoing published comparison of alternative OOH measures, and documentation on the various users and uses of our consumer price inflation statistics.

ONS have also looked at international practice where they found widespread use of the rental equivalence measure. The approach taken by different countries is summarised in the CPIH Compendium. Of the 40 countries considered, the most common approach is rental equivalence (12 countries) if discounting those that exclude OOH altogether (15 countries). It is also worth noting that the method requires a reasonably large rental market to work, and so many countries may be constrained in their choice by the availability of data. The countries that use rental equivalence include the United States, Germany, Norway and the Netherlands.

In light of the 10 years of development and consultation, ONS are not minded to undertake any further engagement with users and experts specifically on rental equivalence and owner-occupier housing costs. There is never likely to be agreement on a single approach. ONS views rental equivalence as the correct approach conceptually for an economic measure of inflation, and one where sufficient data is available to make it practical. Of course, they remain committed to ongoing monitoring and development of the CPIH and the Household Cost Indices.

15.Our recommendations will not however solve the issue of index or inflation shopping immediately. The Government will need to take action in the interim to address this. (Paragraph 155)

16.While the single general measure is being determined, the Government should switch to CPI for uprating purposes in all areas where it is not bound by contract to use RPI (except for the interest rate on student loans which, as we recommended in our Treating Students Fairly report, should be set at the ten year gilt rate thus reflecting the Government’s cost of borrowing). (Paragraph 156)

17.The Government should begin to issue CPI-linked gilts and stop issuing RPI-linked gilts. We heard evidence to suggest there was sufficient demand to make a viable market. (Paragraph 170)

18.Once the long-term single official measure of inflation has been agreed, gilts should begin to be issued that are linked to that index. The prospectuses for new issuances of index-linked gilts should be clear that the inflation index will change to the Government’s single general measure of inflation once it has been agreed. (Paragraph 171)

Recommendations (15) to (18) are primarily directed at HM Government and the Authority has nothing to say on those issues. We continue to urge the Government to cease to use the RPI for its own purposes where practical.

19. Once the single general measure of inflation has been introduced, the UK Statistics Authority and the Government should decide whether RPI should continue to be published in its existing form for the purposes of existing RPI-linked contracts, or whether a programme of adjustments should be made to the RPI so that it converges on the single general measure. (Paragraph 194)

20. To avoid disruption, we envisage any programme of convergence would take place gradually, over a sufficiently long time, and that the plan for that should be published at the outset. (Paragraph 195)

21. We note that the consent of the Chancellor of the Exchequer to changes to RPI that cause material detriment to index-linked gilts holders is no longer required after the last issuance to which that clause relates to expires in 2030. (Paragraph 196)

We strongly agree that any changes to the RPI or stopping the publication of RPI needs to be carefully planned. The Authority and ONS have been discussing the mechanics of any changes with the Government in the run up to the 4 September announcement.

UK Statistics Authority oral evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on governance of statistics inquiry

On Tuesday 2 April 2019, Sir David Norgrove, Chair, UK Statistics Authority; John Pullinger, National Statistician, UK Statistics Authority; and Ed Humpherson, Director General for Regulation, UK Statistics Authority gave evidence to a Public Administration and Constitutional Affairs Committee as part of their Governance of Statistics inquiry.

A transcript of which has been published on the UK Parliament’s website.