Office for National Statistics correspondence to the Public Administration and Constitutional Affairs Committee regarding cases and deaths from COVID-19

Dear Mr Wragg,

Thank you for your letter of 14 April regarding both Department for Health and Social Care (DHSC) data on cases of, and deaths from, COVID-19, and the Office for National Statistics (ONS) mortality data. I wanted to note, too, that as the UK’s National Statistician and Head of the Government Statistical Service, I have a responsibility to ensure the presentation of all data and analysis across Government is useful, transparent about its caveats, and of a high quality.

When making international comparisons it is important to compare figures that have been collected in the same way, using the same definitions. At this stage of the pandemic, international comparisons are difficult due to the differences in data collection. Most countries will be collecting data from hospital settings, some will require a positive test for COVID-19, others will include a diagnosis, and some will include any deaths with any suspicion of COVID-19. While they are not as complete as the ONS figures, the DHSC data are the nearest to international comparators.

The ONS weekly deaths publication is the most comprehensive source of deaths that have occurred in the pandemic, as it is complete, includes all the categories mentioned previously and all settings (hospital, care homes and in the community). However, although the UK produces these figures more quickly than most, these data are not as timely as the DHSC daily published numbers, which
can be used as a broad comparator for deaths that have occurred since the ONS publication. Because the ONS figures rely on the registration of deaths, more timely data would mean further burden on medical practitioners to register deaths on the day of death.

You also asked about policy decisions. Evidence is vital for good policy making, and the evaluation of that policy. When providing that evidence base, we must ensure that the accuracy and timeliness is communicated well. The ONS data provides a comprehensive and accurate measure of deaths according to the registration system. The DHSC daily deaths publication is less comprehensive, but more timely, and as such it is an important lead indicator for the spread and severity of the pandemic. The ONS is currently working with Public Health England, the Care Quality Commission, the Devolved Administrations and leading academics (through the Scientific Pandemic Influenza Group on Modelling, SPI-M) to reconcile the timely data from DHSC with the accuracy and coverage of the ONS data to provide an even better evidence base for policy making. This work will deliver over the next few weeks to improve the quality of the forecasts given to the Scientific Advisory Group for Emergencies (SAGE) to help manage the crisis.

It is a strength of the UK statistical system that we can provide equality of access to data on which important decisions will rely, while highlighting that they serve different purposes, which we did in a statement on our website published 31 March. The ONS publication is more comprehensive, but the daily DHSC numbers are the most timely available. They also serve the public good by providing insight into the pandemic in almost real time, even though we must note the limitations of this data.

Ed Humpherson, Director General for Regulation and Head of the Office for Statistics Regulation (OSR) will write separately on the approach of OSR to published information on COVID-19.

Please do let me know if I can be of any further assistance to yourself and the Committee.

Yours sincerely,
Professor Sir Ian Diamond

Related Links:

Ed Humpherson to William Wragg MP, Chair PACAC (April 2020)
ONS and OSR Oral evidence to PACAC (May 2020)

UK Statistics Authority written evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on governance of statistics inquiry

Dear Bernard,

I write to offer the UK Statistics Authority response to the Public Administration and Constitutional Affairs Committee report on ‘Governance of official statistics: redefining the dual role of the UK Statistics Authority; and re-evaluating the Statistics and Registration Service Act 2007.’

The Authority thanks the Committee for this report and its recommendations. The following note lays out our response and appended to this are two further responses: one from Ed Humpherson, Director General for Regulation, on behalf of the Office for Statistics Regulation (OSR), and another from Jonathan Athow, interim National Statistician, on behalf of the Government Statistical Service (GSS) and Office for National Statistics (ONS).

The Committee’s report comes at an exciting time for the Authority, with the newly-appointed National Statistician, Professor Sir Ian Diamond, due to take office on 22 October. In parallel, the Authority is also developing its next strategy, to follow on from Better Statistics, Better Decisions (BSBD), and which we intend to launch next year. As we do so, Sir Ian, the Board and I will consider the conclusions and recommendations of the Committee very carefully.

As set out below, we will continue to keep the Committee updated on our work, and specifically on the progress made against these recommendations.

Yours ever,




The Authority’s response considers the Committee’s recommendations around understanding demands for data and leadership on data; on separation of functions, reporting to Parliament, transparency and the role of the non-executive directors; and the latest update on the Retail Prices Index (RPI).

Understanding demands for data
(Recommendations 1,2,3,4,6,38)

As the Authority makes clear within its Code of Practice for Statistics, users of statistics and data should be at the centre of statistical production; their needs should be understood, their
views sought and acted upon, and their use of statistics supported. During early discussions on the Authority’s forthcoming strategy, Board members have agreed that this principle will continue to be at the heart of our work over the coming five years. With that in mind, the Authority leads the way by planning an ambitious programme of work to improve the statistical system’s collective understanding of demands for statistics across the UK.

During the initial phase of this work, staff from across the Office for National Statistics (ONS), the Office for Statistics Regulation (OSR), and the Government Statistical Service (GSS) are identifying what sources of intelligence already exist about demands for data (including, as the Committee suggests, the Institute for Government’s report on data gaps). This work will be followed by a range of events with users of statistics including policymakers, Parliamentarians and expert users, as well as members of the public in a series of regional road-shows. As the appended note from the National Statistician illustrates, many gaps have been identified and filled following conversations with users, but there is still more to do.

This exercise is intended to not only provide a clearer picture for the statistical system of demands and current data gaps, but also to forge links with users of statistics across the UK. It is important to respond to the needs and diverging policy demands across the UK, keeping in mind that the Authority is accountable to all four legislatures. We look forward to working closely with the Committee as part of this work. Further detail on the Committee’s recommendations on sector-by-sector reports, and ongoing user engagement, are contained within the appended notes from the OSR and the National Statistician respectively.

Leadership on data
(Recommendations 36,37)

The Authority agrees with the Committee’s recommendation that our responsibility for the statistical system necessitates a leading role across areas such as technology, data science, data sharing and linking and data ethics. Improved access to administrative data in particular remains key to delivering improvements for users of statistics, and we continue to drive work to make best use of the gateways established in the Digital Economy Act. As set out in further detail in the National Statistician’s response, the Authority’s ambition is to be a leader in the use of official statistics and government data both domestically and internationally, by developing world-class technical skills underpinned by a robust legal and ethical framework. The role of the ONS in the United Nations (UN) Global Working Group on Big Data for Official Statistics provides a clear example of the strength and potential in the UK statistical system.

Separation of functions
(Recommendations 13,14,15,16,17,18,26)

The Authority recognises the challenges created by its statutory responsibility to oversee both the production and regulation of official statistics but agrees with witnesses to the Committee that legislative change is unnecessary.

In recent years, the Authority has enhanced separation between its regulatory and production responsibilities by establishing a distinct executive office for regulation, the OSR. These arrangements provide the OSR with full decision-making autonomy from the ONS, but we accept that this separation can be made more visible to those outside the statistical system.

With that in mind, the Authority Board has considered the Committee’s recommendations at length, and have decided to introduce the following changes to its policies and practices:

• The Regulation Committee terms of reference will be updated to reflect the Committee’s responsibility for overseeing the budget of the OSR.
• Further, the Authority will explore arrangements whereby the Head of the OSR could be appointed as a Secondary Accounting Officer for the Department in respect of its regulatory responsibilities.
• The Authority will publish an updated framework document, setting out these roles and responsibilities and those across the statistical system more broadly, within its forthcoming strategy.
• The Authority is also exploring options for developing its website, to emphasise and make clearer the separation that exists between the ONS and the OSR.
• And we will continue to publish a separate Annual Report for the OSR distinct from the work of the ONS.

The Committee also recommended that the OSR consider changing location. As Mr Humpherson explains in his appended note, the OSR is currently considering the business case for such a move and will keep the Committee appraised of its conclusions.

Reporting to Parliament
(Recommendations 11,14,22,23,24)

The Authority is happy to continue and expand on its engagement and reporting to Parliament and this Committee, and indeed is grateful to the Committee in its continued support of the Authority’s independence. The Authority is always content to appear in front of the Committee, including at an annual hearing following the publication of its annual report and accounts. The OSR will continue to publish an annual report which will contribute to this hearing, which is discussed in the appended note from Mr Humpherson.

The Authority will give updates at these annual evidence sessions of the progress of implementation of the recommendations in this report. The Authority is also aware of the recommendation to report annually on progress in implementing other recommendations, of both external reviews and parliamentary select committees, and would be happy to do this via the annual hearing with this Committee and regular correspondence from the Chair. Regarding more frequent correspondence with the Committee, the Director General for Regulation and the Chair of the Authority will inform the Committee when making a significant intervention in the statistical system. As the appended note from Mr Humpherson explains, the OSR will increase their engagement with departmental select committees as another means to ensure Parliament is aware of data and statistical issues. This is in addition to the Chair’s established updates to this Committee which will continue with regular frequency.

(Recommendations 11,14,20,25,26,34)

The Authority is committed to transparency. In response to the Committee’s recommendations, we will ensure Authority Board minutes are published in a timely fashion following their approval, and that they capture the key points of non-executive discussion. In future we will also publish the Authority’s Regulation Committee minutes.

Following approval at the next Regulation Committee meeting in October, July’s minutes and agenda will be published on the Authority’s website. The Committee’s Terms of Reference will be updated accordingly. The Authority website will also be updated to include future meeting dates for the Regulation Committee, along with those for the Authority Board.

Non-Executive Directors
(Recommendations 20,25,34)

The Authority’s non-executive directors play a crucial role in the work of the Board and in the exercise of the Authority’s statutory duties. Non-executive directors also sit on the Authority’s
sub-Committees and will continue to do so in future. In addition, in recent years the Authority’s non-executive directors have increasingly played a role outside the Boardroom, regularly attending sessions with staff to offer support and challenge on key programmes of work, including the 2021 Census, changes to the National Accounts, challenges in accessing administrative data and the work of the OSR. This is alongside planned external engagement activities: for example an event in September to engage with members of the Senedd in Cardiff, and we would be pleased to update the Committee on this too.

The Authority expects there to be a competition for new non-executive directors later this year, and in the course of this process, we can consider the case for increasing the number
of working days for non-executive directors.

(Recommendation 30)

As the Committee are aware, the Authority published its response to the report ‘Measuring Inflation’ of the House of Lords Economic Affairs Committee on 4 September. This included the Authority’s proposals for the Retail Prices Index (RPI). As noted in our response, the Advisory Panel on Consumer Prices provided advice to the National Statistician on the composition of the RPI in light of the Economic Affairs Committee report. Taking account of that advice, the then National Statistician concluded that the current position was unsatisfactory and put options for the future of the RPI to the Authority Board on 26 February 2019.

After receiving this advice, the Chair of the Authority wrote on behalf of the Board to the previous Chancellor of the Exchequer on 4 March 2019 with the following recommendations:

• That the publication of the RPI be stopped at a point in future.
• In the interim, the shortcomings of the RPI should be addressed by bringing the
methods of the CPIH into it.

In his response to the Lords Economic Affairs Committee, the Chancellor announced his intention to consult on whether to bring the methods in CPIH into RPI between 2025 and 2030, effectively aligning the measures. The Authority will consult on the method of making this change. The role of the Authority is to promote and safeguard official statistics. We have been clear that the RPI is not a good measure, at times significantly overestimating inflation and at other times underestimating it, and have consistently urged all – in Government and the private sector – to stop using it. However, the RPI is unique as we need consent from the Chancellor to make certain changes, such as the one we have proposed. Although we regret that no change will occur before 2025, we welcome the Chancellor’s intention to consult on resolving current issues with the RPI. We continue to urge the Government and others to cease to use the RPI. It would be wrong for the Government to continue to use a measure of inflation which it itself accepts is flawed, where it has the opportunity to change. Looking ahead to the proposed consultations in January 2020, the Authority looks forward to both Committees’ continued engagement on this matter.

Office for Statistics Regulation response


This response focuses on the aspects of the Committee’s report that relate to the Office for Statistics Regulation (OSR). Overall, the Committee has recognised the importance of OSR’s role, endorses a strong, separate identify for OSR as the Authority’s regulatory arm, and ultimately provides a basis for enhancing OSR’s role and identity. Since giving evidence to the Committee alongside Sir David Norgrove and John Pullinger, we have published “OSR’s vision: what we do and how we do it”, which resonates closely with many of the findings and  recommendations outlined in the report. The essence of OSR’s vision is that statistics should serve the public good, and that in a world of abundant data, we want people to have confidence in statistics produced by the public sector. In order to realise this vision, our work focuses on three key themes:
• Upholding the trustworthiness, quality and value of statistics.
• Protecting the role of statistics in public debate.
• Developing and leading a better understanding of the public good of statistics in
collaboration with others.

These themes will underpin OSR’s approach to delivering the recommendations outlined by the Committee. This note presents OSR’s response and details plans for driving improvements in these areas. It confirms OSR’s ambitions to enhance its regulatory role and identity. These ambitions can be achieved by enhancing work already underway; immediate implementation of some of the Committee’s recommendations; and evaluating options for more significant growth in OSR’s remit and resourcing.

Actions already underway

Many of the Committee’s recommendations speak to areas in which OSR is already operating and has an active role. Those relating to conducting user research, carrying out sector by sector reviews, identifying data gaps and reviewing quality information are of particular relevance to OSR’s current and planned regulatory activity. In this regard, this inquiry has provided a platform for the importance of OSR’s work, and for raising these important issues with the wider Authority which will support and strengthen OSR’s work across the Government Statistical Service (GSS).

Upholding the trustworthiness, quality and value of statistics
(Recommendations 1,6,7,37)

OSR will continue to uphold the trustworthiness, quality and value of statistics and drive improvements to statistics through its regulatory work programme of assessments and compliance checks. OSR’s current regulatory work programme has an emphasis on statistics that have the greatest public value. It considers the key policy debates and where there may be data gaps or a lack of coherence or insight in statistics which support these areas. Our work programme will continue to be informed by stakeholder discussions, as well as internal expertise, and by monitoring new areas for future inclusion. The ongoing delivery of OSR’s regulatory work programme will support several of the Committee’s recommendations.

The Code of Practice highlights the importance of the role of Chief Statistician/Head of Profession across producers of official statistics. It makes clear their role in upholding and advocating the standards of the Code, stating that they should strive to improve statistics and data for the public good, and challenge inappropriate use. The Code also emphasises the responsibility of organisations to consider the views of its statistical leadership in all matters relating to data and statistics. We have recently started a systemic review to understand characteristics of strong statistical leadership in government. We will be working with Heads of Profession and others with a view to supporting and strengthening their contribution to upholding trustworthiness, quality and value. The review will look at factors that currently support effective statistical leadership and the development of future leadership.

OSR is seeking to expand the growing list of organisations that have made a public commitment to adopt and apply the pillars of the Code of Practice for Statistics voluntarily. The adoption of the Code of Practice’s pillars of trustworthiness, quality and value for data presented as evidence in public debate (e.g. management and performance information, social research) has the potential to offer significant benefits. It can raise standards of analysis and dissemination among organisations that use it and demonstrate transparency to users of these statistics and data used as evidence. Continuing to promote and grow this area of regulatory work will address the Committee’s recommendations and extend them to data and statistics from outside government.

Providing guidance and advice on data quality and appropriate use

OSR fully supports the Committee in highlighting the importance of producers providing guidance and advice to users on the strengths and limitations of statistics, and their appropriate use. OSR will continue to raise this with producers as part of regulatory work, and will monitor and challenge if progress is slow. Since high-profile concerns relating to police recorded crime were raised at the Public Administration Select Committee and the Home Affairs Select Committee, and following the Bean review in 2016, OSR’s assessments have had a stronger focus on quality and quality assurance. OSR will continue to:
• Review, develop and promote its guidance on the Quality Assurance of Administrative Data (QAAD) (first published in 2015)
• Work with producers to further emphasise the need for understanding the natures of data sources, as well as the methods and processes for producing the statistics.
• Regulate in accordance with the pillar of Quality in the Code of Practice for Statistics

Identifying and addressing data gaps

A prominent feature of OSR’s regulatory work is identifying gaps where public debate is not well-informed by statistics. OSR will continue to expand this area of research and regulatory activity. Two current areas in which OSR is investigating the information needs of users and other stakeholders are adult social care and policing, with other topics planned to commence in the next year. These reviews seek to improve the evidence base for public debate across the UK.

OSR will continue to learn from and build on previous successes in identifying data gaps and working with producers to address them. For example, following a series of interventions from OSR highlighting the need for a trustworthy source of statistics on school funding, the Department for Education has recently committed to publishing a comprehensive set of official statistics on this topic.

Through a new series of reports called ‘Insight’, OSR will deliver lessons and insight to a wider audience in an accessible way. Following OSR’s first report on Coherence , published shortly after the PACAC inquiry hearing, the next instalment will focus on identifying and addressing data gaps.

Improving data sharing

Data linkage has great potential to address data gaps and provide additional or new insights to users. Following OSR’s data linkage review which highlighted concerns around the barriers to using linked data effectively, OSR will monitor progress in data sharing and access and will continue to strongly advocate for the wider use of linked data under appropriate conditions.

OSR will continue to collaborate with key organisations in the data landscape, convening workshops involving the Information Commissioner’s Office, the Centre for Data Ethics and Innovation and the Royal Statistical Society. OSR will also commit to reviewing, developing and promoting its regulatory guidance for statistics producers on handling data in ways that are transparent and accountable. These will help establish a common understanding of issues such as data ethics, data ownership and data access, and provide reassurance to the public that their data are safe.

Protecting the role of statistics in public debate
(Recommendations 1,4,11)

Under the Statistics and Registration Service Act 2007, there is a statutory requirement to promote, monitor and safeguard the production and publication of official statistics that serve the public good. OSR’s casework function plays an important role in building public confidence in the production and use of official statistics, intervening when there are significant or persistent issues with how statistics are being used. OSR will commit to developing our casework function further, while noting that we are already proactively monitoring how statistics are used in public debate by initiating a sizeable proportion of cases internally, and are developing automated tools and dashboards to provide intelligence on significant uses.

In the interests of providing greater transparency around the casework process, the Authority and OSR published an Annual Review of Casework in 2018. Seeking to provide further insight to users, the Annual Review of Casework will be published in September 2019 and builds on the quantitative information by including additional commentary and evidence of the impact of the Authority and OSR’s work. We will send a copy of this to the Committee separately. OSR and the Authority will continually review and develop the transparency and effectiveness of its casework function in line with the Committee’s recommendations.

Developing and leading a better understanding of the public good of statistics in collaboration with others
(Recommendations 1,4,6)

In line with the Committee’s recommendations, to develop a better understanding of how valuable statistics are in supporting key policy decisions made by public bodies, as well as by a much wider range of people in society such as businesses, charities and community groups, OSR will commit to delivering work in three key areas:
• Identifying and sharing details of research on the public impact of statistics.
• Developing a framework for judging whether statistics have been used in a misleading way.
• Investigating the understanding among producers, users and other stakeholders, of the role of National Statistics designation in conveying the value of individual sets of statistics.

We recognise that this is a gap in OSR’s past work and filling it will help demonstrate the value of statistics to society.

We note the concerns of the Committee in wanting to see the results of effective engagement in shaping improvements to the nature and content of official statistics. In this vein, OSR will continue to seek to better understand the views and experiences of users, potential users and other stakeholders through its systemic review programme. This programme provides an opportunity to convene mixed groups, involving producers, users and other stakeholders, to facilitate a common understanding of information needs and the challenges in meeting them. More importantly, they provide a means for solutions to be identified and collaborations forged. OSR will continue to drive progress in this area, for example by making additional recommendations to producers for the long-term strategy for improving engagement – a need that is common across the statistical system. We are also currently in the early stages of a review of public and user engagement that will
help inform the preparation of regulatory guidance to push for progress among producers.

Short-term actions
(Recommendations 11,16,17,18,22,23,26)

In addition to the work already underway, OSR will commit to delivering some of the Committee’s recommendations in the short-term and will begin work to address them immediately. These largely focus on transparency, including publishing OSR’s annual report separately and reporting to and engaging with parliaments aross the UK. We welcome the Committee’s call for greater transparency in OSR’s regulatory decision making and agree that this is a priority. In future, more information will be made available about Regulation Committee meetings, such as the schedule, agendas and minutes.

As part of OSR’s efforts to demonstrate a clearer separation from the rest of the Authority, we will launch our own Twitter account. This visible separation of public communication channels will reinforce OSR’s independence and contribute further to the strengthening of our own distinct voice.

This year OSR sought to more clearly illustrate independence from statistics production by publishing a report summarising our work in 2018-19 as an annex in the Authority’s Annual Report and Accounts. We will commit to separately publishing an Annual Review for 2019/20. This will be laid before Parliament as recommended, as well as other OSR reports that highlight specific concerns in statistical practice.

In addition, we note the Committee’s recommendation to ensure that Departmental Select Committees are appraised of OSR’s findings regarding statistical issues that are relevant to their areas of activity, and will write to the Committees to draw their attention to immediate matters of concern.

OSR will support the Authority as it compiles a framework document that clearly describes the roles and responsibilities of the various parts of the statistical system.

Longer-term developments
(Recommendations 13,14,15,16,22,38)

OSR is considering the means for further physical separation from the rest of the Authority and the potential regulatory benefits from an increase in resource. We will be ambitious in our thinking and will report back to the Committee in early 2020.

Premises and location

OSR is considering future office locations and will evaluate a range of options – based on an identification of the costs and benefits, risks and opportunities – from maintaining the current
situation through co-location with other government organisations to completely separate premises at all chosen locations.

OSR will take into account the varying experiences of the three sites, which include: existing separation from statistical producers (Edinburgh); sharing a building with other areas of the
Authority including ONS, taking into account the different historical arrangements for provision of space in those offices (London and Newport); and location within a national capital city (London and Edinburgh) or otherwise (Newport).


Ambitious proposals will be developed that reflect a range of options for OSR with a doubling or tripling of resource. These will be developed with input from a range of stakeholders in the Authority and beyond. Lessons from previous business planning and engagement activities will be incorporated, as well as the approaches taken by other regulators. With a strong focus on how to better deliver OSR’s vision, a range of means to deliver independent, effective, cost efficient, proactive and comprehensive regulation will be considered. These may include, for example: the expansion of OSR’s regulatory teams; extension of its research programme; and the potential development of data-driven technology tools to support regulatory decision-making. The impact of the options on location will also be considered.

Other actions

There are a number of other areas in which OSR will seek to further demonstrate its independence. The feasibility of establishing a separate website for OSR, as well as the value of having one or more external members to the Regulation Committee (that are not non-executive directors of the Authority) to provide further challenge on regulatory judgments will also be considered to make clearer the separation of OSR from the rest of the Authority.


OSR’s ambition is clear: we want to enhance public confidence in statistics and data, and we will do this by:
• Upholding the trustworthiness, quality and value of statistics.
• Protecting the role of statistics in public debate.
• Developing and leading a better understanding of the public good of statistics in collaboration with others.

We welcome the Committee’s report and see it as a tremendous opportunity to enhance our profile and strengthen our delivery as the regulator of official statistics. As mentioned throughout our response, we will continue to keep the Committee updated on our work.

Ed Humpherson, Director General for Regulation
Office for Statistics Regulation
September 2019

UK Statistics Authority response to the Lords Economic Affairs Committee report on their use of RPI inquiry

Dear Lord Forsyth,

Following the Committee’s recent report on Measuring Inflation, I write with the UK Statistics Authority’s response to your recommendations.

As your report made clear, the question faced by the Authority in 2012 was whether to make substantive changes to the construction of the Retail Prices Index (RPI). The decision made by the then National Statistician, one widely supported in the consultation at the time, was to leave the RPI unchanged. This decision gave rise in turn to the conclusion that the RPI should be treated as a legacy measure, with no future substantive changes to its construction and methods. That position was endorsed by an independent review of consumer prices led by Paul Johnson, which reported in 2015. In the period since, the Office for National Statistics (ONS) has developed alternative measures of inflation, and the Authority has urged users to move away from the RPI.

Nonetheless, the RPI continues in widespread use. This – along with new advice from ONS on the flaws of the RPI, new advice from the National Statistician’s Advisory Panels, and the urgings of your Committee – convinced the Board that further action was necessary. The then National Statistician put options for the future of the RPI to the UK Statistics Authority’s Board on 26 February 2019.

After receiving this advice, Sir David Norgrove, Chair of the UK Statistics Authority, wrote on behalf of the Board to the previous Chancellor of the Exchequer on 4 March 2019 with the following recommendations:

  • that the publication of the RPI be stopped at a point in future; and
  • in the interim, the shortcomings of the RPI should be addressed by bringing the methods of the CPIH into it.

Today the Chancellor has announced his intention to consult on whether to bring the methods in CPIH into RPI between 2025 and 2030, effectively aligning the measures.

The proposals made by the Authority address many of the recommendations made by the Committee in its report. More detailed responses to the other recommendations are set out in the attached Annex.
Yours sincerely,

Sir David Norgrove

Related Links:

ONS oral evidence – (2018)

UKSA oral evidence – (2018)

UKSA follow up written evidence – (2018)


Annex: Detailed Response to Specific Recommendations

  1. We heard evidence that the Carli formula, as used in the RPI, produces an upward bias. But expert opinion on the shortcomings of the RPI differs. (Paragraph 99)
  2. There is however broad agreement that the widening of the range of clothing for which prices were collected has produced price data which, when combined with the Carli formula, have led to a substantial increase in the annual rate of growth of RPI. (Paragraph 100)
  3. We are not in a position to reach a conclusion on the question of whether the Carli formula is problematic in areas other than clothing. Given the properties of the Carli formula that may lead to upward bias have long been evident, yet expert opinion still differs, it may be a perpetual debate. (Paragraph 101)

The Authority agrees that there is never likely to be unanimity on the issue of the elementary indices (e.g. Carli, Jevons or Dutot) used in inflation measurement. There is no single universally agreed set of criteria against which to judge them and there are specific examples where each index can be shown to produce either plausible or implausible results. A judgement therefore needs to be taken in the round.

Our view is that the Carli is not generally a good index. A thorough exploration of the issues related to the Carli index was set out in both Chapter 10 of the independent review of consumer prices by Paul Johnson and the 2012 review of UK consumer price statistics conducted by Erwin Diewert, a leading authority on index numbers.

This view is supported by international practice and the National Statistician’s Technical Advisory Panel for Consumer Prices. Many technical manuals and academic papers also highlight the undesirable properties of the Carli index. Regulations on the production of the Harmonised Index of Consumer Prices go further and state that the Carli should not be used unless it can be demonstrated to behave in a similar way to the Jevons or Dutot.

We agree that the interaction between the Carli index and the collection of clothing prices created an increase in the rate of RPI inflation in 2010. It was this event that led ONS and the Authority to put in place a programme of work that led to the 2012 consultation on the future of RPI.

4. Given its widespread use, it is surprising that the UK Statistics Authority is treating RPI as a ‘legacy measure’. The programme of periodic methodological improvements should be resumed. (Paragraph 116)

5. We are unconvinced by the National Statistician’s suggestion that in publishing statistics that serve the public good, the interests of those who may be affected negatively by any change should be taken into account. It is not clear from section 7 of the Statistics and Registration Service Act 2007 that this is a relevant consideration for the statistical authorities to be taking into account when they are producing and publishing statistics. (Paragraph 117)

6. What is clear from section 7 is that the UK Statistics Authority has to promote and safeguard the quality of official statistics, which includes their impartiality, accuracy and relevance, and coherence with other statistics. In publishing an index which it admits is flawed but refuses to fix, the Authority could be accused of failing in its statutory duties. (Paragraph 118)

7. We believe section 7 requires the Authority to attempt to fix the issue with clothing prices. Section 21 may require the Authority to consult the Bank of England over the change and obtain the consent of the Chancellor of the Exchequer, however this provision cannot be cited as a reason for not requesting the change in the first place. (Paragraph 119)

8. If the Authority requests the change, the Chancellor of the Exchequer should consent to it. It is untenable for an official statistic, that is used widely, to continue to be published with flaws that are admitted openly. (Paragraph 120) 

The announcements by the UK Statistics Authority and HM Treasury on 4 September deal with this substantive issue raised in these recommendations, and are summarised in the covering letter to this response.

9. While we accept the arguments that consumer price indices have different purposes, we do not believe this warrants the production of multiple indices for government use. Two different measures of inflation allow a government to engage in ‘inflation shopping’. (Paragraph 134)

10. The Government should address the imbalance in its use of consumer price indices. It risks undermining public confidence in economic statistics. It is encouraging to see that the present Government is taking some steps to address the imbalance, for example with the change to uprating business rates by CPI and recent discussions around rail fares. (Paragraph 135)

11. In future there should be one measure of general inflation that is used by the Government for all purposes. This would be simpler and easier for the public to understand. But the UK Statistics Authority should also continue to develop the Household Cost Indices, discussed below. (Paragraph 136)

We welcome the Committee’s recommendation that the Household Cost Indices should continue to be developed. On 28 June 2019, the National Statistician outlined the next steps in the development of these Indices.

12. We disagree with the UK Statistics Authority that RPI does not have the potential to become a good measure of inflation. With the improvements to RPI that we set out in the previous chapter, and a better method of capturing owner-occupier housing costs as discussed below, we believe RPI would be a viable candidate for the single general measure of inflation. (Paragraph 139)

13. We are not convinced by the use of rental equivalence in CPIH to impute owneroccupier housing costs. The UK Statistics Authority, together with its stakeholder and technical advisory panels and a consultation of a wide range of interested parties, should agree on the best method for capturing owner-occupier housing costs in a consumer price index. (Paragraph 153)

14. Once a method of capturing owner-occupier housing costs has been agreed, the UK Statistics Authority, after consulting the stakeholder and technical panels, should decide which index to recommend as the Government’s single general measure of inflation. The Government should have adopted the preferred candidate as its single general measure of inflation within five years. (Paragraph 154)

Owner occupiers’ housing (OOH) costs are one of the most challenging aspects of inflation to measure. There is also no single approach that will be correct in all circumstances, as the choice will depend on the purpose of the index and also practical issues around data availability. In light of this, ONS has spent the last 10 years developing and consulting on its approaches to owner occupiers’ housing costs.

The development of an OOH measure for CPI was first considered in 2009 by the Consumer Prices Advisory Committee (CPAC). The committee then spent the next three years investigating different approaches to measuring OOH costs. In September 2010 it narrowed down the options to two – net acquisitions and rental equivalence – which it evaluated in detail against the five dimensions of statistical quality defined by the European Statistical System. The Committee finally agreed on rental equivalence in April 2012, giving consideration to both conceptual appropriateness and how well the index could be calculated in practice.

A first consultation was launched in the summer of 2012, in which users were asked about rental equivalence. The responses were fairly evenly split between support for rental equivalence, net acquisitions and neither approach. The National Statistician chose rental equivalence reflecting the quality of the underlying data available and whether asset prices were appropriately treated. The process is described in more detail in Appendix A of the CPIH Compendium.

Paul Johnson’s review of consumer prices was published in January 2015. This looked again at CPAC’s recommendation to use the rental equivalence method. It concluded the underlying assumptions are reasonable in a UK context and that the measure is based on a large, detailed source of underlying data. Therefore, the Review recommended that ONS should continue to use the rental equivalence measure.

A further consultation was conducted on the findings of the Johnson Review. Responses to the review on CPIH and OOH were again mixed, highlighting that users are unlikely to come to an agreement on the most appropriate choice for measuring OOH costs.

The Office for Statistics Regulation’s 2016 re-assessment of CPIH as a National Statistic noted that ‘there is some disagreement among users about the concepts and methods…’ Work to address these recommendations resulted in a wide ranging process of user engagement on CPIH, and the publication of numerous supporting materials such as the CPIH Compendium, which articulates the rationale for ONS’s choice of rental equivalence alongside the pros and cons of each approach, an ongoing published comparison of alternative OOH measures, and documentation on the various users and uses of our consumer price inflation statistics.

ONS have also looked at international practice where they found widespread use of the rental equivalence measure. The approach taken by different countries is summarised in the CPIH Compendium. Of the 40 countries considered, the most common approach is rental equivalence (12 countries) if discounting those that exclude OOH altogether (15 countries). It is also worth noting that the method requires a reasonably large rental market to work, and so many countries may be constrained in their choice by the availability of data. The countries that use rental equivalence include the United States, Germany, Norway and the Netherlands.

In light of the 10 years of development and consultation, ONS are not minded to undertake any further engagement with users and experts specifically on rental equivalence and owner-occupier housing costs. There is never likely to be agreement on a single approach. ONS views rental equivalence as the correct approach conceptually for an economic measure of inflation, and one where sufficient data is available to make it practical. Of course, they remain committed to ongoing monitoring and development of the CPIH and the Household Cost Indices.

15.Our recommendations will not however solve the issue of index or inflation shopping immediately. The Government will need to take action in the interim to address this. (Paragraph 155)

16.While the single general measure is being determined, the Government should switch to CPI for uprating purposes in all areas where it is not bound by contract to use RPI (except for the interest rate on student loans which, as we recommended in our Treating Students Fairly report, should be set at the ten year gilt rate thus reflecting the Government’s cost of borrowing). (Paragraph 156)

17.The Government should begin to issue CPI-linked gilts and stop issuing RPI-linked gilts. We heard evidence to suggest there was sufficient demand to make a viable market. (Paragraph 170)

18.Once the long-term single official measure of inflation has been agreed, gilts should begin to be issued that are linked to that index. The prospectuses for new issuances of index-linked gilts should be clear that the inflation index will change to the Government’s single general measure of inflation once it has been agreed. (Paragraph 171)

Recommendations (15) to (18) are primarily directed at HM Government and the Authority has nothing to say on those issues. We continue to urge the Government to cease to use the RPI for its own purposes where practical.

19. Once the single general measure of inflation has been introduced, the UK Statistics Authority and the Government should decide whether RPI should continue to be published in its existing form for the purposes of existing RPI-linked contracts, or whether a programme of adjustments should be made to the RPI so that it converges on the single general measure. (Paragraph 194)

20. To avoid disruption, we envisage any programme of convergence would take place gradually, over a sufficiently long time, and that the plan for that should be published at the outset. (Paragraph 195)

21. We note that the consent of the Chancellor of the Exchequer to changes to RPI that cause material detriment to index-linked gilts holders is no longer required after the last issuance to which that clause relates to expires in 2030. (Paragraph 196)

We strongly agree that any changes to the RPI or stopping the publication of RPI needs to be carefully planned. The Authority and ONS have been discussing the mechanics of any changes with the Government in the run up to the 4 September announcement.

UK Statistics Authority oral evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on governance of statistics inquiry

On Tuesday 2 April 2019, Sir David Norgrove, Chair, UK Statistics Authority; John Pullinger, National Statistician, UK Statistics Authority; and Ed Humpherson, Director General for Regulation, UK Statistics Authority gave evidence to a Public Administration and Constitutional Affairs Committee as part of their Governance of Statistics inquiry.

A transcript of which has been published on the UK Parliament’s website.

UK Statistics Authority additional written evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on governance of statistics

Dear Bernard,
I am writing in response to the Committee’s request for further information on the seniority of statistical Heads of Profession in departments, with reference to question 4 of the terms of reference for the Governance of Statistics inquiry, which asks how the roles of Heads of Profession and Government Statistical Service (GSS) statisticians could be strengthened.

The scale and complexity of statistical functions within different departments vary, and accordingly so do the responsibilities of each Head of Profession. Not all departments are responsible for frequent publications of key indicators, or large-scale data collections, so it would not be appropriate to apply a one-size-fits-all approach. Nevertheless, every Head of Profession fulfils a critical role in the statistical system and there are common responsibilities that each Head of Profession is required to fulfil for their department. The GSS have a published statement of the role and responsibilities of the Head of Profession, updated in February last year.

The GSS provides guidance to departments on the appointment of Heads of Profession, which sets an expectation that Heads of Profession meet the highest level of the GSS criteria for statistical ability as a minimum and in addition would usually be expected to fulfil the generic civil service competencies to a SCS level.

To assist the Committee with their inquiry, we have collated the grades of Heads of Profession* in the 18 major Whitehall departments, and the devolved administrations. However, as they are employed by their departments, this is not data we normally hold, and we are unable to provide any historical data for comparison.

UK Statistics Authority additional written evidence to the Public Administration and Constitutional Affairs Committee's inquiry on governance of statistics

I hope that this information is helpful to the Committee in its inquiry, and I look forward to contributing further to the Committee’s inquiry in due course.

John Pullinger

*Note that the responsibilities of the Head of Profession are sometimes shared across some combination of a Chief Statistician and a Head of Profession, in which case the grade of the senior statistician has been recorded.

UK Statistics Authority written evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on governance of statistics

Dear Bernard,

I am writing in response to the Committee’s call for evidence for its inquiry, Governance of Statistics. I welcome this inquiry, and the opportunity it provides to support the Committee’s wider work in helping to ‘create conditions where the public can have justified confidence in public services/government.’

The following submission explains how the UK’s statistical system is mobilising the power of data to help Britain make better decisions. It starts with a brief overview of specific developments in relation to first regulation and then production, before describing a selection of the system-wide improvements that we have seen since the establishment of the UK Statistics Authority.

Ed Humpherson, John Pullinger and I look forward to speaking with the Committee to expand on these points on 19 March and indeed will follow all oral evidence sessions of the inquiry with interest.

Yours ever,


Regulating Official Statistics

In 2016, the Authority established a more visibly separate regulatory function as the Office for Statistics Regulation (OSR). OSR was built on the foundations established by its predecessor, the Monitoring and Assessment team, especially the process of assessment of official statistics. This assessment process remains the cornerstone of the regulation of statistics under OSR. OSR has supplemented this approach through transformations in three areas: the philosophy that underpins official statistics; regulatory tools and strategy; and the scope of regulation.

OSR published a revised Code of Practice in February 2018, following an extensive process of engagement and consultation. It sets public confidence in statistics as its overall aim. In an age of abundant data, it is essential that users of statistics can have confidence in statistics. The new Code describes three pillars that support public confidence: statistics must be the product of a sound and objective professional process – so an organisation’s processes, governance and systems must demonstrate trustworthiness; statistics must demonstrate appropriate quality for their intended use and not be materially misleading; and statistics must provide public value that deliver useful insights to the users of statistics.

OSR has also created a more flexible regulatory approach, including:
• A greater focus on how families and groups of statistics serve the public good, resulting in a more systemic perspective across the statistics system;
• Clearer explanation of the criteria that guide OSR’s and the Authority Chair’s public interventions on the use of statistics, and a greater willingness to make these interventions proactively as well as in response to complaints;
• Publication of a database of all extant National Statistics and a register of those National Statistics that have lost their National Statistics designation;
• Greater clarity over the range of regulatory tools at OSR’s disposal and how they are used;
• Setting out the need for greater linkage of data through the September 2018 report Joining up data for better statistics;
• More clearly separate website content, along with a more visible brand and logo.

The Authority’s regulatory regime has always focused on the ways in which statistics and data have informed public debate. As a result, the Authority has, throughout its history, been willing to comment on uses of data and analysis that are not formally designated as official statistics by the organisation publishing the statistics.

In a data rich age, OSR has built on this long-standing position. OSR has recognised the risk that the public may be presented with data from a variety of official sources – analysis, forecasts, official statistics, and management information – and that it is not helpful to confine the regulatory regime to a subset of this information. The public should be confident in all the information presented to help inform public understanding and debate.

So OSR has advocated proportionate adoption of the Code’s pillars and principles by Government and others for a wide range of analytical outputs, while continuing to expect full compliance with the Code for publications that are official statistics. This voluntary adoption approach has the potential to lead to a much more consistent set of principles covering a very wide range of analytical publications that seek to inform public debate.

Producing Official Statistics

In October 2014 the Authority launched a new strategy for official statistics, Better Statistics, Better Decisions. This provided a framework for the Authority to meet emerging demands and changes, focusing on the way we work with colleagues, the statistical community and stakeholders.

The National Statistician has led producers of official statistics, across the Office for National Statistics (ONS) and the Government Statistical Service (GSS), in delivering the ambitions set out in Better Statistics, Better Decisions. His focus has been on three priorities for change: economic statistics; contributing to public policy; and building data capability. In each area, we can look back at where we have succeeded, where we are working to deliver fundamental change, and where we would like to do more.

Economic Statistics

Professor Sir Charles Bean’s review set out a compelling vision for UK economic statistics and his recommendations have been enormously helpful as we transform as an organisation. We have made good progress against many of them, in particular establishing an Economic Statistics Centre of Excellence (ESCoE) to provide research that considers the challenges of measuring the modern economy. We have made improvements to statistics on productivity, construction and trade, such as improved flow of funds figures, productivity estimates which are more timely and comprehensive, and the inclusion of VAT data in estimates of GDP. The 2019 Blue Book round of improvements to the figures also marks a significant step forward in this work.

The establishment of our Data Science Campus illustrates our development of the long-term capacity of our workforce. Alongside this, ONS have also expanded our economic capability by increasing the number of economists in our workforce, particularly by creating a ‘London presence’ economics team to improve stakeholder engagement with key users. But responding to the Bean Review is a first step in our transformation plans. We are aware of the increase in demand of faster, more fine grained and relevant statistics arising from Britain’s exit from the European Union and other policy imperatives, and we want to improve the timeliness of our outputs, linking data where possible to provide our users with better, more innovative statistics.

Data Capability

Crucial to our transformation over this period has been the Digital Economy Act, which received Royal Assent in April 2017. This has meant that we can unlock new data sources and deliver more timely and granular data to decision makers. It has also meant we can create a permissive gateway through which accredited researchers can access de-identified data for public good research: the Secure Research Service. Of course, these improvements mean we, more than ever before, must continue to make the need to use data safely, securely and for the public good, one of our highest priorities.

We are aware that we have a role to play in discussing the public use of data, and recently published a revised framework for data handling and security. In addition, we lead within the
cross-government Data Architecture Community, bringing data leaders across government together in discussions about how to best structure and share data for the public good, and joining other cross-departmental groups in driving the data agenda for government.

In March 2017 we opened the Data Science Campus in Newport to explore new data sources and techniques, including visualising the urban forest, analysing port and shipping operations using big data and using mobile phone data to understand commuter patterns. The impact of these has led to the Campus being asked to lead on an audit of public sector data science capability.

We are also making improvements in the skills of our staff: establishing a Learning Academy at ONS, recruiting more apprentices and transforming our digital and technology estate. We have improved the desktop and mobile technology available to all staff and are moving off long-running and high-risk legacy processing systems. At the cross-government level, the National Statistician chairs the Analysis Function Board which is a federative collaboration between a number of analytical professions who deliver research, evidence and advice to a consistent, professional standard.
Staff engagement has improved, with the latest scores in the Civil Service People Survey the highest so far. The scores for the OSR are amongst the very highest in the Civil Service. Within ONS, leadership and managing change has been the most improved category and there are strong scores for being a great place to work and for diversity and inclusion. However, staff engagement in ONS is still well below the levels we aspire to reach and there is an active programme of work to make further progress.

Population and Public Policy

The Government published the White Paper Help Shape Our Future: The 2021 Census of Population and Housing in England and Wales, laying out the Authority’s proposals for the conduct and content of the 2021 Census on 14 December 2018. This details our plans for a primarily online census in 2021. The next major milestone will be the census dress rehearsal in October 2019.

We are also looking beyond 2021, consulting with users on how administrative data can come to the fore of the statistical system and ensure we are ready to make recommendations to Government in 2023 on the future of the census. The latest report on progress towards an administrative data census was published in July 2018. Alongside this, as part of the Census and Data Collection Transformation Programme, we have modernised the IT under-pinning our data collections, with approximately 360,000 businesses now supplying returns online. In addition, we have modernised the IT supporting our social survey field force.

Across the GSS there has been a transformation in the service provided, with a high demand for statistics, data scientists and other quantitative professionals to support public policy and service delivery in departments.

On our key public policy statistics, we have launched cross-GSS groups to further improve the evidence base on migration, crime, housing and income, building on work already done on student migration, violent crime, abuse, cyber-crime and affordability of rental properties; all of which will make increasing use of administrative statistics. For example, the latest update on migration and population statistics was published in January 2019. The establishment of these cross-cutting groups recognises that for most users, the split of statistics across many producers often gets in the way of clear understanding, and by joining up we can provide a better overview of these core issues. This is also true of our analysis of key public policy issues with publications on the ageing society and being 18 in 2018 setting out a much clearer picture of today’s society. All of these have been supported with improved stakeholder engagement through Population and Public Policy Forums and Select Committee appearances.

Better Statistics, Better Decisions: How the system works in practice

The Statistics and Registration Service Act 2007 established a strong, integrated system for building trust in statistics. Trustworthiness comes, however, from how the system works in

The Better Statistics, Better Decisions strategy reflects many of the themes raised by the Public Administration and Constitutional Affairs Committee in its previous inquiries on statistics. It flagged a growing demand for evidence; implications of our changing society, economy and governance for official statistics; that changes in technology meant more data was available than ever before; and that a better, more accessible online experience was expected by the public.

Together they have helped the UK statistical system drive improvements in a range of areas.

In 2016, ONS launched a new website, transforming the way in which decision-makers access official data. In July 2017, the National Statistician abolished pre-release access to ONS outputs, ensuring equality of access to official statistics published by ONS. In October 2017 the Director General of OSR published guidelines for intervention in public debate, and reviewed activity to date to provide assurance of impartiality, followed by a revised Code of Practice in February 2018.

With increased use of data comes increased responsibility. Ethics, independence and impartiality are critical to building and maintaining trust in official statistics. The National Statistician’s Data Ethics Advisory Committee is establishing principles and caselaw to guide the UK statistical system, ensuring that the access, use and sharing of public data, for research and statistical purposes, is ethical and for the public good.

And of course, perhaps most importantly, the data available to UK decision-makers are improving. The examples that follow demonstrate how the Authority has intervened to strengthen official statistics, and detail specific improvements in both the context of statistics regulation and producers of official statistics.

Crime statistics

In January 2014, as the Committee’s inquiry into police recorded crime (PRC) statistics highlighted concerns about the quality of PRC data, the Authority published an assessment of statistics on crime in England and Wales, which removed the National Statistics designation from statistics based on recorded crime data.

In the period since, ONS has been working hard to improve trust in crime statistics. In the quarterly crime publication, ONS now includes commentary on police force inspection reports conducted by Her Majesty’s Inspectorate of Constabulary and Fire & Rescue Service. This provides transparency alongside the statistics on the improvements being made and how these vary by police force.

The estimates produced from the Homicide Index were re-accredited as National Statistics in December 2016, following work done by ONS in collaboration with the Home Office to publish evidence about the quality assurance processes applied and to demonstrate the quality of this data source.

In addition, ONS is working to give the best possible overview of crime from all available sources. They expanded the survey definition of crime by adding new questions on fraud and computer misuse in October 2015, filling a knowledge gap about crimes that the general population experiences. These were confirmed as National Statistics by an OSR assessment in March 2018. ONS also added a new survey question on abuse experienced as a child, the results of which were published in September 2017.

In November 2017, OSR raised concerns about the Crime in England and Wales: year ending June 2017 statistical bulletin. ONS was encouraged to think about how to present and explain the different sources of crime statistics, and as a result, ONS introduced new commentary on which crime types are well-reported and accurately recorded, and crime types which are affected by changing recording practices.

ONS are working to provide a clear narrative across the Criminal Justice system by collaborating with the Home Office, Ministry of Justice (MoJ), the Crown Prosecution Service and support services. So far this has included publications on domestic abuse and sexual offences, to provide insight into these serious issues which will assist all users wanting to achieve better outcomes for victims. In October 2018, OSR published a blog recognising the steady improvement in the way ONS reports what is happening to crime, describing the last two publications as the clearest yet.

Migration statistics

The need for better data on international migration has been regularly noted by the Authority and the Committee for some years. A changing policy context, alongside the data sharing
powers provided to statisticians under the Digital Economy Act 2017, has offered a welltimed opportunity to reflect on the best way to deliver a population and migration statistics system in future.
OSR has responded to user concerns about these statistics, challenging ONS to develop a clear strategy of improvement. In its reports throughout 2016, OSR highlighted concerns about the differences between official migration statistics (International Passenger Survey based migration statistics) and other sources, especially National Insurance numbers registered to non-UK individuals, and encouraged ONS to bring forward the publication of analysis to explain these differences. It also published a review of the quality of student migration estimates that led to the removal of the National Statistics designation from these statistics. In 2017 OSR initiated a systemic review of how far migration statistics met user needs and concluded that the ONS’s own plans provided sufficient assurance that there was a clear improvement strategy in place. OSR continues to monitor implementation of this strategy, and in October 2018 published a compliance review of how ONS had managed and  communicated a shift from paper to tablet collection of data from travellers to and from the UK.

Working in partnership across the GSS, ONS are progressing a programme of work to put administrative data at the core of evidence on international migration for the UK and on the England and Wales population by 2020. ONS have long acknowledged that the existing approach for measuring international migration – the International Passenger Survey (IPS) – has been stretched beyond its original purpose and that all available sources must be considered to fully understand international migration in future.

Through their transformation work, ONS have already delivered a range of new research and insights into international migration using these administrative data sources. Work into international student migration, using Home Office Exit Checks data, provided new evidence on the actual long-term departure behaviour of non-EU students at the end of their study. The Migration Statistics Quarterly Reports have also been enhanced by including the best assessment of international migration, based on a wider range of sources including Department for Work and Pensions (DWP) and Home Office administrative data. OSR will advise the Authority Board on how far these developments both improve and change the nature of the main migration statistics estimates.

ONS published their latest research into how they can use administrative data to build their future system for producing population and migration statistics in January 2019. This shows the progress made towards a new approach for producing population stocks and flows using administrative data, by bringing more sources together to fill gaps in coverage.

ONS have linked immigration, education, health and income records, exploring how to use these sources to determine the usually resident population of England and Wales and immigration flows to the UK. This includes developing data driven rules, based on registrations and ‘signs of activity’ that can be identified from each data source. This research needs to go further to fully maximise the benefits of administrative data and link across a fuller range of data sources available to ONS, to continue to build an integrated system for measuring population and migration. Key to this work is close collaboration across the GSS, to address key evidence gaps identified by statistical users. ONS plan to publish their next update on this programme of work in spring 2019.

Productivity statistics

Productivity statistics in the UK are going through a period of fundamental change, mainly due to three ‘puzzles’ characterising the UK’s productivity performance. These are: a longstanding gap between UK labour productivity and other advanced economies; considerable dispersion of labour productivity across firms although economic theory suggests this should narrow over time; and lastly a sharp slowdown in the growth rate of UK labour productivity since the 2008 economic downturn, with no significant recovery since.

In its Economic Statistics and Analysis Strategy, ONS committed to improving the quality and scope of productivity statistics, to deliver a world-class set of statistics to support users attempting to address these “productivity puzzles.” Good progress has since been made: ONS now publish much more comprehensive productivity metrics, for 80 industries (up from 24 in 2015) and a new ‘flash’ estimate of labour productivity, 45 days after the end of the reference quarter, which is half the time it used to take to produce. ONS is the only National Statistical Institute to publish experimental quarterly estimates for multi-factor productivity and are therefore among the timeliest in the world.

ONS are also engaging with users to produce a series of microdata analysis pieces looking at the relationship between trade and productivity, ICT and productivity and management practices and productivity, and an interactive benchmarking tool for businesses to compare themselves with averages of their industry and size band. In March 2019 ONS will host a productivity forum to discuss developments and future priorities with users.

ONS continue to focus its efforts on the production of a wider range of statistics, analysis and research to shed light on the puzzles posed by the UK’s recent productivity performance. They are also considering the productivity gap (international comparisons of productivity) and measurement challenges, working collaboratively with OECD.

Health statistics

The UK’s health statistics landscape is complex, with a wide range of producers. NHS Digital, Public Health England (PHE), NHS England, ONS and the Care Quality Commission (CQC) publish the bulk of official statistics in England, and many other organisations publish statistics on health and care across the UK. In November 2018, 156 of the 855 National Statistics across the United Kingdom related to health and care. This also reflects the devolved nature of health policy, with England, Scotland, Wales and Northern Ireland reporting statistics separately; the availability of data from the NHS; and the public interest in the performance and outcomes from the health system. The volume of statistical outputs alone does not guarantee the public interest is well served, and the Authority recognises that users should be presented with a coherent comparable and insightful picture based on the statistics.

In July 2015, OSR’s predecessor published three assessments of statistics on patient outcomes in England: the Patient Experience Survey; the Patient Reported Outcomes Measures; and the Standardised Hospital Mortality Indices. These assessments were followed in October 2015 by an assessment of the overall Patient Outcomes Framework. These assessments fulfilled a recommendation made by the Francis Review into the problems at the Mid Staffordshire Trust in 2013.

Each assessment made some specific points relating to the statistics. But a general pattern also emerged: that there was a lack of user engagement; little insight was provided to help users interpret and use the outcomes information; and the statistics appeared to be designed for expert users within the health and care system, and did not pay sufficient attention to broader public use.

In light of these systemic concerns, OSR convened a round table in February 2016 of senior leaders from the NHS in England, which concluded that the value added by the system for health and care statistics was less than the sum of its parts, because “health statisticians often focus on servicing their immediate policy and operational users, and only within the NHS, with insufficient effort devoted to working collaboratively to address the important issues of coherent and accessible statistics to support public understanding and accountability … statisticians appear to be tentative about engaging with a broader user community.” They found that the health and care landscape is data-rich but information poor; and that a piece-meal production approach was reflected in the fact that health and care statistics can be accessed via a range of websites and portals, but there is no single source to guide researchers or the public to the most appropriate statistics to meet their needs.

Since then, OSR has undertaken strategic interventions to lead improvements, for example, intervening publicly in October 2016 to encourage the Department of Health to provide greater clarity on health funding, particularly on sources, time periods and what is being measured. This led to significant improvements to the way HM Treasury presents health spending in the Public expenditure Statistical Analyses. OSR also intervened on four occasions between 2017 and 2018 on the collection, presentation and use of accident and emergency statistics in Scotland, England, and in comparisons of accident and emergency performance between England and Wales. These interventions led to the withdrawal of new 10 guidance to Trusts in England; a recalculation of performance against the four hour waiting time in England; and revised quality assurance processes and presentation of statistics in Scotland.

ONS, with OSR’s support, has established the English Health Statistics Steering Group (EHSSG) to help ensure coherence across health statistics producers. An interactive “Health statistics landscape” has been produced to enable users of health statistics to get what they need and understand the complete coverage of published health statistics by topic. Moreover, in March 2018, OSR concluded that it should pass on responsibility for convening producers of health statistics to ONS, since this convening is more properly a production rather than a regulatory activity. OSR will continue to drive improvement using a range of regulatory interventions.

ONS health statistics have continued to progress and develop. The production of regular statistics on cause of death, including drug, alcohol and suicides is built on processing and coding of life events data, particularly death records. In addition, ONS have published statistical series on healthy life expectancy and mortality rates which have been critical in uncovering health inequality by deprivation and the levelling off of mortality rates after 100 years of consistent improvement. ONS are now collaborating across government, particularly with PHE and the Department for Health and Social Care (DHSC), to establish the causes behind these patterns.

More recently the Digital Economy Act has provided ONS with the potential to extend health analysis to new sources of data, and improved data linkage techniques and technology to produce more detailed analysis. Examples include publications on student suicides, a new statistical series on the deaths of homeless people, and ONS collaboration with DHSC using census data to analyse the family, housing and labour market participation of those with common mental health disorders and their outcomes after treatment. These outputs have resulted in significant public interest and swift government action.

The Authority is aware that there is still work to be done, and that there are many evidence gaps across the health landscape. A fundamental gap is the lack of consistent, coherent data on social care, which is a focus for the Authority in 2019. There are currently around 150 local authority providers of care, many private sector providers and a huge, hidden, informal care sector. Care and caring is going to be a key policy challenge for many years to come and ONS are working with academics and the charity sector to try and plug this gap. Another evidence gap is around disability and ONS have established a cross government group to understand and address this problem.

Trade statistics

The demand for improved and more detailed UK trade statistics has increased significantly since the EU referendum. ONS note that policy-makers, economists and users needed more
data to help provide a better understanding of the UK’s trading relationship with the rest of the world: specifically, more detail on the goods and services being traded, the sections of the UK economy engaged in this trade, and greater geographic detail of where trade is taking place.

The vital transformation of UK trade statistics is now delivering a breadth and range of trade data, commentary and insights. For example, collaboration with HM Revenue and Customs (HMRC) combined with use of more powerful technology has led to much more granular trade in goods data by country and commodity, consistent with the wider UK’s National Accounts and Balance of Payments. Following a doubling of the quarterly International Trade in Services survey sample size and optimisation of sampling by geographic region, ONS have also expanded publication of trade in services detail to quarterly (from annual) and increased the detail by country and service type. These developments, together with new experimental estimates of trade in goods by industry, have increased the number of published trade series from around 1,000 to over 100,000 in the autumn of 2018.

On the Rotterdam effect (an issue in which this Committee has expressed interest previously), ONS followed the initial analysis and estimation of its effects with further guidance and notes in their trade releases, advising users of this effect on a regular basis. While key, initial discussions with users made it clear that their priorities were for increased granularity and frequency of trade statistics. More recently, however, these users’ next set of priorities now include a better understanding of the Rotterdam effect and its possible impacts. ONS is a member of the Department for International Trade steering group considering research in this area and ESCOE are leading in this analysis.

ONS have delivered wide-reaching analysis of the UK trade asymmetries, providing context and explanation of these. The first focused on the US and the Republic of Ireland, followed by further work expanding on international collaboration and analyses to include Germany, France, The Netherlands, Luxembourg and Belgium. These analyses have highlighted cases where other countries are moving to revised international standards at a slower pace compared with the UK.

Moreover, ONS have provided innovative new tools for users to access and analyse trade data, publishing interactive maps that show 234 countries’ trading relationships with the UK, broken down by 125 types of goods and updated monthly, and further maps showing trade in services by country and type of service.

There is still a keen need for a greater understanding of trade at the local geographic level: how parts of Wales, Scotland, Northern Ireland and the regions of England are trading with the rest of the world. Recent workshops held by the Centre for Subnational Analysis within ONS highlighted even greater need for this information for local enterprise partnerships (LEPs) and combined authorities in designing their strategic economic plans and local industrial strategies. In October 2018, alongside other new trade releases, ONS published updated regional exports of services estimates.

ONS are now researching methods and investigating new data sources to continue expansion of trade in services statistics, and planning to publish experimental estimates of trade in services by industry in the first half of this year. Development work on exports of services at the regional level will continue, alongside further analysis of trade in services asymmetries, trade in value added, digital trade, modes of supply of services and further methodological reviews of data processing.

Education statistics

Statistics about children, education and schools are among the most prominently used in public debate and can also be an important influence on the choices made by families about schooling. It is essential that these statistics are presented and used in line with the Code of Practice. Education statistics are mainly produced by the Department for Education (DfE), and the respective Education departments in Wales, Scotland and Northern Ireland. The Higher Education Statistics Agency, ONS, Ofsted and Student Loans Company also publish related figures that contribute to the evidence base. OSR made a series of interventions involving DfE statistics during 2017 and 2018. These included formal letters published on the Authority’s website, and informal discussions
between OSR and the Department’s Head of Profession for Statistics during the summer of 2018. These informal discussions were based on OSR’s monitoring of statements made by
the Department which draw on official statistics or used statistical evidence. As part of these discussions, OSR queried some aspects of statistical presentation with DfE.

In October 2018, the Authority Chair wrote to the Secretary of State for Education with serious concerns about DfE’s presentation and use of statistics. The Director General for Regulation wrote at the same time to the Permanent Secretary and the Head of Profession for Statistics.

Following this intervention, both the Secretary of State and Permanent Secretary publicly reaffirmed their commitment to the principles of the Code of Practice, and DfE put in place revised procedures for the development and agreement of communication messages using statistics. OSR continues to monitor the use of statistics by DfE and to emphasise the need for appropriate communication of statistics to the Department.

This work on education statistics in England demonstrates that OSR actively monitors and provides feedback to producers of statistics; and takes reasoned judgements to consolidate cases, rather than deal with them one by one, where they judge that there is a pattern of misuse of statistics.

Homelessness and hunger statistics

Better Statistics, Better Decisions notes that ONS will provide a firm evidence base for sound decisions, and so reacting to key policy demands effectively is a priority. Examples of policy areas that require better data include homelessness and hunger. The Government set a commitment to halve rough sleeping by 2022 and eradicate it by 2027. While estimates of the numbers of homeless people are available, along with an annual estimate of rough sleepers, there is currently little systematic data available to help inform policies to prevent homelessness and monitor and evaluate initiatives to relieve homelessness. There is even less data available on rough sleeping.

The regulatory function assessed homelessness statistics for England published by the Ministry for Housing, Communities and Local Government (MHCLG) in 2015. It concluded that the presentation of the statistics, as three separate statistical reports with no coherent narrative to draw the statistics together, or to place them in context, diminished their value. MHCLG have since put in place a programme of change that is leading to more comprehensive homelessness statistics. OSR has removed the National Statistics designation from the existing homelessness statistics.
In view of the increased policy focus on homelessness, the GSS is undertaking work to understand wider data and evidence, which is often generated by charities and third sector organisations who work directly with homeless people and those who sleep rough. The aim of this work is to establish how such data could build better intelligence for local government, helping them identify risk factors and trends specific to their local populations. In addition, ONS also developed a new method to estimate the number of deaths of homeless people and published the first report in December 2018.

Furthermore, the data asset Homeless Case Level Information Classification (H-CLIC) has recently been established in England and has the potential to be linked with wider Government data sets, such as benefit receipt, income, health, crime and justice, family formation and break up, as well as characteristics such as age, ethnicity, education and nationality. Collaborative work across ONS, MHCLG and potentially DWP, MoJ and DHSC to anonymise and link this data, in order to build an understanding of underlying factors that lead to homelessness and rough sleeping, could help lead to identifying effective interventions at the local level.

There is also a growing demand on the Government to improve UK statistics on food security and hunger. Emma Lewell-Buck MP’s Food Insecurity Bill calls for ONS to address her concerns in relation to the lack of statistical data on the issue, alongside the specific requirement to measure food insecurity as part of the Sustainable Development Goals. ONS have conducted a comprehensive review of relevant existing data from both official and non-official sources and are looking at options to fill any data gaps. Following meetings with Emma Lewell-Buck MP and an appearance at the Environmental Audit Committee, the next step is chairing a round-table on food insecurity and the measurement of SDGs Goal 2 with all key third sector and government  takeholders in late February 2019 to fully understand all the data requirements before determining next steps.

Inflation statistics

The importance of, and interest in, inflation statistics is recognised by the Authority. This interest has recently increased due to the Lords Economic Affairs Committee (EAC) inquiry
considering the use of the Retail Prices Index (RPI), but it has been a key statistical issue that the Authority and this Committee have considered for many years.

A range of inflation measures are required to meet current and emerging user needs, but it is also important to ensure that the statistics present a clear and coherent picture. To this end, ONS set out three ‘uses cases’, relating these to the measures that are currently published and those that are under development. Taken together, these present the ONS approach to measuring changes in prices and costs faced by consumers and households: a comprehensive measure of inflation, based on economic principles: the Consumer Prices Index including owner occupiers’ housing costs (CPIH) which is also the ONS headline measure of inflation since March 2017; a set of measures to reflect the change in costs as experienced by households: the Household Costs Indices (HCIs); and a measure that is required under s21 of the Statistics and Registration Service Act: the RPI.

The Statistics and Registration Service Act requires that the Authority publish the RPI each month. In practice, ONS, as the Authority’s executive office, compiles, maintains and publishes the RPI. The current treatment of the RPI has its origins in the 2012 consultation on the future of the RPI. Having analysed the responses to that consultation, the National Statistician’s decision at the time was to leave the RPI unchanged.

The approach that arose from this consultation was to develop alternative, statistically robust measures and encourage their use, which was also one of the conclusions of the independent Johnson Review of consumer price statistics in 2015. The CPIH and its close relation, the CPI, are designed to capture the economic approach to the changing prices of goods and services, and HCIs are intended to measure households’ experiences of changing costs. The CPIH regained its National Statistics status in July 2017, and the first experimental HCIs were published in December 2017. Implicit in this approach was that use of the RPI would decline as alternative measures were adopted.

The Authority has long stated that the RPI is not a good measure of inflation, and that we do not encourage its use. The OSR’s predecessor also removed its National Statistics designation in 2013 because the methods used to produce the index were not consistent with recognised best practices. However, the Authority notes the EAC’s recommendations and agrees further steps are needed. We are currently considering a number of options and will keep this Committee, and the EAC, informed.

UK Statistics Authority follow-up written evidence to the Lords Economic Affairs Committee’s inquiry on the use of RPI

Dear Lord Forsyth,

I am writing with additional information the Committee requested when we gave evidence on 12 June. These are attached at Annex A.

I would also like to take this opportunity to reaffirm the position of the UK Statistics Authority that the Retail Prices Index (RPI) is not a good measure of inflation, does not have the potential to become one, and we strongly discourage its use. Its continued publication is a result of the legislation and the way it is built into a range of contracts.

We continue to encourage users to move away from the RPI to better measures, while recognising that there is never likely to be a single measure of inflation that captures all individual experiences of price changes or meets all user needs. The Consumer Price Index including owner-occupiers’ housing costs (CPIH) and the Consumer Price Index (CPI) are both National Statistics, and the Office for National Statistics (ONS) are developing new Household Cost Indices with a particular focus on the experience of different household types. They have set out for users the different characteristics of these different families of indices.

We welcome the statement by the Chancellor on 25 April that the direction of travel is away from the RPI to the CPIH. The Governor of the Bank of England has also made the case against the use of RPI.

We have seen the use of RPI decline over time. Nonetheless, like the Committee we see continuing uses of the RPI that are difficult to justify. I have for example said publicly that I am concerned by its use for student loans and rail fares.

As we discussed, the issues around the use of RPI are complex, often reflecting decisions and contracts made many years ago. Changes will need to be carefully planned and coordinated. The UK Statistics Authority and ONS look forward to playing their parts in making the changes that are needed.

Yours sincerely

Sir David Norgrove

Related Links:

ONS oral evidence – (2018)

UKSA oral evidence – (2018)

UKSA response to report (2019)