Office for Statistics Regulation written evidence to the Lords Public Services Committee’s inquiry into ‘Levelling up’ and public services

Dear Lady Armstrong

‘Levelling up’ and public services inquiry

Thank you for giving the Office for Statistics Regulation (OSR) the opportunity to comment on the following questions:

  • Are there gaps in regional/local data sets (particularly for health, education, justice and wellbeing outcomes) which may impact upon the Government’s ability to target investment, and for Government success to be measured?
  • How can we ensure transparency and access to the data sets that the Government is using, to ensure Parliament can scrutinise decisions as to where money is prioritised, and the effectiveness of such investments?

OSR’s Strategic Business Plan sets out our vision and priorities for 2020-2025 and how we will contribute to fostering the UK Statistics Authority’s ambitions for the statistics system, as set out in the Authority Strategy. One of OSR’s ambitions is that, by 2025, the statistical system will provide much a richer picture of the UK’s changing economy and society. Statistics should not simply focus on the average, but instead provide disaggregated and granular insight into how different communities, places and people are doing. As regulator of government statistics, we will assess whether statistics provide regional, local and disaggregated pictures of society. We will challenge producers to produce more granular statistics, about ethnicity for example, and to respond to changing aspects of social identity and economic activity. Our systemic reviews will uncover areas where the needs of a wide range of users are not being met, and challenge producers to address these needs and gaps. Where producers do not address these issues, we will continue to highlight our concerns and criticism publicly.

There are a number of gaps in data – both in terms of personal characteristics and low levels of geography – which our previous work has highlighted; we also consider a lack of timeliness or comparability across the UK to be characteristic of data gaps. I wrote a blog on data gaps to initiate some work OSR is doing on the process of demystifying data gaps, building on our growing understanding of how different statistical producers have addressed them. We are still building up our case examples of what works to address gaps, but one striking early conclusion is the importance of collaboration between different producers. And, over recent years, we are pleased that departments are innovating in data collection or processing methods to fill gaps.

This letter addresses your questions about health, education, justice and wellbeing data in more detail, but gaps in data are a widespread problem. During the past year, for example, we have: reviewed statistics and found data gaps in areas such as the Office for National Statistics’ (ONS) internet access and use statistics; seen that the ONS has the potential to do more analysis on alternative methods with the microdata provided by the Valuation Office Agency for the Consumer Price Inflation, including Owner-Occupied Housing Costs (CPIH); and despite positive changes, noted that data gaps remain in Housing and Planning statistics.

Data gaps in public health, health care and social care

As our report on Adult Social Care statistics noted, “While there is rightly a focus on delivery, a scarcity of funding has led to under investment in data and analysis, making it harder for individuals and organisations to make informed decisions. This needs to be addressed” across both social care and health care to ensure the system is sustainable for the management of future public health crises. During 2020, we have called for better statistics to understand and take action to manage the COVID-19 pandemic. The immense effort required to re-work existing, or develop new, data collections has shown the administrative data systems used to collect information about public health are not suitable for the timely production of official statistics. The development of the UK COVID-19 data dashboard is a welcome innovation, but it would have been good to see data of sufficient granularity to meet the needs of the public sooner.

Our report about social care statistics in England highlighted a form of levelling up – that is, the work required to bring social care statistics to the levels of granularity and comprehensiveness of hospital care statistics. Additionally, the lack of joined-up data between health and social care has been a major data gap with serious consequences for many people during the pandemic. As well as improvements to social care, the opportunity afforded by the NHS White Paper in England could also enable the levelling up, in this sense, of primary health care data and mental health care data, as our report on adult mental health statistics in England outlined. Investment in IT infrastructure and staff skills will be needed to be able to accurately capture improvements in care outcomes that are provided by services such as these which take place out of hospital. We will continue to push for better social care data, working with technology leaders, such as NHSX.

Data gaps in education

Our report on the public value of post-16 education and skills in England noted “that better information about applicants to university would help shed light on social mobility. We also found that there are information gaps surrounding the further education workforce and workforce skills, which make planning to meet future demand difficult.” Across the UK, we found that “Significant gaps exist in statistics and data on individual student circumstances, in particular, about whether students are care leavers or have care experience. This information is self-reported and the data quality can be poor.” We continue to work with statistics producers to improve the information available.

Data gaps in poverty

Poverty remains a significant issue for the UK and has the potential to be of greater importance as we adjust to life following COVID-19, which is why we launched a systemic review on the coherence of poverty statistics in Autumn 2020. The volume of official data is difficult to navigate and does not reflect the changing nature of poverty or the unavoidable costs faced by low-income households, to the extent that stakeholders have developed new metrics outside the world of official statistics, including the Joseph Rowntree Foundation’s destitution study and the Social Metrics Commission’s framework for measuring poverty. Our report on poverty statistics will be published shortly, which we can share with the Committee.

Data gaps in rough sleeping

We worked with statistics producers to encourage the development of new and improved data and statistics on rough sleeping, especially during the pandemic. We reported this in a blog, which noted that “robust statistical evidence needed to answer key questions about the experiences of UK rough sleepers since the start of the pandemic is still lacking. New management information on the numbers of rough sleepers, and those at risk of rough sleeping, who have been provided with emergency accommodation since the start of lockdown is now being collected by UK councils. However, this management information is not always recorded consistently, and in many cases remains unpublished.”

Closing the data gaps

There are some examples of good work being undertaken to close data gaps, helping to provide a high-quality evidence base to inform levelling up.

The ONS publishes information about societal and personal well-being in the UK looking beyond what we produce, to areas such as health, relationships, education and skills, what we do, where we live, our finances and the environment. The data are available at local authority level and address the widespread concern that standard measures of the economy do not reflect the underlying welfare and well-being of the population.

The ONS also publishes data for the environmental accounts, including access to green space in Great Britain in 2020, at a local geographic level and includes Natural England survey data on garden access in England, broken down by personal characteristics such as age and ethnicity. Again, this work addresses a concern that focusing on traditional measures of the economy does not capture the environmental consequences of economic activity.

There are endeavours to address data gaps in a range of other policy areas:

  • Health: We were pleased to see that the ONS is developing a Health Index for England, which should allow for benchmarking the progress of local authorities. The Health Index is an Experimental Statistic to measure a broad definition of health, in a way that can be tracked over time and compared between different areas. The domains include healthy people, healthy places and healthy lives.
  • Deprivation: The Ministry of Housing, Communities and Local Government (MHCLG) publishes English Indices of Multiple Deprivation. OSR reviewed the outputs and noted that MHCLG has worked with the University of Sheffield and MySociety to develop a new mapping tool which allows users to visualise the statistics at different geographical scales including new geographies that the statistics have not been presented by before: Westminster Parliamentary Constituencies and Travel to Work areas.
  • Justice: In collaboration with Administrative Data Research UK, the Ministry of Justice is undertaking a data linkage project called Data First. OSR’s review, The Public Value of Justice Statistics, highlighted the need for statistics that move from counting people as they interact with specific parts of the justice system to telling stories about the journeys people take. Data First will anonymously link data from across the family, civil and criminal courts in England and Wales, enabling research on how the justice system is used and enhancing the evidence base to understand what works to help tackle social and justice policy issues.

Transparency

OSR’s work is for statistics for the public good and one of our key tenets is to insist on transparency of data. Sometimes, particularly during the pandemic, the use of data has not consistently been supported by transparent information being provided in a timely manner. As a result, there is potential to confuse the public and undermine confidence in the statistics. It is important that data are shared in a way that promotes transparency and clarity. It should be published in a clear and accessible form with appropriate explanations of context and sources. It should be made available to all at the time the information is referenced publicly.

I hope this is useful to the Committee, and please do let me know if there is anything further I can do to assist with this inquiry or others.

 

Yours sincerely

Ed Humpherson

Director General for Regulation

UK Statistics Authority and Office for Statistics Regulation written evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on data transparency and accountability: COVID-19

Dear William,

Thank you for your letter of 2 February, in which you asked about the Government’s use of data, in relation to your inquiry Data transparency and accountability: COVID-19.

We discuss your specific questions in an Annex to this letter. Here I would like to reflect more generally about the use of data and statistics in the past year.

Overall I believe our statistical system has responded well to the stress and pressures of the pandemic. Ian Diamond’s separate letter to you describes an immense range of work that has been done to understand the pandemic itself, which has been fundamental to government decision making and public understanding. Alongside the work on the pandemic the Office for National Statistics (ONS) and statisticians across government have continued to produce remarkable data and analysis across the economy and society, work that is high quality and innovative. Preparations and contingency plans for the census in England and Wales are encouraging.

The legislative framework for our statistics as set out in the Statistics and Registration Service Act 2007 together with the Digital Economy Act 2017 has also, I think, met the sternest test it has yet seen. The new data and statistics required by the pandemic have for the most part been compiled and published in accordance with the Code of Practice on Statistics and statisticians have generally been able to access the new sources of data they need.

I pay warm tribute to all involved in this work, at a time of anxiety for them and their families, with all the disruption caused by the need to work from home, alongside the increased difficulty of their professional lives, with many surveys and other sources of data having to be changed or abandoned.

Within this generally positive picture not all has gone well, and there are lessons to be learned.

It has too often been a struggle to develop a coherent picture of the pandemic even within England as a single country. DHSC currently plays a limited role in health statistics. Its resource has been strengthened following an ONS review undertaken at their request. But the disparate bodies involved in the provision of health are in terms of statistical output too often inchoate, to the extent for example that both the NHS and Public Health England produce statistics on vaccinations that are published separately.

This is an issue that has been highlighted by the Office for Statistics Regulation (OSR) in the past. It goes well beyond the concerns raised by the pandemic. We currently have no coherent statistical picture of health in England or of the provision of health services and social care.

There are similar issues in relation to the health data for the four nations. The adoption of different definitions complicates comparisons and makes it hard to draw the valuable lessons we could all learn from different ways of doing things.

I strongly support the proposal by the Royal Statistical Society for a thorough external review.

More immediately it is hard to understand why the different nations have chosen to publish vaccination data in the different ways and detail they have chosen. OSR is pursuing this.

Some people may be surprised by my mostly positive assessment of the handling of statistics and data over the past year. Their more negative view is likely to have been influenced by a number of – too many – particular examples of poor practice.

  • The presentation of data at No 10 press briefings has improved, helped by the later involvement of ONS staff, but early presentations were not always clear or well founded, and more recently a rushed presentation has undermined confidence.
  • Ministers have sometimes quoted unpublished management information, and continue to do so, against the requirements of the Code of Practice for Statistics. Such use of unpublished data leads of course to accusations of cooking the books or cherry picking the data. It should not require my involvement or that of OSR to secure publication.
  • Perhaps most important is the damage to trust from the mishandling of testing data. The target of 100,000 tests per day was achieved by adding tests sent out to tests completed. As predicted there was huge double counting, to the extent of some 1.3 million tests that were eventually removed from the figures, in August. The controversy over testing data seems likely to continue to undermine the credibility of statistics and the use that politicians make of them.

The Annex describes a range of current issues in relation to the pandemic, including testing and vaccinations, as well as replying more directly to your letter.

There are perhaps two areas the Committee might like to consider in terms of future change.

The first is the central role of the Authority together with the National Statistician and OSR. The UK has a decentralised system of statistics where individual departments are responsible for their statistics and departmental statisticians report within their departments. This has strengths we should not lose. It ties statistics and statisticians closely into the policy making of their departments and any change should not weaken that tie. But the complexity of data and statistics in the current crisis has shown the need in these circumstances for a firmer central controlling mind. The National Statistician and the ONS have taken this role to a large extent, through expertise, position and personality rather than formal agreement.

OSR has also taken a more expansive role. For the future there may be a place for more formal arrangements.

Secondly, it is clear that political pressures have led to some of the weaknesses in the handling of Covid statistics. It is to the credit of our politicians that they have created an organisation like the Authority that is permitted to criticise them, and in general politicians respond appropriately to our criticisms. But it might help if more issues were headed off before they arose. The Ministerial Code for example only asks Ministers to be ‘mindful’ of the Code of Practice. The requirement could be stronger.

The Authority in 2020 published a new five year strategy, during the pandemic. It remains valid and we are pursuing it at pace. The ONS is leading the development of an Integrated Data Platform for Government as well as developing new and better statistics to help the country understand the economy and society, from trade to happiness and from crime to migration. Statistics to help the recovery are a particular focus. OSR is developing its work on statistical models – its review of exams algorithms will be published shortly – as well as on automation of statistics, data linkage, National Statistics designation, granularity and statistical literacy.

I look forward to keeping you and the Committee in touch with our progress.

 

Kind regards,

Sir David Norgrove

 

OFFICE FOR STATISTICS REGULATION ANNEX

1. Has the Government made enough progress on data since the start of the pandemic, and what gaps still remain?

Summary

Since the start of the pandemic Governments across the UK have maintained a flow of data which has been quite remarkable. New data collections have been established swiftly, existing collections have been amended or added to, and data sources have been linked together in new ways to provide additional insight. We have seen good examples from across the UK, including data on the virus itself and on the wider impacts on the pandemic.

However, in some areas Governments have been slow both to publish data and to ensure its fitness for purpose. For example, more consideration should have been given to the data available as testing and tracing was being set up. While UK governments have started publishing vaccinations data more promptly, and are continuing to develop the statistics on an ongoing basis, there remains much room for improvement in terms of the amount of information that is published and the breakdowns within the data. There are also gaps remaining in the data available to support longer term understanding of the implications of the pandemic.

It is clear that there is intensive work taking place to provide more comprehensive vaccinations data, both by each Government, and through cross-UK collaboration.

New data developed

There are many examples of outputs which have been developed quickly to provide new insights to help understand the pandemic and its implications. Some specific examples include:

  • The coronavirus (COVID-19) infection survey, carried out by the Office for National Statistics (ONS) in conjunction with partners. This is the largest and only representative survey in the world that follows participants longitudinally over a period of up to 16 months.
  • Statistics on the Coronavirus Job Retention Scheme (CJRS), the Self-Employment Income Support Scheme (SEISS) published by HM Revenue and Customs (HMRC), and the ONS Business Impact of Coronavirus Survey (BICS).
  • The Ministry of Justice (MoJ) and Her Majesty’s Prison and Probation Service (HMPPS) have published official statistics providing data on COVID-19 in HM Prison and Probation Service in England and Wales.

We have also seen outputs which attempt to draw together data to make it more easily accessible. The most well-known of these is the coronavirus dashboard which is widely used and constantly evolving. While the dashboard focuses on data related to COVID-19 infections, hospitalisations and deaths, the ONS has pulled together a broader range of data from across the UK government and devolved administrations to highlight the effects of the pandemic on the economy and society: Coronavirus (COVID-19): 2020 in charts. This output covers a broad range of data including transport use, schools attendance and anxiety levels.

Coronavirus (COVID-19) data

We have seen improvements to the data since the start of the pandemic, particularly around cases and deaths, with clearer information on the different sources of data available and the value of each of these sources.

However, we continue to have concerns, including seeing examples of data being referenced publicly before they have been released in an orderly manner, which we have highlighted to your Committee. In some areas Governments have been slow both to publish data and to ensure its fitness for purpose. For example, the UK coronavirus dashboard is now a rich source of information with plans for the inclusion of further data. However, it took several months to become the comprehensive source it is now.

Our concerns around COVID-19 health data cover three broad areas: Testing and tracing, hospitalisations, and vaccinations.

Our concerns with data on testing have been made public on a number of occasions(1 & 2). While there have been significant improvements to the England Test and Trace data it is still disappointing that there is no clear view of the end to end effectiveness of the test and trace programme.

In December we published a statement on data on hospital capacity and occupancy, noting the biggest limitation to the existing data is the inability to make a distinction between whether someone is in hospital because of COVID-19 infection, or whether they are in hospital for some other reason but also have a COVID-19 infection. These data should become available in time but the delay limits understanding at a critical time.

On vaccinations data, UK governments have been quick to start publishing data and have learnt some of the lessons from test and trace (see section 4). However, there remains room for much improvement in terms of the amount of information that is published and the breakdowns within the data. We would like to see more granular breakdowns and more consistency between administrations across the UK. Our letters to producers published on 1 December 2020 and 20 January 2021 outline our expectations in relation to these statistics.

The improvements cover four broad aspects of vaccinations data. First, there should be more granular data on percentage take up, for example by, age band, ethnic group and by Joint Committee on Vaccination and Immunisation (JCVI) priority groups.

Second, in terms of UK-level consistency, the data available for each administration varies:

  • Information on the percentages of individuals in each priority group that have been vaccinated to date is available for both Scotland and Wales
  • In England, breakdowns including ethnicity and some age bands are published by NHS However, it is disappointing that the publication of Public Health England’s COVID-19 vaccine monitoring reports, that might have been a vehicle for more granular information about the vaccination programme in England, has been halted with no explanation as to why it has stopped or when it may restart. We have called upon the statistics producers to be clear on the data currently available and when more data will be published.
  • Northern Ireland data are included in the UK However, the more granular data are not released in Northern Ireland itself in an orderly manner. We have written separately to the Department of Health (Northern Ireland), requesting that the data are published in an orderly and transparent way.

Third, on vaccine type, Scotland is the only administration to routinely publish data on vaccination type (Pfizer BioNTech or AstraZeneca). It is in the public interest to have this information for all parts of the UK, particularly in the context of the media coverage on vaccine efficacy and sustainability of supply.

Fourth, it would also be helpful to have better information on those who have been offered a vaccine and those who have taken them up. This would help with understanding the reasons why some people may not be taking up vaccines, for example whether it is refusals, access or because they have not actually received an invitation to have a vaccine.

In addition to the areas outlined above, there are gaps remaining in health related COVID-19 data. For example, very little data exists on ‘long COVID’, despite emerging evidence that a proportion of people can suffer from symptoms that last for weeks or months after the infection has gone. We understand that the ONS is starting to look at this area and welcome this effort to fill an important data gap.

There are also gaps in the understanding of and information on variants of coronavirus. This is important in understanding the implications, for example whether data already collected on issues such as hospitalisations, deaths and efficacy of vaccinations are still applicable.

It is a fundamental role of official statistics and data to be used to hold government to account, and the lack of granularity and timeliness seen with some of the data makes it hard to do this. We recognise that producers are working intensively to make improvements and are keen to support these efforts.

Wider impacts

It is inevitable in such a fast-moving environment that there will be gaps in the data. To date the priority has been to fill the gaps most needed to understand the immediate and most direct impacts of the pandemic but at some stage the focus will need to shift from what is happening today to looking at what data are needed to fully understand the longer term consequences of the pandemic.

The issues are broad ranging and cover diverse areas such as:

  • the impact of the pandemic on children and young people
  • the long-term effects on health services
  • the future health of the nation
  • the impacts on inequalities
  • the financial impacts of both the pandemic and the response to it

Answering society’s questions about the pandemic must be done using both existing and new data. There are some longstanding statistical outputs on aspects of society that are likely to have been affected by the pandemic – for example, statistics on trade and migration. These outputs should be used to present analysis on the pandemic’s impact on those specific issues. However, where existing data is collected through a survey, the data may be impacted by the difficulties in collecting face to face survey data over the past 12 months.

It is also important to consider sooner rather than later where new data may be required.

2.   Can you give a view on how well (or otherwise) the Government has communicated data on the spread of the virus and other metrics needed to inform its response to the pandemic?

During the pandemic we have highlighted two main areas of concern in the way governments have communicated data: accessibility and transparency.

Accessibility

Early in the pandemic there were lots of new sources of data being published by a range of organisations to support understanding of the pandemic. Much of these data were put out without narrative or sufficient explanation of limitations and could be hard to find. This made it difficult for members of the public to navigate the data and understand the key messages, which in turn could undermine the data and confidence in the decisions made on the basis of the data.

There have been improvements. For example:

  • Data have been more effectively drawn together in dashboards and summary articles, such as the UK coronavirus dashboard and the continually evolving dashboards in Scotland, Wales and Northern Ireland
  • Publications have been developed to include better metadata and explanation of sources and limitations, and how the data link with other An early example was the Department for Transport’s (DfT) statistics on transport usage to support understanding of the public response to the pandemic. We also saw the Department of Health and Social Care (DHSC) develop its weekly test and trace statistics to include information on future development plans and in October an article was published comparing methods used in the COVID-19 Infection Survey and NHS Test and Trace.

However, it can still be hard to know what is available and where to find data on the range of issues people are interested in.

Transparency

Throughout the pandemic we have been calling for greater transparency of data related to COVID-19. Early in the pandemic we published a statement highlighting the important of transparency. We also published a statement and blog on 5 November 2020. One of the prompts for this was the press conference on 31 October announcing the month-long lockdown for England. The slides presented at this conference were difficult to understand and contained errors, and the data underpinning the decision to lock down were not published for several days after the press conference.

We continue to see instances of data being quoted publicly that are not in the public domain. Most recently, for example, at the Downing Street briefing on 3 February, the Prime Minister said:

“…we have today passed the milestone of 10 million vaccinations in the United Kingdom including almost 90% of those aged 75 and over in England and every eligible person  in a care home…”

At the time this statement was made these figures were not published. Breakdowns by age are not published for the UK as a whole. On 4 February NHS England included additional age breakdowns in its publication for data up to 31 January, including percentage 75-79 for the first time (82.6% having had a first dose), and 80 and over (88.1% having had a first dose). The 10 million first doses figure for the UK was reached on 2 February (published 3 February). Our view is that it is a poor data practice to announce figures selectively from a dataset without publishing the underlying data.

Our recent report on statistical leadership highlights the importance of governments across the UK showing statistical leadership to ensure the right data and analysis exist, that they are used at the right time to inform decisions, and that they are communicated clearly and transparently in a way that supports confidence in the data and decisions made on the basis of them. It sets out recommendations to support governments in demonstrating statistical leadership from the most junior analysts producing statistics to the most senior Ministers quoting statistics in parliaments and the media.

We will continue to copy letters to PACAC when we write publicly on transparency issues.

3.   Can you give a view on how well the UK’s statistical infrastructure has held up to the challenge of the pandemic? Are there key systems or infrastructure issues that need to be addressed going forward?

By and large the statistical infrastructure has been extraordinarily resilient. It moved from a largely office-based operation to being almost completely remote in a very short space of time. There were also fast adjustments to allow surveys and data operations for major household, economic and business statistics to continue in some form. Furthermore, new important statistical surveys were launched in a few weeks that would have taken months of planning pre-pandemic, while existing surveys were adapted swiftly from face-to-face to online.

The statistical system has also responded quickly to the need to provide more timely data, for example to support daily briefings.

Producers have been more open to creative solutions and new data sources, for example web-scraped data and PAYE Real Time Information. There have also been greater instances of data sharing. For example, the ONS coronavirus insights dashboard seems to be working towards being a ‘one-stop shop’ for several different producers and the devolved administrations to collate data in a single place. This appears to be encouraging communication between producers and is improving each week.

There are key infrastructure opportunities now that can be exploited, and it is important to question what elements of the new approaches should remain and which should change back to how things used to be done. For example, when and how best should data collection return to face to face households surveys? Should legacy surveys like the Annual Business Survey continue or should there be a move to new platforms or administrative data? How can the new data sources that have now come on stream be exploited even more? Is there a case for synthetic data to enhance existing data to help phase out large and expensive surveys? Can new survey platforms be used to answer short-term questions to help manage the impacts of the pandemic?

It is also important to learn lessons from mistakes that have occurred, at whatever stage of the process they occured. One high profile example involved an error with testing data which meant there were delays to some data being included in the daily figures compiled by   PHE.

Errors like this reflect the underyling data and process infrastructure. OSR is currently exploring the use of reproducible analytical pipelines in government. We are focusing on what enables departments to successfully implement RAP and what issues prevent producers either implementing RAP fully or applying elements of it. This work will give a further insight into infrastructure challenges and where improvements may be needed.

Much of the data that are published are also drawn from administrative sources. In terms of monitoring the pandemic this has presented specific challenges. Public health, social care and hospital administrative systems are not connected to one another, which makes it time consuming to collate the data and puts quality at risk. Updating the IT infrastructure and data governance to make it possible to share information in a timely way is vital.

The pandemic has again highlighted the importance of analysts being involved in the development of operational systems, to make sure they are set up in a way that can best support data and evaluation needs.

The pandemic has also highlighted some of the strengths and limitations of the UK Statistical System. For example, analysts embedded in policy departments and devolved administrations have been able to quickly respond to emerging issues, but this has also been balanced with the complexitiy of having multiple organisations working on overlapping areas.

IT infrastructure as well as how statisticians organise themselves within and across organisations are covered further in our Statistical Leadership report.

4.   What should the Government learn from the issues with testing data that can be applied to vaccine data?

A key issue that has persisted throughout the pandemic has been the need for timely data to be published as quickly as possible. We saw in the early days of testing that the data were disorderly and confused. We were keen that this experience was not repeated with the roll- out of vaccines. So before the start of the vaccine roll-out, we wrote to producers of health-related statistics across the UK, outlining our expectations that they should work proactively to develop statistics on the programme.

Drawing on the lessons to be learnt from the development of test and trace statistics, we outlined the following key requirements for vaccination statistics:

  • Any publications of statistics should be clear on the extent to which they can provide insight into the efficacy of the vaccinations, or whether they are solely focused on operational aspects of programme delivery.
  • Data definitions must be clear at the start to ensure public confidence in the data is
  • Where statistics are reported in relation to any target, the definition of the target and any related statistics should be clear.
  • The statistics should be released in a transparent and orderly way, under the guidance of the Chief Statistician/Head of Profession for Statistics.
  • Some thought needs to be applied as to the timeliness, for example, daily or weekly data, to meet the needs of a range of users.

Encouragingly, we have seen signs that producers have learnt from their previous experiences. For the most part they made initial vaccination data available quickly; at first the numbers were fairly crude but they have continued to develop the data as time has gone on. We have seen also that whereas data were initially published on a weekly basis, producers very quickly moved to publishing daily figures with more detailed breakdowns provided in the weekly updates. This in part reflects a greater acknowledgement of the need to  publish the data so that it can be quoted in parliaments and media without undermining confidence.

This is pleasing but the statistics remain in development and there is more to be done, as we have outlined in this annex and in our letter of 20 January. We also asked producers to publish development plans for these statistics and indicate if some data cannot be provided, as this will help users to understand the limitations of the data available.

Office for Statistics Regulation

February 2021

Related links:

ONS written evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on data transparency and accountability: COVID-19

Office for Statistics Regulation written evidence to the Health, Social Care and Sport Committee’s inquiry on the impact of the Covid-19 outbreak, and its management, on health and social care in Wales

Dear Dr Lloyd,

I write in response to the Health, Social Care and Sport Committee’s call for evidence for the inquiry considering the impact of the Covid-19 outbreak, and its management, on health and social care in Wales.

The Office for Statistics Regulation (OSR) is the independent regulatory arm of the UK Statistics Authority. We provide independent regulation of all official statistics produced in the UK, including those in Devolved Nations and the NHS. Our regulatory work is underpinned by the Statistics and Registration Service Act 2007.

We set the standards official statistics must meet through the statutory Code of Practice for Statistics. We ensure that producers of official statistics uphold these standards by conducting assessments against the Code. Those which meet the standards are given National Statistics status, indicating that they meet the highest standards of trustworthiness, quality, and value. We also report publicly on system-wide issues and on the way statistics are being used, celebrating when the standards are upheld and challenging publicly when they are not.

Statistics published by public sector bodies should be produced in a trustworthy way, be of high quality, and provide value by informing answers to society’s important questions. As the regulator of official statistics in the UK, our view is that good quality data is the bedrock for developing statistics that serve the public good.

Reliable data and evidence are fundamental to underpin policy and decisions relating to the delivery of health and social care in Wales. Our interest in relation to this inquiry mainly revolves around how the pandemic has impacted on the provision of health and social care data, and how this in turn will affect services both now and in the future.

This submission outlines some of the challenges faced in Wales in responding to the Covid-19 pandemic that we can identify with from our own work:

  • Data to support decisions related to the Covid-19 pandemic;
  • Impact of the pandemic on health;
  • Impact of the pandemic on social care.

I look forward to seeing the conclusions of your inquiry. Please do not hesitate to contact me if I can be of any further assistance.

Your sincerely,

Ed Humpherson

 

Office for Statistics Regulation Written Evidence: The impact of the COVID-19 outbreak, and its management, on health and social care in Wales

Data to support decisions related to the COVID-19 pandemic

    • It is important that data related to both the COVID-19 pandemic and its management is available so that effective decisions can be made on how to manage the impact of the virus and lead Wales out of the pandemic.
    • Throughout the pandemic, OSR has continued to monitor the development of data relating to both the pandemic and its impact. We have intervened with departments across the UK where we have identified a need for improvements to the data and statistics they have produced.
    • In November, we published a statement outlining the importance of sharing data in a way that promotes transparency and clarity. This outlined the importance of data that governments quote in statements and briefings being accessible to all and available in a timely manner.
    • Until recently, Welsh Government did not publish the slides and links to data sources to accompany their coronavirus briefings. Following our intervention, they have now started to make this information available on their website. This is helpful in enabling those who are interested to understand the context and reasoning behind the Welsh Government’s decisions.
    • Both Welsh Government and Public Health Wales have responded at speed and have developed new data collections and rapid COVID-19 surveillance dashboards to enable an understanding of the pandemic in Wales. In addition, Public Health Wales have published a paper examining risk factors for outbreaks of COVID-19 in care homes following hospital discharge.
    • Research of this nature is crucial to help understand the impact of practices early on in the pandemic and to inform future decisions. It is important to understand the strengths and limitations of such research. However, it is also important to be clear that much of the evidence is in its infancy and will continue to gain strength and weight as further evidence emerges. Whilst we welcome work of this nature to advance our understanding of the pandemic and its impacts, we also urge caution when basing decisions around future practice on early findings.
    • As the first vaccine to protect against COVID-19is rolled-out, hopes are rising of there finally being a way out of the pandemic. It will be important that statistics are developed on the rollout of vaccination programmes and that these are used to inform and manage such programmes. We have detailed our expectations in a letter to producers of health-related statistics across the UK, published on 1stDecember. Continued statistics on numbers of people infected with COVID-19 will, in part, help us to understand the efficacy of the vaccination programme.

Impact of the pandemic on health

    • The early impacts of the pandemic in relation to such factors as delivery of cancer screening, mental health services and elective operations have been well documented. The longer-term impacts will be wide ranging and will only become fully apparent over the course of several years. These impacts will range across both physical and mental health and will also be influenced by factors such as wider social and economic determinants of health and behavioural risk factors, such as smoking, diet and alcohol consumption.
    • It will be important that comprehensive data is available to allow monitoring of this wide range of factors, both so that the broader impact of the pandemic is fully understood, and so that effective policy and practice can be put into place to address the negative impacts. Public Health England have developed a monitoring tool to assess the wider impacts of COVID-19 on health7. We recommend that a similar approach is adopted in Wales.

Impact of the pandemic on social care

    • In 2018-19,we carried out a review of adult social care statistics across the UK8. Given the devolved nature of adult social care, the review looked at statistical issues in each of the four countries separately. Our research highlighted that adult social care has not been measured or managed as closely as healthcare, and a lack of funding has led to under investment and resourcing in data and analysis. Furthermore, the introduction of a new data collection system in recent years had led to variable levels of data quality within and across local authorities.
    • In 2019, the Welsh Government produced a new Social Services Activitypublication9. The information within this release covered a range of areas related to local authority social services, and the statistics had a range of users including ministers within the Welsh Government, local authorities, the Care Inspectorate Wales and Healthcare Inspectorate Wales, and charities.
    • In March 2020, Welsh Government data collections, research activity and outputs were reviewed in light of the coronavirus (COVID-19) pandemic. The data collections which this statistical release is based on were suspended for 2019-20, meaning that there will be no updated publication for this reporting year.
    • Although it is currently planned that data collections will be resumed for the 2020-21 reporting year, it is likely that local authorities who provide the data will continue to be under significant pressure, meaning that the quality of the data they submit may be jeopardised. There also remains the risk that publication of key statistics will be delayed or further suspended while ever the pandemic continues.

Office for Statistics Regulation follow-up written evidence to Scottish Parliament’s Health and Sport Committee’s inquiry on the future of social care delivery in Scotland

Dear Lewis,

SOCIAL CARE INQUIRY- FOLLOW-UP QUESTIONS

Following our written evidence submitted to your Committee in February, I wish to offer further consideration and view on follow-up questions suggested to us for answer.

1. Measuring individuals’ outcomes, and outcomes associated with the integration of health and social care in Scotland. How can outcomes evaluation and measurement be implemented so that it is statistically sound and useful?

All public bodies who are involved in the production of official statistics should adhere to the principles set out Code of Practice for Statistics (the Code). Compliance with the Code ensures that statistics are of public value, are of high quality and are produced by departments and public bodies that can be trusted.

We note that the 31 Integration Authorities are not official statistics producers, and therefore there is no statutory requirement for them to comply with the Code. Scottish Government and Public Health Scotland (PHS) are official statistics producers. This means that, whilst the data sources for social care statistics are provided by bodies and organisations such as health and social care partnerships, local authorities and third sector organisations, statistics based on this data should comply with the Code.

In our February 2020 report, Adult Social Care Statistics in Scotland, we highlighted that, at the time of writing, fundamental gaps existed in social care statistics in Scotland which meant that they were not currently providing the range and depth of information needed to fully serve the public good. We cited as an example that a lack of information about outcomes for people who use social care was one of the most common frustrations we heard whilst conducting our report research. Users of social care statistics told us that too much emphasis is placed on counting system outputs, such as the number of hours of care delivered, costs of services and numbers of staff, and would instead like to know more about the outcomes achieved for the people using those services.

For outcome evaluation and measurement data to be published as official statistics, this will require that they are developed by an official statistic producing body in line with the Code. In our view, this will likely require some form of standardised data collection across the 31 Integration Authorities. Recent examples of such standardised data collection include those developed for the Carers Census and the new PHS adult social care data collection system.

In our report we made the following recommendation: ‘All social care statistics producers need to work together, in consultation with health and social care partnerships and statistics users, to identify and prioritise actions to address social care data gaps – including by making better use of existing data – and meet users’ information needs.’

In addition to the development of social care statistics, we recognise there may be a need for organisations delivering social care services to develop operational management information on service outcomes to inform commissioning and delivery of social care services. As these organisations are not official statistics producers, this type of information is out of scope of our formal regulatory remit. Notwithstanding this, in 2018 we introduced the ability for organisations who are not official statistics producers to sign up to voluntary application of the Code. This option is available to any producer of data, statistics and analysis which are not official statistics, whether inside government or beyond, to help them produce analytical outputs that are high quality, useful for supporting decisions, and well respected. A commitment to the Code pillars of Trustworthiness, Quality and Value offers the opportunity for an organisation to:

  • Compare its processes, methods and outputs against the recognised standards that the Code requires of official statistics.
  • Demonstrate to the public its commitment to trustworthiness, quality and public value.

Outcomes evaluation and measurement is not a challenge unique to Scotland. In our January 2020 report, Adult Social Care Statistics in England, we outlined that reliable and comprehensive evidence is vital for evaluating delivery and informing policy decisions which can lead to improved outcomes and support individual choice.

There are two aspects to this challenge.

  • Definitions: In the health context, there is a well-developed approach to comparing different interventions based on their impact on the life of the patient – known as the Quality Adjusted Life Years (QALYs). There is not yet a comprehensive framework for thinking about and defining outcomes for social care interventions.
  • Cost effectiveness: In our England report, we highlighted that unlike health, where the effectiveness of interventions is a priority research area, in social care there is very little understanding of the most cost-effective intervention and what the impact of each intervention is. We strongly encouraged the implementation of joined up data across health and social care in England to understand how the two systems interact and what drives the best outcomes.

2. Professor David Bell, University of Stirling, highlighted in his evidence to the Committee the lack of data collection in Scotland in comparison with other parts of the UK. Currently, Scottish researchers rely on English statistics used for projecting demand. Do you have a view on this, particularly in relation to the policy divergence between the two health and care systems?

With regards to the specific issue that Scottish researchers currently rely on English statistics used for projecting demand, we do not have sufficient evidence to make a judgement on this.

However, in our February 2020 report, Adult Social Care Statistics in Scotland, we noted that many researchers are keen to make more use of health and social care data. We welcomed the creation of Research Data Scotland, which we hope will help address the data access issues that researchers currently face when seeking health, social care and other data, and support greater joining up of these data. In addition, in our report we recommended that PHS and Scottish Government should convene a social care data user summit in 2020 to help inform Research Data Scotland’s development and PHS’s plans for making more use of linked health and social care data.

Unfortunately, due to the restrictions imposed due to the pandemic, this summit has not yet been convened. We are in regular discussion with both PHS and the Scottish Government and are keen for this recommendation to be realised.

3. What approach(es) to data collection do you think need to be considered and what data do you feel is required?

As part of our research for our February 2020 report, Adult Social Care Statistics in Scotland, we spoke to organisations who are data providers for the current social care statistics. We highlighted the following important issues that require careful consideration as part of any new or amended official statistics data collection system.

  • Resourcing issues beset all aspects of social care data collection and statistics production. This includes the availability of staff to collect data and return it, the need for investment to improve its quality, the need for entirely new forms of data to be collected to better meet user needs, and the availability and capability of staff to use the data themselves to inform service development locally. We recognise that the resource implications associated with building new data systems are far greater than those associated with improving existing statistics. The drivers and funding to do this will also be largely beyond the reach of statistics producers alone.
  • Building data collection systems that deliver value to staff and users of social care is also difficult to do without imposing unreasonable administrative burdens. A significant amount of social care activity takes place beyond the scope of public sector service settings and the majority of the social care workforce (around 70%) is employed by private and third-sector providers. This makes the task of building routine data collection systems significantly harder.

It is our view that Scotland is not alone in facing these challenges. Our work in this sector as a UK wide regulator has identified similar difficulties with collecting data in disparate settings about human experiences (as opposed to flows of money or service provision).

Whilst we cannot be specific around what data is required in this case, we do expect that, in line with the Code, users of statistics and data should be at the centre of statistical production. Understanding user needs and seeking the views of users is important and should be used to direct what data is required. Official statistic producers should establish an ongoing dialogue with users to ensure that statistics continue to meet changing user needs and demand.

The COVID-19 pandemic has emphasised the importance of responding to user need and has brought attention to existing gaps in adult social care statistics.

I hope this is useful to the Committee.

Yours sincerely
Ed Humpherson
Director General for Regulation

Office for National Statistics and Office for Statistics Regulation oral evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on data transparency and accountability: COVID-19

On 22 September 2020 Professor Sir Ian Diamond, National Statistician and Ed Humpherson, Director General for Regulation, gave evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on data transparency and accountability: COVID-19.

A transcript of which has been published on the UK Parliament Website.

Office for Statistics Regulation written evidence to the Public Accounts Committee’s inquiry into Digital Transformation in the NHS

Dear Meg,

I write in response to the Public Accounts Committee’s call for evidence for the inquiry considering Digital Transformation in the NHS.

The Office for Statistics Regulation (OSR) is the independent regulatory arm of the UK Statistics Authority. We provide independent regulation of all official statistics produced in the UK, including those in Devolved Nations and the NHS. Our regulatory work is underpinned by the Statistics and Registration Service Act 2007.

We set the standards official statistics must meet through the statutory Code of Practice for Statistics. We ensure that producers of official statistics uphold these standards by conducting assessments against the Code. Those which meet the standards are given National Statistics status, indicating that they meet the highest standards of trustworthiness, quality, and value. We also report publicly on system-wide issues and on the way statistics are being used, celebrating when the standards are upheld and challenging publicly when they are not.

Statistics published by public sector bodies should be produced in a trustworthy way, be of high quality, and provide value by informing answers to society’s important questions. As the regulator of official statistics in the UK, our view is that good quality, granular electronic data is the bedrock for developing statistics that serve the public good. We support and share the ambition of the Department of Health and Social Care (DHSC) to digitally transform the NHS to enable more effective data sharing between health and social care organisations; but recognise the enormity and complexity of the task. Through our own work, we understand the challenges faced by DHSC and in our view there are many foundations which need to be addressed in order to implement this ambition.

This submission outlines some of the challenges faced in the NHS for digital transformation that we can identify with from our own work:

  • Data harmonisation
  • Data linkage
  • Digital skills in workforce
  • Fragmentation between health care and social care in England

I look forward to seeing the conclusions of your inquiry. Please do not hesitate to contact me if I can be of any further assistance.

Yours sincerely,

Ed

 

PUBLIC ACCOUNTS COMMITTEE INQUIRY – DIGITAL TRANSFORMATION IN THE NHS

 

Joined-up data and data linkage

 

  1. From our analysis and the knowledge we have sought about health and social care statistics users’ needs through our regulatory work, we know there is a huge appetite for joined-up statistics that paint a more complete picture of people’s journeys through the different parts of the health and social care system.
  2. To optimise the benefits that information technology can bring in painting that picture, those responsible for health and social care data should find more efficient and effective ways to legally share data across organisational boundaries, not just for operational or client/patient care reasons, but for research and statistical purposes too. In 2017, we launched a review of the UK statistics system’s ability to provide greater insight to users via data linkage. Our timing was in part prompted by a desire to take stock of the landscape before the new data sharing provisions for statistics and research, introduced by the Digital Economy Act (2017) (DEA), were implemented. Our desired outcome is for data linkage to be widely used to answer society’s important questions in a timely manner.
  3. Our latest update published in 2019 for the Joining up Data for Better Statistics[1] review found that although data linkage should be a vital part of the official statistics landscape, value is being squandered because for the most part this is not the case. There are some powerful examples of data linkage being used in government to provide insights and drive policy change, but these are the exception and we are concerned that the time and effort required to create linked data resources can discourage others seeking to do similar work.
  4. In improving its digital infrastructure, the Department of Health and Social Care (DHSC) needs to ensure that NHS systems facilitate quick and efficient data sharing and linkage.

 

Digital skills of the workforce

 

  1. The culture of healthcare and social care settings, that primarily rely on face-to-face techniques to examine and care for people, means that staff will need to be involved in the design and development of new software, and be trained in its implementation.
  2. The charity doteveryone carried out research to understand the current impact of technology in the social care system and its potential to shape the future. As part of their findings published in the report Better Care in the Age of Automation[2], they recognise that people need skilled helpand flexibility for technology to work for them, and a culture of suspicion and fear inhibits people from taking advantage of new innovations. As well as recommending long-term investment in better data to support a more sustainable and fair system, they also recommend the establishment of a Royal College of Carers. This new organisation would provide the resource and professionalism to supplement existing skills of carers and support the use of any new technologies in the sector.
  3. In improving the digital infrastructure of the NHS, DHSC needs to ensure users of any new technology are suitably involved and trained in any implementation across the NHS.

 

Imbalance between health care and social care

 

  1. The existing fragmentation between health care and social care in England exacerbates the challenge of digitalisation in the NHS. Adult social care is a large and important area which requires strong evidence to support effective policy development, delivery of care and personal choice. Our recent review of adult social care statistics in England[3] found that this sector is poorly served by data. Scarcity in funding has led to under investment in data and analysis, making it harder for individuals and organisations to make informed decisions. The lack of investment, resourcing and collaboration has led to an imbalance in the quality and value of the statistics when compared with those in the health care sector.
  2. Our review highlighted three main areas for attention:
  • Better leadership and collaboration across the many different organisations involved in the process of publishing official statistics on social care, that enables working across boundaries to join-up government departments, local authorities and between public and private sector providers.
  • Gaps in available data as most information available comes from local authorities with responsibilities for adult social services and does not cover private household expenditure, privately funded care or the value of unpaid care causing limited knowledge of individuals care journeys and outcomes.
  • Improving existing official statistics through accessibility, coherence, quality, timeliness, and granularity of the data to provide insight and allow existing data to better meet user needs.
  1. In line with the introduction of new technologies to assist healthcare[4], we want to see progress made with proposed infrastructure that will support the integration of health and social care data so that there is a better understanding of the interaction between health and care and an individual’s experience. We welcome plans set out in the Government’s vision for digital, data and technology in health and care[5]. We hope the establishment of NHSX, a body to ‘progress digital transformation of the NHS’, will allow government to deliver on this ambition while considering data needs. There is also potential for the Office for National Statistics (ONS) to support the sector through innovative approaches to data analysis, such as data linking, and use of provisions in the DEA.
  2. The autonomy of health and social care data as separate entities not only is apparent in official statistics and public policy, but also in other ways. The digital capability of the social care sector to effectively embed any technological solutions has also been questioned, as highlighted in the NAO report ‘Digital Transformation in the NHS: ‘practices had more mature arrangements in place for sharing electronic patient records with other healthcare providers in their area than they had with social care providers’. This is also reflected amongst adult social care providers, with the Care Quality Commission reporting challenges faced by adult social care providers in adopting digital technology. Further research by doteveryone Better Evidence for Better Care[6] also suggest that there is not enough evidence available for commissioners and providers to effectively implement new technology into social care services.
  3. To improve the digitalisation of the NHS, DHSC needs to fully understand the barriers and challenges faced in the social care sector. We will continue to work with a range of organisations to make the case for improvements to social care statistics.

 

Data harmonisation

 

  1. In our view, the most significant long-term solution to improve the coverage and quality of health and social care statistics is the transformation of social care data collection and analysis, bringing them onto a par with hospital data. NHSX was set up in 2019, and encouragingly steps are being taken by some regions of England to pilot a single health and social care record. A single patient record would enable end-to-end analysis of the patient journey and experience of services across the NHS and more widely. It would improve the care for people, particularly for those with multiple long-term conditions in the care of separate specialist teams, and the use of data and technology to achieve this is key.
  2. Patient-level data that straddles organisational boundaries (such as health/social care or hospital/care home) also needs to have agreed definitions and standards and should harmonise more widely with other data collections e.g. the Census. This is the basis upon which good quality statistics can be developed.

OFFICE FOR STATISTICS REGULATION, SEPTEMBER 2020

 

[1] Joining Up Data for Better Statistics – 2019 update

[2] doteveryone – Better Care in the Age of Automation, September 2019

[3] Office for Statistics Regulation – Report on Adult Social Care statistics in England, January 2020

[4] Government Technology Innovation Strategy, June 2019

[5] The future of healthcare: our vision for digital, data and technology in health and care, October 2018

[6] doteveryone – Better Evidence for Better Care, 2019

Office for Statistics Regulation written evidence to the Health and Social Care Committee’s inquiry on Social Care: Funding and Workforce

Dear Mr Hunt,

I write in response to the Health and Social Care Committee’s call for evidence for the inquiry considering Social Care: Funding and Workforce.

The Office for Statistics Regulation (OSR) is the independent regulatory arm of the UK Statistics Authority.

We provide independent regulation of all official statistics produced in the UK, including those in Devolved Nations and the NHS. Our regulatory work is underpinned by the Statistics and Registration Service Act 2007.

We set the standards official statistics must meet through the statutory Code of Practice for Statistics. We ensure that producers of official statistics uphold these standards by conducting assessments against the Code. Those which meet the standards are given National Statistics status, indicating that they meet the highest standards of trustworthiness, quality, and value. We also report publicly on system-wide issues and on the way statistics are being used, celebrating when the standards are upheld and challenging publicly when they are not.

In January 2020, the OSR published findings from an in-depth review of Adult Social Care statistics in England. We are using this report as the basis for our submission to the Committee, the findings of which have never been more relevant as society adjusts to the rapid changes resulting from the ongoing coronavirus (COVID-19) pandemic.

There are gaps in the data and information that might tell us about the real cost of providing social care and ensuring good outcomes for people who need social care. Our review finds that this important sector of public policy is very poorly served by data. Social care has not been measured or managed as visibly as hospital care. The gaps in data and analysis make it harder for individuals and organisations to make informed decisions.

We want to see improvements to the existing statistics, as well as more fundamental changes. This will require a cross-government commitment to improvements. We strongly encourage the implementation of joined up data across health and social care to understand how the two systems interact, and what drives the best outcomes.

Our review highlighted three main areas for attention:

• Better leadership and collaboration across the many different organisations involved in the process of publishing official statistics on social care, that enables working across boundaries to join-up government departments, local authorities and between public and private sector providers.
• Addressing gaps in available data as most information available comes from local authorities with responsibilities for adult social services and does not cover private household expenditure, privately funded care, or the value of unpaid care, meaning the total cost of social care provision remains unknown.
• Improving existing official statistics through accessibility, coherence, quality, timeliness, and granularity of the data to provide insight and allow existing data to better meet user needs.

We have said for some time that there is no parity of measure between the health and social care sectors. The COVID-19 pandemic has had a significant impact on care homes and clearly shown that the approach to measurement in the social care sector has been lacking. In a response to the disease, there is now more data available on social care – this should continue after the pandemic ends.

The Committee may also be interested in our other work in response to COVID-19. This includes rapid regulatory reviews of new outputs from the Government Statistical Service, and statements advocating improvements to the presentation and availability of data on COVID-19.

We will continue to work with a range of organisations to make the case for improvements to social care statistics in England and more widely across the UK. We hope to raise the profile of these issues through this submission.

I look forward to seeing the conclusions of your inquiry. Please do not hesitate to contact me if I can be of any further assistance.

Your sincerely,
Ed Humpherson
Director General for Regulation

 

ANEX

Overview

1. This submission is based on the findings from our review of Adult Social Care Statistics in England published in January 2020.

2. Adult social care is a large and important area which requires strong evidence to support effective policy development, delivery of care and personal choice. Better data infrastructure and outputs which address the gaps in existing data are essential for individuals and organisations to make informed decisions

3. Improved data matters in solving problems, supporting efficiency, and maximising outcomes. It is also important to inform decisions made by individuals about the care they receive or provide for themselves and their families. Collaboration across traditional boundaries, across public and private sectors, is necessary to deliver the coherent and complete picture of adult social care.
Better leadership and collaboration

4. There needs to be a strong voice to champion statistics that meet a range of user needs and strong leadership to implement the required changes. Many different organisations are involved in publishing official statistics on social care. Making improvements will require collaboration across government departments; local authorities; and between public and private sector providers.

5. In line with the introduction of new technologies to assist healthcare, we want to see progress made with proposed infrastructure that will support the integration of health and
social care data so that there is a better understanding of the interaction between health and care and an individual’s experience. We welcome plans set out in the government’s vision for digital, data and technology in health and care. We hope the establishment of NHSX, a body to ‘progress digital transformation of the NHS’, will allow government to deliver on this ambition while considering data needs. There is also potential for the Office for National Statistics (ONS) to support the sector through innovative approaches to data analysis, such as data linking, and use of provisions in the Digital Economy Act.

Addressing data gaps

6. There are significant gaps in what adult social care data currently measures:

• Delivery of social care outside statutory control: Statistics on social care activity are primarily sourced from data provided by Councils with Adult Social Services Responsibilities (CASSRs). The established assessment criteria mean that many individuals privately funding care or receiving informal care have little or no contact with a local authority. CASSRs can therefore only measure part of the picture. These limited data have to act as a proxy for the whole social care sector. The information on unmet need and future demand is also limited.
• Funding outside statutory control: There are gaps in understanding of the scale of household expenditure on privately funded care and the value of unpaid care. There is no official estimate of the value of unpaid care provided by family and friends, but unofficial estimates that do exist vary between £100bn and £132bn per year, far exceeding HM Treasury spending*, giving a sense of the unacknowledged value of this support.
• Individual experiences and quality of care: There is little information on pathways and transitions between health care and social care – new infrastructure is required to effectively address this. There is also little information on the quality of care and outcomes for those who experience social care.

7. The gaps identified are significant and need to be addressed in order to support effective delivery and facilitate improved outcomes for those who experience social care. There is public and  policy interest in knowing about social care activity and spend wherever it happens, whether in the home, in a residential home or nursing home. The traditional route of relying on data collected by local authorities to complete official statistics is not enough.

Improving existing official statistics

8. Looking across existing statistics on adult social care we found some good examples of insightful analyses, However, there were many instances where we identified that improvements were necessary. There are improvements which should be made to the existing official statistics, around:

• Accessibility
• Coherence
• Quality
• Timeliness
• Granularity of the data

9. Changes in these areas could improve insight and allow the existing data to better meet user needs. We welcome the ONS proposals for a portal to signpost users to existing social care statistics, and want to see all producers of social care statistics take on the recommendations we have set out in letters to the relevant Head of Profession for Statistics following our detailed review of official statistics outputs as part of this review.

10. We will continue to work with a range of organisations to make the case for improvements to social care statistics. We hope to raise the profile of the  issues highlighted in this report and work towards parity of esteem between health and social care statistics.

11. Improved statistics can support policy makers who are developing proposals to reform delivery of adult social care, as well as individuals who will be able to hold government to account and make better informed decisions about the issues impacting their lives and their families.

Data and statistics on COVID-19 impacts on the care sector

12. Statistics on COVID-19 in the care sector – including care home outbreaks, the number of suspected COVID-19 cases in care homes, and registered deaths in care homes involving COVID-19 – are currently released through a variety of different reports including daily and weekly surveillance reports and within weekly registered death releases. These statistics start to provide a picture of the impacts on those receiving care and help decision makers to understand and manage COVID-19 within care settings. However, further analyses are needed to provide context and facilitate a better understanding of key areas for concern.

13. To further improve these statistics, we suggest producers continue collaborating to present a coherent picture of the impact of COVID-19 on those in care settings across the UK. For example, the ONS is collaborating with the Care Quality Commission in England and the Care Inspectorate Wales to publish early estimates of COVID-19 related deaths in care homes. We welcome these new data and efforts and recognise that producers are seeking to develop statistics provision in this area.

14. Producers also need to explain the wider context of COVID-19 and the large number of deaths for those in care settings. There is a need for information to contextualise the data and statistics on deaths in the sector as well as to support management of COVID-19.

15. Alongside this, producers need to understand and assess the impact of any changes in the circumstances and context of data sources, and any implications for use should be clearly explained. Within the varied landscape of statistics and data on those in care settings, producers should make the definitions within their outputs clear to users. For example, clearly identifying statistics as deaths involving COVID-19, deaths due to COVID-19, or deaths of those with a positive test result.

16. Producers should work closely with relevant parties, such as the Care Quality Commission, to understand and investigate any changes in the recording of COVID-19 on death certificates which may impact on the accuracy of the data on deaths in the care sector.

17. There is a need for producers to provide or enable regional comparisons where possible, with guidance and contextual information to support the interpretation of the statistics, as well as UK comparisons where possible. Guidance should be provided on whether the data from different countries of the UK can be compared to help users understand and interpret the statistics. The  similarities and differences between the country-level data should be clearly explained, particularly any differences in care provision, differences in the characteristics of the population of those receiving care, and data collection methods that could affect the ability to make comparisons.

18. The OSR has also published a full statement on data and statistics around the impact of COVID-19 on the care sector.

OFFICE FOR STATISTICS REGULATION, JUNE 2020

 

* Latest figures from HM Treasury show that public expenditure on personal social services in England (table 10.1 of that report) amounted to £24.5 billion in 2017/18, and this does not include the significant private expenditure on social care.

Office for National Statistics and Office for Statistics Regulation oral evidence to the Public Administration and Constitutional Affairs Committee’s inquiry regarding COVID-19 data

On Wednesday 13 May 2020 Professor Sir Ian Diamond, UK National Statistician; Ed Humpherson, Director General for Regulation, UK Statistics Authority gave oral evidence to the Public Administration and Constitutional Affairs Committee’s inquiry regarding COVID-19 data.

A transcript of which has been published on the UK Parliament’s website.

Related Links:

Professor Ian Diamond to William Wragg MP, Chair PACAC regarding COVID-19 (April 2020)
Ed Humpherson to William Wragg MP, Chair PACAC regarding COVID-19 (April 2020)

Office for Statistics Regulation written evidence to the Digital, Culture, Media and Sport Sub-committee on online harms and disinformation’s inquiry of the same name

Dear Chair,

I write in response to the Digital, Culture, Media and Sport Sub-committee on Online Harms and Disinformation call for evidence.

The Office for Statistics Regulation (OSR) provides independent regulation of all statistics produced by the UK Government, Devolved Nations and by all related public bodies. The OSR is the independent regulatory arm of the UK Statistics Authority (the Authority), which was established by the Statistics and Registration Service Act 2007 (the SRSA).

We set the standards producers of official statistics must meet through the statutory Code of
Practice for Statistics. We assess compliance with this Code, and designate statistics as National Statistics. There are three pillars of the Code:

• Trustworthiness: trusted people, systems and processes
• Quality: robust data, method and statistics
• Value: statistics that serve the public good

Our role is to ensure that statistics serve the public good. In a world in which data and information are abundant, people can feel bombarded by information. We focus on the government as a provider of information and statistics, disseminating a reliable, impartial evidence base.

While we also have an important role in challenging instances of statistical misuse (misinformation), most of our regulatory work focuses on what it means to inform society. We help the public to identify the statistics that meet the highest standards of trustworthiness, quality and value and we challenge producers to fill data gaps to better inform society.

This is a very difficult time for everyone as the UK adjusts to rapid changes in society and the economy. Organisations that produce official statistics are showing flexibility and adapting what they collect and publish to respond to this new environment. The pace at which these organisations have set up new data collection and dissemination processes has been unprecedented and enables timely updates on the number of COVID-19 cases and deaths, as well as the economic and societal impacts of the pandemic.

In response to COVID-19 we have developed a package of measures including guidance on factors that producers should consider when making changes to data collection and statistics. We have carried out short regulatory reviews of new COVID-19 questions added to the Office for National Statistics (ONS) Opinions and Lifestyle survey, and of new experimental faster indicators constructed from rapid response surveys, novel data sources and experimental methods.

In accordance with our interventions policy we have responded to concerns about the publication of data on COVID-19 cases and deaths, and have called on the Department for Work and Pensions to ensure management information on Universal Credit used in daily briefings is published and accessible to the public. We have undertaken a review of all the data releases on COVID-19 cases and deaths – at a UK level and for each country within the UK – to help understanding of the available sources and to highlight strengths and areas for improvement.
Following our interventions regarding data on COVID-19 cases and deaths, there have been improvements in the information provided by government. In particular, there is now much greater clarity that the daily deaths data is incomplete and does not include deaths in all settings. It is a leading indicator, however, with the weekly figures from the ONS (and National Records of Scotland and the Northern Ireland Statistics and Research Agency) providing a more complete picture of deaths associated with COVID-19.

However, in order to maintain public confidence in these crucial statistics, we are encouraging producers to continue to clarify the nature and extent of the uncertainty around the UK estimates of deaths associated with COVID-19, and what the figures do and not include. We also continue to state our expectation that any management information used as part of daily public briefings is published and accessible to the public.

In summary, while combatting misinformation is crucial, it is also essential that the public receives information from government that is trustworthy, high quality and valuable – and enabling that outcome is the heart of OSR’s mission.

I hope the Committee finds this evidence to be helpful. Please do not hesitate to contact me if I can be of any further assistance.

Yours sincerely
Ed Humpherson
Director General for Regulation

Office for Statistics Regulation correspondence to the Public Administration and Constitutional Affairs Committee regarding cases and deaths from COVID-19

Dear Mr Wragg,

As Director General for Regulation at the UK Statistics Authority and Head of the Office for Statistics Regulation (OSR), I write in response to your letter of 14 April to the National Statistician, covering the regulatory perspective on COVID-19 statistics and data. This letter summarises OSR’s review of data and statistics on COVID-19 cases and mortality, including trustworthiness, quality and value. It also considers broader regulatory points about the use of management information, and statistics on adult social care.

This is a very difficult time for everyone as the UK adjusts to rapid changes in society and the economy. OSR commends the flexibility and level of responsiveness shown by organisations that produce official statistics in adapting to this new environment and will continue to support further improvements to statistics and data on COVID-19.

Reviewing statistics and data

OSR has undertaken a review of all the data releases on COVID-19 cases and deaths – at a UK level and for each country within the UK – to help understanding of the available sources and to highlight strengths and our view on areas for improvement. The relative strengths and limitations were considered within the context of the three pillars of the Code of Practice for Statistics that you
mentioned, referred to as TQV:

• Trustworthiness: governance, including people, systems and processes
• Quality: robust data, method and statistics
• Value: statistics that answer people’s key questions

The document we published yesterday outlines the findings from our review. It acknowledges that there is value in having timely data, such as the daily surveillance data covering the UK that is published by DHSC less than 24 hours after the data reporting period. This output provides an important leading indicator of the trend in COVID-19 testing, cases and deaths. However, with this timeliness there is a trade-off with completeness, for example, publishing the setting where the death occurs. Because the data from England only captures deaths in hospitals and not deaths in the wider community, these UK daily outputs struggle to meet the needs of all users and require continuous innovation to include information about where the death took place. We understand that the ONS is working with the Care Quality Commission to publish further data on deaths in care home residences. Secondly, although we have seen notable improvements in the metadata that accompany the daily data for each nation, the nature and extent of the uncertainty around the UK estimates of deaths associated with COVID-19 could be clearer. Finally, we are concerned about the accessibility of the data from all four nations and have asked the Government Statistical Service to consider enabling users to navigate all COVID-19 related outputs from a central hub. None of the
daily data releases are designated as National Statistics.

In contrast, the weekly mortality statistics published by the ONS for England and Wales, and by the National Records of Scotland and the Northern Ireland Statistics and Research Agency, provide a more complete measure of the number of people whose deaths are associated with COVID-19, but these statistics are released with a greater time lag and are not designed to measure the spread of a pandemic in close to real-time. Weekly death registration reports focusing on COVID-19 use a variety of data recorded on the death certificate, such as setting of death, sex, age band and underlying diseases that may have contributed to a death. Overall, the weekly mortality statistics largely fulfil the Code’s expectations on trustworthiness, quality and value. Unlike the daily data, the weekly mortality statistics published for England and Wales, Scotland and Northern Ireland are all designated as National Statistics.

Broader regulatory points

There are two further regulatory points I wanted to draw to your attention. First, there is a wide range of information being used by Government to inform its understanding of the impact of COVID-19 on the economy and society. Where that management information is used as part of daily public briefings, it should be published and accessible to the wider public. Second, you note that concerns have been expressed about the limited information on deaths in care homes compared to deaths in hospitals. These concerns echo the findings of OSR’s review of statistics on social care for England, published in January 2020: we concluded that, when comparing social care to the data-rich health system, there are inadequate statistics. We called for greater parity of measurement between the two.

Our document published yesterday includes more detail on our work on COVID-19 statistics and data, and I hope the Committee finds this letter to be helpful.

Yours sincerely
Ed Humpherson

Related Links:

Professor Ian Diamond to William Wragg MP, Chair PACAC (April 2020)
ONS and OSR Oral evidence to PACAC regarding COVID-19 (May 2020)