Office for National Statistics oral evidence to the Science and Technology Committee’s inquiry on Research and Development Statistics

On Wednesday 7 December, Mike Keoghan, Deputy National Statistician for Economic, Social and Environmental Statistics, Office for National Statistics, and Darren Morgan, Director for Economic Statistics Production and Analysis, Office for National Statistics, gave evidence to the Science and Technology Committee for their inquiry, ‘Research and Development Statistics’.

A transcript of which has been published on the UK Parliament website.

Office for Statistics Regulation written evidence to the Levelling Up, Housing and Communities Committee’s inquiry into Funding for Levelling Up

Dear Mr Betts,

I write in response to the Levelling Up, Housing and Communities Committee’s inquiry into Funding for Levelling Up.

The Office for Statistics Regulation (OSR) is the independent regulatory arm of the UK Statistics Authority and provides independent regulation of all official statistics produced in the UK. We aim to enhance public confidence in the trustworthiness, quality and value of statistics produced by government through setting, and assessing compliance with, the Code of Practice for Statistics.

Like the Committee, we strongly advocate for the transparency of data that are in the public interest and required for key decisions underpinning policy making. Our expectations are outlined in our guidance on intelligent transparency. We would like to see greater clarity and transparency around the data that are required to understand progress against the UK Government’s levelling up policy commitments. This includes both data on funding for levelling up policies and the metrics that will be used to measure the outcomes. Where there are new or existing data of sufficient quality and at the level of granularity users require, we would like to see greater transparency and accessibility of those data.

 

We are aware that the Committee wrote to Neil O’ Brien MP, then Parliamentary Under Secretary of State for Levelling Up, asking for further information around the various funding programmes for levelling up policies and his response, accompanied by a funding map dataset, was published on the Committee’s webpages. As far as we can tell, this has not been published elsewhere. We are aware that the Committee has concerns about the table that was provided, which is limited in terms of its utility, and the guidance needed for a comprehensive interpretation of the figures. There will clearly be interest from the Committee and others in updated versions of this table as the levelling up agenda progresses, and other funds are allocated.

We see that there is a case for developing a more publicly accessible version that could sit on gov.uk, rather than a future update coming out again as an annex to a committee letter. This separate publication could then include the necessary additional information required to interpret the data in a more informed manner and be developed to improve its utility for understanding the outcome of bids for different levelling up funding streams, as well as clarity on periodicity and sources of funding. This should be published in an accessible form, in line with the Code of Practice for Statistics.

We corresponded with Department for Levelling Up, Housing and Communities (DLUHC) from August 2021 to November 2021, asking for greater transparency around data on the Levelling Up Fund and the related ‘prioritisation of places model’. This led to a positive outcome and a commitment from DLUHC to the data that inform key decisions available. When the next round of levelling up funding allocations is announced we expect to see published data that is supported by a methodology and links to the source data. This should allow users to recreate the funding model for the allocation of priorities areas and enhance public confidence in the decisions that are being made.

DLUHC has set up the Office for Local Government (Oflog) which is tasked with bringing together data to understand local government performance in relation to value for money and publishing its conclusions in an annual report. This information could be used to monitor local authorities’ performance against aspects levelling up objectives based on the funding that they receive and may add to the case for developing a published levelling up funding series. We also note that the Levelling Up White Paper outlines that “devolved governments are best placed to deliver certain services like health and education” – as a result at some point there may be a need to publish transparent information on the performance of these services, in relation to the funds allocated to them.

 

We understand that the Office for National Statistics (ONS) is receiving funding from DLUHC to develop subnational data and has started publishing its subnational data explorer. The data explorer provides helpful explanations and includes a ‘data dictionary’ that accompanies the supporting dataset. We think the ONS’s approach to publishing subnational data could serve as a good example for other departments looking to publish their own data relating to levelling up policies in a transparent and accessible way.

Please let me know if you have any questions or if I can support the Committee further in its inquiry.

Yours sincerely,

Ed Humpherson 

Director General for Regulation

Office for National Statistics correspondence with the Public Accounts Committee on use of evaluation and modelling in government

Dear Dame Meg,

I wanted to write in my role as Head of the Government Analysis Function regarding your Committee’s report and the subsequent Government response on Use of evaluation and modelling in government. For recommendation 5b, we gave a target implementation date of summer 2022. This was in error: it should have said summer 2023. We have revised this target implementation date in the latest progress update due to your Committee also.

Please do let me know if any questions.

Yours sincerely,
Professor Sir Ian Diamond

Office for National Statistics written evidence to the DCMS Sub-committee on Online Harms and Disinformation’s inquiry on misinformation and trusted voices

Dear Mr Knight, 

I write in response to the Digital, Culture, Media and Sport Sub-committee on Online Harms and Disinformation’s call for evidence for its inquiry, ‘misinformation and trusted voices’.  

As the Committee may be aware, the Office for National Statistics (ONS) is the UK’s National Statistical Institute and largest producer of official statistics. We place an enormous value on being a trusted source of information. We work tirelessly to ensure that our engagement with the public is not only trusted, but actively combats misinformation by making our communications clear, and providing statistics with context.  

The Office for Statistics Regulation complements our work. They assess official statistics against the Code of Practice for Statistics, assigning them National Statistics status if they meet the requirements. This makes the public aware which statistics can be trusted.  

This submission goes into further detail on these points. I hope this is useful, and please do let me know if we can provide further evidence or discuss directly with the Committee.  

Yours sincerely, 

Sam Beckett 

Second Permanent Secretary and Deputy Chief Executive of the UK Statistics Authority

Office for National Statistics written evidence ‘misinformation and trusted voices’, September 2022  

Summary 

  1. It is of the utmost importance that the Office for National Statistics (ONS) is regarded as a trusted voice in the UK. The growing use of data in public debate in recent years has emphasised the need for official statistics and analysis that can be relied upon. As the UK’s National Statistical Institute, our role is to provide trusted and accurate data.  
  2. According to the Public Confidence in Office Statistics (PCOS) 2021 report, the ONS had high levels of trust from respondents (89%), which rises if they are a frequent user of our statistics (97%). The public trust us with their data and our statistical outputs, at 90% and 87% respectively. Compared to other institutions in public life, including Government and the media, we have the highest levels of trust.  
  3. With this trust comes a large responsibility to ensure firstly that we are meeting the data needs of the public, identifying and responding to any data gaps rapidly, and secondly, that our statistics are communicated well, to reduce the risk of misinterpretation.  
  4. On the latter point, we do this by proactively engaging with the public directly and through media to improve the clarity and messaging of analysis, building our own trusted social media presence, and using innovative engagement methods to increase the public’s use of our statistics, highlighted during the census and recently with our Personal Inflation Calculator. 
  5. Misinformation can still occasionally occur. We actively monitor media channels and respond rapidly to provide clarity where it is needed, for example during the coronavirus (COVID-19) pandemic regarding the number of COVID-19 deaths. We also have strong relationships with media outlets to ensure corrections happen swiftly.  

Trusted data sources and institutions 

  1. As the UK’s National Statistical Institute, our outputs are regularly assessed by the Office for Statistics Regulation (OSR) against the Code of Practice for Statistics. If fully compliant, they are accredited as National Statistics, with the quality mark on all associated releases on our website to reassure users that they can be trusted.  

Public Confidence in Official Statistics Survey

  1. Since 2004, the UK Statistics Authority has commissioned research from the National Centre for Social Research (NatCen) on levels of trust in, and the awareness and use of official statistics in Britain. The latest results of this research were published in April 2022, in the Public Confidence in Office Statistics (PCOS) 2021 report. 
  2. Respondents reported high levels of trust (89%) in the ONS. Trust is high regardless of whether people were previously aware of the ONS or not. However, those who have used official statistics are more likely to trust the ONS than those who have not used them, with 84% of non-users saying they trusted the ONS compared with 97% of frequent users of ONS statistics.  
  3. In 2021, 87% of people surveyed said they trusted statistics produced by the ONS. 90% also agreed that they trusted the ONS with data the provided them, and that it would be kept confidential.  
  4. PCOS also asked respondents about their level of trust in the ONS compared to other institutions in British public life. Of the institutions listed on the survey, the ONS has the highest levels of trust, similar to that of the Bank of England and the courts system. Figure 1 shows a comparison of levels of trust in different institutions, as reported in 2018 and 2021.  

Figure 1: Proportion of people that trust different institutions in British public life 

Source: Public Confidence in Official Statistics 2021, National Centre for Social Research 

Responding to data gaps 

  1. In line with the Authority’s five-year strategy ‘Statistics for the Public Good’, (launched in 2020), we are radical and ambitious in providing analysis in a timely way. For example, during the pandemic, we set up and adapted surveys at pace to inform policy decisions and the public: the Business Impacts of Coronavirus Survey (BICS) (now known as the ‘Business Insights and Conditions Survey’), the Opinions and Lifestyle Survey (OPN) and the COVID-19 Infection Survey (CIS). These assessed the impacts on the economy, businesses, society and on the UK’s health. More recently, we have set up new surveys to assess Over-50s in the labour market, and the experiences of Ukrainian nationals arriving in the UK.  
  2. Our agility to respond to emerging demands for evidence means we go some way in avoiding speculation, and therefore potential misinformation, on an issue.  

Building trust and tackling misinformation 

Proactive Communication – Media and Public Engagement  

  1. To avoid misinterpretation, each statistical publication from the ONS is expressed in clear, concise language with summarised findings. Contextual background and commentary are provided when they support wider public understanding of the data and its significance. In recent years, we have focused our efforts on ensuring key findings are reported accurately and lead the coverage, with an emphasis on the use of trusted ONS spokespeople. 
  2. We build relations with media producers, editors, lead reporters and subject matter experts in targeted media outlets to encourage clear reporting of our statistics and analysis. This also provides a direct channel back for media to confirm details for immediate deadlines.  
  3. The ONS identifies areas where insight could be misrepresented or misunderstood and mitigate through our presentation such as creating reusable and shareable social media posts and content. 
  4. The direct contact details of the statisticians are provided with each release so they can be contacted directly for guidance by any member of the public. Since the start of the COVID-19 pandemic, the ONS has answered more than 12,000 different queries from the media. 
  5. While the majority of audiences will engage with ONS content through media channels, we also raise the ONS profile directly with public audiences. Our main social media presence is the @ONS Twitter account with 343,000 followers, which achieves good comparable engagement and reach, with threads created to support outputs and to respond to specific trends on social media.  
  6. We have created a network of statisticians who converse in dialogue on particular issues, themes, and releases on social channels, providing clarity where discussions take place. In recent months, we have also trialled material and commentary on LinkedIn to better understand opportunities to reach out to business audiences.  
  7. We seek creative opportunities to increase the public’s engagement in ONS statistics and analysis as the trusted source, including personalised tools and data visualisations. For example, the ONS Personal Inflation Calculator, a collaboration between the ONS and the BBC, enables individuals to see how increases in the cost of living have affected them. This has resulted in ONS data being more accessible, as well as extending our reach to new audiences. For the first results of the 2021 Census of England and Wales, we developed interactive articles and a game to encourage individuals to actively engage with our data and to discover what the results meant for the population of their local area. 

Reactive Communication – Challenging Disinformation 

  1. In response to the challenge of disinformation and misinformation on social media, we set up an online monitoring and reporting capability. This proved particularly effective during the 2021 Census. We monitored social channels, to identify misinformation and disinformation, and work directly with social media companies to remove content and accounts.  
  2. Where inaccuracy and misrepresentation of ONS statistics are spotted in the media we will seek to challenge them immediately whenever possible. News organisations are typically highly cooperative in amending their online articles or publishing corrections of ONS statistics and analysis. There have been no significant instances when a ‘mainstream’, regulated UK news organisation has refused to engage with us when factual inaccuracy has been drawn to their attention.  
  3. We are frequently consulted by both the Office for Statistics Regulation and fact-checking organisations outside of government when others use our statistics. Where there are false and misleading impressions of ONS statistics, we will rebut them: for example, in January 2022 we published a media statement and article explaining why some claims regarding the number of COVID-19 deaths were highly misleading. The rebuttal was itself reported in the news media and attracted wide engagement to social channels, illustrating the impact we can have.  

 

Office for National Statistics written evidence to the Work and Pension Committee’s inquiry on plans for jobs and employment support

Dear Sir Stephen Timms,

I write in response to the Work and Pensions Committee’s call for evidence for its inquiry, ‘Plan for Jobs and Employment Support.

This inquiry is of particular interest and relevance to the Office for National Statistics (ONS) as we are responsible for producing employment and labour market statistics and analysis for the UK.

Within this submission, we have provided analysis on the impact of the pandemic on the labour market; looking specifically at young people, people with disabilities and people who have migrated to the UK.

We have included the latest data on earnings and on the occurrence of vacancies, noting in which sectors they are concentrated. Finally, we have also provided analysis of inactivity in the labour market, with a focus the inactivity of people over 50-years-old and the inactivity as a result of long covid.

I hope this submission is useful for your inquiry. Please do not hesitate to let me know if we can provide anything further.

Yours sincerely,

Mike Keoghan

Deputy National Statistician for Economic, Social and Environmental Statistics

Impact of the pandemic on the labour market

  1. Following a fall in employment at the start of the coronavirus (COVID-19) pandemic, the UK labour market has become increasingly tight, with the employment rate now close to its pre-pandemic level and nearly half a million more vacancies than pre-pandemic, albeit falling slightly from recent record highs.
  2. The unemployment rate is one of the lowest we have seen in the last fifty years; however, inactivity remains higher than before the pandemic. The driving labour supply trend has been the increase in the number of economically inactive people since the start of the COVID-19 pandemic, particularly by those aged 50 and over.
  3. During the first year of the pandemic, increases in inactivity were largely driven by younger workers entering education, but more recent increases are driven by those aged 50 to 64, with over 60% of the increase in economic inactivity during the pandemic being driven by this age group. Another defining trait of the labour market in the initial phase of the pandemic was the decrease in the number of self-employed workers, which was partly driven by workers flowing out of self-employment and into employee status doing the same job (‘reclassified’). Though levels of “reclassification” have since stabilised to more normal levels, we have yet to see much of a reversal back towards self-employment.
  4. The ONS Business Insights and Conditions Survey (BICS) shows that in August 2022, 15% of businesses were experiencing a shortage of workers, although that proportion was over 40% among businesses in the ‘Accommodation and food service activities’ and the ‘Human health and social work activities’ industries.

Young people

  1. Early in the pandemic, the unemployment rate of young people (aged 16 to 24) increased the most compared with other age groups and began to decline from Quarter 2 (April to June) 2021 onwards. The unemployment rate among young people is now lower than December to February 2020, before the pandemic began.
  1. However, the Labour Force Survey (LFS) suggests that the reduction in unemployment among young people reflects movements to inactivity rather than employment. Comparing the latest period (May to July 2022) with the pre-pandemic period (December to February 2020), there were 86,000 fewer people aged 16 to 24 in employment, and 141,000 fewer in unemployment. In contrast, the number of those economically inactive increased by 178,000 over that period.
  • The rate of inactivity among people aged 16 to 17 has increased 2.1 percentage points as of May to July 2022, to 70.8%, compared with the pre-pandemic period (December to February 2020). The rate of inactivity among people aged 18 to 24 has also increased 2.6 percentage points over the same period, to 31.4%.
  • The increase of 178,000 people economically inactive among those aged 16 to 24, compared with pre-pandemic levels, makes up 27.8% of the total UK rise in inactivity levels (which was 642,000, as of May to July 2022).
  • The increase was similarly split among men and women aged 16 to 24. Of the total 178,000 increase in the number of people economically inactive, 52% are men and 48% are women.
  1. Median pay among those aged 18 to 24 increased 12.4% between February 2020 and August 2022; less than older age groups. Median pay across all age groups increased by 13.7%.

People with disabilities

  1. Comparing April to June 2022 with the same period three years prior, according to the LFS there has been 0.4 percentage point increase in the employment rate for both disabled people who meet the Government Statistical System (GSS) harmonised standard definition of disability (rising to 53%), and those who report having a health problem but do not meet the GSS standard definition of disability (rising to 81.9%).
  2. Likewise, there has been a 0.5 percentage point decrease in the unemployment rate (as a percentage of economically active) and a 0.2 percentage point decrease in the economic inactivity rate among those who meet the GSS standard definition of disability (falling to 6.7% and 43.1% respectively). Among those who report having a health problem but do not meet the GSS standard definition of disability, there was a 0.3 percentage point fall in the unemployment rate, and 0.2 percentage point fall in the economic inactivity rate is seen (falling to 3.1% and 15.5% respectively).
  3. Pay for disabled employees remains behind that of non-disabled people. The disability pay gap, the gap between median pay for disabled employees and non-disabled employees, was 13.8% in 2021. This gap has widened slightly since 2014 when disabled employees earnt 11.7% less than non-disabled employees.

Migrants

  1. The fall in employment seen since 2016 have been largely driven by UK nationals as shown in the Changing Trends and Recent Shortages in the Labour Market publication. In the 12 months to September 2020, the number of EU workers increased by 119,000 when compared with the same period in 2016. The year change from October to September 2020 to October to September 2021 saw a fall of 91,000 EU workers, suggesting a possible pandemic effect.
  1. Payrolled employment counts from HMRC showed the same signal of a fall in EU workers, indeed, the magnitude was higher using this measure, though comparing a longer time period. Between June 2019 and June 2021, payrolled employments held by EU nationals fell by 6% (171,100). This was offset by non-EU nationals, which increased by 9% (186,300) in the same period. There is a lot of variation at industry level, meaning changes in the makeup of migration could be affecting some industries more than others.
  2. In the same period, the largest decline in total payrolled employments was seen in Accommodation and food services; this was driven by a 25% (98,400) fall in payrolled employments of EU nationals during the two years up to June 2021. There were also large falls by EU employments in Agriculture, forestry and fishing and Arts, entertainment and recreation. These sectors also saw falls in non-EU employments. Indeed, the three sectors that saw the largest growth of EU and non-EU employments were the same: Construction, Transportation and storage and construction and Health and social work.
  3. According to the Business Insights and Impact on the UK Economy Survey, looking at why businesses are experiencing vacancies, a year ago (23 August to 5 September 2021) a quarter of businesses who were experiencing difficulties recruiting cited reduced numbers of EU applicants. This has gradually declined to 12 per cent (as of June 2022) as EU migrants have returned to the workforce.
  4. The Migration Observatory (based at Oxford University) published research on 15 August called, “How is the End of Free Movement Affecting the Low-wage Labour Force in the UK?” Their analysis looked at many of the same data sources reaching similar conclusions to ONS research, emphasising how the industries that have driven the increase in non-EU citizen employment are not the same ones that drove the decrease in EU citizen employment. They concluded there is some evidence that EU Exit has contributed to shortages in the UK labour market, although it is by no means the only driver of recruiting difficulties and some other countries have experienced similar problems without a major change in immigration policy. ONS analysis on specific industries where employment has reduced (including HGV drivers) showed that there has been a reduction in the number of EU-nationals but the impact of this was smaller than the reduction in UK nationals.
  5. The latest estimates of international migration levels produced by the ONS are experimental and provisional. They are based on administrative data and supported by statistical modelling. There is a degree of uncertainty around them that we are unable to quantify at this time.
  6. These latest statistics are produced using a new method that relies less on International Passenger Survey data, which we have long acknowledged has been stretched beyond its original purpose, and statistical modelling and makes greater use of administrative data.
  7. Observed migration activity, from early insights of provisional census results, provide some confidence that these estimates derived from administrative data sources are more accurate than those derived using previous methods. Additionally, using these data and methods produces estimates that are more comparable with the latest Home Office statistics on the operation of the immigration system.

Vacancies

  1. In June-August 2022 there were an estimated 1.266 million job vacancies in the UK after falling by 34,000 (2.6%) when compared with the previous quarter. Despite the falls seen in recent months, the number of vacancies remains 470,000 above the level seen in January-March 2020.
  2. All industries are above their pre-pandemic levels, with the largest increases seen in Accommodation and Food Service Activities, and Human Health and Social Care activities, both of which have increased by 83,000 since January-March 2020.
  3. As at the UK level however, many industries have begun to see falls in the most recent data, with 12 of the 18 industry sections seeing falls in June-August 2022 when compared with the previous quarter. The largest fall was seen in Information and Communication which fell by 11,000 (14.0%).
  4. Between 2001, when comparable vacancy statistics were first produced, and the start of 2022, our data had never estimated that there were more vacancies than unemployed people. Of the five 3-month periods between January-March 2022 and May-July 2022, the number of vacancies has been higher than the number of unemployed people on four occasions, highlighting a historically tight labour market.

Sectoral/Industry Trends

  1. According to our Business Insights and Conditions Survey (BICS), businesses in the accommodation and food service activities industry continue to report the largest percentage of businesses experiencing worker shortages. They are also most likely to report recruitment difficulties.
  2. Estimates from BICS show that in mid-August 2022, 15% of businesses reported that they were experiencing a shortage of workers, up from a low of 13% reported in mid-January 2022. However, for businesses with 250 or more employees, the percentage that reported worker shortages in mid-August 2022 was 42%.
  3. In mid-August, the accommodation and food service activities industry continued to report the largest percentage of businesses experiencing worker shortages, at 42%.
  4. In the accommodation and food service industry, 61% of businesses reporting worker shortages said that ‘employees working increased hours’ in mid-August, while 31% ‘had to recruit temporary workers’ or were ‘unable to meet demands’. In construction, 75% of businesses reporting shortages of workers reported that they were ‘unable to meet demands’ in mid-August, while 27% reported ‘employees working increased hours’.
  5. Since March 2022, when the question was first introduced, more than 1 in 10 businesses reported to have experienced difficulties in recruiting employees, rising to more than 4 in 10 among larger firms (with 50 or more employees). Accommodation and food service activities (39%), Human health and social work activities (35%), and Real estate activities (24%) had the highest percentage of businesses reporting difficulties in recruiting employees in July 2022.
  6. In June 2022, across all businesses not permanently stopped trading and had difficulty recruiting employees, 47% reported they had difficulty recruiting skilled, manual, or technical employees, followed by 36% reporting difficulties in recruiting semi-skilled or unskilled employees. Lack of qualified applicants for the roles on offer and low number of applications for the roles on offer and were the most common reasons given (approximately half) for experiencing difficulties in recruiting employees.
  7. Since early June 2022, the arts, entertainment and recreation industry and the information and communication industry have continued to report one of the smallest percentage of businesses experiencing worker shortages, with 4% of businesses in each industry reporting work shortages in mid-August.

Inactivity in the labour market

  1. The rise in economic inactivity since the start of the pandemic has been the main driver in the reduction in UK’s labour supply.
  2. Overall, total employment levels were 327,000 below pre-pandemic levels as of May to July 2022. There were 642,000 more people aged 16 to 64 years inactive, with three-fifths of those (60%, or 386,000) aged 50 to 64 years.
  3. The share of the economically inactive population who want a job has also been declining. Of those aged 16 to 64 years and inactive, 19.2% wanted a job as of May to July 2022, compared with 22.1% before the start of the pandemic (December to February 2020).
  4. Long-term sickness is currently the main reason for inactivity among those aged 16 to 64 (27.3%), followed by studying (26.3%), looking after family or home (19.1%), and retired (13.3%). However, increases in long-term sickness predate the pandemic, starting in 2018.

Over 50’s focus

  1. There are 521,000 more people aged 16 to 64 years who are economically inactive, with almost two-thirds of those (64%, or 334,000) aged 50 to 64 years.
  2. In the period 8 to 13 February 2022, the ONS conducted the Over 50s Lifestyle Study looking at adults aged 50 to 70 years who left work or lost their job during the COVID-19 pandemic in Great Britain from March 2020 and had not returned to work at the time of the survey. This included asking why they left and whether or not they intend to return.
  3. The most frequent reasons given for leaving work were to retire (47%); because of the COVID-19 pandemic (15%); because of illness or disability (13%) and because they did not want to work anymore (11%).
  4. Nearly 6 in 10 said they would not consider going back to paid work in future. Although the majority of these (79%) said nothing would encourage them back to work, 10% said they would be encouraged to return if they were able to work from home; 9% said they would be encouraged by flexible working hours and 4% said they would be encouraged if the job fit around caring responsibilities.
  5. Nearly 4 in 10 said they would consider returning to paid work in the future with over half (54%) saying it would be for the social company or a job they would enjoy; around a half (52%) saying they would be encouraged by the money; and under a half (45%) saying they would be encouraged to return to work for a job that suited their skills and experience.
  6. The ONS plans to publish new insights later this month into the reasons and factors for older workers leaving the labour market, and what may encourage them to return. The ONS is happy to send a copy of the new insights to the Committee once published.

Long COVID

  1. Data suggests that some of the increased inactivity could be due to long COVID. In July 2022, 1.8 million people reported suffering from long COVID in the UK, with 369,000 “limited a lot” by their symptoms.
  2. The impact of long COVID is felt unequally. As a proportion of the UK population, the prevalence of self-reported long COVID was greatest in people aged 35 to 69 years, females, people living in more deprived areas, those working in social care, those aged 16 years or over who were not students or retired and who were not in or looking for paid work, and those with another activity-limiting health condition or disability.
  3. Research on the prevalence of long COVID shows that the employment status with the highest prevalence across the UK is the ‘inactive and not looking for work’ group; with 6.43% of that population estimated to be living with self-reported long COVID of any duration. This compares with 3.81% among those employed, and 3.41% among those unemployed.
  4. The ONS is currently working on an analysis of the association between SARS-CoV-2 infection / long COVID and changes in employment status (working or not working) using data from the Coronavirus Infection Survey. However, it is not possible to identify long-term sickness absence from the survey data. We anticipate being able to publish results from this work later this year and will share these with the Committee.

Earnings

  1. After falling at the start of the pandemic, partly as a result of furlough, growth in total earnings has remained strong in recent months with particularly high bonus payments, especially in March 2022. However, even with bonuses considered, pay has struggled to keep pace with inflation. Ignoring bonus payments, regular pay growth, though also strong (but more subdued than total pay) has seen record falls in real terms (that is, when adjusted for inflation).
  2. Growth in average total pay (including bonuses) was 5.5% and growth in regular pay (excluding bonuses) was 5.2% among employees in May to July 2022.
  3. Growth in total and regular pay fell in real terms (adjusted for inflation) on the year in May to July 2022, at 2.6% for total pay and 2.8% for regular pay; this is slightly smaller than the record fall we saw last month (3.0%), but still remains among the largest falls in growth since comparable records began in 2001.
  4. Average regular pay growth for the private sector was 6.0% in May to July 2022, and 2.0% for the public sector; outside of the height of the pandemic period, this is the largest difference we have seen between the private sector and public sector.
  5. The wholesaling, retailing, hotels and restaurants sector saw the largest regular growth rate at 7.0%, followed by the finance and business services sector at 5.9% and construction sector at 5.4%.

 

Office for Statistics Regulation written evidence to the DCMS Sub-committee on Online Harms and Disinformation inquiry on misinformation and trusted voices

Dear Mr Knight, 

I write in response to the inquiry Misinformation and trusted voices, as conducted by the DCMS Sub-committee on Online Harms and Disinformation.  

Which organisations are the most trusted sources of information in the UK?  

The Office for Statistics Regulation is the independent regulatory arm of the UK Statistics Authority and provides independent regulation of all official statistics produced in the UK. It aims to enhance public confidence in the trustworthiness, quality and value of statistics produced by government through setting, and assessing compliance with, the Code of Practice for Statistics1. In addition, one of our key roles is to use our voice to stand up for statistics and to represent the public, monitoring and reporting publicly where we have concerns about the dissemination and use of statistics and highlighting good practice. 

The Code of Practice for Statistics has three pillars: Trustworthiness, Quality and Value. The three pillars work together to provide the conditions to support public confidence in statistics, which relates directly to the question the Committee is asking. In particular, we distinguish trust – a belief on the part of individuals – from trustworthiness – a property of organisations. Trustworthiness is about providing evidence that the systems, processes and governance surrounding statistics are effective. However, we never consider trustworthiness in isolation. We consider all three pillars to determine whether statistics are fully compliant with the Code of Practice and can be designated as National Statistics. This designation demonstrates to users that they can have confidence in the relevant official statistics. 

One source that can give some insight into levels of trust in official statistics is the 2021 study of public confidence in official statistics. It found that, amongst people who responded, there was high confidence in the statistical system. While respondents did not necessarily know about the Authority or the OSR, there was strong support for our role, with 96% of respondents agreeing there should be an independent body to speak out against the misuse of statistics and 94% agreeing that such a body should ensure that statistics are produced free from political interference. Regarding the Office for National Statistics (ONS), the largest producer of official statistics in the UK, 87% of respondents reported that they trusted ONS statistics. The public value of statistics has also been shown through 92% of respondents who had used COVID-19 data reporting them being useful. Although this is only one source, and we are careful not to place too much weight on a single survey result, we do consider that this provides some reassurance around public confidence in official statistics. 

In addition to official statistics producers, there is a wider ecosystem of statistics and data. Many of these other sources of statistics and data inform policy and public debate and it is important that they are used for the public good. We encourage producers outside of the official statistics producer community to apply the Code of Practice for Statistics on a voluntary basis. Our annual award for Statistical Excellence in Trustworthiness, Quality and Value recognises those who voluntarily apply the core pillars of the Code of Practice for Statistics. 

Is the provision of authoritative information responsive enough to meet the challenge of misinformation that is spread on social media? 

Our view is that the best way to combat misinformation is to ensure that information that is trustworthy, high quality and high value is made available to the public. In this way, the good information can drive out the bad. 

However, we recognise that it is hard to live up to this ideal. The experience of the pandemic is instructive. As we noted in our recent State of the Statistical System report, there are a variety of organisations and individuals commenting on the use of statistics by government. The COVID-19 pandemic in particular was associated with an increase in the role of citizens as ‘armchair epidemiologists’. We wrote a blog4 highlighting how open data enabled great work to be done to communicate data on COVID-19 publicly from outside the official statistics system, including on social media. This demonstrated the changing statistical landscape of increased commentary around official statistics at its best. 

Since the pandemic there has continued to be an increased interest in and scrutiny of statistics. This is a positive for the statistics system but also brings risk. Much discussion of statistics takes place on social media with increased risks around misuse, misinterpretation and ‘echo chambers’. Official statistics producers need to be aware of these changes in the use of statistics. 

Areas that we highlight in our report that can help official statistics producers meet the challenge of misinformation that is spread on social media include: 

  • improving how uncertainty in statistics is communicated to bring effective insight; 
  • an increase in government statisticians challenging the inappropriate use of statistics and engaging directly with users to support understanding of statistics; and  
  • intelligent transparency around statistics, data and wider analysis. 

 

Intelligent transparency means proactively taking an open, clear and accessible approach to the release and use of data, statistics and wider analysis. As set out in our regulatory guidance on transparency, intelligent transparency is informed by three core principles: equality of access, enhancing understanding and analytical leadership. It is about more than just getting the data out there. Intelligent transparency is about thinking about transparency from the outset of policy development, getting data and statistics out at the right time to support thinking and decisions on an issue, supporting the wider public need for information and presenting the data and statistics in a way that aids understanding and prevents misinterpretation. 

In conclusion, a constant refrain of the OSR is that it is important to ensure that the bad data does not drive out the good. However, as long as producers have the right approach, based on trustworthiness, quality and value, good statistics can thrive. 

Please let me know if any questions or if I can support the Committee further in its inquiry. 

Yours sincerely  

Ed Humpherson  

Director General for Regulation 

Office for National Statistics written evidence to the Public Administration and Constitutional Affairs Committee’s inquiry on the Civil Service People Survey

Dear Mr Wragg,

I write in response to the Committee’s call for evidence on the Civil Service People Survey. We have focused our evidence on questions in the Terms of Reference regarding survey design, delivery and validity of results, from the perspective of our role as the Office for National Statistics in administering high-quality national surveys including the census.

Survey design

Anonymity

Staff surveys, and surveys in general, can adopt different strategies to protect respondents’ privacy. These can range from anonymising responses by removing any information that connects the survey to the respondents, to ensuring that analysis derived from the survey does not lead to disclosing identity.

Full anonymisation can limit analysis. For example, if different age groups have a different experience of working in an organisation, this would not by highlighted if the age field was removed to protect privacy. Therefore, best practice is to seek a compromise by using a range of measures to protect privacy. These include:

  • Deidentification removes fields that are highly likely to identify an individual such as their name and address and keeps fields such as age that do not directly relate to one person. Grouping answers limits direct identification further, for example using age ranges rather than dates of birth or referring to branches rather than teams. All these group small numbers of people together to limit the identification of one person while maximising the benefit of the survey.
  • Use of identifiers rather than names so that there is additional protection. Only very few people would have the technical ability and access rights to link respondents to responses. If there were a breach, they would be easily identified because they would leave a digital footprint.
  • Open rather than targeted invitation provides respondents with more control over their responses. An open invitation to the target population allows a respondent to input all their information without it connecting back to a database. A targeted invitation provides a respondent with a code that connects them, and only them, to the sampling frame but could come at the expense of privacy if other measures are not taken, and possibly impact on responses and the response rate if people are more concerned they can be identified.
  • Segregation of duties enables a significant reduction in the number of people who have access to identifiable information. Analysts are not granted access to personal identifiers and users of analysis are only granted access to aggregated data.
  • Statistical Disclose Control is a process by which analytical outputs are checked to ensure that they cannot lead to reidentification on individuals. There are a number of methods that can be used including supressing small numbers and swapping cells which mean the headline summary is still correct. We would not recommend completing cross-sectional analysis when there are low numbers in a category as this might enable identification, especially when it is possible to link to other information in the public domain.
  • Summarising and controlling access to free text are important to ensuring that respondents who provide information that can be used to identify themselves or others are protected. This is particularly important when respondents use the survey as an opportunity to raise issues which require careful handling such as safeguarding. It is best practice to have a safeguarding policy that provides clear guidance and oversight as to when privacy should be breached to protect individuals.

In addition, it is good practice to carry out privacy impact assessments and to make privacy notices and technical guides readily available.

Survey design and delivery

To ensure the best design and delivery of a survey, you may want to be aware of the following:

  • Continuity – staff surveys such as the People Survey are repeated every year so that changes can be tracked and compared over time. To achieve the objectives, the survey needs to be relatively stable, and changes carefully considered and implemented. When a question is discontinued or changed significantly, the time series is ‘broken’, and a new measure is tracked. This is sometimes necessary to ensure the survey remains relevant and useful.
  • Comparability – when a key requirement is the ability to compare performance across the civil service, a key feature of the survey must be consistency: the same survey, with the same questions must be used by all organisations.
  • Comprehension – questions should be pre-tested to ensure that they are being understood as intended and the wording is suitable and understood by all respondents. As far as possible, the survey should use harmonised standards that are available to government departments or reuse questions that are commonly used. These questions have been tested and the practice enables comparison with other data.
  • Scope – the topics covered by the survey are varied. To better understand them, questions in a ‘block’ touch on subsets of the topic. The survey designers must consider the length of the survey and the impact that may have on the quality of the responses, participation and respondents following through to the end, thus completing the survey. The usual recommendation for an online survey is that it should be completed in around 20 minutes.
  • Mode of collection – how responses are collected is determined by cost, the speed in which results are needed, participant preference and the influence modes have on the responses provided. For example, when people complete a survey online, which is the cheapest collection mode, they tend to complete it quickly and may be less reflective compared to an interviewer led survey where the interaction between people can lead to explaining the answer and probing further.
  • Inclusive – the survey ought to be inclusive by design and this refers to the overarching study design but also to the design of the questions themselves and the interfaces that respondents interact with. For example, the online survey should be designed to meet accessibility standards so that it does not limit participation through design. We should be inclusive in the questions we ask, ensuring that the available answer options collect data that represents the population being surveyed. Having multiple modes of collection available increases access to the survey and in turn increases representation in the data.

Who should be involved?

Developing and delivering a survey of the scale of the People Survey is a multidisciplinary task that requires the involvement of many professionals to ensure it delivers on analytical and business objectives.

  • Policy and Analysis users – it is essential to involve those who will be using the results to understand their requirements and to ensure that the data being collected meets their policy questions.
  • Methodologists and Data Architects – the data that underpins the analysis that responds to the policy questions needs to be designed and architected so that it meets data standards and methodological requirements. This step is crucial to ensure that the data collected is fit for purpose, can be used, reused and linked (for example to the data from previous years).
  • Survey designers – As with all surveys, involving questionnaire design experts in the development of the questions and survey to ensure it will meet and balance user need is crucial. As part of their professional input, survey designers will review if questions are clear, appropriate, representative, inclusive and accessible by involving groups across the civil service and asking for their views. They will test questions to ensure that they meet the requirements. We would prioritise cognitive testing to check understanding and interpretation, to mitigate any potential quality issues in the data ahead of going live and so that results can be explained clearly following analysis.
  • Survey developers and user experience designers – whether data is collected online, by an interviewer or using a papers questionnaire, the survey flow and the User Interface must be designed and tested to meet industry standards and to ensure that the survey is accessible to everyone. The survey can be sent to https://digitalaccessibilitycentre.org/ for testing.
  • Procurement – whether the survey is commissioned internally or externally, the specification must be understood and agreed by all parties with subsequent changes governed appropriately. The successful bidder must be able to meet the required standards.
  • Supplier – at the appropriate stage it is essential to build a strong working relationship with the supplier and especially with the technical delivery team. The supplier will be a survey expert with a wealth of experience and should be able to deliver the specified requirements as well as advise on innovation.
  • Communication and dissemination teams – the survey must be promoted by central and local teams to encourage participation. In addition to advertising the survey, the communication can include descriptions of how data will be used, what the benefit of the survey will be and why it is worth taking part. As well as communicating the results, it is necessary to ensure methods and processes are transparent so that people know what to read into them, and importantly what not to read into them. For communication teams to support the survey they must be given all the relevant information from design to analysis.

Relevance of metrics

The information included in the People Survey should be based on the data user needs and the departments that will use it. As mentioned above, these can be ascertained through consultation with policy users. Comparison over time is always an important aspect of any regular survey, and we would recommend keeping question sets as comparable as possible from year to year with changes, when needed, following a transparent methodological review. Finally, some terms used within questions are subjective depending on the department – again this could be improved through consultation.

Periodically the topics covered will change and be impacted by other issues. A good example is the need to monitor the experience of working in the civil service throughout and following the pandemic. When adding or changing a metric, it is important to communicate and explain the changes, especially at the reporting stage.

Some departments may also need to consider organisational changes and how they would like them reported against previous years.

Validity of results

Quality assurance

It would be difficult to quality assure the information provided via the People Survey. There are limited sources to cross-check the information, but these could be exit interviews and/or any internal departmental staff surveys. One approach to address this could be to do a quality follow-up survey with a sample of respondents – which is like what we do with the census to quality assure that data.

Non-response bias impact

Non-response bias can have a huge impact. It can cause results to be distorted, and this is linked with wider issues, as typically people that don’t respond have a reason not to engage, and those reasons are particularly interesting for those collecting the data but remain unseen.

It also means that the data will not be representative and, as a result, any policy changes might not address the real issues. Methodological solutions include weighting and imputation and require comparing the population of respondents to the population of civil servants using the data that is available through HR departments. For example, if fewer people aged under 30 respond to the survey, the responses of those who have replied could be given a bigger weight. Any weighting strategy would need to be transparent and carefully considered, attempting to explain the assumption that people who have responded do indeed represent those who have not.

Survey Delivery

Strengths and weaknesses

Strengths and weaknesses are mostly the result of trade-offs. For example, while the People Survey is relatively long, risking attrition, lower response rate and haste in completion, it does allow for more detailed analysis on many topics.

As discussed in this submission, using a consistent survey across the Civil Service enables efficiency, comparison between organisations, sharing of good practice and analysis over time while limiting bespoke design on issues that may be of interests to specific departments.

Again, as aforementioned, while the survey is not weighted, which enables quick access to the results, it does have an impact on how confident we can be that respondents represent the civil service as a whole. The survey is reported as percentages of respondents rather than a percentage of the population and users can break down the results further to compare responses from different groups. This is a pragmatic, clear approach which is clearly communicated.

A mixture of both quantitative and qualitative data collection could improve the quality of the analysis and the usefulness of the survey. The People Survey is quantitative, with a few open questions capturing free text. There are other qualitative measures such as depth interviews and focus group discussions that can be used alongside the People Survey to enhance understanding of the results. These can either be in addition to, or instead of some of the questions in the survey.

Finally, the survey is accompanied by a tool that enables quick analysis and comparisons, disseminated to all participating organisations; this is a strength.

My colleague Sarah Henry, Director of Methodology at the ONS, looks forward to discussing this further with the Committee on 13 September. Please do let us know if any questions ahead of then.

Yours sincerely,

Professor Sir Ian Diamond

Office for Statistics Regulation written evidence to the Procedure Committee’s inquiry on correcting the record

Dear Ms Bradley,

I write in response to the Committee’s call for evidence for its inquiry Correcting the record.

The UK Statistics Authority and the Office for Statistics Regulation (OSR), as its regulatory arm, have a responsibility to ensure that official statistics meet the public good. We provide independent regulation of all official statistics produced in the UK, and aim to enhance public confidence in the trustworthiness, quality and value of statistics produced by government. We do this by setting the standards official statistics must meet in the Code of Practice for Statistics. We ensure that producers of official statistics uphold these standards by conducting assessments against the Code. Those which meet the standards are given National Statistics status, indicating that they meet the highest standards of trustworthiness, quality and value.

We also report publicly on systemwide issues and on the way that statistics are being used, celebrating when the standards are upheld and challenging publicly when they are not, intervening when statistics are either misused publicly, or quoted without sources being made clear. Our interventions policy explains how we make these judgements in a considered and proportionate way.

Key to our interventions is the ask that people practise intelligent transparency. Transparency and clarity support public confidence and trust in statistics and the organisations that produce them and minimises the risk of misinterpretation of statistics and data. Transparency allows individuals to reach informed decisions, answer important questions and provide a mechanism for holding governments to account. Statistics and data also underpin successful implementation of government policies, and individuals’ views on the effectiveness of policy decisions.

Intelligent transparency is informed by three principles:

  • Equality of access: Data quoted publicly, for example in parliament or the media, should be made available to all in a transparent way. This includes providing sources and appropriate explanation of context, including strengths and limitations.
  • Understanding: Analytical professions need to work together to provide data which enhances understanding of societal and economic matters, including the impacts of policy. Governments should consider data needs when developing policy and be transparent in sharing analytical and research plans and outputs with the public.
  • Leadership: Organisations need strong analytical leadership, within and beyond analytical professions. Decisions about the publication of statistics and data, such as content and timing, should be independent of political and policy processes. These decisions should be made by analytical leaders, who should also be given freedom to collaborate across organisational boundaries to support statistics that serve the public good. Their expertise and decision-making authority should be endorsed by Permanent Secretaries.

As tools for understanding public policy, statistics and data rightly belong at the heart of Parliamentary debate. They can be a powerful support to an argument. In the pressured environment of an oral debate, it is only natural that some of these references to statistics, though made in good faith, will be misremembered, unclear, or misattributed. In these circumstances, it is always welcome when MPs make the effort to furnish the record with clarifications or additional information about their sources. This not only ensures that the House is fully informed, but also meaningfully improves the quality of media reporting and subsequent public debate.

At other times an MP may quote statistics correctly but confuse data from a private source with that already in the public domain. In particular, Ministers (who under the Ministerial Code are required to be mindful of the Code of Practice for Statistics) have access to a wide range of published and unpublished information from their departments and should take care to rely on the former when making their statements. However, as set out in our guidance for the transparent release and use of statistics and data, when unpublished information is used unexpectedly, statistical officials in Government departments can play their role in setting the record straight by publishing the information as soon as possible in an accessible form, ideally on the same day. This can be done via an ad-hoc release, which need not be long, or technical. For example, the Department for Work and Pensions has a page dedicated to ad hoc statistical analyses.

Our aim, one that we would hope the Committee agrees with, would be to see intelligent transparency being the default for all statistics and data, including those used by Ministers and parliamentarians.

Please let me know if you have any questions.

Yours sincerely

Ed Humpherson
Director General for Regulation