Economic statistics and ONS culture
Economic statistics guide critical decisions of Government, independent institutions like the Bank of England and Office for Budget Responsibility, private sector companies and individuals.
But measuring the modern economy is difficult. The economy itself changes, new data sources emerge, new technology becomes available, and international standards evolve. Like many other organisations, ONS is also wrestling with issues of recruitment, retention, and the replacement of legacy technology. And while users want the best statistics, some are unhappy when revisions are made to past series when a better statistic is developed.
ONS has a hard job managing all these challenges. And some work in recent years, not least on consumer prices, R&D, and population statistics have illustrated ONS’s capacity to engage with new data sources, develop new methods, and build modern systems.
That said, it is my view that most of the well-publicised problems with core economic statistics are the consequence of ONS’s own performance. And that performance is affected by certain cultural issues. Three inter-related issues stand out:
- There has been a commendable interest in both new approaches to statistics (including the use of administrative data), and ensuring the relevance of ONS activity to wider political debate. Unfortunately, this has had the (unintended) effect of de-prioritising the less exciting, but nonetheless crucial, task of delivering core economic statistics of sufficient quality to guide decision making. There was an opportunity cost in ONS securing additional funds for flagship programmes, like the Integrated Data Service, and using those funds to deploy scarce human expertise. That opportunity cost was both the constraint on organisational bandwidth, and the restricted funding for core economic production teams. To give just one concrete example of this, many people reported that ONS should by now be further forward on the development of the new Statistical Business Register, both to extend it to smaller companies and to reclassify those companies in preparation for the UN’s new system of national accounts.
- This interest in the “new” might have been better managed if ONS had a stronger system of planning and budgeting. Sadly, too many people I met described a divergence between what they were asked to do, and the resources provided to do so. Of course, all budgeting rounds are difficult: demands always exceed supply initially. ONS is to be commended for its efforts to develop so-called “horizontal” planning, which sought to assess the cost of all the inputs to, say, GDP or to price statistics, and use these costs to prioritise. Every production area also has a “quality improvement plan”. Yet in the most recent annual round, while such information was fed into the planning process, the end results looked to many in ONS closer to “equal shares of misery”. Very few people can point to substantive reductions in lower priority work. Instead, the four leaders of the principal teams within ONS were invited to make their own decisions on how best to balance their final budgets, despite the fact that each core economic statistic requires collaboration across all four teams.
- This in turn reflects what many reported as a reluctance, at senior levels, to hear and act on difficult news. The organisation had established values to be radical, ambitious, inclusive, sustainable. And the experience during the pandemic well illustrates how such values can drive extraordinary behaviour and results. Unfortunately, the lesson learned seems to have been “all things are possible”, without the nuance that this is typically only true when there is an overwhelming emergency, which understandably de-prioritises many other activities. Several people suggested that the list of values was missing “realistic”.
Taken together, these factors left ONS open to a variety of risks, some of which have then crystallised in recent statistical errors.
The interplay of prioritisation and funding decisions which led to production teams being stretched, yet still relying on legacy technology, lies behind both the following failures:
- the problem with trade statistics reflected known concerns about the efficacy of the underlying software configuration; but while reported as a quality risk, was not actioned before an error occurred;
- the problem with the Producer Price Index reflected a mismatch between the methods that had been agreed and the coding which implemented the methods; the use of an older coding language, less readily documented, contributed to the problem not being spotted ahead of the error occurring.
Prioritisation and funding decisions have also affected survey data collection. To avoid repeating the underspend on ONS’s overall budget for 2021-22, a decision was made to allow some parts of ONS to spend at a rate higher than was consistent with their 2022-23 budget allocations. In practice, though, it appears that the favoured areas were all too successful spending at a higher rate, to the extent that ONS had to make significant in-year reductions to allocations. Those reductions were applied across the organisation, including social surveys, even though that had not been an “overprogrammed” area. The resulting constraint on the survey field force, enduring into the next year, directly led to the decision to remove the so-called “COVID boost” to the Labour Force Survey. Almost as soon as that boost was removed, the Labour Market Survey failed, with the suspension of the series in October 2023.
A number of other issues arose, including in early June, from incorrect data being supplied to ONS. Late, and updated, data from one business affected earnings statistics; and errors in data received from the Department for Transport and His Majesty’s Revenue and Customs, respectively affected price and trade statistics. ONS choices about priorities and funding affected the capacity of data collection and statistical production teams, delaying the identification, and also the resolution, of these issues.
All of these examples help illustrate why – for me – the failures of economic statistics which have occurred are not best thought of as isolated issues, but rather as the almost inevitable consequence of the choices made (and not made) at the top of ONS, over several years. This includes choices about what to prioritise in seeking funds in the 2021 Spending Review.
And those choices in turn reflect a reluctance on the part of some to take at face value the warnings which have been raised, apparently preferring instead to categorise those making the warnings as lacking in accountability. This categorisation seems to me to be without foundation, and it has undoubtedly made life difficult for many senior people working at ONS who are concerned about the quality of population and economic statistics. I am not surprised that so many, experienced, senior leaders have chosen to leave.
This section has summarised some of my key findings about ONS performance – especially on economic statistics – and culture, since the two are so clearly related. Before explaining how I believe ONS can turn this round, let me first address the other elements of the Terms of Reference.
Back to top