2. Critical data gaps - Lack of validation sources
A major drawback of discontinuing the census is the difficulty of validating the accuracy and coverage of the linked administrative data. A clear recent example of the need for validation is the case of ONS population estimates for Coventry, where the Office for Statistics Regulation found that the evidence did appear to be inconsistent with, and potentially higher, than local evidence suggests in some smaller cities that had a large student population. In essence, ONS overestimated the size of Coventry’s population by 10%, with major but unwarranted implications for housing requirements. This was eventually confirmed by validation against the 2021 Census. While it might be said that better use of administrative data could have prevented the problem by producing better, more regular estimates to begin with, there is a risk that other issues may arise in the proposed new system – issues that could be much harder to resolve without a census for validation. ONS could therefore be at risk of more frequent challenges and thereby to reputational damage if an authoritative system of validation is not in place.
2(a): Magnitude of under-counts
Of particular interest to NSIDAC is uncertainty around the magnitude of the undercounts with respect to people with particular protected characteristics or members of at-risk populations. The Consultation Document (PDF, 1334KB) notes that “the ONS recognizes the importance of independent sources to provide information on the quality and coverage of available data, and this requirement will continue to exist under the proposals set out in this document”. This is especially important if our concerns about lack of trust in administrative data-collecting authorities is warranted (see further below). The risk is that the least trusting, who may also be among the most marginalised, will be undercounted and will become invisible if there is no independent source against which to compare the administrative data.
We are particularly concerned about the implications for inclusivity of data, as it appears that the magnitude of the task confronting ONS is not yet well understood. That is to say, we do not yet know which characteristics administrative datasets most seriously undercount or simply do not count at all. How one designs one’s benchmarking survey, or other validation techniques, must surely depend on the nature and magnitude of the shortfalls that need to be addressed. For example, in order to obtain reliable estimates for some smaller groups, one might need to oversample. However, area-based oversampling may work for one characteristic (such as ethnic group) but not for another characteristic (such as gender reassignment) for which a completely different oversampling strategy might be needed. We cannot therefore be confident that a good independent source for checking the administrative dataset will be up and running by the time the next regular census would be due.
2(b): Optimism bias
The Consultation Document states that “Ongoing research and development over the coming decade will determine the best independent sources for this, and this is likely to include an intermittent ‘benchmarking’ Survey”. Our concern here is the open-ended nature of the development process with no certainty that a ‘good enough’ and cost-effective solution will be found before a final decision has to be taken on the 2031 Census.
Declining response rates to major surveys such as the Labour Force Survey (LFS) must raise doubts about the validity of any bench-marking survey that lacks a statutory basis. We fear that the ONS may be suffering from what the National Audit Office terms (PDF, 266KB) ‘optimism bias’. As the NAO states: “Our own work shows how over-optimistic plans for delivery or savings, such as the efficiency targets set for hospitals, can be followed by either failure to deliver, lower service quality, or a need for later funding injections. At the root of this problem lies not only poor data on costs and performance, but also inconsistent challenge, both within departments and by the centre of government.” The importance of ONS embracing challenge was also emphasised by the Office for Statistical Regulation (OSR) report on Coventry described above.
Alternative methods have also been suggested to us by ONS, but until the scale and nature of the task has been identified, and the methods described in more detail, it is impossible to evaluate the proposals. The ONS cannot therefore, in our view, be confident at present of successful validation of the proposed system on an appropriate timescale and within budget.
Back to top