Response from Sir Robert Chote to Sandesh Gulhane MSP – minimum unit pricing

Dear Dr Gulhane,

Thank you for your letter of 3 July regarding publications about the impact of minimum unit pricing (MUP) for alcohol in Scotland.

You raised concerns about the communication of Public Health Scotland’s (PHS) evaluation of the MUP policy, which concluded in a final report published in June (that synthesized evidence from a number of studies). You also raised concerns about a Scottish Government press release welcoming that report and an earlier health impact study published in March by authors from PHS, the University of Glasgow and the University of Queensland. We have also looked at an ‘at a glance’ document produced by PHS.

Our remit covers the production and use of official statistics and does not extend to research or policy evaluation. As such, we have not conducted a full investigation of the content or methodology of the PHS reports. Instead, we have focused on how statistical evidence has been communicated and we consider that the findings in the final PHS report are communicated clearly and impartially.

Communication of the PHS evaluation report (published June 2023)

The original version of the Scottish Government press release stated that:

“In their final report of a series, researchers said that ‘robust, independent evaluation’ and the best available, wide-ranging evidence drawing on 40 independent research publications, showed that the MUP has been effective in its main goal of reducing alcohol harm with the reduction in deaths and hospital admissions specific to the timing of MUP implementation”.

This wording might suggest to many readers that most or all of the studies referred to examined the health impact of MUP. But the evaluation report explains that of the 40 papers included, only eight provided evidence on alcohol-related health outcomes. The remaining 32 examined other potential effects of the policy such as on alcohol consumption, social outcomes, compliance by retailers and product prices. Of the eight papers which studied health outcomes, one looked at deaths and hospitalisations and found a beneficial quantitative impact on these outcomes. Based on the other seven papers, the report concluded that there was “no consistent evidence that MUP impacted on other alcohol-related health outcomes such as ambulance callouts, emergency department attendances and prescribing of medication for alcohol dependence”.

Communication of the PHS/Glasgow/Queensland study (published March 2023)

The Scottish Government press release and the PHS ‘at a glance’ document both referred to the results of the PHS/Glasgow/Queensland study. However, information about the level of uncertainty associated with the reduction in hospitalisations and deaths was not included in either output, despite being emphasised in the study. For example, the figures are estimates based on statistical modelling and the reduction in hospital admissions was not found to be statistically significant.

Summarising technical data, especially for a public audience, is challenging. Press releases, factsheets, tweets and other communications require condensed information, but it still serves users best to include caveats about the uncertainty or limitations of statistical evidence. In this case, caveats did not carry through from the final PHS report to the press release and ‘at a glance’ document.

The Office for Statistics Regulation has discussed these issues, and its broader guidance on communicating uncertainty, with PHS and the Scottish Government. It is good to see that, as a result, PHS has updated its at-a-glance summary and the Scottish Government has updated its press release to ensure that the uncertainty around the estimates is more clearly communicated. I am also pleased to report that both have committed to improving the communication of uncertainty in future outputs.

 

Yours sincerely,

Sir Robert Chote
Chair

 

Related links

Letter from Sandesh Gulhane MSP to Sir Robert Chote – minimum unit pricing

Letter from Sandesh Gulhane MSP to Sir Robert Chote – minimum unit pricing

Dear Sir Robert,

I am writing to request a review of the Public Health Scotland report ‘Evaluating the impact of minimum unit pricing for alcohol in Scotland: A synthesis of the evidence’ and the associated publicity and ministerial statements.

It purports to be “the final report from the PHS evaluation of minimum unit pricing for alcohol in Scotland”.

It is likely to be used in Scottish Government decision making on whether to continue with MUP and whether to raise the minimum unit price of alcohol.

However, I am concerned the report and associated publicity and ministerial statements significantly overstate the health impact of MUP, and under-represent the significant uncertainty in the wider body of research and among the scientific community.

I will outline my concerns in turn.

The press release

On 27 June 2023 the Scottish Government distributed the following press release.

It states the conclusion that MUP “has saved lives, reduced hospital admissions and had a positive impact on health” was drawn from “robust, independent evaluation’ and the best-available, wide-ranging evidence drawing on 40 independent research publications”.

It also states: “This follows a study published in March by PHS and University of Glasgow showing MUP reduced alcohol consumption by 3%, deaths directly caused by alcohol consumption by 13.4% and hospital admissions by 4.1%. compared to what would have happened if MUP had not been in place.”

However, this conclusion is not drawn from 40 publications. 32 of these publications referenced in the “final report” are silent on health impacts and focus on other issues such as consumption.

Of the eight publications that do address health impacts, seven of them are inconclusive.

Only one study concluded MUP had reduced deaths — the PHS and University of Glasgow study mentioned in the press release.

This study was led by Grant MA Wyper, public health adviser to PHS.

This “final report” does not “follow the PHS and University of Glasgow study”. It merely restates its findings.

Furthermore, it was not “independent”. It was commissioned by PHS and led by a PHS adviser.

The PHS/Glasgow University report was itself a retread of a report that appeared in The Lancet the previous day.

These reports are presented as two distinct studies in the latest PHS “final report” as Wyper et at (2023a) and Wyper et al (2023b).

This “final report” does not build on the Wyper study. The seven other studies addressing health impacts are inconclusive.

The conclusion that MUP has reduced deaths was not robust, nor drawn from wide-ranging independent evidence. It was drawn from a single PHS report.

Also, the report itself states that the 4.1 per cent reduction in hospitalisations was not statistically significant.

Therefore, the Scottish Government cannot definitively say “MUP reduced…hospital admissions by 4.1%”.

The Lancet study

The assertion that there were 13.4% fewer deaths “compared with what would have been observed in the absence of MUP legislation” overstates the uncertainty in statistical modelling.

No statistical model can say definitively what “would have” happened, as The Lancet study acknowledges in its methodology.

Indeed, “would have” becomes “might have” in the discussion section of the study.

“Study outcomes were assessed using a controlled interrupted time series study design, allowing us to determine the difference between outcomes we observed and our best representation of what might have happened under the counterfactual situation that MUP legislation was not enacted in Scotland.”

It concludes deaths rose faster in England in the absence of MUP therefore MUP “averted…an average of 156 deaths” each year in Scotland.

An additional 156 deaths a year would be a significant acceleration of the trend seen in the preceding 20 years, which was generally downward in the first decade and plateaued in the low thousands before the pandemic.

An acceleration this size was not witnessed in any of the regions of England with a similar population and demographic to Scotland, for example North West England, which saw a post-pandemic increase of a similar magnitude to Scotland despite the cheaper alcohol.

Criticisms of the Lancet study

Dr Adam Jacobs, Senior Director, Biostatistical Sciences at Premier Research, challenged the methodology behind the 13.4 per cent increase.

He said: “It is plausible that the MUP policy would bring down deaths and hospitalisations due to alcohol consumption, but I don’t think this paper shows it convincingly.”

Prof Kevin McConway, Emeritus Professor of Applied Statistics, The Open University, rightly took issue with the “causal interpretation” in The Lancet study.

He said: “This is an observational study, and no matter how well other factors are controlled for, it can never prove conclusively that the changes observed in deaths were due to the minimum unit pricing policy. In my view there hasn’t been enough caution given around assuming this relationship is causal…”

“We can’t say that MUP definitely led to a 13.4% reduction in deaths, though that does clearly remain an important possibility…”

“While it’s possible that the deaths or hospitalisations would have decreased enough to be detectable in the follow-up period here of 32 months after MUP, it’s also possible…that they aren’t clearly detectable on that time scale, though (if they really exist) the effect should show up, and indeed be much larger, later. And given what the time lag specifications look like in the Holmes paper, in another 7 or 8 years the reductions in deaths would be immense, implausibly immense indeed, given the size of the estimate after just over 2.5 years.

“Or it’s possible that what is being picked up in the new study is an effect of a change in alcohol consumption that occurred considerably earlier than MUP, so couldn’t have been caused directly by MUP…”

“So overall, in my view, there remains some doubt about whether MUP definitely caused the alcohol consumption change and therefore whether it is responsible for reductions in deaths.”

The “final report” at a glance

The at a glance conclusion states:

“Overall, the evidence supports that MUP has had a positive impact on health outcomes, including alcohol-related health inequalities.”

However, the finding for health above states:

“MUP reduced deaths directly caused by alcohol consumption by 13.4% and hospital admissions by 4.1%.

“Reductions were greatest for men and those living in the most deprived areas of Scotland.

“There is no consistent evidence of impact, positive or negative, on other health outcomes.”

This is not “overall” evidence. It’s a single study.

The conclusion should have stated:

“One study supports that MUP has had a positive impact on deaths and there is no consistent evidence of impact, positive or negative, on other health outcomes.”

The “final report” briefing

The briefing concludes:

“Taken together, the evidence supports that MUP has had a positive impact on health outcomes.”

Taken together, the evidence does not support this. A single questionable study estimated the reductions of deaths. The rest of the evidence was inconclusive.

The “final report”

Section 3.3 on page 33 confirms evidence relating to alcohol-related health outcomes was drawn from eight papers, not the 40 papers that the press release suggests.

It confirms the 13.4 per cent death reduction figure was drawn solely from the Wyper paper (Item 25 in the bibliography) and confirms the 4.1 per cent reduction in hospitalisations in Wyper was “non-significant”.

It cites Wyper at length but gives short shrift to the other inconclusive papers.

It states:

“The five other papers that contributed relevant quantitative evidence found no evidence of impacts in alcohol-related health outcomes, either positive or negative: there appears to have been no effect at a population level on alcohol-related ambulance callouts, (Manca 2022) prescriptions for treatment of alcohol dependence (Manca 2023) emergency department attendance (So 2021) or the level of alcohol dependence or self-reported health status in drinkers recruited through alcohol treatment services in Scotland, relative to England. (Holmes 2022).”

Ministerial statements

On June 27, Humza Yousaf, Scotland’s first minister tweeted:

“When @ScotGov proposed Minimum Unit Pricing over a decade ago, it was a pioneering approach to tackling alcohol harm and some had their doubts.

“Increasing evidence is now vindicating our approach. It’s saving over 150 lives a year.”

Analysis: There is no increasing evidence. There is one consistently rehashed and questionable PHS paper and about half a dozen inconclusive papers.

On June 27, The SNP tweeted MUP has led to “a major reduction in alcohol related deaths”

Analysis: Alcohol related deaths have risen since MUP was imposed. The SNP omitted the crucial caveat that the reduction was based on a hypothetical model.

Please investigate the matters raised in this correspondence and advise.

 

Yours faithfully,

Dr Sandesh Gulhane
MSP, Glasgow Region
Shadow Cabinet Secretary for Health and Social Care
Scottish Conservative and Unionist Party
NHS GP

 

Related links

Response from Sir Robert Chote to Sandesh Gulhane MSP – minimum unit pricing

Response from Sir Robert Chote to Gillian Martin MSP – A&E comparisons

Dear Ms Martin,

Thank you for your letter of 7 March regarding statements made by the Prime Minister in a BBC programme that aired on 8 January. In this programme, the Prime Minister made reference to the relative performance of the NHS in the different countries of the UK. He said:

“…you mention A&E waiting times; in Wales and Scotland for example, they’re operating at worse levels than they are in England.”

You asked us to investigate the accuracy of these claims and the appropriateness of the measure chosen for comparison.

The monthly official statistics for November 2022 (the latest publicly available at the time of the interview) presented a mixed picture of Accident and Emergency (A&E) performance, as shown in this table.

For waits under four hours, NHS Scotland had better performance for major A&E departments (known in England as Type 1 emergency departments) whereas NHS England was performing marginally better when all types of emergency unit were included [1]. On both measures, emergency units in all three administrations are falling short of the official targets for waiting times.

Parallels between A&E performance across the UK must be drawn with an appreciation of the limitations in comparability between countries. While the Prime Minister used the most readily available measure, which is published regularly across all three countries, it is possible to make other comparisons. The publication you mention by NHS Digital [2] (as was) says that the nearest like for like comparison between administrations can be found in waiting times for Type 1 and major emergency departments, rather than all types of emergency units, to account for the different models of service provision across the four nations. Even so, policy differences in emergency health provision mean that these units will not be treating exactly the same type of patients, so when making these numerically more robust comparisons, users should still be aware that they are not exact. We have made officials responsible for briefing the Prime Minister aware of this case, to ensure that they are aware of the trade-offs between coverage and comparability involved.

I have written previously to Dr Sandesh Gulhane MSP about A&E waiting time statistics [3], and I noted the importance of being able to accurately compare performance between UK administrations. In April, NHS England published a supplementary file of data on 12-hour waiting times for Type 1 departments [4], as well as the usual data on 4-hour waiting times, to facilitate comparisons across a broader range of measures. I welcome this development.

Yours sincerely,

Sir Robert Chote
Chair

 

Percentage of attendances that met the 4-hour target for A&E waiting times by GB administration, November 2022

CountryType 1 / Major emergency unitsAll types of emergency unit
England54.5%68.9%
Scotland64.1%67.5%
Wales57.8%67.3%

 

Sources:

 

FOOTNOTES

[1] In England, this means Types 1, 2 and 3 A&E departments. Type 1 departments provide a consultant-led 24-hour service with full resuscitation facilities and designated accommodation for A&E patients Type 2 departments are consultant-led A&E services for a single specialty, such as emergency ophthalmology or dental clinics. Type 3 departments (or Urgent Treatment Centres) are led by a doctor or nurse and treat minor injuries and illnesses without an appointment.

[2] Hospital Accident and Emergency Activity, 2021-22; Home Nations Comparative Analysis (Spreadsheet), NHS Digital, 15 September 2022

[3] Letter to Dr Sandesh Gulhane MSP on waiting time statistics, UK Statistics Authority, 16 December 2022

[4] Emergency Care Data Set (ECDS) Data, February 2023 Statistical Commentary, NHS England, April 2023

 

Related links

Letter from Gillian Martin MSP to UK Statistics Authority – A&E comparisons

Response from Sir Robert Chote to Pam Duncan-Glancy MSP – statistics on the cost of living

Dear Ms Duncan-Glancy,

 

Thank you for your letter of 24 January regarding Scottish Government figures on the cost of living. I directed the Office for Statistics Regulation to examine your concerns on this issue, and its substantive reply from Director General Ed Humpherson is enclosed.

 

Yours sincerely,

Sir Robert Chote
Chair of the UK Statistics Authority

 

Related links:

Letter from Pam Duncan-Glancy MSP to Ed Humpherson: Misleading figures on Cost of Living Measures

Response from Ed Humpherson to Pam Duncan-Glancy MSP: statistics on the cost of living

Letter from Gillian Martin MSP to UK Statistics Authority – A&E comparisons

Dear Sir/Madam,

I write regarding claims made by Prime Minister Rishi Sunak in a BBC interview, as referenced in the story linked: PM lies directly about A&E Waiting times in England and Scotland – Talking-up Scotland

I am concerned that Mr Sunak selected statistics that are not a like-for-like comparison and in using them he misled the viewers and therefore the country on a serious public health matter.

As a consequence, I am requesting that the UK Statistics Authority investigate whether the claims made by Mr Sunak comparing A&E stats in England and Scotland were accurate?

I draw your attention to the following: NHS England’s agency NHS Digital produce an annual comparison of the four nations’ A&E performance – Hospital Accident & Emergency Activity 2021-22 – NDRS.

The file on ‘home nations’ comparison notes it

“…provides a comparison of the number of unplanned A&E attendances and percentage of attendances spending i) 4 hour or less and ii) over 12 hours in type 1/Major A&E departments from arrival to discharge, admission or transfer for each of the four Home Nations (England, Scotland, Wales and Northern Ireland). Given the different models of service provision across the four nations the comparison is restricted to Type 1 or Major A&E departments within each nation…

“Whilst this allows a nearer like for like comparison it does not take account of the differing casemix of patients that present at type 1 services which will be influenced by the provision and accessibility of alternative types of A&E.”

The Royal College of Emergency Medicine observed that in NHS England

“December’s performance figures are truly shocking, more than 50% of all patients facing waits over four-hours and nearly 55,000 patients facing 12-hour waits from the decision to admit. 12-hour waits from decision to admit obfuscate the truth and are only the tip of the iceberg, we know the reality is far worse.”

How far does Emergency Department performance need to fall for political leaders to take meaningful action? – RCEM

The NHS England A&E attendances and emergency admissions data was published for December 2022. For ‘all’ sites the performance was 65% in December (68.9% in November). But the Type 1 performance was 49.6% in December (54.5% in November). From this its clear that the leading body, RCEM, are clear that the A&E measure that should be focused on is Type1 (Major/Core).

As NHS Digital clear that for compatibility Type1/Core/Major sites should be used, and as RCEM are clear that Type1/Core/Major is the measure which shows A&E performance most clearly it is far from apparent why the PM would claim England’s A&Es were performing better than Scotland’s. As for over 7 years Scotland Core A&Es have performed better than England’s Type1 A&Es.

I hope this helps inform your consideration and I await your reply.

Yours faithfully,
Gillian Martin MSP

 

Related links:

Response from Sir Robert Chote to Gillian Martin MSP – A&E comparisons

Letter from Pam Duncan-Glancy MSP to Ed Humpherson

Dear Ed,

RE: Misleading figures on Cost of Living Measures

I am concerned that a figure that is being repeatedly used by the Scottish Government, which claims that it has allocated £3billion for Cost of Living Measures, is misleading.

The Scottish Government has used this figure on a number of occasions including in a
statement by the Cabinet Secretary for Social Justice, Local Government and Housing on 12th July 2022, a statement by the First Minister of Scotland on 8th August 2022, in the First Minister’s Speech on the 2022-23 Programme for Government in September 2022, and during a Speech by the First Minister at the Poverty Alliance 30th Anniversary Conference on 25th November 2022.

The Scottish Parliament’s Independent Information Centre has set out in this blog the policies which it believes the Scottish Government has used, including a range of measures and their associated attached funding which pre-date not only the current Cost of Living Crisis, but also the SNPs term of government.

The correct use of statistics and data is vital to ensure public confidence.

It is imperative that the public have faith in the accuracy and truthfulness of statistics that are cited by Government ministers. As such, I would be grateful if you could investigate and provide guidance on the matter.

 

Yours sincerely

Pam Duncan -Glancy MSP
Member of the Scottish Parliament for Glasgow Region (Scottish Labour Party)
Shadow Cabinet Secretary for Social Justice and Social Security

 

Related links

Response from Sir Robert Chote to Pam Duncan-Glancy MSP

Response from Ed Humpherson to Pam Duncan-Glancy MSP

Response from Sir Robert Chote to Dr Sandesh Gulhane MSP – Waiting time statistics

Dear Dr Gulhane,

Thank you for your letter of 9 November asking us to investigate concerns about statistics on Accident and Emergency (A&E) waiting times in Scotland. The Scottish Government has a target that 95 per cent of people attending A&E should be seen within four hours.

Your letter refers to an article in the Scotsman, which points out that an estimated 2,000 patients who present at the Acute Assessment Unit (AAU) of Glasgow’s Queen Elizabeth University Hospital each month are excluded from Public Health Scotland’s (PHS) monthly waiting times statistics. The author suspects there may be inconsistencies in data collection here because patients presenting at an apparently similar Assessment Unit – the Medical Assessment Unit (MAU) at the Western General Hospital in Edinburgh – are included.

The monthly A&E statistics (and the Government target) cover all types of A&E site, including Emergency Departments, Minor Injury Units, and smaller community casualty sites. Virtual attendances and activity taking place in trolleyed areas of assessment units, which are often located alongside A&E departments, should also be included. Patients admitted to staffed beds in an Assessment Unit (rather than spending time on trolleys or chairs) are considered Emergency Admissions rather than A&E attendances and should be included in separate Inpatient and Day Case Statistics to which the four hour A&E access standard does not apply.

A&E statistics for the Western General Hospital include activity for both the Minor Injuries Unit and trolleyed areas of the MAU. For the Queen Elizabeth University Hospital, patients coming into trolleyed areas of the AAU via the Emergency Department should be included in the statistics. We understand from PHS that, due to limitations of the current data collection, activity in trolleyed areas of assessment units cannot be differentiated in the data. Therefore, statistics for the AAU are not reported separately but should be included as part of the overall A&E activity reported for the hospital. PHS has acknowledged this issue and is undertaking further work to assess whether or not all relevant activity in this assessment unit is being included in A&E submissions it receives.

In addition to the monthly A&E statistics, Public Health Scotland also publishes weekly waiting times statistics, which are often used by the media to report and compare hospital performance against the Scottish Government target. However, these are confined to Emergency Departments, which PHS defines as “large hospital departments which typically provide a consultant-led, 24-hour service with full resuscitation facilities and designated accommodation for the reception of emergency patients”.

The way in which services and facilities are defined in the monthly and weekly statistics is clearly very important to understand from a user perspective, especially in the context of the Government target. Background information and a glossary are available online, but the recent confusion suggests that they should be made more accessible and transparent for users. The Office for Statistics Regulation has suggested this to PHS. It has also asked PHS to communicate more clearly any caveats regarding data collection issues across various sites.

Separately, you raised a concern that initiatives adopted by individual hospitals may result in inconsistencies in the statistics. This recent Herald article claims that NHS Tayside fares well in the statistics due to a ‘continuous flow’ model at Ninewells Hospital, which means that some patients who would be waiting on trolleys to be seen in A&E elsewhere wait instead on trolleys in the acute medical receiving unit – where they do not count towards the waiting times estimates. It is not for us to say how hospitals should manage their emergency admissions policies, but the Office for Statistics Regulation has urged PHS to make it clear where this is likely to create difficulties in comparing waiting time statistics across hospitals and boards.

Finally, it is important to ensure that PHS guidance on data collection and classification is applied consistently across health boards. PHS has advised us that it is reviewing this guidance due to the increasing emergence of new clinical pathways to A&E. The Office for Statistics Regulation will continue to engage with PHS as it does so and as it responds to our feedback on the presentation of its statistics.

The issues raised in this case around difficulties making comparisons on NHS data in Scotland are indicative of a longstanding broader challenge in getting comparable data on healthcare provision across the UK, between nations and within them. It is important that users of statistics are able to compare the performance of the NHS across the UK on issues such as the waiting times for emergency care, and I encourage statistical producers to take this into account as they develop their statistics.

Yours sincerely,

Sir Robert Chote

 

Related links

Dr Sandesh Gulhane MSP to Sir Robert Chote – Waiting time statistics

Response from Sir Robert Chote to Alex Cole-Hamilton MSP – Scottish renewable energy statistics

Dear Mr Cole-Hamilton,

Thank you for your letters of 14 November and 28 November regarding statistics on renewable energy in Scotland. You asked us to consider the claim that Scotland has 25 per cent of Europe’s renewable energy potential and cited several examples. Upon reviewing these, we identified that the precise claim made is that Scotland has 25 per cent of Europe’s potential offshore wind resource and it is this claim that we have examined.

This claim is based on external research reports rather than official statistics. This is outside our formal remit, but we have investigated these issues because, as a general principle, we consider that high profile numerical statements should be supported by sound evidence and clearly identified sources.

The claim originated in a 2010 publication by the Scottish Government, drawing on estimates that Scotland has an offshore wind potential of 25GW and Europe one of 102GW. However, these figures are derived from separate studies that are both more than 20 years old and not directly comparable:

  • The estimate of Scotland’s offshore wind potential[1] included all resource at least 5km from the shoreline in waters up to 30m deep, and assumed a turbine density of 8 MW per square km. It did not consider technical, navigational, or environmental issues that may affect installation of turbines.
  • The estimate of Europe’s offshore wind potential only included waters up to 20m deep, and assumed a turbine density of 6 MW per square km. It included only 10 per cent of the resource 0-10km from the shoreline, 50 per cent of the resource 10-30km from the shoreline and none beyond 30km. According to the report, they were based on a “very conservative approach” to come up with the likely “exploitable resource”. The figure is also based on just 11 countries from the then European Community and excludes countries like Norway, Sweden and Finland which have large offshore wind potential.

In summary, the calculation for Europe’s offshore wind potential was much more restrictive than that for Scotland. So, when the figures are used together, they give an inflated picture of Scotland’s potential relative to the rest of Europe.

We understand that Scottish Government and Ministers are already aware that this 25 per cent figure is inaccurate. On 15 November, the Minister for Green Skills, Circular Economy and Biodiversity, Lorna Slater (Scottish Greens), acknowledged in Holyrood that the figure was “outdated”, but not that it was poorly constructed.

It is good practice for elected representatives to correct their use of official statistics. My office is engaging with the Scottish National Party about its ongoing use of the claim and with the offices of those who have recently used it to emphasise the importance of using quantitative evidence appropriately. The Office for Statistics Regulation is also engaging with colleagues in Scottish Government to understand what more can be done to avoid further use of this claim and to obtain a more accurate and up to date figure for Scotland’s offshore wind potential in comparison to Europe.

Yours sincerely,

Sir Robert Chote

[1] Scotland’s Renewable Resource 2001 [no longer available in full online], Garrad Hassan, 7 December 2001

 

Related links

Alex Cole-Hamilton MSP to Sir Robert Chote – Scottish renewable energy statistics

Alex Cole-Hamilton MSP to Sir Robert Chote – Further letter on Scottish renewable energy statistics

 Alex Cole-Hamilton MSP to Sir Robert Chote – Further letter on Scottish renewable energy statistics

Dear Sir Robert,

Further to my letter of 14th November, I am writing once again with regard to the serial misuse and mismanagement of a recently debunked energy statistic by a succession of Scottish National Party and Scottish Green Party ministers and politicians.

On Sunday 27th November, a senior member of SNP staff posted an image of leaflets being delivered to households despite their containing the debunked figure that Scotland has 25% of Europe’s offshore wind potential. The tweet can be found here.

The leaflets did not appear to be Kirkcaldy-specific so could be delivered across Scotland. It begs the question how many of these leaflets have been printed and continue to be delivered despite it being known that they contain false information.

Last Friday it was also brought to my attention that this statistic was being used on different leaflets in St Andrews. The leaflet can be found here:

The continued promotion of false information through these fresh deliveries to households is obviously concerning. Can I therefore ask if the UK Statistics Authority intends to issue any advice to the Scottish National Party? How should households which have received this information be informed that it is not in fact true?

Even after the statistic had been debunked, on Tuesday 15th November, SNP MP Ronnie Cowan said in the House of Commons that he would “stand by” the claim that Scotland has 25% of Europe’s offshore wind potential. On the same day, in the Scottish Parliament, Green Minister Lorna Slater refused to confirm that the claim had always been bogus, instead saying that “it is out of date”.

I fully support the expansion of Scotland’s renewable sector and I desperately want to see Scotland fulfil our renewable potential. Nevertheless, the strong case for that is not helped when the Scottish Government and SNP use figures which leave them open to the charge of misleading and misrepresenting.

I would appreciate input from the UK Statistics Authority as to how this can be corrected and avoided in future.

Yours sincerely,

Alex Cole-Hamilton MSP

 

Related links

Alex Cole-Hamilton MSP to Sir Robert Chote – Scottish renewable energy statistics

Response from Sir Robert Chote to Alex Cole-Hamilton MSP – Scottish renewable energy statistics

Alex Cole-Hamilton MSP to Sir Robert Chote – Scottish renewable energy statistics

Dear Sir Robert,

I am writing with regard to the serial misuse and mismanagement of a recently debunked renewable energy statistic by a succession of Scottish National Party and Scottish Green Party ministers and politicians and in Scottish Government documents.

The claim has been that Scotland has 25% of Europe’s renewable energy potential. However research by the thinktank These Islands has demonstrated that this statistic was based on a bogus analysis of a mixture of reports dating all the way back to 1993, when the technology was in its infancy, and using a definition of Europe that excluded renewable powerhouses like Sweden, Norway and Finland. It wasn’t the case that it was accurate in 2010 as a Scottish Government spokesperson claimed last week – it was never accurate.

Freedom of information requests have also revealed that civil servants have been privately warning against its use for at least two years, warning it has “never…been properly sourced” and that the figures had been recycled “robotically without really checking them”.

Nevertheless, this is a claim that has been made by First Minister Nicola Sturgeon, the Westminster leader of the Scottish National Party Ian Blackford, successive Scottish Government environment secretaries, former SNP First Minister Alex Salmond and deputy First Minister John Swinney amongst others. It featured multiple times in the recent SNP-led debate on independence in the House of Commons a fortnight ago.

This matters because this Scottish Government has put this claim at the heart of the debates around Scotland’s energy security, on independence and on meeting our climate targets, including it in their National Strategy for Economic Transformation as recently as March 2022.

I have suggested publicly that the Scottish Government should make a statement to Parliament acknowledging that this figure was not accurate and committing not to repeat the claim further. However, that still leaves unresolved the fact that this bogus fact has littered the public and parliamentary records for years. I would appreciate guidance from the UK Statistics Authority as to what good practice the Government should undertake to clear up the public and parliamentary records.

Should, for example, the government be expected to provide a true figure to inform future debates on renewables generation? Should notes be affixed to the official parliamentary report, acknowledging that this fact is untrue? Should documents such as the National Strategy for Economic Transformation be amended and the corrections made clear?

I fully support the expansion of Scotland’s renewable sector and I desperately want to see Scotland fulfil our renewable potential. But the strong case for that isn’t helped when the figures used by the Scottish Government leave it open to the charge of misleading and misrepresenting. I would appreciate input from the UK Statistics Authority as to how this might be corrected and avoided in future.

Yours sincerely,

Alex Cole-Hamilton MSP

 

Related links

Alex Cole-Hamilton MSP to Sir Robert Chote – Further letter on Scottish renewable energy statistics

Response from Sir Robert Chote to Alex Cole-Hamilton MSP – Scottish renewable energy statistics