Methods

Approach to sampling and recruitment

Different options were considered for how best to approach digitally excluded groups. In February 2021, the Office for National Statistics (ONS) commissioned a research agency to recruit a sample of people who are at risk of digital exclusion. This professional, independent agency was used to meet the tight timeframe within which this work had to be completed.

Sub-groups were devised based on the Government Digital Inclusion Strategy (2014) requirements for a person to be digitally included.

Table 1: Sub-groups of digital exclusion and the demographics of those impacted.

Type of digital exclusion Definition Likely demographics of people who are impacted
Digitally impoverished People lacking access to devices or sharing devices and lacking internet data and credit Those experiencing other forms of poverty, receiving income support, rough sleepers or asylum seekers
Lacking digital skills People lacking digital skills required to use devices and the internet independently The elderly and the disabled and may overlap with those experiencing digital poverty
Lacking digital infrastructure People under-served by current digital infrastructure in their area Very rural communities who lack access to broadband

An additional set of sampling criteria was used by the research agency to help identify those at risk of digital exclusion:

  • digitally impoverished participants:
    • people who reported no access to digital devices, such as a laptop, smartphone, or desktop in their home
    • people who had a phone but it was not a smartphone or did not have internet connectivity
  • participants lacking digital skills:
    • people without internet access
    • people not using the internet
  • participants identified as lacking digital infrastructure:
    • people living in rural areas without reliable broadband internet connection to their homes
    • people who need to frequently rely on mobile data for hot spotting
    • people who said they have regular drop-out of service or no service at all

This sample covered each of the four nations of the UK. Where possible, the recruiter aimed for an equal distribution of age and sex.

Back to top

Achieved sample

The aim was to obtain an achieved sample of 60 people, with 20 participants in each of the three digital exclusion categories. A 20% oversample of 72 addresses was used to allow for non-response.

The recruitment agency was responsible for providing a sample file, including names and addresses of participants in the three groups at greater risk of digital exclusion, so that ONS could send out engagement materials and track responses.

The paper-based consultation achieved 59 responses:

  • 18 responses (30%) came from those who were digitally impoverished, 20 responses (34%) from people who lacked digital skills, and 21 responses (36%) from people who lacked digital infrastructure
  • 31 responses (52%) came from people living in England, 11 responses (19%) from people living in Scotland, 9 responses (15%) from people living in Wales, and 8 responses (14%) from people living in Northern Ireland
  • 40 responses (68%) were from women and 19 responses (32%) were from men
  • responses were gathered from ages ranging from 23 to 91, with most participants (76%) aged 45 and over
Back to top

Design and Materials

The paper-based consultation approach was inspired by a postcard consultation used in New Zealand to gather the views of children and young people. The “Postcard to the Prime Minister” was a way for children, young people, and adults to express their ideas directly with the Prime Minister. The postcard was available online and in hard copy at face-to-face engagements. A similar approach was also used by ONS in the Measuring National Wellbeing consultation (the National Debate) in 2010 and 2011.

Cognitive testing was undertaken in February 2021 to ensure the research  questions and associated materials were accessible, easy to understand, and relevant to members of the public. Cognitive testing is a form of interviewing which looks closely at a participant’s understanding of questions and how they decide what answers they give.

Participants were asked if they felt comfortable and confident in answering questions about data inclusivity, accessibility, and trust via a postcard or leaflet. Participants were also asked about the accessibility of the postcard, envelopes, and information page, and whether participants would prefer to respond via leaflet instead. The sample consisted of 12 participants, including older people, people with neurodiversity or learning disabilities, people with low digital skills, people with limited English language skills, and people from minority ethnic groups.

The initial list of questions tested for each area fell under three themes:

  • inclusion:
    • how can we make sure everyone is included in information about life in the UK?
    • What would help you feel included in information about life in the UK?
  • trust:
    • how do you feel about sharing information about yourself for research?
    • what might stop you from sharing information about yourself for research?
    • can you think of any reasons why you wouldn’t want to share information about yourself for research?
  • accessibility:
    • what would make it easier for people to understand facts and figures?
    • Where do you usually see facts and figures about the UK?

All proposed questions were tested for readability and achieved an A or B grade, which is the standard readability score for the general public. The proposed questions were also assessed against a question evaluation framework, which tested for clarity, focus, openness, suitability, and objectivity.

Participants involved in the cognitive testing offered preferences to questions in each section and suggested alternative wordings. Most also said that they would prefer to respond via leaflet rather than postcard and offered some solutions as to how to improve the leaflet design. A leaflet was designed based on this feedback. Additional revisions were made based on the findings, including:

  • development of a new inclusion themed question
  • revision of the question order to minimise participant burden and encourage responses
  • selection of the trust and accessibility themed questions

The final engagement material for the paper-based consultation provided background to the research and a brief outline of the benefits of having facts and figures to improve people’s lives. The material also asked participants three questions on the themes of accessibility, inclusion, and trust in sharing personal data for research:

  • accessibility – where do you usually see facts and figures about people in the UK?
  • inclusion – what is important for decision makers to know about you, your life and your views?
  • trust – can you think of any reasons why you wouldn’t share information about yourself for research?

A pre-paid return envelope was included to reduce participant burden in returning the completed leaflets. A £25 voucher was offered to participants as a “Thank You” for their time and contribution to the consultation.

Back to top

Approach to analysis

Responses from the paper-based consultation were scanned, transcribed and coded for analysis. The theme-based analysis involved developing inductive codes (or “labels”) for responses to each question. To develop the coding framework and ensure reliability and consistency, discussions between coders informed the overarching themes and sub-themes. All transcripts were analysed using the agreed framework.

The findings have been reported under the three themes of accessibility, inclusion, and trust. As responses were handwritten, they varied in length and detail. Open-ended questions were chosen to encourage participants to write freely about their own views and experiences.

Back to top
Download PDF version (470.94 KB)