How do we get a better understanding of the scale of the work that Digital Champions are doing?
We know they’re out there doing good work. And we’ve been telling everybody about it. And we believe that a well-supported Digital Champions programme can help not only provide the essential assistance and confidence that people need, but can help your project evidence that impact both through qualitative reporting and quantitative recording.
We know that our Digital Champions are helping hundreds of people with the digital skills they need. But we wanted to get a better picture of the scale of that support.
We recently wrote about how our approach to evaluating digital inclusion is not just about the numbers, with the help of England cricketing hero Ben Stokes. Ben was busy this week so we got someone else in to help with our next piece, which is — let’s be honest — all about the numbers. Do we contradict ourselves? Very well then.
Here’s Muhammad al-Khwarizmi of Baghdad (c. 780 – c. 850)
This is a piece about our annual “activity snapshot” process, and how we use it to make estimates about how many people are being helped by Digital Champions each year. We have come up with a suitably sophisticated algorithm to help us derive accurate estimates of annual impact from the snapshot.
Our thanks to al-Khwarizmi, who first popularised algebra with his Compendious Book on Calculation by Completion and Balancing. The word algorithm comes from the Latinised version of his name. Muhammad – we don’t think we could do it without you!
The Citizens Online data team doing “compendious” snapshot estimates.
Taking a snapshot
Citizens Online’s approach to digital inclusion, Switch, is built around the provision of assistance with digital skills provided by a mixture of Professional, Volunteer and Embedded Digital Champions (DCs). DCs help people (End Learners, or ELs) understand the benefits of using the internet, and can show them how to do simple things online.
- Professional DCs are dedicated outreach workers who are recruited by an organisation or partnership, or by us at Citizens Online, to work solely as a Digital Champion.
- Volunteer DCs are recruited and trained by an organisation or partnership to support digital inclusion work, but are unpaid.
- We emphasise the valuable role played by Embedded DCs, who work in a specific role (such as a Job Centre Plus, Citizens Advice branch, or an HR or training department) but who integrate Digital Champion work into this role.
Each year, in April, we ask all the DCs working with our projects to complete a simple form during one particular week, noting down the number of people they have helped and what kinds of activity or skill they helped with. This can be anything from logging on to the wifi at a venue, to setting up an email account, to creating a CV – among many other things.
This process captures extra detail that is not usually collected by the DCs.
We have now run three Activity Snapshot weeks:
- 12-16th September 2016, completed by a total of 39 DCs
- 16-22nd April 2018, completed by a total of 54 DCs
- 15-21st April 2019, completed by 23 DCs
Rationale for the snapshot process
Citizens Online works with Digital Unite to support people to become Digital Champions through the Digital Champions Network (DCN), which provides learning, tools and a friendly community.
We encourage DCs to utilise the DCN’s session- and tally-based Activity Records, but we know from anecdotal reports that the DCN doesn’t capture the full range of DC activity.
Without further analysis, however, we have little idea what proportion of activity is captured by the DCN. Whether for reporting to our Key Performance Indicator targets with funders, or in order to evaluate and improve internally, we want to know how many people are benefiting from the support of Digital Champions we have employed, trained or recruited to the DCN.
What we found
Although there are some differences between the three snapshots – for example, the 2016 survey covered four projects, the 2019 survey only one – and the estimates were calculated in slightly different ways each time, we can draw some conclusions.
- We believe the respondents helped at least 47,000 people over the three years.[mfn]This figure is the sum of annualised estimates associated with each of the snapshots, and does not account for people helped by DCs who continued to help people in years they did not complete snapshots. We also have no data from Plymouth and the Highlands for 2018 and 2019, nor data for Gwynedd for 2019.[/mfn]
- Of the total 116 responses, each DC helped 10 people a week on average.
- Embedded DCs each helped on average three times as many people as Volunteer DCs.
- DCs recorded as few as 0 to as many as 69 ELs helped in a week, including up to 24 in a single day.[mfn]Recorded by one Embedded and one Volunteer DC in Highland in 2016, a Brighton & Hove Embedded DC in 2016, and a Brighton & Hove Embedded DC and Professional DC in Plymouth in 2016, respectively.[/mfn]
- We estimate that – including active DCs that did not fill in the snapshot – around 68,000 people were helped across the four projects.[mfn]As with the earlier estimate, this figure is based on the sum of the annual estimate for each of the three years, not the full programme duration and project extent.[/mfn]
- The snapshots provide us with more information about the work DCs do. In 2019, for instance, “Foundation”[mfn]The Essential Digital Skills Framework details the Foundation skills necessary for people to use devices.[/mfn] tasks were 52% of all activities recorded.
We think the activity recorded in the snapshot is between 14-20% of the potential DC activity (where ‘potential’ means: if all trained DCs were as active as those participating in the snapshot.)[mfn] We don’t expect all trained DCs to undertake activity at the same rate as the most active DCs. Therefore this estimate is a reflection of capacity: what might be possible if all were as active as the most.[/mfn]
Our methodology and algorithm
Here’s the challenge: we need to take the numbers reported in the snapshot week sample, and calculate what this means for all our DCs across the year.
We use a standardised methodology for each snapshot, based on a small set of assumptions.[mfn]Citizens Online has previously analysed the snapshots from 2016 and 2018 using slightly different methodology, and the numbers presented here supersede those earlier estimates.[/mfn]
End Learners helped per DC per week, all vs embedded vs volunteer:
Year | ELs per DC (all) | ELs per DC (embedded) | ELs per DC (volunteer) |
---|---|---|---|
2016 | 15.7 | 16.6 | 4.2 |
2018 | 8.4 | 8.8 | 4.7 |
2019 | 3.0 | 3.8 | 1.9 |
Total | 9.8 | 10.7 | 3.4 |
Calculations of annual numbers of End Learners receiving support from DCs:
2016 | 2018 | 2019 | |
---|---|---|---|
Number of projects in snapshot | 4 | 2 | 1 |
DCs in snapshot | 39 | 54 | 23 |
Total trained DCs in snapshot projects | 266 | 295 | 235 |
Annualised estimate of ELs helped by snapshot respondents | 24,520 | 18,040 | 4,080 |
Annual ELs estimate based on trained DCs | 35,957 | 25,342 | 6,844 |
First, we make an annualised estimate of ELs helped by DCs who did take part in the snapshot, which simply multiplies up the snapshot totals to a 40-week year (for the 2019 snapshot, we also adjusted the estimates to take account of the reported relative quietness of the sample week).
For all other DCs, i.e those who did not take part in the snapshot, the process is slightly more complex:
- Having subtracted the number of DCs who completed the snapshot, we multiply the number of each type of DC by the average number of ELs that type of DC helped in the snapshot. For example, with 61 non-snapshot Volunteer DCs in 2019, if each helped 1.9 ELs/week that would be 117 ELs.
- For all these ‘non-snapshot’ DCs, we then reduce the level of estimated activity to one-tenth of the total of those who completed the snapshot – on the basis that snapshot respondents are generally the most active DCs. Now in our example, 117 ELs is reduced to an estimate of just 12.
- Finally, we again apply the factor of 40 (weeks per year) to make annual estimates (and in our example, the number of ELs first becomes 18 to adjust for the quietness of the 2019 week: which multiplied by 40 gives 720).
It’s a little complex! But we stand by our calculations.
It’s not easy to build up a more general picture from small snapshot samples, but having carefully considered all the factors and the rich information we have received, we think it’s a sound piece of work.
The work that DCs are doing, supported by our project managers and by the resources and software provided by the DCN, is making a great difference to thousands of people. Now with a bit of help from Muhammad the Excel-lent (spreadsheet joke there), we have an even better idea of how many.
Full notes and explanations can be found in our Activity Snapshot Analysis document (pdf).