1. Main changes

  • Repeated changes to schooling policies during the coronavirus (COVID-19) pandemic and the need to measure education output as consistently as possible have required us to keep innovating to ensure measurement keeps up with developments in schools.

  • Following several policy regimes, we have reviewed and aligned our measurement approaches to provide consistent accounting for remote learning during 2020; this has reduced the extent to which remote learning was an effective substitute for in-person teaching at the start of the pandemic relative to during Autumn term 2020.

  • As a consequence of these changes, the fall in education output between Quarter 1 (Jan to Mar) 2020 and Quarter 2 (Apr to June) 2020 has increased, indicating that the impact of the first lockdown on education output was larger than initial estimates suggested.

  • Following the alignment of our methods, education output for Quarter 2 2020 in volume terms is now estimated to have fallen 36.7% and gross domestic product (GDP) 19.5%, a downwards revision of 13.6 and 0.5 percentage points respectively from the previous estimate.

  • We have also adapted our measurement for the further school closures and change in policy regime in the first few months of 2021, to enable comparisons on a consistent basis.

  • Under our consistent approach remote learning as a substitute for in-person teaching has generally improved throughout the pandemic, which is contributing to education output.

Back to table of contents

2. Overview of measuring education output and our initial response from March to July 2020

Background

As set out in our previous article on Coronavirus and the impact on measures of UK government education output, the coronavirus (COVID-19) pandemic has had a profound impact on schools, teachers and students in the UK. In the year since the pandemic arrived in the UK, and with some variation in timings in the different parts of the country, schools have experienced broadly four policy regimes.

Between March and July 2020

Schools across the UK were largely closed to in-person attendance. Vulnerable children and those of key workers could continue to choose to attend in person. For the remainder, provision shifted online enabling them to learn remotely. The in-person attendance rate in England fell very sharply, to between 1% and 10%.

Between September and December 2020

Schools across the UK reopened for general in-person attendance. During this period, students were required to stay at home and learn remotely only if someone in their school "bubble" tested positive for COVID-19. The attendance rate in England was lower than normal, but much higher than during the March to July 2020 period.

From January to early March 2021

Schools were again largely closed to in-person attendance. As before, specific groups could continue to attend in-person, but for the majority, provision again shifted online to enable them to learn remotely.

At time of writing (March 2021), schools are returning to a mixed mode of teaching provision. In person learning is returning for the majority, and remote learning is expected only to return in targeted cohorts where the virus is detected.

These policy changes have made measuring the volume of education output consistently during the pandemic very challenging. During the first wave of the pandemic, we made a number of changes to our "usual" approach to measurement (set out in our previous article) to ensure that we captured changes in schooling provision. At that time, we indicated that the implementation of our approach would be continuously reviewed and that our initial estimates were likely to be revised as more information became available.

This article sets out how we have measured the volume of education output over this period of rapid policy change. In so doing, it sets out the data that we have collected at each stage of the process. It also explains how we have reviewed the implementation of our approach and the changes we have made to ensure our approach has been coherent throughout.

Pre-pandemic approach to measuring education output

The National Accounts provide a conceptual framework in which to record and reconcile economic flows, and they provide a lens through which the economic impact of the coronavirus (COVID-19) pandemic might be measured. In general, transactions in goods and services in the marketplace are well served by this framework. Aggregate values of production and expenditure are usually well-aligned in these markets, as is the value of the income generated by these activities. However, it has long been recognised that non-market output is a measurement challenge. It has also been significantly impacted by the coronavirus pandemic.

Non-market output comprises the production of goods and services by the government or non-profit institutions serving households (NPISH), either supplied for free or at prices that are not economically significant. In the UK, this includes most of healthcare and education provision as well as other public services such as the courts and criminal justice systems, public order and defence, social protection, and other administrative activities.

Consistent with international guidance set out in the European System of Accounts 2010 (ESA 2010), we use separate approaches to estimate the current price value of education output and the volume of education output. The former is a component of nominal gross domestic product (GDP), while the latter contributes to the widely quoted chained volume measure (CVM) estimates. Both approaches reflect the absence of a market-based price mechanism for education services, which prevents the application of methods of valuation used in other settings.

In current price terms, we measure education output using a "sum of costs" approach: adding together the intermediate consumption, labour costs and depreciation of fixed assets associated with these activities. This covers the value of goods and services consumed in the production process as well as the costs of the factors of production. These data are generally available for central government through the Online System for Central Accounting and Reporting (OSCAR) and from local government data collections, although estimates for depreciation are calculated through the our perpetual inventory model. This approach has been used consistently throughout the pandemic to inform our estimates of GDP at current prices, or in nominal terms.

In volume terms, the measurement of education output is based on cost-weighted activity indices. This involves gathering data on changes in the number of students in different educational settings (themselves intended as proxies for the number of hours of teaching provision) and weighting them together according to their relative unit costs of production. Increases in the number of students in a relatively high (low) weight activity consequently increase measured education output by a relatively large (small) amount.

Under ESA 2010 - the international statistical standards by which the UK National Accounts are compiled - we are required to estimate the volume of non-market education provided excluding changes in the quality of that provision. This means that our estimates of education output are based on information about school provision by teachers - encompassing the range of services which teachers provide, including education and childcare services - rather than the quality of learning students receive. Our estimates of the volume of education output are therefore designed to capture the quantity of teacher pupil hours provided, and do not respond to changes in teaching methods. We publish estimates of quality-adjusted education output - consistent with the wider System of National Accounts (SNA) 2008 - separately.

To implement this approach, we gather information on the number of students in eight different educational settings - ranging from nursery schools to secondary schools, special schools to teacher training courses - from England and the devolved administrations. To measure school activity, which is the largest part of education output, we use a wide range of data sources including annual estimates based on school censuses to inform our estimates.

Methodology: our response to the measurement challenges for March to July 2020

In the first wave of the pandemic, as schools across the UK were largely closed to in-person attendance, we modified our approach to measurement to count both in person attendances and remote learners. A measure of education output which excluded the services provided to remote learners would ignore a large portion of teaching activity and likely underestimate the "true" value of education output. Each in-school attendance continued to count as one full time equivalent (FTE),

To include remote learners, we recognised there were a spectrum of options ranging from:

  • assuming that remote and in-person learning were equivalent (formally, setting one remote learning student and one in-person attendance in school both equal to one FTE)

  • excluding remote learning entirely (setting each remote learner equal to zero FTE)

Neither of these assumptions reflect the reality of the education delivered to remote learners. Instead, we sought data that would enable us to estimate a "discount" for remote learning. This discount would capture the instruction that was being provided by teachers, despite the change in the location of that instruction. It would ensure that remote provision would contribute to education output, but likely at a lower rate than in-person provision.

To that end, we used data on two features of the education system during the pandemic.

Changes in hours worked: Teachers working fewer hours than "normal" during school closures (Table 1) were considered indicative of lower input provided to remote learning students compared with "normal". We assumed that this fall in input was indicative of reduced output for remote learners. This assumption was reasonable as in-school attendance was very low (approximately 1% in April) and teachers were largely teaching in one mode (either to remote learners or to students in school).

Dependence on parents and guardians: The fraction of learning dependent on parents and guardians should count towards household production, rather than as education output, as set out in international guidance.

Using a novel sampling frame, we gathered data on these two quantities from teachers, via Teacher Tapp1, and used them to inform a FTE discount that we applied to all remote learners. As set out in our previous article, this discount was larger for primary school students - where the fall in teacher hours worked was larger, and where the role of the parents was considered more important in instruction - than for secondary schools. These trends were very similar throughout the first lockdown (Table 1) and resulted in a fairly stable, if rising, FTE of remote learners.

Notes for Overview of measuring education output and our initial response from March to July 2020:

  1. A survey app run by Educational Intelligence Limited. This company was established by a team with experience from the academic sector, schooling and education journalism. Teacher Tapp maintains a sample frame of teachers across England in a range of different educational settings. Their smartphone app poses questions to teachers on a daily basis. The results are weighted using the English School Workforce Census (on the basis of the sex, age and leadership status of teachers and on the region and setting of the school) and the resulting intelligence is used to inform policy debates.
Back to table of contents

3. Overview of measurement challenge and response for September to December 2020

Following the school summer break1, schools re-opened for general in-person attendance. While the precise timings varied across the UK, this facilitated a significant increase in the attendance rate (Figure 1). Students were required to return to “remote learning” only when the virus was detected in their “bubble”.

The higher in-person attendance resulted in a large rebound in education output, as each of the reported in-person attendances counted as one full-time equivalent (FTE). The increase in attendance from 2 to 10% in June to 84 to 91% in September came close to normal attendance of around 95%.

As some schools were now maintaining provision through two modes - in-person and online (under a new legal duty to provide education for children unable to attend because of coronavirus) - teacher labour input increased between June and September. In September, reported teacher hours were around 49 and 47 in primary and secondary schools respectively, up considerably on the level during the first lockdown, and on our reported "normal" hours measure (Figure 2 and Table 2).

These developments presented a challenge to the implementation of our approach for remote learners. Under full school closures, it was reasonable to assume that changes in teacher hours reflected changes in the amount of input received by remote learning students. However, the observed longer working hours in September 2020 - partly a consequence of needing to teach both in-person and remote learning students - no longer provided a good proxy for the amount of input received by remote learners.

Methodology: our response to the measurement challenges for September to December 2020

Although we continued to collect data on teacher hours worked, these circumstances demanded a further refinement to the implementation of our approach for remote learners. We maintained our question about the importance of parents and guardians to remote instruction, and added an additional question designed to identify whether remote learners were receiving the same quantity of instruction as those attending in-person. Specifically, we used "Teacher Tapp" to ask teachers:

Thinking of students learning remotely today, to what extent did they have learning materials enabling them to cover the same content as pupils who were learning in school?

Our interest in the quantity of instruction - rather than its quality or impact (Section 2) informed the design of this question. Here, we have focused on the "materials provided to students", rather than "how much learning students have done".

The results of this survey indicated that while the level of teacher labour input had risen between June and September 2020, the content provided to remote learners could only cover a proportion of material covered in the classroom (Table 2). As confirmed in subsequent waves of collection, this position was more acute in primary schools than in secondary schools: the materials provided to remote learning primary (secondary) school students enabled them to cover around 56% (72%) of the material covered by in-person attendees over this period.

As the reported increase in average teacher hours worked from September to December 2020 partly reflected the demands of teaching students in school and those learning remotely, we changed our discount for remote learners to avoid capturing the increase in our remote learning assumptions. For the small number of students who were learning remotely over this period, we replaced our inference about the labour input provided by teachers with the information provided by our supplementary question on materials. We continued to apply our adjustment based on the importance of parental instruction.

Given the small number of remote learners once schools reopened in September and that the two methods could not be accurately compared for the same period with the data available, it was decided that there was insufficient information to align the two methods at this point. Information on the alignment of the two methods is covered in Section 5: A consistent approach: aligning policy and measurement regimes.

Back to table of contents

4. Overview of measurement challenge and response for early 2021

Measurement challenges

The return to a full closure of schools in England at the start of January 2021 presented a further challenge to the implementation of our approach. Specifically, the results from our “materials” question – intended to capture the proportion of “normal” content provided to remote learners – suggested that remote and in-class attendees received very similar materials in January and February 2021. This indicated that schools may have taken the step of using the remote learning content for both remote and “in-person” learners during the early months of 2021. To investigate this, a new question on whether in-class learners were taught using the same material as remote learners during January and February 2021 was asked retrospectively in March 2021. The responses confirmed our prior assumption: indicating that only a small proportion of in-class learners were taught using different materials to remote learners.

These results indicated to us that estimates of the full-time equivalent (FTE) factor applied to remote learners for January might be inflated, not because the coverage of the remote learning materials provided had increased, but because the content covered in-class had fallen considerably. This effect was particularly marked in primary schools: between December and January, the proportion of in-class content covered by the remote learning materials provided increased from around 55% to around 85%, and it remained high in February.

To understand this development better, we posed a further question to teachers in February 2021: asking them the extent to which the remote learning materials provided in February allowed students to pursue the same content as at this stage in a "normal" year:

Thinking of students learning remotely today, to what extent did they have learning materials enabling them to cover the same content as pupils at this point in a normal pre-pandemic year?

The results of the new question suggested that the learning materials provided during the latest full school closure (from January 2021) were insufficient to fully represent pre-pandemic normal conditions. This result is consistent with the apparent improvement in the coverage of remote learning materials in early 2021 being driven less by a change in their content over this period, and more by a change in the content covered by in-person learners (Figure 4). Our early estimates indicate that once schools reopened in March, the difference between the results of "comparing to in class" and "comparing to normal" questions reduced considerably.

Methodology: our response to the measurement challenges for early 2021

As discussed, the renewed school closures in the third wave of the coronavirus (COVID-19) pandemic (January 2021) coincided with a marked rise in the share of classroom learning accounted for by remote learning materials. While it remains to be seen how this series behaves once data for a post-school closure period - such as March 2021 - is captured, a FTE adjustment for remote learners based on these data would likely overstate the value of remote education relative "to normal".

To that end, we modified our FTE calculation for remote learners to use the comparison to "a normal year", rather than to what is happening "in the classroom". This change was implemented prior to the publication of estimates of education output from January onwards and therefore there are no revisions to published estimates for this period. This will reflect the change in policy regime and ongoing lower level of provision to remote learners than normal to enable comparisons on a consistent basis. By necessity, this was based on information gathered in February 2021 to adjust both January and February 2021.

Despite the apparent change in the quantity of instruction in classrooms, we have continued to count in-person attendances at the same, FTE rate for this period. This approach reflects our underlying concept of interest: the number of teacher-student hours provided. This in turn requires us to assume that changes in the quantity of instruction in school are compensated for by changes in the quantity of other services provided, retaining the same number of underlying teacher-pupil hours provided.

Our practical implementation of this conceptual approach has involved a number of assumptions. Some of these are a matter of expediency and will be reviewed in the coming months. It is likely that these estimates will be subject to larger revisions than normal.

Back to table of contents

5. A consistent approach: aligning policy and measurement regimes

The succession of school policy regimes and the need to measure education output as consistently as possible over the course of the coronavirus (COVID-19) pandemic has resulted in the three approaches to measurement. These are summarised in Table 3, which sets out how the “discount” for remote learning was based on different information during each phase of the lockdown. In the first phase, the full-time equivalent (FTE) discount was based on information about changes in teacher labour input. In the second phase it was based on the extent to which materials provided to remote learners approximate the classroom experience. In the third phase it was based on the extent to which remote learning materials cover the same content at this stage of a “normal” school year.

The effect of these different policy and measurement regimes on the underlying remote learning FTE assumption is shown in Figure 5; the share of students to whom it was being applied is shown in Table 3. Switching between using teacher labour input in the first lockdown and using information about remote learning materials in the autumn term can be seen to have introduced a discontinuity in our remote learning FTE estimate. Adapting our measurement for the further school closures and change in policy regime in the first few months of 2021 has enabled comparisons on a consistent basis.

We have continued to review the implementation of our approach to the measurement of education over the course of the pandemic. As a result, we are now making some changes to align the implementation of our approach between April to June 2020, and September to December 2020, to ensure that it is as consistent as possible over this period.

Implementing our response

To align the measurement regimes between the first (March to July 2020) and second (September to December) wave estimates, we asked teachers:

Thinking of students learning remotely during the first full school closures, to what extent did they have learning materials enabling them to cover the same content as pupils who were learning in school?

The results of the retrospective question (Figure 6), suggest that the education for remote learners during the first full school closures (March to July 2020) was comparable or lower to when students returned to school in September 2020. Therefore, the large downward step change in proportion of normal input provided between June and September 2020 is likely to be a result of the difference in the implementation of our approach rather than an underlying change in the education services provided to remote learners.

This retrospective question offers a comparable data point for the first and second waves and therefore provides a means of linking the two measurement regimes. To do this, we accept the monthly profile of changes arising from changes in teacher labour input, but benchmark this to the results of our retrospective materials question. This approach introduces some potential for mismeasurement arising from the retrospective nature of the question, but it seems more likely to be comparable than using two different measurement bases.

Impact of changes

The impact of these changes is a much smoother discount rate for each remote learning FTE over the course of the pandemic (Table 4 and Figure 7). At a detailed level, this means that we are reducing the extent to which remote learning was an effective substitute for in-person teaching at the start of the pandemic. Adapting the implementation of our approach for the further school closures and changes in policy regimes in the first few months of 2021 has enabled comparisons on a consistent basis. The latter of these adjustments, in particular, will be subject to review as more information is gathered.

Making these changes to the implementation of our approach has an important impact on the aggregate volume of education output (Figure 8). The largest impact to our education output as a result of these changes is for April to June 2020, but it also has an impact on the July to September 2020 period.

The impact on the July to September 2020 period arises from our treatment of school holidays. Our estimates of education output are designed in a manner which "looks through" the school holidays, as set out in our blog School's Out: measuring education output in the summer of the pandemic. Therefore, the reduction in education output at the end of the summer term has a smaller impact on education output from July to September 2020. There have been no changes to September to December 2020 arising directly from these methods changes1.

As a consequence of these changes, the fall in education output between January to March 2020 and April to June 2020 has increased, indicating that the impact of the first lockdown on education output was larger than initial estimates suggested. The growth in education output between April to June 2020 and July to September 2020 has also increased, suggesting a stronger recovery from the first lockdown than initial estimates suggested. Following the alignment of our methods, education output for Quarter 2 2020 in volume terms is now estimated to have fallen 36.7% and gross domestic product (GDP) 19.5%, a downwards revision of 13.6 and 0.5 percentage points respectively from the previous estimate.

Other measurement issues

In the coming months, we expect to maintain our data collection of hours worked, comparing to "in class" and comparing to "normal", to enable us to continue to respond flexibly. As schools across the UK reopen, we expect to return to informing our remote learning assumptions using comparisons to "in class" rather than "to normal". However, we will keep this approach under review.

We also anticipate that by maintaining several data collections, the implementation of our measurement regime will retain a higher level of consistency and be more robust to future school closures that follow existing policy regimes.

In the future, changes in delivery that increase the quantity of instruction - by reducing the length of holidays or increasing the length of the school day - will feed directly into the level of education output. We are monitoring how these proposals are developing to try to ensure their timely inclusion in our estimates of education output.

Notes for: A consistent approach: aligning policy and measurement regimes

  1. We are considering how to implement this revised approach in Quarter 1 (Jan to Mar) 2020, during which schools were only closed for a short period and prior to the collection of daily attendance data. We will bring forward the revisions implied by this aligned approach as soon as feasible, likely in the September 2021 Quarterly National Accounts release.
Back to table of contents

Contact details for this Article

Philip Wales
public.sector.outputs@ons.gov.uk
Telephone: +44 (0)1633 651609