Table of contents
- Main changes
- Overview of measuring education output and our initial response from March to July 2020
- Overview of measurement challenge and response for September to December 2020
- Overview of measurement challenge and response for early 2021
- A consistent approach: aligning policy and measurement regimes
- Related links
1. Main changes
Repeated changes to schooling policies during the coronavirus (COVID-19) pandemic and the need to measure education output as consistently as possible have required us to keep innovating to ensure measurement keeps up with developments in schools.
Following several policy regimes, we have reviewed and aligned our measurement approaches to provide consistent accounting for remote learning during 2020; this has reduced the extent to which remote learning was an effective substitute for in-person teaching at the start of the pandemic relative to during Autumn term 2020.
As a consequence of these changes, the fall in education output between Quarter 1 (Jan to Mar) 2020 and Quarter 2 (Apr to June) 2020 has increased, indicating that the impact of the first lockdown on education output was larger than initial estimates suggested.
Following the alignment of our methods, education output for Quarter 2 2020 in volume terms is now estimated to have fallen 36.7% and gross domestic product (GDP) 19.5%, a downwards revision of 13.6 and 0.5 percentage points respectively from the previous estimate.
We have also adapted our measurement for the further school closures and change in policy regime in the first few months of 2021, to enable comparisons on a consistent basis.
Under our consistent approach remote learning as a substitute for in-person teaching has generally improved throughout the pandemic, which is contributing to education output.
2. Overview of measuring education output and our initial response from March to July 2020
Background
As set out in our previous article on Coronavirus and the impact on measures of UK government education output, the coronavirus (COVID-19) pandemic has had a profound impact on schools, teachers and students in the UK. In the year since the pandemic arrived in the UK, and with some variation in timings in the different parts of the country, schools have experienced broadly four policy regimes.
Between March and July 2020
Schools across the UK were largely closed to in-person attendance. Vulnerable children and those of key workers could continue to choose to attend in person. For the remainder, provision shifted online enabling them to learn remotely. The in-person attendance rate in England fell very sharply, to between 1% and 10%.
Between September and December 2020
Schools across the UK reopened for general in-person attendance. During this period, students were required to stay at home and learn remotely only if someone in their school "bubble" tested positive for COVID-19. The attendance rate in England was lower than normal, but much higher than during the March to July 2020 period.
From January to early March 2021
Schools were again largely closed to in-person attendance. As before, specific groups could continue to attend in-person, but for the majority, provision again shifted online to enable them to learn remotely.
At time of writing (March 2021), schools are returning to a mixed mode of teaching provision. In person learning is returning for the majority, and remote learning is expected only to return in targeted cohorts where the virus is detected.
These policy changes have made measuring the volume of education output consistently during the pandemic very challenging. During the first wave of the pandemic, we made a number of changes to our "usual" approach to measurement (set out in our previous article) to ensure that we captured changes in schooling provision. At that time, we indicated that the implementation of our approach would be continuously reviewed and that our initial estimates were likely to be revised as more information became available.
This article sets out how we have measured the volume of education output over this period of rapid policy change. In so doing, it sets out the data that we have collected at each stage of the process. It also explains how we have reviewed the implementation of our approach and the changes we have made to ensure our approach has been coherent throughout.
Pre-pandemic approach to measuring education output
The National Accounts provide a conceptual framework in which to record and reconcile economic flows, and they provide a lens through which the economic impact of the coronavirus (COVID-19) pandemic might be measured. In general, transactions in goods and services in the marketplace are well served by this framework. Aggregate values of production and expenditure are usually well-aligned in these markets, as is the value of the income generated by these activities. However, it has long been recognised that non-market output is a measurement challenge. It has also been significantly impacted by the coronavirus pandemic.
Non-market output comprises the production of goods and services by the government or non-profit institutions serving households (NPISH), either supplied for free or at prices that are not economically significant. In the UK, this includes most of healthcare and education provision as well as other public services such as the courts and criminal justice systems, public order and defence, social protection, and other administrative activities.
Consistent with international guidance set out in the European System of Accounts 2010 (ESA 2010), we use separate approaches to estimate the current price value of education output and the volume of education output. The former is a component of nominal gross domestic product (GDP), while the latter contributes to the widely quoted chained volume measure (CVM) estimates. Both approaches reflect the absence of a market-based price mechanism for education services, which prevents the application of methods of valuation used in other settings.
In current price terms, we measure education output using a "sum of costs" approach: adding together the intermediate consumption, labour costs and depreciation of fixed assets associated with these activities. This covers the value of goods and services consumed in the production process as well as the costs of the factors of production. These data are generally available for central government through the Online System for Central Accounting and Reporting (OSCAR) and from local government data collections, although estimates for depreciation are calculated through the our perpetual inventory model. This approach has been used consistently throughout the pandemic to inform our estimates of GDP at current prices, or in nominal terms.
In volume terms, the measurement of education output is based on cost-weighted activity indices. This involves gathering data on changes in the number of students in different educational settings (themselves intended as proxies for the number of hours of teaching provision) and weighting them together according to their relative unit costs of production. Increases in the number of students in a relatively high (low) weight activity consequently increase measured education output by a relatively large (small) amount.
Under ESA 2010 - the international statistical standards by which the UK National Accounts are compiled - we are required to estimate the volume of non-market education provided excluding changes in the quality of that provision. This means that our estimates of education output are based on information about school provision by teachers - encompassing the range of services which teachers provide, including education and childcare services - rather than the quality of learning students receive. Our estimates of the volume of education output are therefore designed to capture the quantity of teacher pupil hours provided, and do not respond to changes in teaching methods. We publish estimates of quality-adjusted education output - consistent with the wider System of National Accounts (SNA) 2008 - separately.
To implement this approach, we gather information on the number of students in eight different educational settings - ranging from nursery schools to secondary schools, special schools to teacher training courses - from England and the devolved administrations. To measure school activity, which is the largest part of education output, we use a wide range of data sources including annual estimates based on school censuses to inform our estimates.
Methodology: our response to the measurement challenges for March to July 2020
In the first wave of the pandemic, as schools across the UK were largely closed to in-person attendance, we modified our approach to measurement to count both in person attendances and remote learners. A measure of education output which excluded the services provided to remote learners would ignore a large portion of teaching activity and likely underestimate the "true" value of education output. Each in-school attendance continued to count as one full time equivalent (FTE),
To include remote learners, we recognised there were a spectrum of options ranging from:
assuming that remote and in-person learning were equivalent (formally, setting one remote learning student and one in-person attendance in school both equal to one FTE)
excluding remote learning entirely (setting each remote learner equal to zero FTE)
Neither of these assumptions reflect the reality of the education delivered to remote learners. Instead, we sought data that would enable us to estimate a "discount" for remote learning. This discount would capture the instruction that was being provided by teachers, despite the change in the location of that instruction. It would ensure that remote provision would contribute to education output, but likely at a lower rate than in-person provision.
To that end, we used data on two features of the education system during the pandemic.
Changes in hours worked: Teachers working fewer hours than "normal" during school closures (Table 1) were considered indicative of lower input provided to remote learning students compared with "normal". We assumed that this fall in input was indicative of reduced output for remote learners. This assumption was reasonable as in-school attendance was very low (approximately 1% in April) and teachers were largely teaching in one mode (either to remote learners or to students in school).
Dependence on parents and guardians: The fraction of learning dependent on parents and guardians should count towards household production, rather than as education output, as set out in international guidance.
Using a novel sampling frame, we gathered data on these two quantities from teachers, via Teacher Tapp1, and used them to inform a FTE discount that we applied to all remote learners. As set out in our previous article, this discount was larger for primary school students - where the fall in teacher hours worked was larger, and where the role of the parents was considered more important in instruction - than for secondary schools. These trends were very similar throughout the first lockdown (Table 1) and resulted in a fairly stable, if rising, FTE of remote learners.
Primary schools | Secondary schools | |||||
---|---|---|---|---|---|---|
April | May | June | April | May | June | |
Hours worked | 26.3 | 29.9 | 36.6 | 27.3 | 29.9 | 32.5 |
Difference to normal | -11.0 | -8.0 | -5.2 | -9.0 | -7.4 | -6.0 |
% of normal input | 71% | 79% | 88% | 75% | 80% | 84% |
Parental involvement | 40% | 40% | 34% | 9% | 9% | 8% |
FTE | 43% | 47% | 58% | 68% | 73% | 77% |
Download this table Table 1: Changes in teacher hours and the importance of parental instruction
.xls .csvNotes for Overview of measuring education output and our initial response from March to July 2020:
- A survey app run by Educational Intelligence Limited. This company was established by a team with experience from the academic sector, schooling and education journalism. Teacher Tapp maintains a sample frame of teachers across England in a range of different educational settings. Their smartphone app poses questions to teachers on a daily basis. The results are weighted using the English School Workforce Census (on the basis of the sex, age and leadership status of teachers and on the region and setting of the school) and the resulting intelligence is used to inform policy debates.
3. Overview of measurement challenge and response for September to December 2020
Following the school summer break1, schools re-opened for general in-person attendance. While the precise timings varied across the UK, this facilitated a significant increase in the attendance rate (Figure 1). Students were required to return to “remote learning” only when the virus was detected in their “bubble”.
Figure 1: As schools reopened in September 2020, attendance increased from the very low level seen in June 2020 to near normal level
England, Northern Ireland, Scotland and Wales, April to June, September to December 2020
Source: Office for National Statistics
Download this chart Figure 1: As schools reopened in September 2020, attendance increased from the very low level seen in June 2020 to near normal level
Image .csv .xlsThe higher in-person attendance resulted in a large rebound in education output, as each of the reported in-person attendances counted as one full-time equivalent (FTE). The increase in attendance from 2 to 10% in June to 84 to 91% in September came close to normal attendance of around 95%.
As some schools were now maintaining provision through two modes - in-person and online (under a new legal duty to provide education for children unable to attend because of coronavirus) - teacher labour input increased between June and September. In September, reported teacher hours were around 49 and 47 in primary and secondary schools respectively, up considerably on the level during the first lockdown, and on our reported "normal" hours measure (Figure 2 and Table 2).
Figure 2: Average teaching hours increased from June to September as schools reopened
England, April to June, and September to December 2020
Source: Teacher Tapp
Download this chart Figure 2: Average teaching hours increased from June to September as schools reopened
Image .csv .xlsThese developments presented a challenge to the implementation of our approach for remote learners. Under full school closures, it was reasonable to assume that changes in teacher hours reflected changes in the amount of input received by remote learning students. However, the observed longer working hours in September 2020 - partly a consequence of needing to teach both in-person and remote learning students - no longer provided a good proxy for the amount of input received by remote learners.
Methodology: our response to the measurement challenges for September to December 2020
Although we continued to collect data on teacher hours worked, these circumstances demanded a further refinement to the implementation of our approach for remote learners. We maintained our question about the importance of parents and guardians to remote instruction, and added an additional question designed to identify whether remote learners were receiving the same quantity of instruction as those attending in-person. Specifically, we used "Teacher Tapp" to ask teachers:
Thinking of students learning remotely today, to what extent did they have learning materials enabling them to cover the same content as pupils who were learning in school?
Our interest in the quantity of instruction - rather than its quality or impact (Section 2) informed the design of this question. Here, we have focused on the "materials provided to students", rather than "how much learning students have done".
The results of this survey indicated that while the level of teacher labour input had risen between June and September 2020, the content provided to remote learners could only cover a proportion of material covered in the classroom (Table 2). As confirmed in subsequent waves of collection, this position was more acute in primary schools than in secondary schools: the materials provided to remote learning primary (secondary) school students enabled them to cover around 56% (72%) of the material covered by in-person attendees over this period.
Primary schools | ||||
---|---|---|---|---|
September | October | November | December | |
Hours worked | 48.5 | 46.1 | 47.9 | 47.3 |
Difference to average normal hours worked between April and June 2020 | 9.5 | 7.0 | 8.9 | 8.3 |
% of normal input comparing to 'in class' | 52% | 58% | 58% | 56% |
Parental involvement | 37% | 34% | 40% | 34% |
FTE | 33% | 38% | 35% | 37% |
Secondary schools | ||||
September | October | November | December | |
Hours worked | 47.4 | 45.5 | 46.9 | 46.5 |
Difference to average normal hours worked between April and June 2020 | 10.1 | 8.1 | 9.5 | 9.1 |
% of normal input comparing to 'in class' | 68% | 72% | 74% | 73% |
Parental involvement | 9% | 8% | 7% | 8% |
FTE | 62% | 66% | 68% | 67% |
Download this table Table 2: Changes in teacher hours, the extent of materials provided to remote learning students and the importance of parental instruction
.xls .csvAs the reported increase in average teacher hours worked from September to December 2020 partly reflected the demands of teaching students in school and those learning remotely, we changed our discount for remote learners to avoid capturing the increase in our remote learning assumptions. For the small number of students who were learning remotely over this period, we replaced our inference about the labour input provided by teachers with the information provided by our supplementary question on materials. We continued to apply our adjustment based on the importance of parental instruction.
Given the small number of remote learners once schools reopened in September and that the two methods could not be accurately compared for the same period with the data available, it was decided that there was insufficient information to align the two methods at this point. Information on the alignment of the two methods is covered in Section 5: A consistent approach: aligning policy and measurement regimes.
Back to table of contents4. Overview of measurement challenge and response for early 2021
Measurement challenges
The return to a full closure of schools in England at the start of January 2021 presented a further challenge to the implementation of our approach. Specifically, the results from our “materials” question – intended to capture the proportion of “normal” content provided to remote learners – suggested that remote and in-class attendees received very similar materials in January and February 2021. This indicated that schools may have taken the step of using the remote learning content for both remote and “in-person” learners during the early months of 2021. To investigate this, a new question on whether in-class learners were taught using the same material as remote learners during January and February 2021 was asked retrospectively in March 2021. The responses confirmed our prior assumption: indicating that only a small proportion of in-class learners were taught using different materials to remote learners.
Figure 3: There was a noticeable increase in the proportion of in-class content covered by remote learning materials in January, remaining high in February 2020
England, September 2020 to February 2021
Source: Teacher Tapp
Notes:
- Between September and December 2020 schools across the UK reopened for general in-person attendance.
- From January to early March 2021, schools were again largely closed to in-person attendance.
Download this chart Figure 3: There was a noticeable increase in the proportion of in-class content covered by remote learning materials in January, remaining high in February 2020
Image .csv .xlsThese results indicated to us that estimates of the full-time equivalent (FTE) factor applied to remote learners for January might be inflated, not because the coverage of the remote learning materials provided had increased, but because the content covered in-class had fallen considerably. This effect was particularly marked in primary schools: between December and January, the proportion of in-class content covered by the remote learning materials provided increased from around 55% to around 85%, and it remained high in February.
To understand this development better, we posed a further question to teachers in February 2021: asking them the extent to which the remote learning materials provided in February allowed students to pursue the same content as at this stage in a "normal" year:
Thinking of students learning remotely today, to what extent did they have learning materials enabling them to cover the same content as pupils at this point in a normal pre-pandemic year?
The results of the new question suggested that the learning materials provided during the latest full school closure (from January 2021) were insufficient to fully represent pre-pandemic normal conditions. This result is consistent with the apparent improvement in the coverage of remote learning materials in early 2021 being driven less by a change in their content over this period, and more by a change in the content covered by in-person learners (Figure 4). Our early estimates indicate that once schools reopened in March, the difference between the results of "comparing to in class" and "comparing to normal" questions reduced considerably.
Figure 4: The proportion of content covered by remote learning materials was lower compared to normal pre-pandemic conditions than to in-class
England, February 2021
Source: Teacher Tapp
Download this chart Figure 4: The proportion of content covered by remote learning materials was lower compared to normal pre-pandemic conditions than to in-class
Image .csv .xlsMethodology: our response to the measurement challenges for early 2021
As discussed, the renewed school closures in the third wave of the coronavirus (COVID-19) pandemic (January 2021) coincided with a marked rise in the share of classroom learning accounted for by remote learning materials. While it remains to be seen how this series behaves once data for a post-school closure period - such as March 2021 - is captured, a FTE adjustment for remote learners based on these data would likely overstate the value of remote education relative "to normal".
To that end, we modified our FTE calculation for remote learners to use the comparison to "a normal year", rather than to what is happening "in the classroom". This change was implemented prior to the publication of estimates of education output from January onwards and therefore there are no revisions to published estimates for this period. This will reflect the change in policy regime and ongoing lower level of provision to remote learners than normal to enable comparisons on a consistent basis. By necessity, this was based on information gathered in February 2021 to adjust both January and February 2021.
Despite the apparent change in the quantity of instruction in classrooms, we have continued to count in-person attendances at the same, FTE rate for this period. This approach reflects our underlying concept of interest: the number of teacher-student hours provided. This in turn requires us to assume that changes in the quantity of instruction in school are compensated for by changes in the quantity of other services provided, retaining the same number of underlying teacher-pupil hours provided.
Our practical implementation of this conceptual approach has involved a number of assumptions. Some of these are a matter of expediency and will be reviewed in the coming months. It is likely that these estimates will be subject to larger revisions than normal.
Back to table of contents5. A consistent approach: aligning policy and measurement regimes
The succession of school policy regimes and the need to measure education output as consistently as possible over the course of the coronavirus (COVID-19) pandemic has resulted in the three approaches to measurement. These are summarised in Table 3, which sets out how the “discount” for remote learning was based on different information during each phase of the lockdown. In the first phase, the full-time equivalent (FTE) discount was based on information about changes in teacher labour input. In the second phase it was based on the extent to which materials provided to remote learners approximate the classroom experience. In the third phase it was based on the extent to which remote learning materials cover the same content at this stage of a “normal” school year.
‘Normal times’ | April to June | September to December | January to February | |
---|---|---|---|---|
In person | ||||
Attendance | Included (95%) | Included (1 to 10%) | Included (52 to 94 %) | Included (3 to 19%) |
Full time equivalency factor | 1 | 1 | 1 | 1 |
Absence (incl. sick) | ||||
Captured (5%) | Captured (5%) | Captured (5%) | Captured (5%) | |
Full time equivalency factor | 0 | 0 | 0 | 0 |
Remote learners | Not applicable | Included (85 to 94%) | Included (1 to 43%) | Included (76 to 92%) |
Full time equivalency factor* | Primary (43 to 58%) Secondary (68 to 77%) | Primary (33 to 38%) Secondary (62 to 68%) | Primary (48 to 50%) Secondary (74 to 76%) | |
% of normal input | Average teacher labour input relative to pre-pandemic Primary (70 to 78%) Secondary (75 to 85%) | Teacher judgement on extent to which remote learning = in class Primary (52 to 58%) Secondary (68 to 74%) | Teacher judgement on extent to which remote learning = in normal Primary (72 to 73%) Secondary (77 to 79%) | |
Proportion of instruction dependent on parents not teachers | Estimated teacher judgment of parental contribution Primary (34 to 40%) Secondary (8 to 9%) | Estimated teacher judgment of parental contribution Primary (34 to 40%) Secondary (7 to 9%) | Estimated teacher judgment of parental contribution Primary (31 to 33%) Secondary (4%) |
Download this table Table 3: Components of our FTE calculation under different measuring approaches by period
.xls .csvThe effect of these different policy and measurement regimes on the underlying remote learning FTE assumption is shown in Figure 5; the share of students to whom it was being applied is shown in Table 3. Switching between using teacher labour input in the first lockdown and using information about remote learning materials in the autumn term can be seen to have introduced a discontinuity in our remote learning FTE estimate. Adapting our measurement for the further school closures and change in policy regime in the first few months of 2021 has enabled comparisons on a consistent basis.
Figure 5: Full-time equivalency assumptions for remote learners under the three different measurement regimes
England, April to June 2020, September 2020 to February 2021
Source: Teacher Tapp
Notes:
- Teachers hours worked, comparing to children in 'class', comparing to 'normal'.
Download this chart Figure 5: Full-time equivalency assumptions for remote learners under the three different measurement regimes
Image .csv .xlsWe have continued to review the implementation of our approach to the measurement of education over the course of the pandemic. As a result, we are now making some changes to align the implementation of our approach between April to June 2020, and September to December 2020, to ensure that it is as consistent as possible over this period.
Implementing our response
To align the measurement regimes between the first (March to July 2020) and second (September to December) wave estimates, we asked teachers:
Thinking of students learning remotely during the first full school closures, to what extent did they have learning materials enabling them to cover the same content as pupils who were learning in school?
The results of the retrospective question (Figure 6), suggest that the education for remote learners during the first full school closures (March to July 2020) was comparable or lower to when students returned to school in September 2020. Therefore, the large downward step change in proportion of normal input provided between June and September 2020 is likely to be a result of the difference in the implementation of our approach rather than an underlying change in the education services provided to remote learners.
Figure 6: Education for remote learners in the first full-school closures was comparable or lower to when students returned to schools in September 2020
England, April to December 2020
Source: Teacher Tapp
Notes:
- England summer break, retrospective question, comparing to children in class.
Download this chart Figure 6: Education for remote learners in the first full-school closures was comparable or lower to when students returned to schools in September 2020
Image .csv .xlsThis retrospective question offers a comparable data point for the first and second waves and therefore provides a means of linking the two measurement regimes. To do this, we accept the monthly profile of changes arising from changes in teacher labour input, but benchmark this to the results of our retrospective materials question. This approach introduces some potential for mismeasurement arising from the retrospective nature of the question, but it seems more likely to be comparable than using two different measurement bases.
Impact of changes
The impact of these changes is a much smoother discount rate for each remote learning FTE over the course of the pandemic (Table 4 and Figure 7). At a detailed level, this means that we are reducing the extent to which remote learning was an effective substitute for in-person teaching at the start of the pandemic. Adapting the implementation of our approach for the further school closures and changes in policy regimes in the first few months of 2021 has enabled comparisons on a consistent basis. The latter of these adjustments, in particular, will be subject to review as more information is gathered.
‘Normal times’ | April to June | September to December | January to February | |
---|---|---|---|---|
In person | ||||
Attendance | Included (95%) | Included (1 to 10%) | Included (52 to 94 %) | Included (3 to 19%) |
Full time equivalency factor | 1 | 1 | 1 | |
Absence (incl. sick) | ||||
Captured (5%) | Captured (5%) | Captured (5%) | Captured (5%) | |
Full time equivalency factor | 0 | 0 | 0 | 0 |
Remote learners | Not applicable | Included (85 to 94%) | Included (1 to 43%) | Included (76 to 92%) |
Full time equivalency factor* | Primary (29 to 40%) Secondary (56 to 64%) | Primary (33 to 38%) Secondary (62 to 68%) | Primary (48 to 50%) Secondary (74 to 76%) | |
% of normal input | Estimated teacher judgement on extent to which remote learning = in class Primary (49 to 61%) Secondary (62 to 70%) | Teacher judgement on extent to which remote learning = in class Primary (52 to 58%) Secondary (68 to 74%) | Teacher judgement on extent to which remote learning = in normal Primary (72 to 73%) Secondary (77 to 79%) | |
Proportion of instruction dependent on parents not teachers | Estimated teacher judgment of parental contribution Primary (34 to 40%) Secondary (8 to 9%) | Estimated teacher judgment of parental contribution Primary (34 to 40%) Secondary (7 to 9%) | Estimated teacher judgment of parental contribution Primary (31 to 33%) Secondary (4%) |
Download this table Table 4: Components of our FTE calculation under consistent approach by period
.xls .csv
Figure 7: Full-time equivalency assumption for remote learners is now smoother over the course of the COVID-19 pandemic.
England, April to June 2020, September 2020 to February 2021
Source: Teacher Tapp
Download this chart Figure 7: Full-time equivalency assumption for remote learners is now smoother over the course of the COVID-19 pandemic.
Image .csv .xlsMaking these changes to the implementation of our approach has an important impact on the aggregate volume of education output (Figure 8). The largest impact to our education output as a result of these changes is for April to June 2020, but it also has an impact on the July to September 2020 period.
The impact on the July to September 2020 period arises from our treatment of school holidays. Our estimates of education output are designed in a manner which "looks through" the school holidays, as set out in our blog School's Out: measuring education output in the summer of the pandemic. Therefore, the reduction in education output at the end of the summer term has a smaller impact on education output from July to September 2020. There have been no changes to September to December 2020 arising directly from these methods changes1.
As a consequence of these changes, the fall in education output between January to March 2020 and April to June 2020 has increased, indicating that the impact of the first lockdown on education output was larger than initial estimates suggested. The growth in education output between April to June 2020 and July to September 2020 has also increased, suggesting a stronger recovery from the first lockdown than initial estimates suggested. Following the alignment of our methods, education output for Quarter 2 2020 in volume terms is now estimated to have fallen 36.7% and gross domestic product (GDP) 19.5%, a downwards revision of 13.6 and 0.5 percentage points respectively from the previous estimate.
Figure 8: The largest effect on our education output as a result of these changes is for quarter 2 2020
Education final consumption expenditure, chain volume measure, seasonally adjusted, UK, quarter 1 2019 to quarter 4 2020
Source: Office for National Statistics – GDP estimates
Download this chart Figure 8: The largest effect on our education output as a result of these changes is for quarter 2 2020
Image .csv .xlsOther measurement issues
In the coming months, we expect to maintain our data collection of hours worked, comparing to "in class" and comparing to "normal", to enable us to continue to respond flexibly. As schools across the UK reopen, we expect to return to informing our remote learning assumptions using comparisons to "in class" rather than "to normal". However, we will keep this approach under review.
We also anticipate that by maintaining several data collections, the implementation of our measurement regime will retain a higher level of consistency and be more robust to future school closures that follow existing policy regimes.
In the future, changes in delivery that increase the quantity of instruction - by reducing the length of holidays or increasing the length of the school day - will feed directly into the level of education output. We are monitoring how these proposals are developing to try to ensure their timely inclusion in our estimates of education output.
Notes for: A consistent approach: aligning policy and measurement regimes
- We are considering how to implement this revised approach in Quarter 1 (Jan to Mar) 2020, during which schools were only closed for a short period and prior to the collection of daily attendance data. We will bring forward the revisions implied by this aligned approach as soon as feasible, likely in the September 2021 Quarterly National Accounts release.