1. Disclaimer
This Research Output does not contain official statistics relating to household finances. Rather, this is published as an output from research which seeks to determine the feasibility of conducting further work towards an online survey of household finances.
Back to table of contents2. Introduction
As part of the Office for National Statistics and UK Statistics Authority Business Plan for April 2019 to March 2022 (PDF, 745.73KB), the Census and Data Collection Transformation Programme (CDCTP) is leading an ambitious programme of work to put administrative data at the centre of the population, migration, and household statistical systems.
This programme of research is underpinned by the Digital Economy Act 2017, which allows us to access data directly from administrative and commercial sources for research and statistical purposes, for the public good.
In addition to administrative data, there will also remain a need for some residual survey data collection, which the Office for National Statistics (ONS) intends to be “digital by default”. In the context of household surveys, this means providing an online self-completion mode to respondents (as well as face-to-face and telephone collection). An online mode will enable respondents to provide data at their own convenience, reduce respondent burden, and reduce operational costs.
As part of the CDCTP, the Social Survey Transformation (SST) Division has been looking at the end-to-end survey designs for the current household survey portfolio. This includes the Labour Force Survey (LFS), as well as the portfolio of household finance surveys we operate.
There are currently three face-to-face finance surveys conducted by the ONS:
The main focus of this research was on establishing uptake and response for an online survey of household finances. Household finance surveys are particularly complex, and collecting this type of data online will be challenging. Many people consider their finances to be a sensitive and personal subject, so before considering how to design questions for online self-completion, some initial research to establish the feasibility of collecting this type of data online is required. The findings from the research will inform the approach to any future development of an online mode for a survey of household finances.
This research uses two main metrics to assess the engagement of households with the online survey. These are:
engagement rate – the proportion of households that at the very least entered login details for the online survey
response rate – the proportion of households that completed the survey
3. Main points
Just over one in five surveyed households (20.7%) engaged with the online survey, and almost one in six (16.5%) completed the survey; these findings indicate that people are willing to engage when asked to participate in an online survey of household finances.
Respondents who were given advance notice of the financial topics included on the survey, and the documentation needed to complete it, were less likely than others to engage with the survey or to complete it; the engagement and completion rates were 16.9% and 14.0% respectively, for those given advance notice, compared with 24.5% and 19.0% for those who were not.
Respondents who were given advance notice of the financial content and documentation needed to complete the survey provided better quality data; for example, more of these respondents were able to answer the financial questions and when they did answer these, they provided more precise figures.
4. Purpose of the research
Research already conducted by the Office for National Statistics (ONS) into the feasibility of collecting labour market data online suggests a response rate of almost 30% is achievable. This response level has resulted from an extensive programme of ongoing research and design. This programme has been running for over three years and has involved multiple tests of the best way to encourage participation in an online Labour Market Survey (LMS).
Data collected on finance surveys is arguably more complex and sensitive, and it is not possible to extrapolate from the findings of the LMS research that householders will complete finance surveys online. The small-scale test reported within this report aimed to provide some initial evidence to address this evidence gap.
Primary research questions
This report will assess the following research questions:
What proportion of selected households will engage with the survey (by logging into it), and respond to the survey (by completing it)?
Are households more or less likely to engage and to respond if they are given advance notice of the financial topics that the survey will cover, and of the reference documentation they will need to complete the survey?
To answer these questions, the sample for the survey was split evenly into two test groups:
A Generic test group who received advance survey materials inviting them to take part in an online survey of household finances.
A Tailored test group who received similar advance materials; they were also told which financial topics would be covered in the survey and provided with a list of documents that they should reference when completing the survey.
Secondary research questions
Testing two different types of advance materials also allowed consideration of several secondary research questions. Although these are not directly related to the primary purpose of the research, they can help inform any future work and research in this area.
This research aims to answer these secondary research questions:
Do survey materials have an impact on data quality?
Are respondents able to complete the survey more quickly if they are in the Tailored test group?
Are respondents in the Tailored group more or less likely to drop out of the survey when completing the financial sections?
Is the profile of households that respond to this survey similar to households that respond to similar Office for National Statistics (ONS) surveys?
Are respondents more likely to provide permission to contact them again in the future if they are in the Tailored test group?
In relation to data quality, five specific metrics were considered and analysed for this report. These were:
the proportion of households that say they consulted documentation when completing the survey
levels of rounding for financial variables
levels of proxy responses, where an individual completes the survey, but their responses are provided by another household member
levels of item missingness, where a respondent either refuses to answer or skips past a question
levels of unit missingness, where an individual within a responding household refuses to answer the individual questionnaire.
5. Existing Office for National Statistics (ONS) household financial surveys
Living Costs and Food Survey (LCF)
The LCF is a cross-sectional survey that collects information from approximately 5,000 households each year on spending patterns and the cost of living that reflect household budgets. It is conducted across the whole of the UK and is the most significant UK survey on household spending.
The LCF provides essential information for important social and economic measures, such as:
household expenditure patterns for the Consumer Prices Index and gross domestic product (GDP) figures
detailed information on food consumption and nutrition for the Department for Environment, Food and Rural Affairs (Defra)
It is also an important source of economic and social data for a range of government and other research agencies. The results are essential for understanding society, and planning to meet its needs.
Survey of Living Conditions (SLC)
The SLC is a longitudinal survey that collects information from approximately 12,000 households each year, covering household resources, housing, labour, education, pensions, and health. It helps policymakers understand how different sections of society are managing, so that they can see the impact of policy changes and measure the state of the economy. The UK Government and EU use the results to monitor and inform policies aimed at combating poverty and social exclusion.
The SLC is used to meet a EU requirement for longitudinal Statistics on Income and Living Conditions (EU-SILC). Information collected from the SLC is sent to Eurostat for this purpose (Eurostat is the statistical office of the EU).
Wealth and Assets Survey (WAS)
The WAS is a longitudinal survey that collects information on the financial well-being of households and individuals in terms of their assets, savings, debt, and planning for retirement. The survey also examines attitudes and attributes related to these. The survey provides users with the ability to measure wealth changes over time.
The data provide a greater understanding of the levels and distribution of wealth and debt across the population of Great Britain in terms of pensions, property, financial and physical assets, as well as indebtedness.
Back to table of contents6. Data collection
For this test, data collection activities were contracted out to IPSOS-Mori. IPSOS-Mori was responsible for:
programming the online survey to Office for National Statistics (ONS) specifications
generating unique access codes for each sampled address
hosting the online survey
delivering the data to us
production of the paradata report
We delivered all other aspects of the survey:
sample design and selection
development of survey materials and incentives
development of survey questions, including layout and routing
data storage
data analysis
Sampling
To ensure data were collected from a wide range of respondents, a systematic random sample of 16,320 addresses in England and Scotland were selected. The sample only included households in England and Scotland for operational reasons. The sample for addresses in England and Scotland was drawn from AddressBase, an Ordnance Survey or GeoPlace product comprised of local authority, Royal Mail and Council Tax data available to the ONS under the Public Sector Mapping Agreement. This product will have future use in sampling ONS address-level surveys such as the census or social surveys. Currently, the Postcode Address File (PAF) is used as the sampling frame for social surveys; this is a list of all addresses to which Royal Mail deliver mail.
The sample was equally split into two treatment groups, the Tailored group and Generic group, with each group containing 8,160 addresses. The Tailored group were given advance notice of the financial topics that the survey would cover, and a list of documentation that would be needed to help with completion.
In contrast, the Generic group were given less detailed advance information.
Test incentive
The invite letter also contained an unconditional incentive for each sampled address – the incentive was a “tote bag”, a reusable bag made from canvas that had a graphic representing statistics produced by the ONS on one side, printed in colour. This type of incentive is unique to the Labour Market Survey (LMS) and this household financial survey (HFS), and is part of the test; it is not currently in use for the Labour Force Survey (LFS) or any other social survey.
Collection period
Sampled households were told that the survey would be open from 28 June 2019 to 19 July 2019. The survey was kept open until 29 July 2019, to give respondents the opportunity to submit late responses.
Respondent materials
Different advance materials were developed for each treatment group. These were designed in a user-centred way by the Research and Design team within the ONS Social Surveys Transformation Division. The format was based upon well-established readability and accessibility principles from previous studies.
Sampled addresses received:
a pre-notification letter nine days before the survey launch date
an invitation letter pack two days before the test launch date, which included a survey invitation letter, a tote bag, and a thank you slip
a reminder letter pack, sent to non-responding households 14 days after the test launch date
The optimum time to send a reminder letter is seven days following the test launch, however printing schedules did not allow for this.
Copies of the survey materials are available by emailing HFS.Transformation@ons.gov.uk.
Access and security
In the invitation letter, households were provided with the web address to an ONS landing page. This helped make it clear that the survey was a genuine ONS survey. When this page was visited, the respondent clicked a button which redirected them to a webpage on the IPSOS-Mori website. Respondents were then asked to enter their unique household access code to securely log in to the online survey.
When a respondent first logged in, they would be asked to record the names and sexes of all members of the household. Some additional questions were then asked to clarify that all of the recorded people used the address as their main residence, and established how the household member knew and/or were related to each other.
One household member was then asked to complete the household section of the survey. If the household logged out and then re-entered the survey, the responses that had previously been recorded were locked down and only the most recently opened survey page would be visible. This maintained data security as it prevented another person or household accidentally viewing the previously entered information if they were to pick up correspondence with the unique access code.
Once the household information was completed, the individuals in the household could select their name from the list of household members to complete their own individual section. Responses were locked down in the same way if an individual exited the survey. This prevented other household members from inadvertently accessing sensitive information entered by other household members.
Survey questionnaire
The main purpose of this test was to establish engagement levels of households when asked to participate in an online survey of household finances. The survey content was designed to ensure that it covered the topics specified in the survey materials. The survey questions were not optimised to provide comparable estimates with concepts on existing surveys.
There is a considerable amount of content in the existing ONS surveys of household finances that overlaps with concepts collected as part of the experimental mixed-mode Labour Market Survey (LMS). LMS questions were used for this survey where such overlaps existed, as the LMS content has already been extensively tested to ensure it is optimised for online collection.
The remaining content underwent an at-desk expert review. Survey questions were developed based on well-established redesign principles and Government Digital Service (GDS) principles and service standards. This includes:
being optimised for readability using desktop computers and mobile devices, particularly with regards to font size and navigation buttons
designing questions without additional instructions; for online surveys, no interviewer is present to clarify definitions
employing as few checks on survey responses as possible
limiting the use of progress bars
designing the content so that it does not use tables or grids
Question block | Household or individual level | Design | Description |
---|---|---|---|
Household structure | Household | Re-used LMS content | Recorded which individuals live in the household and how these individuals were related to each other. |
Accommodation and housing | Household | Mixture of reused LMS content and redesigned content | Collected information on housing tenure, mortgage and rent payments, Council Tax and utilities. |
Socio-demographics and well-being | Individual | Re-used LMS content | Recorded information on age, marital status, nationality, ethnicity, religion, wellbeing and health. |
Employment and economic status | Individual | Re-used LMS content | Recorded information about whether the respondent worked, job and employment details |
Income from employment | Individual | Redesigned content | Recorded income from employment and self-employment. |
Education | Individual | Redesigned content | Recorded information about any education currently taking place and educational qualifications. |
Income from pensions | Individual | Redesigned content | Recorded income from pensions, including State Pension and personal pensions. |
Income from benefits | Individual | Redesigned content | Recorded income from state benefits. |
Savings | Individual | Redesigned content | Recorded information on the amounts held in savings accounts. |
Download this table Table 1: List of question blocks collected on the survey
.xls .csvThe full questionnaire is available on request by emailing HFS.Transformation@ons.gov.uk.
Back to table of contents7. Analysis
Approach
The primary focus of this research was to analyse levels of engagement and response, and to assess whether this was influenced by the content of the advance survey materials. We did not therefore apply any weighting to the responses in the analysis dataset. Nor did we undertake any data cleaning or editing.
Where differences have been described in the write-up of this research, these have been tested for statistical significance, and were found to be significant at 5% significance level.
Will households engage with an online survey of household finances?
The primary purpose of this test was to establish whether households would engage when asked to participate in an online survey of household finances. Two main metrics were analysed:
engagement – the household engaged with the online survey by logging into it
survey completion – the household completed the survey
The test also aimed to establish whether households would be more or less likely to engage and complete the survey if they were told in advance the financial topics the survey would cover. This involved households being made aware of the documentation that would be needed to complete these aspects of the survey.
Figure 1: The Generic group had higher engagement and completion rates than the Tailored group
Engagement rate for whole sample, Generic and Tailored group
Source: Office for National Statistics
Download this chart Figure 1: The Generic group had higher engagement and completion rates than the Tailored group
Image .csv .xlsFigure 1 shows that the engagement rate across the whole sample was 20.7% and the completion rate was 16.5%.
Households in the Generic test group were more likely to both engage with and complete the survey. At 24.5%, the engagement rate for the Generic group was 7.6 percentage points higher than the 16.9% seen in the Tailored group. Likewise, the 19.0% completion rate seen in the Generic group was 5.0 percentage points higher than the 14.0% seen in the Tailored group.
As there was no face-to-face follow-up for this survey, it was not possible to establish which sampled addresses were ineligible. Work on previous studies indicates 5.1% of addresses selected for household surveys are found to be ineligible. This 5.1% ineligibility rate was accounted for when calculating the engagement and completion rates for this survey.
The achieved engagement and completion levels are high enough to indicate that respondents will engage when asked to participate in an online survey of household finances. The results show that higher engagement and overall completion levels can be expected when generic survey materials are used rather than tailored materials. The sensitive nature of financial data is likely to be a factor in the differences observed between the test groups.
Do survey materials have an impact on data quality?
Although this was not a statistical test of data quality, it was still possible to establish some quality metrics on the data. This included:
the proportion of households that said they consulted documentation when completing the survey
levels of rounding for financial variables
levels of proxy responses, where an individual completes the survey but their responses are provided by another household member
levels of item missingness, where a respondent either refuses to answer or skips past a question
levels of unit missingness, where an individual within a responding household refuses to answer the individual questionnaire
What proportion of households consult documentation when completing the survey?
Collection of financial data in a household setting is notoriously difficult to collect accurately. Difficulties include:
recall, where respondents have difficulty remembering exact amounts for the values being requested
definition, where respondents do not understand what it is they are being asked to provide
The use of reference documentation can help address these difficulties. For example, a respondent can use their payslip to provide accurate amounts of gross and net pay for a given time period. Higher levels of document consultation indicate higher levels of data quality.
Figure 2: The Tailored group were more likely to say they referred to documentation than the Generic group
Proportion of respondents that checked documents by treatment group
Source: Office for National Statistics
Download this chart Figure 2: The Tailored group were more likely to say they referred to documentation than the Generic group
Image .csv .xlsFigure 2 shows that 57.3% of the Tailored test group were more likely to refer to documentation during the questionnaire, compared with 35.7% of those in the Generic group, a difference of 21.6 percentage points.
Figure 3: The Tailored group were more likely to refer to both paper and electronic documents than the Generic group
Proportion of respondents who refer to paper or electronic documents by treatment group
Source: Office for National Statistics
Download this chart Figure 3: The Tailored group were more likely to refer to both paper and electronic documents than the Generic group
Image .csv .xlsFigure 3 shows that 36.6% of households in the Tailored group referred to paper documentation, and 36.7% referred to electronic documentation. This compares with 22.7% of households in the Generic group consulting paper documentation and 21.1% referring to electronic documentation. Those in the Tailored group were also more likely to have referred to both paper and electronic documentation during the survey.
These differences reflect the large difference in any reference to documentation observed between the groups and indicate, for this measure of quality, that households in the Tailored test group performed better than those in the Generic group.
Levels of rounding for financial variables
When asked to provide a value for a financial variable, respondents could enter exact amounts (to the nearest penny) or could round to whatever level they deemed appropriate. Lower levels of rounding provide more precise values and indicate higher levels of data quality.
Figure 4: The Tailored group were more likely to provide precision values for their gas and electricity bills than the Generic group
Proportion of respondents who provided rounded answer to their gas and electricity bills, by treatment group
Source: Office for National Statistics
Download this chart Figure 4: The Tailored group were more likely to provide precision values for their gas and electricity bills than the Generic group
Image .csv .xlsFigure 4 shows 10.5% of households in the Tailored group provided exact answers compared with 5.7% of those in the Generic group. For the Tailored group 67.0% provided the answer to the nearest pound or better, compared with 58.2% for the Generic group.
This pattern is representative of the pattern observed for other household financial variables, such as Council Tax and mortgage payments. The tailored group consistently provided answers to better levels of rounding than the Generic group. The background tables provide further detail on this for each household variable.
Figure 5: The Tailored group were more likely to provide precision values for their employee pay than the Generic group
Proportion of respondents who provided rounded answer to their employee pay, by treatment group
Source: Office for National Statistics
Download this chart Figure 5: The Tailored group were more likely to provide precision values for their employee pay than the Generic group
Image .csv .xlsFigure 5 shows that 12.7% of those in the Tailored group provided a precise response to employee pay, and 33.1% rounded to the nearest pound or better. This compares with 8.0% of the Generic group providing precise answers and 28.1% providing their response to the nearest pound or better.
Figure 6: The Tailored group were more likely to provide values for their self-employed income to the nearest £100 or better than the Generic group
Proportion of respondents who provided rounded answer to their self-employed income, by treatment group
Source: Office for National Statistics
Download this chart Figure 6: The Tailored group were more likely to provide values for their self-employed income to the nearest £100 or better than the Generic group
Image .csv .xlsFigure 6 shows that fewer than 1 in 10 provided values for their self-employed pay to the nearest £10. More of those in the Tailored group provided their self-employed pay to the nearest £100 or better – 27.3% compared with 15.8% for the Generic group.
Similar patterns were generally seen across the benefit and pension variables. These results indicate that those in the Tailored group were more likely to provide financial values to better levels of precision than those in the Generic group, indicating that they provided higher quality data.
Levels of proxy responses
A survey has been completed by proxy when the individual questionnaire has been completed, but the responses for that section have been supplied by another member of the household. Studies have shown that proxy responses are less accurate than personal responses. This can certainly be the case for financial data, where individuals within a household may not have a full picture of the finances of their fellow household members. Higher proxy rates are associated with poorer levels of data quality.
Just under one in five of those who answered the individual section had their answers provided by proxy – broadly consistent across both test groups. This level of proxy response is similar to that seen in other surveys (Figures 7 and 8).
Figure 7: The likelihood that an individual interview would be completed by proxy was not affected by test group
Proportion of individual responses completed by proxy
Source: Office for National Statistics
Download this chart Figure 7: The likelihood that an individual interview would be completed by proxy was not affected by test group
Image .csv .xls
Figure 8: This survey delivered similar proxy rates to comparable surveys
Proportion of proxy responses for household financial survey (HFS), Labour Market Survey (LMS) and Living Costs and Food Survey (LCF)
Source: Office for National Statistics
Download this chart Figure 8: This survey delivered similar proxy rates to comparable surveys
Image .csv .xlsItem non-response
Item non-response is where a respondent does not enter a value for a specific survey question. High levels of item non-response are associated with poorer data quality, as these missing data values will need to be estimated using statistical techniques.
Figure 9: The Generic group were less likely to provide complete responses to survey sections than the Tailored group
Proportion of records which had some missing values
Source: Office for National Statistics
Download this chart Figure 9: The Generic group were less likely to provide complete responses to survey sections than the Tailored group
Image .csv .xlsFigure 9 shows that respondents in the Generic test group were more likely than those in the Tailored group to provide incomplete information across each of the financial sections of the questionnaire (household bills, employee pay, self-employed pay, pension income and benefit income).
Levels of completion were broadly the same for non-financial sections, such as demographics and unemployment.
These findings suggest that receiving the tailored set of advance survey materials led to better rates of item completion across the financial sections of the survey.
It should be noted that the high incompletion rates in the employee pay section were a result of respondents being asked to provide income both before and after tax. The vast majority of respondents (93.7%) were able to supply information for one of these, but many could not provide both pieces of information. Although these questions were not fully optimised for online collection, it is recommended that any future research into online collection of employee pay data attempts to address this.
Unit non-response
Unit non-response occurs when an individual within a responding household refuses or fails to complete any part of their individual survey. High unit non-response is associated with poorer data quality, as the values for these individuals need to be estimated using statistical techniques.
Figure 10: The Tailored group had lower rates of unit non-response than the Generic group
Proportion of unit non-response by test group
Source: Office for National Statistics
Download this chart Figure 10: The Tailored group had lower rates of unit non-response than the Generic group
Image .csv .xlsFigure 10 shows that the rate of unit non-response was lower for individuals in the Tailored test group (7.2%) than it was for individuals in the Generic test group (10.7%).
Conclusions drawn from assessing measures of data quality between the test groups
The conclusions from the primary analysis reported that the Generic test group was associated with higher response and completion levels for the survey. However, the measures for data quality indicate that the Tailored test group provided higher quality data.
Most of the quality measures assessed as part of this research suggested that the Tailored group provided better quality information, and none of the measures suggested this group provided poorer quality information.
The Tailored group performed better in terms of:
reference to financial documentation when completing the survey
providing financial values to more precise levels
levels of item non-response for financial variables
levels of unit non-response
It is recommended that more detailed research into the trade-off between response levels and data quality should take place to determine which type of survey materials is associated with better overall data quality.
Are respondents able to complete the survey more quickly if they are in the Tailored test group?
Longer surveys impose a larger burden on survey respondents. This in turn can lead to lower response rates. It is also important, in the interests of openness and transparency, that respondents are adequately informed of the likely time burden. There are two main questions to answer:
Were the observed lengths of time taken to complete the questionnaire in line with the researchers’ expectations?
Did completion times differ between the Tailored and Generic test groups?
Our researchers estimated that the survey would take approximately 20 to 30 minutes for the whole household to complete.
Figure 11: The median time to complete the survey was around 25 minutes
Median time to complete whole questionnaire, by test group
Source: Office for National Statistics
Download this chart Figure 11: The median time to complete the survey was around 25 minutes
Image .csv .xlsFigure 11 shows that the median time taken to complete the survey was just under 25 minutes and 4 seconds. This was consistent between the Generic and Tailored test groups and was also in line with the information provided to households in the survey materials.
Figure 12: Over 60% of households completed the survey in less than 30 minutes
Length of time for households to complete the survey, by treatment group
Source: Office for National Statistics
Download this chart Figure 12: Over 60% of households completed the survey in less than 30 minutes
Image .csv .xlsIn all, 60.4%% households completed the survey in less than 30 minutes. This was broadly similar across both treatment groups.
The findings for individuals were consistent with those for the full questionnaire. The median time taken for individuals to complete the questionnaire was seven minutes and 12 seconds, and was consistent between the Generic and Tailored test groups.
These findings suggest that researchers’ estimates of the time burden being placed on respondents were broadly accurate, the median time taken for households to complete the survey was near the middle of the time range stated in the advance materials, and six out of ten households were able to complete the survey within the upper limit stated in the materials. The materials did not have an impact on the time taken to complete the survey at either household or individual level.
Are there certain sections of the questionnaire where respondents drop out of the survey?
The researchers wanted to look at whether those in the Tailored test group were more likely to drop out of the survey than those in the Generic group. In particular, whether being prewarned of the financial topics covered in the survey affected the likelihood that someone would drop out of the survey. Higher drop-out rates are associated with poorer quality data.
Figure 13: The Generic group were more likely to drop out of the survey than the Tailored group
Proportion of households that dropped out of the survey before completing
Source: Office for National Statistics
Download this chart Figure 13: The Generic group were more likely to drop out of the survey than the Tailored group
Image .csv .xlsFigure 13 shows that 22.5% of households in the Generic group dropped out of the survey compared with 17.4% of those in the Tailored group.
It should be noted that although those in the Generic group were more likely to drop out of the survey once they had started it, this group still achieved a higher overall response rate than the Tailored group (16.5% and 14.0% respectively).
Is the profile of households that respond to this survey similar to households that respond to similar ONS surveys?
Demographics – comparisons with other surveys
A useful metric when accessing the feasibility of an online first approach to the collection of household financial data is analysing what type of people will engage with the survey and how this differs to other similar ONS surveys and wider population estimates. It is also possible to see if there were any considerable differences across both groups.
Estimates used in this section have been gathered from respondents of the Living Costs and Food Survey (LCF) 2015 to 2018, online survey (LMS Test 3) and population estimates (APS).
Figure 14: This survey had similar gender profiles for respondents compared with other similar surveys
Proportion of respondents by gender for household financial survey (HFS), Labour Market Survey (LMS), Living Costs and Food Survey (LCF) and population estimates
Source: Office for National Statistics
Download this chart Figure 14: This survey had similar gender profiles for respondents compared with other similar surveys
Image .csv .xlsFigure 14 shows that this survey delivered a similar gender profile when compared with other similar surveys, such as the experimental mixed-mode Labour Market Survey (LMS) and the existing Living Costs and Food Survey (LCF).
Figure 15: Older individuals were overrepresented in this survey
Proportion of respondents by age for household financial survey (HFS), Labour Market Survey (LMS), Living Costs and Food Survey (LCF) and population estimates
Source: Office for National Statistics
Download this chart Figure 15: Older individuals were overrepresented in this survey
Image .csv .xlsFigure 15 shows that the age profile of those in the responding households was similar between this survey and other similar surveys such as the experimental mixed-mode Labour Market Survey. However, for this survey the under-representation of the 16 to 24 years age group, and over-representation of the 65 years and over age group, were amplified.
Figure 16: Consistent with other ONS surveys, non-white respondents are slightly underrepresented
Proportion of respondents by ethnicity for household financial survey (HFS), Labour Market Survey (LMS) and Living Costs and Food Survey (LCF)
Source: Office for National Statistics
Download this chart Figure 16: Consistent with other ONS surveys, non-white respondents are slightly underrepresented
Image .csv .xlsWhen broken down by ethnicity, a similar profile of respondents between this survey and other similar surveys was observed.
Conclusions
The profile of the responding households to this survey was in line with similar surveys, such as the experimental mixed-mode Labour Market Survey and the Living Costs and Food Survey. The patterns observed when looking at age, sex, and ethnicity were replicated across other demographics. More detail on this can be found in the associated background tables.
Recontact permissions
Some elements of a fully redesigned survey of household finances would need to be collected repeatedly over set time periods. To achieve this, it is essential that responding households provide permission to recontact and provide contact details. It is of interest to see whether the advance survey materials had an impact on permission to recontact.
Figure 17: Significantly more Labour Market Survey respondents gave recontact details than for this survey
Proportion of respondents that gave recontact details for household financial survey (HFS), Labour Market Survey (LMS)
Source: Office for National Statistics
Download this chart Figure 17: Significantly more Labour Market Survey respondents gave recontact details than for this survey
Image .csv .xls
Figure 18: The Tailored group were more likely to provide permission to recontact than the Generic group
Proportion of households that provided an email address, phone number, both, or no contact details by treatment group
Source: Office for National Statistics
Download this chart Figure 18: The Tailored group were more likely to provide permission to recontact than the Generic group
Image .csv .xlsThree out of four responding households provided permission to recontact, with those in the Tailored test group more likely to provide permission than those in the Generic group (77.7% and 72.9% respectively). This has implications for longitudinal surveys, where survey respondents participate multiple times over an extended period to track changes over time. Although the Generic group achieved higher engagement and response rates, in the long term the increased permission to recontact rate may provide better longitudinal data.
Back to table of contents8. Conclusions and recommendations
The primary aim of this research was to establish whether households would engage with an invitation to participate in an online survey of household finances. The engagement rate for this survey was 20.7%, with 16.5% of households successfully completing the survey. This suggests that households will engage when asked to participate in such a survey.
The research also aimed to establish whether households would be less likely to respond if they were given advance notice of the topics that the survey would cover. Overall, 16.9% of households who were given this advance notice engaged with the survey and 14.0% completed it. This compares with 24.5% and 19.0% respectively for those who were not given advance notice.
It is important to note that those households who were given advance notice of the topics covered in the survey performed better against most of the data quality measures. The trade-off between achieving high engagement and response levels, and obtaining high quality data from those who do respond, should be explored further.
This initial research has yielded some interesting and encouraging findings and demonstrates that people are willing to engage with online finance surveys. How householders are approached, and the information they provide seems to be important in terms of getting people to consider providing this type of data online, but also in terms of the quality of the data provided. Further research will build on this evidence base and help further establish the feasibility of collecting data of this nature online.
Back to table of contents