ACSPRI Conferences, ACSPRI Social Science Methodology Conference 2018

Font Size:  Small  Medium  Large

Transitioning from CATI to Online

Benjamin Phillips, Andrew C Ward

Building: Holme Building
Room: Cullen Room
Date: 2018-12-13 11:00 AM – 12:30 PM
Last modified: 2018-10-17

Abstract


The Social Research Centre maintains Life in Australia™, Australia’s only probability-recruited online panel. (Life in Australia™ also includes offline respondents, who are interviewed via telephone.) Various surveys, including the Lowy Institute Poll and the Scanlon Foundation/Monash University Social Cohesion Survey, are in the process of transitioning from being telephone surveys of random digit dialling and other sample to being fielded on Life in Australia™ predominantly as web surveys. We address various considerations associated with survey transitions in this paper.
Retaining trend data is an important consideration when transitioning methods for repeated cross-sectional surveys. Given the likelihood of non-ignorable mode effects, back-casting is the process of adjusting past estimates for the counter-factual scenario in which prior surveys were fielded with the new methods. We make use of approaches developed by Statistics Netherlands to back-cast estimates based on a period (typically one wave) where the surveys are administered in parallel using old and new methods.
Depending on the survey topic, social desirability may be a significant driver of mode effects. Respondents in interviewer-administered surveys are prone to report answers in a way that would be more socially acceptable than their (unobserved) true answer. Acquiescence response bias—the tendency of respondents to agree with statements regardless of their content—is another potential source of differences between interviewer- and self-administered surveys. We will show that both have the potential to impact survey estimates.
Question order effects may also play a role. Self-administered surveys tend to exhibit response primacy, the tendency for respondents to select options presented at the beginning of the list of response options. By contrast, interviewer-administered surveys tend to exhibit response recency, the tendency for respondents to select options presented at the end of the list of response options. Our approach has been to rotate item scales in Life in Australia™ in order to avoid systematic biases from order effects. For most question types, order effects are rare and minor.
Finally, switching from a repeated cross-sectional design to a panel may give rise to panel conditioning effects: changes in respondent attitudes and behaviour associated with panel membership. We describe tests for panel conditioning that we have implemented in Life in Australia™.