ACSPRI Conferences, ACSPRI Social Science Methodology Conference 2010

Font Size:  Small  Medium  Large

Pitfalls of data collection: a practical guide to overcoming methodological issues associated with data collection for a PhD Project based primarily on a survey design

Sallie Gardner

Building: Holme Building
Room: Sutherland Room
Date: 2010-12-03 11:00 AM – 12:30 PM
Last modified: 2010-11-23

Abstract


For researchers, the first challenge is to choose a method to clarify their goals. The mixed-method approach adds richness to survey responses, complementing major foci of the quantitative methods. Email technology enhances the data collection options available to researchers. However, there can be unexpected pitfalls.

This current study reports on the progress of research into student-teachers psychological distress and their coping strategies. The initial plan included data collection using surveys to be delivered by face-to-face and email modes. This paper reports some of the challenges faced by the novice researcher, particularly related to expectations and attrition.

Data was derived from seven sub-scales, consisting of existing quantitative instruments, and also new scales developed for this project.  Quantitative data was collected using the commercially available Depression, Anxiety and Stress Scale (DASS), the Ten Item Personality Inventory (TIPI), the Pearlin-Schooler Mastery Scale (PSMS). New scales were developed, incorporating items from the Brief COPE and the Alcohol Use Disorders Identification Test (AUDIT).

Qualitative data was obtained from open-ended questions, and focus groups in field meetings. Four meetings were conducted during the Internship. Field notes record student-teacher’s feedback.

The research plan was to obtain survey data on campus, and also using email. Challenges arose from the outset. The first of these was that almost half of the student-teachers enrolled in the undergraduate degree failed to attend an on-campus lecture. As this was the lecture at which the research project was presented, surveys provided and email addresses obtained, data collection schedules had to be hastily re-arranged to reach the missing student-teachers.

Unexpected decisions to permit student-teachers to undertake their Internships interstate, and in some cases, internationally, then reducing the ten-week Internship program to six-weeks, contributed unique challenges to the project. Restricting email access to students once the project was already underway presented another challenge.

Data analyses also presented challenges. Exploratory factor analyses assessed the validity and reliability of non-standardised scales. Components analysis revealed that the new, non-standardised scales reduced the 22, 12 and 10 item scales to five, two and three components respectively. The final data collection and analysis challenge, more common in research, was attrition over time. In this case there were originally 145 student-teachers enrolled. However, only 115 remained post Internship. Of these, only approximately 30% responded to the final survey. Having outlined the pitfalls, as experienced in this research into student-teacher well-being, some practical approaches for the future are presented.