Written by Emilio J. C. Lobato, Corinne Zimmerman, and Thomas Critchfield from the Department of Psychology at Illinois State University
Our SoTL project was funded by the office of the Cross Chair in SoTL at ISU to examine ISU students’ involvement in SoTL scholarship or creative work. This topic seemed particularly relevant to our psychology curriculum, as the number of majors pursing out-of-class research experience has increased greatly over the last several years. Currently, of the 5 capstone options for psychology majors, 3 involve hands-on research experience.
Our project investigated whether psychology students who take advantage of research opportunities available at ISU develop a more sophisticated understanding of: a.) science, generally and, b.) psychology as a science, specifically. We examined which “myths of science” students endorse at different levels of the curriculum, and whether endorsement of these myths changes as students matriculate through the psychology curriculum.
To investigate these questions, we administered an online survey to our psychology majors. In order to make sure we sampled students across all four years of our curriculum, we used two recruitment methods. First, we recruited via the psychology department online research participation system (which targets students currently enrolled in courses that offer extra credit for research participation). Second, we used our undergraduate majors listserv (which targets all students, many of whom may not be enrolled in courses offering credit for research participation). We administered this survey twice, once in early fall 2014 and again in late spring 2015.
Across both data collection phases, we administered measures intended to examine student endorsement of myths of science, student perceptions about the nature of science, and student perceptions about psychology as a science. We also asked students about their research experiences with faculty.
Our Findings, So Far
We analyzed the data from fall 2014 and presented our findings at the 2015 University-Wide Teaching & Learning Symposium. Briefly, we found that upperclass students (i.e., juniors and seniors) who had at least one hands-on research experience were more likely to perceive psychology as a science relative to students at any stage of the curriculum without research experience. Additionally, upperclass students with research experience were less likely to endorse myths of science related to whether or not science and scientific methods provide absolute knowledge or proof. A PDF of our poster presentation is available at the symposium website.
Reflections and Issues
Throughout the research process, we encountered several problems. Our goal is to share some of these issues for the benefit of other SoTL researchers. Before we outline these issues, it is important to note that the members of our grant team are all from Psychology, and have decades of cumulative experience working with human participants.
- Participant Buy-In. Our study was advertised as the “Psychology Majors’ Survey.” Rather than being a standard psychology study, where college students are taken to represent the general population, in our study, participants were told “we are interested in you because you are a psychology major.” Our students signed up to participate, but many failed to take the study seriously. In psychological research, researchers often include items that serve as either manipulation-check questions or attention-check Participants either failed to complete the study or failed attention check questions such as “As an attention check, please select Strongly Disagree for this item.” These questions help us filter out participants who were speeding through the study, answering randomly or choosing the same response for every single question. Ultimately, we ended up collecting 350 response sets across both phases, but only 264 (75%) were usable (142 from fall and 122 from spring). Going forward, researchers using human subjects should be mindful that participants may only be taking part because of the compensation offered, such as extra credit, and fail to take participation seriously. They may answer randomly without paying attention to the questions you are asking or to the instructions you have provided. Without including ways to filter out these kinds of “bad” data, researchers risk drawing incorrect conclusions.
- Longitudinal Data. Our study design included two phases of data collection, because we were interested in whether the experiences over the course of an academic year resulted in learning, development, or change. An additional problem we encountered was that we only recruited 28 participants who completed the survey at both data collection phases. Given that one of our research aims was to investigate change in the perception of science and endorsement of myths of science, such a small sample size limits the conclusions we can draw. A cross-sectional analysis (such as what we did with our fall data) is a methodologically valid approach to addressing such questions. However, longitudinal research designs that require participants to complete multiple phases of data collection are another compelling source of data. Researchers interested in designing such studies should be aware that participant attrition is a major obstacle. Researchers should design their studies with this in mind and aim to collect far more data at each collection phase in anticipation of participant attrition at subsequent collection phases.