The SoTL Advocate

Supporting efforts to make public the reflection and study of teaching and learning at Illinois State University and beyond…

1 Comment

How Do Students Believe They Learn a Discipline?: An Example from Sociology

Written by Kathleen McKinney, Cross Endowed Chair in SoTL, Illinois State University

Recently, I was reminiscing about my years as a SoTL Carnegie Scholar at the Carnegie Foundation in CA…the wonderful staff I worked with, the great colleagues I met and had reciprocal support from, the interesting SoTL projects everyone was doing, all that I learned about SoTL, the friendships that have lasted over the years…AND my own SoTL research conducted for this experience. My SoTL research question was — ‘How do sociology majors think they best learn the discipline?’

In my SoTL project on this question, I used a multi-method approach. I had a relatively small number of sociology majors from which to obtain student voices on learning the discipline. Thus, I gathered data using three small-scale, qualitative studies: a group interview, analysis of content in learning logs, and individual face-to-face interviews. I then looked for common themes across the results of these studies. I had two sets of findings. First, students reported five types of “connections” (their word) that were most important to their learning in sociology: connections to others such as peers and faculty, connections among related ideas or skills, connections to their lives, connections across courses in the curriculum, and connections to the discipline itself. Second, I found that students were at different points on three overlapping pathways of learning: level of success in the major, use of surface-deep approaches, and degree of novice-expert learning. These findings certainly had the potential for practical implications in how we taught our majors (and others), our relationships with students, the learning opportunities we offered students, and curricular integration.

All this reminiscing led me to wonder whether anyone in another discipline or in my own, more recently, has conducted research on a similar question. And, if so, what SoTL methodologies and measures were used? What was found? What were the implications? If you have conducted or know of such work, I hope you will share it by commenting on this post.

Publications from this project on how students believe they best learn sociology:

McKinney, K. 2007. “The Student Voice: Sociology Majors Tell us About Learning Sociology.” Teaching Sociology 35:112-124.

McKinney, K. 2005. “Sociology Senior Majors’ Perceptions on Learning Sociology.” Teaching Sociology 33: 371-379.

McKinney, K. 2005, Fall/Winter.  “Reflections on Learning Sociology: Analysis of Learning Log Entries.”  MountainRise.

McKinney, K. 2004. “How Sociology Majors Learn Sociology: Successful Learners Tell Their Story.” Journal of Scholarship of Teaching and Learning 4: 15-24.

Leave a comment

ISU SoTL News and Opportunities on Facebook and Twitter

We wanted to let you know that in addition to managing content on The SoTL Advocate blog, the Cross Chair in the Scholarship of Teaching and Learning at Illinois State University is now sharing SoTL news and opportunities on Facebook and Twitter.

Follow us on Twitter at: @ISU_SoTL

Like us on Facebook at:

We look forward to seeing you on social media. Please share with others who are interested in SoTL news and opportunities!

1 Comment

Explaining or Discussing Your SoTL Results

Written by Kathleen McKinney, Cross Endowed Chair in SoTL, Illinois State University

In our SoTL work, it is critical, not just to report, but to understand, explain, and discuss our data or results—what we have learned about our students’ learning, and the conditions and processes involved. We need to know, not just that some teaching strategy or intervention or technology or experience or assignment or… helped students learn but, as (more?) importantly, why, how, when. It is best, of course, if we have planned ahead and have been able to gather data or evidence about the ‘why’ and the ‘how’ and the ‘when.’ I have written elsewhere about several strategies to do this in SoTL work (McKinney, K. 2012. “Increasing the Impact of SoTL: Two Sometimes Neglected Opportunities.” International Journal of the Scholarship of Teaching and Learning 6 (1).) These five strategies are the following. Some of you may know of or use other strategies.

  • Obtain student voices via qualitative reflection about processes and conditions using learning reflection essays, interviews, or focus groups.
  • Incorporate an observational component to the SoTL research where observers code process variables (e.g., increased peer discussion, practicing authentic tasks, modeling successful behaviors) they see taking place.
  • Draw on theories of learning that suggest possible conditions and processes a priori, then plan ahead and include measures of such variables/factors in the study.
  • Use longitudinal designs with multiple methods and/or measures taken at key points in time during the intervention and the learning process to increase the likelihood of obtaining data pointing to intervening processes and conditions.
  • Move from isolated SoTL projects to undertaking a series of related SoTL studies where each additional study attempts to ferret out specific conditions or qualify past results.

Unfortunately, many times we don’t have the data on the ‘why,’ ‘how,’ or ‘when.’ We don’t have direct evidence of the conditions and processes. But we still want to try to understand, to explain, and to discuss what we found as well as possible implications. In this case, we can use the literature on learning and what it says about best practices, impactful experiences, effective learning behaviors, etc. We can consider the factors, evidence, and results of our own SoTL project, and discuss whether processes consistently found to enhance learning in the literature on learning may have been at work in our SoTL study and our students’ outcomes.

For example, I have highlighted below five processes (there are certainly more!) that have been shown, empirically, to be positively related to learning in 2-5 of the sources at the end of this post (there are other sources on learning as well!). You can consider whether any of these processes help to explain the results related to learning that you find in your SoTL project.

  1. Connecting learning to the students’ prior knowledge or preexisting understandings.
  2. Involving appropriate practice, time on task, study time alone, significant reading and writing.
  3. Offering some student voice, student control/choice of learning, assignments, or activities.
  4. Encouraging and supporting intrinsic motivation for learning and learning tasks.
  5. Providing opportunities and support for student self-reflection, metacognition, monitoring own learning.

Ambrose, Susan, et. al 2010. How Learning Works: 7 Research-Based Principles for Smart Teaching. San Francisco: Jossey-Bass.

Arum, Richard and Josipa Roksa. Josipa. Academically Adrift. 2011. University of Chicago Press.

Bain, Ken. What the Best College Students Do. 2012. Harvard University Press.

Debra Mashek, Debra and Hammer, Elizabeth Yost (eds.) 2011. Empirical Research in Teaching and Learning: Contributions from Social Psychology. Malden, MA: Wiley-Blackwell.

National Research Council. 1999. How People Learn: Bridging Research and Practice. Washington, DC: National Academy Press.

  1. E. Zull, J. E. 2002. The Art of Changing the Brain: Enriching the practice of teaching by exploring the biology of learning. Sterling, VA: Stylus Publishing.

Leave a comment

Volume 3 (2015) of Gauisus is published!

Written by Kathleen McKinney, Cross Chair in SoTL at Illinois State University

Gauisus is the internal, blind peer-reviewed scholarship of teaching and learning (SoTL) publication at Illinois State University (ISU). At ISU we define the scholarship of teaching and learning as the “systematic reflection/study on teaching and learning [of our ISU students] made public.” The first volume of Gauisus appeared in 2009 in print and pdf form and contained 13 traditional scholarly articles or notes. The second and subsequent volumes are multi-media publications and appear on line every late spring. Each will contain several representations of SoTL work. Representations may be scholarly papers or notes, online posters, videos, wikis or blogs and so on.

The purposes of Gauisus are the following: 1) to provide instructors writing about their teaching and learning a local but peer reviewed outlet to share what they and their students have done and learned and 2) to offer other instructors and students an accessible publication to read to obtain a sense of, and learn from, some of the scholarly teaching and SoTL projects conducted by their colleagues on our campus. Gauisus means glad, gladly, or joyful in Latin, as in the Illinois State University motto/logo, “Gladly we learn and teach.” Reviewers are volunteers from ISU, and sometimes beyond, who must apply and are selected based on their experience with SoTL and reviewing scholarly work.

Volume 3, 2015 contains the following SoTL representations. (

Cross-Curricular Learning in Communication Sciences and Disorders: Leaving the Silos Behind

Jennifer Friberg • Department of Communication Sciences and Disorders

Heidi Harbers • Department of Communication Sciences and Disorders

This project describes student perceptions of learning following the completion of a cross-curricular end-of-semester project in communication sciences and disorders. Results indicated that students increased knowledge in several key areas, particularly in the integration of material from two separate graduate courses.

Unwilling or Unable? Measuring Anti-Asian Implicit Biases of Pre-Service Teachers in Order to Impact Teacher Effectiveness

Nicholas D. Hartlep • Educational Administration and Foundations

This exploratory scholarship of teaching and learning (SoTL) study analyzes undergraduate students’ performances on an Implicit Association Test (IAT). The purpose of this exploratory SoTL study was to measure the levels of implicit bias that two groups of pre-service teachers held toward Asian Americans. Two sections of students in a Social Foundations of Education course took an Asian IAT at the beginning and ending of a 6-week summer session. This research examined what students attributed their IAT results to. Family, friends, and upbringing (environmental and external) were salient attributions across both sections of the Social Foundations of Education course as determined by pre- and post-IAT writing assignments. Students’ justifications did not change during the 6-week course, which may show that students believe their biases are what they are, suggesting they don’t feel bad that they may harbor anti-Asian biases.

How Volunteer Service Projects Enhance Learning and Classroom Community: A Longitudinal Study

Phyllis McCluskey-Titus • Educational Administration and Foundations

Wendy Troxel • Educational Administration and Foundations

Jodi Hallsten Lyczak • School of Communication

Erin Thomas • Student Affairs

Brandon Hensley • Educational Administration and Foundations

This article shares the results of a longitudinal study conducted to assess student learning up to six years following participation in a volunteer service project with undergraduate and graduate students enrolled in a first-year seminar class and a first-year master’s degree course about college students. Data was collected using an open-ended survey and analyzed for relevant themes. Results are presented with implications for teaching and learning using collaborative volunteer service as a methodology.

Developing The Impact of Relevance and Teacher Immediacy on Cognitive and Affective Learning

John F. Hooker • School of Communication

This study was an experimental investigation into the impact of lecture topic relevancy and teacher immediacy on students’ cognitive and affective learning. Students at Illinois State University (ISU) were recruited from multiple sections of Communication 110, a course required of all first-year students. Therefore, a variety of majors were included. The results revealed that students learned more with a highly relevant lecture topic. Students also learned more with a highly immediate instructor. There was no interaction between immediacy and relevance, contradicting previous research that suggested an overlap between the two. Pedagogical implications of the findings are discussed.

Instructional Podcasts to Support Thesis Writing: Student and Committee Member Perceptions

Julie Raeder Schumacher • Department of Family and Consumer Sciences

Podcasts support autonomous learning; however, literature is limited on using podcasts to educate students on technical processes separate from course content, such as writing a thesis manuscript. The purpose of this study was to explore the effectiveness of instructional podcasts on thesis writing for Master’s students in our department. The study revealed that students’ writing preparation and confidence was significantly increased after listening to a series of instructional podcasts. Responses from committee members showed positive trends in students’ writing. This study demonstrated podcasts provide one possible means of communicating departmental expectations of the thesis writing process.

Leave a comment

Walk the Talk SoTL Contest Summary from 2nd Winning Team

Written by Maria Moore (COM), Cheri Simonds (COM), Lance Lippert (COM), Kevin Meyer (COM), Megan Koch (COM), and Derek Story (Director, HR Systems) at Illinois State University

On Wednesday, April 29, 2015, a reception was held to honor two award winning teams from our “Walk the Talk” contest for the best team or academic unit who applied SoTL research results/literature beyond the individual classroom to solve a problem, achieve a goal, or exploit an opportunity resulting in improved teaching or enhanced student learning at Illinois State University.

A summary of the first award winning project was featured in a recent blog post. Today’s blog highlights the work done by faculty in from the School of Communication in an effort to create a new teaching evaluation instrument.. This project, titled Creating a New Teaching Evaluation Instrument for the School of Communication is summarized below with an EXCELLENT list of SoTL literature readers interested in teaching evaluation can use, as appropriate:

What problem(s), goal(s) or opportunity(s) did your team seek to solve or exploit?

A teaching evaluation instrument is a critical strategic resource to ensure quality teaching within a division. It impacts practical issues such as promotion or raises, as well as being fundamental to assessing teaching quality.

We identified that our existing SoC instrument should be reviewed and updated. The assignment was given to our Teaching Effectiveness Committee who identified key problems with the old instrument, meeting bi-monthly revise the instrument. Members of the committee conducted a pilot test in their own courses at the end of the Fall 2013 semester, using both old and new instruments with students. The work of the committee, based on the successful pilot test, resulted in a recommendation for change to the full faculty body in March 2014. The faculty voted unanimously to accept the revision and the new instrument was implemented across all courses at the end of Spring 2014.

What strategies did your team devise to apply best practices from SoTL work to your problem/goal/opportunity?

We have many SoTL scholars in the SoC and it is a common practice to use SoTL literature as we identify problems or opportunities and devise resultant tactical strategies to solve or exploit them. This opportunity was no exception. At our first meeting we recognized the need to examine current literature and best practices to have external support and validation as a foundation for our recommendations.

We conducted a literature review on the current thinking about teaching evaluations as well as effective teaching practices. In addition to the literature reviewed, we also investigated teaching evaluation instruments from comparable and aspirational programs externally and internally.

What SoTL research (your own, colleagues, or from the literature) did you use to support your strategies?

We purposefully sought and reviewed literature from multiple disciplines, not just communication. We also sought current, as well as seminal scholarship on effective teaching practices. We included research from our own SoC scholars, too.

What were the outcomes and how were they assessed or measured?

Last year, one of our committee members ran factor analysis and scale reliabilities for all the SoC Spring 2014 data using the new teacher evaluation instrument. Results indicate that the new evaluation instrument we created performs very well in reliability tests, factor analysis, and predictive capability using regression procedures. And, it provides us the ability to condense aggregate reports of our evaluations into 4 categories or factors.

This analysis resulted in a spreadsheet template distributed to all faculty so they can also calculate their own aggregate factor scores for each subsequent semester. Faculty going for tenure or promotion use this spreadsheet as an element of their materials. Our SFSC also uses this spreadsheet to work with faculty deemed deficient as part of their performance enhancement strategic plan.

Please briefly reflect on the impact of this experience upon your team; in particular consider the specific role of the SoTL literature on your outcomes or consequences.

We accepted the responsibility to revise our evaluation instrument with great humility and a sense of tremendous responsibility. Simply put, we were entrusted with our colleagues’ future, as the evaluation instrument is one of the most important elements used to assess a teacher’s success or lack thereof. A foundation of SoTL literature (as we began and then navigated this responsibility) was both empowering and liberating. Empowerment occurred through the knowledge we gained about best practices as well as profoundly important SoTL research of both the student and the teacher perspectives of the evaluation process.

What are your team’s future plans for this particular project or initiative?

This is the first full academic year when all teachers in all roles in the SoC will use the new instrument. Our committee has the responsibility to continually assess its effectiveness for meeting our goals of the continual improvement of our teachers and our teaching. We are also experimenting with ways to mesh old data and new data from the old instrument and the new instrument for the multi-year reporting of data required for tenure and promotion applications.

What are your plans to make this work public?

We intend to submit a panel discussion about this topic for consideration at the next Central States Communication Association Meeting. We wanted to have this year’s aggregate data to discuss in addition to the project itself, so submission will occur in October 2015.

Literature Reviewed for Project

American Association of University Professors. (1990). Statement on Teaching Evaluations. Retrieved from

Benton, S. L., Cashin, W. E., & Kansas, E. (2012). IDEA PAPER# 50 Student Ratings of Teaching: A Summary of Research and Literature.

Boysen, G. A. (2008). Revenge and student evaluations of teaching. Teaching of Psychology35(3), 218-222.

Calkins, S., & Micari, M. (2010). Less-than-perfect judges: Evaluating student evaluations. Thought & Action, 7.

Center for Research on Learning and Teaching University of Michigan. (n.d.). Gender and Student Evaluations: An Annotated Bibliography. Retrieved from

Chen, W., & Chen, W. (2010). Surprises learned from course evaluations. Research in Higher Education Journal9, 1-9.

Comadena, M., Hunt, S., & Simonds, C. (2007). The Effects of Teacher Clarity, Nonverbal Immediacy, and Caring on Student Motivation, Affective- and Cognitive Learning: A Research Note. Communication Research Reports24 (3), 241-248.

Cornell University Evaluation and Recognition of Teachers Handbook (n.d.). Retrieved from

Dodeen, H. (2013). College students’ evaluation of effective teaching: Developing an instrument and assessing its psychometric properties. Research in Higher Education Journal21, 1-12

DuCette, J., & Kenney, J. (1982). Do grading standards affect student evaluations of teaching? Some new evidence on an old question. Journal of Educational Psychology74(3), 308.

Feeley, H. T. (2002). Evidence of halo effects in student evaluations of communication instruction. Communication Education51(3), 225-236.

Frick, T. W., Chadha, R., Watson, C., Wang, Y., & Green, P. (2008, March). Theory-based course evaluation: Implications for improving student success in postsecondary education. In American Educational Research Association conference, New York.

Hudson, J. C. (1989). Expected Grades Correlate with Evaluation of Teaching.Journalism Educator44(2), 38-44.

Kim, C., Damewood, E., & Hodge, N. (2000). Professor attitude: Its effect on teaching evaluations. Journal of Management Education24(4), 458-473.

Kozey, S. R., & Feeley, H. T. (2009). Comparing Current and Former Student Evaluations of Course and Instructor Quality. Communication Research Reports26(2), 158-166.

Lewis, K. G. (2001). Making sense of student written comments. New Directions for Teaching and Learning2001(87), 25-32.

Marsh, H. W., & Roche, L. A. (1997). Making students’ evaluations of teaching effectiveness effective: The critical issues of validity, bias, and utility. American Psychologist52(11), 1187.

Martin, E. (1984). Power and authority in the classroom: Sexist stereotypes in teaching evaluations. Signs, 482-492.

McCroskey, J. C. (1994). Assessment of affect toward communication and affect toward instruction in communication. In S. Morreale, & M. Brooks (Eds.),1994 SCA summer conference proceedings and prepared remarks: Assessing college student competence in speech communication. Annandale, VA: Speech Communication Association.

Onwuegbuzie, A. J., Witcher, A. E., Collins, K. M., Filer, J. D., Wiedmaier, C. D., & Moore, C. W. (2007). Students’ perceptions of characteristics of effective college teachers: A validity study of a teaching evaluation form using a mixed-methods analysis. American Educational Research Journal44(1), 113-160.

Ory, J. C. (2001). Faculty thoughts and concerns about student ratings. New directions for teaching and learning2001(87), 3-15.

Schrodt, P., Witt, P. L., Myers, S. A., Turman, P. D., Barton, M. H., & Jernberg, K. A. (2008). Learner empowerment and teacher evaluations as functions of teacher power use in the college classroom. Communication Education57(2), 180-200.

Sojka, J., Gupta, A. K., & Deeter-Schmelz, D. R. (2002). Student and faculty perceptions of student evaluations of teaching: A study of similarities and differences. College Teaching50(2), 44-49.

Wilson, R. C. (1986). Improving faculty teaching: Effective use of student evaluations and consultants. The Journal of Higher Education, 196-211.

Wines, W. A., & Lau, T. J. (2006). Observations on the folly of using student evaluations of college teaching for faculty evaluation, pay, and retention decisions and its implications for academic freedom. Wm. & Mary J. Women & L.13, 167.

Wode, J., & Keiser, J. (2011). Online course evaluation literature review and findings. A report from Academic Affairs, Columbia College, Chicago.