The SoTL Advocate

Supporting efforts to make public the reflection and study of teaching and learning at Illinois State University and beyond…


1 Comment

Think Alouds: SoTL Methods Series #3

Written by Sarah M. Ginsberg, Ed.D., Professor of Speech-Language Pathology at Eastern Michigan University (sginsberg@emich.edu)

Editor’s Note: This blog was originally posted on the SoTL Advocate on October 12. 2015 and is reprinted in its entirety now for its excellent fit in the current methods series which features methods for SoTL that are “new and different” to many.  — JCF

Screen Shot 2017-11-13 at 8.25.05 PMA common thread that runs through various cross-disciplinary SOTL research is the concerted effort made to understand what the accomplished professional is thinking when she solves a work problem so that we can use that knowledge as teachers to better prepare future professionals. That problem might be how a mathematician completes a technical calculation, or in clinical fields, it might be how the clinician arrives at a diagnosis. The value for all of us in understanding what our accomplished colleagues do in their heads when faced with a technical problem is that in identifying how the pros do it, we can uncover insights into how we should be teaching our students to think and to problem solve. This type of understanding relies on a process of collecting data while the person is actively engaged in solving a problem out loud. This type of study is often referred to as a think aloud (TA) and can yield important information to inform evidence-based educational practices

The TA method is a validated method of learning about cognitive processes by having participants verbalize their thinking in a metacognitive manner (Ericcson & Simon, 1993; Wineberg, 1991). TAs were popularized by Wineberg (1991) in his ground-breaking study that examined the differences between how academic historians processed information while reading historical texts and how students processed information regarding historical texts.  Since then, TAs have been used to study how novice thinking compares to experienced thinking in a wide variety of disciplines, including the health sciences, mathematics, and political science (Banning, 2008; Bernstein, 2010; Forsberg, Ziegert, Hult, & Fors, 2013; Wainwright & McGinnis, 2009). These types of studies are often referred to as “expert-novice” studies (Bernstein, 2010).

The process of data collection using a TA approach is quite simple and requires minimal technology and cost. Typically:

  1. Study participants are presented with the problem to be solved by the researcher and asked to solve it aloud.
  2. Specific directions are provided to participants. Prompts (e.g., “tell me how you would solve this” or “describe how you would approach this problem”) are used to elicit responses and gather additional information if a participant falls silent or struggles with the process.
  3. Participant responses are recorded for subsequent transcription and analysis.
  4. Once the TA is transcribed, the most challenging part of the process becomes the subsequent data analysis. Consistent with qualitative methodology, verbalizations may be read as a whole to determine initial emerging codes and impressions about the thought process (Creswell, 2002; Denzin & Lincoln, 2012). Using an inductive approach to identifying specific thought processes or strategies allows the researcher to move forward to developing secondary, axial coding. Themes emerge as the iterative process expands to include all participants and commonalities and differences can be appreciated.

Having recently completed a study comparing the diagnostic problem-solving of experienced speech-language pathology (SLP) clinicians compared to the problem-solving of SLP graduate students, I found that the most challenging aspect of analyzing the data was determining the level of thinking to focus on. I used studies in comparable clinical professions, such as nursing, physicians, and physical therapists to identify frameworks that might be useful to me. In determining the focus of my study, I chose to concentrate on the heuristics (thinking strategies) of my participants, to understand differences in approaches to problem solving and to create a framework that fostered comparisons to previous literature, potentially increasing the value of my findings.

For more details on the think aloud method and some outstanding examples of its use in a variety of fields, see the items included in the following references. It should be noted that a number of authors also advocate for the use of TA as a teaching method. For those unfamiliar with qualitative research methodology, several references are included here as well.

References for Additional Information on Think Alouds:

Banning, M. (2008b). The think aloud approach as an educational tool to develop and assess clinical reasoning in undergraduate students. Nurse Education Today, 28, (1), p. 8–14. doi: 10.1016/j.nedt.2007.02.001

Bernstein, J. L. (2010). Using “think-alouds” to understand variations in political thinking. Journal of Political Science Education, 6(1), p 49-69. doi:10.1080/15512160903467695

Ericcson, K. A., & Simon, H A. (1993). Protocol analysis: Verbal reports as data. Cambridge, MA: MIT Press.

Forsberg, E., Ziegert, K., Hult, H., & Fors, U. (2013). Clinical reasoning in nursing, a think-aloud study using virtual patients-A base for innovative assessment. Nurse Education Today, http://dx.doi.org./10.1016/j.nedt.2013.07.010

Wainwright, S. F., & McGinnis, P.Q. (2009). Factors that influence the clinical decision-making of rehabilitation professionals in long-term care settings. Journal of Allied Health, 38(3), 143-51.

Wineberg, S. S. (1991) On the reading of historical texts: Notes on the breach between school and academy. American Educational Research Journal, 28(3), 495-519.

Advertisements


Leave a comment

#Collaborative Auto-ethnography: SoTL Methods Series #2

Written by: Catherine McConnell (University of Brighton), Elizabeth Marquis (McMaster University), and Lucy Mercer-Mapstone (University of Queensland) — note complete author-supplied affiliations and contact information at the end of this blog post. 

When we (Catherine, Beth, and Lucy) met at the International Summer Institute on Students as Partners in 2016 we quickly discovered that, as practitioners of student-staff partnerships, we had many shared experiences. Our discussions were fruitful in terms of giving us a sense of belonging but we felt they warranted deeper exploration. So, we embarked on a process of delving into our own experiences in the hopes of learning in the process and sharing that learning with fellow practitioners and researchers. As we worked to find a way to effectively and systematically study ourselves, we arrived at the idea of using collaborative auto-ethnography as a methodological approach.


Above: Participants at the 2016 McMaster Summer Institute on Students as Partners in Teaching and Learning, where we first developed the idea for our CAE project.


Collaborative auto-ethnography (CAE) is a research method that involves ‘a group of researchers pooling their stories to find some commonalities and differences […] to discover the meanings of the stories in relation to their sociocultural contexts’ (Chang et al, 2013, p. 17). As such, it provided a perfect, if initially somewhat unfamiliar, way to collectively explore our individual experiences in a scholarly fashion.

We have put together this blog post to explain our developing understanding of the method and process of CAE, and how we, as a group of three researchers, have used it in our recent SoTL enquiry into ‘student-staff partnership in higher education.’

CAE builds upon ‘auto-ethnography,’ which is a method that uses a researcher’s personal experience to ‘describe and critique cultural beliefs, practices, and experiences’ (Adams et al, 2015, p.1). Auto-ethnography is a deeply personal and reflective process, usually taking place in a researcher’s own context, whether that be about practice or in a certain situation. Adding the dimension of multiple ethnographies (i.e., more than one voice) presents the possibility that the method can simultaneously be collaborative, autobiographical, and ethnographic.

For our shared research, we took up the CAE method through structured reflective writing, designing a writing activity to provide a framework that would guide our individual reflections. We set ourselves a word limit of 750 words that we would write and share with each other in a private online space. This activity was modelled on a set of reflective prompts, informed by Johns’ model of structured reflection (Johns, 2000), which had been adapted by colleagues at the University of Brighton, UK (2011) and which we subsequently re-appropriated. Specifically, we:

  1. Developed a framework of prompts/questions for reflection
  2. Wrote individual reflections guided by the framework
  3. Shared and read each other’s reflections
  4. Conducted iterative thematic analysis to discover key themes

Our developed framework was a useful facilitative tool to prompt us to take an open, honest account of our personal and professional experiences, affirming Johns’ interest in ‘making explicit the knowledge we use in our practice’ (Jasper, 2013, p.86). Posing questions that follow Johns’ (ibid, p.37) format of phased cue questions (phases involve preparatory, descriptive, reflective, anticipatory, and insight questions) we focused our SoTL enquiry on identity construction, navigation and enactment in the context of student-staff partnerships.

The framework below provides an illustration of the types of questions used to prompt our personal reflections about identity in the context of student-staff partnerships. This could be easily adapted to other SoTL topics –  especially those that reflect on practice.

Framework for reflective questioning
Description of experience Phenomenon: describe in detail your partnership practice, or a specific partnership experience that seems especially noteworthy, without interpretation or judgement
Context: what were the significant background factors to this experience? Why did it take place, and what was its purpose?
Reflection What were you trying to achieve?
Why did you behave as you did?
What were the consequences of your actions for yourself and others?
How did you feel about the experience when it was happening?
To what extent did your actions realize your understanding of partnership?
What identity(s) were you consciously aware of at the time?
What identity(s) do you believe were at play during this interaction in hindsight?
If multiple, which identity was most salient? How were they interacting?
Analysis: Influencing factors

 

What factors influenced your decision-making? Some potential options to consider: Prior experiences, Societal expectations/ideologies/assumptions, Context
How was your salient identity affecting your actions?
How was the interaction between identities affecting your actions?
How was the presence of this identity(s) influencing your perceptions of those with whom you were interacting?
Analysis: Alternatives What other choices did you have?
What could be the consequences of these choices?
Learning & Action How do you feel about this experience now?
Has this experience changed your way of understanding yourself?
Did your salient identity change? If so, how and why?
In hindsight, how has is interaction/even affected your ongoing identity in partnership?
What new questions, challenges or issues has it raised?
Given the chance, what would you do differently next time?
How will you follow up on this experience in order to put your learning into practice?

Once we had written and shared our individual reflections, we found it useful to read each other’s, and write a short ‘meta-reflection’ on the writing process. This enabled us to appraise the CAE method straight after the reflective process but before we began any analysis of the transcripts. Some of our observations included:

  • The researchers felt a sense of belonging and solidarity to one another along with a communal ownership of an enquiry
  • Writing and sharing reflections caused each of us a personal realisation and provided grounds for transformation. This process was not without some discomfort, though, perhaps because there is a level of vulnerable-making involved with sharing personal reflections with colleagues
  • Sometimes the content of the reflections themselves was also unsettling or challenging, and caused discomfort in relation to an aspect of one’s own practice
  • We experienced a heightened consciousness of our own values and beliefs relating to practice, the influence these have in partnership situations, and in our expectations of others
  • We also noted a sense of excitement when reading each other’s writing inspired by the experience of sharing personal insights and aspects of our own identity that are usually implicit

While CAE proved a useful method for meaningfully exploring our research questions about our own experiences, then, it was also an exciting and sometimes uncomfortable process that supported reflective thinking and potential development of our practice as teachers and learners.

Project Information

We are three SoTL practitioners working in the area of student-staff partnership in HE across three western countries, in differing roles, and in significantly different institutional contexts. Catherine McConnell is a Senior Lecturer in a Learning and Teaching centre, focusing her work and doctoral research on student-staff partnership, at the University of Brighton in the UK. Beth Marquis is Associate Director (Research) at the central teaching and learning institute at McMaster University in Canada. Her disciplinary training is in film and cultural studies. Lucy Mercer-Mapstone is a PhD candidate and research co-fellow at the University of Queensland, Australia. She currently leads the collaborative design of a university-wide Students as Partners program that aims to embed a culture of partnership at the institution.

You can find out more about our study: Breaking Tradition Through Partnership: Navigating Identities and Dissonance in Student-Staff Partnerships in the EuroSoTL Conference Proceedings, p296.

Catherine McConnell * a, Elizabeth Marquis b, c Lucy Mercer-Mapstone

a Centre for Learning and Teaching, University of Brighton, Sussex, England. C.McConnell@brighton.ac.uk

b Arts & Science Program and MacPherson Institute, McMaster University. Hamilton, ON, Canada. beth.marquis@mcmaster.ca

c Institute for Teaching and Learning Innovation, University of Queensland, St Lucia, Brisbane, 4072, Australia, l.mercermapstone@uq.edu.au, orcid.org/0000-0001-7441-6568

*Corresponding author

 

Blog References

Adams, T. E., Linn, H. J. S. and Ellis, C. (2015). Autoethnography. Oxford: Oxford University Press.

Chang, H., Ngunjiri, F., & Hernandez, W. (2013). Collaborative autoethnography. London: Routledge.

Jasper, M. (2013) Beginning Reflective Practice. (2nd edition) Hampshire: Cengage Learning.

Johns, C. (2000) Becoming a Reflective Practitioner. Oxford: Blackwell Scientific Publications.

University of Brighton. (2011). Critical Incident Analyses. Brighton: University of Brighton. 


Leave a comment

Theoretical Pattern-Matching in SoTL: SoTL Methods Series #1

This blog serves as the beginning of a four-week focus on unique research methods for SoTL work. Enjoy, and please feel free to write to our guest bloggers with any feedback or questions! -Jen Friberg, blog editor

Written by: Bill Anderson, Associate Professor of Family and Consumer Sciences at Illinois State University (jander2@ilstu.edu)

I recently completed a SoTL project where I was hoping to create vicarious, but meaningful, applications of classroom learning, in this case, foundational theories of the human development field. In an attempt to accomplish this, I utilized interrupted case studies (ICS), a progressive disclosure of information viewed as problem-based learning over time. Over an eight week period following a pre-test application, students viewed a longitudinal series of interviews as an ICS. This series followed several participants from the time they were seven years old in 1964, revisiting them every seven years until age 56 in 2013. During the process, and using the assumptions, concepts, and language of assigned developmental theorists, students described and applied relevant theoretical positions to anticipate growth and change as this collection of real lives progressed. Their work was submitted in weekly reflective essays. At the end of the eight-week assignment, post-test results indicated that the method was quite successful but told me nothing further. The post-test increase could simply be the result of memorizing the material. Therefore, pattern-matching was applied to further examine those results.

patternPattern-matching is a less-known, but dependable, procedure for theory testing with case-studies and is regularly recommended for reconciling mixed methods and data sources in case study research, and to boost the rigor of the study. The overarching goal is to explain relationships between key points, in this case the pre/post results, by comparing an identified theoretical pattern with an observed pattern.

The previously mentioned weekly student essays were utilized as the observed pattern. These included descriptions of their assigned interview participants, appraisals of their most recent developmental predictions for this person, and their expectations for the next seven years. The essays were coded line-by-line to determine the degree of matching to a predetermined theoretical pattern. In this case, Bloom’s Revised Taxonomy (Anderson & Krathwohl, 2001) was utilized as coding categories as follows: 1 – Remembering, 2 – Understanding, 3 – Applying, 4 – Analyzing, 5 – Evaluating, and 6 – Creating. Average use of these levels could show a general progression from simple remembering (e.g. defining, telling, listing) to application (identifying, selecting, organizing) to creating (imagining, elaborating, solving). Once the essays were coded, interrater reliability was determined by using the intra-class correlation coefficient function of SPSS v. 20 to determine a kappa score of reliability, with a score of .80 deemed reliable.

Results, the observed pattern, allowed me to see a progress toward more complex reasoning in the assignment as the class progressed and students gathered more information and became more comfortable with theory application. Briefly stated, the first essays indicated an average response at Bloom’s applying level. Students were identifying and correctly applying concrete elements of the theories and making tentative, but informed, inferences. However, by the final essay the average response level was solidly at the evaluating level. There, students were appraising the flexibility of the theories being applied along with the documentary participant they were following. It became more common to see students suggest multiple possibilities in their writing, prioritize these, and determine the most informed interpretation. Consequently, pattern-matching indicated an established theoretical progression in reflective thinking from pre- to post-test.

Still, very few specific examples of best-practice exists with pattern-matching (Almutairi et al, 2014) and applications in SoTL (and education, in general) are rare. However, there are a number of available theories that could be considered as an identified pattern. For instance, I am currently using William Perry’s (1999) scheme of intellectual develop during the college years as a pattern basis in order to better understand contemporary student’s willingness, or unwillingness, to discuss racism in the classroom. Perry’s scheme is noticeably related to Bloom’s work, though somewhat better suited to assess student readiness to learn. Lastly, there are several other established variations of pattern-matching. In you are interested, a good place to begin would be Robert K. Yin’s (2009), Case study research: Design and methods.

References

Almutairi, A.F., Gardner, G.E., & McCarthy, A. (2014). Practical guidance for the use of pattern-matching technique in case-study research: A case presentation. Nursing and Health Sciences, 16, 239-244. doi: 10.1111/nhs.12096.

Anderson, L.W., & Krathwohl, D.R. (2001). A taxonomy for learning, teaching, and assessing: A revision of Blooms taxonomy of educational objectives. London: Longman, Inc.

Perry, W. (1999). Forms of ethical and intellectual development in the college years: A scheme. San Francisco: Jossey-Bass.

Yin, R.K. (1994). Case study research: Design and methods (2nd Ed.). London: Sage.

 

 

 


Leave a comment

Making the Case for SoTL Self-Advocacy in Academic Job Searches

Written by Jennifer Friberg, Cross Endowed Chair in SoTL and Associate Professor, Communication Sciences and Disorders at Illinois State University

Two close friends are currently on the job market. Presently employed as associate professors (or close to that rank!) each has chosen to seek novel adventures elsewhere in academia. Thus, they are in the process of sifting through position descriptions and polishing their teaching and research statements to submit in the coming weeks, following a course familiar to many in higher education.

search1Each of my friends have been successful disciplinary researchers as well as productive SoTL scholars, though they both represent disciplines that do not consistently value SoTL. As they contemplate phone interviews and campus visits in the near future, they have wondered aloud about how they might contextualize their SoTL work in a way that positions them well in their job searches. While it’s disappointing to me that they have to consider this issue (SoTL should be uniformly valued!), I recognize it’s likely very necessary and, in fact, is smart preparation for their respective job searches.

With that in mind, the contextualization my friends seek as a framework for their SoTL work could actually be a form of SoTL self-advocacy, which I’d define as anything a person does to describe the value of their SoTL work to relevant stakeholders. SoTL self-advocacy might look different across contexts, but in the milieu of a job search, there are definitely steps my friends could take to share the appeal and impact of their SoTL work. Specifically, I would advise each (and would tell others!) that they might do the following to engage in SoTL self-advocacy:

  • Closely examine the mission/vision statements for any institution of interest. Mine these statements for alignment with past and current SoTL projects. Prepare an explanation for how your own SoTL work meshes with or could advance the mission/vision of the institution of interest.
  • Read the strategic plan carefully, noting where your SoTL work matches with current initiatives/efforts being undertaken by your institution of interest. Seek out mentions of student involvement in research, evidence-based instruction, or broad definitions of scholarship. Use these to frame your SoTL work and plans for the future in the interview process.
  • Reflect on the impact of your SoTL work. Many classroom-based SoTL studies lead to changes in curriculum, teaching methods, course design, etc. Be prepared to discuss how SoTL has impacted your teaching or your students’ learning. Have there been impacts to your current department (e.g., curricular changes) that resulted from your SoTL work? These would be important to describe.
  • Closely examine the vitas (hopefully archived online!) of faculty in the department/school/unit you might be looking to join. Determine if any individuals are SoTL-active or could serve as collaborators in the future. Be prepared to talk about how you would engage colleagues in institution-relevant collaborative projects.
  • Visit websites for the teaching and learning center, provost’s office, and/or any other important campus units affiliated with your institution of interest. Look for prompts that support SoTL and precedents for SoTL engagement. Identify individuals on campus who might be in the position to discuss SoTL opportunities at the institution and (if appropriate) attempt to have them included in your on campus interview (or express interest in meeting them if they are not in your interview).

Other ideas for SoTL self-advocacy on the job search? Please post below in the comments section!


Leave a comment

How Are We SoTL-ing?

Written by: Jennifer Friberg, Cross Endowed Chair in SoTL and Associate Professor of Communication Sciences and Disorders at Illinois State University 

In the run-up to ISSoTL 2017 last week in Calgary, Alberta, Canada, it might have been easy to miss that the latest issue of Teaching and Learning Inquiry (TLI), the journal of the International Society of the Scholarship of Teaching and Learning, has just been published. I had the opportunity to read several articles in this issue prior to traveling to the conference and was particularly interested in one article, Survey of Research Approaches Utilised in the Scholarship of Teaching and Learning Publications, which was co-authored by Aysha Divan (U. of Leeds), Lynn Ludwig (U. of Wisconsin-Stevens Point), Kelly Matthews (U. of Queensland), Phillip Motley (Elon U.), and Ana Tomljenovic-Berube (McMaster U.).

Why the interest? As a SoTL faculty/student developer, I am forever asked if there is a “preferred” method for engaging in SoTL. I have always addressed this topic from an anecdotal perspective, simply telling novice SoTL scholars that qualitative, quantitative, and/or mixed methods are all equally appropriate for SoTL, depending on the “fit” of the method to the study aims/design. With this paper, a bit more clarity was offered as a result of systematic study of three years of published SoTL journal articles.

Honestly, I imagined that there were far more qualitative methods employed in SoTL research than quantitative; however, I was incorrect. Overall, 223 articles from the following journals were studied: International Journal for the Scholarship of Teaching and Learning, the International Journal for Teaching and Learning in Higher Education, and the Journal of the Scholarship of Teaching and Learning. Across these articles, there was an almost even balance of quantitative, qualitative, and mixed methods research (see graphic below).

methods breakdown

Of even greater surprise to me were the following findings:

  • 84% of papers utilized a single data source for reporting (primarily students), which leaves the need for triangulation of data open for consideration in terms of future project planning.
  • Data from mixed methods studies were often times poorly integrated with only 30% of studies fully integrating qualitative and quantitative data as part of the discussion of findings.
  • 65% of studies relied on a single “snapshot” of data (data collected at one time only), which leads to thoughts on the value of/need for collecting longitudinal data to study student learning over time.

At ISSoTL last week, Gary Poole delivered a plenary address reminding us all that as professionals interested in SoTL, we have a choice to facilitate or hinder as we collaborate and mentor. As a professional developer for faculty and students interested in SoTL, I intend to share this information as a facilitative effort to grow SoTL at ISU (and beyond), helping future SoTL scholars to be mindful of trends, needs, and considerations in SoTL publishing. Specifically, I will urge SoTL researchers to:

  1. Seek out a “goldilocks” fit to connect their research questions to the type of data they collect. Why? This allows a researcher to determine whether research question(s) being posed are best answerable with qualitative, quantitative, or mixed methods approaches. A good fit is critical for a study to make sense to interested stakeholders.
  2. Ensure that data come from as many direct data sources as are necessary to form a strong foundation for any discussion of results/implications.
  3. Use indirect data sources primarily as support/triangulation for data collected from direct sources.
  4. Think carefully and critically about how data from a study is discussed. If the design selected has a mixed methods approach to data collection, then all aspects of data should be explored in an integrated manner to identify trends and accurately interpret and report data across the board.
  5. Consider whether data collected at multiple data points might be more appropriate for a study than a “one-time” data collection effort in order to best answer the research question(s) being posed.

 

Blog References:

Divan, A., Ludwig, L. O., Matthews, K. E., Motley, P. M., & Tomljenovic-Berube, A. M. (2017). Survey of research approaches utilised in the scholarship of teaching and learning publications. Teaching and Learning Inquiry, 5(2).


Leave a comment

Musings on SoTL Peer Mentorship

Written by Jennifer Friberg, Cross Endowed Chair in SoTL and Associate Professor of Communication Sciences & Disorders at Illinois State University

pointRecently, I worked with faculty at Bradley University to develop a framework for and guidance in SoTL peer mentoring. Bradley is working diligently to increase engagement in SoTL and have adopted a “grow their own” approach to this effort, selecting faculty who have been SoTL-productive to mentor other faculty members interested in becoming SoTL scholars. The process of preparing for this undertaking led me to (over time) merge my anecdotal experiences as a SoTL mentor with evidence about peer mentoring (in and out of SoTL). I’ve tried to organize some of these reflections below:

  • In preparing my session, I looked toward existing research on peer mentorship in SoTL, finding little. One study I did find was from Hubbal, Clark, and Poole (2010), who analyzed ten years of data on SoTL mentoring to identify three critical practices of SoTL mentors : modeling of SoTL productivity, facilitation of mentees’ SoTL research, and engagement in SoTL networking with other SoTL scholars. In terms of my SoTL mentee/mentor experiences, I think the last practice, that of connecting mentees with other SoTL scholars, is critical and often neglected. Introducing novice SoTL scholars to the “commons” of SoTL has the potential to sustain interest, broaden perspectives, and increase engagement in the SoTL movement as a whole.
  • Often times, when I do “intro” workshops to explain SoTL to new students and faculty, there is a perception that SoTL research is very different from disciplinary research. I always explain that while it can be, it really isn’t in many ways! Similarly, I have found that faculty who have extensive disciplinary experience mentoring students and peers struggle to understand that SoTL mentorship really isn’t all that different. The same practices applied to a differently-focused research project can be very successful in helping a novice SoTL researcher gain confidence in conducting research on teaching and learning.
  • Zellers, Howard, and Barcic (2008) found that benefits to mentees engaged in mentorship programs included assimilation to campus culture, higher career satisfaction, higher rate of promotion, and increased motivation to mentor others. While this work was not focused on SoTL, I can easily see how the same tenets might apply to research on teaching and learning, as well. In terms of SoTL research, I’d add that benefits could include opportunities for assimilation to SoTL culture at and beyond the single institutional level as well as the chance to work with mentors and faculty across varied fields of study in a way that isn’t always customary in disciplinary research.
  • Clutterbuck and Lane (2016, xvi) state “to some extent the definition of mentoring does not matter greatly, if those in the role of mentor and mentee have a clear and mutual understanding of what is expected of them and what they should, in turn, expect of their mentoring partner.” This is so true! The most successful peer mentoring relationships I’ve witnesses have strong foundations in clear and regular communication of expectations, progress, bottlenecks, etc.
  • I’ve encountered two types of SoTL peer mentorship frameworks: formal (set framework for participation and, often, assignment of mentor/mentee pairs) and informal (relationships that develop by happenstance due to opportunity and shared interests). I feel that there are likely benefits to each. Formal mentorship programs are more likely to have stronger administrative support and integration of the program within a strategy for professional development, both characteristics of successful mentoring programs (Hanover Research, 2014). Conversely, informal peer mentoring frameworks allow for voluntary participation, participant involvement in the mentor/mentee pairing process, and the ability for participants to co-develop goals, expectations, and desired outcomes of the mentorship paring, each also components of successful mentoring programs (Hanover Research, 2014). So, which is better and why? This might be a very interesting area for future study, as currently, we just don’t know.
  • What makes a successful peer mentor? Awareness of adult learning principles/teaching strategies/techniques, and understanding/acknowledgement of differences in orientation and stage of development between themselves and their mentees, and ability to plan/observe/facilitate discussion (Knippelmeyer & Torraco, 2007). It would seem that many folks engaged in SoTL, then, would make excellent peer mentors, as these characteristics are as endemic to SoTL as they are to mentorship!

Blog References:

Clutterbuck, D. & Lane, G. (2016). The situational mentor: An international review of competences and capabilities in mentoring. London: Routledge.

Hanover Research. (2014). Faculty mentoring models and effective practices. Author.

Hubball, H., Clarke, A., & Poole, G. (2010). Ten-year reflections on mentoring SoTL research in a research-intensive university. International Journal for Academic Development, 15(2), 117-129.

Knippelmeyer, S. A. & Torraco, R. J. (2007). Mentoring as a developmental tool for higher education. University of Nebraska-Lincoln teaching center publication.

Zellers, D. F., Howard, V. M., Barcic, M. A. (2008). Faculty mentoring programs: Reenvisioning rather than reinventing the wheel. Review of Educational Research, 78(3), 552-588.

 


Leave a comment

College Rankings, Student Learning, and SoTL : An Unlikely Trio?

Written by Jennifer Friberg, Cross Endowed Chair in the Scholarship of Teaching and Learning, and Associate Professor of Communication Sciences and Disorders at Illinois State University

ratingLast Friday (9/8/2017), the Chronicle of Higher Education published an interesting article written by Richard M. Freeland titled “Stop Looking at Rankings. Use Academe’s Own Measures Instead.” Ostensibly, this piece discusses the role and utility of college rankings such as those published annually by U.S. News and World Report. Freeland explains that there are some measures reported in these rankings that are “legitimate indicators of academic quality,” such as “graduation rates, faculty qualifications, and investment in academic programs.” He goes on to say that other rankings (the federal government’s College Scorecard, extant Integrated Postsecondary Education System data, and the Voluntary System of Accountability) have added important data to the conversation of what makes a college “good” in a world where it’s hard to determine institutional quality. He is undeniably correct. However, for years I have felt as though we have been missing the boat with our current reporting of college rankings, as we seem to in no way account for student learning as part of these metrics. Due in large part to my background in teaching and learning (and my status as the parent of a high school junior seeking a future university home!), this is very frustrating to me, for I want to know more about student learning outcomes at institutions than about many other data points. This is a huge void and something I’ve considered an opportunity for SoTL for a long, long time.

Freeland writes of a “deep resistance within academe to publishing data about what students learn,” providing a historical overview of various standardized measures of intellectual achievement that have been proposed – and rejected — as universal measurements of student learning. And, so, the void in the reporting of student learning as a important data point in college rankings remains. Freeland remarks:

While many colleges have developed programs to assess student learning (often because of accreditation requirements), few systematically collect and even fewer publish quantitative data that allow readers to compare student intellectual achievement across institutional lines. Until this gap is filled, higher education’s systems of accountability will continue to be data-rich but information-poor with respect to the quality to actual learning. The public will be left to rely on commercial rankings as indicators of institutional quality.

Based on all of this, my overarching question is this: as advocates for the scholarship of teaching and learning – the very research that CAN help provide information about student learning in higher education to the public – is it important that we promote SoTL as a potential valuable contributor to the college rankings discussion?

I’ve struggled with this question all weekend. Here’s where my thinking is at this point:

  • I believe that SoTL does belong in the college ranking discussion. Student learning is our wheelhouse. We need a seat at this table to advocate for and honor outcomes of a diverse and rich field of scholarship on student learning. SoTLists cannot allow student learning to be assessed via a standardized test or any other “one look” measure of performance and expect to tell the whole story. Scholars of teaching and learning recognize this well and can advocate accordingly.
  • I don’t believe that student learning should be ranked. I can see dangers in how this could happen if we start talking about which school has better learning outcomes than others might. Student learning varies by context and discipline, creating a number of limitations on “best learning outcome” data that could be reported. There is no universal curriculum for higher education. As such, any ranking system of student learning would lack reliability and validity.
  • Years and years of research on teaching and learning has led to the understanding of high-impact practices for undergraduate education, and more such information is shared regularly in cross-disciplinary and discipline-specific publications.

So, then, perhaps what we are looking to capture in college rankings shouldn’t specifically focus on student learning outcomes. Every institution of higher education is looking to support student learning, but we must acknowledge that this is accomplished in a manner that honors contextual differences as well as institutional missions and ideologies. With this in mind, any comparison of student learning across institutions may well be akin to comparing apples to oranges.

That said, you CAN measure the use of evidence-based approaches in higher education (e.g., undergraduate research, service learning/community-based learning, internships, and first-year seminars and experiences; Kuh, 2008) to measure quality of instruction. SoTL scholars add to the evidence-based for scholarly learning daily. We have solid evidence that certain instructional methods work and data could be collected to reflect how often these pedagogies are used in college classrooms. That might be a metric of interest to various stakeholders. I’m confident there are others, as well. I’m still stewing on this and am curious what others are thinking. I think this is an important discussion and one that we, as SoTL advocates, should be a part of, to explore this idea some more. To that end, feel free to comment below or continue the discussion on social media (@ISU_SoTL).

Blog References:

Freeland, R. M., (2017, September 8). Stop looking at rankings. Use academe’s own measures instead. Chronicle of Higher Education. Downloaded from: http://www.chronicle.com/article/Stop-Looking-at-Rankings-Use/241140?cid=at&utm_source=at&utm_medium=en&elqTrackId=4eadfa107c984352bd8664bf86cba24d&elq=869cd22487394fe88b84eb2d7904a1d2&elqaid=15516&elqat=1&elqCampaignId=6640#comments-anchor

Kuh, G. D. (2008). High-impact educational practices: Why they are, who has access to them, and why they matter. Washington, DC: AAC&U.

McKinney, K. (n.d.). A sampling of what we know about learning from scholarship of teaching and learning and educational research. Downloaded from: http://sotl.illinoisstate.edu/downloads/materials/A%20Sampling%20of%20What%20We%20Know.pdf