The SoTL Advocate

Supporting efforts to make public the reflection and study of teaching and learning at Illinois State University and beyond…


Leave a comment

The SLaM Model of Applying SoTL In and Beyond One Classroom

Written by Kathleen McKinney, Illinois State University, Emeritus and Jennifer Friberg, Illinois State University 

slamIn this blog post we share a model for the application of scholarship of teaching and learning findings in and beyond the individual classroom level. The model, named SLaM, is detailed in the Introduction chapter of our edited book, Applying the Scholarship of Teaching and Learning Beyond the Individual Classroom (Indiana University Press, 2019, in press). The focus of that volume is on SoTL and its application beyond one classroom but the SLaM model is about application at any level. We define SoTL using both our institutional definition, ‘the systematic reflection/study of teaching and learning made public,’ as well as key characteristics as practitioner, action reflection/research that is usually about the instructor/researchers’ own students and/or students in their discipline and is most often at the local level. We understand application as the use of SoTL research findings and implications to design, change, intervene, make decisions, etc., primarily in institutions and disciplines, to enhance teaching and student learning.

The SLaM model is an outgrowth of our early discussions of application at various levels (e.g., Friberg & McKinney, 2015, 2016; McKinney 2003, 2007, 2012).[1] We then organized and built on those ideas, as we wrote for and edited our latest book, to create the SLaM model. The model uses three questions to conceptualize, categorize, and understand the use of SoTL results/knowledge in applications to teaching and learning. We briefly note these here but a more detailed discussion, diagram, and examples of the model can be found in our Introduction to our edited book (see endnote 1 below for the citation for the model).

  1. What is the source of the SoTL that is applied? The “S” in our SLaM framework is connected to identifying the source(s) of SoTL findings being applied. SoTL research results that are applied at various levels may be from the teacher’s original scholarship of teaching and learning studies, SoTL work by colleagues, the synthesis of presented or published SoTL research in the discipline/institution/larger SoTL field, or some combination of these sources of SoTL results and implications.
  2. At what level(s) are the data/results/implications applied? There are numerous levels (the “L” in our framework) at which SoTL findings and implications could be applied to positively impact teaching and learning. These levels include the individual classroom, course/module, program, department, college, co-curricular, institutional, disciplinary, multi-institutional, and multi-disciplinary levels.
  3. What mechanisms or processes are used (or could be used) to apply the SoTL data/results/implications to new areas or contexts at various levels? The “M” in our SLaM framework represents the many mechanisms that exist or could be created that can be used as processes for novel applications of SoTL findings. A few examples include assessment, quality assurance, course/program design or redesign, accreditation, budget development, strategic planning, faculty/staff development, interdisciplinary initiatives, and graduate student training.

In our forthcoming edited book, eleven examples of the application of SoTL are described; two in our Introduction and nine in the contributed chapters. We briefly summarize three of these examples of applications and their fit with our model here. First, Brent Oliver, Darlene Chalmers, and Mary Goitom of Mount Royal University in Canada in their chapter, “Reflexivity in the Field: Applying Lessons Learned from a Collaborative Scholarship of Teaching and Learning Study Exploring the Use of Reflexive Photography in Field Education” use findings and implications from face-to face interviews with students from multiple institutions (source). They apply what they learned at the course, program and department levels using curricular reform, program review and accreditation (mechanisms). They are planning additional applications in a new interdisciplinary fellowship program and via faculty development programs.

Another example comes from Belgium. In the chapter, “Feedback First Year”- A Critical Review of the Strengths and Shortcomings of a Collective Pedagogical Project,” Dominique Verpoorten, Laurent Leduc, Audrey Mohr, Eléonore Marichal, Dominique Duchâteau, and Pascal Detroz describe their sources of SoTL findings: SoTL literature on feedback practices as well as original data from interviews with members of the faculty participating in SoTL staff development programs, observations and diaries of advisers, minutes of meetings, and descriptive templates of project outcomes. Levels of application included individual courses, faculties/departments (group of courses; program), and institution. The mechanisms they used for application were specific course re-design tasks (designing feedback activities by faculty participants), a variety of course interventions, and sharing results in departments via meetings and plenaries.

Finally, contributors Claire Vallotton, Gina A. Cook, Rachel Chazan-Cohen, Kalli B. Decker, Nicole Gardner-Neblett, Christine Lippard, and Tamesha Harewood share their SoTL applications in “The Collaborative for Understanding the Pedagogy of Infant/toddler Development: A Cross-University, Interdisciplinary Effort to Transform a Field through SoTL.” Their project used implications from past SoTL literature, reflection, and original SoTL studies on multiple campuses (sources) at the course, program, department and disciplinary levels. The application mechanism was a cross-institutional, collaborative group of scholars (CUPID) where participants shared resources, conducted research, and disseminated work via conferences, workshops, publications, meetings.

We hope readers of this blog post will take a look at the details of the SLaM model and the interesting projects and applications from around the globe presented in the edited volume. We welcome feedback on the model and hope others will find it useful in their SoTL research and applications.

Blog References

Friberg, Jennifer C., and Kathleen McKinney. 2016. “Creating Opportunities for Institutional and Disciplinary SoTL Advocacy and Growth.” Presentation. SoTL Commons Conference, Savannah, GA, USA.

Friberg, Jennifer C., and Kathleen McKinney. 2015. “Strengthening SoTL at the Institutional and Disciplinary Levels.” Poster presentation. EuroSoTL, Cork, Ireland.

McKinney, Kathleen. 2012. “Making a Difference: Applying SoTL to Enhance Learning.” The Journal of the Scholarship of Teaching and Learning 12(1): 1-7.

McKinney, Kathleen. 2007. Enhancing Learning through the Scholarship of Teaching and Learning: The Challenges and Joys of Juggling. San Francisco: Jossey-Bass.

McKinney, Kathleen. 2003. “Applying the Scholarship of Teaching and Learning: How Can We Do Better?” The Teaching Professor August-September:1,5,8.

 

[1] As discussed in the Introduction to our edited book, the SlaM model overlaps slightly with the 4M model (Poole and Simmons, 2013; Wuetherick and Yu, 2016). Our initial presentations and writings of the SLaM model, however, predate the 4M model and the two models are distinct in various ways.

 

Advertisements


Leave a comment

Taking a Scholarly Approach to the New Academic Term

Written by: Jennifer Friberg, Cross Endowed Chair in SoTL and Associate Professor of Communication Sciences and Disorders at Illinois State University

screen-shot-2017-01-09-at-12-42-50-pmMany of us have are anticipating (or maybe already experiencing!) a new academic term. My fellow Redbirds have one more week before we are back in the classrooms of Illinois State University. Recent conversations with colleagues have revolved around course design/prep and general thoughts about the upcoming semester. I’m guessing this is the case at most colleges and universities.

For me, the weeks before a new term are always times of reflection and consideration. I ask myself questions like: What worked last time I taught this class? What didn’t work? How can I engage more students in a way that makes sense for my course and my course design? Again, I’m guessing that I’m not alone in pondering these topics. And, while we can choose answer these questions via SoTL inquiry, that isn’t always possible for a number of different reasons (resources, competing priorities, etc.). Thankfully, there is ample research on teaching and learning that we can apply to help answer these questions — we just have to access it!

The following resources each describe the evidence base for common beginning of the academic term issues: How do I construct a syllabus? How will my students best learn? What is the advantage of various grouping strategies for my students? What are “best” practices for the first day of class? Happy reading and have a great term!

The Center for Teaching and Vanderbilt University constructed a very useful webpage to highlight important, evidence-based considerations for syllabus construction, addressing questions such as:

  • What are the most important elements of a learner-centered course syllabus?
  • What is the relationship between syllabus construction and course design?
  • How can the tone of the syllabus impact learners?
  • What other resources are available to support faculty in constructing “good” syllabi?

Indiana University of Pennsylvania have gathered a reference list of “what to do on the first day of class,” with cross-disciplinary research and evidence from several different disciplines (e.g., sociology, psychology, calculus, English), as well.

Kathleen McKinney collated a sampling of things we know about learning from SoTL research, outlining findings from seminal texts in teaching and learning from the last decade.

Rick Reis from Tomorrow’s Teaching and Learning offers suggestions — grounded in evidence — for establishing collaborative groups for students, and in so doing, offers pros and cons for random, instructor generated, self-selected, and mixed groups.

 

Public domain photo downloaded from: https://pixabay.com/en/teach-word-scrabble-letters-wooden-1820041/


Leave a comment

Application of SoTL: Sharing Results with Students

Written by Susan Hildebrandt, Associate Professor of Applied Linguistics/Spanish, Department of Languages, Literatures, & Cultures at Illinois State University 

“Understanding World Language edTPA,” a two-hour workshop I presented at the annual meeting of the Illinois Council on the Teaching of Foreign Languages (ICTFL) in Tinley Park, focused on the content-specific student teacher performance assessment purported to measure beginning teacher readiness. edTPA became consequential for every individual seeking teacher licensure in the state of Illinois in September 2015. Student learning was central to this workshop as it explored how ISU world language teacher candidates performed on edTPA. This systematic study of ISU student learning is timely for world language teacher education programs throughout the state. By examining and sharing my students’ performance on the standardized edTPA, a state-wide audience learned from their triumphs and challenges. The workshop also served as an opportunity for a variety of audiences to get a wider view of edTPA, its origin, and its use.

The intended audience for this presentation was world language teacher education coordinators or world language pedagogy instructors and faculty, but a much more diverse audience attended the session. Five of the nine attendees were world language teacher candidates from across the state, who were taking pedagogy classes this semester and intended to student teach during the spring of 2107. The purpose of this workshop was originally to help world language teacher education programs get their students ready for edTPA. Instead, I got to go straight to the intended audience. The edTPA outcomes of my students were able to communicated to teacher candidates directly and I was able offer practical suggestions about how be more successful at demonstrating effective K-12 teaching practices. I was able to point out the areas in which my candidates were successful and those in which they struggled. I was able to share resources that were of particular value to my teacher candidates here at ISU.

The workshop deconstructed edTPA with an exploratory quantitative study, in which I examined edTPA scores of world language teacher candidates (N = 34) and compared their scores to the known cut scores for states in which edTPA is a requirement for licensure. Results indicated that participants performed best in the planning section of the assessment and were most challenged by the assessment section. All participants earned scores above the current minimal cut score for Washington state, and all but two would pass in New York state. The workshop also highlighted ways of encountering the three required tasks, along with logistical guidance for videotaping and writing the extensive commentaries for each task.

The teacher candidate attendees expressed great interest in these results, as more than one intended on teaching in another state. As a result of their interest, I decided to bring my findings back to my own class. I had intended to talk with them about edTPA that day, but I hadn’t intended to give them a research presentation. And yet, I did. And I think they enjoyed it. It’s not often they get to peak behind the curtain of a teacher education program and see how we use data to improve practice. I hope, too, that my teacher candidates can use this experience to learn to analyze their own classroom-based data, one of the skills assessed in edTPA.


Leave a comment

SoTL Think Tank: Fostering Cross-Program Collaboration Within a Discipline

Written by Jerry K. Hoepner, Associate Professor (hoepnejk@uwec.edu) and Abby Hemmerich, Assistant Professor (hemmeral@uwec.edu) at the University of Wisconsin – Eau Claire

The American Speech Language and Hearing Association (ASHA) Academic Affairs Board (AAB) released a report on the role of undergraduate education in Communication Sciences and Disorders (CSD) in June of 2015. Among the concerns addressed by this report is the need to align curriculum and pedagogy across programs. A lack of consistency across programs constrains the portability of a CSD degree to other undergraduate and graduate programs, as well as the generalizability to related educational and healthcare professions.

While this is a discipline-specific example, it is a challenge which faces many disciplines. The reality of today’s educational context affirms that students increasingly seek flexibility in how they assemble their education and the programs that deliver it. This blog addresses one program’s attempt to foster collaboration across institutions operating in the same state university system.

The University of Wisconsin Systems SoTL Think Tank sought to initiate a consortium of faculty from six state programs in CSD. The program was initiated in the spring of 2015 through a UW Systems conference development grant by the Office of Professional Instruction and Development (OPID). The initial intent of the consortium was to share information about current teaching strategies, develop a network of faculty interested in incorporating SoTL research in their programs, encourage sharing of resources and content expertise, foster research and teaching collaboration between programs, increase SoTL and pedagogical knowledge across system programs, and conjointly develop plans for future collaboration (see figure 1).

Figure 1. Purpose and goals of the UW Systems SoTL Think Tank.

think tank.png

Prior to the one-day seminar, attendees responded to a Qualtrics survey about their previous and current perspectives and experiences with SoTL and pedagogy. Most respondents indicated that collaboration happened within their own departments on their own campuses but less across the campus or with similar programs on other campuses (see figure 2). Most attendees felt their home departments valued discussions of SoTL and encouraged research in this area, but implementation of teaching observations was less common (see figure 3).

Figure 2. Pre-conference collaboration data.

fig 2.png

Figure 3. Perceived value of SoTL at home institution.

fig 3.png

A moderator from the host university’s Center for Excellence in Teaching and Learning guided discussions following the framework below:

  • Meet and greet. An informal discussion paired with refreshments allowed the attendees to get to know one another prior to deeper discussions of pedagogy.
  • Discussion of selected readings from disciplinary SoTL text (Ginsberg, Friberg, & Visconti, 2012). Initial discussions of the text allowed attendees to share pedagogical philosophies and connect academic and clinical teaching. Attendees worked within small groups to share experiences and insights related to instruction.
  • What is SoTL and where are people at the outset? Reflecting upon previous experiences with teaching and learning, under the lens of readings within the Ginsberg et al. text, attendees identified the aspects of SoTL that matched their current understanding and where they hoped to be. As you see in the images below (see figure 4), attendees’ conceptualization of the intersection between teaching and SoTL migrated throughout the day from a focus on teacher-learner interactions and pedagogical content knowledge towards evidence-based education and SoTL.

Figure 4. Attendee conceptualizations of the teaching, pedagogy, and SoTL continuum.

fig 4.png

  • SoTL and pedagogy in the discipline. Discussions of the role of SoTL in the discipline, implementation of evidence-based pedagogies, and signature pedagogies within the discipline took place as attendee conceptualization evolved.
  • Action Plans. Following a framework designed by the hosts of the think tank, we worked to assemble dreams (i.e., what would you do if time, money, and other resources were not a limiting factor), goals (e.g., what specific steps will you take next), and potential collaboration surrounding research and teaching interests and needs (see figure 5). Each attendee defined a plan for implementing SoTL at some level into his/her teaching or research for the following academic year.

Figure 5. Action plan (left) and examples of lessons to share (right).

fig 5.pngfig 6.jpg

  • Brag N’ Steal. Attendees brought an innovative lesson to share and discuss. As they presented their lesson plans, it fostered a discussion of how others may draw upon those principles for lessons in their content areas (e.g., a lesson for an adult neurogenic disorders class and how that could be modified for a child language development course). Examples are shown in figure 5 above.

Several projects and plans for follow-up were initiated. This included:

  1. A survivor speaker series exchange which has already hosted its first speaker
  2. A faculty speaker-exchange
  3. A presentation at our disciplinary annual conference in November 2015
  4. A presentation at the UW Systems conference in April 2016
  5. A plan to meet again the following spring, hosted by another program within the system

The program was intended to foster inter-program pedagogical and research collaborations. The conference included one, full-day interaction, intended to foster review of a framework for SoTL research and pedagogical enhancement in CSD. Faculty with expertise in similar content areas were able to connect for future collaboration in teaching resources, as well as research. Further, commonalities across program curriculums provided a basis for initiating discussions of inter-program curricular consistency and compatibility. This could enable students to move seamlessly between system programs (i.e., undergraduate to undergraduate program, undergraduate to graduate school).

Implications & Potential Extensions

This program attempted to initiate a collaboration of disciplinary programs across a system. While not all universities are a part of a state system, as the programs we have described, most programs will have state and regional affiliates in their discipline with whom they may wish to collaborate. As you can see, this is not a process that is quick to implement. Our work thus far is merely a few steps towards our ultimate goals of producing portable curricula, shared standards, cross-program collaboration, and shared expertise. Achieving those lofty goals begins with those initial connections and conversations.

Blog References:

American Speech-Language-Hearing Association. (2015). Academic Affairs Board, “The Role of Undergraduate Education in Communication Sciences and Disorders”, Final Report. Retrieved from http://www.asha.org/uploadedFiles/AAB-Report-Role-Undergrad-Ed-CSD.pdf

Ginsberg, S., Friberg, J., & Visconti, C. (2012). Scholarship of teaching and learning in speech-language pathology and audiology: Evidence-based education. San Diego: Plural Publishing.

 


2 Comments

Application of SoTL: Strategies to Encourage Metacognition in the Classroom

Written by Jen Friberg, SoTL Scholar-Mentor, Illinois State University

Metacog pic

Recently, I have been doing a good deal of reading about various evidence-based strategies to teach for metacognitive understanding in my graduate and undergraduate courses, knowing that when students are explicitly “thinking about their thinking,” they have the capacity to learn more, extend learning beyond the classroom, and integrate information across contexts more easily.

In 2012, Kimberly Tanner published a paper titled: Promoting Student Metacognition. Within this paper, Tanner reviews strategies for explicit teaching of metacognitive strategies to build a culture of “thinking about thinking” within her biology classes. She posits that thinking like a professional requires students to be metacognitive, making teaching about thinking arguably as important as teaching specific course content.

In terms of specific strategies, Tanner provides “sample self-questions to promote student metacognition about learning” (p. 115). She categorizes these into three categories (planning, monitoring, and evaluation) across four specific contexts (class session, active learning task/homework assignment, quiz/exam, and overall course) for a total of 51 specific questions that students can ask themselves to evaluate their learning processes. These include:

  • What resources do I need to complete the task at hand? How will I make sure I have them?
  • What do I most want to learn in this course?
  • Can I distinguish important information from details? If not, how will I figure this out?
  • To what extent am I taking advantage of all the learning supports available to me?
  • Which of my confusions have I clarified? How was I able to get them clarified?
  • How did the ideas of today’s class session relate to previous class sessions?
  • What have I learned about how I learn in this course that I could use in my future courses? In my career?

The utility of providing these questions to students for their use is undeniable, but we cannot be certain that students will take the opportunity to become more metacognitive on their own. Tanner advocates for sharing these questions with students AND embedding them into existing assignments and learning opportunities to build a habit of reflection, which can lead to more routine thinking about learning.

It is with a more explicit intention to directly encourage metacognitive thought that Ron Ritchhart, Mark Church, and Karin Morrison wrote the text Making Thinking Visible: How to Promote Engagement, Understanding and Independence for All Learners in 2011. Ritchhart, Church, and Morrison advocate for having students think about their own thinking through the implementation of thinking routines to make thinking more “visible” as part of the learning process. The suggest different levels of routines to drive different sorts of thinking: introducing/exploring ideas, synthesizing/organizing ideas, and digging deeper into ideas. All in all, 21 different strategies for classroom use are described, including:

  • See-Think-Wonder (p. 55): A strategy for introducing and exploring information that has students asking themselves three questions when observing a new object/artifact — What do you see? What do you think is going on? What does it make you wonder?
  • Connect-Extend-Challenge (p. 132): A strategy for encouraging synthesis and organization of ideas that asks student to consider what they have just read/seen/heard, then ask themselves: how are the ideas and information presented connected to what you already knew? what new ideas did you get that extended or broadened your thinking in new directions? What challenges or puzzles have come up in your mind from the ideas and information presented?
  • Circle of Viewpoints (p. 171): A strategy to digger deeper into ideas that requires students to consider different perspectives that could be present of affected by a given topic by considering the following: 1) I am thinking of [name of the event/issue] from ______ point of view, 2) I think [describe topic form your viewpoint] because _______, and 3) A question/concern I have from this viewpoint is ________ .

Do you think these strategies could be useful in your teaching context? When working with your students in academic, clinical, or outside the classroom/clinic situations, what strategies are you using to encourage metacognition? Have you studied these strategies in any way? We’d love to hear more in the comments below!

Blog References:

Ritchhart, R., Church, M., & Morrison, K. (2011). Making thinking visible: How to promote engagement, understanding, and independence for all learners. Jossey-Bass: San Francisco.

Tanner, K. D. (2012). Promoting student metacognition. CBE – Life Sciences Education, 11, 113-120.


Leave a comment

SoTL Applied: Taking a Metacognitive Approach to Teaching and Learning

Written by Jen Friberg, SoTL Scholar-Mentor at Illinois State University

Recently, I came across a blog written by Ed Nuhfer titled Developing Metacogntive Literacy though Role Play: Edward DeBono’s Six Thinking Hats. I will admit to being intrigued, having not heard of this approach before. Nuhfer’s blog focused on DeBono’s “Six Thinking Hats” as a foundation for training students to learn via perspective taking. He suggests and describes six “hats” that students can wear, and suggests ways course instructors can implement the use of these “hats” to:

  • urge students to present factual evidence about a given course topic
  • advocate for the use/implementation/acceptance of the topic being discussed
  • challenge the use/implementation/acceptance of the topic being discussed
  • express emotion to share positive, negative and/or neutral feelings about a specific course topic
  • question assumptions and/or challenge peers to think differently about a given course topic
  • reflect and increase awareness on a given course topic

Each of these approaches asks students to demonstrate their understanding of course content in a different manner, channeling their metacognitive (aka: “thinking about thinking”) learning processes. Nuhfer argues that by engaging in this form of perspective taking, students become more self-aware as learners and can increase deep learning for specific course content. SoTL researchers have advocated for such metacognitive approaches to learning for decades, indicating that:

Integration of metacognitive instruction with discipline-based learning can enhance student achievement and develop in student the ability to learn independently (Donovan, Bransford, & Pelligrino, 1999, p. 17).

While the Six Hats method for teaching via metacognition is one approach course instructors can adopt for their use, many sources (e.g., Ambrose et al, 2010; Lovett, 2008) agree that when students engage in the following activities, metacognitive learning can be achieved in many different learning contexts:

  1. demonstrate the ability to assess the demands of a learning task
  2. evaluate their knowledge and skills for the task at hand
  3. plan an appropriate approach to undertake the learning task
  4. self-monitor their learning progress throughout the task
  5. make adjustments to their approach to learning as they work towards task completion

How are you emphasizing metacognitive learning in your classrooms? What processes are you using? How are you mediating these processes to allow students to understand your expectations and practice with the above mentioned metacognitive skills? We’d love to hear about your experiences — so please comment below!

Blog Resources:

Ambrose, S. A., Bridges, M. W., Lovett, M. C., DiPietro, M., & Norman, M. K. (2010). How learning works: 7 research-based principles for smart teaching. San Francisco: Jossey-Bass.

Donovan, M. S., Bransford, J. D., & Pellegrino, J. W. (Eds). (1999). How people learn: Bridging research and practice. Washington DC: National Academy Press.

Lovett, M. C. (2008, January). Teaching metacogntion [featured presentation at Educause conference]. Retrieved from: http://www.educause.edu/eli/events/eli-annual-meeting/2008/teaching-metacognition

Nufer, E. (2015). Developing Metacogntive Literacy though Role Play: Edward DeBono’s Six Thinking Hats [blog]. Retrieved from: http://www.improvewithmetacognition.com/developing-metacognitive-literacy-through-role-play-edward-de-bonos-six-thinking-hats/


Leave a comment

Walk the Talk SoTL Contest Summary from 2nd Winning Team

Written by Maria Moore (COM), Cheri Simonds (COM), Lance Lippert (COM), Kevin Meyer (COM), Megan Koch (COM), and Derek Story (Director, HR Systems) at Illinois State University

On Wednesday, April 29, 2015, a reception was held to honor two award winning teams from our “Walk the Talk” contest for the best team or academic unit who applied SoTL research results/literature beyond the individual classroom to solve a problem, achieve a goal, or exploit an opportunity resulting in improved teaching or enhanced student learning at Illinois State University.

A summary of the first award winning project was featured in a recent blog post. Today’s blog highlights the work done by faculty in from the School of Communication in an effort to create a new teaching evaluation instrument.. This project, titled Creating a New Teaching Evaluation Instrument for the School of Communication is summarized below with an EXCELLENT list of SoTL literature readers interested in teaching evaluation can use, as appropriate:

What problem(s), goal(s) or opportunity(s) did your team seek to solve or exploit?

A teaching evaluation instrument is a critical strategic resource to ensure quality teaching within a division. It impacts practical issues such as promotion or raises, as well as being fundamental to assessing teaching quality.

We identified that our existing SoC instrument should be reviewed and updated. The assignment was given to our Teaching Effectiveness Committee who identified key problems with the old instrument, meeting bi-monthly revise the instrument. Members of the committee conducted a pilot test in their own courses at the end of the Fall 2013 semester, using both old and new instruments with students. The work of the committee, based on the successful pilot test, resulted in a recommendation for change to the full faculty body in March 2014. The faculty voted unanimously to accept the revision and the new instrument was implemented across all courses at the end of Spring 2014.

What strategies did your team devise to apply best practices from SoTL work to your problem/goal/opportunity?

We have many SoTL scholars in the SoC and it is a common practice to use SoTL literature as we identify problems or opportunities and devise resultant tactical strategies to solve or exploit them. This opportunity was no exception. At our first meeting we recognized the need to examine current literature and best practices to have external support and validation as a foundation for our recommendations.

We conducted a literature review on the current thinking about teaching evaluations as well as effective teaching practices. In addition to the literature reviewed, we also investigated teaching evaluation instruments from comparable and aspirational programs externally and internally.

What SoTL research (your own, colleagues, or from the literature) did you use to support your strategies?

We purposefully sought and reviewed literature from multiple disciplines, not just communication. We also sought current, as well as seminal scholarship on effective teaching practices. We included research from our own SoC scholars, too.

What were the outcomes and how were they assessed or measured?

Last year, one of our committee members ran factor analysis and scale reliabilities for all the SoC Spring 2014 data using the new teacher evaluation instrument. Results indicate that the new evaluation instrument we created performs very well in reliability tests, factor analysis, and predictive capability using regression procedures. And, it provides us the ability to condense aggregate reports of our evaluations into 4 categories or factors.

This analysis resulted in a spreadsheet template distributed to all faculty so they can also calculate their own aggregate factor scores for each subsequent semester. Faculty going for tenure or promotion use this spreadsheet as an element of their materials. Our SFSC also uses this spreadsheet to work with faculty deemed deficient as part of their performance enhancement strategic plan.

Please briefly reflect on the impact of this experience upon your team; in particular consider the specific role of the SoTL literature on your outcomes or consequences.

We accepted the responsibility to revise our evaluation instrument with great humility and a sense of tremendous responsibility. Simply put, we were entrusted with our colleagues’ future, as the evaluation instrument is one of the most important elements used to assess a teacher’s success or lack thereof. A foundation of SoTL literature (as we began and then navigated this responsibility) was both empowering and liberating. Empowerment occurred through the knowledge we gained about best practices as well as profoundly important SoTL research of both the student and the teacher perspectives of the evaluation process.

What are your team’s future plans for this particular project or initiative?

This is the first full academic year when all teachers in all roles in the SoC will use the new instrument. Our committee has the responsibility to continually assess its effectiveness for meeting our goals of the continual improvement of our teachers and our teaching. We are also experimenting with ways to mesh old data and new data from the old instrument and the new instrument for the multi-year reporting of data required for tenure and promotion applications.

What are your plans to make this work public?

We intend to submit a panel discussion about this topic for consideration at the next Central States Communication Association Meeting. We wanted to have this year’s aggregate data to discuss in addition to the project itself, so submission will occur in October 2015.

Literature Reviewed for Project

American Association of University Professors. (1990). Statement on Teaching Evaluations. Retrieved from http://www.aaup.org/report/statement-on-evaluation

Benton, S. L., Cashin, W. E., & Kansas, E. (2012). IDEA PAPER# 50 Student Ratings of Teaching: A Summary of Research and Literature.

Boysen, G. A. (2008). Revenge and student evaluations of teaching. Teaching of Psychology35(3), 218-222.

Calkins, S., & Micari, M. (2010). Less-than-perfect judges: Evaluating student evaluations. Thought & Action, 7.

Center for Research on Learning and Teaching University of Michigan. (n.d.). Gender and Student Evaluations: An Annotated Bibliography. Retrieved from http://www.crlt.umich.edu/sites/default/files/resource_files/gsebibliography.pdf

Chen, W., & Chen, W. (2010). Surprises learned from course evaluations. Research in Higher Education Journal9, 1-9.

Comadena, M., Hunt, S., & Simonds, C. (2007). The Effects of Teacher Clarity, Nonverbal Immediacy, and Caring on Student Motivation, Affective- and Cognitive Learning: A Research Note. Communication Research Reports24 (3), 241-248.

Cornell University Evaluation and Recognition of Teachers Handbook (n.d.). Retrieved from http://moodle.technion.ac.il/pluginfile.php/443177/mod_resource/content/1/Teaching%20Evaluation%20Handbook.pdf

Dodeen, H. (2013). College students’ evaluation of effective teaching: Developing an instrument and assessing its psychometric properties. Research in Higher Education Journal21, 1-12

DuCette, J., & Kenney, J. (1982). Do grading standards affect student evaluations of teaching? Some new evidence on an old question. Journal of Educational Psychology74(3), 308.

Feeley, H. T. (2002). Evidence of halo effects in student evaluations of communication instruction. Communication Education51(3), 225-236.

Frick, T. W., Chadha, R., Watson, C., Wang, Y., & Green, P. (2008, March). Theory-based course evaluation: Implications for improving student success in postsecondary education. In American Educational Research Association conference, New York.

Hudson, J. C. (1989). Expected Grades Correlate with Evaluation of Teaching.Journalism Educator44(2), 38-44.

Kim, C., Damewood, E., & Hodge, N. (2000). Professor attitude: Its effect on teaching evaluations. Journal of Management Education24(4), 458-473.

Kozey, S. R., & Feeley, H. T. (2009). Comparing Current and Former Student Evaluations of Course and Instructor Quality. Communication Research Reports26(2), 158-166.

Lewis, K. G. (2001). Making sense of student written comments. New Directions for Teaching and Learning2001(87), 25-32.

Marsh, H. W., & Roche, L. A. (1997). Making students’ evaluations of teaching effectiveness effective: The critical issues of validity, bias, and utility. American Psychologist52(11), 1187.

Martin, E. (1984). Power and authority in the classroom: Sexist stereotypes in teaching evaluations. Signs, 482-492.

McCroskey, J. C. (1994). Assessment of affect toward communication and affect toward instruction in communication. In S. Morreale, & M. Brooks (Eds.),1994 SCA summer conference proceedings and prepared remarks: Assessing college student competence in speech communication. Annandale, VA: Speech Communication Association.

Onwuegbuzie, A. J., Witcher, A. E., Collins, K. M., Filer, J. D., Wiedmaier, C. D., & Moore, C. W. (2007). Students’ perceptions of characteristics of effective college teachers: A validity study of a teaching evaluation form using a mixed-methods analysis. American Educational Research Journal44(1), 113-160.

Ory, J. C. (2001). Faculty thoughts and concerns about student ratings. New directions for teaching and learning2001(87), 3-15.

Schrodt, P., Witt, P. L., Myers, S. A., Turman, P. D., Barton, M. H., & Jernberg, K. A. (2008). Learner empowerment and teacher evaluations as functions of teacher power use in the college classroom. Communication Education57(2), 180-200.

Sojka, J., Gupta, A. K., & Deeter-Schmelz, D. R. (2002). Student and faculty perceptions of student evaluations of teaching: A study of similarities and differences. College Teaching50(2), 44-49.

Wilson, R. C. (1986). Improving faculty teaching: Effective use of student evaluations and consultants. The Journal of Higher Education, 196-211.

Wines, W. A., & Lau, T. J. (2006). Observations on the folly of using student evaluations of college teaching for faculty evaluation, pay, and retention decisions and its implications for academic freedom. Wm. & Mary J. Women & L.13, 167.

Wode, J., & Keiser, J. (2011). Online course evaluation literature review and findings. A report from Academic Affairs, Columbia College, Chicago.