Texas Tech University

Student Course Feedback

New for Fall 2024

As we strive to improve and build upon a culture of excellence in teaching and learning at Texas Tech, the Teaching, Learning, and Professional Development Center, the Teaching Academy, and the Office of the Provost and many other organizations, councils, committees, departments, and individuals have partnered together to consider how we might define, reflect upon, and evaluate teaching at Texas Tech University. This effort is called the Teaching Evaluation Initiative

For the past few years, one subset of this initiative has focused on course evaluations at Texas Tech. The following individuals served as members of the Student Course Evaluations committee: Michael Serra (Co-Chair), Suzanne Tapp (Co-Chair), Raegan Higgins, Angela Lumpkin, Kerk Kee, Jason Headrick, Hayden Holmes, Sandra Huston, Sarah Wagnor, Kerri Ford, Jason Rinaldo, Lauren Gollahon, Toby Brooks, Jackson Huffman, Molly Jacobs, and Addison Sparks.

In Fall, 2024, Texas Tech University will begin our new course evaluation process, now called Student Course Feedback.


How can I learn more about the work of this subcommittee and their process?

To see a timeline of the subcommittee’s work and more information about the process, please visit the Teaching Evaluation Initiative and click on the section about the Student Course Feedback.

Why did you rename the course/instructor evaluation to Student Course Feedback?

The committee chose to rename this process to more accurately reflect the role of students. We value input from students and their perceptions of the teaching and learning experience. This feedback can be informative to instructors but it should not serve as the sole measure of evaluation. Instead, we value the three-voice model that considers information from self, peers, and students.

Why change the questions?

This committee examined course evaluations at Texas Tech from an evidence-based perspective to consider what the research suggests about the validity of this measurement, the potential biases, and the information we can gather about student perceptions of their learning.

Our goals were 1) to provide better information for instructors and administrators (more detailed and more focused on instruction and course design) and 2) to increase student understanding, interest, and trust in the process – and hopefully increase their participation as well.

What were the evaluation questions for face-to-face courses?

*Remember that distance courses had a different set of questions. This will be changed and all courses will be evaluated with a consistent form.

The course objectives were specified and followed by the instructor.

Overall, the instructor was an effective teacher.

Overall, the course was a valuable learning experience.

What are the new questions?

The instructor presented material in a way that helped me engage in the course Questions about the Instructor

Likert scale 

(Same response scale as old version:
"Strongly Agree" to "Strongly Disagree")

The instructor helped me to understand the relevance of the course content
The instructor tailored instructions and lessons to incorporate a variety of perspectives
*Overall, the instructor’s teaching methods helped me learn the course content
The course assignments facilitated my learning Questions about the Course
The course was organized
There were opportunities for me to be successful in the course
*Overall, this course helped me learn required concepts or skills
In terms of your learning, what were the MOST EFFECTIVE aspects of this course and why? Overall Questions Free- Response / Open-ended
In terms of your learning, what were the LEAST EFFECTIVE aspects of this course and why?
*Questions marked with an asterisk are aligned with questions on the current student evaluation survey for comparison with previous years.

How have students been involved in this change?

We have partnered with the Student Government Association (SGA). SGA Senate Resolution 59.164 supports the Texas Tech Definition of Teaching Excellence and the Student Course Feedback process. SGA will help us promote both the new questions and awareness of the opportunity to give confidential feedback. Look for marketing materials like this around campus as one example of how we will talk to students.

Will the new questions be added to Digital Measures?

Yes! Placeholders for the new questions have already been added to Digital Measures and will be applied on your Activities/Scheduled Teaching page. Data for both the new questions and the previous questions will be available with a notation about the change in question format. This information will also be applied to other reports, such as your Annual Faculty Report.

The Student Course Feedback period seems long! When does it open and close?

The typical feedback period for the long fall and spring semesters is 2 weeks prior to the last day of classes. For intersessions, condensed courses and some sections which just end prior to the last official day of class have a different schedule. Instructors receive an email approximately 4 weeks prior to the end of class so there is time to make adjustments to the sections being evaluated prior to them going live to the students.

What can I do to encourage the best input from students?

Designate time in class for students to complete the Student Course Feedback. Did you know that a QR code is available for your course that will take students directly to the online evaluation form for your section? Just log in to SmartEvals and click “QR code” for a given class on your EvalCenter. Try inserting that QR code on a PowerPoint slide so that your students can easily click and get started.

If you dedicate class time to allow for completion of the Student Course Feedback, best practice is that you leave the room and a colleague, staff member, or graduate student is present.

Tell your students that you value their honest (both positive and negative) and constructive feedback and that you use student feedback to improve your courses. What aspects of the course and/or instruction helped them learn? What factors might be changed to help future students learn more effectively? You might even share an example of how you have adjusted your course due to student feedback or something you learned from the information the student shared. 

Describe to the students the kind of feedback you find most useful. Generally, specific feedback with examples is more valuable than broad statements. Remind students that evaluations are confidential and that you will not be able to see any of their evaluations until all TTU grades have been posted. You may also remind students that you are the primary audience for their feedback, and others will potentially read their evaluations, including your administrators.  

What are we telling students about Student Course Feedback?

The committee prepared updated text that will be sent to students to encourage them to complete their Student Course Feedback. Here’s one example:

Why participate in Student Course Feedback?

It Helps You

Filling out a course feedback form allows you to reflect on the progress you made in the course over the semester. What have you learned? How far have you come? What parts of the class appealed to you? Which elements did not? How will the things you learned in this course help you to understand concepts or skills in future classes?
  
It Helps Your Fellow Red Raiders – Present and Future 

Your input helps to improve Texas Tech courses in the future. It’s a tangible way you can show your commitment to Texas Tech.
  
It Helps Your Instructors

Thoughtful course feedback responses help instructors hear students’ perspectives about what is working and what could use improvement. The more detailed you are in your written course evaluations (about lectures, readings, assignments, and exams) and the more information you give your faculty member, the better! It helps to explain what impacted your learning in detail. Texas Tech instructors want to improve their courses across semesters; providing detailed, constructive feedback can help them to do that.
  
Student Feedback is Confidential

Your instructor will not know who did or did not submit feedback. All numerical ratings and written comments are de-identified (meaning that there are NO student names attached) when the instructor receives them. If you write your name in the written comments or provide precise information, it may be possible for an instructor to identify which student provided the feedback. Also, instructors can only access their course feedback results during the next semester. Consequently, instructors’ grading cannot possibly have been biased by your feedback.

Will a longer survey mean a lower response rate?

The response rate in the original pilot groups for SmartEvals was approximately 67%-72%.  The response rate for the traditional pencil/paper evaluation process was 73-78%. The current rate with SmartEvals is approximately 50% and seems to be influenced by the interaction/encouragement to participate from the instructor of record. We conducted two pilot programs with the new questions and our response rate did not change significantly. However, overall, students wrote about twice as many words (total) with the two free-response questions.

How did you identify the new questions and do they make a difference?

The committee reviewed survey questions from many different institutions and spent hours considering word choices and question order. Feedback from department chairs, school directors, and area coordinators was particularly helpful in this process.

The committee also conducted two pilots with several versions of new questions in Spring, 2023 and Fall, 2023. Analyses of the results were conducted by Dr. Jaehoon Lee, Woonyoung Song, and Xiunan Shi, College of Education. There were two primary findings:

  • The order of the questions matters: The summative questions at the end of each section (as opposed to the beginning of each section) result in more variation in responses for all questions and indicate that students are taking more time to read the questions. 
     
  • We specifically selected questions (from the larger pool of candidate questions) with the lowest redundancy (collinearity) across questions and highest information value. “Information” refers to how well responses to the item discriminate between instructors at various levels of the latent trait. Here, the trait represents the effectiveness of the instructor/course). 

Free-response data analysis conducted by Dr. Michael Serra using Linguistic Inquiry and Word Count (LIWC) showed several important takeaways:

  • Overall, the students wrote about twice as many words (total) with the two proposed free-response questions vs. the open-ended question on the old version.
  • The students’ language was rated by LIWC as more authentic and more genuine with the new questions.  
  • The new questions seem to encourage more analytical thinking (more cognitive/thinking terms) and less emotional wording.  
  • There are fewer gendered words with the new questions which could lead to some reduction in gender-based biases.  

I have received feedback on course/instructor evaluations in the past that seemed biased or gendered. Are there any resources to help me review the feedback?

The Texas Tech Teaching Academy has volunteered to review your Student Course Feedback upon request. The purpose of the review is to extract useful and constructive comments that might be used to improve teaching in the future, while providing space to disregard unconstructive feedback and address inappropriate comments. The hope is that this process can reduce stress on faculty as they receive student evaluations and foster meaningful conversations among faculty peers on how to work productively with student feedback to improve teaching. The Teaching Academy is a neutral resource and the process is confidential. Questions for your conversation might include the following:

  • What overall trends can be seen in the feedback this group of students has provided?
  • What praise have the students offered? How might this feedback be used to affirm and build on your strengths?
  • What do you see as the most helpful feedback students have provided? How might this feedback be used to improve aspects of your teaching?
  • Are there unhelpful comments here? To what might these be attributed?
  • Are there comments that are inappropriate? How might you (Teaching Academy member) support the instructor in disregarding these, or how can you serve as an ally, if the comments are particularly hateful or discriminatory?

Two other resources that might be of interest are short white papers that help instructors interpret their student course feedback and administrators more equitably consider the feedback from students:

If you are interested in participating in this process, please contact Suzanne Tapp, Executive Council, Teaching Academy, or Mitzi Ziegner, Teaching Academy member.


If we can provide more information or add to this FAQ, please feel free to contact Suzanne Tapp, Associate Vice Provost for Teaching and Learning. If you have questions about the SmartEvals software, please contact Kerri Ford, Program Manager, Institutional Research.

Teaching, Learning, & Professional Development Center

  • Address

    University Library Building, Room 136, Mail Stop 2044, Lubbock, TX 79409-2004
  • Phone

    806.742.0133
  • Email

    tlpdc@ttu.edu