Feedback 7 Minute Read

Strategies for Conducting Student Feedback Surveys

Computer and books

Student feedback surveys are resourceful methods of collecting open and honest feedback in an online course. These surveys, often called course evaluations, can be valuable in improving course quality, helping course facilitators improve and progress their teaching, and detecting any areas that need improvement.

This article provides course facilitators with a guide for how to create, administer, and interpret evaluations given during the semester. Conducting midsemester evaluations often signals to students that the course facilitators are committed to teaching and have the desire to see the students succeed (Brown, 2008). In addition, soliciting midsemester student feedback allows the course facilitator to better help his or her current students and address any needs throughout the remaining weeks of the semester.

Method of Surveying Students

Because response rates of surveys e-mailed to students tend to be low, it’s essential to place surveys directly within the online course. Midsemester evaluations should be administered halfway through the course. For example, if a course is eight weeks long, midsemester evaluations should be provided during Week 4. Instructors should allow students “in-class” time to complete these surveys. In an online course, that means giving students time to complete the survey that they would normally spend on course lectures or synchronous sessions.

To ensure completion of these surveys, instructors can often use course settings to restrict access to the midsemester week until students complete the survey. However, if this isn’t possible, course facilitators should direct students’ attention to the survey and emphasize that their feedback supports continual course quality and improvement.

A Note on Anonymity

Surveys should remain anonymous to promote and protect students’ open and honest responses. The system can track whether students have completed their surveys; however, assigning a name to a survey may skew responses for some students.

Formatting Questions for Student Surveys

Student feedback survey questions should be designed so that institutions can use the responses to identify specific opportunities for course improvement and enhancement. These surveys should include both qualitative and quantitative feedback to provide summative and formative data for further analysis and interpretation. Questions should highlight instructor clarity, course workload and content, course navigation, instructor feedback and accessibility, and instructor knowledge of subject matter.

Likert Scale Questions

A Likert scale is a common tool used in data collection in online education surveys. A Likert scale is a five (or seven) point scale that allows individuals to express how much they agree or disagree with a statement. They often offer a range of responses from “strongly agree” to “strongly disagree” with a neutral midpoint (e.g., 1 = “strongly disagree,” 2 = “disagree,” 3 = “neutral,” 4 = “agree,” and 5 = “strongly agree”).

Likert scale questionnaires may include statements or questions that provide course facilitators with quantitative granular feedback. They help them see areas that need improvement in addition to highlighting areas of success.

Not Applicable Response Options

Likert scales may also include a “no answer” or “not applicable” (“N/A”) option for students. However, institutions should provide this option only if that answer could be warranted. For example, questions regarding instructor assistance would warrant an “N/A” option because not all students may have needed the instructor’s assistance throughout the course. In contrast, questions concerning a course assignment may not necessarily require an “N/A” option.

The course facilitator should clarify the difference between neutral responses and the N/A option (Norward, 1991). “Neutral” should be defined as an impartial attitude toward a statement or question; however, “not applicable” or no answer means students cannot give an answer because the statement does not apply to their situation. To clarify this in a Likert scale, the “N/A” response option should appear at the far right end of the scale (e.g., Strongly Disagree, Disagree, Neutral, Agree, Strongly Agree, N/A).

If a survey includes a not applicable or no answer response option, a written response should be required for those questions (Krosnick et al., 2002). By doing so, students may less likely respond with N/A and spend more time answering the question. In addition, it will provide course facilitators with more certainty that the “N/A” responses are not a result of student idleness.

Examples of Likert Scale Questions

Midsemester evaluation Likert scale questions should focus on how the course is going and what improvements the instructor can make. Questions should highlight the first half of the course in addition to ongoing aspects of the course. For example, questions may cover the course introduction and syllabus and also discuss instructor response rates to e-mails and feedback. These questions may be in past or present tense and should be organized into subcategories (e.g., instructor, formatting, navigation, and content).

Examples of Likert scale questions in a midsemester survey include:

  1. Course lectures are clear and organized.
  • Strongly Disagree
  • Somewhat Disagree
  • Neutral
  • Somewhat Agree
  • Strongly Agree
  1. The course lectures and content prepare me for course assessments.
  • Strongly Disagree
  • Somewhat Disagree
  • Neutral
  • Somewhat Agree
  • Strongly Agree
  1. I feel comfortable asking the instructor questions when I am unsure about assignment details.
  • Strongly Disagree
  • Somewhat Disagree
  • Neutral
  • Somewhat Agree
  • Strongly Agree
  • No answer / Not applicable

Open-Ended Questions

Student feedback surveys should also include open-ended questions to allow students to provide formative feedback. In a study conducted by van Wyk and McLean (2007), facilitators preferred student comments to Likert scores because student responses provided specific direction for improvement.

Student feedback surveys should include a minimum of one open-ended question per survey section. For example, an open-ended question regarding course instruction should follow the Likert scale questions regarding course instruction. In addition, open-ended questions that encompass the class in its entirety should appear at the end of the survey.

An example of an open-ended question following a Likert scale question include:

  1. The instructor has adequate knowledge and experience to teach this class.
  • Strongly Disagree
  • Somewhat Disagree
  • Neutral
  • Somewhat Agree
  • Strongly Agree
  • No answer / Not applicable
  1. Please explain why you gave that rating.

Examples of open-ended questions concluding the student feedback survey include:

  1. Please list and describe any suggestions for improvement of this course.
  2. Please list and describe any suggestions for online interface improvement.
  3. Please list and describe any positive aspects of taking this course online.

Other Question Types

To better understand student responses, surveys may include questions regarding students’ age, sex, and years in education. Other questions regarding demographics may be included; however, be careful not to overload students or they will be more likely to skim through their responses and thus provide inaccurate results.

For smaller class sizes (e.g., fewer than five students), an in-class, open discussion can be more effective than surveys to solicit student feedback, given the difficulty in maintaining anonymity. In addition, the results of surveys with few participants are less valid, thus making it difficult to generalize for future students.

Evaluating Student Feedback Surveys

Institutions can use midsemester surveys at the programmatic level to determine tenure review status, faculty promotions, course duplication, and program expansion. They’re also useful at the course level to provide course facilitators with suggestions for improvement in course instruction, development, content, and navigation. By asking students for quantitative and qualitative feedback, course facilitators and program directors are aware of not only what to improve but also suggestions on how to improve it.

Student Feedback Follow-Up

Course facilitators should review midsemester feedback as soon as possible to give them enough time to make changes prior to the course’s final exam or project. In addition, course facilitators should follow up with students regarding the results to clarify any misunderstandings, provide students with an overview of any immediate changes to the course, explain why certain changes cannot be made during the current term, and finally, thank students for their open and honest feedback.

Conclusion

Student feedback surveys provide institutions and instructors with powerful information they can use to improve student learning and thus strengthen the institution as a whole. By optimizing the question types and encouraging students to participate appropriately, instructors can leverage these opportunities for feedback to benefit everyone involved in the online course.

Posted December 14, 2017
Author Samantha Bir
Categories Feedback