Formbricks
Formbricks Open source Forms & Surveys Logo

30+ course survey questions that actually measure learning (2026)

Johannes

Johannes

CEO & Co-Founder

10 Minutes

April 15th, 2026

Most course surveys are built to rate the instructor. Research has repeatedly shown that instructor ratings are a weak proxy for actual learning, and a widely cited meta-analysis by Uttl, White, and Gonzalez (2017) found that Student Evaluations of Teaching correlate with student learning at levels close to zero. Good course surveys do something different: they measure what students experienced, what they learned, and what specific elements of the course worked or did not.

This guide gives you 40+ course survey questions for online, in-person, and hybrid courses, a mid-course plus end-of-course cadence, and a free template. It applies to university courses, corporate online courses, cohort-based learning programs, bootcamps, and continuing education.

What you will find in this guide:

  • Why course surveys matter, and why the traditional format falls short
  • Mid-course formative surveys vs end-of-course summative surveys
  • The research on instructor rating validity
  • 30+ course survey questions grouped by theme
  • Best practices for anonymity, timing, and analysis
  • Common mistakes that waste course feedback
  • Free Formbricks course survey template

Why course surveys matter

Courses are long, expensive investments of student and instructor time. A well-designed feedback program does three things:

  • Finds fixable problems mid-course. A student struggling with content in week 3 can be helped in week 4, but only if anyone knew.
  • Improves the next cohort. End-of-course feedback reveals what worked and what did not. The next cohort benefits from every signal the previous cohort provided.
  • Protects learning outcomes. Course design drift is real. Without regular feedback, courses slowly stop matching what students need.

The traditional "rate the instructor on a 1-5 scale" questionnaire does almost none of this. Good course surveys are built differently.


Mid-course vs end-of-course surveys

Two timings, two purposes, two different survey designs.

Mid-course (formative) surveys. Run at roughly the halfway point of the course. Short (3 to 5 questions). Focused on what is working, what is not, and what should change before the course ends. The value is that the current cohort benefits from the changes, not just the next one.

End-of-course (summative) surveys. Run within a week of course completion. Longer (10 to 15 questions). Focused on overall assessment, specific content ratings, and what should change for the next cohort. Informs course redesign and instructor development.

Both are valuable. Running only the end-of-course survey means the current cohort never benefits from their own feedback. Running only mid-course means the program loses the deeper retrospective that comes after course completion.


The problem with traditional student evaluations

Student Evaluations of Teaching (SETs) have been the dominant course survey format in higher education for decades. Academic research has steadily accumulated concerns about their validity.

Key research findings:

  • Weak correlation with learning. Uttl, White, and Gonzalez's 2017 meta-analysis, published in Studies in Educational Evaluation, found SET ratings correlate with student learning outcomes at levels close to zero when earlier study biases are controlled for.
  • Systematic bias. Multiple studies have found that SET scores are influenced by instructor gender, race, physical appearance, and subject area, independent of teaching quality.
  • Course difficulty effect. Students in easier courses consistently rate instructors higher. This biases scores against instructors teaching rigorous material.
  • Grade-related bias. Students who expect higher grades rate instructors more positively, which creates a feedback loop against rigorous grading.

What to do instead. Shift questions from "rate the instructor" to "describe your experience." Ask about specific content, specific pedagogy, and specific learning moments. Let the patterns across responses speak instead of relying on a single 1 to 5 rating.

The questions below follow this principle.


30+ course survey questions

Each question is tagged with theme, type, and priority (Essential, Recommended, Nice-to-have).

Course content and learning (questions 1-8)

1. Did the course cover what you expected based on the description?

  • Type: Likert (1-5) | Essential
  • Expectation-reality fit.

2. How clear were the learning objectives at the start of the course?

  • Type: Likert (1-5) | Essential

3. How well did the course content match the stated learning objectives?

  • Type: Likert (1-5) | Essential

4. How confident do you feel applying what you learned?

  • Type: Likert (1-5) | Essential
  • Self-efficacy is a better learning proxy than satisfaction.

5. What are the three most important things you learned in this course?

  • Type: Open-ended | Essential
  • Free-recall learning check.

6. How useful was the content for your goals?

  • Type: Likert (1-5) | Essential

7. Which topic or module was the most valuable?

  • Type: Open-ended | Recommended

8. Which topic or module was the least valuable?

  • Type: Open-ended | Recommended

Course structure and pace (questions 9-14)

9. How appropriate was the pace of the course?

  • Type: Multiple choice | Essential
  • Too slow / About right / Too fast.

10. How appropriate was the length of the course?

  • Type: Multiple choice | Essential
  • Too short / About right / Too long.

11. How was the balance between lecture, discussion, and exercises?

  • Type: Multiple choice | Recommended
  • Too much lecture / Balanced / Too much discussion or exercises.

12. Were the topics covered in a logical order?

  • Type: Likert (1-5) | Recommended

13. How well did earlier topics prepare you for later topics?

  • Type: Likert (1-5) | Recommended

14. Was the workload manageable given your other commitments?

  • Type: Likert (1-5) | Essential

Instruction and facilitation (questions 15-21)

Focus on experience, not rating.

15. How clearly were new concepts explained?

  • Type: Likert (1-5) | Essential

16. When you got stuck, were you able to get help?

  • Type: Likert (1-5) | Essential
  • Support quality, not instructor rating.

17. How well did the instructor answer questions?

  • Type: Likert (1-5) | Recommended

18. How approachable did the instructor feel when you needed help?

  • Type: Likert (1-5) | Recommended

19. Did you feel respected and heard during the course?

  • Type: Likert (1-5) | Essential
  • Inclusion proxy.

20. Did you feel comfortable asking questions or sharing ideas?

  • Type: Likert (1-5) | Essential
  • Psychological safety.

21. What did the instructor do especially well?

  • Type: Open-ended | Recommended

Course materials and technology (questions 22-27)

22. How useful were the assigned readings or materials?

  • Type: Likert (1-5) | Essential

23. How well did the course platform (LMS, video, exercises) work?

  • Type: Likert (1-5) | Essential

24. Did you experience any technical issues?

  • Type: Binary (Yes/No) with follow-up | Essential

25. How useful were the examples and case studies?

  • Type: Likert (1-5) | Recommended

26. Were the slides, handouts, or notes helpful?

  • Type: Likert (1-5) | Recommended

27. What materials should we add, remove, or improve?

  • Type: Open-ended | Recommended

Assessment and feedback (questions 28-33)

28. Were the assignments and assessments aligned with the course content?

  • Type: Likert (1-5) | Essential

29. How fair did you find the grading (if applicable)?

  • Type: Likert (1-5) | Recommended

30. Did you receive useful feedback on your work?

  • Type: Likert (1-5) | Essential

31. Were assessments challenging enough to be meaningful?

  • Type: Likert (1-5) | Recommended

32. Did the assessments help you learn the material better?

  • Type: Likert (1-5) | Essential

33. What one change to the assessments would make them more valuable?

  • Type: Open-ended | Recommended

Open-ended and overall (questions 34-42)

34. How likely are you to recommend this course to a friend or colleague? (0-10)

  • Type: Rating (0-10) | Essential
  • Course NPS.

35. Overall, how would you rate the value you got from this course?

  • Type: Likert (1-5) | Essential

36. What did you like most about this course?

  • Type: Open-ended | Essential

37. What would you change about this course?

  • Type: Open-ended | Essential

38. What is one thing that should stay exactly the same?

  • Type: Open-ended | Recommended

39. What is one thing that should change before the next cohort?

  • Type: Open-ended | Essential

40. Would you take another course from this instructor or program?

  • Type: Likert (1-5) | Recommended

41. What other courses would you like to see offered?

  • Type: Open-ended | Nice-to-have

42. Is there anything else you would like to share?

  • Type: Open-ended | Essential
  • Catch-all.

Mid-course (formative) subset

For a 3 to 5 question mid-course survey, use questions 5, 9, 14, 16, and 37. These surface the most actionable mid-course feedback in the smallest footprint.


Best practices

Anonymous. Essential for honest feedback in a power-imbalanced relationship.

Short. 10 to 15 questions for end-of-course, 3 to 5 for mid-course.

Focus on experience. Ask what students experienced and learned, not how they rate the instructor.

Close the loop. Share specific changes with the next cohort. "Based on last cohort's feedback, we have shortened module 3 and added more exercises to module 5."

Segment. Analyze responses by section, format (online vs in-person), and cohort. Patterns vary.

Run mid-course surveys. They benefit the current cohort, not just the next.

Pair with performance data. Survey results plus assessment performance data together tell a more complete story than either alone.

Respect the limits of the data. A single cohort of responses is directional. Multiple cohorts over time produce reliable patterns.

For the corporate training equivalent, see our training survey questions guide.


Common mistakes

Relying on instructor ratings. Weak correlation with learning. Focus on experience and content.

Long surveys. Response rates collapse past 15 questions.

Only surveying at the end. Mid-course feedback is where the highest-leverage fixes live.

Treating one cohort as definitive. Patterns across cohorts matter more than one round of responses.

Not acting on feedback. Two cohorts of ignored data kills response rates.

Ignoring open-ended feedback. Open-ended answers are where the specific improvement ideas live.

Treating all students as one population. Segment by section, format, and cohort.


Free course survey template

Formbricks is an open-source experience management platform with free course survey templates you can deploy in minutes.

Why Formbricks for course surveys:

  • Open source and self-hostable. Student feedback stays on your infrastructure. Important for educational institutions with student privacy obligations.
  • Anonymous by design. Anonymity enforced at the platform level.
  • Mid-course and end-of-course templates. Both timings covered out of the box.
  • Flexible distribution. Email, in-LMS embed, link, or QR code.
  • No engineering lift. Instructors and course designers can launch without developer support.
  • Free tier. No credit card required.

How to get started:

  1. Sign up at formbricks.com
  2. Pick the content quality evaluation template or start from scratch
  3. Customize for your content and format
  4. Schedule the mid-course and end-of-course delivery
  5. Review results and ship improvements before the next cohort

Start your course survey with Formbricks →

For related feedback frameworks, see our training survey questions, post webinar survey questions, post event survey questions, survey questions examples, and the docs feedback best practice for collecting inline feedback on course materials and documentation.


Frequently asked questions

Try Formbricks now

Keep full control over your data 🔒

Self-hosted

Run locally with docker-compose.

One Click Install

Cloud

Test our managed service for free:

Get started