Event Feedback
Why is it useful?
Captures attendee impressions while the event is still fresh so you can improve logistics, content, and speaker quality for the next one.
How to get started:
Send within 24 hours of the event ending. Keep it under 8 questions and include one NPS question.
Preview
Event feedback survey template: questions to improve every event you run
The best time to collect event feedback is within 24 hours of the event ending. Memories fade fast, and attendees who were on the fence about the experience become increasingly unlikely to respond the longer you wait.
This template covers what to ask for in-person events, virtual events, and hybrid formats, with specific question types and a framework for turning feedback into better future events.
Event feedback survey questions
Overall experience
- How would you rate the event overall? | Rating scale (1-5) | Required
- Did the event meet your expectations? | Single choice (Exceeded / Met / Partially met / Did not meet) | Required
- How relevant was the event content to your interests or goals? | Rating scale (1-5) | Required
Content and speakers
- How would you rate the quality of the speakers or presenters? | Rating scale (1-5) | Required
- Which session or topic did you find most valuable? | Open text (short) | Optional
- Which session or topic did you find least valuable? | Open text (short) | Optional
- Was the depth of content appropriate for your experience level? | Single choice (Too basic / Just right / Too advanced) | Required
Logistics and organization
- How well-organized was the event? | Rating scale (1-5) | Required
- How would you rate the venue (or virtual platform)? | Rating scale (1-5) | Required
- Was the event length appropriate? | Single choice (Too short / Just right / Too long) | Required
Networking (if applicable)
- How useful were the networking opportunities? | Rating scale (1-5) | Optional
Future intent
- How likely are you to attend a future event from us? | Scale (0-10, NPS-style) | Required
- What topics would you like to see covered at future events? | Open text | Optional
- What is one thing we could improve for next time? | Open text | Optional
Adapting the template for virtual events
Virtual events have different pain points than in-person ones. Add or swap these questions for online events:
- How stable was the video/audio quality? | Rating scale (1-5) | Required
- How easy was the virtual platform to use? | Rating scale (1-5) | Required
- Did you experience any technical issues? | Single choice (Yes / No) | Required
- How engaging was the virtual format? | Rating scale (1-5) | Required
Drop the venue and networking questions if they do not apply. Replace them with platform usability questions. Virtual event feedback often reveals that technical friction (bad audio, confusing UI, lag) matters more than content quality.
When and how to send the survey
Within 24 hours. This is non-negotiable. Response rates and recall quality both drop dramatically after the first day.
Email is the standard channel. Send a follow-up email with the survey link, a thank-you for attending, and a clear note that the survey takes two to three minutes.
For in-person events, consider on-site collection. A QR code displayed during the closing session, linking to a short three-question survey, captures feedback while attendees are still in the room.
For recurring events (webinars, meetups), automate the send. Set up a trigger that sends the survey automatically after each event ends. This ensures consistent data collection without manual effort.
For more on delivery channels, see our guide on survey distribution methods.
Analyzing event feedback
Score the basics first
Average the ratings for overall experience, content quality, and logistics. These three numbers give you a quick health check.
| Area | Score 4.0+ | Score 3.0-3.9 | Score below 3.0 |
|---|---|---|---|
| Overall experience | Event is working well | Room for improvement | Major issues to address |
| Content/speakers | Strong programming | Review speaker selection | Rethink content strategy |
| Logistics/organization | Smooth execution | Fix specific friction points | Operational problems |
Compare across events
If you run recurring events, track scores over time. Improving from 3.5 to 4.1 over three events is meaningful progress. Declining scores signal a problem before attendance numbers drop.
Read the open-ended responses
The quantitative scores tell you what is working and what is not. The open-ended responses (questions 5, 6, 13, 14) tell you why. Group similar responses into themes and prioritize the themes that appear most frequently.
Common mistakes
Surveying too late. Sending the survey a week after the event is too late. Aim for same-day or next-day.
Making it too long. Event feedback surveys should take two to three minutes, which means eight to 14 questions maximum. Attendees will not spend 10 minutes on feedback for a two-hour webinar.
Asking vague questions. "How was the event?" is too broad. Break it into specific aspects: content, speakers, logistics, networking. Specific questions yield specific, actionable feedback.
Not acting on the feedback. If attendees consistently ask for more interactive sessions and you keep running lecture-format events, you are telling them their feedback does not matter. Implement at least one visible change per event cycle based on feedback.