How satisfied are you with the results you're getting? Product Survey Question

Measure user sentiment toward the outcomes your product delivers and catch satisfaction gaps before they lead to churn.

How satisfied are you with the results you're getting?
Very dissatisfied
Very satisfied

Question type

Rating scale 1-5

Primary metric

CSAT (Customer Satisfaction Score)

Answer scale variations

Comparison table
StyleOptions
Typical choiceVery dissatisfied
Dissatisfied
Neutral
Satisfied
Very satisfied
Results-focusedNot getting results
Below expectations
Meeting expectations
Good results
Excellent results
Outcome-orientedVery poor outcomes
Poor outcomes
Acceptable outcomes
Good outcomes
Outstanding outcomes
Performance-basedSeverely underperforming
Underperforming
Performing adequately
Performing well
Exceeding expectations

Follow-Up Questions

Asking "How satisfied are you with the results you're getting?" captures overall sentiment, but the real insights come from understanding what's driving that satisfaction score. These follow-up questions help you identify specific improvement opportunities and what's working well.

This open-ended question reveals the gap between user expectations and actual outcomes, helping you understand whether satisfaction issues stem from unmet goals or unclear value propositions.

Understanding what users prioritize helps you focus improvements on the dimensions that actually drive satisfaction, rather than optimizing metrics that don't matter to your audience.

This forward-looking question uncovers enhancement opportunities directly from users experiencing the results, giving you actionable insights beyond just satisfaction scores.

When to Use This Question

SaaS Products: Send after 30 days of active use via in-app modal during a natural workflow pause, because users have enough experience to judge value but are still in the critical retention window where you can address concerns before churn.

E-commerce: Trigger 14 days post-delivery through email with order summary, since customers have had time to use the product and can assess if it met expectations, catching satisfaction issues while you can still recover the relationship with targeted offers.

Mobile Apps: Deploy after users complete 3-5 core actions using an interstitial screen at session end, as this frequency indicates genuine engagement and the end-of-session timing captures their experience while fresh without interrupting active use.

Web Apps: Launch following major feature usage or project completion via slide-in banner in the success state screen, because users can evaluate results immediately after seeing outcomes, and the success moment makes them more receptive to sharing feedback.

Digital Products: Send 60-90 days after purchase for courses or content libraries through email with usage statistics included, since users need substantial time to consume content and measure actual results, while the usage data provides context that makes the satisfaction question more meaningful and actionable.

*feedback.tools
Start collecting user feedback
Get Started