How satisfied are you with the product's durability? Product Survey Question

Understand how customers perceive your product's long-term quality and reliability, helping you prioritize improvements that directly impact repeat purchases and brand reputation.

How satisfied are you with the product's durability?
Very dissatisfied
Very satisfied

Question type

Rating scale 1-5

Primary metric

CSAT (Customer Satisfaction Score)

Answer scale variations

Comparison table
StyleOptions
Typical choiceVery dissatisfied
Dissatisfied
Neutral
Satisfied
Very satisfied
Quality-focusedVery poor quality
Poor quality
Acceptable quality
Good quality
Excellent quality
Expectation-basedFalls far short
Below expectations
Meets expectations
Exceeds expectations
Far exceeds expectations
Longevity-focusedWon't last at all
Won't last long
Average lifespan
Long-lasting
Extremely durable

Follow-Up Questions

Follow-up questions help you understand not just satisfaction levels, but the specific factors driving durability perceptions. These targeted prompts transform a single satisfaction score into actionable insights about product quality, usage patterns, and improvement priorities.

This open-ended follow-up captures concrete examples of durability strengths and weaknesses, giving you direct user language to inform product improvements and quality assurance priorities.

Identifying specific failure points helps you prioritize engineering resources and adjust quality testing protocols where they'll have the most impact.

Usage duration context reveals whether durability concerns emerge early or develop over time, helping you distinguish between manufacturing defects and long-term wear patterns.

When to Use This Question

SaaS Products: Survey users after 90 days of active use via in-app modal triggered by feature usage milestones, because this timing captures real experience with platform stability and technical reliability rather than initial impressions.

E-commerce: Send 30 days post-purchase through automated email with product images to verify physical goods quality, as this window allows customers to assess build quality through regular use while maintaining strong purchase recollection.

Mobile Apps: Trigger after users complete their 20th session using contextual in-app prompt on app close, since frequent users have tested the app across different conditions and can evaluate performance consistency and technical robustness.

Web Apps: Deploy quarterly to power users (50+ logins) through dashboard banner with quick-dismiss option, because sustained usage reveals long-term reliability issues like performance degradation that new users cannot assess.

Digital Products: Survey 60 days after initial download via email with usage statistics summary, as this timeframe lets users evaluate whether digital assets maintain quality across different projects and use cases without technical degradation.

*feedback.tools
Start collecting user feedback
Get Started