Formbricks
Formbricks Open source Forms & Surveys Logo

Understand Low Engagement

Why is it useful?

This survey helps product managers identify reasons for low user engagement. It provides insights into user challenges and barriers to adoption. This information can be used to improve the product and enhance user engagement.

How to get started:

Once you have setup the Formbricks Widget, you have two ways to pre-segment your user base: Based on events and based on attributes. Soon, you will also be able to import cohorts from PostHog with just a few clicks.

Preview

Every product has them: users who created an account, used the product a few times, and then went quiet. They did not cancel. They did not complain. They just stopped showing up. These users represent both a retention problem and a research opportunity.

A low engagement survey reaches out to inactive or underactive users to understand why they disengaged. The reasons are often fixable: they got stuck during setup, they did not find the feature they needed, or life simply got in the way. Without asking, you are left guessing.

When to deploy a low engagement survey

After a defined period of inactivity. Set a threshold that makes sense for your product's usage pattern. For daily-use tools, two weeks of inactivity is a signal. For monthly reporting tools, 60 days might be more appropriate. The threshold should reflect when absence becomes unusual for your product.

When usage drops below a meaningful level. Some users log in but do not do anything substantive. If a user who was creating three surveys per week has not created one in a month, that is low engagement even though they are technically "active."

Before the account is at risk of churning. Low engagement is a leading indicator of churn. Survey users while they still have an account and some connection to the product. Once they cancel, the window for actionable feedback narrows significantly.

After failed onboarding. Users who signed up but never completed setup or reached their first value milestone are the earliest form of low engagement. Survey them within the first week to understand what stopped them.

Low engagement survey questions

  1. We noticed you have not been using [product] recently. What is the main reason? | Too busy / Found an alternative / Could not figure it out / Does not meet my needs / Other | Required
  2. What were you originally hoping to accomplish with [product]? | Open text | Required
  3. Did you encounter any specific problems or frustrations? | Open text | Optional
  4. What would bring you back to using [product] regularly? | Open text | Required
  5. How likely are you to use [product] again in the next 30 days? | Very likely / Somewhat likely / Unlikely / Not at all | Optional

Question one provides the high-level categorization. "Too busy" and "found an alternative" require different responses than "could not figure it out." Question two reveals whether the user's original goal was something your product actually supports. Question four is the most actionable: it tells you exactly what the user needs to re-engage.

Understanding the reasons behind low engagement

"Too busy" is often an excuse. When users say they were too busy, dig deeper. If your product were critical to their workflow, they would make time. "Too busy" usually means the product was not important enough to prioritize. The real question is why it did not become essential.

"Could not figure it out" is an onboarding problem. These users wanted to use your product but hit a wall. This is the most fixable category. Better onboarding, clearer documentation, or proactive check-ins can recover these users.

"Does not meet my needs" needs segmentation. Some users signed up with expectations your product was never designed to meet. Others have needs you could serve but currently do not. Separating these two groups prevents you from chasing features that do not align with your product direction.

"Found an alternative" is competitive intelligence. Ask what they switched to. The answers reveal your competitive landscape from the user's perspective, which is the only perspective that matters.

Re-engagement strategies based on survey data

For stuck users: fix the onboarding. If a significant portion of low-engagement users got stuck, improve the specific step where they stopped. Add a guided walkthrough, a video tutorial, or a proactive support check-in at that point.

For users who found alternatives: identify the gap. What does the alternative offer that you do not? Sometimes it is a specific feature. Sometimes it is price. Sometimes it is that the alternative was already embedded in their workflow. Each requires a different response.

For busy users: reduce time to value. If users cannot justify the time investment, make the product deliver value faster. Templates, presets, and automation that reduce setup time directly address the "too busy" objection.

For users whose needs changed: update your targeting. If users consistently sign up expecting something you do not offer, your marketing is attracting the wrong audience. Adjust your messaging to set accurate expectations.

Measuring re-engagement success

Track return rates by survey cohort. After surveying low-engagement users, monitor how many return to active usage within 30 and 60 days. Compare return rates across different disengagement reasons to see which interventions work.

Monitor time to re-engagement. Users who respond "very likely" to return should be tracked. If they do not return within their stated timeframe, a gentle follow-up with specific help can bridge the gap.

Compare with control group. If possible, compare re-engagement rates between users who received the survey and those who did not. The survey itself is an intervention: it reminds users that the product exists and that you care about their experience.

Common mistakes

Waiting too long to survey. The longer a user has been inactive, the less likely they are to respond or re-engage. Survey early, when the product is still somewhat top of mind.

Being too aggressive with follow-ups. One survey is a check-in. Three follow-up emails feel like desperation. Send the survey once. If they do not respond, wait at least 30 days before any additional outreach.

Not acting on "could not figure it out" responses. These are the easiest wins in your entire feedback pipeline. A user who wanted to use your product and got stuck is a user you can recover with targeted help. Prioritize personal outreach to this group.

Treating all low engagement the same. A user who was active for six months before going quiet is fundamentally different from a user who signed up and never engaged. Segment your survey responses by prior usage level to understand different disengagement patterns.

Ignoring the data in aggregate. If 30% of low-engagement users say they could not figure out setup, that is not 30 individual problems. That is a systemic onboarding issue that needs a systemic fix.

Set up this survey in Formbricks

Formbricks supports targeting users based on inactivity periods using custom events and user attributes. Set up an automation that triggers the low engagement survey when a user has not performed a key action within your defined threshold.

The template includes the categorized reason question, open-text follow-ups, and a re-engagement likelihood scale. Responses are tied to the user's profile, so you can cross-reference disengagement reasons with their original sign-up data, plan type, and usage history.

You can also set up automated re-engagement workflows based on survey responses. For example, users who select "could not figure it out" can automatically receive a personalized onboarding email with links to relevant help resources.

Explore related templates