What signals from product usage predict churn 90 days out?
4 signals: (1) logins declining >30% month-over-month, (2) feature adoption narrow (using <3 of 10 modules), (3) power user count down, (4) support tickets shift from how-to to product complaints. Any two = high churn risk. CSM must intervene within 14 days or accept the loss.
Churn Prediction Signals
Signal #1: Login velocity decline (most predictive)
- Track: "Logins per unique user per month"
- Red flag: Down >30% month-over-month for 2 consecutive months
- Example: Customer averaged 50 logins/month in Jan–Feb. March drops to 35. April drops to 24. High churn risk.
- Why it predicts churn: Users who stop logging in have mentally left; renewal is formality before they announce churn
- Intervention window: 2 weeks after 30% decline; CSM must diagnose
Signal #2: Feature breadth narrowing (adoption cliff)
- Track: "Number of distinct features/modules used per month"
- Baseline: Month 1 usage is highest (honeymoon phase); track "features used" steadily from Month 2–12
- Red flag: Dropped from 7 features in Month 3 to 3 features in Month 9
- Why it predicts churn: Customers who narrow usage are typically failing with the core problem; they're scaling back
- Intervention: "I notice you're using [core feature] and [secondary feature]. What happened to [feature X]? Can I help you get that back?"
Signal #3: Power user attrition
- Track: "Logins per top 3 users" (your evangelists)
- Red flag: Your top 3 users drop usage >50% in a single month
- Why it predicts churn: Power users are adoption champions. If they stop using it, the team follows
- Scenario: Your champion uses the product 20 times/month. Suddenly, 2 times/month. They've moved on.
- Intervention window: Within 3 days of detecting this; it signals org change or disengagement
Signal #4: Support ticket shift (sentiment change)
- Track: Support tickets by category — "How-to" vs. "Bug report" vs. "Feature request" vs. "Complaint"
- Red flag: Shift from "How do I...?" to "This doesn't work" or "Why is this so hard?"
- Why it predicts churn: Early support = learning. Late support = frustration. Shift = adoption failure
- Example: Jan–May = 70% how-to questions. June–Aug = 50% complaints, 20% how-to. Warning sign.
Early warning system (build in CRM/BI):
| Week | Metric | Threshold | CSM Action |
|---|---|---|---|
| Weekly | Login count | Down >20% vs prior week | Monitor; no action yet |
| Monthly | Login velocity | Down >30% MoM | CSM schedules check-in |
| Monthly | Feature breadth | Dropped 2+ modules | CSM diagnoses which features abandoned |
| Monthly | Power user logins | Down >50% MoM | Escalate to manager; call power user |
| Ongoing | Support sentiment | Shift to complaints | CSM joins next support ticket |
Secondary signals (supporting, not primary):
- Session duration declining — Users log in but spend less time
- Reporting requests down — They've stopped monitoring the metric your tool tracks
- API calls dropping — Integration is now dormant (data stopped flowing)
- Help/support escalations up — Users frustration level rising
- Department engagement shift — Power user's department was affected (layoff, reorganization)
Churn prediction accuracy:
- 1 signal present: 30% churn risk (false positive risk is high; don't over-react)
- 2 signals present: 65% churn risk (high confidence; CSM must intervene)
- 3+ signals present: 85%+ churn risk (essentially doomed; focus on negotiating smooth offboarding or competitive switch)
Intervention playbook (upon detection of 2+ signals):
Day 1–3: CSM diagnosis call
- "I noticed your team's usage patterns changed. What's going on?"
- Listen for: Org change, product gap, budget pressure, adoption challenge
- Ask: "Are we still solving the problem you hired us for?"
Day 4–7: Root cause proposal
- If it's adoption: "Let's re-train your team. I'll embed for 2 weeks."
- If it's product gap: "We've shipped [feature] since you onboarded; want to revisit?"
- If it's org change: "Your new [role] may have different priorities. Let's align on what matters."
- If it's budget: "Seems like resource constraints; can we right-size your plan?"
Day 8–14: Commitment
- Customer commits to a reset (extended pilot, 90-day focus, etc.)
- CSM monitors logins weekly; target is stabilization by Day 30
- If logins don't stabilize by Day 30, accept the churn and prep for smooth transition
When signals DON'T predict churn (false positives):
- Seasonal business (e.g., retailer low usage off-season) → Compare to same month last year
- Known one-time event (e.g., rep on vacation, dept freeze on new tool training) → Monitor when normal resumes
- Planned downsizing (e.g., team reduction) → Expected; ask customer if renewal still makes sense
SaaStr research: Churn prediction models with these 4 signals have 70%+ accuracy at 90 days out, assuming CSM can act on them by day 45.
TAGS: churn-prediction, product-usage, early-warning, retention, customer-success