You are an Engineering Manager at ClickUp responsible for the team that owns ClickUp Tasks, task views, and related workflow performance. In the last two quarters, leadership says the team is "shipping a lot," but customer-facing outcomes are mixed: feature output increased while enterprise account complaints about reliability and workflow friction also rose.
For Q2, your team planned 42 roadmap items and completed 39. Story points completed increased from 520 in Q1 to 690 in Q2 (+33%). However, Sev-1/Sev-2 incidents rose from 4 to 9, escaped defects per 1,000 WAU rose from 1.8 to 3.1, and p95 task load latency in ClickUp Web worsened from 1.2s to 1.8s. Adoption of two major launches was uneven: the new Table View bulk actions feature reached 46% of eligible weekly active workspaces by week 6, while the AI task summary panel reached only 11%. Team cycle time improved from 9.4 days to 7.1 days, but 30-day retention for newly activated workspaces using the shipped features was flat at 71%.
The VP of Engineering asks: "How should we measure whether this team is actually delivering successfully, not just shipping more?"
jira_issues: issue_id, team_id, story_points, created_at, started_at, completed_at, issue_type, prioritydeployments: deploy_id, service_name, deploy_time, rollback_flag, change_failure_flagincident_log: incident_id, severity, start_time, end_time, impacted_surfacefeature_events: workspace_id, user_id, feature_name, event_name, timestampworkspace_health: workspace_id, plan_type, weekly_active_users, retention_30d, expansion_revenueperformance_logs: surface, endpoint, p50_ms, p95_ms, error_rate