Query your engineering metrics directly from Cursor. Get instant answers about code velocity, PR cycle time, review quality, AI code adoption, and 40+ other metrics without leaving your editor.
Four tools connected to your Weave workspace:
| Tool | Description |
|---|---|
get_metric_overview |
Aggregated metrics with trends, time series, and benchmarks |
get_metric_drill_down |
Detailed records behind any metric — individual PRs, tasks, reviews |
get_accounts |
Team member names and IDs for filtering |
get_teams |
Team names and IDs for filtering |
Pre-configured guidance that teaches the AI how to use Weave tools effectively — correct metric names, date formats, and best practices.
A structured workflow for answering complex analytics questions: resolving entities, fetching the right metrics, and presenting actionable insights.
A purpose-built agent persona that thinks like an engineering leader — it leads with insights, contextualizes trends, and suggests next steps.
- Install the plugin from the Cursor Marketplace
- Add your API key in Cursor Settings:
- Open Cursor Settings (
Cmd+Shift+J) - Go to Tools & MCP
- Find the weave server and click the edit (pencil) icon
- Replace
YOUR_WEAVE_API_KEYwith your actual key (e.g.,wkey_abc123...) - Restart Cursor for the change to take effect
- Open Cursor Settings (
- Start asking questions in Cursor chat:
- "What's our PR cycle time this month?"
- "Compare code output across teams for Q1"
- "How much AI-written code are we shipping?"
- "Who has the longest review turnaround?"
| Question | What happens |
|---|---|
| "How is the team doing?" | Fetches code output, PRs, and cycle time for the last 30 days with trend comparison |
| "Show me AI code adoption over time" | Returns weekly ai_code_percentage time series |
| "Which repos have the slowest cycle time?" | Drills down into pr_cycle_time grouped by repository |
| "Compare our review quality to benchmarks" | Fetches code_review_quality with organization benchmarks |
Code velocity
code_output— Weighted productivity scorecode_output_per_engineer— Per-person outputcode_loc— Lines of code changedprs— Pull requests mergedprs_per_engineer— PRs per personpr_cycle_time— Open to merge durationpr_merge_time— Approval to merge durationpr_deploy_lead_time— Merge to deploy duration
Code review
code_reviews— Reviews performedcode_review_turnaround— Time to first reviewcode_review_quality— Review thoroughness scorereview_cycles— Review rounds before mergepr_review_rate— % of PRs reviewedcomment_resolution— Comment resolution detailscomment_resolution_rate— Resolution rate
Task delivery
tasks— Tasks completedtask_count— Total task countpoints— Story points deliveredpoints_per_engineer— Points per persontask_lead_time— Creation to completiontask_delivery— Delivery ratebug_tasks— Bug tasks
AI code
ai_code_loc— AI-assisted lines of codeai_code_percentage— % of code written with AIai_output_percentage— AI share of code outputai_efficiency_index— Composite score (volume, usage, cost, churn)output_per_ai_dollar— Output per AI dollartool_costs— AI tool costs
Quality
bugs_introduced— Bugs introducedbug_ratio— Bug-to-feature ratiorevert_prs— Reverted PRscode_turnover— Code churninnovation_ratio— Feature vs maintenance ratio
- A Weave account with connected data sources (GitHub, GitLab, Linear, Jira, etc.)
- An API key from your Weave settings