Six Dimensions of Performance Radar
Assess team performance across six key dimensions — Quality, Responsiveness, Predictability, Productivity, Flow, and Value — with an interactive radar chart.
Most teams measure one thing — velocity, story points, whatever — and then wonder why everything feels off. The problem isn't the metric. It's that you're only looking at one dimension. This radar plots your team across six: Quality, Responsiveness, Predictability, Productivity, Flow, and Value. You rate each one, and the shape that emerges tells you more about your team's health than any single number ever could. Lopsided hexagons don't lie. I've used this in retros, quarterly reviews, even during re-orgs, and it consistently surfaces blind spots teams didn't know they had.
Dimensions
Save Assessment
How to Use the Performance Radar
Name Your Assessment
Give your assessment a meaningful name such as the sprint number, quarter, or team name. This allows you to compare assessments over time and track improvement trends.
Rate Each Dimension
Score each of the six dimensions from 1 (low) to 10 (high) based on your team's current performance. Use data where available and team consensus where metrics are subjective.
Analyze the Radar Chart
Review the generated radar chart to see your team's performance shape. A balanced hexagon indicates well-rounded performance. Dips reveal areas needing attention. Compare multiple assessments to track progress.
Complete Guide to Six Dimensions of Performance
Worked Examples
Example: Identifying an Imbalanced Team
Given: A team scores Quality: 9, Responsiveness: 3, Predictability: 7, Productivity: 8, Flow: 6, Value: 5.
Step 1: Plotted the scores and immediately saw the dip — Responsiveness at 3 was creating a visible dent in an otherwise solid hexagon.
Step 2: Asked the team what was going on. Turns out incoming requests just piled up in a shared Slack channel with no triage process. Things got lost. People got frustrated. Nobody owned it.
Step 3: They created a dead-simple triage rotation: one person per sprint handles incoming requests within 4 hours. Not glamorous, but it worked.
Result: Responsiveness jumped from 3 to 6 over two sprints. Nothing else degraded — which was the team's biggest fear. Sometimes the fix is embarrassingly simple.
Example: Tracking Quarterly Improvement
Given: Three quarterly assessments — Q1 overall score: 4.5, Q2 overall score: 5.8, Q3 overall score: 6.7.
Step 1: Overlaid all three radars. The hexagon was clearly growing — satisfying to see after months of effort.
Step 2: Flow improved the most, from 3 to 7. The team attributed it directly to the WIP limits they introduced in Q2. Hard to argue with a 4-point jump.
Step 3: But Value stayed flat at 5 across all three quarters. Nobody had deliberately worked on it — there was no customer feedback loop in place.
Result: The team made Value their Q4 focus. They set up monthly user interviews and started tracking feature adoption rates. Sometimes you don't improve what you don't measure — even when you're measuring five other things.
Practical Use Cases
Sprint Retrospective Assessment
“We ran the radar at the end of Sprint 14 and compared it to Sprint 12. Productivity went up by two points — great — but Predictability dropped. Turns out the team was shipping more by skipping estimation entirely. The radar caught a trade-off nobody noticed until the shape told the story.”
Quarterly Maturity Review
“One engineering director I know overlays four sprints' worth of radars every quarter and presents the trend to leadership. It's become their go-to format for showing whether coaching investments are paying off. Way more convincing than a slide deck full of velocity charts that nobody trusts.”
Cross-Team Benchmarking
“Two teams at the same company had wildly different shapes. Team A was a spike on Predictability, Team B was a spike on Quality. Rather than declare a winner, the VP set up a knowledge exchange — each team taught the other their secret sauce. Both radars improved the following quarter. Not equally, but noticeably.”
Frequently Asked Questions
?What are the six dimensions of performance?
Quality (defects, craftsmanship), Responsiveness (how fast you react to change), Predictability (do you deliver what you promised?), Productivity (output volume — careful with this one), Flow (how smoothly work moves through your system), and Value (are customers actually getting something useful?). The last one is the hardest to score but arguably the most important.
?How should I score each dimension?
Use data when you have it — defect counts for Quality, forecast accuracy for Predictability, that sort of thing. For the squishier dimensions, just poll the team. Have everyone write a number on a sticky note, reveal simultaneously, and discuss the outliers. Takes ten minutes and avoids the anchoring problem where the first person to speak sets everyone's score.
?How often should I run this assessment?
Every sprint works well for most teams. Some do it biweekly. The important thing is consistency — sporadic assessments tell you nothing about trends. If you're only going to do it quarterly, at least overlay multiple data points so you can see direction.
?What does a balanced radar shape mean?
It means your team is roughly even across dimensions. But — and this trips people up — balanced doesn't automatically mean good. A perfectly round hexagon where every dimension is a 3 is balanced and terrible. You want balanced AND high. The shape tells you about distribution; the size tells you about performance.
?Can I compare assessments over time?
Yes. Save each assessment with a name like 'Sprint 10' or 'Q2 2024' and the tool overlays them on the same chart. It's incredibly satisfying to watch the hexagon grow over a few months — and sobering when a dimension you thought you fixed starts shrinking again.
?Is my data private and secure?
Totally. Everything happens in your browser. No server, no database, no tracking. Your scores stay on your device.
?Is this tool free?
Yes. Free, no sign-up, no strings attached.
Related Tools
Team Health Radar Generator
Visualize team health with a radar/spider chart. Score dimensions, track trends over time, and export as PNG or JSON.
Kano Model Analysis
Classify product features into Must-be, Performance, Attractive, and Indifferent categories.
Backlog Arrival Rate Forecaster
Forecast future backlog arrival rates using seasonal decomposition with linear trend analysis.
Recommended Books on Team Performance & Metrics

The Five Dysfunctions of a Team
Patrick Lencioni

The Balanced Scorecard
Robert S. Kaplan

Team Topologies
Matthew Skelton
As an Amazon Associate we earn from qualifying purchases.
Recommended Products for Team Assessment & Facilitation

Magnetic Whiteboard with Stand Double Sided 40x28 inch Portable
Viz-Pro

The Five Dysfunctions Facilitator's Guide Set
Wiley

PATboard Scrum Board Basic Toolset Magnetic
PATboard
As an Amazon Associate we earn from qualifying purchases.