Skip to content

/spec-retro

Run a structured retrospective on a completed spec. Analyzes metrics from the feature's lifecycle — implementation sessions, debugging cycles, spec changes, commit history — and facilitates a conversation about what went well and what to improve next time.

Usage

/spec-retro [spec-name]

Arguments

Argument Required Description
spec-name No Name of the spec to retrospect on. Auto-detected if only one spec exists.

What It Does

  1. Locates the spec and reads all lifecycle artifacts: tasks.md, progress.md, acceptance.md, release.md, and git history.

  2. Collects metrics automatically:

  3. Number of implementation sessions and tasks requiring multiple attempts
  4. Tasks added mid-implementation (scope creep indicator)
  5. Ratio of feature commits to fix/debug commits
  6. First-pass acceptance rate and which criteria failed initially
  7. Number of times requirements.md or design.md were modified after initial creation

  8. Identifies patterns before the conversation begins:

  9. Positive signals: high first-pass acceptance, few debugging cycles, no spec modifications during implementation
  10. Friction signals: tasks requiring many debugging cycles, reviewer rejections, spec changes mid-implementation, integration failures, scope additions

  11. Facilitates a three-round discussion:

  12. Round 1 — What went well (presents positive signals, asks for agreement or additions)
  13. Round 2 — What caused friction (presents friction signals with data, asks for other pain points)
  14. Round 3 — Improvements (suggests specific actions, asks which to prioritize)

  15. Generates retro.md in the spec directory with:

  16. Metrics table
  17. What went well (data-backed)
  18. What caused friction (with root cause analysis)
  19. Action items with priority and scope
  20. User notes from the discussion

  21. Summarizes key takeaways — top 3 things to keep doing, top 3 improvements for the next spec.

Example

/spec-retro user-authentication

Tips

  • Run this while the feature is still fresh. Do not wait weeks after deployment.
  • The automated analysis is a starting point. Your own observations matter more than the metrics.
  • Action items should be specific, not vague. "Improve testing" is not an action item. "Add integration test tasks for API endpoints in the tasks template" is.
  • Keep retrospectives focused on process, not individuals.
  • If this is the first retro, there is no baseline to compare against — that is fine. It establishes one for future comparisons.
  • Over time, retrospectives across multiple specs reveal systemic patterns that are hard to see on a single feature.

Tip

Review the previous retro's action items before starting a new spec. The value of retrospectives compounds over time only if the improvements are actually applied.

See Also

  • /spec-accept — Acceptance testing results feed into the retro's first-pass rate metric
  • /spec-verify — Post-deployment issues can be captured in the retro
  • /spec — Start the next spec with lessons learned applied