# Revalidate Story - Verify Checkboxes Against Codebase Reality
The workflow execution engine is governed by: {project-root}/_bmad/core/tasks/workflow.xml
You MUST have already loaded and processed: {installed_path}/workflow.yaml
Verify story_file parameter provided
HALT
Read COMPLETE story file: {{story_file}}
Parse sections: Acceptance Criteria, Tasks/Subtasks, Definition of Done, Dev Agent Record
Extract story_key from filename (e.g., "2-7-image-file-handling")
Create backup of current checkbox state:
Count currently checked items:
- ac_checked_before = count of [x] in Acceptance Criteria
- tasks_checked_before = count of [x] in Tasks/Subtasks
- dod_checked_before = count of [x] in Definition of Done
- total_checked_before = sum of above
Use Edit tool to replace all [x] with [ ] in Acceptance Criteria section
Use Edit tool to replace all [x] with [ ] in Tasks/Subtasks section
Use Edit tool to replace all [x] with [ ] in Definition of Done section
Save story file with all boxes unchecked
Extract all AC items from Acceptance Criteria section
For each AC item:
Extract AC description and identify artifacts:
- File mentions (e.g., "UserProfile component")
- Function names (e.g., "updateUser function")
- Features (e.g., "dark mode toggle")
- Test requirements (e.g., "unit tests covering edge cases")
Use Glob to find relevant files:
- If AC mentions specific file: glob for that file
- If AC mentions component: glob for **/*ComponentName*
- If AC mentions feature: glob for files in related directories
Use Grep to search for symbols/functions/features
Read found files to verify:
- NOT a stub (check for "TODO", "Not implemented", "throw new Error")
- Has actual implementation (not just empty function)
- Tests exist (search for *.test.* or *.spec.* files)
- Tests pass (if --fill-gaps mode, run tests)
verification_status = VERIFIED
Check box [x] in story file for this AC
Record evidence: "✅ VERIFIED: {{files_found}}, tests: {{test_files}}"
verification_status = PARTIAL
Check box [~] in story file for this AC
Record gap: "🔶 PARTIAL: {{what_exists}}, missing: {{what_is_missing}}"
Add to gaps_list with details
verification_status = MISSING
Leave box unchecked [ ] in story file
Record gap: "❌ MISSING: No implementation found for {{ac_description}}"
Add to gaps_list with details
Save story file after each AC verification
Extract all Task items from Tasks/Subtasks section
For each Task item (same verification logic as ACs):
Parse task description for artifacts
Search codebase with Glob/Grep
Read and verify (check for stubs, tests)
Determine status: VERIFIED | PARTIAL | MISSING
Update checkbox: [x] | [~] | [ ]
Record evidence or gap
Save story file
Extract all DoD items from Definition of Done section
For each DoD item:
Parse DoD requirement:
- "Type check passes" → Run type checker
- "Unit tests 90%+ coverage" → Run coverage report
- "Linting clean" → Run linter
- "Build succeeds" → Run build
- "All tests pass" → Run test suite
Execute verification for this DoD item
Check box [x]
Record: "✅ VERIFIED: {{verification_result}}"
Leave unchecked [ ] or partial [~]
Record gap if applicable
Calculate overall completion:
total_verified = ac_verified + tasks_verified + dod_verified
total_partial = ac_partial + tasks_partial + dod_partial
total_missing = ac_missing + tasks_missing + dod_missing
total_items = ac_total + tasks_total + dod_total
verified_pct = (total_verified / total_items) × 100
completion_pct = ((total_verified + total_partial) / total_items) × 100
Write detailed report to: {sprint_artifacts}/revalidation-{{story_key}}-{{timestamp}}.md
Include: verification results, gaps list, evidence for each item, recommendations
Exit workflow
Exit workflow
HALT
Continue to Step 8
For each gap in gaps_list:
Fill this gap?
**Item:** {{item_description}}
**Type:** {{item_type}} ({{section}})
**Missing:** {{what_is_missing}}
[Y] Yes - Implement this item
[A] Auto-fill - Implement this and all remaining gaps without asking
[S] Skip - Leave this gap unfilled
[H] Halt - Stop gap filling
Your choice:
Set require_confirmation = false (auto-fill remaining)
Continue to next gap
Exit gap filling loop
Jump to Step 9 (Summary)
Load story context (Technical Requirements, Architecture Compliance, Dev Notes)
Implement missing item following story specifications
Write tests if required
Run tests to verify implementation
Verify linting/type checking passes
Check box [x] for this item in story file
Update File List with new/modified files
Add to Dev Agent Record: "Gap filled: {{item_description}}"
Stage files for this gap
Commit: "fix({{story_key}}): fill gap - {{item_description}}"
Leave box unchecked
Record failure in gaps_list
Add to failed_gaps
After all gaps processed:
For each filled gap:
Re-run verification for that item
Ensure still VERIFIED after all changes
Calculate final completion:
final_verified = count of [x] across all sections
final_partial = count of [~] across all sections
final_missing = count of [ ] across all sections
final_pct = (final_verified / total_items) × 100
Stage all changed files
Commit: "fix({{story_key}}): fill {{gaps_filled}} gaps from revalidation"
Load {sprint_status} file
Update entry with current progress:
Format: {{story_key}}: {{current_status}} # Revalidated: {{final_verified}}/{{total_items}} ({{final_pct}}%) verified
Save sprint-status.yaml
Add to Dev Agent Record in story file:
## Revalidation Record ({{timestamp}})
**Revalidation Mode:** {{#if fill_gaps}}Verify & Fill{{else}}Verify Only{{/if}}
**Results:**
- Verified: {{final_verified}}/{{total_items}} ({{final_pct}}%)
- Gaps Found: {{total_missing}}
- Gaps Filled: {{gaps_filled}}
**Evidence:**
{{#each verification_evidence}}
- {{item}}: {{evidence}}
{{/each}}
{{#if gaps_filled > 0}}
**Gaps Filled:**
{{#each filled_gaps}}
- {{item}}: {{what_was_implemented}}
{{/each}}
{{/if}}
{{#if failed_gaps.length > 0}}
**Failed to Fill:**
{{#each failed_gaps}}
- {{item}}: {{error}}
{{/each}}
{{/if}}
Save story file