This skill should be used when the user asks to "generate documentation", "document this project", "create docs", "write documentation", "update documentation", "document all APIs", "generate onboarding docs", "create developer docs", or needs comprehensive codebase documentation. Orchestrates parallel AI agents to analyze code and produce documentation files.
Add this skill
npx mdskills install bgauryy/octocode-documentation-writerWell-orchestrated 6-phase documentation pipeline with parallel agent execution and research-driven approach
1---2name: octocode-documentation-writer3description: This skill should be used when the user asks to "generate documentation", "document this project", "create docs", "write documentation", "update documentation", "document all APIs", "generate onboarding docs", "create developer docs", or needs comprehensive codebase documentation. Orchestrates parallel AI agents to analyze code and produce documentation files.4---56# Repository Documentation Generator78**Production-ready 6-phase pipeline with intelligent orchestration, research-first validation, and conflict-free file ownership.**91011<what>12This command orchestrates specialized AI agents in 6 phases to analyze your code repository and generate comprehensive documentation:13</what>1415<steps>16 <phase_1>17 **Discovery+Analysis** (Phase 1)18 Agent: Opus19 Parallel: 4 parallel agents20 What: Analyze language, architecture, flows, and APIs21 Input: Repository path22 Output: `analysis.json`23 </phase_1>2425 <phase_2>26 **Engineer Questions** (Phase 2)27 Agent: Opus28 What: Generates comprehensive questions based on the analysis29 Input: `analysis.json`30 Output: `questions.json`31 </phase_2>3233 <phase_3>34 **Research Agent** (Phase 3) ๐35 Agent: Sonnet36 Parallel: Dynamic (based on question volume)37 What: Deep-dive code forensics to ANSWER the questions with evidence38 Input: `questions.json`39 Output: `research.json`40 </phase_3>4142 <phase_4>43 **Orchestrator** (Phase 4)44 Agent: Opus45 What: Groups questions by file target and assigns exclusive file ownership to writers46 Input: `questions.json` + `research.json`47 Output: `work-assignments.json` (file-based assignments for parallel writers)48 </phase_4>4950 <phase_5>51 **Documentation Writers** (Phase 5)52 Agent: Sonnet53 Parallel: 1-8 parallel agents (dynamic based on workload)54 What: Synthesize research and write comprehensive documentation with exclusive file ownership55 Input: `analysis.json` + `questions.json` + `research.json` + `work-assignments.json`56 Output: `documentation/*.md` (16 core docs, 5 required + supplementary files)57 </phase_5>5859 <phase_6>60 **QA Validator** (Phase 6)61 Agent: Sonnet62 What: Validates documentation quality using LSP-powered verification63 Input: `documentation/*.md` + `analysis.json` + `questions.json`64 Output: `qa-results.json` + `QA-SUMMARY.md`65 </phase_6>66</steps>6768<subagents>69Use spawn explore opus/sonnet/haiku subagents to explore code with MCP tools (localSearchCode, lspGotoDefinition, lspCallHierarchy, lspFindReferences)70</subagents>7172**Documentation Flow:** analysis.json โ questions.json โ **research.json** โ work-assignments.json โ documentation (conflict-free!)7374---7576## โ ๏ธ CRITICAL: Parallel Agent Execution7778<parallel_execution_critical importance="maximum">7980**STOP. READ THIS TWICE.**8182### 1. THE RULE83**You MUST spawn parallel agents in a SINGLE message with multiple Task tool calls.**8485### 2. FORBIDDEN BEHAVIOR86**FORBIDDEN:** Calling `Task` sequentially (one per response).87**REASON:** Sequential calls defeat parallelism and slow down execution by 4x-8x.8889### 3. REQUIRED CONFIRMATION90Before launching any parallel phase (1, 3, 5), you **MUST** verify:91- [ ] All Task calls are prepared for a SINGLE response92- [ ] No dependencies exist between these parallel agents93- [ ] Each agent has exclusive scope (no file conflicts)9495<correct_pattern title="โ CORRECT: Single response launches all agents concurrently">96```97// In ONE assistant message, include ALL Task tool invocations:98Task(description="Discovery 1A-language", subagent_type="general-purpose", prompt="...", model="opus")99Task(description="Discovery 1B-components", subagent_type="general-purpose", prompt="...", model="opus")100Task(description="Discovery 1C-dependencies", subagent_type="general-purpose", prompt="...", model="opus")101Task(description="Discovery 1D-flows", subagent_type="general-purpose", prompt="...", model="opus")102// โ All 4 execute SIMULTANEOUSLY103```104</correct_pattern>105106<wrong_pattern title="โ WRONG: Sequential calls lose parallelism">107```108// DON'T DO THIS - Each waits for previous to complete109Message 1: Task(description="Discovery 1A") โ wait for result110Message 2: Task(description="Discovery 1B") โ wait for result111Message 3: Task(description="Discovery 1C") โ wait for result112Message 4: Task(description="Discovery 1D") โ wait for result113// โ 4x slower! No parallelism achieved114```115</wrong_pattern>116117</parallel_execution_critical>118119---120121## Execution Flow Diagram122123```mermaid124flowchart TB125 Start([/octocode-documentation-writer PATH]) --> Validate[Pre-Flight Validation]126 Validate --> Init[Initialize Workspace]127128 Init --> P1[Phase 1: Discovery+Analysis]129130 subgraph P1_Parallel["๐ RUN IN PARALLEL (4 agents)"]131 P1A[Agent 1A:<br/>Language & Manifests]132 P1B[Agent 1B:<br/>Components]133 P1C[Agent 1C:<br/>Dependencies]134 P1D[Agent 1D:<br/>Flows & APIs]135 end136137 P1 --> P1_Parallel138 P1_Parallel --> P1Agg[Aggregation:<br/>Merge into analysis.json]139 P1Agg --> P1Done[โ analysis.json created]140141 P1Done -->|Reads analysis.json| P2[Phase 2: Engineer Questions<br/>Single Agent - Opus]142 P2 --> P2Done[โ questions.json created]143144 P2Done -->|Reads questions.json| P3[Phase 3: Research ๐<br/>Parallel Agents - Sonnet]145146 subgraph P3_Parallel["๐ RUN IN PARALLEL"]147 P3A[Researcher 1]148 P3B[Researcher 2]149 P3C[Researcher 3]150 end151152 P3 --> P3_Parallel153 P3_Parallel --> P3Agg[Aggregation:<br/>Merge into research.json]154 P3Agg --> P3Done[โ research.json created<br/>Evidence-backed answers]155156 P3Done -->|Reads questions + research| P4[Phase 4: Orchestrator<br/>Single Agent - Opus]157 P4 --> P4Group[Group questions<br/>by file target]158 P4 --> P4Assign[Assign file ownership<br/>to writers]159 P4Assign --> P4Done[โ work-assignments.json]160161 P4Done --> P5[Phase 5: Documentation Writers]162 P5 --> P5Input[๐ Input:<br/>work-assignments.json<br/>+ research.json]163 P5Input --> P5Dist[Each writer gets<br/>exclusive file ownership]164165 subgraph P5_Parallel["๐ RUN IN PARALLEL (1-8 agents)"]166 P5W1[Writer 1]167 P5W2[Writer 2]168 P5W3[Writer 3]169 P5W4[Writer 4]170 end171172 P5Dist --> P5_Parallel173 P5_Parallel --> P5Verify[Verify Structure]174 P5Verify --> P5Done[โ documentation/*.md created]175176 P5Done --> P6[Phase 6: QA Validator<br/>Single Agent - Sonnet]177 P6 --> P6Done[โ qa-results.json +<br/>QA-SUMMARY.md]178179 P6Done --> Complete([โ Documentation Complete])180181 style P1_Parallel fill:#e1f5ff182 style P3_Parallel fill:#e1f5ff183 style P5_Parallel fill:#ffe1f5184 style P4 fill:#fff3cd185 style Complete fill:#28a745,color:#fff186```187188### Parallel Execution Rules189190<execution_rules>191 <phase name="1-discovery" type="parallel" critical="true" spawn="single_message">192 <gate>193 **STOP.** Verify parallel spawn requirements.194 **REQUIRED:** Spawn 4 agents in ONE message.195 **FORBIDDEN:** Sequential Task calls.196 </gate>197 <agent_count>4</agent_count>198 <description>Discovery and Analysis</description>199 <spawn_instruction>โ ๏ธ Launch ALL 4 Task calls in ONE response</spawn_instruction>200 <rules>201 <rule>All 4 agents start simultaneously via single-message spawn</rule>202 <rule>Wait for ALL 4 to complete before aggregation</rule>203 <rule>Must aggregate 4 partial JSONs into analysis.json</rule>204 </rules>205 </phase>206207 <phase name="2-questions" type="single" critical="true" spawn="sequential">208 <agent_count>1</agent_count>209 <description>Engineer Questions Generation</description>210 <spawn_instruction>Single agent, wait for completion</spawn_instruction>211 </phase>212213 <phase name="3-research" type="parallel" critical="true" spawn="single_message">214 <gate>215 **STOP.** Verify parallel spawn requirements.216 **REQUIRED:** Spawn N researchers in ONE message.217 **FORBIDDEN:** Sequential Task calls.218 </gate>219 <agent_count_logic>220 <case condition="questions < 10">1 agent</case>221 <case condition="questions >= 10">Ceil(questions / 15)</case>222 </agent_count_logic>223 <description>Evidence Gathering</description>224 <spawn_instruction>โ ๏ธ Launch ALL researcher Task calls in ONE response</spawn_instruction>225 <rules>226 <rule>Split questions into batches BEFORE spawning</rule>227 <rule>All researchers start simultaneously</rule>228 <rule>Aggregate findings into research.json</rule>229 </rules>230 </phase>231232 <phase name="4-orchestrator" type="single" critical="true" spawn="sequential">233 <agent_count>1</agent_count>234 <description>Orchestration and Assignment</description>235 <spawn_instruction>Single agent, wait for completion</spawn_instruction>236 <rules>237 <rule>Assign EXCLUSIVE file ownership to writers</rule>238 <rule>Distribute research findings to relevant writers</rule>239 </rules>240 </phase>241242 <phase name="5-writers" type="dynamic_parallel" critical="false" spawn="single_message">243 <gate>244 **STOP.** Verify parallel spawn requirements.245 **REQUIRED:** Spawn all writers in ONE message.246 **FORBIDDEN:** Sequential Task calls.247 </gate>248 <agent_count_logic>249 <case condition="questions < 20">1 agent</case>250 <case condition="questions 20-99">2-4 agents</case>251 <case condition="questions >= 100">4-8 agents</case>252 </agent_count_logic>253 <spawn_instruction>โ ๏ธ Launch ALL writer Task calls in ONE response</spawn_instruction>254 <rules>255 <rule>Each writer owns EXCLUSIVE files - no conflicts possible</rule>256 <rule>All writers start simultaneously via single-message spawn</rule>257 <rule>Use provided research.json as primary source</rule>258 </rules>259 </phase>260261 <phase name="6-qa" type="single" critical="false" spawn="sequential">262 <agent_count>1</agent_count>263 <description>Quality Validation</description>264 <spawn_instruction>Single agent, wait for completion</spawn_instruction>265 </phase>266</execution_rules>267268## Pre-Flight Checks269270<pre_flight_gate>271**HALT. Complete these requirements before proceeding:**272273### Required Checks2741. **Verify Path Existence**275 - **IF** `repository_path` missing โ **THEN** ERROR & EXIT2762. **Verify Directory Status**277 - **IF** not a directory โ **THEN** ERROR & EXIT2783. **Source Code Check**279 - **IF** < 3 source files โ **THEN** WARN & Ask User (Exit if no)2804. **Build Directory Check**281 - **IF** contains `node_modules` or `dist` โ **THEN** ERROR & EXIT2825. **Size Estimation**283 - **IF** > 200k LOC โ **THEN** WARN & Ask User (Exit if no)284285**FORBIDDEN until gate passes:**286- Any agent spawning287- Workspace initialization288</pre_flight_gate>289290<instruction>291Before starting, validate the repository path and check for edge cases.2922931. **Verify Path Existence**294 - Ensure `repository_path` exists.295 - If not, raise an ERROR: "Repository path does not exist: " + path and EXIT.2962972. **Verify Directory Status**298 - Confirm `repository_path` is a directory.299 - If not, raise an ERROR: "Path is not a directory: " + path and EXIT.3003013. **Source Code Check**302 - Count files ending in `.ts`, `.js`, `.py`, `.go`, or `.rs`.303 - Exclude directories: `node_modules`, `.git`, `dist`, `build`.304 - If fewer than 3 source files are found:305 - WARN: "Very few source files detected ({count}). This may not be a code repository."306 - Ask user: "Continue anyway? [y/N]"307 - If not confirmed, EXIT.3083094. **Build Directory Check**310 - Ensure the path does not contain `node_modules`, `dist`, or `build`.311 - If it does, raise an ERROR: "Repository path appears to be a build directory. Please specify the project root." and EXIT.3123135. **Size Estimation**314 - Estimate the repository size.315 - If larger than 200,000 LOC:316 - WARN: "Large repository detected (~{size} LOC)."317 - Ask user: "Continue anyway? [y/N]"318 - If not confirmed, EXIT.319</instruction>320321## Initialize Workspace322323<init_gate>324**STOP. Verify state before initialization.**325326### Required Actions3271. **Define Directories** (`CONTEXT_DIR`, `DOC_DIR`)3282. **Handle Existing State**329 - **IF** `state.json` exists โ **THEN** Prompt User to Resume330 - **IF** User says NO โ **THEN** Reset state3313. **Create Directories**3324. **Initialize New State** (if not resuming)333334**FORBIDDEN:**335- Starting Phase 1 before state is initialized.336</init_gate>337338<instruction>339### Workspace Initialization340Before starting the pipeline, set up the working environment and handle any existing state.3413421. **Define Directories**343 - Context Directory (`CONTEXT_DIR`): `${REPOSITORY_PATH}/.context`344 - Documentation Directory (`DOC_DIR`): `${REPOSITORY_PATH}/documentation`3453462. **Handle Existing State**347 - Check if `${CONTEXT_DIR}/state.json` exists.348 - If it exists and the phase is NOT "complete" or "failed":349 - **Prompt User**: "Found existing documentation generation in progress (phase: [PHASE]). Resume from last checkpoint? [Y/n]"350 - **If User Confirms (Yes)**:351 - Set `RESUME_MODE = true`352 - Set `START_PHASE` from the saved state.353 - **If User Declines (No)**:354 - **WARN**: "Restarting from beginning. Previous progress will be overwritten."355 - Set `RESUME_MODE = false`356 - Set `START_PHASE = "initialized"`357 - If `state.json` does not exist or previous run finished/failed, start fresh (`RESUME_MODE = false`).3583593. **Create Directories**360 - Ensure `CONTEXT_DIR` exists (create if missing).361 - Ensure `DOC_DIR` exists (create if missing).3623634. **Initialize New State** (If NOT Resuming)364 - Create a new `state.json` using the schema defined in `schemas/state-schema.json`.365</instruction>366367## Progress Tracker368369Display real-time progress:370371```372๐ Documentation Generation Progress v3.1373โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ374375Repository: {REPOSITORY_PATH}376Mode: {RESUME_MODE ? "Resume" : "New"}377378{if RESUME_MODE}379Resuming from: {START_PHASE}380{end}381382โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ383```384385## Agent Pipeline Execution386387### Phase 1: Discovery+Analysis Agent388389<phase_1_gate>390**GATE: START Phase 1**391**REQUIRED:** Spawn 4 agents in **ONE** message.392**FORBIDDEN:** Sequential calls.393</phase_1_gate>394395**Agent Spec**: `references/agent-discovery-analysis.md`396**Task Config**: `schemas/discovery-tasks.json`397398| Property | Value |399|----------|-------|400| Parallel Agents | 4 (1a-language, 1b-components, 1c-dependencies, 1d-flows-apis) |401| Critical | Yes |402| Output | `.context/analysis.json` |403404> See `references/agent-discovery-analysis.md` โ **Orchestrator Execution Logic** section for full implementation.405406### Phase 2: Engineer Questions Agent407408**Agent Spec**: `references/agent-engineer-questions.md`409410| Property | Value |411|----------|-------|412| Agent Type | Single (Opus) |413| Critical | Yes |414| Input | `.context/analysis.json` |415| Output | `.context/questions.json` |416417> See `references/agent-engineer-questions.md` โ **Orchestrator Execution Logic** section for full implementation.418419420### Phase 3: Research Agent ๐421422<phase_3_gate>423**GATE: START Phase 3**424**REQUIRED:** Spawn N agents in **ONE** message.425**FORBIDDEN:** Sequential calls.426</phase_3_gate>427428**Agent Spec**: `references/agent-researcher.md`429430| Property | Value |431|----------|-------|432| Agent Type | Parallel (Sonnet) |433| Critical | Yes |434| Input | `.context/questions.json` |435| Output | `.context/research.json` |436437> See `references/agent-researcher.md` โ **Orchestrator Execution Logic** section for full implementation.438439440### Phase 4: Orchestrator Agent441442**Agent Spec**: `references/agent-orchestrator.md`443444| Property | Value |445|----------|-------|446| Agent Type | Single (Opus) |447| Critical | Yes |448| Input | `.context/analysis.json`, `.context/questions.json`, `.context/research.json` |449| Output | `.context/work-assignments.json` |450451> See `references/agent-orchestrator.md` โ **Orchestrator Execution Logic** section for full implementation.452453### Phase 5: Documentation Writers454455<phase_5_gate>456**GATE: START Phase 5**457**REQUIRED:** Spawn all writers in **ONE** message.458**FORBIDDEN:** Sequential calls.459</phase_5_gate>460461**Agent Spec**: `references/agent-documentation-writer.md`462463| Property | Value |464|----------|-------|465| Agent Type | Parallel (1-8 Sonnet writers) |466| Primary Writer | Writer 1 (Critical) |467| Non-Primary | Partial failure allowed |468| Retry Logic | Up to 2 retries per failed writer |469| Input | `.context/analysis.json`, `.context/research.json`, `.context/work-assignments.json` |470| Output | `documentation/*.md` (16 core, 5 required + supplementary) |471| File Ownership | Exclusive (no conflicts) |472473#### Writer Scaling Strategy474475| Strategy | Agent Count | When Used |476|----------|-------------|-----------|477| `sequential` | 1 | < 20 questions |478| `parallel-core` | 2-4 | 20-99 questions |479| `parallel-all` | 4-8 | >= 100 questions |480481> See `references/agent-documentation-writer.md` โ **Orchestrator Execution Logic** section for full implementation.482483### Phase 6: QA Validator484485**Agent Spec**: `references/agent-qa-validator.md`486487| Property | Value |488|----------|-------|489| Agent Type | Single (Sonnet) |490| Critical | No (failure produces warning) |491| Input | `.context/analysis.json`, `.context/questions.json`, `documentation/*.md` |492| Output | `.context/qa-results.json`, `documentation/QA-SUMMARY.md` |493| Score Range | 0-100 |494| Quality Ratings | `excellent` (โฅ90), `good` (โฅ75), `fair` (โฅ60), `needs-improvement` (<60) |495496> See `references/agent-qa-validator.md` โ **Orchestrator Execution Logic** section for full implementation.497498## Completion499500```javascript501update_state({502 phase: "complete",503 completed_at: new Date().toISOString(),504 current_agent: null505})506507DISPLAY: "โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ"508DISPLAY: "โ Documentation Complete!"509DISPLAY: ""510DISPLAY: "๐ Location: {DOC_DIR}/"511DISPLAY: "๐ QA Report: {DOC_DIR}/QA-SUMMARY.md"512DISPLAY: ""513514if (parsed_qa && parsed_qa.overall_score):515 DISPLAY: "Quality Score: {parsed_qa.overall_score}/100 ({parsed_qa.quality_rating})"516517 if (parsed_qa.overall_score >= 90):518 DISPLAY: "Status: Excellent โ - Ready for release"519 else if (parsed_qa.overall_score >= 75):520 DISPLAY: "Status: Good โ - Minor improvements recommended"521 else if (parsed_qa.overall_score >= 60):522 DISPLAY: "Status: Fair -๏ธ - Address gaps before release"523 else:524 DISPLAY: "Status: Needs Work -๏ธ - Major improvements required"525526 if (parsed_qa.gaps && parsed_qa.gaps.length > 0):527 DISPLAY: ""528 DISPLAY: "Next Steps:"529 for (i = 0; i < Math.min(3, parsed_qa.gaps.length); i++):530 gap = parsed_qa.gaps[i]531 DISPLAY: " {i+1}. {gap.fix}"532533DISPLAY: ""534DISPLAY: "๐ Documentation Coverage:"535DISPLAY: " {parsed_questions.summary.total_questions} questions researched"536DISPLAY: " {parsed_qa.question_coverage.answered} questions answered in docs"537DISPLAY: ""538DISPLAY: "View documentation: {DOC_DIR}/index.md"539DISPLAY: "โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ"540541EXIT code 0542```543544## Error Recovery545546If any agent fails critically:547548```javascript549function handle_critical_failure(phase, error):550 DISPLAY: "โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ"551 DISPLAY: "โ Documentation Generation Failed"552 DISPLAY: ""553 DISPLAY: "Phase: {phase}"554 DISPLAY: "Error: {error.message}"555 DISPLAY: ""556557 if (error.recoverable):558 DISPLAY: "This error is recoverable. Run /octocode-documentation-writer again to resume."559 DISPLAY: "State saved in: {CONTEXT_DIR}/state.json"560 else:561 DISPLAY: "This error is not recoverable. Please check the error and try again."562 DISPLAY: "You may need to fix the issue before retrying."563564 DISPLAY: ""565 DISPLAY: "Logs: {CONTEXT_DIR}/state.json"566 DISPLAY: "โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ"567568 EXIT code 1569```570571## Helper Functions572573> **IMPORTANT: State Synchronization**574> Only the main orchestrator process should update `state.json`. Individual parallel agents575> (Discovery 1A-1D, Researchers, Writers) must NOT directly modify `state.json` to avoid576> race conditions. Parallel agents should only write to their designated partial result files577> in `partials/<phase>/<task_id>.json`. The orchestrator aggregates these results and updates578> `state.json` after all parallel agents complete.579580```javascript581// NOTE: This function should ONLY be called by the main orchestrator process,582// never by parallel sub-agents. Parallel agents use save_partial_result() instead.583function update_state(updates):584 current_state = Read(CONTEXT_DIR + "/state.json")585 parsed = JSON.parse(current_state)586587 for key, value in updates:588 parsed[key] = value589590 Write(CONTEXT_DIR + "/state.json", JSON.stringify(parsed, null, 2))591592function estimate_repo_size(path):593 // Quick estimate: count source files594 files = count_files(path, ["*.ts", "*.js", "*.py", "*.go", "*.rs", "*.java"], excludeDir=["node_modules", ".git", "dist", "build"])595 // Assume ~200 LOC per file average596 return files * 200597598function count_files(path, patterns, excludeDir):599 // Use localFindFiles MCP tool (mcp__octocode__localFindFiles)600 // Return count of matching files601```602603## Retry & Data Preservation Logic604605**CRITICAL**: Never lose partial work. All agents support retry with state preservation.606607```javascript608const RETRY_CONFIG = {609 discovery_analysis: { max_attempts: 3, backoff_ms: 2000 },610 engineer_questions: { max_attempts: 3, backoff_ms: 2000 },611 research: { max_attempts: 3, backoff_ms: 3000 },612 orchestrator: { max_attempts: 3, backoff_ms: 2000 },613 documentation: { max_attempts: 3, backoff_ms: 5000 }, // per writer614 qa: { max_attempts: 2, backoff_ms: 1000 }615}616617// === RETRY WRAPPER FOR ALL AGENTS ===618function retry_agent(phase_name, agent_fn, options = {}):619 config = RETRY_CONFIG[phase_name]620 state = get_retry_state(phase_name)621622 while (state.attempts < config.max_attempts):623 state.attempts++624 update_retry_state(phase_name, state)625626 DISPLAY: `โณ ${phase_name} attempt ${state.attempts}/${config.max_attempts}`627628 try:629 result = agent_fn(options)630631 // Success - clear retry state632 clear_retry_state(phase_name)633 return { success: true, result }634635 catch (error):636 state.last_error = error.message637 update_retry_state(phase_name, state)638639 DISPLAY: `โ ๏ธ ${phase_name} failed: ${error.message}`640641 if (state.attempts < config.max_attempts):642 DISPLAY: ` Retrying in ${config.backoff_ms}ms...`643 sleep(config.backoff_ms * state.attempts) // Exponential backoff644 else:645 DISPLAY: `โ ${phase_name} exhausted all ${config.max_attempts} attempts`646 return { success: false, error, attempts: state.attempts }647648 return { success: false, error: state.last_error, attempts: state.attempts }649650// === PARALLEL AGENT RETRY (for Discovery, Research, Writers) ===651function retry_parallel_agents(phase_name, agent_tasks, options = {}):652 config = RETRY_CONFIG[phase_name]653 results = {}654 failed_tasks = []655656 // First attempt - run all in parallel657 parallel_results = Task_Parallel(agent_tasks)658659 for (task_id, result) in parallel_results:660 if (result.success):661 results[task_id] = result662 save_partial_result(phase_name, task_id, result)663 else:664 failed_tasks.push({ id: task_id, task: agent_tasks[task_id], attempts: 1 })665666 // Retry failed tasks individually667 for failed in failed_tasks:668 while (failed.attempts < config.max_attempts):669 failed.attempts++670 DISPLAY: `โณ Retrying ${phase_name}/${failed.id} (attempt ${failed.attempts}/${config.max_attempts})`671672 try:673 result = Task(failed.task)674 if (result.success):675 results[failed.id] = result676 save_partial_result(phase_name, failed.id, result)677 break678 catch (error):679 DISPLAY: `โ ๏ธ ${phase_name}/${failed.id} failed: ${error.message}`680 if (failed.attempts < config.max_attempts):681 sleep(config.backoff_ms * failed.attempts)682683 if (failed.attempts >= config.max_attempts && !results[failed.id]):684 DISPLAY: `โ ${phase_name}/${failed.id} failed after ${config.max_attempts} attempts`685 // Load any partial result saved during attempts686 results[failed.id] = load_partial_result(phase_name, failed.id) || { success: false, partial: true }687688 return results689690// === PARTIAL RESULT PRESERVATION ===691// Uses atomic writes to prevent corruption from concurrent access692function save_partial_result(phase_name, task_id, result):693 partial_dir = CONTEXT_DIR + "/partials/" + phase_name694 mkdir_p(partial_dir)695696 target_path = partial_dir + "/" + task_id + ".json"697 temp_path = partial_dir + "/" + task_id + ".json.tmp." + random_uuid()698699 // Atomic write: write to temp file, then rename (rename is atomic on POSIX)700 Write(temp_path, JSON.stringify(result))701 rename(temp_path, target_path) // Atomic operation702703function load_partial_result(phase_name, task_id):704 path = CONTEXT_DIR + "/partials/" + phase_name + "/" + task_id + ".json"705 if (exists(path)):706 return JSON.parse(Read(path))707 return null708709function load_all_partial_results(phase_name):710 partial_dir = CONTEXT_DIR + "/partials/" + phase_name711 if (!exists(partial_dir)):712 return {}713 files = list_files(partial_dir, "*.json")714 results = {}715 for file in files:716 task_id = file.replace(".json", "")717 results[task_id] = JSON.parse(Read(partial_dir + "/" + file))718 return results719720// === RETRY STATE MANAGEMENT ===721function get_retry_state(phase_name):722 state = Read(CONTEXT_DIR + "/state.json")723 parsed = JSON.parse(state)724 return parsed.retry_state?.[phase_name] || { attempts: 0 }725726function update_retry_state(phase_name, retry_state):727 update_state({728 retry_state: {729 ...current_state.retry_state,730 [phase_name]: retry_state731 }732 })733734function clear_retry_state(phase_name):735 state = JSON.parse(Read(CONTEXT_DIR + "/state.json"))736 if (state.retry_state):737 delete state.retry_state[phase_name]738 Write(CONTEXT_DIR + "/state.json", JSON.stringify(state, null, 2))739```740741### Phase-Specific Retry Behavior742743| Phase | Retry Strategy | Partial Data Preserved |744|-------|----------------|------------------------|745| **Discovery** | Retry failed sub-agents (1A-1D) individually | `partials/discovery/*.json` |746| **Questions** | Retry entire phase | Previous `questions.json` kept until success |747| **Research** | Retry failed batches only | `partials/research/batch-*.json` |748| **Orchestrator** | Retry entire phase | Previous `work-assignments.json` kept |749| **Writers** | Retry failed writers only | `partials/writers/writer-*.json` + completed files |750| **QA** | Retry once, then warn | `partials/qa/partial-results.json` |751752### Critical Data Protection Rules753754```javascript755// RULE 1: Never overwrite successful output until new output is validated756function safe_write_output(path, content):757 backup_path = path + ".backup"758 if (exists(path)):759 copy(path, backup_path)760761 try:762 Write(path, content)763 validate_json(path) // Ensure valid JSON764 delete(backup_path) // Only delete backup after validation765 catch (error):766 // Restore from backup767 if (exists(backup_path)):768 copy(backup_path, path)769 throw error770771// RULE 2: Aggregate partial results even on failure772// Uses file locking to prevent race conditions during aggregation773function aggregate_with_partials(phase_name, new_results):774 lock_file = CONTEXT_DIR + "/partials/" + phase_name + "/.aggregate.lock"775776 // Acquire exclusive lock before aggregation777 lock_fd = acquire_file_lock(lock_file, timeout_ms=5000)778 if (!lock_fd):779 throw new Error("Failed to acquire lock for aggregation: " + phase_name)780781 try:782 existing = load_all_partial_results(phase_name)783 merged = { ...existing, ...new_results }784 return merged785 finally:786 release_file_lock(lock_fd)787 delete(lock_file)788789// RULE 3: Resume-aware execution790function should_skip_task(phase_name, task_id):791 partial = load_partial_result(phase_name, task_id)792 return partial?.success === true793```794---795796## Key Features797798<key_features>799800| # | Feature | Description |801|---|---------|-------------|802| 1 | **True Parallel Execution** | Phases 1, 3, 5 spawn ALL agents in ONE message for concurrent execution |803| 2 | **Single-Message Spawn** | โ ๏ธ Critical: Multiple Task calls in one response = true parallelism |804| 3 | **Evidence-Based** | Research agent proves answers with code traces before writing |805| 4 | **Engineer-Driven Questions** | Phase 2 generates comprehensive questions |806| 5 | **Conflict-Free Writing** | Orchestrator assigns exclusive file ownership per writer |807| 6 | **LSP-Powered** | Intelligent verification with semantic analysis |808| 7 | **State Recovery** | Resume from any phase if interrupted |809| 8 | **Unified Toolset** | All agents use octocode local + LSP tools |810| 9 | **Dynamic Scaling** | Agent count scales based on question volume |811812</key_features>813814<efficiency_summary>815### Efficiency Maximization816817```818Phase 1: 4 agents ร parallel = ~4x faster than sequential819Phase 3: N agents ร parallel = ~Nx faster than sequential820Phase 5: M agents ร parallel = ~Mx faster than sequential821822Total speedup: Significant when spawn="single_message" is followed823```824825**Remember**: `spawn="single_message"` phases MUST have all Task calls in ONE response.826</efficiency_summary>827828---829
Full transparency โ inspect the skill content before installing.