Compare commits
6 Commits
5294285ded
...
7a7618db7d
| Author | SHA1 | Date |
|---|---|---|
|
|
7a7618db7d | |
|
|
6ff74ba662 | |
|
|
1ad1f91e38 | |
|
|
350688df67 | |
|
|
9924dc6344 | |
|
|
db7b497eeb |
|
|
@ -55,7 +55,8 @@ Load {planning_artifacts}/epics.md and review:
|
||||||
2. **Requirements Grouping**: Group related FRs that deliver cohesive user outcomes
|
2. **Requirements Grouping**: Group related FRs that deliver cohesive user outcomes
|
||||||
3. **Incremental Delivery**: Each epic should deliver value independently
|
3. **Incremental Delivery**: Each epic should deliver value independently
|
||||||
4. **Logical Flow**: Natural progression from user's perspective
|
4. **Logical Flow**: Natural progression from user's perspective
|
||||||
5. **🔗 Dependency-Free Within Epic**: Stories within an epic must NOT depend on future stories
|
5. **Dependency-Free Within Epic**: Stories within an epic must NOT depend on future stories
|
||||||
|
6. **Implementation Efficiency**: Consider consolidating epics that all modify the same core files into fewer epics
|
||||||
|
|
||||||
**⚠️ CRITICAL PRINCIPLE:**
|
**⚠️ CRITICAL PRINCIPLE:**
|
||||||
Organize by USER VALUE, not technical layers:
|
Organize by USER VALUE, not technical layers:
|
||||||
|
|
@ -74,6 +75,18 @@ Organize by USER VALUE, not technical layers:
|
||||||
- Epic 3: Frontend Components (creates reusable components) - **No user value**
|
- Epic 3: Frontend Components (creates reusable components) - **No user value**
|
||||||
- Epic 4: Deployment Pipeline (CI/CD setup) - **No user value**
|
- Epic 4: Deployment Pipeline (CI/CD setup) - **No user value**
|
||||||
|
|
||||||
|
**❌ WRONG Epic Examples (File Churn on Same Component):**
|
||||||
|
|
||||||
|
- Epic 1: File Upload (modifies model, controller, web form, web API)
|
||||||
|
- Epic 2: File Status (modifies model, controller, web form, web API)
|
||||||
|
- Epic 3: File Access permissions (modifies model, controller, web form, web API)
|
||||||
|
- All three epics touch the same files — consolidate into one epic with ordered stories
|
||||||
|
|
||||||
|
**✅ CORRECT Alternative:**
|
||||||
|
|
||||||
|
- Epic 1: File Management Enhancement (upload, status, permissions as stories within one epic)
|
||||||
|
- Rationale: Single component, fully pre-designed, no feedback loop between epics
|
||||||
|
|
||||||
**🔗 DEPENDENCY RULES:**
|
**🔗 DEPENDENCY RULES:**
|
||||||
|
|
||||||
- Each epic must deliver COMPLETE functionality for its domain
|
- Each epic must deliver COMPLETE functionality for its domain
|
||||||
|
|
@ -82,21 +95,38 @@ Organize by USER VALUE, not technical layers:
|
||||||
|
|
||||||
### 3. Design Epic Structure Collaboratively
|
### 3. Design Epic Structure Collaboratively
|
||||||
|
|
||||||
**Step A: Identify User Value Themes**
|
**Step A: Assess Context and Identify Themes**
|
||||||
|
|
||||||
|
First, assess how much of the solution design is already validated (Architecture, UX, Test Design).
|
||||||
|
When the outcome is certain and direction changes between epics are unlikely, prefer fewer but larger epics.
|
||||||
|
Split into multiple epics when there is a genuine risk boundary or when early feedback could change direction
|
||||||
|
of following epics.
|
||||||
|
|
||||||
|
Then, identify user value themes:
|
||||||
|
|
||||||
- Look for natural groupings in the FRs
|
- Look for natural groupings in the FRs
|
||||||
- Identify user journeys or workflows
|
- Identify user journeys or workflows
|
||||||
- Consider user types and their goals
|
- Consider user types and their goals
|
||||||
|
|
||||||
**Step B: Propose Epic Structure**
|
**Step B: Propose Epic Structure**
|
||||||
For each proposed epic:
|
|
||||||
|
For each proposed epic (considering whether epics share the same core files):
|
||||||
|
|
||||||
1. **Epic Title**: User-centric, value-focused
|
1. **Epic Title**: User-centric, value-focused
|
||||||
2. **User Outcome**: What users can accomplish after this epic
|
2. **User Outcome**: What users can accomplish after this epic
|
||||||
3. **FR Coverage**: Which FR numbers this epic addresses
|
3. **FR Coverage**: Which FR numbers this epic addresses
|
||||||
4. **Implementation Notes**: Any technical or UX considerations
|
4. **Implementation Notes**: Any technical or UX considerations
|
||||||
|
|
||||||
**Step C: Create the epics_list**
|
**Step C: Review for File Overlap**
|
||||||
|
|
||||||
|
Assess whether multiple proposed epics repeatedly target the same core files. If overlap is significant:
|
||||||
|
|
||||||
|
- Distinguish meaningful overlap (same component end-to-end) from incidental sharing
|
||||||
|
- Ask whether to consolidate into one epic with ordered stories
|
||||||
|
- If confirmed, merge the epic FRs into a single epic, preserving dependency flow: each story must still fit within
|
||||||
|
a single dev agent's context
|
||||||
|
|
||||||
|
**Step D: Create the epics_list**
|
||||||
|
|
||||||
Format the epics_list as:
|
Format the epics_list as:
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -90,6 +90,12 @@ Review the complete epic and story breakdown to ensure EVERY FR is covered:
|
||||||
- Dependencies flow naturally
|
- Dependencies flow naturally
|
||||||
- Foundation stories only setup what's needed
|
- Foundation stories only setup what's needed
|
||||||
- No big upfront technical work
|
- No big upfront technical work
|
||||||
|
- **File Churn Check:** Do multiple epics repeatedly modify the same core files?
|
||||||
|
- Assess whether the overlap pattern suggests unnecessary churn or is incidental
|
||||||
|
- If overlap is significant: Validate that splitting provides genuine value (risk mitigation, feedback loops, context size limits)
|
||||||
|
- If no justification for the split: Recommend consolidation into fewer epics
|
||||||
|
- ❌ WRONG: Multiple epics each modify the same core files with no feedback loop between them
|
||||||
|
- ✅ RIGHT: Epics target distinct files/components, OR consolidation was explicitly considered and rejected with rationale
|
||||||
|
|
||||||
### 5. Dependency Validation (CRITICAL)
|
### 5. Dependency Validation (CRITICAL)
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -29,6 +29,11 @@ class CommunityModuleManager {
|
||||||
// Shared across all instances; the manifest writer often uses a fresh instance.
|
// Shared across all instances; the manifest writer often uses a fresh instance.
|
||||||
static _resolutions = new Map();
|
static _resolutions = new Map();
|
||||||
|
|
||||||
|
// moduleCode → ResolvedModule (from PluginResolver) when the cloned repo ships
|
||||||
|
// a `.claude-plugin/marketplace.json`. Lets community installs reuse the same
|
||||||
|
// skill-level install pipeline as custom-source installs (installFromResolution).
|
||||||
|
static _pluginResolutions = new Map();
|
||||||
|
|
||||||
constructor() {
|
constructor() {
|
||||||
this._client = new RegistryClient();
|
this._client = new RegistryClient();
|
||||||
this._cachedIndex = null;
|
this._cachedIndex = null;
|
||||||
|
|
@ -40,6 +45,11 @@ class CommunityModuleManager {
|
||||||
return CommunityModuleManager._resolutions.get(moduleCode) || null;
|
return CommunityModuleManager._resolutions.get(moduleCode) || null;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/** Get the marketplace.json-derived plugin resolution for a community module, if any. */
|
||||||
|
getPluginResolution(moduleCode) {
|
||||||
|
return CommunityModuleManager._pluginResolutions.get(moduleCode) || null;
|
||||||
|
}
|
||||||
|
|
||||||
// ─── Data Loading ──────────────────────────────────────────────────────────
|
// ─── Data Loading ──────────────────────────────────────────────────────────
|
||||||
|
|
||||||
/**
|
/**
|
||||||
|
|
@ -371,6 +381,18 @@ class CommunityModuleManager {
|
||||||
planSource: planEntry.source,
|
planSource: planEntry.source,
|
||||||
});
|
});
|
||||||
|
|
||||||
|
// If the repo ships a marketplace.json, route through PluginResolver so the
|
||||||
|
// skill-level install pipeline (installFromResolution) handles the copy.
|
||||||
|
// Repos without marketplace.json fall through to the legacy findModuleSource
|
||||||
|
// path unchanged.
|
||||||
|
await this._tryResolveMarketplacePlugin(moduleCacheDir, moduleInfo, {
|
||||||
|
channel: planEntry.channel,
|
||||||
|
version: recordedVersion,
|
||||||
|
sha: installedSha,
|
||||||
|
approvedTag,
|
||||||
|
approvedSha,
|
||||||
|
});
|
||||||
|
|
||||||
// Install dependencies if needed
|
// Install dependencies if needed
|
||||||
const packageJsonPath = path.join(moduleCacheDir, 'package.json');
|
const packageJsonPath = path.join(moduleCacheDir, 'package.json');
|
||||||
if ((needsDependencyInstall || wasNewClone) && (await fs.pathExists(packageJsonPath))) {
|
if ((needsDependencyInstall || wasNewClone) && (await fs.pathExists(packageJsonPath))) {
|
||||||
|
|
@ -392,6 +414,204 @@ class CommunityModuleManager {
|
||||||
return moduleCacheDir;
|
return moduleCacheDir;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// ─── Marketplace.json Resolution ──────────────────────────────────────────
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Detect `.claude-plugin/marketplace.json` in a cloned community repo and
|
||||||
|
* route through PluginResolver. When successful, caches the resolution so
|
||||||
|
* OfficialModulesManager.install() can route the copy through
|
||||||
|
* installFromResolution() — the same path used by custom-source installs.
|
||||||
|
*
|
||||||
|
* Silent no-op when marketplace.json is absent or the resolver returns no
|
||||||
|
* matches; the legacy findModuleSource path then handles the install.
|
||||||
|
*
|
||||||
|
* @param {string} repoPath - Absolute path to the cloned repo
|
||||||
|
* @param {Object} moduleInfo - Normalized community module info
|
||||||
|
* @param {Object} resolution - Resolution metadata from cloneModule
|
||||||
|
* @param {string} resolution.channel - Channel ('stable' | 'next' | 'pinned')
|
||||||
|
* @param {string} resolution.version - Recorded version string
|
||||||
|
* @param {string} resolution.sha - Resolved git SHA
|
||||||
|
* @param {string|null} resolution.approvedTag - Registry approved tag
|
||||||
|
* @param {string|null} resolution.approvedSha - Registry approved SHA
|
||||||
|
*/
|
||||||
|
async _tryResolveMarketplacePlugin(repoPath, moduleInfo, resolution) {
|
||||||
|
const marketplacePath = path.join(repoPath, '.claude-plugin', 'marketplace.json');
|
||||||
|
if (!(await fs.pathExists(marketplacePath))) return;
|
||||||
|
|
||||||
|
let marketplaceData;
|
||||||
|
try {
|
||||||
|
marketplaceData = JSON.parse(await fs.readFile(marketplacePath, 'utf8'));
|
||||||
|
} catch {
|
||||||
|
// Malformed marketplace.json — fall through to legacy path.
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const plugins = Array.isArray(marketplaceData?.plugins) ? marketplaceData.plugins : [];
|
||||||
|
if (plugins.length === 0) return;
|
||||||
|
|
||||||
|
const selection = this._selectPluginForModule(plugins, moduleInfo);
|
||||||
|
if (!selection) {
|
||||||
|
await this._safeWarn(
|
||||||
|
`Community module '${moduleInfo.code}' ships marketplace.json but no plugin entry matches the registry code. ` +
|
||||||
|
`Falling back to legacy install path.`,
|
||||||
|
);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (selection.source === 'single-fallback') {
|
||||||
|
// Single-entry marketplace.json whose plugin name doesn't match the registry
|
||||||
|
// code or the module_definition hint. Most likely correct, but worth surfacing
|
||||||
|
// in case marketplace.json is misconfigured and we'd install the wrong plugin.
|
||||||
|
await this._safeWarn(
|
||||||
|
`Community module '${moduleInfo.code}' picked the only plugin in marketplace.json ('${selection.plugin?.name}') ` +
|
||||||
|
`because no name or module_definition match was found. Verify marketplace.json if the install looks wrong.`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
const { PluginResolver } = require('./plugin-resolver');
|
||||||
|
const resolver = new PluginResolver();
|
||||||
|
let resolved;
|
||||||
|
try {
|
||||||
|
resolved = await resolver.resolve(repoPath, selection.plugin);
|
||||||
|
} catch (error) {
|
||||||
|
// PluginResolver threw (malformed plugin entry, missing files, etc.).
|
||||||
|
// Honor the silent-fallthrough contract — warn and let the legacy
|
||||||
|
// findModuleSource path handle the install.
|
||||||
|
await this._safeWarn(
|
||||||
|
`PluginResolver failed for community module '${moduleInfo.code}': ${error.message}. ` + `Falling back to legacy install path.`,
|
||||||
|
);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
if (!resolved || resolved.length === 0) return;
|
||||||
|
|
||||||
|
// The registry registers a single code per module. If the resolver returns
|
||||||
|
// multiple modules (Strategy 4: multiple standalone skills), accept only
|
||||||
|
// the entry whose code matches the registry. Other entries are ignored —
|
||||||
|
// they belong to plugins not registered in the community catalog.
|
||||||
|
const matched = resolved.find((mod) => mod.code === moduleInfo.code) || (resolved.length === 1 ? resolved[0] : null);
|
||||||
|
if (!matched) return;
|
||||||
|
|
||||||
|
// Shallow-clone before stamping provenance — the resolver may cache or reuse
|
||||||
|
// its return objects, and we don't want install-specific fields leaking back.
|
||||||
|
const stamped = {
|
||||||
|
...matched,
|
||||||
|
code: moduleInfo.code,
|
||||||
|
repoUrl: moduleInfo.url,
|
||||||
|
cloneRef: resolution.channel === 'pinned' ? resolution.version : resolution.approvedTag || null,
|
||||||
|
cloneSha: resolution.sha,
|
||||||
|
communitySource: true,
|
||||||
|
communityChannel: resolution.channel,
|
||||||
|
communityVersion: resolution.version,
|
||||||
|
registryApprovedTag: resolution.approvedTag,
|
||||||
|
registryApprovedSha: resolution.approvedSha,
|
||||||
|
};
|
||||||
|
|
||||||
|
CommunityModuleManager._pluginResolutions.set(moduleInfo.code, stamped);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Lazy fallback: resolve marketplace.json straight from the on-disk cache
|
||||||
|
* when `_pluginResolutions` is empty (e.g. callers that reach `install()`
|
||||||
|
* without `cloneModule` having populated the cache earlier in this process).
|
||||||
|
*
|
||||||
|
* Reuses an existing channel resolution if present; otherwise synthesizes a
|
||||||
|
* minimal stable-channel stub from the registry entry + the cached repo's
|
||||||
|
* current HEAD. Returns the cached plugin resolution if one is produced,
|
||||||
|
* otherwise null (caller falls back to the legacy path).
|
||||||
|
*
|
||||||
|
* @param {string} moduleCode
|
||||||
|
* @returns {Promise<Object|null>}
|
||||||
|
*/
|
||||||
|
async resolveFromCache(moduleCode) {
|
||||||
|
const existing = this.getPluginResolution(moduleCode);
|
||||||
|
if (existing) return existing;
|
||||||
|
|
||||||
|
const cacheRepoDir = path.join(this.getCacheDir(), moduleCode);
|
||||||
|
const marketplacePath = path.join(cacheRepoDir, '.claude-plugin', 'marketplace.json');
|
||||||
|
if (!(await fs.pathExists(marketplacePath))) return null;
|
||||||
|
|
||||||
|
let moduleInfo;
|
||||||
|
try {
|
||||||
|
moduleInfo = await this.getModuleByCode(moduleCode);
|
||||||
|
} catch {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
if (!moduleInfo) return null;
|
||||||
|
|
||||||
|
let channelResolution = this.getResolution(moduleCode);
|
||||||
|
if (!channelResolution) {
|
||||||
|
let sha = '';
|
||||||
|
try {
|
||||||
|
sha = execSync('git rev-parse HEAD', { cwd: cacheRepoDir, stdio: 'pipe' }).toString().trim();
|
||||||
|
} catch {
|
||||||
|
// Not a git repo or unreadable — give up and let the legacy path run.
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
channelResolution = {
|
||||||
|
channel: 'stable',
|
||||||
|
version: moduleInfo.approvedTag || sha.slice(0, 7),
|
||||||
|
sha,
|
||||||
|
registryApprovedTag: moduleInfo.approvedTag || null,
|
||||||
|
registryApprovedSha: moduleInfo.approvedSha || null,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
await this._tryResolveMarketplacePlugin(cacheRepoDir, moduleInfo, {
|
||||||
|
channel: channelResolution.channel,
|
||||||
|
version: channelResolution.version,
|
||||||
|
sha: channelResolution.sha,
|
||||||
|
approvedTag: channelResolution.registryApprovedTag,
|
||||||
|
approvedSha: channelResolution.registryApprovedSha,
|
||||||
|
});
|
||||||
|
|
||||||
|
return this.getPluginResolution(moduleCode);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Best-effort warning emitter. `prompts.log.warn` may be undefined in some
|
||||||
|
* harnesses and may return a rejected promise — swallow both cases so a
|
||||||
|
* fallthrough warning can never crash the install.
|
||||||
|
*/
|
||||||
|
async _safeWarn(message) {
|
||||||
|
try {
|
||||||
|
const result = prompts.log?.warn?.(message);
|
||||||
|
if (result && typeof result.then === 'function') await result;
|
||||||
|
} catch {
|
||||||
|
/* ignore */
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Pick which plugin entry from marketplace.json represents this community module.
|
||||||
|
* Precedence:
|
||||||
|
* 1. Exact match on `plugin.name === moduleInfo.code`
|
||||||
|
* 2. Trailing directory of `module_definition` matches `plugin.name`
|
||||||
|
* 3. Single plugin in marketplace.json — accepted with a warning so a
|
||||||
|
* mismatched-but-uniquely-named plugin doesn't install silently.
|
||||||
|
* Otherwise null (caller falls back to legacy path).
|
||||||
|
*
|
||||||
|
* @returns {{plugin: Object, source: 'name'|'hint'|'single-fallback'}|null}
|
||||||
|
*/
|
||||||
|
_selectPluginForModule(plugins, moduleInfo) {
|
||||||
|
const byCode = plugins.find((p) => p && p.name === moduleInfo.code);
|
||||||
|
if (byCode) return { plugin: byCode, source: 'name' };
|
||||||
|
|
||||||
|
if (moduleInfo.moduleDefinition) {
|
||||||
|
// module_definition like "src/skills/suno-setup/assets/module.yaml" →
|
||||||
|
// hint segment "suno-setup". Match that against plugin names.
|
||||||
|
const segments = moduleInfo.moduleDefinition.split('/').filter(Boolean);
|
||||||
|
const setupIdx = segments.findIndex((s) => s.endsWith('-setup'));
|
||||||
|
if (setupIdx !== -1) {
|
||||||
|
const hint = segments[setupIdx];
|
||||||
|
const byHint = plugins.find((p) => p && p.name === hint);
|
||||||
|
if (byHint) return { plugin: byHint, source: 'hint' };
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (plugins.length === 1) return { plugin: plugins[0], source: 'single-fallback' };
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
// ─── Source Finding ───────────────────────────────────────────────────────
|
// ─── Source Finding ───────────────────────────────────────────────────────
|
||||||
|
|
||||||
/**
|
/**
|
||||||
|
|
|
||||||
|
|
@ -269,6 +269,21 @@ class OfficialModules {
|
||||||
return this.installFromResolution(resolved, bmadDir, fileTrackingCallback, options);
|
return this.installFromResolution(resolved, bmadDir, fileTrackingCallback, options);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Community modules whose cloned repo ships marketplace.json get the same
|
||||||
|
// skill-level install treatment as custom-source installs. If the in-process
|
||||||
|
// cache wasn't populated (e.g. caller skipped the pre-clone phase), fall
|
||||||
|
// back to resolving directly from `~/.bmad/cache/community-modules/<name>/`
|
||||||
|
// so we don't silently regress to the legacy half-install path.
|
||||||
|
const { CommunityModuleManager } = require('./community-manager');
|
||||||
|
const communityMgr = new CommunityModuleManager();
|
||||||
|
let communityResolved = communityMgr.getPluginResolution(moduleName);
|
||||||
|
if (!communityResolved) {
|
||||||
|
communityResolved = await communityMgr.resolveFromCache(moduleName);
|
||||||
|
}
|
||||||
|
if (communityResolved) {
|
||||||
|
return this.installFromResolution(communityResolved, bmadDir, fileTrackingCallback, options);
|
||||||
|
}
|
||||||
|
|
||||||
const sourcePath = await this.findModuleSource(moduleName, {
|
const sourcePath = await this.findModuleSource(moduleName, {
|
||||||
silent: options.silent,
|
silent: options.silent,
|
||||||
channelOptions: options.channelOptions,
|
channelOptions: options.channelOptions,
|
||||||
|
|
@ -360,21 +375,27 @@ class OfficialModules {
|
||||||
await this.createModuleDirectories(resolved.code, bmadDir, options);
|
await this.createModuleDirectories(resolved.code, bmadDir, options);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Update manifest. For custom modules, derive channel from the git ref:
|
// Update manifest. For community installs we honor the channel resolved by
|
||||||
// cloneRef present → pinned at that ref
|
// CommunityModuleManager (stable/next/pinned) and propagate the registry's
|
||||||
// cloneRef absent → next (main HEAD)
|
// approved tag/sha. For custom-source installs we derive channel from the
|
||||||
// local path → no channel concept
|
// cloneRef (present → pinned, absent → next; local paths have no channel).
|
||||||
const { Manifest } = require('../core/manifest');
|
const { Manifest } = require('../core/manifest');
|
||||||
const manifestObj = new Manifest();
|
const manifestObj = new Manifest();
|
||||||
|
|
||||||
const hasGitClone = !!resolved.repoUrl;
|
const hasGitClone = !!resolved.repoUrl;
|
||||||
|
const isCommunity = resolved.communitySource === true;
|
||||||
const manifestEntry = {
|
const manifestEntry = {
|
||||||
version: resolved.cloneRef || (hasGitClone ? 'main' : resolved.version || null),
|
version: resolved.communityVersion || resolved.cloneRef || (hasGitClone ? 'main' : resolved.version || null),
|
||||||
source: 'custom',
|
source: isCommunity ? 'community' : 'custom',
|
||||||
npmPackage: null,
|
npmPackage: null,
|
||||||
repoUrl: resolved.repoUrl || null,
|
repoUrl: resolved.repoUrl || null,
|
||||||
};
|
};
|
||||||
if (hasGitClone) {
|
if (isCommunity) {
|
||||||
|
if (resolved.communityChannel) manifestEntry.channel = resolved.communityChannel;
|
||||||
|
if (resolved.cloneSha) manifestEntry.sha = resolved.cloneSha;
|
||||||
|
if (resolved.registryApprovedTag) manifestEntry.registryApprovedTag = resolved.registryApprovedTag;
|
||||||
|
if (resolved.registryApprovedSha) manifestEntry.registryApprovedSha = resolved.registryApprovedSha;
|
||||||
|
} else if (hasGitClone) {
|
||||||
manifestEntry.channel = resolved.cloneRef ? 'pinned' : 'next';
|
manifestEntry.channel = resolved.cloneRef ? 'pinned' : 'next';
|
||||||
if (resolved.cloneSha) manifestEntry.sha = resolved.cloneSha;
|
if (resolved.cloneSha) manifestEntry.sha = resolved.cloneSha;
|
||||||
if (resolved.rawInput) manifestEntry.rawSource = resolved.rawInput;
|
if (resolved.rawInput) manifestEntry.rawSource = resolved.rawInput;
|
||||||
|
|
@ -386,10 +407,13 @@ class OfficialModules {
|
||||||
success: true,
|
success: true,
|
||||||
module: resolved.code,
|
module: resolved.code,
|
||||||
path: targetPath,
|
path: targetPath,
|
||||||
// Match the manifestEntry.version expression above so downstream summary
|
// Mirror the manifestEntry.version precedence above so downstream summary
|
||||||
// lines show the cloned ref (tag or 'main') instead of the on-disk
|
// lines show the same string we just wrote to disk (community installs
|
||||||
// package.json version for git-backed custom installs.
|
// use the registry-approved tag via `communityVersion`; custom git-backed
|
||||||
versionInfo: { version: resolved.cloneRef || (hasGitClone ? 'main' : resolved.version || '') },
|
// installs show the cloned ref or 'main').
|
||||||
|
versionInfo: {
|
||||||
|
version: resolved.communityVersion || resolved.cloneRef || (hasGitClone ? 'main' : resolved.version || ''),
|
||||||
|
},
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -1,5 +1,6 @@
|
||||||
const path = require('node:path');
|
const path = require('node:path');
|
||||||
const os = require('node:os');
|
const os = require('node:os');
|
||||||
|
const yaml = require('yaml');
|
||||||
const fs = require('./fs-native');
|
const fs = require('./fs-native');
|
||||||
|
|
||||||
/**
|
/**
|
||||||
|
|
@ -86,8 +87,11 @@ function getExternalModuleCachePath(moduleName, ...segments) {
|
||||||
* Built-in modules (core, bmm) live under <src>. External official modules are
|
* Built-in modules (core, bmm) live under <src>. External official modules are
|
||||||
* cloned into ~/.bmad/cache/external-modules/<name>/ with varying internal
|
* cloned into ~/.bmad/cache/external-modules/<name>/ with varying internal
|
||||||
* layouts (some at src/module.yaml, some at skills/module.yaml, some nested).
|
* layouts (some at src/module.yaml, some at skills/module.yaml, some nested).
|
||||||
* Local custom-source modules are not cached; their path is read from the
|
* Url-source custom modules are cloned into ~/.bmad/cache/custom-modules/<host>/<owner>/<repo>/
|
||||||
* CustomModuleManager resolution cache set during the same install run.
|
* and are resolved by walking the cache and matching `code` or `name` from the
|
||||||
|
* discovered module.yaml. Local custom-source modules are not cached; their
|
||||||
|
* path is read from the CustomModuleManager resolution cache set during the
|
||||||
|
* same install run.
|
||||||
* This mirrors the candidate-path search in
|
* This mirrors the candidate-path search in
|
||||||
* ExternalModuleManager.findExternalModuleSource but performs no git/network
|
* ExternalModuleManager.findExternalModuleSource but performs no git/network
|
||||||
* work, which keeps it safe to call during manifest writing.
|
* work, which keeps it safe to call during manifest writing.
|
||||||
|
|
@ -99,11 +103,14 @@ async function resolveInstalledModuleYaml(moduleName) {
|
||||||
const builtIn = path.join(getModulePath(moduleName), 'module.yaml');
|
const builtIn = path.join(getModulePath(moduleName), 'module.yaml');
|
||||||
if (await fs.pathExists(builtIn)) return builtIn;
|
if (await fs.pathExists(builtIn)) return builtIn;
|
||||||
|
|
||||||
// Search a resolved root directory using the same candidate-path pattern.
|
// Collect every module.yaml under a root using the standard candidate paths.
|
||||||
async function searchRoot(root) {
|
// Url-source repos can host multiple plugins (discovery mode), so we need all
|
||||||
|
// matches, not just the first. Returned in priority order.
|
||||||
|
async function searchRootAll(root) {
|
||||||
|
const results = [];
|
||||||
for (const dir of ['skills', 'src']) {
|
for (const dir of ['skills', 'src']) {
|
||||||
const direct = path.join(root, dir, 'module.yaml');
|
const direct = path.join(root, dir, 'module.yaml');
|
||||||
if (await fs.pathExists(direct)) return direct;
|
if (await fs.pathExists(direct)) results.push(direct);
|
||||||
|
|
||||||
const dirPath = path.join(root, dir);
|
const dirPath = path.join(root, dir);
|
||||||
if (await fs.pathExists(dirPath)) {
|
if (await fs.pathExists(dirPath)) {
|
||||||
|
|
@ -111,22 +118,35 @@ async function resolveInstalledModuleYaml(moduleName) {
|
||||||
for (const entry of entries) {
|
for (const entry of entries) {
|
||||||
if (!entry.isDirectory()) continue;
|
if (!entry.isDirectory()) continue;
|
||||||
const nested = path.join(dirPath, entry.name, 'module.yaml');
|
const nested = path.join(dirPath, entry.name, 'module.yaml');
|
||||||
if (await fs.pathExists(nested)) return nested;
|
if (await fs.pathExists(nested)) results.push(nested);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// BMB standard: {setup-skill}/assets/module.yaml (setup skill is any *-setup directory)
|
// BMB standard: {setup-skill}/assets/module.yaml (setup skill is any *-setup directory).
|
||||||
const rootEntries = await fs.readdir(root, { withFileTypes: true });
|
// Check at the repo root, and also under src/skills/ and skills/ since
|
||||||
for (const entry of rootEntries) {
|
// marketplace plugins commonly nest skills under src/skills/<name>/.
|
||||||
if (!entry.isDirectory() || !entry.name.endsWith('-setup')) continue;
|
const setupSearchRoots = [root, path.join(root, 'src', 'skills'), path.join(root, 'skills')];
|
||||||
const setupAssets = path.join(root, entry.name, 'assets', 'module.yaml');
|
for (const setupRoot of setupSearchRoots) {
|
||||||
if (await fs.pathExists(setupAssets)) return setupAssets;
|
if (!(await fs.pathExists(setupRoot))) continue;
|
||||||
|
const entries = await fs.readdir(setupRoot, { withFileTypes: true });
|
||||||
|
for (const entry of entries) {
|
||||||
|
if (!entry.isDirectory() || !entry.name.endsWith('-setup')) continue;
|
||||||
|
const setupAssets = path.join(setupRoot, entry.name, 'assets', 'module.yaml');
|
||||||
|
if (await fs.pathExists(setupAssets)) results.push(setupAssets);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
const atRoot = path.join(root, 'module.yaml');
|
const atRoot = path.join(root, 'module.yaml');
|
||||||
if (await fs.pathExists(atRoot)) return atRoot;
|
if (await fs.pathExists(atRoot)) results.push(atRoot);
|
||||||
return null;
|
return results;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Backwards-compatible single-result variant for the existing external-cache
|
||||||
|
// and resolution-cache fallbacks (one module per root by construction).
|
||||||
|
async function searchRoot(root) {
|
||||||
|
const all = await searchRootAll(root);
|
||||||
|
return all.length > 0 ? all[0] : null;
|
||||||
}
|
}
|
||||||
|
|
||||||
const cacheRoot = getExternalModuleCachePath(moduleName);
|
const cacheRoot = getExternalModuleCachePath(moduleName);
|
||||||
|
|
@ -135,6 +155,16 @@ async function resolveInstalledModuleYaml(moduleName) {
|
||||||
if (found) return found;
|
if (found) return found;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Community modules are cloned to ~/.bmad/cache/community-modules/<name>/
|
||||||
|
// (parallel to the external-modules cache used above). Search there too so
|
||||||
|
// collectAgentsFromModuleYaml and writeCentralConfig can locate community
|
||||||
|
// module.yaml files regardless of how nested the layout is.
|
||||||
|
const communityCacheRoot = path.join(os.homedir(), '.bmad', 'cache', 'community-modules', moduleName);
|
||||||
|
if (await fs.pathExists(communityCacheRoot)) {
|
||||||
|
const found = await searchRoot(communityCacheRoot);
|
||||||
|
if (found) return found;
|
||||||
|
}
|
||||||
|
|
||||||
// Fallback: local custom-source modules store their source path in the
|
// Fallback: local custom-source modules store their source path in the
|
||||||
// CustomModuleManager resolution cache populated during the same install run.
|
// CustomModuleManager resolution cache populated during the same install run.
|
||||||
// Match by code OR name since callers may use either form.
|
// Match by code OR name since callers may use either form.
|
||||||
|
|
@ -150,6 +180,37 @@ async function resolveInstalledModuleYaml(moduleName) {
|
||||||
// Resolution cache unavailable — continue
|
// Resolution cache unavailable — continue
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Fallback: url-source custom modules cloned to ~/.bmad/cache/custom-modules/.
|
||||||
|
// Walk every cached repo, enumerate ALL module.yaml files via searchRootAll
|
||||||
|
// (a single repo can host multiple plugins in discovery mode), and match by
|
||||||
|
// the yaml's `code` or `name` field. This works on re-install runs where
|
||||||
|
// _resolutionCache is empty and covers both discovery-mode (with marketplace.json)
|
||||||
|
// and direct-mode modules, since we identify repo roots by .bmad-source.json
|
||||||
|
// (written by cloneRepo) or .claude-plugin/ rather than by marketplace.json.
|
||||||
|
try {
|
||||||
|
const customCacheDir = path.join(os.homedir(), '.bmad', 'cache', 'custom-modules');
|
||||||
|
if (await fs.pathExists(customCacheDir)) {
|
||||||
|
const { CustomModuleManager } = require('./modules/custom-module-manager');
|
||||||
|
const customMgr = new CustomModuleManager();
|
||||||
|
const repoRoots = await customMgr._findCacheRepoRoots(customCacheDir);
|
||||||
|
for (const { repoPath } of repoRoots) {
|
||||||
|
const candidates = await searchRootAll(repoPath);
|
||||||
|
for (const candidate of candidates) {
|
||||||
|
try {
|
||||||
|
const parsed = yaml.parse(await fs.readFile(candidate, 'utf8'));
|
||||||
|
if (parsed && (parsed.code === moduleName || parsed.name === moduleName)) {
|
||||||
|
return candidate;
|
||||||
|
}
|
||||||
|
} catch {
|
||||||
|
// Malformed yaml — skip
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} catch {
|
||||||
|
// Custom-modules cache walk failed — continue
|
||||||
|
}
|
||||||
|
|
||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -0,0 +1,316 @@
|
||||||
|
#!/usr/bin/env python3
|
||||||
|
# /// script
|
||||||
|
# requires-python = ">=3.9"
|
||||||
|
# dependencies = []
|
||||||
|
# ///
|
||||||
|
"""Remove legacy module directories from _bmad/ after config migration.
|
||||||
|
|
||||||
|
After merge-config.py and merge-help-csv.py have migrated config data and
|
||||||
|
deleted individual legacy files, this script removes the now-redundant
|
||||||
|
directory trees. These directories contain skill files that are already
|
||||||
|
installed at .claude/skills/ (or equivalent) — only the config files at
|
||||||
|
_bmad/ root need to persist.
|
||||||
|
|
||||||
|
When --skills-dir is provided, the script verifies that every skill found
|
||||||
|
in the legacy directories exists at the installed location before removing
|
||||||
|
anything. Directories without skills (like _config/) are removed directly.
|
||||||
|
|
||||||
|
Exit codes: 0=success (including nothing to remove), 1=validation error, 2=runtime error
|
||||||
|
"""
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import json
|
||||||
|
import logging
|
||||||
|
import shutil
|
||||||
|
import sys
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
def parse_args():
|
||||||
|
parser = argparse.ArgumentParser(
|
||||||
|
description="Remove legacy module directories from _bmad/ after config migration."
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--bmad-dir",
|
||||||
|
required=True,
|
||||||
|
help="Path to the _bmad/ directory",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--module-code",
|
||||||
|
required=True,
|
||||||
|
help="Module code being cleaned up (e.g. 'bmb')",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--also-remove",
|
||||||
|
action="append",
|
||||||
|
default=[],
|
||||||
|
help="Additional directory names under _bmad/ to remove (repeatable)",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--skills-dir",
|
||||||
|
help="Path to .claude/skills/ — enables safety verification that skills "
|
||||||
|
"are installed before removing legacy copies",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--verbose",
|
||||||
|
action="store_true",
|
||||||
|
help="Print detailed progress to stderr",
|
||||||
|
)
|
||||||
|
return parser.parse_args()
|
||||||
|
|
||||||
|
|
||||||
|
def find_skill_dirs(base_path: str) -> list:
|
||||||
|
"""Find installable skill directories under base_path.
|
||||||
|
|
||||||
|
Only considers SKILL.md files at recognized installable positions:
|
||||||
|
- Direct children: base_path/{name}/SKILL.md (legacy flat layout)
|
||||||
|
- Skills subfolder: base_path/skills/{name}/SKILL.md (current layout)
|
||||||
|
|
||||||
|
SKILL.md files nested deeper (e.g. in tasks/, assets/, or within a
|
||||||
|
skill's own subdirectories) are not installable skills and are skipped.
|
||||||
|
|
||||||
|
NOTE: These discovery rules are intentionally stricter than the installer's
|
||||||
|
recursive collectSkills() behavior. The installer is permissive — it walks
|
||||||
|
the entire tree to find all SKILL.md files for installation. Cleanup must
|
||||||
|
be conservative: we only match the two canonical installable layouts so we
|
||||||
|
never accidentally validate a SKILL.md buried in tasks/, assets/, or other
|
||||||
|
non-installable subdirectories as proof that a skill is present.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of skill directory names (e.g. ['bmad-agent-builder', 'bmad-builder-setup'])
|
||||||
|
"""
|
||||||
|
skills = []
|
||||||
|
root = Path(base_path)
|
||||||
|
if not root.exists():
|
||||||
|
return skills
|
||||||
|
|
||||||
|
# Direct child: {name}/SKILL.md
|
||||||
|
for skill_md in root.glob("*/SKILL.md"):
|
||||||
|
skills.append(skill_md.parent.name)
|
||||||
|
|
||||||
|
# Skills subfolder: skills/{name}/SKILL.md
|
||||||
|
skills_root = root / "skills"
|
||||||
|
if skills_root.exists():
|
||||||
|
for skill_md in skills_root.glob("*/SKILL.md"):
|
||||||
|
skills.append(skill_md.parent.name)
|
||||||
|
|
||||||
|
return sorted(set(skills))
|
||||||
|
|
||||||
|
|
||||||
|
def verify_skills_installed(
|
||||||
|
bmad_dir: str, dirs_to_check: list, skills_dir: str, verbose: bool = False
|
||||||
|
) -> list:
|
||||||
|
"""Verify that skills in legacy directories exist at the installed location.
|
||||||
|
|
||||||
|
Scans each directory in dirs_to_check for skill folders (containing SKILL.md),
|
||||||
|
then checks that a matching directory exists under skills_dir. Directories
|
||||||
|
that contain no skills (like _config/) are silently skipped.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of verified skill names.
|
||||||
|
|
||||||
|
Raises SystemExit(1) if any skills are missing from skills_dir.
|
||||||
|
"""
|
||||||
|
all_verified = []
|
||||||
|
missing = []
|
||||||
|
|
||||||
|
for dirname in dirs_to_check:
|
||||||
|
legacy_path = Path(bmad_dir) / dirname
|
||||||
|
if not legacy_path.exists():
|
||||||
|
continue
|
||||||
|
|
||||||
|
skill_names = find_skill_dirs(str(legacy_path))
|
||||||
|
if not skill_names:
|
||||||
|
if verbose:
|
||||||
|
print(
|
||||||
|
f"No skills found in {dirname}/ — skipping verification",
|
||||||
|
file=sys.stderr,
|
||||||
|
)
|
||||||
|
continue
|
||||||
|
|
||||||
|
for skill_name in skill_names:
|
||||||
|
installed_path = Path(skills_dir) / skill_name
|
||||||
|
if installed_path.is_dir():
|
||||||
|
all_verified.append(skill_name)
|
||||||
|
if verbose:
|
||||||
|
print(
|
||||||
|
f"Verified: {skill_name} exists at {installed_path}",
|
||||||
|
file=sys.stderr,
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
missing.append(skill_name)
|
||||||
|
if verbose:
|
||||||
|
print(
|
||||||
|
f"MISSING: {skill_name} not found at {installed_path}",
|
||||||
|
file=sys.stderr,
|
||||||
|
)
|
||||||
|
|
||||||
|
if missing:
|
||||||
|
error_result = {
|
||||||
|
"status": "error",
|
||||||
|
"error": "Skills not found at installed location",
|
||||||
|
"missing_skills": missing,
|
||||||
|
"skills_dir": str(Path(skills_dir).resolve()),
|
||||||
|
}
|
||||||
|
print(json.dumps(error_result, indent=2))
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
return sorted(set(all_verified))
|
||||||
|
|
||||||
|
|
||||||
|
def count_files(path: Path) -> int:
|
||||||
|
"""Count all files recursively in a directory."""
|
||||||
|
count = 0
|
||||||
|
for item in path.rglob("*"):
|
||||||
|
if item.is_file():
|
||||||
|
count += 1
|
||||||
|
return count
|
||||||
|
|
||||||
|
|
||||||
|
def cleanup_directories(
|
||||||
|
bmad_dir: str, dirs_to_remove: list, verbose: bool = False
|
||||||
|
) -> tuple:
|
||||||
|
"""Remove specified directories under bmad_dir.
|
||||||
|
|
||||||
|
Preserves config.yaml files if present (needed by bmad-init at runtime).
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
(removed, not_found, total_files_removed) tuple
|
||||||
|
"""
|
||||||
|
removed = []
|
||||||
|
not_found = []
|
||||||
|
total_files = 0
|
||||||
|
|
||||||
|
for dirname in dirs_to_remove:
|
||||||
|
target = Path(bmad_dir) / dirname
|
||||||
|
if not target.exists():
|
||||||
|
not_found.append(dirname)
|
||||||
|
if verbose:
|
||||||
|
print(f"Not found (skipping): {target}", file=sys.stderr)
|
||||||
|
continue
|
||||||
|
|
||||||
|
if not target.is_dir():
|
||||||
|
if verbose:
|
||||||
|
print(f"Not a directory (skipping): {target}", file=sys.stderr)
|
||||||
|
not_found.append(dirname)
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Validate directory name to prevent path traversal
|
||||||
|
if ".." in dirname or "/" in dirname or "\\" in dirname:
|
||||||
|
error_result = {
|
||||||
|
"status": "error",
|
||||||
|
"error": f"Invalid directory name (path traversal rejected): {dirname}",
|
||||||
|
"directories_removed": removed,
|
||||||
|
"directories_failed": dirname,
|
||||||
|
}
|
||||||
|
print(json.dumps(error_result, indent=2))
|
||||||
|
sys.exit(2)
|
||||||
|
|
||||||
|
# Preserve config.yaml if present (bmad-init needs per-module configs)
|
||||||
|
config_path = target / "config.yaml"
|
||||||
|
config_backup = None
|
||||||
|
if config_path.exists():
|
||||||
|
config_backup = config_path.read_bytes()
|
||||||
|
if verbose:
|
||||||
|
print(f"Preserving config.yaml in {dirname}/", file=sys.stderr)
|
||||||
|
|
||||||
|
file_count = count_files(target)
|
||||||
|
if config_backup is not None:
|
||||||
|
file_count -= 1 # Don't count the preserved file
|
||||||
|
if verbose:
|
||||||
|
print(
|
||||||
|
f"Removing {target} ({file_count} files)",
|
||||||
|
file=sys.stderr,
|
||||||
|
)
|
||||||
|
|
||||||
|
try:
|
||||||
|
shutil.rmtree(target)
|
||||||
|
|
||||||
|
# Restore preserved config.yaml
|
||||||
|
if config_backup is not None:
|
||||||
|
target.mkdir(parents=True, exist_ok=True)
|
||||||
|
config_path.write_bytes(config_backup)
|
||||||
|
if verbose:
|
||||||
|
print(
|
||||||
|
f"Restored config.yaml in {dirname}/",
|
||||||
|
file=sys.stderr,
|
||||||
|
)
|
||||||
|
except OSError as e:
|
||||||
|
logger.error("Failed during cleanup of %s: %s", target, e)
|
||||||
|
error_result = {
|
||||||
|
"status": "error",
|
||||||
|
"error": f"Failed to remove {target}: {e}",
|
||||||
|
"directories_removed": removed,
|
||||||
|
"directories_failed": dirname,
|
||||||
|
}
|
||||||
|
print(json.dumps(error_result, indent=2))
|
||||||
|
sys.exit(2)
|
||||||
|
|
||||||
|
removed.append(dirname)
|
||||||
|
total_files += file_count
|
||||||
|
|
||||||
|
return removed, not_found, total_files
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
args = parse_args()
|
||||||
|
|
||||||
|
bmad_dir = args.bmad_dir
|
||||||
|
module_code = args.module_code
|
||||||
|
|
||||||
|
# Build the list of directories to remove
|
||||||
|
dirs_to_remove = [module_code, "core"] + args.also_remove
|
||||||
|
# Deduplicate while preserving order
|
||||||
|
seen = set()
|
||||||
|
unique_dirs = []
|
||||||
|
for d in dirs_to_remove:
|
||||||
|
if d not in seen:
|
||||||
|
seen.add(d)
|
||||||
|
unique_dirs.append(d)
|
||||||
|
dirs_to_remove = unique_dirs
|
||||||
|
|
||||||
|
if args.verbose:
|
||||||
|
print(f"Directories to remove: {dirs_to_remove}", file=sys.stderr)
|
||||||
|
|
||||||
|
# Safety check: verify skills are installed before removing
|
||||||
|
verified_skills = None
|
||||||
|
if args.skills_dir:
|
||||||
|
if args.verbose:
|
||||||
|
print(
|
||||||
|
f"Verifying skills installed at {args.skills_dir}",
|
||||||
|
file=sys.stderr,
|
||||||
|
)
|
||||||
|
verified_skills = verify_skills_installed(
|
||||||
|
bmad_dir, dirs_to_remove, args.skills_dir, args.verbose
|
||||||
|
)
|
||||||
|
|
||||||
|
# Remove directories
|
||||||
|
removed, not_found, total_files = cleanup_directories(
|
||||||
|
bmad_dir, dirs_to_remove, args.verbose
|
||||||
|
)
|
||||||
|
|
||||||
|
# Build result
|
||||||
|
result = {
|
||||||
|
"status": "success",
|
||||||
|
"bmad_dir": str(Path(bmad_dir).resolve()),
|
||||||
|
"directories_removed": removed,
|
||||||
|
"directories_not_found": not_found,
|
||||||
|
"files_removed_count": total_files,
|
||||||
|
}
|
||||||
|
|
||||||
|
if args.skills_dir:
|
||||||
|
result["safety_checks"] = {
|
||||||
|
"skills_verified": True,
|
||||||
|
"skills_dir": str(Path(args.skills_dir).resolve()),
|
||||||
|
"verified_skills": verified_skills,
|
||||||
|
}
|
||||||
|
else:
|
||||||
|
result["safety_checks"] = None
|
||||||
|
|
||||||
|
print(json.dumps(result, indent=2))
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
Loading…
Reference in New Issue