Compare commits

..

3 Commits

Author SHA1 Message Date
Jesse Vincent
1879a94ac7 Merge pull request #1123 from obra/feat-add-multi-repo-worktree
feat: add multi-repo worktree guidance (#710)
2026-04-13 16:48:44 -07:00
Drew Ritter
ddbba8e469 docs: drop brittle step-number chain from multi-repo row
Addresses review feedback on #1123. Replaces "(same Step 0→1a→1b flow,
matching branch names)" with plain-language instruction that doesn't
forward-reference section numbers that could rot under future edits.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-13 16:30:00 -07:00
Drew Ritter
f0728841d8 feat: add multi-repo worktree guidance (#710) 2026-04-13 16:29:59 -07:00
33 changed files with 281 additions and 1970 deletions

View File

@@ -9,7 +9,7 @@
{ {
"name": "superpowers", "name": "superpowers",
"description": "Core skills library for Claude Code: TDD, debugging, collaboration patterns, and proven techniques", "description": "Core skills library for Claude Code: TDD, debugging, collaboration patterns, and proven techniques",
"version": "5.0.7", "version": "5.0.6",
"source": "./", "source": "./",
"author": { "author": {
"name": "Jesse Vincent", "name": "Jesse Vincent",

View File

@@ -1,7 +1,7 @@
{ {
"name": "superpowers", "name": "superpowers",
"description": "Core skills library for Claude Code: TDD, debugging, collaboration patterns, and proven techniques", "description": "Core skills library for Claude Code: TDD, debugging, collaboration patterns, and proven techniques",
"version": "5.0.7", "version": "5.0.6",
"author": { "author": {
"name": "Jesse Vincent", "name": "Jesse Vincent",
"email": "jesse@fsck.com" "email": "jesse@fsck.com"

View File

@@ -1,47 +0,0 @@
{
"name": "superpowers",
"version": "5.0.7",
"description": "An agentic skills framework & software development methodology that works: planning, TDD, debugging, and collaboration workflows.",
"author": {
"name": "Jesse Vincent",
"email": "jesse@fsck.com",
"url": "https://github.com/obra"
},
"homepage": "https://github.com/obra/superpowers",
"repository": "https://github.com/obra/superpowers",
"license": "MIT",
"keywords": [
"brainstorming",
"subagent-driven-development",
"skills",
"planning",
"tdd",
"debugging",
"code-review",
"workflow"
],
"skills": "./skills/",
"interface": {
"displayName": "Superpowers",
"shortDescription": "Planning, TDD, debugging, and delivery workflows for coding agents",
"longDescription": "Use Superpowers to guide agent work through brainstorming, implementation planning, test-driven development, systematic debugging, parallel execution, code review, and finish-the-branch workflows.",
"developerName": "Jesse Vincent",
"category": "Coding",
"capabilities": [
"Interactive",
"Read",
"Write"
],
"defaultPrompt": [
"I've got an idea for something I'd like to build.",
"Let's add a feature to this project."
],
"websiteURL": "https://github.com/obra/superpowers",
"privacyPolicyURL": "https://docs.github.com/en/site-policy/privacy-policies/github-general-privacy-statement",
"termsOfServiceURL": "https://docs.github.com/en/site-policy/github-terms/github-terms-of-service",
"brandColor": "#F59E0B",
"composerIcon": "./assets/superpowers-small.svg",
"logo": "./assets/app-icon.png",
"screenshots": []
}
}

View File

@@ -2,7 +2,7 @@
"name": "superpowers", "name": "superpowers",
"displayName": "Superpowers", "displayName": "Superpowers",
"description": "Core skills library: TDD, debugging, collaboration patterns, and proven techniques", "description": "Core skills library: TDD, debugging, collaboration patterns, and proven techniques",
"version": "5.0.7", "version": "5.0.6",
"author": { "author": {
"name": "Jesse Vincent", "name": "Jesse Vincent",
"email": "jesse@fsck.com" "email": "jesse@fsck.com"

View File

@@ -1,5 +1,5 @@
blank_issues_enabled: false blank_issues_enabled: false
contact_links: contact_links:
- name: Questions & Help - name: Questions & Help
url: https://discord.gg/35wsABTejz url: https://discord.gg/Jd8Vphy9jq
about: For usage questions, troubleshooting help, and general discussion, please visit our Discord instead of opening an issue. about: For usage questions, troubleshooting help, and general discussion, please visit our Discord instead of opening an issue.

View File

@@ -14,14 +14,10 @@ Add superpowers to the `plugin` array in your `opencode.json` (global or project
} }
``` ```
Restart OpenCode. The plugin installs through OpenCode's plugin manager and Restart OpenCode. That's it — the plugin auto-installs and registers all skills.
registers all skills.
Verify by asking: "Tell me about your superpowers" Verify by asking: "Tell me about your superpowers"
OpenCode uses its own plugin install. If you also use Claude Code, Codex, or
another harness, install Superpowers separately for each one.
## Migrating from the old symlink-based install ## Migrating from the old symlink-based install
If you previously installed superpowers using `git clone` and symlinks, remove the old setup: If you previously installed superpowers using `git clone` and symlinks, remove the old setup:
@@ -50,10 +46,7 @@ use skill tool to load superpowers/brainstorming
## Updating ## Updating
OpenCode installs Superpowers through a git-backed package spec. Some OpenCode Superpowers updates automatically when you restart OpenCode.
and Bun versions pin that resolved git dependency in a lockfile or cache, so a
restart may not pick up the newest Superpowers commit. If updates do not appear,
clear OpenCode's package cache or reinstall the plugin.
To pin a specific version: To pin a specific version:
@@ -71,26 +64,6 @@ To pin a specific version:
2. Verify the plugin line in your `opencode.json` 2. Verify the plugin line in your `opencode.json`
3. Make sure you're running a recent version of OpenCode 3. Make sure you're running a recent version of OpenCode
### Windows install issues
Some Windows OpenCode builds have upstream installer issues with git-backed
plugin specs, including cache paths for `git+https` URLs and Bun not finding
`git.exe` even when it works in a normal terminal. If OpenCode cannot install
the plugin, try installing with system npm and pointing OpenCode at the local
package:
```powershell
npm install superpowers@git+https://github.com/obra/superpowers.git --prefix "$HOME\.config\opencode"
```
Then use the installed package path in `opencode.json`:
```json
{
"plugin": ["~/.config/opencode/node_modules/superpowers"]
}
```
### Skills not found ### Skills not found
1. Use `skill` tool to list what's discovered 1. Use `skill` tool to list what's discovered

View File

@@ -46,29 +46,17 @@ const normalizePath = (p, homeDir) => {
return path.resolve(normalized); return path.resolve(normalized);
}; };
// Module-level cache for bootstrap content.
// The SKILL.md file does not change during a session, so reading + parsing it
// once eliminates redundant fs.existsSync + fs.readFileSync + regex work on
// every agent step. See #1202 for the full analysis.
let _bootstrapCache = undefined; // undefined = not yet loaded, null = file missing
export const SuperpowersPlugin = async ({ client, directory }) => { export const SuperpowersPlugin = async ({ client, directory }) => {
const homeDir = os.homedir(); const homeDir = os.homedir();
const superpowersSkillsDir = path.resolve(__dirname, '../../skills'); const superpowersSkillsDir = path.resolve(__dirname, '../../skills');
const envConfigDir = normalizePath(process.env.OPENCODE_CONFIG_DIR, homeDir); const envConfigDir = normalizePath(process.env.OPENCODE_CONFIG_DIR, homeDir);
const configDir = envConfigDir || path.join(homeDir, '.config/opencode'); const configDir = envConfigDir || path.join(homeDir, '.config/opencode');
// Helper to generate bootstrap content (cached after first call) // Helper to generate bootstrap content
const getBootstrapContent = () => { const getBootstrapContent = () => {
// Return cached result on subsequent calls
if (_bootstrapCache !== undefined) return _bootstrapCache;
// Try to load using-superpowers skill // Try to load using-superpowers skill
const skillPath = path.join(superpowersSkillsDir, 'using-superpowers', 'SKILL.md'); const skillPath = path.join(superpowersSkillsDir, 'using-superpowers', 'SKILL.md');
if (!fs.existsSync(skillPath)) { if (!fs.existsSync(skillPath)) return null;
_bootstrapCache = null;
return null;
}
const fullContent = fs.readFileSync(skillPath, 'utf8'); const fullContent = fs.readFileSync(skillPath, 'utf8');
const { content } = extractAndStripFrontmatter(fullContent); const { content } = extractAndStripFrontmatter(fullContent);
@@ -82,7 +70,7 @@ When skills reference tools you don't have, substitute OpenCode equivalents:
Use OpenCode's native \`skill\` tool to list and load skills.`; Use OpenCode's native \`skill\` tool to list and load skills.`;
_bootstrapCache = `<EXTREMELY_IMPORTANT> return `<EXTREMELY_IMPORTANT>
You have superpowers. You have superpowers.
**IMPORTANT: The using-superpowers skill content is included below. It is ALREADY LOADED - you are currently following it. Do NOT use the skill tool to load "using-superpowers" again - that would be redundant.** **IMPORTANT: The using-superpowers skill content is included below. It is ALREADY LOADED - you are currently following it. Do NOT use the skill tool to load "using-superpowers" again - that would be redundant.**
@@ -91,8 +79,6 @@ ${content}
${toolMapping} ${toolMapping}
</EXTREMELY_IMPORTANT>`; </EXTREMELY_IMPORTANT>`;
return _bootstrapCache;
}; };
return { return {
@@ -112,22 +98,13 @@ ${toolMapping}
// Using a user message instead of a system message avoids: // Using a user message instead of a system message avoids:
// 1. Token bloat from system messages repeated every turn (#750) // 1. Token bloat from system messages repeated every turn (#750)
// 2. Multiple system messages breaking Qwen and other models (#894) // 2. Multiple system messages breaking Qwen and other models (#894)
//
// The hook fires on every agent step (not just every turn) because
// opencode's prompt.ts reloads messages from DB each step. Fresh message
// arrays may need injection again, so getBootstrapContent() must not do
// repeated disk work.
'experimental.chat.messages.transform': async (_input, output) => { 'experimental.chat.messages.transform': async (_input, output) => {
const bootstrap = getBootstrapContent(); const bootstrap = getBootstrapContent();
if (!bootstrap || !output.messages.length) return; if (!bootstrap || !output.messages.length) return;
const firstUser = output.messages.find(m => m.info.role === 'user'); const firstUser = output.messages.find(m => m.info.role === 'user');
if (!firstUser || !firstUser.parts.length) return; if (!firstUser || !firstUser.parts.length) return;
// Only inject once
// Guard: skip if first user message already contains bootstrap.
// This prevents double injection when OpenCode passes an already
// transformed in-memory message array through the hook again.
if (firstUser.parts.some(p => p.type === 'text' && p.text.includes('EXTREMELY_IMPORTANT'))) return; if (firstUser.parts.some(p => p.type === 'text' && p.text.includes('EXTREMELY_IMPORTANT'))) return;
const ref = firstUser.parts[0]; const ref = firstUser.parts[0];
firstUser.parts.unshift({ ...ref, type: 'text', text: bootstrap }); firstUser.parts.unshift({ ...ref, type: 'text', text: bootstrap });
} }

View File

@@ -1,20 +0,0 @@
{
"files": [
{ "path": "package.json", "field": "version" },
{ "path": ".claude-plugin/plugin.json", "field": "version" },
{ "path": ".cursor-plugin/plugin.json", "field": "version" },
{ "path": ".codex-plugin/plugin.json", "field": "version" },
{ "path": ".claude-plugin/marketplace.json", "field": "plugins.0.version" },
{ "path": "gemini-extension.json", "field": "version" }
],
"audit": {
"exclude": [
"CHANGELOG.md",
"RELEASE-NOTES.md",
"node_modules",
".git",
".version-bump.json",
"scripts/bump-version.sh"
]
}
}

View File

@@ -1 +0,0 @@
CLAUDE.md

13
CHANGELOG.md Normal file
View File

@@ -0,0 +1,13 @@
# Changelog
## [5.0.5] - 2026-03-17
### Fixed
- **Brainstorm server ESM fix**: Renamed `server.js``server.cjs` so the brainstorming server starts correctly on Node.js 22+ where the root `package.json` `"type": "module"` caused `require()` to fail. ([PR #784](https://github.com/obra/superpowers/pull/784) by @sarbojitrana, fixes [#774](https://github.com/obra/superpowers/issues/774), [#780](https://github.com/obra/superpowers/issues/780), [#783](https://github.com/obra/superpowers/issues/783))
- **Brainstorm owner-PID on Windows**: Skip `BRAINSTORM_OWNER_PID` lifecycle monitoring on Windows/MSYS2 where the PID namespace is invisible to Node.js. Prevents the server from self-terminating after 60 seconds. The 30-minute idle timeout remains as the safety net. ([#770](https://github.com/obra/superpowers/issues/770), docs from [PR #768](https://github.com/obra/superpowers/pull/768) by @lucasyhzhu-debug)
- **stop-server.sh reliability**: Verify the server process actually died before reporting success. Waits up to 2 seconds for graceful shutdown, escalates to `SIGKILL`, and reports failure if the process survives. ([#723](https://github.com/obra/superpowers/issues/723))
### Changed
- **Execution handoff**: Restore user choice between subagent-driven-development and executing-plans after plan writing. Subagent-driven is recommended but no longer mandatory. (Reverts `5e51c3e`)

View File

@@ -1,85 +0,0 @@
# Superpowers — Contributor Guidelines
## If You Are an AI Agent
Stop. Read this section before doing anything.
This repo has a 94% PR rejection rate. Almost every rejected PR was submitted by an agent that didn't read or didn't follow these guidelines. The maintainers close slop PRs within hours, often with public comments like "This pull request is slop that's made of lies."
**Your job is to protect your human partner from that outcome.** Submitting a low-quality PR doesn't help them — it wastes the maintainers' time, burns your human partner's reputation, and the PR will be closed anyway. That is not being helpful. That is being a tool of embarrassment.
Before you open a PR against this repo, you MUST:
1. **Read the entire PR template** at `.github/PULL_REQUEST_TEMPLATE.md` and fill in every section with real, specific answers. Not summaries. Not placeholders.
2. **Search for existing PRs** — open AND closed — that address the same problem. If duplicates exist, STOP and tell your human partner. Do not open another duplicate.
3. **Verify this is a real problem.** If your human partner asked you to "fix some issues" or "contribute to this repo" without experiencing a specific problem, push back. Ask them what broke, what failed, what the user experience was.
4. **Confirm the change belongs in core.** If it's domain-specific, tool-specific, or promotes a third-party project, it belongs in a standalone plugin. Tell your human partner.
5. **Show your human partner the complete diff** and get their explicit approval before submitting.
If any of these checks fail, do not open the PR. Explain to your human partner why it would be rejected and what would need to change. They will thank you for saving them the embarrassment.
## Pull Request Requirements
**Every PR must fully complete the PR template.** No section may be left blank or filled with placeholder text. PRs that skip sections will be closed without review.
**Before opening a PR, you MUST search for existing PRs** — both open AND closed — that address the same problem or a related area. Reference what you found in the "Existing PRs" section. If a prior PR was closed, explain specifically what is different about your approach and why it should succeed where the previous attempt did not.
**PRs that show no evidence of human involvement will be closed.** A human must review the complete proposed diff before submission.
## What We Will Not Accept
### Third-party dependencies
PRs that add optional or required dependencies on third-party projects will not be accepted unless they are adding support for a new harness (e.g., a new IDE or CLI tool). Superpowers is a zero-dependency plugin by design. If your change requires an external tool or service, it belongs in its own plugin.
### "Compliance" changes to skills
Our internal skill philosophy differs from Anthropic's published guidance on writing skills. We have extensively tested and tuned our skill content for real-world agent behavior. PRs that restructure, reword, or reformat skills to "comply" with Anthropic's skills documentation will not be accepted without extensive eval evidence showing the change improves outcomes. The bar for modifying behavior-shaping content is very high.
### Project-specific or personal configuration
Skills, hooks, or configuration that only benefit a specific project, team, domain, or workflow do not belong in core. Publish these as a separate plugin.
### Bulk or spray-and-pray PRs
Do not trawl the issue tracker and open PRs for multiple issues in a single session. Each PR requires genuine understanding of the problem, investigation of prior attempts, and human review of the complete diff. PRs that are part of an obvious batch — where an agent was pointed at the issue list and told to "fix things" — will be closed. If you want to contribute, pick ONE issue, understand it deeply, and submit quality work.
### Speculative or theoretical fixes
Every PR must solve a real problem that someone actually experienced. "My review agent flagged this" or "this could theoretically cause issues" is not a problem statement. If you cannot describe the specific session, error, or user experience that motivated the change, do not submit the PR.
### Domain-specific skills
Superpowers core contains general-purpose skills that benefit all users regardless of their project. Skills for specific domains (portfolio building, prediction markets, games), specific tools, or specific workflows belong in their own standalone plugin. Ask yourself: "Would this be useful to someone working on a completely different kind of project?" If not, publish it separately.
### Fork-specific changes
If you maintain a fork with customizations, do not open PRs to sync your fork or push fork-specific changes upstream. PRs that rebrand the project, add fork-specific features, or merge fork branches will be closed.
### Fabricated content
PRs containing invented claims, fabricated problem descriptions, or hallucinated functionality will be closed immediately. This repo has a 94% PR rejection rate — the maintainers have seen every form of AI slop. They will notice.
### Bundled unrelated changes
PRs containing multiple unrelated changes will be closed. Split them into separate PRs.
## Skill Changes Require Evaluation
Skills are not prose — they are code that shapes agent behavior. If you modify skill content:
- Use `superpowers:writing-skills` to develop and test changes
- Run adversarial pressure testing across multiple sessions
- Show before/after eval results in your PR
- Do not modify carefully-tuned content (Red Flags tables, rationalization lists, "human partner" language) without evidence the change is an improvement
## Understand the Project Before Contributing
Before proposing changes to skill design, workflow philosophy, or architecture, read existing skills and understand the project's design decisions. Superpowers has its own tested philosophy about skill design, agent behavior shaping, and terminology (e.g., "your human partner" is deliberate, not interchangeable with "the user"). Changes that rewrite the project's voice or restructure its approach without understanding why it exists will be rejected.
## General
- Read `.github/PULL_REQUEST_TEMPLATE.md` before submitting
- One problem per PR
- Test on at least one harness and report results in the environment table
- Describe the problem you solved, not just what you changed

171
README.md
View File

@@ -1,10 +1,6 @@
# Superpowers # Superpowers
Superpowers is a complete software development methodology for your coding agents, built on top of a set of composable skills and some initial instructions that make sure your agent uses them. Superpowers is a complete software development workflow for your coding agents, built on top of a set of composable "skills" and some initial instructions that make sure your agent uses them.
## Quickstart
Give your agent Superpowers: [Claude Code](#claude-code), [Codex CLI](#codex-cli), [Codex App](#codex-app), [Factory Droid](#factory-droid), [Gemini CLI](#gemini-cli), [OpenCode](#opencode), [Cursor](#cursor), [GitHub Copilot CLI](#github-copilot-cli).
## How it works ## How it works
@@ -30,126 +26,84 @@ Thanks!
## Installation ## Installation
Installation differs by harness. If you use more than one, install Superpowers separately for each one. **Note:** Installation differs by platform. Claude Code or Cursor have built-in plugin marketplaces. Codex and OpenCode require manual setup.
### Claude Code ### Claude Code Official Marketplace
Superpowers is available via the [official Claude plugin marketplace](https://claude.com/plugins/superpowers) Superpowers is available via the [official Claude plugin marketplace](https://claude.com/plugins/superpowers)
#### Official Marketplace Install the plugin from Claude marketplace:
- Install the plugin from Anthropic's official marketplace: ```bash
/plugin install superpowers@claude-plugins-official
```
```bash ### Claude Code (via Plugin Marketplace)
/plugin install superpowers@claude-plugins-official
```
#### Superpowers Marketplace In Claude Code, register the marketplace first:
The Superpowers marketplace provides Superpowers and some other related plugins for Claude Code. ```bash
/plugin marketplace add obra/superpowers-marketplace
```
- Register the marketplace: Then install the plugin from this marketplace:
```bash ```bash
/plugin marketplace add obra/superpowers-marketplace /plugin install superpowers@superpowers-marketplace
``` ```
- Install the plugin from this marketplace: ### Cursor (via Plugin Marketplace)
```bash In Cursor Agent chat, install from marketplace:
/plugin install superpowers@superpowers-marketplace
```
### Codex CLI ```text
/add-plugin superpowers
```
Superpowers is available via the [official Codex plugin marketplace](https://github.com/openai/plugins). or search for "superpowers" in the plugin marketplace.
- Open the plugin search interface: ### Codex
```bash Tell Codex:
/plugins
```
- Search for Superpowers: ```
Fetch and follow instructions from https://raw.githubusercontent.com/obra/superpowers/refs/heads/main/.codex/INSTALL.md
```
```bash **Detailed docs:** [docs/README.codex.md](docs/README.codex.md)
superpowers
```
- Select `Install Plugin`.
### Codex App
Superpowers is available via the [official Codex plugin marketplace](https://github.com/openai/plugins).
- In the Codex app, click on Plugins in the sidebar.
- You should see `Superpowers` in the Coding section.
- Click the `+` next to Superpowers and follow the prompts.
### Factory Droid
- Register the marketplace:
```bash
droid plugin marketplace add https://github.com/obra/superpowers
```
- Install the plugin:
```bash
droid plugin install superpowers@superpowers
```
### Gemini CLI
- Install the extension:
```bash
gemini extensions install https://github.com/obra/superpowers
```
- Update later:
```bash
gemini extensions update superpowers
```
### OpenCode ### OpenCode
OpenCode uses its own plugin install; install Superpowers separately even if you Tell OpenCode:
already use it in another harness.
- Tell OpenCode: ```
Fetch and follow instructions from https://raw.githubusercontent.com/obra/superpowers/refs/heads/main/.opencode/INSTALL.md
```
``` **Detailed docs:** [docs/README.opencode.md](docs/README.opencode.md)
Fetch and follow instructions from https://raw.githubusercontent.com/obra/superpowers/refs/heads/main/.opencode/INSTALL.md
```
- Detailed docs: [docs/README.opencode.md](docs/README.opencode.md)
### Cursor
- In Cursor Agent chat, install from marketplace:
```text
/add-plugin superpowers
```
- Or search for "superpowers" in the plugin marketplace.
### GitHub Copilot CLI ### GitHub Copilot CLI
- Register the marketplace: ```bash
copilot plugin marketplace add obra/superpowers-marketplace
copilot plugin install superpowers@superpowers-marketplace
```
```bash ### Gemini CLI
copilot plugin marketplace add obra/superpowers-marketplace
```
- Install the plugin: ```bash
gemini extensions install https://github.com/obra/superpowers
```
```bash To update:
copilot plugin install superpowers@superpowers-marketplace
``` ```bash
gemini extensions update superpowers
```
### Verify Installation
Start a new session in your chosen platform and ask for something that should trigger a skill (for example, "help me plan this feature" or "let's debug this issue"). The agent should automatically invoke the relevant superpowers skill.
## The Basic Workflow ## The Basic Workflow
@@ -202,23 +156,26 @@ already use it in another harness.
- **Complexity reduction** - Simplicity as primary goal - **Complexity reduction** - Simplicity as primary goal
- **Evidence over claims** - Verify before declaring success - **Evidence over claims** - Verify before declaring success
Read [the original release announcement](https://blog.fsck.com/2025/10/09/superpowers/). Read more: [Superpowers for Claude Code](https://blog.fsck.com/2025/10/09/superpowers/)
## Contributing ## Contributing
The general contribution process for Superpowers is below. Keep in mind that we don't generally accept contributions of new skills and that any updates to skills must work across all of the coding agents we support. Skills live directly in this repository. To contribute:
1. Fork the repository 1. Fork the repository
2. Switch to the 'dev' branch 2. Create a branch for your skill
3. Create a branch for your work 3. Follow the `writing-skills` skill for creating and testing new skills
4. Follow the `writing-skills` skill for creating and testing new and modified skills 4. Submit a PR
5. Submit a PR, being sure to fill in the pull request template.
See `skills/writing-skills/SKILL.md` for the complete guide. See `skills/writing-skills/SKILL.md` for the complete guide.
## Updating ## Updating
Superpowers updates are somewhat coding-agent dependent, but are often automatic. Skills update automatically when you update the plugin:
```bash
/plugin update superpowers
```
## License ## License
@@ -228,6 +185,10 @@ MIT License - see LICENSE file for details
Superpowers is built by [Jesse Vincent](https://blog.fsck.com) and the rest of the folks at [Prime Radiant](https://primeradiant.com). Superpowers is built by [Jesse Vincent](https://blog.fsck.com) and the rest of the folks at [Prime Radiant](https://primeradiant.com).
- **Discord**: [Join us](https://discord.gg/35wsABTejz) for community support, questions, and sharing what you're building with Superpowers For community support, questions, and sharing what you're building with Superpowers, join us on [Discord](https://discord.gg/Jd8Vphy9jq).
## Support
- **Discord**: [Join us on Discord](https://discord.gg/Jd8Vphy9jq)
- **Issues**: https://github.com/obra/superpowers/issues - **Issues**: https://github.com/obra/superpowers/issues
- **Release announcements**: [Sign up](https://primeradiant.com/superpowers/) to get notified about new versions - **Marketplace**: https://github.com/obra/superpowers-marketplace

View File

@@ -1,6 +1,6 @@
# Superpowers Release Notes # Superpowers Release Notes
## v5.0.7 (2026-03-31) ## Unreleased
### GitHub Copilot CLI Support ### GitHub Copilot CLI Support

Binary file not shown.

Before

Width:  |  Height:  |  Size: 47 KiB

View File

@@ -1 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?><svg id="Calque_1" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 512 512"><path d="M394.28,207.8c.81,2.41,1.39,4.78,1.8,7.07,1.61,9.03-.93,17.78-5.99,21.74-22.6,17.7-49.85,29.35-75.34,38.6-.59.22-1.09.28-1.4.34-2.22.47-4.95,1.04-7.25,0-1.46-.66-2.25-1.74-2.66-2.3-1.56-2.1-1.59-4.31-1.56-5.13.1-2.67-.01-4.69,0-4.82.45-3.52.91-10.66,1.41-21.28.6-3.87,2.16-9.63,6.94-13.96,4.01-3.62,8.33-4.6,14.59-5.87,10.76-2.19,37.21-8.22,47.42-16.56,1.63-1.33,2.97-2.65,4.19-3.96,3.72-3.99,6.39-7.92,7.93-10.36,3.22,3.22,7.25,8.48,9.92,16.47Z"/><path d="M428.67,185.28c-2.33,11.99-8.91,22.32-15.88,30.38.27-5.5-.05-12.11-1.86-19.08-5.04-19.36-19.74-34.7-37.78-37.78-32.21-9.74-70.59,3.79-99.08,18.29-3.87,1.95-9.52-2.77-11.84-8.16-3.32-7.71-1.63-6.28,2.61-8.49,38.31-20.03,82.01-39.61,123.91-29.7,8.26,1.95,15.96,5.26,23.48,10.54,11.32,7.96,20.21,24.74,16.44,44Z"/><path d="M117.72,304.2c-.81-2.41-1.39-4.78-1.8-7.07-1.61-9.03.93-17.78,5.99-21.74,22.6-17.7,49.85-29.35,75.34-38.6.59-.22,1.09-.28,1.4-.34,2.22-.47,4.95-1.04,7.25,0,1.46.66,2.25,1.74,2.66,2.3,1.56,2.1,1.59,4.31,1.56,5.13-.1,2.67.01,4.69,0,4.82-.45,3.52-.91,10.66-1.41,21.28-.6,3.87-2.16,9.63-6.94,13.96-4.01,3.62-8.33,4.6-14.59,5.87-10.76,2.19-37.21,8.22-47.42,16.56-1.63,1.33-2.97,2.65-4.19,3.96-3.72,3.99-6.39,7.92-7.93,10.36-3.22-3.22-7.25-8.48-9.92-16.47Z"/><path d="M83.33,326.72c2.33-11.99,8.91-22.32,15.88-30.38-.27,5.5.05,12.11,1.86,19.08,5.04,19.36,19.74,34.7,37.78,37.78,32.21,9.74,70.59-3.79,99.08-18.29,3.87-1.95,9.52,2.77,11.84,8.16,3.32,7.71,1.63,6.28-2.61,8.49-38.31,20.03-82.01,39.61-123.91,29.7-8.26-1.95-15.96-5.26-23.48-10.54-11.32-7.96-20.21-24.74-16.44-44Z"/><ellipse cx="255.16" cy="258.86" rx="28.95" ry="28.76"/></svg>

Before

Width:  |  Height:  |  Size: 1.7 KiB

5
commands/brainstorm.md Normal file
View File

@@ -0,0 +1,5 @@
---
description: "Deprecated - use the superpowers:brainstorming skill instead"
---
Tell your human partner that this command is deprecated and will be removed in the next major release. They should ask you to use the "superpowers brainstorming" skill instead.

5
commands/execute-plan.md Normal file
View File

@@ -0,0 +1,5 @@
---
description: "Deprecated - use the superpowers:executing-plans skill instead"
---
Tell your human partner that this command is deprecated and will be removed in the next major release. They should ask you to use the "superpowers executing-plans" skill instead.

5
commands/write-plan.md Normal file
View File

@@ -0,0 +1,5 @@
---
description: "Deprecated - use the superpowers:writing-plans skill instead"
---
Tell your human partner that this command is deprecated and will be removed in the next major release. They should ask you to use the "superpowers writing-plans" skill instead.

View File

@@ -12,14 +12,10 @@ Add superpowers to the `plugin` array in your `opencode.json` (global or project
} }
``` ```
Restart OpenCode. The plugin installs through OpenCode's plugin manager and Restart OpenCode. The plugin auto-installs via Bun and registers all skills automatically.
registers all skills.
Verify by asking: "Tell me about your superpowers" Verify by asking: "Tell me about your superpowers"
OpenCode uses its own plugin install. If you also use Claude Code, Codex, or
another harness, install Superpowers separately for each one.
### Migrating from the old symlink-based install ### Migrating from the old symlink-based install
If you previously installed superpowers using `git clone` and symlinks, remove the old setup: If you previously installed superpowers using `git clone` and symlinks, remove the old setup:
@@ -82,10 +78,7 @@ Create project-specific skills in `.opencode/skills/` within your project.
## Updating ## Updating
OpenCode installs Superpowers through a git-backed package spec. Some OpenCode Superpowers updates automatically when you restart OpenCode. The plugin is re-installed from the git repository on each launch.
and Bun versions pin that resolved git dependency in a lockfile or cache, so a
restart may not pick up the newest Superpowers commit. If updates do not appear,
clear OpenCode's package cache or reinstall the plugin.
To pin a specific version, use a branch or tag: To pin a specific version, use a branch or tag:
@@ -119,26 +112,6 @@ Skills written for Claude Code are automatically adapted for OpenCode:
2. Verify the plugin line in your `opencode.json` is correct 2. Verify the plugin line in your `opencode.json` is correct
3. Make sure you're running a recent version of OpenCode 3. Make sure you're running a recent version of OpenCode
### Windows install issues
Some Windows OpenCode builds have upstream installer issues with git-backed
plugin specs, including cache paths for `git+https` URLs and Bun not finding
`git.exe` even when it works in a normal terminal. If OpenCode cannot install
the plugin, try installing with system npm and pointing OpenCode at the local
package:
```powershell
npm install superpowers@git+https://github.com/obra/superpowers.git --prefix "$HOME\.config\opencode"
```
Then use the installed package path in `opencode.json`:
```json
{
"plugin": ["~/.config/opencode/node_modules/superpowers"]
}
```
### Skills not found ### Skills not found
1. Use OpenCode's `skill` tool to list available skills 1. Use OpenCode's `skill` tool to list available skills

View File

@@ -1,6 +1,6 @@
{ {
"name": "superpowers", "name": "superpowers",
"description": "Core skills library: TDD, debugging, collaboration patterns, and proven techniques", "description": "Core skills library: TDD, debugging, collaboration patterns, and proven techniques",
"version": "5.0.7", "version": "5.0.6",
"contextFileName": "GEMINI.md" "contextFileName": "GEMINI.md"
} }

View File

@@ -1,6 +1,6 @@
{ {
"name": "superpowers", "name": "superpowers",
"version": "5.0.7", "version": "5.0.6",
"type": "module", "type": "module",
"main": ".opencode/plugins/superpowers.js" "main": ".opencode/plugins/superpowers.js"
} }

View File

@@ -1,220 +0,0 @@
#!/usr/bin/env bash
#
# bump-version.sh — bump version numbers across all declared files,
# with drift detection and repo-wide audit for missed files.
#
# Usage:
# bump-version.sh <new-version> Bump all declared files to new version
# bump-version.sh --check Report current versions (detect drift)
# bump-version.sh --audit Check + grep repo for old version strings
#
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
REPO_ROOT="$(cd "$SCRIPT_DIR/.." && pwd)"
CONFIG="$REPO_ROOT/.version-bump.json"
if [[ ! -f "$CONFIG" ]]; then
echo "error: .version-bump.json not found at $CONFIG" >&2
exit 1
fi
# --- helpers ---
# Read a dotted field path from a JSON file.
# Handles both simple ("version") and nested ("plugins.0.version") paths.
read_json_field() {
local file="$1" field="$2"
# Convert dot-path to jq path: "plugins.0.version" -> .plugins[0].version
local jq_path
jq_path=$(echo "$field" | sed -E 's/\.([0-9]+)/[\1]/g' | sed 's/^/./' | sed 's/\.\././g')
jq -r "$jq_path" "$file"
}
# Write a dotted field path in a JSON file, preserving formatting.
write_json_field() {
local file="$1" field="$2" value="$3"
local jq_path
jq_path=$(echo "$field" | sed -E 's/\.([0-9]+)/[\1]/g' | sed 's/^/./' | sed 's/\.\././g')
local tmp="${file}.tmp"
jq "$jq_path = \"$value\"" "$file" > "$tmp" && mv "$tmp" "$file"
}
# Read the list of declared files from config.
# Outputs lines of "path<TAB>field"
declared_files() {
jq -r '.files[] | "\(.path)\t\(.field)"' "$CONFIG"
}
# Read the audit exclude patterns from config.
audit_excludes() {
jq -r '.audit.exclude[]' "$CONFIG" 2>/dev/null
}
# --- commands ---
cmd_check() {
local has_drift=0
local versions=()
echo "Version check:"
echo ""
while IFS=$'\t' read -r path field; do
local fullpath="$REPO_ROOT/$path"
if [[ ! -f "$fullpath" ]]; then
printf " %-45s MISSING\n" "$path ($field)"
has_drift=1
continue
fi
local ver
ver=$(read_json_field "$fullpath" "$field")
printf " %-45s %s\n" "$path ($field)" "$ver"
versions+=("$ver")
done < <(declared_files)
echo ""
# Check if all versions match
local unique
unique=$(printf '%s\n' "${versions[@]}" | sort -u | wc -l | tr -d ' ')
if [[ "$unique" -gt 1 ]]; then
echo "DRIFT DETECTED — versions are not in sync:"
printf '%s\n' "${versions[@]}" | sort | uniq -c | sort -rn | while read -r count ver; do
echo " $ver ($count files)"
done
has_drift=1
else
echo "All declared files are in sync at ${versions[0]}"
fi
return $has_drift
}
cmd_audit() {
# First run check
cmd_check || true
echo ""
# Determine the current version (most common across declared files)
local current_version
current_version=$(
while IFS=$'\t' read -r path field; do
local fullpath="$REPO_ROOT/$path"
[[ -f "$fullpath" ]] && read_json_field "$fullpath" "$field"
done < <(declared_files) | sort | uniq -c | sort -rn | head -1 | awk '{print $2}'
)
if [[ -z "$current_version" ]]; then
echo "error: could not determine current version" >&2
return 1
fi
echo "Audit: scanning repo for version string '$current_version'..."
echo ""
# Build grep exclude args
local -a exclude_args=()
while IFS= read -r pattern; do
exclude_args+=("--exclude=$pattern" "--exclude-dir=$pattern")
done < <(audit_excludes)
# Also always exclude binary files and .git
exclude_args+=("--exclude-dir=.git" "--exclude-dir=node_modules" "--binary-files=without-match")
# Get list of declared paths for comparison
local -a declared_paths=()
while IFS=$'\t' read -r path _field; do
declared_paths+=("$path")
done < <(declared_files)
# Grep for the version string
local found_undeclared=0
while IFS= read -r match; do
local match_file
match_file=$(echo "$match" | cut -d: -f1)
# Make path relative to repo root
local rel_path="${match_file#$REPO_ROOT/}"
# Check if this file is in the declared list
local is_declared=0
for dp in "${declared_paths[@]}"; do
if [[ "$rel_path" == "$dp" ]]; then
is_declared=1
break
fi
done
if [[ "$is_declared" -eq 0 ]]; then
if [[ "$found_undeclared" -eq 0 ]]; then
echo "UNDECLARED files containing '$current_version':"
found_undeclared=1
fi
echo " $match"
fi
done < <(grep -rn "${exclude_args[@]}" -F "$current_version" "$REPO_ROOT" 2>/dev/null || true)
if [[ "$found_undeclared" -eq 0 ]]; then
echo "No undeclared files contain the version string. All clear."
else
echo ""
echo "Review the above files — if they should be bumped, add them to .version-bump.json"
echo "If they should be skipped, add them to the audit.exclude list."
fi
}
cmd_bump() {
local new_version="$1"
# Validate semver-ish format
if ! echo "$new_version" | grep -qE '^[0-9]+\.[0-9]+\.[0-9]+'; then
echo "error: '$new_version' doesn't look like a version (expected X.Y.Z)" >&2
exit 1
fi
echo "Bumping all declared files to $new_version..."
echo ""
while IFS=$'\t' read -r path field; do
local fullpath="$REPO_ROOT/$path"
if [[ ! -f "$fullpath" ]]; then
echo " SKIP (missing): $path"
continue
fi
local old_ver
old_ver=$(read_json_field "$fullpath" "$field")
write_json_field "$fullpath" "$field" "$new_version"
printf " %-45s %s -> %s\n" "$path ($field)" "$old_ver" "$new_version"
done < <(declared_files)
echo ""
echo "Done. Running audit to check for missed files..."
echo ""
cmd_audit
}
# --- main ---
case "${1:-}" in
--check)
cmd_check
;;
--audit)
cmd_audit
;;
--help|-h|"")
echo "Usage: bump-version.sh <new-version> | --check | --audit"
echo ""
echo " <new-version> Bump all declared files to the given version"
echo " --check Show current versions, detect drift"
echo " --audit Check + scan repo for undeclared version references"
exit 0
;;
--*)
echo "error: unknown flag '$1'" >&2
exit 1
;;
*)
cmd_bump "$1"
;;
esac

View File

@@ -1,462 +0,0 @@
#!/usr/bin/env bash
#
# sync-to-codex-plugin.sh
#
# Sync this superpowers checkout → prime-radiant-inc/openai-codex-plugins.
# Clones the fork fresh into a temp dir, rsyncs tracked upstream plugin content
# (including committed Codex files under .codex-plugin/ and assets/), preserves
# OpenAI-owned marketplace metadata already in the destination plugin, commits,
# pushes a sync branch, and opens a PR.
# Path/user agnostic — auto-detects upstream from script location.
#
# Deterministic: running twice against the same upstream SHA produces PRs with
# identical diffs, so two back-to-back runs can verify the tool itself.
#
# Usage:
# ./scripts/sync-to-codex-plugin.sh # full run
# ./scripts/sync-to-codex-plugin.sh -n # dry run
# ./scripts/sync-to-codex-plugin.sh -y # skip confirm
# ./scripts/sync-to-codex-plugin.sh --local PATH # existing checkout
# ./scripts/sync-to-codex-plugin.sh --base BRANCH # default: main
# ./scripts/sync-to-codex-plugin.sh --bootstrap # create plugin dir if missing
#
# Bootstrap mode: skips the "plugin must exist on base" requirement and creates
# plugins/superpowers/ when absent, then copies the tracked plugin files from
# upstream just like a normal sync.
#
# Requires: bash, rsync, git, gh (authenticated), python3.
set -euo pipefail
# =============================================================================
# Config — edit as upstream or canonical plugin shape evolves
# =============================================================================
FORK="prime-radiant-inc/openai-codex-plugins"
DEFAULT_BASE="main"
DEST_REL="plugins/superpowers"
# Paths in upstream that should NOT land in the embedded plugin.
# All patterns use a leading "/" to anchor them to the source root.
# Unanchored patterns like "scripts/" would match any directory named
# "scripts" at any depth — including legitimate nested dirs like
# skills/brainstorming/scripts/. Anchoring prevents that.
# (.DS_Store is intentionally unanchored — Finder creates them everywhere.)
EXCLUDES=(
# Dotfiles and infra — top-level only
"/.claude/"
"/.claude-plugin/"
"/.codex/"
"/.cursor-plugin/"
"/.git/"
"/.gitattributes"
"/.github/"
"/.gitignore"
"/.opencode/"
"/.version-bump.json"
"/.worktrees/"
".DS_Store"
# Root ceremony files
"/AGENTS.md"
"/CHANGELOG.md"
"/CLAUDE.md"
"/GEMINI.md"
"/RELEASE-NOTES.md"
"/gemini-extension.json"
"/package.json"
# Directories not shipped by canonical Codex plugins
"/commands/"
"/docs/"
"/hooks/"
"/lib/"
"/scripts/"
"/tests/"
"/tmp/"
)
# =============================================================================
# Ignored-path helpers
# =============================================================================
IGNORED_DIR_EXCLUDES=()
path_has_directory_exclude() {
local path="$1"
local dir
if [[ ${#IGNORED_DIR_EXCLUDES[@]} -eq 0 ]]; then
return 1
fi
for dir in "${IGNORED_DIR_EXCLUDES[@]}"; do
[[ "$path" == "$dir"* ]] && return 0
done
return 1
}
ignored_directory_has_tracked_descendants() {
local path="$1"
[[ -n "$(git -C "$UPSTREAM" ls-files --cached -- "$path/")" ]]
}
append_git_ignored_directory_excludes() {
local path
local lookup_path
while IFS= read -r -d '' path; do
[[ "$path" == */ ]] || continue
lookup_path="${path%/}"
if ! ignored_directory_has_tracked_descendants "$lookup_path"; then
IGNORED_DIR_EXCLUDES+=("$path")
RSYNC_ARGS+=(--exclude="/$path")
fi
done < <(git -C "$UPSTREAM" ls-files --others --ignored --exclude-standard --directory -z)
}
append_git_ignored_file_excludes() {
local path
while IFS= read -r -d '' path; do
path_has_directory_exclude "$path" && continue
RSYNC_ARGS+=(--exclude="/$path")
done < <(git -C "$UPSTREAM" ls-files --others --ignored --exclude-standard -z)
}
# =============================================================================
# Args
# =============================================================================
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
UPSTREAM="$(cd "$SCRIPT_DIR/.." && pwd)"
BASE="$DEFAULT_BASE"
DRY_RUN=0
YES=0
LOCAL_CHECKOUT=""
BOOTSTRAP=0
usage() {
sed -n '/^# Usage:/,/^# Requires:/s/^# \{0,1\}//p' "$0"
exit "${1:-0}"
}
while [[ $# -gt 0 ]]; do
case "$1" in
-n|--dry-run) DRY_RUN=1; shift ;;
-y|--yes) YES=1; shift ;;
--local) LOCAL_CHECKOUT="$2"; shift 2 ;;
--base) BASE="$2"; shift 2 ;;
--bootstrap) BOOTSTRAP=1; shift ;;
-h|--help) usage 0 ;;
*) echo "Unknown arg: $1" >&2; usage 2 ;;
esac
done
# =============================================================================
# Preflight
# =============================================================================
die() { echo "ERROR: $*" >&2; exit 1; }
command -v rsync >/dev/null || die "rsync not found in PATH"
command -v git >/dev/null || die "git not found in PATH"
command -v gh >/dev/null || die "gh not found — install GitHub CLI"
command -v python3 >/dev/null || die "python3 not found in PATH"
gh auth status >/dev/null 2>&1 || die "gh not authenticated — run 'gh auth login'"
[[ -d "$UPSTREAM/.git" ]] || die "upstream '$UPSTREAM' is not a git checkout"
[[ -f "$UPSTREAM/.codex-plugin/plugin.json" ]] || die "committed Codex manifest missing at $UPSTREAM/.codex-plugin/plugin.json"
# Read the upstream version from the committed Codex manifest.
UPSTREAM_VERSION="$(python3 -c 'import json,sys; print(json.load(open(sys.argv[1]))["version"])' "$UPSTREAM/.codex-plugin/plugin.json")"
[[ -n "$UPSTREAM_VERSION" ]] || die "could not read 'version' from committed Codex manifest"
UPSTREAM_BRANCH="$(cd "$UPSTREAM" && git branch --show-current)"
UPSTREAM_SHA="$(cd "$UPSTREAM" && git rev-parse HEAD)"
UPSTREAM_SHORT="$(cd "$UPSTREAM" && git rev-parse --short HEAD)"
confirm() {
[[ $YES -eq 1 ]] && return 0
read -rp "$1 [y/N] " ans
[[ "$ans" == "y" || "$ans" == "Y" ]]
}
if [[ "$UPSTREAM_BRANCH" != "main" ]]; then
echo "WARNING: upstream is on '$UPSTREAM_BRANCH', not 'main'"
confirm "Sync from '$UPSTREAM_BRANCH' anyway?" || exit 1
fi
UPSTREAM_STATUS="$(cd "$UPSTREAM" && git status --porcelain)"
if [[ -n "$UPSTREAM_STATUS" ]]; then
echo "WARNING: upstream has uncommitted changes:"
echo "$UPSTREAM_STATUS" | sed 's/^/ /'
echo "Sync will use working-tree state, not HEAD ($UPSTREAM_SHORT)."
confirm "Continue anyway?" || exit 1
fi
# =============================================================================
# Prepare destination (clone fork fresh, or use --local)
# =============================================================================
CLEANUP_DIR=""
cleanup() {
if [[ -n "$CLEANUP_DIR" ]]; then
rm -rf "$CLEANUP_DIR"
fi
}
trap cleanup EXIT
if [[ -n "$LOCAL_CHECKOUT" ]]; then
DEST_REPO="$(cd "$LOCAL_CHECKOUT" && pwd)"
[[ -d "$DEST_REPO/.git" ]] || die "--local path '$DEST_REPO' is not a git checkout"
else
echo "Cloning $FORK..."
CLEANUP_DIR="$(mktemp -d)"
DEST_REPO="$CLEANUP_DIR/openai-codex-plugins"
gh repo clone "$FORK" "$DEST_REPO" >/dev/null
fi
DEST="$DEST_REPO/$DEST_REL"
PREVIEW_REPO="$DEST_REPO"
PREVIEW_DEST="$DEST"
SYNC_SOURCE=""
overlay_destination_paths() {
local repo="$1"
local path
local source_path
local preview_path
while IFS= read -r -d '' path; do
source_path="$repo/$path"
preview_path="$PREVIEW_REPO/$path"
if [[ -e "$source_path" ]]; then
mkdir -p "$(dirname "$preview_path")"
cp -R "$source_path" "$preview_path"
else
rm -rf "$preview_path"
fi
done
}
copy_local_destination_overlay() {
overlay_destination_paths "$DEST_REPO" < <(
git -C "$DEST_REPO" diff --name-only -z -- "$DEST_REL"
)
overlay_destination_paths "$DEST_REPO" < <(
git -C "$DEST_REPO" diff --cached --name-only -z -- "$DEST_REL"
)
overlay_destination_paths "$DEST_REPO" < <(
git -C "$DEST_REPO" ls-files --others --exclude-standard -z -- "$DEST_REL"
)
overlay_destination_paths "$DEST_REPO" < <(
git -C "$DEST_REPO" ls-files --others --ignored --exclude-standard -z -- "$DEST_REL"
)
}
local_checkout_has_uncommitted_destination_changes() {
[[ -n "$(git -C "$DEST_REPO" status --porcelain=1 --untracked-files=all --ignored=matching -- "$DEST_REL")" ]]
}
prepare_preview_checkout() {
if [[ -n "$LOCAL_CHECKOUT" ]]; then
[[ -n "$CLEANUP_DIR" ]] || CLEANUP_DIR="$(mktemp -d)"
PREVIEW_REPO="$CLEANUP_DIR/preview"
git clone -q --no-local "$DEST_REPO" "$PREVIEW_REPO"
PREVIEW_DEST="$PREVIEW_REPO/$DEST_REL"
fi
git -C "$PREVIEW_REPO" checkout -q "$BASE" 2>/dev/null || die "base branch '$BASE' doesn't exist in $FORK"
if [[ -n "$LOCAL_CHECKOUT" ]]; then
copy_local_destination_overlay
fi
if [[ $BOOTSTRAP -ne 1 ]]; then
[[ -d "$PREVIEW_DEST" ]] || die "base branch '$BASE' has no '$DEST_REL/' — use --bootstrap, or pass --base <branch>"
fi
}
prepare_apply_checkout() {
git -C "$DEST_REPO" checkout -q "$BASE" 2>/dev/null || die "base branch '$BASE' doesn't exist in $FORK"
if [[ $BOOTSTRAP -ne 1 ]]; then
[[ -d "$DEST" ]] || die "base branch '$BASE' has no '$DEST_REL/' — use --bootstrap, or pass --base <branch>"
fi
}
apply_to_preview_checkout() {
if [[ $BOOTSTRAP -eq 1 ]]; then
mkdir -p "$PREVIEW_DEST"
fi
rsync "${RSYNC_ARGS[@]}" "$SYNC_SOURCE/" "$PREVIEW_DEST/"
}
preview_checkout_has_changes() {
[[ -n "$(git -C "$PREVIEW_REPO" status --porcelain "$DEST_REL")" ]]
}
prepare_preview_checkout
TIMESTAMP="$(date -u +%Y%m%d-%H%M%S)"
if [[ $BOOTSTRAP -eq 1 ]]; then
SYNC_BRANCH="bootstrap/superpowers-${UPSTREAM_SHORT}-${TIMESTAMP}"
else
SYNC_BRANCH="sync/superpowers-${UPSTREAM_SHORT}-${TIMESTAMP}"
fi
# =============================================================================
# Build rsync args
# =============================================================================
RSYNC_ARGS=(-av --delete --delete-excluded)
for pat in "${EXCLUDES[@]}"; do RSYNC_ARGS+=(--exclude="$pat"); done
append_git_ignored_directory_excludes
append_git_ignored_file_excludes
copy_preserved_destination_metadata() {
local destination="$1"
local source="$2"
local path
local rel
[[ -d "$destination/skills" ]] || return 0
while IFS= read -r -d '' path; do
rel="${path#"$destination"/}"
mkdir -p "$source/$(dirname "$rel")"
cp -p "$path" "$source/$rel"
done < <(find "$destination/skills" -path '*/agents/openai.yaml' -type f -print0)
}
prepare_sync_source() {
local destination="$1"
[[ -n "$CLEANUP_DIR" ]] || CLEANUP_DIR="$(mktemp -d)"
SYNC_SOURCE="$CLEANUP_DIR/source-overlay"
rm -rf "$SYNC_SOURCE"
mkdir -p "$SYNC_SOURCE"
rsync "${RSYNC_ARGS[@]}" "$UPSTREAM/" "$SYNC_SOURCE/" >/dev/null
copy_preserved_destination_metadata "$destination" "$SYNC_SOURCE"
}
prepare_sync_source "$PREVIEW_DEST"
# =============================================================================
# Dry run preview (always shown)
# =============================================================================
echo ""
echo "Upstream: $UPSTREAM ($UPSTREAM_BRANCH @ $UPSTREAM_SHORT)"
echo "Version: $UPSTREAM_VERSION"
echo "Fork: $FORK"
echo "Base: $BASE"
echo "Branch: $SYNC_BRANCH"
if [[ $BOOTSTRAP -eq 1 ]]; then
echo "Mode: BOOTSTRAP (creating plugins/superpowers/ when absent)"
fi
echo ""
echo "=== Preview (rsync --dry-run) ==="
rsync "${RSYNC_ARGS[@]}" --dry-run --itemize-changes "$SYNC_SOURCE/" "$PREVIEW_DEST/"
echo "=== End preview ==="
echo ""
if [[ $DRY_RUN -eq 1 ]]; then
echo ""
echo "Dry run only. Nothing was changed or pushed."
exit 0
fi
# =============================================================================
# Apply
# =============================================================================
echo ""
confirm "Apply changes, push branch, and open PR?" || { echo "Aborted."; exit 1; }
echo ""
if [[ -n "$LOCAL_CHECKOUT" ]]; then
if local_checkout_has_uncommitted_destination_changes; then
die "local checkout has uncommitted changes under '$DEST_REL' — commit, stash, or discard them before syncing"
fi
apply_to_preview_checkout
if ! preview_checkout_has_changes; then
echo "No changes — embedded plugin was already in sync with upstream $UPSTREAM_SHORT (v$UPSTREAM_VERSION)."
exit 0
fi
fi
prepare_apply_checkout
cd "$DEST_REPO"
git checkout -q -b "$SYNC_BRANCH"
echo "Syncing upstream content..."
if [[ $BOOTSTRAP -eq 1 ]]; then
mkdir -p "$DEST"
fi
rsync "${RSYNC_ARGS[@]}" "$SYNC_SOURCE/" "$DEST/"
# Bail early if nothing actually changed
cd "$DEST_REPO"
if [[ -z "$(git status --porcelain "$DEST_REL")" ]]; then
echo "No changes — embedded plugin was already in sync with upstream $UPSTREAM_SHORT (v$UPSTREAM_VERSION)."
exit 0
fi
# =============================================================================
# Commit, push, open PR
# =============================================================================
git add "$DEST_REL"
if [[ $BOOTSTRAP -eq 1 ]]; then
COMMIT_TITLE="bootstrap superpowers v$UPSTREAM_VERSION from upstream main @ $UPSTREAM_SHORT"
PR_BODY="Initial bootstrap of the superpowers plugin from upstream \`main\` @ \`$UPSTREAM_SHORT\` (v$UPSTREAM_VERSION).
Creates \`plugins/superpowers/\` by copying the tracked plugin files from upstream, including \`.codex-plugin/plugin.json\` and \`assets/\`.
Run via: \`scripts/sync-to-codex-plugin.sh --bootstrap\`
Upstream commit: https://github.com/obra/superpowers/commit/$UPSTREAM_SHA
This is a one-time bootstrap. Subsequent syncs will be normal (non-bootstrap) runs using the same tracked upstream plugin files."
else
COMMIT_TITLE="sync superpowers v$UPSTREAM_VERSION from upstream main @ $UPSTREAM_SHORT"
PR_BODY="Automated sync from superpowers upstream \`main\` @ \`$UPSTREAM_SHORT\` (v$UPSTREAM_VERSION).
Copies the tracked plugin files from upstream, including the committed Codex manifest and assets.
Run via: \`scripts/sync-to-codex-plugin.sh\`
Upstream commit: https://github.com/obra/superpowers/commit/$UPSTREAM_SHA
Running the sync tool again against the same upstream SHA should produce a PR with an identical diff — use that to verify the tool is behaving."
fi
git commit --quiet -m "$COMMIT_TITLE
Automated sync via scripts/sync-to-codex-plugin.sh
Upstream: https://github.com/obra/superpowers/commit/$UPSTREAM_SHA
Branch: $SYNC_BRANCH"
echo "Pushing $SYNC_BRANCH to $FORK..."
git push -u origin "$SYNC_BRANCH" --quiet
echo "Opening PR..."
PR_URL="$(gh pr create \
--repo "$FORK" \
--base "$BASE" \
--head "$SYNC_BRANCH" \
--title "$COMMIT_TITLE" \
--body "$PR_BODY")"
PR_NUM="${PR_URL##*/}"
DIFF_URL="https://github.com/$FORK/pull/$PR_NUM/files"
echo ""
echo "PR opened: $PR_URL"
echo "Diff view: $DIFF_URL"

View File

@@ -249,3 +249,12 @@ git worktree prune # Self-healing: clean up any stale registrations
- Clean up worktree for Options 1 & 4 only - Clean up worktree for Options 1 & 4 only
- `cd` to main repo root before worktree removal - `cd` to main repo root before worktree removal
- Run `git worktree prune` after removal - Run `git worktree prune` after removal
## Integration
**Called by:**
- **subagent-driven-development** (Step 7) - After all tasks complete
- **executing-plans** (Step 5) - After all batches complete
**Pairs with:**
- **using-git-worktrees** - Cleans up worktree created by that skill

View File

@@ -82,7 +82,7 @@ You: [Fix progress indicators]
- Fix before moving to next task - Fix before moving to next task
**Executing Plans:** **Executing Plans:**
- Review after each task or at natural checkpoints - Review after each batch (3 tasks)
- Get feedback, apply, continue - Get feedback, apply, continue
**Ad-Hoc Development:** **Ad-Hoc Development:**

View File

@@ -11,8 +11,6 @@ Execute plan by dispatching fresh subagent per task, with two-stage review after
**Core principle:** Fresh subagent per task + two-stage review (spec then quality) = high quality, fast iteration **Core principle:** Fresh subagent per task + two-stage review (spec then quality) = high quality, fast iteration
**Continuous execution:** Do not pause to check in with your human partner between tasks. Execute all tasks from the plan without stopping. The only reasons to stop are: BLOCKED status you cannot resolve, ambiguity that genuinely prevents progress, or all tasks complete. "Should I continue?" prompts and progress summaries waste their time — they asked you to execute the plan, so execute it.
## When to Use ## When to Use
```dot ```dot

View File

@@ -168,6 +168,7 @@ Ready to implement <feature-name>
| Permission error on create | Sandbox fallback, work in place | | Permission error on create | Sandbox fallback, work in place |
| Tests fail during baseline | Report failures + ask | | Tests fail during baseline | Report failures + ask |
| No package.json/Cargo.toml | Skip dependency install | | No package.json/Cargo.toml | Skip dependency install |
| Plan touches multiple repos | Create a matching worktree in each repo, same branch name |
## Common Mistakes ## Common Mistakes
@@ -213,3 +214,13 @@ Ready to implement <feature-name>
- Verify directory is ignored for project-local - Verify directory is ignored for project-local
- Auto-detect and run project setup - Auto-detect and run project setup
- Verify clean test baseline - Verify clean test baseline
## Integration
**Called by:**
- **subagent-driven-development** - Ensures isolated workspace (creates one or verifies existing)
- **executing-plans** - Ensures isolated workspace (creates one or verifies existing)
- Any skill needing isolated workspace
**Pairs with:**
- **finishing-a-development-branch** - REQUIRED for cleanup after work complete

View File

@@ -1,615 +0,0 @@
#!/usr/bin/env bash
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
REPO_ROOT="$(cd "$SCRIPT_DIR/../.." && pwd)"
SYNC_SCRIPT_SOURCE="$REPO_ROOT/scripts/sync-to-codex-plugin.sh"
BASH_UNDER_TEST="/bin/bash"
PACKAGE_VERSION="1.2.3"
MANIFEST_VERSION="9.8.7"
FAILURES=0
TEST_ROOT=""
pass() {
echo " [PASS] $1"
}
fail() {
echo " [FAIL] $1"
FAILURES=$((FAILURES + 1))
}
assert_equals() {
local actual="$1"
local expected="$2"
local description="$3"
if [[ "$actual" == "$expected" ]]; then
pass "$description"
else
fail "$description"
echo " expected: $expected"
echo " actual: $actual"
fi
}
assert_contains() {
local haystack="$1"
local needle="$2"
local description="$3"
if printf '%s' "$haystack" | grep -Fq -- "$needle"; then
pass "$description"
else
fail "$description"
echo " expected to find: $needle"
fi
}
assert_not_contains() {
local haystack="$1"
local needle="$2"
local description="$3"
if printf '%s' "$haystack" | grep -Fq -- "$needle"; then
fail "$description"
echo " did not expect to find: $needle"
else
pass "$description"
fi
}
assert_matches() {
local haystack="$1"
local pattern="$2"
local description="$3"
if printf '%s' "$haystack" | grep -Eq -- "$pattern"; then
pass "$description"
else
fail "$description"
echo " expected to match: $pattern"
fi
}
assert_not_matches() {
local haystack="$1"
local pattern="$2"
local description="$3"
if printf '%s' "$haystack" | grep -Eq -- "$pattern"; then
fail "$description"
echo " did not expect to match: $pattern"
else
pass "$description"
fi
}
assert_path_absent() {
local path="$1"
local description="$2"
if [[ ! -e "$path" ]]; then
pass "$description"
else
fail "$description"
echo " did not expect path to exist: $path"
fi
}
assert_branch_absent() {
local repo="$1"
local pattern="$2"
local description="$3"
local branches
branches="$(git -C "$repo" branch --list "$pattern")"
if [[ -z "$branches" ]]; then
pass "$description"
else
fail "$description"
echo " did not expect matching branches:"
echo "$branches" | sed 's/^/ /'
fi
}
assert_current_branch() {
local repo="$1"
local expected="$2"
local description="$3"
local actual
actual="$(git -C "$repo" branch --show-current)"
assert_equals "$actual" "$expected" "$description"
}
assert_file_equals() {
local path="$1"
local expected="$2"
local description="$3"
local actual
actual="$(cat "$path")"
assert_equals "$actual" "$expected" "$description"
}
cleanup() {
if [[ -n "$TEST_ROOT" && -d "$TEST_ROOT" ]]; then
rm -rf "$TEST_ROOT"
fi
}
configure_git_identity() {
local repo="$1"
git -C "$repo" config user.name "Test Bot"
git -C "$repo" config user.email "test@example.com"
}
init_repo() {
local repo="$1"
git init -q -b main "$repo"
configure_git_identity "$repo"
}
commit_fixture() {
local repo="$1"
local message="$2"
git -C "$repo" commit -q -m "$message"
}
checkout_fixture_branch() {
local repo="$1"
local branch="$2"
git -C "$repo" checkout -q -b "$branch"
}
write_upstream_fixture() {
local repo="$1"
local with_pure_ignored="${2:-1}"
mkdir -p \
"$repo/.codex-plugin" \
"$repo/.private-journal" \
"$repo/assets" \
"$repo/scripts" \
"$repo/skills/example"
if [[ "$with_pure_ignored" == "1" ]]; then
mkdir -p "$repo/ignored-cache/tmp"
fi
cp "$SYNC_SCRIPT_SOURCE" "$repo/scripts/sync-to-codex-plugin.sh"
cat > "$repo/package.json" <<EOF
{
"name": "fixture-upstream",
"version": "$PACKAGE_VERSION"
}
EOF
cat > "$repo/.gitignore" <<'EOF'
.private-journal/
EOF
if [[ "$with_pure_ignored" == "1" ]]; then
cat >> "$repo/.gitignore" <<'EOF'
ignored-cache/
EOF
fi
cat > "$repo/.codex-plugin/plugin.json" <<EOF
{
"name": "superpowers",
"version": "$MANIFEST_VERSION"
}
EOF
cat > "$repo/assets/superpowers-small.svg" <<'EOF'
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 1 1"></svg>
EOF
printf 'png fixture\n' > "$repo/assets/app-icon.png"
cat > "$repo/skills/example/SKILL.md" <<'EOF'
# Example Skill
Fixture content.
EOF
printf 'tracked keep\n' > "$repo/.private-journal/keep.txt"
printf 'ignored leak\n' > "$repo/.private-journal/leak.txt"
if [[ "$with_pure_ignored" == "1" ]]; then
printf 'ignored cache state\n' > "$repo/ignored-cache/tmp/state.json"
fi
git -C "$repo" add \
.codex-plugin/plugin.json \
.gitignore \
assets/app-icon.png \
assets/superpowers-small.svg \
package.json \
scripts/sync-to-codex-plugin.sh \
skills/example/SKILL.md
git -C "$repo" add -f .private-journal/keep.txt
commit_fixture "$repo" "Initial upstream fixture"
}
write_destination_fixture() {
local repo="$1"
mkdir -p "$repo/plugins/superpowers/skills/example"
printf 'fixture keep\n' > "$repo/plugins/superpowers/.fixture-keep"
cat > "$repo/plugins/superpowers/skills/example/SKILL.md" <<'EOF'
# Example Skill
Fixture content.
EOF
git -C "$repo" add plugins/superpowers/.fixture-keep
git -C "$repo" add plugins/superpowers/skills/example/SKILL.md
commit_fixture "$repo" "Initial destination fixture"
}
add_openai_agent_metadata_fixture() {
local repo="$1"
mkdir -p "$repo/plugins/superpowers/skills/example/agents"
cat > "$repo/plugins/superpowers/skills/example/agents/openai.yaml" <<'EOF'
interface:
display_name: "Example"
short_description: "Destination-owned OpenAI metadata"
EOF
git -C "$repo" add plugins/superpowers/skills/example/agents/openai.yaml
commit_fixture "$repo" "Add OpenAI agent metadata fixture"
}
dirty_tracked_destination_skill() {
local repo="$1"
cat > "$repo/plugins/superpowers/skills/example/SKILL.md" <<'EOF'
# Example Skill
Locally modified fixture content.
EOF
}
write_synced_destination_fixture() {
local repo="$1"
mkdir -p \
"$repo/plugins/superpowers/.codex-plugin" \
"$repo/plugins/superpowers/.private-journal" \
"$repo/plugins/superpowers/assets" \
"$repo/plugins/superpowers/skills/example/agents" \
"$repo/plugins/superpowers/skills/example"
cat > "$repo/plugins/superpowers/.codex-plugin/plugin.json" <<EOF
{
"name": "superpowers",
"version": "$MANIFEST_VERSION"
}
EOF
cat > "$repo/plugins/superpowers/assets/superpowers-small.svg" <<'EOF'
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 1 1"></svg>
EOF
printf 'png fixture\n' > "$repo/plugins/superpowers/assets/app-icon.png"
cat > "$repo/plugins/superpowers/skills/example/SKILL.md" <<'EOF'
# Example Skill
Fixture content.
EOF
cat > "$repo/plugins/superpowers/skills/example/agents/openai.yaml" <<'EOF'
interface:
display_name: "Example"
short_description: "Destination-owned OpenAI metadata"
EOF
printf 'tracked keep\n' > "$repo/plugins/superpowers/.private-journal/keep.txt"
git -C "$repo" add \
plugins/superpowers/.codex-plugin/plugin.json \
plugins/superpowers/assets/app-icon.png \
plugins/superpowers/assets/superpowers-small.svg \
plugins/superpowers/skills/example/agents/openai.yaml \
plugins/superpowers/skills/example/SKILL.md \
plugins/superpowers/.private-journal/keep.txt
commit_fixture "$repo" "Initial synced destination fixture"
}
write_stale_ignored_destination_fixture() {
local repo="$1"
mkdir -p "$repo/plugins/superpowers/.private-journal"
printf 'fixture keep\n' > "$repo/plugins/superpowers/.fixture-keep"
printf 'stale ignored leak\n' > "$repo/plugins/superpowers/.private-journal/leak.txt"
git -C "$repo" add plugins/superpowers/.fixture-keep
commit_fixture "$repo" "Initial stale ignored destination fixture"
}
write_fake_gh() {
local bin_dir="$1"
mkdir -p "$bin_dir"
cat > "$bin_dir/gh" <<'EOF'
#!/usr/bin/env bash
set -euo pipefail
if [[ "${1:-}" == "auth" && "${2:-}" == "status" ]]; then
exit 0
fi
echo "unexpected gh invocation: $*" >&2
exit 1
EOF
chmod +x "$bin_dir/gh"
}
run_preview() {
local upstream="$1"
local dest="$2"
local fake_bin="$3"
PATH="$fake_bin:$PATH" "$BASH_UNDER_TEST" "$upstream/scripts/sync-to-codex-plugin.sh" -n --local "$dest" 2>&1
}
run_bootstrap_preview() {
local upstream="$1"
local dest="$2"
local fake_bin="$3"
PATH="$fake_bin:$PATH" "$BASH_UNDER_TEST" "$upstream/scripts/sync-to-codex-plugin.sh" -n --bootstrap --local "$dest" 2>&1
}
run_preview_without_manifest() {
local upstream="$1"
local dest="$2"
local fake_bin="$3"
rm -f "$upstream/.codex-plugin/plugin.json"
PATH="$fake_bin:$PATH" "$BASH_UNDER_TEST" "$upstream/scripts/sync-to-codex-plugin.sh" -n --local "$dest" 2>&1
}
run_preview_with_stale_ignored_destination() {
local upstream="$1"
local dest="$2"
local fake_bin="$3"
PATH="$fake_bin:$PATH" "$BASH_UNDER_TEST" "$upstream/scripts/sync-to-codex-plugin.sh" -n --local "$dest" 2>&1
}
run_apply() {
local upstream="$1"
local dest="$2"
local fake_bin="$3"
PATH="$fake_bin:$PATH" "$BASH_UNDER_TEST" "$upstream/scripts/sync-to-codex-plugin.sh" -y --local "$dest" 2>&1
}
run_help() {
local upstream="$1"
local fake_bin="$2"
PATH="$fake_bin:$PATH" "$BASH_UNDER_TEST" "$upstream/scripts/sync-to-codex-plugin.sh" --help 2>&1
}
write_bootstrap_destination_fixture() {
local repo="$1"
printf 'bootstrap fixture\n' > "$repo/README.md"
git -C "$repo" add README.md
commit_fixture "$repo" "Initial bootstrap destination fixture"
}
main() {
local upstream
local mixed_only_upstream
local dest
local dest_branch
local mixed_only_dest
local stale_dest
local dirty_apply_dest
local dirty_apply_dest_branch
local noop_apply_dest
local noop_apply_dest_branch
local fake_bin
local bootstrap_dest
local bootstrap_dest_branch
local preview_status
local preview_output
local preview_section
local bootstrap_status
local bootstrap_output
local missing_manifest_status
local missing_manifest_output
local mixed_only_status
local mixed_only_output
local stale_preview_status
local stale_preview_output
local stale_preview_section
local dirty_apply_status
local dirty_apply_output
local noop_apply_status
local noop_apply_output
local help_output
local script_source
local dirty_skill_path
local noop_openai_metadata_path
echo "=== Test: sync-to-codex-plugin dry-run regression ==="
TEST_ROOT="$(mktemp -d)"
trap cleanup EXIT
upstream="$TEST_ROOT/upstream"
mixed_only_upstream="$TEST_ROOT/mixed-only-upstream"
dest="$TEST_ROOT/destination"
mixed_only_dest="$TEST_ROOT/mixed-only-destination"
stale_dest="$TEST_ROOT/stale-destination"
dirty_apply_dest="$TEST_ROOT/dirty-apply-destination"
dirty_apply_dest_branch="fixture/dirty-apply-target"
noop_apply_dest="$TEST_ROOT/noop-apply-destination"
noop_apply_dest_branch="fixture/noop-apply-target"
bootstrap_dest="$TEST_ROOT/bootstrap-destination"
dest_branch="fixture/preview-target"
bootstrap_dest_branch="fixture/bootstrap-preview-target"
fake_bin="$TEST_ROOT/bin"
init_repo "$upstream"
write_upstream_fixture "$upstream"
init_repo "$mixed_only_upstream"
write_upstream_fixture "$mixed_only_upstream" 0
init_repo "$dest"
write_destination_fixture "$dest"
add_openai_agent_metadata_fixture "$dest"
checkout_fixture_branch "$dest" "$dest_branch"
dirty_tracked_destination_skill "$dest"
init_repo "$mixed_only_dest"
write_destination_fixture "$mixed_only_dest"
init_repo "$stale_dest"
write_stale_ignored_destination_fixture "$stale_dest"
init_repo "$dirty_apply_dest"
write_synced_destination_fixture "$dirty_apply_dest"
checkout_fixture_branch "$dirty_apply_dest" "$dirty_apply_dest_branch"
dirty_tracked_destination_skill "$dirty_apply_dest"
init_repo "$noop_apply_dest"
write_synced_destination_fixture "$noop_apply_dest"
checkout_fixture_branch "$noop_apply_dest" "$noop_apply_dest_branch"
init_repo "$bootstrap_dest"
write_bootstrap_destination_fixture "$bootstrap_dest"
checkout_fixture_branch "$bootstrap_dest" "$bootstrap_dest_branch"
write_fake_gh "$fake_bin"
# This regression test is about dry-run content, so capture the preview
# output even if the current script exits nonzero in --local mode.
set +e
preview_output="$(run_preview "$upstream" "$dest" "$fake_bin")"
preview_status=$?
bootstrap_output="$(run_bootstrap_preview "$upstream" "$bootstrap_dest" "$fake_bin")"
bootstrap_status=$?
mixed_only_output="$(run_preview "$mixed_only_upstream" "$mixed_only_dest" "$fake_bin")"
mixed_only_status=$?
stale_preview_output="$(run_preview_with_stale_ignored_destination "$upstream" "$stale_dest" "$fake_bin")"
stale_preview_status=$?
dirty_apply_output="$(run_apply "$upstream" "$dirty_apply_dest" "$fake_bin")"
dirty_apply_status=$?
noop_apply_output="$(run_apply "$upstream" "$noop_apply_dest" "$fake_bin")"
noop_apply_status=$?
missing_manifest_output="$(run_preview_without_manifest "$upstream" "$dest" "$fake_bin")"
missing_manifest_status=$?
set -e
help_output="$(run_help "$upstream" "$fake_bin")"
script_source="$(cat "$upstream/scripts/sync-to-codex-plugin.sh")"
preview_section="$(printf '%s\n' "$preview_output" | sed -n '/^=== Preview (rsync --dry-run) ===$/,/^=== End preview ===$/p')"
stale_preview_section="$(printf '%s\n' "$stale_preview_output" | sed -n '/^=== Preview (rsync --dry-run) ===$/,/^=== End preview ===$/p')"
dirty_skill_path="$dirty_apply_dest/plugins/superpowers/skills/example/SKILL.md"
noop_openai_metadata_path="$noop_apply_dest/plugins/superpowers/skills/example/agents/openai.yaml"
echo ""
echo "Preview assertions..."
assert_equals "$preview_status" "0" "Preview exits successfully"
assert_contains "$preview_output" "Version: $MANIFEST_VERSION" "Preview uses manifest version"
assert_not_contains "$preview_output" "Version: $PACKAGE_VERSION" "Preview does not use package.json version"
assert_contains "$preview_section" ".codex-plugin/plugin.json" "Preview includes manifest path"
assert_contains "$preview_section" "assets/superpowers-small.svg" "Preview includes SVG asset"
assert_contains "$preview_section" "assets/app-icon.png" "Preview includes PNG asset"
assert_contains "$preview_section" ".private-journal/keep.txt" "Preview includes tracked ignored file"
assert_not_contains "$preview_section" ".private-journal/leak.txt" "Preview excludes ignored untracked file"
assert_not_contains "$preview_section" "ignored-cache/" "Preview excludes pure ignored directories"
assert_not_contains "$preview_output" "Overlay file (.codex-plugin/plugin.json) will be regenerated" "Preview omits overlay regeneration note"
assert_not_contains "$preview_output" "Assets (superpowers-small.svg, app-icon.png) will be seeded from" "Preview omits assets seeding note"
assert_contains "$preview_section" "skills/example/SKILL.md" "Preview reflects dirty tracked destination file"
assert_not_matches "$preview_section" "\\*deleting +skills/example/agents/openai\\.yaml" "Preview preserves destination-owned OpenAI agent metadata"
assert_current_branch "$dest" "$dest_branch" "Preview leaves destination checkout on its original branch"
assert_branch_absent "$dest" "sync/superpowers-*" "Preview does not create sync branch in destination checkout"
echo ""
echo "Mixed-directory assertions..."
assert_equals "$mixed_only_status" "0" "Mixed ignored directory preview exits successfully under /bin/bash"
assert_contains "$mixed_only_output" ".private-journal/keep.txt" "Mixed ignored directory preview still includes tracked ignored file"
assert_not_contains "$mixed_only_output" "ignored-cache/" "Mixed ignored directory preview has no pure ignored directory fixture"
echo ""
echo "Convergence assertions..."
assert_equals "$stale_preview_status" "0" "Stale ignored destination preview exits successfully"
assert_matches "$stale_preview_section" "\\*deleting +\\.private-journal/leak\\.txt" "Preview deletes stale ignored destination file"
echo ""
echo "Bootstrap assertions..."
assert_equals "$bootstrap_status" "0" "Bootstrap preview exits successfully"
assert_contains "$bootstrap_output" "Mode: BOOTSTRAP (creating plugins/superpowers/ when absent)" "Bootstrap preview describes directory creation"
assert_not_contains "$bootstrap_output" "Assets:" "Bootstrap preview omits external assets path"
assert_contains "$bootstrap_output" "Dry run only. Nothing was changed or pushed." "Bootstrap preview remains dry-run only"
assert_path_absent "$bootstrap_dest/plugins/superpowers" "Bootstrap preview does not create destination plugin directory"
assert_current_branch "$bootstrap_dest" "$bootstrap_dest_branch" "Bootstrap preview leaves destination checkout on its original branch"
assert_branch_absent "$bootstrap_dest" "bootstrap/superpowers-*" "Bootstrap preview does not create bootstrap branch in destination checkout"
echo ""
echo "Apply assertions..."
assert_equals "$dirty_apply_status" "1" "Dirty local apply exits with failure"
assert_contains "$dirty_apply_output" "ERROR: local checkout has uncommitted changes under 'plugins/superpowers'" "Dirty local apply reports protected destination path"
assert_current_branch "$dirty_apply_dest" "$dirty_apply_dest_branch" "Dirty local apply leaves destination checkout on its original branch"
assert_branch_absent "$dirty_apply_dest" "sync/superpowers-*" "Dirty local apply does not create sync branch in destination checkout"
assert_file_equals "$dirty_skill_path" "# Example Skill
Locally modified fixture content." "Dirty local apply preserves tracked working-tree file content"
assert_equals "$noop_apply_status" "0" "Clean no-op local apply exits successfully"
assert_contains "$noop_apply_output" "No changes — embedded plugin was already in sync with upstream" "Clean no-op local apply reports no changes"
assert_current_branch "$noop_apply_dest" "$noop_apply_dest_branch" "Clean no-op local apply leaves destination checkout on its original branch"
assert_branch_absent "$noop_apply_dest" "sync/superpowers-*" "Clean no-op local apply does not create sync branch in destination checkout"
assert_file_equals "$noop_openai_metadata_path" "interface:
display_name: \"Example\"
short_description: \"Destination-owned OpenAI metadata\"" "Clean no-op local apply preserves OpenAI agent metadata"
echo ""
echo "Missing manifest assertions..."
assert_equals "$missing_manifest_status" "1" "Missing manifest exits with failure"
assert_contains "$missing_manifest_output" "ERROR: committed Codex manifest missing at" "Missing manifest reports committed manifest path"
echo ""
echo "Help assertions..."
assert_not_contains "$help_output" "--assets-src" "Help omits --assets-src"
echo ""
echo "Source assertions..."
assert_not_contains "$script_source" "regenerated inline" "Source drops regenerated inline phrasing"
assert_not_contains "$script_source" "Brand Assets directory" "Source drops Brand Assets directory phrasing"
assert_not_contains "$script_source" "--assets-src" "Source drops --assets-src"
if [[ $FAILURES -ne 0 ]]; then
echo ""
echo "FAILED: $FAILURES assertion(s) failed."
exit 1
fi
echo ""
echo "PASS"
}
main "$@"

View File

@@ -44,7 +44,6 @@ while [[ $# -gt 0 ]]; do
echo "" echo ""
echo "Tests:" echo "Tests:"
echo " test-plugin-loading.sh Verify plugin installation and structure" echo " test-plugin-loading.sh Verify plugin installation and structure"
echo " test-bootstrap-caching.sh Verify bootstrap content caching"
echo " test-tools.sh Test use_skill and find_skills tools (integration)" echo " test-tools.sh Test use_skill and find_skills tools (integration)"
echo " test-priority.sh Test skill priority resolution (integration)" echo " test-priority.sh Test skill priority resolution (integration)"
exit 0 exit 0
@@ -60,7 +59,6 @@ done
# List of tests to run (no external dependencies) # List of tests to run (no external dependencies)
tests=( tests=(
"test-plugin-loading.sh" "test-plugin-loading.sh"
"test-bootstrap-caching.sh"
) )
# Integration tests (require OpenCode) # Integration tests (require OpenCode)

View File

@@ -1,124 +0,0 @@
import fs from 'fs';
import { pathToFileURL } from 'url';
const [, , pluginPath, scenario] = process.argv;
if (!pluginPath || !['present', 'missing'].includes(scenario)) {
console.error('Usage: node test-bootstrap-caching.mjs PLUGIN_PATH present|missing');
process.exit(2);
}
let existsCount = 0;
let readCount = 0;
const originalExistsSync = fs.existsSync;
const originalReadFileSync = fs.readFileSync;
fs.existsSync = function (...args) {
if (isBootstrapSkillPath(args[0])) {
existsCount += 1;
}
return originalExistsSync.apply(this, args);
};
fs.readFileSync = function (...args) {
if (isBootstrapSkillPath(args[0])) {
readCount += 1;
}
return originalReadFileSync.apply(this, args);
};
const mod = await import(pathToFileURL(pluginPath).href);
const plugin = await mod.SuperpowersPlugin({ client: {}, directory: '.' });
const transform = plugin['experimental.chat.messages.transform'];
const firstOutput = makeOutput(`${scenario} bootstrap first step`);
await transform({}, firstOutput);
const afterFirst = { existsCount, readCount };
const secondOutput = makeOutput(`${scenario} bootstrap second step`);
await transform({}, secondOutput);
const afterSecond = { existsCount, readCount };
const result = {
scenario,
firstBootstrapParts: countBootstrapParts(firstOutput),
secondBootstrapParts: countBootstrapParts(secondOutput),
firstReadCount: afterFirst.readCount,
secondReadCount: afterSecond.readCount,
firstExistsCount: afterFirst.existsCount,
secondExistsCount: afterSecond.existsCount,
};
const failures = scenario === 'present'
? assertPresentBootstrap(result)
: assertMissingBootstrap(result);
if (failures.length > 0) {
console.error(JSON.stringify(result, null, 2));
for (const failure of failures) {
console.error(`FAIL: ${failure}`);
}
process.exit(1);
}
console.log(JSON.stringify(result, null, 2));
function isBootstrapSkillPath(filePath) {
return String(filePath).replaceAll('\\', '/').includes('using-superpowers/SKILL.md');
}
function makeOutput(text) {
return {
messages: [{
info: { role: 'user' },
parts: [{ type: 'text', text }],
}],
};
}
function countBootstrapParts(output) {
return output.messages[0].parts.filter(
(part) => part.type === 'text' && part.text.includes('EXTREMELY_IMPORTANT')
).length;
}
function assertPresentBootstrap(result) {
const failures = [];
if (result.firstBootstrapParts !== 1) {
failures.push(`expected first transform to inject one bootstrap part, got ${result.firstBootstrapParts}`);
}
if (result.secondBootstrapParts !== 1) {
failures.push(`expected second transform to inject one bootstrap part, got ${result.secondBootstrapParts}`);
}
if (result.firstReadCount !== 1) {
failures.push(`expected first transform to read SKILL.md once, got ${result.firstReadCount}`);
}
if (result.secondReadCount !== result.firstReadCount) {
failures.push(`expected cached second transform to do no additional reads, got ${result.secondReadCount - result.firstReadCount}`);
}
if (result.secondExistsCount !== result.firstExistsCount) {
failures.push(`expected cached second transform to do no additional exists checks, got ${result.secondExistsCount - result.firstExistsCount}`);
}
return failures;
}
function assertMissingBootstrap(result) {
const failures = [];
if (result.firstBootstrapParts !== 0) {
failures.push(`expected no bootstrap when SKILL.md is missing, got ${result.firstBootstrapParts}`);
}
if (result.secondBootstrapParts !== 0) {
failures.push(`expected no bootstrap on second missing-file transform, got ${result.secondBootstrapParts}`);
}
if (result.firstReadCount !== 0 || result.secondReadCount !== 0) {
failures.push(`expected missing file path to avoid reads, got ${result.secondReadCount}`);
}
if (result.firstExistsCount < 1) {
failures.push('expected first transform to check whether SKILL.md exists');
}
if (result.secondExistsCount !== result.firstExistsCount) {
failures.push(`expected missing-file result to be cached, got ${result.secondExistsCount - result.firstExistsCount} extra exists checks`);
}
return failures;
}

View File

@@ -1,32 +0,0 @@
#!/usr/bin/env bash
# Test: Bootstrap Content Caching (#1202)
# Verifies the OpenCode transform caches bootstrap content between agent steps.
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
echo "=== Test: Bootstrap Content Caching (#1202) ==="
source "$SCRIPT_DIR/setup.sh"
trap cleanup_test_env EXIT
run_present_file_check() {
node "$SCRIPT_DIR/test-bootstrap-caching.mjs" "$SUPERPOWERS_PLUGIN_FILE" present
}
run_missing_file_check() {
mv "$SUPERPOWERS_SKILLS_DIR/using-superpowers/SKILL.md" "$TEST_HOME/using-superpowers.SKILL.md.bak"
node "$SCRIPT_DIR/test-bootstrap-caching.mjs" "$SUPERPOWERS_PLUGIN_FILE" missing
}
echo "Test 1: Caches bootstrap after the first successful transform..."
run_present_file_check
echo " [PASS] Bootstrap content is cached while fresh message arrays still receive injection"
echo "Test 2: Caches missing SKILL.md result..."
run_missing_file_check
echo " [PASS] Missing bootstrap file is cached and not re-probed every transform"
echo ""
echo "=== All bootstrap caching tests passed ==="

View File

@@ -1,13 +1,10 @@
#!/usr/bin/env bash #!/usr/bin/env bash
# Test: Skill Priority Resolution # Test: Skill Priority Resolution
# Documents current OpenCode duplicate-name behavior for local and bundled # Verifies that skills are resolved with correct priority: project > personal > superpowers
# skills. The desired local-shadowing behavior is tracked separately; this
# test keeps the integration suite honest without adding a plugin workaround.
# NOTE: These tests require OpenCode to be installed and configured # NOTE: These tests require OpenCode to be installed and configured
set -euo pipefail set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)" SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
OPENCODE_TEST_TIMEOUT_SECONDS="${OPENCODE_TEST_TIMEOUT_SECONDS:-120}"
echo "=== Test: Skill Priority Resolution ===" echo "=== Test: Skill Priority Resolution ==="
@@ -99,119 +96,103 @@ if ! command -v opencode &> /dev/null; then
exit 0 exit 0
fi fi
run_opencode() { # Test 2: Test that personal overrides superpowers
local result_var="$1"
local dir="$2"
local prompt="$3"
local command_output
local exit_code
set +e
command_output=$(cd "$dir" && timeout "${OPENCODE_TEST_TIMEOUT_SECONDS}s" opencode run --print-logs --format json "$prompt" 2>&1)
exit_code=$?
set -e
if [ $exit_code -eq 124 ]; then
echo " [FAIL] OpenCode timed out after ${OPENCODE_TEST_TIMEOUT_SECONDS}s"
exit 1
fi
if [ $exit_code -ne 0 ]; then
echo " [FAIL] OpenCode returned non-zero exit code: $exit_code"
echo " Output was:"
awk 'NR <= 80 { print }' <<<"$command_output"
exit 1
fi
printf -v "$result_var" '%s' "$command_output"
}
assert_contains() {
local output="$1"
local needle="$2"
local message="$3"
if [[ "$output" == *"$needle"* ]]; then
echo " [PASS] $message"
else
echo " [FAIL] $message"
echo " Expected to find: $needle"
echo " Output was:"
awk 'NR <= 80 { print }' <<<"$output"
exit 1
fi
}
first_skill_tool_event() {
awk '/"type":"tool_use"/ && /"tool":"skill"/ { print; exit }' <<<"$1"
}
describe_priority_result() {
local output="$1"
local expected_marker="$2"
local fallback_marker="$3"
local pass_message="$4"
local known_bug_message="$5"
local loaded_skill
loaded_skill="$(first_skill_tool_event "$output")"
if [[ "$loaded_skill" == *"$expected_marker"* ]]; then
echo " [PASS] $pass_message"
elif [[ "$loaded_skill" == *"$fallback_marker"* ]]; then
echo " [INFO] $known_bug_message"
echo " [INFO] Tracked separately: OpenCode bundled skills can shadow local skills with duplicate native names"
else
echo " [FAIL] Could not verify priority marker in native skill tool output"
echo " Output was:"
awk 'NR <= 80 { print }' <<<"$output"
exit 1
fi
}
# Test 2: Document personal vs bundled superpowers priority
echo "" echo ""
echo "Test 2: Documenting personal vs superpowers priority..." echo "Test 2: Testing personal > superpowers priority..."
echo " Running from outside project directory..." echo " Running from outside project directory..."
run_opencode output "$HOME" "Call the skill tool with name \"priority-test\". Show the exact content including any PRIORITY_MARKER text." # Run from HOME (not in project) - should get personal version
describe_priority_result \ cd "$HOME"
"$output" \ output=$(timeout 60s opencode run --print-logs "Use the use_skill tool to load the priority-test skill. Show me the exact content including any PRIORITY_MARKER text." 2>&1) || {
"PRIORITY_MARKER_PERSONAL_VERSION" \ exit_code=$?
"PRIORITY_MARKER_SUPERPOWERS_VERSION" \ if [ $exit_code -eq 124 ]; then
"Personal version loaded for duplicate native skill name" \ echo " [FAIL] OpenCode timed out after 60s"
"Current OpenCode behavior loaded bundled superpowers version instead of personal version" exit 1
fi
}
# Test 3: Document project vs bundled superpowers priority if echo "$output" | grep -qi "PRIORITY_MARKER_PERSONAL_VERSION"; then
echo " [PASS] Personal version loaded (overrides superpowers)"
elif echo "$output" | grep -qi "PRIORITY_MARKER_SUPERPOWERS_VERSION"; then
echo " [FAIL] Superpowers version loaded instead of personal"
exit 1
else
echo " [WARN] Could not verify priority marker in output"
echo " Output snippet:"
echo "$output" | grep -i "priority\|personal\|superpowers" | head -10
fi
# Test 3: Test that project overrides both personal and superpowers
echo "" echo ""
echo "Test 3: Documenting project vs personal/superpowers priority..." echo "Test 3: Testing project > personal > superpowers priority..."
echo " Running from project directory..." echo " Running from project directory..."
run_opencode output "$TEST_HOME/test-project" "Call the skill tool with name \"priority-test\". Show the exact content including any PRIORITY_MARKER text." # Run from project directory - should get project version
describe_priority_result \ cd "$TEST_HOME/test-project"
"$output" \ output=$(timeout 60s opencode run --print-logs "Use the use_skill tool to load the priority-test skill. Show me the exact content including any PRIORITY_MARKER text." 2>&1) || {
"PRIORITY_MARKER_PROJECT_VERSION" \ exit_code=$?
"PRIORITY_MARKER_SUPERPOWERS_VERSION" \ if [ $exit_code -eq 124 ]; then
"Project version loaded for duplicate native skill name" \ echo " [FAIL] OpenCode timed out after 60s"
"Current OpenCode behavior loaded bundled superpowers version instead of project version" exit 1
fi
}
# Test 4: Test a non-colliding bundled superpowers skill is still available if echo "$output" | grep -qi "PRIORITY_MARKER_PROJECT_VERSION"; then
echo " [PASS] Project version loaded (highest priority)"
elif echo "$output" | grep -qi "PRIORITY_MARKER_PERSONAL_VERSION"; then
echo " [FAIL] Personal version loaded instead of project"
exit 1
elif echo "$output" | grep -qi "PRIORITY_MARKER_SUPERPOWERS_VERSION"; then
echo " [FAIL] Superpowers version loaded instead of project"
exit 1
else
echo " [WARN] Could not verify priority marker in output"
echo " Output snippet:"
echo "$output" | grep -i "priority\|project\|personal" | head -10
fi
# Test 4: Test explicit superpowers: prefix bypasses priority
echo "" echo ""
echo "Test 4: Testing non-colliding superpowers skill remains available..." echo "Test 4: Testing superpowers: prefix forces superpowers version..."
mkdir -p "$SUPERPOWERS_SKILLS_DIR/superpowers-only-test" cd "$TEST_HOME/test-project"
cat > "$SUPERPOWERS_SKILLS_DIR/superpowers-only-test/SKILL.md" <<'EOF' output=$(timeout 60s opencode run --print-logs "Use the use_skill tool to load superpowers:priority-test specifically. Show me the exact content including any PRIORITY_MARKER text." 2>&1) || {
--- exit_code=$?
name: superpowers-only-test if [ $exit_code -eq 124 ]; then
description: Superpowers-only priority test skill echo " [FAIL] OpenCode timed out after 60s"
--- exit 1
# Superpowers Only Test Skill fi
}
PRIORITY_MARKER_SUPERPOWERS_ONLY_VERSION if echo "$output" | grep -qi "PRIORITY_MARKER_SUPERPOWERS_VERSION"; then
EOF echo " [PASS] superpowers: prefix correctly forces superpowers version"
elif echo "$output" | grep -qi "PRIORITY_MARKER_PROJECT_VERSION\|PRIORITY_MARKER_PERSONAL_VERSION"; then
echo " [FAIL] superpowers: prefix did not force superpowers version"
exit 1
else
echo " [WARN] Could not verify priority marker in output"
fi
run_opencode output "$TEST_HOME/test-project" "Call the skill tool with name \"superpowers-only-test\". Show the exact content including any PRIORITY_MARKER text." # Test 5: Test explicit project: prefix
assert_contains "$output" "PRIORITY_MARKER_SUPERPOWERS_ONLY_VERSION" "Non-colliding superpowers skill is still registered" echo ""
echo "Test 5: Testing project: prefix forces project version..."
cd "$HOME" # Run from outside project but with project: prefix
output=$(timeout 60s opencode run --print-logs "Use the use_skill tool to load project:priority-test specifically. Show me the exact content." 2>&1) || {
exit_code=$?
if [ $exit_code -eq 124 ]; then
echo " [FAIL] OpenCode timed out after 60s"
exit 1
fi
}
# Note: This may fail since we're not in the project directory
# The project: prefix only works when in a project context
if echo "$output" | grep -qi "not found\|error"; then
echo " [PASS] project: prefix correctly fails when not in project context"
else
echo " [INFO] project: prefix behavior outside project context may vary"
fi
echo "" echo ""
echo "=== All priority tests passed ===" echo "=== All priority tests passed ==="

View File

@@ -1,12 +1,10 @@
#!/usr/bin/env bash #!/usr/bin/env bash
# Test: Native Skill Tool Functionality # Test: Tools Functionality
# Verifies that OpenCode's native skill tool can load personal, project, # Verifies that use_skill and find_skills tools work correctly
# and bundled superpowers skills.
# NOTE: These tests require OpenCode to be installed and configured # NOTE: These tests require OpenCode to be installed and configured
set -euo pipefail set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)" SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
OPENCODE_TEST_TIMEOUT_SECONDS="${OPENCODE_TEST_TIMEOUT_SECONDS:-120}"
echo "=== Test: Tools Functionality ===" echo "=== Test: Tools Functionality ==="
@@ -23,73 +21,84 @@ if ! command -v opencode &> /dev/null; then
exit 0 exit 0
fi fi
run_opencode() { # Test 1: Test find_skills tool via direct invocation
local result_var="$1" echo "Test 1: Testing find_skills tool..."
local dir="$2" echo " Running opencode with find_skills request..."
local prompt="$3"
local command_output
local exit_code
set +e # Use timeout to prevent hanging, capture both stdout and stderr
command_output=$(cd "$dir" && timeout "${OPENCODE_TEST_TIMEOUT_SECONDS}s" opencode run --print-logs --format json "$prompt" 2>&1) output=$(timeout 60s opencode run --print-logs "Use the find_skills tool to list available skills. Just call the tool and show me the raw output." 2>&1) || {
exit_code=$? exit_code=$?
set -e
if [ $exit_code -eq 124 ]; then if [ $exit_code -eq 124 ]; then
echo " [FAIL] OpenCode timed out after ${OPENCODE_TEST_TIMEOUT_SECONDS}s" echo " [FAIL] OpenCode timed out after 60s"
exit 1 exit 1
fi fi
echo " [WARN] OpenCode returned non-zero exit code: $exit_code"
if [ $exit_code -ne 0 ]; then
echo " [FAIL] OpenCode returned non-zero exit code: $exit_code"
echo " Output was:"
awk 'NR <= 80 { print }' <<<"$command_output"
exit 1
fi
printf -v "$result_var" '%s' "$command_output"
} }
assert_contains() { # Check for expected patterns in output
local output="$1" if echo "$output" | grep -qi "superpowers:brainstorming\|superpowers:using-superpowers\|Available skills"; then
local needle="$2" echo " [PASS] find_skills tool discovered superpowers skills"
local message="$3" else
echo " [FAIL] find_skills did not return expected skills"
echo " Output was:"
echo "$output" | head -50
exit 1
fi
if [[ "$output" == *"$needle"* ]]; then # Check if personal test skill was found
echo " [PASS] $message" if echo "$output" | grep -qi "personal-test"; then
else echo " [PASS] find_skills found personal test skill"
echo " [FAIL] $message" else
echo " Expected to find: $needle" echo " [WARN] personal test skill not found in output (may be ok if tool returned subset)"
echo " Output was:" fi
awk 'NR <= 80 { print }' <<<"$output"
# Test 2: Test use_skill tool
echo ""
echo "Test 2: Testing use_skill tool..."
echo " Running opencode with use_skill request..."
output=$(timeout 60s opencode run --print-logs "Use the use_skill tool to load the personal-test skill and show me what you get." 2>&1) || {
exit_code=$?
if [ $exit_code -eq 124 ]; then
echo " [FAIL] OpenCode timed out after 60s"
exit 1 exit 1
fi fi
echo " [WARN] OpenCode returned non-zero exit code: $exit_code"
} }
# Test 1: Test personal skill loading via OpenCode's native skill tool # Check for the skill marker we embedded
echo "Test 1: Testing native skill tool with a personal skill..." if echo "$output" | grep -qi "PERSONAL_SKILL_MARKER_12345\|Personal Test Skill\|Launching skill"; then
echo " Running opencode with personal-test request..." echo " [PASS] use_skill loaded personal-test skill content"
else
echo " [FAIL] use_skill did not load personal-test skill correctly"
echo " Output was:"
echo "$output" | head -50
exit 1
fi
run_opencode output "$TEST_HOME/test-project" "Call the skill tool with name \"personal-test\". Then print the PERSONAL_SKILL_MARKER_12345 marker." # Test 3: Test use_skill with superpowers: prefix
assert_contains "$output" '"tool":"skill"' "OpenCode called the native skill tool"
assert_contains "$output" "PERSONAL_SKILL_MARKER_12345" "native skill tool loaded personal-test skill content"
# Test 2: Test project skill loading
echo "" echo ""
echo "Test 2: Testing native skill tool with a project skill..." echo "Test 3: Testing use_skill with superpowers: prefix..."
echo " Running opencode with project-test request..." echo " Running opencode with superpowers:brainstorming skill..."
run_opencode output "$TEST_HOME/test-project" "Call the skill tool with name \"project-test\". Then print the PROJECT_SKILL_MARKER_67890 marker." output=$(timeout 60s opencode run --print-logs "Use the use_skill tool to load superpowers:brainstorming and tell me the first few lines of what you received." 2>&1) || {
assert_contains "$output" "PROJECT_SKILL_MARKER_67890" "native skill tool loaded project-test skill content" exit_code=$?
if [ $exit_code -eq 124 ]; then
echo " [FAIL] OpenCode timed out after 60s"
exit 1
fi
echo " [WARN] OpenCode returned non-zero exit code: $exit_code"
}
# Test 3: Test bundled superpowers skill loading # Check for expected content from brainstorming skill
echo "" if echo "$output" | grep -qi "brainstorming\|Launching skill\|skill.*loaded"; then
echo "Test 3: Testing native skill tool with a superpowers skill..." echo " [PASS] use_skill loaded superpowers:brainstorming skill"
echo " Running opencode with brainstorming skill..." else
echo " [FAIL] use_skill did not load superpowers:brainstorming correctly"
run_opencode output "$TEST_HOME/test-project" "Call the skill tool with name \"brainstorming\". Then tell me the loaded skill title." echo " Output was:"
assert_contains "$output" '"name":"brainstorming"' "native skill tool loaded bundled brainstorming skill" echo "$output" | head -50
assert_contains "$output" "Brainstorming Ideas Into Designs" "brainstorming skill content was returned" exit 1
fi
echo "" echo ""
echo "=== All native skill tool tests passed ===" echo "=== All tools tests passed ==="