diff --git a/.claude/commands/release.md b/.claude/commands/release.md index c83378a..6025ac9 100644 --- a/.claude/commands/release.md +++ b/.claude/commands/release.md @@ -1,37 +1,49 @@ --- allowed-tools: Bash(git:*), Bash(gh:*) -argument-hint: major | minor | patch +argument-hint: [optional - will prompt interactively] description: Create and publish a new release --- ## Context -- Current version: !`git describe --tags --abbrev=0 2>/dev/null || echo "no tags yet"` +- Latest stable: !`git tag -l 'v[0-9]*.[0-9]*.[0-9]*' --sort=-v:refname | grep -v '-' | head -1 || echo "none"` +- Latest beta: !`git tag -l 'v*-beta*' --sort=-v:refname | head -1 || echo "none"` - Current branch: !`git branch --show-current` - Git status: !`git status --short` - Commits since last tag: !`git log $(git describe --tags --abbrev=0 2>/dev/null)..HEAD --oneline 2>/dev/null || git log --oneline -5` ## Task -Create a new **$1** release (major | minor | patch). +Create a new release. ### Steps -1. **Validate** argument is one of: major, minor, patch. If missing or invalid, ask. - -2. **Check prerequisites**: - - On `main` branch +1. **Show current state** to the user: + - Latest stable version + - Latest beta version (if any) + - What versions each release type would create + +2. **Ask user what release to create** using AskUserQuestion: + - Calculate and show concrete version numbers for each option + - Example options (adjust based on current state): + - "v0.2.0 (minor)" — requires `main` branch + - "v0.1.2 (patch)" — requires `main` branch + - "v1.0.0 (major)" — requires `main` branch + - "v0.2.0-beta.1 (new beta)" — requires `dev` branch + - "v0.2.0-beta.2 (next beta)" — only if beta exists, requires `dev` branch + - If argument was provided (e.g., `/release patch`), skip the question and use it + +3. **Check prerequisites**: - Working directory is clean - Tests pass: `go test ./...` - -3. **Calculate new version** using semver: - - major: bump X in vX.Y.Z, reset Y and Z to 0 - - minor: bump Y, reset Z to 0 - - patch: bump Z + - **IMPORTANT - Branch rules:** + - **Stable releases (major/minor/patch)**: MUST be on `main` branch + - **Beta releases**: MUST be on `dev` branch + - If on wrong branch, stop and tell the user to switch branches first 4. **Create and push tag**: ```bash - VERSION=vX.Y.Z + VERSION=vX.Y.Z # or vX.Y.Z-beta.N for beta git tag "$VERSION" git push origin "$VERSION" ``` @@ -41,13 +53,19 @@ Create a new **$1** release (major | minor | patch). gh run watch --workflow release.yml --interval 10 ``` -6. **Verify release**: +6. **Mark as prerelease** (beta only): + ```bash + gh release edit "$VERSION" --prerelease + ``` + This ensures the beta won't be picked up by auto-update (which uses `/releases/latest`). + +7. **Verify release**: ```bash gh release view "$VERSION" gh release view "$VERSION" --json assets --jq '.assets[].name' ``` -7. **Add release notes**: +8. **Add release notes**: - Read the commit history since the last release - Group changes by type (Features, Fixes, Improvements, etc.) - Write a concise summary highlighting the most important changes @@ -72,21 +90,40 @@ Create a new **$1** release (major | minor | patch). )" ``` -8. **Verify Homebrew tap updated**: +9. **Verify Homebrew tap updated** (stable releases only - skip for beta): ```bash gh api "repos/zippoxer/homebrew-tap/contents/Formula/subtask.rb?ref=main" --jq .content \ | base64 --decode \ | rg "version|url|sha256" -n ``` -9. **Test Homebrew install** (without installing): - ```bash - brew fetch --force zippoxer/tap/subtask - ``` - This downloads the tarball and verifies the checksum matches the formula. +10. **Test Homebrew install** (stable releases only - skip for beta): + ```bash + brew fetch --force zippoxer/tap/subtask + ``` + This downloads the tarball and verifies the checksum matches the formula. Note: Do NOT update the local installation. The user tests with local builds (`go install ./cmd/subtask`), not Homebrew. +### Beta Release Notes + +For beta releases, keep notes concise and focused on what to test: +```bash +gh release edit "$VERSION" --prerelease --notes "$(cat <<'EOF' +## Beta Release + +This is a **prerelease** for testing. Not recommended for production. + +### Changes +- Change 1 +- Change 2 + +### Testing +Please report issues at https://github.com/zippoxer/subtask/issues +EOF +)" +``` + ### Troubleshooting If release workflow fails: @@ -94,7 +131,7 @@ If release workflow fails: gh run list --workflow release.yml --limit 5 gh run view --log-failed ``` -Fix the issue on main, then create a NEW tag (don't reuse). +Fix the issue, then create a NEW tag (don't reuse). To undo a bad release: ```bash diff --git a/.gitignore b/.gitignore index c4edba9..16ef7b8 100644 --- a/.gitignore +++ b/.gitignore @@ -21,3 +21,4 @@ coverage.out # Personal notes ISSUES.md TODO.md +/docs/issues diff --git a/CLAUDE.md b/CLAUDE.md index a357473..95debbe 100644 --- a/CLAUDE.md +++ b/CLAUDE.md @@ -133,7 +133,8 @@ Task status is what users care about. Worker status is operational detail. Works | Command | Description | |---------|-------------| -| `subtask init` | One-time setup: project config, workspace limit | +| `subtask install` | One-time global install + configuration wizard | +| `subtask config` | Edit user defaults or project overrides | | `subtask draft ` | Create a task without running it | | `subtask send ` | Send a message (starts or resumes task) | | `subtask stage ` | Advance workflow stage | @@ -152,18 +153,20 @@ Task status is what users care about. Worker status is operational detail. Works ## Storage ``` -.subtask/ # per-project, gitignored -├── config.json # harness, model, workspace limit -├── tasks// # task folder (portable, syncable) -│ ├── TASK.md # description + schema version in frontmatter -│ ├── PLAN.md # optional plan -│ ├── PROGRESS.json # worker progress tracking -│ └── history.jsonl # source of truth: messages + lifecycle events -├── internal// # runtime only (not portable) -│ └── state.json # session ID, workspace path -└── index.db # SQLite cache (rebuilt from history) - -~/.subtask/workspaces/ # global worktree pool (created on demand) +~/.subtask/ +├── config.json # global defaults (from install/config) +├── workspaces/--/ # worktrees (created on demand) +└── projects// # per-project runtime state (machine-local) + ├── internal/ # session IDs, workspace assignments, locks + └── index.db # SQLite cache (rebuildable) + +/.subtask/ +├── config.json # optional per-project overrides +└── tasks// # task folder (portable, syncable) + ├── TASK.md # description + schema version in frontmatter + ├── PLAN.md # optional plan + ├── PROGRESS.json # worker progress tracking + └── history.jsonl # source of truth: messages + lifecycle events ``` ### Portability Contract diff --git a/README.md b/README.md index de7af10..ceedea6 100644 --- a/README.md +++ b/README.md @@ -49,11 +49,12 @@ Run `subtask` in your terminal to see everything: -## Install -> [!NOTE] +## Setup + +> [!NOTE] > Subtask is in early development. Upcoming releases will simplify installation, solve known bugs, and improve Claude's proficiency. -### Get the CLI +### Install the CLI #### Mac/Linux @@ -88,7 +89,16 @@ go install github.com/zippoxer/subtask/cmd/subtask@latest -### Install the Claude Code Skill +### Install the Skill + +Tell Claude Code: +```md +Setup Subtask with `subtask install --guide`. +``` +Claude will install the Subtask skill at `~/.claude/skills`, and ask you whether subagents should run Claude, Codex or OpenCode. + +
+Or install manually… ```bash subtask install @@ -96,18 +106,16 @@ subtask install # Tip: Uninstall later with `subtask uninstall`. ``` -> *This asks whether to install to user-scope (`~/.claude/skills`) or project-scope.* -> -> *You can skip installing the plugin, it isn't working yet.* - -Restart Claude Code. +
-### Setup Subtask in your Repo +### Install the Plugin (Optional) -```bash -cd your-repo -subtask init +In Claude Code: +``` +/plugin marketplace add zippoxer/subtask +/plugin install subtask@subtask ``` +This reminds Claude to use the Subtask skill when it invokes the CLI. ## Use @@ -133,7 +141,7 @@ subtask update - I use Claude Code to lead the development (i talk, it creates tasks and tracks everything) - I use Codex for subagents (just preference, Claude Code works too) - ~60 tasks merged in the past week -- Proof: https://github.com/user-attachments/assets/6c71e34f-b3c6-4372-ac25-dd3eea15932e +- [Proof](https://github.com/user-attachments/assets/6c71e34f-b3c6-4372-ac25-dd3eea15932e) ## License diff --git a/cmd/subtask/ask.go b/cmd/subtask/ask.go index 644d253..81fa2cc 100644 --- a/cmd/subtask/ask.go +++ b/cmd/subtask/ask.go @@ -38,7 +38,7 @@ func (c *AskCmd) Run() error { // Load config for harness cfg, err := workspace.LoadConfig() if err != nil { - return fmt.Errorf("subtask not initialized\n\nRun: subtask init") + return err } if err := workspace.ValidateReasoningFlag(cfg.Harness, c.Reasoning); err != nil { return err diff --git a/cmd/subtask/auto_update.go b/cmd/subtask/auto_update.go index 047662d..1e6dd37 100644 --- a/cmd/subtask/auto_update.go +++ b/cmd/subtask/auto_update.go @@ -2,8 +2,11 @@ package main import ( "os" + "path/filepath" + "github.com/zippoxer/subtask/internal/homedir" "github.com/zippoxer/subtask/pkg/install" + "github.com/zippoxer/subtask/pkg/task" ) func runAutoUpdate() { @@ -11,31 +14,24 @@ func runAutoUpdate() { return } - userBase, _, err := baseDirForScope(install.ScopeUser) - if err != nil || userBase == "" { - return - } - projectBase, _, err := baseDirForScope(install.ScopeProject) - if err != nil || projectBase == "" { - return + homeDir, err := homedir.Dir() + if err == nil && homeDir != "" { + res, err := install.AutoUpdateIfInstalled(homeDir) + if err == nil && res.UpdatedSkill { + printSuccess("Updated skill to latest version") + } } - userRes, err := install.AutoUpdateIfInstalled(install.ScopeUser, userBase) - if err != nil { + repoRoot, err := task.GitRootAbs() + if err != nil || repoRoot == "" { return } - projectRes, err := install.AutoUpdateIfInstalled(install.ScopeProject, projectBase) + + st, err := install.GetSkillStatusFor(repoRoot) if err != nil { return } - - skillUpdated := userRes.UpdatedSkill || projectRes.UpdatedSkill - pluginUpdated := userRes.UpdatedPlugin || projectRes.UpdatedPlugin - - if skillUpdated { - printSuccess("Updated skill to latest version") - } - if pluginUpdated { - printSuccess("Updated plugin to latest version") + if st.Installed && !st.UpToDate { + printWarning("Project skill at " + filepath.Join(".claude", "skills", "subtask", "SKILL.md") + " is outdated. Run `subtask install --scope project` to update.") } } diff --git a/cmd/subtask/auto_update_test.go b/cmd/subtask/auto_update_test.go new file mode 100644 index 0000000..40f052e --- /dev/null +++ b/cmd/subtask/auto_update_test.go @@ -0,0 +1,124 @@ +package main + +import ( + "os" + "path/filepath" + "testing" + + "github.com/stretchr/testify/require" + + "github.com/zippoxer/subtask/pkg/install" +) + +func TestRunAutoUpdate_ProjectSkillOutdated_Warns(t *testing.T) { + withOutputMode(t, false) + + home := t.TempDir() + t.Setenv("HOME", home) + t.Setenv(autoUpdateEnvVar, "") + + repo := t.TempDir() + gitCmd(t, repo, "init") + + prev, err := os.Getwd() + require.NoError(t, err) + require.NoError(t, os.Chdir(repo)) + t.Cleanup(func() { _ = os.Chdir(prev) }) + + projectSkill := filepath.Join(repo, ".claude", "skills", "subtask", "SKILL.md") + require.NoError(t, os.MkdirAll(filepath.Dir(projectSkill), 0o755)) + require.NoError(t, os.WriteFile(projectSkill, []byte("outdated"), 0o644)) + + stdout, stderr, err := captureStdoutStderr(t, func() error { + runAutoUpdate() + return nil + }) + require.NoError(t, err) + require.Empty(t, stdout) + require.Equal(t, "warning: Project skill at "+filepath.Join(".claude", "skills", "subtask", "SKILL.md")+" is outdated. Run `subtask install --scope project` to update.\n", stderr) +} + +func TestRunAutoUpdate_ProjectSkillUpToDate_Silent(t *testing.T) { + withOutputMode(t, false) + + home := t.TempDir() + t.Setenv("HOME", home) + t.Setenv(autoUpdateEnvVar, "") + + repo := t.TempDir() + gitCmd(t, repo, "init") + + prev, err := os.Getwd() + require.NoError(t, err) + require.NoError(t, os.Chdir(repo)) + t.Cleanup(func() { _ = os.Chdir(prev) }) + + projectSkill := filepath.Join(repo, ".claude", "skills", "subtask", "SKILL.md") + require.NoError(t, os.MkdirAll(filepath.Dir(projectSkill), 0o755)) + require.NoError(t, os.WriteFile(projectSkill, install.Embedded(), 0o644)) + + stdout, stderr, err := captureStdoutStderr(t, func() error { + runAutoUpdate() + return nil + }) + require.NoError(t, err) + require.Empty(t, stdout) + require.Empty(t, stderr) +} + +func TestRunAutoUpdate_NoGitRepo_Silent(t *testing.T) { + withOutputMode(t, false) + + home := t.TempDir() + t.Setenv("HOME", home) + t.Setenv(autoUpdateEnvVar, "") + + dir := t.TempDir() + + prev, err := os.Getwd() + require.NoError(t, err) + require.NoError(t, os.Chdir(dir)) + t.Cleanup(func() { _ = os.Chdir(prev) }) + + // Even if a project-scope path exists here, project scope only applies inside a git repo. + projectSkill := filepath.Join(dir, ".claude", "skills", "subtask", "SKILL.md") + require.NoError(t, os.MkdirAll(filepath.Dir(projectSkill), 0o755)) + require.NoError(t, os.WriteFile(projectSkill, []byte("outdated"), 0o644)) + + stdout, stderr, err := captureStdoutStderr(t, func() error { + runAutoUpdate() + return nil + }) + require.NoError(t, err) + require.Empty(t, stdout) + require.Empty(t, stderr) +} + +func TestRunAutoUpdate_AutoUpdateDisabled_SkipsChecks(t *testing.T) { + withOutputMode(t, false) + + home := t.TempDir() + t.Setenv("HOME", home) + t.Setenv(autoUpdateEnvVar, "1") + + repo := t.TempDir() + gitCmd(t, repo, "init") + + prev, err := os.Getwd() + require.NoError(t, err) + require.NoError(t, os.Chdir(repo)) + t.Cleanup(func() { _ = os.Chdir(prev) }) + + projectSkill := filepath.Join(repo, ".claude", "skills", "subtask", "SKILL.md") + require.NoError(t, os.MkdirAll(filepath.Dir(projectSkill), 0o755)) + require.NoError(t, os.WriteFile(projectSkill, []byte("outdated"), 0o644)) + + stdout, stderr, err := captureStdoutStderr(t, func() error { + runAutoUpdate() + return nil + }) + require.NoError(t, err) + require.Empty(t, stdout) + require.Empty(t, stderr) +} + diff --git a/cmd/subtask/base_branch_tracking_golden_test.go b/cmd/subtask/base_branch_tracking_golden_test.go index 7059658..0eb9a09 100644 --- a/cmd/subtask/base_branch_tracking_golden_test.go +++ b/cmd/subtask/base_branch_tracking_golden_test.go @@ -30,36 +30,6 @@ func commitEmpty(t *testing.T, dir, message string) { gitCmd(t, dir, "commit", "--allow-empty", "-m", message) } -func TestGolden_List_CommitsBehind(t *testing.T) { - env := testutil.NewTestEnv(t, 0) - withFixedNow(t, time.Date(2026, 1, 1, 12, 0, 0, 0, time.UTC)) - - baseCommit := gitCmdOutput(t, env.RootDir, "rev-parse", "HEAD") - commitEmpty(t, env.RootDir, "one") - commitEmpty(t, env.RootDir, "two") - - taskName := "list/behind" - env.CreateTask(taskName, "Behind task", "main", "Description") - env.CreateTaskState(taskName, &task.State{ - Workspace: "", - }) - env.CreateTaskHistory(taskName, []history.Event{ - {Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main", "base_commit": baseCommit})}, - {Type: "stage.changed", Data: mustJSON(map[string]any{"from": "", "to": "implement"})}, - }) - - for _, pretty := range []bool{false, true} { - t.Run(modeName(pretty), func(t *testing.T) { - withOutputMode(t, pretty) - - stdout, stderr, err := captureStdoutStderr(t, (&ListCmd{}).Run) - require.NoError(t, err) - require.Empty(t, stderr) - testutil.AssertGoldenOutput(t, "testdata/list/commits_behind", stdout) - }) - } -} - func TestGolden_Show_Conflicts(t *testing.T) { env := testutil.NewTestEnv(t, 0) withFixedNow(t, time.Date(2026, 1, 1, 12, 0, 0, 0, time.UTC)) @@ -152,31 +122,3 @@ func TestGolden_Send_PrintsConflicts(t *testing.T) { }) } } - -func TestIntegration_BaseCommitTracking_DraftThenMainAdvances(t *testing.T) { - env := testutil.NewTestEnv(t, 0) - withOutputMode(t, false) - - taskName := "integration/base-commit" - _, _, err := captureStdoutStderr(t, (&DraftCmd{ - Task: taskName, - Description: "Test description", - Base: "main", - Title: "Integration test", - }).Run) - require.NoError(t, err) - - tail, err := history.Tail(taskName) - require.NoError(t, err) - require.NotEmpty(t, tail.BaseCommit) - - commitEmpty(t, env.RootDir, "one") - commitEmpty(t, env.RootDir, "two") - - stdout, _, err := captureStdoutStderr(t, (&ListCmd{}).Run) - require.NoError(t, err) - require.Contains(t, stdout, "(2 behind)") - - stdout, _, err = captureStdoutStderr(t, (&ShowCmd{Task: taskName}).Run) - require.NoError(t, err) -} diff --git a/cmd/subtask/close.go b/cmd/subtask/close.go index 97048bb..0e2940d 100644 --- a/cmd/subtask/close.go +++ b/cmd/subtask/close.go @@ -14,6 +14,10 @@ type CloseCmd struct { // Run executes the close command. func (c *CloseCmd) Run() error { + if _, err := preflightProject(); err != nil { + return err + } + res, err := ops.CloseTask(c.Task, c.Abandon, cliOpsLogger{}) if err != nil { return err diff --git a/cmd/subtask/config.go b/cmd/subtask/config.go new file mode 100644 index 0000000..3fee20f --- /dev/null +++ b/cmd/subtask/config.go @@ -0,0 +1,133 @@ +package main + +import ( + "encoding/json" + "fmt" + "os" + "path/filepath" + "strings" + + "github.com/charmbracelet/huh" + "github.com/zippoxer/subtask/pkg/task" + "github.com/zippoxer/subtask/pkg/workspace" +) + +// ConfigCmd implements 'subtask config'. +type ConfigCmd struct { + User bool `help:"Edit user config (~/.subtask/config.json)"` + Project bool `help:"Edit project config (/.subtask/config.json)"` + NoPrompt bool `help:"Non-interactive; use defaults"` + Harness string `help:"Worker harness: 'codex', 'claude', or 'opencode'" placeholder:"HARNESS"` + Model string `help:"Default model for workers" placeholder:"MODEL"` + Reasoning string `help:"Reasoning level for Codex: 'low', 'medium', 'high', 'xhigh'" placeholder:"LEVEL"` + MaxWorkspaces int `help:"Max parallel git worktrees per repo (default 20)" placeholder:"N"` +} + +func (c *ConfigCmd) Run() error { + if c.User && c.Project { + return fmt.Errorf("--user and --project are mutually exclusive") + } + + scope := "user" + if c.Project { + scope = "project" + } else if !c.User && !c.NoPrompt { + // Interactive scope prompt. + form := huh.NewForm(huh.NewGroup( + huh.NewSelect[string](). + Title("Config scope"). + Options( + huh.NewOption("User (global defaults)", "user"), + huh.NewOption("Project (this repo only)", "project"), + ). + Value(&scope), + )) + if err := form.Run(); err != nil { + return err + } + } + + var path string + var repoRoot string + switch scope { + case "user": + path = task.ConfigPath() + case "project": + var err error + repoRoot, err = preflightProjectOnly() // requires git; also runs layout migration. + if err != nil { + return err + } + path = filepath.Join(repoRoot, ".subtask", "config.json") + default: + return fmt.Errorf("invalid scope %q", scope) + } + + existing := readConfigFileOrNil(path) + cfg, wrote, err := runConfigWizard(configWizardParams{ + WritePath: path, + RepoRoot: repoRoot, + Existing: existing, + NoPrompt: c.NoPrompt, + Harness: c.Harness, + Model: c.Model, + Reasoning: c.Reasoning, + MaxWorkspaces: c.MaxWorkspaces, + }) + if err != nil { + return err + } + if !wrote || cfg == nil { + return nil + } + + // Best-effort: ignore portable subtask data in git repos. + if scope == "project" && repoRoot != "" { + _ = ensureGitignore(repoRoot) + } + + fmt.Println() + fmt.Println(successStyle.Render(" ✓ Config saved")) + printConfigDetails(cfg, scope, path) + fmt.Println() + return nil +} + +func readConfigFileOrNil(path string) *workspace.Config { + b, err := os.ReadFile(path) + if err != nil { + return nil + } + var cfg workspace.Config + if err := json.Unmarshal(b, &cfg); err != nil { + // Leave validation/reporting to workspace.LoadConfig() for runtime commands. + return nil + } + if cfg.Options == nil { + cfg.Options = make(map[string]any) + } + return &cfg +} + +func stringsTrimSpace(v any) string { + s, ok := v.(string) + if !ok { + return "" + } + return strings.TrimSpace(s) +} + +// printConfigDetails prints the config settings in a consistent format. +func printConfigDetails(cfg *workspace.Config, scope, path string) { + // Title case the scope. + scopeTitle := strings.ToUpper(scope[:1]) + scope[1:] + fmt.Printf(" %s %s %s\n", subtleStyle.Render("Scope:"), scopeTitle, subtleStyle.Render("("+abbreviatePath(path)+")")) + fmt.Printf(" %s %s\n", subtleStyle.Render("Harness:"), cfg.Harness) + if m := stringsTrimSpace(cfg.Options["model"]); m != "" { + fmt.Printf(" %s %s\n", subtleStyle.Render("Model:"), m) + } + if r := stringsTrimSpace(cfg.Options["reasoning"]); r != "" { + fmt.Printf(" %s %s\n", subtleStyle.Render("Reasoning:"), r) + } + fmt.Printf(" %s %d\n", subtleStyle.Render("Max workspaces:"), cfg.MaxWorkspaces) +} diff --git a/cmd/subtask/config_cmd_test.go b/cmd/subtask/config_cmd_test.go new file mode 100644 index 0000000..67df123 --- /dev/null +++ b/cmd/subtask/config_cmd_test.go @@ -0,0 +1,62 @@ +package main + +import ( + "encoding/json" + "errors" + "os" + "path/filepath" + "testing" + + "github.com/stretchr/testify/require" + + "github.com/zippoxer/subtask/pkg/subtaskerr" + "github.com/zippoxer/subtask/pkg/task" + "github.com/zippoxer/subtask/pkg/testutil" + "github.com/zippoxer/subtask/pkg/workspace" +) + +func TestConfigCmd_UserScope_NoPrompt_WritesGlobalConfig(t *testing.T) { + t.Setenv("SUBTASK_DIR", t.TempDir()) + + // Ensure at least one harness is "available". + binDir := filepath.Join(t.TempDir(), "bin") + require.NoError(t, os.MkdirAll(binDir, 0o755)) + _ = writeFakeCLI(t, binDir, "codex") + t.Setenv("PATH", binDir+string(os.PathListSeparator)+os.Getenv("PATH")) + + require.NoError(t, (&ConfigCmd{User: true, NoPrompt: true}).Run()) + + b, err := os.ReadFile(task.ConfigPath()) + require.NoError(t, err) + + var cfg workspace.Config + require.NoError(t, json.Unmarshal(b, &cfg)) + require.NotEmpty(t, cfg.Harness) +} + +func TestConfigCmd_ProjectScope_RequiresGitRepo(t *testing.T) { + t.Setenv("SUBTASK_DIR", t.TempDir()) + prev, _ := os.Getwd() + cwd := t.TempDir() + require.NoError(t, os.Chdir(cwd)) + t.Cleanup(func() { _ = os.Chdir(prev) }) + + err := (&ConfigCmd{Project: true, NoPrompt: true}).Run() + require.True(t, errors.Is(err, subtaskerr.ErrNotGitRepo)) +} + +func TestConfigCmd_ProjectScope_NoPrompt_WritesProjectOverride(t *testing.T) { + t.Setenv("SUBTASK_DIR", t.TempDir()) + + // Ensure at least one harness is "available". + binDir := filepath.Join(t.TempDir(), "bin") + require.NoError(t, os.MkdirAll(binDir, 0o755)) + _ = writeFakeCLI(t, binDir, "codex") + t.Setenv("PATH", binDir+string(os.PathListSeparator)+os.Getenv("PATH")) + + env := testutil.NewTestEnv(t, 0) + + require.NoError(t, (&ConfigCmd{Project: true, NoPrompt: true}).Run()) + require.FileExists(t, filepath.Join(env.RootDir, ".subtask", "config.json")) +} + diff --git a/cmd/subtask/config_wizard.go b/cmd/subtask/config_wizard.go new file mode 100644 index 0000000..34ac20c --- /dev/null +++ b/cmd/subtask/config_wizard.go @@ -0,0 +1,404 @@ +package main + +import ( + "fmt" + "strings" + + "github.com/charmbracelet/bubbles/key" + "github.com/charmbracelet/huh" + "github.com/zippoxer/subtask/pkg/harness" + "github.com/zippoxer/subtask/pkg/workspace" +) + +type configWizardParams struct { + WritePath string + RepoRoot string // optional; used only for display/help text + Existing *workspace.Config + NoPrompt bool + // Flag overrides (take precedence over defaults and existing config). + Harness string + Model string + Reasoning string + MaxWorkspaces int +} + +type configFlags struct { + Harness string + Model string + Reasoning string + MaxWorkspaces int +} + +type configValues struct { + Harness string + Model string + Reasoning string + MaxWorkspaces int +} + +// resolveConfigValues merges defaults + existing config + CLI flags into a resolved set of values. +// It is a pure function (no IO) and does not check harness availability on the machine. +func resolveConfigValues(existing *workspace.Config, flags configFlags) configValues { + values := configValues{ + Harness: "codex", + MaxWorkspaces: workspace.DefaultMaxWorkspaces, + } + + if existing != nil { + if strings.TrimSpace(existing.Harness) != "" { + values.Harness = strings.TrimSpace(existing.Harness) + } + if existing.MaxWorkspaces > 0 { + values.MaxWorkspaces = existing.MaxWorkspaces + } + if existing.Options != nil { + if m := stringsTrimSpace(existing.Options["model"]); m != "" { + values.Model = m + } + if r := stringsTrimSpace(existing.Options["reasoning"]); r != "" { + values.Reasoning = r + } + } + } + + // Harness override resets dependent values to harness-appropriate defaults (but still allows + // explicit flags to override after). + if strings.TrimSpace(flags.Harness) != "" { + values.Harness = strings.TrimSpace(flags.Harness) + values.Model = "" + values.Reasoning = "" + } + if strings.TrimSpace(flags.Model) != "" { + values.Model = strings.TrimSpace(flags.Model) + } + if strings.TrimSpace(flags.Reasoning) != "" { + values.Reasoning = strings.TrimSpace(flags.Reasoning) + } + if flags.MaxWorkspaces > 0 { + values.MaxWorkspaces = flags.MaxWorkspaces + } + + // Harness-specific defaults (only when unset). + switch strings.TrimSpace(values.Harness) { + case "", "codex": + values.Harness = "codex" + if strings.TrimSpace(values.Model) == "" { + values.Model = "gpt-5.2" + } + if strings.TrimSpace(values.Reasoning) == "" { + values.Reasoning = "high" + } + case "claude": + if strings.TrimSpace(values.Model) == "" { + values.Model = "opus" + } + // If reasoning came from defaults/existing and the user didn't explicitly set it as a flag, + // drop it for non-codex harnesses (keeps config files clean and matches prior behavior). + if strings.TrimSpace(flags.Reasoning) == "" { + values.Reasoning = "" + } + case "opencode": + if strings.TrimSpace(flags.Reasoning) == "" { + values.Reasoning = "" + } + } + + return values +} + +// validateConfigValues validates resolved values without performing any IO. +func validateConfigValues(values configValues) error { + harnessName := strings.TrimSpace(values.Harness) + switch harnessName { + case "codex", "claude", "opencode": + // ok + default: + return fmt.Errorf("invalid harness %q\n\nAllowed: codex, claude, opencode", harnessName) + } + + if values.MaxWorkspaces < 0 { + return fmt.Errorf("max workspaces must be >= 0, got %d", values.MaxWorkspaces) + } + + return workspace.ValidateReasoningFlag(harnessName, strings.TrimSpace(values.Reasoning)) +} + +// buildConfig creates a workspace.Config from resolved values. +func buildConfig(values configValues) *workspace.Config { + cfg := &workspace.Config{ + Harness: strings.TrimSpace(values.Harness), + MaxWorkspaces: values.MaxWorkspaces, + } + if cfg.MaxWorkspaces <= 0 { + cfg.MaxWorkspaces = workspace.DefaultMaxWorkspaces + } + + model := strings.TrimSpace(values.Model) + reasoning := strings.TrimSpace(values.Reasoning) + if model != "" || reasoning != "" { + cfg.Options = make(map[string]any) + if model != "" { + cfg.Options["model"] = model + } + if reasoning != "" { + cfg.Options["reasoning"] = reasoning + } + } + + return cfg +} + +func validateHarnessAvailable(harnessName string) error { + if harness.CanResolveCLI(harnessName) { + return nil + } + switch harnessName { + case "codex": + return fmt.Errorf("codex CLI not found\n\nInstall it from: https://github.com/openai/codex") + case "claude": + return fmt.Errorf("claude CLI not found\n\nInstall it from: https://claude.com/claude-code") + default: + return fmt.Errorf("opencode CLI not found\n\nInstall it from: https://github.com/anomalyco/opencode") + } +} + +func runConfigWizard(p configWizardParams) (*workspace.Config, bool, error) { + if strings.TrimSpace(p.WritePath) == "" { + return nil, false, fmt.Errorf("config write path is required") + } + + // Check which harnesses are available. + codexAvailable := isCommandAvailable("codex") + claudeAvailable := isCommandAvailable("claude") + opencodeAvailable := isCommandAvailable("opencode") + if !codexAvailable && !claudeAvailable && !opencodeAvailable { + return nil, false, fmt.Errorf("no worker harness available\n\nInstall one of:\n - Codex CLI: https://github.com/openai/codex\n - Claude Code CLI: https://claude.com/claude-code\n - OpenCode CLI: https://github.com/anomalyco/opencode") + } + + flags := configFlags{ + Harness: p.Harness, + Model: p.Model, + Reasoning: p.Reasoning, + MaxWorkspaces: p.MaxWorkspaces, + } + values := resolveConfigValues(p.Existing, flags) + + // If the user didn't explicitly request a harness and the resolved harness isn't available, + // fall back to the first available harness and reset dependent defaults. + if strings.TrimSpace(flags.Harness) == "" && !isCommandAvailable(values.Harness) { + fallbackHarness := "opencode" + switch { + case codexAvailable: + fallbackHarness = "codex" + case claudeAvailable: + fallbackHarness = "claude" + } + values = resolveConfigValues(nil, configFlags{ + Harness: fallbackHarness, + Model: flags.Model, + Reasoning: flags.Reasoning, + MaxWorkspaces: values.MaxWorkspaces, + }) + } + + if p.NoPrompt { + if err := validateConfigValues(values); err != nil { + return nil, false, err + } + if err := validateHarnessAvailable(strings.TrimSpace(values.Harness)); err != nil { + return nil, false, err + } + cfg := buildConfig(values) + if err := cfg.SaveTo(p.WritePath); err != nil { + return nil, false, fmt.Errorf("failed to save config: %w", err) + } + _ = harness.CanResolveCLI(cfg.Harness) // warm discovery + return cfg, true, nil + } + + // Use resolved defaults to prefill the wizard. + h := values.Harness + model := values.Model + reasoning := values.Reasoning + numWorkspaces := values.MaxWorkspaces + + // Interactive wizard (same flow as prior init). + firstStep := 0 + available := 0 + if codexAvailable { + available++ + } + if claudeAvailable { + available++ + } + if opencodeAvailable { + available++ + } + if available <= 1 { + firstStep = 1 // skip harness selection + } + + step := firstStep + for { + // Clear screen and show header + previous answers. + fmt.Print("\033[H\033[2J") + fmt.Println() + fmt.Println(" " + successStyle.Bold(true).Render("Subtask Config")) + fmt.Println(subtleStyle.Render(" Configure parallel workers")) + fmt.Println() + + if step > 0 && firstStep == 0 { + fmt.Printf(" Harness: %s\n", h) + } + if step > 1 && model != "" { + fmt.Printf(" Model: %s\n", model) + } + if step > 2 && h == "codex" { + fmt.Printf(" Reasoning: %s\n", reasoning) + } + if step > firstStep { + fmt.Println() + } + + var form *huh.Form + switch step { + case 0: + var opts []huh.Option[string] + if codexAvailable { + opts = append(opts, huh.NewOption("Codex (recommended)", "codex")) + } + if claudeAvailable { + opts = append(opts, huh.NewOption("Claude Code", "claude")) + } + if opencodeAvailable { + opts = append(opts, huh.NewOption("OpenCode", "opencode")) + } + form = huh.NewForm(huh.NewGroup( + huh.NewSelect[string](). + Title("Worker"). + Description("Which CLI runs your tasks behind the scenes"). + Options(opts...). + Value(&h), + )) + + case 1: + if h == "codex" { + opts := []huh.Option[string]{ + huh.NewOption("gpt-5.2 (recommended)", "gpt-5.2"), + huh.NewOption("gpt-5.2-codex", "gpt-5.2-codex"), + } + form = huh.NewForm(huh.NewGroup( + huh.NewSelect[string](). + Title("Model"). + Description("Default for workers. Change anytime with: subtask config"). + Options(opts...). + Value(&model), + )) + } else if h == "claude" { + opts := []huh.Option[string]{ + huh.NewOption("Opus (recommended)", "opus"), + huh.NewOption("Sonnet", "sonnet"), + } + form = huh.NewForm(huh.NewGroup( + huh.NewSelect[string](). + Title("Model"). + Description("Default for workers. Change anytime with: subtask config"). + Options(opts...). + Value(&model), + )) + } else { + form = huh.NewForm(huh.NewGroup( + huh.NewInput(). + Title("Model (optional)"). + Description("Default for workers. Leave blank for OpenCode defaults. Change anytime with: subtask config"). + Placeholder("provider/model"). + Value(&model), + )) + } + + case 2: + if h != "codex" { + step++ + continue + } + form = huh.NewForm(huh.NewGroup( + huh.NewSelect[string](). + Title("Reasoning"). + Description("Default for workers. Change anytime with: subtask config"). + Options( + huh.NewOption("Extra High", "xhigh"), + huh.NewOption("High (recommended)", "high"), + huh.NewOption("Medium", "medium"), + huh.NewOption("Low", "low"), + ). + Value(&reasoning), + )) + } + + if step > 2 { + break + } + + km := huh.NewDefaultKeyMap() + km.Quit = key.NewBinding(key.WithKeys("esc", "ctrl+c"), key.WithHelp("esc", "back")) + km.Select.Filter = key.NewBinding(key.WithDisabled()) + form = form.WithKeyMap(km).WithTheme(huh.ThemeCharm()).WithShowHelp(true) + + err := form.Run() + if err == huh.ErrUserAborted { + if step == firstStep { + return nil, false, fmt.Errorf("config cancelled") + } + step-- + if step == 2 && h != "codex" { + step-- + } + continue + } + if err != nil { + return nil, false, err + } + + // Reset dependent values when harness changes. + if step == 0 { + switch h { + case "codex": + model = "gpt-5.2" + reasoning = "high" + case "claude": + model = "opus" + reasoning = "" + default: + model = "" + reasoning = "" + } + } + + step++ + } + + values = configValues{ + Harness: h, + Model: model, + Reasoning: reasoning, + MaxWorkspaces: numWorkspaces, + } + + // Final validation - ensure selections are valid and harness is available. + if err := validateConfigValues(values); err != nil { + return nil, false, err + } + if err := validateHarnessAvailable(strings.TrimSpace(values.Harness)); err != nil { + return nil, false, err + } + + cfg := buildConfig(values) + if err := cfg.SaveTo(p.WritePath); err != nil { + return nil, false, fmt.Errorf("failed to save config: %w", err) + } + + // Warm harness discovery for better UX on first run. + _ = harness.CanResolveCLI(cfg.Harness) + + return cfg, true, nil +} diff --git a/cmd/subtask/config_wizard_pure_test.go b/cmd/subtask/config_wizard_pure_test.go new file mode 100644 index 0000000..452fb89 --- /dev/null +++ b/cmd/subtask/config_wizard_pure_test.go @@ -0,0 +1,90 @@ +package main + +import ( + "testing" + + "github.com/stretchr/testify/require" + + "github.com/zippoxer/subtask/pkg/workspace" +) + +func TestResolveConfigValues_Defaults(t *testing.T) { + values := resolveConfigValues(nil, configFlags{}) + require.Equal(t, "codex", values.Harness) + require.Equal(t, "gpt-5.2", values.Model) + require.Equal(t, "high", values.Reasoning) + require.Equal(t, workspace.DefaultMaxWorkspaces, values.MaxWorkspaces) +} + +func TestResolveConfigValues_ExistingClaude_DefaultsModel_DropsReasoning(t *testing.T) { + existing := &workspace.Config{ + Harness: "claude", + MaxWorkspaces: 7, + Options: map[string]any{ + "reasoning": "high", + }, + } + values := resolveConfigValues(existing, configFlags{}) + require.Equal(t, "claude", values.Harness) + require.Equal(t, "opus", values.Model) + require.Empty(t, values.Reasoning) + require.Equal(t, 7, values.MaxWorkspaces) +} + +func TestResolveConfigValues_FlagsHarnessOverride_ResetsDependentDefaults(t *testing.T) { + existing := &workspace.Config{ + Harness: "codex", + Options: map[string]any{ + "model": "gpt-5.2-codex", + "reasoning": "xhigh", + }, + } + values := resolveConfigValues(existing, configFlags{Harness: "claude"}) + require.Equal(t, "claude", values.Harness) + require.Equal(t, "opus", values.Model) + require.Empty(t, values.Reasoning) +} + +func TestResolveConfigValues_FlagsOverrideModelAndReasoning(t *testing.T) { + values := resolveConfigValues(nil, configFlags{ + Harness: "codex", + Model: "gpt-5.2-codex", + Reasoning: "medium", + }) + require.Equal(t, "codex", values.Harness) + require.Equal(t, "gpt-5.2-codex", values.Model) + require.Equal(t, "medium", values.Reasoning) +} + +func TestValidateConfigValues_InvalidHarness(t *testing.T) { + err := validateConfigValues(configValues{Harness: "nope"}) + require.ErrorContains(t, err, "invalid harness") +} + +func TestValidateConfigValues_ReasoningCodexOnly(t *testing.T) { + err := validateConfigValues(configValues{Harness: "claude", Reasoning: "high"}) + require.ErrorContains(t, err, "codex-only") +} + +func TestValidateConfigValues_MaxWorkspacesNegative(t *testing.T) { + err := validateConfigValues(configValues{Harness: "codex", MaxWorkspaces: -1}) + require.ErrorContains(t, err, "max workspaces must be >= 0") +} + +func TestBuildConfig_UsesDefaultsAndOmitsEmptyOptions(t *testing.T) { + cfg := buildConfig(configValues{Harness: "codex", MaxWorkspaces: 0}) + require.Equal(t, "codex", cfg.Harness) + require.Equal(t, workspace.DefaultMaxWorkspaces, cfg.MaxWorkspaces) + require.Nil(t, cfg.Options) +} + +func TestBuildConfig_SetsOptions(t *testing.T) { + cfg := buildConfig(configValues{ + Harness: "codex", + Model: "gpt-5.2-codex", + Reasoning: "high", + }) + require.Equal(t, "codex", cfg.Harness) + require.Equal(t, "gpt-5.2-codex", cfg.Options["model"]) + require.Equal(t, "high", cfg.Options["reasoning"]) +} diff --git a/cmd/subtask/diff.go b/cmd/subtask/diff.go index e5bd6fc..84ea06c 100644 --- a/cmd/subtask/diff.go +++ b/cmd/subtask/diff.go @@ -3,6 +3,7 @@ package main import ( "fmt" "os" + "strings" "github.com/zippoxer/subtask/pkg/git" "github.com/zippoxer/subtask/pkg/task" @@ -18,6 +19,10 @@ type DiffCmd struct { // Run executes the diff command. func (c *DiffCmd) Run() error { + if _, err := preflightProject(); err != nil { + return err + } + if err := migrate.EnsureSchema(c.Task); err != nil { return err } @@ -33,16 +38,41 @@ func (c *DiffCmd) Run() error { } tail, _ := history.Tail(c.Task) - // Merged tasks: show the squash commit diff if available. + // Merged tasks with deleted branches: + // - For squash merges, show the squash commit diff. + // - For detected/no-op merges, show the recorded PR-style diff (base_commit..branch_head) if possible. if tail.TaskStatus == task.TaskStatusMerged && !git.BranchExists(".", c.Task) { - sha := tail.LastMergedCommit - if sha == "" { - return fmt.Errorf("diff unavailable: task %s is merged and has no branch (missing merge commit)\n\nSend to reopen:\n subtask send %s \"\"", c.Task, c.Task) + mergedCommit := strings.TrimSpace(tail.LastMergedCommit) + mergedMethod := strings.TrimSpace(tail.LastMergedMethod) + + // Legacy: older history didn't record method; assume squash commit if commit is present. + if mergedCommit != "" && (mergedMethod == "" || mergedMethod == "squash") { + if c.Stat { + return git.RunWithStderrFilter(".", git.FilterLineEndingWarnings, "show", "--stat", "--format=", mergedCommit) + } + return git.RunWithStderrFilter(".", git.FilterLineEndingWarnings, "show", mergedCommit) } - if c.Stat { - return git.RunWithStderrFilter(".", git.FilterLineEndingWarnings, "show", "--stat", "--format=", sha) + + baseCommit := strings.TrimSpace(tail.LastMergedBaseCommit) + branchHead := strings.TrimSpace(tail.LastMergedBranchHead) + if baseCommit != "" && branchHead != "" { + args := []string{"diff"} + if c.Stat { + args = append(args, "--stat") + } + args = append(args, baseCommit+".."+branchHead) + return git.RunWithStderrFilter(".", git.FilterLineEndingWarnings, args...) + } + + // Fallback: if we do have a commit, show it, otherwise report unavailable. + if mergedCommit != "" { + if c.Stat { + return git.RunWithStderrFilter(".", git.FilterLineEndingWarnings, "show", "--stat", "--format=", mergedCommit) + } + return git.RunWithStderrFilter(".", git.FilterLineEndingWarnings, "show", mergedCommit) } - return git.RunWithStderrFilter(".", git.FilterLineEndingWarnings, "show", sha) + + return fmt.Errorf("diff unavailable: task %s is merged and has no branch\n\nSend to reopen:\n subtask send %s \"\"", c.Task, c.Task) } // Prefer diffing from the task workspace when available (includes uncommitted changes). diff --git a/cmd/subtask/draft.go b/cmd/subtask/draft.go index 96eb85a..dd157e9 100644 --- a/cmd/subtask/draft.go +++ b/cmd/subtask/draft.go @@ -13,6 +13,7 @@ import ( "github.com/zippoxer/subtask/pkg/render" "github.com/zippoxer/subtask/pkg/task" "github.com/zippoxer/subtask/pkg/task/history" + "github.com/zippoxer/subtask/pkg/task/migrate/gitredesign" "github.com/zippoxer/subtask/pkg/workflow" "github.com/zippoxer/subtask/pkg/workspace" ) @@ -42,6 +43,11 @@ func (c *DraftCmd) Run() error { "Provide description as argument or via stdin (heredoc/pipe)") } + // Requirements: git + global config (config may be migrated on first access). + if _, err := preflightProject(); err != nil { + return err + } + // Check if task already exists if _, err := task.Load(c.Task); err == nil { return fmt.Errorf("task %q already exists", c.Task) @@ -73,7 +79,7 @@ func (c *DraftCmd) Run() error { FollowUp: c.FollowUp, Model: c.Model, Reasoning: c.Reasoning, - Schema: 1, + Schema: gitredesign.TaskSchemaVersion, } if err := t.Save(); err != nil { diff --git a/cmd/subtask/gitignore.go b/cmd/subtask/gitignore.go new file mode 100644 index 0000000..d31b661 --- /dev/null +++ b/cmd/subtask/gitignore.go @@ -0,0 +1,39 @@ +package main + +import ( + "os" + "path/filepath" + + "github.com/zippoxer/subtask/pkg/git" +) + +// ensureGitignore adds /.subtask/ to .gitignore if not already ignored. +func ensureGitignore(repoRoot string) error { + // Use git check-ignore to see if already ignored (handles all gitignore semantics). + subtaskDir := filepath.Join(repoRoot, ".subtask") + if err := git.RunQuiet(repoRoot, "check-ignore", "-q", subtaskDir); err == nil { + return nil // Already ignored. + } + + // Append to .gitignore. + gitignorePath := filepath.Join(repoRoot, ".gitignore") + pattern := "/.subtask/" + + // Read existing content to check if we need a leading newline. + content, _ := os.ReadFile(gitignorePath) + + f, err := os.OpenFile(gitignorePath, os.O_APPEND|os.O_CREATE|os.O_WRONLY, 0o644) + if err != nil { + return err + } + defer f.Close() + + if len(content) > 0 && content[len(content)-1] != '\n' { + if _, err := f.WriteString("\n"); err != nil { + return err + } + } + _, err = f.WriteString(pattern + "\n") + return err +} + diff --git a/cmd/subtask/golden_cli_test.go b/cmd/subtask/golden_cli_test.go index d8a98fa..cdcbf08 100644 --- a/cmd/subtask/golden_cli_test.go +++ b/cmd/subtask/golden_cli_test.go @@ -16,6 +16,7 @@ import ( "github.com/zippoxer/subtask/pkg/render" "github.com/zippoxer/subtask/pkg/task" "github.com/zippoxer/subtask/pkg/task/history" + "github.com/zippoxer/subtask/pkg/task/migrate/gitredesign" "github.com/zippoxer/subtask/pkg/testutil" "github.com/zippoxer/subtask/pkg/workflow" ) @@ -267,6 +268,8 @@ func TestGolden_List_SingleTask(t *testing.T) { {"step":"Update snapshots","done":false} ]`) overwriteWorkspaceReadme(t, env.Workspaces[0], "# Test Repo\nline one\nline two\n") + gitCmd(t, env.Workspaces[0], "add", "README.md") + gitCmd(t, env.Workspaces[0], "commit", "-m", "Update README") for _, pretty := range []bool{false, true} { t.Run(modeName(pretty), func(t *testing.T) { @@ -315,6 +318,8 @@ func TestGolden_List_MultiStatus(t *testing.T) { {"step":"Fix","done":false} ]`) overwriteWorkspaceReadme(t, env.Workspaces[0], "# Test Repo\nworking change\n") + gitCmd(t, env.Workspaces[0], "add", "README.md") + gitCmd(t, env.Workspaces[0], "commit", "-m", "Worker change") // c/replied (with context) env.CreateTask("c/replied", "Replied task", "main", "Replied description") @@ -342,10 +347,13 @@ func TestGolden_List_MultiStatus(t *testing.T) { {"step":"Review","done":false} ]`) overwriteWorkspaceReadme(t, env.Workspaces[1], "one\ntwo\nthree\n") + gitCmd(t, env.Workspaces[1], "add", "README.md") + gitCmd(t, env.Workspaces[1], "commit", "-m", "Replied changes") // d/error env.CreateTask("d/error", "Error task", "main", "Error description") env.CreateTaskState("d/error", &task.State{ + Workspace: env.Workspaces[3], LastError: "something went wrong", }) env.CreateTaskHistory("d/error", []history.Event{ @@ -387,7 +395,7 @@ func TestGolden_Show_Draft(t *testing.T) { Title: "Draft task", BaseBranch: "main", Description: "Draft description", - Schema: 1, + Schema: gitredesign.TaskSchemaVersion, }).Save()) require.NoError(t, workflow.CopyToTask("default", taskName)) require.NoError(t, history.WriteAll(taskName, mustHistoryOpen(t, "main"))) @@ -412,6 +420,8 @@ func TestGolden_Show_RepliedWithProgressAndDiff(t *testing.T) { // Use a stable, short workspace path so pretty output box widths are deterministic. gitCmd(t, env.RootDir, "worktree", "add", "--detach", "ws1") overwriteWorkspaceReadme(t, "ws1", "# Test Repo\none\ntwo\nthree\nfour\n") + gitCmd(t, "ws1", "add", "README.md") + gitCmd(t, "ws1", "commit", "-m", "Replied changes") taskName := "show/replied" env.CreateTask(taskName, "Replied task", "main", "Replied description") diff --git a/cmd/subtask/harness_available.go b/cmd/subtask/harness_available.go new file mode 100644 index 0000000..c9aca6c --- /dev/null +++ b/cmd/subtask/harness_available.go @@ -0,0 +1,9 @@ +package main + +import "github.com/zippoxer/subtask/pkg/harness" + +// isCommandAvailable checks if a command is likely runnable on this machine. +func isCommandAvailable(name string) bool { + return harness.CanResolveCLI(name) +} + diff --git a/cmd/subtask/init.go b/cmd/subtask/init.go deleted file mode 100644 index 357b068..0000000 --- a/cmd/subtask/init.go +++ /dev/null @@ -1,394 +0,0 @@ -package main - -import ( - "encoding/json" - "fmt" - "os" - "path/filepath" - - "github.com/charmbracelet/bubbles/key" - "github.com/charmbracelet/huh" - "github.com/charmbracelet/lipgloss" - "github.com/zippoxer/subtask/pkg/git" - "github.com/zippoxer/subtask/pkg/harness" - "github.com/zippoxer/subtask/pkg/workspace" -) - -// InitCmd implements 'subtask init'. -type InitCmd struct { - Workspaces int `short:"n" default:"20" help:"Maximum number of workspaces (created on demand)"` - Harness string `default:"codex" help:"Worker harness (codex|claude|opencode)"` - Force bool `short:"f" help:"Force re-init, overwriting existing config"` -} - -var ( - successStyle = lipgloss.NewStyle(). - Foreground(lipgloss.Color("10")) - - subtleStyle = lipgloss.NewStyle(). - Foreground(lipgloss.Color("247")) -) - -// Run executes the init command. -func (c *InitCmd) Run() error { - // Get current directory as project root - cwd, err := os.Getwd() - if err != nil { - return err - } - - // For init, always check/create in cwd, don't search ancestors - localSubtaskDir := filepath.Join(cwd, ".subtask") - localConfigPath := filepath.Join(localSubtaskDir, "config.json") - - // Check if already initialized in this directory - if _, err := os.Stat(localConfigPath); err == nil && !c.Force { - return fmt.Errorf("already initialized\n\nConfig exists: %s\nUse --force to reinitialize", localConfigPath) - } - - insideWorkTree, err := git.Output(cwd, "rev-parse", "--is-inside-work-tree") - if err != nil || insideWorkTree != "true" { - return fmt.Errorf("Not a git repository. Run 'git init' first or cd to an existing repo.") - } - - // Check which harnesses are available - codexAvailable := isCommandAvailable("codex") - claudeAvailable := isCommandAvailable("claude") - opencodeAvailable := isCommandAvailable("opencode") - - if !codexAvailable && !claudeAvailable && !opencodeAvailable { - return fmt.Errorf("no worker harness available\n\nInstall one of:\n - Codex CLI: https://github.com/openai/codex\n - Claude Code CLI: https://claude.com/claude-code\n - OpenCode CLI: https://github.com/anomalyco/opencode") - } - - // With --force, confirm before overwriting config. - if c.Force { - if _, err := os.Stat(localConfigPath); err == nil { - fmt.Println() - fmt.Println(lipgloss.NewStyle().Foreground(lipgloss.Color("11")).Render(" ⚠ Warning")) - fmt.Println(" • existing config will be overwritten") - fmt.Println() - - var confirm bool - confirmForm := huh.NewForm(huh.NewGroup( - huh.NewConfirm(). - Title("Continue with --force?"). - Description("This cannot be undone"). - Value(&confirm), - )).WithTheme(huh.ThemeCharm()) - - if err := confirmForm.Run(); err != nil || !confirm { - return fmt.Errorf("cancelled") - } - } - } - - // Form values - numWorkspaces := c.Workspaces - - // Validate harness is a supported value - validHarnesses := map[string]bool{"codex": true, "claude": true, "opencode": true} - if !validHarnesses[c.Harness] { - return fmt.Errorf("invalid harness %q\n\nSupported harnesses: codex, claude, opencode", c.Harness) - } - - harness := c.Harness - // Fall back if requested harness isn't available - if !isCommandAvailable(harness) { - if codexAvailable { - harness = "codex" - } else if claudeAvailable { - harness = "claude" - } else { - harness = "opencode" - } - } - model := "gpt-5.2" - if harness == "claude" { - model = "claude-opus-4-5-20251101" - } - if harness == "opencode" { - model = "" - } - reasoning := "xhigh" - if harness != "codex" { - reasoning = "" - } - - // Determine steps based on what's available - // Steps: 0=harness (if multiple), 1=model, 2=reasoning (if codex), 3=workspaces - firstStep := 0 - available := 0 - if codexAvailable { - available++ - } - if claudeAvailable { - available++ - } - if opencodeAvailable { - available++ - } - if available <= 1 { - firstStep = 1 // skip harness selection - } - - step := firstStep - for { - // Clear screen and show header + previous answers - fmt.Print("\033[H\033[2J") - fmt.Println() - fmt.Println(" " + lipgloss.NewStyle().Bold(true).Render("Subtask Setup")) - fmt.Println(subtleStyle.Render(" Configure parallel workers for your project")) - fmt.Println() - - // Show answered questions above current one - if step > 0 && firstStep == 0 { - fmt.Printf(" Harness: %s\n", harness) - } - if step > 1 && model != "" { - fmt.Printf(" Model: %s\n", model) - } - if step > 2 && harness == "codex" { - fmt.Printf(" Reasoning: %s\n", reasoning) - } - if step > firstStep { - fmt.Println() - } - - // Determine current question - var form *huh.Form - switch step { - case 0: // Harness - var opts []huh.Option[string] - if codexAvailable { - opts = append(opts, huh.NewOption("Codex (recommended)", "codex")) - } - if claudeAvailable { - opts = append(opts, huh.NewOption("Claude Code", "claude")) - } - if opencodeAvailable { - opts = append(opts, huh.NewOption("OpenCode", "opencode")) - } - form = huh.NewForm(huh.NewGroup( - huh.NewSelect[string](). - Title("Worker"). - Description("Which CLI runs your tasks behind the scenes"). - Options(opts...). - Value(&harness), - )) - - case 1: // Model (options depend on harness) - if harness == "codex" { - opts := []huh.Option[string]{ - huh.NewOption("gpt-5.2 (recommended)", "gpt-5.2"), - huh.NewOption("gpt-5.2-codex", "gpt-5.2-codex"), - } - form = huh.NewForm(huh.NewGroup( - huh.NewSelect[string](). - Title("Model"). - Options(opts...). - Value(&model), - )) - } else if harness == "claude" { - opts := []huh.Option[string]{ - huh.NewOption("Claude Opus (recommended)", "claude-opus-4-5-20251101"), - huh.NewOption("Claude Sonnet", "claude-sonnet-4-20250514"), - } - form = huh.NewForm(huh.NewGroup( - huh.NewSelect[string](). - Title("Model"). - Options(opts...). - Value(&model), - )) - } else { - form = huh.NewForm(huh.NewGroup( - huh.NewInput(). - Title("Model (optional)"). - Description("Leave blank to use OpenCode defaults; use provider/model to override."). - Placeholder("provider/model"). - Value(&model), - )) - } - - case 2: // Reasoning (Codex only) - if harness != "codex" { - step++ - continue - } - form = huh.NewForm(huh.NewGroup( - huh.NewSelect[string](). - Title("Reasoning"). - Options( - huh.NewOption("Extra High (recommended)", "xhigh"), - huh.NewOption("High", "high"), - huh.NewOption("Medium", "medium"), - huh.NewOption("Low", "low"), - ). - Value(&reasoning), - )) - - case 3: // Workspaces - form = huh.NewForm(huh.NewGroup( - huh.NewSelect[int](). - Title("Max workspaces"). - Options( - huh.NewOption("5", 5), - huh.NewOption("10", 10), - huh.NewOption("20 (recommended)", 20), - huh.NewOption("50", 50), - ). - Value(&numWorkspaces), - )) - } - - if step > 3 { - break - } - - // Configure form - esc/ctrl+c trigger abort, we catch it to go back (or cancel on first) - km := huh.NewDefaultKeyMap() - km.Quit = key.NewBinding(key.WithKeys("esc", "ctrl+c"), key.WithHelp("esc", "back")) - km.Select.Filter = key.NewBinding(key.WithDisabled()) // disable "/" filter - form = form.WithKeyMap(km).WithTheme(huh.ThemeCharm()).WithShowHelp(true) - - err := form.Run() - if err == huh.ErrUserAborted { - if step == firstStep { - return fmt.Errorf("setup cancelled") - } - // Go back - step-- - if step == 2 && harness != "codex" { - step-- // skip reasoning when going back for claude - } - continue - } - if err != nil { - break // non-interactive, use defaults - } - - // Reset dependent values when harness changes - if step == 0 { - if harness == "codex" { - model = "gpt-5.2" - reasoning = "xhigh" - } else if harness == "claude" { - model = "claude-opus-4-5-20251101" - reasoning = "" - } else { - model = "" - reasoning = "" - } - } - - step++ - } - - // Final validation - ensure selected harness is available - if harness == "codex" && !codexAvailable { - return fmt.Errorf("codex CLI not found\n\nInstall it from: https://github.com/openai/codex") - } - if harness == "claude" && !claudeAvailable { - return fmt.Errorf("claude CLI not found\n\nInstall it from: https://claude.com/claude-code") - } - if harness == "opencode" && !opencodeAvailable { - return fmt.Errorf("opencode CLI not found\n\nInstall it from: https://github.com/anomalyco/opencode") - } - - // Create config (worktrees are created on demand). - cfg := &workspace.Config{ - Harness: harness, - MaxWorkspaces: numWorkspaces, - Options: make(map[string]any), - } - - // Add harness-specific options - if model != "" { - cfg.Options["model"] = model - } - if reasoning != "" { - cfg.Options["reasoning"] = reasoning - } - - // Save config to local directory (not ancestor) - if err := os.MkdirAll(localSubtaskDir, 0755); err != nil { - return fmt.Errorf("failed to create .subtask directory: %w", err) - } - if cfg.MaxWorkspaces <= 0 { - cfg.MaxWorkspaces = workspace.DefaultMaxWorkspaces - } - data, err := json.MarshalIndent(cfg, "", " ") - if err != nil { - return fmt.Errorf("failed to marshal config: %w", err) - } - if err := os.WriteFile(localConfigPath, data, 0644); err != nil { - return fmt.Errorf("failed to save config: %w", err) - } - - // Add .subtask to .gitignore if not already present - gitignoreAdded := false - if err := ensureGitignore(cwd); err != nil { - printWarning(fmt.Sprintf("failed to update .gitignore: %v", err)) - } else { - gitignoreAdded = true - } - - // Summary - fmt.Println() - fmt.Println(successStyle.Render(" ✓ Setup complete")) - fmt.Println() - fmt.Printf(" %s %s\n", subtleStyle.Render("Harness:"), harness) - if model != "" { - fmt.Printf(" %s %s\n", subtleStyle.Render("Model:"), model) - } - if reasoning != "" { - fmt.Printf(" %s %s\n", subtleStyle.Render("Reasoning:"), reasoning) - } - fmt.Printf(" %s %d\n", subtleStyle.Render("Max workspaces:"), numWorkspaces) - fmt.Printf(" %s %s\n", subtleStyle.Render("Config:"), localConfigPath) - if gitignoreAdded { - fmt.Printf(" %s added to .gitignore\n", subtleStyle.Render("/.subtask/")) - } - fmt.Println() - - return nil -} - -// isCommandAvailable checks if a command is likely runnable on this machine. -func isCommandAvailable(name string) bool { - return harness.CanResolveCLI(name) -} - -// ensureGitignore adds /.subtask/ to .gitignore if not already ignored. -func ensureGitignore(repoRoot string) error { - // Use git check-ignore to see if already ignored (handles all gitignore semantics) - subtaskDir := filepath.Join(repoRoot, ".subtask") - if err := git.RunQuiet(repoRoot, "check-ignore", "-q", subtaskDir); err == nil { - return nil // Already ignored - } - - // Append to .gitignore - gitignorePath := filepath.Join(repoRoot, ".gitignore") - pattern := "/.subtask/" - - // Read existing content to check if we need a leading newline - content, _ := os.ReadFile(gitignorePath) - - f, err := os.OpenFile(gitignorePath, os.O_APPEND|os.O_CREATE|os.O_WRONLY, 0644) - if err != nil { - return err - } - defer f.Close() - - // Add newline before if file exists and doesn't end with newline - if len(content) > 0 && content[len(content)-1] != '\n' { - if _, err := f.WriteString("\n"); err != nil { - return err - } - } - - if _, err := f.WriteString(pattern + "\n"); err != nil { - return err - } - - return nil -} diff --git a/cmd/subtask/init_git_repo_test.go b/cmd/subtask/init_git_repo_test.go deleted file mode 100644 index 8064703..0000000 --- a/cmd/subtask/init_git_repo_test.go +++ /dev/null @@ -1,261 +0,0 @@ -package main - -import ( - "encoding/json" - "os" - "os/exec" - "path/filepath" - "runtime" - "testing" - - "github.com/stretchr/testify/require" - "github.com/zippoxer/subtask/pkg/workspace" -) - -func writeFakeCLI(t *testing.T, dir string, name string) string { - t.Helper() - - if runtime.GOOS == "windows" { - path := filepath.Join(dir, name+".bat") - require.NoError(t, os.WriteFile(path, []byte("@echo off\r\nexit /B 0\r\n"), 0o644)) - return path - } - - path := filepath.Join(dir, name) - require.NoError(t, os.WriteFile(path, []byte("#!/bin/sh\nexit 0\n"), 0o755)) - return path -} - -func TestInit_FailsOutsideGitRepo(t *testing.T) { - tmpDir := t.TempDir() - - origCwd, err := os.Getwd() - require.NoError(t, err) - require.NoError(t, os.Chdir(tmpDir)) - t.Cleanup(func() { _ = os.Chdir(origCwd) }) - - // Ensure at least one harness is "available" regardless of the test machine. - binDir := filepath.Join(tmpDir, "bin") - require.NoError(t, os.MkdirAll(binDir, 0o755)) - _ = writeFakeCLI(t, binDir, "codex") - - origPath := os.Getenv("PATH") - t.Setenv("PATH", binDir+string(os.PathListSeparator)+origPath) - - // Avoid hanging on interactive prompts in the current implementation. - origStdin := os.Stdin - devNull, err := os.Open(os.DevNull) - require.NoError(t, err) - os.Stdin = devNull - t.Cleanup(func() { - os.Stdin = origStdin - _ = devNull.Close() - }) - - _, _, runErr := captureStdoutStderr(t, (&InitCmd{}).Run) - require.EqualError(t, runErr, "Not a git repository. Run 'git init' first or cd to an existing repo.") -} - -// initGitRepo initializes a git repo in dir with an initial commit. -func initGitRepo(t *testing.T, dir string) { - t.Helper() - run := func(args ...string) { - cmd := exec.Command("git", args...) - cmd.Dir = dir - if err := cmd.Run(); err != nil { - t.Fatalf("git %v failed: %v", args, err) - } - } - run("init") - run("config", "user.email", "test@test.com") - run("config", "user.name", "Test User") - readme := filepath.Join(dir, "README.md") - require.NoError(t, os.WriteFile(readme, []byte("# Test\n"), 0o644)) - run("add", ".") - run("commit", "-m", "Initial commit") -} - -// setupInitTest creates an isolated test environment for init tests. -// Returns the temp dir and a cleanup function. -func setupInitTest(t *testing.T, harnesses ...string) string { - t.Helper() - - tmpDir := t.TempDir() - initGitRepo(t, tmpDir) - - // Create fake CLI binaries for specified harnesses - binDir := filepath.Join(tmpDir, "bin") - require.NoError(t, os.MkdirAll(binDir, 0o755)) - for _, h := range harnesses { - writeFakeCLI(t, binDir, h) - } - - origCwd, err := os.Getwd() - require.NoError(t, err) - require.NoError(t, os.Chdir(tmpDir)) - - origPath := os.Getenv("PATH") - t.Setenv("PATH", binDir+string(os.PathListSeparator)+origPath) - - // Avoid hanging on interactive prompts - origStdin := os.Stdin - devNull, err := os.Open(os.DevNull) - require.NoError(t, err) - os.Stdin = devNull - - t.Cleanup(func() { - os.Stdin = origStdin - _ = devNull.Close() - _ = os.Chdir(origCwd) - }) - - return tmpDir -} - -// TestInit_RespectsHarnessFlag verifies that --harness flag sets the correct harness in config. -func TestInit_RespectsHarnessFlag(t *testing.T) { - tmpDir := setupInitTest(t, "codex", "claude") - - // Run init with --harness claude - cmd := &InitCmd{Harness: "claude", Workspaces: 10} - _, _, err := captureStdoutStderr(t, cmd.Run) - require.NoError(t, err) - - // Verify config was created with claude harness - configPath := filepath.Join(tmpDir, ".subtask", "config.json") - data, err := os.ReadFile(configPath) - require.NoError(t, err) - - var cfg workspace.Config - require.NoError(t, json.Unmarshal(data, &cfg)) - require.Equal(t, "claude", cfg.Harness) - require.Equal(t, 10, cfg.MaxWorkspaces) -} - -// TestInit_CreatesConfigInCwd verifies that init creates config in cwd, -// not in an ancestor directory that might have .subtask. -func TestInit_CreatesConfigInCwd(t *testing.T) { - // Create a parent directory with .subtask (simulating an existing project) - parentDir := t.TempDir() - initGitRepo(t, parentDir) - parentSubtask := filepath.Join(parentDir, ".subtask") - require.NoError(t, os.MkdirAll(parentSubtask, 0o755)) - require.NoError(t, os.WriteFile( - filepath.Join(parentSubtask, "config.json"), - []byte(`{"harness":"codex","max_workspaces":20}`), - 0o644, - )) - - // Create a child directory (new project) - also needs to be a git repo - childDir := filepath.Join(parentDir, "subproject") - require.NoError(t, os.MkdirAll(childDir, 0o755)) - initGitRepo(t, childDir) - - // Set up test environment in child dir - binDir := filepath.Join(childDir, "bin") - require.NoError(t, os.MkdirAll(binDir, 0o755)) - writeFakeCLI(t, binDir, "codex") - - origCwd, _ := os.Getwd() - require.NoError(t, os.Chdir(childDir)) - - origPath := os.Getenv("PATH") - t.Setenv("PATH", binDir+string(os.PathListSeparator)+origPath) - - origStdin := os.Stdin - devNull, _ := os.Open(os.DevNull) - os.Stdin = devNull - t.Cleanup(func() { - os.Stdin = origStdin - _ = devNull.Close() - _ = os.Chdir(origCwd) - }) - - // Run init in child directory - cmd := &InitCmd{Harness: "codex", Workspaces: 5} - _, _, err := captureStdoutStderr(t, cmd.Run) - require.NoError(t, err) - - // Verify config was created in child, not parent - childConfig := filepath.Join(childDir, ".subtask", "config.json") - require.FileExists(t, childConfig) - - data, err := os.ReadFile(childConfig) - require.NoError(t, err) - var cfg workspace.Config - require.NoError(t, json.Unmarshal(data, &cfg)) - require.Equal(t, 5, cfg.MaxWorkspaces) // Our value, not parent's 20 - - // Parent config should be unchanged - parentData, err := os.ReadFile(filepath.Join(parentSubtask, "config.json")) - require.NoError(t, err) - var parentCfg workspace.Config - require.NoError(t, json.Unmarshal(parentData, &parentCfg)) - require.Equal(t, 20, parentCfg.MaxWorkspaces) // Still 20 -} - -// TestInit_RejectsInvalidHarness verifies that --harness with an unsupported -// value is rejected, even if that command exists on PATH. -func TestInit_RejectsInvalidHarness(t *testing.T) { - // Set up with a fake "invalid" CLI that exists on PATH - tmpDir := setupInitTest(t, "codex", "invalid-harness") - - // Run init with invalid harness - cmd := &InitCmd{Harness: "invalid-harness", Workspaces: 10} - _, _, err := captureStdoutStderr(t, cmd.Run) - require.Error(t, err) - require.Contains(t, err.Error(), "invalid harness") - require.Contains(t, err.Error(), "Supported harnesses: codex, claude, opencode") - - // Verify no config was created - configPath := filepath.Join(tmpDir, ".subtask", "config.json") - require.NoFileExists(t, configPath) -} - -// TestInit_DoesNotUseGlobalSubtask verifies that init doesn't detect -// ~/.subtask (global dir without config.json) as an existing project. -func TestInit_DoesNotUseGlobalSubtask(t *testing.T) { - // Create a fake home with global .subtask (no config.json) - fakeHome := t.TempDir() - globalSubtask := filepath.Join(fakeHome, ".subtask") - require.NoError(t, os.MkdirAll(filepath.Join(globalSubtask, "workspaces"), 0o755)) - // Intentionally NO config.json - - // Create a project directory under fake home - projectDir := filepath.Join(fakeHome, "code", "myproject") - require.NoError(t, os.MkdirAll(projectDir, 0o755)) - initGitRepo(t, projectDir) - - // Set up test environment - binDir := filepath.Join(projectDir, "bin") - require.NoError(t, os.MkdirAll(binDir, 0o755)) - writeFakeCLI(t, binDir, "codex") - - origCwd, _ := os.Getwd() - require.NoError(t, os.Chdir(projectDir)) - - origPath := os.Getenv("PATH") - t.Setenv("PATH", binDir+string(os.PathListSeparator)+origPath) - - origStdin := os.Stdin - devNull, _ := os.Open(os.DevNull) - os.Stdin = devNull - t.Cleanup(func() { - os.Stdin = origStdin - _ = devNull.Close() - _ = os.Chdir(origCwd) - }) - - // Run init - should succeed (not see global .subtask as existing project) - cmd := &InitCmd{Harness: "codex", Workspaces: 10} - _, _, err := captureStdoutStderr(t, cmd.Run) - require.NoError(t, err) - - // Verify config was created in project dir - projectConfig := filepath.Join(projectDir, ".subtask", "config.json") - require.FileExists(t, projectConfig) - - // Global dir should still not have config.json - globalConfig := filepath.Join(globalSubtask, "config.json") - require.NoFileExists(t, globalConfig) -} diff --git a/cmd/subtask/install.go b/cmd/subtask/install.go index 960cb85..2f39364 100644 --- a/cmd/subtask/install.go +++ b/cmd/subtask/install.go @@ -2,188 +2,244 @@ package main import ( "fmt" + "os" + "text/template" "github.com/charmbracelet/bubbles/key" "github.com/charmbracelet/huh" + "github.com/zippoxer/subtask/internal/homedir" "github.com/zippoxer/subtask/pkg/install" + "github.com/zippoxer/subtask/pkg/task" ) // InstallCmd implements 'subtask install'. type InstallCmd struct { - Skill bool `help:"Install only the skill"` - Plugin bool `help:"Install only the plugin"` - Scope string `default:"user" enum:"user,project" help:"Installation scope"` - NoPrompt bool `help:"Non-interactive; use defaults"` + Guide bool `help:"Print setup guidance and exit"` + NoPrompt bool `help:"Non-interactive; use defaults"` + Scope string `help:"Skill scope: 'user' or 'project'" placeholder:"SCOPE"` + Harness string `help:"Worker harness: 'codex', 'claude', or 'opencode'" placeholder:"HARNESS"` + Model string `help:"Default model for workers" placeholder:"MODEL"` + Reasoning string `help:"Reasoning level for Codex: 'low', 'medium', 'high', 'xhigh'" placeholder:"LEVEL"` + MaxWorkspaces int `help:"Max parallel git worktrees per repo (default 20)" placeholder:"N"` } func (c *InstallCmd) Run() error { - scope, err := parseInstallScope(c.Scope) + if c.Guide { + printSetupGuide() + return nil + } + + homeDir, err := homedir.Dir() if err != nil { return err } - installSkill := c.Skill - installPlugin := c.Plugin - if !c.Skill && !c.Plugin { - installSkill = true - installPlugin = true + once, err := install.RunLegacyClaudePluginMigrationOnce(homeDir) + if err != nil { + return err + } + if once.Ran && once.Migration.SkippedSettingsMalformed { + printWarning(fmt.Sprintf("Skipped legacy settings cleanup (malformed JSON at %s)", abbreviatePath(once.Migration.SettingsPath))) + } + if once.Ran && (once.Migration.RemovedLegacyPluginDir || once.Migration.RemovedLegacySettingsKey) { + printSuccess("Removed legacy Claude plugin install artifacts") } - doInit := false - if !c.NoPrompt && !c.Skill && !c.Plugin { - installSkill = true - installPlugin = true - scope = install.ScopeUser - if c.Scope != "" { - if s, err := parseInstallScope(c.Scope); err == nil { - scope = s - } - } - - baseDir, inGit, err := baseDirForScope(scope) - if err != nil { - return err - } - - // Enter alternate screen buffer (preserves terminal history) - fmt.Print("\033[?1049h") - - step := 0 - for { - // Clear screen and show progress - fmt.Print("\033[H\033[2J") - fmt.Println() - fmt.Println(" Install Subtask skill and Claude plugin") - fmt.Println() - if step > 0 { - fmt.Printf(" Skill: %s\n", yesNo(installSkill)) - } - if step > 1 { - fmt.Printf(" Plugin: %s\n", yesNo(installPlugin)) - } - if step > 2 { - fmt.Printf(" Scope: %s\n", scope) - } - if step > 3 && inGit { - fmt.Printf(" Init: %s\n", yesNo(doInit)) - } - if step > 0 { - fmt.Println() - } - - var form *huh.Form - switch step { - case 0: - form = huh.NewForm(huh.NewGroup( - huh.NewConfirm(). - Title("Install skill?"). - Value(&installSkill), - )) - case 1: - form = huh.NewForm(huh.NewGroup( - huh.NewConfirm(). - Title("Install plugin?"). - Value(&installPlugin), - )) - case 2: - form = huh.NewForm(huh.NewGroup( - huh.NewSelect[install.Scope](). - Title("Scope"). - Options( - huh.NewOption("User (recommended)", install.ScopeUser), - huh.NewOption("Project", install.ScopeProject), - ). - Value(&scope), - )) - case 3: - if !inGit { - step++ - continue - } - form = huh.NewForm(huh.NewGroup( - huh.NewConfirm(). - Title("Initialize subtask for this repo?"). - Description("Creates .subtask/config.json with defaults"). - Value(&doInit), - )) - default: - goto done - } - - km := huh.NewDefaultKeyMap() - km.Quit = key.NewBinding(key.WithKeys("esc", "ctrl+c"), key.WithHelp("esc", "back")) - km.Select.Filter = key.NewBinding(key.WithDisabled()) - form = form.WithKeyMap(km).WithTheme(huh.ThemeCharm()).WithShowHelp(true) - - if err := form.Run(); err == huh.ErrUserAborted { - if step == 0 { - fmt.Print("\033[?1049l") // exit alternate buffer - return fmt.Errorf("install cancelled") - } - step-- - continue - } else if err != nil { - // Non-interactive; keep defaults and continue without prompting. - break - } - - // Recompute "inGit" if scope changes to project. - if step == 2 { - baseDir, inGit, err = baseDirForScope(scope) - if err != nil { - fmt.Print("\033[?1049l") // exit alternate buffer - return err - } - _ = baseDir + // Determine scope - from flag, interactive, or default. + // Project scope only makes sense inside a git repository. + inGitRepo := isInGitRepo() + scope := c.Scope + if scope != "" && scope != "user" && scope != "project" { + return fmt.Errorf("--scope must be 'user' or 'project', got %q", scope) + } + if scope == "project" && !inGitRepo { + return fmt.Errorf("--scope=project requires being in a git repository") + } + if scope == "" { + if c.NoPrompt || !inGitRepo { + scope = "user" + } else { + var err error + scope, err = runScopeWizard() + if err != nil { + return err } - - step++ } - done: - // Exit alternate screen buffer - fmt.Print("\033[?1049l") } - baseDir, inGit, err := baseDirForScope(scope) + // Install skill to appropriate location. + var skillPath string + var updated bool + if scope == "project" { + repoRoot := task.ProjectRoot() + skillPath, updated, err = install.InstallToProject(repoRoot) + } else { + skillPath, updated, err = install.InstallTo(homeDir) + } if err != nil { return err } + if updated { + printSuccess(fmt.Sprintf("Installed skill to %s", abbreviatePath(skillPath))) + } else { + printSuccess(fmt.Sprintf("Skill already up to date at %s", abbreviatePath(skillPath))) + } - if doInit && scope == install.ScopeProject && inGit { - if err := initSubtaskDefaults(baseDir); err != nil { + // If not configured yet, run the config wizard and write ~/.subtask/config.json. + if _, err := os.Stat(task.ConfigPath()); os.IsNotExist(err) { + cfg, _, err := runConfigWizard(configWizardParams{ + WritePath: task.ConfigPath(), + Existing: readConfigFileOrNil(task.ConfigPath()), + NoPrompt: c.NoPrompt, + Harness: c.Harness, + Model: c.Model, + Reasoning: c.Reasoning, + MaxWorkspaces: c.MaxWorkspaces, + }) + if err != nil { return err } - printSuccess("Initialized subtask for this repo") + if cfg != nil { + printSuccess("Configured subtask") + printConfigDetails(cfg, "user", task.ConfigPath()) + } + } else if !updated { + // Skill was already up to date and config exists - let user know how to reconfigure. + fmt.Println() + fmt.Println("Subtask is already installed. To change configuration:") + fmt.Println(" subtask config # edit global defaults") + fmt.Println(" subtask config --project # edit project overrides") } - res, err := install.InstallAll(install.InstallRequest{ - Scope: scope, - BaseDir: baseDir, - Skill: installSkill, - Plugin: installPlugin, - }) - if err != nil { - return err + return nil +} + +func printSetupGuide() { + type guideData struct { + InGitRepo bool + CodexAvailable bool + ClaudeAvailable bool + OpencodeAvailable bool + AnyHarnessAvailable bool + MultipleHarnesses bool } - if installSkill { - msg := fmt.Sprintf("Installed skill to %s", abbreviatePath(res.SkillPath)) - if !res.UpdatedSkill { - msg = fmt.Sprintf("Skill already up to date at %s", abbreviatePath(res.SkillPath)) - } - printSuccess(msg) + data := guideData{ + InGitRepo: isInGitRepo(), + CodexAvailable: isCommandAvailable("codex"), + ClaudeAvailable: isCommandAvailable("claude"), + OpencodeAvailable: isCommandAvailable("opencode"), + } + count := 0 + if data.CodexAvailable { + count++ + } + if data.ClaudeAvailable { + count++ + } + if data.OpencodeAvailable { + count++ + } + data.AnyHarnessAvailable = count > 0 + data.MultipleHarnesses = count > 1 + + const tpl = `# Setup Subtask + +**You (Claude Code) are the lead.** Subtask lets you create tasks, spawn subagents, track progress, review their work, and request changes. Each task runs in its own git worktree so they can work in parallel safely. The user doesn't run subtask commands — you do. + +## Environment + +{{if .InGitRepo}}✓ In a git repository{{else}}⚠ Not in a git repository (you'll need one later to create tasks){{end}} + +**Available worker harnesses:** +{{if .CodexAvailable}}- ✓ Codex CLI (recommended) +{{else}}- ✗ Codex CLI not found — install from https://github.com/openai/codex +{{end}}{{if .ClaudeAvailable}}- ✓ Claude Code CLI +{{else}}- ✗ Claude Code CLI not found — install from https://claude.ai/download +{{end}}{{if .OpencodeAvailable}}- ✓ OpenCode CLI +{{else}}- ✗ OpenCode CLI not found — install from https://github.com/sst/opencode +{{end}} +{{if not .AnyHarnessAvailable}}**No harness available.** Install at least one (Codex recommended) before proceeding. +{{end}} +## Install +{{if .AnyHarnessAvailable}} +{{if .MultipleHarnesses}}Ask the user which harness they'd like to use for workers. Then run: + +` + "```bash" + ` +subtask install --no-prompt --harness +` + "```" + ` +{{else}}Run the install command: + +` + "```bash" + ` +subtask install --no-prompt +` + "```" + ` +{{end}} +The install: +1. Installs the Subtask skill to ~/.claude/skills/subtask/ +2. Writes config to ~/.subtask/config.json (with sensible defaults for model, etc.) + +The user can change harness, model, or other settings later with ` + "`subtask config`" + `. +{{else}} +First install a worker harness, then run: + +` + "```bash" + ` +subtask install --no-prompt +` + "```" + ` +{{end}} +## Ready +{{if not .InGitRepo}} +**Before creating tasks:** You're not in a git repository. If this looks like a project directory, offer to run ` + "`git init`" + `. Otherwise, ask the user where their project is. +{{end}}After install, load the Subtask skill with ` + "`/subtask`" + ` to get the full workflow instructions. + +Then suggest example tasks adapted to the project, like: +- "Fix the login bug with Subtask" +- "Run these 3 features in parallel" +- "Plan and implement the new API endpoint with Subtask" + +Once you start your first task, let the user know they can run ` + "`subtask`" + ` in a separate terminal to watch progress in the TUI.` + + t := template.Must(template.New("guide").Parse(tpl)) + if err := t.Execute(os.Stdout, data); err != nil { + fmt.Fprintf(os.Stderr, "template error: %v\n", err) } +} - if installPlugin { - msg := fmt.Sprintf("Installed plugin to %s", abbreviatePath(res.PluginDir)) - if !res.UpdatedPlugin { - msg = fmt.Sprintf("Plugin already up to date at %s", abbreviatePath(res.PluginDir)) - } - printSuccess(msg) - if res.Settings.Rewrote && res.Settings.BackupTo != "" { - printWarning(fmt.Sprintf("Rewrote malformed settings.json (backup at %s)", abbreviatePath(res.Settings.BackupTo))) +func isInGitRepo() bool { + root, err := task.GitRootAbs() + return err == nil && root != "" +} + +func runScopeWizard() (string, error) { + scope := "user" + + // Clear screen and show header. + fmt.Print("\033[H\033[2J") + fmt.Println() + fmt.Println(" " + successStyle.Bold(true).Render("Install Claude Code Skill")) + fmt.Println(subtleStyle.Render(" The skill teaches Claude Code the subtask commands and workflow")) + fmt.Println() + + form := huh.NewForm(huh.NewGroup( + huh.NewSelect[string](). + Title("Where to install the Claude Skill?"). + Options( + huh.NewOption("Globally (recommended)", "user"), + huh.NewOption("This project only", "project"), + ). + Value(&scope), + )) + + km := huh.NewDefaultKeyMap() + km.Quit = key.NewBinding(key.WithKeys("esc", "ctrl+c"), key.WithHelp("esc", "cancel")) + km.Select.Filter = key.NewBinding(key.WithDisabled()) + form = form.WithKeyMap(km).WithTheme(huh.ThemeCharm()).WithShowHelp(true) + + if err := form.Run(); err != nil { + if err == huh.ErrUserAborted { + return "", fmt.Errorf("install cancelled") } + return "", err } - return nil + return scope, nil } diff --git a/cmd/subtask/install_cmd_test.go b/cmd/subtask/install_cmd_test.go index a526413..ddb552b 100644 --- a/cmd/subtask/install_cmd_test.go +++ b/cmd/subtask/install_cmd_test.go @@ -2,6 +2,7 @@ package main import ( "os" + "path/filepath" "testing" "github.com/stretchr/testify/require" @@ -13,12 +14,20 @@ func TestInstallStatusUninstall_UserScope_NoPrompt(t *testing.T) { home := t.TempDir() t.Setenv("HOME", home) t.Setenv("USERPROFILE", home) + t.Setenv("SUBTASK_DIR", filepath.Join(home, ".subtask")) cwd := t.TempDir() prev, _ := os.Getwd() require.NoError(t, os.Chdir(cwd)) t.Cleanup(func() { _ = os.Chdir(prev) }) + // Ensure at least one harness is "available" so `subtask install --no-prompt` + // can write a usable ~/.subtask/config.json. + binDir := filepath.Join(cwd, "bin") + require.NoError(t, os.MkdirAll(binDir, 0o755)) + _ = writeFakeCLI(t, binDir, "codex") + t.Setenv("PATH", binDir+string(os.PathListSeparator)+os.Getenv("PATH")) + withOutputMode(t, false) render.Pretty = false @@ -26,7 +35,7 @@ func TestInstallStatusUninstall_UserScope_NoPrompt(t *testing.T) { require.NoError(t, err) require.Empty(t, stderr) require.Contains(t, stdout, "Skill installed: no") - require.Contains(t, stdout, "Plugin installed: no") + require.NotContains(t, stdout, "Plugin installed") _, stderr, err = captureStdoutStderr(t, (&InstallCmd{NoPrompt: true}).Run) require.NoError(t, err) @@ -36,8 +45,7 @@ func TestInstallStatusUninstall_UserScope_NoPrompt(t *testing.T) { require.NoError(t, err) require.Empty(t, stderr) require.Contains(t, stdout, "Skill installed: yes") - require.Contains(t, stdout, "Plugin installed: yes") - require.Contains(t, stdout, "Plugin enabled: yes") + require.NotContains(t, stdout, "Plugin installed") _, stderr, err = captureStdoutStderr(t, (&UninstallCmd{}).Run) require.NoError(t, err) @@ -47,5 +55,5 @@ func TestInstallStatusUninstall_UserScope_NoPrompt(t *testing.T) { require.NoError(t, err) require.Empty(t, stderr) require.Contains(t, stdout, "Skill installed: no") - require.Contains(t, stdout, "Plugin installed: no") + require.NotContains(t, stdout, "Plugin installed") } diff --git a/cmd/subtask/install_helpers.go b/cmd/subtask/install_helpers.go deleted file mode 100644 index c7f088c..0000000 --- a/cmd/subtask/install_helpers.go +++ /dev/null @@ -1,132 +0,0 @@ -package main - -import ( - "fmt" - "os" - - "github.com/zippoxer/subtask/internal/homedir" - "github.com/zippoxer/subtask/pkg/git" - "github.com/zippoxer/subtask/pkg/harness" - "github.com/zippoxer/subtask/pkg/install" - "github.com/zippoxer/subtask/pkg/task" - "github.com/zippoxer/subtask/pkg/workspace" -) - -func parseInstallScope(s string) (install.Scope, error) { - switch s { - case "", "user": - return install.ScopeUser, nil - case "project": - return install.ScopeProject, nil - default: - return "", fmt.Errorf("invalid scope %q (expected user|project)", s) - } -} - -func projectRootFromCwd() (root string, inGit bool, err error) { - cwd, err := os.Getwd() - if err != nil { - return "", false, err - } - - insideWorkTree, err := git.Output(cwd, "rev-parse", "--is-inside-work-tree") - if err == nil && insideWorkTree == "true" { - top, err := git.Output(cwd, "rev-parse", "--show-toplevel") - if err == nil && top != "" { - return top, true, nil - } - return cwd, true, nil - } - - return cwd, false, nil -} - -func baseDirForScope(scope install.Scope) (baseDir string, inGit bool, err error) { - switch scope { - case install.ScopeUser: - homeDir, err := homedir.Dir() - if err != nil { - return "", false, err - } - return homeDir, false, nil - case install.ScopeProject: - return projectRootFromCwd() - default: - return "", false, fmt.Errorf("invalid scope %q", scope) - } -} - -func yesNo(b bool) string { - if b { - return "yes" - } - return "no" -} - -func initSubtaskDefaults(repoRoot string) error { - if repoRoot == "" { - return fmt.Errorf("invalid repo root") - } - - prev, _ := os.Getwd() - if err := os.Chdir(repoRoot); err != nil { - return err - } - defer func() { _ = os.Chdir(prev) }() - - if _, err := os.Stat(task.ConfigPath()); err == nil { - return nil - } - - codexAvailable := isCommandAvailable("codex") - claudeAvailable := isCommandAvailable("claude") - opencodeAvailable := isCommandAvailable("opencode") - if !codexAvailable && !claudeAvailable && !opencodeAvailable { - return fmt.Errorf("no worker harness available\n\nInstall one of:\n - Codex CLI: https://github.com/openai/codex\n - Claude Code CLI: https://claude.com/claude-code\n - OpenCode CLI: https://github.com/anomalyco/opencode") - } - - h := "codex" - if !codexAvailable { - if claudeAvailable { - h = "claude" - } else { - h = "opencode" - } - } - - model := "gpt-5.2" - reasoning := "xhigh" - if h == "claude" { - model = "claude-opus-4-5-20251101" - reasoning = "" - } - if h == "opencode" { - model = "" - reasoning = "" - } - - cfg := &workspace.Config{ - Harness: h, - MaxWorkspaces: workspace.DefaultMaxWorkspaces, - Options: make(map[string]any), - } - if model != "" { - cfg.Options["model"] = model - } - if reasoning != "" { - cfg.Options["reasoning"] = reasoning - } - - if err := cfg.Save(); err != nil { - return fmt.Errorf("failed to save config: %w", err) - } - - if err := ensureGitignore(repoRoot); err != nil { - // best effort - } - - // Warm harness discovery for better UX on first run. - _ = harness.CanResolveCLI(cfg.Harness) - - return nil -} diff --git a/cmd/subtask/interrupt.go b/cmd/subtask/interrupt.go index 7f26531..af7bbfb 100644 --- a/cmd/subtask/interrupt.go +++ b/cmd/subtask/interrupt.go @@ -17,6 +17,10 @@ type InterruptCmd struct { var interruptSignalFn = sendInterruptSignal func (c *InterruptCmd) Run() error { + if _, err := preflightProject(); err != nil { + return err + } + // Ensure schema/history exist (one-time) and task exists. if err := migrate.EnsureSchema(c.Task); err != nil { return err diff --git a/cmd/subtask/interrupt_test.go b/cmd/subtask/interrupt_test.go index 0b87d47..551d0ab 100644 --- a/cmd/subtask/interrupt_test.go +++ b/cmd/subtask/interrupt_test.go @@ -10,6 +10,7 @@ import ( "github.com/zippoxer/subtask/pkg/task" "github.com/zippoxer/subtask/pkg/task/history" + "github.com/zippoxer/subtask/pkg/task/migrate/gitredesign" "github.com/zippoxer/subtask/pkg/testutil" ) @@ -21,7 +22,7 @@ func TestInterrupt_NotRunning(t *testing.T) { Title: "Not working", BaseBranch: "main", Description: "desc", - Schema: 1, + Schema: gitredesign.TaskSchemaVersion, }).Save()) err := (&InterruptCmd{Task: envTask}).Run() @@ -38,7 +39,7 @@ func TestInterrupt_AppendsHistoryAndSignals(t *testing.T) { Title: "Working", BaseBranch: "main", Description: "desc", - Schema: 1, + Schema: gitredesign.TaskSchemaVersion, }).Save()) runIDData, _ := json.Marshal(map[string]any{"run_id": "run123"}) @@ -89,7 +90,7 @@ func TestInterrupt_StaleSupervisorClearsState(t *testing.T) { Title: "Stale", BaseBranch: "main", Description: "desc", - Schema: 1, + Schema: gitredesign.TaskSchemaVersion, }).Save()) const definitelyDeadPID = 2147483647 diff --git a/cmd/subtask/list.go b/cmd/subtask/list.go index b6ffcfd..87cf95d 100644 --- a/cmd/subtask/list.go +++ b/cmd/subtask/list.go @@ -3,8 +3,9 @@ package main import ( "context" "fmt" + "os" - "github.com/zippoxer/subtask/pkg/task/gather" + "github.com/zippoxer/subtask/pkg/task/store" ) // ListCmd implements 'subtask list'. @@ -14,6 +15,9 @@ type ListCmd struct { // Run executes the list command. func (c *ListCmd) Run() error { + if _, err := preflightProject(); err != nil { + return err + } out, err := c.render() if err != nil { return err @@ -23,35 +27,42 @@ func (c *ListCmd) Run() error { } func (c *ListCmd) render() (string, error) { - data, err := gather.List(context.Background(), gather.ListOptions{All: c.All}) + st := store.New() + data, err := st.List(context.Background(), store.ListOptions{All: c.All}) if err != nil { return "", err } - if len(data.Items) == 0 && len(data.Workspaces) == 0 { + for _, e := range data.Errors { + if e.Err == nil { + continue + } + fmt.Fprintf(os.Stderr, "task %s: %v\n", e.Name, e.Err) + } + + if len(data.Tasks) == 0 && len(data.Workspaces) == 0 { return "No tasks.\n", nil } - tasks := make([]TaskInfo, 0, len(data.Items)) - for _, it := range data.Items { + tasks := make([]TaskInfo, 0, len(data.Tasks)) + for _, it := range data.Tasks { info := TaskInfo{ - Name: it.Name, - Title: it.Title, - FollowUp: it.FollowUp, - BaseBranch: it.BaseBranch, - TaskStatus: it.TaskStatus, - WorkerStatus: it.WorkerStatus, - Stage: it.Stage, - Workspace: it.Workspace, - StartedAt: it.StartedAt, - LastActive: it.LastActive, - ToolCalls: it.ToolCalls, - LinesAdded: it.LinesAdded, - LinesRemoved: it.LinesRemoved, - CommitsBehind: it.CommitsBehind, - LastRunMS: it.LastRunDurationMS, - LastError: it.LastError, - IntegratedReason: it.IntegratedReason, + Name: it.Name, + Title: it.Title, + FollowUp: it.FollowUp, + BaseBranch: it.BaseBranch, + TaskStatus: it.TaskStatus, + WorkerStatus: it.WorkerStatus, + Stage: it.Stage, + Workspace: it.Workspace, + StartedAt: it.StartedAt, + LastActive: it.LastActive, + ToolCalls: it.ToolCalls, + LinesAdded: it.Changes.Added, + LinesRemoved: it.Changes.Removed, + ChangesStatus: string(it.Changes.Status), + LastRunMS: it.LastRunDurationMS, + LastError: it.LastError, } if it.ProgressTotal > 0 { info.Progress = fmt.Sprintf("%d/%d", it.ProgressDone, it.ProgressTotal) diff --git a/cmd/subtask/log.go b/cmd/subtask/log.go index b21b4f2..f675266 100644 --- a/cmd/subtask/log.go +++ b/cmd/subtask/log.go @@ -23,6 +23,10 @@ func (c *LogCmd) Run() error { return fmt.Errorf("--events and --messages are mutually exclusive") } + if _, err := preflightProject(); err != nil { + return err + } + if err := migrate.EnsureSchema(c.Task); err != nil { return err } @@ -108,14 +112,37 @@ func formatHistoryEvent(ev history.Event) string { if d.From != "" || d.To != "" { desc = fmt.Sprintf("stage changed: %s → %s", d.From, d.To) } + case "task.commit": + var d struct { + SHA string `json:"sha"` + Subject string `json:"subject"` + } + _ = json.Unmarshal(ev.Data, &d) + if d.SHA != "" || d.Subject != "" { + desc = fmt.Sprintf("commit %s %q", shortSHA(d.SHA), strings.TrimSpace(d.Subject)) + } case "task.merged": var d struct { Commit string `json:"commit"` Into string `json:"into"` + Via string `json:"via"` + Method string `json:"method"` } _ = json.Unmarshal(ev.Data, &d) - if d.Commit != "" || d.Into != "" { - desc = fmt.Sprintf("merged %s into %s", shortSHA(d.Commit), d.Into) + commit := strings.TrimSpace(d.Commit) + into := strings.TrimSpace(d.Into) + via := strings.TrimSpace(d.Via) + method := strings.TrimSpace(d.Method) + if commit != "" && into != "" { + desc = fmt.Sprintf("merged %s into %s", shortSHA(commit), into) + } else if into != "" { + // No-op / detected merges may not have a merge commit SHA. Avoid implying one exists. + desc = "marked merged into " + into + if method != "" { + desc += " (" + method + ")" + } else if via != "" { + desc += " (" + via + ")" + } } case "task.closed": var d struct { diff --git a/cmd/subtask/logs.go b/cmd/subtask/logs.go index ade67d1..97c015f 100644 --- a/cmd/subtask/logs.go +++ b/cmd/subtask/logs.go @@ -35,6 +35,10 @@ type harnessLogBackend struct { // Run executes the logs command. func (c *LogsCmd) Run() error { + if _, err := preflightProject(); err != nil { + return err + } + backends := []harnessLogBackend{ {name: "codex", parser: &logs.CodexParser{}, locator: &logs.CodexParser{}, parseLine: parseSingleLineCodex}, {name: "claude", parser: &logs.ClaudeParser{}, locator: &logs.ClaudeParser{}, parseLine: parseSingleLineClaude}, diff --git a/cmd/subtask/main.go b/cmd/subtask/main.go index 2189c99..48affc2 100644 --- a/cmd/subtask/main.go +++ b/cmd/subtask/main.go @@ -3,6 +3,7 @@ package main import ( "fmt" "os" + "strings" "github.com/alecthomas/kong" ) @@ -16,10 +17,10 @@ var ( type CLI struct { Version kong.VersionFlag `help:"Print version information and quit"` - Init InitCmd `cmd:"" help:"Initialize subtask for this project"` - Install InstallCmd `cmd:"" help:"Install Subtask skill + plugin (Claude Code)"` - Uninstall UninstallCmd `cmd:"" help:"Uninstall Subtask skill + plugin (Claude Code)"` - Status StatusCmd `cmd:"" help:"Show installation status (skill + plugin)"` + Install InstallCmd `cmd:"" help:"Install Subtask skill (Claude Code) and configure defaults"` + Config ConfigCmd `cmd:"" help:"Edit configuration (user defaults or project overrides)"` + Uninstall UninstallCmd `cmd:"" help:"Uninstall Subtask skill (Claude Code)"` + Status StatusCmd `cmd:"" help:"Show installation status (skill)"` Ask AskCmd `cmd:"" help:"Ask a question (no task, runs in cwd)"` Draft DraftCmd `cmd:"" help:"Create a task without running"` Send SendCmd `cmd:"" help:"Send a message to a task"` @@ -31,7 +32,7 @@ type CLI struct { Close CloseCmd `cmd:"" help:"Close a task and free workspace"` Merge MergeCmd `cmd:"" help:"Merge task into base branch (marks as merged)"` Workspace WorkspaceCmd `cmd:"" help:"Print workspace path for a task"` - Review ReviewCmd `cmd:"" help:"Get review of task changes"` + Review ReviewCmd `cmd:"" help:"Get an AI code review"` Trace LogsCmd `cmd:"" help:"Debug worker runs (tool calls, errors)"` Logs LogsCmd `cmd:"" help:"Alias for trace" hidden:""` Interrupt InterruptCmd `cmd:"" aliases:"stop" help:"Gracefully stop a working worker for a task"` @@ -39,8 +40,10 @@ type CLI struct { } func main() { - runAutoUpdate() - startBinaryAutoUpdate() + if !shouldSkipStartupSideEffects(os.Args) { + runAutoUpdate() + startBinaryAutoUpdate() + } if len(os.Args) == 1 { if err := runTUIWithInitCheck(); err != nil { @@ -64,3 +67,15 @@ func main() { err := ctx.Run() ctx.FatalIfErrorf(err) } + +func shouldSkipStartupSideEffects(args []string) bool { + if len(args) < 3 || args[1] != "install" { + return false + } + for _, a := range args[2:] { + if a == "--guide" || strings.HasPrefix(a, "--guide=") { + return true + } + } + return false +} diff --git a/cmd/subtask/merge.go b/cmd/subtask/merge.go index 7b6eafc..fb5f93f 100644 --- a/cmd/subtask/merge.go +++ b/cmd/subtask/merge.go @@ -1,10 +1,8 @@ package main import ( - "context" "fmt" - taskindex "github.com/zippoxer/subtask/pkg/task/index" "github.com/zippoxer/subtask/pkg/task/ops" ) @@ -16,26 +14,14 @@ type MergeCmd struct { // Run executes the merge command. func (c *MergeCmd) Run() error { + if _, err := preflightProject(); err != nil { + return err + } + res, err := ops.MergeTask(c.Task, c.Message, cliOpsLogger{}) if err != nil { return err } - // Best-effort: refresh integration snapshot so list doesn't need a repair pass - // after a subtask-driven merge advances the base branch. - if idx, err := taskindex.OpenDefault(); err == nil { - defer idx.Close() - if err := idx.Refresh(context.Background(), taskindex.RefreshPolicy{ - Git: taskindex.GitPolicy{ - Mode: taskindex.GitTasks, - Tasks: []string{c.Task}, - IncludeIntegration: true, - }, - }); err != nil { - printWarning(fmt.Sprintf("failed to refresh git integration cache: %v", err)) - } - } else { - printWarning(fmt.Sprintf("failed to open index for git integration cache refresh: %v", err)) - } if res.AlreadyClosed { if res.AlreadyMerged { fmt.Printf("Task %s is already merged.\n", c.Task) diff --git a/cmd/subtask/output.go b/cmd/subtask/output.go index 832941c..a0829d0 100644 --- a/cmd/subtask/output.go +++ b/cmd/subtask/output.go @@ -57,11 +57,9 @@ type TaskInfo struct { Progress string // "X/Y" from PROGRESS.json LinesAdded int // Git diff stats LinesRemoved int - CommitsBehind int // Commits base branch has that the task ref doesn't + ChangesStatus string // "", "applied", "missing" LastRunMS int LastError string - - IntegratedReason string } // PrintTaskList prints a formatted table of tasks. @@ -77,7 +75,7 @@ func RenderTaskList(tasks []TaskInfo, workspaces []workspace.Entry) string { // Build rows var rows []render.TaskRow for _, t := range tasks { - status := userStatusTextWithIntegration(t.TaskStatus, t.WorkerStatus, t.StartedAt, t.LastRunMS, t.LastError, t.IntegratedReason) + status := userStatusText(t.TaskStatus, t.WorkerStatus, t.StartedAt, t.LastRunMS, t.LastError) stage := t.Stage if stage == "" { @@ -110,7 +108,7 @@ func RenderTaskList(tasks []TaskInfo, workspaces []workspace.Entry) string { Progress: progress, LinesAdded: t.LinesAdded, LinesRemoved: t.LinesRemoved, - CommitsBehind: t.CommitsBehind, + ChangesStatus: t.ChangesStatus, LastActive: lastActivity, Title: title, }) @@ -174,14 +172,6 @@ func userStatusText(ts task.TaskStatus, ws task.WorkerStatus, startedAt time.Tim } } -func userStatusTextWithIntegration(ts task.TaskStatus, ws task.WorkerStatus, startedAt time.Time, lastRunMS int, lastError string, integratedReason string) string { - // Don't show "merged" if worker is actively running - if ws != task.WorkerStatusRunning && strings.TrimSpace(integratedReason) != "" && ts != task.TaskStatusMerged { - return "✓ merged" - } - return userStatusText(ts, ws, startedAt, lastRunMS, lastError) -} - // formatTimeAgo formats a time as "Xm ago" or "Xs ago". func formatTimeAgo(t time.Time) string { d := nowFunc().Sub(t) diff --git a/cmd/subtask/preflight.go b/cmd/subtask/preflight.go new file mode 100644 index 0000000..3e6bc3c --- /dev/null +++ b/cmd/subtask/preflight.go @@ -0,0 +1,45 @@ +package main + +import ( + "github.com/zippoxer/subtask/pkg/task" + taskmigrate "github.com/zippoxer/subtask/pkg/task/migrate" + "github.com/zippoxer/subtask/pkg/task/migrate/gitredesign" + "github.com/zippoxer/subtask/pkg/workspace" +) + +type preflightProjectResult struct { + RepoRoot string + Config *workspace.Config +} + +func preflightProject() (*preflightProjectResult, error) { + repoRoot, err := task.GitRootAbs() + if err != nil { + return nil, err + } + if err := taskmigrate.EnsureLayout(repoRoot); err != nil { + return nil, err + } + if err := gitredesign.Ensure(repoRoot); err != nil { + return nil, err + } + cfg, err := workspace.LoadConfig() + if err != nil { + return nil, err + } + return &preflightProjectResult{RepoRoot: repoRoot, Config: cfg}, nil +} + +func preflightProjectOnly() (string, error) { + repoRoot, err := task.GitRootAbs() + if err != nil { + return "", err + } + if err := taskmigrate.EnsureLayout(repoRoot); err != nil { + return "", err + } + if err := gitredesign.Ensure(repoRoot); err != nil { + return "", err + } + return repoRoot, nil +} diff --git a/cmd/subtask/review.go b/cmd/subtask/review.go index 7e18f32..09d9068 100644 --- a/cmd/subtask/review.go +++ b/cmd/subtask/review.go @@ -13,9 +13,10 @@ import ( // ReviewCmd implements 'subtask review'. type ReviewCmd struct { // Target selection (mutually exclusive) - Task string `help:"Review task changes against its base branch"` + Task string `help:"Review changes in a task workspace against that task's base branch"` + Base string `help:"Review changes on the current branch against BRANCH (PR-style diff via merge-base; BRANCH must be a valid git ref)"` Uncommitted bool `help:"Review uncommitted changes (staged, unstaged, untracked)"` - Commit string `help:"Review changes introduced by a specific commit"` + Commit string `help:"Review changes introduced by a specific commit SHA"` // Optional instructions Prompt string `arg:"" optional:"" help:"Additional review instructions (or use stdin)"` @@ -38,20 +39,23 @@ func (c *ReviewCmd) WithHarness(h harness.Harness) *ReviewCmd { func (c *ReviewCmd) Run() error { // Validate mutually exclusive flags count := 0 - if c.Task != "" { + if strings.TrimSpace(c.Task) != "" { + count++ + } + if strings.TrimSpace(c.Base) != "" { count++ } if c.Uncommitted { count++ } - if c.Commit != "" { + if strings.TrimSpace(c.Commit) != "" { count++ } if count > 1 { - return fmt.Errorf("--task, --uncommitted, and --commit are mutually exclusive") + return fmt.Errorf("--task, --base, --uncommitted, and --commit are mutually exclusive") } if count == 0 { - return fmt.Errorf("specify one of: --task , --uncommitted, or --commit ") + return fmt.Errorf("specify one of: --task , --base , --uncommitted, or --commit ") } // Read instructions from arg or stdin @@ -60,11 +64,13 @@ func (c *ReviewCmd) Run() error { instructions = readStdinIfAvailable() } - // Load config - cfg, err := workspace.LoadConfig() + // Requirements: git + global config (config may be migrated on first access). + res, err := preflightProject() if err != nil { return err } + cfg := res.Config + if err := workspace.ValidateReasoningFlag(cfg.Harness, c.Reasoning); err != nil { return err } @@ -74,24 +80,32 @@ func (c *ReviewCmd) Run() error { var target harness.ReviewTarget switch { - case c.Task != "": + case strings.TrimSpace(c.Task) != "": + taskName := strings.TrimSpace(c.Task) // Load task (for base branch) - t, err := task.Load(c.Task) + t, err := task.Load(taskName) if err != nil { - return fmt.Errorf("failed to load task %q: %w", c.Task, err) + return fmt.Errorf("failed to load task %q: %w", taskName, err) } // Load state (for workspace) - state, err := task.LoadState(c.Task) + state, err := task.LoadState(taskName) if err != nil { return err } if state == nil || state.Workspace == "" { - return fmt.Errorf("task %q has no workspace\n\nRun the task first:\n subtask send %s \"...\"", c.Task, c.Task) + return fmt.Errorf("task %q has no workspace\n\nRun the task first:\n subtask send %s \"...\"", taskName, taskName) } cwd = state.Workspace - target = harness.ReviewTarget{BaseBranch: t.BaseBranch} + target = harness.ReviewTarget{TaskName: taskName, BaseBranch: t.BaseBranch} + + case strings.TrimSpace(c.Base) != "": + cwd, err = os.Getwd() + if err != nil { + return fmt.Errorf("failed to get working directory: %w", err) + } + target = harness.ReviewTarget{BaseBranch: strings.TrimSpace(c.Base)} case c.Uncommitted: cwd, err = os.Getwd() @@ -100,12 +114,12 @@ func (c *ReviewCmd) Run() error { } target = harness.ReviewTarget{Uncommitted: true} - case c.Commit != "": + case strings.TrimSpace(c.Commit) != "": cwd, err = os.Getwd() if err != nil { return fmt.Errorf("failed to get working directory: %w", err) } - target = harness.ReviewTarget{Commit: c.Commit} + target = harness.ReviewTarget{Commit: strings.TrimSpace(c.Commit)} } // Run review diff --git a/cmd/subtask/review_test.go b/cmd/subtask/review_test.go index 33dc112..49966b6 100644 --- a/cmd/subtask/review_test.go +++ b/cmd/subtask/review_test.go @@ -81,6 +81,24 @@ func TestReviewCmd_Uncommitted(t *testing.T) { assert.True(t, call.Target.Uncommitted) } +func TestReviewCmd_BaseBranch(t *testing.T) { + _ = testutil.NewTestEnv(t, 0) + + reviewMock := harness.NewMockHarness().WithReviewResult("No issues") + + stdout, stderr, err := captureStdoutStderr(t, (&ReviewCmd{ + Base: " main ", + }).WithHarness(reviewMock).Run) + + require.NoError(t, err) + require.Empty(t, stderr) + assert.Contains(t, stdout, "No issues") + + require.Len(t, reviewMock.ReviewCalls, 1) + call := reviewMock.ReviewCalls[0] + assert.Equal(t, "main", call.Target.BaseBranch) +} + func TestReviewCmd_Commit(t *testing.T) { _ = testutil.NewTestEnv(t, 0) @@ -103,7 +121,7 @@ func TestReviewCmd_MutuallyExclusive(t *testing.T) { _ = testutil.NewTestEnv(t, 0) _, _, err := captureStdoutStderr(t, (&ReviewCmd{ - Task: "some-task", + Base: "main", Uncommitted: true, }).Run) diff --git a/cmd/subtask/send.go b/cmd/subtask/send.go index 2d3d6c4..3b80481 100644 --- a/cmd/subtask/send.go +++ b/cmd/subtask/send.go @@ -8,6 +8,7 @@ import ( "os" "os/signal" "path/filepath" + "strconv" "strings" "sync/atomic" "syscall" @@ -18,7 +19,6 @@ import ( "github.com/zippoxer/subtask/pkg/logging" "github.com/zippoxer/subtask/pkg/task" "github.com/zippoxer/subtask/pkg/task/history" - taskindex "github.com/zippoxer/subtask/pkg/task/index" "github.com/zippoxer/subtask/pkg/task/migrate" "github.com/zippoxer/subtask/pkg/workspace" ) @@ -52,6 +52,13 @@ func (c *SendCmd) Run() error { return fmt.Errorf("prompt is required\n\nProvide a prompt as argument or via stdin (heredoc/pipe)") } + // Requirements: git + global config (config may be migrated on first access). + res, err := preflightProject() + if err != nil { + return err + } + cfg := res.Config + // Ensure schema/history exist (one-time). if err := migrate.EnsureSchema(c.Task); err != nil { return err @@ -63,10 +70,6 @@ func (c *SendCmd) Run() error { c.Task, c.Task) } - cfg, err := workspace.LoadConfig() - if err != nil { - return err - } if err := workspace.ValidateReasoningFlag(cfg.Harness, c.Reasoning); err != nil { return err } @@ -105,6 +108,86 @@ func (c *SendCmd) Run() error { return err } + var runToolCalls atomic.Int64 + + // Start time is stored atomically so the SIGINT handler can read it safely. We update + // it later once the worker is about to run (excluding workspace prep time). + var startedUnixNano atomic.Int64 + startedUnixNano.Store(time.Now().UTC().UnixNano()) + + // Setup signal handling early so an interrupt during workspace prep doesn't leave a + // stuck SupervisorPID claim. + sigChan := make(chan os.Signal, 1) + sigStop := make(chan struct{}) + signal.Notify(sigChan, syscall.SIGINT, syscall.SIGTERM) + go func() { + var sig os.Signal + select { + case sig = <-sigChan: + case <-sigStop: + return + } + finished := time.Now().UTC() + started := time.Unix(0, startedUnixNano.Load()).UTC() + durationMS := int(finished.Sub(started).Milliseconds()) + if durationMS < 0 { + durationMS = 0 + } + errMsg := "interrupted" + + owned := false + _ = task.WithLock(c.Task, func() error { + st, _ := task.LoadState(c.Task) + if st == nil { + return nil + } + if st.SupervisorPID != os.Getpid() { + return nil + } + + owned = true + _ = history.AppendLocked(c.Task, history.Event{ + Type: "worker.interrupt", + Data: mustJSON(map[string]any{ + "action": "received", + "run_id": runID, + "signal": sig.String(), + "supervisor_pid": os.Getpid(), + "supervisor_pgid": task.SelfProcessGroupID(), + }), + TS: finished, + }) + _ = history.AppendLocked(c.Task, history.Event{ + Type: "worker.finished", + Data: mustJSON(map[string]any{ + "run_id": runID, + "duration_ms": durationMS, + "outcome": "error", + "error": errMsg, + "error_message": errMsg, + "tool_calls": int(runToolCalls.Load()), + }), + TS: finished, + }) + + st.SupervisorPID = 0 + st.SupervisorPGID = 0 + st.StartedAt = time.Time{} + st.LastError = errMsg + return st.Save(c.Task) + }) + + if owned { + logging.Error("harness", fmt.Sprintf("task=%s %s error: %s", c.Task, cfg.Harness, errMsg)) + logging.Info("worker", fmt.Sprintf("task=%s finished outcome=error duration=%s", c.Task, finished.Sub(started).Round(time.Second))) + } + os.Exit(1) + }() + defer func() { + close(sigStop) + signal.Stop(sigChan) + }() + wsPath, prevWorkspace, continueFrom, repoStatus, err := c.prepareWorkspaceAndState(cfg, h, t, tail, prompt, runID) if err != nil { return err @@ -126,52 +209,8 @@ func (c *SendCmd) Run() error { // Build prompt. fullPrompt := harness.BuildPrompt(t, wsPath, false, prompt, repoStatus) - var runToolCalls atomic.Int64 - started := time.Now().UTC() - - // Setup signal handling. - sigChan := make(chan os.Signal, 1) - signal.Notify(sigChan, syscall.SIGINT, syscall.SIGTERM) - go func() { - sig := <-sigChan - errMsg := "interrupted" - _ = task.WithLock(c.Task, func() error { - st, _ := task.LoadState(c.Task) - if st == nil { - st = &task.State{} - } - st.SupervisorPID = 0 - st.SupervisorPGID = 0 - st.StartedAt = time.Time{} - st.LastError = errMsg - return st.Save(c.Task) - }) - _ = history.Append(c.Task, history.Event{ - Type: "worker.interrupt", - Data: mustJSON(map[string]any{ - "action": "received", - "run_id": runID, - "signal": sig.String(), - "supervisor_pid": os.Getpid(), - "supervisor_pgid": task.SelfProcessGroupID(), - }), - }) - _ = history.Append(c.Task, history.Event{ - Type: "worker.finished", - Data: mustJSON(map[string]any{ - "run_id": runID, - "duration_ms": int(time.Since(started).Milliseconds()), - "outcome": "error", - "error": errMsg, - "error_message": errMsg, - "tool_calls": int(runToolCalls.Load()), - }), - }) - logging.Error("harness", fmt.Sprintf("task=%s %s error: %s", c.Task, cfg.Harness, errMsg)) - logging.Info("worker", fmt.Sprintf("task=%s finished outcome=error duration=%s", c.Task, time.Since(started).Round(time.Second))) - os.Exit(1) - }() - defer signal.Stop(sigChan) + // Reset start time for the worker run (exclude workspace preparation). + startedUnixNano.Store(time.Now().UTC().UnixNano()) // Snapshot shared files before execution (exclude history.jsonl). sharedBefore := SnapshotTaskFiles(c.Task) @@ -216,6 +255,7 @@ func (c *SendCmd) Run() error { result, runErr := h.Run(context.Background(), wsPath, fullPrompt, continueFrom, callbacks) finished := time.Now().UTC() + started := time.Unix(0, startedUnixNano.Load()).UTC() durationMS := int(finished.Sub(started).Milliseconds()) reply := "" @@ -267,6 +307,20 @@ func (c *SendCmd) Run() error { }) } + _ = history.Append(c.Task, history.Event{ + Type: "worker.finished", + Data: mustJSON(map[string]any{ + "run_id": runID, + "duration_ms": durationMS, + "tool_calls": int(runToolCalls.Load()), + "outcome": "error", + "error": errMsg, + "error_message": errMsg, + }), + TS: finished, + }) + + // Clear running fields after history is written, before printing/returning. _ = task.WithLock(c.Task, func() error { st, _ := task.LoadState(c.Task) if st == nil { @@ -281,39 +335,17 @@ func (c *SendCmd) Run() error { } return st.Save(c.Task) }) - _ = history.Append(c.Task, history.Event{ - Type: "worker.finished", - Data: mustJSON(map[string]any{ - "run_id": runID, - "duration_ms": durationMS, - "tool_calls": int(runToolCalls.Load()), - "outcome": "error", - "error": errMsg, - "error_message": errMsg, - }), - TS: finished, - }) + logging.Error("harness", fmt.Sprintf("task=%s %s error: %s", c.Task, cfg.Harness, errMsg)) logging.Info("worker", fmt.Sprintf("task=%s finished outcome=error duration=%s", c.Task, finished.Sub(started).Round(time.Second))) return runErr } - // Success: append worker message + finish event, clear running fields in state. - _ = task.WithLock(c.Task, func() error { - st, _ := task.LoadState(c.Task) - if st == nil { - st = &task.State{} - } - st.SupervisorPID = 0 - st.SupervisorPGID = 0 - st.StartedAt = time.Time{} - st.LastError = "" - if nextSessionID != "" { - st.SessionID = nextSessionID - st.Harness = cfg.Harness - } - return st.Save(c.Task) - }) + // Snapshot shared files after execution and find changes. + sharedAfter := SnapshotTaskFiles(c.Task) + changedFiles := ChangedTaskFiles(sharedBefore, sharedAfter) + + // Success: append worker message + finish event. _ = history.Append(c.Task, history.Event{ Type: "message", Role: "worker", @@ -330,27 +362,25 @@ func (c *SendCmd) Run() error { }), TS: finished, }) - logging.Info("worker", fmt.Sprintf("task=%s finished outcome=replied duration=%s", c.Task, finished.Sub(started).Round(time.Second))) - // Snapshot shared files after execution and find changes. - sharedAfter := SnapshotTaskFiles(c.Task) - changedFiles := ChangedTaskFiles(sharedBefore, sharedAfter) - - // Refresh git snapshot/integration cache for this task so list/TUI stay fast. - if idx, err := taskindex.OpenDefault(); err == nil { - defer idx.Close() - if err := idx.Refresh(context.Background(), taskindex.RefreshPolicy{ - Git: taskindex.GitPolicy{ - Mode: taskindex.GitTasks, - Tasks: []string{c.Task}, - IncludeIntegration: true, - }, - }); err != nil && !c.Quiet { - printWarning(fmt.Sprintf("failed to refresh git integration cache: %v", err)) + // Clear running fields after history is written, before printing output. + _ = task.WithLock(c.Task, func() error { + st, _ := task.LoadState(c.Task) + if st == nil { + st = &task.State{} } - } else if !c.Quiet { - printWarning(fmt.Sprintf("failed to open index for git integration cache refresh: %v", err)) - } + st.SupervisorPID = 0 + st.SupervisorPGID = 0 + st.StartedAt = time.Time{} + st.LastError = "" + if nextSessionID != "" { + st.SessionID = nextSessionID + st.Harness = cfg.Harness + } + return st.Save(c.Task) + }) + + logging.Info("worker", fmt.Sprintf("task=%s finished outcome=replied duration=%s", c.Task, finished.Sub(started).Round(time.Second))) if c.Quiet { if reply != "" { @@ -366,7 +396,7 @@ func (c *SendCmd) Run() error { return nil } -func (c *SendCmd) prepareWorkspaceAndState(cfg *workspace.Config, h harness.Harness, t *task.Task, tail history.TailInfo, prompt, runID string) (wsPath, prevWorkspace, continueFrom string, repoStatus *harness.RepoStatus, _ error) { +func (c *SendCmd) prepareWorkspaceAndState(cfg *workspace.Config, h harness.Harness, t *task.Task, tail history.TailInfo, prompt, runID string) (wsPath, prevWorkspace, continueFrom string, repoStatus *harness.RepoStatus, err error) { now := time.Now().UTC() var st *task.State @@ -401,9 +431,61 @@ func (c *SendCmd) prepareWorkspaceAndState(cfg *workspace.Config, h harness.Harn // Hard guard: don't allow two concurrent sends on the same machine. if st != nil && st.SupervisorPID != 0 && !st.IsStale() { - return "", "", "", nil, fmt.Errorf("task %s is still working\n\nWait for it to finish, or check:\n subtask list", c.Task) + return "", "", "", nil, fmt.Errorf("task %s is still working\n\nYou'll be notified when done, then you can send more context.\nTo correct a worker going the wrong direction:\n subtask interrupt %s && subtask send %s \"...\"", c.Task, c.Task, c.Task) } + // Test-only: deterministic barrier to coordinate concurrent send attempts. + if err := maybeWaitSendBarrier(); err != nil { + return "", "", "", nil, err + } + + claimedPID := os.Getpid() + claimed := false + defer func() { + if !claimed || err == nil { + return + } + errMsg := strings.TrimSpace(err.Error()) + if errMsg == "" { + errMsg = "send failed" + } + _ = task.WithLock(c.Task, func() error { + locked, _ := task.LoadState(c.Task) + if locked == nil { + return nil + } + if locked.SupervisorPID != claimedPID { + return nil + } + locked.SupervisorPID = 0 + locked.SupervisorPGID = 0 + locked.StartedAt = time.Time{} + locked.LastError = errMsg + return locked.Save(c.Task) + }) + }() + + // Claim the task early (before git worktree operations) to prevent a race where two sends + // concurrently try to check out the same branch in different worktrees. + if err := task.WithLock(c.Task, func() error { + locked, _ := task.LoadState(c.Task) + if locked == nil { + locked = &task.State{} + } + if locked.SupervisorPID != 0 && !locked.IsStale() { + return fmt.Errorf("task %s is still working\n\nYou'll be notified when done, then you can send more context.\nTo correct a worker going the wrong direction:\n subtask interrupt %s && subtask send %s \"...\"", c.Task, c.Task, c.Task) + } + locked.SupervisorPID = claimedPID + locked.SupervisorPGID = task.SelfProcessGroupID() + locked.StartedAt = now + locked.LastError = "" + locked.Harness = cfg.Harness + return locked.Save(c.Task) + }); err != nil { + return "", "", "", nil, err + } + claimed = true + // Reuse workspace when available. if st != nil && st.Workspace != "" { if info, err := os.Stat(st.Workspace); err == nil && info.IsDir() { @@ -439,10 +521,10 @@ func (c *SendCmd) prepareWorkspaceAndState(cfg *workspace.Config, h harness.Harn if tail.TaskStatus != task.TaskStatusMerged { baseRef = strings.TrimSpace(tail.BaseCommit) } - if err := git.SetupBranch(wsPath, t, baseRef); err != nil { + if err := git.SetupBranch(wsPath, t.Name, t.BaseBranch, baseRef); err != nil { // If the recorded base commit is missing (e.g., rewritten history), fall back to base branch HEAD. if baseRef != "" { - if err2 := git.SetupBranch(wsPath, t, ""); err2 == nil { + if err2 := git.SetupBranch(wsPath, t.Name, t.BaseBranch, ""); err2 == nil { baseRef = "" } else { return "", "", "", nil, fmt.Errorf("git setup failed: %w", err) @@ -474,14 +556,9 @@ func (c *SendCmd) prepareWorkspaceAndState(cfg *workspace.Config, h harness.Harn // Local-first: compare against the local base branch only. target := t.BaseBranch if git.BranchExists(wsPath, target) { - behind, err := git.CommitsBehind(wsPath, "HEAD", target) - if err == nil && behind > 0 { - repoStatus = &harness.RepoStatus{CommitsBehind: behind} - - conflicts, err := git.MergeConflictFiles(wsPath, target, "HEAD") - if err == nil && len(conflicts) > 0 { - repoStatus.ConflictFiles = conflicts - } + conflicts, err := git.MergeConflictFiles(wsPath, target, "HEAD") + if err == nil && len(conflicts) > 0 { + repoStatus = &harness.RepoStatus{ConflictFiles: conflicts} } } } @@ -497,11 +574,14 @@ func (c *SendCmd) prepareWorkspaceAndState(cfg *workspace.Config, h harness.Harn } // Set running state and append start events. - err := task.WithLock(c.Task, func() error { + err = task.WithLock(c.Task, func() error { locked, _ := task.LoadState(c.Task) if locked == nil { locked = &task.State{} } + if locked.SupervisorPID != 0 && !locked.IsStale() && locked.SupervisorPID != os.Getpid() { + return fmt.Errorf("task %s is still working\n\nYou'll be notified when done, then you can send more context.\nTo correct a worker going the wrong direction:\n subtask interrupt %s && subtask send %s \"...\"", c.Task, c.Task, c.Task) + } prevWorkspace = locked.Workspace locked.Workspace = wsPath @@ -578,6 +658,53 @@ func (c *SendCmd) prepareWorkspaceAndState(cfg *workspace.Config, h harness.Harn return wsPath, prevWorkspace, continueFrom, repoStatus, nil } +const ( + testSendBarrierDirEnv = "SUBTASK_TEST_SEND_BARRIER_DIR" + testSendBarrierNEnv = "SUBTASK_TEST_SEND_BARRIER_N" + testSendBarrierTimeoutMSEnv = "SUBTASK_TEST_SEND_BARRIER_TIMEOUT_MS" +) + +func maybeWaitSendBarrier() error { + dir := strings.TrimSpace(os.Getenv(testSendBarrierDirEnv)) + if dir == "" { + return nil + } + + n := 2 + if s := strings.TrimSpace(os.Getenv(testSendBarrierNEnv)); s != "" { + if v, err := strconv.Atoi(s); err == nil && v > 0 { + n = v + } + } + timeout := 5 * time.Second + if s := strings.TrimSpace(os.Getenv(testSendBarrierTimeoutMSEnv)); s != "" { + if v, err := strconv.Atoi(s); err == nil && v > 0 { + timeout = time.Duration(v) * time.Millisecond + } + } + + if err := os.MkdirAll(dir, 0o755); err != nil { + return err + } + + // Signal arrival. + p := filepath.Join(dir, fmt.Sprintf("%d", os.Getpid())) + if f, err := os.OpenFile(p, os.O_CREATE|os.O_EXCL|os.O_WRONLY, 0o644); err == nil { + _, _ = f.WriteString("ok\n") + _ = f.Close() + } + + deadline := time.Now().Add(timeout) + for time.Now().Before(deadline) { + ents, err := os.ReadDir(dir) + if err == nil && len(ents) >= n { + return nil + } + time.Sleep(10 * time.Millisecond) + } + return fmt.Errorf("send barrier timed out waiting for %d participants (%s)", n, dir) +} + func (c *SendCmd) info(msg string) { if c.Quiet { return diff --git a/cmd/subtask/show.go b/cmd/subtask/show.go index 43e24ab..7f69cb9 100644 --- a/cmd/subtask/show.go +++ b/cmd/subtask/show.go @@ -9,8 +9,8 @@ import ( "github.com/zippoxer/subtask/pkg/render" "github.com/zippoxer/subtask/pkg/task" - "github.com/zippoxer/subtask/pkg/task/gather" "github.com/zippoxer/subtask/pkg/task/history" + "github.com/zippoxer/subtask/pkg/task/store" ) // ShowCmd implements 'subtask show'. @@ -22,6 +22,9 @@ type ShowCmd struct { // Run executes the show command. func (c *ShowCmd) Run() error { + if _, err := preflightProject(); err != nil { + return err + } if c.JSON { if c.Watch { return fmt.Errorf("--watch cannot be used with --json") @@ -47,7 +50,8 @@ func (c *ShowCmd) Run() error { } func (c *ShowCmd) render() (string, error) { - detail, err := gather.Detail(context.Background(), c.Task) + st := store.New() + detail, err := st.Get(context.Background(), c.Task, store.GetOptions{}) if err != nil { return "", err } @@ -61,6 +65,7 @@ func (c *ShowCmd) render() (string, error) { Title: t.Title, Branch: t.Name, BaseBranch: t.BaseBranch, + BaseCommit: detail.BaseCommit, } card.Model = detail.Model card.Reasoning = detail.Reasoning @@ -69,9 +74,9 @@ func (c *ShowCmd) render() (string, error) { if state != nil { lastError = state.LastError } - card.TaskStatus = userStatusTextWithIntegration(detail.TaskStatus, detail.WorkerStatus, time.Time{}, detail.LastRunMS, lastError, detail.IntegratedReason) + card.TaskStatus = userStatusText(detail.TaskStatus, detail.WorkerStatus, time.Time{}, detail.LastRunMS, lastError) if state != nil && detail.WorkerStatus == task.WorkerStatusRunning && !state.StartedAt.IsZero() { - card.TaskStatus = userStatusTextWithIntegration(detail.TaskStatus, detail.WorkerStatus, state.StartedAt, detail.LastRunMS, lastError, detail.IntegratedReason) + card.TaskStatus = userStatusText(detail.TaskStatus, detail.WorkerStatus, state.StartedAt, detail.LastRunMS, lastError) } if state != nil { @@ -81,9 +86,17 @@ func (c *ShowCmd) render() (string, error) { } } - card.LinesAdded = detail.LinesAdded - card.LinesRemoved = detail.LinesRemoved - card.CommitsBehind = detail.CommitsBehind + card.LinesAdded = detail.Changes.Added + card.LinesRemoved = detail.Changes.Removed + card.ChangesStatus = string(detail.Changes.Status) + if detail.Changes.Err != nil && detail.Changes.Status != store.ChangesStatusMissing { + card.ChangesError = detail.Changes.Err.Error() + } + card.CommitCount = detail.Commits.Count + if detail.Commits.Err != nil { + card.CommitError = detail.Commits.Err.Error() + } + card.ShowCommits = detail.TaskStatus == task.TaskStatusOpen card.ConflictFiles = detail.ConflictFiles // Workflow and stage if present. @@ -118,32 +131,33 @@ type showJSONProgressStep struct { } type showJSON struct { - Name string `json:"name"` - Title string `json:"title,omitempty"` - Branch string `json:"branch,omitempty"` - BaseBranch string `json:"base_branch,omitempty"` - Model string `json:"model,omitempty"` - Reasoning string `json:"reasoning,omitempty"` - Status string `json:"status,omitempty"` - WorkerStatus string `json:"worker_status,omitempty"` - Error string `json:"error,omitempty"` - Workspace string `json:"workspace,omitempty"` - Workflow string `json:"workflow,omitempty"` - Stage string `json:"stage,omitempty"` - TaskDir string `json:"task_dir,omitempty"` - Files []string `json:"files,omitempty"` - ProgressSteps []showJSONProgressStep `json:"progress_steps,omitempty"` - LinesAdded int `json:"lines_added,omitempty"` - LinesRemoved int `json:"lines_removed,omitempty"` - CommitsBehind int `json:"commits_behind,omitempty"` - ConflictFiles []string `json:"conflict_files,omitempty"` - IntegratedReason string `json:"integrated_reason,omitempty"` - HistoryPath string `json:"history_path,omitempty"` - LastWorkerReply string `json:"last_worker_reply,omitempty"` + Name string `json:"name"` + Title string `json:"title,omitempty"` + Branch string `json:"branch,omitempty"` + BaseBranch string `json:"base_branch,omitempty"` + BaseCommit string `json:"base_commit,omitempty"` + Model string `json:"model,omitempty"` + Reasoning string `json:"reasoning,omitempty"` + Status string `json:"status,omitempty"` + WorkerStatus string `json:"worker_status,omitempty"` + Error string `json:"error,omitempty"` + Workspace string `json:"workspace,omitempty"` + Workflow string `json:"workflow,omitempty"` + Stage string `json:"stage,omitempty"` + TaskDir string `json:"task_dir,omitempty"` + Files []string `json:"files,omitempty"` + ProgressSteps []showJSONProgressStep `json:"progress_steps,omitempty"` + LinesAdded int `json:"lines_added,omitempty"` + LinesRemoved int `json:"lines_removed,omitempty"` + CommitCount int `json:"commit_count,omitempty"` + ConflictFiles []string `json:"conflict_files,omitempty"` + HistoryPath string `json:"history_path,omitempty"` + LastWorkerReply string `json:"last_worker_reply,omitempty"` } func (c *ShowCmd) renderJSON() (string, error) { - detail, err := gather.Detail(context.Background(), c.Task) + st := store.New() + detail, err := st.Get(context.Background(), c.Task, store.GetOptions{}) if err != nil { return "", err } @@ -152,21 +166,21 @@ func (c *ShowCmd) renderJSON() (string, error) { state := detail.State out := showJSON{ - Name: t.Name, - Title: t.Title, - Branch: t.Name, - BaseBranch: t.BaseBranch, - Model: detail.Model, - Reasoning: detail.Reasoning, - HistoryPath: task.HistoryPath(c.Task), - LastWorkerReply: lastWorkerReply(c.Task), - TaskDir: task.Dir(c.Task), - Files: detail.TaskFiles, - LinesAdded: detail.LinesAdded, - LinesRemoved: detail.LinesRemoved, - CommitsBehind: detail.CommitsBehind, - ConflictFiles: detail.ConflictFiles, - IntegratedReason: detail.IntegratedReason, + Name: t.Name, + Title: t.Title, + Branch: t.Name, + BaseBranch: t.BaseBranch, + BaseCommit: detail.BaseCommit, + Model: detail.Model, + Reasoning: detail.Reasoning, + HistoryPath: task.HistoryPath(c.Task), + LastWorkerReply: lastWorkerReply(c.Task), + TaskDir: task.Dir(c.Task), + Files: detail.TaskFiles, + LinesAdded: detail.Changes.Added, + LinesRemoved: detail.Changes.Removed, + CommitCount: detail.Commits.Count, + ConflictFiles: detail.ConflictFiles, } out.Status = string(detail.TaskStatus) diff --git a/cmd/subtask/stage.go b/cmd/subtask/stage.go index 2dcce38..c3128a3 100644 --- a/cmd/subtask/stage.go +++ b/cmd/subtask/stage.go @@ -21,6 +21,10 @@ type StageCmd struct { // Run executes the stage command. func (c *StageCmd) Run() error { + if _, err := preflightProject(); err != nil { + return err + } + if err := migrate.EnsureSchema(c.Task); err != nil { return err } diff --git a/cmd/subtask/status.go b/cmd/subtask/status.go index 3e2d13b..a14e9cc 100644 --- a/cmd/subtask/status.go +++ b/cmd/subtask/status.go @@ -9,84 +9,33 @@ import ( type StatusCmd struct{} func (c *StatusCmd) Run() error { - userBase, _, err := baseDirForScope(install.ScopeUser) + st, err := install.GetSkillStatus() if err != nil { return err } - projectBase, _, err := baseDirForScope(install.ScopeProject) - if err != nil { - return err - } - - user, err := install.GetScopeStatus(install.ScopeUser, userBase) - if err != nil { - return err - } - project, err := install.GetScopeStatus(install.ScopeProject, projectBase) - if err != nil { - return err - } - - printScopeStatus("User", user) - printScopeStatus("Project", project) - return nil -} - -func printScopeStatus(title string, st install.ScopeStatus) { - printSection(title) skillInstalled := "no" skillUpToDate := "-" skillSHA := "-" - if st.Skill.Installed { + if st.Installed { skillInstalled = "yes" - skillUpToDate = yesNo(st.Skill.UpToDate) - if st.Skill.InstalledSHA256 != "" { - skillSHA = shortHash(st.Skill.InstalledSHA256) - } - } - - pluginInstalled := "no" - pluginUpToDate := "-" - pluginSHA := "-" - if st.Plugin.Installed { - pluginInstalled = "yes" - pluginUpToDate = yesNo(st.Plugin.UpToDate) - if st.Plugin.InstalledSHA256 != "" { - pluginSHA = shortHash(st.Plugin.InstalledSHA256) + skillUpToDate = yesNo(st.UpToDate) + if st.InstalledSHA256 != "" { + skillSHA = shortHash(st.InstalledSHA256) } } - settingsExists := "no" - settingsEnabled := "-" - settingsErr := "-" - if st.Settings.Exists { - settingsExists = "yes" - settingsEnabled = yesNo(st.Settings.PluginEnabled) - } - if st.Settings.Error != "" { - settingsErr = st.Settings.Error - } - kv := &render.KeyValueList{ Pairs: []render.KV{ - {Key: "Skill path", Value: abbreviatePath(st.Skill.Path)}, + {Key: "Skill path", Value: abbreviatePath(st.Path)}, {Key: "Skill installed", Value: skillInstalled}, {Key: "Skill up-to-date", Value: skillUpToDate}, - {Key: "Skill embedded SHA256", Value: shortHash(st.Skill.EmbeddedSHA256)}, + {Key: "Skill embedded SHA256", Value: shortHash(st.EmbeddedSHA256)}, {Key: "Skill installed SHA256", Value: skillSHA}, - {Key: "Plugin dir", Value: abbreviatePath(st.Plugin.Dir)}, - {Key: "Plugin installed", Value: pluginInstalled}, - {Key: "Plugin up-to-date", Value: pluginUpToDate}, - {Key: "Plugin embedded SHA256", Value: shortHash(st.Plugin.EmbeddedSHA256)}, - {Key: "Plugin installed SHA256", Value: pluginSHA}, - {Key: "Settings path", Value: abbreviatePath(st.Settings.Path)}, - {Key: "Settings exists", Value: settingsExists}, - {Key: "Plugin enabled", Value: settingsEnabled}, - {Key: "Settings error", Value: settingsErr}, }, } kv.Print() + return nil } func shortHash(s string) string { @@ -95,3 +44,10 @@ func shortHash(s string) string { } return s[:12] } + +func yesNo(b bool) string { + if b { + return "yes" + } + return "no" +} diff --git a/cmd/subtask/styles.go b/cmd/subtask/styles.go new file mode 100644 index 0000000..b3361c5 --- /dev/null +++ b/cmd/subtask/styles.go @@ -0,0 +1,12 @@ +package main + +import "github.com/charmbracelet/lipgloss" + +var ( + successStyle = lipgloss.NewStyle(). + Foreground(lipgloss.Color("10")) + + subtleStyle = lipgloss.NewStyle(). + Foreground(lipgloss.Color("247")) +) + diff --git a/cmd/subtask/test_helpers_test.go b/cmd/subtask/test_helpers_test.go new file mode 100644 index 0000000..fe8b4db --- /dev/null +++ b/cmd/subtask/test_helpers_test.go @@ -0,0 +1,25 @@ +package main + +import ( + "os" + "path/filepath" + "runtime" + "testing" + + "github.com/stretchr/testify/require" +) + +func writeFakeCLI(t *testing.T, dir string, name string) string { + t.Helper() + + if runtime.GOOS == "windows" { + path := filepath.Join(dir, name+".bat") + require.NoError(t, os.WriteFile(path, []byte("@echo off\r\nexit /B 0\r\n"), 0o644)) + return path + } + + path := filepath.Join(dir, name) + require.NoError(t, os.WriteFile(path, []byte("#!/bin/sh\nexit 0\n"), 0o755)) + return path +} + diff --git a/cmd/subtask/testdata/list/commits_behind.ansi b/cmd/subtask/testdata/list/commits_behind.ansi deleted file mode 100644 index d16c20a..0000000 --- a/cmd/subtask/testdata/list/commits_behind.ansi +++ /dev/null @@ -1,8 +0,0 @@ - - ╭─────────────────────────────────────────────────────────────────────╮ - │ TASK  STATUS STAGE  PROGRESS CHANGES  LAST ACTIVE │ - │ ─────────────────────────────────────────────────────────────────── │ - │ list/behind draft implement - - (2 behind) - │ - │ └ Behind task │ - ╰─────────────────────────────────────────────────────────────────────╯ - diff --git a/cmd/subtask/testdata/list/commits_behind.txt b/cmd/subtask/testdata/list/commits_behind.txt deleted file mode 100644 index e9a9d49..0000000 --- a/cmd/subtask/testdata/list/commits_behind.txt +++ /dev/null @@ -1,2 +0,0 @@ -TASK STATUS STAGE PROGRESS CHANGES LAST ACTIVE TITLE -list/behind draft implement - - (2 behind) - Behind task diff --git a/cmd/subtask/testdata/list/multi_status.ansi b/cmd/subtask/testdata/list/multi_status.ansi index 8fcad6d..6a6dfc5 100644 --- a/cmd/subtask/testdata/list/multi_status.ansi +++ b/cmd/subtask/testdata/list/multi_status.ansi @@ -14,8 +14,7 @@ │ d/error error review - - - │ │ └ Error task │ │ │ - │ e/closed ✓ merged ready - +2 -0 - │ + │ e/closed ✓ merged ready - - - │ │ └ Closed task │ ╰─────────────────────────────────────────────────────────────────────╯ - (1 workspace(s) available) diff --git a/cmd/subtask/testdata/list/multi_status.txt b/cmd/subtask/testdata/list/multi_status.txt index 50b12f8..71a71d3 100644 --- a/cmd/subtask/testdata/list/multi_status.txt +++ b/cmd/subtask/testdata/list/multi_status.txt @@ -3,5 +3,4 @@ b/working working (2m) implement 0/2 +1 -0 30s ago Working task a/draft draft implement - - - Draft task c/replied replied review 2/3 +3 -1 2h ago Replied task (← ctx/base) d/error error review - - - Error task -e/closed ✓ merged ready - +2 -0 - Closed task - (1 workspace(s) available) +e/closed ✓ merged ready - - - Closed task diff --git a/cmd/subtask/testdata/show/behind_conflicts.ansi b/cmd/subtask/testdata/show/behind_conflicts.ansi index fc86989..ee43efd 100644 --- a/cmd/subtask/testdata/show/behind_conflicts.ansi +++ b/cmd/subtask/testdata/show/behind_conflicts.ansi @@ -5,9 +5,11 @@ │ │ │ Status replied (1s) │ │ Branch show/behind-conflicts (based on main) │ + │ Base 6f3fdff538d6f9254acbcd16db946e375f877eb5 │ │ Model gpt-5.2 │ │ Workspace ws1 │ │ Changes +2 -2 │ + │ Commits 1 │ │ Conflicts a.txt, b.txt │ │ │ │ Directory .subtask/tasks/show--behind-conflicts (contains TASK.md, history.jsonl) │ diff --git a/cmd/subtask/testdata/show/behind_conflicts.txt b/cmd/subtask/testdata/show/behind_conflicts.txt index 2f60b65..46ea21a 100644 --- a/cmd/subtask/testdata/show/behind_conflicts.txt +++ b/cmd/subtask/testdata/show/behind_conflicts.txt @@ -1,10 +1,12 @@ Task: show/behind-conflicts Title: Conflict task Branch: show/behind-conflicts (based on main) +Base commit: 6f3fdff538d6f9254acbcd16db946e375f877eb5 Model: gpt-5.2 Workspace: ws1 Status: replied (1s) Changes: +2 -2 +Commits: 1 Conflicts: a.txt, b.txt Directory: .subtask/tasks/show--behind-conflicts (contains TASK.md, history.jsonl) diff --git a/cmd/subtask/testdata/show/draft.ansi b/cmd/subtask/testdata/show/draft.ansi index e564a76..94ec706 100644 --- a/cmd/subtask/testdata/show/draft.ansi +++ b/cmd/subtask/testdata/show/draft.ansi @@ -5,8 +5,10 @@ │ │ │ Status draft │ │ Branch show/draft (based on main) │ + │ Base a6de1c645da0682a92254a0a454e642f5500a7b9 │ │ Model gpt-5.2 │ │ Changes - │ + │ Commits 0 │ │ Stage doing → review → ready │ │ │ │ Directory .subtask/tasks/show--draft (contains TASK.md, WORKFLOW.yaml, history.jsonl) │ diff --git a/cmd/subtask/testdata/show/draft.txt b/cmd/subtask/testdata/show/draft.txt index d11e9ca..955d6cd 100644 --- a/cmd/subtask/testdata/show/draft.txt +++ b/cmd/subtask/testdata/show/draft.txt @@ -1,9 +1,11 @@ Task: show/draft Title: Draft task Branch: show/draft (based on main) +Base commit: a6de1c645da0682a92254a0a454e642f5500a7b9 Model: gpt-5.2 Status: draft Changes: - +Commits: 0 Workflow: default Stage: doing → review → ready diff --git a/cmd/subtask/testdata/show/model_reasoning.ansi b/cmd/subtask/testdata/show/model_reasoning.ansi index cf50153..67ad2a5 100644 --- a/cmd/subtask/testdata/show/model_reasoning.ansi +++ b/cmd/subtask/testdata/show/model_reasoning.ansi @@ -5,8 +5,10 @@ │ │ │ Status draft │ │ Branch show/model-reasoning (based on main) │ + │ Base a6de1c645da0682a92254a0a454e642f5500a7b9 │ │ Model gpt-5.2 (high) │ │ Changes - │ + │ Commits 0 │ │ │ │ Directory .subtask/tasks/show--model-reasoning (contains TASK.md, history.jsonl) │ ╰───────────────────────────────────────────────────────────────────────────────────╯ diff --git a/cmd/subtask/testdata/show/model_reasoning.txt b/cmd/subtask/testdata/show/model_reasoning.txt index 6f40c3c..37adb53 100644 --- a/cmd/subtask/testdata/show/model_reasoning.txt +++ b/cmd/subtask/testdata/show/model_reasoning.txt @@ -1,8 +1,10 @@ Task: show/model-reasoning Title: Model reasoning task Branch: show/model-reasoning (based on main) +Base commit: a6de1c645da0682a92254a0a454e642f5500a7b9 Model: gpt-5.2 (high) Status: draft Changes: - +Commits: 0 Directory: .subtask/tasks/show--model-reasoning (contains TASK.md, history.jsonl) diff --git a/cmd/subtask/testdata/show/replied.ansi b/cmd/subtask/testdata/show/replied.ansi index cea9b29..289b8f4 100644 --- a/cmd/subtask/testdata/show/replied.ansi +++ b/cmd/subtask/testdata/show/replied.ansi @@ -5,9 +5,11 @@ │ │ │ Status replied │ │ Branch show/replied (based on main) │ + │ Base a6de1c645da0682a92254a0a454e642f5500a7b9 │ │ Model gpt-5.2 │ │ Workspace ws1 │ │ Changes +4 -0 │ + │ Commits 1 │ │ Stage doing → review → ready │ │ │ │ Progress │ diff --git a/cmd/subtask/testdata/show/replied.txt b/cmd/subtask/testdata/show/replied.txt index 2b9937e..fdd02d0 100644 --- a/cmd/subtask/testdata/show/replied.txt +++ b/cmd/subtask/testdata/show/replied.txt @@ -1,10 +1,12 @@ Task: show/replied Title: Replied task Branch: show/replied (based on main) +Base commit: a6de1c645da0682a92254a0a454e642f5500a7b9 Model: gpt-5.2 Workspace: ws1 Status: replied Changes: +4 -0 +Commits: 1 Workflow: default Stage: doing → (review) → ready diff --git a/cmd/subtask/tui.go b/cmd/subtask/tui.go index 2aa7247..c8dec04 100644 --- a/cmd/subtask/tui.go +++ b/cmd/subtask/tui.go @@ -1,38 +1,16 @@ package main import ( - "bufio" "fmt" "os" - "strings" - "github.com/mattn/go-isatty" "github.com/zippoxer/subtask/pkg/logging" - "github.com/zippoxer/subtask/pkg/task" subtasktui "github.com/zippoxer/subtask/pkg/tui" ) func runTUIWithInitCheck() error { - if _, err := os.Stat(task.ConfigPath()); err != nil { - if !os.IsNotExist(err) { - return err - } - - if !isatty.IsTerminal(os.Stdin.Fd()) || !isatty.IsTerminal(os.Stdout.Fd()) { - return fmt.Errorf("subtask not initialized\n\nRun: subtask init") - } - - fmt.Print("Subtask is not initialized. Run 'subtask init' now? (y/n): ") - reader := bufio.NewReader(os.Stdin) - line, _ := reader.ReadString('\n') - line = strings.TrimSpace(strings.ToLower(line)) - if line != "y" && line != "yes" { - return fmt.Errorf("subtask not initialized\n\nRun: subtask init") - } - - if err := (&InitCmd{Workspaces: 20}).Run(); err != nil { - return err - } + if _, err := preflightProject(); err != nil { + return err } if logging.DebugEnabled() { diff --git a/cmd/subtask/uninstall.go b/cmd/subtask/uninstall.go index 8891403..3fb2f2d 100644 --- a/cmd/subtask/uninstall.go +++ b/cmd/subtask/uninstall.go @@ -3,53 +3,25 @@ package main import ( "fmt" + "github.com/zippoxer/subtask/internal/homedir" "github.com/zippoxer/subtask/pkg/install" ) // UninstallCmd implements 'subtask uninstall'. type UninstallCmd struct { - Skill bool `help:"Uninstall only the skill"` - Plugin bool `help:"Uninstall only the plugin"` - Scope string `default:"user" enum:"user,project" help:"Installation scope"` } func (c *UninstallCmd) Run() error { - scope, err := parseInstallScope(c.Scope) + homeDir, err := homedir.Dir() if err != nil { return err } - removeSkill := c.Skill - removePlugin := c.Plugin - if !c.Skill && !c.Plugin { - removeSkill = true - removePlugin = true - } - - baseDir, _, err := baseDirForScope(scope) + path, err := install.UninstallFrom(homeDir) if err != nil { return err } - res, err := install.UninstallAll(install.UninstallRequest{ - Scope: scope, - BaseDir: baseDir, - Skill: removeSkill, - Plugin: removePlugin, - }) - if err != nil { - return err - } - - if removeSkill { - printSuccess(fmt.Sprintf("Removed skill from %s", abbreviatePath(res.SkillPath))) - } - if removePlugin { - printSuccess(fmt.Sprintf("Removed plugin from %s", abbreviatePath(res.PluginDir))) - if res.Settings.Rewrote && res.Settings.BackupTo != "" { - printWarning(fmt.Sprintf("Rewrote malformed settings.json (backup at %s)", abbreviatePath(res.Settings.BackupTo))) - } - } - + printSuccess(fmt.Sprintf("Removed skill from %s", abbreviatePath(path))) return nil } diff --git a/cmd/subtask/workspace.go b/cmd/subtask/workspace.go index f3dc65b..5f45a54 100644 --- a/cmd/subtask/workspace.go +++ b/cmd/subtask/workspace.go @@ -13,6 +13,10 @@ type WorkspaceCmd struct { // Run executes the workspace command. func (c *WorkspaceCmd) Run() error { + if _, err := preflightProject(); err != nil { + return err + } + ws, err := workspace.ForTask(c.Task) if err != nil { return err diff --git a/docs/TUI.md b/docs/TUI.md index a145f66..89f7663 100644 --- a/docs/TUI.md +++ b/docs/TUI.md @@ -275,7 +275,7 @@ subtask list # Non-interactive output (for scripts) subtask show X # Non-interactive task details ``` -**First run**: If subtask is not initialized, prompts to run `subtask init`. +**First run**: If subtask is not configured, it prints an error and exits. Run `subtask install` first. The bare `subtask` command (no subcommand) launches the TUI. Existing commands work for scripting. @@ -314,7 +314,7 @@ tui/ ### Data Flow -1. **Init**: Check if initialized, load tasks, start refresh ticker +1. **Init**: Validate git + config, load tasks, start refresh ticker 2. **Tick**: Every 2s, fetch fresh data via existing task/git packages 3. **Update**: Handle keys, mouse, tick messages 4. **View**: Render based on current view and state diff --git a/docs/issues/codex-network-error-on-success.md b/docs/issues/codex-network-error-on-success.md new file mode 100644 index 0000000..577ad3e --- /dev/null +++ b/docs/issues/codex-network-error-on-success.md @@ -0,0 +1,24 @@ +# Codex harness reports errors even when Codex succeeds + +## Symptom + +When using the `codex` harness, Subtask can report a final network error (and mark the run as failed) even though the worker produced a valid final reply. + +## Likely cause (in Subtask code) + +`pkg/harness/codex.go` latches *any* streamed JSONL `"error"` event into `Result.Error`, and `runCodexCommand` returns an error at the end of the run whenever `Result.Error` is non-empty — even if: + +- the Codex process exits with code 0, and +- the `-o` output file contains a valid final assistant message. + +Key spots: + +- `processCodexJSONLLine`: `case "error": result.Error = event.Message` +- `runCodexCommand`: `// If we got an error event, return it even if exit code was 0` + +This matches the hypothesis that a transient/recovered error (e.g. an internally retried network failure) can poison the final result. + +## Why it looks like a “session end” error + +`cmd/subtask/send.go` treats any non-nil harness error as a failed worker run regardless of whether a reply was returned, so a latched transient `"error"` event becomes the final outcome for the run. + diff --git a/docs/issues/history-write-before-delivery.md b/docs/issues/history-write-before-delivery.md new file mode 100644 index 0000000..439055a --- /dev/null +++ b/docs/issues/history-write-before-delivery.md @@ -0,0 +1,47 @@ +# History writes before harness acknowledges delivery + +## Problem + +When sending a message to a worker, we write to `history.jsonl` BEFORE the harness confirms prompt delivery. + +Current flow in `cmd/subtask/send.go`: +1. `prepareWorkspaceAndState()` writes `lead.message` + `worker.started` to history +2. `h.Run()` sends to harness + +If the harness fails to deliver (network error before prompt is received), history shows "message sent" but the worker never got it. + +## Consequences + +- **History is misleading**: Shows messages that were never delivered +- **Duplicate messages on retry**: User might re-send, not knowing original was recorded +- **Session continuation ambiguity**: Depends on harness implementation whether it resends + +## Current behavior rationale + +History records "what lead intended to send" rather than "what was acknowledged." This is useful for: +- Audit/debugging (see what was attempted) +- Crash recovery (know what was in flight) + +## Alternative: Delivery-confirmed semantics + +To guarantee history only contains delivered messages: + +1. Call harness first +2. Wait for `PromptDelivered` confirmation (e.g., `thread.started` event) +3. Only then write to history +4. Handle partial failures (harness started but didn't complete) + +## Considerations + +- **Complexity**: Partial failure handling is tricky (started but crashed mid-run) +- **Ordering**: Would need to buffer the message until delivery confirmed +- **Atomicity**: What if we confirm delivery, then crash before writing history? + +## Recommendation + +Keep current behavior but: +1. Document that history records intent, not confirmed delivery +2. Consider adding a `delivered: true/false` field to `lead.message` events +3. Set `delivered: true` after harness confirms (could be done post-hoc) + +This preserves the audit trail while adding delivery status visibility. diff --git a/docs/plans/design--external-merge-detection.md b/docs/plans/design--external-merge-detection.md deleted file mode 100644 index 0ef99da..0000000 --- a/docs/plans/design--external-merge-detection.md +++ /dev/null @@ -1,213 +0,0 @@ -# External merge detection (simplified, reliable) - -## Problem - -Subtask’s durable task status (`open|closed|merged`) is derived from `history.jsonl` events. Today, tasks become `merged` only when `subtask merge` appends a `task.merged` event. - -If the task branch is integrated into the base branch externally (manual `git merge`, GitHub PR, etc.), Subtask still reports the task as `open` or `closed`, even though the branch’s changes are already in the base branch. - -## Goals - -- Detect that a task branch’s *net* changes are already present in the base branch (“integrated”), even if the merge happened outside Subtask. -- Avoid long SQLite locks (do git work outside DB transactions; keep writes small). -- Recompute only when inputs change (branch/base moved). -- Prefer *guarantees* over heuristics; keep the implementation small. - -## Non-goals (first iteration) - -- Perfect detection after the local task branch is deleted (possible but expensive; needs patch-id/commit search heuristics). -- Identifying *which* commit(s) on the base branch correspond to the task for squash merges (no single canonical commit). -- Network calls / GitHub API integration. - -## GitHub reality check (what “merged” means there) - -GitHub does **not** infer “merged” purely from git history; it stores PR state as metadata: - -- A PR is “merged” when GitHub performed a merge action and recorded it (e.g., REST `GET /repos/{owner}/{repo}/pulls/{pull_number}` exposes `merged_at`, `merged`, and `merge_commit_sha`). -- GitHub *can* recognize **indirect merges** (commits become reachable on the base branch) and show a PR as merged in some cases, but this relies on commit ancestry (it won’t detect “same patch via squash” as a merge of that PR). - -Implication for Subtask: -- If we want “reliable” without an API, we should lean on git primitives that provide guarantees about **reachability** and/or **no-op merges**, not try to mimic GitHub’s PR metadata. - -References: -- GitHub REST API: “Get a pull request” and “Check if a pull request has been merged”: https://docs.github.com/en/rest/pulls/pulls#get-a-pull-request and https://docs.github.com/en/rest/pulls/pulls#check-if-a-pull-request-has-been-merged -- GitHub docs: “Indirect merges” in “About pull request merges”: https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/incorporating-changes-from-a-pull-request/about-pull-request-merges#indirect-merges - -## Current implementation (relevant code) - -- Durable status: `task/history/history.go` (`Tail()` walks `task.opened|task.closed|task.merged`). -- Merge flow: `task/ops/merge.go` appends `task.merged` with a squash commit SHA and frees workspace. -- Git “integration” check exists already: - - `git/git.go:IsIntegrated(dir, branch, target) IntegrationReason` (currently a multi-step ladder). - - Cached in SQLite as `git_integrated_reason` via `task/index/gitcache.go`. -- `EffectiveTarget()` already prefers `origin/` when it’s ahead to catch PR merges before the user pulls. - -This design recommends simplifying integration detection down to two primitives (reachability + no-op merge) and updating the index cache logic accordingly. - -## What does “integrated” mean? - -For Subtask’s purposes: - -> A task is “integrated” if merging the task branch into the base branch would produce **no tree changes** (i.e., a no-op merge). - -This matches user intent (“the task’s changes are already in main”), and it’s robust across merge strategies. - -## Git primitives: what’s guaranteed vs “best effort” - -To keep this reliable and simple, use **two** git checks only: - -1. **Reachability (guarantee for history-preserving merges)** - `git merge-base --is-ancestor ` - If true, the exact branch tip commit is in the target’s history (fast-forward or merge commit). This is the same core primitive GitHub relies on for “indirect merges”. - -2. **No-op merge (guarantee for content integration)** - Compute the tree that would result from merging, and compare it to the target tree: - `git merge-tree --write-tree ` vs `git rev-parse ^{tree}` - If equal (and no conflicts), merging would introduce no content changes. This covers squash merges, cherry-picks, and “applied elsewhere” cases without guesswork about commit identities. - -Cost: -- Reachability is very cheap. -- `merge-tree --write-tree` is the expensive step, but we only do it when reachability is false *and* inputs changed. - -## Proposed approach (simplified) - -### What we show to users - -Define a *computed* status (from the index) that can be shown as `✓ merged`: - -- If durable history says `merged` → show merged. -- Else if git check says **integrated** → show merged (with reason, e.g. `ancestor` or `merge_adds_nothing`). -- Else show the durable status (`open`/`closed`). - -This achieves “shows as merged” without silently mutating durable history. - -### Durable status stays explicit - -For **open** tasks, don’t auto-append `task.merged` on detection. This avoids surprising side effects (e.g., `subtask diff` expecting a merge commit SHA). - -For **closed** tasks, once integration is proven, it’s reasonable to promote `closed → merged` by appending a `task.merged` event. This matches user intent (“it ended up merged”) and avoids the confusing state where a task is permanently “closed (not merged)” even though its content is in `main`. - -- Summary: - - Open tasks: integration affects display only (`✓ merged`), no durable mutation. - - Closed tasks: integration proof appends `task.merged` (durable). - -## Caching & invalidation under frequent base advances - -The base branch advances frequently (subtask merges + user commits). Benchmarks show that re-evaluating `merge-tree` for *every* task on every base advance (or on every `list`) is too slow at 50–100 tasks. - -So the design shifts to: - -- Keep `list` fast by default (no per-task integration recompute). -- Do O(1) work after `send` (only one task changes). -- Do O(1) work after `merge` (only that task is merged by definition). -- Detect *external* git ref changes reliably and, when they occur, run an explicit “repair” pass before displaying output (can be slower, but only when refs changed outside Subtask). - -### What we cache - -**Per task (SQLite `tasks` table):** - -- `git_last_branch_head TEXT`: last known commit SHA for the task branch (even if the branch ref is later deleted). -- `git_patch_id TEXT`: patch-id for the task diff vs its recorded `base_commit` (used for debugging/optional optimizations). -- `git_integrated_reason TEXT`: empty/NULL = unknown or not integrated, non-empty enum when integration is proven (`ancestor` or `merge_tree_noop`). -- `git_integrated_branch_head TEXT`: branch head used to prove integration. -- `git_integrated_target_head TEXT`: target head used to prove integration. -- `git_integrated_checked_at_ns INTEGER`: timestamp. - -**Repository meta (new `index_meta` table, single-row):** - -- `git_refs_snapshot_json TEXT`: JSON map of relevant refs to SHAs. -- `git_refs_snapshot_hash TEXT`: hash of the snapshot (fast compare). -- `git_refs_snapshot_at_ns INTEGER`: timestamp. - -The snapshot covers: -- `refs/heads/` for known tasks (if it exists). -- `refs/heads/` and `refs/remotes/origin/` for base branches used by tasks. - -### How we detect “cache might be wrong” - -On every `list`/TUI refresh (and `show`), compute a current refs snapshot using **one git command**: - -- `git for-each-ref --format=%(refname)%00%(objectname) ` - -If the hash matches `git_refs_snapshot_hash`, we know refs have not changed since Subtask last updated the index → cached integration state is valid. - -If it differs, some refs changed outside Subtask (external merge, manual rebase, etc.) → run the repair pass before displaying output (so we don’t show stale state). - -### Effective target choice - -Use `git.EffectiveTarget(repoDir, baseBranch)` (prefer `origin/` when it’s ahead *if it exists locally*). Note: Subtask can’t detect GitHub merges until the local repo has fetched updated refs. - -## When we compute/recompute integration - -### Normal path (no external git ref changes) - -- After a worker finishes (`send` completes and the branch head changes): recompute integration **for that one task only**, update `git_last_branch_head`, and update the refs snapshot. -- After `subtask merge`: task is already known merged (durable `task.merged` event); update refs snapshot. No integration scan needed. -- `list`/TUI: read cached data; only pay the ~single-command refs snapshot check. - -### Repair path (refs changed outside Subtask) - -When `list`/`show` detects a snapshot mismatch: - -1. Update the snapshot in the index meta row. -2. Recompute integration for tasks that are not already durable-merged: - - First try reachability: `git merge-base --is-ancestor `. - - If false, try no-op merge: `git merge-tree --write-tree ` and compare to `^{tree}`. -3. Write updated `git_integrated_reason` (and heads) back to SQLite, then render. - -This can take noticeable time at 50–100 tasks, but it only happens when git state changed outside Subtask (the case where correctness matters and some waiting is acceptable). - -## Explicit durable transition (not implemented) - -Subtask does not automatically convert “detected as integrated” into a durable `task.merged` event. This keeps Subtask from silently mutating history based on git heuristics. - -## Edge cases - -- **GitHub PR state**: without API integration, we won’t match GitHub’s “merged” metadata; we’re detecting repository integration. -- **Branch deleted**: reachability and no-op merge can still be checked if we cached `git_last_branch_head` and the commit object still exists locally. -- **Squash merges**: detected via the no-op merge check. - -## Benchmarks (real numbers) - -Environment: -- Apple M4 Pro, macOS kernel `25.1.0`, `git version 2.51.0`. -- Synthetic repos in `/tmp` with disjoint task file sets (no merge conflicts), 50 commits of base churn, and task branches with 2 commits touching 10 files each. - -### Ref snapshot cost (100 task branches) - -- `git for-each-ref refs/heads/task/`: median ~5.3ms (p95 ~6.9ms) -- `git show-ref --heads`: median ~5.2ms - -This is cheap enough to run on every `list`/TUI refresh to detect external git changes. - -### Integration checks cost - -Single task (1 branch): -- `merge-base --is-ancestor`: median ~4.2ms -- `merge-tree --write-tree` + tree compare: median ~9.0ms -- Combined: median ~12.9ms - -Batch (N tasks), measured as “loop N times spawning git each time” (same cost model as Subtask’s helpers): -- N=10: `is-ancestor` ~70ms; `merge-tree` ~82–84ms -- N=50: `is-ancestor` ~300–430ms; `merge-tree` ~330–460ms -- N=100: `is-ancestor` ~540–985ms; `merge-tree` ~635–1031ms - -Conclusion: doing `merge-tree` (or even `is-ancestor`) for *all tasks* on every base advance or on every `list` does not meet the “never wait on list” requirement at 50–100 tasks. This is why the design uses a fast ref snapshot check and only runs a full repair pass when refs changed outside Subtask. - -## Rough implementation plan - -1. **Schema**: bump `task/index/schema.go` to v5. - - Add per-task columns listed above (`git_last_branch_head`, `git_integrated_*`, optionally `git_patch_id`). - - Add `index_meta` table for `git_refs_snapshot_*`. -2. **Ref snapshot**: implement snapshot build/compare using `git for-each-ref` and a stable hash. -3. **Mutation hooks**: - - On `send` completion: update `git_last_branch_head` for the task; update refs snapshot. - - On `merge`: update refs snapshot (and durable status already handled by history). -4. **Repair pass**: - - Triggered only when snapshot mismatch is detected on `list`/`show`. - - Runs integration checks for tasks not durable-merged, writes results to SQLite, then renders output. -5. **Presentation**: list/TUI/show display “merged” when either durable status is `merged` or `git_integrated_reason` is non-empty. -6. **Tests**: e2e coverage for: - - history-preserving merge (ancestor) detection, - - squash merge detection (merge-tree), - - No manual command required; detection affects display only. diff --git a/docs/plans/git-exploration.md b/docs/plans/git-exploration.md new file mode 100644 index 0000000..a4a4206 --- /dev/null +++ b/docs/plans/git-exploration.md @@ -0,0 +1,183 @@ +# Merge Detection: Scenarios & Trade-offs + +## The Dream + +**For someone who doesn't know git:** +> "I created a task. The worker did the work. I reviewed it. I merged it. Done." + +They never think about branches, commits, or merge strategies. They see: +- **Open** → work in progress +- **Merged** → work is in the codebase +- **Closed** → abandoned, didn't use it + +**For someone who knows git:** +> "Subtask tracks my tasks. Workers do the work in isolated branches. I can merge however I want - via subtask, GitHub PR, manual git. Subtask figures it out." + +They have freedom to use their preferred workflow. Subtask stays out of the way but keeps track. + +--- + +### The Ideal Experience + +1. **"Merged" means one thing:** The task's changes are in the main codebase now. + - Doesn't matter *how* it got there + - No "integrated", "detected", "indirect" - just "merged" + +2. **Stats are permanent:** Once merged, you can always see what the task contributed (`+50 -10`). Like a GitHub PR - the stats don't disappear. + +3. **No wrong states:** You never see a task as "open" when the work is already in main. You never see "merged" when it isn't. + +4. **Simple lifecycle:** + ``` + open → merged (work shipped) + open → closed (abandoned) + ``` + That's it. No weird transitions. + +--- + +### What This Means Practically + +| User Action | What Subtask Shows | +|-------------|-------------------| +| `subtask merge` | merged | +| Merge via GitHub PR | merged | +| `git merge` manually | merged | +| `git merge --squash` | merged | +| Cherry-pick the changes | merged | +| Close without merging | closed | + +**The user never has to tell subtask "hey, I merged this externally."** Subtask just knows. + +--- + +## Technical Exploration + +This section explores how to achieve the dream above. + +### Ancestor Detection (what GitHub uses) + +```bash +git merge-base --is-ancestor +``` + +Returns true if the task branch tip commit is reachable from the base branch. + +**Catches:** +- Merge commits (`git merge branch`) +- Fast-forward merges +- Rebase + push (task branch rebased onto base, then base fast-forwarded) +- `subtask merge` (which does squash + rebase + fast-forward) + +**Misses:** +- Squash merge (`git merge --squash`) - creates new commit, branch tip not reachable +- Cherry-pick - creates new commit(s), branch tip not reachable + +### Content Detection (catches squash/cherry-pick) + +```bash +git merge-tree --write-tree +# Compare result tree to baseHead tree +``` + +Returns true if merging would produce no changes (content already in base). + +**Catches everything above, plus:** +- Squash merge +- Cherry-pick + +**Cost:** ~9ms per task vs ~4ms for ancestor-only. + +--- + +### Scenario Comparison: Ancestor-Only vs Content Detection + +| # | Scenario | Ancestor-Only | With Content Detection | +|---|----------|---------------|------------------------| +| | | Status / Changes | Status / Changes | +|---|----------|---------------|------------------------| +| A | **`subtask merge`** | `merged` / frozen ✓ | `merged` / frozen ✓ | +| B | **GitHub PR merge** | `merged` / frozen ✓ | `merged` / frozen ✓ | +| C | **Manual `git merge`** | `merged` / frozen ✓ | `merged` / frozen ✓ | +| D | **Fast-forward merge** | `merged` / frozen ✓ | `merged` / frozen ✓ | +| E | **Rebase + FF** | `merged` / frozen ✓ | `merged` / frozen ✓ | +| F | **Squash merge** | `open` / `+0 -0` | `merged` / frozen | +| G | **Cherry-pick** | `open` / `+0 -0` | `merged` / frozen | +| H | **Close without merging** | `closed` / frozen ✓ | `closed` / frozen ✓ | +| I | **Close, then branch merged** | `merged` / frozen ✓ | `merged` / frozen ✓ | +| J | **Close, then cherry-picked** | `closed` / frozen | `merged` / frozen | + +### Analysis + +**Scenarios A-E, H-I**: Both approaches behave identically. These cover the vast majority of workflows. + +**Scenarios F-G (squash/cherry-pick while open)**: + +| Aspect | Ancestor-Only | Content Detection | +|--------|---------------|-------------------| +| Status shown | `open` | `merged` | +| Changes shown | `+0 -0` (live) | frozen at detection | +| User signal | "Nothing left to merge" | "Merged" | +| False positives? | No | Possible (independent identical fix) | +| Matches GitHub? | Yes | No (GitHub stays Open) | +| Code complexity | Lower | Higher (+ Git version fallback) | + +**Scenario J (close, then cherry-picked)**: + +| Aspect | Ancestor-Only | Content Detection | +|--------|---------------|-------------------| +| Status shown | `closed` | `merged` | +| User intent | Preserved ("I closed it") | Overridden | +| Matches GitHub? | No (GitHub stays Closed) | No (GitHub stays Closed) | + +### The Trade-off + +| | Ancestor-Only | Content Detection | +|--|---------------|-------------------| +| **Correctness** | Never wrong | Risk of false positives | +| **Predictability** | High (matches GitHub) | Lower (magic detection) | +| **User intent** | Respected | Sometimes overridden | +| **Code complexity** | ~100 LOC | ~200+ LOC + fallback | +| **Git version** | Any | 2.38+ or fallback needed | +| **UX for squash** | `+0 -0` signals "done" | `merged` explicit | + +### Open Questions + +1. **Is `+0 -0` a good enough signal?** Users see "open" but zero changes - is that confusing or clear? + +2. **Do we want to match GitHub?** GitHub's approach is battle-tested and users understand it. + +3. **Is "too clever" detection risky?** Auto-marking merged when user didn't merge could surprise users. + +--- + +### What Users See + +| Status | Changes Field | Notes | +|--------|---------------|-------| +| `open` | Live diff (`+50 -10`) | Updates as worker makes changes | +| `merged` | Frozen at merge (`+50 -10`) | Preserved forever | +| `closed` | Frozen at close (`+50 -10`) | Preserved forever | + +--- + +## Implementation Cost + +| Approach | Lines of Code | Catches | +|----------|---------------|---------| +| Current implementation | ~1800 | All merge types | +| Proposed rewrite | ~150-200 | All merge types | + +The current implementation has ~1600 lines of accidental complexity (snapshots, repair passes, multiple strategies, promotion logic). The core detection is simple. + +--- + +## Recommendation + +To achieve "the dream": + +1. **Use ancestor + content detection** - catches all merge styles +2. **Auto-write `task.merged` to history** - freezes stats, marks task done +3. **Closed can become merged** - matches GitHub, more forgiving + +This means rewriting the current ~1800 line implementation as ~150-200 lines. diff --git a/docs/plans/git-redesign-requirements.md b/docs/plans/git-redesign-requirements.md new file mode 100644 index 0000000..d577e59 --- /dev/null +++ b/docs/plans/git-redesign-requirements.md @@ -0,0 +1,59 @@ +# Task Freshness: Requirements + +## Goals (what we're optimizing for) + +1. **Never show wrong data** - prefer waiting over lying +2. **Feel instant** - list/show/detail should be fast +3. **Keep complexity in one place** - single unified access layer + +## Assumptions (normal operation, not edge cases) + +1. **1-10 tasks are actively worked on at any time** - workers commit regularly, this is normal +2. **Workers do arbitrary git operations** - commit, rebase, reset, amend, cherry-pick, etc. +3. **Local-first** - subtask doesn't run `git fetch`, only user actions change remote refs + +## Requirements + +1. **Universal correctness guarantee** - every access to task data gets correct data or an explicit error, never stale/wrong/placeholder values. This applies to: + - TUI (list, detail) + - CLI (`subtask list`, `subtask show`, etc.) + - Internal code (any package that reads task data) + +2. **Single access layer** - all readers go through `pkg/task/store`, which enforces the correctness guarantee. Business logic never thinks about freshness/caching. + +3. **Input-based invalidation** - no TTL. Cache validity = input equality. Store what inputs were used to compute each cached value; on access, compare current inputs to stored inputs; same = valid, different = recompute. + +4. **No "unknown" UI states** - wait for correct data, never show placeholders. If data is computable, compute it (wait if needed). If data is genuinely not computable (e.g., commit deleted), return an explicit truthful error, not "unknown". + +5. **Parallel recompute** - when inputs change for multiple tasks, recompute in parallel (bounded pool ~8) to keep it fast. + +6. **Durable "merged" status** - only from history events (`task.merged`), never inferred from integration cache. + +7. **Immutable merged tasks** - store diff stats in `task.merged` history event at merge time, never recompute. + +8. **Immutable closed tasks** - store diff stats in `task.closed` history event at close time, never recompute. This ensures that when base branch advances, only open tasks (1-10 typically) need recomputation, not the potentially large number of closed tasks. + +9. **Backwards compatibility & seamless migration** - we MUST NOT break existing user setups (users are currently on v0.1.1). This applies to everything in this plan. Specifically: + - Users with existing tasks (merged/closed/open) must continue to work correctly after upgrade. + - A **one-time migration** should run automatically and seamlessly when the new binary first runs. + - Migration must use **proper locking** to prevent corruption if multiple subtask processes run in parallel. + - Migration should **backfill any missing data** (e.g., `base_commit` for existing tasks) to leave old tasks in an ideal state - the goal is to make old tasks indistinguishable from new ones. + - **Migration must be isolated** - all migration logic belongs in a dedicated migration package, NOT spread around the codebase. + - **Avoid backwards-compat branches in main code** - prefer backfilling/migrating old data to the new schema so the main codebase doesn't need conditionals or special cases for old vs new tasks. The migration does the work once so the rest of the code stays clean. + +10. **Thorough e2e tests** - comprehensive end-to-end tests with actual golden fixtures covering various cases and edge cases. We need confidence that the implementation is correct and doesn't regress. + +11. **External merge detection (ancestor-only)** - when a task branch tip becomes reachable from the base branch (via merge commit, fast-forward, or rebase), Subtask should detect this and write `task.merged` to history. This must: + - Be **local-first** (no implicit `git fetch`; detection is based on refs present locally). + - Write `task.merged` event with frozen stats (same as `subtask merge`). + - Use ancestor detection only (`git merge-base --is-ancestor`) - no content detection. + - Be correct or explicit error (e.g., branch deleted / base missing), never a placeholder. + - **Not detect** squash merges or cherry-picks (matches GitHub behavior). For these, users see `open` with historical stats preserved. + +## Non-requirements (things we're OK with) + +1. **Waiting briefly for correct data** - if recompute is needed, we wait. Fast in practice because only 1-10 tasks change at a time and we recompute in parallel. + +2. **Significant refactor** - we want ideal, correct, simple, and fast. Not a quick patch. + +3. **One-off migration** - if needed to achieve backwards compatibility correctly, a well-designed migration is acceptable. diff --git a/docs/plans/git-redesign.md b/docs/plans/git-redesign.md new file mode 100644 index 0000000..cb1d6eb --- /dev/null +++ b/docs/plans/git-redesign.md @@ -0,0 +1,282 @@ +# Git Redesign: Historical Diffs + Ancestor Merge Detection + +## 1. Overview / Goals + +Subtask’s “task freshness” problems came from tying task status + change stats to moving git state (base branch advancing, branches being rewritten/deleted) and from having multiple ad-hoc readers spread across CLI/TUI/ops. + +This redesign makes the task view: + +- **Correct**: never show stale/wrong data; return explicit, typed errors when something is not computable. +- **Fast**: `list`/TUI should feel instant; recompute only for tasks whose inputs changed; parallelize recompute for 1–10 active tasks. +- **Simple**: one access layer (`pkg/task/store`) owns all caching + invalidation + locking. + +### “Dream” UX + +For non-git users: +- `subtask list` shows a small set of stable, meaningful fields: status + historical change stats (like GitHub PR “Files changed”), plus explicit error text when something is broken. +- Tasks show as merged automatically if Subtask can **prove** the task branch tip landed in base (ancestor check), without requiring the user to understand Git internals. + +For git experts: +- `subtask show` provides detail inputs (base commit, branch head, base head), commit count, and commit timeline entries. +- Explicit errors mention the ref/commit that’s missing and how to fix it (fetch/restore branch/etc.). + +## 2. Key Decisions + +1) **Historical diff (not live)** (GitHub-style) +- `Changes` is a stable historical metric of “what the task contributed”. +- It does **not** rebase against a moving base head and does **not** collapse to `+0 -0` when base later contains the changes. + +2) **Ancestor-only merge detection** +- Detect “merged” only when `branch_head` is an ancestor of the current base head. +- No “content detection” (no squash/cherry-pick detection). This avoids false positives and matches GitHub’s “indirect merge” behavior. + +3) **Auto-write `task.merged` on detection** +- When ancestor detection triggers for an open task, Subtask appends a durable `task.merged` event (with frozen stats). +- After that, the task is durably merged; no continued re-detection is required. + +4) **Closed stays closed** +- Closed tasks are immutable. They never auto-promote to merged later. + +5) **Log commits to history (PR-style timeline)** +- When Subtask observes new commits on an open task branch, it appends `task.commit` events so the timeline reflects actual git activity. + +6) **Show commit count (detail only)** +- Detail view shows `Commits: N` where `N` is commits on the task branch since `base_commit`. +- This replaces “commits behind base”; we do not show “behind”. + +## 3. Data Model + +### 3.1 `base_commit`: meaning + capture + +`base_commit` is the immutable “starting point” for the task’s historical diff and commit count: + +- **Definition**: the base branch HEAD at task creation time. +- **Capture time**: when the task is created/opened (e.g., `subtask draft`/`subtask send` that creates the branch). +- **Persistence**: stored durably (see below) so the task’s “Files changed” equivalent stays stable across base advances. + +### 3.2 How `Changes` are computed + +For open tasks (and for frozen merged/closed tasks, once recorded): + +``` +Changes = diffstat(base_commit..branch_head) +``` + +- Inputs: `base_commit`, `branch_head` +- Output: `added`, `removed` (optionally `files_changed`) +- Errors: + - `ErrBranchMissing` / `ErrBranchDeleted` + - `ErrBaseMissing` (if we cannot resolve `base_commit` due to repo corruption) + - `ErrCommitMissing` (if `base_commit` or `branch_head` object no longer exists locally) + +### 3.3 SQLite schema (columns needed) + +SQLite remains a local cache/projection. It stores *cached derived values with their inputs* so the store can be fast and correct without TTL. + +Minimum per-task columns (conceptual): + +- Identity + file-backed: + - `name TEXT PRIMARY KEY` + - file sigs / updated timestamps (existing mechanism) + +- Task git anchors: + - `base_branch TEXT` (e.g. `main`) + - `base_commit TEXT` (immutable once set) + +- Heads (current inputs as last observed by Subtask): + - `branch_head TEXT` (current `refs/heads/` tip if exists) + - `base_head TEXT` (current `refs/heads/` tip as observed; used for merge detection) + +- Cached historical metrics (for open tasks): + - `changes_added INTEGER` + - `changes_removed INTEGER` + - `changes_base_commit TEXT` (input) + - `changes_branch_head TEXT` (input) + - `commit_count INTEGER` + - `commit_count_base_commit TEXT` (input) + - `commit_count_branch_head TEXT` (input) + +- Commit logging bookkeeping: + - `commit_log_last_head TEXT` (last `branch_head` we scanned/logged) + - optionally a lightweight dedupe aid: + - `commit_log_seen_patch_ids_json TEXT` (bounded; for rebase/amend dedupe), OR keep SHA-only and accept duplicates. + +This design intentionally avoids repo-wide snapshots, “commits behind”, and “integration reason” columns. + +### 3.4 History event schemas + +History (`history.jsonl`) is the portable source of truth. + +#### `task.commit` + +Appended when Subtask *observes* new commits on an open task branch. + +```json +{ + "type": "task.commit", + "sha": "abc123...", + "subject": "Add tests for X", + "author_name": "Codex", + "author_email": "codex@example.com", + "authored_at": 1730000000, + "seen_at": 1730000100 +} +``` + +Notes: +- We log what git says (author/subject/time). UI can render “worker committed …” based on author identity or simply “commit …”. +- Rebases/amends may produce new SHAs; see edge cases. + +#### `task.merged` (via detected) + +Appended when ancestor detection proves the task branch tip is in base. + +```json +{ + "type": "task.merged", + "via": "detected", + "method": "ancestor", + "base_branch": "main", + "base_commit": "deadbeef...", + "branch_head": "abc123...", + "base_head": "fedcba...", + "changes_added": 50, + "changes_removed": 10, + "commit_count": 5, + "detected_at": 1730000200 +} +``` + +Notes: +- This freezes the historical stats at the moment we mark merged. +- This is durable status (requirement: “merged only from history”). + +## 4. Components + +### 4.1 `pkg/task/store` + +The single access layer for all readers (CLI, TUI, internal ops). + +Responsibilities: +- Load file-backed task state (TASK.md/history/state/progress) with strict validation. +- Gather git inputs cheaply (refs for all tasks + base branches). +- Ensure returned views are correct for current inputs: + - use cached values only when input-equal + - recompute invalid tasks in parallel (bounded pool) +- Perform any required writes under lock: + - append `task.commit` events + - append `task.merged` event when detection triggers + - update SQLite cache rows + +API shape: + +```go +type Store interface { + List(ctx context.Context, opts ListOptions) (ListResult, error) + Get(ctx context.Context, name string, opts GetOptions) (TaskView, error) +} +``` + +`ListResult` includes partial errors (`[]TaskLoadError`) and CLI must print them. + +### 4.2 `pkg/task/index` + +SQLite cache/projection. + +Responsibilities: +- Store per-task cached derived values with their inputs. +- Store bookkeeping for commit logging scans. +- Keep transactions short; git work happens outside DB transactions. + +### 4.3 `pkg/git` + +Minimal primitives: +- `ListRefs(refs []string) map[string]sha` (uses `git for-each-ref`). +- `MergeBaseIsAncestor(ancestor, descendant) bool` (uses `git merge-base --is-ancestor`). +- `DiffStat(baseCommit, branchHead) (added, removed, filesChanged, error)` +- `RevListCount(baseCommit, branchHead) (int, error)` +- `ListCommits(baseCommit, branchHead) ([]CommitMeta, error)` for commit logging +- `CommitMeta(sha)` (subject/author/times) if needed independently + +## 5. Flows + +### 5.1 `subtask list` / TUI list + +1) `store.List` enumerates tasks and validates task files. +2) Store gathers current refs (single git call). +3) For each task: + - If task is durably merged/closed: read frozen stats from history (and optionally cache in SQLite for speed). + - If open: + - ensure `base_commit` is known (see migration). + - compute or reuse cached `Changes` and (optionally) minimal fields needed for list. +4) Store returns `ListResult{Tasks, Errors}`; CLI prints errors. + +### 5.2 `subtask show ` + +1) `store.Get` loads the task and ensures all detail fields are correct for current inputs. +2) Computes: + - `Changes` (historical diff) + - `Commits: N` (rev-list count) + - commit timeline entries (from history) +3) May append `task.commit` / `task.merged` if required (under lock) before returning. + +### 5.3 When worker commits + +No polling requirement. On any subsequent store access that observes `branch_head` changed: + +1) Store lists commits between `commit_log_last_head..branch_head` (or `base_commit..branch_head` on first run). +2) Appends `task.commit` events for newly observed commits. +3) Updates `commit_log_last_head` in SQLite. + +### 5.4 When external merge is detected (ancestor-only) + +On store access for an open task: + +1) Resolve `branch_head` and `base_head`. +2) If `merge-base --is-ancestor branch_head base_head`: + - compute frozen stats (`diff(base_commit..branch_head)`, commit count) + - append `task.merged` (`via=detected`) + - free workspace if policy requires (optional; out of scope here) +3) Subsequent reads treat the task as durably merged (history). + +### 5.5 `subtask merge` vs detected merge + +- `subtask merge` continues to perform Subtask’s standard merge flow and appends `task.merged` with `via="subtask"` (and its merge metadata if applicable). +- Detected merge appends `task.merged` with `via="detected"` and `method="ancestor"`. +- Both produce the same durable status + frozen stats behavior for UX. + +## 6. Edge Cases + +- **Branch deleted** + - Open task: `Changes.Err = ErrBranchDeleted` (if we know it existed) or `ErrBranchMissing`. + - Detected merges cannot be proven without `branch_head` unless we have a cached commit SHA and it still exists locally. + +- **Rebase/amend (commit logging)** + - SHA-only logging: rebases create new SHAs; history shows both. Correct, but noisy. + - Optional patch-id dedupe reduces noise without requiring complex “rewrite” events. + +- **Git version compatibility** + - Ancestor-only detection uses `git merge-base --is-ancestor` (very old; works on Git 2.34.x). + - No `merge-tree --write-tree` usage in this design. + +## 7. What We’re NOT Doing + +- No content detection (no squash/cherry-pick “merged” detection). +- No “commits behind base”. +- No promotion of closed tasks to merged. + +## 8. Migration + +From current `dev`: + +- Remove the existing complex “integration” subsystem and any UI display paths that can show “merged” without a `task.merged` event. +- Introduce `base_commit`: + - New tasks record it at creation/open time (durably). + - Existing tasks: + - If `base_commit` is missing, initialize once on first access as the then-current base head (or merge-base if that is the closest available anchor), persist it, and treat it as the task’s historical anchor going forward. + - Document that legacy tasks may not perfectly match “true original base” if created before this feature. +- Update list/show rendering to use historical `Changes` and detail commit count. +- Add/extend e2e tests: + - external merge via merge commit / fast-forward (ancestor) triggers `task.merged via=detected` + - commit logging appends `task.commit` + - rebase/amend behavior (either allowed-noisy or patch-id dedupe) diff --git a/pkg/e2e/applied_missing_test.go b/pkg/e2e/applied_missing_test.go new file mode 100644 index 0000000..6b4c39d --- /dev/null +++ b/pkg/e2e/applied_missing_test.go @@ -0,0 +1,172 @@ +package e2e + +import ( + "encoding/json" + "os" + "os/exec" + "path/filepath" + "strings" + "testing" + + "github.com/stretchr/testify/require" + + "github.com/zippoxer/subtask/pkg/task" + "github.com/zippoxer/subtask/pkg/task/history" + "github.com/zippoxer/subtask/pkg/testutil" +) + +func TestAppliedContentDetection_ShowsAppliedAndMergeNoOps(t *testing.T) { + run := func(t *testing.T, force string) { + env := testutil.NewTestEnv(t, 1) + + taskName := "test/applied" + env.CreateTask(taskName, "Applied task", "main", "Applied") + baseCommit := strings.TrimSpace(gitCmd(t, env.RootDir, "rev-parse", "HEAD")) + env.CreateTaskHistory(taskName, []history.Event{ + {Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main", "base_commit": baseCommit})}, + {Type: "stage.changed", Data: mustJSON(map[string]any{"from": "", "to": "implement"})}, + {Type: "worker.finished", Data: mustJSON(map[string]any{"run_id": "r1", "duration_ms": 0, "tool_calls": 0, "outcome": "replied"})}, + }) + + // Create task state (simulating a task that has been run) + state := &task.State{ + Workspace: env.Workspaces[0], + } + env.CreateTaskState(taskName, state) + + // Create workspace with task branch and a commit. + ws := env.Workspaces[0] + gitCmd(t, ws, "checkout", "-b", taskName) + + featureFile := filepath.Join(ws, "feature.txt") + require.NoError(t, os.WriteFile(featureFile, []byte("line 1\n"), 0644)) + gitCmd(t, ws, "add", "feature.txt") + gitCmd(t, ws, "commit", "-m", "Add feature") + + // Simulate a squash merge (or independently-applied change) into main with a different commit. + mainFile := filepath.Join(env.RootDir, "feature.txt") + require.NoError(t, os.WriteFile(mainFile, []byte("line 1\n"), 0644)) + gitCmd(t, env.RootDir, "add", "feature.txt") + gitCmd(t, env.RootDir, "commit", "-m", "Apply feature via squash") + + subtaskBin := buildSubtask(t) + + // List should show "applied (+A -R)" for plain output. + cmd := exec.Command(subtaskBin, "list") + cmd.Dir = env.RootDir + cmd.Env = append(os.Environ(), "SUBTASK_MERGE_SIM_FORCE="+force) + out, err := cmd.CombinedOutput() + require.NoError(t, err, "list should succeed: %s", out) + require.Contains(t, string(out), "applied (+1 -0)") + + // Show should explain that content is already in base. + cmd = exec.Command(subtaskBin, "show", taskName) + cmd.Dir = env.RootDir + cmd.Env = append(os.Environ(), "SUBTASK_MERGE_SIM_FORCE="+force) + out, err = cmd.CombinedOutput() + require.NoError(t, err, "show should succeed: %s", out) + require.Contains(t, string(out), "Already in base branch. Run `subtask merge` to mark as merged.") + + // Merge should no-op (main already contains the change) but still mark the task as merged. + mainBefore := strings.TrimSpace(gitCmd(t, env.RootDir, "rev-parse", "main")) + + cmd = exec.Command(subtaskBin, "merge", taskName, "-m", "Merge applied task") + cmd.Dir = env.RootDir + cmd.Env = append(os.Environ(), "SUBTASK_MERGE_SIM_FORCE="+force) + out, err = cmd.CombinedOutput() + require.NoError(t, err, "merge should succeed: %s", out) + + mainAfter := strings.TrimSpace(gitCmd(t, env.RootDir, "rev-parse", "main")) + require.Equal(t, mainBefore, mainAfter, "merge should not create a new main commit when content is already in base") + + tail, err := history.Tail(taskName) + require.NoError(t, err) + require.Equal(t, task.TaskStatusMerged, tail.TaskStatus) + + events, err := history.Read(taskName, history.ReadOptions{EventsOnly: true}) + require.NoError(t, err) + var mergedEv history.Event + for i := len(events) - 1; i >= 0; i-- { + if events[i].Type == "task.merged" { + mergedEv = events[i] + break + } + } + require.Equal(t, "task.merged", mergedEv.Type) + var mergedData struct { + Via string `json:"via"` + Method string `json:"method"` + ChangesAdded int `json:"changes_added"` + ChangesRemoved int `json:"changes_removed"` + CommitCount int `json:"commit_count"` + } + require.NoError(t, json.Unmarshal(mergedEv.Data, &mergedData)) + require.Equal(t, "subtask", mergedData.Via) + require.True(t, mergedData.Method == "merge_adds_nothing" || mergedData.Method == "trees_match", "unexpected method: %q", mergedData.Method) + require.Equal(t, 1, mergedData.ChangesAdded) + require.Equal(t, 0, mergedData.ChangesRemoved) + require.Equal(t, 1, mergedData.CommitCount) + } + + t.Run("merge-tree", func(t *testing.T) { + if !mergeTreeWriteTreeSupported() { + t.Skip("git merge-tree --write-tree not supported") + } + run(t, "merge-tree") + }) + t.Run("index", func(t *testing.T) { + run(t, "index") + }) +} + +func mergeTreeWriteTreeSupported() bool { + cmd := exec.Command("git", "merge-tree", "-h") + out, _ := cmd.CombinedOutput() // -h exits non-zero + s := string(out) + return strings.Contains(s, "--write-tree") && strings.Contains(s, "--merge-base") && strings.Contains(s, "--name-only") +} + +func TestMissingBranch_ShowsMissingInListAndShow(t *testing.T) { + env := testutil.NewTestEnv(t, 1) + + taskName := "test/missing" + env.CreateTask(taskName, "Missing branch task", "main", "Missing") + baseCommit := strings.TrimSpace(gitCmd(t, env.RootDir, "rev-parse", "HEAD")) + env.CreateTaskHistory(taskName, []history.Event{ + {Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main", "base_commit": baseCommit})}, + {Type: "stage.changed", Data: mustJSON(map[string]any{"from": "", "to": "implement"})}, + {Type: "worker.finished", Data: mustJSON(map[string]any{"run_id": "r1", "duration_ms": 0, "tool_calls": 0, "outcome": "replied"})}, + }) + + // Create a branch (not checked out in a worktree) and then delete it to simulate an external delete. + gitCmd(t, env.RootDir, "checkout", "-b", taskName) + require.NoError(t, os.WriteFile(filepath.Join(env.RootDir, "missing.txt"), []byte("x\n"), 0644)) + gitCmd(t, env.RootDir, "add", "missing.txt") + gitCmd(t, env.RootDir, "commit", "-m", "Add missing file") + gitCmd(t, env.RootDir, "checkout", "main") + gitCmd(t, env.RootDir, "branch", "-D", taskName) + + subtaskBin := buildSubtask(t) + + cmd := exec.Command(subtaskBin, "list") + cmd.Dir = env.RootDir + out, err := cmd.CombinedOutput() + require.NoError(t, err, "list should succeed: %s", out) + + found := false + for _, line := range strings.Split(string(out), "\n") { + if strings.HasPrefix(strings.TrimSpace(line), taskName+" ") { + found = true + require.Contains(t, line, "missing") + break + } + } + require.True(t, found, "expected list output to include task %q", taskName) + + cmd = exec.Command(subtaskBin, "show", taskName) + cmd.Dir = env.RootDir + out, err = cmd.CombinedOutput() + require.NoError(t, err, "show should succeed: %s", out) + require.Contains(t, string(out), "Changes: missing") + require.Contains(t, string(out), "Branch was deleted or commit objects are missing.") +} diff --git a/pkg/e2e/close_command_test.go b/pkg/e2e/close_command_test.go new file mode 100644 index 0000000..a48d6dc --- /dev/null +++ b/pkg/e2e/close_command_test.go @@ -0,0 +1,81 @@ +package e2e + +import ( + "encoding/json" + "os" + "os/exec" + "path/filepath" + "strings" + "testing" + + "github.com/stretchr/testify/assert" + "github.com/stretchr/testify/require" + + "github.com/zippoxer/subtask/pkg/task" + "github.com/zippoxer/subtask/pkg/task/history" + "github.com/zippoxer/subtask/pkg/testutil" +) + +func TestCloseCommand_FreezesStats(t *testing.T) { + env := testutil.NewTestEnv(t, 1) + + taskName := "test/close" + env.CreateTask(taskName, "Test close command", "main", "Test close") + + baseCommit := strings.TrimSpace(gitCmd(t, env.RootDir, "rev-parse", "HEAD")) + env.CreateTaskHistory(taskName, []history.Event{ + {Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main", "base_commit": baseCommit})}, + {Type: "stage.changed", Data: mustJSON(map[string]any{"from": "", "to": "implement"})}, + }) + + env.CreateTaskState(taskName, &task.State{Workspace: env.Workspaces[0]}) + + ws := env.Workspaces[0] + gitCmd(t, ws, "checkout", "-b", taskName) + f := filepath.Join(ws, "feature.txt") + require.NoError(t, os.WriteFile(f, []byte("line 1\nline 2\n"), 0o644)) + gitCmd(t, ws, "add", "feature.txt") + gitCmd(t, ws, "commit", "-m", "Add feature") + branchHead := strings.TrimSpace(gitCmd(t, ws, "rev-parse", "HEAD")) + + subtaskBin := buildSubtask(t) + cmd := exec.Command(subtaskBin, "close", taskName) + cmd.Dir = env.RootDir + out, err := cmd.CombinedOutput() + require.NoError(t, err, "close should succeed: %s", out) + + evs, err := history.Read(taskName, history.ReadOptions{EventsOnly: true}) + require.NoError(t, err) + var closedEv history.Event + for i := len(evs) - 1; i >= 0; i-- { + if evs[i].Type == "task.closed" { + closedEv = evs[i] + break + } + } + require.Equal(t, "task.closed", closedEv.Type) + var closedData struct { + Reason string `json:"reason"` + BaseBranch string `json:"base_branch"` + BaseCommit string `json:"base_commit"` + BranchHead string `json:"branch_head"` + ChangesAdded int `json:"changes_added"` + ChangesRemoved int `json:"changes_removed"` + CommitCount int `json:"commit_count"` + FrozenError string `json:"frozen_error"` + } + require.NoError(t, json.Unmarshal(closedEv.Data, &closedData)) + assert.Equal(t, "close", closedData.Reason) + assert.Equal(t, "main", closedData.BaseBranch) + assert.Equal(t, baseCommit, closedData.BaseCommit) + assert.Equal(t, branchHead, closedData.BranchHead) + assert.Equal(t, 2, closedData.ChangesAdded) + assert.Equal(t, 0, closedData.ChangesRemoved) + assert.Equal(t, 1, closedData.CommitCount) + assert.Empty(t, strings.TrimSpace(closedData.FrozenError)) + + state, err := task.LoadState(taskName) + require.NoError(t, err) + require.NotNil(t, state) + assert.Empty(t, state.Workspace) +} diff --git a/pkg/e2e/commit_logging_test.go b/pkg/e2e/commit_logging_test.go new file mode 100644 index 0000000..b1eddb4 --- /dev/null +++ b/pkg/e2e/commit_logging_test.go @@ -0,0 +1,76 @@ +package e2e + +import ( + "encoding/json" + "os" + "os/exec" + "path/filepath" + "strings" + "testing" + "time" + + "github.com/stretchr/testify/assert" + "github.com/stretchr/testify/require" + + "github.com/zippoxer/subtask/pkg/task" + "github.com/zippoxer/subtask/pkg/task/history" + "github.com/zippoxer/subtask/pkg/testutil" +) + +func TestCommitLogging_WritesTaskCommitEvents(t *testing.T) { + env := testutil.NewTestEnv(t, 1) + + taskName := "test/commit-log" + env.CreateTask(taskName, "Commit logging", "main", "Log commits to history") + + baseCommit := strings.TrimSpace(gitCmd(t, env.RootDir, "rev-parse", "HEAD")) + env.CreateTaskHistory(taskName, []history.Event{ + {Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main", "base_commit": baseCommit})}, + {Type: "stage.changed", Data: mustJSON(map[string]any{"from": "", "to": "implement"})}, + {Type: "worker.finished", TS: time.Now().UTC(), Data: mustJSON(map[string]any{"run_id": "r1", "duration_ms": 0, "tool_calls": 0, "outcome": "replied"})}, + }) + env.CreateTaskState(taskName, &task.State{Workspace: env.Workspaces[0]}) + + ws := env.Workspaces[0] + gitCmd(t, ws, "checkout", "-b", taskName) + f := filepath.Join(ws, "feature.txt") + require.NoError(t, os.WriteFile(f, []byte("hello\n"), 0o644)) + gitCmd(t, ws, "add", "feature.txt") + gitCmd(t, ws, "commit", "-m", "Add feature") + commitSHA := strings.TrimSpace(gitCmd(t, ws, "rev-parse", "HEAD")) + + subtaskBin := buildSubtask(t) + cmd := exec.Command(subtaskBin, "show", taskName) + cmd.Dir = env.RootDir + out, err := cmd.CombinedOutput() + require.NoError(t, err, "show should succeed: %s", out) + + evs, err := history.Read(taskName, history.ReadOptions{EventsOnly: true}) + require.NoError(t, err) + + found := false + for _, ev := range evs { + if ev.Type != "task.commit" { + continue + } + var d struct { + SHA string `json:"sha"` + Subject string `json:"subject"` + AuthorName string `json:"author_name"` + AuthorEmail string `json:"author_email"` + AuthoredAt int64 `json:"authored_at"` + SeenAt int64 `json:"seen_at"` + } + require.NoError(t, json.Unmarshal(ev.Data, &d)) + if strings.TrimSpace(d.SHA) != commitSHA { + continue + } + found = true + assert.Equal(t, "Add feature", d.Subject) + assert.Equal(t, "Test User", d.AuthorName) + assert.Equal(t, "test@test.com", d.AuthorEmail) + assert.Greater(t, d.AuthoredAt, int64(0)) + assert.Greater(t, d.SeenAt, int64(0)) + } + require.True(t, found, "expected task.commit event for %s", commitSHA) +} diff --git a/pkg/e2e/external_merge_detection_test.go b/pkg/e2e/external_merge_detection_test.go index da0162b..b843e81 100644 --- a/pkg/e2e/external_merge_detection_test.go +++ b/pkg/e2e/external_merge_detection_test.go @@ -1,14 +1,13 @@ package e2e import ( - "database/sql" + "encoding/json" "os" "os/exec" "path/filepath" "strings" "testing" - - _ "modernc.org/sqlite" + "time" "github.com/stretchr/testify/assert" "github.com/stretchr/testify/require" @@ -18,39 +17,19 @@ import ( "github.com/zippoxer/subtask/pkg/testutil" ) -func refsSnapshotHash(t *testing.T, root string) string { - t.Helper() - - dbPath := filepath.Join(root, ".subtask", "index.db") - db, err := sql.Open("sqlite", dbPath) - require.NoError(t, err) - t.Cleanup(func() { _ = db.Close() }) - - var hash sql.NullString - require.NoError(t, db.QueryRow(`SELECT git_refs_snapshot_hash FROM index_meta WHERE id = 1;`).Scan(&hash)) - if !hash.Valid { - return "" - } - return strings.TrimSpace(hash.String) -} - -func runSubtaskCLI(t *testing.T, binPath, root string, args ...string) string { - t.Helper() - cmd := exec.Command(binPath, args...) - cmd.Dir = root - out, err := cmd.CombinedOutput() - require.NoError(t, err, "subtask %v failed: %s", args, out) - return string(out) -} - -func TestExternalMergeDetection_ListShowsMerged_AncestorAndSnapshotInvalidation(t *testing.T) { +func TestExternalMergeDetection_WritesTaskMerged(t *testing.T) { env := testutil.NewTestEnv(t, 1) - taskName := "ext/ancestor" - env.CreateTask(taskName, "External ancestor merge", "main", "test") + taskName := "test/external-merge" + env.CreateTask(taskName, "External merge detection", "main", "Detect external merges") + + baseCommit := strings.TrimSpace(gitCmd(t, env.RootDir, "rev-parse", "HEAD")) env.CreateTaskHistory(taskName, []history.Event{ - {Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main"})}, + {Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main", "base_commit": baseCommit})}, + {Type: "stage.changed", Data: mustJSON(map[string]any{"from": "", "to": "implement"})}, + {Type: "worker.finished", TS: time.Now().UTC(), Data: mustJSON(map[string]any{"run_id": "r1", "duration_ms": 0, "tool_calls": 0, "outcome": "replied"})}, }) + env.CreateTaskState(taskName, &task.State{Workspace: env.Workspaces[0]}) ws := env.Workspaces[0] gitCmd(t, ws, "checkout", "-b", taskName) @@ -58,165 +37,55 @@ func TestExternalMergeDetection_ListShowsMerged_AncestorAndSnapshotInvalidation( require.NoError(t, os.WriteFile(f, []byte("hello\n"), 0o644)) gitCmd(t, ws, "add", "feature.txt") gitCmd(t, ws, "commit", "-m", "Add feature") + branchHead := strings.TrimSpace(gitCmd(t, ws, "rev-parse", "HEAD")) - bin := buildSubtask(t) - - // First list persists a snapshot (no repair pass). - out1 := runSubtaskCLI(t, bin, env.RootDir, "list") - assert.Contains(t, out1, taskName) - hash1 := refsSnapshotHash(t, env.RootDir) - require.NotEmpty(t, hash1) - - // External (history-preserving) merge. - gitCmd(t, env.RootDir, "checkout", "main") - gitCmd(t, env.RootDir, "merge", "--no-ff", taskName, "-m", "Merge "+taskName) - - out2 := runSubtaskCLI(t, bin, env.RootDir, "list") - assert.Contains(t, out2, taskName) - assert.Contains(t, out2, "✓ merged") - - hash2 := refsSnapshotHash(t, env.RootDir) - require.NotEmpty(t, hash2) - require.NotEqual(t, hash1, hash2) - - // No changes: snapshot hash stays stable. - out3 := runSubtaskCLI(t, bin, env.RootDir, "list") - assert.Contains(t, out3, taskName) - assert.Contains(t, out3, "✓ merged") - hash3 := refsSnapshotHash(t, env.RootDir) - require.Equal(t, hash2, hash3) -} - -func TestExternalMergeDetection_ListShowsMerged_SquashMerge(t *testing.T) { - env := testutil.NewTestEnv(t, 1) - taskName := "ext/squash" - - env.CreateTask(taskName, "External squash merge", "main", "test") - env.CreateTaskHistory(taskName, []history.Event{ - {Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main"})}, - }) - - ws := env.Workspaces[0] - gitCmd(t, ws, "checkout", "-b", taskName) - f := filepath.Join(ws, "squash.txt") - require.NoError(t, os.WriteFile(f, []byte("squashed\n"), 0o644)) - gitCmd(t, ws, "add", "squash.txt") - gitCmd(t, ws, "commit", "-m", "Add squashed file") - - bin := buildSubtask(t) - - // Prime snapshot. - _ = runSubtaskCLI(t, bin, env.RootDir, "list") - require.NotEmpty(t, refsSnapshotHash(t, env.RootDir)) - - // External squash merge. - gitCmd(t, env.RootDir, "checkout", "main") - gitCmd(t, env.RootDir, "merge", "--squash", taskName) - gitCmd(t, env.RootDir, "commit", "-m", "Squash "+taskName) - - out := runSubtaskCLI(t, bin, env.RootDir, "list") - assert.Contains(t, out, taskName) - assert.Contains(t, out, "✓ merged") -} - -func TestExternalMergeDetection_Revocability_BranchAdvancesClearsMerged(t *testing.T) { - env := testutil.NewTestEnv(t, 1) - taskName := "ext/revocable" - - env.CreateTask(taskName, "Revocable", "main", "test") - env.CreateTaskHistory(taskName, []history.Event{ - {Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main"})}, - }) - - ws := env.Workspaces[0] - gitCmd(t, ws, "checkout", "-b", taskName) - f := filepath.Join(ws, "rev.txt") - require.NoError(t, os.WriteFile(f, []byte("v1\n"), 0o644)) - gitCmd(t, ws, "add", "rev.txt") - gitCmd(t, ws, "commit", "-m", "v1") - - // External merge. - gitCmd(t, env.RootDir, "checkout", "main") - gitCmd(t, env.RootDir, "merge", "--no-ff", taskName, "-m", "Merge "+taskName) - - bin := buildSubtask(t) - out1 := runSubtaskCLI(t, bin, env.RootDir, "list") - assert.Contains(t, out1, taskName) - assert.Contains(t, out1, "✓ merged") - - // Branch advances after integration. - gitCmd(t, ws, "checkout", taskName) - require.NoError(t, os.WriteFile(f, []byte("v2\n"), 0o644)) - gitCmd(t, ws, "add", "rev.txt") - gitCmd(t, ws, "commit", "-m", "v2") - - out2 := runSubtaskCLI(t, bin, env.RootDir, "list") - assert.Contains(t, out2, taskName) - assert.NotContains(t, out2, "✓ merged") -} - -func TestExternalMergeDetection_ClosedTaskAutoPromotesToMerged(t *testing.T) { - env := testutil.NewTestEnv(t, 1) - taskName := "ext/closed-promote" - - env.CreateTask(taskName, "Closed promote", "main", "test") - env.CreateTaskHistory(taskName, []history.Event{ - {Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main"})}, - {Type: "task.closed", Data: mustJSON(map[string]any{"reason": "close"})}, - }) - - ws := env.Workspaces[0] - gitCmd(t, ws, "checkout", "-b", taskName) - f := filepath.Join(ws, "closed.txt") - require.NoError(t, os.WriteFile(f, []byte("c\n"), 0o644)) - gitCmd(t, ws, "add", "closed.txt") - gitCmd(t, ws, "commit", "-m", "c") - - // External merge into main. + // Merge branch into base outside of subtask (fast-forward). gitCmd(t, env.RootDir, "checkout", "main") - gitCmd(t, env.RootDir, "merge", "--no-ff", taskName, "-m", "Merge "+taskName) + gitCmd(t, env.RootDir, "merge", "--ff-only", taskName) + baseHead := strings.TrimSpace(gitCmd(t, env.RootDir, "rev-parse", "main")) + require.Equal(t, branchHead, baseHead, "fast-forward merge should move main to branch tip") - bin := buildSubtask(t) - out := runSubtaskCLI(t, bin, env.RootDir, "list") - assert.Contains(t, out, taskName) - assert.Contains(t, out, "✓ merged") + subtaskBin := buildSubtask(t) + cmd := exec.Command(subtaskBin, "show", taskName) + cmd.Dir = env.RootDir + out, err := cmd.CombinedOutput() + require.NoError(t, err, "show should succeed: %s", out) tail, err := history.Tail(taskName) require.NoError(t, err) assert.Equal(t, task.TaskStatusMerged, tail.TaskStatus) - assert.NotEmpty(t, tail.LastMergedCommit) -} - -func TestExternalMergeDetection_BaseForcePushed_RemovesMergedDetection(t *testing.T) { - env := testutil.NewTestEnv(t, 1) - taskName := "ext/force-push" - - env.CreateTask(taskName, "Force push", "main", "test") - env.CreateTaskHistory(taskName, []history.Event{ - {Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main"})}, - }) - - ws := env.Workspaces[0] - gitCmd(t, ws, "checkout", "-b", taskName) - f := filepath.Join(ws, "fp.txt") - require.NoError(t, os.WriteFile(f, []byte("fp\n"), 0o644)) - gitCmd(t, ws, "add", "fp.txt") - gitCmd(t, ws, "commit", "-m", "fp") - - // External merge. - mainBefore := strings.TrimSpace(gitCmd(t, env.RootDir, "rev-parse", "main")) - gitCmd(t, env.RootDir, "checkout", "main") - gitCmd(t, env.RootDir, "merge", "--no-ff", taskName, "-m", "Merge "+taskName) - - bin := buildSubtask(t) - out1 := runSubtaskCLI(t, bin, env.RootDir, "list") - assert.Contains(t, out1, taskName) - assert.Contains(t, out1, "✓ merged") - // Force-push style rewrite: drop the merge commit. - gitCmd(t, env.RootDir, "reset", "--hard", mainBefore) - - out2 := runSubtaskCLI(t, bin, env.RootDir, "list") - assert.Contains(t, out2, taskName) - assert.NotContains(t, out2, "✓ merged") + evs, err := history.Read(taskName, history.ReadOptions{EventsOnly: true}) + require.NoError(t, err) + var mergedEv history.Event + for i := len(evs) - 1; i >= 0; i-- { + if evs[i].Type == "task.merged" { + mergedEv = evs[i] + break + } + } + require.Equal(t, "task.merged", mergedEv.Type) + var mergedData struct { + Via string `json:"via"` + Method string `json:"method"` + BaseBranch string `json:"base_branch"` + BaseCommit string `json:"base_commit"` + BranchHead string `json:"branch_head"` + BaseHead string `json:"base_head"` + ChangesAdded int `json:"changes_added"` + ChangesRemoved int `json:"changes_removed"` + CommitCount int `json:"commit_count"` + FrozenError string `json:"frozen_error"` + } + require.NoError(t, json.Unmarshal(mergedEv.Data, &mergedData)) + assert.Equal(t, "detected", mergedData.Via) + assert.Equal(t, "ancestor", mergedData.Method) + assert.Equal(t, "main", mergedData.BaseBranch) + assert.Equal(t, baseCommit, mergedData.BaseCommit) + assert.Equal(t, branchHead, mergedData.BranchHead) + assert.Equal(t, baseHead, mergedData.BaseHead) + assert.Equal(t, 1, mergedData.ChangesAdded) + assert.Equal(t, 0, mergedData.ChangesRemoved) + assert.Equal(t, 1, mergedData.CommitCount) + assert.Empty(t, strings.TrimSpace(mergedData.FrozenError)) } diff --git a/pkg/e2e/git_helpers_test.go b/pkg/e2e/git_helpers_test.go new file mode 100644 index 0000000..cd3458f --- /dev/null +++ b/pkg/e2e/git_helpers_test.go @@ -0,0 +1,19 @@ +package e2e + +import ( + "os/exec" + "testing" +) + +// gitCmd runs a git command in the given directory. +func gitCmd(t *testing.T, dir string, args ...string) string { + t.Helper() + cmd := exec.Command("git", args...) + cmd.Dir = dir + out, err := cmd.CombinedOutput() + if err != nil { + t.Fatalf("git %v failed: %v\n%s", args, err, out) + } + return string(out) +} + diff --git a/pkg/e2e/gitredesign_migration_test.go b/pkg/e2e/gitredesign_migration_test.go new file mode 100644 index 0000000..94cc155 --- /dev/null +++ b/pkg/e2e/gitredesign_migration_test.go @@ -0,0 +1,126 @@ +package e2e + +import ( + "encoding/json" + "os" + "os/exec" + "path/filepath" + "strings" + "testing" + + "github.com/stretchr/testify/require" + + "github.com/zippoxer/subtask/pkg/task" + "github.com/zippoxer/subtask/pkg/task/history" + "github.com/zippoxer/subtask/pkg/testutil" +) + +func TestGitRedesignMigration_BackfillsBaseCommit(t *testing.T) { + env := testutil.NewTestEnv(t, 1) + + taskName := "legacy/nobasecommit" + require.NoError(t, (&task.Task{ + Name: taskName, + Title: "Legacy task missing base_commit", + BaseBranch: "main", + Description: "Legacy", + Schema: 1, + }).Save()) + env.CreateTaskHistory(taskName, []history.Event{ + {Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main"})}, + {Type: "stage.changed", Data: mustJSON(map[string]any{"from": "", "to": "implement"})}, + }) + + // Create the task branch so the migration can infer merge-base. + ws := env.Workspaces[0] + gitCmd(t, ws, "checkout", "-b", taskName) + + wantBaseCommit := strings.TrimSpace(gitCmd(t, env.RootDir, "rev-parse", "main")) + + subtaskBin := buildSubtask(t) + cmd := exec.Command(subtaskBin, "list") + cmd.Dir = env.RootDir + out, err := cmd.CombinedOutput() + require.NoError(t, err, "list should succeed: %s", out) + + evs, err := history.Read(taskName, history.ReadOptions{EventsOnly: true}) + require.NoError(t, err) + + var opened history.Event + for i := len(evs) - 1; i >= 0; i-- { + if evs[i].Type == "task.opened" { + opened = evs[i] + break + } + } + require.Equal(t, "task.opened", opened.Type) + + var d struct { + BaseBranch string `json:"base_branch"` + BaseCommit string `json:"base_commit"` + BaseRef string `json:"base_ref"` + } + require.NoError(t, json.Unmarshal(opened.Data, &d)) + require.Equal(t, "main", d.BaseBranch) + require.Equal(t, "main", d.BaseRef) + require.Equal(t, wantBaseCommit, d.BaseCommit) +} + +func TestGitRedesignMigration_LegacyTaskShowsApplied(t *testing.T) { + env := testutil.NewTestEnv(t, 1) + + taskName := "legacy/applied" + require.NoError(t, (&task.Task{ + Name: taskName, + Title: "Legacy task applied", + BaseBranch: "main", + Description: "Legacy applied", + Schema: 1, + }).Save()) + env.CreateTaskHistory(taskName, []history.Event{ + {Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main"})}, // legacy: no base_commit + {Type: "stage.changed", Data: mustJSON(map[string]any{"from": "", "to": "implement"})}, + {Type: "worker.finished", Data: mustJSON(map[string]any{"run_id": "r1", "duration_ms": 0, "tool_calls": 0, "outcome": "replied"})}, + }) + env.CreateTaskState(taskName, &task.State{Workspace: env.Workspaces[0]}) + + // Create branch + commit in workspace. + ws := env.Workspaces[0] + gitCmd(t, ws, "checkout", "-b", taskName) + f := filepath.Join(ws, "feature.txt") + require.NoError(t, os.WriteFile(f, []byte("hello\n"), 0o644)) + gitCmd(t, ws, "add", "feature.txt") + gitCmd(t, ws, "commit", "-m", "Add feature") + + // Apply the same change to main via a different commit (squash-like). + mainFile := filepath.Join(env.RootDir, "feature.txt") + require.NoError(t, os.WriteFile(mainFile, []byte("hello\n"), 0o644)) + gitCmd(t, env.RootDir, "add", "feature.txt") + gitCmd(t, env.RootDir, "commit", "-m", "Apply feature") + + subtaskBin := buildSubtask(t) + cmd := exec.Command(subtaskBin, "list") + cmd.Dir = env.RootDir + out, err := cmd.CombinedOutput() + require.NoError(t, err, "list should succeed: %s", out) + require.Contains(t, string(out), "applied (+1 -0)") + + // Ensure the opened event was backfilled with base_commit. + evs, err := history.Read(taskName, history.ReadOptions{EventsOnly: true}) + require.NoError(t, err) + + var opened history.Event + for i := len(evs) - 1; i >= 0; i-- { + if evs[i].Type == "task.opened" { + opened = evs[i] + break + } + } + require.Equal(t, "task.opened", opened.Type) + + var d struct { + BaseCommit string `json:"base_commit"` + } + require.NoError(t, json.Unmarshal(opened.Data, &d)) + require.NotEmpty(t, strings.TrimSpace(d.BaseCommit)) +} diff --git a/pkg/e2e/install_cli_test.go b/pkg/e2e/install_cli_test.go index 7bcad06..bab35bd 100644 --- a/pkg/e2e/install_cli_test.go +++ b/pkg/e2e/install_cli_test.go @@ -6,22 +6,27 @@ import ( "os/exec" "path/filepath" "runtime" + "strings" "testing" "github.com/stretchr/testify/require" "github.com/zippoxer/subtask/pkg/install" + "github.com/zippoxer/subtask/pkg/workspace" ) -func TestInstall_UserScope_CreatesSkillPluginAndObjectSettings(t *testing.T) { +func TestInstall_UserScope_InstallsSkill_AndIsIdempotent(t *testing.T) { bin := buildSubtask(t) home := t.TempDir() + t.Setenv("HOME", home) + t.Setenv("USERPROFILE", home) // windows + t.Setenv("SUBTASK_DIR", filepath.Join(home, ".subtask")) + addStubCommandToPATH(t, "codex") cwd := t.TempDir() out := runSubtask(t, bin, cwd, home, "install", "--no-prompt") require.Contains(t, out, "Installed skill") - require.Contains(t, out, "Installed plugin") // Skill path. skillPath := filepath.Join(home, ".claude", "skills", "subtask", "SKILL.md") @@ -29,49 +34,67 @@ func TestInstall_UserScope_CreatesSkillPluginAndObjectSettings(t *testing.T) { require.NoError(t, err) require.Equal(t, install.Embedded(), gotSkill) - // Plugin files + exec bit. - pluginDir := filepath.Join(home, ".claude", "plugins", "subtask") - require.FileExists(t, filepath.Join(pluginDir, ".claude-plugin", "plugin.json")) - require.FileExists(t, filepath.Join(pluginDir, "hooks", "hooks.json")) - scriptPath := filepath.Join(pluginDir, "scripts", "skill-reminder.sh") - info, err := os.Stat(scriptPath) - require.NoError(t, err) - if runtime.GOOS != "windows" { - require.NotZero(t, info.Mode().Perm()&0o111, "should be executable on Unix") - } - - // Settings.json: enabledPlugins must be object. - settingsPath := filepath.Join(home, ".claude", "settings.json") - var settings map[string]any - require.NoError(t, readJSON(settingsPath, &settings)) - enabled, ok := settings["enabledPlugins"].(map[string]any) - require.True(t, ok, "enabledPlugins should be an object") - require.Equal(t, true, enabled["subtask"]) - // Idempotent: second install shouldn't break settings or content. out2 := runSubtask(t, bin, cwd, home, "install", "--no-prompt") require.Contains(t, out2, "Skill already up to date") - require.Contains(t, out2, "Plugin already up to date") - require.NoError(t, readJSON(settingsPath, &settings)) - enabled, ok = settings["enabledPlugins"].(map[string]any) - require.True(t, ok, "enabledPlugins should be an object") - require.Equal(t, true, enabled["subtask"]) } -func TestInstall_Settings_ObjectFormatPreserved(t *testing.T) { +func TestInstall_Migration_NoLegacyArtifacts_NoWritesToSettings(t *testing.T) { + bin := buildSubtask(t) + home := t.TempDir() + t.Setenv("HOME", home) + t.Setenv("USERPROFILE", home) // windows + t.Setenv("SUBTASK_DIR", filepath.Join(home, ".subtask")) + addStubCommandToPATH(t, "codex") + cwd := t.TempDir() + + out := runSubtask(t, bin, cwd, home, "install", "--no-prompt") + require.Contains(t, out, "Installed skill") + + // Migration must not create these. + _, err := os.Stat(filepath.Join(home, ".claude", "settings.json")) + require.ErrorIs(t, err, os.ErrNotExist) + _, err = os.Stat(filepath.Join(home, ".claude", "plugins", "subtask")) + require.ErrorIs(t, err, os.ErrNotExist) +} + +func TestInstall_Migration_RemovesLegacyPluginDir(t *testing.T) { + bin := buildSubtask(t) + home := t.TempDir() + t.Setenv("HOME", home) + t.Setenv("USERPROFILE", home) // windows + t.Setenv("SUBTASK_DIR", filepath.Join(home, ".subtask")) + addStubCommandToPATH(t, "codex") + cwd := t.TempDir() + + legacyDir := filepath.Join(home, ".claude", "plugins", "subtask") + require.NoError(t, os.MkdirAll(legacyDir, 0o755)) + require.NoError(t, os.WriteFile(filepath.Join(legacyDir, "sentinel"), []byte("x"), 0o644)) + + _ = runSubtask(t, bin, cwd, home, "install", "--no-prompt") + + _, err := os.Stat(legacyDir) + require.ErrorIs(t, err, os.ErrNotExist) +} + +func TestInstall_Migration_RemovesLegacySettingsKeyOnly(t *testing.T) { bin := buildSubtask(t) home := t.TempDir() + t.Setenv("HOME", home) + t.Setenv("USERPROFILE", home) // windows + subtaskDir := filepath.Join(home, ".subtask") + t.Setenv("SUBTASK_DIR", subtaskDir) + addStubCommandToPATH(t, "codex") cwd := t.TempDir() + require.NoError(t, os.MkdirAll(subtaskDir, 0o755)) + require.NoError(t, os.WriteFile(filepath.Join(subtaskDir, "config.json"), []byte(`{"harness":"codex","max_workspaces":1}`+"\n"), 0o644)) + settingsPath := filepath.Join(home, ".claude", "settings.json") require.NoError(t, os.MkdirAll(filepath.Dir(settingsPath), 0o755)) - require.NoError(t, os.WriteFile(settingsPath, []byte(`{ - "enabledPlugins": { "other": true }, - "keep": { "nested": 123 } -} -`), 0o644)) + require.NoError(t, os.WriteFile(settingsPath, []byte(`{"enabledPlugins":{"subtask":true,"other":true},"keep":123}`+"\n"), 0o644)) - _ = runSubtask(t, bin, cwd, home, "install", "--no-prompt", "--plugin") + _ = runSubtask(t, bin, cwd, home, "install", "--no-prompt") var settings map[string]any require.NoError(t, readJSON(settingsPath, &settings)) @@ -79,101 +102,366 @@ func TestInstall_Settings_ObjectFormatPreserved(t *testing.T) { enabled, ok := settings["enabledPlugins"].(map[string]any) require.True(t, ok, "enabledPlugins should remain an object") require.Equal(t, true, enabled["other"]) - require.Equal(t, true, enabled["subtask"]) - - keep, ok := settings["keep"].(map[string]any) - require.True(t, ok) - require.Equal(t, float64(123), keep["nested"]) + require.Nil(t, enabled["subtask"]) + require.Equal(t, float64(123), settings["keep"]) } -func TestInstall_Settings_ArrayFormatConvertedToObject(t *testing.T) { +func TestInstall_Migration_DoesNotRemoveMarketplaceKey(t *testing.T) { bin := buildSubtask(t) home := t.TempDir() + t.Setenv("HOME", home) + t.Setenv("USERPROFILE", home) // windows + subtaskDir := filepath.Join(home, ".subtask") + t.Setenv("SUBTASK_DIR", subtaskDir) + addStubCommandToPATH(t, "codex") cwd := t.TempDir() + require.NoError(t, os.MkdirAll(subtaskDir, 0o755)) + require.NoError(t, os.WriteFile(filepath.Join(subtaskDir, "config.json"), []byte(`{"harness":"codex","max_workspaces":1}`+"\n"), 0o644)) + settingsPath := filepath.Join(home, ".claude", "settings.json") require.NoError(t, os.MkdirAll(filepath.Dir(settingsPath), 0o755)) - require.NoError(t, os.WriteFile(settingsPath, []byte(`{"enabledPlugins":["other"]}`+"\n"), 0o644)) + require.NoError(t, os.WriteFile(settingsPath, []byte(`{"enabledPlugins":{"subtask@subtask":true}}`+"\n"), 0o644)) - _ = runSubtask(t, bin, cwd, home, "install", "--no-prompt", "--plugin") + _ = runSubtask(t, bin, cwd, home, "install", "--no-prompt") var settings map[string]any require.NoError(t, readJSON(settingsPath, &settings)) enabled, ok := settings["enabledPlugins"].(map[string]any) - require.True(t, ok, "enabledPlugins should be converted to an object") - require.Equal(t, true, enabled["other"]) - require.Equal(t, true, enabled["subtask"]) + require.True(t, ok) + require.Equal(t, true, enabled["subtask@subtask"]) } -func TestInstall_Settings_MalformedJSON_BackupsAndCreatesFreshObject(t *testing.T) { +func TestInstall_Migration_PreservesComplexSettings(t *testing.T) { bin := buildSubtask(t) home := t.TempDir() + t.Setenv("HOME", home) + t.Setenv("USERPROFILE", home) // windows + subtaskDir := filepath.Join(home, ".subtask") + t.Setenv("SUBTASK_DIR", subtaskDir) + addStubCommandToPATH(t, "codex") cwd := t.TempDir() + const settingsJSON = `{ + "$schema": "https://json.schemastore.org/claude-code-settings.json", + "alwaysThinkingEnabled": true, + "enabledPlugins": { + "rust-analyzer-lsp@claude-plugins-official": true, + "gopls-lsp@claude-plugins-official": true, + "dev-browser@dev-browser-marketplace": true, + "subtask": true + }, + "env": { + "BASH_MAX_TIMEOUT_MS": "7200000" + }, + "hooks": { + "SessionStart": [ + { + "hooks": [ + { + "command": "echo 'hello'", + "type": "command" + } + ], + "matcher": "compact" + } + ] + }, + "statusLine": { + "command": "~/.claude/statusline.sh", + "type": "command" + } +} +` + settingsPath := filepath.Join(home, ".claude", "settings.json") require.NoError(t, os.MkdirAll(filepath.Dir(settingsPath), 0o755)) - require.NoError(t, os.WriteFile(settingsPath, []byte("{not json"), 0o644)) + require.NoError(t, os.WriteFile(settingsPath, []byte(settingsJSON), 0o644)) - out := runSubtask(t, bin, cwd, home, "install", "--no-prompt", "--plugin") - require.Contains(t, out, "Rewrote malformed settings.json") + require.NoError(t, os.MkdirAll(subtaskDir, 0o755)) + require.NoError(t, os.WriteFile(filepath.Join(subtaskDir, "config.json"), []byte(`{"harness":"codex","max_workspaces":1}`+"\n"), 0o644)) - // Backup should exist (exact suffix may include timestamp). - matches, err := filepath.Glob(settingsPath + ".bak*") - require.NoError(t, err) - require.NotEmpty(t, matches) + _ = runSubtask(t, bin, cwd, home, "install", "--no-prompt") - var settings map[string]any - require.NoError(t, readJSON(settingsPath, &settings)) - enabled, ok := settings["enabledPlugins"].(map[string]any) - require.True(t, ok, "enabledPlugins should be an object") - require.Equal(t, true, enabled["subtask"]) + var got map[string]any + require.NoError(t, readJSON(settingsPath, &got)) + + enabled, ok := got["enabledPlugins"].(map[string]any) + require.True(t, ok) + require.Nil(t, enabled["subtask"]) + + var expected map[string]any + require.NoError(t, json.Unmarshal([]byte(settingsJSON), &expected)) + expectedEnabled, ok := expected["enabledPlugins"].(map[string]any) + require.True(t, ok) + delete(expectedEnabled, "subtask") + expected["enabledPlugins"] = expectedEnabled + + require.Equal(t, expected, got) } -func TestUninstall_RemovesPluginFromEnabledPlugins(t *testing.T) { +func TestInstall_Migration_RunOnce_SkipsOnSecondInstall(t *testing.T) { bin := buildSubtask(t) home := t.TempDir() + t.Setenv("HOME", home) + t.Setenv("USERPROFILE", home) // windows + subtaskDir := filepath.Join(home, ".subtask") + t.Setenv("SUBTASK_DIR", subtaskDir) + addStubCommandToPATH(t, "codex") cwd := t.TempDir() - _ = runSubtask(t, bin, cwd, home, "install", "--no-prompt") + require.NoError(t, os.MkdirAll(subtaskDir, 0o755)) + require.NoError(t, os.WriteFile(filepath.Join(subtaskDir, "config.json"), []byte(`{"harness":"codex","max_workspaces":1}`+"\n"), 0o644)) settingsPath := filepath.Join(home, ".claude", "settings.json") + require.NoError(t, os.MkdirAll(filepath.Dir(settingsPath), 0o755)) + require.NoError(t, os.WriteFile(settingsPath, []byte(`{"enabledPlugins":{"subtask":true,"other":true}}`+"\n"), 0o644)) + + _ = runSubtask(t, bin, cwd, home, "install", "--no-prompt") + var settings map[string]any require.NoError(t, readJSON(settingsPath, &settings)) + enabled, ok := settings["enabledPlugins"].(map[string]any) + require.True(t, ok) + require.Nil(t, enabled["subtask"]) + require.Equal(t, true, enabled["other"]) + + markerPath := filepath.Join(subtaskDir, "migrations", "legacy-claude-plugin-v1.done") + require.FileExists(t, markerPath) - enabled := settings["enabledPlugins"].(map[string]any) - enabled["other"] = true + // Reintroduce the legacy key; second install should not run migration again. + enabled["subtask"] = true settings["enabledPlugins"] = enabled b, err := json.MarshalIndent(settings, "", " ") require.NoError(t, err) require.NoError(t, os.WriteFile(settingsPath, append(b, '\n'), 0o644)) - _ = runSubtask(t, bin, cwd, home, "uninstall", "--plugin") + _ = runSubtask(t, bin, cwd, home, "install", "--no-prompt") + + require.NoError(t, readJSON(settingsPath, &settings)) + enabled, ok = settings["enabledPlugins"].(map[string]any) + require.True(t, ok) + require.Equal(t, true, enabled["subtask"]) + require.Equal(t, true, enabled["other"]) +} + +func TestInstall_Migration_BothDirAndSettings(t *testing.T) { + bin := buildSubtask(t) + home := t.TempDir() + t.Setenv("HOME", home) + t.Setenv("USERPROFILE", home) // windows + t.Setenv("SUBTASK_DIR", filepath.Join(home, ".subtask")) + addStubCommandToPATH(t, "codex") + cwd := t.TempDir() + + legacyDir := filepath.Join(home, ".claude", "plugins", "subtask") + require.NoError(t, os.MkdirAll(legacyDir, 0o755)) + require.NoError(t, os.WriteFile(filepath.Join(legacyDir, "sentinel"), []byte("x"), 0o644)) + + settingsPath := filepath.Join(home, ".claude", "settings.json") + require.NoError(t, os.MkdirAll(filepath.Dir(settingsPath), 0o755)) + require.NoError(t, os.WriteFile(settingsPath, []byte(`{"enabledPlugins":{"subtask":true,"other":true}}`+"\n"), 0o644)) + + _ = runSubtask(t, bin, cwd, home, "install", "--no-prompt") + + _, err := os.Stat(legacyDir) + require.ErrorIs(t, err, os.ErrNotExist) + var settings map[string]any require.NoError(t, readJSON(settingsPath, &settings)) enabled, ok := settings["enabledPlugins"].(map[string]any) - require.True(t, ok, "enabledPlugins should be an object") + require.True(t, ok) require.Nil(t, enabled["subtask"]) require.Equal(t, true, enabled["other"]) } -func TestInstall_ProjectScope_UsesRepoRoot(t *testing.T) { +func TestInstall_Migration_MalformedSettingsJSON_SkipsAndWarns(t *testing.T) { + bin := buildSubtask(t) + home := t.TempDir() + t.Setenv("HOME", home) + t.Setenv("USERPROFILE", home) // windows + t.Setenv("SUBTASK_DIR", filepath.Join(home, ".subtask")) + addStubCommandToPATH(t, "codex") + cwd := t.TempDir() + + legacyDir := filepath.Join(home, ".claude", "plugins", "subtask") + require.NoError(t, os.MkdirAll(legacyDir, 0o755)) + require.NoError(t, os.WriteFile(filepath.Join(legacyDir, "sentinel"), []byte("x"), 0o644)) + + settingsPath := filepath.Join(home, ".claude", "settings.json") + require.NoError(t, os.MkdirAll(filepath.Dir(settingsPath), 0o755)) + require.NoError(t, os.WriteFile(settingsPath, []byte("{not json"), 0o644)) + + out := runSubtask(t, bin, cwd, home, "install", "--no-prompt") + require.Contains(t, out, "Skipped legacy settings cleanup") + + // Plugin dir removed even if settings.json was malformed. + _, err := os.Stat(legacyDir) + require.ErrorIs(t, err, os.ErrNotExist) + + // settings.json is untouched. + data, err := os.ReadFile(settingsPath) + require.NoError(t, err) + require.Equal(t, "{not json", string(data)) +} + +func TestInstall_Guide_DoesNotWriteAnything(t *testing.T) { bin := buildSubtask(t) + home := t.TempDir() + t.Setenv("HOME", home) + t.Setenv("USERPROFILE", home) // windows + t.Setenv("SUBTASK_DIR", filepath.Join(home, ".subtask")) + cwd := t.TempDir() + + out := runSubtask(t, bin, cwd, home, "install", "--guide") + require.Contains(t, out, "# Setup Subtask") + require.Contains(t, out, "Not in a git repository") + + // Debug + entries, _ := os.ReadDir(home) + t.Logf("home dir contents: %v", entries) + if st := filepath.Join(home, ".subtask"); fileExists(st) { + sub, _ := os.ReadDir(st) + t.Logf(".subtask contents: %v", sub) + } + t.Logf("SUBTASK_DEBUG in test env: %s", os.Getenv("SUBTASK_DEBUG")) + + _, err := os.Stat(filepath.Join(home, ".claude")) + require.ErrorIs(t, err, os.ErrNotExist) + _, err = os.Stat(filepath.Join(home, ".subtask")) + require.ErrorIs(t, err, os.ErrNotExist) +} + +func fileExists(path string) bool { + _, err := os.Stat(path) + return err == nil +} +func TestInstall_Guide_InGitRepo_MultipleHarnesses_ShowsHarnessChoice(t *testing.T) { + bin := buildSubtask(t) home := t.TempDir() + t.Setenv("HOME", home) + t.Setenv("USERPROFILE", home) // windows + t.Setenv("SUBTASK_DIR", filepath.Join(home, ".subtask")) + + addStubCommandToPATH(t, "codex") + addStubCommandToPATH(t, "claude") + + repo := t.TempDir() + initGitRepo(t, repo) + + out := runSubtask(t, bin, repo, home, "install", "--guide") + require.Contains(t, out, "In a git repository") + require.Contains(t, out, "Ask the user which harness") + require.Contains(t, out, "subtask install --no-prompt --harness ") + + _, err := os.Stat(filepath.Join(home, ".subtask")) + require.ErrorIs(t, err, os.ErrNotExist) + _, err = os.Stat(filepath.Join(home, ".claude")) + require.ErrorIs(t, err, os.ErrNotExist) +} + +func TestInstall_NoPrompt_Flags_WriteConfig(t *testing.T) { + bin := buildSubtask(t) + + home := t.TempDir() + t.Setenv("HOME", home) + t.Setenv("USERPROFILE", home) // windows + t.Setenv("SUBTASK_DIR", filepath.Join(home, ".subtask")) + addStubCommandToPATH(t, "claude") + + cwd := t.TempDir() + out := runSubtask(t, bin, cwd, home, "install", "--no-prompt", "--harness", "claude", "--model", "claude-sonnet-4-20250514", "--max-workspaces", "7") + require.Contains(t, out, "Configured subtask") + + var cfg workspace.Config + require.NoError(t, readJSON(filepath.Join(home, ".subtask", "config.json"), &cfg)) + require.Equal(t, "claude", cfg.Harness) + require.Equal(t, 7, cfg.MaxWorkspaces) + require.NotNil(t, cfg.Options) + require.Equal(t, "claude-sonnet-4-20250514", cfg.Options["model"]) + _, hasReasoning := cfg.Options["reasoning"] + require.False(t, hasReasoning) +} + +func TestInstall_NoPrompt_ReasoningRequiresCodex(t *testing.T) { + bin := buildSubtask(t) + + home := t.TempDir() + t.Setenv("HOME", home) + t.Setenv("USERPROFILE", home) // windows + t.Setenv("SUBTASK_DIR", filepath.Join(home, ".subtask")) + addStubCommandToPATH(t, "claude") + + cwd := t.TempDir() + out, err := runSubtaskWithHomeEnv(t, bin, cwd, home, "install", "--no-prompt", "--harness", "claude", "--reasoning", "high") + require.Error(t, err) + require.Contains(t, out, "reasoning is codex-only") +} + +func TestInstall_NoPrompt_InvalidHarnessRejected(t *testing.T) { + bin := buildSubtask(t) + + home := t.TempDir() + t.Setenv("HOME", home) + t.Setenv("USERPROFILE", home) // windows + t.Setenv("SUBTASK_DIR", filepath.Join(home, ".subtask")) + addStubCommandToPATH(t, "codex") + + cwd := t.TempDir() + out, err := runSubtaskWithHomeEnv(t, bin, cwd, home, "install", "--no-prompt", "--harness", "nope") + require.Error(t, err) + require.Contains(t, out, "invalid harness") +} + +func TestInstall_ProjectScope_InstallsSkillToRepoOnly(t *testing.T) { + bin := buildSubtask(t) + + home := t.TempDir() + t.Setenv("HOME", home) + t.Setenv("USERPROFILE", home) // windows + t.Setenv("SUBTASK_DIR", filepath.Join(home, ".subtask")) + addStubCommandToPATH(t, "codex") + repo := t.TempDir() initGitRepo(t, repo) out := runSubtask(t, bin, repo, home, "install", "--no-prompt", "--scope", "project") require.Contains(t, out, "Installed skill") - require.Contains(t, out, "Installed plugin") - require.FileExists(t, filepath.Join(repo, ".claude", "skills", "subtask", "SKILL.md")) - require.FileExists(t, filepath.Join(repo, ".claude", "plugins", "subtask", ".claude-plugin", "plugin.json")) + // Skill path should be project-scoped. + projectSkillPath := filepath.Join(repo, ".claude", "skills", "subtask", "SKILL.md") + gotSkill, err := os.ReadFile(projectSkillPath) + require.NoError(t, err) + require.Equal(t, install.Embedded(), gotSkill) + + // User-scope path should not be touched. + _, err = os.Stat(filepath.Join(home, ".claude", "skills", "subtask", "SKILL.md")) + require.ErrorIs(t, err, os.ErrNotExist) +} + +func TestInstall_ProjectScope_RequiresGitRepo(t *testing.T) { + bin := buildSubtask(t) + + home := t.TempDir() + t.Setenv("HOME", home) + t.Setenv("USERPROFILE", home) // windows + t.Setenv("SUBTASK_DIR", filepath.Join(home, ".subtask")) + addStubCommandToPATH(t, "codex") + + cwd := t.TempDir() + out, err := runSubtaskWithHomeEnv(t, bin, cwd, home, "install", "--no-prompt", "--scope", "project") + require.Error(t, err) + require.Contains(t, out, "--scope=project requires being in a git repository") } func TestAutoUpdate_RepairsDriftOnlyWhenInstalled(t *testing.T) { bin := buildSubtask(t) home := t.TempDir() + t.Setenv("HOME", home) + t.Setenv("USERPROFILE", home) // windows + t.Setenv("SUBTASK_DIR", filepath.Join(home, ".subtask")) + addStubCommandToPATH(t, "codex") cwd := t.TempDir() // Not installed: running any command should not create files. @@ -185,12 +473,9 @@ func TestAutoUpdate_RepairsDriftOnlyWhenInstalled(t *testing.T) { _ = runSubtask(t, bin, cwd, home, "install", "--no-prompt") skillPath := filepath.Join(home, ".claude", "skills", "subtask", "SKILL.md") require.NoError(t, os.WriteFile(skillPath, []byte("different"), 0o644)) - pluginHookPath := filepath.Join(home, ".claude", "plugins", "subtask", "hooks", "hooks.json") - require.NoError(t, os.WriteFile(pluginHookPath, []byte(`{}`), 0o644)) out := runSubtask(t, bin, cwd, home, "status") require.Contains(t, out, "Updated skill to latest version") - require.Contains(t, out, "Updated plugin to latest version") gotSkill, err := os.ReadFile(skillPath) require.NoError(t, err) @@ -198,6 +483,13 @@ func TestAutoUpdate_RepairsDriftOnlyWhenInstalled(t *testing.T) { } func runSubtask(t *testing.T, bin string, dir string, home string, args ...string) string { + t.Helper() + out, err := runSubtaskWithHomeEnv(t, bin, dir, home, args...) + require.NoError(t, err, "%s", out) + return out +} + +func runSubtaskWithHomeEnv(t *testing.T, bin string, dir string, home string, args ...string) (string, error) { t.Helper() cmd := exec.Command(bin, args...) cmd.Dir = dir @@ -209,6 +501,11 @@ func runSubtask(t *testing.T, bin string, dir string, home string, args ...strin if len(kv) >= 12 && kv[:12] == "USERPROFILE=" { continue } + // Filter out debug env var so tests run with predictable logging behavior. + if strings.HasPrefix(kv, "SUBTASK_DEBUG=") { + t.Logf("filtering out SUBTASK_DEBUG: %q", kv) + continue + } env = append(env, kv) } env = append(env, @@ -217,8 +514,7 @@ func runSubtask(t *testing.T, bin string, dir string, home string, args ...strin ) cmd.Env = env out, err := cmd.CombinedOutput() - require.NoError(t, err, "%s", out) - return string(out) + return string(out), err } func readJSON(path string, v any) error { diff --git a/pkg/e2e/integration_test.go b/pkg/e2e/integration_test.go deleted file mode 100644 index 3094c33..0000000 --- a/pkg/e2e/integration_test.go +++ /dev/null @@ -1,156 +0,0 @@ -package e2e - -import ( - "os" - "os/exec" - "path/filepath" - "testing" - "time" - - "github.com/stretchr/testify/assert" - - "github.com/zippoxer/subtask/pkg/git" - "github.com/zippoxer/subtask/pkg/task" - "github.com/zippoxer/subtask/pkg/testutil" -) - -// gitCmd runs a git command in the given directory. -func gitCmd(t *testing.T, dir string, args ...string) string { - t.Helper() - cmd := exec.Command("git", args...) - cmd.Dir = dir - out, err := cmd.CombinedOutput() - if err != nil { - t.Fatalf("git %v failed: %v\n%s", args, err, out) - } - return string(out) -} - -// TestIntegrationDetection_ManualMerge verifies that a task closed after -// manual merge (git merge) shows as merged. -func TestIntegrationDetection_ManualMerge(t *testing.T) { - env := testutil.NewTestEnv(t, 1) - - // Create a task - env.CreateTask("test/manual", "Manual merge test", "main", "Test manual merge detection") - - // Set up state with workspace - state := &task.State{ - Workspace: env.Workspaces[0], - SupervisorPID: os.Getpid(), - StartedAt: time.Now(), - } - env.CreateTaskState("test/manual", state) - - // Create the task branch in workspace and make changes - gitCmd(t, env.Workspaces[0], "checkout", "-b", "test/manual") - featureFile := filepath.Join(env.Workspaces[0], "feature.txt") - os.WriteFile(featureFile, []byte("feature content"), 0644) - gitCmd(t, env.Workspaces[0], "add", "feature.txt") - gitCmd(t, env.Workspaces[0], "commit", "-m", "Add feature") - - // Manually merge task branch into main (in root repo) - gitCmd(t, env.RootDir, "merge", "test/manual", "-m", "Merge test/manual") - - // Verify integration detection finds the branch as merged - target := git.EffectiveTarget(env.Workspaces[0], "main") - reason := git.IsIntegrated(env.Workspaces[0], "test/manual", target) - assert.NotEmpty(t, reason, "manual merge should be detected as integrated") -} - -// TestIntegrationDetection_SquashMerge verifies that a task closed after -// squash merge (different history) shows as merged. -func TestIntegrationDetection_SquashMerge(t *testing.T) { - env := testutil.NewTestEnv(t, 1) - - // Create a task - env.CreateTask("test/squash", "Squash merge test", "main", "Test squash merge detection") - - // Set up state with workspace - state := &task.State{ - Workspace: env.Workspaces[0], - SupervisorPID: os.Getpid(), - StartedAt: time.Now(), - } - env.CreateTaskState("test/squash", state) - - // Create the task branch in workspace and make changes - gitCmd(t, env.Workspaces[0], "checkout", "-b", "test/squash") - featureFile := filepath.Join(env.Workspaces[0], "feature.txt") - os.WriteFile(featureFile, []byte("feature content"), 0644) - gitCmd(t, env.Workspaces[0], "add", "feature.txt") - gitCmd(t, env.Workspaces[0], "commit", "-m", "Add feature") - - // Squash merge task branch into main (creates different history) - gitCmd(t, env.RootDir, "merge", "--squash", "test/squash") - gitCmd(t, env.RootDir, "commit", "-m", "Squash merge test/squash") - - // Verify integration detection finds the branch as merged - // Should be TreesMatch or MergeAddsNothing - target := git.EffectiveTarget(env.Workspaces[0], "main") - reason := git.IsIntegrated(env.Workspaces[0], "test/squash", target) - assert.NotEmpty(t, reason, "squash merge should be detected as integrated") -} - -// TestIntegrationDetection_NotMerged verifies that a task closed without -// merging (abandoned) does NOT show as merged. -func TestIntegrationDetection_NotMerged(t *testing.T) { - env := testutil.NewTestEnv(t, 1) - - // Create a task - env.CreateTask("test/abandon", "Abandoned test", "main", "Test abandoned detection") - - // Set up state with workspace - state := &task.State{ - Workspace: env.Workspaces[0], - SupervisorPID: os.Getpid(), - StartedAt: time.Now(), - } - env.CreateTaskState("test/abandon", state) - - // Create the task branch in workspace and make changes - gitCmd(t, env.Workspaces[0], "checkout", "-b", "test/abandon") - featureFile := filepath.Join(env.Workspaces[0], "feature.txt") - os.WriteFile(featureFile, []byte("feature content"), 0644) - gitCmd(t, env.Workspaces[0], "add", "feature.txt") - gitCmd(t, env.Workspaces[0], "commit", "-m", "Add feature") - - // Verify integration detection does NOT find the branch as merged - target := git.EffectiveTarget(env.Workspaces[0], "main") - reason := git.IsIntegrated(env.Workspaces[0], "test/abandon", target) - assert.Empty(t, reason, "abandoned task should NOT be detected as integrated") -} - -// TestIntegrationDetection_CherryPick verifies that a task whose changes -// were cherry-picked (different commits, same content) shows as merged. -func TestIntegrationDetection_CherryPick(t *testing.T) { - env := testutil.NewTestEnv(t, 1) - - // Create a task - env.CreateTask("test/cherry", "Cherry-pick test", "main", "Test cherry-pick detection") - - // Set up state with workspace - state := &task.State{ - Workspace: env.Workspaces[0], - SupervisorPID: os.Getpid(), - StartedAt: time.Now(), - } - env.CreateTaskState("test/cherry", state) - - // Create the task branch in workspace and make changes - gitCmd(t, env.Workspaces[0], "checkout", "-b", "test/cherry") - featureFile := filepath.Join(env.Workspaces[0], "feature.txt") - os.WriteFile(featureFile, []byte("feature content"), 0644) - gitCmd(t, env.Workspaces[0], "add", "feature.txt") - gitCmd(t, env.Workspaces[0], "commit", "-m", "Add feature") - - // Cherry-pick the changes to main (different commit SHA, same content) - // First, get the commit SHA - gitCmd(t, env.RootDir, "cherry-pick", "test/cherry") - - // Verify integration detection finds the branch as merged - // Could be SameCommit (fast-forward) or TreesMatch depending on scenario - target := git.EffectiveTarget(env.Workspaces[0], "main") - reason := git.IsIntegrated(env.Workspaces[0], "test/cherry", target) - assert.NotEmpty(t, reason, "cherry-picked changes should be detected as integrated") -} diff --git a/pkg/e2e/interrupt_cli_test.go b/pkg/e2e/interrupt_cli_test.go index 5a784d7..800aa0d 100644 --- a/pkg/e2e/interrupt_cli_test.go +++ b/pkg/e2e/interrupt_cli_test.go @@ -20,6 +20,7 @@ func TestInterruptCLI_StopsRunningSend(t *testing.T) { if testing.Short() { t.Skip("skipping interrupt CLI test in short mode") } + t.Setenv("SUBTASK_DIR", t.TempDir()) if runtime.GOOS == "windows" { t.Skip("skipping interrupt CLI test on Windows") } @@ -38,29 +39,44 @@ func TestInterruptCLI_StopsRunningSend(t *testing.T) { require.NoError(t, err, "draft failed: %s", out) // Start send in background. - ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second) + ctx, cancel := context.WithTimeout(context.Background(), 45*time.Second) t.Cleanup(cancel) longPrompt := mockPrompt("Do something long-running") + "\n/MockRunCommand sleep 30" sendCmd := exec.CommandContext(ctx, binPath, "send", taskName, longPrompt) sendCmd.Dir = root + sendOutPath := filepath.Join(t.TempDir(), "send.out") + sendOutFile, err := os.Create(sendOutPath) + require.NoError(t, err) + t.Cleanup(func() { _ = sendOutFile.Close() }) + sendCmd.Stdout = sendOutFile + sendCmd.Stderr = sendOutFile require.NoError(t, sendCmd.Start()) - // Wait until state shows the task is running. - statePath := filepath.Join(root, ".subtask", "internal", task.EscapeName(taskName), "state.json") - var runningState task.State - require.NoError(t, waitForState(t, statePath, func(s task.State) bool { - runningState = s - return s.SupervisorPID != 0 && !s.StartedAt.IsZero() - })) - require.NotZero(t, runningState.SupervisorPGID) + escaped := task.EscapeName(taskName) + statePathCandidates := []string{ + filepath.Join(task.ProjectsDir(), task.EscapePath(root), "internal", escaped, "state.json"), + filepath.Join(root, ".subtask", "internal", escaped, "state.json"), + } // Interrupt the task. - interruptCmd := exec.Command(binPath, "interrupt", taskName) - interruptCmd.Dir = root - out, err = interruptCmd.CombinedOutput() - require.NoError(t, err, "interrupt failed: %s", out) - require.Contains(t, string(out), "Sent SIGINT") + deadline := time.Now().Add(10 * time.Second) + var interruptOut []byte + for time.Now().Before(deadline) { + interruptCmd := exec.Command(binPath, "interrupt", taskName) + interruptCmd.Dir = root + out, err := interruptCmd.CombinedOutput() + if err == nil && strings.Contains(string(out), "Sent SIGINT") { + interruptOut = out + break + } + if strings.Contains(string(out), "is not working") { + time.Sleep(50 * time.Millisecond) + continue + } + t.Fatalf("interrupt failed unexpectedly: %v\n%s\n\nsend output:\n%s\n\nstate debug:\n%s", err, out, mustReadFile(t, sendOutPath), debugStateFiles(t, taskName, root, statePathCandidates)) + } + require.NotEmpty(t, interruptOut, "interrupt never succeeded\n\nsend output:\n%s\n\nstate debug:\n%s", mustReadFile(t, sendOutPath), debugStateFiles(t, taskName, root, statePathCandidates)) // Send should exit with an error code due to signal handler os.Exit(1). done := make(chan error, 1) @@ -74,10 +90,10 @@ func TestInterruptCLI_StopsRunningSend(t *testing.T) { // State should be cleared and contain an interruption error. var cleared task.State - require.NoError(t, waitForState(t, statePath, func(s task.State) bool { + require.NoError(t, waitForAnyState(t, taskName, statePathCandidates, func(s task.State) bool { cleared = s return s.SupervisorPID == 0 - })) + }), "send output:\n%s\n\nstate debug:\n%s", mustReadFile(t, sendOutPath), debugStateFiles(t, taskName, root, statePathCandidates)) require.Zero(t, cleared.SupervisorPGID) require.Contains(t, strings.ToLower(cleared.LastError), "interrupted") @@ -98,15 +114,22 @@ func TestInterruptCLI_StopsRunningSend(t *testing.T) { }), "expected worker.finished error") } -func waitForState(t *testing.T, statePath string, pred func(task.State) bool) error { +func waitForAnyState(t *testing.T, taskName string, statePaths []string, pred func(task.State) bool) error { t.Helper() - deadline := time.Now().Add(3 * time.Second) + deadline := time.Now().Add(30 * time.Second) + escaped := task.EscapeName(taskName) for time.Now().Before(deadline) { - b, err := os.ReadFile(statePath) - if err == nil { - var s task.State - if json.Unmarshal(b, &s) == nil && pred(s) { - return nil + candidates := append([]string{}, statePaths...) + if matches, _ := filepath.Glob(filepath.Join(task.ProjectsDir(), "*", "internal", escaped, "state.json")); len(matches) > 0 { + candidates = append(candidates, matches...) + } + for _, statePath := range candidates { + b, err := os.ReadFile(statePath) + if err == nil { + var s task.State + if json.Unmarshal(b, &s) == nil && pred(s) { + return nil + } } } time.Sleep(25 * time.Millisecond) @@ -133,6 +156,61 @@ func readHistoryEvents(t *testing.T, historyPath string) []map[string]any { return out } +func mustReadFile(t *testing.T, path string) string { + t.Helper() + b, err := os.ReadFile(path) + require.NoError(t, err) + return string(b) +} + +func debugStateFiles(t *testing.T, taskName, root string, candidates []string) string { + t.Helper() + var b strings.Builder + + b.WriteString("candidates:\n") + for _, p := range candidates { + b.WriteString(" - ") + b.WriteString(p) + b.WriteString("\n") + } + + escaped := task.EscapeName(taskName) + glob := filepath.Join(task.ProjectsDir(), "*", "internal", escaped, "state.json") + matches, _ := filepath.Glob(glob) + b.WriteString("glob:\n ") + b.WriteString(glob) + b.WriteString("\n") + for _, p := range matches { + b.WriteString(" - ") + b.WriteString(p) + b.WriteString("\n") + } + + b.WriteString("repo-local:\n") + repoLocal := filepath.Join(root, ".subtask", "internal", escaped, "state.json") + b.WriteString(" - ") + b.WriteString(repoLocal) + b.WriteString("\n") + + seen := map[string]bool{} + for _, p := range append(append([]string{}, candidates...), append(matches, repoLocal)...) { + if seen[p] { + continue + } + seen[p] = true + data, err := os.ReadFile(p) + if err != nil { + continue + } + b.WriteString("\n--- ") + b.WriteString(p) + b.WriteString(" ---\n") + b.Write(data) + b.WriteString("\n") + } + return b.String() +} + func hasHistoryEvent(events []map[string]any, typ string, pred func(data map[string]any) bool) bool { for _, ev := range events { if ev["type"] != typ { diff --git a/pkg/e2e/merge_test.go b/pkg/e2e/merge_test.go index bfab381..1f46a64 100644 --- a/pkg/e2e/merge_test.go +++ b/pkg/e2e/merge_test.go @@ -1,6 +1,7 @@ package e2e import ( + "encoding/json" "os" "os/exec" "path/filepath" @@ -86,6 +87,34 @@ func TestMergeCommand_NoOriginRemote(t *testing.T) { assert.Equal(t, task.TaskStatusMerged, tail.TaskStatus) assert.NotEmpty(t, tail.LastMergedCommit) + mergedEvents, err := history.Read(taskName, history.ReadOptions{EventsOnly: true}) + require.NoError(t, err) + var mergedEv history.Event + for i := len(mergedEvents) - 1; i >= 0; i-- { + if mergedEvents[i].Type == "task.merged" { + mergedEv = mergedEvents[i] + break + } + } + require.Equal(t, "task.merged", mergedEv.Type) + var mergedData struct { + Via string `json:"via"` + BaseCommit string `json:"base_commit"` + BranchHead string `json:"branch_head"` + ChangesAdded int `json:"changes_added"` + ChangesRemoved int `json:"changes_removed"` + CommitCount int `json:"commit_count"` + FrozenError string `json:"frozen_error"` + } + require.NoError(t, json.Unmarshal(mergedEv.Data, &mergedData)) + assert.Equal(t, "subtask", mergedData.Via) + assert.NotEmpty(t, mergedData.BaseCommit) + assert.NotEmpty(t, mergedData.BranchHead) + assert.Equal(t, 2, mergedData.ChangesAdded) + assert.Equal(t, 0, mergedData.ChangesRemoved) + assert.Equal(t, 2, mergedData.CommitCount) + assert.Empty(t, strings.TrimSpace(mergedData.FrozenError)) + finalState, err := task.LoadState(taskName) require.NoError(t, err) require.NotNil(t, finalState) @@ -100,6 +129,53 @@ func TestMergeCommand_NoOriginRemote(t *testing.T) { assert.Equal(t, "line 1\nline 2\n", normalized) } +// TestMergeCommand_NoOpAlreadyInBase verifies that when a task's content is already in the base branch +// (e.g. via squash merge / cherry-pick), `subtask merge` finalizes the task without creating a new commit, +// and `subtask diff` still shows the task's original contribution rather than an arbitrary base tip commit. +func TestMergeCommand_NoOpAlreadyInBase_DiffShowsTaskChanges(t *testing.T) { + env := testutil.NewTestEnv(t, 1) + + taskName := "test/noop-merge" + env.CreateTask(taskName, "Test no-op merge diff", "main", "Test no-op merge diff") + env.CreateTaskHistory(taskName, []history.Event{{Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main"})}}) + + // Create task state (simulating a task that has been run) + state := &task.State{Workspace: env.Workspaces[0]} + env.CreateTaskState(taskName, state) + + // Create workspace with task branch and a commit. + ws := env.Workspaces[0] + gitCmd(t, ws, "checkout", "-b", taskName) + appliedFile := filepath.Join(ws, "applied.txt") + require.NoError(t, os.WriteFile(appliedFile, []byte("hello\n"), 0o644)) + gitCmd(t, ws, "add", "applied.txt") + gitCmd(t, ws, "commit", "-m", "Add applied file") + + // Simulate an external squash merge into main (creates a different commit on main). + gitCmd(t, env.RootDir, "checkout", "main") + gitCmd(t, env.RootDir, "merge", "--squash", taskName) + gitCmd(t, env.RootDir, "commit", "-m", "Squash merge applied") + + subtaskBin := buildSubtask(t) + + // Finalize via `subtask merge` (should take the "already in base" no-op path and delete the branch). + cmd := exec.Command(subtaskBin, "merge", taskName, "-m", "Finalize no-op merge") + cmd.Dir = env.RootDir + out, err := cmd.CombinedOutput() + require.NoError(t, err, "merge should succeed: %s", out) + + branches := gitCmd(t, env.RootDir, "branch", "--list", taskName) + assert.Equal(t, "", strings.TrimSpace(branches), "task branch should be deleted") + + // `subtask diff` should show the task's original change, not an unrelated base tip commit. + cmd = exec.Command(subtaskBin, "diff", taskName) + cmd.Dir = env.RootDir + diffOut, err := cmd.CombinedOutput() + require.NoError(t, err, "diff should succeed: %s", diffOut) + assert.Contains(t, string(diffOut), "applied.txt") + assert.Contains(t, string(diffOut), "+hello") +} + func TestMergeCommand_LocalMainAheadOfOrigin(t *testing.T) { env := testutil.NewTestEnv(t, 1) @@ -161,6 +237,30 @@ func TestMergeCommand_LocalMainAheadOfOrigin(t *testing.T) { tail, err := history.Tail(taskName) require.NoError(t, err) assert.Equal(t, task.TaskStatusMerged, tail.TaskStatus) + + mergedEvents, err := history.Read(taskName, history.ReadOptions{EventsOnly: true}) + require.NoError(t, err) + var mergedEv history.Event + for i := len(mergedEvents) - 1; i >= 0; i-- { + if mergedEvents[i].Type == "task.merged" { + mergedEv = mergedEvents[i] + break + } + } + require.Equal(t, "task.merged", mergedEv.Type) + var mergedData struct { + Via string `json:"via"` + ChangesAdded int `json:"changes_added"` + ChangesRemoved int `json:"changes_removed"` + CommitCount int `json:"commit_count"` + FrozenError string `json:"frozen_error"` + } + require.NoError(t, json.Unmarshal(mergedEv.Data, &mergedData)) + assert.Equal(t, "subtask", mergedData.Via) + assert.Equal(t, 2, mergedData.ChangesAdded) + assert.Equal(t, 0, mergedData.ChangesRemoved) + assert.Equal(t, 2, mergedData.CommitCount) + assert.Empty(t, strings.TrimSpace(mergedData.FrozenError)) } // TestMergeWithConflicts verifies that merge handles conflicts gracefully diff --git a/pkg/e2e/mergesim_concurrency_test.go b/pkg/e2e/mergesim_concurrency_test.go new file mode 100644 index 0000000..3799114 --- /dev/null +++ b/pkg/e2e/mergesim_concurrency_test.go @@ -0,0 +1,78 @@ +package e2e + +import ( + "fmt" + "os" + "os/exec" + "path/filepath" + "strings" + "sync" + "testing" + + "github.com/stretchr/testify/require" + + "github.com/zippoxer/subtask/pkg/task" + "github.com/zippoxer/subtask/pkg/task/history" + "github.com/zippoxer/subtask/pkg/testutil" +) + +func TestAppliedContentDetection_IndexFallback_ConcurrentList(t *testing.T) { + env := testutil.NewTestEnv(t, 1) + + taskName := "test/applied-concurrent" + env.CreateTask(taskName, "Applied task (concurrent)", "main", "Applied") + baseCommit := strings.TrimSpace(gitCmd(t, env.RootDir, "rev-parse", "HEAD")) + env.CreateTaskHistory(taskName, []history.Event{ + {Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main", "base_commit": baseCommit})}, + {Type: "stage.changed", Data: mustJSON(map[string]any{"from": "", "to": "implement"})}, + {Type: "worker.finished", Data: mustJSON(map[string]any{"run_id": "r1", "duration_ms": 0, "tool_calls": 0, "outcome": "replied"})}, + }) + + state := &task.State{Workspace: env.Workspaces[0]} + env.CreateTaskState(taskName, state) + + // Create workspace with task branch and a commit. + ws := env.Workspaces[0] + gitCmd(t, ws, "checkout", "-b", taskName) + require.NoError(t, os.WriteFile(filepath.Join(ws, "feature.txt"), []byte("line 1\n"), 0o644)) + gitCmd(t, ws, "add", "feature.txt") + gitCmd(t, ws, "commit", "-m", "Add feature") + + // Simulate a squash-merge (or independently-applied change) into main with a different commit. + require.NoError(t, os.WriteFile(filepath.Join(env.RootDir, "feature.txt"), []byte("line 1\n"), 0o644)) + gitCmd(t, env.RootDir, "add", "feature.txt") + gitCmd(t, env.RootDir, "commit", "-m", "Apply feature via squash") + + subtaskBin := buildSubtask(t) + + const n = 6 + var wg sync.WaitGroup + errs := make(chan error, n) + + for i := 0; i < n; i++ { + wg.Add(1) + go func() { + defer wg.Done() + cmd := exec.Command(subtaskBin, "list") + cmd.Dir = env.RootDir + cmd.Env = append(os.Environ(), "SUBTASK_MERGE_SIM_FORCE=index") + out, err := cmd.CombinedOutput() + if err != nil { + errs <- fmt.Errorf("list failed: %w: %s", err, out) + return + } + if !strings.Contains(string(out), "applied (+1 -0)") { + errs <- fmt.Errorf("expected applied in list output, got:\n%s", out) + return + } + errs <- nil + }() + } + + wg.Wait() + close(errs) + + for err := range errs { + require.NoError(t, err) + } +} diff --git a/pkg/e2e/parallel_test.go b/pkg/e2e/parallel_test.go index 82811d0..4a8159f 100644 --- a/pkg/e2e/parallel_test.go +++ b/pkg/e2e/parallel_test.go @@ -24,6 +24,7 @@ func TestParallelCLI(t *testing.T) { if testing.Short() { t.Skip("skipping parallel CLI test in short mode") } + t.Setenv("SUBTASK_DIR", t.TempDir()) // Build the binary first binPath := buildSubtask(t) @@ -99,6 +100,7 @@ func TestParallelCLI_AllWorkspacesOccupied(t *testing.T) { if testing.Short() { t.Skip("skipping parallel CLI test in short mode") } + t.Setenv("SUBTASK_DIR", t.TempDir()) binPath := buildSubtask(t) mockWorkerPath := mockWorkerPathForSubtask(binPath) @@ -379,7 +381,7 @@ func mockPrompt(base string) string { func loadStateFromDir(root, taskName string) (*task.State, error) { escaped := task.EscapeName(taskName) - path := filepath.Join(root, ".subtask", "internal", escaped, "state.json") + path := filepath.Join(task.ProjectsDir(), task.EscapePath(root), "internal", escaped, "state.json") data, err := os.ReadFile(path) if err != nil { return nil, err @@ -393,7 +395,7 @@ func loadStateFromDir(root, taskName string) (*task.State, error) { func loadProgressFromDir(root, taskName string) (*task.Progress, error) { escaped := task.EscapeName(taskName) - path := filepath.Join(root, ".subtask", "internal", escaped, "progress.json") + path := filepath.Join(task.ProjectsDir(), task.EscapePath(root), "internal", escaped, "progress.json") data, err := os.ReadFile(path) if err != nil { return nil, err diff --git a/pkg/e2e/send_cli_test.go b/pkg/e2e/send_cli_test.go index 2ea62fe..e1f9e98 100644 --- a/pkg/e2e/send_cli_test.go +++ b/pkg/e2e/send_cli_test.go @@ -19,6 +19,7 @@ func TestSendCLI_BasicFlowAndWorkingGuard(t *testing.T) { if testing.Short() { t.Skip("skipping send CLI test in short mode") } + t.Setenv("SUBTASK_DIR", t.TempDir()) binPath := buildSubtask(t) mockWorkerPath := mockWorkerPathForSubtask(binPath) @@ -67,7 +68,7 @@ func TestSendCLI_BasicFlowAndWorkingGuard(t *testing.T) { assert.GreaterOrEqual(t, progress2.ToolCalls, progress.ToolCalls+3) // Force "working" (non-stale) and verify send errors - statePath := filepath.Join(root, ".subtask", "internal", task.EscapeName(taskName), "state.json") + statePath := filepath.Join(task.ProjectsDir(), task.EscapePath(root), "internal", task.EscapeName(taskName), "state.json") data, err := os.ReadFile(statePath) require.NoError(t, err) var s task.State diff --git a/pkg/e2e/send_concurrency_test.go b/pkg/e2e/send_concurrency_test.go new file mode 100644 index 0000000..d1e941f --- /dev/null +++ b/pkg/e2e/send_concurrency_test.go @@ -0,0 +1,120 @@ +package e2e + +import ( + "os" + "os/exec" + "path/filepath" + "runtime" + "strings" + "testing" + + "github.com/stretchr/testify/require" +) + +func sleepCommandForPlatform(seconds int) string { + if seconds <= 0 { + seconds = 1 + } + if runtime.GOOS == "windows" { + // ping counts include the first immediate reply. + return "ping -n " + itoa(seconds+1) + " 127.0.0.1 >NUL" + } + return "sleep " + itoa(seconds) +} + +func itoa(n int) string { + if n == 0 { + return "0" + } + neg := n < 0 + if neg { + n = -n + } + var b [32]byte + i := len(b) + for n > 0 { + i-- + b[i] = byte('0' + (n % 10)) + n /= 10 + } + if neg { + i-- + b[i] = '-' + } + return string(b[i:]) +} + +func TestSendCLI_ConcurrentSends_RaceWindowStillWorkingGuard(t *testing.T) { + if testing.Short() { + t.Skip("skipping concurrent send CLI test in short mode") + } + t.Setenv("SUBTASK_DIR", t.TempDir()) + + binPath := buildSubtask(t) + mockWorkerPath := mockWorkerPathForSubtask(binPath) + root := setupParallelTestRepo(t, 2, mockWorkerPath) + + taskName := "send/concurrent" + + // Draft task. + draftCmd := exec.Command(binPath, "draft", taskName, "Test task description", + "--base-branch", "main", "--title", "Concurrent send test") + draftCmd.Dir = root + out, err := draftCmd.CombinedOutput() + require.NoError(t, err, "draft failed: %s", out) + + // Use a deterministic barrier inside `subtask send` so both processes reach the point + // after the unlocked state check but before either sets SupervisorPID under lock. + barrierDir := filepath.Join(t.TempDir(), "send-barrier") + envBarrier := []string{ + "SUBTASK_TEST_SEND_BARRIER_DIR=" + barrierDir, + "SUBTASK_TEST_SEND_BARRIER_N=2", + "SUBTASK_TEST_SEND_BARRIER_TIMEOUT_MS=20000", + } + + longPrompt := mockPrompt("Do something slowly") + "\n/MockRunCommand " + sleepCommandForPlatform(2) + + type result struct { + out []byte + err error + } + ch := make(chan result, 2) + + runSend := func() { + cmd := exec.Command(binPath, "send", taskName, longPrompt) + cmd.Dir = root + cmd.Env = append(os.Environ(), envBarrier...) + out, err := cmd.CombinedOutput() + ch <- result{out: out, err: err} + } + + go runSend() + go runSend() + + r1 := <-ch + r2 := <-ch + + // Exactly one should succeed. + if r1.err == nil && r2.err == nil { + t.Fatalf("expected one send to fail, but both succeeded:\n---1---\n%s\n---2---\n%s", string(r1.out), string(r2.out)) + } + if r1.err != nil && r2.err != nil { + t.Fatalf("expected one send to succeed, but both failed:\n---1---\n%s\n---2---\n%s", string(r1.out), string(r2.out)) + } + + // The failing send must fail cleanly with the guard message. + failOut := "" + if r1.err != nil { + failOut = string(r1.out) + } else { + failOut = string(r2.out) + } + if !strings.Contains(failOut, "still working") { + t.Fatalf("expected failing send to contain 'still working', got:\n%s", failOut) + } + + // Barrier should have had both participants. + ents, readErr := os.ReadDir(barrierDir) + require.NoError(t, readErr) + require.GreaterOrEqual(t, len(ents), 2) +} diff --git a/pkg/e2e/setup_ux_test.go b/pkg/e2e/setup_ux_test.go new file mode 100644 index 0000000..605187e --- /dev/null +++ b/pkg/e2e/setup_ux_test.go @@ -0,0 +1,467 @@ +package e2e + +import ( + "database/sql" + "encoding/json" + "fmt" + "io" + "os" + "os/exec" + "path/filepath" + "runtime" + "strings" + "testing" + + _ "modernc.org/sqlite" + + "github.com/stretchr/testify/require" + + "github.com/zippoxer/subtask/pkg/task" + taskindex "github.com/zippoxer/subtask/pkg/task/index" + "github.com/zippoxer/subtask/pkg/workspace" +) + +func TestSetupUX(t *testing.T) { + bin := buildSubtask(t) + + t.Run("NotConfigured_ListFails", func(t *testing.T) { + t.Setenv("SUBTASK_DIR", t.TempDir()) + root := t.TempDir() + run(t, root, "git", "init", "-b", "main") + run(t, root, "git", "config", "user.email", "test@test.com") + run(t, root, "git", "config", "user.name", "Test User") + + out, err := runSubtaskWithErr(t, bin, root, "list") + require.Error(t, err) + require.Contains(t, out, "subtask: not configured — run 'subtask install' first") + }) + + t.Run("NotGitRepo_ListFails", func(t *testing.T) { + t.Setenv("SUBTASK_DIR", t.TempDir()) + dir := t.TempDir() + + out, err := runSubtaskWithErr(t, bin, dir, "list") + require.Error(t, err) + require.Contains(t, out, "subtask: not a git repository — subtask requires git") + }) + + t.Run("LegacyMigration_PromotesConfigAndRuntime", func(t *testing.T) { + subtaskDir := t.TempDir() + t.Setenv("SUBTASK_DIR", subtaskDir) + + root := t.TempDir() + run(t, root, "git", "init", "-b", "main") + run(t, root, "git", "config", "user.email", "test@test.com") + run(t, root, "git", "config", "user.name", "Test User") + _ = os.WriteFile(filepath.Join(root, ".gitignore"), []byte(".subtask/\n"), 0o644) + _ = os.WriteFile(filepath.Join(root, "README.md"), []byte("# test\n"), 0o644) + run(t, root, "git", "add", ".") + run(t, root, "git", "commit", "-m", "init") + + // Golden legacy repo-local layout produced by old Subtask CLI. + fixture := filepath.Join("..", "task", "migrate", "testdata", "legacy", "basic") + require.NoError(t, copyDir(fixture, filepath.Join(root, ".subtask"))) + + out, err := runSubtaskWithErr(t, bin, root, "list") + require.NoError(t, err, out) + + // Global config should be created. + require.FileExists(t, task.ConfigPath()) + + // Runtime state should exist in ~/.subtask/projects//... + projectDir := filepath.Join(task.ProjectsDir(), task.EscapePath(root)) + require.FileExists(t, filepath.Join(projectDir, "internal", "legacy--basic", "state.json")) + require.FileExists(t, filepath.Join(projectDir, "index.db")) + + // Legacy runtime state should be removed from the repo. + _, err = os.Stat(filepath.Join(root, ".subtask", "internal")) + require.True(t, os.IsNotExist(err)) + _, err = os.Stat(filepath.Join(root, ".subtask", "index.db")) + require.True(t, os.IsNotExist(err)) + + // Portable data stays in the repo. + require.FileExists(t, filepath.Join(root, ".subtask", "tasks", "legacy--basic", "TASK.md")) + require.FileExists(t, filepath.Join(root, ".subtask", "config.json")) + + // Index should be usable even if legacy file was corrupt (rebuilt is OK). + idx, err := taskindex.Open(filepath.Join(projectDir, "index.db")) + require.NoError(t, err) + require.NoError(t, idx.Close()) + db, err := sql.Open("sqlite", filepath.Join(projectDir, "index.db")) + require.NoError(t, err) + t.Cleanup(func() { _ = db.Close() }) + var hash sql.NullString + require.NoError(t, db.QueryRow(`SELECT git_refs_snapshot_hash FROM index_meta WHERE id = 1;`).Scan(&hash)) + }) + + t.Run("SubdirUsage_ListWorks", func(t *testing.T) { + subtaskDir := t.TempDir() + t.Setenv("SUBTASK_DIR", subtaskDir) + + root := t.TempDir() + run(t, root, "git", "init", "-b", "main") + run(t, root, "git", "config", "user.email", "test@test.com") + run(t, root, "git", "config", "user.name", "Test User") + _ = os.WriteFile(filepath.Join(root, ".gitignore"), []byte(".subtask/\n"), 0o644) + + // Global config present. + cfg := &workspace.Config{Harness: "mock", MaxWorkspaces: 3} + cfgData, _ := json.MarshalIndent(cfg, "", " ") + require.NoError(t, os.MkdirAll(filepath.Dir(task.ConfigPath()), 0o755)) + require.NoError(t, os.WriteFile(task.ConfigPath(), cfgData, 0o644)) + + subdir := filepath.Join(root, "src", "foo") + require.NoError(t, os.MkdirAll(subdir, 0o755)) + + out, err := runSubtaskWithErr(t, bin, subdir, "list") + require.NoError(t, err, out) + }) + + t.Run("InvalidGlobalConfig_ListErrorsHelpful", func(t *testing.T) { + t.Setenv("SUBTASK_DIR", t.TempDir()) + + root := t.TempDir() + run(t, root, "git", "init", "-b", "main") + run(t, root, "git", "config", "user.email", "test@test.com") + run(t, root, "git", "config", "user.name", "Test User") + + require.NoError(t, os.MkdirAll(filepath.Dir(task.ConfigPath()), 0o755)) + require.NoError(t, os.WriteFile(task.ConfigPath(), []byte("{\n"), 0o644)) + + out, err := runSubtaskWithErr(t, bin, root, "list") + require.Error(t, err) + require.Contains(t, out, "subtask: invalid config") + require.Contains(t, out, "subtask config --user") + }) + + t.Run("InvalidProjectConfig_ListErrorsHelpful", func(t *testing.T) { + t.Setenv("SUBTASK_DIR", t.TempDir()) + + root := t.TempDir() + run(t, root, "git", "init", "-b", "main") + run(t, root, "git", "config", "user.email", "test@test.com") + run(t, root, "git", "config", "user.name", "Test User") + + // Valid global config. + cfg := &workspace.Config{Harness: "mock", MaxWorkspaces: 3} + cfgData, _ := json.MarshalIndent(cfg, "", " ") + require.NoError(t, os.MkdirAll(filepath.Dir(task.ConfigPath()), 0o755)) + require.NoError(t, os.WriteFile(task.ConfigPath(), cfgData, 0o644)) + + // Invalid project override. + require.NoError(t, os.MkdirAll(filepath.Join(root, ".subtask"), 0o755)) + require.NoError(t, os.WriteFile(filepath.Join(root, ".subtask", "config.json"), []byte("{\n"), 0o644)) + + out, err := runSubtaskWithErr(t, bin, root, "list") + require.Error(t, err) + require.Contains(t, out, "invalid project config") + require.Contains(t, out, "subtask config --project") + }) + + t.Run("FreshInstall_NoInit_DraftAndListWork", func(t *testing.T) { + subtaskDir := t.TempDir() + t.Setenv("SUBTASK_DIR", subtaskDir) + + home := t.TempDir() + t.Setenv("HOME", home) + t.Setenv("USERPROFILE", home) // windows + + addStubCommandToPATH(t, "codex") + + // Install + configure (writes global config). + cwd := t.TempDir() + out, err := runSubtaskWithErr(t, bin, cwd, "install", "--no-prompt") + require.NoError(t, err, out) + require.FileExists(t, task.ConfigPath()) + + // New repo: draft/list should work without any init ceremony. + repo := t.TempDir() + initGitRepo(t, repo) + + taskName := "setup/test" + out, err = runSubtaskWithErr(t, bin, repo, "draft", taskName, "desc", "--base-branch", "main", "--title", "Setup UX") + require.NoError(t, err, out) + require.FileExists(t, filepath.Join(repo, ".subtask", "tasks", task.EscapeName(taskName), "TASK.md")) + + out, err = runSubtaskWithErr(t, bin, repo, "list") + require.NoError(t, err, out) + require.Contains(t, out, taskName) + }) + + t.Run("ConfigProject_NoPrompt_CreatesOverrideFile", func(t *testing.T) { + subtaskDir := t.TempDir() + t.Setenv("SUBTASK_DIR", subtaskDir) + + home := t.TempDir() + t.Setenv("HOME", home) + t.Setenv("USERPROFILE", home) // windows + + addStubCommandToPATH(t, "codex") + + // Global config present. + cfg := &workspace.Config{Harness: "builtin-mock", MaxWorkspaces: 3} + cfgData, _ := json.MarshalIndent(cfg, "", " ") + require.NoError(t, os.MkdirAll(filepath.Dir(task.ConfigPath()), 0o755)) + require.NoError(t, os.WriteFile(task.ConfigPath(), cfgData, 0o644)) + + repo := t.TempDir() + initGitRepo(t, repo) + + out, err := runSubtaskWithErr(t, bin, repo, "config", "--project", "--no-prompt") + require.NoError(t, err, out) + require.FileExists(t, filepath.Join(repo, ".subtask", "config.json")) + }) + + t.Run("ConfigProject_NoPrompt_FlagsOverride", func(t *testing.T) { + subtaskDir := t.TempDir() + t.Setenv("SUBTASK_DIR", subtaskDir) + t.Setenv("SUBTASK_DEBUG", "") + + home := t.TempDir() + t.Setenv("HOME", home) + t.Setenv("USERPROFILE", home) // windows + + addStubCommandToPATH(t, "codex") + + // Global config present, but should not influence flag-driven project config. + cfg := &workspace.Config{Harness: "builtin-mock", MaxWorkspaces: 3} + cfgData, _ := json.MarshalIndent(cfg, "", " ") + require.NoError(t, os.MkdirAll(filepath.Dir(task.ConfigPath()), 0o755)) + require.NoError(t, os.WriteFile(task.ConfigPath(), cfgData, 0o644)) + + repo := t.TempDir() + initGitRepo(t, repo) + + out, err := runSubtaskWithErr(t, bin, repo, + "config", "--project", "--no-prompt", + "--harness", "codex", + "--model", "gpt-5.2-codex", + "--reasoning", "medium", + "--max-workspaces", "9", + ) + require.NoError(t, err, out) + + var got workspace.Config + require.NoError(t, readJSON(filepath.Join(repo, ".subtask", "config.json"), &got)) + require.Equal(t, "codex", got.Harness) + require.Equal(t, 9, got.MaxWorkspaces) + require.Equal(t, "gpt-5.2-codex", got.Options["model"]) + require.Equal(t, "medium", got.Options["reasoning"]) + }) + + t.Run("Migration_NoClobber_WhenDestinationExists", func(t *testing.T) { + subtaskDir := t.TempDir() + t.Setenv("SUBTASK_DIR", subtaskDir) + + home := t.TempDir() + t.Setenv("HOME", home) + t.Setenv("USERPROFILE", home) // windows + + // Global config present so migration won't promote legacy repo config. + cfg := &workspace.Config{Harness: "builtin-mock", MaxWorkspaces: 3} + cfgData, _ := json.MarshalIndent(cfg, "", " ") + require.NoError(t, os.MkdirAll(filepath.Dir(task.ConfigPath()), 0o755)) + require.NoError(t, os.WriteFile(task.ConfigPath(), cfgData, 0o644)) + + repo := t.TempDir() + initGitRepo(t, repo) + + // Seed repo with legacy runtime layout. + fixture := filepath.Join("..", "task", "migrate", "testdata", "legacy", "basic") + require.NoError(t, copyDir(fixture, filepath.Join(repo, ".subtask"))) + + projectDir := filepath.Join(task.ProjectsDir(), task.EscapePath(repo)) + destInternalDir := filepath.Join(projectDir, "internal", "legacy--basic") + require.NoError(t, os.MkdirAll(destInternalDir, 0o755)) + destStatePath := filepath.Join(destInternalDir, "state.json") + require.NoError(t, os.WriteFile(destStatePath, []byte(`{"session_id":"dest"}`+"\n"), 0o644)) + + // Create destination index.db (valid) and tag it with a sentinel table. + destIndex := filepath.Join(projectDir, "index.db") + require.NoError(t, copyFile(filepath.Join(repo, ".subtask", "index.db"), destIndex)) + db, err := sql.Open("sqlite", destIndex) + require.NoError(t, err) + _, _ = db.Exec(`CREATE TABLE IF NOT EXISTS sentinel (value TEXT);`) + _, _ = db.Exec(`DELETE FROM sentinel;`) + _, _ = db.Exec(`INSERT INTO sentinel(value) VALUES ('dest');`) + require.NoError(t, db.Close()) + + // Corrupt the legacy index and state; migration must not overwrite destination. + require.NoError(t, os.WriteFile(filepath.Join(repo, ".subtask", "index.db"), []byte("legacy-corrupt"), 0o644)) + require.NoError(t, os.WriteFile(filepath.Join(repo, ".subtask", "internal", "legacy--basic", "state.json"), []byte(`{"session_id":"legacy"}`+"\n"), 0o644)) + + out, err := runSubtaskWithErr(t, bin, repo, "list") + require.NoError(t, err, out) + + // Destination state should be unchanged (no clobber). + gotState, err := os.ReadFile(destStatePath) + require.NoError(t, err) + require.Contains(t, string(gotState), `"session_id":"dest"`) + + // Destination index should not have been replaced/rebuilt (sentinel preserved). + db, err = sql.Open("sqlite", destIndex) + require.NoError(t, err) + var v string + require.NoError(t, db.QueryRow(`SELECT value FROM sentinel LIMIT 1;`).Scan(&v)) + require.Equal(t, "dest", v) + require.NoError(t, db.Close()) + + // Legacy runtime should be removed from repo after migration. + _, err = os.Stat(filepath.Join(repo, ".subtask", "internal")) + require.True(t, os.IsNotExist(err)) + _, err = os.Stat(filepath.Join(repo, ".subtask", "index.db")) + require.True(t, os.IsNotExist(err)) + }) + + t.Run("ConfigScope_ProjectOverrideWins_AndOtherRepoUsesGlobal", func(t *testing.T) { + subtaskDir := t.TempDir() + t.Setenv("SUBTASK_DIR", subtaskDir) + + home := t.TempDir() + t.Setenv("HOME", home) + t.Setenv("USERPROFILE", home) // windows + + // Repo A has a project override that switches to the external mock harness. + repoA := t.TempDir() + initGitRepo(t, repoA) + require.NoError(t, os.MkdirAll(filepath.Join(repoA, ".subtask"), 0o755)) + workerPath := mockWorkerPathForSubtask(bin) + require.FileExists(t, workerPath) + + projectCfg := &workspace.Config{ + Harness: "mock", + Options: map[string]any{"cli": workerPath}, + } + b, _ := json.MarshalIndent(projectCfg, "", " ") + require.NoError(t, os.WriteFile(filepath.Join(repoA, ".subtask", "config.json"), b, 0o644)) + + // Global defaults: builtin mock (in-process). + globalCfg := &workspace.Config{ + Harness: "builtin-mock", + MaxWorkspaces: 3, + Options: map[string]any{"tool_calls": 0}, + } + gb, _ := json.MarshalIndent(globalCfg, "", " ") + require.NoError(t, os.MkdirAll(filepath.Dir(task.ConfigPath()), 0o755)) + require.NoError(t, os.WriteFile(task.ConfigPath(), gb, 0o644)) + + out, err := runSubtaskWithErr(t, bin, repoA, "ask", "hi") + require.NoError(t, err, out) + require.Contains(t, out, "Mock completed (no commands).") + + // Repo B should use global defaults (no project override). + repoB := t.TempDir() + initGitRepo(t, repoB) + + out, err = runSubtaskWithErr(t, bin, repoB, "ask", "hi") + require.NoError(t, err, out) + require.Contains(t, out, "Mock response for:") + }) + + t.Run("Worktree_AutoResolvesAnchor", func(t *testing.T) { + subtaskDir := t.TempDir() + t.Setenv("SUBTASK_DIR", subtaskDir) + + home := t.TempDir() + t.Setenv("HOME", home) + t.Setenv("USERPROFILE", home) // windows + + // Global config present. + cfg := &workspace.Config{Harness: "builtin-mock", MaxWorkspaces: 3} + cfgData, _ := json.MarshalIndent(cfg, "", " ") + require.NoError(t, os.MkdirAll(filepath.Dir(task.ConfigPath()), 0o755)) + require.NoError(t, os.WriteFile(task.ConfigPath(), cfgData, 0o644)) + + anchor := t.TempDir() + initGitRepo(t, anchor) + require.NoError(t, os.MkdirAll(filepath.Join(anchor, ".subtask", "tasks"), 0o755)) // helps anchor selection + + escaped := task.EscapePath(anchor) + wsPath := filepath.Join(task.WorkspacesDir(), fmt.Sprintf("%s--%d", escaped, 1)) + require.NoError(t, os.MkdirAll(filepath.Dir(wsPath), 0o755)) + run(t, anchor, "git", "worktree", "add", "--detach", wsPath) + + out, err := runSubtaskWithErr(t, bin, wsPath, "list") + require.NoError(t, err, out) + + // Runtime folder should be for the anchor, not the workspace root. + require.DirExists(t, filepath.Join(task.ProjectsDir(), task.EscapePath(anchor))) + _, err = os.Stat(filepath.Join(task.ProjectsDir(), task.EscapePath(wsPath))) + require.True(t, os.IsNotExist(err)) + }) +} + +func runSubtaskWithErr(t *testing.T, binPath, dir string, args ...string) (string, error) { + t.Helper() + cmd := exec.Command(binPath, args...) + cmd.Dir = dir + out, err := cmd.CombinedOutput() + return strings.TrimSpace(string(out)), err +} + +func copyDir(src, dst string) error { + entries, err := os.ReadDir(src) + if err != nil { + return err + } + if err := os.MkdirAll(dst, 0o755); err != nil { + return err + } + for _, e := range entries { + srcPath := filepath.Join(src, e.Name()) + dstPath := filepath.Join(dst, e.Name()) + if e.IsDir() { + if err := copyDir(srcPath, dstPath); err != nil { + return err + } + continue + } + if err := copyFile(srcPath, dstPath); err != nil { + return err + } + } + return nil +} + +func copyFile(src, dst string) error { + st, err := os.Stat(src) + if err != nil { + return err + } + if err := os.MkdirAll(filepath.Dir(dst), 0o755); err != nil { + return err + } + in, err := os.Open(src) + if err != nil { + return err + } + defer in.Close() + out, err := os.OpenFile(dst, os.O_CREATE|os.O_WRONLY|os.O_TRUNC, st.Mode()) + if err != nil { + return err + } + _, copyErr := io.Copy(out, in) + closeErr := out.Close() + if copyErr != nil { + return copyErr + } + return closeErr +} + +func addStubCommandToPATH(t *testing.T, name string) { + t.Helper() + + binDir := t.TempDir() + var path string + var content []byte + + if runtime.GOOS == "windows" { + path = filepath.Join(binDir, name+".bat") + content = []byte("@echo off\r\nexit /B 0\r\n") + } else { + path = filepath.Join(binDir, name) + content = []byte("#!/bin/sh\nexit 0\n") + } + require.NoError(t, os.WriteFile(path, content, 0o755)) + t.Setenv("PATH", binDir+string(os.PathListSeparator)+os.Getenv("PATH")) +} diff --git a/pkg/git/diff.go b/pkg/git/diff.go index b74c16e..b35e0c3 100644 --- a/pkg/git/diff.go +++ b/pkg/git/diff.go @@ -125,6 +125,24 @@ func DiffNumstatRange(dir, baseRef, branchRef string) ([]DiffFileStat, error) { return parseNumstat(out), nil } +// DiffStatRange returns summed added/removed lines for base..branch. +// +// This is committed-history only (does not include uncommitted workspace changes). +func DiffStatRange(dir, baseRef, branchRef string) (added, removed int, err error) { + stats, err := DiffNumstatRange(dir, baseRef, branchRef) + if err != nil { + return 0, 0, err + } + for _, s := range stats { + if s.Binary { + continue + } + added += s.Added + removed += s.Removed + } + return added, removed, nil +} + // DiffFile returns the unified diff for a single file path compared to baseRef. func DiffFile(dir, baseRef, path string) (string, error) { return Output(dir, "diff", baseRef, "--", path) diff --git a/pkg/git/error.go b/pkg/git/error.go new file mode 100644 index 0000000..ab5ffe4 --- /dev/null +++ b/pkg/git/error.go @@ -0,0 +1,30 @@ +package git + +import ( + "strings" +) + +// Error is a structured git execution error that avoids leaking raw "exit status N" strings. +type Error struct { + Dir string + Args []string + Stderr string + Cause error +} + +func (e *Error) Error() string { + args := strings.Join(e.Args, " ") + if strings.TrimSpace(e.Stderr) != "" { + return "git " + args + ": " + strings.TrimSpace(e.Stderr) + } + return "git " + args + " failed" +} + +func (e *Error) Unwrap() error { return e.Cause } + +func isNotGitRepoOutput(s string) bool { + s = strings.ToLower(s) + return strings.Contains(s, "not a git repository") || + strings.Contains(s, "not a git repo") +} + diff --git a/pkg/git/git.go b/pkg/git/git.go index 277f52e..d8738ae 100644 --- a/pkg/git/git.go +++ b/pkg/git/git.go @@ -2,6 +2,7 @@ package git import ( "bytes" + "errors" "fmt" "os" "os/exec" @@ -12,7 +13,7 @@ import ( "time" "github.com/zippoxer/subtask/pkg/logging" - "github.com/zippoxer/subtask/pkg/task" + "github.com/zippoxer/subtask/pkg/subtaskerr" ) // Run runs a git command in the specified directory. @@ -22,7 +23,10 @@ func Run(dir string, args ...string) error { cmd.Stdout = os.Stdout cmd.Stderr = os.Stderr if !logging.DebugEnabled() { - return cmd.Run() + if err := cmd.Run(); err != nil { + return &Error{Dir: dir, Args: args, Cause: err} + } + return nil } start := time.Now() err := cmd.Run() @@ -32,7 +36,7 @@ func Run(dir string, args ...string) error { gitCmdBatcher.flushNow() logging.Debug("git", fmt.Sprintf("%s (%s)", strings.Join(args, " "), d.Round(time.Millisecond))) logging.Error("git", fmt.Sprintf("%s error: %s (%s)", strings.Join(args, " "), err.Error(), d.Round(time.Millisecond))) - return err + return &Error{Dir: dir, Args: args, Cause: err} } logGitCommandTiming(args, d) @@ -70,7 +74,10 @@ func RunWithStderrFilter(dir string, stderrFilter func(string) string, args ...s err = cmd.Run() } if stderr.Len() == 0 { - return err + if err != nil { + return &Error{Dir: dir, Args: args, Cause: err} + } + return nil } out := stderr.String() @@ -80,7 +87,13 @@ func RunWithStderrFilter(dir string, stderrFilter func(string) string, args ...s if out != "" { _, _ = os.Stderr.WriteString(out) } - return err + if err != nil { + if isNotGitRepoOutput(out) { + return subtaskerr.ErrNotGitRepo + } + return &Error{Dir: dir, Args: args, Stderr: out, Cause: err} + } + return nil } // FilterLineEndingWarnings removes common git line-ending conversion warnings. @@ -110,13 +123,29 @@ func FilterLineEndingWarnings(stderr string) string { func RunQuiet(dir string, args ...string) error { cmd := exec.Command("git", args...) cmd.Dir = dir - if !logging.DebugEnabled() { - return cmd.Run() + var stderr bytes.Buffer + cmd.Stderr = &stderr + + var ( + err error + d time.Duration + ) + if logging.DebugEnabled() { + start := time.Now() + err = cmd.Run() + d = time.Since(start) + logGitCommandTiming(args, d) + } else { + err = cmd.Run() } - start := time.Now() - err := cmd.Run() - logGitCommandTiming(args, time.Since(start)) - return err + if err != nil { + out := stderr.String() + if isNotGitRepoOutput(out) { + return subtaskerr.ErrNotGitRepo + } + return &Error{Dir: dir, Args: args, Stderr: out, Cause: err} + } + return nil } // RunSilent runs a git command, capturing output and only showing it on error. @@ -144,7 +173,12 @@ func RunSilent(dir string, args ...string) error { } if err != nil { // Show the output only when there's an error - os.Stderr.Write(out) + _, _ = os.Stderr.Write(out) + + if isNotGitRepoOutput(string(out)) { + return subtaskerr.ErrNotGitRepo + } + return &Error{Dir: dir, Args: args, Stderr: string(out), Cause: err} } return err } @@ -153,24 +187,46 @@ func RunSilent(dir string, args ...string) error { func Output(dir string, args ...string) (string, error) { cmd := exec.Command("git", args...) cmd.Dir = dir - if !logging.DebugEnabled() { - out, err := cmd.Output() - if err != nil { - logging.Error("git", fmt.Sprintf("%s error: %s", strings.Join(args, " "), err.Error())) - } - return strings.TrimSpace(string(out)), err + + var stdout bytes.Buffer + var stderr bytes.Buffer + cmd.Stdout = &stdout + cmd.Stderr = &stderr + + var ( + err error + d time.Duration + ) + if logging.DebugEnabled() { + start := time.Now() + err = cmd.Run() + d = time.Since(start) + } else { + err = cmd.Run() } - start := time.Now() - out, err := cmd.Output() - d := time.Since(start) + + outStr := strings.TrimSpace(stdout.String()) + errStr := stderr.String() + if err != nil { - gitCmdBatcher.flushNow() - logging.Debug("git", fmt.Sprintf("%s (%s)", strings.Join(args, " "), d.Round(time.Millisecond))) - logging.Error("git", fmt.Sprintf("%s error: %s (%s)", strings.Join(args, " "), err.Error(), d.Round(time.Millisecond))) - } else { + // Check for "not a git repo" first - this is an expected condition, not an error worth logging. + if isNotGitRepoOutput(errStr) { + return "", subtaskerr.ErrNotGitRepo + } + if logging.DebugEnabled() { + gitCmdBatcher.flushNow() + logging.Debug("git", fmt.Sprintf("%s (%s)", strings.Join(args, " "), d.Round(time.Millisecond))) + logging.Error("git", fmt.Sprintf("%s error: %s (%s)", strings.Join(args, " "), strings.TrimSpace(errStr), d.Round(time.Millisecond))) + } else { + logging.Error("git", fmt.Sprintf("%s error: %s", strings.Join(args, " "), strings.TrimSpace(errStr))) + } + return "", &Error{Dir: dir, Args: args, Stderr: errStr, Cause: err} + } + + if logging.DebugEnabled() { logGitCommandTiming(args, d) } - return strings.TrimSpace(string(out)), err + return outStr, nil } // CommitExists returns whether rev resolves to a commit object. @@ -247,15 +303,15 @@ func Switch(dir, branch, startPoint string) error { } // SetupBranch sets up the git branch for a task (local-first). -func SetupBranch(dir string, t *task.Task, baseCommit string) error { +func SetupBranch(dir string, taskBranch string, baseBranch string, baseCommit string) error { // Prefer a pinned base commit when available (stable diffs, staleness detection). if baseCommit != "" { - if err := Switch(dir, t.Name, baseCommit); err == nil { + if err := Switch(dir, taskBranch, baseCommit); err == nil { return nil } } - return Switch(dir, t.Name, t.BaseBranch) + return Switch(dir, taskBranch, baseBranch) } // IsClean checks if the working directory is clean. @@ -333,6 +389,75 @@ func DiffStat(dir, baseRef string) (added, removed int, err error) { return added, removed, nil } +// RevListCount returns how many commits are reachable from headRef but not baseCommit. +// Equivalent to: git rev-list --count .. +func RevListCount(dir, baseCommit, headRef string) (int, error) { + out, err := Output(dir, "rev-list", "--count", baseCommit+".."+headRef) + if err != nil { + return 0, err + } + if out == "" { + return 0, nil + } + n, err := strconv.Atoi(strings.TrimSpace(out)) + if err != nil { + return 0, err + } + return n, nil +} + +type CommitMeta struct { + SHA string + Subject string + AuthorName string + AuthorEmail string + AuthoredAt int64 // unix seconds +} + +// ListCommitsRange returns commits reachable from to but not from from (from..to), +// ordered from oldest to newest. +func ListCommitsRange(dir, from, to string) ([]CommitMeta, error) { + from = strings.TrimSpace(from) + to = strings.TrimSpace(to) + if from == "" || to == "" { + return nil, fmt.Errorf("commit range requires from and to") + } + + const fieldSep = "\x1f" + format := "%H" + fieldSep + "%an" + fieldSep + "%ae" + fieldSep + "%at" + fieldSep + "%s" + out, err := Output(dir, "log", "--reverse", "--format="+format, from+".."+to) + if err != nil { + return nil, err + } + + out = strings.TrimSpace(out) + if out == "" { + return nil, nil + } + + lines := strings.Split(out, "\n") + commits := make([]CommitMeta, 0, len(lines)) + for _, line := range lines { + line = strings.TrimSpace(line) + if line == "" { + continue + } + parts := strings.Split(line, fieldSep) + if len(parts) < 5 { + continue + } + authoredAt, _ := strconv.ParseInt(strings.TrimSpace(parts[3]), 10, 64) + commits = append(commits, CommitMeta{ + SHA: strings.TrimSpace(parts[0]), + AuthorName: strings.TrimSpace(parts[1]), + AuthorEmail: strings.TrimSpace(parts[2]), + AuthoredAt: authoredAt, + Subject: strings.TrimSpace(parts[4]), + }) + } + return commits, nil +} + // CommitsBehind returns how many commits targetRef is ahead of baseCommit. // Equivalent to: git rev-list --count .. func CommitsBehind(dir, baseCommit, targetRef string) (int, error) { @@ -416,6 +541,20 @@ func MergeBase(dir, ref1, ref2 string) (string, error) { return Output(dir, "merge-base", ref1, ref2) } +// MergeBaseForkPoint returns the fork-point merge-base between upstream and commit. +// +// This is useful for finding a stable "PR base" even if the branch tip is already +// reachable from upstream (e.g., fast-forward merged), as long as upstream's reflog +// still contains the previous base tip. +func MergeBaseForkPoint(dir, upstream, commit string) (string, error) { + upstream = strings.TrimSpace(upstream) + commit = strings.TrimSpace(commit) + if upstream == "" || commit == "" { + return "", fmt.Errorf("upstream and commit are required") + } + return Output(dir, "merge-base", "--fork-point", upstream, commit) +} + // MergeConflictFiles returns the list of files that would conflict when merging headRef into targetRef. // // This is a non-mutating check intended for preflight/status displays. It uses `git merge-tree` to @@ -427,38 +566,14 @@ func MergeConflictFiles(dir, targetRef, headRef string) ([]string, error) { return nil, fmt.Errorf("targetRef and headRef are required") } - mb, err := MergeBase(dir, targetRef, headRef) + res, err := simulateMerge(dir, targetRef, headRef) if err != nil { return nil, err } - mb = strings.TrimSpace(mb) - if mb == "" { - return nil, fmt.Errorf("failed to resolve merge-base between %s and %s", targetRef, headRef) - } - - // `git merge-tree --write-tree` returns exit status 1 on conflicts. In conflict cases it prints: - // - // - // - // - // Auto-merging ... - // CONFLICT ... - cmd := exec.Command("git", "merge-tree", "--write-tree", "--name-only", "--merge-base", mb, targetRef, headRef) - cmd.Dir = dir - out, runErr := cmd.CombinedOutput() - if runErr == nil { + if len(res.ConflictFiles) == 0 { return nil, nil } - - s := string(out) - files := mergeTreeNameOnlyConflictFiles(s) - if len(files) == 0 && strings.Contains(s, "CONFLICT") { - files = extractMergeConflictFiles(s) - } - if len(files) == 0 { - return nil, fmt.Errorf("git merge-tree failed: %w", runErr) - } - return files, nil + return res.ConflictFiles, nil } func mergeTreeNameOnlyConflictFiles(output string) []string { @@ -571,13 +686,115 @@ func extractConflictLines(output string) string { return strings.Join(lines, "\n") } -// LocalPush fast-forwards targetBranch to current HEAD via local push. -// Uses receive.denyCurrentBranch=updateInstead to allow pushing to a checked-out branch. -// This works even if targetBranch is checked out in the main worktree. +func worktreePathsForBranch(dir, branch string) ([]string, error) { + branch = strings.TrimSpace(branch) + if branch == "" { + return nil, nil + } + + out, err := Output(dir, "worktree", "list", "--porcelain") + if err != nil { + return nil, err + } + out = strings.TrimSpace(out) + if out == "" { + return nil, nil + } + + want := "refs/heads/" + branch + + var ( + currentPath string + currentBranch string + paths []string + ) + flush := func() { + if currentPath != "" && currentBranch == want { + paths = append(paths, currentPath) + } + currentPath = "" + currentBranch = "" + } + + for _, line := range strings.Split(out, "\n") { + line = strings.TrimSpace(line) + if line == "" { + flush() + continue + } + + if strings.HasPrefix(line, "worktree ") { + // New record (flush previous, then start). + flush() + currentPath = strings.TrimSpace(strings.TrimPrefix(line, "worktree ")) + continue + } + if strings.HasPrefix(line, "branch ") { + currentBranch = strings.TrimSpace(strings.TrimPrefix(line, "branch ")) + continue + } + } + flush() + + return paths, nil +} + +// LocalPush fast-forwards targetBranch to current HEAD. +// +// If targetBranch is checked out in another worktree (e.g. the user's main worktree), +// update it using a fast-forward merge so local uncommitted changes are preserved when +// they don't overlap (git-like behavior). +// +// If targetBranch is not checked out anywhere, update only the ref. func LocalPush(dir, targetBranch string) error { - return RunSilent(dir, "push", - "--receive-pack=git -c receive.denyCurrentBranch=updateInstead receive-pack", - ".", "HEAD:"+targetBranch) + targetBranch = strings.TrimSpace(targetBranch) + if targetBranch == "" { + return fmt.Errorf("targetBranch is required") + } + + head, err := Output(dir, "rev-parse", "HEAD") + if err != nil { + return err + } + head = strings.TrimSpace(head) + if head == "" { + return fmt.Errorf("failed to resolve HEAD") + } + + old, err := Output(dir, "rev-parse", targetBranch) + if err != nil { + return err + } + old = strings.TrimSpace(old) + if old == "" { + return fmt.Errorf("failed to resolve %s", targetBranch) + } + + ok, err := IsAncestor(dir, old, head) + if err != nil { + return err + } + if !ok { + return fmt.Errorf("cannot fast-forward %s to %s (not a descendant)", targetBranch, head) + } + + paths, err := worktreePathsForBranch(dir, targetBranch) + if err != nil { + return err + } + if len(paths) > 1 { + return fmt.Errorf("cannot fast-forward %s: branch is checked out in multiple worktrees (%s)", targetBranch, strings.Join(paths, ", ")) + } + if len(paths) == 1 { + wt := paths[0] + if err := RunSilent(wt, "merge", "--ff-only", head); err != nil { + return fmt.Errorf("failed to fast-forward %s in %s: %w", targetBranch, wt, err) + } + return nil + } + + // Ref-only update (not checked out anywhere). Use the expected old SHA to avoid races. + return RunSilent(dir, "update-ref", "-m", "subtask merge", "refs/heads/"+targetBranch, head, old) } // IntegrationReason describes why a branch is considered integrated into target. @@ -634,6 +851,30 @@ func BranchExists(dir, branch string) bool { return err == nil } +// IsAncestor reports whether ancestor is reachable from descendant. +// +// This wraps `git merge-base --is-ancestor`: +// - returns (true, nil) if ancestor is an ancestor of descendant +// - returns (false, nil) if ancestor is NOT an ancestor of descendant +// - returns (false, err) for other git errors (missing commits, not a repo, etc.) +func IsAncestor(dir, ancestor, descendant string) (bool, error) { + err := RunQuiet(dir, "merge-base", "--is-ancestor", ancestor, descendant) + if err == nil { + return true, nil + } + + // Exit code 1 = not ancestor. + var gitErr *Error + if errors.As(err, &gitErr) { + var exitErr *exec.ExitError + if errors.As(gitErr.Cause, &exitErr) && exitErr.ExitCode() == 1 { + return false, nil + } + } + + return false, err +} + // isSameCommit checks if two refs point to the same commit. func isSameCommit(dir, ref1, ref2 string) bool { out, err := Output(dir, "rev-parse", ref1, ref2) @@ -683,9 +924,8 @@ func treesMatch(dir, ref1, ref2 string) bool { // mergeAddsChanges checks if merging branch into target would add any changes. // Uses git merge-tree to simulate the merge without actually performing it. func mergeAddsChanges(dir, branch, target string) bool { - // git merge-tree --write-tree returns the tree SHA of what the merge would produce - mergeTree, err := Output(dir, "merge-tree", "--write-tree", target, branch) - if err != nil { + res, err := simulateMerge(dir, target, branch) + if err != nil || len(res.ConflictFiles) > 0 || strings.TrimSpace(res.MergedTree) == "" { return true // Assume has changes on error (including conflicts) } @@ -696,7 +936,7 @@ func mergeAddsChanges(dir, branch, target string) bool { } // If merge result equals target tree, merging adds nothing - return strings.TrimSpace(mergeTree) != strings.TrimSpace(targetTree) + return strings.TrimSpace(res.MergedTree) != strings.TrimSpace(targetTree) } // EffectiveTarget returns target or origin/target if origin is ahead. diff --git a/pkg/git/integration_test.go b/pkg/git/integration_test.go index 620c076..2155d59 100644 --- a/pkg/git/integration_test.go +++ b/pkg/git/integration_test.go @@ -132,37 +132,51 @@ func TestIsIntegrated_TreesMatch(t *testing.T) { } func TestIsIntegrated_MergeAddsNothing(t *testing.T) { - dir := testRepo(t) - - // Create a file on master - os.WriteFile(filepath.Join(dir, "file.txt"), []byte("content"), 0644) - gitCmd(t, dir, "add", "file.txt") - gitCmd(t, dir, "commit", "-m", "Add file") - - // Create feature branch and add a second file - gitCmd(t, dir, "checkout", "-b", "feature") - os.WriteFile(filepath.Join(dir, "feature.txt"), []byte("feature content"), 0644) - gitCmd(t, dir, "add", "feature.txt") - gitCmd(t, dir, "commit", "-m", "Add feature file") - - // Squash-merge feature into master (creates different history) - gitCmd(t, dir, "checkout", "master") - gitCmd(t, dir, "merge", "--squash", "feature") - gitCmd(t, dir, "commit", "-m", "Squash merge feature") - - // Add an extra file on master so trees don't match - // This prevents TreesMatch from triggering before MergeAddsNothing - os.WriteFile(filepath.Join(dir, "extra.txt"), []byte("extra"), 0644) - gitCmd(t, dir, "add", "extra.txt") - gitCmd(t, dir, "commit", "-m", "Add extra file on master") - - // Now: master has file.txt, feature.txt, extra.txt - // Feature has file.txt, feature.txt - // Trees don't match, but merging feature adds nothing - reason := IsIntegrated(dir, "feature", "master") - if reason != IntegratedMergeAddsNothing { - t.Errorf("expected IntegratedMergeAddsNothing, got %q", reason) + run := func(t *testing.T, force string) { + t.Setenv(mergeSimForceEnvVar, force) + + dir := testRepo(t) + + // Create a file on master + os.WriteFile(filepath.Join(dir, "file.txt"), []byte("content"), 0644) + gitCmd(t, dir, "add", "file.txt") + gitCmd(t, dir, "commit", "-m", "Add file") + + // Create feature branch and add a second file + gitCmd(t, dir, "checkout", "-b", "feature") + os.WriteFile(filepath.Join(dir, "feature.txt"), []byte("feature content"), 0644) + gitCmd(t, dir, "add", "feature.txt") + gitCmd(t, dir, "commit", "-m", "Add feature file") + + // Squash-merge feature into master (creates different history) + gitCmd(t, dir, "checkout", "master") + gitCmd(t, dir, "merge", "--squash", "feature") + gitCmd(t, dir, "commit", "-m", "Squash merge feature") + + // Add an extra file on master so trees don't match + // This prevents TreesMatch from triggering before MergeAddsNothing + os.WriteFile(filepath.Join(dir, "extra.txt"), []byte("extra"), 0644) + gitCmd(t, dir, "add", "extra.txt") + gitCmd(t, dir, "commit", "-m", "Add extra file on master") + + // Now: master has file.txt, feature.txt, extra.txt + // Feature has file.txt, feature.txt + // Trees don't match, but merging feature adds nothing + reason := IsIntegrated(dir, "feature", "master") + if reason != IntegratedMergeAddsNothing { + t.Errorf("expected IntegratedMergeAddsNothing, got %q", reason) + } } + + t.Run("merge-tree", func(t *testing.T) { + if !mergeTreeWriteTreeSupported() { + t.Skip("git merge-tree --write-tree not supported") + } + run(t, "merge-tree") + }) + t.Run("index", func(t *testing.T) { + run(t, "index") + }) } func TestIsIntegrated_NotIntegrated(t *testing.T) { diff --git a/pkg/git/localpush_test.go b/pkg/git/localpush_test.go new file mode 100644 index 0000000..9cdd57d --- /dev/null +++ b/pkg/git/localpush_test.go @@ -0,0 +1,107 @@ +package git + +import ( + "os" + "path/filepath" + "strings" + "testing" +) + +func TestLocalPush_AllowsFastForwardWithUncommittedNonOverlappingChanges(t *testing.T) { + repo := testRepo(t) + + // Create a base file on master in the main worktree. + if err := os.WriteFile(filepath.Join(repo, "base.txt"), []byte("base\n"), 0o644); err != nil { + t.Fatal(err) + } + gitCmd(t, repo, "add", "base.txt") + gitCmd(t, repo, "commit", "-m", "base") + + // Create a worktree for a feature branch and advance it. + featureWT := filepath.Join(t.TempDir(), "feature-wt") + gitCmd(t, repo, "worktree", "add", "-b", "feature", featureWT) + if err := os.WriteFile(filepath.Join(featureWT, "feature.txt"), []byte("feature\n"), 0o644); err != nil { + t.Fatal(err) + } + gitCmd(t, featureWT, "add", "feature.txt") + gitCmd(t, featureWT, "commit", "-m", "feature") + + // Dirty the main worktree with a non-overlapping change. + if err := os.WriteFile(filepath.Join(repo, "dirty.txt"), []byte("dirty\n"), 0o644); err != nil { + t.Fatal(err) + } + gitCmd(t, repo, "add", "dirty.txt") + if err := os.WriteFile(filepath.Join(repo, "dirty.txt"), []byte("dirty-modified\n"), 0o644); err != nil { + t.Fatal(err) + } + + oldMaster := strings.TrimSpace(gitCmd(t, repo, "rev-parse", "master")) + head := strings.TrimSpace(gitCmd(t, featureWT, "rev-parse", "HEAD")) + + // LocalPush should fast-forward master, even though the main worktree is dirty. + if err := LocalPush(featureWT, "master"); err != nil { + t.Fatalf("LocalPush returned error: %v", err) + } + + newMaster := strings.TrimSpace(gitCmd(t, repo, "rev-parse", "master")) + if newMaster != head { + t.Fatalf("expected master to fast-forward to %s, got %s (old %s)", head, newMaster, oldMaster) + } + + // The uncommitted change remains. + st, err := Output(repo, "status", "--porcelain") + if err != nil { + t.Fatal(err) + } + if !strings.Contains(st, "dirty.txt") { + t.Fatalf("expected dirty.txt to remain dirty, status:\n%s", st) + } +} + +func TestLocalPush_FailsWhenUncommittedChangesWouldBeOverwritten(t *testing.T) { + repo := testRepo(t) + + // Create a base file on master in the main worktree. + if err := os.WriteFile(filepath.Join(repo, "file.txt"), []byte("base\n"), 0o644); err != nil { + t.Fatal(err) + } + gitCmd(t, repo, "add", "file.txt") + gitCmd(t, repo, "commit", "-m", "base") + + // Create a worktree for a feature branch and modify file.txt. + featureWT := filepath.Join(t.TempDir(), "feature-wt") + gitCmd(t, repo, "worktree", "add", "-b", "feature", featureWT) + if err := os.WriteFile(filepath.Join(featureWT, "file.txt"), []byte("feature\n"), 0o644); err != nil { + t.Fatal(err) + } + gitCmd(t, featureWT, "add", "file.txt") + gitCmd(t, featureWT, "commit", "-m", "feature edit") + + oldMaster := strings.TrimSpace(gitCmd(t, repo, "rev-parse", "master")) + head := strings.TrimSpace(gitCmd(t, featureWT, "rev-parse", "HEAD")) + + // Dirty the main worktree in a way that would be overwritten by the fast-forward. + if err := os.WriteFile(filepath.Join(repo, "file.txt"), []byte("dirty local\n"), 0o644); err != nil { + t.Fatal(err) + } + + err := LocalPush(featureWT, "master") + if err == nil { + t.Fatalf("expected LocalPush to fail due to overlapping uncommitted changes") + } + + // Branch ref should not move. + newMaster := strings.TrimSpace(gitCmd(t, repo, "rev-parse", "master")) + if newMaster != oldMaster { + t.Fatalf("expected master to remain at %s, got %s (head %s)", oldMaster, newMaster, head) + } + + // Local change remains. + got, readErr := os.ReadFile(filepath.Join(repo, "file.txt")) + if readErr != nil { + t.Fatal(readErr) + } + if string(got) != "dirty local\n" { + t.Fatalf("expected dirty working tree content to remain, got %q", string(got)) + } +} diff --git a/pkg/git/merge_conflicts_test.go b/pkg/git/merge_conflicts_test.go index 37ba82e..7a694c9 100644 --- a/pkg/git/merge_conflicts_test.go +++ b/pkg/git/merge_conflicts_test.go @@ -2,69 +2,186 @@ package git import ( "os" + "os/exec" "path/filepath" "strings" "testing" ) func TestMergeConflictFiles_NoConflicts(t *testing.T) { - dir := testRepo(t) + run := func(t *testing.T, force string) { + t.Setenv(mergeSimForceEnvVar, force) - // Base file on master. - if err := os.WriteFile(filepath.Join(dir, "base.txt"), []byte("base\n"), 0o644); err != nil { - t.Fatal(err) - } - gitCmd(t, dir, "add", "base.txt") - gitCmd(t, dir, "commit", "-m", "add base") + dir := testRepo(t) - // Feature changes a different file. - gitCmd(t, dir, "checkout", "-b", "feature") - if err := os.WriteFile(filepath.Join(dir, "feature.txt"), []byte("feature\n"), 0o644); err != nil { - t.Fatal(err) - } - gitCmd(t, dir, "add", "feature.txt") - gitCmd(t, dir, "commit", "-m", "feature change") + // Base file on master. + if err := os.WriteFile(filepath.Join(dir, "base.txt"), []byte("base\n"), 0o644); err != nil { + t.Fatal(err) + } + gitCmd(t, dir, "add", "base.txt") + gitCmd(t, dir, "commit", "-m", "add base") - conflicts, err := MergeConflictFiles(dir, "master", "feature") - if err != nil { - t.Fatalf("MergeConflictFiles returned error: %v", err) - } - if len(conflicts) != 0 { - t.Fatalf("expected no conflicts, got %v", conflicts) + // Feature changes a different file. + gitCmd(t, dir, "checkout", "-b", "feature") + if err := os.WriteFile(filepath.Join(dir, "feature.txt"), []byte("feature\n"), 0o644); err != nil { + t.Fatal(err) + } + gitCmd(t, dir, "add", "feature.txt") + gitCmd(t, dir, "commit", "-m", "feature change") + + conflicts, err := MergeConflictFiles(dir, "master", "feature") + if err != nil { + t.Fatalf("MergeConflictFiles returned error: %v", err) + } + if len(conflicts) != 0 { + t.Fatalf("expected no conflicts, got %v", conflicts) + } } + + t.Run("merge-tree", func(t *testing.T) { + if !mergeTreeWriteTreeSupported() { + t.Skip("git merge-tree --write-tree not supported") + } + run(t, "merge-tree") + }) + t.Run("index", func(t *testing.T) { + run(t, "index") + }) } func TestMergeConflictFiles_WithConflicts(t *testing.T) { + run := func(t *testing.T, force string) { + t.Setenv(mergeSimForceEnvVar, force) + + dir := testRepo(t) + + // Create a file on master. + if err := os.WriteFile(filepath.Join(dir, "file.txt"), []byte("base\n"), 0o644); err != nil { + t.Fatal(err) + } + gitCmd(t, dir, "add", "file.txt") + gitCmd(t, dir, "commit", "-m", "base file") + + // Feature edits file.txt. + gitCmd(t, dir, "checkout", "-b", "feature") + if err := os.WriteFile(filepath.Join(dir, "file.txt"), []byte("feature\n"), 0o644); err != nil { + t.Fatal(err) + } + gitCmd(t, dir, "add", "file.txt") + gitCmd(t, dir, "commit", "-m", "feature edit") + + // Master edits file.txt differently. + gitCmd(t, dir, "checkout", "master") + if err := os.WriteFile(filepath.Join(dir, "file.txt"), []byte("master\n"), 0o644); err != nil { + t.Fatal(err) + } + gitCmd(t, dir, "add", "file.txt") + gitCmd(t, dir, "commit", "-m", "master edit") + + conflicts, err := MergeConflictFiles(dir, "master", "feature") + if err != nil { + t.Fatalf("MergeConflictFiles returned error: %v", err) + } + if strings.Join(conflicts, ",") != "file.txt" { + t.Fatalf("expected [file.txt], got %v", conflicts) + } + } + + t.Run("merge-tree", func(t *testing.T) { + if !mergeTreeWriteTreeSupported() { + t.Skip("git merge-tree --write-tree not supported") + } + run(t, "merge-tree") + }) + t.Run("index", func(t *testing.T) { + run(t, "index") + }) +} + +func TestMergeConflictFiles_NoFalseConflictsOnDeletedFile_WithMultipleMergeBases(t *testing.T) { + if !mergeTreeWriteTreeSupported() { + t.Skip("git merge-tree --write-tree not supported") + } + t.Setenv(mergeSimForceEnvVar, "merge-tree") + dir := testRepo(t) - // Create a file on master. - if err := os.WriteFile(filepath.Join(dir, "file.txt"), []byte("base\n"), 0o644); err != nil { + // Base file on master. + if err := os.WriteFile(filepath.Join(dir, "a.txt"), []byte("0\n"), 0o644); err != nil { t.Fatal(err) } - gitCmd(t, dir, "add", "file.txt") - gitCmd(t, dir, "commit", "-m", "base file") + gitCmd(t, dir, "add", "a.txt") + gitCmd(t, dir, "commit", "-m", "add a") - // Feature edits file.txt. - gitCmd(t, dir, "checkout", "-b", "feature") - if err := os.WriteFile(filepath.Join(dir, "file.txt"), []byte("feature\n"), 0o644); err != nil { + // Branch A: modify a.txt. + gitCmd(t, dir, "checkout", "-b", "A") + if err := os.WriteFile(filepath.Join(dir, "a.txt"), []byte("A1\n"), 0o644); err != nil { t.Fatal(err) } - gitCmd(t, dir, "add", "file.txt") - gitCmd(t, dir, "commit", "-m", "feature edit") + gitCmd(t, dir, "add", "a.txt") + gitCmd(t, dir, "commit", "-m", "A1") + a1 := strings.TrimSpace(gitCmd(t, dir, "rev-parse", "HEAD")) - // Master edits file.txt differently. + // Branch B: touch a different file. gitCmd(t, dir, "checkout", "master") - if err := os.WriteFile(filepath.Join(dir, "file.txt"), []byte("master\n"), 0o644); err != nil { + gitCmd(t, dir, "checkout", "-b", "B") + if err := os.WriteFile(filepath.Join(dir, "b.txt"), []byte("B1\n"), 0o644); err != nil { t.Fatal(err) } - gitCmd(t, dir, "add", "file.txt") - gitCmd(t, dir, "commit", "-m", "master edit") + gitCmd(t, dir, "add", "b.txt") + gitCmd(t, dir, "commit", "-m", "B1") + b1 := strings.TrimSpace(gitCmd(t, dir, "rev-parse", "HEAD")) + + // Create a criss-cross merge with multiple merge bases: + // - A merges B (B1 is now a merge base) + // - B merges A1 by commit hash (A1 is now a merge base) + gitCmd(t, dir, "checkout", "A") + gitCmd(t, dir, "merge", "--no-ff", "B", "-m", "merge B into A") + gitCmd(t, dir, "checkout", "B") + gitCmd(t, dir, "merge", "--no-ff", a1, "-m", "merge A1 into B") + + // A deletes a.txt (simulates "worker deleted file"). + gitCmd(t, dir, "checkout", "A") + gitCmd(t, dir, "rm", "a.txt") + gitCmd(t, dir, "commit", "-m", "A delete a") + + all := strings.Fields(strings.ReplaceAll(strings.TrimSpace(gitCmd(t, dir, "merge-base", "--all", "A", "B")), "\n", " ")) + if len(all) < 2 { + t.Fatalf("expected multiple merge bases, got %v", all) + } - conflicts, err := MergeConflictFiles(dir, "master", "feature") + // `git merge-tree --write-tree A B` uses recursive merge-base selection and is clean. + { + cmd := exec.Command("git", "merge-tree", "--write-tree", "--name-only", "A", "B") + cmd.Dir = dir + if out, err := cmd.CombinedOutput(); err != nil { + t.Fatalf("git merge-tree --write-tree A B unexpectedly conflicted: %v\n%s", err, out) + } + } + + // But forcing a single merge-base (the one returned by `git merge-base A B`) can create a false conflict. + mb := strings.TrimSpace(gitCmd(t, dir, "merge-base", "A", "B")) + if mb != b1 { + t.Fatalf("expected git merge-base A B to pick B1=%s, got %s", b1, mb) + } + { + cmd := exec.Command("git", "merge-tree", "--write-tree", "--name-only", "--merge-base", mb, "A", "B") + cmd.Dir = dir + out, err := cmd.CombinedOutput() + if err == nil { + t.Fatalf("expected forced merge-base merge-tree to conflict, but it succeeded:\n%s", out) + } + if !strings.Contains(string(out), "a.txt") { + t.Fatalf("expected forced merge-base merge-tree to mention a.txt, got:\n%s", out) + } + } + + // Subtask should follow git's default merge-base selection and report no conflicts. + conflicts, err := MergeConflictFiles(dir, "A", "B") if err != nil { t.Fatalf("MergeConflictFiles returned error: %v", err) } - if strings.Join(conflicts, ",") != "file.txt" { - t.Fatalf("expected [file.txt], got %v", conflicts) + if len(conflicts) != 0 { + t.Fatalf("expected no conflicts, got %v", conflicts) } } diff --git a/pkg/git/mergesim.go b/pkg/git/mergesim.go new file mode 100644 index 0000000..28ec614 --- /dev/null +++ b/pkg/git/mergesim.go @@ -0,0 +1,306 @@ +package git + +import ( + "bytes" + "fmt" + "os" + "os/exec" + "path/filepath" + "sort" + "strings" + "sync" + "time" + + "github.com/zippoxer/subtask/pkg/logging" +) + +type mergeSimMethod string + +const ( + mergeSimMethodMergeTree mergeSimMethod = "merge-tree" + mergeSimMethodIndex mergeSimMethod = "index" +) + +const ( + mergeSimForceEnvVar = "SUBTASK_MERGE_SIM_FORCE" // "auto" (default), "merge-tree", or "index" +) + +type mergeSimResult struct { + Method mergeSimMethod + MergeBase string + MergedTree string + ConflictFiles []string +} + +func simulateMerge(dir, targetRef, headRef string) (mergeSimResult, error) { + targetRef = strings.TrimSpace(targetRef) + headRef = strings.TrimSpace(headRef) + if targetRef == "" || headRef == "" { + return mergeSimResult{}, fmt.Errorf("targetRef and headRef are required") + } + + method, err := selectMergeSimMethod() + if err != nil { + return mergeSimResult{}, err + } + + switch method { + case mergeSimMethodMergeTree: + return simulateMergeMergeTree(dir, targetRef, headRef) + case mergeSimMethodIndex: + mb, err := MergeBase(dir, targetRef, headRef) + if err != nil { + return mergeSimResult{}, err + } + mb = strings.TrimSpace(mb) + if mb == "" { + return mergeSimResult{}, fmt.Errorf("failed to resolve merge-base between %s and %s", targetRef, headRef) + } + return simulateMergeTempIndex(dir, mb, targetRef, headRef) + default: + return mergeSimResult{}, fmt.Errorf("unknown merge simulation method %q", method) + } +} + +func selectMergeSimMethod() (mergeSimMethod, error) { + force := strings.ToLower(strings.TrimSpace(os.Getenv(mergeSimForceEnvVar))) + switch force { + case "", "auto": + // auto + case "merge-tree", "mergetree": + if !mergeTreeWriteTreeSupported() { + return "", fmt.Errorf("merge-tree simulation forced but git does not support merge-tree --write-tree") + } + return mergeSimMethodMergeTree, nil + case "index", "temp-index", "tempindex": + return mergeSimMethodIndex, nil + default: + // Unknown values fall back to auto (avoid breaking existing users). + } + + if mergeTreeWriteTreeSupported() { + return mergeSimMethodMergeTree, nil + } + return mergeSimMethodIndex, nil +} + +var ( + mergeTreeWriteTreeOnce sync.Once + mergeTreeWriteTreeOK bool +) + +func mergeTreeWriteTreeSupported() bool { + mergeTreeWriteTreeOnce.Do(func() { + cmd := exec.Command("git", "merge-tree", "-h") + out, _ := cmd.CombinedOutput() // exit status is non-zero for -h + s := string(out) + + // --write-tree is the key feature (introduced in git 2.38). + // We also require --name-only since we rely on it for conflict file extraction. + mergeTreeWriteTreeOK = strings.Contains(s, "--write-tree") && strings.Contains(s, "--name-only") + }) + return mergeTreeWriteTreeOK +} + +func simulateMergeMergeTree(dir, targetRef, headRef string) (mergeSimResult, error) { + cmd := exec.Command("git", "merge-tree", "--write-tree", "--name-only", targetRef, headRef) + cmd.Dir = dir + start := time.Now() + out, runErr := cmd.CombinedOutput() + logGitCommandTiming(cmd.Args[1:], time.Since(start)) + s := string(out) + + firstLine := "" + if i := strings.IndexByte(s, '\n'); i >= 0 { + firstLine = strings.TrimSpace(s[:i]) + } else { + firstLine = strings.TrimSpace(s) + } + + if runErr == nil { + if firstLine == "" { + return mergeSimResult{}, fmt.Errorf("git merge-tree returned empty output") + } + return mergeSimResult{ + Method: mergeSimMethodMergeTree, + MergedTree: firstLine, + }, nil + } + + files := mergeTreeNameOnlyConflictFiles(s) + if len(files) == 0 && strings.Contains(s, "CONFLICT") { + files = extractMergeConflictFiles(s) + } + if len(files) == 0 { + if logging.DebugEnabled() { + exit := -1 + if cmd.ProcessState != nil { + exit = cmd.ProcessState.ExitCode() + } + logging.Debug("git", fmt.Sprintf("merge-tree failed exit=%d output=%q", exit, strings.TrimSpace(s))) + } + return mergeSimResult{}, fmt.Errorf("git merge-tree failed: %w", runErr) + } + if logging.DebugEnabled() { + exit := -1 + if cmd.ProcessState != nil { + exit = cmd.ProcessState.ExitCode() + } + logging.Debug("git", fmt.Sprintf("merge-tree conflicts exit=%d files=%s", exit, strings.Join(files, ","))) + } + return mergeSimResult{ + Method: mergeSimMethodMergeTree, + MergedTree: firstLine, + ConflictFiles: files, + }, nil +} + +const mergeSimTmpDirPrefix = "subtask-mergesim-" + +var ( + mergeSimCleanupOnce sync.Once +) + +func simulateMergeTempIndex(dir, mb, targetRef, headRef string) (mergeSimResult, error) { + mergeSimCleanupOnce.Do(cleanupStaleMergeSimTmpDirs) + + tmpDir, err := os.MkdirTemp("", mergeSimTmpDirPrefix) + if err != nil { + return mergeSimResult{}, err + } + defer func() { + _ = os.RemoveAll(tmpDir) + }() + + // Important: do not pre-create the index file. If it exists but is empty, + // git will error with "index file smaller than expected". + idx := filepath.Join(tmpDir, "index") + + env := append(os.Environ(), "GIT_INDEX_FILE="+idx) + + if _, err := gitCombinedOutputWithEnv(dir, env, "read-tree", "-m", "-i", mb, targetRef, headRef); err != nil { + return mergeSimResult{}, err + } + + ls, err := gitCombinedOutputWithEnv(dir, env, "ls-files", "-u") + if err != nil { + return mergeSimResult{}, err + } + conflicts := parseUnmergedFiles(ls) + if len(conflicts) > 0 { + if logging.DebugEnabled() { + logging.Debug("git", fmt.Sprintf("merge-sim index conflicts files=%s", strings.Join(conflicts, ","))) + } + return mergeSimResult{ + Method: mergeSimMethodIndex, + MergeBase: mb, + ConflictFiles: conflicts, + }, nil + } + + tree, err := gitOutputWithEnv(dir, env, "write-tree") + if err != nil { + return mergeSimResult{}, err + } + tree = strings.TrimSpace(tree) + if tree == "" { + return mergeSimResult{}, fmt.Errorf("git write-tree returned empty output") + } + + return mergeSimResult{ + Method: mergeSimMethodIndex, + MergeBase: mb, + MergedTree: tree, + }, nil +} + +func gitOutputWithEnv(dir string, env []string, args ...string) (string, error) { + cmd := exec.Command("git", args...) + cmd.Dir = dir + if env != nil { + cmd.Env = env + } + var stdout bytes.Buffer + var stderr bytes.Buffer + cmd.Stdout = &stdout + cmd.Stderr = &stderr + if err := cmd.Run(); err != nil { + return "", &Error{Dir: dir, Args: args, Stderr: stderr.String(), Cause: err} + } + return strings.TrimSpace(stdout.String()), nil +} + +func gitCombinedOutputWithEnv(dir string, env []string, args ...string) (string, error) { + cmd := exec.Command("git", args...) + cmd.Dir = dir + if env != nil { + cmd.Env = env + } + out, err := cmd.CombinedOutput() + if err != nil { + return "", &Error{Dir: dir, Args: args, Stderr: string(out), Cause: err} + } + return string(out), nil +} + +func parseUnmergedFiles(output string) []string { + if strings.TrimSpace(output) == "" { + return nil + } + seen := make(map[string]struct{}) + for _, line := range strings.Split(output, "\n") { + line = strings.TrimSpace(line) + if line == "" { + continue + } + // Format: " \t" + if _, file, ok := strings.Cut(line, "\t"); ok { + file = strings.TrimSpace(file) + if file != "" { + seen[file] = struct{}{} + } + } + } + if len(seen) == 0 { + return nil + } + files := make([]string, 0, len(seen)) + for f := range seen { + files = append(files, f) + } + sort.Strings(files) + return files +} + +func cleanupStaleMergeSimTmpDirs() { + tmp := os.TempDir() + entries, err := os.ReadDir(tmp) + if err != nil { + return + } + + // Best-effort: remove stale temp dirs from crashed processes. + // Keep the TTL comfortably above any single merge simulation duration to + // avoid deleting dirs that are currently in use by other processes. + const ttl = 10 * time.Minute + cutoff := time.Now().Add(-ttl) + + for _, e := range entries { + if !e.IsDir() { + continue + } + name := e.Name() + if !strings.HasPrefix(name, mergeSimTmpDirPrefix) { + continue + } + path := filepath.Join(tmp, name) + info, err := e.Info() + if err != nil { + continue + } + if info.ModTime().After(cutoff) { + continue + } + _ = os.RemoveAll(path) + } +} diff --git a/pkg/git/mergesim_test.go b/pkg/git/mergesim_test.go new file mode 100644 index 0000000..9076249 --- /dev/null +++ b/pkg/git/mergesim_test.go @@ -0,0 +1,74 @@ +package git + +import ( + "os" + "path/filepath" + "sync" + "testing" + "time" +) + +func TestCleanupStaleMergeSimTmpDirs_RemovesOldDirs(t *testing.T) { + // Create a fake stale directory in os.TempDir(). + tmp := os.TempDir() + stale, err := os.MkdirTemp(tmp, mergeSimTmpDirPrefix) + if err != nil { + t.Fatal(err) + } + + // Make it look old enough to be cleaned. + old := time.Now().Add(-1 * time.Hour) + if err := os.Chtimes(stale, old, old); err != nil { + t.Fatal(err) + } + + cleanupStaleMergeSimTmpDirs() + + if _, err := os.Stat(stale); err == nil { + t.Fatalf("expected stale mergesim dir to be removed: %s", stale) + } +} + +func TestSimulateMerge_Concurrent_Index(t *testing.T) { + t.Setenv(mergeSimForceEnvVar, "index") + + dir := testRepo(t) + + // Create a simple non-conflicting change on feature. + gitCmd(t, dir, "checkout", "-b", "feature") + if err := os.WriteFile(filepath.Join(dir, "feature.txt"), []byte("x\n"), 0o644); err != nil { + t.Fatal(err) + } + gitCmd(t, dir, "add", "feature.txt") + gitCmd(t, dir, "commit", "-m", "feature") + + // Create a different change on master. + gitCmd(t, dir, "checkout", "master") + if err := os.WriteFile(filepath.Join(dir, "master.txt"), []byte("y\n"), 0o644); err != nil { + t.Fatal(err) + } + gitCmd(t, dir, "add", "master.txt") + gitCmd(t, dir, "commit", "-m", "master") + + const n = 8 + var wg sync.WaitGroup + errs := make(chan error, n) + + for i := 0; i < n; i++ { + wg.Add(1) + go func() { + defer wg.Done() + _, err := simulateMerge(dir, "master", "feature") + errs <- err + }() + } + + wg.Wait() + close(errs) + + for err := range errs { + if err != nil { + t.Fatalf("simulateMerge returned error: %v", err) + } + } +} diff --git a/pkg/harness/codex.go b/pkg/harness/codex.go index bb7b3ce..31ab750 100644 --- a/pkg/harness/codex.go +++ b/pkg/harness/codex.go @@ -80,10 +80,18 @@ func processCodexJSONLLine(line []byte, result *Result, cb Callbacks) { case "error": result.Error = event.Message + case "turn.completed": + // Codex may emit transient "error" events (e.g. brief network failures) + // even when the overall turn succeeds. If the turn completed, treat any + // prior stream error as recovered. + result.Error = "" + result.TurnFailed = false + case "turn.failed": if event.Error != nil { result.Error = event.Error.Message } + result.TurnFailed = true } } @@ -260,9 +268,13 @@ func (c *CodexHarness) runCodexCommand(ctx context.Context, cwd string, flags, p } } - // If we got an error event, return it even if exit code was 0 - if result.Error != "" { - return result, fmt.Errorf("codex error: %s", result.Error) + successReply := strings.TrimSpace(result.Reply) != "" + + // Codex can emit transient "error" events during a successful run (e.g. it retries + // internally). If we have a successful exit code and a valid final reply, treat any + // remaining stream error as recovered. + if result.Error != "" && !result.TurnFailed && cmdErr == nil && successReply { + result.Error = "" } // If command failed and we don't have a specific error, use generic message @@ -274,10 +286,15 @@ func (c *CodexHarness) runCodexCommand(ctx context.Context, cwd string, flags, p return result, fmt.Errorf("codex failed: %w", cmdErr) } + // If we got an error event and we don't have a success signal, return it. + if result.Error != "" { + return result, fmt.Errorf("codex error: %s", result.Error) + } + // Defensive: avoid treating "success with empty reply" as a successful run. // When this happens, it usually indicates a CLI/harness mismatch (e.g., output file not // written, JSON stream parsing interrupted, etc.). - if strings.TrimSpace(result.Reply) == "" { + if !successReply { var parts []string parts = append(parts, "codex produced empty reply") if tmpPath != "" { @@ -300,10 +317,9 @@ func (c *CodexHarness) runCodexCommand(ctx context.Context, cwd string, flags, p return result, nil } -// Review runs codex exec review using the shared command infrastructure. -func (c *CodexHarness) Review(cwd string, target ReviewTarget, instructions string) (string, error) { +func (c *CodexHarness) buildReviewCommandArgs(cwd string, target ReviewTarget, instructions string) (flags []string, positionals []string) { // exec-level flags come before the "review" subcommand - flags := []string{"exec", "--json", "--dangerously-bypass-approvals-and-sandbox"} + flags = []string{"exec", "--json", "--dangerously-bypass-approvals-and-sandbox"} if c.Model != "" { flags = append(flags, "-m", c.Model) @@ -312,25 +328,21 @@ func (c *CodexHarness) Review(cwd string, target ReviewTarget, instructions stri flags = append(flags, "-c", "model_reasoning_effort="+c.Reasoning) } - // "review" subcommand and its flags/positionals + // "review" subcommand. flags = append(flags, "review") - switch { - case target.Uncommitted: - flags = append(flags, "--uncommitted") - case target.BaseBranch != "": - flags = append(flags, "--base", target.BaseBranch) - case target.Commit != "": - flags = append(flags, "--commit", target.Commit) - default: - flags = append(flags, "--uncommitted") - } + // Codex's CLI rejects combining mode flags (e.g. --uncommitted/--base/--commit) + // with a positional PROMPT. To support optional user instructions, Subtask + // always builds the full prompt and passes only the positional argument. + prompt := buildReviewPrompt(cwd, target, instructions) + positionals = []string{prompt} - // Instructions are the positional prompt for review - var positionals []string - if instructions != "" { - positionals = []string{instructions} - } + return flags, positionals +} + +// Review runs codex exec review using the shared command infrastructure. +func (c *CodexHarness) Review(cwd string, target ReviewTarget, instructions string) (string, error) { + flags, positionals := c.buildReviewCommandArgs(cwd, target, instructions) result, err := c.runCodexCommand(context.Background(), cwd, flags, positionals, Callbacks{}, false) if err != nil { diff --git a/pkg/harness/codex_review_args_test.go b/pkg/harness/codex_review_args_test.go new file mode 100644 index 0000000..d0ec41b --- /dev/null +++ b/pkg/harness/codex_review_args_test.go @@ -0,0 +1,33 @@ +package harness + +import "testing" + +func TestCodexHarness_buildReviewCommandArgs_DoesNotPassModeFlags(t *testing.T) { + c := &CodexHarness{Model: "gpt-test", Reasoning: "high"} + + cases := []ReviewTarget{ + {Uncommitted: true}, + {BaseBranch: "dev"}, + {Commit: "abc123"}, + {TaskName: "fix/bug", BaseBranch: "dev"}, + } + + for _, target := range cases { + flags, positionals := c.buildReviewCommandArgs("/tmp", target, "Focus") + + for _, forbidden := range []string{"--uncommitted", "--base", "--commit"} { + for _, f := range flags { + if f == forbidden { + t.Fatalf("flags unexpectedly contain %q: %v", forbidden, flags) + } + } + } + + if len(positionals) != 1 { + t.Fatalf("expected exactly 1 positional prompt, got %d: %v", len(positionals), positionals) + } + if positionals[0] == "" { + t.Fatalf("expected non-empty prompt") + } + } +} diff --git a/pkg/harness/codex_run_test.go b/pkg/harness/codex_run_test.go index 5b26a1a..3197392 100644 --- a/pkg/harness/codex_run_test.go +++ b/pkg/harness/codex_run_test.go @@ -58,3 +58,81 @@ exit 0 require.Equal(t, "hello", res.Reply) }) } + +func TestCodexHarnessRun_TransientErrorEventDoesNotFailSuccessfulRun(t *testing.T) { + if runtime.GOOS == "windows" { + t.Skip("shell script helper is unix-only") + } + + tmp := t.TempDir() + fakeCodex := filepath.Join(tmp, "codex-fake") + require.NoError(t, os.WriteFile(fakeCodex, []byte(`#!/bin/sh +out="" +while [ $# -gt 0 ]; do + case "$1" in + -o|--output-last-message) + out="$2" + shift 2 + ;; + *) + shift + ;; + esac +done + +printf '{"type":"thread.started","thread_id":"sess-1"}\n' +printf '{"type":"error","message":"transient network error"}\n' +printf '{"type":"turn.completed"}\n' + +if [ -n "$out" ]; then + printf 'ok' > "$out" +fi +exit 0 +`), 0o700)) + + h := &CodexHarness{cli: cliSpec{Exec: fakeCodex}} + res, err := h.Run(context.Background(), tmp, "prompt", "", Callbacks{}) + require.NoError(t, err) + require.NotNil(t, res) + require.Equal(t, "ok", res.Reply) + require.Empty(t, res.Error) + require.False(t, res.TurnFailed) +} + +func TestCodexHarnessRun_TurnFailedStillFailsEvenWithReply(t *testing.T) { + if runtime.GOOS == "windows" { + t.Skip("shell script helper is unix-only") + } + + tmp := t.TempDir() + fakeCodex := filepath.Join(tmp, "codex-fake") + require.NoError(t, os.WriteFile(fakeCodex, []byte(`#!/bin/sh +out="" +while [ $# -gt 0 ]; do + case "$1" in + -o|--output-last-message) + out="$2" + shift 2 + ;; + *) + shift + ;; + esac +done + +printf '{"type":"thread.started","thread_id":"sess-1"}\n' +printf '{"type":"turn.failed","error":{"message":"hard failure"}}\n' + +if [ -n "$out" ]; then + printf 'partial output' > "$out" +fi +exit 0 +`), 0o700)) + + h := &CodexHarness{cli: cliSpec{Exec: fakeCodex}} + res, err := h.Run(context.Background(), tmp, "prompt", "", Callbacks{}) + require.Error(t, err) + require.NotNil(t, res) + require.Contains(t, err.Error(), "hard failure") + require.True(t, res.TurnFailed) +} diff --git a/pkg/harness/harness.go b/pkg/harness/harness.go index c251a8c..1fde837 100644 --- a/pkg/harness/harness.go +++ b/pkg/harness/harness.go @@ -17,6 +17,7 @@ type Result struct { PromptDelivered bool // True if session started (thread.started seen) AgentReplied bool // True if agent sent a message Error string // Non-empty if execution failed + TurnFailed bool // True if a turn.failed event was observed (Codex) } // Callbacks for harness events. @@ -27,10 +28,14 @@ type Callbacks struct { // ReviewTarget specifies what to review. type ReviewTarget struct { - // Exactly one of these should be set: + // Exactly one review *mode* should be set: Uncommitted bool // Review staged, unstaged, and untracked changes BaseBranch string // Review changes against this base branch Commit string // Review changes introduced by this commit SHA + + // Optional metadata for prompt construction. + // When set, the review prompt should mention the task name (Subtask-only mode). + TaskName string } // Harness is the interface for worker backends. @@ -168,9 +173,8 @@ func getStringOpt(opts map[string]any, key string) string { return "" } -// buildReviewPrompt constructs a review prompt for harnesses that don't have -// built-in review target support (Claude, OpenCode). -// Mirrors the prompt format from Codex's review_prompts.rs. +// buildReviewPrompt constructs a review prompt for code review. +// For uncommitted/base/commit it mirrors Codex's codex-rs/core/src/review_prompts.rs strings. func buildReviewPrompt(cwd string, target ReviewTarget, instructions string) string { var parts []string @@ -178,33 +182,34 @@ func buildReviewPrompt(cwd string, target ReviewTarget, instructions string) str case target.Uncommitted: parts = append(parts, "Review the current code changes (staged, unstaged, and untracked files) and provide prioritized findings.") + case target.TaskName != "" && target.BaseBranch != "": + mergeBase, err := git.MergeBase(cwd, "HEAD", target.BaseBranch) + if err == nil && mergeBase != "" { + parts = append(parts, fmt.Sprintf( + "Review the code changes for subtask task '%s' against the base branch '%s'. The merge base commit for this comparison is %s. Run `git diff %s` to inspect the changes relative to %s. Provide prioritized, actionable findings.", + target.TaskName, target.BaseBranch, mergeBase, mergeBase, target.BaseBranch)) + } else { + parts = append(parts, fmt.Sprintf( + "Review the code changes for subtask task '%s' against the base branch '%s'. Start by finding the merge diff between the current branch and %s's upstream e.g. (`git merge-base HEAD \"$(git rev-parse --abbrev-ref \"%s@{upstream}\")\"`), then run `git diff` against that SHA to see what changes we would merge into the %s branch. Provide prioritized, actionable findings.", + target.TaskName, target.BaseBranch, target.BaseBranch, target.BaseBranch, target.BaseBranch)) + } + case target.BaseBranch != "": - // Try to get merge base for more accurate diff mergeBase, err := git.MergeBase(cwd, "HEAD", target.BaseBranch) if err == nil && mergeBase != "" { parts = append(parts, fmt.Sprintf( - "Review the code changes against the base branch '%s'. "+ - "The merge base commit for this comparison is %s. "+ - "Run `git diff %s` to inspect the changes relative to %s. "+ - "Provide prioritized, actionable findings.", + "Review the code changes against the base branch '%s'. The merge base commit for this comparison is %s. Run `git diff %s` to inspect the changes relative to %s. Provide prioritized, actionable findings.", target.BaseBranch, mergeBase, mergeBase, target.BaseBranch)) } else { - // Fallback: let the reviewer figure out the merge base parts = append(parts, fmt.Sprintf( - "Review the code changes against the base branch '%s'. "+ - "Start by finding the merge diff between the current branch and %s "+ - "(e.g., `git merge-base HEAD %s`), then run `git diff` against that SHA "+ - "to see what changes we would merge into the %s branch. "+ - "Provide prioritized, actionable findings.", + "Review the code changes against the base branch '%s'. Start by finding the merge diff between the current branch and %s's upstream e.g. (`git merge-base HEAD \"$(git rev-parse --abbrev-ref \"%s@{upstream}\")\"`), then run `git diff` against that SHA to see what changes we would merge into the %s branch. Provide prioritized, actionable findings.", target.BaseBranch, target.BaseBranch, target.BaseBranch, target.BaseBranch)) } case target.Commit != "": parts = append(parts, fmt.Sprintf( - "Review the code changes introduced by commit %s. "+ - "Run `git show %s` to inspect the changes. "+ - "Provide prioritized, actionable findings.", - target.Commit, target.Commit)) + "Review the code changes introduced by commit %s. Provide prioritized, actionable findings.", + target.Commit)) default: // Fallback to uncommitted diff --git a/pkg/harness/prompt.go b/pkg/harness/prompt.go index 144ba3f..de22cc2 100644 --- a/pkg/harness/prompt.go +++ b/pkg/harness/prompt.go @@ -11,7 +11,6 @@ import ( ) type RepoStatus struct { - CommitsBehind int ConflictFiles []string } @@ -21,14 +20,6 @@ func FormatRepoStatusWarning(baseBranch string, status *RepoStatus) string { } var lines []string - if status.CommitsBehind > 0 { - commitWord := "commits" - if status.CommitsBehind == 1 { - commitWord = "commit" - } - lines = append(lines, fmt.Sprintf("Note: %s is %d %s ahead of this task.", baseBranch, status.CommitsBehind, commitWord)) - } - if len(status.ConflictFiles) > 0 { lines = append(lines, fmt.Sprintf( "Note: This branch conflicts with %s in: %s. Consider running `git merge %s` to resolve.", diff --git a/pkg/harness/repo_status_test.go b/pkg/harness/repo_status_test.go index 22822e8..2e544c6 100644 --- a/pkg/harness/repo_status_test.go +++ b/pkg/harness/repo_status_test.go @@ -10,15 +10,13 @@ func TestFormatRepoStatusWarning(t *testing.T) { require.Equal(t, "", FormatRepoStatusWarning("main", nil)) require.Equal(t, - "Note: main is 3 commits ahead of this task.", - FormatRepoStatusWarning("main", &RepoStatus{CommitsBehind: 3}), + "", + FormatRepoStatusWarning("main", &RepoStatus{}), ) require.Equal(t, - "Note: main is 1 commit ahead of this task.\n"+ - "Note: This branch conflicts with main in: a.txt, b.txt. Consider running `git merge main` to resolve.", + "Note: This branch conflicts with main in: a.txt, b.txt. Consider running `git merge main` to resolve.", FormatRepoStatusWarning("main", &RepoStatus{ - CommitsBehind: 1, ConflictFiles: []string{"a.txt", "b.txt"}, }), ) diff --git a/pkg/harness/review_prompt_test.go b/pkg/harness/review_prompt_test.go index 7888ecb..10c6357 100644 --- a/pkg/harness/review_prompt_test.go +++ b/pkg/harness/review_prompt_test.go @@ -1,29 +1,77 @@ package harness import ( + "os/exec" "strings" "testing" "github.com/stretchr/testify/assert" ) +func gitOut(t *testing.T, dir string, args ...string) string { + t.Helper() + cmd := exec.Command("git", args...) + cmd.Dir = dir + out, err := cmd.CombinedOutput() + if err != nil { + t.Fatalf("git %v failed: %v\n%s", args, err, out) + } + return strings.TrimSpace(string(out)) +} + +func setupRepoWithDevBase(t *testing.T) (dir string, baseSHA string) { + t.Helper() + + t.Setenv("GIT_AUTHOR_DATE", "2026-01-01T00:00:00Z") + t.Setenv("GIT_COMMITTER_DATE", "2026-01-01T00:00:00Z") + + dir = t.TempDir() + + run := func(args ...string) { + t.Helper() + cmd := exec.Command("git", args...) + cmd.Dir = dir + if out, err := cmd.CombinedOutput(); err != nil { + t.Fatalf("setup git %v failed: %v\n%s", args, err, out) + } + } + + // Some environments configure init.defaultBranch=main; force master to keep this stable. + { + cmd := exec.Command("git", "init", "-b", "master") + cmd.Dir = dir + if out, err := cmd.CombinedOutput(); err != nil { + _ = out + run("init") + } + } + run("config", "user.email", "test@test.com") + run("config", "user.name", "Test") + run("commit", "--allow-empty", "-m", "Initial commit") + + baseSHA = gitOut(t, dir, "rev-parse", "HEAD") + run("branch", "dev", baseSHA) + + // Add a commit on master so merge-base(master, dev) is deterministic (= baseSHA). + run("commit", "--allow-empty", "-m", "Work on master") + + return dir, baseSHA +} + func TestBuildReviewPrompt_Uncommitted(t *testing.T) { prompt := buildReviewPrompt("", ReviewTarget{Uncommitted: true}, "") - assert.Contains(t, prompt, "Review the current code changes") - assert.Contains(t, prompt, "staged, unstaged, and untracked") + assert.Equal(t, "Review the current code changes (staged, unstaged, and untracked files) and provide prioritized findings.", prompt) } func TestBuildReviewPrompt_UncommittedWithInstructions(t *testing.T) { prompt := buildReviewPrompt("", ReviewTarget{Uncommitted: true}, "Focus on security vulnerabilities") - assert.Contains(t, prompt, "Review the current code changes") - assert.Contains(t, prompt, "Focus on security vulnerabilities") + assert.Equal(t, "Review the current code changes (staged, unstaged, and untracked files) and provide prioritized findings.\n\nFocus on security vulnerabilities", prompt) } func TestBuildReviewPrompt_BaseBranch_NoGitRepo(t *testing.T) { // When git.MergeBase fails (e.g., no git repo), we get the fallback prompt prompt := buildReviewPrompt("/nonexistent/path", ReviewTarget{BaseBranch: "main"}, "") - assert.Contains(t, prompt, "Review the code changes against the base branch 'main'") - assert.Contains(t, prompt, "git merge-base") + assert.Equal(t, "Review the code changes against the base branch 'main'. Start by finding the merge diff between the current branch and main's upstream e.g. (`git merge-base HEAD \"$(git rev-parse --abbrev-ref \"main@{upstream}\")\"`), then run `git diff` against that SHA to see what changes we would merge into the main branch. Provide prioritized, actionable findings.", prompt) } func TestBuildReviewPrompt_BaseBranchWithInstructions(t *testing.T) { @@ -32,25 +80,40 @@ func TestBuildReviewPrompt_BaseBranchWithInstructions(t *testing.T) { // Should have both base branch info and custom instructions assert.Contains(t, prompt, "develop") assert.Contains(t, prompt, "Check for race conditions") + assert.Contains(t, prompt, "git rev-parse --abbrev-ref") // Instructions should be separated from base prompt parts := strings.Split(prompt, "\n\n") assert.Len(t, parts, 2) } +func TestBuildReviewPrompt_BaseBranch_WithMergeBase(t *testing.T) { + dir, baseSHA := setupRepoWithDevBase(t) + + prompt := buildReviewPrompt(dir, ReviewTarget{BaseBranch: "dev"}, "") + expected := "Review the code changes against the base branch 'dev'. The merge base commit for this comparison is " + baseSHA + ". Run `git diff " + baseSHA + "` to inspect the changes relative to dev. Provide prioritized, actionable findings." + assert.Equal(t, expected, prompt) +} + func TestBuildReviewPrompt_Commit(t *testing.T) { prompt := buildReviewPrompt("", ReviewTarget{Commit: "abc1234"}, "") - assert.Contains(t, prompt, "Review the code changes introduced by commit abc1234") - assert.Contains(t, prompt, "git show abc1234") + assert.Equal(t, "Review the code changes introduced by commit abc1234. Provide prioritized, actionable findings.", prompt) } func TestBuildReviewPrompt_CommitWithInstructions(t *testing.T) { prompt := buildReviewPrompt("", ReviewTarget{Commit: "def5678"}, "Check security") - assert.Contains(t, prompt, "def5678") - assert.Contains(t, prompt, "Check security") + assert.Equal(t, "Review the code changes introduced by commit def5678. Provide prioritized, actionable findings.\n\nCheck security", prompt) +} + +func TestBuildReviewPrompt_Task_WithMergeBase(t *testing.T) { + dir, baseSHA := setupRepoWithDevBase(t) + + prompt := buildReviewPrompt(dir, ReviewTarget{TaskName: "fix/bug", BaseBranch: "dev"}, "") + expected := "Review the code changes for subtask task 'fix/bug' against the base branch 'dev'. The merge base commit for this comparison is " + baseSHA + ". Run `git diff " + baseSHA + "` to inspect the changes relative to dev. Provide prioritized, actionable findings." + assert.Equal(t, expected, prompt) } func TestBuildReviewPrompt_EmptyTarget_DefaultsToUncommitted(t *testing.T) { prompt := buildReviewPrompt("", ReviewTarget{}, "") - assert.Contains(t, prompt, "Review the current code changes") + assert.Equal(t, "Review the current code changes (staged, unstaged, and untracked files) and provide prioritized findings.", prompt) } diff --git a/pkg/install/SKILL.md b/pkg/install/SKILL.md index 8994e00..ade466b 100644 --- a/pkg/install/SKILL.md +++ b/pkg/install/SKILL.md @@ -11,6 +11,8 @@ Each worker runs in an isolated git worktree. They can't conflict with each othe The user tells you what they need. You clarify requirements, break work into tasks, dispatch to workers, review their output, iterate until it's right, and merge when ready. +Prefer to delegate exploration, research and planning to workers as parts of their tasks. Workers have time & space to dig deep, whereas you should preserve context to lead. Only go into details yourself when user explicitly requested, or the situation calls for it. + ## Mindset 1. **Understand before delegating** — ask questions, clarify requirements. Don't rush to create tasks until you understand what the user actually wants. @@ -41,7 +43,7 @@ The user tells you what they need. You clarify requirements, break work into tas ## Flow ```bash -# 1. Draft (task description is shared with worker) +# 1. Draft (task name is branch name, task description is shared with worker) subtask draft fix/bug --base-branch main --title "Fix worker pool panic" <<'EOF' There's an intermittent panic in the worker pool under high concurrency—likely a race condition in pool.go. Reproduce, find root cause, fix, and add tests. @@ -52,7 +54,7 @@ subtask send fix/bug "Go ahead." # 3. When worker finishes, review and iterate subtask stage fix/bug review -# Review with `subtask diff fix/bug`, or read the files at `cd $(subtask workspace fix/bug)`. +# Review with `subtask diff --stat fix/bug`, or read the files at `cd $(subtask workspace fix/bug)`. # 4. Request changes if needed subtask send fix/bug <<'EOF' diff --git a/pkg/install/autoupdate.go b/pkg/install/autoupdate.go index 7b85462..aca6844 100644 --- a/pkg/install/autoupdate.go +++ b/pkg/install/autoupdate.go @@ -2,28 +2,19 @@ package install // AutoUpdateResult captures which installed components were updated to match embedded assets. type AutoUpdateResult struct { - UpdatedSkill bool - UpdatedPlugin bool + UpdatedSkill bool } -func AutoUpdateIfInstalled(scope Scope, baseDir string) (AutoUpdateResult, error) { +func AutoUpdateIfInstalled(baseDir string) (AutoUpdateResult, error) { var res AutoUpdateResult - if isSkillInstalled(scope, baseDir) { - _, updated, err := syncSkillTo(scope, baseDir) + if isSkillInstalled(baseDir) { + _, updated, err := syncSkillTo(baseDir) if err != nil { return AutoUpdateResult{}, err } res.UpdatedSkill = updated } - if isPluginInstalled(scope, baseDir) { - _, updated, err := InstallPluginTo(scope, baseDir) - if err != nil { - return AutoUpdateResult{}, err - } - res.UpdatedPlugin = updated - } - return res, nil } diff --git a/pkg/install/autoupdate_test.go b/pkg/install/autoupdate_test.go index bdd202c..d5bc52e 100644 --- a/pkg/install/autoupdate_test.go +++ b/pkg/install/autoupdate_test.go @@ -2,7 +2,6 @@ package install import ( "os" - "path/filepath" "testing" "github.com/stretchr/testify/require" @@ -11,39 +10,28 @@ import ( func TestAutoUpdateIfInstalled_DoesNotCreateWhenMissing(t *testing.T) { base := t.TempDir() - res, err := AutoUpdateIfInstalled(ScopeUser, base) + res, err := AutoUpdateIfInstalled(base) require.NoError(t, err) require.False(t, res.UpdatedSkill) - require.False(t, res.UpdatedPlugin) - _, err = os.Stat(SkillPath(ScopeUser, base)) - require.ErrorIs(t, err, os.ErrNotExist) - _, err = os.Stat(pluginMarkerPath(ScopeUser, base)) + _, err = os.Stat(SkillPath(base)) require.ErrorIs(t, err, os.ErrNotExist) } func TestAutoUpdateIfInstalled_RepairsDrift(t *testing.T) { base := t.TempDir() - _, err := InstallTo(ScopeUser, base) - require.NoError(t, err) - _, _, err = InstallPluginTo(ScopeUser, base) + _, _, err := InstallTo(base) require.NoError(t, err) - // Drift both. - require.NoError(t, os.WriteFile(SkillPath(ScopeUser, base), []byte("different"), 0o644)) - require.NoError(t, os.WriteFile(filepath.Join(PluginDir(ScopeUser, base), "hooks", "hooks.json"), []byte(`{}`), 0o644)) + // Drift. + require.NoError(t, os.WriteFile(SkillPath(base), []byte("different"), 0o644)) - res, err := AutoUpdateIfInstalled(ScopeUser, base) + res, err := AutoUpdateIfInstalled(base) require.NoError(t, err) require.True(t, res.UpdatedSkill) - require.True(t, res.UpdatedPlugin) - got, err := os.ReadFile(SkillPath(ScopeUser, base)) + got, err := os.ReadFile(SkillPath(base)) require.NoError(t, err) require.Equal(t, Embedded(), got) - - pst, err := GetPluginStatusFor(ScopeUser, base) - require.NoError(t, err) - require.True(t, pst.UpToDate) } diff --git a/pkg/install/install.go b/pkg/install/install.go index dbd49ff..031e75e 100644 --- a/pkg/install/install.go +++ b/pkg/install/install.go @@ -15,13 +15,6 @@ import ( //go:embed SKILL.md var embeddedSkill []byte -type Scope string - -const ( - ScopeUser Scope = "user" - ScopeProject Scope = "project" -) - // SkillStatus describes the installation state of the embedded skill. type SkillStatus struct { Path string @@ -36,11 +29,8 @@ func Embedded() []byte { return bytes.Clone(embeddedSkill) } -// SkillPath returns the Claude Code skill path for the given base directory. -// For user scope, baseDir should be the user's home directory. -// For project scope, baseDir should be the project root directory. -func SkillPath(scope Scope, baseDir string) string { - _ = scope // for symmetry with other install targets +// SkillPath returns the Claude Code skill path for the given base directory (usually the user's home directory). +func SkillPath(baseDir string) string { if baseDir == "" { return "" } @@ -48,18 +38,31 @@ func SkillPath(scope Scope, baseDir string) string { } // Install writes the embedded skill to the Claude Code skill location (user scope). -func Install() (string, error) { +func Install() (string, bool, error) { homeDir, err := homedir.Dir() if err != nil { - return "", err + return "", false, err } - return InstallTo(ScopeUser, homeDir) + return InstallTo(homeDir) +} + +// InstallTo writes the embedded skill to the Claude Code skill location under baseDir (user scope). +func InstallTo(baseDir string) (string, bool, error) { + return syncSkillTo(baseDir) } -// InstallTo writes the embedded skill to the Claude Code skill location under baseDir. -func InstallTo(scope Scope, baseDir string) (string, error) { - path, _, err := syncSkillTo(scope, baseDir) - return path, err +// InstallToProject writes the embedded skill to the project-scoped Claude Code skill location. +// projectRoot should be the git root of the project. +func InstallToProject(projectRoot string) (string, bool, error) { + return syncSkillToProject(projectRoot) +} + +// ProjectSkillPath returns the Claude Code skill path for project scope. +func ProjectSkillPath(projectRoot string) string { + if projectRoot == "" { + return "" + } + return filepath.Join(projectRoot, ".claude", "skills", "subtask", "SKILL.md") } // Uninstall removes the skill from the Claude Code skill location (user scope). @@ -68,12 +71,12 @@ func Uninstall() (string, error) { if err != nil { return "", err } - return UninstallFrom(ScopeUser, homeDir) + return UninstallFrom(homeDir) } // UninstallFrom removes the skill from the Claude Code skill location under baseDir. -func UninstallFrom(scope Scope, baseDir string) (string, error) { - path := SkillPath(scope, baseDir) +func UninstallFrom(baseDir string) (string, error) { + path := SkillPath(baseDir) if path == "" { return "", errors.New("invalid base directory") } @@ -89,12 +92,12 @@ func GetSkillStatus() (SkillStatus, error) { if err != nil { return SkillStatus{}, err } - return GetSkillStatusFor(ScopeUser, homeDir) + return GetSkillStatusFor(homeDir) } -// GetSkillStatusFor returns status for baseDir/scope without consulting environment. -func GetSkillStatusFor(scope Scope, baseDir string) (SkillStatus, error) { - path := SkillPath(scope, baseDir) +// GetSkillStatusFor returns status for baseDir without consulting environment. +func GetSkillStatusFor(baseDir string) (SkillStatus, error) { + path := SkillPath(baseDir) if path == "" { return SkillStatus{}, errors.New("invalid base directory") } @@ -126,8 +129,8 @@ func sha256Hex(b []byte) string { return hex.EncodeToString(sum[:]) } -func syncSkillTo(scope Scope, baseDir string) (string, bool, error) { - path := SkillPath(scope, baseDir) +func syncSkillTo(baseDir string) (string, bool, error) { + path := SkillPath(baseDir) if path == "" { return "", false, errors.New("invalid base directory") } @@ -146,11 +149,31 @@ func syncSkillTo(scope Scope, baseDir string) (string, bool, error) { return path, true, nil } -func isSkillInstalled(scope Scope, baseDir string) bool { - path := SkillPath(scope, baseDir) +func isSkillInstalled(baseDir string) bool { + path := SkillPath(baseDir) if path == "" { return false } _, err := os.Stat(path) return err == nil } + +func syncSkillToProject(projectRoot string) (string, bool, error) { + path := ProjectSkillPath(projectRoot) + if path == "" { + return "", false, errors.New("invalid project root") + } + + if err := os.MkdirAll(filepath.Dir(path), 0o755); err != nil { + return "", false, err + } + + if existing, err := os.ReadFile(path); err == nil && bytes.Equal(existing, embeddedSkill) { + return path, false, nil + } + + if err := os.WriteFile(path, embeddedSkill, 0o644); err != nil { + return "", false, err + } + return path, true, nil +} diff --git a/pkg/install/install_test.go b/pkg/install/install_test.go index 6cb656a..26e6e3a 100644 --- a/pkg/install/install_test.go +++ b/pkg/install/install_test.go @@ -11,8 +11,9 @@ import ( func TestInstallTo_WritesEmbeddedSkill(t *testing.T) { home := t.TempDir() - path, err := InstallTo(ScopeUser, home) + path, updated, err := InstallTo(home) require.NoError(t, err) + require.True(t, updated) require.Equal(t, filepath.Join(home, ".claude", "skills", "subtask", "SKILL.md"), path) got, err := os.ReadFile(path) @@ -23,13 +24,13 @@ func TestInstallTo_WritesEmbeddedSkill(t *testing.T) { func TestUninstallFrom_RemovesSkillFile(t *testing.T) { home := t.TempDir() - path, err := InstallTo(ScopeUser, home) + path, _, err := InstallTo(home) require.NoError(t, err) _, err = os.Stat(path) require.NoError(t, err) - removedPath, err := UninstallFrom(ScopeUser, home) + removedPath, err := UninstallFrom(home) require.NoError(t, err) require.Equal(t, path, removedPath) @@ -40,7 +41,7 @@ func TestUninstallFrom_RemovesSkillFile(t *testing.T) { func TestGetSkillStatusFor(t *testing.T) { home := t.TempDir() - st, err := GetSkillStatusFor(ScopeUser, home) + st, err := GetSkillStatusFor(home) require.NoError(t, err) require.False(t, st.Installed) require.False(t, st.UpToDate) @@ -48,10 +49,10 @@ func TestGetSkillStatusFor(t *testing.T) { require.Len(t, st.EmbeddedSHA256, 64) require.Empty(t, st.InstalledSHA256) - _, err = InstallTo(ScopeUser, home) + _, _, err = InstallTo(home) require.NoError(t, err) - st, err = GetSkillStatusFor(ScopeUser, home) + st, err = GetSkillStatusFor(home) require.NoError(t, err) require.True(t, st.Installed) require.True(t, st.UpToDate) @@ -60,7 +61,7 @@ func TestGetSkillStatusFor(t *testing.T) { // Drift the installed skill. require.NoError(t, os.WriteFile(st.Path, []byte("different"), 0o644)) - st, err = GetSkillStatusFor(ScopeUser, home) + st, err = GetSkillStatusFor(home) require.NoError(t, err) require.True(t, st.Installed) require.False(t, st.UpToDate) diff --git a/pkg/install/migrate_legacy_claude_plugin.go b/pkg/install/migrate_legacy_claude_plugin.go new file mode 100644 index 0000000..75f0e21 --- /dev/null +++ b/pkg/install/migrate_legacy_claude_plugin.go @@ -0,0 +1,189 @@ +package install + +import ( + "encoding/json" + "os" + "path/filepath" + + "github.com/zippoxer/subtask/pkg/task" +) + +type LegacyClaudePluginMigrationResult struct { + RemovedLegacyPluginDir bool + RemovedLegacySettingsKey bool + SkippedSettingsMalformed bool + + PluginDir string + SettingsPath string +} + +type LegacyClaudePluginMigrationOnceResult struct { + Ran bool + Migration LegacyClaudePluginMigrationResult + MarkerPath string +} + +// MigrateLegacyClaudePluginInstall cleans up artifacts created by the old (broken) plugin installer. +// +// Conservative behavior: +// - Always best-effort delete ~/.claude/plugins/subtask (does not error if missing). +// - Only edits settings.json if it exists and contains enabledPlugins as an object containing {"subtask": true}. +// - If settings.json is malformed JSON, it is left untouched and SkippedSettingsMalformed is set. +func MigrateLegacyClaudePluginInstall(homeDir string) (LegacyClaudePluginMigrationResult, error) { + res := LegacyClaudePluginMigrationResult{ + PluginDir: filepath.Join(homeDir, ".claude", "plugins", "subtask"), + SettingsPath: filepath.Join(homeDir, ".claude", "settings.json"), + } + + // Best-effort delete legacy plugin dir. + if homeDir != "" { + if _, err := os.Stat(res.PluginDir); err == nil { + if err := os.RemoveAll(res.PluginDir); err == nil { + res.RemovedLegacyPluginDir = true + } + } else if os.IsNotExist(err) { + // noop + } else { + // Unexpected stat error; ignore and proceed. + } + } + + // Settings: do not create, do not touch unless we can safely remove the legacy key. + if homeDir == "" { + return res, nil + } + if _, err := os.Stat(res.SettingsPath); err != nil { + if os.IsNotExist(err) { + return res, nil + } + // If we can't stat settings, skip. + return res, nil + } + + data, err := os.ReadFile(res.SettingsPath) + if err != nil { + return res, nil + } + + var m map[string]any + if err := json.Unmarshal(data, &m); err != nil { + res.SkippedSettingsMalformed = true + return res, nil + } + + plugins, ok := m["enabledPlugins"].(map[string]any) + if !ok || plugins == nil { + return res, nil + } + + legacyVal, ok := plugins["subtask"].(bool) + if !ok || !legacyVal { + return res, nil + } + + delete(plugins, "subtask") + m["enabledPlugins"] = plugins + + info, err := os.Stat(res.SettingsPath) + if err != nil { + return res, nil + } + + b, err := json.MarshalIndent(m, "", " ") + if err != nil { + return res, nil + } + b = append(b, '\n') + + if err := writeFileAtomic(res.SettingsPath, b, info.Mode().Perm()); err != nil { + return res, nil + } + + res.RemovedLegacySettingsKey = true + return res, nil +} + +func RunLegacyClaudePluginMigrationOnce(homeDir string) (LegacyClaudePluginMigrationOnceResult, error) { + res := LegacyClaudePluginMigrationOnceResult{ + MarkerPath: filepath.Join(task.GlobalDir(), "migrations", "legacy-claude-plugin-v1.done"), + } + + if homeDir == "" { + return res, nil + } + + if err := os.MkdirAll(filepath.Dir(res.MarkerPath), 0o755); err != nil { + return LegacyClaudePluginMigrationOnceResult{}, err + } + + f, err := os.OpenFile(res.MarkerPath, os.O_CREATE|os.O_EXCL|os.O_WRONLY, 0o644) + if err != nil { + if os.IsExist(err) { + return res, nil + } + return LegacyClaudePluginMigrationOnceResult{}, err + } + _ = f.Close() + + legacyPluginDir := filepath.Join(homeDir, ".claude", "plugins", "subtask") + shouldRun := fileExists(legacyPluginDir) || fileExists(task.ConfigPath()) + if !shouldRun { + return res, nil + } + + mig, err := MigrateLegacyClaudePluginInstall(homeDir) + if err != nil { + return LegacyClaudePluginMigrationOnceResult{}, err + } + res.Ran = true + res.Migration = mig + return res, nil +} + +func fileExists(path string) bool { + if path == "" { + return false + } + _, err := os.Stat(path) + return err == nil +} + +func writeFileAtomic(path string, data []byte, perm os.FileMode) error { + dir := filepath.Dir(path) + tmp, err := os.CreateTemp(dir, filepath.Base(path)+".tmp-*") + if err != nil { + return err + } + tmpPath := tmp.Name() + ok := false + defer func() { + _ = tmp.Close() + if !ok { + _ = os.Remove(tmpPath) + } + }() + + if err := tmp.Chmod(perm); err != nil { + return err + } + if _, err := tmp.Write(data); err != nil { + return err + } + if err := tmp.Sync(); err != nil { + return err + } + if err := tmp.Close(); err != nil { + return err + } + + if err := os.Rename(tmpPath, path); err != nil { + // Windows rename does not overwrite; fall back to remove+rename. + _ = os.Remove(path) + if err2 := os.Rename(tmpPath, path); err2 != nil { + return err + } + } + + ok = true + return nil +} diff --git a/pkg/install/ops.go b/pkg/install/ops.go deleted file mode 100644 index 09b2ba9..0000000 --- a/pkg/install/ops.go +++ /dev/null @@ -1,98 +0,0 @@ -package install - -import "errors" - -type InstallRequest struct { - Scope Scope - BaseDir string - Skill bool - Plugin bool -} - -type InstallResult struct { - SkillPath string - PluginDir string - - UpdatedSkill bool - UpdatedPlugin bool - - Settings SettingsChange -} - -func InstallAll(req InstallRequest) (InstallResult, error) { - if req.BaseDir == "" { - return InstallResult{}, errors.New("invalid base directory") - } - - res := InstallResult{} - if req.Skill { - path, updated, err := syncSkillTo(req.Scope, req.BaseDir) - if err != nil { - return InstallResult{}, err - } - res.SkillPath = path - res.UpdatedSkill = updated - } - - if req.Plugin { - dir, updated, err := InstallPluginTo(req.Scope, req.BaseDir) - if err != nil { - return InstallResult{}, err - } - res.PluginDir = dir - res.UpdatedPlugin = updated - - ch, err := EnsurePluginEnabled(req.Scope, req.BaseDir) - if err != nil { - return InstallResult{}, err - } - res.Settings = ch - } - - return res, nil -} - -type UninstallRequest struct { - Scope Scope - BaseDir string - Skill bool - Plugin bool -} - -type UninstallResult struct { - SkillPath string - PluginDir string - - Settings SettingsChange -} - -func UninstallAll(req UninstallRequest) (UninstallResult, error) { - if req.BaseDir == "" { - return UninstallResult{}, errors.New("invalid base directory") - } - - res := UninstallResult{} - if req.Skill { - path, err := UninstallFrom(req.Scope, req.BaseDir) - if err != nil { - return UninstallResult{}, err - } - res.SkillPath = path - } - - if req.Plugin { - dir, err := UninstallPluginFrom(req.Scope, req.BaseDir) - if err != nil { - return UninstallResult{}, err - } - res.PluginDir = dir - - ch, err := RemovePluginEnabled(req.Scope, req.BaseDir) - if err != nil { - return UninstallResult{}, err - } - res.Settings = ch - } - - return res, nil -} diff --git a/pkg/install/plugin.go b/pkg/install/plugin.go deleted file mode 100644 index 73dab5b..0000000 --- a/pkg/install/plugin.go +++ /dev/null @@ -1,140 +0,0 @@ -package install - -import ( - "bytes" - "crypto/sha256" - "encoding/hex" - "errors" - "os" - "path/filepath" -) - -const claudePluginName = "subtask" - -// PluginStatus describes the installation state of the embedded plugin. -type PluginStatus struct { - Dir string - Installed bool - UpToDate bool - EmbeddedSHA256 string - InstalledSHA256 string -} - -func PluginDir(scope Scope, baseDir string) string { - _ = scope // for symmetry - if baseDir == "" { - return "" - } - return filepath.Join(baseDir, ".claude", "plugins", claudePluginName) -} - -func pluginMarkerPath(scope Scope, baseDir string) string { - dir := PluginDir(scope, baseDir) - if dir == "" { - return "" - } - return filepath.Join(dir, ".claude-plugin", "plugin.json") -} - -func isPluginInstalled(scope Scope, baseDir string) bool { - marker := pluginMarkerPath(scope, baseDir) - if marker == "" { - return false - } - _, err := os.Stat(marker) - return err == nil -} - -func InstallPluginTo(scope Scope, baseDir string) (string, bool, error) { - dir := PluginDir(scope, baseDir) - if dir == "" { - return "", false, errors.New("invalid base directory") - } - - manifest, _, err := embeddedPluginManifest() - if err != nil { - return "", false, err - } - - updated := false - for _, f := range manifest { - dst := filepath.Join(dir, f.RelPath) - if existing, err := os.ReadFile(dst); err == nil && bytes.Equal(existing, f.Data) { - continue - } - if err := os.MkdirAll(filepath.Dir(dst), 0o755); err != nil { - return "", false, err - } - if err := os.WriteFile(dst, f.Data, f.Perm); err != nil { - return "", false, err - } - updated = true - } - - return dir, updated, nil -} - -func UninstallPluginFrom(scope Scope, baseDir string) (string, error) { - dir := PluginDir(scope, baseDir) - if dir == "" { - return "", errors.New("invalid base directory") - } - if err := os.RemoveAll(dir); err != nil { - return "", err - } - return dir, nil -} - -func GetPluginStatusFor(scope Scope, baseDir string) (PluginStatus, error) { - dir := PluginDir(scope, baseDir) - if dir == "" { - return PluginStatus{}, errors.New("invalid base directory") - } - - manifest, embeddedSHA, err := embeddedPluginManifest() - if err != nil { - return PluginStatus{}, err - } - - st := PluginStatus{ - Dir: dir, - Installed: false, - UpToDate: false, - EmbeddedSHA256: embeddedSHA, - } - - if !isPluginInstalled(scope, baseDir) { - return st, nil - } - - st.Installed = true - - allPresent := true - allMatch := true - h := sha256.New() - - for _, f := range manifest { - p := filepath.Join(dir, f.RelPath) - b, err := os.ReadFile(p) - if err != nil { - allPresent = false - allMatch = false - break - } - - _, _ = h.Write([]byte(f.RelPath)) - _, _ = h.Write([]byte{0}) - _, _ = h.Write(b) - _, _ = h.Write([]byte{0}) - - if !bytes.Equal(b, f.Data) { - allMatch = false - } - } - - if allPresent { - st.InstalledSHA256 = hex.EncodeToString(h.Sum(nil)) - } - st.UpToDate = allMatch - return st, nil -} diff --git a/pkg/install/plugin_manifest.go b/pkg/install/plugin_manifest.go deleted file mode 100644 index c9d832d..0000000 --- a/pkg/install/plugin_manifest.go +++ /dev/null @@ -1,79 +0,0 @@ -package install - -import ( - "crypto/sha256" - "encoding/hex" - "io/fs" - "os" - "sort" - "strings" - "sync" - - "github.com/zippoxer/subtask/plugin" -) - -type embeddedPluginFile struct { - RelPath string - Data []byte - Perm os.FileMode -} - -var ( - pluginManifestOnce sync.Once - pluginManifest []embeddedPluginFile - pluginManifestSHA string - pluginManifestErr error -) - -func embeddedPluginManifest() ([]embeddedPluginFile, string, error) { - pluginManifestOnce.Do(func() { - var files []embeddedPluginFile - err := fs.WalkDir(plugin.FS, ".", func(p string, d fs.DirEntry, err error) error { - if err != nil { - return err - } - if d.IsDir() { - return nil - } - - data, err := plugin.FS.ReadFile(p) - if err != nil { - return err - } - - rel := p - perm := os.FileMode(0o644) - if strings.HasPrefix(rel, "scripts/") { - perm = 0o755 - } - - files = append(files, embeddedPluginFile{ - RelPath: rel, - Data: data, - Perm: perm, - }) - return nil - }) - if err != nil { - pluginManifestErr = err - return - } - - sort.Slice(files, func(i, j int) bool { - return files[i].RelPath < files[j].RelPath - }) - - h := sha256.New() - for _, f := range files { - _, _ = h.Write([]byte(f.RelPath)) - _, _ = h.Write([]byte{0}) - _, _ = h.Write(f.Data) - _, _ = h.Write([]byte{0}) - } - - pluginManifest = files - pluginManifestSHA = hex.EncodeToString(h.Sum(nil)) - }) - - return pluginManifest, pluginManifestSHA, pluginManifestErr -} diff --git a/pkg/install/plugin_test.go b/pkg/install/plugin_test.go deleted file mode 100644 index 9ae42fe..0000000 --- a/pkg/install/plugin_test.go +++ /dev/null @@ -1,61 +0,0 @@ -package install - -import ( - "os" - "path/filepath" - "runtime" - "testing" - - "github.com/stretchr/testify/require" -) - -func TestInstallPluginTo_WritesEmbeddedFiles(t *testing.T) { - base := t.TempDir() - - dir, updated, err := InstallPluginTo(ScopeUser, base) - require.NoError(t, err) - require.True(t, updated) - - require.FileExists(t, filepath.Join(dir, ".claude-plugin", "plugin.json")) - require.FileExists(t, filepath.Join(dir, "hooks", "hooks.json")) - require.FileExists(t, filepath.Join(dir, "scripts", "skill-reminder.sh")) - - info, err := os.Stat(filepath.Join(dir, "scripts", "skill-reminder.sh")) - require.NoError(t, err) - if runtime.GOOS != "windows" { - require.NotZero(t, info.Mode().Perm()&0o111, "should be executable on Unix") - } - - _, updated, err = InstallPluginTo(ScopeUser, base) - require.NoError(t, err) - require.False(t, updated) -} - -func TestGetPluginStatusFor(t *testing.T) { - base := t.TempDir() - - st, err := GetPluginStatusFor(ScopeUser, base) - require.NoError(t, err) - require.False(t, st.Installed) - require.False(t, st.UpToDate) - require.NotEmpty(t, st.Dir) - require.Len(t, st.EmbeddedSHA256, 64) - require.Empty(t, st.InstalledSHA256) - - _, _, err = InstallPluginTo(ScopeUser, base) - require.NoError(t, err) - - st, err = GetPluginStatusFor(ScopeUser, base) - require.NoError(t, err) - require.True(t, st.Installed) - require.True(t, st.UpToDate) - require.Len(t, st.InstalledSHA256, 64) - - // Drift. - require.NoError(t, os.WriteFile(filepath.Join(st.Dir, "hooks", "hooks.json"), []byte(`{}`), 0o644)) - - st, err = GetPluginStatusFor(ScopeUser, base) - require.NoError(t, err) - require.True(t, st.Installed) - require.False(t, st.UpToDate) -} diff --git a/pkg/install/settings.go b/pkg/install/settings.go deleted file mode 100644 index 257fe5b..0000000 --- a/pkg/install/settings.go +++ /dev/null @@ -1,227 +0,0 @@ -package install - -import ( - "encoding/json" - "errors" - "os" - "path/filepath" - "time" -) - -// SettingsStatus describes plugin registration status for Claude Code. -type SettingsStatus struct { - Path string - Exists bool - PluginEnabled bool - Error string -} - -func SettingsPath(scope Scope, baseDir string) string { - _ = scope // for symmetry - if baseDir == "" { - return "" - } - return filepath.Join(baseDir, ".claude", "settings.json") -} - -func GetSettingsStatusFor(scope Scope, baseDir string) SettingsStatus { - path := SettingsPath(scope, baseDir) - st := SettingsStatus{Path: path} - if path == "" { - st.Error = "invalid base directory" - return st - } - - data, err := os.ReadFile(path) - if err != nil { - if os.IsNotExist(err) { - return st - } - st.Error = err.Error() - return st - } - - st.Exists = true - var m map[string]any - if err := json.Unmarshal(data, &m); err != nil { - st.Error = "malformed JSON" - return st - } - - plugins := getEnabledPluginsMap(m) - if enabled, ok := plugins[claudePluginName].(bool); ok && enabled { - st.PluginEnabled = true - } - return st -} - -type SettingsChange struct { - Path string - Changed bool - Rewrote bool - BackupTo string -} - -func EnsurePluginEnabled(scope Scope, baseDir string) (SettingsChange, error) { - path := SettingsPath(scope, baseDir) - if path == "" { - return SettingsChange{}, errors.New("invalid base directory") - } - if err := os.MkdirAll(filepath.Dir(path), 0o755); err != nil { - return SettingsChange{}, err - } - - m, rewrote, backupTo, err := readSettingsFile(path) - if err != nil { - return SettingsChange{}, err - } - - plugins := getEnabledPluginsMap(m) - - // Check if already enabled - if enabled, ok := plugins[claudePluginName].(bool); ok && enabled { - return SettingsChange{Path: path, Changed: false, Rewrote: rewrote, BackupTo: backupTo}, nil - } - - // Enable the plugin - plugins[claudePluginName] = true - m["enabledPlugins"] = plugins - - if err := writeSettingsFile(path, m); err != nil { - return SettingsChange{}, err - } - - return SettingsChange{Path: path, Changed: true, Rewrote: rewrote, BackupTo: backupTo}, nil -} - -func RemovePluginEnabled(scope Scope, baseDir string) (SettingsChange, error) { - path := SettingsPath(scope, baseDir) - if path == "" { - return SettingsChange{}, errors.New("invalid base directory") - } - - // Don't create file if it doesn't exist - if _, err := os.Stat(path); os.IsNotExist(err) { - return SettingsChange{Path: path, Changed: false}, nil - } - - m, rewrote, backupTo, err := readSettingsFile(path) - if err != nil { - return SettingsChange{}, err - } - - plugins := getEnabledPluginsMap(m) - - // Check if not present - if _, ok := plugins[claudePluginName]; !ok { - // If we had to rewrite due to malformed JSON, write out a fresh, valid settings file. - if rewrote { - m["enabledPlugins"] = plugins - if err := writeSettingsFile(path, m); err != nil { - return SettingsChange{}, err - } - } - return SettingsChange{Path: path, Changed: false, Rewrote: rewrote, BackupTo: backupTo}, nil - } - - // Remove the plugin - delete(plugins, claudePluginName) - m["enabledPlugins"] = plugins - - if err := writeSettingsFile(path, m); err != nil { - return SettingsChange{}, err - } - - return SettingsChange{Path: path, Changed: true, Rewrote: rewrote, BackupTo: backupTo}, nil -} - -// readSettingsFile reads and parses settings.json, returning empty map if missing. -func readSettingsFile(path string) (m map[string]any, rewrote bool, backupTo string, err error) { - data, err := os.ReadFile(path) - if err != nil { - if os.IsNotExist(err) { - return make(map[string]any), false, "", nil - } - return nil, false, "", err - } - - if err := json.Unmarshal(data, &m); err != nil { - // Malformed JSON - backup and start fresh - backupTo, err2 := backupFile(path) - if err2 != nil { - return nil, false, "", err2 - } - return make(map[string]any), true, backupTo, nil - } - - if m == nil { - m = make(map[string]any) - } - return m, false, "", nil -} - -// writeSettingsFile writes settings map to path with pretty formatting. -func writeSettingsFile(path string, m map[string]any) error { - b, err := json.MarshalIndent(m, "", " ") - if err != nil { - return err - } - return os.WriteFile(path, append(b, '\n'), 0o644) -} - -// getEnabledPluginsMap extracts or creates the enabledPlugins map. -// Claude Code expects enabledPlugins to be an object: {"plugin-name": true, ...} -func getEnabledPluginsMap(m map[string]any) map[string]any { - if m == nil { - return make(map[string]any) - } - - v := m["enabledPlugins"] - if v == nil { - return make(map[string]any) - } - - // Already correct format - if plugins, ok := v.(map[string]any); ok { - return plugins - } - // Also accept map[string]bool if produced by other tooling. - if plugins, ok := v.(map[string]bool); ok { - out := make(map[string]any, len(plugins)) - for k, b := range plugins { - out[k] = b - } - return out - } - - // Convert from array format (legacy/incorrect) to object format - if arr, ok := v.([]any); ok { - plugins := make(map[string]any) - for _, item := range arr { - if name, ok := item.(string); ok && name != "" { - plugins[name] = true - } - } - return plugins - } - - // Unknown format, return empty - return make(map[string]any) -} -func backupFile(path string) (string, error) { - if _, err := os.Stat(path); err != nil { - if os.IsNotExist(err) { - return "", nil - } - return "", err - } - - backupTo := path + ".bak" - if _, err := os.Stat(backupTo); err == nil { - backupTo = path + ".bak-" + time.Now().UTC().Format("20060102T150405Z") - } - if err := os.Rename(path, backupTo); err != nil { - return "", err - } - return backupTo, nil -} diff --git a/pkg/install/settings_test.go b/pkg/install/settings_test.go deleted file mode 100644 index fb6e34b..0000000 --- a/pkg/install/settings_test.go +++ /dev/null @@ -1,131 +0,0 @@ -package install - -import ( - "encoding/json" - "os" - "path/filepath" - "testing" - - "github.com/stretchr/testify/require" -) - -func TestEnsurePluginEnabled_CreatesAndIsIdempotent(t *testing.T) { - base := t.TempDir() - - ch, err := EnsurePluginEnabled(ScopeUser, base) - require.NoError(t, err) - require.True(t, ch.Changed) - - settingsPath := filepath.Join(base, ".claude", "settings.json") - data, err := os.ReadFile(settingsPath) - require.NoError(t, err) - - var m map[string]any - require.NoError(t, json.Unmarshal(data, &m)) - plugins, ok := m["enabledPlugins"].(map[string]any) - require.True(t, ok, "enabledPlugins should be an object") - require.Equal(t, true, plugins[claudePluginName]) - - ch2, err := EnsurePluginEnabled(ScopeUser, base) - require.NoError(t, err) - require.False(t, ch2.Changed) -} - -func TestRemovePluginEnabled_DoesNotCreateMissingFile(t *testing.T) { - base := t.TempDir() - - ch, err := RemovePluginEnabled(ScopeUser, base) - require.NoError(t, err) - require.False(t, ch.Changed) - - _, err = os.Stat(filepath.Join(base, ".claude", "settings.json")) - require.ErrorIs(t, err, os.ErrNotExist) -} - -func TestEnsurePluginEnabled_MalformedJSON_BackupsAndRewrites(t *testing.T) { - base := t.TempDir() - settingsPath := filepath.Join(base, ".claude", "settings.json") - require.NoError(t, os.MkdirAll(filepath.Dir(settingsPath), 0o755)) - require.NoError(t, os.WriteFile(settingsPath, []byte("{not json"), 0o644)) - - ch, err := EnsurePluginEnabled(ScopeUser, base) - require.NoError(t, err) - require.True(t, ch.Rewrote) - require.NotEmpty(t, ch.BackupTo) - require.FileExists(t, ch.BackupTo) - - data, err := os.ReadFile(settingsPath) - require.NoError(t, err) - var m map[string]any - require.NoError(t, json.Unmarshal(data, &m)) - plugins, ok := m["enabledPlugins"].(map[string]any) - require.True(t, ok, "enabledPlugins should be an object") - require.Equal(t, true, plugins[claudePluginName]) -} - -func TestEnsurePluginEnabled_PreservesExistingSettings(t *testing.T) { - base := t.TempDir() - settingsPath := filepath.Join(base, ".claude", "settings.json") - require.NoError(t, os.MkdirAll(filepath.Dir(settingsPath), 0o755)) - - // Write existing settings with object format - existing := map[string]any{ - "someOtherSetting": true, - "enabledPlugins": map[string]any{ - "other-plugin": true, - }, - } - data, _ := json.MarshalIndent(existing, "", " ") - require.NoError(t, os.WriteFile(settingsPath, append(data, '\n'), 0o644)) - - ch, err := EnsurePluginEnabled(ScopeUser, base) - require.NoError(t, err) - require.True(t, ch.Changed) - - // Read back and verify - data, err = os.ReadFile(settingsPath) - require.NoError(t, err) - var m map[string]any - require.NoError(t, json.Unmarshal(data, &m)) - - // Other settings preserved - require.Equal(t, true, m["someOtherSetting"]) - - // Both plugins present - plugins, ok := m["enabledPlugins"].(map[string]any) - require.True(t, ok, "enabledPlugins should be an object") - require.Equal(t, true, plugins["other-plugin"]) - require.Equal(t, true, plugins[claudePluginName]) -} - -func TestRemovePluginEnabled_PreservesOtherPlugins(t *testing.T) { - base := t.TempDir() - settingsPath := filepath.Join(base, ".claude", "settings.json") - require.NoError(t, os.MkdirAll(filepath.Dir(settingsPath), 0o755)) - - // Write existing settings - existing := map[string]any{ - "enabledPlugins": map[string]any{ - "other-plugin": true, - claudePluginName: true, - }, - } - data, _ := json.MarshalIndent(existing, "", " ") - require.NoError(t, os.WriteFile(settingsPath, append(data, '\n'), 0o644)) - - ch, err := RemovePluginEnabled(ScopeUser, base) - require.NoError(t, err) - require.True(t, ch.Changed) - - // Read back and verify - data, err = os.ReadFile(settingsPath) - require.NoError(t, err) - var m map[string]any - require.NoError(t, json.Unmarshal(data, &m)) - - plugins, ok := m["enabledPlugins"].(map[string]any) - require.True(t, ok, "enabledPlugins should be an object") - require.Equal(t, true, plugins["other-plugin"]) - _, hasSubtask := plugins[claudePluginName] - require.False(t, hasSubtask) -} diff --git a/pkg/install/status.go b/pkg/install/status.go deleted file mode 100644 index 01cc7b0..0000000 --- a/pkg/install/status.go +++ /dev/null @@ -1,35 +0,0 @@ -package install - -import "errors" - -// ScopeStatus describes installation state for a specific scope. -type ScopeStatus struct { - Scope Scope - BaseDir string - Skill SkillStatus - Plugin PluginStatus - Settings SettingsStatus -} - -func GetScopeStatus(scope Scope, baseDir string) (ScopeStatus, error) { - if baseDir == "" { - return ScopeStatus{}, errors.New("invalid base directory") - } - - skill, err := GetSkillStatusFor(scope, baseDir) - if err != nil { - return ScopeStatus{}, err - } - plugin, err := GetPluginStatusFor(scope, baseDir) - if err != nil { - return ScopeStatus{}, err - } - - return ScopeStatus{ - Scope: scope, - BaseDir: baseDir, - Skill: skill, - Plugin: plugin, - Settings: GetSettingsStatusFor(scope, baseDir), - }, nil -} diff --git a/pkg/render/box.go b/pkg/render/box.go index 7c73a56..901a465 100644 --- a/pkg/render/box.go +++ b/pkg/render/box.go @@ -64,6 +64,7 @@ type TaskCard struct { Error string Branch string BaseBranch string + BaseCommit string Model string Reasoning string Workspace string @@ -75,7 +76,11 @@ type TaskCard struct { Files []string LinesAdded int // Git diff stats LinesRemoved int - CommitsBehind int + ChangesStatus string // "", "applied", "missing" + ChangesError string + CommitCount int + CommitError string + ShowCommits bool ConflictFiles []string } @@ -86,6 +91,9 @@ func (c *TaskCard) RenderPlain() string { fmt.Fprintf(&buf, "Task: %s\n", c.Name) fmt.Fprintf(&buf, "Title: %s\n", c.Title) fmt.Fprintf(&buf, "Branch: %s (based on %s)\n", c.Branch, c.BaseBranch) + if c.BaseCommit != "" { + fmt.Fprintf(&buf, "Base commit: %s\n", c.BaseCommit) + } if c.Model != "" { if c.Reasoning != "" { fmt.Fprintf(&buf, "Model: %s (%s)\n", c.Model, c.Reasoning) @@ -102,9 +110,32 @@ func (c *TaskCard) RenderPlain() string { fmt.Fprintf(&buf, "Error: %s\n", c.Error) } - // Git changes + // Git changes + commit count if c.TaskStatus != "" { - fmt.Fprintf(&buf, "Changes: %s\n", formatChanges(c.LinesAdded, c.LinesRemoved)) + switch strings.TrimSpace(c.ChangesStatus) { + case "missing": + fmt.Fprintf(&buf, "Changes: missing\n") + indent := strings.Repeat(" ", len("Changes: ")) + fmt.Fprintf(&buf, "%sBranch was deleted or commit objects are missing.\n", indent) + fmt.Fprintf(&buf, "%sRun `subtask close` to close, or restore the branch and retry.\n", indent) + default: + if c.ChangesError != "" { + fmt.Fprintf(&buf, "Changes: %s\n", c.ChangesError) + } else { + fmt.Fprintf(&buf, "Changes: %s\n", formatChanges(c.LinesAdded, c.LinesRemoved)) + if strings.TrimSpace(c.ChangesStatus) == "applied" { + indent := strings.Repeat(" ", len("Changes: ")) + fmt.Fprintf(&buf, "%sAlready in base branch. Run `subtask merge` to mark as merged.\n", indent) + } + } + } + if c.ShowCommits { + if c.CommitError != "" { + fmt.Fprintf(&buf, "Commits: %s\n", c.CommitError) + } else { + fmt.Fprintf(&buf, "Commits: %d\n", c.CommitCount) + } + } } if len(c.ConflictFiles) > 0 { @@ -178,6 +209,9 @@ func (c *TaskCard) RenderPretty() string { // Branch branchInfo := fmt.Sprintf("%s %s", c.Branch, styleDim.Render("(based on "+c.BaseBranch+")")) lines = append(lines, fmt.Sprintf("%s %s", styleBold.Render("Branch"), branchInfo)) + if strings.TrimSpace(c.BaseCommit) != "" { + lines = append(lines, fmt.Sprintf("%s %s", styleBold.Render("Base"), styleDim.Render(strings.TrimSpace(c.BaseCommit)))) + } // Model (and reasoning) if c.Model != "" { @@ -193,9 +227,30 @@ func (c *TaskCard) RenderPretty() string { lines = append(lines, fmt.Sprintf("%s %s", styleBold.Render("Workspace"), c.Workspace)) } - // Changes (git diff stats) + // Changes + commits if c.TaskStatus != "" { - lines = append(lines, fmt.Sprintf("%s %s", styleBold.Render("Changes"), formatChangesColored(c.LinesAdded, c.LinesRemoved))) + switch strings.TrimSpace(c.ChangesStatus) { + case "missing": + lines = append(lines, fmt.Sprintf("%s %s", styleBold.Render("Changes"), styleDim.Render("missing"))) + lines = append(lines, fmt.Sprintf("%s %s", styleDim.Render(""), styleDim.Render("Branch was deleted or commit objects are missing."))) + lines = append(lines, fmt.Sprintf("%s %s", styleDim.Render(""), styleDim.Render("Run `subtask close` to close, or restore the branch and retry."))) + default: + if c.ChangesError != "" { + lines = append(lines, fmt.Sprintf("%s %s", styleBold.Render("Changes"), styleError.Render(c.ChangesError))) + } else { + lines = append(lines, fmt.Sprintf("%s %s", styleBold.Render("Changes"), formatChangesColored(c.LinesAdded, c.LinesRemoved))) + if strings.TrimSpace(c.ChangesStatus) == "applied" { + lines = append(lines, fmt.Sprintf("%s %s", styleDim.Render(""), styleDim.Render("Already in base branch. Run `subtask merge` to mark as merged."))) + } + } + } + if c.ShowCommits { + if c.CommitError != "" { + lines = append(lines, fmt.Sprintf("%s %s", styleBold.Render("Commits"), styleError.Render(c.CommitError))) + } else { + lines = append(lines, fmt.Sprintf("%s %d", styleBold.Render("Commits"), c.CommitCount)) + } + } } if len(c.ConflictFiles) > 0 { diff --git a/pkg/render/tasklist.go b/pkg/render/tasklist.go index 1d71687..c059327 100644 --- a/pkg/render/tasklist.go +++ b/pkg/render/tasklist.go @@ -17,7 +17,7 @@ type TaskRow struct { Title string LinesAdded int // Git diff stats LinesRemoved int - CommitsBehind int + ChangesStatus string // "", "applied", "missing" } // TaskListTable renders a list of tasks. @@ -161,6 +161,19 @@ func formatChangesColored(added, removed int) string { // For closed+merged tasks: shows "✓ merged" in purple // For other tasks: shows normal changes func formatChangesForTask(task TaskRow, colored bool) string { + switch strings.TrimSpace(task.ChangesStatus) { + case "missing": + if colored { + return styleDim.Render("missing") + } + return "missing" + case "applied": + if colored { + return styleDim.Render("applied") + } + return fmt.Sprintf("applied (+%d -%d)", task.LinesAdded, task.LinesRemoved) + } + // Normal changes var changes string if colored { @@ -169,14 +182,6 @@ func formatChangesForTask(task TaskRow, colored bool) string { changes = formatChanges(task.LinesAdded, task.LinesRemoved) } - if task.CommitsBehind > 0 { - behind := fmt.Sprintf("(%d behind)", task.CommitsBehind) - if colored { - behind = styleDim.Render(behind) - } - changes += " " + behind - } - return changes } diff --git a/pkg/subtaskerr/errors.go b/pkg/subtaskerr/errors.go new file mode 100644 index 0000000..c90da08 --- /dev/null +++ b/pkg/subtaskerr/errors.go @@ -0,0 +1,15 @@ +package subtaskerr + +import "errors" + +var ( + // ErrNotConfigured is returned when ~/.subtask/config.json is missing and no automatic migration applies. + ErrNotConfigured = errors.New("subtask: not configured — run 'subtask install' first") + // ErrNotGitRepo is returned when a command requires git but the cwd is not inside a git repository. + ErrNotGitRepo = errors.New("subtask: not a git repository — subtask requires git") + + // ErrNoAnchorFromWorkspace is returned when running from a worker workspace and Subtask cannot + // determine the main repo anchor worktree. + ErrNoAnchorFromWorkspace = errors.New("subtask: cannot determine project root from within a worker workspace") +) + diff --git a/pkg/task/gather/detail.go b/pkg/task/gather/detail.go index f9c0c50..5afea94 100644 --- a/pkg/task/gather/detail.go +++ b/pkg/task/gather/detail.go @@ -31,10 +31,7 @@ type TaskDetail struct { LinesAdded int LinesRemoved int - CommitsBehind int ConflictFiles []string - - IntegratedReason string } func Detail(ctx context.Context, taskName string) (TaskDetail, error) { @@ -46,10 +43,9 @@ func Detail(ctx context.Context, taskName string) (TaskDetail, error) { if err := idx.Refresh(ctx, index.RefreshPolicy{ Git: index.GitPolicy{ - Mode: index.GitTasks, - Tasks: []string{taskName}, - IncludeConflicts: true, - IncludeIntegration: true, + Mode: index.GitTasks, + Tasks: []string{taskName}, + IncludeConflicts: true, }, }); err != nil { return TaskDetail{}, err @@ -82,14 +78,12 @@ func Detail(ctx context.Context, taskName string) (TaskDetail, error) { LastHistory: rec.LastHistory.UnixNano(), LastRunMS: rec.LastRunDurationMS, } - d.IntegratedReason = rec.IntegratedReason if cfg != nil && cfg.Harness == "codex" { d.Reasoning = workspace.ResolveReasoning(cfg, t, "") } d.LinesAdded = rec.LinesAdded d.LinesRemoved = rec.LinesRemoved - d.CommitsBehind = rec.CommitsBehind if rec.ConflictFilesJSON != "" { var conflicts []string diff --git a/pkg/task/gather/list.go b/pkg/task/gather/list.go index d2a00af..d755708 100644 --- a/pkg/task/gather/list.go +++ b/pkg/task/gather/list.go @@ -79,12 +79,11 @@ func List(ctx context.Context, opts ListOptions) (TaskListData, error) { } if debug { - logging.Debug("refresh", "index.Refresh() opts={GitOpenOnly, IncludeIntegration}") + logging.Debug("refresh", "index.Refresh() opts={GitOpenOnly}") } if err := idx.Refresh(ctx, index.RefreshPolicy{ Git: index.GitPolicy{ - Mode: index.GitOpenOnly, - IncludeIntegration: true, + Mode: index.GitOpenOnly, }, }); err != nil { logging.Error("refresh", "index.Refresh error: "+err.Error()) diff --git a/pkg/task/history/history.go b/pkg/task/history/history.go index d0e0603..8e2622a 100644 --- a/pkg/task/history/history.go +++ b/pkg/task/history/history.go @@ -72,49 +72,54 @@ func AppendLocked(taskName string, ev Event) error { func WriteAll(taskName string, events []Event) error { return task.WithLock(taskName, func() error { - if err := os.MkdirAll(task.Dir(taskName), 0o755); err != nil { - return err - } - path := task.HistoryPath(taskName) - tmp := path + ".tmp" + return WriteAllLocked(taskName, events) + }) +} - f, err := os.OpenFile(tmp, os.O_CREATE|os.O_TRUNC|os.O_WRONLY, 0o644) - if err != nil { - return err - } - buf := bufio.NewWriterSize(f, 128*1024) - for _, ev := range events { - if ev.TS.IsZero() { - ev.TS = time.Now().UTC() - } - b, err := json.Marshal(ev) - if err != nil { - f.Close() - _ = os.Remove(tmp) - return err - } - if _, err := buf.Write(append(b, '\n')); err != nil { - f.Close() - _ = os.Remove(tmp) - return err - } +// WriteAllLocked rewrites history.jsonl atomically. The caller must hold the task lock. +func WriteAllLocked(taskName string, events []Event) error { + if err := os.MkdirAll(task.Dir(taskName), 0o755); err != nil { + return err + } + path := task.HistoryPath(taskName) + tmp := path + ".tmp" + + f, err := os.OpenFile(tmp, os.O_CREATE|os.O_TRUNC|os.O_WRONLY, 0o644) + if err != nil { + return err + } + buf := bufio.NewWriterSize(f, 128*1024) + for _, ev := range events { + if ev.TS.IsZero() { + ev.TS = time.Now().UTC() } - if err := buf.Flush(); err != nil { + b, err := json.Marshal(ev) + if err != nil { f.Close() _ = os.Remove(tmp) return err } - if err := f.Sync(); err != nil { + if _, err := buf.Write(append(b, '\n')); err != nil { f.Close() _ = os.Remove(tmp) return err } - if err := f.Close(); err != nil { - _ = os.Remove(tmp) - return err - } - return os.Rename(tmp, path) - }) + } + if err := buf.Flush(); err != nil { + f.Close() + _ = os.Remove(tmp) + return err + } + if err := f.Sync(); err != nil { + f.Close() + _ = os.Remove(tmp) + return err + } + if err := f.Close(); err != nil { + _ = os.Remove(tmp) + return err + } + return os.Rename(tmp, path) } type ReadOptions struct { @@ -168,12 +173,21 @@ func Read(taskName string, opts ReadOptions) ([]Event, error) { } type TailInfo struct { - LastTS time.Time - TaskStatus task.TaskStatus - Stage string - LastMergedCommit string - BaseBranch string - BaseCommit string + LastTS time.Time + TaskStatus task.TaskStatus + Stage string + LastMergedCommit string + LastMergedMethod string + LastMergedBaseCommit string + LastMergedBranchHead string + LastMergedLinesAdded int + LastMergedLinesRemoved int + LastMergedFrozenError string + LastClosedLinesAdded int + LastClosedLinesRemoved int + LastClosedFrozenError string + BaseBranch string + BaseCommit string LastRunDurationMS int LastRunToolCalls int @@ -248,6 +262,8 @@ func TailPath(path string) (TailInfo, error) { var info TailInfo var taskStatusSet bool + var mergedStatsSet bool + var closedStatsSet bool // Track run completion by run_id for "running since" detection. finishedByRun := make(map[string]struct{}) @@ -288,18 +304,51 @@ func TailPath(path string) (TailInfo, error) { info.TaskStatus = task.TaskStatusClosed taskStatusSet = true } + if !closedStatsSet { + var d struct { + ChangesAdded int `json:"changes_added"` + ChangesRemoved int `json:"changes_removed"` + FrozenError string `json:"frozen_error"` + } + _ = json.Unmarshal(ev.Data, &d) + info.LastClosedLinesAdded = d.ChangesAdded + info.LastClosedLinesRemoved = d.ChangesRemoved + info.LastClosedFrozenError = strings.TrimSpace(d.FrozenError) + closedStatsSet = true + } case "task.merged": if !taskStatusSet { info.TaskStatus = task.TaskStatusMerged taskStatusSet = true } + var d struct { + Commit string `json:"commit"` + Method string `json:"method"` + BaseCommit string `json:"base_commit"` + BranchHead string `json:"branch_head"` + ChangesAdded int `json:"changes_added"` + ChangesRemoved int `json:"changes_removed"` + FrozenError string `json:"frozen_error"` + } + _ = json.Unmarshal(ev.Data, &d) if info.LastMergedCommit == "" { - var d struct { - Commit string `json:"commit"` - } - _ = json.Unmarshal(ev.Data, &d) info.LastMergedCommit = strings.TrimSpace(d.Commit) } + if info.LastMergedMethod == "" { + info.LastMergedMethod = strings.TrimSpace(d.Method) + } + if info.LastMergedBaseCommit == "" { + info.LastMergedBaseCommit = strings.TrimSpace(d.BaseCommit) + } + if info.LastMergedBranchHead == "" { + info.LastMergedBranchHead = strings.TrimSpace(d.BranchHead) + } + if !mergedStatsSet { + info.LastMergedLinesAdded = d.ChangesAdded + info.LastMergedLinesRemoved = d.ChangesRemoved + info.LastMergedFrozenError = strings.TrimSpace(d.FrozenError) + mergedStatsSet = true + } case "task.opened": if !taskStatusSet { info.TaskStatus = task.TaskStatusOpen diff --git a/pkg/task/index/benchmarks_test.go b/pkg/task/index/benchmarks_test.go index 55cc404..eec0fd6 100644 --- a/pkg/task/index/benchmarks_test.go +++ b/pkg/task/index/benchmarks_test.go @@ -12,6 +12,7 @@ import ( "github.com/zippoxer/subtask/pkg/task" "github.com/zippoxer/subtask/pkg/task/history" + "github.com/zippoxer/subtask/pkg/task/migrate/gitredesign" ) func BenchmarkIndex_Refresh_NoChanges_100Tasks(b *testing.B) { @@ -19,7 +20,7 @@ func BenchmarkIndex_Refresh_NoChanges_100Tasks(b *testing.B) { for i := 0; i < 100; i++ { name := fmt.Sprintf("bench/%03d", i) - requireNoError(b, (&task.Task{Name: name, Title: "t", BaseBranch: "main", Description: "d", Schema: 1}).Save()) + requireNoError(b, (&task.Task{Name: name, Title: "t", BaseBranch: "main", Description: "d", Schema: gitredesign.TaskSchemaVersion}).Save()) requireNoError(b, history.WriteAll(name, []history.Event{ {TS: time.Now().UTC(), Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main"})}, {TS: time.Now().UTC(), Type: "stage.changed", Data: mustJSON(map[string]any{"from": "", "to": "implement"})}, @@ -44,7 +45,7 @@ func BenchmarkIndex_List_NoChanges_100Tasks(b *testing.B) { for i := 0; i < 100; i++ { name := fmt.Sprintf("bench/%03d", i) - requireNoError(b, (&task.Task{Name: name, Title: "t", BaseBranch: "main", Description: "d", Schema: 1}).Save()) + requireNoError(b, (&task.Task{Name: name, Title: "t", BaseBranch: "main", Description: "d", Schema: gitredesign.TaskSchemaVersion}).Save()) requireNoError(b, history.WriteAll(name, []history.Event{ {TS: time.Now().UTC(), Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main"})}, {TS: time.Now().UTC(), Type: "stage.changed", Data: mustJSON(map[string]any{"from": "", "to": "implement"})}, @@ -70,7 +71,7 @@ func BenchmarkIndex_Detail_Cached(b *testing.B) { setupTempProject(b) name := "bench/detail" - requireNoError(b, (&task.Task{Name: name, Title: "t", BaseBranch: "main", Description: "d", Schema: 1}).Save()) + requireNoError(b, (&task.Task{Name: name, Title: "t", BaseBranch: "main", Description: "d", Schema: gitredesign.TaskSchemaVersion}).Save()) requireNoError(b, history.WriteAll(name, []history.Event{ {TS: time.Now().UTC(), Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main"})}, {TS: time.Now().UTC(), Type: "stage.changed", Data: mustJSON(map[string]any{"from": "", "to": "implement"})}, diff --git a/pkg/task/index/commits_behind_test.go b/pkg/task/index/commits_behind_test.go deleted file mode 100644 index f12f3d5..0000000 --- a/pkg/task/index/commits_behind_test.go +++ /dev/null @@ -1,113 +0,0 @@ -package index_test - -import ( - "context" - "strings" - "testing" - "time" - - "github.com/stretchr/testify/require" - - "github.com/zippoxer/subtask/pkg/task" - "github.com/zippoxer/subtask/pkg/task/history" - taskindex "github.com/zippoxer/subtask/pkg/task/index" - "github.com/zippoxer/subtask/pkg/testutil" -) - -func commitEmpty(t *testing.T, dir, msg string) { - t.Helper() - gitOut(t, dir, "commit", "--allow-empty", "-m", msg) -} - -func TestIndex_CommitsBehind_UsesTaskRef_WhenBranchExists(t *testing.T) { - env := testutil.NewTestEnv(t, 0) - ctx := context.Background() - now := time.Date(2026, 1, 1, 12, 0, 0, 0, time.UTC) - - // Draft-time base commit. - baseCommit := gitOut(t, env.RootDir, "rev-parse", "HEAD") - - // Base branch advances. - commitEmpty(t, env.RootDir, "main-1") - commitEmpty(t, env.RootDir, "main-2") - - // Task branch created at the old base commit, then rebased onto current main. - taskName := "behind/rebased" - gitOut(t, env.RootDir, "switch", "-c", taskName, baseCommit) - commitEmpty(t, env.RootDir, "task-1") - gitOut(t, env.RootDir, "rebase", "main") - gitOut(t, env.RootDir, "switch", "main") - - // Sanity: main advanced relative to the pinned base commit. - require.NotEqual(t, baseCommit, gitOut(t, env.RootDir, "rev-parse", "main")) - require.Greater(t, mustAtoi(t, gitOut(t, env.RootDir, "rev-list", "--count", baseCommit+"..main")), 0) - - env.CreateTask(taskName, "Rebased task", "main", "Description") - env.CreateTaskState(taskName, &task.State{Workspace: ""}) - env.CreateTaskHistory(taskName, []history.Event{ - {TS: now, Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main", "base_commit": baseCommit})}, - {TS: now, Type: "stage.changed", Data: mustJSON(map[string]any{"from": "", "to": "implement"})}, - }) - - idx, err := taskindex.OpenDefault() - require.NoError(t, err) - t.Cleanup(func() { _ = idx.Close() }) - - require.NoError(t, idx.Refresh(ctx, taskindex.RefreshPolicy{ - Git: taskindex.GitPolicy{ - Mode: taskindex.GitTasks, - Tasks: []string{taskName}, - }, - })) - - rec, ok, err := idx.Get(ctx, taskName) - require.NoError(t, err) - require.True(t, ok) - require.Equal(t, 0, rec.CommitsBehind) -} - -func TestIndex_CommitsBehind_FallsBackToBaseCommit_WhenBranchMissing(t *testing.T) { - env := testutil.NewTestEnv(t, 0) - ctx := context.Background() - now := time.Date(2026, 1, 1, 12, 0, 0, 0, time.UTC) - - baseCommit := gitOut(t, env.RootDir, "rev-parse", "HEAD") - commitEmpty(t, env.RootDir, "main-1") - commitEmpty(t, env.RootDir, "main-2") - - taskName := "behind/draft-only" - env.CreateTask(taskName, "Draft-only task", "main", "Description") - env.CreateTaskState(taskName, &task.State{Workspace: ""}) - env.CreateTaskHistory(taskName, []history.Event{ - {TS: now, Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main", "base_commit": baseCommit})}, - {TS: now, Type: "stage.changed", Data: mustJSON(map[string]any{"from": "", "to": "implement"})}, - }) - - idx, err := taskindex.OpenDefault() - require.NoError(t, err) - t.Cleanup(func() { _ = idx.Close() }) - - require.NoError(t, idx.Refresh(ctx, taskindex.RefreshPolicy{ - Git: taskindex.GitPolicy{ - Mode: taskindex.GitTasks, - Tasks: []string{taskName}, - }, - })) - - rec, ok, err := idx.Get(ctx, taskName) - require.NoError(t, err) - require.True(t, ok) - require.Equal(t, 2, rec.CommitsBehind) -} - -func mustAtoi(t *testing.T, s string) int { - t.Helper() - n := 0 - for _, ch := range strings.TrimSpace(s) { - if ch < '0' || ch > '9' { - t.Fatalf("not an int: %q", s) - } - n = n*10 + int(ch-'0') - } - return n -} diff --git a/pkg/task/index/detected_merge_sort_test.go b/pkg/task/index/detected_merge_sort_test.go deleted file mode 100644 index 4e8af00..0000000 --- a/pkg/task/index/detected_merge_sort_test.go +++ /dev/null @@ -1,56 +0,0 @@ -package index_test - -import ( - "context" - "testing" - "time" - - "github.com/stretchr/testify/require" - - "github.com/zippoxer/subtask/pkg/task" - "github.com/zippoxer/subtask/pkg/task/history" - taskindex "github.com/zippoxer/subtask/pkg/task/index" - "github.com/zippoxer/subtask/pkg/testutil" -) - -func TestIndex_ListAll_DetectedMergeDoesNotAffectSortOrder(t *testing.T) { - env := testutil.NewTestEnv(t, 0) - ctx := context.Background() - - oldClosedAt := time.Date(2020, 1, 2, 12, 0, 0, 0, time.UTC) - newMergedAt := time.Date(2025, 1, 2, 12, 0, 0, 0, time.UTC) - detectedMergedAt := time.Date(2026, 1, 2, 12, 0, 0, 0, time.UTC) - - oldName := "sort/old" - newName := "sort/new" - - env.CreateTask(oldName, "Old task", "main", "desc") - env.CreateTaskHistory(oldName, []history.Event{ - {TS: oldClosedAt.Add(-1 * time.Hour), Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main"})}, - {TS: oldClosedAt, Type: "task.closed"}, - {TS: detectedMergedAt, Type: "task.merged", Data: mustJSON(map[string]any{"via": "detected"})}, - }) - - env.CreateTask(newName, "New task", "main", "desc") - env.CreateTaskHistory(newName, []history.Event{ - {TS: newMergedAt.Add(-1 * time.Hour), Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main"})}, - {TS: newMergedAt, Type: "task.merged", Data: mustJSON(map[string]any{"commit": "abc"})}, - }) - - idx, err := taskindex.OpenDefault() - require.NoError(t, err) - t.Cleanup(func() { _ = idx.Close() }) - - require.NoError(t, idx.Refresh(ctx, taskindex.RefreshPolicy{ - Git: taskindex.GitPolicy{Mode: taskindex.GitNone}, - })) - - items, err := idx.ListAll(ctx) - require.NoError(t, err) - require.Len(t, items, 2) - - require.Equal(t, newName, items[0].Name) - require.Equal(t, oldName, items[1].Name) - require.True(t, items[1].LastHistory.Equal(oldClosedAt)) - require.Equal(t, task.TaskStatusMerged, items[1].TaskStatus) -} diff --git a/pkg/task/index/git_redesign_cache.go b/pkg/task/index/git_redesign_cache.go new file mode 100644 index 0000000..23271c6 --- /dev/null +++ b/pkg/task/index/git_redesign_cache.go @@ -0,0 +1,129 @@ +package index + +import ( + "context" + "fmt" + "strings" +) + +func (i *Index) UpdateRefHeads(ctx context.Context, name string, branchHead string, baseHead string) error { + if ctx == nil { + ctx = context.Background() + } + branchHead = strings.TrimSpace(branchHead) + baseHead = strings.TrimSpace(baseHead) + + _, err := i.db.ExecContext(ctx, ` +UPDATE tasks +SET + branch_head = ?, + base_head = ? +WHERE name = ?;`, + nullableString(branchHead), + nullableString(baseHead), + name, + ) + if err != nil { + return fmt.Errorf("index update ref heads: %w", err) + } + return nil +} + +func (i *Index) UpdateChangesCache(ctx context.Context, name string, baseCommit string, branchHead string, added int, removed int) error { + if ctx == nil { + ctx = context.Background() + } + baseCommit = strings.TrimSpace(baseCommit) + branchHead = strings.TrimSpace(branchHead) + + _, err := i.db.ExecContext(ctx, ` +UPDATE tasks +SET + changes_added = ?, + changes_removed = ?, + changes_base_commit = ?, + changes_branch_head = ? +WHERE name = ?;`, + added, + removed, + nullableString(baseCommit), + nullableString(branchHead), + name, + ) + if err != nil { + return fmt.Errorf("index update changes cache: %w", err) + } + return nil +} + +func (i *Index) UpdateCommitCountCache(ctx context.Context, name string, baseCommit string, branchHead string, count int) error { + if ctx == nil { + ctx = context.Background() + } + baseCommit = strings.TrimSpace(baseCommit) + branchHead = strings.TrimSpace(branchHead) + + _, err := i.db.ExecContext(ctx, ` +UPDATE tasks +SET + commit_count = ?, + commit_count_base_commit = ?, + commit_count_branch_head = ? +WHERE name = ?;`, + count, + nullableString(baseCommit), + nullableString(branchHead), + name, + ) + if err != nil { + return fmt.Errorf("index update commit count cache: %w", err) + } + return nil +} + +func (i *Index) UpdateCommitLogLastHead(ctx context.Context, name string, branchHead string) error { + if ctx == nil { + ctx = context.Background() + } + branchHead = strings.TrimSpace(branchHead) + + _, err := i.db.ExecContext(ctx, ` +UPDATE tasks +SET commit_log_last_head = ? +WHERE name = ?;`, + nullableString(branchHead), + name, + ) + if err != nil { + return fmt.Errorf("index update commit log last head: %w", err) + } + return nil +} + +func (i *Index) UpdateIntegrationCache(ctx context.Context, name string, branchHead string, targetHead string, reason string) error { + if ctx == nil { + ctx = context.Background() + } + branchHead = strings.TrimSpace(branchHead) + targetHead = strings.TrimSpace(targetHead) + reason = strings.TrimSpace(reason) + + _, err := i.db.ExecContext(ctx, ` +UPDATE tasks +SET + git_integrated_reason = ?, + git_integrated_branch_head = ?, + git_integrated_target_head = ?, + git_integrated_checked_at_ns = ? +WHERE name = ?;`, + nullableString(reason), + nullableString(branchHead), + nullableString(targetHead), + i.now().UnixNano(), + name, + ) + if err != nil { + return fmt.Errorf("index update integration cache: %w", err) + } + return nil +} diff --git a/pkg/task/index/gitcache.go b/pkg/task/index/gitcache.go index 2d5c23b..d7d290f 100644 --- a/pkg/task/index/gitcache.go +++ b/pkg/task/index/gitcache.go @@ -33,8 +33,7 @@ type GitPolicy struct { // Tasks is used when Mode == GitTasks. Tasks []string - IncludeConflicts bool - IncludeIntegration bool + IncludeConflicts bool } const defaultGitTTL = 30 * time.Second @@ -46,8 +45,8 @@ func (i *Index) refreshGit(ctx context.Context, p GitPolicy) error { debug := logging.DebugEnabled() if debug { - logging.Debug("git-cache", fmt.Sprintf("refreshGit start mode=%s ttl=%s includeConflicts=%t includeIntegration=%t tasks=%d", - gitModeString(p.Mode), p.TTL, p.IncludeConflicts, p.IncludeIntegration, len(p.Tasks))) + logging.Debug("git-cache", fmt.Sprintf("refreshGit start mode=%s ttl=%s includeConflicts=%t tasks=%d", + gitModeString(p.Mode), p.TTL, p.IncludeConflicts, len(p.Tasks))) } ttl := p.TTL @@ -77,7 +76,6 @@ func (i *Index) refreshGit(ctx context.Context, p GitPolicy) error { linesAdded *int linesRemoved *int - commitsBehind *int conflictFilesJSON *string @@ -178,30 +176,6 @@ func (i *Index) refreshGit(ctx context.Context, p GitPolicy) error { firstErr = err } } - - if targetRef != "" { - // "Behind" means "how many commits the base branch has that the task ref doesn't". - // - // Prefer comparing base branch vs the task branch (correct after rebases/merges), - // and fall back to the pinned base_commit for draft-only tasks where the branch - // doesn't exist yet. - baseRef := "" - if git.BranchExists(repoDir, c.name) { - baseRef = c.name - } else { - baseRef = c.baseCommit - } - - baseRef = strings.TrimSpace(baseRef) - if baseRef != "" { - behind, err := git.CommitsBehind(repoDir, baseRef, targetRef) - if err == nil { - r.commitsBehind = &behind - } else if firstErr == nil { - firstErr = err - } - } - } } if needsConflicts { @@ -252,7 +226,6 @@ func (i *Index) refreshGit(ctx context.Context, p GitPolicy) error { UPDATE tasks SET git_lines_added = CASE WHEN ? THEN ? ELSE git_lines_added END, git_lines_removed = CASE WHEN ? THEN ? ELSE git_lines_removed END, - git_commits_behind = CASE WHEN ? THEN ? ELSE git_commits_behind END, git_base_ref = CASE WHEN ? THEN ? ELSE git_base_ref END, git_target_ref = CASE WHEN ? THEN ? ELSE git_target_ref END, git_computed_at_ns = CASE WHEN ? THEN ? ELSE git_computed_at_ns END, @@ -274,8 +247,6 @@ WHERE name = ?; boolToInt(r.updateBase), nullableInt(r.linesRemoved), boolToInt(r.updateBase), - nullableInt(r.commitsBehind), - boolToInt(r.updateBase), nullableStringPtr(r.baseRef), boolToInt(r.updateBase), nullableStringPtr(r.targetRef), @@ -295,17 +266,10 @@ WHERE name = ?; return fmt.Errorf("index git refresh: commit: %w", err) } } - - var intStart time.Time - if debug { - intStart = time.Now() - } - err = i.refreshIntegration(ctx, p) if debug { - logging.Debug("git-cache", fmt.Sprintf("refreshIntegration (%s)", time.Since(intStart).Round(time.Millisecond))) logging.Debug("git-cache", "refreshGit done") } - return err + return nil } type gitCandidate struct { diff --git a/pkg/task/index/index.go b/pkg/task/index/index.go index 0e7640f..7608e7d 100644 --- a/pkg/task/index/index.go +++ b/pkg/task/index/index.go @@ -7,8 +7,10 @@ import ( "fmt" "os" "path/filepath" + "strings" "time" + "github.com/zippoxer/subtask/internal/filelock" "github.com/zippoxer/subtask/pkg/task" _ "modernc.org/sqlite" @@ -27,7 +29,7 @@ type Index struct { // OpenDefault opens (or creates) the index database at .subtask/index.db. func OpenDefault() (*Index, error) { - return Open(filepath.Join(task.ProjectDir(), "index.db")) + return Open(task.IndexPath()) } // Open opens (or creates) the index database at path. @@ -39,6 +41,21 @@ func Open(path string) (*Index, error) { return nil, fmt.Errorf("create index dir: %w", err) } + // Cross-process guardrail: avoid concurrent migrations/pragma races when multiple + // `subtask` processes start at the same time (e.g. parallel `subtask list`). + lockFile, err := os.OpenFile(path+".lock", os.O_CREATE|os.O_RDWR, 0o644) + if err != nil { + return nil, fmt.Errorf("open index lock: %w", err) + } + if err := filelock.LockExclusive(lockFile); err != nil { + _ = lockFile.Close() + return nil, fmt.Errorf("lock index: %w", err) + } + defer func() { + _ = filelock.Unlock(lockFile) + _ = lockFile.Close() + }() + db, err := sql.Open("sqlite", path) if err != nil { return nil, fmt.Errorf("open sqlite: %w", err) @@ -67,21 +84,21 @@ func Open(path string) (*Index, error) { } func (i *Index) init(ctx context.Context) error { - if err := i.db.PingContext(ctx); err != nil { + if err := pingWithRetry(ctx, i.db); err != nil { return fmt.Errorf("ping index db: %w", err) } // Pragmas: best-effort for speed + concurrency. - if _, err := i.db.ExecContext(ctx, "PRAGMA journal_mode=WAL;"); err != nil { + if err := execPragmaWithRetry(ctx, i.db, "PRAGMA journal_mode=WAL;"); err != nil { return fmt.Errorf("pragma journal_mode: %w", err) } - if _, err := i.db.ExecContext(ctx, "PRAGMA synchronous=NORMAL;"); err != nil { + if err := execPragmaWithRetry(ctx, i.db, "PRAGMA synchronous=NORMAL;"); err != nil { return fmt.Errorf("pragma synchronous: %w", err) } - if _, err := i.db.ExecContext(ctx, fmt.Sprintf("PRAGMA busy_timeout=%d;", defaultBusyTimeout.Milliseconds())); err != nil { + if err := execPragmaWithRetry(ctx, i.db, fmt.Sprintf("PRAGMA busy_timeout=%d;", defaultBusyTimeout.Milliseconds())); err != nil { return fmt.Errorf("pragma busy_timeout: %w", err) } - if _, err := i.db.ExecContext(ctx, "PRAGMA foreign_keys=ON;"); err != nil { + if err := execPragmaWithRetry(ctx, i.db, "PRAGMA foreign_keys=ON;"); err != nil { return fmt.Errorf("pragma foreign_keys: %w", err) } @@ -92,6 +109,46 @@ func (i *Index) init(ctx context.Context) error { return nil } +func pingWithRetry(ctx context.Context, db *sql.DB) error { + for { + err := db.PingContext(ctx) + if err == nil { + return nil + } + if !isSQLiteBusy(err) { + return err + } + if ctx.Err() != nil { + return err + } + time.Sleep(25 * time.Millisecond) + } +} + +func execPragmaWithRetry(ctx context.Context, db *sql.DB, query string) error { + for { + _, err := db.ExecContext(ctx, query) + if err == nil { + return nil + } + if !isSQLiteBusy(err) { + return err + } + if ctx.Err() != nil { + return err + } + time.Sleep(25 * time.Millisecond) + } +} + +func isSQLiteBusy(err error) bool { + if err == nil { + return false + } + s := err.Error() + return strings.Contains(s, "SQLITE_BUSY") || strings.Contains(s, "database is locked") +} + // Close closes the underlying database connection. func (i *Index) Close() error { if i == nil || i.db == nil { diff --git a/pkg/task/index/index_test.go b/pkg/task/index/index_test.go index 84de0a1..c00e919 100644 --- a/pkg/task/index/index_test.go +++ b/pkg/task/index/index_test.go @@ -16,6 +16,7 @@ import ( "github.com/zippoxer/subtask/pkg/task" "github.com/zippoxer/subtask/pkg/task/history" + "github.com/zippoxer/subtask/pkg/task/migrate/gitredesign" "github.com/zippoxer/subtask/pkg/testutil" ) @@ -87,7 +88,7 @@ func TestIndex_Invalidation_TASKmd(t *testing.T) { Title: "Old title", BaseBranch: "main", Description: "desc", - Schema: 1, + Schema: gitredesign.TaskSchemaVersion, }).Save()) require.NoError(t, history.WriteAll(name, []history.Event{{Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main"})}})) @@ -199,12 +200,12 @@ func TestIndex_CorruptDB_Rebuilds(t *testing.T) { Title: "Task", BaseBranch: "main", Description: "desc", - Schema: 1, + Schema: gitredesign.TaskSchemaVersion, }).Save()) require.NoError(t, history.WriteAll(name, []history.Event{{Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main"})}})) // Write a corrupt "db". - require.NoError(t, os.WriteFile(filepath.Join(task.ProjectDir(), "index.db"), []byte("not a sqlite db"), 0o644)) + require.NoError(t, os.WriteFile(task.IndexPath(), []byte("not a sqlite db"), 0o644)) idx, err := taskindex.OpenDefault() require.NoError(t, err) @@ -217,7 +218,7 @@ func TestIndex_CorruptDB_Rebuilds(t *testing.T) { require.True(t, ok) // Ensure the corrupt file was moved out of the way. - matches, err := filepath.Glob(filepath.Join(task.ProjectDir(), "index.db.corrupt-*")) + matches, err := filepath.Glob(task.IndexPath() + ".corrupt-*") require.NoError(t, err) require.NotEmpty(t, matches) } diff --git a/pkg/task/index/integration.go b/pkg/task/index/integration.go deleted file mode 100644 index 3036388..0000000 --- a/pkg/task/index/integration.go +++ /dev/null @@ -1,639 +0,0 @@ -package index - -import ( - "context" - "crypto/sha256" - "database/sql" - "encoding/hex" - "encoding/json" - "fmt" - "sort" - "strings" - "time" - - "github.com/zippoxer/subtask/pkg/git" - "github.com/zippoxer/subtask/pkg/logging" - "github.com/zippoxer/subtask/pkg/task" - "github.com/zippoxer/subtask/pkg/task/history" -) - -type integrationTask struct { - name string - baseBranch string - taskStatus task.TaskStatus - - lastBranchHead string - integrated string - - integratedBranchHead string -} - -type integrationUpdate struct { - name string - - setLastHead bool - lastHead sql.NullString - - setIntegrated bool - reason sql.NullString - branchHead sql.NullString - targetHead sql.NullString - checkedAtNS sql.NullInt64 -} - -func (i *Index) refreshIntegration(ctx context.Context, p GitPolicy) error { - if !p.IncludeIntegration { - return nil - } - if ctx == nil { - ctx = context.Background() - } - - debug := logging.DebugEnabled() - var start time.Time - if debug { - start = time.Now() - logging.Debug("integration", fmt.Sprintf("start mode=%s tasks=%d", gitModeString(p.Mode), len(p.Tasks))) - } - - // Load tasks (from DB) and prior snapshot. - var step time.Time - if debug { - step = time.Now() - } - tasks, err := i.integrationTasks(ctx) - if err != nil { - return err - } - if debug { - logging.Debug("integration", fmt.Sprintf("integrationTasks n=%d (%s)", len(tasks), time.Since(step).Round(time.Millisecond))) - step = time.Now() - } - prevSnap, err := i.loadRefsSnapshot(ctx) - if err != nil { - return err - } - if debug { - logging.Debug("integration", fmt.Sprintf("loadRefsSnapshot ok hasSnapshot=%t (%s)", strings.TrimSpace(prevSnap.Hash) != "", time.Since(step).Round(time.Millisecond))) - } - - // Build a repo-wide view of refs (single git call), then compute a stable snapshot - // for the refs we care about. - if debug { - step = time.Now() - } - allRefs, err := git.ListRefs(".", "refs/heads", "refs/remotes/origin") - if err != nil { - return err - } - nextSnap, desiredRefs := buildRefsSnapshot(tasks, allRefs) - if debug { - logging.Debug("integration", fmt.Sprintf("git.ListRefs refs=%d desiredRefs=%d (%s)", len(allRefs), len(desiredRefs), time.Since(step).Round(time.Millisecond))) - } - - // Decide whether to run a repair pass. - forceTasks := p.Mode == GitTasks && len(p.Tasks) > 0 - noSnapshot := prevSnap.Hash == "" - snapshotMismatch := prevSnap.Hash != "" && prevSnap.Hash != nextSnap.Hash - - if !forceTasks && !snapshotMismatch && !noSnapshot { - // Snapshot matches and we're not being asked to recompute a specific task. - // Keep list/show fast. - if debug { - logging.Debug("integration", fmt.Sprintf("skip snapshot match (%s)", time.Since(start).Round(time.Millisecond))) - } - return nil - } - - repairPass := noSnapshot && !forceTasks - if debug { - logging.Debug("integration", fmt.Sprintf("snapshot noSnapshot=%t mismatch=%t forceTasks=%t", noSnapshot, snapshotMismatch, forceTasks)) - } - if forceTasks && snapshotMismatch { - // If we're being asked to refresh a specific task, we still must not "paper over" - // unrelated external ref changes by blindly updating the snapshot. - // - // If the snapshot changed only due to the requested task refs (and its base refs), - // proceed with a targeted refresh. Otherwise, fall back to a repair pass. - prevRefs, ok := parseRefsSnapshotJSON(prevSnap.JSON) - if !ok { - repairPass = true - } else { - allowedRefs := allowedRefsForForcedTasks(tasks, p.Tasks) - for _, ref := range desiredRefs { - if _, ok := allowedRefs[ref]; ok { - continue - } - if strings.TrimSpace(prevRefs[ref]) != strings.TrimSpace(nextSnap.Refs[ref]) { - repairPass = true - break - } - } - } - } - - var targetTasks []integrationTask - if repairPass { - // Repair pass: recompute integration for all tasks that are not durable-merged. - // - // This is the "manual changes happened" path (external merges, force-pushes, etc.). - // It can be slower, but it guarantees correctness (including clearing stale - // integration results). - for _, t := range tasks { - if t.taskStatus == task.TaskStatusMerged { - continue - } - targetTasks = append(targetTasks, t) - } - } else if forceTasks { - allow := make(map[string]struct{}, len(p.Tasks)) - for _, n := range p.Tasks { - allow[n] = struct{}{} - } - for _, t := range tasks { - if _, ok := allow[t.name]; ok { - targetTasks = append(targetTasks, t) - } - } - } else if snapshotMismatch { - prevRefs, ok := parseRefsSnapshotJSON(prevSnap.JSON) - if !ok { - repairPass = true - for _, t := range tasks { - if t.taskStatus == task.TaskStatusMerged { - continue - } - targetTasks = append(targetTasks, t) - } - } else { - // Snapshot changed: only recompute tasks whose relevant refs changed. - // - // - task branch head changed -> recompute that task. - // - base branch (local or origin) changed -> recompute all tasks using that base. - changedTaskNames := make(map[string]struct{}) - changedBaseBranches := make(map[string]struct{}) - - for _, ref := range desiredRefs { - prev := strings.TrimSpace(prevRefs[ref]) - next := strings.TrimSpace(nextSnap.Refs[ref]) - if prev == next { - continue - } - - if strings.HasPrefix(ref, "refs/heads/") { - name := strings.TrimPrefix(ref, "refs/heads/") - if strings.TrimSpace(name) != "" { - changedTaskNames[name] = struct{}{} - changedBaseBranches[name] = struct{}{} - } - continue - } - if strings.HasPrefix(ref, "refs/remotes/origin/") { - base := strings.TrimPrefix(ref, "refs/remotes/origin/") - if strings.TrimSpace(base) != "" { - changedBaseBranches[base] = struct{}{} - } - continue - } - } - - seen := make(map[string]struct{}) - for _, t := range tasks { - if t.taskStatus == task.TaskStatusMerged { - continue - } - - _, taskChanged := changedTaskNames[t.name] - base := strings.TrimSpace(t.baseBranch) - _, baseChanged := changedBaseBranches[base] - - if !taskChanged && !baseChanged { - continue - } - if _, ok := seen[t.name]; ok { - continue - } - seen[t.name] = struct{}{} - targetTasks = append(targetTasks, t) - } - - // The snapshot may include refs for tasks that no longer exist in the DB - // (e.g. after deleting a task). In that case there can be no target tasks. - // Still persist the updated snapshot so list/show stays fast. - if len(targetTasks) == 0 { - if debug { - logging.Debug("integration", fmt.Sprintf("snapshot updated (no target tasks) (%s)", time.Since(start).Round(time.Millisecond))) - } - return i.persistSnapshotOnly(ctx, nextSnap) - } - } - } - if debug { - reason := "targeted" - if repairPass { - reason = "repair-pass" - } else if forceTasks { - reason = "force-tasks" - } else if snapshotMismatch { - reason = "snapshot-diff" - } - logging.Debug("integration", fmt.Sprintf("targetTasks n=%d reason=%s", len(targetTasks), reason)) - } - - // Group by base branch to amortize target tree lookups. - type targetInfo struct { - refName string - headSHA string - treeSHA string - } - targetByBase := make(map[string]targetInfo) - baseBranches := make([]string, 0, len(targetTasks)) - seenBase := make(map[string]struct{}) - for _, t := range targetTasks { - b := strings.TrimSpace(t.baseBranch) - if b == "" { - continue - } - if _, ok := seenBase[b]; ok { - continue - } - seenBase[b] = struct{}{} - baseBranches = append(baseBranches, b) - } - sort.Strings(baseBranches) - - if debug { - step = time.Now() - } - for _, base := range baseBranches { - ref := git.EffectiveTarget(".", base) - refName := refToFullRefName(ref) - head := strings.TrimSpace(nextSnap.Refs[refName]) - if head == "" { - // Fallback: try resolving directly. - h, err := git.Output(".", "rev-parse", ref) - if err != nil { - continue - } - head = strings.TrimSpace(h) - } - tree, err := git.Output(".", "rev-parse", ref+"^{tree}") - if err != nil { - continue - } - targetByBase[base] = targetInfo{refName: refName, headSHA: head, treeSHA: strings.TrimSpace(tree)} - } - if debug { - logging.Debug("integration", fmt.Sprintf("baseBranches n=%d (%s)", len(baseBranches), time.Since(step).Round(time.Millisecond))) - } - - nowNS := i.now().UnixNano() - now := i.now().UTC() - updates := make([]integrationUpdate, 0, len(targetTasks)) - var closedToMerged []integrationUpdate - - // Precompute current branch heads for all tasks from desired refs. - for _, t := range targetTasks { - branchRef := "refs/heads/" + t.name - head := strings.TrimSpace(nextSnap.Refs[branchRef]) - if head == "" && strings.TrimSpace(t.lastBranchHead) != "" { - head = strings.TrimSpace(t.lastBranchHead) - } - - u := integrationUpdate{name: t.name} - if strings.TrimSpace(nextSnap.Refs[branchRef]) != "" { - u.setLastHead = true - u.lastHead = sql.NullString{String: head, Valid: head != ""} - } - - if head == "" { - // No known head: do not modify integration status. - updates = append(updates, u) - continue - } - - ti, ok := targetByBase[strings.TrimSpace(t.baseBranch)] - if !ok || ti.headSHA == "" || ti.treeSHA == "" { - // No known base head/tree: do not modify integration status. - updates = append(updates, u) - continue - } - - // 1) Ancestor check (guarantee for history-preserving merges). - if git.RunQuiet(".", "merge-base", "--is-ancestor", head, ti.headSHA) == nil { - u.setIntegrated = true - u.reason = sql.NullString{String: string(git.IntegratedAncestor), Valid: true} - u.branchHead = sql.NullString{String: head, Valid: true} - u.targetHead = sql.NullString{String: ti.headSHA, Valid: true} - u.checkedAtNS = sql.NullInt64{Int64: nowNS, Valid: true} - if t.taskStatus == task.TaskStatusClosed { - closedToMerged = append(closedToMerged, u) - } - updates = append(updates, u) - continue - } - - // 2) No-op merge check (guarantee for content integration). - mergeTree, err := git.Output(".", "merge-tree", "--write-tree", ti.headSHA, head) - if err == nil && strings.TrimSpace(mergeTree) == ti.treeSHA { - u.setIntegrated = true - u.reason = sql.NullString{String: string(git.IntegratedMergeAddsNothing), Valid: true} - u.branchHead = sql.NullString{String: head, Valid: true} - u.targetHead = sql.NullString{String: ti.headSHA, Valid: true} - u.checkedAtNS = sql.NullInt64{Int64: nowNS, Valid: true} - if t.taskStatus == task.TaskStatusClosed { - closedToMerged = append(closedToMerged, u) - } - } else if err == nil && strings.TrimSpace(t.integrated) != "" { - // We have enough info to decide "not integrated". Clear any stale cached integration. - u.setIntegrated = true - u.reason = sql.NullString{Valid: false} - u.branchHead = sql.NullString{Valid: false} - u.targetHead = sql.NullString{Valid: false} - u.checkedAtNS = sql.NullInt64{Valid: false} - } - - updates = append(updates, u) - } - - // Persist updates + snapshot in one short transaction. - tx, err := i.db.BeginTx(ctx, nil) - if err != nil { - return fmt.Errorf("index integration refresh: begin tx: %w", err) - } - defer tx.Rollback() - - if err := upsertIntegrationUpdates(ctx, tx, updates); err != nil { - return err - } - if err := saveRefsSnapshot(ctx, tx, nextSnap.Hash, nextSnap.JSON, i.now()); err != nil { - return err - } - - if err := tx.Commit(); err != nil { - return fmt.Errorf("index integration refresh: commit: %w", err) - } - - if err := i.promoteClosedTasksToMerged(closedToMerged, now); err != nil { - return err - } - - if debug { - logging.Debug("integration", fmt.Sprintf("done updates=%d (%s)", len(updates), time.Since(start).Round(time.Millisecond))) - } - return nil -} - -func (i *Index) promoteClosedTasksToMerged(updates []integrationUpdate, now time.Time) error { - if len(updates) == 0 { - return nil - } - - for _, u := range updates { - if !u.setIntegrated || !u.reason.Valid || !u.targetHead.Valid { - continue - } - - taskName := strings.TrimSpace(u.name) - if taskName == "" { - continue - } - - locked, err := task.TryWithLock(taskName, func() error { - tail, err := history.Tail(taskName) - if err != nil { - return err - } - if tail.TaskStatus != task.TaskStatusClosed { - return nil - } - - data, _ := json.Marshal(map[string]any{ - "commit": strings.TrimSpace(u.targetHead.String), - "into": strings.TrimSpace(tail.BaseBranch), - "branch": taskName, - "via": "detected", - "integrated_reason": strings.TrimSpace(u.reason.String), - "branch_head": strings.TrimSpace(u.branchHead.String), - "target_head": strings.TrimSpace(u.targetHead.String), - }) - _ = history.AppendLocked(taskName, history.Event{ - Type: "task.merged", - Data: data, - TS: now, - }) - return nil - }) - if err != nil { - return err - } - if !locked { - // Best-effort: task is busy; we'll retry on the next refresh. - continue - } - } - - return nil -} - -func (i *Index) persistSnapshotOnly(ctx context.Context, snap computedSnapshot) error { - tx, err := i.db.BeginTx(ctx, nil) - if err != nil { - return fmt.Errorf("index snapshot: begin tx: %w", err) - } - defer tx.Rollback() - if err := saveRefsSnapshot(ctx, tx, snap.Hash, snap.JSON, i.now()); err != nil { - return err - } - if err := tx.Commit(); err != nil { - return fmt.Errorf("index snapshot: commit: %w", err) - } - return nil -} - -func upsertIntegrationUpdates(ctx context.Context, tx *sql.Tx, updates []integrationUpdate) error { - if len(updates) == 0 { - return nil - } - stmt, err := tx.PrepareContext(ctx, ` -UPDATE tasks SET - git_last_branch_head = CASE WHEN ? THEN ? ELSE git_last_branch_head END, - git_integrated_reason = CASE WHEN ? THEN ? ELSE git_integrated_reason END, - git_integrated_branch_head = CASE WHEN ? THEN ? ELSE git_integrated_branch_head END, - git_integrated_target_head = CASE WHEN ? THEN ? ELSE git_integrated_target_head END, - git_integrated_checked_at_ns = CASE WHEN ? THEN ? ELSE git_integrated_checked_at_ns END -WHERE name = ?;`) - if err != nil { - return fmt.Errorf("index integration refresh: prepare update: %w", err) - } - defer stmt.Close() - - for _, u := range updates { - if _, err := stmt.ExecContext(ctx, - boolToInt(u.setLastHead), - nullableNullString(u.lastHead), - boolToInt(u.setIntegrated), - nullableNullString(u.reason), - boolToInt(u.setIntegrated), - nullableNullString(u.branchHead), - boolToInt(u.setIntegrated), - nullableNullString(u.targetHead), - boolToInt(u.setIntegrated), - nullableNullInt64(u.checkedAtNS), - u.name, - ); err != nil { - return fmt.Errorf("index integration refresh: update %q: %w", u.name, err) - } - } - return nil -} - -func nullableNullString(s sql.NullString) any { - if !s.Valid { - return nil - } - return s.String -} - -func nullableNullInt64(n sql.NullInt64) any { - if !n.Valid { - return nil - } - return n.Int64 -} - -func (i *Index) integrationTasks(ctx context.Context) ([]integrationTask, error) { - rows, err := i.db.QueryContext(ctx, ` -SELECT name, base_branch, task_status, git_last_branch_head, git_integrated_reason, git_integrated_branch_head -FROM tasks;`) - if err != nil { - return nil, fmt.Errorf("index integration refresh: query tasks: %w", err) - } - defer rows.Close() - - var out []integrationTask - for rows.Next() { - var ( - t integrationTask - ts string - lastHead sql.NullString - integrated sql.NullString - integratedHead sql.NullString - ) - if err := rows.Scan(&t.name, &t.baseBranch, &ts, &lastHead, &integrated, &integratedHead); err != nil { - return nil, fmt.Errorf("index integration refresh: scan task: %w", err) - } - t.taskStatus = task.TaskStatus(ts) - if lastHead.Valid { - t.lastBranchHead = lastHead.String - } - if integrated.Valid { - t.integrated = integrated.String - } - if integratedHead.Valid { - t.integratedBranchHead = integratedHead.String - } - out = append(out, t) - } - if err := rows.Err(); err != nil { - return nil, fmt.Errorf("index integration refresh: iterate tasks: %w", err) - } - return out, nil -} - -type computedSnapshot struct { - Hash string - JSON string - Refs map[string]string -} - -func buildRefsSnapshot(tasks []integrationTask, allRefs map[string]string) (computedSnapshot, []string) { - desired := make(map[string]struct{}) - for _, t := range tasks { - desired["refs/heads/"+t.name] = struct{}{} - b := strings.TrimSpace(t.baseBranch) - if b == "" { - continue - } - desired["refs/heads/"+b] = struct{}{} - desired["refs/remotes/origin/"+b] = struct{}{} - } - - refs := make(map[string]string, len(desired)) - desiredList := make([]string, 0, len(desired)) - for r := range desired { - desiredList = append(desiredList, r) - } - sort.Strings(desiredList) - - var b strings.Builder - for _, r := range desiredList { - sha := strings.TrimSpace(allRefs[r]) - refs[r] = sha - b.WriteString(r) - b.WriteByte('\x00') - b.WriteString(sha) - b.WriteByte('\n') - } - sum := sha256.Sum256([]byte(b.String())) - hash := hex.EncodeToString(sum[:]) - - js, _ := json.Marshal(refs) - - return computedSnapshot{ - Hash: hash, - JSON: string(js), - Refs: refs, - }, desiredList -} - -func refToFullRefName(ref string) string { - ref = strings.TrimSpace(ref) - if strings.HasPrefix(ref, "origin/") { - return "refs/remotes/" + ref - } - if strings.HasPrefix(ref, "refs/") { - return ref - } - return "refs/heads/" + ref -} - -func parseRefsSnapshotJSON(js string) (map[string]string, bool) { - js = strings.TrimSpace(js) - if js == "" { - return map[string]string{}, true - } - var m map[string]string - if err := json.Unmarshal([]byte(js), &m); err != nil { - return nil, false - } - return m, true -} - -func allowedRefsForForcedTasks(allTasks []integrationTask, forcedTaskNames []string) map[string]struct{} { - allowed := make(map[string]struct{}) - - forced := make(map[string]struct{}, len(forcedTaskNames)) - for _, n := range forcedTaskNames { - n = strings.TrimSpace(n) - if n == "" { - continue - } - forced[n] = struct{}{} - allowed["refs/heads/"+n] = struct{}{} - } - - for _, t := range allTasks { - if _, ok := forced[t.name]; !ok { - continue - } - b := strings.TrimSpace(t.baseBranch) - if b == "" { - continue - } - allowed["refs/heads/"+b] = struct{}{} - allowed["refs/remotes/origin/"+b] = struct{}{} - } - - return allowed -} diff --git a/pkg/task/index/integration_refresh_test.go b/pkg/task/index/integration_refresh_test.go deleted file mode 100644 index 0eb5889..0000000 --- a/pkg/task/index/integration_refresh_test.go +++ /dev/null @@ -1,317 +0,0 @@ -package index_test - -import ( - "context" - "database/sql" - "fmt" - "os" - "os/exec" - "path/filepath" - "strings" - "testing" - - "github.com/stretchr/testify/require" - - "github.com/zippoxer/subtask/pkg/git" - "github.com/zippoxer/subtask/pkg/task/history" - "github.com/zippoxer/subtask/pkg/testutil" - - taskindex "github.com/zippoxer/subtask/pkg/task/index" -) - -func gitOut(t *testing.T, dir string, args ...string) string { - t.Helper() - cmd := exec.Command("git", args...) - cmd.Dir = dir - out, err := cmd.CombinedOutput() - require.NoError(t, err, "git %v failed: %s", args, out) - return strings.TrimSpace(string(out)) -} - -func withGitSubcommandSpy(t *testing.T, fn func()) map[string]int { - t.Helper() - - realGit, err := exec.LookPath("git") - require.NoError(t, err) - - tmp := t.TempDir() - logPath := filepath.Join(tmp, "git-subcommands.log") - wrapperPath := filepath.Join(tmp, "git") - - script := fmt.Sprintf(`#!/bin/sh -cmd="$1" -shift -case "$cmd" in - merge-base|merge-tree) - echo "$cmd" >> %q - ;; -esac -exec %q "$cmd" "$@" -`, logPath, realGit) - require.NoError(t, os.WriteFile(wrapperPath, []byte(script), 0o755)) - - oldPath := os.Getenv("PATH") - require.NoError(t, os.Setenv("PATH", tmp+string(os.PathListSeparator)+oldPath)) - defer func() { _ = os.Setenv("PATH", oldPath) }() - - fn() - - data, err := os.ReadFile(logPath) - if err != nil && !os.IsNotExist(err) { - require.NoError(t, err) - } - - out := make(map[string]int) - for _, line := range strings.Split(string(data), "\n") { - line = strings.TrimSpace(line) - if line == "" { - continue - } - out[line]++ - } - return out -} - -func TestIndex_IntegrationRefresh_Ancestor(t *testing.T) { - env := testutil.NewTestEnv(t, 1) - ctx := context.Background() - - name := "idx/ancestor" - env.CreateTask(name, "Ancestor", "main", "desc") - env.CreateTaskHistory(name, []history.Event{{Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main"})}}) - - ws := env.Workspaces[0] - gitOut(t, ws, "checkout", "-b", name) - require.NoError(t, os.WriteFile(filepath.Join(ws, "a.txt"), []byte("a\n"), 0o644)) - gitOut(t, ws, "add", "a.txt") - gitOut(t, ws, "commit", "-m", "a") - - idx, err := taskindex.OpenDefault() - require.NoError(t, err) - t.Cleanup(func() { _ = idx.Close() }) - - // Prime snapshot. - require.NoError(t, idx.Refresh(ctx, taskindex.RefreshPolicy{ - Git: taskindex.GitPolicy{Mode: taskindex.GitOpenOnly, IncludeIntegration: true}, - })) - - // External merge (history-preserving). - gitOut(t, env.RootDir, "checkout", "main") - gitOut(t, env.RootDir, "merge", "--no-ff", name, "-m", "Merge "+name) - - require.NoError(t, idx.Refresh(ctx, taskindex.RefreshPolicy{ - Git: taskindex.GitPolicy{Mode: taskindex.GitOpenOnly, IncludeIntegration: true}, - })) - - rec, ok, err := idx.Get(ctx, name) - require.NoError(t, err) - require.True(t, ok) - require.Equal(t, string(git.IntegratedAncestor), strings.TrimSpace(rec.IntegratedReason)) -} - -func TestIndex_IntegrationRefresh_Squash(t *testing.T) { - env := testutil.NewTestEnv(t, 1) - ctx := context.Background() - - name := "idx/squash" - env.CreateTask(name, "Squash", "main", "desc") - env.CreateTaskHistory(name, []history.Event{{Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main"})}}) - - ws := env.Workspaces[0] - gitOut(t, ws, "checkout", "-b", name) - require.NoError(t, os.WriteFile(filepath.Join(ws, "s.txt"), []byte("s\n"), 0o644)) - gitOut(t, ws, "add", "s.txt") - gitOut(t, ws, "commit", "-m", "s") - - idx, err := taskindex.OpenDefault() - require.NoError(t, err) - t.Cleanup(func() { _ = idx.Close() }) - - // Prime snapshot. - require.NoError(t, idx.Refresh(ctx, taskindex.RefreshPolicy{ - Git: taskindex.GitPolicy{Mode: taskindex.GitOpenOnly, IncludeIntegration: true}, - })) - - // External squash merge. - gitOut(t, env.RootDir, "checkout", "main") - gitOut(t, env.RootDir, "merge", "--squash", name) - gitOut(t, env.RootDir, "commit", "-m", "Squash "+name) - - require.NoError(t, idx.Refresh(ctx, taskindex.RefreshPolicy{ - Git: taskindex.GitPolicy{Mode: taskindex.GitOpenOnly, IncludeIntegration: true}, - })) - - rec, ok, err := idx.Get(ctx, name) - require.NoError(t, err) - require.True(t, ok) - require.Equal(t, string(git.IntegratedMergeAddsNothing), strings.TrimSpace(rec.IntegratedReason)) -} - -func TestIndex_IntegrationRefresh_SnapshotMismatch_RecomputesChangedTaskOnly(t *testing.T) { - env := testutil.NewTestEnv(t, 1) - ctx := context.Background() - - a := "idx/changed-a" - b := "idx/changed-b" - env.CreateTask(a, "A", "main", "desc") - env.CreateTask(b, "B", "main", "desc") - env.CreateTaskHistory(a, []history.Event{{Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main"})}}) - env.CreateTaskHistory(b, []history.Event{{Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main"})}}) - - ws := env.Workspaces[0] - - // Create and merge A. - gitOut(t, ws, "checkout", "-b", a) - require.NoError(t, os.WriteFile(filepath.Join(ws, "a.txt"), []byte("a\n"), 0o644)) - gitOut(t, ws, "add", "a.txt") - gitOut(t, ws, "commit", "-m", "a1") - gitOut(t, env.RootDir, "checkout", "main") - gitOut(t, env.RootDir, "merge", "--no-ff", a, "-m", "Merge "+a) - - // Create and merge B. - gitOut(t, ws, "checkout", "--detach") - gitOut(t, ws, "checkout", "-b", b) - require.NoError(t, os.WriteFile(filepath.Join(ws, "b.txt"), []byte("b\n"), 0o644)) - gitOut(t, ws, "add", "b.txt") - gitOut(t, ws, "commit", "-m", "b1") - gitOut(t, env.RootDir, "checkout", "main") - gitOut(t, env.RootDir, "merge", "--no-ff", b, "-m", "Merge "+b) - - idx, err := taskindex.OpenDefault() - require.NoError(t, err) - t.Cleanup(func() { _ = idx.Close() }) - - // Prime snapshot. - require.NoError(t, idx.Refresh(ctx, taskindex.RefreshPolicy{ - Git: taskindex.GitPolicy{Mode: taskindex.GitOpenOnly, IncludeIntegration: true}, - })) - - // Move only A (not integrated anymore). - gitOut(t, ws, "checkout", a) - require.NoError(t, os.WriteFile(filepath.Join(ws, "a.txt"), []byte("a2\n"), 0o644)) - gitOut(t, ws, "add", "a.txt") - gitOut(t, ws, "commit", "-m", "a2") - - counts := withGitSubcommandSpy(t, func() { - require.NoError(t, idx.Refresh(ctx, taskindex.RefreshPolicy{ - Git: taskindex.GitPolicy{Mode: taskindex.GitOpenOnly, IncludeIntegration: true}, - })) - }) - - // Only A should be recomputed (B is unaffected), so we expect a single per-task - // ancestor check and a single merge-tree check for the non-ancestor case. - require.Equal(t, 1, counts["merge-base"]) - require.Equal(t, 1, counts["merge-tree"]) -} - -func TestIndex_IntegrationForceTasks_DoesNotHideUnrelatedRefChanges(t *testing.T) { - env := testutil.NewTestEnv(t, 1) - ctx := context.Background() - - a := "idx/force-a" - b := "idx/force-b" - env.CreateTask(a, "A", "main", "desc") - env.CreateTask(b, "B", "main", "desc") - env.CreateTaskHistory(a, []history.Event{{Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main"})}}) - env.CreateTaskHistory(b, []history.Event{{Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main"})}}) - - ws := env.Workspaces[0] - gitOut(t, ws, "checkout", "-b", a) - require.NoError(t, os.WriteFile(filepath.Join(ws, "a.txt"), []byte("a\n"), 0o644)) - gitOut(t, ws, "add", "a.txt") - gitOut(t, ws, "commit", "-m", "a") - - gitOut(t, ws, "checkout", "--detach") - gitOut(t, ws, "checkout", "-b", b) - require.NoError(t, os.WriteFile(filepath.Join(ws, "b.txt"), []byte("b\n"), 0o644)) - gitOut(t, ws, "add", "b.txt") - gitOut(t, ws, "commit", "-m", "b1") - - idx, err := taskindex.OpenDefault() - require.NoError(t, err) - t.Cleanup(func() { _ = idx.Close() }) - - // Prime snapshot. - require.NoError(t, idx.Refresh(ctx, taskindex.RefreshPolicy{ - Git: taskindex.GitPolicy{Mode: taskindex.GitOpenOnly, IncludeIntegration: true}, - })) - - // External change on b: new commit. - require.NoError(t, os.WriteFile(filepath.Join(ws, "b.txt"), []byte("b2\n"), 0o644)) - gitOut(t, ws, "add", "b.txt") - gitOut(t, ws, "commit", "-m", "b2") - bHead := gitOut(t, env.RootDir, "rev-parse", b) - - // Force refresh for a only; must not overwrite snapshot without accounting for b. - require.NoError(t, idx.Refresh(ctx, taskindex.RefreshPolicy{ - Git: taskindex.GitPolicy{ - Mode: taskindex.GitTasks, - Tasks: []string{a}, - IncludeIntegration: true, - }, - })) - - db, err := sql.Open("sqlite", filepath.Join(env.RootDir, ".subtask", "index.db")) - require.NoError(t, err) - t.Cleanup(func() { _ = db.Close() }) - - var got sql.NullString - require.NoError(t, db.QueryRow(`SELECT git_last_branch_head FROM tasks WHERE name = ?;`, b).Scan(&got)) - require.True(t, got.Valid) - require.Equal(t, bHead, strings.TrimSpace(got.String)) -} - -func TestIndex_IntegrationForceTasks_ClearsStaleWhenBranchMoves(t *testing.T) { - env := testutil.NewTestEnv(t, 1) - ctx := context.Background() - - name := "idx/clear-stale" - env.CreateTask(name, "Clear stale", "main", "desc") - env.CreateTaskHistory(name, []history.Event{{Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main"})}}) - - ws := env.Workspaces[0] - gitOut(t, ws, "checkout", "-b", name) - require.NoError(t, os.WriteFile(filepath.Join(ws, "x.txt"), []byte("x\n"), 0o644)) - gitOut(t, ws, "add", "x.txt") - gitOut(t, ws, "commit", "-m", "x") - - idx, err := taskindex.OpenDefault() - require.NoError(t, err) - t.Cleanup(func() { _ = idx.Close() }) - - // Prime snapshot. - require.NoError(t, idx.Refresh(ctx, taskindex.RefreshPolicy{ - Git: taskindex.GitPolicy{Mode: taskindex.GitOpenOnly, IncludeIntegration: true}, - })) - - // External merge (history-preserving) so ancestor check is true. - gitOut(t, env.RootDir, "checkout", "main") - gitOut(t, env.RootDir, "merge", "--no-ff", name, "-m", "Merge "+name) - - require.NoError(t, idx.Refresh(ctx, taskindex.RefreshPolicy{ - Git: taskindex.GitPolicy{Mode: taskindex.GitOpenOnly, IncludeIntegration: true}, - })) - rec, ok, err := idx.Get(ctx, name) - require.NoError(t, err) - require.True(t, ok) - require.NotEmpty(t, strings.TrimSpace(rec.IntegratedReason)) - - // Branch moves after being integrated: new commit not in main. - gitOut(t, ws, "checkout", name) - require.NoError(t, os.WriteFile(filepath.Join(ws, "x.txt"), []byte("x2\n"), 0o644)) - gitOut(t, ws, "add", "x.txt") - gitOut(t, ws, "commit", "-m", "x2") - - // Force refresh for this task (send-path behavior) must clear stale integration. - require.NoError(t, idx.Refresh(ctx, taskindex.RefreshPolicy{ - Git: taskindex.GitPolicy{ - Mode: taskindex.GitTasks, - Tasks: []string{name}, - IncludeIntegration: true, - }, - })) - rec, ok, err = idx.Get(ctx, name) - require.NoError(t, err) - require.True(t, ok) - require.Empty(t, strings.TrimSpace(rec.IntegratedReason)) -} diff --git a/pkg/task/index/meta.go b/pkg/task/index/meta.go deleted file mode 100644 index c6e0502..0000000 --- a/pkg/task/index/meta.go +++ /dev/null @@ -1,51 +0,0 @@ -package index - -import ( - "context" - "database/sql" - "fmt" - "time" -) - -type refsSnapshot struct { - Hash string - JSON string - AtNS int64 -} - -func (i *Index) loadRefsSnapshot(ctx context.Context) (refsSnapshot, error) { - var snap refsSnapshot - var ( - hash sql.NullString - js sql.NullString - at sql.NullInt64 - ) - err := i.db.QueryRowContext(ctx, `SELECT git_refs_snapshot_hash, git_refs_snapshot_json, git_refs_snapshot_at_ns FROM index_meta WHERE id = 1;`). - Scan(&hash, &js, &at) - if err != nil { - return refsSnapshot{}, fmt.Errorf("load index meta: %w", err) - } - if hash.Valid { - snap.Hash = hash.String - } - if js.Valid { - snap.JSON = js.String - } - if at.Valid { - snap.AtNS = at.Int64 - } - return snap, nil -} - -func saveRefsSnapshot(ctx context.Context, tx *sql.Tx, hash, js string, now time.Time) error { - _, err := tx.ExecContext(ctx, ` -UPDATE index_meta SET - git_refs_snapshot_hash = ?, - git_refs_snapshot_json = ?, - git_refs_snapshot_at_ns = ? -WHERE id = 1;`, nullableString(hash), nullableString(js), now.UnixNano()) - if err != nil { - return fmt.Errorf("save index meta: %w", err) - } - return nil -} diff --git a/pkg/task/index/query.go b/pkg/task/index/query.go index 5a2200b..2bb9765 100644 --- a/pkg/task/index/query.go +++ b/pkg/task/index/query.go @@ -15,6 +15,7 @@ type ListItem struct { Title string FollowUp string BaseBranch string + BaseCommit string TaskStatus task.TaskStatus WorkerStatus task.WorkerStatus @@ -33,17 +34,16 @@ type ListItem struct { ProgressDone int ProgressTotal int - LinesAdded int - LinesRemoved int - CommitsBehind int - - IntegratedReason string + LinesAdded int + LinesRemoved int } // Record is the cached file-backed data for a single task. type Record struct { Task *task.Task + BaseCommit string + TaskStatus task.TaskStatus WorkerStatus task.WorkerStatus Stage string @@ -60,9 +60,28 @@ type Record struct { LinesAdded int LinesRemoved int - CommitsBehind int ConflictFilesJSON string - IntegratedReason string + + // Integration (content detection) cache keyed by (base_head, branch_head). + IntegratedReason string + IntegratedBranchHead string + IntegratedTargetHead string + IntegratedCheckedAtNS int64 + + // Git redesign cache fields (input-based invalidation). + BranchHead string + BaseHead string + + ChangesAdded int + ChangesRemoved int + ChangesBaseCommit string + ChangesBranchHead string + + CommitCount int + CommitCountBaseCommit string + CommitCountBranchHead string + + CommitLogLastHead string } func (i *Index) ListAll(ctx context.Context) ([]ListItem, error) { @@ -71,15 +90,14 @@ func (i *Index) ListAll(ctx context.Context) ([]ListItem, error) { } const q = ` SELECT - name, title, follow_up, base_branch, + name, title, follow_up, base_branch, base_commit, task_status, worker_status, stage, workspace, started_at_ns, last_error, last_history_ns, last_active_ns, tool_calls, last_run_duration_ms, progress_done, progress_total, - git_lines_added, git_lines_removed, git_commits_behind, - git_integrated_reason + git_lines_added, git_lines_removed FROM tasks ORDER BY last_history_ns DESC, name ASC; ` @@ -92,15 +110,14 @@ func (i *Index) ListOpen(ctx context.Context) ([]ListItem, error) { } const q = ` SELECT - name, title, follow_up, base_branch, + name, title, follow_up, base_branch, base_commit, task_status, worker_status, stage, workspace, started_at_ns, last_error, last_history_ns, last_active_ns, tool_calls, last_run_duration_ms, progress_done, progress_total, - git_lines_added, git_lines_removed, git_commits_behind, - git_integrated_reason + git_lines_added, git_lines_removed FROM tasks WHERE task_status != 'closed' ORDER BY last_history_ns DESC, name ASC; @@ -114,15 +131,14 @@ func (i *Index) ListClosed(ctx context.Context) ([]ListItem, error) { } const q = ` SELECT - name, title, follow_up, base_branch, + name, title, follow_up, base_branch, base_commit, task_status, worker_status, stage, workspace, started_at_ns, last_error, last_history_ns, last_active_ns, tool_calls, last_run_duration_ms, progress_done, progress_total, - git_lines_added, git_lines_removed, git_commits_behind, - git_integrated_reason + git_lines_added, git_lines_removed FROM tasks WHERE task_status = 'closed' ORDER BY last_history_ns DESC, name ASC; @@ -140,30 +156,27 @@ func (i *Index) queryList(ctx context.Context, q string) ([]ListItem, error) { var out []ListItem for rows.Next() { var ( - name, title, followUp, baseBranch string - taskStatus, workerStatus, stage string - workspace string - startedAtNS int64 - lastError sql.NullString - lastHistoryNS int64 - lastActiveNS int64 - toolCalls int - lastRunDurationMS int - progressDone, progressTotal int - linesAdded, linesRemoved sql.NullInt64 - commitsBehind sql.NullInt64 - integratedReason sql.NullString + name, title, followUp, baseBranch, baseCommit string + taskStatus, workerStatus, stage string + workspace string + startedAtNS int64 + lastError sql.NullString + lastHistoryNS int64 + lastActiveNS int64 + toolCalls int + lastRunDurationMS int + progressDone, progressTotal int + linesAdded, linesRemoved sql.NullInt64 ) if err := rows.Scan( - &name, &title, &followUp, &baseBranch, + &name, &title, &followUp, &baseBranch, &baseCommit, &taskStatus, &workerStatus, &stage, &workspace, &startedAtNS, &lastError, &lastHistoryNS, &lastActiveNS, &toolCalls, &lastRunDurationMS, &progressDone, &progressTotal, - &linesAdded, &linesRemoved, &commitsBehind, - &integratedReason, + &linesAdded, &linesRemoved, ); err != nil { return nil, fmt.Errorf("index list: scan: %w", err) } @@ -173,6 +186,7 @@ func (i *Index) queryList(ctx context.Context, q string) ([]ListItem, error) { Title: title, FollowUp: followUp, BaseBranch: baseBranch, + BaseCommit: baseCommit, TaskStatus: task.TaskStatus(taskStatus), WorkerStatus: task.ParseWorkerStatus(workerStatus), Stage: stage, @@ -186,14 +200,10 @@ func (i *Index) queryList(ctx context.Context, q string) ([]ListItem, error) { ProgressTotal: progressTotal, LinesAdded: intOrZero(linesAdded), LinesRemoved: intOrZero(linesRemoved), - CommitsBehind: intOrZero(commitsBehind), } if lastError.Valid { item.LastError = lastError.String } - if integratedReason.Valid { - item.IntegratedReason = integratedReason.String - } out = append(out, item) } @@ -209,49 +219,64 @@ func (i *Index) Get(ctx context.Context, taskName string) (Record, bool, error) } const q = ` SELECT - name, title, base_branch, follow_up, model, reasoning, description, + name, title, base_branch, base_commit, follow_up, model, reasoning, description, task_schema, task_status, worker_status, stage, workspace, started_at_ns, supervisor_pid, last_error, last_history_ns, tool_calls, last_active_ns, last_run_duration_ms, progress_done, progress_total, - git_lines_added, git_lines_removed, git_commits_behind, + git_lines_added, git_lines_removed, git_conflict_files_json, - git_integrated_reason + git_integrated_reason, git_integrated_branch_head, git_integrated_target_head, git_integrated_checked_at_ns, + branch_head, base_head, + changes_added, changes_removed, changes_base_commit, changes_branch_head, + commit_count, commit_count_base_commit, commit_count_branch_head, + commit_log_last_head FROM tasks WHERE name = ?; ` var ( - name, title, baseBranch, followUp, model, reasoning, description string - taskSchema int - taskStatus, workerStatus, stage string - workspace string - startedAtNS int64 - supervisorPID int - lastError sql.NullString - lastHistoryNS int64 - toolCalls int - lastActiveNS int64 - lastRunDurationMS int - progressDone, progressTotal int - linesAdded, linesRemoved, commitsBehind sql.NullInt64 - conflictFilesJSON sql.NullString - integratedReason sql.NullString + name, title, baseBranch, baseCommit, followUp, model, reasoning, description string + taskSchema int + taskStatus, workerStatus, stage string + workspace string + startedAtNS int64 + supervisorPID int + lastError sql.NullString + lastHistoryNS int64 + toolCalls int + lastActiveNS int64 + lastRunDurationMS int + progressDone, progressTotal int + linesAdded, linesRemoved sql.NullInt64 + conflictFilesJSON sql.NullString + integratedReason, integratedBranchHead, integratedTargetHead sql.NullString + integratedCheckedAtNS sql.NullInt64 + branchHead, baseHead sql.NullString + changesAdded, changesRemoved sql.NullInt64 + changesBaseCommit, changesBranchHead sql.NullString + commitCount sql.NullInt64 + commitCountBaseCommit, commitCountBranchHead sql.NullString + commitLogLastHead sql.NullString ) err := i.db.QueryRowContext(ctx, q, taskName).Scan( - &name, &title, &baseBranch, &followUp, &model, &reasoning, &description, + &name, &title, &baseBranch, &baseCommit, &followUp, &model, &reasoning, &description, &taskSchema, &taskStatus, &workerStatus, &stage, &workspace, &startedAtNS, &supervisorPID, &lastError, &lastHistoryNS, &toolCalls, &lastActiveNS, &lastRunDurationMS, &progressDone, &progressTotal, - &linesAdded, &linesRemoved, &commitsBehind, + &linesAdded, &linesRemoved, &conflictFilesJSON, - &integratedReason, + &integratedReason, &integratedBranchHead, &integratedTargetHead, &integratedCheckedAtNS, + &branchHead, &baseHead, + &changesAdded, &changesRemoved, &changesBaseCommit, &changesBranchHead, + &commitCount, &commitCountBaseCommit, &commitCountBranchHead, + &commitLogLastHead, ) if err != nil { if err == sql.ErrNoRows { @@ -271,6 +296,7 @@ WHERE name = ?; Schema: taskSchema, Description: description, }, + BaseCommit: baseCommit, TaskStatus: task.TaskStatus(taskStatus), WorkerStatus: task.ParseWorkerStatus(workerStatus), Stage: stage, @@ -280,7 +306,6 @@ WHERE name = ?; LastRunDurationMS: lastRunDurationMS, LinesAdded: intOrZero(linesAdded), LinesRemoved: intOrZero(linesRemoved), - CommitsBehind: intOrZero(commitsBehind), } st := &task.State{ @@ -305,6 +330,41 @@ WHERE name = ?; if integratedReason.Valid { rec.IntegratedReason = integratedReason.String } + if integratedBranchHead.Valid { + rec.IntegratedBranchHead = integratedBranchHead.String + } + if integratedTargetHead.Valid { + rec.IntegratedTargetHead = integratedTargetHead.String + } + if integratedCheckedAtNS.Valid { + rec.IntegratedCheckedAtNS = integratedCheckedAtNS.Int64 + } + if branchHead.Valid { + rec.BranchHead = branchHead.String + } + if baseHead.Valid { + rec.BaseHead = baseHead.String + } + + rec.ChangesAdded = intOrZero(changesAdded) + rec.ChangesRemoved = intOrZero(changesRemoved) + if changesBaseCommit.Valid { + rec.ChangesBaseCommit = changesBaseCommit.String + } + if changesBranchHead.Valid { + rec.ChangesBranchHead = changesBranchHead.String + } + + rec.CommitCount = intOrZero(commitCount) + if commitCountBaseCommit.Valid { + rec.CommitCountBaseCommit = commitCountBaseCommit.String + } + if commitCountBranchHead.Valid { + rec.CommitCountBranchHead = commitCountBranchHead.String + } + if commitLogLastHead.Valid { + rec.CommitLogLastHead = commitLogLastHead.String + } return rec, true, nil } diff --git a/pkg/task/index/refresh.go b/pkg/task/index/refresh.go index c09786e..d59f9ea 100644 --- a/pkg/task/index/refresh.go +++ b/pkg/task/index/refresh.go @@ -41,8 +41,8 @@ func (i *Index) Refresh(ctx context.Context, policy RefreshPolicy) error { var step time.Time if debug { start = time.Now() - logging.Debug("refresh", fmt.Sprintf("index.Refresh start git={mode:%s includeIntegration:%t includeConflicts:%t ttl:%s tasks:%d}", - gitModeString(policy.Git.Mode), policy.Git.IncludeIntegration, policy.Git.IncludeConflicts, policy.Git.TTL, len(policy.Git.Tasks))) + logging.Debug("refresh", fmt.Sprintf("index.Refresh start git={mode:%s includeConflicts:%t ttl:%s tasks:%d}", + gitModeString(policy.Git.Mode), policy.Git.IncludeConflicts, policy.Git.TTL, len(policy.Git.Tasks))) } if debug { @@ -278,6 +278,10 @@ type taskRow struct { progressDone int progressTotal int + // Frozen stats (history.jsonl) for merged/closed tasks. + linesAdded *int + linesRemoved *int + // Derived statusRank int filesSig string @@ -323,6 +327,18 @@ func buildRowFromDisk(taskName, filesSig string) (taskRow, bool, error) { row.lastHistoryNS = tail.LastTS.UnixNano() row.lastRunDuration = tail.LastRunDurationMS row.baseCommit = tail.BaseCommit + switch tail.TaskStatus { + case task.TaskStatusMerged: + a := tail.LastMergedLinesAdded + r := tail.LastMergedLinesRemoved + row.linesAdded = &a + row.linesRemoved = &r + case task.TaskStatusClosed: + a := tail.LastClosedLinesAdded + r := tail.LastClosedLinesRemoved + row.linesAdded = &a + row.linesRemoved = &r + } // Runtime state. state, err := task.LoadState(taskName) @@ -553,7 +569,8 @@ INSERT INTO tasks ( last_run_duration_ms, progress_done, progress_total, status_rank, - files_sig + files_sig, + git_lines_added, git_lines_removed ) VALUES ( ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, @@ -562,7 +579,8 @@ INSERT INTO tasks ( ?, ?, ?, ?, - ? + ?, + ?, ? ) ON CONFLICT(name) DO UPDATE SET title=excluded.title, @@ -587,7 +605,9 @@ ON CONFLICT(name) DO UPDATE SET progress_done=excluded.progress_done, progress_total=excluded.progress_total, status_rank=excluded.status_rank, - files_sig=excluded.files_sig; + files_sig=excluded.files_sig, + git_lines_added=excluded.git_lines_added, + git_lines_removed=excluded.git_lines_removed; ` stmt, err := tx.PrepareContext(ctx, q) @@ -608,6 +628,8 @@ ON CONFLICT(name) DO UPDATE SET row.progressDone, row.progressTotal, row.statusRank, row.filesSig, + nullableInt(row.linesAdded), + nullableInt(row.linesRemoved), ); err != nil { return fmt.Errorf("index refresh: upsert %q: %w", row.name, err) } diff --git a/pkg/task/index/schema.go b/pkg/task/index/schema.go index dcd4813..fb98550 100644 --- a/pkg/task/index/schema.go +++ b/pkg/task/index/schema.go @@ -7,7 +7,7 @@ import ( "strings" ) -const schemaVersion = 6 +const schemaVersion = 7 func migrateSchema(ctx context.Context, db *sql.DB) error { var v int @@ -69,6 +69,13 @@ func migrateSchema(ctx context.Context, db *sql.DB) error { v = 6 } + if v == 6 { + if err := migrateToV7(ctx, tx); err != nil { + return err + } + v = 7 + } + if _, err := tx.ExecContext(ctx, fmt.Sprintf("PRAGMA user_version=%d;", v)); err != nil { return fmt.Errorf("set index schema version: %w", err) } @@ -253,3 +260,34 @@ func migrateToV6(ctx context.Context, tx *sql.Tx) error { } return nil } + +func migrateToV7(ctx context.Context, tx *sql.Tx) error { + // Git redesign: store historical diffs + commit counts with input-based invalidation, + // plus basic ref heads for debugging. + stmts := []string{ + `ALTER TABLE tasks ADD COLUMN branch_head TEXT;`, + `ALTER TABLE tasks ADD COLUMN base_head TEXT;`, + + `ALTER TABLE tasks ADD COLUMN changes_added INTEGER;`, + `ALTER TABLE tasks ADD COLUMN changes_removed INTEGER;`, + `ALTER TABLE tasks ADD COLUMN changes_base_commit TEXT;`, + `ALTER TABLE tasks ADD COLUMN changes_branch_head TEXT;`, + + `ALTER TABLE tasks ADD COLUMN commit_count INTEGER;`, + `ALTER TABLE tasks ADD COLUMN commit_count_base_commit TEXT;`, + `ALTER TABLE tasks ADD COLUMN commit_count_branch_head TEXT;`, + + `ALTER TABLE tasks ADD COLUMN commit_log_last_head TEXT;`, + } + + for _, stmt := range stmts { + if _, err := tx.ExecContext(ctx, stmt); err != nil { + // ALTER TABLE is not idempotent; ignore duplicate column errors. + if strings.Contains(err.Error(), "duplicate column name") { + continue + } + return fmt.Errorf("migrate v7: %w", err) + } + } + return nil +} diff --git a/pkg/task/index/schema_v6_migration_test.go b/pkg/task/index/schema_v6_migration_test.go index de29c25..b05e623 100644 --- a/pkg/task/index/schema_v6_migration_test.go +++ b/pkg/task/index/schema_v6_migration_test.go @@ -3,6 +3,7 @@ package index import ( "context" "database/sql" + "os" "path/filepath" "testing" @@ -10,13 +11,15 @@ import ( _ "modernc.org/sqlite" + "github.com/zippoxer/subtask/pkg/task" "github.com/zippoxer/subtask/pkg/testutil" ) func TestMigrateToV6_RunningToWorking(t *testing.T) { - env := testutil.NewTestEnv(t, 0) - dbPath := filepath.Join(env.RootDir, ".subtask", "index.db") + _ = testutil.NewTestEnv(t, 0) + dbPath := task.IndexPath() + require.NoError(t, os.MkdirAll(filepath.Dir(dbPath), 0o755)) db, err := sql.Open("sqlite", dbPath) require.NoError(t, err) diff --git a/pkg/task/lock_windows.go b/pkg/task/lock_windows.go index 751023a..137c2d4 100644 --- a/pkg/task/lock_windows.go +++ b/pkg/task/lock_windows.go @@ -26,7 +26,7 @@ func WithLock(taskName string, fn func() error) error { return err } if err := filelock.LockExclusive(f); err != nil { - f.Close() + _ = f.Close() return err } defer func() { diff --git a/pkg/task/migrate/gitredesign/gitredesign.go b/pkg/task/migrate/gitredesign/gitredesign.go new file mode 100644 index 0000000..c205155 --- /dev/null +++ b/pkg/task/migrate/gitredesign/gitredesign.go @@ -0,0 +1,469 @@ +package gitredesign + +import ( + "encoding/json" + "fmt" + "os" + "path/filepath" + "strings" + "time" + + "github.com/zippoxer/subtask/internal/filelock" + "github.com/zippoxer/subtask/pkg/git" + "github.com/zippoxer/subtask/pkg/logging" + "github.com/zippoxer/subtask/pkg/task" + "github.com/zippoxer/subtask/pkg/task/history" + taskmigrate "github.com/zippoxer/subtask/pkg/task/migrate" +) + +// TaskSchemaVersion is the task schema version that indicates the git redesign migration +// has been applied (best-effort) and can be skipped on subsequent runs. +// +// v0.1.1 tasks commonly have schema=1 (schema1 history.jsonl). This migration upgrades +// them to schema=2 by backfilling missing git redesign fields. +const TaskSchemaVersion = 2 + +const repoDoneMarkerName = "gitredesign-v1.done" + +// Ensure performs a best-effort, idempotent migration to support the git redesign: +// - Backfills missing base_commit in the most recent task.opened event. +// - Backfills frozen change stats in task.merged / task.closed events when missing. +// +// It is safe to call multiple times; if tasks are already migrated it becomes a no-op. +func Ensure(repoDir string) error { + repoDir = strings.TrimSpace(repoDir) + if repoDir == "" { + return nil + } + repoDir = canonicalRepoDir(repoDir) + + paths := repoMigrationPaths(repoDir) + if markerExists(paths.doneMarkerPath) { + return nil + } + + // Best-effort: if we can lock, do the scan/migration once per repo and persist a marker. + // If we cannot lock or create runtime state, fall back to the legacy behavior (scan tasks + // every time) rather than failing the CLI. + if err := os.MkdirAll(paths.projectDir, 0o755); err == nil { + lockFile, err := os.OpenFile(paths.lockPath, os.O_CREATE|os.O_RDWR, 0o644) + if err == nil { + defer func() { _ = lockFile.Close() }() + if err := filelock.LockExclusive(lockFile); err == nil { + defer func() { _ = filelock.Unlock(lockFile) }() + + if markerExists(paths.doneMarkerPath) { + return nil + } + + hadErrors, err := migrateAllTasks(repoDir) + if err != nil { + return err + } + if !hadErrors { + if err := writeDoneMarker(paths.doneMarkerPath); err != nil { + logging.Error("migrate", fmt.Sprintf("gitredesign write marker err=%v", err)) + } + } + return nil + } + } + } + + // Fallback path: legacy behavior without persistent marker/locking. + _, err := migrateAllTasks(repoDir) + return err +} + +type repoPaths struct { + projectDir string + lockPath string + doneMarkerPath string +} + +func repoMigrationPaths(repoDir string) repoPaths { + projectDir := filepath.Join(task.ProjectsDir(), task.EscapePath(repoDir)) + return repoPaths{ + projectDir: projectDir, + lockPath: filepath.Join(projectDir, "migrate.lock"), + doneMarkerPath: filepath.Join(projectDir, "migrations", repoDoneMarkerName), + } +} + +func canonicalRepoDir(repoDir string) string { + repoDir = filepath.Clean(strings.TrimSpace(repoDir)) + if repoDir == "" { + return "" + } + if abs, err := filepath.Abs(repoDir); err == nil { + repoDir = abs + } + return repoDir +} + +func markerExists(path string) bool { + st, err := os.Stat(path) + return err == nil && !st.IsDir() +} + +func writeDoneMarker(path string) error { + if err := os.MkdirAll(filepath.Dir(path), 0o755); err != nil { + return err + } + f, err := os.OpenFile(path, os.O_CREATE|os.O_EXCL|os.O_WRONLY, 0o644) + if err != nil { + if os.IsExist(err) { + return nil + } + return err + } + defer func() { _ = f.Close() }() + + _, _ = f.WriteString(time.Now().UTC().Format(time.RFC3339Nano) + "\n") + return nil +} + +func migrateAllTasks(repoDir string) (bool, error) { + taskNames, err := task.List() + if err != nil { + return true, err + } + if len(taskNames) == 0 { + return false, nil + } + + hadErrors := false + for _, name := range taskNames { + // Fast path: schema already indicates the redesign migration has been applied. + // This avoids per-task locks and full history parses on every CLI command. + t, err := task.Load(name) + if err == nil && t != nil && t.Schema >= TaskSchemaVersion { + continue + } + + // Ensure schema/history exist (locks internally). + if err := taskmigrate.EnsureSchema(name); err != nil { + logging.Error("migrate", fmt.Sprintf("gitredesign ensure schema task=%s err=%v", name, err)) + hadErrors = true + continue + } + if err := migrateTask(repoDir, name); err != nil { + logging.Error("migrate", fmt.Sprintf("gitredesign task=%s err=%v", name, err)) + hadErrors = true + continue + } + + // Mark as migrated so subsequent runs can skip this task entirely. + if err := bumpTaskSchema(name, TaskSchemaVersion); err != nil { + logging.Error("migrate", fmt.Sprintf("gitredesign bump schema task=%s err=%v", name, err)) + hadErrors = true + } + } + + return hadErrors, nil +} + +func bumpTaskSchema(taskName string, version int) error { + return task.WithLock(taskName, func() error { + t, err := task.Load(taskName) + if err != nil || t == nil { + return nil + } + if t.Schema >= version { + return nil + } + t.Schema = version + return t.Save() + }) +} + +func migrateTask(repoDir, taskName string) error { + t, err := task.Load(taskName) + if err != nil { + return nil + } + + return task.WithLock(taskName, func() error { + events, err := history.Read(taskName, history.ReadOptions{}) + if err != nil { + return err + } + if len(events) == 0 { + return nil + } + + dirty := false + + openedIdx := lastIndexOfType(events, "task.opened") + openedData := map[string]any{} + if openedIdx >= 0 { + _ = json.Unmarshal(events[openedIdx].Data, &openedData) + + if strings.TrimSpace(getString(openedData, "base_commit")) == "" { + baseBranch := strings.TrimSpace(getString(openedData, "base_branch")) + if baseBranch == "" { + baseBranch = strings.TrimSpace(t.BaseBranch) + } + baseCommit := inferBaseCommit(repoDir, taskName, baseBranch) + if baseCommit != "" { + openedData["base_commit"] = baseCommit + openedData["base_ref"] = baseBranch + if b, err := json.Marshal(openedData); err == nil { + events[openedIdx].Data = b + dirty = true + } + } + } + } + + // Best-effort: backfill frozen stats for merged tasks when missing. + mergedIdx := lastIndexOfType(events, "task.merged") + if mergedIdx >= 0 { + data := map[string]any{} + _ = json.Unmarshal(events[mergedIdx].Data, &data) + if _, ok := data["changes_added"]; !ok { + commit := strings.TrimSpace(getString(data, "commit")) + added, removed, frozenErr := inferFrozenStatsForMerge(repoDir, commit) + if frozenErr != "" { + data["frozen_error"] = frozenErr + } else { + data["changes_added"] = added + data["changes_removed"] = removed + } + if b, err := json.Marshal(data); err == nil { + events[mergedIdx].Data = b + dirty = true + } + } + } + + // Best-effort: backfill frozen stats for closed tasks when missing. + closedIdx := lastIndexOfType(events, "task.closed") + if closedIdx >= 0 { + data := map[string]any{} + _ = json.Unmarshal(events[closedIdx].Data, &data) + if _, ok := data["changes_added"]; !ok { + baseCommit := strings.TrimSpace(getString(openedData, "base_commit")) + if baseCommit == "" { + baseBranch := strings.TrimSpace(t.BaseBranch) + if baseBranch == "" { + baseBranch = strings.TrimSpace(getString(openedData, "base_branch")) + } + mb := inferBaseCommit(repoDir, taskName, baseBranch) + if mb != "" { + baseCommit = mb + } + } + + branchHead := "" + if git.BranchExists(repoDir, taskName) { + if out, err := git.Output(repoDir, "rev-parse", taskName); err == nil { + branchHead = strings.TrimSpace(out) + } + } + added, removed, commitCount, frozenErr := inferFrozenStatsForClose(repoDir, baseCommit, branchHead) + data["base_branch"] = strings.TrimSpace(t.BaseBranch) + data["base_commit"] = baseCommit + data["branch_head"] = branchHead + if frozenErr != "" { + data["frozen_error"] = frozenErr + } else { + data["changes_added"] = added + data["changes_removed"] = removed + data["commit_count"] = commitCount + } + if b, err := json.Marshal(data); err == nil { + events[closedIdx].Data = b + dirty = true + } + } + } + + if !dirty { + return nil + } + + // Legacy histories sometimes have zero timestamps for terminal events. If we let + // WriteAllLocked normalize them, they'll become "now" and break recency ordering. + // + // Best-effort: derive a stable timestamp from git metadata (preferred) or nearby + // history events (fallback). + backfillZeroTimestampsBestEffort(repoDir, taskName, events) + + return history.WriteAllLocked(taskName, events) + }) +} + +func lastIndexOfType(events []history.Event, typ string) int { + for i := len(events) - 1; i >= 0; i-- { + if events[i].Type == typ { + return i + } + } + return -1 +} + +func getString(m map[string]any, key string) string { + v, ok := m[key] + if !ok || v == nil { + return "" + } + s, _ := v.(string) + return s +} + +func backfillZeroTimestampsBestEffort(repoDir, taskName string, events []history.Event) { + if repoDir == "" || taskName == "" || len(events) == 0 { + return + } + + for i := range events { + if !events[i].TS.IsZero() { + continue + } + + switch events[i].Type { + case "task.merged": + var d struct { + Commit string `json:"commit"` + } + _ = json.Unmarshal(events[i].Data, &d) + if ts, ok := commitDateUTC(repoDir, strings.TrimSpace(d.Commit)); ok { + events[i].TS = ts + continue + } + events[i].TS = fallbackEventTimestamp(events, i) + case "task.closed": + // Prefer an explicit branch_head (present in schema2) or fall back to the + // current task branch head when possible. + var d struct { + BranchHead string `json:"branch_head"` + } + _ = json.Unmarshal(events[i].Data, &d) + head := strings.TrimSpace(d.BranchHead) + if head == "" && git.BranchExists(repoDir, taskName) { + if out, err := git.Output(repoDir, "rev-parse", taskName); err == nil { + head = strings.TrimSpace(out) + } + } + if ts, ok := commitDateUTC(repoDir, head); ok { + events[i].TS = ts + continue + } + events[i].TS = fallbackEventTimestamp(events, i) + default: + events[i].TS = fallbackEventTimestamp(events, i) + } + } +} + +func commitDateUTC(repoDir, commit string) (time.Time, bool) { + commit = strings.TrimSpace(commit) + if repoDir == "" || commit == "" { + return time.Time{}, false + } + if !git.CommitExists(repoDir, commit) { + return time.Time{}, false + } + out, err := git.Output(repoDir, "show", "-s", "--format=%cI", commit) + if err != nil { + return time.Time{}, false + } + ts, err := time.Parse(time.RFC3339Nano, strings.TrimSpace(out)) + if err != nil { + return time.Time{}, false + } + return ts.UTC(), true +} + +func fallbackEventTimestamp(events []history.Event, idx int) time.Time { + // Prefer to keep this event after earlier ones in the file. + for i := idx - 1; i >= 0; i-- { + if !events[i].TS.IsZero() { + return events[i].TS.Add(time.Nanosecond) + } + } + for i := idx + 1; i < len(events); i++ { + if !events[i].TS.IsZero() { + return events[i].TS.Add(-time.Nanosecond) + } + } + return time.Unix(0, 0).UTC() +} + +func inferBaseCommit(repoDir, taskName, baseBranch string) string { + taskName = strings.TrimSpace(taskName) + baseBranch = strings.TrimSpace(baseBranch) + if taskName == "" || baseBranch == "" { + return "" + } + + // Prefer merge-base when the branch exists (this matches "based on base HEAD at creation time"). + if git.BranchExists(repoDir, taskName) && git.BranchExists(repoDir, baseBranch) { + if mb, err := git.Output(repoDir, "merge-base", taskName, baseBranch); err == nil { + return strings.TrimSpace(mb) + } + } + + // Draft-only tasks may have no branch yet; fall back to base branch HEAD. + if git.BranchExists(repoDir, baseBranch) { + if head, err := git.Output(repoDir, "rev-parse", baseBranch); err == nil { + return strings.TrimSpace(head) + } + } + + return "" +} + +func inferFrozenStatsForMerge(repoDir, mergedCommit string) (int, int, string) { + mergedCommit = strings.TrimSpace(mergedCommit) + if mergedCommit == "" { + return 0, 0, "cannot compute frozen stats (missing merge commit)" + } + if !git.CommitExists(repoDir, mergedCommit) { + return 0, 0, fmt.Sprintf("cannot compute frozen stats (missing merge commit %s)", mergedCommit) + } + parents, err := git.Output(repoDir, "show", "-s", "--format=%P", mergedCommit) + if err != nil { + return 0, 0, fmt.Sprintf("cannot compute frozen stats (failed to read parents): %v", err) + } + parent := "" + for _, p := range strings.Fields(parents) { + parent = strings.TrimSpace(p) + break + } + if parent == "" { + return 0, 0, "cannot compute frozen stats (no parent commit)" + } + if !git.CommitExists(repoDir, parent) { + return 0, 0, fmt.Sprintf("cannot compute frozen stats (missing parent commit %s)", parent) + } + added, removed, err := git.DiffStatRange(repoDir, parent, mergedCommit) + if err != nil { + return 0, 0, fmt.Sprintf("cannot compute frozen stats: %v", err) + } + return added, removed, "" +} + +func inferFrozenStatsForClose(repoDir, baseCommit, branchHead string) (int, int, int, string) { + baseCommit = strings.TrimSpace(baseCommit) + branchHead = strings.TrimSpace(branchHead) + if baseCommit == "" || branchHead == "" { + return 0, 0, 0, fmt.Sprintf("cannot compute frozen stats (missing base_commit=%t branch_head=%t)", baseCommit == "", branchHead == "") + } + if !git.CommitExists(repoDir, baseCommit) { + return 0, 0, 0, fmt.Sprintf("cannot compute frozen stats (missing base_commit %s)", baseCommit) + } + if !git.CommitExists(repoDir, branchHead) { + return 0, 0, 0, fmt.Sprintf("cannot compute frozen stats (missing branch_head %s)", branchHead) + } + added, removed, err := git.DiffStatRange(repoDir, baseCommit, branchHead) + if err != nil { + return 0, 0, 0, fmt.Sprintf("cannot compute frozen stats: %v", err) + } + commitCount, err := git.RevListCount(repoDir, baseCommit, branchHead) + if err != nil { + return 0, 0, 0, fmt.Sprintf("cannot compute commit_count: %v", err) + } + return added, removed, commitCount, "" +} diff --git a/pkg/task/migrate/gitredesign/gitredesign_e2e_test.go b/pkg/task/migrate/gitredesign/gitredesign_e2e_test.go new file mode 100644 index 0000000..122b2ae --- /dev/null +++ b/pkg/task/migrate/gitredesign/gitredesign_e2e_test.go @@ -0,0 +1,246 @@ +package gitredesign_test + +import ( + "archive/tar" + "compress/gzip" + "encoding/json" + "io" + "os" + "os/exec" + "path/filepath" + "runtime" + "strings" + "testing" + "time" + + "github.com/stretchr/testify/require" + + "github.com/zippoxer/subtask/pkg/task" + taskhistory "github.com/zippoxer/subtask/pkg/task/history" + taskmigrate "github.com/zippoxer/subtask/pkg/task/migrate" + "github.com/zippoxer/subtask/pkg/task/migrate/gitredesign" +) + +func TestEnsure_V011Fixtures_E2E(t *testing.T) { + t.Setenv("SUBTASK_DIR", t.TempDir()) + + fixturesDir := testdataDir(t, "v0.1.1") + bundlePath := filepath.Join(fixturesDir, "repo.bundle") + subtaskTar := filepath.Join(fixturesDir, "subtask-dir.tar.gz") + + root := t.TempDir() + repoDir := filepath.Join(root, "repo") + + require.NoError(t, os.MkdirAll(repoDir, 0o755)) + gitRun(t, repoDir, "init") + // Fetch all fixture branches into a non-checked-out namespace, then create local branches. + gitRun(t, repoDir, "fetch", bundlePath, "refs/heads/*:refs/remotes/bundle/*") + gitRun(t, repoDir, "checkout", "-B", "main", "refs/remotes/bundle/main") + gitRun(t, repoDir, "branch", "legacy/merged", "refs/remotes/bundle/legacy/merged") + gitRun(t, repoDir, "branch", "legacy/closed-keep", "refs/remotes/bundle/legacy/closed-keep") + untarGz(t, subtaskTar, repoDir) + + origCwd, err := os.Getwd() + require.NoError(t, err) + require.NoError(t, os.Chdir(repoDir)) + t.Cleanup(func() { _ = os.Chdir(origCwd) }) + + require.NoError(t, taskmigrate.EnsureLayout(repoDir)) + require.NoError(t, gitredesign.Ensure(repoDir)) + + markerPath := filepath.Join(task.ProjectsDir(), task.EscapePath(repoDir), "migrations", "gitredesign-v1.done") + require.FileExists(t, markerPath) + + taskNames, err := task.List() + require.NoError(t, err) + require.NotEmpty(t, taskNames) + + // Validate every task generally (schema + base_commit/base_ref on opened). + for _, name := range taskNames { + t.Run("task="+strings.ReplaceAll(name, "/", "_"), func(t *testing.T) { + loaded, err := task.Load(name) + require.NoError(t, err) + require.Equal(t, gitredesign.TaskSchemaVersion, loaded.Schema) + + events, err := taskhistory.Read(name, taskhistory.ReadOptions{}) + require.NoError(t, err) + require.NotEmpty(t, events) + + opened := lastEventOfType(t, events, "task.opened") + var openedData map[string]any + require.NoError(t, json.Unmarshal(opened.Data, &openedData)) + + baseCommit, _ := openedData["base_commit"].(string) + baseRef, _ := openedData["base_ref"].(string) + require.Regexp(t, "^[0-9a-f]{40}$", strings.TrimSpace(baseCommit)) + require.NotEmpty(t, strings.TrimSpace(baseRef)) + gitRun(t, repoDir, "cat-file", "-e", strings.TrimSpace(baseCommit)+"^{commit}") + + for _, evType := range []string{"task.merged", "task.closed"} { + ev := lastEventOfTypeOrNil(events, evType) + if ev == nil { + continue + } + require.False(t, ev.TS.IsZero()) + + var data map[string]any + require.NoError(t, json.Unmarshal(ev.Data, &data)) + _, hasAdded := data["changes_added"] + _, hasErr := data["frozen_error"] + require.True(t, hasAdded || hasErr) + } + }) + } + + // Specific timestamp backfill assertions for the v0.1.1 fixture set. + assertMergedTimestampFromCommitDate(t, "legacy/merged", "877f967876cb188d569d8135874d65b4e7c7238a") + assertClosedTimestampFromBranchHeadCommitDate(t, "legacy/closed-keep", "55298ad2cbb4c5c9477189b9049555582cc35bb0") + + // With the marker present, Ensure should not depend on task scanning at all. + tasksDir := filepath.Join(repoDir, ".subtask", "tasks") + bak := tasksDir + ".bak" + require.NoError(t, os.Rename(tasksDir, bak)) + require.NoError(t, os.WriteFile(tasksDir, []byte("not a dir"), 0o644)) + require.NoError(t, gitredesign.Ensure(repoDir)) + require.NoError(t, os.Remove(tasksDir)) + require.NoError(t, os.Rename(bak, tasksDir)) + + // Without the marker, Ensure should still be idempotent for already-migrated tasks. + before := readAllHistories(t, repoDir) + require.NoError(t, os.Remove(markerPath)) + require.NoError(t, gitredesign.Ensure(repoDir)) + after := readAllHistories(t, repoDir) + require.Equal(t, before, after) + require.FileExists(t, markerPath) +} + +func assertMergedTimestampFromCommitDate(t *testing.T, taskName, commit string) { + t.Helper() + events, err := taskhistory.Read(taskName, taskhistory.ReadOptions{}) + require.NoError(t, err) + merged := lastEventOfType(t, events, "task.merged") + + expected := gitCommitDateE2E(t, task.ProjectRoot(), commit) + require.True(t, merged.TS.Equal(expected)) +} + +func assertClosedTimestampFromBranchHeadCommitDate(t *testing.T, taskName, headCommit string) { + t.Helper() + events, err := taskhistory.Read(taskName, taskhistory.ReadOptions{}) + require.NoError(t, err) + closed := lastEventOfType(t, events, "task.closed") + + expected := gitCommitDateE2E(t, task.ProjectRoot(), headCommit) + require.True(t, closed.TS.Equal(expected)) +} + +func readAllHistories(t *testing.T, repoDir string) map[string]string { + t.Helper() + out := map[string]string{} + + taskNames, err := task.List() + require.NoError(t, err) + for _, name := range taskNames { + b, err := os.ReadFile(filepath.Join(repoDir, ".subtask", "tasks", task.EscapeName(name), "history.jsonl")) + require.NoError(t, err) + out[name] = string(b) + } + return out +} + +func lastEventOfType(t *testing.T, events []taskhistory.Event, typ string) taskhistory.Event { + t.Helper() + for i := len(events) - 1; i >= 0; i-- { + if events[i].Type == typ { + return events[i] + } + } + t.Fatalf("missing event type %q", typ) + return taskhistory.Event{} +} + +func lastEventOfTypeOrNil(events []taskhistory.Event, typ string) *taskhistory.Event { + for i := len(events) - 1; i >= 0; i-- { + if events[i].Type == typ { + ev := events[i] + return &ev + } + } + return nil +} + +func gitCommitDateE2E(t *testing.T, repoDir, commit string) time.Time { + t.Helper() + out := gitRun(t, repoDir, "show", "-s", "--format=%cI", commit) + ts, err := time.Parse(time.RFC3339Nano, strings.TrimSpace(out)) + require.NoError(t, err) + return ts.UTC() +} + +func gitRun(t *testing.T, dir string, args ...string) string { + t.Helper() + cmd := exec.Command("git", args...) + if dir != "" { + cmd.Dir = dir + } + b, err := cmd.CombinedOutput() + require.NoError(t, err, "git %s failed: %s", strings.Join(args, " "), string(b)) + return strings.TrimSpace(string(b)) +} + +func testdataDir(t *testing.T, subdir string) string { + t.Helper() + _, thisFile, _, ok := runtime.Caller(0) + require.True(t, ok) + return filepath.Join(filepath.Dir(thisFile), "testdata", subdir) +} + +func untarGz(t *testing.T, src, dst string) { + t.Helper() + + f, err := os.Open(src) + require.NoError(t, err) + defer func() { _ = f.Close() }() + + zr, err := gzip.NewReader(f) + require.NoError(t, err) + defer func() { _ = zr.Close() }() + + tr := tar.NewReader(zr) + for { + hdr, err := tr.Next() + if err != nil { + if err == io.EOF { + break + } + require.NoError(t, err) + } + if hdr == nil { + break + } + + name := filepath.Clean(hdr.Name) + if name == "." || name == "" { + continue + } + if strings.HasPrefix(name, ".."+string(os.PathSeparator)) || name == ".." || filepath.IsAbs(name) { + t.Fatalf("invalid tar entry: %q", hdr.Name) + } + + outPath := filepath.Join(dst, name) + switch hdr.Typeflag { + case tar.TypeDir: + require.NoError(t, os.MkdirAll(outPath, 0o755)) + case tar.TypeReg: + require.NoError(t, os.MkdirAll(filepath.Dir(outPath), 0o755)) + w, err := os.OpenFile(outPath, os.O_CREATE|os.O_TRUNC|os.O_WRONLY, os.FileMode(hdr.Mode)&0o777) + require.NoError(t, err) + _, copyErr := io.Copy(w, tr) + closeErr := w.Close() + require.NoError(t, copyErr) + require.NoError(t, closeErr) + default: + // Ignore symlinks/other types (fixtures should not need them). + } + } +} diff --git a/pkg/task/migrate/gitredesign/gitredesign_test.go b/pkg/task/migrate/gitredesign/gitredesign_test.go new file mode 100644 index 0000000..513f708 --- /dev/null +++ b/pkg/task/migrate/gitredesign/gitredesign_test.go @@ -0,0 +1,269 @@ +package gitredesign_test + +import ( + "encoding/json" + "os" + "os/exec" + "path/filepath" + "strings" + "testing" + "time" + + "github.com/stretchr/testify/require" + + "github.com/zippoxer/subtask/pkg/task" + "github.com/zippoxer/subtask/pkg/task/history" + "github.com/zippoxer/subtask/pkg/task/migrate/gitredesign" + "github.com/zippoxer/subtask/pkg/testutil" +) + +func TestEnsure_SkipsTasksAtCurrentSchemaWithoutReadingHistory(t *testing.T) { + env := testutil.NewTestEnv(t, 0) + repoDir := env.RootDir + + taskName := "migrate/skip" + env.CreateTask(taskName, "Skip", "main", "desc") // schema=gitredesign.TaskSchemaVersion + + // If Ensure tried to read history.jsonl, it would hit a permission error. + historyPath := task.HistoryPath(taskName) + require.NoError(t, os.WriteFile(historyPath, []byte("x\n"), 0o000)) + t.Cleanup(func() { _ = os.Chmod(historyPath, 0o644) }) + + require.NoError(t, gitredesign.Ensure(repoDir)) +} + +func TestEnsure_WritesRepoMarkerAfterSuccessfulRun(t *testing.T) { + env := testutil.NewTestEnv(t, 0) + repoDir := env.RootDir + + taskName := "migrate/marker" + env.CreateTask(taskName, "Marker", "main", "desc") // schema=gitredesign.TaskSchemaVersion + + markerPath := filepath.Join(task.ProjectsDir(), task.EscapePath(repoDir), "migrations", "gitredesign-v1.done") + _, err := os.Stat(markerPath) + require.Error(t, err) + + require.NoError(t, gitredesign.Ensure(repoDir)) + require.FileExists(t, markerPath) +} + +func TestEnsure_SkipsAllWorkWhenRepoMarkerExists(t *testing.T) { + env := testutil.NewTestEnv(t, 0) + repoDir := env.RootDir + + taskName := "migrate/marker-skip" + env.CreateTask(taskName, "Marker skip", "main", "desc") + + markerPath := filepath.Join(task.ProjectsDir(), task.EscapePath(repoDir), "migrations", "gitredesign-v1.done") + require.NoError(t, os.MkdirAll(filepath.Dir(markerPath), 0o755)) + require.NoError(t, os.WriteFile(markerPath, []byte("ok\n"), 0o644)) + + // Break task.List() by making ".subtask/tasks" a file. Ensure should still succeed + // because it should exit before scanning tasks when the marker exists. + tasksDir := filepath.Join(repoDir, ".subtask", "tasks") + bak := tasksDir + ".bak" + require.NoError(t, os.Rename(tasksDir, bak)) + require.NoError(t, os.WriteFile(tasksDir, []byte("not a dir"), 0o644)) + t.Cleanup(func() { + _ = os.Remove(tasksDir) + _ = os.Rename(bak, tasksDir) + }) + + require.NoError(t, gitredesign.Ensure(repoDir)) +} + +func TestEnsure_BackfillsAndBumpsSchema_Idempotent(t *testing.T) { + env := testutil.NewTestEnv(t, 0) + repoDir := env.RootDir + + taskName := "migrate/backfill" + require.NoError(t, (&task.Task{ + Name: taskName, + Title: "Backfill", + BaseBranch: "main", + Description: "desc", + Schema: 1, // v0.1.1 / schema1 + }).Save()) + + // Create a task branch so inferBaseCommit can use merge-base. + gitCmd(t, repoDir, "checkout", "-b", taskName, "main") + require.NoError(t, os.WriteFile(filepath.Join(repoDir, "task.txt"), []byte("task\n"), 0o644)) + gitCmd(t, repoDir, "add", "task.txt") + gitCmd(t, repoDir, "commit", "-m", "task commit") + gitCmd(t, repoDir, "checkout", "main") + + // Create an arbitrary commit to use as the "merged commit" in legacy history. + require.NoError(t, os.WriteFile(filepath.Join(repoDir, "merged.txt"), []byte("merged\n"), 0o644)) + gitCmd(t, repoDir, "add", "merged.txt") + gitCmd(t, repoDir, "commit", "-m", "merged commit") + mergedCommit := strings.TrimSpace(gitCmd(t, repoDir, "rev-parse", "HEAD")) + + // Legacy-ish history: opened missing base_commit, merged missing frozen stats. + env.CreateTaskHistory(taskName, []history.Event{ + {TS: time.Now().UTC(), Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main"})}, + {TS: time.Now().UTC(), Type: "stage.changed", Data: mustJSON(map[string]any{"from": "", "to": "ready"})}, + {TS: time.Now().UTC(), Type: "task.merged", Data: mustJSON(map[string]any{"commit": mergedCommit, "into": "main"})}, + }) + + before, err := os.ReadFile(task.HistoryPath(taskName)) + require.NoError(t, err) + + require.NoError(t, gitredesign.Ensure(repoDir)) + + after, err := os.ReadFile(task.HistoryPath(taskName)) + require.NoError(t, err) + require.NotEqual(t, string(before), string(after)) + + // Schema bumped. + loaded, err := task.Load(taskName) + require.NoError(t, err) + require.Equal(t, gitredesign.TaskSchemaVersion, loaded.Schema) + + // Backfilled base_commit + base_ref. + events, err := history.Read(taskName, history.ReadOptions{}) + require.NoError(t, err) + var openedData map[string]any + require.NoError(t, json.Unmarshal(events[0].Data, &openedData)) + baseCommit, _ := openedData["base_commit"].(string) + baseRef, _ := openedData["base_ref"].(string) + require.NotEmpty(t, strings.TrimSpace(baseCommit)) + require.Equal(t, "main", strings.TrimSpace(baseRef)) + + // Backfilled merged frozen stats (or recorded a frozen_error). + var mergedData map[string]any + require.NoError(t, json.Unmarshal(events[len(events)-1].Data, &mergedData)) + _, hasAdded := mergedData["changes_added"] + _, hasErr := mergedData["frozen_error"] + require.True(t, hasAdded || hasErr) + + // Idempotent: Ensure again should skip (schema already bumped) and not rewrite history. + before2, err := os.ReadFile(task.HistoryPath(taskName)) + require.NoError(t, err) + require.NoError(t, gitredesign.Ensure(repoDir)) + after2, err := os.ReadFile(task.HistoryPath(taskName)) + require.NoError(t, err) + require.Equal(t, string(before2), string(after2)) +} + +func TestEnsure_ZeroTimestampMergedUsesCommitDate(t *testing.T) { + env := testutil.NewTestEnv(t, 0) + repoDir := env.RootDir + + taskName := "migrate/zero-ts-merged" + require.NoError(t, (&task.Task{ + Name: taskName, + Title: "Zero TS merged", + BaseBranch: "main", + Description: "desc", + Schema: 1, + }).Save()) + + // Create a commit to reference from the legacy merged event. + require.NoError(t, os.WriteFile(filepath.Join(repoDir, "merged.txt"), []byte("merged\n"), 0o644)) + gitCmd(t, repoDir, "add", "merged.txt") + gitCmd(t, repoDir, "commit", "-m", "merged commit") + mergedCommit := strings.TrimSpace(gitCmd(t, repoDir, "rev-parse", "HEAD")) + + expected := gitCommitDate(t, repoDir, mergedCommit) + + // Write a legacy-ish history where task.merged has a zero timestamp. + writeRawHistory(t, taskName, []history.Event{ + {TS: time.Date(2025, 12, 31, 23, 59, 0, 0, time.UTC), Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main"})}, + {TS: time.Time{}, Type: "task.merged", Data: mustJSON(map[string]any{"commit": mergedCommit, "into": "main"})}, + }) + + require.NoError(t, gitredesign.Ensure(repoDir)) + + events, err := history.Read(taskName, history.ReadOptions{}) + require.NoError(t, err) + + mergedIdx := -1 + for i := range events { + if events[i].Type == "task.merged" { + mergedIdx = i + } + } + require.GreaterOrEqual(t, mergedIdx, 0) + require.True(t, events[mergedIdx].TS.Equal(expected)) +} + +func TestEnsure_ZeroTimestampClosedUsesBranchHeadCommitDate(t *testing.T) { + env := testutil.NewTestEnv(t, 0) + repoDir := env.RootDir + + taskName := "migrate/zero-ts-closed" + require.NoError(t, (&task.Task{ + Name: taskName, + Title: "Zero TS closed", + BaseBranch: "main", + Description: "desc", + Schema: 1, + }).Save()) + + // Create a task branch so the migration can backfill branch_head. + gitCmd(t, repoDir, "checkout", "-b", taskName, "main") + require.NoError(t, os.WriteFile(filepath.Join(repoDir, "task.txt"), []byte("task\n"), 0o644)) + gitCmd(t, repoDir, "add", "task.txt") + gitCmd(t, repoDir, "commit", "-m", "task commit") + branchHead := strings.TrimSpace(gitCmd(t, repoDir, "rev-parse", "HEAD")) + gitCmd(t, repoDir, "checkout", "main") + + expected := gitCommitDate(t, repoDir, branchHead) + + // Write a legacy-ish history where task.closed has a zero timestamp and missing frozen stats. + writeRawHistory(t, taskName, []history.Event{ + {TS: time.Date(2025, 12, 31, 23, 59, 0, 0, time.UTC), Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main"})}, + {TS: time.Time{}, Type: "task.closed", Data: mustJSON(map[string]any{"reason": "close"})}, + }) + + require.NoError(t, gitredesign.Ensure(repoDir)) + + events, err := history.Read(taskName, history.ReadOptions{}) + require.NoError(t, err) + + closedIdx := -1 + for i := range events { + if events[i].Type == "task.closed" { + closedIdx = i + } + } + require.GreaterOrEqual(t, closedIdx, 0) + require.True(t, events[closedIdx].TS.Equal(expected)) +} + +func mustJSON(v any) json.RawMessage { + b, _ := json.Marshal(v) + return b +} + +func writeRawHistory(t *testing.T, taskName string, events []history.Event) { + t.Helper() + var b strings.Builder + for _, ev := range events { + line, err := json.Marshal(ev) + require.NoError(t, err) + b.Write(line) + b.WriteByte('\n') + } + require.NoError(t, os.MkdirAll(task.Dir(taskName), 0o755)) + require.NoError(t, os.WriteFile(task.HistoryPath(taskName), []byte(b.String()), 0o644)) +} + +func gitCommitDate(t *testing.T, dir, commit string) time.Time { + t.Helper() + out := gitCmd(t, dir, "show", "-s", "--format=%cI", commit) + ts, err := time.Parse(time.RFC3339Nano, strings.TrimSpace(out)) + require.NoError(t, err) + return ts.UTC() +} + +func gitCmd(t *testing.T, dir string, args ...string) string { + t.Helper() + cmd := exec.Command("git", args...) + cmd.Dir = dir + out, err := cmd.CombinedOutput() + if err != nil { + t.Fatalf("git %s failed: %v\n%s", strings.Join(args, " "), err, string(out)) + } + return strings.TrimSpace(string(out)) +} diff --git a/pkg/task/migrate/gitredesign/testdata/v0.1.1/README.md b/pkg/task/migrate/gitredesign/testdata/v0.1.1/README.md new file mode 100644 index 0000000..6363599 --- /dev/null +++ b/pkg/task/migrate/gitredesign/testdata/v0.1.1/README.md @@ -0,0 +1,40 @@ +# v0.1.1 fixtures (gitredesign migration) + +These fixtures are a small “real-ish” dataset created with the `v0.1.1` CLI (tag `v0.1.1`) and then minimally tweaked to exercise migration edge cases. + +## Contents + +- `repo.bundle` — a git bundle containing: + - `main` with 2 commits (`Initial commit`, plus one commit representing a merged task result) + - `legacy/merged` branch (exists) + - `legacy/closed-keep` branch (exists) + - `legacy/closed-delete` branch (deleted) +- `subtask-dir.tar.gz` — a tarball of the repo-local `.subtask/` directory produced by `subtask v0.1.1`: + - `.subtask/tasks/*` (TASK.md + history.jsonl) + - `.subtask/internal/*/op.lock` + +## Scenarios included + +- `legacy/draftonly` + - Draft-only task (no terminal event) +- `legacy/merged` + - `task.opened` missing `base_commit`/`base_ref` (forces backfill) + - `task.merged` exists but has **zero timestamp** and no frozen stats (forces timestamp + stats backfill) +- `legacy/closed-keep` + - `task.closed` exists but has **zero timestamp** and no frozen stats (forces timestamp + stats backfill) + - Task branch exists (can backfill `branch_head`) +- `legacy/closed-delete` + - `task.closed` exists with non-zero timestamp and no frozen stats + - Task branch deleted (exercises “branch missing” best-effort behavior) + +## How these fixtures were created (high level) + +1. Create a new git repo and commit `README.md` on `main`. +2. Build `subtask` from tag `v0.1.1` and run `subtask draft ...` for the four tasks above (this generates `.subtask/tasks/*` and `.subtask/internal/*/op.lock`). +3. Create a few git branches/commits to support migration inference. +4. Edit some `history.jsonl` lines to remove `base_commit` and to introduce zero timestamps / missing frozen stats. +5. Export: + - `git bundle create repo.bundle --all` + - `tar czf subtask-dir.tar.gz .subtask` + +The e2e test `pkg/task/migrate/gitredesign/gitredesign_e2e_test.go` imports these files into a temp dir and runs `EnsureLayout` + `gitredesign.Ensure()` against them. diff --git a/pkg/task/migrate/gitredesign/testdata/v0.1.1/repo.bundle b/pkg/task/migrate/gitredesign/testdata/v0.1.1/repo.bundle new file mode 100644 index 0000000..751b930 Binary files /dev/null and b/pkg/task/migrate/gitredesign/testdata/v0.1.1/repo.bundle differ diff --git a/pkg/task/migrate/gitredesign/testdata/v0.1.1/subtask-dir.tar.gz b/pkg/task/migrate/gitredesign/testdata/v0.1.1/subtask-dir.tar.gz new file mode 100644 index 0000000..25dddbe Binary files /dev/null and b/pkg/task/migrate/gitredesign/testdata/v0.1.1/subtask-dir.tar.gz differ diff --git a/pkg/task/migrate/layout.go b/pkg/task/migrate/layout.go new file mode 100644 index 0000000..61ad479 --- /dev/null +++ b/pkg/task/migrate/layout.go @@ -0,0 +1,252 @@ +package migrate + +import ( + "fmt" + "io" + "os" + "path/filepath" + "strings" + "sync" + "time" + + "github.com/zippoxer/subtask/internal/filelock" + "github.com/zippoxer/subtask/pkg/task" +) + +var layoutOnce struct { + mu sync.Mutex + done map[string]struct{} +} + +// EnsureLayout performs best-effort, safe migration from legacy repo-local runtime/config +// into the new global layout. It is intended to be called once on process startup +// (before other domain code runs). +// +// It is idempotent and safe to call multiple times. +func EnsureLayout(repoRoot string) error { + repoRoot = filepath.Clean(repoRoot) + if repoRoot == "" || repoRoot == "." { + return nil + } + if abs, err := filepath.Abs(repoRoot); err == nil { + repoRoot = abs + } + + layoutOnce.mu.Lock() + if layoutOnce.done == nil { + layoutOnce.done = make(map[string]struct{}) + } + if _, ok := layoutOnce.done[repoRoot]; ok { + layoutOnce.mu.Unlock() + return nil + } + layoutOnce.done[repoRoot] = struct{}{} + layoutOnce.mu.Unlock() + + destProject := filepath.Join(task.ProjectsDir(), task.EscapePath(repoRoot)) + if err := os.MkdirAll(destProject, 0o755); err != nil { + return fmt.Errorf("subtask: failed to prepare runtime dir at %s: %w", destProject, err) + } + + // Serialize migrations per repo to avoid cross-process races. + lockPath := filepath.Join(destProject, "migrate.lock") + lockFile, err := os.OpenFile(lockPath, os.O_CREATE|os.O_RDWR, 0o644) + if err != nil { + return fmt.Errorf("subtask: failed to open migrate lock at %s: %w", lockPath, err) + } + defer func() { _ = lockFile.Close() }() + if err := filelock.LockExclusive(lockFile); err != nil { + return fmt.Errorf("subtask: failed to lock migrate lock at %s: %w", lockPath, err) + } + defer func() { _ = filelock.Unlock(lockFile) }() + + // 1) Promote legacy project config to global defaults (if global missing). + if err := promoteConfig(repoRoot); err != nil { + return err + } + + // 2) Migrate legacy runtime state into ~/.subtask/projects//. + if err := migrateRuntime(repoRoot, destProject); err != nil { + return err + } + + return nil +} + +func promoteConfig(repoRoot string) error { + userCfg := task.ConfigPath() + if fileExists(userCfg) { + return nil + } + legacyProjectCfg := filepath.Join(repoRoot, ".subtask", "config.json") + if !fileExists(legacyProjectCfg) { + return nil + } + if err := os.MkdirAll(filepath.Dir(userCfg), 0o755); err != nil { + return fmt.Errorf("subtask: failed to create global config dir: %w", err) + } + if err := copyFileAtomic(legacyProjectCfg, userCfg); err != nil { + return fmt.Errorf("subtask: failed to migrate legacy config %s -> %s: %w", legacyProjectCfg, userCfg, err) + } + return nil +} + +func migrateRuntime(repoRoot, destProject string) error { + legacySubtask := filepath.Join(repoRoot, ".subtask") + legacyInternal := filepath.Join(legacySubtask, "internal") + legacyIndex := filepath.Join(legacySubtask, "index.db") + + destInternal := filepath.Join(destProject, "internal") + destIndex := filepath.Join(destProject, "index.db") + + if err := os.MkdirAll(destInternal, 0o755); err != nil { + return fmt.Errorf("subtask: failed to create runtime internal dir: %w", err) + } + + // Internal: merge contents (never overwrite). + if dirExists(legacyInternal) { + if err := mergeDirNoClobber(legacyInternal, destInternal); err != nil { + return fmt.Errorf("subtask: failed to migrate legacy internal dir %s -> %s: %w", legacyInternal, destInternal, err) + } + } + + // Index db: copy if missing (index is rebuildable, so do not try to merge/overwrite). + if fileExists(legacyIndex) && !fileExists(destIndex) { + if err := copyFileAtomic(legacyIndex, destIndex); err != nil { + return fmt.Errorf("subtask: failed to migrate legacy index %s -> %s: %w", legacyIndex, destIndex, err) + } + // Best-effort sqlite sidecars. + _ = copyFileAtomic(legacyIndex+"-wal", destIndex+"-wal") + _ = copyFileAtomic(legacyIndex+"-shm", destIndex+"-shm") + } + + // Cleanup: legacy runtime state no longer belongs in the repo. + if err := cleanupLegacyRuntime(repoRoot); err != nil { + return err + } + + return nil +} + +func cleanupLegacyRuntime(repoRoot string) error { + repoRoot = filepath.Clean(repoRoot) + legacySubtask := filepath.Join(repoRoot, ".subtask") + legacyInternal := filepath.Join(legacySubtask, "internal") + legacyIndex := filepath.Join(legacySubtask, "index.db") + + // Safety: never remove paths that are not within the repo root. + if rel, err := filepath.Rel(repoRoot, legacyInternal); err == nil { + if rel != "." && !strings.HasPrefix(rel, ".."+string(os.PathSeparator)) && rel != ".." { + if dirExists(legacyInternal) { + if err := os.RemoveAll(legacyInternal); err != nil { + return fmt.Errorf("subtask: migrated legacy runtime but failed to remove %s: %w", legacyInternal, err) + } + } + } + } + + if rel, err := filepath.Rel(repoRoot, legacyIndex); err == nil { + if rel != "." && !strings.HasPrefix(rel, ".."+string(os.PathSeparator)) && rel != ".." { + // Remove the main db and best-effort sqlite sidecars. + if err := removeFileIfExists(legacyIndex); err != nil { + return fmt.Errorf("subtask: migrated legacy runtime but failed to remove %s: %w", legacyIndex, err) + } + if err := removeFileIfExists(legacyIndex + "-wal"); err != nil { + return fmt.Errorf("subtask: migrated legacy runtime but failed to remove %s-wal: %w", legacyIndex, err) + } + if err := removeFileIfExists(legacyIndex + "-shm"); err != nil { + return fmt.Errorf("subtask: migrated legacy runtime but failed to remove %s-shm: %w", legacyIndex, err) + } + } + } + + return nil +} + +func removeFileIfExists(path string) error { + if err := os.Remove(path); err != nil { + if os.IsNotExist(err) { + return nil + } + return err + } + return nil +} + +func mergeDirNoClobber(src, dst string) error { + entries, err := os.ReadDir(src) + if err != nil { + return err + } + if err := os.MkdirAll(dst, 0o755); err != nil { + return err + } + for _, e := range entries { + srcPath := filepath.Join(src, e.Name()) + dstPath := filepath.Join(dst, e.Name()) + if e.IsDir() { + if err := mergeDirNoClobber(srcPath, dstPath); err != nil { + return err + } + continue + } + if fileExists(dstPath) { + continue + } + if err := copyFileAtomic(srcPath, dstPath); err != nil { + return err + } + } + return nil +} + +func copyFileAtomic(src, dst string) error { + if !fileExists(src) { + return nil + } + if fileExists(dst) { + return nil + } + in, err := os.Open(src) + if err != nil { + return err + } + defer in.Close() + + if err := os.MkdirAll(filepath.Dir(dst), 0o755); err != nil { + return err + } + + tmp := fmt.Sprintf("%s.tmp-%d", dst, time.Now().UnixNano()) + out, err := os.OpenFile(tmp, os.O_WRONLY|os.O_CREATE|os.O_TRUNC, 0o644) + if err != nil { + return err + } + + _, copyErr := io.Copy(out, in) + syncErr := out.Sync() + closeErr := out.Close() + if copyErr != nil { + _ = os.Remove(tmp) + return copyErr + } + if syncErr != nil { + _ = os.Remove(tmp) + return syncErr + } + if closeErr != nil { + _ = os.Remove(tmp) + return closeErr + } + return os.Rename(tmp, dst) +} + +func dirExists(path string) bool { + st, err := os.Stat(path) + return err == nil && st.IsDir() +} + +func fileExists(path string) bool { + st, err := os.Stat(path) + return err == nil && !st.IsDir() +} diff --git a/pkg/task/migrate/layout_fixture_test.go b/pkg/task/migrate/layout_fixture_test.go new file mode 100644 index 0000000..d5672f1 --- /dev/null +++ b/pkg/task/migrate/layout_fixture_test.go @@ -0,0 +1,141 @@ +package migrate + +import ( + "io" + "os" + "path/filepath" + "testing" + + "github.com/stretchr/testify/require" + + "github.com/zippoxer/subtask/pkg/task" +) + +func TestEnsureLayout_LegacyFixtureBasic(t *testing.T) { + t.Setenv("SUBTASK_DIR", t.TempDir()) + + repoRoot := t.TempDir() + require.NoError(t, copyDir(filepath.Join("testdata", "legacy", "basic"), filepath.Join(repoRoot, ".subtask"))) + + require.NoError(t, EnsureLayout(repoRoot)) + + // Promoted global config exists. + require.FileExists(t, task.ConfigPath()) + + // Runtime state moved to ~/.subtask/projects//... + projectDir := filepath.Join(task.ProjectsDir(), task.EscapePath(repoRoot)) + require.FileExists(t, filepath.Join(projectDir, "internal", "legacy--basic", "state.json")) + require.FileExists(t, filepath.Join(projectDir, "internal", "legacy--basic", "progress.json")) + require.FileExists(t, filepath.Join(projectDir, "index.db")) + + // Repo cleanup: legacy runtime state removed. + _, err := os.Stat(filepath.Join(repoRoot, ".subtask", "internal")) + require.True(t, os.IsNotExist(err)) + _, err = os.Stat(filepath.Join(repoRoot, ".subtask", "index.db")) + require.True(t, os.IsNotExist(err)) + + // Portable data stays. + require.FileExists(t, filepath.Join(repoRoot, ".subtask", "tasks", "legacy--basic", "TASK.md")) + require.FileExists(t, filepath.Join(repoRoot, ".subtask", "config.json")) + + // Idempotent. + require.NoError(t, EnsureLayout(repoRoot)) +} + +func TestEnsureLayout_LegacyFixtureDraftOnly_NoIndex(t *testing.T) { + t.Setenv("SUBTASK_DIR", t.TempDir()) + + repoRoot := t.TempDir() + require.NoError(t, copyDir(filepath.Join("testdata", "legacy", "draft-only"), filepath.Join(repoRoot, ".subtask"))) + + require.NoError(t, EnsureLayout(repoRoot)) + + projectDir := filepath.Join(task.ProjectsDir(), task.EscapePath(repoRoot)) + require.FileExists(t, filepath.Join(projectDir, "internal", "legacy--draftonly", "op.lock")) + + // No legacy index => no runtime index created by layout migration. + _, err := os.Stat(filepath.Join(projectDir, "index.db")) + require.True(t, os.IsNotExist(err)) + + // Repo cleanup. + _, err = os.Stat(filepath.Join(repoRoot, ".subtask", "internal")) + require.True(t, os.IsNotExist(err)) + _, err = os.Stat(filepath.Join(repoRoot, ".subtask", "index.db")) + require.True(t, os.IsNotExist(err)) +} + +func TestEnsureLayout_LegacyFixtureMulti(t *testing.T) { + t.Setenv("SUBTASK_DIR", t.TempDir()) + + repoRoot := t.TempDir() + require.NoError(t, copyDir(filepath.Join("testdata", "legacy", "multi"), filepath.Join(repoRoot, ".subtask"))) + + require.NoError(t, EnsureLayout(repoRoot)) + + projectDir := filepath.Join(task.ProjectsDir(), task.EscapePath(repoRoot)) + require.FileExists(t, filepath.Join(projectDir, "internal", "legacy--open", "state.json")) + require.FileExists(t, filepath.Join(projectDir, "internal", "legacy--open", "progress.json")) + require.FileExists(t, filepath.Join(projectDir, "internal", "legacy--closed", "state.json")) + require.FileExists(t, filepath.Join(projectDir, "internal", "legacy--merged", "state.json")) + require.FileExists(t, filepath.Join(projectDir, "index.db")) + + // Repo cleanup. + _, err := os.Stat(filepath.Join(repoRoot, ".subtask", "internal")) + require.True(t, os.IsNotExist(err)) + _, err = os.Stat(filepath.Join(repoRoot, ".subtask", "index.db")) + require.True(t, os.IsNotExist(err)) + + // Portable history remains. + require.FileExists(t, filepath.Join(repoRoot, ".subtask", "tasks", "legacy--merged", "history.jsonl")) + require.FileExists(t, filepath.Join(repoRoot, ".subtask", "tasks", "legacy--closed", "history.jsonl")) + require.FileExists(t, filepath.Join(repoRoot, ".subtask", "tasks", "legacy--open", "history.jsonl")) +} + +func copyDir(src, dst string) error { + entries, err := os.ReadDir(src) + if err != nil { + return err + } + if err := os.MkdirAll(dst, 0o755); err != nil { + return err + } + for _, e := range entries { + srcPath := filepath.Join(src, e.Name()) + dstPath := filepath.Join(dst, e.Name()) + if e.IsDir() { + if err := copyDir(srcPath, dstPath); err != nil { + return err + } + continue + } + if err := copyFile(srcPath, dstPath); err != nil { + return err + } + } + return nil +} + +func copyFile(src, dst string) error { + st, err := os.Stat(src) + if err != nil { + return err + } + if err := os.MkdirAll(filepath.Dir(dst), 0o755); err != nil { + return err + } + in, err := os.Open(src) + if err != nil { + return err + } + defer in.Close() + out, err := os.OpenFile(dst, os.O_CREATE|os.O_WRONLY|os.O_TRUNC, st.Mode()) + if err != nil { + return err + } + _, copyErr := io.Copy(out, in) + closeErr := out.Close() + if copyErr != nil { + return copyErr + } + return closeErr +} diff --git a/pkg/task/migrate/layout_thorough_test.go b/pkg/task/migrate/layout_thorough_test.go new file mode 100644 index 0000000..1c3cca8 --- /dev/null +++ b/pkg/task/migrate/layout_thorough_test.go @@ -0,0 +1,303 @@ +package migrate + +import ( + "bytes" + "fmt" + "os" + "os/exec" + "path/filepath" + "strings" + "testing" + "time" + + "github.com/stretchr/testify/require" + + "github.com/zippoxer/subtask/internal/filelock" + "github.com/zippoxer/subtask/pkg/task" +) + +func TestEnsureLayout_HappyPath_CopiesSidecarsAndDeletesLegacy(t *testing.T) { + t.Setenv("SUBTASK_DIR", t.TempDir()) + + repoRoot := t.TempDir() + require.NoError(t, copyDir(filepath.Join("testdata", "legacy", "basic"), filepath.Join(repoRoot, ".subtask"))) + + // Simulate sqlite sidecars. + require.NoError(t, os.WriteFile(filepath.Join(repoRoot, ".subtask", "index.db-wal"), []byte("wal"), 0o644)) + require.NoError(t, os.WriteFile(filepath.Join(repoRoot, ".subtask", "index.db-shm"), []byte("shm"), 0o644)) + + require.NoError(t, EnsureLayout(repoRoot)) + + projectDir := filepath.Join(task.ProjectsDir(), task.EscapePath(repoRoot)) + require.FileExists(t, filepath.Join(projectDir, "index.db")) + require.FileExists(t, filepath.Join(projectDir, "index.db-wal")) + require.FileExists(t, filepath.Join(projectDir, "index.db-shm")) + + // Legacy runtime state removed from repo. + require.NoDirExists(t, filepath.Join(repoRoot, ".subtask", "internal")) + require.NoFileExists(t, filepath.Join(repoRoot, ".subtask", "index.db")) + require.NoFileExists(t, filepath.Join(repoRoot, ".subtask", "index.db-wal")) + require.NoFileExists(t, filepath.Join(repoRoot, ".subtask", "index.db-shm")) +} + +func TestEnsureLayout_IdempotentAcrossProcesses_SecondRunNoOp(t *testing.T) { + t.Setenv("SUBTASK_DIR", t.TempDir()) + + repoRoot := t.TempDir() + require.NoError(t, copyDir(filepath.Join("testdata", "legacy", "basic"), filepath.Join(repoRoot, ".subtask"))) + + require.NoError(t, runEnsureLayoutSubprocess(t, repoRoot)) + + projectDir := filepath.Join(task.ProjectsDir(), task.EscapePath(repoRoot)) + statePath := filepath.Join(projectDir, "internal", "legacy--basic", "state.json") + before := readFileString(t, statePath) + + require.NoError(t, runEnsureLayoutSubprocess(t, repoRoot)) + + after := readFileString(t, statePath) + require.Equal(t, before, after) +} + +func TestEnsureLayout_PartialDestExists_MergeNoClobber(t *testing.T) { + t.Setenv("SUBTASK_DIR", t.TempDir()) + + repoRoot := t.TempDir() + require.NoError(t, copyDir(filepath.Join("testdata", "legacy", "basic"), filepath.Join(repoRoot, ".subtask"))) + + projectDir := filepath.Join(task.ProjectsDir(), task.EscapePath(repoRoot)) + destInternal := filepath.Join(projectDir, "internal") + require.NoError(t, os.MkdirAll(destInternal, 0o755)) + + // Pre-create a file that should not be overwritten by mergeDirNoClobber. + preexistingState := filepath.Join(destInternal, "legacy--basic", "state.json") + require.NoError(t, os.MkdirAll(filepath.Dir(preexistingState), 0o755)) + require.NoError(t, os.WriteFile(preexistingState, []byte(`{"preexisting":true}`), 0o644)) + + // Pre-create index.db so migration doesn't overwrite it. + destIndex := filepath.Join(projectDir, "index.db") + require.NoError(t, os.WriteFile(destIndex, []byte("dest-index"), 0o644)) + + require.NoError(t, EnsureLayout(repoRoot)) + + require.Equal(t, `{"preexisting":true}`, strings.TrimSpace(readFileString(t, preexistingState))) + require.Equal(t, "dest-index", strings.TrimSpace(readFileString(t, destIndex))) + + // Files that didn't exist should be copied. + require.FileExists(t, filepath.Join(destInternal, "legacy--basic", "progress.json")) + require.FileExists(t, filepath.Join(destInternal, "legacy--basic", "op.lock")) + + // Legacy sources removed. + require.NoDirExists(t, filepath.Join(repoRoot, ".subtask", "internal")) + require.NoFileExists(t, filepath.Join(repoRoot, ".subtask", "index.db")) +} + +func TestEnsureLayout_ConcurrentMigrations_SerializedByLock(t *testing.T) { + t.Setenv("SUBTASK_DIR", t.TempDir()) + + repoRoot := t.TempDir() + require.NoError(t, copyDir(filepath.Join("testdata", "legacy", "basic"), filepath.Join(repoRoot, ".subtask"))) + + projectDir := filepath.Join(task.ProjectsDir(), task.EscapePath(repoRoot)) + require.NoError(t, os.MkdirAll(projectDir, 0o755)) + + lockPath := filepath.Join(projectDir, "migrate.lock") + lockFile, err := os.OpenFile(lockPath, os.O_CREATE|os.O_RDWR, 0o644) + require.NoError(t, err) + t.Cleanup(func() { _ = lockFile.Close() }) + require.NoError(t, filelock.LockExclusive(lockFile)) + + cmd1, out1 := startEnsureLayoutSubprocess(t, repoRoot) + cmd2, out2 := startEnsureLayoutSubprocess(t, repoRoot) + t.Cleanup(func() { + _ = cmd1.Process.Kill() + _ = cmd2.Process.Kill() + }) + + done1 := waitCmdAsync(cmd1) + done2 := waitCmdAsync(cmd2) + + select { + case err := <-done1: + t.Fatalf("process 1 finished unexpectedly while lock held: %v\n%s", err, out1.String()) + case <-time.After(500 * time.Millisecond): + } + select { + case err := <-done2: + t.Fatalf("process 2 finished unexpectedly while lock held: %v\n%s", err, out2.String()) + case <-time.After(10 * time.Millisecond): + } + + require.NoError(t, filelock.Unlock(lockFile)) + + require.NoError(t, waitWithTimeout(done1, 5*time.Second), out1.String()) + require.NoError(t, waitWithTimeout(done2, 5*time.Second), out2.String()) + + // Migration succeeded and cleaned up legacy runtime. + require.NoDirExists(t, filepath.Join(repoRoot, ".subtask", "internal")) + require.NoFileExists(t, filepath.Join(repoRoot, ".subtask", "index.db")) + require.FileExists(t, filepath.Join(projectDir, "internal", "legacy--basic", "state.json")) + require.FileExists(t, filepath.Join(projectDir, "index.db")) +} + +func TestEnsureLayout_CrashAfterCopyBeforeDelete_RerunCleansUp(t *testing.T) { + t.Setenv("SUBTASK_DIR", t.TempDir()) + + repoRoot := t.TempDir() + require.NoError(t, copyDir(filepath.Join("testdata", "legacy", "basic"), filepath.Join(repoRoot, ".subtask"))) + + projectDir := filepath.Join(task.ProjectsDir(), task.EscapePath(repoRoot)) + legacyInternal := filepath.Join(repoRoot, ".subtask", "internal") + legacyIndex := filepath.Join(repoRoot, ".subtask", "index.db") + destInternal := filepath.Join(projectDir, "internal") + destIndex := filepath.Join(projectDir, "index.db") + + require.NoError(t, os.MkdirAll(destInternal, 0o755)) + require.NoError(t, mergeDirNoClobber(legacyInternal, destInternal)) + require.NoError(t, copyFileAtomic(legacyIndex, destIndex)) + + // "Crash": legacy sources still exist, but dest already has copied data. + require.DirExists(t, legacyInternal) + require.FileExists(t, legacyIndex) + require.FileExists(t, filepath.Join(destInternal, "legacy--basic", "state.json")) + require.FileExists(t, destIndex) + + require.NoError(t, EnsureLayout(repoRoot)) + + require.NoDirExists(t, legacyInternal) + require.NoFileExists(t, legacyIndex) +} + +func TestEnsureLayout_NoLegacyRuntime_NoOp(t *testing.T) { + t.Setenv("SUBTASK_DIR", t.TempDir()) + + repoRoot := t.TempDir() + require.NoError(t, EnsureLayout(repoRoot)) + + // Does not create repo-local runtime state. + require.NoDirExists(t, filepath.Join(repoRoot, ".subtask", "internal")) + require.NoFileExists(t, filepath.Join(repoRoot, ".subtask", "index.db")) + + // Does not promote config when legacy config is absent. + require.NoFileExists(t, task.ConfigPath()) +} + +func TestEnsureLayout_DestUnwritable_ReturnsErrorAndLeavesLegacyUntouched(t *testing.T) { + t.Setenv("SUBTASK_DIR", t.TempDir()) + + projectsDir := filepath.Join(task.GlobalDir(), "projects") + require.NoError(t, os.MkdirAll(projectsDir, 0o755)) + require.NoError(t, os.Chmod(projectsDir, 0o555)) + t.Cleanup(func() { _ = os.Chmod(projectsDir, 0o755) }) + + repoRoot := t.TempDir() + legacyInternal := filepath.Join(repoRoot, ".subtask", "internal", "some-task") + require.NoError(t, os.MkdirAll(legacyInternal, 0o755)) + require.NoError(t, os.WriteFile(filepath.Join(legacyInternal, "state.json"), []byte("state"), 0o644)) + require.NoError(t, os.WriteFile(filepath.Join(repoRoot, ".subtask", "index.db"), []byte("db"), 0o644)) + + err := EnsureLayout(repoRoot) + require.Error(t, err) + + // Legacy files remain on error. + require.DirExists(t, filepath.Join(repoRoot, ".subtask", "internal")) + require.FileExists(t, filepath.Join(repoRoot, ".subtask", "index.db")) + require.Equal(t, "state", strings.TrimSpace(readFileString(t, filepath.Join(legacyInternal, "state.json")))) +} + +func TestEnsureLayout_LargeInternalFolder_CopiesAll(t *testing.T) { + t.Setenv("SUBTASK_DIR", t.TempDir()) + + repoRoot := t.TempDir() + legacyInternal := filepath.Join(repoRoot, ".subtask", "internal") + require.NoError(t, os.MkdirAll(legacyInternal, 0o755)) + require.NoError(t, os.WriteFile(filepath.Join(repoRoot, ".subtask", "index.db"), []byte("db"), 0o644)) + + const taskDirs = 100 + const filesPerTask = 3 + expected := make(map[string]string, taskDirs*filesPerTask) + for i := 0; i < taskDirs; i++ { + dir := filepath.Join(legacyInternal, fmt.Sprintf("task-%03d", i)) + require.NoError(t, os.MkdirAll(dir, 0o755)) + for j := 0; j < filesPerTask; j++ { + rel := filepath.Join(fmt.Sprintf("task-%03d", i), fmt.Sprintf("file-%d.txt", j)) + content := fmt.Sprintf("task=%d file=%d", i, j) + require.NoError(t, os.WriteFile(filepath.Join(legacyInternal, rel), []byte(content), 0o644)) + expected[rel] = content + } + } + + require.NoError(t, EnsureLayout(repoRoot)) + + projectDir := filepath.Join(task.ProjectsDir(), task.EscapePath(repoRoot)) + destInternal := filepath.Join(projectDir, "internal") + for rel, content := range expected { + dstPath := filepath.Join(destInternal, rel) + require.FileExists(t, dstPath) + require.Equal(t, content, strings.TrimSpace(readFileString(t, dstPath))) + } + + require.NoDirExists(t, legacyInternal) + require.NoFileExists(t, filepath.Join(repoRoot, ".subtask", "index.db")) +} + +func TestHelperProcessEnsureLayout(t *testing.T) { + if os.Getenv("SUBTASK_HELPER_PROCESS") != "1" { + return + } + repoRoot := strings.TrimSpace(os.Getenv("SUBTASK_HELPER_REPO_ROOT")) + if repoRoot == "" { + _, _ = fmt.Fprintln(os.Stderr, "missing SUBTASK_HELPER_REPO_ROOT") + os.Exit(2) + } + if err := EnsureLayout(repoRoot); err != nil { + _, _ = fmt.Fprintln(os.Stderr, err.Error()) + os.Exit(1) + } + os.Exit(0) +} + +func runEnsureLayoutSubprocess(t *testing.T, repoRoot string) error { + t.Helper() + cmd, buf := startEnsureLayoutSubprocess(t, repoRoot) + if err := cmd.Wait(); err != nil { + return fmt.Errorf("EnsureLayout subprocess failed: %w\n%s", err, buf.String()) + } + return nil +} + +func startEnsureLayoutSubprocess(t *testing.T, repoRoot string) (*exec.Cmd, *bytes.Buffer) { + t.Helper() + + cmd := exec.Command(os.Args[0], "-test.run=^TestHelperProcessEnsureLayout$", "-test.v") + cmd.Env = append(os.Environ(), + "SUBTASK_HELPER_PROCESS=1", + "SUBTASK_HELPER_REPO_ROOT="+repoRoot, + ) + var buf bytes.Buffer + cmd.Stdout = &buf + cmd.Stderr = &buf + require.NoError(t, cmd.Start()) + return cmd, &buf +} + +func waitCmdAsync(cmd *exec.Cmd) <-chan error { + ch := make(chan error, 1) + go func() { ch <- cmd.Wait() }() + return ch +} + +func waitWithTimeout(ch <-chan error, timeout time.Duration) error { + select { + case err := <-ch: + return err + case <-time.After(timeout): + return fmt.Errorf("timeout after %s", timeout) + } +} + +func readFileString(t *testing.T, path string) string { + t.Helper() + b, err := os.ReadFile(path) + require.NoError(t, err) + return string(b) +} diff --git a/pkg/task/migrate/testdata/legacy/basic/config.json b/pkg/task/migrate/testdata/legacy/basic/config.json new file mode 100644 index 0000000..03e4a7a --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/basic/config.json @@ -0,0 +1,7 @@ +{ + "harness": "mock", + "max_workspaces": 3, + "options": { + "tool_calls": 2 + } +} diff --git a/pkg/task/migrate/testdata/legacy/basic/index.db b/pkg/task/migrate/testdata/legacy/basic/index.db new file mode 100644 index 0000000..e70ffd5 Binary files /dev/null and b/pkg/task/migrate/testdata/legacy/basic/index.db differ diff --git a/pkg/task/migrate/testdata/legacy/basic/internal/legacy--basic/op.lock b/pkg/task/migrate/testdata/legacy/basic/internal/legacy--basic/op.lock new file mode 100644 index 0000000..e69de29 diff --git a/pkg/task/migrate/testdata/legacy/basic/internal/legacy--basic/progress.json b/pkg/task/migrate/testdata/legacy/basic/internal/legacy--basic/progress.json new file mode 100644 index 0000000..3cb0a76 --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/basic/internal/legacy--basic/progress.json @@ -0,0 +1,4 @@ +{ + "tool_calls": 2, + "last_activity": "2026-01-20T23:45:08.680693+02:00" +} \ No newline at end of file diff --git a/pkg/task/migrate/testdata/legacy/basic/internal/legacy--basic/state.json b/pkg/task/migrate/testdata/legacy/basic/internal/legacy--basic/state.json new file mode 100644 index 0000000..c8747fc --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/basic/internal/legacy--basic/state.json @@ -0,0 +1,6 @@ +{ + "workspace": "/tmp/subtask-legacy-fixtures-25896/home/.subtask/workspaces/-private-tmp-subtask-legacy-fixtures-25896-basic-repo--1", + "session_id": "mock-session-1768945508658093000", + "harness": "mock", + "started_at": "0001-01-01T00:00:00Z" +} \ No newline at end of file diff --git a/pkg/task/migrate/testdata/legacy/basic/internal/workspace.lock b/pkg/task/migrate/testdata/legacy/basic/internal/workspace.lock new file mode 100644 index 0000000..e69de29 diff --git a/pkg/task/migrate/testdata/legacy/basic/tasks/legacy--basic/TASK.md b/pkg/task/migrate/testdata/legacy/basic/tasks/legacy--basic/TASK.md new file mode 100644 index 0000000..4869c6d --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/basic/tasks/legacy--basic/TASK.md @@ -0,0 +1,7 @@ +--- +title: Basic +base-branch: main +schema: 1 +--- + +basic diff --git a/pkg/task/migrate/testdata/legacy/basic/tasks/legacy--basic/WORKFLOW.yaml b/pkg/task/migrate/testdata/legacy/basic/tasks/legacy--basic/WORKFLOW.yaml new file mode 100644 index 0000000..8dd7c97 --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/basic/tasks/legacy--basic/WORKFLOW.yaml @@ -0,0 +1,32 @@ +name: default +description: Simple task execution + +instructions: + worker: | + When applicable, create PROGRESS.json to track your steps: + ```json + [ + {"step": "First step", "done": false}, + {"step": "Second step", "done": false} + ] + ``` + Update `done: true` as you complete each step, or revise as needed. Commit when done. + +stages: + - name: doing + instructions: | + Worker is implementing. + + Check progress: `subtask show ` + When ready for lead review: `subtask stage review` to get instructions + + - name: review + instructions: | + Review code with `subtask diff --stat ` and `subtask diff `. + + Request changes: `subtask send "Please..."` + When ready for human review: `subtask stage ready` + + - name: ready + instructions: | + Task is ready. Notify human and ask them if they want to merge or create a PR. diff --git a/pkg/task/migrate/testdata/legacy/basic/tasks/legacy--basic/history.jsonl b/pkg/task/migrate/testdata/legacy/basic/tasks/legacy--basic/history.jsonl new file mode 100644 index 0000000..475acad --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/basic/tasks/legacy--basic/history.jsonl @@ -0,0 +1,7 @@ +{"ts":"2026-01-20T21:45:08.55108Z","type":"task.opened","data":{"base_branch":"main","base_commit":"214cf94d51884866abbd4f05d93abef85723b304","base_ref":"main","branch":"legacy/basic","follow_up":"","model":"","reason":"draft","reasoning":"","title":"Basic","workflow":"default"}} +{"ts":"2026-01-20T21:45:08.55108Z","type":"stage.changed","data":{"from":"","to":"doing"}} +{"ts":"2026-01-20T21:45:08.586222Z","type":"message","role":"lead","content":"do"} +{"ts":"2026-01-20T21:45:08.586222Z","type":"worker.started","data":{"prompt_bytes":2,"run_id":"7fbf0721d67cffd2"}} +{"ts":"2026-01-20T21:45:08.661799Z","type":"worker.session","data":{"action":"started","harness":"mock","session_id":"mock-session-1768945508658093000"}} +{"ts":"2026-01-20T21:45:08.695544Z","type":"message","role":"worker","content":"Mock response for: # Task\nName: legacy/basic\nTitle: Basic\nBranch: leg"} +{"ts":"2026-01-20T21:45:08.695544Z","type":"worker.finished","data":{"duration_ms":38,"outcome":"replied","run_id":"7fbf0721d67cffd2","tool_calls":2}} diff --git a/pkg/task/migrate/testdata/legacy/draft-only/config.json b/pkg/task/migrate/testdata/legacy/draft-only/config.json new file mode 100644 index 0000000..03e4a7a --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/draft-only/config.json @@ -0,0 +1,7 @@ +{ + "harness": "mock", + "max_workspaces": 3, + "options": { + "tool_calls": 2 + } +} diff --git a/pkg/task/migrate/testdata/legacy/draft-only/internal/legacy--draftonly/op.lock b/pkg/task/migrate/testdata/legacy/draft-only/internal/legacy--draftonly/op.lock new file mode 100644 index 0000000..e69de29 diff --git a/pkg/task/migrate/testdata/legacy/draft-only/tasks/legacy--draftonly/TASK.md b/pkg/task/migrate/testdata/legacy/draft-only/tasks/legacy--draftonly/TASK.md new file mode 100644 index 0000000..b14c84b --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/draft-only/tasks/legacy--draftonly/TASK.md @@ -0,0 +1,7 @@ +--- +title: DraftOnly +base-branch: main +schema: 1 +--- + +draft only diff --git a/pkg/task/migrate/testdata/legacy/draft-only/tasks/legacy--draftonly/WORKFLOW.yaml b/pkg/task/migrate/testdata/legacy/draft-only/tasks/legacy--draftonly/WORKFLOW.yaml new file mode 100644 index 0000000..8dd7c97 --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/draft-only/tasks/legacy--draftonly/WORKFLOW.yaml @@ -0,0 +1,32 @@ +name: default +description: Simple task execution + +instructions: + worker: | + When applicable, create PROGRESS.json to track your steps: + ```json + [ + {"step": "First step", "done": false}, + {"step": "Second step", "done": false} + ] + ``` + Update `done: true` as you complete each step, or revise as needed. Commit when done. + +stages: + - name: doing + instructions: | + Worker is implementing. + + Check progress: `subtask show ` + When ready for lead review: `subtask stage review` to get instructions + + - name: review + instructions: | + Review code with `subtask diff --stat ` and `subtask diff `. + + Request changes: `subtask send "Please..."` + When ready for human review: `subtask stage ready` + + - name: ready + instructions: | + Task is ready. Notify human and ask them if they want to merge or create a PR. diff --git a/pkg/task/migrate/testdata/legacy/draft-only/tasks/legacy--draftonly/history.jsonl b/pkg/task/migrate/testdata/legacy/draft-only/tasks/legacy--draftonly/history.jsonl new file mode 100644 index 0000000..3ca9df0 --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/draft-only/tasks/legacy--draftonly/history.jsonl @@ -0,0 +1,2 @@ +{"ts":"2026-01-20T21:45:30.350386Z","type":"task.opened","data":{"base_branch":"main","base_commit":"0159214c7a57f9aa92aec1f703b48f6bb29b4911","base_ref":"main","branch":"legacy/draftonly","follow_up":"","model":"","reason":"draft","reasoning":"","title":"DraftOnly","workflow":"default"}} +{"ts":"2026-01-20T21:45:30.350386Z","type":"stage.changed","data":{"from":"","to":"doing"}} diff --git a/pkg/task/migrate/testdata/legacy/multi/config.json b/pkg/task/migrate/testdata/legacy/multi/config.json new file mode 100644 index 0000000..03e4a7a --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/multi/config.json @@ -0,0 +1,7 @@ +{ + "harness": "mock", + "max_workspaces": 3, + "options": { + "tool_calls": 2 + } +} diff --git a/pkg/task/migrate/testdata/legacy/multi/index.db b/pkg/task/migrate/testdata/legacy/multi/index.db new file mode 100644 index 0000000..0c8d755 Binary files /dev/null and b/pkg/task/migrate/testdata/legacy/multi/index.db differ diff --git a/pkg/task/migrate/testdata/legacy/multi/internal/legacy--closed/op.lock b/pkg/task/migrate/testdata/legacy/multi/internal/legacy--closed/op.lock new file mode 100644 index 0000000..e69de29 diff --git a/pkg/task/migrate/testdata/legacy/multi/internal/legacy--closed/progress.json b/pkg/task/migrate/testdata/legacy/multi/internal/legacy--closed/progress.json new file mode 100644 index 0000000..cc6c66d --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/multi/internal/legacy--closed/progress.json @@ -0,0 +1,4 @@ +{ + "tool_calls": 2, + "last_activity": "2026-01-20T23:45:09.411741+02:00" +} \ No newline at end of file diff --git a/pkg/task/migrate/testdata/legacy/multi/internal/legacy--closed/state.json b/pkg/task/migrate/testdata/legacy/multi/internal/legacy--closed/state.json new file mode 100644 index 0000000..a5a0750 --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/multi/internal/legacy--closed/state.json @@ -0,0 +1,3 @@ +{ + "started_at": "0001-01-01T00:00:00Z" +} \ No newline at end of file diff --git a/pkg/task/migrate/testdata/legacy/multi/internal/legacy--draftonly/op.lock b/pkg/task/migrate/testdata/legacy/multi/internal/legacy--draftonly/op.lock new file mode 100644 index 0000000..e69de29 diff --git a/pkg/task/migrate/testdata/legacy/multi/internal/legacy--merged/op.lock b/pkg/task/migrate/testdata/legacy/multi/internal/legacy--merged/op.lock new file mode 100644 index 0000000..e69de29 diff --git a/pkg/task/migrate/testdata/legacy/multi/internal/legacy--merged/progress.json b/pkg/task/migrate/testdata/legacy/multi/internal/legacy--merged/progress.json new file mode 100644 index 0000000..03d01eb --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/multi/internal/legacy--merged/progress.json @@ -0,0 +1,4 @@ +{ + "tool_calls": 2, + "last_activity": "2026-01-20T23:45:09.717929+02:00" +} \ No newline at end of file diff --git a/pkg/task/migrate/testdata/legacy/multi/internal/legacy--merged/state.json b/pkg/task/migrate/testdata/legacy/multi/internal/legacy--merged/state.json new file mode 100644 index 0000000..a5a0750 --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/multi/internal/legacy--merged/state.json @@ -0,0 +1,3 @@ +{ + "started_at": "0001-01-01T00:00:00Z" +} \ No newline at end of file diff --git a/pkg/task/migrate/testdata/legacy/multi/internal/legacy--open/op.lock b/pkg/task/migrate/testdata/legacy/multi/internal/legacy--open/op.lock new file mode 100644 index 0000000..e69de29 diff --git a/pkg/task/migrate/testdata/legacy/multi/internal/legacy--open/progress.json b/pkg/task/migrate/testdata/legacy/multi/internal/legacy--open/progress.json new file mode 100644 index 0000000..87f7fbc --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/multi/internal/legacy--open/progress.json @@ -0,0 +1,4 @@ +{ + "tool_calls": 2, + "last_activity": "2026-01-20T23:45:09.119232+02:00" +} \ No newline at end of file diff --git a/pkg/task/migrate/testdata/legacy/multi/internal/legacy--open/state.json b/pkg/task/migrate/testdata/legacy/multi/internal/legacy--open/state.json new file mode 100644 index 0000000..58c6562 --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/multi/internal/legacy--open/state.json @@ -0,0 +1,6 @@ +{ + "workspace": "/tmp/subtask-legacy-fixtures-25896/home/.subtask/workspaces/-private-tmp-subtask-legacy-fixtures-25896-multi-repo--1", + "session_id": "mock-session-1768945509097128000", + "harness": "mock", + "started_at": "0001-01-01T00:00:00Z" +} \ No newline at end of file diff --git a/pkg/task/migrate/testdata/legacy/multi/internal/workspace.lock b/pkg/task/migrate/testdata/legacy/multi/internal/workspace.lock new file mode 100644 index 0000000..e69de29 diff --git a/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--closed/TASK.md b/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--closed/TASK.md new file mode 100644 index 0000000..939de1d --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--closed/TASK.md @@ -0,0 +1,7 @@ +--- +title: Closed +base-branch: main +schema: 1 +--- + +closed diff --git a/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--closed/WORKFLOW.yaml b/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--closed/WORKFLOW.yaml new file mode 100644 index 0000000..8dd7c97 --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--closed/WORKFLOW.yaml @@ -0,0 +1,32 @@ +name: default +description: Simple task execution + +instructions: + worker: | + When applicable, create PROGRESS.json to track your steps: + ```json + [ + {"step": "First step", "done": false}, + {"step": "Second step", "done": false} + ] + ``` + Update `done: true` as you complete each step, or revise as needed. Commit when done. + +stages: + - name: doing + instructions: | + Worker is implementing. + + Check progress: `subtask show ` + When ready for lead review: `subtask stage review` to get instructions + + - name: review + instructions: | + Review code with `subtask diff --stat ` and `subtask diff `. + + Request changes: `subtask send "Please..."` + When ready for human review: `subtask stage ready` + + - name: ready + instructions: | + Task is ready. Notify human and ask them if they want to merge or create a PR. diff --git a/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--closed/history.jsonl b/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--closed/history.jsonl new file mode 100644 index 0000000..642e5bc --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--closed/history.jsonl @@ -0,0 +1,9 @@ +{"ts":"2026-01-20T21:45:09.290766Z","type":"task.opened","data":{"base_branch":"main","base_commit":"3c15fd321445639fc6e6a611a7a58ac49e741c1d","base_ref":"main","branch":"legacy/closed","follow_up":"","model":"","reason":"draft","reasoning":"","title":"Closed","workflow":"default"}} +{"ts":"2026-01-20T21:45:09.290766Z","type":"stage.changed","data":{"from":"","to":"doing"}} +{"ts":"2026-01-20T21:45:09.320839Z","type":"message","role":"lead","content":"do"} +{"ts":"2026-01-20T21:45:09.320839Z","type":"worker.started","data":{"prompt_bytes":2,"run_id":"eeeaf94875438136"}} +{"ts":"2026-01-20T21:45:09.390918Z","type":"worker.session","data":{"action":"started","harness":"mock","session_id":"mock-session-1768945509386221000"}} +{"ts":"2026-01-20T21:45:09.426051Z","type":"message","role":"worker","content":"Mock response for: # Task\nName: legacy/closed\nTitle: Closed\nBranch: l"} +{"ts":"2026-01-20T21:45:09.426051Z","type":"worker.finished","data":{"duration_ms":40,"outcome":"replied","run_id":"eeeaf94875438136","tool_calls":2}} +{"ts":"2026-01-20T21:45:09.585256Z","type":"task.closed","data":{"reason":"close"}} +{"ts":"2026-01-20T21:45:09.801387Z","type":"task.merged","data":{"branch":"legacy/closed","branch_head":"3c15fd321445639fc6e6a611a7a58ac49e741c1d","commit":"3c15fd321445639fc6e6a611a7a58ac49e741c1d","integrated_reason":"ancestor","into":"main","target_head":"3c15fd321445639fc6e6a611a7a58ac49e741c1d","via":"detected"}} diff --git a/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--draftonly/TASK.md b/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--draftonly/TASK.md new file mode 100644 index 0000000..b14c84b --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--draftonly/TASK.md @@ -0,0 +1,7 @@ +--- +title: DraftOnly +base-branch: main +schema: 1 +--- + +draft only diff --git a/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--draftonly/WORKFLOW.yaml b/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--draftonly/WORKFLOW.yaml new file mode 100644 index 0000000..8dd7c97 --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--draftonly/WORKFLOW.yaml @@ -0,0 +1,32 @@ +name: default +description: Simple task execution + +instructions: + worker: | + When applicable, create PROGRESS.json to track your steps: + ```json + [ + {"step": "First step", "done": false}, + {"step": "Second step", "done": false} + ] + ``` + Update `done: true` as you complete each step, or revise as needed. Commit when done. + +stages: + - name: doing + instructions: | + Worker is implementing. + + Check progress: `subtask show ` + When ready for lead review: `subtask stage review` to get instructions + + - name: review + instructions: | + Review code with `subtask diff --stat ` and `subtask diff `. + + Request changes: `subtask send "Please..."` + When ready for human review: `subtask stage ready` + + - name: ready + instructions: | + Task is ready. Notify human and ask them if they want to merge or create a PR. diff --git a/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--draftonly/history.jsonl b/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--draftonly/history.jsonl new file mode 100644 index 0000000..adfe49b --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--draftonly/history.jsonl @@ -0,0 +1,2 @@ +{"ts":"2026-01-20T21:45:08.952776Z","type":"task.opened","data":{"base_branch":"main","base_commit":"3c15fd321445639fc6e6a611a7a58ac49e741c1d","base_ref":"main","branch":"legacy/draftonly","follow_up":"","model":"","reason":"draft","reasoning":"","title":"DraftOnly","workflow":"default"}} +{"ts":"2026-01-20T21:45:08.952776Z","type":"stage.changed","data":{"from":"","to":"doing"}} diff --git a/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--merged/TASK.md b/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--merged/TASK.md new file mode 100644 index 0000000..b2fa5f9 --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--merged/TASK.md @@ -0,0 +1,7 @@ +--- +title: Merged +base-branch: main +schema: 1 +--- + +merged diff --git a/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--merged/WORKFLOW.yaml b/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--merged/WORKFLOW.yaml new file mode 100644 index 0000000..8dd7c97 --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--merged/WORKFLOW.yaml @@ -0,0 +1,32 @@ +name: default +description: Simple task execution + +instructions: + worker: | + When applicable, create PROGRESS.json to track your steps: + ```json + [ + {"step": "First step", "done": false}, + {"step": "Second step", "done": false} + ] + ``` + Update `done: true` as you complete each step, or revise as needed. Commit when done. + +stages: + - name: doing + instructions: | + Worker is implementing. + + Check progress: `subtask show ` + When ready for lead review: `subtask stage review` to get instructions + + - name: review + instructions: | + Review code with `subtask diff --stat ` and `subtask diff `. + + Request changes: `subtask send "Please..."` + When ready for human review: `subtask stage ready` + + - name: ready + instructions: | + Task is ready. Notify human and ask them if they want to merge or create a PR. diff --git a/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--merged/history.jsonl b/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--merged/history.jsonl new file mode 100644 index 0000000..083bb6b --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--merged/history.jsonl @@ -0,0 +1,8 @@ +{"ts":"2026-01-20T21:45:09.626343Z","type":"task.opened","data":{"base_branch":"main","base_commit":"3c15fd321445639fc6e6a611a7a58ac49e741c1d","base_ref":"main","branch":"legacy/merged","follow_up":"","model":"","reason":"draft","reasoning":"","title":"Merged","workflow":"default"}} +{"ts":"2026-01-20T21:45:09.626343Z","type":"stage.changed","data":{"from":"","to":"doing"}} +{"ts":"2026-01-20T21:45:09.656736Z","type":"message","role":"lead","content":"do"} +{"ts":"2026-01-20T21:45:09.656736Z","type":"worker.started","data":{"prompt_bytes":2,"run_id":"e3d7ddbb1cf43995"}} +{"ts":"2026-01-20T21:45:09.696983Z","type":"worker.session","data":{"action":"started","harness":"mock","session_id":"mock-session-1768945509692126000"}} +{"ts":"2026-01-20T21:45:09.733979Z","type":"message","role":"worker","content":"Mock response for: # Task\nName: legacy/merged\nTitle: Merged\nBranch: l"} +{"ts":"2026-01-20T21:45:09.733979Z","type":"worker.finished","data":{"duration_ms":42,"outcome":"replied","run_id":"e3d7ddbb1cf43995","tool_calls":2}} +{"ts":"2026-01-20T21:45:10.103147Z","type":"task.merged","data":{"branch":"legacy/merged","commit":"14e19e497de53c557bfd40197dd50344bbcada9c","into":"main","merge_base":"3c15fd321445639fc6e6a611a7a58ac49e741c1d","trailers":{"Subtask-Task":"legacy/merged"}}} diff --git a/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--open/TASK.md b/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--open/TASK.md new file mode 100644 index 0000000..556d0b5 --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--open/TASK.md @@ -0,0 +1,7 @@ +--- +title: Open +base-branch: main +schema: 1 +--- + +open diff --git a/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--open/WORKFLOW.yaml b/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--open/WORKFLOW.yaml new file mode 100644 index 0000000..8dd7c97 --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--open/WORKFLOW.yaml @@ -0,0 +1,32 @@ +name: default +description: Simple task execution + +instructions: + worker: | + When applicable, create PROGRESS.json to track your steps: + ```json + [ + {"step": "First step", "done": false}, + {"step": "Second step", "done": false} + ] + ``` + Update `done: true` as you complete each step, or revise as needed. Commit when done. + +stages: + - name: doing + instructions: | + Worker is implementing. + + Check progress: `subtask show ` + When ready for lead review: `subtask stage review` to get instructions + + - name: review + instructions: | + Review code with `subtask diff --stat ` and `subtask diff `. + + Request changes: `subtask send "Please..."` + When ready for human review: `subtask stage ready` + + - name: ready + instructions: | + Task is ready. Notify human and ask them if they want to merge or create a PR. diff --git a/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--open/history.jsonl b/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--open/history.jsonl new file mode 100644 index 0000000..3e247b0 --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/multi/tasks/legacy--open/history.jsonl @@ -0,0 +1,7 @@ +{"ts":"2026-01-20T21:45:08.996778Z","type":"task.opened","data":{"base_branch":"main","base_commit":"3c15fd321445639fc6e6a611a7a58ac49e741c1d","base_ref":"main","branch":"legacy/open","follow_up":"","model":"","reason":"draft","reasoning":"","title":"Open","workflow":"default"}} +{"ts":"2026-01-20T21:45:08.996778Z","type":"stage.changed","data":{"from":"","to":"doing"}} +{"ts":"2026-01-20T21:45:09.032008Z","type":"message","role":"lead","content":"do"} +{"ts":"2026-01-20T21:45:09.032008Z","type":"worker.started","data":{"prompt_bytes":2,"run_id":"7414dad635f72e69"}} +{"ts":"2026-01-20T21:45:09.100878Z","type":"worker.session","data":{"action":"started","harness":"mock","session_id":"mock-session-1768945509097128000"}} +{"ts":"2026-01-20T21:45:09.133955Z","type":"message","role":"worker","content":"Mock response for: # Task\nName: legacy/open\nTitle: Open\nBranch: legac"} +{"ts":"2026-01-20T21:45:09.133955Z","type":"worker.finished","data":{"duration_ms":36,"outcome":"replied","run_id":"7414dad635f72e69","tool_calls":2}} diff --git a/pkg/task/migrate/testdata/legacy/no-index/config.json b/pkg/task/migrate/testdata/legacy/no-index/config.json new file mode 100644 index 0000000..03e4a7a --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/no-index/config.json @@ -0,0 +1,7 @@ +{ + "harness": "mock", + "max_workspaces": 3, + "options": { + "tool_calls": 2 + } +} diff --git a/pkg/task/migrate/testdata/legacy/no-index/index.db b/pkg/task/migrate/testdata/legacy/no-index/index.db new file mode 100644 index 0000000..3610313 Binary files /dev/null and b/pkg/task/migrate/testdata/legacy/no-index/index.db differ diff --git a/pkg/task/migrate/testdata/legacy/no-index/internal/legacy--noindex/op.lock b/pkg/task/migrate/testdata/legacy/no-index/internal/legacy--noindex/op.lock new file mode 100644 index 0000000..e69de29 diff --git a/pkg/task/migrate/testdata/legacy/no-index/internal/legacy--noindex/progress.json b/pkg/task/migrate/testdata/legacy/no-index/internal/legacy--noindex/progress.json new file mode 100644 index 0000000..7b05ad5 --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/no-index/internal/legacy--noindex/progress.json @@ -0,0 +1,4 @@ +{ + "tool_calls": 2, + "last_activity": "2026-01-20T23:45:10.468723+02:00" +} \ No newline at end of file diff --git a/pkg/task/migrate/testdata/legacy/no-index/internal/legacy--noindex/state.json b/pkg/task/migrate/testdata/legacy/no-index/internal/legacy--noindex/state.json new file mode 100644 index 0000000..fa599f8 --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/no-index/internal/legacy--noindex/state.json @@ -0,0 +1,6 @@ +{ + "workspace": "/tmp/subtask-legacy-fixtures-25896/home/.subtask/workspaces/-private-tmp-subtask-legacy-fixtures-25896-no-index-repo--1", + "session_id": "mock-session-1768945510442825000", + "harness": "mock", + "started_at": "0001-01-01T00:00:00Z" +} \ No newline at end of file diff --git a/pkg/task/migrate/testdata/legacy/no-index/internal/workspace.lock b/pkg/task/migrate/testdata/legacy/no-index/internal/workspace.lock new file mode 100644 index 0000000..e69de29 diff --git a/pkg/task/migrate/testdata/legacy/no-index/tasks/legacy--noindex/TASK.md b/pkg/task/migrate/testdata/legacy/no-index/tasks/legacy--noindex/TASK.md new file mode 100644 index 0000000..02cf0cc --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/no-index/tasks/legacy--noindex/TASK.md @@ -0,0 +1,7 @@ +--- +title: NoIndex +base-branch: main +schema: 1 +--- + +no index diff --git a/pkg/task/migrate/testdata/legacy/no-index/tasks/legacy--noindex/WORKFLOW.yaml b/pkg/task/migrate/testdata/legacy/no-index/tasks/legacy--noindex/WORKFLOW.yaml new file mode 100644 index 0000000..8dd7c97 --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/no-index/tasks/legacy--noindex/WORKFLOW.yaml @@ -0,0 +1,32 @@ +name: default +description: Simple task execution + +instructions: + worker: | + When applicable, create PROGRESS.json to track your steps: + ```json + [ + {"step": "First step", "done": false}, + {"step": "Second step", "done": false} + ] + ``` + Update `done: true` as you complete each step, or revise as needed. Commit when done. + +stages: + - name: doing + instructions: | + Worker is implementing. + + Check progress: `subtask show ` + When ready for lead review: `subtask stage review` to get instructions + + - name: review + instructions: | + Review code with `subtask diff --stat ` and `subtask diff `. + + Request changes: `subtask send "Please..."` + When ready for human review: `subtask stage ready` + + - name: ready + instructions: | + Task is ready. Notify human and ask them if they want to merge or create a PR. diff --git a/pkg/task/migrate/testdata/legacy/no-index/tasks/legacy--noindex/history.jsonl b/pkg/task/migrate/testdata/legacy/no-index/tasks/legacy--noindex/history.jsonl new file mode 100644 index 0000000..0136d00 --- /dev/null +++ b/pkg/task/migrate/testdata/legacy/no-index/tasks/legacy--noindex/history.jsonl @@ -0,0 +1,7 @@ +{"ts":"2026-01-20T21:45:10.335238Z","type":"task.opened","data":{"base_branch":"main","base_commit":"f6ef258404c595f5ddff6acae5d380777d973845","base_ref":"main","branch":"legacy/noindex","follow_up":"","model":"","reason":"draft","reasoning":"","title":"NoIndex","workflow":"default"}} +{"ts":"2026-01-20T21:45:10.335238Z","type":"stage.changed","data":{"from":"","to":"doing"}} +{"ts":"2026-01-20T21:45:10.374236Z","type":"message","role":"lead","content":"do"} +{"ts":"2026-01-20T21:45:10.374236Z","type":"worker.started","data":{"prompt_bytes":2,"run_id":"34e934f1f7389f05"}} +{"ts":"2026-01-20T21:45:10.447918Z","type":"worker.session","data":{"action":"started","harness":"mock","session_id":"mock-session-1768945510442825000"}} +{"ts":"2026-01-20T21:45:10.486278Z","type":"message","role":"worker","content":"Mock response for: # Task\nName: legacy/noindex\nTitle: NoIndex\nBranch:"} +{"ts":"2026-01-20T21:45:10.486278Z","type":"worker.finished","data":{"duration_ms":43,"outcome":"replied","run_id":"34e934f1f7389f05","tool_calls":2}} diff --git a/pkg/task/ops/close.go b/pkg/task/ops/close.go index 2b84736..96a7317 100644 --- a/pkg/task/ops/close.go +++ b/pkg/task/ops/close.go @@ -27,6 +27,7 @@ func CloseTask(taskName string, abandon bool, logger Logger) (CloseResult, error } t, _ := task.Load(taskName) // best-effort (allows closing synced tasks without full metadata) + repoDir := task.ProjectRoot() state, err := task.LoadState(taskName) if err != nil { @@ -80,12 +81,97 @@ func CloseTask(taskName string, abandon bool, logger Logger) (CloseResult, error deleteEmptyTaskBranchBestEffort(logger, state.Workspace, taskName, t.BaseBranch) } + baseBranch := "" + if t != nil { + baseBranch = strings.TrimSpace(t.BaseBranch) + } + if baseBranch == "" { + baseBranch = strings.TrimSpace(tail.BaseBranch) + } + baseCommit := strings.TrimSpace(tail.BaseCommit) + + branchHead := "" + // Prefer the workspace's HEAD if it exists (covers cases where the branch name ref is missing). + if strings.TrimSpace(state.Workspace) != "" { + if out, err := git.Output(state.Workspace, "rev-parse", "HEAD"); err == nil { + branchHead = strings.TrimSpace(out) + } + } + if branchHead == "" && git.BranchExists(repoDir, taskName) { + if out, err := git.Output(repoDir, "rev-parse", taskName); err == nil { + branchHead = strings.TrimSpace(out) + } + } + statsDir := repoDir + if strings.TrimSpace(state.Workspace) != "" { + statsDir = state.Workspace + } + + // Compute frozen stats relative to a PR-style base for this branch state. + // Use merge-base(base, head) when possible, so rebases don't inflate stats. + if strings.TrimSpace(baseBranch) != "" && strings.TrimSpace(branchHead) != "" && git.CommitExists(statsDir, branchHead) { + if mb, err := git.MergeBase(statsDir, baseBranch, branchHead); err == nil && strings.TrimSpace(mb) != "" { + baseCommit = strings.TrimSpace(mb) + } + } + + // Back-compat: older tasks may not have base_commit; fall back to merge-base when possible. + if baseCommit == "" && baseBranch != "" && (branchHead != "" || git.BranchExists(repoDir, taskName)) { + mbDir := repoDir + mbBranch := taskName + if strings.TrimSpace(state.Workspace) != "" { + mbDir = state.Workspace + mbBranch = "HEAD" + } + if mb, err := git.Output(mbDir, "merge-base", mbBranch, baseBranch); err == nil { + baseCommit = strings.TrimSpace(mb) + } + } + + added := 0 + removed := 0 + commitCount := 0 + frozenErr := "" + if baseCommit == "" || branchHead == "" { + frozenErr = fmt.Sprintf("cannot compute frozen stats (missing base_commit=%t branch_head=%t)", baseCommit == "", branchHead == "") + } else if !git.CommitExists(statsDir, baseCommit) { + frozenErr = fmt.Sprintf("cannot compute frozen stats (missing base_commit %s)", baseCommit) + } else if !git.CommitExists(statsDir, branchHead) { + frozenErr = fmt.Sprintf("cannot compute frozen stats (missing branch_head %s)", branchHead) + } else { + if a, r, err := git.DiffStatRange(statsDir, baseCommit, branchHead); err == nil { + added = a + removed = r + } else { + frozenErr = fmt.Sprintf("cannot compute frozen stats: %v", err) + } + if frozenErr == "" { + if n, err := git.RevListCount(statsDir, baseCommit, branchHead); err == nil { + commitCount = n + } else { + frozenErr = fmt.Sprintf("cannot compute commit_count: %v", err) + } + } + } + // Append history event. reason := "close" if abandon { reason = "abandon" } - data, _ := json.Marshal(map[string]any{"reason": reason}) + closedData := map[string]any{ + "reason": reason, + "base_branch": baseBranch, + "base_commit": baseCommit, + "branch_head": branchHead, + "changes_added": added, + "changes_removed": removed, + "commit_count": commitCount, + } + if frozenErr != "" { + closedData["frozen_error"] = frozenErr + } + data, _ := json.Marshal(closedData) _ = history.AppendLocked(taskName, history.Event{Type: "task.closed", Data: data, TS: time.Now().UTC()}) // Clear runtime state. diff --git a/pkg/task/ops/merge.go b/pkg/task/ops/merge.go index 6c2fed3..adb78f2 100644 --- a/pkg/task/ops/merge.go +++ b/pkg/task/ops/merge.go @@ -85,6 +85,116 @@ func mergeTaskUnlocked(taskName, message string, logger Logger) (MergeResult, er return MergeResult{}, fmt.Errorf("failed to find merge base with %s: %w", t.BaseBranch, err) } + branchHead := "" + if out, err := git.Output(ws, "rev-parse", "HEAD"); err == nil { + branchHead = strings.TrimSpace(out) + } + baseHead := "" + if out, err := git.Output(ws, "rev-parse", t.BaseBranch); err == nil { + baseHead = strings.TrimSpace(out) + } + + // Compute frozen stats relative to a PR-style base for this branch state. + // - Default: merge-base(base, head) + // - If the branch tip is already reachable from base (ancestor), merge-base == head; in that case + // try fork-point (uses base reflog) and fall back to the task's stored base_commit. + baseCommit := strings.TrimSpace(mergeBase) + if baseCommit == branchHead && branchHead != "" { + if fp, err := git.MergeBaseForkPoint(ws, t.BaseBranch, branchHead); err == nil && strings.TrimSpace(fp) != "" && git.CommitExists(ws, strings.TrimSpace(fp)) { + baseCommit = strings.TrimSpace(fp) + } else if strings.TrimSpace(tail.BaseCommit) != "" && git.CommitExists(ws, strings.TrimSpace(tail.BaseCommit)) { + baseCommit = strings.TrimSpace(tail.BaseCommit) + } + } + + // If the task branch's content is already in base (e.g. squash merge, cherry-pick, or ancestor), + // treat `subtask merge` as a no-op finalize: record task.merged and free the workspace. + integrated := git.IsIntegrated(ws, "HEAD", t.BaseBranch) + added := 0 + removed := 0 + commitCount := 0 + frozenErr := "" + if baseCommit == "" || branchHead == "" { + frozenErr = fmt.Sprintf("cannot compute frozen stats (missing base_commit=%t branch_head=%t)", baseCommit == "", branchHead == "") + } else if !git.CommitExists(ws, baseCommit) { + frozenErr = fmt.Sprintf("cannot compute frozen stats (missing base_commit %s)", baseCommit) + } else if !git.CommitExists(ws, branchHead) { + frozenErr = fmt.Sprintf("cannot compute frozen stats (missing branch_head %s)", branchHead) + } else { + if a, r, err := git.DiffStatRange(ws, baseCommit, branchHead); err == nil { + added = a + removed = r + } else { + frozenErr = fmt.Sprintf("cannot compute frozen stats: %v", err) + } + if frozenErr == "" { + if n, err := git.RevListCount(ws, baseCommit, branchHead); err == nil { + commitCount = n + } else { + frozenErr = fmt.Sprintf("cannot compute commit_count: %v", err) + } + } + } + + if integrated != "" { + logInfo(logger, fmt.Sprintf("Already in %s (%s). Marking task as merged...", t.BaseBranch, integrated)) + + // Detach HEAD to free the workspace. + taskBranch, _ := git.CurrentBranch(ws) + if err := git.RunSilent(ws, "checkout", "--detach", "HEAD"); err != nil { + logWarning(logger, fmt.Sprintf("failed to detach HEAD: %v", err)) + } + + // Delete task branch (cleanup). + if taskBranch != "" && taskBranch != t.BaseBranch { + if err := git.RunSilent(ws, "branch", "-D", taskBranch); err != nil { + logWarning(logger, fmt.Sprintf("failed to delete branch %s: %v", taskBranch, err)) + } + } + + mergedData := map[string]any{ + // Back-compat + // No-op finalize: we didn't create a merge commit, so avoid pretending we did. + // `subtask diff` can use base_commit..branch_head when the branch is deleted. + "commit": "", + "into": t.BaseBranch, + "branch": taskName, + + // Redesign fields + "via": "subtask", + "method": string(integrated), + "base_branch": t.BaseBranch, + "base_commit": baseCommit, + "branch_head": branchHead, + "base_head": baseHead, + "target_head": baseHead, + "changes_added": added, + "changes_removed": removed, + "commit_count": commitCount, + "detected_at": time.Now().UTC().Unix(), + } + if frozenErr != "" { + mergedData["frozen_error"] = frozenErr + } + data, _ := json.Marshal(mergedData) + _ = history.AppendLocked(taskName, history.Event{Type: "task.merged", Data: data}) + + // Clear runtime state. + state.Workspace = "" + state.SessionID = "" + state.Harness = "" + state.SupervisorPID = 0 + state.SupervisorPGID = 0 + state.StartedAt = time.Time{} + state.LastError = "" + if err := state.Save(taskName); err != nil { + return MergeResult{}, err + } + + logSuccess(logger, fmt.Sprintf("Merged %s into %s. Workspace freed.", taskName, t.BaseBranch)) + return MergeResult{}, nil + } + // Preflight: detect conflicts the same way `git merge ` would fail. // // This avoids rewriting the task branch (squash) only to discover conflicts during integration. @@ -168,15 +278,27 @@ func mergeTaskUnlocked(taskName, message string, logger Logger) (MergeResult, er } // Append history event and clear runtime state. - data, _ := json.Marshal(map[string]any{ - "commit": strings.TrimSpace(mergedCommit), - "into": t.BaseBranch, - "branch": taskName, - "merge_base": mergeBase, + mergedData := map[string]any{ + "commit": strings.TrimSpace(mergedCommit), + "into": t.BaseBranch, + "branch": taskName, + "merge_base": mergeBase, + "via": "subtask", + "method": "squash", + "base_branch": t.BaseBranch, + "base_commit": baseCommit, + "branch_head": branchHead, + "changes_added": added, + "changes_removed": removed, + "commit_count": commitCount, "trailers": map[string]string{ "Subtask-Task": taskName, }, - }) + } + if frozenErr != "" { + mergedData["frozen_error"] = frozenErr + } + data, _ := json.Marshal(mergedData) _ = history.AppendLocked(taskName, history.Event{Type: "task.merged", Data: data}) state.Workspace = "" diff --git a/pkg/task/paths.go b/pkg/task/paths.go index a7ed34f..a7dca76 100644 --- a/pkg/task/paths.go +++ b/pkg/task/paths.go @@ -1,65 +1,232 @@ package task import ( + "fmt" "os" "path/filepath" "strings" "sync" "github.com/zippoxer/subtask/internal/homedir" + "github.com/zippoxer/subtask/pkg/git" + "github.com/zippoxer/subtask/pkg/subtaskerr" ) var projectDirCache struct { mu sync.Mutex computed bool cwd string - abs string + rootAbs string ok bool + err error } // GlobalDir returns ~/.subtask. func GlobalDir() string { + if d := strings.TrimSpace(os.Getenv("SUBTASK_DIR")); d != "" { + return filepath.Clean(d) + } home, _ := homedir.Dir() return filepath.Join(home, ".subtask") } -// ProjectDir returns .subtask in current dir. +// ConfigPath returns ~/.subtask/config.json (global defaults). +func ConfigPath() string { + return filepath.Join(GlobalDir(), "config.json") +} + +// ProjectsDir returns ~/.subtask/projects. +func ProjectsDir() string { + return filepath.Join(GlobalDir(), "projects") +} + +// WorkspacesDir returns ~/.subtask/workspaces. +func WorkspacesDir() string { + return filepath.Join(GlobalDir(), "workspaces") +} + +// GitRootAbs returns the git project root (worktree-aware). +func GitRootAbs() (string, error) { + cwd, err := os.Getwd() + if err != nil { + return "", err + } + cwd = canonicalPath(cwd) + + // Fast path: cache per-cwd. + projectDirCache.mu.Lock() + if projectDirCache.computed && projectDirCache.cwd == cwd { + root := projectDirCache.rootAbs + ok := projectDirCache.ok + cachedErr := projectDirCache.err + projectDirCache.mu.Unlock() + if !ok || root == "" { + if cachedErr != nil { + return "", cachedErr + } + return "", subtaskerr.ErrNotGitRepo + } + return root, nil + } + projectDirCache.mu.Unlock() + + top, err := git.Output(cwd, "rev-parse", "--show-toplevel") + if err != nil { + projectDirCache.mu.Lock() + projectDirCache.computed = true + projectDirCache.cwd = cwd + projectDirCache.rootAbs = "" + projectDirCache.ok = false + projectDirCache.err = err + projectDirCache.mu.Unlock() + return "", err + } + if abs, err := filepath.Abs(top); err == nil { + top = abs + } + top = canonicalPath(top) + + // If we're inside a Subtask-managed workspace, resolve the anchor worktree. + wsRoot := canonicalPath(WorkspacesDir()) + if isWithinDir(top, wsRoot) { + anchor, err := resolveAnchorFromWorktreeList(top, wsRoot) + if err != nil { + // Cache failure for this cwd, but don't poison other dirs. + projectDirCache.mu.Lock() + projectDirCache.computed = true + projectDirCache.cwd = cwd + projectDirCache.rootAbs = "" + projectDirCache.ok = false + projectDirCache.err = err + projectDirCache.mu.Unlock() + return "", err + } + top = canonicalPath(anchor) + } + + projectDirCache.mu.Lock() + projectDirCache.computed = true + projectDirCache.cwd = cwd + projectDirCache.rootAbs = top + projectDirCache.ok = true + projectDirCache.err = nil + projectDirCache.mu.Unlock() + return top, nil +} + +func resolveAnchorFromWorktreeList(worktreeTop string, workspacesRoot string) (string, error) { + out, err := git.Output(worktreeTop, "worktree", "list", "--porcelain") + if err != nil { + return "", err + } + var candidates []string + for _, line := range strings.Split(out, "\n") { + line = strings.TrimSpace(line) + if line == "" { + continue + } + const prefix = "worktree " + if strings.HasPrefix(line, prefix) { + p := strings.TrimSpace(strings.TrimPrefix(line, prefix)) + if p == "" { + continue + } + if abs, err := filepath.Abs(p); err == nil { + p = abs + } + p = canonicalPath(p) + if isWithinDir(p, workspacesRoot) { + continue + } + candidates = append(candidates, p) + } + } + if len(candidates) == 0 { + return "", fmt.Errorf("%w\n\nDetected workspace root: %s\nTip: cd to your main repo and re-run.", subtaskerr.ErrNoAnchorFromWorkspace, worktreeTop) + } + + // Prefer an anchor that already has subtask data (reduces ambiguity). + for _, c := range candidates { + if dirExists(filepath.Join(c, ".subtask", "tasks")) || fileExists(filepath.Join(c, ".subtask", "config.json")) { + return c, nil + } + } + return candidates[0], nil +} + +func isWithinDir(child, parent string) bool { + child = canonicalPath(child) + parent = canonicalPath(parent) + if child == "" || parent == "" { + return false + } + rel, err := filepath.Rel(parent, child) + if err != nil { + return false + } + rel = filepath.Clean(rel) + if rel == "." { + return true + } + return rel != ".." && !strings.HasPrefix(rel, ".."+string(os.PathSeparator)) +} + +func dirExists(path string) bool { + st, err := os.Stat(path) + return err == nil && st.IsDir() +} + +func fileExists(path string) bool { + st, err := os.Stat(path) + return err == nil && !st.IsDir() +} + +// ProjectDir returns .subtask relative to cwd (anchored at git root). func ProjectDir() string { cwd, err := os.Getwd() if err != nil { return ".subtask" } cwd = canonicalPath(cwd) - if abs, ok := projectDirAbsFrom(cwd); ok { - rel, err := filepath.Rel(cwd, abs) - if err == nil && rel != "" { - return rel - } - // Fall back to absolute path if Rel fails for some reason. - return abs + root, err := GitRootAbs() + if err != nil || root == "" { + return ".subtask" + } + abs := filepath.Join(root, ".subtask") + rel, err := filepath.Rel(cwd, abs) + if err == nil && rel != "" { + return rel } - // No project found; preserve prior behavior (e.g. subtask init creates .subtask in cwd). - return ".subtask" + return abs } // ProjectDirAbs returns the absolute path to the project's .subtask directory. -// If no .subtask directory exists in the cwd or any parent, it returns "/.subtask". +// If not in git, it returns "/.subtask". func ProjectDirAbs() string { cwd, err := os.Getwd() if err != nil { return ".subtask" } cwd = canonicalPath(cwd) - if abs, ok := projectDirAbsFrom(cwd); ok { - return abs + root, err := GitRootAbs() + if err != nil || root == "" { + return filepath.Join(cwd, ".subtask") } - return filepath.Join(cwd, ".subtask") + return filepath.Join(root, ".subtask") } -// ProjectRoot returns the absolute path to the project root (the parent of .subtask). -// If no .subtask directory exists in the cwd or any parent, it returns the current working directory. +// ProjectRoot returns the absolute path to the git project root. +// If not in git, it returns the current working directory. func ProjectRoot() string { - return filepath.Dir(ProjectDirAbs()) + cwd, err := os.Getwd() + if err != nil { + return "." + } + root, err := GitRootAbs() + if err != nil || root == "" { + return canonicalPath(cwd) + } + return root } // TasksDir returns .subtask/tasks. @@ -67,14 +234,34 @@ func TasksDir() string { return filepath.Join(ProjectDir(), "tasks") } -// InternalDir returns .subtask/internal. +// ProjectConfigPath returns the optional project override config path (/.subtask/config.json), +// expressed relative to cwd when possible. +func ProjectConfigPath() string { + return filepath.Join(ProjectDir(), "config.json") +} + +// InternalDir returns the runtime internal directory for this repo: +// ~/.subtask/projects//internal +// +// If not in git, falls back to /.subtask/internal (legacy behavior). func InternalDir() string { + root, err := GitRootAbs() + if err == nil && root != "" { + return filepath.Join(runtimeProjectDirAbs(root), "internal") + } return filepath.Join(ProjectDir(), "internal") } -// ConfigPath returns .subtask/config.json. -func ConfigPath() string { - return filepath.Join(ProjectDir(), "config.json") +// IndexPath returns the default index db path for this repo: +// ~/.subtask/projects//index.db +// +// If not in git, falls back to .subtask/index.db (legacy behavior). +func IndexPath() string { + root, err := GitRootAbs() + if err == nil && root != "" { + return filepath.Join(runtimeProjectDirAbs(root), "index.db") + } + return filepath.Join(ProjectDir(), "index.db") } // EscapeName converts "fix/epoch-boundary" to "fix--epoch-boundary". @@ -107,11 +294,6 @@ func HistoryPath(name string) string { return filepath.Join(Dir(name), "history.jsonl") } -// WorkspacesDir returns ~/.subtask/workspaces. -func WorkspacesDir() string { - return filepath.Join(GlobalDir(), "workspaces") -} - // EscapePath converts a path to a safe directory name. // It resolves symlinks first to ensure consistency across different cwd resolutions. func EscapePath(p string) string { @@ -134,38 +316,6 @@ func EscapePath(p string) string { return p } -func findProjectDirAbs(startDir string) (string, bool) { - dir := startDir - for { - candidate := filepath.Join(dir, ".subtask") - // Check for config.json to distinguish project .subtask from global ~/.subtask - configPath := filepath.Join(candidate, "config.json") - if _, err := os.Stat(configPath); err == nil { - return candidate, true - } - - parent := filepath.Dir(dir) - if parent == dir { - break - } - dir = parent - } - return "", false -} - -func projectDirAbsFrom(cwd string) (string, bool) { - projectDirCache.mu.Lock() - defer projectDirCache.mu.Unlock() - - if projectDirCache.computed && projectDirCache.cwd == cwd { - return projectDirCache.abs, projectDirCache.ok - } - - abs, ok := findProjectDirAbs(cwd) - abs = canonicalPath(abs) - projectDirCache.computed = true - projectDirCache.cwd = cwd - projectDirCache.abs = abs - projectDirCache.ok = ok - return abs, ok +func runtimeProjectDirAbs(repoRoot string) string { + return filepath.Join(ProjectsDir(), EscapePath(repoRoot)) } diff --git a/pkg/task/paths_subdir_test.go b/pkg/task/paths_subdir_test.go index 1868af9..3736016 100644 --- a/pkg/task/paths_subdir_test.go +++ b/pkg/task/paths_subdir_test.go @@ -2,18 +2,17 @@ package task import ( "os" + "os/exec" "path/filepath" "testing" "github.com/stretchr/testify/require" + "github.com/zippoxer/subtask/pkg/subtaskerr" ) -func TestProjectDir_WalksUpFromSubdir(t *testing.T) { +func TestProjectDir_AnchorsAtGitRoot_FromSubdir(t *testing.T) { root := t.TempDir() - require.NoError(t, os.MkdirAll(filepath.Join(root, ".subtask", "tasks"), 0o755)) - require.NoError(t, os.MkdirAll(filepath.Join(root, ".subtask", "internal"), 0o755)) - // config.json is required to identify a project .subtask directory - require.NoError(t, os.WriteFile(filepath.Join(root, ".subtask", "config.json"), []byte(`{}`), 0o644)) + initGitRepo(t, root) subdir := filepath.Join(root, "src", "pkg") require.NoError(t, os.MkdirAll(subdir, 0o755)) @@ -22,8 +21,9 @@ func TestProjectDir_WalksUpFromSubdir(t *testing.T) { require.NoError(t, os.Chdir(subdir)) t.Cleanup(func() { _ = os.Chdir(orig) }) + resetProjectCache() + require.Equal(t, filepath.Join("..", "..", ".subtask"), ProjectDir()) - require.Equal(t, filepath.Join("..", "..", ".subtask", "config.json"), ConfigPath()) expectedRoot, err := filepath.EvalSymlinks(root) require.NoError(t, err) @@ -31,75 +31,37 @@ func TestProjectDir_WalksUpFromSubdir(t *testing.T) { require.Equal(t, filepath.Join(expectedRoot, ".subtask"), ProjectDirAbs()) } -// TestProjectDir_IgnoresGlobalSubtaskDir verifies that a .subtask directory -// without config.json (like the global ~/.subtask for workspaces) is not -// mistaken for a project directory. -func TestProjectDir_IgnoresGlobalSubtaskDir(t *testing.T) { - // Create a fake "home" with .subtask but NO config.json (like global dir) - fakeHome := t.TempDir() - globalSubtask := filepath.Join(fakeHome, ".subtask") - require.NoError(t, os.MkdirAll(filepath.Join(globalSubtask, "workspaces"), 0o755)) - require.NoError(t, os.MkdirAll(filepath.Join(globalSubtask, "logs"), 0o755)) - // Intentionally NO config.json - - // Create a project directory under fake home - projectDir := filepath.Join(fakeHome, "code", "myproject") - require.NoError(t, os.MkdirAll(projectDir, 0o755)) +func TestGitRootAbs_NotGitRepo(t *testing.T) { + dir := t.TempDir() orig, _ := os.Getwd() - require.NoError(t, os.Chdir(projectDir)) + require.NoError(t, os.Chdir(dir)) t.Cleanup(func() { _ = os.Chdir(orig) }) - // Clear the cache since we changed directories - projectDirCache.mu.Lock() - projectDirCache.computed = false - projectDirCache.mu.Unlock() - - // Should NOT find the parent .subtask (no config.json) - // Should return local .subtask as fallback - require.Equal(t, ".subtask", ProjectDir()) - require.Equal(t, filepath.Join(".subtask", "config.json"), ConfigPath()) - - // For ProjectDirAbs, resolve symlinks on the projectDir part (macOS /var -> /private/var) - // The .subtask part doesn't exist, so we resolve the parent and append - resolvedProjectDir, err := filepath.EvalSymlinks(projectDir) - require.NoError(t, err) - require.Equal(t, filepath.Join(resolvedProjectDir, ".subtask"), ProjectDirAbs()) + resetProjectCache() + _, err := GitRootAbs() + require.ErrorIs(t, err, subtaskerr.ErrNotGitRepo) } -// TestProjectDir_FindsProjectNotGlobal verifies that when both a project -// .subtask (with config.json) and a global-like .subtask (without config.json) -// exist in the path, only the project one is found. -func TestProjectDir_FindsProjectNotGlobal(t *testing.T) { - // Create hierarchy: /tmp/home/.subtask (no config) > /tmp/home/code/project/.subtask (with config) - fakeHome := t.TempDir() - - // Global-like .subtask at "home" level - no config.json - globalSubtask := filepath.Join(fakeHome, ".subtask") - require.NoError(t, os.MkdirAll(filepath.Join(globalSubtask, "workspaces"), 0o755)) - - // Project .subtask with config.json - projectRoot := filepath.Join(fakeHome, "code", "project") - projectSubtask := filepath.Join(projectRoot, ".subtask") - require.NoError(t, os.MkdirAll(projectSubtask, 0o755)) - require.NoError(t, os.WriteFile(filepath.Join(projectSubtask, "config.json"), []byte(`{}`), 0o644)) - - // Working in a subdir of the project - workDir := filepath.Join(projectRoot, "src", "pkg") - require.NoError(t, os.MkdirAll(workDir, 0o755)) - - orig, _ := os.Getwd() - require.NoError(t, os.Chdir(workDir)) - t.Cleanup(func() { _ = os.Chdir(orig) }) - - // Clear the cache +func resetProjectCache() { projectDirCache.mu.Lock() projectDirCache.computed = false + projectDirCache.cwd = "" + projectDirCache.rootAbs = "" + projectDirCache.ok = false + projectDirCache.err = nil projectDirCache.mu.Unlock() +} - // Should find project .subtask, not the global-like one - expectedProjectRoot, err := filepath.EvalSymlinks(projectRoot) - require.NoError(t, err) - require.Equal(t, expectedProjectRoot, ProjectRoot()) - require.Equal(t, filepath.Join(expectedProjectRoot, ".subtask"), ProjectDirAbs()) +func initGitRepo(t *testing.T, dir string) { + t.Helper() + run(t, dir, "git", "init") +} + +func run(t *testing.T, dir string, name string, args ...string) { + t.Helper() + cmd := exec.Command(name, args...) + cmd.Dir = dir + out, err := cmd.CombinedOutput() + require.NoError(t, err, "%s", out) } diff --git a/pkg/task/progress.go b/pkg/task/progress.go index 126b587..299eec7 100644 --- a/pkg/task/progress.go +++ b/pkg/task/progress.go @@ -58,7 +58,8 @@ func (p *Progress) Save(taskName string) error { // LoadProgress reads progress from .subtask/internal//progress.json. func LoadProgress(taskName string) (*Progress, error) { - data, err := os.ReadFile(progressPath(taskName)) + path := progressPath(taskName) + data, err := os.ReadFile(path) if err != nil { if os.IsNotExist(err) { return nil, nil diff --git a/pkg/task/state.go b/pkg/task/state.go index bbb1c10..b72863d 100644 --- a/pkg/task/state.go +++ b/pkg/task/state.go @@ -38,6 +38,18 @@ func (s *State) Save(taskName string) error { return err } + return writeBytesAtomic(path, data) +} + +func writeJSONAtomic(path string, v any) error { + data, err := json.MarshalIndent(v, "", " ") + if err != nil { + return err + } + return writeBytesAtomic(path, data) +} + +func writeBytesAtomic(path string, data []byte) error { // Write to temp file and rename for atomicity tmpPath := path + ".tmp" f, err := os.OpenFile(tmpPath, os.O_WRONLY|os.O_CREATE|os.O_TRUNC, 0644) @@ -47,19 +59,19 @@ func (s *State) Save(taskName string) error { if _, err := f.Write(data); err != nil { f.Close() - os.Remove(tmpPath) + _ = os.Remove(tmpPath) return err } // Sync to disk before releasing any locks if err := f.Sync(); err != nil { f.Close() - os.Remove(tmpPath) + _ = os.Remove(tmpPath) return err } if err := f.Close(); err != nil { - os.Remove(tmpPath) + _ = os.Remove(tmpPath) return err } @@ -74,7 +86,8 @@ func LoadState(taskName string) (*State, error) { if debug { start = time.Now() } - data, err := os.ReadFile(StatePath(taskName)) + path := StatePath(taskName) + data, err := os.ReadFile(path) if err != nil { if os.IsNotExist(err) { return nil, nil diff --git a/pkg/task/store/errors.go b/pkg/task/store/errors.go new file mode 100644 index 0000000..ea304a7 --- /dev/null +++ b/pkg/task/store/errors.go @@ -0,0 +1,13 @@ +package store + +import "errors" + +var ( + ErrBranchDeleted = errors.New("branch deleted") + ErrBranchMissing = errors.New("branch missing") + ErrBaseMissing = errors.New("base missing") + ErrCommitMissing = errors.New("commit missing") + ErrMergeBaseMissing = errors.New("merge-base missing") + ErrGitNotRepo = errors.New("not a git repository") +) + diff --git a/pkg/task/store/store.go b/pkg/task/store/store.go new file mode 100644 index 0000000..152e18c --- /dev/null +++ b/pkg/task/store/store.go @@ -0,0 +1,803 @@ +package store + +import ( + "context" + "encoding/json" + "fmt" + "os" + "strings" + "sync" + "time" + + "github.com/zippoxer/subtask/pkg/git" + "github.com/zippoxer/subtask/pkg/task" + "github.com/zippoxer/subtask/pkg/task/history" + "github.com/zippoxer/subtask/pkg/task/index" + "github.com/zippoxer/subtask/pkg/workflow" + "github.com/zippoxer/subtask/pkg/workspace" +) + +type store struct{} + +func New() Store { + return &store{} +} + +const defaultListTargetCount = 10 + +func (s *store) List(ctx context.Context, opts ListOptions) (ListResult, error) { + if ctx == nil { + ctx = context.Background() + } + + workspaces, err := workspace.ListWorkspaces() + if err != nil { + return ListResult{}, err + } + + taskNames, err := task.List() + if err != nil { + return ListResult{}, err + } + if len(taskNames) == 0 { + return ListResult{ + Tasks: nil, + Errors: nil, + Workspaces: workspaces, + AvailableWorkspaces: countAvailableWorkspaces(nil, workspaces), + }, nil + } + + idx, err := index.OpenDefault() + if err != nil { + return ListResult{}, err + } + defer idx.Close() + + if err := idx.Refresh(ctx, index.RefreshPolicy{Git: index.GitPolicy{Mode: index.GitNone}}); err != nil { + return ListResult{}, err + } + + targetCount := opts.TargetCount + if targetCount <= 0 { + targetCount = defaultListTargetCount + } + + var items []index.ListItem + if opts.All { + ls, err := idx.ListAll(ctx) + if err != nil { + return ListResult{}, err + } + items = append(items, ls...) + } else { + open, err := idx.ListOpen(ctx) + if err != nil { + return ListResult{}, err + } + closed, err := idx.ListClosed(ctx) + if err != nil { + return ListResult{}, err + } + + items = append(items, open...) + + remaining := targetCount - len(open) + if remaining > 0 { + if remaining > len(closed) { + remaining = len(closed) + } + items = append(items, closed[:remaining]...) + } + } + + available := countAvailableWorkspaces(items, workspaces) + + repoDir := task.ProjectRoot() + refs, err := git.ListRefs(repoDir, "refs/heads") + if err != nil { + return ListResult{}, err + } + + type changeResult struct { + name string + change Changes + merged bool + err error + } + + sem := make(chan struct{}, 8) + var wg sync.WaitGroup + results := make(chan changeResult, len(items)) + + for _, it := range items { + if it.TaskStatus != task.TaskStatusOpen { + continue + } + // Draft tasks may not have a branch yet; allow fast path without errors. + if task.NormalizeWorkerStatus(it.WorkerStatus) == task.WorkerStatusNotStarted { + continue + } + + wg.Add(1) + sem <- struct{}{} + go func(it index.ListItem) { + defer func() { + <-sem + wg.Done() + }() + + ch, computeErr := computeHistoricalChanges(ctx, idx, repoDir, refs, it.Name, it.BaseBranch, it.BaseCommit, it.Workspace) + merged := false + if computeErr == nil && ch.Err == nil && task.NormalizeWorkerStatus(it.WorkerStatus) != task.WorkerStatusRunning { + // Never auto-merge "empty" tasks at creation time. + branchHead := resolveHead(refs, it.Name) + baseHead := resolveHead(refs, it.BaseBranch) + if branchHead != "" && baseHead != "" && strings.TrimSpace(it.BaseCommit) != "" && branchHead != strings.TrimSpace(it.BaseCommit) { + isAncestor, err := git.IsAncestor(repoDir, branchHead, baseHead) + if err != nil { + results <- changeResult{name: it.Name, change: ch, err: err} + return + } + if isAncestor { + commitCount, err := git.RevListCount(repoDir, strings.TrimSpace(it.BaseCommit), branchHead) + if err != nil { + results <- changeResult{name: it.Name, change: ch, err: err} + return + } + appended, err := appendDetectedMergeEvent(ctx, repoDir, it.Name, it.BaseBranch, strings.TrimSpace(it.BaseCommit), branchHead, baseHead, ch, commitCount) + if err != nil { + results <- changeResult{name: it.Name, change: ch, err: err} + return + } + merged = appended + } + } + } + + // Content detection: if merging the task branch into base would add nothing, the work is already in base. + // This is an informational UX signal (does not change durable task status). + if !merged && computeErr == nil && ch.Err == nil && task.NormalizeWorkerStatus(it.WorkerStatus) != task.WorkerStatusRunning { + branchHead := resolveHead(refs, it.Name) + baseHead := resolveHead(refs, it.BaseBranch) + if branchHead != "" && baseHead != "" && git.CommitExists(repoDir, branchHead) && git.CommitExists(repoDir, baseHead) { + reason, _ := integrationReason(ctx, idx, repoDir, it.Name, branchHead, baseHead) + if reason != "" && reason != git.IntegratedAncestor && reason != git.IntegratedSameCommit { + mb, err := git.MergeBase(repoDir, branchHead, baseHead) + if err == nil && strings.TrimSpace(mb) != "" { + out, err := git.Output(repoDir, "diff", "--name-only", strings.TrimSpace(mb)+".."+branchHead) + if err == nil && strings.TrimSpace(out) != "" { + ch.Status = ChangesStatusApplied + } + } + } + } + } + + results <- changeResult{name: it.Name, change: ch, merged: merged, err: computeErr} + }(it) + } + + wg.Wait() + close(results) + + changeByName := make(map[string]Changes, len(items)) + mergedByName := make(map[string]bool, len(items)) + var errs []TaskLoadError + for r := range results { + changeByName[r.name] = r.change + if r.merged { + mergedByName[r.name] = true + } + if r.err != nil { + errs = append(errs, TaskLoadError{Name: r.name, Err: r.err}) + } + } + + out := ListResult{ + Tasks: make([]TaskListItem, 0, len(items)), + Errors: errs, + Workspaces: workspaces, + AvailableWorkspaces: available, + } + + for _, it := range items { + taskItem := TaskListItem{ + Name: it.Name, + Title: it.Title, + FollowUp: it.FollowUp, + BaseBranch: it.BaseBranch, + BaseCommit: it.BaseCommit, + TaskStatus: it.TaskStatus, + WorkerStatus: it.WorkerStatus, + Stage: it.Stage, + Workspace: it.Workspace, + StartedAt: it.StartedAt, + LastActive: it.LastActive, + ToolCalls: it.ToolCalls, + ProgressDone: it.ProgressDone, + ProgressTotal: it.ProgressTotal, + LastRunDurationMS: it.LastRunDurationMS, + LastError: it.LastError, + Changes: Changes{ + Added: it.LinesAdded, + Removed: it.LinesRemoved, + }, + } + + if mergedByName[it.Name] { + taskItem.TaskStatus = task.TaskStatusMerged + } + + if it.TaskStatus == task.TaskStatusOpen { + if task.NormalizeWorkerStatus(it.WorkerStatus) == task.WorkerStatusNotStarted { + // Keep draft tasks fast and clean: no branch required, no errors. + taskItem.Changes = Changes{} + } else if ch, ok := changeByName[it.Name]; ok { + taskItem.Changes = ch + } + } + + out.Tasks = append(out.Tasks, taskItem) + } + + return out, nil +} + +func (s *store) Get(ctx context.Context, name string, _ GetOptions) (TaskView, error) { + if ctx == nil { + ctx = context.Background() + } + + idx, err := index.OpenDefault() + if err != nil { + return TaskView{}, err + } + defer idx.Close() + + if err := idx.Refresh(ctx, index.RefreshPolicy{Git: index.GitPolicy{Mode: index.GitNone}}); err != nil { + return TaskView{}, err + } + + rec, ok, err := idx.Get(ctx, name) + if err != nil { + return TaskView{}, err + } + if !ok || rec.Task == nil { + // Preserve historical errors for missing/invalid tasks. + _, err := task.Load(name) + return TaskView{}, err + } + + t := rec.Task + state := rec.State + meta := rec.ProgressMeta + cfg, _ := workspace.LoadConfig() // best-effort (allows working in partial setups) + + view := TaskView{ + Task: t, + BaseCommit: rec.BaseCommit, + State: state, + ProgressMeta: meta, + ProgressSteps: task.LoadProgressSteps(name), + Model: workspace.ResolveModel(cfg, t, ""), + TaskStatus: rec.TaskStatus, + WorkerStatus: rec.WorkerStatus, + Stage: rec.Stage, + LastHistoryNS: rec.LastHistory.UnixNano(), + LastRunMS: rec.LastRunDurationMS, + } + if cfg != nil && cfg.Harness == "codex" { + view.Reasoning = workspace.ResolveReasoning(cfg, t, "") + } + + // Workflow for this task, if any. + if wf, err := workflow.LoadFromTask(name); err == nil { + view.Workflow = wf + } + + // Task folder files. + taskDir := task.Dir(name) + entries, err := os.ReadDir(taskDir) + if err == nil { + for _, e := range entries { + if !e.IsDir() { + view.TaskFiles = append(view.TaskFiles, e.Name()) + } + } + } + + repoDir := task.ProjectRoot() + refs, err := git.ListRefs(repoDir, "refs/heads") + if err != nil { + return TaskView{}, err + } + + // Historical changes + commit count (detail-only) for open tasks. + if rec.TaskStatus == task.TaskStatusOpen { + ws := task.NormalizeWorkerStatus(rec.WorkerStatus) + if ws == task.WorkerStatusNotStarted { + view.Changes = Changes{} + view.Commits = Commits{} + } else { + workspacePath := "" + if state != nil { + workspacePath = state.Workspace + } + view.Changes, _ = computeHistoricalChanges(ctx, idx, repoDir, refs, name, t.BaseBranch, rec.BaseCommit, workspacePath) + view.Commits, _ = computeCommitCount(ctx, idx, repoDir, refs, name, t.BaseBranch, rec.BaseCommit, workspacePath) + + // External merge detection (ancestor-only): if branch tip is in base, record a durable merge event. + branchHead := resolveHead(refs, name) + baseHead := resolveHead(refs, t.BaseBranch) + if ws != task.WorkerStatusRunning && branchHead != "" && baseHead != "" && strings.TrimSpace(rec.BaseCommit) != "" && branchHead != strings.TrimSpace(rec.BaseCommit) { + isAncestor, err := git.IsAncestor(repoDir, branchHead, baseHead) + if err == nil && isAncestor && view.Changes.Err == nil && view.Commits.Err == nil { + appended, err := appendDetectedMergeEvent(ctx, repoDir, name, t.BaseBranch, strings.TrimSpace(rec.BaseCommit), branchHead, baseHead, view.Changes, view.Commits.Count) + if err == nil && appended { + view.TaskStatus = task.TaskStatusMerged + } + } + } + + // Content detection: show "applied" when branch content is already in base but history is not merged. + if view.TaskStatus == task.TaskStatusOpen && ws != task.WorkerStatusRunning && branchHead != "" && baseHead != "" && git.CommitExists(repoDir, branchHead) && git.CommitExists(repoDir, baseHead) { + reason := git.IntegrationReason("") + if strings.TrimSpace(rec.IntegratedBranchHead) == branchHead && strings.TrimSpace(rec.IntegratedTargetHead) == baseHead { + reason = git.IntegrationReason(strings.TrimSpace(rec.IntegratedReason)) + } else { + reason = git.IsIntegrated(repoDir, branchHead, baseHead) + _ = idx.UpdateIntegrationCache(ctx, name, branchHead, baseHead, string(reason)) + } + + if reason != "" && reason != git.IntegratedAncestor && reason != git.IntegratedSameCommit { + mb, err := git.MergeBase(repoDir, branchHead, baseHead) + if err == nil && strings.TrimSpace(mb) != "" { + out, err := git.Output(repoDir, "diff", "--name-only", strings.TrimSpace(mb)+".."+branchHead) + if err == nil && strings.TrimSpace(out) != "" { + view.Changes.Status = ChangesStatusApplied + } + } + } + } + } + } else { + // Back-compat: keep the existing cached counts until frozen stats land. + view.Changes = Changes{Added: rec.LinesAdded, Removed: rec.LinesRemoved} + } + + // Conflicts: best-effort, computed on demand. + if rec.TaskStatus == task.TaskStatusOpen && state != nil && state.Workspace != "" && strings.TrimSpace(t.BaseBranch) != "" { + conflicts, err := git.MergeConflictFiles(state.Workspace, t.BaseBranch, "HEAD") + if err == nil && len(conflicts) > 0 { + view.ConflictFiles = conflicts + } + } else if rec.ConflictFilesJSON != "" { + // Back-compat for any cached conflict list. + var conflicts []string + if err := json.Unmarshal([]byte(rec.ConflictFilesJSON), &conflicts); err == nil && len(conflicts) > 0 { + view.ConflictFiles = conflicts + } + } + + return view, nil +} + +func appendDetectedMergeEvent(ctx context.Context, repoDir string, taskName string, baseBranch string, baseCommit string, branchHead string, baseHead string, ch Changes, commitCount int) (bool, error) { + if ctx == nil { + ctx = context.Background() + } + taskName = strings.TrimSpace(taskName) + baseBranch = strings.TrimSpace(baseBranch) + baseCommit = strings.TrimSpace(baseCommit) + branchHead = strings.TrimSpace(branchHead) + baseHead = strings.TrimSpace(baseHead) + + if taskName == "" || baseBranch == "" || baseCommit == "" || branchHead == "" || baseHead == "" { + return false, nil + } + + // Safety rail: never auto-merge empty branches at creation time. + if branchHead == baseCommit { + return false, nil + } + + appended := false + err := task.WithLock(taskName, func() error { + tail, err := history.Tail(taskName) + if err != nil { + return err + } + if tail.TaskStatus != task.TaskStatusOpen { + return nil + } + + // Re-check heads while locked to avoid racing concurrent writes. + currentBranchHead, err := git.Output(repoDir, "rev-parse", taskName) + if err != nil { + return err + } + currentBranchHead = strings.TrimSpace(currentBranchHead) + if currentBranchHead == "" || currentBranchHead != branchHead { + return nil + } + + currentBaseHead, err := git.Output(repoDir, "rev-parse", baseBranch) + if err != nil { + return err + } + currentBaseHead = strings.TrimSpace(currentBaseHead) + if currentBaseHead == "" { + return nil + } + + isAncestor, err := git.IsAncestor(repoDir, currentBranchHead, currentBaseHead) + if err != nil { + return err + } + if !isAncestor { + return nil + } + + data, err := json.Marshal(map[string]any{ + // Back-compat + "commit": strings.TrimSpace(currentBranchHead), + "into": baseBranch, + "branch": taskName, + + // Redesign fields + "via": "detected", + "method": "ancestor", + "base_branch": baseBranch, + "base_commit": baseCommit, + "branch_head": strings.TrimSpace(currentBranchHead), + "base_head": strings.TrimSpace(currentBaseHead), + "target_head": strings.TrimSpace(currentBaseHead), + "changes_added": ch.Added, + "changes_removed": ch.Removed, + "commit_count": commitCount, + "detected_at": time.Now().UTC().Unix(), + }) + if err != nil { + return err + } + + if err := history.AppendLocked(taskName, history.Event{Type: "task.merged", Data: data}); err != nil { + return err + } + appended = true + return nil + }) + return appended, err +} + +func appendCommitEvents(ctx context.Context, idx *index.Index, repoDir string, taskName string, baseCommit string, branchHead string, lastLoggedHead string) error { + if ctx == nil { + ctx = context.Background() + } + taskName = strings.TrimSpace(taskName) + baseCommit = strings.TrimSpace(baseCommit) + branchHead = strings.TrimSpace(branchHead) + lastLoggedHead = strings.TrimSpace(lastLoggedHead) + + if taskName == "" || baseCommit == "" || branchHead == "" { + return nil + } + if lastLoggedHead == branchHead { + return nil + } + + // For empty branches (no commits since base_commit), we still want to mark the head as scanned + // so we don't re-run commit logging on every list/show call. + var commits []git.CommitMeta + if branchHead != baseCommit { + from := baseCommit + if lastLoggedHead != "" { + if isAncestor, err := git.IsAncestor(repoDir, lastLoggedHead, branchHead); err == nil && isAncestor { + from = lastLoggedHead + } + } + var err error + commits, err = git.ListCommitsRange(repoDir, from, branchHead) + if err != nil { + return err + } + } + + seenAt := time.Now().UTC().Unix() + return task.WithLock(taskName, func() error { + tail, err := history.Tail(taskName) + if err != nil { + return err + } + if tail.TaskStatus != task.TaskStatusOpen { + return nil + } + + // Ensure the branch head is still the same before we write history. + currentBranchHead, err := git.Output(repoDir, "rev-parse", taskName) + if err != nil { + return err + } + currentBranchHead = strings.TrimSpace(currentBranchHead) + if currentBranchHead == "" || currentBranchHead != branchHead { + return nil + } + + // Dedupe by SHA from existing history. + evs, err := history.Read(taskName, history.ReadOptions{EventsOnly: true}) + if err != nil { + return err + } + seen := make(map[string]struct{}, len(evs)) + for _, ev := range evs { + if ev.Type != "task.commit" { + continue + } + var d struct { + SHA string `json:"sha"` + } + if err := json.Unmarshal(ev.Data, &d); err != nil { + continue + } + sha := strings.TrimSpace(d.SHA) + if sha != "" { + seen[sha] = struct{}{} + } + } + + for _, c := range commits { + sha := strings.TrimSpace(c.SHA) + if sha == "" { + continue + } + if _, ok := seen[sha]; ok { + continue + } + + data, err := json.Marshal(map[string]any{ + "sha": sha, + "subject": c.Subject, + "author_name": c.AuthorName, + "author_email": c.AuthorEmail, + "authored_at": c.AuthoredAt, + "seen_at": seenAt, + }) + if err != nil { + return err + } + if err := history.AppendLocked(taskName, history.Event{Type: "task.commit", Data: data}); err != nil { + return err + } + seen[sha] = struct{}{} + } + + return idx.UpdateCommitLogLastHead(ctx, taskName, branchHead) + }) +} + +func countAvailableWorkspaces(items []index.ListItem, workspaces []workspace.Entry) int { + used := make(map[string]bool, len(items)) + for _, it := range items { + if it.Workspace != "" { + used[it.Workspace] = true + } + } + available := 0 + for _, ws := range workspaces { + if !used[ws.Path] { + available++ + } + } + return available +} + +func resolveHead(refs map[string]string, name string) string { + name = strings.TrimSpace(name) + if name == "" { + return "" + } + return strings.TrimSpace(refs["refs/heads/"+name]) +} + +func resolveWorkspaceHead(workspacePath string) string { + workspacePath = strings.TrimSpace(workspacePath) + if workspacePath == "" { + return "" + } + if st, err := os.Stat(workspacePath); err != nil || !st.IsDir() { + return "" + } + head, err := git.Output(workspacePath, "rev-parse", "HEAD") + if err != nil { + return "" + } + return strings.TrimSpace(head) +} + +func integrationReason(ctx context.Context, idx *index.Index, repoDir string, taskName string, branchHead string, baseHead string) (git.IntegrationReason, error) { + if ctx == nil { + ctx = context.Background() + } + taskName = strings.TrimSpace(taskName) + branchHead = strings.TrimSpace(branchHead) + baseHead = strings.TrimSpace(baseHead) + if taskName == "" || branchHead == "" || baseHead == "" { + return "", nil + } + + if idx != nil { + rec, ok, err := idx.Get(ctx, taskName) + if err == nil && ok { + if strings.TrimSpace(rec.IntegratedBranchHead) == branchHead && strings.TrimSpace(rec.IntegratedTargetHead) == baseHead { + return git.IntegrationReason(strings.TrimSpace(rec.IntegratedReason)), nil + } + } + } + + reason := git.IsIntegrated(repoDir, branchHead, baseHead) + if idx != nil { + _ = idx.UpdateIntegrationCache(ctx, taskName, branchHead, baseHead, string(reason)) + } + return reason, nil +} + +func computeHistoricalChanges(ctx context.Context, idx *index.Index, repoDir string, refs map[string]string, taskName string, baseBranch string, baseCommit string, workspacePath string) (Changes, error) { + if ctx == nil { + ctx = context.Background() + } + taskName = strings.TrimSpace(taskName) + baseBranch = strings.TrimSpace(baseBranch) + taskBaseCommit := strings.TrimSpace(baseCommit) + + branchRefHead := resolveHead(refs, taskName) + branchHead := branchRefHead + if branchHead == "" { + branchHead = resolveWorkspaceHead(workspacePath) + } + baseHead := resolveHead(refs, baseBranch) + + rec, ok, err := idx.Get(ctx, taskName) + if err != nil { + return Changes{Err: err}, err + } + if ok { + // Best-effort: keep these around for debugging. + if strings.TrimSpace(rec.BranchHead) != branchHead || strings.TrimSpace(rec.BaseHead) != baseHead { + _ = idx.UpdateRefHeads(ctx, taskName, branchHead, baseHead) + } + } + + if branchHead == "" { + miss := fmt.Errorf("%w: %s", ErrBranchMissing, taskName) + if ok && strings.TrimSpace(rec.BranchHead) != "" { + miss = fmt.Errorf("%w: %s", ErrBranchDeleted, taskName) + } + return Changes{Status: ChangesStatusMissing, Err: miss}, nil + } + if !git.CommitExists(repoDir, branchHead) { + miss := fmt.Errorf("%w: branch_head %s", ErrCommitMissing, branchHead) + return Changes{Status: ChangesStatusMissing, Err: miss}, nil + } + + mergeBase := "" + if strings.TrimSpace(baseBranch) != "" { + if mb, err := git.MergeBase(repoDir, branchHead, baseBranch); err == nil { + mergeBase = strings.TrimSpace(mb) + } + } + if mergeBase == "" { + return Changes{Err: fmt.Errorf("cannot compute merge-base for %s (base_branch=%q)", taskName, baseBranch)}, fmt.Errorf("cannot compute merge-base for %s (base_branch=%q)", taskName, baseBranch) + } + if !git.CommitExists(repoDir, mergeBase) { + miss := fmt.Errorf("%w: merge_base %s", ErrCommitMissing, mergeBase) + return Changes{Status: ChangesStatusMissing, Err: miss}, nil + } + + // Diff base: use merge-base for open tasks (GitHub PR semantics), but if the branch tip is already + // reachable from base (fast-forward merged), merge-base collapses to the branch tip and stats become 0/0. + // In that case, try fork-point (uses base reflog) and fall back to the task's stored base_commit. + diffBase := mergeBase + if strings.TrimSpace(diffBase) != "" && strings.TrimSpace(branchHead) != "" && strings.TrimSpace(diffBase) == strings.TrimSpace(branchHead) { + if fp, err := git.MergeBaseForkPoint(repoDir, baseBranch, branchHead); err == nil { + fp = strings.TrimSpace(fp) + if fp != "" && fp != branchHead && git.CommitExists(repoDir, fp) { + diffBase = fp + } + } + if strings.TrimSpace(diffBase) == strings.TrimSpace(branchHead) && taskBaseCommit != "" && taskBaseCommit != branchHead && git.CommitExists(repoDir, taskBaseCommit) { + diffBase = taskBaseCommit + } + } + + // Best-effort: log commit events when the branch head advances. + // This is a side-effect, but computeHistoricalChanges is called on every store read for open tasks. + if ok && branchRefHead != "" { + _ = appendCommitEvents(ctx, idx, repoDir, taskName, diffBase, branchRefHead, rec.CommitLogLastHead) + } + + // Cache hit. + if ok && rec.ChangesBaseCommit == diffBase && rec.ChangesBranchHead == branchHead { + return Changes{Added: rec.ChangesAdded, Removed: rec.ChangesRemoved}, nil + } + + added, removed, err := git.DiffStatRange(repoDir, diffBase, branchHead) + if err != nil { + return Changes{Err: err}, err + } + _ = idx.UpdateChangesCache(ctx, taskName, diffBase, branchHead, added, removed) + return Changes{Added: added, Removed: removed}, nil +} + +func computeCommitCount(ctx context.Context, idx *index.Index, repoDir string, refs map[string]string, taskName string, baseBranch string, baseCommit string, workspacePath string) (Commits, error) { + if ctx == nil { + ctx = context.Background() + } + taskName = strings.TrimSpace(taskName) + baseBranch = strings.TrimSpace(baseBranch) + taskBaseCommit := strings.TrimSpace(baseCommit) + + branchHead := resolveHead(refs, taskName) + if branchHead == "" { + branchHead = resolveWorkspaceHead(workspacePath) + } + baseHead := resolveHead(refs, baseBranch) + + rec, ok, err := idx.Get(ctx, taskName) + if err != nil { + return Commits{Err: err}, err + } + if ok { + if strings.TrimSpace(rec.BranchHead) != branchHead || strings.TrimSpace(rec.BaseHead) != baseHead { + _ = idx.UpdateRefHeads(ctx, taskName, branchHead, baseHead) + } + } + + if branchHead == "" { + miss := fmt.Errorf("%w: %s", ErrBranchMissing, taskName) + if ok && strings.TrimSpace(rec.BranchHead) != "" { + miss = fmt.Errorf("%w: %s", ErrBranchDeleted, taskName) + } + return Commits{Err: miss}, miss + } + if !git.CommitExists(repoDir, branchHead) { + return Commits{Err: fmt.Errorf("%w: branch_head %s", ErrCommitMissing, branchHead)}, fmt.Errorf("%w: branch_head %s", ErrCommitMissing, branchHead) + } + + mergeBase := "" + if strings.TrimSpace(baseBranch) != "" { + if mb, err := git.MergeBase(repoDir, branchHead, baseBranch); err == nil { + mergeBase = strings.TrimSpace(mb) + } + } + if mergeBase == "" { + return Commits{Err: fmt.Errorf("cannot compute merge-base for %s (base_branch=%q)", taskName, baseBranch)}, fmt.Errorf("cannot compute merge-base for %s (base_branch=%q)", taskName, baseBranch) + } + if !git.CommitExists(repoDir, mergeBase) { + return Commits{Err: fmt.Errorf("%w: merge_base %s", ErrCommitMissing, mergeBase)}, fmt.Errorf("%w: merge_base %s", ErrCommitMissing, mergeBase) + } + + diffBase := mergeBase + if strings.TrimSpace(diffBase) != "" && strings.TrimSpace(branchHead) != "" && strings.TrimSpace(diffBase) == strings.TrimSpace(branchHead) { + if fp, err := git.MergeBaseForkPoint(repoDir, baseBranch, branchHead); err == nil { + fp = strings.TrimSpace(fp) + if fp != "" && fp != branchHead && git.CommitExists(repoDir, fp) { + diffBase = fp + } + } + if strings.TrimSpace(diffBase) == strings.TrimSpace(branchHead) && taskBaseCommit != "" && taskBaseCommit != branchHead && git.CommitExists(repoDir, taskBaseCommit) { + diffBase = taskBaseCommit + } + } + + if ok && rec.CommitCountBaseCommit == diffBase && rec.CommitCountBranchHead == branchHead { + return Commits{Count: rec.CommitCount}, nil + } + + n, err := git.RevListCount(repoDir, diffBase, branchHead) + if err != nil { + return Commits{Err: err}, err + } + _ = idx.UpdateCommitCountCache(ctx, taskName, diffBase, branchHead, n) + return Commits{Count: n}, nil +} diff --git a/pkg/task/store/store_test.go b/pkg/task/store/store_test.go new file mode 100644 index 0000000..556189f --- /dev/null +++ b/pkg/task/store/store_test.go @@ -0,0 +1,159 @@ +package store_test + +import ( + "context" + "encoding/json" + "os" + "os/exec" + "path/filepath" + "strings" + "testing" + "time" + + "github.com/stretchr/testify/require" + + "github.com/zippoxer/subtask/pkg/task/history" + "github.com/zippoxer/subtask/pkg/task/store" + "github.com/zippoxer/subtask/pkg/testutil" +) + +func TestStoreGet_OpenTask_PrStyleChangesAndCommitCount(t *testing.T) { + env := testutil.NewTestEnv(t, 0) + repoDir := env.RootDir + + taskName := "fix/prstyle" + baseCommit := gitCmd(t, repoDir, "rev-parse", "HEAD") + + // Task branch commit. + gitCmd(t, repoDir, "checkout", "-b", taskName, "main") + require.NoError(t, os.WriteFile(filepath.Join(repoDir, "task.txt"), []byte("task\n"), 0o644)) + gitCmd(t, repoDir, "add", "task.txt") + gitCmd(t, repoDir, "commit", "-m", "task commit") + + // Base branch advances independently. + gitCmd(t, repoDir, "checkout", "main") + require.NoError(t, os.WriteFile(filepath.Join(repoDir, "README.md"), []byte("# Test Repo\nbase\n"), 0o644)) + gitCmd(t, repoDir, "add", "README.md") + gitCmd(t, repoDir, "commit", "-m", "base commit") + + env.CreateTask(taskName, "PR-style", "main", "desc") + env.CreateTaskHistory(taskName, repliedHistory("main", baseCommit)) + + s := store.New() + view, err := s.Get(context.Background(), taskName, store.GetOptions{}) + require.NoError(t, err) + require.Equal(t, 1, view.Commits.Count) + require.GreaterOrEqual(t, view.Changes.Added, 1) + require.Equal(t, 0, view.Changes.Removed) + require.Empty(t, view.Changes.Status) +} + +func TestStoreList_OpenTask_AppliedWhenContentInBase(t *testing.T) { + env := testutil.NewTestEnv(t, 0) + repoDir := env.RootDir + + taskName := "fix/applied" + baseCommit := gitCmd(t, repoDir, "rev-parse", "HEAD") + + // Commit on task branch. + gitCmd(t, repoDir, "checkout", "-b", taskName, "main") + require.NoError(t, os.WriteFile(filepath.Join(repoDir, "task.txt"), []byte("task\n"), 0o644)) + gitCmd(t, repoDir, "add", "task.txt") + gitCmd(t, repoDir, "commit", "-m", "task commit") + + // Apply the same change to main via a different commit (squash-like). + gitCmd(t, repoDir, "checkout", "main") + require.NoError(t, os.WriteFile(filepath.Join(repoDir, "task.txt"), []byte("task\n"), 0o644)) + gitCmd(t, repoDir, "add", "task.txt") + gitCmd(t, repoDir, "commit", "-m", "apply task") + + env.CreateTask(taskName, "Applied", "main", "desc") + env.CreateTaskHistory(taskName, repliedHistory("main", baseCommit)) + + s := store.New() + res, err := s.List(context.Background(), store.ListOptions{All: true}) + require.NoError(t, err) + + var got *store.TaskListItem + for i := range res.Tasks { + if res.Tasks[i].Name == taskName { + got = &res.Tasks[i] + break + } + } + require.NotNil(t, got) + require.Equal(t, store.ChangesStatusApplied, got.Changes.Status) + require.Equal(t, "open", string(got.TaskStatus)) +} + +func TestStoreList_OpenTask_MissingBranchMarkedMissing(t *testing.T) { + env := testutil.NewTestEnv(t, 0) + repoDir := env.RootDir + + taskName := "fix/missing" + baseCommit := gitCmd(t, repoDir, "rev-parse", "HEAD") + + // No branch exists for this task, but history indicates it previously ran. + env.CreateTask(taskName, "Missing", "main", "desc") + env.CreateTaskHistory(taskName, repliedHistory("main", baseCommit)) + + s := store.New() + res, err := s.List(context.Background(), store.ListOptions{All: true}) + require.NoError(t, err) + + var got *store.TaskListItem + for i := range res.Tasks { + if res.Tasks[i].Name == taskName { + got = &res.Tasks[i] + break + } + } + require.NotNil(t, got) + require.Equal(t, store.ChangesStatusMissing, got.Changes.Status) + require.Error(t, got.Changes.Err) +} + +func TestStoreGet_MergedTask_ShowsFrozenStats(t *testing.T) { + env := testutil.NewTestEnv(t, 0) + repoDir := env.RootDir + + taskName := "fix/merged" + baseCommit := gitCmd(t, repoDir, "rev-parse", "HEAD") + + env.CreateTask(taskName, "Merged", "main", "desc") + env.CreateTaskHistory(taskName, []history.Event{ + {TS: time.Now().UTC(), Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": "main", "base_ref": "main", "base_commit": baseCommit})}, + {TS: time.Now().UTC(), Type: "task.merged", Data: mustJSON(map[string]any{"via": "subtask", "method": "squash", "into": "main", "branch": taskName, "commit": "deadbeef", "changes_added": 10, "changes_removed": 5})}, + }) + + s := store.New() + view, err := s.Get(context.Background(), taskName, store.GetOptions{}) + require.NoError(t, err) + require.Equal(t, "merged", string(view.TaskStatus)) + require.Equal(t, 10, view.Changes.Added) + require.Equal(t, 5, view.Changes.Removed) +} + +func repliedHistory(baseBranch, baseCommit string) []history.Event { + return []history.Event{ + {TS: time.Now().UTC(), Type: "task.opened", Data: mustJSON(map[string]any{"reason": "draft", "base_branch": baseBranch, "base_ref": baseBranch, "base_commit": baseCommit})}, + {TS: time.Now().UTC(), Type: "stage.changed", Data: mustJSON(map[string]any{"from": "", "to": "implement"})}, + {TS: time.Now().UTC(), Type: "worker.finished", Data: mustJSON(map[string]any{"run_id": "r1", "duration_ms": 0, "tool_calls": 0, "outcome": "replied"})}, + } +} + +func mustJSON(v any) json.RawMessage { + b, _ := json.Marshal(v) + return b +} + +func gitCmd(t *testing.T, dir string, args ...string) string { + t.Helper() + cmd := exec.Command("git", args...) + cmd.Dir = dir + out, err := cmd.CombinedOutput() + if err != nil { + t.Fatalf("git %s failed: %v\n%s", strings.Join(args, " "), err, string(out)) + } + return strings.TrimSpace(string(out)) +} diff --git a/pkg/task/store/types.go b/pkg/task/store/types.go new file mode 100644 index 0000000..f596248 --- /dev/null +++ b/pkg/task/store/types.go @@ -0,0 +1,104 @@ +package store + +import ( + "context" + "time" + + "github.com/zippoxer/subtask/pkg/task" + "github.com/zippoxer/subtask/pkg/workflow" + "github.com/zippoxer/subtask/pkg/workspace" +) + +type Store interface { + List(ctx context.Context, opts ListOptions) (ListResult, error) + Get(ctx context.Context, name string, opts GetOptions) (TaskView, error) +} + +type ListOptions struct { + All bool + // TargetCount only applies when All is false. If zero, the store uses a default. + TargetCount int +} + +type GetOptions struct{} + +type ListResult struct { + Tasks []TaskListItem + Errors []TaskLoadError + Workspaces []workspace.Entry + AvailableWorkspaces int +} + +type TaskLoadError struct { + Name string + Err error +} + +type ChangesStatus string + +const ( + ChangesStatusApplied ChangesStatus = "applied" + ChangesStatusMissing ChangesStatus = "missing" +) + +type Changes struct { + Added int + Removed int + Status ChangesStatus + Err error +} + +type Commits struct { + Count int + Err error +} + +type TaskListItem struct { + Name string + Title string + FollowUp string + BaseBranch string + BaseCommit string + TaskStatus task.TaskStatus + WorkerStatus task.WorkerStatus + Stage string + + Workspace string + StartedAt time.Time + LastActive time.Time + ToolCalls int + + ProgressDone int + ProgressTotal int + + LastRunDurationMS int + LastError string + + Changes Changes +} + +type TaskView struct { + Task *task.Task + BaseCommit string + State *task.State + ProgressMeta *task.Progress + Workflow *workflow.Workflow + + TaskStatus task.TaskStatus + WorkerStatus task.WorkerStatus + Stage string + + LastHistoryNS int64 + LastRunMS int + + Model string + Reasoning string + + ProgressSteps []task.ProgressStep + TaskFiles []string + + Changes Changes + Commits Commits + + ConflictFiles []string +} diff --git a/pkg/testutil/testutil.go b/pkg/testutil/testutil.go index 61a7616..01217d1 100644 --- a/pkg/testutil/testutil.go +++ b/pkg/testutil/testutil.go @@ -12,6 +12,7 @@ import ( "github.com/zippoxer/subtask/pkg/task" "github.com/zippoxer/subtask/pkg/task/history" + "github.com/zippoxer/subtask/pkg/task/migrate/gitredesign" "github.com/zippoxer/subtask/pkg/workspace" ) @@ -27,19 +28,45 @@ type TestEnv struct { func NewTestEnv(t *testing.T, numWorkspaces int) *TestEnv { t.Helper() - // Create temp root - root, err := os.MkdirTemp("", "subtask-test-*") - if err != nil { - t.Fatalf("failed to create temp dir: %v", err) - } + origSubtaskDir, hadSubtaskDir := os.LookupEnv("SUBTASK_DIR") + requireSetEnv(t, "SUBTASK_DIR", t.TempDir()) + t.Cleanup(func() { + if hadSubtaskDir { + _ = os.Setenv("SUBTASK_DIR", origSubtaskDir) + } else { + _ = os.Unsetenv("SUBTASK_DIR") + } + }) + + // Make git commit SHAs deterministic for golden tests by pinning author/committer + // timestamps. Tests that care about time should use history events (nowFunc), not + // git commit metadata. + origAuthorDate, hadAuthorDate := os.LookupEnv("GIT_AUTHOR_DATE") + origCommitterDate, hadCommitterDate := os.LookupEnv("GIT_COMMITTER_DATE") + requireSetEnv(t, "GIT_AUTHOR_DATE", "2026-01-01T00:00:00Z") + requireSetEnv(t, "GIT_COMMITTER_DATE", "2026-01-01T00:00:00Z") + t.Cleanup(func() { + if hadAuthorDate { + _ = os.Setenv("GIT_AUTHOR_DATE", origAuthorDate) + } else { + _ = os.Unsetenv("GIT_AUTHOR_DATE") + } + if hadCommitterDate { + _ = os.Setenv("GIT_COMMITTER_DATE", origCommitterDate) + } else { + _ = os.Unsetenv("GIT_COMMITTER_DATE") + } + }) + + // Create temp root (git repo) + root := t.TempDir() // Initialize as git repo initGitRepo(t, root) - // Create .subtask directory structure + // Create portable task dir (repo-local only) subtaskDir := filepath.Join(root, ".subtask") - os.MkdirAll(filepath.Join(subtaskDir, "tasks"), 0755) - os.MkdirAll(filepath.Join(subtaskDir, "internal"), 0755) + _ = os.MkdirAll(filepath.Join(subtaskDir, "tasks"), 0o755) // Create workspaces (git worktrees) using the standard naming convention // so ListWorkspaces() can discover them @@ -62,9 +89,10 @@ func NewTestEnv(t *testing.T, numWorkspaces int) *TestEnv { "model": "gpt-5.2", }, } - cfgPath := filepath.Join(subtaskDir, "config.json") + cfgPath := task.ConfigPath() cfgData, _ := json.MarshalIndent(cfg, "", " ") - os.WriteFile(cfgPath, cfgData, 0644) + _ = os.MkdirAll(filepath.Dir(cfgPath), 0o755) + _ = os.WriteFile(cfgPath, cfgData, 0o644) // Save original cwd and change to test root origCwd, _ := os.Getwd() @@ -81,20 +109,18 @@ func NewTestEnv(t *testing.T, numWorkspaces int) *TestEnv { t.Cleanup(func() { os.Chdir(origCwd) - // Remove all worktrees for this repo from the global workspaces directory - // (tests may create additional worktrees lazily). - escapedPath := task.EscapePath(root) - pattern := filepath.Join(task.WorkspacesDir(), escapedPath+"--*") - matches, _ := filepath.Glob(pattern) - for _, ws := range matches { - os.RemoveAll(ws) - } - os.RemoveAll(root) }) return env } +func requireSetEnv(t *testing.T, k, v string) { + t.Helper() + if err := os.Setenv(k, v); err != nil { + t.Fatalf("setenv %s: %v", k, err) + } +} + // CreateTask creates a task with TASK.md. func (e *TestEnv) CreateTask(name, title, base, description string) *task.Task { e.T.Helper() @@ -103,7 +129,7 @@ func (e *TestEnv) CreateTask(name, title, base, description string) *task.Task { Title: title, BaseBranch: base, Description: description, - Schema: 1, + Schema: gitredesign.TaskSchemaVersion, } if err := t.Save(); err != nil { e.T.Fatalf("failed to save task: %v", err) diff --git a/pkg/tui/model.go b/pkg/tui/model.go index 651e83a..a43eb90 100644 --- a/pkg/tui/model.go +++ b/pkg/tui/model.go @@ -16,8 +16,8 @@ import ( "github.com/zippoxer/subtask/pkg/git" "github.com/zippoxer/subtask/pkg/logging" "github.com/zippoxer/subtask/pkg/task" - "github.com/zippoxer/subtask/pkg/task/gather" "github.com/zippoxer/subtask/pkg/task/history" + "github.com/zippoxer/subtask/pkg/task/store" ) type viewMode int @@ -69,7 +69,7 @@ type toastState struct { func (t toastState) active() bool { return t.kind != toastNone && t.text != "" } type listLoadedMsg struct { - data gather.TaskListData + data store.ListResult err error } @@ -81,7 +81,7 @@ var spinnerFrames = []string{"⠋", "⠙", "⠹", "⠸", "⠼", "⠴", "⠦", " type detailLoadedMsg struct { taskName string - detail gather.TaskDetail + detail store.TaskView err error } @@ -129,8 +129,8 @@ type model struct { width int height int - tasks []gather.TaskListItem - filteredTasks []gather.TaskListItem // filtered view when searching + tasks []store.TaskListItem + filteredTasks []store.TaskListItem // filtered view when searching availableWorkspaces int selected int offset int @@ -157,7 +157,7 @@ type model struct { // detail data (refreshed on demand) detailTaskName string - detail gather.TaskDetail + detail store.TaskView detailErr error // viewports (one per tab; diff uses split-pane viewport) @@ -360,9 +360,9 @@ func (m model) Update(msg tea.Msg) (tea.Model, tea.Cmd) { return m, nil } if logging.DebugEnabled() { - logging.Debug("tui", fmt.Sprintf("data arrived items=%d (+%s)", len(msg.data.Items), sinceStartup().Round(time.Millisecond))) + logging.Debug("tui", fmt.Sprintf("data arrived items=%d (+%s)", len(msg.data.Tasks), sinceStartup().Round(time.Millisecond))) } - m.tasks = msg.data.Items + m.tasks = msg.data.Tasks m.availableWorkspaces = msg.data.AvailableWorkspaces // Re-filter if search is active (without resetting selection) @@ -905,12 +905,13 @@ func (m model) View() string { func fetchListCmd() tea.Cmd { return func() tea.Msg { done := logging.DebugTimer("refresh", "start") - data, err := gather.List(context.Background(), gather.ListOptions{All: true}) + st := store.New() + data, err := st.List(context.Background(), store.ListOptions{All: true}) if err != nil { - logging.Error("refresh", "gather.List error: "+err.Error()) + logging.Error("refresh", "store.List error: "+err.Error()) } if logging.DebugEnabled() { - done(fmt.Sprintf("done items=%d", len(data.Items))) + done(fmt.Sprintf("done items=%d", len(data.Tasks))) } else { done("") } @@ -950,7 +951,7 @@ func (m model) refreshSelected() tea.Cmd { return tea.Batch(cmds...) } -func clampSelection(tasks []gather.TaskListItem, idx int, preferredName string) int { +func clampSelection(tasks []store.TaskListItem, idx int, preferredName string) int { if len(tasks) == 0 { return 0 } @@ -971,7 +972,7 @@ func clampSelection(tasks []gather.TaskListItem, idx int, preferredName string) } // visibleTasks returns the filtered tasks if search is active, otherwise all tasks. -func (m model) visibleTasks() []gather.TaskListItem { +func (m model) visibleTasks() []store.TaskListItem { if m.searchActive && m.searchInput.Value() != "" { return m.filteredTasks } @@ -1009,7 +1010,8 @@ func (m *model) refilterTasks() { func fetchDetailCmd(taskName string) tea.Cmd { return func() tea.Msg { - d, err := gather.Detail(context.Background(), taskName) + st := store.New() + d, err := st.Get(context.Background(), taskName, store.GetOptions{}) return detailLoadedMsg{taskName: taskName, detail: d, err: err} } } @@ -1161,7 +1163,7 @@ func fetchConversationCmd(taskName string) tea.Cmd { } } -func fetchDiffFilesCmd(taskName string, detail gather.TaskDetail) tea.Cmd { +func fetchDiffFilesCmd(taskName string, detail store.TaskView) tea.Cmd { return func() tea.Msg { ctx, err := computeDiffCtx(taskName, detail) if err != nil { @@ -1193,7 +1195,7 @@ func fetchDiffFilesCmd(taskName string, detail gather.TaskDetail) tea.Cmd { } } -func computeDiffCtx(taskName string, detail gather.TaskDetail) (diffCtx, error) { +func computeDiffCtx(taskName string, detail store.TaskView) (diffCtx, error) { if detail.Task == nil { return diffCtx{}, fmt.Errorf("diff unavailable") } diff --git a/pkg/tui/model_helpers.go b/pkg/tui/model_helpers.go index d68c8be..a12ce27 100644 --- a/pkg/tui/model_helpers.go +++ b/pkg/tui/model_helpers.go @@ -9,6 +9,7 @@ import ( "github.com/charmbracelet/lipgloss" "github.com/charmbracelet/x/ansi" "github.com/zippoxer/subtask/pkg/task" + "github.com/zippoxer/subtask/pkg/task/store" ) const ( @@ -150,13 +151,27 @@ func (m *model) updateOverviewContent() { } detailsLines = append(detailsLines, styleDim.Render(padRight("Model", labelWidth))+modelInfo) } - if m.detail.LinesAdded > 0 || m.detail.LinesRemoved > 0 { - changesInfo := styleSuccess.Render(fmt.Sprintf("+%d", m.detail.LinesAdded)) + - " " + styleError.Render(fmt.Sprintf("-%d", m.detail.LinesRemoved)) + switch m.detail.Changes.Status { + case store.ChangesStatusMissing: + detailsLines = append(detailsLines, styleDim.Render(padRight("Changes", labelWidth))+styleDim.Render("missing")) + detailsLines = append(detailsLines, styleDim.Render(padRight("", labelWidth))+"Branch deleted or commit objects missing.") + case store.ChangesStatusApplied: + changesInfo := styleSuccess.Render(fmt.Sprintf("+%d", m.detail.Changes.Added)) + + " " + styleError.Render(fmt.Sprintf("-%d", m.detail.Changes.Removed)) detailsLines = append(detailsLines, styleDim.Render(padRight("Changes", labelWidth))+changesInfo) - if m.detail.CommitsBehind > 0 { - behindMsg := styleStatusReplied.Render(fmt.Sprintf("%d commits behind", m.detail.CommitsBehind)) - detailsLines = append(detailsLines, strings.Repeat(" ", labelWidth)+behindMsg) + detailsLines = append(detailsLines, styleDim.Render(padRight("", labelWidth))+"Already in base branch. Merge to mark as merged.") + default: + if m.detail.Changes.Added > 0 || m.detail.Changes.Removed > 0 { + changesInfo := styleSuccess.Render(fmt.Sprintf("+%d", m.detail.Changes.Added)) + + " " + styleError.Render(fmt.Sprintf("-%d", m.detail.Changes.Removed)) + detailsLines = append(detailsLines, styleDim.Render(padRight("Changes", labelWidth))+changesInfo) + } + } + if m.detail.TaskStatus == task.TaskStatusOpen { + if m.detail.Commits.Err != nil { + detailsLines = append(detailsLines, styleDim.Render(padRight("Commits", labelWidth))+styleStatusError.Render(m.detail.Commits.Err.Error())) + } else { + detailsLines = append(detailsLines, styleDim.Render(padRight("Commits", labelWidth))+fmt.Sprintf("%d", m.detail.Commits.Count)) } } if m.detail.ProgressMeta != nil { diff --git a/pkg/tui/model_helpers_overview_test.go b/pkg/tui/model_helpers_overview_test.go index 5fd15b8..e1144d0 100644 --- a/pkg/tui/model_helpers_overview_test.go +++ b/pkg/tui/model_helpers_overview_test.go @@ -7,7 +7,7 @@ import ( "github.com/charmbracelet/bubbles/viewport" "github.com/charmbracelet/x/ansi" "github.com/zippoxer/subtask/pkg/task" - "github.com/zippoxer/subtask/pkg/task/gather" + "github.com/zippoxer/subtask/pkg/task/store" "github.com/zippoxer/subtask/pkg/workflow" ) @@ -28,7 +28,7 @@ func TestUpdateOverviewContent_RendersProgressAndWorkflow(t *testing.T) { m := newModel() m.vpOverview = viewport.New(80, 30) - m.detail = gather.TaskDetail{ + m.detail = store.TaskView{ Task: &task.Task{ Name: "fix/overview", Title: "Overview Task", diff --git a/pkg/tui/view_detail.go b/pkg/tui/view_detail.go index c872683..30ceb7d 100644 --- a/pkg/tui/view_detail.go +++ b/pkg/tui/view_detail.go @@ -94,7 +94,7 @@ func renderDetailHeader(m model, leftPad string, contentWidth int) string { startedAt = m.detail.State.StartedAt lastError = m.detail.State.LastError } - statusPill := statusPillStyled(m.detail.TaskStatus, m.detail.WorkerStatus, m.detail.IntegratedReason, startedAt, m.detail.LastRunMS, lastError, m.spinnerFrame) + statusPill := statusPillStyled(m.detail.TaskStatus, m.detail.WorkerStatus, startedAt, m.detail.LastRunMS, lastError, m.spinnerFrame) // Build left side: name + title titleStyle := lipgloss.NewStyle().Foreground(lipgloss.AdaptiveColor{Light: "240", Dark: "250"}) @@ -217,8 +217,8 @@ func addPadding(content, leftPad string) string { return strings.Join(lines, "\n") } -func statusPillStyled(taskStatus task.TaskStatus, workerStatus task.WorkerStatus, integratedReason string, startedAt time.Time, lastRunMS int, lastError string, spinnerFrame int) string { - return unifiedStatusTextStyled(taskStatus, workerStatus, integratedReason, startedAt, lastRunMS, lastError, spinnerFrame) +func statusPillStyled(taskStatus task.TaskStatus, workerStatus task.WorkerStatus, startedAt time.Time, lastRunMS int, lastError string, spinnerFrame int) string { + return unifiedStatusTextStyled(taskStatus, workerStatus, startedAt, lastRunMS, lastError, spinnerFrame) } // formatDurationShort returns a short human-readable duration like "5m" or "2h". diff --git a/pkg/tui/view_list.go b/pkg/tui/view_list.go index 422fe97..c364cca 100644 --- a/pkg/tui/view_list.go +++ b/pkg/tui/view_list.go @@ -8,7 +8,7 @@ import ( "github.com/charmbracelet/lipgloss" zone "github.com/lrstanley/bubblezone" "github.com/zippoxer/subtask/pkg/task" - "github.com/zippoxer/subtask/pkg/task/gather" + "github.com/zippoxer/subtask/pkg/task/store" ) // Selection indicator character @@ -249,15 +249,15 @@ func renderSearchBoxWithBg(m model, maxWidth int, bg lipgloss.TerminalColor) str } // stageText returns stage or empty for closed tasks. -func stageText(t gather.TaskListItem) string { +func stageText(t store.TaskListItem) string { return t.Stage } // listRowDataLeft returns plain text data for left-column width calculation. -func listRowDataLeft(t gather.TaskListItem) []string { +func listRowDataLeft(t store.TaskListItem) []string { return []string{ t.Name, - unifiedStatusTextPlain(t.TaskStatus, t.WorkerStatus, t.IntegratedReason, t.StartedAt, t.LastRunDurationMS, t.LastError), + unifiedStatusTextPlain(t.TaskStatus, t.WorkerStatus, t.StartedAt, t.LastRunDurationMS, t.LastError), stageText(t), changesTextPlain(t), } @@ -265,11 +265,11 @@ func listRowDataLeft(t gather.TaskListItem) []string { // buildTaskRow builds a complete row with stretched layout. // PROGRESS column stretches to fill space, LAST ACTIVE is right-aligned. -func buildTaskRow(t gather.TaskListItem, widths []int, totalWidth int, spinnerFrame int) string { +func buildTaskRow(t store.TaskListItem, widths []int, totalWidth int, spinnerFrame int) string { // Build left columns (TASK through CHANGES) leftCells := []string{ padRight(t.Name, widths[0]), - padRightDisplay(unifiedStatusTextStyled(t.TaskStatus, t.WorkerStatus, t.IntegratedReason, t.StartedAt, t.LastRunDurationMS, t.LastError, spinnerFrame), widths[1]), + padRightDisplay(unifiedStatusTextStyled(t.TaskStatus, t.WorkerStatus, t.StartedAt, t.LastRunDurationMS, t.LastError, spinnerFrame), widths[1]), padRight(stageText(t), widths[2]), padRightDisplay(changesTextStyled(t), widths[3]), } @@ -294,11 +294,11 @@ func buildTaskRow(t gather.TaskListItem, widths []int, totalWidth int, spinnerFr } // buildTaskRowSelected builds a row for selected task with blue+bold task name. -func buildTaskRowSelected(t gather.TaskListItem, widths []int, totalWidth int, spinnerFrame int) string { +func buildTaskRowSelected(t store.TaskListItem, widths []int, totalWidth int, spinnerFrame int) string { // Build left columns - task name is blue+bold, rest normal leftCells := []string{ styleSelectedTaskName.Render(padRight(t.Name, widths[0])), - padRightDisplay(unifiedStatusTextStyled(t.TaskStatus, t.WorkerStatus, t.IntegratedReason, t.StartedAt, t.LastRunDurationMS, t.LastError, spinnerFrame), widths[1]), + padRightDisplay(unifiedStatusTextStyled(t.TaskStatus, t.WorkerStatus, t.StartedAt, t.LastRunDurationMS, t.LastError, spinnerFrame), widths[1]), padRight(stageText(t), widths[2]), padRightDisplay(changesTextStyled(t), widths[3]), } @@ -322,11 +322,7 @@ func buildTaskRowSelected(t gather.TaskListItem, widths []int, totalWidth int, s return leftPart + " " + progressPart + strings.Repeat(" ", gap) + styleDim.Render(lastActivePart) } -func unifiedStatusTextPlain(ts task.TaskStatus, ws task.WorkerStatus, integratedReason string, startedAt time.Time, lastRunMS int, lastError string) string { - // Don't show "merged" if worker is actively running - if ws != task.WorkerStatusRunning && strings.TrimSpace(integratedReason) != "" { - return "✓ merged" - } +func unifiedStatusTextPlain(ts task.TaskStatus, ws task.WorkerStatus, startedAt time.Time, lastRunMS int, lastError string) string { switch task.UserStatusFor(ts, ws) { case task.UserStatusMerged: return "✓ merged" @@ -360,11 +356,7 @@ func unifiedStatusTextPlain(ts task.TaskStatus, ws task.WorkerStatus, integrated } } -func unifiedStatusTextStyled(ts task.TaskStatus, ws task.WorkerStatus, integratedReason string, startedAt time.Time, lastRunMS int, lastError string, spinnerFrame int) string { - // Don't show "merged" if worker is actively running - if ws != task.WorkerStatusRunning && strings.TrimSpace(integratedReason) != "" { - return styleStatusMerged.Render("✓ merged") - } +func unifiedStatusTextStyled(ts task.TaskStatus, ws task.WorkerStatus, startedAt time.Time, lastRunMS int, lastError string, spinnerFrame int) string { switch task.UserStatusFor(ts, ws) { case task.UserStatusMerged: return styleStatusMerged.Render("✓ merged") @@ -431,22 +423,34 @@ func progressBar(done, total int) string { return filledStyle.Render(filled) + emptyStyle.Render(empty) } -func changesTextPlain(t gather.TaskListItem) string { - if t.LinesAdded == 0 && t.LinesRemoved == 0 { +func changesTextPlain(t store.TaskListItem) string { + switch t.Changes.Status { + case store.ChangesStatusMissing: + return "missing" + case store.ChangesStatusApplied: + return "applied" + } + if t.Changes.Added == 0 && t.Changes.Removed == 0 { return "" } - return fmt.Sprintf("+%d -%d", t.LinesAdded, t.LinesRemoved) + return fmt.Sprintf("+%d -%d", t.Changes.Added, t.Changes.Removed) } -func changesTextStyled(t gather.TaskListItem) string { - if t.LinesAdded == 0 && t.LinesRemoved == 0 { +func changesTextStyled(t store.TaskListItem) string { + switch t.Changes.Status { + case store.ChangesStatusMissing: + return styleDim.Render("missing") + case store.ChangesStatusApplied: + return styleDim.Render("applied") + } + if t.Changes.Added == 0 && t.Changes.Removed == 0 { return "" } - return styleSuccess.Render(fmt.Sprintf("+%d", t.LinesAdded)) + " " + - styleError.Render(fmt.Sprintf("-%d", t.LinesRemoved)) + return styleSuccess.Render(fmt.Sprintf("+%d", t.Changes.Added)) + " " + + styleError.Render(fmt.Sprintf("-%d", t.Changes.Removed)) } -func lastActiveText(t gather.TaskListItem) string { +func lastActiveText(t store.TaskListItem) string { if t.LastActive.IsZero() { return "" } diff --git a/pkg/tui/view_list_helpers_test.go b/pkg/tui/view_list_helpers_test.go index 43a5b8d..d546577 100644 --- a/pkg/tui/view_list_helpers_test.go +++ b/pkg/tui/view_list_helpers_test.go @@ -6,14 +6,14 @@ import ( "github.com/charmbracelet/x/ansi" "github.com/zippoxer/subtask/pkg/task" - "github.com/zippoxer/subtask/pkg/task/gather" + "github.com/zippoxer/subtask/pkg/task/store" ) func TestStageText_HidesStageForClosedTasks(t *testing.T) { - if got := stageText(gather.TaskListItem{TaskStatus: task.TaskStatusClosed, Stage: "review"}); got != "review" { + if got := stageText(store.TaskListItem{TaskStatus: task.TaskStatusClosed, Stage: "review"}); got != "review" { t.Fatalf("stageText(closed)=%q want %q", got, "review") } - if got := stageText(gather.TaskListItem{TaskStatus: task.TaskStatusOpen, Stage: "review"}); got != "review" { + if got := stageText(store.TaskListItem{TaskStatus: task.TaskStatusOpen, Stage: "review"}); got != "review" { t.Fatalf("stageText(open)=%q want %q", got, "review") } } diff --git a/pkg/workspace/config.go b/pkg/workspace/config.go index 5c969c3..bec1fb0 100644 --- a/pkg/workspace/config.go +++ b/pkg/workspace/config.go @@ -9,6 +9,7 @@ import ( "strconv" "strings" + "github.com/zippoxer/subtask/pkg/subtaskerr" "github.com/zippoxer/subtask/pkg/task" ) @@ -28,32 +29,38 @@ type Entry struct { ID int // e.g., 1 } -// LoadConfig loads the project config from .subtask/config.json. +// LoadConfig loads the effective config (global defaults + optional project overrides). func LoadConfig() (*Config, error) { - data, err := os.ReadFile(task.ConfigPath()) + userPath := task.ConfigPath() + user, userExists, err := loadConfigFile(userPath) if err != nil { - if os.IsNotExist(err) { - return nil, fmt.Errorf("not initialized\n\nRun 'subtask init' first") - } - return nil, err + return nil, fmt.Errorf("subtask: invalid config at %s\n\nFix it with:\n subtask config --user", userPath) } - var cfg Config - if err := json.Unmarshal(data, &cfg); err != nil { - return nil, fmt.Errorf("invalid config: %w", err) + // Best-effort project override discovery (requires git; ignored if not in git). + var project *Config + var projectPath string + if root, err := task.GitRootAbs(); err == nil && strings.TrimSpace(root) != "" { + projectPath = filepath.Join(root, ".subtask", "config.json") + project, _, err = loadConfigFile(projectPath) + if err != nil { + return nil, fmt.Errorf("subtask: invalid project config at %s\n\nFix it with:\n subtask config --project", projectPath) + } } - if cfg.MaxWorkspaces <= 0 { - cfg.MaxWorkspaces = DefaultMaxWorkspaces + + if !userExists || user == nil { + return nil, subtaskerr.ErrNotConfigured } - return &cfg, nil -} -// Save writes the config to .subtask/config.json. -func (c *Config) Save() error { - if err := os.MkdirAll(task.ProjectDir(), 0755); err != nil { - return err + effective := mergeConfig(user, project) + if effective.MaxWorkspaces <= 0 { + effective.MaxWorkspaces = DefaultMaxWorkspaces } + return effective, nil +} +// SaveTo writes the config to a specific path. +func (c *Config) SaveTo(path string) error { if c.MaxWorkspaces <= 0 { c.MaxWorkspaces = DefaultMaxWorkspaces } @@ -62,7 +69,15 @@ func (c *Config) Save() error { if err != nil { return err } - return os.WriteFile(task.ConfigPath(), data, 0644) + if err := os.MkdirAll(filepath.Dir(path), 0o755); err != nil { + return err + } + return os.WriteFile(path, data, 0o644) +} + +// Save writes the config to the global defaults path (~/.subtask/config.json). +func (c *Config) Save() error { + return c.SaveTo(task.ConfigPath()) } // ListWorkspaces discovers workspaces for the current project by globbing. @@ -98,3 +113,47 @@ func ListWorkspaces() ([]Entry, error) { return entries, nil } + +func mergeConfig(user, project *Config) *Config { + out := &Config{ + Harness: strings.TrimSpace(user.Harness), + MaxWorkspaces: user.MaxWorkspaces, + Options: make(map[string]any), + } + for k, v := range user.Options { + out.Options[k] = v + } + if project == nil { + return out + } + + if strings.TrimSpace(project.Harness) != "" { + out.Harness = strings.TrimSpace(project.Harness) + } + if project.MaxWorkspaces > 0 { + out.MaxWorkspaces = project.MaxWorkspaces + } + for k, v := range project.Options { + out.Options[k] = v + } + return out +} + +func loadConfigFile(path string) (*Config, bool, error) { + data, err := os.ReadFile(path) + if err != nil { + if os.IsNotExist(err) { + return nil, false, nil + } + return nil, false, err + } + + var cfg Config + if err := json.Unmarshal(data, &cfg); err != nil { + return nil, true, err + } + if cfg.Options == nil { + cfg.Options = make(map[string]any) + } + return &cfg, true, nil +} diff --git a/pkg/workspace/model_override.go b/pkg/workspace/model_override.go index e4c42ee..9ef8c11 100644 --- a/pkg/workspace/model_override.go +++ b/pkg/workspace/model_override.go @@ -28,7 +28,7 @@ func ValidateReasoningFlag(harnessName, reasoning string) error { return nil } if strings.TrimSpace(harnessName) != "codex" { - return fmt.Errorf("reasoning is codex-only\n\nRemove --reasoning, or switch your project harness to codex in %s", task.ConfigPath()) + return fmt.Errorf("reasoning is codex-only\n\nRemove --reasoning, or switch your harness to codex with:\n subtask config --user\nor (repo-only):\n subtask config --project") } return ValidateReasoningLevel(reasoning) } diff --git a/plugin/commands/setup.md b/plugin/commands/setup.md deleted file mode 100644 index 1400760..0000000 --- a/plugin/commands/setup.md +++ /dev/null @@ -1,49 +0,0 @@ ---- -description: Initialize Subtask for this repository ---- - -# Setup Subtask - -Initialize Subtask for the current repository. - -## Check available harnesses -Check if `git` is installed and if we're inside a Git repository. If not, let user know that Subtask requires a Git repository and stop. - -## Check available harnesses - -```bash -codex --version -claude --version -``` - -**Important:** The "worker harness" is the AI that will execute tasks in parallel workspaces - NOT you (Claude Code). You are the lead; the harness is your worker. - -## Ask the user which harness to use - -| Harness | Command | Notes | -|---------|---------|-------| -| **Codex CLI** | `codex` | Recommended - more reliable at autonomous multi-step tasks | -| **Claude Code CLI** | `claude` | Good alternative if Codex isn't installed | - -If only Claude Code is available, use it. If both are available, ask user and recommend Codex. - -## Initialize - -```bash -subtask init --harness -n 20 -``` - -This creates `.subtask/config.json`. Workspaces are created on demand at `~/.subtask/workspaces/`. - -## Done - -Tell the user: - -> Subtask is ready! -> -> Example usage: -> - "fix the login bug with Subtask" -> - "run these 3 features in parallel" -> - "plan and implement the new API endpoint with Subtask" -> -> I'll draft tasks, dispatch workers in isolated workspaces and let you know when they're done. diff --git a/plugin/embed.go b/plugin/embed.go deleted file mode 100644 index 0e25e9e..0000000 --- a/plugin/embed.go +++ /dev/null @@ -1,11 +0,0 @@ -package plugin - -import "embed" - -// FS contains the embedded Claude plugin files. -// -// Install logic should typically use fs.WalkDir(FS, ".") and copy files into a target plugin directory, -// preserving the relative paths. -// -//go:embed .claude-plugin/plugin.json commands/* hooks/hooks.json scripts/* -var FS embed.FS