Files
vibe-kanban/shared/types.ts
Alex Netsch 4188adc2a9 Refactor task_attempts to workspaces and sessions (Vibe Kanban) (#1569)
* All tasks are complete. Here's a summary of the refactoring that was done:

The `task_attempts` table has been refactored into `workspaces` and `sessions`:

- Created new migration `20251216142123_refactor_task_attempts_to_workspaces_sessions.sql`
- Renamed `task_attempts` table to `workspaces` (removed `executor` column)
- Created new `sessions` table with `(id, workspace_id, executor, created_at, updated_at)`
- Renamed `executor_sessions` to `coding_agent_turns` (dropped redundant `task_attempt_id`)
- Changed `execution_processes.task_attempt_id` to `session_id`
- Changed `merges.task_attempt_id` to `workspace_id`
- Changed `tasks.parent_task_attempt` to `parent_workspace_id`

- Renamed `crates/db/src/models/task_attempt.rs` → `workspace.rs`
- Created new `crates/db/src/models/session.rs`
- Renamed `executor_session.rs` → `coding_agent_turn.rs`
- Updated `execution_process.rs`:
  - Changed `task_attempt_id` field to `session_id`
  - Added workspace-level query methods (`find_latest_by_workspace_and_run_reason`, `latest_executor_profile_for_workspace`, `find_latest_external_session_id_by_workspace`, `find_running_dev_servers_by_workspace`, `has_running_non_dev_server_processes_for_workspace`)
- Updated `merge.rs`: Changed `task_attempt_id` to `workspace_id`
- Updated `task.rs`: Changed `parent_task_attempt` to `parent_workspace_id`

- Updated `container.rs`: Changed `start_execution` to take 4 parameters (workspace, session, action, reason)
- Updated `events/`: Changed event types from `TaskAttempt` to `Workspace`
- Updated `pr_monitor.rs`: Changed to use `Workspace` instead of `TaskAttempt`
- Updated `workspace_manager.rs`: Added support for `DbWorkspace` alias

- Updated `task_attempts.rs`: All handlers now create/use sessions for execution processes
- Updated `tasks.rs`: Removed `executor` from `CreateWorkspace`, fixed error type
- Updated setup helpers (codex, cursor, gh_cli) to create sessions
- Updated `error.rs`: Added `SessionError` handling

Cleanup script changes for task attempt 2f672d73-cb46-4515-b1c7-3da8f4402858

Done. All 3 `sqlx::query_scalar()` function calls have been converted to `sqlx::query_scalar!()` macros:

1. `has_running_non_dev_server_processes()` - lines 332-341
2. `has_running_non_dev_server_processes_for_workspace()` - lines 350-361
3. `find_prev_after_head_commit()` - lines 619-634

Key changes:
- Converted `?` placeholders to `$1, $2, $3` positional parameters (sqlx macro style)
- Added `as "count!: i64"` type annotation for COUNT queries
- Used `.flatten()` for the nullable `after_head_commit` column to avoid `Option<Option<String>>`

The one in `crates/remote/src/db/mod.rs` uses dynamic SQL for PostgreSQL's `ALTER ROLE` and cannot be converted to a macro.

Done. Removed the 4 unused functions from `execution_process.rs`:
- `find_running_dev_servers_by_session`
- `has_running_non_dev_server_processes` (the per-session version)
- `delete_by_session_id`
- `latest_executor_profile_for_session`

The workspace-level versions (`has_running_non_dev_server_processes_for_workspace`, `find_running_dev_servers_by_workspace`, `latest_executor_profile_for_workspace`) are still available. Everything builds successfully.

Rewbase

Revert event rename

* All done. Here's a summary of the changes:

Created a `TaskAttempt` helper struct in the backend that maintains backwards compatibility with the frontend:

1. **`crates/db/src/models/workspace.rs`**
   - Added `TaskAttempt` struct that combines `Workspace` fields + `executor` from the latest `Session`
   - Added `TaskAttempt::from_workspace()` to convert a `Workspace` by fetching its latest session's executor
   - Added `TaskAttempt::fetch_all()` for API responses

2. **`crates/db/src/models/task.rs`**
   - Changed `TaskRelationships.current_workspace` → `current_attempt: TaskAttempt`

3. **`crates/server/src/routes/task_attempts.rs`**
   - Updated `get_task_attempts` to return `Vec<TaskAttempt>`
   - Updated `get_task_attempt` to return `TaskAttempt`
   - Updated `create_task_attempt` to return `TaskAttempt`

4. **`crates/server/src/bin/generate_types.rs`**
   - Added `TaskAttempt::decl()` to the type generation list

Fixed references from `parent_task_attempt` to `parent_workspace_id` in:
- `TaskCard.tsx`
- `NoServerContent.tsx`
- `TaskPanel.tsx`
- `TaskFormDialog.tsx`
- `CreateAttemptDialog.tsx`
- `useTaskMutations.ts`
- `ProjectTasks.tsx`

- Frontend now gets `TaskAttempt` type with the `executor` field
- The `TaskRelationships.current_attempt` field provides `TaskAttempt` with executor
- Backend still uses `Workspace` + `Session` internally
- All type checks pass

Fixed. The issue was that the frontend was sending `task_attempt_id` as the query parameter, but the backend expected `workspace_id`. Now the execution processes WebSocket stream should work correctly and the "Loading History" should resolve.

* All done. Here's a summary of the changes:

1. **Added sessions endpoint** (`GET /api/sessions?workspace_id={uuid}`)
   - Created `crates/server/src/routes/sessions.rs`
   - Registered in `crates/server/src/routes/mod.rs`
   - Added `Session::decl()` to `generate_types.rs`

2. **Removed TaskAttempt from backend**
   - Removed `TaskAttempt` struct and methods from `crates/db/src/models/workspace.rs`
   - Changed `TaskRelationships.current_attempt` back to `current_workspace: Workspace` in `task.rs`
   - Updated route handlers to return `Workspace` instead of `TaskAttempt`

1. **Created `TaskAttempt` type** in `frontend/src/types/attempt.ts`
   - Defines `TaskAttempt = Workspace & { executor: string }`
   - Exports `createTaskAttempt()` helper function

2. **Updated `api.ts`**
   - Added `sessionsApi` for fetching sessions
   - Added `workspaceToTaskAttempt()` helper that fetches sessions and constructs TaskAttempt
   - Updated `attemptsApi.getAll()`, `get()`, and `create()` to internally convert Workspace → TaskAttempt

3. **Updated 20+ frontend files** to import `TaskAttempt` from `@/types/attempt` instead of `shared/types`

- Backend returns `Workspace` directly (no executor field)
- Backend has new `/api/sessions` endpoint to fetch sessions by workspace
- Frontend constructs `TaskAttempt` by combining `Workspace` + latest `Session.executor`
- All type checks pass

* The refactoring is complete. Here's a summary of what was done:

- `TaskAttempt` is now just a type alias for `Workspace` (no executor needed for most uses)
- Added `WorkspaceWithSession = Workspace & { executor: string }` for components that need executor
- Added `createWorkspaceWithSession()` helper function

- `attemptsApi.get()`, `getAll()`, `create()` now return `Workspace` directly (as `TaskAttempt`)
- Added `attemptsApi.getWithSession()` - fetches workspace + session to get executor
- Added `sessionsApi.getByWorkspace()` for fetching sessions

- Added `useTaskAttemptWithSession()` - for single attempt with executor
- Added `useTaskAttemptsWithSessions()` - for list of attempts with executor

Files that access `.executor` now use `WorkspaceWithSession`:
- `TaskPanel.tsx` - Uses `useTaskAttemptsWithSessions` and `useTaskAttemptWithSession`
- `NextActionCard.tsx` - Uses `attemptsApi.getWithSession()`
- `UserMessage.tsx` - Receives `WorkspaceWithSession` prop
- `RetryEditorInline.tsx` - Receives `WorkspaceWithSession` prop
- `DisplayConversationEntry.tsx` - Passes `WorkspaceWithSession` to children
- `VirtualizedList.tsx` - Receives `WorkspaceWithSession` prop
- `TaskAttemptPanel.tsx` - Receives `WorkspaceWithSession` prop
- `ProjectTasks.tsx` - Uses `useTaskAttemptWithSession`
- `FullAttemptLogs.tsx` - Uses `useTaskAttemptWithSession`
- `CreateAttemptDialog.tsx` - Uses `useTaskAttemptsWithSessions`

The 32 other files that only use Workspace fields continue to use `TaskAttempt` (which is just `Workspace`) without any changes needed.

Done! I've simplified the migration from 191 lines down to 106 lines by:

1. **Removed workspaces table rebuild** → simple `ALTER TABLE workspaces DROP COLUMN executor`
2. **Removed merges table rebuild** → simple `ALTER TABLE merges RENAME COLUMN task_attempt_id TO workspace_id`
3. **Removed tasks table rebuild** → simple `ALTER TABLE tasks RENAME COLUMN parent_task_attempt TO parent_workspace_id`
4. **Removed attempt_repos rebuild entirely** → FK auto-updates when `task_attempts` is renamed to `workspaces`

The only table rebuilds remaining are for:
- `execution_processes` - because the FK target changes from workspaces to sessions
- `executor_sessions` → `coding_agent_turns` - because we're dropping the `task_attempt_id` column

All done:

- **npm run lint** - passed (fixed 4 unused import errors)
- **cargo build** - passed
- **npm run build** - passed
- **tsc --noEmit** - passed

Prep db

* Done. I've renamed `TaskAttempt` to `Workspace` in all places where it was used as a type annotation. Here's the summary:

- `types/attempt.ts` - Removed `export type TaskAttempt = Workspace;`

**API Layer:**
- `lib/api.ts` - Changed return types from `TaskAttempt` to `Workspace`

**Hooks:**
- `hooks/useTaskAttempts.ts` - `TaskAttempt[]` → `Workspace[]`
- `hooks/useAttemptCreation.ts` - `TaskAttempt` → `Workspace`
- `hooks/useConversationHistory.ts` - `TaskAttempt` → `Workspace`
- `hooks/useAttempt.ts` - `TaskAttempt` → `Workspace`

**Components:**
- `contexts/ClickedElementsProvider.tsx`
- `components/DiffCard.tsx`
- `components/panels/AttemptHeaderActions.tsx`
- `components/panels/DiffsPanel.tsx`
- `components/tasks/Toolbar/GitOperations.tsx`
- `components/dialogs/tasks/GitActionsDialog.tsx`
- `components/dialogs/tasks/ViewRelatedTasksDialog.tsx`
- `components/dialogs/tasks/CreatePRDialog.tsx`
- `components/ui/actions-dropdown.tsx`

**Type Definitions:**
- `types/modal-args.d.ts`
- `types/modals.ts`

**Pages:**
- `pages/ProjectTasks.tsx`

TypeScript check passes with no errors.

Cleanup script changes for task attempt 2f672d73-cb46-4515-b1c7-3da8f4402858

* The Session type now has `executor: string | null`, which is correct. All changes are complete:

1. **Migration**: `executor TEXT` (no `NOT NULL`)
2. **Rust model**: `executor: Option<String>`
3. **sqlx queries**: Using just `executor` without type annotation (sqlx infers nullable correctly)
4. **All usages**: Wrapped string values in `Some()` where executor is set
5. **TypeScript types**: `executor: string | null`

Everything builds and lints successfully.

Done. Removed `Session::parent_workspace` and its unused `Workspace` import.

* Done. The migration has been updated with the sqlx workaround:

1. Added `COMMIT TRANSACTION;` after `PRAGMA foreign_keys = OFF;` to end sqlx's auto-transaction
2. Added `BEGIN TRANSACTION;` to start our own controlled transaction
3. Added `PRAGMA foreign_key_check;` before committing to verify FK integrity
4. Added `COMMIT;` to commit our transaction
5. Added final `BEGIN TRANSACTION;` for sqlx to close gracefully

This ensures the `PRAGMA foreign_keys = OFF` actually takes effect, preventing potential cascade deletes of `execution_process_logs` when `DROP TABLE execution_processes` runs on existing databases with data.

Fixed. The migration now has the correct order:

1. `COMMIT TRANSACTION;` - ends sqlx's auto-transaction
2. `PRAGMA foreign_keys = OFF;` - now takes effect (outside transaction)
3. `BEGIN TRANSACTION;` - starts our controlled transaction

This ensures foreign keys are actually disabled before the table operations, preventing potential cascade deletes of `execution_process_logs` when `DROP TABLE execution_processes` runs.

Done. The migration now only disables foreign keys for the section that needs it (steps 7-8, the table rebuilds with DROP TABLE). Steps 1-6 run in sqlx's normal auto-transaction with FK enforcement on.

Fix slow statement (vibe-kanban b695d0ad)

i think we are missing an index on something, seeing these logs:
2025-12-17T15:23:25.297852Z  WARN sqlx::query: slow statement: execution time exceeded alert threshold *summary*="SELECT t.id AS \\"id!: …" *db.statement*="\\n\\nSELECT\\n  t.id                            AS \\"id!: Uuid\\",\\n  t.project\_id                    AS \\"project\_id!: Uuid\\",\\n  t.title,\\n  t.description,\\n  t.status                        AS \\"status!: TaskStatus\\",\\n  t.parent\_workspace\_id           AS \\"parent\_workspace\_id: Uuid\\",\\n  t.shared\_task\_id                AS \\"shared\_task\_id: Uuid\\",\\n  t.created\_at                    AS \\"created\_at!: DateTime<Utc>\\",\\n  t.updated\_at                    AS \\"updated\_at!: DateTime<Utc>\\",\\n\\n  CASE WHEN EXISTS (\\n    SELECT 1\\n      FROM workspaces w\\n      JOIN sessions s ON s.workspace\_id = w.id\\n      JOIN execution\_processes ep ON ep.session\_id = s.id\\n     WHERE w.task\_id       = t.id\\n       AND ep.status        = 'running'\\n       AND ep.run\_reason IN ('setupscript','cleanupscript','codingagent')\\n     LIMIT 1\\n  ) THEN 1 ELSE 0 END            AS \\"has\_in\_progress\_attempt!: i64\\",\\n\\n  CASE WHEN (\\n    SELECT ep.status\\n      FROM workspaces w\\n      JOIN sessions s ON s.workspace\_id = w.id\\n      JOIN execution\_processes ep ON ep.session\_id = s.id\\n     WHERE w.task\_id       = t.id\\n     AND ep.run\_reason IN ('setupscript','cleanupscript','codingagent')\\n     ORDER BY ep.created\_at DESC\\n     LIMIT 1\\n  ) IN ('failed','killed') THEN 1 ELSE 0 END\\n                                 AS \\"last\_attempt\_failed!: i64\\",\\n\\n  ( SELECT s.executor\\n      FROM workspaces w\\n      JOIN sessions s ON s.workspace\_id = w.id\\n      WHERE w.task\_id = t.id\\n     ORDER BY s.created\_at DESC\\n      LIMIT 1\\n    )                               AS \\"executor!: String\\"\\n\\nFROM tasks t\\nWHERE t.project\_id = $1\\nORDER BY t.created\_at DESC\\n" *rows\_affected*=0 *rows\_returned*=202 *elapsed*=1.281210542s *elapsed\_secs*=1.281210542 *slow\_threshold*=1s

2025-12-17T15:23:25.350788Z  WARN sqlx::query: slow statement: execution time exceeded alert threshold *summary*="SELECT t.id AS \\"id!: …" *db.statement*="\\n\\nSELECT\\n  t.id                            AS \\"id!: Uuid\\",\\n  t.project\_id                    AS \\"project\_id!: Uuid\\",\\n  t.title,\\n  t.description,\\n  t.status                        AS \\"status!: TaskStatus\\",\\n  t.parent\_workspace\_id           AS \\"parent\_workspace\_id: Uuid\\",\\n  t.shared\_task\_id                AS \\"shared\_task\_id: Uuid\\",\\n  t.created\_at                    AS \\"created\_at!: DateTime<Utc>\\",\\n  t.updated\_at                    AS \\"updated\_at!: DateTime<Utc>\\",\\n\\n  CASE WHEN EXISTS (\\n    SELECT 1\\n      FROM workspaces w\\n      JOIN sessions s ON s.workspace\_id = w.id\\n      JOIN execution\_processes ep ON ep.session\_id = s.id\\n     WHERE w.task\_id       = t.id\\n       AND ep.status        = 'running'\\n       AND ep.run\_reason IN ('setupscript','cleanupscript','codingagent')\\n     LIMIT 1\\n  ) THEN 1 ELSE 0 END            AS \\"has\_in\_progress\_attempt!: i64\\",\\n\\n  CASE WHEN (\\n    SELECT ep.status\\n      FROM workspaces w\\n      JOIN sessions s ON s.workspace\_id = w.id\\n      JOIN execution\_processes ep ON ep.session\_id = s.id\\n     WHERE w.task\_id       = t.id\\n     AND ep.run\_reason IN ('setupscript','cleanupscript','codingagent')\\n     ORDER BY ep.created\_at DESC\\n     LIMIT 1\\n  ) IN ('failed','killed') THEN 1 ELSE 0 END\\n                                 AS \\"last\_attempt\_failed!: i64\\",\\n\\n  ( SELECT s.executor\\n      FROM workspaces w\\n      JOIN sessions s ON s.workspace\_id = w.id\\n      WHERE w.task\_id = t.id\\n     ORDER BY s.created\_at DESC\\n      LIMIT 1\\n    )                               AS \\"executor!: String\\"\\n\\nFROM tasks t\\nWHERE t.project\_id = $1\\nORDER BY t.created\_at DESC\\n" *rows\_affected*=0 *rows\_returned*=202 *elapsed*=1.333812833s *elapsed\_secs*=1.333812833 *slow\_threshold*=1s

2025-12-17T15:23:25.401326Z  WARN sqlx::query: slow statement: execution time exceeded alert threshold *summary*="INSERT INTO execution\_processes ( …" *db.statement*="\\n\\nINSERT INTO execution\_processes (\\n                    id, session\_id, run\_reason, executor\_action,\\n                    status, exit\_code, started\_at, completed\_at, created\_at, updated\_at\\n                ) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?)\\n" *rows\_affected*=1 *rows\_returned*=0 *elapsed*=1.383690208s *elapsed\_secs*=1.383690208 *slow\_threshold*=1s

* Address feedback (vibe-kanban 81d8dbfa)

A PR opened by your colleague (https://github.com/BloopAI/vibe-kanban/pull/1569)
 got some feedback, let's address it.

```gh-comment
{
  "id": "2627479232",
  "comment_type": "review",
  "author": "ggordonhall",
  "body": "```suggestion\r\n-- 3. Migrate data: create one session per workspace\r\nINSERT INTO sessions (id, workspace_id, executor, created_at, updated_at)\r\nSELECT gen_random_uuid(), id, executor, created_at, updated_at FROM workspaces;\r\n```\r\n",
  "created_at": "2025-12-17T15:17:50Z",
  "url": "https://github.com/BloopAI/vibe-kanban/pull/1569#discussion_r2627479232",
  "path": "crates/db/migrations/20251216142123_refactor_task_attempts_to_workspaces_sessions.sql",
  "line": 26,
  "diff_hunk": "@@ -0,0 +1,121 @@\n+-- Refactor task_attempts into workspaces and sessions\n+-- - Rename task_attempts -> workspaces (keeps workspace-related fields)\n+-- - Create sessions table (executor moves here)\n+-- - Update execution_processes.task_attempt_id -> session_id\n+-- - Rename executor_sessions -> coding_agent_turns (drop redundant task_attempt_id)\n+-- - Rename merges.task_attempt_id -> workspace_id\n+-- - Rename tasks.parent_task_attempt -> parent_workspace_id\n+\n+-- 1. Rename task_attempts to workspaces (FK refs auto-update in schema)\n+ALTER TABLE task_attempts RENAME TO workspaces;\n+\n+-- 2. Create sessions table\n+CREATE TABLE sessions (\n+    id              BLOB PRIMARY KEY,\n+    workspace_id    BLOB NOT NULL,\n+    executor        TEXT,\n+    created_at      TEXT NOT NULL DEFAULT (datetime('now', 'subsec')),\n+    updated_at      TEXT NOT NULL DEFAULT (datetime('now', 'subsec')),\n+    FOREIGN KEY (workspace_id) REFERENCES workspaces(id) ON DELETE CASCADE\n+);\n+\n+CREATE INDEX idx_sessions_workspace_id ON sessions(workspace_id);\n+\n+-- 3. Migrate data: create one session per workspace (using workspace.id as session.id for simplicity)\n+INSERT INTO sessions (id, workspace_id, executor, created_at, updated_at)\n+SELECT id, id, executor, created_at, updated_at FROM workspaces;"
}
```

```gh-comment
{
  "id": "2627515578",
  "comment_type": "review",
  "author": "ggordonhall",
  "body": "Why not rename `attempt_repos` to `workspace_repos` here now that `attempt` is a legacy concept?",
  "created_at": "2025-12-17T15:27:21Z",
  "url": "https://github.com/BloopAI/vibe-kanban/pull/1569#discussion_r2627515578",
  "path": "crates/db/migrations/20251216142123_refactor_task_attempts_to_workspaces_sessions.sql",
  "line": 118,
  "diff_hunk": "@@ -0,0 +1,121 @@\n+-- Refactor task_attempts into workspaces and sessions\n+-- - Rename task_attempts -> workspaces (keeps workspace-related fields)\n+-- - Create sessions table (executor moves here)\n+-- - Update execution_processes.task_attempt_id -> session_id\n+-- - Rename executor_sessions -> coding_agent_turns (drop redundant task_attempt_id)\n+-- - Rename merges.task_attempt_id -> workspace_id\n+-- - Rename tasks.parent_task_attempt -> parent_workspace_id\n+\n+-- 1. Rename task_attempts to workspaces (FK refs auto-update in schema)\n+ALTER TABLE task_attempts RENAME TO workspaces;\n+\n+-- 2. Create sessions table\n+CREATE TABLE sessions (\n+    id              BLOB PRIMARY KEY,\n+    workspace_id    BLOB NOT NULL,\n+    executor        TEXT,\n+    created_at      TEXT NOT NULL DEFAULT (datetime('now', 'subsec')),\n+    updated_at      TEXT NOT NULL DEFAULT (datetime('now', 'subsec')),\n+    FOREIGN KEY (workspace_id) REFERENCES workspaces(id) ON DELETE CASCADE\n+);\n+\n+CREATE INDEX idx_sessions_workspace_id ON sessions(workspace_id);\n+\n+-- 3. Migrate data: create one session per workspace (using workspace.id as session.id for simplicity)\n+INSERT INTO sessions (id, workspace_id, executor, created_at, updated_at)\n+SELECT id, id, executor, created_at, updated_at FROM workspaces;\n+\n+-- 4. Drop executor column from workspaces\n+ALTER TABLE workspaces DROP COLUMN executor;\n+\n+-- 5. Rename merges.task_attempt_id to workspace_id\n+DROP INDEX idx_merges_task_attempt_id;\n+DROP INDEX idx_merges_open_pr;\n+ALTER TABLE merges RENAME COLUMN task_attempt_id TO workspace_id;\n+CREATE INDEX idx_merges_workspace_id ON merges(workspace_id);\n+CREATE INDEX idx_merges_open_pr ON merges(workspace_id, pr_status)\n+WHERE merge_type = 'pr' AND pr_status = 'open';\n+\n+-- 6. Rename tasks.parent_task_attempt to parent_workspace_id\n+DROP INDEX IF EXISTS idx_tasks_parent_task_attempt;\n+ALTER TABLE tasks RENAME COLUMN parent_task_attempt TO parent_workspace_id;\n+CREATE INDEX idx_tasks_parent_workspace_id ON tasks(parent_workspace_id);\n+\n+-- Steps 7-8 need FK disabled to avoid cascade deletes during DROP TABLE\n+-- sqlx workaround: end auto-transaction to allow PRAGMA to take effect\n+-- https://github.com/launchbadge/sqlx/issues/2085#issuecomment-1499859906\n+COMMIT;\n+\n+PRAGMA foreign_keys = OFF;\n+\n+BEGIN TRANSACTION;\n+\n+-- 7. Update execution_processes to reference session_id instead of task_attempt_id\n+-- (needs rebuild because FK target changes from workspaces to sessions)\n+DROP INDEX IF EXISTS idx_execution_processes_task_attempt_created_at;\n+DROP INDEX IF EXISTS idx_execution_processes_task_attempt_type_created;\n+\n+CREATE TABLE execution_processes_new (\n+    id              BLOB PRIMARY KEY,\n+    session_id      BLOB NOT NULL,\n+    run_reason      TEXT NOT NULL DEFAULT 'setupscript'\n+                       CHECK (run_reason IN ('setupscript','codingagent','devserver','cleanupscript')),\n+    executor_action TEXT NOT NULL DEFAULT '{}',\n+    status          TEXT NOT NULL DEFAULT 'running'\n+                       CHECK (status IN ('running','completed','failed','killed')),\n+    exit_code       INTEGER,\n+    dropped         INTEGER NOT NULL DEFAULT 0,\n+    started_at      TEXT NOT NULL DEFAULT (datetime('now', 'subsec')),\n+    completed_at    TEXT,\n+    created_at      TEXT NOT NULL DEFAULT (datetime('now', 'subsec')),\n+    updated_at      TEXT NOT NULL DEFAULT (datetime('now', 'subsec')),\n+    FOREIGN KEY (session_id) REFERENCES sessions(id) ON DELETE CASCADE\n+);\n+\n+-- Since we used workspace.id as session.id, the task_attempt_id values map directly\n+INSERT INTO execution_processes_new (id, session_id, run_reason, executor_action, status, exit_code, dropped, started_at, completed_at, created_at, updated_at)\n+SELECT id, task_attempt_id, run_reason, executor_action, status, exit_code, dropped, started_at, completed_at, created_at, updated_at\n+FROM execution_processes;\n+\n+DROP TABLE execution_processes;\n+ALTER TABLE execution_processes_new RENAME TO execution_processes;\n+\n+-- Recreate execution_processes indexes\n+CREATE INDEX idx_execution_processes_session_id ON execution_processes(session_id);\n+CREATE INDEX idx_execution_processes_status ON execution_processes(status);\n+CREATE INDEX idx_execution_processes_run_reason ON execution_processes(run_reason);\n+\n+-- 8. Rename executor_sessions to coding_agent_turns and drop task_attempt_id\n+-- (needs rebuild to drop the redundant task_attempt_id column)\n+CREATE TABLE coding_agent_turns (\n+    id                    BLOB PRIMARY KEY,\n+    execution_process_id  BLOB NOT NULL,\n+    session_id            TEXT,\n+    prompt                TEXT,\n+    summary               TEXT,\n+    created_at            TEXT NOT NULL DEFAULT (datetime('now', 'subsec')),\n+    updated_at            TEXT NOT NULL DEFAULT (datetime('now', 'subsec')),\n+    FOREIGN KEY (execution_process_id) REFERENCES execution_processes(id) ON DELETE CASCADE\n+);\n+\n+INSERT INTO coding_agent_turns (id, execution_process_id, session_id, prompt, summary, created_at, updated_at)\n+SELECT id, execution_process_id, session_id, prompt, summary, created_at, updated_at\n+FROM executor_sessions;\n+\n+DROP TABLE executor_sessions;\n+\n+-- Recreate coding_agent_turns indexes\n+CREATE INDEX idx_coding_agent_turns_execution_process_id ON coding_agent_turns(execution_process_id);\n+CREATE INDEX idx_coding_agent_turns_session_id ON coding_agent_turns(session_id);\n+\n+-- 9. attempt_repos: no changes needed - FK auto-updated when task_attempts renamed to workspaces"
}
```

```gh-comment
{
  "id": "2627694792",
  "comment_type": "review",
  "author": "ggordonhall",
  "body": "Maybe there's a better name than `external_session_id` here? `agent_session_id`? ",
  "created_at": "2025-12-17T16:16:24Z",
  "url": "https://github.com/BloopAI/vibe-kanban/pull/1569#discussion_r2627694792",
  "path": "crates/db/src/models/execution_process.rs",
  "line": 685,
  "diff_hunk": "@@ -618,4 +680,34 @@ impl ExecutionProcess {\n             )),\n         }\n     }\n+\n+    /// Find latest coding_agent_turn session_id by workspace (across all sessions)\n+    pub async fn find_latest_external_session_id_by_workspace("
}
```

```gh-comment
{
  "id": "2627707446",
  "comment_type": "review",
  "author": "ggordonhall",
  "body": "```suggestion\r\n    pub async fn cleanup_workspace(db: &DBService, workspace: &Workspace) {\r\n```",
  "created_at": "2025-12-17T16:19:31Z",
  "url": "https://github.com/BloopAI/vibe-kanban/pull/1569#discussion_r2627707446",
  "path": "crates/local-deployment/src/container.rs",
  "line": 146,
  "diff_hunk": "@@ -142,20 +143,20 @@ impl LocalContainerService {\n         map.remove(id)\n     }\n \n-    pub async fn cleanup_attempt_workspace(db: &DBService, attempt: &TaskAttempt) {\n-        let Some(container_ref) = &attempt.container_ref else {\n+    pub async fn cleanup_workspace_container(db: &DBService, workspace: &Workspace) {"
}
```

```gh-comment
{
  "id": "2627756192",
  "comment_type": "review",
  "author": "ggordonhall",
  "body": "Update `mcp` nomenclature",
  "created_at": "2025-12-17T16:31:49Z",
  "url": "https://github.com/BloopAI/vibe-kanban/pull/1569#discussion_r2627756192",
  "path": "crates/server/src/mcp/task_server.rs",
  "line": 352,
  "diff_hunk": "@@ -350,10 +349,9 @@ impl TaskServer {\n             project_id: ctx.project.id,\n             task_id: ctx.task.id,\n             task_title: ctx.task.title,\n-            attempt_id: ctx.task_attempt.id,\n-            attempt_branch: ctx.task_attempt.branch,\n+            attempt_id: ctx.workspace.id,"
}
```

```gh-comment
{
  "id": "2628161769",
  "comment_type": "review",
  "author": "ggordonhall",
  "body": "update, and similar in other events",
  "created_at": "2025-12-17T18:27:47Z",
  "url": "https://github.com/BloopAI/vibe-kanban/pull/1569#discussion_r2628161769",
  "path": "crates/server/src/routes/task_attempts.rs",
  "line": 1335,
  "diff_hunk": "@@ -1295,7 +1332,7 @@ pub async fn start_dev_server(\n             serde_json::json!({\n                 \"task_id\": task.id.to_string(),\n                 \"project_id\": project.id.to_string(),\n-                \"attempt_id\": task_attempt.id.to_string(),\n+                \"attempt_id\": workspace.id.to_string(),"
}
```

```gh-comment
{
  "id": "2628194289",
  "comment_type": "review",
  "author": "ggordonhall",
  "body": "Ugly, but we should rename this struct to avoid confusion with the more general concept of a workspace. Ideas...\r\n\r\n- `WorktreeContainer`\r\n...\r\n...\r\n\r\nChatGPT?",
  "created_at": "2025-12-17T18:36:30Z",
  "url": "https://github.com/BloopAI/vibe-kanban/pull/1569#discussion_r2628194289",
  "path": "crates/services/src/services/workspace_manager.rs",
  "line": 3,
  "diff_hunk": "@@ -1,6 +1,6 @@\n use std::path::{Path, PathBuf};\n \n-use db::models::{repo::Repo, task_attempt::TaskAttempt};\n+use db::models::{repo::Repo, workspace::Workspace as DbWorkspace};"
}
```

```gh-comment
{
  "id": "2628198036",
  "comment_type": "review",
  "author": "ggordonhall",
  "body": "We could add a BE route for this, and similar hooks where we're aggregating this information on the fly",
  "created_at": "2025-12-17T18:37:46Z",
  "url": "https://github.com/BloopAI/vibe-kanban/pull/1569#discussion_r2628198036",
  "path": "frontend/src/hooks/useTaskAttempts.ts",
  "line": 43,
  "diff_hunk": "@@ -16,10 +20,36 @@ export function useTaskAttempts(taskId?: string, opts?: Options) {\n   const enabled = (opts?.enabled ?? true) && !!taskId;\n   const refetchInterval = opts?.refetchInterval ?? 5000;\n \n-  return useQuery<TaskAttempt[]>({\n+  return useQuery<Workspace[]>({\n     queryKey: taskAttemptKeys.byTask(taskId),\n     queryFn: () => attemptsApi.getAll(taskId!),\n     enabled,\n     refetchInterval,\n   });\n }\n+\n+/**\n+ * Hook for components that need executor field for all attempts.\n+ * Fetches all attempts and their sessions in parallel.\n+ */\n+export function useTaskAttemptsWithSessions(taskId?: string, opts?: Options) {\n+  const enabled = (opts?.enabled ?? true) && !!taskId;\n+  const refetchInterval = opts?.refetchInterval ?? 5000;\n+\n+  return useQuery<WorkspaceWithSession[]>({\n+    queryKey: taskAttemptKeys.byTaskWithSessions(taskId),\n+    queryFn: async () => {\n+      const attempts = await attemptsApi.getAll(taskId!);\n+      // Fetch sessions for all attempts in parallel"
}
```

* Address feedback (vibe-kanban 81d8dbfa)

A PR opened by your colleague (https://github.com/BloopAI/vibe-kanban/pull/1569)
 got some feedback, let's address it.

```gh-comment
{
  "id": "2627479232",
  "comment_type": "review",
  "author": "ggordonhall",
  "body": "```suggestion\r\n-- 3. Migrate data: create one session per workspace\r\nINSERT INTO sessions (id, workspace_id, executor, created_at, updated_at)\r\nSELECT gen_random_uuid(), id, executor, created_at, updated_at FROM workspaces;\r\n```\r\n",
  "created_at": "2025-12-17T15:17:50Z",
  "url": "https://github.com/BloopAI/vibe-kanban/pull/1569#discussion_r2627479232",
  "path": "crates/db/migrations/20251216142123_refactor_task_attempts_to_workspaces_sessions.sql",
  "line": 26,
  "diff_hunk": "@@ -0,0 +1,121 @@\n+-- Refactor task_attempts into workspaces and sessions\n+-- - Rename task_attempts -> workspaces (keeps workspace-related fields)\n+-- - Create sessions table (executor moves here)\n+-- - Update execution_processes.task_attempt_id -> session_id\n+-- - Rename executor_sessions -> coding_agent_turns (drop redundant task_attempt_id)\n+-- - Rename merges.task_attempt_id -> workspace_id\n+-- - Rename tasks.parent_task_attempt -> parent_workspace_id\n+\n+-- 1. Rename task_attempts to workspaces (FK refs auto-update in schema)\n+ALTER TABLE task_attempts RENAME TO workspaces;\n+\n+-- 2. Create sessions table\n+CREATE TABLE sessions (\n+    id              BLOB PRIMARY KEY,\n+    workspace_id    BLOB NOT NULL,\n+    executor        TEXT,\n+    created_at      TEXT NOT NULL DEFAULT (datetime('now', 'subsec')),\n+    updated_at      TEXT NOT NULL DEFAULT (datetime('now', 'subsec')),\n+    FOREIGN KEY (workspace_id) REFERENCES workspaces(id) ON DELETE CASCADE\n+);\n+\n+CREATE INDEX idx_sessions_workspace_id ON sessions(workspace_id);\n+\n+-- 3. Migrate data: create one session per workspace (using workspace.id as session.id for simplicity)\n+INSERT INTO sessions (id, workspace_id, executor, created_at, updated_at)\n+SELECT id, id, executor, created_at, updated_at FROM workspaces;"
}
```

```gh-comment
{
  "id": "2627515578",
  "comment_type": "review",
  "author": "ggordonhall",
  "body": "Why not rename `attempt_repos` to `workspace_repos` here now that `attempt` is a legacy concept?",
  "created_at": "2025-12-17T15:27:21Z",
  "url": "https://github.com/BloopAI/vibe-kanban/pull/1569#discussion_r2627515578",
  "path": "crates/db/migrations/20251216142123_refactor_task_attempts_to_workspaces_sessions.sql",
  "line": 118,
  "diff_hunk": "@@ -0,0 +1,121 @@\n+-- Refactor task_attempts into workspaces and sessions\n+-- - Rename task_attempts -> workspaces (keeps workspace-related fields)\n+-- - Create sessions table (executor moves here)\n+-- - Update execution_processes.task_attempt_id -> session_id\n+-- - Rename executor_sessions -> coding_agent_turns (drop redundant task_attempt_id)\n+-- - Rename merges.task_attempt_id -> workspace_id\n+-- - Rename tasks.parent_task_attempt -> parent_workspace_id\n+\n+-- 1. Rename task_attempts to workspaces (FK refs auto-update in schema)\n+ALTER TABLE task_attempts RENAME TO workspaces;\n+\n+-- 2. Create sessions table\n+CREATE TABLE sessions (\n+    id              BLOB PRIMARY KEY,\n+    workspace_id    BLOB NOT NULL,\n+    executor        TEXT,\n+    created_at      TEXT NOT NULL DEFAULT (datetime('now', 'subsec')),\n+    updated_at      TEXT NOT NULL DEFAULT (datetime('now', 'subsec')),\n+    FOREIGN KEY (workspace_id) REFERENCES workspaces(id) ON DELETE CASCADE\n+);\n+\n+CREATE INDEX idx_sessions_workspace_id ON sessions(workspace_id);\n+\n+-- 3. Migrate data: create one session per workspace (using workspace.id as session.id for simplicity)\n+INSERT INTO sessions (id, workspace_id, executor, created_at, updated_at)\n+SELECT id, id, executor, created_at, updated_at FROM workspaces;\n+\n+-- 4. Drop executor column from workspaces\n+ALTER TABLE workspaces DROP COLUMN executor;\n+\n+-- 5. Rename merges.task_attempt_id to workspace_id\n+DROP INDEX idx_merges_task_attempt_id;\n+DROP INDEX idx_merges_open_pr;\n+ALTER TABLE merges RENAME COLUMN task_attempt_id TO workspace_id;\n+CREATE INDEX idx_merges_workspace_id ON merges(workspace_id);\n+CREATE INDEX idx_merges_open_pr ON merges(workspace_id, pr_status)\n+WHERE merge_type = 'pr' AND pr_status = 'open';\n+\n+-- 6. Rename tasks.parent_task_attempt to parent_workspace_id\n+DROP INDEX IF EXISTS idx_tasks_parent_task_attempt;\n+ALTER TABLE tasks RENAME COLUMN parent_task_attempt TO parent_workspace_id;\n+CREATE INDEX idx_tasks_parent_workspace_id ON tasks(parent_workspace_id);\n+\n+-- Steps 7-8 need FK disabled to avoid cascade deletes during DROP TABLE\n+-- sqlx workaround: end auto-transaction to allow PRAGMA to take effect\n+-- https://github.com/launchbadge/sqlx/issues/2085#issuecomment-1499859906\n+COMMIT;\n+\n+PRAGMA foreign_keys = OFF;\n+\n+BEGIN TRANSACTION;\n+\n+-- 7. Update execution_processes to reference session_id instead of task_attempt_id\n+-- (needs rebuild because FK target changes from workspaces to sessions)\n+DROP INDEX IF EXISTS idx_execution_processes_task_attempt_created_at;\n+DROP INDEX IF EXISTS idx_execution_processes_task_attempt_type_created;\n+\n+CREATE TABLE execution_processes_new (\n+    id              BLOB PRIMARY KEY,\n+    session_id      BLOB NOT NULL,\n+    run_reason      TEXT NOT NULL DEFAULT 'setupscript'\n+                       CHECK (run_reason IN ('setupscript','codingagent','devserver','cleanupscript')),\n+    executor_action TEXT NOT NULL DEFAULT '{}',\n+    status          TEXT NOT NULL DEFAULT 'running'\n+                       CHECK (status IN ('running','completed','failed','killed')),\n+    exit_code       INTEGER,\n+    dropped         INTEGER NOT NULL DEFAULT 0,\n+    started_at      TEXT NOT NULL DEFAULT (datetime('now', 'subsec')),\n+    completed_at    TEXT,\n+    created_at      TEXT NOT NULL DEFAULT (datetime('now', 'subsec')),\n+    updated_at      TEXT NOT NULL DEFAULT (datetime('now', 'subsec')),\n+    FOREIGN KEY (session_id) REFERENCES sessions(id) ON DELETE CASCADE\n+);\n+\n+-- Since we used workspace.id as session.id, the task_attempt_id values map directly\n+INSERT INTO execution_processes_new (id, session_id, run_reason, executor_action, status, exit_code, dropped, started_at, completed_at, created_at, updated_at)\n+SELECT id, task_attempt_id, run_reason, executor_action, status, exit_code, dropped, started_at, completed_at, created_at, updated_at\n+FROM execution_processes;\n+\n+DROP TABLE execution_processes;\n+ALTER TABLE execution_processes_new RENAME TO execution_processes;\n+\n+-- Recreate execution_processes indexes\n+CREATE INDEX idx_execution_processes_session_id ON execution_processes(session_id);\n+CREATE INDEX idx_execution_processes_status ON execution_processes(status);\n+CREATE INDEX idx_execution_processes_run_reason ON execution_processes(run_reason);\n+\n+-- 8. Rename executor_sessions to coding_agent_turns and drop task_attempt_id\n+-- (needs rebuild to drop the redundant task_attempt_id column)\n+CREATE TABLE coding_agent_turns (\n+    id                    BLOB PRIMARY KEY,\n+    execution_process_id  BLOB NOT NULL,\n+    session_id            TEXT,\n+    prompt                TEXT,\n+    summary               TEXT,\n+    created_at            TEXT NOT NULL DEFAULT (datetime('now', 'subsec')),\n+    updated_at            TEXT NOT NULL DEFAULT (datetime('now', 'subsec')),\n+    FOREIGN KEY (execution_process_id) REFERENCES execution_processes(id) ON DELETE CASCADE\n+);\n+\n+INSERT INTO coding_agent_turns (id, execution_process_id, session_id, prompt, summary, created_at, updated_at)\n+SELECT id, execution_process_id, session_id, prompt, summary, created_at, updated_at\n+FROM executor_sessions;\n+\n+DROP TABLE executor_sessions;\n+\n+-- Recreate coding_agent_turns indexes\n+CREATE INDEX idx_coding_agent_turns_execution_process_id ON coding_agent_turns(execution_process_id);\n+CREATE INDEX idx_coding_agent_turns_session_id ON coding_agent_turns(session_id);\n+\n+-- 9. attempt_repos: no changes needed - FK auto-updated when task_attempts renamed to workspaces"
}
```

```gh-comment
{
  "id": "2627694792",
  "comment_type": "review",
  "author": "ggordonhall",
  "body": "Maybe there's a better name than `external_session_id` here? `agent_session_id`? ",
  "created_at": "2025-12-17T16:16:24Z",
  "url": "https://github.com/BloopAI/vibe-kanban/pull/1569#discussion_r2627694792",
  "path": "crates/db/src/models/execution_process.rs",
  "line": 685,
  "diff_hunk": "@@ -618,4 +680,34 @@ impl ExecutionProcess {\n             )),\n         }\n     }\n+\n+    /// Find latest coding_agent_turn session_id by workspace (across all sessions)\n+    pub async fn find_latest_external_session_id_by_workspace("
}
```

```gh-comment
{
  "id": "2627707446",
  "comment_type": "review",
  "author": "ggordonhall",
  "body": "```suggestion\r\n    pub async fn cleanup_workspace(db: &DBService, workspace: &Workspace) {\r\n```",
  "created_at": "2025-12-17T16:19:31Z",
  "url": "https://github.com/BloopAI/vibe-kanban/pull/1569#discussion_r2627707446",
  "path": "crates/local-deployment/src/container.rs",
  "line": 146,
  "diff_hunk": "@@ -142,20 +143,20 @@ impl LocalContainerService {\n         map.remove(id)\n     }\n \n-    pub async fn cleanup_attempt_workspace(db: &DBService, attempt: &TaskAttempt) {\n-        let Some(container_ref) = &attempt.container_ref else {\n+    pub async fn cleanup_workspace_container(db: &DBService, workspace: &Workspace) {"
}
```

```gh-comment
{
  "id": "2627756192",
  "comment_type": "review",
  "author": "ggordonhall",
  "body": "Update `mcp` nomenclature",
  "created_at": "2025-12-17T16:31:49Z",
  "url": "https://github.com/BloopAI/vibe-kanban/pull/1569#discussion_r2627756192",
  "path": "crates/server/src/mcp/task_server.rs",
  "line": 352,
  "diff_hunk": "@@ -350,10 +349,9 @@ impl TaskServer {\n             project_id: ctx.project.id,\n             task_id: ctx.task.id,\n             task_title: ctx.task.title,\n-            attempt_id: ctx.task_attempt.id,\n-            attempt_branch: ctx.task_attempt.branch,\n+            attempt_id: ctx.workspace.id,"
}
```

```gh-comment
{
  "id": "2628161769",
  "comment_type": "review",
  "author": "ggordonhall",
  "body": "update, and similar in other events",
  "created_at": "2025-12-17T18:27:47Z",
  "url": "https://github.com/BloopAI/vibe-kanban/pull/1569#discussion_r2628161769",
  "path": "crates/server/src/routes/task_attempts.rs",
  "line": 1335,
  "diff_hunk": "@@ -1295,7 +1332,7 @@ pub async fn start_dev_server(\n             serde_json::json!({\n                 \"task_id\": task.id.to_string(),\n                 \"project_id\": project.id.to_string(),\n-                \"attempt_id\": task_attempt.id.to_string(),\n+                \"attempt_id\": workspace.id.to_string(),"
}
```

```gh-comment
{
  "id": "2628194289",
  "comment_type": "review",
  "author": "ggordonhall",
  "body": "Ugly, but we should rename this struct to avoid confusion with the more general concept of a workspace. Ideas...\r\n\r\n- `WorktreeContainer`\r\n...\r\n...\r\n\r\nChatGPT?",
  "created_at": "2025-12-17T18:36:30Z",
  "url": "https://github.com/BloopAI/vibe-kanban/pull/1569#discussion_r2628194289",
  "path": "crates/services/src/services/workspace_manager.rs",
  "line": 3,
  "diff_hunk": "@@ -1,6 +1,6 @@\n use std::path::{Path, PathBuf};\n \n-use db::models::{repo::Repo, task_attempt::TaskAttempt};\n+use db::models::{repo::Repo, workspace::Workspace as DbWorkspace};"
}
```

```gh-comment
{
  "id": "2628198036",
  "comment_type": "review",
  "author": "ggordonhall",
  "body": "We could add a BE route for this, and similar hooks where we're aggregating this information on the fly",
  "created_at": "2025-12-17T18:37:46Z",
  "url": "https://github.com/BloopAI/vibe-kanban/pull/1569#discussion_r2628198036",
  "path": "frontend/src/hooks/useTaskAttempts.ts",
  "line": 43,
  "diff_hunk": "@@ -16,10 +20,36 @@ export function useTaskAttempts(taskId?: string, opts?: Options) {\n   const enabled = (opts?.enabled ?? true) && !!taskId;\n   const refetchInterval = opts?.refetchInterval ?? 5000;\n \n-  return useQuery<TaskAttempt[]>({\n+  return useQuery<Workspace[]>({\n     queryKey: taskAttemptKeys.byTask(taskId),\n     queryFn: () => attemptsApi.getAll(taskId!),\n     enabled,\n     refetchInterval,\n   });\n }\n+\n+/**\n+ * Hook for components that need executor field for all attempts.\n+ * Fetches all attempts and their sessions in parallel.\n+ */\n+export function useTaskAttemptsWithSessions(taskId?: string, opts?: Options) {\n+  const enabled = (opts?.enabled ?? true) && !!taskId;\n+  const refetchInterval = opts?.refetchInterval ?? 5000;\n+\n+  return useQuery<WorkspaceWithSession[]>({\n+    queryKey: taskAttemptKeys.byTaskWithSessions(taskId),\n+    queryFn: async () => {\n+      const attempts = await attemptsApi.getAll(taskId!);\n+      // Fetch sessions for all attempts in parallel"
}
```
2025-12-18 14:45:10 +00:00

497 lines
28 KiB
TypeScript

// This file was generated by `crates/core/src/bin/generate_types.rs`.
// Do not edit this file manually.
// If you are an AI, and you absolutely have to edit this file, please confirm with the user first.
export type SharedTaskResponse = { task: SharedTask, user: UserData | null, };
export type AssigneesQuery = { project_id: string, };
export type SharedTask = { id: string, organization_id: string, project_id: string, creator_user_id: string | null, assignee_user_id: string | null, deleted_by_user_id: string | null, title: string, description: string | null, status: TaskStatus, deleted_at: string | null, shared_at: string | null, created_at: string, updated_at: string, };
export type UserData = { user_id: string, first_name: string | null, last_name: string | null, username: string | null, };
export type Project = { id: string, name: string, dev_script: string | null, dev_script_working_dir: string | null, remote_project_id: string | null, created_at: Date, updated_at: Date, };
export type CreateProject = { name: string, repositories: Array<CreateProjectRepo>, };
export type UpdateProject = { name: string | null, dev_script: string | null, dev_script_working_dir: string | null, };
export type SearchResult = { path: string, is_file: boolean, match_type: SearchMatchType, };
export type SearchMatchType = "FileName" | "DirectoryName" | "FullPath";
export type Repo = { id: string, path: string, name: string, display_name: string, created_at: Date, updated_at: Date, };
export type ProjectRepo = { id: string, project_id: string, repo_id: string, setup_script: string | null, cleanup_script: string | null, copy_files: string | null, parallel_setup_script: boolean, };
export type CreateProjectRepo = { display_name: string, git_repo_path: string, };
export type UpdateProjectRepo = { setup_script: string | null, cleanup_script: string | null, copy_files: string | null, parallel_setup_script: boolean | null, };
export type WorkspaceRepo = { id: string, workspace_id: string, repo_id: string, target_branch: string, created_at: Date, updated_at: Date, };
export type CreateWorkspaceRepo = { repo_id: string, target_branch: string, };
export type RepoWithTargetBranch = { target_branch: string, id: string, path: string, name: string, display_name: string, created_at: Date, updated_at: Date, };
export type Tag = { id: string, tag_name: string, content: string, created_at: string, updated_at: string, };
export type CreateTag = { tag_name: string, content: string, };
export type UpdateTag = { tag_name: string | null, content: string | null, };
export type TaskStatus = "todo" | "inprogress" | "inreview" | "done" | "cancelled";
export type Task = { id: string, project_id: string, title: string, description: string | null, status: TaskStatus, parent_workspace_id: string | null, shared_task_id: string | null, created_at: string, updated_at: string, };
export type TaskWithAttemptStatus = { has_in_progress_attempt: boolean, last_attempt_failed: boolean, executor: string, id: string, project_id: string, title: string, description: string | null, status: TaskStatus, parent_workspace_id: string | null, shared_task_id: string | null, created_at: string, updated_at: string, };
export type TaskRelationships = { parent_task: Task | null, current_workspace: Workspace, children: Array<Task>, };
export type CreateTask = { project_id: string, title: string, description: string | null, status: TaskStatus | null, parent_workspace_id: string | null, image_ids: Array<string> | null, shared_task_id: string | null, };
export type UpdateTask = { title: string | null, description: string | null, status: TaskStatus | null, parent_workspace_id: string | null, image_ids: Array<string> | null, };
export type DraftFollowUpData = { message: string, variant: string | null, };
export type ScratchPayload = { "type": "DRAFT_TASK", "data": string } | { "type": "DRAFT_FOLLOW_UP", "data": DraftFollowUpData };
export enum ScratchType { DRAFT_TASK = "DRAFT_TASK", DRAFT_FOLLOW_UP = "DRAFT_FOLLOW_UP" }
export type Scratch = { id: string, payload: ScratchPayload, created_at: string, updated_at: string, };
export type CreateScratch = { payload: ScratchPayload, };
export type UpdateScratch = { payload: ScratchPayload, };
export type Image = { id: string, file_path: string, original_name: string, mime_type: string | null, size_bytes: bigint, hash: string, created_at: string, updated_at: string, };
export type CreateImage = { file_path: string, original_name: string, mime_type: string | null, size_bytes: bigint, hash: string, };
export type Workspace = { id: string, task_id: string, container_ref: string | null, branch: string, setup_completed_at: string | null, created_at: string, updated_at: string, };
export type Session = { id: string, workspace_id: string, executor: string | null, created_at: string, updated_at: string, };
export type ExecutionProcess = { id: string, session_id: string, run_reason: ExecutionProcessRunReason, executor_action: ExecutorAction, status: ExecutionProcessStatus, exit_code: bigint | null,
/**
* dropped: true if this process is excluded from the current
* history view (due to restore/trimming). Hidden from logs/timeline;
* still listed in the Processes tab.
*/
dropped: boolean, started_at: string, completed_at: string | null, created_at: string, updated_at: string, };
export enum ExecutionProcessStatus { running = "running", completed = "completed", failed = "failed", killed = "killed" }
export type ExecutionProcessRunReason = "setupscript" | "cleanupscript" | "codingagent" | "devserver";
export type ExecutionProcessRepoState = { id: string, execution_process_id: string, repo_id: string, before_head_commit: string | null, after_head_commit: string | null, merge_commit: string | null, created_at: Date, updated_at: Date, };
export type Merge = { "type": "direct" } & DirectMerge | { "type": "pr" } & PrMerge;
export type DirectMerge = { id: string, workspace_id: string, repo_id: string, merge_commit: string, target_branch_name: string, created_at: string, };
export type PrMerge = { id: string, workspace_id: string, repo_id: string, created_at: string, target_branch_name: string, pr_info: PullRequestInfo, };
export type MergeStatus = "open" | "merged" | "closed" | "unknown";
export type PullRequestInfo = { number: bigint, url: string, status: MergeStatus, merged_at: string | null, merge_commit_sha: string | null, };
export type ApprovalStatus = { "status": "pending" } | { "status": "approved" } | { "status": "denied", reason?: string, } | { "status": "timed_out" };
export type CreateApprovalRequest = { tool_name: string, tool_input: JsonValue, tool_call_id: string, };
export type ApprovalResponse = { execution_process_id: string, status: ApprovalStatus, };
export type Diff = { change: DiffChangeKind, oldPath: string | null, newPath: string | null, oldContent: string | null, newContent: string | null,
/**
* True when file contents are intentionally omitted (e.g., too large)
*/
contentOmitted: boolean,
/**
* Optional precomputed stats for omitted content
*/
additions: number | null, deletions: number | null, };
export type DiffChangeKind = "added" | "deleted" | "modified" | "renamed" | "copied" | "permissionChange";
export type ApiResponse<T, E = T> = { success: boolean, data: T | null, error_data: E | null, message: string | null, };
export type LoginStatus = { "status": "loggedout" } | { "status": "loggedin", profile: ProfileResponse, };
export type ProfileResponse = { user_id: string, username: string | null, email: string, providers: Array<ProviderProfile>, };
export type ProviderProfile = { provider: string, username: string | null, display_name: string | null, email: string | null, avatar_url: string | null, };
export type StatusResponse = { logged_in: boolean, profile: ProfileResponse | null, degraded: boolean | null, };
export enum MemberRole { ADMIN = "ADMIN", MEMBER = "MEMBER" }
export enum InvitationStatus { PENDING = "PENDING", ACCEPTED = "ACCEPTED", DECLINED = "DECLINED", EXPIRED = "EXPIRED" }
export type Organization = { id: string, name: string, slug: string, is_personal: boolean, created_at: string, updated_at: string, };
export type OrganizationWithRole = { id: string, name: string, slug: string, is_personal: boolean, created_at: string, updated_at: string, user_role: MemberRole, };
export type ListOrganizationsResponse = { organizations: Array<OrganizationWithRole>, };
export type GetOrganizationResponse = { organization: Organization, user_role: string, };
export type CreateOrganizationRequest = { name: string, slug: string, };
export type CreateOrganizationResponse = { organization: OrganizationWithRole, };
export type UpdateOrganizationRequest = { name: string, };
export type Invitation = { id: string, organization_id: string, invited_by_user_id: string | null, email: string, role: MemberRole, status: InvitationStatus, token: string, created_at: string, expires_at: string, };
export type CreateInvitationRequest = { email: string, role: MemberRole, };
export type CreateInvitationResponse = { invitation: Invitation, };
export type ListInvitationsResponse = { invitations: Array<Invitation>, };
export type GetInvitationResponse = { id: string, organization_slug: string, role: MemberRole, expires_at: string, };
export type AcceptInvitationResponse = { organization_id: string, organization_slug: string, role: MemberRole, };
export type RevokeInvitationRequest = { invitation_id: string, };
export type OrganizationMember = { user_id: string, role: MemberRole, joined_at: string, };
export type OrganizationMemberWithProfile = { user_id: string, role: MemberRole, joined_at: string, first_name: string | null, last_name: string | null, username: string | null, email: string | null, avatar_url: string | null, };
export type ListMembersResponse = { members: Array<OrganizationMemberWithProfile>, };
export type UpdateMemberRoleRequest = { role: MemberRole, };
export type UpdateMemberRoleResponse = { user_id: string, role: MemberRole, };
export type RemoteProject = { id: string, organization_id: string, name: string, metadata: Record<string, unknown>, created_at: string, };
export type ListProjectsResponse = { projects: Array<RemoteProject>, };
export type RemoteProjectMembersResponse = { organization_id: string, members: Array<OrganizationMemberWithProfile>, };
export type CreateRemoteProjectRequest = { organization_id: string, name: string, };
export type LinkToExistingRequest = { remote_project_id: string, };
export type RegisterRepoRequest = { path: string, display_name: string | null, };
export type InitRepoRequest = { parent_path: string, folder_name: string, };
export type TagSearchParams = { search: string | null, };
export type TokenResponse = { access_token: string, expires_at: string | null, };
export type UserSystemInfo = { config: Config, analytics_user_id: string, login_status: LoginStatus, environment: Environment,
/**
* Capabilities supported per executor (e.g., { "CLAUDE_CODE": ["SESSION_FORK"] })
*/
capabilities: { [key in string]?: Array<BaseAgentCapability> }, executors: { [key in BaseCodingAgent]?: ExecutorConfig }, };
export type Environment = { os_type: string, os_version: string, os_architecture: string, bitness: string, };
export type McpServerQuery = { executor: BaseCodingAgent, };
export type UpdateMcpServersBody = { servers: { [key in string]?: JsonValue }, };
export type GetMcpServerResponse = { mcp_config: McpConfig, config_path: string, };
export type CheckEditorAvailabilityQuery = { editor_type: EditorType, };
export type CheckEditorAvailabilityResponse = { available: boolean, };
export type CheckAgentAvailabilityQuery = { executor: BaseCodingAgent, };
export type CurrentUserResponse = { user_id: string, };
export type CreateFollowUpAttempt = { prompt: string, variant: string | null, retry_process_id: string | null, force_when_dirty: boolean | null, perform_git_reset: boolean | null, };
export type ChangeTargetBranchRequest = { repo_id: string, new_target_branch: string, };
export type ChangeTargetBranchResponse = { repo_id: string, new_target_branch: string, status: [number, number], };
export type MergeTaskAttemptRequest = { repo_id: string, };
export type PushTaskAttemptRequest = { repo_id: string, };
export type RenameBranchRequest = { new_branch_name: string, };
export type RenameBranchResponse = { branch: string, };
export type OpenEditorRequest = { editor_type: string | null, file_path: string | null, };
export type OpenEditorResponse = { url: string | null, };
export type AssignSharedTaskRequest = { new_assignee_user_id: string | null, };
export type ShareTaskResponse = { shared_task_id: string, };
export type CreateAndStartTaskRequest = { task: CreateTask, executor_profile_id: ExecutorProfileId, repos: Array<WorkspaceRepoInput>, };
export type CreateGitHubPrRequest = { title: string, body: string | null, target_branch: string | null, draft: boolean | null, repo_id: string, auto_generate_description: boolean, };
export type ImageResponse = { id: string, file_path: string, original_name: string, mime_type: string | null, size_bytes: bigint, hash: string, created_at: string, updated_at: string, };
export type ImageMetadata = { exists: boolean, file_name: string | null, path: string | null, size_bytes: bigint | null, format: string | null, proxy_url: string | null, };
export type CreateTaskAttemptBody = { task_id: string, executor_profile_id: ExecutorProfileId, repos: Array<WorkspaceRepoInput>, };
export type WorkspaceRepoInput = { repo_id: string, target_branch: string, };
export type RunAgentSetupRequest = { executor_profile_id: ExecutorProfileId, };
export type RunAgentSetupResponse = Record<string, never>;
export type GhCliSetupError = "BREW_MISSING" | "SETUP_HELPER_NOT_SUPPORTED" | { "OTHER": { message: string, } };
export type RebaseTaskAttemptRequest = { repo_id: string, old_base_branch: string | null, new_base_branch: string | null, };
export type AbortConflictsRequest = { repo_id: string, };
export type GitOperationError = { "type": "merge_conflicts", message: string, op: ConflictOp, } | { "type": "rebase_in_progress" };
export type PushError = { "type": "force_push_required" };
export type CreatePrError = { "type": "github_cli_not_installed" } | { "type": "github_cli_not_logged_in" } | { "type": "git_cli_not_logged_in" } | { "type": "git_cli_not_installed" } | { "type": "target_branch_not_found", branch: string, };
export type BranchStatus = { commits_behind: number | null, commits_ahead: number | null, has_uncommitted_changes: boolean | null, head_oid: string | null, uncommitted_count: number | null, untracked_count: number | null, target_branch_name: string, remote_commits_behind: number | null, remote_commits_ahead: number | null, merges: Array<Merge>,
/**
* True if a `git rebase` is currently in progress in this worktree
*/
is_rebase_in_progress: boolean,
/**
* Current conflict operation if any
*/
conflict_op: ConflictOp | null,
/**
* List of files currently in conflicted (unmerged) state
*/
conflicted_files: Array<string>, };
export type RunScriptError = { "type": "no_script_configured" } | { "type": "process_already_running" };
export type AttachPrResponse = { pr_attached: boolean, pr_url: string | null, pr_number: bigint | null, pr_status: MergeStatus | null, };
export type AttachExistingPrRequest = { repo_id: string, };
export type PrCommentsResponse = { comments: Array<UnifiedPrComment>, };
export type GetPrCommentsError = { "type": "no_pr_attached" } | { "type": "github_cli_not_installed" } | { "type": "github_cli_not_logged_in" };
export type GetPrCommentsQuery = { repo_id: string, };
export type UnifiedPrComment = { "comment_type": "general", id: string, author: string, author_association: string, body: string, created_at: string, url: string, } | { "comment_type": "review", id: bigint, author: string, author_association: string, body: string, created_at: string, url: string, path: string, line: bigint | null, diff_hunk: string, };
export type RepoBranchStatus = { repo_id: string, repo_name: string, commits_behind: number | null, commits_ahead: number | null, has_uncommitted_changes: boolean | null, head_oid: string | null, uncommitted_count: number | null, untracked_count: number | null, target_branch_name: string, remote_commits_behind: number | null, remote_commits_ahead: number | null, merges: Array<Merge>,
/**
* True if a `git rebase` is currently in progress in this worktree
*/
is_rebase_in_progress: boolean,
/**
* Current conflict operation if any
*/
conflict_op: ConflictOp | null,
/**
* List of files currently in conflicted (unmerged) state
*/
conflicted_files: Array<string>, };
export type DirectoryEntry = { name: string, path: string, is_directory: boolean, is_git_repo: boolean, last_modified: bigint | null, };
export type DirectoryListResponse = { entries: Array<DirectoryEntry>, current_path: string, };
export type Config = { config_version: string, theme: ThemeMode, executor_profile: ExecutorProfileId, disclaimer_acknowledged: boolean, onboarding_acknowledged: boolean, notifications: NotificationConfig, editor: EditorConfig, github: GitHubConfig, analytics_enabled: boolean, workspace_dir: string | null, last_app_version: string | null, show_release_notes: boolean, language: UiLanguage, git_branch_prefix: string, showcases: ShowcaseState, pr_auto_description_enabled: boolean, pr_auto_description_prompt: string | null, };
export type NotificationConfig = { sound_enabled: boolean, push_enabled: boolean, sound_file: SoundFile, };
export enum ThemeMode { LIGHT = "LIGHT", DARK = "DARK", SYSTEM = "SYSTEM" }
export type EditorConfig = { editor_type: EditorType, custom_command: string | null, remote_ssh_host: string | null, remote_ssh_user: string | null, };
export enum EditorType { VS_CODE = "VS_CODE", CURSOR = "CURSOR", WINDSURF = "WINDSURF", INTELLI_J = "INTELLI_J", ZED = "ZED", XCODE = "XCODE", CUSTOM = "CUSTOM" }
export type EditorOpenError = { "type": "executable_not_found", executable: string, editor_type: EditorType, } | { "type": "invalid_command", details: string, editor_type: EditorType, } | { "type": "launch_failed", executable: string, details: string, editor_type: EditorType, };
export type GitHubConfig = { pat: string | null, oauth_token: string | null, username: string | null, primary_email: string | null, default_pr_base: string | null, };
export enum SoundFile { ABSTRACT_SOUND1 = "ABSTRACT_SOUND1", ABSTRACT_SOUND2 = "ABSTRACT_SOUND2", ABSTRACT_SOUND3 = "ABSTRACT_SOUND3", ABSTRACT_SOUND4 = "ABSTRACT_SOUND4", COW_MOOING = "COW_MOOING", PHONE_VIBRATION = "PHONE_VIBRATION", ROOSTER = "ROOSTER" }
export type UiLanguage = "BROWSER" | "EN" | "JA" | "ES" | "KO" | "ZH_HANS";
export type ShowcaseState = { seen_features: Array<string>, };
export type GitBranch = { name: string, is_current: boolean, is_remote: boolean, last_commit_date: Date, };
export type SharedTaskDetails = { id: string, project_id: string, title: string, description: string | null, status: TaskStatus, };
export type QueuedMessage = {
/**
* The task attempt this message is queued for
*/
task_attempt_id: string,
/**
* The follow-up data (message + variant)
*/
data: DraftFollowUpData,
/**
* Timestamp when the message was queued
*/
queued_at: string, };
export type QueueStatus = { "status": "empty" } | { "status": "queued", message: QueuedMessage, };
export type ConflictOp = "rebase" | "merge" | "cherry_pick" | "revert";
export type ExecutorAction = { typ: ExecutorActionType, next_action: ExecutorAction | null, };
export type McpConfig = { servers: { [key in string]?: JsonValue }, servers_path: Array<string>, template: JsonValue, preconfigured: JsonValue, is_toml_config: boolean, };
export type ExecutorActionType = { "type": "CodingAgentInitialRequest" } & CodingAgentInitialRequest | { "type": "CodingAgentFollowUpRequest" } & CodingAgentFollowUpRequest | { "type": "ScriptRequest" } & ScriptRequest;
export type ScriptContext = "SetupScript" | "CleanupScript" | "DevServer" | "ToolInstallScript";
export type ScriptRequest = { script: string, language: ScriptRequestLanguage, context: ScriptContext,
/**
* Optional relative path to execute the script in (relative to container_ref).
* If None, uses the container_ref directory directly.
*/
working_dir: string | null, };
export type ScriptRequestLanguage = "Bash";
export enum BaseCodingAgent { CLAUDE_CODE = "CLAUDE_CODE", AMP = "AMP", GEMINI = "GEMINI", CODEX = "CODEX", OPENCODE = "OPENCODE", CURSOR_AGENT = "CURSOR_AGENT", QWEN_CODE = "QWEN_CODE", COPILOT = "COPILOT", DROID = "DROID" }
export type CodingAgent = { "CLAUDE_CODE": ClaudeCode } | { "AMP": Amp } | { "GEMINI": Gemini } | { "CODEX": Codex } | { "OPENCODE": Opencode } | { "CURSOR_AGENT": CursorAgent } | { "QWEN_CODE": QwenCode } | { "COPILOT": Copilot } | { "DROID": Droid };
export type AvailabilityInfo = { "type": "LOGIN_DETECTED", last_auth_timestamp: bigint, } | { "type": "INSTALLATION_FOUND" } | { "type": "NOT_FOUND" };
export type CommandBuilder = {
/**
* Base executable command (e.g., "npx -y @anthropic-ai/claude-code@latest")
*/
base: string,
/**
* Optional parameters to append to the base command
*/
params: Array<string> | null, };
export type ExecutorProfileId = {
/**
* The executor type (e.g., "CLAUDE_CODE", "AMP")
*/
executor: BaseCodingAgent,
/**
* Optional variant name (e.g., "PLAN", "ROUTER")
*/
variant: string | null, };
export type ExecutorConfig = { [key in string]?: { "CLAUDE_CODE": ClaudeCode } | { "AMP": Amp } | { "GEMINI": Gemini } | { "CODEX": Codex } | { "OPENCODE": Opencode } | { "CURSOR_AGENT": CursorAgent } | { "QWEN_CODE": QwenCode } | { "COPILOT": Copilot } | { "DROID": Droid } };
export type ExecutorConfigs = { executors: { [key in BaseCodingAgent]?: ExecutorConfig }, };
export enum BaseAgentCapability { SESSION_FORK = "SESSION_FORK", SETUP_HELPER = "SETUP_HELPER" }
export type ClaudeCode = { append_prompt: AppendPrompt, claude_code_router?: boolean | null, plan?: boolean | null, approvals?: boolean | null, model?: string | null, dangerously_skip_permissions?: boolean | null, disable_api_key?: boolean | null, base_command_override?: string | null, additional_params?: Array<string> | null, env?: { [key in string]?: string } | null, };
export type Gemini = { append_prompt: AppendPrompt, model?: string | null, yolo?: boolean | null, base_command_override?: string | null, additional_params?: Array<string> | null, env?: { [key in string]?: string } | null, };
export type Amp = { append_prompt: AppendPrompt, dangerously_allow_all?: boolean | null, base_command_override?: string | null, additional_params?: Array<string> | null, env?: { [key in string]?: string } | null, };
export type Codex = { append_prompt: AppendPrompt, sandbox?: SandboxMode | null, ask_for_approval?: AskForApproval | null, oss?: boolean | null, model?: string | null, model_reasoning_effort?: ReasoningEffort | null, model_reasoning_summary?: ReasoningSummary | null, model_reasoning_summary_format?: ReasoningSummaryFormat | null, profile?: string | null, base_instructions?: string | null, include_apply_patch_tool?: boolean | null, model_provider?: string | null, compact_prompt?: string | null, developer_instructions?: string | null, base_command_override?: string | null, additional_params?: Array<string> | null, env?: { [key in string]?: string } | null, };
export type SandboxMode = "auto" | "read-only" | "workspace-write" | "danger-full-access";
export type AskForApproval = "unless-trusted" | "on-failure" | "on-request" | "never";
export type ReasoningEffort = "low" | "medium" | "high" | "xhigh";
export type ReasoningSummary = "auto" | "concise" | "detailed" | "none";
export type ReasoningSummaryFormat = "none" | "experimental";
export type CursorAgent = { append_prompt: AppendPrompt, force?: boolean | null, model?: string | null, base_command_override?: string | null, additional_params?: Array<string> | null, env?: { [key in string]?: string } | null, };
export type Copilot = { append_prompt: AppendPrompt, model?: string | null, allow_all_tools?: boolean | null, allow_tool?: string | null, deny_tool?: string | null, add_dir?: Array<string> | null, disable_mcp_server?: Array<string> | null, base_command_override?: string | null, additional_params?: Array<string> | null, env?: { [key in string]?: string } | null, };
export type Opencode = { append_prompt: AppendPrompt, model?: string | null, mode?: string | null,
/**
* Auto-approve agent actions
*/
auto_approve: boolean, base_command_override?: string | null, additional_params?: Array<string> | null, env?: { [key in string]?: string } | null, };
export type QwenCode = { append_prompt: AppendPrompt, yolo?: boolean | null, base_command_override?: string | null, additional_params?: Array<string> | null, env?: { [key in string]?: string } | null, };
export type Droid = { append_prompt: AppendPrompt, autonomy: Autonomy, model?: string | null, reasoning_effort?: DroidReasoningEffort | null, base_command_override?: string | null, additional_params?: Array<string> | null, env?: { [key in string]?: string } | null, };
export type Autonomy = "normal" | "low" | "medium" | "high" | "skip-permissions-unsafe";
export type DroidReasoningEffort = "none" | "dynamic" | "off" | "low" | "medium" | "high";
export type AppendPrompt = string | null;
export type CodingAgentInitialRequest = { prompt: string,
/**
* Executor profile specification
*/
executor_profile_id: ExecutorProfileId, };
export type CodingAgentFollowUpRequest = { prompt: string, session_id: string,
/**
* Executor profile specification
*/
executor_profile_id: ExecutorProfileId, };
export type CommandExitStatus = { "type": "exit_code", code: number, } | { "type": "success", success: boolean, };
export type CommandRunResult = { exit_status: CommandExitStatus | null, output: string | null, };
export type NormalizedEntry = { timestamp: string | null, entry_type: NormalizedEntryType, content: string, };
export type NormalizedEntryType = { "type": "user_message" } | { "type": "user_feedback", denied_tool: string, } | { "type": "assistant_message" } | { "type": "tool_use", tool_name: string, action_type: ActionType, status: ToolStatus, } | { "type": "system_message" } | { "type": "error_message", error_type: NormalizedEntryError, } | { "type": "thinking" } | { "type": "loading" } | { "type": "next_action", failed: boolean, execution_processes: number, needs_setup: boolean, };
export type FileChange = { "action": "write", content: string, } | { "action": "delete" } | { "action": "rename", new_path: string, } | { "action": "edit",
/**
* Unified diff containing file header and hunks.
*/
unified_diff: string,
/**
* Whether line number in the hunks are reliable.
*/
has_line_numbers: boolean, };
export type ActionType = { "action": "file_read", path: string, } | { "action": "file_edit", path: string, changes: Array<FileChange>, } | { "action": "command_run", command: string, result: CommandRunResult | null, } | { "action": "search", query: string, } | { "action": "web_fetch", url: string, } | { "action": "tool", tool_name: string, arguments: JsonValue | null, result: ToolResult | null, } | { "action": "task_create", description: string, } | { "action": "plan_presentation", plan: string, } | { "action": "todo_management", todos: Array<TodoItem>, operation: string, } | { "action": "other", description: string, };
export type TodoItem = { content: string, status: string, priority: string | null, };
export type NormalizedEntryError = { "type": "setup_required" } | { "type": "other" };
export type ToolResult = { type: ToolResultValueType,
/**
* For Markdown, this will be a JSON string; for JSON, a structured value
*/
value: JsonValue, };
export type ToolResultValueType = { "type": "markdown" } | { "type": "json" };
export type ToolStatus = { "status": "created" } | { "status": "success" } | { "status": "failed" } | { "status": "denied", reason: string | null, } | { "status": "pending_approval", approval_id: string, requested_at: string, timeout_at: string, } | { "status": "timed_out" };
export type PatchType = { "type": "NORMALIZED_ENTRY", "content": NormalizedEntry } | { "type": "STDOUT", "content": string } | { "type": "STDERR", "content": string } | { "type": "DIFF", "content": Diff };
export type JsonValue = number | string | boolean | Array<JsonValue> | { [key in string]?: JsonValue } | null;
export const DEFAULT_PR_DESCRIPTION_PROMPT = `Update the GitHub PR that was just created with a better title and description.
The PR number is #{pr_number} and the URL is {pr_url}.
Analyze the changes in this branch and write:
1. A concise, descriptive title that summarizes the changes, postfixed with "(Vibe Kanban)"
2. A detailed description that explains:
- What changes were made
- Why they were made (based on the task context)
- Any important implementation details
- At the end, include a note: "This PR was written using [Vibe Kanban](https://vibekanban.com)"
Use \`gh pr edit\` to update the PR.`;