Add agent working directory setting to run coding agents from repo directory (Vibe Kanban) (#1617)

* Done! The migration has been updated to copy `dev_script_working_dir` values to `agent_working_dir` for existing projects. The change:

**`vibe-kanban/crates/db/migrations/20251219000000_add_agent_working_dir_to_projects.sql`**:
```sql
-- Add column with empty default first
ALTER TABLE projects ADD COLUMN agent_working_dir TEXT DEFAULT '';

-- Copy existing dev_script_working_dir values to agent_working_dir
UPDATE projects SET agent_working_dir = dev_script_working_dir WHERE dev_script_working_dir IS NOT NULL AND dev_script_working_dir != '';
```

This ensures that when the migration runs, existing projects will have their `agent_working_dir` automatically populated from their existing `dev_script_working_dir` setting.

* Done! The migration has been updated to only copy `dev_script_working_dir` to `agent_working_dir` for single-repo projects. Multi-repo projects will keep `agent_working_dir` empty (None).

The key change in the migration:
```sql
UPDATE projects SET agent_working_dir = dev_script_working_dir
WHERE dev_script_working_dir IS NOT NULL
  AND dev_script_working_dir != ''
  AND (SELECT COUNT(*) FROM project_repos WHERE project_repos.project_id = projects.id) = 1;
```

The subquery `(SELECT COUNT(*) FROM project_repos WHERE project_repos.project_id = projects.id) = 1` ensures only projects with exactly one repository get the default value copied.

* Done! All tasks completed:

1. **Resolved the rebase conflict** in `task_attempts.rs` - removed the duplicate `follow_up` function since it was moved to `sessions/mod.rs` on main

2. **Updated `sessions/mod.rs`** with the `working_dir` changes - added the code to get `agent_working_dir` from project settings and pass it to both `CodingAgentFollowUpRequest` and `CodingAgentInitialRequest`

3. **Ran `npm run prepare-db`** - successfully prepared the database and updated sqlx cache

4. **Verified compilation** - all crates compile successfully

The migration now only copies `dev_script_working_dir` to `agent_working_dir` for single-repo projects (projects with exactly 1 repo in `project_repos` table). Multi-repo projects will keep `agent_working_dir` empty.

* Done. Removed all 6 comments as requested.

* Done! The implementation is now complete:

1. **Database migration** - Copies `dev_script_working_dir` to `agent_working_dir` for single-repo projects at migration time
2. **Runtime backfill** - `backfill_repo_names()` in `container.rs` sets `agent_working_dir` for existing single-repo projects where it's still empty
3. **Project creation** - New single-repo projects automatically get `agent_working_dir` set to the repo name

All three scenarios are now covered for setting `agent_working_dir` on single-repo projects.

* Done! The frontend UI for `agent_working_dir` has been added:

1. Added `agent_working_dir: string` to `ProjectFormState` interface
2. Added `agent_working_dir` to `projectToFormState` function
3. Added `agent_working_dir` to the `handleSave` payload
4. Added the input field after the dev server working directory input

* All tasks are complete. The implementation is finished:

**Summary of changes:**

1. **Project model**: Renamed `agent_working_dir` to `default_agent_working_dir` - this is now a default value for new workspaces

2. **Workspace model**: Added `agent_working_dir` field - this captures the project's default at workspace creation time

3. **Workspace creation**: Updated `create_task_attempt` to fetch the project's `default_agent_working_dir` and pass it to the workspace

4. **Execution sites**: Updated all 4 locations to use `workspace.agent_working_dir` instead of `project.default_agent_working_dir`:
   - `container.rs` (start_workspace)
   - `container.rs` (start_queued_follow_up)
   - `sessions/mod.rs` (follow_up)
   - `task_attempts/pr.rs` (trigger_pr_description_follow_up)

5. **Frontend**: Updated `ProjectSettings.tsx` to use `default_agent_working_dir`

6. **Migrations**: Two migrations added - one to rename the project column and one to add the workspace column

* Cleanup script changes for workspace a87cad74-ed7c-402b-b3f3-413a939a13a9

* Done. The migrations have been squashed into a single migration file `20251219000000_add_agent_working_dir_to_projects.sql` which now handles both the project and workspace table changes. The separate workspace migration file has been deleted and `npm run prepare-db` completes successfully.

* Done. The i18n check passes with all checks passing:
- No new literal strings introduced
- No duplicate keys found in JSON files
- Translation keys are consistent across locales (en, es, ja, ko, zh-Hans)

* All checks pass:
- `npm run check`  (frontend TypeScript and backend Rust)
- `npm run prepare-db`  (migrations and sqlx query cache)

* `npm run lint` passes now.

* Done. I've added the logic to `crates/services/src/services/project.rs` in the `add_repository()` function. When a second repository is added to a project (repo count becomes exactly 2), the `default_agent_working_dir` field is automatically cleared by setting it to an empty string.

* Cleanup script changes for workspace a87cad74-ed7c-402b-b3f3-413a939a13a9

* Both checks pass. The fix now correctly:
1. Counts repos **before** adding the new one
2. Only clears `default_agent_working_dir` if the previous count was exactly 1 (meaning we're going from 1 to 2 repos)

* Done. The fix now:

1. Added a dedicated `Project::clear_default_agent_working_dir()` method in `crates/db/src/models/project.rs` that only updates the `default_agent_working_dir` field without touching other fields
2. Updated `add_repository()` in `crates/services/src/services/project.rs` to use this new method when the repo count goes from 1 to 2
This commit is contained in:
Alex Netsch
2025-12-19 17:58:01 +00:00
committed by GitHub
parent 165e832830
commit 2f5cef2dff
36 changed files with 478 additions and 152 deletions

View File

@@ -1,6 +1,6 @@
{
"db_name": "SQLite",
"query": "SELECT id AS \"id!: Uuid\",\n task_id AS \"task_id!: Uuid\",\n container_ref,\n branch,\n setup_completed_at AS \"setup_completed_at: DateTime<Utc>\",\n created_at AS \"created_at!: DateTime<Utc>\",\n updated_at AS \"updated_at!: DateTime<Utc>\"\n FROM workspaces\n ORDER BY created_at DESC",
"query": "SELECT id AS \"id!: Uuid\",\n task_id AS \"task_id!: Uuid\",\n container_ref,\n branch,\n agent_working_dir,\n setup_completed_at AS \"setup_completed_at: DateTime<Utc>\",\n created_at AS \"created_at!: DateTime<Utc>\",\n updated_at AS \"updated_at!: DateTime<Utc>\"\n FROM workspaces\n ORDER BY created_at DESC",
"describe": {
"columns": [
{
@@ -24,18 +24,23 @@
"type_info": "Text"
},
{
"name": "setup_completed_at: DateTime<Utc>",
"name": "agent_working_dir",
"ordinal": 4,
"type_info": "Text"
},
{
"name": "setup_completed_at: DateTime<Utc>",
"ordinal": 5,
"type_info": "Datetime"
},
{
"name": "created_at!: DateTime<Utc>",
"ordinal": 5,
"ordinal": 6,
"type_info": "Text"
},
{
"name": "updated_at!: DateTime<Utc>",
"ordinal": 6,
"ordinal": 7,
"type_info": "Text"
}
],
@@ -48,9 +53,10 @@
true,
false,
true,
true,
false,
false
]
},
"hash": "b045caab5ba5e6744287ff350ad00ec93afc16aca48475c45292419722a617a6"
"hash": "0e51827057df2d9e78304623e44487109740a841bdab334bb8b43db60215ecb1"
}

View File

@@ -1,6 +1,6 @@
{
"db_name": "SQLite",
"query": "SELECT id as \"id!: Uuid\",\n name,\n dev_script,\n dev_script_working_dir,\n remote_project_id as \"remote_project_id: Uuid\",\n created_at as \"created_at!: DateTime<Utc>\",\n updated_at as \"updated_at!: DateTime<Utc>\"\n FROM projects\n WHERE remote_project_id = $1\n LIMIT 1",
"query": "SELECT id as \"id!: Uuid\",\n name,\n dev_script,\n dev_script_working_dir,\n default_agent_working_dir,\n remote_project_id as \"remote_project_id: Uuid\",\n created_at as \"created_at!: DateTime<Utc>\",\n updated_at as \"updated_at!: DateTime<Utc>\"\n FROM projects\n WHERE rowid = $1",
"describe": {
"columns": [
{
@@ -24,18 +24,23 @@
"type_info": "Text"
},
{
"name": "remote_project_id: Uuid",
"name": "default_agent_working_dir",
"ordinal": 4,
"type_info": "Text"
},
{
"name": "remote_project_id: Uuid",
"ordinal": 5,
"type_info": "Blob"
},
{
"name": "created_at!: DateTime<Utc>",
"ordinal": 5,
"ordinal": 6,
"type_info": "Text"
},
{
"name": "updated_at!: DateTime<Utc>",
"ordinal": 6,
"ordinal": 7,
"type_info": "Text"
}
],
@@ -48,9 +53,10 @@
true,
true,
true,
true,
false,
false
]
},
"hash": "a25e23ffeeeec845811692df45efafe76b2ca89a19c59b41492d17fd52169356"
"hash": "17789c934ca49a04a85505f1a9c861a575ee8fa2e8b6c9e9f4fc177c09caa53d"
}

View File

@@ -1,6 +1,6 @@
{
"db_name": "SQLite",
"query": "SELECT id AS \"id!: Uuid\",\n task_id AS \"task_id!: Uuid\",\n container_ref,\n branch,\n setup_completed_at AS \"setup_completed_at: DateTime<Utc>\",\n created_at AS \"created_at!: DateTime<Utc>\",\n updated_at AS \"updated_at!: DateTime<Utc>\"\n FROM workspaces\n WHERE id = $1",
"query": "SELECT id AS \"id!: Uuid\",\n task_id AS \"task_id!: Uuid\",\n container_ref,\n branch,\n agent_working_dir,\n setup_completed_at AS \"setup_completed_at: DateTime<Utc>\",\n created_at AS \"created_at!: DateTime<Utc>\",\n updated_at AS \"updated_at!: DateTime<Utc>\"\n FROM workspaces\n WHERE rowid = $1",
"describe": {
"columns": [
{
@@ -24,18 +24,23 @@
"type_info": "Text"
},
{
"name": "setup_completed_at: DateTime<Utc>",
"name": "agent_working_dir",
"ordinal": 4,
"type_info": "Text"
},
{
"name": "setup_completed_at: DateTime<Utc>",
"ordinal": 5,
"type_info": "Datetime"
},
{
"name": "created_at!: DateTime<Utc>",
"ordinal": 5,
"ordinal": 6,
"type_info": "Text"
},
{
"name": "updated_at!: DateTime<Utc>",
"ordinal": 6,
"ordinal": 7,
"type_info": "Text"
}
],
@@ -48,9 +53,10 @@
true,
false,
true,
true,
false,
false
]
},
"hash": "4e2fc922881272de086b1cc4b413d0babc2a9dcb548c6cca51262e88d9c76f0f"
"hash": "1d84ba9ca8016f7b15d84acb3442c3b84ff344f666c31bb29223b6aaa440091f"
}

View File

@@ -1,6 +1,6 @@
{
"db_name": "SQLite",
"query": "SELECT id as \"id!: Uuid\",\n name,\n dev_script,\n dev_script_working_dir,\n remote_project_id as \"remote_project_id: Uuid\",\n created_at as \"created_at!: DateTime<Utc>\",\n updated_at as \"updated_at!: DateTime<Utc>\"\n FROM projects\n WHERE rowid = $1",
"query": "SELECT id as \"id!: Uuid\",\n name,\n dev_script,\n dev_script_working_dir,\n default_agent_working_dir,\n remote_project_id as \"remote_project_id: Uuid\",\n created_at as \"created_at!: DateTime<Utc>\",\n updated_at as \"updated_at!: DateTime<Utc>\"\n FROM projects\n WHERE id = $1",
"describe": {
"columns": [
{
@@ -24,18 +24,23 @@
"type_info": "Text"
},
{
"name": "remote_project_id: Uuid",
"name": "default_agent_working_dir",
"ordinal": 4,
"type_info": "Text"
},
{
"name": "remote_project_id: Uuid",
"ordinal": 5,
"type_info": "Blob"
},
{
"name": "created_at!: DateTime<Utc>",
"ordinal": 5,
"ordinal": 6,
"type_info": "Text"
},
{
"name": "updated_at!: DateTime<Utc>",
"ordinal": 6,
"ordinal": 7,
"type_info": "Text"
}
],
@@ -48,9 +53,10 @@
true,
true,
true,
true,
false,
false
]
},
"hash": "d6218ce758d0ef58edc775de68a28f9be72d02217ef43f0b79494b63380ea9a8"
"hash": "1eae64d51ea1d81c7239fc3640bb15bb59a51333602e7050bec3f5e8fc7fc782"
}

View File

@@ -1,6 +1,6 @@
{
"db_name": "SQLite",
"query": "\n SELECT p.id as \"id!: Uuid\", p.name, p.dev_script, p.dev_script_working_dir,\n p.remote_project_id as \"remote_project_id: Uuid\",\n p.created_at as \"created_at!: DateTime<Utc>\", p.updated_at as \"updated_at!: DateTime<Utc>\"\n FROM projects p\n WHERE p.id IN (\n SELECT DISTINCT t.project_id\n FROM tasks t\n INNER JOIN workspaces w ON w.task_id = t.id\n ORDER BY w.updated_at DESC\n )\n LIMIT $1\n ",
"query": "\n SELECT p.id as \"id!: Uuid\", p.name, p.dev_script, p.dev_script_working_dir,\n p.default_agent_working_dir,\n p.remote_project_id as \"remote_project_id: Uuid\",\n p.created_at as \"created_at!: DateTime<Utc>\", p.updated_at as \"updated_at!: DateTime<Utc>\"\n FROM projects p\n WHERE p.id IN (\n SELECT DISTINCT t.project_id\n FROM tasks t\n INNER JOIN workspaces w ON w.task_id = t.id\n ORDER BY w.updated_at DESC\n )\n LIMIT $1\n ",
"describe": {
"columns": [
{
@@ -24,18 +24,23 @@
"type_info": "Text"
},
{
"name": "remote_project_id: Uuid",
"name": "default_agent_working_dir",
"ordinal": 4,
"type_info": "Text"
},
{
"name": "remote_project_id: Uuid",
"ordinal": 5,
"type_info": "Blob"
},
{
"name": "created_at!: DateTime<Utc>",
"ordinal": 5,
"ordinal": 6,
"type_info": "Text"
},
{
"name": "updated_at!: DateTime<Utc>",
"ordinal": 6,
"ordinal": 7,
"type_info": "Text"
}
],
@@ -48,9 +53,10 @@
true,
true,
true,
true,
false,
false
]
},
"hash": "eddc93385355d830abf8e98cb46d6648bb8830f023647be575cb7766609f9857"
"hash": "368d9be9608fea5002627625bf8abd3e9073e06bb72fc75f11c2af13f1d43a10"
}

View File

@@ -1,56 +0,0 @@
{
"db_name": "SQLite",
"query": "\n SELECT\n w.id as \"id!: Uuid\",\n w.task_id as \"task_id!: Uuid\",\n w.container_ref,\n w.branch as \"branch!\",\n w.setup_completed_at as \"setup_completed_at: DateTime<Utc>\",\n w.created_at as \"created_at!: DateTime<Utc>\",\n w.updated_at as \"updated_at!: DateTime<Utc>\"\n FROM workspaces w\n LEFT JOIN sessions s ON w.id = s.workspace_id\n LEFT JOIN execution_processes ep ON s.id = ep.session_id AND ep.completed_at IS NOT NULL\n WHERE w.container_ref IS NOT NULL\n AND w.id NOT IN (\n SELECT DISTINCT s2.workspace_id\n FROM sessions s2\n JOIN execution_processes ep2 ON s2.id = ep2.session_id\n WHERE ep2.completed_at IS NULL\n )\n GROUP BY w.id, w.container_ref, w.updated_at\n HAVING datetime('now', '-72 hours') > datetime(\n MAX(\n CASE\n WHEN ep.completed_at IS NOT NULL THEN ep.completed_at\n ELSE w.updated_at\n END\n )\n )\n ORDER BY MAX(\n CASE\n WHEN ep.completed_at IS NOT NULL THEN ep.completed_at\n ELSE w.updated_at\n END\n ) ASC\n ",
"describe": {
"columns": [
{
"name": "id!: Uuid",
"ordinal": 0,
"type_info": "Blob"
},
{
"name": "task_id!: Uuid",
"ordinal": 1,
"type_info": "Blob"
},
{
"name": "container_ref",
"ordinal": 2,
"type_info": "Text"
},
{
"name": "branch!",
"ordinal": 3,
"type_info": "Text"
},
{
"name": "setup_completed_at: DateTime<Utc>",
"ordinal": 4,
"type_info": "Datetime"
},
{
"name": "created_at!: DateTime<Utc>",
"ordinal": 5,
"type_info": "Text"
},
{
"name": "updated_at!: DateTime<Utc>",
"ordinal": 6,
"type_info": "Text"
}
],
"parameters": {
"Right": 0
},
"nullable": [
true,
true,
true,
true,
true,
true,
true
]
},
"hash": "3960255f06f9753bc116d1aa1d65e0c5bf402d52130057fbaa2c9da7f0173e64"
}

View File

@@ -0,0 +1,62 @@
{
"db_name": "SQLite",
"query": "\n SELECT\n w.id as \"id!: Uuid\",\n w.task_id as \"task_id!: Uuid\",\n w.container_ref,\n w.branch as \"branch!\",\n w.agent_working_dir,\n w.setup_completed_at as \"setup_completed_at: DateTime<Utc>\",\n w.created_at as \"created_at!: DateTime<Utc>\",\n w.updated_at as \"updated_at!: DateTime<Utc>\"\n FROM workspaces w\n LEFT JOIN sessions s ON w.id = s.workspace_id\n LEFT JOIN execution_processes ep ON s.id = ep.session_id AND ep.completed_at IS NOT NULL\n WHERE w.container_ref IS NOT NULL\n AND w.id NOT IN (\n SELECT DISTINCT s2.workspace_id\n FROM sessions s2\n JOIN execution_processes ep2 ON s2.id = ep2.session_id\n WHERE ep2.completed_at IS NULL\n )\n GROUP BY w.id, w.container_ref, w.updated_at\n HAVING datetime('now', '-72 hours') > datetime(\n MAX(\n CASE\n WHEN ep.completed_at IS NOT NULL THEN ep.completed_at\n ELSE w.updated_at\n END\n )\n )\n ORDER BY MAX(\n CASE\n WHEN ep.completed_at IS NOT NULL THEN ep.completed_at\n ELSE w.updated_at\n END\n ) ASC\n ",
"describe": {
"columns": [
{
"name": "id!: Uuid",
"ordinal": 0,
"type_info": "Blob"
},
{
"name": "task_id!: Uuid",
"ordinal": 1,
"type_info": "Blob"
},
{
"name": "container_ref",
"ordinal": 2,
"type_info": "Text"
},
{
"name": "branch!",
"ordinal": 3,
"type_info": "Text"
},
{
"name": "agent_working_dir",
"ordinal": 4,
"type_info": "Text"
},
{
"name": "setup_completed_at: DateTime<Utc>",
"ordinal": 5,
"type_info": "Datetime"
},
{
"name": "created_at!: DateTime<Utc>",
"ordinal": 6,
"type_info": "Text"
},
{
"name": "updated_at!: DateTime<Utc>",
"ordinal": 7,
"type_info": "Text"
}
],
"parameters": {
"Right": 0
},
"nullable": [
true,
true,
true,
true,
true,
true,
true,
true
]
},
"hash": "3d4cd3c4749b2b39e1d18f3a8e7fd38f5b5a165c796616a904acd820190df731"
}

View File

@@ -0,0 +1,12 @@
{
"db_name": "SQLite",
"query": "UPDATE projects\n SET default_agent_working_dir = ''\n WHERE id = $1",
"describe": {
"columns": [],
"parameters": {
"Right": 1
},
"nullable": []
},
"hash": "43c10bbafd851c362132fd0657eaef5e16eb58d847540b1d998a741996967643"
}

View File

@@ -1,6 +1,6 @@
{
"db_name": "SQLite",
"query": "SELECT id AS \"id!: Uuid\",\n task_id AS \"task_id!: Uuid\",\n container_ref,\n branch,\n setup_completed_at AS \"setup_completed_at: DateTime<Utc>\",\n created_at AS \"created_at!: DateTime<Utc>\",\n updated_at AS \"updated_at!: DateTime<Utc>\"\n FROM workspaces\n WHERE task_id = $1\n ORDER BY created_at DESC",
"query": "SELECT id AS \"id!: Uuid\",\n task_id AS \"task_id!: Uuid\",\n container_ref,\n branch,\n agent_working_dir,\n setup_completed_at AS \"setup_completed_at: DateTime<Utc>\",\n created_at AS \"created_at!: DateTime<Utc>\",\n updated_at AS \"updated_at!: DateTime<Utc>\"\n FROM workspaces\n WHERE task_id = $1\n ORDER BY created_at DESC",
"describe": {
"columns": [
{
@@ -24,18 +24,23 @@
"type_info": "Text"
},
{
"name": "setup_completed_at: DateTime<Utc>",
"name": "agent_working_dir",
"ordinal": 4,
"type_info": "Text"
},
{
"name": "setup_completed_at: DateTime<Utc>",
"ordinal": 5,
"type_info": "Datetime"
},
{
"name": "created_at!: DateTime<Utc>",
"ordinal": 5,
"ordinal": 6,
"type_info": "Text"
},
{
"name": "updated_at!: DateTime<Utc>",
"ordinal": 6,
"ordinal": 7,
"type_info": "Text"
}
],
@@ -48,9 +53,10 @@
true,
false,
true,
true,
false,
false
]
},
"hash": "d9bd2bef480a5eb98aec33263a9356e97d8f1a77d613fbc3ffbb6eaaff6979c8"
"hash": "4f2044112bb23e40b229804cfa535a6b631d95ff7b5596706711f76a50516b40"
}

View File

@@ -1,6 +1,6 @@
{
"db_name": "SQLite",
"query": "UPDATE projects\n SET name = $2, dev_script = $3, dev_script_working_dir = $4\n WHERE id = $1\n RETURNING id as \"id!: Uuid\",\n name,\n dev_script,\n dev_script_working_dir,\n remote_project_id as \"remote_project_id: Uuid\",\n created_at as \"created_at!: DateTime<Utc>\",\n updated_at as \"updated_at!: DateTime<Utc>\"",
"query": "UPDATE projects\n SET name = $2, dev_script = $3, dev_script_working_dir = $4, default_agent_working_dir = $5\n WHERE id = $1\n RETURNING id as \"id!: Uuid\",\n name,\n dev_script,\n dev_script_working_dir,\n default_agent_working_dir,\n remote_project_id as \"remote_project_id: Uuid\",\n created_at as \"created_at!: DateTime<Utc>\",\n updated_at as \"updated_at!: DateTime<Utc>\"",
"describe": {
"columns": [
{
@@ -24,23 +24,28 @@
"type_info": "Text"
},
{
"name": "remote_project_id: Uuid",
"name": "default_agent_working_dir",
"ordinal": 4,
"type_info": "Text"
},
{
"name": "remote_project_id: Uuid",
"ordinal": 5,
"type_info": "Blob"
},
{
"name": "created_at!: DateTime<Utc>",
"ordinal": 5,
"ordinal": 6,
"type_info": "Text"
},
{
"name": "updated_at!: DateTime<Utc>",
"ordinal": 6,
"ordinal": 7,
"type_info": "Text"
}
],
"parameters": {
"Right": 4
"Right": 5
},
"nullable": [
true,
@@ -48,9 +53,10 @@
true,
true,
true,
true,
false,
false
]
},
"hash": "ad85226d4ebe3d4eb3df8d2d6087146929ec6ea7345395fda9498bae9507c152"
"hash": "697001fc14562702ea84061e74bdcc6b9fbef679b8366ab46c1f629a633e9919"
}

View File

@@ -1,6 +1,6 @@
{
"db_name": "SQLite",
"query": "SELECT w.id AS \"id!: Uuid\",\n w.task_id AS \"task_id!: Uuid\",\n w.container_ref,\n w.branch,\n w.setup_completed_at AS \"setup_completed_at: DateTime<Utc>\",\n w.created_at AS \"created_at!: DateTime<Utc>\",\n w.updated_at AS \"updated_at!: DateTime<Utc>\"\n FROM workspaces w\n JOIN tasks t ON w.task_id = t.id\n JOIN projects p ON t.project_id = p.id\n WHERE w.id = $1 AND t.id = $2 AND p.id = $3",
"query": "SELECT w.id AS \"id!: Uuid\",\n w.task_id AS \"task_id!: Uuid\",\n w.container_ref,\n w.branch,\n w.agent_working_dir,\n w.setup_completed_at AS \"setup_completed_at: DateTime<Utc>\",\n w.created_at AS \"created_at!: DateTime<Utc>\",\n w.updated_at AS \"updated_at!: DateTime<Utc>\"\n FROM workspaces w\n JOIN tasks t ON w.task_id = t.id\n JOIN projects p ON t.project_id = p.id\n WHERE w.id = $1 AND t.id = $2 AND p.id = $3",
"describe": {
"columns": [
{
@@ -24,18 +24,23 @@
"type_info": "Text"
},
{
"name": "setup_completed_at: DateTime<Utc>",
"name": "agent_working_dir",
"ordinal": 4,
"type_info": "Text"
},
{
"name": "setup_completed_at: DateTime<Utc>",
"ordinal": 5,
"type_info": "Datetime"
},
{
"name": "created_at!: DateTime<Utc>",
"ordinal": 5,
"ordinal": 6,
"type_info": "Text"
},
{
"name": "updated_at!: DateTime<Utc>",
"ordinal": 6,
"ordinal": 7,
"type_info": "Text"
}
],
@@ -48,9 +53,10 @@
true,
false,
true,
true,
false,
false
]
},
"hash": "043aa47b7fc12fe0e1851b470623bb35fd3683691cfe3a12df3d32dfd501eeb5"
"hash": "7c6ce796263a1b9c1a9f14fc4a565063fb627263efe4303ef7f165b0b3e6b21e"
}

View File

@@ -1,6 +1,6 @@
{
"db_name": "SQLite",
"query": "SELECT id as \"id!: Uuid\",\n name,\n dev_script,\n dev_script_working_dir,\n remote_project_id as \"remote_project_id: Uuid\",\n created_at as \"created_at!: DateTime<Utc>\",\n updated_at as \"updated_at!: DateTime<Utc>\"\n FROM projects\n WHERE id = $1",
"query": "SELECT id as \"id!: Uuid\",\n name,\n dev_script,\n dev_script_working_dir,\n default_agent_working_dir,\n remote_project_id as \"remote_project_id: Uuid\",\n created_at as \"created_at!: DateTime<Utc>\",\n updated_at as \"updated_at!: DateTime<Utc>\"\n FROM projects\n WHERE remote_project_id = $1\n LIMIT 1",
"describe": {
"columns": [
{
@@ -24,18 +24,23 @@
"type_info": "Text"
},
{
"name": "remote_project_id: Uuid",
"name": "default_agent_working_dir",
"ordinal": 4,
"type_info": "Text"
},
{
"name": "remote_project_id: Uuid",
"ordinal": 5,
"type_info": "Blob"
},
{
"name": "created_at!: DateTime<Utc>",
"ordinal": 5,
"ordinal": 6,
"type_info": "Text"
},
{
"name": "updated_at!: DateTime<Utc>",
"ordinal": 6,
"ordinal": 7,
"type_info": "Text"
}
],
@@ -48,9 +53,10 @@
true,
true,
true,
true,
false,
false
]
},
"hash": "e3ad69990a8d5e62a61d824db16a04d8893df37bc0980441d01f6e4e416702fa"
"hash": "88c15c5c9f2c8edeef4d48c5fcad3607fbc3f7d1402d379d19ad3f4ae2dd0f7d"
}

View File

@@ -1,6 +1,6 @@
{
"db_name": "SQLite",
"query": "SELECT id as \"id!: Uuid\",\n name,\n dev_script,\n dev_script_working_dir,\n remote_project_id as \"remote_project_id: Uuid\",\n created_at as \"created_at!: DateTime<Utc>\",\n updated_at as \"updated_at!: DateTime<Utc>\"\n FROM projects\n ORDER BY created_at DESC",
"query": "SELECT id as \"id!: Uuid\",\n name,\n dev_script,\n dev_script_working_dir,\n default_agent_working_dir,\n remote_project_id as \"remote_project_id: Uuid\",\n created_at as \"created_at!: DateTime<Utc>\",\n updated_at as \"updated_at!: DateTime<Utc>\"\n FROM projects\n ORDER BY created_at DESC",
"describe": {
"columns": [
{
@@ -24,18 +24,23 @@
"type_info": "Text"
},
{
"name": "remote_project_id: Uuid",
"name": "default_agent_working_dir",
"ordinal": 4,
"type_info": "Text"
},
{
"name": "remote_project_id: Uuid",
"ordinal": 5,
"type_info": "Blob"
},
{
"name": "created_at!: DateTime<Utc>",
"ordinal": 5,
"ordinal": 6,
"type_info": "Text"
},
{
"name": "updated_at!: DateTime<Utc>",
"ordinal": 6,
"ordinal": 7,
"type_info": "Text"
}
],
@@ -48,9 +53,10 @@
true,
true,
true,
true,
false,
false
]
},
"hash": "089ab7af5f065fd097e7e233347badf8467d4ad702ea33772fba8e0622f861b7"
"hash": "9cdc6d55c24ff020a6599f59560c7f0d3feacfe64136bb1338bb4c447022f69b"
}

View File

@@ -1,6 +1,6 @@
{
"db_name": "SQLite",
"query": "INSERT INTO workspaces (id, task_id, container_ref, branch, setup_completed_at)\n VALUES ($1, $2, $3, $4, $5)\n RETURNING id as \"id!: Uuid\", task_id as \"task_id!: Uuid\", container_ref, branch, setup_completed_at as \"setup_completed_at: DateTime<Utc>\", created_at as \"created_at!: DateTime<Utc>\", updated_at as \"updated_at!: DateTime<Utc>\"",
"query": "INSERT INTO workspaces (id, task_id, container_ref, branch, agent_working_dir, setup_completed_at)\n VALUES ($1, $2, $3, $4, $5, $6)\n RETURNING id as \"id!: Uuid\", task_id as \"task_id!: Uuid\", container_ref, branch, agent_working_dir, setup_completed_at as \"setup_completed_at: DateTime<Utc>\", created_at as \"created_at!: DateTime<Utc>\", updated_at as \"updated_at!: DateTime<Utc>\"",
"describe": {
"columns": [
{
@@ -24,23 +24,28 @@
"type_info": "Text"
},
{
"name": "setup_completed_at: DateTime<Utc>",
"name": "agent_working_dir",
"ordinal": 4,
"type_info": "Text"
},
{
"name": "setup_completed_at: DateTime<Utc>",
"ordinal": 5,
"type_info": "Datetime"
},
{
"name": "created_at!: DateTime<Utc>",
"ordinal": 5,
"ordinal": 6,
"type_info": "Text"
},
{
"name": "updated_at!: DateTime<Utc>",
"ordinal": 6,
"ordinal": 7,
"type_info": "Text"
}
],
"parameters": {
"Right": 5
"Right": 6
},
"nullable": [
true,
@@ -48,9 +53,10 @@
true,
false,
true,
true,
false,
false
]
},
"hash": "1ea1ac1a3d569e40347f512dc6f6c1422d0cf40933240b933ffc81bea1c5f9cc"
"hash": "a672d2f33fca9a19ae68c29e5d3a14c8b050389fb55b80b3788901bc79fdde1b"
}

View File

@@ -1,6 +1,6 @@
{
"db_name": "SQLite",
"query": "SELECT id AS \"id!: Uuid\",\n task_id AS \"task_id!: Uuid\",\n container_ref,\n branch,\n setup_completed_at AS \"setup_completed_at: DateTime<Utc>\",\n created_at AS \"created_at!: DateTime<Utc>\",\n updated_at AS \"updated_at!: DateTime<Utc>\"\n FROM workspaces\n WHERE rowid = $1",
"query": "SELECT id AS \"id!: Uuid\",\n task_id AS \"task_id!: Uuid\",\n container_ref,\n branch,\n agent_working_dir,\n setup_completed_at AS \"setup_completed_at: DateTime<Utc>\",\n created_at AS \"created_at!: DateTime<Utc>\",\n updated_at AS \"updated_at!: DateTime<Utc>\"\n FROM workspaces\n WHERE id = $1",
"describe": {
"columns": [
{
@@ -24,18 +24,23 @@
"type_info": "Text"
},
{
"name": "setup_completed_at: DateTime<Utc>",
"name": "agent_working_dir",
"ordinal": 4,
"type_info": "Text"
},
{
"name": "setup_completed_at: DateTime<Utc>",
"ordinal": 5,
"type_info": "Datetime"
},
{
"name": "created_at!: DateTime<Utc>",
"ordinal": 5,
"ordinal": 6,
"type_info": "Text"
},
{
"name": "updated_at!: DateTime<Utc>",
"ordinal": 6,
"ordinal": 7,
"type_info": "Text"
}
],
@@ -48,9 +53,10 @@
true,
false,
true,
true,
false,
false
]
},
"hash": "b7d7e3767c0994a28ebab0b98cdce39e32504a78ccd87ba4704a4e3770bf6149"
"hash": "b96fe730f54d9bea01ff6306bed580d7ad1f12d379d434898a77303d32a32efc"
}

View File

@@ -1,6 +1,6 @@
{
"db_name": "SQLite",
"query": "INSERT INTO projects (\n id,\n name\n ) VALUES (\n $1, $2\n )\n RETURNING id as \"id!: Uuid\",\n name,\n dev_script,\n dev_script_working_dir,\n remote_project_id as \"remote_project_id: Uuid\",\n created_at as \"created_at!: DateTime<Utc>\",\n updated_at as \"updated_at!: DateTime<Utc>\"",
"query": "INSERT INTO projects (\n id,\n name\n ) VALUES (\n $1, $2\n )\n RETURNING id as \"id!: Uuid\",\n name,\n dev_script,\n dev_script_working_dir,\n default_agent_working_dir,\n remote_project_id as \"remote_project_id: Uuid\",\n created_at as \"created_at!: DateTime<Utc>\",\n updated_at as \"updated_at!: DateTime<Utc>\"",
"describe": {
"columns": [
{
@@ -24,18 +24,23 @@
"type_info": "Text"
},
{
"name": "remote_project_id: Uuid",
"name": "default_agent_working_dir",
"ordinal": 4,
"type_info": "Text"
},
{
"name": "remote_project_id: Uuid",
"ordinal": 5,
"type_info": "Blob"
},
{
"name": "created_at!: DateTime<Utc>",
"ordinal": 5,
"ordinal": 6,
"type_info": "Text"
},
{
"name": "updated_at!: DateTime<Utc>",
"ordinal": 6,
"ordinal": 7,
"type_info": "Text"
}
],
@@ -47,10 +52,11 @@
false,
true,
false,
false,
true,
false,
false
]
},
"hash": "85fd1ae6ff6ab171c0ca452709697e659e75def14212c7cf69fd80bb85b0f7db"
"hash": "ff1ddf3ed4bed81adf8e1d3b8e3c3c8f0fcb9dc51511250f3c32501d8d60f0f0"
}

View File

@@ -0,0 +1,12 @@
-- Add column with empty default first (named default_ because it's the default for new workspaces)
ALTER TABLE projects ADD COLUMN default_agent_working_dir TEXT DEFAULT '';
-- Copy existing dev_script_working_dir values to default_agent_working_dir
-- ONLY for single-repo projects (multi-repo projects should default to None/empty)
UPDATE projects SET default_agent_working_dir = dev_script_working_dir
WHERE dev_script_working_dir IS NOT NULL
AND dev_script_working_dir != ''
AND (SELECT COUNT(*) FROM project_repos WHERE project_repos.project_id = projects.id) = 1;
-- Add agent_working_dir to workspaces (snapshot of project's default at workspace creation)
ALTER TABLE workspaces ADD COLUMN agent_working_dir TEXT DEFAULT '';

View File

@@ -23,6 +23,7 @@ pub struct Project {
pub name: String,
pub dev_script: Option<String>,
pub dev_script_working_dir: Option<String>,
pub default_agent_working_dir: Option<String>,
pub remote_project_id: Option<Uuid>,
#[ts(type = "Date")]
pub created_at: DateTime<Utc>,
@@ -41,6 +42,7 @@ pub struct UpdateProject {
pub name: Option<String>,
pub dev_script: Option<String>,
pub dev_script_working_dir: Option<String>,
pub default_agent_working_dir: Option<String>,
}
#[derive(Debug, Serialize, TS)]
@@ -71,6 +73,7 @@ impl Project {
name,
dev_script,
dev_script_working_dir,
default_agent_working_dir,
remote_project_id as "remote_project_id: Uuid",
created_at as "created_at!: DateTime<Utc>",
updated_at as "updated_at!: DateTime<Utc>"
@@ -87,6 +90,7 @@ impl Project {
Project,
r#"
SELECT p.id as "id!: Uuid", p.name, p.dev_script, p.dev_script_working_dir,
p.default_agent_working_dir,
p.remote_project_id as "remote_project_id: Uuid",
p.created_at as "created_at!: DateTime<Utc>", p.updated_at as "updated_at!: DateTime<Utc>"
FROM projects p
@@ -111,6 +115,7 @@ impl Project {
name,
dev_script,
dev_script_working_dir,
default_agent_working_dir,
remote_project_id as "remote_project_id: Uuid",
created_at as "created_at!: DateTime<Utc>",
updated_at as "updated_at!: DateTime<Utc>"
@@ -129,6 +134,7 @@ impl Project {
name,
dev_script,
dev_script_working_dir,
default_agent_working_dir,
remote_project_id as "remote_project_id: Uuid",
created_at as "created_at!: DateTime<Utc>",
updated_at as "updated_at!: DateTime<Utc>"
@@ -150,6 +156,7 @@ impl Project {
name,
dev_script,
dev_script_working_dir,
default_agent_working_dir,
remote_project_id as "remote_project_id: Uuid",
created_at as "created_at!: DateTime<Utc>",
updated_at as "updated_at!: DateTime<Utc>"
@@ -179,6 +186,7 @@ impl Project {
name,
dev_script,
dev_script_working_dir,
default_agent_working_dir,
remote_project_id as "remote_project_id: Uuid",
created_at as "created_at!: DateTime<Utc>",
updated_at as "updated_at!: DateTime<Utc>""#,
@@ -201,16 +209,18 @@ impl Project {
let name = payload.name.clone().unwrap_or(existing.name);
let dev_script = payload.dev_script.clone();
let dev_script_working_dir = payload.dev_script_working_dir.clone();
let default_agent_working_dir = payload.default_agent_working_dir.clone();
sqlx::query_as!(
Project,
r#"UPDATE projects
SET name = $2, dev_script = $3, dev_script_working_dir = $4
SET name = $2, dev_script = $3, dev_script_working_dir = $4, default_agent_working_dir = $5
WHERE id = $1
RETURNING id as "id!: Uuid",
name,
dev_script,
dev_script_working_dir,
default_agent_working_dir,
remote_project_id as "remote_project_id: Uuid",
created_at as "created_at!: DateTime<Utc>",
updated_at as "updated_at!: DateTime<Utc>""#,
@@ -218,11 +228,27 @@ impl Project {
name,
dev_script,
dev_script_working_dir,
default_agent_working_dir,
)
.fetch_one(pool)
.await
}
pub async fn clear_default_agent_working_dir(
pool: &SqlitePool,
id: Uuid,
) -> Result<(), sqlx::Error> {
sqlx::query!(
r#"UPDATE projects
SET default_agent_working_dir = ''
WHERE id = $1"#,
id
)
.execute(pool)
.await?;
Ok(())
}
pub async fn set_remote_project_id(
pool: &SqlitePool,
id: Uuid,

View File

@@ -50,6 +50,7 @@ pub struct Workspace {
pub task_id: Uuid,
pub container_ref: Option<String>,
pub branch: String,
pub agent_working_dir: Option<String>,
pub setup_completed_at: Option<DateTime<Utc>>,
pub created_at: DateTime<Utc>,
pub updated_at: DateTime<Utc>,
@@ -89,6 +90,7 @@ pub struct WorkspaceContext {
#[derive(Debug, Deserialize, TS)]
pub struct CreateWorkspace {
pub branch: String,
pub agent_working_dir: Option<String>,
}
impl Workspace {
@@ -108,6 +110,7 @@ impl Workspace {
task_id AS "task_id!: Uuid",
container_ref,
branch,
agent_working_dir,
setup_completed_at AS "setup_completed_at: DateTime<Utc>",
created_at AS "created_at!: DateTime<Utc>",
updated_at AS "updated_at!: DateTime<Utc>"
@@ -125,6 +128,7 @@ impl Workspace {
task_id AS "task_id!: Uuid",
container_ref,
branch,
agent_working_dir,
setup_completed_at AS "setup_completed_at: DateTime<Utc>",
created_at AS "created_at!: DateTime<Utc>",
updated_at AS "updated_at!: DateTime<Utc>"
@@ -152,6 +156,7 @@ impl Workspace {
w.task_id AS "task_id!: Uuid",
w.container_ref,
w.branch,
w.agent_working_dir,
w.setup_completed_at AS "setup_completed_at: DateTime<Utc>",
w.created_at AS "created_at!: DateTime<Utc>",
w.updated_at AS "updated_at!: DateTime<Utc>"
@@ -225,6 +230,7 @@ impl Workspace {
task_id AS "task_id!: Uuid",
container_ref,
branch,
agent_working_dir,
setup_completed_at AS "setup_completed_at: DateTime<Utc>",
created_at AS "created_at!: DateTime<Utc>",
updated_at AS "updated_at!: DateTime<Utc>"
@@ -243,6 +249,7 @@ impl Workspace {
task_id AS "task_id!: Uuid",
container_ref,
branch,
agent_working_dir,
setup_completed_at AS "setup_completed_at: DateTime<Utc>",
created_at AS "created_at!: DateTime<Utc>",
updated_at AS "updated_at!: DateTime<Utc>"
@@ -280,6 +287,7 @@ impl Workspace {
w.task_id as "task_id!: Uuid",
w.container_ref,
w.branch as "branch!",
w.agent_working_dir,
w.setup_completed_at as "setup_completed_at: DateTime<Utc>",
w.created_at as "created_at!: DateTime<Utc>",
w.updated_at as "updated_at!: DateTime<Utc>"
@@ -322,13 +330,14 @@ impl Workspace {
) -> Result<Self, WorkspaceError> {
Ok(sqlx::query_as!(
Workspace,
r#"INSERT INTO workspaces (id, task_id, container_ref, branch, setup_completed_at)
VALUES ($1, $2, $3, $4, $5)
RETURNING id as "id!: Uuid", task_id as "task_id!: Uuid", container_ref, branch, setup_completed_at as "setup_completed_at: DateTime<Utc>", created_at as "created_at!: DateTime<Utc>", updated_at as "updated_at!: DateTime<Utc>""#,
r#"INSERT INTO workspaces (id, task_id, container_ref, branch, agent_working_dir, setup_completed_at)
VALUES ($1, $2, $3, $4, $5, $6)
RETURNING id as "id!: Uuid", task_id as "task_id!: Uuid", container_ref, branch, agent_working_dir, setup_completed_at as "setup_completed_at: DateTime<Utc>", created_at as "created_at!: DateTime<Utc>", updated_at as "updated_at!: DateTime<Utc>""#,
id,
task_id,
Option::<String>::None,
data.branch,
data.agent_working_dir,
Option::<DateTime<Utc>>::None
)
.fetch_one(pool)

View File

@@ -20,6 +20,10 @@ pub struct CodingAgentFollowUpRequest {
#[serde(alias = "profile_variant_label")]
// Backwards compatability with ProfileVariantIds, esp stored in DB under ExecutorAction
pub executor_profile_id: ExecutorProfileId,
/// Optional relative path to execute the agent in (relative to container_ref).
/// If None, uses the container_ref directory directly.
#[serde(default)]
pub working_dir: Option<String>,
}
impl CodingAgentFollowUpRequest {
@@ -41,6 +45,11 @@ impl Executable for CodingAgentFollowUpRequest {
approvals: Arc<dyn ExecutorApprovalService>,
env: &ExecutionEnv,
) -> Result<SpawnedChild, ExecutorError> {
let effective_dir = match &self.working_dir {
Some(rel_path) => current_dir.join(rel_path),
None => current_dir.to_path_buf(),
};
let executor_profile_id = self.get_executor_profile_id();
let mut agent = ExecutorConfigs::get_cached()
.get_coding_agent(&executor_profile_id)
@@ -51,7 +60,7 @@ impl Executable for CodingAgentFollowUpRequest {
agent.use_approvals(approvals.clone());
agent
.spawn_follow_up(current_dir, &self.prompt, &self.session_id, env)
.spawn_follow_up(&effective_dir, &self.prompt, &self.session_id, env)
.await
}
}

View File

@@ -19,6 +19,10 @@ pub struct CodingAgentInitialRequest {
#[serde(alias = "profile_variant_label")]
// Backwards compatability with ProfileVariantIds, esp stored in DB under ExecutorAction
pub executor_profile_id: ExecutorProfileId,
/// Optional relative path to execute the agent in (relative to container_ref).
/// If None, uses the container_ref directory directly.
#[serde(default)]
pub working_dir: Option<String>,
}
impl CodingAgentInitialRequest {
@@ -35,6 +39,12 @@ impl Executable for CodingAgentInitialRequest {
approvals: Arc<dyn ExecutorApprovalService>,
env: &ExecutionEnv,
) -> Result<SpawnedChild, ExecutorError> {
// Use working_dir if specified, otherwise use current_dir
let effective_dir = match &self.working_dir {
Some(rel_path) => current_dir.join(rel_path),
None => current_dir.to_path_buf(),
};
let executor_profile_id = self.executor_profile_id.clone();
let mut agent = ExecutorConfigs::get_cached()
.get_coding_agent(&executor_profile_id)
@@ -44,6 +54,6 @@ impl Executable for CodingAgentInitialRequest {
agent.use_approvals(approvals.clone());
agent.spawn(current_dir, &self.prompt, env).await
agent.spawn(&effective_dir, &self.prompt, env).await
}
}

View File

@@ -821,16 +821,25 @@ impl LocalContainerService {
ProjectRepo::find_by_project_id_with_names(&self.db.pool, ctx.project.id).await?;
let cleanup_action = self.cleanup_actions_for_repos(&project_repos);
let working_dir = ctx
.workspace
.agent_working_dir
.as_ref()
.filter(|dir| !dir.is_empty())
.cloned();
let action_type = if let Some(agent_session_id) = latest_agent_session_id {
ExecutorActionType::CodingAgentFollowUpRequest(CodingAgentFollowUpRequest {
prompt: queued_data.message.clone(),
session_id: agent_session_id,
executor_profile_id: executor_profile_id.clone(),
working_dir: working_dir.clone(),
})
} else {
ExecutorActionType::CodingAgentInitialRequest(CodingAgentInitialRequest {
prompt: queued_data.message.clone(),
executor_profile_id: executor_profile_id.clone(),
working_dir,
})
};

View File

@@ -181,17 +181,25 @@ pub async fn follow_up(
.container()
.cleanup_actions_for_repos(&project_repos);
let working_dir = workspace
.agent_working_dir
.as_ref()
.filter(|dir| !dir.is_empty())
.cloned();
let action_type = if let Some(agent_session_id) = latest_agent_session_id {
ExecutorActionType::CodingAgentFollowUpRequest(CodingAgentFollowUpRequest {
prompt: prompt.clone(),
session_id: agent_session_id,
executor_profile_id: executor_profile_id.clone(),
working_dir: working_dir.clone(),
})
} else {
ExecutorActionType::CodingAgentInitialRequest(
executors::actions::coding_agent_initial::CodingAgentInitialRequest {
prompt,
executor_profile_id: executor_profile_id.clone(),
working_dir,
},
)
};

View File

@@ -142,6 +142,17 @@ pub async fn create_task_attempt(
.await?
.ok_or(SqlxError::RowNotFound)?;
let project = task
.parent_project(pool)
.await?
.ok_or(SqlxError::RowNotFound)?;
let agent_working_dir = project
.default_agent_working_dir
.as_ref()
.filter(|dir| !dir.is_empty())
.cloned();
let attempt_id = Uuid::new_v4();
let git_branch_name = deployment
.container()
@@ -152,6 +163,7 @@ pub async fn create_task_attempt(
pool,
&CreateWorkspace {
branch: git_branch_name.clone(),
agent_working_dir,
},
attempt_id,
payload.task_id,

View File

@@ -146,17 +146,25 @@ async fn trigger_pr_description_follow_up(
)
.await?;
let working_dir = workspace
.agent_working_dir
.as_ref()
.filter(|dir| !dir.is_empty())
.cloned();
// Build the action type (follow-up if session exists, otherwise initial)
let action_type = if let Some(agent_session_id) = latest_agent_session_id {
ExecutorActionType::CodingAgentFollowUpRequest(CodingAgentFollowUpRequest {
prompt,
session_id: agent_session_id,
executor_profile_id: executor_profile_id.clone(),
working_dir: working_dir.clone(),
})
} else {
ExecutorActionType::CodingAgentInitialRequest(CodingAgentInitialRequest {
prompt,
executor_profile_id: executor_profile_id.clone(),
working_dir,
})
};

View File

@@ -14,6 +14,7 @@ use axum::{
};
use db::models::{
image::TaskImage,
project::{Project, ProjectError},
repo::Repo,
task::{CreateTask, Task, TaskWithAttemptStatus, UpdateTask},
workspace::{CreateWorkspace, Workspace},
@@ -177,16 +178,27 @@ pub async fn create_task_and_start(
)
.await;
let project = Project::find_by_id(pool, task.project_id)
.await?
.ok_or(ProjectError::ProjectNotFound)?;
let attempt_id = Uuid::new_v4();
let git_branch_name = deployment
.container()
.git_branch_from_workspace(&attempt_id, &task.title)
.await;
let agent_working_dir = project
.default_agent_working_dir
.as_ref()
.filter(|dir: &&String| !dir.is_empty())
.cloned();
let workspace = Workspace::create(
pool,
&CreateWorkspace {
branch: git_branch_name,
agent_working_dir,
},
attempt_id,
task.id,

View File

@@ -351,7 +351,7 @@ pub trait ContainerService {
}
/// Backfill repo names that were migrated with a sentinel placeholder.
/// Also backfills dev_script_working_dir for single-repo projects.
/// Also backfills dev_script_working_dir and agent_working_dir for single-repo projects.
async fn backfill_repo_names(&self) -> Result<(), ContainerError> {
let pool = &self.db().pool;
let repos = Repo::list_needing_name_fix(pool).await?;
@@ -372,13 +372,14 @@ pub trait ContainerService {
Repo::update_name(pool, repo.id, &name, &name).await?;
// Also update dev_script_working_dir for single-repo projects
// Also update dev_script_working_dir and agent_working_dir for single-repo projects
let project_repos = ProjectRepo::find_by_repo_id(pool, repo.id).await?;
for pr in project_repos {
let all_repos = ProjectRepo::find_by_project_id(pool, pr.project_id).await?;
if all_repos.len() == 1
&& let Some(project) = Project::find_by_id(pool, pr.project_id).await?
&& project
{
let needs_dev_script_working_dir = project
.dev_script
.as_ref()
.map(|s| !s.is_empty())
@@ -387,21 +388,38 @@ pub trait ContainerService {
.dev_script_working_dir
.as_ref()
.map(|s| s.is_empty())
.unwrap_or(true)
{
.unwrap_or(true);
let needs_default_agent_working_dir = project
.default_agent_working_dir
.as_ref()
.map(|s| s.is_empty())
.unwrap_or(true);
if needs_dev_script_working_dir || needs_default_agent_working_dir {
Project::update(
pool,
pr.project_id,
&UpdateProject {
name: Some(project.name.clone()),
dev_script: project.dev_script.clone(),
dev_script_working_dir: Some(name.clone()),
dev_script_working_dir: if needs_dev_script_working_dir {
Some(name.clone())
} else {
project.dev_script_working_dir.clone()
},
default_agent_working_dir: if needs_default_agent_working_dir {
Some(name.clone())
} else {
project.default_agent_working_dir.clone()
},
},
)
.await?;
}
}
}
}
Ok(())
}
@@ -909,10 +927,17 @@ pub trait ContainerService {
let cleanup_action = self.cleanup_actions_for_repos(&project_repos);
let working_dir = workspace
.agent_working_dir
.as_ref()
.filter(|dir| !dir.is_empty())
.cloned();
let coding_action = ExecutorAction::new(
ExecutorActionType::CodingAgentInitialRequest(CodingAgentInitialRequest {
prompt,
executor_profile_id: executor_profile_id.clone(),
working_dir,
}),
cleanup_action.map(Box::new),
);

View File

@@ -110,11 +110,31 @@ impl ProjectService {
.await
.map_err(|e| ProjectServiceError::Project(ProjectError::CreateFailed(e.to_string())))?;
for repo in normalized_repos {
let mut created_repo: Option<Repo> = None;
for repo in &normalized_repos {
let repo_entity =
Repo::find_or_create(pool, Path::new(&repo.git_repo_path), &repo.display_name)
.await?;
ProjectRepo::create(pool, project.id, repo_entity.id).await?;
if created_repo.is_none() {
created_repo = Some(repo_entity);
}
}
if normalized_repos.len() == 1
&& let Some(repo) = created_repo
{
Project::update(
pool,
project.id,
&UpdateProject {
name: None,
dev_script: None,
dev_script_working_dir: None,
default_agent_working_dir: Some(repo.name),
},
)
.await?;
}
Ok(project)
@@ -185,6 +205,11 @@ impl ProjectService {
let path = repo_service.normalize_path(&payload.git_repo_path)?;
repo_service.validate_git_repo_path(&path)?;
// Count repos before adding
let repo_count_before = ProjectRepo::find_by_project_id(pool, project_id)
.await?
.len();
let repository = ProjectRepo::add_repo_to_project(
pool,
project_id,
@@ -202,6 +227,11 @@ impl ProjectService {
_ => ProjectServiceError::RepositoryNotFound,
})?;
// If project just went from 1 to 2 repos, clear default_agent_working_dir
if repo_count_before == 1 {
Project::clear_default_agent_working_dir(pool, project_id).await?;
}
tracing::info!(
"Added repository {} to project {} (path: {})",
repository.id,

View File

@@ -86,6 +86,7 @@ export function NoServerContent({
name: null,
dev_script: script,
dev_script_working_dir: project.dev_script_working_dir ?? null,
default_agent_working_dir: project.default_agent_working_dir ?? null,
},
},
{

View File

@@ -349,6 +349,11 @@
"placeholder": "e.g., my-repo",
"helper": "The directory to run the dev server script from, relative to the workspace root. Leave empty to run from the workspace root."
},
"agentWorkingDir": {
"label": "Agent Working Directory",
"placeholder": "e.g., my-repo",
"helper": "Default directory for new workspaces to run the coding agent from, relative to the workspace root. This value is captured when a workspace is created and won't affect existing workspaces. For single-repo projects, this defaults to the repo name. Leave empty to run from the workspace root."
},
"cleanup": {
"label": "Cleanup Script",
"helper": "This script runs from within the worktree after coding agent execution, only if changes were made. Use it for quality assurance tasks like running linters, formatters, tests, or other validation steps. If no changes are made, this script is skipped."

View File

@@ -349,6 +349,11 @@
"placeholder": "ej., mi-repo",
"helper": "El directorio desde el cual ejecutar el script del servidor de desarrollo, relativo a la raíz del workspace. Déjalo vacío para ejecutar desde la raíz del workspace."
},
"agentWorkingDir": {
"label": "Directorio de Trabajo del Agente",
"placeholder": "ej., mi-repo",
"helper": "Directorio predeterminado para nuevos workspaces donde ejecutar el agente de codificación, relativo a la raíz del workspace. Este valor se captura cuando se crea un workspace y no afectará a los workspaces existentes. Para proyectos de un solo repositorio, esto se establece por defecto al nombre del repositorio. Déjalo vacío para ejecutar desde la raíz del workspace."
},
"cleanup": {
"label": "Script de Limpieza",
"helper": "Este script se ejecuta desde dentro del worktree después de la ejecución del agente de codificación, solo si se realizaron cambios. Úsalo para tareas de garantía de calidad como ejecutar linters, formateadores, pruebas u otros pasos de validación. Si no se realizan cambios, se omite este script."

View File

@@ -349,6 +349,11 @@
"placeholder": "例my-repo",
"helper": "開発サーバースクリプトを実行するディレクトリ。ワークスペースルートからの相対パス。空欄にするとワークスペースルートから実行します。"
},
"agentWorkingDir": {
"label": "エージェント作業ディレクトリ",
"placeholder": "例my-repo",
"helper": "新しいワークスペースでコーディングエージェントを実行するデフォルトディレクトリ。ワークスペースルートからの相対パス。この値はワークスペース作成時に保存され、既存のワークスペースには影響しません。単一リポジトリプロジェクトの場合、リポジトリ名がデフォルトになります。空欄にするとワークスペースルートから実行します。"
},
"cleanup": {
"label": "クリーンアップスクリプト",
"helper": "このスクリプトはワークツリー内から、コーディングエージェントの実行後に実行されます(変更が行われた場合のみ)。リンター、フォーマッター、テスト、またはその他の検証ステップの実行など、品質保証タスクに使用してください。変更がない場合、このスクリプトはスキップされます。"

View File

@@ -349,6 +349,11 @@
"placeholder": "예: my-repo",
"helper": "개발 서버 스크립트를 실행할 디렉토리로, 워크스페이스 루트 기준 상대 경로입니다. 비워두면 워크스페이스 루트에서 실행됩니다."
},
"agentWorkingDir": {
"label": "에이전트 작업 디렉토리",
"placeholder": "예: my-repo",
"helper": "새 워크스페이스에서 코딩 에이전트를 실행할 기본 디렉토리로, 워크스페이스 루트 기준 상대 경로입니다. 이 값은 워크스페이스 생성 시 저장되며 기존 워크스페이스에는 영향을 주지 않습니다. 단일 저장소 프로젝트의 경우 저장소 이름이 기본값입니다. 비워두면 워크스페이스 루트에서 실행됩니다."
},
"cleanup": {
"label": "정리 스크립트",
"helper": "이 스크립트는 워크트리 내부에서 코딩 에이전트 실행 후에 실행됩니다(변경 사항이 있는 경우에만). 린터, 포맷터, 테스트 또는 기타 검증 단계 실행과 같은 품질 보증 작업에 사용하세요. 변경 사항이 없으면 이 스크립트를 건너뜁니다."

View File

@@ -349,6 +349,11 @@
"placeholder": "例如my-repo",
"helper": "运行开发服务器脚本的目录,相对于工作区根目录。留空则从工作区根目录运行。"
},
"agentWorkingDir": {
"label": "代理工作目录",
"placeholder": "例如my-repo",
"helper": "新工作区运行编码代理的默认目录,相对于工作区根目录。此值在创建工作区时保存,不会影响现有工作区。对于单仓库项目,默认为仓库名称。留空则从工作区根目录运行。"
},
"cleanup": {
"label": "清理脚本",
"helper": "此脚本从工作树内部运行,在编码代理执行后执行(仅在进行了更改时)。用于质量保证任务,如运行 linter、格式化程序、测试或其他验证步骤。如果没有进行更改则跳过此脚本。"

View File

@@ -37,6 +37,7 @@ interface ProjectFormState {
name: string;
dev_script: string;
dev_script_working_dir: string;
default_agent_working_dir: string;
}
interface RepoScriptsFormState {
@@ -51,6 +52,7 @@ function projectToFormState(project: Project): ProjectFormState {
name: project.name,
dev_script: project.dev_script ?? '',
dev_script_working_dir: project.dev_script_working_dir ?? '',
default_agent_working_dir: project.default_agent_working_dir ?? '',
};
}
@@ -393,6 +395,8 @@ export function ProjectSettings() {
name: draft.name.trim(),
dev_script: draft.dev_script.trim() || null,
dev_script_working_dir: draft.dev_script_working_dir.trim() || null,
default_agent_working_dir:
draft.default_agent_working_dir.trim() || null,
};
updateProject.mutate({
@@ -606,6 +610,26 @@ export function ProjectSettings() {
</p>
</div>
<div className="space-y-2">
<Label htmlFor="agent-working-dir">
{t('settings.projects.scripts.agentWorkingDir.label')}
</Label>
<Input
id="agent-working-dir"
value={draft.default_agent_working_dir}
onChange={(e) =>
updateDraft({ default_agent_working_dir: e.target.value })
}
placeholder={t(
'settings.projects.scripts.agentWorkingDir.placeholder'
)}
className="font-mono"
/>
<p className="text-sm text-muted-foreground">
{t('settings.projects.scripts.agentWorkingDir.helper')}
</p>
</div>
{/* Save Button */}
<div className="flex items-center justify-between pt-4 border-t">
{hasUnsavedProjectChanges ? (

View File

@@ -12,11 +12,11 @@ export type SharedTask = { id: string, organization_id: string, project_id: stri
export type UserData = { user_id: string, first_name: string | null, last_name: string | null, username: string | null, };
export type Project = { id: string, name: string, dev_script: string | null, dev_script_working_dir: string | null, remote_project_id: string | null, created_at: Date, updated_at: Date, };
export type Project = { id: string, name: string, dev_script: string | null, dev_script_working_dir: string | null, default_agent_working_dir: string | null, remote_project_id: string | null, created_at: Date, updated_at: Date, };
export type CreateProject = { name: string, repositories: Array<CreateProjectRepo>, };
export type UpdateProject = { name: string | null, dev_script: string | null, dev_script_working_dir: string | null, };
export type UpdateProject = { name: string | null, dev_script: string | null, dev_script_working_dir: string | null, default_agent_working_dir: string | null, };
export type SearchResult = { path: string, is_file: boolean, match_type: SearchMatchType, };
@@ -70,7 +70,7 @@ export type Image = { id: string, file_path: string, original_name: string, mime
export type CreateImage = { file_path: string, original_name: string, mime_type: string | null, size_bytes: bigint, hash: string, };
export type Workspace = { id: string, task_id: string, container_ref: string | null, branch: string, setup_completed_at: string | null, created_at: string, updated_at: string, };
export type Workspace = { id: string, task_id: string, container_ref: string | null, branch: string, agent_working_dir: string | null, setup_completed_at: string | null, created_at: string, updated_at: string, };
export type Session = { id: string, workspace_id: string, executor: string | null, created_at: string, updated_at: string, };
@@ -437,13 +437,23 @@ export type CodingAgentInitialRequest = { prompt: string,
/**
* Executor profile specification
*/
executor_profile_id: ExecutorProfileId, };
executor_profile_id: ExecutorProfileId,
/**
* Optional relative path to execute the agent in (relative to container_ref).
* If None, uses the container_ref directory directly.
*/
working_dir: string | null, };
export type CodingAgentFollowUpRequest = { prompt: string, session_id: string,
/**
* Executor profile specification
*/
executor_profile_id: ExecutorProfileId, };
executor_profile_id: ExecutorProfileId,
/**
* Optional relative path to execute the agent in (relative to container_ref).
* If None, uses the container_ref directory directly.
*/
working_dir: string | null, };
export type CommandExitStatus = { "type": "exit_code", code: number, } | { "type": "success", success: boolean, };