Allow multiple merges (#510)

* Allow multiple merge for a single task attempt

Merge more than once (vibe-kanban 618829fc)

When creating a PR, new changes can be pushed after creation.
We need merge to work the same way, when changes have been made after the first merge, a second one should work.

Commit changes from coding agent for task attempt 548ff450-df77-47b2-a5ba-c88d0aa4a334

Merge more than once (vibe-kanban 618829fc)

When creating a PR, new changes can be pushed after creation.
We need merge to work the same way, when changes have been made after the first merge, a second one should work.

Remove pinned todo list (vibe-kanban cc66cda2)

Make a minimal change to remove the pinned todo list from the frontend

Remove pinned todo list (vibe-kanban cc66cda2)

Make a minimal change to remove the pinned todo list from the frontend

* Create merges table; remove task_attempt.merge_commit

Add merge model, replace ta.merge_commit with m.merge_commit

Fix frontend

* Move PR to merges table

* Refactor GitHub repository info retrieval to return structured data

* Fix frontend

* Reset task branch after PR merge

Add branch status handling to TaskDetailsProvider and related components

fmt

Add branch status handling to TaskDetailsProvider and related components

fmt

Test (vibe-kanban 1bf1a80f)

add test.txt

Show merged diff when no worktree present

Refresh branch status after PR creation

Test (vibe-kanban 1bf1a80f)

add test.txt

Test (vibe-kanban 1bf1a80f)

add test.txt

Show rebase when behind

Refactor container service to check if the container is clean before showing merged diff; remove unused BranchStatus import

Test (vibe-kanban a3c1b297)

add test.txt

Refactor branch status handling: rename BranchStatusResponse to BranchStatus and update related types and usages

Test (vibe-kanban) (#540)

* Remove test.txt

* Test (vibe-kanban aade357e)

add test.txt

* test.txt removed.

* Fix diff when merged and new commits have been made

* Remvoe logging (vibe-kanban) (#541)

* Test (vibe-kanban aade357e)

add test.txt

* Test (vibe-kanban aade357e)

add test.txt

* Perfect! I've successfully removed the "Fetching branch status" logging statement from the code. The logging has been removed from `crates/server/src/routes/task_attempts.rs:568-571`.

* Clear previous errors on successful PR creation, push, merge, and rebase actions

* Show branch in worktree dirty error message

* Add success indicators for push and merge actions in CurrentAttempt

* Refactor status display logic in CurrentAttempt for improved readability and maintainability

* Add target_branch_name to merge models and queries for direct and PR merges

* Enhance merge status display logic in CurrentAttempt for better clarity on direct merges

* Remove unnecessary condition check in attempt data fetching interval

* Clippy

* Add index for task_attempt_id in merges table to improve query performance

* Pass PR creation error

* Disable buttons (vibe-kanban 240346bf)

Instead of not showing the merge/pr buttons when theyre not available we should disable them. frontend/src/components/tasks/Toolbar/CurrentAttempt.tsx
This commit is contained in:
Alex Netsch
2025-08-21 16:00:35 +01:00
committed by GitHub
parent 061b461397
commit ed594a3d80
34 changed files with 1348 additions and 810 deletions

View File

@@ -1,12 +0,0 @@
{
"db_name": "SQLite",
"query": "UPDATE task_attempts SET merge_commit = $1, updated_at = datetime('now') WHERE id = $2",
"describe": {
"columns": [],
"parameters": {
"Right": 2
},
"nullable": []
},
"hash": "03f2b02ba6dc5ea2b3cf6b1004caea0ad6bcc10ebd63f441d321a389f026e263"
}

View File

@@ -0,0 +1,80 @@
{
"db_name": "SQLite",
"query": "INSERT INTO merges (\n id, task_attempt_id, merge_type, pr_number, pr_url, pr_status, created_at, target_branch_name\n ) VALUES ($1, $2, 'pr', $3, $4, 'open', $5, $6)\n RETURNING \n id as \"id!: Uuid\",\n task_attempt_id as \"task_attempt_id!: Uuid\",\n merge_type as \"merge_type!: MergeType\",\n merge_commit,\n pr_number,\n pr_url,\n pr_status as \"pr_status?: MergeStatus\",\n pr_merged_at as \"pr_merged_at?: DateTime<Utc>\",\n pr_merge_commit_sha,\n created_at as \"created_at!: DateTime<Utc>\",\n target_branch_name as \"target_branch_name!: String\"\n ",
"describe": {
"columns": [
{
"name": "id!: Uuid",
"ordinal": 0,
"type_info": "Blob"
},
{
"name": "task_attempt_id!: Uuid",
"ordinal": 1,
"type_info": "Blob"
},
{
"name": "merge_type!: MergeType",
"ordinal": 2,
"type_info": "Text"
},
{
"name": "merge_commit",
"ordinal": 3,
"type_info": "Text"
},
{
"name": "pr_number",
"ordinal": 4,
"type_info": "Integer"
},
{
"name": "pr_url",
"ordinal": 5,
"type_info": "Text"
},
{
"name": "pr_status?: MergeStatus",
"ordinal": 6,
"type_info": "Text"
},
{
"name": "pr_merged_at?: DateTime<Utc>",
"ordinal": 7,
"type_info": "Text"
},
{
"name": "pr_merge_commit_sha",
"ordinal": 8,
"type_info": "Text"
},
{
"name": "created_at!: DateTime<Utc>",
"ordinal": 9,
"type_info": "Text"
},
{
"name": "target_branch_name!: String",
"ordinal": 10,
"type_info": "Text"
}
],
"parameters": {
"Right": 6
},
"nullable": [
true,
false,
false,
true,
true,
true,
true,
true,
true,
false,
false
]
},
"hash": "09510a7e5927bd5000f6e9e027d4bf1edf6246f1feb575917ed0aff0e6e0f5a1"
}

View File

@@ -0,0 +1,80 @@
{
"db_name": "SQLite",
"query": "SELECT \n id as \"id!: Uuid\",\n task_attempt_id as \"task_attempt_id!: Uuid\",\n merge_type as \"merge_type!: MergeType\",\n merge_commit,\n pr_number,\n pr_url,\n pr_status as \"pr_status?: MergeStatus\",\n pr_merged_at as \"pr_merged_at?: DateTime<Utc>\",\n pr_merge_commit_sha,\n target_branch_name as \"target_branch_name!: String\",\n created_at as \"created_at!: DateTime<Utc>\"\n FROM merges \n WHERE task_attempt_id = $1\n ORDER BY created_at DESC",
"describe": {
"columns": [
{
"name": "id!: Uuid",
"ordinal": 0,
"type_info": "Blob"
},
{
"name": "task_attempt_id!: Uuid",
"ordinal": 1,
"type_info": "Blob"
},
{
"name": "merge_type!: MergeType",
"ordinal": 2,
"type_info": "Text"
},
{
"name": "merge_commit",
"ordinal": 3,
"type_info": "Text"
},
{
"name": "pr_number",
"ordinal": 4,
"type_info": "Integer"
},
{
"name": "pr_url",
"ordinal": 5,
"type_info": "Text"
},
{
"name": "pr_status?: MergeStatus",
"ordinal": 6,
"type_info": "Text"
},
{
"name": "pr_merged_at?: DateTime<Utc>",
"ordinal": 7,
"type_info": "Text"
},
{
"name": "pr_merge_commit_sha",
"ordinal": 8,
"type_info": "Text"
},
{
"name": "target_branch_name!: String",
"ordinal": 9,
"type_info": "Text"
},
{
"name": "created_at!: DateTime<Utc>",
"ordinal": 10,
"type_info": "Text"
}
],
"parameters": {
"Right": 1
},
"nullable": [
true,
false,
false,
true,
true,
true,
true,
true,
true,
false,
false
]
},
"hash": "1395fe4c3041a4d05e5c3caa068471c8790a67890d6a0566f44bd4e134679095"
}

View File

@@ -0,0 +1,12 @@
{
"db_name": "SQLite",
"query": "UPDATE merges \n SET pr_status = $1, \n pr_merge_commit_sha = $2,\n pr_merged_at = $3\n WHERE id = $4",
"describe": {
"columns": [],
"parameters": {
"Right": 4
},
"nullable": []
},
"hash": "19fcd51ab5368347045ccb0eb39f0bf5dc321c057d01b55151b6ca67f163fc9b"
}

View File

@@ -1,6 +1,6 @@
{ {
"db_name": "SQLite", "db_name": "SQLite",
"query": "INSERT INTO task_attempts (id, task_id, container_ref, branch, base_branch, merge_commit, profile, pr_url, pr_number, pr_status, pr_merged_at, worktree_deleted, setup_completed_at)\n VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13)\n RETURNING id as \"id!: Uuid\", task_id as \"task_id!: Uuid\", container_ref, branch, base_branch, merge_commit, profile as \"profile!\", pr_url, pr_number, pr_status, pr_merged_at as \"pr_merged_at: DateTime<Utc>\", worktree_deleted as \"worktree_deleted!: bool\", setup_completed_at as \"setup_completed_at: DateTime<Utc>\", created_at as \"created_at!: DateTime<Utc>\", updated_at as \"updated_at!: DateTime<Utc>\"", "query": "INSERT INTO task_attempts (id, task_id, container_ref, branch, base_branch, profile, worktree_deleted, setup_completed_at)\n VALUES ($1, $2, $3, $4, $5, $6, $7, $8)\n RETURNING id as \"id!: Uuid\", task_id as \"task_id!: Uuid\", container_ref, branch, base_branch, profile as \"profile!\", worktree_deleted as \"worktree_deleted!: bool\", setup_completed_at as \"setup_completed_at: DateTime<Utc>\", created_at as \"created_at!: DateTime<Utc>\", updated_at as \"updated_at!: DateTime<Utc>\"",
"describe": { "describe": {
"columns": [ "columns": [
{ {
@@ -29,58 +29,33 @@
"type_info": "Text" "type_info": "Text"
}, },
{ {
"name": "merge_commit", "name": "profile!",
"ordinal": 5, "ordinal": 5,
"type_info": "Text" "type_info": "Text"
}, },
{
"name": "profile!",
"ordinal": 6,
"type_info": "Text"
},
{
"name": "pr_url",
"ordinal": 7,
"type_info": "Text"
},
{
"name": "pr_number",
"ordinal": 8,
"type_info": "Integer"
},
{
"name": "pr_status",
"ordinal": 9,
"type_info": "Text"
},
{
"name": "pr_merged_at: DateTime<Utc>",
"ordinal": 10,
"type_info": "Datetime"
},
{ {
"name": "worktree_deleted!: bool", "name": "worktree_deleted!: bool",
"ordinal": 11, "ordinal": 6,
"type_info": "Bool" "type_info": "Bool"
}, },
{ {
"name": "setup_completed_at: DateTime<Utc>", "name": "setup_completed_at: DateTime<Utc>",
"ordinal": 12, "ordinal": 7,
"type_info": "Datetime" "type_info": "Datetime"
}, },
{ {
"name": "created_at!: DateTime<Utc>", "name": "created_at!: DateTime<Utc>",
"ordinal": 13, "ordinal": 8,
"type_info": "Text" "type_info": "Text"
}, },
{ {
"name": "updated_at!: DateTime<Utc>", "name": "updated_at!: DateTime<Utc>",
"ordinal": 14, "ordinal": 9,
"type_info": "Text" "type_info": "Text"
} }
], ],
"parameters": { "parameters": {
"Right": 13 "Right": 8
}, },
"nullable": [ "nullable": [
true, true,
@@ -89,16 +64,11 @@
true, true,
false, false,
true, true,
true,
true,
true,
true,
true,
false, false,
true, true,
false, false,
false false
] ]
}, },
"hash": "741e831f4509958c508da36e955e381905672b39fc121170418015e2512184a2" "hash": "25d8df97101afa1a3a6e7c609d741237f24871bd270d37f50c37807cdece1104"
} }

View File

@@ -0,0 +1,80 @@
{
"db_name": "SQLite",
"query": "INSERT INTO merges (\n id, task_attempt_id, merge_type, merge_commit, created_at, target_branch_name\n ) VALUES ($1, $2, 'direct', $3, $4, $5)\n RETURNING \n id as \"id!: Uuid\",\n task_attempt_id as \"task_attempt_id!: Uuid\",\n merge_type as \"merge_type!: MergeType\",\n merge_commit,\n pr_number,\n pr_url,\n pr_status as \"pr_status?: MergeStatus\",\n pr_merged_at as \"pr_merged_at?: DateTime<Utc>\",\n pr_merge_commit_sha,\n created_at as \"created_at!: DateTime<Utc>\",\n target_branch_name as \"target_branch_name!: String\"\n ",
"describe": {
"columns": [
{
"name": "id!: Uuid",
"ordinal": 0,
"type_info": "Blob"
},
{
"name": "task_attempt_id!: Uuid",
"ordinal": 1,
"type_info": "Blob"
},
{
"name": "merge_type!: MergeType",
"ordinal": 2,
"type_info": "Text"
},
{
"name": "merge_commit",
"ordinal": 3,
"type_info": "Text"
},
{
"name": "pr_number",
"ordinal": 4,
"type_info": "Integer"
},
{
"name": "pr_url",
"ordinal": 5,
"type_info": "Text"
},
{
"name": "pr_status?: MergeStatus",
"ordinal": 6,
"type_info": "Text"
},
{
"name": "pr_merged_at?: DateTime<Utc>",
"ordinal": 7,
"type_info": "Text"
},
{
"name": "pr_merge_commit_sha",
"ordinal": 8,
"type_info": "Text"
},
{
"name": "created_at!: DateTime<Utc>",
"ordinal": 9,
"type_info": "Text"
},
{
"name": "target_branch_name!: String",
"ordinal": 10,
"type_info": "Text"
}
],
"parameters": {
"Right": 5
},
"nullable": [
true,
false,
false,
true,
true,
true,
true,
true,
true,
false,
false
]
},
"hash": "32c9dae46df6480ce1ca07f72b8659e60d9159afcc03a4bb5213f7a2bae537d8"
}

View File

@@ -1,6 +1,6 @@
{ {
"db_name": "SQLite", "db_name": "SQLite",
"query": "SELECT id AS \"id!: Uuid\",\n task_id AS \"task_id!: Uuid\",\n container_ref,\n branch,\n base_branch,\n merge_commit,\n profile AS \"profile!\",\n pr_url,\n pr_number,\n pr_status,\n pr_merged_at AS \"pr_merged_at: DateTime<Utc>\",\n worktree_deleted AS \"worktree_deleted!: bool\",\n setup_completed_at AS \"setup_completed_at: DateTime<Utc>\",\n created_at AS \"created_at!: DateTime<Utc>\",\n updated_at AS \"updated_at!: DateTime<Utc>\"\n FROM task_attempts\n ORDER BY created_at DESC", "query": "SELECT id AS \"id!: Uuid\",\n task_id AS \"task_id!: Uuid\",\n container_ref,\n branch,\n base_branch,\n profile AS \"profile!\",\n worktree_deleted AS \"worktree_deleted!: bool\",\n setup_completed_at AS \"setup_completed_at: DateTime<Utc>\",\n created_at AS \"created_at!: DateTime<Utc>\",\n updated_at AS \"updated_at!: DateTime<Utc>\"\n FROM task_attempts\n ORDER BY created_at DESC",
"describe": { "describe": {
"columns": [ "columns": [
{ {
@@ -29,53 +29,28 @@
"type_info": "Text" "type_info": "Text"
}, },
{ {
"name": "merge_commit", "name": "profile!",
"ordinal": 5, "ordinal": 5,
"type_info": "Text" "type_info": "Text"
}, },
{
"name": "profile!",
"ordinal": 6,
"type_info": "Text"
},
{
"name": "pr_url",
"ordinal": 7,
"type_info": "Text"
},
{
"name": "pr_number",
"ordinal": 8,
"type_info": "Integer"
},
{
"name": "pr_status",
"ordinal": 9,
"type_info": "Text"
},
{
"name": "pr_merged_at: DateTime<Utc>",
"ordinal": 10,
"type_info": "Datetime"
},
{ {
"name": "worktree_deleted!: bool", "name": "worktree_deleted!: bool",
"ordinal": 11, "ordinal": 6,
"type_info": "Bool" "type_info": "Bool"
}, },
{ {
"name": "setup_completed_at: DateTime<Utc>", "name": "setup_completed_at: DateTime<Utc>",
"ordinal": 12, "ordinal": 7,
"type_info": "Datetime" "type_info": "Datetime"
}, },
{ {
"name": "created_at!: DateTime<Utc>", "name": "created_at!: DateTime<Utc>",
"ordinal": 13, "ordinal": 8,
"type_info": "Text" "type_info": "Text"
}, },
{ {
"name": "updated_at!: DateTime<Utc>", "name": "updated_at!: DateTime<Utc>",
"ordinal": 14, "ordinal": 9,
"type_info": "Text" "type_info": "Text"
} }
], ],
@@ -89,16 +64,11 @@
true, true,
false, false,
true, true,
true,
true,
true,
true,
true,
false, false,
true, true,
false, false,
false false
] ]
}, },
"hash": "8a1b8a47f4405a3e4a5bc41db8ec40af31f748587b61b7821f7e326ce9e23a75" "hash": "7e7f701c7e56081684128df131135cad9e4d5633f5f1d95ed9186379fcb099b4"
} }

View File

@@ -1,12 +0,0 @@
{
"db_name": "SQLite",
"query": "UPDATE task_attempts SET pr_url = $1, pr_number = $2, pr_status = $3, updated_at = datetime('now') WHERE id = $4",
"describe": {
"columns": [],
"parameters": {
"Right": 4
},
"nullable": []
},
"hash": "86d03eb70eef39c59296416867f2ee66c9f7cd8b7f961fbda2f89fc0a1c442c2"
}

View File

@@ -1,6 +1,6 @@
{ {
"db_name": "SQLite", "db_name": "SQLite",
"query": "SELECT ta.id AS \"id!: Uuid\",\n ta.task_id AS \"task_id!: Uuid\",\n ta.container_ref,\n ta.branch,\n ta.base_branch,\n ta.merge_commit,\n ta.profile AS \"profile!\",\n ta.pr_url,\n ta.pr_number,\n ta.pr_status,\n ta.pr_merged_at AS \"pr_merged_at: DateTime<Utc>\",\n ta.worktree_deleted AS \"worktree_deleted!: bool\",\n ta.setup_completed_at AS \"setup_completed_at: DateTime<Utc>\",\n ta.created_at AS \"created_at!: DateTime<Utc>\",\n ta.updated_at AS \"updated_at!: DateTime<Utc>\"\n FROM task_attempts ta\n JOIN tasks t ON ta.task_id = t.id\n JOIN projects p ON t.project_id = p.id\n WHERE ta.id = $1 AND t.id = $2 AND p.id = $3", "query": "SELECT ta.id AS \"id!: Uuid\",\n ta.task_id AS \"task_id!: Uuid\",\n ta.container_ref,\n ta.branch,\n ta.base_branch,\n ta.profile AS \"profile!\",\n ta.worktree_deleted AS \"worktree_deleted!: bool\",\n ta.setup_completed_at AS \"setup_completed_at: DateTime<Utc>\",\n ta.created_at AS \"created_at!: DateTime<Utc>\",\n ta.updated_at AS \"updated_at!: DateTime<Utc>\"\n FROM task_attempts ta\n JOIN tasks t ON ta.task_id = t.id\n JOIN projects p ON t.project_id = p.id\n WHERE ta.id = $1 AND t.id = $2 AND p.id = $3",
"describe": { "describe": {
"columns": [ "columns": [
{ {
@@ -29,53 +29,28 @@
"type_info": "Text" "type_info": "Text"
}, },
{ {
"name": "merge_commit", "name": "profile!",
"ordinal": 5, "ordinal": 5,
"type_info": "Text" "type_info": "Text"
}, },
{
"name": "profile!",
"ordinal": 6,
"type_info": "Text"
},
{
"name": "pr_url",
"ordinal": 7,
"type_info": "Text"
},
{
"name": "pr_number",
"ordinal": 8,
"type_info": "Integer"
},
{
"name": "pr_status",
"ordinal": 9,
"type_info": "Text"
},
{
"name": "pr_merged_at: DateTime<Utc>",
"ordinal": 10,
"type_info": "Datetime"
},
{ {
"name": "worktree_deleted!: bool", "name": "worktree_deleted!: bool",
"ordinal": 11, "ordinal": 6,
"type_info": "Bool" "type_info": "Bool"
}, },
{ {
"name": "setup_completed_at: DateTime<Utc>", "name": "setup_completed_at: DateTime<Utc>",
"ordinal": 12, "ordinal": 7,
"type_info": "Datetime" "type_info": "Datetime"
}, },
{ {
"name": "created_at!: DateTime<Utc>", "name": "created_at!: DateTime<Utc>",
"ordinal": 13, "ordinal": 8,
"type_info": "Text" "type_info": "Text"
}, },
{ {
"name": "updated_at!: DateTime<Utc>", "name": "updated_at!: DateTime<Utc>",
"ordinal": 14, "ordinal": 9,
"type_info": "Text" "type_info": "Text"
} }
], ],
@@ -89,16 +64,11 @@
true, true,
false, false,
true, true,
true,
true,
true,
true,
true,
false, false,
true, true,
false, false,
false false
] ]
}, },
"hash": "703270b5172b81852470a886a72d9f749cbef0078e786582d23eb18e8cf11119" "hash": "8def2c6c696ac747df23dea77c23e135110c85a8d10cf0df096ffe7e7cd201c4"
} }

View File

@@ -1,6 +1,6 @@
{ {
"db_name": "SQLite", "db_name": "SQLite",
"query": "SELECT\n t.id AS \"id!: Uuid\",\n t.project_id AS \"project_id!: Uuid\",\n t.title,\n t.description,\n t.status AS \"status!: TaskStatus\",\n t.parent_task_attempt AS \"parent_task_attempt: Uuid\",\n t.created_at AS \"created_at!: DateTime<Utc>\",\n t.updated_at AS \"updated_at!: DateTime<Utc>\",\n\n CASE WHEN EXISTS (\n SELECT 1\n FROM task_attempts ta\n JOIN execution_processes ep\n ON ep.task_attempt_id = ta.id\n WHERE ta.task_id = t.id\n AND ep.status = 'running'\n AND ep.run_reason IN ('setupscript','cleanupscript','codingagent')\n LIMIT 1\n ) THEN 1 ELSE 0 END AS \"has_in_progress_attempt!: i64\",\n\n CASE WHEN EXISTS (\n SELECT 1\n FROM task_attempts ta\n WHERE ta.task_id = t.id\n AND ta.merge_commit IS NOT NULL\n LIMIT 1\n ) THEN 1 ELSE 0 END AS \"has_merged_attempt!: i64\",\n\n CASE WHEN (\n SELECT ep.status\n FROM task_attempts ta\n JOIN execution_processes ep\n ON ep.task_attempt_id = ta.id\n WHERE ta.task_id = t.id\n AND ep.run_reason IN ('setupscript','cleanupscript','codingagent')\n ORDER BY ep.created_at DESC\n LIMIT 1\n ) IN ('failed','killed') THEN 1 ELSE 0 END\n AS \"last_attempt_failed!: i64\",\n\n ( SELECT ta.profile\n FROM task_attempts ta\n WHERE ta.task_id = t.id\n ORDER BY ta.created_at DESC\n LIMIT 1\n ) AS \"profile!: String\"\n\nFROM tasks t\nWHERE t.project_id = $1\nORDER BY t.created_at DESC", "query": "SELECT\n t.id AS \"id!: Uuid\",\n t.project_id AS \"project_id!: Uuid\",\n t.title,\n t.description,\n t.status AS \"status!: TaskStatus\",\n t.parent_task_attempt AS \"parent_task_attempt: Uuid\",\n t.created_at AS \"created_at!: DateTime<Utc>\",\n t.updated_at AS \"updated_at!: DateTime<Utc>\",\n\n CASE WHEN EXISTS (\n SELECT 1\n FROM task_attempts ta\n JOIN execution_processes ep\n ON ep.task_attempt_id = ta.id\n WHERE ta.task_id = t.id\n AND ep.status = 'running'\n AND ep.run_reason IN ('setupscript','cleanupscript','codingagent')\n LIMIT 1\n ) THEN 1 ELSE 0 END AS \"has_in_progress_attempt!: i64\",\n \n CASE WHEN (\n SELECT ep.status\n FROM task_attempts ta\n JOIN execution_processes ep\n ON ep.task_attempt_id = ta.id\n WHERE ta.task_id = t.id\n AND ep.run_reason IN ('setupscript','cleanupscript','codingagent')\n ORDER BY ep.created_at DESC\n LIMIT 1\n ) IN ('failed','killed') THEN 1 ELSE 0 END\n AS \"last_attempt_failed!: i64\",\n\n ( SELECT ta.profile\n FROM task_attempts ta\n WHERE ta.task_id = t.id\n ORDER BY ta.created_at DESC\n LIMIT 1\n ) AS \"profile!: String\"\n\nFROM tasks t\nWHERE t.project_id = $1\nORDER BY t.created_at DESC",
"describe": { "describe": {
"columns": [ "columns": [
{ {
@@ -49,18 +49,13 @@
"type_info": "Integer" "type_info": "Integer"
}, },
{ {
"name": "has_merged_attempt!: i64", "name": "last_attempt_failed!: i64",
"ordinal": 9, "ordinal": 9,
"type_info": "Integer" "type_info": "Integer"
}, },
{
"name": "last_attempt_failed!: i64",
"ordinal": 10,
"type_info": "Integer"
},
{ {
"name": "profile!: String", "name": "profile!: String",
"ordinal": 11, "ordinal": 10,
"type_info": "Text" "type_info": "Text"
} }
], ],
@@ -78,9 +73,8 @@
false, false,
false, false,
false, false,
false,
true true
] ]
}, },
"hash": "f338f880ec72989bcaabe3ae3e843fe1faabc1f990f2c91ceb30b76b0fe43153" "hash": "8f848d77f2464b4010475de13aacf8157663b139b363da000ca0c94fbcac378e"
} }

View File

@@ -1,6 +1,6 @@
{ {
"db_name": "SQLite", "db_name": "SQLite",
"query": "SELECT id AS \"id!: Uuid\",\n task_id AS \"task_id!: Uuid\",\n container_ref,\n branch,\n base_branch,\n merge_commit,\n profile AS \"profile!\",\n pr_url,\n pr_number,\n pr_status,\n pr_merged_at AS \"pr_merged_at: DateTime<Utc>\",\n worktree_deleted AS \"worktree_deleted!: bool\",\n setup_completed_at AS \"setup_completed_at: DateTime<Utc>\",\n created_at AS \"created_at!: DateTime<Utc>\",\n updated_at AS \"updated_at!: DateTime<Utc>\"\n FROM task_attempts\n WHERE task_id = $1\n ORDER BY created_at DESC", "query": "SELECT id AS \"id!: Uuid\",\n task_id AS \"task_id!: Uuid\",\n container_ref,\n branch,\n base_branch,\n profile AS \"profile!\",\n worktree_deleted AS \"worktree_deleted!: bool\",\n setup_completed_at AS \"setup_completed_at: DateTime<Utc>\",\n created_at AS \"created_at!: DateTime<Utc>\",\n updated_at AS \"updated_at!: DateTime<Utc>\"\n FROM task_attempts\n WHERE task_id = $1\n ORDER BY created_at DESC",
"describe": { "describe": {
"columns": [ "columns": [
{ {
@@ -29,53 +29,28 @@
"type_info": "Text" "type_info": "Text"
}, },
{ {
"name": "merge_commit", "name": "profile!",
"ordinal": 5, "ordinal": 5,
"type_info": "Text" "type_info": "Text"
}, },
{
"name": "profile!",
"ordinal": 6,
"type_info": "Text"
},
{
"name": "pr_url",
"ordinal": 7,
"type_info": "Text"
},
{
"name": "pr_number",
"ordinal": 8,
"type_info": "Integer"
},
{
"name": "pr_status",
"ordinal": 9,
"type_info": "Text"
},
{
"name": "pr_merged_at: DateTime<Utc>",
"ordinal": 10,
"type_info": "Datetime"
},
{ {
"name": "worktree_deleted!: bool", "name": "worktree_deleted!: bool",
"ordinal": 11, "ordinal": 6,
"type_info": "Bool" "type_info": "Bool"
}, },
{ {
"name": "setup_completed_at: DateTime<Utc>", "name": "setup_completed_at: DateTime<Utc>",
"ordinal": 12, "ordinal": 7,
"type_info": "Datetime" "type_info": "Datetime"
}, },
{ {
"name": "created_at!: DateTime<Utc>", "name": "created_at!: DateTime<Utc>",
"ordinal": 13, "ordinal": 8,
"type_info": "Text" "type_info": "Text"
}, },
{ {
"name": "updated_at!: DateTime<Utc>", "name": "updated_at!: DateTime<Utc>",
"ordinal": 14, "ordinal": 9,
"type_info": "Text" "type_info": "Text"
} }
], ],
@@ -89,16 +64,11 @@
true, true,
false, false,
true, true,
true,
true,
true,
true,
true,
false, false,
true, true,
false, false,
false false
] ]
}, },
"hash": "70474b20d1a3affa13c80926f954ca2007faa6977508bb6372a867bdc56c4830" "hash": "9298f2ee7230893e28a8defecabb17c7b9e08d355d654b846f0e6a56189c10b6"
} }

View File

@@ -1,6 +1,6 @@
{ {
"db_name": "SQLite", "db_name": "SQLite",
"query": "SELECT id AS \"id!: Uuid\",\n task_id AS \"task_id!: Uuid\",\n container_ref,\n branch,\n merge_commit,\n base_branch,\n profile AS \"profile!\",\n pr_url,\n pr_number,\n pr_status,\n pr_merged_at AS \"pr_merged_at: DateTime<Utc>\",\n worktree_deleted AS \"worktree_deleted!: bool\",\n setup_completed_at AS \"setup_completed_at: DateTime<Utc>\",\n created_at AS \"created_at!: DateTime<Utc>\",\n updated_at AS \"updated_at!: DateTime<Utc>\"\n FROM task_attempts\n WHERE id = $1", "query": "SELECT id AS \"id!: Uuid\",\n task_id AS \"task_id!: Uuid\",\n container_ref,\n branch,\n base_branch,\n profile AS \"profile!\",\n worktree_deleted AS \"worktree_deleted!: bool\",\n setup_completed_at AS \"setup_completed_at: DateTime<Utc>\",\n created_at AS \"created_at!: DateTime<Utc>\",\n updated_at AS \"updated_at!: DateTime<Utc>\"\n FROM task_attempts\n WHERE rowid = $1",
"describe": { "describe": {
"columns": [ "columns": [
{ {
@@ -24,58 +24,33 @@
"type_info": "Text" "type_info": "Text"
}, },
{ {
"name": "merge_commit", "name": "base_branch",
"ordinal": 4, "ordinal": 4,
"type_info": "Text" "type_info": "Text"
}, },
{ {
"name": "base_branch", "name": "profile!",
"ordinal": 5, "ordinal": 5,
"type_info": "Text" "type_info": "Text"
}, },
{
"name": "profile!",
"ordinal": 6,
"type_info": "Text"
},
{
"name": "pr_url",
"ordinal": 7,
"type_info": "Text"
},
{
"name": "pr_number",
"ordinal": 8,
"type_info": "Integer"
},
{
"name": "pr_status",
"ordinal": 9,
"type_info": "Text"
},
{
"name": "pr_merged_at: DateTime<Utc>",
"ordinal": 10,
"type_info": "Datetime"
},
{ {
"name": "worktree_deleted!: bool", "name": "worktree_deleted!: bool",
"ordinal": 11, "ordinal": 6,
"type_info": "Bool" "type_info": "Bool"
}, },
{ {
"name": "setup_completed_at: DateTime<Utc>", "name": "setup_completed_at: DateTime<Utc>",
"ordinal": 12, "ordinal": 7,
"type_info": "Datetime" "type_info": "Datetime"
}, },
{ {
"name": "created_at!: DateTime<Utc>", "name": "created_at!: DateTime<Utc>",
"ordinal": 13, "ordinal": 8,
"type_info": "Text" "type_info": "Text"
}, },
{ {
"name": "updated_at!: DateTime<Utc>", "name": "updated_at!: DateTime<Utc>",
"ordinal": 14, "ordinal": 9,
"type_info": "Text" "type_info": "Text"
} }
], ],
@@ -87,18 +62,13 @@
false, false,
true, true,
true, true,
true,
false, false,
true, true,
true,
true,
true,
true,
false, false,
true, true,
false, false,
false false
] ]
}, },
"hash": "2494dbc96dfeed84122a142ca4b4ce5166875560295794e09ffff754861fd765" "hash": "94535d0c0e4eac82202f5420b62781ab774616c6d6c5ffd58b5b344c75104a0a"
} }

View File

@@ -1,38 +0,0 @@
{
"db_name": "SQLite",
"query": "SELECT \n ta.id as \"attempt_id!: Uuid\",\n ta.task_id as \"task_id!: Uuid\",\n ta.pr_number as \"pr_number!: i64\",\n ta.pr_url as \"pr_url!: String\"\n FROM task_attempts ta\n WHERE ta.pr_status = 'open' AND ta.pr_number IS NOT NULL",
"describe": {
"columns": [
{
"name": "attempt_id!: Uuid",
"ordinal": 0,
"type_info": "Blob"
},
{
"name": "task_id!: Uuid",
"ordinal": 1,
"type_info": "Blob"
},
{
"name": "pr_number!: i64",
"ordinal": 2,
"type_info": "Integer"
},
{
"name": "pr_url!: String",
"ordinal": 3,
"type_info": "Text"
}
],
"parameters": {
"Right": 0
},
"nullable": [
true,
false,
true,
true
]
},
"hash": "c1e5b46545fcef759610463d9bf902b25f18cd83d2ca8616bf3ec1c84728bf6f"
}

View File

@@ -0,0 +1,80 @@
{
"db_name": "SQLite",
"query": "SELECT \n id as \"id!: Uuid\",\n task_attempt_id as \"task_attempt_id!: Uuid\",\n merge_type as \"merge_type!: MergeType\",\n merge_commit,\n pr_number,\n pr_url,\n pr_status as \"pr_status?: MergeStatus\",\n pr_merged_at as \"pr_merged_at?: DateTime<Utc>\",\n pr_merge_commit_sha,\n created_at as \"created_at!: DateTime<Utc>\",\n target_branch_name as \"target_branch_name!: String\"\n FROM merges \n WHERE merge_type = 'pr' AND pr_status = 'open'\n ORDER BY created_at DESC",
"describe": {
"columns": [
{
"name": "id!: Uuid",
"ordinal": 0,
"type_info": "Blob"
},
{
"name": "task_attempt_id!: Uuid",
"ordinal": 1,
"type_info": "Blob"
},
{
"name": "merge_type!: MergeType",
"ordinal": 2,
"type_info": "Text"
},
{
"name": "merge_commit",
"ordinal": 3,
"type_info": "Text"
},
{
"name": "pr_number",
"ordinal": 4,
"type_info": "Integer"
},
{
"name": "pr_url",
"ordinal": 5,
"type_info": "Text"
},
{
"name": "pr_status?: MergeStatus",
"ordinal": 6,
"type_info": "Text"
},
{
"name": "pr_merged_at?: DateTime<Utc>",
"ordinal": 7,
"type_info": "Text"
},
{
"name": "pr_merge_commit_sha",
"ordinal": 8,
"type_info": "Text"
},
{
"name": "created_at!: DateTime<Utc>",
"ordinal": 9,
"type_info": "Text"
},
{
"name": "target_branch_name!: String",
"ordinal": 10,
"type_info": "Text"
}
],
"parameters": {
"Right": 0
},
"nullable": [
true,
false,
false,
true,
true,
true,
false,
true,
true,
false,
false
]
},
"hash": "e45aa1e2282cc62522f66049de7d1d1c47e926000fac7a5c5f28237fdb65a0bb"
}

View File

@@ -1,6 +1,6 @@
{ {
"db_name": "SQLite", "db_name": "SQLite",
"query": "SELECT id AS \"id!: Uuid\",\n task_id AS \"task_id!: Uuid\",\n container_ref,\n branch,\n merge_commit,\n base_branch,\n profile AS \"profile!\",\n pr_url,\n pr_number,\n pr_status,\n pr_merged_at AS \"pr_merged_at: DateTime<Utc>\",\n worktree_deleted AS \"worktree_deleted!: bool\",\n setup_completed_at AS \"setup_completed_at: DateTime<Utc>\",\n created_at AS \"created_at!: DateTime<Utc>\",\n updated_at AS \"updated_at!: DateTime<Utc>\"\n FROM task_attempts\n WHERE rowid = $1", "query": "SELECT id AS \"id!: Uuid\",\n task_id AS \"task_id!: Uuid\",\n container_ref,\n branch,\n base_branch,\n profile AS \"profile!\",\n worktree_deleted AS \"worktree_deleted!: bool\",\n setup_completed_at AS \"setup_completed_at: DateTime<Utc>\",\n created_at AS \"created_at!: DateTime<Utc>\",\n updated_at AS \"updated_at!: DateTime<Utc>\"\n FROM task_attempts\n WHERE id = $1",
"describe": { "describe": {
"columns": [ "columns": [
{ {
@@ -24,58 +24,33 @@
"type_info": "Text" "type_info": "Text"
}, },
{ {
"name": "merge_commit", "name": "base_branch",
"ordinal": 4, "ordinal": 4,
"type_info": "Text" "type_info": "Text"
}, },
{ {
"name": "base_branch", "name": "profile!",
"ordinal": 5, "ordinal": 5,
"type_info": "Text" "type_info": "Text"
}, },
{
"name": "profile!",
"ordinal": 6,
"type_info": "Text"
},
{
"name": "pr_url",
"ordinal": 7,
"type_info": "Text"
},
{
"name": "pr_number",
"ordinal": 8,
"type_info": "Integer"
},
{
"name": "pr_status",
"ordinal": 9,
"type_info": "Text"
},
{
"name": "pr_merged_at: DateTime<Utc>",
"ordinal": 10,
"type_info": "Datetime"
},
{ {
"name": "worktree_deleted!: bool", "name": "worktree_deleted!: bool",
"ordinal": 11, "ordinal": 6,
"type_info": "Bool" "type_info": "Bool"
}, },
{ {
"name": "setup_completed_at: DateTime<Utc>", "name": "setup_completed_at: DateTime<Utc>",
"ordinal": 12, "ordinal": 7,
"type_info": "Datetime" "type_info": "Datetime"
}, },
{ {
"name": "created_at!: DateTime<Utc>", "name": "created_at!: DateTime<Utc>",
"ordinal": 13, "ordinal": 8,
"type_info": "Text" "type_info": "Text"
}, },
{ {
"name": "updated_at!: DateTime<Utc>", "name": "updated_at!: DateTime<Utc>",
"ordinal": 14, "ordinal": 9,
"type_info": "Text" "type_info": "Text"
} }
], ],
@@ -87,18 +62,13 @@
false, false,
true, true,
true, true,
true,
false, false,
true, true,
true,
true,
true,
true,
false, false,
true, true,
false, false,
false false
] ]
}, },
"hash": "5f44ebd79693cfe8f0eab52c1a41533bb78d340771a3ac178f7745852785c843" "hash": "fed05aaa5ad03cc0e9c5f261b48ec194c4a7a2dd05975f60d2c58107b958b8a7"
} }

View File

@@ -0,0 +1,78 @@
-- Create enhanced merges table with type-specific columns
CREATE TABLE merges (
id BLOB PRIMARY KEY,
task_attempt_id BLOB NOT NULL,
merge_type TEXT NOT NULL CHECK (merge_type IN ('direct', 'pr')),
-- Direct merge fields (NULL for PR merges)
merge_commit TEXT,
-- PR merge fields (NULL for direct merges)
pr_number INTEGER,
pr_url TEXT,
pr_status TEXT CHECK (pr_status IN ('open', 'merged', 'closed')),
pr_merged_at TEXT,
pr_merge_commit_sha TEXT,
created_at TEXT NOT NULL DEFAULT (datetime('now', 'subsec')),
target_branch_name TEXT NOT NULL,
-- Data integrity constraints
CHECK (
(merge_type = 'direct' AND merge_commit IS NOT NULL
AND pr_number IS NULL AND pr_url IS NULL)
OR
(merge_type = 'pr' AND pr_number IS NOT NULL AND pr_url IS NOT NULL
AND pr_status IS NOT NULL AND merge_commit IS NULL)
),
FOREIGN KEY (task_attempt_id) REFERENCES task_attempts(id) ON DELETE CASCADE
);
-- Create general index for all task_attempt_id queries
CREATE INDEX idx_merges_task_attempt_id ON merges(task_attempt_id);
-- Create index for finding open PRs quickly
CREATE INDEX idx_merges_open_pr ON merges(task_attempt_id, pr_status)
WHERE merge_type = 'pr' AND pr_status = 'open';
-- Migrate existing merge_commit data to new table as direct merges
INSERT INTO merges (id, task_attempt_id, merge_type, merge_commit, created_at, target_branch_name)
SELECT
randomblob(16),
id,
'direct',
merge_commit,
updated_at,
base_branch
FROM task_attempts
WHERE merge_commit IS NOT NULL;
-- Migrate existing PR data from task_attempts to merges
INSERT INTO merges (id, task_attempt_id, merge_type, pr_number, pr_url, pr_status, pr_merged_at, pr_merge_commit_sha, created_at, target_branch_name)
SELECT
randomblob(16),
id,
'pr',
pr_number,
pr_url,
CASE
WHEN pr_status = 'merged' THEN 'merged'
WHEN pr_status = 'closed' THEN 'closed'
ELSE 'open'
END,
pr_merged_at,
NULL, -- We don't have merge_commit for PRs in task_attempts
COALESCE(pr_merged_at, updated_at),
base_branch
FROM task_attempts
WHERE pr_number IS NOT NULL;
-- Drop merge_commit column from task_attempts
ALTER TABLE task_attempts DROP COLUMN merge_commit;
-- Drop PR columns from task_attempts
ALTER TABLE task_attempts DROP COLUMN pr_url;
ALTER TABLE task_attempts DROP COLUMN pr_number;
ALTER TABLE task_attempts DROP COLUMN pr_status;
ALTER TABLE task_attempts DROP COLUMN pr_merged_at;

View File

@@ -0,0 +1,299 @@
use chrono::{DateTime, Utc};
use serde::{Deserialize, Serialize};
use sqlx::{FromRow, SqlitePool, Type};
use ts_rs::TS;
use uuid::Uuid;
#[derive(Debug, Clone, Serialize, Deserialize, TS, Type)]
#[sqlx(type_name = "merge_status", rename_all = "snake_case")]
#[serde(rename_all = "snake_case")]
pub enum MergeStatus {
Open,
Merged,
Closed,
Unknown,
}
#[derive(Debug, Clone, Serialize, Deserialize, TS)]
#[serde(tag = "type", rename_all = "snake_case")]
pub enum Merge {
Direct(DirectMerge),
Pr(PrMerge),
}
#[derive(Debug, Clone, Serialize, Deserialize, TS)]
pub struct DirectMerge {
pub id: Uuid,
pub task_attempt_id: Uuid,
pub merge_commit: String,
pub target_branch_name: String,
pub created_at: DateTime<Utc>,
}
/// PR merge - represents a pull request merge
#[derive(Debug, Clone, Serialize, Deserialize, TS)]
pub struct PrMerge {
pub id: Uuid,
pub task_attempt_id: Uuid,
pub created_at: DateTime<Utc>,
pub target_branch_name: String,
pub pr_info: PullRequestInfo,
}
#[derive(Debug, Clone, Serialize, Deserialize, TS)]
pub struct PullRequestInfo {
pub number: i64,
pub url: String,
pub status: MergeStatus,
pub merged_at: Option<chrono::DateTime<chrono::Utc>>,
pub merge_commit_sha: Option<String>,
}
#[derive(Debug, Clone, Serialize, Deserialize, Type)]
#[sqlx(type_name = "TEXT", rename_all = "snake_case")]
pub enum MergeType {
Direct,
Pr,
}
#[derive(FromRow)]
struct MergeRow {
id: Uuid,
task_attempt_id: Uuid,
merge_type: MergeType,
merge_commit: Option<String>,
target_branch_name: String,
pr_number: Option<i64>,
pr_url: Option<String>,
pr_status: Option<MergeStatus>,
pr_merged_at: Option<DateTime<Utc>>,
pr_merge_commit_sha: Option<String>,
created_at: DateTime<Utc>,
}
impl Merge {
pub fn merge_commit(&self) -> Option<String> {
match self {
Merge::Direct(direct) => Some(direct.merge_commit.clone()),
Merge::Pr(pr) => pr.pr_info.merge_commit_sha.clone(),
}
}
/// Create a direct merge record
pub async fn create_direct(
pool: &SqlitePool,
task_attempt_id: Uuid,
target_branch_name: &str,
merge_commit: &str,
) -> Result<DirectMerge, sqlx::Error> {
let id = Uuid::new_v4();
let now = Utc::now();
sqlx::query_as!(
MergeRow,
r#"INSERT INTO merges (
id, task_attempt_id, merge_type, merge_commit, created_at, target_branch_name
) VALUES ($1, $2, 'direct', $3, $4, $5)
RETURNING
id as "id!: Uuid",
task_attempt_id as "task_attempt_id!: Uuid",
merge_type as "merge_type!: MergeType",
merge_commit,
pr_number,
pr_url,
pr_status as "pr_status?: MergeStatus",
pr_merged_at as "pr_merged_at?: DateTime<Utc>",
pr_merge_commit_sha,
created_at as "created_at!: DateTime<Utc>",
target_branch_name as "target_branch_name!: String"
"#,
id,
task_attempt_id,
merge_commit,
now,
target_branch_name
)
.fetch_one(pool)
.await
.map(Into::into)
}
/// Create a new PR record (when PR is opened)
pub async fn create_pr(
pool: &SqlitePool,
task_attempt_id: Uuid,
target_branch_name: &str,
pr_number: i64,
pr_url: &str,
) -> Result<PrMerge, sqlx::Error> {
let id = Uuid::new_v4();
let now = Utc::now();
sqlx::query_as!(
MergeRow,
r#"INSERT INTO merges (
id, task_attempt_id, merge_type, pr_number, pr_url, pr_status, created_at, target_branch_name
) VALUES ($1, $2, 'pr', $3, $4, 'open', $5, $6)
RETURNING
id as "id!: Uuid",
task_attempt_id as "task_attempt_id!: Uuid",
merge_type as "merge_type!: MergeType",
merge_commit,
pr_number,
pr_url,
pr_status as "pr_status?: MergeStatus",
pr_merged_at as "pr_merged_at?: DateTime<Utc>",
pr_merge_commit_sha,
created_at as "created_at!: DateTime<Utc>",
target_branch_name as "target_branch_name!: String"
"#,
id,
task_attempt_id,
pr_number,
pr_url,
now,
target_branch_name
)
.fetch_one(pool)
.await
.map(Into::into)
}
/// Get all open PRs for monitoring
pub async fn get_open_prs(pool: &SqlitePool) -> Result<Vec<PrMerge>, sqlx::Error> {
let rows = sqlx::query_as!(
MergeRow,
r#"SELECT
id as "id!: Uuid",
task_attempt_id as "task_attempt_id!: Uuid",
merge_type as "merge_type!: MergeType",
merge_commit,
pr_number,
pr_url,
pr_status as "pr_status?: MergeStatus",
pr_merged_at as "pr_merged_at?: DateTime<Utc>",
pr_merge_commit_sha,
created_at as "created_at!: DateTime<Utc>",
target_branch_name as "target_branch_name!: String"
FROM merges
WHERE merge_type = 'pr' AND pr_status = 'open'
ORDER BY created_at DESC"#,
)
.fetch_all(pool)
.await?;
Ok(rows.into_iter().map(Into::into).collect())
}
/// Update PR status for a task attempt
pub async fn update_status(
pool: &SqlitePool,
merge_id: Uuid,
pr_status: MergeStatus,
merge_commit_sha: Option<String>,
) -> Result<(), sqlx::Error> {
let merged_at = if matches!(pr_status, MergeStatus::Merged) {
Some(Utc::now())
} else {
None
};
sqlx::query!(
r#"UPDATE merges
SET pr_status = $1,
pr_merge_commit_sha = $2,
pr_merged_at = $3
WHERE id = $4"#,
pr_status,
merge_commit_sha,
merged_at,
merge_id
)
.execute(pool)
.await?;
Ok(())
}
/// Find all merges for a task attempt (returns both direct and PR merges)
pub async fn find_by_task_attempt_id(
pool: &SqlitePool,
task_attempt_id: Uuid,
) -> Result<Vec<Self>, sqlx::Error> {
// Get raw data from database
let rows = sqlx::query_as!(
MergeRow,
r#"SELECT
id as "id!: Uuid",
task_attempt_id as "task_attempt_id!: Uuid",
merge_type as "merge_type!: MergeType",
merge_commit,
pr_number,
pr_url,
pr_status as "pr_status?: MergeStatus",
pr_merged_at as "pr_merged_at?: DateTime<Utc>",
pr_merge_commit_sha,
target_branch_name as "target_branch_name!: String",
created_at as "created_at!: DateTime<Utc>"
FROM merges
WHERE task_attempt_id = $1
ORDER BY created_at DESC"#,
task_attempt_id
)
.fetch_all(pool)
.await?;
// Convert to appropriate types based on merge_type
Ok(rows.into_iter().map(Into::into).collect())
}
/// Find the most recent merge for a task attempt
pub async fn find_latest_by_task_attempt_id(
pool: &SqlitePool,
task_attempt_id: Uuid,
) -> Result<Option<Self>, sqlx::Error> {
Self::find_by_task_attempt_id(pool, task_attempt_id)
.await
.map(|mut merges| merges.pop())
}
}
// Conversion implementations
impl From<MergeRow> for DirectMerge {
fn from(row: MergeRow) -> Self {
DirectMerge {
id: row.id,
task_attempt_id: row.task_attempt_id,
merge_commit: row
.merge_commit
.expect("direct merge must have merge_commit"),
target_branch_name: row.target_branch_name,
created_at: row.created_at,
}
}
}
impl From<MergeRow> for PrMerge {
fn from(row: MergeRow) -> Self {
PrMerge {
id: row.id,
task_attempt_id: row.task_attempt_id,
target_branch_name: row.target_branch_name,
pr_info: PullRequestInfo {
number: row.pr_number.expect("pr merge must have pr_number"),
url: row.pr_url.expect("pr merge must have pr_url"),
status: row.pr_status.expect("pr merge must have status"),
merged_at: row.pr_merged_at,
merge_commit_sha: row.pr_merge_commit_sha,
},
created_at: row.created_at,
}
}
}
impl From<MergeRow> for Merge {
fn from(row: MergeRow) -> Self {
match row.merge_type {
MergeType::Direct => Merge::Direct(DirectMerge::from(row)),
MergeType::Pr => Merge::Pr(PrMerge::from(row)),
}
}
}

View File

@@ -2,6 +2,7 @@ pub mod execution_process;
pub mod execution_process_logs; pub mod execution_process_logs;
pub mod executor_session; pub mod executor_session;
pub mod image; pub mod image;
pub mod merge;
pub mod project; pub mod project;
pub mod task; pub mod task;
pub mod task_attempt; pub mod task_attempt;

View File

@@ -101,15 +101,7 @@ impl Task {
AND ep.run_reason IN ('setupscript','cleanupscript','codingagent') AND ep.run_reason IN ('setupscript','cleanupscript','codingagent')
LIMIT 1 LIMIT 1
) THEN 1 ELSE 0 END AS "has_in_progress_attempt!: i64", ) THEN 1 ELSE 0 END AS "has_in_progress_attempt!: i64",
CASE WHEN EXISTS (
SELECT 1
FROM task_attempts ta
WHERE ta.task_id = t.id
AND ta.merge_commit IS NOT NULL
LIMIT 1
) THEN 1 ELSE 0 END AS "has_merged_attempt!: i64",
CASE WHEN ( CASE WHEN (
SELECT ep.status SELECT ep.status
FROM task_attempts ta FROM task_attempts ta
@@ -149,7 +141,7 @@ ORDER BY t.created_at DESC"#,
created_at: rec.created_at, created_at: rec.created_at,
updated_at: rec.updated_at, updated_at: rec.updated_at,
has_in_progress_attempt: rec.has_in_progress_attempt != 0, has_in_progress_attempt: rec.has_in_progress_attempt != 0,
has_merged_attempt: rec.has_merged_attempt != 0, has_merged_attempt: false, // TODO use merges table
last_attempt_failed: rec.last_attempt_failed != 0, last_attempt_failed: rec.last_attempt_failed != 0,
profile: rec.profile, profile: rec.profile,
}) })

View File

@@ -7,40 +7,6 @@ use uuid::Uuid;
use super::{project::Project, task::Task}; use super::{project::Project, task::Task};
#[derive(Debug)]
pub struct PrInfo {
pub attempt_id: Uuid,
pub task_id: Uuid,
pub pr_number: i64,
pub repo_owner: String,
pub repo_name: String,
}
impl PrInfo {
pub fn from_task_attempt_data(
attempt_id: Uuid,
task_id: Uuid,
pr_number: i64,
pr_url: &str,
) -> Result<Self, sqlx::Error> {
let re = regex::Regex::new(r"github\.com/(?P<owner>[^/]+)/(?P<repo>[^/]+)").unwrap();
let caps = re
.captures(pr_url)
.ok_or_else(|| sqlx::Error::ColumnNotFound("Invalid URL format".into()))?;
let owner = caps.name("owner").unwrap().as_str().to_string();
let repo_name = caps.name("repo").unwrap().as_str().to_string();
Ok(Self {
attempt_id,
task_id,
pr_number,
repo_owner: owner,
repo_name,
})
}
}
#[derive(Debug, Error)] #[derive(Debug, Error)]
pub enum TaskAttemptError { pub enum TaskAttemptError {
#[error(transparent)] #[error(transparent)]
@@ -74,13 +40,8 @@ pub struct TaskAttempt {
pub container_ref: Option<String>, // Path to a worktree (local), or cloud container id pub container_ref: Option<String>, // Path to a worktree (local), or cloud container id
pub branch: Option<String>, // Git branch name for this task attempt pub branch: Option<String>, // Git branch name for this task attempt
pub base_branch: String, // Base branch this attempt is based on pub base_branch: String, // Base branch this attempt is based on
pub merge_commit: Option<String>,
pub profile: String, // Name of the base coding agent to use ("AMP", "CLAUDE_CODE", pub profile: String, // Name of the base coding agent to use ("AMP", "CLAUDE_CODE",
// "GEMINI", etc.) // "GEMINI", etc.)
pub pr_url: Option<String>, // GitHub PR URL
pub pr_number: Option<i64>, // GitHub PR number
pub pr_status: Option<String>, // open, closed, merged
pub pr_merged_at: Option<DateTime<Utc>>, // When PR was merged
pub worktree_deleted: bool, // Flag indicating if worktree has been cleaned up pub worktree_deleted: bool, // Flag indicating if worktree has been cleaned up
pub setup_completed_at: Option<DateTime<Utc>>, // When setup script was last completed pub setup_completed_at: Option<DateTime<Utc>>, // When setup script was last completed
pub created_at: DateTime<Utc>, pub created_at: DateTime<Utc>,
@@ -141,12 +102,7 @@ impl TaskAttempt {
container_ref, container_ref,
branch, branch,
base_branch, base_branch,
merge_commit,
profile AS "profile!", profile AS "profile!",
pr_url,
pr_number,
pr_status,
pr_merged_at AS "pr_merged_at: DateTime<Utc>",
worktree_deleted AS "worktree_deleted!: bool", worktree_deleted AS "worktree_deleted!: bool",
setup_completed_at AS "setup_completed_at: DateTime<Utc>", setup_completed_at AS "setup_completed_at: DateTime<Utc>",
created_at AS "created_at!: DateTime<Utc>", created_at AS "created_at!: DateTime<Utc>",
@@ -166,12 +122,7 @@ impl TaskAttempt {
container_ref, container_ref,
branch, branch,
base_branch, base_branch,
merge_commit,
profile AS "profile!", profile AS "profile!",
pr_url,
pr_number,
pr_status,
pr_merged_at AS "pr_merged_at: DateTime<Utc>",
worktree_deleted AS "worktree_deleted!: bool", worktree_deleted AS "worktree_deleted!: bool",
setup_completed_at AS "setup_completed_at: DateTime<Utc>", setup_completed_at AS "setup_completed_at: DateTime<Utc>",
created_at AS "created_at!: DateTime<Utc>", created_at AS "created_at!: DateTime<Utc>",
@@ -202,12 +153,7 @@ impl TaskAttempt {
ta.container_ref, ta.container_ref,
ta.branch, ta.branch,
ta.base_branch, ta.base_branch,
ta.merge_commit,
ta.profile AS "profile!", ta.profile AS "profile!",
ta.pr_url,
ta.pr_number,
ta.pr_status,
ta.pr_merged_at AS "pr_merged_at: DateTime<Utc>",
ta.worktree_deleted AS "worktree_deleted!: bool", ta.worktree_deleted AS "worktree_deleted!: bool",
ta.setup_completed_at AS "setup_completed_at: DateTime<Utc>", ta.setup_completed_at AS "setup_completed_at: DateTime<Utc>",
ta.created_at AS "created_at!: DateTime<Utc>", ta.created_at AS "created_at!: DateTime<Utc>",
@@ -296,13 +242,8 @@ impl TaskAttempt {
task_id AS "task_id!: Uuid", task_id AS "task_id!: Uuid",
container_ref, container_ref,
branch, branch,
merge_commit,
base_branch, base_branch,
profile AS "profile!", profile AS "profile!",
pr_url,
pr_number,
pr_status,
pr_merged_at AS "pr_merged_at: DateTime<Utc>",
worktree_deleted AS "worktree_deleted!: bool", worktree_deleted AS "worktree_deleted!: bool",
setup_completed_at AS "setup_completed_at: DateTime<Utc>", setup_completed_at AS "setup_completed_at: DateTime<Utc>",
created_at AS "created_at!: DateTime<Utc>", created_at AS "created_at!: DateTime<Utc>",
@@ -322,13 +263,8 @@ impl TaskAttempt {
task_id AS "task_id!: Uuid", task_id AS "task_id!: Uuid",
container_ref, container_ref,
branch, branch,
merge_commit,
base_branch, base_branch,
profile AS "profile!", profile AS "profile!",
pr_url,
pr_number,
pr_status,
pr_merged_at AS "pr_merged_at: DateTime<Utc>",
worktree_deleted AS "worktree_deleted!: bool", worktree_deleted AS "worktree_deleted!: bool",
setup_completed_at AS "setup_completed_at: DateTime<Utc>", setup_completed_at AS "setup_completed_at: DateTime<Utc>",
created_at AS "created_at!: DateTime<Utc>", created_at AS "created_at!: DateTime<Utc>",
@@ -341,36 +277,6 @@ impl TaskAttempt {
.await .await
} }
// pub async fn find_by_task_id(
// pool: &SqlitePool,
// task_id: Uuid,
// ) -> Result<Vec<Self>, sqlx::Error> {
// sqlx::query_as!(
// TaskAttempt,
// r#"SELECT id AS "id!: Uuid",
// task_id AS "task_id!: Uuid",
// worktree_path,
// branch,
// base_branch,
// merge_commit,
// executor,
// pr_url,
// pr_number,
// pr_status,
// pr_merged_at AS "pr_merged_at: DateTime<Utc>",
// worktree_deleted AS "worktree_deleted!: bool",
// setup_completed_at AS "setup_completed_at: DateTime<Utc>",
// created_at AS "created_at!: DateTime<Utc>",
// updated_at AS "updated_at!: DateTime<Utc>"
// FROM task_attempts
// WHERE task_id = $1
// ORDER BY created_at DESC"#,
// task_id
// )
// .fetch_all(pool)
// .await
// }
/// Find task attempts by task_id with project git repo path for cleanup operations /// Find task attempts by task_id with project git repo path for cleanup operations
pub async fn find_by_task_id_with_project( pub async fn find_by_task_id_with_project(
pool: &SqlitePool, pool: &SqlitePool,
@@ -481,20 +387,15 @@ impl TaskAttempt {
// Insert the record into the database // Insert the record into the database
Ok(sqlx::query_as!( Ok(sqlx::query_as!(
TaskAttempt, TaskAttempt,
r#"INSERT INTO task_attempts (id, task_id, container_ref, branch, base_branch, merge_commit, profile, pr_url, pr_number, pr_status, pr_merged_at, worktree_deleted, setup_completed_at) r#"INSERT INTO task_attempts (id, task_id, container_ref, branch, base_branch, profile, worktree_deleted, setup_completed_at)
VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13) VALUES ($1, $2, $3, $4, $5, $6, $7, $8)
RETURNING id as "id!: Uuid", task_id as "task_id!: Uuid", container_ref, branch, base_branch, merge_commit, profile as "profile!", pr_url, pr_number, pr_status, pr_merged_at as "pr_merged_at: DateTime<Utc>", worktree_deleted as "worktree_deleted!: bool", setup_completed_at as "setup_completed_at: DateTime<Utc>", created_at as "created_at!: DateTime<Utc>", updated_at as "updated_at!: DateTime<Utc>""#, RETURNING id as "id!: Uuid", task_id as "task_id!: Uuid", container_ref, branch, base_branch, profile as "profile!", worktree_deleted as "worktree_deleted!: bool", setup_completed_at as "setup_completed_at: DateTime<Utc>", created_at as "created_at!: DateTime<Utc>", updated_at as "updated_at!: DateTime<Utc>""#,
attempt_id, attempt_id,
task_id, task_id,
Option::<String>::None, // Container isn't known yet Option::<String>::None, // Container isn't known yet
Option::<String>::None, // branch name isn't known yet Option::<String>::None, // branch name isn't known yet
data.base_branch, data.base_branch,
Option::<String>::None, // merge_commit is always None during creation
data.profile, data.profile,
Option::<String>::None, // pr_url is None during creation
Option::<i64>::None, // pr_number is None during creation
Option::<String>::None, // pr_status is None during creation
Option::<DateTime<Utc>>::None, // pr_merged_at is None during creation
false, // worktree_deleted is false during creation false, // worktree_deleted is false during creation
Option::<DateTime<Utc>>::None // setup_completed_at is None during creation Option::<DateTime<Utc>>::None // setup_completed_at is None during creation
) )
@@ -502,23 +403,6 @@ impl TaskAttempt {
.await?) .await?)
} }
/// Update the task attempt with the merge commit ID
pub async fn update_merge_commit(
pool: &SqlitePool,
attempt_id: Uuid,
merge_commit_id: &str,
) -> Result<(), TaskAttemptError> {
sqlx::query!(
"UPDATE task_attempts SET merge_commit = $1, updated_at = datetime('now') WHERE id = $2",
merge_commit_id,
attempt_id
)
.execute(pool)
.await?;
Ok(())
}
pub async fn update_base_branch( pub async fn update_base_branch(
pool: &SqlitePool, pool: &SqlitePool,
attempt_id: Uuid, attempt_id: Uuid,
@@ -535,27 +419,6 @@ impl TaskAttempt {
Ok(()) Ok(())
} }
/// Update PR status for a task attempt
pub async fn update_pr_status(
pool: &SqlitePool,
attempt_id: Uuid,
pr_url: String,
pr_number: i64,
pr_status: String,
) -> Result<(), sqlx::Error> {
sqlx::query!(
"UPDATE task_attempts SET pr_url = $1, pr_number = $2, pr_status = $3, updated_at = datetime('now') WHERE id = $4",
pr_url,
pr_number,
pr_status,
attempt_id
)
.execute(pool)
.await?;
Ok(())
}
pub async fn resolve_container_ref( pub async fn resolve_container_ref(
pool: &SqlitePool, pool: &SqlitePool,
container_ref: &str, container_ref: &str,
@@ -575,24 +438,4 @@ impl TaskAttempt {
Ok((result.attempt_id, result.task_id, result.project_id)) Ok((result.attempt_id, result.task_id, result.project_id))
} }
pub async fn get_open_prs(pool: &SqlitePool) -> Result<Vec<PrInfo>, sqlx::Error> {
let rows = sqlx::query!(
r#"SELECT
ta.id as "attempt_id!: Uuid",
ta.task_id as "task_id!: Uuid",
ta.pr_number as "pr_number!: i64",
ta.pr_url as "pr_url!: String"
FROM task_attempts ta
WHERE ta.pr_status = 'open' AND ta.pr_number IS NOT NULL"#
)
.fetch_all(pool)
.await?;
Ok(rows
.into_iter()
.filter_map(|r| {
PrInfo::from_task_attempt_data(r.attempt_id, r.task_id, r.pr_number, &r.pr_url).ok()
})
.collect())
}
} }

View File

@@ -18,6 +18,7 @@ use db::{
ExecutionContext, ExecutionProcess, ExecutionProcessRunReason, ExecutionProcessStatus, ExecutionContext, ExecutionProcess, ExecutionProcessRunReason, ExecutionProcessStatus,
}, },
executor_session::ExecutorSession, executor_session::ExecutorSession,
merge::Merge,
project::Project, project::Project,
task::{Task, TaskStatus}, task::{Task, TaskStatus},
task_attempt::TaskAttempt, task_attempt::TaskAttempt,
@@ -813,6 +814,20 @@ impl ContainerService for LocalContainerService {
Ok(container_ref.to_string()) Ok(container_ref.to_string())
} }
async fn is_container_clean(&self, task_attempt: &TaskAttempt) -> Result<bool, ContainerError> {
if let Some(container_ref) = &task_attempt.container_ref {
// If container_ref is set, check if the worktree exists
let path = PathBuf::from(container_ref);
if path.exists() {
self.git().is_worktree_clean(&path).map_err(|e| e.into())
} else {
return Ok(true); // No worktree means it's clean
}
} else {
return Ok(true); // No container_ref means no worktree, so it's clean
}
}
async fn start_execution_inner( async fn start_execution_inner(
&self, &self,
task_attempt: &TaskAttempt, task_attempt: &TaskAttempt,
@@ -904,16 +919,9 @@ impl ContainerService for LocalContainerService {
task_attempt: &TaskAttempt, task_attempt: &TaskAttempt,
) -> Result<futures::stream::BoxStream<'static, Result<Event, std::io::Error>>, ContainerError> ) -> Result<futures::stream::BoxStream<'static, Result<Event, std::io::Error>>, ContainerError>
{ {
let container_ref = self.ensure_container_exists(task_attempt).await?;
let worktree_path = PathBuf::from(container_ref);
let project_repo_path = self.get_project_repo_path(task_attempt).await?; let project_repo_path = self.get_project_repo_path(task_attempt).await?;
let latest_merge =
// Handle merged attempts (static diff) Merge::find_latest_by_task_attempt_id(&self.db.pool, task_attempt.id).await?;
if let Some(merge_commit_id) = &task_attempt.merge_commit {
return self.create_merged_diff_stream(&project_repo_path, merge_commit_id);
}
let task_branch = task_attempt let task_branch = task_attempt
.branch .branch
.clone() .clone()
@@ -922,6 +930,29 @@ impl ContainerService for LocalContainerService {
task_attempt.id task_attempt.id
)))?; )))?;
let is_ahead = if let Ok((ahead, _)) = self.git().get_local_branch_status(
&project_repo_path,
&task_branch,
&task_attempt.base_branch,
) {
ahead > 0
} else {
false
};
// Show merged diff when no new work is on the branch or container
if let Some(merge) = &latest_merge
&& let Some(commit) = merge.merge_commit()
&& self.is_container_clean(task_attempt).await?
&& !is_ahead
{
return self.create_merged_diff_stream(&project_repo_path, &commit);
}
// worktree is needed for non-merged diffs
let container_ref = self.ensure_container_exists(task_attempt).await?;
let worktree_path = PathBuf::from(container_ref);
// Handle ongoing attempts (live streaming diff) // Handle ongoing attempts (live streaming diff)
self.create_live_diff_stream( self.create_live_diff_stream(
&project_repo_path, &project_repo_path,

View File

@@ -54,7 +54,6 @@ fn generate_types_content() -> String {
server::routes::auth::DevicePollStatus::decl(), server::routes::auth::DevicePollStatus::decl(),
server::routes::auth::CheckTokenResponse::decl(), server::routes::auth::CheckTokenResponse::decl(),
services::services::git::GitBranch::decl(), services::services::git::GitBranch::decl(),
services::services::git::BranchStatus::decl(),
utils::diff::Diff::decl(), utils::diff::Diff::decl(),
utils::diff::FileDiffDetails::decl(), utils::diff::FileDiffDetails::decl(),
services::services::github_service::RepositoryInfo::decl(), services::services::github_service::RepositoryInfo::decl(),
@@ -73,10 +72,16 @@ fn generate_types_content() -> String {
executors::actions::coding_agent_follow_up::CodingAgentFollowUpRequest::decl(), executors::actions::coding_agent_follow_up::CodingAgentFollowUpRequest::decl(),
server::routes::task_attempts::CreateTaskAttemptBody::decl(), server::routes::task_attempts::CreateTaskAttemptBody::decl(),
server::routes::task_attempts::RebaseTaskAttemptRequest::decl(), server::routes::task_attempts::RebaseTaskAttemptRequest::decl(),
server::routes::task_attempts::BranchStatus::decl(),
db::models::task_attempt::TaskAttempt::decl(), db::models::task_attempt::TaskAttempt::decl(),
db::models::execution_process::ExecutionProcess::decl(), db::models::execution_process::ExecutionProcess::decl(),
db::models::execution_process::ExecutionProcessStatus::decl(), db::models::execution_process::ExecutionProcessStatus::decl(),
db::models::execution_process::ExecutionProcessRunReason::decl(), db::models::execution_process::ExecutionProcessRunReason::decl(),
db::models::merge::Merge::decl(),
db::models::merge::DirectMerge::decl(),
db::models::merge::PrMerge::decl(),
db::models::merge::MergeStatus::decl(),
db::models::merge::PullRequestInfo::decl(),
services::services::events::EventPatch::decl(), services::services::events::EventPatch::decl(),
services::services::events::EventPatchInner::decl(), services::services::events::EventPatchInner::decl(),
services::services::events::RecordTypes::decl(), services::services::events::RecordTypes::decl(),

View File

@@ -12,6 +12,8 @@ use axum::{
use db::models::{ use db::models::{
execution_process::{ExecutionProcess, ExecutionProcessRunReason}, execution_process::{ExecutionProcess, ExecutionProcessRunReason},
image::TaskImage, image::TaskImage,
merge::{Merge, MergeStatus, PrMerge, PullRequestInfo},
project::{Project, ProjectError},
task::{Task, TaskStatus}, task::{Task, TaskStatus},
task_attempt::{CreateTaskAttempt, TaskAttempt, TaskAttemptError}, task_attempt::{CreateTaskAttempt, TaskAttempt, TaskAttemptError},
}; };
@@ -25,11 +27,11 @@ use executors::{
profile::{ProfileConfigs, ProfileVariantLabel}, profile::{ProfileConfigs, ProfileVariantLabel},
}; };
use futures_util::TryStreamExt; use futures_util::TryStreamExt;
use local_deployment::container;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use services::services::{ use services::services::{
container::ContainerService, container::ContainerService,
git::BranchStatus, github_service::{CreatePrRequest, GitHubService, GitHubServiceError},
github_service::{CreatePrRequest, GitHubRepoInfo, GitHubService, GitHubServiceError},
image::ImageService, image::ImageService,
}; };
use sqlx::Error as SqlxError; use sqlx::Error as SqlxError;
@@ -324,7 +326,13 @@ pub async fn merge_task_attempt(
&commit_message, &commit_message,
)?; )?;
TaskAttempt::update_merge_commit(pool, task_attempt.id, &merge_commit_id).await?; Merge::create_direct(
pool,
task_attempt.id,
&ctx.task_attempt.base_branch,
&merge_commit_id,
)
.await?;
Task::update_status(pool, ctx.task.id, TaskStatus::Done).await?; Task::update_status(pool, ctx.task.id, TaskStatus::Done).await?;
deployment deployment
@@ -358,15 +366,11 @@ pub async fn push_task_attempt_branch(
.parent_task(pool) .parent_task(pool)
.await? .await?
.ok_or(ApiError::TaskAttempt(TaskAttemptError::TaskNotFound))?; .ok_or(ApiError::TaskAttempt(TaskAttemptError::TaskNotFound))?;
let ctx = TaskAttempt::load_context(pool, task_attempt.id, task.id, task.project_id).await?; let project = Project::find_by_id(pool, task.project_id)
.await?
.ok_or(ApiError::Project(ProjectError::ProjectNotFound))?;
let container_ref = deployment let branch_name = task_attempt.branch.as_ref().ok_or_else(|| {
.container()
.ensure_container_exists(&task_attempt)
.await?;
let worktree_path = std::path::Path::new(&container_ref);
let branch_name = ctx.task_attempt.branch.as_ref().ok_or_else(|| {
ApiError::TaskAttempt(TaskAttemptError::ValidationError( ApiError::TaskAttempt(TaskAttemptError::ValidationError(
"No branch found for task attempt".to_string(), "No branch found for task attempt".to_string(),
)) ))
@@ -374,7 +378,7 @@ pub async fn push_task_attempt_branch(
deployment deployment
.git() .git()
.push_to_github(worktree_path, branch_name, &github_token)?; .push_to_github(&project.git_repo_path, branch_name, &github_token)?;
Ok(ResponseJson(ApiResponse::success(()))) Ok(ResponseJson(ApiResponse::success(())))
} }
@@ -417,32 +421,27 @@ pub async fn create_github_pr(
.parent_task(pool) .parent_task(pool)
.await? .await?
.ok_or(ApiError::TaskAttempt(TaskAttemptError::TaskNotFound))?; .ok_or(ApiError::TaskAttempt(TaskAttemptError::TaskNotFound))?;
let ctx = TaskAttempt::load_context(pool, task_attempt.id, task.id, task.project_id).await?; let project = Project::find_by_id(pool, task.project_id)
.await?
// Ensure worktree exists (recreate if needed for cold task support) .ok_or(ApiError::Project(ProjectError::ProjectNotFound))?;
let container_ref = deployment
.container()
.ensure_container_exists(&task_attempt)
.await?;
let worktree_path = std::path::Path::new(&container_ref);
// Use GitService to get the remote URL, then create GitHubRepoInfo // Use GitService to get the remote URL, then create GitHubRepoInfo
let (owner, repo_name) = deployment let repo_info = deployment
.git() .git()
.get_github_repo_info(&ctx.project.git_repo_path)?; .get_github_repo_info(&project.git_repo_path)?;
let repo_info = GitHubRepoInfo { owner, repo_name };
// Get branch name from task attempt // Get branch name from task attempt
let branch_name = ctx.task_attempt.branch.as_ref().ok_or_else(|| { let branch_name = task_attempt.branch.as_ref().ok_or_else(|| {
ApiError::TaskAttempt(TaskAttemptError::ValidationError( ApiError::TaskAttempt(TaskAttemptError::ValidationError(
"No branch found for task attempt".to_string(), "No branch found for task attempt".to_string(),
)) ))
})?; })?;
// Push the branch to GitHub first // Push the branch to GitHub first
if let Err(e) = deployment if let Err(e) =
.git() deployment
.push_to_github(worktree_path, branch_name, &github_token) .git()
.push_to_github(&project.git_repo_path, branch_name, &github_token)
{ {
tracing::error!("Failed to push branch to GitHub: {}", e); tracing::error!("Failed to push branch to GitHub: {}", e);
let gh_e = GitHubServiceError::from(e); let gh_e = GitHubServiceError::from(e);
@@ -450,7 +449,7 @@ pub async fn create_github_pr(
return Ok(ResponseJson(ApiResponse::error_with_data(gh_e))); return Ok(ResponseJson(ApiResponse::error_with_data(gh_e)));
} else { } else {
return Ok(ResponseJson(ApiResponse::error( return Ok(ResponseJson(ApiResponse::error(
"Failed to push branch to GitHub", format!("Failed to push branch to GitHub: {}", gh_e).as_str(),
))); )));
} }
} }
@@ -465,12 +464,12 @@ pub async fn create_github_pr(
match github_service.create_pr(&repo_info, &pr_request).await { match github_service.create_pr(&repo_info, &pr_request).await {
Ok(pr_info) => { Ok(pr_info) => {
// Update the task attempt with PR information // Update the task attempt with PR information
if let Err(e) = TaskAttempt::update_pr_status( if let Err(e) = Merge::create_pr(
pool, pool,
task_attempt.id, task_attempt.id,
pr_info.url.clone(), &base_branch,
pr_info.number, pr_info.number,
pr_info.status.clone(), &pr_info.url,
) )
.await .await
{ {
@@ -481,8 +480,8 @@ pub async fn create_github_pr(
.track_if_analytics_allowed( .track_if_analytics_allowed(
"github_pr_created", "github_pr_created",
serde_json::json!({ serde_json::json!({
"task_id": ctx.task.id.to_string(), "task_id": task.id.to_string(),
"project_id": ctx.project.id.to_string(), "project_id": project.id.to_string(),
"attempt_id": task_attempt.id.to_string(), "attempt_id": task_attempt.id.to_string(),
}), }),
) )
@@ -499,7 +498,9 @@ pub async fn create_github_pr(
if e.is_api_data() { if e.is_api_data() {
Ok(ResponseJson(ApiResponse::error_with_data(e))) Ok(ResponseJson(ApiResponse::error_with_data(e)))
} else { } else {
Ok(ResponseJson(ApiResponse::error("Failed to create PR"))) Ok(ResponseJson(ApiResponse::error(
format!("Failed to create PR: {}", e).as_str(),
)))
} }
} }
} }
@@ -563,6 +564,17 @@ pub async fn open_task_attempt_in_editor(
} }
} }
#[derive(Debug, Clone, Serialize, Deserialize, TS)]
pub struct BranchStatus {
pub commits_behind: Option<usize>,
pub commits_ahead: Option<usize>,
pub has_uncommitted_changes: Option<bool>,
pub base_branch_name: String,
pub remote_commits_behind: Option<usize>,
pub remote_commits_ahead: Option<usize>,
pub merges: Vec<Merge>,
}
pub async fn get_task_attempt_branch_status( pub async fn get_task_attempt_branch_status(
Extension(task_attempt): Extension<TaskAttempt>, Extension(task_attempt): Extension<TaskAttempt>,
State(deployment): State<DeploymentImpl>, State(deployment): State<DeploymentImpl>,
@@ -574,30 +586,60 @@ pub async fn get_task_attempt_branch_status(
.await? .await?
.ok_or(ApiError::TaskAttempt(TaskAttemptError::TaskNotFound))?; .ok_or(ApiError::TaskAttempt(TaskAttemptError::TaskNotFound))?;
let ctx = TaskAttempt::load_context(pool, task_attempt.id, task.id, task.project_id).await?; let ctx = TaskAttempt::load_context(pool, task_attempt.id, task.id, task.project_id).await?;
let github_config = deployment.config().read().await.github.clone(); let has_uncommitted_changes = deployment
.container()
.is_container_clean(&task_attempt)
.await
.ok()
.map(|is_clean| !is_clean);
let branch_status = deployment let task_branch =
.git() task_attempt
.get_branch_status( .branch
&ctx.project.git_repo_path, .ok_or(ApiError::TaskAttempt(TaskAttemptError::ValidationError(
ctx.task_attempt.branch.as_ref().ok_or_else(|| { "No branch found for task attempt".to_string(),
ApiError::TaskAttempt(TaskAttemptError::ValidationError( )))?;
"No branch found for task attempt".to_string(),
)) let (commits_ahead, commits_behind) = deployment.git().get_local_branch_status(
})?, &ctx.project.git_repo_path,
&ctx.task_attempt.base_branch, &task_branch,
ctx.task_attempt.merge_commit.is_some(), &task_attempt.base_branch,
github_config.token(), )?;
// Fetch merges for this task attempt and add to branch status
let merges = Merge::find_by_task_attempt_id(pool, task_attempt.id).await?;
let mut branch_status = BranchStatus {
commits_ahead: Some(commits_ahead),
commits_behind: Some(commits_behind),
has_uncommitted_changes,
remote_commits_ahead: None,
remote_commits_behind: None,
merges,
base_branch_name: task_attempt.base_branch.clone(),
};
// check remote status if the attempt has an open PR
if branch_status.merges.first().is_some_and(|m| {
matches!(
m,
Merge::Pr(PrMerge {
pr_info: PullRequestInfo {
status: MergeStatus::Open,
..
},
..
})
) )
.map_err(|e| { }) {
tracing::error!( let github_config = deployment.config().read().await.github.clone();
"Failed to get branch status for task attempt {}: {}", let token = github_config
task_attempt.id, .token()
e .ok_or(ApiError::GitHubService(GitHubServiceError::TokenInvalid))?;
); let (remote_commits_ahead, remote_commits_behind) = deployment
ApiError::GitService(e) .git()
})?; .get_remote_branch_status(&ctx.project.git_repo_path, &task_branch, token)?;
branch_status.remote_commits_ahead = Some(remote_commits_ahead);
branch_status.remote_commits_behind = Some(remote_commits_behind);
}
Ok(ResponseJson(ApiResponse::success(branch_status))) Ok(ResponseJson(ApiResponse::success(branch_status)))
} }

View File

@@ -110,6 +110,7 @@ pub trait ContainerService {
&self, &self,
task_attempt: &TaskAttempt, task_attempt: &TaskAttempt,
) -> Result<ContainerRef, ContainerError>; ) -> Result<ContainerRef, ContainerError>;
async fn is_container_clean(&self, task_attempt: &TaskAttempt) -> Result<bool, ContainerError>;
async fn start_execution_inner( async fn start_execution_inner(
&self, &self,

View File

@@ -3,16 +3,17 @@ use std::{collections::HashMap, path::Path};
use chrono::{DateTime, Utc}; use chrono::{DateTime, Utc};
use git2::{ use git2::{
BranchType, CherrypickOptions, Delta, DiffFindOptions, DiffOptions, Error as GitError, BranchType, CherrypickOptions, Delta, DiffFindOptions, DiffOptions, Error as GitError,
FetchOptions, Repository, Sort, Status, StatusOptions, build::CheckoutBuilder, FetchOptions, Repository, Sort, build::CheckoutBuilder,
}; };
use regex; use regex;
use serde::{Deserialize, Serialize}; use serde::Serialize;
use thiserror::Error; use thiserror::Error;
use ts_rs::TS; use ts_rs::TS;
use utils::diff::{Diff, FileDiffDetails}; use utils::diff::{Diff, FileDiffDetails};
// Import for file ranking functionality // Import for file ranking functionality
use super::file_ranker::FileStat; use super::file_ranker::FileStat;
use crate::services::github_service::GitHubRepoInfo;
#[derive(Debug, Error)] #[derive(Debug, Error)]
pub enum GitServiceError { pub enum GitServiceError {
@@ -28,8 +29,8 @@ pub enum GitServiceError {
MergeConflicts(String), MergeConflicts(String),
#[error("Invalid path: {0}")] #[error("Invalid path: {0}")]
InvalidPath(String), InvalidPath(String),
#[error("Worktree has uncommitted changes: {0}")] #[error("{0} has uncommitted changes: {1}")]
WorktreeDirty(String), WorktreeDirty(String, String),
#[error("Invalid file paths: {0}")] #[error("Invalid file paths: {0}")]
InvalidFilePaths(String), InvalidFilePaths(String),
#[error("No GitHub token available.")] #[error("No GitHub token available.")]
@@ -55,19 +56,6 @@ pub struct HeadInfo {
pub oid: String, pub oid: String,
} }
#[derive(Debug, Clone, Serialize, Deserialize, TS)]
pub struct BranchStatus {
pub commits_behind: Option<usize>,
pub commits_ahead: Option<usize>,
pub up_to_date: Option<bool>,
pub merged: bool,
pub has_uncommitted_changes: bool,
pub base_branch_name: String,
pub remote_commits_behind: Option<usize>,
pub remote_commits_ahead: Option<usize>,
pub remote_up_to_date: Option<bool>,
}
/// Target for diff generation /// Target for diff generation
pub enum DiffTarget<'p> { pub enum DiffTarget<'p> {
/// Work-in-progress branch checked out in this worktree /// Work-in-progress branch checked out in this worktree
@@ -501,10 +489,12 @@ impl GitService {
commit_message: &str, commit_message: &str,
) -> Result<String, GitServiceError> { ) -> Result<String, GitServiceError> {
// Open the worktree repository // Open the worktree repository
let worktree_repo = Repository::open(worktree_path)?; let worktree_repo = self.open_repo(worktree_path)?;
let main_repo = self.open_repo(repo_path)?;
// Check if worktree is dirty before proceeding // Check if worktree is dirty before proceeding
self.check_worktree_clean(&worktree_repo)?; self.check_worktree_clean(&worktree_repo)?;
self.check_worktree_clean(&main_repo)?;
// Verify the task branch exists in the worktree // Verify the task branch exists in the worktree
let task_branch = worktree_repo let task_branch = worktree_repo
@@ -533,8 +523,17 @@ impl GitService {
base_branch_name, base_branch_name,
)?; )?;
// Reset the task branch to point to the squash commit
// This allows follow-up work to continue from the merged state without conflicts
let task_refname = format!("refs/heads/{branch_name}");
main_repo.reference(
&task_refname,
squash_commit_id,
true,
"Reset task branch after merge in main repo",
)?;
// Fix: Update main repo's HEAD if it's pointing to the base branch // Fix: Update main repo's HEAD if it's pointing to the base branch
let main_repo = self.open_repo(repo_path)?;
let refname = format!("refs/heads/{base_branch_name}"); let refname = format!("refs/heads/{base_branch_name}");
if let Ok(main_head) = main_repo.head() if let Ok(main_head) = main_repo.head()
@@ -551,14 +550,36 @@ impl GitService {
Ok(squash_commit_id.to_string()) Ok(squash_commit_id.to_string())
} }
pub fn get_branch_status( pub fn get_local_branch_status(
&self, &self,
repo_path: &Path, repo_path: &Path,
branch_name: &str, branch_name: &str,
base_branch_name: &str, base_branch_name: &str,
is_merged: bool, ) -> Result<(usize, usize), GitServiceError> {
github_token: Option<String>, let repo = Repository::open(repo_path)?;
) -> Result<BranchStatus, GitServiceError> { let branch_ref = repo
// try "refs/heads/<name>" first, then raw name
.find_reference(&format!("refs/heads/{branch_name}"))
.or_else(|_| repo.find_reference(branch_name))?;
let branch_oid = branch_ref.target().unwrap();
// Calculate ahead/behind counts using the stored base branch
let base_oid = repo
.find_branch(base_branch_name, BranchType::Local)?
.get()
.target()
.ok_or(GitServiceError::BranchNotFound(format!(
"refs/heads/{base_branch_name}"
)))?;
let (a, b) = repo.graph_ahead_behind(branch_oid, base_oid)?;
Ok((a, b))
}
pub fn get_remote_branch_status(
&self,
repo_path: &Path,
branch_name: &str,
github_token: String,
) -> Result<(usize, usize), GitServiceError> {
let repo = Repository::open(repo_path)?; let repo = Repository::open(repo_path)?;
let branch_ref = repo let branch_ref = repo
@@ -566,55 +587,25 @@ impl GitService {
.find_reference(&format!("refs/heads/{branch_name}")) .find_reference(&format!("refs/heads/{branch_name}"))
.or_else(|_| repo.find_reference(branch_name))?; .or_else(|_| repo.find_reference(branch_name))?;
let branch_oid = branch_ref.target().unwrap(); let branch_oid = branch_ref.target().unwrap();
// Check for unpushed commits by comparing with origin/branch_name // Check for unpushed commits by comparing with origin/branch_name
let (remote_commits_ahead, remote_commits_behind, remote_up_to_date) = if let Some(token) = self.fetch_from_remote(&repo, &github_token)?;
github_token let remote_oid = repo
&& self.fetch_from_remote(&repo, &token).is_ok() .find_reference(&format!("refs/remotes/origin/{branch_name}"))?
&& let Ok(remote_ref) = .target()
repo.find_reference(&format!("refs/remotes/origin/{branch_name}")) .ok_or(GitServiceError::BranchNotFound(format!(
&& let Some(remote_oid) = remote_ref.target() "origin/{branch_name}"
{ )))?;
let (a, b) = repo.graph_ahead_behind(branch_oid, remote_oid)?; let (a, b) = repo.graph_ahead_behind(branch_oid, remote_oid)?;
(Some(a), Some(b), Some(a == 0 && b == 0)) Ok((a, b))
} else { }
(None, None, None)
};
// Calculate ahead/behind counts using the stored base branch pub fn is_worktree_clean(&self, worktree_path: &Path) -> Result<bool, GitServiceError> {
let (commits_ahead, commits_behind, up_to_date) = if let Ok(base_branch) = let repo = self.open_repo(worktree_path)?;
repo.find_branch(base_branch_name, BranchType::Local) match self.check_worktree_clean(&repo) {
&& let Some(base_oid) = base_branch.get().target() Ok(()) => Ok(true),
{ Err(GitServiceError::WorktreeDirty(_, _)) => Ok(false),
let (a, b) = repo.graph_ahead_behind(branch_oid, base_oid)?; Err(e) => Err(e),
(Some(a), Some(b), Some(a == 0 && b == 0)) }
} else {
// Base branch doesn't exist, assume no relationship
(None, None, None)
};
let mut status_opts = StatusOptions::new();
status_opts
.include_untracked(true)
.recurse_untracked_dirs(true)
.include_ignored(false);
let has_uncommitted_changes = repo
.statuses(Some(&mut status_opts))?
.iter()
.any(|e| e.status() != Status::CURRENT);
Ok(BranchStatus {
commits_behind,
commits_ahead,
up_to_date,
merged: is_merged,
has_uncommitted_changes,
base_branch_name: base_branch_name.to_string(),
remote_commits_behind,
remote_commits_ahead,
remote_up_to_date,
})
} }
/// Check if the worktree is clean (no uncommitted changes to tracked files) /// Check if the worktree is clean (no uncommitted changes to tracked files)
@@ -648,7 +639,15 @@ impl GitService {
} }
if !dirty_files.is_empty() { if !dirty_files.is_empty() {
return Err(GitServiceError::WorktreeDirty(dirty_files.join(", "))); let branch_name = repo
.head()
.ok()
.and_then(|h| h.shorthand().map(|s| s.to_string()))
.unwrap_or_else(|| "unknown branch".to_string());
return Err(GitServiceError::WorktreeDirty(
branch_name,
dirty_files.join(", "),
));
} }
} }
@@ -1008,7 +1007,7 @@ impl GitService {
pub fn get_github_repo_info( pub fn get_github_repo_info(
&self, &self,
repo_path: &Path, repo_path: &Path,
) -> Result<(String, String), GitServiceError> { ) -> Result<GitHubRepoInfo, GitServiceError> {
let repo = self.open_repo(repo_path)?; let repo = self.open_repo(repo_path)?;
let remote = repo.find_remote("origin").map_err(|_| { let remote = repo.find_remote("origin").map_err(|_| {
GitServiceError::InvalidRepository("No 'origin' remote found".to_string()) GitServiceError::InvalidRepository("No 'origin' remote found".to_string())
@@ -1025,7 +1024,7 @@ impl GitService {
if let Some(captures) = github_regex.captures(url) { if let Some(captures) = github_regex.captures(url) {
let owner = captures.get(1).unwrap().as_str().to_string(); let owner = captures.get(1).unwrap().as_str().to_string();
let repo_name = captures.get(2).unwrap().as_str().to_string(); let repo_name = captures.get(2).unwrap().as_str().to_string();
Ok((owner, repo_name)) Ok(GitHubRepoInfo { owner, repo_name })
} else { } else {
Err(GitServiceError::InvalidRepository(format!( Err(GitServiceError::InvalidRepository(format!(
"Not a GitHub repository: {url}" "Not a GitHub repository: {url}"
@@ -1041,6 +1040,7 @@ impl GitService {
github_token: &str, github_token: &str,
) -> Result<(), GitServiceError> { ) -> Result<(), GitServiceError> {
let repo = Repository::open(worktree_path)?; let repo = Repository::open(worktree_path)?;
self.check_worktree_clean(&repo)?;
// Get the remote // Get the remote
let remote = repo.find_remote("origin")?; let remote = repo.find_remote("origin")?;

View File

@@ -1,6 +1,7 @@
use std::time::Duration; use std::time::Duration;
use backon::{ExponentialBuilder, Retryable}; use backon::{ExponentialBuilder, Retryable};
use db::models::merge::{MergeStatus, PullRequestInfo};
use octocrab::{Octocrab, OctocrabBuilder}; use octocrab::{Octocrab, OctocrabBuilder};
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use thiserror::Error; use thiserror::Error;
@@ -97,6 +98,19 @@ pub struct GitHubRepoInfo {
pub owner: String, pub owner: String,
pub repo_name: String, pub repo_name: String,
} }
impl GitHubRepoInfo {
pub fn from_pr_url(pr_url: &str) -> Result<Self, sqlx::Error> {
let re = regex::Regex::new(r"github\.com/(?P<owner>[^/]+)/(?P<repo>[^/]+)").unwrap();
let caps = re
.captures(pr_url)
.ok_or_else(|| sqlx::Error::ColumnNotFound("Invalid URL format".into()))?;
let owner = caps.name("owner").unwrap().as_str().to_string();
let repo_name = caps.name("repo").unwrap().as_str().to_string();
Ok(Self { owner, repo_name })
}
}
#[derive(Debug, Clone)] #[derive(Debug, Clone)]
pub struct CreatePrRequest { pub struct CreatePrRequest {
@@ -106,16 +120,6 @@ pub struct CreatePrRequest {
pub base_branch: String, pub base_branch: String,
} }
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct PullRequestInfo {
pub number: i64,
pub url: String,
pub status: String,
pub merged: bool,
pub merged_at: Option<chrono::DateTime<chrono::Utc>>,
pub merge_commit_sha: Option<String>,
}
#[derive(Debug, Clone, Serialize, Deserialize, TS)] #[derive(Debug, Clone, Serialize, Deserialize, TS)]
pub struct RepositoryInfo { pub struct RepositoryInfo {
pub id: i64, pub id: i64,
@@ -163,7 +167,10 @@ impl GitHubService {
.with_max_times(3) .with_max_times(3)
.with_jitter(), .with_jitter(),
) )
.when(|e| !matches!(e, GitHubServiceError::TokenInvalid)) .when(|e| {
!matches!(e, GitHubServiceError::TokenInvalid)
&& !matches!(e, GitHubServiceError::Branch(_))
})
.notify(|err: &GitHubServiceError, dur: Duration| { .notify(|err: &GitHubServiceError, dur: Duration| {
tracing::warn!( tracing::warn!(
"GitHub API call failed, retrying after {:.2}s: {}", "GitHub API call failed, retrying after {:.2}s: {}",
@@ -255,8 +262,7 @@ impl GitHubService {
let pr_info = PullRequestInfo { let pr_info = PullRequestInfo {
number: pr.number as i64, number: pr.number as i64,
url: pr.html_url.map(|url| url.to_string()).unwrap_or_default(), url: pr.html_url.map(|url| url.to_string()).unwrap_or_default(),
status: "open".to_string(), status: MergeStatus::Open,
merged: false,
merged_at: None, merged_at: None,
merge_commit_sha: None, merge_commit_sha: None,
}; };
@@ -309,23 +315,22 @@ impl GitHubService {
})?; })?;
let status = match pr.state { let status = match pr.state {
Some(octocrab::models::IssueState::Open) => "open", Some(octocrab::models::IssueState::Open) => MergeStatus::Open,
Some(octocrab::models::IssueState::Closed) => { Some(octocrab::models::IssueState::Closed) => {
if pr.merged_at.is_some() { if pr.merged_at.is_some() {
"merged" MergeStatus::Merged
} else { } else {
"closed" MergeStatus::Closed
} }
} }
None => "unknown", None => MergeStatus::Unknown,
Some(_) => "unknown", // Handle any other states Some(_) => MergeStatus::Unknown,
}; };
let pr_info = PullRequestInfo { let pr_info = PullRequestInfo {
number: pr.number as i64, number: pr.number as i64,
url: pr.html_url.map(|url| url.to_string()).unwrap_or_default(), url: pr.html_url.map(|url| url.to_string()).unwrap_or_default(),
status: status.to_string(), status,
merged: pr.merged_at.is_some(),
merged_at: pr.merged_at.map(|dt| dt.naive_utc().and_utc()), merged_at: pr.merged_at.map(|dt| dt.naive_utc().and_utc()),
merge_commit_sha: pr.merge_commit_sha.clone(), merge_commit_sha: pr.merge_commit_sha.clone(),
}; };

View File

@@ -3,8 +3,9 @@ use std::{sync::Arc, time::Duration};
use db::{ use db::{
DBService, DBService,
models::{ models::{
merge::{Merge, MergeStatus, PrMerge},
task::{Task, TaskStatus}, task::{Task, TaskStatus},
task_attempt::{PrInfo, TaskAttempt, TaskAttemptError}, task_attempt::{TaskAttempt, TaskAttemptError},
}, },
}; };
use sqlx::error::Error as SqlxError; use sqlx::error::Error as SqlxError;
@@ -66,7 +67,7 @@ impl PrMonitorService {
/// Check all open PRs for updates with the provided GitHub token /// Check all open PRs for updates with the provided GitHub token
async fn check_all_open_prs(&self) -> Result<(), PrMonitorError> { async fn check_all_open_prs(&self) -> Result<(), PrMonitorError> {
let open_prs = TaskAttempt::get_open_prs(&self.db.pool).await?; let open_prs = Merge::get_open_prs(&self.db.pool).await?;
if open_prs.is_empty() { if open_prs.is_empty() {
debug!("No open PRs to check"); debug!("No open PRs to check");
@@ -75,65 +76,56 @@ impl PrMonitorService {
info!("Checking {} open PRs", open_prs.len()); info!("Checking {} open PRs", open_prs.len());
for pr_info in open_prs { for pr_merge in open_prs {
if let Err(e) = self.check_pr_status(&pr_info).await { if let Err(e) = self.check_pr_status(&pr_merge).await {
error!( error!(
"Error checking PR #{} for attempt {}: {}", "Error checking PR #{} for attempt {}: {}",
pr_info.pr_number, pr_info.attempt_id, e pr_merge.pr_info.number, pr_merge.task_attempt_id, e
); );
} }
} }
Ok(()) Ok(())
} }
/// Check the status of a specific PR /// Check the status of a specific PR
async fn check_pr_status(&self, pr_info: &PrInfo) -> Result<(), PrMonitorError> { async fn check_pr_status(&self, pr_merge: &PrMerge) -> Result<(), PrMonitorError> {
let github_config = self.config.read().await.github.clone(); let github_config = self.config.read().await.github.clone();
let github_token = github_config.token().ok_or(PrMonitorError::NoGitHubToken)?; let github_token = github_config.token().ok_or(PrMonitorError::NoGitHubToken)?;
let github_service = GitHubService::new(&github_token)?; let github_service = GitHubService::new(&github_token)?;
let repo_info = GitHubRepoInfo { let repo_info = GitHubRepoInfo::from_pr_url(&pr_merge.pr_info.url)?;
owner: pr_info.repo_owner.clone(),
repo_name: pr_info.repo_name.clone(),
};
let pr_status = github_service let pr_status = github_service
.update_pr_status(&repo_info, pr_info.pr_number) .update_pr_status(&repo_info, pr_merge.pr_info.number)
.await?; .await?;
debug!( debug!(
"PR #{} status: {} (was open)", "PR #{} status: {:?} (was open)",
pr_info.pr_number, pr_status.status pr_merge.pr_info.number, pr_status.status
); );
// Update the PR status in the database // Update the PR status in the database
if pr_status.status != "open" { if !matches!(&pr_status.status, MergeStatus::Open) {
// Extract merge commit SHA if the PR was merged // Update merge status with the latest information from GitHub
TaskAttempt::update_pr_status( Merge::update_status(
&self.db.pool, &self.db.pool,
pr_info.attempt_id, pr_merge.id,
pr_status.url, pr_status.status.clone(),
pr_status.number, pr_status.merge_commit_sha,
pr_status.status,
) )
.await?; .await?;
// If the PR was merged, update the task status to done // If the PR was merged, update the task status to done
if pr_status.merged { if matches!(&pr_status.status, MergeStatus::Merged)
&& let Some(task_attempt) =
TaskAttempt::find_by_id(&self.db.pool, pr_merge.task_attempt_id).await?
{
info!( info!(
"PR #{} was merged, updating task {} to done", "PR #{} was merged, updating task {} to done",
pr_info.pr_number, pr_info.task_id pr_merge.pr_info.number, task_attempt.task_id
); );
let merge_commit_sha = pr_status.merge_commit_sha.as_deref().unwrap_or("unknown"); Task::update_status(&self.db.pool, task_attempt.task_id, TaskStatus::Done).await?;
Task::update_status(&self.db.pool, pr_info.task_id, TaskStatus::Done).await?;
TaskAttempt::update_merge_commit(
&self.db.pool,
pr_info.attempt_id,
merge_commit_sha,
)
.await?;
} }
} }

View File

@@ -46,7 +46,8 @@
"react-window": "^1.8.11", "react-window": "^1.8.11",
"rfc6902": "^5.1.2", "rfc6902": "^5.1.2",
"tailwind-merge": "^2.2.0", "tailwind-merge": "^2.2.0",
"tailwindcss-animate": "^1.0.7" "tailwindcss-animate": "^1.0.7",
"zustand": "^4.5.4"
}, },
"devDependencies": { "devDependencies": {
"@types/react": "^18.2.43", "@types/react": "^18.2.43",
@@ -8035,6 +8036,34 @@
"url": "https://github.com/sponsors/sindresorhus" "url": "https://github.com/sponsors/sindresorhus"
} }
}, },
"node_modules/zustand": {
"version": "4.5.7",
"resolved": "https://registry.npmjs.org/zustand/-/zustand-4.5.7.tgz",
"integrity": "sha512-CHOUy7mu3lbD6o6LJLfllpjkzhHXSBlX8B9+qPddUsIfeF5S/UZ5q0kmCsnRqT1UHFQZchNFDDzMbQsuesHWlw==",
"license": "MIT",
"dependencies": {
"use-sync-external-store": "^1.2.2"
},
"engines": {
"node": ">=12.7.0"
},
"peerDependencies": {
"@types/react": ">=16.8",
"immer": ">=9.0.6",
"react": ">=16.8"
},
"peerDependenciesMeta": {
"@types/react": {
"optional": true
},
"immer": {
"optional": true
},
"react": {
"optional": true
}
}
},
"node_modules/zwitch": { "node_modules/zwitch": {
"version": "2.0.4", "version": "2.0.4",
"resolved": "https://registry.npmjs.org/zwitch/-/zwitch-2.0.4.tgz", "resolved": "https://registry.npmjs.org/zwitch/-/zwitch-2.0.4.tgz",

View File

@@ -13,6 +13,7 @@ import type {
EditorType, EditorType,
TaskAttempt, TaskAttempt,
TaskWithAttemptStatus, TaskWithAttemptStatus,
BranchStatus,
} from 'shared/types'; } from 'shared/types';
import { attemptsApi, executionProcessesApi } from '@/lib/api.ts'; import { attemptsApi, executionProcessesApi } from '@/lib/api.ts';
import { import {
@@ -52,6 +53,7 @@ const TaskDetailsProvider: FC<{
processes: [], processes: [],
runningProcessDetails: {}, runningProcessDetails: {},
}); });
const [branchStatus, setBranchStatus] = useState<BranchStatus | null>(null);
const handleOpenInEditor = useCallback( const handleOpenInEditor = useCallback(
async (editorType?: EditorType) => { async (editorType?: EditorType) => {
@@ -111,6 +113,15 @@ const TaskDetailsProvider: FC<{
return newData; return newData;
}); });
} }
// Also fetch branch status as part of attempt data
try {
const branchResult = await attemptsApi.getBranchStatus(attemptId);
setBranchStatus(branchResult);
} catch (err) {
console.error('Failed to fetch branch status:', err);
setBranchStatus(null);
}
} catch (err) { } catch (err) {
console.error('Failed to fetch attempt data:', err); console.error('Failed to fetch attempt data:', err);
} }
@@ -165,8 +176,6 @@ const TaskDetailsProvider: FC<{
}, [attemptData.processes, selectedAttempt?.profile, profiles]); }, [attemptData.processes, selectedAttempt?.profile, profiles]);
useEffect(() => { useEffect(() => {
if (!isAttemptRunning || !task) return;
const interval = setInterval(() => { const interval = setInterval(() => {
if (selectedAttempt) { if (selectedAttempt) {
fetchAttemptData(selectedAttempt.id); fetchAttemptData(selectedAttempt.id);
@@ -176,6 +185,26 @@ const TaskDetailsProvider: FC<{
return () => clearInterval(interval); return () => clearInterval(interval);
}, [isAttemptRunning, task, selectedAttempt, fetchAttemptData]); }, [isAttemptRunning, task, selectedAttempt, fetchAttemptData]);
// Fetch branch status when selected attempt changes
useEffect(() => {
if (!selectedAttempt) {
setBranchStatus(null);
return;
}
const fetchBranchStatus = async () => {
try {
const result = await attemptsApi.getBranchStatus(selectedAttempt.id);
setBranchStatus(result);
} catch (err) {
console.error('Failed to fetch branch status:', err);
setBranchStatus(null);
}
};
fetchBranchStatus();
}, [selectedAttempt]);
const value = useMemo( const value = useMemo(
() => ({ () => ({
task, task,
@@ -218,8 +247,16 @@ const TaskDetailsProvider: FC<{
fetchAttemptData, fetchAttemptData,
isAttemptRunning, isAttemptRunning,
defaultFollowUpVariant, defaultFollowUpVariant,
branchStatus,
setBranchStatus,
}), }),
[attemptData, fetchAttemptData, isAttemptRunning, defaultFollowUpVariant] [
attemptData,
fetchAttemptData,
isAttemptRunning,
defaultFollowUpVariant,
branchStatus,
]
); );
return ( return (

View File

@@ -3,6 +3,7 @@ import type {
EditorType, EditorType,
TaskAttempt, TaskAttempt,
TaskWithAttemptStatus, TaskWithAttemptStatus,
BranchStatus,
} from 'shared/types'; } from 'shared/types';
import { AttemptData } from '@/lib/types.ts'; import { AttemptData } from '@/lib/types.ts';
@@ -33,6 +34,8 @@ interface TaskAttemptDataContextValue {
fetchAttemptData: (attemptId: string) => Promise<void> | void; fetchAttemptData: (attemptId: string) => Promise<void> | void;
isAttemptRunning: boolean; isAttemptRunning: boolean;
defaultFollowUpVariant: string | null; defaultFollowUpVariant: string | null;
branchStatus: BranchStatus | null;
setBranchStatus: Dispatch<SetStateAction<BranchStatus | null>>;
} }
export const TaskAttemptDataContext = export const TaskAttemptDataContext =

View File

@@ -37,6 +37,7 @@ export function TaskFollowUpSection() {
fetchAttemptData, fetchAttemptData,
isAttemptRunning, isAttemptRunning,
defaultFollowUpVariant, defaultFollowUpVariant,
branchStatus,
} = useContext(TaskAttemptDataContext); } = useContext(TaskAttemptDataContext);
const { profiles } = useUserSystem(); const { profiles } = useUserSystem();
@@ -66,12 +67,24 @@ export function TaskFollowUpSection() {
) { ) {
return false; return false;
} }
// Check if PR is merged - if so, block follow-ups
if (branchStatus?.merges) {
const mergedPR = branchStatus.merges.find(
(m) => m.type === 'pr' && m.pr_info.status === 'merged'
);
if (mergedPR) {
return false;
}
}
return true; return true;
}, [ }, [
selectedAttempt, selectedAttempt,
attemptData.processes, attemptData.processes,
isAttemptRunning, isAttemptRunning,
isSendingFollowUp, isSendingFollowUp,
branchStatus?.merges,
]); ]);
const currentProfile = useMemo(() => { const currentProfile = useMemo(() => {
if (!selectedProfile || !profiles) return null; if (!selectedProfile || !profiles) return null;

View File

@@ -19,6 +19,7 @@ import {
} from '@/components/ui/select'; } from '@/components/ui/select';
import { useCallback, useContext, useEffect, useState } from 'react'; import { useCallback, useContext, useEffect, useState } from 'react';
import { import {
TaskAttemptDataContext,
TaskDetailsContext, TaskDetailsContext,
TaskSelectedAttemptContext, TaskSelectedAttemptContext,
} from '@/components/context/taskDetailsContext.ts'; } from '@/components/context/taskDetailsContext.ts';
@@ -46,6 +47,7 @@ function CreatePrDialog({
}: Props) { }: Props) {
const { projectId, task } = useContext(TaskDetailsContext); const { projectId, task } = useContext(TaskDetailsContext);
const { selectedAttempt } = useContext(TaskSelectedAttemptContext); const { selectedAttempt } = useContext(TaskSelectedAttemptContext);
const { fetchAttemptData } = useContext(TaskAttemptDataContext);
const [prTitle, setPrTitle] = useState(''); const [prTitle, setPrTitle] = useState('');
const [prBody, setPrBody] = useState(''); const [prBody, setPrBody] = useState('');
const [prBaseBranch, setPrBaseBranch] = useState( const [prBaseBranch, setPrBaseBranch] = useState(
@@ -82,12 +84,14 @@ function CreatePrDialog({
}); });
if (result.success) { if (result.success) {
setError(null); // Clear any previous errors on success
window.open(result.data, '_blank'); window.open(result.data, '_blank');
setShowCreatePRDialog(false);
// Reset form // Reset form
setPrTitle(''); setPrTitle('');
setPrBody(''); setPrBody('');
setPrBaseBranch(selectedAttempt?.base_branch || 'main'); setPrBaseBranch(selectedAttempt?.base_branch || 'main');
// Refresh branch status to show the new PR
fetchAttemptData(selectedAttempt.id);
} else { } else {
if (result.error) { if (result.error) {
setShowCreatePRDialog(false); setShowCreatePRDialog(false);
@@ -112,7 +116,7 @@ function CreatePrDialog({
setError('Failed to create GitHub PR'); setError('Failed to create GitHub PR');
} }
} }
setShowCreatePRDialog(false);
setCreatingPR(false); setCreatingPR(false);
}, [ }, [
projectId, projectId,

View File

@@ -3,7 +3,6 @@ import {
GitBranch as GitBranchIcon, GitBranch as GitBranchIcon,
GitPullRequest, GitPullRequest,
History, History,
Upload,
Play, Play,
Plus, Plus,
RefreshCw, RefreshCw,
@@ -44,7 +43,7 @@ import {
useState, useState,
} from 'react'; } from 'react';
import type { ExecutionProcess } from 'shared/types'; import type { ExecutionProcess } from 'shared/types';
import type { BranchStatus, GitBranch, TaskAttempt } from 'shared/types'; import type { GitBranch, TaskAttempt } from 'shared/types';
import { import {
TaskAttemptDataContext, TaskAttemptDataContext,
TaskAttemptStoppingContext, TaskAttemptStoppingContext,
@@ -103,9 +102,8 @@ function CurrentAttempt({
useContext(TaskDetailsContext); useContext(TaskDetailsContext);
const { config } = useConfig(); const { config } = useConfig();
const { isStopping, setIsStopping } = useContext(TaskAttemptStoppingContext); const { isStopping, setIsStopping } = useContext(TaskAttemptStoppingContext);
const { attemptData, fetchAttemptData, isAttemptRunning } = useContext( const { attemptData, fetchAttemptData, isAttemptRunning, branchStatus } =
TaskAttemptDataContext useContext(TaskAttemptDataContext);
);
const { jumpToProcess } = useProcessSelection(); const { jumpToProcess } = useProcessSelection();
const [isStartingDevServer, setIsStartingDevServer] = useState(false); const [isStartingDevServer, setIsStartingDevServer] = useState(false);
@@ -115,12 +113,12 @@ function CurrentAttempt({
const [devServerDetails, setDevServerDetails] = const [devServerDetails, setDevServerDetails] =
useState<ExecutionProcess | null>(null); useState<ExecutionProcess | null>(null);
const [isHoveringDevServer, setIsHoveringDevServer] = useState(false); const [isHoveringDevServer, setIsHoveringDevServer] = useState(false);
const [branchStatus, setBranchStatus] = useState<BranchStatus | null>(null);
const [branchStatusLoading, setBranchStatusLoading] = useState(false);
const [showRebaseDialog, setShowRebaseDialog] = useState(false); const [showRebaseDialog, setShowRebaseDialog] = useState(false);
const [selectedRebaseBranch, setSelectedRebaseBranch] = useState<string>(''); const [selectedRebaseBranch, setSelectedRebaseBranch] = useState<string>('');
const [showStopConfirmation, setShowStopConfirmation] = useState(false); const [showStopConfirmation, setShowStopConfirmation] = useState(false);
const [copied, setCopied] = useState(false); const [copied, setCopied] = useState(false);
const [mergeSuccess, setMergeSuccess] = useState(false);
const [pushSuccess, setPushSuccess] = useState(false);
const processedDevServerLogs = useMemo(() => { const processedDevServerLogs = useMemo(() => {
if (!devServerDetails) return 'No output yet...'; if (!devServerDetails) return 'No output yet...';
@@ -263,7 +261,10 @@ function CurrentAttempt({
try { try {
setPushing(true); setPushing(true);
await attemptsApi.push(selectedAttempt.id); await attemptsApi.push(selectedAttempt.id);
fetchBranchStatus(); setError(null); // Clear any previous errors on success
setPushSuccess(true);
setTimeout(() => setPushSuccess(false), 2000);
fetchAttemptData(selectedAttempt.id);
} catch (error: any) { } catch (error: any) {
console.error('Failed to push changes:', error); console.error('Failed to push changes:', error);
setError(error.message || 'Failed to push changes'); setError(error.message || 'Failed to push changes');
@@ -272,38 +273,16 @@ function CurrentAttempt({
} }
}; };
const fetchBranchStatus = useCallback(async () => {
if (!selectedAttempt?.id) return;
try {
setBranchStatusLoading(true);
const result = await attemptsApi.getBranchStatus(selectedAttempt.id);
setBranchStatus((prev) => {
if (JSON.stringify(prev) === JSON.stringify(result)) return prev;
return result;
});
} catch (err) {
setError('Failed to load branch status');
} finally {
setBranchStatusLoading(false);
}
}, [projectId, selectedAttempt?.id, selectedAttempt?.task_id, setError]);
// Fetch branch status when selected attempt changes
useEffect(() => {
if (selectedAttempt) {
fetchBranchStatus();
}
}, [selectedAttempt, fetchBranchStatus]);
const performMerge = async () => { const performMerge = async () => {
if (!projectId || !selectedAttempt?.id || !selectedAttempt?.task_id) return; if (!projectId || !selectedAttempt?.id || !selectedAttempt?.task_id) return;
try { try {
setMerging(true); setMerging(true);
await attemptsApi.merge(selectedAttempt.id); await attemptsApi.merge(selectedAttempt.id);
// Refetch branch status to show updated state setError(null); // Clear any previous errors on success
fetchBranchStatus(); setMergeSuccess(true);
setTimeout(() => setMergeSuccess(false), 2000);
fetchAttemptData(selectedAttempt.id);
} catch (error) { } catch (error) {
console.error('Failed to merge changes:', error); console.error('Failed to merge changes:', error);
// @ts-expect-error it is type ApiError // @ts-expect-error it is type ApiError
@@ -319,8 +298,8 @@ function CurrentAttempt({
try { try {
setRebasing(true); setRebasing(true);
await attemptsApi.rebase(selectedAttempt.id, { new_base_branch: null }); await attemptsApi.rebase(selectedAttempt.id, { new_base_branch: null });
// Refresh branch status after rebase setError(null); // Clear any previous errors on success
fetchBranchStatus(); fetchAttemptData(selectedAttempt.id);
} catch (err) { } catch (err) {
setError(err instanceof Error ? err.message : 'Failed to rebase branch'); setError(err instanceof Error ? err.message : 'Failed to rebase branch');
} finally { } finally {
@@ -336,8 +315,8 @@ function CurrentAttempt({
await attemptsApi.rebase(selectedAttempt.id, { await attemptsApi.rebase(selectedAttempt.id, {
new_base_branch: newBaseBranch, new_base_branch: newBaseBranch,
}); });
// Refresh branch status after rebase setError(null); // Clear any previous errors on success
fetchBranchStatus(); fetchAttemptData(selectedAttempt.id);
setShowRebaseDialog(false); setShowRebaseDialog(false);
} catch (err) { } catch (err) {
setError(err instanceof Error ? err.message : 'Failed to rebase branch'); setError(err instanceof Error ? err.message : 'Failed to rebase branch');
@@ -360,9 +339,9 @@ function CurrentAttempt({
const handlePRButtonClick = async () => { const handlePRButtonClick = async () => {
if (!projectId || !selectedAttempt?.id || !selectedAttempt?.task_id) return; if (!projectId || !selectedAttempt?.id || !selectedAttempt?.task_id) return;
// If PR already exists, view it in a new tab // If PR already exists, push to it
if (selectedAttempt.pr_url) { if (mergeInfo.hasOpenPR) {
window.open(selectedAttempt.pr_url, '_blank'); await handlePushClick();
return; return;
} }
@@ -387,6 +366,42 @@ function CurrentAttempt({
return getEditorDisplayName(config.editor.editor_type); return getEditorDisplayName(config.editor.editor_type);
}, [config?.editor?.editor_type]); }, [config?.editor?.editor_type]);
// Memoize merge status information to avoid repeated calculations
const mergeInfo = useMemo(() => {
if (!branchStatus?.merges)
return {
hasOpenPR: false,
openPR: null,
hasMergedPR: false,
mergedPR: null,
hasMerged: false,
latestMerge: null,
};
const openPR = branchStatus.merges.find(
(m) => m.type === 'pr' && m.pr_info.status === 'open'
);
const mergedPR = branchStatus.merges.find(
(m) => m.type === 'pr' && m.pr_info.status === 'merged'
);
const merges = branchStatus.merges.filter(
(m) =>
m.type === 'direct' ||
(m.type === 'pr' && m.pr_info.status === 'merged')
);
return {
hasOpenPR: !!openPR,
openPR,
hasMergedPR: !!mergedPR,
mergedPR,
hasMerged: merges.length > 0,
latestMerge: branchStatus.merges[0] || null, // Most recent merge
};
}, [branchStatus?.merges]);
const handleCopyWorktreePath = useCallback(async () => { const handleCopyWorktreePath = useCallback(async () => {
try { try {
await navigator.clipboard.writeText(selectedAttempt.container_ref || ''); await navigator.clipboard.writeText(selectedAttempt.container_ref || '');
@@ -397,6 +412,71 @@ function CurrentAttempt({
} }
}, [selectedAttempt.container_ref]); }, [selectedAttempt.container_ref]);
// Get status information for display
const getStatusInfo = useCallback(() => {
if (mergeInfo.hasMergedPR && mergeInfo.mergedPR?.type === 'pr') {
const prMerge = mergeInfo.mergedPR;
return {
dotColor: 'bg-green-500',
textColor: 'text-green-700',
text: `PR #${prMerge.pr_info.number} merged`,
isClickable: true,
onClick: () => window.open(prMerge.pr_info.url, '_blank'),
};
}
if (
mergeInfo.hasMerged &&
mergeInfo.latestMerge?.type === 'direct' &&
(branchStatus?.commits_ahead ?? 0) === 0
) {
return {
dotColor: 'bg-green-500',
textColor: 'text-green-700',
text: `Merged`,
isClickable: false,
};
}
if (mergeInfo.hasOpenPR && mergeInfo.openPR?.type === 'pr') {
const prMerge = mergeInfo.openPR;
return {
dotColor: 'bg-blue-500',
textColor: 'text-blue-700',
text: `PR #${prMerge.pr_info.number}`,
isClickable: true,
onClick: () => window.open(prMerge.pr_info.url, '_blank'),
};
}
if ((branchStatus?.commits_behind ?? 0) > 0) {
return {
dotColor: 'bg-orange-500',
textColor: 'text-orange-700',
text: `Rebase needed${branchStatus?.has_uncommitted_changes ? ' (dirty)' : ''}`,
isClickable: false,
};
}
if ((branchStatus?.commits_ahead ?? 0) > 0) {
return {
dotColor: 'bg-yellow-500',
textColor: 'text-yellow-700',
text:
branchStatus?.commits_ahead === 1
? `1 commit ahead${branchStatus?.has_uncommitted_changes ? ' (dirty)' : ''}`
: `${branchStatus?.commits_ahead} commits ahead${branchStatus?.has_uncommitted_changes ? ' (dirty)' : ''}`,
isClickable: false,
};
}
return {
dotColor: 'bg-gray-500',
textColor: 'text-gray-700',
text: `Up to date${branchStatus?.has_uncommitted_changes ? ' (dirty)' : ''}`,
isClickable: false,
};
}, [mergeInfo, branchStatus]);
return ( return (
<div className="space-y-2"> <div className="space-y-2">
<div className="flex gap-6 items-start"> <div className="flex gap-6 items-start">
@@ -429,9 +509,7 @@ function CurrentAttempt({
variant="ghost" variant="ghost"
size="xs" size="xs"
onClick={handleRebaseDialogOpen} onClick={handleRebaseDialogOpen}
disabled={ disabled={rebasing || isAttemptRunning}
rebasing || branchStatusLoading || isAttemptRunning
}
className="h-4 w-4 p-0 hover:bg-muted" className="h-4 w-4 p-0 hover:bg-muted"
> >
<Settings className="h-3 w-3" /> <Settings className="h-3 w-3" />
@@ -456,24 +534,30 @@ function CurrentAttempt({
Status Status
</div> </div>
<div className="flex items-center gap-1.5"> <div className="flex items-center gap-1.5">
{selectedAttempt.merge_commit ? ( {(() => {
<div className="flex items-center gap-1.5 overflow-hidden"> const statusInfo = getStatusInfo();
<div className="h-2 w-2 bg-green-500 rounded-full" /> return (
<span className="text-sm font-medium text-green-700 truncate"> <div className="flex items-center gap-1.5">
Merged <div
</span> className={`h-2 w-2 ${statusInfo.dotColor} rounded-full`}
<span className="text-xs font-mono text-muted-foreground truncate"> />
({selectedAttempt.merge_commit.slice(0, 8)}) {statusInfo.isClickable ? (
</span> <button
</div> onClick={statusInfo.onClick}
) : ( className={`text-sm font-medium ${statusInfo.textColor} hover:underline cursor-pointer`}
<div className="flex items-center gap-1.5 overflow-hidden"> >
<div className="h-2 w-2 bg-yellow-500 rounded-full" /> {statusInfo.text}
<span className="text-sm font-medium text-yellow-700"> </button>
Not merged ) : (
</span> <span
</div> className={`text-sm font-medium ${statusInfo.textColor}`}
)} >
{statusInfo.text}
</span>
)}
</div>
);
})()}
</div> </div>
</div> </div>
</div> </div>
@@ -494,7 +578,7 @@ function CurrentAttempt({
</Button> </Button>
</div> </div>
<div <div
className={`text-xs font-mono px-2 py-1 rounded cursor-pointer transition-all duration-300 flex items-center gap-2 ${ className={`text-xs font-mono px-2 py-1 rounded break-all cursor-pointer transition-all duration-300 flex items-center gap-2 ${
copied copied
? 'bg-green-100 text-green-800 border border-green-300' ? 'bg-green-100 text-green-800 border border-green-300'
: 'text-muted-foreground bg-muted hover:bg-muted/80' : 'text-muted-foreground bg-muted hover:bg-muted/80'
@@ -600,88 +684,73 @@ function CurrentAttempt({
<div className="flex items-center gap-2 flex-wrap"> <div className="flex items-center gap-2 flex-wrap">
{/* Git Operations */} {/* Git Operations */}
{selectedAttempt && branchStatus && ( {selectedAttempt && branchStatus && !mergeInfo.hasMergedPR && (
<> <>
{(branchStatus.commits_behind ?? 0) > 0 && {(branchStatus.commits_behind ?? 0) > 0 && (
!branchStatus.merged && ( <Button
<Button onClick={handleRebaseClick}
onClick={handleRebaseClick} disabled={rebasing || isAttemptRunning}
disabled={ variant="outline"
rebasing || branchStatusLoading || isAttemptRunning size="xs"
} className="border-orange-300 text-orange-700 hover:bg-orange-50 gap-1"
variant="outline" >
size="xs" <RefreshCw
className="border-orange-300 text-orange-700 hover:bg-orange-50 gap-1" className={`h-3 w-3 ${rebasing ? 'animate-spin' : ''}`}
> />
<RefreshCw {rebasing ? 'Rebasing...' : `Rebase`}
className={`h-3 w-3 ${rebasing ? 'animate-spin' : ''}`} </Button>
/> )}
{rebasing ? 'Rebasing...' : `Rebase`} <>
</Button> <Button
)} onClick={handlePRButtonClick}
{ disabled={
// Normal merge and PR buttons for regular tasks creatingPR ||
!branchStatus.merged && ( pushing ||
<> Boolean((branchStatus.commits_behind ?? 0) > 0) ||
<Button isAttemptRunning ||
onClick={handlePRButtonClick} (mergeInfo.hasOpenPR &&
disabled={ branchStatus.remote_commits_ahead === 0) ||
creatingPR || ((branchStatus.commits_ahead ?? 0) === 0 &&
Boolean((branchStatus.commits_behind ?? 0) > 0) || !pushSuccess &&
isAttemptRunning !mergeSuccess)
} }
variant="outline" variant="outline"
size="xs" size="xs"
className="border-blue-300 text-blue-700 hover:bg-blue-50 gap-1" className="border-blue-300 text-blue-700 hover:bg-blue-50 gap-1 min-w-[120px]"
> >
<GitPullRequest className="h-3 w-3" /> <GitPullRequest className="h-3 w-3" />
{selectedAttempt.pr_url {mergeInfo.hasOpenPR
? 'View PR' ? pushSuccess
: creatingPR ? 'Pushed!'
? 'Creating...' : pushing
: 'Create PR'} ? 'Pushing...'
</Button> : branchStatus.remote_commits_ahead === 0
<Button ? 'Push to PR'
onClick={ : branchStatus.remote_commits_ahead === 1
selectedAttempt.pr_status === 'open' ? 'Push 1 commit'
? handlePushClick : `Push ${branchStatus.remote_commits_ahead || 0} commits`
: handleMergeClick : creatingPR
} ? 'Creating...'
disabled={ : 'Create PR'}
selectedAttempt.pr_status === 'open' </Button>
? pushing || <Button
isAttemptRunning || onClick={handleMergeClick}
(branchStatus.remote_up_to_date ?? true) disabled={
: merging || mergeInfo.hasOpenPR ||
Boolean((branchStatus.commits_behind ?? 0) > 0) || merging ||
isAttemptRunning Boolean((branchStatus.commits_behind ?? 0) > 0) ||
} isAttemptRunning ||
size="xs" ((branchStatus.commits_ahead ?? 0) === 0 &&
className="bg-green-600 hover:bg-green-700 disabled:bg-gray-400 gap-1" !pushSuccess &&
> !mergeSuccess)
{selectedAttempt.pr_status === 'open' ? ( }
<> size="xs"
<Upload className="h-3 w-3" /> className="bg-green-600 hover:bg-green-700 disabled:bg-gray-400 gap-1 min-w-[120px]"
{pushing >
? 'Pushing...' <GitBranchIcon className="h-3 w-3" />
: branchStatus.remote_commits_behind === null {mergeSuccess ? 'Merged!' : merging ? 'Merging...' : 'Merge'}
? 'Disconnected' </Button>
: branchStatus.remote_commits_behind === 0 </>
? 'Push to remote'
: branchStatus.remote_commits_behind === 1
? 'Push 1 commit'
: `Push ${branchStatus.remote_commits_behind} commits`}
</>
) : (
<>
<GitBranchIcon className="h-3 w-3" />
{merging ? 'Merging...' : 'Merge'}
</>
)}
</Button>
</>
)
}
</> </>
)} )}

View File

@@ -94,8 +94,6 @@ export enum CheckTokenResponse { VALID = "VALID", INVALID = "INVALID" }
export type GitBranch = { name: string, is_current: boolean, is_remote: boolean, last_commit_date: Date, }; export type GitBranch = { name: string, is_current: boolean, is_remote: boolean, last_commit_date: Date, };
export type BranchStatus = { commits_behind: number | null, commits_ahead: number | null, up_to_date: boolean | null, merged: boolean, has_uncommitted_changes: boolean, base_branch_name: string, remote_commits_behind: number | null, remote_commits_ahead: number | null, remote_up_to_date: boolean | null, };
export type Diff = { oldFile: FileDiffDetails | null, newFile: FileDiffDetails | null, hunks: Array<string>, }; export type Diff = { oldFile: FileDiffDetails | null, newFile: FileDiffDetails | null, hunks: Array<string>, };
export type FileDiffDetails = { fileName: string | null, content: string | null, }; export type FileDiffDetails = { fileName: string | null, content: string | null, };
@@ -160,7 +158,9 @@ export type CreateTaskAttemptBody = { task_id: string, profile_variant_label: Pr
export type RebaseTaskAttemptRequest = { new_base_branch: string | null, }; export type RebaseTaskAttemptRequest = { new_base_branch: string | null, };
export type TaskAttempt = { id: string, task_id: string, container_ref: string | null, branch: string | null, base_branch: string, merge_commit: string | null, profile: string, pr_url: string | null, pr_number: bigint | null, pr_status: string | null, pr_merged_at: string | null, worktree_deleted: boolean, setup_completed_at: string | null, created_at: string, updated_at: string, }; export type BranchStatus = { commits_behind: number | null, commits_ahead: number | null, has_uncommitted_changes: boolean | null, base_branch_name: string, remote_commits_behind: number | null, remote_commits_ahead: number | null, merges: Array<Merge>, };
export type TaskAttempt = { id: string, task_id: string, container_ref: string | null, branch: string | null, base_branch: string, profile: string, worktree_deleted: boolean, setup_completed_at: string | null, created_at: string, updated_at: string, };
export type ExecutionProcess = { id: string, task_attempt_id: string, run_reason: ExecutionProcessRunReason, executor_action: ExecutorAction, status: ExecutionProcessStatus, exit_code: bigint | null, started_at: string, completed_at: string | null, created_at: string, updated_at: string, }; export type ExecutionProcess = { id: string, task_attempt_id: string, run_reason: ExecutionProcessRunReason, executor_action: ExecutorAction, status: ExecutionProcessStatus, exit_code: bigint | null, started_at: string, completed_at: string | null, created_at: string, updated_at: string, };
@@ -168,6 +168,16 @@ export type ExecutionProcessStatus = "running" | "completed" | "failed" | "kille
export type ExecutionProcessRunReason = "setupscript" | "cleanupscript" | "codingagent" | "devserver"; export type ExecutionProcessRunReason = "setupscript" | "cleanupscript" | "codingagent" | "devserver";
export type Merge = { "type": "direct" } & DirectMerge | { "type": "pr" } & PrMerge;
export type DirectMerge = { id: string, task_attempt_id: string, merge_commit: string, target_branch_name: string, created_at: string, };
export type PrMerge = { id: string, task_attempt_id: string, created_at: string, target_branch_name: string, pr_info: PullRequestInfo, };
export type MergeStatus = "open" | "merged" | "closed" | "unknown";
export type PullRequestInfo = { number: bigint, url: string, status: MergeStatus, merged_at: string | null, merge_commit_sha: string | null, };
export type EventPatch = { op: string, path: string, value: EventPatchInner, }; export type EventPatch = { op: string, path: string, value: EventPatchInner, };
export type EventPatchInner = { db_op: string, record: RecordTypes, }; export type EventPatchInner = { db_op: string, record: RecordTypes, };