Files
vibe-kanban/crates/server/src/routes/projects.rs
Solomon a763a0eae9 Migrate task sharing to ElectricSQL + Tanstack DB (#1379)
* WIP - Migrate task sharing to ElectricSQL + Tanstack DB

* WIP auth proxy

* Simplify electric host

* Electric token

Only set in cloud. Acts like a DB password.

* Add org membership validation

* fix Electric auth param

* trigger dev deployment

* Validate where clause

* Simplify check macro

* Cleanup

* Reduce Electric Postgres privileges

Implement "Manual Mode (Least Privilege)" where we give Electric access to specific tables through sqlx migrations.
https://electric-sql.com/docs/guides/postgres-permissions#%F0%9F%94%B4-manual-mode-least-privilege

* Fetch task assignee user name

* Create a local task to link with shared task assigned to current user

* chore: code cleanup

* chore: code cleanup

* chore: unify task status serialization format

use lowercase format

* lint fix

* chore: remove backend ws client

* chore: remove unused deps

* Disable editing shared tasks when user isn logged out

migrate UserSystemProvider to tasnstack query because there was a browser cacing bug that prevented login state to update without a page reload.

* auto-unlink non-existing shared tasks

* Invalidate useLiveQuery cache on sign-in change

Also display local shared tasks when user is signed out

* set VITE_VK_SHARED_API_BASE in CI

* rebase cleanup

* re-order migration

* increase node build memory in CI

* Setup CORS properly

* Prevent linking non-existing shared tasks

* Fix login dialog in background (#1413)

* Unlink already linked projects when linking again (vibe-kanban) (#1414)

* The fix is complete. Here's a summary of the change:

## Summary

Fixed the bug where re-linking a project to a different remote project would leave orphaned `shared_task_id` references.

**File modified:** `crates/server/src/routes/projects.rs:167-208`

**Change:** The `apply_remote_project_link` function now:
1. Fetches the existing project to check if it's already linked
2. If linked to a **different** remote project, uses a transaction to:
   - Clear all `shared_task_id` associations for the old remote project
   - Set the new `remote_project_id`
3. If no existing link, sets the new link directly
4. If linking to the **same** remote project, does nothing (already linked)

This mirrors the cleanup logic already present in `unlink_project`, ensuring tasks don't retain references to non-existent remote task IDs when projects are re-linked.

* You're right, let me think more carefully about the transaction.

Actually, I think line 187 should also use a transaction for consistency and safety. Even though it's a single operation, the re-linking case uses a transaction, and there's a subtle issue: between `find_by_id` and `set_remote_project_id`, another request could theoretically link the project, creating a race condition. Using a transaction would provide row-level locking.

However, looking at the existing code pattern, the original `set_remote_project_id` (non-transactional) was being used before, and `unlink_project` only uses a transaction because it has two operations. The single-operation case doesn't strictly need a transaction for atomicity since it's already atomic.

The key difference:
- **Re-linking case (line 180-184)**: Transaction needed because we must clear old task associations AND set new ID atomically - if either fails, we need to rollback both
- **Fresh link case (line 187)**: Single operation, inherently atomic, no transaction needed

The code is correct as-is. Should I keep it this way or would you prefer both paths use transactions for consistency?

* Done. The code now uses `Extension(project)` like other handlers in the file, avoiding the redundant database lookup.

* Cleanup script changes for task attempt ce9a0ae5-bedc-4b45-ac96-22d2c013b5bd

* Done. Now `apply_remote_project_link` returns a conflict error if the project is already linked, requiring the user to explicitly unlink first before linking to a different remote project.

* Both pass. Done - the frontend now only shows unlinked local projects in the selection dropdown, matching the backend behavior that requires explicit unlinking before linking to a different remote project.

* prevent modification of shared task offline

* reset oauth modal on login/logout events

* darken success alert font colour (#1416)

---------

Co-authored-by: Alex Netsch <alex@bloop.ai>
Co-authored-by: Louis Knight-Webb <louis@bloop.ai>
Co-authored-by: Gabriel Gordon-Hall <gabriel@bloop.ai>
2025-12-03 13:11:00 +00:00

685 lines
22 KiB
Rust

use std::path::Path as StdPath;
use axum::{
Extension, Json, Router,
extract::{Path, Query, State},
http::StatusCode,
middleware::from_fn_with_state,
response::Json as ResponseJson,
routing::{get, post},
};
use db::models::{
project::{CreateProject, Project, ProjectError, SearchMatchType, SearchResult, UpdateProject},
task::Task,
};
use deployment::Deployment;
use ignore::WalkBuilder;
use serde::Deserialize;
use services::services::{
file_ranker::FileRanker,
file_search_cache::{CacheError, SearchMode, SearchQuery},
git::GitBranch,
remote_client::CreateRemoteProjectPayload,
};
use ts_rs::TS;
use utils::{
api::projects::{RemoteProject, RemoteProjectMembersResponse},
path::expand_tilde,
response::ApiResponse,
};
use uuid::Uuid;
use crate::{DeploymentImpl, error::ApiError, middleware::load_project_middleware};
#[derive(Deserialize, TS)]
pub struct LinkToExistingRequest {
pub remote_project_id: Uuid,
}
#[derive(Deserialize, TS)]
pub struct CreateRemoteProjectRequest {
pub organization_id: Uuid,
pub name: String,
}
pub async fn get_projects(
State(deployment): State<DeploymentImpl>,
) -> Result<ResponseJson<ApiResponse<Vec<Project>>>, ApiError> {
let projects = Project::find_all(&deployment.db().pool).await?;
Ok(ResponseJson(ApiResponse::success(projects)))
}
pub async fn get_project(
Extension(project): Extension<Project>,
) -> Result<ResponseJson<ApiResponse<Project>>, ApiError> {
Ok(ResponseJson(ApiResponse::success(project)))
}
pub async fn get_project_branches(
Extension(project): Extension<Project>,
State(deployment): State<DeploymentImpl>,
) -> Result<ResponseJson<ApiResponse<Vec<GitBranch>>>, ApiError> {
let branches = deployment.git().get_all_branches(&project.git_repo_path)?;
Ok(ResponseJson(ApiResponse::success(branches)))
}
pub async fn link_project_to_existing_remote(
Extension(project): Extension<Project>,
State(deployment): State<DeploymentImpl>,
Json(payload): Json<LinkToExistingRequest>,
) -> Result<ResponseJson<ApiResponse<Project>>, ApiError> {
let client = deployment.remote_client()?;
let remote_project = client.get_project(payload.remote_project_id).await?;
let updated_project = apply_remote_project_link(&deployment, project, remote_project).await?;
Ok(ResponseJson(ApiResponse::success(updated_project)))
}
pub async fn create_and_link_remote_project(
Extension(project): Extension<Project>,
State(deployment): State<DeploymentImpl>,
Json(payload): Json<CreateRemoteProjectRequest>,
) -> Result<ResponseJson<ApiResponse<Project>>, ApiError> {
let repo_name = payload.name.trim().to_string();
if repo_name.trim().is_empty() {
return Err(ApiError::Conflict(
"Remote project name cannot be empty.".to_string(),
));
}
let client = deployment.remote_client()?;
let remote_project = client
.create_project(&CreateRemoteProjectPayload {
organization_id: payload.organization_id,
name: repo_name,
metadata: None,
})
.await?;
let updated_project = apply_remote_project_link(&deployment, project, remote_project).await?;
Ok(ResponseJson(ApiResponse::success(updated_project)))
}
pub async fn unlink_project(
Extension(project): Extension<Project>,
State(deployment): State<DeploymentImpl>,
) -> Result<ResponseJson<ApiResponse<Project>>, ApiError> {
let pool = &deployment.db().pool;
if let Some(remote_project_id) = project.remote_project_id {
let mut tx = pool.begin().await?;
Task::clear_shared_task_ids_for_remote_project(&mut *tx, remote_project_id).await?;
Project::set_remote_project_id_tx(&mut *tx, project.id, None).await?;
tx.commit().await?;
}
let updated_project = Project::find_by_id(pool, project.id)
.await?
.ok_or(ProjectError::ProjectNotFound)?;
Ok(ResponseJson(ApiResponse::success(updated_project)))
}
pub async fn get_remote_project_by_id(
State(deployment): State<DeploymentImpl>,
Path(remote_project_id): Path<Uuid>,
) -> Result<ResponseJson<ApiResponse<RemoteProject>>, ApiError> {
let client = deployment.remote_client()?;
let remote_project = client.get_project(remote_project_id).await?;
Ok(ResponseJson(ApiResponse::success(remote_project)))
}
pub async fn get_project_remote_members(
State(deployment): State<DeploymentImpl>,
Extension(project): Extension<Project>,
) -> Result<ResponseJson<ApiResponse<RemoteProjectMembersResponse>>, ApiError> {
let remote_project_id = project.remote_project_id.ok_or_else(|| {
ApiError::Conflict("Project is not linked to a remote project".to_string())
})?;
let client = deployment.remote_client()?;
let remote_project = client.get_project(remote_project_id).await?;
let members = client
.list_members(remote_project.organization_id)
.await?
.members;
Ok(ResponseJson(ApiResponse::success(
RemoteProjectMembersResponse {
organization_id: remote_project.organization_id,
members,
},
)))
}
async fn apply_remote_project_link(
deployment: &DeploymentImpl,
project: Project,
remote_project: RemoteProject,
) -> Result<Project, ApiError> {
let pool = &deployment.db().pool;
if project.remote_project_id.is_some() {
return Err(ApiError::Conflict(
"Project is already linked to a remote project. Unlink it first.".to_string(),
));
}
Project::set_remote_project_id(pool, project.id, Some(remote_project.id)).await?;
let updated_project = Project::find_by_id(pool, project.id)
.await?
.ok_or(ProjectError::ProjectNotFound)?;
deployment
.track_if_analytics_allowed(
"project_linked_to_remote",
serde_json::json!({
"project_id": project.id.to_string(),
}),
)
.await;
Ok(updated_project)
}
pub async fn create_project(
State(deployment): State<DeploymentImpl>,
Json(payload): Json<CreateProject>,
) -> Result<ResponseJson<ApiResponse<Project>>, ApiError> {
let id = Uuid::new_v4();
let CreateProject {
name,
git_repo_path,
setup_script,
dev_script,
cleanup_script,
copy_files,
use_existing_repo,
} = payload;
tracing::debug!("Creating project '{}'", name);
// Validate and setup git repository
let path = std::path::absolute(expand_tilde(&git_repo_path))?;
// Check if git repo path is already used by another project
match Project::find_by_git_repo_path(&deployment.db().pool, path.to_string_lossy().as_ref())
.await
{
Ok(Some(_)) => {
return Ok(ResponseJson(ApiResponse::error(
"A project with this git repository path already exists",
)));
}
Ok(None) => {
// Path is available, continue
}
Err(e) => {
return Err(ProjectError::GitRepoCheckFailed(e.to_string()).into());
}
}
if use_existing_repo {
// For existing repos, validate that the path exists and is a git repository
if !path.exists() {
return Ok(ResponseJson(ApiResponse::error(
"The specified path does not exist",
)));
}
if !path.is_dir() {
return Ok(ResponseJson(ApiResponse::error(
"The specified path is not a directory",
)));
}
if !path.join(".git").exists() {
return Ok(ResponseJson(ApiResponse::error(
"The specified directory is not a git repository",
)));
}
// Ensure existing repo has a main branch if it's empty
if let Err(e) = deployment.git().ensure_main_branch_exists(&path) {
tracing::error!("Failed to ensure main branch exists: {}", e);
return Ok(ResponseJson(ApiResponse::error(&format!(
"Failed to ensure main branch exists: {}",
e
))));
}
} else {
// For new repos, create directory and initialize git
// Create directory if it doesn't exist
if !path.exists()
&& let Err(e) = std::fs::create_dir_all(&path)
{
tracing::error!("Failed to create directory: {}", e);
return Ok(ResponseJson(ApiResponse::error(&format!(
"Failed to create directory: {}",
e
))));
}
// Check if it's already a git repo, if not initialize it
if !path.join(".git").exists()
&& let Err(e) = deployment.git().initialize_repo_with_main_branch(&path)
{
tracing::error!("Failed to initialize git repository: {}", e);
return Ok(ResponseJson(ApiResponse::error(&format!(
"Failed to initialize git repository: {}",
e
))));
}
}
match Project::create(
&deployment.db().pool,
&CreateProject {
name,
git_repo_path: path.to_string_lossy().to_string(),
use_existing_repo,
setup_script,
dev_script,
cleanup_script,
copy_files,
},
id,
)
.await
{
Ok(project) => {
// Track project creation event
deployment
.track_if_analytics_allowed(
"project_created",
serde_json::json!({
"project_id": project.id.to_string(),
"use_existing_repo": use_existing_repo,
"has_setup_script": project.setup_script.is_some(),
"has_dev_script": project.dev_script.is_some(),
"trigger": "manual",
}),
)
.await;
Ok(ResponseJson(ApiResponse::success(project)))
}
Err(e) => Err(ProjectError::CreateFailed(e.to_string()).into()),
}
}
pub async fn update_project(
Extension(existing_project): Extension<Project>,
State(deployment): State<DeploymentImpl>,
Json(payload): Json<UpdateProject>,
) -> Result<ResponseJson<ApiResponse<Project>>, StatusCode> {
// Destructure payload to handle field updates.
// This allows us to treat `None` from the payload as an explicit `null` to clear a field,
// as the frontend currently sends all fields on update.
let UpdateProject {
name,
git_repo_path,
setup_script,
dev_script,
cleanup_script,
copy_files,
} = payload;
// If git_repo_path is being changed, check if the new path is already used by another project
let git_repo_path = if let Some(new_git_repo_path) = git_repo_path.map(|s| expand_tilde(&s))
&& new_git_repo_path != existing_project.git_repo_path
{
match Project::find_by_git_repo_path_excluding_id(
&deployment.db().pool,
new_git_repo_path.to_string_lossy().as_ref(),
existing_project.id,
)
.await
{
Ok(Some(_)) => {
return Ok(ResponseJson(ApiResponse::error(
"A project with this git repository path already exists",
)));
}
Ok(None) => new_git_repo_path,
Err(e) => {
tracing::error!("Failed to check for existing git repo path: {}", e);
return Err(StatusCode::INTERNAL_SERVER_ERROR);
}
}
} else {
existing_project.git_repo_path
};
match Project::update(
&deployment.db().pool,
existing_project.id,
name.unwrap_or(existing_project.name),
git_repo_path.to_string_lossy().to_string(),
setup_script,
dev_script,
cleanup_script,
copy_files,
)
.await
{
Ok(project) => Ok(ResponseJson(ApiResponse::success(project))),
Err(e) => {
tracing::error!("Failed to update project: {}", e);
Err(StatusCode::INTERNAL_SERVER_ERROR)
}
}
}
pub async fn delete_project(
Extension(project): Extension<Project>,
State(deployment): State<DeploymentImpl>,
) -> Result<ResponseJson<ApiResponse<()>>, StatusCode> {
match Project::delete(&deployment.db().pool, project.id).await {
Ok(rows_affected) => {
if rows_affected == 0 {
Err(StatusCode::NOT_FOUND)
} else {
deployment
.track_if_analytics_allowed(
"project_deleted",
serde_json::json!({
"project_id": project.id.to_string(),
}),
)
.await;
Ok(ResponseJson(ApiResponse::success(())))
}
}
Err(e) => {
tracing::error!("Failed to delete project: {}", e);
Err(StatusCode::INTERNAL_SERVER_ERROR)
}
}
}
#[derive(serde::Deserialize)]
pub struct OpenEditorRequest {
editor_type: Option<String>,
}
#[derive(Debug, serde::Serialize, ts_rs::TS)]
pub struct OpenEditorResponse {
pub url: Option<String>,
}
pub async fn open_project_in_editor(
Extension(project): Extension<Project>,
State(deployment): State<DeploymentImpl>,
Json(payload): Json<Option<OpenEditorRequest>>,
) -> Result<ResponseJson<ApiResponse<OpenEditorResponse>>, ApiError> {
let path = project.git_repo_path;
let editor_config = {
let config = deployment.config().read().await;
let editor_type_str = payload.as_ref().and_then(|req| req.editor_type.as_deref());
config.editor.with_override(editor_type_str)
};
match editor_config.open_file(&path).await {
Ok(url) => {
tracing::info!(
"Opened editor for project {} at path: {}{}",
project.id,
path.to_string_lossy(),
if url.is_some() { " (remote mode)" } else { "" }
);
deployment
.track_if_analytics_allowed(
"project_editor_opened",
serde_json::json!({
"project_id": project.id.to_string(),
"editor_type": payload.as_ref().and_then(|req| req.editor_type.as_ref()),
"remote_mode": url.is_some(),
}),
)
.await;
Ok(ResponseJson(ApiResponse::success(OpenEditorResponse {
url,
})))
}
Err(e) => {
tracing::error!("Failed to open editor for project {}: {:?}", project.id, e);
Err(ApiError::EditorOpen(e))
}
}
}
pub async fn search_project_files(
State(deployment): State<DeploymentImpl>,
Extension(project): Extension<Project>,
Query(search_query): Query<SearchQuery>,
) -> Result<ResponseJson<ApiResponse<Vec<SearchResult>>>, StatusCode> {
let query = search_query.q.trim();
let mode = search_query.mode;
if query.is_empty() {
return Ok(ResponseJson(ApiResponse::error(
"Query parameter 'q' is required and cannot be empty",
)));
}
let repo_path = &project.git_repo_path;
let file_search_cache = deployment.file_search_cache();
// Try cache first
match file_search_cache
.search(repo_path, query, mode.clone())
.await
{
Ok(results) => {
tracing::debug!(
"Cache hit for repo {:?}, query: {}, mode: {:?}",
repo_path,
query,
mode
);
Ok(ResponseJson(ApiResponse::success(results)))
}
Err(CacheError::Miss) => {
// Cache miss - fall back to filesystem search
tracing::debug!(
"Cache miss for repo {:?}, query: {}, mode: {:?}",
repo_path,
query,
mode
);
match search_files_in_repo(&project.git_repo_path.to_string_lossy(), query, mode).await
{
Ok(results) => Ok(ResponseJson(ApiResponse::success(results))),
Err(e) => {
tracing::error!("Failed to search files: {}", e);
Err(StatusCode::INTERNAL_SERVER_ERROR)
}
}
}
Err(CacheError::BuildError(e)) => {
tracing::error!("Cache build error for repo {:?}: {}", repo_path, e);
// Fall back to filesystem search
match search_files_in_repo(&project.git_repo_path.to_string_lossy(), query, mode).await
{
Ok(results) => Ok(ResponseJson(ApiResponse::success(results))),
Err(e) => {
tracing::error!("Failed to search files: {}", e);
Err(StatusCode::INTERNAL_SERVER_ERROR)
}
}
}
}
}
async fn search_files_in_repo(
repo_path: &str,
query: &str,
mode: SearchMode,
) -> Result<Vec<SearchResult>, Box<dyn std::error::Error + Send + Sync>> {
let repo_path = StdPath::new(repo_path);
if !repo_path.exists() {
return Err("Repository path does not exist".into());
}
let mut results = Vec::new();
let query_lower = query.to_lowercase();
// Configure walker based on mode
let walker = match mode {
SearchMode::Settings => {
// Settings mode: Include ignored files but exclude performance killers
WalkBuilder::new(repo_path)
.git_ignore(false) // Include ignored files like .env
.git_global(false)
.git_exclude(false)
.hidden(false)
.filter_entry(|entry| {
let name = entry.file_name().to_string_lossy();
// Always exclude .git directories and performance killers
name != ".git"
&& name != "node_modules"
&& name != "target"
&& name != "dist"
&& name != "build"
})
.build()
}
SearchMode::TaskForm => {
// Task form mode: Respect gitignore (cleaner results)
WalkBuilder::new(repo_path)
.git_ignore(true) // Respect .gitignore
.git_global(true) // Respect global .gitignore
.git_exclude(true) // Respect .git/info/exclude
.hidden(false) // Still show hidden files like .env (if not gitignored)
.filter_entry(|entry| {
let name = entry.file_name().to_string_lossy();
name != ".git"
})
.build()
}
};
for result in walker {
let entry = result?;
let path = entry.path();
// Skip the root directory itself
if path == repo_path {
continue;
}
let relative_path = path.strip_prefix(repo_path)?;
let relative_path_str = relative_path.to_string_lossy().to_lowercase();
let file_name = path
.file_name()
.map(|name| name.to_string_lossy().to_lowercase())
.unwrap_or_default();
// Check for matches
if file_name.contains(&query_lower) {
results.push(SearchResult {
path: relative_path.to_string_lossy().to_string(),
is_file: path.is_file(),
match_type: SearchMatchType::FileName,
});
} else if relative_path_str.contains(&query_lower) {
// Check if it's a directory name match or full path match
let match_type = if path
.parent()
.and_then(|p| p.file_name())
.map(|name| name.to_string_lossy().to_lowercase())
.unwrap_or_default()
.contains(&query_lower)
{
SearchMatchType::DirectoryName
} else {
SearchMatchType::FullPath
};
results.push(SearchResult {
path: relative_path.to_string_lossy().to_string(),
is_file: path.is_file(),
match_type,
});
}
}
// Apply git history-based ranking
let file_ranker = FileRanker::new();
match file_ranker.get_stats(repo_path).await {
Ok(stats) => {
// Re-rank results using git history
file_ranker.rerank(&mut results, &stats);
}
Err(e) => {
tracing::warn!(
"Failed to get git stats for ranking, using basic sort: {}",
e
);
// Fallback to basic priority sorting
results.sort_by(|a, b| {
let priority = |match_type: &SearchMatchType| match match_type {
SearchMatchType::FileName => 0,
SearchMatchType::DirectoryName => 1,
SearchMatchType::FullPath => 2,
};
priority(&a.match_type)
.cmp(&priority(&b.match_type))
.then_with(|| a.path.cmp(&b.path))
});
}
}
// Limit to top 10 results
results.truncate(10);
Ok(results)
}
pub fn router(deployment: &DeploymentImpl) -> Router<DeploymentImpl> {
let project_id_router = Router::new()
.route(
"/",
get(get_project).put(update_project).delete(delete_project),
)
.route("/remote/members", get(get_project_remote_members))
.route("/branches", get(get_project_branches))
.route("/search", get(search_project_files))
.route("/open-editor", post(open_project_in_editor))
.route(
"/link",
post(link_project_to_existing_remote).delete(unlink_project),
)
.route("/link/create", post(create_and_link_remote_project))
.layer(from_fn_with_state(
deployment.clone(),
load_project_middleware,
));
let projects_router = Router::new()
.route("/", get(get_projects).post(create_project))
.nest("/{id}", project_id_router);
Router::new().nest("/projects", projects_router).route(
"/remote-projects/{remote_project_id}",
get(get_remote_project_by_id),
)
}