Compare commits

..

26 Commits

Author SHA1 Message Date
shiyu
37e13dabe0 chore: update version to v1.2.9 2025-09-22 19:43:50 +08:00
shiyu
9d6c63aff4 feat(FileExplorer): support moving and copying multiple entries in context menu and modals 2025-09-22 19:32:45 +08:00
shiyu
81095f11df feat(TextEditor): lazy load Monaco and Markdown editors with suspense fallback 2025-09-22 19:11:45 +08:00
shiyu
7d35c10d71 feat(task-queue): implement task queue management with settings and UI integration 2025-09-22 19:08:14 +08:00
shiyu
17ebb8d4f4 feat(FileExplorer): add move and copy functionality with task queuing 2025-09-22 18:15:05 +08:00
shiyu
330e8fd72b feat(offline-downloads): implement offline download 2025-09-22 12:03:39 +08:00
shiyu
11c717e61d chore: update version to v1.2.8 2025-09-20 21:02:06 +08:00
shiyu
45d63febb9 fix(ui): fix bug on processor page 2025-09-20 14:16:06 +08:00
shiyu
5a29c579dc chore: update version to v1.2.7 2025-09-19 20:20:20 +08:00
shiyu
b530b16c53 feat(GridView): add RGBA color conversion function and update background style 2025-09-19 20:02:54 +08:00
shiyu
7da49191aa feat(processors): add processor management 2025-09-19 18:58:54 +08:00
shiyu
fbeb673126 feat(vector_db): Implement Vector Database Service with multiple providers 2025-09-19 13:45:48 +08:00
shiyu
0a06f4d02c feat: add webdav support to nginx configuration 2025-09-18 11:26:05 +08:00
shiyu
f02c29492b chore: update version to v1.2.6 2025-09-17 14:27:15 +08:00
shiyu
1a79e87887 feat: add AI embedding dimension configuration 2025-09-17 14:26:53 +08:00
shiyu
626ff727b3 feat: update contributing guidelines and add Chinese translation 2025-09-16 18:43:08 +08:00
shiyu
117a94d793 feat(docker): create data directories with appropriate permissions 2025-09-16 18:32:38 +08:00
shiyu
c39bea67a4 chore: update version to v1.2.5 2025-09-16 11:31:52 +08:00
shiyu
2cbfb29260 feat(i18n): add 'Processor' and 'Share' translations for English and Chinese 2025-09-16 11:31:23 +08:00
shiyu
155f3a144d feat(ui): add path selector modal 2025-09-15 14:14:10 +08:00
shiyu
208a52589f feat: update theme context to support dynamic locale switching 2025-09-14 16:35:12 +08:00
shiyu
0732b611a9 feat: add expired share cleanup functionality 2025-09-14 16:27:46 +08:00
shiyu
7b25e6d3b6 chore: update version to v1.2.4 2025-09-14 13:40:10 +08:00
shiyu
04441d0bc4 feat: add username field to profile modal 2025-09-14 13:20:45 +08:00
shiyu
917b542dab feat: add user profile management 2025-09-14 12:54:49 +08:00
shiyu
e43b68beda feat: ensure data/db directory exists during app startup 2025-09-13 16:35:54 +08:00
74 changed files with 5190 additions and 477 deletions

View File

@@ -1,76 +1,76 @@
<div align="right">
<b>English</b> | <a href="./CONTRIBUTING_zh.md">简体中文</a>
</div>
# Contributing to Foxel
🎉 首先,非常感谢您愿意花时间为 Foxel 做出贡献!
We appreciate every minute you spend helping Foxel improve. This guide explains the contribution workflow so you can get started quickly.
我们热烈欢迎各种形式的贡献。无论是报告 Bug、提出新功能建议、完善文档还是直接提交代码都将对项目产生积极的影响。
## Table of Contents
本指南将帮助您顺利地参与到项目中来。
## 目录
- [如何贡献](#如何贡献)
- [🐛 报告 Bug](#-报告-bug)
- [✨ 提交功能建议](#-提交功能建议)
- [🛠️ 贡献代码](#-贡献代码)
- [开发环境搭建](#开发环境搭建)
- [依赖准备](#依赖准备)
- [后端 (FastAPI)](#后端-fastapi)
- [前端 (React + Vite)](#前端-react--vite)
- [代码贡献指南](#代码贡献指南)
- [贡献存储适配器 (Adapter)](#贡献存储适配器-adapter)
- [贡献前端应用 (App)](#贡献前端应用-app)
- [提交规范](#提交规范)
- [Git 分支管理](#git-分支管理)
- [Commit Message 格式](#commit-message-格式)
- [Pull Request 流程](#pull-request-流程)
- [How to Contribute](#how-to-contribute)
- [🐛 Report Bugs](#-report-bugs)
- [✨ Suggest Features](#-suggest-features)
- [🛠️ Contribute Code](#-contribute-code)
- [Development Environment](#development-environment)
- [Prerequisites](#prerequisites)
- [Backend (FastAPI)](#backend-fastapi)
- [Frontend (React + Vite)](#frontend-react--vite)
- [Contribution Guidelines](#contribution-guidelines)
- [Storage Adapters](#storage-adapters)
- [Frontend Apps](#frontend-apps)
- [Submission Rules](#submission-rules)
- [Git Branching](#git-branching)
- [Commit Message Format](#commit-message-format)
- [Pull Request Flow](#pull-request-flow)
---
## 如何贡献
## How to Contribute
### 🐛 报告 Bug
### 🐛 Report Bugs
如果您在使用的过程中发现了 Bug请通过 [GitHub Issues](https://github.com/DrizzleTime/Foxel/issues) 来报告。请在报告中提供以下信息:
If you discover a bug, open a ticket via [GitHub Issues](https://github.com/DrizzleTime/Foxel/issues) and include:
- **清晰的标题**:简明扼要地描述问题。
- **复现步骤**:详细说明如何一步步重现该 Bug
- **期望行为** vs **实际行为**:描述您预期的结果和实际发生的情况。
- **环境信息**例如操作系统、浏览器版本、Foxel 版本等。
- **A clear title** that summarises the problem.
- **Reproduction steps** with enough detail to trigger the bug.
- **Expected vs actual behaviour** to highlight the gap.
- **Environment details** such as operating system, browser version, and the Foxel build you used.
### ✨ 提交功能建议
### ✨ Suggest Features
我们欢迎任何关于新功能或改进的建议。请通过 [GitHub Issues](https://github.com/DrizzleTime/Foxel/issues) 创建一个 "Feature Request",并详细阐述您的想法:
To propose a new capability or an improvement, create an Issue and choose the "Feature Request" template. Document:
- **问题描述**:说明该功能要解决什么问题。
- **方案设想**:描述您希望该功能如何工作。
- **相关信息**:提供任何有助于理解您想法的截图、链接或参考。
- **Problem statement** what pain point will the feature solve?
- **Proposed solution** how you expect it to work.
- **Supporting material** screenshots, references, or related links if helpful.
### 🛠️ 贡献代码
### 🛠️ Contribute Code
如果您希望直接贡献代码,请参考下面的开发和提交流程。
Follow the development setup below before opening a pull request. Keep changes focused and small so they are easier to review.
## 开发环境搭建
## Development Environment
### 依赖准备
### Prerequisites
- **Git**: 用于版本控制。
- **Python**: >= 3.13
- **Bun**: 用于前端包管理和脚本运行。
Install the following tooling first:
### 后端 (FastAPI)
- **Git** for version control
- **Python** 3.13 or newer
- **Bun** for frontend package management and scripts
后端 API 服务基于 Python 和 FastAPI 构建。
### Backend (FastAPI)
1. **克隆仓库**
1. **Clone the repository**
```bash
git clone https://github.com/DrizzleTime/foxel.git
cd Foxel
```
2. **创建并激活 Python 虚拟环境**
2. **Create and activate a virtual environment**
我们推荐使用 `uv` 来管理虚拟环境,以获得最佳性能。
`uv` is recommended for performance and reproducibility:
```bash
uv venv
@@ -78,91 +78,85 @@
# On Windows: .venv\Scripts\activate
```
3. **安装依赖**
3. **Install dependencies**
```bash
uv sync
```
4. **初始化环境**
4. **Prepare local resources**
在启动服务前,请进行以下准备:
- Create the data directory:
- **创建数据目录**:
在项目根目录执行 `mkdir -p data/db`。这将创建用于存放数据库等文件的目录。
> [!IMPORTANT]
> 请确保应用拥有对 `data/db` 目录的读写权限。
```bash
mkdir -p data/db
```
- **创建 `.env` 配置文件**:
在项目根目录创建名为 `.env` 的文件,并填入以下内容。这些密钥用于保障应用安全,您可以按需修改。
Ensure the application user can read and write to `data/db`.
- Create an `.env` file in the project root and provide the required secrets. Replace the sample values with your own random strings:
```dotenv
SECRET_KEY=EnsRhL9NFPxgFVc+7t96/y70DIOR+9SpntcIqQa90TU=
TEMP_LINK_SECRET_KEY=EnsRhL9NFPxgFVc+7t96/y70DIOR+9SpntcIqQa90TU=
```
5. **启动开发服务器**
5. **Start the development server**
```bash
uvicorn main:app --reload --host 0.0.0.0 --port 8000
```
API 服务将在 `http://localhost:8000` 上运行,您可以通过 `http://localhost:8000/docs` 访问自动生成的 API 文档。
The API is available at `http://localhost:8000`, and the interactive docs live at `http://localhost:8000/docs`.
### 前端 (React + Vite)
### Frontend (React + Vite)
前端应用使用 React, Vite, 和 TypeScript 构建。
1. **进入前端目录**
1. **Enter the frontend directory**
```bash
cd web
```
2. **安装依赖**
2. **Install dependencies**
```bash
bun install
```
3. **启动开发服务器**
3. **Run the dev server**
```bash
bun run dev
```
前端开发服务器将在 `http://localhost:5173` 运行。它已经配置了代理,会自动将 `/api` 请求转发到后端服务。
The Vite dev server runs at `http://localhost:5173` and proxies `/api` requests to the backend.
## 代码贡献指南
## Contribution Guidelines
### 贡献存储适配器 (Adapter)
### Storage Adapters
存储适配器是 Foxel 的核心扩展点,用于接入不同的存储后端 (如 S3, FTP, Alist 等)。
Storage adapters integrate new storage providers (for example S3, FTP, or Alist).
1. **创建适配器文件**: 在 [`services/adapters/`](services/adapters/) 目录下,创建一个新文件,例如 `my_new_adapter.py`
2. **实现适配器类**:
- 创建一个类,继承自 [`services.adapters.base.BaseAdapter`](services/adapters/base.py)。
- 实现 `BaseAdapter` 中定义的所有抽象方法,如 `list_dir`, `get_meta`, `upload`, `download` 等。请仔细阅读基类中的文档注释以理解每个方法的作用和参数。
1. Create a new module under [`services/adapters/`](services/adapters/) (for example `my_new_adapter.py`).
2. Implement a class that inherits from [`services.adapters.base.BaseAdapter`](services/adapters/base.py) and provide concrete implementations for the abstract methods such as `list_dir`, `get_meta`, `upload`, and `download`.
### 贡献前端应用 (App)
### Frontend Apps
前端应用允许用户在浏览器中直接预览或编辑特定类型的文件。
Frontend apps enable in-browser previews or editors for specific file types.
1. **创建应用组件**: 在 [`web/src/apps/`](web/src/apps/) 目录下,为您的应用创建一个新的文件夹,并在其中创建 React 组件。
2. **定义应用类型**: 您的应用需要实现 [`web/src/apps/types.ts`](web/src/apps/types.ts) 中定义的 `FoxelApp` 接口。
3. **注册应用**: 在 [`web/src/apps/registry.ts`](web/src/apps/registry.ts) 中,导入您的应用组件,并将其添加到 `APP_REGISTRY`。在注册时,您需要指定该应用可以处理的文件类型(通过 MIME Type 或文件扩展名)。
1. Add a new folder in [`web/src/apps/`](web/src/apps/) for your app and expose a React component.
2. Implement the `FoxelApp` interface defined in [`web/src/apps/types.ts`](web/src/apps/types.ts).
3. Register the app in [`web/src/apps/registry.ts`](web/src/apps/registry.ts) and declare the MIME types or extensions it supports.
## 提交规范
## Submission Rules
### Git 分支管理
### Git Branching
- 从最新的 `main` 分支创建您的特性分支。
Start your work from the latest `main` branch and push feature changes on a dedicated branch.
### Commit Message 格式
### Commit Message Format
我们遵循 [Conventional Commits](https://www.conventionalcommits.org/) 规范。这有助于自动化生成更新日志和版本管理。
Commit Message 格式如下:
We follow the [Conventional Commits](https://www.conventionalcommits.org/) specification to drive release tooling.
```
<type>(<scope>): <subject>
@@ -172,27 +166,27 @@ Commit Message 格式如下:
<footer>
```
- **type**: `feat`, `fix`, `docs`, `style`, `refactor`, `test`, `chore` 等。
- **scope**: (可选) 本次提交影响的范围,例如 `adapter`, `ui`, `api`
- **subject**: 简明扼要的描述。
- **type**: e.g. `feat`, `fix`, `docs`, `style`, `refactor`, `test`, `chore`.
- **scope** (optional): the area impacted by the change, such as `adapter`, `ui`, or `api`.
- **subject**: a concise summary written in the imperative mood.
**示例:**
**Examples:**
```
feat(adapter): Add support for Alist storage
feat(adapter): add support for Alist storage
```
```
fix(ui): Correct display issue in file list view
fix(ui): correct display issue in file list view
```
### Pull Request 流程
### Pull Request Flow
1. Fork 仓库并克隆到本地。
2. 创建并切换到您的特性分支。
3. 完成代码编写和测试。
4. 将您的分支推送到您的 Fork 仓库。
5. 在 Foxel 主仓库创建一个 Pull Request,目标分支为 `main`。
6. 在 PR 描述中清晰地说明您的更改内容、目的和任何相关的 Issue 编号。
1. Fork the repository and clone it locally.
2. Create and switch to your feature branch.
3. Implement the change and run relevant checks.
4. Push the branch to your fork.
5. Open a pull request against `main` in the Foxel repository.
6. Explain the change set, its motivation, and reference related Issues in the PR description.
项目维护者会尽快审查您的 PR。感谢您的耐心和贡献
Maintainers will review your pull request as soon as possible.

202
CONTRIBUTING_zh.md Normal file
View File

@@ -0,0 +1,202 @@
<div align="right">
<a href="./CONTRIBUTING.md">English</a> | <b>简体中文</b>
</div>
# Contributing to Foxel
🎉 首先,非常感谢您愿意花时间为 Foxel 做出贡献!
我们热烈欢迎各种形式的贡献。无论是报告 Bug、提出新功能建议、完善文档还是直接提交代码都将对项目产生积极的影响。
本指南将帮助您顺利地参与到项目中来。
## 目录
- [如何贡献](#如何贡献)
- [🐛 报告 Bug](#-报告-bug)
- [✨ 提交功能建议](#-提交功能建议)
- [🛠️ 贡献代码](#-贡献代码)
- [开发环境搭建](#开发环境搭建)
- [依赖准备](#依赖准备)
- [后端 (FastAPI)](#后端-fastapi)
- [前端 (React + Vite)](#前端-react--vite)
- [代码贡献指南](#代码贡献指南)
- [贡献存储适配器 (Adapter)](#贡献存储适配器-adapter)
- [贡献前端应用 (App)](#贡献前端应用-app)
- [提交规范](#提交规范)
- [Git 分支管理](#git-分支管理)
- [Commit Message 格式](#commit-message-格式)
- [Pull Request 流程](#pull-request-流程)
---
## 如何贡献
### 🐛 报告 Bug
如果您在使用的过程中发现了 Bug请通过 [GitHub Issues](https://github.com/DrizzleTime/Foxel/issues) 来报告。请在报告中提供以下信息:
- **清晰的标题**:简明扼要地描述问题。
- **复现步骤**:详细说明如何一步步重现该 Bug。
- **期望行为** vs **实际行为**:描述您预期的结果和实际发生的情况。
- **环境信息**例如操作系统、浏览器版本、Foxel 版本等。
### ✨ 提交功能建议
我们欢迎任何关于新功能或改进的建议。请通过 [GitHub Issues](https://github.com/DrizzleTime/Foxel/issues) 创建一个 "Feature Request",并详细阐述您的想法:
- **问题描述**:说明该功能要解决什么问题。
- **方案设想**:描述您希望该功能如何工作。
- **相关信息**:提供任何有助于理解您想法的截图、链接或参考。
### 🛠️ 贡献代码
如果您希望直接贡献代码,请参考下面的开发和提交流程。
## 开发环境搭建
### 依赖准备
- **Git**: 用于版本控制。
- **Python**: >= 3.13
- **Bun**: 用于前端包管理和脚本运行。
### 后端 (FastAPI)
后端 API 服务基于 Python 和 FastAPI 构建。
1. **克隆仓库**
```bash
git clone https://github.com/DrizzleTime/foxel.git
cd Foxel
```
2. **创建并激活 Python 虚拟环境**
我们推荐使用 `uv` 来管理虚拟环境,以获得最佳性能。
```bash
uv venv
source .venv/bin/activate
# On Windows: .venv\Scripts\activate
```
3. **安装依赖**
```bash
uv sync
```
4. **初始化环境**
在启动服务前,请进行以下准备:
- **创建数据目录**:
在项目根目录执行 `mkdir -p data/db`。这将创建用于存放数据库等文件的目录。
> [!IMPORTANT]
> 请确保应用拥有对 `data/db` 目录的读写权限。
- **创建 `.env` 配置文件**:
在项目根目录创建名为 `.env` 的文件,并填入以下内容。这些密钥用于保障应用安全,您可以按需修改。
```dotenv
SECRET_KEY=EnsRhL9NFPxgFVc+7t96/y70DIOR+9SpntcIqQa90TU=
TEMP_LINK_SECRET_KEY=EnsRhL9NFPxgFVc+7t96/y70DIOR+9SpntcIqQa90TU=
```
5. **启动开发服务器**
```bash
uvicorn main:app --reload --host 0.0.0.0 --port 8000
```
API 服务将在 `http://localhost:8000` 上运行,您可以通过 `http://localhost:8000/docs` 访问自动生成的 API 文档。
### 前端 (React + Vite)
前端应用使用 React, Vite, 和 TypeScript 构建。
1. **进入前端目录**
```bash
cd web
```
2. **安装依赖**
```bash
bun install
```
3. **启动开发服务器**
```bash
bun run dev
```
前端开发服务器将在 `http://localhost:5173` 运行。它已经配置了代理,会自动将 `/api` 请求转发到后端服务。
## 代码贡献指南
### 贡献存储适配器 (Adapter)
存储适配器是 Foxel 的核心扩展点,用于接入不同的存储后端 (如 S3, FTP, Alist 等)。
1. **创建适配器文件**: 在 [`services/adapters/`](services/adapters/) 目录下,创建一个新文件,例如 `my_new_adapter.py`。
2. **实现适配器类**:
- 创建一个类,继承自 [`services.adapters.base.BaseAdapter`](services/adapters/base.py)。
- 实现 `BaseAdapter` 中定义的所有抽象方法,如 `list_dir`, `get_meta`, `upload`, `download` 等。请仔细阅读基类中的文档注释以理解每个方法的作用和参数。
### 贡献前端应用 (App)
前端应用允许用户在浏览器中直接预览或编辑特定类型的文件。
1. **创建应用组件**: 在 [`web/src/apps/`](web/src/apps/) 目录下,为您的应用创建一个新的文件夹,并在其中创建 React 组件。
2. **定义应用类型**: 您的应用需要实现 [`web/src/apps/types.ts`](web/src/apps/types.ts) 中定义的 `FoxelApp` 接口。
3. **注册应用**: 在 [`web/src/apps/registry.ts`](web/src/apps/registry.ts) 中,导入您的应用组件,并将其添加到 `APP_REGISTRY`。在注册时,您需要指定该应用可以处理的文件类型(通过 MIME Type 或文件扩展名)。
## 提交规范
### Git 分支管理
- 从最新的 `main` 分支创建您的特性分支。
### Commit Message 格式
我们遵循 [Conventional Commits](https://www.conventionalcommits.org/) 规范。这有助于自动化生成更新日志和版本管理。
Commit Message 格式如下:
```
<type>(<scope>): <subject>
<BLANK LINE>
<body>
<BLANK LINE>
<footer>
```
- **type**: `feat`, `fix`, `docs`, `style`, `refactor`, `test`, `chore` 等。
- **scope**: (可选) 本次提交影响的范围,例如 `adapter`, `ui`, `api`。
- **subject**: 简明扼要的描述。
**示例:**
```
feat(adapter): Add support for Alist storage
```
```
fix(ui): Correct display issue in file list view
```
### Pull Request 流程
1. Fork 仓库并克隆到本地。
2. 创建并切换到您的特性分支。
3. 完成代码编写和测试。
4. 将您的分支推送到您的 Fork 仓库。
5. 在 Foxel 主仓库创建一个 Pull Request目标分支为 `main`。
6. 在 PR 描述中清晰地说明您的更改内容、目的和任何相关的 Issue 编号。
项目维护者会尽快审查您的 PR。感谢您的耐心和贡献

View File

@@ -27,6 +27,9 @@ COPY . .
COPY nginx.conf /etc/nginx/nginx.conf
RUN mkdir -p data/db data/mount && \
chmod 777 data/db data/mount
EXPOSE 80
COPY entrypoint.sh /entrypoint.sh

View File

@@ -73,7 +73,7 @@ chmod 777 data/db data/mount
We welcome contributions from the community! Whether it's submitting bugs, suggesting new features, or contributing code directly.
Before you start, please read our [`CONTRIBUTING.md`](CONTRIBUTING.md) file, which will guide you on how to set up your development environment and the submission process.
Before you start, please read our [`CONTRIBUTING.md`](CONTRIBUTING.md) file, which explains the development environment and submission process. A Simplified Chinese translation is available in [`CONTRIBUTING_zh.md`](CONTRIBUTING_zh.md).
## 🌐 Community

View File

@@ -74,7 +74,7 @@ chmod 777 data/db data/mount
我们非常欢迎来自社区的贡献!无论是提交 Bug、建议新功能还是直接贡献代码。
在开始之前,请先阅读我们的 [`CONTRIBUTING.md`](CONTRIBUTING.md) 文件,它会指导你如何设置开发环境以及提交流程。
在开始之前,请先阅读我们的 [`CONTRIBUTING_zh.md`](CONTRIBUTING_zh.md) 文件,它会指导你如何设置开发环境以及提交流程。
## 🌐 社区

View File

@@ -1,6 +1,6 @@
from fastapi import FastAPI
from .routes import adapters, virtual_fs, auth, config, processors, tasks, logs, share, backup, search, vector_db
from .routes import adapters, virtual_fs, auth, config, processors, tasks, logs, share, backup, search, vector_db, offline_downloads
from .routes import webdav
from .routes import plugins
@@ -20,3 +20,4 @@ def include_routers(app: FastAPI):
app.include_router(vector_db.router)
app.include_router(plugins.router)
app.include_router(webdav.router)
app.include_router(offline_downloads.router)

View File

@@ -1,5 +1,6 @@
from typing import Annotated
from fastapi import APIRouter, HTTPException, Depends, Form
import hashlib
from fastapi.security import OAuth2PasswordRequestForm
from services.auth import (
authenticate_user_db,
@@ -7,10 +8,14 @@ from services.auth import (
ACCESS_TOKEN_EXPIRE_MINUTES,
register_user,
Token,
get_current_active_user,
User,
)
from pydantic import BaseModel
from datetime import timedelta
from api.response import success
from models.database import UserAccount
from services.auth import verify_password, get_password_hash
router = APIRouter(prefix="/api/auth", tags=["auth"])
@@ -21,6 +26,7 @@ class RegisterRequest(BaseModel):
email: str | None = None
full_name: str | None = None
@router.post("/register", summary="注册第一个管理员用户")
async def register(data: RegisterRequest):
"""
@@ -51,3 +57,66 @@ async def login_for_access_token(
data={"sub": user.username}, expires_delta=access_token_expires
)
return Token(access_token=access_token, token_type="bearer")
@router.get("/me", summary="获取当前登录用户信息")
async def get_me(current_user: Annotated[User, Depends(get_current_active_user)]):
"""
返回当前登录用户的基本信息,并附带 gravatar 头像链接。
"""
email = (current_user.email or "").strip().lower()
md5_hash = hashlib.md5(email.encode("utf-8")).hexdigest()
gravatar_url = f"https://www.gravatar.com/avatar/{md5_hash}?s=64&d=identicon"
return success({
"id": current_user.id,
"username": current_user.username,
"email": current_user.email,
"full_name": current_user.full_name,
"gravatar_url": gravatar_url,
})
class UpdateMeRequest(BaseModel):
email: str | None = None
full_name: str | None = None
old_password: str | None = None
new_password: str | None = None
@router.put("/me", summary="更新当前登录用户信息")
async def update_me(
payload: UpdateMeRequest,
current_user: Annotated[User, Depends(get_current_active_user)],
):
db_user = await UserAccount.get_or_none(id=current_user.id)
if not db_user:
raise HTTPException(status_code=404, detail="用户不存在")
if payload.email is not None:
exists = await UserAccount.filter(email=payload.email).exclude(id=db_user.id).exists()
if exists:
raise HTTPException(status_code=400, detail="邮箱已被占用")
db_user.email = payload.email
if payload.full_name is not None:
db_user.full_name = payload.full_name
if payload.new_password:
if not payload.old_password:
raise HTTPException(status_code=400, detail="请提供原密码")
if not verify_password(payload.old_password, db_user.hashed_password):
raise HTTPException(status_code=400, detail="原密码错误")
db_user.hashed_password = get_password_hash(payload.new_password)
await db_user.save()
email = (db_user.email or "").strip().lower()
md5_hash = hashlib.md5(email.encode("utf-8")).hexdigest()
gravatar_url = f"https://cn.cravatar.com/avatar/{md5_hash}?s=64&d=identicon"
return success({
"id": db_user.id,
"username": db_user.username,
"email": db_user.email,
"full_name": db_user.full_name,
"gravatar_url": gravatar_url,
})

View File

@@ -1,10 +1,11 @@
import httpx
import time
from fastapi import APIRouter, Depends, Form
from fastapi import APIRouter, Depends, Form, HTTPException
from typing import Annotated
from services.config import ConfigCenter, VERSION
from services.auth import get_current_active_user, User, has_users
from api.response import success
from services.vector_db import VectorDBService
router = APIRouter(prefix="/api/config", tags=["config"])
@@ -23,8 +24,27 @@ async def set_config(
key: str = Form(...),
value: str = Form(...)
):
await ConfigCenter.set(key, value)
return success({"key": key, "value": value})
original_value = await ConfigCenter.get(key)
value_to_save = value
if key == "AI_EMBED_DIM":
try:
parsed_value = int(value)
except (TypeError, ValueError):
raise HTTPException(status_code=400, detail="AI_EMBED_DIM must be an integer")
if parsed_value <= 0:
raise HTTPException(status_code=400, detail="AI_EMBED_DIM must be greater than zero")
value_to_save = str(parsed_value)
await ConfigCenter.set(key, value_to_save)
if key == "AI_EMBED_DIM" and str(original_value) != value_to_save:
try:
service = VectorDBService()
await service.clear_all_data()
except Exception as exc:
raise HTTPException(status_code=500, detail=f"Failed to clear vector database: {exc}")
return success({"key": key, "value": value_to_save})
@router.get("/all")

View File

@@ -0,0 +1,79 @@
from typing import Annotated
from fastapi import APIRouter, Depends, HTTPException
from api.response import success
from schemas.offline_downloads import OfflineDownloadCreate
from services.auth import User, get_current_active_user
from services.logging import LogService
from services.task_queue import task_queue_service, TaskProgress
from services.virtual_fs import path_is_directory
router = APIRouter(
prefix="/api/offline-downloads",
tags=["OfflineDownloads"],
)
@router.post("/")
async def create_offline_download(
payload: OfflineDownloadCreate,
current_user: Annotated[User, Depends(get_current_active_user)],
):
dest_dir = payload.dest_dir
try:
is_dir = await path_is_directory(dest_dir)
except HTTPException:
is_dir = False
if not is_dir:
raise HTTPException(400, detail="Destination directory not found")
task = await task_queue_service.add_task(
"offline_http_download",
{
"url": str(payload.url),
"dest_dir": dest_dir,
"filename": payload.filename,
},
)
await task_queue_service.update_progress(
task.id,
TaskProgress(
stage="queued",
percent=0.0,
bytes_total=None,
bytes_done=0,
detail="Waiting to start",
),
)
await LogService.action(
"route:offline_downloads",
f"Offline download task created {task.id}",
details={"url": str(payload.url), "dest_dir": dest_dir, "filename": payload.filename},
user_id=current_user.id if hasattr(current_user, "id") else None,
)
return success({"task_id": task.id})
@router.get("/")
async def list_offline_downloads(
current_user: Annotated[User, Depends(get_current_active_user)],
):
tasks = [t for t in task_queue_service.get_all_tasks() if t.name == "offline_http_download"]
data = [t.dict() for t in tasks]
return success(data)
@router.get("/{task_id}")
async def get_offline_download(
task_id: str,
current_user: Annotated[User, Depends(get_current_active_user)],
):
task = task_queue_service.get_task(task_id)
if not task or task.name != "offline_http_download":
raise HTTPException(status_code=404, detail="Task not found")
return success(task.dict())

View File

@@ -1,10 +1,17 @@
from fastapi import APIRouter, Depends, Body
from pathlib import Path
from fastapi import APIRouter, Depends, Body, HTTPException
from fastapi.concurrency import run_in_threadpool
from typing import Annotated
from services.processors.registry import get_config_schemas
from services.processors.registry import (
get_config_schemas,
get_module_path,
reload_processors,
)
from services.task_queue import task_queue_service
from services.auth import get_current_active_user, User
from api.response import success
from pydantic import BaseModel
from services.virtual_fs import path_is_directory
router = APIRouter(prefix="/api/processors", tags=["processors"])
@@ -22,6 +29,7 @@ async def list_processors(
"supported_exts": meta.get("supported_exts", []),
"config_schema": meta["config_schema"],
"produces_file": meta.get("produces_file", False),
"module_path": meta.get("module_path"),
})
return success(out)
@@ -34,12 +42,20 @@ class ProcessRequest(BaseModel):
overwrite: bool = False
class UpdateSourceRequest(BaseModel):
source: str
@router.post("/process")
async def process_file_with_processor(
current_user: Annotated[User, Depends(get_current_active_user)],
req: ProcessRequest = Body(...)
):
save_to = req.path if req.overwrite else req.save_to
is_dir = await path_is_directory(req.path)
if is_dir and not req.overwrite:
raise HTTPException(400, detail="Directory processing requires overwrite")
save_to = None if is_dir else (req.path if req.overwrite else req.save_to)
task = await task_queue_service.add_task(
"process_file",
{
@@ -47,6 +63,54 @@ async def process_file_with_processor(
"processor_type": req.processor_type,
"config": req.config,
"save_to": save_to,
"overwrite": req.overwrite,
},
)
return success({"task_id": task.id})
@router.get("/source/{processor_type}")
async def get_processor_source(
processor_type: str,
current_user: Annotated[User, Depends(get_current_active_user)],
):
module_path = get_module_path(processor_type)
if not module_path:
raise HTTPException(404, detail="Processor not found")
path_obj = Path(module_path)
if not path_obj.exists():
raise HTTPException(404, detail="Processor source not found")
try:
content = await run_in_threadpool(path_obj.read_text, encoding='utf-8')
except Exception as exc:
raise HTTPException(500, detail=f"Failed to read source: {exc}")
return success({"source": content, "module_path": str(path_obj)})
@router.put("/source/{processor_type}")
async def update_processor_source(
processor_type: str,
req: UpdateSourceRequest,
current_user: Annotated[User, Depends(get_current_active_user)],
):
module_path = get_module_path(processor_type)
if not module_path:
raise HTTPException(404, detail="Processor not found")
path_obj = Path(module_path)
if not path_obj.exists():
raise HTTPException(404, detail="Processor source not found")
try:
await run_in_threadpool(path_obj.write_text, req.source, encoding='utf-8')
except Exception as exc:
raise HTTPException(500, detail=f"Failed to write source: {exc}")
return success(True)
@router.post("/reload")
async def reload_processor_modules(
current_user: Annotated[User, Depends(get_current_active_user)],
):
errors = reload_processors()
if errors:
raise HTTPException(500, detail="; ".join(errors))
return success(True)

View File

@@ -9,7 +9,7 @@ router = APIRouter(prefix="/api/search", tags=["search"])
async def search_files_by_vector(q: str, top_k: int):
embedding = await get_text_embedding(q)
vector_db = VectorDBService()
results = vector_db.search_vectors("vector_collection", embedding, top_k)
results = await vector_db.search_vectors("vector_collection", embedding, top_k)
items = [
SearchResultItem(id=res["id"], path=res["entity"]["path"], score=res["distance"])
for res in results[0]
@@ -18,7 +18,7 @@ async def search_files_by_vector(q: str, top_k: int):
async def search_files_by_name(q: str, top_k: int):
vector_db = VectorDBService()
results = vector_db.search_by_path("vector_collection", q, top_k)
results = await vector_db.search_by_path("vector_collection", q, top_k)
items = [
SearchResultItem(id=idx, path=res["entity"]["path"], score=res["distance"])
for idx, res in enumerate(results[0])
@@ -38,4 +38,4 @@ async def search_files(
elif mode == "filename":
return await search_files_by_name(q, top_k)
else:
return {"items": [], "query": q, "error": "Invalid search mode"}
return {"items": [], "query": q, "error": "Invalid search mode"}

View File

@@ -83,6 +83,18 @@ async def get_my_shares(current_user: User = Depends(get_current_active_user)):
return [ShareInfo.from_orm(s) for s in shares]
@router.delete("/expired")
async def delete_expired_shares(
current_user: User = Depends(get_current_active_user),
):
"""
删除当前用户的所有已过期分享。
"""
user_account = await UserAccount.get(id=current_user.id)
deleted_count = await share_service.delete_expired_shares(user=user_account)
return success({"deleted_count": deleted_count})
@router.delete("/{share_id}")
async def delete_share(
share_id: int,

View File

@@ -2,11 +2,17 @@ from fastapi import APIRouter, Depends, HTTPException
from typing import Annotated
from models.database import AutomationTask
from schemas.tasks import AutomationTaskCreate, AutomationTaskUpdate
from schemas.tasks import (
AutomationTaskCreate,
AutomationTaskUpdate,
TaskQueueSettings,
TaskQueueSettingsResponse,
)
from api.response import success
from services.auth import get_current_active_user, User
from services.logging import LogService
from services.task_queue import task_queue_service
from services.config import ConfigCenter
router = APIRouter(
prefix="/api/tasks",
@@ -24,6 +30,37 @@ async def get_task_queue_status(
return success([task.dict() for task in tasks])
@router.get("/queue/settings")
async def get_task_queue_settings(
current_user: Annotated[User, Depends(get_current_active_user)],
):
payload = TaskQueueSettingsResponse(
concurrency=task_queue_service.get_concurrency(),
active_workers=task_queue_service.get_active_worker_count(),
)
return success(payload.model_dump())
@router.post("/queue/settings")
async def update_task_queue_settings(
settings: TaskQueueSettings,
current_user: Annotated[User, Depends(get_current_active_user)],
):
await task_queue_service.set_concurrency(settings.concurrency)
await ConfigCenter.set("TASK_QUEUE_CONCURRENCY", str(task_queue_service.get_concurrency()))
await LogService.action(
"route:tasks",
"Updated task queue settings",
details={"concurrency": settings.concurrency},
user_id=getattr(current_user, "id", None),
)
payload = TaskQueueSettingsResponse(
concurrency=task_queue_service.get_concurrency(),
active_workers=task_queue_service.get_active_worker_count(),
)
return success(payload.model_dump())
@router.get("/queue/{task_id}")
async def get_task_status(
task_id: str,

View File

@@ -1,19 +1,100 @@
from typing import Any, Dict
from fastapi import APIRouter, Depends, HTTPException
from pydantic import BaseModel, Field
from services.auth import get_current_active_user
from models.database import UserAccount
from services.vector_db import VectorDBService
from services.vector_db import (
VectorDBService,
VectorDBConfigManager,
list_providers,
get_provider_entry,
)
from services.vector_db.providers import get_provider_class
from api.response import success
router = APIRouter(prefix="/api/vector-db", tags=["vector-db"])
class VectorDBConfigPayload(BaseModel):
type: str = Field(..., description="向量数据库提供者类型")
config: Dict[str, Any] = Field(default_factory=dict, description="提供者配置参数")
@router.post("/clear-all", summary="清空向量数据库")
async def clear_vector_db(user: UserAccount = Depends(get_current_active_user)):
if user.username != 'admin':
raise HTTPException(status_code=403, detail="仅管理员可操作")
try:
service = VectorDBService()
service.clear_all_data()
await service.clear_all_data()
return success(msg="向量数据库已清空")
except Exception as e:
raise HTTPException(status_code=500, detail=str(e))
raise HTTPException(status_code=500, detail=str(e))
@router.get("/stats", summary="获取向量数据库统计")
async def get_vector_db_stats(user: UserAccount = Depends(get_current_active_user)):
if user.username != 'admin':
raise HTTPException(status_code=403, detail="仅管理员可操作")
try:
service = VectorDBService()
data = await service.get_all_stats()
return success(data=data)
except Exception as e:
raise HTTPException(status_code=500, detail=str(e))
@router.get("/providers", summary="列出可用向量数据库提供者")
async def list_vector_providers(user: UserAccount = Depends(get_current_active_user)):
if user.username != 'admin':
raise HTTPException(status_code=403, detail="仅管理员可操作")
return success(list_providers())
@router.get("/config", summary="获取当前向量数据库配置")
async def get_vector_db_config(user: UserAccount = Depends(get_current_active_user)):
if user.username != 'admin':
raise HTTPException(status_code=403, detail="仅管理员可操作")
service = VectorDBService()
data = await service.current_provider()
return success(data)
@router.post("/config", summary="更新向量数据库配置")
async def update_vector_db_config(payload: VectorDBConfigPayload, user: UserAccount = Depends(get_current_active_user)):
if user.username != 'admin':
raise HTTPException(status_code=403, detail="仅管理员可操作")
entry = get_provider_entry(payload.type)
if not entry:
raise HTTPException(status_code=400, detail=f"未知的向量数据库类型: {payload.type}")
if not entry.get("enabled", True):
raise HTTPException(status_code=400, detail="该向量数据库类型暂不可用")
provider_cls = get_provider_class(payload.type)
if not provider_cls:
raise HTTPException(status_code=400, detail=f"未找到类型 {payload.type} 对应的实现")
# 先尝试建立连接,确保配置有效
test_provider = provider_cls(payload.config)
try:
await test_provider.initialize()
except Exception as exc:
raise HTTPException(status_code=400, detail=str(exc))
finally:
client = getattr(test_provider, "client", None)
close_fn = getattr(client, "close", None)
if callable(close_fn):
try:
close_fn()
except Exception:
pass
await VectorDBConfigManager.save_config(payload.type, payload.config)
service = VectorDBService()
await service.reload()
config_data = await service.current_provider()
stats = await service.get_all_stats()
return success({"config": config_data, "stats": stats})

View File

@@ -219,31 +219,41 @@ async def api_mkdir(
@router.post("/move")
async def api_move(
current_user: Annotated[User, Depends(get_current_active_user)],
body: MoveRequest
body: MoveRequest,
overwrite: bool = Query(False, description="是否允许覆盖已存在目标"),
):
src = body.src if body.src.startswith('/') else '/' + body.src
dst = body.dst if body.dst.startswith('/') else '/' + body.dst
await move_path(src, dst)
return success({"moved": True, "src": src, "dst": dst})
debug_info = await move_path(src, dst, overwrite=overwrite, return_debug=True, allow_cross=True)
queued = bool(debug_info.get("queued"))
response = {
"moved": not queued,
"queued": queued,
"src": src,
"dst": dst,
"overwrite": overwrite,
}
if queued:
response["task_id"] = debug_info.get("task_id")
response["task_name"] = debug_info.get("task_name")
return success(response)
@router.post("/rename")
async def api_rename(
current_user: Annotated[User, Depends(get_current_active_user)],
body: MoveRequest,
overwrite: bool = Query(False, description="是否允许覆盖已存在目标"),
debug: bool = Query(False, description="返回调试信息")
overwrite: bool = Query(False, description="是否允许覆盖已存在目标")
):
src = body.src if body.src.startswith('/') else '/' + body.src
dst = body.dst if body.dst.startswith('/') else '/' + body.dst
from services.virtual_fs import rename_path
debug_info = await rename_path(src, dst, overwrite=overwrite, return_debug=debug)
await rename_path(src, dst, overwrite=overwrite, return_debug=False)
return success({
"renamed": True,
"src": src,
"dst": dst,
"overwrite": overwrite,
**({"debug": debug_info} if debug else {})
})
@@ -252,19 +262,23 @@ async def api_copy(
current_user: Annotated[User, Depends(get_current_active_user)],
body: MoveRequest,
overwrite: bool = Query(False, description="是否覆盖已存在目标"),
debug: bool = Query(False, description="返回调试信息")
):
from services.virtual_fs import copy_path
src = body.src if body.src.startswith('/') else '/' + body.src
dst = body.dst if body.dst.startswith('/') else '/' + body.dst
debug_info = await copy_path(src, dst, overwrite=overwrite, return_debug=debug)
return success({
"copied": True,
debug_info = await copy_path(src, dst, overwrite=overwrite, return_debug=True, allow_cross=True)
queued = bool(debug_info.get("queued"))
response = {
"copied": not queued,
"queued": queued,
"src": src,
"dst": dst,
"overwrite": overwrite,
**({"debug": debug_info} if debug else {})
})
}
if queued:
response["task_id"] = debug_info.get("task_id")
response["task_name"] = debug_info.get("task_name")
return success(response)
@router.post("/upload/{full_path:path}")

View File

@@ -1,3 +1,4 @@
import os
from services.config import VERSION, ConfigCenter
from services.adapters.registry import runtime_registry
from fastapi.middleware.cors import CORSMiddleware
@@ -15,6 +16,7 @@ load_dotenv()
@asynccontextmanager
async def lifespan(app: FastAPI):
os.makedirs("data/db", exist_ok=True)
await init_db()
await runtime_registry.refresh()
await ConfigCenter.set("APP_VERSION", VERSION)
@@ -29,7 +31,7 @@ async def lifespan(app: FastAPI):
def create_app() -> FastAPI:
app = FastAPI(
title="Foxel",
description="AList-like virtual storage aggregator",
description="A highly extensible private cloud storage solution for individuals and teams",
lifespan=lifespan,
)
include_routers(app)

View File

@@ -28,7 +28,7 @@ http {
listen 80;
server_name _;
location ~ ^/(api|docs|openapi\.json$) {
location ~ ^/(api|webdav|docs|openapi\.json$) {
proxy_pass http://127.0.0.1:8000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;

View File

@@ -64,6 +64,7 @@ dependencies = [
"python-multipart==0.0.20",
"pytz==2025.2",
"pyyaml==6.0.2",
"qdrant-client==1.15.1",
"rawpy==0.25.1",
"rich==14.1.0",
"rich-toolkit==0.15.0",

View File

@@ -0,0 +1,7 @@
from pydantic import BaseModel, HttpUrl, Field
class OfflineDownloadCreate(BaseModel):
url: HttpUrl
dest_dir: str = Field(..., min_length=1)
filename: str = Field(..., min_length=1)

View File

@@ -1,4 +1,4 @@
from pydantic import BaseModel
from pydantic import BaseModel, Field
from typing import Optional, Dict, Any
@@ -29,3 +29,11 @@ class AutomationTaskRead(AutomationTaskBase):
class Config:
from_attributes = True
class TaskQueueSettings(BaseModel):
concurrency: int = Field(..., ge=1, description="Desired number of concurrent task workers")
class TaskQueueSettingsResponse(TaskQueueSettings):
active_workers: int = Field(..., ge=0, description="Currently running worker count")

View File

@@ -4,7 +4,7 @@ from typing import Any, Optional, Dict
from dotenv import load_dotenv
from models.database import Configuration
load_dotenv(dotenv_path=".env")
VERSION = "v1.2.3"
VERSION = "v1.2.9"
class ConfigCenter:
_cache: Dict[str, Any] = {}

View File

@@ -0,0 +1,199 @@
import os
import time
from pathlib import Path
from typing import AsyncIterator
import aiofiles
import aiohttp
from fastapi import HTTPException
from services.logging import LogService
from services.task_queue import Task, task_queue_service, TaskProgress
from services.virtual_fs import write_file_stream, stat_file
TEMP_ROOT = Path("data/tmp/offline_downloads")
def _normalize_path(path: str) -> str:
if not path:
return "/"
if not path.startswith("/"):
path = "/" + path
if len(path) > 1 and path.endswith("/"):
path = path.rstrip("/")
return path or "/"
async def _path_exists(full_path: str) -> bool:
try:
await stat_file(full_path)
return True
except FileNotFoundError:
return False
except HTTPException as exc:
if exc.status_code == 404:
return False
raise
def _split_filename(filename: str) -> tuple[str, str]:
if not filename:
return "", ""
if filename.startswith('.') and filename.count('.') == 1:
return filename, ""
if '.' not in filename:
return filename, ""
stem, ext = filename.rsplit('.', 1)
return stem, f".{ext}"
async def _allocate_destination(dest_dir: str, filename: str) -> tuple[str, str]:
dest_dir = _normalize_path(dest_dir)
stem, suffix = _split_filename(filename)
candidate = filename
if dest_dir == "/":
base = ""
else:
base = dest_dir
attempt = 0
while await _path_exists(f"{base}/{candidate}" if base else f"/{candidate}"):
attempt += 1
if stem:
candidate = f"{stem} ({attempt}){suffix}"
else:
candidate = f"file ({attempt}){suffix}" if suffix else f"file ({attempt})"
if base:
full_path = f"{base}/{candidate}"
else:
full_path = f"/{candidate}"
return full_path, candidate
async def _iter_file(path: Path, chunk_size: int, report_cb) -> AsyncIterator[bytes]:
async with aiofiles.open(path, "rb") as f:
while True:
chunk = await f.read(chunk_size)
if not chunk:
break
await report_cb(len(chunk))
yield chunk
async def run_http_download(task: Task):
params = task.task_info
url = params.get("url")
dest_dir = params.get("dest_dir")
filename = params.get("filename")
if not url or not dest_dir or not filename:
raise ValueError("Missing required parameters for offline download")
TEMP_ROOT.mkdir(parents=True, exist_ok=True)
temp_dir = TEMP_ROOT / task.id
temp_dir.mkdir(parents=True, exist_ok=True)
temp_file = temp_dir / "payload"
bytes_total: int | None = None
bytes_done = 0
last_update = time.monotonic()
await task_queue_service.update_progress(
task.id,
TaskProgress(
stage="downloading",
percent=0.0,
bytes_total=None,
bytes_done=0,
detail="HTTP downloading",
),
)
async def report_download(delta: int, total: int | None):
nonlocal bytes_done, bytes_total, last_update
if total is not None:
bytes_total = total
bytes_done += delta
now = time.monotonic()
if delta and now - last_update < 0.5:
return
last_update = now
percent = None
total_for_display = bytes_total if bytes_total is not None else None
if bytes_total:
percent = min(100.0, round(bytes_done / bytes_total * 100, 2))
await task_queue_service.update_progress(
task.id,
TaskProgress(
stage="downloading",
percent=percent,
bytes_total=total_for_display,
bytes_done=bytes_done,
detail="HTTP downloading",
),
)
timeout = aiohttp.ClientTimeout(total=None, connect=30)
async with aiohttp.ClientSession(timeout=timeout) as session:
async with session.get(url) as resp:
if resp.status != 200:
raise ValueError(f"HTTP {resp.status} for {url}")
content_length = resp.headers.get("Content-Length")
total_size = int(content_length) if content_length else None
bytes_done = 0
async with aiofiles.open(temp_file, "wb") as f:
async for chunk in resp.content.iter_chunked(512 * 1024):
if not chunk:
continue
await f.write(chunk)
await report_download(len(chunk), total_size)
# ensure final update
await report_download(0, total_size)
file_size = os.path.getsize(temp_file)
bytes_done_transfer = 0
async def report_transfer(delta: int):
nonlocal bytes_done_transfer
bytes_done_transfer += delta
percent = min(100.0, round(bytes_done_transfer / file_size * 100, 2)) if file_size else None
await task_queue_service.update_progress(
task.id,
TaskProgress(
stage="transferring",
percent=percent,
bytes_total=file_size or None,
bytes_done=bytes_done_transfer,
detail="Saving to storage",
),
)
async def chunk_iter() -> AsyncIterator[bytes]:
async for chunk in _iter_file(temp_file, 512 * 1024, report_transfer):
yield chunk
final_path, resolved_name = await _allocate_destination(dest_dir, filename)
await task_queue_service.update_progress(
task.id,
TaskProgress(stage="transferring", percent=0.0, bytes_total=file_size or None, bytes_done=0, detail="Saving to storage"),
)
await write_file_stream(final_path, chunk_iter())
await task_queue_service.update_progress(
task.id,
TaskProgress(stage="completed", percent=100.0, bytes_total=file_size or None, bytes_done=file_size, detail="Completed"),
)
await task_queue_service.update_meta(task.id, {"final_path": final_path, "filename": resolved_name})
try:
os.remove(temp_file)
temp_dir.rmdir()
except Exception:
await LogService.info("offline_download", f"Temp cleanup failed for task {task.id}")
return final_path

View File

@@ -1,33 +1,53 @@
import pkgutil
import inspect
from importlib import import_module
from typing import Dict, Callable
import pkgutil
from importlib import import_module, reload
from pathlib import Path
from types import ModuleType
from typing import Callable, Dict, Optional
from .base import BaseProcessor
ProcessorFactory = Callable[[], BaseProcessor]
TYPE_MAP: Dict[str, ProcessorFactory] = {}
CONFIG_SCHEMAS: Dict[str, dict] = {}
MODULE_MAP: Dict[str, ModuleType] = {}
LAST_DISCOVERY_ERRORS: list[str] = []
def discover_processors(force_reload: bool = False) -> list[str]:
"""Discover available processor modules and cache their metadata."""
import services.processors # 延迟导入以避免循环
def discover_processors():
import services.processors
processors_pkg = services.processors
TYPE_MAP.clear()
CONFIG_SCHEMAS.clear()
MODULE_MAP.clear()
global LAST_DISCOVERY_ERRORS
LAST_DISCOVERY_ERRORS = []
for modinfo in pkgutil.iter_modules(processors_pkg.__path__):
if modinfo.name.startswith("_"):
continue
full_name = f"{processors_pkg.__name__}.{modinfo.name}"
try:
module = import_module(full_name)
except Exception:
if force_reload:
module = reload(module)
except Exception as exc:
LAST_DISCOVERY_ERRORS.append(f"Failed to import {full_name}: {exc}")
continue
processor_type = getattr(module, "PROCESSOR_TYPE", None)
processor_name = getattr(module, "PROCESSOR_NAME", None)
supported_exts = getattr(module, "SUPPORTED_EXTS", None)
schema = getattr(module, "CONFIG_SCHEMA", None)
factory = getattr(module, "PROCESSOR_FACTORY", None)
if not processor_type:
continue
if factory is None:
for attr in module.__dict__.values():
if inspect.isclass(attr) and attr.__name__.endswith("Processor"):
@@ -35,31 +55,85 @@ def discover_processors():
return lambda: cls()
factory = _mk()
break
if not callable(factory):
LAST_DISCOVERY_ERRORS.append(f"Processor {full_name} missing factory")
continue
try:
sample = factory()
except Exception as exc:
LAST_DISCOVERY_ERRORS.append(f"Failed to instantiate processor {processor_type}: {exc}")
continue
TYPE_MAP[processor_type] = factory
MODULE_MAP[processor_type] = module
produces_file = getattr(module, "produces_file", None)
if produces_file is None and hasattr(factory(), "produces_file"):
produces_file = getattr(factory(), "produces_file")
if produces_file is None and hasattr(sample, "produces_file"):
produces_file = getattr(sample, "produces_file")
module_file = getattr(module, "__file__", None)
module_path: Optional[str] = None
if module_file:
try:
module_path = str(Path(module_file).resolve())
except Exception:
module_path = module_file
if isinstance(supported_exts, list):
normalized_exts = [str(ext) for ext in supported_exts]
elif supported_exts:
normalized_exts = [str(supported_exts)]
else:
normalized_exts = []
if not normalized_exts and hasattr(sample, "supported_exts"):
sample_exts = getattr(sample, "supported_exts") or []
if isinstance(sample_exts, list):
normalized_exts = [str(ext) for ext in sample_exts]
if isinstance(schema, list):
CONFIG_SCHEMAS[processor_type] = {
"type": processor_type,
"name": processor_name or processor_type,
"supported_exts": supported_exts or [],
"supported_exts": normalized_exts,
"config_schema": schema,
"produces_file": produces_file if produces_file is not None else False
"produces_file": produces_file if produces_file is not None else False,
"module_path": module_path,
}
return LAST_DISCOVERY_ERRORS
def get_config_schemas() -> Dict[str, dict]:
return CONFIG_SCHEMAS
def get_config_schema(processor_type: str):
return CONFIG_SCHEMAS.get(processor_type)
def get(processor_type: str) -> BaseProcessor:
factory = TYPE_MAP.get(processor_type)
if factory:
return factory()
return None
def get_module_path(processor_type: str) -> Optional[str]:
meta = CONFIG_SCHEMAS.get(processor_type)
if not meta:
return None
return meta.get("module_path")
def get_last_discovery_errors() -> list[str]:
return LAST_DISCOVERY_ERRORS
def reload_processors() -> list[str]:
return discover_processors(force_reload=True)
discover_processors()

View File

@@ -2,8 +2,9 @@ from typing import Dict, Any
from fastapi.responses import Response
import base64
from services.ai import describe_image_base64, get_text_embedding
from services.vector_db import VectorDBService
from services.vector_db import VectorDBService, DEFAULT_VECTOR_DIMENSION
from services.logging import LogService
from services.config import ConfigCenter
class VectorIndexProcessor:
@@ -33,7 +34,7 @@ class VectorIndexProcessor:
vector_db = VectorDBService()
collection_name = "vector_collection"
if action == "destroy":
vector_db.delete_vector(collection_name, path)
await vector_db.delete_vector(collection_name, path)
await LogService.info(
"processor:vector_index",
f"Destroyed {index_type} index for {path}",
@@ -42,8 +43,8 @@ class VectorIndexProcessor:
return Response(content=f"文件 {path}{index_type} 索引已销毁", media_type="text/plain")
if index_type == 'simple':
vector_db.ensure_collection(collection_name, vector=False)
vector_db.upsert_vector(collection_name, {'path': path})
await vector_db.ensure_collection(collection_name, vector=False)
await vector_db.upsert_vector(collection_name, {'path': path})
await LogService.info(
"processor:vector_index",
f"Created simple index for {path}",
@@ -71,8 +72,16 @@ class VectorIndexProcessor:
if embedding is None:
return Response(content="不支持的文件类型", status_code=400)
vector_db.ensure_collection(collection_name, vector=True)
vector_db.upsert_vector(
raw_dim = await ConfigCenter.get('AI_EMBED_DIM', DEFAULT_VECTOR_DIMENSION)
try:
vector_dim = int(raw_dim)
except (TypeError, ValueError):
vector_dim = DEFAULT_VECTOR_DIMENSION
if vector_dim <= 0:
vector_dim = DEFAULT_VECTOR_DIMENSION
await vector_db.ensure_collection(collection_name, vector=True, dim=vector_dim)
await vector_db.upsert_vector(
collection_name, {'path': path, 'embedding': embedding})
await LogService.info(

View File

@@ -90,6 +90,16 @@ class ShareService:
raise HTTPException(status_code=404, detail="分享链接不存在")
await share.delete()
@staticmethod
async def delete_expired_shares(user: UserAccount) -> int:
"""
删除当前用户所有已过期的分享链接,返回删除数量。
条件expires_at 非空 且 小于等于当前时间UTC
"""
now = datetime.now(timezone.utc)
deleted_count = await ShareLink.filter(user=user, expires_at__lte=now).delete()
return deleted_count
@staticmethod
async def get_shared_item_details(share: ShareLink, sub_path: str = ""):
"""
@@ -122,4 +132,4 @@ class ShareService:
raise e
share_service = ShareService()
share_service = ShareService()

View File

@@ -13,6 +13,14 @@ class TaskStatus(str, Enum):
FAILED = "failed"
class TaskProgress(BaseModel):
stage: str | None = None
percent: float | None = None
bytes_total: int | None = None
bytes_done: int | None = None
detail: str | None = None
class Task(BaseModel):
id: str = Field(default_factory=lambda: uuid.uuid4().hex)
name: str
@@ -20,13 +28,20 @@ class Task(BaseModel):
result: Any = None
error: str | None = None
task_info: Dict[str, Any] = {}
progress: TaskProgress | None = None
meta: Dict[str, Any] | None = None
_SENTINEL = object()
class TaskQueueService:
def __init__(self):
self._queue = asyncio.Queue()
self._queue: asyncio.Queue[Task | object] = asyncio.Queue()
self._tasks: Dict[str, Task] = {}
self._worker_task: asyncio.Task | None = None
self._worker_tasks: list[asyncio.Task] = []
self._concurrency: int = 1
self._worker_seq: int = 0
async def add_task(self, name: str, task_info: Dict[str, Any]) -> Task:
task = Task(name=name, task_info=task_info)
@@ -41,6 +56,21 @@ class TaskQueueService:
def get_all_tasks(self) -> list[Task]:
return list(self._tasks.values())
async def update_progress(self, task_id: str, progress: TaskProgress | Dict[str, Any]):
task = self._tasks.get(task_id)
if not task:
return
if isinstance(progress, TaskProgress):
task.progress = progress
else:
task.progress = TaskProgress(**progress)
async def update_meta(self, task_id: str, meta: Dict[str, Any]):
task = self._tasks.get(task_id)
if not task:
return
task.meta = (task.meta or {}) | meta
async def _execute_task(self, task: Task):
from services.virtual_fs import process_file
@@ -54,10 +84,11 @@ class TaskQueueService:
path=params["path"],
processor_type=params["processor_type"],
config=params["config"],
save_to=params["save_to"]
save_to=params.get("save_to"),
overwrite=params.get("overwrite", False),
)
task.result = result
elif task.name == "automation_task":
elif task.name == "automation_task" or self._is_processor_task(task.name):
from models.database import AutomationTask
from services.processors.registry import get as get_processor
from services.virtual_fs import read_file, write_file
@@ -66,9 +97,21 @@ class TaskQueueService:
auto_task = await AutomationTask.get(id=params["task_id"])
path = params["path"]
processor = get_processor(auto_task.processor_type)
processor_type = auto_task.processor_type if task.name == "automation_task" else task.name
processor = get_processor(processor_type)
if not processor:
raise ValueError(f"Processor {auto_task.processor_type} not found for task {auto_task.id}")
raise ValueError(f"Processor {processor_type} not found for task {auto_task.id}")
if processor_type != auto_task.processor_type:
await LogService.warning(
"task_queue",
"Processor type mismatch; falling back to stored type",
{"task_id": auto_task.id, "expected": auto_task.processor_type, "got": processor_type},
)
processor_type = auto_task.processor_type
processor = get_processor(processor_type)
if not processor:
raise ValueError(f"Processor {processor_type} not found for task {auto_task.id}")
file_content = await read_file(path)
result = await processor.process(file_content, path, auto_task.processor_config)
@@ -77,6 +120,16 @@ class TaskQueueService:
if save_to and getattr(processor, "produces_file", False):
await write_file(save_to, result)
task.result = "Automation task completed"
elif task.name == "offline_http_download":
from services.offline_download import run_http_download
result_path = await run_http_download(task)
task.result = {"path": result_path}
elif task.name == "cross_mount_transfer":
from services.virtual_fs import run_cross_mount_transfer_task
result = await run_cross_mount_transfer_task(task)
task.result = result
else:
raise ValueError(f"Unknown task name: {task.name}")
@@ -88,35 +141,88 @@ class TaskQueueService:
task.error = str(e)
await LogService.error("task_queue", f"Task {task.name} ({task.id}) failed: {e}", {"task_id": task.id, "name": task.name})
async def worker(self):
await LogService.info("task_queue", "Task worker started")
while True:
try:
task = await self._queue.get()
await self._execute_task(task)
except asyncio.CancelledError:
await LogService.info("task_queue", "Task worker stopped")
break
except Exception as e:
await LogService.error("task_queue", f"Error in task worker: {e}", exc_info=True)
finally:
self._queue.task_done()
def _cleanup_workers(self):
self._worker_tasks = [task for task in self._worker_tasks if not task.done()]
async def start_worker(self):
if self._worker_task is None or self._worker_task.done():
self._worker_task = asyncio.create_task(self.worker())
await LogService.info("task_queue", "Task worker created.")
def _is_processor_task(self, task_name: str) -> bool:
try:
from services.processors.registry import get as get_processor
return get_processor(task_name) is not None
except Exception:
return False
async def _ensure_worker_count(self):
self._cleanup_workers()
current = len(self._worker_tasks)
if current < self._concurrency:
for _ in range(self._concurrency - current):
self._worker_seq += 1
worker_id = self._worker_seq
worker_task = asyncio.create_task(self._worker_loop(worker_id))
self._worker_tasks.append(worker_task)
await LogService.info("task_queue", "Task workers adjusted", {"active_workers": len(self._worker_tasks), "target": self._concurrency})
elif current > self._concurrency:
for _ in range(current - self._concurrency):
await self._queue.put(_SENTINEL)
await LogService.info("task_queue", "Task workers scaling down", {"active_workers": len(self._worker_tasks), "target": self._concurrency})
async def _worker_loop(self, worker_id: int):
current_task = asyncio.current_task()
await LogService.info("task_queue", f"Worker {worker_id} started")
try:
while True:
job = await self._queue.get()
if job is _SENTINEL:
self._queue.task_done()
break
try:
await self._execute_task(job)
except Exception as e:
await LogService.error(
"task_queue",
f"Error executing task {job.id}: {e}",
{"task_id": job.id, "name": job.name},
)
finally:
self._queue.task_done()
finally:
if current_task in self._worker_tasks:
self._worker_tasks.remove(current_task) # type: ignore[arg-type]
await LogService.info("task_queue", f"Worker {worker_id} stopped")
async def start_worker(self, concurrency: int | None = None):
if concurrency is None:
from services.config import ConfigCenter
stored_value = await ConfigCenter.get("TASK_QUEUE_CONCURRENCY", self._concurrency)
try:
concurrency = int(stored_value)
except (TypeError, ValueError):
concurrency = self._concurrency
await self.set_concurrency(concurrency)
async def set_concurrency(self, value: int):
value = max(1, int(value))
if value != self._concurrency:
self._concurrency = value
await self._ensure_worker_count()
async def stop_worker(self):
if self._worker_task and not self._worker_task.done():
self._worker_task.cancel()
try:
await self._worker_task
except asyncio.CancelledError:
pass
finally:
self._worker_task = None
await LogService.info("task_queue", "Task worker has been stopped.")
self._cleanup_workers()
for _ in range(len(self._worker_tasks)):
await self._queue.put(_SENTINEL)
if self._worker_tasks:
await asyncio.gather(*self._worker_tasks, return_exceptions=True)
self._worker_tasks.clear()
await LogService.info("task_queue", "Task workers have been stopped.")
def get_concurrency(self) -> int:
return self._concurrency
def get_active_worker_count(self) -> int:
self._cleanup_workers()
return len(self._worker_tasks)
task_queue_service = TaskQueueService()
task_queue_service = TaskQueueService()

View File

@@ -25,11 +25,11 @@ class TaskService:
async def execute(self, task: AutomationTask, path: str):
await task_queue_service.add_task(
"automation_task",
task.processor_type,
{
"task_id": task.id,
"path": path,
},
)
task_service = TaskService()
task_service = TaskService()

View File

@@ -1,83 +0,0 @@
from pymilvus import CollectionSchema, DataType, FieldSchema, MilvusClient
class VectorDBService:
_instance = None
def __new__(cls, *args, **kwargs):
if not cls._instance:
cls._instance = super(VectorDBService, cls).__new__(cls)
return cls._instance
def __init__(self):
if not hasattr(self, 'client'):
self.client = MilvusClient("data/db/milvus.db")
def ensure_collection(self, collection_name, vector: bool = True):
if self.client.has_collection(collection_name):
return
if vector:
fields = [
FieldSchema(name="path", dtype=DataType.VARCHAR,
max_length=512, is_primary=True, auto_id=False),
FieldSchema(name="embedding",
dtype=DataType.FLOAT_VECTOR, dim=4096)
]
schema = CollectionSchema(
fields, description="Image vector collection")
self.client.create_collection(collection_name, schema=schema)
index_params = MilvusClient.prepare_index_params()
index_params.add_index(
field_name="embedding",
index_type="IVF_FLAT",
index_name="vector_index",
metric_type="COSINE",
params={
"nlist": 64,
}
)
self.client.create_index(
collection_name,
index_params=index_params
)
else:
fields = [
FieldSchema(name="path", dtype=DataType.VARCHAR,
max_length=512, is_primary=True, auto_id=False),
]
schema = CollectionSchema(fields, description="Simple file index")
self.client.create_collection(collection_name, schema=schema)
def upsert_vector(self, collection_name, data):
self.client.upsert(collection_name, data)
def delete_vector(self, collection_name, path: str):
self.client.delete(collection_name, ids=[path])
def search_vectors(self, collection_name, query_embedding, top_k=5):
search_params = {"metric_type": "COSINE"}
results = self.client.search(
collection_name,
data=[query_embedding],
anns_field="embedding",
search_params=search_params,
limit=top_k,
output_fields=["path"]
)
print(results)
return results
def search_by_path(self, collection_name, query_path, top_k=20):
results = self.client.query(
collection_name,
filter=f"path like '%{query_path}%'",
limit=top_k,
output_fields=["path"]
)
return [[{'id': r['path'], 'distance': 1.0, 'entity': {'path': r['path']}} for r in results]]
def clear_all_data(self):
"""清空所有集合的内容"""
collections = self.client.list_collections()
for collection_name in collections:
self.client.drop_collection(collection_name)

View File

@@ -0,0 +1,11 @@
from .service import VectorDBService, DEFAULT_VECTOR_DIMENSION
from .providers import list_providers, get_provider_entry
from .config_manager import VectorDBConfigManager
__all__ = [
"VectorDBService",
"DEFAULT_VECTOR_DIMENSION",
"list_providers",
"get_provider_entry",
"VectorDBConfigManager",
]

View File

@@ -0,0 +1,43 @@
from __future__ import annotations
import json
from typing import Any, Dict, Tuple
from services.config import ConfigCenter
class VectorDBConfigManager:
TYPE_KEY = "VECTOR_DB_TYPE"
CONFIG_KEY = "VECTOR_DB_CONFIG"
DEFAULT_TYPE = "milvus_lite"
@classmethod
async def load_config(cls) -> Tuple[str, Dict[str, Any]]:
raw_type = await ConfigCenter.get(cls.TYPE_KEY, cls.DEFAULT_TYPE)
provider_type = str(raw_type or cls.DEFAULT_TYPE)
raw_config = await ConfigCenter.get(cls.CONFIG_KEY)
config_dict: Dict[str, Any] = {}
if isinstance(raw_config, str) and raw_config:
try:
config_dict = json.loads(raw_config)
except json.JSONDecodeError:
config_dict = {}
elif isinstance(raw_config, dict):
config_dict = raw_config
return provider_type, config_dict
@classmethod
async def save_config(cls, provider_type: str, config: Dict[str, Any]) -> None:
await ConfigCenter.set(cls.TYPE_KEY, provider_type)
await ConfigCenter.set(cls.CONFIG_KEY, json.dumps(config or {}))
@classmethod
async def get_type(cls) -> str:
provider_type, _ = await cls.load_config()
return provider_type
@classmethod
async def get_config(cls) -> Dict[str, Any]:
_, config = await cls.load_config()
return config

View File

@@ -0,0 +1,56 @@
from __future__ import annotations
from typing import Dict, List, Type
from .base import BaseVectorProvider
from .milvus_lite import MilvusLiteProvider
from .milvus_server import MilvusServerProvider
from .qdrant import QdrantProvider
_PROVIDER_REGISTRY: Dict[str, Dict[str, object]] = {
MilvusLiteProvider.type: {
"class": MilvusLiteProvider,
"label": MilvusLiteProvider.label,
"description": MilvusLiteProvider.description,
"enabled": MilvusLiteProvider.enabled,
"config_schema": MilvusLiteProvider.config_schema,
},
MilvusServerProvider.type: {
"class": MilvusServerProvider,
"label": MilvusServerProvider.label,
"description": MilvusServerProvider.description,
"enabled": MilvusServerProvider.enabled,
"config_schema": MilvusServerProvider.config_schema,
},
QdrantProvider.type: {
"class": QdrantProvider,
"label": QdrantProvider.label,
"description": QdrantProvider.description,
"enabled": QdrantProvider.enabled,
"config_schema": QdrantProvider.config_schema,
},
}
def list_providers() -> List[Dict[str, object]]:
return [
{
"type": type_key,
"label": meta["label"],
"description": meta.get("description"),
"enabled": meta.get("enabled", True),
"config_schema": meta.get("config_schema", []),
}
for type_key, meta in _PROVIDER_REGISTRY.items()
]
def get_provider_entry(provider_type: str) -> Dict[str, object] | None:
return _PROVIDER_REGISTRY.get(provider_type)
def get_provider_class(provider_type: str) -> Type[BaseVectorProvider] | None:
entry = get_provider_entry(provider_type)
if not entry:
return None
return entry.get("class") # type: ignore[return-value]

View File

@@ -0,0 +1,41 @@
from __future__ import annotations
from typing import Any, Dict, List
class BaseVectorProvider:
"""向量数据库提供者基础类,所有实际实现需继承该类"""
type: str = ""
label: str = ""
description: str | None = None
enabled: bool = True
config_schema: List[Dict[str, Any]] = []
def __init__(self, config: Dict[str, Any] | None = None):
self.config = config or {}
async def initialize(self) -> None:
"""执行初始化逻辑,例如建立连接"""
raise NotImplementedError
def ensure_collection(self, collection_name: str, vector: bool, dim: int) -> None:
raise NotImplementedError
def upsert_vector(self, collection_name: str, data: Dict[str, Any]) -> None:
raise NotImplementedError
def delete_vector(self, collection_name: str, path: str) -> None:
raise NotImplementedError
def search_vectors(self, collection_name: str, query_embedding, top_k: int):
raise NotImplementedError
def search_by_path(self, collection_name: str, query_path: str, top_k: int):
raise NotImplementedError
def get_all_stats(self) -> Dict[str, Any]:
raise NotImplementedError
def clear_all_data(self) -> None:
raise NotImplementedError

View File

@@ -0,0 +1,196 @@
from __future__ import annotations
from pathlib import Path
from typing import Any, Dict, List, Optional
from pymilvus import CollectionSchema, DataType, FieldSchema, MilvusClient
from .base import BaseVectorProvider
class MilvusLiteProvider(BaseVectorProvider):
type = "milvus_lite"
label = "Milvus Lite"
description = "Embedded Milvus Lite (local file storage)."
enabled = True
config_schema: List[Dict[str, Any]] = [
{
"key": "db_path",
"label": "Database file path",
"type": "text",
"default": "data/db/milvus.db",
"required": False,
}
]
def __init__(self, config: Dict[str, Any] | None = None):
super().__init__(config)
self.db_path = Path(self.config.get("db_path") or "data/db/milvus.db")
self.client: MilvusClient | None = None
async def initialize(self) -> None:
try:
self.client = MilvusClient(str(self.db_path))
except Exception as exc: # pragma: no cover - depends on local environment
raise RuntimeError(f"Failed to open Milvus Lite at {self.db_path}: {exc}") from exc
def _get_client(self) -> MilvusClient:
if not self.client:
raise RuntimeError("Milvus Lite client is not initialized")
return self.client
@staticmethod
def _to_int(value: Any) -> int:
try:
return int(value)
except (TypeError, ValueError):
return 0
def ensure_collection(self, collection_name: str, vector: bool, dim: int) -> None:
client = self._get_client()
if client.has_collection(collection_name):
return
if vector:
vector_dim = dim if isinstance(dim, int) and dim > 0 else 0
if vector_dim <= 0:
vector_dim = 4096
fields = [
FieldSchema(name="path", dtype=DataType.VARCHAR, max_length=512, is_primary=True, auto_id=False),
FieldSchema(name="embedding", dtype=DataType.FLOAT_VECTOR, dim=vector_dim),
]
schema = CollectionSchema(fields, description="Image vector collection")
client.create_collection(collection_name, schema=schema)
index_params = MilvusClient.prepare_index_params()
index_params.add_index(
field_name="embedding",
index_type="IVF_FLAT",
index_name="vector_index",
metric_type="COSINE",
params={"nlist": 64},
)
client.create_index(collection_name, index_params=index_params)
else:
fields = [
FieldSchema(name="path", dtype=DataType.VARCHAR, max_length=512, is_primary=True, auto_id=False),
]
schema = CollectionSchema(fields, description="Simple file index")
client.create_collection(collection_name, schema=schema)
def upsert_vector(self, collection_name: str, data: Dict[str, Any]) -> None:
self._get_client().upsert(collection_name, data)
def delete_vector(self, collection_name: str, path: str) -> None:
self._get_client().delete(collection_name, ids=[path])
def search_vectors(self, collection_name: str, query_embedding, top_k: int):
search_params = {"metric_type": "COSINE"}
return self._get_client().search(
collection_name,
data=[query_embedding],
anns_field="embedding",
search_params=search_params,
limit=top_k,
output_fields=["path"],
)
def search_by_path(self, collection_name: str, query_path: str, top_k: int):
filter_expr = f"path like '%{query_path}%'" if query_path else "path like '%%'"
results = self._get_client().query(
collection_name,
filter=filter_expr,
limit=top_k,
output_fields=["path"],
)
return [[{"id": r["path"], "distance": 1.0, "entity": {"path": r["path"]}} for r in results]]
def get_all_stats(self) -> Dict[str, Any]:
client = self._get_client()
try:
collection_names = client.list_collections()
except Exception as exc:
raise RuntimeError(f"Failed to list collections: {exc}") from exc
collections: List[Dict[str, Any]] = []
total_vectors = 0
total_estimated_memory = 0
for name in collection_names:
try:
stats = client.get_collection_stats(name) or {}
except Exception:
stats = {}
row_count = self._to_int(stats.get("row_count"))
total_vectors += row_count
dimension: Optional[int] = None
is_vector_collection = False
try:
description = client.describe_collection(name)
except Exception:
description = None
if description:
for field in description.get("fields", []):
if field.get("type") == DataType.FLOAT_VECTOR:
params = field.get("params") or {}
dimension = self._to_int(params.get("dim")) or 4096
is_vector_collection = True
break
estimated_memory = 0
if is_vector_collection and dimension:
estimated_memory = row_count * dimension * 4
total_estimated_memory += estimated_memory
indexes: List[Dict[str, Any]] = []
try:
index_names = client.list_indexes(name) or []
except Exception:
index_names = []
for index_name in index_names:
try:
detail = client.describe_index(name, index_name) or {}
except Exception:
detail = {}
indexes.append(
{
"index_name": index_name,
"index_type": detail.get("index_type"),
"metric_type": detail.get("metric_type"),
"indexed_rows": self._to_int(detail.get("indexed_rows")),
"pending_index_rows": self._to_int(detail.get("pending_index_rows")),
"state": detail.get("state"),
}
)
collections.append(
{
"name": name,
"row_count": row_count,
"dimension": dimension if is_vector_collection else None,
"estimated_memory_bytes": estimated_memory,
"is_vector_collection": is_vector_collection,
"indexes": indexes,
}
)
db_file_size = None
try:
if self.db_path.exists():
db_file_size = self.db_path.stat().st_size
except OSError:
db_file_size = None
return {
"collections": collections,
"collection_count": len(collections),
"total_vectors": total_vectors,
"estimated_total_memory_bytes": total_estimated_memory,
"db_file_size_bytes": db_file_size,
}
def clear_all_data(self) -> None:
client = self._get_client()
for collection_name in client.list_collections():
client.drop_collection(collection_name)

View File

@@ -0,0 +1,197 @@
from __future__ import annotations
from typing import Any, Dict, List, Optional
from pymilvus import CollectionSchema, DataType, FieldSchema, MilvusClient
from .base import BaseVectorProvider
class MilvusServerProvider(BaseVectorProvider):
type = "milvus_server"
label = "Milvus Server"
description = "Remote Milvus instance accessed via URI."
enabled = True
config_schema: List[Dict[str, Any]] = [
{
"key": "uri",
"label": "Server URI",
"type": "text",
"required": True,
"placeholder": "http://localhost:19530",
},
{
"key": "token",
"label": "Token",
"type": "password",
"required": False,
"placeholder": "user:password",
},
]
def __init__(self, config: Dict[str, Any] | None = None):
super().__init__(config)
self.client: MilvusClient | None = None
async def initialize(self) -> None:
uri = self.config.get("uri")
if not uri:
raise RuntimeError("Milvus Server URI is required")
try:
self.client = MilvusClient(uri=uri, token=self.config.get("token"))
except Exception as exc: # pragma: no cover - depends on remote availability
raise RuntimeError(f"Failed to connect to Milvus Server {uri}: {exc}") from exc
def _get_client(self) -> MilvusClient:
if not self.client:
raise RuntimeError("Milvus Server client is not initialized")
return self.client
@staticmethod
def _to_int(value: Any) -> int:
try:
return int(value)
except (TypeError, ValueError):
return 0
def ensure_collection(self, collection_name: str, vector: bool, dim: int) -> None:
client = self._get_client()
if client.has_collection(collection_name):
return
if vector:
vector_dim = dim if isinstance(dim, int) and dim > 0 else 0
if vector_dim <= 0:
vector_dim = 4096
fields = [
FieldSchema(name="path", dtype=DataType.VARCHAR, max_length=512, is_primary=True, auto_id=False),
FieldSchema(name="embedding", dtype=DataType.FLOAT_VECTOR, dim=vector_dim),
]
schema = CollectionSchema(fields, description="Image vector collection")
client.create_collection(collection_name, schema=schema)
index_params = MilvusClient.prepare_index_params()
index_params.add_index(
field_name="embedding",
index_type="IVF_FLAT",
index_name="vector_index",
metric_type="COSINE",
params={"nlist": 64},
)
client.create_index(collection_name, index_params=index_params)
else:
fields = [
FieldSchema(name="path", dtype=DataType.VARCHAR, max_length=512, is_primary=True, auto_id=False),
]
schema = CollectionSchema(fields, description="Simple file index")
client.create_collection(collection_name, schema=schema)
def upsert_vector(self, collection_name: str, data: Dict[str, Any]) -> None:
self._get_client().upsert(collection_name, data)
def delete_vector(self, collection_name: str, path: str) -> None:
self._get_client().delete(collection_name, ids=[path])
def search_vectors(self, collection_name: str, query_embedding, top_k: int):
search_params = {"metric_type": "COSINE"}
return self._get_client().search(
collection_name,
data=[query_embedding],
anns_field="embedding",
search_params=search_params,
limit=top_k,
output_fields=["path"],
)
def search_by_path(self, collection_name: str, query_path: str, top_k: int):
filter_expr = f"path like '%{query_path}%'" if query_path else "path like '%%'"
results = self._get_client().query(
collection_name,
filter=filter_expr,
limit=top_k,
output_fields=["path"],
)
return [[{"id": r["path"], "distance": 1.0, "entity": {"path": r["path"]}} for r in results]]
def get_all_stats(self) -> Dict[str, Any]:
client = self._get_client()
try:
collection_names = client.list_collections()
except Exception as exc:
raise RuntimeError(f"Failed to list collections: {exc}") from exc
collections: List[Dict[str, Any]] = []
total_vectors = 0
total_estimated_memory = 0
for name in collection_names:
try:
stats = client.get_collection_stats(name) or {}
except Exception:
stats = {}
row_count = self._to_int(stats.get("row_count"))
total_vectors += row_count
dimension: Optional[int] = None
is_vector_collection = False
try:
description = client.describe_collection(name)
except Exception:
description = None
if description:
for field in description.get("fields", []):
if field.get("type") == DataType.FLOAT_VECTOR:
params = field.get("params") or {}
dimension = self._to_int(params.get("dim")) or 4096
is_vector_collection = True
break
estimated_memory = 0
if is_vector_collection and dimension:
estimated_memory = row_count * dimension * 4
total_estimated_memory += estimated_memory
indexes: List[Dict[str, Any]] = []
try:
index_names = client.list_indexes(name) or []
except Exception:
index_names = []
for index_name in index_names:
try:
detail = client.describe_index(name, index_name) or {}
except Exception:
detail = {}
indexes.append(
{
"index_name": index_name,
"index_type": detail.get("index_type"),
"metric_type": detail.get("metric_type"),
"indexed_rows": self._to_int(detail.get("indexed_rows")),
"pending_index_rows": self._to_int(detail.get("pending_index_rows")),
"state": detail.get("state"),
}
)
collections.append(
{
"name": name,
"row_count": row_count,
"dimension": dimension if is_vector_collection else None,
"estimated_memory_bytes": estimated_memory,
"is_vector_collection": is_vector_collection,
"indexes": indexes,
}
)
return {
"collections": collections,
"collection_count": len(collections),
"total_vectors": total_vectors,
"estimated_total_memory_bytes": total_estimated_memory,
"db_file_size_bytes": None,
}
def clear_all_data(self) -> None:
client = self._get_client()
for collection_name in client.list_collections():
client.drop_collection(collection_name)

View File

@@ -0,0 +1,237 @@
from __future__ import annotations
from typing import Any, Dict, List, Optional, Sequence
from uuid import NAMESPACE_URL, uuid5
from qdrant_client import QdrantClient
from qdrant_client.http import models as qmodels
from .base import BaseVectorProvider
class QdrantProvider(BaseVectorProvider):
type = "qdrant"
label = "Qdrant"
description = "Qdrant vector database (HTTP API)."
enabled = True
config_schema: List[Dict[str, Any]] = [
{
"key": "url",
"label": "Server URL",
"type": "text",
"required": True,
"placeholder": "http://localhost:6333",
},
{
"key": "api_key",
"label": "API Key",
"type": "password",
"required": False,
},
]
def __init__(self, config: Dict[str, Any] | None = None):
super().__init__(config)
self.client: Optional[QdrantClient] = None
async def initialize(self) -> None:
url = (self.config.get("url") or "").strip()
if not url:
raise RuntimeError("Qdrant URL is required")
api_key = (self.config.get("api_key") or None) or None
try:
client = QdrantClient(url=url, api_key=api_key)
# 简单连通性校验
client.get_collections()
self.client = client
except Exception as exc: # pragma: no cover - 依赖外部服务
raise RuntimeError(f"Failed to connect to Qdrant at {url}: {exc}") from exc
def _get_client(self) -> QdrantClient:
if not self.client:
raise RuntimeError("Qdrant client is not initialized")
return self.client
@staticmethod
def _vector_params(vector: bool, dim: int) -> qmodels.VectorParams:
size = dim if vector and isinstance(dim, int) and dim > 0 else 1
return qmodels.VectorParams(size=size, distance=qmodels.Distance.COSINE)
def ensure_collection(self, collection_name: str, vector: bool, dim: int) -> None:
client = self._get_client()
try:
if client.collection_exists(collection_name):
return
except Exception as exc: # pragma: no cover - 依赖外部服务
raise RuntimeError(f"Failed to check Qdrant collection '{collection_name}': {exc}") from exc
vectors_config = self._vector_params(vector, dim)
try:
client.create_collection(collection_name=collection_name, vectors_config=vectors_config)
except Exception as exc: # pragma: no cover
if "already exists" in str(exc).lower():
return
raise RuntimeError(f"Failed to create Qdrant collection '{collection_name}': {exc}") from exc
@staticmethod
def _point_id(path: str) -> str:
return str(uuid5(NAMESPACE_URL, path))
def _prepare_point(self, data: Dict[str, Any]) -> qmodels.PointStruct:
path = data.get("path")
if not path:
raise ValueError("Qdrant upsert requires 'path' in data")
embedding = data.get("embedding")
if embedding is None:
vector = [0.0]
else:
vector = [float(x) for x in embedding]
payload = {"path": path}
return qmodels.PointStruct(id=self._point_id(path), vector=vector, payload=payload)
def upsert_vector(self, collection_name: str, data: Dict[str, Any]) -> None:
client = self._get_client()
point = self._prepare_point(data)
client.upsert(collection_name=collection_name, wait=True, points=[point])
def delete_vector(self, collection_name: str, path: str) -> None:
client = self._get_client()
selector = qmodels.PointIdsList(points=[self._point_id(path)])
client.delete(collection_name=collection_name, points_selector=selector, wait=True)
def _format_search_results(self, points: Sequence[qmodels.ScoredPoint]):
return [
{
"id": point.id,
"distance": point.score,
"entity": {"path": (point.payload or {}).get("path")},
}
for point in points
]
def search_vectors(self, collection_name: str, query_embedding, top_k: int):
client = self._get_client()
vector = [float(x) for x in query_embedding]
points = client.search(
collection_name=collection_name,
query_vector=vector,
limit=top_k,
with_payload=True,
)
return [self._format_search_results(points)]
def search_by_path(self, collection_name: str, query_path: str, top_k: int):
client = self._get_client()
results: List[Dict[str, Any]] = []
offset: Optional[str | int] = None
remaining = max(top_k, 1)
while len(results) < top_k:
batch_size = min(max(remaining * 2, 10), 200)
records, next_offset = client.scroll(
collection_name=collection_name,
limit=batch_size,
offset=offset,
with_payload=True,
)
if not records:
break
for record in records:
path = (record.payload or {}).get("path")
if query_path and path:
if query_path not in path:
continue
results.append({"id": record.id, "distance": 1.0, "entity": {"path": path}})
if len(results) >= top_k:
break
if next_offset is None or len(results) >= top_k:
break
offset = next_offset
remaining = top_k - len(results)
return [results]
def _extract_vector_config(self, vectors) -> Optional[qmodels.VectorParams]:
if isinstance(vectors, qmodels.VectorParams):
return vectors
if isinstance(vectors, dict):
for value in vectors.values():
if isinstance(value, qmodels.VectorParams):
return value
return None
def get_all_stats(self) -> Dict[str, Any]:
client = self._get_client()
try:
response = client.get_collections()
except Exception as exc: # pragma: no cover
raise RuntimeError(f"Failed to list Qdrant collections: {exc}") from exc
collections: List[Dict[str, Any]] = []
total_vectors = 0
total_estimated_memory = 0
for description in response.collections or []:
name = description.name
try:
info = client.get_collection(name)
except Exception:
continue
row_count = int(info.points_count or 0)
total_vectors += row_count
vector_params = self._extract_vector_config(info.config.params.vectors if info.config and info.config.params else None)
dimension = int(vector_params.size) if vector_params and vector_params.size else None
estimated_memory = row_count * dimension * 4 if dimension else 0
total_estimated_memory += estimated_memory
distance = str(vector_params.distance) if vector_params and vector_params.distance else None
indexed_rows = int(info.indexed_vectors_count or 0)
pending_rows = max(row_count - indexed_rows, 0)
collections.append(
{
"name": name,
"row_count": row_count,
"dimension": dimension,
"estimated_memory_bytes": estimated_memory,
"is_vector_collection": dimension is not None and dimension > 1,
"indexes": [
{
"index_name": "hnsw",
"index_type": "HNSW",
"metric_type": distance,
"indexed_rows": indexed_rows,
"pending_index_rows": pending_rows,
"state": info.status,
}
],
}
)
return {
"collections": collections,
"collection_count": len(collections),
"total_vectors": total_vectors,
"estimated_total_memory_bytes": total_estimated_memory,
"db_file_size_bytes": None,
}
def clear_all_data(self) -> None:
client = self._get_client()
try:
response = client.get_collections()
except Exception as exc: # pragma: no cover
raise RuntimeError(f"Failed to list Qdrant collections: {exc}") from exc
for description in response.collections or []:
try:
client.delete_collection(description.name)
except Exception:
continue

View File

@@ -0,0 +1,99 @@
from __future__ import annotations
import asyncio
from typing import Any, Dict, Optional
from .config_manager import VectorDBConfigManager
from .providers import get_provider_class, get_provider_entry
from .providers.base import BaseVectorProvider
DEFAULT_VECTOR_DIMENSION = 4096
class VectorDBService:
_instance: "VectorDBService" | None = None
def __new__(cls, *args, **kwargs):
if cls._instance is None:
cls._instance = super().__new__(cls)
return cls._instance
def __init__(self):
if not hasattr(self, "_provider"):
self._provider: Optional[BaseVectorProvider] = None
self._provider_type: Optional[str] = None
self._provider_config: Dict[str, Any] | None = None
self._lock = asyncio.Lock()
async def _ensure_provider(self) -> BaseVectorProvider:
if self._provider is None:
await self.reload()
assert self._provider is not None # for type checker
return self._provider
async def reload(self) -> BaseVectorProvider:
async with self._lock:
provider_type, provider_config = await VectorDBConfigManager.load_config()
normalized_config = dict(provider_config or {})
if (
self._provider
and self._provider_type == provider_type
and self._provider_config == normalized_config
):
return self._provider
entry = get_provider_entry(provider_type)
if not entry:
raise RuntimeError(f"Unknown vector database provider: {provider_type}")
if not entry.get("enabled", True):
raise RuntimeError(f"Vector database provider '{provider_type}' is disabled")
provider_cls = get_provider_class(provider_type)
if not provider_cls:
raise RuntimeError(f"Provider class not found for '{provider_type}'")
provider = provider_cls(provider_config)
await provider.initialize()
self._provider = provider
self._provider_type = provider_type
self._provider_config = normalized_config
return provider
async def ensure_collection(self, collection_name: str, vector: bool = True, dim: int = DEFAULT_VECTOR_DIMENSION) -> None:
provider = await self._ensure_provider()
provider.ensure_collection(collection_name, vector, dim)
async def upsert_vector(self, collection_name: str, data: Dict[str, Any]) -> None:
provider = await self._ensure_provider()
provider.upsert_vector(collection_name, data)
async def delete_vector(self, collection_name: str, path: str) -> None:
provider = await self._ensure_provider()
provider.delete_vector(collection_name, path)
async def search_vectors(self, collection_name: str, query_embedding, top_k: int = 5):
provider = await self._ensure_provider()
return provider.search_vectors(collection_name, query_embedding, top_k)
async def search_by_path(self, collection_name: str, query_path: str, top_k: int = 20):
provider = await self._ensure_provider()
return provider.search_by_path(collection_name, query_path, top_k)
async def get_all_stats(self) -> Dict[str, Any]:
provider = await self._ensure_provider()
return provider.get_all_stats()
async def clear_all_data(self) -> None:
provider = await self._ensure_provider()
provider.clear_all_data()
async def current_provider(self) -> Dict[str, Any]:
provider_type, provider_config = await VectorDBConfigManager.load_config()
entry = get_provider_entry(provider_type) or {}
return {
"type": provider_type,
"config": provider_config,
"label": entry.get("label"),
"enabled": entry.get("enabled", True),
}

View File

@@ -1,4 +1,6 @@
from typing import Dict, Tuple, Any, Union, AsyncIterator
from __future__ import annotations
from typing import Dict, Tuple, Any, Union, AsyncIterator, List, TYPE_CHECKING
from fastapi import HTTPException
import mimetypes
from fastapi.responses import Response
@@ -6,6 +8,9 @@ import time
import hmac
import hashlib
import base64
from pathlib import Path
import shutil
import aiofiles
from models import StorageAdapter
from .adapters.registry import runtime_registry
@@ -17,6 +22,36 @@ from services.logging import LogService
from services.config import ConfigCenter
CROSS_TRANSFER_TEMP_ROOT = Path("data/tmp/cross_transfer")
if TYPE_CHECKING:
from services.task_queue import Task
def _build_absolute_path(mount_path: str, rel_path: str) -> str:
rel_norm = rel_path.lstrip('/')
mount_norm = mount_path.rstrip('/')
if not mount_norm:
return '/' + rel_norm if rel_norm else '/'
return f"{mount_norm}/{rel_norm}" if rel_norm else mount_norm
def _join_rel(base: str, name: str) -> str:
if not base:
return name.lstrip('/')
if not name:
return base
return f"{base.rstrip('/')}/{name.lstrip('/')}"
def _parent_rel(rel: str) -> str:
if not rel:
return ''
if '/' not in rel:
return ''
return rel.rsplit('/', 1)[0]
async def resolve_adapter_by_path(path: str) -> Tuple[StorageAdapter, str]:
norm = path if path.startswith('/') else '/' + path
adapters = await StorageAdapter.filter(enabled=True)
@@ -59,6 +94,24 @@ async def _ensure_method(adapter: Any, method: str):
return func
async def path_is_directory(path: str) -> bool:
"""判断给定路径是否为目录。"""
adapter_instance, _, root, rel = await resolve_adapter_and_rel(path)
rel = rel.rstrip('/')
if rel == '':
return True
stat_func = getattr(adapter_instance, "stat_file", None)
if not callable(stat_func):
raise HTTPException(501, detail="Adapter does not implement stat_file")
try:
info = await stat_func(root, rel)
except FileNotFoundError:
raise HTTPException(404, detail="Path not found")
if isinstance(info, dict):
return bool(info.get("is_dir"))
return False
async def list_virtual_dir(path: str, page_num: int = 1, page_size: int = 50, sort_by: str = "name", sort_order: str = "asc") -> Dict:
norm = (path if path.startswith('/') else '/' + path).rstrip('/') or '/'
adapters = await StorageAdapter.filter(enabled=True)
@@ -221,22 +274,40 @@ async def delete_path(path: str):
await LogService.action("virtual_fs", f"Deleted {path}", details={"path": path})
async def move_path(src: str, dst: str, overwrite: bool = False, return_debug: bool = True):
async def move_path(
src: str,
dst: str,
overwrite: bool = False,
return_debug: bool = True,
allow_cross: bool = False,
):
adapter_s, adapter_model_s, root_s, rel_s = await resolve_adapter_and_rel(src)
adapter_d, adapter_model_d, root_d, rel_d = await resolve_adapter_and_rel(dst)
debug_info = {
"src": src, "dst": dst,
"rel_s": rel_s, "rel_d": rel_d,
"root_s": root_s, "root_d": root_d,
"overwrite": overwrite
"overwrite": overwrite,
"operation": "move",
"queued": False,
}
if adapter_model_s.id != adapter_model_d.id:
raise HTTPException(400, detail="Cross-adapter move not supported")
if not rel_s:
raise HTTPException(400, detail="Cannot move or rename mount root")
if not rel_d:
raise HTTPException(400, detail="Invalid destination")
if adapter_model_s.id != adapter_model_d.id:
if not allow_cross:
raise HTTPException(400, detail="Cross-adapter move not supported")
queue_info = await _enqueue_cross_mount_transfer(
operation="move",
src=src,
dst=dst,
overwrite=overwrite,
)
debug_info.update(queue_info)
return debug_info if return_debug else None
exists_func = getattr(adapter_s, "exists", None)
stat_func = getattr(adapter_s, "stat_path", None)
delete_func = await _ensure_method(adapter_s, "delete")
@@ -415,22 +486,40 @@ async def stat_file(path: str):
return await stat_func(root, rel)
async def copy_path(src: str, dst: str, overwrite: bool = False, return_debug: bool = True):
async def copy_path(
src: str,
dst: str,
overwrite: bool = False,
return_debug: bool = True,
allow_cross: bool = False,
):
adapter_s, adapter_model_s, root_s, rel_s = await resolve_adapter_and_rel(src)
adapter_d, adapter_model_d, root_d, rel_d = await resolve_adapter_and_rel(dst)
debug_info = {
"src": src, "dst": dst,
"rel_s": rel_s, "rel_d": rel_d,
"root_s": root_s, "root_d": root_d,
"overwrite": overwrite
"overwrite": overwrite,
"operation": "copy",
"queued": False,
}
if adapter_model_s.id != adapter_model_d.id:
raise HTTPException(400, detail="Cross-adapter copy not supported")
if not rel_s:
raise HTTPException(400, detail="Cannot copy mount root")
if not rel_d:
raise HTTPException(400, detail="Invalid destination")
if adapter_model_s.id != adapter_model_d.id:
if not allow_cross:
raise HTTPException(400, detail="Cross-adapter copy not supported")
queue_info = await _enqueue_cross_mount_transfer(
operation="copy",
src=src,
dst=dst,
overwrite=overwrite,
)
debug_info.update(queue_info)
return debug_info if return_debug else None
exists_func = getattr(adapter_s, "exists", None)
stat_func = getattr(adapter_s, "stat_path", None)
delete_func = getattr(adapter_s, "delete", None)
@@ -476,28 +565,424 @@ async def copy_path(src: str, dst: str, overwrite: bool = False, return_debug: b
return debug_info if return_debug else None
async def process_file(path: str, processor_type: str, config: dict, save_to: str = None):
"""
使用指定处理器处理文件,并可选择保存到新路径
:param path: 源文件路径
:param processor_type: 处理器类型
:param config: 处理器配置
:param save_to: 保存路径(可选),不指定则只返回处理结果
:return: 处理后的文件内容或保存结果
"""
data = await read_file(path)
async def _enqueue_cross_mount_transfer(operation: str, src: str, dst: str, overwrite: bool) -> Dict[str, Any]:
if operation not in {"move", "copy"}:
raise HTTPException(400, detail="Unsupported transfer operation")
adapter_s, adapter_model_s, _, _ = await resolve_adapter_and_rel(src)
adapter_d, adapter_model_d, root_d, rel_d = await resolve_adapter_and_rel(dst)
if adapter_model_s.id == adapter_model_d.id:
raise HTTPException(400, detail="Cross-adapter transfer requested but adapters are identical")
dst_exists = False
exists_func = getattr(adapter_d, "exists", None)
if callable(exists_func):
dst_exists = await exists_func(root_d, rel_d)
else:
try:
await stat_file(dst)
dst_exists = True
except FileNotFoundError:
dst_exists = False
except HTTPException as exc:
if exc.status_code == 404:
dst_exists = False
else:
raise
if dst_exists and not overwrite:
raise HTTPException(409, detail="Destination already exists")
payload = {
"operation": operation,
"src": src,
"dst": dst,
"overwrite": overwrite,
}
from services.task_queue import task_queue_service
task = await task_queue_service.add_task("cross_mount_transfer", payload)
return {
"queued": True,
"task_id": task.id,
"task_name": "cross_mount_transfer",
"dst_exists": dst_exists,
"cross_adapter": True,
}
async def run_cross_mount_transfer_task(task: "Task") -> Dict[str, Any]:
from services.task_queue import task_queue_service
params = task.task_info or {}
operation = params.get("operation")
src = params.get("src")
dst = params.get("dst")
overwrite = bool(params.get("overwrite", False))
if operation not in {"move", "copy"}:
raise ValueError(f"Unsupported cross mount operation: {operation}")
if not src or not dst:
raise ValueError("Missing src or dst for cross mount transfer")
adapter_s, adapter_model_s, root_s, rel_s = await resolve_adapter_and_rel(src)
adapter_d, adapter_model_d, root_d, rel_d = await resolve_adapter_and_rel(dst)
await task_queue_service.update_meta(task.id, {
"operation": operation,
"src": src,
"dst": dst,
})
if adapter_model_s.id == adapter_model_d.id:
if operation == "move":
await move_path(src, dst, overwrite=overwrite, return_debug=False, allow_cross=False)
else:
await copy_path(src, dst, overwrite=overwrite, return_debug=False, allow_cross=False)
return {
"mode": "direct",
"operation": operation,
"src": src,
"dst": dst,
"files": 0,
"bytes": 0,
}
if not rel_s:
raise ValueError("Cannot transfer mount root")
if not rel_d:
raise ValueError("Invalid destination")
dst_exists = False
exists_func = getattr(adapter_d, "exists", None)
if callable(exists_func):
dst_exists = await exists_func(root_d, rel_d)
else:
try:
await stat_file(dst)
dst_exists = True
except FileNotFoundError:
dst_exists = False
except HTTPException as exc:
if exc.status_code != 404:
raise
if dst_exists and not overwrite:
raise ValueError("Destination already exists")
if dst_exists and overwrite:
await delete_path(dst)
try:
src_stat = await stat_file(src)
except HTTPException as exc:
if exc.status_code == 404:
raise FileNotFoundError(src) from exc
raise
src_is_dir = bool(src_stat.get("is_dir"))
files_to_transfer: List[Dict[str, Any]] = []
dirs_to_create: List[str] = []
await task_queue_service.update_progress(task.id, {
"stage": "preparing",
"percent": 0.0,
"detail": "Collecting source entries",
})
if src_is_dir:
if rel_d:
dirs_to_create.append(rel_d)
list_dir = await _ensure_method(adapter_s, "list_dir")
stack: List[Tuple[str, str, str]] = [(rel_s, rel_d, '')]
page_size = 200
while stack:
current_rel, current_dst_rel, current_relative = stack.pop()
page = 1
while True:
entries, total = await list_dir(root_s, current_rel, page, page_size, "name", "asc")
if not entries and (total or 0) == 0:
break
for entry in entries:
name = entry.get("name")
if not name:
continue
child_rel = _join_rel(current_rel, name)
child_dst_rel = _join_rel(current_dst_rel, name)
child_relative = _join_rel(current_relative, name)
if entry.get("is_dir"):
dirs_to_create.append(child_dst_rel)
stack.append((child_rel, child_dst_rel, child_relative))
else:
files_to_transfer.append({
"src_rel": child_rel,
"dst_rel": child_dst_rel,
"relative_rel": child_relative or name,
"size": entry.get("size"),
"name": name,
})
if total is None or page * page_size >= (total or 0):
break
page += 1
else:
relative_rel = rel_s or (src_stat.get("name") or "file")
files_to_transfer.append({
"src_rel": rel_s,
"dst_rel": rel_d,
"relative_rel": relative_rel,
"size": src_stat.get("size"),
"name": src_stat.get("name") or rel_s.split('/')[-1],
})
parent_dir = _parent_rel(rel_d)
if parent_dir:
dirs_to_create.append(parent_dir)
CROSS_TRANSFER_TEMP_ROOT.mkdir(parents=True, exist_ok=True)
temp_dir = CROSS_TRANSFER_TEMP_ROOT / task.id
temp_dir.mkdir(parents=True, exist_ok=True)
bytes_downloaded = 0
total_dynamic_bytes = sum((f["size"] or 0) for f in files_to_transfer)
try:
for job in files_to_transfer:
src_abs = _build_absolute_path(adapter_model_s.path, job["src_rel"])
data = await read_file(src_abs)
temp_path = temp_dir / job["relative_rel"]
temp_path.parent.mkdir(parents=True, exist_ok=True)
async with aiofiles.open(temp_path, "wb") as f:
await f.write(data)
actual_size = len(data)
job["temp_path"] = temp_path
prev_size = job.get("size") or 0
if prev_size <= 0:
total_dynamic_bytes += actual_size
job_size = actual_size
else:
job_size = prev_size
job["size"] = job_size
bytes_downloaded += actual_size
percent = None
total_for_percent = total_dynamic_bytes if total_dynamic_bytes else bytes_downloaded
if total_for_percent:
percent = min(100.0, round(bytes_downloaded / total_for_percent * 100, 2))
await task_queue_service.update_progress(task.id, {
"stage": "downloading",
"percent": percent,
"bytes_done": bytes_downloaded,
"bytes_total": total_dynamic_bytes or None,
"detail": f"Downloaded {job['name']}",
})
mkdir_func = await _ensure_method(adapter_d, "mkdir")
ensured_dirs: set[str] = set()
async def ensure_dir(rel_path: str):
if not rel_path or rel_path in ensured_dirs:
return
parent = _parent_rel(rel_path)
if parent:
await ensure_dir(parent)
try:
await mkdir_func(root_d, rel_path)
except FileExistsError:
pass
except HTTPException as exc:
if exc.status_code not in {409, 400}:
raise
except Exception:
# Assume directory already exists
pass
ensured_dirs.add(rel_path)
for dir_rel in sorted({d for d in dirs_to_create if d}, key=lambda x: x.count('/')):
await ensure_dir(dir_rel)
uploaded_bytes = 0
total_bytes = sum((f["size"] or 0) for f in files_to_transfer)
async def iter_temp_file(path: Path, chunk_size: int = 512 * 1024):
async with aiofiles.open(path, "rb") as f:
while True:
chunk = await f.read(chunk_size)
if not chunk:
break
yield chunk
for job in files_to_transfer:
parent_dir = _parent_rel(job["dst_rel"])
if parent_dir:
await ensure_dir(parent_dir)
dst_abs = _build_absolute_path(adapter_model_d.path, job["dst_rel"])
temp_path: Path = job["temp_path"]
await write_file_stream(dst_abs, iter_temp_file(temp_path), overwrite=overwrite)
uploaded_bytes += job["size"] or 0
percent = None
if total_bytes:
percent = min(100.0, round(uploaded_bytes / total_bytes * 100, 2))
await task_queue_service.update_progress(task.id, {
"stage": "uploading",
"percent": percent,
"bytes_done": uploaded_bytes,
"bytes_total": total_bytes or None,
"detail": f"Uploaded {job['name']}",
})
if operation == "move":
await delete_path(src)
await task_queue_service.update_progress(task.id, {
"stage": "completed",
"percent": 100.0,
"bytes_done": total_bytes,
"bytes_total": total_bytes,
"detail": "Completed",
})
await task_queue_service.update_meta(task.id, {
"files": len(files_to_transfer),
"directories": len({d for d in dirs_to_create if d}),
"bytes": total_bytes,
"operation": operation,
})
await LogService.action(
"virtual_fs",
f"Cross-adapter {operation} from {src} to {dst}",
details={
"src": src,
"dst": dst,
"operation": operation,
"files": len(files_to_transfer),
"bytes": total_bytes,
},
)
return {
"mode": "cross",
"operation": operation,
"src": src,
"dst": dst,
"files": len(files_to_transfer),
"bytes": total_bytes,
}
finally:
try:
if temp_dir.exists():
shutil.rmtree(temp_dir)
except Exception:
await LogService.info(
"virtual_fs",
"Failed to cleanup cross transfer temp dir",
details={"task_id": task.id, "temp_dir": str(temp_dir)},
)
async def process_file(
path: str,
processor_type: str,
config: dict,
save_to: str | None = None,
overwrite: bool = False,
) -> Any:
"""处理指定路径(文件或目录)。目录会递归处理其下所有文件。"""
processor = get_processor(processor_type)
if not processor:
raise HTTPException(
400, detail=f"Processor {processor_type} not found")
result = await processor.process(data, path, config)
if save_to and getattr(processor, "produces_file", False):
raise HTTPException(400, detail=f"Processor {processor_type} not found")
actual_is_dir = await path_is_directory(path)
supported_exts = getattr(processor, "supported_exts", None) or []
allowed_exts = {
str(ext).lower().lstrip('.')
for ext in supported_exts
if isinstance(ext, str)
}
def matches_extension(rel_path: str) -> bool:
if not allowed_exts:
return True
if '.' not in rel_path:
return '' in allowed_exts
ext = rel_path.rsplit('.', 1)[-1].lower()
return ext in allowed_exts or f'.{ext}' in allowed_exts
def coerce_result_bytes(result: Any) -> bytes:
if isinstance(result, Response):
result_bytes = result.body
else:
result_bytes = result
await write_file(save_to, result_bytes)
return {"saved_to": save_to}
return result.body
if isinstance(result, (bytes, bytearray)):
return bytes(result)
if isinstance(result, str):
return result.encode('utf-8')
raise HTTPException(500, detail="Processor must return bytes/Response when produces_file=True")
def build_absolute_path(mount_path: str, rel_path: str) -> str:
rel_norm = rel_path.lstrip('/')
mount_norm = mount_path.rstrip('/')
if not mount_norm:
return '/' + rel_norm if rel_norm else '/'
return f"{mount_norm}/{rel_norm}" if rel_norm else mount_norm
if actual_is_dir:
if save_to:
raise HTTPException(400, detail="Directory processing does not support custom save_to path")
if not overwrite:
raise HTTPException(400, detail="Directory processing requires overwrite")
adapter_instance, adapter_model, root, rel = await resolve_adapter_and_rel(path)
rel = rel.rstrip('/')
list_dir = await _ensure_method(adapter_instance, "list_dir")
processed_count = 0
stack: List[str] = [rel]
page_size = 200
while stack:
current = stack.pop()
page = 1
while True:
entries, total = await list_dir(root, current, page, page_size, "name", "asc")
if not entries and (total or 0) == 0:
break
for entry in entries:
name = entry.get("name")
if not name:
continue
child_rel = f"{current}/{name}" if current else name
if entry.get("is_dir"):
stack.append(child_rel)
continue
if not matches_extension(child_rel):
continue
absolute_path = build_absolute_path(adapter_model.path, child_rel)
data = await read_file(absolute_path)
result = await processor.process(data, absolute_path, config)
if getattr(processor, "produces_file", False):
result_bytes = coerce_result_bytes(result)
await write_file(absolute_path, result_bytes)
processed_count += 1
if total is None or page * page_size >= total:
break
page += 1
return {"processed_files": processed_count}
# 单文件处理
data = await read_file(path)
result = await processor.process(data, path, config)
target_path = save_to
if overwrite and not target_path:
target_path = path
if target_path and getattr(processor, "produces_file", False):
result_bytes = coerce_result_bytes(result)
await write_file(target_path, result_bytes)
return {"saved_to": target_path}
return result

81
uv.lock generated
View File

@@ -415,6 +415,7 @@ dependencies = [
{ name = "python-multipart" },
{ name = "pytz" },
{ name = "pyyaml" },
{ name = "qdrant-client" },
{ name = "rawpy" },
{ name = "rich" },
{ name = "rich-toolkit" },
@@ -505,6 +506,7 @@ requires-dist = [
{ name = "python-multipart", specifier = "==0.0.20" },
{ name = "pytz", specifier = "==2025.2" },
{ name = "pyyaml", specifier = "==6.0.2" },
{ name = "qdrant-client", specifier = "==1.15.1" },
{ name = "rawpy", specifier = "==0.25.1" },
{ name = "rich", specifier = "==14.1.0" },
{ name = "rich-toolkit", specifier = "==0.15.0" },
@@ -604,6 +606,28 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/04/4b/29cac41a4d98d144bf5f6d33995617b185d14b22401f75ca86f384e87ff1/h11-0.16.0-py3-none-any.whl", hash = "sha256:63cf8bbe7522de3bf65932fda1d9c2772064ffb3dae62d55932da54b31cb6c86", size = 37515, upload-time = "2025-04-24T03:35:24.344Z" },
]
[[package]]
name = "h2"
version = "4.3.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "hpack" },
{ name = "hyperframe" },
]
sdist = { url = "https://files.pythonhosted.org/packages/1d/17/afa56379f94ad0fe8defd37d6eb3f89a25404ffc71d4d848893d270325fc/h2-4.3.0.tar.gz", hash = "sha256:6c59efe4323fa18b47a632221a1888bd7fde6249819beda254aeca909f221bf1", size = 2152026, upload-time = "2025-08-23T18:12:19.778Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/69/b2/119f6e6dcbd96f9069ce9a2665e0146588dc9f88f29549711853645e736a/h2-4.3.0-py3-none-any.whl", hash = "sha256:c438f029a25f7945c69e0ccf0fb951dc3f73a5f6412981daee861431b70e2bdd", size = 61779, upload-time = "2025-08-23T18:12:17.779Z" },
]
[[package]]
name = "hpack"
version = "4.1.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/2c/48/71de9ed269fdae9c8057e5a4c0aa7402e8bb16f2c6e90b3aa53327b113f8/hpack-4.1.0.tar.gz", hash = "sha256:ec5eca154f7056aa06f196a557655c5b009b382873ac8d1e66e79e87535f1dca", size = 51276, upload-time = "2025-01-22T21:44:58.347Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/07/c6/80c95b1b2b94682a72cbdbfb85b81ae2daffa4291fbfa1b1464502ede10d/hpack-4.1.0-py3-none-any.whl", hash = "sha256:157ac792668d995c657d93111f46b4535ed114f0c9c8d672271bbec7eae1b496", size = 34357, upload-time = "2025-01-22T21:44:56.92Z" },
]
[[package]]
name = "httpcore"
version = "1.0.9"
@@ -647,6 +671,20 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/2a/39/e50c7c3a983047577ee07d2a9e53faf5a69493943ec3f6a384bdc792deb2/httpx-0.28.1-py3-none-any.whl", hash = "sha256:d909fcccc110f8c7faf814ca82a9a4d816bc5a6dbfea25d6591d6985b8ba59ad", size = 73517, upload-time = "2024-12-06T15:37:21.509Z" },
]
[package.optional-dependencies]
http2 = [
{ name = "h2" },
]
[[package]]
name = "hyperframe"
version = "6.1.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/02/e7/94f8232d4a74cc99514c13a9f995811485a6903d48e5d952771ef6322e30/hyperframe-6.1.0.tar.gz", hash = "sha256:f630908a00854a7adeabd6382b43923a4c4cd4b821fcb527e6ab9e15382a3b08", size = 26566, upload-time = "2025-01-22T21:41:49.302Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/48/30/47d0bf6072f7252e6521f3447ccfa40b421b6824517f82854703d0f5a98b/hyperframe-6.1.0-py3-none-any.whl", hash = "sha256:b03380493a519fce58ea5af42e4a42317bf9bd425596f7a0835ffce80f1a42e5", size = 13007, upload-time = "2025-01-22T21:41:47.295Z" },
]
[[package]]
name = "idna"
version = "3.10"
@@ -950,6 +988,18 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/89/c7/5572fa4a3f45740eaab6ae86fcdf7195b55beac1371ac8c619d880cfe948/pillow-11.3.0-cp314-cp314t-win_arm64.whl", hash = "sha256:79ea0d14d3ebad43ec77ad5272e6ff9bba5b679ef73375ea760261207fa8e0aa", size = 2512835, upload-time = "2025-07-01T09:15:50.399Z" },
]
[[package]]
name = "portalocker"
version = "3.2.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "pywin32", marker = "sys_platform == 'win32'" },
]
sdist = { url = "https://files.pythonhosted.org/packages/5e/77/65b857a69ed876e1951e88aaba60f5ce6120c33703f7cb61a3c894b8c1b6/portalocker-3.2.0.tar.gz", hash = "sha256:1f3002956a54a8c3730586c5c77bf18fae4149e07eaf1c29fc3faf4d5a3f89ac", size = 95644, upload-time = "2025-06-14T13:20:40.03Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/4b/a6/38c8e2f318bf67d338f4d629e93b0b4b9af331f455f0390ea8ce4a099b26/portalocker-3.2.0-py3-none-any.whl", hash = "sha256:3cdc5f565312224bc570c49337bd21428bba0ef363bbcf58b9ef4a9f11779968", size = 22424, upload-time = "2025-06-14T13:20:38.083Z" },
]
[[package]]
name = "propcache"
version = "0.3.2"
@@ -1161,6 +1211,19 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/81/c4/34e93fe5f5429d7570ec1fa436f1986fb1f00c3e0f43a589fe2bbcd22c3f/pytz-2025.2-py2.py3-none-any.whl", hash = "sha256:5ddf76296dd8c44c26eb8f4b6f35488f3ccbf6fbbd7adee0b7262d43f0ec2f00", size = 509225, upload-time = "2025-03-25T02:24:58.468Z" },
]
[[package]]
name = "pywin32"
version = "311"
source = { registry = "https://pypi.org/simple" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/a5/be/3fd5de0979fcb3994bfee0d65ed8ca9506a8a1260651b86174f6a86f52b3/pywin32-311-cp313-cp313-win32.whl", hash = "sha256:f95ba5a847cba10dd8c4d8fefa9f2a6cf283b8b88ed6178fa8a6c1ab16054d0d", size = 8705700, upload-time = "2025-07-14T20:13:26.471Z" },
{ url = "https://files.pythonhosted.org/packages/e3/28/e0a1909523c6890208295a29e05c2adb2126364e289826c0a8bc7297bd5c/pywin32-311-cp313-cp313-win_amd64.whl", hash = "sha256:718a38f7e5b058e76aee1c56ddd06908116d35147e133427e59a3983f703a20d", size = 9494700, upload-time = "2025-07-14T20:13:28.243Z" },
{ url = "https://files.pythonhosted.org/packages/04/bf/90339ac0f55726dce7d794e6d79a18a91265bdf3aa70b6b9ca52f35e022a/pywin32-311-cp313-cp313-win_arm64.whl", hash = "sha256:7b4075d959648406202d92a2310cb990fea19b535c7f4a78d3f5e10b926eeb8a", size = 8709318, upload-time = "2025-07-14T20:13:30.348Z" },
{ url = "https://files.pythonhosted.org/packages/c9/31/097f2e132c4f16d99a22bfb777e0fd88bd8e1c634304e102f313af69ace5/pywin32-311-cp314-cp314-win32.whl", hash = "sha256:b7a2c10b93f8986666d0c803ee19b5990885872a7de910fc460f9b0c2fbf92ee", size = 8840714, upload-time = "2025-07-14T20:13:32.449Z" },
{ url = "https://files.pythonhosted.org/packages/90/4b/07c77d8ba0e01349358082713400435347df8426208171ce297da32c313d/pywin32-311-cp314-cp314-win_amd64.whl", hash = "sha256:3aca44c046bd2ed8c90de9cb8427f581c479e594e99b5c0bb19b29c10fd6cb87", size = 9656800, upload-time = "2025-07-14T20:13:34.312Z" },
{ url = "https://files.pythonhosted.org/packages/c0/d2/21af5c535501a7233e734b8af901574572da66fcc254cb35d0609c9080dd/pywin32-311-cp314-cp314-win_arm64.whl", hash = "sha256:a508e2d9025764a8270f93111a970e1d0fbfc33f4153b388bb649b7eec4f9b42", size = 8932540, upload-time = "2025-07-14T20:13:36.379Z" },
]
[[package]]
name = "pyyaml"
version = "6.0.2"
@@ -1178,6 +1241,24 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/fa/de/02b54f42487e3d3c6efb3f89428677074ca7bf43aae402517bc7cca949f3/PyYAML-6.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:8388ee1976c416731879ac16da0aff3f63b286ffdd57cdeb95f3f2e085687563", size = 156446, upload-time = "2024-08-06T20:33:04.33Z" },
]
[[package]]
name = "qdrant-client"
version = "1.15.1"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "grpcio" },
{ name = "httpx", extra = ["http2"] },
{ name = "numpy" },
{ name = "portalocker" },
{ name = "protobuf" },
{ name = "pydantic" },
{ name = "urllib3" },
]
sdist = { url = "https://files.pythonhosted.org/packages/79/8b/76c7d325e11d97cb8eb5e261c3759e9ed6664735afbf32fdded5b580690c/qdrant_client-1.15.1.tar.gz", hash = "sha256:631f1f3caebfad0fd0c1fba98f41be81d9962b7bf3ca653bed3b727c0e0cbe0e", size = 295297, upload-time = "2025-07-31T19:35:19.627Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/ef/33/d8df6a2b214ffbe4138db9a1efe3248f67dc3c671f82308bea1582ecbbb7/qdrant_client-1.15.1-py3-none-any.whl", hash = "sha256:2b975099b378382f6ca1cfb43f0d59e541be6e16a5892f282a4b8de7eff5cb63", size = 337331, upload-time = "2025-07-31T19:35:17.539Z" },
]
[[package]]
name = "rawpy"
version = "0.25.1"

View File

@@ -5,12 +5,10 @@ import { status as getStatus } from './api/config.ts';
import type { SystemStatus } from './api/config.ts';
import { SystemContext } from './contexts/SystemContext.tsx';
import { ThemeProvider } from './contexts/ThemeContext.tsx';
import { Spin, ConfigProvider } from 'antd';
import { Spin } from 'antd';
import { Routes, Route, Navigate } from 'react-router';
import SetupPage from './pages/SetupPage.tsx';
import { I18nProvider, useI18n } from './i18n';
import zhCN from 'antd/locale/zh_CN';
import enUS from 'antd/locale/en_US';
import { I18nProvider } from './i18n';
function AppInner() {
const [status, setStatus] = useState<SystemStatus | null>(null);
@@ -39,26 +37,21 @@ function AppInner() {
);
}
const { lang } = useI18n();
const locale = lang === 'zh' ? zhCN : enUS;
return (
<ConfigProvider locale={locale}>
<SystemContext.Provider value={status}>
<AuthProvider>
<ThemeProvider>
{!status.is_initialized ? (
<Routes>
<Route path="/setup" element={<SetupPage />} />
<Route path="*" element={<Navigate to="/setup" replace />} />
</Routes>
) : (
<AppRouter />
)}
</ThemeProvider>
</AuthProvider>
</SystemContext.Provider>
</ConfigProvider>
<SystemContext.Provider value={status}>
<AuthProvider>
<ThemeProvider>
{!status.is_initialized ? (
<Routes>
<Route path="/setup" element={<SetupPage />} />
<Route path="*" element={<Navigate to="/setup" replace />} />
</Routes>
) : (
<AppRouter />
)}
</ThemeProvider>
</AuthProvider>
</SystemContext.Provider>
);
}

View File

@@ -17,6 +17,21 @@ export interface AuthResponse {
token_type: string;
}
export interface MeResponse {
id: number;
username: string;
email?: string | null;
full_name?: string | null;
gravatar_url: string;
}
export interface UpdateMePayload {
email?: string | null;
full_name?: string | null;
old_password?: string;
new_password?: string;
}
export const authApi = {
register: async (username: string, password: string, email?: string, full_name?: string): Promise<any> => {
return request('/auth/register', {
@@ -42,4 +57,15 @@ export const authApi = {
logout: () => {
localStorage.removeItem('token');
},
me: async () => {
return await request<MeResponse>('/auth/me', {
method: 'GET',
});
},
updateMe: async (payload: UpdateMePayload) => {
return await request<MeResponse>('/auth/me', {
method: 'PUT',
json: payload,
});
},
};

View File

@@ -73,4 +73,5 @@ async function request<T = any>(url: string, options: RequestOptions = {}): Prom
export { vfsApi, type VfsEntry, type DirListing } from './vfs';
export { adaptersApi, type AdapterItem, type AdapterTypeField, type AdapterTypeMeta } from './adapters';
export { shareApi, type ShareInfo, type ShareInfoWithPassword } from './share';
export { offlineDownloadsApi, type OfflineDownloadTask, type OfflineDownloadCreate, type TaskProgress } from './offlineDownloads';
export default request;

View File

@@ -0,0 +1,35 @@
import request from './client';
export interface TaskProgress {
stage?: string | null;
percent?: number | null;
bytes_total?: number | null;
bytes_done?: number | null;
detail?: string | null;
}
export interface OfflineDownloadTask {
id: string;
name: string;
status: 'pending' | 'running' | 'success' | 'failed';
result?: any;
error?: string | null;
task_info: Record<string, any>;
progress?: TaskProgress | null;
meta?: Record<string, any> | null;
}
export interface OfflineDownloadCreate {
url: string;
dest_dir: string;
filename: string;
}
export const offlineDownloadsApi = {
create: (payload: OfflineDownloadCreate) => request<{ task_id: string }>('/offline-downloads/', {
method: 'POST',
json: payload,
}),
list: () => request<OfflineDownloadTask[]>('/offline-downloads/'),
detail: (taskId: string) => request<OfflineDownloadTask>(`/offline-downloads/${taskId}`),
};

View File

@@ -15,7 +15,8 @@ export interface ProcessorTypeMeta {
name: string;
supported_exts: string[];
config_schema: ProcessorTypeField[];
produces_file:boolean;
produces_file: boolean;
module_path?: string | null;
}
export const processorsApi = {
@@ -29,11 +30,21 @@ export const processorsApi = {
save_to?: string;
overwrite?: boolean;
}) =>
request<any>('/processors/process', {
request<{ task_id: string }>('/processors/process', {
method: 'POST',
json: params,
}),
getSource: (type: string) =>
request<{ source: string; module_path: string }>('/processors/source/' + encodeURIComponent(type), {
method: 'GET',
}),
updateSource: (type: string, source: string) =>
request<boolean>('/processors/source/' + encodeURIComponent(type), {
method: 'PUT',
json: { source },
}),
reload: () =>
request<boolean>('/processors/reload', {
method: 'POST',
body: JSON.stringify(params),
headers: {
'Content-Type': 'application/json'
}
}),
};

View File

@@ -23,10 +23,15 @@ export interface ShareCreatePayload {
password?: string;
}
export interface ClearExpiredResult {
deleted_count: number;
}
export const shareApi = {
create: (payload: ShareCreatePayload) => request<ShareInfoWithPassword>('/shares', { method: 'POST', json: payload }),
list: () => request<ShareInfo[]>('/shares'),
remove: (shareId: number) => request<void>(`/shares/${shareId}`, { method: 'DELETE' }),
clearExpired: () => request<ClearExpiredResult>(`/shares/expired`, { method: 'DELETE' }),
get: (token: string) => request<ShareInfo>(`/s/${token}`),
verifyPassword: (token: string, password: string) => request<void>(`/s/${token}/verify`, { method: 'POST', json: { password } }),
listDir: (token: string, path: string = '/', password?: string) => {
@@ -40,4 +45,4 @@ export const shareApi = {
const url = `${API_BASE_URL}/s/${token}/download?path=${encodeURIComponent(path)}`;
return password ? `${url}&password=${encodeURIComponent(password)}` : url;
},
};
};

View File

@@ -1,4 +1,5 @@
import request from './client';
import type { TaskProgress } from './offlineDownloads';
export interface AutomationTask {
id: number;
@@ -21,6 +22,17 @@ export interface QueuedTask {
result?: any;
error?: string;
task_info: Record<string, any>;
progress?: TaskProgress | null;
meta?: Record<string, any> | null;
}
export interface TaskQueueSettings {
concurrency: number;
active_workers: number;
}
export interface TaskQueueSettingsUpdate {
concurrency: number;
}
export const tasksApi = {
@@ -29,4 +41,6 @@ export const tasksApi = {
update: (id: number, payload: AutomationTaskUpdate) => request<AutomationTask>(`/tasks/${id}`, { method: 'PUT', json: payload }),
remove: (id: number) => request<void>(`/tasks/${id}`, { method: 'DELETE' }),
getQueue: () => request<QueuedTask[]>('/tasks/queue'),
};
getQueueSettings: () => request<TaskQueueSettings>('/tasks/queue/settings'),
updateQueueSettings: (payload: TaskQueueSettingsUpdate) => request<TaskQueueSettings>('/tasks/queue/settings', { method: 'POST', json: payload }),
};

View File

@@ -1,5 +1,65 @@
import client from './client';
export interface VectorDBIndexInfo {
index_name: string;
index_type?: string;
metric_type?: string;
indexed_rows: number;
pending_index_rows: number;
state?: string;
}
export interface VectorDBCollectionStats {
name: string;
row_count: number;
dimension: number | null;
estimated_memory_bytes: number;
is_vector_collection: boolean;
indexes: VectorDBIndexInfo[];
}
export interface VectorDBStats {
collections: VectorDBCollectionStats[];
collection_count: number;
total_vectors: number;
estimated_total_memory_bytes: number;
db_file_size_bytes: number | null;
}
export interface VectorDBProviderField {
key: string;
label: string;
type: 'text' | 'password';
required?: boolean;
default?: string;
placeholder?: string;
}
export interface VectorDBProviderMeta {
type: string;
label: string;
description?: string;
enabled: boolean;
config_schema: VectorDBProviderField[];
}
export interface VectorDBCurrentConfig {
type: string;
config: Record<string, string>;
label?: string;
enabled?: boolean;
}
export interface UpdateVectorDBConfigResponse {
config: VectorDBCurrentConfig;
stats: VectorDBStats;
}
export const vectorDBApi = {
getProviders: () => client<VectorDBProviderMeta[]>('/vector-db/providers', { method: 'GET' }),
getConfig: () => client<VectorDBCurrentConfig>('/vector-db/config', { method: 'GET' }),
getStats: () => client<VectorDBStats>('/vector-db/stats', { method: 'GET' }),
updateConfig: (payload: { type: string; config: Record<string, string> }) =>
client<UpdateVectorDBConfigResponse>('/vector-db/config', { method: 'POST', json: payload }),
clearAll: () => client('/vector-db/clear-all', { method: 'POST' }),
};
};

View File

@@ -50,7 +50,18 @@ export const vfsApi = {
},
mkdir: (path: string) => request('/fs/mkdir', { method: 'POST', json: { path } }),
deletePath: (path: string) => request(`/fs/${encodeURI(path.replace(/^\/+/, ''))}`, { method: 'DELETE' }),
move: (src: string, dst: string) => request('/fs/move', { method: 'POST', json: { src, dst } }),
move: (src: string, dst: string, options?: { overwrite?: boolean }) => {
const params = new URLSearchParams();
if (options?.overwrite !== undefined) params.set('overwrite', String(options.overwrite));
const query = params.toString();
return request(`/fs/move${query ? `?${query}` : ''}`, { method: 'POST', json: { src, dst } });
},
copy: (src: string, dst: string, options?: { overwrite?: boolean }) => {
const params = new URLSearchParams();
if (options?.overwrite !== undefined) params.set('overwrite', String(options.overwrite));
const query = params.toString();
return request(`/fs/copy${query ? `?${query}` : ''}`, { method: 'POST', json: { src, dst } });
},
rename: (src: string, dst: string) => request('/fs/rename', { method: 'POST', json: { src, dst } }),
thumb: (path: string, w=256, h=256, fit='cover') =>
request<ArrayBuffer>(`/fs/thumb/${encodeURI(path.replace(/^\/+/, ''))}?w=${w}&h=${h}&fit=${fit}`),

View File

@@ -1,11 +1,12 @@
import React, { useState, useEffect, useCallback, useRef, useMemo } from 'react';
import React, { useState, useEffect, useCallback, useRef, useMemo, Suspense } from 'react';
import { Layout, Spin, Button, Space, message } from 'antd';
import MDEditor from '@uiw/react-md-editor';
import Editor from '@monaco-editor/react';
import type { AppComponentProps } from '../types';
import { vfsApi } from '../../api/vfs';
import request from '../../api/client';
const MonacoEditor = React.lazy(() => import('@monaco-editor/react'));
const MarkdownEditor = React.lazy(() => import('@uiw/react-md-editor'));
const { Header, Content } = Layout;
export const TextEditorApp: React.FC<AppComponentProps> = ({ filePath, entry, onRequestClose }) => {
@@ -143,26 +144,30 @@ export const TextEditorApp: React.FC<AppComponentProps> = ({ filePath, entry, on
</div>
) : (
isMarkdown ? (
<MDEditor
value={content}
onChange={(val) => setContent(val || '')}
height="100%"
preview={truncated ? 'preview' : 'live'}
/>
<Suspense fallback={<Spin style={{ marginTop: 24 }} />}>
<MarkdownEditor
value={content}
onChange={(val) => setContent(val || '')}
height="100%"
preview={truncated ? 'preview' : 'live'}
/>
</Suspense>
) : (
<Editor
value={content}
onChange={(val) => setContent(val || '')}
height="100%"
language={monacoLanguage}
options={{
readOnly: truncated,
minimap: { enabled: false },
scrollBeyondLastLine: false,
wordWrap: 'on',
fontSize: 13,
}}
/>
<Suspense fallback={<Spin style={{ marginTop: 24 }} />}>
<MonacoEditor
value={content}
onChange={(val) => setContent(val || '')}
height="100%"
language={monacoLanguage}
options={{
readOnly: truncated,
minimap: { enabled: false },
scrollBeyondLastLine: false,
wordWrap: 'on',
fontSize: 13,
}}
/>
</Suspense>
)
)}
</Content>

View File

@@ -0,0 +1,143 @@
import { memo, useEffect, useMemo, useState } from 'react';
import { Modal, Button, List, Typography, Space, Input, message } from 'antd';
import { FolderOutlined, ArrowUpOutlined } from '@ant-design/icons';
import { useI18n } from '../i18n';
import { vfsApi, type VfsEntry } from '../api/client';
import { getFileIcon } from '../pages/FileExplorerPage/components/FileIcons';
export type PathSelectorMode = 'directory' | 'file' | 'any';
interface PathSelectorModalProps {
open: boolean;
mode?: PathSelectorMode;
initialPath?: string;
onOk: (path: string) => void;
onCancel: () => void;
}
function normalizePath(p: string): string {
if (!p) return '/';
const s = ('/' + p).replace(/\/+/, '/');
return s.replace(/\\/g, '/').replace(/\/+$/, '') || '/';
}
function joinPath(dir: string, name: string): string {
const base = normalizePath(dir);
if (base === '/') return `/${name}`;
return `${base}/${name}`.replace(/\/+/, '/');
}
const PathSelectorModal = memo(function PathSelectorModal({ open, mode = 'directory', initialPath = '/', onOk, onCancel }: PathSelectorModalProps) {
const { t } = useI18n();
const [path, setPath] = useState<string>(normalizePath(initialPath));
const [entries, setEntries] = useState<VfsEntry[]>([]);
const [loading, setLoading] = useState(false);
const [selected, setSelected] = useState<string | null>(null); // selected file name within current folder
const title = useMemo(() => {
if (mode === 'file') return t('Select File');
if (mode === 'any') return t('Select Path');
return t('Select Folder');
}, [mode, t]);
const load = async (p: string) => {
setLoading(true);
try {
const listing = await vfsApi.list(p, 1, 500, 'name', 'asc');
setEntries(listing.entries);
setPath(listing.path || p);
setSelected(null);
} catch (e: any) {
message.error(e.message || t('Load failed'));
} finally {
setLoading(false);
}
};
useEffect(() => {
if (open) {
load(normalizePath(initialPath));
}
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [open, initialPath]);
const canOk = useMemo(() => {
if (mode === 'file') return !!selected;
return true;
}, [mode, selected]);
const handleOk = () => {
if (mode === 'directory') {
onOk(normalizePath(path));
return;
}
if (mode === 'file') {
if (!selected) {
message.warning(t('Please select a file'));
return;
}
onOk(joinPath(path, selected));
return;
}
// any
if (selected) onOk(joinPath(path, selected));
else onOk(normalizePath(path));
};
const goUp = () => {
const cur = normalizePath(path);
if (cur === '/') return;
const parent = cur.replace(/\/+$/, '').split('/').slice(0, -1).join('/') || '/';
load(parent);
};
return (
<Modal
title={title}
open={open}
onCancel={onCancel}
onOk={handleOk}
okButtonProps={{ disabled: !canOk }}
width={720}
>
<Space style={{ width: '100%', marginBottom: 12 }} align="center">
<Typography.Text type="secondary">{t('Current')}</Typography.Text>
<Input value={path} readOnly />
<Button onClick={goUp} icon={<ArrowUpOutlined />} disabled={path === '/'}>{t('Up')}</Button>
{mode !== 'file' && (
<Button type="primary" onClick={() => onOk(normalizePath(path))}>{t('Select Current Folder')}</Button>
)}
</Space>
<List
bordered
loading={loading}
dataSource={entries}
style={{ maxHeight: 420, overflow: 'auto' }}
renderItem={(item) => {
const isSelected = selected === item.name && !item.is_dir;
return (
<List.Item
onClick={() => {
if (item.is_dir) {
load(joinPath(path, item.name));
} else {
setSelected((prev) => (prev === item.name ? null : item.name));
}
}}
style={{ cursor: 'pointer', background: isSelected ? 'rgba(22,119,255,0.08)' : undefined }}
>
<Space>
{item.is_dir ? <FolderOutlined /> : getFileIcon(item.name)}
<Typography.Text strong={item.is_dir}>{item.name}</Typography.Text>
</Space>
</List.Item>
);
}}
/>
</Modal>
);
});
export default PathSelectorModal;

View File

@@ -0,0 +1,122 @@
import { memo, useEffect, useState } from 'react';
import { Modal, Form, Input, message, Collapse } from 'antd';
import { useAuth } from '../contexts/AuthContext';
import { authApi } from '../api/auth';
import { useI18n } from '../i18n';
export interface ProfileModalProps {
open: boolean;
onClose: () => void;
}
const ProfileModal = memo(function ProfileModal({ open, onClose }: ProfileModalProps) {
const { user, refreshUser } = useAuth();
const [form] = Form.useForm();
const [loading, setLoading] = useState(false);
const { t } = useI18n();
useEffect(() => {
if (open && user) {
form.setFieldsValue({
username: user.username || '',
full_name: user.full_name || '',
email: user.email || '',
old_password: '',
new_password: '',
});
} else if (!open) {
form.resetFields();
}
}, [open, user, form]);
const handleOk = async () => {
try {
const values = await form.validateFields();
const payload: any = {};
if (values.full_name !== (user?.full_name || '')) payload.full_name = values.full_name || null;
if (values.email !== (user?.email || '')) payload.email = values.email || null;
if (values.old_password || values.new_password) {
payload.old_password = values.old_password || '';
payload.new_password = values.new_password || '';
}
setLoading(true);
await authApi.updateMe(payload);
await refreshUser();
message.success(t('Saved successfully'));
onClose();
} catch (e) {
if (e instanceof Error) {
message.error(e.message || t('Save failed'));
}
} finally {
setLoading(false);
}
};
return (
<Modal
title={t('Profile')}
open={open}
onCancel={onClose}
onOk={handleOk}
confirmLoading={loading}
okText={t('Save')}
cancelText={t('Cancel')}
forceRender
>
<Form form={form} layout="vertical">
<Form.Item name="username" label={t('Username')}>
<Input disabled />
</Form.Item>
<Form.Item name="full_name" label={t('Full Name')}>
<Input placeholder={t('Full Name')} />
</Form.Item>
<Form.Item name="email" label={t('Email')}>
<Input placeholder={t('Email')} type="email" />
</Form.Item>
<Collapse
size="small"
items={[{
key: 'pwd',
label: t('Change Password'),
children: (
<>
<Form.Item
name="old_password"
label={t('Old Password')}
dependencies={["new_password"]}
rules={[{
validator: async (_, value) => {
const newPwd = form.getFieldValue('new_password');
if ((value && !newPwd) || (!value && newPwd)) {
throw new Error(t('Please fill both old and new password'));
}
}
}]}
>
<Input.Password placeholder={t('Old Password')} autoComplete="current-password" />
</Form.Item>
<Form.Item
name="new_password"
label={t('New Password')}
dependencies={["old_password"]}
rules={[{
validator: async (_, value) => {
const oldPwd = form.getFieldValue('old_password');
if ((value && !oldPwd) || (!value && oldPwd)) {
throw new Error(t('Please fill both old and new password'));
}
}
}]}
>
<Input.Password placeholder={t('New Password')} autoComplete="new-password" />
</Form.Item>
</>
)
}]} />
</Form>
</Modal>
);
});
export default ProfileModal;

View File

@@ -1,5 +1,5 @@
import React, { createContext, useContext, useState, useEffect } from 'react';
import { authApi } from '../api/auth';
import { authApi, type MeResponse } from '../api/auth';
interface AuthContextType {
token: string | null;
@@ -7,12 +7,15 @@ interface AuthContextType {
login: (username: string, password: string) => Promise<void>;
logout: () => void;
register: (username: string, password: string, email?: string, full_name?: string) => Promise<void>;
user: MeResponse | null;
refreshUser: () => Promise<void>;
}
const AuthContext = createContext<AuthContextType>({} as any);
export function AuthProvider({ children }: { children: React.ReactNode }) {
const [token, setToken] = useState<string | null>(() => localStorage.getItem('token'));
const [user, setUser] = useState<MeResponse | null>(null);
const isAuthenticated = !!token;
useEffect(() => {
@@ -22,20 +25,38 @@ export function AuthProvider({ children }: { children: React.ReactNode }) {
const login = async (username: string, password: string) => {
const res = await authApi.login({ username, password });
if (res)
if (res) {
setToken(res.access_token);
try { await refreshUser(); } catch (_) {}
}
};
const logout = () => {
setToken(null);
setUser(null);
};
const register = async (username: string, password: string, email?: string, full_name?: string) => {
await authApi.register(username, password, email, full_name);
};
const refreshUser = async () => {
if (!localStorage.getItem('token')) { setUser(null); return; }
const me = await authApi.me();
setUser(me);
};
useEffect(() => {
if (token) {
refreshUser().catch(() => setUser(null));
} else {
setUser(null);
}
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [token]);
return (
<AuthContext.Provider value={{ token, isAuthenticated, login, logout, register }}>
<AuthContext.Provider value={{ token, isAuthenticated, login, logout, register, user, refreshUser }}>
{children}
</AuthContext.Provider>
);

View File

@@ -1,10 +1,12 @@
import React, { createContext, useContext, useEffect, useMemo, useRef, useState } from 'react';
import { ConfigProvider, theme as antdTheme } from 'antd';
import zhCN from 'antd/locale/zh_CN';
import enUS from 'antd/locale/en_US';
import type { ThemeConfig } from 'antd/es/config-provider/context';
import { getAllConfig } from '../api/config';
import { useAuth } from './AuthContext';
import baseTheme from '../theme';
import { useI18n } from '../i18n';
type ThemeMode = 'light' | 'dark' | 'system';
@@ -101,6 +103,7 @@ function buildThemeConfig(state: ThemeState, systemDark: boolean): ThemeConfig {
export function ThemeProvider({ children }: { children: React.ReactNode }) {
const { isAuthenticated } = useAuth();
const { lang } = useI18n();
const systemDark = useSystemDarkPreferred();
const [state, setState] = useState<ThemeState>({ mode: 'light' });
const styleTagRef = useRef<HTMLStyleElement | null>(null);
@@ -163,6 +166,7 @@ export function ThemeProvider({ children }: { children: React.ReactNode }) {
const themeConfig = useMemo(() => buildThemeConfig(state, systemDark), [state, systemDark]);
const resolvedMode: ThemeMode = useMemo(() => (state.mode === 'system' ? (systemDark ? 'dark' : 'light') : state.mode), [state.mode, systemDark]);
const locale = useMemo(() => (lang === 'zh' ? zhCN : enUS), [lang]);
const ctxValue = useMemo<ThemeContextType>(() => ({
refreshTheme,
@@ -173,7 +177,7 @@ export function ThemeProvider({ children }: { children: React.ReactNode }) {
return (
<Ctx.Provider value={ctxValue}>
<ConfigProvider theme={{ ...themeConfig, cssVar: true }} locale={zhCN}>
<ConfigProvider theme={{ ...themeConfig, cssVar: true }} locale={locale}>
{children}
</ConfigProvider>
</Ctx.Provider>

View File

@@ -40,3 +40,29 @@ body { font-family: system-ui,-apple-system,BlinkMacSystemFont,'Segoe UI',Roboto
.fx-grid-item .thumb .badge { position:absolute; top:6px; left:6px; background: var(--ant-color-primary, #111); color:#fff; font-size:10px; padding:2px 4px; border-radius:6px; line-height:1; letter-spacing:.5px; }
.fx-grid-item .name { font-weight:600; font-size:13px; }
.ellipsis { overflow:hidden; white-space:nowrap; text-overflow:ellipsis; }
.processors-tabs {
flex: 1;
display: flex;
flex-direction: column;
min-height: 0;
padding: 5px;
}
.processors-tabs .ant-tabs-content-holder,
.processors-tabs .ant-tabs-content {
flex: 1;
height: 100%;
min-height: 0;
display: flex;
flex-direction: column;
}
.processors-tabs .ant-tabs-tabpane {
flex: 1;
height: 100%;
min-height: 0;
display: none;
flex-direction: column;
}
.processors-tabs .ant-tabs-tabpane-active {
display: flex;
}

View File

@@ -17,9 +17,17 @@ export const en = {
'Search files / tags / types': 'Search files / tags / types',
'Log Out': 'Log Out',
'Admin': 'Admin',
'Profile': 'Profile',
'Account Settings': 'Account Settings',
'Language': 'Language',
'Chinese': '中文',
'English': 'English',
'Full Name': 'Full Name',
'Email': 'Email',
'Change Password': 'Change Password',
'Old Password': 'Old Password',
'New Password': 'New Password',
'Please fill both old and new password': 'Please fill both old and new password',
// Auth / Login
'Welcome Back': 'Welcome Back',
@@ -45,6 +53,9 @@ export const en = {
'Cancel failed': 'Cancel failed',
'Load failed': 'Load failed',
'Are you sure to cancel share?': 'Are you sure to cancel share?',
'Clear expired shares': 'Clear expired shares',
'Confirm clear expired shares?': 'Confirm clear expired shares?',
'Cleared {count} expired shares': 'Cleared {count} expired shares',
'Share Name': 'Share Name',
'Share Content': 'Share Content',
@@ -72,6 +83,27 @@ export const en = {
// Offline download
'No offline download tasks': 'No offline download tasks',
'Create Offline Download': 'Create Offline Download',
'Offline Download Tasks': 'Offline Download Tasks',
'URL': 'URL',
'Please input URL': 'Please input URL',
'Destination Folder': 'Destination Folder',
'Select destination': 'Select destination',
'Filename': 'Filename',
'Please input filename': 'Please input filename',
'Start Download': 'Start Download',
'Stage': 'Stage',
'Progress': 'Progress',
'Bytes': 'Bytes',
'Save Path': 'Save Path',
'Queued': 'Queued',
'Downloading': 'Downloading',
'Transferring': 'Transferring',
'Completed': 'Completed',
'Pending': 'Pending',
'Running': 'Running',
'Success': 'Success',
'Failed': 'Failed',
// Header/File Explorer
'Home': 'Home',
'File Manager': 'File Manager',
@@ -83,12 +115,23 @@ export const en = {
'Grid': 'Grid',
'List': 'List',
'Mount Point': 'Mount Point',
'Move': 'Move',
'Move to': 'Move to',
'Copy to': 'Copy to',
'Destination path': 'Destination path',
'Move task queued': 'Move task queued',
'Move completed': 'Move completed',
'Copy task queued': 'Copy task queued',
'Copy completed': 'Copy completed',
'Please input destination path': 'Please input destination path',
// Context menu
'Upload File': 'Upload File',
'Open': 'Open',
'Open With': 'Open With',
'Default': 'Default',
'Processor': 'Processor',
'Share': 'Share',
'Rename': 'Rename',
'Delete': 'Delete',
'Details': 'Details',
@@ -136,6 +179,24 @@ export const en = {
'Copy Markdown': 'Copy Markdown',
'Close': 'Close',
// Task queue
'Task Queue': 'Task Queue',
'Last updated at {time}': 'Last updated at {time}',
'Total Tasks': 'Total Tasks',
'Running Tasks': 'Running Tasks',
'Waiting Tasks': 'Waiting Tasks',
'Failed Tasks': 'Failed Tasks',
'Active Workers': 'Active Workers',
'Task Type': 'Task Type',
'Search by name or ID': 'Search by name or ID',
'Filter by status': 'Filter by status',
'Queue Concurrency': 'Queue Concurrency',
'Settings saved': 'Settings saved',
'Expand': 'Expand',
'Adjust worker concurrency immediately': 'Adjust worker concurrency immediately',
'Auto': 'Auto',
'Manual': 'Manual',
// File detail
'Camera Make': 'Camera Make',
'Camera Model': 'Camera Model',
@@ -148,7 +209,6 @@ export const en = {
'Width': 'Width',
'Height': 'Height',
'No common EXIF info': 'No common EXIF info',
'Bytes': 'Bytes',
'File Properties': 'File Properties',
'Loading file info...': 'Loading file info...',
'Basic Info': 'Basic Info',
@@ -189,9 +249,38 @@ export const en = {
'AI Settings': 'AI Settings',
'Vision Model': 'Vision Model',
'Embedding Model': 'Embedding Model',
'Embedding Dimension': 'Embedding Dimension',
'Vector Database': 'Vector Database',
'Vector Database Settings': 'Vector Database Settings',
'Current Statistics': 'Current Statistics',
'Collections': 'Collections',
'Vectors': 'Vectors',
'Database Size': 'Database Size',
'Estimated Memory': 'Estimated Memory',
'No collections': 'No collections',
'Dimension': 'Dimension',
'Non-vector collection': 'Non-vector collection',
'Estimated memory': 'Estimated memory',
'Indexes': 'Indexes',
'Unnamed index': 'Unnamed index',
'Indexed rows': 'Indexed rows',
'Pending rows': 'Pending rows',
'Estimated memory is calculated as vectors x dimension x 4 bytes (float32).': 'Estimated memory is calculated as vectors x dimension x 4 bytes (float32).',
'Database Provider': 'Database Provider',
'Please select a provider': 'Please select a provider',
'Coming soon': 'Coming soon',
'This provider is not available yet': 'This provider is not available yet',
'Database file path': 'Database file path',
'Server URI': 'Server URI',
'Token': 'Token',
'Server URL': 'Server URL',
'API Key': 'API Key',
'Embedded Milvus Lite (local file storage).': 'Embedded Milvus Lite (local file storage).',
'Remote Milvus instance accessed via URI.': 'Remote Milvus instance accessed via URI.',
'Qdrant vector database (HTTP API).': 'Qdrant vector database (HTTP API).',
'Database Type': 'Database Type',
'Confirm embedding dimension change': 'Confirm embedding dimension change',
'Changing the embedding dimension will clear the vector database automatically. You will need to rebuild indexes afterwards. Continue?': 'Changing the embedding dimension will clear the vector database automatically. You will need to rebuild indexes afterwards. Continue?',
'Confirm clear vector database?': 'Confirm clear vector database?',
'This will delete all collections irreversibly.': 'This will delete all collections irreversibly.',
'Confirm Clear': 'Confirm Clear',
@@ -236,7 +325,6 @@ export const en = {
// Tasks
'Automation Tasks': 'Automation Tasks',
'Running Tasks': 'Running Tasks',
'Create Task': 'Create Task',
'Edit Task': 'Edit Task',
'Create Automation Task': 'Create Automation Task',
@@ -300,6 +388,45 @@ export const en = {
// Processor flow
'Processing finished': 'Processing finished',
'Processing failed': 'Processing failed',
'Processors': 'Processors',
'Processor List': 'Processor List',
'Reload': 'Reload',
'Run Processor': 'Run Processor',
'Target Path': 'Target Path',
'Please select a path': 'Please select a path',
'Select Directory': 'Select Directory',
'Overwrite original': 'Overwrite original',
'Save To': 'Save To',
'Optional output path': 'Optional output path',
'Run': 'Run',
'Select a processor': 'Select a processor',
'No module path': 'No module path',
'Source saved': 'Source saved',
'Processors reloaded': 'Processors reloaded',
'Unsaved changes': 'Unsaved changes',
'Switching processor will discard unsaved changes. Continue?': 'Switching processor will discard unsaved changes. Continue?',
'Task submitted': 'Task submitted',
'Supported Extensions': 'Supported Extensions',
'All': 'All',
'Produces File': 'Produces File',
'Yes': 'Yes',
'No': 'No',
'Please select a processor': 'Please select a processor',
'Select a path': 'Select a path',
'Source Editor': 'Source Editor',
'Module Path': 'Module Path',
'Directory processing always overwrites original files': 'Directory processing always overwrites original files',
'No data': 'No data',
// Path selector
'Select File': 'Select File',
'Select Path': 'Select Path',
'Select Folder': 'Select Folder',
'Select': 'Select',
'Current': 'Current',
'Up': 'Up',
'Select Current Folder': 'Select Current Folder',
'Please select a file': 'Please select a file',
// Plugins page
'Installed successfully': 'Installed successfully',
@@ -348,8 +475,6 @@ export const en = {
'Create admin account': 'Create admin account',
'This is the first account with full permissions': 'This is the first account with full permissions',
'Username': 'Username',
'Full Name': 'Full Name',
'Email': 'Email',
'Please input a valid email!': 'Please input a valid email!',
'Confirm Password': 'Confirm Password',
'Please confirm your password!': 'Please confirm your password!',

View File

@@ -8,9 +8,31 @@ export const zh = {
'All Files': '全部文件',
'Manage': '管理',
'System': '系统',
'Automation': '自动',
'Automation': '自动任务',
'My Shares': '我的分享',
'Offline Downloads': '离线下载',
'No offline download tasks': '暂无离线下载任务',
'Create Offline Download': '创建离线下载任务',
'Offline Download Tasks': '离线下载任务列表',
'URL': '下载地址',
'Please input URL': '请输入下载地址',
'Destination Folder': '保存目录',
'Select destination': '请选择保存目录',
'Filename': '文件名',
'Please input filename': '请输入文件名',
'Start Download': '开始下载',
'Stage': '阶段',
'Progress': '进度',
'Bytes': '已传输',
'Save Path': '保存路径',
'Queued': '排队中',
'Downloading': '下载中',
'Transferring': '转存中',
'Completed': '已完成',
'Pending': '等待',
'Running': '进行中',
'Success': '成功',
'Failed': '失败',
'Adapters': '存储挂载',
'Plugins': '应用中心',
'System Settings': '系统设置',
@@ -21,9 +43,17 @@ export const zh = {
'Search files / tags / types': '搜索文件 / 标签 / 类型',
'Log Out': '退出登录',
'Admin': '管理员',
'Profile': '个人资料',
'Account Settings': '账户设置',
'Language': '语言',
'Chinese': '中文',
'English': '英文',
'English': 'English',
'Full Name': '昵称',
'Email': '邮箱',
'Change Password': '修改密码',
'Old Password': '原密码',
'New Password': '新密码',
'Please fill both old and new password': '请同时填写原密码和新密码',
// Auth / Login
'Welcome Back': '欢迎回来',
@@ -49,6 +79,9 @@ export const zh = {
'Cancel failed': '取消失败',
'Load failed': '加载失败',
'Are you sure to cancel share?': '确认取消分享?',
'Clear expired shares': '清空过期分享',
'Confirm clear expired shares?': '确认清空过期分享?',
'Cleared {count} expired shares': '已清理 {count} 个过期分享',
'Share Name': '分享名称',
'Share Content': '分享内容',
'Created At': '创建时间',
@@ -84,12 +117,23 @@ export const zh = {
'Grid': '网格',
'List': '列表',
'Mount Point': '挂载点',
'Move': '移动',
'Move to': '移动到',
'Copy to': '复制到',
'Destination path': '目标路径',
'Move task queued': '移动任务已排队',
'Move completed': '移动完成',
'Copy task queued': '复制任务已排队',
'Copy completed': '复制完成',
'Please input destination path': '请输入目标路径',
// Context menu
'Upload File': '上传文件',
'Open': '打开',
'Open With': '打开方式',
'Default': '默认',
'Processor': '处理器',
'Share': '分享',
'Rename': '重命名',
'Delete': '删除',
'Details': '详情',
@@ -137,6 +181,22 @@ export const zh = {
'Copy Markdown': '复制 Markdown',
'Close': '关闭',
// Task queue
'Task Queue': '任务队列',
'Last updated at {time}': '上次刷新时间 {time}',
'Total Tasks': '任务总数',
'Waiting Tasks': '等待中的任务',
'Failed Tasks': '失败的任务',
'Active Workers': '活跃 Worker 数',
'Task Type': '任务类型',
'Search by name or ID': '按名称或 ID 搜索',
'Filter by status': '按状态筛选',
'Queue Concurrency': '队列并发数',
'Settings saved': '设置已保存',
'Expand': '展开',
'Adjust worker concurrency immediately': '立即调整任务并发数',
'Auto': '自动',
'Manual': '手动',
// File detail
'Camera Make': '设备品牌',
'Camera Model': '设备型号',
@@ -149,7 +209,6 @@ export const zh = {
'Width': '宽度',
'Height': '高度',
'No common EXIF info': '无常见EXIF信息',
'Bytes': '字节',
'File Properties': '文件属性',
'Loading file info...': '加载文件信息...',
'Basic Info': '基本信息',
@@ -191,9 +250,38 @@ export const zh = {
'AI Settings': 'AI设置',
'Vision Model': '视觉模型',
'Embedding Model': '嵌入模型',
'Embedding Dimension': '向量维度',
'Vector Database': '向量数据库',
'Vector Database Settings': '向量数据库设置',
'Current Statistics': '当前统计',
'Collections': '集合',
'Vectors': '向量',
'Database Size': '数据库大小',
'Estimated Memory': '估算内存',
'No collections': '暂无集合',
'Dimension': '维度',
'Non-vector collection': '非向量集合',
'Estimated memory': '估算内存',
'Indexes': '索引',
'Unnamed index': '未命名索引',
'Indexed rows': '已索引行数',
'Pending rows': '待索引行数',
'Estimated memory is calculated as vectors x dimension x 4 bytes (float32).': '估算内存 = 向量数量 x 维度 x 4 字节float32。',
'Database Provider': '数据库提供者',
'Please select a provider': '请选择提供者',
'Coming soon': '敬请期待',
'This provider is not available yet': '该提供者暂不可用',
'Database file path': '数据库文件路径',
'Server URI': '服务器 URI',
'Token': '令牌',
'Server URL': '服务器地址',
'API Key': 'API Key',
'Embedded Milvus Lite (local file storage).': '嵌入式 Milvus Lite本地文件存储。',
'Remote Milvus instance accessed via URI.': '通过 URI 访问的远程 Milvus 实例。',
'Qdrant vector database (HTTP API).': 'Qdrant 向量数据库HTTP API。',
'Database Type': '数据库类型',
'Confirm embedding dimension change': '确认修改向量维度',
'Changing the embedding dimension will clear the vector database automatically. You will need to rebuild indexes afterwards. Continue?': '修改向量维度会自动清空向量数据库,之后需要重建索引,是否继续?',
'Confirm clear vector database?': '确认清空向量数据库?',
'This will delete all collections irreversibly.': '此操作将删除所有集合中的所有数据,且不可逆。',
'Confirm Clear': '确认清空',
@@ -302,6 +390,45 @@ export const zh = {
// Processor flow
'Processing finished': '处理完成',
'Processing failed': '处理失败',
'Processors': '处理器',
'Processor List': '处理器列表',
'Reload': '重载',
'Run Processor': '运行处理器',
'Target Path': '目标路径',
'Please select a path': '请选择路径',
'Select Directory': '选择目录',
'Overwrite original': '覆盖原文件',
'Save To': '保存到',
'Optional output path': '可选输出路径',
'Run': '运行',
'Select a processor': '选择处理器',
'No module path': '未检测到模块路径',
'Source saved': '源码已保存',
'Processors reloaded': '处理器已重载',
'Unsaved changes': '存在未保存的修改',
'Switching processor will discard unsaved changes. Continue?': '切换处理器会丢失未保存的修改,确认继续?',
'Task submitted': '任务已提交',
'Supported Extensions': '支持的扩展名',
'All': '全部',
'Produces File': '生成文件',
'Yes': '是',
'No': '否',
'Please select a processor': '请选择处理器',
'Select a path': '请选择路径',
'Source Editor': '源码编辑',
'Module Path': '模块路径',
'Directory processing always overwrites original files': '选择目录时会强制覆盖原文件',
'No data': '暂无数据',
// Path selector
'Select File': '选择文件',
'Select Path': '选择路径',
'Select Folder': '选择目录',
'Select': '选择',
'Current': '当前',
'Up': '上一级',
'Select Current Folder': '选择当前目录',
'Please select a file': '请选择一个文件',
// Plugins page
'Installed successfully': '安装成功',
@@ -350,8 +477,6 @@ export const zh = {
'Create admin account': '创建管理员账户',
'This is the first account with full permissions': '这是系统的第一个账户,将拥有最高权限。',
'Username': '用户名',
'Full Name': '昵称',
'Email': '邮箱',
'Please input a valid email!': '请输入有效的邮箱地址!',
'Confirm Password': '确认密码',
'Please confirm your password!': '请确认您的密码!',

View File

@@ -1,11 +1,13 @@
import { Layout, Button, Dropdown, theme, Flex } from 'antd';
import { SearchOutlined, UserOutlined, MenuUnfoldOutlined, LogoutOutlined } from '@ant-design/icons';
import { Layout, Button, Dropdown, theme, Flex, Avatar, Typography } from 'antd';
import { SearchOutlined, MenuUnfoldOutlined, LogoutOutlined, UserOutlined } from '@ant-design/icons';
import { memo, useState } from 'react';
import SearchDialog from './SearchDialog.tsx';
import { authApi } from '../api/auth.ts';
import { useNavigate } from 'react-router';
import { useI18n } from '../i18n';
import LanguageSwitcher from '../components/LanguageSwitcher';
import { useAuth } from '../contexts/AuthContext';
import ProfileModal from '../components/ProfileModal';
const { Header } = Layout;
@@ -19,12 +21,16 @@ const TopHeader = memo(function TopHeader({ collapsed, onToggle }: TopHeaderProp
const [searchOpen, setSearchOpen] = useState(false);
const navigate = useNavigate();
const { t } = useI18n();
const { user } = useAuth();
const [profileOpen, setProfileOpen] = useState(false);
const handleLogout = () => {
authApi.logout();
navigate('/login', { replace: true });
};
const openProfile = () => setProfileOpen(true);
return (
<Header style={{ background: token.colorBgContainer, borderBottom: `1px solid ${token.colorBorderSecondary}`, display: 'flex', alignItems: 'center', gap: 16, backdropFilter: 'saturate(180%) blur(8px)' }}>
{collapsed && (
@@ -48,12 +54,23 @@ const TopHeader = memo(function TopHeader({ collapsed, onToggle }: TopHeaderProp
<Dropdown
menu={{
items: [
{ key: 'profile', label: t('Profile'), icon: <UserOutlined />, onClick: openProfile },
{ key: 'logout', label: t('Log Out'), icon: <LogoutOutlined />, onClick: handleLogout }
]
}}
>
<Button icon={<UserOutlined />}>{t('Admin')}</Button>
<Button type="text" style={{ paddingInline: 8, height: 40 }}>
<Flex align="center" gap={8}>
<Avatar size={28} src={user?.gravatar_url}>
{(user?.full_name || user?.username || 'A').charAt(0).toUpperCase()}
</Avatar>
<Typography.Text style={{ maxWidth: 160 }} ellipsis>
{user?.full_name || user?.username || t('Admin')}
</Typography.Text>
</Flex>
</Button>
</Dropdown>
<ProfileModal open={profileOpen} onClose={() => setProfileOpen(false)} />
</Flex>
</Header>
);

View File

@@ -9,6 +9,8 @@ import {
BugOutlined,
DatabaseOutlined,
AppstoreOutlined,
CodeOutlined,
ClockCircleOutlined,
} from '@ant-design/icons';
import type { ReactNode } from 'react';
@@ -27,7 +29,9 @@ export const navGroups: NavGroup[] = [
key: 'manage',
title: 'Manage',
children: [
{ key: 'processors', icon: React.createElement(CodeOutlined), label: 'Processors' },
{ key: 'tasks', icon: React.createElement(RobotOutlined), label: 'Automation' },
{ key: 'task-queue', icon: React.createElement(ClockCircleOutlined), label: 'Task Queue' },
{ key: 'share', icon: React.createElement(ShareAltOutlined), label: 'My Shares' },
{ key: 'offline', icon: React.createElement(CloudDownloadOutlined), label: 'Offline Downloads' },
{ key: 'adapters', icon: React.createElement(ApiOutlined), label: 'Adapters' },

View File

@@ -223,7 +223,7 @@ const AdaptersPage = memo(function AdaptersPage() {
width={480}
open={open}
onClose={() => { setOpen(false); setEditing(null); }}
destroyOnClose
destroyOnHidden
extra={
<Space>
<Button onClick={() => { setOpen(false); setEditing(null); }}>{t('Cancel')}</Button>

View File

@@ -1,4 +1,4 @@
import { memo, useEffect, useRef, useState } from 'react';
import { memo, useCallback, useEffect, useRef, useState } from 'react';
import { useParams } from 'react-router';
import { theme, Pagination } from 'antd';
import { useFileExplorer } from './hooks/useFileExplorer';
@@ -22,6 +22,7 @@ import UploadModal from './components/Modals/UploadModal';
import { ShareModal } from './components/Modals/ShareModal';
import { DirectLinkModal } from './components/Modals/DirectLinkModal';
import { FileDetailModal } from './components/FileDetailModal';
import { MoveCopyModal } from './components/Modals/MoveCopyModal';
import type { ViewMode } from './types';
import { vfsApi, type VfsEntry } from '../../api/client';
@@ -35,7 +36,7 @@ const FileExplorerPage = memo(function FileExplorerPage() {
// --- Hooks ---
const { path, entries, loading, pagination, processorTypes, sortBy, sortOrder, load, navigateTo, goUp, handlePaginationChange, refresh, handleSortChange } = useFileExplorer(navKey);
const { selectedEntries, handleSelect, handleSelectRange, clearSelection, setSelectedEntries } = useFileSelection();
const { doCreateDir, doDelete, doRename, doDownload, doShare, doGetDirectLink } = useFileActions({ path, refresh, clearSelection, onShare: (entries) => setSharingEntries(entries), onGetDirectLink: (entry) => setDirectLinkEntry(entry) });
const { doCreateDir, doDelete, doRename, doDownload, doShare, doGetDirectLink, doMove, doCopy } = useFileActions({ path, refresh, clearSelection, onShare: (entries) => setSharingEntries(entries), onGetDirectLink: (entry) => setDirectLinkEntry(entry) });
const { openFileWithDefaultApp, confirmOpenWithApp } = useAppWindows();
const { ctxMenu, blankCtxMenu, openContextMenu, openBlankContextMenu, closeContextMenus } = useContextMenu();
const uploader = useUploader(path, refresh);
@@ -51,6 +52,8 @@ const FileExplorerPage = memo(function FileExplorerPage() {
const [directLinkEntry, setDirectLinkEntry] = useState<VfsEntry | null>(null);
const [detailData, setDetailData] = useState<any>(null);
const [detailLoading, setDetailLoading] = useState(false);
const [movingEntries, setMovingEntries] = useState<VfsEntry[]>([]);
const [copyingEntries, setCopyingEntries] = useState<VfsEntry[]>([]);
// --- Effects ---
useEffect(() => {
@@ -82,6 +85,21 @@ const FileExplorerPage = memo(function FileExplorerPage() {
}
};
const buildDefaultDestination = useCallback((targetEntries: VfsEntry[]) => {
if (!targetEntries || targetEntries.length === 0) return '';
if (targetEntries.length > 1) {
return path || '/';
}
const entry = targetEntries[0];
const base = path === '/' ? '' : path;
const segments = [base, entry.name].filter(Boolean);
const joined = segments.join('/');
if (!joined) {
return '/';
}
return joined.startsWith('/') ? joined : `/${joined}`;
}, [path]);
const handleDragEnter = (e: React.DragEvent) => {
e.preventDefault();
e.stopPropagation();
@@ -189,6 +207,30 @@ const FileExplorerPage = memo(function FileExplorerPage() {
<CreateDirModal open={creatingDir} onOk={(name) => { doCreateDir(name); setCreatingDir(false); }} onCancel={() => setCreatingDir(false)} />
<RenameModal entry={renaming} onOk={(entry, newName) => { doRename(entry, newName); setRenaming(null); }} onCancel={() => setRenaming(null)} />
<FileDetailModal entry={detailEntry} loading={detailLoading} data={detailData} onClose={() => setDetailEntry(null)} />
<MoveCopyModal
mode="move"
entries={movingEntries}
open={movingEntries.length > 0}
defaultPath={buildDefaultDestination(movingEntries)}
onOk={async (destination) => {
if (movingEntries.length > 0) {
await doMove(movingEntries, destination);
}
}}
onCancel={() => setMovingEntries([])}
/>
<MoveCopyModal
mode="copy"
entries={copyingEntries}
open={copyingEntries.length > 0}
defaultPath={buildDefaultDestination(copyingEntries)}
onOk={async (destination) => {
if (copyingEntries.length > 0) {
await doCopy(copyingEntries, destination);
}
}}
onCancel={() => setCopyingEntries([])}
/>
{sharingEntries.length > 0 && (
<ShareModal
path={path}
@@ -244,6 +286,8 @@ const FileExplorerPage = memo(function FileExplorerPage() {
onCreateDir={() => setCreatingDir(true)}
onShare={doShare}
onGetDirectLink={doGetDirectLink}
onMove={(entriesToMove) => setMovingEntries(entriesToMove)}
onCopy={(entriesToCopy) => setCopyingEntries(entriesToCopy)}
/>
)}
<UploadModal

View File

@@ -5,7 +5,8 @@ import { getAppsForEntry, getDefaultAppForEntry } from '../../../apps/registry';
import { useI18n } from '../../../i18n';
import {
FolderFilled, AppstoreOutlined, AppstoreAddOutlined, DownloadOutlined,
EditOutlined, DeleteOutlined, InfoCircleOutlined, UploadOutlined, PlusOutlined, ShareAltOutlined, LinkOutlined
EditOutlined, DeleteOutlined, InfoCircleOutlined, UploadOutlined, PlusOutlined,
ShareAltOutlined, LinkOutlined, CopyOutlined, SwapOutlined
} from '@ant-design/icons';
interface ContextMenuProps {
@@ -27,6 +28,8 @@ interface ContextMenuProps {
onCreateDir: () => void;
onShare: (entries: VfsEntry[]) => void;
onGetDirectLink: (entry: VfsEntry) => void;
onMove: (entries: VfsEntry[]) => void;
onCopy: (entries: VfsEntry[]) => void;
}
export const ContextMenu: React.FC<ContextMenuProps> = (props) => {
@@ -110,6 +113,20 @@ export const ContextMenu: React.FC<ContextMenuProps> = (props) => {
disabled: targetEntries.length !== 1 || targetEntries[0].type === 'mount',
onClick: () => actions.onRename(targetEntries[0]),
},
{
key: 'move',
label: t('Move'),
icon: <SwapOutlined />,
disabled: targetEntries.length === 0 || targetEntries.some(t => t.type === 'mount'),
onClick: () => actions.onMove(targetEntries),
},
{
key: 'copy',
label: t('Copy'),
icon: <CopyOutlined />,
disabled: targetEntries.length === 0 || targetEntries.some(t => t.type === 'mount'),
onClick: () => actions.onCopy(targetEntries),
},
{
key: 'delete',
label: t('Delete'),

View File

@@ -49,6 +49,18 @@ export const GridView: React.FC<Props> = ({ entries, thumbs, selectedEntries, lo
const toHex = (v: number) => v.toString(16).padStart(2, '0');
return `#${toHex(r)}${toHex(g)}${toHex(b)}`;
};
const toRgba = (hex: string, alpha: number) => {
const s = hex.replace('#', '');
const normalized = s.length === 3 ? s.split('').map(c => c + c).join('') : s;
const num = parseInt(normalized, 16);
if (Number.isNaN(num) || normalized.length !== 6) {
return `rgba(22, 119, 255, ${alpha})`;
}
const r = (num >> 16) & 255;
const g = (num >> 8) & 255;
const b = num & 255;
return `rgba(${r}, ${g}, ${b}, ${alpha})`;
};
const containerRef = useRef<HTMLDivElement | null>(null);
const itemRefs = useRef<Record<string, HTMLDivElement | null>>({});
const startRef = useRef<{ x: number, y: number } | null>(null);
@@ -168,7 +180,7 @@ export const GridView: React.FC<Props> = ({ entries, thumbs, selectedEntries, lo
width: rect.width,
height: rect.height,
border: '1px dashed var(--ant-color-border, rgba(0,0,0,0.4))',
background: 'var(--ant-color-primary-bg, rgba(0, 120, 212, 0.08))',
background: toRgba(String(token.colorPrimary || '#1677ff'), 0.16),
zIndex: 999
}}
/>

View File

@@ -29,7 +29,7 @@ export const CreateDirModal: React.FC<CreateDirModalProps> = ({ open, onOk, onCa
onOk={handleOk}
onCancel={onCancel}
okButtonProps={{ disabled: !name.trim() }}
destroyOnClose
destroyOnHidden
>
<Input
placeholder={t('Folder Name')}

View File

@@ -0,0 +1,120 @@
import { useEffect, useMemo, useState } from 'react';
import { Button, Modal, Input, message, Space } from 'antd';
import { useI18n } from '../../../../i18n';
import type { VfsEntry } from '../../../../api/client';
import PathSelectorModal from '../../../../components/PathSelectorModal';
interface MoveCopyModalProps {
mode: 'move' | 'copy';
entries: VfsEntry[];
open: boolean;
defaultPath: string;
onOk: (destination: string) => Promise<void> | void;
onCancel: () => void;
}
export function MoveCopyModal({ mode, entries, open, defaultPath, onOk, onCancel }: MoveCopyModalProps) {
const { t } = useI18n();
const [value, setValue] = useState(defaultPath);
const [loading, setLoading] = useState(false);
const [selectorOpen, setSelectorOpen] = useState(false);
const entryName = useMemo(() => entries.length === 1 ? entries[0]?.name ?? '' : '', [entries]);
useEffect(() => {
if (open) {
setValue(defaultPath);
} else {
setValue('');
}
setLoading(false);
setSelectorOpen(false);
}, [open, defaultPath]);
const handleOk = async () => {
const trimmed = value.trim();
if (!trimmed) {
message.warning(t('Please input destination path'));
return;
}
setLoading(true);
try {
await onOk(trimmed);
onCancel();
} catch (e) {
// 上层已处理提示,这里只需保持对话框
} finally {
setLoading(false);
}
};
const title = mode === 'move' ? t('Move to') : t('Copy to');
const okText = mode === 'move' ? t('Move') : t('Copy');
const normalizeSelectedPath = (dirPath: string) => {
const collapse = (p: string) => p.replace(/\/+/g, '/');
const ensureLeading = (p: string) => (p.startsWith('/') ? p : `/${p}`);
const normalizeDir = (() => {
if (!dirPath || dirPath === '/') return '/';
const replaced = dirPath.replace(/\\/g, '/');
const trimmed = replaced.endsWith('/') ? replaced.replace(/\/+$/, '') : replaced;
return ensureLeading(collapse(trimmed));
})();
if (!entryName) {
return normalizeDir || '/';
}
const base = normalizeDir === '/' ? '' : normalizeDir;
const combined = `${base}/${entryName}`;
return ensureLeading(collapse(combined));
};
const handleBrowse = (selectedPath: string) => {
const finalPath = normalizeSelectedPath(selectedPath);
setValue(finalPath);
setSelectorOpen(false);
};
const selectorInitialPath = useMemo(() => {
if (!value) return '/';
const normalized = value.replace(/\\/g, '/');
if (!normalized || normalized === '/') return '/';
const trimmed = normalized.endsWith('/') ? normalized.replace(/\/+$/, '') : normalized;
const parts = trimmed.split('/');
if (parts.length <= 1) return '/';
parts.pop();
const parent = parts.join('/') || '/';
return parent.startsWith('/') ? parent : `/${parent}`;
}, [value]);
return (
<Modal
title={title}
open={open && entries.length > 0}
onOk={handleOk}
onCancel={onCancel}
confirmLoading={loading}
okText={okText}
destroyOnClose
>
<Space.Compact style={{ width: '100%', marginBottom: 12 }}>
<Input
autoFocus
value={value}
placeholder={t('Destination path')}
onChange={(e) => setValue(e.target.value)}
onPressEnter={handleOk}
/>
<Button onClick={() => setSelectorOpen(true)}>{t('Select destination')}</Button>
</Space.Compact>
<PathSelectorModal
open={selectorOpen}
mode="directory"
initialPath={selectorInitialPath}
onOk={handleBrowse}
onCancel={() => setSelectorOpen(false)}
/>
</Modal>
);
}

View File

@@ -58,7 +58,7 @@ export const ProcessorModal: React.FC<ProcessorModalProps> = (props) => {
onCancel={onCancel}
onOk={onOk}
confirmLoading={loading}
destroyOnClose
destroyOnHidden
>
<Form form={form} layout="vertical" onValuesChange={handleFormValuesChange}>
<Form.Item name="processor_type" label={t('Processor')} required>

View File

@@ -32,7 +32,7 @@ export const RenameModal: React.FC<RenameModalProps> = ({ entry, onOk, onCancel
onOk={handleOk}
onCancel={onCancel}
okButtonProps={{ disabled: !name.trim() || name.trim() === entry?.name }}
destroyOnClose
destroyOnHidden
>
<Input
placeholder={t('New Name')}

View File

@@ -13,6 +13,17 @@ interface FileActionsParams {
export function useFileActions({ path, refresh, clearSelection, onShare, onGetDirectLink }: FileActionsParams) {
const { t } = useI18n();
const normalizeFullPath = useCallback((name: string) => {
const base = path === '/' ? '' : path;
return `${base}/${name}`.replace(/\/{2,}/g, '/');
}, [path]);
const normalizeDestination = useCallback((dest: string) => {
const trimmed = dest.trim();
if (!trimmed) return '';
const normalized = trimmed.startsWith('/') ? trimmed : `/${trimmed}`;
return normalized.replace(/\/{2,}/g, '/');
}, []);
const doCreateDir = useCallback(async (name: string) => {
if (!name.trim()) {
message.warning(t('Please input name'));
@@ -57,6 +68,92 @@ export function useFileActions({ path, refresh, clearSelection, onShare, onGetDi
}
}, [path, refresh]);
const buildEntryDestination = useCallback((base: string, name: string) => {
const normalizedBase = base.replace(/\/+$/, '') || '/';
const prefix = normalizedBase === '/' ? '' : normalizedBase;
const combined = `${prefix}/${name}`.replace(/\/{2,}/g, '/');
return combined.startsWith('/') ? combined : `/${combined}`;
}, []);
const doMove = useCallback(async (entriesToMove: VfsEntry[], destination: string, overwrite: boolean = false) => {
if (!entriesToMove || entriesToMove.length === 0) return;
const normalized = normalizeDestination(destination);
if (!normalized) {
message.warning(t('Please input destination path'));
return;
}
const multiple = entriesToMove.length > 1;
const targetDir = multiple ? (normalized === '/' ? '/' : normalized.replace(/\/+$/, '') || '/') : normalized;
let completedCount = 0;
let queuedCount = 0;
for (const entry of entriesToMove) {
const src = normalizeFullPath(entry.name);
const dst = multiple ? buildEntryDestination(targetDir, entry.name) : normalized;
try {
const result = await vfsApi.move(src, dst, { overwrite });
if (result?.queued) {
queuedCount += 1;
} else {
completedCount += 1;
}
} catch (e: any) {
message.error(e.message);
throw e;
}
}
if (completedCount > 0) {
message.success(t('Move completed'));
}
if (queuedCount > 0) {
message.info(t('Move task queued'));
}
clearSelection();
refresh();
}, [normalizeDestination, normalizeFullPath, t, buildEntryDestination, clearSelection, refresh]);
const doCopy = useCallback(async (entriesToCopy: VfsEntry[], destination: string, overwrite: boolean = false) => {
if (!entriesToCopy || entriesToCopy.length === 0) return;
const normalized = normalizeDestination(destination);
if (!normalized) {
message.warning(t('Please input destination path'));
return;
}
const multiple = entriesToCopy.length > 1;
const targetDir = multiple ? (normalized === '/' ? '/' : normalized.replace(/\/+$/, '') || '/') : normalized;
let completedCount = 0;
let queuedCount = 0;
for (const entry of entriesToCopy) {
const src = normalizeFullPath(entry.name);
const dst = multiple ? buildEntryDestination(targetDir, entry.name) : normalized;
try {
const result = await vfsApi.copy(src, dst, { overwrite });
if (result?.queued) {
queuedCount += 1;
} else {
completedCount += 1;
}
} catch (e: any) {
message.error(e.message);
throw e;
}
}
if (completedCount > 0) {
message.success(t('Copy completed'));
}
if (queuedCount > 0) {
message.info(t('Copy task queued'));
}
refresh();
}, [normalizeDestination, normalizeFullPath, t, buildEntryDestination, refresh]);
const doDownload = useCallback(async (entry: VfsEntry) => {
if (entry.is_dir) {
message.warning(t('Downloading folders is not supported'));
@@ -101,5 +198,7 @@ export function useFileActions({ path, refresh, clearSelection, onShare, onGetDi
doDownload,
doShare,
doGetDirectLink,
doMove,
doCopy,
};
}

View File

@@ -49,8 +49,8 @@ export function useProcessor({ path, processorTypes, refresh }: ProcessorParams)
overwrite: overwrite ? true : undefined,
};
await processorsApi.process(params);
message.success(t('Processing finished'));
const resp = await processorsApi.process(params);
message.success(`${t('Task submitted')}: ${resp.task_id}`);
setModal({ entry: null, visible: false });
if (overwrite || savingPath) refresh();
} catch (e: any) {

View File

@@ -1,8 +1,234 @@
import { Empty } from 'antd';
import { memo, useCallback, useEffect, useMemo, useState } from 'react';
import { Button, Form, Input, Modal, message, Table, Tag, Typography, Space } from 'antd';
import type { TableColumnsType } from 'antd';
import { FolderOpenOutlined } from '@ant-design/icons';
import PathSelectorModal from '../components/PathSelectorModal';
import { offlineDownloadsApi, type OfflineDownloadTask } from '../api/client';
import { useI18n } from '../i18n';
import PageCard from '../components/PageCard';
export default function OfflineDownloadPage() {
const { t } = useI18n();
return <Empty description={t('No offline download tasks')} />;
interface TableRow extends OfflineDownloadTask {
key: string;
}
function formatBytes(bytes?: number | null): string {
if (bytes === undefined || bytes === null) return '--';
if (bytes === 0) return '0 B';
const units = ['B', 'KB', 'MB', 'GB', 'TB'];
const idx = Math.min(Math.floor(Math.log(bytes) / Math.log(1024)), units.length - 1);
const val = bytes / Math.pow(1024, idx);
return `${val.toFixed(idx === 0 ? 0 : 1)} ${units[idx]}`;
}
function stageLabel(t: (key: string) => string, stage?: string | null): string {
if (!stage) return '--';
const map: Record<string, string> = {
queued: t('Queued'),
downloading: t('Downloading'),
transferring: t('Transferring'),
completed: t('Completed'),
};
return map[stage] ?? stage;
}
function statusTag(status: OfflineDownloadTask['status'], t: (key: string) => string) {
switch (status) {
case 'success':
return <Tag color="green">{t('Success')}</Tag>;
case 'failed':
return <Tag color="red">{t('Failed')}</Tag>;
case 'running':
return <Tag color="blue">{t('Running')}</Tag>;
default:
return <Tag color="default">{t('Pending')}</Tag>;
}
}
const OfflineDownloadPage = memo(function OfflineDownloadPage() {
const { t } = useI18n();
const [form] = Form.useForm();
const [messageApi, contextHolder] = message.useMessage();
const [tasks, setTasks] = useState<TableRow[]>([]);
const [loading, setLoading] = useState(false);
const [submitting, setSubmitting] = useState(false);
const [pathModalOpen, setPathModalOpen] = useState(false);
const [createModalOpen, setCreateModalOpen] = useState(false);
const loadTasks = useCallback(async (withSpinner = false) => {
if (withSpinner) setLoading(true);
try {
const list = await offlineDownloadsApi.list();
setTasks(list.map(item => ({ ...item, key: item.id })));
} catch (err: any) {
messageApi.error(err?.message || t('Load failed'));
} finally {
if (withSpinner) setLoading(false);
}
}, [messageApi, t]);
useEffect(() => {
loadTasks(true);
const timer = window.setInterval(() => {
loadTasks().catch(() => {});
}, 3000);
return () => window.clearInterval(timer);
}, [loadTasks]);
const handleSelectFolder = useCallback((path: string) => {
form.setFieldsValue({ dest_dir: path });
setPathModalOpen(false);
}, [form]);
const handleSubmit = useCallback(async () => {
try {
const values = await form.validateFields();
setSubmitting(true);
const resp = await offlineDownloadsApi.create(values);
messageApi.success(`${t('Task submitted')}: ${resp.task_id}`);
form.resetFields();
await loadTasks(true);
setCreateModalOpen(false);
} catch (err: any) {
if (err?.errorFields) return;
messageApi.error(err?.message || t('Operation failed'));
} finally {
setSubmitting(false);
}
}, [form, loadTasks, messageApi, t]);
const columns: TableColumnsType<TableRow> = useMemo(() => [
{
title: t('Filename'),
dataIndex: ['meta', 'filename'],
render: (_: any, record) => record.meta?.filename || record.task_info?.filename || '--',
},
{
title: t('Stage'),
dataIndex: ['progress', 'stage'],
render: (_: any, record) => stageLabel(t, record.progress?.stage),
},
{
title: t('Progress'),
dataIndex: ['progress', 'percent'],
render: (_: any, record) => {
const percent = record.progress?.percent;
return percent !== undefined && percent !== null ? `${percent.toFixed(1)}%` : '--';
},
},
{
title: t('Bytes'),
render: (_: any, record) => {
const done = record.progress?.bytes_done;
const total = record.progress?.bytes_total;
if (done === undefined && total === undefined) return '--';
if (total) {
return `${formatBytes(done)} / ${formatBytes(total)}`;
}
return formatBytes(done);
},
},
{
title: t('Status'),
dataIndex: 'status',
render: (status: TableRow['status'], record) => (
<Space>
{statusTag(status, t)}
{status === 'failed' && record.error ? <Typography.Text type="danger">{record.error}</Typography.Text> : null}
</Space>
),
},
{
title: t('Save Path'),
dataIndex: ['meta', 'final_path'],
render: (value: string | undefined, record) => value || record.task_info?.dest_dir || '--',
},
], [t]);
return (
<>
{contextHolder}
<PageCard
title={t('Offline Downloads')}
extra={
<Button type="primary" onClick={() => { form.resetFields(); setCreateModalOpen(true); }}>
{t('Create Offline Download')}
</Button>
}
>
<Table
columns={columns}
dataSource={tasks}
loading={loading}
pagination={false}
locale={{ emptyText: t('No offline download tasks') }}
rowKey="id"
style={{ marginBottom: 0 }}
/>
</PageCard>
<Modal
title={t('Create Offline Download')}
open={createModalOpen}
onCancel={() => { setCreateModalOpen(false); form.resetFields(); }}
onOk={handleSubmit}
okText={t('Start Download')}
okButtonProps={{ loading: submitting }}
>
<Form form={form} layout="vertical">
<Form.Item name="url" label={t('URL')} rules={[{ required: true, message: t('Please input URL') }]}>
<Input placeholder="https://example.com/file" />
</Form.Item>
{(() => {
const errors = form.getFieldError('dest_dir');
const hasError = errors.length > 0;
return (
<Form.Item
label={t('Destination Folder')}
required
validateStatus={hasError ? 'error' : undefined}
help={hasError ? errors[0] : undefined}
>
<Input.Group compact style={{ display: 'flex' }}>
<Form.Item
name="dest_dir"
noStyle
rules={[{ required: true, message: t('Select destination') }]}
>
<Input
readOnly
placeholder={t('Select destination')}
style={{ width: 'calc(100% - 120px)', cursor: 'pointer' }}
onClick={() => setPathModalOpen(true)}
/>
</Form.Item>
<Button
type="default"
icon={<FolderOpenOutlined />}
onClick={() => setPathModalOpen(true)}
style={{ width: 120 }}
>
{t('Select Folder')}
</Button>
</Input.Group>
</Form.Item>
);
})()}
<Form.Item name="filename" label={t('Filename')} rules={[{ required: true, message: t('Please input filename') }]}>
<Input placeholder="example.zip" />
</Form.Item>
</Form>
</Modal>
<PathSelectorModal
open={pathModalOpen}
mode="directory"
initialPath={form.getFieldValue('dest_dir') || '/'}
onOk={handleSelectFolder}
onCancel={() => setPathModalOpen(false)}
/>
</>
);
});
export default OfflineDownloadPage;

View File

@@ -0,0 +1,501 @@
import { memo, useCallback, useEffect, useMemo, useState } from 'react';
import {
Button,
Card,
Empty,
Flex,
Form,
Input,
message,
Modal,
Space,
Spin,
Switch,
Tabs,
Tag,
Typography,
theme,
} from 'antd';
import Editor from '@monaco-editor/react';
import { ProcessorConfigForm } from '../components/ProcessorConfigForm';
import PathSelectorModal, { type PathSelectorMode } from '../components/PathSelectorModal';
import { processorsApi, type ProcessorTypeMeta } from '../api/processors';
import { useI18n } from '../i18n';
const { Text } = Typography;
type TabKey = 'editor' | 'runner';
const ProcessorsPage = memo(function ProcessorsPage() {
const { t } = useI18n();
const { token } = theme.useToken();
const [messageApi, contextHolder] = message.useMessage();
const [processors, setProcessors] = useState<ProcessorTypeMeta[]>([]);
const [loadingList, setLoadingList] = useState(false);
const [selectedType, setSelectedType] = useState<string>('');
const [source, setSource] = useState('');
const [initialSource, setInitialSource] = useState('');
const [modulePath, setModulePath] = useState('');
const [sourceLoading, setSourceLoading] = useState(false);
const [savingSource, setSavingSource] = useState(false);
const [reloading, setReloading] = useState(false);
const [form] = Form.useForm();
const [running, setRunning] = useState(false);
const [isDirectory, setIsDirectory] = useState(false);
const [pathModalOpen, setPathModalOpen] = useState(false);
const [pathModalMode, setPathModalMode] = useState<PathSelectorMode>('file');
const [pathModalField, setPathModalField] = useState<'path' | 'save_to'>('path');
const [activeTab, setActiveTab] = useState<TabKey>('editor');
const isDirty = source !== initialSource;
const selectedProcessorMeta = useMemo(
() => processors.find(p => p.type === selectedType),
[processors, selectedType]
);
const loadList = useCallback(async () => {
setLoadingList(true);
try {
const list = await processorsApi.list();
setProcessors(list);
} catch (err: any) {
messageApi.error(err?.message || t('Load failed'));
} finally {
setLoadingList(false);
}
}, [messageApi, t]);
useEffect(() => {
loadList();
}, [loadList]);
useEffect(() => {
if (!processors.length) {
setSelectedType('');
return;
}
if (!selectedType) {
setSelectedType(processors[0].type);
} else if (!processors.some(p => p.type === selectedType)) {
setSelectedType(processors[0].type);
}
}, [processors, selectedType]);
useEffect(() => {
if (!selectedType) {
setSource('');
setInitialSource('');
setModulePath('');
return;
}
const controller = new AbortController();
setSource('');
setInitialSource('');
setModulePath('');
setSourceLoading(true);
processorsApi.getSource(selectedType)
.then(resp => {
if (controller.signal.aborted) return;
setSource(resp.source ?? '');
setInitialSource(resp.source ?? '');
setModulePath(resp.module_path ?? '');
})
.catch((err: any) => {
if (controller.signal.aborted) return;
messageApi.error(err?.message || t('Load failed'));
setSource('');
setInitialSource('');
setModulePath('');
})
.finally(() => {
if (!controller.signal.aborted) {
setSourceLoading(false);
}
});
return () => controller.abort();
}, [messageApi, selectedType, t]);
useEffect(() => {
if (!selectedProcessorMeta) {
form.resetFields();
setIsDirectory(false);
return;
}
form.resetFields();
const defaults: Record<string, any> = {};
selectedProcessorMeta.config_schema?.forEach(field => {
if (field.default !== undefined) {
defaults[field.key] = field.default;
}
});
form.setFieldsValue({
path: '',
overwrite: !!selectedProcessorMeta.produces_file,
save_to: undefined,
config: defaults,
});
setIsDirectory(false);
}, [selectedProcessorMeta, form]);
const overwriteValue = Form.useWatch('overwrite', form) ?? false;
useEffect(() => {
if (overwriteValue) {
form.setFieldsValue({ save_to: undefined });
}
}, [overwriteValue, form]);
useEffect(() => {
if (isDirectory) {
form.setFieldsValue({ overwrite: true, save_to: undefined });
}
}, [isDirectory, form]);
const handleSelectProcessor = useCallback((type: string) => {
if (type === selectedType) return;
if (isDirty) {
Modal.confirm({
title: t('Unsaved changes'),
content: t('Switching processor will discard unsaved changes. Continue?'),
okText: t('Confirm'),
cancelText: t('Cancel'),
onOk: () => {
setSelectedType(type);
setActiveTab('editor');
},
});
} else {
setSelectedType(type);
setActiveTab('editor');
}
}, [isDirty, selectedType, t]);
const handleSaveSource = useCallback(async () => {
if (!selectedType) return;
try {
setSavingSource(true);
await processorsApi.updateSource(selectedType, source);
setInitialSource(source);
messageApi.success(t('Source saved'));
} catch (err: any) {
messageApi.error(err?.message || t('Operation failed'));
} finally {
setSavingSource(false);
}
}, [messageApi, selectedType, source, t]);
const handleReloadProcessors = useCallback(async () => {
try {
setReloading(true);
await processorsApi.reload();
messageApi.success(t('Processors reloaded'));
await loadList();
} catch (err: any) {
messageApi.error(err?.message || t('Operation failed'));
} finally {
setReloading(false);
}
}, [loadList, messageApi, t]);
const openPathSelector = useCallback((field: 'path' | 'save_to', mode: PathSelectorMode) => {
setPathModalField(field);
setPathModalMode(mode);
setPathModalOpen(true);
}, []);
const handlePathSelected = useCallback((selectedPath: string) => {
if (pathModalField === 'path') {
form.setFieldsValue({ path: selectedPath });
setIsDirectory(pathModalMode === 'directory');
} else {
form.setFieldsValue({ save_to: selectedPath });
}
setPathModalOpen(false);
}, [form, pathModalField, pathModalMode]);
const handleRun = useCallback(async () => {
if (!selectedType) {
messageApi.warning(t('Please select a processor'));
return;
}
try {
const values = await form.validateFields();
const schema = selectedProcessorMeta?.config_schema || [];
const finalConfig: Record<string, any> = {};
schema.forEach(field => {
const value = values.config?.[field.key];
if (value === undefined) {
finalConfig[field.key] = field.default;
} else {
finalConfig[field.key] = value;
}
});
setRunning(true);
const payload: any = {
path: values.path,
processor_type: selectedType,
config: finalConfig,
overwrite: !!values.overwrite,
};
if (values.save_to && !values.overwrite) {
payload.save_to = values.save_to;
}
const resp = await processorsApi.process(payload);
messageApi.success(`${t('Task submitted')}: ${resp.task_id}`);
} catch (err: any) {
if (err?.errorFields) {
return;
}
messageApi.error(err?.message || t('Operation failed'));
} finally {
setRunning(false);
}
}, [form, messageApi, selectedProcessorMeta, selectedType, t]);
const selectedConfigPath = pathModalField === 'path'
? (selectedType ? form.getFieldValue('path') : undefined) || '/'
: (selectedType ? form.getFieldValue('save_to') : undefined) || '/';
const renderProcessorList = () => {
if (loadingList) {
return (
<Flex align="center" justify="center" style={{ height: '100%' }}>
<Spin />
</Flex>
);
}
if (!processors.length) {
return (
<Flex align="center" justify="center" style={{ height: '100%' }}>
<Empty description={t('No data')} />
</Flex>
);
}
return (
<div style={{ padding: 8, overflowY: 'auto', height: '100%' }}>
{processors.map(item => {
const selected = item.type === selectedType;
const onClick = () => handleSelectProcessor(item.type);
return (
<div
key={item.type}
onClick={onClick}
style={{
border: `1px solid ${selected ? token.colorPrimary : token.colorBorderSecondary}`,
background: selected ? token.colorPrimaryBg : token.colorBgContainer,
borderRadius: 10,
padding: 12,
marginBottom: 8,
cursor: 'pointer',
transition: 'all 0.2s ease',
}}
>
<Flex justify="space-between" align="center">
<Space size={8} align="center">
<span
style={{
width: 8,
height: 8,
borderRadius: '50%',
background: selected ? token.colorPrimary : token.colorBorderSecondary,
display: 'inline-block',
}}
/>
<Text strong>{item.name}</Text>
</Space>
<Tag color={selected ? token.colorPrimary : token.colorBorderSecondary}>{item.type}</Tag>
</Flex>
<Space direction="vertical" size={6} style={{ marginTop: 8 }}>
<div>
<Text type="secondary" style={{ marginRight: 8 }}>{t('Supported Extensions')}:</Text>
{item.supported_exts?.length ? (
<Space wrap size={[4, 4]}>
{item.supported_exts.map(ext => (
<Tag key={ext}>{ext}</Tag>
))}
</Space>
) : (
<Tag>{t('All')}</Tag>
)}
</div>
<Text type="secondary">
{t('Produces File')}: {item.produces_file ? t('Yes') : t('No')}
</Text>
</Space>
</div>
);
})}
</div>
);
};
const tabs = [
{
key: 'editor',
label: t('Source Editor'),
children: selectedType ? (
<div style={{ height: '100%', display: 'flex', flexDirection: 'column' }}>
<div style={{ padding: '8px 12px', borderBottom: `1px solid ${token.colorBorderSecondary}` }}>
{modulePath ? (
<Space size={8}>
<Text type="secondary">{t('Module Path')}:</Text>
<Text code>{modulePath}</Text>
</Space>
) : (
<Text type="secondary">{t('No module path')}</Text>
)}
</div>
<div style={{ flex: 1, minHeight: 0 }}>
{sourceLoading ? (
<Flex align="center" justify="center" style={{ height: '100%' }}>
<Spin />
</Flex>
) : (
<Editor
language="python"
value={source}
onChange={(val) => setSource(val ?? '')}
height="100%"
options={{
automaticLayout: true,
minimap: { enabled: false },
fontSize: 13,
scrollBeyondLastLine: false,
}}
/>
)}
</div>
</div>
) : (
<Empty style={{ marginTop: 64 }} description={t('Select a processor')} />
),
},
{
key: 'runner',
label: t('Run Processor'),
forceRender: true,
children: (
<Form form={form} layout="vertical" disabled={!selectedType} style={{ padding: '12px 0' }}>
{selectedType ? (
<>
{isDirectory && (
<Text type="secondary" style={{ display: 'block', marginBottom: 12 }}>
{t('Directory processing always overwrites original files')}
</Text>
)}
<Form.Item
label={t('Target Path')}
required
>
<Flex gap={8} align="center">
<div style={{ flex: 1 }}>
<Form.Item
name="path"
rules={[{ required: true, message: t('Please select a path') }]}
noStyle
>
<Input placeholder={t('Select a path')} />
</Form.Item>
</div>
<Button onClick={() => openPathSelector('path', 'file')}>{t('Select File')}</Button>
<Button onClick={() => openPathSelector('path', 'directory')}>{t('Select Directory')}</Button>
</Flex>
</Form.Item>
<Form.Item
name="overwrite"
label={t('Overwrite original')}
valuePropName="checked"
>
<Switch disabled={isDirectory} />
</Form.Item>
{selectedProcessorMeta?.produces_file && !overwriteValue && (
<Form.Item label={t('Save To')}>
<Flex gap={8} align="center">
<div style={{ flex: 1 }}>
<Form.Item name="save_to" noStyle>
<Input placeholder={t('Optional output path')} />
</Form.Item>
</div>
<Button onClick={() => openPathSelector('save_to', 'any')}>{t('Select')}</Button>
</Flex>
</Form.Item>
)}
<ProcessorConfigForm
processorMeta={selectedProcessorMeta}
form={form}
configPath={['config']}
/>
<Form.Item>
<Button type="primary" onClick={handleRun} loading={running} disabled={!selectedType}>
{t('Run')}
</Button>
</Form.Item>
</>
) : (
<Empty style={{ marginTop: 64 }} description={t('Select a processor')} />
)}
</Form>
),
},
];
return (
<>
{contextHolder}
<Flex gap={16} style={{ height: 'calc(100vh - 88px)' }}>
<Card
style={{ flex: '0 0 320px', minWidth: 280, display: 'flex', flexDirection: 'column' }}
title={t('Processor List')}
extra={
<Space size={8}>
<Button size="small" onClick={loadList} loading={loadingList}>{t('Refresh')}</Button>
<Button size="small" onClick={handleReloadProcessors} loading={reloading}>{t('Reload')}</Button>
</Space>
}
styles={{ body: { padding: 0, flex: 1, display: 'flex' } }}
>
{renderProcessorList()}
</Card>
<Card
style={{ flex: 1, minWidth: 0, display: 'flex', flexDirection: 'column' }}
title={selectedProcessorMeta ? `${selectedProcessorMeta.name} (${selectedProcessorMeta.type})` : t('Select a processor')}
extra={
<Space size={8}>
<Button size="small" onClick={handleSaveSource} loading={savingSource} disabled={!selectedType || !isDirty}>
{t('Save')}
</Button>
<Button size="small" onClick={handleReloadProcessors} loading={reloading} disabled={!selectedType}>
{t('Reload')}
</Button>
</Space>
}
styles={{ body: { padding: 0, flex: 1, display: 'flex', flexDirection: 'column' } }}
>
<Tabs
activeKey={activeTab}
onChange={key => setActiveTab(key as TabKey)}
items={tabs as any}
className="processors-tabs"
tabBarGutter={32}
/>
</Card>
</Flex>
<PathSelectorModal
open={pathModalOpen}
mode={pathModalMode}
initialPath={selectedConfigPath}
onOk={handlePathSelected}
onCancel={() => setPathModalOpen(false)}
/>
</>
);
});
export default ProcessorsPage;

View File

@@ -44,6 +44,16 @@ const SharePage = memo(function SharePage() {
}
};
const handleClearExpired = async () => {
try {
const res = await shareApi.clearExpired();
message.success(t('Cleared {count} expired shares', { count: String(res.deleted_count) }));
fetchList();
} catch (e: any) {
message.error(e.message || t('Clear failed'));
}
};
const columns = [
{
title: t('Share Name'),
@@ -100,7 +110,14 @@ const SharePage = memo(function SharePage() {
return (
<PageCard
title={t('My Shares')}
extra={<Button onClick={fetchList} loading={loading}>{t('Refresh')}</Button>}
extra={
<Space>
<Button onClick={fetchList} loading={loading}>{t('Refresh')}</Button>
<Popconfirm title={t('Confirm clear expired shares?')} onConfirm={handleClearExpired}>
<Button danger>{t('Clear expired shares')}</Button>
</Popconfirm>
</Space>
}
>
<Table
rowKey="id"

View File

@@ -1,8 +1,8 @@
import { Form, Input, Button, message, Tabs, Space, Card, Select, Modal, Radio, InputNumber } from 'antd';
import { useEffect, useState } from 'react';
import { Form, Input, Button, message, Tabs, Space, Card, Select, Modal, Radio, InputNumber, Spin, Empty, Alert } from 'antd';
import { useEffect, useState, useCallback } from 'react';
import PageCard from '../../components/PageCard';
import { getAllConfig, setConfig } from '../../api/config';
import { vectorDBApi } from '../../api/vectorDB';
import { vectorDBApi, type VectorDBStats, type VectorDBProviderMeta, type VectorDBCurrentConfig } from '../../api/vectorDB';
import { AppstoreOutlined, RobotOutlined, DatabaseOutlined, SkinOutlined } from '@ant-design/icons';
import { useTheme } from '../../contexts/ThemeContext';
import '../../styles/settings-tabs.css';
@@ -21,13 +21,30 @@ const VISION_CONFIG_KEYS = [
{ key: 'AI_VISION_API_KEY', label: 'Vision API Key' },
];
const DEFAULT_EMBED_DIMENSION = 4096;
const EMBED_DIM_KEY = 'AI_EMBED_DIM';
const EMBED_CONFIG_KEYS = [
{ key: 'AI_EMBED_API_URL', label: 'Embedding API URL' },
{ key: 'AI_EMBED_MODEL', label: 'Embedding Model', default: 'Qwen/Qwen3-Embedding-8B' },
{ key: 'AI_EMBED_API_KEY', label: 'Embedding API Key' },
];
const ALL_AI_KEYS = [...VISION_CONFIG_KEYS, ...EMBED_CONFIG_KEYS];
const ALL_AI_KEYS = [...VISION_CONFIG_KEYS, ...EMBED_CONFIG_KEYS, { key: EMBED_DIM_KEY, default: DEFAULT_EMBED_DIMENSION }];
const formatBytes = (bytes?: number | null) => {
if (bytes === null || bytes === undefined) return '-';
if (bytes === 0) return '0 B';
const units = ['B', 'KB', 'MB', 'GB', 'TB'];
let value = bytes;
let unitIndex = 0;
while (value >= 1024 && unitIndex < units.length - 1) {
value /= 1024;
unitIndex += 1;
}
const precision = value >= 10 || unitIndex === 0 ? 0 : 1;
return `${value.toFixed(precision)} ${units[unitIndex]}`;
};
// Theme related config keys
const THEME_KEYS = {
@@ -39,9 +56,19 @@ const THEME_KEYS = {
};
export default function SystemSettingsPage() {
const [vectorConfigForm] = Form.useForm();
const [loading, setLoading] = useState(false);
const [config, setConfigState] = useState<Record<string, string> | null>(null);
const [activeTab, setActiveTab] = useState('appearance');
const [vectorStats, setVectorStats] = useState<VectorDBStats | null>(null);
const [vectorStatsLoading, setVectorStatsLoading] = useState(false);
const [vectorStatsError, setVectorStatsError] = useState<string | null>(null);
const [vectorProviders, setVectorProviders] = useState<VectorDBProviderMeta[]>([]);
const [vectorConfig, setVectorConfig] = useState<VectorDBCurrentConfig | null>(null);
const [vectorConfigLoading, setVectorConfigLoading] = useState(false);
const [vectorConfigSaving, setVectorConfigSaving] = useState(false);
const [vectorMetaError, setVectorMetaError] = useState<string | null>(null);
const [selectedProviderType, setSelectedProviderType] = useState<string | null>(null);
const { refreshTheme, previewTheme } = useTheme();
const { t } = useI18n();
@@ -49,6 +76,72 @@ export default function SystemSettingsPage() {
getAllConfig().then((data) => setConfigState(data as Record<string, string>));
}, []);
const fetchVectorStats = useCallback(async () => {
setVectorStatsLoading(true);
setVectorStatsError(null);
try {
const data = await vectorDBApi.getStats();
setVectorStats(data);
} catch (e: any) {
const msg = e?.message || t('Load failed');
setVectorStatsError(msg);
message.error(msg);
} finally {
setVectorStatsLoading(false);
}
}, [t]);
const buildProviderConfigValues = useCallback((provider: VectorDBProviderMeta | undefined, existing?: Record<string, string>) => {
if (!provider) return {};
const values: Record<string, string> = {};
const schema = provider.config_schema || [];
schema.forEach((field) => {
const current = existing && existing[field.key] !== undefined && existing[field.key] !== null
? String(existing[field.key])
: undefined;
if (current !== undefined) {
values[field.key] = current;
} else if (field.default !== undefined && field.default !== null) {
values[field.key] = String(field.default);
} else {
values[field.key] = '';
}
});
return values;
}, []);
const fetchVectorMeta = useCallback(async () => {
setVectorConfigLoading(true);
setVectorMetaError(null);
try {
const [providers, current] = await Promise.all([
vectorDBApi.getProviders(),
vectorDBApi.getConfig(),
]);
setVectorProviders(providers);
setVectorConfig(current);
const enabled = providers.filter((item) => item.enabled);
let nextType: string | null = current?.type ?? null;
if (nextType && !providers.some((item) => item.type === nextType)) {
nextType = null;
}
if (!nextType) {
nextType = enabled[0]?.type ?? providers[0]?.type ?? null;
}
setSelectedProviderType(nextType);
const provider = providers.find((item) => item.type === nextType);
const configValues = buildProviderConfigValues(provider, nextType === current?.type ? current?.config : undefined);
vectorConfigForm.setFieldsValue({ type: nextType || undefined, config: configValues });
} catch (e: any) {
const msg = e?.message || t('Load failed');
setVectorMetaError(msg);
message.error(msg);
} finally {
setVectorConfigLoading(false);
}
}, [buildProviderConfigValues, message, t, vectorConfigForm]);
const handleSave = async (values: any) => {
setLoading(true);
try {
@@ -67,6 +160,40 @@ export default function SystemSettingsPage() {
setLoading(false);
};
const handleProviderChange = useCallback((value: string) => {
setSelectedProviderType(value);
const provider = vectorProviders.find((item) => item.type === value);
const existing = value === vectorConfig?.type ? vectorConfig?.config : undefined;
const configValues = buildProviderConfigValues(provider, existing);
vectorConfigForm.setFieldsValue({ type: value, config: configValues });
}, [vectorProviders, vectorConfig, buildProviderConfigValues, vectorConfigForm]);
const handleVectorConfigSave = useCallback(async (values: { type: string; config?: Record<string, string> }) => {
if (!values?.type) {
return;
}
setVectorConfigSaving(true);
try {
const configPayload = Object.fromEntries(
Object.entries(values.config || {}).filter(([, val]) => val !== undefined && val !== null && String(val).trim() !== '')
.map(([key, val]) => [key, String(val)])
);
const response = await vectorDBApi.updateConfig({ type: values.type, config: configPayload });
setVectorConfig(response.config);
setVectorStats(response.stats);
setVectorStatsError(null);
setSelectedProviderType(response.config.type);
const provider = vectorProviders.find((item) => item.type === response.config.type);
const mergedValues = buildProviderConfigValues(provider, response.config.config);
vectorConfigForm.setFieldsValue({ type: response.config.type, config: mergedValues });
message.success(t('Saved successfully'));
} catch (e: any) {
message.error(e?.message || t('Save failed'));
} finally {
setVectorConfigSaving(false);
}
}, [buildProviderConfigValues, message, t, vectorConfigForm, vectorProviders]);
// 离开“外观设置”时,恢复后端持久化配置(取消未保存的预览)
useEffect(() => {
if (activeTab !== 'appearance') {
@@ -74,6 +201,27 @@ export default function SystemSettingsPage() {
}
}, [activeTab]);
useEffect(() => {
if (activeTab === 'vector-db') {
if (!vectorProviders.length && !vectorConfigLoading) {
fetchVectorMeta();
}
if (!vectorStats && !vectorStatsLoading) {
fetchVectorStats();
}
}
}, [
activeTab,
fetchVectorMeta,
fetchVectorStats,
vectorProviders.length,
vectorConfigLoading,
vectorStats,
vectorStatsLoading,
]);
const selectedProvider = vectorProviders.find((item) => item.type === selectedProviderType || (!selectedProviderType && item.enabled));
if (!config) {
return <PageCard title={t('System Settings')}><div>{t('Loading...')}</div></PageCard>;
}
@@ -213,9 +361,27 @@ export default function SystemSettingsPage() {
<Form
layout="vertical"
initialValues={{
...Object.fromEntries(ALL_AI_KEYS.map(({ key, default: def }) => [key, config[key] ?? def ?? ''])),
...Object.fromEntries(ALL_AI_KEYS.map(({ key, default: def }) => [key, key === EMBED_DIM_KEY
? Number(config[key] ?? def ?? DEFAULT_EMBED_DIMENSION)
: config[key] ?? def ?? ''])),
}}
onFinish={async (vals) => {
const currentDim = Number(config[EMBED_DIM_KEY] ?? DEFAULT_EMBED_DIMENSION);
const nextDim = Number(vals[EMBED_DIM_KEY] ?? DEFAULT_EMBED_DIMENSION);
if (currentDim !== nextDim) {
Modal.confirm({
title: t('Confirm embedding dimension change'),
content: t('Changing the embedding dimension will clear the vector database automatically. You will need to rebuild indexes afterwards. Continue?'),
okText: t('Confirm'),
cancelText: t('Cancel'),
onOk: async () => {
await handleSave(vals);
},
});
return;
}
await handleSave(vals);
}}
onFinish={handleSave}
style={{ marginTop: 24 }}
key={JSON.stringify(config)}
>
@@ -232,6 +398,9 @@ export default function SystemSettingsPage() {
<Input size="large" />
</Form.Item>
))}
<Form.Item name={EMBED_DIM_KEY} label={t('Embedding Dimension')}>
<InputNumber min={1} max={32768} style={{ width: '100%' }} />
</Form.Item>
</Card>
<Form.Item style={{ marginTop: 24 }}>
<Button type="primary" htmlType="submit" loading={loading} block>
@@ -251,41 +420,187 @@ export default function SystemSettingsPage() {
),
children: (
<Card title={t('Vector Database Settings')} style={{ marginTop: 24 }}>
<Form layout="vertical">
<Form.Item label={t('Database Type')}>
<Select
size="large"
value={'Milvus Lite'}
disabled
options={[{ value: 'Milvus Lite', label: 'Milvus Lite' }]}
/>
</Form.Item>
<Form.Item>
<Button
danger
block
onClick={() => {
Modal.confirm({
title: t('Confirm clear vector database?'),
content: t('This will delete all collections irreversibly.'),
okText: t('Confirm Clear'),
okType: 'danger',
cancelText: t('Cancel'),
onOk: async () => {
try {
await vectorDBApi.clearAll();
message.success(t('Vector database cleared'));
} catch (e: any) {
message.error(e.message || t('Clear failed'));
}
},
});
}}
<Space direction="vertical" size={24} style={{ width: '100%' }}>
<Space direction="vertical" size={16} style={{ width: '100%' }}>
<div style={{ display: 'flex', justifyContent: 'space-between', alignItems: 'center', flexWrap: 'wrap', gap: 12 }}>
<strong>{t('Current Statistics')}</strong>
<Button onClick={() => { fetchVectorMeta(); fetchVectorStats(); }} loading={vectorStatsLoading || vectorConfigLoading} disabled={(vectorStatsLoading || vectorConfigLoading) && !vectorStats}>
{t('Refresh')}
</Button>
</div>
{vectorMetaError ? (
<Alert type="error" showIcon message={vectorMetaError} />
) : null}
{vectorStatsLoading && !vectorStats ? (
<Spin />
) : vectorStats ? (
<Space direction="vertical" size={16} style={{ width: '100%' }}>
<div style={{ display: 'flex', flexWrap: 'wrap', gap: 24 }}>
<div>
<div style={{ color: '#888' }}>{t('Collections')}</div>
<div style={{ fontSize: 20, fontWeight: 600 }}>{vectorStats.collection_count}</div>
</div>
<div>
<div style={{ color: '#888' }}>{t('Vectors')}</div>
<div style={{ fontSize: 20, fontWeight: 600 }}>{vectorStats.total_vectors}</div>
</div>
<div>
<div style={{ color: '#888' }}>{t('Database Size')}</div>
<div style={{ fontSize: 20, fontWeight: 600 }}>{formatBytes(vectorStats.db_file_size_bytes)}</div>
</div>
<div>
<div style={{ color: '#888' }}>{t('Estimated Memory')}</div>
<div style={{ fontSize: 20, fontWeight: 600 }}>{formatBytes(vectorStats.estimated_total_memory_bytes)}</div>
</div>
</div>
{vectorStats.collections.length ? (
<Space direction="vertical" style={{ width: '100%' }} size={16}>
{vectorStats.collections.map((collection) => (
<div key={collection.name} style={{ border: '1px solid #f0f0f0', borderRadius: 8, padding: 16 }}>
<Space direction="vertical" size={12} style={{ width: '100%' }}>
<div style={{ display: 'flex', justifyContent: 'space-between', alignItems: 'center', flexWrap: 'wrap', gap: 12 }}>
<strong>{collection.name}</strong>
<span style={{ color: '#888' }}>
{collection.is_vector_collection && collection.dimension
? `${t('Dimension')}: ${collection.dimension}`
: t('Non-vector collection')}
</span>
</div>
<div>{t('Vectors')}: {collection.row_count}</div>
{collection.is_vector_collection ? (
<div>{t('Estimated memory')}: {formatBytes(collection.estimated_memory_bytes)}</div>
) : null}
{collection.indexes.length ? (
<Space direction="vertical" size={4} style={{ width: '100%' }}>
<span>{t('Indexes')}:</span>
<ul style={{ paddingLeft: 20, margin: 0 }}>
{collection.indexes.map((index) => (
<li key={`${collection.name}-${index.index_name || 'default'}`}>
<span>{index.index_name || t('Unnamed index')}</span>
<span>{' · '}{index.index_type || '-'}</span>
<span>{' · '}{index.metric_type || '-'}</span>
<span>{' · '}{t('Indexed rows')}: {index.indexed_rows}</span>
<span>{' · '}{t('Pending rows')}: {index.pending_index_rows}</span>
<span>{' · '}{t('Status')}: {index.state || '-'}</span>
</li>
))}
</ul>
</Space>
) : null}
</Space>
</div>
))}
</Space>
) : (
<Empty description={t('No collections')} />
)}
<div style={{ color: '#888' }}>
{t('Estimated memory is calculated as vectors x dimension x 4 bytes (float32).')}
</div>
</Space>
) : vectorStatsError ? (
<div style={{ color: '#ff4d4f' }}>{vectorStatsError}</div>
) : (
<Empty description={t('No collections')} />
)}
</Space>
{vectorConfigLoading && !vectorProviders.length ? (
<Spin />
) : (
<Form
layout="vertical"
form={vectorConfigForm}
onFinish={handleVectorConfigSave}
initialValues={{ type: selectedProviderType || undefined, config: {} }}
>
{t('Clear Vector DB')}
</Button>
</Form.Item>
</Form>
<Form.Item
name="type"
label={t('Database Provider')}
rules={[{ required: true, message: t('Please select a provider') }]}
>
<Select
size="large"
options={vectorProviders.map((provider) => ({
value: provider.type,
label: provider.enabled ? provider.label : `${provider.label} (${t('Coming soon')})`,
disabled: !provider.enabled,
}))}
onChange={handleProviderChange}
loading={vectorConfigLoading && !vectorProviders.length}
/>
</Form.Item>
{selectedProvider?.description ? (
<Alert
type="info"
showIcon
message={t(selectedProvider.description)}
style={{ marginBottom: 16 }}
/>
) : null}
{selectedProvider?.config_schema?.map((field) => (
<Form.Item
key={field.key}
name={['config', field.key]}
label={t(field.label)}
rules={field.required ? [{ required: true, message: t('Please input {label}', { label: t(field.label) }) }] : []}
>
{field.type === 'password' ? (
<Input.Password size="large" placeholder={field.placeholder ? t(field.placeholder) : undefined} />
) : (
<Input size="large" placeholder={field.placeholder ? t(field.placeholder) : undefined} />
)}
</Form.Item>
))}
{selectedProvider && !selectedProvider.enabled ? (
<Alert
type="warning"
showIcon
message={t('This provider is not available yet')}
style={{ marginBottom: 16 }}
/>
) : null}
<Form.Item>
<Space direction="vertical" style={{ width: '100%' }}>
<Button
type="primary"
htmlType="submit"
loading={vectorConfigSaving}
block
disabled={!selectedProvider?.enabled}
>
{t('Save')}
</Button>
<Button
danger
htmlType="button"
block
onClick={() => {
Modal.confirm({
title: t('Confirm clear vector database?'),
content: t('This will delete all collections irreversibly.'),
okText: t('Confirm Clear'),
okType: 'danger',
cancelText: t('Cancel'),
onOk: async () => {
try {
await vectorDBApi.clearAll();
message.success(t('Vector database cleared'));
await fetchVectorStats();
await fetchVectorMeta();
} catch (e: any) {
message.error(e.message || t('Clear failed'));
}
},
});
}}
>
{t('Clear Vector DB')}
</Button>
</Space>
</Form.Item>
</Form>
)}
</Space>
</Card>
),
},

View File

@@ -0,0 +1,297 @@
import { memo, useCallback, useEffect, useMemo, useState } from 'react';
import { Button, Card, Col, Form, Input, InputNumber, message, Progress, Row, Segmented, Select, Space, Table, Tag, Tooltip, Typography } from 'antd';
import type { ColumnsType } from 'antd/es/table';
import PageCard from '../components/PageCard';
import { tasksApi, type QueuedTask, type TaskQueueSettings } from '../api/tasks';
import { useI18n } from '../i18n';
import { SearchOutlined } from '@ant-design/icons';
type QueueStatus = QueuedTask['status'];
const AUTO_REFRESH_INTERVAL = 5000;
const TaskQueuePage = memo(function TaskQueuePage() {
const { t } = useI18n();
const [messageApi, contextHolder] = message.useMessage();
const [tasks, setTasks] = useState<QueuedTask[]>([]);
const [loading, setLoading] = useState(false);
const [keyword, setKeyword] = useState('');
const [statusFilter, setStatusFilter] = useState<QueueStatus[]>(['pending', 'running']);
const [autoRefresh, setAutoRefresh] = useState<'on' | 'off'>('on');
const [settings, setSettings] = useState<TaskQueueSettings>({ concurrency: 1, active_workers: 0 });
const [concurrencyDraft, setConcurrencyDraft] = useState<number>(1);
const [settingsLoading, setSettingsLoading] = useState(false);
const [lastUpdated, setLastUpdated] = useState<number | null>(null);
const statusOptions = useMemo(() => ([
{ label: t('Pending'), value: 'pending' },
{ label: t('Running'), value: 'running' },
{ label: t('Success'), value: 'success' },
{ label: t('Failed'), value: 'failed' },
]), [t]);
const loadTasks = useCallback(async (withSpinner = false) => {
if (withSpinner) setLoading(true);
try {
const data = await tasksApi.getQueue();
setTasks(data);
setLastUpdated(Date.now());
} catch (err) {
const msg = err instanceof Error && err.message ? err.message : t('Load failed');
messageApi.error(msg);
} finally {
if (withSpinner) setLoading(false);
}
}, [messageApi, t]);
const loadSettings = useCallback(async () => {
try {
const data = await tasksApi.getQueueSettings();
setSettings(data);
setConcurrencyDraft(data.concurrency);
} catch (err) {
const msg = err instanceof Error && err.message ? err.message : t('Load failed');
messageApi.error(msg);
}
}, [messageApi, t]);
useEffect(() => {
loadTasks(true).catch(() => {});
loadSettings().catch(() => {});
}, [loadTasks, loadSettings]);
useEffect(() => {
if (autoRefresh === 'off') return () => {};
const timer = window.setInterval(() => {
loadTasks().catch(() => {});
}, AUTO_REFRESH_INTERVAL);
return () => window.clearInterval(timer);
}, [autoRefresh, loadTasks]);
const metrics = useMemo(() => {
const counts = {
total: tasks.length,
pending: 0,
running: 0,
success: 0,
failed: 0,
};
tasks.forEach(task => {
counts[task.status] += 1;
});
return counts;
}, [tasks]);
const filteredTasks = useMemo(() => {
const normalizedKeyword = keyword.trim().toLowerCase();
return tasks.filter(task => {
if (statusFilter.length > 0 && !statusFilter.includes(task.status)) return false;
if (!normalizedKeyword) return true;
const haystack = [
task.id,
task.name,
JSON.stringify(task.task_info ?? {}),
task.meta ? JSON.stringify(task.meta) : '',
task.error ?? '',
].join(' ').toLowerCase();
return haystack.includes(normalizedKeyword);
});
}, [keyword, statusFilter, tasks]);
const progressLabel = useCallback((task: QueuedTask) => {
const percent = task.progress?.percent;
const stage = task.progress?.stage;
if (percent !== undefined && percent !== null) {
const percentText = `${percent.toFixed(1)}%`;
return stage ? `${percentText} · ${stage}` : percentText;
}
return stage ?? '--';
}, []);
const columns: ColumnsType<QueuedTask> = useMemo(() => ([
{
title: 'ID',
dataIndex: 'id',
width: 160,
render: (id: string) => (
<Tooltip title={id}>
<Tag color="default" style={{ cursor: 'pointer', margin: 0 }}>
<Typography.Text style={{ fontSize: 12 }} copyable={{ text: id }}>
{id.slice(0, 8)}
</Typography.Text>
</Tag>
</Tooltip>
),
},
{
title: t('Task Type'),
dataIndex: 'name',
sorter: (a, b) => a.name.localeCompare(b.name),
width: 160,
},
{
title: t('Status'),
dataIndex: 'status',
width: 120,
render: (status: QueueStatus) => {
const colorMap: Record<QueueStatus, string> = {
pending: 'default',
running: 'processing',
success: 'success',
failed: 'error',
};
const labelMap: Record<QueueStatus, string> = {
pending: t('Pending'),
running: t('Running'),
success: t('Success'),
failed: t('Failed'),
};
return <Tag color={colorMap[status]}>{labelMap[status]}</Tag>;
},
},
{
title: t('Progress'),
render: (_: unknown, record) => (
record.progress?.percent !== undefined && record.progress.percent !== null
? <Progress percent={Number(record.progress.percent.toFixed(1))} size="small" status={record.status === 'failed' ? 'exception' : record.status === 'success' ? 'success' : 'active'} />
: <Typography.Text type="secondary" style={{ fontSize: 12 }}>{progressLabel(record)}</Typography.Text>
),
},
{
title: t('Details'),
dataIndex: 'task_info',
render: (_: unknown, record) => {
const info = record.task_info ? JSON.stringify(record.task_info) : '--';
return (
<Typography.Paragraph
ellipsis={{ rows: 2, expandable: true, symbol: t('Expand') }}
style={{ marginBottom: 0, fontFamily: 'Menlo, Consolas, monospace', fontSize: 12 }}
>
{info}
</Typography.Paragraph>
);
},
},
{
title: t('Error'),
dataIndex: 'error',
render: (error: string | undefined) => error ? <Typography.Text type="danger">{error}</Typography.Text> : '--',
},
]), [progressLabel, t]);
const handleSaveSettings = useCallback(async () => {
setSettingsLoading(true);
try {
const next = await tasksApi.updateQueueSettings({ concurrency: concurrencyDraft });
setSettings(next);
setConcurrencyDraft(next.concurrency);
messageApi.success(t('Settings saved'));
await loadTasks();
} catch (err) {
const msg = err instanceof Error && err.message ? err.message : t('Operation failed');
messageApi.error(msg);
} finally {
setSettingsLoading(false);
}
}, [concurrencyDraft, loadTasks, messageApi, t]);
const lastUpdatedText = useMemo(() => {
if (!lastUpdated) return '--';
return new Date(lastUpdated).toLocaleTimeString();
}, [lastUpdated]);
return (
<>
{contextHolder}
<PageCard
title={t('Task Queue')}
extra={
<Space size={16} wrap>
<Typography.Text type="secondary">{t('Last updated at {time}', { time: lastUpdatedText })}</Typography.Text>
<Segmented
options={[{ label: t('Auto'), value: 'on' }, { label: t('Manual'), value: 'off' }]}
value={autoRefresh}
onChange={(value) => setAutoRefresh(value as 'on' | 'off')}
/>
<Button onClick={() => loadTasks(true)} loading={loading} type="default" disabled={autoRefresh === 'on'}>
{t('Refresh')}
</Button>
</Space>
}
>
<Row gutter={[16, 16]} style={{ marginBottom: 24 }}>
{[{
label: t('Total Tasks'), value: metrics.total, color: '#1677ff'
}, {
label: t('Running Tasks'), value: metrics.running, color: '#52c41a'
}, {
label: t('Waiting Tasks'), value: metrics.pending, color: '#faad14'
}, {
label: t('Failed Tasks'), value: metrics.failed, color: '#ff4d4f'
}, {
label: t('Active Workers'), value: settings.active_workers, color: '#722ed1'
}].map((item) => (
<Col key={item.label} xs={24} sm={12} md={8} lg={6} xl={4}>
<Card size="small" bodyStyle={{ padding: '12px 16px' }}>
<Typography.Text type="secondary">{item.label}</Typography.Text>
<Typography.Title level={4} style={{ margin: '4px 0 0', color: item.color }}>{item.value}</Typography.Title>
</Card>
</Col>
))}
</Row>
<Form layout="inline" style={{ marginBottom: 16, gap: 16, flexWrap: 'wrap' }}>
<Form.Item>
<Input
allowClear
placeholder={t('Search by name or ID')}
prefix={<SearchOutlined style={{ color: 'var(--ant-color-text-tertiary)' }} />}
onChange={(e) => setKeyword(e.target.value)}
style={{ width: 260 }}
/>
</Form.Item>
<Form.Item>
<Select
mode="multiple"
allowClear
placeholder={t('Filter by status')}
value={statusFilter}
maxTagCount={2}
onChange={(value) => setStatusFilter(value as QueueStatus[])}
options={statusOptions}
style={{ minWidth: 220 }}
/>
</Form.Item>
<Form.Item>
<Tooltip title={t('Adjust worker concurrency immediately')}>
<Space size={8}>
<Typography.Text>{t('Queue Concurrency')}:</Typography.Text>
<InputNumber
min={1}
controls={false}
style={{ width: 80 }}
value={concurrencyDraft}
onChange={(value) => setConcurrencyDraft(value ?? 1)}
/>
<Button type="primary" onClick={handleSaveSettings} loading={settingsLoading}>
{t('Save')}
</Button>
</Space>
</Tooltip>
</Form.Item>
</Form>
<Table
rowKey="id"
dataSource={filteredTasks}
columns={columns}
loading={loading}
pagination={{ pageSize: 10 }}
style={{ marginBottom: 0 }}
/>
</PageCard>
</>
);
});
export default TaskQueuePage;

View File

@@ -1,10 +1,11 @@
import { memo, useState, useEffect, useCallback } from 'react';
import { Table, Button, Space, Drawer, Form, Input, Switch, message, Typography, Popconfirm, Select, Modal, Tag } from 'antd';
import { Table, Button, Space, Drawer, Form, Input, Switch, message, Typography, Popconfirm, Select } from 'antd';
import PageCard from '../components/PageCard';
import { tasksApi, type AutomationTask, type QueuedTask } from '../api/tasks';
import { tasksApi, type AutomationTask } from '../api/tasks';
import { processorsApi, type ProcessorTypeMeta } from '../api/processors';
import { ProcessorConfigForm } from '../components/ProcessorConfigForm';
import { useI18n } from '../i18n';
import PathSelectorModal from '../components/PathSelectorModal';
const TasksPage = memo(function TasksPage() {
const [loading, setLoading] = useState(false);
@@ -13,10 +14,8 @@ const TasksPage = memo(function TasksPage() {
const [editing, setEditing] = useState<AutomationTask | null>(null);
const [form] = Form.useForm();
const [availableProcessors, setAvailableProcessors] = useState<ProcessorTypeMeta[]>([]);
const [queueModalOpen, setQueueModalOpen] = useState(false);
const [queuedTasks, setQueuedTasks] = useState<QueuedTask[]>([]);
const [queueLoading, setQueueLoading] = useState(false);
const { t } = useI18n();
const [pathPickerOpen, setPathPickerOpen] = useState(false);
const fetchList = useCallback(async () => {
setLoading(true);
@@ -91,23 +90,6 @@ const TasksPage = memo(function TasksPage() {
}
};
const fetchQueue = async () => {
setQueueLoading(true);
try {
const tasks = await tasksApi.getQueue();
setQueuedTasks(tasks);
} catch (e: any) {
message.error(e.message || '加载队列失败');
} finally {
setQueueLoading(false);
}
};
const openQueueModal = () => {
setQueueModalOpen(true);
fetchQueue();
};
const toggleEnabled = async (rec: AutomationTask, enabled: boolean) => {
setEditing(rec);
setLoading(true);
@@ -151,6 +133,7 @@ const TasksPage = memo(function TasksPage() {
const selectedProcessor = Form.useWatch('processor_type', form);
const currentProcessorMeta = availableProcessors.find(p => p.type === selectedProcessor);
const watchedPathPattern = Form.useWatch('path_pattern', form);
return (
@@ -159,7 +142,6 @@ const TasksPage = memo(function TasksPage() {
extra={
<Space>
<Button onClick={fetchList} loading={loading}>{t('Refresh')}</Button>
<Button onClick={openQueueModal}>{t('Running Tasks')}</Button>
<Button type="primary" onClick={openCreate}>{t('Create Task')}</Button>
</Space>
}
@@ -177,7 +159,7 @@ const TasksPage = memo(function TasksPage() {
width={480}
open={open}
onClose={() => { setOpen(false); setEditing(null); }}
destroyOnClose
destroyOnHidden
extra={
<Space>
<Button onClick={() => { setOpen(false); setEditing(null); }}>{t('Cancel')}</Button>
@@ -197,7 +179,10 @@ const TasksPage = memo(function TasksPage() {
</Form.Item>
<Typography.Title level={5} style={{ marginTop: 8, fontSize: 14 }}>{t('Matching Rules')}</Typography.Title>
<Form.Item name="path_pattern" label={t('Path Prefix (optional)')}>
<Input placeholder="/images/screenshots" />
<Input
placeholder="/images/screenshots"
addonAfter={<Button size="small" onClick={() => setPathPickerOpen(true)}>{t('Select')}</Button>}
/>
</Form.Item>
<Form.Item name="filename_regex" label={t('Filename Regex (optional)')}>
<Input placeholder=".*\.png$" />
@@ -219,40 +204,13 @@ const TasksPage = memo(function TasksPage() {
/>
</Form>
</Drawer>
<Modal
title={t('Current Task Queue')}
open={queueModalOpen}
onCancel={() => setQueueModalOpen(false)}
width={800}
footer={[
<Button key="refresh" onClick={fetchQueue} loading={queueLoading}>{t('Refresh')}</Button>,
<Button key="close" onClick={() => setQueueModalOpen(false)}>{t('Close')}</Button>
]}
>
<Table
size="small"
rowKey="id"
dataSource={queuedTasks}
loading={queueLoading}
pagination={false}
columns={[
{ title: 'ID', dataIndex: 'id', width: 120, render: (id) => <Typography.Text style={{ fontSize: 12 }} copyable={{ text: id }}>{id.slice(0, 8)}</Typography.Text> },
{ title: t('Task Name'), dataIndex: 'name' },
{ title: t('Params'), dataIndex: 'task_info', render: (info) => <Typography.Text type="secondary" style={{ fontSize: 12 }}>{JSON.stringify(info)}</Typography.Text> },
{
title: t('Status'), dataIndex: 'status', width: 100, render: (status: QueuedTask['status']) => {
const colorMap = {
pending: 'default',
running: 'processing',
success: 'success',
failed: 'error'
};
return <Tag color={colorMap[status]}>{status}</Tag>;
}
},
]}
/>
</Modal>
<PathSelectorModal
open={pathPickerOpen}
mode="directory"
initialPath={watchedPathPattern || '/'}
onCancel={() => setPathPickerOpen(false)}
onOk={(p) => { form.setFieldsValue({ path_pattern: p }); setPathPickerOpen(false); }}
/>
</PageCard>
);
});

View File

@@ -7,6 +7,8 @@ import FileExplorerPage from '../pages/FileExplorerPage/FileExplorerPage.tsx';
import AdaptersPage from '../pages/AdaptersPage.tsx';
import SharePage from '../pages/SharePage.tsx';
import TasksPage from '../pages/TasksPage.tsx';
import TaskQueuePage from '../pages/TaskQueuePage.tsx';
import ProcessorsPage from '../pages/ProcessorsPage.tsx';
import OfflineDownloadPage from '../pages/OfflineDownloadPage.tsx';
import SystemSettingsPage from '../pages/SystemSettingsPage/SystemSettingsPage.tsx';
import LogsPage from '../pages/LogsPage.tsx';
@@ -37,6 +39,8 @@ const ShellBody = memo(function ShellBody() {
{navKey === 'files' && <FileExplorerPage />}
{navKey === 'share' && <SharePage />}
{navKey === 'tasks' && <TasksPage />}
{navKey === 'task-queue' && <TaskQueuePage />}
{navKey === 'processors' && <ProcessorsPage />}
{navKey === 'offline' && <OfflineDownloadPage />}
{navKey === 'plugins' && <PluginsPage />}
{navKey === 'settings' && <SystemSettingsPage />}