mirror of
https://github.com/DrizzleTime/Foxel.git
synced 2026-05-08 09:13:23 +08:00
Compare commits
16 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
11c717e61d | ||
|
|
45d63febb9 | ||
|
|
5a29c579dc | ||
|
|
b530b16c53 | ||
|
|
7da49191aa | ||
|
|
fbeb673126 | ||
|
|
0a06f4d02c | ||
|
|
f02c29492b | ||
|
|
1a79e87887 | ||
|
|
626ff727b3 | ||
|
|
117a94d793 | ||
|
|
c39bea67a4 | ||
|
|
2cbfb29260 | ||
|
|
155f3a144d | ||
|
|
208a52589f | ||
|
|
0732b611a9 |
186
CONTRIBUTING.md
186
CONTRIBUTING.md
@@ -1,76 +1,76 @@
|
||||
<div align="right">
|
||||
<b>English</b> | <a href="./CONTRIBUTING_zh.md">简体中文</a>
|
||||
</div>
|
||||
|
||||
# Contributing to Foxel
|
||||
|
||||
🎉 首先,非常感谢您愿意花时间为 Foxel 做出贡献!
|
||||
We appreciate every minute you spend helping Foxel improve. This guide explains the contribution workflow so you can get started quickly.
|
||||
|
||||
我们热烈欢迎各种形式的贡献。无论是报告 Bug、提出新功能建议、完善文档,还是直接提交代码,都将对项目产生积极的影响。
|
||||
## Table of Contents
|
||||
|
||||
本指南将帮助您顺利地参与到项目中来。
|
||||
|
||||
## 目录
|
||||
|
||||
- [如何贡献](#如何贡献)
|
||||
- [🐛 报告 Bug](#-报告-bug)
|
||||
- [✨ 提交功能建议](#-提交功能建议)
|
||||
- [🛠️ 贡献代码](#️-贡献代码)
|
||||
- [开发环境搭建](#开发环境搭建)
|
||||
- [依赖准备](#依赖准备)
|
||||
- [后端 (FastAPI)](#后端-fastapi)
|
||||
- [前端 (React + Vite)](#前端-react--vite)
|
||||
- [代码贡献指南](#代码贡献指南)
|
||||
- [贡献存储适配器 (Adapter)](#贡献存储适配器-adapter)
|
||||
- [贡献前端应用 (App)](#贡献前端应用-app)
|
||||
- [提交规范](#提交规范)
|
||||
- [Git 分支管理](#git-分支管理)
|
||||
- [Commit Message 格式](#commit-message-格式)
|
||||
- [Pull Request 流程](#pull-request-流程)
|
||||
- [How to Contribute](#how-to-contribute)
|
||||
- [🐛 Report Bugs](#-report-bugs)
|
||||
- [✨ Suggest Features](#-suggest-features)
|
||||
- [🛠️ Contribute Code](#️-contribute-code)
|
||||
- [Development Environment](#development-environment)
|
||||
- [Prerequisites](#prerequisites)
|
||||
- [Backend (FastAPI)](#backend-fastapi)
|
||||
- [Frontend (React + Vite)](#frontend-react--vite)
|
||||
- [Contribution Guidelines](#contribution-guidelines)
|
||||
- [Storage Adapters](#storage-adapters)
|
||||
- [Frontend Apps](#frontend-apps)
|
||||
- [Submission Rules](#submission-rules)
|
||||
- [Git Branching](#git-branching)
|
||||
- [Commit Message Format](#commit-message-format)
|
||||
- [Pull Request Flow](#pull-request-flow)
|
||||
|
||||
---
|
||||
|
||||
## 如何贡献
|
||||
## How to Contribute
|
||||
|
||||
### 🐛 报告 Bug
|
||||
### 🐛 Report Bugs
|
||||
|
||||
如果您在使用的过程中发现了 Bug,请通过 [GitHub Issues](https://github.com/DrizzleTime/Foxel/issues) 来报告。请在报告中提供以下信息:
|
||||
If you discover a bug, open a ticket via [GitHub Issues](https://github.com/DrizzleTime/Foxel/issues) and include:
|
||||
|
||||
- **清晰的标题**:简明扼要地描述问题。
|
||||
- **复现步骤**:详细说明如何一步步重现该 Bug。
|
||||
- **期望行为** vs **实际行为**:描述您预期的结果和实际发生的情况。
|
||||
- **环境信息**:例如操作系统、浏览器版本、Foxel 版本等。
|
||||
- **A clear title** that summarises the problem.
|
||||
- **Reproduction steps** with enough detail to trigger the bug.
|
||||
- **Expected vs actual behaviour** to highlight the gap.
|
||||
- **Environment details** such as operating system, browser version, and the Foxel build you used.
|
||||
|
||||
### ✨ 提交功能建议
|
||||
### ✨ Suggest Features
|
||||
|
||||
我们欢迎任何关于新功能或改进的建议。请通过 [GitHub Issues](https://github.com/DrizzleTime/Foxel/issues) 创建一个 "Feature Request",并详细阐述您的想法:
|
||||
To propose a new capability or an improvement, create an Issue and choose the "Feature Request" template. Document:
|
||||
|
||||
- **问题描述**:说明该功能要解决什么问题。
|
||||
- **方案设想**:描述您希望该功能如何工作。
|
||||
- **相关信息**:提供任何有助于理解您想法的截图、链接或参考。
|
||||
- **Problem statement** – what pain point will the feature solve?
|
||||
- **Proposed solution** – how you expect it to work.
|
||||
- **Supporting material** – screenshots, references, or related links if helpful.
|
||||
|
||||
### 🛠️ 贡献代码
|
||||
### 🛠️ Contribute Code
|
||||
|
||||
如果您希望直接贡献代码,请参考下面的开发和提交流程。
|
||||
Follow the development setup below before opening a pull request. Keep changes focused and small so they are easier to review.
|
||||
|
||||
## 开发环境搭建
|
||||
## Development Environment
|
||||
|
||||
### 依赖准备
|
||||
### Prerequisites
|
||||
|
||||
- **Git**: 用于版本控制。
|
||||
- **Python**: >= 3.13
|
||||
- **Bun**: 用于前端包管理和脚本运行。
|
||||
Install the following tooling first:
|
||||
|
||||
### 后端 (FastAPI)
|
||||
- **Git** for version control
|
||||
- **Python** 3.13 or newer
|
||||
- **Bun** for frontend package management and scripts
|
||||
|
||||
后端 API 服务基于 Python 和 FastAPI 构建。
|
||||
### Backend (FastAPI)
|
||||
|
||||
1. **克隆仓库**
|
||||
1. **Clone the repository**
|
||||
|
||||
```bash
|
||||
git clone https://github.com/DrizzleTime/foxel.git
|
||||
cd Foxel
|
||||
```
|
||||
|
||||
2. **创建并激活 Python 虚拟环境**
|
||||
2. **Create and activate a virtual environment**
|
||||
|
||||
我们推荐使用 `uv` 来管理虚拟环境,以获得最佳性能。
|
||||
`uv` is recommended for performance and reproducibility:
|
||||
|
||||
```bash
|
||||
uv venv
|
||||
@@ -78,91 +78,85 @@
|
||||
# On Windows: .venv\Scripts\activate
|
||||
```
|
||||
|
||||
3. **安装依赖**
|
||||
3. **Install dependencies**
|
||||
|
||||
```bash
|
||||
uv sync
|
||||
```
|
||||
|
||||
4. **初始化环境**
|
||||
4. **Prepare local resources**
|
||||
|
||||
在启动服务前,请进行以下准备:
|
||||
- Create the data directory:
|
||||
|
||||
- **创建数据目录**:
|
||||
在项目根目录执行 `mkdir -p data/db`。这将创建用于存放数据库等文件的目录。
|
||||
> [!IMPORTANT]
|
||||
> 请确保应用拥有对 `data/db` 目录的读写权限。
|
||||
```bash
|
||||
mkdir -p data/db
|
||||
```
|
||||
|
||||
- **创建 `.env` 配置文件**:
|
||||
在项目根目录创建名为 `.env` 的文件,并填入以下内容。这些密钥用于保障应用安全,您可以按需修改。
|
||||
Ensure the application user can read and write to `data/db`.
|
||||
|
||||
- Create an `.env` file in the project root and provide the required secrets. Replace the sample values with your own random strings:
|
||||
|
||||
```dotenv
|
||||
SECRET_KEY=EnsRhL9NFPxgFVc+7t96/y70DIOR+9SpntcIqQa90TU=
|
||||
TEMP_LINK_SECRET_KEY=EnsRhL9NFPxgFVc+7t96/y70DIOR+9SpntcIqQa90TU=
|
||||
```
|
||||
|
||||
5. **启动开发服务器**
|
||||
5. **Start the development server**
|
||||
|
||||
```bash
|
||||
uvicorn main:app --reload --host 0.0.0.0 --port 8000
|
||||
```
|
||||
|
||||
API 服务将在 `http://localhost:8000` 上运行,您可以通过 `http://localhost:8000/docs` 访问自动生成的 API 文档。
|
||||
The API is available at `http://localhost:8000`, and the interactive docs live at `http://localhost:8000/docs`.
|
||||
|
||||
### 前端 (React + Vite)
|
||||
### Frontend (React + Vite)
|
||||
|
||||
前端应用使用 React, Vite, 和 TypeScript 构建。
|
||||
|
||||
1. **进入前端目录**
|
||||
1. **Enter the frontend directory**
|
||||
|
||||
```bash
|
||||
cd web
|
||||
```
|
||||
|
||||
2. **安装依赖**
|
||||
2. **Install dependencies**
|
||||
|
||||
```bash
|
||||
bun install
|
||||
```
|
||||
|
||||
3. **启动开发服务器**
|
||||
3. **Run the dev server**
|
||||
|
||||
```bash
|
||||
bun run dev
|
||||
```
|
||||
|
||||
前端开发服务器将在 `http://localhost:5173` 运行。它已经配置了代理,会自动将 `/api` 请求转发到后端服务。
|
||||
The Vite dev server runs at `http://localhost:5173` and proxies `/api` requests to the backend.
|
||||
|
||||
## 代码贡献指南
|
||||
## Contribution Guidelines
|
||||
|
||||
### 贡献存储适配器 (Adapter)
|
||||
### Storage Adapters
|
||||
|
||||
存储适配器是 Foxel 的核心扩展点,用于接入不同的存储后端 (如 S3, FTP, Alist 等)。
|
||||
Storage adapters integrate new storage providers (for example S3, FTP, or Alist).
|
||||
|
||||
1. **创建适配器文件**: 在 [`services/adapters/`](services/adapters/) 目录下,创建一个新文件,例如 `my_new_adapter.py`。
|
||||
2. **实现适配器类**:
|
||||
- 创建一个类,继承自 [`services.adapters.base.BaseAdapter`](services/adapters/base.py)。
|
||||
- 实现 `BaseAdapter` 中定义的所有抽象方法,如 `list_dir`, `get_meta`, `upload`, `download` 等。请仔细阅读基类中的文档注释以理解每个方法的作用和参数。
|
||||
1. Create a new module under [`services/adapters/`](services/adapters/) (for example `my_new_adapter.py`).
|
||||
2. Implement a class that inherits from [`services.adapters.base.BaseAdapter`](services/adapters/base.py) and provide concrete implementations for the abstract methods such as `list_dir`, `get_meta`, `upload`, and `download`.
|
||||
|
||||
### 贡献前端应用 (App)
|
||||
### Frontend Apps
|
||||
|
||||
前端应用允许用户在浏览器中直接预览或编辑特定类型的文件。
|
||||
Frontend apps enable in-browser previews or editors for specific file types.
|
||||
|
||||
1. **创建应用组件**: 在 [`web/src/apps/`](web/src/apps/) 目录下,为您的应用创建一个新的文件夹,并在其中创建 React 组件。
|
||||
2. **定义应用类型**: 您的应用需要实现 [`web/src/apps/types.ts`](web/src/apps/types.ts) 中定义的 `FoxelApp` 接口。
|
||||
3. **注册应用**: 在 [`web/src/apps/registry.ts`](web/src/apps/registry.ts) 中,导入您的应用组件,并将其添加到 `APP_REGISTRY`。在注册时,您需要指定该应用可以处理的文件类型(通过 MIME Type 或文件扩展名)。
|
||||
1. Add a new folder in [`web/src/apps/`](web/src/apps/) for your app and expose a React component.
|
||||
2. Implement the `FoxelApp` interface defined in [`web/src/apps/types.ts`](web/src/apps/types.ts).
|
||||
3. Register the app in [`web/src/apps/registry.ts`](web/src/apps/registry.ts) and declare the MIME types or extensions it supports.
|
||||
|
||||
## 提交规范
|
||||
## Submission Rules
|
||||
|
||||
### Git 分支管理
|
||||
### Git Branching
|
||||
|
||||
- 从最新的 `main` 分支创建您的特性分支。
|
||||
Start your work from the latest `main` branch and push feature changes on a dedicated branch.
|
||||
|
||||
### Commit Message 格式
|
||||
### Commit Message Format
|
||||
|
||||
我们遵循 [Conventional Commits](https://www.conventionalcommits.org/) 规范。这有助于自动化生成更新日志和版本管理。
|
||||
|
||||
Commit Message 格式如下:
|
||||
We follow the [Conventional Commits](https://www.conventionalcommits.org/) specification to drive release tooling.
|
||||
|
||||
```
|
||||
<type>(<scope>): <subject>
|
||||
@@ -172,27 +166,27 @@ Commit Message 格式如下:
|
||||
<footer>
|
||||
```
|
||||
|
||||
- **type**: `feat`, `fix`, `docs`, `style`, `refactor`, `test`, `chore` 等。
|
||||
- **scope**: (可选) 本次提交影响的范围,例如 `adapter`, `ui`, `api`。
|
||||
- **subject**: 简明扼要的描述。
|
||||
- **type**: e.g. `feat`, `fix`, `docs`, `style`, `refactor`, `test`, `chore`.
|
||||
- **scope** (optional): the area impacted by the change, such as `adapter`, `ui`, or `api`.
|
||||
- **subject**: a concise summary written in the imperative mood.
|
||||
|
||||
**示例:**
|
||||
**Examples:**
|
||||
|
||||
```
|
||||
feat(adapter): Add support for Alist storage
|
||||
feat(adapter): add support for Alist storage
|
||||
```
|
||||
|
||||
```
|
||||
fix(ui): Correct display issue in file list view
|
||||
fix(ui): correct display issue in file list view
|
||||
```
|
||||
|
||||
### Pull Request 流程
|
||||
### Pull Request Flow
|
||||
|
||||
1. Fork 仓库并克隆到本地。
|
||||
2. 创建并切换到您的特性分支。
|
||||
3. 完成代码编写和测试。
|
||||
4. 将您的分支推送到您的 Fork 仓库。
|
||||
5. 在 Foxel 主仓库创建一个 Pull Request,目标分支为 `main`。
|
||||
6. 在 PR 描述中清晰地说明您的更改内容、目的和任何相关的 Issue 编号。
|
||||
1. Fork the repository and clone it locally.
|
||||
2. Create and switch to your feature branch.
|
||||
3. Implement the change and run relevant checks.
|
||||
4. Push the branch to your fork.
|
||||
5. Open a pull request against `main` in the Foxel repository.
|
||||
6. Explain the change set, its motivation, and reference related Issues in the PR description.
|
||||
|
||||
项目维护者会尽快审查您的 PR。感谢您的耐心和贡献!
|
||||
Maintainers will review your pull request as soon as possible.
|
||||
|
||||
202
CONTRIBUTING_zh.md
Normal file
202
CONTRIBUTING_zh.md
Normal file
@@ -0,0 +1,202 @@
|
||||
<div align="right">
|
||||
<a href="./CONTRIBUTING.md">English</a> | <b>简体中文</b>
|
||||
</div>
|
||||
|
||||
# Contributing to Foxel
|
||||
|
||||
🎉 首先,非常感谢您愿意花时间为 Foxel 做出贡献!
|
||||
|
||||
我们热烈欢迎各种形式的贡献。无论是报告 Bug、提出新功能建议、完善文档,还是直接提交代码,都将对项目产生积极的影响。
|
||||
|
||||
本指南将帮助您顺利地参与到项目中来。
|
||||
|
||||
## 目录
|
||||
|
||||
- [如何贡献](#如何贡献)
|
||||
- [🐛 报告 Bug](#-报告-bug)
|
||||
- [✨ 提交功能建议](#-提交功能建议)
|
||||
- [🛠️ 贡献代码](#️-贡献代码)
|
||||
- [开发环境搭建](#开发环境搭建)
|
||||
- [依赖准备](#依赖准备)
|
||||
- [后端 (FastAPI)](#后端-fastapi)
|
||||
- [前端 (React + Vite)](#前端-react--vite)
|
||||
- [代码贡献指南](#代码贡献指南)
|
||||
- [贡献存储适配器 (Adapter)](#贡献存储适配器-adapter)
|
||||
- [贡献前端应用 (App)](#贡献前端应用-app)
|
||||
- [提交规范](#提交规范)
|
||||
- [Git 分支管理](#git-分支管理)
|
||||
- [Commit Message 格式](#commit-message-格式)
|
||||
- [Pull Request 流程](#pull-request-流程)
|
||||
|
||||
---
|
||||
|
||||
## 如何贡献
|
||||
|
||||
### 🐛 报告 Bug
|
||||
|
||||
如果您在使用的过程中发现了 Bug,请通过 [GitHub Issues](https://github.com/DrizzleTime/Foxel/issues) 来报告。请在报告中提供以下信息:
|
||||
|
||||
- **清晰的标题**:简明扼要地描述问题。
|
||||
- **复现步骤**:详细说明如何一步步重现该 Bug。
|
||||
- **期望行为** vs **实际行为**:描述您预期的结果和实际发生的情况。
|
||||
- **环境信息**:例如操作系统、浏览器版本、Foxel 版本等。
|
||||
|
||||
### ✨ 提交功能建议
|
||||
|
||||
我们欢迎任何关于新功能或改进的建议。请通过 [GitHub Issues](https://github.com/DrizzleTime/Foxel/issues) 创建一个 "Feature Request",并详细阐述您的想法:
|
||||
|
||||
- **问题描述**:说明该功能要解决什么问题。
|
||||
- **方案设想**:描述您希望该功能如何工作。
|
||||
- **相关信息**:提供任何有助于理解您想法的截图、链接或参考。
|
||||
|
||||
### 🛠️ 贡献代码
|
||||
|
||||
如果您希望直接贡献代码,请参考下面的开发和提交流程。
|
||||
|
||||
## 开发环境搭建
|
||||
|
||||
### 依赖准备
|
||||
|
||||
- **Git**: 用于版本控制。
|
||||
- **Python**: >= 3.13
|
||||
- **Bun**: 用于前端包管理和脚本运行。
|
||||
|
||||
### 后端 (FastAPI)
|
||||
|
||||
后端 API 服务基于 Python 和 FastAPI 构建。
|
||||
|
||||
1. **克隆仓库**
|
||||
|
||||
```bash
|
||||
git clone https://github.com/DrizzleTime/foxel.git
|
||||
cd Foxel
|
||||
```
|
||||
|
||||
2. **创建并激活 Python 虚拟环境**
|
||||
|
||||
我们推荐使用 `uv` 来管理虚拟环境,以获得最佳性能。
|
||||
|
||||
```bash
|
||||
uv venv
|
||||
source .venv/bin/activate
|
||||
# On Windows: .venv\Scripts\activate
|
||||
```
|
||||
|
||||
3. **安装依赖**
|
||||
|
||||
```bash
|
||||
uv sync
|
||||
```
|
||||
|
||||
4. **初始化环境**
|
||||
|
||||
在启动服务前,请进行以下准备:
|
||||
|
||||
- **创建数据目录**:
|
||||
在项目根目录执行 `mkdir -p data/db`。这将创建用于存放数据库等文件的目录。
|
||||
> [!IMPORTANT]
|
||||
> 请确保应用拥有对 `data/db` 目录的读写权限。
|
||||
|
||||
- **创建 `.env` 配置文件**:
|
||||
在项目根目录创建名为 `.env` 的文件,并填入以下内容。这些密钥用于保障应用安全,您可以按需修改。
|
||||
|
||||
```dotenv
|
||||
SECRET_KEY=EnsRhL9NFPxgFVc+7t96/y70DIOR+9SpntcIqQa90TU=
|
||||
TEMP_LINK_SECRET_KEY=EnsRhL9NFPxgFVc+7t96/y70DIOR+9SpntcIqQa90TU=
|
||||
```
|
||||
|
||||
5. **启动开发服务器**
|
||||
|
||||
```bash
|
||||
uvicorn main:app --reload --host 0.0.0.0 --port 8000
|
||||
```
|
||||
|
||||
API 服务将在 `http://localhost:8000` 上运行,您可以通过 `http://localhost:8000/docs` 访问自动生成的 API 文档。
|
||||
|
||||
### 前端 (React + Vite)
|
||||
|
||||
前端应用使用 React, Vite, 和 TypeScript 构建。
|
||||
|
||||
1. **进入前端目录**
|
||||
|
||||
```bash
|
||||
cd web
|
||||
```
|
||||
|
||||
2. **安装依赖**
|
||||
|
||||
```bash
|
||||
bun install
|
||||
```
|
||||
|
||||
3. **启动开发服务器**
|
||||
|
||||
```bash
|
||||
bun run dev
|
||||
```
|
||||
|
||||
前端开发服务器将在 `http://localhost:5173` 运行。它已经配置了代理,会自动将 `/api` 请求转发到后端服务。
|
||||
|
||||
## 代码贡献指南
|
||||
|
||||
### 贡献存储适配器 (Adapter)
|
||||
|
||||
存储适配器是 Foxel 的核心扩展点,用于接入不同的存储后端 (如 S3, FTP, Alist 等)。
|
||||
|
||||
1. **创建适配器文件**: 在 [`services/adapters/`](services/adapters/) 目录下,创建一个新文件,例如 `my_new_adapter.py`。
|
||||
2. **实现适配器类**:
|
||||
- 创建一个类,继承自 [`services.adapters.base.BaseAdapter`](services/adapters/base.py)。
|
||||
- 实现 `BaseAdapter` 中定义的所有抽象方法,如 `list_dir`, `get_meta`, `upload`, `download` 等。请仔细阅读基类中的文档注释以理解每个方法的作用和参数。
|
||||
|
||||
### 贡献前端应用 (App)
|
||||
|
||||
前端应用允许用户在浏览器中直接预览或编辑特定类型的文件。
|
||||
|
||||
1. **创建应用组件**: 在 [`web/src/apps/`](web/src/apps/) 目录下,为您的应用创建一个新的文件夹,并在其中创建 React 组件。
|
||||
2. **定义应用类型**: 您的应用需要实现 [`web/src/apps/types.ts`](web/src/apps/types.ts) 中定义的 `FoxelApp` 接口。
|
||||
3. **注册应用**: 在 [`web/src/apps/registry.ts`](web/src/apps/registry.ts) 中,导入您的应用组件,并将其添加到 `APP_REGISTRY`。在注册时,您需要指定该应用可以处理的文件类型(通过 MIME Type 或文件扩展名)。
|
||||
|
||||
## 提交规范
|
||||
|
||||
### Git 分支管理
|
||||
|
||||
- 从最新的 `main` 分支创建您的特性分支。
|
||||
|
||||
### Commit Message 格式
|
||||
|
||||
我们遵循 [Conventional Commits](https://www.conventionalcommits.org/) 规范。这有助于自动化生成更新日志和版本管理。
|
||||
|
||||
Commit Message 格式如下:
|
||||
|
||||
```
|
||||
<type>(<scope>): <subject>
|
||||
<BLANK LINE>
|
||||
<body>
|
||||
<BLANK LINE>
|
||||
<footer>
|
||||
```
|
||||
|
||||
- **type**: `feat`, `fix`, `docs`, `style`, `refactor`, `test`, `chore` 等。
|
||||
- **scope**: (可选) 本次提交影响的范围,例如 `adapter`, `ui`, `api`。
|
||||
- **subject**: 简明扼要的描述。
|
||||
|
||||
**示例:**
|
||||
|
||||
```
|
||||
feat(adapter): Add support for Alist storage
|
||||
```
|
||||
|
||||
```
|
||||
fix(ui): Correct display issue in file list view
|
||||
```
|
||||
|
||||
### Pull Request 流程
|
||||
|
||||
1. Fork 仓库并克隆到本地。
|
||||
2. 创建并切换到您的特性分支。
|
||||
3. 完成代码编写和测试。
|
||||
4. 将您的分支推送到您的 Fork 仓库。
|
||||
5. 在 Foxel 主仓库创建一个 Pull Request,目标分支为 `main`。
|
||||
6. 在 PR 描述中清晰地说明您的更改内容、目的和任何相关的 Issue 编号。
|
||||
|
||||
项目维护者会尽快审查您的 PR。感谢您的耐心和贡献!
|
||||
@@ -27,6 +27,9 @@ COPY . .
|
||||
|
||||
COPY nginx.conf /etc/nginx/nginx.conf
|
||||
|
||||
RUN mkdir -p data/db data/mount && \
|
||||
chmod 777 data/db data/mount
|
||||
|
||||
EXPOSE 80
|
||||
|
||||
COPY entrypoint.sh /entrypoint.sh
|
||||
|
||||
@@ -73,7 +73,7 @@ chmod 777 data/db data/mount
|
||||
|
||||
We welcome contributions from the community! Whether it's submitting bugs, suggesting new features, or contributing code directly.
|
||||
|
||||
Before you start, please read our [`CONTRIBUTING.md`](CONTRIBUTING.md) file, which will guide you on how to set up your development environment and the submission process.
|
||||
Before you start, please read our [`CONTRIBUTING.md`](CONTRIBUTING.md) file, which explains the development environment and submission process. A Simplified Chinese translation is available in [`CONTRIBUTING_zh.md`](CONTRIBUTING_zh.md).
|
||||
|
||||
## 🌐 Community
|
||||
|
||||
|
||||
@@ -74,7 +74,7 @@ chmod 777 data/db data/mount
|
||||
|
||||
我们非常欢迎来自社区的贡献!无论是提交 Bug、建议新功能还是直接贡献代码。
|
||||
|
||||
在开始之前,请先阅读我们的 [`CONTRIBUTING.md`](CONTRIBUTING.md) 文件,它会指导你如何设置开发环境以及提交流程。
|
||||
在开始之前,请先阅读我们的 [`CONTRIBUTING_zh.md`](CONTRIBUTING_zh.md) 文件,它会指导你如何设置开发环境以及提交流程。
|
||||
|
||||
## 🌐 社区
|
||||
|
||||
|
||||
@@ -1,10 +1,11 @@
|
||||
import httpx
|
||||
import time
|
||||
from fastapi import APIRouter, Depends, Form
|
||||
from fastapi import APIRouter, Depends, Form, HTTPException
|
||||
from typing import Annotated
|
||||
from services.config import ConfigCenter, VERSION
|
||||
from services.auth import get_current_active_user, User, has_users
|
||||
from api.response import success
|
||||
from services.vector_db import VectorDBService
|
||||
router = APIRouter(prefix="/api/config", tags=["config"])
|
||||
|
||||
|
||||
@@ -23,8 +24,27 @@ async def set_config(
|
||||
key: str = Form(...),
|
||||
value: str = Form(...)
|
||||
):
|
||||
await ConfigCenter.set(key, value)
|
||||
return success({"key": key, "value": value})
|
||||
original_value = await ConfigCenter.get(key)
|
||||
value_to_save = value
|
||||
if key == "AI_EMBED_DIM":
|
||||
try:
|
||||
parsed_value = int(value)
|
||||
except (TypeError, ValueError):
|
||||
raise HTTPException(status_code=400, detail="AI_EMBED_DIM must be an integer")
|
||||
if parsed_value <= 0:
|
||||
raise HTTPException(status_code=400, detail="AI_EMBED_DIM must be greater than zero")
|
||||
value_to_save = str(parsed_value)
|
||||
|
||||
await ConfigCenter.set(key, value_to_save)
|
||||
|
||||
if key == "AI_EMBED_DIM" and str(original_value) != value_to_save:
|
||||
try:
|
||||
service = VectorDBService()
|
||||
await service.clear_all_data()
|
||||
except Exception as exc:
|
||||
raise HTTPException(status_code=500, detail=f"Failed to clear vector database: {exc}")
|
||||
|
||||
return success({"key": key, "value": value_to_save})
|
||||
|
||||
|
||||
@router.get("/all")
|
||||
|
||||
@@ -1,10 +1,17 @@
|
||||
from fastapi import APIRouter, Depends, Body
|
||||
from pathlib import Path
|
||||
from fastapi import APIRouter, Depends, Body, HTTPException
|
||||
from fastapi.concurrency import run_in_threadpool
|
||||
from typing import Annotated
|
||||
from services.processors.registry import get_config_schemas
|
||||
from services.processors.registry import (
|
||||
get_config_schemas,
|
||||
get_module_path,
|
||||
reload_processors,
|
||||
)
|
||||
from services.task_queue import task_queue_service
|
||||
from services.auth import get_current_active_user, User
|
||||
from api.response import success
|
||||
from pydantic import BaseModel
|
||||
from services.virtual_fs import path_is_directory
|
||||
|
||||
router = APIRouter(prefix="/api/processors", tags=["processors"])
|
||||
|
||||
@@ -22,6 +29,7 @@ async def list_processors(
|
||||
"supported_exts": meta.get("supported_exts", []),
|
||||
"config_schema": meta["config_schema"],
|
||||
"produces_file": meta.get("produces_file", False),
|
||||
"module_path": meta.get("module_path"),
|
||||
})
|
||||
return success(out)
|
||||
|
||||
@@ -34,12 +42,20 @@ class ProcessRequest(BaseModel):
|
||||
overwrite: bool = False
|
||||
|
||||
|
||||
class UpdateSourceRequest(BaseModel):
|
||||
source: str
|
||||
|
||||
|
||||
@router.post("/process")
|
||||
async def process_file_with_processor(
|
||||
current_user: Annotated[User, Depends(get_current_active_user)],
|
||||
req: ProcessRequest = Body(...)
|
||||
):
|
||||
save_to = req.path if req.overwrite else req.save_to
|
||||
is_dir = await path_is_directory(req.path)
|
||||
if is_dir and not req.overwrite:
|
||||
raise HTTPException(400, detail="Directory processing requires overwrite")
|
||||
|
||||
save_to = None if is_dir else (req.path if req.overwrite else req.save_to)
|
||||
task = await task_queue_service.add_task(
|
||||
"process_file",
|
||||
{
|
||||
@@ -47,6 +63,54 @@ async def process_file_with_processor(
|
||||
"processor_type": req.processor_type,
|
||||
"config": req.config,
|
||||
"save_to": save_to,
|
||||
"overwrite": req.overwrite,
|
||||
},
|
||||
)
|
||||
return success({"task_id": task.id})
|
||||
|
||||
|
||||
@router.get("/source/{processor_type}")
|
||||
async def get_processor_source(
|
||||
processor_type: str,
|
||||
current_user: Annotated[User, Depends(get_current_active_user)],
|
||||
):
|
||||
module_path = get_module_path(processor_type)
|
||||
if not module_path:
|
||||
raise HTTPException(404, detail="Processor not found")
|
||||
path_obj = Path(module_path)
|
||||
if not path_obj.exists():
|
||||
raise HTTPException(404, detail="Processor source not found")
|
||||
try:
|
||||
content = await run_in_threadpool(path_obj.read_text, encoding='utf-8')
|
||||
except Exception as exc:
|
||||
raise HTTPException(500, detail=f"Failed to read source: {exc}")
|
||||
return success({"source": content, "module_path": str(path_obj)})
|
||||
|
||||
|
||||
@router.put("/source/{processor_type}")
|
||||
async def update_processor_source(
|
||||
processor_type: str,
|
||||
req: UpdateSourceRequest,
|
||||
current_user: Annotated[User, Depends(get_current_active_user)],
|
||||
):
|
||||
module_path = get_module_path(processor_type)
|
||||
if not module_path:
|
||||
raise HTTPException(404, detail="Processor not found")
|
||||
path_obj = Path(module_path)
|
||||
if not path_obj.exists():
|
||||
raise HTTPException(404, detail="Processor source not found")
|
||||
try:
|
||||
await run_in_threadpool(path_obj.write_text, req.source, encoding='utf-8')
|
||||
except Exception as exc:
|
||||
raise HTTPException(500, detail=f"Failed to write source: {exc}")
|
||||
return success(True)
|
||||
|
||||
|
||||
@router.post("/reload")
|
||||
async def reload_processor_modules(
|
||||
current_user: Annotated[User, Depends(get_current_active_user)],
|
||||
):
|
||||
errors = reload_processors()
|
||||
if errors:
|
||||
raise HTTPException(500, detail="; ".join(errors))
|
||||
return success(True)
|
||||
|
||||
@@ -9,7 +9,7 @@ router = APIRouter(prefix="/api/search", tags=["search"])
|
||||
async def search_files_by_vector(q: str, top_k: int):
|
||||
embedding = await get_text_embedding(q)
|
||||
vector_db = VectorDBService()
|
||||
results = vector_db.search_vectors("vector_collection", embedding, top_k)
|
||||
results = await vector_db.search_vectors("vector_collection", embedding, top_k)
|
||||
items = [
|
||||
SearchResultItem(id=res["id"], path=res["entity"]["path"], score=res["distance"])
|
||||
for res in results[0]
|
||||
@@ -18,7 +18,7 @@ async def search_files_by_vector(q: str, top_k: int):
|
||||
|
||||
async def search_files_by_name(q: str, top_k: int):
|
||||
vector_db = VectorDBService()
|
||||
results = vector_db.search_by_path("vector_collection", q, top_k)
|
||||
results = await vector_db.search_by_path("vector_collection", q, top_k)
|
||||
items = [
|
||||
SearchResultItem(id=idx, path=res["entity"]["path"], score=res["distance"])
|
||||
for idx, res in enumerate(results[0])
|
||||
@@ -38,4 +38,4 @@ async def search_files(
|
||||
elif mode == "filename":
|
||||
return await search_files_by_name(q, top_k)
|
||||
else:
|
||||
return {"items": [], "query": q, "error": "Invalid search mode"}
|
||||
return {"items": [], "query": q, "error": "Invalid search mode"}
|
||||
|
||||
@@ -83,6 +83,18 @@ async def get_my_shares(current_user: User = Depends(get_current_active_user)):
|
||||
return [ShareInfo.from_orm(s) for s in shares]
|
||||
|
||||
|
||||
@router.delete("/expired")
|
||||
async def delete_expired_shares(
|
||||
current_user: User = Depends(get_current_active_user),
|
||||
):
|
||||
"""
|
||||
删除当前用户的所有已过期分享。
|
||||
"""
|
||||
user_account = await UserAccount.get(id=current_user.id)
|
||||
deleted_count = await share_service.delete_expired_shares(user=user_account)
|
||||
return success({"deleted_count": deleted_count})
|
||||
|
||||
|
||||
@router.delete("/{share_id}")
|
||||
async def delete_share(
|
||||
share_id: int,
|
||||
|
||||
@@ -1,19 +1,100 @@
|
||||
from typing import Any, Dict
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException
|
||||
from pydantic import BaseModel, Field
|
||||
|
||||
from services.auth import get_current_active_user
|
||||
from models.database import UserAccount
|
||||
from services.vector_db import VectorDBService
|
||||
from services.vector_db import (
|
||||
VectorDBService,
|
||||
VectorDBConfigManager,
|
||||
list_providers,
|
||||
get_provider_entry,
|
||||
)
|
||||
from services.vector_db.providers import get_provider_class
|
||||
from api.response import success
|
||||
|
||||
router = APIRouter(prefix="/api/vector-db", tags=["vector-db"])
|
||||
|
||||
|
||||
class VectorDBConfigPayload(BaseModel):
|
||||
type: str = Field(..., description="向量数据库提供者类型")
|
||||
config: Dict[str, Any] = Field(default_factory=dict, description="提供者配置参数")
|
||||
|
||||
|
||||
@router.post("/clear-all", summary="清空向量数据库")
|
||||
async def clear_vector_db(user: UserAccount = Depends(get_current_active_user)):
|
||||
if user.username != 'admin':
|
||||
raise HTTPException(status_code=403, detail="仅管理员可操作")
|
||||
try:
|
||||
service = VectorDBService()
|
||||
service.clear_all_data()
|
||||
await service.clear_all_data()
|
||||
return success(msg="向量数据库已清空")
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=500, detail=str(e))
|
||||
raise HTTPException(status_code=500, detail=str(e))
|
||||
|
||||
|
||||
@router.get("/stats", summary="获取向量数据库统计")
|
||||
async def get_vector_db_stats(user: UserAccount = Depends(get_current_active_user)):
|
||||
if user.username != 'admin':
|
||||
raise HTTPException(status_code=403, detail="仅管理员可操作")
|
||||
try:
|
||||
service = VectorDBService()
|
||||
data = await service.get_all_stats()
|
||||
return success(data=data)
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=500, detail=str(e))
|
||||
|
||||
|
||||
@router.get("/providers", summary="列出可用向量数据库提供者")
|
||||
async def list_vector_providers(user: UserAccount = Depends(get_current_active_user)):
|
||||
if user.username != 'admin':
|
||||
raise HTTPException(status_code=403, detail="仅管理员可操作")
|
||||
return success(list_providers())
|
||||
|
||||
|
||||
@router.get("/config", summary="获取当前向量数据库配置")
|
||||
async def get_vector_db_config(user: UserAccount = Depends(get_current_active_user)):
|
||||
if user.username != 'admin':
|
||||
raise HTTPException(status_code=403, detail="仅管理员可操作")
|
||||
service = VectorDBService()
|
||||
data = await service.current_provider()
|
||||
return success(data)
|
||||
|
||||
|
||||
@router.post("/config", summary="更新向量数据库配置")
|
||||
async def update_vector_db_config(payload: VectorDBConfigPayload, user: UserAccount = Depends(get_current_active_user)):
|
||||
if user.username != 'admin':
|
||||
raise HTTPException(status_code=403, detail="仅管理员可操作")
|
||||
|
||||
entry = get_provider_entry(payload.type)
|
||||
if not entry:
|
||||
raise HTTPException(status_code=400, detail=f"未知的向量数据库类型: {payload.type}")
|
||||
if not entry.get("enabled", True):
|
||||
raise HTTPException(status_code=400, detail="该向量数据库类型暂不可用")
|
||||
|
||||
provider_cls = get_provider_class(payload.type)
|
||||
if not provider_cls:
|
||||
raise HTTPException(status_code=400, detail=f"未找到类型 {payload.type} 对应的实现")
|
||||
|
||||
# 先尝试建立连接,确保配置有效
|
||||
test_provider = provider_cls(payload.config)
|
||||
try:
|
||||
await test_provider.initialize()
|
||||
except Exception as exc:
|
||||
raise HTTPException(status_code=400, detail=str(exc))
|
||||
finally:
|
||||
client = getattr(test_provider, "client", None)
|
||||
close_fn = getattr(client, "close", None)
|
||||
if callable(close_fn):
|
||||
try:
|
||||
close_fn()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
await VectorDBConfigManager.save_config(payload.type, payload.config)
|
||||
service = VectorDBService()
|
||||
await service.reload()
|
||||
config_data = await service.current_provider()
|
||||
stats = await service.get_all_stats()
|
||||
return success({"config": config_data, "stats": stats})
|
||||
|
||||
@@ -28,7 +28,7 @@ http {
|
||||
listen 80;
|
||||
server_name _;
|
||||
|
||||
location ~ ^/(api|docs|openapi\.json$) {
|
||||
location ~ ^/(api|webdav|docs|openapi\.json$) {
|
||||
proxy_pass http://127.0.0.1:8000;
|
||||
proxy_set_header Host $host;
|
||||
proxy_set_header X-Real-IP $remote_addr;
|
||||
|
||||
@@ -64,6 +64,7 @@ dependencies = [
|
||||
"python-multipart==0.0.20",
|
||||
"pytz==2025.2",
|
||||
"pyyaml==6.0.2",
|
||||
"qdrant-client==1.15.1",
|
||||
"rawpy==0.25.1",
|
||||
"rich==14.1.0",
|
||||
"rich-toolkit==0.15.0",
|
||||
|
||||
@@ -4,7 +4,7 @@ from typing import Any, Optional, Dict
|
||||
from dotenv import load_dotenv
|
||||
from models.database import Configuration
|
||||
load_dotenv(dotenv_path=".env")
|
||||
VERSION = "v1.2.4"
|
||||
VERSION = "v1.2.8"
|
||||
|
||||
class ConfigCenter:
|
||||
_cache: Dict[str, Any] = {}
|
||||
|
||||
@@ -1,33 +1,53 @@
|
||||
import pkgutil
|
||||
import inspect
|
||||
from importlib import import_module
|
||||
from typing import Dict, Callable
|
||||
import pkgutil
|
||||
from importlib import import_module, reload
|
||||
from pathlib import Path
|
||||
from types import ModuleType
|
||||
from typing import Callable, Dict, Optional
|
||||
|
||||
from .base import BaseProcessor
|
||||
|
||||
ProcessorFactory = Callable[[], BaseProcessor]
|
||||
TYPE_MAP: Dict[str, ProcessorFactory] = {}
|
||||
CONFIG_SCHEMAS: Dict[str, dict] = {}
|
||||
MODULE_MAP: Dict[str, ModuleType] = {}
|
||||
LAST_DISCOVERY_ERRORS: list[str] = []
|
||||
|
||||
|
||||
def discover_processors(force_reload: bool = False) -> list[str]:
|
||||
"""Discover available processor modules and cache their metadata."""
|
||||
import services.processors # 延迟导入以避免循环
|
||||
|
||||
def discover_processors():
|
||||
import services.processors
|
||||
processors_pkg = services.processors
|
||||
TYPE_MAP.clear()
|
||||
CONFIG_SCHEMAS.clear()
|
||||
MODULE_MAP.clear()
|
||||
|
||||
global LAST_DISCOVERY_ERRORS
|
||||
LAST_DISCOVERY_ERRORS = []
|
||||
|
||||
for modinfo in pkgutil.iter_modules(processors_pkg.__path__):
|
||||
if modinfo.name.startswith("_"):
|
||||
continue
|
||||
|
||||
full_name = f"{processors_pkg.__name__}.{modinfo.name}"
|
||||
try:
|
||||
module = import_module(full_name)
|
||||
except Exception:
|
||||
if force_reload:
|
||||
module = reload(module)
|
||||
except Exception as exc:
|
||||
LAST_DISCOVERY_ERRORS.append(f"Failed to import {full_name}: {exc}")
|
||||
continue
|
||||
|
||||
processor_type = getattr(module, "PROCESSOR_TYPE", None)
|
||||
processor_name = getattr(module, "PROCESSOR_NAME", None)
|
||||
supported_exts = getattr(module, "SUPPORTED_EXTS", None)
|
||||
schema = getattr(module, "CONFIG_SCHEMA", None)
|
||||
factory = getattr(module, "PROCESSOR_FACTORY", None)
|
||||
|
||||
if not processor_type:
|
||||
continue
|
||||
|
||||
if factory is None:
|
||||
for attr in module.__dict__.values():
|
||||
if inspect.isclass(attr) and attr.__name__.endswith("Processor"):
|
||||
@@ -35,31 +55,85 @@ def discover_processors():
|
||||
return lambda: cls()
|
||||
factory = _mk()
|
||||
break
|
||||
|
||||
if not callable(factory):
|
||||
LAST_DISCOVERY_ERRORS.append(f"Processor {full_name} missing factory")
|
||||
continue
|
||||
|
||||
try:
|
||||
sample = factory()
|
||||
except Exception as exc:
|
||||
LAST_DISCOVERY_ERRORS.append(f"Failed to instantiate processor {processor_type}: {exc}")
|
||||
continue
|
||||
|
||||
TYPE_MAP[processor_type] = factory
|
||||
MODULE_MAP[processor_type] = module
|
||||
|
||||
produces_file = getattr(module, "produces_file", None)
|
||||
if produces_file is None and hasattr(factory(), "produces_file"):
|
||||
produces_file = getattr(factory(), "produces_file")
|
||||
if produces_file is None and hasattr(sample, "produces_file"):
|
||||
produces_file = getattr(sample, "produces_file")
|
||||
|
||||
module_file = getattr(module, "__file__", None)
|
||||
module_path: Optional[str] = None
|
||||
if module_file:
|
||||
try:
|
||||
module_path = str(Path(module_file).resolve())
|
||||
except Exception:
|
||||
module_path = module_file
|
||||
|
||||
if isinstance(supported_exts, list):
|
||||
normalized_exts = [str(ext) for ext in supported_exts]
|
||||
elif supported_exts:
|
||||
normalized_exts = [str(supported_exts)]
|
||||
else:
|
||||
normalized_exts = []
|
||||
|
||||
if not normalized_exts and hasattr(sample, "supported_exts"):
|
||||
sample_exts = getattr(sample, "supported_exts") or []
|
||||
if isinstance(sample_exts, list):
|
||||
normalized_exts = [str(ext) for ext in sample_exts]
|
||||
|
||||
if isinstance(schema, list):
|
||||
CONFIG_SCHEMAS[processor_type] = {
|
||||
"type": processor_type,
|
||||
"name": processor_name or processor_type,
|
||||
"supported_exts": supported_exts or [],
|
||||
"supported_exts": normalized_exts,
|
||||
"config_schema": schema,
|
||||
"produces_file": produces_file if produces_file is not None else False
|
||||
"produces_file": produces_file if produces_file is not None else False,
|
||||
"module_path": module_path,
|
||||
}
|
||||
|
||||
return LAST_DISCOVERY_ERRORS
|
||||
|
||||
|
||||
def get_config_schemas() -> Dict[str, dict]:
|
||||
return CONFIG_SCHEMAS
|
||||
|
||||
|
||||
def get_config_schema(processor_type: str):
|
||||
return CONFIG_SCHEMAS.get(processor_type)
|
||||
|
||||
|
||||
def get(processor_type: str) -> BaseProcessor:
|
||||
factory = TYPE_MAP.get(processor_type)
|
||||
if factory:
|
||||
return factory()
|
||||
return None
|
||||
|
||||
|
||||
def get_module_path(processor_type: str) -> Optional[str]:
|
||||
meta = CONFIG_SCHEMAS.get(processor_type)
|
||||
if not meta:
|
||||
return None
|
||||
return meta.get("module_path")
|
||||
|
||||
|
||||
def get_last_discovery_errors() -> list[str]:
|
||||
return LAST_DISCOVERY_ERRORS
|
||||
|
||||
|
||||
def reload_processors() -> list[str]:
|
||||
return discover_processors(force_reload=True)
|
||||
|
||||
|
||||
discover_processors()
|
||||
|
||||
@@ -2,8 +2,9 @@ from typing import Dict, Any
|
||||
from fastapi.responses import Response
|
||||
import base64
|
||||
from services.ai import describe_image_base64, get_text_embedding
|
||||
from services.vector_db import VectorDBService
|
||||
from services.vector_db import VectorDBService, DEFAULT_VECTOR_DIMENSION
|
||||
from services.logging import LogService
|
||||
from services.config import ConfigCenter
|
||||
|
||||
|
||||
class VectorIndexProcessor:
|
||||
@@ -33,7 +34,7 @@ class VectorIndexProcessor:
|
||||
vector_db = VectorDBService()
|
||||
collection_name = "vector_collection"
|
||||
if action == "destroy":
|
||||
vector_db.delete_vector(collection_name, path)
|
||||
await vector_db.delete_vector(collection_name, path)
|
||||
await LogService.info(
|
||||
"processor:vector_index",
|
||||
f"Destroyed {index_type} index for {path}",
|
||||
@@ -42,8 +43,8 @@ class VectorIndexProcessor:
|
||||
return Response(content=f"文件 {path} 的 {index_type} 索引已销毁", media_type="text/plain")
|
||||
|
||||
if index_type == 'simple':
|
||||
vector_db.ensure_collection(collection_name, vector=False)
|
||||
vector_db.upsert_vector(collection_name, {'path': path})
|
||||
await vector_db.ensure_collection(collection_name, vector=False)
|
||||
await vector_db.upsert_vector(collection_name, {'path': path})
|
||||
await LogService.info(
|
||||
"processor:vector_index",
|
||||
f"Created simple index for {path}",
|
||||
@@ -71,8 +72,16 @@ class VectorIndexProcessor:
|
||||
if embedding is None:
|
||||
return Response(content="不支持的文件类型", status_code=400)
|
||||
|
||||
vector_db.ensure_collection(collection_name, vector=True)
|
||||
vector_db.upsert_vector(
|
||||
raw_dim = await ConfigCenter.get('AI_EMBED_DIM', DEFAULT_VECTOR_DIMENSION)
|
||||
try:
|
||||
vector_dim = int(raw_dim)
|
||||
except (TypeError, ValueError):
|
||||
vector_dim = DEFAULT_VECTOR_DIMENSION
|
||||
if vector_dim <= 0:
|
||||
vector_dim = DEFAULT_VECTOR_DIMENSION
|
||||
|
||||
await vector_db.ensure_collection(collection_name, vector=True, dim=vector_dim)
|
||||
await vector_db.upsert_vector(
|
||||
collection_name, {'path': path, 'embedding': embedding})
|
||||
|
||||
await LogService.info(
|
||||
|
||||
@@ -90,6 +90,16 @@ class ShareService:
|
||||
raise HTTPException(status_code=404, detail="分享链接不存在")
|
||||
await share.delete()
|
||||
|
||||
@staticmethod
|
||||
async def delete_expired_shares(user: UserAccount) -> int:
|
||||
"""
|
||||
删除当前用户所有已过期的分享链接,返回删除数量。
|
||||
条件:expires_at 非空 且 小于等于当前时间(UTC)。
|
||||
"""
|
||||
now = datetime.now(timezone.utc)
|
||||
deleted_count = await ShareLink.filter(user=user, expires_at__lte=now).delete()
|
||||
return deleted_count
|
||||
|
||||
@staticmethod
|
||||
async def get_shared_item_details(share: ShareLink, sub_path: str = ""):
|
||||
"""
|
||||
@@ -122,4 +132,4 @@ class ShareService:
|
||||
raise e
|
||||
|
||||
|
||||
share_service = ShareService()
|
||||
share_service = ShareService()
|
||||
|
||||
@@ -54,7 +54,8 @@ class TaskQueueService:
|
||||
path=params["path"],
|
||||
processor_type=params["processor_type"],
|
||||
config=params["config"],
|
||||
save_to=params["save_to"]
|
||||
save_to=params.get("save_to"),
|
||||
overwrite=params.get("overwrite", False),
|
||||
)
|
||||
task.result = result
|
||||
elif task.name == "automation_task":
|
||||
@@ -119,4 +120,4 @@ class TaskQueueService:
|
||||
await LogService.info("task_queue", "Task worker has been stopped.")
|
||||
|
||||
|
||||
task_queue_service = TaskQueueService()
|
||||
task_queue_service = TaskQueueService()
|
||||
|
||||
@@ -1,83 +0,0 @@
|
||||
from pymilvus import CollectionSchema, DataType, FieldSchema, MilvusClient
|
||||
|
||||
|
||||
class VectorDBService:
|
||||
_instance = None
|
||||
|
||||
def __new__(cls, *args, **kwargs):
|
||||
if not cls._instance:
|
||||
cls._instance = super(VectorDBService, cls).__new__(cls)
|
||||
return cls._instance
|
||||
|
||||
def __init__(self):
|
||||
if not hasattr(self, 'client'):
|
||||
self.client = MilvusClient("data/db/milvus.db")
|
||||
|
||||
def ensure_collection(self, collection_name, vector: bool = True):
|
||||
if self.client.has_collection(collection_name):
|
||||
return
|
||||
if vector:
|
||||
fields = [
|
||||
FieldSchema(name="path", dtype=DataType.VARCHAR,
|
||||
max_length=512, is_primary=True, auto_id=False),
|
||||
FieldSchema(name="embedding",
|
||||
dtype=DataType.FLOAT_VECTOR, dim=4096)
|
||||
]
|
||||
schema = CollectionSchema(
|
||||
fields, description="Image vector collection")
|
||||
self.client.create_collection(collection_name, schema=schema)
|
||||
index_params = MilvusClient.prepare_index_params()
|
||||
index_params.add_index(
|
||||
field_name="embedding",
|
||||
index_type="IVF_FLAT",
|
||||
index_name="vector_index",
|
||||
metric_type="COSINE",
|
||||
params={
|
||||
"nlist": 64,
|
||||
}
|
||||
)
|
||||
self.client.create_index(
|
||||
collection_name,
|
||||
index_params=index_params
|
||||
)
|
||||
else:
|
||||
fields = [
|
||||
FieldSchema(name="path", dtype=DataType.VARCHAR,
|
||||
max_length=512, is_primary=True, auto_id=False),
|
||||
]
|
||||
schema = CollectionSchema(fields, description="Simple file index")
|
||||
self.client.create_collection(collection_name, schema=schema)
|
||||
|
||||
def upsert_vector(self, collection_name, data):
|
||||
self.client.upsert(collection_name, data)
|
||||
|
||||
def delete_vector(self, collection_name, path: str):
|
||||
self.client.delete(collection_name, ids=[path])
|
||||
|
||||
def search_vectors(self, collection_name, query_embedding, top_k=5):
|
||||
search_params = {"metric_type": "COSINE"}
|
||||
results = self.client.search(
|
||||
collection_name,
|
||||
data=[query_embedding],
|
||||
anns_field="embedding",
|
||||
search_params=search_params,
|
||||
limit=top_k,
|
||||
output_fields=["path"]
|
||||
)
|
||||
print(results)
|
||||
return results
|
||||
|
||||
def search_by_path(self, collection_name, query_path, top_k=20):
|
||||
results = self.client.query(
|
||||
collection_name,
|
||||
filter=f"path like '%{query_path}%'",
|
||||
limit=top_k,
|
||||
output_fields=["path"]
|
||||
)
|
||||
return [[{'id': r['path'], 'distance': 1.0, 'entity': {'path': r['path']}} for r in results]]
|
||||
|
||||
def clear_all_data(self):
|
||||
"""清空所有集合的内容"""
|
||||
collections = self.client.list_collections()
|
||||
for collection_name in collections:
|
||||
self.client.drop_collection(collection_name)
|
||||
11
services/vector_db/__init__.py
Normal file
11
services/vector_db/__init__.py
Normal file
@@ -0,0 +1,11 @@
|
||||
from .service import VectorDBService, DEFAULT_VECTOR_DIMENSION
|
||||
from .providers import list_providers, get_provider_entry
|
||||
from .config_manager import VectorDBConfigManager
|
||||
|
||||
__all__ = [
|
||||
"VectorDBService",
|
||||
"DEFAULT_VECTOR_DIMENSION",
|
||||
"list_providers",
|
||||
"get_provider_entry",
|
||||
"VectorDBConfigManager",
|
||||
]
|
||||
43
services/vector_db/config_manager.py
Normal file
43
services/vector_db/config_manager.py
Normal file
@@ -0,0 +1,43 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
from typing import Any, Dict, Tuple
|
||||
|
||||
from services.config import ConfigCenter
|
||||
|
||||
|
||||
class VectorDBConfigManager:
|
||||
TYPE_KEY = "VECTOR_DB_TYPE"
|
||||
CONFIG_KEY = "VECTOR_DB_CONFIG"
|
||||
DEFAULT_TYPE = "milvus_lite"
|
||||
|
||||
@classmethod
|
||||
async def load_config(cls) -> Tuple[str, Dict[str, Any]]:
|
||||
raw_type = await ConfigCenter.get(cls.TYPE_KEY, cls.DEFAULT_TYPE)
|
||||
provider_type = str(raw_type or cls.DEFAULT_TYPE)
|
||||
|
||||
raw_config = await ConfigCenter.get(cls.CONFIG_KEY)
|
||||
config_dict: Dict[str, Any] = {}
|
||||
if isinstance(raw_config, str) and raw_config:
|
||||
try:
|
||||
config_dict = json.loads(raw_config)
|
||||
except json.JSONDecodeError:
|
||||
config_dict = {}
|
||||
elif isinstance(raw_config, dict):
|
||||
config_dict = raw_config
|
||||
return provider_type, config_dict
|
||||
|
||||
@classmethod
|
||||
async def save_config(cls, provider_type: str, config: Dict[str, Any]) -> None:
|
||||
await ConfigCenter.set(cls.TYPE_KEY, provider_type)
|
||||
await ConfigCenter.set(cls.CONFIG_KEY, json.dumps(config or {}))
|
||||
|
||||
@classmethod
|
||||
async def get_type(cls) -> str:
|
||||
provider_type, _ = await cls.load_config()
|
||||
return provider_type
|
||||
|
||||
@classmethod
|
||||
async def get_config(cls) -> Dict[str, Any]:
|
||||
_, config = await cls.load_config()
|
||||
return config
|
||||
56
services/vector_db/providers/__init__.py
Normal file
56
services/vector_db/providers/__init__.py
Normal file
@@ -0,0 +1,56 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Dict, List, Type
|
||||
|
||||
from .base import BaseVectorProvider
|
||||
from .milvus_lite import MilvusLiteProvider
|
||||
from .milvus_server import MilvusServerProvider
|
||||
from .qdrant import QdrantProvider
|
||||
|
||||
_PROVIDER_REGISTRY: Dict[str, Dict[str, object]] = {
|
||||
MilvusLiteProvider.type: {
|
||||
"class": MilvusLiteProvider,
|
||||
"label": MilvusLiteProvider.label,
|
||||
"description": MilvusLiteProvider.description,
|
||||
"enabled": MilvusLiteProvider.enabled,
|
||||
"config_schema": MilvusLiteProvider.config_schema,
|
||||
},
|
||||
MilvusServerProvider.type: {
|
||||
"class": MilvusServerProvider,
|
||||
"label": MilvusServerProvider.label,
|
||||
"description": MilvusServerProvider.description,
|
||||
"enabled": MilvusServerProvider.enabled,
|
||||
"config_schema": MilvusServerProvider.config_schema,
|
||||
},
|
||||
QdrantProvider.type: {
|
||||
"class": QdrantProvider,
|
||||
"label": QdrantProvider.label,
|
||||
"description": QdrantProvider.description,
|
||||
"enabled": QdrantProvider.enabled,
|
||||
"config_schema": QdrantProvider.config_schema,
|
||||
},
|
||||
}
|
||||
|
||||
|
||||
def list_providers() -> List[Dict[str, object]]:
|
||||
return [
|
||||
{
|
||||
"type": type_key,
|
||||
"label": meta["label"],
|
||||
"description": meta.get("description"),
|
||||
"enabled": meta.get("enabled", True),
|
||||
"config_schema": meta.get("config_schema", []),
|
||||
}
|
||||
for type_key, meta in _PROVIDER_REGISTRY.items()
|
||||
]
|
||||
|
||||
|
||||
def get_provider_entry(provider_type: str) -> Dict[str, object] | None:
|
||||
return _PROVIDER_REGISTRY.get(provider_type)
|
||||
|
||||
|
||||
def get_provider_class(provider_type: str) -> Type[BaseVectorProvider] | None:
|
||||
entry = get_provider_entry(provider_type)
|
||||
if not entry:
|
||||
return None
|
||||
return entry.get("class") # type: ignore[return-value]
|
||||
41
services/vector_db/providers/base.py
Normal file
41
services/vector_db/providers/base.py
Normal file
@@ -0,0 +1,41 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Any, Dict, List
|
||||
|
||||
|
||||
class BaseVectorProvider:
|
||||
"""向量数据库提供者基础类,所有实际实现需继承该类"""
|
||||
|
||||
type: str = ""
|
||||
label: str = ""
|
||||
description: str | None = None
|
||||
enabled: bool = True
|
||||
config_schema: List[Dict[str, Any]] = []
|
||||
|
||||
def __init__(self, config: Dict[str, Any] | None = None):
|
||||
self.config = config or {}
|
||||
|
||||
async def initialize(self) -> None:
|
||||
"""执行初始化逻辑,例如建立连接"""
|
||||
raise NotImplementedError
|
||||
|
||||
def ensure_collection(self, collection_name: str, vector: bool, dim: int) -> None:
|
||||
raise NotImplementedError
|
||||
|
||||
def upsert_vector(self, collection_name: str, data: Dict[str, Any]) -> None:
|
||||
raise NotImplementedError
|
||||
|
||||
def delete_vector(self, collection_name: str, path: str) -> None:
|
||||
raise NotImplementedError
|
||||
|
||||
def search_vectors(self, collection_name: str, query_embedding, top_k: int):
|
||||
raise NotImplementedError
|
||||
|
||||
def search_by_path(self, collection_name: str, query_path: str, top_k: int):
|
||||
raise NotImplementedError
|
||||
|
||||
def get_all_stats(self) -> Dict[str, Any]:
|
||||
raise NotImplementedError
|
||||
|
||||
def clear_all_data(self) -> None:
|
||||
raise NotImplementedError
|
||||
196
services/vector_db/providers/milvus_lite.py
Normal file
196
services/vector_db/providers/milvus_lite.py
Normal file
@@ -0,0 +1,196 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from pathlib import Path
|
||||
from typing import Any, Dict, List, Optional
|
||||
|
||||
from pymilvus import CollectionSchema, DataType, FieldSchema, MilvusClient
|
||||
|
||||
from .base import BaseVectorProvider
|
||||
|
||||
|
||||
class MilvusLiteProvider(BaseVectorProvider):
|
||||
type = "milvus_lite"
|
||||
label = "Milvus Lite"
|
||||
description = "Embedded Milvus Lite (local file storage)."
|
||||
enabled = True
|
||||
config_schema: List[Dict[str, Any]] = [
|
||||
{
|
||||
"key": "db_path",
|
||||
"label": "Database file path",
|
||||
"type": "text",
|
||||
"default": "data/db/milvus.db",
|
||||
"required": False,
|
||||
}
|
||||
]
|
||||
|
||||
def __init__(self, config: Dict[str, Any] | None = None):
|
||||
super().__init__(config)
|
||||
self.db_path = Path(self.config.get("db_path") or "data/db/milvus.db")
|
||||
self.client: MilvusClient | None = None
|
||||
|
||||
async def initialize(self) -> None:
|
||||
try:
|
||||
self.client = MilvusClient(str(self.db_path))
|
||||
except Exception as exc: # pragma: no cover - depends on local environment
|
||||
raise RuntimeError(f"Failed to open Milvus Lite at {self.db_path}: {exc}") from exc
|
||||
|
||||
def _get_client(self) -> MilvusClient:
|
||||
if not self.client:
|
||||
raise RuntimeError("Milvus Lite client is not initialized")
|
||||
return self.client
|
||||
|
||||
@staticmethod
|
||||
def _to_int(value: Any) -> int:
|
||||
try:
|
||||
return int(value)
|
||||
except (TypeError, ValueError):
|
||||
return 0
|
||||
|
||||
def ensure_collection(self, collection_name: str, vector: bool, dim: int) -> None:
|
||||
client = self._get_client()
|
||||
if client.has_collection(collection_name):
|
||||
return
|
||||
if vector:
|
||||
vector_dim = dim if isinstance(dim, int) and dim > 0 else 0
|
||||
if vector_dim <= 0:
|
||||
vector_dim = 4096
|
||||
fields = [
|
||||
FieldSchema(name="path", dtype=DataType.VARCHAR, max_length=512, is_primary=True, auto_id=False),
|
||||
FieldSchema(name="embedding", dtype=DataType.FLOAT_VECTOR, dim=vector_dim),
|
||||
]
|
||||
schema = CollectionSchema(fields, description="Image vector collection")
|
||||
client.create_collection(collection_name, schema=schema)
|
||||
index_params = MilvusClient.prepare_index_params()
|
||||
index_params.add_index(
|
||||
field_name="embedding",
|
||||
index_type="IVF_FLAT",
|
||||
index_name="vector_index",
|
||||
metric_type="COSINE",
|
||||
params={"nlist": 64},
|
||||
)
|
||||
client.create_index(collection_name, index_params=index_params)
|
||||
else:
|
||||
fields = [
|
||||
FieldSchema(name="path", dtype=DataType.VARCHAR, max_length=512, is_primary=True, auto_id=False),
|
||||
]
|
||||
schema = CollectionSchema(fields, description="Simple file index")
|
||||
client.create_collection(collection_name, schema=schema)
|
||||
|
||||
def upsert_vector(self, collection_name: str, data: Dict[str, Any]) -> None:
|
||||
self._get_client().upsert(collection_name, data)
|
||||
|
||||
def delete_vector(self, collection_name: str, path: str) -> None:
|
||||
self._get_client().delete(collection_name, ids=[path])
|
||||
|
||||
def search_vectors(self, collection_name: str, query_embedding, top_k: int):
|
||||
search_params = {"metric_type": "COSINE"}
|
||||
return self._get_client().search(
|
||||
collection_name,
|
||||
data=[query_embedding],
|
||||
anns_field="embedding",
|
||||
search_params=search_params,
|
||||
limit=top_k,
|
||||
output_fields=["path"],
|
||||
)
|
||||
|
||||
def search_by_path(self, collection_name: str, query_path: str, top_k: int):
|
||||
filter_expr = f"path like '%{query_path}%'" if query_path else "path like '%%'"
|
||||
results = self._get_client().query(
|
||||
collection_name,
|
||||
filter=filter_expr,
|
||||
limit=top_k,
|
||||
output_fields=["path"],
|
||||
)
|
||||
return [[{"id": r["path"], "distance": 1.0, "entity": {"path": r["path"]}} for r in results]]
|
||||
|
||||
def get_all_stats(self) -> Dict[str, Any]:
|
||||
client = self._get_client()
|
||||
try:
|
||||
collection_names = client.list_collections()
|
||||
except Exception as exc:
|
||||
raise RuntimeError(f"Failed to list collections: {exc}") from exc
|
||||
|
||||
collections: List[Dict[str, Any]] = []
|
||||
total_vectors = 0
|
||||
total_estimated_memory = 0
|
||||
|
||||
for name in collection_names:
|
||||
try:
|
||||
stats = client.get_collection_stats(name) or {}
|
||||
except Exception:
|
||||
stats = {}
|
||||
row_count = self._to_int(stats.get("row_count"))
|
||||
total_vectors += row_count
|
||||
|
||||
dimension: Optional[int] = None
|
||||
is_vector_collection = False
|
||||
try:
|
||||
description = client.describe_collection(name)
|
||||
except Exception:
|
||||
description = None
|
||||
|
||||
if description:
|
||||
for field in description.get("fields", []):
|
||||
if field.get("type") == DataType.FLOAT_VECTOR:
|
||||
params = field.get("params") or {}
|
||||
dimension = self._to_int(params.get("dim")) or 4096
|
||||
is_vector_collection = True
|
||||
break
|
||||
|
||||
estimated_memory = 0
|
||||
if is_vector_collection and dimension:
|
||||
estimated_memory = row_count * dimension * 4
|
||||
total_estimated_memory += estimated_memory
|
||||
|
||||
indexes: List[Dict[str, Any]] = []
|
||||
try:
|
||||
index_names = client.list_indexes(name) or []
|
||||
except Exception:
|
||||
index_names = []
|
||||
|
||||
for index_name in index_names:
|
||||
try:
|
||||
detail = client.describe_index(name, index_name) or {}
|
||||
except Exception:
|
||||
detail = {}
|
||||
indexes.append(
|
||||
{
|
||||
"index_name": index_name,
|
||||
"index_type": detail.get("index_type"),
|
||||
"metric_type": detail.get("metric_type"),
|
||||
"indexed_rows": self._to_int(detail.get("indexed_rows")),
|
||||
"pending_index_rows": self._to_int(detail.get("pending_index_rows")),
|
||||
"state": detail.get("state"),
|
||||
}
|
||||
)
|
||||
|
||||
collections.append(
|
||||
{
|
||||
"name": name,
|
||||
"row_count": row_count,
|
||||
"dimension": dimension if is_vector_collection else None,
|
||||
"estimated_memory_bytes": estimated_memory,
|
||||
"is_vector_collection": is_vector_collection,
|
||||
"indexes": indexes,
|
||||
}
|
||||
)
|
||||
|
||||
db_file_size = None
|
||||
try:
|
||||
if self.db_path.exists():
|
||||
db_file_size = self.db_path.stat().st_size
|
||||
except OSError:
|
||||
db_file_size = None
|
||||
|
||||
return {
|
||||
"collections": collections,
|
||||
"collection_count": len(collections),
|
||||
"total_vectors": total_vectors,
|
||||
"estimated_total_memory_bytes": total_estimated_memory,
|
||||
"db_file_size_bytes": db_file_size,
|
||||
}
|
||||
|
||||
def clear_all_data(self) -> None:
|
||||
client = self._get_client()
|
||||
for collection_name in client.list_collections():
|
||||
client.drop_collection(collection_name)
|
||||
197
services/vector_db/providers/milvus_server.py
Normal file
197
services/vector_db/providers/milvus_server.py
Normal file
@@ -0,0 +1,197 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Any, Dict, List, Optional
|
||||
|
||||
from pymilvus import CollectionSchema, DataType, FieldSchema, MilvusClient
|
||||
|
||||
from .base import BaseVectorProvider
|
||||
|
||||
|
||||
class MilvusServerProvider(BaseVectorProvider):
|
||||
type = "milvus_server"
|
||||
label = "Milvus Server"
|
||||
description = "Remote Milvus instance accessed via URI."
|
||||
enabled = True
|
||||
config_schema: List[Dict[str, Any]] = [
|
||||
{
|
||||
"key": "uri",
|
||||
"label": "Server URI",
|
||||
"type": "text",
|
||||
"required": True,
|
||||
"placeholder": "http://localhost:19530",
|
||||
},
|
||||
{
|
||||
"key": "token",
|
||||
"label": "Token",
|
||||
"type": "password",
|
||||
"required": False,
|
||||
"placeholder": "user:password",
|
||||
},
|
||||
]
|
||||
|
||||
def __init__(self, config: Dict[str, Any] | None = None):
|
||||
super().__init__(config)
|
||||
self.client: MilvusClient | None = None
|
||||
|
||||
async def initialize(self) -> None:
|
||||
uri = self.config.get("uri")
|
||||
if not uri:
|
||||
raise RuntimeError("Milvus Server URI is required")
|
||||
try:
|
||||
self.client = MilvusClient(uri=uri, token=self.config.get("token"))
|
||||
except Exception as exc: # pragma: no cover - depends on remote availability
|
||||
raise RuntimeError(f"Failed to connect to Milvus Server {uri}: {exc}") from exc
|
||||
|
||||
def _get_client(self) -> MilvusClient:
|
||||
if not self.client:
|
||||
raise RuntimeError("Milvus Server client is not initialized")
|
||||
return self.client
|
||||
|
||||
@staticmethod
|
||||
def _to_int(value: Any) -> int:
|
||||
try:
|
||||
return int(value)
|
||||
except (TypeError, ValueError):
|
||||
return 0
|
||||
|
||||
def ensure_collection(self, collection_name: str, vector: bool, dim: int) -> None:
|
||||
client = self._get_client()
|
||||
if client.has_collection(collection_name):
|
||||
return
|
||||
if vector:
|
||||
vector_dim = dim if isinstance(dim, int) and dim > 0 else 0
|
||||
if vector_dim <= 0:
|
||||
vector_dim = 4096
|
||||
fields = [
|
||||
FieldSchema(name="path", dtype=DataType.VARCHAR, max_length=512, is_primary=True, auto_id=False),
|
||||
FieldSchema(name="embedding", dtype=DataType.FLOAT_VECTOR, dim=vector_dim),
|
||||
]
|
||||
schema = CollectionSchema(fields, description="Image vector collection")
|
||||
client.create_collection(collection_name, schema=schema)
|
||||
index_params = MilvusClient.prepare_index_params()
|
||||
index_params.add_index(
|
||||
field_name="embedding",
|
||||
index_type="IVF_FLAT",
|
||||
index_name="vector_index",
|
||||
metric_type="COSINE",
|
||||
params={"nlist": 64},
|
||||
)
|
||||
client.create_index(collection_name, index_params=index_params)
|
||||
else:
|
||||
fields = [
|
||||
FieldSchema(name="path", dtype=DataType.VARCHAR, max_length=512, is_primary=True, auto_id=False),
|
||||
]
|
||||
schema = CollectionSchema(fields, description="Simple file index")
|
||||
client.create_collection(collection_name, schema=schema)
|
||||
|
||||
def upsert_vector(self, collection_name: str, data: Dict[str, Any]) -> None:
|
||||
self._get_client().upsert(collection_name, data)
|
||||
|
||||
def delete_vector(self, collection_name: str, path: str) -> None:
|
||||
self._get_client().delete(collection_name, ids=[path])
|
||||
|
||||
def search_vectors(self, collection_name: str, query_embedding, top_k: int):
|
||||
search_params = {"metric_type": "COSINE"}
|
||||
return self._get_client().search(
|
||||
collection_name,
|
||||
data=[query_embedding],
|
||||
anns_field="embedding",
|
||||
search_params=search_params,
|
||||
limit=top_k,
|
||||
output_fields=["path"],
|
||||
)
|
||||
|
||||
def search_by_path(self, collection_name: str, query_path: str, top_k: int):
|
||||
filter_expr = f"path like '%{query_path}%'" if query_path else "path like '%%'"
|
||||
results = self._get_client().query(
|
||||
collection_name,
|
||||
filter=filter_expr,
|
||||
limit=top_k,
|
||||
output_fields=["path"],
|
||||
)
|
||||
return [[{"id": r["path"], "distance": 1.0, "entity": {"path": r["path"]}} for r in results]]
|
||||
|
||||
def get_all_stats(self) -> Dict[str, Any]:
|
||||
client = self._get_client()
|
||||
try:
|
||||
collection_names = client.list_collections()
|
||||
except Exception as exc:
|
||||
raise RuntimeError(f"Failed to list collections: {exc}") from exc
|
||||
|
||||
collections: List[Dict[str, Any]] = []
|
||||
total_vectors = 0
|
||||
total_estimated_memory = 0
|
||||
|
||||
for name in collection_names:
|
||||
try:
|
||||
stats = client.get_collection_stats(name) or {}
|
||||
except Exception:
|
||||
stats = {}
|
||||
row_count = self._to_int(stats.get("row_count"))
|
||||
total_vectors += row_count
|
||||
|
||||
dimension: Optional[int] = None
|
||||
is_vector_collection = False
|
||||
try:
|
||||
description = client.describe_collection(name)
|
||||
except Exception:
|
||||
description = None
|
||||
|
||||
if description:
|
||||
for field in description.get("fields", []):
|
||||
if field.get("type") == DataType.FLOAT_VECTOR:
|
||||
params = field.get("params") or {}
|
||||
dimension = self._to_int(params.get("dim")) or 4096
|
||||
is_vector_collection = True
|
||||
break
|
||||
|
||||
estimated_memory = 0
|
||||
if is_vector_collection and dimension:
|
||||
estimated_memory = row_count * dimension * 4
|
||||
total_estimated_memory += estimated_memory
|
||||
|
||||
indexes: List[Dict[str, Any]] = []
|
||||
try:
|
||||
index_names = client.list_indexes(name) or []
|
||||
except Exception:
|
||||
index_names = []
|
||||
|
||||
for index_name in index_names:
|
||||
try:
|
||||
detail = client.describe_index(name, index_name) or {}
|
||||
except Exception:
|
||||
detail = {}
|
||||
indexes.append(
|
||||
{
|
||||
"index_name": index_name,
|
||||
"index_type": detail.get("index_type"),
|
||||
"metric_type": detail.get("metric_type"),
|
||||
"indexed_rows": self._to_int(detail.get("indexed_rows")),
|
||||
"pending_index_rows": self._to_int(detail.get("pending_index_rows")),
|
||||
"state": detail.get("state"),
|
||||
}
|
||||
)
|
||||
|
||||
collections.append(
|
||||
{
|
||||
"name": name,
|
||||
"row_count": row_count,
|
||||
"dimension": dimension if is_vector_collection else None,
|
||||
"estimated_memory_bytes": estimated_memory,
|
||||
"is_vector_collection": is_vector_collection,
|
||||
"indexes": indexes,
|
||||
}
|
||||
)
|
||||
|
||||
return {
|
||||
"collections": collections,
|
||||
"collection_count": len(collections),
|
||||
"total_vectors": total_vectors,
|
||||
"estimated_total_memory_bytes": total_estimated_memory,
|
||||
"db_file_size_bytes": None,
|
||||
}
|
||||
|
||||
def clear_all_data(self) -> None:
|
||||
client = self._get_client()
|
||||
for collection_name in client.list_collections():
|
||||
client.drop_collection(collection_name)
|
||||
237
services/vector_db/providers/qdrant.py
Normal file
237
services/vector_db/providers/qdrant.py
Normal file
@@ -0,0 +1,237 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Any, Dict, List, Optional, Sequence
|
||||
from uuid import NAMESPACE_URL, uuid5
|
||||
|
||||
from qdrant_client import QdrantClient
|
||||
from qdrant_client.http import models as qmodels
|
||||
|
||||
from .base import BaseVectorProvider
|
||||
|
||||
|
||||
class QdrantProvider(BaseVectorProvider):
|
||||
type = "qdrant"
|
||||
label = "Qdrant"
|
||||
description = "Qdrant vector database (HTTP API)."
|
||||
enabled = True
|
||||
config_schema: List[Dict[str, Any]] = [
|
||||
{
|
||||
"key": "url",
|
||||
"label": "Server URL",
|
||||
"type": "text",
|
||||
"required": True,
|
||||
"placeholder": "http://localhost:6333",
|
||||
},
|
||||
{
|
||||
"key": "api_key",
|
||||
"label": "API Key",
|
||||
"type": "password",
|
||||
"required": False,
|
||||
},
|
||||
]
|
||||
|
||||
def __init__(self, config: Dict[str, Any] | None = None):
|
||||
super().__init__(config)
|
||||
self.client: Optional[QdrantClient] = None
|
||||
|
||||
async def initialize(self) -> None:
|
||||
url = (self.config.get("url") or "").strip()
|
||||
if not url:
|
||||
raise RuntimeError("Qdrant URL is required")
|
||||
|
||||
api_key = (self.config.get("api_key") or None) or None
|
||||
try:
|
||||
client = QdrantClient(url=url, api_key=api_key)
|
||||
# 简单连通性校验
|
||||
client.get_collections()
|
||||
self.client = client
|
||||
except Exception as exc: # pragma: no cover - 依赖外部服务
|
||||
raise RuntimeError(f"Failed to connect to Qdrant at {url}: {exc}") from exc
|
||||
|
||||
def _get_client(self) -> QdrantClient:
|
||||
if not self.client:
|
||||
raise RuntimeError("Qdrant client is not initialized")
|
||||
return self.client
|
||||
|
||||
@staticmethod
|
||||
def _vector_params(vector: bool, dim: int) -> qmodels.VectorParams:
|
||||
size = dim if vector and isinstance(dim, int) and dim > 0 else 1
|
||||
return qmodels.VectorParams(size=size, distance=qmodels.Distance.COSINE)
|
||||
|
||||
def ensure_collection(self, collection_name: str, vector: bool, dim: int) -> None:
|
||||
client = self._get_client()
|
||||
try:
|
||||
if client.collection_exists(collection_name):
|
||||
return
|
||||
except Exception as exc: # pragma: no cover - 依赖外部服务
|
||||
raise RuntimeError(f"Failed to check Qdrant collection '{collection_name}': {exc}") from exc
|
||||
|
||||
vectors_config = self._vector_params(vector, dim)
|
||||
try:
|
||||
client.create_collection(collection_name=collection_name, vectors_config=vectors_config)
|
||||
except Exception as exc: # pragma: no cover
|
||||
if "already exists" in str(exc).lower():
|
||||
return
|
||||
raise RuntimeError(f"Failed to create Qdrant collection '{collection_name}': {exc}") from exc
|
||||
|
||||
@staticmethod
|
||||
def _point_id(path: str) -> str:
|
||||
return str(uuid5(NAMESPACE_URL, path))
|
||||
|
||||
def _prepare_point(self, data: Dict[str, Any]) -> qmodels.PointStruct:
|
||||
path = data.get("path")
|
||||
if not path:
|
||||
raise ValueError("Qdrant upsert requires 'path' in data")
|
||||
|
||||
embedding = data.get("embedding")
|
||||
if embedding is None:
|
||||
vector = [0.0]
|
||||
else:
|
||||
vector = [float(x) for x in embedding]
|
||||
|
||||
payload = {"path": path}
|
||||
return qmodels.PointStruct(id=self._point_id(path), vector=vector, payload=payload)
|
||||
|
||||
def upsert_vector(self, collection_name: str, data: Dict[str, Any]) -> None:
|
||||
client = self._get_client()
|
||||
point = self._prepare_point(data)
|
||||
client.upsert(collection_name=collection_name, wait=True, points=[point])
|
||||
|
||||
def delete_vector(self, collection_name: str, path: str) -> None:
|
||||
client = self._get_client()
|
||||
selector = qmodels.PointIdsList(points=[self._point_id(path)])
|
||||
client.delete(collection_name=collection_name, points_selector=selector, wait=True)
|
||||
|
||||
def _format_search_results(self, points: Sequence[qmodels.ScoredPoint]):
|
||||
return [
|
||||
{
|
||||
"id": point.id,
|
||||
"distance": point.score,
|
||||
"entity": {"path": (point.payload or {}).get("path")},
|
||||
}
|
||||
for point in points
|
||||
]
|
||||
|
||||
def search_vectors(self, collection_name: str, query_embedding, top_k: int):
|
||||
client = self._get_client()
|
||||
vector = [float(x) for x in query_embedding]
|
||||
points = client.search(
|
||||
collection_name=collection_name,
|
||||
query_vector=vector,
|
||||
limit=top_k,
|
||||
with_payload=True,
|
||||
)
|
||||
return [self._format_search_results(points)]
|
||||
|
||||
def search_by_path(self, collection_name: str, query_path: str, top_k: int):
|
||||
client = self._get_client()
|
||||
results: List[Dict[str, Any]] = []
|
||||
offset: Optional[str | int] = None
|
||||
remaining = max(top_k, 1)
|
||||
|
||||
while len(results) < top_k:
|
||||
batch_size = min(max(remaining * 2, 10), 200)
|
||||
records, next_offset = client.scroll(
|
||||
collection_name=collection_name,
|
||||
limit=batch_size,
|
||||
offset=offset,
|
||||
with_payload=True,
|
||||
)
|
||||
if not records:
|
||||
break
|
||||
|
||||
for record in records:
|
||||
path = (record.payload or {}).get("path")
|
||||
if query_path and path:
|
||||
if query_path not in path:
|
||||
continue
|
||||
results.append({"id": record.id, "distance": 1.0, "entity": {"path": path}})
|
||||
if len(results) >= top_k:
|
||||
break
|
||||
|
||||
if next_offset is None or len(results) >= top_k:
|
||||
break
|
||||
offset = next_offset
|
||||
remaining = top_k - len(results)
|
||||
|
||||
return [results]
|
||||
|
||||
def _extract_vector_config(self, vectors) -> Optional[qmodels.VectorParams]:
|
||||
if isinstance(vectors, qmodels.VectorParams):
|
||||
return vectors
|
||||
if isinstance(vectors, dict):
|
||||
for value in vectors.values():
|
||||
if isinstance(value, qmodels.VectorParams):
|
||||
return value
|
||||
return None
|
||||
|
||||
def get_all_stats(self) -> Dict[str, Any]:
|
||||
client = self._get_client()
|
||||
try:
|
||||
response = client.get_collections()
|
||||
except Exception as exc: # pragma: no cover
|
||||
raise RuntimeError(f"Failed to list Qdrant collections: {exc}") from exc
|
||||
|
||||
collections: List[Dict[str, Any]] = []
|
||||
total_vectors = 0
|
||||
total_estimated_memory = 0
|
||||
|
||||
for description in response.collections or []:
|
||||
name = description.name
|
||||
try:
|
||||
info = client.get_collection(name)
|
||||
except Exception:
|
||||
continue
|
||||
|
||||
row_count = int(info.points_count or 0)
|
||||
total_vectors += row_count
|
||||
|
||||
vector_params = self._extract_vector_config(info.config.params.vectors if info.config and info.config.params else None)
|
||||
dimension = int(vector_params.size) if vector_params and vector_params.size else None
|
||||
estimated_memory = row_count * dimension * 4 if dimension else 0
|
||||
total_estimated_memory += estimated_memory
|
||||
distance = str(vector_params.distance) if vector_params and vector_params.distance else None
|
||||
|
||||
indexed_rows = int(info.indexed_vectors_count or 0)
|
||||
pending_rows = max(row_count - indexed_rows, 0)
|
||||
|
||||
collections.append(
|
||||
{
|
||||
"name": name,
|
||||
"row_count": row_count,
|
||||
"dimension": dimension,
|
||||
"estimated_memory_bytes": estimated_memory,
|
||||
"is_vector_collection": dimension is not None and dimension > 1,
|
||||
"indexes": [
|
||||
{
|
||||
"index_name": "hnsw",
|
||||
"index_type": "HNSW",
|
||||
"metric_type": distance,
|
||||
"indexed_rows": indexed_rows,
|
||||
"pending_index_rows": pending_rows,
|
||||
"state": info.status,
|
||||
}
|
||||
],
|
||||
}
|
||||
)
|
||||
|
||||
return {
|
||||
"collections": collections,
|
||||
"collection_count": len(collections),
|
||||
"total_vectors": total_vectors,
|
||||
"estimated_total_memory_bytes": total_estimated_memory,
|
||||
"db_file_size_bytes": None,
|
||||
}
|
||||
|
||||
def clear_all_data(self) -> None:
|
||||
client = self._get_client()
|
||||
try:
|
||||
response = client.get_collections()
|
||||
except Exception as exc: # pragma: no cover
|
||||
raise RuntimeError(f"Failed to list Qdrant collections: {exc}") from exc
|
||||
|
||||
for description in response.collections or []:
|
||||
try:
|
||||
client.delete_collection(description.name)
|
||||
except Exception:
|
||||
continue
|
||||
99
services/vector_db/service.py
Normal file
99
services/vector_db/service.py
Normal file
@@ -0,0 +1,99 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
from typing import Any, Dict, Optional
|
||||
|
||||
from .config_manager import VectorDBConfigManager
|
||||
from .providers import get_provider_class, get_provider_entry
|
||||
from .providers.base import BaseVectorProvider
|
||||
|
||||
DEFAULT_VECTOR_DIMENSION = 4096
|
||||
|
||||
|
||||
class VectorDBService:
|
||||
_instance: "VectorDBService" | None = None
|
||||
|
||||
def __new__(cls, *args, **kwargs):
|
||||
if cls._instance is None:
|
||||
cls._instance = super().__new__(cls)
|
||||
return cls._instance
|
||||
|
||||
def __init__(self):
|
||||
if not hasattr(self, "_provider"):
|
||||
self._provider: Optional[BaseVectorProvider] = None
|
||||
self._provider_type: Optional[str] = None
|
||||
self._provider_config: Dict[str, Any] | None = None
|
||||
self._lock = asyncio.Lock()
|
||||
|
||||
async def _ensure_provider(self) -> BaseVectorProvider:
|
||||
if self._provider is None:
|
||||
await self.reload()
|
||||
assert self._provider is not None # for type checker
|
||||
return self._provider
|
||||
|
||||
async def reload(self) -> BaseVectorProvider:
|
||||
async with self._lock:
|
||||
provider_type, provider_config = await VectorDBConfigManager.load_config()
|
||||
normalized_config = dict(provider_config or {})
|
||||
if (
|
||||
self._provider
|
||||
and self._provider_type == provider_type
|
||||
and self._provider_config == normalized_config
|
||||
):
|
||||
return self._provider
|
||||
|
||||
entry = get_provider_entry(provider_type)
|
||||
if not entry:
|
||||
raise RuntimeError(f"Unknown vector database provider: {provider_type}")
|
||||
if not entry.get("enabled", True):
|
||||
raise RuntimeError(f"Vector database provider '{provider_type}' is disabled")
|
||||
|
||||
provider_cls = get_provider_class(provider_type)
|
||||
if not provider_cls:
|
||||
raise RuntimeError(f"Provider class not found for '{provider_type}'")
|
||||
|
||||
provider = provider_cls(provider_config)
|
||||
await provider.initialize()
|
||||
|
||||
self._provider = provider
|
||||
self._provider_type = provider_type
|
||||
self._provider_config = normalized_config
|
||||
return provider
|
||||
|
||||
async def ensure_collection(self, collection_name: str, vector: bool = True, dim: int = DEFAULT_VECTOR_DIMENSION) -> None:
|
||||
provider = await self._ensure_provider()
|
||||
provider.ensure_collection(collection_name, vector, dim)
|
||||
|
||||
async def upsert_vector(self, collection_name: str, data: Dict[str, Any]) -> None:
|
||||
provider = await self._ensure_provider()
|
||||
provider.upsert_vector(collection_name, data)
|
||||
|
||||
async def delete_vector(self, collection_name: str, path: str) -> None:
|
||||
provider = await self._ensure_provider()
|
||||
provider.delete_vector(collection_name, path)
|
||||
|
||||
async def search_vectors(self, collection_name: str, query_embedding, top_k: int = 5):
|
||||
provider = await self._ensure_provider()
|
||||
return provider.search_vectors(collection_name, query_embedding, top_k)
|
||||
|
||||
async def search_by_path(self, collection_name: str, query_path: str, top_k: int = 20):
|
||||
provider = await self._ensure_provider()
|
||||
return provider.search_by_path(collection_name, query_path, top_k)
|
||||
|
||||
async def get_all_stats(self) -> Dict[str, Any]:
|
||||
provider = await self._ensure_provider()
|
||||
return provider.get_all_stats()
|
||||
|
||||
async def clear_all_data(self) -> None:
|
||||
provider = await self._ensure_provider()
|
||||
provider.clear_all_data()
|
||||
|
||||
async def current_provider(self) -> Dict[str, Any]:
|
||||
provider_type, provider_config = await VectorDBConfigManager.load_config()
|
||||
entry = get_provider_entry(provider_type) or {}
|
||||
return {
|
||||
"type": provider_type,
|
||||
"config": provider_config,
|
||||
"label": entry.get("label"),
|
||||
"enabled": entry.get("enabled", True),
|
||||
}
|
||||
@@ -1,4 +1,4 @@
|
||||
from typing import Dict, Tuple, Any, Union, AsyncIterator
|
||||
from typing import Dict, Tuple, Any, Union, AsyncIterator, List
|
||||
from fastapi import HTTPException
|
||||
import mimetypes
|
||||
from fastapi.responses import Response
|
||||
@@ -59,6 +59,24 @@ async def _ensure_method(adapter: Any, method: str):
|
||||
return func
|
||||
|
||||
|
||||
async def path_is_directory(path: str) -> bool:
|
||||
"""判断给定路径是否为目录。"""
|
||||
adapter_instance, _, root, rel = await resolve_adapter_and_rel(path)
|
||||
rel = rel.rstrip('/')
|
||||
if rel == '':
|
||||
return True
|
||||
stat_func = getattr(adapter_instance, "stat_file", None)
|
||||
if not callable(stat_func):
|
||||
raise HTTPException(501, detail="Adapter does not implement stat_file")
|
||||
try:
|
||||
info = await stat_func(root, rel)
|
||||
except FileNotFoundError:
|
||||
raise HTTPException(404, detail="Path not found")
|
||||
if isinstance(info, dict):
|
||||
return bool(info.get("is_dir"))
|
||||
return False
|
||||
|
||||
|
||||
async def list_virtual_dir(path: str, page_num: int = 1, page_size: int = 50, sort_by: str = "name", sort_order: str = "asc") -> Dict:
|
||||
norm = (path if path.startswith('/') else '/' + path).rstrip('/') or '/'
|
||||
adapters = await StorageAdapter.filter(enabled=True)
|
||||
@@ -476,28 +494,110 @@ async def copy_path(src: str, dst: str, overwrite: bool = False, return_debug: b
|
||||
return debug_info if return_debug else None
|
||||
|
||||
|
||||
async def process_file(path: str, processor_type: str, config: dict, save_to: str = None):
|
||||
"""
|
||||
使用指定处理器处理文件,并可选择保存到新路径
|
||||
:param path: 源文件路径
|
||||
:param processor_type: 处理器类型
|
||||
:param config: 处理器配置
|
||||
:param save_to: 保存路径(可选),不指定则只返回处理结果
|
||||
:return: 处理后的文件内容或保存结果
|
||||
"""
|
||||
data = await read_file(path)
|
||||
async def process_file(
|
||||
path: str,
|
||||
processor_type: str,
|
||||
config: dict,
|
||||
save_to: str | None = None,
|
||||
overwrite: bool = False,
|
||||
) -> Any:
|
||||
"""处理指定路径(文件或目录)。目录会递归处理其下所有文件。"""
|
||||
|
||||
processor = get_processor(processor_type)
|
||||
if not processor:
|
||||
raise HTTPException(
|
||||
400, detail=f"Processor {processor_type} not found")
|
||||
result = await processor.process(data, path, config)
|
||||
if save_to and getattr(processor, "produces_file", False):
|
||||
raise HTTPException(400, detail=f"Processor {processor_type} not found")
|
||||
|
||||
actual_is_dir = await path_is_directory(path)
|
||||
|
||||
supported_exts = getattr(processor, "supported_exts", None) or []
|
||||
allowed_exts = {
|
||||
str(ext).lower().lstrip('.')
|
||||
for ext in supported_exts
|
||||
if isinstance(ext, str)
|
||||
}
|
||||
|
||||
def matches_extension(rel_path: str) -> bool:
|
||||
if not allowed_exts:
|
||||
return True
|
||||
if '.' not in rel_path:
|
||||
return '' in allowed_exts
|
||||
ext = rel_path.rsplit('.', 1)[-1].lower()
|
||||
return ext in allowed_exts or f'.{ext}' in allowed_exts
|
||||
|
||||
def coerce_result_bytes(result: Any) -> bytes:
|
||||
if isinstance(result, Response):
|
||||
result_bytes = result.body
|
||||
else:
|
||||
result_bytes = result
|
||||
await write_file(save_to, result_bytes)
|
||||
return {"saved_to": save_to}
|
||||
return result.body
|
||||
if isinstance(result, (bytes, bytearray)):
|
||||
return bytes(result)
|
||||
if isinstance(result, str):
|
||||
return result.encode('utf-8')
|
||||
raise HTTPException(500, detail="Processor must return bytes/Response when produces_file=True")
|
||||
|
||||
def build_absolute_path(mount_path: str, rel_path: str) -> str:
|
||||
rel_norm = rel_path.lstrip('/')
|
||||
mount_norm = mount_path.rstrip('/')
|
||||
if not mount_norm:
|
||||
return '/' + rel_norm if rel_norm else '/'
|
||||
return f"{mount_norm}/{rel_norm}" if rel_norm else mount_norm
|
||||
|
||||
if actual_is_dir:
|
||||
if save_to:
|
||||
raise HTTPException(400, detail="Directory processing does not support custom save_to path")
|
||||
if not overwrite:
|
||||
raise HTTPException(400, detail="Directory processing requires overwrite")
|
||||
|
||||
adapter_instance, adapter_model, root, rel = await resolve_adapter_and_rel(path)
|
||||
rel = rel.rstrip('/')
|
||||
list_dir = await _ensure_method(adapter_instance, "list_dir")
|
||||
processed_count = 0
|
||||
stack: List[str] = [rel]
|
||||
page_size = 200
|
||||
|
||||
while stack:
|
||||
current = stack.pop()
|
||||
page = 1
|
||||
while True:
|
||||
entries, total = await list_dir(root, current, page, page_size, "name", "asc")
|
||||
if not entries and (total or 0) == 0:
|
||||
break
|
||||
|
||||
for entry in entries:
|
||||
name = entry.get("name")
|
||||
if not name:
|
||||
continue
|
||||
child_rel = f"{current}/{name}" if current else name
|
||||
if entry.get("is_dir"):
|
||||
stack.append(child_rel)
|
||||
continue
|
||||
if not matches_extension(child_rel):
|
||||
continue
|
||||
absolute_path = build_absolute_path(adapter_model.path, child_rel)
|
||||
data = await read_file(absolute_path)
|
||||
result = await processor.process(data, absolute_path, config)
|
||||
if getattr(processor, "produces_file", False):
|
||||
result_bytes = coerce_result_bytes(result)
|
||||
await write_file(absolute_path, result_bytes)
|
||||
processed_count += 1
|
||||
|
||||
if total is None or page * page_size >= total:
|
||||
break
|
||||
page += 1
|
||||
|
||||
return {"processed_files": processed_count}
|
||||
|
||||
# 单文件处理
|
||||
data = await read_file(path)
|
||||
result = await processor.process(data, path, config)
|
||||
|
||||
target_path = save_to
|
||||
if overwrite and not target_path:
|
||||
target_path = path
|
||||
|
||||
if target_path and getattr(processor, "produces_file", False):
|
||||
result_bytes = coerce_result_bytes(result)
|
||||
await write_file(target_path, result_bytes)
|
||||
return {"saved_to": target_path}
|
||||
|
||||
return result
|
||||
|
||||
|
||||
|
||||
81
uv.lock
generated
81
uv.lock
generated
@@ -415,6 +415,7 @@ dependencies = [
|
||||
{ name = "python-multipart" },
|
||||
{ name = "pytz" },
|
||||
{ name = "pyyaml" },
|
||||
{ name = "qdrant-client" },
|
||||
{ name = "rawpy" },
|
||||
{ name = "rich" },
|
||||
{ name = "rich-toolkit" },
|
||||
@@ -505,6 +506,7 @@ requires-dist = [
|
||||
{ name = "python-multipart", specifier = "==0.0.20" },
|
||||
{ name = "pytz", specifier = "==2025.2" },
|
||||
{ name = "pyyaml", specifier = "==6.0.2" },
|
||||
{ name = "qdrant-client", specifier = "==1.15.1" },
|
||||
{ name = "rawpy", specifier = "==0.25.1" },
|
||||
{ name = "rich", specifier = "==14.1.0" },
|
||||
{ name = "rich-toolkit", specifier = "==0.15.0" },
|
||||
@@ -604,6 +606,28 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/04/4b/29cac41a4d98d144bf5f6d33995617b185d14b22401f75ca86f384e87ff1/h11-0.16.0-py3-none-any.whl", hash = "sha256:63cf8bbe7522de3bf65932fda1d9c2772064ffb3dae62d55932da54b31cb6c86", size = 37515, upload-time = "2025-04-24T03:35:24.344Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "h2"
|
||||
version = "4.3.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "hpack" },
|
||||
{ name = "hyperframe" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/1d/17/afa56379f94ad0fe8defd37d6eb3f89a25404ffc71d4d848893d270325fc/h2-4.3.0.tar.gz", hash = "sha256:6c59efe4323fa18b47a632221a1888bd7fde6249819beda254aeca909f221bf1", size = 2152026, upload-time = "2025-08-23T18:12:19.778Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/69/b2/119f6e6dcbd96f9069ce9a2665e0146588dc9f88f29549711853645e736a/h2-4.3.0-py3-none-any.whl", hash = "sha256:c438f029a25f7945c69e0ccf0fb951dc3f73a5f6412981daee861431b70e2bdd", size = 61779, upload-time = "2025-08-23T18:12:17.779Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "hpack"
|
||||
version = "4.1.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/2c/48/71de9ed269fdae9c8057e5a4c0aa7402e8bb16f2c6e90b3aa53327b113f8/hpack-4.1.0.tar.gz", hash = "sha256:ec5eca154f7056aa06f196a557655c5b009b382873ac8d1e66e79e87535f1dca", size = 51276, upload-time = "2025-01-22T21:44:58.347Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/07/c6/80c95b1b2b94682a72cbdbfb85b81ae2daffa4291fbfa1b1464502ede10d/hpack-4.1.0-py3-none-any.whl", hash = "sha256:157ac792668d995c657d93111f46b4535ed114f0c9c8d672271bbec7eae1b496", size = 34357, upload-time = "2025-01-22T21:44:56.92Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "httpcore"
|
||||
version = "1.0.9"
|
||||
@@ -647,6 +671,20 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/2a/39/e50c7c3a983047577ee07d2a9e53faf5a69493943ec3f6a384bdc792deb2/httpx-0.28.1-py3-none-any.whl", hash = "sha256:d909fcccc110f8c7faf814ca82a9a4d816bc5a6dbfea25d6591d6985b8ba59ad", size = 73517, upload-time = "2024-12-06T15:37:21.509Z" },
|
||||
]
|
||||
|
||||
[package.optional-dependencies]
|
||||
http2 = [
|
||||
{ name = "h2" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "hyperframe"
|
||||
version = "6.1.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/02/e7/94f8232d4a74cc99514c13a9f995811485a6903d48e5d952771ef6322e30/hyperframe-6.1.0.tar.gz", hash = "sha256:f630908a00854a7adeabd6382b43923a4c4cd4b821fcb527e6ab9e15382a3b08", size = 26566, upload-time = "2025-01-22T21:41:49.302Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/48/30/47d0bf6072f7252e6521f3447ccfa40b421b6824517f82854703d0f5a98b/hyperframe-6.1.0-py3-none-any.whl", hash = "sha256:b03380493a519fce58ea5af42e4a42317bf9bd425596f7a0835ffce80f1a42e5", size = 13007, upload-time = "2025-01-22T21:41:47.295Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "idna"
|
||||
version = "3.10"
|
||||
@@ -950,6 +988,18 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/89/c7/5572fa4a3f45740eaab6ae86fcdf7195b55beac1371ac8c619d880cfe948/pillow-11.3.0-cp314-cp314t-win_arm64.whl", hash = "sha256:79ea0d14d3ebad43ec77ad5272e6ff9bba5b679ef73375ea760261207fa8e0aa", size = 2512835, upload-time = "2025-07-01T09:15:50.399Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "portalocker"
|
||||
version = "3.2.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "pywin32", marker = "sys_platform == 'win32'" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/5e/77/65b857a69ed876e1951e88aaba60f5ce6120c33703f7cb61a3c894b8c1b6/portalocker-3.2.0.tar.gz", hash = "sha256:1f3002956a54a8c3730586c5c77bf18fae4149e07eaf1c29fc3faf4d5a3f89ac", size = 95644, upload-time = "2025-06-14T13:20:40.03Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/4b/a6/38c8e2f318bf67d338f4d629e93b0b4b9af331f455f0390ea8ce4a099b26/portalocker-3.2.0-py3-none-any.whl", hash = "sha256:3cdc5f565312224bc570c49337bd21428bba0ef363bbcf58b9ef4a9f11779968", size = 22424, upload-time = "2025-06-14T13:20:38.083Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "propcache"
|
||||
version = "0.3.2"
|
||||
@@ -1161,6 +1211,19 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/81/c4/34e93fe5f5429d7570ec1fa436f1986fb1f00c3e0f43a589fe2bbcd22c3f/pytz-2025.2-py2.py3-none-any.whl", hash = "sha256:5ddf76296dd8c44c26eb8f4b6f35488f3ccbf6fbbd7adee0b7262d43f0ec2f00", size = 509225, upload-time = "2025-03-25T02:24:58.468Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pywin32"
|
||||
version = "311"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/a5/be/3fd5de0979fcb3994bfee0d65ed8ca9506a8a1260651b86174f6a86f52b3/pywin32-311-cp313-cp313-win32.whl", hash = "sha256:f95ba5a847cba10dd8c4d8fefa9f2a6cf283b8b88ed6178fa8a6c1ab16054d0d", size = 8705700, upload-time = "2025-07-14T20:13:26.471Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e3/28/e0a1909523c6890208295a29e05c2adb2126364e289826c0a8bc7297bd5c/pywin32-311-cp313-cp313-win_amd64.whl", hash = "sha256:718a38f7e5b058e76aee1c56ddd06908116d35147e133427e59a3983f703a20d", size = 9494700, upload-time = "2025-07-14T20:13:28.243Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/04/bf/90339ac0f55726dce7d794e6d79a18a91265bdf3aa70b6b9ca52f35e022a/pywin32-311-cp313-cp313-win_arm64.whl", hash = "sha256:7b4075d959648406202d92a2310cb990fea19b535c7f4a78d3f5e10b926eeb8a", size = 8709318, upload-time = "2025-07-14T20:13:30.348Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c9/31/097f2e132c4f16d99a22bfb777e0fd88bd8e1c634304e102f313af69ace5/pywin32-311-cp314-cp314-win32.whl", hash = "sha256:b7a2c10b93f8986666d0c803ee19b5990885872a7de910fc460f9b0c2fbf92ee", size = 8840714, upload-time = "2025-07-14T20:13:32.449Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/90/4b/07c77d8ba0e01349358082713400435347df8426208171ce297da32c313d/pywin32-311-cp314-cp314-win_amd64.whl", hash = "sha256:3aca44c046bd2ed8c90de9cb8427f581c479e594e99b5c0bb19b29c10fd6cb87", size = 9656800, upload-time = "2025-07-14T20:13:34.312Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c0/d2/21af5c535501a7233e734b8af901574572da66fcc254cb35d0609c9080dd/pywin32-311-cp314-cp314-win_arm64.whl", hash = "sha256:a508e2d9025764a8270f93111a970e1d0fbfc33f4153b388bb649b7eec4f9b42", size = 8932540, upload-time = "2025-07-14T20:13:36.379Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pyyaml"
|
||||
version = "6.0.2"
|
||||
@@ -1178,6 +1241,24 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/fa/de/02b54f42487e3d3c6efb3f89428677074ca7bf43aae402517bc7cca949f3/PyYAML-6.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:8388ee1976c416731879ac16da0aff3f63b286ffdd57cdeb95f3f2e085687563", size = 156446, upload-time = "2024-08-06T20:33:04.33Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "qdrant-client"
|
||||
version = "1.15.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "grpcio" },
|
||||
{ name = "httpx", extra = ["http2"] },
|
||||
{ name = "numpy" },
|
||||
{ name = "portalocker" },
|
||||
{ name = "protobuf" },
|
||||
{ name = "pydantic" },
|
||||
{ name = "urllib3" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/79/8b/76c7d325e11d97cb8eb5e261c3759e9ed6664735afbf32fdded5b580690c/qdrant_client-1.15.1.tar.gz", hash = "sha256:631f1f3caebfad0fd0c1fba98f41be81d9962b7bf3ca653bed3b727c0e0cbe0e", size = 295297, upload-time = "2025-07-31T19:35:19.627Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/ef/33/d8df6a2b214ffbe4138db9a1efe3248f67dc3c671f82308bea1582ecbbb7/qdrant_client-1.15.1-py3-none-any.whl", hash = "sha256:2b975099b378382f6ca1cfb43f0d59e541be6e16a5892f282a4b8de7eff5cb63", size = 337331, upload-time = "2025-07-31T19:35:17.539Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "rawpy"
|
||||
version = "0.25.1"
|
||||
|
||||
@@ -5,12 +5,10 @@ import { status as getStatus } from './api/config.ts';
|
||||
import type { SystemStatus } from './api/config.ts';
|
||||
import { SystemContext } from './contexts/SystemContext.tsx';
|
||||
import { ThemeProvider } from './contexts/ThemeContext.tsx';
|
||||
import { Spin, ConfigProvider } from 'antd';
|
||||
import { Spin } from 'antd';
|
||||
import { Routes, Route, Navigate } from 'react-router';
|
||||
import SetupPage from './pages/SetupPage.tsx';
|
||||
import { I18nProvider, useI18n } from './i18n';
|
||||
import zhCN from 'antd/locale/zh_CN';
|
||||
import enUS from 'antd/locale/en_US';
|
||||
import { I18nProvider } from './i18n';
|
||||
|
||||
function AppInner() {
|
||||
const [status, setStatus] = useState<SystemStatus | null>(null);
|
||||
@@ -39,26 +37,21 @@ function AppInner() {
|
||||
);
|
||||
}
|
||||
|
||||
const { lang } = useI18n();
|
||||
const locale = lang === 'zh' ? zhCN : enUS;
|
||||
|
||||
return (
|
||||
<ConfigProvider locale={locale}>
|
||||
<SystemContext.Provider value={status}>
|
||||
<AuthProvider>
|
||||
<ThemeProvider>
|
||||
{!status.is_initialized ? (
|
||||
<Routes>
|
||||
<Route path="/setup" element={<SetupPage />} />
|
||||
<Route path="*" element={<Navigate to="/setup" replace />} />
|
||||
</Routes>
|
||||
) : (
|
||||
<AppRouter />
|
||||
)}
|
||||
</ThemeProvider>
|
||||
</AuthProvider>
|
||||
</SystemContext.Provider>
|
||||
</ConfigProvider>
|
||||
<SystemContext.Provider value={status}>
|
||||
<AuthProvider>
|
||||
<ThemeProvider>
|
||||
{!status.is_initialized ? (
|
||||
<Routes>
|
||||
<Route path="/setup" element={<SetupPage />} />
|
||||
<Route path="*" element={<Navigate to="/setup" replace />} />
|
||||
</Routes>
|
||||
) : (
|
||||
<AppRouter />
|
||||
)}
|
||||
</ThemeProvider>
|
||||
</AuthProvider>
|
||||
</SystemContext.Provider>
|
||||
);
|
||||
}
|
||||
|
||||
|
||||
@@ -15,7 +15,8 @@ export interface ProcessorTypeMeta {
|
||||
name: string;
|
||||
supported_exts: string[];
|
||||
config_schema: ProcessorTypeField[];
|
||||
produces_file:boolean;
|
||||
produces_file: boolean;
|
||||
module_path?: string | null;
|
||||
}
|
||||
|
||||
export const processorsApi = {
|
||||
@@ -29,11 +30,21 @@ export const processorsApi = {
|
||||
save_to?: string;
|
||||
overwrite?: boolean;
|
||||
}) =>
|
||||
request<any>('/processors/process', {
|
||||
request<{ task_id: string }>('/processors/process', {
|
||||
method: 'POST',
|
||||
json: params,
|
||||
}),
|
||||
getSource: (type: string) =>
|
||||
request<{ source: string; module_path: string }>('/processors/source/' + encodeURIComponent(type), {
|
||||
method: 'GET',
|
||||
}),
|
||||
updateSource: (type: string, source: string) =>
|
||||
request<boolean>('/processors/source/' + encodeURIComponent(type), {
|
||||
method: 'PUT',
|
||||
json: { source },
|
||||
}),
|
||||
reload: () =>
|
||||
request<boolean>('/processors/reload', {
|
||||
method: 'POST',
|
||||
body: JSON.stringify(params),
|
||||
headers: {
|
||||
'Content-Type': 'application/json'
|
||||
}
|
||||
}),
|
||||
};
|
||||
|
||||
@@ -23,10 +23,15 @@ export interface ShareCreatePayload {
|
||||
password?: string;
|
||||
}
|
||||
|
||||
export interface ClearExpiredResult {
|
||||
deleted_count: number;
|
||||
}
|
||||
|
||||
export const shareApi = {
|
||||
create: (payload: ShareCreatePayload) => request<ShareInfoWithPassword>('/shares', { method: 'POST', json: payload }),
|
||||
list: () => request<ShareInfo[]>('/shares'),
|
||||
remove: (shareId: number) => request<void>(`/shares/${shareId}`, { method: 'DELETE' }),
|
||||
clearExpired: () => request<ClearExpiredResult>(`/shares/expired`, { method: 'DELETE' }),
|
||||
get: (token: string) => request<ShareInfo>(`/s/${token}`),
|
||||
verifyPassword: (token: string, password: string) => request<void>(`/s/${token}/verify`, { method: 'POST', json: { password } }),
|
||||
listDir: (token: string, path: string = '/', password?: string) => {
|
||||
@@ -40,4 +45,4 @@ export const shareApi = {
|
||||
const url = `${API_BASE_URL}/s/${token}/download?path=${encodeURIComponent(path)}`;
|
||||
return password ? `${url}&password=${encodeURIComponent(password)}` : url;
|
||||
},
|
||||
};
|
||||
};
|
||||
|
||||
@@ -1,5 +1,65 @@
|
||||
import client from './client';
|
||||
|
||||
export interface VectorDBIndexInfo {
|
||||
index_name: string;
|
||||
index_type?: string;
|
||||
metric_type?: string;
|
||||
indexed_rows: number;
|
||||
pending_index_rows: number;
|
||||
state?: string;
|
||||
}
|
||||
|
||||
export interface VectorDBCollectionStats {
|
||||
name: string;
|
||||
row_count: number;
|
||||
dimension: number | null;
|
||||
estimated_memory_bytes: number;
|
||||
is_vector_collection: boolean;
|
||||
indexes: VectorDBIndexInfo[];
|
||||
}
|
||||
|
||||
export interface VectorDBStats {
|
||||
collections: VectorDBCollectionStats[];
|
||||
collection_count: number;
|
||||
total_vectors: number;
|
||||
estimated_total_memory_bytes: number;
|
||||
db_file_size_bytes: number | null;
|
||||
}
|
||||
|
||||
export interface VectorDBProviderField {
|
||||
key: string;
|
||||
label: string;
|
||||
type: 'text' | 'password';
|
||||
required?: boolean;
|
||||
default?: string;
|
||||
placeholder?: string;
|
||||
}
|
||||
|
||||
export interface VectorDBProviderMeta {
|
||||
type: string;
|
||||
label: string;
|
||||
description?: string;
|
||||
enabled: boolean;
|
||||
config_schema: VectorDBProviderField[];
|
||||
}
|
||||
|
||||
export interface VectorDBCurrentConfig {
|
||||
type: string;
|
||||
config: Record<string, string>;
|
||||
label?: string;
|
||||
enabled?: boolean;
|
||||
}
|
||||
|
||||
export interface UpdateVectorDBConfigResponse {
|
||||
config: VectorDBCurrentConfig;
|
||||
stats: VectorDBStats;
|
||||
}
|
||||
|
||||
export const vectorDBApi = {
|
||||
getProviders: () => client<VectorDBProviderMeta[]>('/vector-db/providers', { method: 'GET' }),
|
||||
getConfig: () => client<VectorDBCurrentConfig>('/vector-db/config', { method: 'GET' }),
|
||||
getStats: () => client<VectorDBStats>('/vector-db/stats', { method: 'GET' }),
|
||||
updateConfig: (payload: { type: string; config: Record<string, string> }) =>
|
||||
client<UpdateVectorDBConfigResponse>('/vector-db/config', { method: 'POST', json: payload }),
|
||||
clearAll: () => client('/vector-db/clear-all', { method: 'POST' }),
|
||||
};
|
||||
};
|
||||
|
||||
143
web/src/components/PathSelectorModal.tsx
Normal file
143
web/src/components/PathSelectorModal.tsx
Normal file
@@ -0,0 +1,143 @@
|
||||
import { memo, useEffect, useMemo, useState } from 'react';
|
||||
import { Modal, Button, List, Typography, Space, Input, message } from 'antd';
|
||||
import { FolderOutlined, ArrowUpOutlined } from '@ant-design/icons';
|
||||
import { useI18n } from '../i18n';
|
||||
import { vfsApi, type VfsEntry } from '../api/client';
|
||||
import { getFileIcon } from '../pages/FileExplorerPage/components/FileIcons';
|
||||
|
||||
export type PathSelectorMode = 'directory' | 'file' | 'any';
|
||||
|
||||
interface PathSelectorModalProps {
|
||||
open: boolean;
|
||||
mode?: PathSelectorMode;
|
||||
initialPath?: string;
|
||||
onOk: (path: string) => void;
|
||||
onCancel: () => void;
|
||||
}
|
||||
|
||||
function normalizePath(p: string): string {
|
||||
if (!p) return '/';
|
||||
const s = ('/' + p).replace(/\/+/, '/');
|
||||
return s.replace(/\\/g, '/').replace(/\/+$/, '') || '/';
|
||||
}
|
||||
|
||||
function joinPath(dir: string, name: string): string {
|
||||
const base = normalizePath(dir);
|
||||
if (base === '/') return `/${name}`;
|
||||
return `${base}/${name}`.replace(/\/+/, '/');
|
||||
}
|
||||
|
||||
const PathSelectorModal = memo(function PathSelectorModal({ open, mode = 'directory', initialPath = '/', onOk, onCancel }: PathSelectorModalProps) {
|
||||
const { t } = useI18n();
|
||||
const [path, setPath] = useState<string>(normalizePath(initialPath));
|
||||
const [entries, setEntries] = useState<VfsEntry[]>([]);
|
||||
const [loading, setLoading] = useState(false);
|
||||
const [selected, setSelected] = useState<string | null>(null); // selected file name within current folder
|
||||
|
||||
const title = useMemo(() => {
|
||||
if (mode === 'file') return t('Select File');
|
||||
if (mode === 'any') return t('Select Path');
|
||||
return t('Select Folder');
|
||||
}, [mode, t]);
|
||||
|
||||
const load = async (p: string) => {
|
||||
setLoading(true);
|
||||
try {
|
||||
const listing = await vfsApi.list(p, 1, 500, 'name', 'asc');
|
||||
setEntries(listing.entries);
|
||||
setPath(listing.path || p);
|
||||
setSelected(null);
|
||||
} catch (e: any) {
|
||||
message.error(e.message || t('Load failed'));
|
||||
} finally {
|
||||
setLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
useEffect(() => {
|
||||
if (open) {
|
||||
load(normalizePath(initialPath));
|
||||
}
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||
}, [open, initialPath]);
|
||||
|
||||
const canOk = useMemo(() => {
|
||||
if (mode === 'file') return !!selected;
|
||||
return true;
|
||||
}, [mode, selected]);
|
||||
|
||||
const handleOk = () => {
|
||||
if (mode === 'directory') {
|
||||
onOk(normalizePath(path));
|
||||
return;
|
||||
}
|
||||
if (mode === 'file') {
|
||||
if (!selected) {
|
||||
message.warning(t('Please select a file'));
|
||||
return;
|
||||
}
|
||||
onOk(joinPath(path, selected));
|
||||
return;
|
||||
}
|
||||
// any
|
||||
if (selected) onOk(joinPath(path, selected));
|
||||
else onOk(normalizePath(path));
|
||||
};
|
||||
|
||||
const goUp = () => {
|
||||
const cur = normalizePath(path);
|
||||
if (cur === '/') return;
|
||||
const parent = cur.replace(/\/+$/, '').split('/').slice(0, -1).join('/') || '/';
|
||||
load(parent);
|
||||
};
|
||||
|
||||
return (
|
||||
<Modal
|
||||
title={title}
|
||||
open={open}
|
||||
onCancel={onCancel}
|
||||
onOk={handleOk}
|
||||
okButtonProps={{ disabled: !canOk }}
|
||||
width={720}
|
||||
>
|
||||
<Space style={{ width: '100%', marginBottom: 12 }} align="center">
|
||||
<Typography.Text type="secondary">{t('Current')}</Typography.Text>
|
||||
<Input value={path} readOnly />
|
||||
<Button onClick={goUp} icon={<ArrowUpOutlined />} disabled={path === '/'}>{t('Up')}</Button>
|
||||
{mode !== 'file' && (
|
||||
<Button type="primary" onClick={() => onOk(normalizePath(path))}>{t('Select Current Folder')}</Button>
|
||||
)}
|
||||
</Space>
|
||||
|
||||
<List
|
||||
bordered
|
||||
loading={loading}
|
||||
dataSource={entries}
|
||||
style={{ maxHeight: 420, overflow: 'auto' }}
|
||||
renderItem={(item) => {
|
||||
const isSelected = selected === item.name && !item.is_dir;
|
||||
return (
|
||||
<List.Item
|
||||
onClick={() => {
|
||||
if (item.is_dir) {
|
||||
load(joinPath(path, item.name));
|
||||
} else {
|
||||
setSelected((prev) => (prev === item.name ? null : item.name));
|
||||
}
|
||||
}}
|
||||
style={{ cursor: 'pointer', background: isSelected ? 'rgba(22,119,255,0.08)' : undefined }}
|
||||
>
|
||||
<Space>
|
||||
{item.is_dir ? <FolderOutlined /> : getFileIcon(item.name)}
|
||||
<Typography.Text strong={item.is_dir}>{item.name}</Typography.Text>
|
||||
</Space>
|
||||
</List.Item>
|
||||
);
|
||||
}}
|
||||
/>
|
||||
</Modal>
|
||||
);
|
||||
});
|
||||
|
||||
export default PathSelectorModal;
|
||||
|
||||
@@ -62,6 +62,7 @@ const ProfileModal = memo(function ProfileModal({ open, onClose }: ProfileModalP
|
||||
confirmLoading={loading}
|
||||
okText={t('Save')}
|
||||
cancelText={t('Cancel')}
|
||||
forceRender
|
||||
>
|
||||
<Form form={form} layout="vertical">
|
||||
<Form.Item name="username" label={t('Username')}>
|
||||
|
||||
@@ -1,10 +1,12 @@
|
||||
import React, { createContext, useContext, useEffect, useMemo, useRef, useState } from 'react';
|
||||
import { ConfigProvider, theme as antdTheme } from 'antd';
|
||||
import zhCN from 'antd/locale/zh_CN';
|
||||
import enUS from 'antd/locale/en_US';
|
||||
import type { ThemeConfig } from 'antd/es/config-provider/context';
|
||||
import { getAllConfig } from '../api/config';
|
||||
import { useAuth } from './AuthContext';
|
||||
import baseTheme from '../theme';
|
||||
import { useI18n } from '../i18n';
|
||||
|
||||
type ThemeMode = 'light' | 'dark' | 'system';
|
||||
|
||||
@@ -101,6 +103,7 @@ function buildThemeConfig(state: ThemeState, systemDark: boolean): ThemeConfig {
|
||||
|
||||
export function ThemeProvider({ children }: { children: React.ReactNode }) {
|
||||
const { isAuthenticated } = useAuth();
|
||||
const { lang } = useI18n();
|
||||
const systemDark = useSystemDarkPreferred();
|
||||
const [state, setState] = useState<ThemeState>({ mode: 'light' });
|
||||
const styleTagRef = useRef<HTMLStyleElement | null>(null);
|
||||
@@ -163,6 +166,7 @@ export function ThemeProvider({ children }: { children: React.ReactNode }) {
|
||||
|
||||
const themeConfig = useMemo(() => buildThemeConfig(state, systemDark), [state, systemDark]);
|
||||
const resolvedMode: ThemeMode = useMemo(() => (state.mode === 'system' ? (systemDark ? 'dark' : 'light') : state.mode), [state.mode, systemDark]);
|
||||
const locale = useMemo(() => (lang === 'zh' ? zhCN : enUS), [lang]);
|
||||
|
||||
const ctxValue = useMemo<ThemeContextType>(() => ({
|
||||
refreshTheme,
|
||||
@@ -173,7 +177,7 @@ export function ThemeProvider({ children }: { children: React.ReactNode }) {
|
||||
|
||||
return (
|
||||
<Ctx.Provider value={ctxValue}>
|
||||
<ConfigProvider theme={{ ...themeConfig, cssVar: true }} locale={zhCN}>
|
||||
<ConfigProvider theme={{ ...themeConfig, cssVar: true }} locale={locale}>
|
||||
{children}
|
||||
</ConfigProvider>
|
||||
</Ctx.Provider>
|
||||
|
||||
@@ -40,3 +40,29 @@ body { font-family: system-ui,-apple-system,BlinkMacSystemFont,'Segoe UI',Roboto
|
||||
.fx-grid-item .thumb .badge { position:absolute; top:6px; left:6px; background: var(--ant-color-primary, #111); color:#fff; font-size:10px; padding:2px 4px; border-radius:6px; line-height:1; letter-spacing:.5px; }
|
||||
.fx-grid-item .name { font-weight:600; font-size:13px; }
|
||||
.ellipsis { overflow:hidden; white-space:nowrap; text-overflow:ellipsis; }
|
||||
|
||||
.processors-tabs {
|
||||
flex: 1;
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
min-height: 0;
|
||||
padding: 5px;
|
||||
}
|
||||
.processors-tabs .ant-tabs-content-holder,
|
||||
.processors-tabs .ant-tabs-content {
|
||||
flex: 1;
|
||||
height: 100%;
|
||||
min-height: 0;
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
}
|
||||
.processors-tabs .ant-tabs-tabpane {
|
||||
flex: 1;
|
||||
height: 100%;
|
||||
min-height: 0;
|
||||
display: none;
|
||||
flex-direction: column;
|
||||
}
|
||||
.processors-tabs .ant-tabs-tabpane-active {
|
||||
display: flex;
|
||||
}
|
||||
|
||||
@@ -53,6 +53,9 @@ export const en = {
|
||||
'Cancel failed': 'Cancel failed',
|
||||
'Load failed': 'Load failed',
|
||||
'Are you sure to cancel share?': 'Are you sure to cancel share?',
|
||||
'Clear expired shares': 'Clear expired shares',
|
||||
'Confirm clear expired shares?': 'Confirm clear expired shares?',
|
||||
'Cleared {count} expired shares': 'Cleared {count} expired shares',
|
||||
|
||||
'Share Name': 'Share Name',
|
||||
'Share Content': 'Share Content',
|
||||
@@ -97,6 +100,8 @@ export const en = {
|
||||
'Open': 'Open',
|
||||
'Open With': 'Open With',
|
||||
'Default': 'Default',
|
||||
'Processor': 'Processor',
|
||||
'Share': 'Share',
|
||||
'Rename': 'Rename',
|
||||
'Delete': 'Delete',
|
||||
'Details': 'Details',
|
||||
@@ -197,9 +202,38 @@ export const en = {
|
||||
'AI Settings': 'AI Settings',
|
||||
'Vision Model': 'Vision Model',
|
||||
'Embedding Model': 'Embedding Model',
|
||||
'Embedding Dimension': 'Embedding Dimension',
|
||||
'Vector Database': 'Vector Database',
|
||||
'Vector Database Settings': 'Vector Database Settings',
|
||||
'Current Statistics': 'Current Statistics',
|
||||
'Collections': 'Collections',
|
||||
'Vectors': 'Vectors',
|
||||
'Database Size': 'Database Size',
|
||||
'Estimated Memory': 'Estimated Memory',
|
||||
'No collections': 'No collections',
|
||||
'Dimension': 'Dimension',
|
||||
'Non-vector collection': 'Non-vector collection',
|
||||
'Estimated memory': 'Estimated memory',
|
||||
'Indexes': 'Indexes',
|
||||
'Unnamed index': 'Unnamed index',
|
||||
'Indexed rows': 'Indexed rows',
|
||||
'Pending rows': 'Pending rows',
|
||||
'Estimated memory is calculated as vectors x dimension x 4 bytes (float32).': 'Estimated memory is calculated as vectors x dimension x 4 bytes (float32).',
|
||||
'Database Provider': 'Database Provider',
|
||||
'Please select a provider': 'Please select a provider',
|
||||
'Coming soon': 'Coming soon',
|
||||
'This provider is not available yet': 'This provider is not available yet',
|
||||
'Database file path': 'Database file path',
|
||||
'Server URI': 'Server URI',
|
||||
'Token': 'Token',
|
||||
'Server URL': 'Server URL',
|
||||
'API Key': 'API Key',
|
||||
'Embedded Milvus Lite (local file storage).': 'Embedded Milvus Lite (local file storage).',
|
||||
'Remote Milvus instance accessed via URI.': 'Remote Milvus instance accessed via URI.',
|
||||
'Qdrant vector database (HTTP API).': 'Qdrant vector database (HTTP API).',
|
||||
'Database Type': 'Database Type',
|
||||
'Confirm embedding dimension change': 'Confirm embedding dimension change',
|
||||
'Changing the embedding dimension will clear the vector database automatically. You will need to rebuild indexes afterwards. Continue?': 'Changing the embedding dimension will clear the vector database automatically. You will need to rebuild indexes afterwards. Continue?',
|
||||
'Confirm clear vector database?': 'Confirm clear vector database?',
|
||||
'This will delete all collections irreversibly.': 'This will delete all collections irreversibly.',
|
||||
'Confirm Clear': 'Confirm Clear',
|
||||
@@ -308,6 +342,45 @@ export const en = {
|
||||
// Processor flow
|
||||
'Processing finished': 'Processing finished',
|
||||
'Processing failed': 'Processing failed',
|
||||
'Processors': 'Processors',
|
||||
'Processor List': 'Processor List',
|
||||
'Reload': 'Reload',
|
||||
'Run Processor': 'Run Processor',
|
||||
'Target Path': 'Target Path',
|
||||
'Please select a path': 'Please select a path',
|
||||
'Select Directory': 'Select Directory',
|
||||
'Overwrite original': 'Overwrite original',
|
||||
'Save To': 'Save To',
|
||||
'Optional output path': 'Optional output path',
|
||||
'Run': 'Run',
|
||||
'Select a processor': 'Select a processor',
|
||||
'No module path': 'No module path',
|
||||
'Source saved': 'Source saved',
|
||||
'Processors reloaded': 'Processors reloaded',
|
||||
'Unsaved changes': 'Unsaved changes',
|
||||
'Switching processor will discard unsaved changes. Continue?': 'Switching processor will discard unsaved changes. Continue?',
|
||||
'Task submitted': 'Task submitted',
|
||||
'Supported Extensions': 'Supported Extensions',
|
||||
'All': 'All',
|
||||
'Produces File': 'Produces File',
|
||||
'Yes': 'Yes',
|
||||
'No': 'No',
|
||||
'Please select a processor': 'Please select a processor',
|
||||
'Select a path': 'Select a path',
|
||||
'Source Editor': 'Source Editor',
|
||||
'Module Path': 'Module Path',
|
||||
'Directory processing always overwrites original files': 'Directory processing always overwrites original files',
|
||||
'No data': 'No data',
|
||||
|
||||
// Path selector
|
||||
'Select File': 'Select File',
|
||||
'Select Path': 'Select Path',
|
||||
'Select Folder': 'Select Folder',
|
||||
'Select': 'Select',
|
||||
'Current': 'Current',
|
||||
'Up': 'Up',
|
||||
'Select Current Folder': 'Select Current Folder',
|
||||
'Please select a file': 'Please select a file',
|
||||
|
||||
// Plugins page
|
||||
'Installed successfully': 'Installed successfully',
|
||||
|
||||
@@ -8,7 +8,7 @@ export const zh = {
|
||||
'All Files': '全部文件',
|
||||
'Manage': '管理',
|
||||
'System': '系统',
|
||||
'Automation': '自动化',
|
||||
'Automation': '自动任务',
|
||||
'My Shares': '我的分享',
|
||||
'Offline Downloads': '离线下载',
|
||||
'Adapters': '存储挂载',
|
||||
@@ -25,7 +25,7 @@ export const zh = {
|
||||
'Account Settings': '账户设置',
|
||||
'Language': '语言',
|
||||
'Chinese': '中文',
|
||||
'English': '英文',
|
||||
'English': 'English',
|
||||
'Full Name': '昵称',
|
||||
'Email': '邮箱',
|
||||
'Change Password': '修改密码',
|
||||
@@ -57,6 +57,9 @@ export const zh = {
|
||||
'Cancel failed': '取消失败',
|
||||
'Load failed': '加载失败',
|
||||
'Are you sure to cancel share?': '确认取消分享?',
|
||||
'Clear expired shares': '清空过期分享',
|
||||
'Confirm clear expired shares?': '确认清空过期分享?',
|
||||
'Cleared {count} expired shares': '已清理 {count} 个过期分享',
|
||||
'Share Name': '分享名称',
|
||||
'Share Content': '分享内容',
|
||||
'Created At': '创建时间',
|
||||
@@ -98,6 +101,8 @@ export const zh = {
|
||||
'Open': '打开',
|
||||
'Open With': '打开方式',
|
||||
'Default': '默认',
|
||||
'Processor': '处理器',
|
||||
'Share': '分享',
|
||||
'Rename': '重命名',
|
||||
'Delete': '删除',
|
||||
'Details': '详情',
|
||||
@@ -199,9 +204,38 @@ export const zh = {
|
||||
'AI Settings': 'AI设置',
|
||||
'Vision Model': '视觉模型',
|
||||
'Embedding Model': '嵌入模型',
|
||||
'Embedding Dimension': '向量维度',
|
||||
'Vector Database': '向量数据库',
|
||||
'Vector Database Settings': '向量数据库设置',
|
||||
'Current Statistics': '当前统计',
|
||||
'Collections': '集合',
|
||||
'Vectors': '向量',
|
||||
'Database Size': '数据库大小',
|
||||
'Estimated Memory': '估算内存',
|
||||
'No collections': '暂无集合',
|
||||
'Dimension': '维度',
|
||||
'Non-vector collection': '非向量集合',
|
||||
'Estimated memory': '估算内存',
|
||||
'Indexes': '索引',
|
||||
'Unnamed index': '未命名索引',
|
||||
'Indexed rows': '已索引行数',
|
||||
'Pending rows': '待索引行数',
|
||||
'Estimated memory is calculated as vectors x dimension x 4 bytes (float32).': '估算内存 = 向量数量 x 维度 x 4 字节(float32)。',
|
||||
'Database Provider': '数据库提供者',
|
||||
'Please select a provider': '请选择提供者',
|
||||
'Coming soon': '敬请期待',
|
||||
'This provider is not available yet': '该提供者暂不可用',
|
||||
'Database file path': '数据库文件路径',
|
||||
'Server URI': '服务器 URI',
|
||||
'Token': '令牌',
|
||||
'Server URL': '服务器地址',
|
||||
'API Key': 'API Key',
|
||||
'Embedded Milvus Lite (local file storage).': '嵌入式 Milvus Lite,本地文件存储。',
|
||||
'Remote Milvus instance accessed via URI.': '通过 URI 访问的远程 Milvus 实例。',
|
||||
'Qdrant vector database (HTTP API).': 'Qdrant 向量数据库(HTTP API)。',
|
||||
'Database Type': '数据库类型',
|
||||
'Confirm embedding dimension change': '确认修改向量维度',
|
||||
'Changing the embedding dimension will clear the vector database automatically. You will need to rebuild indexes afterwards. Continue?': '修改向量维度会自动清空向量数据库,之后需要重建索引,是否继续?',
|
||||
'Confirm clear vector database?': '确认清空向量数据库?',
|
||||
'This will delete all collections irreversibly.': '此操作将删除所有集合中的所有数据,且不可逆。',
|
||||
'Confirm Clear': '确认清空',
|
||||
@@ -310,6 +344,45 @@ export const zh = {
|
||||
// Processor flow
|
||||
'Processing finished': '处理完成',
|
||||
'Processing failed': '处理失败',
|
||||
'Processors': '处理器',
|
||||
'Processor List': '处理器列表',
|
||||
'Reload': '重载',
|
||||
'Run Processor': '运行处理器',
|
||||
'Target Path': '目标路径',
|
||||
'Please select a path': '请选择路径',
|
||||
'Select Directory': '选择目录',
|
||||
'Overwrite original': '覆盖原文件',
|
||||
'Save To': '保存到',
|
||||
'Optional output path': '可选输出路径',
|
||||
'Run': '运行',
|
||||
'Select a processor': '选择处理器',
|
||||
'No module path': '未检测到模块路径',
|
||||
'Source saved': '源码已保存',
|
||||
'Processors reloaded': '处理器已重载',
|
||||
'Unsaved changes': '存在未保存的修改',
|
||||
'Switching processor will discard unsaved changes. Continue?': '切换处理器会丢失未保存的修改,确认继续?',
|
||||
'Task submitted': '任务已提交',
|
||||
'Supported Extensions': '支持的扩展名',
|
||||
'All': '全部',
|
||||
'Produces File': '生成文件',
|
||||
'Yes': '是',
|
||||
'No': '否',
|
||||
'Please select a processor': '请选择处理器',
|
||||
'Select a path': '请选择路径',
|
||||
'Source Editor': '源码编辑',
|
||||
'Module Path': '模块路径',
|
||||
'Directory processing always overwrites original files': '选择目录时会强制覆盖原文件',
|
||||
'No data': '暂无数据',
|
||||
|
||||
// Path selector
|
||||
'Select File': '选择文件',
|
||||
'Select Path': '选择路径',
|
||||
'Select Folder': '选择目录',
|
||||
'Select': '选择',
|
||||
'Current': '当前',
|
||||
'Up': '上一级',
|
||||
'Select Current Folder': '选择当前目录',
|
||||
'Please select a file': '请选择一个文件',
|
||||
|
||||
// Plugins page
|
||||
'Installed successfully': '安装成功',
|
||||
|
||||
@@ -9,6 +9,7 @@ import {
|
||||
BugOutlined,
|
||||
DatabaseOutlined,
|
||||
AppstoreOutlined,
|
||||
CodeOutlined,
|
||||
} from '@ant-design/icons';
|
||||
import type { ReactNode } from 'react';
|
||||
|
||||
@@ -27,6 +28,7 @@ export const navGroups: NavGroup[] = [
|
||||
key: 'manage',
|
||||
title: 'Manage',
|
||||
children: [
|
||||
{ key: 'processors', icon: React.createElement(CodeOutlined), label: 'Processors' },
|
||||
{ key: 'tasks', icon: React.createElement(RobotOutlined), label: 'Automation' },
|
||||
{ key: 'share', icon: React.createElement(ShareAltOutlined), label: 'My Shares' },
|
||||
{ key: 'offline', icon: React.createElement(CloudDownloadOutlined), label: 'Offline Downloads' },
|
||||
|
||||
@@ -223,7 +223,7 @@ const AdaptersPage = memo(function AdaptersPage() {
|
||||
width={480}
|
||||
open={open}
|
||||
onClose={() => { setOpen(false); setEditing(null); }}
|
||||
destroyOnClose
|
||||
destroyOnHidden
|
||||
extra={
|
||||
<Space>
|
||||
<Button onClick={() => { setOpen(false); setEditing(null); }}>{t('Cancel')}</Button>
|
||||
|
||||
@@ -49,6 +49,18 @@ export const GridView: React.FC<Props> = ({ entries, thumbs, selectedEntries, lo
|
||||
const toHex = (v: number) => v.toString(16).padStart(2, '0');
|
||||
return `#${toHex(r)}${toHex(g)}${toHex(b)}`;
|
||||
};
|
||||
const toRgba = (hex: string, alpha: number) => {
|
||||
const s = hex.replace('#', '');
|
||||
const normalized = s.length === 3 ? s.split('').map(c => c + c).join('') : s;
|
||||
const num = parseInt(normalized, 16);
|
||||
if (Number.isNaN(num) || normalized.length !== 6) {
|
||||
return `rgba(22, 119, 255, ${alpha})`;
|
||||
}
|
||||
const r = (num >> 16) & 255;
|
||||
const g = (num >> 8) & 255;
|
||||
const b = num & 255;
|
||||
return `rgba(${r}, ${g}, ${b}, ${alpha})`;
|
||||
};
|
||||
const containerRef = useRef<HTMLDivElement | null>(null);
|
||||
const itemRefs = useRef<Record<string, HTMLDivElement | null>>({});
|
||||
const startRef = useRef<{ x: number, y: number } | null>(null);
|
||||
@@ -168,7 +180,7 @@ export const GridView: React.FC<Props> = ({ entries, thumbs, selectedEntries, lo
|
||||
width: rect.width,
|
||||
height: rect.height,
|
||||
border: '1px dashed var(--ant-color-border, rgba(0,0,0,0.4))',
|
||||
background: 'var(--ant-color-primary-bg, rgba(0, 120, 212, 0.08))',
|
||||
background: toRgba(String(token.colorPrimary || '#1677ff'), 0.16),
|
||||
zIndex: 999
|
||||
}}
|
||||
/>
|
||||
|
||||
@@ -29,7 +29,7 @@ export const CreateDirModal: React.FC<CreateDirModalProps> = ({ open, onOk, onCa
|
||||
onOk={handleOk}
|
||||
onCancel={onCancel}
|
||||
okButtonProps={{ disabled: !name.trim() }}
|
||||
destroyOnClose
|
||||
destroyOnHidden
|
||||
>
|
||||
<Input
|
||||
placeholder={t('Folder Name')}
|
||||
|
||||
@@ -58,7 +58,7 @@ export const ProcessorModal: React.FC<ProcessorModalProps> = (props) => {
|
||||
onCancel={onCancel}
|
||||
onOk={onOk}
|
||||
confirmLoading={loading}
|
||||
destroyOnClose
|
||||
destroyOnHidden
|
||||
>
|
||||
<Form form={form} layout="vertical" onValuesChange={handleFormValuesChange}>
|
||||
<Form.Item name="processor_type" label={t('Processor')} required>
|
||||
|
||||
@@ -32,7 +32,7 @@ export const RenameModal: React.FC<RenameModalProps> = ({ entry, onOk, onCancel
|
||||
onOk={handleOk}
|
||||
onCancel={onCancel}
|
||||
okButtonProps={{ disabled: !name.trim() || name.trim() === entry?.name }}
|
||||
destroyOnClose
|
||||
destroyOnHidden
|
||||
>
|
||||
<Input
|
||||
placeholder={t('New Name')}
|
||||
|
||||
@@ -49,8 +49,8 @@ export function useProcessor({ path, processorTypes, refresh }: ProcessorParams)
|
||||
overwrite: overwrite ? true : undefined,
|
||||
};
|
||||
|
||||
await processorsApi.process(params);
|
||||
message.success(t('Processing finished'));
|
||||
const resp = await processorsApi.process(params);
|
||||
message.success(`${t('Task submitted')}: ${resp.task_id}`);
|
||||
setModal({ entry: null, visible: false });
|
||||
if (overwrite || savingPath) refresh();
|
||||
} catch (e: any) {
|
||||
|
||||
501
web/src/pages/ProcessorsPage.tsx
Normal file
501
web/src/pages/ProcessorsPage.tsx
Normal file
@@ -0,0 +1,501 @@
|
||||
import { memo, useCallback, useEffect, useMemo, useState } from 'react';
|
||||
import {
|
||||
Button,
|
||||
Card,
|
||||
Empty,
|
||||
Flex,
|
||||
Form,
|
||||
Input,
|
||||
message,
|
||||
Modal,
|
||||
Space,
|
||||
Spin,
|
||||
Switch,
|
||||
Tabs,
|
||||
Tag,
|
||||
Typography,
|
||||
theme,
|
||||
} from 'antd';
|
||||
import Editor from '@monaco-editor/react';
|
||||
import { ProcessorConfigForm } from '../components/ProcessorConfigForm';
|
||||
import PathSelectorModal, { type PathSelectorMode } from '../components/PathSelectorModal';
|
||||
import { processorsApi, type ProcessorTypeMeta } from '../api/processors';
|
||||
import { useI18n } from '../i18n';
|
||||
|
||||
const { Text } = Typography;
|
||||
|
||||
type TabKey = 'editor' | 'runner';
|
||||
|
||||
const ProcessorsPage = memo(function ProcessorsPage() {
|
||||
const { t } = useI18n();
|
||||
const { token } = theme.useToken();
|
||||
const [messageApi, contextHolder] = message.useMessage();
|
||||
const [processors, setProcessors] = useState<ProcessorTypeMeta[]>([]);
|
||||
const [loadingList, setLoadingList] = useState(false);
|
||||
const [selectedType, setSelectedType] = useState<string>('');
|
||||
const [source, setSource] = useState('');
|
||||
const [initialSource, setInitialSource] = useState('');
|
||||
const [modulePath, setModulePath] = useState('');
|
||||
const [sourceLoading, setSourceLoading] = useState(false);
|
||||
const [savingSource, setSavingSource] = useState(false);
|
||||
const [reloading, setReloading] = useState(false);
|
||||
const [form] = Form.useForm();
|
||||
const [running, setRunning] = useState(false);
|
||||
const [isDirectory, setIsDirectory] = useState(false);
|
||||
const [pathModalOpen, setPathModalOpen] = useState(false);
|
||||
const [pathModalMode, setPathModalMode] = useState<PathSelectorMode>('file');
|
||||
const [pathModalField, setPathModalField] = useState<'path' | 'save_to'>('path');
|
||||
const [activeTab, setActiveTab] = useState<TabKey>('editor');
|
||||
|
||||
const isDirty = source !== initialSource;
|
||||
|
||||
const selectedProcessorMeta = useMemo(
|
||||
() => processors.find(p => p.type === selectedType),
|
||||
[processors, selectedType]
|
||||
);
|
||||
|
||||
const loadList = useCallback(async () => {
|
||||
setLoadingList(true);
|
||||
try {
|
||||
const list = await processorsApi.list();
|
||||
setProcessors(list);
|
||||
} catch (err: any) {
|
||||
messageApi.error(err?.message || t('Load failed'));
|
||||
} finally {
|
||||
setLoadingList(false);
|
||||
}
|
||||
}, [messageApi, t]);
|
||||
|
||||
useEffect(() => {
|
||||
loadList();
|
||||
}, [loadList]);
|
||||
|
||||
useEffect(() => {
|
||||
if (!processors.length) {
|
||||
setSelectedType('');
|
||||
return;
|
||||
}
|
||||
if (!selectedType) {
|
||||
setSelectedType(processors[0].type);
|
||||
} else if (!processors.some(p => p.type === selectedType)) {
|
||||
setSelectedType(processors[0].type);
|
||||
}
|
||||
}, [processors, selectedType]);
|
||||
|
||||
useEffect(() => {
|
||||
if (!selectedType) {
|
||||
setSource('');
|
||||
setInitialSource('');
|
||||
setModulePath('');
|
||||
return;
|
||||
}
|
||||
const controller = new AbortController();
|
||||
setSource('');
|
||||
setInitialSource('');
|
||||
setModulePath('');
|
||||
setSourceLoading(true);
|
||||
processorsApi.getSource(selectedType)
|
||||
.then(resp => {
|
||||
if (controller.signal.aborted) return;
|
||||
setSource(resp.source ?? '');
|
||||
setInitialSource(resp.source ?? '');
|
||||
setModulePath(resp.module_path ?? '');
|
||||
})
|
||||
.catch((err: any) => {
|
||||
if (controller.signal.aborted) return;
|
||||
messageApi.error(err?.message || t('Load failed'));
|
||||
setSource('');
|
||||
setInitialSource('');
|
||||
setModulePath('');
|
||||
})
|
||||
.finally(() => {
|
||||
if (!controller.signal.aborted) {
|
||||
setSourceLoading(false);
|
||||
}
|
||||
});
|
||||
return () => controller.abort();
|
||||
}, [messageApi, selectedType, t]);
|
||||
|
||||
useEffect(() => {
|
||||
if (!selectedProcessorMeta) {
|
||||
form.resetFields();
|
||||
setIsDirectory(false);
|
||||
return;
|
||||
}
|
||||
form.resetFields();
|
||||
const defaults: Record<string, any> = {};
|
||||
selectedProcessorMeta.config_schema?.forEach(field => {
|
||||
if (field.default !== undefined) {
|
||||
defaults[field.key] = field.default;
|
||||
}
|
||||
});
|
||||
form.setFieldsValue({
|
||||
path: '',
|
||||
overwrite: !!selectedProcessorMeta.produces_file,
|
||||
save_to: undefined,
|
||||
config: defaults,
|
||||
});
|
||||
setIsDirectory(false);
|
||||
}, [selectedProcessorMeta, form]);
|
||||
|
||||
const overwriteValue = Form.useWatch('overwrite', form) ?? false;
|
||||
|
||||
useEffect(() => {
|
||||
if (overwriteValue) {
|
||||
form.setFieldsValue({ save_to: undefined });
|
||||
}
|
||||
}, [overwriteValue, form]);
|
||||
|
||||
useEffect(() => {
|
||||
if (isDirectory) {
|
||||
form.setFieldsValue({ overwrite: true, save_to: undefined });
|
||||
}
|
||||
}, [isDirectory, form]);
|
||||
|
||||
const handleSelectProcessor = useCallback((type: string) => {
|
||||
if (type === selectedType) return;
|
||||
if (isDirty) {
|
||||
Modal.confirm({
|
||||
title: t('Unsaved changes'),
|
||||
content: t('Switching processor will discard unsaved changes. Continue?'),
|
||||
okText: t('Confirm'),
|
||||
cancelText: t('Cancel'),
|
||||
onOk: () => {
|
||||
setSelectedType(type);
|
||||
setActiveTab('editor');
|
||||
},
|
||||
});
|
||||
} else {
|
||||
setSelectedType(type);
|
||||
setActiveTab('editor');
|
||||
}
|
||||
}, [isDirty, selectedType, t]);
|
||||
|
||||
const handleSaveSource = useCallback(async () => {
|
||||
if (!selectedType) return;
|
||||
try {
|
||||
setSavingSource(true);
|
||||
await processorsApi.updateSource(selectedType, source);
|
||||
setInitialSource(source);
|
||||
messageApi.success(t('Source saved'));
|
||||
} catch (err: any) {
|
||||
messageApi.error(err?.message || t('Operation failed'));
|
||||
} finally {
|
||||
setSavingSource(false);
|
||||
}
|
||||
}, [messageApi, selectedType, source, t]);
|
||||
|
||||
const handleReloadProcessors = useCallback(async () => {
|
||||
try {
|
||||
setReloading(true);
|
||||
await processorsApi.reload();
|
||||
messageApi.success(t('Processors reloaded'));
|
||||
await loadList();
|
||||
} catch (err: any) {
|
||||
messageApi.error(err?.message || t('Operation failed'));
|
||||
} finally {
|
||||
setReloading(false);
|
||||
}
|
||||
}, [loadList, messageApi, t]);
|
||||
|
||||
const openPathSelector = useCallback((field: 'path' | 'save_to', mode: PathSelectorMode) => {
|
||||
setPathModalField(field);
|
||||
setPathModalMode(mode);
|
||||
setPathModalOpen(true);
|
||||
}, []);
|
||||
|
||||
const handlePathSelected = useCallback((selectedPath: string) => {
|
||||
if (pathModalField === 'path') {
|
||||
form.setFieldsValue({ path: selectedPath });
|
||||
setIsDirectory(pathModalMode === 'directory');
|
||||
} else {
|
||||
form.setFieldsValue({ save_to: selectedPath });
|
||||
}
|
||||
setPathModalOpen(false);
|
||||
}, [form, pathModalField, pathModalMode]);
|
||||
|
||||
const handleRun = useCallback(async () => {
|
||||
if (!selectedType) {
|
||||
messageApi.warning(t('Please select a processor'));
|
||||
return;
|
||||
}
|
||||
try {
|
||||
const values = await form.validateFields();
|
||||
const schema = selectedProcessorMeta?.config_schema || [];
|
||||
const finalConfig: Record<string, any> = {};
|
||||
schema.forEach(field => {
|
||||
const value = values.config?.[field.key];
|
||||
if (value === undefined) {
|
||||
finalConfig[field.key] = field.default;
|
||||
} else {
|
||||
finalConfig[field.key] = value;
|
||||
}
|
||||
});
|
||||
setRunning(true);
|
||||
const payload: any = {
|
||||
path: values.path,
|
||||
processor_type: selectedType,
|
||||
config: finalConfig,
|
||||
overwrite: !!values.overwrite,
|
||||
};
|
||||
if (values.save_to && !values.overwrite) {
|
||||
payload.save_to = values.save_to;
|
||||
}
|
||||
const resp = await processorsApi.process(payload);
|
||||
messageApi.success(`${t('Task submitted')}: ${resp.task_id}`);
|
||||
} catch (err: any) {
|
||||
if (err?.errorFields) {
|
||||
return;
|
||||
}
|
||||
messageApi.error(err?.message || t('Operation failed'));
|
||||
} finally {
|
||||
setRunning(false);
|
||||
}
|
||||
}, [form, messageApi, selectedProcessorMeta, selectedType, t]);
|
||||
|
||||
const selectedConfigPath = pathModalField === 'path'
|
||||
? (selectedType ? form.getFieldValue('path') : undefined) || '/'
|
||||
: (selectedType ? form.getFieldValue('save_to') : undefined) || '/';
|
||||
|
||||
const renderProcessorList = () => {
|
||||
if (loadingList) {
|
||||
return (
|
||||
<Flex align="center" justify="center" style={{ height: '100%' }}>
|
||||
<Spin />
|
||||
</Flex>
|
||||
);
|
||||
}
|
||||
if (!processors.length) {
|
||||
return (
|
||||
<Flex align="center" justify="center" style={{ height: '100%' }}>
|
||||
<Empty description={t('No data')} />
|
||||
</Flex>
|
||||
);
|
||||
}
|
||||
return (
|
||||
<div style={{ padding: 8, overflowY: 'auto', height: '100%' }}>
|
||||
{processors.map(item => {
|
||||
const selected = item.type === selectedType;
|
||||
const onClick = () => handleSelectProcessor(item.type);
|
||||
return (
|
||||
<div
|
||||
key={item.type}
|
||||
onClick={onClick}
|
||||
style={{
|
||||
border: `1px solid ${selected ? token.colorPrimary : token.colorBorderSecondary}`,
|
||||
background: selected ? token.colorPrimaryBg : token.colorBgContainer,
|
||||
borderRadius: 10,
|
||||
padding: 12,
|
||||
marginBottom: 8,
|
||||
cursor: 'pointer',
|
||||
transition: 'all 0.2s ease',
|
||||
}}
|
||||
>
|
||||
<Flex justify="space-between" align="center">
|
||||
<Space size={8} align="center">
|
||||
<span
|
||||
style={{
|
||||
width: 8,
|
||||
height: 8,
|
||||
borderRadius: '50%',
|
||||
background: selected ? token.colorPrimary : token.colorBorderSecondary,
|
||||
display: 'inline-block',
|
||||
}}
|
||||
/>
|
||||
<Text strong>{item.name}</Text>
|
||||
</Space>
|
||||
<Tag color={selected ? token.colorPrimary : token.colorBorderSecondary}>{item.type}</Tag>
|
||||
</Flex>
|
||||
<Space direction="vertical" size={6} style={{ marginTop: 8 }}>
|
||||
<div>
|
||||
<Text type="secondary" style={{ marginRight: 8 }}>{t('Supported Extensions')}:</Text>
|
||||
{item.supported_exts?.length ? (
|
||||
<Space wrap size={[4, 4]}>
|
||||
{item.supported_exts.map(ext => (
|
||||
<Tag key={ext}>{ext}</Tag>
|
||||
))}
|
||||
</Space>
|
||||
) : (
|
||||
<Tag>{t('All')}</Tag>
|
||||
)}
|
||||
</div>
|
||||
<Text type="secondary">
|
||||
{t('Produces File')}: {item.produces_file ? t('Yes') : t('No')}
|
||||
</Text>
|
||||
</Space>
|
||||
</div>
|
||||
);
|
||||
})}
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
const tabs = [
|
||||
{
|
||||
key: 'editor',
|
||||
label: t('Source Editor'),
|
||||
children: selectedType ? (
|
||||
<div style={{ height: '100%', display: 'flex', flexDirection: 'column' }}>
|
||||
<div style={{ padding: '8px 12px', borderBottom: `1px solid ${token.colorBorderSecondary}` }}>
|
||||
{modulePath ? (
|
||||
<Space size={8}>
|
||||
<Text type="secondary">{t('Module Path')}:</Text>
|
||||
<Text code>{modulePath}</Text>
|
||||
</Space>
|
||||
) : (
|
||||
<Text type="secondary">{t('No module path')}</Text>
|
||||
)}
|
||||
</div>
|
||||
<div style={{ flex: 1, minHeight: 0 }}>
|
||||
{sourceLoading ? (
|
||||
<Flex align="center" justify="center" style={{ height: '100%' }}>
|
||||
<Spin />
|
||||
</Flex>
|
||||
) : (
|
||||
<Editor
|
||||
language="python"
|
||||
value={source}
|
||||
onChange={(val) => setSource(val ?? '')}
|
||||
height="100%"
|
||||
options={{
|
||||
automaticLayout: true,
|
||||
minimap: { enabled: false },
|
||||
fontSize: 13,
|
||||
scrollBeyondLastLine: false,
|
||||
}}
|
||||
/>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
) : (
|
||||
<Empty style={{ marginTop: 64 }} description={t('Select a processor')} />
|
||||
),
|
||||
},
|
||||
{
|
||||
key: 'runner',
|
||||
label: t('Run Processor'),
|
||||
forceRender: true,
|
||||
children: (
|
||||
<Form form={form} layout="vertical" disabled={!selectedType} style={{ padding: '12px 0' }}>
|
||||
{selectedType ? (
|
||||
<>
|
||||
{isDirectory && (
|
||||
<Text type="secondary" style={{ display: 'block', marginBottom: 12 }}>
|
||||
{t('Directory processing always overwrites original files')}
|
||||
</Text>
|
||||
)}
|
||||
<Form.Item
|
||||
label={t('Target Path')}
|
||||
required
|
||||
>
|
||||
<Flex gap={8} align="center">
|
||||
<div style={{ flex: 1 }}>
|
||||
<Form.Item
|
||||
name="path"
|
||||
rules={[{ required: true, message: t('Please select a path') }]}
|
||||
noStyle
|
||||
>
|
||||
<Input placeholder={t('Select a path')} />
|
||||
</Form.Item>
|
||||
</div>
|
||||
<Button onClick={() => openPathSelector('path', 'file')}>{t('Select File')}</Button>
|
||||
<Button onClick={() => openPathSelector('path', 'directory')}>{t('Select Directory')}</Button>
|
||||
</Flex>
|
||||
</Form.Item>
|
||||
|
||||
<Form.Item
|
||||
name="overwrite"
|
||||
label={t('Overwrite original')}
|
||||
valuePropName="checked"
|
||||
>
|
||||
<Switch disabled={isDirectory} />
|
||||
</Form.Item>
|
||||
|
||||
{selectedProcessorMeta?.produces_file && !overwriteValue && (
|
||||
<Form.Item label={t('Save To')}>
|
||||
<Flex gap={8} align="center">
|
||||
<div style={{ flex: 1 }}>
|
||||
<Form.Item name="save_to" noStyle>
|
||||
<Input placeholder={t('Optional output path')} />
|
||||
</Form.Item>
|
||||
</div>
|
||||
<Button onClick={() => openPathSelector('save_to', 'any')}>{t('Select')}</Button>
|
||||
</Flex>
|
||||
</Form.Item>
|
||||
)}
|
||||
|
||||
<ProcessorConfigForm
|
||||
processorMeta={selectedProcessorMeta}
|
||||
form={form}
|
||||
configPath={['config']}
|
||||
/>
|
||||
|
||||
<Form.Item>
|
||||
<Button type="primary" onClick={handleRun} loading={running} disabled={!selectedType}>
|
||||
{t('Run')}
|
||||
</Button>
|
||||
</Form.Item>
|
||||
</>
|
||||
) : (
|
||||
<Empty style={{ marginTop: 64 }} description={t('Select a processor')} />
|
||||
)}
|
||||
</Form>
|
||||
),
|
||||
},
|
||||
];
|
||||
|
||||
return (
|
||||
<>
|
||||
{contextHolder}
|
||||
<Flex gap={16} style={{ height: 'calc(100vh - 88px)' }}>
|
||||
<Card
|
||||
style={{ flex: '0 0 320px', minWidth: 280, display: 'flex', flexDirection: 'column' }}
|
||||
title={t('Processor List')}
|
||||
extra={
|
||||
<Space size={8}>
|
||||
<Button size="small" onClick={loadList} loading={loadingList}>{t('Refresh')}</Button>
|
||||
<Button size="small" onClick={handleReloadProcessors} loading={reloading}>{t('Reload')}</Button>
|
||||
</Space>
|
||||
}
|
||||
styles={{ body: { padding: 0, flex: 1, display: 'flex' } }}
|
||||
>
|
||||
{renderProcessorList()}
|
||||
</Card>
|
||||
|
||||
<Card
|
||||
style={{ flex: 1, minWidth: 0, display: 'flex', flexDirection: 'column' }}
|
||||
title={selectedProcessorMeta ? `${selectedProcessorMeta.name} (${selectedProcessorMeta.type})` : t('Select a processor')}
|
||||
extra={
|
||||
<Space size={8}>
|
||||
<Button size="small" onClick={handleSaveSource} loading={savingSource} disabled={!selectedType || !isDirty}>
|
||||
{t('Save')}
|
||||
</Button>
|
||||
<Button size="small" onClick={handleReloadProcessors} loading={reloading} disabled={!selectedType}>
|
||||
{t('Reload')}
|
||||
</Button>
|
||||
</Space>
|
||||
}
|
||||
styles={{ body: { padding: 0, flex: 1, display: 'flex', flexDirection: 'column' } }}
|
||||
>
|
||||
<Tabs
|
||||
activeKey={activeTab}
|
||||
onChange={key => setActiveTab(key as TabKey)}
|
||||
items={tabs as any}
|
||||
className="processors-tabs"
|
||||
tabBarGutter={32}
|
||||
/>
|
||||
</Card>
|
||||
</Flex>
|
||||
|
||||
<PathSelectorModal
|
||||
open={pathModalOpen}
|
||||
mode={pathModalMode}
|
||||
initialPath={selectedConfigPath}
|
||||
onOk={handlePathSelected}
|
||||
onCancel={() => setPathModalOpen(false)}
|
||||
/>
|
||||
</>
|
||||
);
|
||||
});
|
||||
|
||||
export default ProcessorsPage;
|
||||
@@ -44,6 +44,16 @@ const SharePage = memo(function SharePage() {
|
||||
}
|
||||
};
|
||||
|
||||
const handleClearExpired = async () => {
|
||||
try {
|
||||
const res = await shareApi.clearExpired();
|
||||
message.success(t('Cleared {count} expired shares', { count: String(res.deleted_count) }));
|
||||
fetchList();
|
||||
} catch (e: any) {
|
||||
message.error(e.message || t('Clear failed'));
|
||||
}
|
||||
};
|
||||
|
||||
const columns = [
|
||||
{
|
||||
title: t('Share Name'),
|
||||
@@ -100,7 +110,14 @@ const SharePage = memo(function SharePage() {
|
||||
return (
|
||||
<PageCard
|
||||
title={t('My Shares')}
|
||||
extra={<Button onClick={fetchList} loading={loading}>{t('Refresh')}</Button>}
|
||||
extra={
|
||||
<Space>
|
||||
<Button onClick={fetchList} loading={loading}>{t('Refresh')}</Button>
|
||||
<Popconfirm title={t('Confirm clear expired shares?')} onConfirm={handleClearExpired}>
|
||||
<Button danger>{t('Clear expired shares')}</Button>
|
||||
</Popconfirm>
|
||||
</Space>
|
||||
}
|
||||
>
|
||||
<Table
|
||||
rowKey="id"
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
import { Form, Input, Button, message, Tabs, Space, Card, Select, Modal, Radio, InputNumber } from 'antd';
|
||||
import { useEffect, useState } from 'react';
|
||||
import { Form, Input, Button, message, Tabs, Space, Card, Select, Modal, Radio, InputNumber, Spin, Empty, Alert } from 'antd';
|
||||
import { useEffect, useState, useCallback } from 'react';
|
||||
import PageCard from '../../components/PageCard';
|
||||
import { getAllConfig, setConfig } from '../../api/config';
|
||||
import { vectorDBApi } from '../../api/vectorDB';
|
||||
import { vectorDBApi, type VectorDBStats, type VectorDBProviderMeta, type VectorDBCurrentConfig } from '../../api/vectorDB';
|
||||
import { AppstoreOutlined, RobotOutlined, DatabaseOutlined, SkinOutlined } from '@ant-design/icons';
|
||||
import { useTheme } from '../../contexts/ThemeContext';
|
||||
import '../../styles/settings-tabs.css';
|
||||
@@ -21,13 +21,30 @@ const VISION_CONFIG_KEYS = [
|
||||
{ key: 'AI_VISION_API_KEY', label: 'Vision API Key' },
|
||||
];
|
||||
|
||||
const DEFAULT_EMBED_DIMENSION = 4096;
|
||||
const EMBED_DIM_KEY = 'AI_EMBED_DIM';
|
||||
|
||||
const EMBED_CONFIG_KEYS = [
|
||||
{ key: 'AI_EMBED_API_URL', label: 'Embedding API URL' },
|
||||
{ key: 'AI_EMBED_MODEL', label: 'Embedding Model', default: 'Qwen/Qwen3-Embedding-8B' },
|
||||
{ key: 'AI_EMBED_API_KEY', label: 'Embedding API Key' },
|
||||
];
|
||||
|
||||
const ALL_AI_KEYS = [...VISION_CONFIG_KEYS, ...EMBED_CONFIG_KEYS];
|
||||
const ALL_AI_KEYS = [...VISION_CONFIG_KEYS, ...EMBED_CONFIG_KEYS, { key: EMBED_DIM_KEY, default: DEFAULT_EMBED_DIMENSION }];
|
||||
|
||||
const formatBytes = (bytes?: number | null) => {
|
||||
if (bytes === null || bytes === undefined) return '-';
|
||||
if (bytes === 0) return '0 B';
|
||||
const units = ['B', 'KB', 'MB', 'GB', 'TB'];
|
||||
let value = bytes;
|
||||
let unitIndex = 0;
|
||||
while (value >= 1024 && unitIndex < units.length - 1) {
|
||||
value /= 1024;
|
||||
unitIndex += 1;
|
||||
}
|
||||
const precision = value >= 10 || unitIndex === 0 ? 0 : 1;
|
||||
return `${value.toFixed(precision)} ${units[unitIndex]}`;
|
||||
};
|
||||
|
||||
// Theme related config keys
|
||||
const THEME_KEYS = {
|
||||
@@ -39,9 +56,19 @@ const THEME_KEYS = {
|
||||
};
|
||||
|
||||
export default function SystemSettingsPage() {
|
||||
const [vectorConfigForm] = Form.useForm();
|
||||
const [loading, setLoading] = useState(false);
|
||||
const [config, setConfigState] = useState<Record<string, string> | null>(null);
|
||||
const [activeTab, setActiveTab] = useState('appearance');
|
||||
const [vectorStats, setVectorStats] = useState<VectorDBStats | null>(null);
|
||||
const [vectorStatsLoading, setVectorStatsLoading] = useState(false);
|
||||
const [vectorStatsError, setVectorStatsError] = useState<string | null>(null);
|
||||
const [vectorProviders, setVectorProviders] = useState<VectorDBProviderMeta[]>([]);
|
||||
const [vectorConfig, setVectorConfig] = useState<VectorDBCurrentConfig | null>(null);
|
||||
const [vectorConfigLoading, setVectorConfigLoading] = useState(false);
|
||||
const [vectorConfigSaving, setVectorConfigSaving] = useState(false);
|
||||
const [vectorMetaError, setVectorMetaError] = useState<string | null>(null);
|
||||
const [selectedProviderType, setSelectedProviderType] = useState<string | null>(null);
|
||||
const { refreshTheme, previewTheme } = useTheme();
|
||||
const { t } = useI18n();
|
||||
|
||||
@@ -49,6 +76,72 @@ export default function SystemSettingsPage() {
|
||||
getAllConfig().then((data) => setConfigState(data as Record<string, string>));
|
||||
}, []);
|
||||
|
||||
const fetchVectorStats = useCallback(async () => {
|
||||
setVectorStatsLoading(true);
|
||||
setVectorStatsError(null);
|
||||
try {
|
||||
const data = await vectorDBApi.getStats();
|
||||
setVectorStats(data);
|
||||
} catch (e: any) {
|
||||
const msg = e?.message || t('Load failed');
|
||||
setVectorStatsError(msg);
|
||||
message.error(msg);
|
||||
} finally {
|
||||
setVectorStatsLoading(false);
|
||||
}
|
||||
}, [t]);
|
||||
|
||||
const buildProviderConfigValues = useCallback((provider: VectorDBProviderMeta | undefined, existing?: Record<string, string>) => {
|
||||
if (!provider) return {};
|
||||
const values: Record<string, string> = {};
|
||||
const schema = provider.config_schema || [];
|
||||
schema.forEach((field) => {
|
||||
const current = existing && existing[field.key] !== undefined && existing[field.key] !== null
|
||||
? String(existing[field.key])
|
||||
: undefined;
|
||||
if (current !== undefined) {
|
||||
values[field.key] = current;
|
||||
} else if (field.default !== undefined && field.default !== null) {
|
||||
values[field.key] = String(field.default);
|
||||
} else {
|
||||
values[field.key] = '';
|
||||
}
|
||||
});
|
||||
return values;
|
||||
}, []);
|
||||
|
||||
const fetchVectorMeta = useCallback(async () => {
|
||||
setVectorConfigLoading(true);
|
||||
setVectorMetaError(null);
|
||||
try {
|
||||
const [providers, current] = await Promise.all([
|
||||
vectorDBApi.getProviders(),
|
||||
vectorDBApi.getConfig(),
|
||||
]);
|
||||
setVectorProviders(providers);
|
||||
setVectorConfig(current);
|
||||
|
||||
const enabled = providers.filter((item) => item.enabled);
|
||||
let nextType: string | null = current?.type ?? null;
|
||||
if (nextType && !providers.some((item) => item.type === nextType)) {
|
||||
nextType = null;
|
||||
}
|
||||
if (!nextType) {
|
||||
nextType = enabled[0]?.type ?? providers[0]?.type ?? null;
|
||||
}
|
||||
setSelectedProviderType(nextType);
|
||||
const provider = providers.find((item) => item.type === nextType);
|
||||
const configValues = buildProviderConfigValues(provider, nextType === current?.type ? current?.config : undefined);
|
||||
vectorConfigForm.setFieldsValue({ type: nextType || undefined, config: configValues });
|
||||
} catch (e: any) {
|
||||
const msg = e?.message || t('Load failed');
|
||||
setVectorMetaError(msg);
|
||||
message.error(msg);
|
||||
} finally {
|
||||
setVectorConfigLoading(false);
|
||||
}
|
||||
}, [buildProviderConfigValues, message, t, vectorConfigForm]);
|
||||
|
||||
const handleSave = async (values: any) => {
|
||||
setLoading(true);
|
||||
try {
|
||||
@@ -67,6 +160,40 @@ export default function SystemSettingsPage() {
|
||||
setLoading(false);
|
||||
};
|
||||
|
||||
const handleProviderChange = useCallback((value: string) => {
|
||||
setSelectedProviderType(value);
|
||||
const provider = vectorProviders.find((item) => item.type === value);
|
||||
const existing = value === vectorConfig?.type ? vectorConfig?.config : undefined;
|
||||
const configValues = buildProviderConfigValues(provider, existing);
|
||||
vectorConfigForm.setFieldsValue({ type: value, config: configValues });
|
||||
}, [vectorProviders, vectorConfig, buildProviderConfigValues, vectorConfigForm]);
|
||||
|
||||
const handleVectorConfigSave = useCallback(async (values: { type: string; config?: Record<string, string> }) => {
|
||||
if (!values?.type) {
|
||||
return;
|
||||
}
|
||||
setVectorConfigSaving(true);
|
||||
try {
|
||||
const configPayload = Object.fromEntries(
|
||||
Object.entries(values.config || {}).filter(([, val]) => val !== undefined && val !== null && String(val).trim() !== '')
|
||||
.map(([key, val]) => [key, String(val)])
|
||||
);
|
||||
const response = await vectorDBApi.updateConfig({ type: values.type, config: configPayload });
|
||||
setVectorConfig(response.config);
|
||||
setVectorStats(response.stats);
|
||||
setVectorStatsError(null);
|
||||
setSelectedProviderType(response.config.type);
|
||||
const provider = vectorProviders.find((item) => item.type === response.config.type);
|
||||
const mergedValues = buildProviderConfigValues(provider, response.config.config);
|
||||
vectorConfigForm.setFieldsValue({ type: response.config.type, config: mergedValues });
|
||||
message.success(t('Saved successfully'));
|
||||
} catch (e: any) {
|
||||
message.error(e?.message || t('Save failed'));
|
||||
} finally {
|
||||
setVectorConfigSaving(false);
|
||||
}
|
||||
}, [buildProviderConfigValues, message, t, vectorConfigForm, vectorProviders]);
|
||||
|
||||
// 离开“外观设置”时,恢复后端持久化配置(取消未保存的预览)
|
||||
useEffect(() => {
|
||||
if (activeTab !== 'appearance') {
|
||||
@@ -74,6 +201,27 @@ export default function SystemSettingsPage() {
|
||||
}
|
||||
}, [activeTab]);
|
||||
|
||||
useEffect(() => {
|
||||
if (activeTab === 'vector-db') {
|
||||
if (!vectorProviders.length && !vectorConfigLoading) {
|
||||
fetchVectorMeta();
|
||||
}
|
||||
if (!vectorStats && !vectorStatsLoading) {
|
||||
fetchVectorStats();
|
||||
}
|
||||
}
|
||||
}, [
|
||||
activeTab,
|
||||
fetchVectorMeta,
|
||||
fetchVectorStats,
|
||||
vectorProviders.length,
|
||||
vectorConfigLoading,
|
||||
vectorStats,
|
||||
vectorStatsLoading,
|
||||
]);
|
||||
|
||||
const selectedProvider = vectorProviders.find((item) => item.type === selectedProviderType || (!selectedProviderType && item.enabled));
|
||||
|
||||
if (!config) {
|
||||
return <PageCard title={t('System Settings')}><div>{t('Loading...')}</div></PageCard>;
|
||||
}
|
||||
@@ -213,9 +361,27 @@ export default function SystemSettingsPage() {
|
||||
<Form
|
||||
layout="vertical"
|
||||
initialValues={{
|
||||
...Object.fromEntries(ALL_AI_KEYS.map(({ key, default: def }) => [key, config[key] ?? def ?? ''])),
|
||||
...Object.fromEntries(ALL_AI_KEYS.map(({ key, default: def }) => [key, key === EMBED_DIM_KEY
|
||||
? Number(config[key] ?? def ?? DEFAULT_EMBED_DIMENSION)
|
||||
: config[key] ?? def ?? ''])),
|
||||
}}
|
||||
onFinish={async (vals) => {
|
||||
const currentDim = Number(config[EMBED_DIM_KEY] ?? DEFAULT_EMBED_DIMENSION);
|
||||
const nextDim = Number(vals[EMBED_DIM_KEY] ?? DEFAULT_EMBED_DIMENSION);
|
||||
if (currentDim !== nextDim) {
|
||||
Modal.confirm({
|
||||
title: t('Confirm embedding dimension change'),
|
||||
content: t('Changing the embedding dimension will clear the vector database automatically. You will need to rebuild indexes afterwards. Continue?'),
|
||||
okText: t('Confirm'),
|
||||
cancelText: t('Cancel'),
|
||||
onOk: async () => {
|
||||
await handleSave(vals);
|
||||
},
|
||||
});
|
||||
return;
|
||||
}
|
||||
await handleSave(vals);
|
||||
}}
|
||||
onFinish={handleSave}
|
||||
style={{ marginTop: 24 }}
|
||||
key={JSON.stringify(config)}
|
||||
>
|
||||
@@ -232,6 +398,9 @@ export default function SystemSettingsPage() {
|
||||
<Input size="large" />
|
||||
</Form.Item>
|
||||
))}
|
||||
<Form.Item name={EMBED_DIM_KEY} label={t('Embedding Dimension')}>
|
||||
<InputNumber min={1} max={32768} style={{ width: '100%' }} />
|
||||
</Form.Item>
|
||||
</Card>
|
||||
<Form.Item style={{ marginTop: 24 }}>
|
||||
<Button type="primary" htmlType="submit" loading={loading} block>
|
||||
@@ -251,41 +420,187 @@ export default function SystemSettingsPage() {
|
||||
),
|
||||
children: (
|
||||
<Card title={t('Vector Database Settings')} style={{ marginTop: 24 }}>
|
||||
<Form layout="vertical">
|
||||
<Form.Item label={t('Database Type')}>
|
||||
<Select
|
||||
size="large"
|
||||
value={'Milvus Lite'}
|
||||
disabled
|
||||
options={[{ value: 'Milvus Lite', label: 'Milvus Lite' }]}
|
||||
/>
|
||||
</Form.Item>
|
||||
<Form.Item>
|
||||
<Button
|
||||
danger
|
||||
block
|
||||
onClick={() => {
|
||||
Modal.confirm({
|
||||
title: t('Confirm clear vector database?'),
|
||||
content: t('This will delete all collections irreversibly.'),
|
||||
okText: t('Confirm Clear'),
|
||||
okType: 'danger',
|
||||
cancelText: t('Cancel'),
|
||||
onOk: async () => {
|
||||
try {
|
||||
await vectorDBApi.clearAll();
|
||||
message.success(t('Vector database cleared'));
|
||||
} catch (e: any) {
|
||||
message.error(e.message || t('Clear failed'));
|
||||
}
|
||||
},
|
||||
});
|
||||
}}
|
||||
<Space direction="vertical" size={24} style={{ width: '100%' }}>
|
||||
<Space direction="vertical" size={16} style={{ width: '100%' }}>
|
||||
<div style={{ display: 'flex', justifyContent: 'space-between', alignItems: 'center', flexWrap: 'wrap', gap: 12 }}>
|
||||
<strong>{t('Current Statistics')}</strong>
|
||||
<Button onClick={() => { fetchVectorMeta(); fetchVectorStats(); }} loading={vectorStatsLoading || vectorConfigLoading} disabled={(vectorStatsLoading || vectorConfigLoading) && !vectorStats}>
|
||||
{t('Refresh')}
|
||||
</Button>
|
||||
</div>
|
||||
{vectorMetaError ? (
|
||||
<Alert type="error" showIcon message={vectorMetaError} />
|
||||
) : null}
|
||||
{vectorStatsLoading && !vectorStats ? (
|
||||
<Spin />
|
||||
) : vectorStats ? (
|
||||
<Space direction="vertical" size={16} style={{ width: '100%' }}>
|
||||
<div style={{ display: 'flex', flexWrap: 'wrap', gap: 24 }}>
|
||||
<div>
|
||||
<div style={{ color: '#888' }}>{t('Collections')}</div>
|
||||
<div style={{ fontSize: 20, fontWeight: 600 }}>{vectorStats.collection_count}</div>
|
||||
</div>
|
||||
<div>
|
||||
<div style={{ color: '#888' }}>{t('Vectors')}</div>
|
||||
<div style={{ fontSize: 20, fontWeight: 600 }}>{vectorStats.total_vectors}</div>
|
||||
</div>
|
||||
<div>
|
||||
<div style={{ color: '#888' }}>{t('Database Size')}</div>
|
||||
<div style={{ fontSize: 20, fontWeight: 600 }}>{formatBytes(vectorStats.db_file_size_bytes)}</div>
|
||||
</div>
|
||||
<div>
|
||||
<div style={{ color: '#888' }}>{t('Estimated Memory')}</div>
|
||||
<div style={{ fontSize: 20, fontWeight: 600 }}>{formatBytes(vectorStats.estimated_total_memory_bytes)}</div>
|
||||
</div>
|
||||
</div>
|
||||
{vectorStats.collections.length ? (
|
||||
<Space direction="vertical" style={{ width: '100%' }} size={16}>
|
||||
{vectorStats.collections.map((collection) => (
|
||||
<div key={collection.name} style={{ border: '1px solid #f0f0f0', borderRadius: 8, padding: 16 }}>
|
||||
<Space direction="vertical" size={12} style={{ width: '100%' }}>
|
||||
<div style={{ display: 'flex', justifyContent: 'space-between', alignItems: 'center', flexWrap: 'wrap', gap: 12 }}>
|
||||
<strong>{collection.name}</strong>
|
||||
<span style={{ color: '#888' }}>
|
||||
{collection.is_vector_collection && collection.dimension
|
||||
? `${t('Dimension')}: ${collection.dimension}`
|
||||
: t('Non-vector collection')}
|
||||
</span>
|
||||
</div>
|
||||
<div>{t('Vectors')}: {collection.row_count}</div>
|
||||
{collection.is_vector_collection ? (
|
||||
<div>{t('Estimated memory')}: {formatBytes(collection.estimated_memory_bytes)}</div>
|
||||
) : null}
|
||||
{collection.indexes.length ? (
|
||||
<Space direction="vertical" size={4} style={{ width: '100%' }}>
|
||||
<span>{t('Indexes')}:</span>
|
||||
<ul style={{ paddingLeft: 20, margin: 0 }}>
|
||||
{collection.indexes.map((index) => (
|
||||
<li key={`${collection.name}-${index.index_name || 'default'}`}>
|
||||
<span>{index.index_name || t('Unnamed index')}</span>
|
||||
<span>{' · '}{index.index_type || '-'}</span>
|
||||
<span>{' · '}{index.metric_type || '-'}</span>
|
||||
<span>{' · '}{t('Indexed rows')}: {index.indexed_rows}</span>
|
||||
<span>{' · '}{t('Pending rows')}: {index.pending_index_rows}</span>
|
||||
<span>{' · '}{t('Status')}: {index.state || '-'}</span>
|
||||
</li>
|
||||
))}
|
||||
</ul>
|
||||
</Space>
|
||||
) : null}
|
||||
</Space>
|
||||
</div>
|
||||
))}
|
||||
</Space>
|
||||
) : (
|
||||
<Empty description={t('No collections')} />
|
||||
)}
|
||||
<div style={{ color: '#888' }}>
|
||||
{t('Estimated memory is calculated as vectors x dimension x 4 bytes (float32).')}
|
||||
</div>
|
||||
</Space>
|
||||
) : vectorStatsError ? (
|
||||
<div style={{ color: '#ff4d4f' }}>{vectorStatsError}</div>
|
||||
) : (
|
||||
<Empty description={t('No collections')} />
|
||||
)}
|
||||
</Space>
|
||||
{vectorConfigLoading && !vectorProviders.length ? (
|
||||
<Spin />
|
||||
) : (
|
||||
<Form
|
||||
layout="vertical"
|
||||
form={vectorConfigForm}
|
||||
onFinish={handleVectorConfigSave}
|
||||
initialValues={{ type: selectedProviderType || undefined, config: {} }}
|
||||
>
|
||||
{t('Clear Vector DB')}
|
||||
</Button>
|
||||
</Form.Item>
|
||||
</Form>
|
||||
<Form.Item
|
||||
name="type"
|
||||
label={t('Database Provider')}
|
||||
rules={[{ required: true, message: t('Please select a provider') }]}
|
||||
>
|
||||
<Select
|
||||
size="large"
|
||||
options={vectorProviders.map((provider) => ({
|
||||
value: provider.type,
|
||||
label: provider.enabled ? provider.label : `${provider.label} (${t('Coming soon')})`,
|
||||
disabled: !provider.enabled,
|
||||
}))}
|
||||
onChange={handleProviderChange}
|
||||
loading={vectorConfigLoading && !vectorProviders.length}
|
||||
/>
|
||||
</Form.Item>
|
||||
{selectedProvider?.description ? (
|
||||
<Alert
|
||||
type="info"
|
||||
showIcon
|
||||
message={t(selectedProvider.description)}
|
||||
style={{ marginBottom: 16 }}
|
||||
/>
|
||||
) : null}
|
||||
{selectedProvider?.config_schema?.map((field) => (
|
||||
<Form.Item
|
||||
key={field.key}
|
||||
name={['config', field.key]}
|
||||
label={t(field.label)}
|
||||
rules={field.required ? [{ required: true, message: t('Please input {label}', { label: t(field.label) }) }] : []}
|
||||
>
|
||||
{field.type === 'password' ? (
|
||||
<Input.Password size="large" placeholder={field.placeholder ? t(field.placeholder) : undefined} />
|
||||
) : (
|
||||
<Input size="large" placeholder={field.placeholder ? t(field.placeholder) : undefined} />
|
||||
)}
|
||||
</Form.Item>
|
||||
))}
|
||||
{selectedProvider && !selectedProvider.enabled ? (
|
||||
<Alert
|
||||
type="warning"
|
||||
showIcon
|
||||
message={t('This provider is not available yet')}
|
||||
style={{ marginBottom: 16 }}
|
||||
/>
|
||||
) : null}
|
||||
<Form.Item>
|
||||
<Space direction="vertical" style={{ width: '100%' }}>
|
||||
<Button
|
||||
type="primary"
|
||||
htmlType="submit"
|
||||
loading={vectorConfigSaving}
|
||||
block
|
||||
disabled={!selectedProvider?.enabled}
|
||||
>
|
||||
{t('Save')}
|
||||
</Button>
|
||||
<Button
|
||||
danger
|
||||
htmlType="button"
|
||||
block
|
||||
onClick={() => {
|
||||
Modal.confirm({
|
||||
title: t('Confirm clear vector database?'),
|
||||
content: t('This will delete all collections irreversibly.'),
|
||||
okText: t('Confirm Clear'),
|
||||
okType: 'danger',
|
||||
cancelText: t('Cancel'),
|
||||
onOk: async () => {
|
||||
try {
|
||||
await vectorDBApi.clearAll();
|
||||
message.success(t('Vector database cleared'));
|
||||
await fetchVectorStats();
|
||||
await fetchVectorMeta();
|
||||
} catch (e: any) {
|
||||
message.error(e.message || t('Clear failed'));
|
||||
}
|
||||
},
|
||||
});
|
||||
}}
|
||||
>
|
||||
{t('Clear Vector DB')}
|
||||
</Button>
|
||||
</Space>
|
||||
</Form.Item>
|
||||
</Form>
|
||||
)}
|
||||
</Space>
|
||||
</Card>
|
||||
),
|
||||
},
|
||||
|
||||
@@ -5,6 +5,7 @@ import { tasksApi, type AutomationTask, type QueuedTask } from '../api/tasks';
|
||||
import { processorsApi, type ProcessorTypeMeta } from '../api/processors';
|
||||
import { ProcessorConfigForm } from '../components/ProcessorConfigForm';
|
||||
import { useI18n } from '../i18n';
|
||||
import PathSelectorModal from '../components/PathSelectorModal';
|
||||
|
||||
const TasksPage = memo(function TasksPage() {
|
||||
const [loading, setLoading] = useState(false);
|
||||
@@ -17,6 +18,7 @@ const TasksPage = memo(function TasksPage() {
|
||||
const [queuedTasks, setQueuedTasks] = useState<QueuedTask[]>([]);
|
||||
const [queueLoading, setQueueLoading] = useState(false);
|
||||
const { t } = useI18n();
|
||||
const [pathPickerOpen, setPathPickerOpen] = useState(false);
|
||||
|
||||
const fetchList = useCallback(async () => {
|
||||
setLoading(true);
|
||||
@@ -151,6 +153,7 @@ const TasksPage = memo(function TasksPage() {
|
||||
|
||||
const selectedProcessor = Form.useWatch('processor_type', form);
|
||||
const currentProcessorMeta = availableProcessors.find(p => p.type === selectedProcessor);
|
||||
const watchedPathPattern = Form.useWatch('path_pattern', form);
|
||||
|
||||
|
||||
return (
|
||||
@@ -177,7 +180,7 @@ const TasksPage = memo(function TasksPage() {
|
||||
width={480}
|
||||
open={open}
|
||||
onClose={() => { setOpen(false); setEditing(null); }}
|
||||
destroyOnClose
|
||||
destroyOnHidden
|
||||
extra={
|
||||
<Space>
|
||||
<Button onClick={() => { setOpen(false); setEditing(null); }}>{t('Cancel')}</Button>
|
||||
@@ -197,7 +200,10 @@ const TasksPage = memo(function TasksPage() {
|
||||
</Form.Item>
|
||||
<Typography.Title level={5} style={{ marginTop: 8, fontSize: 14 }}>{t('Matching Rules')}</Typography.Title>
|
||||
<Form.Item name="path_pattern" label={t('Path Prefix (optional)')}>
|
||||
<Input placeholder="/images/screenshots" />
|
||||
<Input
|
||||
placeholder="/images/screenshots"
|
||||
addonAfter={<Button size="small" onClick={() => setPathPickerOpen(true)}>{t('Select')}</Button>}
|
||||
/>
|
||||
</Form.Item>
|
||||
<Form.Item name="filename_regex" label={t('Filename Regex (optional)')}>
|
||||
<Input placeholder=".*\.png$" />
|
||||
@@ -219,6 +225,13 @@ const TasksPage = memo(function TasksPage() {
|
||||
/>
|
||||
</Form>
|
||||
</Drawer>
|
||||
<PathSelectorModal
|
||||
open={pathPickerOpen}
|
||||
mode="directory"
|
||||
initialPath={watchedPathPattern || '/'}
|
||||
onCancel={() => setPathPickerOpen(false)}
|
||||
onOk={(p) => { form.setFieldsValue({ path_pattern: p }); setPathPickerOpen(false); }}
|
||||
/>
|
||||
<Modal
|
||||
title={t('Current Task Queue')}
|
||||
open={queueModalOpen}
|
||||
|
||||
@@ -7,6 +7,7 @@ import FileExplorerPage from '../pages/FileExplorerPage/FileExplorerPage.tsx';
|
||||
import AdaptersPage from '../pages/AdaptersPage.tsx';
|
||||
import SharePage from '../pages/SharePage.tsx';
|
||||
import TasksPage from '../pages/TasksPage.tsx';
|
||||
import ProcessorsPage from '../pages/ProcessorsPage.tsx';
|
||||
import OfflineDownloadPage from '../pages/OfflineDownloadPage.tsx';
|
||||
import SystemSettingsPage from '../pages/SystemSettingsPage/SystemSettingsPage.tsx';
|
||||
import LogsPage from '../pages/LogsPage.tsx';
|
||||
@@ -37,6 +38,7 @@ const ShellBody = memo(function ShellBody() {
|
||||
{navKey === 'files' && <FileExplorerPage />}
|
||||
{navKey === 'share' && <SharePage />}
|
||||
{navKey === 'tasks' && <TasksPage />}
|
||||
{navKey === 'processors' && <ProcessorsPage />}
|
||||
{navKey === 'offline' && <OfflineDownloadPage />}
|
||||
{navKey === 'plugins' && <PluginsPage />}
|
||||
{navKey === 'settings' && <SystemSettingsPage />}
|
||||
|
||||
Reference in New Issue
Block a user