mirror of
https://github.com/DrizzleTime/Foxel.git
synced 2026-05-10 06:22:44 +08:00
Compare commits
21 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
1679b03d3a | ||
|
|
ab6562fc79 | ||
|
|
87770176b6 | ||
|
|
e7cf8dbdb8 | ||
|
|
e7eafdee97 | ||
|
|
051b49d3f6 | ||
|
|
b059b0eb44 | ||
|
|
59ad2cb622 | ||
|
|
6b2ada0b42 | ||
|
|
a727e77341 | ||
|
|
4638356a45 | ||
|
|
e51344b43e | ||
|
|
b7685db0e8 | ||
|
|
4e16de973c | ||
|
|
4dd0a4b1d6 | ||
|
|
5703825c31 | ||
|
|
24255744df | ||
|
|
31d97b2968 | ||
|
|
35abd080be | ||
|
|
2fa93a1eeb | ||
|
|
ff7eb13187 |
44
README.md
44
README.md
@@ -8,9 +8,10 @@
|
||||
|
||||
**A highly extensible private cloud storage solution for individuals and teams, featuring AI-powered semantic search.**
|
||||
|
||||

|
||||

|
||||

|
||||

|
||||
|
||||

|
||||
|
||||
---
|
||||
@@ -39,36 +40,37 @@
|
||||
|
||||
Using Docker Compose is the most recommended way to start Foxel.
|
||||
|
||||
1. **Create Data Directories**:
|
||||
Create a `data` folder for persistent data:
|
||||
1. **Create Data Directories**
|
||||
|
||||
```bash
|
||||
mkdir -p data/db
|
||||
mkdir -p data/mount
|
||||
chmod 777 data/db data/mount
|
||||
```
|
||||
Create a `data` folder for persistent data:
|
||||
|
||||
2. **Download Docker Compose File**:
|
||||
```bash
|
||||
mkdir -p data/db
|
||||
mkdir -p data/mount
|
||||
chmod 777 data/db data/mount
|
||||
```
|
||||
|
||||
```bash
|
||||
curl -L -O https://github.com/DrizzleTime/Foxel/raw/main/compose.yaml
|
||||
```
|
||||
2. **Download Docker Compose File**
|
||||
|
||||
After downloading, it is **strongly recommended** to modify the environment variables in the `compose.yaml` file to ensure security:
|
||||
```bash
|
||||
curl -L -O https://github.com/DrizzleTime/Foxel/raw/main/compose.yaml
|
||||
```
|
||||
|
||||
- Modify `SECRET_KEY` and `TEMP_LINK_SECRET_KEY`: Replace the default keys with randomly generated strong keys.
|
||||
After downloading, it is **strongly recommended** to modify the environment variables in the `compose.yaml` file to ensure security:
|
||||
|
||||
3. **Start the Services**:
|
||||
- Modify `SECRET_KEY` and `TEMP_LINK_SECRET_KEY`: Replace the default keys with randomly generated strong keys.
|
||||
|
||||
```bash
|
||||
docker-compose up -d
|
||||
```
|
||||
3. **Start the Services**
|
||||
|
||||
4. **Access the Application**:
|
||||
```bash
|
||||
docker-compose up -d
|
||||
```
|
||||
|
||||
Once the services are running, open the page in your browser.
|
||||
4. **Access the Application**
|
||||
|
||||
> On the first launch, please follow the setup guide to initialize the administrator account.
|
||||
Once the services are running, open the page in your browser.
|
||||
|
||||
> On the first launch, please follow the setup guide to initialize the administrator account.
|
||||
|
||||
## 🤝 How to Contribute
|
||||
|
||||
|
||||
45
README_zh.md
45
README_zh.md
@@ -8,15 +8,15 @@
|
||||
|
||||
**一个面向个人和团队的、高度可扩展的私有云盘解决方案,支持 AI 语义搜索。**
|
||||
|
||||

|
||||

|
||||

|
||||

|
||||
|
||||

|
||||
|
||||
---
|
||||
<blockquote>
|
||||
<em><strong>数据之洋浩瀚无涯,当以洞察之目引航,然其脉络深隐,非表象所能尽窥。</strong></em><br>
|
||||
<em><strong>The ocean of data is boundless, let the eye of insight guide the voyage, yet its intricate connections lie deep, not fully discernible from the surface.</strong></em>
|
||||
</blockquote>
|
||||
<img src="https://foxel.cc/image/ad-min-zh.png" alt="UI Screenshot">
|
||||
</div>
|
||||
@@ -40,36 +40,37 @@
|
||||
|
||||
使用 Docker Compose 是启动 Foxel 最推荐的方式。
|
||||
|
||||
1. **创建数据目录**:
|
||||
新建 `data` 文件夹用于持久化数据:
|
||||
1. **创建数据目录**
|
||||
|
||||
```bash
|
||||
mkdir -p data/db
|
||||
mkdir -p data/mount
|
||||
chmod 777 data/db data/mount
|
||||
```
|
||||
新建 `data` 文件夹用于持久化数据:
|
||||
|
||||
2. **下载 Docker Compose 文件**:
|
||||
```bash
|
||||
mkdir -p data/db
|
||||
mkdir -p data/mount
|
||||
chmod 777 data/db data/mount
|
||||
```
|
||||
|
||||
```bash
|
||||
curl -L -O https://github.com/DrizzleTime/Foxel/raw/main/compose.yaml
|
||||
```
|
||||
2. **下载 Docker Compose 文件**
|
||||
|
||||
下载完成后,**强烈建议**修改 `compose.yaml` 文件中的环境变量以确保安全:
|
||||
```bash
|
||||
curl -L -O https://github.com/DrizzleTime/Foxel/raw/main/compose.yaml
|
||||
```
|
||||
|
||||
- 修改 `SECRET_KEY` 和 `TEMP_LINK_SECRET_KEY`:将默认的密钥替换为随机生成的强密钥
|
||||
下载完成后,**强烈建议**修改 `compose.yaml` 文件中的环境变量以确保安全:
|
||||
|
||||
3. **启动服务**:
|
||||
- 修改 `SECRET_KEY` 和 `TEMP_LINK_SECRET_KEY`:将默认的密钥替换为随机生成的强密钥
|
||||
|
||||
```bash
|
||||
docker-compose up -d
|
||||
```
|
||||
3. **启动服务**
|
||||
|
||||
4. **访问应用**:
|
||||
```bash
|
||||
docker-compose up -d
|
||||
```
|
||||
|
||||
服务启动后,在浏览器中打开页面。
|
||||
4. **访问应用**
|
||||
|
||||
> 首次启动,请根据引导页面完成管理员账号的初始化设置。
|
||||
服务启动后,在浏览器中打开页面。
|
||||
|
||||
> 首次启动,请根据引导页面完成管理员账号的初始化设置。
|
||||
|
||||
## 🤝 如何贡献
|
||||
|
||||
|
||||
@@ -11,16 +11,17 @@ from domain.processors import api as processors
|
||||
from domain.share import api as share
|
||||
from domain.tasks import api as tasks
|
||||
from domain.ai import api as ai
|
||||
from domain.agent import api as agent
|
||||
from domain.virtual_fs import api as virtual_fs
|
||||
from domain.virtual_fs.mapping import s3_api, webdav_api
|
||||
from domain.virtual_fs.search import search_api
|
||||
from domain.audit import router as audit
|
||||
from domain.audit import api as audit
|
||||
|
||||
|
||||
def include_routers(app: FastAPI):
|
||||
app.include_router(adapters.router)
|
||||
app.include_router(virtual_fs.router)
|
||||
app.include_router(search_api.router)
|
||||
app.include_router(virtual_fs.router)
|
||||
app.include_router(auth.router)
|
||||
app.include_router(config.router)
|
||||
app.include_router(processors.router)
|
||||
@@ -30,9 +31,10 @@ def include_routers(app: FastAPI):
|
||||
app.include_router(backup.router)
|
||||
app.include_router(ai.router_vector_db)
|
||||
app.include_router(ai.router_ai)
|
||||
app.include_router(agent.router)
|
||||
app.include_router(plugins.router)
|
||||
app.include_router(webdav_api.router)
|
||||
app.include_router(s3_api.router)
|
||||
app.include_router(offline_downloads.router)
|
||||
app.include_router(email.router)
|
||||
app.include_router(audit)
|
||||
app.include_router(audit.router)
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
from tortoise import Tortoise
|
||||
|
||||
from domain.adapters.registry import runtime_registry
|
||||
from domain.adapters import runtime_registry
|
||||
|
||||
TORTOISE_ORM = {
|
||||
"connections": {"default": "sqlite://data/db/db.sqlite3"},
|
||||
|
||||
7
domain/__init__.py
Normal file
7
domain/__init__.py
Normal file
@@ -0,0 +1,7 @@
|
||||
"""
|
||||
domain:业务域层
|
||||
|
||||
约定:跨包只从各子包 `__init__.py` 导入公开 API。
|
||||
"""
|
||||
|
||||
__all__: list[str] = []
|
||||
@@ -1 +1,24 @@
|
||||
from .providers import BaseAdapter
|
||||
from .registry import (
|
||||
RuntimeRegistry,
|
||||
discover_adapters,
|
||||
get_config_schema,
|
||||
get_config_schemas,
|
||||
normalize_adapter_type,
|
||||
runtime_registry,
|
||||
)
|
||||
from .service import AdapterService
|
||||
from .types import AdapterCreate, AdapterOut
|
||||
|
||||
__all__ = [
|
||||
"BaseAdapter",
|
||||
"RuntimeRegistry",
|
||||
"discover_adapters",
|
||||
"get_config_schema",
|
||||
"get_config_schemas",
|
||||
"normalize_adapter_type",
|
||||
"runtime_registry",
|
||||
"AdapterService",
|
||||
"AdapterCreate",
|
||||
"AdapterOut",
|
||||
]
|
||||
|
||||
@@ -4,10 +4,9 @@ from fastapi import APIRouter, Depends, Request
|
||||
|
||||
from api.response import success
|
||||
from domain.audit import AuditAction, audit
|
||||
from domain.adapters.service import AdapterService
|
||||
from domain.adapters.types import AdapterCreate
|
||||
from domain.auth.service import get_current_active_user
|
||||
from domain.auth.types import User
|
||||
from domain.auth import User, get_current_active_user
|
||||
from .service import AdapterService
|
||||
from .types import AdapterCreate
|
||||
|
||||
router = APIRouter(prefix="/api/adapters", tags=["adapters"])
|
||||
|
||||
|
||||
@@ -1,11 +1,26 @@
|
||||
from typing import List, Dict, Tuple, AsyncIterator
|
||||
import asyncio
|
||||
import base64
|
||||
import io
|
||||
import os
|
||||
import struct
|
||||
from models import StorageAdapter
|
||||
from telethon import TelegramClient
|
||||
from telethon.crypto import AuthKey
|
||||
from telethon.sessions import StringSession
|
||||
from telethon.tl import types
|
||||
import socks
|
||||
|
||||
_SESSION_LOCKS: Dict[str, asyncio.Lock] = {}
|
||||
|
||||
|
||||
def _get_session_lock(session_string: str) -> asyncio.Lock:
|
||||
lock = _SESSION_LOCKS.get(session_string)
|
||||
if lock is None:
|
||||
lock = asyncio.Lock()
|
||||
_SESSION_LOCKS[session_string] = lock
|
||||
return lock
|
||||
|
||||
# 适配器类型标识
|
||||
ADAPTER_TYPE = "telegram"
|
||||
|
||||
@@ -54,9 +69,93 @@ class TelegramAdapter:
|
||||
if not all([self.api_id, self.api_hash, self.session_string, self.chat_id]):
|
||||
raise ValueError("Telegram 适配器需要 api_id, api_hash, session_string 和 chat_id")
|
||||
|
||||
@staticmethod
|
||||
def _parse_legacy_session_string(value: str) -> StringSession:
|
||||
"""
|
||||
兼容旧版 session_string 格式:
|
||||
- version(1B char) + base64(data)
|
||||
- data: dc_id(1B) + ip_len(2B) + ip(ASCII, ip_len bytes) + port(2B) + auth_key(256B)
|
||||
"""
|
||||
s = (value or "").strip()
|
||||
if not s:
|
||||
raise ValueError("session_string 为空")
|
||||
|
||||
body = s[1:] if s.startswith("1") else s
|
||||
raw = base64.urlsafe_b64decode(body)
|
||||
if len(raw) < 1 + 2 + 2 + 256:
|
||||
raise ValueError("legacy session 数据长度不足")
|
||||
|
||||
dc_id = raw[0]
|
||||
ip_len = struct.unpack(">H", raw[1:3])[0]
|
||||
expected_len = 1 + 2 + ip_len + 2 + 256
|
||||
if len(raw) != expected_len:
|
||||
raise ValueError("legacy session 数据长度不匹配")
|
||||
|
||||
ip_start = 3
|
||||
ip_end = ip_start + ip_len
|
||||
ip = raw[ip_start:ip_end].decode("utf-8")
|
||||
port = struct.unpack(">H", raw[ip_end : ip_end + 2])[0]
|
||||
key = raw[ip_end + 2 : ip_end + 2 + 256]
|
||||
|
||||
sess = StringSession()
|
||||
sess.set_dc(dc_id, ip, port)
|
||||
sess.auth_key = AuthKey(key)
|
||||
return sess
|
||||
|
||||
@staticmethod
|
||||
def _pick_photo_thumb(thumbs: list | None):
|
||||
if not thumbs:
|
||||
return None
|
||||
|
||||
cached = []
|
||||
others = []
|
||||
for t in thumbs:
|
||||
if isinstance(t, (types.PhotoCachedSize, types.PhotoStrippedSize)):
|
||||
cached.append(t)
|
||||
elif isinstance(t, (types.PhotoSize, types.PhotoSizeProgressive)):
|
||||
if not isinstance(t, types.PhotoSizeEmpty):
|
||||
others.append(t)
|
||||
|
||||
if cached:
|
||||
cached.sort(key=lambda x: len(getattr(x, "bytes", b"") or b""))
|
||||
return cached[-1]
|
||||
|
||||
if others:
|
||||
def _sz(x):
|
||||
if isinstance(x, types.PhotoSizeProgressive):
|
||||
return max(x.sizes or [0])
|
||||
return int(getattr(x, "size", 0) or 0)
|
||||
|
||||
others.sort(key=_sz)
|
||||
return others[-1]
|
||||
|
||||
return None
|
||||
|
||||
def _build_session(self) -> StringSession:
|
||||
s = (self.session_string or "").strip()
|
||||
if not s:
|
||||
raise ValueError("Telegram 适配器 session_string 为空")
|
||||
|
||||
try:
|
||||
return StringSession(s)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# 少数工具可能去掉了 version 前缀,这里做一次兼容
|
||||
if not s.startswith("1"):
|
||||
try:
|
||||
return StringSession("1" + s)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
try:
|
||||
return self._parse_legacy_session_string(s)
|
||||
except Exception as exc:
|
||||
raise ValueError("Telegram session_string 无效,请使用 Telethon StringSession 重新生成") from exc
|
||||
|
||||
def _get_client(self) -> TelegramClient:
|
||||
"""创建一个新的 TelegramClient 实例"""
|
||||
return TelegramClient(StringSession(self.session_string), self.api_id, self.api_hash, proxy=self.proxy)
|
||||
return TelegramClient(self._build_session(), self.api_id, self.api_hash, proxy=self.proxy)
|
||||
|
||||
def get_effective_root(self, sub_path: str | None) -> str:
|
||||
return ""
|
||||
@@ -198,6 +297,41 @@ class TelegramAdapter:
|
||||
async def mkdir(self, root: str, rel: str):
|
||||
raise NotImplementedError("Telegram 适配器不支持创建目录。")
|
||||
|
||||
async def get_thumbnail(self, root: str, rel: str, size: str = "medium"):
|
||||
try:
|
||||
message_id_str, _ = rel.split('_', 1)
|
||||
message_id = int(message_id_str)
|
||||
except (ValueError, IndexError):
|
||||
return None
|
||||
|
||||
client = self._get_client()
|
||||
try:
|
||||
await client.connect()
|
||||
message = await client.get_messages(self.chat_id, ids=message_id)
|
||||
if not message:
|
||||
return None
|
||||
|
||||
doc = message.document or message.video
|
||||
thumbs = None
|
||||
if doc and getattr(doc, "thumbs", None):
|
||||
thumbs = list(doc.thumbs or [])
|
||||
elif message.photo and getattr(message.photo, "sizes", None):
|
||||
thumbs = list(message.photo.sizes or [])
|
||||
|
||||
thumb = self._pick_photo_thumb(thumbs)
|
||||
if not thumb:
|
||||
return None
|
||||
|
||||
result = await client.download_media(message, bytes, thumb=thumb)
|
||||
if isinstance(result, (bytes, bytearray)):
|
||||
return bytes(result)
|
||||
return None
|
||||
except Exception:
|
||||
return None
|
||||
finally:
|
||||
if client.is_connected():
|
||||
await client.disconnect()
|
||||
|
||||
async def delete(self, root: str, rel: str):
|
||||
"""删除一个文件 (即一条消息)"""
|
||||
try:
|
||||
@@ -236,6 +370,8 @@ class TelegramAdapter:
|
||||
raise HTTPException(status_code=400, detail=f"无效的文件路径格式: {rel}")
|
||||
|
||||
client = self._get_client()
|
||||
lock = _get_session_lock(self.session_string)
|
||||
await lock.acquire()
|
||||
|
||||
try:
|
||||
await client.connect()
|
||||
@@ -273,7 +409,6 @@ class TelegramAdapter:
|
||||
headers = {
|
||||
"Accept-Ranges": "bytes",
|
||||
"Content-Type": mime_type,
|
||||
"Content-Length": str(file_size),
|
||||
}
|
||||
|
||||
if range_header:
|
||||
@@ -285,7 +420,6 @@ class TelegramAdapter:
|
||||
if start >= file_size or end >= file_size or start > end:
|
||||
raise HTTPException(status_code=416, detail="Requested Range Not Satisfiable")
|
||||
status = 206
|
||||
headers["Content-Length"] = str(end - start + 1)
|
||||
headers["Content-Range"] = f"bytes {start}-{end}/{file_size}"
|
||||
except ValueError:
|
||||
raise HTTPException(status_code=400, detail="Invalid Range header")
|
||||
@@ -304,18 +438,28 @@ class TelegramAdapter:
|
||||
if downloaded >= limit:
|
||||
break
|
||||
finally:
|
||||
if client.is_connected():
|
||||
await client.disconnect()
|
||||
try:
|
||||
if client.is_connected():
|
||||
await client.disconnect()
|
||||
finally:
|
||||
lock.release()
|
||||
|
||||
return StreamingResponse(iterator(), status_code=status, headers=headers)
|
||||
|
||||
except HTTPException:
|
||||
if client.is_connected():
|
||||
await client.disconnect()
|
||||
lock.release()
|
||||
raise
|
||||
except FileNotFoundError as e:
|
||||
if client.is_connected():
|
||||
await client.disconnect()
|
||||
lock.release()
|
||||
raise HTTPException(status_code=404, detail=str(e))
|
||||
except Exception as e:
|
||||
if client.is_connected():
|
||||
await client.disconnect()
|
||||
lock.release()
|
||||
raise HTTPException(status_code=500, detail=f"Streaming failed: {str(e)}")
|
||||
|
||||
async def stat_file(self, root: str, rel: str):
|
||||
|
||||
@@ -4,7 +4,7 @@ from importlib import import_module
|
||||
from typing import Callable, Dict
|
||||
|
||||
from models import StorageAdapter
|
||||
from domain.adapters.providers.base import BaseAdapter
|
||||
from .providers.base import BaseAdapter
|
||||
|
||||
AdapterFactory = Callable[[StorageAdapter], BaseAdapter]
|
||||
|
||||
@@ -21,7 +21,7 @@ def normalize_adapter_type(value: str | None) -> str | None:
|
||||
|
||||
def discover_adapters():
|
||||
"""扫描 domain.adapters.providers 包, 自动注册适配器类型、工厂与配置 schema。"""
|
||||
from domain.adapters import providers as adapters_pkg
|
||||
from . import providers as adapters_pkg
|
||||
|
||||
TYPE_MAP.clear()
|
||||
CONFIG_SCHEMAS.clear()
|
||||
|
||||
@@ -2,13 +2,13 @@ from typing import Optional
|
||||
|
||||
from fastapi import HTTPException
|
||||
|
||||
from domain.adapters.registry import (
|
||||
from domain.auth import User
|
||||
from .registry import (
|
||||
get_config_schemas,
|
||||
normalize_adapter_type,
|
||||
runtime_registry,
|
||||
)
|
||||
from domain.adapters.types import AdapterCreate, AdapterOut
|
||||
from domain.auth.types import User
|
||||
from .types import AdapterCreate, AdapterOut
|
||||
from models import StorageAdapter
|
||||
|
||||
|
||||
|
||||
9
domain/agent/__init__.py
Normal file
9
domain/agent/__init__.py
Normal file
@@ -0,0 +1,9 @@
|
||||
from .service import AgentService
|
||||
from .types import AgentChatContext, AgentChatRequest, PendingToolCall
|
||||
|
||||
__all__ = [
|
||||
"AgentService",
|
||||
"AgentChatContext",
|
||||
"AgentChatRequest",
|
||||
"PendingToolCall",
|
||||
]
|
||||
38
domain/agent/api.py
Normal file
38
domain/agent/api.py
Normal file
@@ -0,0 +1,38 @@
|
||||
from typing import Annotated
|
||||
|
||||
from fastapi import APIRouter, Depends, Request
|
||||
from fastapi.responses import StreamingResponse
|
||||
|
||||
from api.response import success
|
||||
from domain.audit import AuditAction, audit
|
||||
from domain.auth import User, get_current_active_user
|
||||
from .service import AgentService
|
||||
from .types import AgentChatRequest
|
||||
|
||||
|
||||
router = APIRouter(prefix="/api/agent", tags=["agent"])
|
||||
|
||||
|
||||
@router.post("/chat")
|
||||
@audit(action=AuditAction.CREATE, description="Agent 对话", body_fields=["auto_execute"])
|
||||
async def chat(
|
||||
request: Request,
|
||||
payload: AgentChatRequest,
|
||||
current_user: Annotated[User, Depends(get_current_active_user)],
|
||||
):
|
||||
data = await AgentService.chat(payload, current_user)
|
||||
return success(data)
|
||||
|
||||
|
||||
@router.post("/chat/stream")
|
||||
@audit(action=AuditAction.CREATE, description="Agent 对话(SSE)", body_fields=["auto_execute"])
|
||||
async def chat_stream(
|
||||
request: Request,
|
||||
payload: AgentChatRequest,
|
||||
current_user: Annotated[User, Depends(get_current_active_user)],
|
||||
):
|
||||
return StreamingResponse(
|
||||
AgentService.chat_stream(payload, current_user),
|
||||
media_type="text/event-stream",
|
||||
headers={"Cache-Control": "no-cache"},
|
||||
)
|
||||
470
domain/agent/service.py
Normal file
470
domain/agent/service.py
Normal file
@@ -0,0 +1,470 @@
|
||||
import asyncio
|
||||
import json
|
||||
import uuid
|
||||
from typing import Any, Dict, List, Optional, Tuple
|
||||
|
||||
import httpx
|
||||
from fastapi import HTTPException
|
||||
|
||||
from domain.ai import AIProviderService, MissingModelError, chat_completion, chat_completion_stream
|
||||
from domain.auth import User
|
||||
from .tools import get_tool, openai_tools, tool_result_to_content
|
||||
from .types import AgentChatRequest, PendingToolCall
|
||||
|
||||
|
||||
def _normalize_path(p: Optional[str]) -> Optional[str]:
|
||||
if not p:
|
||||
return None
|
||||
s = str(p).strip()
|
||||
if not s:
|
||||
return None
|
||||
s = s.replace("\\", "/")
|
||||
if not s.startswith("/"):
|
||||
s = "/" + s
|
||||
s = s.rstrip("/") or "/"
|
||||
return s
|
||||
|
||||
|
||||
def _build_system_prompt(current_path: Optional[str]) -> str:
|
||||
lines = [
|
||||
"你是 Foxel 的 AI 助手。",
|
||||
"你可以通过工具对文件/目录进行查询、读写、移动、复制、删除,以及运行处理器(processor)。",
|
||||
"",
|
||||
"可用工具:",
|
||||
"- vfs_list_dir:浏览目录(列出 entries + pagination)。",
|
||||
"- vfs_stat:查看文件/目录信息。",
|
||||
"- vfs_read_text:读取文本文件内容(不支持二进制)。",
|
||||
"- vfs_search:搜索文件(vector/filename)。",
|
||||
"- vfs_write_text:写入文本文件内容(覆盖)。",
|
||||
"- vfs_mkdir:创建目录。",
|
||||
"- vfs_delete:删除文件或目录。",
|
||||
"- vfs_move:移动路径。",
|
||||
"- vfs_copy:复制路径。",
|
||||
"- vfs_rename:重命名路径。",
|
||||
"- processors_list:获取可用处理器列表(含 type/name/config_schema/produces_file/supports_directory)。",
|
||||
"- processors_run:运行处理器处理文件或目录(会返回 task_id 或 task_ids)。",
|
||||
"",
|
||||
"规则:",
|
||||
"1) 读操作(vfs_list_dir/vfs_stat/vfs_read_text/vfs_search)可直接调用工具。",
|
||||
"2) 写/改/删操作(vfs_write_text/vfs_mkdir/vfs_delete/vfs_move/vfs_copy/vfs_rename/processors_run)默认需要用户确认;只有在开启自动执行时才应直接执行。",
|
||||
"3) 用户未给出明确路径时先追问;若提供了“当前文件管理目录”,可以基于它把相对描述补全为绝对路径(以 / 开头)。",
|
||||
"4) 修改文件内容:先读取(vfs_read_text)→给出改动点→确认后再写入(vfs_write_text)。",
|
||||
"5) processors_run 返回任务 id 后,说明任务已提交,可在任务队列查看进度。",
|
||||
"6) 回答保持简洁中文。",
|
||||
]
|
||||
if current_path:
|
||||
lines.append("")
|
||||
lines.append(f"当前文件管理目录:{current_path}")
|
||||
return "\n".join(lines)
|
||||
|
||||
|
||||
def _ensure_tool_call_ids(message: Dict[str, Any]) -> Dict[str, Any]:
|
||||
tool_calls = message.get("tool_calls")
|
||||
if not isinstance(tool_calls, list):
|
||||
return message
|
||||
|
||||
changed = False
|
||||
for idx, call in enumerate(tool_calls):
|
||||
if not isinstance(call, dict):
|
||||
continue
|
||||
call_id = call.get("id")
|
||||
if isinstance(call_id, str) and call_id.strip():
|
||||
continue
|
||||
call["id"] = f"call_{idx}"
|
||||
changed = True
|
||||
|
||||
if changed:
|
||||
message["tool_calls"] = tool_calls
|
||||
return message
|
||||
|
||||
|
||||
def _extract_pending(tool_call: Dict[str, Any], requires_confirmation: bool) -> PendingToolCall:
|
||||
call_id = str(tool_call.get("id") or "")
|
||||
fn = tool_call.get("function") or {}
|
||||
name = str((fn.get("name") if isinstance(fn, dict) else None) or "")
|
||||
raw_args = fn.get("arguments") if isinstance(fn, dict) else None
|
||||
arguments: Dict[str, Any] = {}
|
||||
if isinstance(raw_args, str) and raw_args.strip():
|
||||
try:
|
||||
parsed = json.loads(raw_args)
|
||||
if isinstance(parsed, dict):
|
||||
arguments = parsed
|
||||
except json.JSONDecodeError:
|
||||
arguments = {}
|
||||
return PendingToolCall(
|
||||
id=call_id,
|
||||
name=name,
|
||||
arguments=arguments,
|
||||
requires_confirmation=requires_confirmation,
|
||||
)
|
||||
|
||||
|
||||
def _find_last_assistant_tool_calls(messages: List[Dict[str, Any]]) -> Tuple[int, Dict[str, Any]]:
|
||||
for idx in range(len(messages) - 1, -1, -1):
|
||||
msg = messages[idx]
|
||||
if not isinstance(msg, dict):
|
||||
continue
|
||||
if msg.get("role") != "assistant":
|
||||
continue
|
||||
tool_calls = msg.get("tool_calls")
|
||||
if isinstance(tool_calls, list) and tool_calls:
|
||||
return idx, msg
|
||||
raise HTTPException(status_code=400, detail="没有可确认的待执行操作")
|
||||
|
||||
|
||||
def _existing_tool_result_ids(messages: List[Dict[str, Any]]) -> set[str]:
|
||||
ids: set[str] = set()
|
||||
for msg in messages:
|
||||
if not isinstance(msg, dict):
|
||||
continue
|
||||
if msg.get("role") != "tool":
|
||||
continue
|
||||
tool_call_id = msg.get("tool_call_id")
|
||||
if isinstance(tool_call_id, str) and tool_call_id.strip():
|
||||
ids.add(tool_call_id)
|
||||
return ids
|
||||
|
||||
|
||||
async def _choose_chat_ability() -> str:
|
||||
tools_model = await AIProviderService.get_default_model("tools")
|
||||
return "tools" if tools_model else "chat"
|
||||
|
||||
|
||||
def _sse(event: str, data: Any) -> bytes:
|
||||
payload = json.dumps(data, ensure_ascii=False, separators=(",", ":"))
|
||||
return f"event: {event}\ndata: {payload}\n\n".encode("utf-8")
|
||||
|
||||
|
||||
def _format_exc(exc: BaseException) -> str:
|
||||
text = str(exc)
|
||||
return text if text else exc.__class__.__name__
|
||||
|
||||
|
||||
class AgentService:
|
||||
@classmethod
|
||||
async def chat(cls, req: AgentChatRequest, user: Optional[User]) -> Dict[str, Any]:
|
||||
history: List[Dict[str, Any]] = list(req.messages or [])
|
||||
current_path = _normalize_path(req.context.current_path if req.context else None)
|
||||
|
||||
system_prompt = _build_system_prompt(current_path)
|
||||
internal_messages: List[Dict[str, Any]] = [{"role": "system", "content": system_prompt}] + history
|
||||
|
||||
new_messages: List[Dict[str, Any]] = []
|
||||
pending: List[PendingToolCall] = []
|
||||
|
||||
approved_ids = {i for i in (req.approved_tool_call_ids or []) if isinstance(i, str) and i.strip()}
|
||||
rejected_ids = {i for i in (req.rejected_tool_call_ids or []) if isinstance(i, str) and i.strip()}
|
||||
|
||||
if approved_ids or rejected_ids:
|
||||
_, last_call_msg = _find_last_assistant_tool_calls(internal_messages)
|
||||
last_call_msg = _ensure_tool_call_ids(last_call_msg)
|
||||
tool_calls = last_call_msg.get("tool_calls") or []
|
||||
call_map: Dict[str, Dict[str, Any]] = {
|
||||
str(c.get("id")): c
|
||||
for c in tool_calls
|
||||
if isinstance(c, dict) and isinstance(c.get("id"), str)
|
||||
}
|
||||
|
||||
existing_ids = _existing_tool_result_ids(internal_messages)
|
||||
for call_id in approved_ids | rejected_ids:
|
||||
if call_id in existing_ids:
|
||||
continue
|
||||
tool_call = call_map.get(call_id)
|
||||
if not tool_call:
|
||||
continue
|
||||
fn = tool_call.get("function") or {}
|
||||
name = fn.get("name") if isinstance(fn, dict) else None
|
||||
args_raw = fn.get("arguments") if isinstance(fn, dict) else None
|
||||
args: Dict[str, Any] = {}
|
||||
if isinstance(args_raw, str) and args_raw.strip():
|
||||
try:
|
||||
parsed = json.loads(args_raw)
|
||||
if isinstance(parsed, dict):
|
||||
args = parsed
|
||||
except json.JSONDecodeError:
|
||||
args = {}
|
||||
|
||||
spec = get_tool(str(name or ""))
|
||||
if call_id in rejected_ids:
|
||||
content = tool_result_to_content({"canceled": True, "reason": "user_rejected"})
|
||||
tool_msg = {"role": "tool", "tool_call_id": call_id, "content": content}
|
||||
internal_messages.append(tool_msg)
|
||||
new_messages.append(tool_msg)
|
||||
continue
|
||||
|
||||
if not spec:
|
||||
content = tool_result_to_content({"error": f"unknown_tool: {name}"})
|
||||
tool_msg = {"role": "tool", "tool_call_id": call_id, "content": content}
|
||||
internal_messages.append(tool_msg)
|
||||
new_messages.append(tool_msg)
|
||||
continue
|
||||
|
||||
try:
|
||||
result = await spec.handler(args)
|
||||
content = tool_result_to_content(result)
|
||||
except Exception as exc: # noqa: BLE001
|
||||
content = tool_result_to_content({"error": str(exc)})
|
||||
tool_msg = {"role": "tool", "tool_call_id": call_id, "content": content}
|
||||
internal_messages.append(tool_msg)
|
||||
new_messages.append(tool_msg)
|
||||
|
||||
tools_schema = openai_tools()
|
||||
ability = await _choose_chat_ability()
|
||||
max_loops = 4
|
||||
|
||||
for _ in range(max_loops):
|
||||
try:
|
||||
assistant = await chat_completion(
|
||||
internal_messages,
|
||||
ability=ability,
|
||||
tools=tools_schema,
|
||||
tool_choice="auto",
|
||||
timeout=60.0,
|
||||
)
|
||||
except MissingModelError as exc:
|
||||
raise HTTPException(status_code=400, detail=str(exc)) from exc
|
||||
except httpx.HTTPStatusError as exc:
|
||||
raise HTTPException(status_code=502, detail=f"对话请求失败: {exc}") from exc
|
||||
except httpx.RequestError as exc:
|
||||
raise HTTPException(status_code=502, detail=f"对话请求异常: {exc}") from exc
|
||||
|
||||
assistant = _ensure_tool_call_ids(assistant)
|
||||
internal_messages.append(assistant)
|
||||
new_messages.append(assistant)
|
||||
|
||||
tool_calls = assistant.get("tool_calls")
|
||||
if not isinstance(tool_calls, list) or not tool_calls:
|
||||
break
|
||||
|
||||
pending = []
|
||||
for call in tool_calls:
|
||||
if not isinstance(call, dict):
|
||||
continue
|
||||
call_id = str(call.get("id") or "")
|
||||
fn = call.get("function") or {}
|
||||
name = fn.get("name") if isinstance(fn, dict) else None
|
||||
args_raw = fn.get("arguments") if isinstance(fn, dict) else None
|
||||
args: Dict[str, Any] = {}
|
||||
if isinstance(args_raw, str) and args_raw.strip():
|
||||
try:
|
||||
parsed = json.loads(args_raw)
|
||||
if isinstance(parsed, dict):
|
||||
args = parsed
|
||||
except json.JSONDecodeError:
|
||||
args = {}
|
||||
|
||||
spec = get_tool(str(name or ""))
|
||||
if not spec:
|
||||
content = tool_result_to_content({"error": f"unknown_tool: {name}"})
|
||||
tool_msg = {"role": "tool", "tool_call_id": call_id, "content": content}
|
||||
internal_messages.append(tool_msg)
|
||||
new_messages.append(tool_msg)
|
||||
continue
|
||||
|
||||
if spec.requires_confirmation and not req.auto_execute:
|
||||
pending.append(_extract_pending(call, True))
|
||||
continue
|
||||
|
||||
try:
|
||||
result = await spec.handler(args)
|
||||
content = tool_result_to_content(result)
|
||||
except Exception as exc: # noqa: BLE001
|
||||
content = tool_result_to_content({"error": str(exc)})
|
||||
tool_msg = {"role": "tool", "tool_call_id": call_id, "content": content}
|
||||
internal_messages.append(tool_msg)
|
||||
new_messages.append(tool_msg)
|
||||
|
||||
if pending:
|
||||
break
|
||||
|
||||
payload: Dict[str, Any] = {"messages": new_messages}
|
||||
if pending:
|
||||
payload["pending_tool_calls"] = [p.model_dump() for p in pending]
|
||||
return payload
|
||||
|
||||
@classmethod
|
||||
async def chat_stream(cls, req: AgentChatRequest, user: Optional[User]):
|
||||
history: List[Dict[str, Any]] = list(req.messages or [])
|
||||
current_path = _normalize_path(req.context.current_path if req.context else None)
|
||||
|
||||
system_prompt = _build_system_prompt(current_path)
|
||||
internal_messages: List[Dict[str, Any]] = [{"role": "system", "content": system_prompt}] + history
|
||||
|
||||
new_messages: List[Dict[str, Any]] = []
|
||||
pending: List[PendingToolCall] = []
|
||||
|
||||
approved_ids = {i for i in (req.approved_tool_call_ids or []) if isinstance(i, str) and i.strip()}
|
||||
rejected_ids = {i for i in (req.rejected_tool_call_ids or []) if isinstance(i, str) and i.strip()}
|
||||
|
||||
try:
|
||||
if approved_ids or rejected_ids:
|
||||
_, last_call_msg = _find_last_assistant_tool_calls(internal_messages)
|
||||
last_call_msg = _ensure_tool_call_ids(last_call_msg)
|
||||
tool_calls = last_call_msg.get("tool_calls") or []
|
||||
call_map: Dict[str, Dict[str, Any]] = {
|
||||
str(c.get("id")): c
|
||||
for c in tool_calls
|
||||
if isinstance(c, dict) and isinstance(c.get("id"), str)
|
||||
}
|
||||
|
||||
existing_ids = _existing_tool_result_ids(internal_messages)
|
||||
for call_id in approved_ids | rejected_ids:
|
||||
if call_id in existing_ids:
|
||||
continue
|
||||
tool_call = call_map.get(call_id)
|
||||
if not tool_call:
|
||||
continue
|
||||
fn = tool_call.get("function") or {}
|
||||
name = fn.get("name") if isinstance(fn, dict) else None
|
||||
args_raw = fn.get("arguments") if isinstance(fn, dict) else None
|
||||
args: Dict[str, Any] = {}
|
||||
if isinstance(args_raw, str) and args_raw.strip():
|
||||
try:
|
||||
parsed = json.loads(args_raw)
|
||||
if isinstance(parsed, dict):
|
||||
args = parsed
|
||||
except json.JSONDecodeError:
|
||||
args = {}
|
||||
|
||||
spec = get_tool(str(name or ""))
|
||||
if call_id in rejected_ids:
|
||||
content = tool_result_to_content({"canceled": True, "reason": "user_rejected"})
|
||||
tool_msg = {"role": "tool", "tool_call_id": call_id, "content": content}
|
||||
internal_messages.append(tool_msg)
|
||||
new_messages.append(tool_msg)
|
||||
yield _sse("tool_end", {"tool_call_id": call_id, "name": str(name or ""), "message": tool_msg})
|
||||
continue
|
||||
|
||||
if not spec:
|
||||
content = tool_result_to_content({"error": f"unknown_tool: {name}"})
|
||||
tool_msg = {"role": "tool", "tool_call_id": call_id, "content": content}
|
||||
internal_messages.append(tool_msg)
|
||||
new_messages.append(tool_msg)
|
||||
yield _sse("tool_end", {"tool_call_id": call_id, "name": str(name or ""), "message": tool_msg})
|
||||
continue
|
||||
|
||||
yield _sse("tool_start", {"tool_call_id": call_id, "name": spec.name})
|
||||
try:
|
||||
result = await spec.handler(args)
|
||||
content = tool_result_to_content(result)
|
||||
except Exception as exc: # noqa: BLE001
|
||||
content = tool_result_to_content({"error": str(exc)})
|
||||
tool_msg = {"role": "tool", "tool_call_id": call_id, "content": content}
|
||||
internal_messages.append(tool_msg)
|
||||
new_messages.append(tool_msg)
|
||||
yield _sse("tool_end", {"tool_call_id": call_id, "name": spec.name, "message": tool_msg})
|
||||
|
||||
tools_schema = openai_tools()
|
||||
ability = await _choose_chat_ability()
|
||||
max_loops = 4
|
||||
|
||||
for _ in range(max_loops):
|
||||
assistant_event_id = uuid.uuid4().hex
|
||||
yield _sse("assistant_start", {"id": assistant_event_id})
|
||||
|
||||
assistant_message: Dict[str, Any] | None = None
|
||||
try:
|
||||
async for event in chat_completion_stream(
|
||||
internal_messages,
|
||||
ability=ability,
|
||||
tools=tools_schema,
|
||||
tool_choice="auto",
|
||||
timeout=60.0,
|
||||
):
|
||||
if event.get("type") == "delta":
|
||||
delta = event.get("delta")
|
||||
if isinstance(delta, str) and delta:
|
||||
yield _sse("assistant_delta", {"id": assistant_event_id, "delta": delta})
|
||||
elif event.get("type") == "message":
|
||||
msg = event.get("message")
|
||||
if isinstance(msg, dict):
|
||||
assistant_message = msg
|
||||
except MissingModelError as exc:
|
||||
raise HTTPException(status_code=400, detail=_format_exc(exc)) from exc
|
||||
except httpx.HTTPStatusError as exc:
|
||||
raise HTTPException(status_code=502, detail=f"对话请求失败: {_format_exc(exc)}") from exc
|
||||
except httpx.RequestError as exc:
|
||||
raise HTTPException(status_code=502, detail=f"对话请求异常: {_format_exc(exc)}") from exc
|
||||
|
||||
if not assistant_message:
|
||||
assistant_message = {"role": "assistant", "content": ""}
|
||||
|
||||
assistant_message = _ensure_tool_call_ids(assistant_message)
|
||||
internal_messages.append(assistant_message)
|
||||
new_messages.append(assistant_message)
|
||||
yield _sse("assistant_end", {"id": assistant_event_id, "message": assistant_message})
|
||||
|
||||
tool_calls = assistant_message.get("tool_calls")
|
||||
if not isinstance(tool_calls, list) or not tool_calls:
|
||||
break
|
||||
|
||||
pending = []
|
||||
for call in tool_calls:
|
||||
if not isinstance(call, dict):
|
||||
continue
|
||||
call_id = str(call.get("id") or "")
|
||||
fn = call.get("function") or {}
|
||||
name = fn.get("name") if isinstance(fn, dict) else None
|
||||
args_raw = fn.get("arguments") if isinstance(fn, dict) else None
|
||||
args: Dict[str, Any] = {}
|
||||
if isinstance(args_raw, str) and args_raw.strip():
|
||||
try:
|
||||
parsed = json.loads(args_raw)
|
||||
if isinstance(parsed, dict):
|
||||
args = parsed
|
||||
except json.JSONDecodeError:
|
||||
args = {}
|
||||
|
||||
spec = get_tool(str(name or ""))
|
||||
if not spec:
|
||||
content = tool_result_to_content({"error": f"unknown_tool: {name}"})
|
||||
tool_msg = {"role": "tool", "tool_call_id": call_id, "content": content}
|
||||
internal_messages.append(tool_msg)
|
||||
new_messages.append(tool_msg)
|
||||
yield _sse("tool_end", {"tool_call_id": call_id, "name": str(name or ""), "message": tool_msg})
|
||||
continue
|
||||
|
||||
if spec.requires_confirmation and not req.auto_execute:
|
||||
pending.append(_extract_pending(call, True))
|
||||
continue
|
||||
|
||||
yield _sse("tool_start", {"tool_call_id": call_id, "name": spec.name})
|
||||
try:
|
||||
result = await spec.handler(args)
|
||||
content = tool_result_to_content(result)
|
||||
except Exception as exc: # noqa: BLE001
|
||||
content = tool_result_to_content({"error": str(exc)})
|
||||
tool_msg = {"role": "tool", "tool_call_id": call_id, "content": content}
|
||||
internal_messages.append(tool_msg)
|
||||
new_messages.append(tool_msg)
|
||||
yield _sse("tool_end", {"tool_call_id": call_id, "name": spec.name, "message": tool_msg})
|
||||
|
||||
if pending:
|
||||
yield _sse("pending", {"pending_tool_calls": [p.model_dump() for p in pending]})
|
||||
break
|
||||
|
||||
payload: Dict[str, Any] = {"messages": new_messages}
|
||||
if pending:
|
||||
payload["pending_tool_calls"] = [p.model_dump() for p in pending]
|
||||
yield _sse("done", payload)
|
||||
|
||||
except asyncio.CancelledError:
|
||||
return
|
||||
except HTTPException as exc:
|
||||
detail = exc.detail
|
||||
content = detail if isinstance(detail, str) else str(detail)
|
||||
if not content.strip():
|
||||
content = f"请求失败({exc.status_code})"
|
||||
new_messages.append({"role": "assistant", "content": content})
|
||||
payload: Dict[str, Any] = {"messages": new_messages}
|
||||
if pending:
|
||||
payload["pending_tool_calls"] = [p.model_dump() for p in pending]
|
||||
yield _sse("done", payload)
|
||||
return
|
||||
except Exception as exc: # noqa: BLE001
|
||||
new_messages.append({"role": "assistant", "content": f"服务端异常: {_format_exc(exc)}"})
|
||||
payload: Dict[str, Any] = {"messages": new_messages}
|
||||
if pending:
|
||||
payload["pending_tool_calls"] = [p.model_dump() for p in pending]
|
||||
yield _sse("done", payload)
|
||||
return
|
||||
412
domain/agent/tools.py
Normal file
412
domain/agent/tools.py
Normal file
@@ -0,0 +1,412 @@
|
||||
import json
|
||||
from dataclasses import dataclass
|
||||
from typing import Any, Awaitable, Callable, Dict, List, Optional
|
||||
|
||||
from domain.processors import ProcessDirectoryRequest, ProcessRequest, ProcessorService
|
||||
from domain.virtual_fs import VirtualFSService
|
||||
from domain.virtual_fs.search import VirtualFSSearchService
|
||||
|
||||
|
||||
@dataclass(frozen=True)
|
||||
class ToolSpec:
|
||||
name: str
|
||||
description: str
|
||||
parameters: Dict[str, Any]
|
||||
requires_confirmation: bool
|
||||
handler: Callable[[Dict[str, Any]], Awaitable[Any]]
|
||||
|
||||
|
||||
async def _processors_list(_: Dict[str, Any]) -> Dict[str, Any]:
|
||||
return {"processors": ProcessorService.list_processors()}
|
||||
|
||||
|
||||
async def _processors_run(args: Dict[str, Any]) -> Dict[str, Any]:
|
||||
path = str(args.get("path") or "")
|
||||
processor_type = str(args.get("processor_type") or "")
|
||||
config = args.get("config")
|
||||
if not isinstance(config, dict):
|
||||
config = {}
|
||||
|
||||
save_to = args.get("save_to")
|
||||
save_to = str(save_to) if isinstance(save_to, str) and save_to.strip() else None
|
||||
|
||||
max_depth = args.get("max_depth")
|
||||
max_depth_value: Optional[int] = None
|
||||
if max_depth is not None:
|
||||
try:
|
||||
max_depth_value = int(max_depth)
|
||||
except (TypeError, ValueError):
|
||||
max_depth_value = None
|
||||
|
||||
suffix = args.get("suffix")
|
||||
suffix_value = str(suffix) if isinstance(suffix, str) and suffix.strip() else None
|
||||
|
||||
overwrite_value = args.get("overwrite")
|
||||
overwrite = bool(overwrite_value) if overwrite_value is not None else None
|
||||
|
||||
is_dir = await VirtualFSService.path_is_directory(path)
|
||||
if is_dir and (max_depth_value is not None or suffix_value is not None):
|
||||
req = ProcessDirectoryRequest(
|
||||
path=path,
|
||||
processor_type=processor_type,
|
||||
config=config,
|
||||
overwrite=True if overwrite is None else overwrite,
|
||||
max_depth=max_depth_value,
|
||||
suffix=suffix_value,
|
||||
)
|
||||
result = await ProcessorService.process_directory(req)
|
||||
return {"mode": "directory", **result}
|
||||
|
||||
req = ProcessRequest(
|
||||
path=path,
|
||||
processor_type=processor_type,
|
||||
config=config,
|
||||
save_to=save_to,
|
||||
overwrite=False if overwrite is None else overwrite,
|
||||
)
|
||||
result = await ProcessorService.process_file(req)
|
||||
return {"mode": "file", **result}
|
||||
|
||||
|
||||
def _normalize_vfs_path(value: Any) -> str:
|
||||
s = str(value or "").strip().replace("\\", "/")
|
||||
if not s:
|
||||
return ""
|
||||
if not s.startswith("/"):
|
||||
s = "/" + s
|
||||
s = s.rstrip("/") or "/"
|
||||
return s
|
||||
|
||||
|
||||
def _require_vfs_path(value: Any, field: str) -> str:
|
||||
path = _normalize_vfs_path(value)
|
||||
if not path:
|
||||
raise ValueError(f"missing_{field}")
|
||||
return path
|
||||
|
||||
|
||||
async def _vfs_list_dir(args: Dict[str, Any]) -> Dict[str, Any]:
|
||||
path = _normalize_vfs_path(args.get("path") or "/") or "/"
|
||||
page = int(args.get("page") or 1)
|
||||
page_size = int(args.get("page_size") or 50)
|
||||
sort_by = str(args.get("sort_by") or "name")
|
||||
sort_order = str(args.get("sort_order") or "asc")
|
||||
return await VirtualFSService.list_directory(path, page, page_size, sort_by, sort_order)
|
||||
|
||||
|
||||
async def _vfs_stat(args: Dict[str, Any]) -> Any:
|
||||
path = _require_vfs_path(args.get("path"), "path")
|
||||
return await VirtualFSService.stat(path)
|
||||
|
||||
|
||||
async def _vfs_read_text(args: Dict[str, Any]) -> Dict[str, Any]:
|
||||
path = _require_vfs_path(args.get("path"), "path")
|
||||
encoding = str(args.get("encoding") or "utf-8")
|
||||
max_chars = int(args.get("max_chars") or 8000)
|
||||
|
||||
data = await VirtualFSService.read_file(path)
|
||||
if isinstance(data, (bytes, bytearray)):
|
||||
try:
|
||||
text = bytes(data).decode(encoding)
|
||||
except UnicodeDecodeError:
|
||||
return {"error": "binary_or_invalid_text", "path": path}
|
||||
elif isinstance(data, str):
|
||||
text = data
|
||||
else:
|
||||
text = str(data)
|
||||
|
||||
original_len = len(text)
|
||||
truncated = original_len > max_chars
|
||||
if truncated:
|
||||
text = text[:max_chars]
|
||||
return {
|
||||
"path": path,
|
||||
"encoding": encoding,
|
||||
"content": text,
|
||||
"truncated": truncated,
|
||||
"length": original_len,
|
||||
}
|
||||
|
||||
|
||||
async def _vfs_write_text(args: Dict[str, Any]) -> Dict[str, Any]:
|
||||
path = _require_vfs_path(args.get("path"), "path")
|
||||
if path == "/":
|
||||
raise ValueError("invalid_path")
|
||||
encoding = str(args.get("encoding") or "utf-8")
|
||||
content = str(args.get("content") or "")
|
||||
data = content.encode(encoding)
|
||||
await VirtualFSService.write_file(path, data)
|
||||
return {"written": True, "path": path, "encoding": encoding, "bytes": len(data)}
|
||||
|
||||
|
||||
async def _vfs_mkdir(args: Dict[str, Any]) -> Dict[str, Any]:
|
||||
path = _require_vfs_path(args.get("path"), "path")
|
||||
return await VirtualFSService.mkdir(path)
|
||||
|
||||
|
||||
async def _vfs_delete(args: Dict[str, Any]) -> Dict[str, Any]:
|
||||
path = _require_vfs_path(args.get("path"), "path")
|
||||
return await VirtualFSService.delete(path)
|
||||
|
||||
|
||||
async def _vfs_move(args: Dict[str, Any]) -> Dict[str, Any]:
|
||||
src = _require_vfs_path(args.get("src"), "src")
|
||||
dst = _require_vfs_path(args.get("dst"), "dst")
|
||||
if src == "/" or dst == "/":
|
||||
raise ValueError("invalid_path")
|
||||
overwrite = bool(args.get("overwrite") or False)
|
||||
return await VirtualFSService.move(src, dst, overwrite)
|
||||
|
||||
|
||||
async def _vfs_copy(args: Dict[str, Any]) -> Dict[str, Any]:
|
||||
src = _require_vfs_path(args.get("src"), "src")
|
||||
dst = _require_vfs_path(args.get("dst"), "dst")
|
||||
if src == "/" or dst == "/":
|
||||
raise ValueError("invalid_path")
|
||||
overwrite = bool(args.get("overwrite") or False)
|
||||
return await VirtualFSService.copy(src, dst, overwrite)
|
||||
|
||||
|
||||
async def _vfs_rename(args: Dict[str, Any]) -> Dict[str, Any]:
|
||||
src = _require_vfs_path(args.get("src"), "src")
|
||||
dst = _require_vfs_path(args.get("dst"), "dst")
|
||||
if src == "/" or dst == "/":
|
||||
raise ValueError("invalid_path")
|
||||
overwrite = bool(args.get("overwrite") or False)
|
||||
return await VirtualFSService.rename(src, dst, overwrite)
|
||||
|
||||
|
||||
async def _vfs_search(args: Dict[str, Any]) -> Dict[str, Any]:
|
||||
q = str(args.get("q") or "").strip()
|
||||
if not q:
|
||||
raise ValueError("missing_q")
|
||||
mode = str(args.get("mode") or "vector")
|
||||
top_k = int(args.get("top_k") or 10)
|
||||
page = int(args.get("page") or 1)
|
||||
page_size = int(args.get("page_size") or 10)
|
||||
return await VirtualFSSearchService.search(q, top_k, mode, page, page_size)
|
||||
|
||||
|
||||
TOOLS: Dict[str, ToolSpec] = {
|
||||
"processors_list": ToolSpec(
|
||||
name="processors_list",
|
||||
description="获取可用处理器列表(type/name/config_schema 等)。",
|
||||
parameters={
|
||||
"type": "object",
|
||||
"properties": {},
|
||||
"additionalProperties": False,
|
||||
},
|
||||
requires_confirmation=False,
|
||||
handler=_processors_list,
|
||||
),
|
||||
"processors_run": ToolSpec(
|
||||
name="processors_run",
|
||||
description=(
|
||||
"运行处理器处理文件或目录。"
|
||||
" 对目录可选 max_depth/suffix;对文件可选 overwrite/save_to。"
|
||||
" 返回任务 id(去任务队列查看进度)。"
|
||||
),
|
||||
parameters={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"path": {"type": "string", "description": "文件或目录路径(绝对路径,如 /foo/bar)"},
|
||||
"processor_type": {"type": "string", "description": "处理器类型(例如 image_watermark)"},
|
||||
"config": {"type": "object", "description": "处理器配置,按 processors_list 返回的 config_schema 填写"},
|
||||
"overwrite": {"type": "boolean", "description": "是否覆盖原文件/目录内文件"},
|
||||
"save_to": {"type": "string", "description": "保存到指定路径(仅文件模式,且 overwrite=false 时使用)"},
|
||||
"max_depth": {"type": "integer", "description": "目录遍历深度(仅目录模式)"},
|
||||
"suffix": {"type": "string", "description": "目录批处理时的输出后缀(仅 produces_file 且 overwrite=false)"},
|
||||
},
|
||||
"required": ["path", "processor_type"],
|
||||
},
|
||||
requires_confirmation=True,
|
||||
handler=_processors_run,
|
||||
),
|
||||
"vfs_list_dir": ToolSpec(
|
||||
name="vfs_list_dir",
|
||||
description="浏览目录(列出 entries + pagination)。",
|
||||
parameters={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"path": {"type": "string", "description": "目录路径(绝对路径,如 /foo/bar)"},
|
||||
"page": {"type": "integer", "description": "页码(从 1 开始)"},
|
||||
"page_size": {"type": "integer", "description": "每页条数"},
|
||||
"sort_by": {"type": "string", "description": "排序字段:name/size/mtime"},
|
||||
"sort_order": {"type": "string", "description": "排序顺序:asc/desc"},
|
||||
},
|
||||
"required": ["path"],
|
||||
"additionalProperties": False,
|
||||
},
|
||||
requires_confirmation=False,
|
||||
handler=_vfs_list_dir,
|
||||
),
|
||||
"vfs_stat": ToolSpec(
|
||||
name="vfs_stat",
|
||||
description="查看文件/目录信息(size/mtime/is_dir/has_thumbnail/vector_index 等)。",
|
||||
parameters={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"path": {"type": "string", "description": "路径(绝对路径,如 /foo/bar.txt)"},
|
||||
},
|
||||
"required": ["path"],
|
||||
"additionalProperties": False,
|
||||
},
|
||||
requires_confirmation=False,
|
||||
handler=_vfs_stat,
|
||||
),
|
||||
"vfs_read_text": ToolSpec(
|
||||
name="vfs_read_text",
|
||||
description="读取文本文件内容(解码失败视为二进制,返回 error)。",
|
||||
parameters={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"path": {"type": "string", "description": "文件路径(绝对路径,如 /foo/bar.md)"},
|
||||
"encoding": {"type": "string", "description": "文本编码(默认 utf-8)"},
|
||||
"max_chars": {"type": "integer", "description": "最多返回的字符数(默认 8000)"},
|
||||
},
|
||||
"required": ["path"],
|
||||
"additionalProperties": False,
|
||||
},
|
||||
requires_confirmation=False,
|
||||
handler=_vfs_read_text,
|
||||
),
|
||||
"vfs_write_text": ToolSpec(
|
||||
name="vfs_write_text",
|
||||
description="写入文本文件内容(会覆盖目标文件)。",
|
||||
parameters={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"path": {"type": "string", "description": "文件路径(绝对路径,如 /foo/bar.md)"},
|
||||
"content": {"type": "string", "description": "要写入的文本内容"},
|
||||
"encoding": {"type": "string", "description": "文本编码(默认 utf-8)"},
|
||||
},
|
||||
"required": ["path", "content"],
|
||||
"additionalProperties": False,
|
||||
},
|
||||
requires_confirmation=True,
|
||||
handler=_vfs_write_text,
|
||||
),
|
||||
"vfs_mkdir": ToolSpec(
|
||||
name="vfs_mkdir",
|
||||
description="创建目录。",
|
||||
parameters={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"path": {"type": "string", "description": "目录路径(绝对路径,如 /foo/bar)"},
|
||||
},
|
||||
"required": ["path"],
|
||||
"additionalProperties": False,
|
||||
},
|
||||
requires_confirmation=True,
|
||||
handler=_vfs_mkdir,
|
||||
),
|
||||
"vfs_delete": ToolSpec(
|
||||
name="vfs_delete",
|
||||
description="删除文件或目录(由底层适配器决定是否递归)。",
|
||||
parameters={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"path": {"type": "string", "description": "路径(绝对路径,如 /foo/bar 或 /foo/bar.txt)"},
|
||||
},
|
||||
"required": ["path"],
|
||||
"additionalProperties": False,
|
||||
},
|
||||
requires_confirmation=True,
|
||||
handler=_vfs_delete,
|
||||
),
|
||||
"vfs_move": ToolSpec(
|
||||
name="vfs_move",
|
||||
description="移动路径(可能进入任务队列)。",
|
||||
parameters={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"src": {"type": "string", "description": "源路径(绝对路径)"},
|
||||
"dst": {"type": "string", "description": "目标路径(绝对路径)"},
|
||||
"overwrite": {"type": "boolean", "description": "是否允许覆盖已存在目标(默认 false)"},
|
||||
},
|
||||
"required": ["src", "dst"],
|
||||
"additionalProperties": False,
|
||||
},
|
||||
requires_confirmation=True,
|
||||
handler=_vfs_move,
|
||||
),
|
||||
"vfs_copy": ToolSpec(
|
||||
name="vfs_copy",
|
||||
description="复制路径(可能进入任务队列)。",
|
||||
parameters={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"src": {"type": "string", "description": "源路径(绝对路径)"},
|
||||
"dst": {"type": "string", "description": "目标路径(绝对路径)"},
|
||||
"overwrite": {"type": "boolean", "description": "是否覆盖已存在目标(默认 false)"},
|
||||
},
|
||||
"required": ["src", "dst"],
|
||||
"additionalProperties": False,
|
||||
},
|
||||
requires_confirmation=True,
|
||||
handler=_vfs_copy,
|
||||
),
|
||||
"vfs_rename": ToolSpec(
|
||||
name="vfs_rename",
|
||||
description="重命名路径(本质是同目录 move)。",
|
||||
parameters={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"src": {"type": "string", "description": "源路径(绝对路径)"},
|
||||
"dst": {"type": "string", "description": "目标路径(绝对路径)"},
|
||||
"overwrite": {"type": "boolean", "description": "是否允许覆盖已存在目标(默认 false)"},
|
||||
},
|
||||
"required": ["src", "dst"],
|
||||
"additionalProperties": False,
|
||||
},
|
||||
requires_confirmation=True,
|
||||
handler=_vfs_rename,
|
||||
),
|
||||
"vfs_search": ToolSpec(
|
||||
name="vfs_search",
|
||||
description="搜索文件(mode=vector 或 filename)。",
|
||||
parameters={
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"q": {"type": "string", "description": "搜索关键词"},
|
||||
"mode": {"type": "string", "description": "搜索模式:vector/filename(默认 vector)"},
|
||||
"top_k": {"type": "integer", "description": "返回数量(vector 模式使用,默认 10)"},
|
||||
"page": {"type": "integer", "description": "页码(filename 模式使用,默认 1)"},
|
||||
"page_size": {"type": "integer", "description": "分页大小(filename 模式使用,默认 10)"},
|
||||
},
|
||||
"required": ["q"],
|
||||
"additionalProperties": False,
|
||||
},
|
||||
requires_confirmation=False,
|
||||
handler=_vfs_search,
|
||||
),
|
||||
}
|
||||
|
||||
|
||||
def get_tool(name: str) -> Optional[ToolSpec]:
|
||||
return TOOLS.get(name)
|
||||
|
||||
|
||||
def openai_tools() -> List[Dict[str, Any]]:
|
||||
out: List[Dict[str, Any]] = []
|
||||
for spec in TOOLS.values():
|
||||
out.append({
|
||||
"type": "function",
|
||||
"function": {
|
||||
"name": spec.name,
|
||||
"description": spec.description,
|
||||
"parameters": spec.parameters,
|
||||
},
|
||||
})
|
||||
return out
|
||||
|
||||
|
||||
def tool_result_to_content(result: Any) -> str:
|
||||
if result is None:
|
||||
return ""
|
||||
if isinstance(result, str):
|
||||
return result
|
||||
try:
|
||||
return json.dumps(result, ensure_ascii=False)
|
||||
except TypeError:
|
||||
return json.dumps({"result": str(result)}, ensure_ascii=False)
|
||||
23
domain/agent/types.py
Normal file
23
domain/agent/types.py
Normal file
@@ -0,0 +1,23 @@
|
||||
from typing import Any, Dict, List, Optional
|
||||
|
||||
from pydantic import BaseModel, Field
|
||||
|
||||
|
||||
class AgentChatContext(BaseModel):
|
||||
current_path: Optional[str] = None
|
||||
|
||||
|
||||
class AgentChatRequest(BaseModel):
|
||||
messages: List[Dict[str, Any]] = Field(default_factory=list)
|
||||
auto_execute: bool = False
|
||||
approved_tool_call_ids: List[str] = Field(default_factory=list)
|
||||
rejected_tool_call_ids: List[str] = Field(default_factory=list)
|
||||
context: Optional[AgentChatContext] = None
|
||||
|
||||
|
||||
class PendingToolCall(BaseModel):
|
||||
id: str
|
||||
name: str
|
||||
arguments: Dict[str, Any] = Field(default_factory=dict)
|
||||
requires_confirmation: bool = True
|
||||
|
||||
@@ -1,28 +1,61 @@
|
||||
from .api import router_ai, router_vector_db
|
||||
from .inference import (
|
||||
MissingModelError,
|
||||
chat_completion,
|
||||
chat_completion_stream,
|
||||
describe_image_base64,
|
||||
get_text_embedding,
|
||||
provider_service,
|
||||
rerank_texts,
|
||||
)
|
||||
from .service import (
|
||||
AIProviderService,
|
||||
FILE_COLLECTION_NAME,
|
||||
VECTOR_COLLECTION_NAME,
|
||||
DEFAULT_VECTOR_DIMENSION,
|
||||
VectorDBConfigManager,
|
||||
VectorDBService,
|
||||
DEFAULT_VECTOR_DIMENSION,
|
||||
ABILITIES,
|
||||
normalize_capabilities,
|
||||
)
|
||||
from .types import (
|
||||
ABILITIES,
|
||||
AIDefaultsUpdate,
|
||||
AIModelCreate,
|
||||
AIModelUpdate,
|
||||
AIProviderCreate,
|
||||
AIProviderUpdate,
|
||||
VectorDBConfigPayload,
|
||||
normalize_capabilities,
|
||||
)
|
||||
from .vector_providers import (
|
||||
BaseVectorProvider,
|
||||
MilvusLiteProvider,
|
||||
MilvusServerProvider,
|
||||
QdrantProvider,
|
||||
get_provider_class,
|
||||
get_provider_entry,
|
||||
list_providers,
|
||||
)
|
||||
|
||||
__all__ = [
|
||||
"router_ai",
|
||||
"router_vector_db",
|
||||
"MissingModelError",
|
||||
"chat_completion",
|
||||
"chat_completion_stream",
|
||||
"describe_image_base64",
|
||||
"get_text_embedding",
|
||||
"provider_service",
|
||||
"rerank_texts",
|
||||
"AIProviderService",
|
||||
"VectorDBService",
|
||||
"VectorDBConfigManager",
|
||||
"DEFAULT_VECTOR_DIMENSION",
|
||||
"VECTOR_COLLECTION_NAME",
|
||||
"FILE_COLLECTION_NAME",
|
||||
"BaseVectorProvider",
|
||||
"MilvusLiteProvider",
|
||||
"MilvusServerProvider",
|
||||
"QdrantProvider",
|
||||
"list_providers",
|
||||
"get_provider_entry",
|
||||
"get_provider_class",
|
||||
"ABILITIES",
|
||||
"normalize_capabilities",
|
||||
"AIDefaultsUpdate",
|
||||
|
||||
@@ -5,8 +5,9 @@ from fastapi import APIRouter, Depends, HTTPException, Path, Request
|
||||
|
||||
from api.response import success
|
||||
from domain.audit import AuditAction, audit
|
||||
from domain.ai.service import AIProviderService, VectorDBConfigManager, VectorDBService
|
||||
from domain.ai.types import (
|
||||
from domain.auth import User, get_current_active_user
|
||||
from .service import AIProviderService, VectorDBConfigManager, VectorDBService
|
||||
from .types import (
|
||||
AIDefaultsUpdate,
|
||||
AIModelCreate,
|
||||
AIModelUpdate,
|
||||
@@ -14,9 +15,7 @@ from domain.ai.types import (
|
||||
AIProviderUpdate,
|
||||
VectorDBConfigPayload,
|
||||
)
|
||||
from domain.ai.vector_providers import get_provider_class, get_provider_entry, list_providers
|
||||
from domain.auth.service import get_current_active_user
|
||||
from domain.auth.types import User
|
||||
from .vector_providers import get_provider_class, get_provider_entry, list_providers
|
||||
|
||||
router_ai = APIRouter(prefix="/api/ai", tags=["ai"])
|
||||
router_vector_db = APIRouter(prefix="/api/vector-db", tags=["vector-db"])
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -7,7 +7,7 @@ import httpx
|
||||
from tortoise.exceptions import DoesNotExist
|
||||
from tortoise.transactions import in_transaction
|
||||
|
||||
from domain.config.service import ConfigService
|
||||
from domain.config import ConfigService
|
||||
from models.database import AIDefaultModel, AIModel, AIProvider
|
||||
|
||||
from .types import ABILITIES, normalize_capabilities
|
||||
@@ -19,6 +19,8 @@ from .vector_providers import (
|
||||
)
|
||||
|
||||
DEFAULT_VECTOR_DIMENSION = 4096
|
||||
VECTOR_COLLECTION_NAME = "vector_collection"
|
||||
FILE_COLLECTION_NAME = "file_collection"
|
||||
|
||||
OPENAI_EMBEDDING_DIMS = {
|
||||
"text-embedding-3-large": 3072,
|
||||
@@ -138,7 +140,7 @@ def serialize_provider(provider: AIProvider) -> Dict[str, Any]:
|
||||
"provider_type": provider.provider_type,
|
||||
"api_format": provider.api_format,
|
||||
"base_url": provider.base_url,
|
||||
"api_key": provider.api_key,
|
||||
"has_api_key": bool(provider.api_key),
|
||||
"logo_url": provider.logo_url,
|
||||
"extra_config": provider.extra_config or {},
|
||||
"created_at": provider.created_at,
|
||||
|
||||
@@ -30,8 +30,8 @@ class AIProviderBase(BaseModel):
|
||||
@classmethod
|
||||
def normalize_format(cls, value: str) -> str:
|
||||
fmt = value.lower()
|
||||
if fmt not in {"openai", "gemini"}:
|
||||
raise ValueError("api_format must be 'openai' or 'gemini'")
|
||||
if fmt not in {"openai", "gemini", "anthropic", "ollama"}:
|
||||
raise ValueError("api_format must be 'openai', 'gemini', 'anthropic', or 'ollama'")
|
||||
return fmt
|
||||
|
||||
|
||||
@@ -54,8 +54,8 @@ class AIProviderUpdate(BaseModel):
|
||||
if value is None:
|
||||
return value
|
||||
fmt = value.lower()
|
||||
if fmt not in {"openai", "gemini"}:
|
||||
raise ValueError("api_format must be 'openai' or 'gemini'")
|
||||
if fmt not in {"openai", "gemini", "anthropic", "ollama"}:
|
||||
raise ValueError("api_format must be 'openai', 'gemini', 'anthropic', or 'ollama'")
|
||||
return fmt
|
||||
|
||||
|
||||
|
||||
@@ -1,5 +1,4 @@
|
||||
from domain.audit.decorator import audit
|
||||
from domain.audit.types import AuditAction
|
||||
from domain.audit.api import router
|
||||
from .decorator import audit
|
||||
from .types import AuditAction
|
||||
|
||||
__all__ = ["audit", "AuditAction", "router"]
|
||||
__all__ = ["audit", "AuditAction"]
|
||||
|
||||
@@ -4,10 +4,9 @@ from typing import Annotated, Optional
|
||||
from fastapi import APIRouter, Depends, HTTPException, Query
|
||||
|
||||
from api import response
|
||||
from domain.audit.service import AuditService
|
||||
from domain.audit.types import AuditAction
|
||||
from domain.auth.service import get_current_active_user
|
||||
from domain.auth.types import User
|
||||
from domain.auth import User, get_current_active_user
|
||||
from .service import AuditService
|
||||
from .types import AuditAction
|
||||
|
||||
CurrentUser = Annotated[User, Depends(get_current_active_user)]
|
||||
|
||||
|
||||
@@ -7,11 +7,11 @@ import jwt
|
||||
from fastapi import Request
|
||||
from jwt.exceptions import InvalidTokenError
|
||||
|
||||
from domain.audit.service import AuditService
|
||||
from domain.audit.types import AuditAction
|
||||
from domain.auth.service import ALGORITHM
|
||||
from domain.config.service import ConfigService
|
||||
from domain.auth import ALGORITHM
|
||||
from domain.config import ConfigService
|
||||
from models.database import UserAccount
|
||||
from .service import AuditService
|
||||
from .types import AuditAction
|
||||
|
||||
|
||||
def _extract_request(bound_args: Mapping[str, Any]) -> Request | None:
|
||||
|
||||
@@ -2,7 +2,7 @@ from typing import Any, Dict, Optional
|
||||
|
||||
from models.database import AuditLog
|
||||
|
||||
from domain.audit.types import AuditAction
|
||||
from .types import AuditAction
|
||||
|
||||
|
||||
class AuditService:
|
||||
|
||||
49
domain/auth/__init__.py
Normal file
49
domain/auth/__init__.py
Normal file
@@ -0,0 +1,49 @@
|
||||
from .service import (
|
||||
ALGORITHM,
|
||||
AuthService,
|
||||
authenticate_user_db,
|
||||
create_access_token,
|
||||
get_current_active_user,
|
||||
get_current_user,
|
||||
get_password_hash,
|
||||
has_users,
|
||||
register_user,
|
||||
request_password_reset,
|
||||
reset_password_with_token,
|
||||
verify_password,
|
||||
verify_password_reset_token,
|
||||
)
|
||||
from .types import (
|
||||
PasswordResetConfirm,
|
||||
PasswordResetRequest,
|
||||
RegisterRequest,
|
||||
Token,
|
||||
TokenData,
|
||||
UpdateMeRequest,
|
||||
User,
|
||||
UserInDB,
|
||||
)
|
||||
|
||||
__all__ = [
|
||||
"ALGORITHM",
|
||||
"AuthService",
|
||||
"authenticate_user_db",
|
||||
"create_access_token",
|
||||
"get_current_active_user",
|
||||
"get_current_user",
|
||||
"get_password_hash",
|
||||
"has_users",
|
||||
"register_user",
|
||||
"request_password_reset",
|
||||
"reset_password_with_token",
|
||||
"verify_password",
|
||||
"verify_password_reset_token",
|
||||
"PasswordResetConfirm",
|
||||
"PasswordResetRequest",
|
||||
"RegisterRequest",
|
||||
"Token",
|
||||
"TokenData",
|
||||
"UpdateMeRequest",
|
||||
"User",
|
||||
"UserInDB",
|
||||
]
|
||||
@@ -5,8 +5,8 @@ from fastapi.security import OAuth2PasswordRequestForm
|
||||
|
||||
from api.response import success
|
||||
from domain.audit import AuditAction, audit
|
||||
from domain.auth.service import AuthService, get_current_active_user
|
||||
from domain.auth.types import (
|
||||
from .service import AuthService, get_current_active_user
|
||||
from .types import (
|
||||
PasswordResetConfirm,
|
||||
PasswordResetRequest,
|
||||
RegisterRequest,
|
||||
|
||||
@@ -11,7 +11,9 @@ from fastapi import Depends, HTTPException, status
|
||||
from fastapi.security import OAuth2PasswordBearer, OAuth2PasswordRequestForm
|
||||
from jwt.exceptions import InvalidTokenError
|
||||
|
||||
from domain.auth.types import (
|
||||
from domain.config import ConfigService
|
||||
from models.database import UserAccount
|
||||
from .types import (
|
||||
PasswordResetConfirm,
|
||||
PasswordResetRequest,
|
||||
RegisterRequest,
|
||||
@@ -21,8 +23,6 @@ from domain.auth.types import (
|
||||
User,
|
||||
UserInDB,
|
||||
)
|
||||
from models.database import UserAccount
|
||||
from domain.config.service import ConfigService
|
||||
|
||||
ALGORITHM = "HS256"
|
||||
ACCESS_TOKEN_EXPIRE_MINUTES = 60 * 24 * 365
|
||||
@@ -324,7 +324,7 @@ class AuthService:
|
||||
|
||||
@classmethod
|
||||
async def _send_password_reset_email(cls, user: UserAccount, token: str) -> None:
|
||||
from domain.email.service import EmailService
|
||||
from domain.email import EmailService
|
||||
|
||||
app_domain = await ConfigService.get("APP_DOMAIN", None)
|
||||
base_url = (app_domain or "http://localhost:5173").rstrip("/")
|
||||
|
||||
@@ -1 +1,7 @@
|
||||
from .service import BackupService
|
||||
from .types import BackupData
|
||||
|
||||
__all__ = [
|
||||
"BackupService",
|
||||
"BackupData",
|
||||
]
|
||||
|
||||
@@ -4,8 +4,8 @@ from fastapi import APIRouter, Depends, File, Request, UploadFile
|
||||
from fastapi.responses import JSONResponse
|
||||
|
||||
from domain.audit import AuditAction, audit
|
||||
from domain.auth.service import get_current_active_user
|
||||
from domain.backup.service import BackupService
|
||||
from domain.auth import get_current_active_user
|
||||
from .service import BackupService
|
||||
|
||||
router = APIRouter(
|
||||
prefix="/api/backup",
|
||||
|
||||
@@ -4,8 +4,8 @@ from datetime import datetime
|
||||
from fastapi import HTTPException
|
||||
from tortoise.transactions import in_transaction
|
||||
|
||||
from domain.backup.types import BackupData
|
||||
from domain.config.service import VERSION
|
||||
from domain.config import VERSION
|
||||
from .types import BackupData
|
||||
from models.database import (
|
||||
AIDefaultModel,
|
||||
AIModel,
|
||||
|
||||
10
domain/config/__init__.py
Normal file
10
domain/config/__init__.py
Normal file
@@ -0,0 +1,10 @@
|
||||
from .service import ConfigService, VERSION
|
||||
from .types import ConfigItem, LatestVersionInfo, SystemStatus
|
||||
|
||||
__all__ = [
|
||||
"ConfigService",
|
||||
"VERSION",
|
||||
"ConfigItem",
|
||||
"LatestVersionInfo",
|
||||
"SystemStatus",
|
||||
]
|
||||
@@ -4,10 +4,9 @@ from fastapi import APIRouter, Depends, Form, Request
|
||||
|
||||
from api.response import success
|
||||
from domain.audit import AuditAction, audit
|
||||
from domain.auth.service import get_current_active_user
|
||||
from domain.auth.types import User
|
||||
from domain.config.service import ConfigService
|
||||
from domain.config.types import ConfigItem
|
||||
from domain.auth import User, get_current_active_user
|
||||
from .service import ConfigService
|
||||
from .types import ConfigItem
|
||||
|
||||
router = APIRouter(prefix="/api/config", tags=["config"])
|
||||
|
||||
|
||||
@@ -5,12 +5,12 @@ from typing import Any, Dict, Optional
|
||||
import httpx
|
||||
from dotenv import load_dotenv
|
||||
|
||||
from domain.config.types import LatestVersionInfo, SystemStatus
|
||||
from .types import LatestVersionInfo, SystemStatus
|
||||
from models.database import Configuration, UserAccount
|
||||
|
||||
load_dotenv(dotenv_path=".env")
|
||||
|
||||
VERSION = "v1.5.4"
|
||||
VERSION = "v1.7.1"
|
||||
|
||||
|
||||
class ConfigService:
|
||||
|
||||
20
domain/email/__init__.py
Normal file
20
domain/email/__init__.py
Normal file
@@ -0,0 +1,20 @@
|
||||
from .service import EmailService, EmailTemplateRenderer
|
||||
from .types import (
|
||||
EmailConfig,
|
||||
EmailSecurity,
|
||||
EmailSendPayload,
|
||||
EmailTemplatePreviewPayload,
|
||||
EmailTemplateUpdate,
|
||||
EmailTestRequest,
|
||||
)
|
||||
|
||||
__all__ = [
|
||||
"EmailService",
|
||||
"EmailTemplateRenderer",
|
||||
"EmailConfig",
|
||||
"EmailSecurity",
|
||||
"EmailSendPayload",
|
||||
"EmailTemplatePreviewPayload",
|
||||
"EmailTemplateUpdate",
|
||||
"EmailTestRequest",
|
||||
]
|
||||
@@ -2,10 +2,9 @@ from fastapi import APIRouter, Depends, HTTPException, Request
|
||||
|
||||
from api.response import success
|
||||
from domain.audit import AuditAction, audit
|
||||
from domain.auth.service import get_current_active_user
|
||||
from domain.auth.types import User
|
||||
from domain.email.service import EmailService, EmailTemplateRenderer
|
||||
from domain.email.types import (
|
||||
from domain.auth import User, get_current_active_user
|
||||
from .service import EmailService, EmailTemplateRenderer
|
||||
from .types import (
|
||||
EmailTemplatePreviewPayload,
|
||||
EmailTemplateUpdate,
|
||||
EmailTestRequest,
|
||||
|
||||
@@ -7,8 +7,8 @@ from pathlib import Path
|
||||
from string import Template
|
||||
from typing import Any, Dict, List, Optional
|
||||
|
||||
from domain.config.service import ConfigService
|
||||
from domain.email.types import EmailConfig, EmailSecurity, EmailSendPayload
|
||||
from domain.config import ConfigService
|
||||
from .types import EmailConfig, EmailSecurity, EmailSendPayload
|
||||
|
||||
|
||||
class EmailTemplateRenderer:
|
||||
@@ -104,7 +104,7 @@ class EmailService:
|
||||
template: str,
|
||||
context: Optional[Dict[str, Any]] = None,
|
||||
):
|
||||
from domain.tasks.task_queue import TaskProgress, task_queue_service
|
||||
from domain.tasks import TaskProgress, task_queue_service
|
||||
|
||||
payload = EmailSendPayload(
|
||||
recipients=recipients,
|
||||
@@ -126,7 +126,7 @@ class EmailService:
|
||||
|
||||
@classmethod
|
||||
async def send_from_task(cls, task_id: str, data: Dict[str, Any]):
|
||||
from domain.tasks.task_queue import TaskProgress, task_queue_service
|
||||
from domain.tasks import TaskProgress, task_queue_service
|
||||
|
||||
payload = EmailSendPayload(**data)
|
||||
|
||||
|
||||
7
domain/offline_downloads/__init__.py
Normal file
7
domain/offline_downloads/__init__.py
Normal file
@@ -0,0 +1,7 @@
|
||||
from .service import OfflineDownloadService
|
||||
from .types import OfflineDownloadCreate
|
||||
|
||||
__all__ = [
|
||||
"OfflineDownloadService",
|
||||
"OfflineDownloadCreate",
|
||||
]
|
||||
@@ -4,10 +4,9 @@ from fastapi import APIRouter, Depends, Request
|
||||
|
||||
from api.response import success
|
||||
from domain.audit import AuditAction, audit
|
||||
from domain.auth.service import get_current_active_user
|
||||
from domain.auth.types import User
|
||||
from domain.offline_downloads.service import OfflineDownloadService
|
||||
from domain.offline_downloads.types import OfflineDownloadCreate
|
||||
from domain.auth import User, get_current_active_user
|
||||
from .service import OfflineDownloadService
|
||||
from .types import OfflineDownloadCreate
|
||||
|
||||
CurrentUser = Annotated[User, Depends(get_current_active_user)]
|
||||
|
||||
|
||||
@@ -7,11 +7,10 @@ import aiofiles
|
||||
import aiohttp
|
||||
from fastapi import Depends, HTTPException
|
||||
|
||||
from domain.auth.service import get_current_active_user
|
||||
from domain.auth.types import User
|
||||
from domain.offline_downloads.types import OfflineDownloadCreate
|
||||
from domain.virtual_fs.service import VirtualFSService
|
||||
from domain.tasks.task_queue import Task, TaskProgress, task_queue_service
|
||||
from domain.auth import User, get_current_active_user
|
||||
from domain.tasks import Task, TaskProgress, task_queue_service
|
||||
from domain.virtual_fs import VirtualFSService
|
||||
from .types import OfflineDownloadCreate
|
||||
|
||||
|
||||
class OfflineDownloadService:
|
||||
|
||||
@@ -1 +1,17 @@
|
||||
"""
|
||||
Foxel 插件系统
|
||||
|
||||
提供 .foxpkg 插件包的安装、管理和运行时加载功能。
|
||||
"""
|
||||
|
||||
from .loader import PluginLoadError, PluginLoader
|
||||
from .service import PluginService
|
||||
from .startup import init_plugins, load_installed_plugins
|
||||
|
||||
__all__ = [
|
||||
"PluginLoader",
|
||||
"PluginLoadError",
|
||||
"PluginService",
|
||||
"init_plugins",
|
||||
"load_installed_plugins",
|
||||
]
|
||||
|
||||
@@ -1,76 +1,111 @@
|
||||
"""
|
||||
插件管理 API 路由
|
||||
"""
|
||||
|
||||
from typing import List
|
||||
|
||||
from fastapi import APIRouter, Body, Request
|
||||
from fastapi import APIRouter, File, Request, UploadFile
|
||||
from fastapi.responses import FileResponse
|
||||
|
||||
from domain.audit import AuditAction, audit
|
||||
from domain.plugins.service import PluginService
|
||||
from domain.plugins.routes import video_player as video_player_routes
|
||||
from domain.plugins.types import PluginCreate, PluginManifestUpdate, PluginOut
|
||||
from .service import PluginService
|
||||
from .types import (
|
||||
PluginInstallResult,
|
||||
PluginOut,
|
||||
)
|
||||
|
||||
router = APIRouter(prefix="/api/plugins", tags=["plugins"])
|
||||
router.include_router(video_player_routes.router)
|
||||
|
||||
|
||||
@router.post("", response_model=PluginOut)
|
||||
@audit(
|
||||
action=AuditAction.CREATE,
|
||||
description="创建插件",
|
||||
body_fields=["url", "enabled"],
|
||||
)
|
||||
async def create_plugin(request: Request, payload: PluginCreate):
|
||||
return await PluginService.create(payload)
|
||||
# ========== 安装 ==========
|
||||
|
||||
|
||||
@router.post("/install", response_model=PluginInstallResult)
|
||||
@audit(action=AuditAction.CREATE, description="安装插件包")
|
||||
async def install_plugin(request: Request, file: UploadFile = File(...)):
|
||||
"""
|
||||
安装 .foxpkg 插件包
|
||||
|
||||
上传 .foxpkg 文件进行安装。
|
||||
"""
|
||||
content = await file.read()
|
||||
return await PluginService.install_package(content, file.filename or "plugin.foxpkg")
|
||||
|
||||
|
||||
# ========== 插件列表和详情 ==========
|
||||
|
||||
|
||||
@router.get("", response_model=List[PluginOut])
|
||||
@audit(action=AuditAction.READ, description="获取插件列表")
|
||||
async def list_plugins(request: Request):
|
||||
"""获取已安装的插件列表"""
|
||||
return await PluginService.list_plugins()
|
||||
|
||||
|
||||
@router.delete("/{plugin_id}")
|
||||
@audit(action=AuditAction.DELETE, description="删除插件")
|
||||
async def delete_plugin(request: Request, plugin_id: int):
|
||||
await PluginService.delete(plugin_id)
|
||||
@router.get("/{key_or_id}", response_model=PluginOut)
|
||||
@audit(action=AuditAction.READ, description="获取插件详情")
|
||||
async def get_plugin(request: Request, key_or_id: str):
|
||||
"""获取单个插件详情"""
|
||||
return await PluginService.get_plugin(key_or_id)
|
||||
|
||||
|
||||
# ========== 插件管理 ==========
|
||||
|
||||
|
||||
@router.delete("/{key_or_id}")
|
||||
@audit(action=AuditAction.DELETE, description="卸载插件")
|
||||
async def delete_plugin(request: Request, key_or_id: str):
|
||||
"""卸载插件"""
|
||||
await PluginService.delete(key_or_id)
|
||||
return {"code": 0, "msg": "ok"}
|
||||
|
||||
|
||||
@router.put("/{plugin_id}", response_model=PluginOut)
|
||||
@audit(
|
||||
action=AuditAction.UPDATE,
|
||||
description="更新插件",
|
||||
body_fields=["url", "enabled"],
|
||||
)
|
||||
async def update_plugin(request: Request, plugin_id: int, payload: PluginCreate):
|
||||
return await PluginService.update(plugin_id, payload)
|
||||
# ========== 插件资源 ==========
|
||||
|
||||
|
||||
@router.post("/{plugin_id}/metadata", response_model=PluginOut)
|
||||
@audit(
|
||||
action=AuditAction.UPDATE,
|
||||
description="更新插件 manifest",
|
||||
body_fields=[
|
||||
"key",
|
||||
"name",
|
||||
"version",
|
||||
"open_app",
|
||||
"supported_exts",
|
||||
"default_bounds",
|
||||
"default_maximized",
|
||||
"icon",
|
||||
"description",
|
||||
"author",
|
||||
"website",
|
||||
"github",
|
||||
],
|
||||
)
|
||||
async def update_manifest(
|
||||
request: Request, plugin_id: int, manifest: PluginManifestUpdate = Body(...)
|
||||
):
|
||||
return await PluginService.update_manifest(plugin_id, manifest)
|
||||
@router.get("/{key_or_id}/bundle.js")
|
||||
async def get_bundle(request: Request, key_or_id: str):
|
||||
"""获取插件前端 bundle"""
|
||||
path = await PluginService.get_bundle_path(key_or_id)
|
||||
v = (request.query_params.get("v") or "").strip()
|
||||
cache_control = "public, max-age=31536000, immutable" if v else "no-cache"
|
||||
return FileResponse(
|
||||
path,
|
||||
media_type="application/javascript",
|
||||
headers={"Cache-Control": cache_control},
|
||||
)
|
||||
|
||||
|
||||
@router.get("/{plugin_id}/bundle.js")
|
||||
async def get_bundle(request: Request, plugin_id: int):
|
||||
path = await PluginService.get_bundle_path(plugin_id)
|
||||
return FileResponse(path, media_type="application/javascript", headers={"Cache-Control": "no-store"})
|
||||
@router.get("/{key}/assets/{asset_path:path}")
|
||||
async def get_asset(request: Request, key: str, asset_path: str):
|
||||
"""获取插件静态资源"""
|
||||
path = await PluginService.get_asset_path(key, asset_path)
|
||||
|
||||
# 根据扩展名确定 MIME 类型
|
||||
ext = path.suffix.lower()
|
||||
media_types = {
|
||||
".js": "application/javascript",
|
||||
".css": "text/css",
|
||||
".json": "application/json",
|
||||
".svg": "image/svg+xml",
|
||||
".png": "image/png",
|
||||
".jpg": "image/jpeg",
|
||||
".jpeg": "image/jpeg",
|
||||
".gif": "image/gif",
|
||||
".webp": "image/webp",
|
||||
".ico": "image/x-icon",
|
||||
".woff": "font/woff",
|
||||
".woff2": "font/woff2",
|
||||
".ttf": "font/ttf",
|
||||
".eot": "application/vnd.ms-fontobject",
|
||||
".html": "text/html",
|
||||
".txt": "text/plain",
|
||||
".md": "text/markdown",
|
||||
}
|
||||
media_type = media_types.get(ext, "application/octet-stream")
|
||||
|
||||
return FileResponse(
|
||||
path,
|
||||
media_type=media_type,
|
||||
headers={"Cache-Control": "public, max-age=3600"},
|
||||
)
|
||||
|
||||
449
domain/plugins/loader.py
Normal file
449
domain/plugins/loader.py
Normal file
@@ -0,0 +1,449 @@
|
||||
"""
|
||||
插件加载器模块
|
||||
|
||||
负责:
|
||||
1. .foxpkg 解包和验证
|
||||
2. 插件文件部署
|
||||
3. 后端路由动态加载
|
||||
4. 处理器动态注册
|
||||
"""
|
||||
|
||||
import io
|
||||
import json
|
||||
import shutil
|
||||
import sys
|
||||
import zipfile
|
||||
from importlib.util import module_from_spec, spec_from_file_location
|
||||
from pathlib import Path
|
||||
from types import ModuleType
|
||||
from typing import Any, Dict, List, Optional, Tuple
|
||||
|
||||
from fastapi import APIRouter
|
||||
|
||||
from .types import (
|
||||
ManifestProcessorConfig,
|
||||
ManifestRouteConfig,
|
||||
PluginManifest,
|
||||
)
|
||||
|
||||
|
||||
class PluginLoadError(Exception):
|
||||
"""插件加载错误"""
|
||||
|
||||
pass
|
||||
|
||||
|
||||
class PluginLoader:
|
||||
"""插件加载器"""
|
||||
|
||||
PLUGINS_ROOT = Path("data/plugins")
|
||||
|
||||
# 已加载的插件模块缓存
|
||||
_loaded_modules: Dict[str, ModuleType] = {}
|
||||
# 已挂载的路由追踪
|
||||
_mounted_routers: Dict[str, List[APIRouter]] = {}
|
||||
|
||||
@classmethod
|
||||
def get_plugin_dir(cls, plugin_key: str) -> Path:
|
||||
"""获取插件目录"""
|
||||
return cls.PLUGINS_ROOT / plugin_key
|
||||
|
||||
@classmethod
|
||||
def get_manifest_path(cls, plugin_key: str) -> Path:
|
||||
"""获取插件 manifest.json 路径"""
|
||||
return cls.get_plugin_dir(plugin_key) / "manifest.json"
|
||||
|
||||
@classmethod
|
||||
def get_frontend_bundle_path(cls, plugin_key: str, entry: Optional[str] = None) -> Path:
|
||||
"""获取前端 bundle 路径"""
|
||||
plugin_dir = cls.get_plugin_dir(plugin_key)
|
||||
if entry:
|
||||
return plugin_dir / entry
|
||||
# 默认位置
|
||||
return plugin_dir / "frontend" / "index.js"
|
||||
|
||||
@classmethod
|
||||
def get_asset_path(cls, plugin_key: str, asset_path: str) -> Path:
|
||||
"""获取静态资源路径"""
|
||||
return cls.get_plugin_dir(plugin_key) / asset_path
|
||||
|
||||
# ========== 解包和验证 ==========
|
||||
|
||||
@classmethod
|
||||
def validate_manifest(cls, manifest_data: Dict[str, Any]) -> Tuple[bool, List[str]]:
|
||||
"""验证 manifest 数据"""
|
||||
errors: List[str] = []
|
||||
|
||||
# 必需字段检查
|
||||
if not manifest_data.get("key"):
|
||||
errors.append("manifest 缺少必需字段: key")
|
||||
if not manifest_data.get("name"):
|
||||
errors.append("manifest 缺少必需字段: name")
|
||||
|
||||
# key 格式检查(Java 命名空间格式)
|
||||
key = manifest_data.get("key", "")
|
||||
if key:
|
||||
import re
|
||||
|
||||
# 格式: com.example.plugin (至少两级,每级以小写字母开头,可包含小写字母和数字)
|
||||
if not re.match(r"^[a-z][a-z0-9]*(\.[a-z][a-z0-9]*)+$", key):
|
||||
errors.append(
|
||||
"key 格式无效:必须使用命名空间格式(如 com.example.plugin),"
|
||||
"每个部分以小写字母开头,只能包含小写字母和数字,至少两级"
|
||||
)
|
||||
|
||||
# 版本格式检查(简单检查)
|
||||
version = manifest_data.get("version", "")
|
||||
if version and not isinstance(version, str):
|
||||
errors.append("version 必须是字符串")
|
||||
|
||||
# 验证 frontend 配置
|
||||
frontend = manifest_data.get("frontend")
|
||||
if frontend and isinstance(frontend, dict):
|
||||
if frontend.get("entry") and not isinstance(frontend["entry"], str):
|
||||
errors.append("frontend.entry 必须是字符串")
|
||||
if frontend.get("styles") is not None:
|
||||
if not isinstance(frontend["styles"], list) or not all(
|
||||
isinstance(x, str) for x in frontend["styles"]
|
||||
):
|
||||
errors.append("frontend.styles 必须是字符串数组")
|
||||
supported_exts = frontend.get("supportedExts") or frontend.get("supported_exts")
|
||||
if supported_exts and not isinstance(supported_exts, list):
|
||||
errors.append("frontend.supportedExts 必须是数组")
|
||||
use_system_window = frontend.get("useSystemWindow") or frontend.get("use_system_window")
|
||||
if use_system_window is not None and not isinstance(use_system_window, bool):
|
||||
errors.append("frontend.useSystemWindow 必须是布尔值")
|
||||
|
||||
# 验证 backend 配置
|
||||
backend = manifest_data.get("backend")
|
||||
if backend and isinstance(backend, dict):
|
||||
routes = backend.get("routes", [])
|
||||
if routes:
|
||||
for i, route in enumerate(routes):
|
||||
if not route.get("module"):
|
||||
errors.append(f"backend.routes[{i}] 缺少 module")
|
||||
if not route.get("prefix"):
|
||||
errors.append(f"backend.routes[{i}] 缺少 prefix")
|
||||
|
||||
processors = backend.get("processors", [])
|
||||
if processors:
|
||||
for i, proc in enumerate(processors):
|
||||
if not proc.get("module"):
|
||||
errors.append(f"backend.processors[{i}] 缺少 module")
|
||||
if not proc.get("type"):
|
||||
errors.append(f"backend.processors[{i}] 缺少 type")
|
||||
|
||||
return len(errors) == 0, errors
|
||||
|
||||
@classmethod
|
||||
def unpack_foxpkg(
|
||||
cls, file_content: bytes, target_key: Optional[str] = None
|
||||
) -> Tuple[PluginManifest, Path]:
|
||||
"""
|
||||
解包 .foxpkg 文件
|
||||
|
||||
Args:
|
||||
file_content: .foxpkg 文件内容
|
||||
target_key: 可选,指定安装的插件 key(覆盖 manifest 中的 key)
|
||||
|
||||
Returns:
|
||||
(manifest, plugin_dir) 元组
|
||||
|
||||
Raises:
|
||||
PluginLoadError: 解包或验证失败
|
||||
"""
|
||||
try:
|
||||
with zipfile.ZipFile(io.BytesIO(file_content)) as zf:
|
||||
# 读取 manifest.json
|
||||
try:
|
||||
manifest_bytes = zf.read("manifest.json")
|
||||
except KeyError:
|
||||
raise PluginLoadError("插件包缺少 manifest.json")
|
||||
|
||||
try:
|
||||
manifest_data = json.loads(manifest_bytes.decode("utf-8"))
|
||||
except json.JSONDecodeError as e:
|
||||
raise PluginLoadError(f"manifest.json 解析失败: {e}")
|
||||
|
||||
# 验证 manifest
|
||||
valid, errors = cls.validate_manifest(manifest_data)
|
||||
if not valid:
|
||||
raise PluginLoadError(f"manifest 验证失败: {'; '.join(errors)}")
|
||||
|
||||
# 解析 manifest
|
||||
try:
|
||||
manifest = PluginManifest.model_validate(manifest_data)
|
||||
except Exception as e:
|
||||
raise PluginLoadError(f"manifest 解析失败: {e}")
|
||||
|
||||
# 确定插件 key
|
||||
plugin_key = target_key or manifest.key
|
||||
|
||||
# 验证包内文件
|
||||
cls._validate_package_files(zf, manifest)
|
||||
|
||||
# 部署文件
|
||||
target_dir = cls.PLUGINS_ROOT / plugin_key
|
||||
if target_dir.exists():
|
||||
# 备份旧版本
|
||||
backup_dir = cls.PLUGINS_ROOT / f"{plugin_key}.backup"
|
||||
if backup_dir.exists():
|
||||
shutil.rmtree(backup_dir)
|
||||
shutil.move(str(target_dir), str(backup_dir))
|
||||
|
||||
target_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
try:
|
||||
zf.extractall(target_dir)
|
||||
except Exception as e:
|
||||
# 恢复备份
|
||||
if (cls.PLUGINS_ROOT / f"{plugin_key}.backup").exists():
|
||||
shutil.rmtree(target_dir, ignore_errors=True)
|
||||
shutil.move(str(cls.PLUGINS_ROOT / f"{plugin_key}.backup"), str(target_dir))
|
||||
raise PluginLoadError(f"文件解压失败: {e}")
|
||||
|
||||
# 清理备份
|
||||
backup_dir = cls.PLUGINS_ROOT / f"{plugin_key}.backup"
|
||||
if backup_dir.exists():
|
||||
shutil.rmtree(backup_dir, ignore_errors=True)
|
||||
|
||||
return manifest, target_dir
|
||||
|
||||
except zipfile.BadZipFile:
|
||||
raise PluginLoadError("无效的插件包格式(非 ZIP 文件)")
|
||||
|
||||
@classmethod
|
||||
def _validate_package_files(cls, zf: zipfile.ZipFile, manifest: PluginManifest) -> None:
|
||||
"""验证包内文件是否完整"""
|
||||
file_list = zf.namelist()
|
||||
|
||||
# 检查前端入口
|
||||
if manifest.frontend and manifest.frontend.entry:
|
||||
if manifest.frontend.entry not in file_list:
|
||||
raise PluginLoadError(f"前端入口文件不存在: {manifest.frontend.entry}")
|
||||
|
||||
# 检查后端模块
|
||||
if manifest.backend:
|
||||
if manifest.backend.routes:
|
||||
for route in manifest.backend.routes:
|
||||
if route.module not in file_list:
|
||||
raise PluginLoadError(f"路由模块不存在: {route.module}")
|
||||
|
||||
if manifest.backend.processors:
|
||||
for proc in manifest.backend.processors:
|
||||
if proc.module not in file_list:
|
||||
raise PluginLoadError(f"处理器模块不存在: {proc.module}")
|
||||
|
||||
# ========== 路由动态加载 ==========
|
||||
|
||||
@classmethod
|
||||
def load_route_module(cls, plugin_key: str, route_config: ManifestRouteConfig) -> APIRouter:
|
||||
"""
|
||||
动态加载插件路由模块
|
||||
|
||||
Args:
|
||||
plugin_key: 插件标识
|
||||
route_config: 路由配置
|
||||
|
||||
Returns:
|
||||
加载的 APIRouter
|
||||
"""
|
||||
module_path = cls.get_plugin_dir(plugin_key) / route_config.module
|
||||
|
||||
if not module_path.exists():
|
||||
raise PluginLoadError(f"路由模块不存在: {module_path}")
|
||||
|
||||
module_name = f"foxel_plugin_{plugin_key}_route_{module_path.stem}"
|
||||
|
||||
try:
|
||||
spec = spec_from_file_location(module_name, module_path)
|
||||
if spec is None or spec.loader is None:
|
||||
raise PluginLoadError(f"无法加载路由模块: {module_path}")
|
||||
|
||||
module = module_from_spec(spec)
|
||||
sys.modules[module_name] = module
|
||||
spec.loader.exec_module(module)
|
||||
|
||||
# 缓存模块
|
||||
cls._loaded_modules[f"{plugin_key}:route:{route_config.module}"] = module
|
||||
|
||||
# 获取 router
|
||||
router = getattr(module, "router", None)
|
||||
if router is None:
|
||||
raise PluginLoadError(f"路由模块缺少 'router' 对象: {module_path}")
|
||||
|
||||
if not isinstance(router, APIRouter):
|
||||
raise PluginLoadError(f"'router' 不是有效的 APIRouter 实例: {module_path}")
|
||||
|
||||
# 创建包装路由器添加前缀
|
||||
wrapper = APIRouter(prefix=route_config.prefix, tags=route_config.tags or [])
|
||||
wrapper.include_router(router)
|
||||
|
||||
return wrapper
|
||||
|
||||
except PluginLoadError:
|
||||
raise
|
||||
except Exception as e:
|
||||
raise PluginLoadError(f"加载路由模块失败 [{module_path}]: {e}")
|
||||
|
||||
@classmethod
|
||||
def load_all_routes(cls, plugin_key: str, manifest: PluginManifest) -> List[APIRouter]:
|
||||
"""加载插件的所有路由"""
|
||||
routers: List[APIRouter] = []
|
||||
|
||||
if not manifest.backend or not manifest.backend.routes:
|
||||
return routers
|
||||
|
||||
for route_config in manifest.backend.routes:
|
||||
router = cls.load_route_module(plugin_key, route_config)
|
||||
routers.append(router)
|
||||
|
||||
cls._mounted_routers[plugin_key] = routers
|
||||
return routers
|
||||
|
||||
# ========== 处理器动态注册 ==========
|
||||
|
||||
@classmethod
|
||||
def load_processor_module(
|
||||
cls, plugin_key: str, processor_config: ManifestProcessorConfig
|
||||
) -> None:
|
||||
"""
|
||||
动态加载并注册处理器模块
|
||||
|
||||
Args:
|
||||
plugin_key: 插件标识
|
||||
processor_config: 处理器配置
|
||||
"""
|
||||
module_path = cls.get_plugin_dir(plugin_key) / processor_config.module
|
||||
|
||||
if not module_path.exists():
|
||||
raise PluginLoadError(f"处理器模块不存在: {module_path}")
|
||||
|
||||
module_name = f"foxel_plugin_{plugin_key}_processor_{module_path.stem}"
|
||||
|
||||
try:
|
||||
spec = spec_from_file_location(module_name, module_path)
|
||||
if spec is None or spec.loader is None:
|
||||
raise PluginLoadError(f"无法加载处理器模块: {module_path}")
|
||||
|
||||
module = module_from_spec(spec)
|
||||
sys.modules[module_name] = module
|
||||
spec.loader.exec_module(module)
|
||||
|
||||
# 缓存模块
|
||||
cls._loaded_modules[f"{plugin_key}:processor:{processor_config.module}"] = module
|
||||
|
||||
# 获取处理器工厂
|
||||
factory = getattr(module, "PROCESSOR_FACTORY", None)
|
||||
if factory is None:
|
||||
raise PluginLoadError(f"处理器模块缺少 'PROCESSOR_FACTORY': {module_path}")
|
||||
|
||||
# 获取配置 schema
|
||||
config_schema = getattr(module, "CONFIG_SCHEMA", [])
|
||||
processor_name = getattr(module, "PROCESSOR_NAME", processor_config.name or processor_config.type)
|
||||
supported_exts = getattr(module, "SUPPORTED_EXTS", [])
|
||||
|
||||
# 注册到处理器注册表
|
||||
from domain.processors import CONFIG_SCHEMAS, TYPE_MAP
|
||||
|
||||
processor_type = processor_config.type
|
||||
TYPE_MAP[processor_type] = factory
|
||||
|
||||
# 获取实例以读取属性
|
||||
try:
|
||||
sample = factory()
|
||||
produces_file = getattr(sample, "produces_file", False)
|
||||
supports_directory = getattr(sample, "supports_directory", False)
|
||||
except Exception:
|
||||
produces_file = False
|
||||
supports_directory = False
|
||||
|
||||
CONFIG_SCHEMAS[processor_type] = {
|
||||
"type": processor_type,
|
||||
"name": processor_name,
|
||||
"supported_exts": supported_exts,
|
||||
"config_schema": config_schema,
|
||||
"produces_file": produces_file,
|
||||
"supports_directory": supports_directory,
|
||||
"plugin": plugin_key, # 标记来源插件
|
||||
"module_path": str(module_path),
|
||||
}
|
||||
|
||||
except PluginLoadError:
|
||||
raise
|
||||
except Exception as e:
|
||||
raise PluginLoadError(f"加载处理器模块失败 [{module_path}]: {e}")
|
||||
|
||||
@classmethod
|
||||
def load_all_processors(cls, plugin_key: str, manifest: PluginManifest) -> List[str]:
|
||||
"""加载插件的所有处理器,返回处理器类型列表"""
|
||||
processor_types: List[str] = []
|
||||
|
||||
if not manifest.backend or not manifest.backend.processors:
|
||||
return processor_types
|
||||
|
||||
for proc_config in manifest.backend.processors:
|
||||
cls.load_processor_module(plugin_key, proc_config)
|
||||
processor_types.append(proc_config.type)
|
||||
|
||||
return processor_types
|
||||
|
||||
# ========== 卸载 ==========
|
||||
|
||||
@classmethod
|
||||
def unload_plugin(cls, plugin_key: str, manifest: Optional[PluginManifest] = None) -> None:
|
||||
"""
|
||||
卸载插件的后端组件
|
||||
|
||||
Args:
|
||||
plugin_key: 插件标识
|
||||
manifest: 可选的 manifest,用于确定要卸载的组件
|
||||
"""
|
||||
# 卸载处理器
|
||||
if manifest and manifest.backend and manifest.backend.processors:
|
||||
from domain.processors import CONFIG_SCHEMAS, TYPE_MAP
|
||||
|
||||
for proc_config in manifest.backend.processors:
|
||||
proc_type = proc_config.type
|
||||
if proc_type in TYPE_MAP:
|
||||
del TYPE_MAP[proc_type]
|
||||
if proc_type in CONFIG_SCHEMAS:
|
||||
del CONFIG_SCHEMAS[proc_type]
|
||||
|
||||
# 清理缓存的模块
|
||||
keys_to_remove = [k for k in cls._loaded_modules if k.startswith(f"{plugin_key}:")]
|
||||
for key in keys_to_remove:
|
||||
module = cls._loaded_modules.pop(key, None)
|
||||
if module and module.__name__ in sys.modules:
|
||||
del sys.modules[module.__name__]
|
||||
|
||||
# 清理路由追踪(注意:FastAPI 不支持动态移除路由,需要重启应用)
|
||||
cls._mounted_routers.pop(plugin_key, None)
|
||||
|
||||
@classmethod
|
||||
def delete_plugin_files(cls, plugin_key: str) -> None:
|
||||
"""删除插件文件"""
|
||||
plugin_dir = cls.get_plugin_dir(plugin_key)
|
||||
if plugin_dir.exists():
|
||||
shutil.rmtree(plugin_dir)
|
||||
|
||||
# 同时删除备份
|
||||
backup_dir = cls.PLUGINS_ROOT / f"{plugin_key}.backup"
|
||||
if backup_dir.exists():
|
||||
shutil.rmtree(backup_dir)
|
||||
|
||||
# ========== 读取 manifest ==========
|
||||
|
||||
@classmethod
|
||||
def read_manifest(cls, plugin_key: str) -> Optional[PluginManifest]:
|
||||
"""从文件系统读取插件 manifest"""
|
||||
manifest_path = cls.get_manifest_path(plugin_key)
|
||||
if not manifest_path.exists():
|
||||
return None
|
||||
|
||||
try:
|
||||
with open(manifest_path, "r", encoding="utf-8") as f:
|
||||
data = json.load(f)
|
||||
return PluginManifest.model_validate(data)
|
||||
except Exception:
|
||||
return None
|
||||
@@ -1,2 +0,0 @@
|
||||
"""插件专属服务端路由集合。"""
|
||||
|
||||
@@ -1,142 +0,0 @@
|
||||
import json
|
||||
from datetime import UTC, datetime
|
||||
from pathlib import Path
|
||||
from typing import Any, Dict, List, Optional
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException, Query
|
||||
|
||||
from api.response import success
|
||||
from domain.auth.service import get_current_active_user
|
||||
|
||||
|
||||
router = APIRouter(
|
||||
prefix="/video-player",
|
||||
tags=["plugins"],
|
||||
dependencies=[Depends(get_current_active_user)],
|
||||
)
|
||||
|
||||
DATA_ROOT = Path("data/.video")
|
||||
|
||||
|
||||
def _read_json(path: Path) -> Dict[str, Any]:
|
||||
return json.loads(path.read_text(encoding="utf-8"))
|
||||
|
||||
|
||||
def _file_mtime_iso(path: Path) -> str:
|
||||
try:
|
||||
ts = path.stat().st_mtime
|
||||
except FileNotFoundError:
|
||||
return ""
|
||||
return datetime.fromtimestamp(ts, tz=UTC).isoformat()
|
||||
|
||||
|
||||
def _extract_title(payload: Dict[str, Any]) -> str:
|
||||
detail = (payload.get("tmdb") or {}).get("detail") or {}
|
||||
if payload.get("type") == "tv":
|
||||
return str(detail.get("name") or detail.get("original_name") or "")
|
||||
return str(detail.get("title") or detail.get("original_title") or "")
|
||||
|
||||
|
||||
def _extract_year(payload: Dict[str, Any]) -> Optional[str]:
|
||||
detail = (payload.get("tmdb") or {}).get("detail") or {}
|
||||
value = detail.get("first_air_date") if payload.get("type") == "tv" else detail.get("release_date")
|
||||
if not value or not isinstance(value, str):
|
||||
return None
|
||||
return value[:4] if len(value) >= 4 else value
|
||||
|
||||
|
||||
def _extract_genres(payload: Dict[str, Any]) -> List[str]:
|
||||
detail = (payload.get("tmdb") or {}).get("detail") or {}
|
||||
genres = detail.get("genres") or []
|
||||
out: List[str] = []
|
||||
if isinstance(genres, list):
|
||||
for g in genres:
|
||||
if isinstance(g, dict) and g.get("name"):
|
||||
out.append(str(g["name"]))
|
||||
return out
|
||||
|
||||
|
||||
def _summarize(item_id: str, payload: Dict[str, Any], mtime_iso: str) -> Dict[str, Any]:
|
||||
detail = (payload.get("tmdb") or {}).get("detail") or {}
|
||||
media_type = payload.get("type") or "unknown"
|
||||
episodes = payload.get("episodes") or []
|
||||
seasons = {e.get("season") for e in episodes if isinstance(e, dict) and e.get("season") is not None}
|
||||
|
||||
return {
|
||||
"id": item_id,
|
||||
"type": media_type,
|
||||
"title": _extract_title(payload),
|
||||
"year": _extract_year(payload),
|
||||
"overview": detail.get("overview"),
|
||||
"poster_path": detail.get("poster_path"),
|
||||
"backdrop_path": detail.get("backdrop_path"),
|
||||
"genres": _extract_genres(payload),
|
||||
"tmdb_id": (payload.get("tmdb") or {}).get("id"),
|
||||
"source_path": payload.get("source_path"),
|
||||
"scraped_at": payload.get("scraped_at"),
|
||||
"updated_at": mtime_iso,
|
||||
"episodes_count": len(episodes) if isinstance(episodes, list) else 0,
|
||||
"seasons_count": len(seasons),
|
||||
"vote_average": detail.get("vote_average"),
|
||||
"vote_count": detail.get("vote_count"),
|
||||
}
|
||||
|
||||
|
||||
def _iter_library_files() -> List[tuple[str, Path]]:
|
||||
files: List[tuple[str, Path]] = []
|
||||
for sub in ("tv", "movie"):
|
||||
folder = DATA_ROOT / sub
|
||||
if not folder.exists():
|
||||
continue
|
||||
for p in folder.glob("*.json"):
|
||||
if not p.is_file():
|
||||
continue
|
||||
files.append((sub, p))
|
||||
return files
|
||||
|
||||
|
||||
@router.get("/library")
|
||||
async def list_library(
|
||||
q: str | None = Query(None, description="搜索关键字(标题/简介)"),
|
||||
media_type: str | None = Query(None, alias="type", description="tv 或 movie"),
|
||||
):
|
||||
items: List[Dict[str, Any]] = []
|
||||
keyword = (q or "").strip().lower()
|
||||
type_filter = (media_type or "").strip().lower()
|
||||
if type_filter and type_filter not in {"tv", "movie"}:
|
||||
raise HTTPException(status_code=400, detail="type must be tv or movie")
|
||||
|
||||
for _sub, path in _iter_library_files():
|
||||
item_id = path.stem
|
||||
try:
|
||||
payload = _read_json(path)
|
||||
except Exception:
|
||||
continue
|
||||
if type_filter and str(payload.get("type") or "").lower() != type_filter:
|
||||
continue
|
||||
summary = _summarize(item_id, payload, _file_mtime_iso(path))
|
||||
if keyword:
|
||||
haystack = f"{summary.get('title') or ''} {summary.get('overview') or ''}".lower()
|
||||
if keyword not in haystack:
|
||||
continue
|
||||
items.append(summary)
|
||||
|
||||
items.sort(key=lambda x: x.get("updated_at") or "", reverse=True)
|
||||
return success(items)
|
||||
|
||||
|
||||
@router.get("/library/{item_id}")
|
||||
async def get_library_item(item_id: str):
|
||||
candidates = [
|
||||
DATA_ROOT / "tv" / f"{item_id}.json",
|
||||
DATA_ROOT / "movie" / f"{item_id}.json",
|
||||
]
|
||||
path = next((p for p in candidates if p.exists()), None)
|
||||
if not path:
|
||||
raise HTTPException(status_code=404, detail="Item not found")
|
||||
|
||||
payload = _read_json(path)
|
||||
payload["id"] = item_id
|
||||
payload["updated_at"] = _file_mtime_iso(path)
|
||||
return success(payload)
|
||||
|
||||
@@ -1,138 +1,273 @@
|
||||
"""
|
||||
插件服务模块
|
||||
|
||||
负责插件的安装、卸载等管理操作
|
||||
"""
|
||||
|
||||
import contextlib
|
||||
import re
|
||||
import logging
|
||||
import shutil
|
||||
from pathlib import Path
|
||||
from typing import List, Optional, Union
|
||||
|
||||
import aiofiles
|
||||
import httpx
|
||||
from fastapi import HTTPException
|
||||
|
||||
from domain.plugins.types import PluginCreate, PluginManifestUpdate, PluginOut
|
||||
from .loader import PluginLoadError, PluginLoader
|
||||
from .types import (
|
||||
PluginInstallResult,
|
||||
PluginManifest,
|
||||
PluginOut,
|
||||
)
|
||||
from models.database import Plugin
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class PluginService:
|
||||
"""插件服务"""
|
||||
|
||||
_plugins_root = Path("data/plugins")
|
||||
|
||||
@classmethod
|
||||
def _folder_name(cls, rec: Plugin) -> str:
|
||||
if rec.key:
|
||||
safe = re.sub(r"[^A-Za-z0-9_.-]", "_", rec.key)
|
||||
return safe or str(rec.id)
|
||||
return str(rec.id)
|
||||
# ========== 工具方法 ==========
|
||||
|
||||
@classmethod
|
||||
def _bundle_dir_from_rec(cls, rec: Plugin) -> Path:
|
||||
return cls._plugins_root / cls._folder_name(rec) / "current"
|
||||
def _get_plugin_dir(cls, plugin_key: str) -> Path:
|
||||
"""获取插件目录"""
|
||||
return cls._plugins_root / plugin_key
|
||||
|
||||
@classmethod
|
||||
def _bundle_path_from_rec(cls, rec: Plugin) -> Path:
|
||||
return cls._bundle_dir_from_rec(rec) / "index.js"
|
||||
def _get_bundle_path(cls, rec: Plugin) -> Path:
|
||||
"""获取前端 bundle 路径"""
|
||||
plugin_dir = cls._get_plugin_dir(rec.key)
|
||||
# 从 manifest 读取
|
||||
if rec.manifest:
|
||||
frontend = rec.manifest.get("frontend", {})
|
||||
entry = frontend.get("entry")
|
||||
if entry:
|
||||
return plugin_dir / entry
|
||||
# 默认位置
|
||||
return plugin_dir / "frontend" / "index.js"
|
||||
|
||||
@classmethod
|
||||
async def _download_bundle(cls, rec: Plugin, url: str) -> None:
|
||||
dest_dir = cls._bundle_dir_from_rec(rec)
|
||||
dest_dir.mkdir(parents=True, exist_ok=True)
|
||||
dest_path = cls._bundle_path_from_rec(rec)
|
||||
tmp_path = dest_path.with_suffix(".tmp")
|
||||
try:
|
||||
async with httpx.AsyncClient(timeout=30.0, follow_redirects=True) as client:
|
||||
async with client.stream("GET", url) as resp:
|
||||
resp.raise_for_status()
|
||||
async with aiofiles.open(tmp_path, "wb") as f:
|
||||
async for chunk in resp.aiter_bytes(chunk_size=65536):
|
||||
if not chunk:
|
||||
continue
|
||||
await f.write(chunk)
|
||||
tmp_path.replace(dest_path)
|
||||
except Exception:
|
||||
with contextlib.suppress(Exception):
|
||||
if tmp_path.exists():
|
||||
tmp_path.unlink()
|
||||
raise
|
||||
|
||||
@classmethod
|
||||
async def _ensure_bundle(cls, plugin_id: int) -> Path:
|
||||
rec = await cls._get_or_404(plugin_id)
|
||||
bundle_path = cls._bundle_path_from_rec(rec)
|
||||
if bundle_path.exists():
|
||||
return bundle_path
|
||||
|
||||
legacy = cls._plugins_root / str(rec.id) / "current" / "index.js"
|
||||
if legacy.exists():
|
||||
return legacy
|
||||
|
||||
raise HTTPException(status_code=404, detail="Plugin bundle not found")
|
||||
|
||||
@classmethod
|
||||
async def get_bundle_path(cls, plugin_id: int) -> Path:
|
||||
return await cls._ensure_bundle(plugin_id)
|
||||
|
||||
@classmethod
|
||||
async def create(cls, payload: PluginCreate) -> PluginOut:
|
||||
rec = await Plugin.create(**payload.model_dump())
|
||||
try:
|
||||
await cls._download_bundle(rec, rec.url)
|
||||
except Exception as exc:
|
||||
with contextlib.suppress(Exception):
|
||||
await rec.delete()
|
||||
raise HTTPException(status_code=400, detail=f"Failed to fetch plugin: {exc}")
|
||||
return PluginOut.model_validate(rec)
|
||||
|
||||
@classmethod
|
||||
async def list_plugins(cls) -> list[PluginOut]:
|
||||
rows = await Plugin.all().order_by("-id")
|
||||
return [PluginOut.model_validate(r) for r in rows]
|
||||
|
||||
@classmethod
|
||||
async def _get_or_404(cls, plugin_id: int) -> Plugin:
|
||||
rec = await Plugin.get_or_none(id=plugin_id)
|
||||
async def _get_by_key_or_404(cls, key: str) -> Plugin:
|
||||
"""通过 key 获取插件,不存在则返回 404"""
|
||||
rec = await Plugin.get_or_none(key=key)
|
||||
if not rec:
|
||||
raise HTTPException(status_code=404, detail="Plugin not found")
|
||||
return rec
|
||||
|
||||
@classmethod
|
||||
async def delete(cls, plugin_id: int) -> None:
|
||||
rec = await cls._get_or_404(plugin_id)
|
||||
await rec.delete()
|
||||
with contextlib.suppress(Exception):
|
||||
dirs = {cls._bundle_dir_from_rec(rec).parent, cls._plugins_root / str(rec.id)}
|
||||
for plugin_dir in dirs:
|
||||
if plugin_dir.exists():
|
||||
shutil.rmtree(plugin_dir)
|
||||
async def _get_by_key_or_id(cls, key_or_id: Union[str, int]) -> Plugin:
|
||||
"""通过 key 或 ID 获取插件"""
|
||||
# 尝试作为 ID
|
||||
if isinstance(key_or_id, int) or (isinstance(key_or_id, str) and key_or_id.isdigit()):
|
||||
plugin_id = int(key_or_id)
|
||||
rec = await Plugin.get_or_none(id=plugin_id)
|
||||
if rec:
|
||||
return rec
|
||||
# 尝试作为 key
|
||||
if isinstance(key_or_id, str):
|
||||
rec = await Plugin.get_or_none(key=key_or_id)
|
||||
if rec:
|
||||
return rec
|
||||
raise HTTPException(status_code=404, detail="Plugin not found")
|
||||
|
||||
# ========== 安装 ==========
|
||||
|
||||
@classmethod
|
||||
async def update(cls, plugin_id: int, payload: PluginCreate) -> PluginOut:
|
||||
rec = await cls._get_or_404(plugin_id)
|
||||
url_changed = rec.url != payload.url
|
||||
if url_changed:
|
||||
try:
|
||||
await cls._download_bundle(rec, payload.url)
|
||||
except Exception as exc:
|
||||
raise HTTPException(status_code=400, detail=f"Failed to fetch plugin: {exc}")
|
||||
rec.url = payload.url
|
||||
rec.enabled = payload.enabled
|
||||
await rec.save()
|
||||
return PluginOut.model_validate(rec)
|
||||
async def install_package(cls, file_content: bytes, filename: str) -> PluginInstallResult:
|
||||
"""
|
||||
安装 .foxpkg 插件包
|
||||
|
||||
Args:
|
||||
file_content: 插件包内容
|
||||
filename: 文件名
|
||||
|
||||
Returns:
|
||||
安装结果
|
||||
"""
|
||||
errors: List[str] = []
|
||||
|
||||
try:
|
||||
# 解包
|
||||
manifest, plugin_dir = PluginLoader.unpack_foxpkg(file_content)
|
||||
plugin_key = manifest.key
|
||||
|
||||
# 检查是否已存在
|
||||
existing = await Plugin.get_or_none(key=plugin_key)
|
||||
if existing:
|
||||
# 更新现有插件
|
||||
logger.info(f"更新插件: {plugin_key}")
|
||||
rec = existing
|
||||
else:
|
||||
# 创建新插件
|
||||
logger.info(f"安装新插件: {plugin_key}")
|
||||
rec = Plugin(key=plugin_key)
|
||||
|
||||
# 更新字段
|
||||
rec.name = manifest.name
|
||||
rec.version = manifest.version
|
||||
rec.description = manifest.description
|
||||
rec.author = manifest.author
|
||||
rec.website = manifest.website
|
||||
rec.github = manifest.github
|
||||
rec.license = manifest.license
|
||||
rec.manifest = manifest.model_dump(mode="json")
|
||||
|
||||
# 从 manifest.frontend 提取前端配置
|
||||
if manifest.frontend:
|
||||
rec.open_app = manifest.frontend.open_app or False
|
||||
rec.supported_exts = manifest.frontend.supported_exts
|
||||
rec.default_bounds = manifest.frontend.default_bounds
|
||||
rec.default_maximized = manifest.frontend.default_maximized
|
||||
rec.icon = manifest.frontend.icon
|
||||
|
||||
@classmethod
|
||||
async def update_manifest(
|
||||
cls, plugin_id: int, manifest: PluginManifestUpdate
|
||||
) -> PluginOut:
|
||||
rec = await cls._get_or_404(plugin_id)
|
||||
old_dir = cls._bundle_dir_from_rec(rec).parent
|
||||
updates = manifest.model_dump(exclude_none=True)
|
||||
if updates:
|
||||
for key, value in updates.items():
|
||||
setattr(rec, key, value)
|
||||
await rec.save()
|
||||
new_dir = cls._bundle_dir_from_rec(rec).parent
|
||||
if rec.key and new_dir != old_dir:
|
||||
candidate_dir = old_dir if old_dir.exists() else (cls._plugins_root / str(rec.id))
|
||||
if candidate_dir.exists():
|
||||
new_dir.parent.mkdir(parents=True, exist_ok=True)
|
||||
with contextlib.suppress(Exception):
|
||||
if new_dir.exists():
|
||||
shutil.rmtree(new_dir)
|
||||
shutil.move(str(candidate_dir), str(new_dir))
|
||||
|
||||
# 加载后端组件(如果有)
|
||||
loaded_routes: List[str] = []
|
||||
loaded_processors: List[str] = []
|
||||
|
||||
if manifest.backend:
|
||||
# 加载路由
|
||||
if manifest.backend.routes:
|
||||
try:
|
||||
from main import app
|
||||
routers = PluginLoader.load_all_routes(plugin_key, manifest)
|
||||
for router in routers:
|
||||
app.include_router(router)
|
||||
loaded_routes.append(router.prefix)
|
||||
except PluginLoadError as e:
|
||||
errors.append(f"路由加载失败: {e}")
|
||||
logger.error(f"插件 {plugin_key} 路由加载失败: {e}")
|
||||
except Exception as e:
|
||||
errors.append(f"路由加载失败: {e}")
|
||||
logger.exception(f"插件 {plugin_key} 路由加载异常")
|
||||
|
||||
# 加载处理器
|
||||
if manifest.backend.processors:
|
||||
try:
|
||||
processor_types = PluginLoader.load_all_processors(plugin_key, manifest)
|
||||
loaded_processors = processor_types
|
||||
except PluginLoadError as e:
|
||||
errors.append(f"处理器加载失败: {e}")
|
||||
logger.error(f"插件 {plugin_key} 处理器加载失败: {e}")
|
||||
except Exception as e:
|
||||
errors.append(f"处理器加载失败: {e}")
|
||||
logger.exception(f"插件 {plugin_key} 处理器加载异常")
|
||||
|
||||
# 更新加载状态
|
||||
rec.loaded_routes = loaded_routes if loaded_routes else None
|
||||
rec.loaded_processors = loaded_processors if loaded_processors else None
|
||||
await rec.save()
|
||||
|
||||
return PluginInstallResult(
|
||||
success=True,
|
||||
plugin=PluginOut.model_validate(rec),
|
||||
message="安装成功" if not errors else "安装完成,但有部分组件加载失败",
|
||||
errors=errors if errors else None,
|
||||
)
|
||||
|
||||
except PluginLoadError as e:
|
||||
logger.error(f"插件安装失败: {e}")
|
||||
return PluginInstallResult(
|
||||
success=False,
|
||||
message=str(e),
|
||||
errors=[str(e)],
|
||||
)
|
||||
except Exception as e:
|
||||
logger.exception("插件安装异常")
|
||||
return PluginInstallResult(
|
||||
success=False,
|
||||
message=f"安装失败: {e}",
|
||||
errors=[str(e)],
|
||||
)
|
||||
|
||||
# ========== 查询 ==========
|
||||
|
||||
@classmethod
|
||||
async def list_plugins(cls) -> List[PluginOut]:
|
||||
"""获取所有插件列表"""
|
||||
rows = await Plugin.all().order_by("-id")
|
||||
for rec in rows:
|
||||
try:
|
||||
manifest = PluginLoader.read_manifest(rec.key)
|
||||
if manifest:
|
||||
rec.manifest = manifest.model_dump(mode="json")
|
||||
except Exception:
|
||||
continue
|
||||
return [PluginOut.model_validate(r) for r in rows]
|
||||
|
||||
@classmethod
|
||||
async def get_plugin(cls, key_or_id: Union[str, int]) -> PluginOut:
|
||||
"""获取单个插件详情"""
|
||||
rec = await cls._get_by_key_or_id(key_or_id)
|
||||
try:
|
||||
manifest = PluginLoader.read_manifest(rec.key)
|
||||
if manifest:
|
||||
rec.manifest = manifest.model_dump(mode="json")
|
||||
except Exception:
|
||||
pass
|
||||
return PluginOut.model_validate(rec)
|
||||
|
||||
@classmethod
|
||||
async def get_bundle_path(cls, key_or_id: Union[str, int]) -> Path:
|
||||
"""获取插件前端 bundle 路径"""
|
||||
rec = await cls._get_by_key_or_id(key_or_id)
|
||||
bundle_path = cls._get_bundle_path(rec)
|
||||
if not bundle_path.exists():
|
||||
raise HTTPException(status_code=404, detail="Plugin bundle not found")
|
||||
return bundle_path
|
||||
|
||||
@classmethod
|
||||
async def get_asset_path(cls, key: str, asset_path: str) -> Path:
|
||||
"""获取插件静态资源路径"""
|
||||
rec = await cls._get_by_key_or_404(key)
|
||||
plugin_dir = cls._get_plugin_dir(rec.key)
|
||||
|
||||
# 安全检查:防止路径遍历
|
||||
asset_path = asset_path.lstrip("/")
|
||||
if ".." in asset_path:
|
||||
raise HTTPException(status_code=400, detail="Invalid asset path")
|
||||
|
||||
full_path = plugin_dir / asset_path
|
||||
if not full_path.exists():
|
||||
raise HTTPException(status_code=404, detail="Asset not found")
|
||||
|
||||
# 确保路径在插件目录内
|
||||
try:
|
||||
full_path.resolve().relative_to(plugin_dir.resolve())
|
||||
except ValueError:
|
||||
raise HTTPException(status_code=400, detail="Invalid asset path")
|
||||
|
||||
return full_path
|
||||
|
||||
# ========== 管理操作 ==========
|
||||
|
||||
@classmethod
|
||||
async def delete(cls, key_or_id: Union[str, int]) -> None:
|
||||
"""删除/卸载插件"""
|
||||
rec = await cls._get_by_key_or_id(key_or_id)
|
||||
|
||||
# 获取 manifest 用于卸载组件
|
||||
manifest: Optional[PluginManifest] = None
|
||||
if rec.manifest:
|
||||
try:
|
||||
manifest = PluginManifest.model_validate(rec.manifest)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# 卸载后端组件
|
||||
if manifest:
|
||||
PluginLoader.unload_plugin(rec.key, manifest)
|
||||
|
||||
# 删除数据库记录
|
||||
await rec.delete()
|
||||
|
||||
# 删除文件
|
||||
with contextlib.suppress(Exception):
|
||||
plugin_dir = cls._get_plugin_dir(rec.key)
|
||||
if plugin_dir.exists():
|
||||
shutil.rmtree(plugin_dir)
|
||||
|
||||
logger.info(f"插件 {rec.key} 已卸载")
|
||||
|
||||
115
domain/plugins/startup.py
Normal file
115
domain/plugins/startup.py
Normal file
@@ -0,0 +1,115 @@
|
||||
"""
|
||||
插件启动加载模块
|
||||
|
||||
负责在应用启动时加载所有已安装的插件
|
||||
"""
|
||||
|
||||
import logging
|
||||
from typing import TYPE_CHECKING, List, Tuple
|
||||
|
||||
from .loader import PluginLoadError, PluginLoader
|
||||
from .types import PluginManifest
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from fastapi import FastAPI
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
async def load_installed_plugins(app: "FastAPI") -> Tuple[int, List[str]]:
|
||||
"""
|
||||
加载所有已安装的插件
|
||||
|
||||
Args:
|
||||
app: FastAPI 应用实例
|
||||
|
||||
Returns:
|
||||
(成功加载数量, 错误列表)
|
||||
"""
|
||||
from models.database import Plugin
|
||||
|
||||
errors: List[str] = []
|
||||
loaded_count = 0
|
||||
|
||||
try:
|
||||
plugins = await Plugin.all()
|
||||
except Exception as e:
|
||||
logger.error(f"查询插件列表失败: {e}")
|
||||
return 0, [f"查询插件列表失败: {e}"]
|
||||
|
||||
for plugin in plugins:
|
||||
if not plugin.key:
|
||||
continue
|
||||
|
||||
try:
|
||||
# 获取 manifest
|
||||
manifest = None
|
||||
if plugin.manifest:
|
||||
try:
|
||||
manifest = PluginManifest.model_validate(plugin.manifest)
|
||||
except Exception:
|
||||
# 尝试从文件系统读取
|
||||
manifest = PluginLoader.read_manifest(plugin.key)
|
||||
else:
|
||||
manifest = PluginLoader.read_manifest(plugin.key)
|
||||
|
||||
if not manifest:
|
||||
logger.warning(f"插件 {plugin.key} 缺少 manifest,跳过加载")
|
||||
continue
|
||||
|
||||
# 加载后端路由
|
||||
loaded_routes: List[str] = []
|
||||
if manifest.backend and manifest.backend.routes:
|
||||
try:
|
||||
routers = PluginLoader.load_all_routes(plugin.key, manifest)
|
||||
for router in routers:
|
||||
app.include_router(router)
|
||||
loaded_routes.append(router.prefix)
|
||||
logger.info(f"插件 {plugin.key} 加载了 {len(routers)} 个路由")
|
||||
except PluginLoadError as e:
|
||||
errors.append(f"插件 {plugin.key} 路由加载失败: {e}")
|
||||
logger.error(f"插件 {plugin.key} 路由加载失败: {e}")
|
||||
|
||||
# 加载处理器
|
||||
loaded_processors: List[str] = []
|
||||
if manifest.backend and manifest.backend.processors:
|
||||
try:
|
||||
processor_types = PluginLoader.load_all_processors(plugin.key, manifest)
|
||||
loaded_processors = processor_types
|
||||
logger.info(f"插件 {plugin.key} 注册了 {len(processor_types)} 个处理器")
|
||||
except PluginLoadError as e:
|
||||
errors.append(f"插件 {plugin.key} 处理器加载失败: {e}")
|
||||
logger.error(f"插件 {plugin.key} 处理器加载失败: {e}")
|
||||
|
||||
# 更新数据库记录
|
||||
plugin.loaded_routes = loaded_routes if loaded_routes else None
|
||||
plugin.loaded_processors = loaded_processors if loaded_processors else None
|
||||
await plugin.save()
|
||||
|
||||
loaded_count += 1
|
||||
logger.info(f"插件 {plugin.key} 加载完成")
|
||||
|
||||
except Exception as e:
|
||||
error_msg = f"插件 {plugin.key} 加载异常: {e}"
|
||||
errors.append(error_msg)
|
||||
logger.exception(error_msg)
|
||||
|
||||
return loaded_count, errors
|
||||
|
||||
|
||||
async def init_plugins(app: "FastAPI") -> None:
|
||||
"""
|
||||
初始化插件系统
|
||||
|
||||
在应用启动时调用
|
||||
"""
|
||||
logger.info("开始加载已安装插件...")
|
||||
|
||||
loaded_count, errors = await load_installed_plugins(app)
|
||||
|
||||
if errors:
|
||||
logger.warning(f"插件加载完成,共 {loaded_count} 个成功,{len(errors)} 个错误")
|
||||
for error in errors:
|
||||
logger.warning(f" - {error}")
|
||||
else:
|
||||
logger.info(f"插件加载完成,共 {loaded_count} 个插件")
|
||||
@@ -1,48 +1,119 @@
|
||||
from typing import Any, Dict, List, Optional
|
||||
|
||||
from pydantic import AliasChoices, BaseModel, ConfigDict, Field
|
||||
from pydantic import BaseModel, ConfigDict, Field
|
||||
|
||||
|
||||
class PluginCreate(BaseModel):
|
||||
url: str = Field(min_length=1)
|
||||
enabled: bool = True
|
||||
# ========== Manifest 相关类型 ==========
|
||||
|
||||
|
||||
class PluginManifestUpdate(BaseModel):
|
||||
class ManifestFrontend(BaseModel):
|
||||
"""manifest.json 中的 frontend 配置"""
|
||||
|
||||
model_config = ConfigDict(populate_by_name=True, extra="ignore")
|
||||
|
||||
key: Optional[str] = None
|
||||
name: Optional[str] = None
|
||||
version: Optional[str] = None
|
||||
entry: Optional[str] = Field(default=None, description="前端入口文件路径")
|
||||
styles: Optional[List[str]] = Field(default=None, description="前端样式文件路径列表(相对插件根目录)")
|
||||
open_app: Optional[bool] = Field(
|
||||
default=None,
|
||||
validation_alias=AliasChoices("open_app", "openApp"),
|
||||
alias="openApp",
|
||||
description="是否支持独立打开",
|
||||
)
|
||||
supported_exts: Optional[List[str]] = Field(
|
||||
default=None,
|
||||
validation_alias=AliasChoices("supported_exts", "supportedExts"),
|
||||
alias="supportedExts",
|
||||
description="支持的文件扩展名列表",
|
||||
)
|
||||
default_bounds: Optional[Dict[str, Any]] = Field(
|
||||
default=None,
|
||||
validation_alias=AliasChoices("default_bounds", "defaultBounds"),
|
||||
alias="defaultBounds",
|
||||
description="默认窗口尺寸",
|
||||
)
|
||||
default_maximized: Optional[bool] = Field(
|
||||
default=None,
|
||||
validation_alias=AliasChoices("default_maximized", "defaultMaximized"),
|
||||
alias="defaultMaximized",
|
||||
description="是否默认最大化",
|
||||
)
|
||||
icon: Optional[str] = None
|
||||
description: Optional[str] = None
|
||||
author: Optional[str] = None
|
||||
website: Optional[str] = None
|
||||
github: Optional[str] = None
|
||||
icon: Optional[str] = Field(default=None, description="图标路径")
|
||||
use_system_window: Optional[bool] = Field(
|
||||
default=None,
|
||||
alias="useSystemWindow",
|
||||
description="是否使用系统窗口",
|
||||
)
|
||||
|
||||
|
||||
class ManifestRouteConfig(BaseModel):
|
||||
"""manifest.json 中的路由配置"""
|
||||
|
||||
model_config = ConfigDict(extra="ignore")
|
||||
|
||||
module: str = Field(..., description="路由模块路径")
|
||||
prefix: str = Field(..., description="路由前缀")
|
||||
tags: Optional[List[str]] = Field(default=None, description="API 标签")
|
||||
|
||||
|
||||
class ManifestProcessorConfig(BaseModel):
|
||||
"""manifest.json 中的处理器配置"""
|
||||
|
||||
model_config = ConfigDict(extra="ignore")
|
||||
|
||||
module: str = Field(..., description="处理器模块路径")
|
||||
type: str = Field(..., description="处理器类型标识")
|
||||
name: Optional[str] = Field(default=None, description="处理器显示名称")
|
||||
|
||||
|
||||
class ManifestBackend(BaseModel):
|
||||
"""manifest.json 中的 backend 配置"""
|
||||
|
||||
model_config = ConfigDict(extra="ignore")
|
||||
|
||||
routes: Optional[List[ManifestRouteConfig]] = Field(default=None, description="路由列表")
|
||||
processors: Optional[List[ManifestProcessorConfig]] = Field(
|
||||
default=None, description="处理器列表"
|
||||
)
|
||||
|
||||
|
||||
class ManifestDependencies(BaseModel):
|
||||
"""manifest.json 中的依赖配置"""
|
||||
|
||||
model_config = ConfigDict(extra="ignore")
|
||||
|
||||
python: Optional[str] = Field(default=None, description="Python 版本要求")
|
||||
packages: Optional[List[str]] = Field(default=None, description="Python 包依赖列表")
|
||||
|
||||
|
||||
class PluginManifest(BaseModel):
|
||||
"""完整的 manifest.json 结构"""
|
||||
|
||||
model_config = ConfigDict(populate_by_name=True, extra="ignore")
|
||||
|
||||
foxpkg: str = Field(default="1.0", description="foxpkg 格式版本")
|
||||
key: str = Field(..., min_length=1, description="插件唯一标识")
|
||||
name: str = Field(..., min_length=1, description="插件名称")
|
||||
version: str = Field(default="1.0.0", description="插件版本")
|
||||
description: Optional[str] = Field(default=None, description="插件描述")
|
||||
i18n: Optional[Dict[str, Dict[str, str]]] = Field(
|
||||
default=None,
|
||||
description="多语言信息(name/description),例如:{'en': {'name': '...', 'description': '...'}}",
|
||||
)
|
||||
author: Optional[str] = Field(default=None, description="作者")
|
||||
website: Optional[str] = Field(default=None, description="网站")
|
||||
github: Optional[str] = Field(default=None, description="GitHub 地址")
|
||||
license: Optional[str] = Field(default=None, description="许可证")
|
||||
|
||||
frontend: Optional[ManifestFrontend] = Field(default=None, description="前端配置")
|
||||
backend: Optional[ManifestBackend] = Field(default=None, description="后端配置")
|
||||
dependencies: Optional[ManifestDependencies] = Field(default=None, description="依赖配置")
|
||||
|
||||
|
||||
# ========== API 请求/响应类型 ==========
|
||||
|
||||
|
||||
class PluginOut(BaseModel):
|
||||
"""插件输出模型"""
|
||||
|
||||
id: int
|
||||
url: str
|
||||
enabled: bool
|
||||
key: str
|
||||
open_app: bool = False
|
||||
key: Optional[str] = None
|
||||
name: Optional[str] = None
|
||||
version: Optional[str] = None
|
||||
supported_exts: Optional[List[str]] = None
|
||||
@@ -53,5 +124,20 @@ class PluginOut(BaseModel):
|
||||
author: Optional[str] = None
|
||||
website: Optional[str] = None
|
||||
github: Optional[str] = None
|
||||
license: Optional[str] = None
|
||||
|
||||
# 新增字段
|
||||
manifest: Optional[Dict[str, Any]] = None
|
||||
loaded_routes: Optional[List[str]] = None
|
||||
loaded_processors: Optional[List[str]] = None
|
||||
|
||||
model_config = ConfigDict(from_attributes=True)
|
||||
|
||||
|
||||
class PluginInstallResult(BaseModel):
|
||||
"""安装结果"""
|
||||
|
||||
success: bool
|
||||
plugin: Optional[PluginOut] = None
|
||||
message: Optional[str] = None
|
||||
errors: Optional[List[str]] = None
|
||||
|
||||
35
domain/processors/__init__.py
Normal file
35
domain/processors/__init__.py
Normal file
@@ -0,0 +1,35 @@
|
||||
from .base import BaseProcessor
|
||||
from .registry import (
|
||||
CONFIG_SCHEMAS,
|
||||
TYPE_MAP,
|
||||
get_config_schema,
|
||||
get_config_schemas,
|
||||
get_last_discovery_errors,
|
||||
get_module_path,
|
||||
reload_processors,
|
||||
)
|
||||
from .service import (
|
||||
ProcessorService,
|
||||
get_processor,
|
||||
list_processors,
|
||||
reload_processor_modules,
|
||||
)
|
||||
from .types import ProcessDirectoryRequest, ProcessRequest, UpdateSourceRequest
|
||||
|
||||
__all__ = [
|
||||
"BaseProcessor",
|
||||
"CONFIG_SCHEMAS",
|
||||
"TYPE_MAP",
|
||||
"get_config_schema",
|
||||
"get_config_schemas",
|
||||
"get_last_discovery_errors",
|
||||
"get_module_path",
|
||||
"reload_processors",
|
||||
"ProcessorService",
|
||||
"get_processor",
|
||||
"list_processors",
|
||||
"reload_processor_modules",
|
||||
"ProcessDirectoryRequest",
|
||||
"ProcessRequest",
|
||||
"UpdateSourceRequest",
|
||||
]
|
||||
@@ -4,10 +4,9 @@ from fastapi import APIRouter, Body, Depends, Request
|
||||
|
||||
from api.response import success
|
||||
from domain.audit import AuditAction, audit
|
||||
from domain.auth.service import get_current_active_user
|
||||
from domain.auth.types import User
|
||||
from domain.processors.service import ProcessorService
|
||||
from domain.processors.types import (
|
||||
from domain.auth import User, get_current_active_user
|
||||
from .service import ProcessorService
|
||||
from .types import (
|
||||
ProcessDirectoryRequest,
|
||||
ProcessRequest,
|
||||
UpdateSourceRequest,
|
||||
|
||||
@@ -8,8 +8,15 @@ from fastapi.responses import Response
|
||||
from PIL import Image
|
||||
|
||||
from ..base import BaseProcessor
|
||||
from domain.ai.inference import describe_image_base64, get_text_embedding, provider_service
|
||||
from domain.ai.service import VectorDBService, DEFAULT_VECTOR_DIMENSION
|
||||
from domain.ai import (
|
||||
DEFAULT_VECTOR_DIMENSION,
|
||||
FILE_COLLECTION_NAME,
|
||||
VECTOR_COLLECTION_NAME,
|
||||
VectorDBService,
|
||||
describe_image_base64,
|
||||
get_text_embedding,
|
||||
provider_service,
|
||||
)
|
||||
|
||||
|
||||
CHUNK_SIZE = 800
|
||||
@@ -112,18 +119,20 @@ class VectorIndexProcessor:
|
||||
action = config.get("action", "create")
|
||||
index_type = config.get("index_type", "vector")
|
||||
vector_db = VectorDBService()
|
||||
collection_name = "vector_collection"
|
||||
vector_collection = VECTOR_COLLECTION_NAME
|
||||
file_collection = FILE_COLLECTION_NAME
|
||||
|
||||
if action == "destroy":
|
||||
await vector_db.delete_vector(collection_name, path)
|
||||
target_collection = file_collection if index_type == "simple" else vector_collection
|
||||
await vector_db.delete_vector(target_collection, path)
|
||||
return Response(content=f"文件 {path} 的 {index_type} 索引已销毁", media_type="text/plain")
|
||||
|
||||
mime_type = _guess_mime(path)
|
||||
|
||||
if index_type == "simple":
|
||||
await vector_db.ensure_collection(collection_name, vector=False)
|
||||
await vector_db.delete_vector(collection_name, path)
|
||||
await vector_db.upsert_vector(collection_name, {
|
||||
await vector_db.ensure_collection(file_collection, vector=False)
|
||||
await vector_db.delete_vector(file_collection, path)
|
||||
await vector_db.upsert_vector(file_collection, {
|
||||
"path": path,
|
||||
"source_path": path,
|
||||
"chunk_id": "filename",
|
||||
@@ -146,8 +155,8 @@ class VectorIndexProcessor:
|
||||
if vector_dim <= 0:
|
||||
vector_dim = DEFAULT_VECTOR_DIMENSION
|
||||
|
||||
await vector_db.ensure_collection(collection_name, vector=True, dim=vector_dim)
|
||||
await vector_db.delete_vector(collection_name, path)
|
||||
await vector_db.ensure_collection(vector_collection, vector=True, dim=vector_dim)
|
||||
await vector_db.delete_vector(vector_collection, path)
|
||||
|
||||
if file_ext in ["jpg", "jpeg", "png", "bmp"]:
|
||||
processed_bytes, compression = _compress_image_for_embedding(input_bytes)
|
||||
@@ -155,7 +164,7 @@ class VectorIndexProcessor:
|
||||
description = await describe_image_base64(base64_image)
|
||||
embedding = await get_text_embedding(description)
|
||||
image_mime = "image/jpeg" if compression else mime_type
|
||||
await vector_db.upsert_vector(collection_name, {
|
||||
await vector_db.upsert_vector(vector_collection, {
|
||||
"path": _chunk_key(path, "image"),
|
||||
"source_path": path,
|
||||
"chunk_id": "image",
|
||||
@@ -177,7 +186,7 @@ class VectorIndexProcessor:
|
||||
|
||||
chunks = _chunk_text(text)
|
||||
if not chunks:
|
||||
await vector_db.upsert_vector(collection_name, {
|
||||
await vector_db.upsert_vector(vector_collection, {
|
||||
"path": _chunk_key(path, "0"),
|
||||
"source_path": path,
|
||||
"chunk_id": "0",
|
||||
@@ -194,7 +203,7 @@ class VectorIndexProcessor:
|
||||
chunk_count = 0
|
||||
for chunk_id, chunk_text, start, end in chunks:
|
||||
embedding = await get_text_embedding(chunk_text)
|
||||
await vector_db.upsert_vector(collection_name, {
|
||||
await vector_db.upsert_vector(vector_collection, {
|
||||
"path": _chunk_key(path, str(chunk_id)),
|
||||
"source_path": path,
|
||||
"chunk_id": str(chunk_id),
|
||||
@@ -213,15 +222,15 @@ class VectorIndexProcessor:
|
||||
return Response(content="文本文件已索引", media_type="text/plain")
|
||||
|
||||
# 其他类型暂未支持向量索引,回退为文件名索引
|
||||
await vector_db.delete_vector(collection_name, path)
|
||||
await vector_db.upsert_vector(collection_name, {
|
||||
"path": _chunk_key(path, "fallback"),
|
||||
await vector_db.ensure_collection(file_collection, vector=False)
|
||||
await vector_db.delete_vector(file_collection, path)
|
||||
await vector_db.upsert_vector(file_collection, {
|
||||
"path": path,
|
||||
"source_path": path,
|
||||
"chunk_id": "filename",
|
||||
"mime": mime_type,
|
||||
"type": "filename",
|
||||
"name": os.path.basename(path),
|
||||
"embedding": [0.0] * vector_dim,
|
||||
})
|
||||
return Response(content="暂不支持该类型的向量索引,已创建文件名索引", media_type="text/plain")
|
||||
|
||||
|
||||
@@ -1,396 +0,0 @@
|
||||
import hashlib
|
||||
import json
|
||||
import os
|
||||
import re
|
||||
from datetime import UTC, datetime
|
||||
from pathlib import Path
|
||||
from typing import Any, Dict, List, Optional, Tuple
|
||||
|
||||
import httpx
|
||||
|
||||
from domain.virtual_fs.service import VirtualFSService
|
||||
from domain.virtual_fs.thumbnail import VIDEO_EXT, is_video_filename
|
||||
|
||||
|
||||
DATA_ROOT = Path("data/.video")
|
||||
TMDB_BASE_URL = "https://api.themoviedb.org/3"
|
||||
|
||||
|
||||
def _sha1(text: str) -> str:
|
||||
return hashlib.sha1(text.encode("utf-8")).hexdigest()
|
||||
|
||||
|
||||
def _store_path(media_type: str, source_path: str) -> Path:
|
||||
subdir = "tv" if media_type == "tv" else "movie"
|
||||
return DATA_ROOT / subdir / f"{_sha1(source_path)}.json"
|
||||
|
||||
|
||||
def _write_json(path: Path, payload: dict) -> None:
|
||||
path.parent.mkdir(parents=True, exist_ok=True)
|
||||
path.write_text(json.dumps(payload, ensure_ascii=False, indent=2), encoding="utf-8")
|
||||
|
||||
|
||||
_CLEAN_TAGS_RE = re.compile(
|
||||
r"\b("
|
||||
r"2160p|1080p|720p|480p|4k|hdr|dv|dolby|atmos|"
|
||||
r"x264|x265|h264|h265|hevc|av1|aac|dts|flac|"
|
||||
r"bluray|bdrip|web[- ]?dl|webrip|dvdrip|remux|proper|repack"
|
||||
r")\b",
|
||||
re.IGNORECASE,
|
||||
)
|
||||
|
||||
|
||||
def _clean_query_name(raw: str) -> str:
|
||||
name = raw
|
||||
name = name.replace(".", " ").replace("_", " ")
|
||||
name = re.sub(r"\[[^\]]*\]", " ", name)
|
||||
name = re.sub(r"\([^\)]*\)", " ", name)
|
||||
name = _CLEAN_TAGS_RE.sub(" ", name)
|
||||
name = re.sub(r"\s+", " ", name).strip()
|
||||
return name
|
||||
|
||||
|
||||
def _guess_name_from_path(path: str, is_dir: bool) -> str:
|
||||
norm = path.rstrip("/") if is_dir else path
|
||||
p = Path(norm)
|
||||
raw = p.name if is_dir else p.stem
|
||||
return _clean_query_name(raw)
|
||||
|
||||
|
||||
def _as_bool(value: Any, default: bool) -> bool:
|
||||
if value is None:
|
||||
return default
|
||||
if isinstance(value, bool):
|
||||
return value
|
||||
if isinstance(value, int):
|
||||
return value != 0
|
||||
if isinstance(value, str):
|
||||
v = value.strip().lower()
|
||||
if v in {"1", "true", "yes", "y", "on"}:
|
||||
return True
|
||||
if v in {"0", "false", "no", "n", "off"}:
|
||||
return False
|
||||
return default
|
||||
|
||||
|
||||
_SXXEYY_RE = re.compile(r"[Ss](\d{1,2})\s*[.\-_ ]*\s*[Ee](\d{1,3})")
|
||||
_X_RE = re.compile(r"(\d{1,2})x(\d{1,3})", re.IGNORECASE)
|
||||
_CN_EP_RE = re.compile(r"第\s*(\d{1,3})\s*[集话]")
|
||||
_CN_SEASON_RE = re.compile(r"第\s*(\d{1,2})\s*季")
|
||||
_SEASON_WORD_RE = re.compile(r"Season\s*(\d{1,2})", re.IGNORECASE)
|
||||
_S_RE = re.compile(r"[Ss](\d{1,2})")
|
||||
|
||||
|
||||
def _parse_season_episode(rel_path: str) -> Tuple[Optional[int], Optional[int]]:
|
||||
stem = Path(rel_path).stem
|
||||
|
||||
m = _SXXEYY_RE.search(stem) or _SXXEYY_RE.search(rel_path)
|
||||
if m:
|
||||
return int(m.group(1)), int(m.group(2))
|
||||
|
||||
m = _X_RE.search(stem)
|
||||
if m:
|
||||
return int(m.group(1)), int(m.group(2))
|
||||
|
||||
m = _CN_EP_RE.search(stem)
|
||||
if m:
|
||||
episode = int(m.group(1))
|
||||
season = None
|
||||
for part in reversed(Path(rel_path).parts[:-1]):
|
||||
sm = _CN_SEASON_RE.search(part) or _SEASON_WORD_RE.search(part) or _S_RE.search(part)
|
||||
if sm:
|
||||
season = int(sm.group(1))
|
||||
break
|
||||
return season or 1, episode
|
||||
|
||||
m = re.match(r"^(\d{1,3})(?!\d)", stem)
|
||||
if m:
|
||||
episode = int(m.group(1))
|
||||
season = None
|
||||
for part in reversed(Path(rel_path).parts[:-1]):
|
||||
sm = _CN_SEASON_RE.search(part) or _SEASON_WORD_RE.search(part) or _S_RE.search(part)
|
||||
if sm:
|
||||
season = int(sm.group(1))
|
||||
break
|
||||
return season or 1, episode
|
||||
|
||||
return None, None
|
||||
|
||||
|
||||
class TMDBClient:
|
||||
def __init__(self, access_token: str | None, api_key: str | None):
|
||||
self._access_token = access_token
|
||||
self._api_key = api_key
|
||||
|
||||
@classmethod
|
||||
def from_env(cls) -> "TMDBClient":
|
||||
access_token = os.getenv("TMDB_ACCESS_TOKEN")
|
||||
api_key = os.getenv("TMDB_API_KEY")
|
||||
if not access_token and not api_key:
|
||||
raise RuntimeError("缺少 TMDB_ACCESS_TOKEN 或 TMDB_API_KEY")
|
||||
return cls(access_token=access_token, api_key=api_key)
|
||||
|
||||
def _headers(self) -> dict:
|
||||
headers = {"Accept": "application/json"}
|
||||
if self._access_token:
|
||||
headers["Authorization"] = f"Bearer {self._access_token}"
|
||||
return headers
|
||||
|
||||
def _merge_params(self, params: dict) -> dict:
|
||||
merged = dict(params or {})
|
||||
if self._api_key:
|
||||
merged.setdefault("api_key", self._api_key)
|
||||
return merged
|
||||
|
||||
async def get(self, path: str, params: dict) -> dict:
|
||||
url = f"{TMDB_BASE_URL}{path}"
|
||||
async with httpx.AsyncClient(timeout=30.0) as client:
|
||||
resp = await client.get(url, headers=self._headers(), params=self._merge_params(params))
|
||||
resp.raise_for_status()
|
||||
return resp.json()
|
||||
|
||||
|
||||
class VideoLibraryProcessor:
|
||||
name = "影视入库"
|
||||
supported_exts = sorted(VIDEO_EXT)
|
||||
config_schema = [
|
||||
{
|
||||
"key": "name",
|
||||
"label": "手动名称(可选)",
|
||||
"type": "string",
|
||||
"required": False,
|
||||
"placeholder": "留空则从路径提取",
|
||||
},
|
||||
{
|
||||
"key": "language",
|
||||
"label": "语言",
|
||||
"type": "string",
|
||||
"required": False,
|
||||
"default": "zh-CN",
|
||||
},
|
||||
{
|
||||
"key": "include_episodes",
|
||||
"label": "电视剧:保存每集",
|
||||
"type": "select",
|
||||
"required": False,
|
||||
"default": 1,
|
||||
"options": [
|
||||
{"label": "是", "value": 1},
|
||||
{"label": "否", "value": 0},
|
||||
],
|
||||
},
|
||||
]
|
||||
produces_file = False
|
||||
supports_directory = True
|
||||
requires_input_bytes = False
|
||||
|
||||
async def process(self, input_bytes: bytes, path: str, config: Dict[str, Any]) -> Dict[str, Any]:
|
||||
tmdb = TMDBClient.from_env()
|
||||
is_dir = await VirtualFSService.path_is_directory(path)
|
||||
language = str(config.get("language") or "zh-CN")
|
||||
manual_name = str(config.get("name") or "").strip()
|
||||
query_name = manual_name or _guess_name_from_path(path, is_dir=is_dir)
|
||||
scraped_at = datetime.now(UTC).isoformat()
|
||||
|
||||
if is_dir:
|
||||
payload, saved_to = await self._process_tv_dir(tmdb, path, query_name, language, scraped_at, config)
|
||||
return {
|
||||
"ok": True,
|
||||
"type": "tv",
|
||||
"path": path,
|
||||
"tmdb_id": payload.get("tmdb", {}).get("id"),
|
||||
"saved_to": str(saved_to),
|
||||
}
|
||||
|
||||
payload, saved_to = await self._process_movie_file(tmdb, path, query_name, language, scraped_at)
|
||||
return {
|
||||
"ok": True,
|
||||
"type": "movie",
|
||||
"path": path,
|
||||
"tmdb_id": payload.get("tmdb", {}).get("id"),
|
||||
"saved_to": str(saved_to),
|
||||
}
|
||||
|
||||
async def _process_movie_file(
|
||||
self,
|
||||
tmdb: TMDBClient,
|
||||
path: str,
|
||||
query_name: str,
|
||||
language: str,
|
||||
scraped_at: str,
|
||||
) -> Tuple[dict, Path]:
|
||||
search = await tmdb.get("/search/movie", {"query": query_name, "language": language})
|
||||
results = search.get("results") or []
|
||||
if not results:
|
||||
raise RuntimeError(f"未找到电影条目:{query_name}")
|
||||
|
||||
chosen = results[0] or {}
|
||||
movie_id = chosen.get("id")
|
||||
if not movie_id:
|
||||
raise RuntimeError("TMDB 搜索结果缺少 id")
|
||||
|
||||
detail = await tmdb.get(
|
||||
f"/movie/{movie_id}",
|
||||
{
|
||||
"language": language,
|
||||
"append_to_response": "credits,images,external_ids,videos",
|
||||
},
|
||||
)
|
||||
|
||||
payload = {
|
||||
"type": "movie",
|
||||
"source_path": path,
|
||||
"query": {"name": query_name, "language": language},
|
||||
"scraped_at": scraped_at,
|
||||
"tmdb": {
|
||||
"id": movie_id,
|
||||
"search": {"page": search.get("page"), "total_results": search.get("total_results"), "results": results[:5]},
|
||||
"detail": detail,
|
||||
},
|
||||
}
|
||||
saved_to = _store_path("movie", path)
|
||||
_write_json(saved_to, payload)
|
||||
return payload, saved_to
|
||||
|
||||
async def _process_tv_dir(
|
||||
self,
|
||||
tmdb: TMDBClient,
|
||||
path: str,
|
||||
query_name: str,
|
||||
language: str,
|
||||
scraped_at: str,
|
||||
config: Dict[str, Any],
|
||||
) -> Tuple[dict, Path]:
|
||||
search = await tmdb.get("/search/tv", {"query": query_name, "language": language})
|
||||
results = search.get("results") or []
|
||||
if not results:
|
||||
raise RuntimeError(f"未找到电视剧条目:{query_name}")
|
||||
|
||||
chosen = results[0] or {}
|
||||
tv_id = chosen.get("id")
|
||||
if not tv_id:
|
||||
raise RuntimeError("TMDB 搜索结果缺少 id")
|
||||
|
||||
detail = await tmdb.get(
|
||||
f"/tv/{tv_id}",
|
||||
{
|
||||
"language": language,
|
||||
"append_to_response": "credits,images,external_ids,videos",
|
||||
},
|
||||
)
|
||||
|
||||
include_episodes = _as_bool(config.get("include_episodes"), True)
|
||||
episodes: List[dict] = []
|
||||
seasons_detail: Dict[str, Any] = {}
|
||||
if include_episodes:
|
||||
episodes = await self._collect_episode_files(path)
|
||||
seasons = sorted({ep["season"] for ep in episodes if ep.get("season") is not None})
|
||||
for season in seasons:
|
||||
seasons_detail[str(season)] = await tmdb.get(
|
||||
f"/tv/{tv_id}/season/{int(season)}",
|
||||
{"language": language},
|
||||
)
|
||||
self._attach_tmdb_episode_detail(episodes, seasons_detail)
|
||||
|
||||
payload = {
|
||||
"type": "tv",
|
||||
"source_path": path,
|
||||
"query": {"name": query_name, "language": language},
|
||||
"scraped_at": scraped_at,
|
||||
"tmdb": {
|
||||
"id": tv_id,
|
||||
"search": {"page": search.get("page"), "total_results": search.get("total_results"), "results": results[:5]},
|
||||
"detail": detail,
|
||||
"seasons": seasons_detail,
|
||||
},
|
||||
"episodes": episodes,
|
||||
}
|
||||
|
||||
saved_to = _store_path("tv", path)
|
||||
_write_json(saved_to, payload)
|
||||
return payload, saved_to
|
||||
|
||||
async def _collect_episode_files(self, dir_path: str) -> List[dict]:
|
||||
adapter_instance, adapter_model, root, rel = await VirtualFSService.resolve_adapter_and_rel(dir_path)
|
||||
rel = rel.rstrip("/")
|
||||
list_dir = await VirtualFSService._ensure_method(adapter_instance, "list_dir")
|
||||
|
||||
stack: List[str] = [rel]
|
||||
page_size = 200
|
||||
out: List[dict] = []
|
||||
|
||||
while stack:
|
||||
current_rel = stack.pop()
|
||||
page = 1
|
||||
while True:
|
||||
entries, total = await list_dir(root, current_rel, page, page_size, "name", "asc")
|
||||
entries = entries or []
|
||||
if not entries and (total or 0) == 0:
|
||||
break
|
||||
|
||||
for entry in entries:
|
||||
name = entry.get("name")
|
||||
if not name:
|
||||
continue
|
||||
child_rel = VirtualFSService._join_rel(current_rel, name)
|
||||
if entry.get("is_dir"):
|
||||
stack.append(child_rel.rstrip("/"))
|
||||
continue
|
||||
if not is_video_filename(name):
|
||||
continue
|
||||
|
||||
absolute_path = VirtualFSService._build_absolute_path(adapter_model.path, child_rel)
|
||||
rel_in_show = child_rel
|
||||
if rel and child_rel.startswith(rel.rstrip("/") + "/"):
|
||||
rel_in_show = child_rel[len(rel.rstrip("/")) + 1 :]
|
||||
|
||||
season, episode = _parse_season_episode(rel_in_show)
|
||||
out.append(
|
||||
{
|
||||
"path": absolute_path,
|
||||
"rel": rel_in_show,
|
||||
"name": name,
|
||||
"size": entry.get("size"),
|
||||
"mtime": entry.get("mtime"),
|
||||
"season": season,
|
||||
"episode": episode,
|
||||
}
|
||||
)
|
||||
|
||||
if total is None or page * page_size >= total:
|
||||
break
|
||||
page += 1
|
||||
|
||||
return out
|
||||
|
||||
def _attach_tmdb_episode_detail(self, episodes: List[dict], seasons_detail: Dict[str, Any]) -> None:
|
||||
episode_maps: Dict[str, Dict[int, Any]] = {}
|
||||
for season_str, season_payload in (seasons_detail or {}).items():
|
||||
items = (season_payload or {}).get("episodes") or []
|
||||
m: Dict[int, Any] = {}
|
||||
for item in items:
|
||||
try:
|
||||
number = int(item.get("episode_number"))
|
||||
except Exception:
|
||||
continue
|
||||
m[number] = item
|
||||
episode_maps[season_str] = m
|
||||
|
||||
for ep in episodes:
|
||||
season = ep.get("season")
|
||||
episode = ep.get("episode")
|
||||
if season is None or episode is None:
|
||||
continue
|
||||
m = episode_maps.get(str(season))
|
||||
if not m:
|
||||
continue
|
||||
detail = m.get(int(episode))
|
||||
if detail:
|
||||
ep["tmdb_episode"] = detail
|
||||
|
||||
|
||||
PROCESSOR_TYPE = "video_library"
|
||||
PROCESSOR_NAME = VideoLibraryProcessor.name
|
||||
SUPPORTED_EXTS = VideoLibraryProcessor.supported_exts
|
||||
CONFIG_SCHEMA = VideoLibraryProcessor.config_schema
|
||||
PROCESSOR_FACTORY = lambda: VideoLibraryProcessor()
|
||||
@@ -5,7 +5,7 @@ from pathlib import Path
|
||||
from types import ModuleType
|
||||
from typing import Callable, Dict, Optional
|
||||
|
||||
from domain.processors.base import BaseProcessor
|
||||
from .base import BaseProcessor
|
||||
|
||||
ProcessorFactory = Callable[[], BaseProcessor]
|
||||
TYPE_MAP: Dict[str, ProcessorFactory] = {}
|
||||
@@ -16,7 +16,7 @@ LAST_DISCOVERY_ERRORS: list[str] = []
|
||||
|
||||
def discover_processors(force_reload: bool = False) -> list[str]:
|
||||
"""扫描并缓存可用的处理器模块。"""
|
||||
from domain.processors import builtin as processors_pkg
|
||||
from . import builtin as processors_pkg
|
||||
|
||||
TYPE_MAP.clear()
|
||||
CONFIG_SCHEMAS.clear()
|
||||
|
||||
@@ -3,20 +3,20 @@ from typing import List, Tuple
|
||||
|
||||
from fastapi import HTTPException
|
||||
from fastapi.concurrency import run_in_threadpool
|
||||
from domain.processors.registry import (
|
||||
from domain.tasks import task_queue_service
|
||||
from domain.virtual_fs import VirtualFSService
|
||||
from .registry import (
|
||||
get,
|
||||
get_config_schema,
|
||||
get_config_schemas,
|
||||
get_module_path,
|
||||
reload_processors,
|
||||
)
|
||||
from domain.processors.types import (
|
||||
from .types import (
|
||||
ProcessDirectoryRequest,
|
||||
ProcessRequest,
|
||||
UpdateSourceRequest,
|
||||
)
|
||||
from domain.virtual_fs.service import VirtualFSService
|
||||
from domain.tasks.task_queue import task_queue_service
|
||||
|
||||
|
||||
class ProcessorService:
|
||||
|
||||
1
domain/repositories/__init__.py
Normal file
1
domain/repositories/__init__.py
Normal file
@@ -0,0 +1 @@
|
||||
__all__: list[str] = []
|
||||
10
domain/share/__init__.py
Normal file
10
domain/share/__init__.py
Normal file
@@ -0,0 +1,10 @@
|
||||
from .service import ShareService
|
||||
from .types import ShareCreate, ShareInfo, ShareInfoWithPassword, SharePassword
|
||||
|
||||
__all__ = [
|
||||
"ShareService",
|
||||
"ShareCreate",
|
||||
"ShareInfo",
|
||||
"ShareInfoWithPassword",
|
||||
"SharePassword",
|
||||
]
|
||||
@@ -4,10 +4,9 @@ from fastapi import APIRouter, Depends, Request
|
||||
|
||||
from api.response import success
|
||||
from domain.audit import AuditAction, audit
|
||||
from domain.auth.service import get_current_active_user
|
||||
from domain.auth.types import User
|
||||
from domain.share.service import ShareService
|
||||
from domain.share.types import (
|
||||
from domain.auth import User, get_current_active_user
|
||||
from .service import ShareService
|
||||
from .types import (
|
||||
ShareCreate,
|
||||
ShareInfo,
|
||||
ShareInfoWithPassword,
|
||||
|
||||
@@ -7,7 +7,7 @@ import bcrypt
|
||||
from fastapi import HTTPException, status
|
||||
from fastapi.responses import Response
|
||||
|
||||
from domain.virtual_fs.service import VirtualFSService
|
||||
from domain.virtual_fs import VirtualFSService
|
||||
from models.database import ShareLink, UserAccount
|
||||
|
||||
|
||||
|
||||
24
domain/tasks/__init__.py
Normal file
24
domain/tasks/__init__.py
Normal file
@@ -0,0 +1,24 @@
|
||||
from .service import TaskService
|
||||
from .task_queue import Task, TaskProgress, TaskStatus, task_queue_service
|
||||
from .types import (
|
||||
AutomationTaskBase,
|
||||
AutomationTaskCreate,
|
||||
AutomationTaskRead,
|
||||
AutomationTaskUpdate,
|
||||
TaskQueueSettings,
|
||||
TaskQueueSettingsResponse,
|
||||
)
|
||||
|
||||
__all__ = [
|
||||
"TaskService",
|
||||
"Task",
|
||||
"TaskProgress",
|
||||
"TaskStatus",
|
||||
"task_queue_service",
|
||||
"AutomationTaskBase",
|
||||
"AutomationTaskCreate",
|
||||
"AutomationTaskRead",
|
||||
"AutomationTaskUpdate",
|
||||
"TaskQueueSettings",
|
||||
"TaskQueueSettingsResponse",
|
||||
]
|
||||
@@ -2,9 +2,9 @@ from fastapi import APIRouter, Depends, Request
|
||||
|
||||
from api.response import success
|
||||
from domain.audit import AuditAction, audit
|
||||
from domain.auth.service import get_current_active_user
|
||||
from domain.tasks.service import TaskService
|
||||
from domain.tasks.types import (
|
||||
from domain.auth import get_current_active_user
|
||||
from .service import TaskService
|
||||
from .types import (
|
||||
AutomationTaskCreate,
|
||||
AutomationTaskUpdate,
|
||||
TaskQueueSettings,
|
||||
|
||||
@@ -3,17 +3,16 @@ from typing import Annotated, Any, Dict, Optional
|
||||
|
||||
from fastapi import Depends, HTTPException
|
||||
|
||||
from domain.auth.service import get_current_active_user
|
||||
from domain.auth.types import User
|
||||
from domain.config.service import ConfigService
|
||||
from domain.tasks.types import (
|
||||
from domain.auth import User, get_current_active_user
|
||||
from domain.config import ConfigService
|
||||
from .task_queue import task_queue_service
|
||||
from .types import (
|
||||
AutomationTaskCreate,
|
||||
AutomationTaskUpdate,
|
||||
TaskQueueSettings,
|
||||
TaskQueueSettingsResponse,
|
||||
)
|
||||
from models.database import AutomationTask
|
||||
from domain.tasks.task_queue import task_queue_service
|
||||
|
||||
|
||||
class TaskService:
|
||||
|
||||
@@ -74,7 +74,7 @@ class TaskQueueService:
|
||||
|
||||
try:
|
||||
# Local import to avoid circular dependency during module load.
|
||||
from domain.virtual_fs.service import VirtualFSService
|
||||
from domain.virtual_fs import VirtualFSService
|
||||
|
||||
if task.name == "process_file":
|
||||
params = task.task_info
|
||||
@@ -88,7 +88,7 @@ class TaskQueueService:
|
||||
task.result = result
|
||||
elif task.name == "automation_task" or self._is_processor_task(task.name):
|
||||
from models.database import AutomationTask
|
||||
from domain.processors.service import get_processor
|
||||
from domain.processors import get_processor
|
||||
|
||||
params = task.task_info
|
||||
auto_task = await AutomationTask.get(id=params["task_id"])
|
||||
@@ -116,7 +116,7 @@ class TaskQueueService:
|
||||
await VirtualFSService.write_file(save_to, result)
|
||||
task.result = "Automation task completed"
|
||||
elif task.name == "offline_http_download":
|
||||
from domain.offline_downloads.service import OfflineDownloadService
|
||||
from domain.offline_downloads import OfflineDownloadService
|
||||
|
||||
result_path = await OfflineDownloadService.run_http_download(task)
|
||||
task.result = {"path": result_path}
|
||||
@@ -124,7 +124,7 @@ class TaskQueueService:
|
||||
result = await VirtualFSService.run_cross_mount_transfer_task(task)
|
||||
task.result = result
|
||||
elif task.name == "send_email":
|
||||
from domain.email.service import EmailService
|
||||
from domain.email import EmailService
|
||||
await EmailService.send_from_task(task.id, task.task_info)
|
||||
task.result = "Email sent"
|
||||
else:
|
||||
@@ -141,7 +141,7 @@ class TaskQueueService:
|
||||
|
||||
def _is_processor_task(self, task_name: str) -> bool:
|
||||
try:
|
||||
from domain.processors.service import get_processor
|
||||
from domain.processors import get_processor
|
||||
|
||||
return get_processor(task_name) is not None
|
||||
except Exception:
|
||||
@@ -180,7 +180,7 @@ class TaskQueueService:
|
||||
|
||||
async def start_worker(self, concurrency: int | None = None):
|
||||
if concurrency is None:
|
||||
from domain.config.service import ConfigService
|
||||
from domain.config import ConfigService
|
||||
|
||||
stored_value = await ConfigService.get("TASK_QUEUE_CONCURRENCY", self._concurrency)
|
||||
try:
|
||||
|
||||
11
domain/virtual_fs/__init__.py
Normal file
11
domain/virtual_fs/__init__.py
Normal file
@@ -0,0 +1,11 @@
|
||||
from .service import VirtualFSService
|
||||
from .types import DirListing, MkdirRequest, MoveRequest, SearchResultItem, VfsEntry
|
||||
|
||||
__all__ = [
|
||||
"VirtualFSService",
|
||||
"DirListing",
|
||||
"MkdirRequest",
|
||||
"MoveRequest",
|
||||
"SearchResultItem",
|
||||
"VfsEntry",
|
||||
]
|
||||
@@ -4,10 +4,9 @@ from fastapi import APIRouter, Depends, File, Query, Request, UploadFile
|
||||
|
||||
from api.response import success
|
||||
from domain.audit import AuditAction, audit
|
||||
from domain.auth.service import get_current_active_user
|
||||
from domain.auth.types import User
|
||||
from domain.virtual_fs.service import VirtualFSService
|
||||
from domain.virtual_fs.types import MkdirRequest, MoveRequest
|
||||
from domain.auth import User, get_current_active_user
|
||||
from .service import VirtualFSService
|
||||
from .types import MkdirRequest, MoveRequest
|
||||
|
||||
router = APIRouter(prefix="/api/fs", tags=["virtual-fs"])
|
||||
|
||||
|
||||
@@ -4,8 +4,8 @@ from typing import Any, AsyncIterator, Union
|
||||
from fastapi import HTTPException
|
||||
from fastapi.responses import Response
|
||||
|
||||
from domain.tasks.service import TaskService
|
||||
from domain.virtual_fs.thumbnail import is_raw_filename, raw_bytes_to_jpeg
|
||||
from domain.tasks import TaskService
|
||||
from .thumbnail import is_raw_filename, raw_bytes_to_jpeg
|
||||
|
||||
from .listing import VirtualFSListingMixin
|
||||
|
||||
|
||||
@@ -3,9 +3,9 @@ from typing import Any, Dict, List, Tuple
|
||||
from fastapi import HTTPException
|
||||
|
||||
from api.response import page
|
||||
from domain.adapters.registry import runtime_registry
|
||||
from domain.ai.service import VectorDBService
|
||||
from domain.virtual_fs.thumbnail import is_image_filename, is_video_filename
|
||||
from domain.adapters import runtime_registry
|
||||
from domain.ai import FILE_COLLECTION_NAME, VECTOR_COLLECTION_NAME, VectorDBService
|
||||
from .thumbnail import is_image_filename, is_video_filename
|
||||
from models import StorageAdapter
|
||||
|
||||
from .resolver import VirtualFSResolverMixin
|
||||
@@ -161,13 +161,19 @@ class VirtualFSListingMixin(VirtualFSResolverMixin):
|
||||
@classmethod
|
||||
async def _gather_vector_index(cls, full_path: str, limit: int = 20):
|
||||
vector_db = VectorDBService()
|
||||
try:
|
||||
raw_results = await vector_db.search_by_path("vector_collection", full_path, max(limit * 2, 20))
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
matched = []
|
||||
if raw_results:
|
||||
had_success = False
|
||||
fetch_limit = max(limit * 2, 20)
|
||||
for collection_name in (VECTOR_COLLECTION_NAME, FILE_COLLECTION_NAME):
|
||||
try:
|
||||
raw_results = await vector_db.search_by_path(collection_name, full_path, fetch_limit)
|
||||
except Exception:
|
||||
continue
|
||||
|
||||
if not raw_results:
|
||||
had_success = True
|
||||
continue
|
||||
had_success = True
|
||||
buckets = raw_results if isinstance(raw_results, list) else [raw_results]
|
||||
for bucket in buckets:
|
||||
if not bucket:
|
||||
@@ -193,6 +199,9 @@ class VirtualFSListingMixin(VirtualFSResolverMixin):
|
||||
entry["preview_truncated"] = len(text) > preview_limit
|
||||
matched.append(entry)
|
||||
|
||||
if not had_success:
|
||||
return None
|
||||
|
||||
if not matched:
|
||||
return {"total": 0, "entries": [], "by_type": {}, "has_more": False}
|
||||
|
||||
|
||||
@@ -0,0 +1 @@
|
||||
__all__: list[str] = []
|
||||
|
||||
@@ -15,8 +15,8 @@ from fastapi import APIRouter, Request, Response
|
||||
from fastapi import HTTPException
|
||||
|
||||
from domain.audit import AuditAction, audit
|
||||
from domain.config.service import ConfigService
|
||||
from domain.virtual_fs.service import VirtualFSService
|
||||
from domain.config import ConfigService
|
||||
from domain.virtual_fs import VirtualFSService
|
||||
|
||||
|
||||
router = APIRouter(prefix="/s3", tags=["s3"])
|
||||
|
||||
@@ -9,10 +9,9 @@ from fastapi import APIRouter, Request, Response, HTTPException, Depends
|
||||
import xml.etree.ElementTree as ET
|
||||
|
||||
from domain.audit import AuditAction, audit
|
||||
from domain.auth.service import AuthService
|
||||
from domain.auth.types import User, UserInDB
|
||||
from domain.virtual_fs.service import VirtualFSService
|
||||
from domain.config.service import ConfigService
|
||||
from domain.auth import AuthService, User, UserInDB
|
||||
from domain.config import ConfigService
|
||||
from domain.virtual_fs import VirtualFSService
|
||||
|
||||
|
||||
_WEBDAV_ENABLED_KEY = "WEBDAV_MAPPING_ENABLED"
|
||||
@@ -172,12 +171,32 @@ async def propfind(
|
||||
ctype = None if is_dir else (mimetypes.guess_type(name)[0] or "application/octet-stream")
|
||||
responses.append(_build_prop_response(full_path, name, is_dir, size, mtime, ctype))
|
||||
except FileNotFoundError:
|
||||
raise HTTPException(404, detail="Not found")
|
||||
st = None
|
||||
except HTTPException as e:
|
||||
if e.status_code != 404:
|
||||
raise
|
||||
st = None
|
||||
|
||||
if st is None:
|
||||
is_mount_root = False
|
||||
try:
|
||||
_, rel = await VirtualFSService.resolve_adapter_by_path(full_path)
|
||||
is_mount_root = rel == ""
|
||||
except HTTPException:
|
||||
is_mount_root = False
|
||||
|
||||
if not is_mount_root and full_path != "/":
|
||||
listing_probe = await VirtualFSService.list_virtual_dir(full_path, page_num=1, page_size=1)
|
||||
if not (listing_probe.get("items") or []):
|
||||
raise HTTPException(404, detail="Not found")
|
||||
|
||||
name = "/" if full_path == "/" else (full_path.rstrip("/").rsplit("/", 1)[-1] or "/")
|
||||
responses.append(_build_prop_response(full_path, name, True, None, 0, None))
|
||||
|
||||
if depth in ("1", "infinity"):
|
||||
try:
|
||||
listing = await VirtualFSService.list_virtual_dir(full_path, page_num=1, page_size=1000)
|
||||
for ent in listing["items"]:
|
||||
for ent in (listing.get("items") or []):
|
||||
is_dir = bool(ent.get("is_dir"))
|
||||
name = ent.get("name")
|
||||
child_path = full_path.rstrip("/") + "/" + name
|
||||
|
||||
@@ -16,7 +16,7 @@ class VirtualFSProcessingMixin(VirtualFSTransferMixin):
|
||||
save_to: str | None = None,
|
||||
overwrite: bool = False,
|
||||
) -> Any:
|
||||
from domain.processors.service import get_processor
|
||||
from domain.processors import get_processor
|
||||
|
||||
processor = get_processor(processor_type)
|
||||
if not processor:
|
||||
|
||||
@@ -3,7 +3,7 @@ from typing import Tuple
|
||||
from fastapi import HTTPException
|
||||
from fastapi.responses import Response
|
||||
|
||||
from domain.adapters.registry import runtime_registry
|
||||
from domain.adapters import runtime_registry
|
||||
from models import StorageAdapter
|
||||
|
||||
from .common import VirtualFSCommonMixin
|
||||
|
||||
@@ -4,8 +4,8 @@ import re
|
||||
from fastapi import HTTPException, UploadFile
|
||||
from fastapi.responses import Response
|
||||
|
||||
from domain.config.service import ConfigService
|
||||
from domain.virtual_fs.thumbnail import (
|
||||
from domain.config import ConfigService
|
||||
from .thumbnail import (
|
||||
get_or_create_thumb,
|
||||
is_image_filename,
|
||||
is_raw_filename,
|
||||
|
||||
@@ -0,0 +1,3 @@
|
||||
from .search_service import VirtualFSSearchService
|
||||
|
||||
__all__ = ["VirtualFSSearchService"]
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
from fastapi import APIRouter, Depends, Query
|
||||
|
||||
from domain.auth.service import get_current_active_user
|
||||
from domain.auth.types import User
|
||||
from domain.virtual_fs.search.search_service import VirtualFSSearchService
|
||||
from api.response import success
|
||||
from domain.auth import User, get_current_active_user
|
||||
from .search_service import VirtualFSSearchService
|
||||
|
||||
router = APIRouter(prefix="/api/fs/search", tags=["search"])
|
||||
|
||||
@@ -17,10 +17,11 @@ async def search_files(
|
||||
user: User = Depends(get_current_active_user),
|
||||
):
|
||||
if not q.strip():
|
||||
return {"items": [], "query": q}
|
||||
return success({"items": [], "query": q, "mode": mode})
|
||||
|
||||
top_k = max(top_k, 1)
|
||||
page = max(page, 1)
|
||||
page_size = max(min(page_size, 100), 1)
|
||||
|
||||
return await VirtualFSSearchService.search(q, top_k, mode, page, page_size)
|
||||
data = await VirtualFSSearchService.search(q, top_k, mode, page, page_size)
|
||||
return success(data)
|
||||
|
||||
@@ -1,8 +1,7 @@
|
||||
from typing import Any, Dict, List, Tuple
|
||||
|
||||
from domain.virtual_fs.types import SearchResultItem
|
||||
from domain.ai.inference import get_text_embedding
|
||||
from domain.ai.service import VectorDBService
|
||||
from domain.ai import FILE_COLLECTION_NAME, VECTOR_COLLECTION_NAME, VectorDBService, get_text_embedding
|
||||
from ..types import SearchResultItem
|
||||
|
||||
|
||||
def _normalize_result(raw: Dict[str, Any], source: str, fallback_score: float = 0.0) -> SearchResultItem:
|
||||
@@ -53,7 +52,7 @@ async def _vector_search(query: str, top_k: int) -> List[SearchResultItem]:
|
||||
return []
|
||||
|
||||
try:
|
||||
raw_results = await vector_db.search_vectors("vector_collection", embedding, max(top_k, 10))
|
||||
raw_results = await vector_db.search_vectors(VECTOR_COLLECTION_NAME, embedding, max(top_k, 10))
|
||||
except Exception:
|
||||
return []
|
||||
|
||||
@@ -68,12 +67,15 @@ async def _filename_search(query: str, page: int, page_size: int) -> Tuple[List[
|
||||
vector_db = VectorDBService()
|
||||
limit = max(page * page_size + 1, page_size * (page + 2))
|
||||
limit = min(limit, 2000)
|
||||
try:
|
||||
raw_results = await vector_db.search_by_path("vector_collection", query, limit)
|
||||
except Exception:
|
||||
return [], False
|
||||
records: List[Dict[str, Any]] = []
|
||||
for collection_name in (FILE_COLLECTION_NAME, VECTOR_COLLECTION_NAME):
|
||||
try:
|
||||
raw_results = await vector_db.search_by_path(collection_name, query, limit)
|
||||
except Exception:
|
||||
continue
|
||||
if raw_results:
|
||||
records.extend(raw_results[0] or [])
|
||||
|
||||
records = raw_results[0] if raw_results else []
|
||||
deduped: List[SearchResultItem] = []
|
||||
seen_paths: set[str] = set()
|
||||
for record in records or []:
|
||||
|
||||
@@ -5,7 +5,7 @@ import time
|
||||
|
||||
from fastapi import HTTPException
|
||||
|
||||
from domain.config.service import ConfigService
|
||||
from domain.config import ConfigService
|
||||
|
||||
from .processing import VirtualFSProcessingMixin
|
||||
|
||||
|
||||
@@ -273,7 +273,7 @@ class VirtualFSTransferMixin(VirtualFSFileOpsMixin):
|
||||
"overwrite": overwrite,
|
||||
}
|
||||
|
||||
from domain.tasks.task_queue import task_queue_service
|
||||
from domain.tasks import task_queue_service
|
||||
|
||||
task = await task_queue_service.add_task("cross_mount_transfer", payload)
|
||||
return {
|
||||
@@ -286,7 +286,7 @@ class VirtualFSTransferMixin(VirtualFSFileOpsMixin):
|
||||
|
||||
@classmethod
|
||||
async def run_cross_mount_transfer_task(cls, task: "Task") -> Dict[str, Any]:
|
||||
from domain.tasks.task_queue import task_queue_service
|
||||
from domain.tasks import task_queue_service
|
||||
|
||||
params = task.task_info or {}
|
||||
operation = params.get("operation")
|
||||
|
||||
59
main.py
59
main.py
@@ -2,15 +2,16 @@ import os
|
||||
from pathlib import Path
|
||||
from contextlib import asynccontextmanager
|
||||
|
||||
from domain.config.service import ConfigService, VERSION
|
||||
from domain.adapters.registry import runtime_registry
|
||||
from domain.adapters import runtime_registry
|
||||
from domain.config import ConfigService, VERSION
|
||||
from db.session import close_db, init_db
|
||||
from api.routers import include_routers
|
||||
from fastapi.middleware.cors import CORSMiddleware
|
||||
from fastapi.staticfiles import StaticFiles
|
||||
from fastapi.responses import FileResponse
|
||||
from fastapi import FastAPI, HTTPException, Request
|
||||
from fastapi import FastAPI, HTTPException
|
||||
from fastapi.exceptions import RequestValidationError
|
||||
from starlette.exceptions import HTTPException as StarletteHTTPException
|
||||
from middleware.exception_handler import (
|
||||
global_exception_handler,
|
||||
http_exception_handler,
|
||||
@@ -19,43 +20,63 @@ from middleware.exception_handler import (
|
||||
)
|
||||
import httpx
|
||||
from dotenv import load_dotenv
|
||||
from domain.tasks.task_queue import task_queue_service
|
||||
from domain.tasks import task_queue_service
|
||||
|
||||
load_dotenv()
|
||||
|
||||
|
||||
class SPAStaticFiles(StaticFiles):
|
||||
async def get_response(self, path, scope):
|
||||
response = await super().get_response(path, scope)
|
||||
if response.status_code == 404:
|
||||
return await super().get_response("index.html", scope)
|
||||
try:
|
||||
response = await super().get_response(path, scope)
|
||||
except StarletteHTTPException as exc:
|
||||
if exc.status_code != 404:
|
||||
raise
|
||||
if self._should_spa_fallback(scope):
|
||||
return FileResponse(INDEX_FILE)
|
||||
raise
|
||||
|
||||
if response.status_code == 404 and self._should_spa_fallback(scope):
|
||||
return FileResponse(INDEX_FILE)
|
||||
return response
|
||||
|
||||
@staticmethod
|
||||
def _should_spa_fallback(scope) -> bool:
|
||||
return (
|
||||
scope.get("method") == "GET"
|
||||
and _request_accepts_html(scope)
|
||||
and not (scope.get("path") or "").startswith(SPA_EXCLUDE_PREFIXES)
|
||||
and INDEX_FILE.exists()
|
||||
)
|
||||
|
||||
|
||||
INDEX_FILE = Path("web/dist/index.html")
|
||||
SPA_EXCLUDE_PREFIXES = ("/api", "/docs", "/openapi.json", "/webdav", "/s3")
|
||||
|
||||
|
||||
async def spa_fallback_middleware(request: Request, call_next):
|
||||
response = await call_next(request)
|
||||
if (
|
||||
response.status_code == 404
|
||||
and request.method == "GET"
|
||||
and "text/html" in request.headers.get("accept", "")
|
||||
and not request.url.path.startswith(SPA_EXCLUDE_PREFIXES)
|
||||
and INDEX_FILE.exists()
|
||||
):
|
||||
return FileResponse(INDEX_FILE)
|
||||
return response
|
||||
def _request_accepts_html(scope) -> bool:
|
||||
for k, v in scope.get("headers") or []:
|
||||
if k == b"accept":
|
||||
return "text/html" in v.decode("latin-1")
|
||||
return False
|
||||
|
||||
|
||||
@asynccontextmanager
|
||||
async def lifespan(app: FastAPI):
|
||||
os.makedirs("data/db", exist_ok=True)
|
||||
os.makedirs("data/plugins", exist_ok=True)
|
||||
await init_db()
|
||||
await runtime_registry.refresh()
|
||||
await ConfigService.set("APP_VERSION", VERSION)
|
||||
await task_queue_service.start_worker()
|
||||
|
||||
# 加载已安装的插件
|
||||
from domain.plugins import init_plugins
|
||||
await init_plugins(app)
|
||||
|
||||
# 在所有路由加载完成后,挂载静态文件服务(放在最后以避免覆盖 API 路由)
|
||||
app.mount("/", SPAStaticFiles(directory="web/dist", html=True, check_dir=False), name="static")
|
||||
|
||||
try:
|
||||
yield
|
||||
finally:
|
||||
@@ -69,7 +90,6 @@ def create_app() -> FastAPI:
|
||||
description="A highly extensible private cloud storage solution for individuals and teams",
|
||||
lifespan=lifespan,
|
||||
)
|
||||
app.middleware("http")(spa_fallback_middleware)
|
||||
include_routers(app)
|
||||
app.add_exception_handler(HTTPException, http_exception_handler)
|
||||
app.add_exception_handler(RequestValidationError, validation_exception_handler)
|
||||
@@ -86,7 +106,6 @@ app.add_middleware(
|
||||
allow_methods=["*"],
|
||||
allow_headers=["*"],
|
||||
)
|
||||
app.mount("/", SPAStaticFiles(directory="web/dist", html=True, check_dir=False), name="static")
|
||||
if __name__ == "__main__":
|
||||
import uvicorn
|
||||
uvicorn.run("main:app", host="0.0.0.0", port=8000, reload=True)
|
||||
|
||||
@@ -168,24 +168,28 @@ class ShareLink(Model):
|
||||
|
||||
class Plugin(Model):
|
||||
id = fields.IntField(pk=True)
|
||||
url = fields.CharField(max_length=2048)
|
||||
enabled = fields.BooleanField(default=True)
|
||||
|
||||
open_app = fields.BooleanField(default=False)
|
||||
|
||||
key = fields.CharField(max_length=100, null=True)
|
||||
key = fields.CharField(max_length=100, unique=True) # 插件唯一标识
|
||||
name = fields.CharField(max_length=255, null=True)
|
||||
version = fields.CharField(max_length=50, null=True)
|
||||
supported_exts = fields.JSONField(null=True)
|
||||
|
||||
default_bounds = fields.JSONField(null=True)
|
||||
default_maximized = fields.BooleanField(null=True)
|
||||
|
||||
icon = fields.CharField(max_length=2048, null=True)
|
||||
description = fields.TextField(null=True)
|
||||
author = fields.CharField(max_length=255, null=True)
|
||||
website = fields.CharField(max_length=2048, null=True)
|
||||
github = fields.CharField(max_length=2048, null=True)
|
||||
license = fields.CharField(max_length=100, null=True)
|
||||
|
||||
# 完整 manifest 存储
|
||||
manifest = fields.JSONField(null=True)
|
||||
|
||||
# 前端相关配置(从 manifest.frontend 提取)
|
||||
open_app = fields.BooleanField(default=False)
|
||||
supported_exts = fields.JSONField(null=True)
|
||||
default_bounds = fields.JSONField(null=True)
|
||||
default_maximized = fields.BooleanField(null=True)
|
||||
icon = fields.CharField(max_length=2048, null=True)
|
||||
|
||||
# 已加载的组件追踪
|
||||
loaded_routes = fields.JSONField(null=True) # ["/api/plugins/xxx", ...]
|
||||
loaded_processors = fields.JSONField(null=True) # ["processor_type", ...]
|
||||
|
||||
created_at = fields.DatetimeField(auto_now_add=True)
|
||||
updated_at = fields.DatetimeField(auto_now=True)
|
||||
|
||||
@@ -13,8 +13,8 @@ PROJECT_ROOT = Path(__file__).resolve().parents[1]
|
||||
if str(PROJECT_ROOT) not in sys.path:
|
||||
sys.path.insert(0, str(PROJECT_ROOT))
|
||||
|
||||
from domain.auth.service import get_password_hash
|
||||
from domain.config.service import VERSION
|
||||
from domain.config import VERSION
|
||||
from domain.auth import get_password_hash
|
||||
|
||||
|
||||
def _project_root() -> Path:
|
||||
|
||||
142
web/bun.lock
142
web/bun.lock
@@ -7,11 +7,9 @@
|
||||
"dependencies": {
|
||||
"@ant-design/icons": "6",
|
||||
"@monaco-editor/react": "^4.7.0",
|
||||
"@uiw/react-md-editor": "^4.0.11",
|
||||
"antd": "6",
|
||||
"artplayer": "^5.3.0",
|
||||
"date-fns": "^4.1.0",
|
||||
"monaco-editor": "^0.55.1",
|
||||
"react": "^19.2.3",
|
||||
"react-dom": "^19.2.3",
|
||||
"react-markdown": "^10.1.0",
|
||||
@@ -337,8 +335,6 @@
|
||||
|
||||
"@types/ms": ["@types/ms@2.1.0", "", {}, "sha512-GsCCIZDE/p3i96vtEqx+7dBUGXrc7zeSK3wwPHIaRThS+9OhWIXRqzs4d6k1SVU8g91DrNRWxWUGhp5KXQb2VA=="],
|
||||
|
||||
"@types/prismjs": ["@types/prismjs@1.26.5", "", {}, "sha512-AUZTa7hQ2KY5L7AmtSiqxlhWxb4ina0yd8hNbl4TWuqnv/pFP0nDMb3YrfSBf4hJVGLh2YEIBfKaBW/9UEl6IQ=="],
|
||||
|
||||
"@types/react": ["@types/react@19.2.7", "", { "dependencies": { "csstype": "^3.2.2" } }, "sha512-MWtvHrGZLFttgeEj28VXHxpmwYbor/ATPYbBfSFZEIRK0ecCFLl2Qo55z52Hss+UV9CRN7trSeq1zbgx7YDWWg=="],
|
||||
|
||||
"@types/react-dom": ["@types/react-dom@19.2.3", "", { "peerDependencies": { "@types/react": "^19.2.0" } }, "sha512-jp2L/eY6fn+KgVVQAOqYItbF0VY/YApe5Mz2F0aykSO8gx31bYCZyvSeYxCHKvzHG5eZjc+zyaS5BrBWya2+kQ=="],
|
||||
@@ -367,12 +363,6 @@
|
||||
|
||||
"@typescript-eslint/visitor-keys": ["@typescript-eslint/visitor-keys@8.51.0", "", { "dependencies": { "@typescript-eslint/types": "8.51.0", "eslint-visitor-keys": "^4.2.1" } }, "sha512-mM/JRQOzhVN1ykejrvwnBRV3+7yTKK8tVANVN3o1O0t0v7o+jqdVu9crPy5Y9dov15TJk/FTIgoUGHrTOVL3Zg=="],
|
||||
|
||||
"@uiw/copy-to-clipboard": ["@uiw/copy-to-clipboard@1.0.19", "", {}, "sha512-AYxzFUBkZrhtExb2QC0C4lFH2+BSx6JVId9iqeGHakBuosqiQHUQaNZCvIBeM97Ucp+nJ22flOh8FBT2pKRRAA=="],
|
||||
|
||||
"@uiw/react-markdown-preview": ["@uiw/react-markdown-preview@5.1.5", "", { "dependencies": { "@babel/runtime": "^7.17.2", "@uiw/copy-to-clipboard": "~1.0.12", "react-markdown": "~9.0.1", "rehype-attr": "~3.0.1", "rehype-autolink-headings": "~7.1.0", "rehype-ignore": "^2.0.0", "rehype-prism-plus": "2.0.0", "rehype-raw": "^7.0.0", "rehype-rewrite": "~4.0.0", "rehype-slug": "~6.0.0", "remark-gfm": "~4.0.0", "remark-github-blockquote-alert": "^1.0.0", "unist-util-visit": "^5.0.0" }, "peerDependencies": { "react": ">=16.8.0", "react-dom": ">=16.8.0" } }, "sha512-DNOqx1a6gJR7Btt57zpGEKTfHRlb7rWbtctMRO2f82wWcuoJsxPBrM+JWebDdOD0LfD8oe2CQvW2ICQJKHQhZg=="],
|
||||
|
||||
"@uiw/react-md-editor": ["@uiw/react-md-editor@4.0.11", "", { "dependencies": { "@babel/runtime": "^7.14.6", "@uiw/react-markdown-preview": "^5.0.6", "rehype": "~13.0.0", "rehype-prism-plus": "~2.0.0" }, "peerDependencies": { "react": ">=16.8.0", "react-dom": ">=16.8.0" } }, "sha512-F0OR5O1v54EkZYvJj3ew0I7UqLiPeU34hMAY4MdXS3hI86rruYi5DHVkG/VuvLkUZW7wIETM2QFtZ459gKIjQA=="],
|
||||
|
||||
"@ungap/structured-clone": ["@ungap/structured-clone@1.3.0", "", {}, "sha512-WmoN8qaIAo7WTYWbAZuG8PYEhn5fkz7dZrqTBZ7dtt//lL2Gwms1IcnQ5yHqjDfX8Ft5j4YzDM23f87zBfDe9g=="],
|
||||
|
||||
"@vitejs/plugin-react": ["@vitejs/plugin-react@5.1.2", "", { "dependencies": { "@babel/core": "^7.28.5", "@babel/plugin-transform-react-jsx-self": "^7.27.1", "@babel/plugin-transform-react-jsx-source": "^7.27.1", "@rolldown/pluginutils": "1.0.0-beta.53", "@types/babel__core": "^7.20.5", "react-refresh": "^0.18.0" }, "peerDependencies": { "vite": "^4.2.0 || ^5.0.0 || ^6.0.0 || ^7.0.0" } }, "sha512-EcA07pHJouywpzsoTUqNh5NwGayl2PPVEJKUSinGGSxFGYn+shYbqMGBg6FXDqgXum9Ou/ecb+411ssw8HImJQ=="],
|
||||
@@ -397,10 +387,6 @@
|
||||
|
||||
"baseline-browser-mapping": ["baseline-browser-mapping@2.9.11", "", { "bin": { "baseline-browser-mapping": "dist/cli.js" } }, "sha512-Sg0xJUNDU1sJNGdfGWhVHX0kkZ+HWcvmVymJbj6NSgZZmW/8S9Y2HQ5euytnIgakgxN6papOAWiwDo1ctFDcoQ=="],
|
||||
|
||||
"bcp-47-match": ["bcp-47-match@2.0.3", "", {}, "sha512-JtTezzbAibu8G0R9op9zb3vcWZd9JF6M0xOYGPn0fNCd7wOpRB1mU2mH9T8gaBGbAAyIIVgB2G7xG0GP98zMAQ=="],
|
||||
|
||||
"boolbase": ["boolbase@1.0.0", "", {}, "sha512-JZOSA7Mo9sNGB8+UjSgzdLtokWAky1zbztM3WRLCbZ70/3cTANmQmOdR7y2g+J0e2WXywy1yS468tY+IruqEww=="],
|
||||
|
||||
"brace-expansion": ["brace-expansion@1.1.12", "", { "dependencies": { "balanced-match": "^1.0.0", "concat-map": "0.0.1" } }, "sha512-9T9UjW3r0UW5c1Q7GTwllptXwhvYmEzFhzMfZ9H7FQWt+uZePjZPjBP/W1ZEyZ1twGWom5/56TF4lPcqjnDHcg=="],
|
||||
|
||||
"browserslist": ["browserslist@4.28.1", "", { "dependencies": { "baseline-browser-mapping": "^2.9.0", "caniuse-lite": "^1.0.30001759", "electron-to-chromium": "^1.5.263", "node-releases": "^2.0.27", "update-browserslist-db": "^1.2.0" }, "bin": { "browserslist": "cli.js" } }, "sha512-ZC5Bd0LgJXgwGqUknZY/vkUQ04r8NXnJZ3yYi4vDmSiZmC/pdSN0NbNRPxZpbtO4uAfDUAFffO8IZoM3Gj8IkA=="],
|
||||
@@ -439,8 +425,6 @@
|
||||
|
||||
"cross-spawn": ["cross-spawn@7.0.6", "", { "dependencies": { "path-key": "^3.1.0", "shebang-command": "^2.0.0", "which": "^2.0.1" } }, "sha512-uV2QOWP2nWzsy2aMp8aRibhi9dlzF5Hgh5SHaB9OiTGEyDTiJJyx0uy51QXdyWbtAHNua4XJzUKca3OzKUd3vA=="],
|
||||
|
||||
"css-selector-parser": ["css-selector-parser@3.3.0", "", {}, "sha512-Y2asgMGFqJKF4fq4xHDSlFYIkeVfRsm69lQC1q9kbEsH5XtnINTMrweLkjYMeaUgiXBy/uvKeO/a1JHTNnmB2g=="],
|
||||
|
||||
"csstype": ["csstype@3.2.3", "", {}, "sha512-z1HGKcYy2xA8AGQfwrn0PAy+PB7X/GSj3UVJW9qKyn43xWa+gl5nXmU4qqLMRzWVLFC8KusUX8T/0kCiOYpAIQ=="],
|
||||
|
||||
"date-fns": ["date-fns@4.1.0", "", {}, "sha512-Ukq0owbQXxa/U3EGtsdVBkR1w7KOQ5gIBqdH2hkvknzZPYvBxb/aa6E8L7tmjFtkwZBu3UXBbjIgPo/Ez4xaNg=="],
|
||||
@@ -457,14 +441,10 @@
|
||||
|
||||
"devlop": ["devlop@1.1.0", "", { "dependencies": { "dequal": "^2.0.0" } }, "sha512-RWmIqhcFf1lRYBvNmr7qTNuyCt/7/ns2jbpp1+PalgE/rDQcBT0fioSMUpJ93irlUhC5hrg4cYqe6U+0ImW0rA=="],
|
||||
|
||||
"direction": ["direction@2.0.1", "", { "bin": { "direction": "cli.js" } }, "sha512-9S6m9Sukh1cZNknO1CWAr2QAWsbKLafQiyM5gZ7VgXHeuaoUwffKN4q6NC4A/Mf9iiPlOXQEKW/Mv/mh9/3YFA=="],
|
||||
|
||||
"dompurify": ["dompurify@3.2.7", "", { "optionalDependencies": { "@types/trusted-types": "^2.0.7" } }, "sha512-WhL/YuveyGXJaerVlMYGWhvQswa7myDG17P7Vu65EWC05o8vfeNbvNf4d/BOvH99+ZW+LlQsc1GDKMa1vNK6dw=="],
|
||||
|
||||
"electron-to-chromium": ["electron-to-chromium@1.5.267", "", {}, "sha512-0Drusm6MVRXSOJpGbaSVgcQsuB4hEkMpHXaVstcPmhu5LIedxs1xNK/nIxmQIU/RPC0+1/o0AVZfBTkTNJOdUw=="],
|
||||
|
||||
"entities": ["entities@6.0.1", "", {}, "sha512-aN97NXWF6AWBTahfVOIrB/NShkzi5H7F9r1s9mD3cDj4Ko5f2qhhVoYMibXF7GlLveb/D2ioWay8lxI97Ven3g=="],
|
||||
|
||||
"esbuild": ["esbuild@0.27.2", "", { "optionalDependencies": { "@esbuild/aix-ppc64": "0.27.2", "@esbuild/android-arm": "0.27.2", "@esbuild/android-arm64": "0.27.2", "@esbuild/android-x64": "0.27.2", "@esbuild/darwin-arm64": "0.27.2", "@esbuild/darwin-x64": "0.27.2", "@esbuild/freebsd-arm64": "0.27.2", "@esbuild/freebsd-x64": "0.27.2", "@esbuild/linux-arm": "0.27.2", "@esbuild/linux-arm64": "0.27.2", "@esbuild/linux-ia32": "0.27.2", "@esbuild/linux-loong64": "0.27.2", "@esbuild/linux-mips64el": "0.27.2", "@esbuild/linux-ppc64": "0.27.2", "@esbuild/linux-riscv64": "0.27.2", "@esbuild/linux-s390x": "0.27.2", "@esbuild/linux-x64": "0.27.2", "@esbuild/netbsd-arm64": "0.27.2", "@esbuild/netbsd-x64": "0.27.2", "@esbuild/openbsd-arm64": "0.27.2", "@esbuild/openbsd-x64": "0.27.2", "@esbuild/openharmony-arm64": "0.27.2", "@esbuild/sunos-x64": "0.27.2", "@esbuild/win32-arm64": "0.27.2", "@esbuild/win32-ia32": "0.27.2", "@esbuild/win32-x64": "0.27.2" }, "bin": { "esbuild": "bin/esbuild" } }, "sha512-HyNQImnsOC7X9PMNaCIeAm4ISCQXs5a5YasTXVliKv4uuBo1dKrG0A+uQS8M5eXjVMnLg3WgXaKvprHlFJQffw=="],
|
||||
|
||||
"escalade": ["escalade@3.2.0", "", {}, "sha512-WUj2qlxaQtO4g6Pq5c29GTcWGDyd8itL8zTlipgECz3JesAiiOKotd8JU6otB3PACgG6xkJUyVhboMS+bje/jA=="],
|
||||
@@ -515,50 +495,22 @@
|
||||
|
||||
"gensync": ["gensync@1.0.0-beta.2", "", {}, "sha512-3hN7NaskYvMDLQY55gnW3NQ+mesEAepTqlg+VEbj7zzqEMBVNhzcGYYeqFo/TlYz6eQiFcp1HcsCZO+nGgS8zg=="],
|
||||
|
||||
"github-slugger": ["github-slugger@2.0.0", "", {}, "sha512-IaOQ9puYtjrkq7Y0Ygl9KDZnrf/aiUJYUpVf89y8kyaxbRG7Y1SrX/jaumrv81vc61+kiMempujsM3Yw7w5qcw=="],
|
||||
|
||||
"glob-parent": ["glob-parent@6.0.2", "", { "dependencies": { "is-glob": "^4.0.3" } }, "sha512-XxwI8EOhVQgWp6iDL+3b0r86f4d6AX6zSU55HfB4ydCEuXLXc5FcYeOu+nnGftS4TEju/11rt4KJPTMgbfmv4A=="],
|
||||
|
||||
"globals": ["globals@16.5.0", "", {}, "sha512-c/c15i26VrJ4IRt5Z89DnIzCGDn9EcebibhAOjw5ibqEHsE1wLUgkPn9RDmNcUKyU87GeaL633nyJ+pplFR2ZQ=="],
|
||||
|
||||
"has-flag": ["has-flag@4.0.0", "", {}, "sha512-EykJT/Q1KjTWctppgIAgfSO0tKVuZUjhgMr17kqTumMl6Afv3EISleU7qZUzoXDFTAHTDC4NOoG/ZxU3EvlMPQ=="],
|
||||
|
||||
"hast-util-from-html": ["hast-util-from-html@2.0.3", "", { "dependencies": { "@types/hast": "^3.0.0", "devlop": "^1.1.0", "hast-util-from-parse5": "^8.0.0", "parse5": "^7.0.0", "vfile": "^6.0.0", "vfile-message": "^4.0.0" } }, "sha512-CUSRHXyKjzHov8yKsQjGOElXy/3EKpyX56ELnkHH34vDVw1N1XSQ1ZcAvTyAPtGqLTuKP/uxM+aLkSPqF/EtMw=="],
|
||||
|
||||
"hast-util-from-parse5": ["hast-util-from-parse5@8.0.3", "", { "dependencies": { "@types/hast": "^3.0.0", "@types/unist": "^3.0.0", "devlop": "^1.0.0", "hastscript": "^9.0.0", "property-information": "^7.0.0", "vfile": "^6.0.0", "vfile-location": "^5.0.0", "web-namespaces": "^2.0.0" } }, "sha512-3kxEVkEKt0zvcZ3hCRYI8rqrgwtlIOFMWkbclACvjlDw8Li9S2hk/d51OI0nr/gIpdMHNepwgOKqZ/sy0Clpyg=="],
|
||||
|
||||
"hast-util-has-property": ["hast-util-has-property@3.0.0", "", { "dependencies": { "@types/hast": "^3.0.0" } }, "sha512-MNilsvEKLFpV604hwfhVStK0usFY/QmM5zX16bo7EjnAEGofr5YyI37kzopBlZJkHD4t887i+q/C8/tr5Q94cA=="],
|
||||
|
||||
"hast-util-heading-rank": ["hast-util-heading-rank@3.0.0", "", { "dependencies": { "@types/hast": "^3.0.0" } }, "sha512-EJKb8oMUXVHcWZTDepnr+WNbfnXKFNf9duMesmr4S8SXTJBJ9M4Yok08pu9vxdJwdlGRhVumk9mEhkEvKGifwA=="],
|
||||
|
||||
"hast-util-is-element": ["hast-util-is-element@3.0.0", "", { "dependencies": { "@types/hast": "^3.0.0" } }, "sha512-Val9mnv2IWpLbNPqc/pUem+a7Ipj2aHacCwgNfTiK0vJKl0LF+4Ba4+v1oPHFpf3bLYmreq0/l3Gud9S5OH42g=="],
|
||||
|
||||
"hast-util-parse-selector": ["hast-util-parse-selector@3.1.1", "", { "dependencies": { "@types/hast": "^2.0.0" } }, "sha512-jdlwBjEexy1oGz0aJ2f4GKMaVKkA9jwjr4MjAAI22E5fM/TXVZHuS5OpONtdeIkRKqAaryQ2E9xNQxijoThSZA=="],
|
||||
|
||||
"hast-util-raw": ["hast-util-raw@9.1.0", "", { "dependencies": { "@types/hast": "^3.0.0", "@types/unist": "^3.0.0", "@ungap/structured-clone": "^1.0.0", "hast-util-from-parse5": "^8.0.0", "hast-util-to-parse5": "^8.0.0", "html-void-elements": "^3.0.0", "mdast-util-to-hast": "^13.0.0", "parse5": "^7.0.0", "unist-util-position": "^5.0.0", "unist-util-visit": "^5.0.0", "vfile": "^6.0.0", "web-namespaces": "^2.0.0", "zwitch": "^2.0.0" } }, "sha512-Y8/SBAHkZGoNkpzqqfCldijcuUKh7/su31kEBp67cFY09Wy0mTRgtsLYsiIxMJxlu0f6AA5SUTbDR8K0rxnbUw=="],
|
||||
|
||||
"hast-util-select": ["hast-util-select@6.0.4", "", { "dependencies": { "@types/hast": "^3.0.0", "@types/unist": "^3.0.0", "bcp-47-match": "^2.0.0", "comma-separated-tokens": "^2.0.0", "css-selector-parser": "^3.0.0", "devlop": "^1.0.0", "direction": "^2.0.0", "hast-util-has-property": "^3.0.0", "hast-util-to-string": "^3.0.0", "hast-util-whitespace": "^3.0.0", "nth-check": "^2.0.0", "property-information": "^7.0.0", "space-separated-tokens": "^2.0.0", "unist-util-visit": "^5.0.0", "zwitch": "^2.0.0" } }, "sha512-RqGS1ZgI0MwxLaKLDxjprynNzINEkRHY2i8ln4DDjgv9ZhcYVIHN9rlpiYsqtFwrgpYU361SyWDQcGNIBVu3lw=="],
|
||||
|
||||
"hast-util-to-html": ["hast-util-to-html@9.0.5", "", { "dependencies": { "@types/hast": "^3.0.0", "@types/unist": "^3.0.0", "ccount": "^2.0.0", "comma-separated-tokens": "^2.0.0", "hast-util-whitespace": "^3.0.0", "html-void-elements": "^3.0.0", "mdast-util-to-hast": "^13.0.0", "property-information": "^7.0.0", "space-separated-tokens": "^2.0.0", "stringify-entities": "^4.0.0", "zwitch": "^2.0.4" } }, "sha512-OguPdidb+fbHQSU4Q4ZiLKnzWo8Wwsf5bZfbvu7//a9oTYoqD/fWpe96NuHkoS9h0ccGOTe0C4NGXdtS0iObOw=="],
|
||||
|
||||
"hast-util-to-jsx-runtime": ["hast-util-to-jsx-runtime@2.3.6", "", { "dependencies": { "@types/estree": "^1.0.0", "@types/hast": "^3.0.0", "@types/unist": "^3.0.0", "comma-separated-tokens": "^2.0.0", "devlop": "^1.0.0", "estree-util-is-identifier-name": "^3.0.0", "hast-util-whitespace": "^3.0.0", "mdast-util-mdx-expression": "^2.0.0", "mdast-util-mdx-jsx": "^3.0.0", "mdast-util-mdxjs-esm": "^2.0.0", "property-information": "^7.0.0", "space-separated-tokens": "^2.0.0", "style-to-js": "^1.0.0", "unist-util-position": "^5.0.0", "vfile-message": "^4.0.0" } }, "sha512-zl6s8LwNyo1P9uw+XJGvZtdFF1GdAkOg8ujOw+4Pyb76874fLps4ueHXDhXWdk6YHQ6OgUtinliG7RsYvCbbBg=="],
|
||||
|
||||
"hast-util-to-parse5": ["hast-util-to-parse5@8.0.1", "", { "dependencies": { "@types/hast": "^3.0.0", "comma-separated-tokens": "^2.0.0", "devlop": "^1.0.0", "property-information": "^7.0.0", "space-separated-tokens": "^2.0.0", "web-namespaces": "^2.0.0", "zwitch": "^2.0.0" } }, "sha512-MlWT6Pjt4CG9lFCjiz4BH7l9wmrMkfkJYCxFwKQic8+RTZgWPuWxwAfjJElsXkex7DJjfSJsQIt931ilUgmwdA=="],
|
||||
|
||||
"hast-util-to-string": ["hast-util-to-string@3.0.1", "", { "dependencies": { "@types/hast": "^3.0.0" } }, "sha512-XelQVTDWvqcl3axRfI0xSeoVKzyIFPwsAGSLIsKdJKQMXDYJS4WYrBNF/8J7RdhIcFI2BOHgAifggsvsxp/3+A=="],
|
||||
|
||||
"hast-util-whitespace": ["hast-util-whitespace@3.0.0", "", { "dependencies": { "@types/hast": "^3.0.0" } }, "sha512-88JUN06ipLwsnv+dVn+OIYOvAuvBMy/Qoi6O7mQHxdPXpjy+Cd6xRkWwux7DKO+4sYILtLBRIKgsdpS2gQc7qw=="],
|
||||
|
||||
"hastscript": ["hastscript@7.2.0", "", { "dependencies": { "@types/hast": "^2.0.0", "comma-separated-tokens": "^2.0.0", "hast-util-parse-selector": "^3.0.0", "property-information": "^6.0.0", "space-separated-tokens": "^2.0.0" } }, "sha512-TtYPq24IldU8iKoJQqvZOuhi5CyCQRAbvDOX0x1eW6rsHSxa/1i2CCiptNTotGHJ3VoHRGmqiv6/D3q113ikkw=="],
|
||||
|
||||
"hermes-estree": ["hermes-estree@0.25.1", "", {}, "sha512-0wUoCcLp+5Ev5pDW2OriHC2MJCbwLwuRx+gAqMTOkGKJJiBCLjtrvy4PWUGn6MIVefecRpzoOZ/UV6iGdOr+Cw=="],
|
||||
|
||||
"hermes-parser": ["hermes-parser@0.25.1", "", { "dependencies": { "hermes-estree": "0.25.1" } }, "sha512-6pEjquH3rqaI6cYAXYPcz9MS4rY6R4ngRgrgfDshRptUZIc3lw0MCIJIGDj9++mfySOuPTHB4nrSW99BCvOPIA=="],
|
||||
|
||||
"html-url-attributes": ["html-url-attributes@3.0.1", "", {}, "sha512-ol6UPyBWqsrO6EJySPz2O7ZSr856WDrEzM5zMqp+FJJLGMW35cLYmmZnl0vztAZxRUoNZJFTCohfjuIJ8I4QBQ=="],
|
||||
|
||||
"html-void-elements": ["html-void-elements@3.0.0", "", {}, "sha512-bEqo66MRXsUGxWHV5IP0PUiAWwoEjba4VCzg0LjFJBpchPaTfyfCKTG6bc5F8ucKec3q5y6qOdGyYTSBEvhCrg=="],
|
||||
|
||||
"ignore": ["ignore@5.3.2", "", {}, "sha512-hsBTNUqQTDwkWtcdYI2i06Y/nUBEsNEDJKjWdigLvegy8kDuJAS8uRlpkkcQpyEXL0Z/pjDy5HBmMjRCJ2gq+g=="],
|
||||
|
||||
"import-fresh": ["import-fresh@3.3.1", "", { "dependencies": { "parent-module": "^1.0.0", "resolve-from": "^4.0.0" } }, "sha512-TR3KfrTZTYLPB6jUjfx6MF9WcWrHL9su5TObK4ZkYgBdWKPOFoSoQIdEuTuR82pmtxH2spWG9h6etwfr1pLBqQ=="],
|
||||
@@ -615,26 +567,10 @@
|
||||
|
||||
"lru-cache": ["lru-cache@5.1.1", "", { "dependencies": { "yallist": "^3.0.2" } }, "sha512-KpNARQA3Iwv+jTA0utUVVbrh+Jlrr1Fv0e56GGzAFOXN7dk/FviaDW8LHmK52DlcH4WP2n6gI8vN1aesBFgo9w=="],
|
||||
|
||||
"markdown-table": ["markdown-table@3.0.4", "", {}, "sha512-wiYz4+JrLyb/DqW2hkFJxP7Vd7JuTDm77fvbM8VfEQdmSMqcImWeeRbHwZjBjIFki/VaMK2BhFi7oUUZeM5bqw=="],
|
||||
|
||||
"marked": ["marked@14.0.0", "", { "bin": { "marked": "bin/marked.js" } }, "sha512-uIj4+faQ+MgHgwUW1l2PsPglZLOLOT1uErt06dAPtx2kjteLAkbsd/0FiYg/MGS+i7ZKLb7w2WClxHkzOOuryQ=="],
|
||||
|
||||
"mdast-util-find-and-replace": ["mdast-util-find-and-replace@3.0.2", "", { "dependencies": { "@types/mdast": "^4.0.0", "escape-string-regexp": "^5.0.0", "unist-util-is": "^6.0.0", "unist-util-visit-parents": "^6.0.0" } }, "sha512-Tmd1Vg/m3Xz43afeNxDIhWRtFZgM2VLyaf4vSTYwudTyeuTneoL3qtWMA5jeLyz/O1vDJmmV4QuScFCA2tBPwg=="],
|
||||
|
||||
"mdast-util-from-markdown": ["mdast-util-from-markdown@2.0.2", "", { "dependencies": { "@types/mdast": "^4.0.0", "@types/unist": "^3.0.0", "decode-named-character-reference": "^1.0.0", "devlop": "^1.0.0", "mdast-util-to-string": "^4.0.0", "micromark": "^4.0.0", "micromark-util-decode-numeric-character-reference": "^2.0.0", "micromark-util-decode-string": "^2.0.0", "micromark-util-normalize-identifier": "^2.0.0", "micromark-util-symbol": "^2.0.0", "micromark-util-types": "^2.0.0", "unist-util-stringify-position": "^4.0.0" } }, "sha512-uZhTV/8NBuw0WHkPTrCqDOl0zVe1BIng5ZtHoDk49ME1qqcjYmmLmOf0gELgcRMxN4w2iuIeVso5/6QymSrgmA=="],
|
||||
|
||||
"mdast-util-gfm": ["mdast-util-gfm@3.1.0", "", { "dependencies": { "mdast-util-from-markdown": "^2.0.0", "mdast-util-gfm-autolink-literal": "^2.0.0", "mdast-util-gfm-footnote": "^2.0.0", "mdast-util-gfm-strikethrough": "^2.0.0", "mdast-util-gfm-table": "^2.0.0", "mdast-util-gfm-task-list-item": "^2.0.0", "mdast-util-to-markdown": "^2.0.0" } }, "sha512-0ulfdQOM3ysHhCJ1p06l0b0VKlhU0wuQs3thxZQagjcjPrlFRqY215uZGHHJan9GEAXd9MbfPjFJz+qMkVR6zQ=="],
|
||||
|
||||
"mdast-util-gfm-autolink-literal": ["mdast-util-gfm-autolink-literal@2.0.1", "", { "dependencies": { "@types/mdast": "^4.0.0", "ccount": "^2.0.0", "devlop": "^1.0.0", "mdast-util-find-and-replace": "^3.0.0", "micromark-util-character": "^2.0.0" } }, "sha512-5HVP2MKaP6L+G6YaxPNjuL0BPrq9orG3TsrZ9YXbA3vDw/ACI4MEsnoDpn6ZNm7GnZgtAcONJyPhOP8tNJQavQ=="],
|
||||
|
||||
"mdast-util-gfm-footnote": ["mdast-util-gfm-footnote@2.1.0", "", { "dependencies": { "@types/mdast": "^4.0.0", "devlop": "^1.1.0", "mdast-util-from-markdown": "^2.0.0", "mdast-util-to-markdown": "^2.0.0", "micromark-util-normalize-identifier": "^2.0.0" } }, "sha512-sqpDWlsHn7Ac9GNZQMeUzPQSMzR6Wv0WKRNvQRg0KqHh02fpTz69Qc1QSseNX29bhz1ROIyNyxExfawVKTm1GQ=="],
|
||||
|
||||
"mdast-util-gfm-strikethrough": ["mdast-util-gfm-strikethrough@2.0.0", "", { "dependencies": { "@types/mdast": "^4.0.0", "mdast-util-from-markdown": "^2.0.0", "mdast-util-to-markdown": "^2.0.0" } }, "sha512-mKKb915TF+OC5ptj5bJ7WFRPdYtuHv0yTRxK2tJvi+BDqbkiG7h7u/9SI89nRAYcmap2xHQL9D+QG/6wSrTtXg=="],
|
||||
|
||||
"mdast-util-gfm-table": ["mdast-util-gfm-table@2.0.0", "", { "dependencies": { "@types/mdast": "^4.0.0", "devlop": "^1.0.0", "markdown-table": "^3.0.0", "mdast-util-from-markdown": "^2.0.0", "mdast-util-to-markdown": "^2.0.0" } }, "sha512-78UEvebzz/rJIxLvE7ZtDd/vIQ0RHv+3Mh5DR96p7cS7HsBhYIICDBCu8csTNWNO6tBWfqXPWekRuj2FNOGOZg=="],
|
||||
|
||||
"mdast-util-gfm-task-list-item": ["mdast-util-gfm-task-list-item@2.0.0", "", { "dependencies": { "@types/mdast": "^4.0.0", "devlop": "^1.0.0", "mdast-util-from-markdown": "^2.0.0", "mdast-util-to-markdown": "^2.0.0" } }, "sha512-IrtvNvjxC1o06taBAVJznEnkiHxLFTzgonUdy8hzFVeDun0uTjxxrRGVaNFqkU1wJR3RBPEfsxmU6jDWPofrTQ=="],
|
||||
|
||||
"mdast-util-mdx-expression": ["mdast-util-mdx-expression@2.0.1", "", { "dependencies": { "@types/estree-jsx": "^1.0.0", "@types/hast": "^3.0.0", "@types/mdast": "^4.0.0", "devlop": "^1.0.0", "mdast-util-from-markdown": "^2.0.0", "mdast-util-to-markdown": "^2.0.0" } }, "sha512-J6f+9hUp+ldTZqKRSg7Vw5V6MqjATc+3E4gf3CFNcuZNWD8XdyI6zQ8GqH7f8169MM6P7hMBRDVGnn7oHB9kXQ=="],
|
||||
|
||||
"mdast-util-mdx-jsx": ["mdast-util-mdx-jsx@3.2.0", "", { "dependencies": { "@types/estree-jsx": "^1.0.0", "@types/hast": "^3.0.0", "@types/mdast": "^4.0.0", "@types/unist": "^3.0.0", "ccount": "^2.0.0", "devlop": "^1.1.0", "mdast-util-from-markdown": "^2.0.0", "mdast-util-to-markdown": "^2.0.0", "parse-entities": "^4.0.0", "stringify-entities": "^4.0.0", "unist-util-stringify-position": "^4.0.0", "vfile-message": "^4.0.0" } }, "sha512-lj/z8v0r6ZtsN/cGNNtemmmfoLAFZnjMbNyLzBafjzikOM+glrjNHPlf6lQDOTccj9n5b0PPihEBbhneMyGs1Q=="],
|
||||
@@ -653,20 +589,6 @@
|
||||
|
||||
"micromark-core-commonmark": ["micromark-core-commonmark@2.0.3", "", { "dependencies": { "decode-named-character-reference": "^1.0.0", "devlop": "^1.0.0", "micromark-factory-destination": "^2.0.0", "micromark-factory-label": "^2.0.0", "micromark-factory-space": "^2.0.0", "micromark-factory-title": "^2.0.0", "micromark-factory-whitespace": "^2.0.0", "micromark-util-character": "^2.0.0", "micromark-util-chunked": "^2.0.0", "micromark-util-classify-character": "^2.0.0", "micromark-util-html-tag-name": "^2.0.0", "micromark-util-normalize-identifier": "^2.0.0", "micromark-util-resolve-all": "^2.0.0", "micromark-util-subtokenize": "^2.0.0", "micromark-util-symbol": "^2.0.0", "micromark-util-types": "^2.0.0" } }, "sha512-RDBrHEMSxVFLg6xvnXmb1Ayr2WzLAWjeSATAoxwKYJV94TeNavgoIdA0a9ytzDSVzBy2YKFK+emCPOEibLeCrg=="],
|
||||
|
||||
"micromark-extension-gfm": ["micromark-extension-gfm@3.0.0", "", { "dependencies": { "micromark-extension-gfm-autolink-literal": "^2.0.0", "micromark-extension-gfm-footnote": "^2.0.0", "micromark-extension-gfm-strikethrough": "^2.0.0", "micromark-extension-gfm-table": "^2.0.0", "micromark-extension-gfm-tagfilter": "^2.0.0", "micromark-extension-gfm-task-list-item": "^2.0.0", "micromark-util-combine-extensions": "^2.0.0", "micromark-util-types": "^2.0.0" } }, "sha512-vsKArQsicm7t0z2GugkCKtZehqUm31oeGBV/KVSorWSy8ZlNAv7ytjFhvaryUiCUJYqs+NoE6AFhpQvBTM6Q4w=="],
|
||||
|
||||
"micromark-extension-gfm-autolink-literal": ["micromark-extension-gfm-autolink-literal@2.1.0", "", { "dependencies": { "micromark-util-character": "^2.0.0", "micromark-util-sanitize-uri": "^2.0.0", "micromark-util-symbol": "^2.0.0", "micromark-util-types": "^2.0.0" } }, "sha512-oOg7knzhicgQ3t4QCjCWgTmfNhvQbDDnJeVu9v81r7NltNCVmhPy1fJRX27pISafdjL+SVc4d3l48Gb6pbRypw=="],
|
||||
|
||||
"micromark-extension-gfm-footnote": ["micromark-extension-gfm-footnote@2.1.0", "", { "dependencies": { "devlop": "^1.0.0", "micromark-core-commonmark": "^2.0.0", "micromark-factory-space": "^2.0.0", "micromark-util-character": "^2.0.0", "micromark-util-normalize-identifier": "^2.0.0", "micromark-util-sanitize-uri": "^2.0.0", "micromark-util-symbol": "^2.0.0", "micromark-util-types": "^2.0.0" } }, "sha512-/yPhxI1ntnDNsiHtzLKYnE3vf9JZ6cAisqVDauhp4CEHxlb4uoOTxOCJ+9s51bIB8U1N1FJ1RXOKTIlD5B/gqw=="],
|
||||
|
||||
"micromark-extension-gfm-strikethrough": ["micromark-extension-gfm-strikethrough@2.1.0", "", { "dependencies": { "devlop": "^1.0.0", "micromark-util-chunked": "^2.0.0", "micromark-util-classify-character": "^2.0.0", "micromark-util-resolve-all": "^2.0.0", "micromark-util-symbol": "^2.0.0", "micromark-util-types": "^2.0.0" } }, "sha512-ADVjpOOkjz1hhkZLlBiYA9cR2Anf8F4HqZUO6e5eDcPQd0Txw5fxLzzxnEkSkfnD0wziSGiv7sYhk/ktvbf1uw=="],
|
||||
|
||||
"micromark-extension-gfm-table": ["micromark-extension-gfm-table@2.1.1", "", { "dependencies": { "devlop": "^1.0.0", "micromark-factory-space": "^2.0.0", "micromark-util-character": "^2.0.0", "micromark-util-symbol": "^2.0.0", "micromark-util-types": "^2.0.0" } }, "sha512-t2OU/dXXioARrC6yWfJ4hqB7rct14e8f7m0cbI5hUmDyyIlwv5vEtooptH8INkbLzOatzKuVbQmAYcbWoyz6Dg=="],
|
||||
|
||||
"micromark-extension-gfm-tagfilter": ["micromark-extension-gfm-tagfilter@2.0.0", "", { "dependencies": { "micromark-util-types": "^2.0.0" } }, "sha512-xHlTOmuCSotIA8TW1mDIM6X2O1SiX5P9IuDtqGonFhEK0qgRI4yeC6vMxEV2dgyr2TiD+2PQ10o+cOhdVAcwfg=="],
|
||||
|
||||
"micromark-extension-gfm-task-list-item": ["micromark-extension-gfm-task-list-item@2.1.0", "", { "dependencies": { "devlop": "^1.0.0", "micromark-factory-space": "^2.0.0", "micromark-util-character": "^2.0.0", "micromark-util-symbol": "^2.0.0", "micromark-util-types": "^2.0.0" } }, "sha512-qIBZhqxqI6fjLDYFTBIa4eivDMnP+OZqsNwmQ3xNLE4Cxwc+zfQEfbs6tzAo2Hjq+bh6q5F+Z8/cksrLFYWQQw=="],
|
||||
|
||||
"micromark-factory-destination": ["micromark-factory-destination@2.0.1", "", { "dependencies": { "micromark-util-character": "^2.0.0", "micromark-util-symbol": "^2.0.0", "micromark-util-types": "^2.0.0" } }, "sha512-Xe6rDdJlkmbFRExpTOmRj9N3MaWmbAgdpSrBQvCFqhezUn4AHqJHbaEnfbVYYiexVSs//tqOdY/DxhjdCiJnIA=="],
|
||||
|
||||
"micromark-factory-label": ["micromark-factory-label@2.0.1", "", { "dependencies": { "devlop": "^1.0.0", "micromark-util-character": "^2.0.0", "micromark-util-symbol": "^2.0.0", "micromark-util-types": "^2.0.0" } }, "sha512-VFMekyQExqIW7xIChcXn4ok29YE3rnuyveW3wZQWWqF4Nv9Wk5rgJ99KzPvHjkmPXF93FXIbBp6YdW3t71/7Vg=="],
|
||||
@@ -717,8 +639,6 @@
|
||||
|
||||
"node-releases": ["node-releases@2.0.27", "", {}, "sha512-nmh3lCkYZ3grZvqcCH+fjmQ7X+H0OeZgP40OierEaAptX4XofMh5kwNbWh7lBduUzCcV/8kZ+NDLCwm2iorIlA=="],
|
||||
|
||||
"nth-check": ["nth-check@2.1.1", "", { "dependencies": { "boolbase": "^1.0.0" } }, "sha512-lqjrjmaOoAnWfMmBPL+XNnynZh2+swxiX3WUE0s4yEHI6m+AwrK2UZOimIRl3X/4QctVqS8AiZjFqyOGrMXb/w=="],
|
||||
|
||||
"option-validator": ["option-validator@2.0.6", "", { "dependencies": { "kind-of": "^6.0.3" } }, "sha512-tmZDan2LRIRQyhUGvkff68/O0R8UmF+Btmiiz0SmSw2ng3CfPZB9wJlIjHpe/MKUZqyIZkVIXCrwr1tIN+0Dzg=="],
|
||||
|
||||
"optionator": ["optionator@0.9.4", "", { "dependencies": { "deep-is": "^0.1.3", "fast-levenshtein": "^2.0.6", "levn": "^0.4.1", "prelude-ls": "^1.2.1", "type-check": "^0.4.0", "word-wrap": "^1.2.5" } }, "sha512-6IpQ7mKUxRcZNLIObR0hz7lxsapSSIYNZJwXPGeF0mTVqGKFIXj1DQcMoT22S3ROcLyY/rz0PWaWZ9ayWmad9g=="],
|
||||
@@ -731,10 +651,6 @@
|
||||
|
||||
"parse-entities": ["parse-entities@4.0.2", "", { "dependencies": { "@types/unist": "^2.0.0", "character-entities-legacy": "^3.0.0", "character-reference-invalid": "^2.0.0", "decode-named-character-reference": "^1.0.0", "is-alphanumerical": "^2.0.0", "is-decimal": "^2.0.0", "is-hexadecimal": "^2.0.0" } }, "sha512-GG2AQYWoLgL877gQIKeRPGO1xF9+eG1ujIb5soS5gPvLQ1y2o8FL90w2QWNdf9I361Mpp7726c+lj3U0qK1uGw=="],
|
||||
|
||||
"parse-numeric-range": ["parse-numeric-range@1.3.0", "", {}, "sha512-twN+njEipszzlMJd4ONUYgSfZPDxgHhT9Ahed5uTigpQn90FggW4SA/AIPq/6a149fTbE9qBEcSwE3FAEp6wQQ=="],
|
||||
|
||||
"parse5": ["parse5@7.3.0", "", { "dependencies": { "entities": "^6.0.0" } }, "sha512-IInvU7fabl34qmi9gY8XOVxhYyMyuH2xUNpb2q8/Y+7552KlejkRvqvD19nMoUW/uQGGbqNpA6Tufu5FL5BZgw=="],
|
||||
|
||||
"path-exists": ["path-exists@4.0.0", "", {}, "sha512-ak9Qy5Q7jYb2Wwcey5Fpvg2KoAc/ZIhLSLOSBmRmygPsGwkVVt0fZa0qrtMz+m6tJTAHfZQ8FnmB4MG4LWy7/w=="],
|
||||
|
||||
"path-key": ["path-key@3.1.1", "", {}, "sha512-ojmeN0qd+y0jszEtoY48r0Peq5dwMEkIlCOu6Q5f41lfkswXuKtYrhgoTpLnyIcHm24Uhqx+5Tqm2InSwLhE6Q=="],
|
||||
@@ -763,38 +679,10 @@
|
||||
|
||||
"react-router": ["react-router@7.11.0", "", { "dependencies": { "cookie": "^1.0.1", "set-cookie-parser": "^2.6.0" }, "peerDependencies": { "react": ">=18", "react-dom": ">=18" }, "optionalPeers": ["react-dom"] }, "sha512-uI4JkMmjbWCZc01WVP2cH7ZfSzH91JAZUDd7/nIprDgWxBV1TkkmLToFh7EbMTcMak8URFRa2YoBL/W8GWnCTQ=="],
|
||||
|
||||
"refractor": ["refractor@4.9.0", "", { "dependencies": { "@types/hast": "^2.0.0", "@types/prismjs": "^1.0.0", "hastscript": "^7.0.0", "parse-entities": "^4.0.0" } }, "sha512-nEG1SPXFoGGx+dcjftjv8cAjEusIh6ED1xhf5DG3C0x/k+rmZ2duKnc3QLpt6qeHv5fPb8uwN3VWN2BT7fr3Og=="],
|
||||
|
||||
"rehype": ["rehype@13.0.2", "", { "dependencies": { "@types/hast": "^3.0.0", "rehype-parse": "^9.0.0", "rehype-stringify": "^10.0.0", "unified": "^11.0.0" } }, "sha512-j31mdaRFrwFRUIlxGeuPXXKWQxet52RBQRvCmzl5eCefn/KGbomK5GMHNMsOJf55fgo3qw5tST5neDuarDYR2A=="],
|
||||
|
||||
"rehype-attr": ["rehype-attr@3.0.3", "", { "dependencies": { "unified": "~11.0.0", "unist-util-visit": "~5.0.0" } }, "sha512-Up50Xfra8tyxnkJdCzLBIBtxOcB2M1xdeKe1324U06RAvSjYm7ULSeoM+b/nYPQPVd7jsXJ9+39IG1WAJPXONw=="],
|
||||
|
||||
"rehype-autolink-headings": ["rehype-autolink-headings@7.1.0", "", { "dependencies": { "@types/hast": "^3.0.0", "@ungap/structured-clone": "^1.0.0", "hast-util-heading-rank": "^3.0.0", "hast-util-is-element": "^3.0.0", "unified": "^11.0.0", "unist-util-visit": "^5.0.0" } }, "sha512-rItO/pSdvnvsP4QRB1pmPiNHUskikqtPojZKJPPPAVx9Hj8i8TwMBhofrrAYRhYOOBZH9tgmG5lPqDLuIWPWmw=="],
|
||||
|
||||
"rehype-ignore": ["rehype-ignore@2.0.3", "", { "dependencies": { "hast-util-select": "^6.0.0", "unified": "^11.0.0", "unist-util-visit": "^5.0.0" } }, "sha512-IzhP6/u/6sm49sdktuYSmeIuObWB+5yC/5eqVws8BhuGA9kY25/byz6uCy/Ravj6lXUShEd2ofHM5MyAIj86Sg=="],
|
||||
|
||||
"rehype-parse": ["rehype-parse@9.0.1", "", { "dependencies": { "@types/hast": "^3.0.0", "hast-util-from-html": "^2.0.0", "unified": "^11.0.0" } }, "sha512-ksCzCD0Fgfh7trPDxr2rSylbwq9iYDkSn8TCDmEJ49ljEUBxDVCzCHv7QNzZOfODanX4+bWQ4WZqLCRWYLfhag=="],
|
||||
|
||||
"rehype-prism-plus": ["rehype-prism-plus@2.0.1", "", { "dependencies": { "hast-util-to-string": "^3.0.0", "parse-numeric-range": "^1.3.0", "refractor": "^4.8.0", "rehype-parse": "^9.0.0", "unist-util-filter": "^5.0.0", "unist-util-visit": "^5.0.0" } }, "sha512-Wglct0OW12tksTUseAPyWPo3srjBOY7xKlql/DPKi7HbsdZTyaLCAoO58QBKSczFQxElTsQlOY3JDOFzB/K++Q=="],
|
||||
|
||||
"rehype-raw": ["rehype-raw@7.0.0", "", { "dependencies": { "@types/hast": "^3.0.0", "hast-util-raw": "^9.0.0", "vfile": "^6.0.0" } }, "sha512-/aE8hCfKlQeA8LmyeyQvQF3eBiLRGNlfBJEvWH7ivp9sBqs7TNqBL5X3v157rM4IFETqDnIOO+z5M/biZbo9Ww=="],
|
||||
|
||||
"rehype-rewrite": ["rehype-rewrite@4.0.4", "", { "dependencies": { "hast-util-select": "^6.0.0", "unified": "^11.0.3", "unist-util-visit": "^5.0.0" } }, "sha512-L/FO96EOzSA6bzOam4DVu61/PB3AGKcSPXpa53yMIozoxH4qg1+bVZDF8zh1EsuxtSauAhzt5cCnvoplAaSLrw=="],
|
||||
|
||||
"rehype-slug": ["rehype-slug@6.0.0", "", { "dependencies": { "@types/hast": "^3.0.0", "github-slugger": "^2.0.0", "hast-util-heading-rank": "^3.0.0", "hast-util-to-string": "^3.0.0", "unist-util-visit": "^5.0.0" } }, "sha512-lWyvf/jwu+oS5+hL5eClVd3hNdmwM1kAC0BUvEGD19pajQMIzcNUd/k9GsfQ+FfECvX+JE+e9/btsKH0EjJT6A=="],
|
||||
|
||||
"rehype-stringify": ["rehype-stringify@10.0.1", "", { "dependencies": { "@types/hast": "^3.0.0", "hast-util-to-html": "^9.0.0", "unified": "^11.0.0" } }, "sha512-k9ecfXHmIPuFVI61B9DeLPN0qFHfawM6RsuX48hoqlaKSF61RskNjSm1lI8PhBEM0MRdLxVVm4WmTqJQccH9mA=="],
|
||||
|
||||
"remark-gfm": ["remark-gfm@4.0.1", "", { "dependencies": { "@types/mdast": "^4.0.0", "mdast-util-gfm": "^3.0.0", "micromark-extension-gfm": "^3.0.0", "remark-parse": "^11.0.0", "remark-stringify": "^11.0.0", "unified": "^11.0.0" } }, "sha512-1quofZ2RQ9EWdeN34S79+KExV1764+wCUGop5CPL1WGdD0ocPpu91lzPGbwWMECpEpd42kJGQwzRfyov9j4yNg=="],
|
||||
|
||||
"remark-github-blockquote-alert": ["remark-github-blockquote-alert@1.3.1", "", { "dependencies": { "unist-util-visit": "^5.0.0" } }, "sha512-OPNnimcKeozWN1w8KVQEuHOxgN3L4rah8geMOLhA5vN9wITqU4FWD+G26tkEsCGHiOVDbISx+Se5rGZ+D1p0Jg=="],
|
||||
|
||||
"remark-parse": ["remark-parse@11.0.0", "", { "dependencies": { "@types/mdast": "^4.0.0", "mdast-util-from-markdown": "^2.0.0", "micromark-util-types": "^2.0.0", "unified": "^11.0.0" } }, "sha512-FCxlKLNGknS5ba/1lmpYijMUzX2esxW5xQqjWxw2eHFfS2MSdaHVINFmhjo+qN1WhZhNimq0dZATN9pH0IDrpA=="],
|
||||
|
||||
"remark-rehype": ["remark-rehype@11.1.2", "", { "dependencies": { "@types/hast": "^3.0.0", "@types/mdast": "^4.0.0", "mdast-util-to-hast": "^13.0.0", "unified": "^11.0.0", "vfile": "^6.0.0" } }, "sha512-Dh7l57ianaEoIpzbp0PC9UKAdCSVklD8E5Rpw7ETfbTl3FqcOOgq5q2LVDhgGCkaBv7p24JXikPdvhhmHvKMsw=="],
|
||||
|
||||
"remark-stringify": ["remark-stringify@11.0.0", "", { "dependencies": { "@types/mdast": "^4.0.0", "mdast-util-to-markdown": "^2.0.0", "unified": "^11.0.0" } }, "sha512-1OSmLd3awB/t8qdoEOMazZkNsfVTeY4fTsgzcQFdXNq8ToTN4ZGwrMnlda4K6smTFKD+GRV6O48i6Z4iKgPPpw=="],
|
||||
|
||||
"resolve-from": ["resolve-from@4.0.0", "", {}, "sha512-pb/MYmXstAkysRFx8piNI1tGFNQIFA3vkE3Gq4EuA1dF6gHp/+vgZqsCGJapvy8N3Q+4o7FwvquPJcnZ7RYy4g=="],
|
||||
|
||||
"rollup": ["rollup@4.54.0", "", { "dependencies": { "@types/estree": "1.0.8" }, "optionalDependencies": { "@rollup/rollup-android-arm-eabi": "4.54.0", "@rollup/rollup-android-arm64": "4.54.0", "@rollup/rollup-darwin-arm64": "4.54.0", "@rollup/rollup-darwin-x64": "4.54.0", "@rollup/rollup-freebsd-arm64": "4.54.0", "@rollup/rollup-freebsd-x64": "4.54.0", "@rollup/rollup-linux-arm-gnueabihf": "4.54.0", "@rollup/rollup-linux-arm-musleabihf": "4.54.0", "@rollup/rollup-linux-arm64-gnu": "4.54.0", "@rollup/rollup-linux-arm64-musl": "4.54.0", "@rollup/rollup-linux-loong64-gnu": "4.54.0", "@rollup/rollup-linux-ppc64-gnu": "4.54.0", "@rollup/rollup-linux-riscv64-gnu": "4.54.0", "@rollup/rollup-linux-riscv64-musl": "4.54.0", "@rollup/rollup-linux-s390x-gnu": "4.54.0", "@rollup/rollup-linux-x64-gnu": "4.54.0", "@rollup/rollup-linux-x64-musl": "4.54.0", "@rollup/rollup-openharmony-arm64": "4.54.0", "@rollup/rollup-win32-arm64-msvc": "4.54.0", "@rollup/rollup-win32-ia32-msvc": "4.54.0", "@rollup/rollup-win32-x64-gnu": "4.54.0", "@rollup/rollup-win32-x64-msvc": "4.54.0", "fsevents": "~2.3.2" }, "bin": { "rollup": "dist/bin/rollup" } }, "sha512-3nk8Y3a9Ea8szgKhinMlGMhGMw89mqule3KWczxhIzqudyHdCIOHw8WJlj/r329fACjKLEh13ZSk7oE22kyeIw=="],
|
||||
@@ -849,8 +737,6 @@
|
||||
|
||||
"unified": ["unified@11.0.5", "", { "dependencies": { "@types/unist": "^3.0.0", "bail": "^2.0.0", "devlop": "^1.0.0", "extend": "^3.0.0", "is-plain-obj": "^4.0.0", "trough": "^2.0.0", "vfile": "^6.0.0" } }, "sha512-xKvGhPWw3k84Qjh8bI3ZeJjqnyadK+GEFtazSfZv/rKeTkTjOJho6mFqh2SM96iIcZokxiOpg78GazTSg8+KHA=="],
|
||||
|
||||
"unist-util-filter": ["unist-util-filter@5.0.1", "", { "dependencies": { "@types/unist": "^3.0.0", "unist-util-is": "^6.0.0", "unist-util-visit-parents": "^6.0.0" } }, "sha512-pHx7D4Zt6+TsfwylH9+lYhBhzyhEnCXs/lbq/Hstxno5z4gVdyc2WEW0asfjGKPyG4pEKrnBv5hdkO6+aRnQJw=="],
|
||||
|
||||
"unist-util-is": ["unist-util-is@6.0.1", "", { "dependencies": { "@types/unist": "^3.0.0" } }, "sha512-LsiILbtBETkDz8I9p1dQ0uyRUWuaQzd/cuEeS1hoRSyW5E5XGmTzlwY1OrNzzakGowI9Dr/I8HVaw4hTtnxy8g=="],
|
||||
|
||||
"unist-util-position": ["unist-util-position@5.0.0", "", { "dependencies": { "@types/unist": "^3.0.0" } }, "sha512-fucsC7HjXvkB5R3kTCO7kUjRdrS0BJt3M/FPxmHMBOm8JQi2BsHAHFsy27E0EolP8rp0NzXsJ+jNPyDWvOJZPA=="],
|
||||
@@ -867,14 +753,10 @@
|
||||
|
||||
"vfile": ["vfile@6.0.3", "", { "dependencies": { "@types/unist": "^3.0.0", "vfile-message": "^4.0.0" } }, "sha512-KzIbH/9tXat2u30jf+smMwFCsno4wHVdNmzFyL+T/L3UGqqk6JKfVqOFOZEpZSHADH1k40ab6NUIXZq422ov3Q=="],
|
||||
|
||||
"vfile-location": ["vfile-location@5.0.3", "", { "dependencies": { "@types/unist": "^3.0.0", "vfile": "^6.0.0" } }, "sha512-5yXvWDEgqeiYiBe1lbxYF7UMAIm/IcopxMHrMQDq3nvKcjPKIhZklUKL+AE7J7uApI4kwe2snsK+eI6UTj9EHg=="],
|
||||
|
||||
"vfile-message": ["vfile-message@4.0.3", "", { "dependencies": { "@types/unist": "^3.0.0", "unist-util-stringify-position": "^4.0.0" } }, "sha512-QTHzsGd1EhbZs4AsQ20JX1rC3cOlt/IWJruk893DfLRr57lcnOeMaWG4K0JrRta4mIJZKth2Au3mM3u03/JWKw=="],
|
||||
|
||||
"vite": ["vite@7.3.0", "", { "dependencies": { "esbuild": "^0.27.0", "fdir": "^6.5.0", "picomatch": "^4.0.3", "postcss": "^8.5.6", "rollup": "^4.43.0", "tinyglobby": "^0.2.15" }, "optionalDependencies": { "fsevents": "~2.3.3" }, "peerDependencies": { "@types/node": "^20.19.0 || >=22.12.0", "jiti": ">=1.21.0", "less": "^4.0.0", "lightningcss": "^1.21.0", "sass": "^1.70.0", "sass-embedded": "^1.70.0", "stylus": ">=0.54.8", "sugarss": "^5.0.0", "terser": "^5.16.0", "tsx": "^4.8.1", "yaml": "^2.4.2" }, "optionalPeers": ["@types/node", "jiti", "less", "lightningcss", "sass", "sass-embedded", "stylus", "sugarss", "terser", "tsx", "yaml"], "bin": { "vite": "bin/vite.js" } }, "sha512-dZwN5L1VlUBewiP6H9s2+B3e3Jg96D0vzN+Ry73sOefebhYr9f94wwkMNN/9ouoU8pV1BqA1d1zGk8928cx0rg=="],
|
||||
|
||||
"web-namespaces": ["web-namespaces@2.0.1", "", {}, "sha512-bKr1DkiNa2krS7qxNtdrtHAmzuYGFQLiQ13TsorsdT6ULTkPLKuu5+GsFpDlg6JFjUTwX2DyhMPG2be8uPrqsQ=="],
|
||||
|
||||
"which": ["which@2.0.2", "", { "dependencies": { "isexe": "^2.0.0" }, "bin": { "node-which": "./bin/node-which" } }, "sha512-BLI3Tl1TW3Pvl70l3yq3Y64i+awpwXqsGBYWkkqMtnbXgrMD+yj7rhW0kuEDxzJaYXGjEW5ogapKNMEKNMjibA=="],
|
||||
|
||||
"word-wrap": ["word-wrap@1.2.5", "", {}, "sha512-BN22B5eaMMI9UMtjrGd5g5eCYPpCPDUy0FJXbYsaT5zYxjFOckS53SQDE3pWkVoWpHXVb3BrYcEN4Twa55B5cA=="],
|
||||
@@ -899,32 +781,8 @@
|
||||
|
||||
"@typescript-eslint/typescript-estree/semver": ["semver@7.7.3", "", { "bin": { "semver": "bin/semver.js" } }, "sha512-SdsKMrI9TdgjdweUSR9MweHA4EJ8YxHn8DFaDisvhVlUOe4BF1tLD7GAj0lIqWVl+dPb/rExr0Btby5loQm20Q=="],
|
||||
|
||||
"@uiw/react-markdown-preview/react-markdown": ["react-markdown@9.0.3", "", { "dependencies": { "@types/hast": "^3.0.0", "devlop": "^1.0.0", "hast-util-to-jsx-runtime": "^2.0.0", "html-url-attributes": "^3.0.0", "mdast-util-to-hast": "^13.0.0", "remark-parse": "^11.0.0", "remark-rehype": "^11.0.0", "unified": "^11.0.0", "unist-util-visit": "^5.0.0", "vfile": "^6.0.0" }, "peerDependencies": { "@types/react": ">=18", "react": ">=18" } }, "sha512-Yk7Z94dbgYTOrdk41Z74GoKA7rThnsbbqBTRYuxoe08qvfQ9tJVhmAKw6BJS/ZORG7kTy/s1QvYzSuaoBA1qfw=="],
|
||||
|
||||
"@uiw/react-markdown-preview/rehype-prism-plus": ["rehype-prism-plus@2.0.0", "", { "dependencies": { "hast-util-to-string": "^3.0.0", "parse-numeric-range": "^1.3.0", "refractor": "^4.8.0", "rehype-parse": "^9.0.0", "unist-util-filter": "^5.0.0", "unist-util-visit": "^5.0.0" } }, "sha512-FeM/9V2N7EvDZVdR2dqhAzlw5YI49m9Tgn7ZrYJeYHIahM6gcXpH0K1y2gNnKanZCydOMluJvX2cB9z3lhY8XQ=="],
|
||||
|
||||
"hast-util-from-parse5/hastscript": ["hastscript@9.0.1", "", { "dependencies": { "@types/hast": "^3.0.0", "comma-separated-tokens": "^2.0.0", "hast-util-parse-selector": "^4.0.0", "property-information": "^7.0.0", "space-separated-tokens": "^2.0.0" } }, "sha512-g7df9rMFX/SPi34tyGCyUBREQoKkapwdY/T04Qn9TDWfHhAYt4/I0gMVirzK5wEzeUqIjEB+LXC/ypb7Aqno5w=="],
|
||||
|
||||
"hast-util-parse-selector/@types/hast": ["@types/hast@2.3.10", "", { "dependencies": { "@types/unist": "^2" } }, "sha512-McWspRw8xx8J9HurkVBfYj0xKoE25tOFlHGdx4MJ5xORQrMGZNqJhVQWaIbm6Oyla5kYOXtDiopzKRJzEOkwJw=="],
|
||||
|
||||
"hastscript/@types/hast": ["@types/hast@2.3.10", "", { "dependencies": { "@types/unist": "^2" } }, "sha512-McWspRw8xx8J9HurkVBfYj0xKoE25tOFlHGdx4MJ5xORQrMGZNqJhVQWaIbm6Oyla5kYOXtDiopzKRJzEOkwJw=="],
|
||||
|
||||
"hastscript/property-information": ["property-information@6.5.0", "", {}, "sha512-PgTgs/BlvHxOu8QuEN7wi5A0OmXaBcHpmCSTehcs6Uuu9IkDIEo13Hy7n898RHfrQ49vKCoGeWZSaAK01nwVig=="],
|
||||
|
||||
"mdast-util-find-and-replace/escape-string-regexp": ["escape-string-regexp@5.0.0", "", {}, "sha512-/veY75JbMK4j1yjvuUxuVsiS/hr/4iHs9FTT6cgTexxdE0Ly/glccBAkloH/DofkjRbZU3bnoj38mOmhkZ0lHw=="],
|
||||
|
||||
"parse-entities/@types/unist": ["@types/unist@2.0.11", "", {}, "sha512-CmBKiL6NNo/OqgmMn95Fk9Whlp2mtvIv+KNpQKN2F4SjvrEesubTRWGYSg+BnWZOnlCaSTU1sMpsBOzgbYhnsA=="],
|
||||
|
||||
"refractor/@types/hast": ["@types/hast@2.3.10", "", { "dependencies": { "@types/unist": "^2" } }, "sha512-McWspRw8xx8J9HurkVBfYj0xKoE25tOFlHGdx4MJ5xORQrMGZNqJhVQWaIbm6Oyla5kYOXtDiopzKRJzEOkwJw=="],
|
||||
|
||||
"@typescript-eslint/typescript-estree/minimatch/brace-expansion": ["brace-expansion@2.0.2", "", { "dependencies": { "balanced-match": "^1.0.0" } }, "sha512-Jt0vHyM+jmUBqojB7E1NIYadt0vI0Qxjxd2TErW94wDz+E2LAm5vKMXXwg6ZZBTHPuUlDgQHKXvjGBdfcF1ZDQ=="],
|
||||
|
||||
"hast-util-from-parse5/hastscript/hast-util-parse-selector": ["hast-util-parse-selector@4.0.0", "", { "dependencies": { "@types/hast": "^3.0.0" } }, "sha512-wkQCkSYoOGCRKERFWcxMVMOcYE2K1AaNLU8DXS9arxnLOUEWbOXKXiJUNzEpqZ3JOKpnha3jkFrumEjVliDe7A=="],
|
||||
|
||||
"hast-util-parse-selector/@types/hast/@types/unist": ["@types/unist@2.0.11", "", {}, "sha512-CmBKiL6NNo/OqgmMn95Fk9Whlp2mtvIv+KNpQKN2F4SjvrEesubTRWGYSg+BnWZOnlCaSTU1sMpsBOzgbYhnsA=="],
|
||||
|
||||
"hastscript/@types/hast/@types/unist": ["@types/unist@2.0.11", "", {}, "sha512-CmBKiL6NNo/OqgmMn95Fk9Whlp2mtvIv+KNpQKN2F4SjvrEesubTRWGYSg+BnWZOnlCaSTU1sMpsBOzgbYhnsA=="],
|
||||
|
||||
"refractor/@types/hast/@types/unist": ["@types/unist@2.0.11", "", {}, "sha512-CmBKiL6NNo/OqgmMn95Fk9Whlp2mtvIv+KNpQKN2F4SjvrEesubTRWGYSg+BnWZOnlCaSTU1sMpsBOzgbYhnsA=="],
|
||||
}
|
||||
}
|
||||
|
||||
@@ -12,11 +12,15 @@ export default tseslint.config([
|
||||
extends: [
|
||||
js.configs.recommended,
|
||||
tseslint.configs.recommended,
|
||||
reactHooks.configs['recommended-latest'],
|
||||
reactRefresh.configs.vite,
|
||||
],
|
||||
plugins: {
|
||||
'react-hooks': reactHooks,
|
||||
'react-refresh': reactRefresh,
|
||||
},
|
||||
rules: {
|
||||
'@typescript-eslint/no-explicit-any': 'off',
|
||||
'react-hooks/rules-of-hooks': 'error',
|
||||
'react-hooks/exhaustive-deps': 'warn',
|
||||
'react-refresh/only-export-components': [
|
||||
'error',
|
||||
{
|
||||
|
||||
@@ -12,11 +12,9 @@
|
||||
"dependencies": {
|
||||
"@ant-design/icons": "6",
|
||||
"@monaco-editor/react": "^4.7.0",
|
||||
"@uiw/react-md-editor": "^4.0.11",
|
||||
"antd": "6",
|
||||
"artplayer": "^5.3.0",
|
||||
"date-fns": "^4.1.0",
|
||||
"monaco-editor": "^0.55.1",
|
||||
"react": "^19.2.3",
|
||||
"react-dom": "^19.2.3",
|
||||
"react-markdown": "^10.1.0",
|
||||
|
||||
28
web/plugin-frame.html
Normal file
28
web/plugin-frame.html
Normal file
@@ -0,0 +1,28 @@
|
||||
<!doctype html>
|
||||
<html lang="zh-CN">
|
||||
|
||||
<head>
|
||||
<meta charset="UTF-8" />
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
|
||||
<title>Foxel Plugin Frame</title>
|
||||
<link rel='stylesheet' href='https://foxel.cc/fonts/result.css' />
|
||||
<style>
|
||||
html,
|
||||
body,
|
||||
#root {
|
||||
height: 100%;
|
||||
margin: 0;
|
||||
}
|
||||
|
||||
* {
|
||||
font-family: 'Maple Mono Normal NL NF CN';
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
|
||||
<body>
|
||||
<div id="root"></div>
|
||||
<script type="module" src="/src/plugin-frame.ts"></script>
|
||||
</body>
|
||||
|
||||
</html>
|
||||
1
web/public/icon/anthropic.svg
Normal file
1
web/public/icon/anthropic.svg
Normal file
@@ -0,0 +1 @@
|
||||
<svg fill="currentColor" fill-rule="evenodd" height="1em" style="flex:none;line-height:1" viewBox="0 0 24 24" width="1em" xmlns="http://www.w3.org/2000/svg"><title>Anthropic</title><path d="M13.827 3.52h3.603L24 20h-3.603l-6.57-16.48zm-7.258 0h3.767L16.906 20h-3.674l-1.343-3.461H5.017l-1.344 3.46H0L6.57 3.522zm4.132 9.959L8.453 7.687 6.205 13.48H10.7z"></path></svg>
|
||||
|
After Width: | Height: | Size: 368 B |
1
web/public/icon/azure-color.svg
Normal file
1
web/public/icon/azure-color.svg
Normal file
@@ -0,0 +1 @@
|
||||
<svg height="1em" style="flex:none;line-height:1" viewBox="0 0 24 24" width="1em" xmlns="http://www.w3.org/2000/svg"><title>Azure</title><path d="M7.242 1.613A1.11 1.11 0 018.295.857h6.977L8.03 22.316a1.11 1.11 0 01-1.052.755h-5.43a1.11 1.11 0 01-1.053-1.466L7.242 1.613z" fill="url(#lobe-icons-azure-fill-0)"></path><path d="M18.397 15.296H7.4a.51.51 0 00-.347.882l7.066 6.595c.206.192.477.298.758.298h6.226l-2.706-7.775z" fill="#0078D4"></path><path d="M15.272.857H7.497L0 23.071h7.775l1.596-4.73 5.068 4.73h6.665l-2.707-7.775h-7.998L15.272.857z" fill="url(#lobe-icons-azure-fill-1)"></path><path d="M17.193 1.613a1.11 1.11 0 00-1.052-.756h-7.81.035c.477 0 .9.304 1.052.756l6.748 19.992a1.11 1.11 0 01-1.052 1.466h-.12 7.895a1.11 1.11 0 001.052-1.466L17.193 1.613z" fill="url(#lobe-icons-azure-fill-2)"></path><defs><linearGradient gradientUnits="userSpaceOnUse" id="lobe-icons-azure-fill-0" x1="8.247" x2="1.002" y1="1.626" y2="23.03"><stop stop-color="#114A8B"></stop><stop offset="1" stop-color="#0669BC"></stop></linearGradient><linearGradient gradientUnits="userSpaceOnUse" id="lobe-icons-azure-fill-1" x1="14.042" x2="12.324" y1="15.302" y2="15.888"><stop stop-opacity=".3"></stop><stop offset=".071" stop-opacity=".2"></stop><stop offset=".321" stop-opacity=".1"></stop><stop offset=".623" stop-opacity=".05"></stop><stop offset="1" stop-opacity="0"></stop></linearGradient><linearGradient gradientUnits="userSpaceOnUse" id="lobe-icons-azure-fill-2" x1="12.841" x2="20.793" y1="1.626" y2="22.814"><stop stop-color="#3CCBF4"></stop><stop offset="1" stop-color="#2892DF"></stop></linearGradient></defs></svg>
|
||||
|
After Width: | Height: | Size: 1.6 KiB |
1
web/public/icon/ollama.svg
Normal file
1
web/public/icon/ollama.svg
Normal file
@@ -0,0 +1 @@
|
||||
<svg fill="currentColor" fill-rule="evenodd" height="1em" style="flex:none;line-height:1" viewBox="0 0 24 24" width="1em" xmlns="http://www.w3.org/2000/svg"><title>Ollama</title><path d="M7.905 1.09c.216.085.411.225.588.41.295.306.544.744.734 1.263.191.522.315 1.1.362 1.68a5.054 5.054 0 012.049-.636l.051-.004c.87-.07 1.73.087 2.48.474.101.053.2.11.297.17.05-.569.172-1.134.36-1.644.19-.52.439-.957.733-1.264a1.67 1.67 0 01.589-.41c.257-.1.53-.118.796-.042.401.114.745.368 1.016.737.248.337.434.769.561 1.287.23.934.27 2.163.115 3.645l.053.04.026.019c.757.576 1.284 1.397 1.563 2.35.435 1.487.216 3.155-.534 4.088l-.018.021.002.003c.417.762.67 1.567.724 2.4l.002.03c.064 1.065-.2 2.137-.814 3.19l-.007.01.01.024c.472 1.157.62 2.322.438 3.486l-.006.039a.651.651 0 01-.747.536.648.648 0 01-.54-.742c.167-1.033.01-2.069-.48-3.123a.643.643 0 01.04-.617l.004-.006c.604-.924.854-1.83.8-2.72-.046-.779-.325-1.544-.8-2.273a.644.644 0 01.18-.886l.009-.006c.243-.159.467-.565.58-1.12a4.229 4.229 0 00-.095-1.974c-.205-.7-.58-1.284-1.105-1.683-.595-.454-1.383-.673-2.38-.61a.653.653 0 01-.632-.371c-.314-.665-.772-1.141-1.343-1.436a3.288 3.288 0 00-1.772-.332c-1.245.099-2.343.801-2.67 1.686a.652.652 0 01-.61.425c-1.067.002-1.893.252-2.497.703-.522.39-.878.935-1.066 1.588a4.07 4.07 0 00-.068 1.886c.112.558.331 1.02.582 1.269l.008.007c.212.207.257.53.109.785-.36.622-.629 1.549-.673 2.44-.05 1.018.186 1.902.719 2.536l.016.019a.643.643 0 01.095.69c-.576 1.236-.753 2.252-.562 3.052a.652.652 0 01-1.269.298c-.243-1.018-.078-2.184.473-3.498l.014-.035-.008-.012a4.339 4.339 0 01-.598-1.309l-.005-.019a5.764 5.764 0 01-.177-1.785c.044-.91.278-1.842.622-2.59l.012-.026-.002-.002c-.293-.418-.51-.953-.63-1.545l-.005-.024a5.352 5.352 0 01.093-2.49c.262-.915.777-1.701 1.536-2.269.06-.045.123-.09.186-.132-.159-1.493-.119-2.73.112-3.67.127-.518.314-.95.562-1.287.27-.368.614-.622 1.015-.737.266-.076.54-.059.797.042zm4.116 9.09c.936 0 1.8.313 2.446.855.63.527 1.005 1.235 1.005 1.94 0 .888-.406 1.58-1.133 2.022-.62.375-1.451.557-2.403.557-1.009 0-1.871-.259-2.493-.734-.617-.47-.963-1.13-.963-1.845 0-.707.398-1.417 1.056-1.946.668-.537 1.55-.849 2.485-.849zm0 .896a3.07 3.07 0 00-1.916.65c-.461.37-.722.835-.722 1.25 0 .428.21.829.61 1.134.455.347 1.124.548 1.943.548.799 0 1.473-.147 1.932-.426.463-.28.7-.686.7-1.257 0-.423-.246-.89-.683-1.256-.484-.405-1.14-.643-1.864-.643zm.662 1.21l.004.004c.12.151.095.37-.056.49l-.292.23v.446a.375.375 0 01-.376.373.375.375 0 01-.376-.373v-.46l-.271-.218a.347.347 0 01-.052-.49.353.353 0 01.494-.051l.215.172.22-.174a.353.353 0 01.49.051zm-5.04-1.919c.478 0 .867.39.867.871a.87.87 0 01-.868.871.87.87 0 01-.867-.87.87.87 0 01.867-.872zm8.706 0c.48 0 .868.39.868.871a.87.87 0 01-.868.871.87.87 0 01-.867-.87.87.87 0 01.867-.872zM7.44 2.3l-.003.002a.659.659 0 00-.285.238l-.005.006c-.138.189-.258.467-.348.832-.17.692-.216 1.631-.124 2.782.43-.128.899-.208 1.404-.237l.01-.001.019-.034c.046-.082.095-.161.148-.239.123-.771.022-1.692-.253-2.444-.134-.364-.297-.65-.453-.813a.628.628 0 00-.107-.09L7.44 2.3zm9.174.04l-.002.001a.628.628 0 00-.107.09c-.156.163-.32.45-.453.814-.29.794-.387 1.776-.23 2.572l.058.097.008.014h.03a5.184 5.184 0 011.466.212c.086-1.124.038-2.043-.128-2.722-.09-.365-.21-.643-.349-.832l-.004-.006a.659.659 0 00-.285-.239h-.004z"></path></svg>
|
||||
|
After Width: | Height: | Size: 3.2 KiB |
1
web/public/icon/zai.svg
Normal file
1
web/public/icon/zai.svg
Normal file
@@ -0,0 +1 @@
|
||||
<svg fill="currentColor" fill-rule="evenodd" height="1em" style="flex:none;line-height:1" viewBox="0 0 24 24" width="1em" xmlns="http://www.w3.org/2000/svg"><title>Z.ai</title><path d="M12.105 2L9.927 4.953H.653L2.83 2h9.276zM23.254 19.048L21.078 22h-9.242l2.174-2.952h9.244zM24 2L9.264 22H0L14.736 2H24z"></path></svg>
|
||||
|
After Width: | Height: | Size: 319 B |
121
web/src/api/agent.ts
Normal file
121
web/src/api/agent.ts
Normal file
@@ -0,0 +1,121 @@
|
||||
import request, { API_BASE_URL } from './client';
|
||||
|
||||
export type AgentChatMessage = Record<string, any>;
|
||||
|
||||
export interface AgentChatContext {
|
||||
current_path?: string | null;
|
||||
}
|
||||
|
||||
export interface AgentChatRequest {
|
||||
messages: AgentChatMessage[];
|
||||
auto_execute?: boolean;
|
||||
approved_tool_call_ids?: string[];
|
||||
rejected_tool_call_ids?: string[];
|
||||
context?: AgentChatContext;
|
||||
}
|
||||
|
||||
export interface PendingToolCall {
|
||||
id: string;
|
||||
name: string;
|
||||
arguments: Record<string, any>;
|
||||
requires_confirmation: boolean;
|
||||
}
|
||||
|
||||
export interface AgentChatResponse {
|
||||
messages: AgentChatMessage[];
|
||||
pending_tool_calls?: PendingToolCall[];
|
||||
}
|
||||
|
||||
export type AgentSseEvent =
|
||||
| { event: 'assistant_start'; data: { id: string } }
|
||||
| { event: 'assistant_delta'; data: { id: string; delta: string } }
|
||||
| { event: 'assistant_end'; data: { id: string; message: AgentChatMessage } }
|
||||
| { event: 'tool_start'; data: { tool_call_id: string; name: string } }
|
||||
| { event: 'tool_end'; data: { tool_call_id: string; name: string; message: AgentChatMessage } }
|
||||
| { event: 'pending'; data: { pending_tool_calls: PendingToolCall[] } }
|
||||
| { event: 'done'; data: AgentChatResponse };
|
||||
|
||||
export const agentApi = {
|
||||
chat: (payload: AgentChatRequest) =>
|
||||
request<AgentChatResponse>('/agent/chat', {
|
||||
method: 'POST',
|
||||
json: payload,
|
||||
}),
|
||||
chatStream: async (
|
||||
payload: AgentChatRequest,
|
||||
onEvent: (evt: AgentSseEvent) => void,
|
||||
options?: { signal?: AbortSignal }
|
||||
) => {
|
||||
const headers: Record<string, string> = {
|
||||
'Content-Type': 'application/json',
|
||||
'Accept': 'text/event-stream',
|
||||
};
|
||||
const token = localStorage.getItem('token');
|
||||
if (token) headers['Authorization'] = `Bearer ${token}`;
|
||||
|
||||
const resp = await fetch(`${API_BASE_URL}/agent/chat/stream`, {
|
||||
method: 'POST',
|
||||
headers,
|
||||
body: JSON.stringify(payload),
|
||||
signal: options?.signal,
|
||||
});
|
||||
|
||||
if (!resp.ok) {
|
||||
let errMsg = resp.statusText;
|
||||
try {
|
||||
const data = await resp.json();
|
||||
if (Array.isArray((data as any)?.detail)) {
|
||||
errMsg = (data as any).detail.map((e: any) => e.msg || JSON.stringify(e)).join('; ');
|
||||
} else {
|
||||
errMsg = (typeof (data as any)?.detail === 'string') ? (data as any).detail : JSON.stringify(data);
|
||||
}
|
||||
} catch {
|
||||
try {
|
||||
errMsg = await resp.text();
|
||||
} catch { void 0; }
|
||||
}
|
||||
throw new Error(errMsg || `Request failed: ${resp.status}`);
|
||||
}
|
||||
|
||||
const reader = resp.body?.getReader();
|
||||
if (!reader) throw new Error('Stream not supported');
|
||||
|
||||
const decoder = new TextDecoder();
|
||||
let buffer = '';
|
||||
|
||||
const flush = (raw: string) => {
|
||||
const lines = raw.split(/\r?\n/);
|
||||
let eventName = 'message';
|
||||
const dataLines: string[] = [];
|
||||
for (const line of lines) {
|
||||
if (line.startsWith('event:')) {
|
||||
eventName = line.slice(6).trim();
|
||||
} else if (line.startsWith('data:')) {
|
||||
dataLines.push(line.slice(5).trimStart());
|
||||
}
|
||||
}
|
||||
const dataStr = dataLines.join('\n').trim();
|
||||
if (!eventName || !dataStr) return;
|
||||
try {
|
||||
const data = JSON.parse(dataStr);
|
||||
onEvent({ event: eventName as any, data } as any);
|
||||
} catch {
|
||||
// ignore parse error
|
||||
}
|
||||
};
|
||||
|
||||
while (true) {
|
||||
const { value, done } = await reader.read();
|
||||
if (done) break;
|
||||
buffer += decoder.decode(value, { stream: true });
|
||||
while (true) {
|
||||
const idx = buffer.indexOf('\n\n');
|
||||
if (idx === -1) break;
|
||||
const chunk = buffer.slice(0, idx);
|
||||
buffer = buffer.slice(idx + 2);
|
||||
if (chunk.trim()) flush(chunk);
|
||||
}
|
||||
}
|
||||
if (buffer.trim()) flush(buffer);
|
||||
},
|
||||
};
|
||||
@@ -6,15 +6,16 @@ export interface AIProviderPayload {
|
||||
name: string;
|
||||
identifier: string;
|
||||
provider_type?: string | null;
|
||||
api_format: 'openai' | 'gemini';
|
||||
api_format: 'openai' | 'gemini' | 'anthropic' | 'ollama';
|
||||
base_url?: string | null;
|
||||
api_key?: string | null;
|
||||
logo_url?: string | null;
|
||||
extra_config?: Record<string, unknown> | null;
|
||||
}
|
||||
|
||||
export interface AIProvider extends Omit<AIProviderPayload, 'extra_config'> {
|
||||
export interface AIProvider extends Omit<AIProviderPayload, 'extra_config' | 'api_key'> {
|
||||
id: number;
|
||||
has_api_key: boolean;
|
||||
extra_config: Record<string, unknown>;
|
||||
created_at: string;
|
||||
updated_at: string;
|
||||
|
||||
@@ -28,7 +28,67 @@ export interface RepoQueryParams {
|
||||
pageSize?: number;
|
||||
}
|
||||
|
||||
// foxel-core 应用中心的数据结构
|
||||
export interface FoxelCoreApp {
|
||||
key: string;
|
||||
version: string;
|
||||
name: {
|
||||
zh: string;
|
||||
en: string;
|
||||
};
|
||||
description: {
|
||||
zh: string;
|
||||
en: string;
|
||||
};
|
||||
author: string;
|
||||
website: string;
|
||||
tags: {
|
||||
zh: string[];
|
||||
en: string[];
|
||||
};
|
||||
approvedAt: number;
|
||||
detailUrl: string;
|
||||
downloadUrl: string;
|
||||
}
|
||||
|
||||
export interface FoxelCoreAppsResponse {
|
||||
apps: FoxelCoreApp[];
|
||||
}
|
||||
|
||||
export interface FoxelCoreAppVersion {
|
||||
version: string;
|
||||
name: {
|
||||
zh: string;
|
||||
en: string;
|
||||
};
|
||||
description: {
|
||||
zh: string;
|
||||
en: string;
|
||||
};
|
||||
author: string;
|
||||
website: string;
|
||||
tags: {
|
||||
zh: string[];
|
||||
en: string[];
|
||||
};
|
||||
approvedAt: number;
|
||||
releaseNotesMd: string | null;
|
||||
}
|
||||
|
||||
export interface FoxelCoreAppDetail {
|
||||
key: string;
|
||||
latest: FoxelCoreAppVersion & {
|
||||
downloadUrl: string;
|
||||
};
|
||||
versions: FoxelCoreAppVersion[];
|
||||
}
|
||||
|
||||
export interface FoxelCoreAppDetailResponse {
|
||||
app: FoxelCoreAppDetail;
|
||||
}
|
||||
|
||||
const CENTER_BASE = 'https://center.foxel.cc';
|
||||
const FOXEL_CORE_BASE = 'https://foxel.cc';
|
||||
|
||||
export function buildCenterUrl(path: string) {
|
||||
return new URL(path, CENTER_BASE).href;
|
||||
@@ -50,3 +110,46 @@ export async function fetchRepoList(params: RepoQueryParams = {}): Promise<RepoL
|
||||
return await resp.json();
|
||||
}
|
||||
|
||||
/**
|
||||
* 从 foxel-core 应用中心获取应用列表
|
||||
*/
|
||||
export async function fetchFoxelCoreApps(query?: string): Promise<FoxelCoreApp[]> {
|
||||
const url = new URL('/api/apps', FOXEL_CORE_BASE);
|
||||
const q = query?.trim();
|
||||
if (q) {
|
||||
url.searchParams.set('q', q);
|
||||
}
|
||||
const resp = await fetch(url.href);
|
||||
if (!resp.ok) {
|
||||
throw new Error(`Failed to fetch apps: ${resp.status}`);
|
||||
}
|
||||
const data: FoxelCoreAppsResponse = await resp.json();
|
||||
return data.apps;
|
||||
}
|
||||
|
||||
/**
|
||||
* 从 foxel-core 应用中心获取应用详情(含历史版本)
|
||||
*/
|
||||
export async function fetchFoxelCoreAppDetail(appKey: string): Promise<FoxelCoreAppDetail> {
|
||||
const url = `${FOXEL_CORE_BASE}/api/apps/${encodeURIComponent(appKey)}`;
|
||||
const resp = await fetch(url);
|
||||
if (!resp.ok) {
|
||||
throw new Error(`Failed to fetch app detail: ${resp.status}`);
|
||||
}
|
||||
const data: FoxelCoreAppDetailResponse = await resp.json();
|
||||
return data.app;
|
||||
}
|
||||
|
||||
/**
|
||||
* 从 foxel-core 下载应用包文件
|
||||
*/
|
||||
export async function downloadFoxelCoreApp(app: Pick<FoxelCoreApp, 'key' | 'version' | 'downloadUrl'>): Promise<File> {
|
||||
const url = `${FOXEL_CORE_BASE}${app.downloadUrl}`;
|
||||
const resp = await fetch(url);
|
||||
if (!resp.ok) {
|
||||
throw new Error(`Failed to download app: ${resp.status}`);
|
||||
}
|
||||
const blob = await resp.blob();
|
||||
const filename = `${app.key}-${app.version}.foxpkg`;
|
||||
return new File([blob], filename, { type: 'application/octet-stream' });
|
||||
}
|
||||
|
||||
@@ -2,46 +2,67 @@ import request from './client';
|
||||
|
||||
export interface PluginItem {
|
||||
id: number;
|
||||
url: string;
|
||||
enabled: boolean;
|
||||
key: string;
|
||||
open_app?: boolean | null;
|
||||
key?: string | null;
|
||||
name?: string | null;
|
||||
version?: string | null;
|
||||
supported_exts?: string[] | null;
|
||||
default_bounds?: Record<string, any> | null;
|
||||
default_bounds?: Record<string, number> | null;
|
||||
default_maximized?: boolean | null;
|
||||
icon?: string | null;
|
||||
description?: string | null;
|
||||
author?: string | null;
|
||||
website?: string | null;
|
||||
github?: string | null;
|
||||
license?: string | null;
|
||||
manifest?: Record<string, unknown> | null;
|
||||
loaded_routes?: string[] | null;
|
||||
loaded_processors?: string[] | null;
|
||||
}
|
||||
|
||||
export interface PluginCreate {
|
||||
url: string;
|
||||
enabled?: boolean;
|
||||
}
|
||||
|
||||
export interface PluginManifestUpdate {
|
||||
key?: string;
|
||||
name?: string;
|
||||
version?: string;
|
||||
open_app?: boolean;
|
||||
supported_exts?: string[];
|
||||
default_bounds?: Record<string, any>;
|
||||
default_maximized?: boolean;
|
||||
icon?: string;
|
||||
description?: string;
|
||||
author?: string;
|
||||
website?: string;
|
||||
github?: string;
|
||||
export interface PluginInstallResult {
|
||||
success: boolean;
|
||||
plugin?: PluginItem;
|
||||
message?: string;
|
||||
errors?: string[];
|
||||
}
|
||||
|
||||
export const pluginsApi = {
|
||||
/**
|
||||
* 获取已安装插件列表
|
||||
*/
|
||||
list: () => request<PluginItem[]>(`/plugins`),
|
||||
create: (payload: PluginCreate) => request<PluginItem>(`/plugins`, { method: 'POST', json: payload }),
|
||||
remove: (id: number) => request(`/plugins/${id}`, { method: 'DELETE' }),
|
||||
update: (id: number, payload: PluginCreate) => request<PluginItem>(`/plugins/${id}`, { method: 'PUT', json: payload }),
|
||||
updateManifest: (id: number, payload: PluginManifestUpdate) => request<PluginItem>(`/plugins/${id}/metadata`, { method: 'POST', json: payload }),
|
||||
|
||||
/**
|
||||
* 获取单个插件详情
|
||||
*/
|
||||
get: (key: string) => request<PluginItem>(`/plugins/${key}`),
|
||||
|
||||
/**
|
||||
* 安装插件(上传 .foxpkg)
|
||||
*/
|
||||
install: async (file: File): Promise<PluginInstallResult> => {
|
||||
const formData = new FormData();
|
||||
formData.append('file', file);
|
||||
return request<PluginInstallResult>(`/plugins/install`, {
|
||||
method: 'POST',
|
||||
formData,
|
||||
});
|
||||
},
|
||||
|
||||
/**
|
||||
* 删除/卸载插件
|
||||
*/
|
||||
remove: (key: string) => request(`/plugins/${key}`, { method: 'DELETE' }),
|
||||
|
||||
/**
|
||||
* 获取插件 bundle URL
|
||||
*/
|
||||
getBundleUrl: (key: string) => `/api/plugins/${key}/bundle.js`,
|
||||
|
||||
/**
|
||||
* 获取插件资源 URL
|
||||
*/
|
||||
getAssetUrl: (key: string, assetPath: string) =>
|
||||
`/api/plugins/${key}/assets/${assetPath}`,
|
||||
};
|
||||
|
||||
@@ -1,35 +0,0 @@
|
||||
import request from './client';
|
||||
|
||||
export type VideoLibraryMediaType = 'tv' | 'movie';
|
||||
|
||||
export interface VideoLibraryItem {
|
||||
id: string;
|
||||
type: VideoLibraryMediaType;
|
||||
title: string;
|
||||
year?: string | null;
|
||||
overview?: string | null;
|
||||
poster_path?: string | null;
|
||||
backdrop_path?: string | null;
|
||||
genres?: string[];
|
||||
tmdb_id?: number | null;
|
||||
source_path?: string | null;
|
||||
scraped_at?: string | null;
|
||||
updated_at?: string | null;
|
||||
episodes_count?: number;
|
||||
seasons_count?: number;
|
||||
vote_average?: number | null;
|
||||
vote_count?: number | null;
|
||||
}
|
||||
|
||||
export const videoLibraryApi = {
|
||||
list: (params?: { q?: string; type?: VideoLibraryMediaType }) => {
|
||||
const search = new URLSearchParams();
|
||||
if (params?.q) search.set('q', params.q);
|
||||
if (params?.type) search.set('type', params.type);
|
||||
const suffix = search.toString();
|
||||
return request<VideoLibraryItem[]>(`/plugins/video-player/library${suffix ? `?${suffix}` : ''}`, { method: 'GET' });
|
||||
},
|
||||
get: (id: string) =>
|
||||
request<any>(`/plugins/video-player/library/${encodeURIComponent(id)}`, { method: 'GET' }),
|
||||
};
|
||||
|
||||
@@ -212,6 +212,7 @@ export const AppWindowsLayer: React.FC<AppWindowsLayerProps> = ({ windows, onClo
|
||||
<div
|
||||
key={w.id}
|
||||
ref={el => { windowEls.current[w.id] = el; }}
|
||||
onMouseDown={() => onBringToFront(w.id)}
|
||||
style={{
|
||||
position: 'fixed',
|
||||
top: w.maximized ? 0 : w.y,
|
||||
|
||||
@@ -1,654 +0,0 @@
|
||||
import React, { useCallback, useEffect, useMemo, useRef, useState } from 'react';
|
||||
import {
|
||||
FileOutlined,
|
||||
DatabaseOutlined,
|
||||
ExpandOutlined,
|
||||
BgColorsOutlined,
|
||||
ClockCircleOutlined,
|
||||
FolderOutlined,
|
||||
AimOutlined,
|
||||
BulbOutlined,
|
||||
ThunderboltOutlined,
|
||||
AlertOutlined,
|
||||
CameraOutlined,
|
||||
ApiOutlined,
|
||||
FieldTimeOutlined,
|
||||
} from '@ant-design/icons';
|
||||
import { API_BASE_URL, vfsApi, type VfsEntry } from '../../api/client';
|
||||
import type { AppComponentProps } from '../types';
|
||||
import { ImageCanvas } from './components/ImageCanvas';
|
||||
import { ViewerControls } from './components/ViewerControls';
|
||||
import { Filmstrip } from './components/Filmstrip';
|
||||
import { InfoPanel } from './components/InfoPanel';
|
||||
import type { HistogramData, RgbColor, InfoItem } from './components/types';
|
||||
import { viewerStyles } from './styles';
|
||||
|
||||
interface ExplorerSnapshot {
|
||||
path: string;
|
||||
entries: VfsEntry[];
|
||||
pagination?: { page: number; page_size: number; total: number };
|
||||
sortBy?: string;
|
||||
sortOrder?: string;
|
||||
timestamp: number;
|
||||
}
|
||||
|
||||
interface FileStat {
|
||||
name?: string;
|
||||
is_dir?: boolean;
|
||||
size?: number;
|
||||
mtime?: number;
|
||||
mode?: number;
|
||||
path?: string;
|
||||
type?: string;
|
||||
exif?: Record<string, unknown>;
|
||||
}
|
||||
|
||||
declare global {
|
||||
interface WindowEventMap {
|
||||
'foxel:file-explorer-page': CustomEvent<ExplorerSnapshot>;
|
||||
}
|
||||
}
|
||||
type ExplorerAwareWindow = Window & { __FOXEL_LAST_EXPLORER_PAGE__?: ExplorerSnapshot };
|
||||
|
||||
const DEFAULT_TONE: RgbColor = { r: 28, g: 32, b: 46 };
|
||||
|
||||
const isImageEntry = (ent: VfsEntry) => {
|
||||
if (ent.is_dir) return false;
|
||||
const maybe = ent as VfsEntry & { has_thumbnail?: boolean };
|
||||
if (typeof maybe.has_thumbnail === 'boolean' && maybe.has_thumbnail) return true;
|
||||
const ext = ent.name.split('.').pop()?.toLowerCase();
|
||||
if (!ext) return false;
|
||||
return ['png', 'jpg', 'jpeg', 'gif', 'webp', 'bmp', 'avif', 'ico', 'tif', 'tiff', 'svg', 'heic', 'heif', 'arw', 'cr2', 'cr3', 'nef', 'rw2', 'orf', 'pef', 'dng'].includes(ext);
|
||||
};
|
||||
|
||||
const buildThumbUrl = (fullPath: string, w = 180, h = 120) => {
|
||||
const base = API_BASE_URL.replace(/\/+$/, '');
|
||||
const clean = fullPath.replace(/^\/+/, '');
|
||||
return `${base}/fs/thumb/${encodeURI(clean)}?w=${w}&h=${h}&fit=cover`;
|
||||
};
|
||||
|
||||
const getDirectory = (fullPath: string) => {
|
||||
const path = fullPath.startsWith('/') ? fullPath : `/${fullPath}`;
|
||||
const idx = path.lastIndexOf('/');
|
||||
if (idx <= 0) return '/';
|
||||
return path.slice(0, idx) || '/';
|
||||
};
|
||||
|
||||
const joinPath = (dir: string, name: string) => {
|
||||
if (dir === '/' || dir === '') return `/${name}`;
|
||||
return `${dir.replace(/\/$/, '')}/${name}`;
|
||||
};
|
||||
|
||||
const clamp = (value: number, min: number, max: number) => Math.max(min, Math.min(max, value));
|
||||
|
||||
const parseNumberish = (raw: unknown): number | null => {
|
||||
if (typeof raw === 'number') return raw;
|
||||
if (typeof raw !== 'string') return null;
|
||||
if (raw.includes('/')) {
|
||||
const [a, b] = raw.split('/').map(v => Number(v));
|
||||
if (!Number.isNaN(a) && !Number.isNaN(b) && b !== 0) return a / b;
|
||||
}
|
||||
const val = Number(raw);
|
||||
return Number.isNaN(val) ? null : val;
|
||||
};
|
||||
|
||||
const humanFileSize = (size: number | undefined) => {
|
||||
if (typeof size !== 'number') return '-';
|
||||
const units = ['B', 'KB', 'MB', 'GB', 'TB'];
|
||||
let value = size;
|
||||
let index = 0;
|
||||
while (value >= 1024 && index < units.length - 1) {
|
||||
value /= 1024;
|
||||
index += 1;
|
||||
}
|
||||
return `${value.toFixed(index === 0 ? 0 : 1)} ${units[index]}`;
|
||||
};
|
||||
|
||||
const readExplorerSnapshot = (dir: string): ExplorerSnapshot | null => {
|
||||
if (typeof window === 'undefined') return null;
|
||||
const snap = (window as ExplorerAwareWindow).__FOXEL_LAST_EXPLORER_PAGE__;
|
||||
if (!snap) return null;
|
||||
const snapshotPath = snap.path === '' ? '/' : snap.path;
|
||||
const normalizedSnap = snapshotPath.endsWith('/') && snapshotPath !== '/' ? snapshotPath.slice(0, -1) : snapshotPath;
|
||||
const normalizedTarget = dir.endsWith('/') && dir !== '/' ? dir.slice(0, -1) : dir;
|
||||
if (normalizedSnap !== normalizedTarget) return null;
|
||||
return snap;
|
||||
};
|
||||
|
||||
const formatDateTime = (ts?: number) => {
|
||||
if (!ts) return '-';
|
||||
try {
|
||||
return new Date(ts * 1000).toLocaleString();
|
||||
} catch {
|
||||
return '-';
|
||||
}
|
||||
};
|
||||
|
||||
const clampChannel = (value: number) => Math.max(0, Math.min(255, value));
|
||||
|
||||
const mixColor = (base: RgbColor, target: RgbColor, ratio: number): RgbColor => ({
|
||||
r: clampChannel(base.r * (1 - ratio) + target.r * ratio),
|
||||
g: clampChannel(base.g * (1 - ratio) + target.g * ratio),
|
||||
b: clampChannel(base.b * (1 - ratio) + target.b * ratio),
|
||||
});
|
||||
|
||||
const rgbToRgba = (color: RgbColor, alpha: number) => `rgba(${Math.round(color.r)}, ${Math.round(color.g)}, ${Math.round(color.b)}, ${alpha})`;
|
||||
|
||||
const computeImageStats = (img: HTMLImageElement): { histogram: HistogramData | null; dominantColor: RgbColor | null } => {
|
||||
try {
|
||||
const maxSide = 720;
|
||||
const naturalWidth = img.naturalWidth || 1;
|
||||
const naturalHeight = img.naturalHeight || 1;
|
||||
const ratio = Math.min(1, maxSide / Math.max(naturalWidth, naturalHeight));
|
||||
const width = Math.max(1, Math.floor(naturalWidth * ratio));
|
||||
const height = Math.max(1, Math.floor(naturalHeight * ratio));
|
||||
const canvas = document.createElement('canvas');
|
||||
canvas.width = width;
|
||||
canvas.height = height;
|
||||
const ctx = canvas.getContext('2d', { willReadFrequently: true });
|
||||
if (!ctx) return { histogram: null, dominantColor: null };
|
||||
ctx.drawImage(img, 0, 0, width, height);
|
||||
const { data } = ctx.getImageData(0, 0, width, height);
|
||||
const r = new Array(256).fill(0);
|
||||
const g = new Array(256).fill(0);
|
||||
const b = new Array(256).fill(0);
|
||||
let rTotal = 0;
|
||||
let gTotal = 0;
|
||||
let bTotal = 0;
|
||||
let count = 0;
|
||||
for (let i = 0; i < data.length; i += 4) {
|
||||
r[data[i]] += 1;
|
||||
g[data[i + 1]] += 1;
|
||||
b[data[i + 2]] += 1;
|
||||
rTotal += data[i];
|
||||
gTotal += data[i + 1];
|
||||
bTotal += data[i + 2];
|
||||
count += 1;
|
||||
}
|
||||
const histogram: HistogramData = { r, g, b };
|
||||
if (count === 0) return { histogram, dominantColor: null };
|
||||
const dominantColor: RgbColor = {
|
||||
r: rTotal / count,
|
||||
g: gTotal / count,
|
||||
b: bTotal / count,
|
||||
};
|
||||
return { histogram, dominantColor };
|
||||
} catch {
|
||||
return { histogram: null, dominantColor: null };
|
||||
}
|
||||
};
|
||||
|
||||
export const ImageViewerApp: React.FC<AppComponentProps> = ({ filePath, entry, onRequestClose }) => {
|
||||
const normalizedInitialPath = filePath.startsWith('/') ? filePath : `/${filePath}`;
|
||||
const [activeEntry, setActiveEntry] = useState<VfsEntry>(entry);
|
||||
const [activePath, setActivePath] = useState<string>(normalizedInitialPath);
|
||||
const [imageUrl, setImageUrl] = useState<string>();
|
||||
const [loading, setLoading] = useState(true);
|
||||
const [error, setError] = useState<string>();
|
||||
const [stat, setStat] = useState<FileStat | null>(null);
|
||||
const [histogram, setHistogram] = useState<HistogramData | null>(null);
|
||||
const [dominantColor, setDominantColor] = useState<RgbColor | null>(null);
|
||||
const [scale, setScale] = useState(1);
|
||||
const [offset, setOffset] = useState({ x: 0, y: 0 });
|
||||
const [rotate, setRotate] = useState(0);
|
||||
const [isDragging, setIsDragging] = useState(false);
|
||||
const [filmstrip, setFilmstrip] = useState<VfsEntry[]>([]);
|
||||
const [pageInfo, setPageInfo] = useState<{ page: number; total: number; pageSize: number } | null>(null);
|
||||
|
||||
const containerRef = useRef<HTMLDivElement | null>(null);
|
||||
const imageRef = useRef<HTMLImageElement | null>(null);
|
||||
const dragPointRef = useRef<{ x: number; y: number } | null>(null);
|
||||
const pinchDistanceRef = useRef<number | null>(null);
|
||||
const transitionRef = useRef(false);
|
||||
const filmstripRefs = useRef<Record<string, HTMLDivElement | null>>({});
|
||||
|
||||
const directory = useMemo(() => getDirectory(activePath), [activePath]);
|
||||
|
||||
const baseTone = useMemo<RgbColor>(() => dominantColor ?? DEFAULT_TONE, [dominantColor]);
|
||||
|
||||
const containerStyle = useMemo(() => {
|
||||
const light = mixColor(baseTone, { r: 255, g: 255, b: 255 }, 0.18);
|
||||
const shadow = mixColor(baseTone, { r: 0, g: 0, b: 0 }, 0.62);
|
||||
return {
|
||||
...viewerStyles.container,
|
||||
background: `linear-gradient(135deg, ${rgbToRgba(light, 0.78)} 0%, ${rgbToRgba(baseTone, 0.86)} 48%, ${rgbToRgba(shadow, 0.96)} 100%)`,
|
||||
};
|
||||
}, [baseTone]);
|
||||
|
||||
const mainBackdropStyle = useMemo(() => {
|
||||
const glow = mixColor(baseTone, { r: 255, g: 255, b: 255 }, 0.32);
|
||||
const shade = mixColor(baseTone, { r: 0, g: 0, b: 0 }, 0.7);
|
||||
return {
|
||||
...viewerStyles.mainBackdrop,
|
||||
background: `radial-gradient(circle at 18% 22%, ${rgbToRgba(glow, 0.38)}, ${rgbToRgba(shade, 0.94)} 68%)`,
|
||||
};
|
||||
}, [baseTone]);
|
||||
|
||||
const viewerStyle = useMemo(() => {
|
||||
const surface = mixColor(baseTone, { r: 0, g: 0, b: 0 }, 0.45);
|
||||
const edge = mixColor(baseTone, { r: 0, g: 0, b: 0 }, 0.65);
|
||||
return {
|
||||
...viewerStyles.viewer,
|
||||
background: `linear-gradient(145deg, ${rgbToRgba(surface, 0.7)} 0%, ${rgbToRgba(edge, 0.92)} 100%)`,
|
||||
backdropFilter: 'blur(28px)',
|
||||
};
|
||||
}, [baseTone]);
|
||||
|
||||
const controlsStyle = useMemo(() => {
|
||||
const tone = mixColor(baseTone, { r: 0, g: 0, b: 0 }, 0.52);
|
||||
return {
|
||||
...viewerStyles.controls,
|
||||
background: rgbToRgba(tone, 0.74),
|
||||
backdropFilter: 'blur(18px)',
|
||||
};
|
||||
}, [baseTone]);
|
||||
|
||||
const filmstripShellStyle = useMemo(() => {
|
||||
const tone = mixColor(baseTone, { r: 0, g: 0, b: 0 }, 0.56);
|
||||
return {
|
||||
...viewerStyles.filmstripShell,
|
||||
background: rgbToRgba(tone, 0.7),
|
||||
backdropFilter: 'blur(22px)',
|
||||
};
|
||||
}, [baseTone]);
|
||||
|
||||
const getThumbUrl = useCallback((item: VfsEntry) => {
|
||||
const full = joinPath(directory, item.name);
|
||||
return buildThumbUrl(full, 160, 120);
|
||||
}, [directory]);
|
||||
|
||||
const sidePanelStyle = useMemo(() => {
|
||||
const panel = mixColor(baseTone, { r: 0, g: 0, b: 0 }, 0.6);
|
||||
const border = rgbToRgba(mixColor(baseTone, { r: 255, g: 255, b: 255 }, 0.1), 0.28);
|
||||
return {
|
||||
...viewerStyles.sidePanel,
|
||||
background: rgbToRgba(panel, 0.8),
|
||||
backdropFilter: 'blur(28px)',
|
||||
borderLeft: `1px solid ${border}`,
|
||||
};
|
||||
}, [baseTone]);
|
||||
|
||||
const histogramCardStyle = useMemo(() => {
|
||||
const tone = mixColor(baseTone, { r: 0, g: 0, b: 0 }, 0.55);
|
||||
const stroke = rgbToRgba(mixColor(baseTone, { r: 255, g: 255, b: 255 }, 0.12), 0.2);
|
||||
return {
|
||||
...viewerStyles.histogramCard,
|
||||
background: rgbToRgba(tone, 0.58),
|
||||
border: `1px solid ${stroke}`,
|
||||
};
|
||||
}, [baseTone]);
|
||||
|
||||
useEffect(() => {
|
||||
const normalized = filePath.startsWith('/') ? filePath : `/${filePath}`;
|
||||
setActiveEntry(entry);
|
||||
setActivePath(normalized);
|
||||
}, [entry, filePath]);
|
||||
|
||||
useEffect(() => {
|
||||
let cancelled = false;
|
||||
setLoading(true);
|
||||
setError(undefined);
|
||||
setHistogram(null);
|
||||
setDominantColor(null);
|
||||
const cleaned = activePath.replace(/^\/+/, '');
|
||||
Promise.all([
|
||||
vfsApi.getTempLinkToken(cleaned),
|
||||
vfsApi.stat(activePath) as Promise<FileStat>,
|
||||
])
|
||||
.then(([token, metadata]) => {
|
||||
if (cancelled) return;
|
||||
setImageUrl(vfsApi.getTempPublicUrl(token.token));
|
||||
setStat(metadata);
|
||||
setScale(1);
|
||||
setRotate(0);
|
||||
setOffset({ x: 0, y: 0 });
|
||||
})
|
||||
.catch((err: unknown) => {
|
||||
if (!cancelled) {
|
||||
setError(err instanceof Error ? err.message : '加载失败');
|
||||
}
|
||||
})
|
||||
.finally(() => {
|
||||
if (!cancelled) setLoading(false);
|
||||
});
|
||||
return () => {
|
||||
cancelled = true;
|
||||
};
|
||||
}, [activePath]);
|
||||
|
||||
const refreshFilmstrip = useCallback((dir: string) => {
|
||||
const snap = readExplorerSnapshot(dir);
|
||||
if (snap) {
|
||||
const images = snap.entries.filter(isImageEntry);
|
||||
const ensured = images.some(item => item.name === activeEntry.name) ? images : [...images, activeEntry];
|
||||
setFilmstrip(ensured);
|
||||
if (snap.pagination) {
|
||||
setPageInfo({
|
||||
page: snap.pagination.page,
|
||||
pageSize: snap.pagination.page_size,
|
||||
total: snap.pagination.total,
|
||||
});
|
||||
} else {
|
||||
setPageInfo(null);
|
||||
}
|
||||
return;
|
||||
}
|
||||
setFilmstrip([activeEntry]);
|
||||
setPageInfo(null);
|
||||
}, [activeEntry]);
|
||||
|
||||
useEffect(() => {
|
||||
refreshFilmstrip(directory);
|
||||
}, [directory, refreshFilmstrip]);
|
||||
|
||||
useEffect(() => {
|
||||
const handler = () => refreshFilmstrip(directory);
|
||||
window.addEventListener('foxel:file-explorer-page', handler);
|
||||
return () => window.removeEventListener('foxel:file-explorer-page', handler);
|
||||
}, [directory, refreshFilmstrip]);
|
||||
|
||||
useEffect(() => {
|
||||
const el = filmstripRefs.current[activeEntry.name];
|
||||
if (el) {
|
||||
el.scrollIntoView({ behavior: 'smooth', inline: 'center', block: 'nearest' });
|
||||
}
|
||||
}, [activeEntry, filmstrip]);
|
||||
|
||||
useEffect(() => {
|
||||
const keyHandler = (e: KeyboardEvent) => {
|
||||
if (e.key === 'ArrowRight') {
|
||||
e.preventDefault();
|
||||
switchRelative(1);
|
||||
} else if (e.key === 'ArrowLeft') {
|
||||
e.preventDefault();
|
||||
switchRelative(-1);
|
||||
} else if ((e.key === '+' || e.key === '=') && (e.ctrlKey || e.metaKey)) {
|
||||
e.preventDefault();
|
||||
zoom(1.15);
|
||||
} else if ((e.key === '-' || e.key === '_') && (e.ctrlKey || e.metaKey)) {
|
||||
e.preventDefault();
|
||||
zoom(0.85);
|
||||
}
|
||||
};
|
||||
window.addEventListener('keydown', keyHandler);
|
||||
return () => window.removeEventListener('keydown', keyHandler);
|
||||
});
|
||||
|
||||
const zoom = useCallback((factor: number) => {
|
||||
setScale(prev => {
|
||||
const next = clamp(prev * factor, 0.08, 10);
|
||||
transitionRef.current = true;
|
||||
window.setTimeout(() => { transitionRef.current = false; }, 120);
|
||||
return next;
|
||||
});
|
||||
}, []);
|
||||
|
||||
const rotateImage = () => {
|
||||
setRotate(prev => {
|
||||
transitionRef.current = true;
|
||||
window.setTimeout(() => { transitionRef.current = false; }, 180);
|
||||
return (prev + 90) % 360;
|
||||
});
|
||||
};
|
||||
|
||||
const resetView = () => {
|
||||
transitionRef.current = true;
|
||||
window.setTimeout(() => { transitionRef.current = false; }, 160);
|
||||
setScale(1);
|
||||
setOffset({ x: 0, y: 0 });
|
||||
setRotate(0);
|
||||
};
|
||||
|
||||
const fitToScreen = () => {
|
||||
resetView();
|
||||
};
|
||||
|
||||
const onWheel = (e: React.WheelEvent) => {
|
||||
e.preventDefault();
|
||||
const container = containerRef.current;
|
||||
if (!container) return;
|
||||
const rect = container.getBoundingClientRect();
|
||||
const cx = e.clientX - rect.left - rect.width / 2;
|
||||
const cy = e.clientY - rect.top - rect.height / 2;
|
||||
setScale(prev => {
|
||||
const factor = e.deltaY < 0 ? 1.12 : 0.88;
|
||||
const next = clamp(prev * factor, 0.08, 10);
|
||||
const ratio = next / prev;
|
||||
setOffset(off => ({ x: off.x - cx * (ratio - 1), y: off.y - cy * (ratio - 1) }));
|
||||
transitionRef.current = true;
|
||||
window.setTimeout(() => { transitionRef.current = false; }, 120);
|
||||
return next;
|
||||
});
|
||||
};
|
||||
|
||||
const onMouseDown = (e: React.MouseEvent) => {
|
||||
if (e.button !== 0) return;
|
||||
e.preventDefault();
|
||||
setIsDragging(true);
|
||||
dragPointRef.current = { x: e.clientX, y: e.clientY };
|
||||
};
|
||||
|
||||
const onMouseMove = (e: React.MouseEvent) => {
|
||||
if (!isDragging || !dragPointRef.current) return;
|
||||
e.preventDefault();
|
||||
const dx = e.clientX - dragPointRef.current.x;
|
||||
const dy = e.clientY - dragPointRef.current.y;
|
||||
dragPointRef.current = { x: e.clientX, y: e.clientY };
|
||||
setOffset(off => ({ x: off.x + dx, y: off.y + dy }));
|
||||
};
|
||||
|
||||
const stopDragging = () => {
|
||||
setIsDragging(false);
|
||||
dragPointRef.current = null;
|
||||
};
|
||||
|
||||
const dist = (t1: React.Touch, t2: React.Touch) => Math.hypot(t1.clientX - t2.clientX, t1.clientY - t2.clientY);
|
||||
|
||||
const onTouchStart = (e: React.TouchEvent) => {
|
||||
if (e.touches.length === 1) {
|
||||
const t = e.touches[0];
|
||||
dragPointRef.current = { x: t.clientX, y: t.clientY };
|
||||
} else if (e.touches.length === 2) {
|
||||
pinchDistanceRef.current = dist(e.touches[0], e.touches[1]);
|
||||
}
|
||||
};
|
||||
|
||||
const onTouchMove = (e: React.TouchEvent) => {
|
||||
if (e.touches.length === 1 && dragPointRef.current) {
|
||||
const t = e.touches[0];
|
||||
const dx = t.clientX - dragPointRef.current.x;
|
||||
const dy = t.clientY - dragPointRef.current.y;
|
||||
dragPointRef.current = { x: t.clientX, y: t.clientY };
|
||||
setOffset(off => ({ x: off.x + dx, y: off.y + dy }));
|
||||
} else if (e.touches.length === 2 && pinchDistanceRef.current) {
|
||||
const dNow = dist(e.touches[0], e.touches[1]);
|
||||
const ratio = dNow / pinchDistanceRef.current;
|
||||
pinchDistanceRef.current = dNow;
|
||||
setScale(prev => clamp(prev * ratio, 0.08, 10));
|
||||
}
|
||||
};
|
||||
|
||||
const onTouchEnd = () => {
|
||||
pinchDistanceRef.current = null;
|
||||
dragPointRef.current = null;
|
||||
};
|
||||
|
||||
const onDoubleClick = (e: React.MouseEvent) => {
|
||||
e.preventDefault();
|
||||
const next = scale > 1.4 ? 1 : 2.2;
|
||||
const container = containerRef.current;
|
||||
if (!container) {
|
||||
setScale(next);
|
||||
return;
|
||||
}
|
||||
const rect = container.getBoundingClientRect();
|
||||
const cx = e.clientX - rect.left - rect.width / 2;
|
||||
const cy = e.clientY - rect.top - rect.height / 2;
|
||||
const ratio = next / scale;
|
||||
setScale(next);
|
||||
setOffset(off => ({ x: off.x - cx * (ratio - 1), y: off.y - cy * (ratio - 1) }));
|
||||
};
|
||||
|
||||
const handleImageLoaded = () => {
|
||||
const img = imageRef.current;
|
||||
if (!img) return;
|
||||
const stats = computeImageStats(img);
|
||||
setHistogram(stats.histogram);
|
||||
setDominantColor(stats.dominantColor);
|
||||
};
|
||||
|
||||
const switchEntry = (target: VfsEntry) => {
|
||||
const nextPath = joinPath(directory, target.name);
|
||||
setActiveEntry(target);
|
||||
setActivePath(nextPath);
|
||||
};
|
||||
|
||||
const switchRelative = (step: number) => {
|
||||
if (filmstrip.length <= 1) return;
|
||||
const currentIndex = filmstrip.findIndex(item => item.name === activeEntry.name);
|
||||
if (currentIndex === -1) return;
|
||||
const target = filmstrip[(currentIndex + step + filmstrip.length) % filmstrip.length];
|
||||
if (target) switchEntry(target);
|
||||
};
|
||||
|
||||
const scaleLabel = `${(scale * 100).toFixed(scale >= 1 ? 0 : 1)}%`;
|
||||
|
||||
const imageStyle: React.CSSProperties = {
|
||||
maxWidth: '100%',
|
||||
maxHeight: '100%',
|
||||
transform: `translate(${offset.x}px, ${offset.y}px) scale(${scale}) rotate(${rotate}deg)`,
|
||||
transition: transitionRef.current ? 'transform 0.18s cubic-bezier(.4,.8,.4,1)' : undefined,
|
||||
cursor: isDragging ? 'grabbing' : scale > 1 ? 'grab' : 'zoom-in',
|
||||
willChange: 'transform',
|
||||
};
|
||||
|
||||
const controlsNode = (
|
||||
<ViewerControls
|
||||
style={controlsStyle}
|
||||
onPrev={() => switchRelative(-1)}
|
||||
onNext={() => switchRelative(1)}
|
||||
onZoomIn={() => zoom(1.18)}
|
||||
onZoomOut={() => zoom(0.82)}
|
||||
onRotate={rotateImage}
|
||||
onReset={resetView}
|
||||
onFit={fitToScreen}
|
||||
disableSwitch={filmstrip.length <= 1}
|
||||
/>
|
||||
);
|
||||
|
||||
const exif = (stat?.exif ?? {}) as Record<string, unknown>;
|
||||
const infoIconStyle: React.CSSProperties = { fontSize: 15, color: 'rgba(255,255,255,0.62)' };
|
||||
const exifValue = (key: string): string | number | null => {
|
||||
const value = exif[key];
|
||||
if (typeof value === 'string' || typeof value === 'number') return value;
|
||||
return null;
|
||||
};
|
||||
const focalLength = (() => {
|
||||
const v = parseNumberish(exifValue('37386') ?? exifValue('37377'));
|
||||
return v ? `${v.toFixed(1)} mm` : null;
|
||||
})();
|
||||
const aperture = (() => {
|
||||
const v = parseNumberish(exifValue('33437') ?? exifValue('37378'));
|
||||
return v ? `f/${v.toFixed(1)}` : null;
|
||||
})();
|
||||
const exposure = (() => {
|
||||
const v = parseNumberish(exifValue('33434'));
|
||||
if (!v) return null;
|
||||
if (v >= 1) return `${v.toFixed(1)} s`;
|
||||
const denom = Math.max(1, Math.round(1 / v));
|
||||
return `1/${denom}`;
|
||||
})();
|
||||
const isoValue = exifValue('34855') ?? exifValue('34864');
|
||||
const width = parseNumberish(exifValue('40962'));
|
||||
const height = parseNumberish(exifValue('40963'));
|
||||
const colorSpace = exifValue('40961');
|
||||
const cameraMake = exifValue('271');
|
||||
const cameraModel = exifValue('272');
|
||||
const lensModel = exifValue('42036');
|
||||
const captureTime = exifValue('36867') ?? exifValue('36868') ?? exifValue('306');
|
||||
|
||||
const basicList: InfoItem[] = [
|
||||
{ label: '文件名', value: activeEntry.name, icon: <FileOutlined style={infoIconStyle} /> },
|
||||
{ label: '文件大小', value: humanFileSize(stat?.size), icon: <DatabaseOutlined style={infoIconStyle} /> },
|
||||
{ label: '分辨率', value: width && height ? `${width} × ${height}` : null, icon: <ExpandOutlined style={infoIconStyle} /> },
|
||||
{ label: '颜色空间', value: colorSpace ?? null, icon: <BgColorsOutlined style={infoIconStyle} /> },
|
||||
{ label: '修改时间', value: stat?.mtime ? formatDateTime(stat.mtime) : null, icon: <ClockCircleOutlined style={infoIconStyle} /> },
|
||||
{ label: '路径', value: typeof stat?.path === 'string' ? stat.path : activePath, icon: <FolderOutlined style={infoIconStyle} /> },
|
||||
];
|
||||
|
||||
const shootingList: InfoItem[] = [
|
||||
{ label: '焦距', value: focalLength, icon: <AimOutlined style={infoIconStyle} /> },
|
||||
{ label: '光圈', value: aperture, icon: <BulbOutlined style={infoIconStyle} /> },
|
||||
{ label: '快门', value: exposure, icon: <ThunderboltOutlined style={infoIconStyle} /> },
|
||||
{ label: 'ISO', value: isoValue != null ? isoValue.toString() : null, icon: <AlertOutlined style={infoIconStyle} /> },
|
||||
];
|
||||
|
||||
const deviceList: InfoItem[] = [
|
||||
{
|
||||
label: '相机',
|
||||
value: cameraModel ? `${cameraMake ? `${cameraMake} ` : ''}${cameraModel}` : (cameraMake ?? null),
|
||||
icon: <CameraOutlined style={infoIconStyle} />,
|
||||
},
|
||||
{ label: '镜头', value: lensModel ?? null, icon: <ApiOutlined style={infoIconStyle} /> },
|
||||
];
|
||||
|
||||
const miscList: InfoItem[] = [
|
||||
{ label: '拍摄时间', value: captureTime, icon: <FieldTimeOutlined style={infoIconStyle} /> },
|
||||
];
|
||||
|
||||
return (
|
||||
<div style={containerStyle}>
|
||||
<section style={viewerStyles.main}>
|
||||
<div style={mainBackdropStyle} />
|
||||
<div style={viewerStyles.mainContent}>
|
||||
<ImageCanvas
|
||||
containerRef={containerRef}
|
||||
imageRef={imageRef}
|
||||
viewerStyle={viewerStyle}
|
||||
controls={controlsNode}
|
||||
scaleLabel={scaleLabel}
|
||||
imageStyle={imageStyle}
|
||||
loading={loading}
|
||||
error={error}
|
||||
imageUrl={imageUrl}
|
||||
activeEntry={activeEntry}
|
||||
onRequestClose={onRequestClose}
|
||||
onImageLoad={handleImageLoaded}
|
||||
onWheel={onWheel}
|
||||
onMouseDown={onMouseDown}
|
||||
onMouseMove={onMouseMove}
|
||||
onMouseLeave={stopDragging}
|
||||
onMouseUp={stopDragging}
|
||||
onDoubleClick={onDoubleClick}
|
||||
onTouchStart={onTouchStart}
|
||||
onTouchMove={onTouchMove}
|
||||
onTouchEnd={onTouchEnd}
|
||||
/>
|
||||
|
||||
<Filmstrip
|
||||
shellStyle={filmstripShellStyle}
|
||||
listStyle={viewerStyles.filmstrip}
|
||||
entries={filmstrip}
|
||||
activeEntry={activeEntry}
|
||||
onSelect={switchEntry}
|
||||
filmstripRefs={filmstripRefs}
|
||||
pageInfo={pageInfo}
|
||||
getThumbUrl={getThumbUrl}
|
||||
/>
|
||||
</div>
|
||||
</section>
|
||||
|
||||
<InfoPanel
|
||||
style={sidePanelStyle}
|
||||
histogramCardStyle={histogramCardStyle}
|
||||
title={activeEntry.name}
|
||||
captureTime={captureTime ?? null}
|
||||
basicList={basicList}
|
||||
shootingList={shootingList}
|
||||
deviceList={deviceList}
|
||||
miscList={miscList}
|
||||
histogram={histogram}
|
||||
/>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
@@ -1,94 +0,0 @@
|
||||
import React from 'react';
|
||||
import { Typography } from 'antd';
|
||||
import type { VfsEntry } from '../../../api/client';
|
||||
|
||||
interface PageInfo {
|
||||
page: number;
|
||||
total: number;
|
||||
pageSize: number;
|
||||
}
|
||||
|
||||
interface FilmstripProps {
|
||||
shellStyle: React.CSSProperties;
|
||||
listStyle: React.CSSProperties;
|
||||
entries: VfsEntry[];
|
||||
activeEntry: VfsEntry;
|
||||
onSelect: (entry: VfsEntry) => void;
|
||||
filmstripRefs: React.MutableRefObject<Record<string, HTMLDivElement | null>>;
|
||||
pageInfo: PageInfo | null;
|
||||
getThumbUrl: (entry: VfsEntry) => string;
|
||||
}
|
||||
|
||||
export const Filmstrip: React.FC<FilmstripProps> = ({
|
||||
shellStyle,
|
||||
listStyle,
|
||||
entries,
|
||||
activeEntry,
|
||||
onSelect,
|
||||
filmstripRefs,
|
||||
pageInfo,
|
||||
getThumbUrl,
|
||||
}) => (
|
||||
<div style={shellStyle}>
|
||||
<div style={{ display: 'flex', justifyContent: 'space-between', alignItems: 'center', marginBottom: 10 }}>
|
||||
<Typography.Text style={{ color: 'rgba(255,255,255,0.72)', fontWeight: 500 }}>
|
||||
胶片带 · {entries.length} 张
|
||||
</Typography.Text>
|
||||
{pageInfo && (
|
||||
<Typography.Text style={{ color: 'rgba(255,255,255,0.45)', fontSize: 12 }}>
|
||||
第 {pageInfo.page} 页 / 共 {Math.max(1, Math.ceil(pageInfo.total / pageInfo.pageSize))} 页
|
||||
</Typography.Text>
|
||||
)}
|
||||
</div>
|
||||
<div style={listStyle}>
|
||||
{entries.map(item => {
|
||||
const active = item.name === activeEntry.name;
|
||||
return (
|
||||
<div
|
||||
key={`${item.name}-${item.mtime ?? ''}`}
|
||||
ref={el => { filmstripRefs.current[item.name] = el; }}
|
||||
onClick={() => onSelect(item)}
|
||||
style={{
|
||||
width: 84,
|
||||
height: 64,
|
||||
overflow: 'hidden',
|
||||
border: active ? '2px solid #4e9bff' : '2px solid transparent',
|
||||
boxShadow: active ? '0 0 0 4px rgba(78,155,255,0.28)' : '0 10px 28px rgba(0,0,0,0.45)',
|
||||
cursor: 'pointer',
|
||||
position: 'relative',
|
||||
flex: '0 0 auto',
|
||||
}}
|
||||
>
|
||||
<img
|
||||
src={getThumbUrl(item)}
|
||||
alt={item.name}
|
||||
style={{ width: '100%', height: '100%', objectFit: 'cover', filter: active ? 'saturate(1)' : 'saturate(0.65)' }}
|
||||
/>
|
||||
{active && (
|
||||
<div
|
||||
style={{
|
||||
position: 'absolute',
|
||||
bottom: 4,
|
||||
left: 6,
|
||||
right: 6,
|
||||
padding: '2px 4px',
|
||||
background: 'rgba(0,0,0,0.55)',
|
||||
color: '#fff',
|
||||
fontSize: 10,
|
||||
whiteSpace: 'nowrap',
|
||||
overflow: 'hidden',
|
||||
textOverflow: 'ellipsis',
|
||||
}}
|
||||
>
|
||||
{item.name}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
})}
|
||||
{entries.length === 0 && (
|
||||
<div style={{ color: 'rgba(255,255,255,0.45)' }}>暂无图片</div>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
@@ -1,99 +0,0 @@
|
||||
import React from 'react';
|
||||
import { Spin, Typography, Tooltip, Button } from 'antd';
|
||||
import { CloseOutlined } from '@ant-design/icons';
|
||||
import type { VfsEntry } from '../../../api/client';
|
||||
import { viewerStyles } from '../styles';
|
||||
|
||||
interface ImageCanvasProps {
|
||||
containerRef: React.RefObject<HTMLDivElement | null>;
|
||||
imageRef: React.RefObject<HTMLImageElement | null>;
|
||||
viewerStyle: React.CSSProperties;
|
||||
controls: React.ReactNode;
|
||||
scaleLabel: string;
|
||||
imageStyle: React.CSSProperties;
|
||||
loading: boolean;
|
||||
error?: string;
|
||||
imageUrl?: string;
|
||||
activeEntry: VfsEntry;
|
||||
onRequestClose: () => void;
|
||||
onImageLoad: () => void;
|
||||
onWheel: React.WheelEventHandler<HTMLDivElement>;
|
||||
onMouseDown: React.MouseEventHandler<HTMLDivElement>;
|
||||
onMouseMove: React.MouseEventHandler<HTMLDivElement>;
|
||||
onMouseLeave: React.MouseEventHandler<HTMLDivElement>;
|
||||
onMouseUp: React.MouseEventHandler<HTMLDivElement>;
|
||||
onDoubleClick: React.MouseEventHandler<HTMLDivElement>;
|
||||
onTouchStart: React.TouchEventHandler<HTMLDivElement>;
|
||||
onTouchMove: React.TouchEventHandler<HTMLDivElement>;
|
||||
onTouchEnd: React.TouchEventHandler<HTMLDivElement>;
|
||||
}
|
||||
|
||||
export const ImageCanvas: React.FC<ImageCanvasProps> = ({
|
||||
containerRef,
|
||||
imageRef,
|
||||
viewerStyle,
|
||||
controls,
|
||||
scaleLabel,
|
||||
imageStyle,
|
||||
loading,
|
||||
error,
|
||||
imageUrl,
|
||||
activeEntry,
|
||||
onRequestClose,
|
||||
onImageLoad,
|
||||
onWheel,
|
||||
onMouseDown,
|
||||
onMouseMove,
|
||||
onMouseLeave,
|
||||
onMouseUp,
|
||||
onDoubleClick,
|
||||
onTouchStart,
|
||||
onTouchMove,
|
||||
onTouchEnd,
|
||||
}) => (
|
||||
<div
|
||||
ref={containerRef}
|
||||
style={viewerStyle}
|
||||
onWheel={onWheel}
|
||||
onMouseDown={onMouseDown}
|
||||
onMouseMove={onMouseMove}
|
||||
onMouseLeave={onMouseLeave}
|
||||
onMouseUp={onMouseUp}
|
||||
onDoubleClick={onDoubleClick}
|
||||
onTouchStart={onTouchStart}
|
||||
onTouchMove={onTouchMove}
|
||||
onTouchEnd={onTouchEnd}
|
||||
>
|
||||
<div style={viewerStyles.viewerCloseWrap}>
|
||||
<Tooltip title="关闭">
|
||||
<Button
|
||||
type="text"
|
||||
icon={<CloseOutlined />}
|
||||
onClick={onRequestClose}
|
||||
style={viewerStyles.viewerClose}
|
||||
/>
|
||||
</Tooltip>
|
||||
</div>
|
||||
{loading ? (
|
||||
<Spin tip="加载中" />
|
||||
) : error ? (
|
||||
<Typography.Text type="danger">{error}</Typography.Text>
|
||||
) : imageUrl ? (
|
||||
<img
|
||||
ref={imageRef}
|
||||
src={imageUrl}
|
||||
alt={activeEntry.name}
|
||||
onLoad={onImageLoad}
|
||||
draggable={false}
|
||||
crossOrigin="anonymous"
|
||||
style={imageStyle}
|
||||
/>
|
||||
) : (
|
||||
<Typography.Text>无可用内容</Typography.Text>
|
||||
)}
|
||||
|
||||
<div style={viewerStyles.scaleBadge}>{scaleLabel}</div>
|
||||
|
||||
{controls}
|
||||
</div>
|
||||
);
|
||||
@@ -1,116 +0,0 @@
|
||||
import React from 'react';
|
||||
import { Typography, Empty } from 'antd';
|
||||
import type { HistogramData, InfoItem } from './types';
|
||||
|
||||
interface InfoPanelProps {
|
||||
style: React.CSSProperties;
|
||||
histogramCardStyle: React.CSSProperties;
|
||||
title: string;
|
||||
captureTime: string | number | null;
|
||||
basicList: InfoItem[];
|
||||
shootingList: InfoItem[];
|
||||
deviceList: InfoItem[];
|
||||
miscList: InfoItem[];
|
||||
histogram: HistogramData | null;
|
||||
}
|
||||
|
||||
const SectionTitle: React.FC<{ children: React.ReactNode }> = ({ children }) => (
|
||||
<Typography.Title level={5} style={{ color: '#fff', fontSize: 15, marginTop: 24, marginBottom: 12 }}>
|
||||
{children}
|
||||
</Typography.Title>
|
||||
);
|
||||
|
||||
const HistogramPlot: React.FC<{ data: HistogramData | null }> = ({ data }) => {
|
||||
if (!data) {
|
||||
return <Empty description="无法解析直方图" image={Empty.PRESENTED_IMAGE_SIMPLE} />;
|
||||
}
|
||||
const width = 260;
|
||||
const height = 140;
|
||||
const max = Math.max(...data.r, ...data.g, ...data.b, 1);
|
||||
const toPath = (arr: number[]) => arr
|
||||
.map((value, index) => {
|
||||
const x = (index / 255) * width;
|
||||
const y = height - (value / max) * height;
|
||||
return `${index === 0 ? 'M' : 'L'}${x.toFixed(2)},${y.toFixed(2)}`;
|
||||
})
|
||||
.join(' ');
|
||||
return (
|
||||
<svg width={width} height={height} viewBox={`0 0 ${width} ${height}`} style={{ width: '100%' }}>
|
||||
<rect x={0} y={0} width={width} height={height} fill="rgba(255,255,255,0.04)" />
|
||||
<path d={toPath(data.r)} stroke="rgba(255,99,132,0.88)" fill="none" strokeWidth={1.3} />
|
||||
<path d={toPath(data.g)} stroke="rgba(75,192,192,0.88)" fill="none" strokeWidth={1.3} />
|
||||
<path d={toPath(data.b)} stroke="rgba(54,162,235,0.88)" fill="none" strokeWidth={1.3} />
|
||||
</svg>
|
||||
);
|
||||
};
|
||||
|
||||
const InfoRows: React.FC<{ items: InfoItem[] }> = ({ items }) => (
|
||||
<div style={{ display: 'grid', gridTemplateColumns: '100px 1fr', rowGap: 10, columnGap: 12 }}>
|
||||
{items
|
||||
.filter(item => item.value !== null && item.value !== undefined && item.value !== '')
|
||||
.map(item => (
|
||||
<React.Fragment key={item.label}>
|
||||
<span style={{ display: 'inline-flex', alignItems: 'center', gap: 6, color: 'rgba(255,255,255,0.55)' }}>
|
||||
{item.icon && <span style={{ display: 'inline-flex', alignItems: 'center' }}>{item.icon}</span>}
|
||||
<span>{item.label}</span>
|
||||
</span>
|
||||
<span style={{ color: '#fff', wordBreak: 'break-all' }}>{item.value}</span>
|
||||
</React.Fragment>
|
||||
))}
|
||||
</div>
|
||||
);
|
||||
|
||||
export const InfoPanel: React.FC<InfoPanelProps> = ({
|
||||
style,
|
||||
histogramCardStyle,
|
||||
title,
|
||||
captureTime,
|
||||
basicList,
|
||||
shootingList,
|
||||
deviceList,
|
||||
miscList,
|
||||
histogram,
|
||||
}) => (
|
||||
<aside style={style}>
|
||||
<Typography.Title level={3} style={{ color: '#fff', marginTop: 6, wordBreak: 'break-all' }}>
|
||||
{title}
|
||||
</Typography.Title>
|
||||
{captureTime && (
|
||||
<Typography.Text style={{ color: 'rgba(255,255,255,0.6)' }}>拍摄时间 {captureTime}</Typography.Text>
|
||||
)}
|
||||
|
||||
<SectionTitle>基本信息</SectionTitle>
|
||||
<InfoRows items={basicList} />
|
||||
|
||||
{shootingList.some(i => i.value) && (
|
||||
<>
|
||||
<SectionTitle>拍摄参数</SectionTitle>
|
||||
<InfoRows items={shootingList} />
|
||||
</>
|
||||
)}
|
||||
|
||||
{deviceList.some(i => i.value) && (
|
||||
<>
|
||||
<SectionTitle>设备信息</SectionTitle>
|
||||
<InfoRows items={deviceList} />
|
||||
</>
|
||||
)}
|
||||
|
||||
{miscList.some(i => i.value) && (
|
||||
<>
|
||||
<SectionTitle>其他</SectionTitle>
|
||||
<InfoRows items={miscList} />
|
||||
</>
|
||||
)}
|
||||
|
||||
<SectionTitle>直方图</SectionTitle>
|
||||
<div style={histogramCardStyle}>
|
||||
<HistogramPlot data={histogram} />
|
||||
<div style={{ marginTop: 12, display: 'flex', gap: 12, fontSize: 12 }}>
|
||||
<span style={{ color: 'rgba(255,99,132,0.88)' }}>R</span>
|
||||
<span style={{ color: 'rgba(75,192,192,0.88)' }}>G</span>
|
||||
<span style={{ color: 'rgba(54,162,235,0.88)' }}>B</span>
|
||||
</div>
|
||||
</div>
|
||||
</aside>
|
||||
);
|
||||
@@ -1,73 +0,0 @@
|
||||
import React from 'react';
|
||||
import { Button, Tooltip } from 'antd';
|
||||
import {
|
||||
LeftOutlined,
|
||||
RightOutlined,
|
||||
ZoomInOutlined,
|
||||
ZoomOutOutlined,
|
||||
RotateRightOutlined,
|
||||
ReloadOutlined,
|
||||
CompressOutlined,
|
||||
} from '@ant-design/icons';
|
||||
|
||||
interface ViewerControlsProps {
|
||||
style: React.CSSProperties;
|
||||
onPrev: () => void;
|
||||
onNext: () => void;
|
||||
onZoomIn: () => void;
|
||||
onZoomOut: () => void;
|
||||
onRotate: () => void;
|
||||
onReset: () => void;
|
||||
onFit: () => void;
|
||||
disableSwitch: boolean;
|
||||
}
|
||||
|
||||
export const ViewerControls: React.FC<ViewerControlsProps> = ({
|
||||
style,
|
||||
onPrev,
|
||||
onNext,
|
||||
onZoomIn,
|
||||
onZoomOut,
|
||||
onRotate,
|
||||
onReset,
|
||||
onFit,
|
||||
disableSwitch,
|
||||
}) => (
|
||||
<div style={style}>
|
||||
<Tooltip title="上一张">
|
||||
<Button
|
||||
shape="circle"
|
||||
type="text"
|
||||
icon={<LeftOutlined />}
|
||||
onClick={onPrev}
|
||||
disabled={disableSwitch}
|
||||
style={{ color: '#fff' }}
|
||||
/>
|
||||
</Tooltip>
|
||||
<Tooltip title="缩小">
|
||||
<Button shape="circle" type="text" icon={<ZoomOutOutlined />} onClick={onZoomOut} style={{ color: '#fff' }} />
|
||||
</Tooltip>
|
||||
<Tooltip title="放大">
|
||||
<Button shape="circle" type="text" icon={<ZoomInOutlined />} onClick={onZoomIn} style={{ color: '#fff' }} />
|
||||
</Tooltip>
|
||||
<Tooltip title="旋转 90°">
|
||||
<Button shape="circle" type="text" icon={<RotateRightOutlined />} onClick={onRotate} style={{ color: '#fff' }} />
|
||||
</Tooltip>
|
||||
<Tooltip title="重置">
|
||||
<Button shape="circle" type="text" icon={<ReloadOutlined />} onClick={onReset} style={{ color: '#fff' }} />
|
||||
</Tooltip>
|
||||
<Tooltip title="适应窗口">
|
||||
<Button shape="circle" type="text" icon={<CompressOutlined />} onClick={onFit} style={{ color: '#fff' }} />
|
||||
</Tooltip>
|
||||
<Tooltip title="下一张">
|
||||
<Button
|
||||
shape="circle"
|
||||
type="text"
|
||||
icon={<RightOutlined />}
|
||||
onClick={onNext}
|
||||
disabled={disableSwitch}
|
||||
style={{ color: '#fff' }}
|
||||
/>
|
||||
</Tooltip>
|
||||
</div>
|
||||
);
|
||||
@@ -1,19 +0,0 @@
|
||||
import type { ReactNode } from 'react';
|
||||
|
||||
export interface HistogramData {
|
||||
r: number[];
|
||||
g: number[];
|
||||
b: number[];
|
||||
}
|
||||
|
||||
export interface RgbColor {
|
||||
r: number;
|
||||
g: number;
|
||||
b: number;
|
||||
}
|
||||
|
||||
export interface InfoItem {
|
||||
label: string;
|
||||
value: string | number | null;
|
||||
icon?: ReactNode;
|
||||
}
|
||||
@@ -1,23 +0,0 @@
|
||||
import type { AppDescriptor } from '../types';
|
||||
import { ImageViewerApp } from './ImageViewer.tsx';
|
||||
|
||||
const supportedExts = ['png', 'jpg', 'jpeg', 'gif', 'webp', 'svg', 'bmp', 'ico', 'avif', 'arw', 'cr2', 'cr3', 'nef', 'rw2', 'orf', 'pef', 'dng'];
|
||||
|
||||
export const descriptor: AppDescriptor = {
|
||||
key: 'image-viewer',
|
||||
name: '图片查看器',
|
||||
iconUrl: 'https://api.iconify.design/mdi:image.svg',
|
||||
description: '内置图片查看器,支持常见图片与部分 RAW 格式预览。',
|
||||
author: 'Foxel',
|
||||
supportedExts,
|
||||
supported: (entry) => {
|
||||
if (entry.is_dir) return false;
|
||||
const ext = entry.name.split('.').pop()?.toLowerCase() || '';
|
||||
return supportedExts.includes(ext);
|
||||
},
|
||||
component: ImageViewerApp,
|
||||
default: true,
|
||||
defaultMaximized:true,
|
||||
useSystemWindow:false,
|
||||
defaultBounds: { width: 820, height: 620, x: 140, y: 96 }
|
||||
};
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user