Compare commits

...

133 Commits

Author SHA1 Message Date
jxxghp
1145b32299 fix plugin install 2025-09-18 22:32:04 +08:00
jxxghp
ab71df0011 Merge pull request #4971 from cddjr/fix_glitch 2025-09-18 21:00:00 +08:00
jxxghp
fb137252a9 fix plugin id lower case 2025-09-18 18:00:15 +08:00
jxxghp
f57a680306 插件安装支持传递 repo_url 参数 2025-09-18 17:42:12 +08:00
景大侠
8bb3eaa320 fix 获取上次搜索结果时产生的NoneType异常
glitchtip#14
2025-09-18 17:23:20 +08:00
景大侠
9489730a44 fix u115刷新access_token失败会产生NoneType异常
glitchtip#49549
2025-09-18 17:23:20 +08:00
景大侠
d4795bb897 fix u115重试请求时报错unexpected keyword argument
glitchtip#136696
2025-09-18 17:23:19 +08:00
景大侠
63775872c7 fix TMDB因连接失败产生的NoneType错误
glitchtip#11
2025-09-18 17:05:09 +08:00
jxxghp
beff508a1f Merge pull request #4970 from cddjr/fix_trimemedia 2025-09-18 15:55:46 +08:00
景大侠
deaae8a2c6 fix 2025-09-18 15:39:10 +08:00
景大侠
46a27bd50c fix: 飞牛影视 2025-09-18 15:27:02 +08:00
jxxghp
24f2993433 Merge pull request #4958 from cddjr/fix_browse_mteam 2025-09-17 07:04:59 +08:00
景大侠
c80bfbfac5 fix: 浏览馒头报错NoneType 2025-09-17 01:59:28 +08:00
jxxghp
06abfc45c7 更新 version.py 2025-09-16 20:30:38 +08:00
jxxghp
440a773081 fix 2025-09-16 17:56:44 +08:00
jxxghp
0797bcb38b fix 2025-09-16 13:10:31 +08:00
jxxghp
d463b5bf0d Merge pull request #4955 from jxxghp/cursor/add-sort-type-to-subscription-queries-af67 2025-09-16 11:41:08 +08:00
Cursor Agent
0733c8edcc Add sort_type parameter to subscribe endpoints
Co-authored-by: jxxghp <jxxghp@qq.com>
2025-09-16 03:29:28 +00:00
jxxghp
86c7c05cb1 feat: 在获取订阅分享数据的接口中添加可选参数 2025-09-16 07:38:56 +08:00
jxxghp
18ff7ce753 feat: 在订阅统计中添加可选参数 2025-09-16 07:37:14 +08:00
jxxghp
8f2ed1004d Merge pull request #4952 from cddjr/fix_file_perm 2025-09-16 07:00:45 +08:00
景大侠
14961323c3 fix umask 2025-09-15 22:01:00 +08:00
景大侠
f8c682b183 fix: 修复刮削的文件权限只有0600的问题 2025-09-15 21:49:37 +08:00
jxxghp
dd92708f60 Merge pull request #4947 from pluto0x0/fix/4941-mttorent-imdb-search 2025-09-15 14:23:17 +08:00
Zifan Ying
4d9eeccefa fix: mtorrent搜索imdb时提供完整链接
fix: mtorrent搜索imdb时需要提供完整链接(例如https://www.imdb.com/title/tt3058674)
keyword为imdb条目时添加链接前缀
参考 https://wiki.m-team.cc/zh-tw/imdbtosearch
 
issue: https://github.com/jxxghp/MoviePilot/issues/4941
2025-09-15 00:31:45 -05:00
jxxghp
cd7b251031 Merge pull request #4946 from developer-wlj/wlj0914 2025-09-14 17:30:11 +08:00
developer-wlj
db614180b9 Revert "refactor: 优化临时文件的创建和上传逻辑"
This reverts commit 77c0f8f39e.
2025-09-14 17:14:52 +08:00
jxxghp
b6e527e5f4 Merge pull request #4945 from developer-wlj/wlj0914 2025-09-14 16:54:37 +08:00
developer-wlj
77c0f8f39e refactor: 优化临时文件的创建和上传逻辑
- 使用 with 语句自动管理临时文件的创建和关闭,提高代码的可读性和安全性
- 优化了代码结构,减少了嵌套的 try 语句,使代码更加清晰
2025-09-14 16:46:27 +08:00
jxxghp
58816d73c8 Merge pull request #4944 from developer-wlj/wlj0914 2025-09-14 16:42:37 +08:00
developer-wlj
3b194d282e fix: 修复在windows下因临时文件被占用,导致刮削失败
- 修改了两个函数中的临时文件创建和删除逻辑
- 使用手动删除代替自动删除,确保临时文件被正确清理
- 添加了异常处理,记录临时文件删除失败的情况
2025-09-14 16:28:24 +08:00
jxxghp
397f66433d v2.8.0 2025-09-13 15:58:00 +08:00
jxxghp
04a4ed1d0e fix delete_media_file 2025-09-13 14:10:15 +08:00
jxxghp
625850d4e7 fix 2025-09-13 13:35:51 +08:00
jxxghp
6c572baca5 rollback 2025-09-13 13:32:48 +08:00
jxxghp
ee0406a13f Handle smb protocol key error during disconnect (#4938)
* Refactor: Improve SMB connection handling and add signal handling

Co-authored-by: jxxghp <jxxghp@qq.com>

* Remove test_smb_fix.py

Co-authored-by: jxxghp <jxxghp@qq.com>

---------

Co-authored-by: Cursor Agent <cursoragent@cursor.com>
Co-authored-by: jxxghp <jxxghp@qq.com>
2025-09-13 11:25:29 +08:00
jxxghp
608a049ba3 fix smb delete 2025-09-13 11:05:21 +08:00
jxxghp
4d9b5198e2 增强SMB存储的删除功能 2025-09-13 10:56:45 +08:00
jxxghp
24b6c970aa feat:emby用户名 2025-09-13 10:34:41 +08:00
jxxghp
239c47f469 fix #4917 2025-09-13 10:13:33 +08:00
jxxghp
f0fc64c517 fix #4917 2025-09-13 10:12:40 +08:00
jxxghp
8481fd38ce fix #4933 2025-09-13 09:54:28 +08:00
jxxghp
5f425129d5 fix #4934 2025-09-13 09:46:04 +08:00
jxxghp
92955b1315 fix:在fork进程中执行文件整理 2025-09-13 08:56:05 +08:00
jxxghp
a3872d5bb5 fix:在fork进程中执行文件整理 2025-09-13 08:50:20 +08:00
jxxghp
a123ff2c04 feat:在fork进程中执行文件整理 2025-09-13 08:32:31 +08:00
jxxghp
188de34306 mini chunk size 2025-09-12 21:45:26 +08:00
jxxghp
3d43750e9b fix async event 2025-09-10 17:33:12 +08:00
jxxghp
fea228c68d add SUPERUSER_PASSWORD 2025-09-10 15:42:17 +08:00
jxxghp
a71a28e563 更新 config.py 2025-09-10 07:00:10 +08:00
jxxghp
3b5d4982b5 add wizard flag 2025-09-09 13:50:11 +08:00
jxxghp
b201e9ab8c Revert "feat:在子进程中操作文件"
This reverts commit 4f304a70b7.
2025-09-08 17:23:25 +08:00
jxxghp
d30b9282fd fix alipan u115 error log 2025-09-08 17:13:01 +08:00
jxxghp
4f304a70b7 feat:在子进程中操作文件 2025-09-08 16:59:29 +08:00
jxxghp
59a54d4f04 fix plugin cache 2025-09-08 13:27:32 +08:00
jxxghp
1e94d794ed fix log 2025-09-08 12:12:00 +08:00
jxxghp
5bd210406b Merge pull request #4918 from cddjr/fix_4853 2025-09-08 11:36:41 +08:00
景大侠
e00514d36d fix: 将RSS中的发布日期转为本地时区 2025-09-08 11:28:08 +08:00
jxxghp
f013bf1931 fix 2025-09-08 10:59:28 +08:00
jxxghp
107cbbad1d fix 2025-09-08 10:54:45 +08:00
jxxghp
481f1f9d30 add full gc scheduler 2025-09-08 10:49:09 +08:00
jxxghp
704364061c fix redis test 2025-09-08 09:59:11 +08:00
jxxghp
c1bd2d6cf1 fix:优化下载 2025-09-08 09:50:08 +08:00
jxxghp
a018e1228c Merge pull request #4904 from DDS-Derek/fix_gosu 2025-09-05 21:40:41 +08:00
DDSRem
d962d9c7f6 feat(docker): add START_NOGOSU mode
fix https://github.com/jxxghp/MoviePilot/issues/4889
2025-09-05 21:30:59 +08:00
jxxghp
4ea28cbca5 fix #4902 2025-09-05 21:09:05 +08:00
jxxghp
1b48b8b4cc Merge pull request #4902 from DDS-Derek/dev 2025-09-05 20:06:42 +08:00
jxxghp
73df197e33 Merge pull request #4903 from imtms/v2 2025-09-05 20:05:28 +08:00
TMs
bdc66e55ca fix(LocalStorage): 添加源文件与目标文件相同的检查,防止文件被删除。 2025-09-05 20:02:37 +08:00
DDSRem
926343ee86 fix(u115): code logic vulnerabilities 2025-09-05 19:37:41 +08:00
DDSRem
8e6021c5e7 fix(u115): code logic vulnerabilities 2025-09-05 19:23:23 +08:00
jxxghp
ac2b6c76ce 更新 version.py 2025-09-05 12:04:26 +08:00
jxxghp
9e966d0a7f Merge pull request #4898 from wumode/fix_alist 2025-09-04 21:16:58 +08:00
wumode
6c10defaa1 fix(Alist): add type hints 2025-09-04 21:08:25 +08:00
wumode
b6a76f6f7c fix(Alist): 添加__len__() 2025-09-04 20:47:13 +08:00
jxxghp
84e5b77a5c rollback orjson 2025-09-04 11:53:39 +08:00
jxxghp
89b0ea0bf1 remove monitoring 2025-09-04 11:23:22 +08:00
jxxghp
48aeb98bf1 add orjson 2025-09-04 08:52:36 +08:00
jxxghp
8a5d864812 更新 config.py 2025-09-04 08:28:42 +08:00
jxxghp
ae79e645a6 Merge pull request #4893 from Aqr-K/feat-plugin-wheels 2025-09-03 14:30:01 +08:00
Aqr-K
0947deb372 fix plugin.py 2025-09-03 14:27:24 +08:00
jxxghp
69c92911a2 更新 category.yaml 2025-09-03 14:26:40 +08:00
jxxghp
b16bb37b75 Merge pull request #4892 from Aqr-K/feat-plugin-wheels 2025-09-03 14:21:08 +08:00
Aqr-K
9c9ec8adf2 feat(plugin): Implement robust dependency installation with embedded wheels
- 通过在插件中嵌入轮子来支持安装依赖项
2025-09-03 14:13:32 +08:00
jxxghp
eb0e67fc42 fix logging 2025-09-03 12:42:13 +08:00
jxxghp
9cc50bddab Merge pull request #4764 from 2Dou/v2 2025-09-03 12:01:37 +08:00
jxxghp
d3ba0fa487 更新 category.py
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-09-03 11:58:07 +08:00
jxxghp
39f6505a80 fix:优化参数、使用orjson 2025-09-03 09:51:24 +08:00
jxxghp
36a6802439 fix:#4876 2025-09-02 12:45:44 +08:00
jxxghp
d7e2633a92 fix:移除更新阻断 2025-09-02 12:16:45 +08:00
jxxghp
88049e741e add SUBSCRIBE_SEARCH_INTERVAL 2025-09-02 11:41:52 +08:00
jxxghp
ff7fb14087 fix cache_clear 2025-09-02 08:35:48 +08:00
jxxghp
816c64bd48 Merge pull request #4883 from cikezhu/v2 2025-09-01 18:32:21 +08:00
cikezhu
d2756e6f2d schedule() # 这会返回一个协程对象,但我们没有等待它 2025-09-01 17:39:46 +08:00
jxxghp
147e12acbb Merge pull request #4879 from sebastian0619/v2 2025-08-31 19:04:38 +08:00
jxxghp
4098018ee9 更新 entrypoint.sh
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-08-31 19:04:24 +08:00
Sebastian
133e7578b9 Update NGINX SSL port configuration 2025-08-31 17:17:26 +08:00
jxxghp
74a2bdbf09 Merge pull request #4872 from Aqr-K/feat/v2.7.8/string/natural_sort 2025-08-30 09:45:23 +08:00
Aqr-K
f22bc68af4 Update string.py 2025-08-30 08:59:35 +08:00
Aqr-K
26cc6da650 fix(storage): Adjust to use natural_stort_key 2025-08-30 08:48:38 +08:00
Aqr-K
d21f1f1b87 feat(string): add natural_sort_key function 2025-08-30 08:44:41 +08:00
jxxghp
7cdaafffe1 Merge pull request #4867 from aotuwuxi/hotfix/250829 2025-08-29 13:46:48 +08:00
jxxghp
0265dca197 Merge pull request #4866 from lostwindsenril/patch-1 2025-08-29 13:45:49 +08:00
wuxi
9d68366043 fix: 修复工作流调用插件无法获取到对象属性问题 2025-08-29 13:19:50 +08:00
lostwindsenril
c8c671d915 Update app/modules/indexer/spider/mtorrent.py
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-08-29 13:07:31 +08:00
lostwindsenril
142daa9d15 使馒头(m-team)支持剩余促销期检测
Add freedate to torrent if discountEndTime exists
2025-08-29 13:04:17 +08:00
jxxghp
2552219991 更新 version.py 2025-08-28 11:11:32 +08:00
jxxghp
a038b698d7 fix haidan 2025-08-28 09:36:19 +08:00
jxxghp
a3b222574e add thetvdb cache 2025-08-28 08:05:10 +08:00
jxxghp
e0cd467293 rollback fix #4856 2025-08-28 07:51:05 +08:00
jxxghp
9c056030d2 fix:捕促115&alipan请求异常 2025-08-27 20:21:06 +08:00
jxxghp
19efa9d4cc fix #4795 2025-08-27 16:15:45 +08:00
jxxghp
90633a6495 fix #4851 2025-08-27 15:57:43 +08:00
jxxghp
edc432fbd8 fix #4846 2025-08-27 12:45:23 +08:00
jxxghp
1b7bdbf516 fix #4834 2025-08-27 08:28:16 +08:00
jxxghp
8c1be70c85 更新 version.py 2025-08-26 12:20:16 +08:00
jxxghp
b8e0c0db9e feat:精细化事件错误 2025-08-26 08:41:47 +08:00
jxxghp
7b7fb6cc82 Merge pull request #4836 from jxxghp/cursor/alter-siteuser-data-userid-to-character-type-9f4d 2025-08-25 22:05:19 +08:00
Cursor Agent
62512ba215 Remove SQLite-specific migration code for userid field
Co-authored-by: jxxghp <jxxghp@live.cn>
2025-08-25 14:00:33 +00:00
Cursor Agent
e1beb64c01 Simplify userid conversion to integer in Synology Chat module
Co-authored-by: jxxghp <jxxghp@live.cn>
2025-08-25 13:58:15 +00:00
Cursor Agent
c81f26ddad Remove downgrade methods for PostgreSQL and SQLite userid migration
Co-authored-by: jxxghp <jxxghp@live.cn>
2025-08-25 13:56:21 +00:00
Cursor Agent
340114c2a1 Remove migration README after completing SiteUserData userid type migration
Co-authored-by: jxxghp <jxxghp@live.cn>
2025-08-25 13:54:58 +00:00
Cursor Agent
cd7767b331 Checkpoint before follow-up message
Co-authored-by: jxxghp <jxxghp@live.cn>
2025-08-25 13:54:48 +00:00
Cursor Agent
25289dad8a Migrate SiteUserData userid field from Integer to String type
Co-authored-by: jxxghp <jxxghp@live.cn>
2025-08-25 13:50:58 +00:00
jxxghp
47c6917129 remove _check_restart_policy 2025-08-25 21:30:53 +08:00
jxxghp
6379cda148 fix 异步定时服务 2025-08-25 21:19:07 +08:00
jxxghp
91a124ab8f fix 异步定时服务 2025-08-25 20:44:38 +08:00
jxxghp
2357a7135e fix run_async 2025-08-25 17:46:06 +08:00
jxxghp
da0b3b3de9 fix:日历缓存 2025-08-25 16:46:10 +08:00
jxxghp
6664fb1716 feat:增加插件和日历的自动缓存 2025-08-25 16:37:02 +08:00
jxxghp
1206f24fa9 修复缓存迭代时的并发问题 2025-08-25 13:11:44 +08:00
jxxghp
ffb5823e84 fix #4829 优化模块导入逻辑,增加对 Async 类的特殊处理 2025-08-25 08:14:43 +08:00
2Dou
3723cf8ac2 二级分类配置增加排除功能 2025-08-15 09:54:56 +08:00
69 changed files with 1581 additions and 1865 deletions

View File

@@ -56,7 +56,7 @@ class InvokePluginAction(BaseAction):
logger.error(f"插件不存在: {params.plugin_id}")
return context
actions = plugin_actions[0].get("actions", [])
action = next((action for action in actions if action.action_id == params.action_id), None)
action = next((action for action in actions if action.get("action_id") == params.action_id), None)
if not action or not action.get("func"):
logger.error(f"插件动作不存在: {params.plugin_id} - {params.action_id}")
return context

View File

@@ -67,14 +67,8 @@ class ScanFileAction(BaseAction):
break
if not file.extension or f".{file.extension.lower()}" not in settings.RMT_MEDIAEXT:
continue
# 检查缓存
cache_key = f"{file.path}"
if self.check_cache(workflow_id, cache_key):
logger.info(f"{file.path} 已处理过,跳过")
continue
self._fileitems.append(fileitem)
# 保存缓存
self.save_cache(workflow_id, cache_key)
# 添加文件到队列,而不是目录
self._fileitems.append(file)
if self._fileitems:
context.fileitems.extend(self._fileitems)

View File

@@ -2,7 +2,7 @@ from fastapi import APIRouter
from app.api.endpoints import login, user, webhook, message, site, subscribe, \
media, douban, search, plugin, tmdb, history, system, download, dashboard, \
transfer, mediaserver, bangumi, storage, discover, recommend, workflow, torrent, monitoring
transfer, mediaserver, bangumi, storage, discover, recommend, workflow, torrent
api_router = APIRouter()
api_router.include_router(login.router, prefix="/login", tags=["login"])
@@ -28,4 +28,3 @@ api_router.include_router(discover.router, prefix="/discover", tags=["discover"]
api_router.include_router(recommend.router, prefix="/recommend", tags=["recommend"])
api_router.include_router(workflow.router, prefix="/workflow", tags=["workflow"])
api_router.include_router(torrent.router, prefix="/torrent", tags=["torrent"])
api_router.include_router(monitoring.router, prefix="/monitoring", tags=["monitoring"])

View File

@@ -123,7 +123,7 @@ async def schedule2(_: Annotated[str, Depends(verify_apitoken)]) -> Any:
"""
查询下载器信息 API_TOKEN认证?token=xxx
"""
return schedule()
return await schedule()
@router.get("/transfer", summary="文件整理统计", response_model=List[int])

View File

@@ -8,8 +8,10 @@ from app import schemas
from app.chain.user import UserChain
from app.core import security
from app.core.config import settings
from app.db.systemconfig_oper import SystemConfigOper
from app.helper.sites import SitesHelper # noqa
from app.helper.wallpaper import WallpaperHelper
from app.schemas.types import SystemConfigKey
router = APIRouter()
@@ -29,7 +31,10 @@ def login_access_token(
if not success:
raise HTTPException(status_code=401, detail=user_or_message)
# 用户等级
level = SitesHelper().auth_level
# 是否显示配置向导
show_wizard = not SystemConfigOper().get(SystemConfigKey.SetupWizardState) and not settings.ADVANCED_MODE
return schemas.Token(
access_token=security.create_access_token(
userid=user_or_message.id,
@@ -45,6 +50,7 @@ def login_access_token(
avatar=user_or_message.avatar,
level=level,
permissions=user_or_message.permissions or {},
widzard=show_wizard
)

View File

@@ -1,409 +0,0 @@
from typing import Any, List
from fastapi import APIRouter, Depends, Query
from fastapi.responses import HTMLResponse
from app import schemas
from app.core.security import verify_apitoken
from app.monitoring import monitor, get_metrics_response
from app.schemas.monitoring import (
PerformanceSnapshot,
EndpointStats,
ErrorRequest,
MonitoringOverview
)
router = APIRouter()
@router.get("/overview", summary="获取监控概览", response_model=schemas.MonitoringOverview)
def get_overview(_: str = Depends(verify_apitoken)) -> Any:
"""
获取完整的监控概览信息
"""
# 获取性能快照
performance = monitor.get_performance_snapshot()
# 获取最活跃端点
top_endpoints = monitor.get_top_endpoints(limit=10)
# 获取最近错误
recent_errors = monitor.get_recent_errors(limit=20)
# 检查告警
alerts = monitor.check_alerts()
return MonitoringOverview(
performance=PerformanceSnapshot(
timestamp=performance.timestamp,
cpu_usage=performance.cpu_usage,
memory_usage=performance.memory_usage,
active_requests=performance.active_requests,
request_rate=performance.request_rate,
avg_response_time=performance.avg_response_time,
error_rate=performance.error_rate,
slow_requests=performance.slow_requests
),
top_endpoints=[EndpointStats(**endpoint) for endpoint in top_endpoints],
recent_errors=[ErrorRequest(**error) for error in recent_errors],
alerts=alerts
)
@router.get("/performance", summary="获取性能快照", response_model=schemas.PerformanceSnapshot)
def get_performance(_: str = Depends(verify_apitoken)) -> Any:
"""
获取当前性能快照
"""
snapshot = monitor.get_performance_snapshot()
return PerformanceSnapshot(
timestamp=snapshot.timestamp,
cpu_usage=snapshot.cpu_usage,
memory_usage=snapshot.memory_usage,
active_requests=snapshot.active_requests,
request_rate=snapshot.request_rate,
avg_response_time=snapshot.avg_response_time,
error_rate=snapshot.error_rate,
slow_requests=snapshot.slow_requests
)
@router.get("/endpoints", summary="获取端点统计", response_model=List[schemas.EndpointStats])
def get_endpoints(
limit: int = Query(10, ge=1, le=50, description="返回的端点数量"),
_: str = Depends(verify_apitoken)
) -> Any:
"""
获取最活跃的API端点统计
"""
endpoints = monitor.get_top_endpoints(limit=limit)
return [EndpointStats(**endpoint) for endpoint in endpoints]
@router.get("/errors", summary="获取错误请求", response_model=List[schemas.ErrorRequest])
def get_errors(
limit: int = Query(20, ge=1, le=100, description="返回的错误数量"),
_: str = Depends(verify_apitoken)
) -> Any:
"""
获取最近的错误请求记录
"""
errors = monitor.get_recent_errors(limit=limit)
return [ErrorRequest(**error) for error in errors]
@router.get("/alerts", summary="获取告警信息", response_model=List[str])
def get_alerts(_: str = Depends(verify_apitoken)) -> Any:
"""
获取当前告警信息
"""
return monitor.check_alerts()
@router.get("/metrics", summary="Prometheus指标")
def get_prometheus_metrics(_: str = Depends(verify_apitoken)) -> Any:
"""
获取Prometheus格式的监控指标
"""
return get_metrics_response()
@router.get("/dashboard", summary="监控仪表板", response_class=HTMLResponse)
def get_dashboard(_: str = Depends(verify_apitoken)) -> Any:
"""
获取实时监控仪表板HTML页面
"""
return HTMLResponse(content="""
<!DOCTYPE html>
<html lang="zh-CN">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>MoviePilot 性能监控仪表板</title>
<script src="https://cdn.jsdelivr.net/npm/chart.js"></script>
<style>
body {
font-family: 'Segoe UI', Tahoma, Geneva, Verdana, sans-serif;
margin: 0;
padding: 20px;
background-color: #f5f5f5;
}
.container {
max-width: 1200px;
margin: 0 auto;
}
.header {
text-align: center;
margin-bottom: 30px;
color: #333;
}
.metrics-grid {
display: grid;
grid-template-columns: repeat(auto-fit, minmax(250px, 1fr));
gap: 20px;
margin-bottom: 30px;
}
.metric-card {
background: white;
padding: 20px;
border-radius: 10px;
box-shadow: 0 2px 10px rgba(0,0,0,0.1);
text-align: center;
}
.metric-value {
font-size: 2em;
font-weight: bold;
color: #2196F3;
}
.metric-label {
color: #666;
margin-top: 5px;
}
.chart-container {
background: white;
padding: 20px;
border-radius: 10px;
box-shadow: 0 2px 10px rgba(0,0,0,0.1);
margin-bottom: 20px;
}
.alerts {
background: #fff3cd;
border: 1px solid #ffeaa7;
border-radius: 5px;
padding: 15px;
margin-bottom: 20px;
}
.alert-item {
color: #856404;
margin: 5px 0;
}
.refresh-btn {
background: #2196F3;
color: white;
border: none;
padding: 10px 20px;
border-radius: 5px;
cursor: pointer;
margin-bottom: 20px;
}
.refresh-btn:hover {
background: #1976D2;
}
</style>
</head>
<body>
<div class="container">
<div class="header">
<h1>🎬 MoviePilot 性能监控仪表板</h1>
<button class="refresh-btn" onclick="refreshData()">刷新数据</button>
</div>
<div id="alerts" class="alerts" style="display: none;">
<h3>⚠️ 告警信息</h3>
<div id="alerts-list"></div>
</div>
<div class="metrics-grid">
<div class="metric-card">
<div class="metric-value" id="cpu-usage">--</div>
<div class="metric-label">CPU使用率 (%)</div>
</div>
<div class="metric-card">
<div class="metric-value" id="memory-usage">--</div>
<div class="metric-label">内存使用率 (%)</div>
</div>
<div class="metric-card">
<div class="metric-value" id="active-requests">--</div>
<div class="metric-label">活跃请求数</div>
</div>
<div class="metric-card">
<div class="metric-value" id="request-rate">--</div>
<div class="metric-label">请求率 (req/min)</div>
</div>
<div class="metric-card">
<div class="metric-value" id="avg-response-time">--</div>
<div class="metric-label">平均响应时间 (s)</div>
</div>
<div class="metric-card">
<div class="metric-value" id="error-rate">--</div>
<div class="metric-label">错误率 (%)</div>
</div>
</div>
<div class="chart-container">
<h3>📊 性能趋势</h3>
<canvas id="performanceChart" width="400" height="200"></canvas>
</div>
<div class="chart-container">
<h3>🔥 最活跃端点</h3>
<canvas id="endpointsChart" width="400" height="200"></canvas>
</div>
</div>
<script>
let performanceChart, endpointsChart;
let performanceData = {
labels: [],
cpu: [],
memory: [],
requests: []
};
// 初始化图表
function initCharts() {
const ctx1 = document.getElementById('performanceChart').getContext('2d');
performanceChart = new Chart(ctx1, {
type: 'line',
data: {
labels: performanceData.labels,
datasets: [{
label: 'CPU使用率 (%)',
data: performanceData.cpu,
borderColor: '#2196F3',
backgroundColor: 'rgba(33, 150, 243, 0.1)',
tension: 0.4
}, {
label: '内存使用率 (%)',
data: performanceData.memory,
borderColor: '#4CAF50',
backgroundColor: 'rgba(76, 175, 80, 0.1)',
tension: 0.4
}, {
label: '活跃请求数',
data: performanceData.requests,
borderColor: '#FF9800',
backgroundColor: 'rgba(255, 152, 0, 0.1)',
tension: 0.4
}]
},
options: {
responsive: true,
scales: {
y: {
beginAtZero: true
}
}
}
});
const ctx2 = document.getElementById('endpointsChart').getContext('2d');
endpointsChart = new Chart(ctx2, {
type: 'bar',
data: {
labels: [],
datasets: [{
label: '请求数',
data: [],
backgroundColor: 'rgba(33, 150, 243, 0.8)'
}]
},
options: {
responsive: true,
scales: {
y: {
beginAtZero: true
}
}
}
});
}
// 更新性能数据
function updatePerformanceData(data) {
const now = new Date().toLocaleTimeString();
performanceData.labels.push(now);
performanceData.cpu.push(data.performance.cpu_usage);
performanceData.memory.push(data.performance.memory_usage);
performanceData.requests.push(data.performance.active_requests);
// 保持最近20个数据点
if (performanceData.labels.length > 20) {
performanceData.labels.shift();
performanceData.cpu.shift();
performanceData.memory.shift();
performanceData.requests.shift();
}
// 更新图表
performanceChart.data.labels = performanceData.labels;
performanceChart.data.datasets[0].data = performanceData.cpu;
performanceChart.data.datasets[1].data = performanceData.memory;
performanceChart.data.datasets[2].data = performanceData.requests;
performanceChart.update();
// 更新端点图表
const endpointLabels = data.top_endpoints.map(e => e.endpoint.substring(0, 20));
const endpointData = data.top_endpoints.map(e => e.count);
endpointsChart.data.labels = endpointLabels;
endpointsChart.data.datasets[0].data = endpointData;
endpointsChart.update();
}
// 更新指标显示
function updateMetrics(data) {
document.getElementById('cpu-usage').textContent = data.performance.cpu_usage.toFixed(1);
document.getElementById('memory-usage').textContent = data.performance.memory_usage.toFixed(1);
document.getElementById('active-requests').textContent = data.performance.active_requests;
document.getElementById('request-rate').textContent = data.performance.request_rate.toFixed(0);
document.getElementById('avg-response-time').textContent = data.performance.avg_response_time.toFixed(3);
document.getElementById('error-rate').textContent = (data.performance.error_rate * 100).toFixed(2);
}
// 更新告警
function updateAlerts(alerts) {
const alertsDiv = document.getElementById('alerts');
const alertsList = document.getElementById('alerts-list');
if (alerts.length > 0) {
alertsDiv.style.display = 'block';
alertsList.innerHTML = alerts.map(alert =>
`<div class="alert-item">⚠️ ${alert}</div>`
).join('');
} else {
alertsDiv.style.display = 'none';
}
}
// 获取URL中的token参数
function getTokenFromUrl() {
const urlParams = new URLSearchParams(window.location.search);
return urlParams.get('token');
}
// 刷新数据
async function refreshData() {
try {
const token = getTokenFromUrl();
if (!token) {
console.error('未找到token参数');
return;
}
const response = await fetch(`/api/v1/monitoring/overview?token=${token}`);
if (response.ok) {
const data = await response.json();
updateMetrics(data);
updatePerformanceData(data);
updateAlerts(data.alerts);
}
} catch (error) {
console.error('获取监控数据失败:', error);
}
}
// 页面加载完成后初始化
document.addEventListener('DOMContentLoaded', function() {
initCharts();
refreshData();
// 每5秒自动刷新
setInterval(refreshData, 5000);
});
</script>
</body>
</html>
""")

View File

@@ -13,7 +13,7 @@ from app import schemas
from app.command import Command
from app.core.config import settings
from app.core.plugin import PluginManager
from app.core.security import verify_apikey, verify_token, verify_apitoken
from app.core.security import verify_apikey, verify_token
from app.db.models import User
from app.db.systemconfig_oper import SystemConfigOper
from app.db.user_oper import get_current_active_superuser, get_current_active_superuser_async
@@ -21,7 +21,6 @@ from app.factory import app
from app.helper.plugin import PluginHelper
from app.log import logger
from app.scheduler import Scheduler
from app.schemas.plugin import PluginMemoryInfo
from app.schemas.types import SystemConfigKey
PROTECTED_ROUTES = {"/api/v1/openapi.json", "/docs", "/docs/oauth2-redirect", "/redoc"}
@@ -494,57 +493,6 @@ def clone_plugin(plugin_id: str,
return schemas.Response(success=False, message=f"创建插件分身失败:{str(e)}")
@router.get("/memory", summary="插件内存使用统计", response_model=List[PluginMemoryInfo])
def plugin_memory_stats(_: Annotated[str, Depends(verify_apitoken)]) -> Any:
"""
获取所有插件的内存使用统计信息
"""
try:
plugin_manager = PluginManager()
memory_stats = plugin_manager.get_plugin_memory_stats()
return memory_stats
except Exception as e:
logger.error(f"获取插件内存统计失败:{str(e)}")
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail=f"获取插件内存统计失败:{str(e)}")
@router.get("/memory/{plugin_id}", summary="单个插件内存使用统计", response_model=PluginMemoryInfo)
def plugin_memory_stat(plugin_id: str, _: Annotated[str, Depends(verify_apitoken)]) -> Any:
"""
获取指定插件的内存使用统计信息
"""
try:
plugin_manager = PluginManager()
memory_stats = plugin_manager.get_plugin_memory_stats(plugin_id)
if not memory_stats:
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND,
detail=f"插件 {plugin_id} 不存在或未运行")
return memory_stats[0]
except HTTPException:
raise
except Exception as e:
logger.error(f"获取插件 {plugin_id} 内存统计失败:{str(e)}")
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail=f"获取插件内存统计失败:{str(e)}")
@router.delete("/memory/cache", summary="清除插件内存统计缓存")
def clear_plugin_memory_cache(_: Annotated[str, Depends(verify_apitoken)],
plugin_id: Optional[str] = None) -> Any:
"""
清除插件内存统计缓存
"""
try:
plugin_manager = PluginManager()
plugin_manager.clear_plugin_memory_cache(plugin_id)
message = f"已清除插件 {plugin_id} 的内存统计缓存" if plugin_id else "已清除所有插件的内存统计缓存"
return schemas.Response(success=True, message=message)
except Exception as e:
logger.error(f"清除插件内存统计缓存失败:{str(e)}")
return schemas.Response(success=False, message=f"清除缓存失败:{str(e)}")
@router.get("/{plugin_id}", summary="获取插件配置")
async def plugin_config(plugin_id: str,
_: User = Depends(get_current_active_superuser_async)) -> dict:

View File

@@ -20,7 +20,7 @@ async def search_latest(_: schemas.TokenPayload = Depends(verify_token)) -> Any:
"""
查询搜索结果
"""
torrents = await SearchChain().async_last_search_results()
torrents = await SearchChain().async_last_search_results() or []
return [torrent.to_dict() for torrent in torrents]

View File

@@ -15,6 +15,7 @@ from app.db.models import User
from app.db.user_oper import get_current_active_superuser, get_current_active_superuser_async
from app.helper.progress import ProgressHelper
from app.schemas.types import ProgressKey
from app.utils.string import StringUtils
router = APIRouter()
@@ -80,7 +81,7 @@ def list_files(fileitem: schemas.FileItem,
file_list = StorageChain().list_files(fileitem)
if file_list:
if sort == "name":
file_list.sort(key=lambda x: x.name or "")
file_list.sort(key=lambda x: StringUtils.natural_sort_key(x.name or ""))
else:
file_list.sort(key=lambda x: x.modify_time or datetime.min, reverse=True)
return file_list

View File

@@ -421,11 +421,23 @@ async def popular_subscribes(
page: Optional[int] = 1,
count: Optional[int] = 30,
min_sub: Optional[int] = None,
genre_id: Optional[int] = None,
min_rating: Optional[float] = None,
max_rating: Optional[float] = None,
sort_type: Optional[str] = None,
_: schemas.TokenPayload = Depends(verify_token)) -> Any:
"""
查询热门订阅
"""
subscribes = await SubscribeHelper().async_get_statistic(stype=stype, page=page, count=count)
subscribes = await SubscribeHelper().async_get_statistic(
stype=stype,
page=page,
count=count,
genre_id=genre_id,
min_rating=min_rating,
max_rating=max_rating,
sort_type=sort_type
)
if subscribes:
ret_medias = []
for sub in subscribes:
@@ -570,11 +582,23 @@ async def popular_subscribes(
name: Optional[str] = None,
page: Optional[int] = 1,
count: Optional[int] = 30,
genre_id: Optional[int] = None,
min_rating: Optional[float] = None,
max_rating: Optional[float] = None,
sort_type: Optional[str] = None,
_: schemas.TokenPayload = Depends(verify_token)) -> Any:
"""
查询分享的订阅
"""
return await SubscribeHelper().async_get_shares(name=name, page=page, count=count)
return await SubscribeHelper().async_get_shares(
name=name,
page=page,
count=count,
genre_id=genre_id,
min_rating=min_rating,
max_rating=max_rating,
sort_type=sort_type
)
@router.get("/share/statistics", summary="查询订阅分享统计", response_model=List[schemas.SubscribeShareStatistics])

View File

@@ -11,6 +11,7 @@ import aiofiles
import pillow_avif # noqa 用于自动注册AVIF支持
from PIL import Image
from anyio import Path as AsyncPath
from app.helper.sites import SitesHelper # noqa # noqa
from fastapi import APIRouter, Body, Depends, HTTPException, Header, Request, Response
from fastapi.responses import StreamingResponse
@@ -31,7 +32,6 @@ from app.helper.mediaserver import MediaServerHelper
from app.helper.message import MessageHelper
from app.helper.progress import ProgressHelper
from app.helper.rule import RuleHelper
from app.helper.sites import SitesHelper # noqa # noqa
from app.helper.subscribe import SubscribeHelper
from app.helper.system import SystemHelper
from app.log import logger
@@ -52,19 +52,20 @@ async def fetch_image(
proxy: bool = False,
use_cache: bool = False,
if_none_match: Optional[str] = None,
allowed_domains: Optional[set[str]] = None) -> Response:
allowed_domains: Optional[set[str]] = None) -> Optional[Response]:
"""
处理图片缓存逻辑支持HTTP缓存和磁盘缓存
"""
if not url:
raise HTTPException(status_code=404, detail="URL not provided")
return None
if allowed_domains is None:
allowed_domains = set(settings.SECURITY_IMAGE_DOMAINS)
# 验证URL安全性
if not SecurityUtils.is_safe_url(url, allowed_domains):
raise HTTPException(status_code=404, detail="Unsafe URL")
logger.warn(f"Blocked unsafe image URL: {url}")
return None
# 缓存路径
sanitized_path = SecurityUtils.sanitize_url_path(url)
@@ -98,15 +99,16 @@ async def fetch_image(
response = await AsyncRequestUtils(ua=settings.NORMAL_USER_AGENT, proxies=proxies, referer=referer,
accept_type="image/avif,image/webp,image/apng,*/*").get_res(url=url)
if not response:
raise HTTPException(status_code=502, detail="Failed to fetch the image from the remote server")
logger.warn(f"Failed to fetch image from URL: {url}")
return None
# 验证下载的内容是否为有效图片
try:
content = response.content
Image.open(io.BytesIO(content)).verify()
except Exception as e:
logger.debug(f"Invalid image format for URL {url}: {e}")
raise HTTPException(status_code=502, detail="Invalid image format")
logger.warn(f"Invalid image format for URL {url}: {e}")
return None
# 获取请求响应头
response_headers = response.headers

View File

@@ -105,7 +105,7 @@ class ChainBase(metaclass=ABCMeta):
"""
异步删除缓存同时删除Redis和本地缓存
"""
pass
await self.async_filecache.delete(filename)
@staticmethod
def __is_valid_empty(ret):

View File

@@ -1,4 +1,6 @@
import os
from pathlib import Path
from tempfile import NamedTemporaryFile
from threading import Lock
from typing import Optional, List, Tuple, Union
@@ -20,6 +22,9 @@ from app.utils.string import StringUtils
recognize_lock = Lock()
scraping_lock = Lock()
current_umask = os.umask(0)
os.umask(current_umask)
class MediaChain(ChainBase):
"""
@@ -310,6 +315,21 @@ class MediaChain(ChainBase):
)
return None
@staticmethod
def is_bluray_folder(fileitem: schemas.FileItem) -> bool:
"""
判断是否为原盘目录
"""
if not fileitem or fileitem.type != "dir":
return False
# 蓝光原盘目录必备的文件或文件夹
required_files = ['BDMV', 'CERTIFICATE']
# 检查目录下是否存在所需文件或文件夹
for item in StorageChain().list_files(fileitem):
if item.name in required_files:
return True
return False
@eventmanager.register(EventType.MetadataScrape)
def scrape_metadata_event(self, event: Event):
"""
@@ -349,51 +369,60 @@ class MediaChain(ChainBase):
overwrite=overwrite)
else:
if file_list:
# 1. 收集fileitem和file_list中每个文件之间所有子目录
all_dirs = set()
root_path = Path(fileitem.path)
# 如果是BDMV原盘目录只对根目录进行刮削不处理子目录
if self.is_bluray_folder(fileitem):
logger.info(f"检测到BDMV原盘目录只对根目录进行刮削{fileitem.path}")
self.scrape_metadata(fileitem=fileitem,
mediainfo=mediainfo,
init_folder=True,
recursive=False,
overwrite=overwrite)
else:
# 1. 收集fileitem和file_list中每个文件之间所有子目录
all_dirs = set()
root_path = Path(fileitem.path)
logger.debug(f"开始收集目录,根目录:{root_path}")
# 收集根目录
all_dirs.add(root_path)
logger.debug(f"开始收集目录,根目录:{root_path}")
# 收集根目录
all_dirs.add(root_path)
# 收集所有目录(包括所有层级)
for sub_file in file_list:
sub_path = Path(sub_file)
# 收集从根目录到文件的所有父目录
current_path = sub_path.parent
while current_path != root_path and current_path.is_relative_to(root_path):
all_dirs.add(current_path)
current_path = current_path.parent
# 收集所有目录(包括所有层级)
for sub_file in file_list:
sub_path = Path(sub_file)
# 收集从根目录到文件的所有父目录
current_path = sub_path.parent
while current_path != root_path and current_path.is_relative_to(root_path):
all_dirs.add(current_path)
current_path = current_path.parent
logger.debug(f"共收集到 {len(all_dirs)} 个目录")
logger.debug(f"共收集到 {len(all_dirs)} 个目录")
# 2. 初始化一遍子目录,但不处理文件
for sub_dir in all_dirs:
sub_dir_item = storagechain.get_file_item(storage=fileitem.storage, path=sub_dir)
if sub_dir_item:
logger.info(f"为目录生成海报和nfo{sub_dir}")
# 初始化目录元数据,但不处理文件
self.scrape_metadata(fileitem=sub_dir_item,
mediainfo=mediainfo,
init_folder=True,
recursive=False,
overwrite=overwrite)
else:
logger.warn(f"无法获取目录项:{sub_dir}")
# 2. 初始化一遍子目录,但不处理文件
for sub_dir in all_dirs:
sub_dir_item = storagechain.get_file_item(storage=fileitem.storage, path=sub_dir)
if sub_dir_item:
logger.info(f"为目录生成海报和nfo{sub_dir}")
# 初始化目录元数据,但不处理文件
self.scrape_metadata(fileitem=sub_dir_item,
mediainfo=mediainfo,
init_folder=True,
recursive=False,
overwrite=overwrite)
else:
logger.warn(f"无法获取目录项:{sub_dir}")
# 3. 刮削每个文件
logger.info(f"开始刮削 {len(file_list)} 个文件")
for sub_file_path in file_list:
sub_file_item = storagechain.get_file_item(storage=fileitem.storage,
path=Path(sub_file_path))
if sub_file_item:
self.scrape_metadata(fileitem=sub_file_item,
mediainfo=mediainfo,
init_folder=False,
overwrite=overwrite)
else:
logger.warn(f"无法获取文件项:{sub_file_path}")
# 3. 刮削每个文件
logger.info(f"开始刮削 {len(file_list)} 个文件")
for sub_file_path in file_list:
sub_file_item = storagechain.get_file_item(storage=fileitem.storage,
path=Path(sub_file_path))
if sub_file_item:
self.scrape_metadata(fileitem=sub_file_item,
mediainfo=mediainfo,
init_folder=False,
overwrite=overwrite)
else:
logger.warn(f"无法获取文件项:{sub_file_path}")
else:
# 执行全量刮削
logger.info(f"开始刮削目录 {fileitem.path} ...")
@@ -417,20 +446,6 @@ class MediaChain(ChainBase):
storagechain = StorageChain()
def is_bluray_folder(_fileitem: schemas.FileItem) -> bool:
"""
判断是否为原盘目录
"""
if not _fileitem or _fileitem.type != "dir":
return False
# 蓝光原盘目录必备的文件或文件夹
required_files = ['BDMV', 'CERTIFICATE']
# 检查目录下是否存在所需文件或文件夹
for item in storagechain.list_files(_fileitem):
if item.name in required_files:
return True
return False
def __list_files(_fileitem: schemas.FileItem):
"""
列出下级文件
@@ -446,36 +461,65 @@ class MediaChain(ChainBase):
"""
if not _fileitem or not _content or not _path:
return
# 保存文件到临时目录
tmp_dir = settings.TEMP_PATH / StringUtils.generate_random_str(10)
tmp_dir.mkdir(parents=True, exist_ok=True)
tmp_file = tmp_dir / _path.name
tmp_file.write_bytes(_content)
# 获取文件的父目录
try:
item = storagechain.upload_file(fileitem=_fileitem, path=tmp_file, new_name=_path.name)
# 使用tempfile创建临时文件自动删除
with NamedTemporaryFile(delete=True, delete_on_close=False, suffix=_path.suffix) as tmp_file:
tmp_file_path = Path(tmp_file.name)
# 写入内容
if isinstance(_content, bytes):
tmp_file.write(_content)
else:
tmp_file.write(_content.encode('utf-8'))
tmp_file.flush()
tmp_file.close() # 关闭文件句柄
# 刮削文件只需要读写权限
tmp_file_path.chmod(0o666 & ~current_umask)
# 上传文件
item = storagechain.upload_file(fileitem=_fileitem, path=tmp_file_path, new_name=_path.name)
if item:
logger.info(f"已保存文件:{item.path}")
else:
logger.warn(f"文件保存失败:{_path}")
finally:
if tmp_file.exists():
tmp_file.unlink()
def __download_image(_url: str) -> Optional[bytes]:
def __download_and_save_image(_fileitem: schemas.FileItem, _path: Path, _url: str):
"""
下载图片并保存
流式下载图片并直接保存到文件(减少内存占用)
:param _fileitem: 关联的媒体文件项
:param _path: 图片文件路径
:param _url: 图片下载URL
"""
if not _fileitem or not _url or not _path:
return
try:
logger.info(f"正在下载图片:{_url} ...")
r = RequestUtils(proxies=settings.PROXY, ua=settings.NORMAL_USER_AGENT).get_res(url=_url)
if r:
return r.content
else:
logger.info(f"{_url} 图片下载失败,请检查网络连通性!")
request_utils = RequestUtils(proxies=settings.PROXY, ua=settings.NORMAL_USER_AGENT)
with request_utils.get_stream(url=_url) as r:
if r and r.status_code == 200:
# 使用tempfile创建临时文件自动删除
with NamedTemporaryFile(delete=True, delete_on_close=False, suffix=_path.suffix) as tmp_file:
tmp_file_path = Path(tmp_file.name)
# 流式写入文件
for chunk in r.iter_content(chunk_size=8192):
if chunk:
tmp_file.write(chunk)
tmp_file.flush()
tmp_file.close() # 关闭文件句柄
# 刮削的图片只需要读写权限
tmp_file_path.chmod(0o666 & ~current_umask)
# 上传文件
item = storagechain.upload_file(fileitem=_fileitem, path=tmp_file_path,
new_name=_path.name)
if item:
logger.info(f"已保存图片:{item.path}")
else:
logger.warn(f"图片保存失败:{_path}")
else:
logger.info(f"{_url} 图片下载失败")
except Exception as err:
logger.error(f"{_url} 图片下载失败:{str(err)}")
return None
if not fileitem:
return
@@ -521,7 +565,7 @@ class MediaChain(ChainBase):
# 电影目录
if recursive:
# 处理文件
if is_bluray_folder(fileitem):
if self.is_bluray_folder(fileitem):
# 原盘目录
if scraping_switchs.get('movie_nfo', True):
nfo_path = filepath / (filepath.name + ".nfo")
@@ -541,6 +585,9 @@ class MediaChain(ChainBase):
# 处理目录内的文件
files = __list_files(_fileitem=fileitem)
for file in files:
if file.type == "dir":
# 电影不处理子目录
continue
self.scrape_metadata(fileitem=file,
mediainfo=mediainfo,
init_folder=False,
@@ -574,11 +621,8 @@ class MediaChain(ChainBase):
image_path = filepath.with_name(image_name)
if overwrite or not storagechain.get_file_item(storage=fileitem.storage,
path=image_path):
# 下载图片
content = __download_image(image_url)
# 写入图片到当前目录
if content:
__save_file(_fileitem=fileitem, _path=image_path, _content=content)
# 流式下载图片并直接保存
__download_and_save_image(_fileitem=fileitem, _path=image_path, _url=image_url)
else:
logger.info(f"已存在图片文件:{image_path}")
else:
@@ -624,13 +668,10 @@ class MediaChain(ChainBase):
for episode, image_url in image_dict.items():
image_path = filepath.with_suffix(Path(image_url).suffix)
if overwrite or not storagechain.get_file_item(storage=fileitem.storage, path=image_path):
# 下载图片
content = __download_image(image_url)
# 保存图片文件到当前目录
if content:
if not parent:
parent = storagechain.get_parent_item(fileitem)
__save_file(_fileitem=parent, _path=image_path, _content=content)
# 流式下载图片并直接保存
if not parent:
parent = storagechain.get_parent_item(fileitem)
__download_and_save_image(_fileitem=parent, _path=image_path, _url=image_url)
else:
logger.info(f"已存在图片文件:{image_path}")
else:
@@ -640,6 +681,9 @@ class MediaChain(ChainBase):
if recursive:
files = __list_files(_fileitem=fileitem)
for file in files:
if file.type == "dir" and not file.name.lower().startswith("season"):
# 电视剧不处理非季子目录
continue
self.scrape_metadata(fileitem=file,
mediainfo=mediainfo,
parent=fileitem if file.type == "file" else None,
@@ -678,13 +722,10 @@ class MediaChain(ChainBase):
image_path = filepath.with_name(image_name)
if overwrite or not storagechain.get_file_item(storage=fileitem.storage,
path=image_path):
# 下载图片
content = __download_image(image_url)
# 保存图片文件到剧集目录
if content:
if not parent:
parent = storagechain.get_parent_item(fileitem)
__save_file(_fileitem=parent, _path=image_path, _content=content)
# 流式下载图片并直接保存
if not parent:
parent = storagechain.get_parent_item(fileitem)
__download_and_save_image(_fileitem=parent, _path=image_path, _url=image_url)
else:
logger.info(f"已存在图片文件:{image_path}")
else:
@@ -714,13 +755,11 @@ class MediaChain(ChainBase):
continue
if overwrite or not storagechain.get_file_item(storage=fileitem.storage,
path=image_path):
# 下载图片
content = __download_image(image_url)
# 保存图片文件到当前目录
if content:
if not parent:
parent = storagechain.get_parent_item(fileitem)
__save_file(_fileitem=parent, _path=image_path, _content=content)
# 流式下载图片并直接保存
if not parent:
parent = storagechain.get_parent_item(fileitem)
__download_and_save_image(_fileitem=parent, _path=image_path,
_url=image_url)
else:
logger.info(f"已存在图片文件:{image_path}")
else:
@@ -770,11 +809,8 @@ class MediaChain(ChainBase):
image_path = filepath / image_name
if overwrite or not storagechain.get_file_item(storage=fileitem.storage,
path=image_path):
# 下载图片
content = __download_image(image_url)
# 保存图片文件到当前目录
if content:
__save_file(_fileitem=fileitem, _path=image_path, _content=content)
# 流式下载图片并直接保存
__download_and_save_image(_fileitem=fileitem, _path=image_path, _url=image_url)
else:
logger.info(f"已存在图片文件:{image_path}")
else:

View File

@@ -6,6 +6,7 @@ from datetime import datetime
from typing import Dict, Tuple
from typing import List, Optional
from app.helper.sites import SitesHelper # noqa
from fastapi.concurrency import run_in_threadpool
from app.chain import ChainBase
@@ -16,7 +17,6 @@ from app.core.event import eventmanager, Event
from app.core.metainfo import MetaInfo
from app.db.systemconfig_oper import SystemConfigOper
from app.helper.progress import ProgressHelper
from app.helper.sites import SitesHelper # noqa
from app.helper.torrent import TorrentHelper
from app.log import logger
from app.schemas import NotExistMediaInfo
@@ -86,13 +86,13 @@ class SearchChain(ChainBase):
self.save_cache(contexts, self.__result_temp_file)
return contexts
def last_search_results(self) -> List[Context]:
def last_search_results(self) -> Optional[List[Context]]:
"""
获取上次搜索结果
"""
return self.load_cache(self.__result_temp_file)
async def async_last_search_results(self) -> List[Context]:
async def async_last_search_results(self) -> Optional[List[Context]]:
"""
异步获取上次搜索结果
"""
@@ -324,9 +324,6 @@ class SearchChain(ChainBase):
:param _torrents: 种子列表
:return: 去重后的种子列表
"""
if not settings.SEARCH_MULTIPLE_NAME:
return _torrents
# 通过encosure去重
return list({f"{t.torrent_info.site_name}_{t.torrent_info.title}_{t.torrent_info.description}": t
for t in _torrents}.values())
@@ -384,16 +381,23 @@ class SearchChain(ChainBase):
if search_count > 0:
logger.info(f"已搜索 {search_count} 次,强制休眠 1-10 秒 ...")
time.sleep(random.randint(1, 10))
# 搜索站点
torrents.extend(
self.__search_all_sites(
mediainfo=mediainfo,
keyword=search_word,
sites=sites,
area=area
) or []
)
results = self.__search_all_sites(
mediainfo=mediainfo,
keyword=search_word,
sites=sites,
area=area
) or []
# 合并结果
search_count += 1
torrents.extend(results)
# 有结果则停止
if not settings.SEARCH_MULTIPLE_NAME and torrents:
logger.info(f"共搜索到 {len(torrents)} 个资源,停止搜索")
break
# 处理结果
return self.__parse_result(

View File

@@ -173,7 +173,7 @@ class StorageChain(ChainBase):
dir_item = fileitem if fileitem.type == "dir" else self.get_parent_item(fileitem)
if not dir_item:
logger.warn(f"{fileitem.storage}{fileitem.path} 上级目录不存在")
return False
return True
# 查找操作文件项匹配的配置目录(资源目录、媒体库目录)
associated_dir = max(

View File

@@ -1184,6 +1184,42 @@ class SubscribeChain(ChainBase):
logger.error(f'follow用户分享订阅 {title} 添加失败:{message}')
logger.info(f'follow用户分享订阅刷新完成共添加 {success_count} 个订阅')
async def cache_calendar(self):
"""
预缓存订阅日历,实际上就是查询一遍所有订阅的媒体信息
前端请示是异常的,所以需要使用异步缓存方法
"""
logger.info(f'开始预缓存订阅日历 ...')
for subscribe in await SubscribeOper().async_list():
if global_vars.is_system_stopped:
break
try:
mtype = MediaType(subscribe.type)
except ValueError:
logger.error(f'订阅 {subscribe.name} 类型错误:{subscribe.type}')
continue
# 识别媒体信息
if mtype == MediaType.MOVIE:
mediainfo: MediaInfo = await self.async_recognize_media(mtype=mtype,
tmdbid=subscribe.tmdbid,
doubanid=subscribe.doubanid,
bangumiid=subscribe.bangumiid,
episode_group=subscribe.episode_group,
cache=False)
if not mediainfo:
logger.warn(
f'未识别到媒体信息,标题:{subscribe.name}tmdbid{subscribe.tmdbid}doubanid{subscribe.doubanid}')
continue
else:
episodes = await TmdbChain().async_tmdb_episodes(tmdbid=subscribe.tmdbid,
season=subscribe.season,
episode_group=subscribe.episode_group)
if not episodes:
logger.warn(
f'未识别到季集信息,标题:{subscribe.name}tmdbid{subscribe.tmdbid}豆瓣ID{subscribe.doubanid},季:{subscribe.season}')
continue
logger.info(f'订阅日历预缓存完成')
@staticmethod
def __update_subscribe_note(subscribe: Subscribe, downloads: Optional[List[Context]]):
"""

View File

@@ -591,7 +591,7 @@ class TransferChain(ChainBase, metaclass=Singleton):
text=__process_msg,
data={
"current": Path(fileitem.path).as_posix(),
"finished":finished_files
"finished": finished_files
})
# 整理
state, err_msg = self.__handle_transfer(task=task, callback=item.callback)
@@ -1471,13 +1471,9 @@ class TransferChain(ChainBase, metaclass=Singleton):
for file in torrent_files:
file_path = save_path / file.name
# 如果存在未被屏蔽的媒体文件,则不删除种子
if (
file_path.suffix in self.all_exts
and not self._is_blocked_by_exclude_words(
str(file_path), transfer_exclude_words
)
and file_path.exists()
):
if (file_path.suffix in self.all_exts
and not self._is_blocked_by_exclude_words(str(file_path), transfer_exclude_words)
and file_path.exists()):
return False
# 所有媒体文件都被屏蔽或不存在,可以删除种子

View File

@@ -455,7 +455,7 @@ class MemoryBackend(CacheBackend):
if region_cache:
with lock:
region_cache.clear()
logger.info(f"Cleared cache for region: {region}")
logger.debug(f"Cleared cache for region: {region}")
else:
# 清除所有区域的缓存
for region_cache in self._region_caches.values():
@@ -474,7 +474,11 @@ class MemoryBackend(CacheBackend):
if region_cache is None:
yield from ()
return
for item in region_cache.items():
# 使用锁保护迭代过程,避免在迭代时缓存被修改
with lock:
# 创建快照避免并发修改问题
items_snapshot = list(region_cache.items())
for item in items_snapshot:
yield item
def close(self) -> None:
@@ -585,13 +589,13 @@ class AsyncMemoryBackend(AsyncCacheBackend):
if region_cache:
with lock:
region_cache.clear()
logger.info(f"Cleared cache for region: {region}")
logger.debug(f"Cleared cache for region: {region}")
else:
# 清除所有区域的缓存
for region_cache in self._region_caches.values():
with lock:
region_cache.clear()
logger.info("Cleared all cache")
logger.info("All cache cleared")
async def items(self, region: Optional[str] = DEFAULT_CACHE_REGION) -> AsyncGenerator[Tuple[str, Any], None]:
"""
@@ -603,7 +607,11 @@ class AsyncMemoryBackend(AsyncCacheBackend):
region_cache = self.__get_region_cache(region)
if region_cache is None:
return
for item in region_cache.items():
# 使用锁保护迭代过程,避免在迭代时缓存被修改
with lock:
# 创建快照避免并发修改问题
items_snapshot = list(region_cache.items())
for item in items_snapshot:
yield item
async def close(self) -> None:
@@ -1385,7 +1393,7 @@ class TTLCache(CacheProxy):
def __init__(self,
region: Optional[str] = DEFAULT_CACHE_REGION,
maxsize: Optional[int] = DEFAULT_CACHE_SIZE,
ttl: Optional[int]= DEFAULT_CACHE_TTL):
ttl: Optional[int] = DEFAULT_CACHE_TTL):
"""
初始化 TTL 缓存
@@ -1404,7 +1412,7 @@ class LRUCache(CacheProxy):
def __init__(self,
region: Optional[str] = DEFAULT_CACHE_REGION,
maxsize: Optional[int]= DEFAULT_CACHE_SIZE
maxsize: Optional[int] = DEFAULT_CACHE_SIZE
):
"""
初始化 LRU 缓存

View File

@@ -75,6 +75,8 @@ class ConfigModel(BaseModel):
DEBUG: bool = False
# 是否开发模式
DEV: bool = False
# 高级设置模式
ADVANCED_MODE: bool = True
# ==================== 安全认证配置 ====================
# 密钥
@@ -87,8 +89,10 @@ class ConfigModel(BaseModel):
ACCESS_TOKEN_EXPIRE_MINUTES: int = 60 * 24 * 8
# RESOURCE_TOKEN过期时间
RESOURCE_ACCESS_TOKEN_EXPIRE_SECONDS: int = 60 * 30
# 超级管理员
# 超级管理员初始用户名
SUPERUSER: str = "admin"
# 超级管理员初始密码
SUPERUSER_PASSWORD: str = None
# 辅助认证,允许通过外部服务进行认证、单点登录以及自动创建用户
AUXILIARY_AUTH_ENABLE: bool = False
# API密钥需要更换
@@ -114,7 +118,7 @@ class ConfigModel(BaseModel):
# 数据库连接池获取连接的超时时间(秒)
DB_POOL_TIMEOUT: int = 30
# SQLite 连接池大小
DB_SQLITE_POOL_SIZE: int = 30
DB_SQLITE_POOL_SIZE: int = 10
# SQLite 连接池溢出数量
DB_SQLITE_MAX_OVERFLOW: int = 50
# PostgreSQL 主机地址
@@ -128,7 +132,7 @@ class ConfigModel(BaseModel):
# PostgreSQL 密码
DB_POSTGRESQL_PASSWORD: str = "moviepilot"
# PostgreSQL 连接池大小
DB_POSTGRESQL_POOL_SIZE: int = 30
DB_POSTGRESQL_POOL_SIZE: int = 10
# PostgreSQL 连接池溢出数量
DB_POSTGRESQL_MAX_OVERFLOW: int = 50
@@ -167,7 +171,7 @@ class ConfigModel(BaseModel):
# ==================== 媒体元数据配置 ====================
# 媒体搜索来源 themoviedb/douban/bangumi多个用,分隔
SEARCH_SOURCE: str = "themoviedb,douban,bangumi"
SEARCH_SOURCE: str = "themoviedb"
# 媒体识别来源 themoviedb/douban
RECOGNIZE_SOURCE: str = "themoviedb"
# 刮削来源 themoviedb/douban
@@ -249,8 +253,10 @@ class ConfigModel(BaseModel):
SUBSCRIBE_STATISTIC_SHARE: bool = True
# 订阅搜索开关
SUBSCRIBE_SEARCH: bool = False
# 订阅搜索时间间隔(小时)
SUBSCRIBE_SEARCH_INTERVAL: int = 24
# 检查本地媒体库是否存在资源开关
LOCAL_EXISTS_SEARCH: bool = False
LOCAL_EXISTS_SEARCH: bool = True
# ==================== 站点配置 ====================
# 站点数据刷新间隔(小时)
@@ -358,12 +364,12 @@ class ConfigModel(BaseModel):
# ==================== 性能配置 ====================
# 大内存模式
BIG_MEMORY_MODE: bool = False
# FastApi性能监控
PERFORMANCE_MONITOR_ENABLE: bool = False
# 是否启用编码探测的性能模式
ENCODING_DETECTION_PERFORMANCE_MODE: bool = True
# 编码探测的最低置信度阈值
ENCODING_DETECTION_MIN_CONFIDENCE: float = 0.8
# 主动内存回收时间间隔分钟0为不启用
MEMORY_GC_INTERVAL: int = 30
# ==================== 安全配置 ====================
# 允许的图片缓存域名
@@ -663,7 +669,7 @@ class Settings(BaseSettings, ConfigModel, LogConfigModel):
douban=512,
bangumi=512,
fanart=512,
meta=(self.META_CACHE_EXPIRE or 24) * 3600,
meta=(self.META_CACHE_EXPIRE or 72) * 3600,
scheduler=100,
threadpool=100
)
@@ -674,7 +680,7 @@ class Settings(BaseSettings, ConfigModel, LogConfigModel):
douban=256,
bangumi=256,
fanart=128,
meta=(self.META_CACHE_EXPIRE or 2) * 3600,
meta=(self.META_CACHE_EXPIRE or 24) * 3600,
scheduler=50,
threadpool=50
)

View File

@@ -1,3 +1,4 @@
import asyncio
import importlib
import inspect
import random
@@ -71,15 +72,26 @@ class EventManager(metaclass=Singleton):
"""
def __init__(self):
self.__executor = ThreadHelper() # 动态线程池,用于消费事件
self.__consumer_threads = [] # 用于保存启动的事件消费者线程
self.__event_queue = PriorityQueue() # 优先级队列
self.__broadcast_subscribers: Dict[EventType, Dict[str, Callable]] = {} # 广播事件的订阅者
self.__chain_subscribers: Dict[ChainEventType, Dict[str, tuple[int, Callable]]] = {} # 链式事件的订阅者
self.__disabled_handlers = set() # 禁用的事件处理器集合
self.__disabled_classes = set() # 禁用的事件处理器类集合
self.__lock = threading.Lock() # 线程锁
self.__event = threading.Event() # 退出事件
# 动态线程池,用于消费事件
self.__executor = ThreadHelper()
# 用于保存启动的事件消费者线程
self.__consumer_threads = []
# 优先级队列
self.__event_queue = PriorityQueue()
# 广播事件的订阅者
self.__broadcast_subscribers: Dict[EventType, Dict[str, Callable]] = {}
# 链式事件的订阅者
self.__chain_subscribers: Dict[ChainEventType, Dict[str, tuple[int, Callable]]] = {}
# 禁用的事件处理器集合
self.__disabled_handlers = set()
# 禁用的事件处理器类集合
self.__disabled_classes = set()
# 线程锁
self.__lock = threading.Lock()
# 退出事件
self.__event = threading.Event()
# 当前事件循环
self.loop = asyncio.get_event_loop()
def start(self):
"""
@@ -438,7 +450,15 @@ class EventManager(metaclass=Singleton):
isolated_event = Event(event_type=event.event_type,
event_data=event_data_copy,
priority=event.priority)
self.__executor.submit(self.__safe_invoke_handler, handler, isolated_event)
if inspect.iscoroutinefunction(handler):
# 对于异步函数,直接在事件循环中运行
asyncio.run_coroutine_threadsafe(
self.__safe_invoke_handler_async(handler, isolated_event),
self.loop
)
else:
# 对于同步函数,在线程池中运行
self.__executor.submit(self.__safe_invoke_handler, handler, isolated_event)
def __safe_invoke_handler(self, handler: Callable, event: Event):
"""
@@ -450,10 +470,7 @@ class EventManager(metaclass=Singleton):
logger.debug(f"Handler {self.__get_handler_identifier(handler)} is disabled. Skipping execution")
return
try:
self.__invoke_handler_by_type_sync(handler, event)
except Exception as e:
self.__handle_event_error(event, handler, e)
self.__invoke_handler_by_type_sync(handler, event)
async def __safe_invoke_handler_async(self, handler: Callable, event: Event):
"""
@@ -465,10 +482,7 @@ class EventManager(metaclass=Singleton):
logger.debug(f"Handler {self.__get_handler_identifier(handler)} is disabled. Skipping execution")
return
try:
await self.__invoke_handler_by_type_async(handler, event)
except Exception as e:
self.__handle_event_error(event, handler, e)
await self.__invoke_handler_by_type_async(handler, event)
def __invoke_handler_by_type_sync(self, handler: Callable, event: Event):
"""
@@ -486,7 +500,17 @@ class EventManager(metaclass=Singleton):
if class_name in plugin_manager.get_plugin_ids():
# 插件处理器
plugin_manager.run_plugin_method(class_name, method_name, event)
plugin = plugin_manager.running_plugins.get(class_name)
if not plugin:
return
method = getattr(plugin, method_name, None)
if not method:
return
try:
method(event)
except Exception as e:
self.__handle_event_error(event=event, module_name=plugin.name,
class_name=class_name, method_name=method_name, e=e)
elif class_name in module_manager.get_module_ids():
# 模块处理器
module = module_manager.get_running_module(class_name)
@@ -495,16 +519,24 @@ class EventManager(metaclass=Singleton):
method = getattr(module, method_name, None)
if not method:
return
method(event)
try:
method(event)
except Exception as e:
self.__handle_event_error(event=event, module_name=module.get_name(),
class_name=class_name, method_name=method_name, e=e)
else:
# 全局处理器
class_obj = self.__get_class_instance(class_name)
if not class_obj or not hasattr(class_obj, method_name):
return
method = getattr(class_obj, method_name)
method = getattr(class_obj, method_name, None)
if not method:
return
method(event)
try:
method(event)
except Exception as e:
self.__handle_event_error(event=event, module_name=class_name,
class_name=class_name, method_name=method_name, e=e)
async def __invoke_handler_by_type_async(self, handler: Callable, event: Event):
"""
@@ -537,52 +569,62 @@ class EventManager(metaclass=Singleton):
names = handler.__qualname__.split(".")
return names[0], names[1]
@staticmethod
async def __invoke_plugin_method_async(handler: Any, class_name: str, method_name: str, event: Event):
async def __invoke_plugin_method_async(self, handler: Any, class_name: str, method_name: str, event: Event):
"""
异步调用插件方法
"""
plugin = handler.running_plugins.get(class_name)
if plugin and hasattr(plugin, method_name):
method = getattr(plugin, method_name)
if not plugin:
return
method = getattr(plugin, method_name, None)
if not method:
return
try:
if inspect.iscoroutinefunction(method):
await method(event)
else:
# 插件同步函数在异步环境中运行,避免阻塞
await run_in_threadpool(method, event)
except Exception as e:
self.__handle_event_error(event=event, handler=handler, e=e, module_name=plugin.name)
@staticmethod
async def __invoke_module_method_async(handler: Any, class_name: str, method_name: str, event: Event):
async def __invoke_module_method_async(self, handler: Any, class_name: str, method_name: str, event: Event):
"""
异步调用模块方法
"""
module = handler.get_running_module(class_name)
if not module:
return
method = getattr(module, method_name, None)
if not method:
return
if inspect.iscoroutinefunction(method):
await method(event)
else:
method(event)
try:
if inspect.iscoroutinefunction(method):
await method(event)
else:
method(event)
except Exception as e:
self.__handle_event_error(event=event, module_name=module.get_name(),
class_name=class_name, method_name=method_name, e=e)
async def __invoke_global_method_async(self, class_name: str, method_name: str, event: Event):
"""
异步调用全局对象方法
"""
class_obj = self.__get_class_instance(class_name)
if not class_obj or not hasattr(class_obj, method_name):
if not class_obj:
return
method = getattr(class_obj, method_name)
if inspect.iscoroutinefunction(method):
await method(event)
else:
method(event)
method = getattr(class_obj, method_name, None)
if not method:
return
try:
if inspect.iscoroutinefunction(method):
await method(event)
else:
method(event)
except Exception as e:
self.__handle_event_error(event=event, module_name=class_name,
class_name=class_name, method_name=method_name, e=e)
@staticmethod
def __get_class_instance(class_name: str):
@@ -609,7 +651,11 @@ class EventManager(metaclass=Singleton):
module_name = f"app.chain.{class_name[:-5].lower()}"
module = importlib.import_module(module_name)
elif class_name.endswith("Helper"):
module_name = f"app.helper.{class_name[:-6].lower()}"
# 特殊处理 Async 类
if class_name.startswith("Async"):
module_name = f"app.helper.{class_name[5:-6].lower()}"
else:
module_name = f"app.helper.{class_name[:-6].lower()}"
module = importlib.import_module(module_name)
else:
module_name = f"app.{class_name.lower()}"
@@ -649,18 +695,16 @@ class EventManager(metaclass=Singleton):
"""
logger.debug(f"{stage} - {event}")
def __handle_event_error(self, event: Event, handler: Callable, e: Exception):
def __handle_event_error(self, event: Event, module_name: str,
class_name: str, method_name: str, e: Exception):
"""
全局错误处理器,用于处理事件处理中的异常
"""
logger.error(f"事件处理出错:{str(e)} - {traceback.format_exc()}")
names = handler.__qualname__.split(".")
class_name, method_name = names[0], names[1]
logger.error(f"{module_name} 事件处理出错:{str(e)} - {traceback.format_exc()}")
# 发送系统错误通知
from app.helper.message import MessageHelper
MessageHelper().put(title=f"{event.event_type} 事件处理出错",
MessageHelper().put(title=f"{module_name} 处理事件 {event.event_type} 出错",
message=f"{class_name}.{method_name}{str(e)}",
role="system")
self.send_event(

View File

@@ -48,7 +48,7 @@ class ModuleManager(metaclass=Singleton):
# 通过模板开关控制加载
_module.init_module()
self._running_modules[module_id] = _module
logger.info(f"Moudle Loaded{module_id}")
logger.debug(f"Moudle Loaded{module_id}")
except Exception as err:
logger.error(f"Load Moudle Error{module_id}{str(err)} - {traceback.format_exc()}", exc_info=True)
@@ -61,7 +61,7 @@ class ModuleManager(metaclass=Singleton):
if hasattr(module, "stop"):
try:
module.stop()
logger.info(f"Moudle Stoped{module_id}")
logger.debug(f"Moudle Stoped{module_id}")
except Exception as err:
logger.error(f"Stop Moudle Error{module_id}{str(err)} - {traceback.format_exc()}", exc_info=True)
logger.info("所有模块停止完成")

View File

@@ -17,11 +17,12 @@ from watchdog.events import FileSystemEventHandler
from watchdog.observers import Observer
from app import schemas
from app.core.cache import cached
from app.core.config import settings
from app.core.event import eventmanager, Event
from app.db.plugindata_oper import PluginDataOper
from app.db.systemconfig_oper import SystemConfigOper
from app.helper.plugin import PluginHelper, PluginMemoryMonitor
from app.helper.plugin import PluginHelper
from app.helper.sites import SitesHelper # noqa
from app.log import logger
from app.schemas.types import EventType, SystemConfigKey
@@ -98,8 +99,6 @@ class PluginManager(metaclass=Singleton):
self._config_key: str = "plugin.%s"
# 监听器
self._observer: Observer = None
# 内存监控器
self._memory_monitor = PluginMemoryMonitor()
# 开发者模式监测插件修改
if settings.DEV or settings.PLUGIN_AUTO_RELOAD:
self.__start_monitor()
@@ -865,32 +864,14 @@ class PluginManager(metaclass=Singleton):
"""
return list(self._running_plugins.keys())
def get_plugin_memory_stats(self, pid: Optional[str] = None) -> List[Dict[str, Any]]:
"""
获取插件内存统计信息
:param pid: 插件ID为空则获取所有插件
:return: 内存统计信息列表
"""
if pid:
plugin_instance = self._running_plugins.get(pid)
if plugin_instance:
return [self._memory_monitor.get_plugin_memory_usage(pid, plugin_instance)]
else:
return []
else:
return self._memory_monitor.get_all_plugins_memory_usage(self._running_plugins)
def clear_plugin_memory_cache(self, pid: Optional[str] = None):
"""
清除插件内存统计缓存
:param pid: 插件ID为空则清除所有缓存
"""
self._memory_monitor.clear_cache(pid)
@cached(maxsize=1, ttl=1800)
def get_online_plugins(self, force: bool = False) -> List[schemas.Plugin]:
"""
获取所有在线插件信息
"""
if force:
self.get_online_plugins.cache_clear()
if not settings.PLUGIN_MARKET:
return []
@@ -1186,10 +1167,15 @@ class PluginManager(metaclass=Singleton):
return plugin
@cached(maxsize=1, ttl=1800)
async def async_get_online_plugins(self, force: bool = False) -> List[schemas.Plugin]:
"""
异步获取所有在线插件信息
:param force: 是否强制刷新(忽略缓存)
"""
if force:
await self.async_get_online_plugins.cache_clear()
if not settings.PLUGIN_MARKET:
return []

View File

@@ -20,7 +20,7 @@ class SiteUserData(Base):
# 用户名
username = Column(String)
# 用户ID
userid = Column(Integer)
userid = Column(String)
# 用户等级
user_level = Column(String)
# 加入时间

View File

@@ -119,6 +119,14 @@ class SubscribeOper(DbOper):
return Subscribe.get_by_state(self._db, state)
return Subscribe.list(self._db)
async def async_list(self, state: Optional[str] = None) -> List[Subscribe]:
"""
异步获取订阅列表
"""
if state:
return await Subscribe.async_get_by_state(self._db, state)
return await Subscribe.async_list(self._db)
def delete(self, sid: int):
"""
删除订阅

View File

@@ -2,7 +2,6 @@ from fastapi import FastAPI
from fastapi.middleware.cors import CORSMiddleware
from app.core.config import settings
from app.monitoring import setup_prometheus_metrics
from app.startup.lifecycle import lifespan
@@ -25,9 +24,6 @@ def create_app() -> FastAPI:
allow_headers=["*"],
)
# 设置性能监控
setup_prometheus_metrics(_app)
return _app

View File

@@ -713,6 +713,7 @@ class MessageQueueManager(metaclass=SingletonClass):
self._running = False
logger.info("正在停止消息队列...")
self.thread.join()
logger.info("消息队列已停止")
class MessageHelper(metaclass=Singleton):

View File

@@ -4,11 +4,10 @@ import json
import shutil
import site
import sys
import time
import traceback
import zipfile
from pathlib import Path
from typing import Dict, List, Optional, Tuple, Set, Callable, Awaitable, Any
from typing import Dict, List, Optional, Tuple, Set, Callable, Awaitable
import aiofiles
import aioshutil
@@ -25,7 +24,6 @@ from app.db.systemconfig_oper import SystemConfigOper
from app.log import logger
from app.schemas.types import SystemConfigKey
from app.utils.http import RequestUtils, AsyncRequestUtils
from app.utils.memory import MemoryCalculator
from app.utils.singleton import WeakSingleton
from app.utils.system import SystemUtils
from app.utils.url import UrlUtils
@@ -60,21 +58,22 @@ class PluginHelper(metaclass=WeakSingleton):
"""
# 如果强制刷新,直接调用不带缓存的版本
if force:
return self._get_plugins_uncached(repo_url, package_version)
return self._request_plugins(repo_url, package_version)
else:
return self._request_plugins_cached(repo_url, package_version)
# 正常情况下调用带缓存的版本
return self._get_plugins_cached(repo_url, package_version)
@cached(maxsize=64, ttl=1800)
def _get_plugins_cached(self, repo_url: str, package_version: Optional[str] = None) -> Optional[Dict[str, dict]]:
@cached(maxsize=128, ttl=1800)
def _request_plugins_cached(self, repo_url: str,
package_version: Optional[str] = None) -> Optional[Dict[str, dict]]:
"""
获取Github所有最新插件列表使用缓存
:param repo_url: Github仓库地址
:param package_version: 首选插件版本 (如 "v2", "v3"),如果不指定则获取 v1 版本
"""
return self._get_plugins_uncached(repo_url, package_version)
return self._request_plugins(repo_url, package_version)
def _get_plugins_uncached(self, repo_url: str, package_version: Optional[str] = None) -> Optional[Dict[str, dict]]:
def _request_plugins(self, repo_url: str,
package_version: Optional[str] = None) -> Optional[Dict[str, dict]]:
"""
获取Github所有最新插件列表不使用缓存
:param repo_url: Github仓库地址
@@ -163,7 +162,7 @@ class PluginHelper(metaclass=WeakSingleton):
return res.json()
return {}
def install_reg(self, pid: str) -> bool:
def install_reg(self, pid: str, repo_url: Optional[str] = None) -> bool:
"""
安装插件统计
"""
@@ -172,24 +171,39 @@ class PluginHelper(metaclass=WeakSingleton):
if not pid:
return False
install_reg_url = self._install_reg.format(pid=pid)
res = RequestUtils(proxies=settings.PROXY, timeout=5).get_res(install_reg_url)
res = RequestUtils(
proxies=settings.PROXY,
content_type="application/json",
timeout=5
).post(install_reg_url, json={
"plugin_id": pid,
"repo_url": repo_url
})
if res and res.status_code == 200:
return True
return False
def install_report(self) -> bool:
def install_report(self, items: Optional[List[Tuple[str, Optional[str]]]] = None) -> bool:
"""
上报存量插件安装统计
上报存量插件安装统计(批量)。支持上送 repo_url。
:param items: 可选,形如 [(plugin_id, repo_url), ...];不传则回落到历史配置,仅上送 plugin_id。
"""
if not settings.PLUGIN_STATISTIC_SHARE:
return False
plugins = self.systemconfig.get(SystemConfigKey.UserInstalledPlugins)
if not plugins:
return False
payload_plugins = []
if items:
for pid, repo_url in items:
if pid:
payload_plugins.append({"plugin_id": pid, "repo_url": repo_url})
else:
plugins = self.systemconfig.get(SystemConfigKey.UserInstalledPlugins)
if not plugins:
return False
payload_plugins = [{"plugin_id": plugin, "repo_url": None} for plugin in plugins]
res = RequestUtils(proxies=settings.PROXY,
content_type="application/json",
timeout=5).post(self._install_report,
json={"plugins": [{"plugin_id": plugin} for plugin in plugins]})
json={"plugins": payload_plugins})
return True if res else False
def install(self, pid: str, repo_url: str, package_version: Optional[str] = None, force_install: bool = False) \
@@ -254,16 +268,16 @@ class PluginHelper(metaclass=WeakSingleton):
# 使用 release 进行安装
def prepare_release() -> Tuple[bool, str]:
return self.__install_from_release(
pid.lower(), user_repo, release_tag
pid, user_repo, release_tag
)
return self.__install_flow_sync(pid.lower(), force_install, prepare_release)
return self.__install_flow_sync(pid, force_install, prepare_release, repo_url)
else:
# 如果 release_tag 不存在,说明插件没有发布版本,使用文件列表方式安装
def prepare_filelist() -> Tuple[bool, str]:
return self.__prepare_content_via_filelist_sync(pid.lower(), user_repo, package_version)
return self.__install_flow_sync(pid.lower(), force_install, prepare_filelist)
return self.__install_flow_sync(pid, force_install, prepare_filelist, repo_url)
def __get_file_list(self, pid: str, user_repo: str, package_version: Optional[str] = None) -> \
Tuple[Optional[list], Optional[str]]:
@@ -277,7 +291,7 @@ class PluginHelper(metaclass=WeakSingleton):
# 如果 package_version 存在(如 "v2"),则加上版本号
if package_version:
file_api += f".{package_version}"
file_api += f"/{pid}"
file_api += f"/{pid.lower()}"
res = self.__request_with_fallback(file_api,
headers=settings.REPO_GITHUB_HEADERS(repo=user_repo),
@@ -410,8 +424,8 @@ class PluginHelper(metaclass=WeakSingleton):
:param pid: 插件 ID
:return: 备份目录路径
"""
plugin_dir = PLUGIN_DIR / pid
backup_dir = Path(settings.TEMP_PATH) / "plugin_backup" / pid
plugin_dir = PLUGIN_DIR / pid.lower()
backup_dir = Path(settings.TEMP_PATH) / "plugin_backup" / pid.lower()
if plugin_dir.exists():
# 备份时清理已有的备份目录,防止残留文件影响
@@ -431,7 +445,7 @@ class PluginHelper(metaclass=WeakSingleton):
:param pid: 插件 ID
:param backup_dir: 备份目录路径
"""
plugin_dir = PLUGIN_DIR / pid
plugin_dir = PLUGIN_DIR / pid.lower()
if plugin_dir.exists():
shutil.rmtree(plugin_dir, ignore_errors=True)
logger.debug(f"{pid} 已清理插件目录 {plugin_dir}")
@@ -448,7 +462,7 @@ class PluginHelper(metaclass=WeakSingleton):
删除旧插件
:param pid: 插件 ID
"""
plugin_dir = PLUGIN_DIR / pid
plugin_dir = PLUGIN_DIR / pid.lower()
if plugin_dir.exists():
shutil.rmtree(plugin_dir, ignore_errors=True)
@@ -459,7 +473,18 @@ class PluginHelper(metaclass=WeakSingleton):
:param requirements_file: 依赖的 requirements.txt 文件路径
:return: (是否成功, 错误信息)
"""
base_cmd = [sys.executable, "-m", "pip", "install", "-r", str(requirements_file)]
wheels_dir = requirements_file.parent / "wheels"
find_links_option = []
if wheels_dir.is_dir():
# 如果目录存在,增加 --find-links 选项
logger.debug(f"[PIP] 发现插件内嵌的 wheels 目录: {wheels_dir},将优先从本地安装。")
find_links_option = ["--find-links", str(wheels_dir)]
else:
# 如果不存在,选项为空列表,对后续命令无影响
logger.debug(f"[PIP] 未发现插件内嵌的 wheels 目录,将仅使用在线源。")
base_cmd = [sys.executable, "-m", "pip", "install"] + find_links_option + ["-r", str(requirements_file)]
strategies = []
# 添加策略到列表中
@@ -548,41 +573,42 @@ class PluginHelper(metaclass=WeakSingleton):
logger.error(f"获取插件 {pid} 元数据失败:{e}")
return {}
def __install_flow_sync(self, pid_lower: str, force_install: bool,
prepare_content: Callable[[], Tuple[bool, str]]) -> Tuple[bool, str]:
def __install_flow_sync(self, pid: str, force_install: bool,
prepare_content: Callable[[], Tuple[bool, str]],
repo_url: Optional[str] = None) -> Tuple[bool, str]:
"""
同步安装统一流程:备份→清理→准备内容→安装依赖→上报
prepare_content 负责把插件文件放到 app/plugins/{pid}
"""
backup_dir = None
if not force_install:
backup_dir = self.__backup_plugin(pid_lower)
backup_dir = self.__backup_plugin(pid)
self.__remove_old_plugin(pid_lower)
self.__remove_old_plugin(pid)
success, message = prepare_content()
if not success:
logger.error(f"{pid_lower} 准备插件内容失败:{message}")
logger.error(f"{pid} 准备插件内容失败:{message}")
if backup_dir:
self.__restore_plugin(pid_lower, backup_dir)
logger.warning(f"{pid_lower} 插件安装失败,已还原备份插件")
self.__restore_plugin(pid, backup_dir)
logger.warning(f"{pid} 插件安装失败,已还原备份插件")
else:
self.__remove_old_plugin(pid_lower)
logger.warning(f"{pid_lower} 已清理对应插件目录,请尝试重新安装")
self.__remove_old_plugin(pid)
logger.warning(f"{pid} 已清理对应插件目录,请尝试重新安装")
return False, message
dependencies_exist, dep_ok, dep_msg = self.__install_dependencies_if_required(pid_lower)
dependencies_exist, dep_ok, dep_msg = self.__install_dependencies_if_required(pid)
if dependencies_exist and not dep_ok:
logger.error(f"{pid_lower} 依赖安装失败:{dep_msg}")
logger.error(f"{pid} 依赖安装失败:{dep_msg}")
if backup_dir:
self.__restore_plugin(pid_lower, backup_dir)
logger.warning(f"{pid_lower} 插件安装失败,已还原备份插件")
self.__restore_plugin(pid, backup_dir)
logger.warning(f"{pid} 插件安装失败,已还原备份插件")
else:
self.__remove_old_plugin(pid_lower)
logger.warning(f"{pid_lower} 已清理对应插件目录,请尝试重新安装")
self.__remove_old_plugin(pid)
logger.warning(f"{pid} 已清理对应插件目录,请尝试重新安装")
return False, dep_msg
self.install_reg(pid_lower)
self.install_reg(pid, repo_url)
return True, ""
def __install_from_release(self, pid: str, user_repo: str, release_tag: str) -> Tuple[bool, str]:
@@ -610,14 +636,19 @@ class PluginHelper(metaclass=WeakSingleton):
asset = next((a for a in assets if a.get("name") == asset_name), None)
if not asset:
return False, f"未找到资产文件:{asset_name}"
download_url = asset.get("browser_download_url")
if not download_url:
return False, "资产缺少下载地址"
asset_id = asset.get("id")
if not asset_id:
return False, "资产缺少ID信息"
# 构建资产的API下载URL
download_url = f"https://api.github.com/repos/{user_repo}/releases/assets/{asset_id}"
except Exception as e:
logger.error(f"解析 Release 信息失败:{e}")
return False, f"解析 Release 信息失败:{e}"
res = self.__request_with_fallback(download_url, headers=settings.REPO_GITHUB_HEADERS(repo=user_repo))
# 使用资产的API端点下载需要设置Accept头为application/octet-stream
headers = settings.REPO_GITHUB_HEADERS(repo=user_repo).copy()
headers["Accept"] = "application/octet-stream"
res = self.__request_with_fallback(download_url, headers=headers, is_api=True)
if res is None or res.status_code != 200:
return False, f"下载资产失败:{res.status_code if res else '连接失败'}"
@@ -909,23 +940,23 @@ class PluginHelper(metaclass=WeakSingleton):
:param package_version: 首选插件版本 (如 "v2", "v3"),如果不指定则获取 v1 版本
:param force: 是否强制刷新,忽略缓存
"""
# 异步版本直接调用不带缓存的版本(缓存在异步环境下可能有并发问题)
if force:
return await self._async_get_plugins_uncached(repo_url, package_version)
return await self._async_get_plugins_cached(repo_url, package_version)
return await self._async_request_plugins(repo_url, package_version)
else:
return await self._async_request_plugins_cached(repo_url, package_version)
@cached(maxsize=64, ttl=1800)
async def _async_get_plugins_cached(self, repo_url: str,
package_version: Optional[str] = None) -> Optional[Dict[str, dict]]:
@cached(maxsize=128, ttl=1800)
async def _async_request_plugins_cached(self, repo_url: str,
package_version: Optional[str] = None) -> Optional[Dict[str, dict]]:
"""
获取Github所有最新插件列表使用缓存
:param repo_url: Github仓库地址
:param package_version: 首选插件版本 (如 "v2", "v3"),如果不指定则获取 v1 版本
"""
return await self._async_get_plugins_uncached(repo_url, package_version)
return await self._async_request_plugins(repo_url, package_version)
async def _async_get_plugins_uncached(self, repo_url: str,
package_version: Optional[str] = None) -> Optional[Dict[str, dict]]:
async def _async_request_plugins(self, repo_url: str,
package_version: Optional[str] = None) -> Optional[Dict[str, dict]]:
"""
异步获取Github所有最新插件列表不使用缓存
:param repo_url: Github仓库地址
@@ -966,7 +997,7 @@ class PluginHelper(metaclass=WeakSingleton):
return res.json()
return {}
async def async_install_reg(self, pid: str) -> bool:
async def async_install_reg(self, pid: str, repo_url: Optional[str] = None) -> bool:
"""
异步安装插件统计
"""
@@ -975,24 +1006,39 @@ class PluginHelper(metaclass=WeakSingleton):
if not pid:
return False
install_reg_url = self._install_reg.format(pid=pid)
res = await AsyncRequestUtils(proxies=settings.PROXY, timeout=5).get_res(install_reg_url)
res = await AsyncRequestUtils(
proxies=settings.PROXY,
content_type="application/json",
timeout=5
).post(install_reg_url, json={
"plugin_id": pid,
"repo_url": repo_url
})
if res and res.status_code == 200:
return True
return False
async def async_install_report(self) -> bool:
async def async_install_report(self, items: Optional[List[Tuple[str, Optional[str]]]] = None) -> bool:
"""
异步上报存量插件安装统计
异步上报存量插件安装统计(批量)。支持上送 repo_url。
:param items: 可选,形如 [(plugin_id, repo_url), ...];不传则回落到历史配置,仅上送 plugin_id。
"""
if not settings.PLUGIN_STATISTIC_SHARE:
return False
plugins = self.systemconfig.get(SystemConfigKey.UserInstalledPlugins)
if not plugins:
return False
payload_plugins = []
if items:
for pid, repo_url in items:
if pid:
payload_plugins.append({"plugin_id": pid, "repo_url": repo_url})
else:
plugins = self.systemconfig.get(SystemConfigKey.UserInstalledPlugins)
if not plugins:
return False
payload_plugins = [{"plugin_id": plugin, "repo_url": None} for plugin in plugins]
res = await AsyncRequestUtils(proxies=settings.PROXY,
content_type="application/json",
timeout=5).post(self._install_report,
json={"plugins": [{"plugin_id": plugin} for plugin in plugins]})
json={"plugins": payload_plugins})
return True if res else False
async def __async_get_file_list(self, pid: str, user_repo: str, package_version: Optional[str] = None) -> \
@@ -1007,7 +1053,7 @@ class PluginHelper(metaclass=WeakSingleton):
# 如果 package_version 存在(如 "v2"),则加上版本号
if package_version:
file_api += f".{package_version}"
file_api += f"/{pid}"
file_api += f"/{pid.lower()}"
res = await self.__async_request_with_fallback(file_api,
headers=settings.REPO_GITHUB_HEADERS(repo=user_repo),
@@ -1119,8 +1165,8 @@ class PluginHelper(metaclass=WeakSingleton):
:param pid: 插件 ID
:return: 备份目录路径
"""
plugin_dir = AsyncPath(PLUGIN_DIR) / pid
backup_dir = AsyncPath(settings.TEMP_PATH) / "plugin_backup" / pid
plugin_dir = AsyncPath(PLUGIN_DIR) / pid.lower()
backup_dir = AsyncPath(settings.TEMP_PATH) / "plugin_backup" / pid.lower()
if await plugin_dir.exists():
# 备份时清理已有的备份目录,防止残留文件影响
@@ -1140,7 +1186,7 @@ class PluginHelper(metaclass=WeakSingleton):
:param pid: 插件 ID
:param backup_dir: 备份目录路径
"""
plugin_dir = AsyncPath(PLUGIN_DIR) / pid
plugin_dir = AsyncPath(PLUGIN_DIR) / pid.lower()
if await plugin_dir.exists():
await aioshutil.rmtree(plugin_dir, ignore_errors=True)
logger.debug(f"{pid} 已清理插件目录 {plugin_dir}")
@@ -1158,7 +1204,7 @@ class PluginHelper(metaclass=WeakSingleton):
异步删除旧插件
:param pid: 插件 ID
"""
plugin_dir = AsyncPath(PLUGIN_DIR) / pid
plugin_dir = AsyncPath(PLUGIN_DIR) / pid.lower()
if await plugin_dir.exists():
await aioshutil.rmtree(plugin_dir, ignore_errors=True)
@@ -1400,16 +1446,16 @@ class PluginHelper(metaclass=WeakSingleton):
# 使用 release 进行安装
async def prepare_release() -> Tuple[bool, str]:
return await self.__async_install_from_release(
pid.lower(), user_repo, release_tag
pid, user_repo, release_tag
)
return await self.__install_flow_async(pid.lower(), force_install, prepare_release)
return await self.__install_flow_async(pid, force_install, prepare_release, repo_url)
else:
# 如果没有 release_tag则使用文件列表安装方式
async def prepare_filelist() -> Tuple[bool, str]:
return await self.__prepare_content_via_filelist_async(pid.lower(), user_repo, package_version)
return await self.__prepare_content_via_filelist_async(pid, user_repo, package_version)
return await self.__install_flow_async(pid.lower(), force_install, prepare_filelist)
return await self.__install_flow_async(pid, force_install, prepare_filelist, repo_url)
async def __async_get_plugin_meta(self, pid: str, repo_url: str,
package_version: Optional[str]) -> dict:
@@ -1424,78 +1470,79 @@ class PluginHelper(metaclass=WeakSingleton):
logger.warn(f"获取插件 {pid} 元数据失败:{e}")
return {}
async def __install_flow_async(self, pid_lower: str, force_install: bool,
prepare_content: Callable[[], Awaitable[Tuple[bool, str]]]) -> Tuple[bool, str]:
async def __install_flow_async(self, pid: str, force_install: bool,
prepare_content: Callable[[], Awaitable[Tuple[bool, str]]],
repo_url: Optional[str] = None) -> Tuple[bool, str]:
"""
异步安装流程,处理插件内容准备、依赖安装和注册
"""
backup_dir = None
if not force_install:
backup_dir = await self.__async_backup_plugin(pid_lower)
backup_dir = await self.__async_backup_plugin(pid)
await self.__async_remove_old_plugin(pid_lower)
await self.__async_remove_old_plugin(pid)
success, message = await prepare_content()
if not success:
logger.error(f"{pid_lower} 准备插件内容失败:{message}")
logger.error(f"{pid} 准备插件内容失败:{message}")
if backup_dir:
await self.__async_restore_plugin(pid_lower, backup_dir)
logger.warning(f"{pid_lower} 插件安装失败,已还原备份插件")
await self.__async_restore_plugin(pid, backup_dir)
logger.warning(f"{pid} 插件安装失败,已还原备份插件")
else:
await self.__async_remove_old_plugin(pid_lower)
logger.warning(f"{pid_lower} 已清理对应插件目录,请尝试重新安装")
await self.__async_remove_old_plugin(pid)
logger.warning(f"{pid} 已清理对应插件目录,请尝试重新安装")
return False, message
dependencies_exist, dep_ok, dep_msg = await self.__async_install_dependencies_if_required(pid_lower)
dependencies_exist, dep_ok, dep_msg = await self.__async_install_dependencies_if_required(pid)
if dependencies_exist and not dep_ok:
logger.error(f"{pid_lower} 依赖安装失败:{dep_msg}")
logger.error(f"{pid} 依赖安装失败:{dep_msg}")
if backup_dir:
await self.__async_restore_plugin(pid_lower, backup_dir)
logger.warning(f"{pid_lower} 插件安装失败,已还原备份插件")
await self.__async_restore_plugin(pid, backup_dir)
logger.warning(f"{pid} 插件安装失败,已还原备份插件")
else:
await self.__async_remove_old_plugin(pid_lower)
logger.warning(f"{pid_lower} 已清理对应插件目录,请尝试重新安装")
await self.__async_remove_old_plugin(pid)
logger.warning(f"{pid} 已清理对应插件目录,请尝试重新安装")
return False, dep_msg
await self.async_install_reg(pid_lower)
await self.async_install_reg(pid, repo_url)
return True, ""
def __prepare_content_via_filelist_sync(self, pid_lower: str, user_repo: str,
def __prepare_content_via_filelist_sync(self, pid: str, user_repo: str,
package_version: Optional[str]) -> Tuple[bool, str]:
"""
同步准备插件内容,通过文件列表获取插件文件和依赖
"""
file_list, msg = self.__get_file_list(pid_lower, user_repo, package_version)
file_list, msg = self.__get_file_list(pid, user_repo, package_version)
if not file_list:
return False, msg
requirements_file_info = next((f for f in file_list if f.get("name") == "requirements.txt"), None)
if requirements_file_info:
ok, m = self.__download_and_install_requirements(requirements_file_info, pid_lower, user_repo)
ok, m = self.__download_and_install_requirements(requirements_file_info, pid, user_repo)
if not ok:
logger.debug(f"{pid_lower} 依赖预安装失败:{m}")
logger.debug(f"{pid} 依赖预安装失败:{m}")
else:
logger.debug(f"{pid_lower} 依赖预安装成功")
ok, m = self.__download_files(pid_lower, file_list, user_repo, package_version, True)
logger.debug(f"{pid} 依赖预安装成功")
ok, m = self.__download_files(pid, file_list, user_repo, package_version, True)
if not ok:
return False, m
return True, ""
async def __prepare_content_via_filelist_async(self, pid_lower: str, user_repo: str,
async def __prepare_content_via_filelist_async(self, pid: str, user_repo: str,
package_version: Optional[str]) -> Tuple[bool, str]:
"""
异步准备插件内容,通过文件列表获取插件文件和依赖
"""
file_list, msg = await self.__async_get_file_list(pid_lower, user_repo, package_version)
file_list, msg = await self.__async_get_file_list(pid, user_repo, package_version)
if not file_list:
return False, msg
requirements_file_info = next((f for f in file_list if f.get("name") == "requirements.txt"), None)
if requirements_file_info:
ok, m = await self.__async_download_and_install_requirements(requirements_file_info, pid_lower, user_repo)
ok, m = await self.__async_download_and_install_requirements(requirements_file_info, pid, user_repo)
if not ok:
logger.debug(f"{pid_lower} 依赖预安装失败:{m}")
logger.debug(f"{pid} 依赖预安装失败:{m}")
else:
logger.debug(f"{pid_lower} 依赖预安装成功")
ok, m = await self.__async_download_files(pid_lower, file_list, user_repo, package_version, True)
logger.debug(f"{pid} 依赖预安装成功")
ok, m = await self.__async_download_files(pid, file_list, user_repo, package_version, True)
if not ok:
return False, m
return True, ""
@@ -1525,15 +1572,21 @@ class PluginHelper(metaclass=WeakSingleton):
asset = next((a for a in assets if a.get("name") == asset_name), None)
if not asset:
return False, f"未找到资产文件:{asset_name}"
download_url = asset.get("browser_download_url")
if not download_url:
return False, "资产缺少下载地址"
asset_id = asset.get("id")
if not asset_id:
return False, "资产缺少ID信息"
# 构建资产的API下载URL
download_url = f"https://api.github.com/repos/{user_repo}/releases/assets/{asset_id}"
except Exception as e:
logger.error(f"解析 Release 信息失败:{e}")
return False, f"解析 Release 信息失败:{e}"
# 使用资产的API端点下载需要设置Accept头为application/octet-stream
headers = settings.REPO_GITHUB_HEADERS(repo=user_repo).copy()
headers["Accept"] = "application/octet-stream"
res = await self.__async_request_with_fallback(download_url,
headers=settings.REPO_GITHUB_HEADERS(repo=user_repo))
headers=headers,
is_api=True)
if res is None or res.status_code != 200:
return False, f"下载资产失败:{res.status_code if res else '连接失败'}"
@@ -1571,87 +1624,3 @@ class PluginHelper(metaclass=WeakSingleton):
except Exception as e:
logger.error(f"解压 Release 压缩包失败:{e}")
return False, f"解压 Release 压缩包失败:{e}"
class PluginMemoryMonitor:
"""
插件内存监控器
"""
def __init__(self):
self._calculator = MemoryCalculator()
self._cache = {}
self._cache_ttl = 300 # 缓存5分钟
def get_plugin_memory_usage(self, plugin_id: str, plugin_instance: Any) -> Dict[str, Any]:
"""
获取插件内存使用情况
:param plugin_id: 插件ID
:param plugin_instance: 插件实例
:return: 内存使用信息
"""
# 检查缓存
if self._is_cache_valid(plugin_id):
return self._cache[plugin_id]
# 计算内存使用
memory_info = self._calculator.calculate_object_memory(plugin_instance)
# 添加插件信息
result = {
'plugin_id': plugin_id,
'plugin_name': getattr(plugin_instance, 'plugin_name', 'Unknown'),
'plugin_version': getattr(plugin_instance, 'plugin_version', 'Unknown'),
'timestamp': time.time(),
**memory_info
}
# 更新缓存
self._cache[plugin_id] = result
return result
def get_all_plugins_memory_usage(self, plugins: Dict[str, Any]) -> List[Dict[str, Any]]:
"""
获取所有插件的内存使用情况
:param plugins: 插件实例字典
:return: 内存使用信息列表
"""
results = []
for plugin_id, plugin_instance in plugins.items():
if plugin_instance:
try:
memory_info = self.get_plugin_memory_usage(plugin_id, plugin_instance)
results.append(memory_info)
except Exception as e:
logger.error(f"获取插件 {plugin_id} 内存使用情况失败:{str(e)}")
results.append({
'plugin_id': plugin_id,
'plugin_name': getattr(plugin_instance, 'plugin_name', 'Unknown'),
'error': str(e),
'total_memory_bytes': 0,
'total_memory_mb': 0,
'object_count': 0,
'calculation_time_ms': 0
})
# 按内存使用量排序
results.sort(key=lambda x: x.get('total_memory_bytes', 0), reverse=True)
return results
def _is_cache_valid(self, plugin_id: str) -> bool:
"""
检查缓存是否有效
"""
if plugin_id not in self._cache:
return False
return time.time() - self._cache[plugin_id]['timestamp'] < self._cache_ttl
def clear_cache(self, plugin_id: Optional[str] = None):
"""
清除缓存
:param plugin_id: 插件ID为空则清除所有缓存
"""
if plugin_id:
self._cache.pop(plugin_id, None)
else:
self._cache.clear()

View File

@@ -17,6 +17,11 @@ from app.utils.singleton import Singleton
_complex_serializable_types = set()
_simple_serializable_types = set()
# 默认连接参数
_socket_timeout = 30
_socket_connect_timeout = 5
_health_check_interval = 60
def serialize(value: Any) -> bytes:
"""
@@ -96,9 +101,9 @@ class RedisHelper(metaclass=Singleton):
self.client = redis.Redis.from_url(
self.redis_url,
decode_responses=False,
socket_timeout=30,
socket_connect_timeout=5,
health_check_interval=60,
socket_timeout=_socket_timeout,
socket_connect_timeout=_socket_connect_timeout,
health_check_interval=_health_check_interval,
)
# 测试连接确保Redis可用
self.client.ping()
@@ -253,10 +258,10 @@ class RedisHelper(metaclass=Singleton):
for key in self.client.scan_iter(redis_key):
pipe.delete(key)
pipe.execute()
logger.info(f"Cleared Redis cache for region: {region}")
logger.debug(f"Cleared Redis cache for region: {region}")
else:
self.client.flushdb()
logger.info("Cleared all Redis cache")
logger.info("All Redis cache Cleared")
except Exception as e:
logger.error(f"Failed to clear cache, region: {region}, error: {e}")
@@ -317,10 +322,6 @@ class AsyncRedisHelper(metaclass=Singleton):
- 所有操作都是异步的
"""
# 类型缓存集合,针对非容器简单类型
_complex_serializable_types = set()
_simple_serializable_types = set()
def __init__(self):
"""
初始化异步Redis助手实例
@@ -337,9 +338,9 @@ class AsyncRedisHelper(metaclass=Singleton):
self.client = Redis.from_url(
self.redis_url,
decode_responses=False,
socket_timeout=30,
socket_connect_timeout=5,
health_check_interval=60,
socket_timeout=_socket_timeout,
socket_connect_timeout=_socket_connect_timeout,
health_check_interval=_health_check_interval,
)
# 测试连接确保Redis可用
await self.client.ping()
@@ -495,7 +496,7 @@ class AsyncRedisHelper(metaclass=Singleton):
async for key in self.client.scan_iter(redis_key):
await pipe.delete(key)
await pipe.execute()
logger.info(f"Cleared Redis cache for region (async): {region}")
logger.debug(f"Cleared Redis cache for region (async): {region}")
else:
await self.client.flushdb()
logger.info("Cleared all Redis cache (async)")

View File

@@ -8,7 +8,6 @@ from app.log import logger
from app.utils.http import RequestUtils
from app.utils.string import StringUtils
from app.utils.system import SystemUtils
from version import APP_VERSION
class ResourceHelper:
@@ -59,12 +58,6 @@ class ResourceHelper:
if rtype == "auth":
# 站点认证资源
local_version = SitesHelper().auth_version
# 阻断站点认证资源v2.3.0以下的版本直接更新,避免无限重启
if StringUtils.compare_version(local_version, "<", "2.3.0"):
continue
# 阻断主程序版本v2.6.3以下的版本直接更新,避免搜索异常
if StringUtils.compare_version(APP_VERSION, "<", "2.6.3"):
continue
elif rtype == "sites":
# 站点索引资源
local_version = SitesHelper().indexer_version

View File

@@ -384,6 +384,9 @@ class RssHelper:
pubdate = ""
if pubdate_nodes and pubdate_nodes[0].text:
pubdate = StringUtils.get_time(pubdate_nodes[0].text)
if pubdate is not None:
# 转为本地时区
pubdate = pubdate.astimezone(tz=None)
# 获取豆瓣昵称
nickname_nodes = item.xpath('.//*[local-name()="creator"]')

View File

@@ -131,7 +131,9 @@ class SubscribeHelper(metaclass=WeakSingleton):
return []
@cached(region=_shares_cache_region, maxsize=5, ttl=1800, skip_empty=True)
def get_statistic(self, stype: str, page: Optional[int] = 1, count: Optional[int] = 30) -> List[dict]:
def get_statistic(self, stype: str, page: Optional[int] = 1, count: Optional[int] = 30,
genre_id: Optional[int] = None, min_rating: Optional[float] = None,
max_rating: Optional[float] = None, sort_type: Optional[str] = None) -> List[dict]:
"""
获取订阅统计数据
"""
@@ -139,16 +141,30 @@ class SubscribeHelper(metaclass=WeakSingleton):
if not enabled:
return []
res = RequestUtils(proxies=settings.PROXY, timeout=15).get_res(self._sub_statistic, params={
params = {
"stype": stype,
"page": page,
"count": count
})
}
# 添加可选参数
if genre_id is not None:
params["genre_id"] = genre_id
if min_rating is not None:
params["min_rating"] = min_rating
if max_rating is not None:
params["max_rating"] = max_rating
if sort_type is not None:
params["sort_type"] = sort_type
res = RequestUtils(proxies=settings.PROXY, timeout=15).get_res(self._sub_statistic, params=params)
return self._handle_list_response(res)
@cached(region=_shares_cache_region, maxsize=5, ttl=1800, skip_empty=True)
async def async_get_statistic(self, stype: str, page: Optional[int] = 1, count: Optional[int] = 30) -> List[dict]:
async def async_get_statistic(self, stype: str, page: Optional[int] = 1, count: Optional[int] = 30,
genre_id: Optional[int] = None, min_rating: Optional[float] = None,
max_rating: Optional[float] = None, sort_type: Optional[str] = None) -> List[dict]:
"""
异步获取订阅统计数据
"""
@@ -156,11 +172,23 @@ class SubscribeHelper(metaclass=WeakSingleton):
if not enabled:
return []
res = await AsyncRequestUtils(proxies=settings.PROXY, timeout=15).get_res(self._sub_statistic, params={
params = {
"stype": stype,
"page": page,
"count": count
})
}
# 添加可选参数
if genre_id is not None:
params["genre_id"] = genre_id
if min_rating is not None:
params["min_rating"] = min_rating
if max_rating is not None:
params["max_rating"] = max_rating
if sort_type is not None:
params["sort_type"] = sort_type
res = await AsyncRequestUtils(proxies=settings.PROXY, timeout=15).get_res(self._sub_statistic, params=params)
return self._handle_list_response(res)
@@ -358,7 +386,9 @@ class SubscribeHelper(metaclass=WeakSingleton):
return self._handle_response(res, clear_cache=False)
@cached(region=_shares_cache_region, maxsize=1, ttl=1800, skip_empty=True)
def get_shares(self, name: Optional[str] = None, page: Optional[int] = 1, count: Optional[int] = 30) -> List[dict]:
def get_shares(self, name: Optional[str] = None, page: Optional[int] = 1, count: Optional[int] = 30,
genre_id: Optional[int] = None, min_rating: Optional[float] = None,
max_rating: Optional[float] = None, sort_type: Optional[str] = None) -> List[dict]:
"""
获取订阅分享数据
"""
@@ -366,17 +396,30 @@ class SubscribeHelper(metaclass=WeakSingleton):
if not enabled:
return []
res = RequestUtils(proxies=settings.PROXY, timeout=15).get_res(self._sub_shares, params={
params = {
"name": name,
"page": page,
"count": count
})
}
# 添加可选参数
if genre_id is not None:
params["genre_id"] = genre_id
if min_rating is not None:
params["min_rating"] = min_rating
if max_rating is not None:
params["max_rating"] = max_rating
if sort_type is not None:
params["sort_type"] = sort_type
res = RequestUtils(proxies=settings.PROXY, timeout=15).get_res(self._sub_shares, params=params)
return self._handle_list_response(res)
@cached(region=_shares_cache_region, maxsize=1, ttl=1800, skip_empty=True)
async def async_get_shares(self, name: Optional[str] = None, page: Optional[int] = 1, count: Optional[int] = 30) -> \
List[dict]:
async def async_get_shares(self, name: Optional[str] = None, page: Optional[int] = 1, count: Optional[int] = 30,
genre_id: Optional[int] = None, min_rating: Optional[float] = None,
max_rating: Optional[float] = None, sort_type: Optional[str] = None) -> List[dict]:
"""
异步获取订阅分享数据
"""
@@ -384,11 +427,23 @@ class SubscribeHelper(metaclass=WeakSingleton):
if not enabled:
return []
res = await AsyncRequestUtils(proxies=settings.PROXY, timeout=15).get_res(self._sub_shares, params={
params = {
"name": name,
"page": page,
"count": count
})
}
# 添加可选参数
if genre_id is not None:
params["genre_id"] = genre_id
if min_rating is not None:
params["min_rating"] = min_rating
if max_rating is not None:
params["max_rating"] = max_rating
if sort_type is not None:
params["sort_type"] = sort_type
res = await AsyncRequestUtils(proxies=settings.PROXY, timeout=15).get_res(self._sub_shares, params=params)
return self._handle_list_response(res)

View File

@@ -154,6 +154,7 @@ class DoubanApi(metaclass=WeakSingleton):
_api_url = "https://api.douban.com/v2"
def __init__(self):
self.__clear_async_cache__ = False
self._session = requests.Session()
@classmethod
@@ -171,28 +172,24 @@ class DoubanApi(metaclass=WeakSingleton):
).digest()
).decode()
@cached(maxsize=settings.CONF.douban, ttl=settings.CONF.meta)
def __invoke_recommend(self, url: str, **kwargs) -> dict:
"""
推荐/发现类API
"""
return self.__invoke(url, **kwargs)
@cached(maxsize=settings.CONF.douban, ttl=settings.CONF.meta)
async def __async_invoke_recommend(self, url: str, **kwargs) -> dict:
"""
推荐/发现类API异步版本
"""
return await self.__async_invoke(url, **kwargs)
@cached(maxsize=settings.CONF.douban, ttl=settings.CONF.meta)
def __invoke_search(self, url: str, **kwargs) -> dict:
"""
搜索类API
"""
return self.__invoke(url, **kwargs)
@cached(maxsize=settings.CONF.douban, ttl=settings.CONF.meta)
async def __async_invoke_search(self, url: str, **kwargs) -> dict:
"""
搜索类API异步版本
@@ -226,11 +223,9 @@ class DoubanApi(metaclass=WeakSingleton):
"""
处理HTTP响应
"""
if resp is not None and resp.status_code == 400 and "rate_limit" in resp.text:
return resp.json()
return resp.json() if resp else {}
return resp.json() if resp is not None else None
@cached(maxsize=settings.CONF.douban, ttl=settings.CONF.meta)
@cached(maxsize=settings.CONF.douban, ttl=settings.CONF.meta, skip_none=True)
def __invoke(self, url: str, **kwargs) -> dict:
"""
GET请求
@@ -242,11 +237,14 @@ class DoubanApi(metaclass=WeakSingleton):
).get_res(url=req_url, params=params)
return self._handle_response(resp)
@cached(maxsize=settings.CONF.douban, ttl=settings.CONF.meta)
@cached(maxsize=settings.CONF.douban, ttl=settings.CONF.meta, skip_none=True)
async def __async_invoke(self, url: str, **kwargs) -> dict:
"""
GET请求异步版本
"""
if self.__clear_async_cache__:
self.__clear_async_cache__ = False
await self.__async_invoke.cache_clear()
req_url, params = self._prepare_get_request(url, **kwargs)
resp = await AsyncRequestUtils(
ua=choice(self._user_agents)
@@ -265,7 +263,7 @@ class DoubanApi(metaclass=WeakSingleton):
params.pop('_ts')
return req_url, params
@cached(maxsize=settings.CONF.douban, ttl=settings.CONF.meta)
@cached(maxsize=settings.CONF.douban, ttl=settings.CONF.meta, skip_none=True)
def __post(self, url: str, **kwargs) -> dict:
"""
POST请求
@@ -287,7 +285,7 @@ class DoubanApi(metaclass=WeakSingleton):
).post_res(url=req_url, data=params)
return self._handle_response(resp)
@cached(maxsize=settings.CONF.douban, ttl=settings.CONF.meta)
@cached(maxsize=settings.CONF.douban, ttl=settings.CONF.meta, skip_none=True)
async def __async_post(self, url: str, **kwargs) -> dict:
"""
POST请求异步版本
@@ -866,8 +864,8 @@ class DoubanApi(metaclass=WeakSingleton):
"""
清空LRU缓存
"""
# 尚未支持缓存清理
pass
self.__invoke.cache_clear()
self.__clear_async_cache__ = True
def close(self):
if self._session:

View File

@@ -10,10 +10,10 @@ from requests import Response
from app import schemas
from app.core.config import settings
from app.log import logger
from app.schemas import MediaServerItem
from app.schemas.types import MediaType
from app.utils.http import RequestUtils
from app.utils.url import UrlUtils
from app.schemas import MediaServerItem
class Emby:
@@ -22,9 +22,10 @@ class Emby:
_apikey: Optional[str] = None
_sync_libraries: List[str] = []
user: Optional[Union[str, int]] = None
_username: Optional[str] = None
def __init__(self, host: Optional[str] = None, apikey: Optional[str] = None, play_host: Optional[str] = None,
sync_libraries: list = None, **kwargs):
username: Optional[str] = None, sync_libraries: list = None, **kwargs):
if not host or not apikey:
logger.error("Emby服务器配置不完整")
return
@@ -35,7 +36,8 @@ class Emby:
if self._playhost:
self._playhost = UrlUtils.standardize_base_url(self._playhost)
self._apikey = apikey
self.user = self.get_user(settings.SUPERUSER)
self._username = username
self.user = self.get_user(username or settings.SUPERUSER)
self.folders = self.get_emby_folders()
self.serverid = self.get_server_id()
self._sync_libraries = sync_libraries or []
@@ -139,7 +141,8 @@ class Emby:
logger.error(f"连接User/Views 出错:" + str(e))
return []
def get_librarys(self, username: Optional[str] = None, hidden: Optional[bool] = False) -> List[schemas.MediaServerLibrary]:
def get_librarys(self, username: Optional[str] = None, hidden: Optional[bool] = False) -> List[
schemas.MediaServerLibrary]:
"""
获取媒体服务器所有媒体库列表
"""
@@ -567,6 +570,7 @@ class Emby:
if library_id != "/":
return self.__refresh_emby_library_by_id(library_id)
logger.info(f"Emby媒体库刷新完成")
return True
def __get_emby_library_id_by_item(self, item: schemas.RefreshMediaItem) -> Optional[str]:
"""
@@ -706,9 +710,9 @@ class Emby:
yield items
elif item.get("Type") in ["Movie", "Series"]:
yield self.__format_item_info(item)
except Exception as e:
logger.error(f"连接Users/Items出错" + str(e))
return None
def get_webhook_message(self, form: any, args: dict) -> Optional[schemas.WebhookEventInfo]:
"""
@@ -1109,7 +1113,8 @@ class Emby:
return ""
return "%sItems/%s/Images/Primary" % (self._host, item_id)
def get_resume(self, num: Optional[int] = 12, username: Optional[str] = None) -> Optional[List[schemas.MediaServerPlayItem]]:
def get_resume(self, num: Optional[int] = 12, username: Optional[str] = None) -> Optional[
List[schemas.MediaServerPlayItem]]:
"""
获得继续观看
"""
@@ -1178,7 +1183,8 @@ class Emby:
logger.error(f"连接Users/Items/Resume出错" + str(e))
return []
def get_latest(self, num: Optional[int] = 20, username: Optional[str] = None) -> Optional[List[schemas.MediaServerPlayItem]]:
def get_latest(self, num: Optional[int] = 20, username: Optional[str] = None) -> Optional[
List[schemas.MediaServerPlayItem]]:
"""
获得最近更新
"""

View File

@@ -15,7 +15,7 @@ def transfer_process(path: str) -> Callable[[int | float], None]:
"""
传输进度回调
"""
pbar = tqdm(total=100, desc="整理进度", unit="%")
pbar = tqdm(total=100, desc="进度", unit="%")
progress = ProgressHelper(HashUtils.md5(path))
progress.start()
@@ -23,7 +23,7 @@ def transfer_process(path: str) -> Callable[[int | float], None]:
"""
更新进度百分比
"""
percent_value = int(percent)
percent_value = round(percent, 2) if isinstance(percent, float) else percent
pbar.n = percent_value
# 更新进度
pbar.refresh()

View File

@@ -14,6 +14,7 @@ from app.log import logger
from app.modules.filemanager import StorageBase
from app.modules.filemanager.storages import transfer_process
from app.schemas.types import StorageSchema
from app.utils.http import RequestUtils
from app.utils.singleton import WeakSingleton
from app.utils.string import StringUtils
@@ -251,10 +252,18 @@ class AliPan(StorageBase, metaclass=WeakSingleton):
# 检查会话
self._check_session()
resp = self.session.request(
method, f"{self.base_url}{endpoint}",
**kwargs
)
# 错误日志控制
no_error_log = kwargs.pop("no_error_log", False)
try:
resp = self.session.request(
method, f"{self.base_url}{endpoint}",
**kwargs
)
except requests.exceptions.RequestException as e:
logger.error(f"【阿里云盘】{method} 请求 {endpoint} 网络错误: {str(e)}")
return None
if resp is None:
logger.warn(f"【阿里云盘】{method} 请求 {endpoint} 失败!")
return None
@@ -268,7 +277,8 @@ class AliPan(StorageBase, metaclass=WeakSingleton):
# 返回数据
ret_data = resp.json()
if ret_data.get("code"):
logger.warn(f"【阿里云盘】{method} {endpoint} 返回:{ret_data.get('code')} {ret_data.get('message')}")
if not no_error_log:
logger.warn(f"【阿里云盘】{method} {endpoint} 返回:{ret_data.get('code')} {ret_data.get('message')}")
if result_key:
return ret_data.get(result_key)
@@ -592,7 +602,7 @@ class AliPan(StorageBase, metaclass=WeakSingleton):
file_size = local_path.stat().st_size
# 1. 创建文件并检查秒传
chunk_size = 100 * 1024 * 1024 # 分片大小 100M
chunk_size = 10 * 1024 * 1024 # 分片大小 10M
create_res = self._create_file(drive_id=target_dir.drive_id,
parent_file_id=target_dir.fileid,
file_name=target_name,
@@ -724,7 +734,25 @@ class AliPan(StorageBase, metaclass=WeakSingleton):
progress_callback = transfer_process(Path(fileitem.path).as_posix())
try:
with requests.get(download_url, stream=True) as r:
# 构建请求头,包含必要的认证信息
headers = {
"User-Agent": settings.NORMAL_USER_AGENT,
"Referer": "https://www.aliyundrive.com/",
"Accept": "*/*",
"Accept-Language": "zh-CN,zh;q=0.9,en;q=0.8",
"Accept-Encoding": "gzip, deflate, br",
"Connection": "keep-alive",
"Sec-Fetch-Dest": "empty",
"Sec-Fetch-Mode": "cors",
"Sec-Fetch-Site": "cross-site"
}
# 如果有access_token添加到请求头
if self.access_token:
headers["Authorization"] = f"Bearer {self.access_token}"
request_utils = RequestUtils(headers=headers)
with request_utils.get_stream(download_url, raise_exception=True) as r:
r.raise_for_status()
downloaded_size = 0
with open(local_path, "wb") as f:
@@ -743,22 +771,13 @@ class AliPan(StorageBase, metaclass=WeakSingleton):
# 完成下载
progress_callback(100)
logger.info(f"【阿里云盘】下载完成: {fileitem.name}")
except requests.exceptions.RequestException as e:
logger.error(f"【阿里云盘】下载网络错误: {fileitem.name} - {str(e)}")
# 删除可能部分下载的文件
if local_path.exists():
local_path.unlink()
return None
return local_path
except Exception as e:
logger.error(f"【阿里云盘】下载失败: {fileitem.name} - {str(e)}")
# 删除可能部分下载的文件
if local_path.exists():
local_path.unlink()
return None
return local_path
def check(self) -> bool:
return self.access_token is not None
@@ -810,7 +829,8 @@ class AliPan(StorageBase, metaclass=WeakSingleton):
json={
"drive_id": drive_id or self._default_drive_id,
"file_path": path.as_posix()
}
},
no_error_log=True
)
if not resp:
return None

View File

@@ -4,8 +4,6 @@ from datetime import datetime
from pathlib import Path
from typing import Optional, List
import requests
from app import schemas
from app.core.cache import cached
from app.core.config import settings, global_vars
@@ -569,18 +567,22 @@ class Alist(StorageBase, metaclass=WeakSingleton):
else:
local_path = path / fileitem.name
with requests.get(download_url, headers=self.__get_header_with_token(), stream=True) as r:
r.raise_for_status()
with open(local_path, "wb") as f:
for chunk in r.iter_content(chunk_size=8192):
if global_vars.is_transfer_stopped(fileitem.path):
logger.info(f"【OpenList】{fileitem.path} 下载已取消!")
return None
f.write(chunk)
request_utils = RequestUtils(headers=self.__get_header_with_token())
try:
with request_utils.get_stream(download_url, raise_exception=True) as r:
r.raise_for_status()
with open(local_path, "wb") as f:
for chunk in r.iter_content(chunk_size=8192):
if global_vars.is_transfer_stopped(fileitem.path):
logger.info(f"【OpenList】{fileitem.path} 下载已取消!")
return None
f.write(chunk)
except Exception as e:
logger.error(f"【OpenList】下载文件 {fileitem.path} 失败:{e}")
if local_path.exists():
return local_path
if local_path.exists():
return local_path
return None
return local_path
def upload(
self, fileitem: schemas.FileItem, path: Path, new_name: Optional[str] = None, task: bool = False
@@ -615,6 +617,9 @@ class Alist(StorageBase, metaclass=WeakSingleton):
self.uploaded_size = 0
self.file_size = file_path.stat().st_size
def __len__(self) -> int:
return self.file_size
def read(self, size=-1):
if global_vars.is_transfer_stopped(path.as_posix()):
logger.info(f"【OpenList】{path} 上传已取消!")

View File

@@ -26,8 +26,8 @@ class LocalStorage(StorageBase):
"softlink": "软链接"
}
# 文件块大小默认100MB
chunk_size = 100 * 1024 * 1024
# 文件块大小默认10MB
chunk_size = 10 * 1024 * 1024
def init_storage(self):
"""
@@ -246,6 +246,17 @@ class LocalStorage(StorageBase):
logger.error(f"【本地】移动文件失败:{err}")
return None
@staticmethod
def __should_show_progress(src: Path, dest: Path):
"""
是否显示进度条
"""
src_isnetwork = SystemUtils.is_network_filesystem(src)
dest_isnetwork = SystemUtils.is_network_filesystem(dest)
if src_isnetwork and dest_isnetwork and SystemUtils.is_same_disk(src, dest):
return True
return False
def copy(
self,
fileitem: schemas.FileItem,
@@ -258,8 +269,15 @@ class LocalStorage(StorageBase):
try:
src = Path(fileitem.path)
dest = path / new_name
if self._copy_with_progress(src, dest):
return True
if self.__should_show_progress(src, dest):
if self._copy_with_progress(src, dest):
return True
else:
code, message = SystemUtils.copy(src, dest)
if code == 0:
return True
else:
logger.error(f"【本地】复制文件失败:{message}")
except Exception as err:
logger.error(f"【本地】复制文件失败:{err}")
return False
@@ -276,10 +294,20 @@ class LocalStorage(StorageBase):
try:
src = Path(fileitem.path)
dest = path / new_name
if self._copy_with_progress(src, dest):
# 复制成功删除源文件
src.unlink()
if src == dest:
# 目标和源文件相同,直接返回成功,不做任何操作
return True
if self.__should_show_progress(src, dest):
if self._copy_with_progress(src, dest):
# 复制成功删除源文件
src.unlink()
return True
else:
code, message = SystemUtils.move(src, dest)
if code == 0:
return True
else:
logger.error(f"【本地】移动文件失败:{message}")
except Exception as err:
logger.error(f"【本地】移动文件失败:{err}")
return False

View File

@@ -39,8 +39,8 @@ class SMB(StorageBase, metaclass=WeakSingleton):
"copy": "复制",
}
# 文件块大小默认100MB
chunk_size = 100 * 1024 * 1024
# 文件块大小默认10MB
chunk_size = 10 * 1024 * 1024
def __init__(self):
super().__init__()
@@ -49,6 +49,7 @@ class SMB(StorageBase, metaclass=WeakSingleton):
self._host = None
self._username = None
self._password = None
self._init_connection()
def _init_connection(self):
@@ -380,19 +381,95 @@ class SMB(StorageBase, metaclass=WeakSingleton):
self._check_connection()
smb_path = self._normalize_path(fileitem.path.rstrip("/"))
logger.info(f"【SMB】开始删除: {fileitem.path} (类型: {fileitem.type})")
# 先检查路径是否存在
if not smbclient.path.exists(smb_path):
logger.warn(f"【SMB】路径不存在跳过删除: {fileitem.path}")
return True
if fileitem.type == "dir":
# 删除目录
smbclient.rmdir(smb_path)
# 递归删除目录及其内容
logger.debug(f"【SMB】递归删除目录: {smb_path}")
self._recursive_delete(smb_path)
else:
# 删除文件
logger.debug(f"【SMB】删除文件: {smb_path}")
smbclient.remove(smb_path)
logger.info(f"【SMB】删除成功: {fileitem.path}")
return True
except Exception as e:
logger.error(f"【SMB】删除失败: {e}")
except SMBConnectionError as e:
logger.error(f"【SMB】删除失败 - 连接错误: {fileitem.path} - {e}")
return False
except SMBResponseException as e:
logger.error(f"【SMB】删除失败 - SMB响应错误: {fileitem.path} - {e}")
return False
except SMBException as e:
logger.error(f"【SMB】删除失败 - SMB错误: {fileitem.path} - {e}")
return False
except Exception as e:
logger.error(f"【SMB】删除失败 - 未知错误: {fileitem.path} - {e}")
return False
def _recursive_delete(self, smb_path: str):
"""
递归删除目录及其所有内容
"""
try:
# 检查路径是否存在
if not smbclient.path.exists(smb_path):
logger.debug(f"【SMB】路径不存在跳过删除: {smb_path}")
return
# 如果是文件,直接删除
if smbclient.path.isfile(smb_path):
logger.debug(f"【SMB】删除文件: {smb_path}")
smbclient.remove(smb_path)
return
# 如果是目录,先删除其内容
if smbclient.path.isdir(smb_path):
logger.debug(f"【SMB】开始删除目录内容: {smb_path}")
try:
# 列出目录内容
entries = smbclient.listdir(smb_path)
logger.debug(f"【SMB】目录 {smb_path} 包含 {len(entries)} 个项目")
for entry in entries:
if entry in [".", ".."]:
continue
entry_path = f"{smb_path}\\{entry}"
logger.debug(f"【SMB】递归删除子项: {entry_path}")
# 递归删除子项
self._recursive_delete(entry_path)
# 删除空目录
logger.debug(f"【SMB】删除空目录: {smb_path}")
smbclient.rmdir(smb_path)
logger.debug(f"【SMB】目录删除成功: {smb_path}")
except SMBResponseException as e:
# 如果目录不为空,尝试强制删除
logger.warn(f"【SMB】目录不为空尝试强制删除: {smb_path} - {e}")
# 使用remove方法尝试删除某些SMB服务器支持
try:
smbclient.remove(smb_path)
logger.info(f"【SMB】强制删除目录成功: {smb_path}")
except Exception as remove_error:
# 如果还是失败,记录错误并抛出异常
logger.error(f"【SMB】无法删除非空目录: {smb_path} - {remove_error}")
raise SMBConnectionError(f"无法删除非空目录 {smb_path}: {remove_error}")
except SMBException as e:
logger.error(f"【SMB】SMB操作失败: {smb_path} - {e}")
raise SMBConnectionError(f"SMB操作失败 {smb_path}: {e}")
except SMBConnectionError:
# 重新抛出SMB连接错误
raise
except Exception as e:
logger.error(f"【SMB】递归删除失败: {smb_path} - {e}")
raise SMBConnectionError(f"递归删除失败 {smb_path}: {e}")
def rename(self, fileitem: schemas.FileItem, name: str) -> bool:
"""
@@ -584,8 +661,7 @@ class SMB(StorageBase, metaclass=WeakSingleton):
析构函数,清理连接
"""
try:
# smbclient 自动管理连接池,但我们可以重置缓存
if hasattr(self, '_connected') and self._connected:
if self._connected:
reset_connection_cache()
except Exception as e:
logger.debug(f"【SMB】清理连接失败: {e}")

View File

@@ -46,6 +46,9 @@ class U115Pan(StorageBase, metaclass=WeakSingleton):
# 文件块大小默认10MB
chunk_size = 10 * 1024 * 1024
# 流控重试间隔时间
retry_delay = 70
def __init__(self):
super().__init__()
self._auth_state = {}
@@ -88,6 +91,8 @@ class U115Pan(StorageBase, metaclass=WeakSingleton):
"refresh_time": int(time.time()),
**tokens
})
else:
return None
access_token = tokens.get("access_token")
if access_token:
self.session.headers.update({"Authorization": f"Bearer {access_token}"})
@@ -195,6 +200,7 @@ class U115Pan(StorageBase, metaclass=WeakSingleton):
result = resp.json()
if result.get("code") != 0:
logger.warn(f"【115】刷新 access_token 失败:{result.get('code')} - {result.get('message')}")
return None
return result.get("data")
def _request_api(self, method: str, endpoint: str,
@@ -205,14 +211,26 @@ class U115Pan(StorageBase, metaclass=WeakSingleton):
# 检查会话
self._check_session()
resp = self.session.request(
method, f"{self.base_url}{endpoint}",
**kwargs
)
# 错误日志标志
no_error_log = kwargs.pop("no_error_log", False)
# 重试次数
retry_times = kwargs.pop("retry_limit", 5)
try:
resp = self.session.request(
method, f"{self.base_url}{endpoint}",
**kwargs
)
except requests.exceptions.RequestException as e:
logger.error(f"【115】{method} 请求 {endpoint} 网络错误: {str(e)}")
return None
if resp is None:
logger.warn(f"【115】{method} 请求 {endpoint} 失败!")
return None
kwargs["retry_limit"] = retry_times
# 处理速率限制
if resp.status_code == 429:
reset_time = 5 + int(resp.headers.get("X-RateLimit-Reset", 60))
@@ -228,7 +246,18 @@ class U115Pan(StorageBase, metaclass=WeakSingleton):
# 返回数据
ret_data = resp.json()
if ret_data.get("code") != 0:
logger.warn(f"【115】{method} 请求 {endpoint} 出错:{ret_data.get('message')}")
error_msg = ret_data.get("message")
if not no_error_log:
logger.warn(f"【115】{method} 请求 {endpoint} 出错:{error_msg}")
if "已达到当前访问上限" in error_msg:
if retry_times <= 0:
logger.error(f"【115】{method} 请求 {endpoint} 达到访问上限,重试次数用尽!")
return None
kwargs["retry_limit"] = retry_times - 1
logger.info(f"【115】{method} 请求 {endpoint} 达到访问上限,等待 {self.retry_delay} 秒后重试...")
time.sleep(self.retry_delay)
return self._request_api(method, endpoint, result_key, **kwargs)
return None
if result_key:
return ret_data.get(result_key)
@@ -254,8 +283,8 @@ class U115Pan(StorageBase, metaclass=WeakSingleton):
"""
自动延迟重试 get_item 模块
"""
for _ in range(2):
time.sleep(2)
for i in range(1, 4):
time.sleep(2 ** i)
fileitem = self.get_item(path)
if fileitem:
return fileitem
@@ -430,6 +459,9 @@ class U115Pan(StorageBase, metaclass=WeakSingleton):
)
if not init_resp:
return None
if not init_resp.get("state"):
logger.warn(f"【115】上传二次认证失败: {init_resp.get('error')}")
return None
# 二次认证结果
init_result = init_resp.get("data")
logger.debug(f"【115】上传 Step 2 二次认证结果: {init_result}")
@@ -513,8 +545,8 @@ class U115Pan(StorageBase, metaclass=WeakSingleton):
security_token=SecurityToken
)
bucket = oss2.Bucket(auth, endpoint, bucket_name) # noqa
# determine_part_size方法用于确定分片大小设置分片大小为 100M
part_size = determine_part_size(file_size, preferred_size=100 * 1024 * 1024)
# determine_part_size方法用于确定分片大小设置分片大小为 10M
part_size = determine_part_size(file_size, preferred_size=10 * 1024 * 1024)
# 初始化进度条
logger.info(f"【115】开始上传: {local_path} -> {target_path},分片大小:{StringUtils.str_filesize(part_size)}")
@@ -695,7 +727,8 @@ class U115Pan(StorageBase, metaclass=WeakSingleton):
"data",
data={
"path": path.as_posix()
}
},
no_error_log=True
)
if not resp:
return None
@@ -782,8 +815,10 @@ class U115Pan(StorageBase, metaclass=WeakSingleton):
if resp["state"]:
new_path = Path(path) / fileitem.name
new_item = self._delay_get_item(new_path)
self.rename(new_item, new_name)
return True
if not new_item:
return False
if self.rename(new_item, new_name):
return True
return False
def move(self, fileitem: schemas.FileItem, path: Path, new_name: str) -> bool:
@@ -812,8 +847,10 @@ class U115Pan(StorageBase, metaclass=WeakSingleton):
if resp["state"]:
new_path = Path(path) / fileitem.name
new_file = self._delay_get_item(new_path)
self.rename(new_file, new_name)
return True
if not new_file:
return False
if self.rename(new_file, new_name):
return True
return False
def link(self, fileitem: schemas.FileItem, target_file: Path) -> bool:

View File

@@ -14,10 +14,10 @@ from app.helper.directory import DirectoryHelper
from app.helper.message import TemplateHelper
from app.log import logger
from app.modules.filemanager.storages import StorageBase
from app.schemas import TransferInfo, TmdbEpisode, TransferDirectoryConf, FileItem, TransferInterceptEventData
from app.schemas import TransferInfo, TmdbEpisode, TransferDirectoryConf, FileItem, TransferInterceptEventData, \
TransferRenameEventData
from app.schemas.types import MediaType, ChainEventType
from app.utils.system import SystemUtils
from app.schemas import TransferRenameEventData
lock = Lock()
@@ -239,7 +239,8 @@ class TransHandler:
overflag = True
if not overflag:
# 目标文件已存在
logger.info(f"目的文件系统中已经存在同名文件 {target_file},当前整理覆盖模式设置为 {overwrite_mode}")
logger.info(
f"目的文件系统中已经存在同名文件 {target_file},当前整理覆盖模式设置为 {overwrite_mode}")
if overwrite_mode == 'always':
# 总是覆盖同名文件
overflag = True

View File

@@ -85,7 +85,7 @@ class HaiDanSpider:
categories = self._movie_category
# 搜索类型
if keyword.startswith('tt'):
if keyword and keyword.startswith('tt'):
search_area = '4'
else:
search_area = '0'

View File

@@ -75,6 +75,9 @@ class MTorrentSpider:
categories = self._tv_category
else:
categories = self._movie_category
# mtorrent搜索imdb需要输入完整imdb链接参见 https://wiki.m-team.cc/zh-tw/imdbtosearch
if keyword and keyword.startswith("tt"):
keyword = f"https://www.imdb.com/title/{keyword}"
return {
"keyword": keyword,
"categories": categories,
@@ -127,6 +130,8 @@ class MTorrentSpider:
'labels': labels,
'category': category
}
if discount_end_time := (result.get('status') or {}).get('discountEndTime'):
torrent['freedate'] = StringUtils.format_timestamp(discount_end_time)
torrents.append(torrent)
return torrents

View File

@@ -51,10 +51,6 @@ class RedisModule(_ModuleBase):
"""
if settings.CACHE_BACKEND_TYPE != "redis":
return None
redis_helper = RedisHelper()
try:
if redis_helper.test():
return True, ""
return False, "Redis连接失败请检查配置"
finally:
redis_helper.close()
if RedisHelper().test():
return True, ""
return False, "Redis连接失败请检查配置"

View File

@@ -108,6 +108,7 @@ class CategoryHelper(metaclass=WeakSingleton):
return ""
if not categorys:
return ""
for key, item in categorys.items():
if not item:
return key
@@ -134,23 +135,41 @@ class CategoryHelper(metaclass=WeakSingleton):
else:
info_values = [str(info_value).upper()]
if value.find(",") != -1:
# , 分隔多个值
values = [str(val).upper() for val in value.split(",") if val]
elif value.find("-") != -1:
# - 表示范围,仅限于数字
value_begin = value.split("-")[0]
value_end = value.split("-")[1]
values = []
invert_values = []
# 如果有 "," 进行分割
values = [str(val) for val in value.split(",") if val]
expanded_values = []
for v in values:
if "-" not in v:
expanded_values.append(v)
continue
# - 表示范围
value_begin, value_end = v.split("-", 1)
prefix = ""
if value_begin.startswith('!'):
prefix = '!'
value_begin = value_begin[1:]
if value_begin.isdigit() and value_end.isdigit():
# 数字范围
values = [str(val) for val in range(int(value_begin), int(value_end) + 1)]
expanded_values.extend(f"{prefix}{val}" for val in range(int(value_begin), int(value_end) + 1))
else:
# 字符串范围
values = [str(value_begin), str(value_end)]
else:
values = [str(value).upper()]
expanded_values.extend([f"{prefix}{value_begin}", f"{prefix}{value_end}"])
if not set(values).intersection(set(info_values)):
values = list(map(str.upper, expanded_values))
invert_values = [val[1:] for val in values if val.startswith('!')]
values = [val for val in values if not val.startswith('!')]
if values and not set(values).intersection(set(info_values)):
match_flag = False
if invert_values and set(invert_values).intersection(set(info_values)):
match_flag = False
if match_flag:
return key

View File

@@ -747,6 +747,9 @@ class TmdbApi:
logger.info("正在从TheDbMovie网站查询%s ..." % name)
tmdb_url = self._build_tmdb_search_url(name)
res = RequestUtils(timeout=5, ua=settings.NORMAL_USER_AGENT, proxies=settings.PROXY).get_res(url=tmdb_url)
if res is None:
logger.error("无法连接TheDbMovie")
return None
# 响应验证
response_result = self._validate_response(res)
@@ -1857,6 +1860,9 @@ class TmdbApi:
tmdb_url = self._build_tmdb_search_url(name)
res = await AsyncRequestUtils(timeout=5, ua=settings.NORMAL_USER_AGENT, proxies=settings.PROXY).get_res(
url=tmdb_url)
if res is None:
logger.error("无法连接TheDbMovie")
return None
# 响应验证
response_result = self._validate_response(res)

View File

@@ -43,6 +43,8 @@ class TMDb(object):
self._timeout = 15
self.obj_cached = obj_cached
self.__clear_async_cache__ = False
@property
def page(self):
return self._page
@@ -125,14 +127,17 @@ class TMDb(object):
def cache(self, cache):
self._cache_enabled = bool(cache)
@cached(maxsize=settings.CONF.tmdb, ttl=settings.CONF.meta)
@cached(maxsize=settings.CONF.tmdb, ttl=settings.CONF.meta, skip_none=True)
def cached_request(self, method, url, data, json,
_ts=datetime.strftime(datetime.now(), '%Y%m%d')):
return self.request(method, url, data, json)
@cached(maxsize=settings.CONF.tmdb, ttl=settings.CONF.meta)
@cached(maxsize=settings.CONF.tmdb, ttl=settings.CONF.meta, skip_none=True)
async def async_cached_request(self, method, url, data, json,
_ts=datetime.strftime(datetime.now(), '%Y%m%d')):
if self.__clear_async_cache__:
self.__clear_async_cache__ = False
await self.async_cached_request.cache_clear()
return await self.async_request(method, url, data, json)
def request(self, method, url, data, json):
@@ -154,6 +159,7 @@ class TMDb(object):
return req
def cache_clear(self):
self.__clear_async_cache__ = True
return self.cached_request.cache_clear()
def _validate_api_key(self):

View File

@@ -7,6 +7,8 @@ import json
import urllib.parse
from http import HTTPStatus
from app.core.cache import cached
from app.core.config import settings
from app.utils.http import RequestUtils
@@ -15,7 +17,7 @@ class Auth:
TVDB认证类
"""
def __init__(self, url, apikey, pin="", proxy=None, timeout: int = 15):
def __init__(self, url: str, apikey: str, pin: str = "", proxy: dict = None, timeout: int = 15):
login_info = {"apikey": apikey}
if pin != "":
login_info["pin"] = pin
@@ -35,13 +37,14 @@ class Auth:
result = response.json()
self.token = result["data"]["token"]
else:
error_msg = f"登录失败,状态码: {response.status_code if response else 'None'}"
if response:
if response is not None:
try:
error_data = response.json()
error_msg = f"Code: {response.status_code}, {error_data.get('message', '未知错误')}"
except Exception as err:
error_msg = f"Code: {response.status_code}, 响应解析失败:{err}"
else:
error_msg = "网络连接失败,未收到响应"
raise Exception(error_msg)
except Exception as e:
raise Exception(f"TVDB认证失败: {str(e)}")
@@ -58,13 +61,14 @@ class Request:
请求处理类
"""
def __init__(self, auth_token, proxy=None, timeout=15):
def __init__(self, auth_token: str, proxy: dict = None, timeout: int = 15):
self.auth_token = auth_token
self.links = None
self.proxy = proxy
self.timeout = timeout
def make_request(self, url, if_modified_since=None):
@cached(maxsize=settings.CONF.tmdb, ttl=settings.CONF.meta, skip_none=True)
def make_request(self, url: str, if_modified_since: bool = None):
"""
向指定的 URL 发起请求并返回数据
"""
@@ -118,7 +122,8 @@ class Url:
def __init__(self):
self.base_url = "https://api4.thetvdb.com/v4/"
def construct(self, url_sect, url_id=None, url_subsect=None, url_lang=None, **kwargs):
def construct(self, url_sect: str, url_id: int = None,
url_subsect: str = None, url_lang: str = None, **kwargs):
"""
构建API URL
"""
@@ -141,7 +146,7 @@ class TVDB:
TVDB API主类
"""
def __init__(self, apikey: str, pin="", proxy=None, timeout: int = 15):
def __init__(self, apikey: str, pin: str = "", proxy: dict = None, timeout: int = 15):
self.url = Url()
login_url = self.url.construct("login")
self.auth = Auth(login_url, apikey, pin, proxy, timeout)
@@ -154,126 +159,126 @@ class TVDB:
"""
return self.request.links
def get_artwork_statuses(self, meta=None, if_modified_since=None) -> list:
def get_artwork_statuses(self, meta: str = None, if_modified_since: bool = None) -> list:
"""
返回艺术图状态列表
"""
url = self.url.construct("artwork/statuses", meta=meta)
return self.request.make_request(url, if_modified_since)
def get_artwork_types(self, meta=None, if_modified_since=None) -> list:
def get_artwork_types(self, meta: str = None, if_modified_since: bool = None) -> list:
"""
返回艺术图类型列表
"""
url = self.url.construct("artwork/types", meta=meta)
return self.request.make_request(url, if_modified_since)
def get_artwork(self, id: int, meta=None, if_modified_since=None) -> dict:
def get_artwork(self, id: int, meta: str = None, if_modified_since: bool = None) -> dict:
"""
返回单个艺术图信息的字典
"""
url = self.url.construct("artwork", id, meta=meta)
return self.request.make_request(url, if_modified_since)
def get_artwork_extended(self, id: int, meta=None, if_modified_since=None) -> dict:
def get_artwork_extended(self, id: int, meta: str = None, if_modified_since: bool = None) -> dict:
"""
返回单个艺术图的扩展信息字典
"""
url = self.url.construct("artwork", id, "extended", meta=meta)
return self.request.make_request(url, if_modified_since)
def get_all_awards(self, meta=None, if_modified_since=None) -> list:
def get_all_awards(self, meta: str = None, if_modified_since: bool = None) -> list:
"""
返回奖项列表
"""
url = self.url.construct("awards", meta=meta)
return self.request.make_request(url, if_modified_since)
def get_award(self, id: int, meta=None, if_modified_since=None) -> dict:
def get_award(self, id: int, meta: str = None, if_modified_since: bool = None) -> dict:
"""
返回单个奖项信息的字典
"""
url = self.url.construct("awards", id, meta=meta)
return self.request.make_request(url, if_modified_since)
def get_award_extended(self, id: int, meta=None, if_modified_since=None) -> dict:
def get_award_extended(self, id: int, meta: str = None, if_modified_since: bool = None) -> dict:
"""
返回单个奖项的扩展信息字典
"""
url = self.url.construct("awards", id, "extended", meta=meta)
return self.request.make_request(url, if_modified_since)
def get_all_award_categories(self, meta=None, if_modified_since=None) -> list:
def get_all_award_categories(self, meta: str = None, if_modified_since: bool = None) -> list:
"""
返回奖项类别列表
"""
url = self.url.construct("awards/categories", meta=meta)
return self.request.make_request(url, if_modified_since)
def get_award_category(self, id: int, meta=None, if_modified_since=None) -> dict:
def get_award_category(self, id: int, meta: str = None, if_modified_since: bool = None) -> dict:
"""
返回单个奖项类别信息的字典
"""
url = self.url.construct("awards/categories", id, meta=meta)
return self.request.make_request(url, if_modified_since)
def get_award_category_extended(self, id: int, meta=None, if_modified_since=None) -> dict:
def get_award_category_extended(self, id: int, meta: str = None, if_modified_since: bool = None) -> dict:
"""
返回单个奖项类别的扩展信息字典
"""
url = self.url.construct("awards/categories", id, "extended", meta=meta)
return self.request.make_request(url, if_modified_since)
def get_content_ratings(self, meta=None, if_modified_since=None) -> list:
def get_content_ratings(self, meta: str = None, if_modified_since: bool = None) -> list:
"""
返回内容分级列表
"""
url = self.url.construct("content/ratings", meta=meta)
return self.request.make_request(url, if_modified_since)
def get_countries(self, meta=None, if_modified_since=None) -> list:
def get_countries(self, meta: str = None, if_modified_since: bool = None) -> list:
"""
返回国家列表
"""
url = self.url.construct("countries", meta=meta)
return self.request.make_request(url, if_modified_since)
def get_all_companies(self, page=None, meta=None, if_modified_since=None) -> list:
def get_all_companies(self, page: int = None, meta: str = None, if_modified_since: bool = None) -> list:
"""
返回公司列表 (可分页)
"""
url = self.url.construct("companies", page=page, meta=meta)
return self.request.make_request(url, if_modified_since)
def get_company_types(self, meta=None, if_modified_since=None) -> list:
def get_company_types(self, meta: str = None, if_modified_since: bool = None) -> list:
"""
返回公司类型列表
"""
url = self.url.construct("companies/types", meta=meta)
return self.request.make_request(url, if_modified_since)
def get_company(self, id: int, meta=None, if_modified_since=None) -> dict:
def get_company(self, id: int, meta: str = None, if_modified_since: bool = None) -> dict:
"""
返回单个公司信息的字典
"""
url = self.url.construct("companies", id, meta=meta)
return self.request.make_request(url, if_modified_since)
def get_all_series(self, page=None, meta=None, if_modified_since=None) -> list:
def get_all_series(self, page: int = None, meta: str = None, if_modified_since: bool = None) -> list:
"""
返回剧集列表 (可分页)
"""
url = self.url.construct("series", page=page, meta=meta)
return self.request.make_request(url, if_modified_since)
def get_series(self, id: int, meta=None, if_modified_since=None) -> dict:
def get_series(self, id: int, meta: str = None, if_modified_since: bool = None) -> dict:
"""
返回单个剧集信息的字典
"""
url = self.url.construct("series", id, meta=meta)
return self.request.make_request(url, if_modified_since)
def get_series_by_slug(self, slug: str, meta=None, if_modified_since=None) -> dict:
def get_series_by_slug(self, slug: str, meta: str = None, if_modified_since: bool = None) -> dict:
"""
通过 slug (别名) 返回单个剧集信息的字典
"""
@@ -288,7 +293,7 @@ class TVDB:
return self.request.make_request(url, if_modified_since)
def get_series_episodes(self, id: int, season_type: str = "default", page: int = 0,
lang: str = None, meta=None, if_modified_since=None, **kwargs) -> dict:
lang: str = None, meta: str = None, if_modified_since: bool = None, **kwargs) -> dict:
"""
返回指定剧集和季类型的各集信息字典 (可分页,可指定语言)
"""
@@ -297,7 +302,7 @@ class TVDB:
)
return self.request.make_request(url, if_modified_since)
def get_series_translation(self, id: int, lang: str, meta=None, if_modified_since=None) -> dict:
def get_series_translation(self, id: int, lang: str, meta: str = None, if_modified_since: bool = None) -> dict:
"""
返回剧集的指定语言翻译信息字典
"""
@@ -318,21 +323,21 @@ class TVDB:
url = self.url.construct("series", id, "nextAired")
return self.request.make_request(url, if_modified_since)
def get_all_movies(self, page=None, meta=None, if_modified_since=None) -> list:
def get_all_movies(self, page: int = None, meta: str = None, if_modified_since: bool = None) -> list:
"""
返回电影列表 (可分页)
"""
url = self.url.construct("movies", page=page, meta=meta)
return self.request.make_request(url, if_modified_since)
def get_movie(self, id: int, meta=None, if_modified_since=None) -> dict:
def get_movie(self, id: int, meta: str = None, if_modified_since: bool = None) -> dict:
"""
返回单个电影信息的字典
"""
url = self.url.construct("movies", id, meta=meta)
return self.request.make_request(url, if_modified_since)
def get_movie_by_slug(self, slug: str, meta=None, if_modified_since=None) -> dict:
def get_movie_by_slug(self, slug: str, meta: str = None, if_modified_since: bool = None) -> dict:
"""
通过 slug (别名) 返回单个电影信息的字典
"""
@@ -346,70 +351,70 @@ class TVDB:
url = self.url.construct("movies", id, "extended", meta=meta, short=short)
return self.request.make_request(url, if_modified_since)
def get_movie_translation(self, id: int, lang: str, meta=None, if_modified_since=None) -> dict:
def get_movie_translation(self, id: int, lang: str, meta: str = None, if_modified_since: bool = None) -> dict:
"""
返回电影的指定语言翻译信息字典
"""
url = self.url.construct("movies", id, "translations", lang, meta=meta)
return self.request.make_request(url, if_modified_since)
def get_all_seasons(self, page=None, meta=None, if_modified_since=None) -> list:
def get_all_seasons(self, page: int = None, meta: str = None, if_modified_since: bool = None) -> list:
"""
返回季列表 (可分页)
"""
url = self.url.construct("seasons", page=page, meta=meta)
return self.request.make_request(url, if_modified_since)
def get_season(self, id: int, meta=None, if_modified_since=None) -> dict:
def get_season(self, id: int, meta: str = None, if_modified_since: bool = None) -> dict:
"""
返回单季信息的字典
"""
url = self.url.construct("seasons", id, meta=meta)
return self.request.make_request(url, if_modified_since)
def get_season_extended(self, id: int, meta=None, if_modified_since=None) -> dict:
def get_season_extended(self, id: int, meta: str = None, if_modified_since: bool = None) -> dict:
"""
返回单季的扩展信息字典
"""
url = self.url.construct("seasons", id, "extended", meta=meta)
return self.request.make_request(url, if_modified_since)
def get_season_types(self, meta=None, if_modified_since=None) -> list:
def get_season_types(self, meta: str = None, if_modified_since: bool = None) -> list:
"""
返回季类型列表
"""
url = self.url.construct("seasons/types", meta=meta)
return self.request.make_request(url, if_modified_since)
def get_season_translation(self, id: int, lang: str, meta=None, if_modified_since=None) -> dict:
def get_season_translation(self, id: int, lang: str, meta: str = None, if_modified_since: bool = None) -> dict:
"""
返回季的指定语言翻译信息字典
"""
url = self.url.construct("seasons", id, "translations", lang, meta=meta)
return self.request.make_request(url, if_modified_since)
def get_all_episodes(self, page=None, meta=None, if_modified_since=None) -> list:
def get_all_episodes(self, page: int = None, meta: str = None, if_modified_since: bool = None) -> list:
"""
返回集列表 (可分页)
"""
url = self.url.construct("episodes", page=page, meta=meta)
return self.request.make_request(url, if_modified_since)
def get_episode(self, id: int, meta=None, if_modified_since=None) -> dict:
def get_episode(self, id: int, meta: str = None, if_modified_since: bool = None) -> dict:
"""
返回单集信息的字典
"""
url = self.url.construct("episodes", id, meta=meta)
return self.request.make_request(url, if_modified_since)
def get_episode_extended(self, id: int, meta=None, if_modified_since=None) -> dict:
def get_episode_extended(self, id: int, meta: str = None, if_modified_since: bool = None) -> dict:
"""
返回单集的扩展信息字典
"""
url = self.url.construct("episodes", id, "extended", meta=meta)
return self.request.make_request(url, if_modified_since)
def get_episode_translation(self, id: int, lang: str, meta=None, if_modified_since=None) -> dict:
def get_episode_translation(self, id: int, lang: str, meta: str = None, if_modified_since: bool = None) -> dict:
"""
返回单集的指定语言翻译信息字典
"""
@@ -419,70 +424,70 @@ class TVDB:
# 兼容旧函数名。
get_episodes_translation = get_episode_translation
def get_all_genders(self, meta=None, if_modified_since=None) -> list:
def get_all_genders(self, meta: str = None, if_modified_since: bool = None) -> list:
"""
返回性别列表
"""
url = self.url.construct("genders", meta=meta)
return self.request.make_request(url, if_modified_since)
def get_all_genres(self, meta=None, if_modified_since=None) -> list:
def get_all_genres(self, meta: str = None, if_modified_since: bool = None) -> list:
"""
返回类型(流派)列表
"""
url = self.url.construct("genres", meta=meta)
return self.request.make_request(url, if_modified_since)
def get_genre(self, id: int, meta=None, if_modified_since=None) -> dict:
def get_genre(self, id: int, meta: str = None, if_modified_since: bool = None) -> dict:
"""
返回单个类型(流派)信息的字典
"""
url = self.url.construct("genres", id, meta=meta)
return self.request.make_request(url, if_modified_since)
def get_all_languages(self, meta=None, if_modified_since=None) -> list:
def get_all_languages(self, meta: str = None, if_modified_since: bool = None) -> list:
"""
返回语言列表
"""
url = self.url.construct("languages", meta=meta)
return self.request.make_request(url, if_modified_since)
def get_all_people(self, page=None, meta=None, if_modified_since=None) -> list:
def get_all_people(self, page: int = None, meta: str = None, if_modified_since: bool = None) -> list:
"""
返回人物列表 (可分页)
"""
url = self.url.construct("people", page=page, meta=meta)
return self.request.make_request(url, if_modified_since)
def get_person(self, id: int, meta=None, if_modified_since=None) -> dict:
def get_person(self, id: int, meta: str = None, if_modified_since: bool = None) -> dict:
"""
返回单个人物信息的字典
"""
url = self.url.construct("people", id, meta=meta)
return self.request.make_request(url, if_modified_since)
def get_person_extended(self, id: int, meta=None, if_modified_since=None) -> dict:
def get_person_extended(self, id: int, meta: str = None, if_modified_since: bool = None) -> dict:
"""
返回单个人物的扩展信息字典
"""
url = self.url.construct("people", id, "extended", meta=meta)
return self.request.make_request(url, if_modified_since)
def get_person_translation(self, id: int, lang: str, meta=None, if_modified_since=None) -> dict:
def get_person_translation(self, id: int, lang: str, meta: str = None, if_modified_since: bool = None) -> dict:
"""
返回人物的指定语言翻译信息字典
"""
url = self.url.construct("people", id, "translations", lang, meta=meta)
return self.request.make_request(url, if_modified_since)
def get_character(self, id: int, meta=None, if_modified_since=None) -> dict:
def get_character(self, id: int, meta: str = None, if_modified_since: bool = None) -> dict:
"""
返回角色信息的字典
"""
url = self.url.construct("characters", id, meta=meta)
return self.request.make_request(url, if_modified_since)
def get_people_types(self, meta=None, if_modified_since=None) -> list:
def get_people_types(self, meta: str = None, if_modified_since: bool = None) -> list:
"""
返回人物类型列表
"""
@@ -492,7 +497,7 @@ class TVDB:
# 兼容旧函数名
get_all_people_types = get_people_types
def get_source_types(self, meta=None, if_modified_since=None) -> list:
def get_source_types(self, meta: str = None, if_modified_since: bool = None) -> list:
"""
返回来源类型列表
"""
@@ -509,56 +514,56 @@ class TVDB:
url = self.url.construct("updates", since=since, **kwargs)
return self.request.make_request(url)
def get_all_tag_options(self, page=None, meta=None, if_modified_since=None) -> list:
def get_all_tag_options(self, page: int = None, meta: str = None, if_modified_since: bool = None) -> list:
"""
返回标签选项列表 (可分页)
"""
url = self.url.construct("tags/options", page=page, meta=meta)
return self.request.make_request(url, if_modified_since)
def get_tag_option(self, id: int, meta=None, if_modified_since=None) -> dict:
def get_tag_option(self, id: int, meta: str = None, if_modified_since: bool = None) -> dict:
"""
返回单个标签选项信息的字典
"""
url = self.url.construct("tags/options", id, meta=meta)
return self.request.make_request(url, if_modified_since)
def get_all_lists(self, page=None, meta=None) -> dict:
def get_all_lists(self, page: int = None, meta=None) -> dict:
"""
返回所有公开的列表信息 (可分页)
"""
url = self.url.construct("lists", page=page, meta=meta)
return self.request.make_request(url)
def get_list(self, id: int, meta=None, if_modified_since=None) -> dict:
def get_list(self, id: int, meta: str = None, if_modified_since: bool = None) -> dict:
"""
返回单个列表信息的字典
"""
url = self.url.construct("lists", id, meta=meta)
return self.request.make_request(url, if_modified_since)
def get_list_by_slug(self, slug: str, meta=None, if_modified_since=None) -> dict:
def get_list_by_slug(self, slug: str, meta: str = None, if_modified_since: bool = None) -> dict:
"""
通过 slug (别名) 返回单个列表信息的字典
"""
url = self.url.construct("lists/slug", slug, meta=meta)
return self.request.make_request(url, if_modified_since)
def get_list_extended(self, id: int, meta=None, if_modified_since=None) -> dict:
def get_list_extended(self, id: int, meta: str = None, if_modified_since: bool = None) -> dict:
"""
返回单个列表的扩展信息字典
"""
url = self.url.construct("lists", id, "extended", meta=meta)
return self.request.make_request(url, if_modified_since)
def get_list_translation(self, id: int, lang: str, meta=None, if_modified_since=None) -> dict:
def get_list_translation(self, id: int, lang: str, meta: str = None, if_modified_since: bool = None) -> dict:
"""
返回列表的指定语言翻译信息字典
"""
url = self.url.construct("lists", id, "translations", lang, meta=meta)
return self.request.make_request(url, if_modified_since)
def get_inspiration_types(self, meta=None, if_modified_since=None) -> dict:
def get_inspiration_types(self, meta: str = None, if_modified_since: bool = None) -> dict:
"""
返回灵感类型列表
"""

View File

@@ -154,7 +154,7 @@ class TrimeMediaModule(_ModuleBase, _MediaServerBase[TrimeMedia]):
"""
source = args.get("source")
if source:
server: TrimeMedia = self.get_instance(source)
server: Optional[TrimeMedia] = self.get_instance(source)
if not server:
return None
result = server.get_webhook_message(body)
@@ -247,7 +247,7 @@ class TrimeMediaModule(_ModuleBase, _MediaServerBase[TrimeMedia]):
媒体数量统计
"""
if server:
server_obj: TrimeMedia = self.get_instance(server)
server_obj: Optional[TrimeMedia] = self.get_instance(server)
if not server_obj:
return None
servers = [server_obj]
@@ -268,7 +268,7 @@ class TrimeMediaModule(_ModuleBase, _MediaServerBase[TrimeMedia]):
"""
媒体库列表
"""
server_obj: TrimeMedia = self.get_instance(server)
server_obj: Optional[TrimeMedia] = self.get_instance(server)
if server_obj:
return server_obj.get_librarys(hidden=hidden)
return None
@@ -290,7 +290,7 @@ class TrimeMediaModule(_ModuleBase, _MediaServerBase[TrimeMedia]):
:return: 返回一个生成器对象,用于逐步获取媒体服务器中的项目
"""
server_obj: TrimeMedia = self.get_instance(server)
server_obj: Optional[TrimeMedia] = self.get_instance(server)
if server_obj:
return server_obj.get_items(library_id, start_index, limit)
return None
@@ -301,7 +301,7 @@ class TrimeMediaModule(_ModuleBase, _MediaServerBase[TrimeMedia]):
"""
媒体库项目详情
"""
server_obj: TrimeMedia = self.get_instance(server)
server_obj: Optional[TrimeMedia] = self.get_instance(server)
if server_obj:
return server_obj.get_iteminfo(item_id)
return None
@@ -312,7 +312,9 @@ class TrimeMediaModule(_ModuleBase, _MediaServerBase[TrimeMedia]):
"""
获取剧集信息
"""
server_obj: TrimeMedia = self.get_instance(server)
if not isinstance(item_id, str):
return None
server_obj: Optional[TrimeMedia] = self.get_instance(server)
if not server_obj:
return None
_, seasoninfo = server_obj.get_tv_episodes(item_id=item_id)
@@ -329,10 +331,10 @@ class TrimeMediaModule(_ModuleBase, _MediaServerBase[TrimeMedia]):
"""
获取媒体服务器正在播放信息
"""
server_obj: TrimeMedia = self.get_instance(server)
server_obj: Optional[TrimeMedia] = self.get_instance(server)
if not server_obj:
return []
return server_obj.get_resume(num=count)
return server_obj.get_resume(num=count) or []
def mediaserver_play_url(
self, server: str, item_id: Union[str, int]
@@ -340,7 +342,9 @@ class TrimeMediaModule(_ModuleBase, _MediaServerBase[TrimeMedia]):
"""
获取媒体库播放地址
"""
server_obj: TrimeMedia = self.get_instance(server)
if not isinstance(item_id, str):
return None
server_obj: Optional[TrimeMedia] = self.get_instance(server)
if not server_obj:
return None
return server_obj.get_play_url(item_id)
@@ -354,10 +358,10 @@ class TrimeMediaModule(_ModuleBase, _MediaServerBase[TrimeMedia]):
"""
获取媒体服务器最新入库条目
"""
server_obj: TrimeMedia = self.get_instance(server)
server_obj: Optional[TrimeMedia] = self.get_instance(server)
if not server_obj:
return []
return server_obj.get_latest(num=count)
return server_obj.get_latest(num=count) or []
def mediaserver_latest_images(
self,
@@ -374,7 +378,7 @@ class TrimeMediaModule(_ModuleBase, _MediaServerBase[TrimeMedia]):
:param remote: True为外网链接, False为内网链接
:return: 图片链接列表
"""
server_obj: TrimeMedia = self.get_instance(server)
server_obj: Optional[TrimeMedia] = self.get_instance(server)
if not server_obj:
return []
return server_obj.get_latest_backdrops(num=count, remote=remote)
return server_obj.get_latest_backdrops(num=count, remote=remote) or []

View File

@@ -140,13 +140,13 @@ class Api:
self._token: Optional[str] = None
self._version: Optional[Version] = None
self._session = requests.Session()
self._request_utils = RequestUtils(session=self._session)
self._request_utils = RequestUtils(session=self._session, timeout=10)
def sys_version(self) -> Optional[Version]:
"""
飞牛影视版本号
"""
if (res := self.__request_api("/sys/version")) and res.success:
if (res := self.request("/sys/version")) and res.success:
if res.data:
self._version = Version(
frontend=res.data.get("version"),
@@ -162,7 +162,7 @@ class Api:
:return: 成功返回token 否则返回None
"""
if (
res := self.__request_api(
res := self.request(
"/login",
data={
"username": username,
@@ -178,7 +178,9 @@ class Api:
"""
退出账号
"""
if (res := self.__request_api("/user/logout", method="post")) and res.success:
if not self._token:
return True
if (res := self.request("/user/logout", method="post")) and res.success:
if res.data:
self._token = None
return True
@@ -188,7 +190,9 @@ class Api:
"""
用户列表(仅管理员有权访问)
"""
if (res := self.__request_api("/manager/user/list")) and res.success:
if (res := self.request("/manager/user/list")) and res.success:
if not res.data:
return []
return [
User(
guid=info.get("guid"),
@@ -203,7 +207,7 @@ class Api:
"""
当前用户信息
"""
if (res := self.__request_api("/user/info")) and res.success:
if (res := self.request("/user/info")) and res.success:
_user = User("", "")
_user.__dict__.update(res.data)
return _user
@@ -213,7 +217,7 @@ class Api:
"""
媒体数量统计
"""
if (res := self.__request_api("/mediadb/sum")) and res.success:
if (res := self.request("/mediadb/sum")) and res.success:
sums = MediaDbSummary()
sums.__dict__.update(res.data)
return sums
@@ -223,9 +227,9 @@ class Api:
"""
媒体库列表(普通用户)
"""
if (res := self.__request_api("/mediadb/list")) and res.success:
if (res := self.request("/mediadb/list")) and res.success:
_items = []
for info in res.data:
for info in res.data or []:
mdb = MediaDb(
guid=info.get("guid"),
category=Category(info.get("category")),
@@ -250,9 +254,9 @@ class Api:
"""
媒体库列表(管理员)
"""
if (res := self.__request_api("/mdb/list")) and res.success:
if (res := self.request("/mdb/list")) and res.success:
_items = []
for info in res.data:
for info in res.data or []:
mdb = MediaDb(
guid=info.get("guid"),
category=Category(info.get("category")),
@@ -271,7 +275,7 @@ class Api:
"""
扫描所有媒体库
"""
if (res := self.__request_api("/mdb/scanall", method="post")) and res.success:
if (res := self.request("/mdb/scanall", method="post")) and res.success:
if res.data:
return True
return False
@@ -280,9 +284,7 @@ class Api:
"""
扫描指定媒体库
"""
if (
res := self.__request_api(f"/mdb/scan/{mdb.guid}", data={})
) and res.success:
if (res := self.request(f"/mdb/scan/{mdb.guid}", data={})) and res.success:
if res.data:
return True
return False
@@ -291,9 +293,7 @@ class Api:
"""
当前正在运行的任务
"""
if (
res := self.__request_api("/task/running")
) and res.success:
if (res := self.request("/task/running")) and res.success:
if res.data:
# TODO 具体正在运行的任务
return True
@@ -341,7 +341,9 @@ class Api:
if exclude_grouped_video:
post["exclude_grouped_video"] = 1
if (res := self.__request_api("/item/list", data=post)) and res.success:
if (res := self.request("/item/list", data=post)) and res.success:
if not res.data:
return []
return [self.__build_item(info) for info in res.data.get("list", [])]
return None
@@ -350,8 +352,10 @@ class Api:
搜索影片、演员
"""
if (
res := self.__request_api("/search/list", params={"q": keywords})
res := self.request("/search/list", params={"q": keywords})
) and res.success:
if not res.data:
return []
return [self.__build_item(info) for info in res.data]
return None
@@ -359,7 +363,7 @@ class Api:
"""
查询媒体详情
"""
if (res := self.__request_api(f"/item/{guid}")) and res.success:
if (res := self.request(f"/item/{guid}")) and res.success:
return self.__build_item(res.data)
return None
@@ -370,7 +374,7 @@ class Api:
:param delete_file: True删除媒体文件False仅从媒体库移除
"""
if (
res := self.__request_api(
res := self.request(
f"/item/{guid}",
method="delete",
data={"delete_file": 1 if delete_file else 0, "media_guids": []},
@@ -384,7 +388,9 @@ class Api:
"""
查询季列表
"""
if (res := self.__request_api(f"/season/list/{tv_guid}")) and res.success:
if (res := self.request(f"/season/list/{tv_guid}")) and res.success:
if not res.data:
return []
return [self.__build_item(info) for info in res.data]
return None
@@ -392,7 +398,9 @@ class Api:
"""
查询剧集列表
"""
if (res := self.__request_api(f"/episode/list/{season_guid}")) and res.success:
if (res := self.request(f"/episode/list/{season_guid}")) and res.success:
if not res.data:
return []
return [self.__build_item(info) for info in res.data]
return None
@@ -400,7 +408,9 @@ class Api:
"""
继续观看列表
"""
if (res := self.__request_api("/play/list")) and res.success:
if (res := self.request("/play/list")) and res.success:
if not res.data:
return []
return [self.__build_item(info) for info in res.data]
return None
@@ -431,7 +441,7 @@ class Api:
sign = md5.hexdigest()
return f"nonce={nonce}&timestamp={ts}&sign={sign}"
def __request_api(
def request(
self,
api: str,
method: Optional[str] = None,
@@ -482,6 +492,8 @@ class Api:
queries_unquoted = None
headers = {
"User-Agent": settings.USER_AGENT,
"Accept": "application/json",
"Referer": self._host,
"Authorization": self._token,
"authx": self.__get_authx(api_path, json_body or queries_unquoted),
}

View File

@@ -13,6 +13,7 @@ class TrimeMedia:
_password: Optional[str] = None
_userinfo: Optional[fnapi.User] = None
_host: Optional[str] = None
_playhost: Optional[str] = None
_libraries: dict[str, fnapi.MediaDb] = {}
@@ -34,20 +35,19 @@ class TrimeMedia:
return
self._username = username
self._password = password
self._host = host
self._sync_libraries = sync_libraries or []
if (api := self.__create_api(host)) is None:
if not self.reconnect():
logger.error(f"请检查服务端地址 {host}")
return
self._api = api
if play_api := self.__create_api(play_host):
self._playhost = play_api.host
if result := self.__create_api(play_host):
self._playhost = result.api.host
result.api.close()
elif play_host:
logger.warning(f"请检查外网播放地址 {play_host}")
self._playhost = UrlUtils.standardize_base_url(play_host).rstrip("/")
self.reconnect()
@property
def api(self) -> Optional[fnapi.Api]:
"""
@@ -55,14 +55,19 @@ class TrimeMedia:
"""
return self._api
class _ApiCreateResult:
api: fnapi.Api
version: fnapi.Version
@staticmethod
def __create_api(host: Optional[str]) -> Optional[fnapi.Api]:
def __create_api(host: Optional[str]) -> Optional["TrimeMedia._ApiCreateResult"]:
"""
创建一个飞牛API
:param host: 服务端地址
:return: 如果地址无效、不可访问则返回None
"""
if not host:
return None
api_key = "16CCEB3D-AB42-077D-36A1-F355324E4237"
@@ -70,21 +75,35 @@ class TrimeMedia:
if not host.endswith("/v"):
# 尝试补上结尾的/v 测试能否正常访问
api = fnapi.Api(host + "/v", api_key)
if api.sys_version():
return api
res = TrimeMedia._ApiCreateResult()
res.api = fnapi.Api(host + "/v", api_key)
if fnver := res.api.sys_version():
res.version = fnver
return res
# 测试用户配置的地址
api = fnapi.Api(host, api_key)
return api if api.sys_version() else None
res = TrimeMedia._ApiCreateResult()
res.api = fnapi.Api(host, api_key)
if fnver := res.api.sys_version():
res.version = fnver
return res
return None
def close(self):
self.disconnect()
def is_configured(self) -> bool:
return self._api is not None
return bool(self._host and self._username and self._password)
def is_authenticated(self) -> bool:
return self.is_configured() and self._api.token is not None
"""
是否已登录
"""
return (
self.is_configured()
and self._api is not None
and self._api.token is not None
and self._userinfo is not None
)
def is_inactive(self) -> bool:
"""
@@ -101,10 +120,15 @@ class TrimeMedia:
"""
if not self.is_configured():
return False
if (fnver := self._api.sys_version()) is None:
self.disconnect()
if result := self.__create_api(self._host):
self._api = result.api
# 版本号:0.8.53, 服务版本:0.8.23
logger.debug(
f"版本号:{result.version.frontend}, 服务版本:{result.version.backend}"
)
else:
return False
# 版本号:0.8.36, 服务版本:0.8.19
logger.debug(f"版本号:{fnver.frontend}, 服务版本:{fnver.backend}")
if self._api.login(self._username, self._password) is None:
return False
self._userinfo = self._api.user_info()
@@ -119,9 +143,10 @@ class TrimeMedia:
"""
断开与飞牛的连接
"""
if self.is_authenticated():
if self._api:
self._api.logout()
self._api.close()
self._api = None
self._userinfo = None
logger.debug(f"{self._username} 已断开飞牛影视")
@@ -163,7 +188,7 @@ class TrimeMedia:
for img_path in library.posters or []
],
link=f"{self._playhost or self._api.host}/library/{library.guid}",
server_type='trimemedia'
server_type="trimemedia",
)
)
return libraries
@@ -205,10 +230,12 @@ class TrimeMedia:
return None
if not self.is_configured():
return None
feiniu = fnapi.Api(self._api.host, self._api.apikey)
if token := feiniu.login(username, password):
feiniu.logout()
return token
if result := self.__create_api(self._host):
try:
return result.api.login(username, password)
finally:
result.api.logout()
result.api.close()
def get_movies(
self, title: str, year: Optional[str] = None, tmdb_id: Optional[int] = None
@@ -459,7 +486,7 @@ class TrimeMedia:
if item.duration and item.ts is not None
else 0
),
server_type='trimemedia',
server_type="trimemedia",
)
def get_items(

View File

@@ -1,7 +1,6 @@
import json
import platform
import re
import subprocess
import threading
import time
import traceback
@@ -10,13 +9,13 @@ from threading import Lock
from typing import Any, Optional, Dict, List
from apscheduler.schedulers.background import BackgroundScheduler
from app.core.cache import TTLCache, FileCache
from watchdog.events import FileSystemEventHandler, FileSystemMovedEvent, FileSystemEvent
from watchdog.observers.polling import PollingObserver
from app.chain import ChainBase
from app.chain.storage import StorageChain
from app.chain.transfer import TransferChain
from app.core.cache import TTLCache, FileCache
from app.core.config import settings
from app.core.event import Event, eventmanager
from app.helper.directory import DirectoryHelper
@@ -25,7 +24,8 @@ from app.log import logger
from app.schemas import ConfigChangeEventData
from app.schemas import FileItem
from app.schemas.types import SystemConfigKey, EventType
from app.utils.singleton import Singleton
from app.utils.singleton import SingletonClass
from app.utils.system import SystemUtils
lock = Lock()
snapshot_lock = Lock()
@@ -54,7 +54,7 @@ class FileMonitorHandler(FileSystemEventHandler):
file_size=Path(event.dest_path).stat().st_size)
class Monitor(metaclass=Singleton):
class Monitor(metaclass=SingletonClass):
"""
目录监控处理链,单例模式
"""
@@ -355,7 +355,8 @@ class Monitor(metaclass=Singleton):
return tips
def should_use_polling(self, directory: Path, monitor_mode: str,
@staticmethod
def should_use_polling(directory: Path, monitor_mode: str,
file_count: int, limits: dict) -> tuple[bool, str]:
"""
判断是否应该使用轮询模式
@@ -369,7 +370,7 @@ class Monitor(metaclass=Singleton):
return True, "用户配置为兼容模式"
# 检查网络文件系统
if self.is_network_filesystem(directory):
if SystemUtils.is_network_filesystem(directory):
return True, "检测到网络文件系统,建议使用兼容模式"
max_watches = limits.get('max_user_watches')
@@ -377,45 +378,6 @@ class Monitor(metaclass=Singleton):
return True, f"目录文件数量({file_count})接近系统限制({max_watches})"
return False, "使用快速模式"
@staticmethod
def is_network_filesystem(directory: Path) -> bool:
"""
检测是否为网络文件系统
:param directory: 目录路径
:return: 是否为网络文件系统
"""
try:
system = platform.system()
if system == 'Linux':
# 检查挂载信息
result = subprocess.run(['df', '-T', str(directory)],
capture_output=True, text=True, timeout=5)
if result.returncode == 0:
output = result.stdout.lower()
# 以下本地文件系统含有fuse关键字
local_fs = [
"fuse.shfs", # Unraid
"zfuse.zfsv", # 极空间(zfuse.zfsv2、zfuse.zfsv3、...)
# TBD
]
if any(fs in output for fs in local_fs):
return False
network_fs = ['nfs', 'cifs', 'smbfs', 'fuse', 'sshfs', 'ftpfs']
return any(fs in output for fs in network_fs)
elif system == 'Darwin':
# macOS 检查
result = subprocess.run(['df', '-T', str(directory)],
capture_output=True, text=True, timeout=5)
if result.returncode == 0:
output = result.stdout.lower()
return 'nfs' in output or 'smbfs' in output
elif system == 'Windows':
# Windows 检查网络驱动器
return str(directory).startswith('\\\\')
except Exception as e:
logger.debug(f"检测网络文件系统时出错: {e}")
return False
def init(self):
"""
启动监控

View File

@@ -1,340 +0,0 @@
import threading
import time
from collections import defaultdict, deque
from dataclasses import dataclass
from datetime import datetime, timedelta
from typing import Dict, List, Any
import psutil
from fastapi import Request, Response
from fastapi.responses import PlainTextResponse
from prometheus_client import Counter, Histogram, Gauge, generate_latest, CONTENT_TYPE_LATEST
from prometheus_fastapi_instrumentator import Instrumentator
from app.core.config import settings
from app.log import logger
@dataclass
class RequestMetrics:
"""
请求指标数据类
"""
path: str
method: str
status_code: int
response_time: float
timestamp: datetime
client_ip: str
user_agent: str
@dataclass
class PerformanceSnapshot:
"""
性能快照数据类
"""
timestamp: datetime
cpu_usage: float
memory_usage: float
active_requests: int
request_rate: float
avg_response_time: float
error_rate: float
slow_requests: int
class FastAPIMonitor:
"""
FastAPI性能监控器
"""
def __init__(self, max_history: int = 1000, window_size: int = 60):
self.max_history = max_history
self.window_size = window_size # 秒
# 请求历史记录
self.request_history: deque = deque(maxlen=max_history)
# 实时统计
self.active_requests = 0
self.total_requests = 0
self.error_requests = 0
self.slow_requests = 0 # 响应时间超过1秒的请求
# 时间窗口统计
self.window_requests: deque = deque(maxlen=window_size)
self.window_response_times: deque = deque(maxlen=window_size)
# 线程锁
self._lock = threading.Lock()
# 性能阈值
self.slow_request_threshold = 1.0 # 1秒
self.error_threshold = 0.05 # 5%
self.cpu_threshold = 80.0 # 80%
self.memory_threshold = 80.0 # 80%
# 告警状态
self.alerts: List[str] = []
logger.info("FastAPI性能监控器已初始化")
def record_request(self, request: Request, response: Response, response_time: float):
"""
记录请求指标
"""
with self._lock:
# 创建请求指标
metrics = RequestMetrics(
path=str(request.url.path),
method=request.method,
status_code=response.status_code,
response_time=response_time,
timestamp=datetime.now(),
client_ip=request.client.host if request.client else "unknown",
user_agent=request.headers.get("user-agent", "unknown")
)
# 添加到历史记录
self.request_history.append(metrics)
# 更新统计
self.total_requests += 1
if response.status_code >= 400:
self.error_requests += 1
if response_time > self.slow_request_threshold:
self.slow_requests += 1
# 添加到时间窗口
self.window_requests.append(metrics)
self.window_response_times.append(response_time)
def start_request(self):
"""
开始处理请求
"""
with self._lock:
self.active_requests += 1
def end_request(self):
"""
结束处理请求
"""
with self._lock:
self.active_requests = max(0, self.active_requests - 1)
def get_performance_snapshot(self) -> PerformanceSnapshot:
"""
获取性能快照
"""
with self._lock:
now = datetime.now()
# 计算请求率(每分钟)
recent_requests = [
req for req in self.window_requests
if now - req.timestamp < timedelta(seconds=self.window_size)
]
request_rate = len(recent_requests) / (self.window_size / 60)
# 计算平均响应时间
recent_response_times = [
rt for rt in self.window_response_times
if len(self.window_response_times) > 0
]
avg_response_time = sum(recent_response_times) / len(recent_response_times) if recent_response_times else 0
# 计算错误率
error_rate = self.error_requests / self.total_requests if self.total_requests > 0 else 0
# 系统资源使用率
cpu_usage = psutil.cpu_percent(interval=0.1)
memory_usage = psutil.virtual_memory().percent
return PerformanceSnapshot(
timestamp=now,
cpu_usage=cpu_usage,
memory_usage=memory_usage,
active_requests=self.active_requests,
request_rate=request_rate,
avg_response_time=avg_response_time,
error_rate=error_rate,
slow_requests=self.slow_requests
)
def get_top_endpoints(self, limit: int = 10) -> List[Dict[str, Any]]:
"""
获取最活跃的端点
"""
with self._lock:
endpoint_stats = defaultdict(lambda: {
'count': 0,
'total_time': 0,
'errors': 0,
'avg_time': 0
})
for req in self.request_history:
key = f"{req.method} {req.path}"
endpoint_stats[key]['count'] += 1
endpoint_stats[key]['total_time'] += req.response_time
if req.status_code >= 400:
endpoint_stats[key]['errors'] += 1
# 计算平均时间
for stats in endpoint_stats.values():
if stats['count'] > 0:
stats['avg_time'] = stats['total_time'] / stats['count']
# 按请求数量排序
sorted_endpoints = sorted(
[{'endpoint': k, **v} for k, v in endpoint_stats.items()],
key=lambda x: x['count'],
reverse=True
)
return sorted_endpoints[:limit]
def get_recent_errors(self, limit: int = 20) -> List[Dict[str, Any]]:
"""
获取最近的错误请求
"""
with self._lock:
errors = [
{
'timestamp': req.timestamp.isoformat(),
'method': req.method,
'path': req.path,
'status_code': req.status_code,
'response_time': req.response_time,
'client_ip': req.client_ip
}
for req in self.request_history
if req.status_code >= 400
]
return errors[-limit:]
def check_alerts(self) -> List[str]:
"""
检查告警条件
"""
snapshot = self.get_performance_snapshot()
alerts = []
if snapshot.error_rate > self.error_threshold:
alerts.append(f"错误率过高: {snapshot.error_rate:.2%}")
if snapshot.cpu_usage > self.cpu_threshold:
alerts.append(f"CPU使用率过高: {snapshot.cpu_usage:.1f}%")
if snapshot.memory_usage > self.memory_threshold:
alerts.append(f"内存使用率过高: {snapshot.memory_usage:.1f}%")
if snapshot.avg_response_time > self.slow_request_threshold:
alerts.append(f"平均响应时间过长: {snapshot.avg_response_time:.2f}s")
if snapshot.request_rate > 1000: # 每分钟1000请求
alerts.append(f"请求率过高: {snapshot.request_rate:.0f} req/min")
self.alerts = alerts
return alerts
# 全局监控实例
monitor = FastAPIMonitor()
def setup_prometheus_metrics(app):
"""
设置Prometheus指标
"""
if not settings.PERFORMANCE_MONITOR_ENABLE:
return
# 创建Prometheus指标
request_counter = Counter(
"http_requests_total",
"Total number of HTTP requests",
["method", "endpoint", "status"]
)
request_duration = Histogram(
"http_request_duration_seconds",
"HTTP request duration in seconds",
["method", "endpoint"]
)
active_requests = Gauge(
"http_active_requests",
"Number of active HTTP requests"
)
# 自定义指标收集函数
def custom_metrics(request: Request, response: Response, response_time: float):
request_counter.labels(
method=request.method,
endpoint=request.url.path,
status=response.status_code
).inc()
request_duration.labels(
method=request.method,
endpoint=request.url.path
).observe(response_time)
active_requests.set(monitor.active_requests)
# 设置Prometheus监控
Instrumentator().instrument(app).expose(app, include_in_schema=False, should_gzip=True)
# 添加自定义指标
@app.middleware("http")
async def monitor_middleware(request: Request, call_next):
start_time = time.time()
# 开始请求
monitor.start_request()
try:
response = await call_next(request)
response_time = time.time() - start_time
# 记录请求指标
monitor.record_request(request, response, response_time)
# 更新Prometheus指标
custom_metrics(request, response, response_time)
return response
except Exception as e:
response_time = time.time() - start_time
logger.error(f"请求处理异常: {e}")
# 创建错误响应
response = Response(
content=str(e),
status_code=500,
media_type="text/plain"
)
# 记录错误请求
monitor.record_request(request, response, response_time)
return response
finally:
# 结束请求
monitor.end_request()
logger.info("Prometheus指标监控已设置")
def get_metrics_response():
"""
获取Prometheus指标响应
"""
return PlainTextResponse(
generate_latest(),
media_type=CONTENT_TYPE_LATEST
)

View File

@@ -1,3 +1,7 @@
import asyncio
import gc
import inspect
import multiprocessing
import threading
import traceback
from datetime import datetime, timedelta
@@ -27,7 +31,8 @@ from app.helper.wallpaper import WallpaperHelper
from app.log import logger
from app.schemas import Notification, NotificationType, Workflow, ConfigChangeEventData
from app.schemas.types import EventType, SystemConfigKey
from app.utils.singleton import Singleton
from app.utils.gc import get_memory_usage
from app.utils.singleton import SingletonClass
from app.utils.timer import TimerUtils
lock = threading.Lock()
@@ -37,7 +42,7 @@ class SchedulerChain(ChainBase):
pass
class Scheduler(metaclass=Singleton):
class Scheduler(metaclass=SingletonClass):
"""
定时任务管理
"""
@@ -55,6 +60,8 @@ class Scheduler(metaclass=Singleton):
self._auth_count = 0
# 用户认证失败消息发送
self._auth_message = False
# 当前事件循环
self.loop = asyncio.get_event_loop()
self.init()
@eventmanager.register(EventType.ConfigChanged)
@@ -67,7 +74,8 @@ class Scheduler(metaclass=Singleton):
return
event_data: ConfigChangeEventData = event.event_data
if event_data.key not in ['DEV', 'COOKIECLOUD_INTERVAL', 'MEDIASERVER_SYNC_INTERVAL', 'SUBSCRIBE_SEARCH',
'SUBSCRIBE_MODE', 'SUBSCRIBE_RSS_INTERVAL', 'SITEDATA_REFRESH_INTERVAL']:
'SUBSCRIBE_SEARCH_INTERVAL', 'SUBSCRIBE_MODE', 'SUBSCRIBE_RSS_INTERVAL',
'SITEDATA_REFRESH_INTERVAL']:
return
logger.info(f"配置项 {event_data.key} 变更,重新初始化定时服务...")
self.init()
@@ -90,17 +98,17 @@ class Scheduler(metaclass=Singleton):
"cookiecloud": {
"name": "同步CookieCloud站点",
"func": SiteChain().sync_cookies,
"running": False,
"running": False
},
"mediaserver_sync": {
"name": "同步媒体服务器",
"func": MediaServerChain().sync,
"running": False,
"running": False
},
"subscribe_tmdb": {
"name": "订阅元数据更新",
"func": SubscribeChain().check,
"running": False,
"running": False
},
"subscribe_search": {
"name": "订阅搜索补全",
@@ -121,47 +129,65 @@ class Scheduler(metaclass=Singleton):
"subscribe_refresh": {
"name": "订阅刷新",
"func": SubscribeChain().refresh,
"running": False,
"running": False
},
"subscribe_follow": {
"name": "关注的订阅分享",
"func": SubscribeChain().follow,
"running": False,
"running": False
},
"transfer": {
"name": "下载文件整理",
"func": TransferChain().process,
"running": False,
"running": False
},
"clear_cache": {
"name": "缓存清理",
"func": self.clear_cache,
"running": False,
"running": False
},
"user_auth": {
"name": "用户认证检查",
"func": self.user_auth,
"running": False,
"running": False
},
"scheduler_job": {
"name": "公共定时服务",
"func": SchedulerChain().scheduler_job,
"running": False,
"running": False
},
"random_wallpager": {
"name": "壁纸缓存",
"func": WallpaperHelper().get_wallpapers,
"running": False,
"running": False
},
"sitedata_refresh": {
"name": "站点数据刷新",
"func": SiteChain().refresh_userdatas,
"running": False,
"running": False
},
"recommend_refresh": {
"name": "推荐缓存",
"func": RecommendChain().refresh_recommend,
"running": False
},
"plugin_market_refresh": {
"name": "插件市场缓存",
"func": PluginManager().async_get_online_plugins,
"running": False,
"kwargs": {
"force": True
}
},
"subscribe_calendar_cache": {
"name": "订阅日历缓存",
"func": SubscribeChain().cache_calendar,
"running": False
},
"full_gc": {
"name": "主动内存回收",
"func": self.full_gc,
"running": False
}
}
@@ -180,7 +206,7 @@ class Scheduler(metaclass=Singleton):
id="cookiecloud",
name="同步CookieCloud站点",
minutes=int(settings.COOKIECLOUD_INTERVAL),
next_run_time=datetime.now(pytz.timezone(settings.TZ)) + timedelta(minutes=1),
next_run_time=datetime.now(pytz.timezone(settings.TZ)) + timedelta(minutes=5),
kwargs={
'job_id': 'cookiecloud'
}
@@ -195,7 +221,7 @@ class Scheduler(metaclass=Singleton):
id="mediaserver_sync",
name="同步媒体服务器",
hours=int(settings.MEDIASERVER_SYNC_INTERVAL),
next_run_time=datetime.now(pytz.timezone(settings.TZ)) + timedelta(minutes=5),
next_run_time=datetime.now(pytz.timezone(settings.TZ)) + timedelta(minutes=10),
kwargs={
'job_id': 'mediaserver_sync'
}
@@ -232,7 +258,7 @@ class Scheduler(metaclass=Singleton):
"interval",
id="subscribe_search",
name="订阅搜索补全",
hours=24,
hours=settings.SUBSCRIBE_SEARCH_INTERVAL,
kwargs={
'job_id': 'subscribe_search'
}
@@ -301,7 +327,7 @@ class Scheduler(metaclass=Singleton):
id="random_wallpager",
name="壁纸缓存",
minutes=30,
next_run_time=datetime.now(pytz.timezone(settings.TZ)) + timedelta(seconds=3),
next_run_time=datetime.now(pytz.timezone(settings.TZ)) + timedelta(seconds=1),
kwargs={
'job_id': 'random_wallpager'
}
@@ -363,21 +389,56 @@ class Scheduler(metaclass=Singleton):
id="recommend_refresh",
name="推荐缓存",
hours=24,
next_run_time=datetime.now(pytz.timezone(settings.TZ)) + timedelta(seconds=3),
next_run_time=datetime.now(pytz.timezone(settings.TZ)) + timedelta(seconds=5),
kwargs={
'job_id': 'recommend_refresh'
}
)
# 插件市场缓存
self._scheduler.add_job(
self.start,
"interval",
id="plugin_market_refresh",
name="插件市场缓存",
minutes=30,
kwargs={
'job_id': 'plugin_market_refresh'
}
)
# 订阅日历缓存
self._scheduler.add_job(
self.start,
"interval",
id="subscribe_calendar_cache",
name="订阅日历缓存",
hours=6,
next_run_time=datetime.now(pytz.timezone(settings.TZ)) + timedelta(minutes=2),
kwargs={
'job_id': 'subscribe_calendar_cache'
}
)
# 主动内存回收
if settings.MEMORY_GC_INTERVAL:
self._scheduler.add_job(
self.start,
"interval",
id="full_gc",
name="主动内存回收",
minutes=settings.MEMORY_GC_INTERVAL,
kwargs={
'job_id': 'full_gc'
}
)
# 初始化工作流服务
self.init_workflow_jobs()
# 初始化插件服务
self.init_plugin_jobs()
# 打印服务
self._scheduler.print_jobs()
# 启动定时服务
self._scheduler.start()
@@ -409,6 +470,13 @@ class Scheduler(metaclass=Singleton):
"""
启动定时服务
"""
def __start_coro(coro):
"""
启动协程
"""
return asyncio.run_coroutine_threadsafe(coro, self.loop)
# 获取定时任务
job = self.__prepare_job(job_id)
if not job:
@@ -417,7 +485,22 @@ class Scheduler(metaclass=Singleton):
try:
if not kwargs:
kwargs = job.get("kwargs") or {}
job["func"](*args, **kwargs)
func = job.get("func")
if not func:
return
# 是否多进程运行
run_in_process = job.get("run_in_process", False)
if inspect.iscoroutinefunction(func):
# 协程函数
__start_coro(func(*args, **kwargs))
elif run_in_process:
# 多进程运行
p = multiprocessing.Process(target=func, args=args, kwargs=kwargs)
p.start()
p.join()
else:
# 普通函数
job["func"](*args, **kwargs)
except Exception as e:
logger.error(f"定时任务 {job.get('name')} 执行失败:{str(e)} - {traceback.format_exc()}")
MessageHelper().put(title=f"{job.get('name')} 执行失败",
@@ -519,7 +602,7 @@ class Scheduler(metaclass=Singleton):
except JobLookupError:
pass
if job_removed:
logger.info(f"移除插件服务({plugin_name}){service.get('name')}")
logger.info(f"移除插件服务({plugin_name}){service.get('name')}") # noqa
except Exception as e:
logger.error(f"移除插件服务失败:{str(e)} - {job_id}: {service}")
SchedulerChain().messagehelper.put(title=f"插件 {plugin_name} 服务移除失败",
@@ -684,6 +767,17 @@ class Scheduler(metaclass=Singleton):
"""
SchedulerChain().clear_cache()
@staticmethod
def full_gc():
"""
主动内存回收
"""
memory_before = get_memory_usage()
collected = gc.collect()
memory_after = get_memory_usage()
memory_freed = memory_before - memory_after
logger.info(f"主动内存回收完成,回收对象数: {collected},释放内存: {memory_freed:.2f} MB")
def user_auth(self):
"""
用户认证检查

View File

@@ -77,7 +77,7 @@ class SiteUserData(BaseModel):
# 用户名
username: Optional[str] = None
# 用户ID
userid: Optional[Union[int, str]] = None
userid: Optional[str] = None
# 用户等级
user_level: Optional[str] = None
# 加入时间

View File

@@ -20,6 +20,8 @@ class Token(BaseModel):
level: int = 1
# 详细权限
permissions: Optional[dict] = Field(default_factory=dict)
# 是否显示配置向导
widzard: Optional[bool] = None
class TokenPayload(BaseModel):

View File

@@ -9,6 +9,13 @@ class MediaType(Enum):
UNKNOWN = '未知'
# 排序类型枚举
class SortType(Enum):
TIME = "time" # 按时间排序
COUNT = "count" # 按人数排序
RATING = "rating" # 按评分排序
# 种子状态
class TorrentStatus(Enum):
TRANSFER = "可转移"
@@ -175,8 +182,6 @@ class SystemConfigKey(Enum):
UserCustomCSS = "UserCustomCSS"
# 用户已安装的插件
UserInstalledPlugins = "UserInstalledPlugins"
# 插件安装统计
PluginInstallReport = "PluginInstallReport"
# 插件文件夹分组配置
PluginFolders = "PluginFolders"
# 默认电影订阅规则
@@ -193,6 +198,10 @@ class SystemConfigKey(Enum):
NotificationTemplates = "NotificationTemplates"
# 刮削开关设置
ScrapingSwitchs = "ScrapingSwitchs"
# 插件安装统计
PluginInstallReport = "PluginInstallReport"
# 配置向导状态
SetupWizardState = "SetupWizardState"
# 处理进度Key字典

View File

@@ -35,10 +35,10 @@ async def lifespan(app: FastAPI):
定义应用的生命周期事件
"""
print("Starting up...")
# 初始化模块
init_modules()
# 初始化路由
init_routers(app)
# 初始化模块
init_modules()
# 恢复插件备份
SystemChain().restore_plugins()
# 初始化插件

View File

@@ -1,83 +0,0 @@
import asyncio
import threading
from concurrent.futures import ThreadPoolExecutor
from typing import Coroutine, Any, TypeVar
T = TypeVar('T')
class AsyncUtils:
"""
异步工具类,用于在同步环境中调用异步方法
"""
@staticmethod
def run_async(coro: Coroutine[Any, Any, T]) -> T:
"""
在同步环境中安全地执行异步协程
:param coro: 要执行的协程
:return: 协程的返回值
:raises: 协程执行过程中的任何异常
"""
try:
# 尝试获取当前运行的事件循环
asyncio.get_running_loop()
# 如果有运行中的事件循环,在新线程中执行
return AsyncUtils._run_in_thread(coro)
except RuntimeError:
# 没有运行中的事件循环,直接使用 asyncio.run
return asyncio.run(coro)
@staticmethod
def _run_in_thread(coro: Coroutine[Any, Any, T]) -> T:
"""
在新线程中创建事件循环并执行协程
:param coro: 要执行的协程
:return: 协程的返回值
"""
result = None
exception = None
def _run():
nonlocal result, exception
try:
# 在新线程中创建新的事件循环
new_loop = asyncio.new_event_loop()
asyncio.set_event_loop(new_loop)
try:
result = new_loop.run_until_complete(coro)
finally:
new_loop.close()
except Exception as e:
exception = e
# 在新线程中执行
thread = threading.Thread(target=_run)
thread.start()
thread.join()
if exception:
raise exception
return result
@staticmethod
def run_async_in_executor(coro: Coroutine[Any, Any, T]) -> T:
"""
使用线程池执行器在新线程中运行异步协程
:param coro: 要执行的协程
:return: 协程的返回值
"""
try:
# 检查是否有运行中的事件循环
asyncio.get_running_loop()
# 有运行中的事件循环,使用线程池
with ThreadPoolExecutor() as executor:
future = executor.submit(asyncio.run, coro)
return future.result()
except RuntimeError:
# 没有运行中的事件循环,直接运行
return asyncio.run(coro)

125
app/utils/gc.py Normal file
View File

@@ -0,0 +1,125 @@
"""
内存回收装饰器模块
提供装饰器用于在函数执行后立即回收内存
"""
import gc
import functools
import psutil
import os
from typing import Callable, Any, Optional
from app.log import logger
def memory_gc(force_collect: bool = True,
log_memory_usage: bool = False) -> Callable:
"""
内存回收装饰器
Args:
force_collect: 是否强制执行垃圾回收默认True
log_memory_usage: 是否记录内存使用日志默认False
Returns:
装饰器函数
"""
def decorator(func: Callable) -> Callable:
@functools.wraps(func)
def wrapper(*args, **kwargs) -> Any:
# 记录函数执行前的内存使用情况
memory_before = None
memory_after = None
if log_memory_usage:
memory_before = get_memory_usage()
logger.info(f"函数 {func.__name__} 执行前内存使用: {memory_before}")
try:
# 执行原函数
result = func(*args, **kwargs)
# 记录函数执行后的内存使用情况
if log_memory_usage:
memory_after = get_memory_usage()
logger.info(f"函数 {func.__name__} 执行后内存使用: {memory_after}")
if memory_before:
memory_diff = memory_after - memory_before
logger.info(f"函数 {func.__name__} 内存变化: {memory_diff} MB")
return result
finally:
# 强制垃圾回收
if force_collect:
collected = gc.collect()
if log_memory_usage:
logger.info(f"函数 {func.__name__} 垃圾回收完成,回收对象数: {collected}")
# 记录垃圾回收后的内存使用情况
if log_memory_usage:
memory_after_gc = get_memory_usage()
logger.info(f"函数 {func.__name__} 垃圾回收后内存使用: {memory_after_gc}")
if memory_after:
memory_freed = memory_after - memory_after_gc
logger.info(f"函数 {func.__name__} 释放内存: {memory_freed} MB")
return wrapper
return decorator
def get_memory_usage() -> float:
"""
获取当前进程的内存使用情况MB
Returns:
内存使用量MB
"""
try:
process = psutil.Process(os.getpid())
memory_info = process.memory_info()
return memory_info.rss / 1024 / 1024 # 转换为MB
except Exception as e:
logger.warning(f"获取内存使用情况失败: {e}")
return 0.0
def memory_monitor(threshold_mb: Optional[float] = None) -> Callable:
"""
内存监控装饰器,当内存使用超过阈值时自动触发垃圾回收
Args:
threshold_mb: 内存阈值MB超过此值将触发垃圾回收
Returns:
装饰器函数
"""
def decorator(func: Callable) -> Callable:
@functools.wraps(func)
def wrapper(*args, **kwargs) -> Any:
# 检查内存使用情况
current_memory = get_memory_usage()
if threshold_mb and current_memory > threshold_mb:
logger.warning(f"内存使用超过阈值 {threshold_mb}MB当前使用: {current_memory}MB")
collected = gc.collect()
logger.info(f"自动垃圾回收完成,回收对象数: {collected}")
# 执行原函数
result = func(*args, **kwargs)
# 执行后再次检查并回收
if threshold_mb:
memory_after = get_memory_usage()
if memory_after > threshold_mb:
collected = gc.collect()
logger.info(f"函数执行后垃圾回收完成,回收对象数: {collected}")
return result
return wrapper
return decorator
# 便捷的装饰器别名
memory_cleanup = memory_gc
auto_gc = memory_gc(force_collect=True, log_memory_usage=True)
memory_watch = memory_monitor

View File

@@ -1,178 +0,0 @@
import sys
import time
from collections import deque
from typing import Any, Dict, Set
from app.log import logger
class MemoryCalculator:
"""
内存计算器,用于递归计算对象的内存占用
"""
def __init__(self):
# 缓存已计算的对象ID避免重复计算
self._calculated_ids: Set[int] = set()
# 最大递归深度,防止无限递归
self._max_depth = 10
# 最大对象数量,防止计算过多对象
self._max_objects = 10000
def calculate_object_memory(self, obj: Any, max_depth: int = None, max_objects: int = None) -> Dict[str, Any]:
"""
计算对象的内存占用
:param obj: 要计算的对象
:param max_depth: 最大递归深度
:param max_objects: 最大对象数量
:return: 内存统计信息
"""
if max_depth is None:
max_depth = self._max_depth
if max_objects is None:
max_objects = self._max_objects
# 重置缓存
self._calculated_ids.clear()
start_time = time.time()
object_details = []
try:
# 递归计算内存
memory_info = self._calculate_recursive(obj, depth=0, max_depth=max_depth,
max_objects=max_objects, object_count=0)
total_memory = memory_info['total_memory']
object_count = memory_info['object_count']
object_details = memory_info['object_details']
except Exception as e:
logger.error(f"计算对象内存时出错:{str(e)}")
total_memory = 0
object_count = 0
calculation_time = time.time() - start_time
return {
'total_memory_bytes': total_memory,
'total_memory_mb': round(total_memory / (1024 * 1024), 2),
'object_count': object_count,
'calculation_time_ms': round(calculation_time * 1000, 2),
'object_details': object_details[:10] # 只返回前10个最大的对象
}
def _calculate_recursive(self, obj: Any, depth: int, max_depth: int,
max_objects: int, object_count: int) -> Dict[str, Any]:
"""
递归计算对象内存
"""
if depth > max_depth or object_count > max_objects:
return {
'total_memory': 0,
'object_count': object_count,
'object_details': []
}
total_memory = 0
object_details = []
# 获取对象ID避免重复计算
obj_id = id(obj)
if obj_id in self._calculated_ids:
return {
'total_memory': 0,
'object_count': object_count,
'object_details': []
}
self._calculated_ids.add(obj_id)
object_count += 1
try:
# 计算对象本身的内存
obj_memory = sys.getsizeof(obj)
total_memory += obj_memory
# 记录大对象
if obj_memory > 1024: # 大于1KB的对象
object_details.append({
'type': type(obj).__name__,
'memory_bytes': obj_memory,
'memory_mb': round(obj_memory / (1024 * 1024), 2),
'depth': depth
})
# 递归计算容器对象的内容
if depth < max_depth:
container_memory = self._calculate_container_memory(
obj, depth + 1, max_depth, max_objects, object_count
)
total_memory += container_memory['total_memory']
object_count = container_memory['object_count']
object_details.extend(container_memory['object_details'])
except Exception as e:
logger.debug(f"计算对象 {type(obj).__name__} 内存时出错:{str(e)}")
return {
'total_memory': total_memory,
'object_count': object_count,
'object_details': object_details
}
def _calculate_container_memory(self, obj: Any, depth: int, max_depth: int,
max_objects: int, object_count: int) -> Dict[str, Any]:
"""
计算容器对象的内存
"""
total_memory = 0
object_details = []
try:
# 处理不同类型的容器
if isinstance(obj, (list, tuple, deque)):
for item in obj:
if object_count > max_objects:
break
item_memory = self._calculate_recursive(item, depth, max_depth, max_objects, object_count)
total_memory += item_memory['total_memory']
object_count = item_memory['object_count']
object_details.extend(item_memory['object_details'])
elif isinstance(obj, dict):
for key, value in obj.items():
if object_count > max_objects:
break
# 计算key的内存
key_memory = self._calculate_recursive(key, depth, max_depth, max_objects, object_count)
total_memory += key_memory['total_memory']
object_count = key_memory['object_count']
object_details.extend(key_memory['object_details'])
# 计算value的内存
value_memory = self._calculate_recursive(value, depth, max_depth, max_objects, object_count)
total_memory += value_memory['total_memory']
object_count = value_memory['object_count']
object_details.extend(value_memory['object_details'])
elif hasattr(obj, '__dict__'):
# 处理有__dict__属性的对象
for attr_name, attr_value in obj.__dict__.items():
if object_count > max_objects:
break
# 跳过一些特殊属性
if attr_name.startswith('_') and attr_name not in ['_calculated_ids']:
continue
attr_memory = self._calculate_recursive(attr_value, depth, max_depth, max_objects, object_count)
total_memory += attr_memory['total_memory']
object_count = attr_memory['object_count']
object_details.extend(attr_memory['object_details'])
except Exception as e:
logger.debug(f"计算容器对象 {type(obj).__name__} 内存时出错:{str(e)}")
return {
'total_memory': total_memory,
'object_count': object_count,
'object_details': object_details
}

View File

@@ -938,3 +938,19 @@ class StringUtils:
if isinstance(content, bytes) and content.startswith(b"magnet:"):
return True
return False
@staticmethod
def natural_sort_key(text: str) -> List[Union[int, str]]:
"""
自然排序
将字符串拆分为数字和非数字部分,数字部分转换为整数,非数字部分转换为小写字母
:param text: 要处理的字符串
:return 用于排序的数字和字符串列表
"""
if text is None:
return []
if not isinstance(text, str):
text = str(text)
return [int(part) if part.isdigit() else part.lower() for part in re.split(r'(\d+)', text)]

View File

@@ -527,6 +527,45 @@ class SystemUtils:
print(f"Error occurred: {e}")
return False
@staticmethod
def is_network_filesystem(directory: Path) -> bool:
"""
检测是否为网络文件系统
:param directory: 目录路径
:return: 是否为网络文件系统
"""
try:
system = platform.system()
if system == 'Linux':
# 检查挂载信息
result = subprocess.run(['df', '-T', str(directory)],
capture_output=True, text=True, timeout=5)
if result.returncode == 0:
output = result.stdout.lower()
# 以下本地文件系统含有fuse关键字
local_fs = [
"fuse.shfs", # Unraid
"zfuse.zfsv", # 极空间(zfuse.zfsv2、zfuse.zfsv3、...)
# TBD
]
if any(fs in output for fs in local_fs):
return False
network_fs = ['nfs', 'cifs', 'smbfs', 'fuse', 'sshfs', 'ftpfs']
return any(fs in output for fs in network_fs)
elif system == 'Darwin':
# macOS 检查
result = subprocess.run(['df', '-T', str(directory)],
capture_output=True, text=True, timeout=5)
if result.returncode == 0:
output = result.stdout.lower()
return 'nfs' in output or 'smbfs' in output
elif system == 'Windows':
# Windows 检查网络驱动器
return str(directory).startswith('\\\\')
except Exception as e:
print(f"Error occurred: {e}")
return False
@staticmethod
def is_same_disk(src: Path, dest: Path) -> bool:
"""

View File

@@ -8,6 +8,7 @@
# `release_year` 发行年份格式YYYY电影实际对应`release_date`字段,电视剧实际对应`first_air_date`字段,支持范围设定,如:`YYYY-YYYY`
# themoviedb 详情API返回的其它一级字段
# 4. 配置多项条件时需要同时满足,一个条件需要匹配多个值是使用`,`分隔
# 5. !条件值表示排除该值
# 配置电影的分类策略
movie:

View File

@@ -31,13 +31,16 @@ def upgrade() -> None:
# 初始化超级管理员
_user = User.get_by_name(db=db, name=settings.SUPERUSER)
if not _user:
# 生成随机密码
random_password = secrets.token_urlsafe(16)
logger.info(
f"【超级管理员初始密码】{random_password} 请登录系统后在设定中修改。 注:该密码只会显示一次,请注意保存。")
if settings.SUPERUSER_PASSWORD:
init_password = settings.SUPERUSER_PASSWORD
else:
# 生成随机密码
init_password = secrets.token_urlsafe(16)
logger.info(
f"【超级管理员初始密码】{init_password} 请登录系统后在设定中修改。 注:该密码只会显示一次,请注意保存。")
_user = User(
name=settings.SUPERUSER,
hashed_password=get_password_hash(random_password),
hashed_password=get_password_hash(init_password),
email="admin@movie-pilot.org",
is_superuser=True,
avatar=""

View File

@@ -0,0 +1,80 @@
"""2.2.1
Revision ID: a946dae52526
Revises: 5b3355c964bb
Create Date: 2025-08-20 17:50:00.000000
"""
import sqlalchemy as sa
from alembic import op
from app.log import logger
from app.core.config import settings
# revision identifiers, used by Alembic.
revision = 'a946dae52526'
down_revision = '5b3355c964bb'
branch_labels = None
depends_on = None
def upgrade() -> None:
"""
升级将SiteUserData表的userid字段从Integer改为String
"""
connection = op.get_bind()
if settings.DB_TYPE.lower() == "postgresql":
# PostgreSQL数据库迁移
migrate_postgresql_userid(connection)
def downgrade() -> None:
"""
降级将SiteUserData表的userid字段从String改回Integer
"""
pass
def migrate_postgresql_userid(connection):
"""
PostgreSQL数据库userid字段迁移
"""
try:
logger.info("开始PostgreSQL数据库userid字段迁移...")
# 1. 创建临时列
connection.execute(sa.text("""
ALTER TABLE siteuserdata
ADD COLUMN userid_new VARCHAR
"""))
# 2. 将现有数据转换为字符串并复制到新列
connection.execute(sa.text("""
UPDATE siteuserdata
SET userid_new = CAST(userid AS VARCHAR)
WHERE userid IS NOT NULL
"""))
# 3. 删除旧列
connection.execute(sa.text("""
ALTER TABLE siteuserdata
DROP COLUMN userid
"""))
# 4. 重命名新列
connection.execute(sa.text("""
ALTER TABLE siteuserdata
RENAME COLUMN userid_new TO userid
"""))
logger.info("PostgreSQL数据库userid字段迁移完成")
except Exception as e:
logger.error(f"PostgreSQL数据库userid字段迁移失败: {e}")
raise

View File

@@ -31,23 +31,34 @@ if [ "${ENABLE_SSL}" = "true" ] && \
if [ ! -d "/config/acme.sh" ]; then
INFO "→ 安装acme.sh..."
# 生成安装参数
INSTALL_ARGS=(
"--install-online"
"--home" "/config/acme.sh"
"--config-home" "/config/acme.sh/data"
"--cert-home" "/config/certs"
)
# 设置安装环境变量
export LE_WORKING_DIR="/config/acme.sh"
export LE_CONFIG_HOME="/config/acme.sh/data"
export LE_CERT_HOME="/config/certs"
# 添加邮箱参数(如果设置
# 执行官方安装命令(添加错误处理
INFO "正在下载并安装 acme.sh..."
# 构建安装命令
INSTALL_CMD="curl -sSL https://get.acme.sh | sh -s -- --install-online"
if [ -n "${SSL_EMAIL}" ]; then
INSTALL_ARGS+=("--accountemail" "${SSL_EMAIL}")
INSTALL_CMD="${INSTALL_CMD} --accountemail ${SSL_EMAIL}"
else
WARN "未设置SSL_EMAIL建议配置邮箱用于证书过期提醒"
fi
if ! eval "${INSTALL_CMD}"; then
ERROR "acme.sh 安装失败"
exit 1
fi
# 执行官方安装命令
curl -sSL https://get.acme.sh | sh -s -- "${INSTALL_ARGS[@]}"
# 验证安装是否成功
if [ ! -f "/config/acme.sh/acme.sh" ]; then
ERROR "acme.sh 安装后文件不存在,安装可能失败"
exit 1
fi
INFO "acme.sh 安装成功"
fi
# 签发证书(仅当证书不存在时)
@@ -77,17 +88,24 @@ if [ "${ENABLE_SSL}" = "true" ] && \
fi
done
# 签发证书
/config/acme.sh/acme.sh --issue \
# 签发证书(添加错误处理)
INFO "正在签发证书..."
if ! /config/acme.sh/acme.sh --issue \
--dns "${DNS_PROVIDER}" \
--domain "${SSL_DOMAIN}" \
--key-file /config/certs/"${SSL_DOMAIN}"/privkey.pem \
--fullchain-file /config/certs/"${SSL_DOMAIN}"/fullchain.pem \
--reloadcmd "nginx -s reload" \
--force
--force; then
ERROR "证书签发失败"
exit 1
fi
# 创建稳定符号链接
ln -sf /config/certs/"${SSL_DOMAIN}" /config/certs/latest
INFO "证书签发成功"
else
INFO "证书已存在,跳过签发步骤"
fi
# 配置自动更新任务
@@ -98,4 +116,12 @@ if [ "${ENABLE_SSL}" = "true" ] && \
elif [ "${ENABLE_SSL}" = "true" ] && [ "${AUTO_ISSUE_CERT}" = "true" ] && [ -z "${SSL_DOMAIN}" ]; then
WARN "已启用自动签发证书但未设置SSL_DOMAIN跳过证书管理"
elif [ "${ENABLE_SSL}" = "true" ] && [ "${AUTO_ISSUE_CERT}" = "false" ]; then
INFO "SSL已启用但自动签发证书已禁用将使用手动配置的证书"
# 检查证书文件是否存在
if [ -f "/config/certs/latest/fullchain.pem" ] && [ -f "/config/certs/latest/privkey.pem" ]; then
INFO "检测到证书文件SSL配置正常"
else
WARN "未检测到证书文件,请确保手动配置了正确的证书路径"
fi
fi

View File

@@ -183,8 +183,8 @@ if [ "${ENABLE_SSL}" = "true" ]; then
include /etc/nginx/mime.types;
default_type application/octet-stream;
listen 443 ssl;
listen [::]:443 ssl;
listen ${SSL_NGINX_PORT:-443} ssl;
listen [::]:${SSL_NGINX_PORT:-443} ssl;
server_name ${SSL_DOMAIN:-moviepilot};
# SSL证书路径
@@ -274,4 +274,8 @@ fi
# 启动后端服务
INFO "→ 启动后端服务..."
exec dumb-init gosu moviepilot:moviepilot ${VENV_PATH}/bin/python3 app/main.py
if [ "${START_NOGOSU:-false}" = "true" ]; then
exec dumb-init "${VENV_PATH}/bin/python3" app/main.py
else
exec dumb-init gosu moviepilot:moviepilot "${VENV_PATH}/bin/python3" app/main.py
fi

View File

@@ -76,6 +76,4 @@ setuptools~=78.1.0
pympler~=1.1
smbprotocol~=1.15.0
setproctitle~=1.3.6
httpx[socks]~=0.28.1
prometheus-client~=0.22.1
prometheus-fastapi-instrumentator~=7.1.0
httpx[socks]~=0.28.1

View File

@@ -1,2 +1,2 @@
APP_VERSION = 'v2.7.6'
FRONTEND_VERSION = 'v2.7.6'
APP_VERSION = 'v2.8.1'
FRONTEND_VERSION = 'v2.8.1'