Compare commits

..

26 Commits
1.0.1 ... 1.1.1

Author SHA1 Message Date
amtoaer
9d151b4731 feat: 命令默认不覆盖现有内容,更新文档 2023-12-06 01:19:08 +08:00
amtoaer
1686c1a8df feat: 支持弹幕下载 2023-12-06 00:39:46 +08:00
amtoaer
de6eaeb4a6 chore: 整理代码逻辑,留出下载字幕的入口 2023-12-06 00:00:42 +08:00
amtoaer
46d1810e7c chore: 处理成功的日志往外提一级 2023-12-04 01:50:31 +08:00
amtoaer
89e2567fef feat: tag 获取失败不影响主流程 2023-12-04 01:35:29 +08:00
amtoaer
38caf1f0d6 fix: 修复运行错误 2023-12-04 01:02:04 +08:00
amtoaer
6877171f4d fix: 修复参数错误 2023-12-04 00:54:04 +08:00
amtoaer
29d06a040b fix: 注册命令 2023-12-04 00:50:22 +08:00
amtoaer
ceec5d6780 feat: 尝试支持视频标签 2023-12-04 00:39:42 +08:00
amtoaer
650498d4a1 fix: 每天应仅检查一次 credential 2023-12-03 23:14:15 +08:00
amtoaer
96ff84391d doc: 修复图片路径过长的问题 2023-12-02 01:29:33 +08:00
amtoaer
44e8a2c97d doc: 加入对额外命令的描述,添加图片 2023-12-02 01:26:55 +08:00
amtoaer
c3bfb3c2e5 doc: 已解决 up 头像问题,更新 README 2023-12-02 01:12:59 +08:00
amtoaer
ec91cbf3ed fix: 继续修复字段错误 2023-12-02 00:51:21 +08:00
amtoaer
f174a3b898 fix: 修复字段错误,凭据刷新从第二天开始 2023-12-02 00:46:21 +08:00
amtoaer
c8fca7fcca style: 格式化代码 2023-12-02 00:40:55 +08:00
amtoaer
6ef25d6409 fix: 修复类型错误,加入部分日志 2023-12-02 00:40:35 +08:00
amtoaer
f10fc9dd97 fix: 修复字段取值错误 2023-12-02 00:30:56 +08:00
amtoaer
d21f14d851 ci: 推送 main 时构建 debug 镜像 2023-12-02 00:30:47 +08:00
amtoaer
012b3f9f31 fix: 尝试修复 emby 头像路径,异步化所有文件操作 2023-12-02 00:26:19 +08:00
amtoaer
bbde9d6ba6 fix: 修复判断 2023-11-30 18:12:15 +08:00
amtoaer
e040ab2d75 chore: 尝试降低更新凭据频率,防止风控 2023-11-30 17:50:36 +08:00
amtoaer
ad977e41d4 fix: 修复异步函数没有 await 的问题 2023-11-28 13:55:12 +08:00
amtoaer
dc612ec6f1 feat: 加入 recheck 功能,在本地文件被删除后重新标记为未下载 2023-11-26 12:51:27 +08:00
amtoaer
eee99d9fda fix: 加入靠谱的凭据更新机制 2023-11-26 12:37:39 +08:00
amtoaer
e977f12568 doc: 加上彩色输出的说明 2023-11-25 23:39:58 +08:00
12 changed files with 458 additions and 131 deletions

View File

@@ -0,0 +1,29 @@
name: Docker Image CI (DEBUG)
on:
push:
branches:
- main
jobs:
build:
runs-on: ubuntu-latest
steps:
-
name: Checkout
uses: actions/checkout@v3
-
name: Login to DockerHub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
-
name: Build and push images
uses: docker/build-push-action@v5
with:
context: .
file: ./Dockerfile
push: true
tags: |
${{ secrets.DOCKERHUB_USERNAME }}/bili-sync:debug

2
.gitignore vendored
View File

@@ -4,7 +4,7 @@ debug.py
videos
config.test.json
database.test.db*
example.json
example*.json
thumbs.test
config
data

View File

@@ -1,4 +1,4 @@
.PHONY: install fmt start-daemon start-once
.PHONY: install fmt start-daemon start-once db-init db-migrate db-upgrade sync-conf
install:
@echo "Installing dependencies..."
@@ -22,4 +22,10 @@ db-migrate:
@poetry run aerich migrate
db-upgrade:
@poetry run aerich upgrade
@poetry run aerich upgrade
sync-conf:
@echo "Syncing config..."
@cp ${CONFIG_SRC} ./config/
@cp ${DB_SRC} ./data/
@echo "Done."

View File

@@ -1,4 +1,6 @@
# bili-sync
![bili-sync](https://socialify.git.ci/amtoaer/bili-sync/image?description=1&font=KoHo&issues=1&language=1&logo=https%3A%2F%2Fs2.loli.net%2F2023%2F12%2F02%2F9EwT2yInOu1d3zm.png&name=1&owner=1&pattern=Signal&pulls=1&stargazers=1&theme=Light)
## 简介
为 NAS 用户编写的 BILIBILI 收藏夹同步工具,可方便导入 EMBY 等媒体库工具浏览。
@@ -37,15 +39,6 @@ class Config(DataClassJsonMixin):
即:我们可以通过运行一次程序,等程序写入初始配置并提示配置错误终止后编辑 `config.json` 文件,编辑后即可重新运行。
## 关于 UP 头像
目前开放全局的环境变量 `THUMB_PATH` 作为 up 主头像的存储位置。
在下载某条视频时,如果 UP 的头像还不存在,就会将 UP 的头像下载至 `THUMB_PATH`,同时在视频的 NFO 文件中写入 UP 头像的绝对路径。
但实际测试下来EMBY 似乎无法正常读取 NFO 文件中的本地头像路径,待找到处理办法后再修复。
> 虽然但是,一个基本的逻辑是,如果期望 `bili-sync` 在 NFO 中写入的头像绝对路径能够被 EMBY 读取到,那么两个容器中头像的绝对路径必须完全相同。因此虽然头像还没办法正常加载,但为后续考虑,还是推荐将 THUMB_PATH 填写上,并确保该路径在 `bili-sync` 和 `emby` 两个容器中指向的是相同的文件夹(也就是把一个文件夹同时挂载到 `bili-sync` 和 `emby` 的 THUMB_PATH 下)。
## Docker 运行示例
@@ -55,21 +48,21 @@ services:
bili-sync:
image: amtoaer/bili-sync:latest
user: 1000:1000 # 此处可以指定以哪个用户的权限运行,不填写的话默认 root推荐填写。
tty: true # 加上这一行可以让日志变成彩色
volumes:
- /home/amtoaer/Videos/Bilibilis/:/Videos/Bilibilis/ # 视频文件
- /home/amtoaer/.config/nas/bili-sync/config/:/app/config/ # 配置文件
- /home/amtoaer/.config/nas/bili-sync/data/:/app/data/ # 数据库
# 注:如需在 emby 内查看 up 主头像,需要将 emby 的 metadata/people/ 配置目录挂载至容器的 /app/thumb/
- /home/amtoaer/.config/nas/emby/metadata/people/:/app/thumb/
environment:
- THUMB_PATH=/Videos/Bilibilis/thumb/ # 将头像放到视频文件的 thumb 文件夹下
- TZ=Asia/Shanghai
restart: always
network_mode: bridge
hostname: bili-sync
container_name: bili-sync
logging:
driver: "json-file"
options:
max-size: "30m"
driver: "local"
```
对应的配置文件:
@@ -91,9 +84,33 @@ services:
}
```
## 目前的问题
## 支持的额外命令
- [ ] 研究一下 NFO看看怎么正常读取本地的演员头像
为满足需要,该应用包含几个单独的命令,可在程序目录下使用 `python entry.py ${command name}` 运行。
1. `once`
处理收藏夹,和一般定时任务触发时执行的操作完全相同,但仅运行一次。
2. `recheck`
将本地不存在的视频文件标记成未下载,下次定时任务触发时将一并下载。
3. `refresh_refresh_poster`
更新本地视频的封面。
3. `refresh_upper`
更新本地up的头像和元数据。
3. `refresh_nfo`
更新本地视频的元数据。(如标签、标题等信息)
3. `refresh_video`
更新本地的视频源文件。
3. `refresh_subtitle`
更新本地的弹幕文件。
**对于以 refresh 开头的命令,均支持 --force 参数,如果有 --force 参数,将全量覆盖对应内容,否则默认仅更新缺失的部分。**
## 路线图

101
commands.py Normal file
View File

@@ -0,0 +1,101 @@
import asyncio
import functools
from pathlib import Path
from typing import Callable
from loguru import logger
from constants import MediaStatus, MediaType
from models import FavoriteItem
from processor import process_favorite_item
from utils import aexists, aremove
async def recheck():
"""刷新数据库中视频的状态,如果发现文件不存在则标记未下载,以便在下次任务重新下载,在自己手动删除文件后调用"""
items = await FavoriteItem.filter(
type=MediaType.VIDEO,
status=MediaStatus.NORMAL,
downloaded=True,
)
exists = await asyncio.gather(*[aexists(item.video_path) for item in items])
for item, exist in zip(items, exists):
if isinstance(exist, Exception):
logger.error(
"Error when checking file {} {}: {}",
item.bvid,
item.name,
exist,
)
continue
if not exist:
logger.info(
"File {} {} not exists, mark as not downloaded.",
item.bvid,
item.name,
)
item.downloaded = False
logger.info("Updating database...")
await FavoriteItem.bulk_update(items, fields=["downloaded"])
logger.info("Database updated.")
async def _refresh_favorite_item_info(
path_getter: Callable[[FavoriteItem], list[Path]],
process_poster: bool = False,
process_video: bool = False,
process_nfo: bool = False,
process_upper: bool = False,
process_subtitle: bool = False,
force: bool = False,
):
items = await FavoriteItem.filter(downloaded=True).prefetch_related("upper")
if force:
# 如果强制刷新,那么就先把现存的所有内容删除
await asyncio.gather(
*[aremove(path) for item in items for path in path_getter(item)],
return_exceptions=True,
)
await asyncio.gather(
*[
process_favorite_item(
item,
process_poster=process_poster,
process_video=process_video,
process_nfo=process_nfo,
process_upper=process_upper,
process_subtitle=process_subtitle,
)
for item in items
],
return_exceptions=True,
)
refresh_nfo = functools.partial(
_refresh_favorite_item_info, lambda item: [item.nfo_path], process_nfo=True
)
refresh_poster = functools.partial(
_refresh_favorite_item_info,
lambda item: [item.poster_path],
process_poster=True,
)
refresh_video = functools.partial(
_refresh_favorite_item_info,
lambda item: [item.video_path],
process_video=True,
)
refresh_upper = functools.partial(
_refresh_favorite_item_info,
lambda item: item.upper_path,
process_upper=True,
)
refresh_subtitle = functools.partial(
_refresh_favorite_item_info,
lambda item: [item.subtitle_path],
process_subtitle=True,
)

View File

@@ -4,6 +4,14 @@ import sys
import uvloop
from loguru import logger
from commands import (
recheck,
refresh_nfo,
refresh_poster,
refresh_subtitle,
refresh_upper,
refresh_video,
)
from models import init_model
from processor import cleanup, process
from settings import settings
@@ -13,11 +21,23 @@ asyncio.set_event_loop_policy(uvloop.EventLoopPolicy())
async def entry() -> None:
await init_model()
if any("once" in _ for _ in sys.argv):
# 单次运行
logger.info("Running once...")
await process()
return
force = any("force" in _ for _ in sys.argv)
for command, func in (
("once", process),
("recheck", recheck),
("refresh_poster", refresh_poster),
("refresh_upper", refresh_upper),
("refresh_nfo", refresh_nfo),
("refresh_video", refresh_video),
("refresh_subtitle", refresh_subtitle),
):
if any(command in _ for _ in sys.argv):
logger.info("Running {}...", command)
if command.startswith("refresh"):
await func(force=force)
else:
await func()
return
logger.info("Running daemon...")
while True:
await process()

View File

@@ -0,0 +1,11 @@
from tortoise import BaseDBAsyncClient
async def upgrade(db: BaseDBAsyncClient) -> str:
return """
ALTER TABLE "favoriteitem" ADD "tags" JSON;"""
async def downgrade(db: BaseDBAsyncClient) -> str:
return """
ALTER TABLE "favoriteitem" DROP COLUMN "tags";"""

View File

@@ -13,6 +13,7 @@ from constants import (
MediaType,
)
from settings import settings
from utils import aopen
class FavoriteList(Model):
@@ -39,7 +40,31 @@ class Upper(Model):
@property
def thumb_path(self) -> Path:
return DEFAULT_THUMB_PATH / f"{self.mid}.jpg"
return (
DEFAULT_THUMB_PATH / str(self.mid)[0] / f"{self.mid}" / "folder.jpg"
)
@property
def meta_path(self) -> Path:
return (
DEFAULT_THUMB_PATH / str(self.mid)[0] / f"{self.mid}" / "person.nfo"
)
async def save_metadata(self):
async with aopen(self.meta_path, "w") as f:
await f.write(
f"""
<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<person>
<plot />
<outline />
<lockdata>false</lockdata>
<dateadded>{self.created_at.strftime("%Y-%m-%d %H:%M:%S")}</dateadded>
<title>{self.mid}</title>
<sorttitle>{self.mid}</sorttitle>
</person>
""".strip()
)
class FavoriteItem(Model):
@@ -54,6 +79,7 @@ class FavoriteItem(Model):
bvid = fields.CharField(max_length=255)
desc = fields.TextField()
cover = fields.TextField()
tags = fields.JSONField(null=True)
favorite_list = fields.ForeignKeyField(
"models.FavoriteList", related_name="items"
)
@@ -107,6 +133,20 @@ class FavoriteItem(Model):
/ f"{self.bvid}-poster.jpg"
)
@property
def upper_path(self) -> list[Path]:
return [
self.upper.thumb_path,
self.upper.meta_path,
]
@property
def subtitle_path(self) -> Path:
return (
Path(settings.path_mapper[self.favorite_list_id])
/ f"{self.bvid}.zh-CN.default.ass"
)
async def init_model() -> None:
await Tortoise.init(config=TORTOISE_ORM)

17
nfo.py
View File

@@ -2,19 +2,19 @@ import datetime
from dataclasses import dataclass
from pathlib import Path
from utils import aopen
@dataclass
class Actor:
name: str
role: str
thumb: Path
def to_xml(self) -> str:
return f"""
<actor>
<name>{self.name}</name>
<role>{self.role}</role>
<thumb>{self.thumb.resolve()}</thumb>
</actor>
""".strip(
"\n"
@@ -25,16 +25,22 @@ class Actor:
class EpisodeInfo:
title: str
plot: str
tags: list[str]
actor: list[Actor]
bvid: str
aired: datetime.datetime
def write_nfo(self, path: Path) -> None:
with path.open("w", encoding="utf-8") as f:
f.write(self.to_xml())
async def write_nfo(self, path: Path) -> None:
async with aopen(path, "w", encoding="utf-8") as f:
await f.write(self.to_xml())
def to_xml(self) -> str:
actor = "\n".join(_.to_xml() for _ in self.actor)
tags = (
"\n".join(f" <genre>{_}</genre>" for _ in self.tags)
if isinstance(self.tags, list)
else ""
)
return f"""
<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<episodedetails>
@@ -43,6 +49,7 @@ class EpisodeInfo:
<title>{self.title}</title>
{actor}
<year>{self.aired.year}</year>
{tags}
<uniqueid type="bilibili">{self.bvid}</uniqueid>
<aired>{self.aired.strftime("%Y-%m-%d")}</aired>
</episodedetails>

6
poetry.lock generated
View File

@@ -1,4 +1,4 @@
# This file is automatically @generated by Poetry 1.7.0 and should not be changed by hand.
# This file is automatically @generated by Poetry 1.7.1 and should not be changed by hand.
[[package]]
name = "aerich"
@@ -301,7 +301,7 @@ pillow = ">=10.1.0,<10.2.0"
pycryptodomex = ">=3.19.0,<3.20.0"
pyyaml = ">=6.0,<7.0"
qrcode = ">=7.4.2,<7.5.0"
qrcode-terminal = ">=0.8,<1.0"
qrcode_terminal = ">=0.8,<1.0"
requests = ">=2.31.0,<2.32.0"
rsa = ">=4.9,<5.0"
tqdm = ">=4.66.1,<4.67.0"
@@ -311,7 +311,7 @@ yarl = ">=1.9.2,<1.10.0"
type = "git"
url = "https://github.com/amtoaer/bilibili-api.git"
reference = "dev"
resolved_reference = "30db504b410787bdc8a5cd1ff63bacf6b365f393"
resolved_reference = "0281d7c58e8a26706cefc96e7e427cb05a26d866"
[[package]]
name = "black"

View File

@@ -2,11 +2,8 @@ import asyncio
import datetime
from asyncio import Semaphore, create_subprocess_exec
from asyncio.subprocess import DEVNULL
from pathlib import Path
import aiofiles
import httpx
from bilibili_api import HEADERS, favorite_list, video
from bilibili_api import ass, favorite_list, video
from bilibili_api.exceptions import ResponseCodeException
from loguru import logger
from tortoise import Tortoise
@@ -16,10 +13,9 @@ from credential import credential
from models import FavoriteItem, FavoriteList, Upper
from nfo import Actor, EpisodeInfo
from settings import settings
from utils import aexists, amakedirs, client, download_content
anchor = datetime.datetime.today()
client = httpx.AsyncClient(headers=HEADERS)
anchor = datetime.date.today()
async def cleanup() -> None:
@@ -40,16 +36,6 @@ def concurrent_decorator(concurrency: int) -> callable:
return decorator
async def download_content(url: str, path: Path) -> None:
async with client.stream("GET", url) as resp, aiofiles.open(
path, "wb"
) as f:
async for chunk in resp.aiter_bytes(40960):
if not chunk:
return
await f.write(chunk)
async def manage_model(medias: list[dict], fav_list: FavoriteList) -> None:
uppers = [
Upper(
@@ -95,15 +81,16 @@ async def manage_model(medias: list[dict], fav_list: FavoriteList) -> None:
async def process() -> None:
global anchor
if (datetime.datetime.now() - anchor).days >= 3:
# 暂定三天刷新一次凭据,具体看情况调整
try:
credential.refresh()
anchor = datetime.datetime.today()
logger.info("Credential refreshed.")
except Exception:
logger.exception("Failed to refresh credential.")
return
if (today := datetime.date.today()) > anchor:
anchor = today
logger.info("Check credential.")
if await credential.check_refresh():
try:
await credential.refresh()
logger.info("Credential refreshed.")
except Exception:
logger.exception("Failed to refresh credential.")
return
for favorite_id in settings.favorite_ids:
if favorite_id not in settings.path_mapper:
logger.warning(
@@ -162,89 +149,160 @@ async def process_favorite(favorite_id: int) -> None:
downloaded=False,
).prefetch_related("upper")
await asyncio.gather(
*[process_video(item) for item in all_unprocessed_items],
*[process_favorite_item(item) for item in all_unprocessed_items],
return_exceptions=True,
)
logger.info("Favorite {} {} processed successfully.", favorite_id, title)
@concurrent_decorator(4)
async def process_video(fav_item: FavoriteItem) -> None:
async def process_favorite_item(
fav_item: FavoriteItem,
process_poster=True,
process_video=True,
process_nfo=True,
process_upper=True,
process_subtitle=True,
) -> None:
logger.info("Start to process video {} {}", fav_item.bvid, fav_item.name)
if fav_item.type != MediaType.VIDEO:
logger.warning("Media {} is not a video, skipped.", fav_item.name)
return
v = video.Video(fav_item.bvid, credential=credential)
try:
if fav_item.video_path.exists():
fav_item.downloaded = True
await fav_item.save()
logger.info(
"{} {} already exists, skipped.", fav_item.bvid, fav_item.name
)
return
# 写入 up 主头像
if not fav_item.upper.thumb_path.exists():
await download_content(
fav_item.upper.thumb, fav_item.upper.thumb_path
)
# 写入 nfo
EpisodeInfo(
title=fav_item.name,
plot=fav_item.desc,
actor=[
Actor(
name=fav_item.upper.mid,
role=fav_item.upper.name,
thumb=fav_item.upper.thumb_path,
if process_upper:
# 写入 up 主头像
if not all(
await asyncio.gather(
aexists(fav_item.upper.thumb_path),
aexists(fav_item.upper.meta_path),
)
],
bvid=fav_item.bvid,
aired=fav_item.ctime,
).write_nfo(fav_item.nfo_path)
# 写入 poster
await download_content(fav_item.cover, fav_item.poster_path)
# 开始处理视频内容
v = video.Video(fav_item.bvid, credential=credential)
detector = video.VideoDownloadURLDataDetecter(
await v.get_download_url(page_index=0)
)
streams = detector.detect_best_streams()
if detector.check_flv_stream():
await download_content(streams[0].url, fav_item.tmp_video_path)
process = await create_subprocess_exec(
FFMPEG_COMMAND,
"-i",
str(fav_item.tmp_video_path),
str(fav_item.video_path),
stdout=DEVNULL,
stderr=DEVNULL,
)
await process.communicate()
fav_item.tmp_video_path.unlink()
else:
await asyncio.gather(
download_content(streams[0].url, fav_item.tmp_video_path),
download_content(streams[1].url, fav_item.tmp_audio_path),
)
process = await create_subprocess_exec(
FFMPEG_COMMAND,
"-i",
str(fav_item.tmp_video_path),
"-i",
str(fav_item.tmp_audio_path),
"-c",
"copy",
str(fav_item.video_path),
stdout=DEVNULL,
stderr=DEVNULL,
)
await process.communicate()
fav_item.tmp_video_path.unlink()
fav_item.tmp_audio_path.unlink()
fav_item.downloaded = True
await fav_item.save()
):
await amakedirs(fav_item.upper.thumb_path.parent, exist_ok=True)
await asyncio.gather(
fav_item.upper.save_metadata(),
download_content(
fav_item.upper.thumb, fav_item.upper.thumb_path
),
return_exceptions=True,
)
else:
logger.info(
"Upper {} {} already exists, skipped.",
fav_item.upper.mid,
fav_item.upper.name,
)
if process_nfo:
if not await aexists(fav_item.nfo_path):
if fav_item.tags is None:
try:
fav_item.tags = [
_["tag_name"] for _ in await v.get_tags()
]
except Exception:
logger.exception(
"Failed to get tags of video {} {}",
fav_item.bvid,
fav_item.name,
)
# 写入 nfo
await EpisodeInfo(
title=fav_item.name,
plot=fav_item.desc,
actor=[
Actor(
name=fav_item.upper.mid,
role=fav_item.upper.name,
)
],
tags=fav_item.tags,
bvid=fav_item.bvid,
aired=fav_item.ctime,
).write_nfo(fav_item.nfo_path)
else:
logger.info(
"NFO of {} {} already exists, skipped.",
fav_item.bvid,
fav_item.name,
)
if process_poster:
# 写入 poster
if not await aexists(fav_item.poster_path):
await download_content(fav_item.cover, fav_item.poster_path)
else:
logger.info(
"Poster of {} {} already exists, skipped.",
fav_item.bvid,
fav_item.name,
)
if process_subtitle:
if not await aexists(fav_item.subtitle_path):
await ass.make_ass_file_danmakus_protobuf(
v, 0, str(fav_item.subtitle_path.resolve())
)
else:
logger.info(
"Subtitle of {} {} already exists, skipped.",
fav_item.bvid,
fav_item.name,
)
if process_video:
if await aexists(fav_item.video_path):
fav_item.downloaded = True
logger.info(
"Video {} {} already exists, skipped.",
fav_item.bvid,
fav_item.name,
)
else:
# 开始处理视频内容
detector = video.VideoDownloadURLDataDetecter(
await v.get_download_url(page_index=0)
)
streams = detector.detect_best_streams()
if detector.check_flv_stream():
await download_content(
streams[0].url, fav_item.tmp_video_path
)
process = await create_subprocess_exec(
FFMPEG_COMMAND,
"-i",
str(fav_item.tmp_video_path),
str(fav_item.video_path),
stdout=DEVNULL,
stderr=DEVNULL,
)
await process.communicate()
fav_item.tmp_video_path.unlink()
else:
await asyncio.gather(
download_content(
streams[0].url, fav_item.tmp_video_path
),
download_content(
streams[1].url, fav_item.tmp_audio_path
),
)
process = await create_subprocess_exec(
FFMPEG_COMMAND,
"-i",
str(fav_item.tmp_video_path),
"-i",
str(fav_item.tmp_audio_path),
"-c",
"copy",
str(fav_item.video_path),
stdout=DEVNULL,
stderr=DEVNULL,
)
await process.communicate()
fav_item.tmp_video_path.unlink()
fav_item.tmp_audio_path.unlink()
fav_item.downloaded = True
logger.info(
"{} {} processed successfully.", fav_item.bvid, fav_item.name
"{} {} processed successfully.",
fav_item.bvid,
fav_item.name,
)
except ResponseCodeException as e:
match e.code:
@@ -260,7 +318,6 @@ async def process_video(fav_item: FavoriteItem) -> None:
e.code,
)
return
await fav_item.save()
logger.error(
"Video {} {} is not available, marked as {}",
fav_item.bvid,
@@ -271,3 +328,5 @@ async def process_video(fav_item: FavoriteItem) -> None:
logger.exception(
"Failed to process video {} {}", fav_item.bvid, fav_item.name
)
finally:
await fav_item.save()

37
utils.py Normal file
View File

@@ -0,0 +1,37 @@
from pathlib import Path
import aiofiles
import httpx
from aiofiles.base import AiofilesContextManager
from aiofiles.os import makedirs, remove
from aiofiles.ospath import exists
from aiofiles.threadpool.text import AsyncTextIOWrapper
from bilibili_api import HEADERS
client = httpx.AsyncClient(headers=HEADERS)
async def download_content(url: str, path: Path) -> None:
async with client.stream("GET", url) as resp, aopen(path, "wb") as f:
async for chunk in resp.aiter_bytes(40960):
if not chunk:
return
await f.write(chunk)
async def aexists(path: Path) -> bool:
return await exists(path)
async def amakedirs(path: Path, exist_ok=False) -> None:
await makedirs(path, exist_ok=exist_ok)
def aopen(
path: Path, mode: str = "r", **kwargs
) -> AiofilesContextManager[None, None, AsyncTextIOWrapper]:
return aiofiles.open(path, mode, **kwargs)
async def aremove(path: Path) -> None:
await remove(path)