## 用户反馈 "晴辰助手的备用模型配置,很复杂,很麻烦" —— 旧 UI 每个备用要填 6 个字段(label / baseUrl / apiKey / model / apiType / enabled), 每张卡 3 行 6 输入框 ~200px 高,添加流程比配主模型还复杂。 ## 重设计原则 备用模型本质是"主模型挂了用啥兜底",应该是**选一个**而不是**重新配一个**。 复用 PROVIDER_PRESETS 里已有的 16 个厂商预设,一键预填 baseUrl / apiType。 ## 新 UI 结构 ``` ┌─ 备用模型 (已启用 2 个) ────────────────────── [▼] ─┐ │ 主模型失败时按顺序切换到备用(401/403 除外) │ │ │ │ 📌 主模型(当前) gpt-4o-mini gpt.qt.cool │ ← 只读 │ ⋮⋮ #2 [晴辰云] claude-haiku · api.anthropic 编辑 × │ ← 紧凑一行 │ ⋮⋮ #3 qwen3-30b · localhost:8000 编辑 × │ │ │ │ 选择服务商快速添加: │ │ [★晴辰云][OpenAI][Anthropic][DeepSeek][Google][Ollama]│ │ [📋 从主模型复制] [+ 自定义/自建] [更多服务商…] │ └──────────────────────────────────────────────────────┘ ``` ## 关键变化 | 维度 | 旧 | 新 | |-------------------|------------------------------|---------------------------------------| | 每行高度 | ~200px 卡片 | ~36px 紧凑,点编辑才展开 | | 添加方式 | 空白卡 6 字段填 | 点厂商 → 只填 apiKey + 选 model | | label 字段 | 用户手填 | 去掉,自动用 model 显示 | | enabled 开关 | 显式 | 去掉(删除即停用)+ 迁移旧禁用条目 | | 主模型可见 | 无 | 列表顶部显示完整调用链 (📌 主模型行) | | 排序 | 隐式按数组顺序 | 显式 HTML5 drag-drop 拖拽手柄 | | baseUrl/apiType | 始终暴露 | 折叠到"高级选项"(选 preset 后不用碰)| | 快捷"从主模型复制"| 无 | 有(解决"备用和主只想换个模型"场景) | ## 实现细节 - 复用 `src/lib/model-presets.js` 的 `PROVIDER_PRESETS` 16 个厂商 - 主按钮区展示 6 个最常用(qtcool / openai / anthropic / deepseek / google / ollama),其余点"更多服务商…"展开 - 点厂商 → push draft 对象(含临时字段 `_editing`/`_brandLabel`), 默认展开编辑态,autofocus 到 apiKey - 保存时 .map 只挑 5 个字段(自动 strip 临时字段 + 不再写 enabled) - 迁移:保存时过滤 `enabled === false` 的老条目,用户下次看到的 都是启用状态(避免 UI 不暴露 enabled 导致的"隐形禁用"困惑) - 主模型只读行实时跟随表单 `#ast-baseurl` / `#ast-model` 变化 - 每行折叠态的 model / hostname 在编辑态输入时实时更新(不重渲染 避免输入框失焦) - HTML5 drag-drop 拖拽排序(dragstart/dragover/drop),无第三方库 ## i18n 新增 10 个翻译 key(至少覆盖 zh-CN / en / zh-TW / ja / ko / vi): - fallbackPrimaryRow / fallbackPickProviderHint - fallbackAddCopyPrimary / fallbackAddCustom / fallbackMoreProviders - fallbackEditAdvanced / fallbackHideAdvanced / fallbackShowAdvanced - fallbackUnnamedModel / fallbackPickProviderTitle ## 验证 - npm run build 通过 - assistant chunk 156.24KB → 162.02KB(gzip +1.5KB),合理 - 向后兼容:已存的 fallbackModels 数据(含 label/enabled 字段) 可正常读取和显示,保存时会"隐式迁移"(去掉 enabled 字段,禁用 条目被清理) ## 相关 - #Compat-3 系列(备用模型 failover) - 用户反馈:"很复杂,很麻烦,整体重新设计下!"
OpenClaw & Hermes Agent Management Panel with Built-in AI Assistant — Multi-Engine AI Framework Management
🇨🇳 中文 | 🇺🇸 English | 🇹🇼 繁體中文 | 🇯🇵 日本語 | 🇰🇷 한국어 | 🇻🇳 Tiếng Việt | 🇪🇸 Español | 🇧🇷 Português | 🇷🇺 Русский | 🇫🇷 Français | 🇩🇪 Deutsch
ClawPanel is a visual management panel supporting multiple AI Agent frameworks, currently with OpenClaw and Hermes Agent dual-engine support. It features a built-in intelligent AI assistant that helps you install, auto-diagnose configurations, troubleshoot issues, and fix errors. 8 tools + 4 modes + interactive Q&A — easy to manage for beginners and experts alike.
🌐 Website: claw.qt.cool | 📦 Download: GitHub Releases
🎁 QingchenCloud AI API
Internal technical testing platform, open for selected users. Sign in daily to earn credits.
- Daily Sign-in Credits — Sign in daily + invite friends to earn test credits
- OpenAI-Compatible API — Seamless integration with OpenClaw, plug and play
- Resource Policy — Rate limiting + request caps, may queue during peak hours
- Model Availability — Models/APIs subject to actual page display, may rotate versions
⚠️ Compliance: This platform is for technical testing only. Illegal use or circumventing security mechanisms is prohibited. Keep your API Key secure. Rules subject to latest platform policies.
🔥 Dev Board / Embedded Device Support
ClawPanel provides a pure Web deployment mode (zero GUI dependency), natively compatible with ARM64 boards:
- Orange Pi / Raspberry Pi / RK3588 —
npm run serveto run - Docker ARM64 —
docker run ghcr.io/qingchencloud/openclaw:latest - Armbian / Debian / Ubuntu Server — Auto-detect architecture
- No Rust / Tauri / GUI needed — only Node.js 18+ required
📖 See Armbian Deployment Guide | Web Dev Mode
Community
A community of passionate AI Agent developers and enthusiasts — join us!
Discord · Discussions · Report Issue
Features
- 🤖 AI Assistant (New) — Built-in AI assistant, 4 modes + 8 tools + interactive Q&A. See AI Assistant Highlights
- 🧩 Multi-Engine Architecture — Supports both OpenClaw and Hermes Agent dual engines, freely switchable, independently managed
- 🤖 Hermes Agent Chat — Built-in Hermes Agent chat interface with tool call visualization, file system access toggle, SSE streaming output
- 🖼️ Image Recognition — Paste screenshots or drag images, AI auto-analyzes, multimodal conversations
- Dashboard — System overview, real-time service monitoring, quick actions
- Service Management — OpenClaw / Hermes Gateway start/stop, version detection & one-click upgrade, config backup & restore
- Model Configuration — Multi-provider management, model CRUD, batch connectivity tests, latency detection, drag-to-reorder, auto-save + undo
- Gateway Configuration — Port, access scope (localhost/LAN), auth Token, Tailscale networking
- Messaging Channels — Unified Telegram, Discord, Feishu, DingTalk, QQ management, multi-Agent binding per platform
- Communication & Automation — Message settings, broadcast strategies, slash commands, Webhooks, execution approval
- Usage Analytics — Token usage, API costs, model/provider/tool rankings, daily usage charts
- Agent Management — Agent CRUD, identity editing, model config, workspace management
- Chat — Streaming, Markdown rendering, session management, /fast /think /verbose /reasoning commands
- Cron Jobs — Cron-based scheduled execution, multi-channel delivery
- Log Viewer — Multi-source real-time logs with keyword search
- Memory Management — Memory file view/edit, categorized management, ZIP export, Agent switching
- QingchenCloud AI API — Internal testing platform, OpenAI-compatible, daily sign-in credits
- Extensions — cftunnel tunnel management, ClawApp status monitoring
- About — Version info, community links, related projects, one-click upgrade
Download & Install
Go to Releases for the latest version:
macOS
| Chip | Installer | Notes |
|---|---|---|
| Apple Silicon (M1/M2/M3/M4) | ClawPanel_x.x.x_aarch64.dmg |
Macs from late 2020+ |
| Intel | ClawPanel_x.x.x_x64.dmg |
Macs 2020 and earlier |
⚠️ "Damaged" or "unverified developer"? App is unsigned. Run:
sudo xattr -rd com.apple.quarantine /Applications/ClawPanel.app
Windows
| Format | Installer | Notes |
|---|---|---|
| EXE | ClawPanel_x.x.x_x64-setup.exe |
Recommended |
| MSI | ClawPanel_x.x.x_x64_en-US.msi |
Enterprise / silent install |
Linux
| Format | Installer | Notes |
|---|---|---|
| AppImage | ClawPanel_x.x.x_amd64.AppImage |
No install, chmod +x and run |
| DEB | ClawPanel_x.x.x_amd64.deb |
sudo dpkg -i *.deb |
| RPM | ClawPanel-x.x.x-1.x86_64.rpm |
sudo rpm -i *.rpm |
Linux Server (Web Version)
curl -fsSL https://raw.githubusercontent.com/qingchencloud/clawpanel/main/scripts/linux-deploy.sh | bash
Visit http://YOUR_SERVER_IP:1420 after deployment. 📖 Linux Deployment Guide
Docker
docker run -d --name clawpanel --restart unless-stopped \
-p 1420:1420 -v clawpanel-data:/root/.openclaw \
node:22-slim \
sh -c "apt-get update && apt-get install -y git && \
npm install -g @qingchencloud/openclaw-zh --registry https://registry.npmmirror.com && \
git clone https://github.com/qingchencloud/clawpanel.git /app && \
cd /app && npm install && npm run build && npm run serve"
Quick Start
- Initial Setup — First launch auto-detects Node.js, Git, OpenClaw. One-click install if missing.
- Configure Models — Add AI providers (DeepSeek, MiniMax, OpenAI, Ollama, etc.) with API keys. Test connectivity.
- Start Gateway — Go to Service Management, click Start. Green status = ready.
- Start Chatting — Go to Live Chat, select model, start conversation with streaming & Markdown.
🤖 AI Assistant Highlights
Built-in AI assistant that can directly operate your system — diagnose, fix, even submit PRs.
Four Modes
| Mode | Icon | Tools | Write | Confirm | Use Case |
|---|---|---|---|---|---|
| Chat | 💬 | ❌ | ❌ | — | Pure Q&A |
| Plan | 📋 | ✅ | ❌ | ✅ | Read configs/logs, output plans |
| Execute | ⚡ | ✅ | ✅ | ✅ | Normal work, dangerous ops need confirm |
| Unlimited | ∞ | ✅ | ✅ | ❌ | Full auto, no prompts |
Eight Tools
| Tool | Function |
|---|---|
ask_user |
Ask user questions (single/multi/text) |
get_system_info |
Get OS, architecture, home directory |
run_command |
Execute shell commands |
read_file / write_file |
Read/write files |
list_directory |
Browse directories |
list_processes |
View processes |
check_port |
Check port usage |
Tech Architecture
| Layer | Technology | Description |
|---|---|---|
| Frontend | Vanilla JS + Vite | Zero framework, lightweight |
| Backend | Rust + Tauri v2 | Native performance, cross-platform |
| Communication | Tauri IPC + Shell Plugin | Frontend-backend bridge |
| Styling | Pure CSS (CSS Variables) | Dark/Light themes, glassmorphism |
Build from Source
git clone https://github.com/qingchencloud/clawpanel.git
cd clawpanel && npm install
# Desktop (requires Rust + Tauri v2)
npm run tauri dev # Development
npm run tauri build # Production
# Web only (no Rust needed)
npm run dev # Dev with hot reload
npm run build && npm run serve # Production
Related Projects
| Project | Description |
|---|---|
| OpenClaw | AI Agent Framework |
| ClawApp | Cross-platform mobile chat client |
| cftunnel | Cloudflare Tunnel tool |
Contributing
Issues and Pull Requests are welcome. See CONTRIBUTING.md for guidelines.
Acknowledgements
ClawPanel keeps growing because of every contributor in the community. Thank you for helping make the project better.
Code Contributors
Thanks to these developers for submitting Pull Requests and contributing directly to the codebase:
![]() liucong2013 #88 |
![]() axdlee #58 |
![]() ATGCS #107 |
![]() livisun #106 |
![]() kiss-kedaya #101 #94 |
![]() wzh4869 #82 |
![]() 0xsline #15 |
![]() jonntd #18 |
Community Reporters
Thanks to community members who opened issues, reported bugs, and suggested features:
If we missed your contribution, please open an issue and we will add it promptly.
Sponsor
If you find this project useful, consider supporting us via USDT (BNB Smart Chain):
0xbdd7ebdf2b30d873e556799711021c6671ffe88f
Contact
- Email: support@qctx.net
- Website: qingchencloud.com
- Product: claw.qt.cool
License
This project is licensed under AGPL-3.0. For commercial/proprietary use without open-source requirements, contact us for a commercial license.
© 2026 QingchenCloud (武汉晴辰天下网络科技有限公司) | claw.qt.cool









