解决用户反馈的两个问题:晴辰助手设置里"测试"按钮在某些 provider
(如 gpt.qt.cool)上显示 Response Status: 0、Body 空,以及只能配
一个模型、挂了就没法用。
## 1. 测试按钮 status 0 根因 & 修复
**根因**:Tauri 桌面端以前走 webview 的 `fetch()` 直打外部 API,
受 Chromium 网络栈限制 —— 某些 provider 的 HTTP/2 分块编码、TLS
握手、CORS 预检、或特殊响应头会被静默拒绝并抛 TypeError: Failed
to fetch,前端 catch 后把 `respStatus` 写死 0、`respBody` 空。这
不是 provider 的问题,也不是 key 的问题,是 Chromium net stack 的
兼容性问题。
**修复**:新增 Rust 命令 `test_model_verbose`(基于已有的 reqwest
HTTP 客户端),返回结构化 JSON
`{success, status, reqUrl, reqBody, respBody, reply, error,
elapsedMs, usedApi}`。
前端测试按钮无论 Tauri/Web 模式,一律调 `api.testModelVerbose()`:
- Tauri → `invoke('test_model_verbose')` → 走原生 reqwest
- Web → `fetch('/__api/test_model_verbose')` → 走 dev-api.js 服务端
fetch
这样绕过了 webview net stack 所有兼容性陷阱,拿到的永远是真实
HTTP status(含 401/429/5xx)和原始 body,debug 面板展示完整信息。
相比旧的 `test_model` 命令,`test_model_verbose` 不会因 400/422/429
就吞错误返回 "连接正常",而是如实回传,便于用户排查。
## 2. 备用模型组 failover(参考 OpenClaw)
**新增配置**:`_config.fallbackModels: Array<{label, baseUrl, apiKey,
model, apiType, enabled}>`,存在 localStorage 里。
**callAI 改造**:
- 旧的 `callAI` 改名为 `_callAIOnce`,保持不变
- 新增 `callAIWithSlot(slot, messages, onChunk)`:临时把 slot 注入
到 `_config`,调 `_callAIOnce`,finally 恢复(单线程安全,因为
`_isStreaming` 防并发)
- 新的 `callAI`:`buildActiveSlots()` 收集主模型 + 启用且配置完整
的 fallback,按序尝试
- 成功 → return
- `AbortError`(用户中止)→ 直抛,不 failover
- 鉴权错误 401/403/`unauthorized`/`invalid api key` → 直抛,不
failover(切也白切)
- 其他可重试错误(网络/超时/5xx/429/400 请求错/模型不存在)→ 在
聊天里插入 `⚠ 模型「X」失败,切换到备用「Y」` 引用块,继续下
一个 slot
- 全部 slot 都失败 → 抛最后一个错误,触发既有 retry bar + circuit
breaker 流程
**UI**:设置面板 API 标签页,在晴辰云 promo 卡片下方新增一个默认
折叠的 `<details>` 区块"备用模型组":
- 顶部 summary 显示启用数量 + 折叠箭头
- 每张卡片:label / baseUrl / apiType / apiKey / model(紧凑 2 列
栅格)+ enabled 开关 + 删除按钮
- 顶部 "添加备用模型" 按钮:默认继承主模型的 apiType,减少配置项
- 编辑态用 fallbackDrafts(深拷贝),保存按钮才过滤空卡片写回
`_config.fallbackModels`
- 单个 input 变化时只同步 drafts + 更新计数,不重渲染列表(保持
输入框焦点)
**文件改动**:
- `src-tauri/src/commands/config.rs`:+175 行 `test_model_verbose`
- `src-tauri/src/lib.rs`:注册新命令
- `src/lib/tauri-api.js`:+1 行 `testModelVerbose` 封装
- `scripts/dev-api.js`:+75 行 Web 模式 test_model_verbose handler
- `src/pages/assistant.js`:
- `loadConfig`: 新增 `fallbackModels = []` 默认值
- `callAI` 重构为 failover loop(+80 行)
- 测试按钮:移除 90 行的 webview fetch 双分支,统一调 verbose
API(净减 ~60 行)
- `showSettings`: 新增备用模型 UI + 事件绑定(+85 行)
- 保存按钮:收集 fallbackDrafts 写回 _config
- `src/locales/modules/assistant.js`:11 语言翻译(slotPrimary /
failoverNotice / fallbackModelsTitle / fallbackModelsDesc /
fallbackEnabledSuffix / fallbackEmpty / fallbackAdd /
fallbackRemove / fallbackEnabled / placeholders)
## 验证
- `npm run build` 通过(assistant chunk 149.85 → 153.98 kB)
- `cargo fmt --check` 通过
- `cargo clippy --all-targets -- -D warnings` 通过
- 向后兼容:旧用户的 `localStorage` 无 `fallbackModels` 字段,
loadConfig 会初始化空数组,既有行为不变
Refs: 模型兼容性优化 + 多模型 failover 需求
OpenClaw & Hermes Agent Management Panel with Built-in AI Assistant — Multi-Engine AI Framework Management
🇨🇳 中文 | 🇺🇸 English | 🇹🇼 繁體中文 | 🇯🇵 日本語 | 🇰🇷 한국어 | 🇻🇳 Tiếng Việt | 🇪🇸 Español | 🇧🇷 Português | 🇷🇺 Русский | 🇫🇷 Français | 🇩🇪 Deutsch
ClawPanel is a visual management panel supporting multiple AI Agent frameworks, currently with OpenClaw and Hermes Agent dual-engine support. It features a built-in intelligent AI assistant that helps you install, auto-diagnose configurations, troubleshoot issues, and fix errors. 8 tools + 4 modes + interactive Q&A — easy to manage for beginners and experts alike.
🌐 Website: claw.qt.cool | 📦 Download: GitHub Releases
🎁 QingchenCloud AI API
Internal technical testing platform, open for selected users. Sign in daily to earn credits.
- Daily Sign-in Credits — Sign in daily + invite friends to earn test credits
- OpenAI-Compatible API — Seamless integration with OpenClaw, plug and play
- Resource Policy — Rate limiting + request caps, may queue during peak hours
- Model Availability — Models/APIs subject to actual page display, may rotate versions
⚠️ Compliance: This platform is for technical testing only. Illegal use or circumventing security mechanisms is prohibited. Keep your API Key secure. Rules subject to latest platform policies.
🔥 Dev Board / Embedded Device Support
ClawPanel provides a pure Web deployment mode (zero GUI dependency), natively compatible with ARM64 boards:
- Orange Pi / Raspberry Pi / RK3588 —
npm run serveto run - Docker ARM64 —
docker run ghcr.io/qingchencloud/openclaw:latest - Armbian / Debian / Ubuntu Server — Auto-detect architecture
- No Rust / Tauri / GUI needed — only Node.js 18+ required
📖 See Armbian Deployment Guide | Web Dev Mode
Community
A community of passionate AI Agent developers and enthusiasts — join us!
Discord · Discussions · Report Issue
Features
- 🤖 AI Assistant (New) — Built-in AI assistant, 4 modes + 8 tools + interactive Q&A. See AI Assistant Highlights
- 🧩 Multi-Engine Architecture — Supports both OpenClaw and Hermes Agent dual engines, freely switchable, independently managed
- 🤖 Hermes Agent Chat — Built-in Hermes Agent chat interface with tool call visualization, file system access toggle, SSE streaming output
- 🖼️ Image Recognition — Paste screenshots or drag images, AI auto-analyzes, multimodal conversations
- Dashboard — System overview, real-time service monitoring, quick actions
- Service Management — OpenClaw / Hermes Gateway start/stop, version detection & one-click upgrade, config backup & restore
- Model Configuration — Multi-provider management, model CRUD, batch connectivity tests, latency detection, drag-to-reorder, auto-save + undo
- Gateway Configuration — Port, access scope (localhost/LAN), auth Token, Tailscale networking
- Messaging Channels — Unified Telegram, Discord, Feishu, DingTalk, QQ management, multi-Agent binding per platform
- Communication & Automation — Message settings, broadcast strategies, slash commands, Webhooks, execution approval
- Usage Analytics — Token usage, API costs, model/provider/tool rankings, daily usage charts
- Agent Management — Agent CRUD, identity editing, model config, workspace management
- Chat — Streaming, Markdown rendering, session management, /fast /think /verbose /reasoning commands
- Cron Jobs — Cron-based scheduled execution, multi-channel delivery
- Log Viewer — Multi-source real-time logs with keyword search
- Memory Management — Memory file view/edit, categorized management, ZIP export, Agent switching
- QingchenCloud AI API — Internal testing platform, OpenAI-compatible, daily sign-in credits
- Extensions — cftunnel tunnel management, ClawApp status monitoring
- About — Version info, community links, related projects, one-click upgrade
Download & Install
Go to Releases for the latest version:
macOS
| Chip | Installer | Notes |
|---|---|---|
| Apple Silicon (M1/M2/M3/M4) | ClawPanel_x.x.x_aarch64.dmg |
Macs from late 2020+ |
| Intel | ClawPanel_x.x.x_x64.dmg |
Macs 2020 and earlier |
⚠️ "Damaged" or "unverified developer"? App is unsigned. Run:
sudo xattr -rd com.apple.quarantine /Applications/ClawPanel.app
Windows
| Format | Installer | Notes |
|---|---|---|
| EXE | ClawPanel_x.x.x_x64-setup.exe |
Recommended |
| MSI | ClawPanel_x.x.x_x64_en-US.msi |
Enterprise / silent install |
Linux
| Format | Installer | Notes |
|---|---|---|
| AppImage | ClawPanel_x.x.x_amd64.AppImage |
No install, chmod +x and run |
| DEB | ClawPanel_x.x.x_amd64.deb |
sudo dpkg -i *.deb |
| RPM | ClawPanel-x.x.x-1.x86_64.rpm |
sudo rpm -i *.rpm |
Linux Server (Web Version)
curl -fsSL https://raw.githubusercontent.com/qingchencloud/clawpanel/main/scripts/linux-deploy.sh | bash
Visit http://YOUR_SERVER_IP:1420 after deployment. 📖 Linux Deployment Guide
Docker
docker run -d --name clawpanel --restart unless-stopped \
-p 1420:1420 -v clawpanel-data:/root/.openclaw \
node:22-slim \
sh -c "apt-get update && apt-get install -y git && \
npm install -g @qingchencloud/openclaw-zh --registry https://registry.npmmirror.com && \
git clone https://github.com/qingchencloud/clawpanel.git /app && \
cd /app && npm install && npm run build && npm run serve"
Quick Start
- Initial Setup — First launch auto-detects Node.js, Git, OpenClaw. One-click install if missing.
- Configure Models — Add AI providers (DeepSeek, MiniMax, OpenAI, Ollama, etc.) with API keys. Test connectivity.
- Start Gateway — Go to Service Management, click Start. Green status = ready.
- Start Chatting — Go to Live Chat, select model, start conversation with streaming & Markdown.
🤖 AI Assistant Highlights
Built-in AI assistant that can directly operate your system — diagnose, fix, even submit PRs.
Four Modes
| Mode | Icon | Tools | Write | Confirm | Use Case |
|---|---|---|---|---|---|
| Chat | 💬 | ❌ | ❌ | — | Pure Q&A |
| Plan | 📋 | ✅ | ❌ | ✅ | Read configs/logs, output plans |
| Execute | ⚡ | ✅ | ✅ | ✅ | Normal work, dangerous ops need confirm |
| Unlimited | ∞ | ✅ | ✅ | ❌ | Full auto, no prompts |
Eight Tools
| Tool | Function |
|---|---|
ask_user |
Ask user questions (single/multi/text) |
get_system_info |
Get OS, architecture, home directory |
run_command |
Execute shell commands |
read_file / write_file |
Read/write files |
list_directory |
Browse directories |
list_processes |
View processes |
check_port |
Check port usage |
Tech Architecture
| Layer | Technology | Description |
|---|---|---|
| Frontend | Vanilla JS + Vite | Zero framework, lightweight |
| Backend | Rust + Tauri v2 | Native performance, cross-platform |
| Communication | Tauri IPC + Shell Plugin | Frontend-backend bridge |
| Styling | Pure CSS (CSS Variables) | Dark/Light themes, glassmorphism |
Build from Source
git clone https://github.com/qingchencloud/clawpanel.git
cd clawpanel && npm install
# Desktop (requires Rust + Tauri v2)
npm run tauri dev # Development
npm run tauri build # Production
# Web only (no Rust needed)
npm run dev # Dev with hot reload
npm run build && npm run serve # Production
Related Projects
| Project | Description |
|---|---|
| OpenClaw | AI Agent Framework |
| ClawApp | Cross-platform mobile chat client |
| cftunnel | Cloudflare Tunnel tool |
Contributing
Issues and Pull Requests are welcome. See CONTRIBUTING.md for guidelines.
Acknowledgements
ClawPanel keeps growing because of every contributor in the community. Thank you for helping make the project better.
Code Contributors
Thanks to these developers for submitting Pull Requests and contributing directly to the codebase:
![]() liucong2013 #88 |
![]() axdlee #58 |
![]() ATGCS #107 |
![]() livisun #106 |
![]() kiss-kedaya #101 #94 |
![]() wzh4869 #82 |
![]() 0xsline #15 |
![]() jonntd #18 |
Community Reporters
Thanks to community members who opened issues, reported bugs, and suggested features:
If we missed your contribution, please open an issue and we will add it promptly.
Sponsor
If you find this project useful, consider supporting us via USDT (BNB Smart Chain):
0xbdd7ebdf2b30d873e556799711021c6671ffe88f
Contact
- Email: support@qctx.net
- Website: qingchencloud.com
- Product: claw.qt.cool
License
This project is licensed under AGPL-3.0. For commercial/proprietary use without open-source requirements, contact us for a commercial license.
© 2026 QingchenCloud (武汉晴辰天下网络科技有限公司) | claw.qt.cool









