Compare commits

...

3 Commits

Author SHA1 Message Date
Syngnat
6ad690cffc release/0.5.6 (#210)
* 🐛 fix(data-viewer): 修复ClickHouse尾部分页异常并增强DuckDB复杂类型兼容

- DataViewer 新增 ClickHouse 反向分页策略,修复最后页与倒数页查询失败
- DuckDB 查询失败时按列类型生成安全 SELECT,复杂类型转 VARCHAR 重试
- 分页状态统一使用 currentPage 回填,避免页码与总数推导不一致
- 增强查询异常日志与重试路径,降低大表场景卡顿与误报

*  feat(frontend-driver): 驱动管理支持快速搜索并优化信息展示

- 新增搜索框,支持按 DuckDB/ClickHouse 等关键字快速定位驱动
- 显示“匹配 x / y”统计与无结果提示
- 优化头部区域排版,提升透明/暗色场景下的视觉对齐

* 🔧 fix(connection-modal): 修复多数据源URI导入解析并校正Oracle服务名校验

- 新增单主机URI解析映射,兼容 postgres/postgresql、sqlserver、redis、tdengine、dameng(dm)、kingbase、highgo、vastbase、clickhouse、oracle
- 抽取 parseSingleHostUri 复用逻辑,统一 host/port/user/password/database 回填行为
- Oracle 连接新增服务名必填校验,移除“服务名为空回退用户名”的隐式逻辑
- 连接弹窗补充 Oracle 服务名输入项与 URI 示例

* 🐛 fix(query-export): 修复查询结果导出卡住并统一按数据源能力控制导出路径

- 查询结果页导出增加稳定兜底,异常时确保 loading 关闭避免持续转圈
- DataGrid 导出逻辑按数据源能力分流,优先走后端 ExportQuery 并保留结果集导出降级
- QueryEditor 传递结果导出 SQL,保证查询结果导出范围与当前结果一致
- 后端补充 ExportData/ExportQuery 关键日志,提升导出链路可观测性

* 🐛 fix(precision): 修复查询链路与分页统计的大整数精度丢失

- 代理响应数据解码改为 UseNumber,避免默认 float64 吞精度
- 统一归一化 json.Number 与超界整数,超出 JS 安全范围转字符串
- 修复 DataViewer 总数解析,超大值不再误转 Number 参与分页
- refs #142

* 🐛 fix(driver-manager): 修复驱动管理网络告警重复并强化代理引导

- 新增下载链路域名探测,区分“GitHub可达但驱动下载链路不可达”
- 网络不可达场景仅保留红色强提醒,移除重复二级告警
- 强提醒增加“打开全局代理设置”入口,优先引导使用 GoNavi 全局代理
- 统一网络检测与目录说明提示图标尺寸,修复加载期视觉不一致
- refs #141

* ♻️ refactor(frontend-interaction): 统一标签拖拽与暗色主题交互实现

- 重构Tab拖拽排序实现,统一为可配置拖拽引擎
- 规范拖拽与点击事件边界,提升交互一致性
- 统一多组件暗色透明样式策略,减少硬编码色值
- 提升Redis/表格/连接面板在透明模式下的观感一致性
- refs #144

* ♻️ refactor(update-state): 重构在线更新状态流并按版本统一进度展示

- 重构更新检查与下载状态同步流程,减少前后端状态分叉
- 进度展示严格绑定 latestVersion,避免跨版本状态串用
- 优化 about 打开场景的静默检查状态回填逻辑
- 统一下载弹窗关闭/后台隐藏行为
- 保持现有安装流程并补齐目录打开能力

* 🎨 style(sidebar-log): 将SQL执行日志入口调整为悬浮胶囊样式

- 移除侧栏底部整条日志入口容器
- 新增悬浮按钮阴影/边框/透明背景并适配明暗主题
- 为树区域预留底部空间避免入口遮挡内容

*  feat(redis-cluster): 支持集群模式逻辑多库隔离与 0-15 库切换

- 前端恢复 Redis 集群场景下 db0-db15 的数据库选择与展示
- 后端新增集群逻辑库命名空间前缀映射,统一 key/pattern 读写隔离
- 覆盖扫描、读取、写入、删除、重命名等核心操作的键映射规则
- 集群命令通道支持 SELECT 逻辑切库与 FLUSHDB 逻辑库清空
- refs #145

*  feat(DataGrid): 大数据表虚拟滚动性能优化及UI一致性修复

- 启用动态虚拟滚动(数据量≥500行自动切换),解决万行数据表卡顿问题
- 虚拟模式下EditableCell改用div渲染,CSS选择器从元素级改为类级适配虚拟DOM
- 修复虚拟模式双水平滚动条:样式化rc-virtual-list内置滚动条为胶囊外观,禁用自定义外部滚动条
- 为rc-virtual-list水平滚动条添加鼠标滚轮支持(MutationObserver + marginLeft驱动)
- 修复白色主题透明模式下列名悬浮Tooltip对比度不足的问题
- 新增白色主题全局滚动条样式适配透明模式(App.css)
- App.tsx主题token与组件样式优化
- refs #147

* 🔧 chore(app): 清理 App.tsx 类型告警并收敛前端壳层实现

- 清除未使用代码和冗余状态
- 替换弃用 API 以消除 IDE 提示
- 显式处理浮动 Promise 避免告警
- 保持现有更新检查和代理设置行为不变

* 🔧 fix(ci): 修复 Windows AMD64 下 DuckDB 驱动构建链路

- 将 DuckDB 工具链准备切换为优先使用 MSYS2
- 增加 gcc 和 g++ 存在性校验与版本验证
- 在 MSYS2 异常时回退 Chocolatey 安装 MinGW
- 保持 Windows ARM64 跳过 DuckDB 构建与平台支持一致

* 🔧 fix(ci): 修复 Windows AMD64 下 DuckDB 驱动构建链路

- 将 DuckDB 工具链准备切换为优先使用 MSYS2
- 增加 gcc 和 g++ 存在性校验与版本验证
- 在 MSYS2 异常时回退 Chocolatey 安装 MinGW
- 保持 Windows ARM64 跳过 DuckDB 构建与平台支持一致

* 🔧 fix(ci): 修复 Windows AMD64 下 DuckDB 驱动构建工具链

- 将 DuckDB 编译链从 MINGW64 切换为 MSYS2 UCRT64
- 修正 Windows AMD64 的 gcc 和 g++ 探测路径
- 增加 DuckDB 编译器版本校验步骤

* 📝 docs(contributing): 补充中英文贡献指南并统一 README 入口

- 新增英文版 CONTRIBUTING.md 作为正式贡献文档
- 新增中文版 CONTRIBUTING.zh-CN.md 作为中文贡献说明
- 调整 README 和 README.zh-CN 的贡献入口指向对应语言文档

* - feat(connection,metadata,kingbase): 增强多数据源连接能力并修复金仓/达梦/Oracle/ClickHouse兼容性问题 (#188) (#190)

* feat(http-tunnel): 支持独立 HTTP 隧道连接并覆盖多数据源

refs #168

* fix(kingbase-data-grid): 修复金仓打开表卡顿并降低对象渲染开销

refs #178

* fix(kingbase-transaction): 修复金仓事务提交重复引号导致语法错误

refs #176

* fix(driver-agent): 修复老版本 Win10 升级后金仓驱动代理启动失败

refs #177

* chore(ci): 新增手动触发的 macOS 测试构建工作流

* chore(ci): 允许测试工作流在当前分支自动触发

* fix(query-editor): 修复 SQL 编辑中光标随机跳到末尾 refs #185

* feat(data-sync): 增加差异 SQL 预览能力便于审核 refs #174

* fix(clickhouse-connect): 自动识别并回退 HTTP/Native 协议连接 refs #181

* fix(oracle-metadata): 修复视图与函数加载按 schema 过滤异常 refs #155

* fix(dameng-databases): 修复显示全部库时数据库列表不完整 refs #154

* fix(connection,db-list): 统一处理空列表返回并修复达梦连接测试报错 refs #157

Co-authored-by: 辣条 <69459608+tianqijiuyun-latiao@users.noreply.github.com>

*  feat(release-notes): 支持自动生成 Release 更新说明并区分配置文件命名

* 🔁 chore(sync): 回灌 main 到 dev (#192)

* - feat(connection,metadata,kingbase): 增强多数据源连接能力并修复金仓/达梦/Oracle/ClickHouse兼容性问题 (#188)

* feat(http-tunnel): 支持独立 HTTP 隧道连接并覆盖多数据源

refs #168

* fix(kingbase-data-grid): 修复金仓打开表卡顿并降低对象渲染开销

refs #178

* fix(kingbase-transaction): 修复金仓事务提交重复引号导致语法错误

refs #176

* fix(driver-agent): 修复老版本 Win10 升级后金仓驱动代理启动失败

refs #177

* chore(ci): 新增手动触发的 macOS 测试构建工作流

* chore(ci): 允许测试工作流在当前分支自动触发

* fix(query-editor): 修复 SQL 编辑中光标随机跳到末尾 refs #185

* feat(data-sync): 增加差异 SQL 预览能力便于审核 refs #174

* fix(clickhouse-connect): 自动识别并回退 HTTP/Native 协议连接 refs #181

* fix(oracle-metadata): 修复视图与函数加载按 schema 过滤异常 refs #155

* fix(dameng-databases): 修复显示全部库时数据库列表不完整 refs #154

* fix(connection,db-list): 统一处理空列表返回并修复达梦连接测试报错 refs #157

* Release/0.5.3 (#191)

---------

Co-authored-by: 辣条 <69459608+tianqijiuyun-latiao@users.noreply.github.com>
Co-authored-by: Syngnat <92659908+Syngnat@users.noreply.github.com>

* 🐛 fix(branch-sync): 修复 main 回灌 dev 时 mergeable 异步计算导致漏开自动合并

- 增加 mergeable 状态轮询,避免新建同步 PR 后立即返回 UNKNOWN
- 在合并状态未稳定时输出中文告警与执行摘要
- 保持冲突分支、待计算分支与自动合并分支的处理路径清晰

* 🔁 chore(sync): 回灌 main 到 dev (#195)

* - feat(connection,metadata,kingbase): 增强多数据源连接能力并修复金仓/达梦/Oracle/ClickHouse兼容性问题 (#188)

* feat(http-tunnel): 支持独立 HTTP 隧道连接并覆盖多数据源

refs #168

* fix(kingbase-data-grid): 修复金仓打开表卡顿并降低对象渲染开销

refs #178

* fix(kingbase-transaction): 修复金仓事务提交重复引号导致语法错误

refs #176

* fix(driver-agent): 修复老版本 Win10 升级后金仓驱动代理启动失败

refs #177

* chore(ci): 新增手动触发的 macOS 测试构建工作流

* chore(ci): 允许测试工作流在当前分支自动触发

* fix(query-editor): 修复 SQL 编辑中光标随机跳到末尾 refs #185

* feat(data-sync): 增加差异 SQL 预览能力便于审核 refs #174

* fix(clickhouse-connect): 自动识别并回退 HTTP/Native 协议连接 refs #181

* fix(oracle-metadata): 修复视图与函数加载按 schema 过滤异常 refs #155

* fix(dameng-databases): 修复显示全部库时数据库列表不完整 refs #154

* fix(connection,db-list): 统一处理空列表返回并修复达梦连接测试报错 refs #157

* Release/0.5.3 (#191)

* - chore(ci): 新增全平台测试包手动构建工作流 tianqijiuyun-latiao 今天 下午4:26 (#194)

* feat(http-tunnel): 支持独立 HTTP 隧道连接并覆盖多数据源

refs #168

* fix(kingbase-data-grid): 修复金仓打开表卡顿并降低对象渲染开销

refs #178

* fix(kingbase-transaction): 修复金仓事务提交重复引号导致语法错误

refs #176

* fix(driver-agent): 修复老版本 Win10 升级后金仓驱动代理启动失败

refs #177

* chore(ci): 新增手动触发的 macOS 测试构建工作流

* chore(ci): 允许测试工作流在当前分支自动触发

* fix(query-editor): 修复 SQL 编辑中光标随机跳到末尾 refs #185

* feat(data-sync): 增加差异 SQL 预览能力便于审核 refs #174

* fix(clickhouse-connect): 自动识别并回退 HTTP/Native 协议连接 refs #181

* fix(oracle-metadata): 修复视图与函数加载按 schema 过滤异常 refs #155

* fix(dameng-databases): 修复显示全部库时数据库列表不完整 refs #154

* fix(connection,db-list): 统一处理空列表返回并修复达梦连接测试报错 refs #157

* fix(kingbase): 补齐主键识别并优化宽表卡顿 refs #176 refs #178

* fix(query-execution): 支持带前置注释的读查询结果识别

* chore(ci): 新增全平台测试包手动构建工作流

---------

Co-authored-by: 辣条 <69459608+tianqijiuyun-latiao@users.noreply.github.com>
Co-authored-by: Syngnat <92659908+Syngnat@users.noreply.github.com>

* ♻️ refactor(frontend-sync): 优化桌面交互细节并移除 main 回灌 dev 自动化

- 优化新建连接、主题设置、侧边栏工具区与 SQL 日志的界面表现
- 调整分页、筛选、透明模式与弹窗样式,统一整体交互层次
- 收口外观参数生效逻辑并补齐多组件适配
- 删除 sync-main-to-dev 工作流并同步维护者手动回灌说明

* feat: 统一筛选条件逻辑按钮宽度 (#201)

* 🐛 fix(oracle-query): 修复 Oracle 表数据分页 SQL 兼容问题 refs #196 (#202)

*  feat(datasource): 支持 DuckDB Parquet 文件模式并优化弹窗打开链路

- 统一 DuckDB 文件库与 Parquet 文件接入能力
- 补充 URI、文件选择、只读挂载与连接缓存键处理
- 去掉数据源卡片点击前的同步驱动查询,修复打开卡顿

*  feat(datasource): 支持 DuckDB Parquet 文件模式并优化弹窗打开链路

- 统一 DuckDB 文件库与 Parquet 文件接入能力
- 补充 URI、文件选择、只读挂载与连接缓存键处理
- 去掉数据源卡片点击前的同步驱动查询,修复打开卡顿
- refs #166

* 🐛 fix(dameng): 修复达梦连接成功后数据库列表为空问题

- 调整达梦数据库列表获取策略,优先回退查询当前 schema 与当前用户
- 保留可见用户与 owner 聚合逻辑,兼容低权限账号场景
- 补充前端空列表提示与后端单元测试,降低排查成本
- close #203

*  feat(data-sync): 扩展跨库迁移链路并优化数据同步交互

- 统一同库同步与跨库迁移入口,补充模式区分与风险提示
- 扩展 ClickHouse 与 PG-like 双向迁移,并新增 PG-like、ClickHouse、TDengine 到 MongoDB 的迁移路由
- 完善 TDengine 目标端建表规划、回归测试与需求追踪文档
- refs #51

* 🐛 fix(connection): 修复新建连接时标签切换导致表单数据丢失

- 在 SSH 标签页测试连接时,基础信息的 host 回退为默认值 localhost
- 在基础信息标签页保存时,SSH 配置丢失
- 保存结果仅包含当前选中标签页的字段
- refs #208

* 🐛 fix(mongodb): 修复单机模式连接副本集实例时地址被替换为内网地址

- getURI 在 topology=single 时未设置 directConnection=true
- 驱动连接目标地址后自动跟随副本集成员发现,切换到 localhost:27017
- 在 mongodb_impl.go 和 mongodb_impl_v1.go 中添加 directConnection=true
- 仅在 topology 非 replica、无 replicaSet、非 SRV 时生效
- refs #205

* 🐛 fix(DataGrid): 修复虚拟滚动模式下右键菜单失效

- 行级和单元格级右键菜单的启用条件互斥,虚拟滚动模式下两者同时失效
- enableLargeResultOptimizedEditing 关闭了内联编辑但未回退启用行级菜单
- 修改 useContextMenuRow 和 enableRowContextMenu 条件,虚拟模式下启用行级菜单
- 更新 dataContextValue 的 useMemo 依赖数组
- refs #209

* 🐛 fix(sqlserver): 修复 SQL Server 查看表数据时分页语法和标识符引用错误

- quoteIdentPart 缺少 sqlserver 分支,标识符使用双引号而非 [bracket]
- buildPaginatedSelectSQL 增加 mssql 别名兜底,避免 dbType 变体导致走 default 分支
- 修复后标识符使用 [bracket],分页使用 OFFSET FETCH NEXT 语法
- refs #204

*  feat(DataGrid): 统一表格右键菜单交互体验

- 彻底移除功能较少的行级右键菜单 ContextMenuRow,统一使用功能更丰富的单元格右键菜单
- 优化虚拟滚动模式和只读模式下的渲染,支持触发单元格右键菜单
- 菜单展示自适应:在只读或不可修改数据的场景下自动隐藏「设置为 NULL」与「填充到选中行」等编辑项
- refs #209

* 🔧 fix(DataGrid): 默认开启虚拟滚动并修复多选单元格高亮失效问题

- 移除根据数据量和列数动态判断是否开启虚拟滚动的阈值限制,改为在表格视图下默认全量开启,彻底解决卡顿问题
- 修复 `updateCellSelection` 在查找坐标节点时硬编码 `td` 选择器的问题,改为精确匹配 `.ant-table-cell`,兼容虚拟滚动时的 `div` 渲染模式
- 修复因透明窗口特性导致的 `transparent !important` 把高亮样式强行覆盖的问题,拔高了多选状态下背景与边框 CSS 的优先级
- 解决单元格内外多重属性嵌套导致的高亮右侧留白现象,使得高亮框完全贴合表格单元格边缘
- 适配主题色响应(暗黑模式使用黄色深色高亮,白昼模式使用默认蓝色高亮)

---------

Co-authored-by: Syngnat <yangguofeng919@gmail.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: 辣条 <69459608+tianqijiuyun-latiao@users.noreply.github.com>
Co-authored-by: TSS <266256496+Zencok@users.noreply.github.com>
2026-03-10 11:26:02 +08:00
Syngnat
22bd1c4c28 Release/0.5.5 (#207)
* 🐛 fix(data-viewer): 修复ClickHouse尾部分页异常并增强DuckDB复杂类型兼容

- DataViewer 新增 ClickHouse 反向分页策略,修复最后页与倒数页查询失败
- DuckDB 查询失败时按列类型生成安全 SELECT,复杂类型转 VARCHAR 重试
- 分页状态统一使用 currentPage 回填,避免页码与总数推导不一致
- 增强查询异常日志与重试路径,降低大表场景卡顿与误报

*  feat(frontend-driver): 驱动管理支持快速搜索并优化信息展示

- 新增搜索框,支持按 DuckDB/ClickHouse 等关键字快速定位驱动
- 显示“匹配 x / y”统计与无结果提示
- 优化头部区域排版,提升透明/暗色场景下的视觉对齐

* 🔧 fix(connection-modal): 修复多数据源URI导入解析并校正Oracle服务名校验

- 新增单主机URI解析映射,兼容 postgres/postgresql、sqlserver、redis、tdengine、dameng(dm)、kingbase、highgo、vastbase、clickhouse、oracle
- 抽取 parseSingleHostUri 复用逻辑,统一 host/port/user/password/database 回填行为
- Oracle 连接新增服务名必填校验,移除“服务名为空回退用户名”的隐式逻辑
- 连接弹窗补充 Oracle 服务名输入项与 URI 示例

* 🐛 fix(query-export): 修复查询结果导出卡住并统一按数据源能力控制导出路径

- 查询结果页导出增加稳定兜底,异常时确保 loading 关闭避免持续转圈
- DataGrid 导出逻辑按数据源能力分流,优先走后端 ExportQuery 并保留结果集导出降级
- QueryEditor 传递结果导出 SQL,保证查询结果导出范围与当前结果一致
- 后端补充 ExportData/ExportQuery 关键日志,提升导出链路可观测性

* 🐛 fix(precision): 修复查询链路与分页统计的大整数精度丢失

- 代理响应数据解码改为 UseNumber,避免默认 float64 吞精度
- 统一归一化 json.Number 与超界整数,超出 JS 安全范围转字符串
- 修复 DataViewer 总数解析,超大值不再误转 Number 参与分页
- refs #142

* 🐛 fix(driver-manager): 修复驱动管理网络告警重复并强化代理引导

- 新增下载链路域名探测,区分“GitHub可达但驱动下载链路不可达”
- 网络不可达场景仅保留红色强提醒,移除重复二级告警
- 强提醒增加“打开全局代理设置”入口,优先引导使用 GoNavi 全局代理
- 统一网络检测与目录说明提示图标尺寸,修复加载期视觉不一致
- refs #141

* ♻️ refactor(frontend-interaction): 统一标签拖拽与暗色主题交互实现

- 重构Tab拖拽排序实现,统一为可配置拖拽引擎
- 规范拖拽与点击事件边界,提升交互一致性
- 统一多组件暗色透明样式策略,减少硬编码色值
- 提升Redis/表格/连接面板在透明模式下的观感一致性
- refs #144

* ♻️ refactor(update-state): 重构在线更新状态流并按版本统一进度展示

- 重构更新检查与下载状态同步流程,减少前后端状态分叉
- 进度展示严格绑定 latestVersion,避免跨版本状态串用
- 优化 about 打开场景的静默检查状态回填逻辑
- 统一下载弹窗关闭/后台隐藏行为
- 保持现有安装流程并补齐目录打开能力

* 🎨 style(sidebar-log): 将SQL执行日志入口调整为悬浮胶囊样式

- 移除侧栏底部整条日志入口容器
- 新增悬浮按钮阴影/边框/透明背景并适配明暗主题
- 为树区域预留底部空间避免入口遮挡内容

*  feat(redis-cluster): 支持集群模式逻辑多库隔离与 0-15 库切换

- 前端恢复 Redis 集群场景下 db0-db15 的数据库选择与展示
- 后端新增集群逻辑库命名空间前缀映射,统一 key/pattern 读写隔离
- 覆盖扫描、读取、写入、删除、重命名等核心操作的键映射规则
- 集群命令通道支持 SELECT 逻辑切库与 FLUSHDB 逻辑库清空
- refs #145

*  feat(DataGrid): 大数据表虚拟滚动性能优化及UI一致性修复

- 启用动态虚拟滚动(数据量≥500行自动切换),解决万行数据表卡顿问题
- 虚拟模式下EditableCell改用div渲染,CSS选择器从元素级改为类级适配虚拟DOM
- 修复虚拟模式双水平滚动条:样式化rc-virtual-list内置滚动条为胶囊外观,禁用自定义外部滚动条
- 为rc-virtual-list水平滚动条添加鼠标滚轮支持(MutationObserver + marginLeft驱动)
- 修复白色主题透明模式下列名悬浮Tooltip对比度不足的问题
- 新增白色主题全局滚动条样式适配透明模式(App.css)
- App.tsx主题token与组件样式优化
- refs #147

* 🔧 chore(app): 清理 App.tsx 类型告警并收敛前端壳层实现

- 清除未使用代码和冗余状态
- 替换弃用 API 以消除 IDE 提示
- 显式处理浮动 Promise 避免告警
- 保持现有更新检查和代理设置行为不变

* 🔧 fix(ci): 修复 Windows AMD64 下 DuckDB 驱动构建链路

- 将 DuckDB 工具链准备切换为优先使用 MSYS2
- 增加 gcc 和 g++ 存在性校验与版本验证
- 在 MSYS2 异常时回退 Chocolatey 安装 MinGW
- 保持 Windows ARM64 跳过 DuckDB 构建与平台支持一致

* 🔧 fix(ci): 修复 Windows AMD64 下 DuckDB 驱动构建链路

- 将 DuckDB 工具链准备切换为优先使用 MSYS2
- 增加 gcc 和 g++ 存在性校验与版本验证
- 在 MSYS2 异常时回退 Chocolatey 安装 MinGW
- 保持 Windows ARM64 跳过 DuckDB 构建与平台支持一致

* 🔧 fix(ci): 修复 Windows AMD64 下 DuckDB 驱动构建工具链

- 将 DuckDB 编译链从 MINGW64 切换为 MSYS2 UCRT64
- 修正 Windows AMD64 的 gcc 和 g++ 探测路径
- 增加 DuckDB 编译器版本校验步骤

* 📝 docs(contributing): 补充中英文贡献指南并统一 README 入口

- 新增英文版 CONTRIBUTING.md 作为正式贡献文档
- 新增中文版 CONTRIBUTING.zh-CN.md 作为中文贡献说明
- 调整 README 和 README.zh-CN 的贡献入口指向对应语言文档

* - feat(connection,metadata,kingbase): 增强多数据源连接能力并修复金仓/达梦/Oracle/ClickHouse兼容性问题 (#188) (#190)

* feat(http-tunnel): 支持独立 HTTP 隧道连接并覆盖多数据源

refs #168

* fix(kingbase-data-grid): 修复金仓打开表卡顿并降低对象渲染开销

refs #178

* fix(kingbase-transaction): 修复金仓事务提交重复引号导致语法错误

refs #176

* fix(driver-agent): 修复老版本 Win10 升级后金仓驱动代理启动失败

refs #177

* chore(ci): 新增手动触发的 macOS 测试构建工作流

* chore(ci): 允许测试工作流在当前分支自动触发

* fix(query-editor): 修复 SQL 编辑中光标随机跳到末尾 refs #185

* feat(data-sync): 增加差异 SQL 预览能力便于审核 refs #174

* fix(clickhouse-connect): 自动识别并回退 HTTP/Native 协议连接 refs #181

* fix(oracle-metadata): 修复视图与函数加载按 schema 过滤异常 refs #155

* fix(dameng-databases): 修复显示全部库时数据库列表不完整 refs #154

* fix(connection,db-list): 统一处理空列表返回并修复达梦连接测试报错 refs #157

Co-authored-by: 辣条 <69459608+tianqijiuyun-latiao@users.noreply.github.com>

*  feat(release-notes): 支持自动生成 Release 更新说明并区分配置文件命名

* 🔁 chore(sync): 回灌 main 到 dev (#192)

* - feat(connection,metadata,kingbase): 增强多数据源连接能力并修复金仓/达梦/Oracle/ClickHouse兼容性问题 (#188)

* feat(http-tunnel): 支持独立 HTTP 隧道连接并覆盖多数据源

refs #168

* fix(kingbase-data-grid): 修复金仓打开表卡顿并降低对象渲染开销

refs #178

* fix(kingbase-transaction): 修复金仓事务提交重复引号导致语法错误

refs #176

* fix(driver-agent): 修复老版本 Win10 升级后金仓驱动代理启动失败

refs #177

* chore(ci): 新增手动触发的 macOS 测试构建工作流

* chore(ci): 允许测试工作流在当前分支自动触发

* fix(query-editor): 修复 SQL 编辑中光标随机跳到末尾 refs #185

* feat(data-sync): 增加差异 SQL 预览能力便于审核 refs #174

* fix(clickhouse-connect): 自动识别并回退 HTTP/Native 协议连接 refs #181

* fix(oracle-metadata): 修复视图与函数加载按 schema 过滤异常 refs #155

* fix(dameng-databases): 修复显示全部库时数据库列表不完整 refs #154

* fix(connection,db-list): 统一处理空列表返回并修复达梦连接测试报错 refs #157

* Release/0.5.3 (#191)

---------

Co-authored-by: 辣条 <69459608+tianqijiuyun-latiao@users.noreply.github.com>
Co-authored-by: Syngnat <92659908+Syngnat@users.noreply.github.com>

* 🐛 fix(branch-sync): 修复 main 回灌 dev 时 mergeable 异步计算导致漏开自动合并

- 增加 mergeable 状态轮询,避免新建同步 PR 后立即返回 UNKNOWN
- 在合并状态未稳定时输出中文告警与执行摘要
- 保持冲突分支、待计算分支与自动合并分支的处理路径清晰

* 🔁 chore(sync): 回灌 main 到 dev (#195)

* - feat(connection,metadata,kingbase): 增强多数据源连接能力并修复金仓/达梦/Oracle/ClickHouse兼容性问题 (#188)

* feat(http-tunnel): 支持独立 HTTP 隧道连接并覆盖多数据源

refs #168

* fix(kingbase-data-grid): 修复金仓打开表卡顿并降低对象渲染开销

refs #178

* fix(kingbase-transaction): 修复金仓事务提交重复引号导致语法错误

refs #176

* fix(driver-agent): 修复老版本 Win10 升级后金仓驱动代理启动失败

refs #177

* chore(ci): 新增手动触发的 macOS 测试构建工作流

* chore(ci): 允许测试工作流在当前分支自动触发

* fix(query-editor): 修复 SQL 编辑中光标随机跳到末尾 refs #185

* feat(data-sync): 增加差异 SQL 预览能力便于审核 refs #174

* fix(clickhouse-connect): 自动识别并回退 HTTP/Native 协议连接 refs #181

* fix(oracle-metadata): 修复视图与函数加载按 schema 过滤异常 refs #155

* fix(dameng-databases): 修复显示全部库时数据库列表不完整 refs #154

* fix(connection,db-list): 统一处理空列表返回并修复达梦连接测试报错 refs #157

* Release/0.5.3 (#191)

* - chore(ci): 新增全平台测试包手动构建工作流 tianqijiuyun-latiao 今天 下午4:26 (#194)

* feat(http-tunnel): 支持独立 HTTP 隧道连接并覆盖多数据源

refs #168

* fix(kingbase-data-grid): 修复金仓打开表卡顿并降低对象渲染开销

refs #178

* fix(kingbase-transaction): 修复金仓事务提交重复引号导致语法错误

refs #176

* fix(driver-agent): 修复老版本 Win10 升级后金仓驱动代理启动失败

refs #177

* chore(ci): 新增手动触发的 macOS 测试构建工作流

* chore(ci): 允许测试工作流在当前分支自动触发

* fix(query-editor): 修复 SQL 编辑中光标随机跳到末尾 refs #185

* feat(data-sync): 增加差异 SQL 预览能力便于审核 refs #174

* fix(clickhouse-connect): 自动识别并回退 HTTP/Native 协议连接 refs #181

* fix(oracle-metadata): 修复视图与函数加载按 schema 过滤异常 refs #155

* fix(dameng-databases): 修复显示全部库时数据库列表不完整 refs #154

* fix(connection,db-list): 统一处理空列表返回并修复达梦连接测试报错 refs #157

* fix(kingbase): 补齐主键识别并优化宽表卡顿 refs #176 refs #178

* fix(query-execution): 支持带前置注释的读查询结果识别

* chore(ci): 新增全平台测试包手动构建工作流

---------

Co-authored-by: 辣条 <69459608+tianqijiuyun-latiao@users.noreply.github.com>
Co-authored-by: Syngnat <92659908+Syngnat@users.noreply.github.com>

* ♻️ refactor(frontend-sync): 优化桌面交互细节并移除 main 回灌 dev 自动化

- 优化新建连接、主题设置、侧边栏工具区与 SQL 日志的界面表现
- 调整分页、筛选、透明模式与弹窗样式,统一整体交互层次
- 收口外观参数生效逻辑并补齐多组件适配
- 删除 sync-main-to-dev 工作流并同步维护者手动回灌说明

* feat: 统一筛选条件逻辑按钮宽度 (#201)

* 🐛 fix(oracle-query): 修复 Oracle 表数据分页 SQL 兼容问题 refs #196 (#202)

*  feat(datasource): 支持 DuckDB Parquet 文件模式并优化弹窗打开链路

- 统一 DuckDB 文件库与 Parquet 文件接入能力
- 补充 URI、文件选择、只读挂载与连接缓存键处理
- 去掉数据源卡片点击前的同步驱动查询,修复打开卡顿

*  feat(datasource): 支持 DuckDB Parquet 文件模式并优化弹窗打开链路

- 统一 DuckDB 文件库与 Parquet 文件接入能力
- 补充 URI、文件选择、只读挂载与连接缓存键处理
- 去掉数据源卡片点击前的同步驱动查询,修复打开卡顿
- refs #166

* 🐛 fix(dameng): 修复达梦连接成功后数据库列表为空问题

- 调整达梦数据库列表获取策略,优先回退查询当前 schema 与当前用户
- 保留可见用户与 owner 聚合逻辑,兼容低权限账号场景
- 补充前端空列表提示与后端单元测试,降低排查成本
- close #203

*  feat(data-sync): 扩展跨库迁移链路并优化数据同步交互

- 统一同库同步与跨库迁移入口,补充模式区分与风险提示
- 扩展 ClickHouse 与 PG-like 双向迁移,并新增 PG-like、ClickHouse、TDengine 到 MongoDB 的迁移路由
- 完善 TDengine 目标端建表规划、回归测试与需求追踪文档
- refs #51

---------

Co-authored-by: Syngnat <yangguofeng919@gmail.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: 辣条 <69459608+tianqijiuyun-latiao@users.noreply.github.com>
Co-authored-by: TSS <266256496+Zencok@users.noreply.github.com>
2026-03-09 17:36:52 +08:00
Syngnat
89c81823bc Release/0.5.4 (#199)
* 🐛 fix(data-viewer): 修复ClickHouse尾部分页异常并增强DuckDB复杂类型兼容

- DataViewer 新增 ClickHouse 反向分页策略,修复最后页与倒数页查询失败
- DuckDB 查询失败时按列类型生成安全 SELECT,复杂类型转 VARCHAR 重试
- 分页状态统一使用 currentPage 回填,避免页码与总数推导不一致
- 增强查询异常日志与重试路径,降低大表场景卡顿与误报

*  feat(frontend-driver): 驱动管理支持快速搜索并优化信息展示

- 新增搜索框,支持按 DuckDB/ClickHouse 等关键字快速定位驱动
- 显示“匹配 x / y”统计与无结果提示
- 优化头部区域排版,提升透明/暗色场景下的视觉对齐

* 🔧 fix(connection-modal): 修复多数据源URI导入解析并校正Oracle服务名校验

- 新增单主机URI解析映射,兼容 postgres/postgresql、sqlserver、redis、tdengine、dameng(dm)、kingbase、highgo、vastbase、clickhouse、oracle
- 抽取 parseSingleHostUri 复用逻辑,统一 host/port/user/password/database 回填行为
- Oracle 连接新增服务名必填校验,移除“服务名为空回退用户名”的隐式逻辑
- 连接弹窗补充 Oracle 服务名输入项与 URI 示例

* 🐛 fix(query-export): 修复查询结果导出卡住并统一按数据源能力控制导出路径

- 查询结果页导出增加稳定兜底,异常时确保 loading 关闭避免持续转圈
- DataGrid 导出逻辑按数据源能力分流,优先走后端 ExportQuery 并保留结果集导出降级
- QueryEditor 传递结果导出 SQL,保证查询结果导出范围与当前结果一致
- 后端补充 ExportData/ExportQuery 关键日志,提升导出链路可观测性

* 🐛 fix(precision): 修复查询链路与分页统计的大整数精度丢失

- 代理响应数据解码改为 UseNumber,避免默认 float64 吞精度
- 统一归一化 json.Number 与超界整数,超出 JS 安全范围转字符串
- 修复 DataViewer 总数解析,超大值不再误转 Number 参与分页
- refs #142

* 🐛 fix(driver-manager): 修复驱动管理网络告警重复并强化代理引导

- 新增下载链路域名探测,区分“GitHub可达但驱动下载链路不可达”
- 网络不可达场景仅保留红色强提醒,移除重复二级告警
- 强提醒增加“打开全局代理设置”入口,优先引导使用 GoNavi 全局代理
- 统一网络检测与目录说明提示图标尺寸,修复加载期视觉不一致
- refs #141

* ♻️ refactor(frontend-interaction): 统一标签拖拽与暗色主题交互实现

- 重构Tab拖拽排序实现,统一为可配置拖拽引擎
- 规范拖拽与点击事件边界,提升交互一致性
- 统一多组件暗色透明样式策略,减少硬编码色值
- 提升Redis/表格/连接面板在透明模式下的观感一致性
- refs #144

* ♻️ refactor(update-state): 重构在线更新状态流并按版本统一进度展示

- 重构更新检查与下载状态同步流程,减少前后端状态分叉
- 进度展示严格绑定 latestVersion,避免跨版本状态串用
- 优化 about 打开场景的静默检查状态回填逻辑
- 统一下载弹窗关闭/后台隐藏行为
- 保持现有安装流程并补齐目录打开能力

* 🎨 style(sidebar-log): 将SQL执行日志入口调整为悬浮胶囊样式

- 移除侧栏底部整条日志入口容器
- 新增悬浮按钮阴影/边框/透明背景并适配明暗主题
- 为树区域预留底部空间避免入口遮挡内容

*  feat(redis-cluster): 支持集群模式逻辑多库隔离与 0-15 库切换

- 前端恢复 Redis 集群场景下 db0-db15 的数据库选择与展示
- 后端新增集群逻辑库命名空间前缀映射,统一 key/pattern 读写隔离
- 覆盖扫描、读取、写入、删除、重命名等核心操作的键映射规则
- 集群命令通道支持 SELECT 逻辑切库与 FLUSHDB 逻辑库清空
- refs #145

*  feat(DataGrid): 大数据表虚拟滚动性能优化及UI一致性修复

- 启用动态虚拟滚动(数据量≥500行自动切换),解决万行数据表卡顿问题
- 虚拟模式下EditableCell改用div渲染,CSS选择器从元素级改为类级适配虚拟DOM
- 修复虚拟模式双水平滚动条:样式化rc-virtual-list内置滚动条为胶囊外观,禁用自定义外部滚动条
- 为rc-virtual-list水平滚动条添加鼠标滚轮支持(MutationObserver + marginLeft驱动)
- 修复白色主题透明模式下列名悬浮Tooltip对比度不足的问题
- 新增白色主题全局滚动条样式适配透明模式(App.css)
- App.tsx主题token与组件样式优化
- refs #147

* 🔧 chore(app): 清理 App.tsx 类型告警并收敛前端壳层实现

- 清除未使用代码和冗余状态
- 替换弃用 API 以消除 IDE 提示
- 显式处理浮动 Promise 避免告警
- 保持现有更新检查和代理设置行为不变

* 🔧 fix(ci): 修复 Windows AMD64 下 DuckDB 驱动构建链路

- 将 DuckDB 工具链准备切换为优先使用 MSYS2
- 增加 gcc 和 g++ 存在性校验与版本验证
- 在 MSYS2 异常时回退 Chocolatey 安装 MinGW
- 保持 Windows ARM64 跳过 DuckDB 构建与平台支持一致

* 🔧 fix(ci): 修复 Windows AMD64 下 DuckDB 驱动构建链路

- 将 DuckDB 工具链准备切换为优先使用 MSYS2
- 增加 gcc 和 g++ 存在性校验与版本验证
- 在 MSYS2 异常时回退 Chocolatey 安装 MinGW
- 保持 Windows ARM64 跳过 DuckDB 构建与平台支持一致

* 🔧 fix(ci): 修复 Windows AMD64 下 DuckDB 驱动构建工具链

- 将 DuckDB 编译链从 MINGW64 切换为 MSYS2 UCRT64
- 修正 Windows AMD64 的 gcc 和 g++ 探测路径
- 增加 DuckDB 编译器版本校验步骤

* 📝 docs(contributing): 补充中英文贡献指南并统一 README 入口

- 新增英文版 CONTRIBUTING.md 作为正式贡献文档
- 新增中文版 CONTRIBUTING.zh-CN.md 作为中文贡献说明
- 调整 README 和 README.zh-CN 的贡献入口指向对应语言文档

* - feat(connection,metadata,kingbase): 增强多数据源连接能力并修复金仓/达梦/Oracle/ClickHouse兼容性问题 (#188) (#190)

* feat(http-tunnel): 支持独立 HTTP 隧道连接并覆盖多数据源

refs #168

* fix(kingbase-data-grid): 修复金仓打开表卡顿并降低对象渲染开销

refs #178

* fix(kingbase-transaction): 修复金仓事务提交重复引号导致语法错误

refs #176

* fix(driver-agent): 修复老版本 Win10 升级后金仓驱动代理启动失败

refs #177

* chore(ci): 新增手动触发的 macOS 测试构建工作流

* chore(ci): 允许测试工作流在当前分支自动触发

* fix(query-editor): 修复 SQL 编辑中光标随机跳到末尾 refs #185

* feat(data-sync): 增加差异 SQL 预览能力便于审核 refs #174

* fix(clickhouse-connect): 自动识别并回退 HTTP/Native 协议连接 refs #181

* fix(oracle-metadata): 修复视图与函数加载按 schema 过滤异常 refs #155

* fix(dameng-databases): 修复显示全部库时数据库列表不完整 refs #154

* fix(connection,db-list): 统一处理空列表返回并修复达梦连接测试报错 refs #157

Co-authored-by: 辣条 <69459608+tianqijiuyun-latiao@users.noreply.github.com>

*  feat(release-notes): 支持自动生成 Release 更新说明并区分配置文件命名

* 🔁 chore(sync): 回灌 main 到 dev (#192)

* - feat(connection,metadata,kingbase): 增强多数据源连接能力并修复金仓/达梦/Oracle/ClickHouse兼容性问题 (#188)

* feat(http-tunnel): 支持独立 HTTP 隧道连接并覆盖多数据源

refs #168

* fix(kingbase-data-grid): 修复金仓打开表卡顿并降低对象渲染开销

refs #178

* fix(kingbase-transaction): 修复金仓事务提交重复引号导致语法错误

refs #176

* fix(driver-agent): 修复老版本 Win10 升级后金仓驱动代理启动失败

refs #177

* chore(ci): 新增手动触发的 macOS 测试构建工作流

* chore(ci): 允许测试工作流在当前分支自动触发

* fix(query-editor): 修复 SQL 编辑中光标随机跳到末尾 refs #185

* feat(data-sync): 增加差异 SQL 预览能力便于审核 refs #174

* fix(clickhouse-connect): 自动识别并回退 HTTP/Native 协议连接 refs #181

* fix(oracle-metadata): 修复视图与函数加载按 schema 过滤异常 refs #155

* fix(dameng-databases): 修复显示全部库时数据库列表不完整 refs #154

* fix(connection,db-list): 统一处理空列表返回并修复达梦连接测试报错 refs #157

* Release/0.5.3 (#191)

---------

Co-authored-by: 辣条 <69459608+tianqijiuyun-latiao@users.noreply.github.com>
Co-authored-by: Syngnat <92659908+Syngnat@users.noreply.github.com>

* 🐛 fix(branch-sync): 修复 main 回灌 dev 时 mergeable 异步计算导致漏开自动合并

- 增加 mergeable 状态轮询,避免新建同步 PR 后立即返回 UNKNOWN
- 在合并状态未稳定时输出中文告警与执行摘要
- 保持冲突分支、待计算分支与自动合并分支的处理路径清晰

* 🔁 chore(sync): 回灌 main 到 dev (#195)

* - feat(connection,metadata,kingbase): 增强多数据源连接能力并修复金仓/达梦/Oracle/ClickHouse兼容性问题 (#188)

* feat(http-tunnel): 支持独立 HTTP 隧道连接并覆盖多数据源

refs #168

* fix(kingbase-data-grid): 修复金仓打开表卡顿并降低对象渲染开销

refs #178

* fix(kingbase-transaction): 修复金仓事务提交重复引号导致语法错误

refs #176

* fix(driver-agent): 修复老版本 Win10 升级后金仓驱动代理启动失败

refs #177

* chore(ci): 新增手动触发的 macOS 测试构建工作流

* chore(ci): 允许测试工作流在当前分支自动触发

* fix(query-editor): 修复 SQL 编辑中光标随机跳到末尾 refs #185

* feat(data-sync): 增加差异 SQL 预览能力便于审核 refs #174

* fix(clickhouse-connect): 自动识别并回退 HTTP/Native 协议连接 refs #181

* fix(oracle-metadata): 修复视图与函数加载按 schema 过滤异常 refs #155

* fix(dameng-databases): 修复显示全部库时数据库列表不完整 refs #154

* fix(connection,db-list): 统一处理空列表返回并修复达梦连接测试报错 refs #157

* Release/0.5.3 (#191)

* - chore(ci): 新增全平台测试包手动构建工作流 tianqijiuyun-latiao 今天 下午4:26 (#194)

* feat(http-tunnel): 支持独立 HTTP 隧道连接并覆盖多数据源

refs #168

* fix(kingbase-data-grid): 修复金仓打开表卡顿并降低对象渲染开销

refs #178

* fix(kingbase-transaction): 修复金仓事务提交重复引号导致语法错误

refs #176

* fix(driver-agent): 修复老版本 Win10 升级后金仓驱动代理启动失败

refs #177

* chore(ci): 新增手动触发的 macOS 测试构建工作流

* chore(ci): 允许测试工作流在当前分支自动触发

* fix(query-editor): 修复 SQL 编辑中光标随机跳到末尾 refs #185

* feat(data-sync): 增加差异 SQL 预览能力便于审核 refs #174

* fix(clickhouse-connect): 自动识别并回退 HTTP/Native 协议连接 refs #181

* fix(oracle-metadata): 修复视图与函数加载按 schema 过滤异常 refs #155

* fix(dameng-databases): 修复显示全部库时数据库列表不完整 refs #154

* fix(connection,db-list): 统一处理空列表返回并修复达梦连接测试报错 refs #157

* fix(kingbase): 补齐主键识别并优化宽表卡顿 refs #176 refs #178

* fix(query-execution): 支持带前置注释的读查询结果识别

* chore(ci): 新增全平台测试包手动构建工作流

---------

Co-authored-by: 辣条 <69459608+tianqijiuyun-latiao@users.noreply.github.com>
Co-authored-by: Syngnat <92659908+Syngnat@users.noreply.github.com>

* ♻️ refactor(frontend-sync): 优化桌面交互细节并移除 main 回灌 dev 自动化

- 优化新建连接、主题设置、侧边栏工具区与 SQL 日志的界面表现
- 调整分页、筛选、透明模式与弹窗样式,统一整体交互层次
- 收口外观参数生效逻辑并补齐多组件适配
- 删除 sync-main-to-dev 工作流并同步维护者手动回灌说明

---------

Co-authored-by: Syngnat <yangguofeng919@gmail.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: 辣条 <69459608+tianqijiuyun-latiao@users.noreply.github.com>
2026-03-07 17:15:30 +08:00
53 changed files with 10821 additions and 1642 deletions

View File

@@ -1,159 +0,0 @@
name: main 回灌 dev
on:
push:
branches:
- main
workflow_dispatch:
permissions:
contents: write
pull-requests: write
concurrency:
group: sync-main-to-dev
cancel-in-progress: true
jobs:
sync-main-to-dev:
name: 执行回灌同步
runs-on: ubuntu-latest
steps:
- name: 检出代码
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: 检查是否需要同步
id: diff_check
shell: bash
run: |
set -euo pipefail
echo "开始检查 main 与 dev 的分支差异..."
git fetch origin main dev
ahead_count="$(git rev-list --count origin/dev..origin/main)"
echo "ahead_count=${ahead_count}" >> "$GITHUB_OUTPUT"
if [ "${ahead_count}" -eq 0 ]; then
echo "无需同步dev 已包含 main 的最新提交。"
echo "has_changes=false" >> "$GITHUB_OUTPUT"
else
echo "检测到 ${ahead_count} 个待同步提交,准备创建或复用同步 PR。"
echo "has_changes=true" >> "$GITHUB_OUTPUT"
fi
- name: 创建或复用同步 PR
id: sync_pr
if: steps.diff_check.outputs.has_changes == 'true'
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
shell: bash
run: |
set -euo pipefail
echo "permission_blocked=false" >> "$GITHUB_OUTPUT"
existing_number="$(gh pr list --base dev --head main --state open --json number --jq '.[0].number // empty')"
if [ -n "${existing_number}" ]; then
pr_number="${existing_number}"
pr_url="$(gh pr view "${pr_number}" --json url --jq '.url')"
echo "复用已有同步 PR#${pr_number}"
echo "created=false" >> "$GITHUB_OUTPUT"
else
body_file="$(mktemp)"
error_file="$(mktemp)"
{
echo "## 自动回灌:\`main -> dev\`"
echo
echo "- 触发条件:\`main\` 分支出现新提交(含贡献者直接合并到 \`main\` 的 PR"
echo "- 目标:让 \`dev\` 持续吸收 \`main\` 的更新,避免发布前集中冲突"
echo
echo "### 合并建议"
echo "- 无冲突:直接合并该 PR建议 \`Merge commit\`"
echo "- 有冲突:在该 PR 内解决冲突后再合并"
} > "${body_file}"
if pr_url="$(gh pr create \
--base dev \
--head main \
--title "🔁 chore(sync): 回灌 main 到 dev" \
--body-file "${body_file}" 2>"${error_file}")"; then
pr_number="${pr_url##*/}"
echo "已创建同步 PR#${pr_number}"
echo "created=true" >> "$GITHUB_OUTPUT"
else
error_message="$(tr '\n' ' ' < "${error_file}")"
if printf '%s' "${error_message}" | grep -Fq "GitHub Actions is not permitted to create or approve pull requests"; then
echo "::warning::仓库未开启“Allow GitHub Actions to create and approve pull requests”已跳过自动创建同步 PR。"
echo "permission_blocked=true" >> "$GITHUB_OUTPUT"
echo "created=false" >> "$GITHUB_OUTPUT"
echo "pr_number=" >> "$GITHUB_OUTPUT"
echo "pr_url=" >> "$GITHUB_OUTPUT"
exit 0
fi
echo "::error::创建同步 PR 失败:${error_message}"
exit 1
fi
fi
echo "pr_number=${pr_number}" >> "$GITHUB_OUTPUT"
echo "pr_url=${pr_url}" >> "$GITHUB_OUTPUT"
- name: 检查合并状态
id: merge_state
if: steps.diff_check.outputs.has_changes == 'true' && steps.sync_pr.outputs.permission_blocked != 'true'
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
shell: bash
run: |
set -euo pipefail
pr_number="${{ steps.sync_pr.outputs.pr_number }}"
mergeable="$(gh pr view "${pr_number}" --json mergeable --jq '.mergeable')"
merge_state_status="$(gh pr view "${pr_number}" --json mergeStateStatus --jq '.mergeStateStatus')"
echo "PR #${pr_number} 合并状态mergeable=${mergeable}, mergeStateStatus=${merge_state_status}"
echo "mergeable=${mergeable}" >> "$GITHUB_OUTPUT"
echo "merge_state_status=${merge_state_status}" >> "$GITHUB_OUTPUT"
- name: 可合并时开启自动合并
id: auto_merge
if: steps.diff_check.outputs.has_changes == 'true' && steps.sync_pr.outputs.permission_blocked != 'true' && steps.merge_state.outputs.mergeable == 'MERGEABLE'
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
shell: bash
run: |
set -euo pipefail
pr_number="${{ steps.sync_pr.outputs.pr_number }}"
if gh pr merge "${pr_number}" --merge --auto; then
echo "已为 PR #${pr_number} 开启自动合并。"
echo "result=enabled" >> "$GITHUB_OUTPUT"
else
echo "::warning::自动合并开启失败,请手动处理并合并该 PR。"
echo "result=failed" >> "$GITHUB_OUTPUT"
fi
- name: 写入执行摘要
if: always()
shell: bash
run: |
{
echo "## main 回灌 dev 执行结果"
if [ "${{ steps.diff_check.outputs.has_changes }}" != "true" ]; then
echo "- 状态无需同步dev 已包含 main 最新提交)"
exit 0
fi
if [ "${{ steps.sync_pr.outputs.permission_blocked }}" = "true" ]; then
echo "- 状态:已跳过自动创建同步 PR"
echo "- 原因:仓库未开启 GitHub Actions 创建与审批 Pull Request 权限"
echo "- 处理:前往 Settings -> Actions -> General -> Workflow permissions开启 Allow GitHub Actions to create and approve pull requests"
echo "- 兜底:由维护者手动执行 main 到 dev 合并,或开启该设置后重新运行 workflow"
exit 0
fi
echo "- PR${{ steps.sync_pr.outputs.pr_url }}"
echo "- 可合并状态:${{ steps.merge_state.outputs.mergeable }}"
echo "- 合并状态详情:${{ steps.merge_state.outputs.merge_state_status }}"
if [ "${{ steps.merge_state.outputs.mergeable }}" = "CONFLICTING" ]; then
echo "- 结论:检测到冲突,需要手动处理后合并"
elif [ "${{ steps.auto_merge.outputs.result }}" = "enabled" ]; then
echo "- 结论:已启用自动合并(满足保护规则后将自动入 dev"
else
echo "- 结论PR 已创建/复用,请按分支策略人工合并"
fi
} >> "$GITHUB_STEP_SUMMARY"

91
.github/workflows/test-macos-build.yml vendored Normal file
View File

@@ -0,0 +1,91 @@
name: Test Build macOS (Manual)
on:
workflow_dispatch:
inputs:
build_label:
description: "测试包标识(仅用于文件名)"
required: false
default: "test"
push:
branches:
- feature/kingbase_opt
paths:
- ".github/workflows/test-macos-build.yml"
permissions:
contents: read
jobs:
build-macos:
name: Build macOS ${{ matrix.arch }}
runs-on: macos-latest
strategy:
fail-fast: false
matrix:
include:
- platform: darwin/amd64
arch: amd64
- platform: darwin/arm64
arch: arm64
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup Go
uses: actions/setup-go@v5
with:
go-version: "1.24.3"
check-latest: true
- name: Setup Node
uses: actions/setup-node@v4
with:
node-version: "20"
- name: Install Wails
run: go install github.com/wailsapp/wails/v2/cmd/wails@v2.11.0
- name: Build App
run: |
set -euo pipefail
OUTPUT_NAME="gonavi-test-${{ matrix.arch }}"
BUILD_LABEL="${{ inputs.build_label }}"
if [ -z "$BUILD_LABEL" ]; then
BUILD_LABEL="test"
fi
APP_VERSION="${BUILD_LABEL}-${GITHUB_RUN_NUMBER}"
wails build \
-platform "${{ matrix.platform }}" \
-clean \
-o "$OUTPUT_NAME" \
-ldflags "-s -w -X GoNavi-Wails/internal/app.AppVersion=${APP_VERSION}"
- name: Package Zip
run: |
set -euo pipefail
APP_PATH="build/bin/gonavi-test-${{ matrix.arch }}.app"
if [ ! -d "$APP_PATH" ]; then
APP_PATH=$(find build/bin -maxdepth 1 -name "*.app" | head -n 1 || true)
fi
if [ -z "$APP_PATH" ] || [ ! -d "$APP_PATH" ]; then
echo "未找到 .app 产物"
ls -la build/bin || true
exit 1
fi
LABEL="${{ inputs.build_label }}"
if [ -z "$LABEL" ]; then
LABEL="test"
fi
ZIP_NAME="GoNavi-${LABEL}-macos-${{ matrix.arch }}-run${GITHUB_RUN_NUMBER}.zip"
mkdir -p artifacts
ditto -c -k --sequesterRsrc --keepParent "$APP_PATH" "artifacts/$ZIP_NAME"
shasum -a 256 "artifacts/$ZIP_NAME" > "artifacts/$ZIP_NAME.sha256"
- name: Upload Artifact
uses: actions/upload-artifact@v4
with:
name: gonavi-macos-${{ matrix.arch }}-run${{ github.run_number }}
path: artifacts/*
if-no-files-found: error

View File

@@ -79,14 +79,8 @@ Because external pull requests are merged directly into `main`, maintainers must
### 1. Sync `main` -> `dev` (required)
This repository provides automatic sync via GitHub Actions workflow:
- `.github/workflows/sync-main-to-dev.yml`
- Trigger: every push to `main`
- Behavior: create/reuse a PR from `main` to `dev`; if mergeable, it tries to enable auto-merge
- Prerequisite: in `Settings -> Actions -> General -> Workflow permissions`, enable `Allow GitHub Actions to create and approve pull requests`; otherwise the workflow will skip PR creation and only emit a warning summary
Manual fallback (when conflicts or automation is unavailable):
The automatic GitHub Actions sync workflow has been removed.
Maintainers should sync `main` back to `dev` manually when needed:
```bash
git checkout dev

View File

@@ -79,14 +79,8 @@ feature/* / fix/* -> dev -> release/* -> main -> tag(vX.Y.Z)
### 1. main → dev 同步(必做)
仓库已提供 GitHub Actions 自动同步机制:
- `.github/workflows/sync-main-to-dev.yml`
- 触发时机:每次 `main` 分支有新的 push
- 行为:自动创建或复用 `main``dev` 的同步 PR若可合并则尝试开启自动合并
- 前置条件:需在 `Settings -> Actions -> General -> Workflow permissions` 中开启 `Allow GitHub Actions to create and approve pull requests`,否则 workflow 只会输出告警摘要并跳过建 PR
当出现冲突,或自动化暂不可用时,使用以下手动兜底方式:
仓库已移除 GitHub Actions 自动回灌 workflow。
当前统一采用手动方式将 `main` 同步回 `dev`
```bash
git checkout dev

View File

@@ -1 +1 @@
d0f9366af59a6367ad3c7e2d4185ead4
5b8157374dae5f9340e31b2d0bd2c00e

View File

@@ -1,7 +1,7 @@
import React, { useState, useEffect } from 'react';
import React, { useState, useEffect, useMemo } from 'react';
import { Layout, Button, ConfigProvider, theme, Dropdown, MenuProps, message, Modal, Spin, Slider, Progress, Switch, Input, InputNumber, Select } from 'antd';
import zhCN from 'antd/locale/zh_CN';
import { PlusOutlined, ConsoleSqlOutlined, UploadOutlined, DownloadOutlined, CloudDownloadOutlined, BugOutlined, ToolOutlined, GlobalOutlined, InfoCircleOutlined, GithubOutlined, SkinOutlined, CheckOutlined, MinusOutlined, BorderOutlined, CloseOutlined, SettingOutlined, LinkOutlined } from '@ant-design/icons';
import { PlusOutlined, ConsoleSqlOutlined, UploadOutlined, DownloadOutlined, CloudDownloadOutlined, BugOutlined, ToolOutlined, GlobalOutlined, InfoCircleOutlined, GithubOutlined, SkinOutlined, CheckOutlined, MinusOutlined, BorderOutlined, CloseOutlined, SettingOutlined, LinkOutlined, BgColorsOutlined, AppstoreOutlined } from '@ant-design/icons';
import { BrowserOpenURL, Environment, EventsOn, Quit, WindowFullscreen, WindowGetSize, WindowIsFullscreen, WindowIsMaximised, WindowMaximise, WindowMinimise, WindowSetSize, WindowToggleMaximise } from '../wailsjs/runtime';
import Sidebar from './components/Sidebar';
import TabManager from './components/TabManager';
@@ -11,7 +11,7 @@ import DriverManagerModal from './components/DriverManagerModal';
import LogPanel from './components/LogPanel';
import { useStore } from './store';
import { SavedConnection } from './types';
import { blurToFilter, normalizeBlurForPlatform, normalizeOpacityForPlatform, isWindowsPlatform } from './utils/appearance';
import { blurToFilter, normalizeBlurForPlatform, normalizeOpacityForPlatform, isWindowsPlatform, resolveAppearanceValues } from './utils/appearance';
import {
SHORTCUT_ACTION_META,
SHORTCUT_ACTION_ORDER,
@@ -78,11 +78,11 @@ function App() {
const tokenControlHeightLG = Math.max(30, Math.round(40 * effectiveUiScale));
const appComponentSize: 'small' | 'middle' | 'large' = effectiveUiScale <= 0.92 ? 'small' : (effectiveUiScale >= 1.12 ? 'large' : 'middle');
const titleBarHeight = Math.max(28, Math.round(32 * effectiveUiScale));
const toolbarHeight = Math.max(32, Math.round(36 * effectiveUiScale));
const titleBarButtonWidth = Math.max(40, Math.round(46 * effectiveUiScale));
const floatingLogButtonHeight = Math.max(30, Math.round(34 * effectiveUiScale));
const effectiveOpacity = normalizeOpacityForPlatform(appearance.opacity);
const effectiveBlur = normalizeBlurForPlatform(appearance.blur);
const resolvedAppearance = resolveAppearanceValues(appearance);
const effectiveOpacity = normalizeOpacityForPlatform(resolvedAppearance.opacity);
const effectiveBlur = normalizeBlurForPlatform(resolvedAppearance.blur);
const blurFilter = blurToFilter(effectiveBlur);
const windowCornerRadius = 14;
const [runtimePlatform, setRuntimePlatform] = useState('');
@@ -93,8 +93,8 @@ function App() {
// 同步 macOS 窗口透明度opacity=1.0 且 blur=0 时关闭 NSVisualEffectView
// 避免 GPU 持续计算窗口背后的模糊合成
useEffect(() => {
void SetWindowTranslucency(appearance.opacity, appearance.blur).catch(() => undefined);
}, [appearance.opacity, appearance.blur]);
void SetWindowTranslucency(resolvedAppearance.opacity, resolvedAppearance.blur).catch(() => undefined);
}, [resolvedAppearance.blur, resolvedAppearance.opacity]);
useEffect(() => {
let cancelled = false;
@@ -409,6 +409,141 @@ function App() {
const floatingLogButtonShadow = darkMode
? '0 8px 22px rgba(0,0,0,0.38)'
: '0 8px 20px rgba(0,0,0,0.16)';
const isOpaqueUtilityMode = resolvedAppearance.opacity >= 0.999 && resolvedAppearance.blur <= 0;
const utilityButtonBgAlpha = darkMode
? Math.max(0.28, Math.min(0.76, effectiveOpacity * 0.72))
: Math.max(0.52, Math.min(0.92, effectiveOpacity * 0.9));
const utilityButtonBgColor = isOpaqueUtilityMode
? 'transparent'
: (darkMode
? `rgba(20, 26, 38, ${utilityButtonBgAlpha})`
: `rgba(255, 255, 255, ${utilityButtonBgAlpha})`);
const utilityButtonBorderColor = isOpaqueUtilityMode
? (darkMode ? 'rgba(255,255,255,0.12)' : 'rgba(16,24,40,0.10)')
: (darkMode
? `rgba(255,255,255,${Math.max(0.08, Math.min(0.18, effectiveOpacity * 0.16))})`
: `rgba(16,24,40,${Math.max(0.06, Math.min(0.14, effectiveOpacity * 0.12))})`);
const utilityButtonShadow = isOpaqueUtilityMode
? 'none'
: (darkMode
? `0 8px 18px rgba(0,0,0,${Math.max(0.10, Math.min(0.22, effectiveOpacity * 0.24))})`
: `0 8px 18px rgba(15,23,42,${Math.max(0.04, Math.min(0.12, effectiveOpacity * 0.12))})`);
const utilityButtonStyle = useMemo(() => ({
height: Math.max(30, Math.round(32 * effectiveUiScale)),
width: '100%',
paddingInline: Math.max(10, Math.round(12 * effectiveUiScale)),
borderRadius: 10,
border: `1px solid ${utilityButtonBorderColor}`,
background: utilityButtonBgColor,
color: darkMode ? 'rgba(255,255,255,0.94)' : '#162033',
boxShadow: utilityButtonShadow,
backdropFilter: isOpaqueUtilityMode ? 'none' : blurFilter,
WebkitBackdropFilter: isOpaqueUtilityMode ? 'none' : blurFilter,
display: 'inline-flex',
alignItems: 'center',
justifyContent: 'center',
gap: 6,
}), [blurFilter, darkMode, effectiveUiScale, isOpaqueUtilityMode, utilityButtonBgColor, utilityButtonBorderColor, utilityButtonShadow]);
const utilityDropdownShellStyle = useMemo(() => ({
borderRadius: 14,
padding: 6,
background: darkMode ? 'linear-gradient(180deg, rgba(20,26,38,0.96) 0%, rgba(13,17,26,0.98) 100%)' : 'linear-gradient(180deg, rgba(255,255,255,0.98) 0%, rgba(246,248,252,0.98) 100%)',
border: darkMode ? '1px solid rgba(255,255,255,0.08)' : '1px solid rgba(16,24,40,0.08)',
boxShadow: darkMode ? '0 20px 48px rgba(0,0,0,0.32)' : '0 16px 36px rgba(15,23,42,0.12)',
backdropFilter: darkMode ? 'blur(16px)' : 'none',
overflow: 'hidden',
}), [darkMode]);
const sidebarQuickActionBaseStyle = useMemo(() => ({
height: Math.max(34, Math.round(36 * effectiveUiScale)),
borderRadius: 12,
display: 'inline-flex',
alignItems: 'center',
justifyContent: 'center',
gap: 8,
paddingInline: Math.max(12, Math.round(14 * effectiveUiScale)),
fontWeight: 700,
boxShadow: darkMode ? '0 8px 18px rgba(0,0,0,0.16)' : '0 8px 16px rgba(15,23,42,0.08)',
backdropFilter: blurFilter,
WebkitBackdropFilter: blurFilter,
minWidth: 0,
whiteSpace: 'nowrap',
}), [blurFilter, darkMode, effectiveUiScale]);
const sidebarQueryActionStyle = useMemo(() => ({
...sidebarQuickActionBaseStyle,
flex: '1 1 0',
border: `1px solid ${darkMode ? 'rgba(255,255,255,0.12)' : 'rgba(16,24,40,0.10)'}`,
background: darkMode ? `rgba(255,255,255,0.05)` : 'rgba(255,255,255,0.88)',
color: darkMode ? 'rgba(255,255,255,0.92)' : '#162033',
}), [darkMode, sidebarQuickActionBaseStyle]);
const sidebarCreateConnectionActionStyle = useMemo(() => ({
...sidebarQuickActionBaseStyle,
flex: '1 1 0',
border: 'none',
background: 'linear-gradient(135deg, rgba(255,214,102,0.96) 0%, rgba(240,183,39,0.92) 100%)',
color: '#2a1f00',
}), [sidebarQuickActionBaseStyle]);
const utilityMenuTheme = useMemo(() => ({
components: {
Menu: {
popupBg: 'transparent',
darkPopupBg: 'transparent',
itemBg: 'transparent',
darkItemBg: 'transparent',
subMenuItemBg: 'transparent',
itemColor: darkMode ? 'rgba(255,255,255,0.88)' : '#162033',
itemHoverColor: darkMode ? '#fff7d6' : '#0f172a',
itemHoverBg: darkMode ? 'rgba(255,214,102,0.10)' : 'rgba(24,144,255,0.08)',
itemSelectedColor: darkMode ? '#ffd666' : '#1677ff',
itemSelectedBg: darkMode ? 'rgba(255,214,102,0.14)' : 'rgba(24,144,255,0.12)',
itemBorderRadius: 10,
itemMarginBlock: 4,
itemMarginInline: 0,
itemPaddingInline: 12,
itemHeight: 40,
groupTitleColor: darkMode ? 'rgba(255,255,255,0.48)' : 'rgba(16,24,40,0.48)',
},
},
}), [darkMode]);
const renderUtilityDropdown = (menu: React.ReactNode) => (
<ConfigProvider theme={utilityMenuTheme}>
<div style={{ ...utilityDropdownShellStyle, minWidth: 220 }}>
{menu}
</div>
</ConfigProvider>
);
const utilityModalShellStyle = useMemo(() => ({
background: darkMode ? 'linear-gradient(180deg, rgba(20,26,38,0.96) 0%, rgba(13,17,26,0.98) 100%)' : 'linear-gradient(180deg, rgba(255,255,255,0.98) 0%, rgba(246,248,252,0.98) 100%)',
border: darkMode ? '1px solid rgba(255,255,255,0.08)' : '1px solid rgba(16,24,40,0.08)',
boxShadow: darkMode ? '0 24px 56px rgba(0,0,0,0.32)' : '0 18px 42px rgba(15,23,42,0.12)',
backdropFilter: darkMode ? 'blur(18px)' : 'none',
}), [darkMode]);
const utilityPanelStyle = useMemo(() => ({
padding: 16,
borderRadius: 14,
border: darkMode ? '1px solid rgba(255,255,255,0.08)' : '1px solid rgba(16,24,40,0.08)',
background: darkMode ? 'rgba(255,255,255,0.03)' : 'rgba(255,255,255,0.84)',
}), [darkMode]);
const utilityMutedTextStyle = useMemo(() => ({
color: darkMode ? 'rgba(255,255,255,0.5)' : 'rgba(16,24,40,0.55)',
fontSize: 12,
lineHeight: 1.6,
}), [darkMode]);
const renderUtilityModalTitle = (icon: React.ReactNode, title: string, description: string) => (
<div style={{ display: 'flex', alignItems: 'flex-start', gap: 12 }}>
<div style={{ width: 36, height: 36, borderRadius: 12, display: 'grid', placeItems: 'center', background: darkMode ? 'rgba(255,214,102,0.12)' : 'rgba(24,144,255,0.1)', color: darkMode ? '#ffd666' : '#1677ff', flexShrink: 0 }}>
{icon}
</div>
<div style={{ minWidth: 0 }}>
<div style={{ fontSize: 16, fontWeight: 700, color: darkMode ? '#f5f7ff' : '#162033' }}>{title}</div>
<div style={{ marginTop: 4, color: darkMode ? 'rgba(255,255,255,0.5)' : 'rgba(16,24,40,0.55)', fontSize: 12, lineHeight: 1.6 }}>{description}</div>
</div>
</div>
);
const sidebarHorizontalPadding = 10;
const addTab = useStore(state => state.addTab);
const activeContext = useStore(state => state.activeContext);
@@ -825,37 +960,18 @@ function App() {
label: '驱动管理',
icon: <SettingOutlined />,
onClick: () => setIsDriverModalOpen(true)
}
];
const themeMenu: MenuProps['items'] = [
{
key: 'light',
label: '亮色主题',
icon: themeMode === 'light' ? <CheckOutlined /> : undefined,
onClick: () => setTheme('light')
},
{
key: 'dark',
label: '暗色主题',
icon: themeMode === 'dark' ? <CheckOutlined /> : undefined,
onClick: () => setTheme('dark')
},
{ type: 'divider' },
{
key: 'settings',
label: '外观设置...',
icon: <SettingOutlined />,
onClick: () => setIsAppearanceModalOpen(true)
},
{
key: 'shortcut-settings',
label: '快捷键管理...',
label: '快捷键管理',
icon: <LinkOutlined />,
onClick: () => setIsShortcutModalOpen(true)
}
];
const [isThemeModalOpen, setIsThemeModalOpen] = useState(false);
const [themeModalSection, setThemeModalSection] = useState<'theme' | 'appearance'>('theme');
const [isAppearanceModalOpen, setIsAppearanceModalOpen] = useState(false);
const [isShortcutModalOpen, setIsShortcutModalOpen] = useState(false);
const [capturingShortcutAction, setCapturingShortcutAction] = useState<ShortcutAction | null>(null);
@@ -1229,8 +1345,6 @@ function App() {
},
components: {
Layout: {
colorBgBody: 'transparent',
colorBgHeader: 'transparent',
bodyBg: 'transparent',
headerBg: 'transparent',
siderBg: 'transparent',
@@ -1311,28 +1425,6 @@ function App() {
</div>
</div>
<div
style={{
height: toolbarHeight,
flexShrink: 0,
display: 'flex',
alignItems: 'center',
justifyContent: 'flex-start',
gap: Math.max(2, Math.round(4 * effectiveUiScale)),
padding: `0 ${Math.max(6, Math.round(8 * effectiveUiScale))}px`,
borderBottom: 'none',
background: bgMain,
}}
>
<Dropdown menu={{ items: toolsMenu }} placement="bottomLeft">
<Button type="text" icon={<ToolOutlined />} title="工具"></Button>
</Dropdown>
<Button type="text" icon={<GlobalOutlined />} title="代理" onClick={() => setIsProxyModalOpen(true)}></Button>
<Dropdown menu={{ items: themeMenu }} placement="bottomLeft">
<Button type="text" icon={<SkinOutlined />} title="主题"></Button>
</Dropdown>
<Button type="text" icon={<InfoCircleOutlined />} title="关于" onClick={() => setIsAboutOpen(true)}></Button>
</div>
<Layout style={{ flex: 1, minHeight: 0, minWidth: 0 }}>
<Sider
width={sidebarWidth}
@@ -1343,13 +1435,26 @@ function App() {
}}
>
<div style={{ height: '100%', display: 'flex', flexDirection: 'column', overflow: 'hidden' }}>
<div style={{ padding: '10px', borderBottom: 'none', display: 'flex', justifyContent: 'flex-end', alignItems: 'center', flexShrink: 0 }}>
<div>
<Button type="text" icon={<ConsoleSqlOutlined />} onClick={handleNewQuery} title="新建查询" />
<Button type="text" icon={<PlusOutlined />} onClick={() => setIsModalOpen(true)} title="新建连接" />
<div style={{ padding: `12px ${sidebarHorizontalPadding}px 8px`, borderBottom: 'none', display: 'flex', alignItems: 'center', flexShrink: 0 }}>
<div style={{ display: 'grid', gridTemplateColumns: 'repeat(4, minmax(0, 1fr))', gap: 8, width: '100%' }}>
<Dropdown menu={{ items: toolsMenu }} placement="bottomLeft" dropdownRender={renderUtilityDropdown}>
<Button type="text" icon={<ToolOutlined />} title="工具" style={utilityButtonStyle}></Button>
</Dropdown>
<Button type="text" icon={<GlobalOutlined />} title="代理" style={utilityButtonStyle} onClick={() => setIsProxyModalOpen(true)}></Button>
<Button type="text" icon={<SkinOutlined />} title="主题" style={utilityButtonStyle} onClick={() => setIsThemeModalOpen(true)}></Button>
<Button type="text" icon={<InfoCircleOutlined />} title="关于" style={utilityButtonStyle} onClick={() => setIsAboutOpen(true)}></Button>
</div>
</div>
<div style={{ padding: `0 ${sidebarHorizontalPadding}px 10px`, borderBottom: 'none', display: 'flex', alignItems: 'center', flexShrink: 0 }}>
<div style={{ display: 'grid', gridTemplateColumns: 'minmax(0, 1fr) minmax(0, 1fr)', gap: 8, width: '100%' }}>
<Button icon={<ConsoleSqlOutlined />} onClick={handleNewQuery} title="新建查询" style={sidebarQueryActionStyle}>
</Button>
<Button icon={<PlusOutlined />} onClick={() => setIsModalOpen(true)} title="新建连接" style={sidebarCreateConnectionActionStyle}>
</Button>
</div>
</div>
</div>
<div style={{ flex: 1, overflow: 'hidden', paddingBottom: 58 }}>
<Sidebar onEditConnection={handleEditConnection} />
@@ -1409,8 +1514,8 @@ function App() {
title="拖动调整宽度"
/>
</Sider>
<Content style={{ background: 'transparent', overflow: 'hidden', display: 'flex', flexDirection: 'column', minWidth: 0 }}>
<div style={{ flex: 1, minHeight: 0, minWidth: 0, overflow: 'hidden', display: 'flex', flexDirection: 'column', background: bgContent }}>
<Content style={{ background: isLogPanelOpen ? bgContent : 'transparent', overflow: 'hidden', display: 'flex', flexDirection: 'column', minWidth: 0 }}>
<div style={{ flex: 1, minHeight: 0, minWidth: 0, overflow: 'hidden', display: 'flex', flexDirection: 'column', background: bgContent, marginBottom: isLogPanelOpen ? 8 : 0, borderRadius: isLogPanelOpen ? windowCornerRadius : 0, clipPath: isLogPanelOpen ? `inset(0 round ${windowCornerRadius}px)` : 'none' }}>
<TabManager />
</div>
{isLogPanelOpen && (
@@ -1438,9 +1543,10 @@ function App() {
onOpenGlobalProxySettings={() => setIsProxyModalOpen(true)}
/>
<Modal
title="关于 GoNavi"
title={renderUtilityModalTitle(<InfoCircleOutlined />, '关于 GoNavi', '查看版本信息、仓库地址、更新状态与下载入口。')}
open={isAboutOpen}
onCancel={() => setIsAboutOpen(false)}
styles={{ content: utilityModalShellStyle, header: { background: 'transparent', borderBottom: 'none', paddingBottom: 8 }, body: { paddingTop: 8 }, footer: { background: 'transparent', borderTop: 'none', paddingTop: 10 } }}
footer={[
canShowProgressEntry ? (
<Button key="progress" icon={<DownloadOutlined />} onClick={showUpdateDownloadProgress}></Button>
@@ -1460,150 +1566,274 @@ function App() {
<Spin />
</div>
) : (
<div style={{ display: 'flex', flexDirection: 'column', gap: 8 }}>
<div>{aboutInfo?.version || '未知'}</div>
<div>{aboutInfo?.author || '未知'}</div>
{aboutInfo?.communityUrl ? (
<div><a onClick={(e) => { e.preventDefault(); if (aboutInfo?.communityUrl) BrowserOpenURL(aboutInfo.communityUrl); }} href={aboutInfo.communityUrl}>AI全书</a></div>
) : null}
<div>{aboutUpdateStatus || '未检查'}</div>
<div style={{ display: 'flex', alignItems: 'center', gap: 6 }}>
<GithubOutlined />
{aboutInfo?.repoUrl ? (
<a onClick={(e) => { e.preventDefault(); if (aboutInfo?.repoUrl) BrowserOpenURL(aboutInfo.repoUrl); }} href={aboutInfo.repoUrl}>
{aboutInfo.repoUrl}
</a>
) : '未'}
<div style={{ display: 'flex', flexDirection: 'column', gap: 16 }}>
<div style={utilityPanelStyle}>
<div style={{ display: 'grid', gridTemplateColumns: 'repeat(2, minmax(0, 1fr))', gap: 12 }}>
<div>
<div style={{ marginBottom: 6, fontWeight: 600 }}></div>
<div style={utilityMutedTextStyle}>{aboutInfo?.version || '未知'}</div>
</div>
<div>
<div style={{ marginBottom: 6, fontWeight: 600 }}></div>
<div style={utilityMutedTextStyle}>{aboutInfo?.author || '未知'}</div>
</div>
<div style={{ gridColumn: '1 / -1' }}>
<div style={{ marginBottom: 6, fontWeight: 600 }}></div>
<div style={utilityMutedTextStyle}>{aboutUpdateStatus || '未检查'}</div>
</div>
{aboutInfo?.communityUrl ? (
<div style={{ gridColumn: '1 / -1' }}>
<div style={{ marginBottom: 6, fontWeight: 600 }}></div>
<a onClick={(e) => { e.preventDefault(); if (aboutInfo?.communityUrl) BrowserOpenURL(aboutInfo.communityUrl); }} href={aboutInfo.communityUrl}>AI全书</a>
</div>
) : null}
</div>
</div>
<div style={utilityPanelStyle}>
<div style={{ marginBottom: 10, fontWeight: 600 }}></div>
<div style={{ display: 'grid', gap: 10 }}>
<div style={{ display: 'flex', alignItems: 'center', gap: 8 }}>
<GithubOutlined />
{aboutInfo?.repoUrl ? (
<a onClick={(e) => { e.preventDefault(); if (aboutInfo?.repoUrl) BrowserOpenURL(aboutInfo.repoUrl); }} href={aboutInfo.repoUrl}>{aboutInfo.repoUrl}</a>
) : '未知'}
</div>
<div style={{ display: 'flex', alignItems: 'center', gap: 8 }}>
<BugOutlined />
{aboutInfo?.issueUrl ? (
<a onClick={(e) => { e.preventDefault(); if (aboutInfo?.issueUrl) BrowserOpenURL(aboutInfo.issueUrl); }} href={aboutInfo.issueUrl}>{aboutInfo.issueUrl}</a>
) : '未知'}
</div>
<div style={{ display: 'flex', alignItems: 'center', gap: 8 }}>
<CloudDownloadOutlined />
{aboutInfo?.releaseUrl ? (
<a onClick={(e) => { e.preventDefault(); if (aboutInfo?.releaseUrl) BrowserOpenURL(aboutInfo.releaseUrl); }} href={aboutInfo.releaseUrl}>{aboutInfo.releaseUrl}</a>
) : '未知'}
</div>
</div>
</div>
</div>
<div style={{ display: 'flex', alignItems: 'center', gap: 6 }}>
<BugOutlined />
{aboutInfo?.issueUrl ? (
<a onClick={(e) => { e.preventDefault(); if (aboutInfo?.issueUrl) BrowserOpenURL(aboutInfo.issueUrl); }} href={aboutInfo.issueUrl}>
{aboutInfo.issueUrl}
</a>
) : '未知'}
</div>
<div style={{ display: 'flex', alignItems: 'center', gap: 6 }}>
<CloudDownloadOutlined />
{aboutInfo?.releaseUrl ? (
<a onClick={(e) => { e.preventDefault(); if (aboutInfo?.releaseUrl) BrowserOpenURL(aboutInfo.releaseUrl); }} href={aboutInfo.releaseUrl}>
{aboutInfo.releaseUrl}
</a>
) : '未知'}
</div>
</div>
)}
</Modal>
<Modal
title="外观设置"
open={isAppearanceModalOpen}
onCancel={() => setIsAppearanceModalOpen(false)}
title={renderUtilityModalTitle(
themeModalSection === 'theme' ? <SkinOutlined /> : <BgColorsOutlined />,
themeModalSection === 'theme' ? '主题设置' : '外观设置',
themeModalSection === 'theme'
? '切换亮暗主题,保持整体视觉风格统一。'
: '统一调整缩放、字体、透明度与模糊效果。'
)}
open={isThemeModalOpen}
onCancel={() => { setIsThemeModalOpen(false); setThemeModalSection('theme'); }}
footer={null}
width={460}
width={820}
styles={{ content: utilityModalShellStyle, header: { background: 'transparent', borderBottom: 'none', paddingBottom: 8 }, body: { paddingTop: 8, height: 620, overflow: 'hidden' }, footer: { background: 'transparent', borderTop: 'none', paddingTop: 10 } }}
>
<div style={{ display: 'flex', flexDirection: 'column', gap: 24, padding: '12px 0' }}>
<div>
<div style={{ marginBottom: 8, fontWeight: 500 }}> (UI Scale)</div>
<div style={{ display: 'flex', alignItems: 'center', gap: 16 }}>
<Slider
min={MIN_UI_SCALE}
max={MAX_UI_SCALE}
step={0.05}
value={effectiveUiScale}
onChange={(v) => setUiScale(Number(v))}
style={{ flex: 1 }}
/>
<span style={{ width: 56 }}>{Math.round(effectiveUiScale * 100)}%</span>
</div>
<div style={{ fontSize: 12, color: '#888', marginTop: 4 }}>
* 85%-95%
<div style={{ display: 'grid', gridTemplateColumns: '180px minmax(0, 1fr)', gap: 16, padding: '12px 0', height: '100%', minHeight: 0, overflow: 'hidden', alignItems: 'stretch' }}>
<div style={{ ...utilityPanelStyle, padding: 12, height: 'fit-content' }}>
<div style={{ marginBottom: 12, fontWeight: 600 }}></div>
<div style={{ display: 'grid', gap: 10 }}>
{[
{ key: 'theme', title: '主题模式', description: '亮色与暗色切换', icon: <SkinOutlined /> },
{ key: 'appearance', title: '外观参数', description: '缩放、字体与透明度', icon: <BgColorsOutlined /> },
].map((item) => {
const active = themeModalSection === item.key;
return (
<button
key={item.key}
type="button"
onClick={() => setThemeModalSection(item.key as 'theme' | 'appearance')}
style={{
textAlign: 'left',
padding: '12px 12px',
borderRadius: 12,
border: `1px solid ${active
? (darkMode ? 'rgba(255,214,102,0.3)' : 'rgba(24,144,255,0.24)')
: (darkMode ? 'rgba(255,255,255,0.08)' : 'rgba(16,24,40,0.08)')}`,
background: active
? (darkMode ? 'linear-gradient(180deg, rgba(255,214,102,0.12) 0%, rgba(255,214,102,0.06) 100%)' : 'linear-gradient(180deg, rgba(24,144,255,0.10) 0%, rgba(24,144,255,0.05) 100%)')
: (darkMode ? 'rgba(255,255,255,0.02)' : 'rgba(255,255,255,0.72)'),
color: active ? (darkMode ? '#f5f7ff' : '#162033') : (darkMode ? 'rgba(255,255,255,0.82)' : '#3f4b5e'),
cursor: 'pointer',
}}
>
<div style={{ display: 'flex', alignItems: 'center', gap: 10 }}>
<span>{item.icon}</span>
<span style={{ fontWeight: 700 }}>{item.title}</span>
</div>
<div style={{ marginTop: 6, fontSize: 12, lineHeight: 1.6, color: active ? (darkMode ? 'rgba(255,255,255,0.68)' : 'rgba(22,32,51,0.68)') : utilityMutedTextStyle.color }}>
{item.description}
</div>
</button>
);
})}
</div>
</div>
<div>
<div style={{ marginBottom: 8, fontWeight: 500 }}> (Font Size)</div>
<div style={{ display: 'flex', alignItems: 'center', gap: 16 }}>
<Slider
min={MIN_FONT_SIZE}
max={MAX_FONT_SIZE}
step={1}
value={effectiveFontSize}
onChange={(v) => setFontSize(Number(v))}
style={{ flex: 1 }}
/>
<span style={{ width: 56 }}>{effectiveFontSize}px</span>
</div>
</div>
<div>
<div style={{ marginBottom: 8, fontWeight: 500 }}> (Opacity)</div>
<div style={{ display: 'flex', alignItems: 'center', gap: 16 }}>
<Slider
min={0.1}
max={1.0}
step={0.05}
value={appearance.opacity ?? 1.0}
onChange={(v) => setAppearance({ opacity: v })}
style={{ flex: 1 }}
/>
<span style={{ width: 40 }}>{Math.round((appearance.opacity ?? 1.0) * 100)}%</span>
</div>
</div>
<div>
<div style={{ marginBottom: 8, fontWeight: 500 }}> (Blur)</div>
{isWindowsPlatform() ? (
<div style={{ fontSize: 12, color: '#888' }}>
Windows 使 Acrylic
<div style={{ minWidth: 0, minHeight: 0, height: '100%', overflowY: 'auto', overflowX: 'hidden', paddingRight: 8, paddingBottom: 28 }}>
{themeModalSection === 'theme' ? (
<div style={{ display: 'flex', flexDirection: 'column', gap: 16 }}>
<div style={utilityPanelStyle}>
<div style={{ marginBottom: 10, fontWeight: 600 }}></div>
<div style={{ display: 'grid', gridTemplateColumns: 'repeat(2, minmax(0, 1fr))', gap: 12 }}>
{[
{ key: 'light', label: '亮色主题', description: '适合明亮环境,层次更轻。' },
{ key: 'dark', label: '暗色主题', description: '适合低光环境,视觉更沉稳。' },
].map((item) => {
const active = themeMode === item.key;
return (
<button
key={item.key}
type="button"
onClick={() => setTheme(item.key as 'light' | 'dark')}
style={{
textAlign: 'left',
padding: '14px 14px',
borderRadius: 14,
border: `1px solid ${active
? (darkMode ? 'rgba(255,214,102,0.3)' : 'rgba(24,144,255,0.24)')
: (darkMode ? 'rgba(255,255,255,0.08)' : 'rgba(16,24,40,0.08)')}`,
background: active
? (darkMode ? 'linear-gradient(180deg, rgba(255,214,102,0.12) 0%, rgba(255,214,102,0.06) 100%)' : 'linear-gradient(180deg, rgba(24,144,255,0.10) 0%, rgba(24,144,255,0.05) 100%)')
: (darkMode ? 'rgba(255,255,255,0.02)' : 'rgba(255,255,255,0.72)'),
color: active ? (darkMode ? '#f5f7ff' : '#162033') : (darkMode ? 'rgba(255,255,255,0.82)' : '#3f4b5e'),
cursor: 'pointer',
}}
>
<div style={{ display: 'flex', alignItems: 'center', justifyContent: 'space-between', gap: 8 }}>
<span style={{ fontSize: 14, fontWeight: 700 }}>{item.label}</span>
{active ? <CheckOutlined style={{ color: darkMode ? '#ffd666' : '#1677ff' }} /> : null}
</div>
<div style={{ marginTop: 6, fontSize: 12, lineHeight: 1.6, color: active ? (darkMode ? 'rgba(255,255,255,0.68)' : 'rgba(22,32,51,0.68)') : utilityMutedTextStyle.color }}>
{item.description}
</div>
</button>
);
})}
</div>
</div>
</div>
) : (
<>
<div style={{ display: 'flex', alignItems: 'center', gap: 16 }}>
<Slider
min={0}
max={20}
value={appearance.blur ?? 0}
onChange={(v) => setAppearance({ blur: v })}
style={{ flex: 1 }}
/>
<span style={{ width: 40 }}>{appearance.blur}px</span>
<div style={{ display: 'flex', flexDirection: 'column', gap: 16 }}>
<div style={utilityPanelStyle}>
<div style={{ marginBottom: 8, fontWeight: 500 }}> (UI Scale)</div>
<div style={{ display: 'flex', alignItems: 'center', gap: 16 }}>
<Slider
min={MIN_UI_SCALE}
max={MAX_UI_SCALE}
step={0.05}
value={effectiveUiScale}
onChange={(v) => setUiScale(Number(v))}
style={{ flex: 1 }}
/>
<span style={{ width: 56 }}>{Math.round(effectiveUiScale * 100)}%</span>
</div>
<div style={{ fontSize: 12, color: darkMode ? 'rgba(255,255,255,0.5)' : 'rgba(16,24,40,0.55)', marginTop: 4 }}>
* 85%-95%
</div>
</div>
<div style={{ fontSize: 12, color: '#888', marginTop: 4 }}>
*
<div style={utilityPanelStyle}>
<div style={{ marginBottom: 8, fontWeight: 500 }}> (Font Size)</div>
<div style={{ display: 'flex', alignItems: 'center', gap: 16 }}>
<Slider
min={MIN_FONT_SIZE}
max={MAX_FONT_SIZE}
step={1}
value={effectiveFontSize}
onChange={(v) => setFontSize(Number(v))}
style={{ flex: 1 }}
/>
<span style={{ width: 56 }}>{effectiveFontSize}px</span>
</div>
</div>
</>
<div style={utilityPanelStyle}>
<div style={{ marginBottom: 10, fontWeight: 500 }}></div>
<div style={{ display: 'flex', alignItems: 'center', justifyContent: 'space-between', gap: 12, marginBottom: 12 }}>
<div>
<div style={{ fontWeight: 500 }}></div>
<div style={{ ...utilityMutedTextStyle, marginTop: 4 }}></div>
</div>
<Switch checked={appearance.enabled !== false} onChange={(checked) => setAppearance({ enabled: checked })} />
</div>
<div style={{ display: 'grid', gap: 14, opacity: appearance.enabled !== false ? 1 : 0.6 }}>
<div>
<div style={{ marginBottom: 8, fontWeight: 500 }}> (Opacity)</div>
<div style={{ display: 'flex', alignItems: 'center', gap: 16 }}>
<Slider
min={0.1}
max={1.0}
step={0.05}
disabled={appearance.enabled === false}
value={appearance.opacity ?? 1.0}
onChange={(v) => setAppearance({ opacity: v })}
style={{ flex: 1 }}
/>
<span style={{ width: 40 }}>{Math.round((appearance.opacity ?? 1.0) * 100)}%</span>
</div>
</div>
<div>
<div style={{ marginBottom: 8, fontWeight: 500 }}> (Blur)</div>
{isWindowsPlatform() ? (
<div style={{ fontSize: 12, color: darkMode ? 'rgba(255,255,255,0.5)' : 'rgba(16,24,40,0.55)' }}>
Windows 使 Acrylic
</div>
) : (
<>
<div style={{ display: 'flex', alignItems: 'center', gap: 16 }}>
<Slider
min={0}
max={20}
disabled={appearance.enabled === false}
value={appearance.blur ?? 0}
onChange={(v) => setAppearance({ blur: v })}
style={{ flex: 1 }}
/>
<span style={{ width: 40 }}>{appearance.blur}px</span>
</div>
<div style={{ fontSize: 12, color: darkMode ? 'rgba(255,255,255,0.5)' : 'rgba(16,24,40,0.55)', marginTop: 4 }}>
*
</div>
</>
)}
</div>
</div>
</div>
<div style={utilityPanelStyle}>
<div style={{ marginBottom: 8, fontWeight: 500 }}></div>
<div style={{ display: 'flex', alignItems: 'center', justifyContent: 'space-between', gap: 12 }}>
<span></span>
<Switch checked={startupFullscreen} onChange={(checked) => setStartupFullscreen(checked)} />
</div>
<div style={{ fontSize: 12, color: darkMode ? 'rgba(255,255,255,0.5)' : 'rgba(16,24,40,0.55)', marginTop: 4 }}>
*
</div>
</div>
<div style={{ display: 'flex', justifyContent: 'flex-end', alignItems: 'center', gap: 12, paddingTop: 8, paddingBottom: 12 }}>
<Button
onClick={() => {
setUiScale(DEFAULT_UI_SCALE);
setFontSize(DEFAULT_FONT_SIZE);
setAppearance({ enabled: true, opacity: 1.0, blur: 0 });
}}
>
</Button>
</div>
</div>
)}
</div>
<div>
<div style={{ marginBottom: 8, fontWeight: 500 }}></div>
<div style={{ display: 'flex', alignItems: 'center', justifyContent: 'space-between', gap: 12 }}>
<span></span>
<Switch checked={startupFullscreen} onChange={(checked) => setStartupFullscreen(checked)} />
</div>
<div style={{ fontSize: 12, color: '#888', marginTop: 4 }}>
*
</div>
</div>
<div style={{ display: 'flex', justifyContent: 'flex-end' }}>
<Button
onClick={() => {
setUiScale(DEFAULT_UI_SCALE);
setFontSize(DEFAULT_FONT_SIZE);
setAppearance({ opacity: 1.0, blur: 0 });
}}
>
</Button>
</div>
</div>
</Modal>
<Modal
title="快捷键管理"
title={renderUtilityModalTitle(<LinkOutlined />, '快捷键管理', '统一查看、录制与启停常用快捷键,保持操作习惯一致。')}
open={isShortcutModalOpen}
onCancel={() => {
setIsShortcutModalOpen(false);
setCapturingShortcutAction(null);
}}
width={720}
width={760}
styles={{ content: utilityModalShellStyle, header: { background: 'transparent', borderBottom: 'none', paddingBottom: 8 }, body: { paddingTop: 8 }, footer: { background: 'transparent', borderTop: 'none', paddingTop: 10 } }}
footer={[
<Button
key="reset"
@@ -1627,9 +1857,11 @@ function App() {
</Button>,
]}
>
<div style={{ display: 'flex', flexDirection: 'column', gap: 12, paddingTop: 8 }}>
<div style={{ fontSize: 12, color: '#8c8c8c' }}>
Esc Ctrl/Alt/Shift/Meta
<div style={{ display: 'flex', flexDirection: 'column', gap: 16, paddingTop: 8 }}>
<div style={utilityPanelStyle}>
<div style={{ fontSize: 12, color: darkMode ? 'rgba(255,255,255,0.5)' : 'rgba(16,24,40,0.55)' }}>
Esc Ctrl/Alt/Shift/Meta
</div>
</div>
{SHORTCUT_ACTION_ORDER.map((action) => {
const meta = SHORTCUT_ACTION_META[action];
@@ -1639,18 +1871,17 @@ function App() {
<div
key={action}
style={{
...utilityPanelStyle,
display: 'grid',
gridTemplateColumns: '1fr auto',
gap: 12,
alignItems: 'center',
padding: '10px 12px',
border: '1px solid rgba(128, 128, 128, 0.2)',
borderRadius: 8,
}}
>
<div>
<div style={{ fontWeight: 500 }}>{meta.label}</div>
<div style={{ fontSize: 12, color: '#8c8c8c' }}>{meta.description}</div>
<div style={{ fontSize: 12, color: darkMode ? 'rgba(255,255,255,0.5)' : 'rgba(16,24,40,0.55)' }}>{meta.description}</div>
</div>
<div style={{ display: 'flex', alignItems: 'center', gap: 8 }}>
<Input
@@ -1675,14 +1906,15 @@ function App() {
</div>
</Modal>
<Modal
title="全局代理设置"
title={renderUtilityModalTitle(<GlobalOutlined />, '全局代理设置', '统一配置更新检查、驱动管理与未单独指定代理的连接网络出口。')}
open={isProxyModalOpen}
onCancel={() => setIsProxyModalOpen(false)}
footer={null}
width={460}
width={520}
styles={{ content: utilityModalShellStyle, header: { background: 'transparent', borderBottom: 'none', paddingBottom: 8 }, body: { paddingTop: 8 }, footer: { background: 'transparent', borderTop: 'none', paddingTop: 10 } }}
>
<div style={{ display: 'flex', flexDirection: 'column', gap: 24, padding: '12px 0' }}>
<div>
<div style={{ display: 'flex', flexDirection: 'column', gap: 16, padding: '12px 0' }}>
<div style={utilityPanelStyle}>
<div style={{ marginBottom: 8, fontWeight: 500 }}></div>
<div style={{ display: 'flex', alignItems: 'center', justifyContent: 'space-between', gap: 12 }}>
<span></span>
@@ -1690,7 +1922,7 @@ function App() {
</div>
<div style={{ marginTop: 12, display: 'grid', gridTemplateColumns: '1fr 1fr', gap: 12, opacity: globalProxy.enabled ? 1 : 0.7 }}>
<div>
<div style={{ marginBottom: 6, fontSize: 12, color: '#8c8c8c' }}></div>
<div style={{ marginBottom: 6, fontSize: 12, color: darkMode ? 'rgba(255,255,255,0.5)' : 'rgba(16,24,40,0.55)' }}></div>
<Select
value={globalProxy.type}
disabled={!globalProxy.enabled}
@@ -1702,7 +1934,7 @@ function App() {
/>
</div>
<div>
<div style={{ marginBottom: 6, fontSize: 12, color: '#8c8c8c' }}></div>
<div style={{ marginBottom: 6, fontSize: 12, color: darkMode ? 'rgba(255,255,255,0.5)' : 'rgba(16,24,40,0.55)' }}></div>
<InputNumber
min={1}
max={65535}
@@ -1715,7 +1947,7 @@ function App() {
/>
</div>
<div style={{ gridColumn: '1 / span 2' }}>
<div style={{ marginBottom: 6, fontSize: 12, color: '#8c8c8c' }}></div>
<div style={{ marginBottom: 6, fontSize: 12, color: darkMode ? 'rgba(255,255,255,0.5)' : 'rgba(16,24,40,0.55)' }}></div>
<Input
placeholder="例如127.0.0.1"
value={globalProxy.host}
@@ -1724,7 +1956,7 @@ function App() {
/>
</div>
<div>
<div style={{ marginBottom: 6, fontSize: 12, color: '#8c8c8c' }}></div>
<div style={{ marginBottom: 6, fontSize: 12, color: darkMode ? 'rgba(255,255,255,0.5)' : 'rgba(16,24,40,0.55)' }}></div>
<Input
placeholder="proxy-user"
value={globalProxy.user}
@@ -1733,7 +1965,7 @@ function App() {
/>
</div>
<div>
<div style={{ marginBottom: 6, fontSize: 12, color: '#8c8c8c' }}></div>
<div style={{ marginBottom: 6, fontSize: 12, color: darkMode ? 'rgba(255,255,255,0.5)' : 'rgba(16,24,40,0.55)' }}></div>
<Input.Password
placeholder="proxy-password"
value={globalProxy.password}
@@ -1742,7 +1974,7 @@ function App() {
/>
</div>
</div>
<div style={{ fontSize: 12, color: '#888', marginTop: 6 }}>
<div style={{ fontSize: 12, color: darkMode ? 'rgba(255,255,255,0.5)' : 'rgba(16,24,40,0.55)', marginTop: 6 }}>
*
</div>
</div>
@@ -1777,7 +2009,7 @@ function App() {
percent={Math.round(updateDownloadProgress.percent)}
status={updateDownloadProgress.status === 'error' ? 'exception' : (updateDownloadProgress.status === 'done' ? 'success' : 'active')}
/>
<div style={{ fontSize: 12, color: '#8c8c8c' }}>
<div style={{ fontSize: 12, color: darkMode ? 'rgba(255,255,255,0.5)' : 'rgba(16,24,40,0.55)' }}>
{`${formatBytes(updateDownloadProgress.downloaded)} / ${formatBytes(updateDownloadProgress.total)}`}
</div>
{updateDownloadProgress.message ? (

File diff suppressed because it is too large Load Diff

View File

@@ -2,7 +2,7 @@ import React, { useState, useEffect, useRef, useContext, useMemo, useCallback }
import { createPortal } from 'react-dom';
import { Table, message, Input, Button, Dropdown, MenuProps, Form, Pagination, Select, Modal, Checkbox, Segmented, Tooltip, Popover } from 'antd';
import type { SortOrder } from 'antd/es/table/interface';
import { ReloadOutlined, ImportOutlined, ExportOutlined, DownOutlined, PlusOutlined, DeleteOutlined, SaveOutlined, UndoOutlined, FilterOutlined, CloseOutlined, ConsoleSqlOutlined, FileTextOutlined, CopyOutlined, ClearOutlined, EditOutlined, VerticalAlignBottomOutlined } from '@ant-design/icons';
import { ReloadOutlined, ImportOutlined, ExportOutlined, DownOutlined, PlusOutlined, DeleteOutlined, SaveOutlined, UndoOutlined, FilterOutlined, CloseOutlined, ConsoleSqlOutlined, FileTextOutlined, CopyOutlined, ClearOutlined, EditOutlined, VerticalAlignBottomOutlined, LeftOutlined, RightOutlined } from '@ant-design/icons';
import Editor from '@monaco-editor/react';
import { ImportData, ExportTable, ExportData, ExportQuery, ApplyChanges, DBGetColumns } from '../../wailsjs/go/app/App';
import ImportPreviewModal from './ImportPreviewModal';
@@ -10,8 +10,8 @@ import { useStore } from '../store';
import type { ColumnDefinition } from '../types';
import { v4 as uuidv4 } from 'uuid';
import 'react-resizable/css/styles.css';
import { buildOrderBySQL, buildWhereSQL, escapeLiteral, quoteIdentPart, quoteQualifiedIdent, withSortBufferTuningSQL, type FilterCondition } from '../utils/sql';
import { isMacLikePlatform, normalizeOpacityForPlatform } from '../utils/appearance';
import { buildOrderBySQL, buildPaginatedSelectSQL, buildWhereSQL, escapeLiteral, quoteIdentPart, quoteQualifiedIdent, withSortBufferTuningSQL, type FilterCondition } from '../utils/sql';
import { isMacLikePlatform, normalizeOpacityForPlatform, resolveAppearanceValues } from '../utils/appearance';
import { getDataSourceCapabilities } from '../utils/dataSourceCapabilities';
// --- Error Boundary ---
@@ -642,7 +642,8 @@ const DataGrid: React.FC<DataGridProps> = ({
const setQueryOptions = useStore(state => state.setQueryOptions);
const isMacLike = useMemo(() => isMacLikePlatform(), []);
const darkMode = theme === 'dark';
const opacity = normalizeOpacityForPlatform(appearance.opacity);
const resolvedAppearance = resolveAppearanceValues(appearance);
const opacity = normalizeOpacityForPlatform(resolvedAppearance.opacity);
const canModifyData = !readOnly && !!tableName;
const showColumnComment = queryOptions?.showColumnComment !== false;
const showColumnType = queryOptions?.showColumnType !== false;
@@ -709,6 +710,33 @@ const DataGrid: React.FC<DataGridProps> = ({
const toolbarDividerColor = darkMode ? 'rgba(255, 255, 255, 0.12)' : 'rgba(0, 0, 0, 0.10)';
const columnMetaHintColor = darkMode ? darkHighlightTextColor : lightMetaHintColor;
const columnMetaTooltipColor = darkMode ? darkHighlightTextColor : lightMetaTooltipColor;
const paginationPageSizeOptions = ['100', '200', '500', '1000'];
const paginationGlassMode = opacity < 0.999 || resolvedAppearance.blur > 0;
const paginationShellBg = darkMode
? `linear-gradient(135deg, rgba(17,22,34,${paginationGlassMode ? Math.max(0.22, opacity * 0.38) : 0.82}) 0%, rgba(10,14,24,${paginationGlassMode ? Math.max(0.28, opacity * 0.46) : 0.9}) 100%)`
: `linear-gradient(135deg, rgba(255,255,255,${paginationGlassMode ? Math.max(0.24, opacity * 0.36) : 0.96}) 0%, rgba(246,248,252,${paginationGlassMode ? Math.max(0.32, opacity * 0.44) : 0.99}) 100%)`;
const paginationShellBorderColor = darkMode
? `rgba(255,255,255,${paginationGlassMode ? 0.10 : 0.08})`
: `rgba(16,24,40,${paginationGlassMode ? 0.08 : 0.08})`;
const paginationShellShadow = darkMode
? `0 16px 34px rgba(0,0,0,${paginationGlassMode ? 0.10 : 0.22})`
: `0 14px 30px rgba(15,23,42,${paginationGlassMode ? 0.03 : 0.08})`;
const paginationChipBg = darkMode
? `rgba(255,255,255,${paginationGlassMode ? Math.max(0.02, opacity * 0.035) : 0.04})`
: `rgba(255,255,255,${paginationGlassMode ? Math.max(0.18, opacity * 0.26) : 0.86})`;
const paginationChipBorderColor = darkMode
? `rgba(255,255,255,${paginationGlassMode ? 0.10 : 0.08})`
: `rgba(16,24,40,${paginationGlassMode ? 0.10 : 0.08})`;
const paginationHoverBg = darkMode
? `rgba(255,255,255,${paginationGlassMode ? Math.max(0.04, opacity * 0.06) : 0.07})`
: `rgba(255,255,255,${paginationGlassMode ? Math.max(0.24, opacity * 0.34) : 0.96})`;
const paginationPrimaryTextColor = darkMode ? '#f5f7ff' : '#162033';
const paginationSecondaryTextColor = darkMode ? 'rgba(255,255,255,0.54)' : 'rgba(16,24,40,0.56)';
const paginationAccentBg = darkMode ? 'rgba(255,214,102,0.14)' : 'rgba(24,144,255,0.10)';
const paginationAccentBorderColor = darkMode ? 'rgba(255,214,102,0.38)' : 'rgba(24,144,255,0.22)';
const paginationActiveItemBg = darkMode ? 'rgba(255,214,102,0.18)' : 'rgba(24,144,255,0.12)';
const paginationActiveItemBorderColor = darkMode ? 'rgba(255,214,102,0.46)' : 'rgba(24,144,255,0.28)';
const paginationActiveItemTextColor = darkMode ? '#fff7d6' : '#0958d9';
const [form] = Form.useForm();
const [modal, contextHolder] = Modal.useModal();
@@ -1174,11 +1202,11 @@ const DataGrid: React.FC<DataGridProps> = ({
// 直接操作 DOM 更新选中效果,避免 React 重渲染
const updateCellSelection = useCallback((newSelection: Set<string>) => {
const tableBody = containerRef.current?.querySelector('.ant-table-body');
if (!tableBody) return;
const container = containerRef.current;
if (!container) return;
// 只同步可见单元格(兼容 virtual 渲染 + 极大选区)
const visibleCells = tableBody.querySelectorAll('td[data-row-key][data-col-name]');
// 只同步可见单元格,严格限定 `.ant-table-cell`,避免虚拟列表中内嵌的 EditableCell 被重复获取并打上 selected 样式从而产生白边。
const visibleCells = container.querySelectorAll('.ant-table-cell[data-row-key][data-col-name]');
visibleCells.forEach((cell) => {
const el = cell as HTMLElement;
const rowKey = el.getAttribute('data-row-key');
@@ -1306,10 +1334,10 @@ const DataGrid: React.FC<DataGridProps> = ({
const getCellInfo = (target: HTMLElement | null): { rowKey: string; colName: string } | null => {
if (!target) return null;
const td = target.closest('td[data-row-key][data-col-name]') as HTMLElement;
if (!td) return null;
const rowKey = td.getAttribute('data-row-key');
const colName = td.getAttribute('data-col-name');
const cell = target.closest('[data-row-key][data-col-name]') as HTMLElement;
if (!cell) return null;
const rowKey = cell.getAttribute('data-row-key');
const colName = cell.getAttribute('data-col-name');
if (!rowKey || !colName) return null;
return { rowKey, colName };
};
@@ -2077,16 +2105,9 @@ const DataGrid: React.FC<DataGridProps> = ({
closeRowEditor();
}, [rowEditorRowKey, rowEditorForm, addedRows, columnNames, rowKeyStr, closeRowEditor]);
const estimatedVisibleCellCount = mergedDisplayData.length * Math.max(columnNames.length, 1);
const enableLargeResultOptimizedEditing =
viewMode === 'table' && (
mergedDisplayData.length >= 60 ||
estimatedVisibleCellCount >= 1600 ||
columnNames.length >= 36 ||
(isMacLike && columnNames.length >= 24)
);
const enableVirtual = enableLargeResultOptimizedEditing;
const enableInlineEditableCell = canModifyData && !enableLargeResultOptimizedEditing;
const enableVirtual = viewMode === 'table';
const enableInlineEditableCell = canModifyData;
const columns = useMemo(() => {
return columnNames.map(key => ({
@@ -2141,22 +2162,28 @@ const DataGrid: React.FC<DataGridProps> = ({
return {
...col,
onCell: (record: Item) => {
if (!enableInlineEditableCell) {
const rowKey = record?.[GONAVI_ROW_KEY];
return {
'data-row-key': rowKey === undefined || rowKey === null ? undefined : String(rowKey),
'data-col-name': dataIndex,
onDoubleClick: () => handleVirtualCellActivate(record, dataIndex, dataIndex),
};
}
return {
record,
editable: col.editable,
dataIndex: col.dataIndex,
title: dataIndex,
handleSave: handleCellSave,
focusCell: openCellEditor,
const rowKey = record?.[GONAVI_ROW_KEY];
const cellProps: any = {
'data-row-key': rowKey === undefined || rowKey === null ? undefined : String(rowKey),
'data-col-name': dataIndex,
};
if (!enableInlineEditableCell) {
cellProps.onDoubleClick = () => handleVirtualCellActivate(record, dataIndex, dataIndex);
cellProps.onContextMenu = (e: React.MouseEvent) => {
e.preventDefault();
e.stopPropagation();
showCellContextMenu(e, record, dataIndex, dataIndex);
};
} else {
cellProps.record = record;
cellProps.editable = col.editable;
cellProps.dataIndex = col.dataIndex;
cellProps.title = dataIndex;
cellProps.handleSave = handleCellSave;
cellProps.focusCell = openCellEditor;
}
return cellProps;
},
render: (text: any, record: Item, index: number) => {
const originalRenderContent = col.render ? (col.render as any)(text, record, index) : text;
@@ -2176,10 +2203,24 @@ const DataGrid: React.FC<DataGridProps> = ({
</EditableCell>
);
}
if (enableVirtual) {
return (
<div
style={{ margin: -8, padding: '8px 8px 8px 8px' }}
onContextMenu={(e) => {
e.preventDefault();
e.stopPropagation();
showCellContextMenu(e, record, dataIndex, dataIndex);
}}
>
{originalRenderContent}
</div>
);
}
return originalRenderContent;
}
};
}), [columns, enableInlineEditableCell, enableVirtual, handleCellSave, openCellEditor, handleVirtualCellActivate]);
}), [columns, enableInlineEditableCell, enableVirtual, handleCellSave, openCellEditor, handleVirtualCellActivate, showCellContextMenu]);
const handleAddRow = () => {
const newKey = `new-${Date.now()}`;
@@ -2419,18 +2460,18 @@ const DataGrid: React.FC<DataGridProps> = ({
return clauses.join(' OR ');
}, [pkColumns, tableName]);
const buildCurrentPageSql = useCallback((dbType: string) => {
const buildCurrentPageSql = useCallback((dbType: string) => {
if (!tableName || !pagination) return '';
const whereSQL = buildWhereSQL(dbType, filterConditions);
let sql = `SELECT * FROM ${quoteQualifiedIdent(dbType, tableName)} ${whereSQL}`;
sql += buildOrderBySQL(dbType, sortInfo, pkColumns);
const baseSql = `SELECT * FROM ${quoteQualifiedIdent(dbType, tableName)} ${whereSQL}`;
const orderBySQL = buildOrderBySQL(dbType, sortInfo, pkColumns);
const normalizedType = String(dbType || '').trim().toLowerCase();
const hasExplicitSort = !!sortInfo?.columnKey && (sortInfo?.order === 'ascend' || sortInfo?.order === 'descend');
const offset = (pagination.current - 1) * pagination.pageSize;
let sql = buildPaginatedSelectSQL(dbType, baseSql, orderBySQL, pagination.pageSize, offset);
if (hasExplicitSort && (normalizedType === 'mysql' || normalizedType === 'mariadb')) {
sql = withSortBufferTuningSQL(normalizedType, sql, 32 * 1024 * 1024);
}
const offset = (pagination.current - 1) * pagination.pageSize;
sql += ` LIMIT ${pagination.pageSize} OFFSET ${offset}`;
return sql;
}, [tableName, pagination, filterConditions, sortInfo, pkColumns]);
@@ -2718,7 +2759,7 @@ const DataGrid: React.FC<DataGridProps> = ({
handleExportSelected,
copyToClipboard,
tableName,
enableRowContextMenu: !canModifyData,
enableRowContextMenu: false,
supportsCopyInsert,
}), [handleCopyCsv, handleCopyInsert, handleCopyJson, handleExportSelected, copyToClipboard, tableName, canModifyData, supportsCopyInsert]);
@@ -2736,7 +2777,7 @@ const DataGrid: React.FC<DataGridProps> = ({
const rowPropsFactory = useCallback((record: any) => ({ record } as any), []);
const totalWidth = columns.reduce((sum, col) => sum + (Number(col.width) || 200), 0) + selectionColumnWidth;
const useContextMenuRow = !canModifyData;
const useContextMenuRow = false;
const tableScrollX = useMemo(() => {
const baseWidth = Math.max(totalWidth, 1000);
if (!isMacLike || tableViewportWidth <= 0) return baseWidth;
@@ -3132,6 +3173,49 @@ const DataGrid: React.FC<DataGridProps> = ({
};
}, [viewMode, tableScrollX, mergedDisplayData.length, syncExternalScrollFromTargets, pickHorizontalScrollTargets]);
const paginationSummaryText = useMemo(() => {
if (!pagination) return '';
const total = Number.isFinite(pagination.total) ? pagination.total : 0;
const rangeStart = Math.max(0, (pagination.current - 1) * pagination.pageSize + (total > 0 ? 1 : 0));
const hasValidRange = total > 0 && rangeStart > 0;
const rangeEnd = hasValidRange ? Math.min(total, rangeStart + pagination.pageSize - 1) : 0;
const currentCount = hasValidRange ? Math.max(0, rangeEnd - rangeStart + 1) : 0;
if (pagination.totalKnown === false) {
if (isDuckDBConnection) {
if (pagination.totalCountLoading) return `当前 ${currentCount} 条 / 正在统计精确总数…`;
if (pagination.totalApprox && Number.isFinite(total) && total > 0) return `当前 ${currentCount} 条 / 约 ${total}`;
if (pagination.totalCountCancelled) return `当前 ${currentCount} 条 / 已取消统计`;
return `当前 ${currentCount} 条 / 总数未统计`;
}
return `当前 ${currentCount} 条 / 正在统计总数…`;
}
if (isDuckDBConnection && (!Number.isFinite(total) || total <= 0)) {
return '当前 0 条 / 共 0 条';
}
return `当前 ${currentCount} 条 / 共 ${total}`;
}, [pagination, isDuckDBConnection]);
const paginationPageText = useMemo(() => {
if (!pagination) return '';
const total = Number.isFinite(pagination.total) ? pagination.total : 0;
const canShowTotalPages = pagination.totalKnown !== false || (isDuckDBConnection && pagination.totalApprox && total > 0);
if (!canShowTotalPages || total <= 0) return `${pagination.current}`;
const totalPages = Math.max(1, Math.ceil(total / Math.max(1, pagination.pageSize)));
return `${pagination.current} / ${totalPages}`;
}, [pagination, isDuckDBConnection]);
const handlePageSizeChange = useCallback((value: string) => {
if (!pagination || !onPageChange) return;
const nextSize = Number(value);
if (!Number.isFinite(nextSize) || nextSize <= 0) return;
const firstRowIndex = Math.max(0, (pagination.current - 1) * pagination.pageSize);
const nextPage = Math.floor(firstRowIndex / nextSize) + 1;
onPageChange(nextPage, nextSize);
}, [pagination, onPageChange]);
return (
<div className={`${gridId}${cellEditMode ? ' cell-edit-mode' : ''} data-grid-root`} style={{ flex: '1 1 auto', height: '100%', overflow: 'hidden', padding: 0, display: 'flex', flexDirection: 'column', minHeight: 0, minWidth: 0, background: 'transparent' }}>
{/* Toolbar + Filter Panel */}
@@ -3313,22 +3397,17 @@ const DataGrid: React.FC<DataGridProps> = ({
<Checkbox
checked={cond.enabled !== false}
onChange={e => updateFilter(cond.id, 'enabled', e.target.checked)}
style={{ marginTop: 6 }}
style={{ marginTop: 6, flex: '0 0 auto', whiteSpace: 'nowrap' }}
>
</Checkbox>
{condIndex === 0 ? (
<div style={{ width: 96, marginTop: 7, textAlign: 'center', fontSize: 12, color: '#8c8c8c' }}>
</div>
) : (
<Select
style={{ width: 96 }}
value={cond.logic === 'OR' ? 'OR' : 'AND'}
onChange={v => updateFilter(cond.id, 'logic', v)}
options={filterLogicOptions as any}
/>
)}
<Select
style={{ width: 96, minWidth: 96, maxWidth: 96, flex: '0 0 96px' }}
value={condIndex === 0 ? '__FIRST__' : (cond.logic === 'OR' ? 'OR' : 'AND')}
onChange={v => updateFilter(cond.id, 'logic', v)}
options={condIndex === 0 ? [{ value: '__FIRST__', label: '首条' }] : (filterLogicOptions as any)}
disabled={condIndex === 0}
/>
<Select
style={{ width: 180 }}
value={cond.column}
@@ -3690,6 +3769,8 @@ const DataGrid: React.FC<DataGridProps> = ({
}}
onClick={(e) => e.stopPropagation()}
>
{canModifyData && (
<>
<div
style={{
padding: '8px 12px',
@@ -3723,6 +3804,8 @@ const DataGrid: React.FC<DataGridProps> = ({
({selectedRowKeys.length})
</div>
<div style={{ height: 1, background: darkMode ? '#303030' : '#f0f0f0', margin: '4px 0' }} />
</>
)}
{supportsCopyInsert && (
<div
style={{
@@ -3859,33 +3942,41 @@ const DataGrid: React.FC<DataGridProps> = ({
</div>
{pagination && (
<div style={{ padding: '8px', borderTop: 'none', display: 'flex', justifyContent: 'flex-end' }}>
<Pagination
current={pagination.current}
pageSize={pagination.pageSize}
total={pagination.total}
showTotal={(total, range) => {
const hasValidRange = Array.isArray(range) && range[0] > 0 && range[1] >= range[0];
const currentCount = hasValidRange ? Math.max(0, range[1] - range[0] + 1) : 0;
if (pagination.totalKnown === false) {
if (isDuckDBConnection) {
if (pagination.totalCountLoading) return `当前 ${currentCount} 条 / 正在统计精确总数...`;
if (pagination.totalApprox && Number.isFinite(total) && total > 0) return `当前 ${currentCount} 条 / 约 ${total}`;
if (pagination.totalCountCancelled) return `当前 ${currentCount} 条 / 已取消统计`;
return `当前 ${currentCount} 条 / 总数未统计`;
<div style={{ padding: '12px 0 0', borderTop: 'none', display: 'flex', justifyContent: 'flex-end' }}>
<div className="data-grid-pagination-shell">
<div className="data-grid-pagination-summary" aria-live="polite">
<span className="data-grid-pagination-kicker"></span>
<span className="data-grid-pagination-summary-value">{paginationSummaryText}</span>
</div>
<div className="data-grid-pagination-page-chip">{paginationPageText}</div>
<Pagination
current={pagination.current}
pageSize={pagination.pageSize}
total={pagination.total}
showSizeChanger={false}
onChange={onPageChange}
showTitle={false}
size="small"
itemRender={(_page, type, originalElement) => {
if (type === 'prev') {
return <span className="data-grid-pagination-nav-icon" aria-hidden="true"><LeftOutlined /></span>;
}
return `当前 ${currentCount} 条 / 正在统计总数...`;
}
if (isDuckDBConnection && (!Number.isFinite(total) || total <= 0)) {
return '当前 0 条 / 共 0 条';
}
return `当前 ${currentCount} 条 / 共 ${total}`;
}}
showSizeChanger
pageSizeOptions={['100', '200', '500', '1000']}
onChange={onPageChange}
size="small"
/>
if (type === 'next') {
return <span className="data-grid-pagination-nav-icon" aria-hidden="true"><RightOutlined /></span>;
}
return originalElement;
}}
/>
<Select
size="small"
popupMatchSelectWidth={false}
value={String(pagination.pageSize)}
onChange={handlePageSizeChange}
options={paginationPageSizeOptions.map((value) => ({ value, label: `${value} 条 / 页` }))}
className="data-grid-pagination-size-select"
aria-label="每页条数"
/>
</div>
</div>
)}
@@ -3947,12 +4038,13 @@ const DataGrid: React.FC<DataGridProps> = ({
.${gridId} .ant-table-tbody .ant-table-row.row-added:hover > .ant-table-cell { background-color: ${rowAddedHover} !important; }
.${gridId} .ant-table-tbody > tr.row-modified:hover > td,
.${gridId} .ant-table-tbody .ant-table-row.row-modified:hover > .ant-table-cell { background-color: ${rowModHover} !important; }
.${gridId}.cell-edit-mode .ant-table-tbody > tr > td[data-col-name],
.${gridId}.cell-edit-mode .ant-table-tbody .ant-table-row > .ant-table-cell[data-col-name] { user-select: none; -webkit-user-select: none; cursor: crosshair; }
.${gridId}.cell-edit-mode .ant-table-tbody > tr > td[data-cell-selected="true"],
.${gridId}.cell-edit-mode .ant-table-tbody .ant-table-row > .ant-table-cell[data-cell-selected="true"] {
box-shadow: inset 0 0 0 2px ${selectionAccentHex};
background-image: linear-gradient(${darkMode ? `rgba(${selectionAccentRgb}, 0.20)` : `rgba(${selectionAccentRgb}, 0.08)`}, ${darkMode ? `rgba(${selectionAccentRgb}, 0.20)` : `rgba(${selectionAccentRgb}, 0.08)`});
.${gridId} .ant-table-tbody > tr > td[data-col-name],
.${gridId} .ant-table-tbody .ant-table-row > .ant-table-cell[data-col-name] { user-select: none; -webkit-user-select: none; cursor: crosshair; }
.${gridId} .ant-table-tbody > tr > td[data-cell-selected="true"],
.${gridId} .ant-table-tbody .ant-table-row > .ant-table-cell[data-cell-selected="true"],
.${gridId} [data-cell-selected="true"] {
box-shadow: inset 0 0 0 2px ${selectionAccentHex} !important;
background-image: linear-gradient(${darkMode ? `rgba(${selectionAccentRgb}, 0.20)` : `rgba(${selectionAccentRgb}, 0.08)`}, ${darkMode ? `rgba(${selectionAccentRgb}, 0.20)` : `rgba(${selectionAccentRgb}, 0.08)`}) !important;
}
.${gridId} .ant-table-content,
.${gridId} .ant-table-body {
@@ -4061,6 +4153,266 @@ const DataGrid: React.FC<DataGridProps> = ({
.${gridId} .data-grid-external-hscroll-inner {
height: 1px;
}
.${gridId} .data-grid-pagination-shell {
display: inline-flex;
align-items: center;
justify-content: flex-end;
gap: 10px;
flex-wrap: wrap;
max-width: 100%;
padding: 8px 10px;
border-radius: 16px;
border: 1px solid ${paginationShellBorderColor};
background: ${paginationShellBg};
box-shadow: ${paginationShellShadow};
backdrop-filter: ${opacity < 0.999 ? 'blur(14px)' : 'none'};
-webkit-backdrop-filter: ${opacity < 0.999 ? 'blur(14px)' : 'none'};
}
.${gridId} .data-grid-pagination-summary,
.${gridId} .data-grid-pagination-page-chip {
display: inline-flex;
align-items: center;
gap: 8px;
min-height: 34px;
padding: 0 12px;
border-radius: 999px;
border: 1px solid ${paginationChipBorderColor};
background: ${paginationChipBg};
color: ${paginationPrimaryTextColor};
font-size: 12px;
line-height: 1;
font-variant-numeric: tabular-nums;
white-space: nowrap;
}
.${gridId} .data-grid-pagination-kicker {
display: inline-flex;
align-items: center;
height: 20px;
padding: 0 8px;
border-radius: 999px;
background: ${paginationAccentBg};
border: 1px solid ${paginationAccentBorderColor};
color: ${paginationActiveItemTextColor};
font-size: 11px;
font-weight: 700;
letter-spacing: 0.02em;
}
.${gridId} .data-grid-pagination-summary-value {
color: ${paginationPrimaryTextColor};
font-weight: 600;
font-variant-numeric: tabular-nums;
}
.${gridId} .data-grid-pagination-page-chip {
color: ${paginationSecondaryTextColor};
font-weight: 600;
}
.${gridId} .ant-pagination {
display: inline-flex;
align-items: center;
gap: 6px;
margin: 0;
color: ${paginationPrimaryTextColor};
}
.${gridId} .ant-pagination .ant-pagination-item,
.${gridId} .ant-pagination .ant-pagination-prev,
.${gridId} .ant-pagination .ant-pagination-next,
.${gridId} .ant-pagination .ant-pagination-jump-prev,
.${gridId} .ant-pagination .ant-pagination-jump-next {
min-width: 34px;
height: 34px;
margin-inline-end: 0;
border-radius: 12px;
border: 1px solid ${paginationChipBorderColor};
background: ${paginationChipBg};
box-shadow: none;
display: inline-flex;
align-items: center;
justify-content: center;
overflow: hidden;
transition: border-color 160ms ease, background-color 160ms ease, transform 160ms ease, box-shadow 160ms ease;
}
.${gridId} .ant-pagination .ant-pagination-item a,
.${gridId} .ant-pagination .ant-pagination-prev .ant-pagination-item-link,
.${gridId} .ant-pagination .ant-pagination-next .ant-pagination-item-link,
.${gridId} .ant-pagination .ant-pagination-prev > *,
.${gridId} .ant-pagination .ant-pagination-next > * {
display: flex;
align-items: center;
justify-content: center;
width: 100%;
height: 100%;
color: ${paginationPrimaryTextColor};
font-weight: 600;
border: none;
background: transparent;
border-radius: inherit;
line-height: 1;
}
.${gridId} .ant-pagination .ant-pagination-item:hover,
.${gridId} .ant-pagination .ant-pagination-prev:hover,
.${gridId} .ant-pagination .ant-pagination-next:hover {
background: ${paginationHoverBg};
border-color: ${paginationActiveItemBorderColor};
transform: translateY(-1px);
}
.${gridId} .ant-pagination .ant-pagination-item-active {
border-color: ${paginationActiveItemBorderColor};
background: ${paginationActiveItemBg};
box-shadow: inset 0 0 0 1px ${paginationAccentBorderColor};
}
.${gridId} .ant-pagination .ant-pagination-item-active a {
color: ${paginationActiveItemTextColor};
}
.${gridId} .ant-pagination .ant-pagination-disabled,
.${gridId} .ant-pagination .ant-pagination-disabled:hover {
background: transparent;
border-color: ${paginationChipBorderColor};
transform: none;
opacity: 0.42;
}
.${gridId} .ant-pagination .ant-pagination-jump-prev,
.${gridId} .ant-pagination .ant-pagination-jump-next {
padding: 0;
}
.${gridId} .ant-pagination .ant-pagination-jump-prev .ant-pagination-item-link,
.${gridId} .ant-pagination .ant-pagination-jump-next .ant-pagination-item-link {
position: relative;
display: flex;
align-items: center;
justify-content: center;
width: 100%;
height: 100%;
padding: 0;
margin: 0;
line-height: 1;
}
.${gridId} .ant-pagination .ant-pagination-jump-prev .ant-pagination-item-container,
.${gridId} .ant-pagination .ant-pagination-jump-next .ant-pagination-item-container {
display: flex;
align-items: center;
justify-content: center;
width: 100%;
height: 100%;
position: relative;
line-height: 1;
}
.${gridId} .ant-pagination .ant-pagination-jump-prev .ant-pagination-item-ellipsis,
.${gridId} .ant-pagination .ant-pagination-jump-next .ant-pagination-item-ellipsis,
.${gridId} .ant-pagination .ant-pagination-jump-prev .ant-pagination-item-link-icon,
.${gridId} .ant-pagination .ant-pagination-jump-next .ant-pagination-item-link-icon {
position: absolute !important;
top: 0 !important;
right: 0 !important;
bottom: 0 !important;
left: 0 !important;
inset: 0 !important;
width: fit-content !important;
height: fit-content !important;
min-width: 0 !important;
min-height: 0 !important;
margin: auto !important;
padding: 0 !important;
transform: none !important;
display: inline-flex !important;
align-items: center !important;
justify-content: center !important;
line-height: 1 !important;
color: ${paginationSecondaryTextColor};
}
.${gridId} .ant-pagination .ant-pagination-jump-prev .ant-pagination-item-ellipsis,
.${gridId} .ant-pagination .ant-pagination-jump-next .ant-pagination-item-ellipsis {
letter-spacing: 0.18em;
text-indent: 0.18em;
text-align: center;
}
.${gridId} .ant-pagination .ant-pagination-jump-prev .ant-pagination-item-link-icon .anticon,
.${gridId} .ant-pagination .ant-pagination-jump-next .ant-pagination-item-link-icon .anticon,
.${gridId} .ant-pagination .ant-pagination-jump-prev .ant-pagination-item-link-icon svg,
.${gridId} .ant-pagination .ant-pagination-jump-next .ant-pagination-item-link-icon svg {
display: inline-flex !important;
align-items: center !important;
justify-content: center !important;
width: 1em;
height: 1em;
line-height: 1;
}
.${gridId} .data-grid-pagination-nav-icon {
display: inline-flex;
align-items: center;
justify-content: center;
width: 100%;
height: 100%;
font-size: 12px;
line-height: 1;
}
.${gridId} .data-grid-pagination-nav-icon .anticon {
display: inline-flex;
align-items: center;
justify-content: center;
width: 100%;
height: 100%;
}
.${gridId} .data-grid-pagination-size-select {
min-width: 112px;
height: 34px;
display: inline-flex;
align-items: stretch;
}
.${gridId} .data-grid-pagination-size-select.ant-select-single,
.${gridId} .data-grid-pagination-size-select.ant-select-single.ant-select-sm {
height: 34px;
}
.${gridId} .data-grid-pagination-size-select .ant-select-selector {
height: 34px !important;
border-radius: 12px !important;
border: 1px solid ${paginationChipBorderColor} !important;
background: ${paginationChipBg} !important;
box-shadow: none !important;
padding: 0 12px !important;
display: flex !important;
align-items: center !important;
}
.${gridId} .data-grid-pagination-size-select .ant-select-selection-wrap {
display: flex !important;
align-items: center !important;
height: 100%;
}
.${gridId} .data-grid-pagination-size-select .ant-select-selection-search,
.${gridId} .data-grid-pagination-size-select .ant-select-selection-search-input {
height: 100% !important;
}
.${gridId} .data-grid-pagination-size-select .ant-select-selection-item,
.${gridId} .data-grid-pagination-size-select .ant-select-selection-placeholder {
display: flex;
align-items: center;
height: 100%;
line-height: 34px !important;
color: ${paginationPrimaryTextColor};
font-weight: 600;
font-variant-numeric: tabular-nums;
}
.${gridId} .data-grid-pagination-size-select .ant-select-selection-search {
inset-inline-start: 12px !important;
inset-inline-end: 32px !important;
}
.${gridId} .data-grid-pagination-size-select .ant-select-arrow {
color: ${paginationSecondaryTextColor};
inset-inline-end: 12px;
top: 50%;
transform: translateY(-50%);
margin-top: 0;
display: inline-flex;
align-items: center;
justify-content: center;
height: 16px;
line-height: 1;
}
.${gridId} .data-grid-pagination-size-select .ant-select-arrow .anticon {
display: inline-flex;
align-items: center;
justify-content: center;
line-height: 1;
}
`}</style>
{/* Ghost Resize Line for Columns */}

View File

@@ -1,9 +1,11 @@
import React, { useState, useEffect, useMemo, useRef } from 'react';
import { Modal, Form, Select, Button, message, Steps, Transfer, Card, Alert, Divider, Typography, Progress, Checkbox, Table, Drawer, Tabs } from 'antd';
import { Modal, Form, Select, Input, Button, message, Steps, Transfer, Card, Alert, Divider, Typography, Progress, Checkbox, Table, Drawer, Tabs, theme as antdTheme } from 'antd';
import { DatabaseOutlined, RocketOutlined, SwapOutlined, TableOutlined } from '@ant-design/icons';
import { useStore } from '../store';
import { DBGetDatabases, DBGetTables, DataSync, DataSyncAnalyze, DataSyncPreview } from '../../wailsjs/go/app/App';
import { SavedConnection } from '../types';
import { EventsOn } from '../../wailsjs/runtime/runtime';
import { normalizeOpacityForPlatform, resolveAppearanceValues } from '../utils/appearance';
const { Title, Text } = Typography;
const { Step } = Steps;
@@ -21,6 +23,12 @@ type TableDiffSummary = {
deletes?: number;
same?: number;
message?: string;
targetTableExists?: boolean;
plannedAction?: string;
warnings?: string[];
unsupportedObjects?: string[];
indexesToCreate?: number;
indexesSkipped?: number;
};
type TableOps = {
insert: boolean;
@@ -31,6 +39,8 @@ type TableOps = {
selectedDeletePks?: string[];
};
type WorkflowType = 'sync' | 'migration';
const quoteSqlIdent = (dbType: string, ident: string): string => {
const raw = String(ident || '').trim();
if (!raw) return raw;
@@ -76,6 +86,11 @@ const toSqlLiteral = (value: any, dbType: string): string => {
return `'${String(value).replace(/'/g, "''")}'`;
};
const resolveRedisDbIndex = (raw?: string): number => {
const value = Number(String(raw || '').trim());
return Number.isInteger(value) && value >= 0 && value <= 15 ? value : 0;
};
const buildSqlPreview = (
previewData: any,
tableName: string,
@@ -145,8 +160,14 @@ const buildSqlPreview = (
const DataSyncModal: React.FC<{ open: boolean; onClose: () => void }> = ({ open, onClose }) => {
const connections = useStore((state) => state.connections);
const themeMode = useStore((state) => state.theme);
const appearance = useStore((state) => state.appearance);
const [currentStep, setCurrentStep] = useState(0);
const [loading, setLoading] = useState(false);
const { token } = antdTheme.useToken();
const darkMode = themeMode === 'dark';
const resolvedAppearance = resolveAppearanceValues(appearance);
const effectiveOpacity = normalizeOpacityForPlatform(resolvedAppearance.opacity);
// Step 1: Config
const [sourceConnId, setSourceConnId] = useState<string>('');
@@ -162,9 +183,13 @@ const DataSyncModal: React.FC<{ open: boolean; onClose: () => void }> = ({ open,
const [selectedTables, setSelectedTables] = useState<string[]>([]);
// Options
const [workflowType, setWorkflowType] = useState<WorkflowType>('sync');
const [syncContent, setSyncContent] = useState<'data' | 'schema' | 'both'>('data');
const [syncMode, setSyncMode] = useState<string>('insert_update');
const [autoAddColumns, setAutoAddColumns] = useState<boolean>(true);
const [targetTableStrategy, setTargetTableStrategy] = useState<'existing_only' | 'auto_create_if_missing' | 'smart'>('existing_only');
const [createIndexes, setCreateIndexes] = useState<boolean>(false);
const [mongoCollectionName, setMongoCollectionName] = useState<string>('');
const [showSameTables, setShowSameTables] = useState<boolean>(false);
const [analyzing, setAnalyzing] = useState<boolean>(false);
const [diffTables, setDiffTables] = useState<TableDiffSummary[]>([]);
@@ -240,9 +265,12 @@ const DataSyncModal: React.FC<{ open: boolean; onClose: () => void }> = ({ open,
setSourceDb('');
setTargetDb('');
setSelectedTables([]);
setWorkflowType('sync');
setSyncContent('data');
setSyncMode('insert_update');
setAutoAddColumns(true);
setTargetTableStrategy('existing_only');
setCreateIndexes(false);
setShowSameTables(false);
setAnalyzing(false);
setDiffTables([]);
@@ -260,6 +288,30 @@ const DataSyncModal: React.FC<{ open: boolean; onClose: () => void }> = ({ open,
}
}, [open]);
useEffect(() => {
if (workflowType === 'migration') {
if (syncMode === 'insert_update') {
setSyncMode('insert_only');
}
if (syncContent === 'schema') {
setSyncContent('both');
}
if (targetTableStrategy === 'existing_only') {
setTargetTableStrategy('smart');
}
if (!createIndexes) {
setCreateIndexes(true);
}
} else {
if (targetTableStrategy !== 'existing_only') {
setTargetTableStrategy('existing_only');
}
if (createIndexes) {
setCreateIndexes(false);
}
}
}, [workflowType]);
const handleSourceConnChange = async (connId: string) => {
setSourceConnId(connId);
setSourceDb('');
@@ -357,6 +409,9 @@ const DataSyncModal: React.FC<{ open: boolean; onClose: () => void }> = ({ open,
content: syncContent,
mode: "insert_update",
autoAddColumns,
targetTableStrategy,
createIndexes,
mongoCollectionName: mongoCollectionName.trim(),
jobId,
};
@@ -407,6 +462,9 @@ const DataSyncModal: React.FC<{ open: boolean; onClose: () => void }> = ({ open,
content: "data",
mode: "insert_update",
autoAddColumns,
targetTableStrategy,
createIndexes,
mongoCollectionName: mongoCollectionName.trim(),
};
try {
@@ -483,6 +541,9 @@ const DataSyncModal: React.FC<{ open: boolean; onClose: () => void }> = ({ open,
content: syncContent,
mode: syncMode,
autoAddColumns,
targetTableStrategy,
createIndexes,
mongoCollectionName: mongoCollectionName.trim(),
tableOptions,
jobId,
};
@@ -530,10 +591,132 @@ const DataSyncModal: React.FC<{ open: boolean; onClose: () => void }> = ({ open,
return buildSqlPreview(previewData, previewTable, targetType, ops);
}, [previewData, previewTable, targetConnId, connections, tableOptions]);
const analysisWarnings = useMemo(() => {
const items: string[] = [];
diffTables.forEach((table) => {
(table.warnings || []).forEach((warning) => items.push(`${table.table}: ${warning}`));
(table.unsupportedObjects || []).forEach((warning) => items.push(`${table.table}: ${warning}`));
});
return Array.from(new Set(items));
}, [diffTables]);
const isMigrationWorkflow = workflowType === 'migration';
const sourceConn = useMemo(() => connections.find(c => c.id === sourceConnId), [connections, sourceConnId]);
const targetConn = useMemo(() => connections.find(c => c.id === targetConnId), [connections, targetConnId]);
const sourceType = String(sourceConn?.config?.type || '').toLowerCase();
const targetType = String(targetConn?.config?.type || '').toLowerCase();
const isRedisMongoKeyspaceMigration = isMigrationWorkflow && (
(sourceType === 'redis' && targetType === 'mongodb') ||
(sourceType === 'mongodb' && targetType === 'redis')
);
const defaultMongoCollectionName = useMemo(() => {
if (sourceType === 'redis' && targetType === 'mongodb') {
return `redis_db_${resolveRedisDbIndex(sourceDb || sourceConn?.config?.database)}_keys`;
}
if (sourceType === 'mongodb' && targetType === 'redis') {
return selectedTables[0] || `redis_db_${resolveRedisDbIndex(targetDb || targetConn?.config?.database)}_keys`;
}
return '';
}, [sourceType, targetType, sourceDb, targetDb, sourceConn, targetConn, selectedTables]);
const modalPanelStyle = useMemo(() => ({
background: darkMode
? 'linear-gradient(180deg, rgba(16,22,34,0.96) 0%, rgba(10,14,24,0.98) 100%)'
: 'linear-gradient(180deg, rgba(255,255,255,0.98) 0%, rgba(246,248,252,0.98) 100%)',
border: darkMode ? '1px solid rgba(255,255,255,0.08)' : '1px solid rgba(16,24,40,0.08)',
boxShadow: darkMode ? '0 24px 56px rgba(0,0,0,0.36)' : '0 18px 44px rgba(15,23,42,0.14)',
backdropFilter: darkMode ? 'blur(18px)' : 'none',
}), [darkMode]);
const shellCardStyle = useMemo<React.CSSProperties>(() => ({
borderRadius: 18,
border: darkMode ? '1px solid rgba(255,255,255,0.08)' : '1px solid rgba(15,23,42,0.08)',
background: darkMode ? 'rgba(255,255,255,0.03)' : `rgba(255,255,255,${Math.max(effectiveOpacity, 0.88)})`,
boxShadow: darkMode ? '0 12px 32px rgba(0,0,0,0.22)' : '0 10px 24px rgba(15,23,42,0.08)',
overflow: 'hidden',
}), [darkMode, effectiveOpacity]);
const heroPanelStyle = useMemo<React.CSSProperties>(() => ({
padding: 18,
borderRadius: 18,
border: darkMode ? '1px solid rgba(255,214,102,0.12)' : '1px solid rgba(24,144,255,0.12)',
background: darkMode
? 'linear-gradient(135deg, rgba(255,214,102,0.10) 0%, rgba(255,255,255,0.03) 100%)'
: 'linear-gradient(135deg, rgba(24,144,255,0.10) 0%, rgba(255,255,255,0.95) 100%)',
marginBottom: 18,
}), [darkMode]);
const badgeStyle = useMemo<React.CSSProperties>(() => ({
display: 'inline-flex',
alignItems: 'center',
gap: 6,
padding: '6px 10px',
borderRadius: 999,
border: darkMode ? '1px solid rgba(255,255,255,0.10)' : '1px solid rgba(15,23,42,0.08)',
background: darkMode ? 'rgba(255,255,255,0.04)' : 'rgba(255,255,255,0.86)',
color: darkMode ? 'rgba(255,255,255,0.88)' : '#334155',
fontSize: 12,
fontWeight: 600,
}), [darkMode]);
const quietPanelStyle = useMemo<React.CSSProperties>(() => ({
padding: 14,
borderRadius: 16,
border: darkMode ? '1px solid rgba(255,255,255,0.08)' : '1px solid rgba(15,23,42,0.08)',
background: darkMode ? 'rgba(255,255,255,0.025)' : 'rgba(248,250,252,0.92)',
}), [darkMode]);
const modalWorkspaceStyle = useMemo<React.CSSProperties>(() => ({
display: 'flex',
flexDirection: 'column',
height: '100%',
minHeight: 0,
}), []);
const modalScrollableContentStyle = useMemo<React.CSSProperties>(() => ({
flex: 1,
minHeight: 0,
overflowY: 'auto',
overflowX: 'hidden',
paddingRight: 4,
overscrollBehavior: 'contain',
}), []);
const modalFooterBarStyle = useMemo<React.CSSProperties>(() => ({
marginTop: 18,
display: 'flex',
justifyContent: 'flex-end',
gap: 8,
paddingTop: 12,
borderTop: darkMode ? '1px solid rgba(255,255,255,0.06)' : '1px solid rgba(15,23,42,0.06)',
flex: '0 0 auto',
}), [darkMode]);
const renderModalTitle = (title: string, description: string) => (
<div style={{ display: 'flex', alignItems: 'flex-start', gap: 12 }}>
<div style={{
width: 38,
height: 38,
borderRadius: 14,
display: 'grid',
placeItems: 'center',
background: darkMode ? 'rgba(255,214,102,0.12)' : 'rgba(24,144,255,0.10)',
color: darkMode ? '#ffd666' : token.colorPrimary,
flexShrink: 0,
}}>
{isMigrationWorkflow ? <RocketOutlined /> : <SwapOutlined />}
</div>
<div style={{ minWidth: 0 }}>
<div style={{ fontSize: 16, fontWeight: 700, color: darkMode ? '#f8fafc' : '#0f172a' }}>{title}</div>
<div style={{ marginTop: 4, fontSize: 12, lineHeight: 1.6, color: darkMode ? 'rgba(255,255,255,0.56)' : 'rgba(15,23,42,0.58)' }}>{description}</div>
</div>
</div>
);
return (
<>
<Modal
title="数据同步"
title={renderModalTitle(isMigrationWorkflow ? '跨库迁移工作台' : '数据同步工作台', isMigrationWorkflow ? '按源库 → 目标库完成建表、导入与风险预检。' : '按已有目标表完成差异对比、同步执行与结果确认。')}
open={open}
onCancel={() => {
if (syncing) {
@@ -542,23 +725,61 @@ const DataSyncModal: React.FC<{ open: boolean; onClose: () => void }> = ({ open,
}
onClose();
}}
width={800}
width={920}
footer={null}
destroyOnHidden
closable={!syncing}
maskClosable={!syncing}
styles={{
content: modalPanelStyle,
header: { background: 'transparent', borderBottom: 'none', paddingBottom: 10 },
body: {
paddingTop: 8,
height: 760,
maxHeight: 'calc(100vh - 120px)',
overflow: 'hidden',
display: 'flex',
flexDirection: 'column',
},
footer: { background: 'transparent', borderTop: 'none', paddingTop: 12 },
}}
>
<div style={modalWorkspaceStyle}>
<div style={{ flex: '0 0 auto' }}>
<div style={heroPanelStyle}>
<div style={{ display: 'flex', justifyContent: 'space-between', gap: 12, alignItems: 'flex-start', flexWrap: 'wrap' }}>
<div style={{ minWidth: 0 }}>
<div style={{ fontSize: 18, fontWeight: 700, color: darkMode ? '#f8fafc' : '#0f172a' }}>{isMigrationWorkflow ? '跨数据源迁移' : '数据同步'}</div>
<div style={{ marginTop: 6, fontSize: 13, lineHeight: 1.7, color: darkMode ? 'rgba(255,255,255,0.62)' : 'rgba(15,23,42,0.62)' }}>
{isMigrationWorkflow
? '适合把源表迁移到另一套数据库,可按策略自动建表、导入数据并补建可兼容索引。'
: '适合目标表已存在的场景,先做差异分析,再按勾选执行插入、更新或删除。'}
</div>
</div>
<div style={{ display: 'flex', flexWrap: 'wrap', gap: 8 }}>
<span style={badgeStyle}>{isMigrationWorkflow ? <RocketOutlined /> : <SwapOutlined />} {isMigrationWorkflow ? '迁移模式' : '同步模式'}</span>
<span style={badgeStyle}><DatabaseOutlined /> {sourceConnId ? '已选源连接' : '待选源连接'}</span>
<span style={badgeStyle}><TableOutlined /> {selectedTables.length || 0} </span>
</div>
</div>
</div>
<Steps current={currentStep} style={{ marginBottom: 24 }}>
<Step title="配置源与目标" />
<Step title="选择表" />
<Step title="执行结果" />
</Steps>
</div>
<div style={modalScrollableContentStyle}>
{/* STEP 1: CONFIG */}
{currentStep === 0 && (
<div>
<div style={{ display: 'flex', gap: 24, justifyContent: 'center' }}>
<Card title="源数据库" style={{ width: 350 }}>
<div style={{ display: 'grid', gridTemplateColumns: 'minmax(0, 1fr) 44px minmax(0, 1fr)', gap: 18, alignItems: 'stretch' }}>
<Card
title="源数据库"
style={shellCardStyle}
styles={{ header: { borderBottom: darkMode ? '1px solid rgba(255,255,255,0.08)' : '1px solid rgba(15,23,42,0.06)', fontWeight: 700 }, body: { padding: 18 } }}
>
<Form layout="vertical">
<Form.Item label="连接">
<Select value={sourceConnId} onChange={handleSourceConnChange}>
@@ -572,8 +793,16 @@ const DataSyncModal: React.FC<{ open: boolean; onClose: () => void }> = ({ open,
</Form.Item>
</Form>
</Card>
<div style={{ display: 'flex', alignItems: 'center' }}></div>
<Card title="目标数据库" style={{ width: 350 }}>
<div style={{ display: 'grid', placeItems: 'center' }}>
<div style={{ ...badgeStyle, width: 44, height: 44, borderRadius: 14, justifyContent: 'center', padding: 0 }}>
<SwapOutlined />
</div>
</div>
<Card
title="目标数据库"
style={shellCardStyle}
styles={{ header: { borderBottom: darkMode ? '1px solid rgba(255,255,255,0.08)' : '1px solid rgba(15,23,42,0.06)', fontWeight: 700 }, body: { padding: 18 } }}
>
<Form layout="vertical">
<Form.Item label="连接">
<Select value={targetConnId} onChange={handleTargetConnChange}>
@@ -589,27 +818,94 @@ const DataSyncModal: React.FC<{ open: boolean; onClose: () => void }> = ({ open,
</Card>
</div>
<Card title="同步选项" style={{ marginTop: 16 }}>
<Card
title={isMigrationWorkflow ? '迁移选项' : '同步选项'}
style={{ ...shellCardStyle, marginTop: 18 }}
styles={{ header: { borderBottom: darkMode ? '1px solid rgba(255,255,255,0.08)' : '1px solid rgba(15,23,42,0.06)', fontWeight: 700 }, body: { padding: 18 } }}
>
<div style={{ ...quietPanelStyle, marginBottom: 14 }}>
<Text style={{ color: darkMode ? 'rgba(255,255,255,0.72)' : 'rgba(15,23,42,0.68)', lineHeight: 1.7 }}>
</Text>
</div>
<Form layout="vertical">
<Form.Item label="同步内容">
<Form.Item label="功能类型">
<Select value={workflowType} onChange={setWorkflowType}>
<Option value="sync"></Option>
<Option value="migration"></Option>
</Select>
</Form.Item>
<Alert
type={isMigrationWorkflow ? 'info' : 'success'}
showIcon
style={{ marginBottom: 12 }}
message={isMigrationWorkflow
? '当前为“跨库迁移”模式:适合将表迁移到另一数据源,可自动建表并导入数据。'
: '当前为“数据同步”模式:适合目标表已存在时做增量同步或覆盖导入。'}
/>
<Form.Item label={isMigrationWorkflow ? '迁移内容' : '同步内容'}>
<Select value={syncContent} onChange={setSyncContent}>
<Option value="data"></Option>
<Option value="schema"></Option>
<Option value="both"> + </Option>
</Select>
</Form.Item>
<Form.Item label="同步模式">
<Form.Item label={isMigrationWorkflow ? '迁移模式' : '同步模式'}>
<Select value={syncMode} onChange={setSyncMode} disabled={syncContent === 'schema'}>
<Option value="insert_update">//</Option>
<Option value="insert_only"></Option>
<Option value="full_overwrite"></Option>
</Select>
</Form.Item>
<Form.Item label={isMigrationWorkflow ? '目标表处理策略' : '目标表要求'}>
<Select value={targetTableStrategy} onChange={setTargetTableStrategy} disabled={!isMigrationWorkflow}>
<Option value="existing_only">使</Option>
<Option value="auto_create_if_missing"></Option>
<Option value="smart"></Option>
</Select>
</Form.Item>
{isRedisMongoKeyspaceMigration && (
<Form.Item
label="Mongo 集合名(可选)"
extra={sourceType === 'redis'
? '为空时沿用默认集合名;填写后本次 Redis 键空间会统一写入该 Mongo 集合。'
: 'MongoDB → Redis 场景下通常直接选择源集合;这里留空即可,未显式选集合时才会回退使用该名称。'}
>
<Input
value={mongoCollectionName}
onChange={(e) => setMongoCollectionName(e.target.value)}
placeholder={defaultMongoCollectionName || '请输入 Mongo 集合名'}
allowClear
maxLength={128}
/>
</Form.Item>
)}
<Form.Item>
<Checkbox checked={autoAddColumns} onChange={(e) => setAutoAddColumns(e.target.checked)}>
MySQL
MySQL MySQL Kingbase
</Checkbox>
</Form.Item>
<Form.Item>
<Checkbox checked={createIndexes} onChange={(e) => setCreateIndexes(e.target.checked)} disabled={!isMigrationWorkflow || targetTableStrategy === 'existing_only'}>
/
</Checkbox>
</Form.Item>
{isMigrationWorkflow && targetTableStrategy !== 'existing_only' && (
<Alert
type="info"
showIcon
message="自动建表模式首期仅支持 MySQL → Kingbase将迁移字段、主键、普通/唯一/联合索引,并显式跳过全文、空间、前缀、函数类索引。"
style={{ marginBottom: 12 }}
/>
)}
{!isMigrationWorkflow && (
<Alert
type="info"
showIcon
message="数据同步模式默认基于已有目标表执行;如需跨数据源建表导入,请切换到“跨库迁移”。"
style={{ marginBottom: 12 }}
/>
)}
{syncContent !== 'schema' && syncMode === 'full_overwrite' && (
<Alert
type="warning"
@@ -624,26 +920,42 @@ const DataSyncModal: React.FC<{ open: boolean; onClose: () => void }> = ({ open,
{/* STEP 2: TABLES */}
{currentStep === 1 && (
<div style={{ display: 'flex', flexDirection: 'column', gap: 12 }}>
<div style={{ display: 'flex', justifyContent: 'space-between', alignItems: 'center' }}>
<Text type="secondary">:</Text>
<div style={{ display: 'flex', flexDirection: 'column', gap: 14 }}>
<div style={quietPanelStyle}>
<div style={{ display: 'flex', justifyContent: 'space-between', alignItems: 'center', marginBottom: 10 }}>
<Text type="secondary"></Text>
<Checkbox checked={showSameTables} onChange={(e) => setShowSameTables(e.target.checked)}>
</Checkbox>
</div>
<Transfer
</div>
<Transfer
dataSource={allTables.map(t => ({ key: t, title: t }))}
titles={['源表', '已选表']}
targetKeys={selectedTables}
onChange={(keys) => setSelectedTables(keys as string[])}
render={item => item.title}
listStyle={{ width: 350, height: 280, marginTop: 0 }}
locale={{ itemUnit: '项', itemsUnit: '项', searchPlaceholder: '搜索表', notFoundContent: '暂无数据' }}
listStyle={{ width: 390, height: 320, marginTop: 0, borderRadius: 14, overflow: 'hidden' }}
locale={{ itemUnit: '项', itemsUnit: '项', searchPlaceholder: '搜索表', notFoundContent: '暂无数据' }}
/>
</div>
{diffTables.length > 0 && (
<div>
<Divider orientation="left"></Divider>
<div style={quietPanelStyle}>
<Divider orientation="left" style={{ marginTop: 0 }}></Divider>
{analysisWarnings.length > 0 && (
<Alert
type="warning"
showIcon
message="预检发现风险或降级项,请在执行前确认"
description={
<ul style={{ margin: 0, paddingLeft: 18 }}>
{analysisWarnings.slice(0, 8).map((item) => <li key={item}>{item}</li>)}
{analysisWarnings.length > 8 && <li> {analysisWarnings.length - 8} </li>}
</ul>
}
style={{ marginBottom: 12 }}
/>
)}
<Table
size="small"
pagination={false}
@@ -655,13 +967,29 @@ const DataSyncModal: React.FC<{ open: boolean; onClose: () => void }> = ({ open,
const same = Number(t.same || 0);
const msg = String(t.message || '').trim();
const can = !!t.canSync;
const warns = Array.isArray(t.warnings) ? t.warnings.length : 0;
const unsupported = Array.isArray(t.unsupportedObjects) ? t.unsupportedObjects.length : 0;
if (showSameTables) return true;
if (!can) return true;
if (msg) return true;
if (msg || warns > 0 || unsupported > 0) return true;
return ins > 0 || upd > 0 || del > 0 || same === 0;
})}
columns={[
{ title: '表名', dataIndex: 'table', key: 'table', ellipsis: true },
{
title: '目标表',
key: 'targetTableExists',
width: 90,
render: (_: any, r: any) => r.targetTableExists ? '已存在' : '不存在'
},
{
title: '计划',
dataIndex: 'plannedAction',
key: 'plannedAction',
width: 220,
ellipsis: true,
render: (v: any) => String(v || '')
},
{
title: '插入',
key: 'inserts',
@@ -670,11 +998,7 @@ const DataSyncModal: React.FC<{ open: boolean; onClose: () => void }> = ({ open,
const ops = tableOptions[r.table] || { insert: true, update: true, delete: false };
const disabled = !r.canSync || analyzing || Number(r.inserts || 0) === 0;
return (
<Checkbox
checked={!!ops.insert}
disabled={disabled}
onChange={(e) => updateTableOption(r.table, 'insert', e.target.checked)}
>
<Checkbox checked={!!ops.insert} disabled={disabled} onChange={(e) => updateTableOption(r.table, 'insert', e.target.checked)}>
{Number(r.inserts || 0)}
</Checkbox>
);
@@ -688,11 +1012,7 @@ const DataSyncModal: React.FC<{ open: boolean; onClose: () => void }> = ({ open,
const ops = tableOptions[r.table] || { insert: true, update: true, delete: false };
const disabled = !r.canSync || analyzing || Number(r.updates || 0) === 0;
return (
<Checkbox
checked={!!ops.update}
disabled={disabled}
onChange={(e) => updateTableOption(r.table, 'update', e.target.checked)}
>
<Checkbox checked={!!ops.update} disabled={disabled} onChange={(e) => updateTableOption(r.table, 'update', e.target.checked)}>
{Number(r.updates || 0)}
</Checkbox>
);
@@ -706,18 +1026,28 @@ const DataSyncModal: React.FC<{ open: boolean; onClose: () => void }> = ({ open,
const ops = tableOptions[r.table] || { insert: true, update: true, delete: false };
const disabled = !r.canSync || analyzing || Number(r.deletes || 0) === 0;
return (
<Checkbox
checked={!!ops.delete}
disabled={disabled}
onChange={(e) => updateTableOption(r.table, 'delete', e.target.checked)}
>
<Checkbox checked={!!ops.delete} disabled={disabled} onChange={(e) => updateTableOption(r.table, 'delete', e.target.checked)}>
{Number(r.deletes || 0)}
</Checkbox>
);
}
},
{ title: '相同', dataIndex: 'same', key: 'same', width: 70, render: (v: any) => Number(v || 0) },
{ title: '消息', dataIndex: 'message', key: 'message', ellipsis: true, render: (v: any) => (v ? String(v) : '') },
{
title: '风险',
key: 'warnings',
width: 220,
render: (_: any, r: any) => {
const warns = [...(Array.isArray(r.warnings) ? r.warnings : []), ...(Array.isArray(r.unsupportedObjects) ? r.unsupportedObjects : [])];
if (warns.length === 0) return '-';
return (
<div style={{ color: '#d48806', fontSize: 12, lineHeight: 1.5 }}>
{warns.slice(0, 2).map((item: string) => <div key={item}>{item}</div>)}
{warns.length > 2 && <div> {warns.length - 2} </div>}
</div>
);
}
},
{
title: '预览',
key: 'preview',
@@ -741,7 +1071,8 @@ const DataSyncModal: React.FC<{ open: boolean; onClose: () => void }> = ({ open,
{/* STEP 3: RESULT */}
{currentStep === 2 && (
<div>
<div style={{ display: 'flex', flexDirection: 'column', gap: 14 }}>
<div style={quietPanelStyle}>
<Alert
message={syncing ? "正在同步" : (syncResult?.success ? "同步完成" : "同步失败")}
description={
@@ -753,7 +1084,7 @@ const DataSyncModal: React.FC<{ open: boolean; onClose: () => void }> = ({ open,
showIcon
/>
<div style={{ marginTop: 12 }}>
<div style={{ marginTop: 14 }}>
<Progress
percent={syncProgress.percent}
status={syncing ? "active" : (syncResult?.success ? "success" : "exception")}
@@ -761,7 +1092,9 @@ const DataSyncModal: React.FC<{ open: boolean; onClose: () => void }> = ({ open,
/>
</div>
<Divider orientation="left"></Divider>
</div>
<div style={quietPanelStyle}>
<Divider orientation="left" style={{ marginTop: 0 }}></Divider>
<div
ref={logBoxRef}
onScroll={() => {
@@ -770,14 +1103,25 @@ const DataSyncModal: React.FC<{ open: boolean; onClose: () => void }> = ({ open,
const nearBottom = el.scrollHeight - el.scrollTop - el.clientHeight < 40;
autoScrollRef.current = nearBottom;
}}
style={{ background: '#f5f5f5', padding: 12, height: 300, overflowY: 'auto', fontFamily: 'monospace' }}
style={{
background: darkMode ? 'rgba(255,255,255,0.03)' : 'rgba(248,250,252,0.92)',
border: darkMode ? '1px solid rgba(255,255,255,0.08)' : '1px solid rgba(15,23,42,0.06)',
borderRadius: 14,
padding: 12,
height: 300,
overflowY: 'auto',
fontFamily: 'SFMono-Regular, ui-monospace, Menlo, Consolas, monospace'
}}
>
{syncLogs.map((item, i: number) => <div key={i}>{renderSyncLogItem(item)}</div>)}
</div>
</div>
</div>
)}
<div style={{ marginTop: 24, textAlign: 'right' }}>
</div>
<div style={modalFooterBarStyle}>
{currentStep === 0 && (
<Button type="primary" onClick={nextToTables} loading={loading}></Button>
)}
@@ -804,14 +1148,16 @@ const DataSyncModal: React.FC<{ open: boolean; onClose: () => void }> = ({ open,
</>
)}
</div>
</div>
</Modal>
<Drawer
title={`差异预览:${previewTable}`}
styles={{ body: { background: darkMode ? 'rgba(9,13,20,0.98)' : '#f8fafc' } }}
open={previewOpen}
onClose={() => { setPreviewOpen(false); setPreviewTable(''); setPreviewData(null); }}
width={900}
>
{previewLoading && <Alert type="info" showIcon message="正在加载差异预览..." />}
{previewLoading && <Alert type="info" showIcon message="正在加载差异预览" />}
{!previewLoading && previewData && (
<div>
<Alert

View File

@@ -4,7 +4,7 @@ import { TabData, ColumnDefinition } from '../types';
import { useStore } from '../store';
import { DBQuery, DBGetColumns } from '../../wailsjs/go/app/App';
import DataGrid, { GONAVI_ROW_KEY } from './DataGrid';
import { buildOrderBySQL, buildWhereSQL, quoteIdentPart, quoteQualifiedIdent, withSortBufferTuningSQL, type FilterCondition } from '../utils/sql';
import { buildOrderBySQL, buildPaginatedSelectSQL, buildWhereSQL, quoteIdentPart, quoteQualifiedIdent, withSortBufferTuningSQL, type FilterCondition } from '../utils/sql';
import { buildMongoCountCommand, buildMongoFilter, buildMongoFindCommand, buildMongoSort } from '../utils/mongodb';
import { getDataSourceCapabilities } from '../utils/dataSourceCapabilities';
@@ -455,7 +455,7 @@ const DataViewer: React.FC<{ tab: TabData }> = ({ tab }) => {
if (pageRowCount > 0) {
const tailOffset = Math.max(0, totalRows - (offset + pageRowCount));
if (tailOffset < offset) {
sql = `${baseSql}${reverseOrderSQL} LIMIT ${pageRowCount} OFFSET ${tailOffset}`;
sql = buildPaginatedSelectSQL(dbType, baseSql, reverseOrderSQL, pageRowCount, tailOffset);
useClickHouseReversePagination = true;
clickHouseReverseLimit = pageRowCount;
clickHouseReverseHasMore = currentPage < totalPages;
@@ -464,7 +464,7 @@ const DataViewer: React.FC<{ tab: TabData }> = ({ tab }) => {
}
if (!useClickHouseReversePagination) {
// 大表性能:打开表不阻塞在 COUNT(*),先通过多取 1 条判断是否还有下一页;总数在后台统计并异步回填。
sql += ` LIMIT ${size + 1} OFFSET ${offset}`;
sql = buildPaginatedSelectSQL(dbType, baseSql, orderBySQL, size + 1, offset);
}
}
@@ -534,8 +534,7 @@ const DataViewer: React.FC<{ tab: TabData }> = ({ tab }) => {
if (safeSelect) {
let fallbackSql = `SELECT ${safeSelect} FROM ${quoteQualifiedIdent(dbType, tableName)} ${whereSQL}`;
fallbackSql += buildOrderBySQL(dbType, sortInfo, pkColumns);
fallbackSql += ` LIMIT ${size + 1} OFFSET ${offset}`;
fallbackSql = buildPaginatedSelectSQL(dbType, fallbackSql, buildOrderBySQL(dbType, sortInfo, pkColumns), size + 1, offset);
executedSql = fallbackSql;
resData = await executeDataQuery(fallbackSql, '复杂类型降级重试');
}

View File

@@ -3,7 +3,7 @@ import { Alert, Button, Collapse, Input, Modal, Progress, Select, Space, Switch,
import { DeleteOutlined, DownloadOutlined, FileSearchOutlined, FolderOpenOutlined, InfoCircleFilled, ReloadOutlined } from '@ant-design/icons';
import { EventsOn } from '../../wailsjs/runtime/runtime';
import { useStore } from '../store';
import { normalizeOpacityForPlatform } from '../utils/appearance';
import { normalizeOpacityForPlatform, resolveAppearanceValues } from '../utils/appearance';
import {
CheckDriverNetworkStatus,
DownloadDriverPackage,
@@ -166,7 +166,8 @@ const DriverManagerModal: React.FC<{ open: boolean; onClose: () => void; onOpenG
const theme = useStore((state) => state.theme);
const appearance = useStore((state) => state.appearance);
const darkMode = theme === 'dark';
const opacity = normalizeOpacityForPlatform(appearance.opacity);
const resolvedAppearance = resolveAppearanceValues(appearance);
const opacity = normalizeOpacityForPlatform(resolvedAppearance.opacity);
const modalContentRef = useRef<HTMLDivElement | null>(null);
const tableContainerRef = useRef<HTMLDivElement | null>(null);
const tableScrollTargetsRef = useRef<HTMLElement[]>([]);
@@ -1223,7 +1224,7 @@ const DriverManagerModal: React.FC<{ open: boolean; onClose: () => void; onOpenG
paddingRight: 18,
},
}}
destroyOnClose
destroyOnHidden
footer={(
<div className="driver-manager-footer">
<div

View File

@@ -1,8 +1,8 @@
import React, { useRef, useEffect } from 'react';
import { Table, Tag, Button, Tooltip } from 'antd';
import { ClearOutlined, CloseOutlined, CaretRightOutlined, BugOutlined } from '@ant-design/icons';
import { Table, Tag, Button, Tooltip, Empty } from 'antd';
import { ClearOutlined, CloseOutlined, BugOutlined, ClockCircleOutlined } from '@ant-design/icons';
import { useStore } from '../store';
import { normalizeOpacityForPlatform } from '../utils/appearance';
import { normalizeOpacityForPlatform, resolveAppearanceValues } from '../utils/appearance';
interface LogPanelProps {
height: number;
@@ -16,7 +16,8 @@ const LogPanel: React.FC<LogPanelProps> = ({ height, onClose, onResizeStart }) =
const theme = useStore(state => state.theme);
const appearance = useStore(state => state.appearance);
const darkMode = theme === 'dark';
const opacity = normalizeOpacityForPlatform(appearance.opacity);
const resolvedAppearance = resolveAppearanceValues(appearance);
const opacity = normalizeOpacityForPlatform(resolvedAppearance.opacity);
// Background Helper
const getBg = (darkHex: string) => {
@@ -28,10 +29,25 @@ const LogPanel: React.FC<LogPanelProps> = ({ height, onClose, onResizeStart }) =
return `rgba(${r}, ${g}, ${b}, ${opacity})`;
};
const bgMain = getBg('#1d1d1d');
const panelDividerColor = darkMode ? 'rgba(255,255,255,0.08)' : 'rgba(0,0,0,0.08)';
const shellOpacity = darkMode ? Math.max(0.18, opacity * 0.82) : Math.max(0.28, opacity * 0.92);
const shellOpacityStrong = darkMode ? Math.max(0.22, opacity * 0.9) : Math.max(0.34, opacity * 0.96);
const panelDividerColor = darkMode
? `rgba(255,255,255,${Math.max(0.04, opacity * 0.10)})`
: `rgba(0,0,0,${Math.max(0.04, opacity * 0.08)})`;
const panelMutedTextColor = darkMode ? 'rgba(255,255,255,0.62)' : 'rgba(0,0,0,0.58)';
const logScrollbarThumb = darkMode ? 'rgba(255, 255, 255, 0.34)' : 'rgba(0, 0, 0, 0.26)';
const logScrollbarThumbHover = darkMode ? 'rgba(255, 255, 255, 0.5)' : 'rgba(0, 0, 0, 0.36)';
const panelShellBg = darkMode
? `linear-gradient(180deg, rgba(15,20,30,${shellOpacity}) 0%, rgba(9,13,22,${shellOpacityStrong}) 100%)`
: `linear-gradient(180deg, rgba(255,255,255,${shellOpacityStrong}) 0%, rgba(246,248,252,${shellOpacity}) 100%)`;
const panelAccentColor = darkMode ? '#ffd666' : '#1677ff';
const panelShadow = darkMode
? `0 12px 28px rgba(0,0,0,${Math.max(0.05, opacity * 0.18)})`
: `0 12px 24px rgba(15,23,42,${Math.max(0.02, opacity * 0.08)})`;
const logScrollbarThumb = darkMode
? `rgba(255, 255, 255, ${Math.max(0.18, opacity * 0.34)})`
: `rgba(0, 0, 0, ${Math.max(0.12, opacity * 0.26)})`;
const logScrollbarThumbHover = darkMode
? `rgba(255, 255, 255, ${Math.max(0.28, opacity * 0.48)})`
: `rgba(0, 0, 0, ${Math.max(0.18, opacity * 0.36)})`;
const columns = [
{
@@ -45,7 +61,7 @@ const LogPanel: React.FC<LogPanelProps> = ({ height, onClose, onResizeStart }) =
dataIndex: 'status',
width: 70,
render: (status: string) => (
<Tag color={status === 'success' ? 'success' : 'error'} style={{ marginRight: 0 }}>
<Tag color={status === 'success' ? 'success' : 'error'} style={{ marginRight: 0, borderRadius: 999, paddingInline: 8, fontSize: 11, fontWeight: 700 }}>
{status === 'success' ? 'OK' : 'ERR'}
</Tag>
)
@@ -60,7 +76,7 @@ const LogPanel: React.FC<LogPanelProps> = ({ height, onClose, onResizeStart }) =
title: 'SQL / Message',
dataIndex: 'sql',
render: (text: string, record: any) => (
<div style={{ fontFamily: 'monospace', wordBreak: 'break-all', fontSize: '12px', lineHeight: '1.2' }}>
<div style={{ fontFamily: 'monospace', wordBreak: 'break-all', fontSize: '12px', lineHeight: '1.45' }}>
<div style={{ color: darkMode ? '#a6e22e' : '#005cc5' }}>{text}</div>
{record.message && <div style={{ color: '#ff4d4f', marginTop: 2 }}>{record.message}</div>}
{record.affectedRows !== undefined && <div style={{ color: panelMutedTextColor, marginTop: 1 }}>Affected: {record.affectedRows}</div>}
@@ -72,12 +88,18 @@ const LogPanel: React.FC<LogPanelProps> = ({ height, onClose, onResizeStart }) =
return (
<div style={{
height,
borderTop: `1px solid ${panelDividerColor}`,
background: bgMain,
margin: 0,
border: `1px solid ${panelDividerColor}`,
borderRadius: 14,
background: panelShellBg,
WebkitBackdropFilter: opacity < 0.999 ? 'blur(14px)' : 'none',
boxShadow: panelShadow,
backdropFilter: darkMode && opacity < 0.999 ? 'blur(18px)' : 'none',
display: 'flex',
flexDirection: 'column',
position: 'relative',
zIndex: 100 // Ensure above other content
overflow: 'hidden',
zIndex: 100
}}>
{/* Resize Handle */}
<div
@@ -95,38 +117,53 @@ const LogPanel: React.FC<LogPanelProps> = ({ height, onClose, onResizeStart }) =
{/* Toolbar */}
<div style={{
padding: '4px 8px',
padding: '10px 14px',
borderBottom: `1px solid ${panelDividerColor}`,
display: 'flex',
justifyContent: 'space-between',
alignItems: 'center',
height: 32
gap: 12,
minHeight: 48
}}>
<div style={{ display: 'flex', alignItems: 'center', gap: 8, fontWeight: 'bold', fontSize: '12px' }}>
<BugOutlined /> SQL
<div style={{ display: 'flex', alignItems: 'center', gap: 10, minWidth: 0 }}>
<div style={{ width: 30, height: 30, borderRadius: 10, display: 'grid', placeItems: 'center', background: darkMode ? `rgba(255,214,102,${Math.max(0.10, Math.min(0.18, opacity * 0.18))})` : `rgba(24,144,255,${Math.max(0.08, Math.min(0.16, opacity * 0.16))})`, color: panelAccentColor, flexShrink: 0 }}>
<BugOutlined />
</div>
<div style={{ minWidth: 0 }}>
<div style={{ fontWeight: 700, fontSize: 13, color: darkMode ? '#f5f7ff' : '#162033' }}>SQL </div>
<div style={{ fontSize: 12, color: panelMutedTextColor }}>便</div>
</div>
</div>
<div>
<div style={{ display: 'flex', alignItems: 'center', gap: 6 }}>
<Tooltip title="清空日志">
<Button type="text" size="small" icon={<ClearOutlined />} onClick={clearSqlLogs} />
<Button type="text" size="small" icon={<ClearOutlined />} onClick={clearSqlLogs} style={{ color: panelMutedTextColor }} />
</Tooltip>
<Tooltip title="关闭面板">
<Button type="text" size="small" icon={<CloseOutlined />} onClick={onClose} />
<Button type="text" size="small" icon={<CloseOutlined />} onClick={onClose} style={{ color: panelMutedTextColor }} />
</Tooltip>
</div>
</div>
{/* List */}
<div className="log-panel-scroll" style={{ flex: 1, overflow: 'auto' }}>
<Table
className="log-panel-table"
dataSource={sqlLogs}
columns={columns}
size="small"
pagination={false}
rowKey="id"
showHeader={false}
// scroll={{ y: height - 32 }} // Let flex handle it
/>
<div className="log-panel-scroll" style={{ flex: 1, overflow: 'auto', padding: '8px 10px 10px' }}>
{sqlLogs.length === 0 ? (
<div style={{ height: '100%', minHeight: 160, display: 'grid', placeItems: 'center' }}>
<Empty
image={Empty.PRESENTED_IMAGE_SIMPLE}
description={<span style={{ color: panelMutedTextColor }}> SQL </span>}
/>
</div>
) : (
<Table
className="log-panel-table"
dataSource={sqlLogs}
columns={columns}
size="small"
pagination={false}
rowKey="id"
showHeader={false}
/>
)}
</div>
<style>{`
.log-panel-scroll {
@@ -156,6 +193,16 @@ const LogPanel: React.FC<LogPanelProps> = ({ height, onClose, onResizeStart }) =
.log-panel-table .ant-table-tbody > tr > td {
background: transparent !important;
}
.log-panel-table .ant-table-tbody > tr > td {
padding: 8px 10px !important;
border-bottom: 1px solid ${panelDividerColor} !important;
}
.log-panel-table .ant-table-tbody > tr:last-child > td {
border-bottom: none !important;
}
.log-panel-table .ant-table-row:hover > td {
background: ${darkMode ? 'rgba(255,255,255,0.03)' : 'rgba(16,24,40,0.03)'} !important;
}
`}</style>
</div>
);

View File

@@ -5,7 +5,7 @@ import { useStore } from '../store';
import { RedisKeyInfo, RedisValue, StreamEntry } from '../types';
import Editor from '@monaco-editor/react';
import type { DataNode } from 'antd/es/tree';
import { normalizeOpacityForPlatform } from '../utils/appearance';
import { normalizeOpacityForPlatform, resolveAppearanceValues } from '../utils/appearance';
const { Search } = Input;
@@ -399,7 +399,8 @@ const RedisViewer: React.FC<RedisViewerProps> = ({ connectionId, redisDB }) => {
const theme = useStore(state => state.theme);
const appearance = useStore(state => state.appearance);
const darkMode = theme === 'dark';
const opacity = normalizeOpacityForPlatform(appearance.opacity);
const resolvedAppearance = resolveAppearanceValues(appearance);
const opacity = normalizeOpacityForPlatform(resolvedAppearance.opacity);
const connection = connections.find(c => c.id === connectionId);
const keyAccentColor = darkMode ? '#ffd666' : '#1677ff';
const jsonAccentColor = darkMode ? '#f6c453' : '#1890ff';

View File

@@ -27,12 +27,15 @@ import { Tree, message, Dropdown, MenuProps, Input, Button, Modal, Form, Badge,
DisconnectOutlined,
CloudOutlined,
CheckSquareOutlined,
CodeOutlined
CodeOutlined,
TagOutlined,
CheckOutlined,
FilterOutlined
} from '@ant-design/icons';
import { useStore } from '../store';
import { SavedConnection } from '../types';
import { DBGetDatabases, DBGetTables, DBQuery, DBShowCreateTable, ExportTable, OpenSQLFile, CreateDatabase, RenameDatabase, DropDatabase, RenameTable, DropTable, DropView, DropFunction, RenameView } from '../../wailsjs/go/app/App';
import { normalizeOpacityForPlatform } from '../utils/appearance';
import { normalizeOpacityForPlatform, resolveAppearanceValues } from '../utils/appearance';
const { Search } = Input;
@@ -73,6 +76,15 @@ const SEARCH_SCOPE_LABEL_MAP: Record<SearchScope, string> = SEARCH_SCOPE_OPTIONS
return acc;
}, {} as Record<SearchScope, string>);
const SEARCH_SCOPE_ICON_MAP: Record<SearchScope, React.ReactNode> = {
smart: <ThunderboltOutlined />,
object: <TableOutlined />,
database: <DatabaseOutlined />,
host: <CloudOutlined />,
tag: <TagOutlined />,
};
const Sidebar: React.FC<{ onEditConnection?: (conn: SavedConnection) => void }> = ({ onEditConnection }) => {
const connections = useStore(state => state.connections);
const savedQueries = useStore(state => state.savedQueries);
@@ -95,7 +107,8 @@ const Sidebar: React.FC<{ onEditConnection?: (conn: SavedConnection) => void }>
const recordTableAccess = useStore(state => state.recordTableAccess);
const setTableSortPreference = useStore(state => state.setTableSortPreference);
const darkMode = theme === 'dark';
const opacity = normalizeOpacityForPlatform(appearance.opacity);
const resolvedAppearance = resolveAppearanceValues(appearance);
const opacity = normalizeOpacityForPlatform(resolvedAppearance.opacity);
const [treeData, setTreeData] = useState<TreeNode[]>([]);
// Background Helper (Duplicate logic for now, ideally shared)
@@ -108,6 +121,44 @@ const Sidebar: React.FC<{ onEditConnection?: (conn: SavedConnection) => void }>
return `rgba(${r}, ${g}, ${b}, ${opacity})`;
};
const bgMain = getBg('#141414');
const modalPanelStyle = useMemo(() => ({
background: darkMode
? 'linear-gradient(180deg, rgba(20,26,38,0.96) 0%, rgba(13,17,26,0.98) 100%)'
: 'linear-gradient(180deg, rgba(255,255,255,0.98) 0%, rgba(246,248,252,0.98) 100%)',
border: darkMode ? '1px solid rgba(255,255,255,0.08)' : '1px solid rgba(16,24,40,0.08)',
boxShadow: darkMode ? '0 20px 48px rgba(0,0,0,0.38)' : '0 18px 42px rgba(15,23,42,0.12)',
backdropFilter: darkMode ? 'blur(18px)' : 'none',
}), [darkMode]);
const modalSectionStyle = useMemo(() => ({
padding: 14,
borderRadius: 14,
border: darkMode ? '1px solid rgba(255,255,255,0.08)' : '1px solid rgba(16,24,40,0.08)',
background: darkMode ? 'rgba(255,255,255,0.03)' : 'rgba(255,255,255,0.84)',
}), [darkMode]);
const modalScrollSectionStyle = useMemo(() => ({
maxHeight: 400,
overflow: 'auto' as const,
border: darkMode ? '1px solid rgba(255,255,255,0.08)' : '1px solid rgba(16,24,40,0.08)',
borderRadius: 14,
padding: 12,
background: darkMode ? 'rgba(255,255,255,0.03)' : 'rgba(255,255,255,0.8)',
}), [darkMode]);
const modalHintTextStyle = useMemo(() => ({
color: darkMode ? 'rgba(255,255,255,0.5)' : 'rgba(16,24,40,0.55)',
fontSize: 12,
lineHeight: 1.6,
}), [darkMode]);
const renderSidebarModalTitle = (icon: React.ReactNode, title: string, description: string) => (
<div style={{ display: 'flex', alignItems: 'flex-start', gap: 12 }}>
<div style={{ width: 34, height: 34, borderRadius: 12, display: 'grid', placeItems: 'center', background: darkMode ? 'rgba(255,214,102,0.12)' : 'rgba(24,144,255,0.1)', color: darkMode ? '#ffd666' : '#1677ff', flexShrink: 0 }}>
{icon}
</div>
<div style={{ minWidth: 0 }}>
<div style={{ fontSize: 16, fontWeight: 700, color: darkMode ? '#f5f7ff' : '#162033' }}>{title}</div>
<div style={{ marginTop: 4, color: darkMode ? 'rgba(255,255,255,0.5)' : 'rgba(16,24,40,0.55)', fontSize: 12, lineHeight: 1.6 }}>{description}</div>
</div>
</div>
);
const [searchValue, setSearchValue] = useState('');
const [searchScopes, setSearchScopes] = useState<SearchScope[]>(['smart']);
const [isSearchScopePopoverOpen, setIsSearchScopePopoverOpen] = useState(false);
@@ -2471,32 +2522,100 @@ const Sidebar: React.FC<{ onEditConnection?: (conn: SavedConnection) => void }>
const searchScopePopoverContent = useMemo(() => {
const smartSelected = searchScopes.includes('smart');
const scopedOptions = SEARCH_SCOPE_OPTIONS.filter((option) => option.value !== 'smart');
const borderColor = darkMode ? 'rgba(255,255,255,0.08)' : 'rgba(16,24,40,0.08)';
const mutedTextColor = darkMode ? 'rgba(255,255,255,0.5)' : 'rgba(16,24,40,0.55)';
const titleColor = darkMode ? 'rgba(255,255,255,0.92)' : '#162033';
const panelBg = darkMode
? 'linear-gradient(180deg, rgba(17,24,39,0.96) 0%, rgba(10,15,26,0.98) 100%)'
: 'linear-gradient(180deg, rgba(255,255,255,0.98) 0%, rgba(246,248,252,0.98) 100%)';
const smartBg = smartSelected
? (darkMode ? 'linear-gradient(135deg, rgba(255,214,102,0.22) 0%, rgba(255,179,71,0.16) 100%)' : 'linear-gradient(135deg, rgba(255,214,102,0.26) 0%, rgba(255,244,204,0.92) 100%)')
: (darkMode ? 'rgba(255,255,255,0.03)' : 'rgba(255,255,255,0.72)');
const smartBorder = smartSelected
? (darkMode ? 'rgba(255,214,102,0.42)' : 'rgba(245,176,65,0.34)')
: borderColor;
const getOptionCardStyle = (checked: boolean) => ({
display: 'flex',
alignItems: 'center' as const,
justifyContent: 'space-between' as const,
gap: 12,
padding: '10px 12px',
borderRadius: 12,
border: `1px solid ${checked ? (darkMode ? 'rgba(118,169,250,0.44)' : 'rgba(24,144,255,0.32)') : borderColor}`,
background: checked
? (darkMode ? 'rgba(64,124,255,0.18)' : 'rgba(24,144,255,0.08)')
: (darkMode ? 'rgba(255,255,255,0.03)' : 'rgba(255,255,255,0.76)'),
transition: 'all 120ms ease',
});
return (
<div style={{ minWidth: 220, display: 'flex', flexDirection: 'column', gap: 8 }}>
<div style={{ fontSize: 12, color: '#8c8c8c' }}></div>
<Checkbox
checked={smartSelected}
onChange={(e) => setSearchScopeChecked('smart', e.target.checked)}
>
</Checkbox>
<div style={{ paddingLeft: 12, display: 'grid', gap: 6 }}>
{scopedOptions.map((option) => (
<Checkbox
key={option.value}
checked={searchScopes.includes(option.value)}
onChange={(e) => setSearchScopeChecked(option.value, e.target.checked)}
>
{option.label}
</Checkbox>
))}
<div style={{ minWidth: 280, display: 'flex', flexDirection: 'column', background: panelBg, padding: 14, gap: 12 }}>
<div style={{ display: 'flex', alignItems: 'flex-start', justifyContent: 'space-between', gap: 12 }}>
<div>
<div style={{ fontSize: 12, fontWeight: 700, letterSpacing: 0.4, color: mutedTextColor, textTransform: 'uppercase' }}></div>
<div style={{ marginTop: 4, fontSize: 13, lineHeight: 1.5, color: mutedTextColor }}></div>
</div>
<div style={{ width: 32, height: 32, borderRadius: 10, display: 'grid', placeItems: 'center', background: darkMode ? 'rgba(255,255,255,0.05)' : 'rgba(17,24,39,0.06)', color: darkMode ? '#ffd666' : '#1677ff', flexShrink: 0 }}>
<FilterOutlined />
</div>
</div>
<div style={{ fontSize: 12, color: '#8c8c8c' }}>
<label style={{ display: 'block', cursor: 'pointer' }}>
<div style={{ display: 'flex', alignItems: 'center', gap: 12, padding: '12px 14px', borderRadius: 14, border: `1px solid ${smartBorder}`, background: smartBg, boxShadow: smartSelected ? (darkMode ? '0 10px 24px rgba(0,0,0,0.24)' : '0 10px 24px rgba(245,176,65,0.14)') : 'none' }}>
<Checkbox
checked={smartSelected}
onChange={(e) => setSearchScopeChecked('smart', e.target.checked)}
/>
<div style={{ width: 30, height: 30, borderRadius: 10, display: 'grid', placeItems: 'center', background: darkMode ? 'rgba(255,214,102,0.16)' : 'rgba(255,214,102,0.3)', color: darkMode ? '#ffd666' : '#ad6800', flexShrink: 0 }}>
{SEARCH_SCOPE_ICON_MAP.smart}
</div>
<div style={{ flex: 1, minWidth: 0 }}>
<div style={{ display: 'flex', alignItems: 'center', gap: 8, flexWrap: 'wrap' }}>
<span style={{ fontSize: 14, fontWeight: 700, color: titleColor }}></span>
<span style={{ padding: '2px 8px', borderRadius: 999, fontSize: 11, fontWeight: 700, color: darkMode ? '#ffe58f' : '#ad6800', background: darkMode ? 'rgba(255,214,102,0.16)' : 'rgba(255,214,102,0.35)' }}></span>
</div>
<div style={{ marginTop: 3, fontSize: 12, lineHeight: 1.5, color: mutedTextColor }}>Host </div>
</div>
</div>
</label>
<div style={{ height: 1, background: borderColor, opacity: 0.9 }} />
<div style={{ display: 'flex', alignItems: 'center', justifyContent: 'space-between', gap: 12 }}>
<div style={{ fontSize: 12, fontWeight: 700, letterSpacing: 0.3, color: mutedTextColor, textTransform: 'uppercase' }}></div>
<div style={{ fontSize: 12, color: mutedTextColor }}></div>
</div>
<div style={{ display: 'grid', gap: 8 }}>
{scopedOptions.map((option) => {
const checked = searchScopes.includes(option.value);
return (
<label key={option.value} style={{ display: 'block', cursor: 'pointer' }}>
<div style={getOptionCardStyle(checked)}>
<div style={{ display: 'flex', alignItems: 'center', gap: 12, minWidth: 0 }}>
<Checkbox
checked={checked}
onChange={(e) => setSearchScopeChecked(option.value, e.target.checked)}
/>
<div style={{ width: 28, height: 28, borderRadius: 9, display: 'grid', placeItems: 'center', background: checked ? (darkMode ? 'rgba(118,169,250,0.2)' : 'rgba(24,144,255,0.12)') : (darkMode ? 'rgba(255,255,255,0.05)' : 'rgba(17,24,39,0.06)'), color: checked ? (darkMode ? '#91caff' : '#1677ff') : mutedTextColor, flexShrink: 0 }}>
{SEARCH_SCOPE_ICON_MAP[option.value]}
</div>
<span style={{ fontSize: 14, fontWeight: 600, color: titleColor, whiteSpace: 'nowrap' }}>{option.label}</span>
</div>
<div style={{ width: 18, display: 'flex', justifyContent: 'center', color: checked ? (darkMode ? '#91caff' : '#1677ff') : 'transparent', flexShrink: 0 }}>
<CheckOutlined />
</div>
</div>
</label>
);
})}
</div>
<div style={{ padding: '10px 12px', borderRadius: 12, background: darkMode ? 'rgba(255,255,255,0.03)' : 'rgba(17,24,39,0.04)', color: mutedTextColor, fontSize: 12, lineHeight: 1.6 }}>
Host
</div>
</div>
);
}, [searchScopes]);
}, [darkMode, searchScopes]);
const parseHostOnlyToken = (value: unknown): string[] => {
const raw = String(value || '').trim();
@@ -3301,14 +3420,14 @@ const Sidebar: React.FC<{ onEditConnection?: (conn: SavedConnection) => void }>
return (
<div style={{ display: 'flex', flexDirection: 'column', height: '100%' }}>
<div style={{ padding: '4px 8px' }}>
<Space.Compact block size="small">
<div style={{ padding: '4px 10px' }}>
<div style={{ display: 'flex', alignItems: 'center', gap: 8 }}>
<Search
ref={searchInputRef}
placeholder="搜索..."
onChange={onSearch}
size="small"
style={{ width: '100%' }}
style={{ flex: 1, minWidth: 0 }}
/>
<Popover
content={searchScopePopoverContent}
@@ -3316,18 +3435,66 @@ const Sidebar: React.FC<{ onEditConnection?: (conn: SavedConnection) => void }>
placement="bottomRight"
open={isSearchScopePopoverOpen}
onOpenChange={setIsSearchScopePopoverOpen}
styles={{ body: { padding: 0, borderRadius: 18, overflow: 'hidden' } }}
>
<Tooltip title={`搜索范围:${searchScopeSummary}`}>
<Button size="small" icon={<DownOutlined />} style={{ width: 86 }}>
{searchScopes.includes('smart') ? '(智)' : `(${searchScopes.length})`}
<Button
size="small"
style={{
minWidth: 86,
display: 'inline-flex',
alignItems: 'center',
justifyContent: 'center',
gap: 6,
paddingInline: 10,
borderRadius: 10,
borderColor: darkMode ? 'rgba(255,255,255,0.12)' : 'rgba(16,24,40,0.12)',
background: darkMode ? bgMain : 'rgba(255,255,255,0.92)',
color: darkMode ? 'rgba(255,255,255,0.88)' : '#162033',
boxShadow: isSearchScopePopoverOpen
? (darkMode ? '0 0 0 1px rgba(255,214,102,0.22) inset' : '0 0 0 1px rgba(24,144,255,0.24) inset')
: 'none',
backdropFilter: darkMode ? 'blur(10px)' : 'none',
flexShrink: 0,
}}
>
<span style={{ display: 'inline-flex', alignItems: 'center', color: searchScopes.includes('smart') ? '#ffd666' : (darkMode ? 'rgba(255,255,255,0.72)' : 'rgba(22,32,51,0.72)') }}>
<FilterOutlined />
</span>
<span style={{ fontWeight: 700, color: darkMode ? 'rgba(255,255,255,0.88)' : '#162033' }}></span>
<span
style={{
minWidth: 18,
height: 18,
padding: '0 5px',
borderRadius: 999,
display: 'inline-flex',
alignItems: 'center',
justifyContent: 'center',
fontSize: 11,
fontWeight: 700,
lineHeight: 1,
background: searchScopes.includes('smart')
? (darkMode ? 'rgba(255,214,102,0.16)' : 'rgba(24,144,255,0.12)')
: (darkMode ? 'rgba(118,169,250,0.18)' : 'rgba(24,144,255,0.12)'),
color: searchScopes.includes('smart')
? (darkMode ? '#ffd666' : '#1677ff')
: (darkMode ? '#91caff' : '#1677ff'),
}}
>
{searchScopes.includes('smart') ? '智' : searchScopes.length}
</span>
<span style={{ display: 'inline-flex', alignItems: 'center', color: darkMode ? 'rgba(255,255,255,0.48)' : 'rgba(22,32,51,0.4)', fontSize: 12 }}>
<DownOutlined />
</span>
</Button>
</Tooltip>
</Popover>
</Space.Compact>
</div>
</div>
{/* Toolbar */}
<div style={{ padding: '4px 8px', borderBottom: 'none', display: 'flex', flexWrap: 'wrap', gap: 4 }}>
<div style={{ padding: '4px 10px', borderBottom: 'none', display: 'flex', flexWrap: 'wrap', gap: 4 }}>
<Button
size="small"
icon={<FolderOpenOutlined />}
@@ -3395,8 +3562,14 @@ const Sidebar: React.FC<{ onEditConnection?: (conn: SavedConnection) => void }>
)}
<Modal
title={renameViewTarget?.type === 'tag' ? "编辑标签" : "新建组"}
title={renderSidebarModalTitle(
<FolderOpenOutlined />,
renameViewTarget?.type === 'tag' ? "编辑标签" : "新建组",
renameViewTarget?.type === 'tag' ? "调整分组名称和包含的连接。" : "为连接树创建一个更清晰的分组视图。"
)}
open={isCreateTagModalOpen}
centered
styles={{ content: modalPanelStyle, header: { background: 'transparent', borderBottom: 'none', paddingBottom: 10 }, body: { paddingTop: 8 }, footer: { background: 'transparent', borderTop: 'none', paddingTop: 12 } }}
onOk={() => {
createTagForm.validateFields().then(values => {
if (renameViewTarget?.type === 'tag') {
@@ -3431,20 +3604,24 @@ const Sidebar: React.FC<{ onEditConnection?: (conn: SavedConnection) => void }>
onCancel={() => setIsCreateTagModalOpen(false)}
>
<Form form={createTagForm} layout="vertical">
<Form.Item name="name" label="标签名称" rules={[{ required: true, message: '请输入标签名称' }]}>
<Input />
</Form.Item>
<Form.Item name="connectionIds" label="选择连接">
<Checkbox.Group style={{ width: '100%' }}>
<Space direction="vertical" style={{ width: '100%', maxHeight: '400px', overflowY: 'auto' }}>
{connections.map(conn => (
<Checkbox key={conn.id} value={conn.id}>
{conn.name} {conn.config.host ? `(${conn.config.host})` : ''}
</Checkbox>
))}
</Space>
</Checkbox.Group>
</Form.Item>
<div style={modalSectionStyle}>
<Form.Item name="name" label="标签名称" rules={[{ required: true, message: '请输入标签名称' }]}>
<Input placeholder="例如:线上环境 / 核心业务 / 临时调试" />
</Form.Item>
<Form.Item name="connectionIds" label="选择连接" style={{ marginBottom: 0 }}>
<Checkbox.Group style={{ width: '100%' }}>
<div style={modalScrollSectionStyle}>
<Space direction="vertical" style={{ width: '100%' }}>
{connections.map(conn => (
<Checkbox key={conn.id} value={conn.id}>
{conn.name} {conn.config.host ? `(${conn.config.host})` : ''}
</Checkbox>
))}
</Space>
</div>
</Checkbox.Group>
</Form.Item>
</div>
</Form>
</Modal>
@@ -3514,10 +3691,12 @@ const Sidebar: React.FC<{ onEditConnection?: (conn: SavedConnection) => void }>
</Modal>
<Modal
title="批量操作表"
title={renderSidebarModalTitle(<TableOutlined />, "批量操作表", "按对象批量导出结构、数据或完整备份。")}
open={isBatchModalOpen}
onCancel={() => setIsBatchModalOpen(false)}
width={680}
width={720}
centered
styles={{ content: modalPanelStyle, header: { background: 'transparent', borderBottom: 'none', paddingBottom: 10 }, body: { paddingTop: 8 }, footer: { background: 'transparent', borderTop: 'none', paddingTop: 12 } }}
footer={
<div style={{ display: 'flex', justifyContent: 'space-between', alignItems: 'center', gap: 8, flexWrap: 'wrap' }}>
<Button key="cancel" onClick={() => setIsBatchModalOpen(false)}>
@@ -3553,7 +3732,7 @@ const Sidebar: React.FC<{ onEditConnection?: (conn: SavedConnection) => void }>
</div>
}
>
<div style={{ marginBottom: 16 }}>
<div style={{ ...modalSectionStyle, marginBottom: 16 }}>
<div style={{ marginBottom: 8 }}>
<label style={{ display: 'block', marginBottom: 4, fontWeight: 500 }}></label>
<Select
@@ -3585,10 +3764,11 @@ const Sidebar: React.FC<{ onEditConnection?: (conn: SavedConnection) => void }>
))}
</Select>
</div>
<div style={modalHintTextStyle}></div>
</div>
{batchTables.length > 0 && (
<div style={{ marginBottom: 16 }}>
<div style={{ ...modalSectionStyle, marginBottom: 16 }}>
<Space wrap size={8} style={{ width: '100%' }}>
<Input
allowClear
@@ -3626,7 +3806,7 @@ const Sidebar: React.FC<{ onEditConnection?: (conn: SavedConnection) => void }>
{batchTables.length > 0 && (
<>
<div style={{ marginBottom: 16 }}>
<div style={{ ...modalSectionStyle, marginBottom: 16 }}>
<Space>
<Button
size="small"
@@ -3654,7 +3834,7 @@ const Sidebar: React.FC<{ onEditConnection?: (conn: SavedConnection) => void }>
</span>
</Space>
</div>
<div style={{ maxHeight: 400, overflow: 'auto', border: darkMode ? '1px solid #303030' : '1px solid #f0f0f0', borderRadius: 4, padding: 8 }}>
<div style={modalScrollSectionStyle}>
<Checkbox.Group
value={checkedTableKeys}
onChange={(values) => setCheckedTableKeys(values as string[])}
@@ -3704,10 +3884,12 @@ const Sidebar: React.FC<{ onEditConnection?: (conn: SavedConnection) => void }>
</Modal>
<Modal
title="批量操作库"
title={renderSidebarModalTitle(<DatabaseOutlined />, "批量操作库", "按数据库批量导出结构,或生成结构加数据的备份。")}
open={isBatchDbModalOpen}
onCancel={() => setIsBatchDbModalOpen(false)}
width={600}
width={640}
centered
styles={{ content: modalPanelStyle, header: { background: 'transparent', borderBottom: 'none', paddingBottom: 10 }, body: { paddingTop: 8 }, footer: { background: 'transparent', borderTop: 'none', paddingTop: 12 } }}
footer={[
<Button key="cancel" onClick={() => setIsBatchDbModalOpen(false)}>
@@ -3731,8 +3913,8 @@ const Sidebar: React.FC<{ onEditConnection?: (conn: SavedConnection) => void }>
</Button>
]}
>
<div style={{ marginBottom: 16 }}>
<label style={{ display: 'block', marginBottom: 4, fontWeight: 500 }}></label>
<div style={{ ...modalSectionStyle, marginBottom: 16 }}>
<label style={{ display: 'block', marginBottom: 4, fontWeight: 600, color: darkMode ? '#f5f7ff' : '#162033' }}></label>
<Select
value={selectedDbConnection}
onChange={handleDbConnectionChange}
@@ -3745,11 +3927,12 @@ const Sidebar: React.FC<{ onEditConnection?: (conn: SavedConnection) => void }>
</Select.Option>
))}
</Select>
<div style={{ ...modalHintTextStyle, marginTop: 10 }}></div>
</div>
{batchDatabases.length > 0 && (
<>
<div style={{ marginBottom: 16 }}>
<div style={{ ...modalSectionStyle, marginBottom: 16 }}>
<Space>
<Button
size="small"
@@ -3774,7 +3957,7 @@ const Sidebar: React.FC<{ onEditConnection?: (conn: SavedConnection) => void }>
</span>
</Space>
</div>
<div style={{ maxHeight: 400, overflow: 'auto', border: darkMode ? '1px solid #303030' : '1px solid #f0f0f0', borderRadius: 4, padding: 8 }}>
<div style={modalScrollSectionStyle}>
<Checkbox.Group
value={checkedDbKeys}
onChange={(values) => setCheckedDbKeys(values as string[])}

View File

@@ -2491,7 +2491,7 @@ END;`;
okText="应用"
cancelText="取消"
width={640}
destroyOnClose
destroyOnHidden
>
<Input.TextArea
value={commentEditorValue}

View File

@@ -10,7 +10,7 @@ import {
sanitizeShortcutOptions,
} from './utils/shortcuts';
const DEFAULT_APPEARANCE = { opacity: 1.0, blur: 0 };
const DEFAULT_APPEARANCE = { enabled: true, opacity: 1.0, blur: 0 };
const DEFAULT_UI_SCALE = 1.0;
const MIN_UI_SCALE = 0.8;
const MAX_UI_SCALE = 1.25;
@@ -25,7 +25,7 @@ const MAX_HOST_ENTRY_LENGTH = 512;
const MAX_HOST_ENTRIES = 64;
const DEFAULT_TIMEOUT_SECONDS = 30;
const MAX_TIMEOUT_SECONDS = 3600;
const PERSIST_VERSION = 5;
const PERSIST_VERSION = 6;
const DEFAULT_CONNECTION_TYPE = 'mysql';
const DEFAULT_GLOBAL_PROXY: GlobalProxyConfig = {
enabled: false,
@@ -405,7 +405,7 @@ interface AppState {
activeContext: { connectionId: string; dbName: string } | null;
savedQueries: SavedQuery[];
theme: 'light' | 'dark';
appearance: { opacity: number; blur: number };
appearance: { enabled: boolean; opacity: number; blur: number };
uiScale: number;
fontSize: number;
startupFullscreen: boolean;
@@ -443,7 +443,7 @@ interface AppState {
deleteQuery: (id: string) => void;
setTheme: (theme: 'light' | 'dark') => void;
setAppearance: (appearance: Partial<{ opacity: number; blur: number }>) => void;
setAppearance: (appearance: Partial<{ enabled: boolean; opacity: number; blur: number }>) => void;
setUiScale: (scale: number) => void;
setFontSize: (size: number) => void;
setStartupFullscreen: (enabled: boolean) => void;
@@ -522,13 +522,14 @@ const sanitizeTableSortPreference = (value: unknown): Record<string, 'name' | 'f
};
const sanitizeAppearance = (
appearance: Partial<{ opacity: number; blur: number }> | undefined,
appearance: Partial<{ enabled: boolean; opacity: number; blur: number }> | undefined,
version: number
): { opacity: number; blur: number } => {
): { enabled: boolean; opacity: number; blur: number } => {
if (!appearance || typeof appearance !== 'object') {
return { ...DEFAULT_APPEARANCE };
}
const nextAppearance = {
enabled: typeof appearance.enabled === 'boolean' ? appearance.enabled : DEFAULT_APPEARANCE.enabled,
opacity: typeof appearance.opacity === 'number' ? appearance.opacity : DEFAULT_APPEARANCE.opacity,
blur: typeof appearance.blur === 'number' ? appearance.blur : DEFAULT_APPEARANCE.blur,
};

View File

@@ -10,6 +10,22 @@ const WINDOWS_BLUR_FACTOR = 1.00;
const clamp = (value: number, min: number, max: number) => Math.min(max, Math.max(min, value));
export interface AppearanceSettingsLike {
enabled?: boolean;
opacity?: number;
blur?: number;
}
export const resolveAppearanceValues = (appearance: AppearanceSettingsLike | undefined): { opacity: number; blur: number } => {
if (!appearance || appearance.enabled !== false) {
return {
opacity: appearance?.opacity ?? DEFAULT_OPACITY,
blur: appearance?.blur ?? 0,
};
}
return { opacity: DEFAULT_OPACITY, blur: 0 };
};
export const isMacLikePlatform = (): boolean => {
if (typeof navigator === 'undefined') {
return false;

View File

@@ -50,6 +50,11 @@ export const quoteIdentPart = (dbType: string, ident: string) => {
return raw;
}
// SQL Server 使用 [bracket] 标识符
if (dbTypeLower === 'sqlserver' || dbTypeLower === 'mssql') {
return `[${raw.replace(/]/g, ']]')}]`;
}
// 其他数据库默认加双引号
return `"${raw.replace(/"/g, '""')}"`;
};
@@ -134,6 +139,42 @@ export const buildOrderBySQL = (
return '';
};
export const buildPaginatedSelectSQL = (
dbType: string,
baseSql: string,
orderBySQL: string,
limit: number,
offset: number,
) => {
const normalizedType = String(dbType || '').trim().toLowerCase();
const safeLimit = Math.max(0, Math.floor(Number(limit) || 0));
const safeOffset = Math.max(0, Math.floor(Number(offset) || 0));
const base = String(baseSql || '').trim();
const orderBy = String(orderBySQL || '');
if (!base || safeLimit <= 0) {
return `${base}${orderBy}`;
}
switch (normalizedType) {
case 'oracle': {
const orderedSql = `${base}${orderBy}`;
const upperBound = safeOffset + safeLimit;
if (safeOffset <= 0) {
return `SELECT * FROM (${orderedSql}) WHERE ROWNUM <= ${upperBound}`;
}
return `SELECT * FROM (SELECT "__gonavi_page__".*, ROWNUM "__gonavi_rn__" FROM (${orderedSql}) "__gonavi_page__" WHERE ROWNUM <= ${upperBound}) WHERE "__gonavi_rn__" > ${safeOffset}`;
}
case 'sqlserver':
case 'mssql': {
const effectiveOrderBy = orderBy.trim() ? orderBy : ' ORDER BY (SELECT NULL)';
return `${base}${effectiveOrderBy} OFFSET ${safeOffset} ROWS FETCH NEXT ${safeLimit} ROWS ONLY`;
}
default:
return `${base}${orderBy} LIMIT ${safeLimit} OFFSET ${safeOffset}`;
}
};
export const parseListValues = (val: string) => {
const raw = (val || '').trim();
if (!raw) return [];

View File

@@ -277,6 +277,9 @@ export namespace sync {
mode: string;
jobId?: string;
autoAddColumns?: boolean;
targetTableStrategy?: string;
createIndexes?: boolean;
mongoCollectionName?: string;
tableOptions?: Record<string, TableOptions>;
static createFrom(source: any = {}) {
@@ -292,6 +295,9 @@ export namespace sync {
this.mode = source["mode"];
this.jobId = source["jobId"];
this.autoAddColumns = source["autoAddColumns"];
this.targetTableStrategy = source["targetTableStrategy"];
this.createIndexes = source["createIndexes"];
this.mongoCollectionName = source["mongoCollectionName"];
this.tableOptions = this.convertValues(source["tableOptions"], TableOptions, true);
}

View File

@@ -1,6 +1,7 @@
package app
import (
"strconv"
"strings"
"GoNavi-Wails/internal/connection"
@@ -20,6 +21,11 @@ func normalizeRunConfig(config connection.ConnectionConfig, dbName string) conne
case "dameng":
// 达梦使用 schema 参数沿用现有行为dbName 表示 schema。
runConfig.Database = name
case "redis":
runConfig.Database = name
if idx, err := strconv.Atoi(name); err == nil && idx >= 0 && idx <= 15 {
runConfig.RedisDB = idx
}
default:
// oracle: dbName 表示 schema/owner不能覆盖 config.Database服务名
// sqlite: 无需设置 Database

View File

@@ -3,6 +3,7 @@ package app
import (
"context"
"fmt"
"strconv"
"strings"
"time"
@@ -547,6 +548,24 @@ func ensureNonNilSlice[T any](items []T) []T {
func (a *App) DBGetDatabases(config connection.ConnectionConfig) connection.QueryResult {
runConfig := normalizeRunConfig(config, "")
if strings.EqualFold(strings.TrimSpace(runConfig.Type), "redis") {
runConfig.Type = "redis"
client, err := a.getRedisClient(runConfig)
if err != nil {
logger.Error(err, "DBGetDatabases 获取 Redis 连接失败:%s", formatConnSummary(runConfig))
return connection.QueryResult{Success: false, Message: err.Error()}
}
dbs, err := client.GetDatabases()
if err != nil {
logger.Error(err, "DBGetDatabases 获取 Redis 库列表失败:%s", formatConnSummary(runConfig))
return connection.QueryResult{Success: false, Message: err.Error()}
}
resData := make([]map[string]string, 0, len(dbs))
for _, item := range dbs {
resData = append(resData, map[string]string{"Database": strconv.Itoa(item.Index)})
}
return connection.QueryResult{Success: true, Data: resData}
}
dbInst, err := a.getDatabase(runConfig)
if err != nil {
logger.Error(err, "DBGetDatabases 获取连接失败:%s", formatConnSummary(runConfig))
@@ -579,6 +598,48 @@ func (a *App) DBGetDatabases(config connection.ConnectionConfig) connection.Quer
func (a *App) DBGetTables(config connection.ConnectionConfig, dbName string) connection.QueryResult {
runConfig := normalizeRunConfig(config, dbName)
if strings.EqualFold(strings.TrimSpace(runConfig.Type), "redis") {
runConfig.Type = "redis"
client, err := a.getRedisClient(runConfig)
if err != nil {
logger.Error(err, "DBGetTables 获取 Redis 连接失败:%s", formatConnSummary(runConfig))
return connection.QueryResult{Success: false, Message: err.Error()}
}
cursor := uint64(0)
tables := make([]string, 0, 128)
seen := make(map[string]struct{}, 128)
for {
result, err := client.ScanKeys("*", cursor, 1000)
if err != nil {
logger.Error(err, "DBGetTables 扫描 Redis Key 失败:%s", formatConnSummary(runConfig))
return connection.QueryResult{Success: false, Message: err.Error()}
}
for _, item := range result.Keys {
key := strings.TrimSpace(item.Key)
if key == "" {
continue
}
if _, ok := seen[key]; ok {
continue
}
seen[key] = struct{}{}
tables = append(tables, key)
}
if strings.TrimSpace(result.Cursor) == "" || strings.TrimSpace(result.Cursor) == "0" {
break
}
next, err := strconv.ParseUint(strings.TrimSpace(result.Cursor), 10, 64)
if err != nil || next == cursor {
break
}
cursor = next
}
resData := make([]map[string]string, 0, len(tables))
for _, name := range tables {
resData = append(resData, map[string]string{"Table": name})
}
return connection.QueryResult{Success: true, Data: resData}
}
dbInst, err := a.getDatabase(runConfig)
if err != nil {

View File

@@ -90,6 +90,7 @@ type IndexDefinition struct {
NonUnique int `json:"nonUnique"`
SeqInIndex int `json:"seqInIndex"`
IndexType string `json:"indexType"`
SubPart int `json:"subPart,omitempty"`
}
// ForeignKeyDefinition represents a foreign key

View File

@@ -8,6 +8,7 @@ import (
"fmt"
"net"
"net/url"
"sort"
"strconv"
"strings"
"time"
@@ -678,3 +679,134 @@ func isClickHouseTruthy(value interface{}) bool {
return normalized == "1" || normalized == "true" || normalized == "yes" || normalized == "y"
}
}
func (c *ClickHouseDB) ApplyChanges(tableName string, changes connection.ChangeSet) error {
if c.conn == nil {
return fmt.Errorf("connection not open")
}
database, table, err := c.resolveDatabaseAndTable(c.database, tableName)
if err != nil {
return err
}
qualifiedTable := fmt.Sprintf("%s.%s", quoteClickHouseIdentifier(database), quoteClickHouseIdentifier(table))
for _, pk := range changes.Deletes {
whereExpr := buildClickHouseWhereClause(pk)
if whereExpr == "" {
continue
}
query := fmt.Sprintf("ALTER TABLE %s DELETE WHERE %s", qualifiedTable, whereExpr)
if _, err := c.conn.Exec(query); err != nil {
return fmt.Errorf("delete error: %v; sql=%s", err, query)
}
}
for _, update := range changes.Updates {
setExpr := buildClickHouseAssignments(update.Values)
whereExpr := buildClickHouseWhereClause(update.Keys)
if setExpr == "" || whereExpr == "" {
continue
}
query := fmt.Sprintf("ALTER TABLE %s UPDATE %s WHERE %s", qualifiedTable, setExpr, whereExpr)
if _, err := c.conn.Exec(query); err != nil {
return fmt.Errorf("update error: %v; sql=%s", err, query)
}
}
for _, row := range changes.Inserts {
query, err := buildClickHouseInsertSQL(qualifiedTable, row)
if err != nil {
return err
}
if query == "" {
continue
}
if _, err := c.conn.Exec(query); err != nil {
return fmt.Errorf("insert error: %v; sql=%s", err, query)
}
}
return nil
}
func buildClickHouseInsertSQL(qualifiedTable string, row map[string]interface{}) (string, error) {
if len(row) == 0 {
return "", nil
}
cols := make([]string, 0, len(row))
for k := range row {
if strings.TrimSpace(k) == "" {
continue
}
cols = append(cols, k)
}
if len(cols) == 0 {
return "", nil
}
sort.Strings(cols)
quotedCols := make([]string, 0, len(cols))
values := make([]string, 0, len(cols))
for _, col := range cols {
quotedCols = append(quotedCols, quoteClickHouseIdentifier(col))
values = append(values, clickHouseLiteral(row[col]))
}
return fmt.Sprintf("INSERT INTO %s (%s) VALUES (%s)", qualifiedTable, strings.Join(quotedCols, ", "), strings.Join(values, ", ")), nil
}
func buildClickHouseAssignments(values map[string]interface{}) string {
if len(values) == 0 {
return ""
}
cols := make([]string, 0, len(values))
for k := range values {
if strings.TrimSpace(k) == "" {
continue
}
cols = append(cols, k)
}
sort.Strings(cols)
parts := make([]string, 0, len(cols))
for _, col := range cols {
parts = append(parts, fmt.Sprintf("%s = %s", quoteClickHouseIdentifier(col), clickHouseLiteral(values[col])))
}
return strings.Join(parts, ", ")
}
func buildClickHouseWhereClause(keys map[string]interface{}) string {
if len(keys) == 0 {
return ""
}
cols := make([]string, 0, len(keys))
for k := range keys {
if strings.TrimSpace(k) == "" {
continue
}
cols = append(cols, k)
}
sort.Strings(cols)
parts := make([]string, 0, len(cols))
for _, col := range cols {
parts = append(parts, fmt.Sprintf("%s = %s", quoteClickHouseIdentifier(col), clickHouseLiteral(keys[col])))
}
return strings.Join(parts, " AND ")
}
func clickHouseLiteral(value interface{}) string {
switch val := value.(type) {
case nil:
return "NULL"
case bool:
if val {
return "1"
}
return "0"
case int, int8, int16, int32, int64, uint, uint8, uint16, uint32, uint64, float32, float64:
return fmt.Sprintf("%v", val)
case time.Time:
return fmt.Sprintf("'%s'", val.Format("2006-01-02 15:04:05"))
case []byte:
return fmt.Sprintf("'%s'", strings.ReplaceAll(string(val), "'", "''"))
default:
return fmt.Sprintf("'%s'", strings.ReplaceAll(fmt.Sprintf("%v", val), "'", "''"))
}
}

View File

@@ -8,7 +8,6 @@ import (
"fmt"
"net"
"net/url"
"sort"
"strconv"
"strings"
"time"
@@ -205,80 +204,9 @@ func (d *DamengDB) Exec(query string) (int64, error) {
}
func (d *DamengDB) GetDatabases() ([]string, error) {
// 达梦将「用户/模式」作为数据库列表来源,不同权限下可见口径不同
// 这里采用多查询口径聚合,避免仅依赖单一视图导致“少库”
queries := []string{
"SELECT USERNAME AS DATABASE_NAME FROM SYS.DBA_USERS ORDER BY USERNAME",
"SELECT USERNAME AS DATABASE_NAME FROM DBA_USERS ORDER BY USERNAME",
"SELECT USERNAME AS DATABASE_NAME FROM ALL_USERS ORDER BY USERNAME",
"SELECT USERNAME AS DATABASE_NAME FROM USER_USERS",
"SELECT DISTINCT OWNER AS DATABASE_NAME FROM ALL_TABLES ORDER BY OWNER",
}
seen := make(map[string]struct{})
dbs := make([]string, 0, 64)
var lastErr error
success := false
for _, q := range queries {
data, _, err := d.Query(q)
if err != nil {
lastErr = err
continue
}
success = true
for _, row := range data {
name := getDamengRowString(row, "DATABASE_NAME", "USERNAME", "OWNER", "SCHEMA_NAME")
if name == "" {
// 回退到第一列,兼容驱动返回列名差异。
for _, v := range row {
text := strings.TrimSpace(fmt.Sprintf("%v", v))
if text == "" || strings.EqualFold(text, "<nil>") {
continue
}
name = text
break
}
}
if name == "" {
continue
}
key := strings.ToUpper(name)
if _, ok := seen[key]; ok {
continue
}
seen[key] = struct{}{}
dbs = append(dbs, name)
}
}
if !success && lastErr != nil {
return nil, lastErr
}
sort.Slice(dbs, func(i, j int) bool {
return strings.ToUpper(dbs[i]) < strings.ToUpper(dbs[j])
})
return dbs, nil
}
func getDamengRowString(row map[string]interface{}, keys ...string) string {
if len(row) == 0 {
return ""
}
for _, key := range keys {
for k, v := range row {
if !strings.EqualFold(strings.TrimSpace(k), strings.TrimSpace(key)) {
continue
}
text := strings.TrimSpace(fmt.Sprintf("%v", v))
if text == "" || strings.EqualFold(text, "<nil>") {
return ""
}
return text
}
}
return ""
// 达梦在本项目中将 schema/owner 作为数据库”展示口径
// 先查当前 schema / 当前用户,再聚合可见用户与 owner避免权限受限时返回空列表
return collectDamengDatabaseNames(d.Query)
}
func (d *DamengDB) GetTables(dbName string) ([]string, error) {

View File

@@ -0,0 +1,91 @@
package db
import (
"fmt"
"sort"
"strings"
)
var damengDatabaseQueries = []string{
"SELECT SYS_CONTEXT('USERENV', 'CURRENT_SCHEMA') AS DATABASE_NAME FROM DUAL",
"SELECT SYS_CONTEXT('USERENV', 'CURRENT_USER') AS DATABASE_NAME FROM DUAL",
"SELECT USERNAME AS DATABASE_NAME FROM USER_USERS",
"SELECT USERNAME AS DATABASE_NAME FROM ALL_USERS ORDER BY USERNAME",
"SELECT USERNAME AS DATABASE_NAME FROM DBA_USERS ORDER BY USERNAME",
"SELECT USERNAME AS DATABASE_NAME FROM SYS.DBA_USERS ORDER BY USERNAME",
"SELECT DISTINCT OWNER AS DATABASE_NAME FROM ALL_OBJECTS ORDER BY OWNER",
"SELECT DISTINCT OWNER AS DATABASE_NAME FROM ALL_TABLES ORDER BY OWNER",
}
type damengQueryFunc func(query string) ([]map[string]interface{}, []string, error)
func collectDamengDatabaseNames(query damengQueryFunc) ([]string, error) {
seen := make(map[string]struct{})
dbs := make([]string, 0, 64)
var lastErr error
for _, q := range damengDatabaseQueries {
data, _, err := query(q)
if err != nil {
lastErr = err
continue
}
for _, row := range data {
name := getDamengRowString(row,
"DATABASE_NAME",
"USERNAME",
"OWNER",
"SCHEMA_NAME",
"CURRENT_SCHEMA",
"CURRENT_USER",
)
if name == "" {
for _, v := range row {
text := strings.TrimSpace(fmt.Sprintf("%v", v))
if text == "" || strings.EqualFold(text, "<nil>") {
continue
}
name = text
break
}
}
if name == "" {
continue
}
key := strings.ToUpper(name)
if _, ok := seen[key]; ok {
continue
}
seen[key] = struct{}{}
dbs = append(dbs, name)
}
}
if len(dbs) == 0 && lastErr != nil {
return nil, lastErr
}
sort.Slice(dbs, func(i, j int) bool {
return strings.ToUpper(dbs[i]) < strings.ToUpper(dbs[j])
})
return dbs, nil
}
func getDamengRowString(row map[string]interface{}, keys ...string) string {
if len(row) == 0 {
return ""
}
for _, key := range keys {
for k, v := range row {
if !strings.EqualFold(strings.TrimSpace(k), strings.TrimSpace(key)) {
continue
}
text := strings.TrimSpace(fmt.Sprintf("%v", v))
if text == "" || strings.EqualFold(text, "<nil>") {
return ""
}
return text
}
}
return ""
}

View File

@@ -0,0 +1,73 @@
package db
import (
"errors"
"reflect"
"testing"
)
func TestCollectDamengDatabaseNames_UsesCurrentSchemaFallback(t *testing.T) {
t.Parallel()
got, err := collectDamengDatabaseNames(func(query string) ([]map[string]interface{}, []string, error) {
switch query {
case damengDatabaseQueries[0]:
return []map[string]interface{}{{"DATABASE_NAME": "APP_SCHEMA"}}, nil, nil
case damengDatabaseQueries[1]:
return []map[string]interface{}{{"DATABASE_NAME": "app_schema"}}, nil, nil
default:
return nil, nil, errors.New("permission denied")
}
})
if err != nil {
t.Fatalf("collectDamengDatabaseNames 返回错误: %v", err)
}
want := []string{"APP_SCHEMA"}
if !reflect.DeepEqual(got, want) {
t.Fatalf("unexpected database names, got=%v want=%v", got, want)
}
}
func TestCollectDamengDatabaseNames_CollectsOwnersWhenVisible(t *testing.T) {
t.Parallel()
got, err := collectDamengDatabaseNames(func(query string) ([]map[string]interface{}, []string, error) {
switch query {
case damengDatabaseQueries[0], damengDatabaseQueries[1], damengDatabaseQueries[2], damengDatabaseQueries[3], damengDatabaseQueries[4], damengDatabaseQueries[5]:
return []map[string]interface{}{}, nil, nil
case damengDatabaseQueries[6]:
return []map[string]interface{}{{"OWNER": "BIZ"}, {"OWNER": "audit"}}, nil, nil
case damengDatabaseQueries[7]:
return []map[string]interface{}{{"OWNER": "BIZ"}}, nil, nil
default:
return nil, nil, nil
}
})
if err != nil {
t.Fatalf("collectDamengDatabaseNames 返回错误: %v", err)
}
want := []string{"audit", "BIZ"}
if !reflect.DeepEqual(got, want) {
t.Fatalf("unexpected database names, got=%v want=%v", got, want)
}
}
func TestCollectDamengDatabaseNames_ReturnsErrorWhenNoNameResolved(t *testing.T) {
t.Parallel()
expectErr := errors.New("last query failed")
got, err := collectDamengDatabaseNames(func(query string) ([]map[string]interface{}, []string, error) {
if query == damengDatabaseQueries[len(damengDatabaseQueries)-1] {
return nil, nil, expectErr
}
return nil, nil, errors.New("permission denied")
})
if err == nil {
t.Fatalf("期望返回错误,实际 got=%v", got)
}
if !errors.Is(err, expectErr) {
t.Fatalf("错误不符合预期: %v", err)
}
}

View File

@@ -250,12 +250,22 @@ func (m *MariaDB) GetIndexes(dbName, tableName string) ([]connection.IndexDefini
}
}
subPart := 0
if val, ok := row["Sub_part"]; ok && val != nil {
if f, ok := val.(float64); ok {
subPart = int(f)
} else if i, ok := val.(int64); ok {
subPart = int(i)
}
}
idx := connection.IndexDefinition{
Name: fmt.Sprintf("%v", row["Key_name"]),
ColumnName: fmt.Sprintf("%v", row["Column_name"]),
NonUnique: nonUnique,
SeqInIndex: seq,
IndexType: fmt.Sprintf("%v", row["Index_type"]),
SubPart: subPart,
}
indexes = append(indexes, idx)
}
@@ -323,7 +333,7 @@ func (m *MariaDB) ApplyChanges(tableName string, changes connection.ChangeSet) e
var args []interface{}
for k, v := range pk {
wheres = append(wheres, fmt.Sprintf("`%s` = ?", k))
args = append(args, normalizeMySQLDateTimeValue(v))
args = append(args, normalizeMySQLComplexValue(normalizeMySQLDateTimeValue(v)))
}
if len(wheres) == 0 {
continue
@@ -341,7 +351,7 @@ func (m *MariaDB) ApplyChanges(tableName string, changes connection.ChangeSet) e
for k, v := range update.Values {
sets = append(sets, fmt.Sprintf("`%s` = ?", k))
args = append(args, normalizeMySQLDateTimeValue(v))
args = append(args, normalizeMySQLComplexValue(normalizeMySQLDateTimeValue(v)))
}
if len(sets) == 0 {
@@ -351,7 +361,7 @@ func (m *MariaDB) ApplyChanges(tableName string, changes connection.ChangeSet) e
var wheres []string
for k, v := range update.Keys {
wheres = append(wheres, fmt.Sprintf("`%s` = ?", k))
args = append(args, normalizeMySQLDateTimeValue(v))
args = append(args, normalizeMySQLComplexValue(normalizeMySQLDateTimeValue(v)))
}
if len(wheres) == 0 {
@@ -373,7 +383,7 @@ func (m *MariaDB) ApplyChanges(tableName string, changes connection.ChangeSet) e
for k, v := range row {
cols = append(cols, fmt.Sprintf("`%s`", k))
placeholders = append(placeholders, "?")
args = append(args, normalizeMySQLDateTimeValue(v))
args = append(args, normalizeMySQLComplexValue(normalizeMySQLDateTimeValue(v)))
}
if len(cols) == 0 {

View File

@@ -251,6 +251,11 @@ func (m *MongoDB) getURI(config connection.ConnectionConfig) string {
params.Set("authMechanism", authMechanism)
}
// 单机模式且未指定副本集名称时,启用 directConnection 避免驱动自动跟随副本集成员发现
if strings.TrimSpace(config.Topology) != "replica" && strings.TrimSpace(config.ReplicaSet) == "" && !config.MongoSRV {
params.Set("directConnection", "true")
}
if encoded := params.Encode(); encoded != "" {
uri += "?" + encoded
}

View File

@@ -252,6 +252,11 @@ func (m *MongoDBV1) getURI(config connection.ConnectionConfig) string {
params.Set("authMechanism", authMechanism)
}
// 单机模式且未指定副本集名称时,启用 directConnection 避免驱动自动跟随副本集成员发现
if strings.TrimSpace(config.Topology) != "replica" && strings.TrimSpace(config.ReplicaSet) == "" && !config.MongoSRV {
params.Set("directConnection", "true")
}
if encoded := params.Encode(); encoded != "" {
uri += "?" + encoded
}

View File

@@ -3,6 +3,7 @@ package db
import (
"context"
"database/sql"
"encoding/json"
"fmt"
"net/url"
"strconv"
@@ -441,12 +442,22 @@ func (m *MySQLDB) GetIndexes(dbName, tableName string) ([]connection.IndexDefini
}
}
subPart := 0
if val, ok := row["Sub_part"]; ok && val != nil {
if f, ok := val.(float64); ok {
subPart = int(f)
} else if i, ok := val.(int64); ok {
subPart = int(i)
}
}
idx := connection.IndexDefinition{
Name: fmt.Sprintf("%v", row["Key_name"]),
ColumnName: fmt.Sprintf("%v", row["Column_name"]),
NonUnique: nonUnique,
SeqInIndex: seq,
IndexType: fmt.Sprintf("%v", row["Index_type"]),
SubPart: subPart,
}
indexes = append(indexes, idx)
}
@@ -606,6 +617,18 @@ func (m *MySQLDB) ApplyChanges(tableName string, changes connection.ChangeSet) e
return tx.Commit()
}
func normalizeMySQLComplexValue(value interface{}) interface{} {
switch v := value.(type) {
case map[string]interface{}, []interface{}:
if data, err := json.Marshal(v); err == nil {
return string(data)
}
return fmt.Sprintf("%v", value)
default:
return value
}
}
func normalizeMySQLDateTimeValue(value interface{}) interface{} {
text, ok := value.(string)
if !ok {
@@ -670,7 +693,7 @@ func (m *MySQLDB) loadColumnTypeMap(tableName string) map[string]string {
func normalizeMySQLValueForInsert(columnName string, value interface{}, columnTypeMap map[string]string) (interface{}, bool) {
columnType := strings.ToLower(strings.TrimSpace(columnTypeMap[strings.ToLower(strings.TrimSpace(columnName))]))
if !isMySQLTemporalColumnType(columnType) {
return value, false
return normalizeMySQLComplexValue(value), false
}
text, ok := value.(string)
if ok && strings.TrimSpace(text) == "" {

View File

@@ -1,4 +1,4 @@
package db
package db
import (
"encoding/hex"

View File

@@ -0,0 +1,168 @@
//go:build gonavi_full_drivers || gonavi_tdengine_driver
package db
import (
"context"
"database/sql"
"database/sql/driver"
"fmt"
"strings"
"sync"
"testing"
"GoNavi-Wails/internal/connection"
)
const tdengineRecordingDriverName = "gonavi_tdengine_recording"
var (
registerTDengineRecordingDriverOnce sync.Once
tdengineRecordingDriverMu sync.Mutex
tdengineRecordingDriverSeq int
tdengineRecordingDriverStates = map[string]*tdengineRecordingState{}
)
type tdengineRecordingState struct {
mu sync.Mutex
queries []string
execErr error
}
func (s *tdengineRecordingState) snapshotQueries() []string {
s.mu.Lock()
defer s.mu.Unlock()
queries := make([]string, len(s.queries))
copy(queries, s.queries)
return queries
}
type tdengineRecordingDriver struct{}
func (tdengineRecordingDriver) Open(name string) (driver.Conn, error) {
tdengineRecordingDriverMu.Lock()
state := tdengineRecordingDriverStates[name]
tdengineRecordingDriverMu.Unlock()
if state == nil {
return nil, fmt.Errorf("recording state not found: %s", name)
}
return &tdengineRecordingConn{state: state}, nil
}
type tdengineRecordingConn struct {
state *tdengineRecordingState
}
func (c *tdengineRecordingConn) Prepare(query string) (driver.Stmt, error) {
return nil, fmt.Errorf("prepare not supported in tdengine recording driver: %s", query)
}
func (c *tdengineRecordingConn) Close() error { return nil }
func (c *tdengineRecordingConn) Begin() (driver.Tx, error) {
return nil, fmt.Errorf("transactions not supported in tdengine recording driver")
}
func (c *tdengineRecordingConn) ExecContext(_ context.Context, query string, args []driver.NamedValue) (driver.Result, error) {
if len(args) > 0 {
return nil, fmt.Errorf("unexpected exec args: %d", len(args))
}
c.state.mu.Lock()
defer c.state.mu.Unlock()
if c.state.execErr != nil {
return nil, c.state.execErr
}
c.state.queries = append(c.state.queries, query)
return driver.RowsAffected(1), nil
}
var _ driver.ExecerContext = (*tdengineRecordingConn)(nil)
func openTDengineRecordingDB(t *testing.T) (*sql.DB, *tdengineRecordingState) {
t.Helper()
registerTDengineRecordingDriverOnce.Do(func() {
sql.Register(tdengineRecordingDriverName, tdengineRecordingDriver{})
})
tdengineRecordingDriverMu.Lock()
tdengineRecordingDriverSeq++
dsn := fmt.Sprintf("tdengine-recording-%d", tdengineRecordingDriverSeq)
state := &tdengineRecordingState{}
tdengineRecordingDriverStates[dsn] = state
tdengineRecordingDriverMu.Unlock()
dbConn, err := sql.Open(tdengineRecordingDriverName, dsn)
if err != nil {
t.Fatalf("打开 recording db 失败: %v", err)
}
t.Cleanup(func() {
_ = dbConn.Close()
tdengineRecordingDriverMu.Lock()
delete(tdengineRecordingDriverStates, dsn)
tdengineRecordingDriverMu.Unlock()
})
return dbConn, state
}
func TestTDengineApplyChanges_InsertsIntoQualifiedTable(t *testing.T) {
t.Parallel()
dbConn, state := openTDengineRecordingDB(t)
td := &TDengineDB{conn: dbConn}
changes := connection.ChangeSet{
Inserts: []map[string]interface{}{
{
"ts": "2026-03-09 10:00:00",
"value": 12.5,
"device": "sensor-a",
"enabled": true,
},
},
}
if err := td.ApplyChanges("analytics.metrics", changes); err != nil {
t.Fatalf("ApplyChanges 返回错误: %v", err)
}
queries := state.snapshotQueries()
if len(queries) != 1 {
t.Fatalf("期望执行 1 条 SQL实际 %d 条: %#v", len(queries), queries)
}
want := "INSERT INTO `analytics`.`metrics` (`device`, `enabled`, `ts`, `value`) VALUES ('sensor-a', 1, '2026-03-09 10:00:00', 12.5)"
if queries[0] != want {
t.Fatalf("插入 SQL 不符合预期\nwant: %s\n got: %s", want, queries[0])
}
}
func TestTDengineApplyChanges_RejectsMixedUpdatesWithoutPartialWrite(t *testing.T) {
t.Parallel()
dbConn, state := openTDengineRecordingDB(t)
td := &TDengineDB{conn: dbConn}
changes := connection.ChangeSet{
Inserts: []map[string]interface{}{{
"ts": "2026-03-09 10:00:00",
"value": 12.5,
}},
Updates: []connection.UpdateRow{{
Keys: map[string]interface{}{"ts": "2026-03-09 10:00:00"},
Values: map[string]interface{}{"value": 18.8},
}},
}
err := td.ApplyChanges("metrics", changes)
if err == nil {
t.Fatalf("期望 mixed changes 被拒绝")
}
if !strings.Contains(err.Error(), "UPDATE/DELETE") {
t.Fatalf("错误信息未说明限制边界: %v", err)
}
if queries := state.snapshotQueries(); len(queries) != 0 {
t.Fatalf("期望拒绝 mixed changes 时不执行任何 SQL实际=%#v", queries)
}
}

View File

@@ -7,6 +7,7 @@ import (
"database/sql"
"fmt"
"net"
"sort"
"strconv"
"strings"
"time"
@@ -362,6 +363,83 @@ func (t *TDengineDB) GetTriggers(dbName, tableName string) ([]connection.Trigger
return []connection.TriggerDefinition{}, nil
}
func (t *TDengineDB) ApplyChanges(tableName string, changes connection.ChangeSet) error {
if t.conn == nil {
return fmt.Errorf("connection not open")
}
if strings.TrimSpace(tableName) == "" {
return fmt.Errorf("table name required")
}
if len(changes.Updates) > 0 || len(changes.Deletes) > 0 {
return fmt.Errorf("TDengine 目标端当前仅支持 INSERT 写入,暂不支持 UPDATE/DELETE 差异同步,请改用仅插入或全量覆盖模式")
}
qualifiedTable := quoteTDengineTable("", tableName)
for _, row := range changes.Inserts {
query, err := buildTDengineInsertSQL(qualifiedTable, row)
if err != nil {
return err
}
if query == "" {
continue
}
if _, err := t.conn.Exec(query); err != nil {
return fmt.Errorf("insert error: %v; sql=%s", err, query)
}
}
return nil
}
func buildTDengineInsertSQL(qualifiedTable string, row map[string]interface{}) (string, error) {
if strings.TrimSpace(qualifiedTable) == "" {
return "", fmt.Errorf("qualified table required")
}
if len(row) == 0 {
return "", nil
}
cols := make([]string, 0, len(row))
for key := range row {
if strings.TrimSpace(key) == "" {
continue
}
cols = append(cols, key)
}
if len(cols) == 0 {
return "", nil
}
sort.Strings(cols)
quotedCols := make([]string, 0, len(cols))
values := make([]string, 0, len(cols))
for _, col := range cols {
quotedCols = append(quotedCols, fmt.Sprintf("`%s`", escapeBacktickIdent(col)))
values = append(values, tdengineLiteral(row[col]))
}
return fmt.Sprintf("INSERT INTO %s (%s) VALUES (%s)", qualifiedTable, strings.Join(quotedCols, ", "), strings.Join(values, ", ")), nil
}
func tdengineLiteral(value interface{}) string {
switch val := value.(type) {
case nil:
return "NULL"
case bool:
if val {
return "1"
}
return "0"
case int, int8, int16, int32, int64, uint, uint8, uint16, uint32, uint64, float32, float64:
return fmt.Sprintf("%v", val)
case time.Time:
return fmt.Sprintf("'%s'", val.Format("2006-01-02 15:04:05"))
case []byte:
return fmt.Sprintf("'%s'", strings.ReplaceAll(string(val), "'", "''"))
default:
return fmt.Sprintf("'%s'", strings.ReplaceAll(fmt.Sprintf("%v", val), "'", "''"))
}
}
func getValueFromRow(row map[string]interface{}, keys ...string) (interface{}, bool) {
if len(row) == 0 {
return nil, false

View File

@@ -1,22 +1,27 @@
package sync
import (
"GoNavi-Wails/internal/db"
"GoNavi-Wails/internal/logger"
"fmt"
"strings"
)
type TableDiffSummary struct {
Table string `json:"table"`
PKColumn string `json:"pkColumn,omitempty"`
CanSync bool `json:"canSync"`
Inserts int `json:"inserts"`
Updates int `json:"updates"`
Deletes int `json:"deletes"`
Same int `json:"same"`
Message string `json:"message,omitempty"`
HasSchema bool `json:"hasSchema,omitempty"`
Table string `json:"table"`
PKColumn string `json:"pkColumn,omitempty"`
CanSync bool `json:"canSync"`
Inserts int `json:"inserts"`
Updates int `json:"updates"`
Deletes int `json:"deletes"`
Same int `json:"same"`
Message string `json:"message,omitempty"`
HasSchema bool `json:"hasSchema,omitempty"`
TargetTableExists bool `json:"targetTableExists,omitempty"`
PlannedAction string `json:"plannedAction,omitempty"`
Warnings []string `json:"warnings,omitempty"`
UnsupportedObjects []string `json:"unsupportedObjects,omitempty"`
IndexesToCreate int `json:"indexesToCreate,omitempty"`
IndexesSkipped int `json:"indexesSkipped,omitempty"`
}
type SyncAnalyzeResult struct {
@@ -27,6 +32,12 @@ type SyncAnalyzeResult struct {
func (s *SyncEngine) Analyze(config SyncConfig) SyncAnalyzeResult {
result := SyncAnalyzeResult{Success: true, Tables: []TableDiffSummary{}}
if isRedisToMongoKeyspacePair(config) {
return s.analyzeRedisToMongo(config)
}
if isMongoToRedisKeyspacePair(config) {
return s.analyzeMongoToRedis(config)
}
contentRaw := strings.ToLower(strings.TrimSpace(config.Content))
syncSchema := false
@@ -48,25 +59,23 @@ func (s *SyncEngine) Analyze(config SyncConfig) SyncAnalyzeResult {
totalTables := len(config.Tables)
s.progress(config.JobID, 0, totalTables, "", "差异分析开始")
sourceDB, err := db.NewDatabase(config.SourceConfig.Type)
sourceDB, err := newSyncDatabase(config.SourceConfig.Type)
if err != nil {
logger.Error(err, "初始化源数据库驱动失败:类型=%s", config.SourceConfig.Type)
return SyncAnalyzeResult{Success: false, Message: "初始化源数据库驱动失败: " + err.Error()}
}
targetDB, err := db.NewDatabase(config.TargetConfig.Type)
targetDB, err := newSyncDatabase(config.TargetConfig.Type)
if err != nil {
logger.Error(err, "初始化目标数据库驱动失败:类型=%s", config.TargetConfig.Type)
return SyncAnalyzeResult{Success: false, Message: "初始化目标数据库驱动失败: " + err.Error()}
}
// Connect Source
if err := sourceDB.Connect(config.SourceConfig); err != nil {
logger.Error(err, "源数据库连接失败:%s", formatConnSummaryForSync(config.SourceConfig))
return SyncAnalyzeResult{Success: false, Message: "源数据库连接失败: " + err.Error()}
}
defer sourceDB.Close()
// Connect Target
if err := targetDB.Connect(config.TargetConfig); err != nil {
logger.Error(err, "目标数据库连接失败:%s", formatConnSummaryForSync(config.TargetConfig))
return SyncAnalyzeResult{Success: false, Message: "目标数据库连接失败: " + err.Error()}
@@ -88,51 +97,76 @@ func (s *SyncEngine) Analyze(config SyncConfig) SyncAnalyzeResult {
HasSchema: syncSchema,
}
sourceSchema, sourceTable := normalizeSchemaAndTable(config.SourceConfig.Type, config.SourceConfig.Database, tableName)
targetSchema, targetTable := normalizeSchemaAndTable(config.TargetConfig.Type, config.TargetConfig.Database, tableName)
sourceQueryTable := qualifiedNameForQuery(config.SourceConfig.Type, sourceSchema, sourceTable, tableName)
targetQueryTable := qualifiedNameForQuery(config.TargetConfig.Type, targetSchema, targetTable, tableName)
cols, err := sourceDB.GetColumns(sourceSchema, sourceTable)
plan, cols, _, err := buildSchemaMigrationPlan(config, tableName, sourceDB, targetDB)
if err != nil {
summary.Message = "获取源表字段失败: " + err.Error()
summary.Message = err.Error()
result.Tables = append(result.Tables, summary)
return
}
summary.TargetTableExists = plan.TargetTableExists
summary.PlannedAction = plan.PlannedAction
summary.Warnings = append(summary.Warnings, plan.Warnings...)
summary.UnsupportedObjects = append(summary.UnsupportedObjects, plan.UnsupportedObjects...)
summary.IndexesToCreate = plan.IndexesToCreate
summary.IndexesSkipped = plan.IndexesSkipped
if !plan.TargetTableExists && !plan.AutoCreate {
summary.Message = firstNonEmpty(plan.PlannedAction, "目标表不存在,无法执行同步")
result.Tables = append(result.Tables, summary)
return
}
if !syncData {
summary.CanSync = true
summary.Message = "仅同步结构,未执行数据差异分析"
summary.Message = firstNonEmpty(plan.PlannedAction, "仅同步结构,未执行数据差异分析")
result.Tables = append(result.Tables, summary)
return
}
tableMode := normalizeSyncMode(config.Mode)
pkCols := make([]string, 0, 2)
for _, c := range cols {
if c.Key == "PRI" || c.Key == "PK" {
pkCols = append(pkCols, c.Name)
}
}
if len(pkCols) == 0 {
summary.Message = "无主键,不支持数据对比/同步"
result.Tables = append(result.Tables, summary)
return
}
if len(pkCols) > 1 {
summary.Message = fmt.Sprintf("复合主键(%s暂不支持数据对比/同步", strings.Join(pkCols, ","))
result.Tables = append(result.Tables, summary)
return
}
summary.PKColumn = pkCols[0]
// Query data for diff
sourceRows, _, err := sourceDB.Query(fmt.Sprintf("SELECT * FROM %s", quoteQualifiedIdentByType(config.SourceConfig.Type, sourceQueryTable)))
sourceRows, _, err := sourceDB.Query(fmt.Sprintf("SELECT * FROM %s", quoteQualifiedIdentByType(config.SourceConfig.Type, plan.SourceQueryTable)))
if err != nil {
summary.Message = "读取源表失败: " + err.Error()
result.Tables = append(result.Tables, summary)
return
}
targetRows, _, err := targetDB.Query(fmt.Sprintf("SELECT * FROM %s", quoteQualifiedIdentByType(config.TargetConfig.Type, targetQueryTable)))
if !plan.TargetTableExists && plan.AutoCreate {
summary.CanSync = true
summary.Inserts = len(sourceRows)
summary.Message = firstNonEmpty(plan.PlannedAction, "目标表不存在,执行时将自动建表并导入全部源数据")
result.Tables = append(result.Tables, summary)
return
}
if tableMode != "insert_update" {
summary.CanSync = true
summary.Inserts = len(sourceRows)
summary.Message = firstNonEmpty(plan.PlannedAction, "当前模式无需差异对比,将按源表数据执行导入")
result.Tables = append(result.Tables, summary)
return
}
if len(pkCols) == 0 {
summary.Message = "无主键,不支持差异对比同步;如需直接导入请使用仅插入或全量覆盖模式"
result.Tables = append(result.Tables, summary)
return
}
if len(pkCols) > 1 {
summary.Message = fmt.Sprintf("复合主键(%s暂不支持差异对比同步", strings.Join(pkCols, ","))
result.Tables = append(result.Tables, summary)
return
}
summary.PKColumn = pkCols[0]
targetRows, _, err := targetDB.Query(fmt.Sprintf("SELECT * FROM %s", quoteQualifiedIdentByType(config.TargetConfig.Type, plan.TargetQueryTable)))
if err != nil {
summary.Message = "读取目标表失败: " + err.Error()
result.Tables = append(result.Tables, summary)
@@ -188,6 +222,9 @@ func (s *SyncEngine) Analyze(config SyncConfig) SyncAnalyzeResult {
}
summary.CanSync = true
if strings.TrimSpace(summary.Message) == "" {
summary.Message = firstNonEmpty(plan.PlannedAction, "差异分析完成")
}
result.Tables = append(result.Tables, summary)
}()
}
@@ -196,3 +233,12 @@ func (s *SyncEngine) Analyze(config SyncConfig) SyncAnalyzeResult {
result.Message = fmt.Sprintf("已完成 %d 张表的差异分析", len(result.Tables))
return result
}
func firstNonEmpty(values ...string) string {
for _, value := range values {
if strings.TrimSpace(value) != "" {
return value
}
}
return ""
}

View File

@@ -0,0 +1,741 @@
package sync
import (
"GoNavi-Wails/internal/connection"
"GoNavi-Wails/internal/db"
"fmt"
"regexp"
"strings"
)
func buildMySQLToClickHousePlan(config SyncConfig, tableName string, sourceDB db.Database, targetDB db.Database) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
plan := SchemaMigrationPlan{}
plan.SourceSchema, plan.SourceTable = normalizeSchemaAndTable(config.SourceConfig.Type, config.SourceConfig.Database, tableName)
plan.TargetSchema, plan.TargetTable = normalizeSchemaAndTable(config.TargetConfig.Type, config.TargetConfig.Database, tableName)
plan.SourceQueryTable = qualifiedNameForQuery(config.SourceConfig.Type, plan.SourceSchema, plan.SourceTable, tableName)
plan.TargetQueryTable = qualifiedNameForQuery(config.TargetConfig.Type, plan.TargetSchema, plan.TargetTable, tableName)
plan.PlannedAction = "使用已有目标表导入"
sourceCols, sourceExists, err := inspectTableColumns(sourceDB, plan.SourceSchema, plan.SourceTable)
if err != nil {
return plan, nil, nil, fmt.Errorf("获取源表字段失败: %w", err)
}
if !sourceExists {
return plan, nil, nil, fmt.Errorf("源表不存在或无列定义: %s", tableName)
}
targetCols, targetExists, err := inspectTableColumns(targetDB, plan.TargetSchema, plan.TargetTable)
if err != nil {
return plan, sourceCols, nil, fmt.Errorf("获取目标表字段失败: %w", err)
}
plan.TargetTableExists = targetExists
strategy := normalizeTargetTableStrategy(config.TargetTableStrategy)
if targetExists {
missing := diffMissingColumnNames(sourceCols, targetCols)
if len(missing) > 0 {
plan.Warnings = append(plan.Warnings, fmt.Sprintf("目标表缺失字段 %d 个:%s", len(missing), strings.Join(missing, ", ")))
}
if config.AutoAddColumns {
addSQL, addWarnings := buildMySQLToClickHouseAddColumnSQL(plan.TargetQueryTable, sourceCols, targetCols)
plan.PreDataSQL = append(plan.PreDataSQL, addSQL...)
plan.Warnings = append(plan.Warnings, addWarnings...)
if len(addSQL) > 0 {
plan.PlannedAction = fmt.Sprintf("补齐缺失字段(%d)后导入", len(addSQL))
}
}
plan.Warnings = append(plan.Warnings, "ClickHouse 目标端建议优先使用仅插入或全量覆盖;更新/删除语义与传统关系型存在差异")
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
}
switch strategy {
case "existing_only":
plan.PlannedAction = "目标表不存在,需先手工创建"
plan.Warnings = append(plan.Warnings, "当前策略要求目标表已存在,执行时不会自动建表")
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
case "smart", "auto_create_if_missing":
plan.AutoCreate = true
plan.PlannedAction = "目标表不存在,将自动建表后导入"
createSQL, warnings, unsupported := buildMySQLToClickHouseCreateTableSQL(plan.TargetQueryTable, sourceCols)
plan.CreateTableSQL = createSQL
plan.Warnings = append(plan.Warnings, warnings...)
plan.UnsupportedObjects = append(plan.UnsupportedObjects, unsupported...)
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
default:
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
}
}
func buildPGLikeToClickHousePlan(config SyncConfig, tableName string, sourceDB db.Database, targetDB db.Database) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
plan := SchemaMigrationPlan{}
sourceType := resolveMigrationDBType(config.SourceConfig)
targetType := resolveMigrationDBType(config.TargetConfig)
plan.SourceSchema, plan.SourceTable = normalizeSchemaAndTable(sourceType, config.SourceConfig.Database, tableName)
plan.TargetSchema, plan.TargetTable = normalizeSchemaAndTable(targetType, config.TargetConfig.Database, tableName)
plan.SourceQueryTable = qualifiedNameForQuery(sourceType, plan.SourceSchema, plan.SourceTable, tableName)
plan.TargetQueryTable = qualifiedNameForQuery(targetType, plan.TargetSchema, plan.TargetTable, tableName)
plan.PlannedAction = "使用已有目标表导入"
sourceCols, sourceExists, err := inspectTableColumns(sourceDB, plan.SourceSchema, plan.SourceTable)
if err != nil {
return plan, nil, nil, fmt.Errorf("获取源表字段失败: %w", err)
}
if !sourceExists {
return plan, nil, nil, fmt.Errorf("源表不存在或无列定义: %s", tableName)
}
targetCols, targetExists, err := inspectTableColumns(targetDB, plan.TargetSchema, plan.TargetTable)
if err != nil {
return plan, sourceCols, nil, fmt.Errorf("获取目标表字段失败: %w", err)
}
plan.TargetTableExists = targetExists
strategy := normalizeTargetTableStrategy(config.TargetTableStrategy)
if targetExists {
missing := diffMissingColumnNames(sourceCols, targetCols)
if len(missing) > 0 {
plan.Warnings = append(plan.Warnings, fmt.Sprintf("目标表缺失字段 %d 个:%s", len(missing), strings.Join(missing, ", ")))
}
if config.AutoAddColumns {
addSQL, addWarnings := buildPGLikeToClickHouseAddColumnSQL(plan.TargetQueryTable, sourceCols, targetCols)
plan.PreDataSQL = append(plan.PreDataSQL, addSQL...)
plan.Warnings = append(plan.Warnings, addWarnings...)
if len(addSQL) > 0 {
plan.PlannedAction = fmt.Sprintf("补齐缺失字段(%d)后导入", len(addSQL))
}
}
plan.Warnings = append(plan.Warnings, "ClickHouse 目标端建议优先使用仅插入或全量覆盖;更新/删除语义与传统关系型存在差异")
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
}
switch strategy {
case "existing_only":
plan.PlannedAction = "目标表不存在,需先手工创建"
plan.Warnings = append(plan.Warnings, "当前策略要求目标表已存在,执行时不会自动建表")
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
case "smart", "auto_create_if_missing":
plan.AutoCreate = true
plan.PlannedAction = "目标表不存在,将自动建表后导入"
createSQL, warnings, unsupported := buildPGLikeToClickHouseCreateTableSQL(plan.TargetQueryTable, sourceCols)
plan.CreateTableSQL = createSQL
plan.Warnings = append(plan.Warnings, warnings...)
plan.UnsupportedObjects = append(plan.UnsupportedObjects, unsupported...)
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
default:
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
}
}
func buildClickHouseToMySQLPlan(config SyncConfig, tableName string, sourceDB db.Database, targetDB db.Database) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
plan := SchemaMigrationPlan{}
plan.SourceSchema, plan.SourceTable = normalizeSchemaAndTable(config.SourceConfig.Type, config.SourceConfig.Database, tableName)
plan.TargetSchema, plan.TargetTable = normalizeSchemaAndTable(config.TargetConfig.Type, config.TargetConfig.Database, tableName)
plan.SourceQueryTable = qualifiedNameForQuery(config.SourceConfig.Type, plan.SourceSchema, plan.SourceTable, tableName)
plan.TargetQueryTable = qualifiedNameForQuery(config.TargetConfig.Type, plan.TargetSchema, plan.TargetTable, tableName)
plan.PlannedAction = "使用已有目标表导入"
sourceCols, sourceExists, err := inspectTableColumns(sourceDB, plan.SourceSchema, plan.SourceTable)
if err != nil {
return plan, nil, nil, fmt.Errorf("获取源表字段失败: %w", err)
}
if !sourceExists {
return plan, nil, nil, fmt.Errorf("源表不存在或无列定义: %s", tableName)
}
targetCols, targetExists, err := inspectTableColumns(targetDB, plan.TargetSchema, plan.TargetTable)
if err != nil {
return plan, sourceCols, nil, fmt.Errorf("获取目标表字段失败: %w", err)
}
plan.TargetTableExists = targetExists
strategy := normalizeTargetTableStrategy(config.TargetTableStrategy)
if targetExists {
missing := diffMissingColumnNames(sourceCols, targetCols)
if len(missing) > 0 {
plan.Warnings = append(plan.Warnings, fmt.Sprintf("目标表缺失字段 %d 个:%s", len(missing), strings.Join(missing, ", ")))
}
if config.AutoAddColumns {
addSQL, addWarnings := buildClickHouseToMySQLAddColumnSQL(plan.TargetQueryTable, sourceCols, targetCols)
plan.PreDataSQL = append(plan.PreDataSQL, addSQL...)
plan.Warnings = append(plan.Warnings, addWarnings...)
if len(addSQL) > 0 {
plan.PlannedAction = fmt.Sprintf("补齐缺失字段(%d)后导入", len(addSQL))
}
}
plan.Warnings = append(plan.Warnings, "ClickHouse 源端索引/约束元数据有限,反向迁移将以字段和数据为主")
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
}
switch strategy {
case "existing_only":
plan.PlannedAction = "目标表不存在,需先手工创建"
plan.Warnings = append(plan.Warnings, "当前策略要求目标表已存在,执行时不会自动建表")
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
case "smart", "auto_create_if_missing":
plan.AutoCreate = true
plan.PlannedAction = "目标表不存在,将自动建表后导入"
createSQL, warnings := buildClickHouseToMySQLCreateTableSQL(plan.TargetQueryTable, sourceCols)
plan.CreateTableSQL = createSQL
plan.Warnings = append(plan.Warnings, warnings...)
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
default:
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
}
}
func buildClickHouseToPGLikePlan(config SyncConfig, tableName string, sourceDB db.Database, targetDB db.Database) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
plan := SchemaMigrationPlan{}
sourceType := resolveMigrationDBType(config.SourceConfig)
targetType := resolveMigrationDBType(config.TargetConfig)
plan.SourceSchema, plan.SourceTable = normalizeSchemaAndTable(sourceType, config.SourceConfig.Database, tableName)
plan.TargetSchema, plan.TargetTable = normalizeSchemaAndTable(targetType, config.TargetConfig.Database, tableName)
plan.SourceQueryTable = qualifiedNameForQuery(sourceType, plan.SourceSchema, plan.SourceTable, tableName)
plan.TargetQueryTable = qualifiedNameForQuery(targetType, plan.TargetSchema, plan.TargetTable, tableName)
plan.PlannedAction = "使用已有目标表导入"
sourceCols, sourceExists, err := inspectTableColumns(sourceDB, plan.SourceSchema, plan.SourceTable)
if err != nil {
return plan, nil, nil, fmt.Errorf("获取源表字段失败: %w", err)
}
if !sourceExists {
return plan, nil, nil, fmt.Errorf("源表不存在或无列定义: %s", tableName)
}
targetCols, targetExists, err := inspectTableColumns(targetDB, plan.TargetSchema, plan.TargetTable)
if err != nil {
return plan, sourceCols, nil, fmt.Errorf("获取目标表字段失败: %w", err)
}
plan.TargetTableExists = targetExists
strategy := normalizeTargetTableStrategy(config.TargetTableStrategy)
if targetExists {
missing := diffMissingColumnNames(sourceCols, targetCols)
if len(missing) > 0 {
plan.Warnings = append(plan.Warnings, fmt.Sprintf("目标表缺失字段 %d 个:%s", len(missing), strings.Join(missing, ", ")))
}
if config.AutoAddColumns {
addSQL, addWarnings := buildClickHouseToPGLikeAddColumnSQL(targetType, plan.TargetQueryTable, sourceCols, targetCols)
plan.PreDataSQL = append(plan.PreDataSQL, addSQL...)
plan.Warnings = append(plan.Warnings, addWarnings...)
if len(addSQL) > 0 {
plan.PlannedAction = fmt.Sprintf("补齐缺失字段(%d)后导入", len(addSQL))
}
}
plan.Warnings = append(plan.Warnings, "ClickHouse 源端索引/约束元数据有限,反向迁移将以字段和数据为主")
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
}
switch strategy {
case "existing_only":
plan.PlannedAction = "目标表不存在,需先手工创建"
plan.Warnings = append(plan.Warnings, "当前策略要求目标表已存在,执行时不会自动建表")
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
case "smart", "auto_create_if_missing":
plan.AutoCreate = true
plan.PlannedAction = "目标表不存在,将自动建表后导入"
createSQL, warnings, unsupported := buildClickHouseToPGLikeCreateTableSQL(targetType, plan.TargetQueryTable, sourceCols)
plan.CreateTableSQL = createSQL
plan.Warnings = append(plan.Warnings, warnings...)
plan.UnsupportedObjects = append(plan.UnsupportedObjects, unsupported...)
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
default:
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
}
}
func buildPGLikeToClickHouseAddColumnSQL(targetQueryTable string, sourceCols, targetCols []connection.ColumnDefinition) ([]string, []string) {
targetSet := make(map[string]struct{}, len(targetCols))
for _, col := range targetCols {
key := strings.ToLower(strings.TrimSpace(col.Name))
if key == "" {
continue
}
targetSet[key] = struct{}{}
}
var sqlList []string
var warnings []string
for _, col := range sourceCols {
key := strings.ToLower(strings.TrimSpace(col.Name))
if key == "" {
continue
}
if _, ok := targetSet[key]; ok {
continue
}
colType, mapWarnings := mapPGLikeColumnToClickHouse(col)
warnings = append(warnings, mapWarnings...)
sqlList = append(sqlList, fmt.Sprintf("ALTER TABLE %s ADD COLUMN %s %s",
quoteQualifiedIdentByType("clickhouse", targetQueryTable),
quoteIdentByType("clickhouse", col.Name),
colType,
))
}
return sqlList, dedupeStrings(warnings)
}
func buildMySQLToClickHouseAddColumnSQL(targetQueryTable string, sourceCols, targetCols []connection.ColumnDefinition) ([]string, []string) {
targetSet := make(map[string]struct{}, len(targetCols))
for _, col := range targetCols {
key := strings.ToLower(strings.TrimSpace(col.Name))
if key == "" {
continue
}
targetSet[key] = struct{}{}
}
var sqlList []string
var warnings []string
for _, col := range sourceCols {
key := strings.ToLower(strings.TrimSpace(col.Name))
if key == "" {
continue
}
if _, ok := targetSet[key]; ok {
continue
}
colType, mapWarnings := mapMySQLColumnToClickHouse(col)
warnings = append(warnings, mapWarnings...)
sqlList = append(sqlList, fmt.Sprintf("ALTER TABLE %s ADD COLUMN %s %s",
quoteQualifiedIdentByType("clickhouse", targetQueryTable),
quoteIdentByType("clickhouse", col.Name),
colType,
))
}
return sqlList, dedupeStrings(warnings)
}
func buildClickHouseToPGLikeAddColumnSQL(targetType string, targetQueryTable string, sourceCols, targetCols []connection.ColumnDefinition) ([]string, []string) {
targetSet := make(map[string]struct{}, len(targetCols))
for _, col := range targetCols {
key := strings.ToLower(strings.TrimSpace(col.Name))
if key == "" {
continue
}
targetSet[key] = struct{}{}
}
var sqlList []string
var warnings []string
for _, col := range sourceCols {
key := strings.ToLower(strings.TrimSpace(col.Name))
if key == "" {
continue
}
if _, ok := targetSet[key]; ok {
continue
}
colType, mapWarnings := mapClickHouseColumnToPGLike(col)
warnings = append(warnings, mapWarnings...)
sqlList = append(sqlList, fmt.Sprintf("ALTER TABLE %s ADD COLUMN %s %s NULL",
quoteQualifiedIdentByType(targetType, targetQueryTable),
quoteIdentByType(targetType, col.Name),
colType,
))
}
return sqlList, dedupeStrings(warnings)
}
func buildClickHouseToMySQLAddColumnSQL(targetQueryTable string, sourceCols, targetCols []connection.ColumnDefinition) ([]string, []string) {
targetSet := make(map[string]struct{}, len(targetCols))
for _, col := range targetCols {
key := strings.ToLower(strings.TrimSpace(col.Name))
if key == "" {
continue
}
targetSet[key] = struct{}{}
}
var sqlList []string
var warnings []string
for _, col := range sourceCols {
key := strings.ToLower(strings.TrimSpace(col.Name))
if key == "" {
continue
}
if _, ok := targetSet[key]; ok {
continue
}
colType, mapWarnings := mapClickHouseColumnToMySQL(col)
warnings = append(warnings, mapWarnings...)
sqlList = append(sqlList, fmt.Sprintf("ALTER TABLE %s ADD COLUMN %s %s NULL",
quoteQualifiedIdentByType("mysql", targetQueryTable),
quoteIdentByType("mysql", col.Name),
colType,
))
}
return sqlList, dedupeStrings(warnings)
}
func buildPGLikeToClickHouseCreateTableSQL(targetQueryTable string, sourceCols []connection.ColumnDefinition) (string, []string, []string) {
columnDefs := make([]string, 0, len(sourceCols))
warnings := make([]string, 0)
unsupported := make([]string, 0)
orderByCols := make([]string, 0)
for _, col := range sourceCols {
def, colWarnings := buildPGLikeToClickHouseColumnDefinition(col)
warnings = append(warnings, colWarnings...)
columnDefs = append(columnDefs, fmt.Sprintf("%s %s", quoteIdentByType("clickhouse", col.Name), def))
if col.Key == "PRI" || col.Key == "PK" {
orderByCols = append(orderByCols, quoteIdentByType("clickhouse", col.Name))
}
}
orderExpr := "tuple()"
if len(orderByCols) > 0 {
orderExpr = "(" + strings.Join(orderByCols, ", ") + ")"
} else {
warnings = append(warnings, "源表未识别到主键ClickHouse 将使用 ORDER BY tuple() 建表,后续查询性能可能受影响")
}
warnings = append(warnings, "ClickHouse 不保留关系型外键/唯一约束语义,将仅迁移字段与数据")
createSQL := fmt.Sprintf("CREATE TABLE %s (\n %s\n) ENGINE = MergeTree() ORDER BY %s", quoteQualifiedIdentByType("clickhouse", targetQueryTable), strings.Join(columnDefs, ",\n "), orderExpr)
return createSQL, dedupeStrings(warnings), dedupeStrings(unsupported)
}
func buildMySQLToClickHouseCreateTableSQL(targetQueryTable string, sourceCols []connection.ColumnDefinition) (string, []string, []string) {
columnDefs := make([]string, 0, len(sourceCols))
warnings := make([]string, 0)
unsupported := make([]string, 0)
orderByCols := make([]string, 0)
for _, col := range sourceCols {
def, colWarnings := buildMySQLToClickHouseColumnDefinition(col)
warnings = append(warnings, colWarnings...)
columnDefs = append(columnDefs, fmt.Sprintf("%s %s", quoteIdentByType("clickhouse", col.Name), def))
if col.Key == "PRI" || col.Key == "PK" {
orderByCols = append(orderByCols, quoteIdentByType("clickhouse", col.Name))
}
}
orderExpr := "tuple()"
if len(orderByCols) > 0 {
orderExpr = "(" + strings.Join(orderByCols, ", ") + ")"
} else {
warnings = append(warnings, "源表未识别到主键ClickHouse 将使用 ORDER BY tuple() 建表,后续查询性能可能受影响")
}
warnings = append(warnings, "ClickHouse 不保留关系型外键/唯一约束语义,将仅迁移字段与数据")
createSQL := fmt.Sprintf("CREATE TABLE %s (\n %s\n) ENGINE = MergeTree() ORDER BY %s", quoteQualifiedIdentByType("clickhouse", targetQueryTable), strings.Join(columnDefs, ",\n "), orderExpr)
return createSQL, dedupeStrings(warnings), dedupeStrings(unsupported)
}
func buildClickHouseToPGLikeCreateTableSQL(targetType string, targetQueryTable string, sourceCols []connection.ColumnDefinition) (string, []string, []string) {
columnDefs := make([]string, 0, len(sourceCols)+1)
warnings := make([]string, 0)
unsupported := []string{"ClickHouse ORDER BY/PARTITION/TTL/Projection/物化视图 语义当前不会自动迁移到 PG-like"}
pkCols := make([]string, 0)
for _, col := range sourceCols {
def, colWarnings := buildClickHouseToPGLikeColumnDefinition(col)
warnings = append(warnings, colWarnings...)
columnDefs = append(columnDefs, fmt.Sprintf("%s %s", quoteIdentByType(targetType, col.Name), def))
if col.Key == "PRI" || col.Key == "PK" {
pkCols = append(pkCols, quoteIdentByType(targetType, col.Name))
}
}
if len(pkCols) > 0 {
columnDefs = append(columnDefs, fmt.Sprintf("PRIMARY KEY (%s)", strings.Join(pkCols, ", ")))
} else {
warnings = append(warnings, "ClickHouse 源端未返回主键信息,目标 PG-like 表将不自动创建主键")
}
createSQL := fmt.Sprintf("CREATE TABLE %s (\n %s\n)", quoteQualifiedIdentByType(targetType, targetQueryTable), strings.Join(columnDefs, ",\n "))
return createSQL, dedupeStrings(warnings), dedupeStrings(unsupported)
}
func buildClickHouseToMySQLCreateTableSQL(targetQueryTable string, sourceCols []connection.ColumnDefinition) (string, []string) {
columnDefs := make([]string, 0, len(sourceCols)+1)
warnings := make([]string, 0)
pkCols := make([]string, 0)
for _, col := range sourceCols {
def, colWarnings := buildClickHouseToMySQLColumnDefinition(col)
warnings = append(warnings, colWarnings...)
columnDefs = append(columnDefs, fmt.Sprintf("%s %s", quoteIdentByType("mysql", col.Name), def))
if col.Key == "PRI" || col.Key == "PK" {
pkCols = append(pkCols, quoteIdentByType("mysql", col.Name))
}
}
if len(pkCols) > 0 {
columnDefs = append(columnDefs, fmt.Sprintf("PRIMARY KEY (%s)", strings.Join(pkCols, ", ")))
} else {
warnings = append(warnings, "ClickHouse 源端未返回主键信息,目标 MySQL 表将不自动创建主键")
}
createSQL := fmt.Sprintf("CREATE TABLE %s (\n %s\n)", quoteQualifiedIdentByType("mysql", targetQueryTable), strings.Join(columnDefs, ",\n "))
return createSQL, dedupeStrings(warnings)
}
func buildPGLikeToClickHouseColumnDefinition(col connection.ColumnDefinition) (string, []string) {
targetType, warnings := mapPGLikeColumnToClickHouse(col)
parts := []string{targetType}
return strings.Join(parts, " "), dedupeStrings(warnings)
}
func buildMySQLToClickHouseColumnDefinition(col connection.ColumnDefinition) (string, []string) {
targetType, warnings := mapMySQLColumnToClickHouse(col)
parts := []string{targetType}
if strings.EqualFold(strings.TrimSpace(col.Nullable), "NO") && !strings.HasPrefix(strings.ToLower(targetType), "nullable(") {
return strings.Join(parts, " "), dedupeStrings(warnings)
}
return strings.Join(parts, " "), dedupeStrings(warnings)
}
func buildClickHouseToPGLikeColumnDefinition(col connection.ColumnDefinition) (string, []string) {
targetType, warnings := mapClickHouseColumnToPGLike(col)
parts := []string{targetType}
if strings.EqualFold(strings.TrimSpace(col.Nullable), "NO") {
parts = append(parts, "NOT NULL")
}
return strings.Join(parts, " "), dedupeStrings(warnings)
}
func buildClickHouseToMySQLColumnDefinition(col connection.ColumnDefinition) (string, []string) {
targetType, warnings := mapClickHouseColumnToMySQL(col)
parts := []string{targetType}
if strings.EqualFold(strings.TrimSpace(col.Nullable), "NO") {
parts = append(parts, "NOT NULL")
}
return strings.Join(parts, " "), dedupeStrings(warnings)
}
func mapPGLikeColumnToClickHouse(col connection.ColumnDefinition) (string, []string) {
raw := strings.ToLower(strings.TrimSpace(col.Type))
warnings := make([]string, 0)
if raw == "" {
return "String", []string{fmt.Sprintf("字段 %s 类型为空,已降级为 String", col.Name)}
}
baseType := "String"
switch {
case raw == "boolean" || strings.HasPrefix(raw, "bool"):
baseType = "UInt8"
case raw == "smallint":
baseType = "Int16"
case raw == "integer" || raw == "int4":
baseType = "Int32"
case raw == "bigint" || raw == "int8":
baseType = "Int64"
case strings.HasPrefix(raw, "numeric"), strings.HasPrefix(raw, "decimal"):
baseType = replaceTypeBase(raw, []string{"numeric", "decimal"}, "Decimal")
case raw == "real" || raw == "float4":
baseType = "Float32"
case raw == "double precision" || raw == "float8":
baseType = "Float64"
case raw == "date":
baseType = "Date"
case strings.HasPrefix(raw, "timestamp") || strings.Contains(raw, "without time zone") || strings.Contains(raw, "with time zone"):
baseType = "DateTime"
case strings.HasPrefix(raw, "time"):
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 %s 已降级为 String", col.Name, col.Type))
baseType = "String"
case strings.HasPrefix(raw, "character varying"), strings.HasPrefix(raw, "varchar("), strings.HasPrefix(raw, "character("), strings.HasPrefix(raw, "char("), raw == "character", raw == "text", raw == "uuid":
baseType = "String"
case raw == "json" || raw == "jsonb" || raw == "bytea":
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 %s 已降级为 String", col.Name, col.Type))
baseType = "String"
case strings.HasSuffix(raw, "[]") || strings.HasPrefix(raw, "array"):
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 %s 已降级为 String", col.Name, col.Type))
baseType = "String"
case raw == "user-defined":
warnings = append(warnings, fmt.Sprintf("字段 %s 为用户自定义类型,已降级为 String", col.Name))
baseType = "String"
default:
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 %s 暂无专门映射,已降级为 String", col.Name, col.Type))
baseType = "String"
}
if strings.EqualFold(strings.TrimSpace(col.Nullable), "YES") && !strings.HasPrefix(strings.ToLower(baseType), "nullable(") {
baseType = fmt.Sprintf("Nullable(%s)", baseType)
}
if strings.Contains(strings.ToLower(strings.TrimSpace(col.Extra)), "identity") || strings.Contains(strings.ToLower(strings.TrimSpace(col.Extra)), "auto_increment") {
warnings = append(warnings, fmt.Sprintf("字段 %s 的 identity/自增语义在 ClickHouse 中不保留", col.Name))
}
return baseType, dedupeStrings(warnings)
}
func mapMySQLColumnToClickHouse(col connection.ColumnDefinition) (string, []string) {
raw := strings.ToLower(strings.TrimSpace(col.Type))
warnings := make([]string, 0)
if raw == "" {
return "String", []string{fmt.Sprintf("字段 %s 类型为空,已降级为 String", col.Name)}
}
unsigned := strings.Contains(raw, "unsigned")
clean := strings.ReplaceAll(raw, " unsigned", "")
clean = strings.ReplaceAll(clean, " zerofill", "")
baseType := "String"
switch {
case strings.HasPrefix(clean, "tinyint(1)"):
baseType = "UInt8"
case strings.HasPrefix(clean, "tinyint"):
if unsigned {
baseType = "UInt8"
} else {
baseType = "Int8"
}
case strings.HasPrefix(clean, "smallint"):
if unsigned {
baseType = "UInt16"
} else {
baseType = "Int16"
}
case strings.HasPrefix(clean, "mediumint"), strings.HasPrefix(clean, "int"), strings.HasPrefix(clean, "integer"):
if unsigned {
baseType = "UInt32"
} else {
baseType = "Int32"
}
case strings.HasPrefix(clean, "bigint"):
if unsigned {
baseType = "UInt64"
} else {
baseType = "Int64"
}
case strings.HasPrefix(clean, "decimal"), strings.HasPrefix(clean, "numeric"):
baseType = replaceTypeBase(strings.Title(clean), []string{"Decimal", "Numeric"}, "Decimal")
case strings.HasPrefix(clean, "float"):
baseType = "Float32"
case strings.HasPrefix(clean, "double"):
baseType = "Float64"
case strings.HasPrefix(clean, "date"):
baseType = "Date"
case strings.HasPrefix(clean, "datetime"), strings.HasPrefix(clean, "timestamp"):
baseType = "DateTime"
case strings.HasPrefix(clean, "time"):
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 time 已降级为 String", col.Name))
baseType = "String"
case strings.HasPrefix(clean, "json"), strings.HasPrefix(clean, "enum"), strings.HasPrefix(clean, "set"), strings.HasPrefix(clean, "char"), strings.HasPrefix(clean, "varchar"), strings.Contains(clean, "text"):
baseType = "String"
case strings.Contains(clean, "blob"), strings.Contains(clean, "binary"):
warnings = append(warnings, fmt.Sprintf("字段 %s 二进制类型已降级为 String", col.Name))
baseType = "String"
default:
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 %s 暂无专门映射,已降级为 String", col.Name, col.Type))
baseType = "String"
}
if strings.EqualFold(strings.TrimSpace(col.Nullable), "YES") && !strings.HasPrefix(strings.ToLower(baseType), "nullable(") {
baseType = fmt.Sprintf("Nullable(%s)", baseType)
}
if strings.Contains(strings.ToLower(strings.TrimSpace(col.Extra)), "auto_increment") {
warnings = append(warnings, fmt.Sprintf("字段 %s 的 AUTO_INCREMENT 在 ClickHouse 中不保留自增语义", col.Name))
}
return baseType, dedupeStrings(warnings)
}
var clickHouseDecimalPattern = regexp.MustCompile(`^(decimal|numeric)\((\d+)\s*,\s*(\d+)\)$`)
var clickHouseStringArgsPattern = regexp.MustCompile(`^fixedstring\((\d+)\)$`)
func mapClickHouseColumnToPGLike(col connection.ColumnDefinition) (string, []string) {
raw := strings.TrimSpace(col.Type)
lower := strings.ToLower(raw)
warnings := make([]string, 0)
if strings.HasPrefix(lower, "nullable(") && strings.HasSuffix(lower, ")") {
raw = strings.TrimSpace(raw[len("Nullable(") : len(raw)-1])
lower = strings.ToLower(raw)
}
for {
if strings.HasPrefix(lower, "lowcardinality(") && strings.HasSuffix(lower, ")") {
raw = strings.TrimSpace(raw[len("LowCardinality(") : len(raw)-1])
lower = strings.ToLower(raw)
continue
}
break
}
switch {
case lower == "bool" || lower == "boolean":
return "boolean", warnings
case lower == "int8":
return "smallint", warnings
case lower == "uint8":
return "smallint", warnings
case lower == "int16":
return "smallint", warnings
case lower == "uint16":
return "integer", warnings
case lower == "int32":
return "integer", warnings
case lower == "uint32":
return "bigint", warnings
case lower == "int64":
return "bigint", warnings
case lower == "uint64":
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 %s 已映射为 numeric(20,0) 以避免无符号溢出", col.Name, col.Type))
return "numeric(20,0)", warnings
case lower == "float32":
return "real", warnings
case lower == "float64":
return "double precision", warnings
case lower == "date":
return "date", warnings
case strings.HasPrefix(lower, "datetime"):
return "timestamp", warnings
case lower == "string":
return "text", warnings
case lower == "uuid":
return "uuid", warnings
case lower == "json", strings.HasPrefix(lower, "map("), strings.HasPrefix(lower, "array("), strings.HasPrefix(lower, "tuple("), strings.HasPrefix(lower, "nested("):
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 %s 已降级为 jsonb", col.Name, col.Type))
return "jsonb", warnings
case strings.HasPrefix(lower, "enum8("), strings.HasPrefix(lower, "enum16("):
warnings = append(warnings, fmt.Sprintf("字段 %s 枚举类型 %s 已降级为 varchar(255)", col.Name, col.Type))
return "varchar(255)", warnings
case clickHouseDecimalPattern.MatchString(lower):
parts := clickHouseDecimalPattern.FindStringSubmatch(lower)
return fmt.Sprintf("numeric(%s,%s)", parts[2], parts[3]), warnings
case clickHouseStringArgsPattern.MatchString(lower):
parts := clickHouseStringArgsPattern.FindStringSubmatch(lower)
return fmt.Sprintf("varchar(%s)", parts[1]), warnings
default:
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 %s 暂无专门 PG-like 映射,已降级为 text", col.Name, col.Type))
return "text", warnings
}
}
func mapClickHouseColumnToMySQL(col connection.ColumnDefinition) (string, []string) {
raw := strings.TrimSpace(col.Type)
lower := strings.ToLower(raw)
warnings := make([]string, 0)
nullable := false
if strings.HasPrefix(lower, "nullable(") && strings.HasSuffix(lower, ")") {
nullable = true
raw = strings.TrimSpace(raw[len("Nullable(") : len(raw)-1])
lower = strings.ToLower(raw)
}
for {
if strings.HasPrefix(lower, "lowcardinality(") && strings.HasSuffix(lower, ")") {
raw = strings.TrimSpace(raw[len("LowCardinality(") : len(raw)-1])
lower = strings.ToLower(raw)
continue
}
break
}
_ = nullable
switch {
case lower == "bool" || lower == "boolean" || lower == "uint8":
return "tinyint(1)", warnings
case lower == "int8":
return "tinyint", warnings
case lower == "uint16":
return "smallint unsigned", warnings
case lower == "int16":
return "smallint", warnings
case lower == "uint32":
return "int unsigned", warnings
case lower == "int32":
return "int", warnings
case lower == "uint64":
return "bigint unsigned", warnings
case lower == "int64":
return "bigint", warnings
case lower == "float32":
return "float", warnings
case lower == "float64":
return "double", warnings
case lower == "date":
return "date", warnings
case strings.HasPrefix(lower, "datetime"):
return "datetime", warnings
case lower == "string":
return "text", warnings
case lower == "uuid":
return "char(36)", warnings
case lower == "json", strings.HasPrefix(lower, "map("), strings.HasPrefix(lower, "array("), strings.HasPrefix(lower, "tuple("), strings.HasPrefix(lower, "nested("):
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 %s 已降级为 json", col.Name, col.Type))
return "json", warnings
case clickHouseDecimalPattern.MatchString(lower):
parts := clickHouseDecimalPattern.FindStringSubmatch(lower)
return fmt.Sprintf("decimal(%s,%s)", parts[2], parts[3]), warnings
case clickHouseStringArgsPattern.MatchString(lower):
parts := clickHouseStringArgsPattern.FindStringSubmatch(lower)
return fmt.Sprintf("varchar(%s)", parts[1]), warnings
default:
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 %s 暂无专门映射,已降级为 text", col.Name, col.Type))
return "text", warnings
}
}

View File

@@ -0,0 +1,379 @@
package sync
import (
"GoNavi-Wails/internal/connection"
"GoNavi-Wails/internal/db"
"fmt"
"strings"
)
type genericLegacyPlanner struct{}
type mysqlToPGLikePlanner struct{}
type mysqlToClickHousePlanner struct{}
type pgLikeToClickHousePlanner struct{}
type clickHouseToMySQLPlanner struct{}
type clickHouseToPGLikePlanner struct{}
type mysqlToMongoPlanner struct{}
type pgLikeToMongoPlanner struct{}
type clickHouseToMongoPlanner struct{}
type tdengineToMongoPlanner struct{}
type mongoToMySQLPlanner struct{}
type mongoToPGLikePlanner struct{}
type pgLikeToMySQLPlanner struct{}
type tdengineToMySQLPlanner struct{}
type tdengineToPGLikePlanner struct{}
type mongoToRelationalPlanner struct{}
func buildSchemaMigrationPlan(config SyncConfig, tableName string, sourceDB db.Database, targetDB db.Database) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
ctx := MigrationBuildContext{
Config: config,
TableName: tableName,
SourceDB: sourceDB,
TargetDB: targetDB,
}
planner := resolveMigrationPlanner(ctx)
if planner == nil {
return buildSchemaMigrationPlanLegacy(config, tableName, sourceDB, targetDB)
}
return planner.BuildPlan(ctx)
}
func resolveMigrationPlanner(ctx MigrationBuildContext) MigrationPlanner {
planners := []MigrationPlanner{
mysqlToPGLikePlanner{},
mySQLLikeToTDenginePlanner{},
pgLikeToTDenginePlanner{},
clickHouseToTDenginePlanner{},
tdengineToTDenginePlanner{},
tdengineToPGLikePlanner{},
tdengineToMySQLPlanner{},
mysqlToClickHousePlanner{},
pgLikeToClickHousePlanner{},
clickHouseToMySQLPlanner{},
clickHouseToPGLikePlanner{},
mysqlToMongoPlanner{},
pgLikeToMongoPlanner{},
clickHouseToMongoPlanner{},
tdengineToMongoPlanner{},
mongoToMySQLPlanner{},
mongoToPGLikePlanner{},
pgLikeToMySQLPlanner{},
mongoToRelationalPlanner{},
genericLegacyPlanner{},
}
bestLevel := MigrationSupportLevelUnsupported
var bestPlanner MigrationPlanner
for _, planner := range planners {
level := planner.SupportLevel(ctx)
if migrationSupportRank(level) > migrationSupportRank(bestLevel) {
bestLevel = level
bestPlanner = planner
}
}
return bestPlanner
}
func migrationSupportRank(level MigrationSupportLevel) int {
switch level {
case MigrationSupportLevelFull:
return 4
case MigrationSupportLevelPlanned:
return 3
case MigrationSupportLevelPartial:
return 2
default:
return 1
}
}
func isMySQLLikeType(dbType string) bool {
return isMySQLLikeWritableTargetType(dbType)
}
func classifyMigrationDataModel(dbType string) MigrationDataModel {
switch normalizeMigrationDBType(dbType) {
case "mysql", "mariadb", "postgres", "kingbase", "highgo", "vastbase", "oracle", "sqlserver", "dameng", "sqlite", "duckdb":
return MigrationDataModelRelational
case "mongodb":
return MigrationDataModelDocument
case "clickhouse", "diros", "sphinx":
return MigrationDataModelColumnar
case "tdengine":
return MigrationDataModelTimeSeries
case "redis":
return MigrationDataModelKeyValue
default:
return MigrationDataModelCustom
}
}
func (genericLegacyPlanner) Name() string { return "generic-legacy-planner" }
func (genericLegacyPlanner) SupportLevel(ctx MigrationBuildContext) MigrationSupportLevel {
_ = ctx
return MigrationSupportLevelPartial
}
func (genericLegacyPlanner) BuildPlan(ctx MigrationBuildContext) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
return buildSchemaMigrationPlanLegacy(ctx.Config, ctx.TableName, ctx.SourceDB, ctx.TargetDB)
}
func (mysqlToPGLikePlanner) Name() string { return "mysql-pglike-planner" }
func (mysqlToPGLikePlanner) SupportLevel(ctx MigrationBuildContext) MigrationSupportLevel {
sourceType := resolveMigrationDBType(ctx.Config.SourceConfig)
targetType := resolveMigrationDBType(ctx.Config.TargetConfig)
if isMySQLLikeSourceType(sourceType) && isPGLikeTarget(targetType) {
return MigrationSupportLevelFull
}
return MigrationSupportLevelUnsupported
}
func (mysqlToPGLikePlanner) BuildPlan(ctx MigrationBuildContext) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
return buildMySQLToPGLikePlan(ctx.Config, ctx.TableName, ctx.SourceDB, ctx.TargetDB)
}
func (tdengineToMySQLPlanner) Name() string { return "tdengine-mysql-planner" }
func (tdengineToMySQLPlanner) SupportLevel(ctx MigrationBuildContext) MigrationSupportLevel {
sourceType := resolveMigrationDBType(ctx.Config.SourceConfig)
targetType := resolveMigrationDBType(ctx.Config.TargetConfig)
if sourceType == "tdengine" && isMySQLLikeWritableTargetType(targetType) {
return MigrationSupportLevelFull
}
return MigrationSupportLevelUnsupported
}
func (tdengineToMySQLPlanner) BuildPlan(ctx MigrationBuildContext) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
return buildTDengineToMySQLPlan(ctx.Config, ctx.TableName, ctx.SourceDB, ctx.TargetDB)
}
func (tdengineToPGLikePlanner) Name() string { return "tdengine-pglike-planner" }
func (tdengineToPGLikePlanner) SupportLevel(ctx MigrationBuildContext) MigrationSupportLevel {
sourceType := resolveMigrationDBType(ctx.Config.SourceConfig)
targetType := resolveMigrationDBType(ctx.Config.TargetConfig)
if sourceType == "tdengine" && isPGLikeTarget(targetType) {
return MigrationSupportLevelFull
}
return MigrationSupportLevelUnsupported
}
func (tdengineToPGLikePlanner) BuildPlan(ctx MigrationBuildContext) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
return buildTDengineToPGLikePlan(ctx.Config, ctx.TableName, ctx.SourceDB, ctx.TargetDB)
}
func (mysqlToClickHousePlanner) Name() string { return "mysql-clickhouse-planner" }
func (mysqlToClickHousePlanner) SupportLevel(ctx MigrationBuildContext) MigrationSupportLevel {
sourceType := resolveMigrationDBType(ctx.Config.SourceConfig)
targetType := resolveMigrationDBType(ctx.Config.TargetConfig)
if isMySQLCoreType(sourceType) && targetType == "clickhouse" {
return MigrationSupportLevelFull
}
return MigrationSupportLevelUnsupported
}
func (mysqlToClickHousePlanner) BuildPlan(ctx MigrationBuildContext) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
return buildMySQLToClickHousePlan(ctx.Config, ctx.TableName, ctx.SourceDB, ctx.TargetDB)
}
func (pgLikeToClickHousePlanner) Name() string { return "pglike-clickhouse-planner" }
func (pgLikeToClickHousePlanner) SupportLevel(ctx MigrationBuildContext) MigrationSupportLevel {
sourceType := resolveMigrationDBType(ctx.Config.SourceConfig)
targetType := resolveMigrationDBType(ctx.Config.TargetConfig)
if isPGLikeSource(sourceType) && targetType == "clickhouse" {
return MigrationSupportLevelFull
}
return MigrationSupportLevelUnsupported
}
func (pgLikeToClickHousePlanner) BuildPlan(ctx MigrationBuildContext) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
return buildPGLikeToClickHousePlan(ctx.Config, ctx.TableName, ctx.SourceDB, ctx.TargetDB)
}
func (clickHouseToMySQLPlanner) Name() string { return "clickhouse-mysql-planner" }
func (clickHouseToMySQLPlanner) SupportLevel(ctx MigrationBuildContext) MigrationSupportLevel {
sourceType := resolveMigrationDBType(ctx.Config.SourceConfig)
targetType := resolveMigrationDBType(ctx.Config.TargetConfig)
if sourceType == "clickhouse" && isMySQLLikeWritableTargetType(targetType) {
return MigrationSupportLevelFull
}
return MigrationSupportLevelUnsupported
}
func (clickHouseToMySQLPlanner) BuildPlan(ctx MigrationBuildContext) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
return buildClickHouseToMySQLPlan(ctx.Config, ctx.TableName, ctx.SourceDB, ctx.TargetDB)
}
func (clickHouseToPGLikePlanner) Name() string { return "clickhouse-pglike-planner" }
func (clickHouseToPGLikePlanner) SupportLevel(ctx MigrationBuildContext) MigrationSupportLevel {
sourceType := resolveMigrationDBType(ctx.Config.SourceConfig)
targetType := resolveMigrationDBType(ctx.Config.TargetConfig)
if sourceType == "clickhouse" && isPGLikeTarget(targetType) {
return MigrationSupportLevelFull
}
return MigrationSupportLevelUnsupported
}
func (clickHouseToPGLikePlanner) BuildPlan(ctx MigrationBuildContext) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
return buildClickHouseToPGLikePlan(ctx.Config, ctx.TableName, ctx.SourceDB, ctx.TargetDB)
}
func (mysqlToMongoPlanner) Name() string { return "mysql-mongo-planner" }
func (mysqlToMongoPlanner) SupportLevel(ctx MigrationBuildContext) MigrationSupportLevel {
sourceType := resolveMigrationDBType(ctx.Config.SourceConfig)
targetType := resolveMigrationDBType(ctx.Config.TargetConfig)
if isMySQLCoreType(sourceType) && targetType == "mongodb" {
return MigrationSupportLevelFull
}
return MigrationSupportLevelUnsupported
}
func (mysqlToMongoPlanner) BuildPlan(ctx MigrationBuildContext) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
return buildMySQLToMongoPlan(ctx.Config, ctx.TableName, ctx.SourceDB, ctx.TargetDB)
}
func (pgLikeToMongoPlanner) Name() string { return "pglike-mongo-planner" }
func (pgLikeToMongoPlanner) SupportLevel(ctx MigrationBuildContext) MigrationSupportLevel {
sourceType := resolveMigrationDBType(ctx.Config.SourceConfig)
targetType := resolveMigrationDBType(ctx.Config.TargetConfig)
if isPGLikeSource(sourceType) && targetType == "mongodb" {
return MigrationSupportLevelFull
}
return MigrationSupportLevelUnsupported
}
func (pgLikeToMongoPlanner) BuildPlan(ctx MigrationBuildContext) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
return buildPGLikeToMongoPlan(ctx.Config, ctx.TableName, ctx.SourceDB, ctx.TargetDB)
}
func (clickHouseToMongoPlanner) Name() string { return "clickhouse-mongo-planner" }
func (clickHouseToMongoPlanner) SupportLevel(ctx MigrationBuildContext) MigrationSupportLevel {
sourceType := resolveMigrationDBType(ctx.Config.SourceConfig)
targetType := resolveMigrationDBType(ctx.Config.TargetConfig)
if sourceType == "clickhouse" && targetType == "mongodb" {
return MigrationSupportLevelFull
}
return MigrationSupportLevelUnsupported
}
func (clickHouseToMongoPlanner) BuildPlan(ctx MigrationBuildContext) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
return buildClickHouseToMongoPlan(ctx.Config, ctx.TableName, ctx.SourceDB, ctx.TargetDB)
}
func (tdengineToMongoPlanner) Name() string { return "tdengine-mongo-planner" }
func (tdengineToMongoPlanner) SupportLevel(ctx MigrationBuildContext) MigrationSupportLevel {
sourceType := resolveMigrationDBType(ctx.Config.SourceConfig)
targetType := resolveMigrationDBType(ctx.Config.TargetConfig)
if sourceType == "tdengine" && targetType == "mongodb" {
return MigrationSupportLevelFull
}
return MigrationSupportLevelUnsupported
}
func (tdengineToMongoPlanner) BuildPlan(ctx MigrationBuildContext) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
return buildTDengineToMongoPlan(ctx.Config, ctx.TableName, ctx.SourceDB, ctx.TargetDB)
}
func (mongoToMySQLPlanner) Name() string { return "mongo-mysql-planner" }
func (mongoToMySQLPlanner) SupportLevel(ctx MigrationBuildContext) MigrationSupportLevel {
sourceType := resolveMigrationDBType(ctx.Config.SourceConfig)
targetType := resolveMigrationDBType(ctx.Config.TargetConfig)
if sourceType == "mongodb" && isMySQLLikeWritableTargetType(targetType) {
return MigrationSupportLevelFull
}
return MigrationSupportLevelUnsupported
}
func (mongoToMySQLPlanner) BuildPlan(ctx MigrationBuildContext) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
return buildMongoToMySQLPlan(ctx.Config, ctx.TableName, ctx.SourceDB, ctx.TargetDB)
}
func (mongoToPGLikePlanner) Name() string { return "mongo-pglike-planner" }
func (mongoToPGLikePlanner) SupportLevel(ctx MigrationBuildContext) MigrationSupportLevel {
sourceType := resolveMigrationDBType(ctx.Config.SourceConfig)
targetType := resolveMigrationDBType(ctx.Config.TargetConfig)
if sourceType == "mongodb" && isPGLikeTarget(targetType) {
return MigrationSupportLevelFull
}
return MigrationSupportLevelUnsupported
}
func (mongoToPGLikePlanner) BuildPlan(ctx MigrationBuildContext) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
return buildMongoToPGLikePlan(ctx.Config, ctx.TableName, ctx.SourceDB, ctx.TargetDB)
}
func (pgLikeToMySQLPlanner) Name() string { return "pglike-mysql-planner" }
func (pgLikeToMySQLPlanner) SupportLevel(ctx MigrationBuildContext) MigrationSupportLevel {
sourceType := resolveMigrationDBType(ctx.Config.SourceConfig)
targetType := resolveMigrationDBType(ctx.Config.TargetConfig)
if isPGLikeSource(sourceType) && isMySQLLikeWritableTargetType(targetType) {
return MigrationSupportLevelFull
}
return MigrationSupportLevelUnsupported
}
func (pgLikeToMySQLPlanner) BuildPlan(ctx MigrationBuildContext) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
return buildPGLikeToMySQLPlan(ctx.Config, ctx.TableName, ctx.SourceDB, ctx.TargetDB)
}
func (mongoToRelationalPlanner) Name() string { return "mongo-relational-inference-planner" }
func (mongoToRelationalPlanner) SupportLevel(ctx MigrationBuildContext) MigrationSupportLevel {
sourceType := resolveMigrationDBType(ctx.Config.SourceConfig)
targetType := resolveMigrationDBType(ctx.Config.TargetConfig)
if !shouldUseSchemaInference(sourceType, targetType) {
return MigrationSupportLevelUnsupported
}
return MigrationSupportLevelPlanned
}
func (mongoToRelationalPlanner) BuildPlan(ctx MigrationBuildContext) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
sourceType := resolveMigrationDBType(ctx.Config.SourceConfig)
targetType := resolveMigrationDBType(ctx.Config.TargetConfig)
inference, err := inferSchemaForPair(sourceType, targetType, ctx.TableName)
if err != nil {
return SchemaMigrationPlan{}, nil, nil, err
}
plan := SchemaMigrationPlan{}
plan.SourceSchema, plan.SourceTable = normalizeSchemaAndTable(sourceType, ctx.Config.SourceConfig.Database, ctx.TableName)
plan.TargetSchema, plan.TargetTable = normalizeSchemaAndTable(targetType, ctx.Config.TargetConfig.Database, ctx.TableName)
plan.SourceQueryTable = qualifiedNameForQuery(sourceType, plan.SourceSchema, plan.SourceTable, ctx.TableName)
plan.TargetQueryTable = qualifiedNameForQuery(targetType, plan.TargetSchema, plan.TargetTable, ctx.TableName)
plan.PlannedAction = "当前库对已进入迁移内核规划阶段,等待 schema 推断与目标方言生成器落地"
for _, issue := range inference.Issues {
msg := strings.TrimSpace(issue.Message)
if msg == "" {
continue
}
plan.Warnings = append(plan.Warnings, msg)
}
plan.Warnings = append(plan.Warnings, fmt.Sprintf("迁移对象=%s目标类型=%s当前仅提供规划入口暂不执行自动建表", inference.Object.Kind, targetType))
return dedupeSchemaMigrationPlan(plan), nil, nil, nil
}

View File

@@ -0,0 +1,447 @@
package sync
import (
"GoNavi-Wails/internal/connection"
"strings"
"testing"
)
func TestClassifyMigrationDataModel(t *testing.T) {
t.Parallel()
cases := map[string]MigrationDataModel{
"mysql": MigrationDataModelRelational,
"postgres": MigrationDataModelRelational,
"kingbase": MigrationDataModelRelational,
"mongodb": MigrationDataModelDocument,
"clickhouse": MigrationDataModelColumnar,
"tdengine": MigrationDataModelTimeSeries,
"redis": MigrationDataModelKeyValue,
"custom": MigrationDataModelCustom,
}
for input, want := range cases {
input, want := input, want
t.Run(input, func(t *testing.T) {
t.Parallel()
got := classifyMigrationDataModel(input)
if got != want {
t.Fatalf("unexpected data model, input=%s got=%s want=%s", input, got, want)
}
})
}
}
func TestResolveMigrationPlanner_PrefersMySQLKingbasePlanner(t *testing.T) {
t.Parallel()
planner := resolveMigrationPlanner(MigrationBuildContext{
Config: SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "mysql"},
TargetConfig: connection.ConnectionConfig{Type: "kingbase"},
},
})
if planner == nil {
t.Fatalf("expected planner")
}
if planner.Name() != "mysql-pglike-planner" {
t.Fatalf("unexpected planner: %s", planner.Name())
}
}
func TestResolveMigrationPlanner_UsesSchemaInferencePlannerForMongoToMySQL(t *testing.T) {
t.Parallel()
planner := resolveMigrationPlanner(MigrationBuildContext{
Config: SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "mongodb"},
TargetConfig: connection.ConnectionConfig{Type: "mysql"},
},
})
if planner == nil {
t.Fatalf("expected planner")
}
if planner.Name() != "mongo-mysql-planner" {
t.Fatalf("unexpected planner: %s", planner.Name())
}
}
func TestInferSchemaForPair_MongoToMySQLReturnsPlannedWarning(t *testing.T) {
t.Parallel()
result, err := inferSchemaForPair("mongodb", "mysql", "users")
if err != nil {
t.Fatalf("inferSchemaForPair returned error: %v", err)
}
if !result.NeedsReview {
t.Fatalf("expected needs review")
}
if result.Object.Name != "users" {
t.Fatalf("unexpected object name: %s", result.Object.Name)
}
if len(result.Issues) == 0 || !strings.Contains(result.Issues[0].Message, "schema 推断") {
t.Fatalf("unexpected issues: %+v", result.Issues)
}
}
func TestResolveMigrationPlanner_UsesPGLikeMySQLPlannerForKingbaseToMySQL(t *testing.T) {
t.Parallel()
planner := resolveMigrationPlanner(MigrationBuildContext{
Config: SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "kingbase"},
TargetConfig: connection.ConnectionConfig{Type: "mysql"},
},
})
if planner == nil {
t.Fatalf("expected planner")
}
if planner.Name() != "pglike-mysql-planner" {
t.Fatalf("unexpected planner: %s", planner.Name())
}
}
func TestResolveMigrationPlanner_UsesMySQLPGLikePlannerForMySQLToPostgres(t *testing.T) {
t.Parallel()
planner := resolveMigrationPlanner(MigrationBuildContext{
Config: SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "mysql"},
TargetConfig: connection.ConnectionConfig{Type: "postgres"},
},
})
if planner == nil {
t.Fatalf("expected planner")
}
if planner.Name() != "mysql-pglike-planner" {
t.Fatalf("unexpected planner: %s", planner.Name())
}
}
func TestResolveMigrationPlanner_UsesMySQLClickHousePlanner(t *testing.T) {
t.Parallel()
planner := resolveMigrationPlanner(MigrationBuildContext{
Config: SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "mysql"},
TargetConfig: connection.ConnectionConfig{Type: "clickhouse"},
},
})
if planner == nil {
t.Fatalf("expected planner")
}
if planner.Name() != "mysql-clickhouse-planner" {
t.Fatalf("unexpected planner: %s", planner.Name())
}
}
func TestResolveMigrationPlanner_UsesClickHouseMySQLPlanner(t *testing.T) {
t.Parallel()
planner := resolveMigrationPlanner(MigrationBuildContext{
Config: SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "clickhouse"},
TargetConfig: connection.ConnectionConfig{Type: "mysql"},
},
})
if planner == nil {
t.Fatalf("expected planner")
}
if planner.Name() != "clickhouse-mysql-planner" {
t.Fatalf("unexpected planner: %s", planner.Name())
}
}
func TestResolveMigrationPlanner_UsesMySQLMongoPlanner(t *testing.T) {
t.Parallel()
planner := resolveMigrationPlanner(MigrationBuildContext{
Config: SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "mysql"},
TargetConfig: connection.ConnectionConfig{Type: "mongodb"},
},
})
if planner == nil || planner.Name() != "mysql-mongo-planner" {
t.Fatalf("unexpected planner: %v", planner)
}
}
func TestResolveMigrationPlanner_UsesMongoMySQLPlanner(t *testing.T) {
t.Parallel()
planner := resolveMigrationPlanner(MigrationBuildContext{
Config: SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "mongodb"},
TargetConfig: connection.ConnectionConfig{Type: "mysql"},
},
})
if planner == nil || planner.Name() != "mongo-mysql-planner" {
t.Fatalf("unexpected planner: %v", planner)
}
}
func TestResolveMigrationPlanner_UsesMongoPGLikePlanner(t *testing.T) {
t.Parallel()
planner := resolveMigrationPlanner(MigrationBuildContext{
Config: SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "mongodb"},
TargetConfig: connection.ConnectionConfig{Type: "postgres"},
},
})
if planner == nil || planner.Name() != "mongo-pglike-planner" {
t.Fatalf("unexpected planner: %v", planner)
}
}
func TestResolveMigrationPlanner_UsesPGLikeMongoPlanner(t *testing.T) {
t.Parallel()
planner := resolveMigrationPlanner(MigrationBuildContext{
Config: SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "postgres"},
TargetConfig: connection.ConnectionConfig{Type: "mongodb"},
},
})
if planner == nil || planner.Name() != "pglike-mongo-planner" {
t.Fatalf("unexpected planner: %v", planner)
}
}
func TestResolveMigrationPlanner_UsesClickHouseMongoPlanner(t *testing.T) {
t.Parallel()
planner := resolveMigrationPlanner(MigrationBuildContext{
Config: SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "clickhouse"},
TargetConfig: connection.ConnectionConfig{Type: "mongodb"},
},
})
if planner == nil || planner.Name() != "clickhouse-mongo-planner" {
t.Fatalf("unexpected planner: %v", planner)
}
}
func TestResolveMigrationPlanner_UsesTDengineMongoPlanner(t *testing.T) {
t.Parallel()
planner := resolveMigrationPlanner(MigrationBuildContext{
Config: SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "tdengine"},
TargetConfig: connection.ConnectionConfig{Type: "mongodb"},
},
})
if planner == nil || planner.Name() != "tdengine-mongo-planner" {
t.Fatalf("unexpected planner: %v", planner)
}
}
func TestResolveMigrationPlanner_UsesMySQLPGLikePlannerForDirosToPostgres(t *testing.T) {
t.Parallel()
planner := resolveMigrationPlanner(MigrationBuildContext{
Config: SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "diros"},
TargetConfig: connection.ConnectionConfig{Type: "postgres"},
},
})
if planner == nil || planner.Name() != "mysql-pglike-planner" {
t.Fatalf("unexpected planner: %v", planner)
}
}
func TestResolveMigrationPlanner_UsesPGLikeMySQLPlannerForPostgresToDiros(t *testing.T) {
t.Parallel()
planner := resolveMigrationPlanner(MigrationBuildContext{
Config: SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "postgres"},
TargetConfig: connection.ConnectionConfig{Type: "diros"},
},
})
if planner == nil || planner.Name() != "pglike-mysql-planner" {
t.Fatalf("unexpected planner: %v", planner)
}
}
func TestResolveMigrationPlanner_UsesMySQLPGLikePlannerForMySQLToDuckDB(t *testing.T) {
t.Parallel()
planner := resolveMigrationPlanner(MigrationBuildContext{
Config: SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "mysql"},
TargetConfig: connection.ConnectionConfig{Type: "duckdb"},
},
})
if planner == nil || planner.Name() != "mysql-pglike-planner" {
t.Fatalf("unexpected planner: %v", planner)
}
}
func TestResolveMigrationPlanner_UsesPGLikeClickHousePlanner(t *testing.T) {
t.Parallel()
planner := resolveMigrationPlanner(MigrationBuildContext{
Config: SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "postgres"},
TargetConfig: connection.ConnectionConfig{Type: "clickhouse"},
},
})
if planner == nil || planner.Name() != "pglike-clickhouse-planner" {
t.Fatalf("unexpected planner: %v", planner)
}
}
func TestResolveMigrationPlanner_UsesPGLikeMySQLPlannerForDuckDBToMySQL(t *testing.T) {
t.Parallel()
planner := resolveMigrationPlanner(MigrationBuildContext{
Config: SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "duckdb"},
TargetConfig: connection.ConnectionConfig{Type: "mysql"},
},
})
if planner == nil || planner.Name() != "pglike-mysql-planner" {
t.Fatalf("unexpected planner: %v", planner)
}
}
func TestResolveMigrationPlanner_UsesMySQLPGLikePlannerForSphinxToPostgres(t *testing.T) {
t.Parallel()
planner := resolveMigrationPlanner(MigrationBuildContext{
Config: SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "sphinx"},
TargetConfig: connection.ConnectionConfig{Type: "postgres"},
},
})
if planner == nil || planner.Name() != "mysql-pglike-planner" {
t.Fatalf("unexpected planner: %v", planner)
}
}
func TestResolveMigrationPlanner_UsesPGLikeMySQLPlannerForCustomKingbaseToMySQL(t *testing.T) {
t.Parallel()
planner := resolveMigrationPlanner(MigrationBuildContext{
Config: SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "custom", Driver: "kingbase8"},
TargetConfig: connection.ConnectionConfig{Type: "mysql"},
},
})
if planner == nil || planner.Name() != "pglike-mysql-planner" {
t.Fatalf("unexpected planner: %v", planner)
}
}
func TestResolveMigrationPlanner_UsesMySQLPGLikePlannerForMySQLToCustomPostgres(t *testing.T) {
t.Parallel()
planner := resolveMigrationPlanner(MigrationBuildContext{
Config: SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "mysql"},
TargetConfig: connection.ConnectionConfig{Type: "custom", Driver: "postgresql"},
},
})
if planner == nil || planner.Name() != "mysql-pglike-planner" {
t.Fatalf("unexpected planner: %v", planner)
}
}
func TestResolveMigrationPlanner_UsesTDengineMySQLPlanner(t *testing.T) {
t.Parallel()
planner := resolveMigrationPlanner(MigrationBuildContext{
Config: SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "tdengine"},
TargetConfig: connection.ConnectionConfig{Type: "mysql"},
},
})
if planner == nil || planner.Name() != "tdengine-mysql-planner" {
t.Fatalf("unexpected planner: %v", planner)
}
}
func TestResolveMigrationPlanner_UsesTDenginePGLikePlanner(t *testing.T) {
t.Parallel()
planner := resolveMigrationPlanner(MigrationBuildContext{
Config: SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "tdengine"},
TargetConfig: connection.ConnectionConfig{Type: "kingbase"},
},
})
if planner == nil || planner.Name() != "tdengine-pglike-planner" {
t.Fatalf("unexpected planner: %v", planner)
}
}
func TestResolveMigrationPlanner_UsesMySQLLikeTDenginePlanner(t *testing.T) {
t.Parallel()
planner := resolveMigrationPlanner(MigrationBuildContext{
Config: SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "mysql"},
TargetConfig: connection.ConnectionConfig{Type: "tdengine"},
},
})
if planner == nil || planner.Name() != "mysqllike-tdengine-planner" {
t.Fatalf("unexpected planner: %v", planner)
}
}
func TestResolveMigrationPlanner_UsesPGLikeTDenginePlanner(t *testing.T) {
t.Parallel()
planner := resolveMigrationPlanner(MigrationBuildContext{
Config: SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "postgres"},
TargetConfig: connection.ConnectionConfig{Type: "tdengine"},
},
})
if planner == nil || planner.Name() != "pglike-tdengine-planner" {
t.Fatalf("unexpected planner: %v", planner)
}
}
func TestResolveMigrationPlanner_UsesClickHouseTDenginePlanner(t *testing.T) {
t.Parallel()
planner := resolveMigrationPlanner(MigrationBuildContext{
Config: SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "clickhouse"},
TargetConfig: connection.ConnectionConfig{Type: "tdengine"},
},
})
if planner == nil || planner.Name() != "clickhouse-tdengine-planner" {
t.Fatalf("unexpected planner: %v", planner)
}
}
func TestResolveMigrationPlanner_UsesClickHousePGLikePlanner(t *testing.T) {
t.Parallel()
planner := resolveMigrationPlanner(MigrationBuildContext{
Config: SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "clickhouse"},
TargetConfig: connection.ConnectionConfig{Type: "postgres"},
},
})
if planner == nil || planner.Name() != "clickhouse-pglike-planner" {
t.Fatalf("unexpected planner: %v", planner)
}
}
func TestResolveMigrationPlanner_UsesTDengineTDenginePlanner(t *testing.T) {
t.Parallel()
planner := resolveMigrationPlanner(MigrationBuildContext{
Config: SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "tdengine"},
TargetConfig: connection.ConnectionConfig{Type: "tdengine"},
},
})
if planner == nil || planner.Name() != "tdengine-tdengine-planner" {
t.Fatalf("unexpected planner: %v", planner)
}
}

View File

@@ -0,0 +1,104 @@
package sync
import (
"GoNavi-Wails/internal/connection"
"GoNavi-Wails/internal/db"
)
type MigrationDataModel string
const (
MigrationDataModelRelational MigrationDataModel = "relational"
MigrationDataModelDocument MigrationDataModel = "document"
MigrationDataModelColumnar MigrationDataModel = "columnar"
MigrationDataModelTimeSeries MigrationDataModel = "timeseries"
MigrationDataModelKeyValue MigrationDataModel = "keyvalue"
MigrationDataModelCustom MigrationDataModel = "custom"
)
type MigrationObjectKind string
const (
MigrationObjectKindTable MigrationObjectKind = "table"
MigrationObjectKindCollection MigrationObjectKind = "collection"
MigrationObjectKindKeyspace MigrationObjectKind = "keyspace"
)
type MigrationSupportLevel string
const (
MigrationSupportLevelFull MigrationSupportLevel = "full"
MigrationSupportLevelPartial MigrationSupportLevel = "partial"
MigrationSupportLevelPlanned MigrationSupportLevel = "planned"
MigrationSupportLevelUnsupported MigrationSupportLevel = "unsupported"
)
type CanonicalFieldSpec struct {
Name string
SourceType string
CanonicalType string
Nullable bool
DefaultValue *string
AutoIncrement bool
Comment string
NestedPath string
Confidence float64
}
type CanonicalIndexSpec struct {
Name string
Kind string
Columns []string
Expression string
PrefixLength int
Supported bool
DegradeStrategy string
Unique bool
}
type CanonicalConstraintSpec struct {
Name string
Kind string
Columns []string
RefName string
}
type CanonicalObjectSpec struct {
Name string
Schema string
Kind MigrationObjectKind
Fields []CanonicalFieldSpec
PrimaryKey []string
Indexes []CanonicalIndexSpec
Constraints []CanonicalConstraintSpec
Comments []string
SourceHints map[string]string
}
type SchemaInferenceIssue struct {
Field string
Level string
Message string
Resolution string
}
type SchemaInferenceResult struct {
Object CanonicalObjectSpec
Issues []SchemaInferenceIssue
SampleSize int
Confidence float64
NeedsReview bool
}
type MigrationBuildContext struct {
Config SyncConfig
TableName string
SourceDB db.Database
TargetDB db.Database
}
type MigrationPlanner interface {
Name() string
SupportLevel(ctx MigrationBuildContext) MigrationSupportLevel
BuildPlan(ctx MigrationBuildContext) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error)
}

View File

@@ -0,0 +1,603 @@
package sync
import (
"GoNavi-Wails/internal/connection"
"GoNavi-Wails/internal/db"
"encoding/json"
"fmt"
"sort"
"strings"
"time"
)
func buildMySQLToMongoPlan(config SyncConfig, tableName string, sourceDB db.Database, targetDB db.Database) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
return buildTabularToMongoPlan(config, tableName, sourceDB, targetDB)
}
func buildPGLikeToMongoPlan(config SyncConfig, tableName string, sourceDB db.Database, targetDB db.Database) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
return buildTabularToMongoPlan(config, tableName, sourceDB, targetDB)
}
func buildClickHouseToMongoPlan(config SyncConfig, tableName string, sourceDB db.Database, targetDB db.Database) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
return buildTabularToMongoPlan(config, tableName, sourceDB, targetDB)
}
func buildTDengineToMongoPlan(config SyncConfig, tableName string, sourceDB db.Database, targetDB db.Database) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
return buildTabularToMongoPlan(config, tableName, sourceDB, targetDB)
}
func buildTabularToMongoPlan(config SyncConfig, tableName string, sourceDB db.Database, targetDB db.Database) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
plan := SchemaMigrationPlan{}
sourceType := resolveMigrationDBType(config.SourceConfig)
targetType := resolveMigrationDBType(config.TargetConfig)
plan.SourceSchema, plan.SourceTable = normalizeSchemaAndTable(sourceType, config.SourceConfig.Database, tableName)
plan.TargetSchema, plan.TargetTable = normalizeSchemaAndTable(targetType, config.TargetConfig.Database, tableName)
plan.SourceQueryTable = qualifiedNameForQuery(sourceType, plan.SourceSchema, plan.SourceTable, tableName)
plan.TargetQueryTable = qualifiedNameForQuery(targetType, plan.TargetSchema, plan.TargetTable, tableName)
plan.PlannedAction = "使用已有目标集合导入"
sourceCols, sourceExists, err := inspectTableColumns(sourceDB, plan.SourceSchema, plan.SourceTable)
if err != nil {
return plan, nil, nil, fmt.Errorf("获取源表字段失败: %w", err)
}
if !sourceExists {
return plan, nil, nil, fmt.Errorf("源表不存在或无列定义: %s", tableName)
}
targetExists, err := inspectMongoCollection(targetDB, plan.TargetSchema, plan.TargetTable)
if err != nil {
return plan, sourceCols, nil, fmt.Errorf("检查目标集合失败: %w", err)
}
plan.TargetTableExists = targetExists
strategy := normalizeTargetTableStrategy(config.TargetTableStrategy)
if targetExists {
plan.Warnings = append(plan.Warnings, "MongoDB 为弱 schema 目标,字段结构以写入文档为准,不执行目标列校验")
return dedupeSchemaMigrationPlan(plan), sourceCols, nil, nil
}
switch strategy {
case "existing_only":
plan.PlannedAction = "目标集合不存在,需先手工创建"
plan.Warnings = append(plan.Warnings, "当前策略要求目标集合已存在,执行时不会自动创建")
return dedupeSchemaMigrationPlan(plan), sourceCols, nil, nil
case "smart", "auto_create_if_missing":
plan.AutoCreate = true
plan.PlannedAction = "目标集合不存在,将自动创建集合后导入"
createCmd, err := buildMongoCreateCollectionCommand(plan.TargetTable)
if err != nil {
return plan, sourceCols, nil, err
}
plan.PreDataSQL = append(plan.PreDataSQL, createCmd)
if config.CreateIndexes {
indexCmds, warnings, unsupported, created, skipped, err := buildMongoIndexCommands(sourceDB, plan.SourceSchema, plan.SourceTable, plan.TargetTable)
if err != nil {
plan.Warnings = append(plan.Warnings, fmt.Sprintf("读取源表索引失败,已跳过索引迁移:%v", err))
} else {
plan.PostDataSQL = append(plan.PostDataSQL, indexCmds...)
plan.Warnings = append(plan.Warnings, warnings...)
plan.UnsupportedObjects = append(plan.UnsupportedObjects, unsupported...)
plan.IndexesToCreate = created
plan.IndexesSkipped = skipped
}
}
return dedupeSchemaMigrationPlan(plan), sourceCols, nil, nil
default:
return dedupeSchemaMigrationPlan(plan), sourceCols, nil, nil
}
}
func buildMongoToMySQLPlan(config SyncConfig, tableName string, sourceDB db.Database, targetDB db.Database) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
plan := SchemaMigrationPlan{}
plan.SourceSchema, plan.SourceTable = normalizeSchemaAndTable(config.SourceConfig.Type, config.SourceConfig.Database, tableName)
plan.TargetSchema, plan.TargetTable = normalizeSchemaAndTable(config.TargetConfig.Type, config.TargetConfig.Database, tableName)
plan.SourceQueryTable = qualifiedNameForQuery(config.SourceConfig.Type, plan.SourceSchema, plan.SourceTable, tableName)
plan.TargetQueryTable = qualifiedNameForQuery(config.TargetConfig.Type, plan.TargetSchema, plan.TargetTable, tableName)
plan.PlannedAction = "使用已有目标表导入"
sourceCols, warnings, err := inferMongoCollectionColumns(sourceDB, plan.SourceTable)
if err != nil {
return plan, nil, nil, err
}
plan.Warnings = append(plan.Warnings, warnings...)
if len(sourceCols) == 0 {
return plan, nil, nil, fmt.Errorf("源集合未推断出可迁移字段: %s", tableName)
}
targetCols, targetExists, err := inspectTableColumns(targetDB, plan.TargetSchema, plan.TargetTable)
if err != nil {
return plan, sourceCols, nil, fmt.Errorf("获取目标表字段失败: %w", err)
}
plan.TargetTableExists = targetExists
strategy := normalizeTargetTableStrategy(config.TargetTableStrategy)
if targetExists {
missing := diffMissingColumnNames(sourceCols, targetCols)
if len(missing) > 0 {
plan.Warnings = append(plan.Warnings, fmt.Sprintf("目标表缺失字段 %d 个:%s", len(missing), strings.Join(missing, ", ")))
}
if config.AutoAddColumns {
addSQL, addWarnings := buildMongoToMySQLAddColumnSQL(plan.TargetQueryTable, sourceCols, targetCols)
plan.PreDataSQL = append(plan.PreDataSQL, addSQL...)
plan.Warnings = append(plan.Warnings, addWarnings...)
if len(addSQL) > 0 {
plan.PlannedAction = fmt.Sprintf("补齐缺失字段(%d)后导入", len(addSQL))
}
}
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
}
switch strategy {
case "existing_only":
plan.PlannedAction = "目标表不存在,需先手工创建"
plan.Warnings = append(plan.Warnings, "当前策略要求目标表已存在,执行时不会自动建表")
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
case "smart", "auto_create_if_missing":
plan.AutoCreate = true
plan.PlannedAction = "目标表不存在,将自动建表后导入"
createSQL, postSQL, moreWarnings, unsupported, idxCreate, idxSkip, err := buildMongoToMySQLCreateTablePlan(config, plan.TargetQueryTable, sourceCols, sourceDB, plan.SourceSchema, plan.SourceTable)
if err != nil {
return plan, sourceCols, targetCols, err
}
plan.CreateTableSQL = createSQL
plan.PostDataSQL = append(plan.PostDataSQL, postSQL...)
plan.Warnings = append(plan.Warnings, moreWarnings...)
plan.UnsupportedObjects = append(plan.UnsupportedObjects, unsupported...)
plan.IndexesToCreate = idxCreate
plan.IndexesSkipped = idxSkip
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
default:
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
}
}
func inspectMongoCollection(database db.Database, dbName, collection string) (bool, error) {
items, err := database.GetTables(dbName)
if err != nil {
return false, err
}
target := strings.TrimSpace(collection)
for _, item := range items {
if strings.EqualFold(strings.TrimSpace(item), target) {
return true, nil
}
}
return false, nil
}
func buildMongoCreateCollectionCommand(collection string) (string, error) {
cmd := map[string]interface{}{"create": strings.TrimSpace(collection)}
data, err := json.Marshal(cmd)
if err != nil {
return "", err
}
return string(data), nil
}
func buildMongoIndexCommands(sourceDB db.Database, dbName, tableName, targetCollection string) ([]string, []string, []string, int, int, error) {
indexes, err := sourceDB.GetIndexes(dbName, tableName)
if err != nil {
return nil, nil, nil, 0, 0, err
}
grouped := groupIndexDefinitions(indexes)
cmds := make([]string, 0, len(grouped))
warnings := make([]string, 0)
unsupported := make([]string, 0)
created := 0
skipped := 0
for _, idx := range grouped {
name := strings.TrimSpace(idx.Name)
if name == "" || strings.EqualFold(name, "primary") {
continue
}
if len(idx.Columns) == 0 {
skipped++
unsupported = append(unsupported, fmt.Sprintf("索引 %s 缺少列定义,已跳过", name))
continue
}
kind := strings.ToLower(strings.TrimSpace(idx.IndexType))
if idx.SubPart > 0 {
skipped++
unsupported = append(unsupported, fmt.Sprintf("索引 %s 使用前缀长度MongoDB 目标暂不支持等价迁移", name))
continue
}
if kind != "" && kind != "btree" {
warnings = append(warnings, fmt.Sprintf("索引 %s 类型=%s 将按普通索引迁移到 MongoDB", name, idx.IndexType))
}
keySpec := make(map[string]int)
for _, col := range idx.Columns {
keySpec[col] = 1
}
command := map[string]interface{}{
"createIndexes": strings.TrimSpace(targetCollection),
"indexes": []map[string]interface{}{{
"name": name,
"key": keySpec,
"unique": idx.Unique,
}},
}
data, err := json.Marshal(command)
if err != nil {
skipped++
unsupported = append(unsupported, fmt.Sprintf("索引 %s 生成 MongoDB createIndexes 命令失败:%v", name, err))
continue
}
cmds = append(cmds, string(data))
created++
}
return cmds, dedupeStrings(warnings), dedupeStrings(unsupported), created, skipped, nil
}
func inferMongoCollectionColumns(sourceDB db.Database, collection string) ([]connection.ColumnDefinition, []string, error) {
query := fmt.Sprintf(`{"find":"%s","filter":{},"limit":200}`, strings.TrimSpace(collection))
rows, _, err := sourceDB.Query(query)
if err != nil {
return nil, nil, fmt.Errorf("读取源集合样本失败: %w", err)
}
if len(rows) == 0 {
return []connection.ColumnDefinition{{Name: "_id", Type: "varchar(64)", Nullable: "NO", Key: "PRI"}}, []string{"源集合暂无样本数据,仅按 `_id` 生成基础主键列"}, nil
}
fieldNames := make(map[string]struct{})
for _, row := range rows {
for key := range row {
fieldNames[key] = struct{}{}
}
}
orderedFields := make([]string, 0, len(fieldNames))
for key := range fieldNames {
orderedFields = append(orderedFields, key)
}
sort.Strings(orderedFields)
if containsString(orderedFields, "_id") {
orderedFields = moveStringToFront(orderedFields, "_id")
}
columns := make([]connection.ColumnDefinition, 0, len(orderedFields))
warnings := make([]string, 0)
for _, field := range orderedFields {
typeName, nullable, fieldWarnings := inferMongoFieldType(rows, field)
warnings = append(warnings, fieldWarnings...)
col := connection.ColumnDefinition{
Name: field,
Type: typeName,
Nullable: ternaryString(nullable, "YES", "NO"),
Key: "",
Extra: "",
}
if field == "_id" {
col.Key = "PRI"
col.Nullable = "NO"
}
columns = append(columns, col)
}
return columns, dedupeStrings(warnings), nil
}
func inferMongoFieldType(rows []map[string]interface{}, field string) (string, bool, []string) {
nullable := false
hasString, hasBool, hasInt, hasFloat, hasTime, hasComplex := false, false, false, false, false, false
for _, row := range rows {
value, ok := row[field]
if !ok || value == nil {
nullable = true
continue
}
switch value.(type) {
case bool:
hasBool = true
case int, int8, int16, int32, int64, uint, uint8, uint16, uint32, uint64:
hasInt = true
case float32, float64:
hasFloat = true
case time.Time:
hasTime = true
case map[string]interface{}, []interface{}:
hasComplex = true
default:
hasString = true
}
}
kinds := 0
for _, flag := range []bool{hasString, hasBool, hasInt, hasFloat, hasTime, hasComplex} {
if flag {
kinds++
}
}
warnings := make([]string, 0)
if kinds > 1 {
warnings = append(warnings, fmt.Sprintf("字段 %s 存在多种 BSON 值类型,已按兼容类型降级", field))
}
if field == "_id" {
return "varchar(64)", false, warnings
}
switch {
case hasComplex:
return "json", nullable, warnings
case hasTime:
return "datetime", nullable, warnings
case hasFloat:
return "double", nullable, warnings
case hasInt:
return "bigint", nullable, warnings
case hasBool:
return "tinyint(1)", nullable, warnings
default:
return "varchar(255)", nullable, warnings
}
}
func buildMongoToMySQLAddColumnSQL(targetQueryTable string, sourceCols, targetCols []connection.ColumnDefinition) ([]string, []string) {
targetSet := make(map[string]struct{}, len(targetCols))
for _, col := range targetCols {
key := strings.ToLower(strings.TrimSpace(col.Name))
if key == "" {
continue
}
targetSet[key] = struct{}{}
}
var sqlList []string
for _, col := range sourceCols {
key := strings.ToLower(strings.TrimSpace(col.Name))
if key == "" {
continue
}
if _, ok := targetSet[key]; ok {
continue
}
sqlList = append(sqlList, fmt.Sprintf("ALTER TABLE %s ADD COLUMN %s %s NULL",
quoteQualifiedIdentByType("mysql", targetQueryTable),
quoteIdentByType("mysql", col.Name),
strings.TrimSpace(col.Type),
))
}
return sqlList, nil
}
func buildMongoToMySQLCreateTablePlan(config SyncConfig, targetQueryTable string, sourceCols []connection.ColumnDefinition, sourceDB db.Database, sourceSchema, sourceTable string) (string, []string, []string, []string, int, int, error) {
columnDefs := make([]string, 0, len(sourceCols)+1)
warnings := make([]string, 0)
unsupported := make([]string, 0)
pkCols := make([]string, 0, 1)
for _, col := range sourceCols {
columnDef := fmt.Sprintf("%s %s", quoteIdentByType("mysql", col.Name), strings.TrimSpace(col.Type))
if strings.EqualFold(strings.TrimSpace(col.Nullable), "NO") {
columnDef += " NOT NULL"
}
columnDefs = append(columnDefs, columnDef)
if col.Key == "PRI" || col.Key == "PK" {
pkCols = append(pkCols, quoteIdentByType("mysql", col.Name))
}
}
if len(pkCols) > 0 {
columnDefs = append(columnDefs, fmt.Sprintf("PRIMARY KEY (%s)", strings.Join(pkCols, ", ")))
} else {
warnings = append(warnings, "MongoDB 源集合未推断出稳定主键,目标表将不自动创建主键")
}
createSQL := fmt.Sprintf("CREATE TABLE %s (\n %s\n)", quoteQualifiedIdentByType("mysql", targetQueryTable), strings.Join(columnDefs, ",\n "))
if !config.CreateIndexes {
return createSQL, nil, dedupeStrings(warnings), dedupeStrings(unsupported), 0, 0, nil
}
indexes, err := sourceDB.GetIndexes(sourceSchema, sourceTable)
if err != nil {
warnings = append(warnings, fmt.Sprintf("读取源集合索引失败,已跳过索引迁移:%v", err))
return createSQL, nil, dedupeStrings(warnings), dedupeStrings(unsupported), 0, 0, nil
}
grouped := groupIndexDefinitions(indexes)
postSQL := make([]string, 0, len(grouped))
created := 0
skipped := 0
for _, idx := range grouped {
name := strings.TrimSpace(idx.Name)
if name == "" || strings.EqualFold(name, "_id_") || strings.EqualFold(name, "primary") {
continue
}
if len(idx.Columns) == 0 {
skipped++
unsupported = append(unsupported, fmt.Sprintf("索引 %s 缺少列定义,已跳过", name))
continue
}
quotedCols := make([]string, 0, len(idx.Columns))
for _, col := range idx.Columns {
quotedCols = append(quotedCols, quoteIdentByType("mysql", col))
}
prefix := "CREATE INDEX"
if idx.Unique {
prefix = "CREATE UNIQUE INDEX"
}
postSQL = append(postSQL, fmt.Sprintf("%s %s ON %s (%s)", prefix, quoteIdentByType("mysql", name), quoteQualifiedIdentByType("mysql", targetQueryTable), strings.Join(quotedCols, ", ")))
created++
}
return createSQL, postSQL, dedupeStrings(warnings), dedupeStrings(unsupported), created, skipped, nil
}
func containsString(items []string, target string) bool {
for _, item := range items {
if item == target {
return true
}
}
return false
}
func moveStringToFront(items []string, target string) []string {
out := make([]string, 0, len(items))
for _, item := range items {
if item == target {
continue
}
out = append(out, item)
}
return append([]string{target}, out...)
}
func buildMongoToPGLikePlan(config SyncConfig, tableName string, sourceDB db.Database, targetDB db.Database) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
plan := SchemaMigrationPlan{}
targetType := strings.ToLower(strings.TrimSpace(config.TargetConfig.Type))
plan.SourceSchema, plan.SourceTable = normalizeSchemaAndTable(config.SourceConfig.Type, config.SourceConfig.Database, tableName)
plan.TargetSchema, plan.TargetTable = normalizeSchemaAndTable(config.TargetConfig.Type, config.TargetConfig.Database, tableName)
plan.SourceQueryTable = qualifiedNameForQuery(config.SourceConfig.Type, plan.SourceSchema, plan.SourceTable, tableName)
plan.TargetQueryTable = qualifiedNameForQuery(config.TargetConfig.Type, plan.TargetSchema, plan.TargetTable, tableName)
plan.PlannedAction = "使用已有目标表导入"
sourceCols, warnings, err := inferMongoCollectionColumns(sourceDB, plan.SourceTable)
if err != nil {
return plan, nil, nil, err
}
plan.Warnings = append(plan.Warnings, warnings...)
if len(sourceCols) == 0 {
return plan, nil, nil, fmt.Errorf("源集合未推断出可迁移字段: %s", tableName)
}
targetCols, targetExists, err := inspectTableColumns(targetDB, plan.TargetSchema, plan.TargetTable)
if err != nil {
return plan, sourceCols, nil, fmt.Errorf("获取目标表字段失败: %w", err)
}
plan.TargetTableExists = targetExists
strategy := normalizeTargetTableStrategy(config.TargetTableStrategy)
if targetExists {
missing := diffMissingColumnNames(sourceCols, targetCols)
if len(missing) > 0 {
plan.Warnings = append(plan.Warnings, fmt.Sprintf("目标表缺失字段 %d 个:%s", len(missing), strings.Join(missing, ", ")))
}
if config.AutoAddColumns {
addSQL, addWarnings := buildMongoToPGLikeAddColumnSQL(targetType, plan.TargetQueryTable, sourceCols, targetCols)
plan.PreDataSQL = append(plan.PreDataSQL, addSQL...)
plan.Warnings = append(plan.Warnings, addWarnings...)
if len(addSQL) > 0 {
plan.PlannedAction = fmt.Sprintf("补齐缺失字段(%d)后导入", len(addSQL))
}
}
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
}
switch strategy {
case "existing_only":
plan.PlannedAction = "目标表不存在,需先手工创建"
plan.Warnings = append(plan.Warnings, "当前策略要求目标表已存在,执行时不会自动建表")
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
case "smart", "auto_create_if_missing":
plan.AutoCreate = true
plan.PlannedAction = "目标表不存在,将自动建表后导入"
createSQL, postSQL, moreWarnings, unsupported, idxCreate, idxSkip, err := buildMongoToPGLikeCreateTablePlan(targetType, config, plan.TargetQueryTable, sourceCols, sourceDB, plan.SourceSchema, plan.SourceTable)
if err != nil {
return plan, sourceCols, targetCols, err
}
plan.CreateTableSQL = createSQL
plan.PostDataSQL = append(plan.PostDataSQL, postSQL...)
plan.Warnings = append(plan.Warnings, moreWarnings...)
plan.UnsupportedObjects = append(plan.UnsupportedObjects, unsupported...)
plan.IndexesToCreate = idxCreate
plan.IndexesSkipped = idxSkip
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
default:
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
}
}
func buildMongoToPGLikeAddColumnSQL(targetType string, targetQueryTable string, sourceCols, targetCols []connection.ColumnDefinition) ([]string, []string) {
targetSet := make(map[string]struct{}, len(targetCols))
for _, col := range targetCols {
key := strings.ToLower(strings.TrimSpace(col.Name))
if key == "" {
continue
}
targetSet[key] = struct{}{}
}
var sqlList []string
var warnings []string
for _, col := range sourceCols {
key := strings.ToLower(strings.TrimSpace(col.Name))
if key == "" {
continue
}
if _, ok := targetSet[key]; ok {
continue
}
colType, mapWarnings := mapMongoInferredColumnToPGLike(col)
warnings = append(warnings, mapWarnings...)
sqlList = append(sqlList, fmt.Sprintf("ALTER TABLE %s ADD COLUMN %s %s NULL",
quoteQualifiedIdentByType(targetType, targetQueryTable),
quoteIdentByType(targetType, col.Name),
colType,
))
}
return sqlList, dedupeStrings(warnings)
}
func buildMongoToPGLikeCreateTablePlan(targetType string, config SyncConfig, targetQueryTable string, sourceCols []connection.ColumnDefinition, sourceDB db.Database, sourceSchema, sourceTable string) (string, []string, []string, []string, int, int, error) {
columnDefs := make([]string, 0, len(sourceCols)+1)
warnings := make([]string, 0)
unsupported := make([]string, 0)
pkCols := make([]string, 0, 1)
for _, col := range sourceCols {
colType, colWarnings := mapMongoInferredColumnToPGLike(col)
warnings = append(warnings, colWarnings...)
parts := []string{colType}
if strings.EqualFold(strings.TrimSpace(col.Nullable), "NO") {
parts = append(parts, "NOT NULL")
}
columnDefs = append(columnDefs, fmt.Sprintf("%s %s", quoteIdentByType(targetType, col.Name), strings.Join(parts, " ")))
if col.Key == "PRI" || col.Key == "PK" {
pkCols = append(pkCols, quoteIdentByType(targetType, col.Name))
}
}
if len(pkCols) > 0 {
columnDefs = append(columnDefs, fmt.Sprintf("PRIMARY KEY (%s)", strings.Join(pkCols, ", ")))
}
createSQL := fmt.Sprintf("CREATE TABLE %s (\n %s\n)", quoteQualifiedIdentByType(targetType, targetQueryTable), strings.Join(columnDefs, ",\n "))
if !config.CreateIndexes {
return createSQL, nil, dedupeStrings(warnings), dedupeStrings(unsupported), 0, 0, nil
}
indexes, err := sourceDB.GetIndexes(sourceSchema, sourceTable)
if err != nil {
warnings = append(warnings, fmt.Sprintf("读取源集合索引失败,已跳过索引迁移:%v", err))
return createSQL, nil, dedupeStrings(warnings), dedupeStrings(unsupported), 0, 0, nil
}
grouped := groupIndexDefinitions(indexes)
postSQL := make([]string, 0, len(grouped))
created := 0
skipped := 0
for _, idx := range grouped {
name := strings.TrimSpace(idx.Name)
if name == "" || strings.EqualFold(name, "_id_") || strings.EqualFold(name, "primary") {
continue
}
if len(idx.Columns) == 0 {
skipped++
unsupported = append(unsupported, fmt.Sprintf("索引 %s 缺少列定义,已跳过", name))
continue
}
quotedCols := make([]string, 0, len(idx.Columns))
for _, col := range idx.Columns {
quotedCols = append(quotedCols, quoteIdentByType(targetType, col))
}
prefix := "CREATE INDEX"
if idx.Unique {
prefix = "CREATE UNIQUE INDEX"
}
postSQL = append(postSQL, fmt.Sprintf("%s %s ON %s (%s)", prefix, quoteIdentByType(targetType, name), quoteQualifiedIdentByType(targetType, targetQueryTable), strings.Join(quotedCols, ", ")))
created++
}
return createSQL, postSQL, dedupeStrings(warnings), dedupeStrings(unsupported), created, skipped, nil
}
func mapMongoInferredColumnToPGLike(col connection.ColumnDefinition) (string, []string) {
raw := strings.ToLower(strings.TrimSpace(col.Type))
warnings := make([]string, 0)
switch {
case strings.HasPrefix(raw, "varchar"):
return col.Type, warnings
case raw == "json":
return "jsonb", warnings
case raw == "datetime":
return "timestamp", warnings
case raw == "tinyint(1)":
return "boolean", warnings
case raw == "double":
return "double precision", warnings
case raw == "bigint":
return "bigint", warnings
default:
return col.Type, warnings
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,58 @@
package sync
import (
"GoNavi-Wails/internal/connection"
"fmt"
"strings"
)
func supportsAutoAddColumnsForPair(sourceType string, targetType string) bool {
source := normalizeMigrationDBType(sourceType)
target := normalizeMigrationDBType(targetType)
if isMySQLLikeWritableTargetType(target) {
return isMySQLCoreType(source)
}
if isPGLikeTarget(target) {
return isMySQLLikeSourceType(source)
}
return false
}
func buildAddColumnSQLForPair(sourceType string, targetType string, targetQueryTable string, sourceCol connection.ColumnDefinition) (string, error) {
source := normalizeMigrationDBType(sourceType)
target := normalizeMigrationDBType(targetType)
switch {
case isMySQLCoreType(source) && isMySQLLikeWritableTargetType(target):
colType := sanitizeMySQLColumnType(sourceCol.Type)
return fmt.Sprintf("ALTER TABLE %s ADD COLUMN %s %s NULL",
quoteQualifiedIdentByType("mysql", targetQueryTable),
quoteIdentByType("mysql", sourceCol.Name),
colType,
), nil
case isMySQLLikeSourceType(source) && isPGLikeTarget(target):
colType, _, warnings := mapMySQLColumnToKingbase(sourceCol)
if len(warnings) > 0 && strings.Contains(strings.Join(warnings, " "), "identity") {
// 对已有目标表补字段时保守处理,不补建自增语义。
}
return fmt.Sprintf("ALTER TABLE %s ADD COLUMN %s %s NULL",
quoteQualifiedIdentByType(target, targetQueryTable),
quoteIdentByType(target, sourceCol.Name),
colType,
), nil
default:
return "", fmt.Errorf("当前不支持 source=%s target=%s 的自动补字段", sourceType, targetType)
}
}
func executeSQLStatements(execFn func(string) (int64, error), statements []string) error {
for _, stmt := range statements {
trimmed := strings.TrimSpace(stmt)
if trimmed == "" {
continue
}
if _, err := execFn(trimmed); err != nil {
return err
}
}
return nil
}

View File

@@ -0,0 +1,53 @@
package sync
import (
"fmt"
"strings"
)
type SchemaInferenceStrategy string
const (
SchemaInferenceStrategySample SchemaInferenceStrategy = "sample"
SchemaInferenceStrategyStrict SchemaInferenceStrategy = "strict"
)
func shouldUseSchemaInference(sourceType string, targetType string) bool {
sourceModel := classifyMigrationDataModel(sourceType)
targetModel := classifyMigrationDataModel(targetType)
return sourceModel == MigrationDataModelDocument && targetModel == MigrationDataModelRelational
}
func inferMigrationObjectKind(sourceType string, targetType string) MigrationObjectKind {
sourceModel := classifyMigrationDataModel(sourceType)
targetModel := classifyMigrationDataModel(targetType)
switch {
case sourceModel == MigrationDataModelDocument || targetModel == MigrationDataModelDocument:
return MigrationObjectKindCollection
case sourceModel == MigrationDataModelKeyValue || targetModel == MigrationDataModelKeyValue:
return MigrationObjectKindKeyspace
default:
return MigrationObjectKindTable
}
}
func inferSchemaForPair(sourceType string, targetType string, objectName string) (SchemaInferenceResult, error) {
if !shouldUseSchemaInference(sourceType, targetType) {
return SchemaInferenceResult{}, fmt.Errorf("当前迁移对 %s -> %s 不需要 schema 推断", sourceType, targetType)
}
return SchemaInferenceResult{
Object: CanonicalObjectSpec{
Name: strings.TrimSpace(objectName),
Kind: MigrationObjectKindCollection,
Fields: []CanonicalFieldSpec{},
},
Issues: []SchemaInferenceIssue{
{
Level: "info",
Message: "MongoDB -> 关系型数据库的 schema 推断能力尚在建设中,当前仅提供内核入口。",
Resolution: "后续将基于样本数据生成列定义与类型降级策略。",
},
},
NeedsReview: true,
}, nil
}

View File

@@ -0,0 +1,296 @@
package sync
import (
"GoNavi-Wails/internal/connection"
"GoNavi-Wails/internal/db"
"fmt"
"strconv"
"strings"
)
func buildTDengineToMySQLPlan(config SyncConfig, tableName string, sourceDB db.Database, targetDB db.Database) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
plan := SchemaMigrationPlan{}
sourceType := resolveMigrationDBType(config.SourceConfig)
targetType := resolveMigrationDBType(config.TargetConfig)
plan.SourceSchema, plan.SourceTable = normalizeSchemaAndTable(sourceType, config.SourceConfig.Database, tableName)
plan.TargetSchema, plan.TargetTable = normalizeSchemaAndTable(targetType, config.TargetConfig.Database, tableName)
plan.SourceQueryTable = qualifiedNameForQuery(sourceType, plan.SourceSchema, plan.SourceTable, tableName)
plan.TargetQueryTable = qualifiedNameForQuery(targetType, plan.TargetSchema, plan.TargetTable, tableName)
plan.PlannedAction = "使用已有目标表导入"
sourceCols, sourceExists, err := inspectTableColumns(sourceDB, plan.SourceSchema, plan.SourceTable)
if err != nil {
return plan, nil, nil, fmt.Errorf("获取源表字段失败: %w", err)
}
if !sourceExists {
return plan, nil, nil, fmt.Errorf("源表不存在或无列定义: %s", tableName)
}
targetCols, targetExists, err := inspectTableColumns(targetDB, plan.TargetSchema, plan.TargetTable)
if err != nil {
return plan, sourceCols, nil, fmt.Errorf("获取目标表字段失败: %w", err)
}
plan.TargetTableExists = targetExists
plan.Warnings = append(plan.Warnings, tdengineSemanticWarnings(sourceCols)...)
strategy := normalizeTargetTableStrategy(config.TargetTableStrategy)
if targetExists {
missing := diffMissingColumnNames(sourceCols, targetCols)
if len(missing) > 0 {
plan.Warnings = append(plan.Warnings, fmt.Sprintf("目标表缺失字段 %d 个:%s", len(missing), strings.Join(missing, ", ")))
}
if strategy != "existing_only" {
plan.Warnings = append(plan.Warnings, "TDengine 源端当前不自动补齐已有目标表字段,请先确认目标表结构")
}
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
}
switch strategy {
case "existing_only":
plan.PlannedAction = "目标表不存在,需先手工创建"
plan.Warnings = append(plan.Warnings, "当前策略要求目标表已存在,执行时不会自动建表")
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
case "smart", "auto_create_if_missing":
plan.AutoCreate = true
plan.PlannedAction = "目标表不存在,将自动建表后导入"
createSQL, warnings, unsupported := buildTDengineToMySQLCreateTableSQL(plan.TargetQueryTable, sourceCols)
plan.CreateTableSQL = createSQL
plan.Warnings = append(plan.Warnings, warnings...)
plan.UnsupportedObjects = append(plan.UnsupportedObjects, unsupported...)
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
default:
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
}
}
func buildTDengineToPGLikePlan(config SyncConfig, tableName string, sourceDB db.Database, targetDB db.Database) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
plan := SchemaMigrationPlan{}
sourceType := resolveMigrationDBType(config.SourceConfig)
targetType := resolveMigrationDBType(config.TargetConfig)
plan.SourceSchema, plan.SourceTable = normalizeSchemaAndTable(sourceType, config.SourceConfig.Database, tableName)
plan.TargetSchema, plan.TargetTable = normalizeSchemaAndTable(targetType, config.TargetConfig.Database, tableName)
plan.SourceQueryTable = qualifiedNameForQuery(sourceType, plan.SourceSchema, plan.SourceTable, tableName)
plan.TargetQueryTable = qualifiedNameForQuery(targetType, plan.TargetSchema, plan.TargetTable, tableName)
plan.PlannedAction = "使用已有目标表导入"
sourceCols, sourceExists, err := inspectTableColumns(sourceDB, plan.SourceSchema, plan.SourceTable)
if err != nil {
return plan, nil, nil, fmt.Errorf("获取源表字段失败: %w", err)
}
if !sourceExists {
return plan, nil, nil, fmt.Errorf("源表不存在或无列定义: %s", tableName)
}
targetCols, targetExists, err := inspectTableColumns(targetDB, plan.TargetSchema, plan.TargetTable)
if err != nil {
return plan, sourceCols, nil, fmt.Errorf("获取目标表字段失败: %w", err)
}
plan.TargetTableExists = targetExists
plan.Warnings = append(plan.Warnings, tdengineSemanticWarnings(sourceCols)...)
strategy := normalizeTargetTableStrategy(config.TargetTableStrategy)
if targetExists {
missing := diffMissingColumnNames(sourceCols, targetCols)
if len(missing) > 0 {
plan.Warnings = append(plan.Warnings, fmt.Sprintf("目标表缺失字段 %d 个:%s", len(missing), strings.Join(missing, ", ")))
}
if strategy != "existing_only" {
plan.Warnings = append(plan.Warnings, "TDengine 源端当前不自动补齐已有目标表字段,请先确认目标表结构")
}
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
}
switch strategy {
case "existing_only":
plan.PlannedAction = "目标表不存在,需先手工创建"
plan.Warnings = append(plan.Warnings, "当前策略要求目标表已存在,执行时不会自动建表")
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
case "smart", "auto_create_if_missing":
plan.AutoCreate = true
plan.PlannedAction = "目标表不存在,将自动建表后导入"
createSQL, warnings, unsupported := buildTDengineToPGLikeCreateTableSQL(targetType, plan.TargetQueryTable, sourceCols)
plan.CreateTableSQL = createSQL
plan.Warnings = append(plan.Warnings, warnings...)
plan.UnsupportedObjects = append(plan.UnsupportedObjects, unsupported...)
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
default:
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
}
}
func buildTDengineToMySQLCreateTableSQL(targetQueryTable string, sourceCols []connection.ColumnDefinition) (string, []string, []string) {
columnDefs := make([]string, 0, len(sourceCols))
warnings := make([]string, 0)
unsupported := []string{"TDengine 的索引/外键/触发器/超级表/TTL 等时序语义当前不会自动迁移"}
for _, col := range sourceCols {
def, colWarnings := buildTDengineToMySQLColumnDefinition(col)
columnDefs = append(columnDefs, fmt.Sprintf("%s %s", quoteIdentByType("mysql", col.Name), def))
warnings = append(warnings, colWarnings...)
}
createSQL := fmt.Sprintf("CREATE TABLE %s (\n %s\n)", quoteQualifiedIdentByType("mysql", targetQueryTable), strings.Join(columnDefs, ",\n "))
return createSQL, dedupeStrings(warnings), dedupeStrings(unsupported)
}
func buildTDengineToPGLikeCreateTableSQL(targetType string, targetQueryTable string, sourceCols []connection.ColumnDefinition) (string, []string, []string) {
columnDefs := make([]string, 0, len(sourceCols))
warnings := make([]string, 0)
unsupported := []string{"TDengine 的索引/外键/触发器/超级表/TTL 等时序语义当前不会自动迁移"}
for _, col := range sourceCols {
def, colWarnings := buildTDengineToPGLikeColumnDefinition(col)
columnDefs = append(columnDefs, fmt.Sprintf("%s %s", quoteIdentByType(targetType, col.Name), def))
warnings = append(warnings, colWarnings...)
}
createSQL := fmt.Sprintf("CREATE TABLE %s (\n %s\n)", quoteQualifiedIdentByType(targetType, targetQueryTable), strings.Join(columnDefs, ",\n "))
return createSQL, dedupeStrings(warnings), dedupeStrings(unsupported)
}
func buildTDengineToMySQLColumnDefinition(col connection.ColumnDefinition) (string, []string) {
targetType, warnings := mapTDengineColumnToMySQL(col)
parts := []string{targetType}
if strings.EqualFold(strings.TrimSpace(col.Nullable), "NO") {
parts = append(parts, "NOT NULL")
} else {
parts = append(parts, "NULL")
}
return strings.Join(parts, " "), warnings
}
func buildTDengineToPGLikeColumnDefinition(col connection.ColumnDefinition) (string, []string) {
targetType, warnings := mapTDengineColumnToPGLike(col)
parts := []string{targetType}
if strings.EqualFold(strings.TrimSpace(col.Nullable), "NO") {
parts = append(parts, "NOT NULL")
} else {
parts = append(parts, "NULL")
}
return strings.Join(parts, " "), warnings
}
func tdengineSemanticWarnings(sourceCols []connection.ColumnDefinition) []string {
warnings := []string{"TDengine 到关系型目标库当前仅迁移列与数据超级表、TAG 关联、保留策略等时序语义会降级或丢失"}
for _, col := range sourceCols {
if isTDengineTagColumn(col) {
warnings = append(warnings, fmt.Sprintf("字段 %s 为 TDengine TAG 列,迁移到关系型目标后将降级为普通字段", col.Name))
}
}
return dedupeStrings(warnings)
}
func isTDengineTagColumn(col connection.ColumnDefinition) bool {
return strings.EqualFold(strings.TrimSpace(col.Key), "TAG") || strings.Contains(strings.ToUpper(strings.TrimSpace(col.Extra)), "TAG")
}
func parseTDengineType(raw string) (string, int) {
cleaned := strings.TrimSpace(strings.ToUpper(raw))
if cleaned == "" {
return "", 0
}
base := cleaned
length := 0
if idx := strings.Index(base, "("); idx >= 0 {
end := strings.Index(base[idx+1:], ")")
if end >= 0 {
lengthText := strings.TrimSpace(base[idx+1 : idx+1+end])
if v, err := strconv.Atoi(lengthText); err == nil {
length = v
}
}
base = strings.TrimSpace(base[:idx])
}
return base, length
}
func mapTDengineColumnToMySQL(col connection.ColumnDefinition) (string, []string) {
base, length := parseTDengineType(col.Type)
warnings := make([]string, 0)
if isTDengineTagColumn(col) {
warnings = append(warnings, fmt.Sprintf("字段 %s 为 TDengine TAG 列,已按普通列映射", col.Name))
}
switch base {
case "BOOL", "BOOLEAN":
return "tinyint(1)", warnings
case "TINYINT":
return "tinyint", warnings
case "UTINYINT":
return "tinyint unsigned", warnings
case "SMALLINT":
return "smallint", warnings
case "USMALLINT":
return "smallint unsigned", warnings
case "INT", "INTEGER":
return "int", warnings
case "UINT":
return "int unsigned", warnings
case "BIGINT":
return "bigint", warnings
case "UBIGINT":
return "bigint unsigned", warnings
case "FLOAT":
return "float", warnings
case "DOUBLE":
return "double", warnings
case "DECIMAL", "NUMERIC":
if length > 0 {
return strings.ToLower(strings.TrimSpace(col.Type)), warnings
}
return "decimal(38,10)", warnings
case "TIMESTAMP":
return "datetime", warnings
case "DATE":
return "date", warnings
case "JSON":
return "json", warnings
case "BINARY", "NCHAR", "VARCHAR", "VARBINARY":
if length > 0 && length <= 65535 {
return fmt.Sprintf("varchar(%d)", length), warnings
}
return "text", warnings
default:
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 %s 暂无专门 MySQL 映射,已降级为 text", col.Name, col.Type))
return "text", warnings
}
}
func mapTDengineColumnToPGLike(col connection.ColumnDefinition) (string, []string) {
base, length := parseTDengineType(col.Type)
warnings := make([]string, 0)
if isTDengineTagColumn(col) {
warnings = append(warnings, fmt.Sprintf("字段 %s 为 TDengine TAG 列,已按普通列映射", col.Name))
}
switch base {
case "BOOL", "BOOLEAN":
return "boolean", warnings
case "TINYINT", "UTINYINT", "SMALLINT":
return "smallint", warnings
case "USMALLINT", "INT", "INTEGER":
return "integer", warnings
case "UINT", "BIGINT":
return "bigint", warnings
case "UBIGINT":
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 UBIGINT 已映射为 numeric(20,0) 以避免无符号溢出", col.Name))
return "numeric(20,0)", warnings
case "FLOAT":
return "real", warnings
case "DOUBLE":
return "double precision", warnings
case "DECIMAL", "NUMERIC":
if length > 0 {
return strings.ToLower(strings.TrimSpace(col.Type)), warnings
}
return "numeric(38,10)", warnings
case "TIMESTAMP":
return "timestamp", warnings
case "DATE":
return "date", warnings
case "JSON":
return "jsonb", warnings
case "BINARY", "NCHAR", "VARCHAR", "VARBINARY":
if length > 0 {
return fmt.Sprintf("varchar(%d)", length), warnings
}
return "text", warnings
default:
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 %s 暂无专门 PG-like 映射,已降级为 text", col.Name, col.Type))
return "text", warnings
}
}

View File

@@ -0,0 +1,657 @@
package sync
import (
"GoNavi-Wails/internal/connection"
"GoNavi-Wails/internal/db"
"fmt"
"strconv"
"strings"
)
type mySQLLikeToTDenginePlanner struct{}
type pgLikeToTDenginePlanner struct{}
type clickHouseToTDenginePlanner struct{}
type tdengineToTDenginePlanner struct{}
func (mySQLLikeToTDenginePlanner) Name() string { return "mysqllike-tdengine-planner" }
func (mySQLLikeToTDenginePlanner) SupportLevel(ctx MigrationBuildContext) MigrationSupportLevel {
sourceType := resolveMigrationDBType(ctx.Config.SourceConfig)
targetType := resolveMigrationDBType(ctx.Config.TargetConfig)
if isMySQLLikeSourceType(sourceType) && targetType == "tdengine" {
return MigrationSupportLevelFull
}
return MigrationSupportLevelUnsupported
}
func (mySQLLikeToTDenginePlanner) BuildPlan(ctx MigrationBuildContext) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
return buildMySQLLikeToTDenginePlan(ctx.Config, ctx.TableName, ctx.SourceDB, ctx.TargetDB)
}
func (pgLikeToTDenginePlanner) Name() string { return "pglike-tdengine-planner" }
func (pgLikeToTDenginePlanner) SupportLevel(ctx MigrationBuildContext) MigrationSupportLevel {
sourceType := resolveMigrationDBType(ctx.Config.SourceConfig)
targetType := resolveMigrationDBType(ctx.Config.TargetConfig)
if isPGLikeSource(sourceType) && targetType == "tdengine" {
return MigrationSupportLevelFull
}
return MigrationSupportLevelUnsupported
}
func (pgLikeToTDenginePlanner) BuildPlan(ctx MigrationBuildContext) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
return buildPGLikeToTDenginePlan(ctx.Config, ctx.TableName, ctx.SourceDB, ctx.TargetDB)
}
func buildMySQLLikeToTDenginePlan(config SyncConfig, tableName string, sourceDB db.Database, targetDB db.Database) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
return buildSourceToTDenginePlan(config, tableName, sourceDB, targetDB, isMySQLLikeTDengineTimestampCandidate, buildMySQLLikeToTDengineCreateTableSQL)
}
func buildPGLikeToTDenginePlan(config SyncConfig, tableName string, sourceDB db.Database, targetDB db.Database) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
return buildSourceToTDenginePlan(config, tableName, sourceDB, targetDB, isPGLikeTDengineTimestampCandidate, buildPGLikeToTDengineCreateTableSQL)
}
func buildClickHouseToTDenginePlan(config SyncConfig, tableName string, sourceDB db.Database, targetDB db.Database) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
return buildSourceToTDenginePlan(config, tableName, sourceDB, targetDB, isClickHouseTDengineTimestampCandidate, buildClickHouseToTDengineCreateTableSQL)
}
func buildTDengineToTDenginePlan(config SyncConfig, tableName string, sourceDB db.Database, targetDB db.Database) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
return buildSourceToTDenginePlan(config, tableName, sourceDB, targetDB, isTDengineTDengineTimestampCandidate, buildTDengineToTDengineCreateTableSQL)
}
func (clickHouseToTDenginePlanner) Name() string { return "clickhouse-tdengine-planner" }
func (clickHouseToTDenginePlanner) SupportLevel(ctx MigrationBuildContext) MigrationSupportLevel {
sourceType := resolveMigrationDBType(ctx.Config.SourceConfig)
targetType := resolveMigrationDBType(ctx.Config.TargetConfig)
if sourceType == "clickhouse" && targetType == "tdengine" {
return MigrationSupportLevelFull
}
return MigrationSupportLevelUnsupported
}
func (clickHouseToTDenginePlanner) BuildPlan(ctx MigrationBuildContext) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
return buildClickHouseToTDenginePlan(ctx.Config, ctx.TableName, ctx.SourceDB, ctx.TargetDB)
}
func (tdengineToTDenginePlanner) Name() string { return "tdengine-tdengine-planner" }
func (tdengineToTDenginePlanner) SupportLevel(ctx MigrationBuildContext) MigrationSupportLevel {
sourceType := resolveMigrationDBType(ctx.Config.SourceConfig)
targetType := resolveMigrationDBType(ctx.Config.TargetConfig)
if sourceType == "tdengine" && targetType == "tdengine" {
return MigrationSupportLevelFull
}
return MigrationSupportLevelUnsupported
}
func (tdengineToTDenginePlanner) BuildPlan(ctx MigrationBuildContext) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
return buildTDengineToTDenginePlan(ctx.Config, ctx.TableName, ctx.SourceDB, ctx.TargetDB)
}
type tdengineTimestampCandidate func(connection.ColumnDefinition) bool
type tdengineCreateTableBuilder func(string, []connection.ColumnDefinition, int) (string, []string, []string)
func buildSourceToTDenginePlan(config SyncConfig, tableName string, sourceDB db.Database, targetDB db.Database, isTimestamp tdengineTimestampCandidate, buildCreateSQL tdengineCreateTableBuilder) (SchemaMigrationPlan, []connection.ColumnDefinition, []connection.ColumnDefinition, error) {
plan := SchemaMigrationPlan{}
sourceType := resolveMigrationDBType(config.SourceConfig)
targetType := resolveMigrationDBType(config.TargetConfig)
plan.SourceSchema, plan.SourceTable = normalizeSchemaAndTable(sourceType, config.SourceConfig.Database, tableName)
plan.TargetSchema, plan.TargetTable = normalizeSchemaAndTable(targetType, config.TargetConfig.Database, tableName)
plan.SourceQueryTable = qualifiedNameForQuery(sourceType, plan.SourceSchema, plan.SourceTable, tableName)
plan.TargetQueryTable = qualifiedNameForQuery(targetType, plan.TargetSchema, plan.TargetTable, tableName)
plan.PlannedAction = "使用已有目标表导入"
sourceCols, sourceExists, err := inspectTableColumns(sourceDB, plan.SourceSchema, plan.SourceTable)
if err != nil {
return plan, nil, nil, fmt.Errorf("获取源表字段失败: %w", err)
}
if !sourceExists {
return plan, nil, nil, fmt.Errorf("源表不存在或无列定义: %s", tableName)
}
plan.Warnings = append(plan.Warnings, tdengineTargetBaseWarnings()...)
timestampIndex := findTDengineTimestampColumn(sourceCols, isTimestamp)
if timestampIndex < 0 {
plan.Warnings = append(plan.Warnings, tdengineTargetMissingTimeWarning())
}
targetCols, targetExists, err := inspectTableColumns(targetDB, plan.TargetSchema, plan.TargetTable)
if err != nil {
return plan, sourceCols, nil, fmt.Errorf("获取目标表字段失败: %w", err)
}
plan.TargetTableExists = targetExists
strategy := normalizeTargetTableStrategy(config.TargetTableStrategy)
if targetExists {
missing := diffMissingColumnNames(sourceCols, targetCols)
if len(missing) > 0 {
plan.Warnings = append(plan.Warnings, fmt.Sprintf("目标表缺失字段 %d 个:%s", len(missing), strings.Join(missing, ", ")))
}
if strategy != "existing_only" {
plan.Warnings = append(plan.Warnings, "TDengine 目标端当前不自动补齐已有目标表字段,请先确认目标表结构")
}
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
}
switch strategy {
case "existing_only":
plan.PlannedAction = "目标表不存在,需先手工创建"
plan.Warnings = append(plan.Warnings, "当前策略要求目标表已存在,执行时不会自动建表")
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
case "smart", "auto_create_if_missing":
if timestampIndex < 0 {
plan.PlannedAction = "源表未识别到可映射为 TDengine 首列的时间列,无法自动建表"
plan.UnsupportedObjects = append(plan.UnsupportedObjects, "TDengine regular table 首列必须为 TIMESTAMP当前源表缺少可直接映射的时间列")
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
}
plan.AutoCreate = true
plan.PlannedAction = "目标表不存在,将自动建表后导入"
createSQL, warnings, unsupported := buildCreateSQL(plan.TargetQueryTable, sourceCols, timestampIndex)
plan.CreateTableSQL = createSQL
plan.Warnings = append(plan.Warnings, warnings...)
plan.UnsupportedObjects = append(plan.UnsupportedObjects, unsupported...)
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
default:
return dedupeSchemaMigrationPlan(plan), sourceCols, targetCols, nil
}
}
func tdengineTargetBaseWarnings() []string {
return []string{
"TDengine 目标端当前仅支持 INSERT 写入;若存在差异 update/delete执行期会被拒绝",
"TDengine 目标端 auto-create 当前仅创建基础表索引、外键、触发器、supertable/TAGS/TTL 不会自动迁移",
}
}
func tdengineTargetMissingTimeWarning() string {
return "源表缺少可映射的时间列,自动建表将不可用;如需继续,请先人工准备 TDengine 目标表与时间列"
}
func findTDengineTimestampColumn(sourceCols []connection.ColumnDefinition, candidate tdengineTimestampCandidate) int {
preferred := []string{"ts", "timestamp", "event_time", "eventtime", "created_at", "create_time", "occurred_at"}
for _, name := range preferred {
for idx, col := range sourceCols {
if !candidate(col) {
continue
}
if strings.EqualFold(strings.TrimSpace(col.Name), name) {
return idx
}
}
}
for idx, col := range sourceCols {
if candidate(col) {
return idx
}
}
return -1
}
func reorderTDengineColumns(sourceCols []connection.ColumnDefinition, timestampIndex int) []connection.ColumnDefinition {
if timestampIndex <= 0 || timestampIndex >= len(sourceCols) {
cloned := make([]connection.ColumnDefinition, len(sourceCols))
copy(cloned, sourceCols)
return cloned
}
ordered := make([]connection.ColumnDefinition, 0, len(sourceCols))
ordered = append(ordered, sourceCols[timestampIndex])
for idx, col := range sourceCols {
if idx == timestampIndex {
continue
}
ordered = append(ordered, col)
}
return ordered
}
func buildMySQLLikeToTDengineCreateTableSQL(targetQueryTable string, sourceCols []connection.ColumnDefinition, timestampIndex int) (string, []string, []string) {
ordered := reorderTDengineColumns(sourceCols, timestampIndex)
columnDefs := make([]string, 0, len(ordered))
warnings := make([]string, 0)
unsupported := []string{"源表索引/外键/触发器/唯一约束/自增语义当前不会自动迁移到 TDengine"}
if timestampIndex != 0 && timestampIndex >= 0 && timestampIndex < len(sourceCols) {
warnings = append(warnings, fmt.Sprintf("TDengine 基础表要求时间列优先,已将字段 %s 调整为首列", sourceCols[timestampIndex].Name))
}
for idx, col := range ordered {
def, colWarnings := mapMySQLLikeColumnToTDengine(col, idx == 0)
warnings = append(warnings, colWarnings...)
columnDefs = append(columnDefs, fmt.Sprintf("%s %s", quoteIdentByType("tdengine", col.Name), def))
}
createSQL := fmt.Sprintf("CREATE TABLE %s (\n %s\n)", quoteQualifiedIdentByType("tdengine", targetQueryTable), strings.Join(columnDefs, ",\n "))
return createSQL, dedupeStrings(warnings), dedupeStrings(unsupported)
}
func buildPGLikeToTDengineCreateTableSQL(targetQueryTable string, sourceCols []connection.ColumnDefinition, timestampIndex int) (string, []string, []string) {
ordered := reorderTDengineColumns(sourceCols, timestampIndex)
columnDefs := make([]string, 0, len(ordered))
warnings := make([]string, 0)
unsupported := []string{"源表索引/外键/触发器/唯一约束/identity/sequence 语义当前不会自动迁移到 TDengine"}
if timestampIndex != 0 && timestampIndex >= 0 && timestampIndex < len(sourceCols) {
warnings = append(warnings, fmt.Sprintf("TDengine 基础表要求时间列优先,已将字段 %s 调整为首列", sourceCols[timestampIndex].Name))
}
for idx, col := range ordered {
def, colWarnings := mapPGLikeColumnToTDengine(col, idx == 0)
warnings = append(warnings, colWarnings...)
columnDefs = append(columnDefs, fmt.Sprintf("%s %s", quoteIdentByType("tdengine", col.Name), def))
}
createSQL := fmt.Sprintf("CREATE TABLE %s (\n %s\n)", quoteQualifiedIdentByType("tdengine", targetQueryTable), strings.Join(columnDefs, ",\n "))
return createSQL, dedupeStrings(warnings), dedupeStrings(unsupported)
}
func buildClickHouseToTDengineCreateTableSQL(targetQueryTable string, sourceCols []connection.ColumnDefinition, timestampIndex int) (string, []string, []string) {
ordered := reorderTDengineColumns(sourceCols, timestampIndex)
columnDefs := make([]string, 0, len(ordered))
warnings := make([]string, 0)
unsupported := []string{"源表 ORDER BY/PARTITION/TTL/Projection/物化视图 语义当前不会自动迁移到 TDengine"}
if timestampIndex != 0 && timestampIndex >= 0 && timestampIndex < len(sourceCols) {
warnings = append(warnings, fmt.Sprintf("TDengine 基础表要求时间列优先,已将字段 %s 调整为首列", sourceCols[timestampIndex].Name))
}
for idx, col := range ordered {
def, colWarnings := mapClickHouseColumnToTDengine(col, idx == 0)
warnings = append(warnings, colWarnings...)
columnDefs = append(columnDefs, fmt.Sprintf("%s %s", quoteIdentByType("tdengine", col.Name), def))
}
createSQL := fmt.Sprintf("CREATE TABLE %s (\n %s\n)", quoteQualifiedIdentByType("tdengine", targetQueryTable), strings.Join(columnDefs, ",\n "))
return createSQL, dedupeStrings(warnings), dedupeStrings(unsupported)
}
func buildTDengineToTDengineCreateTableSQL(targetQueryTable string, sourceCols []connection.ColumnDefinition, timestampIndex int) (string, []string, []string) {
ordered := reorderTDengineColumns(sourceCols, timestampIndex)
columnDefs := make([]string, 0, len(ordered))
warnings := make([]string, 0)
unsupported := []string{"源表 supertable/TAGS/TTL/保留策略/索引 语义当前不会自动迁移到 TDengine regular table"}
if timestampIndex != 0 && timestampIndex >= 0 && timestampIndex < len(sourceCols) {
warnings = append(warnings, fmt.Sprintf("TDengine 基础表要求时间列优先,已将字段 %s 调整为首列", sourceCols[timestampIndex].Name))
}
for idx, col := range ordered {
def, colWarnings := mapTDengineColumnToTDengine(col, idx == 0)
warnings = append(warnings, colWarnings...)
columnDefs = append(columnDefs, fmt.Sprintf("%s %s", quoteIdentByType("tdengine", col.Name), def))
}
createSQL := fmt.Sprintf("CREATE TABLE %s (\n %s\n)", quoteQualifiedIdentByType("tdengine", targetQueryTable), strings.Join(columnDefs, ",\n "))
return createSQL, dedupeStrings(warnings), dedupeStrings(unsupported)
}
func isMySQLLikeTDengineTimestampCandidate(col connection.ColumnDefinition) bool {
raw := strings.ToLower(strings.TrimSpace(col.Type))
clean := strings.ReplaceAll(raw, " unsigned", "")
clean = strings.ReplaceAll(clean, " zerofill", "")
return strings.HasPrefix(clean, "timestamp") || strings.HasPrefix(clean, "datetime")
}
func isPGLikeTDengineTimestampCandidate(col connection.ColumnDefinition) bool {
raw := strings.ToLower(strings.TrimSpace(col.Type))
return strings.HasPrefix(raw, "timestamp")
}
func isClickHouseTDengineTimestampCandidate(col connection.ColumnDefinition) bool {
lower, _ := unwrapClickHouseTDengineType(col.Type)
return strings.HasPrefix(lower, "datetime")
}
func isTDengineTDengineTimestampCandidate(col connection.ColumnDefinition) bool {
base, _ := parseTDengineType(col.Type)
return base == "TIMESTAMP"
}
func mapMySQLLikeColumnToTDengine(col connection.ColumnDefinition, forceTimestamp bool) (string, []string) {
warnings := make([]string, 0)
if forceTimestamp {
if !isMySQLLikeTDengineTimestampCandidate(col) {
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 %s 已提升为 TDengine 首列 TIMESTAMP", col.Name, col.Type))
}
return "TIMESTAMP", warnings
}
raw := strings.ToLower(strings.TrimSpace(col.Type))
if raw == "" {
return "VARCHAR(1024)", []string{fmt.Sprintf("字段 %s 类型为空,已降级为 VARCHAR(1024)", col.Name)}
}
unsigned := strings.Contains(raw, "unsigned")
clean := strings.ReplaceAll(raw, " unsigned", "")
clean = strings.ReplaceAll(clean, " zerofill", "")
isAutoIncrement := strings.Contains(strings.ToLower(strings.TrimSpace(col.Extra)), "auto_increment")
if isAutoIncrement {
warnings = append(warnings, fmt.Sprintf("字段 %s 自增语义不会迁移到 TDengine", col.Name))
}
if col.Key == "PRI" || col.Key == "PK" {
warnings = append(warnings, fmt.Sprintf("字段 %s 主键语义不会按关系型约束迁移到 TDengine", col.Name))
}
switch {
case strings.HasPrefix(clean, "tinyint(1)") && !unsigned && !isAutoIncrement:
return "BOOL", warnings
case strings.HasPrefix(clean, "tinyint"):
if unsigned {
return "UTINYINT", warnings
}
return "TINYINT", warnings
case strings.HasPrefix(clean, "smallint"):
if unsigned {
return "USMALLINT", warnings
}
return "SMALLINT", warnings
case strings.HasPrefix(clean, "mediumint"), strings.HasPrefix(clean, "int"), strings.HasPrefix(clean, "integer"):
if unsigned {
return "UINT", warnings
}
return "INT", warnings
case strings.HasPrefix(clean, "bigint"):
if unsigned {
return "UBIGINT", warnings
}
return "BIGINT", warnings
case strings.HasPrefix(clean, "decimal"), strings.HasPrefix(clean, "numeric"):
return normalizeTDengineDecimalType(clean), warnings
case strings.HasPrefix(clean, "float"):
return "FLOAT", warnings
case strings.HasPrefix(clean, "double"):
return "DOUBLE", warnings
case strings.HasPrefix(clean, "date"):
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 date 已降级映射为 TIMESTAMP", col.Name))
return "TIMESTAMP", warnings
case strings.HasPrefix(clean, "timestamp"), strings.HasPrefix(clean, "datetime"):
return "TIMESTAMP", warnings
case strings.HasPrefix(clean, "time"):
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 %s 暂无稳定 TDengine 时间-only 映射,已降级为 VARCHAR(64)", col.Name, col.Type))
return "VARCHAR(64)", warnings
case strings.HasPrefix(clean, "char("), strings.HasPrefix(clean, "varchar("):
return fmt.Sprintf("VARCHAR(%d)", normalizeTDengineVarcharLength(extractFirstTypeLength(clean), 255)), warnings
case strings.HasPrefix(clean, "tinytext"), strings.HasPrefix(clean, "text"), strings.HasPrefix(clean, "mediumtext"), strings.HasPrefix(clean, "longtext"):
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 %s 已降级为 VARCHAR(4096)", col.Name, col.Type))
return "VARCHAR(4096)", warnings
case strings.HasPrefix(clean, "json"):
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 %s 因 TDengine JSON 仅适用于 TAG已降级为 VARCHAR(4096)", col.Name, col.Type))
return "VARCHAR(4096)", warnings
case strings.HasPrefix(clean, "enum"), strings.HasPrefix(clean, "set"):
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 %s 已降级为 VARCHAR(255)", col.Name, col.Type))
return "VARCHAR(255)", warnings
case strings.HasPrefix(clean, "binary"), strings.HasPrefix(clean, "varbinary"), strings.HasPrefix(clean, "tinyblob"), strings.HasPrefix(clean, "blob"), strings.HasPrefix(clean, "mediumblob"), strings.HasPrefix(clean, "longblob"):
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 %s 已按字符串语义降级为 VARCHAR(4096)", col.Name, col.Type))
return "VARCHAR(4096)", warnings
default:
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 %s 暂无专门 TDengine 映射,已降级为 VARCHAR(1024)", col.Name, col.Type))
return "VARCHAR(1024)", warnings
}
}
func mapPGLikeColumnToTDengine(col connection.ColumnDefinition, forceTimestamp bool) (string, []string) {
warnings := make([]string, 0)
if forceTimestamp {
if raw := strings.ToLower(strings.TrimSpace(col.Type)); !strings.HasPrefix(raw, "timestamp") {
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 %s 已提升为 TDengine 首列 TIMESTAMP", col.Name, col.Type))
}
return "TIMESTAMP", warnings
}
raw := strings.ToLower(strings.TrimSpace(col.Type))
if raw == "" {
return "VARCHAR(1024)", []string{fmt.Sprintf("字段 %s 类型为空,已降级为 VARCHAR(1024)", col.Name)}
}
if col.Key == "PRI" || col.Key == "PK" {
warnings = append(warnings, fmt.Sprintf("字段 %s 主键语义不会按关系型约束迁移到 TDengine", col.Name))
}
if strings.Contains(strings.ToLower(strings.TrimSpace(col.Extra)), "identity") || strings.Contains(strings.ToLower(strings.TrimSpace(col.Extra)), "auto_increment") {
warnings = append(warnings, fmt.Sprintf("字段 %s 自增/identity 语义不会迁移到 TDengine", col.Name))
}
switch {
case raw == "boolean" || strings.HasPrefix(raw, "bool"):
return "BOOL", warnings
case raw == "smallint":
return "SMALLINT", warnings
case raw == "integer" || raw == "int4":
return "INT", warnings
case raw == "bigint" || raw == "int8":
return "BIGINT", warnings
case strings.HasPrefix(raw, "numeric"), strings.HasPrefix(raw, "decimal"):
return normalizeTDengineDecimalType(raw), warnings
case raw == "real" || raw == "float4":
return "FLOAT", warnings
case raw == "double precision" || raw == "float8":
return "DOUBLE", warnings
case raw == "date":
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 date 已降级映射为 TIMESTAMP", col.Name))
return "TIMESTAMP", warnings
case strings.HasPrefix(raw, "timestamp"):
return "TIMESTAMP", warnings
case strings.HasPrefix(raw, "time"):
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 %s 暂无稳定 TDengine 时间-only 映射,已降级为 VARCHAR(64)", col.Name, col.Type))
return "VARCHAR(64)", warnings
case strings.HasPrefix(raw, "character varying("), strings.HasPrefix(raw, "varchar("), strings.HasPrefix(raw, "character("), strings.HasPrefix(raw, "char("):
return fmt.Sprintf("VARCHAR(%d)", normalizeTDengineVarcharLength(extractFirstTypeLength(raw), 255)), warnings
case raw == "text":
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 text 已降级为 VARCHAR(4096)", col.Name))
return "VARCHAR(4096)", warnings
case raw == "uuid":
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 uuid 已降级为 VARCHAR(36)", col.Name))
return "VARCHAR(36)", warnings
case raw == "json" || raw == "jsonb":
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 %s 因 TDengine JSON 仅适用于 TAG已降级为 VARCHAR(4096)", col.Name, col.Type))
return "VARCHAR(4096)", warnings
case raw == "bytea":
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 bytea 已按字符串语义降级为 VARCHAR(4096)", col.Name))
return "VARCHAR(4096)", warnings
case strings.HasSuffix(raw, "[]") || strings.HasPrefix(raw, "array"):
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 %s 已降级为 VARCHAR(4096)", col.Name, col.Type))
return "VARCHAR(4096)", warnings
case raw == "user-defined":
warnings = append(warnings, fmt.Sprintf("字段 %s 为用户自定义类型,已降级为 VARCHAR(1024)", col.Name))
return "VARCHAR(1024)", warnings
default:
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 %s 暂无专门 TDengine 映射,已降级为 VARCHAR(1024)", col.Name, col.Type))
return "VARCHAR(1024)", warnings
}
}
func mapClickHouseColumnToTDengine(col connection.ColumnDefinition, forceTimestamp bool) (string, []string) {
warnings := make([]string, 0)
if forceTimestamp {
if !isClickHouseTDengineTimestampCandidate(col) {
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 %s 已提升为 TDengine 首列 TIMESTAMP", col.Name, col.Type))
}
return "TIMESTAMP", warnings
}
lower, _ := unwrapClickHouseTDengineType(col.Type)
if lower == "" {
return "VARCHAR(1024)", []string{fmt.Sprintf("字段 %s 类型为空,已降级为 VARCHAR(1024)", col.Name)}
}
switch {
case lower == "bool" || lower == "boolean":
return "BOOL", warnings
case lower == "int8":
return "TINYINT", warnings
case lower == "uint8":
return "UTINYINT", warnings
case lower == "int16":
return "SMALLINT", warnings
case lower == "uint16":
return "USMALLINT", warnings
case lower == "int32":
return "INT", warnings
case lower == "uint32":
return "UINT", warnings
case lower == "int64":
return "BIGINT", warnings
case lower == "uint64":
return "UBIGINT", warnings
case lower == "float32":
return "FLOAT", warnings
case lower == "float64":
return "DOUBLE", warnings
case lower == "date":
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 date 已降级映射为 TIMESTAMP", col.Name))
return "TIMESTAMP", warnings
case strings.HasPrefix(lower, "datetime"):
return "TIMESTAMP", warnings
case lower == "string":
return "VARCHAR(1024)", warnings
case lower == "uuid":
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 uuid 已降级为 VARCHAR(36)", col.Name))
return "VARCHAR(36)", warnings
case lower == "json", strings.HasPrefix(lower, "map("), strings.HasPrefix(lower, "array("), strings.HasPrefix(lower, "tuple("), strings.HasPrefix(lower, "nested("):
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 %s 已降级为 VARCHAR(4096)", col.Name, col.Type))
return "VARCHAR(4096)", warnings
case strings.HasPrefix(lower, "enum8("), strings.HasPrefix(lower, "enum16("):
warnings = append(warnings, fmt.Sprintf("字段 %s 枚举类型 %s 已降级为 VARCHAR(255)", col.Name, col.Type))
return "VARCHAR(255)", warnings
case clickHouseDecimalPattern.MatchString(lower):
parts := clickHouseDecimalPattern.FindStringSubmatch(lower)
return fmt.Sprintf("DECIMAL(%s,%s)", parts[2], parts[3]), warnings
case clickHouseStringArgsPattern.MatchString(lower):
parts := clickHouseStringArgsPattern.FindStringSubmatch(lower)
length, err := strconv.Atoi(parts[1])
if err != nil {
warnings = append(warnings, fmt.Sprintf("字段 %s FixedString 长度解析失败,已降级为 VARCHAR(255)", col.Name))
return "VARCHAR(255)", warnings
}
return fmt.Sprintf("VARCHAR(%d)", normalizeTDengineVarcharLength(length, 255)), warnings
default:
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 %s 暂无专门 TDengine 映射,已降级为 VARCHAR(1024)", col.Name, col.Type))
return "VARCHAR(1024)", warnings
}
}
func mapTDengineColumnToTDengine(col connection.ColumnDefinition, forceTimestamp bool) (string, []string) {
warnings := make([]string, 0)
if forceTimestamp {
if !isTDengineTDengineTimestampCandidate(col) {
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 %s 已提升为 TDengine 首列 TIMESTAMP", col.Name, col.Type))
}
return "TIMESTAMP", warnings
}
base, length := parseTDengineType(col.Type)
if base == "" {
return "VARCHAR(1024)", []string{fmt.Sprintf("字段 %s 类型为空,已降级为 VARCHAR(1024)", col.Name)}
}
if isTDengineTagColumn(col) {
warnings = append(warnings, fmt.Sprintf("字段 %s 为 TDengine TAG 列,迁移到 regular table 后将降级为普通字段", col.Name))
}
switch base {
case "BOOL", "BOOLEAN":
return "BOOL", warnings
case "TINYINT":
return "TINYINT", warnings
case "UTINYINT":
return "UTINYINT", warnings
case "SMALLINT":
return "SMALLINT", warnings
case "USMALLINT":
return "USMALLINT", warnings
case "INT", "INTEGER":
return "INT", warnings
case "UINT":
return "UINT", warnings
case "BIGINT":
return "BIGINT", warnings
case "UBIGINT":
return "UBIGINT", warnings
case "FLOAT":
return "FLOAT", warnings
case "DOUBLE":
return "DOUBLE", warnings
case "DECIMAL", "NUMERIC":
return normalizeTDengineDecimalType(col.Type), warnings
case "TIMESTAMP":
return "TIMESTAMP", warnings
case "DATE":
return "DATE", warnings
case "JSON":
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 JSON 在 TDengine regular table 中不保留 TAG 语义,已降级为 VARCHAR(4096)", col.Name))
return "VARCHAR(4096)", warnings
case "BINARY", "NCHAR", "VARCHAR", "VARBINARY":
if length > 0 {
return fmt.Sprintf("%s(%d)", base, normalizeTDengineVarcharLength(length, length)), warnings
}
fallback := 255
if base == "VARCHAR" {
fallback = 1024
}
return fmt.Sprintf("%s(%d)", base, fallback), warnings
default:
warnings = append(warnings, fmt.Sprintf("字段 %s 类型 %s 暂无专门 TDengine 同库映射,已降级为 VARCHAR(1024)", col.Name, col.Type))
return "VARCHAR(1024)", warnings
}
}
func unwrapClickHouseTDengineType(raw string) (string, bool) {
text := strings.TrimSpace(raw)
lower := strings.ToLower(text)
nullable := false
for {
switched := false
if strings.HasPrefix(lower, "nullable(") && strings.HasSuffix(lower, ")") {
text = strings.TrimSpace(text[len("Nullable(") : len(text)-1])
lower = strings.ToLower(text)
nullable = true
switched = true
}
if strings.HasPrefix(lower, "lowcardinality(") && strings.HasSuffix(lower, ")") {
text = strings.TrimSpace(text[len("LowCardinality(") : len(text)-1])
lower = strings.ToLower(text)
switched = true
}
if !switched {
break
}
}
return lower, nullable
}
func normalizeTDengineDecimalType(raw string) string {
text := strings.TrimSpace(raw)
if text == "" {
return "DECIMAL(38,10)"
}
lower := strings.ToLower(text)
if strings.HasPrefix(lower, "numeric") {
return "DECIMAL" + text[len("numeric"):]
}
if strings.HasPrefix(lower, "decimal") {
return "DECIMAL" + text[len("decimal"):]
}
return "DECIMAL(38,10)"
}
func normalizeTDengineVarcharLength(length int, fallback int) int {
if fallback <= 0 {
fallback = 255
}
if length <= 0 {
return fallback
}
if length > 16384 {
return 16384
}
return length
}
func extractFirstTypeLength(raw string) int {
start := strings.Index(raw, "(")
if start < 0 {
return 0
}
end := strings.Index(raw[start+1:], ")")
if end < 0 {
return 0
}
inside := strings.TrimSpace(raw[start+1 : start+1+end])
if inside == "" {
return 0
}
parts := strings.SplitN(inside, ",", 2)
length, err := strconv.Atoi(strings.TrimSpace(parts[0]))
if err != nil {
return 0
}
return length
}

View File

@@ -0,0 +1,98 @@
package sync
import (
"GoNavi-Wails/internal/connection"
"strings"
)
func normalizeMigrationDBType(dbType string) string {
normalized := strings.ToLower(strings.TrimSpace(dbType))
switch normalized {
case "doris":
return "diros"
case "postgresql":
return "postgres"
case "dm", "dm8":
return "dameng"
case "sqlite3":
return "sqlite"
default:
return normalized
}
}
func resolveMigrationDBType(config connection.ConnectionConfig) string {
dbType := normalizeMigrationDBType(config.Type)
if dbType != "custom" {
return dbType
}
driver := strings.ToLower(strings.TrimSpace(config.Driver))
switch driver {
case "postgresql", "postgres", "pg", "pq", "pgx":
return "postgres"
case "dm", "dameng", "dm8":
return "dameng"
case "sqlite3", "sqlite":
return "sqlite"
case "sphinxql":
return "sphinx"
case "diros", "doris":
return "diros"
case "kingbase", "kingbase8", "kingbasees", "kingbasev8":
return "kingbase"
case "highgo":
return "highgo"
case "vastbase":
return "vastbase"
case "mysql", "mysql2":
return "mysql"
case "mariadb":
return "mariadb"
}
switch {
case strings.Contains(driver, "postgres"):
return "postgres"
case strings.Contains(driver, "kingbase"):
return "kingbase"
case strings.Contains(driver, "highgo"):
return "highgo"
case strings.Contains(driver, "vastbase"):
return "vastbase"
case strings.Contains(driver, "sqlite"):
return "sqlite"
case strings.Contains(driver, "sphinx"):
return "sphinx"
case strings.Contains(driver, "diros"), strings.Contains(driver, "doris"):
return "diros"
case strings.Contains(driver, "maria"):
return "mariadb"
case strings.Contains(driver, "mysql"):
return "mysql"
case strings.Contains(driver, "dameng"), strings.Contains(driver, "dm"):
return "dameng"
default:
return normalizeMigrationDBType(driver)
}
}
func isMySQLCoreType(dbType string) bool {
switch normalizeMigrationDBType(dbType) {
case "mysql", "mariadb", "diros":
return true
default:
return false
}
}
func isMySQLLikeSourceType(dbType string) bool {
if isMySQLCoreType(dbType) {
return true
}
return normalizeMigrationDBType(dbType) == "sphinx"
}
func isMySQLLikeWritableTargetType(dbType string) bool {
return isMySQLCoreType(dbType)
}

View File

@@ -1,7 +1,7 @@
package sync
import (
"GoNavi-Wails/internal/db"
"errors"
"fmt"
"strings"
)
@@ -36,12 +36,18 @@ func (s *SyncEngine) Preview(config SyncConfig, tableName string, limit int) (Ta
if limit > 500 {
limit = 500
}
if isRedisToMongoKeyspacePair(config) {
return s.previewRedisToMongo(config, tableName, limit)
}
if isMongoToRedisKeyspacePair(config) {
return s.previewMongoToRedis(config, tableName, limit)
}
sourceDB, err := db.NewDatabase(config.SourceConfig.Type)
sourceDB, err := newSyncDatabase(config.SourceConfig.Type)
if err != nil {
return TableDiffPreview{}, fmt.Errorf("初始化源数据库驱动失败: %w", err)
}
targetDB, err := db.NewDatabase(config.TargetConfig.Type)
targetDB, err := newSyncDatabase(config.TargetConfig.Type)
if err != nil {
return TableDiffPreview{}, fmt.Errorf("初始化目标数据库驱动失败: %w", err)
}
@@ -56,14 +62,12 @@ func (s *SyncEngine) Preview(config SyncConfig, tableName string, limit int) (Ta
}
defer targetDB.Close()
sourceSchema, sourceTable := normalizeSchemaAndTable(config.SourceConfig.Type, config.SourceConfig.Database, tableName)
targetSchema, targetTable := normalizeSchemaAndTable(config.TargetConfig.Type, config.TargetConfig.Database, tableName)
sourceQueryTable := qualifiedNameForQuery(config.SourceConfig.Type, sourceSchema, sourceTable, tableName)
targetQueryTable := qualifiedNameForQuery(config.TargetConfig.Type, targetSchema, targetTable, tableName)
cols, err := sourceDB.GetColumns(sourceSchema, sourceTable)
plan, cols, _, err := buildSchemaMigrationPlan(config, tableName, sourceDB, targetDB)
if err != nil {
return TableDiffPreview{}, fmt.Errorf("获取源表字段失败: %w", err)
return TableDiffPreview{}, err
}
if !plan.TargetTableExists && !plan.AutoCreate {
return TableDiffPreview{}, errors.New(firstNonEmpty(plan.PlannedAction, "目标表不存在,无法预览差异"))
}
pkCols := make([]string, 0, 2)
@@ -80,13 +84,17 @@ func (s *SyncEngine) Preview(config SyncConfig, tableName string, limit int) (Ta
}
pkCol := pkCols[0]
sourceRows, _, err := sourceDB.Query(fmt.Sprintf("SELECT * FROM %s", quoteQualifiedIdentByType(config.SourceConfig.Type, sourceQueryTable)))
sourceRows, _, err := sourceDB.Query(fmt.Sprintf("SELECT * FROM %s", quoteQualifiedIdentByType(resolveMigrationDBType(config.SourceConfig), plan.SourceQueryTable)))
if err != nil {
return TableDiffPreview{}, fmt.Errorf("读取源表失败: %w", err)
}
targetRows, _, err := targetDB.Query(fmt.Sprintf("SELECT * FROM %s", quoteQualifiedIdentByType(config.TargetConfig.Type, targetQueryTable)))
if err != nil {
return TableDiffPreview{}, fmt.Errorf("读取目标表失败: %w", err)
targetRows := make([]map[string]interface{}, 0)
if plan.TargetTableExists {
targetRows, _, err = targetDB.Query(fmt.Sprintf("SELECT * FROM %s", quoteQualifiedIdentByType(resolveMigrationDBType(config.TargetConfig), plan.TargetQueryTable)))
if err != nil {
return TableDiffPreview{}, fmt.Errorf("读取目标表失败: %w", err)
}
}
targetMap := make(map[string]map[string]interface{}, len(targetRows))
@@ -133,12 +141,7 @@ func (s *SyncEngine) Preview(config SyncConfig, tableName string, limit int) (Ta
if len(changedColumns) > 0 {
out.TotalUpdates++
if len(out.Updates) < limit {
out.Updates = append(out.Updates, PreviewUpdateRow{
PK: pkVal,
ChangedColumns: changedColumns,
Source: sRow,
Target: tRow,
})
out.Updates = append(out.Updates, PreviewUpdateRow{PK: pkVal, ChangedColumns: changedColumns, Source: sRow, Target: tRow})
}
}
continue

View File

@@ -0,0 +1,490 @@
package sync
import (
"GoNavi-Wails/internal/connection"
"GoNavi-Wails/internal/db"
redispkg "GoNavi-Wails/internal/redis"
"fmt"
"sort"
"strings"
"testing"
)
type fakeRedisMigrationClient struct {
values map[string]*redispkg.RedisValue
scannedKeys []string
connectConfig connection.ConnectionConfig
closed bool
}
func (f *fakeRedisMigrationClient) Connect(config connection.ConnectionConfig) error {
f.connectConfig = config
return nil
}
func (f *fakeRedisMigrationClient) Close() error {
f.closed = true
return nil
}
func (f *fakeRedisMigrationClient) ScanKeys(pattern string, cursor uint64, count int64) (*redispkg.RedisScanResult, error) {
items := make([]redispkg.RedisKeyInfo, 0, len(f.scannedKeys))
for _, key := range f.scannedKeys {
items = append(items, redispkg.RedisKeyInfo{Key: key, Type: "string", TTL: -1})
}
return &redispkg.RedisScanResult{Keys: items, Cursor: "0"}, nil
}
func (f *fakeRedisMigrationClient) GetKeyType(key string) (string, error) {
if value, ok := f.values[key]; ok && value != nil {
return value.Type, nil
}
return "none", nil
}
func (f *fakeRedisMigrationClient) GetValue(key string) (*redispkg.RedisValue, error) {
if value, ok := f.values[key]; ok {
return value, nil
}
return nil, fmt.Errorf("key not found: %s", key)
}
func (f *fakeRedisMigrationClient) DeleteKeys(keys []string) (int64, error) {
var deleted int64
for _, key := range keys {
if _, ok := f.values[key]; ok {
delete(f.values, key)
deleted++
}
}
return deleted, nil
}
func (f *fakeRedisMigrationClient) SetTTL(key string, ttl int64) error {
value, ok := f.values[key]
if !ok {
return nil
}
value.TTL = ttl
return nil
}
func (f *fakeRedisMigrationClient) SetString(key, value string, ttl int64) error {
if f.values == nil {
f.values = map[string]*redispkg.RedisValue{}
}
f.values[key] = &redispkg.RedisValue{Type: "string", TTL: ttl, Value: value, Length: int64(len(value))}
return nil
}
func (f *fakeRedisMigrationClient) SetHashField(key, field, value string) error {
if f.values == nil {
f.values = map[string]*redispkg.RedisValue{}
}
current, ok := f.values[key]
if !ok || current == nil || current.Type != "hash" {
current = &redispkg.RedisValue{Type: "hash", TTL: -1, Value: map[string]string{}}
f.values[key] = current
}
hash, _ := current.Value.(map[string]string)
if hash == nil {
hash = map[string]string{}
}
hash[field] = value
current.Value = hash
current.Length = int64(len(hash))
return nil
}
func (f *fakeRedisMigrationClient) ListPush(key string, values ...string) error {
if f.values == nil {
f.values = map[string]*redispkg.RedisValue{}
}
current, ok := f.values[key]
if !ok || current == nil || current.Type != "list" {
current = &redispkg.RedisValue{Type: "list", TTL: -1, Value: []string{}}
f.values[key] = current
}
list, _ := current.Value.([]string)
list = append(list, values...)
current.Value = list
current.Length = int64(len(list))
return nil
}
func (f *fakeRedisMigrationClient) SetAdd(key string, members ...string) error {
if f.values == nil {
f.values = map[string]*redispkg.RedisValue{}
}
current, ok := f.values[key]
if !ok || current == nil || current.Type != "set" {
current = &redispkg.RedisValue{Type: "set", TTL: -1, Value: []string{}}
f.values[key] = current
}
setValues, _ := current.Value.([]string)
seen := make(map[string]struct{}, len(setValues)+len(members))
for _, item := range setValues {
seen[item] = struct{}{}
}
for _, item := range members {
if _, ok := seen[item]; ok {
continue
}
seen[item] = struct{}{}
setValues = append(setValues, item)
}
sort.Strings(setValues)
current.Value = setValues
current.Length = int64(len(setValues))
return nil
}
func (f *fakeRedisMigrationClient) ZSetAdd(key string, members ...redispkg.ZSetMember) error {
if f.values == nil {
f.values = map[string]*redispkg.RedisValue{}
}
copied := append([]redispkg.ZSetMember(nil), members...)
sort.Slice(copied, func(i, j int) bool {
if copied[i].Score == copied[j].Score {
return copied[i].Member < copied[j].Member
}
return copied[i].Score < copied[j].Score
})
f.values[key] = &redispkg.RedisValue{Type: "zset", TTL: -1, Value: copied, Length: int64(len(copied))}
return nil
}
func (f *fakeRedisMigrationClient) StreamAdd(key string, fields map[string]string, id string) (string, error) {
if f.values == nil {
f.values = map[string]*redispkg.RedisValue{}
}
current, ok := f.values[key]
if !ok || current == nil || current.Type != "stream" {
current = &redispkg.RedisValue{Type: "stream", TTL: -1, Value: []redispkg.StreamEntry{}}
f.values[key] = current
}
entries, _ := current.Value.([]redispkg.StreamEntry)
entryID := id
if entryID == "" {
entryID = fmt.Sprintf("%d-0", len(entries)+1)
}
entries = append(entries, redispkg.StreamEntry{ID: entryID, Fields: fields})
current.Value = entries
current.Length = int64(len(entries))
return entryID, nil
}
type fakeRedisMongoTargetDB struct {
tables []string
queryTable string
queryRows []map[string]interface{}
execs []string
applyTable string
applySet connection.ChangeSet
}
func (f *fakeRedisMongoTargetDB) Connect(config connection.ConnectionConfig) error { return nil }
func (f *fakeRedisMongoTargetDB) Close() error { return nil }
func (f *fakeRedisMongoTargetDB) Ping() error { return nil }
func (f *fakeRedisMongoTargetDB) Query(query string) ([]map[string]interface{}, []string, error) {
queryTable := strings.TrimSpace(f.queryTable)
if queryTable == "" {
queryTable = "redis_db_0_keys"
}
if strings.Contains(query, fmt.Sprintf(`"find":"%s"`, queryTable)) {
return f.queryRows, []string{"_id", "key", "value"}, nil
}
return nil, nil, nil
}
func (f *fakeRedisMongoTargetDB) Exec(query string) (int64, error) {
f.execs = append(f.execs, query)
return 1, nil
}
func (f *fakeRedisMongoTargetDB) GetDatabases() ([]string, error) { return []string{"app"}, nil }
func (f *fakeRedisMongoTargetDB) GetTables(dbName string) ([]string, error) {
return f.tables, nil
}
func (f *fakeRedisMongoTargetDB) GetCreateStatement(dbName, tableName string) (string, error) {
return "", nil
}
func (f *fakeRedisMongoTargetDB) GetColumns(dbName, tableName string) ([]connection.ColumnDefinition, error) {
return nil, nil
}
func (f *fakeRedisMongoTargetDB) GetAllColumns(dbName string) ([]connection.ColumnDefinitionWithTable, error) {
return nil, nil
}
func (f *fakeRedisMongoTargetDB) GetIndexes(dbName, tableName string) ([]connection.IndexDefinition, error) {
return nil, nil
}
func (f *fakeRedisMongoTargetDB) GetForeignKeys(dbName, tableName string) ([]connection.ForeignKeyDefinition, error) {
return nil, nil
}
func (f *fakeRedisMongoTargetDB) GetTriggers(dbName, tableName string) ([]connection.TriggerDefinition, error) {
return nil, nil
}
func (f *fakeRedisMongoTargetDB) ApplyChanges(tableName string, changes connection.ChangeSet) error {
f.applyTable = tableName
f.applySet = changes
return nil
}
type fakeMongoRedisSourceDB struct {
tables []string
rowsByTable map[string][]map[string]interface{}
connectConfig connection.ConnectionConfig
}
func (f *fakeMongoRedisSourceDB) Connect(config connection.ConnectionConfig) error {
f.connectConfig = config
return nil
}
func (f *fakeMongoRedisSourceDB) Close() error { return nil }
func (f *fakeMongoRedisSourceDB) Ping() error { return nil }
func (f *fakeMongoRedisSourceDB) Query(query string) ([]map[string]interface{}, []string, error) {
for tableName, rows := range f.rowsByTable {
if strings.Contains(query, fmt.Sprintf(`"find":"%s"`, tableName)) {
return rows, []string{"_id", "key", "type", "ttl", "value"}, nil
}
}
return nil, nil, fmt.Errorf("unexpected query: %s", query)
}
func (f *fakeMongoRedisSourceDB) Exec(query string) (int64, error) { return 0, nil }
func (f *fakeMongoRedisSourceDB) GetDatabases() ([]string, error) { return []string{"app"}, nil }
func (f *fakeMongoRedisSourceDB) GetTables(dbName string) ([]string, error) {
return f.tables, nil
}
func (f *fakeMongoRedisSourceDB) GetCreateStatement(dbName, tableName string) (string, error) {
return "", nil
}
func (f *fakeMongoRedisSourceDB) GetColumns(dbName, tableName string) ([]connection.ColumnDefinition, error) {
return nil, nil
}
func (f *fakeMongoRedisSourceDB) GetAllColumns(dbName string) ([]connection.ColumnDefinitionWithTable, error) {
return nil, nil
}
func (f *fakeMongoRedisSourceDB) GetIndexes(dbName, tableName string) ([]connection.IndexDefinition, error) {
return nil, nil
}
func (f *fakeMongoRedisSourceDB) GetForeignKeys(dbName, tableName string) ([]connection.ForeignKeyDefinition, error) {
return nil, nil
}
func (f *fakeMongoRedisSourceDB) GetTriggers(dbName, tableName string) ([]connection.TriggerDefinition, error) {
return nil, nil
}
func TestRunSync_RedisToMongoAppliesInsertAndUpdate(t *testing.T) {
fakeRedis := &fakeRedisMigrationClient{
values: map[string]*redispkg.RedisValue{
"user:1": {Type: "hash", TTL: 120, Length: 2, Value: map[string]string{"name": "alice"}},
"user:2": {Type: "string", TTL: -1, Length: 1, Value: "online"},
},
}
fakeTarget := &fakeRedisMongoTargetDB{
tables: []string{"redis_db_0_keys"},
queryRows: []map[string]interface{}{
{"_id": "db0:user:1", "redisDb": 0, "key": "user:1", "type": "hash", "ttl": 120, "length": int64(2), "value": map[string]interface{}{"name": "old"}},
},
}
oldNewRedisClient := newRedisSourceClient
oldNewDatabase := newSyncDatabase
defer func() {
newRedisSourceClient = oldNewRedisClient
newSyncDatabase = oldNewDatabase
}()
newRedisSourceClient = func() redisMigrationClient { return fakeRedis }
newSyncDatabase = func(dbType string) (db.Database, error) { return fakeTarget, nil }
engine := NewSyncEngine(Reporter{})
result := engine.RunSync(SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "redis", Database: "0"},
TargetConfig: connection.ConnectionConfig{Type: "mongodb", Database: "app"},
Tables: []string{"user:1", "user:2"},
Content: "data",
Mode: "insert_update",
})
if !result.Success {
t.Fatalf("expected success, got: %+v", result)
}
if fakeRedis.connectConfig.RedisDB != 0 {
t.Fatalf("expected redis db 0, got %d", fakeRedis.connectConfig.RedisDB)
}
if fakeTarget.applyTable != "redis_db_0_keys" {
t.Fatalf("unexpected apply table: %s", fakeTarget.applyTable)
}
if len(fakeTarget.applySet.Inserts) != 1 || len(fakeTarget.applySet.Updates) != 1 {
t.Fatalf("unexpected change set: %+v", fakeTarget.applySet)
}
}
func TestRunSync_RedisToMongoUsesConfiguredCollectionName(t *testing.T) {
fakeRedis := &fakeRedisMigrationClient{
values: map[string]*redispkg.RedisValue{
"user:1": {Type: "string", TTL: -1, Length: 1, Value: "online"},
},
}
fakeTarget := &fakeRedisMongoTargetDB{
tables: []string{"custom_keyspace_docs"},
queryTable: "custom_keyspace_docs",
}
oldNewRedisClient := newRedisSourceClient
oldNewDatabase := newSyncDatabase
defer func() {
newRedisSourceClient = oldNewRedisClient
newSyncDatabase = oldNewDatabase
}()
newRedisSourceClient = func() redisMigrationClient { return fakeRedis }
newSyncDatabase = func(dbType string) (db.Database, error) { return fakeTarget, nil }
engine := NewSyncEngine(Reporter{})
result := engine.RunSync(SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "redis", Database: "0"},
TargetConfig: connection.ConnectionConfig{Type: "mongodb", Database: "app"},
Tables: []string{"user:1"},
Content: "data",
Mode: "insert_update",
MongoCollectionName: "custom_keyspace_docs",
})
if !result.Success {
t.Fatalf("expected success, got: %+v", result)
}
if fakeTarget.applyTable != "custom_keyspace_docs" {
t.Fatalf("unexpected apply table: %s", fakeTarget.applyTable)
}
}
func TestPreview_RedisToMongoReturnsDocumentPreview(t *testing.T) {
fakeRedis := &fakeRedisMigrationClient{
values: map[string]*redispkg.RedisValue{
"session:1": {Type: "string", TTL: 60, Length: 1, Value: "token"},
},
}
fakeTarget := &fakeRedisMongoTargetDB{}
oldNewRedisClient := newRedisSourceClient
oldNewDatabase := newSyncDatabase
defer func() {
newRedisSourceClient = oldNewRedisClient
newSyncDatabase = oldNewDatabase
}()
newRedisSourceClient = func() redisMigrationClient { return fakeRedis }
newSyncDatabase = func(dbType string) (db.Database, error) { return fakeTarget, nil }
engine := NewSyncEngine(Reporter{})
preview, err := engine.Preview(SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "redis", Database: "0"},
TargetConfig: connection.ConnectionConfig{Type: "mongodb", Database: "app"},
Tables: []string{"session:1"},
Content: "data",
Mode: "insert_update",
}, "session:1", 20)
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
if preview.PKColumn != "_id" {
t.Fatalf("unexpected pk column: %s", preview.PKColumn)
}
if preview.TotalInserts != 1 || len(preview.Inserts) != 1 {
t.Fatalf("unexpected preview: %+v", preview)
}
if preview.Inserts[0].PK != "db0:session:1" {
t.Fatalf("unexpected preview pk: %+v", preview.Inserts[0])
}
}
func TestRunSync_MongoToRedisAppliesStringAndHash(t *testing.T) {
fakeSource := &fakeMongoRedisSourceDB{
tables: []string{"redis_db_0_keys"},
rowsByTable: map[string][]map[string]interface{}{
"redis_db_0_keys": {
{"_id": "db0:session:1", "key": "session:1", "type": "string", "ttl": int64(60), "value": "token"},
{"_id": "db0:user:1", "key": "user:1", "type": "hash", "ttl": int64(120), "value": map[string]interface{}{"name": "alice", "role": "admin"}},
},
},
}
fakeRedis := &fakeRedisMigrationClient{
values: map[string]*redispkg.RedisValue{
"user:1": {Type: "hash", TTL: 120, Length: 1, Value: map[string]string{"name": "old"}},
},
}
oldNewRedisClient := newRedisSourceClient
oldNewDatabase := newSyncDatabase
defer func() {
newRedisSourceClient = oldNewRedisClient
newSyncDatabase = oldNewDatabase
}()
newRedisSourceClient = func() redisMigrationClient { return fakeRedis }
newSyncDatabase = func(dbType string) (db.Database, error) { return fakeSource, nil }
engine := NewSyncEngine(Reporter{})
result := engine.RunSync(SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "mongodb", Database: "app"},
TargetConfig: connection.ConnectionConfig{Type: "redis", Database: "0"},
Tables: []string{"redis_db_0_keys"},
Content: "data",
Mode: "insert_update",
})
if !result.Success {
t.Fatalf("expected success, got: %+v", result)
}
if fakeRedis.connectConfig.RedisDB != 0 {
t.Fatalf("expected redis db 0, got %d", fakeRedis.connectConfig.RedisDB)
}
if got := fakeRedis.values["session:1"]; got == nil || got.Type != "string" || got.Value != "token" || got.TTL != 60 {
t.Fatalf("unexpected string value: %+v", got)
}
gotHash, _ := fakeRedis.values["user:1"].Value.(map[string]string)
if gotHash["name"] != "alice" || gotHash["role"] != "admin" {
t.Fatalf("unexpected hash value: %+v", fakeRedis.values["user:1"])
}
if result.RowsInserted != 1 || result.RowsUpdated != 1 {
t.Fatalf("unexpected sync result: %+v", result)
}
}
func TestPreview_MongoToRedisReturnsCollectionPreview(t *testing.T) {
fakeSource := &fakeMongoRedisSourceDB{
tables: []string{"redis_db_0_keys"},
rowsByTable: map[string][]map[string]interface{}{
"redis_db_0_keys": {
{"_id": "db0:session:1", "key": "session:1", "type": "string", "ttl": int64(60), "value": "token"},
},
},
}
fakeRedis := &fakeRedisMigrationClient{values: map[string]*redispkg.RedisValue{}}
oldNewRedisClient := newRedisSourceClient
oldNewDatabase := newSyncDatabase
defer func() {
newRedisSourceClient = oldNewRedisClient
newSyncDatabase = oldNewDatabase
}()
newRedisSourceClient = func() redisMigrationClient { return fakeRedis }
newSyncDatabase = func(dbType string) (db.Database, error) { return fakeSource, nil }
engine := NewSyncEngine(Reporter{})
preview, err := engine.Preview(SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "mongodb", Database: "app"},
TargetConfig: connection.ConnectionConfig{Type: "redis", Database: "0"},
Tables: []string{"redis_db_0_keys"},
Content: "data",
Mode: "insert_update",
}, "redis_db_0_keys", 20)
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
if preview.Table != "redis_db_0_keys" || preview.PKColumn != "key" {
t.Fatalf("unexpected preview header: %+v", preview)
}
if preview.TotalInserts != 1 || len(preview.Inserts) != 1 {
t.Fatalf("unexpected preview rows: %+v", preview)
}
if preview.Inserts[0].PK != "session:1" {
t.Fatalf("unexpected preview pk: %+v", preview.Inserts[0])
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,957 @@
package sync
import (
"GoNavi-Wails/internal/connection"
"context"
"reflect"
"strings"
"testing"
)
type fakeMigrationDB struct {
columns map[string][]connection.ColumnDefinition
indexes map[string][]connection.IndexDefinition
queryData map[string][]map[string]interface{}
queryCols map[string][]string
}
func (f *fakeMigrationDB) Connect(config connection.ConnectionConfig) error { return nil }
func (f *fakeMigrationDB) Close() error { return nil }
func (f *fakeMigrationDB) Ping() error { return nil }
func (f *fakeMigrationDB) Query(query string) ([]map[string]interface{}, []string, error) {
if rows, ok := f.queryData[query]; ok {
return rows, f.queryCols[query], nil
}
return nil, nil, nil
}
func (f *fakeMigrationDB) Exec(query string) (int64, error) { return 0, nil }
func (f *fakeMigrationDB) GetDatabases() ([]string, error) { return nil, nil }
func (f *fakeMigrationDB) GetTables(dbName string) ([]string, error) {
return nil, nil
}
func (f *fakeMigrationDB) GetCreateStatement(dbName, tableName string) (string, error) {
return "", nil
}
func (f *fakeMigrationDB) GetColumns(dbName, tableName string) ([]connection.ColumnDefinition, error) {
key := dbName + "." + tableName
if rows, ok := f.columns[key]; ok {
return rows, nil
}
return []connection.ColumnDefinition{}, nil
}
func (f *fakeMigrationDB) GetAllColumns(dbName string) ([]connection.ColumnDefinitionWithTable, error) {
return nil, nil
}
func (f *fakeMigrationDB) GetIndexes(dbName, tableName string) ([]connection.IndexDefinition, error) {
key := dbName + "." + tableName
if rows, ok := f.indexes[key]; ok {
return rows, nil
}
return nil, nil
}
func (f *fakeMigrationDB) GetForeignKeys(dbName, tableName string) ([]connection.ForeignKeyDefinition, error) {
return nil, nil
}
func (f *fakeMigrationDB) GetTriggers(dbName, tableName string) ([]connection.TriggerDefinition, error) {
return nil, nil
}
func (f *fakeMigrationDB) QueryContext(ctx context.Context, query string) ([]map[string]interface{}, []string, error) {
return f.Query(query)
}
func (f *fakeMigrationDB) ExecContext(ctx context.Context, query string) (int64, error) {
return 0, nil
}
func TestBuildMySQLToKingbaseColumnDefinition_AutoIncrementAndBoolean(t *testing.T) {
t.Parallel()
def, warnings := buildMySQLToKingbaseColumnDefinition(connection.ColumnDefinition{
Name: "id",
Type: "int unsigned",
Nullable: "NO",
Extra: "auto_increment",
})
if !strings.Contains(def, "bigint") || !strings.Contains(def, "GENERATED BY DEFAULT AS IDENTITY") || !strings.Contains(def, "NOT NULL") {
t.Fatalf("unexpected definition: %s", def)
}
if len(warnings) != 0 {
t.Fatalf("unexpected warnings: %v", warnings)
}
def, warnings = buildMySQLToKingbaseColumnDefinition(connection.ColumnDefinition{
Name: "enabled",
Type: "tinyint(1)",
Nullable: "YES",
Default: stringPtr("1"),
})
if !strings.Contains(def, "boolean") || !strings.Contains(def, "DEFAULT TRUE") {
t.Fatalf("unexpected boolean definition: %s", def)
}
if len(warnings) != 0 {
t.Fatalf("unexpected warnings for boolean: %v", warnings)
}
}
func TestBuildMySQLToKingbaseCreateTablePlan_GeneratesAndSkipsIndexes(t *testing.T) {
t.Parallel()
sourceDB := &fakeMigrationDB{
indexes: map[string][]connection.IndexDefinition{
"shop.orders": {
{Name: "PRIMARY", ColumnName: "id", NonUnique: 0, SeqInIndex: 1, IndexType: "BTREE"},
{Name: "idx_user_status", ColumnName: "user_id", NonUnique: 1, SeqInIndex: 1, IndexType: "BTREE"},
{Name: "idx_user_status", ColumnName: "status", NonUnique: 1, SeqInIndex: 2, IndexType: "BTREE"},
{Name: "idx_name_prefix", ColumnName: "name", NonUnique: 1, SeqInIndex: 1, IndexType: "BTREE", SubPart: 12},
{Name: "idx_fulltext_note", ColumnName: "note", NonUnique: 1, SeqInIndex: 1, IndexType: "FULLTEXT"},
},
},
}
cols := []connection.ColumnDefinition{
{Name: "id", Type: "bigint", Nullable: "NO", Key: "PRI", Extra: "auto_increment"},
{Name: "user_id", Type: "bigint", Nullable: "NO"},
{Name: "status", Type: "varchar(32)", Nullable: "YES"},
{Name: "name", Type: "varchar(128)", Nullable: "YES"},
{Name: "note", Type: "text", Nullable: "YES"},
}
cfg := SyncConfig{CreateIndexes: true}
createSQL, postSQL, warnings, unsupported, idxCreate, idxSkip, err := buildMySQLToKingbaseCreateTablePlan(cfg, "public.orders", cols, sourceDB, "shop", "orders")
if err != nil {
t.Fatalf("buildMySQLToKingbaseCreateTablePlan returned error: %v", err)
}
if !strings.Contains(createSQL, `CREATE TABLE "public"."orders"`) {
t.Fatalf("unexpected create SQL: %s", createSQL)
}
if !strings.Contains(createSQL, `PRIMARY KEY ("id")`) {
t.Fatalf("create SQL missing primary key: %s", createSQL)
}
if idxCreate != 1 || idxSkip != 2 {
t.Fatalf("unexpected index summary: create=%d skip=%d", idxCreate, idxSkip)
}
if len(postSQL) != 1 || !strings.Contains(postSQL[0], `CREATE INDEX "idx_user_status"`) {
t.Fatalf("unexpected post SQL: %v", postSQL)
}
if len(warnings) != 0 {
t.Fatalf("unexpected warnings: %v", warnings)
}
wantUnsupported := []string{
"索引 idx_name_prefix 使用前缀长度,当前暂不支持迁移",
"索引 idx_fulltext_note 类型=FULLTEXT当前暂不支持自动迁移",
}
if !reflect.DeepEqual(unsupported, wantUnsupported) {
t.Fatalf("unexpected unsupported objects: got=%v want=%v", unsupported, wantUnsupported)
}
}
func TestBuildSchemaMigrationPlan_AutoCreateWhenTargetMissing(t *testing.T) {
t.Parallel()
sourceDB := &fakeMigrationDB{
columns: map[string][]connection.ColumnDefinition{
"shop.orders": {
{Name: "id", Type: "bigint", Nullable: "NO", Key: "PRI", Extra: "auto_increment"},
{Name: "name", Type: "varchar(128)", Nullable: "YES"},
},
},
indexes: map[string][]connection.IndexDefinition{},
}
targetDB := &fakeMigrationDB{columns: map[string][]connection.ColumnDefinition{}}
cfg := SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "mysql", Database: "shop"},
TargetConfig: connection.ConnectionConfig{Type: "kingbase", Database: "demo"},
TargetTableStrategy: "smart",
CreateIndexes: true,
}
plan, sourceCols, targetCols, err := buildSchemaMigrationPlan(cfg, "orders", sourceDB, targetDB)
if err != nil {
t.Fatalf("buildSchemaMigrationPlan returned error: %v", err)
}
if len(sourceCols) != 2 || len(targetCols) != 0 {
t.Fatalf("unexpected columns lengths: source=%d target=%d", len(sourceCols), len(targetCols))
}
if plan.TargetTableExists {
t.Fatalf("expected target table missing")
}
if !plan.AutoCreate {
t.Fatalf("expected auto create enabled")
}
if !strings.Contains(plan.PlannedAction, "自动建表") {
t.Fatalf("unexpected planned action: %s", plan.PlannedAction)
}
if !strings.Contains(plan.CreateTableSQL, `CREATE TABLE "public"."orders"`) {
t.Fatalf("unexpected create table SQL: %s", plan.CreateTableSQL)
}
}
func stringPtr(v string) *string { return &v }
func TestBuildPGLikeToMySQLCreateTablePlan_GeneratesMySQLDDL(t *testing.T) {
t.Parallel()
sourceDB := &fakeMigrationDB{
indexes: map[string][]connection.IndexDefinition{
"public.users": {
{Name: "users_email_key", ColumnName: "email", NonUnique: 0, SeqInIndex: 1, IndexType: "BTREE"},
{Name: "idx_users_name", ColumnName: "name", NonUnique: 1, SeqInIndex: 1, IndexType: "BTREE"},
},
},
}
cols := []connection.ColumnDefinition{
{Name: "id", Type: "integer", Nullable: "NO", Key: "PRI", Extra: "auto_increment"},
{Name: "email", Type: "character varying(120)", Nullable: "NO"},
{Name: "name", Type: "text", Nullable: "YES"},
{Name: "profile", Type: "jsonb", Nullable: "YES"},
}
cfg := SyncConfig{CreateIndexes: true}
createSQL, postSQL, warnings, unsupported, idxCreate, idxSkip, err := buildPGLikeToMySQLCreateTablePlan(cfg, "app.users", cols, sourceDB, "public", "users")
if err != nil {
t.Fatalf("buildPGLikeToMySQLCreateTablePlan returned error: %v", err)
}
if !strings.Contains(createSQL, "CREATE TABLE `app`.`users`") {
t.Fatalf("unexpected create SQL: %s", createSQL)
}
if !strings.Contains(createSQL, "`id` int AUTO_INCREMENT NOT NULL") {
t.Fatalf("unexpected id definition: %s", createSQL)
}
if !strings.Contains(createSQL, "`profile` json") {
t.Fatalf("unexpected json definition: %s", createSQL)
}
if idxCreate != 2 || idxSkip != 0 {
t.Fatalf("unexpected index summary: create=%d skip=%d", idxCreate, idxSkip)
}
if len(postSQL) != 2 {
t.Fatalf("unexpected post sql length: %v", postSQL)
}
if len(warnings) != 0 {
t.Fatalf("unexpected warnings: %v", warnings)
}
if len(unsupported) != 0 {
t.Fatalf("unexpected unsupported: %v", unsupported)
}
}
func TestBuildPGLikeToMySQLPlan_AutoCreateWhenTargetMissing(t *testing.T) {
t.Parallel()
sourceDB := &fakeMigrationDB{
columns: map[string][]connection.ColumnDefinition{
"public.orders": {
{Name: "id", Type: "bigint", Nullable: "NO", Key: "PRI", Extra: "auto_increment"},
{Name: "amount", Type: "numeric(10,2)", Nullable: "NO"},
},
},
indexes: map[string][]connection.IndexDefinition{},
}
targetDB := &fakeMigrationDB{columns: map[string][]connection.ColumnDefinition{}}
cfg := SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "kingbase", Database: "public"},
TargetConfig: connection.ConnectionConfig{Type: "mysql", Database: "app"},
TargetTableStrategy: "smart",
CreateIndexes: true,
}
plan, sourceCols, targetCols, err := buildPGLikeToMySQLPlan(cfg, "orders", sourceDB, targetDB)
if err != nil {
t.Fatalf("buildPGLikeToMySQLPlan returned error: %v", err)
}
if len(sourceCols) != 2 || len(targetCols) != 0 {
t.Fatalf("unexpected columns lengths: source=%d target=%d", len(sourceCols), len(targetCols))
}
if plan.TargetTableExists {
t.Fatalf("expected target table missing")
}
if !plan.AutoCreate {
t.Fatalf("expected auto create enabled")
}
if !strings.Contains(plan.CreateTableSQL, "CREATE TABLE `app`.`orders`") {
t.Fatalf("unexpected create table SQL: %s", plan.CreateTableSQL)
}
}
func TestBuildMySQLToPGLikeCreateTablePlan_GeneratesPostgresDDL(t *testing.T) {
t.Parallel()
sourceDB := &fakeMigrationDB{
indexes: map[string][]connection.IndexDefinition{
"shop.orders": {
{Name: "idx_orders_user", ColumnName: "user_id", NonUnique: 1, SeqInIndex: 1, IndexType: "BTREE"},
{Name: "idx_orders_user", ColumnName: "status", NonUnique: 1, SeqInIndex: 2, IndexType: "BTREE"},
},
},
}
cols := []connection.ColumnDefinition{
{Name: "id", Type: "bigint", Nullable: "NO", Key: "PRI", Extra: "auto_increment"},
{Name: "user_id", Type: "bigint", Nullable: "NO"},
{Name: "status", Type: "varchar(32)", Nullable: "YES"},
{Name: "payload", Type: "json", Nullable: "YES"},
}
cfg := SyncConfig{CreateIndexes: true}
createSQL, postSQL, warnings, unsupported, idxCreate, idxSkip, err := buildMySQLToPGLikeCreateTablePlan("postgres", cfg, "public.orders", cols, sourceDB, "shop", "orders")
if err != nil {
t.Fatalf("buildMySQLToPGLikeCreateTablePlan returned error: %v", err)
}
if !strings.Contains(createSQL, `CREATE TABLE "public"."orders"`) {
t.Fatalf("unexpected create SQL: %s", createSQL)
}
if !strings.Contains(createSQL, `GENERATED BY DEFAULT AS IDENTITY`) {
t.Fatalf("missing identity mapping: %s", createSQL)
}
if !strings.Contains(createSQL, `jsonb`) {
t.Fatalf("missing jsonb mapping: %s", createSQL)
}
if idxCreate != 1 || idxSkip != 0 {
t.Fatalf("unexpected index summary: create=%d skip=%d", idxCreate, idxSkip)
}
if len(postSQL) != 1 || !strings.Contains(postSQL[0], `CREATE INDEX "idx_orders_user"`) {
t.Fatalf("unexpected post SQL: %v", postSQL)
}
if len(warnings) != 0 || len(unsupported) != 0 {
t.Fatalf("unexpected warnings/unsupported: warnings=%v unsupported=%v", warnings, unsupported)
}
}
func TestBuildMySQLToClickHouseCreateTableSQL_GeneratesMergeTree(t *testing.T) {
t.Parallel()
cols := []connection.ColumnDefinition{
{Name: "id", Type: "bigint unsigned", Nullable: "NO", Key: "PRI"},
{Name: "name", Type: "varchar(128)", Nullable: "YES"},
{Name: "payload", Type: "json", Nullable: "YES"},
}
createSQL, warnings, unsupported := buildMySQLToClickHouseCreateTableSQL("analytics.orders", cols)
if !strings.Contains(createSQL, "ENGINE = MergeTree()") {
t.Fatalf("unexpected create SQL: %s", createSQL)
}
if !strings.Contains(createSQL, "ORDER BY (`id`)") {
t.Fatalf("unexpected order by: %s", createSQL)
}
if !strings.Contains(createSQL, "`payload` Nullable(String)") {
t.Fatalf("unexpected json mapping: %s", createSQL)
}
if len(warnings) == 0 {
t.Fatalf("expected warnings for clickhouse semantics")
}
if len(unsupported) != 0 {
t.Fatalf("unexpected unsupported: %v", unsupported)
}
}
func TestBuildClickHouseToMySQLCreateTableSQL_GeneratesMySQLDDL(t *testing.T) {
t.Parallel()
cols := []connection.ColumnDefinition{
{Name: "id", Type: "UInt64", Nullable: "NO", Key: "PRI"},
{Name: "event_time", Type: "DateTime", Nullable: "NO"},
{Name: "payload", Type: "Map(String, String)", Nullable: "YES"},
}
createSQL, warnings := buildClickHouseToMySQLCreateTableSQL("app.metrics", cols)
if !strings.Contains(createSQL, "CREATE TABLE `app`.`metrics`") {
t.Fatalf("unexpected create SQL: %s", createSQL)
}
if !strings.Contains(createSQL, "`id` bigint unsigned NOT NULL") {
t.Fatalf("unexpected uint64 mapping: %s", createSQL)
}
if !strings.Contains(createSQL, "`payload` json") {
t.Fatalf("unexpected complex type mapping: %s", createSQL)
}
if len(warnings) == 0 {
t.Fatalf("expected warning for limited clickhouse reverse semantics")
}
}
func TestBuildMySQLToMongoPlan_AutoCreateCollection(t *testing.T) {
t.Parallel()
sourceDB := &fakeMigrationDB{
columns: map[string][]connection.ColumnDefinition{
"shop.users": {
{Name: "id", Type: "bigint", Nullable: "NO", Key: "PRI"},
{Name: "name", Type: "varchar(64)", Nullable: "YES"},
},
},
indexes: map[string][]connection.IndexDefinition{
"shop.users": {
{Name: "idx_users_name", ColumnName: "name", NonUnique: 1, SeqInIndex: 1, IndexType: "BTREE"},
},
},
}
targetDB := &fakeMigrationDB{}
cfg := SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "mysql", Database: "shop"},
TargetConfig: connection.ConnectionConfig{Type: "mongodb", Database: "app"},
TargetTableStrategy: "smart",
CreateIndexes: true,
}
plan, sourceCols, targetCols, err := buildMySQLToMongoPlan(cfg, "users", sourceDB, targetDB)
if err != nil {
t.Fatalf("buildMySQLToMongoPlan returned error: %v", err)
}
if len(sourceCols) != 2 || targetCols != nil {
t.Fatalf("unexpected source/target columns: %d / %v", len(sourceCols), targetCols)
}
if !plan.AutoCreate || len(plan.PreDataSQL) == 0 {
t.Fatalf("expected auto create collection command: %+v", plan)
}
if !strings.Contains(plan.PreDataSQL[0], `"create":"users"`) {
t.Fatalf("unexpected create collection command: %v", plan.PreDataSQL)
}
if len(plan.PostDataSQL) != 1 || !strings.Contains(plan.PostDataSQL[0], `"createIndexes":"users"`) {
t.Fatalf("unexpected index commands: %v", plan.PostDataSQL)
}
}
func TestBuildPGLikeToMongoPlan_AutoCreateCollection(t *testing.T) {
t.Parallel()
sourceDB := &fakeMigrationDB{
columns: map[string][]connection.ColumnDefinition{
"public.orders": {
{Name: "id", Type: "bigint", Nullable: "NO", Key: "PRI"},
{Name: "name", Type: "varchar(64)", Nullable: "YES"},
},
},
indexes: map[string][]connection.IndexDefinition{
"public.orders": {
{Name: "idx_orders_name", ColumnName: "name", NonUnique: 1, SeqInIndex: 1, IndexType: "BTREE"},
},
},
}
targetDB := &fakeMigrationDB{}
cfg := SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "postgres", Database: "public"},
TargetConfig: connection.ConnectionConfig{Type: "mongodb", Database: "app"},
TargetTableStrategy: "smart",
CreateIndexes: true,
}
plan, sourceCols, targetCols, err := buildPGLikeToMongoPlan(cfg, "orders", sourceDB, targetDB)
if err != nil {
t.Fatalf("buildPGLikeToMongoPlan returned error: %v", err)
}
if len(sourceCols) != 2 || targetCols != nil {
t.Fatalf("unexpected source/target columns: %d / %v", len(sourceCols), targetCols)
}
if !plan.AutoCreate || len(plan.PreDataSQL) == 0 {
t.Fatalf("expected auto create collection command: %+v", plan)
}
if !strings.Contains(plan.PreDataSQL[0], `"create":"orders"`) {
t.Fatalf("unexpected create collection command: %v", plan.PreDataSQL)
}
if len(plan.PostDataSQL) != 1 || !strings.Contains(plan.PostDataSQL[0], `"createIndexes":"orders"`) {
t.Fatalf("unexpected index commands: %v", plan.PostDataSQL)
}
}
func TestBuildClickHouseToMongoPlan_AutoCreateCollection(t *testing.T) {
t.Parallel()
sourceDB := &fakeMigrationDB{
columns: map[string][]connection.ColumnDefinition{
"analytics.metrics": {
{Name: "id", Type: "UInt64", Nullable: "NO", Key: "PRI"},
{Name: "host", Type: "String", Nullable: "YES"},
},
},
}
targetDB := &fakeMigrationDB{}
cfg := SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "clickhouse", Database: "analytics"},
TargetConfig: connection.ConnectionConfig{Type: "mongodb", Database: "app"},
TargetTableStrategy: "smart",
}
plan, sourceCols, targetCols, err := buildClickHouseToMongoPlan(cfg, "metrics", sourceDB, targetDB)
if err != nil {
t.Fatalf("buildClickHouseToMongoPlan returned error: %v", err)
}
if len(sourceCols) != 2 || targetCols != nil {
t.Fatalf("unexpected source/target columns: %d / %v", len(sourceCols), targetCols)
}
if !plan.AutoCreate || len(plan.PreDataSQL) == 0 {
t.Fatalf("expected auto create collection command: %+v", plan)
}
if !strings.Contains(plan.PreDataSQL[0], `"create":"metrics"`) {
t.Fatalf("unexpected create collection command: %v", plan.PreDataSQL)
}
}
func TestBuildTDengineToMongoPlan_AutoCreateCollection(t *testing.T) {
t.Parallel()
sourceDB := &fakeMigrationDB{
columns: map[string][]connection.ColumnDefinition{
"src.cpu": {
{Name: "ts", Type: "TIMESTAMP", Nullable: "NO"},
{Name: "host", Type: "NCHAR(64)", Nullable: "YES"},
},
},
}
targetDB := &fakeMigrationDB{}
cfg := SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "tdengine", Database: "src"},
TargetConfig: connection.ConnectionConfig{Type: "mongodb", Database: "app"},
TargetTableStrategy: "smart",
}
plan, sourceCols, targetCols, err := buildTDengineToMongoPlan(cfg, "cpu", sourceDB, targetDB)
if err != nil {
t.Fatalf("buildTDengineToMongoPlan returned error: %v", err)
}
if len(sourceCols) != 2 || targetCols != nil {
t.Fatalf("unexpected source/target columns: %d / %v", len(sourceCols), targetCols)
}
if !plan.AutoCreate || len(plan.PreDataSQL) == 0 {
t.Fatalf("expected auto create collection command: %+v", plan)
}
if !strings.Contains(plan.PreDataSQL[0], `"create":"cpu"`) {
t.Fatalf("unexpected create collection command: %v", plan.PreDataSQL)
}
}
func TestBuildMongoToMySQLPlan_InfersColumnsAndCreatesTable(t *testing.T) {
t.Parallel()
query := `{"find":"users","filter":{},"limit":200}`
sourceDB := &fakeMigrationDB{
queryData: map[string][]map[string]interface{}{
query: {
{"_id": "a1", "name": "alice", "age": int64(18), "profile": map[string]interface{}{"city": "shanghai"}},
{"_id": "b2", "name": "bob", "profile": map[string]interface{}{"city": "beijing"}},
},
},
queryCols: map[string][]string{query: {"_id", "name", "age", "profile"}},
indexes: map[string][]connection.IndexDefinition{
"crm.users": {{Name: "email_1", ColumnName: "name", NonUnique: 1, SeqInIndex: 1, IndexType: "BTREE"}},
},
}
targetDB := &fakeMigrationDB{}
cfg := SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "mongodb", Database: "crm"},
TargetConfig: connection.ConnectionConfig{Type: "mysql", Database: "app"},
TargetTableStrategy: "smart",
CreateIndexes: true,
}
plan, sourceCols, _, err := buildMongoToMySQLPlan(cfg, "users", sourceDB, targetDB)
if err != nil {
t.Fatalf("buildMongoToMySQLPlan returned error: %v", err)
}
if len(sourceCols) == 0 {
t.Fatalf("expected inferred source cols")
}
if !plan.AutoCreate || !strings.Contains(plan.CreateTableSQL, "CREATE TABLE `app`.`users`") {
t.Fatalf("unexpected create table sql: %s", plan.CreateTableSQL)
}
if !strings.Contains(plan.CreateTableSQL, "`_id` text NOT NULL") && !strings.Contains(plan.CreateTableSQL, "`_id` varchar") {
t.Fatalf("missing inferred _id column: %s", plan.CreateTableSQL)
}
if !strings.Contains(plan.CreateTableSQL, "`profile` json") {
t.Fatalf("expected nested field degrade to json: %s", plan.CreateTableSQL)
}
if len(plan.PostDataSQL) != 1 {
t.Fatalf("expected one post index sql, got=%v", plan.PostDataSQL)
}
}
func TestBuildTDengineToMySQLPlan_AutoCreateWhenTargetMissing(t *testing.T) {
t.Parallel()
sourceDB := &fakeMigrationDB{
columns: map[string][]connection.ColumnDefinition{
"metrics.cpu": {
{Name: "ts", Type: "TIMESTAMP", Nullable: "NO"},
{Name: "host", Type: "NCHAR(64)", Nullable: "YES", Key: "TAG", Extra: "TAG"},
{Name: "usage", Type: "DOUBLE", Nullable: "YES"},
},
},
}
targetDB := &fakeMigrationDB{}
cfg := SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "tdengine", Database: "metrics"},
TargetConfig: connection.ConnectionConfig{Type: "mysql", Database: "app"},
TargetTableStrategy: "smart",
}
plan, sourceCols, targetCols, err := buildTDengineToMySQLPlan(cfg, "cpu", sourceDB, targetDB)
if err != nil {
t.Fatalf("buildTDengineToMySQLPlan returned error: %v", err)
}
if len(sourceCols) != 3 || len(targetCols) != 0 {
t.Fatalf("unexpected columns lengths: source=%d target=%d", len(sourceCols), len(targetCols))
}
if !plan.AutoCreate {
t.Fatalf("expected auto create enabled")
}
if !strings.Contains(plan.CreateTableSQL, "CREATE TABLE `app`.`cpu`") {
t.Fatalf("unexpected create table sql: %s", plan.CreateTableSQL)
}
if !strings.Contains(plan.CreateTableSQL, "`ts` datetime") {
t.Fatalf("expected timestamp mapping, got: %s", plan.CreateTableSQL)
}
if !strings.Contains(plan.CreateTableSQL, "`host` varchar(64)") {
t.Fatalf("expected nchar mapping, got: %s", plan.CreateTableSQL)
}
if len(plan.Warnings) == 0 || !strings.Contains(strings.Join(plan.Warnings, " "), "TAG") {
t.Fatalf("expected TAG warning, got: %v", plan.Warnings)
}
}
func TestBuildTDengineToPGLikePlan_AutoCreateWhenTargetMissing(t *testing.T) {
t.Parallel()
sourceDB := &fakeMigrationDB{
columns: map[string][]connection.ColumnDefinition{
"metrics.cpu": {
{Name: "ts", Type: "TIMESTAMP", Nullable: "NO"},
{Name: "payload", Type: "JSON", Nullable: "YES"},
{Name: "host", Type: "BINARY(32)", Nullable: "YES", Key: "TAG", Extra: "TAG"},
},
},
}
targetDB := &fakeMigrationDB{}
cfg := SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "tdengine", Database: "metrics"},
TargetConfig: connection.ConnectionConfig{Type: "kingbase", Database: "ignored"},
TargetTableStrategy: "smart",
}
plan, sourceCols, targetCols, err := buildTDengineToPGLikePlan(cfg, "cpu", sourceDB, targetDB)
if err != nil {
t.Fatalf("buildTDengineToPGLikePlan returned error: %v", err)
}
if len(sourceCols) != 3 || len(targetCols) != 0 {
t.Fatalf("unexpected columns lengths: source=%d target=%d", len(sourceCols), len(targetCols))
}
if !plan.AutoCreate {
t.Fatalf("expected auto create enabled")
}
if !strings.Contains(plan.CreateTableSQL, `CREATE TABLE "public"."cpu"`) {
t.Fatalf("unexpected create table sql: %s", plan.CreateTableSQL)
}
if !strings.Contains(plan.CreateTableSQL, `"ts" timestamp`) {
t.Fatalf("expected timestamp mapping, got: %s", plan.CreateTableSQL)
}
if !strings.Contains(plan.CreateTableSQL, `"payload" jsonb`) {
t.Fatalf("expected json mapping, got: %s", plan.CreateTableSQL)
}
if len(plan.Warnings) == 0 || !strings.Contains(strings.Join(plan.Warnings, " "), "TAG") {
t.Fatalf("expected TAG warning, got: %v", plan.Warnings)
}
}
func TestBuildSchemaMigrationPlan_TDengineTargetWarnsInsertOnlyBoundary(t *testing.T) {
t.Parallel()
sourceDB := &fakeMigrationDB{
columns: map[string][]connection.ColumnDefinition{
"shop.metrics": {
{Name: "id", Type: "bigint", Nullable: "NO", Key: "PRI"},
{Name: "ts", Type: "datetime", Nullable: "NO"},
{Name: "value", Type: "double", Nullable: "YES"},
},
},
}
targetDB := &fakeMigrationDB{
columns: map[string][]connection.ColumnDefinition{
"taos.metrics": {
{Name: "id", Type: "bigint", Nullable: "NO"},
{Name: "ts", Type: "timestamp", Nullable: "NO"},
{Name: "value", Type: "double", Nullable: "YES"},
},
},
}
cfg := SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "mysql", Database: "shop"},
TargetConfig: connection.ConnectionConfig{Type: "tdengine", Database: "taos"},
Mode: "insert_update",
}
plan, _, _, err := buildSchemaMigrationPlan(cfg, "metrics", sourceDB, targetDB)
if err != nil {
t.Fatalf("buildSchemaMigrationPlan returned error: %v", err)
}
warnings := strings.Join(plan.Warnings, " ")
if !strings.Contains(warnings, "仅支持 INSERT 写入") {
t.Fatalf("expected TDengine target warning, got: %v", plan.Warnings)
}
}
func TestBuildMySQLLikeToTDenginePlan_AutoCreateWhenTargetMissing(t *testing.T) {
t.Parallel()
sourceDB := &fakeMigrationDB{
columns: map[string][]connection.ColumnDefinition{
"shop.metrics": {
{Name: "id", Type: "bigint", Nullable: "NO", Key: "PRI", Extra: "auto_increment"},
{Name: "ts", Type: "datetime", Nullable: "NO"},
{Name: "payload", Type: "json", Nullable: "YES"},
},
},
}
targetDB := &fakeMigrationDB{}
cfg := SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "mysql", Database: "shop"},
TargetConfig: connection.ConnectionConfig{Type: "tdengine", Database: "taos"},
TargetTableStrategy: "smart",
}
plan, sourceCols, targetCols, err := buildMySQLLikeToTDenginePlan(cfg, "metrics", sourceDB, targetDB)
if err != nil {
t.Fatalf("buildMySQLLikeToTDenginePlan returned error: %v", err)
}
if len(sourceCols) != 3 || len(targetCols) != 0 {
t.Fatalf("unexpected columns lengths: source=%d target=%d", len(sourceCols), len(targetCols))
}
if !plan.AutoCreate {
t.Fatalf("expected auto create enabled")
}
if !strings.Contains(plan.CreateTableSQL, "CREATE TABLE `taos`.`metrics`") {
t.Fatalf("unexpected create sql: %s", plan.CreateTableSQL)
}
if !strings.Contains(plan.CreateTableSQL, "`ts` TIMESTAMP") {
t.Fatalf("expected ts first column mapped to TIMESTAMP, got: %s", plan.CreateTableSQL)
}
if !strings.Contains(plan.CreateTableSQL, "`payload` VARCHAR(") {
t.Fatalf("expected json degrade to VARCHAR, got: %s", plan.CreateTableSQL)
}
if !strings.Contains(strings.Join(plan.Warnings, " "), "insert-only") && !strings.Contains(strings.Join(plan.Warnings, " "), "INSERT") {
t.Fatalf("expected tdengine target warning, got: %v", plan.Warnings)
}
}
func TestBuildPGLikeToTDenginePlan_AutoCreateWhenTargetMissing(t *testing.T) {
t.Parallel()
sourceDB := &fakeMigrationDB{
columns: map[string][]connection.ColumnDefinition{
"public.metrics": {
{Name: "event_time", Type: "timestamp without time zone", Nullable: "NO"},
{Name: "name", Type: "character varying(64)", Nullable: "YES"},
{Name: "meta", Type: "jsonb", Nullable: "YES"},
},
},
}
targetDB := &fakeMigrationDB{}
cfg := SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "postgres", Database: "ignored"},
TargetConfig: connection.ConnectionConfig{Type: "tdengine", Database: "taos"},
TargetTableStrategy: "smart",
}
plan, sourceCols, targetCols, err := buildPGLikeToTDenginePlan(cfg, "metrics", sourceDB, targetDB)
if err != nil {
t.Fatalf("buildPGLikeToTDenginePlan returned error: %v", err)
}
if len(sourceCols) != 3 || len(targetCols) != 0 {
t.Fatalf("unexpected columns lengths: source=%d target=%d", len(sourceCols), len(targetCols))
}
if !plan.AutoCreate {
t.Fatalf("expected auto create enabled")
}
if !strings.Contains(plan.CreateTableSQL, "CREATE TABLE `taos`.`metrics`") {
t.Fatalf("unexpected create sql: %s", plan.CreateTableSQL)
}
if !strings.Contains(plan.CreateTableSQL, "`event_time` TIMESTAMP") {
t.Fatalf("expected timestamp mapping, got: %s", plan.CreateTableSQL)
}
if !strings.Contains(plan.CreateTableSQL, "`meta` VARCHAR(") {
t.Fatalf("expected jsonb degrade to VARCHAR, got: %s", plan.CreateTableSQL)
}
}
func TestBuildMySQLLikeToTDenginePlan_RejectsAutoCreateWithoutTimestampColumn(t *testing.T) {
t.Parallel()
sourceDB := &fakeMigrationDB{
columns: map[string][]connection.ColumnDefinition{
"shop.metrics": {
{Name: "id", Type: "bigint", Nullable: "NO", Key: "PRI"},
{Name: "name", Type: "varchar(64)", Nullable: "YES"},
},
},
}
targetDB := &fakeMigrationDB{}
cfg := SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "mysql", Database: "shop"},
TargetConfig: connection.ConnectionConfig{Type: "tdengine", Database: "taos"},
TargetTableStrategy: "smart",
}
plan, _, _, err := buildMySQLLikeToTDenginePlan(cfg, "metrics", sourceDB, targetDB)
if err != nil {
t.Fatalf("buildMySQLLikeToTDenginePlan returned error: %v", err)
}
if plan.AutoCreate {
t.Fatalf("expected auto create disabled when source has no timestamp column")
}
if !strings.Contains(plan.PlannedAction, "时间列") {
t.Fatalf("unexpected planned action: %s", plan.PlannedAction)
}
if !strings.Contains(strings.Join(plan.Warnings, " "), "时间列") {
t.Fatalf("expected missing timestamp warning, got: %v", plan.Warnings)
}
}
func TestBuildClickHouseToTDenginePlan_AutoCreateWhenTargetMissing(t *testing.T) {
t.Parallel()
sourceDB := &fakeMigrationDB{
columns: map[string][]connection.ColumnDefinition{
"analytics.metrics": {
{Name: "event_time", Type: "DateTime64(3)", Nullable: "NO"},
{Name: "host", Type: "FixedString(64)", Nullable: "YES"},
{Name: "payload", Type: "Map(String,String)", Nullable: "YES"},
},
},
}
targetDB := &fakeMigrationDB{}
cfg := SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "clickhouse", Database: "analytics"},
TargetConfig: connection.ConnectionConfig{Type: "tdengine", Database: "taos"},
TargetTableStrategy: "smart",
}
plan, sourceCols, targetCols, err := buildClickHouseToTDenginePlan(cfg, "metrics", sourceDB, targetDB)
if err != nil {
t.Fatalf("buildClickHouseToTDenginePlan returned error: %v", err)
}
if len(sourceCols) != 3 || len(targetCols) != 0 {
t.Fatalf("unexpected columns lengths: source=%d target=%d", len(sourceCols), len(targetCols))
}
if !plan.AutoCreate {
t.Fatalf("expected auto create enabled")
}
if !strings.Contains(plan.CreateTableSQL, "CREATE TABLE `taos`.`metrics`") {
t.Fatalf("unexpected create sql: %s", plan.CreateTableSQL)
}
if !strings.Contains(plan.CreateTableSQL, "`event_time` TIMESTAMP") {
t.Fatalf("expected datetime64 mapping, got: %s", plan.CreateTableSQL)
}
if !strings.Contains(plan.CreateTableSQL, "`host` VARCHAR(64)") {
t.Fatalf("expected fixedstring mapping, got: %s", plan.CreateTableSQL)
}
if !strings.Contains(plan.CreateTableSQL, "`payload` VARCHAR(") {
t.Fatalf("expected complex type degrade to VARCHAR, got: %s", plan.CreateTableSQL)
}
}
func TestBuildClickHouseToPGLikePlan_AutoCreateWhenTargetMissing(t *testing.T) {
t.Parallel()
sourceDB := &fakeMigrationDB{
columns: map[string][]connection.ColumnDefinition{
"analytics.metrics": {
{Name: "id", Type: "UInt64", Nullable: "NO", Key: "PRI"},
{Name: "event_time", Type: "DateTime64(3)", Nullable: "NO"},
{Name: "host", Type: "FixedString(64)", Nullable: "YES"},
{Name: "payload", Type: "Map(String,String)", Nullable: "YES"},
},
},
}
targetDB := &fakeMigrationDB{}
cfg := SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "clickhouse", Database: "analytics"},
TargetConfig: connection.ConnectionConfig{Type: "postgres", Database: "public"},
TargetTableStrategy: "smart",
}
plan, sourceCols, targetCols, err := buildClickHouseToPGLikePlan(cfg, "metrics", sourceDB, targetDB)
if err != nil {
t.Fatalf("buildClickHouseToPGLikePlan returned error: %v", err)
}
if len(sourceCols) != 4 || len(targetCols) != 0 {
t.Fatalf("unexpected columns lengths: source=%d target=%d", len(sourceCols), len(targetCols))
}
if !plan.AutoCreate {
t.Fatalf("expected auto create enabled")
}
if !strings.Contains(plan.CreateTableSQL, `CREATE TABLE "public"."metrics"`) {
t.Fatalf("unexpected create sql: %s", plan.CreateTableSQL)
}
if !strings.Contains(plan.CreateTableSQL, `"id" numeric(20,0)`) {
t.Fatalf("expected uint64 safeguard mapping, got: %s", plan.CreateTableSQL)
}
if !strings.Contains(plan.CreateTableSQL, `"event_time" timestamp`) {
t.Fatalf("expected datetime64 mapping, got: %s", plan.CreateTableSQL)
}
if !strings.Contains(plan.CreateTableSQL, `"host" varchar(64)`) {
t.Fatalf("expected fixedstring mapping, got: %s", plan.CreateTableSQL)
}
if !strings.Contains(plan.CreateTableSQL, `"payload" jsonb`) {
t.Fatalf("expected complex type degrade to jsonb, got: %s", plan.CreateTableSQL)
}
if !strings.Contains(plan.CreateTableSQL, `PRIMARY KEY ("id")`) {
t.Fatalf("expected primary key preservation, got: %s", plan.CreateTableSQL)
}
}
func TestBuildPGLikeToClickHousePlan_AutoCreateWhenTargetMissing(t *testing.T) {
t.Parallel()
sourceDB := &fakeMigrationDB{
columns: map[string][]connection.ColumnDefinition{
"public.orders": {
{Name: "id", Type: "bigint", Nullable: "NO", Key: "PRI"},
{Name: "created_at", Type: "timestamp without time zone", Nullable: "NO"},
{Name: "profile", Type: "jsonb", Nullable: "YES"},
},
},
}
targetDB := &fakeMigrationDB{}
cfg := SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "postgres", Database: "public"},
TargetConfig: connection.ConnectionConfig{Type: "clickhouse", Database: "analytics"},
TargetTableStrategy: "smart",
}
plan, sourceCols, targetCols, err := buildPGLikeToClickHousePlan(cfg, "orders", sourceDB, targetDB)
if err != nil {
t.Fatalf("buildPGLikeToClickHousePlan returned error: %v", err)
}
if len(sourceCols) != 3 || len(targetCols) != 0 {
t.Fatalf("unexpected columns lengths: source=%d target=%d", len(sourceCols), len(targetCols))
}
if !plan.AutoCreate {
t.Fatalf("expected auto create enabled")
}
if !strings.Contains(plan.CreateTableSQL, "CREATE TABLE `analytics`.`orders`") {
t.Fatalf("unexpected create sql: %s", plan.CreateTableSQL)
}
if !strings.Contains(plan.CreateTableSQL, "`created_at` DateTime") {
t.Fatalf("expected timestamp mapping, got: %s", plan.CreateTableSQL)
}
if !strings.Contains(plan.CreateTableSQL, "`profile` Nullable(String)") {
t.Fatalf("expected jsonb degrade to Nullable(String), got: %s", plan.CreateTableSQL)
}
if !strings.Contains(plan.CreateTableSQL, "ORDER BY (`id`)") {
t.Fatalf("expected primary key order by, got: %s", plan.CreateTableSQL)
}
}
func TestBuildTDengineToTDenginePlan_AutoCreateWhenTargetMissing(t *testing.T) {
t.Parallel()
sourceDB := &fakeMigrationDB{
columns: map[string][]connection.ColumnDefinition{
"src.cpu": {
{Name: "ts", Type: "TIMESTAMP", Nullable: "NO"},
{Name: "host", Type: "NCHAR(64)", Nullable: "YES"},
{Name: "region", Type: "NCHAR(32)", Nullable: "YES", Key: "TAG"},
},
},
}
targetDB := &fakeMigrationDB{}
cfg := SyncConfig{
SourceConfig: connection.ConnectionConfig{Type: "tdengine", Database: "src"},
TargetConfig: connection.ConnectionConfig{Type: "tdengine", Database: "dst"},
TargetTableStrategy: "smart",
}
plan, sourceCols, targetCols, err := buildTDengineToTDenginePlan(cfg, "cpu", sourceDB, targetDB)
if err != nil {
t.Fatalf("buildTDengineToTDenginePlan returned error: %v", err)
}
if len(sourceCols) != 3 || len(targetCols) != 0 {
t.Fatalf("unexpected columns lengths: source=%d target=%d", len(sourceCols), len(targetCols))
}
if !plan.AutoCreate {
t.Fatalf("expected auto create enabled")
}
if !strings.Contains(plan.CreateTableSQL, "CREATE TABLE `dst`.`cpu`") {
t.Fatalf("unexpected create sql: %s", plan.CreateTableSQL)
}
if !strings.Contains(plan.CreateTableSQL, "`ts` TIMESTAMP") {
t.Fatalf("expected timestamp preserved, got: %s", plan.CreateTableSQL)
}
if !strings.Contains(plan.CreateTableSQL, "`region` NCHAR(32)") {
t.Fatalf("expected tag degrade to regular nchar column, got: %s", plan.CreateTableSQL)
}
if !strings.Contains(strings.Join(plan.Warnings, " "), "TAG") {
t.Fatalf("expected TAG degrade warning, got: %v", plan.Warnings)
}
}

View File

@@ -7,15 +7,16 @@ import (
)
func (s *SyncEngine) syncTableSchema(config SyncConfig, res *SyncResult, sourceDB db.Database, targetDB db.Database, tableName string) error {
targetType := strings.ToLower(strings.TrimSpace(config.TargetConfig.Type))
targetType := resolveMigrationDBType(config.TargetConfig)
if targetType != "mysql" {
s.appendLog(config.JobID, res, "warn", fmt.Sprintf("目标数据库类型=%s 暂不支持结构同步,已跳过表 %s", config.TargetConfig.Type, tableName))
return nil
}
sourceSchema, sourceTable := normalizeSchemaAndTable(config.SourceConfig.Type, config.SourceConfig.Database, tableName)
targetSchema, targetTable := normalizeSchemaAndTable(config.TargetConfig.Type, config.TargetConfig.Database, tableName)
targetQueryTable := qualifiedNameForQuery(config.TargetConfig.Type, targetSchema, targetTable, tableName)
sourceType := resolveMigrationDBType(config.SourceConfig)
sourceSchema, sourceTable := normalizeSchemaAndTable(sourceType, config.SourceConfig.Database, tableName)
targetSchema, targetTable := normalizeSchemaAndTable(targetType, config.TargetConfig.Database, tableName)
targetQueryTable := qualifiedNameForQuery(targetType, targetSchema, targetTable, tableName)
// 1) 获取源表字段
sourceCols, err := sourceDB.GetColumns(sourceSchema, sourceTable)
@@ -26,7 +27,6 @@ func (s *SyncEngine) syncTableSchema(config SyncConfig, res *SyncResult, sourceD
// 2) 确保目标表存在
targetCols, err := targetDB.GetColumns(targetSchema, targetTable)
if err != nil {
sourceType := strings.ToLower(strings.TrimSpace(config.SourceConfig.Type))
if sourceType != "mysql" {
return fmt.Errorf("目标表不存在且源类型=%s 暂不支持自动建表: %w", config.SourceConfig.Type, err)
}
@@ -62,7 +62,6 @@ func (s *SyncEngine) syncTableSchema(config SyncConfig, res *SyncResult, sourceD
// 3) 补齐目标缺失字段(安全策略:新增字段统一允许 NULL
missing := make([]string, 0)
sourceType := strings.ToLower(strings.TrimSpace(config.SourceConfig.Type))
for _, c := range sourceCols {
colName := strings.TrimSpace(c.Name)
if colName == "" {

View File

@@ -22,7 +22,7 @@ func quoteIdentByType(dbType string, ident string) string {
}
switch dbType {
case "mysql", "mariadb", "diros", "sphinx":
case "mysql", "mariadb", "diros", "sphinx", "clickhouse", "tdengine":
return "`" + strings.ReplaceAll(ident, "`", "``") + "`"
case "sqlserver":
escaped := strings.ReplaceAll(ident, "]", "]]")
@@ -74,8 +74,10 @@ func normalizeSchemaAndTable(dbType string, dbName string, tableName string) (st
}
switch strings.ToLower(strings.TrimSpace(dbType)) {
case "postgres", "kingbase", "vastbase":
case "postgres", "kingbase", "highgo", "vastbase":
return "public", rawTable
case "duckdb":
return "main", rawTable
default:
return rawDB, rawTable
}
@@ -91,7 +93,7 @@ func qualifiedNameForQuery(dbType string, schema string, table string, original
}
switch strings.ToLower(strings.TrimSpace(dbType)) {
case "postgres", "kingbase", "vastbase":
case "postgres", "kingbase", "highgo", "vastbase":
s := strings.TrimSpace(schema)
if s == "" {
s = "public"
@@ -100,7 +102,16 @@ func qualifiedNameForQuery(dbType string, schema string, table string, original
return raw
}
return s + "." + table
case "mysql", "mariadb", "diros", "sphinx":
case "duckdb":
s := strings.TrimSpace(schema)
if s == "" {
s = "main"
}
if table == "" {
return raw
}
return s + "." + table
case "mysql", "mariadb", "diros", "sphinx", "clickhouse", "tdengine":
s := strings.TrimSpace(schema)
if s == "" || table == "" {
return table

View File

@@ -12,14 +12,17 @@ import (
// SyncConfig defines the parameters for a synchronization task
type SyncConfig struct {
SourceConfig connection.ConnectionConfig `json:"sourceConfig"`
TargetConfig connection.ConnectionConfig `json:"targetConfig"`
Tables []string `json:"tables"` // Tables to sync
Content string `json:"content,omitempty"` // "data", "schema", "both"
Mode string `json:"mode"` // "insert_update", "insert_only", "full_overwrite"
JobID string `json:"jobId,omitempty"`
AutoAddColumns bool `json:"autoAddColumns,omitempty"` // 自动补齐缺失字段(当前仅 MySQL 目标支持)
TableOptions map[string]TableOptions `json:"tableOptions,omitempty"`
SourceConfig connection.ConnectionConfig `json:"sourceConfig"`
TargetConfig connection.ConnectionConfig `json:"targetConfig"`
Tables []string `json:"tables"`
Content string `json:"content,omitempty"` // "data", "schema", "both"
Mode string `json:"mode"` // "insert_update", "insert_only", "full_overwrite"
JobID string `json:"jobId,omitempty"`
AutoAddColumns bool `json:"autoAddColumns,omitempty"` // 自动补齐缺失字段
TargetTableStrategy string `json:"targetTableStrategy,omitempty"`
CreateIndexes bool `json:"createIndexes,omitempty"`
MongoCollectionName string `json:"mongoCollectionName,omitempty"`
TableOptions map[string]TableOptions `json:"tableOptions,omitempty"`
}
// SyncResult holds the result of the sync operation
@@ -45,6 +48,13 @@ func NewSyncEngine(reporter Reporter) *SyncEngine {
func (s *SyncEngine) RunSync(config SyncConfig) SyncResult {
result := SyncResult{Success: true, Logs: []string{}}
logger.Infof("开始数据同步:源=%s 目标=%s 表数量=%d", formatConnSummaryForSync(config.SourceConfig), formatConnSummaryForSync(config.TargetConfig), len(config.Tables))
if isRedisToMongoKeyspacePair(config) {
return s.runRedisToMongoSync(config, result)
}
if isMongoToRedisKeyspacePair(config) {
return s.runMongoToRedisSync(config, result)
}
totalTables := len(config.Tables)
s.progress(config.JobID, 0, totalTables, "", "开始同步")
@@ -70,6 +80,7 @@ func (s *SyncEngine) RunSync(config SyncConfig) SyncResult {
s.appendLog(config.JobID, &result, "warn", fmt.Sprintf("未知同步模式 %q已自动使用 insert_update", config.Mode))
}
defaultMode := normalizeSyncMode(config.Mode)
strategy := normalizeTargetTableStrategy(config.TargetTableStrategy)
contentLabel := "仅同步数据"
if syncSchema && syncData {
@@ -77,9 +88,9 @@ func (s *SyncEngine) RunSync(config SyncConfig) SyncResult {
} else if syncSchema {
contentLabel = "仅同步结构"
}
s.appendLog(config.JobID, &result, "info", fmt.Sprintf("同步内容:%s模式%s自动补字段%v", contentLabel, defaultMode, config.AutoAddColumns))
s.appendLog(config.JobID, &result, "info", fmt.Sprintf("同步内容:%s模式%s自动补字段%v;目标表策略:%s创建索引%v", contentLabel, defaultMode, config.AutoAddColumns, strategy, config.CreateIndexes))
sourceDB, err := db.NewDatabase(config.SourceConfig.Type)
sourceDB, err := newSyncDatabase(config.SourceConfig.Type)
if err != nil {
logger.Error(err, "初始化源数据库驱动失败:类型=%s", config.SourceConfig.Type)
return s.fail(config.JobID, totalTables, result, "初始化源数据库驱动失败: "+err.Error())
@@ -88,7 +99,7 @@ func (s *SyncEngine) RunSync(config SyncConfig) SyncResult {
// Custom DB setup would go here if needed
}
targetDB, err := db.NewDatabase(config.TargetConfig.Type)
targetDB, err := newSyncDatabase(config.TargetConfig.Type)
if err != nil {
logger.Error(err, "初始化目标数据库驱动失败:类型=%s", config.TargetConfig.Type)
return s.fail(config.JobID, totalTables, result, "初始化目标数据库驱动失败: "+err.Error())
@@ -112,7 +123,6 @@ func (s *SyncEngine) RunSync(config SyncConfig) SyncResult {
}
defer targetDB.Close()
// Iterate Tables
for i, tableName := range config.Tables {
func() {
tableMode := defaultMode
@@ -120,30 +130,82 @@ func (s *SyncEngine) RunSync(config SyncConfig) SyncResult {
s.progress(config.JobID, i, totalTables, tableName, fmt.Sprintf("同步表(%d/%d)", i+1, totalTables))
defer s.progress(config.JobID, i+1, totalTables, tableName, "表处理完成")
if syncSchema {
s.progress(config.JobID, i, totalTables, tableName, "同步表结构")
if err := s.syncTableSchema(config, &result, sourceDB, targetDB, tableName); err != nil {
s.appendLog(config.JobID, &result, "error", fmt.Sprintf("表结构同步失败:表=%s 错误=%v", tableName, err))
plan, cols, targetCols, err := buildSchemaMigrationPlan(config, tableName, sourceDB, targetDB)
if err != nil {
s.appendLog(config.JobID, &result, "error", fmt.Sprintf("生成迁移计划失败:表=%s 错误=%v", tableName, err))
return
}
for _, warning := range plan.Warnings {
s.appendLog(config.JobID, &result, "warn", fmt.Sprintf(" -> %s", warning))
}
for _, unsupported := range plan.UnsupportedObjects {
s.appendLog(config.JobID, &result, "warn", fmt.Sprintf(" -> %s", unsupported))
}
if strings.TrimSpace(plan.PlannedAction) != "" {
s.appendLog(config.JobID, &result, "info", fmt.Sprintf(" -> %s", plan.PlannedAction))
}
if !plan.TargetTableExists && !plan.AutoCreate {
s.appendLog(config.JobID, &result, "warn", fmt.Sprintf("表 %s 目标表不存在,当前策略不允许自动建表,已跳过", tableName))
return
}
if !plan.TargetTableExists && plan.AutoCreate {
s.progress(config.JobID, i, totalTables, tableName, "创建目标表")
if len(plan.PreDataSQL) > 0 {
if err := executeSQLStatements(targetDB.Exec, plan.PreDataSQL); err != nil {
s.appendLog(config.JobID, &result, "error", fmt.Sprintf("预执行建表 SQL 失败:表=%s 错误=%v", tableName, err))
return
}
}
if strings.TrimSpace(plan.CreateTableSQL) == "" {
s.appendLog(config.JobID, &result, "error", fmt.Sprintf("表 %s 自动建表失败:建表 SQL 为空", tableName))
return
}
if _, err := targetDB.Exec(plan.CreateTableSQL); err != nil {
s.appendLog(config.JobID, &result, "error", fmt.Sprintf("创建目标表失败:表=%s 错误=%v", tableName, err))
return
}
s.appendLog(config.JobID, &result, "info", fmt.Sprintf("目标表创建成功:%s", tableName))
targetCols, err = targetDB.GetColumns(plan.TargetSchema, plan.TargetTable)
if err != nil {
s.appendLog(config.JobID, &result, "error", fmt.Sprintf("创建目标表后获取字段失败:表=%s 错误=%v", tableName, err))
return
}
} else if len(plan.PreDataSQL) > 0 {
s.progress(config.JobID, i, totalTables, tableName, "同步表结构")
if err := executeSQLStatements(targetDB.Exec, plan.PreDataSQL); err != nil {
s.appendLog(config.JobID, &result, "error", fmt.Sprintf("同步表结构失败:表=%s 错误=%v", tableName, err))
return
}
targetCols, err = targetDB.GetColumns(plan.TargetSchema, plan.TargetTable)
if err != nil {
s.appendLog(config.JobID, &result, "warn", fmt.Sprintf("补字段后刷新目标字段失败:表=%s 错误=%v", tableName, err))
}
}
if !syncData {
if len(plan.PostDataSQL) > 0 {
s.progress(config.JobID, i, totalTables, tableName, "创建索引")
if err := executeSQLStatements(targetDB.Exec, plan.PostDataSQL); err != nil {
s.appendLog(config.JobID, &result, "error", fmt.Sprintf("创建索引失败:表=%s 错误=%v", tableName, err))
return
}
}
result.TablesSynced++
return
}
sourceSchema, sourceTable := normalizeSchemaAndTable(config.SourceConfig.Type, config.SourceConfig.Database, tableName)
targetSchema, targetTable := normalizeSchemaAndTable(config.TargetConfig.Type, config.TargetConfig.Database, tableName)
sourceQueryTable := qualifiedNameForQuery(config.SourceConfig.Type, sourceSchema, sourceTable, tableName)
targetQueryTable := qualifiedNameForQuery(config.TargetConfig.Type, targetSchema, targetTable, tableName)
// 1. Get Columns & PKs
cols, err := sourceDB.GetColumns(sourceSchema, sourceTable)
if err != nil {
logger.Error(err, "获取源表列信息失败:表=%s", tableName)
s.appendLog(config.JobID, &result, "error", fmt.Sprintf("获取表 %s 的列信息失败: %v", tableName, err))
return
targetType := resolveMigrationDBType(config.TargetConfig)
sourceType := resolveMigrationDBType(config.SourceConfig)
targetTable := plan.TargetTable
sourceQueryTable, targetQueryTable := plan.SourceQueryTable, plan.TargetQueryTable
applyTableName := targetTable
switch targetType {
case "postgres", "kingbase", "highgo", "vastbase", "sqlserver":
applyTableName = targetQueryTable
}
sourceColsByLower := make(map[string]connection.ColumnDefinition, len(cols))
for _, col := range cols {
if strings.TrimSpace(col.Name) == "" {
@@ -158,25 +220,24 @@ func (s *SyncEngine) RunSync(config SyncConfig) SyncResult {
pkCols = append(pkCols, col.Name)
}
}
if len(pkCols) == 0 {
s.appendLog(config.JobID, &result, "warn", fmt.Sprintf("表 %s 未找到主键,已跳过数据同步(避免产生重复数据)", tableName))
return
requirePK := tableMode == "insert_update" && plan.TargetTableExists
pkCol := ""
if requirePK {
if len(pkCols) == 0 {
s.appendLog(config.JobID, &result, "warn", fmt.Sprintf("表 %s 未找到主键,当前模式需要差异对比,已跳过", tableName))
return
}
if len(pkCols) > 1 {
s.appendLog(config.JobID, &result, "warn", fmt.Sprintf("表 %s 为复合主键(%s当前暂不支持差异同步", tableName, strings.Join(pkCols, ",")))
return
}
pkCol = pkCols[0]
}
if len(pkCols) > 1 {
s.appendLog(config.JobID, &result, "warn", fmt.Sprintf("表 %s 为复合主键(%s当前暂不支持数据同步", tableName, strings.Join(pkCols, ",")))
return
}
pkCol := pkCols[0]
opts := TableOptions{Insert: true, Update: true, Delete: false}
if config.TableOptions != nil {
if t, ok := config.TableOptions[tableName]; ok {
opts = t
// 默认防护:如用户未设置任意一个字段,保持 insert/update 默认 true、delete 默认 false
if !t.Insert && !t.Update && !t.Delete {
opts = t
}
}
}
if !opts.Insert && !opts.Update && !opts.Delete {
@@ -184,10 +245,8 @@ func (s *SyncEngine) RunSync(config SyncConfig) SyncResult {
return
}
// 2. Fetch Data (MEMORY INTENSIVE - PROTOTYPE ONLY)
// TODO: Implement paging/streaming
s.progress(config.JobID, i, totalTables, tableName, "读取源表数据")
sourceRows, _, err := sourceDB.Query(fmt.Sprintf("SELECT * FROM %s", quoteQualifiedIdentByType(config.SourceConfig.Type, sourceQueryTable)))
sourceRows, _, err := sourceDB.Query(fmt.Sprintf("SELECT * FROM %s", quoteQualifiedIdentByType(sourceType, sourceQueryTable)))
if err != nil {
logger.Error(err, "读取源表失败:表=%s", tableName)
s.appendLog(config.JobID, &result, "error", fmt.Sprintf("读取源表 %s 失败: %v", tableName, err))
@@ -196,19 +255,19 @@ func (s *SyncEngine) RunSync(config SyncConfig) SyncResult {
var inserts []map[string]interface{}
var updates []connection.UpdateRow
var deletes []map[string]interface{}
if tableMode == "insert_update" {
if tableMode == "insert_update" && plan.TargetTableExists {
s.progress(config.JobID, i, totalTables, tableName, "读取目标表数据")
targetRows, _, err := targetDB.Query(fmt.Sprintf("SELECT * FROM %s", quoteQualifiedIdentByType(config.TargetConfig.Type, targetQueryTable)))
targetRows, _, err := targetDB.Query(fmt.Sprintf("SELECT * FROM %s", quoteQualifiedIdentByType(targetType, targetQueryTable)))
if err != nil {
logger.Error(err, "读取目标表失败:表=%s", tableName)
s.appendLog(config.JobID, &result, "error", fmt.Sprintf("读取目标表 %s 失败: %v", tableName, err))
return
}
// 3. Compare (In-Memory Hash Map)
s.progress(config.JobID, i, totalTables, tableName, "对比差异")
targetMap := make(map[string]map[string]interface{})
targetMap := make(map[string]map[string]interface{}, len(targetRows))
for _, row := range targetRows {
if row[pkCol] == nil {
continue
@@ -220,7 +279,6 @@ func (s *SyncEngine) RunSync(config SyncConfig) SyncResult {
targetMap[pkVal] = row
}
sourcePKSet := make(map[string]struct{}, len(sourceRows))
for _, sRow := range sourceRows {
if sRow[pkCol] == nil {
continue
@@ -230,7 +288,6 @@ func (s *SyncEngine) RunSync(config SyncConfig) SyncResult {
continue
}
sourcePKSet[pkVal] = struct{}{}
if tRow, exists := targetMap[pkVal]; exists {
changes := make(map[string]interface{})
for k, v := range sRow {
@@ -239,17 +296,12 @@ func (s *SyncEngine) RunSync(config SyncConfig) SyncResult {
}
}
if len(changes) > 0 {
updates = append(updates, connection.UpdateRow{
Keys: map[string]interface{}{pkCol: sRow[pkCol]},
Values: changes,
})
updates = append(updates, connection.UpdateRow{Keys: map[string]interface{}{pkCol: sRow[pkCol]}, Values: changes})
}
} else {
inserts = append(inserts, sRow)
}
}
var deletes []map[string]interface{}
if opts.Delete {
for pkStr, row := range targetMap {
if _, ok := sourcePKSet[pkStr]; ok {
@@ -258,150 +310,49 @@ func (s *SyncEngine) RunSync(config SyncConfig) SyncResult {
deletes = append(deletes, map[string]interface{}{pkCol: row[pkCol]})
}
}
// apply operation selection
inserts = filterRowsByPKSelection(pkCol, inserts, opts.Insert, opts.SelectedInsertPKs)
updates = filterUpdatesByPKSelection(pkCol, updates, opts.Update, opts.SelectedUpdatePKs)
deletes = filterRowsByPKSelection(pkCol, deletes, opts.Delete, opts.SelectedDeletePKs)
changeSet := connection.ChangeSet{
Inserts: inserts,
Updates: updates,
Deletes: deletes,
} else {
inserts = sourceRows
if !opts.Insert {
inserts = nil
}
if tableMode == "full_overwrite" && plan.TargetTableExists {
s.appendLog(config.JobID, &result, "warn", fmt.Sprintf(" -> 全量覆盖模式:即将清空目标表 %s", tableName))
s.progress(config.JobID, i, totalTables, tableName, "清空目标表")
clearSQL := ""
if targetType == "mysql" {
clearSQL = fmt.Sprintf("TRUNCATE TABLE %s", quoteQualifiedIdentByType(targetType, targetQueryTable))
} else {
clearSQL = fmt.Sprintf("DELETE FROM %s", quoteQualifiedIdentByType(targetType, targetQueryTable))
}
if _, err := targetDB.Exec(clearSQL); err != nil {
s.appendLog(config.JobID, &result, "error", fmt.Sprintf(" -> 清空目标表失败: %v", err))
return
}
}
}
// 4. Align schema (target missing columns)
s.progress(config.JobID, i, totalTables, tableName, "检查字段一致性")
requiredCols := collectRequiredColumns(changeSet.Inserts, changeSet.Updates)
targetCols, err := targetDB.GetColumns(targetSchema, targetTable)
changeSet := connection.ChangeSet{Inserts: inserts, Updates: updates, Deletes: deletes}
s.progress(config.JobID, i, totalTables, tableName, "检查字段一致性")
targetColsResolved := targetCols
if len(targetColsResolved) == 0 {
targetColsResolved, err = targetDB.GetColumns(plan.TargetSchema, plan.TargetTable)
if err != nil {
s.appendLog(config.JobID, &result, "warn", fmt.Sprintf(" -> 获取目标表字段失败,已跳过字段一致性检查: %v", err))
} else {
targetColSet := make(map[string]struct{}, len(targetCols))
for _, c := range targetCols {
name := strings.ToLower(strings.TrimSpace(c.Name))
if name == "" {
continue
}
targetColSet[name] = struct{}{}
}
missing := make([]string, 0)
for lower, original := range requiredCols {
if _, ok := targetColSet[lower]; !ok {
missing = append(missing, original)
}
}
sort.Strings(missing)
if len(missing) > 0 {
if config.AutoAddColumns && strings.ToLower(strings.TrimSpace(config.TargetConfig.Type)) == "mysql" {
s.appendLog(config.JobID, &result, "warn", fmt.Sprintf(" -> 目标表缺少字段 %d 个,开始自动补齐: %s", len(missing), strings.Join(missing, ", ")))
added := 0
for _, colName := range missing {
colLower := strings.ToLower(strings.TrimSpace(colName))
colType := "TEXT"
if strings.ToLower(strings.TrimSpace(config.SourceConfig.Type)) == "mysql" {
if srcCol, ok := sourceColsByLower[colLower]; ok {
colType = sanitizeMySQLColumnType(srcCol.Type)
}
}
alterSQL := fmt.Sprintf("ALTER TABLE %s ADD COLUMN %s %s NULL",
quoteQualifiedIdentByType("mysql", targetQueryTable),
quoteIdentByType("mysql", colName),
colType,
)
if _, err := targetDB.Exec(alterSQL); err != nil {
s.appendLog(config.JobID, &result, "error", fmt.Sprintf(" -> 自动补字段失败:字段=%s 错误=%v", colName, err))
continue
}
added++
}
s.appendLog(config.JobID, &result, "info", fmt.Sprintf(" -> 自动补字段完成:成功=%d 失败=%d", added, len(missing)-added))
// refresh columns
targetCols, err = targetDB.GetColumns(targetSchema, targetTable)
if err == nil {
targetColSet = make(map[string]struct{}, len(targetCols))
for _, c := range targetCols {
name := strings.ToLower(strings.TrimSpace(c.Name))
if name == "" {
continue
}
targetColSet[name] = struct{}{}
}
}
} else {
s.appendLog(config.JobID, &result, "warn", fmt.Sprintf(" -> 目标表缺少字段 %d 个(未开启自动补齐),将自动忽略:%s", len(missing), strings.Join(missing, ", ")))
}
// filter out still-missing columns to avoid apply failure
changeSet.Inserts = filterInsertRows(changeSet.Inserts, targetColSet)
changeSet.Updates = filterUpdateRows(changeSet.Updates, targetColSet)
}
}
// 5. Apply Changes
s.progress(config.JobID, i, totalTables, tableName, "应用变更")
if len(changeSet.Inserts) > 0 || len(changeSet.Updates) > 0 || len(changeSet.Deletes) > 0 {
s.appendLog(config.JobID, &result, "info", fmt.Sprintf(" -> 需插入: %d 行, 需更新: %d 行, 需删除: %d 行", len(changeSet.Inserts), len(changeSet.Updates), len(changeSet.Deletes)))
if applier, ok := targetDB.(db.BatchApplier); ok {
if err := applier.ApplyChanges(targetTable, changeSet); err != nil {
s.appendLog(config.JobID, &result, "error", fmt.Sprintf(" -> 应用变更失败: %v", err))
} else {
result.RowsInserted += len(changeSet.Inserts)
result.RowsUpdated += len(changeSet.Updates)
result.RowsDeleted += len(changeSet.Deletes)
}
} else {
s.appendLog(config.JobID, &result, "warn", " -> 目标驱动不支持应用数据变更 (ApplyChanges).")
}
} else {
s.appendLog(config.JobID, &result, "info", " -> 数据一致,无需变更.")
}
result.TablesSynced++
return
} else {
// insert_only / full_overwrite: do not compare target, just insert source rows
inserts = sourceRows
}
// full_overwrite: clear target table first
if tableMode == "full_overwrite" {
s.appendLog(config.JobID, &result, "warn", fmt.Sprintf(" -> 全量覆盖模式:即将清空目标表 %s", tableName))
s.progress(config.JobID, i, totalTables, tableName, "清空目标表")
clearSQL := ""
if strings.ToLower(strings.TrimSpace(config.TargetConfig.Type)) == "mysql" {
clearSQL = fmt.Sprintf("TRUNCATE TABLE %s", quoteQualifiedIdentByType(config.TargetConfig.Type, targetQueryTable))
} else {
clearSQL = fmt.Sprintf("DELETE FROM %s", quoteQualifiedIdentByType(config.TargetConfig.Type, targetQueryTable))
}
if _, err := targetDB.Exec(clearSQL); err != nil {
s.appendLog(config.JobID, &result, "error", fmt.Sprintf(" -> 清空目标表失败: %v", err))
return
}
}
// 4. Align schema (target missing columns)
s.progress(config.JobID, i, totalTables, tableName, "检查字段一致性")
requiredCols := collectRequiredColumns(inserts, updates)
targetCols, err := targetDB.GetColumns(targetSchema, targetTable)
if err != nil {
s.appendLog(config.JobID, &result, "warn", fmt.Sprintf(" -> 获取目标表字段失败,已跳过字段一致性检查: %v", err))
} else {
targetColSet := make(map[string]struct{}, len(targetCols))
for _, c := range targetCols {
if len(targetColsResolved) > 0 {
targetColSet := make(map[string]struct{}, len(targetColsResolved))
for _, c := range targetColsResolved {
name := strings.ToLower(strings.TrimSpace(c.Name))
if name == "" {
continue
}
targetColSet[name] = struct{}{}
}
requiredCols := collectRequiredColumns(changeSet.Inserts, changeSet.Updates)
missing := make([]string, 0)
for lower, original := range requiredCols {
if _, ok := targetColSet[lower]; !ok {
@@ -409,79 +360,64 @@ func (s *SyncEngine) RunSync(config SyncConfig) SyncResult {
}
}
sort.Strings(missing)
if len(missing) > 0 {
if config.AutoAddColumns && strings.ToLower(strings.TrimSpace(config.TargetConfig.Type)) == "mysql" {
if config.AutoAddColumns && supportsAutoAddColumnsForPair(sourceType, targetType) {
s.appendLog(config.JobID, &result, "warn", fmt.Sprintf(" -> 目标表缺少字段 %d 个,开始自动补齐: %s", len(missing), strings.Join(missing, ", ")))
added := 0
for _, colName := range missing {
colLower := strings.ToLower(strings.TrimSpace(colName))
colType := "TEXT"
if strings.ToLower(strings.TrimSpace(config.SourceConfig.Type)) == "mysql" {
if srcCol, ok := sourceColsByLower[colLower]; ok {
colType = sanitizeMySQLColumnType(srcCol.Type)
}
srcCol, ok := sourceColsByLower[colLower]
if !ok {
continue
}
alterSQL, err := buildAddColumnSQLForPair(sourceType, targetType, targetQueryTable, srcCol)
if err != nil {
s.appendLog(config.JobID, &result, "error", fmt.Sprintf(" -> 自动补字段失败:字段=%s 错误=%v", colName, err))
continue
}
alterSQL := fmt.Sprintf("ALTER TABLE %s ADD COLUMN %s %s NULL",
quoteQualifiedIdentByType("mysql", targetQueryTable),
quoteIdentByType("mysql", colName),
colType,
)
if _, err := targetDB.Exec(alterSQL); err != nil {
s.appendLog(config.JobID, &result, "error", fmt.Sprintf(" -> 自动补字段失败:字段=%s 错误=%v", colName, err))
continue
}
added++
targetColSet[colLower] = struct{}{}
}
s.appendLog(config.JobID, &result, "info", fmt.Sprintf(" -> 自动补字段完成:成功=%d 失败=%d", added, len(missing)-added))
// refresh columns
targetCols, err = targetDB.GetColumns(targetSchema, targetTable)
if err == nil {
targetColSet = make(map[string]struct{}, len(targetCols))
for _, c := range targetCols {
name := strings.ToLower(strings.TrimSpace(c.Name))
if name == "" {
continue
}
targetColSet[name] = struct{}{}
}
}
} else {
s.appendLog(config.JobID, &result, "warn", fmt.Sprintf(" -> 目标表缺少字段 %d 个(未开启自动补齐),将自动忽略:%s", len(missing), strings.Join(missing, ", ")))
}
// filter out still-missing columns to avoid apply failure
inserts = filterInsertRows(inserts, targetColSet)
updates = filterUpdateRows(updates, targetColSet)
changeSet.Inserts = filterInsertRows(changeSet.Inserts, targetColSet)
changeSet.Updates = filterUpdateRows(changeSet.Updates, targetColSet)
}
}
// 5. Apply Changes
s.progress(config.JobID, i, totalTables, tableName, "应用变更")
changeSet := connection.ChangeSet{
Inserts: inserts,
Updates: updates,
}
if len(changeSet.Inserts) > 0 || len(changeSet.Updates) > 0 {
s.appendLog(config.JobID, &result, "info", fmt.Sprintf(" -> 需插入: %d 行, 需更新: %d 行", len(changeSet.Inserts), len(changeSet.Updates)))
if len(changeSet.Inserts) > 0 || len(changeSet.Updates) > 0 || len(changeSet.Deletes) > 0 {
s.appendLog(config.JobID, &result, "info", fmt.Sprintf(" -> 需插入: %d 行, 需更新: %d 行, 需删除: %d 行", len(changeSet.Inserts), len(changeSet.Updates), len(changeSet.Deletes)))
if applier, ok := targetDB.(db.BatchApplier); ok {
if err := applier.ApplyChanges(targetTable, changeSet); err != nil {
if err := applier.ApplyChanges(applyTableName, changeSet); err != nil {
s.appendLog(config.JobID, &result, "error", fmt.Sprintf(" -> 应用变更失败: %v", err))
} else {
result.RowsInserted += len(changeSet.Inserts)
result.RowsUpdated += len(changeSet.Updates)
return
}
result.RowsInserted += len(changeSet.Inserts)
result.RowsUpdated += len(changeSet.Updates)
result.RowsDeleted += len(changeSet.Deletes)
} else {
s.appendLog(config.JobID, &result, "warn", " -> 目标驱动不支持应用数据变更 (ApplyChanges).")
return
}
} else {
s.appendLog(config.JobID, &result, "info", " -> 数据一致,无需变更.")
}
if len(plan.PostDataSQL) > 0 {
s.progress(config.JobID, i, totalTables, tableName, "创建索引")
if err := executeSQLStatements(targetDB.Exec, plan.PostDataSQL); err != nil {
s.appendLog(config.JobID, &result, "error", fmt.Sprintf("创建索引失败:表=%s 错误=%v", tableName, err))
return
}
}
result.TablesSynced++
}()
}
@@ -554,3 +490,26 @@ func (s *SyncEngine) fail(jobID string, totalTables int, res SyncResult, msg str
s.progress(jobID, res.TablesSynced, totalTables, "", "同步失败")
return res
}
func (s *SyncEngine) execDDLStatements(jobID string, res *SyncResult, database db.Database, tableName string, stage string, statements []string) error {
for _, statement := range statements {
sqlText := strings.TrimSpace(statement)
if sqlText == "" {
continue
}
if _, err := database.Exec(sqlText); err != nil {
return fmt.Errorf("%s失败: %w", stage, err)
}
s.appendLog(jobID, res, "info", fmt.Sprintf("表 %s %s成功%s", tableName, stage, shortenSyncSQL(sqlText)))
}
return nil
}
func shortenSyncSQL(sqlText string) string {
text := strings.TrimSpace(strings.ReplaceAll(strings.ReplaceAll(sqlText, "\n", " "), "\t", " "))
text = strings.Join(strings.Fields(text), " ")
if len(text) <= 120 {
return text
}
return text[:117] + "..."
}

View File

@@ -27,4 +27,3 @@ type Reporter struct {
OnLog func(event SyncLogEvent)
OnProgress func(event SyncProgressEvent)
}