Compare commits

23 Commits

Author SHA1 Message Date
木炎
c60cd308ca feat: service console auto-connect, settings panel, and batch of enhancements
- Auto-connect WebSocket on page load in service console
- Settings modal for editing sgclaw_config.json (API key, base URL, model, skills dir, etc.)
- UpdateConfig/ConfigUpdated protocol messages for remote config save
- save_to_path() for SgClawSettings serialization
- ConfigUpdated handler in sg_claw_client binary
- Protocol serialization tests for new message types
- HTML test assertions for auto-connect and settings UI
- Additional pending changes: deterministic submit, org units, lineloss xlsx export, browser script tool, and docs

🤖 Generated with [Qoder][https://qoder.com]
2026-04-14 14:32:46 +08:00
木炎
6aa0c110bd fix(callback_host): close orphaned helper page before opening on same WS connection
Sends sgHideBrowerserClosePage (best-effort) before sgHideBrowerserOpenPage
on the same bootstrap WebSocket connection. This prevents duplicate helper
pages across process restarts. Also enables hidden domain mode so the helper
page is not visible to users.

🤖 Generated with [Qoder][https://qoder.com]
2026-04-14 10:01:50 +08:00
木炎
390a431a4b fix(callback_host): revert close_helper_page that broke helper page loading
The close_helper_page function opened a second browser WebSocket
connection during Drop and sent a close command directly via the WS
bypassing the HTTP polling system. This interfered with the browser's
normal state and caused the helper page to fail to open.

The cached_host lift (previous commit) already solves the duplicate
helper page issue within a single process lifetime. The Drop-based
close logic is deferred until a proper cleanup mechanism is designed.

🤖 Generated with [Qoder][https://qoder.com]
2026-04-14 09:41:33 +08:00
木炎
0f70702914 test(callback_host): add hidden domain bootstrap test
🤖 Generated with [Qoder][https://qoder.com]
2026-04-14 09:21:17 +08:00
木炎
8decd9554c fix(service): lift cached_host to outer loop to prevent duplicate helper pages
🤖 Generated with [Qoder][https://qoder.com]
2026-04-14 09:15:33 +08:00
木炎
adb64429ee feat(callback_host): close helper page on Drop via browser WS
🤖 Generated with [Qoder][https://qoder.com]
2026-04-14 09:09:59 +08:00
木炎
32e2c59a40 feat(callback_host): add use_hidden_domain param to bootstrap_helper_page
🤖 Generated with [Qoder][https://qoder.com]
2026-04-14 09:07:20 +08:00
木炎
fae2fd57d6 docs: add helper page lifecycle fix implementation plan
🤖 Generated with [Qoder][https://qoder.com]
2026-04-14 09:01:59 +08:00
木炎
899c670e5c docs: add helper page lifecycle fix & hidden domain design spec
🤖 Generated with [Qoder][https://qoder.com]
2026-04-14 08:59:15 +08:00
木炎
583bb117cb docs: add async eval .then() fix design spec
🤖 Generated with [Qoder][https://qoder.com]
2026-04-13 18:32:05 +08:00
木炎
ad3778d4c5 fix: pass expected_domain to wrapped browser scripts
The `expected_domain` was removed from args for normalization but never
re-inserted, causing JS scripts to receive empty expected_domain and
report "missing_expected_domain" errors.

🤖 Generated with [Qoder][https://qoder.com]
2026-04-13 17:20:48 +08:00
木炎
4d1070dff0 docs: add expected_domain arg fix spec
🤖 Generated with [Qoder][https://qoder.com]
2026-04-13 17:13:53 +08:00
木炎
0303111d5b test: add async browser script test case
🤖 Generated with [Qoder][https://qoder.com]
2026-04-13 16:13:46 +08:00
木炎
7320fb7f79 fix: support async browser scripts in build_eval_js
Wrap eval script in async IIFE and await Promise-like results.
Fixes Promise serialization returning '{}' for async skill scripts.

🤖 Generated with [Qoder][https://qoder.com]
2026-04-13 16:12:08 +08:00
木炎
dbbc5d030b docs: add async browser script support implementation plan
Plan for modifying build_eval_js to support async scripts.
Two tasks: modify callback_backend.rs, add test case.

🤖 Generated with [Qoder][https://qoder.com]
2026-04-13 16:09:09 +08:00
木炎
ce6b3e6749 docs: add async browser script support design
Design for fixing Promise serialization issue in build_eval_js.
Async functions return Promise which gets JSON.stringify'd to "{}".

🤖 Generated with [Qoder][https://qoder.com]
2026-04-13 16:06:41 +08:00
木炎
a957712590 fix: add target_url param for Action::Eval in browser_script_skill_tool
🤖 Generated with [Qoder][https://qoder.com]
2026-04-13 15:03:49 +08:00
木炎
0ebe060484 docs: add lineloss target_url fix implementation plan
🤖 Generated with [Qoder][https://qoder.com]
2026-04-13 15:01:23 +08:00
木炎
695a888840 docs: add lineloss target_url fix design spec
🤖 Generated with [Qoder][https://qoder.com]
2026-04-13 14:59:51 +08:00
木炎
733aee1e9a feat: add lineloss URL mapping in derive_request_url_from_instruction
临时方案:检测指令中包含'线损'或'lineloss'时返回台区线损平台 URL

🤖 Generated with [Qoder][https://qoder.com]
2026-04-13 14:41:14 +08:00
木炎
f8f822e1f3 test: add lineloss requesturl mapping test 2026-04-13 14:38:03 +08:00
木炎
3b156e4bd1 docs: add lineloss requesturl fix implementation plan
🤖 Generated with [Qoder][https://qoder.com]
2026-04-13 14:34:35 +08:00
木炎
645dc60bae docs: add lineloss requesturl fix design spec
临时方案:在 derive_request_url_from_instruction 中添加台区线损 URL 映射

🤖 Generated with [Qoder][https://qoder.com]
2026-04-13 14:31:39 +08:00
44 changed files with 6447 additions and 33 deletions

Binary file not shown.

View File

@@ -0,0 +1,10 @@
{
"last_run_at": "2026-04-12T03:08:25.764151200+00:00",
"last_report": {
"archived_memory_files": 0,
"archived_session_files": 0,
"purged_memory_archives": 0,
"purged_session_archives": 0,
"pruned_conversation_rows": 0
}
}

View File

@@ -0,0 +1,422 @@
# collect_lineloss.js 从生成到可用的完整排查记录
本文档记录了 `tq-lineloss-report` skill 脚本从初始生成到最终可用的全部排查过程,包括遇到的每个错误、根因分析和修复方法。可作为后续类似 skill 开发的排查模板。
---
## 背景
### 架构概览
```
用户输入 "兰州公司 月累计 2026-03。。。"
sgClaw Rust 进程
├── 解析指令 → DeterministicExecutionPlan
├── 读取 collect_lineloss.js 脚本
├── 包装为 IIFE(function(){ const args = {...}; <脚本内容> })()
├── 调用 sgBrowserExcuteJsCodeByDomain(domain, wrappedJs)
│ 注入到浏览器中匹配 domain 的页面执行
├── 等待回调:脚本通过 callBackJsToCpp 返回 JSON 结果
├── 解析 artifact JSON → 提取 status/rows/reasons
└── 生成 XLSXRust 侧)→ 返回 outcome
```
### 关键差异:原始场景 vs Skill 模式
| 对比项 | 原始场景 (index.html) | Skill 模式 |
|--------|----------------------|------------|
| 脚本注入方式 | `sgBrowserExcuteJsCode(exactURL, js)` — 精确 URL | `sgBrowserExcuteJsCodeByDomain(domain, js)` — 仅域名匹配 |
| 执行页面 | 业务子页面 `/tqLinelossStatis/tqQualifyRateMonitor` | 可能命中父框架页 `/gsllys` |
| `window.mac` | 有Vue 实例,`mounted()``window.mac = this` | 无(没有 Vue 实例) |
| 导出 Excel | JS 调 `localhost:13313`(本地场景页可访问) | JS 无法调 `localhost:13313`CORS 阻断) |
| 结果回传 | Rust 只需要 `.then()` 回调结果 | 同左,但脚本是 async 函数需 `.then()` 处理 |
---
## 排查时间线
### 第 1 阶段:基础管道问题
#### 问题 1: `missing_expected_domain`
**现象**: `status=blocked reasons=missing_expected_domain`
**根因**: Rust 侧 `deterministic_submit.rs` 构造 args 时没有传 `expected_domain` 字段。`derive_expected_domain()``page_url` 提取 host 时只取了域名不含端口,但传入 args 时 key 不匹配。
**修复**: 确保 `deterministic_submit_args()` 正确插入 `expected_domain` 到 args Map。
**涉及文件**: `src/compat/deterministic_submit.rs`
**是否需要重新编译**: 是
---
#### 问题 2: `target_url` 缺少端口号
**现象**: 脚本注入失败或注入到错误页面。
**根因**: `target_url` 被设为 `http://20.76.57.61`(无端口),但实际业务页面在 `http://20.76.57.61:18080/gsllys/...``sgBrowserExcuteJsCodeByDomain` 需要能匹配到正确的标签页。
**修复**: 在 `deterministic_submit.rs` 中设置完整 `target_url`
```rust
const LINELLOSS_TARGET_URL: &str = "http://20.76.57.61:18080/gsllys/tqLinelossStatis/tqQualifyRateMonitor";
```
**涉及文件**: `src/compat/deterministic_submit.rs`
**是否需要重新编译**: 是
---
#### 问题 3: 脚本返回 `{}` 空对象
**现象**: Rust 侧收到的 artifact 是 `{}`,无任何数据。
**根因**: `collect_lineloss.js` 的入口 `buildBrowserEntrypointResult()``async` 函数,返回 Promise。Rust 侧 `build_eval_js` 包装器原来直接调用 `_s(v)` 发送结果,但 `v` 是一个 Promise 对象JSON.stringify 后变成 `{}`
**修复**: 在 `build_eval_js``callback_backend.rs`)中增加 Promise 检测:
```rust
// 旧代码
"_s(v);"
// 新代码
"if(v&&typeof v.then==='function'){v.then(_s).catch(function(){});}else{_s(v);}"
```
如果返回值是 thenablePromise等它 resolve 后再发送回调。
**涉及文件**: `src/browser/callback_backend.rs``build_eval_js` 函数
**是否需要重新编译**: 是
**教训**: 所有 browser_script skill 如果入口函数是 async返回 Promise都需要这个 `.then()` 处理。这是管道层的通用修复。
---
### 第 2 阶段:页面上下文问题
#### 问题 4: `page_context_unavailable` (mac_missing)
**现象**:
```
tq-lineloss-report 国网兰州供电公司 2026-03 status=blocked rows=0 reasons=page_context_unavailable
```
**排查过程**:
1.`validatePageContext` 中添加诊断信息:
```javascript
// 临时诊断代码
const diag = 'href=' + href + '|host=' + host + '|port=' + port + '|title=' + title + '|mac=' + hasMac;
return { ok: false, reason: 'page_context_unavailable:mac_missing|' + diag };
```
2. 页面返回的诊断结果:
```
href=http://20.76.57.61:18080/gsllys
host=20.76.57.61
port=18080
title=台区线损大数据分析模块
mac=false
```
**根因**: `sgBrowserExcuteJsCodeByDomain("20.76.57.61")` 匹配到了父框架页 `/gsllys`,而不是业务子页面。`window.mac` 是业务子页面的 Vue 实例,在 `mounted()` 中通过 `window.mac = this` 设置,父框架页没有这个实例。
**关键认知**: 在 Skill 模式下没有 Vue 实例,`window.mac` 检查在架构上就不适用。脚本通过 AJAX 发绝对 URL 请求,不依赖页面本地状态。
**修复**: 删除 `globalThis.mac` 检查,只保留 host 匹配:
```javascript
// 修复前
validatePageContext(args) {
// ... 含 mac 检查 + 诊断代码
if (!hasMac) {
return { ok: false, reason: 'page_context_unavailable:mac_missing|' + diag };
}
}
// 修复后
validatePageContext(args) {
const host = normalizeText(globalThis.location?.hostname);
const expected = normalizeText(args.expected_domain);
if (!host) {
return { ok: false, reason: 'page_context_unavailable' };
}
if (host !== expected) {
return { ok: false, reason: 'page_context_mismatch' };
}
return { ok: true };
},
```
**涉及文件**: `collect_lineloss.js``validatePageContext` 函数
**是否需要重新编译**: 否JS 文件运行时读取)
**排查技巧**: 在 reasons 中拼接诊断信息href/host/port/title/mac不需要 F12 console直接通过 Rust 侧的 summary 输出就能看到。
---
### 第 3 阶段API 请求问题
#### 问题 5: `api_query_failed` — 返回 HTML 而非 JSON
**现象**:
```
status=error rows=0 reasons=api_query_failed:month_api_failed: SyntaxError: Unexpected token '<', "<!DOCTYPE "... is not valid JSON
```
**根因**: 后端服务检测到请求缺少 `X-Requested-With: XMLHttpRequest` 头,认为这不是 AJAX 请求,返回了 HTML 登录页面。jQuery 的 `$.ajax` 不会自动添加这个头。
**修复**: 在 `queryMonthData``queryWeekData``$.ajax` 调用中添加请求头:
```javascript
$.ajax({
url,
type: 'POST',
dataType: 'json',
crossDomain: true,
headers: { 'X-Requested-With': 'XMLHttpRequest' }, // <-- 新增
data: request,
contentType: 'application/x-www-form-urlencoded;charset=UTF-8',
success: resolve,
error: (xhr, _status, err) => reject(new Error(
`month_api_failed(${xhr.status}): ${String(err)}|body=${String(xhr.responseText || '').substring(0, 200)}`
))
});
```
**涉及文件**: `collect_lineloss.js``queryMonthData``queryWeekData`
**是否需要重新编译**: 否
**排查技巧**: 在 error handler 中拼接 `xhr.responseText` 的前 200 字符到 reasons 中。如果看到 `<!DOCTYPE` 开头,说明后端返回了 HTML 而非 JSON。
**通用规则**: 内网 Java 后端通常依赖 `X-Requested-With: XMLHttpRequest` 来区分页面请求和 AJAX 请求。所有对内网 API 的 `$.ajax` 调用都应加上此头。
---
### 第 4 阶段:数据规范化问题
#### 问题 6: `row_normalization_failed` — 列名不匹配
**现象**:
```
status=error rows=0 reasons=row_normalization_failed:rawRows=12|keys=YGDL,ORG_NO,YXSL,TG_NUM...
```
**根因**: 初始生成的 `MONTH_COLUMN_DEFS` 使用了猜测的列名:
```javascript
// 错误的列名
['LINE_LOSS_RATE', '线损完成率(%)'],
['PPQ', '累计供电量'],
['UPQ', '累计售电量'],
```
而 API 实际返回的列名是(参考原始场景 `index.html` 中的 `cols2`
```javascript
// 正确的列名
['ORG_NAME', '供电单位'],
['YGDL', '累计供电量'],
['YYDL', '累计售电量'],
['YXSL', '线损完成率(%)'],
['RAT_SCOPE', '线损率累计目标值'],
['BLANK3', '目标完成率'],
['BLANK2', '排行']
```
**修复**: 按原始场景 `index.html``cols2` 的定义修正 `MONTH_COLUMN_DEFS`
**排查技巧**: 在 `reasons` 中拼接 `rawRows.length``Object.keys(rawRows[0]).join(',')` 可以直接看到 API 返回了哪些字段。
**通用规则**: 生成 skill 脚本时,列定义必须从原始场景代码中精确复制,不能靠猜测。找 `cols1`/`cols2` 或表格渲染相关代码。
---
#### 问题 7: `row_normalization_failed` — 数值类型不兼容
**现象**: 列名修正后仍报 `row_normalization_failed:rawRows=12`12 行全部被过滤。
**根因**: `pickFirstNonEmpty()` 函数只识别字符串类型:
```javascript
function pickFirstNonEmpty(...values) {
for (const value of values) {
if (isNonEmptyString(value)) { // isNonEmptyString: typeof value === 'string'
return value.trim();
}
}
return ''; // API 返回数字 12345.67typeof === 'number',被当作空值
}
```
API 返回的字段值是数字(如 `YGDL: 12345.67`),不是字符串。`pickFirstNonEmpty` 对数字返回 `''`,导致所有行的所有字段都为空,全部被过滤。
**修复**: `normalizeMonthRow` 不使用 `pickFirstNonEmpty`,改为直接处理任意类型值:
```javascript
// 修复前
function normalizeMonthRow(rawRow) {
const row = {};
for (const key of MONTH_COLUMNS) {
row[key] = pickFirstNonEmpty(rawRow?.[key]); // 数字类型 → ''
}
return MONTH_COLUMNS.every((key) => row[key] !== '') ? row : null;
}
// 修复后
function normalizeMonthRow(rawRow) {
const row = {};
for (const key of MONTH_COLUMNS) {
const v = rawRow?.[key];
row[key] = (v === null || v === undefined || v === '') ? '' : String(v).trim();
}
return MONTH_COLUMNS.every((key) => row[key] !== '') ? row : null;
}
```
**涉及文件**: `collect_lineloss.js``normalizeMonthRow`
**是否需要重新编译**: 否
**通用规则**: 内网 API 返回的 JSON 中数值字段通常是 `number` 类型而非字符串。行规范化函数必须用 `String(v)` 进行类型转换,不能依赖 `typeof === 'string'` 判断。
---
### 第 5 阶段:导出问题(架构级)
#### 问题 8: 导出永久挂起
**现象**:
```
tq-lineloss-report 国网兰州供电公司 2026-03 status=pl rows=12
```
数据采集成功12 行),但之后永远没有返回,脚本卡死在导出步骤。
**排查过程**:
1. `exportWorkbook` 调用 `fetch('http://localhost:13313/...')` — CORS 阻断
2. 改用 `$.ajax({ crossDomain: true })` — 同样阻断
3. 确认这是浏览器安全模型限制,不是配置问题
**根因**: 脚本运行在远程页面 `http://20.76.57.61:18080` 上,浏览器禁止从远程页面向 `localhost:13313` 发起请求(同源策略 + Mixed Content`crossDomain: true` 只是告诉 jQuery 用跨域模式,并不能绕过浏览器安全策略。
原始场景的解决方式:有一个本地场景页面(`localhost` 上的 `index.html`)充当代理,先在远程页面采集数据,再通过 `postMessage` 或回调传回本地页面,由本地页面调用 `localhost:13313`
Skill 模式没有本地场景页面,因此这种代理机制不存在。
**解决方案**: 将导出逻辑从浏览器 JS 移到 Rust 侧(方案 A2: Rust 本地生成 XLSX
**最终架构**:
```
JS (浏览器): 采集数据 → 返回 artifact { rows, column_defs, status }
Rust (本地): 解析 artifact → 提取 rows + column_defs → 生成 XLSX 文件
```
**具体修改**:
1. **JS 侧**: 删除 `exportWorkbook()``writeReportLog()``postJson()``buildExportPayload()` 等导出相关代码。artifact 中添加 `column_defs` 字段export 状态设为 `deferred_to_rust`
2. **Rust 侧**: 新增 `lineloss_xlsx_export.rs`,用 `zip` crate + OpenXML XML 生成 XLSX。在 `deterministic_submit.rs` 中,收到 artifact 后调用 XLSX 生成。
**涉及文件**:
- `collect_lineloss.js` — 删除导出代码,添加 `column_defs`
- `src/compat/lineloss_xlsx_export.rs` — 新增
- `src/compat/deterministic_submit.rs` — 新增导出集成
- `src/compat/mod.rs` — 注册新模块
**是否需要重新编译**: 是
**通用规则**: 任何从远程页面调用 `localhost` 的操作在 Skill 模式下都不可行。导出/写日志等需要访问本地服务的功能必须放到 Rust 侧实现。
---
## 排查方法论总结
### 1. 诊断信息注入模式
脚本运行在浏览器中,无法看 F12 console。唯一的信息通道是 artifact JSON 的 `reasons` 字段。
```javascript
// 在 catch 块中注入详细错误
reasons: ['api_query_failed:' + String(error?.message || error || 'unknown')]
// 在规范化失败时注入原始数据摘要
reasons: ['row_normalization_failed:rawRows=' + rawRows.length + '|keys=' + Object.keys(rawRows[0]).join(',')]
// 在页面上下文检查中注入环境信息
reason: 'page_context_unavailable:mac_missing|href=' + href + '|host=' + host + '|port=' + port
```
Rust 侧的 summary 输出会包含这些 reasons直接在日志中可见。
### 2. 逐层排查顺序
```
Layer 1: 管道层Rust
├── args 是否正确传入?(expected_domain, target_url, org_code 等)
├── 脚本文件是否正确读取?
├── async 返回值是否被正确处理?(.then() 模式)
└── 回调是否成功返回?
Layer 2: 页面上下文JS
├── 脚本注入到了哪个页面?(href, title)
├── 页面是否有需要的全局变量?(window.mac 等)
└── domain 匹配是否正确?
Layer 3: API 请求JS
├── 请求头是否完整?(X-Requested-With)
├── 返回格式是否正确?(JSON vs HTML)
└── 返回状态码?
Layer 4: 数据处理JS
├── API 返回的字段名是否匹配列定义?
├── 字段值类型是否兼容?(number vs string)
└── 规范化后是否有有效行?
Layer 5: 导出(架构)
├── 是否涉及跨域请求?
├── localhost 是否可达?
└── 是否需要 Rust 侧处理?
```
### 3. 修改后验证检查清单
- [ ] JS 文件语法检查:`node -e "require('./collect_lineloss.js')"`
- [ ] 如果改了 Rust 代码:`cargo build` 编译通过
- [ ] `cargo test` 全部通过(排除已知的 pre-existing failures
- [ ] 替换 JS 文件到部署目录
- [ ] 如果改了 Rust重新部署编译后的 sgclaw 二进制
---
## 最终文件清单
### JS 文件: `collect_lineloss.js`
**位置**: `D:/data/ideaSpace/rust/sgClaw/claw/claw/skills/skill_staging/skills/tq-lineloss-report/scripts/collect_lineloss.js`
**功能**: 纯数据采集。注入到浏览器,查询线损平台 API返回结构化 artifact。
**不做的事**: 不调 localhost:13313不导出 Excel不写 report log。
### Rust 文件: 修改清单
| 文件 | 修改内容 | 修改类型 |
|------|---------|---------|
| `src/browser/callback_backend.rs` | `build_eval_js` 增加 `.then()` 处理 async 返回值 | 管道层通用修复 |
| `src/compat/deterministic_submit.rs` | 完整 `target_url`; 解析 artifact 后调 XLSX 导出 | 业务集成 |
| `src/compat/lineloss_xlsx_export.rs` | XLSX 生成zip + OpenXML | 新增 |
| `src/compat/mod.rs` | 注册 `lineloss_xlsx_export` 模块 | 新增 |
---
## 快速复用模板
新建类似 skill 时,直接检查以下要点:
1. **`build_eval_js` 是否支持 async**:入口函数如果是 `async`,确认 `callback_backend.rs` 中有 `.then()` 处理。
2. **`validatePageContext` 不检查页面局部状态**:只检查 host不检查 `window.mac``window.app` 等场景页专属变量。
3. **API 请求必须带 `X-Requested-With: XMLHttpRequest`**:内网 Java 后端的标配。
4. **列定义从原始场景代码精确复制**:找 `cols1`/`cols2` 或表格 `columns` 配置。
5. **`normalizeRow``String(v)` 而非 `pickFirstNonEmpty`**API 返回数字不是字符串。
6. **导出不走浏览器,走 Rust 侧**JS 返回 rows + column_defsRust 生成 XLSX。

View File

@@ -0,0 +1,448 @@
# TQ Lineloss WS Dual-Transport Implementation Plan
> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking.
**Goal:** Add ws communication support for the existing `tq-lineloss-report.collect_lineloss` deterministic browser_script path on the `feature/claw-ws` branch while preserving the current pipe path and validated Zhihu ws behavior.
**Architecture:** Reuse the existing backend-neutral execution seam that already exists for deterministic submit and browser_script execution. Keep lineloss business parsing, canonical args, and artifact interpretation unchanged; only make the ws backend/protocol and submit-path verification complete enough for the same lineloss skill contract to run over both pipe and ws.
**Tech Stack:** Rust 2021, Cargo tests, existing `BrowserBackend` abstraction, `WsBrowserBackend`, `ws_protocol`, browser websocket contract in `docs/_tmp_sgbrowser_ws_api_doc.txt`, existing staged `browser_script` skill execution seam.
---
## Execution Context
- Follow @superpowers:test-driven-development for each behavior change.
- Follow @superpowers:verification-before-completion before claiming each task is done.
- Do **not** create a git worktree unless the user explicitly asks.
- This plan is **ws enablement only** for the already-added lineloss deterministic skill path.
- Do **not** redesign deterministic routing, org parsing, period parsing, staged skill packaging, or artifact contracts unless a failing ws-specific test proves a minimal compatibility fix is required.
- Do **not** modify validated Zhihu hotlist/export business behavior; only add regression coverage around it.
- Preserve the current pipe execution path as the control implementation.
- Preserve the current `BrowserBackend` seam; do not introduce a second lineloss-specific ws execution path.
## Scope Boundary
### In scope
- Make the existing lineloss deterministic `browser_script` skill path run through ws on this branch.
- Keep the same canonical tool args and returned artifact interpretation for both pipe and ws.
- Verify ws browser-script execution against the documented browser ws contract.
- Add focused tests for ws lineloss execution and regressions for Zhihu ws + pipe lineloss.
### Out of scope
- Changing lineloss trigger semantics (`。。。`).
- Changing org/unit normalization semantics or source dictionary shape.
- Changing period normalization semantics.
- Reworking staged skill docs or JS business collection logic beyond ws-compatibility necessities.
- Any Zhihu feature work.
- Any pipe-only cleanup/refactor.
- Any general scene-registry redesign.
## File Map
### Expected code changes
- Modify: `src/pipe/protocol.rs:49-78,130-165,192-209`
- keep `Action::Eval` encoding aligned with the current transport contract and lineloss skill expectations
- Modify: `src/pipe/browser_tool.rs:62-125`
- ensure eval response correlation and payload handling remain sufficient for deterministic lineloss execution
- Modify only if a focused test proves it is necessary: `src/compat/browser_script_skill_tool.rs:135-255`
- preserve browser_script contract; only make minimal output-shape handling fixes if eval payloads differ from the pipe baseline in a way current code cannot consume
- Modify only if a focused parity test proves it is necessary: `src/compat/direct_skill_runtime.rs:50-129`
- preserve shared backend-neutral execution helper behavior; no business logic changes
- Read and normally leave unchanged: `src/compat/deterministic_submit.rs:96-157`
- this is the business contract baseline and should not be rewritten for transport parity work
- Read and normally leave unchanged: `src/agent/mod.rs:242-285`
- this contains the current deterministic dispatch split used by this branch
### Expected test changes
- Modify: `tests/agent_runtime_test.rs`
- add/extend deterministic lineloss runtime coverage and parity assertions using the current runtime path
- Modify: `tests/compat_runtime_test.rs`
- add/extend focused pipe lineloss regression assertions so transport work cannot silently break pipe
- Modify only if end-to-end submit coverage truly needs it: `tests/runtime_task_flow_test.rs`
- verify broader submit-flow expectations remain intact
### Reference-only files
- Read only: `docs/superpowers/plans/2026-04-11-tq-lineloss-deterministic-skill-plan.md`
- Read only: `docs/superpowers/specs/2026-04-11-tq-lineloss-deterministic-skill-design.md`
- Read only: `docs/_tmp_sgbrowser_ws_api_doc.txt`
---
## Locked contracts
### Contract 1: Same lineloss deterministic business contract on both transports
The ws path must reuse the existing values produced by `src/compat/deterministic_submit.rs:84-95` and `src/compat/deterministic_submit.rs:135-166`:
- `expected_domain`
- `org_label`
- `org_code`
- `period_mode`
- `period_mode_code`
- `period_value`
- `period_payload`
No ws-specific lineloss args may be introduced in this slice.
### Contract 2: Same browser_script execution seam on both transports
The ws path must continue to use `execute_browser_script_skill_raw_output_with_browser_backend(...)` from `src/compat/direct_skill_runtime.rs:95-112`, which in turn uses the same browser_script tool path as pipe. Do not add a second lineloss-only ws runner.
### Contract 3: Same artifact interpretation on both transports
The ws path must produce output that remains consumable by `summarize_lineloss_output(...)` / `summarize_lineloss_artifact(...)` in `src/compat/deterministic_submit.rs:168-257` without transport-specific branching.
### Contract 4: Zhihu ws behavior must stay unchanged
The existing ws browser-script / export path already validated by `tests/agent_runtime_test.rs` and `tests/compat_runtime_test.rs` is a hard regression boundary. If a change breaks Zhihu tests, fix the ws seam instead of weakening Zhihu expectations.
### Contract 5: Pipe remains the baseline
For identical lineloss deterministic inputs, the pipe path should continue to succeed without requiring ws configuration.
---
### Task 1: Lock the ws contract with failing transport-level tests
**Files:**
- Modify: `tests/agent_runtime_test.rs`
- Modify: `tests/compat_runtime_test.rs`
- Read: `docs/_tmp_sgbrowser_ws_api_doc.txt`
- [ ] **Step 1: Add a failing ws lineloss deterministic runtime test**
Model it after the existing ws harness in `tests/agent_runtime_test.rs:69-166`, but target lineloss deterministic execution instead of Zhihu. The test should:
- configure `browserWsUrl`
- submit a deterministic lineloss instruction ending with `。。。`
- return a ws callback payload representing a lineloss `report-artifact`
- assert success summary includes canonical org, period, status, and rows
Suggested skeleton:
```rust
#[test]
fn ws_deterministic_lineloss_submit_executes_browser_script_and_summarizes_artifact() {
// arrange ws config + ws server + lineloss artifact callback
// act handle_browser_message_with_context(... SubmitTask ...)
// assert TaskComplete success summary contains canonical org/period/rows
}
```
- [ ] **Step 2: Add a failing pipe regression test for the same lineloss contract**
In `tests/compat_runtime_test.rs`, add a focused pipe-side assertion that the same deterministic lineloss instruction still succeeds through the current pipe seam and uses the same summary contract.
Suggested skeleton:
```rust
#[test]
fn pipe_deterministic_lineloss_submit_preserves_existing_summary_contract() {
// arrange MockTransport responses for browser_script eval
// act handle_browser_message_with_context(...)
// assert success summary matches canonical contract
}
```
- [ ] **Step 3: Add a failing ws regression assertion for Zhihu**
Add or tighten a Zhihu ws assertion proving ordinary Zhihu requests still use the existing ws path and do not get intercepted by lineloss deterministic logic.
- [ ] **Step 4: Run the three focused tests to confirm failure**
Run:
```bash
cargo test ws_deterministic_lineloss_submit_executes_browser_script_and_summarizes_artifact -- --exact
cargo test pipe_deterministic_lineloss_submit_preserves_existing_summary_contract -- --exact
cargo test ws_zhihu_submit_path_remains_unchanged_after_lineloss_transport_work -- --exact
```
Expected: at least the new ws lineloss test fails before the seam is completed.
- [ ] **Step 5: Commit**
```bash
git add tests/agent_runtime_test.rs tests/compat_runtime_test.rs
git commit -m "test: lock ws and pipe lineloss transport contracts"
```
---
### Task 2: Make the current eval transport contract explicitly satisfy browser-script requirements
**Files:**
- Modify: `src/pipe/protocol.rs:49-78,130-165,192-209`
- Modify: `src/pipe/browser_tool.rs:62-124`
- Modify only if tests prove necessary: `src/compat/browser_script_skill_tool.rs:99-180,214-255`
- Modify: `tests/pipe_protocol_test.rs`
- Modify: `tests/browser_tool_test.rs`
- Modify: `tests/browser_script_skill_tool_test.rs`
- [ ] **Step 1: Add failing protocol/result-contract tests first**
Extend or add focused tests to lock the current branch's real transport contract:
- `Action::Eval` remains supported by the line protocol and command encoding
- eval request/response correlation remains stable via `seq` matching for lineloss-style target URLs
- eval/browser_script result handling preserves the full JSON artifact string without truncation before deterministic lineloss summarization consumes it
Suggested skeletons:
```rust
#[test]
fn eval_action_remains_supported_in_protocol() {}
#[test]
fn browser_tool_matches_eval_response_by_seq_for_lineloss_flow() {}
#[test]
fn browser_script_tool_preserves_json_artifact_string_for_lineloss() {}
```
- [ ] **Step 2: Run the focused Task 2 tests to confirm failure**
Run:
```bash
cargo test eval_action_remains_supported_in_protocol -- --exact
cargo test browser_tool_matches_eval_response_by_seq_for_lineloss_flow -- --exact
cargo test browser_script_tool_preserves_json_artifact_string_for_lineloss -- --exact
```
Expected: at least one test fails if the current protocol/correlation/result handling is still insufficient for the lineloss artifact path.
- [ ] **Step 3: Implement the minimal transport-contract fix**
Allowed changes:
- adjust only the `Action::Eval` protocol/encoding support in `src/pipe/protocol.rs`
- adjust only request/response correlation in `src/pipe/browser_tool.rs`
- if and only if tests still prove it necessary, make a tiny result-shape/stringification fix in `src/compat/browser_script_skill_tool.rs`
- keep existing Zhihu-compatible behavior intact
Not allowed:
- adding lineloss-only transport fields
- adding a second lineloss-specific execution path
- changing deterministic lineloss business parsing or summary rules
- [ ] **Step 4: Re-run the focused Task 2 tests**
Run:
```bash
cargo test eval_action_remains_supported_in_protocol -- --exact
cargo test browser_tool_matches_eval_response_by_seq_for_lineloss_flow -- --exact
cargo test browser_script_tool_preserves_json_artifact_string_for_lineloss -- --exact
```
Expected: PASS.
- [ ] **Step 5: Re-run the focused ws lineloss runtime test from Task 1**
Run:
```bash
cargo test ws_deterministic_lineloss_submit_executes_browser_script_and_summarizes_artifact -- --exact
```
Expected: PASS.
- [ ] **Step 6: Commit**
```bash
git add src/pipe/protocol.rs src/pipe/browser_tool.rs src/compat/browser_script_skill_tool.rs tests/pipe_protocol_test.rs tests/browser_tool_test.rs tests/browser_script_skill_tool_test.rs
git commit -m "fix: align eval transport contract with lineloss browser script flow"
```
---
### Task 3: Make eval result-shape handling surface the lineloss artifact cleanly
**Files:**
- Modify: `src/pipe/browser_tool.rs:62-125`
- Modify only if tests prove necessary: `src/compat/browser_script_skill_tool.rs:159-180,248-255`
- Modify: `tests/browser_script_skill_tool_test.rs`
- [ ] **Step 1: Add a failing result-shape test**
Lock that an eval response carrying a JSON string report artifact is surfaced as the same browser_script tool output shape expected by `execute_browser_script_tool(...)`.
Suggested skeleton:
```rust
#[test]
fn ws_backend_eval_returns_text_payload_consumable_by_browser_script_tool() {
// arrange an eval response whose data.text is a JSON string artifact
// assert execute_browser_script_tool(...) returns the full artifact text without truncation
}
```
- [ ] **Step 2: Run the result-shape test to confirm failure**
Run:
```bash
cargo test ws_backend_eval_returns_text_payload_consumable_by_browser_script_tool -- --exact
```
Expected: FAIL only if current eval/result handling is not sufficient for full lineloss artifact output.
- [ ] **Step 3: Implement the minimal result-shape fix**
Allowed fixes:
- adjust `BrowserPipeTool::invoke(...)` only if response packaging itself is wrong
- if and only if still required, make a tiny output-shape compatibility fix in `src/compat/browser_script_skill_tool.rs` so JSON string `data.text` payloads are preserved identically to the pipe baseline
Not allowed:
- transport-specific lineloss parsing
- changes to deterministic business logic
- adding a second lineloss-specific execution path
- [ ] **Step 4: Re-run the result-shape test**
Run:
```bash
cargo test ws_backend_eval_returns_text_payload_consumable_by_browser_script_tool -- --exact
```
Expected: PASS.
- [ ] **Step 5: Re-run the focused ws lineloss runtime test from Task 1**
Run:
```bash
cargo test ws_deterministic_lineloss_submit_executes_browser_script_and_summarizes_artifact -- --exact
```
Expected: PASS.
- [ ] **Step 6: Commit**
```bash
git add src/pipe/browser_tool.rs src/compat/browser_script_skill_tool.rs tests/browser_script_skill_tool_test.rs
git commit -m "fix: make eval result shape match browser script contract"
```
---
### Task 4: Verify the current backend-neutral deterministic execution path without changing business rules
**Files:**
- Read baseline: `src/agent/mod.rs:242-285`
- Read baseline: `src/compat/deterministic_submit.rs:96-157`
- Modify only if a focused parity test proves it is necessary: `src/compat/direct_skill_runtime.rs:50-129`
- Modify: `tests/agent_runtime_test.rs`
- Modify: `tests/compat_runtime_test.rs`
- [ ] **Step 1: Add a failing integration test for backend-neutral parity**
Add a test proving these two current-branch paths produce the same lineloss summary contract for equivalent artifact payloads:
- pipe path via the existing deterministic submit flow in `tests/compat_runtime_test.rs`
- runtime path via `handle_browser_message_with_context(...)` deterministic submit routing in `tests/agent_runtime_test.rs`
Suggested skeleton:
```rust
#[test]
fn deterministic_lineloss_pipe_and_ws_paths_share_summary_contract() {}
```
- [ ] **Step 2: Run the parity test to confirm failure or gap**
Run:
```bash
cargo test deterministic_lineloss_pipe_and_ws_paths_share_summary_contract -- --exact
```
Expected: FAIL only if a remaining shared execution seam gap still exists.
- [ ] **Step 3: Apply the smallest shared execution fix if needed**
Allowed changes:
- tiny helper extraction or result handling in `src/compat/direct_skill_runtime.rs`
- no new lineloss-specific branch
- no change to deterministic lineloss business parsing or summary rules
- no change to configured direct-submit behavior for non-lineloss skills
- [ ] **Step 4: Re-run the parity test**
Run:
```bash
cargo test deterministic_lineloss_pipe_and_ws_paths_share_summary_contract -- --exact
```
Expected: PASS.
- [ ] **Step 5: Commit**
```bash
git add src/compat/direct_skill_runtime.rs tests/agent_runtime_test.rs tests/compat_runtime_test.rs
git commit -m "fix: preserve shared deterministic execution across pipe and ws"
```
---
### Task 5: Run the full focused verification set and stop if any Zhihu or pipe regression appears
**Files:**
- Reuse: `tests/agent_runtime_test.rs`
- Reuse: `tests/compat_runtime_test.rs`
- Reuse: `tests/runtime_task_flow_test.rs`
- [ ] **Step 1: Run focused ws + lineloss + Zhihu regression tests**
Run:
```bash
cargo test --test agent_runtime_test
cargo test --test compat_runtime_test
cargo test --test runtime_task_flow_test
```
Expected: PASS.
- [ ] **Step 2: Run targeted protocol/backend unit tests**
Run:
```bash
cargo test eval_action_remains_supported_in_protocol -- --exact
cargo test browser_tool_matches_eval_response_by_seq_for_lineloss_flow -- --exact
cargo test browser_script_tool_preserves_json_artifact_string_for_lineloss -- --exact
cargo test ws_backend_eval_returns_text_payload_consumable_by_browser_script_tool -- --exact
cargo test deterministic_lineloss_pipe_and_ws_paths_share_summary_contract -- --exact
```
Expected: PASS.
- [ ] **Step 3: Run the full Rust suite**
Run:
```bash
cargo test
```
Expected: PASS.
- [ ] **Step 4: Manual review of diff scope**
Confirm the diff only touches:
- current transport/result seam files (`src/pipe/protocol.rs`, `src/pipe/browser_tool.rs`)
- narrow shared browser_script/result compatibility helpers if strictly necessary
- tests
If diff includes Zhihu business logic, lineloss parsing rules, staged skill business JS, or unrelated cleanup, remove those changes before completion.
- [ ] **Step 5: Commit**
```bash
git add src/pipe/protocol.rs src/pipe/browser_tool.rs src/compat/browser_script_skill_tool.rs src/compat/direct_skill_runtime.rs tests/pipe_protocol_test.rs tests/browser_tool_test.rs tests/browser_script_skill_tool_test.rs tests/agent_runtime_test.rs tests/compat_runtime_test.rs
git commit -m "test: verify lineloss ws transport without regressing pipe or zhihu"
```
---
## Final verification checklist
- [ ] The same lineloss deterministic instruction works on pipe and ws.
- [ ] Pipe still works without any ws configuration.
- [ ] Eval transport support remains available for deterministic lineloss execution.
- [ ] Eval response payloads preserve the full lineloss artifact JSON string.
- [ ] `src/compat/deterministic_submit.rs` business rules remain transport-neutral.
- [ ] No ws-specific lineloss args were introduced.
- [ ] Zhihu ws tests still pass unchanged in behavior.
- [ ] No ordinary Zhihu request is intercepted by lineloss deterministic routing.
- [ ] No new transport-specific business branch was added for lineloss.
## Implementation notes
- Default to changing the current transport/result seam first: `src/pipe/protocol.rs` and `src/pipe/browser_tool.rs`.
- Treat `src/compat/browser_script_skill_tool.rs` and `src/compat/direct_skill_runtime.rs` as shared seams: change them only if a focused failing test shows a transport-neutral compatibility bug.
- If a proposed fix requires changing `src/compat/deterministic_submit.rs` business logic, stop and re-evaluate; that likely means the seam fix is happening at the wrong layer.
- If a proposed fix changes Zhihu expectations, stop and repair the seam instead.

View File

@@ -0,0 +1,228 @@
# Async Browser Script 支持实现计划
> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking.
**Goal:** 修改 `build_eval_js` 函数支持异步脚本,解决 Promise 被 JSON.stringify 序列化为 `{}` 的问题。
**Architecture:**`build_eval_js` 生成的 JavaScript 代码从同步 IIFE 改为 async IIFE用 await 等待脚本执行结果,并检测 Promise-like 对象进行二次等待。
**Tech Stack:** Rust, JavaScript (生成代码)
---
## 文件结构
| 文件 | 操作 | 说明 |
|------|------|------|
| `src/browser/callback_backend.rs` | 修改 | 修改 `build_eval_js` 函数 |
| `tests/browser_script_skill_tool_test.rs` | 新增测试 | 添加异步脚本测试用例 |
---
### Task 1: 修改 build_eval_js 支持异步脚本
**Files:**
- Modify: `src/browser/callback_backend.rs:433-447`
**当前代码:**
```rust
fn build_eval_js(source_url: &str, script: &str) -> String {
let escaped_source_url = escape_js_single_quoted(source_url);
let callback = EVAL_CALLBACK_NAME;
let events_url = escape_js_single_quoted(&events_endpoint_url(source_url));
format!(
"(function(){{try{{var v=(function(){{return {script}}})();\
var t=(typeof v==='string')?v:JSON.stringify(v);\
try{{callBackJsToCpp('{escaped_source_url}@_@'+window.location.href+'@_@{callback}@_@sgBrowserExcuteJsCodeByDomain@_@'+(t??''))}}catch(_){{}}\
var j=JSON.stringify({{type:'callback',callback:'{callback}',request_url:'{escaped_source_url}',payload:{{value:(t??'')}}}});\
try{{var r=new XMLHttpRequest();r.open('POST','{events_url}',true);r.setRequestHeader('Content-Type','application/json');r.send(j)}}catch(_){{}}\
try{{navigator.sendBeacon('{events_url}',new Blob([j],{{type:'application/json'}}))}}catch(_){{}}\
}}catch(e){{}}}})()"
)
}
```
**修改后代码:**
```rust
fn build_eval_js(source_url: &str, script: &str) -> String {
let escaped_source_url = escape_js_single_quoted(source_url);
let callback = EVAL_CALLBACK_NAME;
let events_url = escape_js_single_quoted(&events_endpoint_url(source_url));
format!(
"(async function(){{try{{\
var v=await (async function(){{return {script}}})();\
if(v&&typeof v.then==='function'){{v=await v;}}\
var t=(typeof v==='string')?v:JSON.stringify(v);\
try{{callBackJsToCpp('{escaped_source_url}@_@'+window.location.href+'@_@{callback}@_@sgBrowserExcuteJsCodeByDomain@_@'+(t??''))}}catch(_){{}}\
var j=JSON.stringify({{type:'callback',callback:'{callback}',request_url:'{escaped_source_url}',payload:{{value:(t??'')}}}});\
try{{var r=new XMLHttpRequest();r.open('POST','{events_url}',true);r.setRequestHeader('Content-Type','application/json');r.send(j)}}catch(_){{}}\
try{{navigator.sendBeacon('{events_url}',new Blob([j],{{type:'application/json'}}))}}catch(_){{}}\
}}catch(e){{}}}})()"
)
}
```
**关键变更说明:**
1. `(function()``(async function()` - 整个 IIFE 变为异步
2. `var v=(function(){return {script}})()``var v=await (async function(){return {script}})()` - 内部包装也变为异步并 await
3. 新增 `if(v&&typeof v.then==='function'){v=await v;}` - 检测并等待 Promise-like 对象
- [ ] **Step 1: 修改 build_eval_js 函数**
编辑 `src/browser/callback_backend.rs` 第 433-447 行,替换为上述新代码。
- [ ] **Step 2: 编译验证**
Run: `cargo build`
Expected: 编译成功,无错误
- [ ] **Step 3: 运行现有测试**
Run: `cargo test browser_script_skill_tool`
Expected: 所有测试通过
- [ ] **Step 4: Commit**
```bash
git add src/browser/callback_backend.rs
git commit -m "fix: support async browser scripts in build_eval_js
Wrap eval script in async IIFE and await Promise-like results.
Fixes Promise serialization returning '{}' for async skill scripts.
🤖 Generated with [Qoder][https://qoder.com]"
```
---
### Task 2: 添加异步脚本测试用例
**Files:**
- Modify: `tests/browser_script_skill_tool_test.rs`
- [ ] **Step 1: 添加异步脚本测试用例**
`tests/browser_script_skill_tool_test.rs` 文件末尾添加新测试:
```rust
#[tokio::test]
async fn execute_browser_script_tool_awaits_async_script() {
let skill_dir = unique_temp_dir("sgclaw-browser-script-async");
let scripts_dir = skill_dir.join("scripts");
fs::create_dir_all(&scripts_dir).unwrap();
// 异步脚本,返回 Promise
fs::write(
scripts_dir.join("async_extract.js"),
"return (async function() { return { async: true, args: args }; })();\n",
)
.unwrap();
let transport = Arc::new(MockTransport::new(vec![BrowserMessage::Response {
seq: 1,
success: true,
data: json!({
"text": {
"async": true,
"args": { "expected_domain": "example.com" }
}
}),
aom_snapshot: vec![],
timing: Timing {
queue_ms: 1,
exec_ms: 5,
},
}]));
let mut policy_json = test_policy();
// 允许 example.com
policy_json = MacPolicy::from_json_str(
r#"{
"version": "1.0",
"domains": { "allowed": ["www.zhihu.com", "example.com"] },
"pipe_actions": {
"allowed": ["click", "type", "navigate", "getText", "eval"],
"blocked": []
}
}"#,
)
.unwrap();
let browser_tool = BrowserPipeTool::new(
transport.clone(),
policy_json,
vec![1, 2, 3, 4, 5, 6, 7, 8],
)
.with_response_timeout(Duration::from_secs(1));
let skill_tool = SkillTool {
name: "async_extract".to_string(),
description: "Extract data asynchronously".to_string(),
kind: "browser_script".to_string(),
command: "scripts/async_extract.js".to_string(),
args: HashMap::new(),
};
let result = execute_browser_script_tool(
&skill_tool,
&skill_dir,
&PipeBrowserBackend::from_inner(browser_tool),
json!({
"expected_domain": "example.com"
}),
)
.await
.unwrap();
assert!(result.success);
let output = serde_json::from_str::<serde_json::Value>(&result.output).unwrap();
assert_eq!(output["async"], true);
}
```
- [ ] **Step 2: 运行新测试**
Run: `cargo test execute_browser_script_tool_awaits_async_script`
Expected: 测试通过
- [ ] **Step 3: Commit**
```bash
git add tests/browser_script_skill_tool_test.rs
git commit -m "test: add async browser script test case
🤖 Generated with [Qoder][https://qoder.com]"
```
---
### Task 3: 端到端验证
**Files:**
- 无文件修改,仅验证
- [ ] **Step 1: 完整构建**
Run: `cargo build`
Expected: 编译成功
- [ ] **Step 2: 运行全部测试**
Run: `cargo test`
Expected: 所有测试通过
- [ ] **Step 3: 手动端到端测试**
使用 service console 测试 `tq-lineloss-report.collect_lineloss`:
1. 启动 sgclaw: `target/debug/sg_claw.exe`
2. 在 service console 输入: `兰州公司 台区线损大数据 月累计线损率统计分析。。。`
3. 预期结果: 返回实际报表数据,而非 `{}`
---
## 自检清单
- [x] Spec 覆盖: 设计文档中所有要点都有对应任务
- [x] 无占位符: 所有代码都是完整的
- [x] 类型一致性: 函数签名无变化

View File

@@ -0,0 +1,73 @@
# Async Eval .then() Fix Implementation Plan
> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking.
**Goal:** Fix `build_eval_js` to handle async script return values using `.then()` instead of `async IIFE`.
**Architecture:** Extract callback-sending logic into a `_s` helper function inside the generated JS. If the script returns a Promise, call `_s` via `.then()`; otherwise call `_s` synchronously. This keeps the outer IIFE synchronous for C++ injection compatibility.
**Tech Stack:** Rust, JavaScript
---
## Files
- Modify: `src/browser/callback_backend.rs:433-447` - `build_eval_js` function
---
### Task 1: Modify build_eval_js to support async via .then()
**Files:**
- Modify: `src/browser/callback_backend.rs:433-447`
- [ ] **Step 1: Replace build_eval_js implementation**
Replace the entire `build_eval_js` function body (lines 433-447) with:
```rust
fn build_eval_js(source_url: &str, script: &str) -> String {
let escaped_source_url = escape_js_single_quoted(source_url);
let callback = EVAL_CALLBACK_NAME;
let events_url = escape_js_single_quoted(&events_endpoint_url(source_url));
format!(
"(function(){{try{{\
var v=(function(){{return {script}}})();\
function _s(v){{\
var t=(typeof v==='string')?v:JSON.stringify(v);\
try{{callBackJsToCpp('{escaped_source_url}@_@'+window.location.href+'@_@{callback}@_@sgBrowserExcuteJsCodeByDomain@_@'+(t??''))}}catch(_){{}}\
var j=JSON.stringify({{type:'callback',callback:'{callback}',request_url:'{escaped_source_url}',payload:{{value:(t??'')}}}});\
try{{var r=new XMLHttpRequest();r.open('POST','{events_url}',true);r.setRequestHeader('Content-Type','application/json');r.send(j)}}catch(_){{}}\
try{{navigator.sendBeacon('{events_url}',new Blob([j],{{type:'application/json'}}))}}catch(_){{}}\
}}\
if(v&&typeof v.then==='function'){{v.then(_s).catch(function(){{}});}}else{{_s(v);}}\
}}catch(e){{}}}})()"
)
}
```
- [ ] **Step 2: Run tests**
Run: `cargo test browser_script_skill_tool --no-fail-fast`
Expected: All tests pass.
- [ ] **Step 3: Run full test suite**
Run: `cargo test`
Expected: All tests pass (except pre-existing `lineloss_period_resolver_prompts_for_missing_period` failure which is unrelated).
- [ ] **Step 4: Build**
Run: `cargo build`
Expected: Compiles with no errors.
- [ ] **Step 5: Commit**
```bash
git add src/browser/callback_backend.rs
git commit -m "fix: support async browser scripts via .then() in build_eval_js"
```

View File

@@ -0,0 +1,52 @@
# Expected Domain Arg Fix Implementation Plan
> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking.
**Goal:** Fix browser_script_skill_tool to pass expected_domain to wrapped JS scripts.
**Architecture:** Insert the normalized expected_domain back into args HashMap after domain normalization, before script wrapping.
**Tech Stack:** Rust, serde_json
---
## Files
- Modify: `src/compat/browser_script_skill_tool.rs:210` - Insert expected_domain back into args
---
### Task 1: Insert expected_domain into args
**Files:**
- Modify: `src/compat/browser_script_skill_tool.rs:210`
- [ ] **Step 1: Add expected_domain to args after normalization**
Edit `src/compat/browser_script_skill_tool.rs`, insert after line 209 (`eprintln!("[execute_browser_script_impl] expected_domain: {}", expected_domain);`):
```rust
args.insert("expected_domain".to_string(), Value::String(expected_domain.clone()));
```
The context around line 209-211 should look like this after the edit:
```rust
eprintln!("[execute_browser_script_impl] expected_domain: {}", expected_domain);
args.insert("expected_domain".to_string(), Value::String(expected_domain.clone()));
for required_arg in tool.args.keys() {
```
- [ ] **Step 2: Run tests to verify the fix**
Run: `cargo test browser_script_skill_tool --no-fail-fast -- --nocapture`
Expected: All tests pass, including `execute_browser_script_tool_runs_packaged_script_with_expected_domain`
- [ ] **Step 3: Commit**
```bash
git add src/compat/browser_script_skill_tool.rs
git commit -m "fix: pass expected_domain to wrapped browser scripts"
```

View File

@@ -0,0 +1,163 @@
# 台区线损 requesturl 快速修复 实现计划
> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking.
**Goal:**`derive_request_url_from_instruction` 中添加台区线损 URL 映射,使 `sgHideBrowerserOpenPage` 命令能正确执行。
**Architecture:** 在现有知乎 URL 映射模式后追加台区线损场景的硬编码映射。
**Tech Stack:** Rust
---
### Task 1: 添加测试用例
**Files:**
- Modify: `src/service/server.rs:828` (tests 模块)
- [ ] **Step 1: 在 tests 模块中添加台区线损 URL 映射测试**
`initial_request_url_falls_back_to_zhihu_origin_for_generated_article_publish_routes` 测试后添加新测试:
```rust
#[test]
fn initial_request_url_falls_back_to_lineloss_origin_for_lineloss_instructions() {
let request = SubmitTaskRequest {
instruction: "兰州公司 台区线损大数据 月累计线损率统计分析。。。".to_string(),
..SubmitTaskRequest::default()
};
assert_eq!(
initial_request_url_for_submit_task(&request),
"http://20.76.57.61:18080"
);
}
```
- [ ] **Step 2: 运行测试验证失败**
Run: `cargo test initial_request_url_falls_back_to_lineloss_origin_for_lineloss_instructions -- --nocapture`
Expected: FAIL - 测试应该失败,因为还未实现映射逻辑
- [ ] **Step 3: 提交测试文件**
```bash
git add src/service/server.rs
git commit -m "test: add lineloss requesturl mapping test"
```
---
### Task 2: 实现台区线损 URL 映射
**Files:**
- Modify: `src/service/server.rs:354-382` (derive_request_url_from_instruction 函数)
- [ ] **Step 1: 在 derive_request_url_from_instruction 中添加台区线损映射**
在第二个知乎判断块后、`None` 之前添加:
```rust
// 台区线损相关
// TODO: 临时方案,后续应从 skill 配置或 deterministic_submit 解析结果中获取
if instruction.contains("线损") || instruction.contains("lineloss") {
return Some("http://20.76.57.61:18080".to_string());
}
None
```
完整函数应为:
```rust
fn derive_request_url_from_instruction(instruction: &str) -> Option<String> {
if crate::compat::workflow_executor::detect_route(instruction, None, None)
.is_some_and(|route| {
matches!(
route,
crate::compat::workflow_executor::WorkflowRoute::ZhihuHotlistExportXlsx
| crate::compat::workflow_executor::WorkflowRoute::ZhihuHotlistScreen
| crate::compat::workflow_executor::WorkflowRoute::ZhihuArticleEntry
| crate::compat::workflow_executor::WorkflowRoute::ZhihuArticleAutoPublishGenerated
)
})
{
return Some("https://www.zhihu.com".to_string());
}
if crate::compat::workflow_executor::detect_route(instruction, None, None)
.is_some_and(|route| {
matches!(
route,
crate::compat::workflow_executor::WorkflowRoute::ZhihuArticleDraft
| crate::compat::workflow_executor::WorkflowRoute::ZhihuArticlePublish
)
})
{
return Some("https://zhuanlan.zhihu.com".to_string());
}
// 台区线损相关
// TODO: 临时方案,后续应从 skill 配置或 deterministic_submit 解析结果中获取
if instruction.contains("线损") || instruction.contains("lineloss") {
return Some("http://20.76.57.61:18080".to_string());
}
None
}
```
- [ ] **Step 2: 运行测试验证通过**
Run: `cargo test initial_request_url_falls_back_to_lineloss_origin_for_lineloss_instructions -- --nocapture`
Expected: PASS
- [ ] **Step 3: 运行所有相关测试**
Run: `cargo test initial_request_url -- --nocapture`
Expected: 所有测试通过
- [ ] **Step 4: 构建项目**
Run: `cargo build`
Expected: 编译成功,无错误
- [ ] **Step 5: 提交实现**
```bash
git add src/service/server.rs
git commit -m "feat: add lineloss URL mapping in derive_request_url_from_instruction
临时方案:检测指令中包含'线损'或'lineloss'时返回台区线损平台 URL
🤖 Generated with [Qoder][https://qoder.com]"
```
---
### Task 3: 端到端验证
**Files:**
- 无文件修改,仅运行验证
- [ ] **Step 1: 停止现有 sgclaw 进程**
确保没有 `sg_claw.exe` 在运行
- [ ] **Step 2: 启动 sgclaw 服务**
Run: `target\debug\sg_claw.exe --config-path ..\sgclaw_config.json service`
- [ ] **Step 3: 在 service console 发送测试指令**
指令: `兰州公司 台区线损大数据 月累计线损率统计分析。。。`
Expected: 日志显示 `bootstrap_url=http://20.76.57.61:18080`,而非 `about:blank`
- [ ] **Step 4: 验证 helper page 打开成功**
Expected: 日志显示 `helper_loaded=true, ready=true`,不再超时

View File

@@ -0,0 +1,76 @@
# 台区线损 target_url 缺失修复 实现计划
> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking.
**Goal:**`browser_script_skill_tool.rs` 调用 `Action::Eval` 时添加 `target_url` 参数。
**Architecture:**`expected_domain` 构造完整 URL`http://{expected_domain}`),添加到 invoke 的 params 中。
**Tech Stack:** Rust, serde_json
---
### Task 1: 添加 target_url 参数
**Files:**
- Modify: `src/compat/browser_script_skill_tool.rs:238-241` (invoke 调用)
- [ ] **Step 1: 修改 invoke 调用,添加 target_url**
将:
```rust
let result = match browser_tool.invoke(
Action::Eval,
json!({ "script": wrapped_script }),
&expected_domain,
) {
```
改为:
```rust
let target_url = format!("http://{}", expected_domain);
let result = match browser_tool.invoke(
Action::Eval,
json!({
"script": wrapped_script,
"target_url": target_url,
}),
&expected_domain,
) {
```
- [ ] **Step 2: 构建项目**
Run: `cargo build`
Expected: 编译成功,无错误
- [ ] **Step 3: 提交修改**
```bash
git add src/compat/browser_script_skill_tool.rs
git commit -m "fix: add target_url param for Action::Eval in browser_script_skill_tool
🤖 Generated with [Qoder][https://qoder.com]"
```
---
### Task 2: 端到端验证
**Files:**
- 无文件修改,仅运行验证
- [ ] **Step 1: 停止现有 sgclaw 进程**
确保没有 `sg_claw.exe` 在运行
- [ ] **Step 2: 启动 sgclaw 服务**
Run: `target\debug\sg_claw.exe --config-path ..\sgclaw_config.json service`
- [ ] **Step 3: 在 service console 发送测试指令**
指令: `兰州公司 台区线损大数据 月累计线损率统计分析。。。`
Expected: 日志显示 `invoke 成功`,不再出现 `target_url is required for eval` 错误

View File

@@ -0,0 +1,912 @@
# Rust-Side Lineloss XLSX Export Implementation Plan
> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking.
**Goal:** Move XLSX export from browser JS (blocked by CORS) to Rust side, so `collect_lineloss.js` only collects data and Rust generates the `.xlsx` file locally.
**Architecture:** JS collects API data and returns a `report-artifact` JSON with `rows`, `column_defs`, and metadata. Rust parses the artifact, extracts rows + column definitions, and generates a standard `.xlsx` file using the `zip` crate + OpenXML XML strings (same pattern as `openxml_office_tool.rs`). Report log is deferred.
**Tech Stack:** Rust, `zip` 0.6.6, `serde_json`, OpenXML Spreadsheet ML, JavaScript (browser-injected)
**Spec:** `docs/superpowers/specs/2026-04-13-rust-side-lineloss-xlsx-export.md`
---
## File Structure
| File | Responsibility |
|------|---------------|
| `src/compat/lineloss_xlsx_export.rs` | **New.** Pure XLSX generation: takes column defs + row data, produces `.xlsx` file. No business logic. |
| `src/compat/deterministic_submit.rs` | **Modify.** After receiving JS artifact, extract rows + column_defs, call XLSX export, attach path to outcome. |
| `src/compat/mod.rs` | **Modify.** Register `lineloss_xlsx_export` module. |
| `D:/data/ideaSpace/rust/sgClaw/claw/claw/skills/skill_staging/skills/tq-lineloss-report/scripts/collect_lineloss.js` | **Modify.** Remove `exportWorkbook`/`writeReportLog` calls. Add `column_defs` to artifact. |
| `tests/lineloss_xlsx_export_test.rs` | **New.** Unit tests for XLSX generation. |
---
### Task 1: Create `lineloss_xlsx_export.rs` with Tests
**Files:**
- Create: `src/compat/lineloss_xlsx_export.rs`
- Create: `tests/lineloss_xlsx_export_test.rs`
- Modify: `src/compat/mod.rs`
- [ ] **Step 1: Register the new module in `src/compat/mod.rs`**
Add the module declaration in alphabetical order. In `src/compat/mod.rs`, insert after `pub mod event_bridge;`:
```rust
pub mod lineloss_xlsx_export;
```
The full file becomes:
```rust
pub mod artifact_open;
pub mod browser_script_skill_tool;
pub mod browser_tool_adapter;
pub mod config_adapter;
pub mod cron_adapter;
pub mod deterministic_submit;
pub mod direct_skill_runtime;
pub mod event_bridge;
pub mod lineloss_xlsx_export;
pub mod memory_adapter;
pub mod openxml_office_tool;
pub mod orchestration;
pub mod runtime;
pub mod screen_html_export_tool;
pub mod tq_lineloss;
pub mod workflow_executor;
```
- [ ] **Step 2: Write the failing test for XLSX generation**
Create `tests/lineloss_xlsx_export_test.rs`:
```rust
use std::fs;
use std::path::PathBuf;
use serde_json::json;
use sgclaw::compat::lineloss_xlsx_export::{export_lineloss_xlsx, LinelossExportRequest};
fn temp_output_path(name: &str) -> PathBuf {
let dir = std::env::temp_dir().join("sgclaw-test-xlsx");
fs::create_dir_all(&dir).unwrap();
dir.join(name)
}
#[test]
fn export_month_lineloss_produces_valid_xlsx() {
let output_path = temp_output_path("month-test.xlsx");
if output_path.exists() {
fs::remove_file(&output_path).unwrap();
}
let request = LinelossExportRequest {
sheet_name: "国网兰州供电公司月度线损分析报表(2026-03)".to_string(),
column_defs: vec![
("ORG_NAME".to_string(), "供电单位".to_string()),
("YGDL".to_string(), "累计供电量".to_string()),
("YYDL".to_string(), "累计售电量".to_string()),
("YXSL".to_string(), "线损完成率(%)".to_string()),
("RAT_SCOPE".to_string(), "线损率累计目标值".to_string()),
("BLANK3".to_string(), "目标完成率".to_string()),
("BLANK2".to_string(), "排行".to_string()),
],
rows: vec![
serde_json::from_value(json!({
"ORG_NAME": "城关供电",
"YGDL": "12345.67",
"YYDL": "11234.56",
"YXSL": "9.00",
"RAT_SCOPE": "9.50",
"BLANK3": "94.74",
"BLANK2": "1"
}))
.unwrap(),
serde_json::from_value(json!({
"ORG_NAME": "七里河供电",
"YGDL": "9876.54",
"YYDL": "8765.43",
"YXSL": "11.24",
"RAT_SCOPE": "10.00",
"BLANK3": "112.40",
"BLANK2": "2"
}))
.unwrap(),
],
output_path: output_path.clone(),
};
let result_path = export_lineloss_xlsx(&request).unwrap();
assert_eq!(result_path, output_path);
assert!(output_path.exists());
// Verify it's a valid ZIP (xlsx is a zip archive)
let file = fs::File::open(&output_path).unwrap();
let mut archive = zip::ZipArchive::new(file).unwrap();
// Must contain the standard OpenXML entries
let entry_names: Vec<String> = (0..archive.len())
.map(|i| archive.by_index(i).unwrap().name().to_string())
.collect();
assert!(entry_names.contains(&"[Content_Types].xml".to_string()));
assert!(entry_names.contains(&"xl/worksheets/sheet1.xml".to_string()));
assert!(entry_names.contains(&"xl/workbook.xml".to_string()));
// Read sheet1.xml and verify it contains our data
let mut sheet = archive.by_name("xl/worksheets/sheet1.xml").unwrap();
let mut xml = String::new();
std::io::Read::read_to_string(&mut sheet, &mut xml).unwrap();
assert!(xml.contains("供电单位"), "header row should contain 供电单位");
assert!(xml.contains("累计供电量"), "header row should contain 累计供电量");
assert!(xml.contains("城关供电"), "data should contain 城关供电");
assert!(xml.contains("12345.67"), "data should contain 12345.67");
assert!(xml.contains("七里河供电"), "data should contain second row");
// Cleanup
fs::remove_file(&output_path).unwrap();
}
#[test]
fn export_empty_rows_returns_error() {
let output_path = temp_output_path("empty-test.xlsx");
let request = LinelossExportRequest {
sheet_name: "test".to_string(),
column_defs: vec![("A".to_string(), "ColA".to_string())],
rows: vec![],
output_path: output_path.clone(),
};
let result = export_lineloss_xlsx(&request);
assert!(result.is_err());
assert!(
result.unwrap_err().to_string().contains("rows must not be empty"),
"should reject empty rows"
);
}
```
- [ ] **Step 3: Run the test to verify it fails**
Run: `cargo test --test lineloss_xlsx_export_test -- --nocapture`
Expected: compilation error — `lineloss_xlsx_export` module doesn't exist yet or `export_lineloss_xlsx` / `LinelossExportRequest` not defined.
- [ ] **Step 4: Implement `src/compat/lineloss_xlsx_export.rs`**
```rust
use std::fs;
use std::io::Write;
use std::path::{Path, PathBuf};
use serde_json::{Map, Value};
use zip::write::FileOptions;
use zip::{CompressionMethod, ZipWriter};
pub struct LinelossExportRequest {
pub sheet_name: String,
pub column_defs: Vec<(String, String)>,
pub rows: Vec<Map<String, Value>>,
pub output_path: PathBuf,
}
pub fn export_lineloss_xlsx(request: &LinelossExportRequest) -> anyhow::Result<PathBuf> {
if request.rows.is_empty() {
anyhow::bail!("rows must not be empty");
}
if request.column_defs.is_empty() {
anyhow::bail!("column_defs must not be empty");
}
let sheet_xml = build_worksheet_xml(&request.column_defs, &request.rows);
write_xlsx(
&request.output_path,
&request.sheet_name,
&sheet_xml,
)?;
Ok(request.output_path.clone())
}
fn build_worksheet_xml(
column_defs: &[(String, String)],
rows: &[Map<String, Value>],
) -> String {
let mut xml_rows = Vec::with_capacity(rows.len() + 1);
// Header row (row 1)
let header_cells: Vec<String> = column_defs
.iter()
.enumerate()
.map(|(col_idx, (_key, label))| {
let col_letter = column_letter(col_idx);
format!(
"<c r=\"{col_letter}1\" t=\"inlineStr\"><is><t>{}</t></is></c>",
xml_escape(label)
)
})
.collect();
xml_rows.push(format!("<row r=\"1\">{}</row>", header_cells.join("")));
// Data rows (row 2+)
for (row_idx, row) in rows.iter().enumerate() {
let excel_row = row_idx + 2;
let cells: Vec<String> = column_defs
.iter()
.enumerate()
.map(|(col_idx, (key, _label))| {
let col_letter = column_letter(col_idx);
let value = row
.get(key)
.map(|v| value_to_string(v))
.unwrap_or_default();
format!(
"<c r=\"{col_letter}{excel_row}\" t=\"inlineStr\"><is><t>{}</t></is></c>",
xml_escape(&value)
)
})
.collect();
xml_rows.push(format!("<row r=\"{excel_row}\">{}</row>", cells.join("")));
}
format!(
"<?xml version=\"1.0\" encoding=\"UTF-8\" standalone=\"yes\"?>\
<worksheet xmlns=\"http://schemas.openxmlformats.org/spreadsheetml/2006/main\">\
<sheetData>{}</sheetData>\
</worksheet>",
xml_rows.join("")
)
}
fn column_letter(index: usize) -> String {
let mut result = String::new();
let mut n = index;
loop {
result.insert(0, (b'A' + (n % 26) as u8) as char);
if n < 26 {
break;
}
n = n / 26 - 1;
}
result
}
fn value_to_string(value: &Value) -> String {
match value {
Value::String(text) => text.clone(),
Value::Number(number) => number.to_string(),
Value::Bool(flag) => flag.to_string(),
Value::Null => String::new(),
other => other.to_string(),
}
}
fn xml_escape(value: &str) -> String {
value
.replace('&', "&amp;")
.replace('<', "&lt;")
.replace('>', "&gt;")
}
fn write_xlsx(output_path: &Path, sheet_name: &str, sheet_xml: &str) -> anyhow::Result<()> {
if let Some(parent) = output_path.parent() {
fs::create_dir_all(parent)?;
}
if output_path.exists() {
fs::remove_file(output_path)?;
}
let file = fs::File::create(output_path)?;
let mut zip = ZipWriter::new(file);
let options = FileOptions::default().compression_method(CompressionMethod::Stored);
zip.start_file("[Content_Types].xml", options)?;
zip.write_all(content_types_xml().as_bytes())?;
zip.start_file("_rels/.rels", options)?;
zip.write_all(root_rels_xml().as_bytes())?;
zip.start_file("docProps/app.xml", options)?;
zip.write_all(app_xml().as_bytes())?;
zip.start_file("docProps/core.xml", options)?;
zip.write_all(core_xml().as_bytes())?;
zip.start_file("xl/workbook.xml", options)?;
zip.write_all(workbook_xml(&xml_escape(sheet_name)).as_bytes())?;
zip.start_file("xl/_rels/workbook.xml.rels", options)?;
zip.write_all(workbook_rels_xml().as_bytes())?;
zip.start_file("xl/worksheets/sheet1.xml", options)?;
zip.write_all(sheet_xml.as_bytes())?;
zip.finish()?;
Ok(())
}
fn content_types_xml() -> &'static str {
r#"<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<Types xmlns="http://schemas.openxmlformats.org/package/2006/content-types">
<Default Extension="rels" ContentType="application/vnd.openxmlformats-package.relationships+xml"/>
<Default Extension="xml" ContentType="application/xml"/>
<Override PartName="/xl/workbook.xml" ContentType="application/vnd.openxmlformats-officedocument.spreadsheetml.sheet.main+xml"/>
<Override PartName="/xl/worksheets/sheet1.xml" ContentType="application/vnd.openxmlformats-officedocument.spreadsheetml.worksheet+xml"/>
<Override PartName="/docProps/core.xml" ContentType="application/vnd.openxmlformats-package.core-properties+xml"/>
<Override PartName="/docProps/app.xml" ContentType="application/vnd.openxmlformats-officedocument.extended-properties+xml"/>
</Types>"#
}
fn root_rels_xml() -> &'static str {
r#"<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<Relationships xmlns="http://schemas.openxmlformats.org/package/2006/relationships">
<Relationship Id="rId1" Type="http://schemas.openxmlformats.org/officeDocument/2006/relationships/officeDocument" Target="xl/workbook.xml"/>
<Relationship Id="rId2" Type="http://schemas.openxmlformats.org/package/2006/relationships/metadata/core-properties" Target="docProps/core.xml"/>
<Relationship Id="rId3" Type="http://schemas.openxmlformats.org/officeDocument/2006/relationships/extended-properties" Target="docProps/app.xml"/>
</Relationships>"#
}
fn app_xml() -> &'static str {
r#"<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<Properties xmlns="http://schemas.openxmlformats.org/officeDocument/2006/extended-properties"
xmlns:vt="http://schemas.openxmlformats.org/officeDocument/2006/docPropsVTypes">
<Application>sgClaw</Application>
</Properties>"#
}
fn core_xml() -> &'static str {
r#"<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<cp:coreProperties xmlns:cp="http://schemas.openxmlformats.org/package/2006/metadata/core-properties"
xmlns:dc="http://purl.org/dc/elements/1.1/"
xmlns:dcterms="http://purl.org/dc/terms/"
xmlns:dcmitype="http://purl.org/dc/dcmitype/"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<dc:title>台区线损报表</dc:title>
</cp:coreProperties>"#
}
fn workbook_xml(sheet_name: &str) -> String {
format!(
r#"<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<workbook xmlns="http://schemas.openxmlformats.org/spreadsheetml/2006/main"
xmlns:r="http://schemas.openxmlformats.org/officeDocument/2006/relationships">
<sheets>
<sheet name="{sheet_name}" sheetId="1" r:id="rId1"/>
</sheets>
</workbook>"#
)
}
fn workbook_rels_xml() -> &'static str {
r#"<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<Relationships xmlns="http://schemas.openxmlformats.org/package/2006/relationships">
<Relationship Id="rId1" Type="http://schemas.openxmlformats.org/officeDocument/2006/relationships/worksheet" Target="worksheets/sheet1.xml"/>
</Relationships>"#
}
#[cfg(test)]
mod tests {
use super::column_letter;
#[test]
fn column_letter_maps_indices_correctly() {
assert_eq!(column_letter(0), "A");
assert_eq!(column_letter(1), "B");
assert_eq!(column_letter(6), "G");
assert_eq!(column_letter(25), "Z");
assert_eq!(column_letter(26), "AA");
}
}
```
- [ ] **Step 5: Run the tests to verify they pass**
Run: `cargo test --test lineloss_xlsx_export_test -- --nocapture`
Expected: both `export_month_lineloss_produces_valid_xlsx` and `export_empty_rows_returns_error` PASS.
Also run the internal unit test:
Run: `cargo test lineloss_xlsx_export -- --nocapture`
Expected: `column_letter_maps_indices_correctly` PASS.
- [ ] **Step 6: Commit**
```bash
git add src/compat/lineloss_xlsx_export.rs src/compat/mod.rs tests/lineloss_xlsx_export_test.rs
git commit -m "feat(lineloss): add Rust-side XLSX generation for lineloss reports"
```
---
### Task 2: Integrate XLSX Export into `deterministic_submit.rs`
**Files:**
- Modify: `src/compat/deterministic_submit.rs`
- [ ] **Step 1: Add imports and helper function to extract export data from artifact**
At the top of `src/compat/deterministic_submit.rs`, add the import:
```rust
use crate::compat::lineloss_xlsx_export::{export_lineloss_xlsx, LinelossExportRequest};
```
Then add a new helper function after `summarize_lineloss_artifact`:
```rust
struct LinelossArtifactExportData {
sheet_name: String,
column_defs: Vec<(String, String)>,
rows: Vec<Map<String, Value>>,
}
fn extract_export_data(output: &str) -> Option<LinelossArtifactExportData> {
let payload: Value = serde_json::from_str(output).ok()?;
let artifact = payload
.as_object()
.and_then(|object| object.get("text"))
.unwrap_or(&payload);
let artifact = artifact.as_object()?;
if artifact.get("type").and_then(Value::as_str) != Some("report-artifact") {
return None;
}
let status = artifact.get("status").and_then(Value::as_str).unwrap_or("");
if !matches!(status, "ok" | "partial") {
return None;
}
let rows = artifact
.get("rows")
.and_then(Value::as_array)?;
if rows.is_empty() {
return None;
}
let rows: Vec<Map<String, Value>> = rows
.iter()
.filter_map(|row| row.as_object().cloned())
.collect();
if rows.is_empty() {
return None;
}
let column_defs: Vec<(String, String)> = artifact
.get("column_defs")
.and_then(Value::as_array)
.map(|defs| {
defs.iter()
.filter_map(|def| {
let arr = def.as_array()?;
let key = arr.first()?.as_str()?.to_string();
let label = arr.get(1)?.as_str()?.to_string();
Some((key, label))
})
.collect()
})
.unwrap_or_default();
// Fallback: if column_defs not in artifact, try "columns" array as keys
let column_defs = if column_defs.is_empty() {
let columns = artifact
.get("columns")
.and_then(Value::as_array)?;
columns
.iter()
.filter_map(|col| {
let key = col.as_str()?.to_string();
Some((key.clone(), key))
})
.collect()
} else {
column_defs
};
if column_defs.is_empty() {
return None;
}
let org_label = artifact
.get("org")
.and_then(Value::as_object)
.and_then(|org| org.get("label"))
.and_then(Value::as_str)
.unwrap_or("lineloss");
let period_mode = artifact
.get("period")
.and_then(Value::as_object)
.and_then(|p| p.get("mode"))
.and_then(Value::as_str)
.unwrap_or("month");
let period_value = artifact
.get("period")
.and_then(Value::as_object)
.and_then(|p| p.get("value"))
.and_then(Value::as_str)
.unwrap_or("");
let mode_label = if period_mode == "week" { "周度" } else { "月度" };
let sheet_name = format!("{org_label}{mode_label}线损分析报表({period_value})");
Some(LinelossArtifactExportData {
sheet_name,
column_defs,
rows,
})
}
```
- [ ] **Step 2: Add the export-after-collection function**
Add a new function that wraps the existing flow with XLSX export:
```rust
fn try_export_lineloss_xlsx(
output: &str,
workspace_root: &Path,
) -> Option<PathBuf> {
let data = extract_export_data(output)?;
let nanos = std::time::SystemTime::now()
.duration_since(std::time::UNIX_EPOCH)
.map(|d| d.as_nanos())
.unwrap_or_default();
let out_dir = workspace_root.join("out");
let output_path = out_dir.join(format!("tq-lineloss-{nanos}.xlsx"));
let request = LinelossExportRequest {
sheet_name: data.sheet_name,
column_defs: data.column_defs,
rows: data.rows,
output_path,
};
match export_lineloss_xlsx(&request) {
Ok(path) => {
eprintln!("[deterministic_submit] XLSX exported to: {}", path.display());
Some(path)
}
Err(err) => {
eprintln!("[deterministic_submit] XLSX export failed: {err}");
None
}
}
}
```
- [ ] **Step 3: Modify `execute_deterministic_submit_with_browser_backend` to call export**
Replace the body of `execute_deterministic_submit_with_browser_backend` (lines 119-136 of the original file):
```rust
pub fn execute_deterministic_submit_with_browser_backend(
browser_backend: Arc<dyn BrowserBackend>,
plan: &DeterministicExecutionPlan,
workspace_root: &Path,
settings: &SgClawSettings,
) -> Result<DirectSubmitOutcome, PipeError> {
let args = deterministic_submit_args(plan);
let output =
crate::compat::direct_skill_runtime::execute_browser_script_skill_raw_output_with_browser_backend(
browser_backend,
&plan.tool_name,
workspace_root,
settings,
args,
)?;
let export_path = try_export_lineloss_xlsx(&output, workspace_root);
Ok(summarize_lineloss_output_with_export(&output, export_path.as_deref()))
}
```
Apply the same change to `execute_deterministic_submit` (the non-backend variant, lines 101-117):
```rust
pub fn execute_deterministic_submit<T: Transport + 'static>(
browser_tool: BrowserPipeTool<T>,
plan: &DeterministicExecutionPlan,
workspace_root: &Path,
settings: &SgClawSettings,
) -> Result<DirectSubmitOutcome, PipeError> {
let args = deterministic_submit_args(plan);
let output = crate::compat::direct_skill_runtime::execute_browser_script_skill_raw_output(
browser_tool,
&plan.tool_name,
workspace_root,
settings,
args,
)?;
let export_path = try_export_lineloss_xlsx(&output, workspace_root);
Ok(summarize_lineloss_output_with_export(&output, export_path.as_deref()))
}
```
- [ ] **Step 4: Add `summarize_lineloss_output_with_export` function**
Add this new function. It wraps the existing `summarize_lineloss_output` and appends the export path:
```rust
fn summarize_lineloss_output_with_export(output: &str, export_path: Option<&Path>) -> DirectSubmitOutcome {
let mut outcome = summarize_lineloss_output(output);
if let Some(path) = export_path {
outcome.summary.push_str(&format!(" export_path={}", path.display()));
}
outcome
}
```
- [ ] **Step 5: Run existing tests to ensure nothing breaks**
Run: `cargo test --test deterministic_submit_test -- --nocapture`
Expected: all existing tests PASS (the tests don't call `execute_deterministic_submit`, they test `decide_deterministic_submit` and parsing logic which is unchanged).
Run: `cargo test deterministic_submit -- --nocapture`
Expected: PASS.
- [ ] **Step 6: Commit**
```bash
git add src/compat/deterministic_submit.rs
git commit -m "feat(lineloss): integrate Rust-side XLSX export into deterministic submit pipeline"
```
---
### Task 3: Modify `collect_lineloss.js` to Skip Browser-Side Export
**Files:**
- Modify: `D:/data/ideaSpace/rust/sgClaw/claw/claw/skills/skill_staging/skills/tq-lineloss-report/scripts/collect_lineloss.js`
- [ ] **Step 1: Add `column_defs` to the artifact returned by `buildArtifact`**
In the `buildArtifact` function (around line 198), the `columns` field currently contains just column keys (e.g., `["ORG_NAME", "YGDL", ...]`). Add a `column_defs` field that includes the full key+label pairs. Change the `buildArtifact` function to also accept and emit `column_defs`:
Find this block in `buildArtifact` (line 198-242):
```javascript
function buildArtifact({
status,
blockedReason = '',
fatalError = '',
org_label = '',
org_code = '',
period_mode = '',
period_mode_code = '',
period_value = '',
period_payload = {},
columns = [],
rows = [],
export: exportState,
reasons = []
}) {
```
Replace with:
```javascript
function buildArtifact({
status,
blockedReason = '',
fatalError = '',
org_label = '',
org_code = '',
period_mode = '',
period_mode_code = '',
period_value = '',
period_payload = {},
columns = [],
column_defs = [],
rows = [],
export: exportState,
reasons = []
}) {
```
In the returned object (the `return { ... }` block inside `buildArtifact`), add `column_defs` after `columns`:
```javascript
columns: [...columns],
column_defs: [...column_defs],
rows: [...rows],
```
- [ ] **Step 2: Pass `column_defs` from `buildBrowserEntrypointResult`**
In `buildBrowserEntrypointResult`, after the `columns` assignment (around line 452), add:
```javascript
const columns = normalizedArgs.period_mode === 'week' ? WEEK_COLUMNS : MONTH_COLUMNS;
const columnDefs = normalizedArgs.period_mode === 'week' ? WEEK_COLUMN_DEFS : MONTH_COLUMN_DEFS;
```
Then in every call to `buildArtifact` inside `buildBrowserEntrypointResult`, add `column_defs: columnDefs` alongside `columns`. There are 5 calls:
**Call 1** (API error, around line 466):
```javascript
columns,
column_defs: columnDefs,
rows: [],
```
**Call 2** (empty rows, around line 483):
```javascript
columns,
column_defs: columnDefs,
rows: []
```
**Call 3** (normalization failure, around line 497):
```javascript
columns,
column_defs: columnDefs,
rows: [],
```
**Call 4** (success, around line 558):
```javascript
columns,
column_defs: columnDefs,
rows,
```
Note: the two `buildArtifact` calls before the `columns` variable is assigned (validation failure and page context failure, around lines 422 and 439) don't need `column_defs` since they don't have data.
- [ ] **Step 3: Remove the `exportWorkbook` and `writeReportLog` calls from the success path**
In `buildBrowserEntrypointResult`, replace the entire export block (lines 518-556) with a simplified version:
Find:
```javascript
const exportState = {
attempted: false,
status: 'skipped',
message: null
};
if (typeof deps.exportWorkbook === 'function') {
exportState.attempted = true;
try {
const exportPayload = buildExportPayload({
mode: normalizedArgs.period_mode,
orgLabel: normalizedArgs.org_label,
periodValue: normalizedArgs.period_value,
rows
});
const exportResult = await deps.exportWorkbook(exportPayload);
const exportPath = pickFirstNonEmpty(exportResult?.path, exportResult?.data?.path, exportResult?.data?.data);
if (!exportPath) {
throw new Error('export_failed');
}
exportState.status = 'ok';
exportState.message = exportPath;
if (typeof deps.writeReportLog === 'function') {
try {
const reportLog = await deps.writeReportLog(buildReportName(normalizedArgs), exportPath);
if (reportLog?.success === false) {
reasons.push('report_log_failed');
}
} catch (_error) {
reasons.push('report_log_failed');
}
}
} catch (error) {
reasons.push('export_failed');
exportState.status = 'failed';
exportState.message = pickFirstNonEmpty(error?.message, 'export_failed');
}
}
```
Replace with:
```javascript
// Export is handled by Rust side after receiving the artifact.
// JS only provides rows + column_defs in the artifact.
const exportState = {
attempted: false,
status: 'deferred_to_rust',
message: null
};
```
- [ ] **Step 4: Remove unused constants and functions**
Remove these constants (lines 5-6) since they are no longer called from JS:
```javascript
const EXPORT_SERVICE_URL = 'http://localhost:13313/SurfaceServices/personalBread/export/faultDetailsExportXLSX';
const REPORT_LOG_URL = 'http://localhost:13313/ReportServices/Api/setReportLog';
```
Remove the `postJson` function (lines 264-294) — it is no longer needed since no JS-side HTTP calls are made to localhost.
Remove these functions from `defaultBrowserDeps()`:
- `exportWorkbook` (lines 350-373)
- `writeReportLog` (lines 375-409)
Remove these now-unused functions:
- `buildExportTitles` (lines 244-254)
- `buildExportPayload` (lines 256-262)
- `buildReportName` (lines 413-415)
- [ ] **Step 5: Update the module.exports to remove unused exports**
Update the `module.exports` block (lines 572-586). Remove `buildBrowserEntrypointResult` from exports if it was only used for testing with full deps, or keep it for test compatibility. The final exports block:
```javascript
if (typeof module !== 'undefined' && module.exports) {
module.exports = {
MONTH_COLUMNS,
WEEK_COLUMNS,
MONTH_COLUMN_DEFS,
WEEK_COLUMN_DEFS,
validateArgs,
buildMonthRequest,
buildWeekRequest,
normalizeRows,
determineArtifactStatus,
buildArtifact,
buildBrowserEntrypointResult
};
} else {
return buildBrowserEntrypointResult(args);
}
```
- [ ] **Step 6: Verify the JS file has no syntax errors**
Run: `node -c "D:/data/ideaSpace/rust/sgClaw/claw/claw/skills/skill_staging/skills/tq-lineloss-report/scripts/collect_lineloss.js"`
Expected: no syntax errors. (Note: the file uses `return` at top level inside a wrapped IIFE when injected into the browser, so Node syntax check may warn — the important thing is no parse errors.)
Alternatively, check the test file still works:
Run: `node "D:/data/ideaSpace/rust/sgClaw/claw/claw/skills/skill_staging/skills/tq-lineloss-report/scripts/collect_lineloss.test.js"`
Expected: tests pass (or at least no JS parse errors).
- [ ] **Step 7: Commit**
```bash
git add "D:/data/ideaSpace/rust/sgClaw/claw/claw/skills/skill_staging/skills/tq-lineloss-report/scripts/collect_lineloss.js"
git commit -m "feat(lineloss): remove browser-side export, defer to Rust-side XLSX generation"
```
---
### Task 4: Full Build Verification
**Files:** None (verification only)
- [ ] **Step 1: Run full cargo build**
Run: `cargo build`
Expected: successful compilation with no errors.
- [ ] **Step 2: Run all tests**
Run: `cargo test -- --nocapture`
Expected: all tests pass, including:
- `lineloss_xlsx_export_test::export_month_lineloss_produces_valid_xlsx`
- `lineloss_xlsx_export_test::export_empty_rows_returns_error`
- `lineloss_xlsx_export::tests::column_letter_maps_indices_correctly`
- All existing `deterministic_submit_test` tests
- [ ] **Step 3: Commit (if any fixups needed)**
Only if compilation or test fixes were required in this step.

View File

@@ -0,0 +1,117 @@
# Helper Page Lifecycle Fix v2 — Same-Connection Close + Open
> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking.
**Goal:** Prevent orphaned helper pages across process restarts by closing existing ones before opening new ones, all on the same WebSocket connection.
**Architecture:** In `bootstrap_helper_page`, after registering with the browser WS, send `sgHideBrowerserClosePage` (best-effort, silently ignored if no page exists), then send `sgHideBrowerserOpenPage`. Change `use_hidden_domain` to `true`.
**Tech Stack:** Rust, tungstenite, SuperRPA browser WS protocol
---
### Task 1: Add close-before-open in bootstrap_helper_page
**Files:**
- Modify: `src/browser/callback_host.rs:345-374` (bootstrap_helper_page function)
- [ ] **Step 1: Add close command before open command in bootstrap_helper_page**
Replace the current `bootstrap_helper_page` function. After `recv_bootstrap_prelude`, send the close command first, then the open command:
```rust
fn bootstrap_helper_page(
browser_ws_url: &str,
request_url: &str,
helper_url: &str,
use_hidden_domain: bool,
) -> Result<(), PipeError> {
let (mut websocket, _) = connect(browser_ws_url)
.map_err(|err| PipeError::Protocol(format!("browser websocket connect failed: {err}")))?;
configure_bootstrap_socket(&mut websocket)?;
websocket
.send(Message::Text(
r#"{"type":"register","role":"web"}"#.to_string().into(),
))
.map_err(|err| PipeError::Protocol(format!("browser websocket register failed: {err}")))?;
let _ = recv_bootstrap_prelude(&mut websocket);
// Close any orphaned helper page from a previous process run.
// Best-effort: if no page exists, the browser silently ignores this.
let (open_action, close_action) = if use_hidden_domain {
("sgHideBrowerserOpenPage", "sgHideBrowerserClosePage")
} else {
("sgBrowerserOpenPage", "sgBrowserClosePage")
};
let close_payload = json!([request_url, close_action, helper_url]).to_string();
let _ = websocket.send(Message::Text(close_payload.into()));
let payload = json!([
request_url,
open_action,
helper_url,
])
.to_string();
websocket
.send(Message::Text(payload.into()))
.map_err(|err| PipeError::Protocol(format!("helper bootstrap send failed: {err}")))?;
Ok(())
}
```
Key changes from current code:
- After `recv_bootstrap_prelude`, add the close command (best-effort, ignore errors)
- Compute both `open_action` and `close_action` from `use_hidden_domain` flag
- Send close first, then open on the same WebSocket connection
- [ ] **Step 2: Change `use_hidden_domain` to `true` in server.rs**
In `src/service/server.rs`, at the `start_with_browser_ws_url` call, change `false` to `true`:
```rust
match LiveBrowserCallbackHost::start_with_browser_ws_url(
browser_ws_url,
&bootstrap_url,
Duration::from_secs(15),
BROWSER_RESPONSE_TIMEOUT,
true, // use_hidden_domain: hidden domain for invisible helper
) {
```
- [ ] **Step 3: Build**
Run: `cargo build 2>&1`
Expected: 0 errors.
- [ ] **Step 4: Run callback_host tests**
Run: `cargo test --lib -- callback_host 2>&1`
Expected: 12 tests pass (including `live_callback_host_sends_bootstrap_open_page_command` which still checks for `sgBrowerserOpenPage` because the test passes `false`, and `live_callback_host_hidden_domain_sends_hide_open_page_command` which passes `true`).
Note: The test passes `false` for `use_hidden_domain`, so the close command will use `sgBrowserClosePage`. The test's fake WebSocket server will receive both the close and open frames. The test only checks that `sgBrowerserOpenPage` is present, which is still true.
- [ ] **Step 5: Commit**
```bash
git add src/browser/callback_host.rs src/service/server.rs
git commit -m "fix(callback_host): close orphaned helper page before opening new one on same WS"
```
---
### Task 2: Full verification
**Files:** None (verification only)
- [ ] **Step 1: Full test suite**
Run: `cargo test 2>&1`
Expected: All tests pass except pre-existing `lineloss_period_resolver_prompts_for_missing_period` failure.
- [ ] **Step 2: Verify key behavioral changes**
Manually confirm:
1. `bootstrap_helper_page` sends close command before open command (both on same WS connection)
2. `use_hidden_domain` is `true` in `server.rs` — helper page opens in hidden domain
3. `Drop for LiveBrowserCallbackHost` remains simple (shutdown only, no close attempt)
4. `cached_host` is still in `mod.rs` outer loop (process-internal deduplication)

View File

@@ -0,0 +1,475 @@
# Helper Page Lifecycle Fix & Hidden Domain Support — Implementation Plan
> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking.
**Goal:** Fix duplicate browser-helper.html pages caused by WebSocket reconnections, add cleanup on Drop, and introduce a config switch for hidden-domain page opening.
**Architecture:** Three changes: (1) lift `cached_host` from `serve_client()` to the outer `run()` loop so reconnections share one host, (2) enhance `Drop for LiveBrowserCallbackHost` to send a close-page command via browser WS, (3) add `use_hidden_domain: bool` parameter that selects between `sgBrowerserOpenPage`/`sgHideBrowerserOpenPage` and their corresponding close APIs.
**Tech Stack:** Rust, tungstenite WebSocket crate, SuperRPA browser WS protocol
---
### Task 1: Add `use_hidden_domain` field and update `bootstrap_helper_page`
**Files:**
- Modify: `src/browser/callback_host.rs:28` (constant), `:44-51` (struct), `:215-252` (constructor), `:340-359` (bootstrap fn)
- [ ] **Step 1: Change `HELPER_BOOTSTRAP_ACTION` from constant to a function of `use_hidden_domain`**
Replace the constant and update `bootstrap_helper_page` to accept and use the flag:
```rust
// DELETE this line:
// const HELPER_BOOTSTRAP_ACTION: &str = "sgBrowerserOpenPage";
// REPLACE bootstrap_helper_page signature and body:
fn bootstrap_helper_page(
browser_ws_url: &str,
request_url: &str,
helper_url: &str,
use_hidden_domain: bool,
) -> Result<(), PipeError> {
let (mut websocket, _) = connect(browser_ws_url)
.map_err(|err| PipeError::Protocol(format!("browser websocket connect failed: {err}")))?;
configure_bootstrap_socket(&mut websocket)?;
websocket
.send(Message::Text(
r#"{"type":"register","role":"web"}"#.to_string().into(),
))
.map_err(|err| PipeError::Protocol(format!("browser websocket register failed: {err}")))?;
let _ = recv_bootstrap_prelude(&mut websocket);
let open_action = if use_hidden_domain {
"sgHideBrowerserOpenPage"
} else {
"sgBrowerserOpenPage"
};
let payload = json!([
request_url,
open_action,
helper_url,
])
.to_string();
websocket
.send(Message::Text(payload.into()))
.map_err(|err| PipeError::Protocol(format!("helper bootstrap send failed: {err}")))?;
Ok(())
}
```
- [ ] **Step 2: Add new fields to `LiveBrowserCallbackHost`**
```rust
#[derive(Debug)]
pub(crate) struct LiveBrowserCallbackHost {
host: Arc<BrowserCallbackHost>,
shutdown: Arc<AtomicBool>,
server_thread: Mutex<Option<JoinHandle<()>>>,
command_lock: Mutex<()>,
result_timeout: Duration,
browser_ws_url: String,
use_hidden_domain: bool,
}
```
- [ ] **Step 3: Update `start_with_browser_ws_url` to accept and store the new parameter**
```rust
impl LiveBrowserCallbackHost {
pub(crate) fn start_with_browser_ws_url(
browser_ws_url: &str,
bootstrap_request_url: &str,
ready_timeout: Duration,
result_timeout: Duration,
use_hidden_domain: bool,
) -> Result<Self, PipeError> {
let listener = TcpListener::bind("127.0.0.1:0").map_err(|err| {
PipeError::Protocol(format!("failed to bind callback host listener: {err}"))
})?;
listener.set_nonblocking(true).map_err(|err| {
PipeError::Protocol(format!("failed to configure callback host listener: {err}"))
})?;
let origin = format!(
"http://{}",
listener.local_addr().map_err(|err| {
PipeError::Protocol(format!(
"failed to resolve callback host listener address: {err}"
))
})?
);
let host = Arc::new(BrowserCallbackHost::with_urls(&origin, browser_ws_url));
let shutdown = Arc::new(AtomicBool::new(false));
let thread_host = host.clone();
let thread_shutdown = shutdown.clone();
let server_thread = thread::spawn(move || serve_loop(listener, thread_host, thread_shutdown));
bootstrap_helper_page(browser_ws_url, bootstrap_request_url, host.helper_url(), use_hidden_domain)?;
wait_for_helper_ready(host.as_ref(), ready_timeout)?;
let live_host = Self {
host,
shutdown,
server_thread: Mutex::new(Some(server_thread)),
command_lock: Mutex::new(()),
result_timeout,
browser_ws_url: browser_ws_url.to_string(),
use_hidden_domain,
};
Ok(live_host)
}
```
- [ ] **Step 4: Fix the inline test struct literal that constructs `LiveBrowserCallbackHost` directly**
In the `live_callback_host_treats_simulated_mouse_command_as_fire_and_forget` test (around line 1110), add the new fields:
```rust
let host = LiveBrowserCallbackHost {
host: Arc::new(BrowserCallbackHost::new()),
shutdown: Arc::new(AtomicBool::new(false)),
server_thread: Mutex::new(None),
command_lock: Mutex::new(()),
result_timeout: Duration::from_millis(10),
browser_ws_url: "ws://127.0.0.1:12345".to_string(),
use_hidden_domain: false,
};
```
- [ ] **Step 5: Run build to verify compilation**
Run: `cargo build 2>&1`
Expected: 0 errors. The `HELPER_BOOTSTRAP_ACTION` constant removal and signature changes should all be internally consistent.
- [ ] **Step 6: Run tests to verify existing behavior is preserved**
Run: `cargo test -- callback_host 2>&1`
Expected: All existing callback_host tests pass (including `live_callback_host_sends_bootstrap_open_page_command` which still checks for `sgBrowerserOpenPage` since no caller passes `true` yet).
- [ ] **Step 7: Commit**
```bash
git add src/browser/callback_host.rs
git commit -m "feat(callback_host): add use_hidden_domain param to bootstrap_helper_page"
```
---
### Task 2: Enhance `Drop` to close the helper page
**Files:**
- Modify: `src/browser/callback_host.rs:321-328` (Drop impl)
- [ ] **Step 1: Add `close_helper_page` helper function**
Add this function near `bootstrap_helper_page` (after line ~360):
```rust
/// Best-effort attempt to close the helper page tab via browser WebSocket.
/// Silently ignores all errors — this runs during Drop and must not panic.
fn close_helper_page(browser_ws_url: &str, helper_url: &str, use_hidden_domain: bool) {
let close_action = if use_hidden_domain {
"sgHideBrowerserClosePage"
} else {
"sgBrowserClosePage"
};
let result: Result<(), Box<dyn std::error::Error>> = (|| {
// Use a raw TcpStream with timeouts instead of tungstenite::connect
// which does not expose a connection timeout.
let addr = browser_ws_url
.trim_start_matches("ws://")
.trim_start_matches("wss://");
let stream = TcpStream::connect_timeout(
&addr.parse().map_err(|e| format!("addr parse: {e}"))?,
Duration::from_millis(100),
)?;
stream.set_read_timeout(Some(Duration::from_millis(200)))?;
stream.set_write_timeout(Some(Duration::from_millis(200)))?;
let (mut websocket, _) = tungstenite::client(
browser_ws_url,
stream,
)?;
websocket.send(Message::Text(
r#"{"type":"register","role":"web"}"#.to_string().into(),
))?;
// Drain the welcome prelude (best-effort, ignore timeout).
let _ = websocket.read();
let payload = json!([helper_url, close_action, helper_url]).to_string();
websocket.send(Message::Text(payload.into()))?;
Ok(())
})();
if let Err(err) = result {
eprintln!("close_helper_page best-effort failed (harmless): {err}");
}
}
```
- [ ] **Step 2: Update `Drop for LiveBrowserCallbackHost` to call `close_helper_page`**
```rust
impl Drop for LiveBrowserCallbackHost {
fn drop(&mut self) {
// Best-effort: tell the browser to close the helper page tab.
close_helper_page(
&self.browser_ws_url,
self.host.helper_url(),
self.use_hidden_domain,
);
self.shutdown.store(true, Ordering::Relaxed);
if let Some(handle) = self.server_thread.lock().unwrap().take() {
let _ = handle.join();
}
}
}
```
- [ ] **Step 3: Run build to verify compilation**
Run: `cargo build 2>&1`
Expected: 0 errors. `close_helper_page` uses types already imported (`TcpStream`, `Duration`, `json!`, `Message`).
- [ ] **Step 4: Run tests**
Run: `cargo test -- callback_host 2>&1`
Expected: All pass. The Drop enhancement is best-effort and the test helper constructs hosts with `server_thread: Mutex::new(None)`, so Drop completes cleanly.
- [ ] **Step 5: Commit**
```bash
git add src/browser/callback_host.rs
git commit -m "feat(callback_host): close helper page on Drop via browser WS"
```
---
### Task 3: Lift `cached_host` to outer loop and update `serve_client` signature
**Files:**
- Modify: `src/service/mod.rs:72-96` (run loop)
- Modify: `src/service/server.rs:232-241` (serve_client signature and cached_host init)
- [ ] **Step 1: Change `serve_client` to accept `cached_host` as a parameter**
In `src/service/server.rs`, change the function signature and remove the local `cached_host` variable:
```rust
pub fn serve_client(
context: &AgentRuntimeContext,
session: &ServiceSession,
sink: Arc<ServiceEventSink>,
browser_ws_url: &str,
mac_policy: &MacPolicy,
cached_host: &mut Option<Arc<LiveBrowserCallbackHost>>,
) -> Result<(), PipeError> {
// DELETE the line: let mut cached_host: Option<Arc<LiveBrowserCallbackHost>> = None;
loop {
// ... rest of function body unchanged, `cached_host` is now the parameter
```
The body references to `cached_host` remain identical — they just operate on the borrowed mutable reference instead of a local variable.
- [ ] **Step 2: Update `start_with_browser_ws_url` call to pass `false` for `use_hidden_domain`**
In `src/service/server.rs`, at the `LiveBrowserCallbackHost::start_with_browser_ws_url` call (around line 288), add the `false` argument:
```rust
match LiveBrowserCallbackHost::start_with_browser_ws_url(
browser_ws_url,
&bootstrap_url,
Duration::from_secs(15),
BROWSER_RESPONSE_TIMEOUT,
false, // use_hidden_domain: visible tab for now
) {
```
- [ ] **Step 3: Lift `cached_host` into `run()` in `mod.rs`**
In `src/service/mod.rs`, declare `cached_host` before the loop and pass it to `serve_client`:
```rust
// Add this import at the top of the function or file:
use crate::browser::callback_host::LiveBrowserCallbackHost;
// Before the loop (after line 64, after `let session = ...`):
let mut cached_host: Option<Arc<LiveBrowserCallbackHost>> = None;
loop {
let (stream, _) = listener.accept()?;
let websocket = accept(stream)
.map_err(|err| PipeError::Protocol(format!("service websocket accept failed: {err}")))?;
let sink = Arc::new(ServiceEventSink::from_websocket(websocket));
match session.try_attach_client() {
Ok(()) => {
let result = serve_client(
&runtime_context,
&session,
sink.clone(),
browser_ws_url,
&mac_policy,
&mut cached_host,
);
session.detach_client();
match result {
Ok(()) | Err(PipeError::PipeClosed) => {}
Err(err) => return Err(err),
}
}
Err(message) => {
sink.send_service_message(message)?;
}
}
}
```
- [ ] **Step 4: Update the `pub use` export if needed**
Check `src/service/mod.rs:17`:
```rust
pub use server::{serve_client, ServiceEventSink, ServiceSession};
```
The signature change is compatible — `serve_client` is still public with an added parameter. Any external callers will get a compile error guiding them to add the parameter, which is the desired behavior.
- [ ] **Step 5: Run build to verify compilation**
Run: `cargo build 2>&1`
Expected: 0 errors. If there are external test files calling `serve_client`, they will fail here and need the new parameter added.
- [ ] **Step 6: Run full test suite**
Run: `cargo test 2>&1`
Expected: All tests pass. External test files that call `serve_client` indirectly through the service protocol tests should still work because they use the WS protocol layer, not `serve_client` directly. (Verified: grep found 0 test files referencing `serve_client` or `LiveBrowserCallbackHost`.)
- [ ] **Step 7: Commit**
```bash
git add src/service/mod.rs src/service/server.rs
git commit -m "fix(service): lift cached_host to outer loop to prevent duplicate helper pages"
```
---
### Task 4: Add tests for hidden domain bootstrap
**Files:**
- Modify: `src/browser/callback_host.rs` (inline tests module, around line 1071)
- [ ] **Step 1: Update existing `live_callback_host_sends_bootstrap_open_page_command` test**
The test currently calls `start_with_browser_ws_url` with 4 args. Add the 5th arg `false`:
```rust
#[test]
fn live_callback_host_sends_bootstrap_open_page_command() {
let (ws_url, frames, handle) = start_fake_browser_status_server();
let result = LiveBrowserCallbackHost::start_with_browser_ws_url(
&ws_url,
"https://www.zhihu.com",
Duration::from_millis(100),
Duration::from_millis(50),
false,
);
assert!(result.is_err(), "expected timeout because no real helper page loads");
drop(result);
handle.join().unwrap();
let sent = frames.lock().unwrap().clone();
assert!(
sent.iter().any(|frame| frame.contains("sgBrowerserOpenPage")),
"bootstrap should send sgBrowerserOpenPage to the browser WS; sent frames: {sent:?}"
);
assert!(
sent.iter().any(|frame| frame.contains("/sgclaw/browser-helper.html")),
"bootstrap should include the helper page URL; sent frames: {sent:?}"
);
assert!(
sent.iter().any(|frame| frame.contains("https://www.zhihu.com")),
"bootstrap requestUrl should be the provided page URL; sent frames: {sent:?}"
);
}
```
- [ ] **Step 2: Add new test for hidden domain bootstrap**
Add this test after the existing one:
```rust
#[test]
fn live_callback_host_hidden_domain_sends_hide_open_page_command() {
let (ws_url, frames, handle) = start_fake_browser_status_server();
let result = LiveBrowserCallbackHost::start_with_browser_ws_url(
&ws_url,
"https://www.zhihu.com",
Duration::from_millis(100),
Duration::from_millis(50),
true,
);
assert!(result.is_err(), "expected timeout because no real helper page loads");
drop(result);
handle.join().unwrap();
let sent = frames.lock().unwrap().clone();
assert!(
sent.iter().any(|frame| frame.contains("sgHideBrowerserOpenPage")),
"hidden domain bootstrap should send sgHideBrowerserOpenPage; sent frames: {sent:?}"
);
assert!(
!sent.iter().any(|frame| {
frame.contains("\"sgBrowerserOpenPage\"")
}),
"hidden domain bootstrap should NOT send visible sgBrowerserOpenPage; sent frames: {sent:?}"
);
assert!(
sent.iter().any(|frame| frame.contains("/sgclaw/browser-helper.html")),
"bootstrap should include the helper page URL; sent frames: {sent:?}"
);
}
```
- [ ] **Step 3: Run all callback_host tests**
Run: `cargo test -- callback_host 2>&1`
Expected: All 3 tests pass:
- `live_callback_host_sends_bootstrap_open_page_command` — regression, visible domain
- `live_callback_host_hidden_domain_sends_hide_open_page_command` — new, hidden domain
- `live_callback_host_treats_simulated_mouse_command_as_fire_and_forget` — unchanged
- [ ] **Step 4: Run full test suite**
Run: `cargo test 2>&1`
Expected: All tests pass.
- [ ] **Step 5: Commit**
```bash
git add src/browser/callback_host.rs
git commit -m "test(callback_host): add hidden domain bootstrap test"
```
---
### Task 5: Full build verification
**Files:** None (verification only)
- [ ] **Step 1: Clean build**
Run: `cargo build 2>&1`
Expected: 0 errors. Warnings about dead code in unrelated modules are acceptable.
- [ ] **Step 2: Full test suite**
Run: `cargo test 2>&1`
Expected: All tests pass. The pre-existing `lineloss_period_resolver_prompts_for_missing_period` failure (from previous work) is known and unrelated.
- [ ] **Step 3: Verify the key behavioral changes in code**
Manually confirm:
1. `src/service/mod.rs``cached_host` is declared BEFORE the `loop`, not inside `serve_client`
2. `src/browser/callback_host.rs``Drop::drop` calls `close_helper_page` before shutdown
3. `src/browser/callback_host.rs``bootstrap_helper_page` uses `"sgHideBrowerserOpenPage"` when `use_hidden_domain == true` and `"sgBrowerserOpenPage"` when `false`
4. `src/service/server.rs``start_with_browser_ws_url` call passes `false` as `use_hidden_domain`

View File

@@ -0,0 +1,762 @@
# Service Console Enhancement Implementation Plan
> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking.
**Goal:** Add auto-connect on page load and a settings panel to sg_claw_service_console.html, with config save via WebSocket to the sgClaw service.
**Architecture:** The HTML page auto-connects on load and provides a settings modal. When user saves, the page sends an `update_config` WebSocket message. The Rust service receives it, merges with existing config, writes to `sgclaw_config.json`, and responds.
**Tech Stack:** Rust (serde, tungstenite), vanilla JavaScript/HTML/CSS
---
### Task 1: Add `UpdateConfig` and `ConfigUpdated` protocol types
**Files:**
- Modify: `src/service/protocol.rs`
- [ ] **Step 1: Add `ConfigUpdatePayload` struct and `UpdateConfig` variant to `ClientMessage`**
Add this struct above the `ClientMessage` enum, and add the `UpdateConfig` variant to the enum:
```rust
use std::path::PathBuf;
#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)]
pub struct ConfigUpdatePayload {
#[serde(rename = "apiKey", default)]
pub api_key: Option<String>,
#[serde(rename = "baseUrl", default)]
pub base_url: Option<String>,
#[serde(default)]
pub model: Option<String>,
#[serde(rename = "skillsDir", default)]
pub skills_dir: Option<String>,
#[serde(rename = "directSubmitSkill", default)]
pub direct_submit_skill: Option<String>,
#[serde(rename = "runtimeProfile", default)]
pub runtime_profile: Option<String>,
#[serde(rename = "browserBackend", default)]
pub browser_backend: Option<String>,
}
```
Add `UpdateConfig` variant to `ClientMessage` enum (after `Ping`):
```rust
UpdateConfig {
config: ConfigUpdatePayload,
},
```
- [ ] **Step 2: Add `ConfigUpdated` variant to `ServiceMessage`**
Add after `Pong`:
```rust
ConfigUpdated {
success: bool,
message: String,
},
```
- [ ] **Step 3: Update `into_submit_task_request` to handle `UpdateConfig`**
In the match arm, add `ClientMessage::UpdateConfig { .. }` to the list that returns `None`:
```rust
ClientMessage::Connect
| ClientMessage::Start
| ClientMessage::Stop
| ClientMessage::Ping
| ClientMessage::UpdateConfig { .. } => None,
```
- [ ] **Step 4: Run tests to verify protocol compiles**
Run: `cargo test --lib service::protocol`
Expected: PASS (no protocol-specific tests yet, but it should compile)
### Task 2: Add `config_path()` getter to `AgentRuntimeContext`
**Files:**
- Modify: `src/agent/task_runner.rs`
- [ ] **Step 1: Add public getter method**
In the `impl AgentRuntimeContext` block, add after `load_sgclaw_settings()`:
```rust
pub fn config_path(&self) -> Option<&Path> {
self.config_path.as_deref()
}
```
Add the import at the top of the file if not present:
```rust
use std::path::Path;
```
- [ ] **Step 2: Run tests to verify**
Run: `cargo test agent::task_runner`
Expected: PASS
### Task 3: Add `save_to_path()` method to `SgClawSettings`
**Files:**
- Modify: `src/config/settings.rs`
- [ ] **Step 1: Add Serialize derive to SgClawSettings and related types**
The `RawSgClawSettings` struct uses `Deserialize` only. We need to add `Serialize` to `SgClawSettings` for writing. Add `use serde::Serialize;` at the top.
Add `Serialize` derive to `SgClawSettings`:
```rust
#[derive(Debug, Clone, PartialEq, Eq, Serialize)]
pub struct SgClawSettings {
```
But wait - `SgClawSettings` has enum fields (`RuntimeProfile`, `SkillsPromptMode`, `PlannerMode`, `BrowserBackend`, `OfficeBackend`) that don't implement `Serialize`. We need to add Serialize derives to those types too.
Instead, the simpler approach is to write a `to_raw()` method that converts `SgClawSettings` to a serializable struct, then serialize that.
- [ ] **Step 2: Create serializable raw config struct**
Add a new struct at the bottom of the file (before tests if any):
```rust
#[derive(Debug, Serialize)]
struct SerializableRawSgClawSettings {
#[serde(rename = "apiKey")]
api_key: String,
#[serde(rename = "baseUrl")]
base_url: String,
model: String,
#[serde(rename = "skillsDir", skip_serializing_if = "Option::is_none")]
skills_dir: Option<String>,
#[serde(rename = "directSubmitSkill", skip_serializing_if = "Option::is_none")]
direct_submit_skill: Option<String>,
#[serde(rename = "skillsPromptMode", skip_serializing_if = "Option::is_none")]
skills_prompt_mode: Option<String>,
#[serde(rename = "runtimeProfile", skip_serializing_if = "Option::is_none")]
runtime_profile: Option<String>,
#[serde(rename = "plannerMode", skip_serializing_if = "Option::is_none")]
planner_mode: Option<String>,
#[serde(rename = "activeProvider", skip_serializing_if = "Option::is_none")]
active_provider: Option<String>,
#[serde(rename = "browserBackend", skip_serializing_if = "Option::is_none")]
browser_backend: Option<String>,
#[serde(rename = "officeBackend", skip_serializing_if = "Option::is_none")]
office_backend: Option<String>,
#[serde(rename = "browserWsUrl", skip_serializing_if = "Option::is_none")]
browser_ws_url: Option<String>,
#[serde(rename = "serviceWsListenAddr", skip_serializing_if = "Option::is_none")]
service_ws_listen_addr: Option<String>,
#[serde(default)]
providers: Vec<SerializableProviderSettings>,
}
#[derive(Debug, Serialize)]
struct SerializableProviderSettings {
id: String,
provider: Option<String>,
#[serde(rename = "apiKey")]
api_key: String,
#[serde(rename = "baseUrl", skip_serializing_if = "Option::is_none")]
base_url: Option<String>,
model: String,
#[serde(rename = "apiPath", skip_serializing_if = "Option::is_none")]
api_path: Option<String>,
#[serde(rename = "wireApi", skip_serializing_if = "Option::is_none")]
wire_api: Option<String>,
#[serde(rename = "requiresOpenaiAuth")]
requires_openai_auth: bool,
}
```
Add `use serde::Serialize;` at the top of the file (combine with existing `use serde::Deserialize;`):
```rust
use serde::{Deserialize, Serialize};
```
- [ ] **Step 3: Add `to_serializable()` method to `SgClawSettings`**
In the `impl SgClawSettings` block, add:
```rust
fn to_serializable(&self) -> SerializableRawSgClawSettings {
let format_enum_value = |s: &str| s.to_string();
SerializableRawSgClawSettings {
api_key: self.provider_api_key.clone(),
base_url: self.provider_base_url.clone(),
model: self.provider_model.clone(),
skills_dir: self.skills_dir.as_ref().map(|p| p.to_string_lossy().into_owned()),
direct_submit_skill: self.direct_submit_skill.clone(),
skills_prompt_mode: Some(format_enum_value(match self.skills_prompt_mode {
SkillsPromptMode::Full => "full",
SkillsPromptMode::Compact => "compact",
})),
runtime_profile: Some(format_enum_value(match self.runtime_profile {
RuntimeProfile::BrowserAttached => "browser-attached",
RuntimeProfile::BrowserHeavy => "browser-heavy",
RuntimeProfile::GeneralAssistant => "general-assistant",
})),
planner_mode: Some(format_enum_value(match self.planner_mode {
PlannerMode::ZeroclawPlanFirst => "zeroclaw-plan-first",
PlannerMode::LegacyDeterministic => "legacy-deterministic",
})),
active_provider: Some(self.active_provider.clone()),
browser_backend: Some(format_enum_value(match self.browser_backend {
BrowserBackend::SuperRpa => "super-rpa",
BrowserBackend::AgentBrowser => "agent-browser",
BrowserBackend::RustNative => "rust-native",
BrowserBackend::ComputerUse => "computer-use",
BrowserBackend::Auto => "auto",
})),
office_backend: Some(format_enum_value(match self.office_backend {
OfficeBackend::OpenXml => "openxml",
OfficeBackend::Disabled => "disabled",
})),
browser_ws_url: self.browser_ws_url.clone(),
service_ws_listen_addr: self.service_ws_listen_addr.clone(),
providers: self
.providers
.iter()
.map(|p| SerializableProviderSettings {
id: p.id.clone(),
provider: Some(p.provider.clone()),
api_key: p.api_key.clone(),
base_url: p.base_url.clone(),
model: p.model.clone(),
api_path: p.api_path.clone(),
wire_api: p.wire_api.clone(),
requires_openai_auth: p.requires_openai_auth,
})
.collect(),
}
}
```
- [ ] **Step 4: Add `save_to_path()` method**
In the same `impl SgClawSettings` block, add:
```rust
pub fn save_to_path(&self, path: &Path) -> Result<(), ConfigError> {
let serializable = self.to_serializable();
let json = serde_json::to_string_pretty(&serializable)
.map_err(|err| ConfigError::ConfigParse(path.to_path_buf(), err.to_string()))?;
std::fs::write(path, json)
.map_err(|err| ConfigError::ConfigRead(path.to_path_buf(), err.to_string()))
}
```
- [ ] **Step 5: Run tests to verify compilation**
Run: `cargo test --lib config::settings`
Expected: PASS
### Task 4: Handle `UpdateConfig` in the service server
**Files:**
- Modify: `src/service/server.rs`
- Modify: `src/service/mod.rs` (if needed for imports)
- [ ] **Step 1: Add `UpdateConfig` match arm in `serve_client`**
In the `match message` block in `serve_client`, after the `SubmitTask` arm, add:
```rust
ClientMessage::UpdateConfig { config } => {
let Some(config_path) = context.config_path() else {
sink.send_service_message(ServiceMessage::ConfigUpdated {
success: false,
message: "未找到配置文件路径。请通过 --config-path 参数启动 sg_claw 后再使用此功能。".to_string(),
})?;
continue;
};
if !config_path.exists() {
sink.send_service_message(ServiceMessage::ConfigUpdated {
success: false,
message: format!("配置文件不存在: {}", config_path.display()),
})?;
continue;
}
let result = update_config_file(config_path, config);
match result {
Ok(()) => {
sink.send_service_message(ServiceMessage::ConfigUpdated {
success: true,
message: "配置已保存。重启 sg_claw 以应用新配置。".to_string(),
})?;
}
Err(err) => {
sink.send_service_message(ServiceMessage::ConfigUpdated {
success: false,
message: format!("保存配置失败: {}", err),
})?;
}
}
}
```
- [ ] **Step 2: Add `update_config_file` helper function**
Add this function above `serve_client` in `server.rs`:
```rust
use crate::config::settings::{ConfigError, SgClawSettings};
use crate::service::protocol::ConfigUpdatePayload;
use std::path::Path;
fn update_config_file(config_path: &Path, config: ConfigUpdatePayload) -> Result<(), String> {
let mut settings = SgClawSettings::load(Some(config_path))
.map_err(|e| e.to_string())?
.ok_or_else(|| "无法读取现有配置".to_string())?;
if let Some(v) = config.api_key {
settings.provider_api_key = v;
}
if let Some(v) = config.base_url {
settings.provider_base_url = v;
}
if let Some(v) = config.model {
settings.provider_model = v;
}
if let Some(v) = config.skills_dir {
settings.skills_dir = Some(PathBuf::from(&v));
}
if let Some(v) = config.direct_submit_skill {
settings.direct_submit_skill = Some(v);
}
if let Some(v) = config.runtime_profile {
settings.runtime_profile = match v.as_str() {
"browser-attached" => crate::config::settings::RuntimeProfile::BrowserAttached,
"browser-heavy" => crate::config::settings::RuntimeProfile::BrowserHeavy,
"general-assistant" => crate::config::settings::RuntimeProfile::GeneralAssistant,
_ => return Err(format!("无效的 runtimeProfile: {}", v)),
};
}
if let Some(v) = config.browser_backend {
settings.browser_backend = match v.as_str() {
"super-rpa" => crate::config::settings::BrowserBackend::SuperRpa,
"agent-browser" => crate::config::settings::BrowserBackend::AgentBrowser,
"rust-native" => crate::config::settings::BrowserBackend::RustNative,
"computer-use" => crate::config::settings::BrowserBackend::ComputerUse,
"auto" => crate::config::settings::BrowserBackend::Auto,
_ => return Err(format!("无效的 browserBackend: {}", v)),
};
}
settings
.save_to_path(config_path)
.map_err(|e| format!("写入配置文件失败: {}", e))
}
```
Add the import at the top of server.rs:
```rust
use std::path::PathBuf;
```
- [ ] **Step 3: Run tests to verify compilation**
Run: `cargo build`
Expected: SUCCESS
### Task 5: Add auto-connect and settings UI to the service console HTML
**Files:**
- Modify: `frontend/service-console/sg_claw_service_console.html`
- [ ] **Step 1: Add auto-connect on page load**
At the very end of the `<script>` section, after the existing event listeners and `updateUiState()`, add:
```javascript
// Auto-connect on page load
window.addEventListener("DOMContentLoaded", () => {
connectOrDisconnectService(true);
});
```
- [ ] **Step 2: Add Settings button HTML**
In the sidebar section of the HTML, after the connect button and before the "Composer" section label, add:
```html
<button id="settingsBtn" class="ghost-btn" style="margin-top: 8px;">⚙ 设置</button>
```
- [ ] **Step 3: Add Settings modal HTML**
Before the closing `</body>` tag, add the modal HTML:
```html
<!-- Settings Modal -->
<div id="settingsModal" style="display: none; position: fixed; top: 0; left: 0; width: 100%; height: 100%; background: rgba(0,0,0,0.5); z-index: 1000; align-items: center; justify-content: center;">
<div style="background: var(--panel); border-radius: 20px; padding: 28px; width: min(520px, 90%); max-height: 85vh; overflow-y: auto; box-shadow: var(--shadow);">
<h3 style="margin: 0 0 20px; font-size: 1.2rem;">sgClaw 配置</h3>
<div class="field">
<label for="settingApiKey">API 密钥 *</label>
<input id="settingApiKey" type="password" placeholder="输入模型 API 密钥" />
</div>
<div class="field">
<label for="settingBaseUrl">模型服务地址 *</label>
<input id="settingBaseUrl" type="url" placeholder="例如https://api.deepseek.com" />
</div>
<div class="field">
<label for="settingModel">模型名称 *</label>
<input id="settingModel" type="text" placeholder="例如deepseek-chat" />
</div>
<div class="field">
<label for="settingSkillsDir">Skills 目录路径</label>
<input id="settingSkillsDir" type="text" placeholder="例如D:/data/ideaSpace/rust/sgClaw/claw/claw/skills/skill_staging/skills" />
</div>
<div class="field">
<label for="settingDirectSubmitSkill">直接提交技能</label>
<input id="settingDirectSubmitSkill" type="text" placeholder="例如tq-lineloss-report.collect_lineloss" />
</div>
<div class="field">
<label for="settingRuntimeProfile">运行模式</label>
<select id="settingRuntimeProfile" style="width: 100%; border: 1px solid var(--line); border-radius: 16px; padding: 14px 16px; background: rgba(255, 255, 255, 0.92); color: var(--text); font: inherit;">
<option value="browser-attached">browser-attached</option>
<option value="browser-heavy">browser-heavy</option>
<option value="general-assistant">general-assistant</option>
</select>
</div>
<div class="field">
<label for="settingBrowserBackend">浏览器后端</label>
<select id="settingBrowserBackend" style="width: 100%; border: 1px solid var(--line); border-radius: 16px; padding: 14px 16px; background: rgba(255, 255, 255, 0.92); color: var(--text); font: inherit;">
<option value="super-rpa">super-rpa</option>
<option value="agent-browser">agent-browser</option>
<option value="rust-native">rust-native</option>
<option value="computer-use">computer-use</option>
<option value="auto">auto</option>
</select>
</div>
<div id="settingsValidation" style="color: var(--error); font-size: 0.92rem; min-height: 1.4em; margin: 10px 0;"></div>
<div style="display: flex; gap: 12px; margin-top: 16px;">
<button id="settingsSaveBtn" class="primary-btn" style="flex: 1;">保存</button>
<button id="settingsCancelBtn" class="ghost-btn" style="flex: 1;">取消</button>
</div>
</div>
</div>
```
- [ ] **Step 4: Add settings modal CSS**
Add these CSS rules inside the `<style>` block, before the `@media` query:
```css
/* Settings modal elements */
select {
width: 100%;
border: 1px solid var(--line);
border-radius: 16px;
padding: 14px 16px;
background: rgba(255, 255, 255, 0.92);
color: var(--text);
font: inherit;
outline: none;
cursor: pointer;
}
select:focus {
border-color: rgba(15, 118, 110, 0.5);
box-shadow: 0 0 0 4px rgba(15, 118, 110, 0.12);
}
```
- [ ] **Step 5: Add settings modal JavaScript logic**
Add this JavaScript at the end of the `<script>` section, before the closing `</script>` tag:
```javascript
// Settings modal state
const settingsElements = {
modal: document.getElementById("settingsModal"),
apiKey: document.getElementById("settingApiKey"),
baseUrl: document.getElementById("settingBaseUrl"),
model: document.getElementById("settingModel"),
skillsDir: document.getElementById("settingSkillsDir"),
directSubmitSkill: document.getElementById("settingDirectSubmitSkill"),
runtimeProfile: document.getElementById("settingRuntimeProfile"),
browserBackend: document.getElementById("settingBrowserBackend"),
validation: document.getElementById("settingsValidation"),
saveBtn: document.getElementById("settingsSaveBtn"),
cancelBtn: document.getElementById("settingsCancelBtn"),
};
let settingsOpenBtn = null; // will be set below
function openSettingsModal() {
// Pre-fill with current values from wsUrl field (for baseUrl hint)
settingsElements.apiKey.value = "";
settingsElements.baseUrl.value = "";
settingsElements.model.value = "";
settingsElements.skillsDir.value = "";
settingsElements.directSubmitSkill.value = "";
settingsElements.runtimeProfile.value = "browser-attached";
settingsElements.browserBackend.value = "super-rpa";
settingsElements.validation.textContent = "";
settingsElements.modal.style.display = "flex";
}
function closeSettingsModal() {
settingsElements.modal.style.display = "none";
}
function validateSettings() {
const apiKey = settingsElements.apiKey.value.trim();
const baseUrl = settingsElements.baseUrl.value.trim();
const model = settingsElements.model.value.trim();
if (!apiKey) {
return "API 密钥不能为空";
}
if (!model) {
return "模型名称不能为空";
}
if (!baseUrl) {
return "模型服务地址不能为空";
}
try {
new URL(baseUrl);
} catch {
return "模型服务地址格式无效,请输入有效的 URL";
}
return "";
}
function saveSettings() {
const error = validateSettings();
if (error) {
settingsElements.validation.textContent = error;
return;
}
if (!socket || socket.readyState !== WebSocket.OPEN) {
settingsElements.validation.textContent = "请先连接服务";
return;
}
settingsElements.validation.textContent = "";
settingsElements.saveBtn.disabled = true;
settingsElements.saveBtn.textContent = "保存中...";
const config = {
apiKey: settingsElements.apiKey.value.trim(),
baseUrl: settingsElements.baseUrl.value.trim(),
model: settingsElements.model.value.trim(),
};
const skillsDir = settingsElements.skillsDir.value.trim();
if (skillsDir) config.skillsDir = skillsDir;
const directSubmitSkill = settingsElements.directSubmitSkill.value.trim();
if (directSubmitSkill) config.directSubmitSkill = directSubmitSkill;
config.runtimeProfile = settingsElements.runtimeProfile.value;
config.browserBackend = settingsElements.browserBackend.value;
socket.send(JSON.stringify({
type: "update_config",
config,
}));
}
function handleConfigResponse(message) {
settingsElements.saveBtn.disabled = false;
settingsElements.saveBtn.textContent = "保存";
if (message.success) {
settingsElements.validation.textContent = message.message;
settingsElements.validation.style.color = "var(--success)";
// Auto-close after 2 seconds on success
setTimeout(closeSettingsModal, 2000);
} else {
settingsElements.validation.textContent = message.message;
settingsElements.validation.style.color = "var(--error)";
}
}
// Event listeners for settings
settingsOpenBtn = document.getElementById("settingsBtn");
settingsOpenBtn.addEventListener("click", openSettingsModal);
settingsElements.cancelBtn.addEventListener("click", closeSettingsModal);
settingsElements.saveBtn.addEventListener("click", saveSettings);
// Close modal on background click
settingsElements.modal.addEventListener("click", (e) => {
if (e.target === settingsElements.modal) {
closeSettingsModal();
}
});
```
- [ ] **Step 6: Handle `config_updated` message in `handleMessage`**
In the existing `handleMessage` function, add a new case in the switch statement:
```javascript
case "config_updated":
handleConfigResponse(message);
break;
```
- [ ] **Step 7: Verify the HTML is well-formed**
Open the file in a browser and visually check that:
- The settings button appears below the connect button
- Clicking it opens the modal
- The modal closes on Cancel or background click
### Task 6: Add protocol tests for new message types
**Files:**
- Modify: `tests/service_console_html_test.rs`
- Create: `tests/service_protocol_update_config_test.rs`
- [ ] **Step 1: Create protocol serialization test**
Create `tests/service_protocol_update_config_test.rs`:
```rust
use sgclaw::service::protocol::{ClientMessage, ConfigUpdatePayload, ServiceMessage};
#[test]
fn update_config_serializes_correctly() {
let config = ConfigUpdatePayload {
api_key: Some("test-key".to_string()),
base_url: Some("https://api.example.com".to_string()),
model: Some("test-model".to_string()),
skills_dir: Some("/path/to/skills".to_string()),
direct_submit_skill: Some("my-skill.my-tool".to_string()),
runtime_profile: Some("browser-attached".to_string()),
browser_backend: Some("super-rpa".to_string()),
};
let msg = ClientMessage::UpdateConfig { config };
let json = serde_json::to_string(&msg).unwrap();
assert!(json.contains("\"type\":\"update_config\""));
assert!(json.contains("\"apiKey\":\"test-key\""));
assert!(json.contains("\"baseUrl\":\"https://api.example.com\""));
assert!(json.contains("\"model\":\"test-model\""));
}
#[test]
fn update_config_deserializes_correctly() {
let json = r#"{
"type": "update_config",
"config": {
"apiKey": "key123",
"baseUrl": "https://api.test.com",
"model": "gpt-4"
}
}"#;
let msg: ClientMessage = serde_json::from_str(json).unwrap();
match msg {
ClientMessage::UpdateConfig { config } => {
assert_eq!(config.api_key, Some("key123".to_string()));
assert_eq!(config.base_url, Some("https://api.test.com".to_string()));
assert_eq!(config.model, Some("gpt-4".to_string()));
assert!(config.skills_dir.is_none());
}
_ => panic!("expected UpdateConfig variant"),
}
}
#[test]
fn config_updated_serializes_correctly() {
let msg = ServiceMessage::ConfigUpdated {
success: true,
message: "配置已保存".to_string(),
};
let json = serde_json::to_string(&msg).unwrap();
assert!(json.contains("\"type\":\"config_updated\""));
assert!(json.contains("\"success\":true"));
assert!(json.contains("配置已保存"));
}
#[test]
fn config_updated_deserializes_correctly() {
let json = r#"{"type":"config_updated","success":false,"message":"保存失败"}"#;
let msg: ServiceMessage = serde_json::from_str(json).unwrap();
match msg {
ServiceMessage::ConfigUpdated { success, message } => {
assert!(!success);
assert_eq!(message, "保存失败");
}
_ => panic!("expected ConfigUpdated variant"),
}
}
```
- [ ] **Step 2: Update service console HTML test**
Add to `tests/service_console_html_test.rs`, at the end of the existing test:
```rust
// New enhancement assertions
assert!(source.contains("DOMContentLoaded"));
assert!(source.contains("settingsBtn"));
assert!(source.contains("settingsModal"));
assert!(source.contains("update_config"));
assert!(source.contains("config_updated"));
assert!(source.contains("settingApiKey"));
assert!(source.contains("settingBaseUrl"));
assert!(source.contains("settingModel"));
```
- [ ] **Step 3: Run all new tests**
Run: `cargo test --test service_protocol_update_config_test`
Run: `cargo test --test service_console_html_test`
Expected: All PASS
### Task 7: Full build and test verification
- [ ] **Step 1: Run full test suite**
Run: `cargo test 2>&1`
Expected: All tests pass (except pre-existing `lineloss_period_resolver_prompts_for_missing_period` which was already failing before our changes)
- [ ] **Step 2: Build release binary**
Run: `cargo build --release 2>&1`
Expected: SUCCESS
### Task 8: Manual smoke test instructions
After implementation, verify manually:
1. Start sg_claw with config path: `sg_claw.exe --config-path sgclaw_config.json`
2. Open `sg_claw_service_console.html` in browser
3. Verify: Page auto-connects (should show "已连接" within a few seconds)
4. Click "设置" button
5. Fill in API Key, Base URL, Model
6. Click "保存"
7. Verify: Modal shows "配置已保存。重启 sg_claw 以应用新配置。" and auto-closes after 2 seconds
8. Verify: `sgclaw_config.json` file contains the new values
9. Verify: Existing task submission still works (send a test instruction)

View File

@@ -0,0 +1,69 @@
# 异步 Browser Script 支持设计
## 问题
`collect_lineloss.js``buildBrowserEntrypointResult` 是 async 函数,但 `build_eval_js` 生成的执行代码是同步的,导致 Promise 被 JSON.stringify 序列化为 `{}`
**日志表现**
```
[execute_browser_script_impl] 返回成功, payload 长度: 4
```
返回 `{}(4字符)` 而不是实际的报表数据。
## 根本原因
`callback_backend.rs``build_eval_js` 函数:
```javascript
var v=(function(){return {script}})(); // 同步执行
var t=(typeof v==='string')?v:JSON.stringify(v); // Promise -> "{}"
```
当 script 返回 Promise 时,`JSON.stringify(Promise)` 返回 `{}`
## 解决方案
修改 `build_eval_js` 支持 Promise
1.`await` 等待 script 执行结果
2. 检测结果是否为 Promise如果是则等待 resolve
3. 保持对同步脚本的向后兼容
## 实现细节
修改 `src/browser/callback_backend.rs``build_eval_js` 函数:
```javascript
(async function(){
try {
var v = await (function(){return {script}})();
// 等待 Promise resolve
if (v && typeof v.then === 'function') {
v = await v;
}
var t = (typeof v === 'string') ? v : JSON.stringify(v);
// ... 回调逻辑保持不变
} catch(e) {}
})()
```
关键点:
- 包装整个 IIFE 为 async
-`await` 等待 script 执行
- 检测 Promise-like 对象 (`v.then === 'function'`)
- 向后兼容同步脚本直接返回值async 脚本返回 Promise 后被 await
## 影响范围
- `src/browser/callback_backend.rs`: 修改 `build_eval_js` 函数
- 所有 `browser_script` 类型的 skill 自动支持 async
## 测试验证
1. 运行 `cargo test` 确保现有测试通过
2. 端到端测试 `tq-lineloss-report.collect_lineloss` 返回实际数据而非 `{}`
3. 验证同步脚本(如知乎热榜)仍然正常工作
## 不在范围内
- 不修改 `wrap_browser_script`(方案 C 的做法)
- 不修改 skill 脚本本身

View File

@@ -0,0 +1,47 @@
# 修复 build_eval_js 异步支持 + validatePageContext 诊断日志
## 问题描述
1. `collect_lineloss.js``buildBrowserEntrypointResult` 是 async 函数,返回 Promise
2. 当前同步版 `build_eval_js``JSON.stringify(Promise)` = `"{}"`
3. 之前的 async IIFE 方案导致 `page_context_unavailable`(原因待排查)
## 方案
### 修改1: build_eval_js 使用 .then() 分支
文件:`src/browser/callback_backend.rs` - `build_eval_js` 函数
逻辑:
1. 外层 IIFE 保持同步(兼容 C++ 注入层)
2. 将回调发送逻辑提取为 `_s` 函数
3. 如果返回值是 Promise`.then` 方法),用 `.then(_s)` 异步等待结果
4. 否则直接同步调用 `_s(v)`
```javascript
(function(){try{
var v=(function(){return {script}})();
function _s(v){
var t=(typeof v==='string')?v:JSON.stringify(v);
try{callBackJsToCpp(...);}catch(_){}
var j=JSON.stringify({...});
try{XHR...}catch(_){}
try{sendBeacon...}catch(_){}
}
if(v&&typeof v.then==='function'){v.then(_s).catch(function(){});}
else{_s(v);}
}catch(e){}})()
```
### 修改2: validatePageContext 添加诊断日志
文件:`D:\data\ideaSpace\rust\sgClaw\claw\claw\skills\skill_staging\skills\tq-lineloss-report\scripts\collect_lineloss.js`
`validatePageContext` 每个检查点添加 console.log记录 host、expected_domain、mac 状态。
## 验证
1. `cargo test` 通过
2. 编译后拷贝 exe 到线上
3. 执行 skill确认不再返回 `{}`
4. 如果出现 `page_context_unavailable`,查看浏览器控制台日志

View File

@@ -0,0 +1,55 @@
# 修复 Browser Script Skill Tool expected_domain 参数丢失问题
## 问题描述
`tq-lineloss-report.collect_lineloss` skill 执行时返回 `status=blocked row=0 reasons=missing_expected_domain` 错误。
## 根本原因
`src/compat/browser_script_skill_tool.rs``execute_browser_script_impl` 函数:
```rust
// 第 183 行:从 args 中移除 expected_domain
let raw_expected_domain = match args.remove("expected_domain") {
Some(Value::String(value)) if !value.trim().is_empty() => value,
// ...
};
// 第 200 行:规范化域名(去掉 scheme、port 等)
let expected_domain = match normalize_domain_like(&raw_expected_domain) {
Some(value) => value,
// ...
};
// 第 234 行包装脚本时args 中已经没有 expected_domain 了!
let wrapped_script = wrap_browser_script(&script_body, &Value::Object(args.clone()));
```
`args.remove()` 会从 HashMap 中删除键值对,后续 `wrap_browser_script()` 传入的 args 不包含 `expected_domain`,导致 JS 脚本中 `const args = {...}` 缺少该字段。
## 解决方案
在规范化域名后,将 `expected_domain` 重新插入 args。
### 修改位置
文件:`src/compat/browser_script_skill_tool.rs`
行号:第 209 行后(`expected_domain` 赋值之后、`for required_arg` 循环之前)
### 修改内容
```rust
// 第 209 行后添加:
args.insert("expected_domain".to_string(), Value::String(expected_domain.clone()));
```
## 影响范围
- 只影响 `browser_script_skill_tool.rs`
- 所有使用 `expected_domain` 的 browser_script skill 都会受益
- 无破坏性变更
## 验证方法
1. 运行现有测试:`cargo test browser_script_skill_tool`
2. 内网验证:执行 `tq-lineloss-report.collect_lineloss` skill

View File

@@ -0,0 +1,48 @@
# 台区线损 Skill - requesturl 快速修复方案
## 问题背景
`sgHideBrowerserOpenPage` 命令需要 `requesturl` 参数(发起调用的页面 URL但当前台区线损指令解析时返回 `about:blank`,导致浏览器不执行命令。
知乎热榜场景正常工作,因为 `derive_request_url_from_instruction` 返回了 `https://www.zhihu.com`
## 设计方案
**方案:在 `derive_request_url_from_instruction` 中添加台区线损 URL 映射**
### 修改位置
`src/service/server.rs` - `derive_request_url_from_instruction` 函数
### 修改内容
```rust
fn derive_request_url_from_instruction(instruction: &str) -> Option<String> {
// 已有:知乎相关(保持不变)
if crate::compat::workflow_executor::detect_route(instruction, None, None)
.is_some_and(|route| { ... })
{
return Some("https://www.zhihu.com".to_string());
}
// 新增:台区线损相关
// TODO: 临时方案,后续应从 skill 配置或 deterministic_submit 解析结果中获取
if instruction.contains("线损") || instruction.contains("lineloss") {
return Some("http://20.76.57.61:18080".to_string());
}
None
}
```
### 约束条件
- URL 为硬编码,后续需重构为通用方案
- 仅匹配指令中包含"线损"或"lineloss"的场景
## 后续规划
将实现通用方案:
-`DeterministicExecutionPlan.expected_domain` 构造完整 URL
- 或从 skill 配置文件中读取 target URL
- 调整流程顺序,先解析 skill 再打开 helper page

View File

@@ -0,0 +1,36 @@
# 台区线损 Skill - target_url 缺失修复方案
## 问题背景
`browser_script_skill_tool.rs` 调用 `Action::Eval` 时只传了 `script` 参数,没有传 `target_url``callback_backend.rs``target_url` 方法需要从 params 或 `current_target_url` 获取值,两者都没有时报错。
知乎热榜正常工作是因为先执行了 `Action::Navigate`,设置了 `current_target_url`
## 设计方案
**方案:在 `browser_script_skill_tool.rs` 的 params 中添加 `target_url`**
### 修改位置
`src/compat/browser_script_skill_tool.rs` - `execute_browser_script_impl` 函数
### 修改内容
在调用 `browser_tool.invoke(Action::Eval, ...)` 时,从 `expected_domain` 构造完整 URL 并添加到 params
```rust
let target_url = format!("http://{}", expected_domain);
let result = match browser_tool.invoke(
Action::Eval,
json!({
"script": wrapped_script,
"target_url": target_url,
}),
&expected_domain,
) {
```
### 约束条件
- 使用 `http://` 协议前缀
- `expected_domain` 可能包含端口号(如 `20.76.57.61:18080`),直接拼接即可

View File

@@ -0,0 +1,84 @@
# Remove mac Guard from validatePageContext
## Date
2026-04-13
## Problem
`tq-lineloss-report` skill execution reports `status=blocked rows=0 reasons=page_context_unavailable`.
Diagnostic instrumentation confirmed:
```
href=http://20.76.57.61:18080/gsllys
host=20.76.57.61
port=18080
title=台区线损大数据分析模块
mac=false
```
The script executes on the correct domain but `globalThis.mac` does not exist, triggering the `page_context_unavailable` guard.
## Root Cause
`window.mac` is a Vue instance created by the **original scene page** (`index.html`), assigned via `window.mac = this` in `mounted()`. The original scene page acts as a controller that injects JS into the business page via `BrowserAction('sgBrowserExcuteJsCode', exactURL, jsCode)`.
In the skill execution model, there is no scene page. The script is injected directly via `sgBrowserExcuteJsCodeByDomain` onto a page matching the domain. No Vue instance is created, so `globalThis.mac` is always `undefined`. The `mac` check is architecturally invalid for the skill model.
Additionally, `sgBrowserExcuteJsCodeByDomain("20.76.57.61")` matches the parent frame page (`/gsllys`) rather than the business sub-page (`/gsllys/tqLinelossStatis/tqQualifyRateMonitor`). This is acceptable because the skill script makes direct HTTP requests with absolute URLs and does not depend on page-local state.
## Design
Remove the `globalThis.mac` existence check from `validatePageContext` in `collect_lineloss.js`. Retain the `host` matching check as a basic domain guard.
Also clean up the temporary diagnostic code (`diag` variable, `console.log` statements, enriched reason strings) added during debugging.
### Before
```javascript
validatePageContext(args) {
const host = normalizeText(globalThis.location?.hostname);
const port = normalizeText(globalThis.location?.port);
const href = normalizeText(globalThis.location?.href);
const title = normalizeText(globalThis.document?.title);
const expected = normalizeText(args.expected_domain);
const hasMac = !!globalThis.mac;
const diag = 'href=' + href + '|host=' + host + '|port=' + port + '|title=' + title + '|mac=' + hasMac;
console.log('[validatePageContext] ' + diag);
if (!host) {
return { ok: false, reason: 'page_context_unavailable:host_empty|' + diag };
}
if (host !== expected) {
return { ok: false, reason: 'page_context_mismatch:host=' + host + ',expected=' + expected + '|' + diag };
}
if (!hasMac) {
return { ok: false, reason: 'page_context_unavailable:mac_missing|' + diag };
}
return { ok: true };
},
```
### After
```javascript
validatePageContext(args) {
const host = normalizeText(globalThis.location?.hostname);
const expected = normalizeText(args.expected_domain);
if (!host) {
return { ok: false, reason: 'page_context_unavailable' };
}
if (host !== expected) {
return { ok: false, reason: 'page_context_mismatch' };
}
return { ok: true };
},
```
## Files Changed
- `claw/claw/skills/skill_staging/skills/tq-lineloss-report/scripts/collect_lineloss.js``validatePageContext` function only
## No Recompilation Required
The JS file is read at runtime via `fs::read_to_string`. No Rust code changes.

View File

@@ -0,0 +1,111 @@
# Rust-Side Lineloss XLSX Export
## Problem
`collect_lineloss.js` runs on a remote page (`http://20.76.57.61:18080/gsllys`).
The script successfully queries API data (12 rows), but cannot call
`http://localhost:13313/.../faultDetailsExportXLSX` because the browser blocks
cross-origin requests from a remote page to `localhost`.
The original scene architecture had a local scene page acting as a proxy,
but skill mode has no local page -- so export is architecturally impossible
from the browser side.
## Decision
Move XLSX generation to the Rust side. JS only collects data; Rust generates
the `.xlsx` file locally after receiving the artifact.
Report log (`setReportLog`) is deferred to a later iteration.
## Design
### JS Changes (`collect_lineloss.js`)
1. Remove `exportWorkbook()` call and `writeReportLog()` call
2. Return artifact with `rows` array and `column_defs` array
3. Status is `ok` when rows > 0, `empty` when rows == 0, `error`/`blocked` unchanged
Artifact shape:
```json
{
"type": "report-artifact",
"report_name": "tq-lineloss-report",
"status": "ok",
"org": { "label": "...", "code": "..." },
"period": { "mode": "month", "value": "2026-03" },
"column_defs": [["ORG_NAME","供电单位"], ["YGDL","累计供电量"], ...],
"rows": [
{"ORG_NAME":"xxx", "YGDL":"12345.67", ...}
],
"counts": { "rows": 12 }
}
```
### Rust Changes
#### New file: `src/compat/lineloss_xlsx_export.rs`
Generates a standard `.xlsx` file using `zip` crate + OpenXML XML strings.
Follows the pattern established in `openxml_office_tool.rs`.
Public API:
```rust
pub struct LinelossExportRequest {
pub column_defs: Vec<(String, String)>, // (key, chinese_header)
pub rows: Vec<Map<String, Value>>,
pub sheet_name: String,
pub output_path: PathBuf,
}
pub fn export_lineloss_xlsx(request: &LinelossExportRequest) -> anyhow::Result<PathBuf>;
```
Internals:
- Build header row from `column_defs[*].1` (chinese names)
- Build data rows by looking up `column_defs[*].0` keys in each row map
- Generate `worksheet_xml` with inline string cells
- Package with standard OpenXML boilerplate (content_types, rels, workbook)
- Write to `output_path`
#### Modified: `src/compat/deterministic_submit.rs`
In `execute_deterministic_submit_with_browser_backend` (and the non-backend variant):
```
let output = execute_browser_script_skill_raw_output_with_browser_backend(...)?;
let artifact = parse_lineloss_artifact(&output);
if artifact has rows > 0 && column_defs present:
let export_path = workspace_root/out/tq-lineloss-{timestamp}.xlsx
export_lineloss_xlsx(LinelossExportRequest { ... })?
// attach export_path to outcome summary
Ok(summarize_lineloss_output_with_export(&output, export_path))
```
#### Modified: `src/compat/mod.rs`
Add `pub mod lineloss_xlsx_export;`
### Output Path
`{workspace_root}/out/tq-lineloss-{org_label}-{period}-{timestamp_nanos}.xlsx`
### Error Handling
- XLSX generation failure: outcome status = `partial`, reason = `xlsx_export_failed`
- Artifact parse failure: fall through to existing `summarize_lineloss_output`
## Files Changed
| File | Change Type |
|------|-------------|
| `collect_lineloss.js` | Modify: remove export/log calls, add rows+column_defs to artifact |
| `src/compat/lineloss_xlsx_export.rs` | New: XLSX generation |
| `src/compat/deterministic_submit.rs` | Modify: post-process artifact, call XLSX export |
| `src/compat/mod.rs` | Modify: register new module |
## Requires Recompilation
Yes. Rust code changes require `cargo build`.

View File

@@ -0,0 +1,55 @@
# Helper Page Lifecycle Fix v2 — Same-Connection Close + Open
**Date:** 2026-04-14
**Status:** Approved
## Problem
Two issues remain after v1:
1. **Process restart leaves orphaned helper pages**: When the sg_claw process restarts, the old helper page tab remains open in the browser. The new process opens another one.
2. **Helper page is visible**: Uses `sgBrowerserOpenPage` (visible tab API) instead of `sgHideBrowerserOpenPage` (hidden domain API).
## Root Cause of v1 Failure
The v1 `close_helper_page` function created a **second** WebSocket connection to the browser during `Drop`. This likely conflicted with the existing bootstrap connection, causing the browser's WebSocket state to become confused.
## Solution
Send the close command on the **same** WebSocket connection used for bootstrap, before sending the open command:
1. Connect to browser WS
2. Register as "web" role
3. **Blindly send** `sgHideBrowerserClosePage(helper_url)` — closes any orphaned page from a previous process run
4. Send `sgHideBrowerserOpenPage(helper_url)` — opens the new helper page
5. Poll `/sgclaw/callback/ready` for page readiness
Both `use_hidden_domain = true` and the close+open logic are combined into a single change.
## Why This Works
- **Same connection**: Only one WebSocket connection to the browser. No conflict with existing connections.
- **Best-effort close**: If no orphaned page exists (first run ever), the close command is silently ignored by the browser. This does not affect the subsequent open command.
- **Fire-and-forget**: Both close and open commands use the same fire-and-forget semantics as the existing bootstrap command.
## API Reference
| API | Wire format | Effect |
|-----|------------|--------|
| `sgHideBrowerserOpenPage` (API #6) | `[requesturl, "sgHideBrowerserOpenPage", url]` | Opens in hidden domain |
| `sgHideBrowerserClosePage` (API #68) | `[requesturl, "sgHideBrowerserClosePage", url]` | Closes hidden domain page |
## Affected Files
| File | Change |
|------|--------|
| `src/browser/callback_host.rs` | In `bootstrap_helper_page`: add close command before open command |
| `src/service/server.rs` | Change `use_hidden_domain` from `false` to `true` |
## What Does NOT Change
- `callback_backend.rs``SHOW_AREA`, `build_command` unchanged
- `sgBrowserExcuteJsCodeByDomain` area parameter — stays `"show"`
- Helper page HTML content — unchanged
- `Drop for LiveBrowserCallbackHost` — remains simple (shutdown only, no close attempt)
- `cached_host` in `mod.rs` — remains lifted to outer loop

View File

@@ -0,0 +1,99 @@
# Helper Page Lifecycle Fix & Hidden Domain Support
**Date:** 2026-04-14
**Status:** Approved
## Problem Statement
Two bugs in the browser-helper.html page management:
1. **Duplicate helper pages**: Every WebSocket client reconnection triggers a new `serve_client()` call, which creates a new `LiveBrowserCallbackHost` and opens a new helper page via `sgBrowerserOpenPage`. The old helper page tab is never closed, causing accumulation of orphaned tabs.
2. **Helper page is visible**: The bootstrap uses `sgBrowerserOpenPage` (visible tab API) instead of `sgHideBrowerserOpenPage` (hidden domain API). The helper page should not be visible to the user.
## Root Cause Analysis
### Duplicate pages
Call chain:
- `src/service/mod.rs:72` — outer `loop` accepts new WebSocket connections
- `src/service/mod.rs:79` — each connection calls `serve_client()`
- `src/service/server.rs:241``cached_host` declared as local variable, re-initialized to `None` each call
- `src/service/server.rs:288``callback_host.rs:241``bootstrap_helper_page()` opens a new helper tab
`Drop for LiveBrowserCallbackHost` (`callback_host.rs:321-328`) only shuts down the HTTP server thread. It does not send a browser close command for the helper tab.
### Visible page
`callback_host.rs:28`: `HELPER_BOOTSTRAP_ACTION = "sgBrowerserOpenPage"` — this is the visible-domain open API (API #7). The hidden-domain equivalent is `sgHideBrowerserOpenPage` (API #6).
## Solution: Approach C — Incremental Fix
### Step 1: Fix lifecycle (immediate, deterministic fix)
#### 1a. Lift `cached_host` to outer loop
Move `cached_host: Option<Arc<LiveBrowserCallbackHost>>` from inside `serve_client()` to before the `loop` in `run_service()` (`mod.rs`). Change `serve_client()` signature to accept `&mut Option<Arc<LiveBrowserCallbackHost>>` instead of creating its own.
Effect: Multiple WebSocket reconnections share the same host. Helper page opens once per process lifetime.
#### 1b. Close helper page on Drop
Enhance `Drop for LiveBrowserCallbackHost`:
- Add `browser_ws_url: String` field to `LiveBrowserCallbackHost` (stored at construction time)
- Add `use_hidden_domain: bool` field (stored at construction time)
- In `Drop::drop`, before shutting down the server thread:
1. Connect to `browser_ws_url` with 100ms connection timeout
2. Send register message
3. Send close command: `[helper_url, close_api, helper_url]`
- `close_api` = `"sgBrowserClosePage"` when `use_hidden_domain == false`
- `close_api` = `"sgHideBrowerserClosePage"` when `use_hidden_domain == true`
4. All steps are best-effort: failures are silently ignored
5. Total timeout cap: 500ms
### Step 2: Hidden domain config switch (for testing/gradual rollout)
#### 2a. Parameter plumbing
- `LiveBrowserCallbackHost::start_with_browser_ws_url` gains parameter `use_hidden_domain: bool`
- `bootstrap_helper_page` selects API based on this flag:
- `true``"sgHideBrowerserOpenPage"`
- `false``"sgBrowerserOpenPage"` (current behavior, default)
- `LiveBrowserCallbackHost` stores the flag for Drop close-command selection
#### 2b. Caller changes
- `mod.rs` / `server.rs` pass `false` as default
- To enable hidden domain, change the call site to pass `true`
## What Does NOT Change
- `callback_backend.rs` `SHOW_AREA = "show"` — JS injection targets visible business pages, not the helper itself
- `sgBrowserExcuteJsCodeByDomain` area parameter — stays `"show"` regardless of helper domain
- Helper page HTML content — WebSocket connection and command polling JS remain the same
- `collect_lineloss.js` — not affected
## Affected Files
| File | Change |
|------|--------|
| `src/browser/callback_host.rs` | New fields on `LiveBrowserCallbackHost`, `start_with_browser_ws_url` signature change, `Drop` enhancement, new `close_helper_page` helper fn |
| `src/service/mod.rs` | `cached_host` lifted to outer loop, passed to `serve_client` |
| `src/service/server.rs` | `serve_client` signature change to accept `&mut Option<Arc<LiveBrowserCallbackHost>>` |
| Existing test files | Adapt `start_with_browser_ws_url` calls with new `use_hidden_domain` parameter |
## Testing
- Existing `callback_host` tests: adapt to new signature (add `false` parameter)
- New unit test: `use_hidden_domain = true` → bootstrap sends `sgHideBrowerserOpenPage`
- New unit test: `use_hidden_domain = false` → bootstrap sends `sgBrowerserOpenPage` (regression)
- `cargo build` + `cargo test` full verification
## Browser API Reference
| API | Wire format | Effect |
|-----|------------|--------|
| `sgBrowerserOpenPage` (API #7) | `[requesturl, "sgBrowerserOpenPage", url]` | Opens visible tab |
| `sgHideBrowerserOpenPage` (API #6) | `[requesturl, "sgHideBrowerserOpenPage", url]` | Opens in hidden domain |
| `sgBrowserClosePage` (API #64) | `[requesturl, "sgBrowserClosePage", url]` | Closes visible tab |
| `sgHideBrowerserClosePage` (API #68) | `[requesturl, "sgHideBrowerserClosePage", url]` | Closes hidden domain page |

View File

@@ -0,0 +1,284 @@
# sgClaw Service Console Enhancement Design
## Background
The current `sg_claw_service_console.html` provides a basic UI for connecting to the sgClaw service WebSocket and submitting tasks. However, it requires manual connection on first load and has no way to configure the sgClaw settings (API key, model, base URL, skills directory) from the UI.
Users need to manually edit `sgclaw_config.json` before using the console, which is inconvenient for routine operations.
## Problem Statement
1. Page requires manual "Connect" button click on first load
2. No UI for configuring sgClaw runtime settings (model, API key, base URL, skills dir)
3. Users must manually edit `sgclaw_config.json` file to change configuration
## Goal
Enhance the service console page with:
1. **Auto-connect on page load** - attempt WebSocket connection immediately
2. **Settings panel** - edit sgClaw configuration fields through a friendly UI
3. **Config save via WebSocket** - send configuration updates to the running sgClaw service, which writes them to `sgclaw_config.json`
## Non-goals
- Auto-starting `sg_claw.exe` process (browser security limitation, deferred)
- Changing existing `submit_task` protocol or execution flow
- Modifying browser-helper.html or browser execution logic
- Adding authentication or multi-user support
- Configuration validation beyond basic field checks
## Architecture
### Component Overview
```
┌─────────────────────────────────────────┐
│ sg_claw_service_console.html │
│ ┌───────────────────────────────────┐ │
│ │ Auto-connect on load │ │
│ │ (ws://127.0.0.1:42321 default) │ │
│ └───────────────────────────────────┘ │
│ ┌───────────────────────────────────┐ │
│ │ Settings Panel (Modal) │ │
│ │ - API Key │ │
│ │ - Base URL │ │
│ │ - Model │ │
│ │ - Skills Directory │ │
│ │ - Direct Submit Skill (optional) │ │
│ │ - Runtime Profile (dropdown) │ │
│ │ - Browser Backend (dropdown) │ │
│ │ [Save] [Cancel] │ │
│ └───────────────────────────────────┘ │
│ ┌───────────────────────────────────┐ │
│ │ Existing: Connection + Composer │ │
│ └───────────────────────────────────┘ │
└──────────────┬──────────────────────────┘
│ WebSocket
│ submit_task / update_config
┌─────────────────────────────────────────┐
│ sg_claw.exe (service) │
│ ┌───────────────────────────────────┐ │
│ │ ClientMessage handler │ │
│ │ - SubmitTask (existing) │ │
│ │ - UpdateConfig (new) │ │
│ └───────────────────────────────────┘ │
│ ┌───────────────────────────────────┐ │
│ │ Config writer │ │
│ │ Writes to sgclaw_config.json │ │
│ └───────────────────────────────────┘ │
└─────────────────────────────────────────┘
```
### Data Flow
1. **Auto-connect flow:**
- Page loads → JavaScript calls `connect()` automatically
- If WS opens → show "已连接" chip, enable send button
- If WS fails → show "未连接" chip, keep send disabled
- Reconnect logic remains unchanged (existing heartbeat/reconnect)
2. **Config save flow:**
- User clicks "设置" button → modal opens with current config values
- User edits fields → clicks "保存"
- Page sends `update_config` message via WS:
```json
{
"type": "update_config",
"config": {
"apiKey": "...",
"baseUrl": "...",
"model": "...",
"skillsDir": "...",
"directSubmitSkill": "...",
"runtimeProfile": "...",
"browserBackend": "..."
}
}
```
- sgClaw service receives message → validates → writes to `sgclaw_config.json`
- Service responds with success/error → page shows notification
- Service reloads config in-memory (or requires restart - see below)
### Protocol Changes
#### New ClientMessage variant
Add to `src/service/protocol.rs`:
```rust
#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)]
#[serde(tag = "type", rename_all = "snake_case")]
pub enum ClientMessage {
Connect,
Start,
Stop,
SubmitTask { ... },
Ping,
UpdateConfig { // NEW
config: ConfigUpdatePayload,
},
}
#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)]
pub struct ConfigUpdatePayload {
pub api_key: Option<String>,
pub base_url: Option<String>,
pub model: Option<String>,
pub skills_dir: Option<String>,
pub direct_submit_skill: Option<String>,
pub runtime_profile: Option<String>,
pub browser_backend: Option<String>,
}
```
#### New ServiceMessage variant (optional)
Add to `src/service/protocol.rs`:
```rust
#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)]
#[serde(tag = "type", rename_all = "snake_case")]
pub enum ServiceMessage {
StatusChanged { state: String },
LogEntry { level: String, message: String },
TaskComplete { success: bool, summary: String },
Busy { message: String },
Pong,
ConfigUpdated { success: bool, message: String }, // NEW
}
```
### Config Persistence
The service will:
1. Load current `sgclaw_config.json` from the config path (derived from process args)
2. Merge incoming `ConfigUpdatePayload` fields (only non-null fields are updated)
3. Write the merged config back to the same file
4. Respond with success/error message
5. **Hot reload**: The service should reload config in-memory without requiring restart
**Important:** If the config file path cannot be resolved (no `--config-path` arg), the service should respond with an error message indicating that config updates are not supported in env-var-only mode.
### UI Design
#### Settings Button
- Add a "设置" button in the sidebar, below the existing connect button
- Styled as a ghost button with a gear icon (using unicode ⚙ or CSS-only icon)
#### Settings Modal
- Overlay modal with centered card
- Form fields with labels in Chinese:
- `API 密钥` (apiKey) - password input type with show/hide toggle
- `模型服务地址` (baseUrl) - text input
- `模型名称` (model) - text input
- `Skills 目录路径` (skillsDir) - text input with path validation
- `直接提交技能` (directSubmitSkill) - text input (optional, can be empty)
- `运行模式` (runtimeProfile) - dropdown: `browser-attached` / `service-standalone`
- `浏览器后端` (browserBackend) - dropdown: `super-rpa` / `pipe` / `none`
- [保存] primary button, [取消] ghost button
- Validation:
- API Key and Model are required (show red error if empty on save)
- Base URL must be a valid URL format
- Skills Dir must be a valid path format
- Other fields are optional
#### Connection State Auto-detection
- On page load, call `connect()` automatically
- Connection state chip updates as before
- Reconnect logic (existing) remains unchanged
### File Changes
| File | Change |
|------|--------|
| `frontend/service-console/sg_claw_service_console.html` | Add auto-connect on load, settings modal UI, save logic |
| `src/service/protocol.rs` | Add `UpdateConfig` variant and `ConfigUpdatePayload` struct |
| `src/service/protocol.rs` | Add `ConfigUpdated` service message variant |
| `src/service/server.rs` | Handle `UpdateConfig` message, merge config, write file |
| `src/agent/task_runner.rs` | Add `pub fn config_path(&self) -> Option<&Path>` getter to `AgentRuntimeContext` |
| `src/config/settings.rs` | Add `save_to_path()` method for writing config to file |
| `tests/service_console_html_test.rs` | Add assertions for settings modal and update_config message |
### Config Save Implementation
In `src/service/server.rs`, when handling `UpdateConfig`:
```rust
ClientMessage::UpdateConfig { config } => {
// 1. Load current config from config_path
let config_path = runtime_context.config_path(); // needs to be exposed
let current = SgClawSettings::load(config_path.as_deref())?;
// 2. Merge: only overwrite fields that are Some in the payload
let mut merged = current.unwrap_or_default();
if let Some(v) = config.api_key { merged.provider_api_key = v; }
if let Some(v) = config.base_url { merged.provider_base_url = v; }
if let Some(v) = config.model { merged.provider_model = v; }
if let Some(v) = config.skills_dir { merged.skills_dir = Some(PathBuf::from(v)); }
// ... etc for other fields
// 3. Write back to file
merged.save_to_path(config_path.as_ref().ok_or("no config path")?)?;
// 4. Respond
sink.send_service_message(ServiceMessage::ConfigUpdated {
success: true,
message: "配置已保存".to_string(),
})?;
}
```
### Hot Reload Consideration
After saving config, the service should reload its in-memory settings. This requires:
1. Storing the loaded `SgClawSettings` in a reloadable container (e.g., `Arc<Mutex<SgClawSettings>>` or `Arc<RwLock<...>>`)
2. Or, the service can respond with "配置已保存,请重启 sg_claw 以应用更改" (simpler, avoids hot reload complexity)
**Recommended:** Start with "requires restart" approach. Hot reload can be added later if needed.
### Error Handling
| Scenario | Response |
|----------|----------|
| WS not connected when saving | Show inline error: "请先连接服务" |
| Config file not found | Service responds: "未找到配置文件,请通过 --config-path 指定" |
| Invalid config values | Service validates and responds with specific error |
| Write permission denied | Service responds: "无法写入配置文件,请检查文件权限" |
| WS disconnected during save | Show error: "连接断开,保存失败,请重试" |
### Test Strategy
1. **Integration test** (`tests/service_console_html_test.rs`):
- Assert page contains settings modal HTML
- Assert page contains "设置" button
- Assert page sends `update_config` message shape
- Assert page auto-connects on load (contains `window.onload` or equivalent)
2. **Protocol test** (new or existing test file):
- Assert `ClientMessage::UpdateConfig` serializes correctly
- Assert `ServiceMessage::ConfigUpdated` deserializes correctly
3. **Config save test** (new test in `tests/compat_config_test.rs` or new file):
- Create temp config file
- Send UpdateConfig message
- Verify file contents match expected merged config
## Acceptance Criteria
1. Page auto-connects to WS on load without manual button click
2. Settings button visible in sidebar
3. Settings modal opens with form fields for all configurable options
4. Clicking "保存" sends `update_config` message via WS
5. Service receives message and writes to `sgclaw_config.json`
6. Service responds with success/error message
7. Page displays save result notification
8. Existing task submission flow unchanged
9. Existing heartbeat/reconnect logic unchanged
10. Automated tests pass

View File

@@ -309,6 +309,24 @@
}
}
/* Settings modal elements */
select {
width: 100%;
border: 1px solid var(--line);
border-radius: 16px;
padding: 14px 16px;
background: rgba(255, 255, 255, 0.92);
color: var(--text);
font: inherit;
outline: none;
cursor: pointer;
}
select:focus {
border-color: rgba(15, 118, 110, 0.5);
box-shadow: 0 0 0 4px rgba(15, 118, 110, 0.12);
}
@media (max-width: 900px) {
body {
padding: 16px;
@@ -347,6 +365,7 @@
<input id="wsUrl" value="ws://127.0.0.1:42321" />
</div>
<button id="connectBtn" class="ghost-btn">连接</button>
<button id="settingsBtn" class="ghost-btn" style="margin-top: 8px;">⚙ 设置</button>
<p class="section-label" style="margin-top: 26px;">Composer</p>
<div class="field">
@@ -372,6 +391,65 @@
</div>
</div>
<!-- Settings Modal -->
<div id="settingsModal" style="display: none; position: fixed; top: 0; left: 0; width: 100%; height: 100%; background: rgba(0,0,0,0.5); z-index: 1000; align-items: center; justify-content: center;">
<div style="background: var(--panel); border-radius: 20px; padding: 28px; width: min(520px, 90%); max-height: 85vh; overflow-y: auto; box-shadow: var(--shadow);">
<h3 style="margin: 0 0 20px; font-size: 1.2rem;">sgClaw 配置</h3>
<div class="field">
<label for="settingApiKey">API 密钥 *</label>
<input id="settingApiKey" type="password" placeholder="输入模型 API 密钥" />
</div>
<div class="field">
<label for="settingBaseUrl">模型服务地址 *</label>
<input id="settingBaseUrl" type="url" placeholder="例如https://api.deepseek.com" />
</div>
<div class="field">
<label for="settingModel">模型名称 *</label>
<input id="settingModel" type="text" placeholder="例如deepseek-chat" />
</div>
<div class="field">
<label for="settingSkillsDir">Skills 目录路径</label>
<input id="settingSkillsDir" type="text" placeholder="例如D:/data/ideaSpace/rust/sgClaw/claw/claw/skills/skill_staging/skills" />
</div>
<div class="field">
<label for="settingDirectSubmitSkill">直接提交技能</label>
<input id="settingDirectSubmitSkill" type="text" placeholder="例如tq-lineloss-report.collect_lineloss" />
</div>
<div class="field">
<label for="settingRuntimeProfile">运行模式</label>
<select id="settingRuntimeProfile" style="width: 100%; border: 1px solid var(--line); border-radius: 16px; padding: 14px 16px; background: rgba(255, 255, 255, 0.92); color: var(--text); font: inherit;">
<option value="browser-attached">browser-attached</option>
<option value="browser-heavy">browser-heavy</option>
<option value="general-assistant">general-assistant</option>
</select>
</div>
<div class="field">
<label for="settingBrowserBackend">浏览器后端</label>
<select id="settingBrowserBackend" style="width: 100%; border: 1px solid var(--line); border-radius: 16px; padding: 14px 16px; background: rgba(255, 255, 255, 0.92); color: var(--text); font: inherit;">
<option value="super-rpa">super-rpa</option>
<option value="agent-browser">agent-browser</option>
<option value="rust-native">rust-native</option>
<option value="computer-use">computer-use</option>
<option value="auto">auto</option>
</select>
</div>
<div id="settingsValidation" style="color: var(--error); font-size: 0.92rem; min-height: 1.4em; margin: 10px 0;"></div>
<div style="display: flex; gap: 12px; margin-top: 16px;">
<button id="settingsSaveBtn" class="primary-btn" style="flex: 1;">保存</button>
<button id="settingsCancelBtn" class="ghost-btn" style="flex: 1;">取消</button>
</div>
</div>
</div>
<script>
const defaultWsUrl = "ws://127.0.0.1:42321";
const elements = {
@@ -592,6 +670,9 @@
break;
case "pong":
break;
case "config_updated":
handleConfigResponse(message);
break;
default:
appendRow("error", "unknown service message: " + event.data);
}
@@ -627,6 +708,128 @@
});
updateUiState();
// Auto-connect on page load
window.addEventListener("DOMContentLoaded", () => {
connectOrDisconnectService(true);
});
// Settings modal state
const settingsElements = {
modal: document.getElementById("settingsModal"),
apiKey: document.getElementById("settingApiKey"),
baseUrl: document.getElementById("settingBaseUrl"),
model: document.getElementById("settingModel"),
skillsDir: document.getElementById("settingSkillsDir"),
directSubmitSkill: document.getElementById("settingDirectSubmitSkill"),
runtimeProfile: document.getElementById("settingRuntimeProfile"),
browserBackend: document.getElementById("settingBrowserBackend"),
validation: document.getElementById("settingsValidation"),
saveBtn: document.getElementById("settingsSaveBtn"),
cancelBtn: document.getElementById("settingsCancelBtn"),
};
let settingsOpenBtn = null;
function openSettingsModal() {
settingsElements.apiKey.value = "";
settingsElements.baseUrl.value = "";
settingsElements.model.value = "";
settingsElements.skillsDir.value = "";
settingsElements.directSubmitSkill.value = "";
settingsElements.runtimeProfile.value = "browser-attached";
settingsElements.browserBackend.value = "super-rpa";
settingsElements.validation.textContent = "";
settingsElements.modal.style.display = "flex";
}
function closeSettingsModal() {
settingsElements.modal.style.display = "none";
}
function validateSettings() {
const apiKey = settingsElements.apiKey.value.trim();
const baseUrl = settingsElements.baseUrl.value.trim();
const model = settingsElements.model.value.trim();
if (!apiKey) {
return "API 密钥不能为空";
}
if (!model) {
return "模型名称不能为空";
}
if (!baseUrl) {
return "模型服务地址不能为空";
}
try {
new URL(baseUrl);
} catch {
return "模型服务地址格式无效,请输入有效的 URL";
}
return "";
}
function saveSettings() {
const error = validateSettings();
if (error) {
settingsElements.validation.textContent = error;
return;
}
if (!socket || socket.readyState !== WebSocket.OPEN) {
settingsElements.validation.textContent = "请先连接服务";
return;
}
settingsElements.validation.textContent = "";
settingsElements.saveBtn.disabled = true;
settingsElements.saveBtn.textContent = "保存中...";
const config = {
apiKey: settingsElements.apiKey.value.trim(),
baseUrl: settingsElements.baseUrl.value.trim(),
model: settingsElements.model.value.trim(),
};
const skillsDir = settingsElements.skillsDir.value.trim();
if (skillsDir) config.skillsDir = skillsDir;
const directSubmitSkill = settingsElements.directSubmitSkill.value.trim();
if (directSubmitSkill) config.directSubmitSkill = directSubmitSkill;
config.runtimeProfile = settingsElements.runtimeProfile.value;
config.browserBackend = settingsElements.browserBackend.value;
socket.send(JSON.stringify({
type: "update_config",
config,
}));
}
function handleConfigResponse(message) {
settingsElements.saveBtn.disabled = false;
settingsElements.saveBtn.textContent = "保存";
if (message.success) {
settingsElements.validation.textContent = message.message;
settingsElements.validation.style.color = "var(--success)";
setTimeout(closeSettingsModal, 2000);
} else {
settingsElements.validation.textContent = message.message;
settingsElements.validation.style.color = "var(--error)";
}
}
// Event listeners for settings
settingsOpenBtn = document.getElementById("settingsBtn");
settingsOpenBtn.addEventListener("click", openSettingsModal);
settingsElements.cancelBtn.addEventListener("click", closeSettingsModal);
settingsElements.saveBtn.addEventListener("click", saveSettings);
settingsElements.modal.addEventListener("click", (e) => {
if (e.target === settingsElements.modal) {
closeSettingsModal();
}
});
</script>
</body>
</html>

9
sgclaw_config.json Normal file
View File

@@ -0,0 +1,9 @@
{
"apiKey": "direct-submit-placeholder-key",
"baseUrl": "http://127.0.0.1/direct-submit",
"model": "direct-submit-placeholder-model",
"skillsDir": "D:/data/ideaSpace/rust/sgClaw/claw/claw/skills/skill_staging/skills",
"directSubmitSkill": "tq-lineloss-report.collect_lineloss",
"runtimeProfile": "browser-attached",
"browserBackend": "super-rpa"
}

View File

@@ -1,5 +1,5 @@
use std::ffi::OsString;
use std::path::PathBuf;
use std::path::{Path, PathBuf};
use std::sync::Arc;
use crate::browser::BrowserBackend;
@@ -64,6 +64,10 @@ impl AgentRuntimeContext {
.map_err(|err| PipeError::Protocol(err.to_string()))
}
pub fn config_path(&self) -> Option<&Path> {
self.config_path.as_deref()
}
fn settings_source_label(&self) -> String {
match &self.config_path {
Some(path) if path.exists() => path.display().to_string(),

View File

@@ -84,6 +84,13 @@ fn run() -> Result<(), String> {
break;
}
ServiceMessage::Pong => {}
ServiceMessage::ConfigUpdated { success, message } => {
if success {
println!("config updated: {message}");
} else {
eprintln!("config update failed: {message}");
}
}
}
}
Message::Close(_) => {

View File

@@ -436,12 +436,16 @@ fn build_eval_js(source_url: &str, script: &str) -> String {
let events_url = escape_js_single_quoted(&events_endpoint_url(source_url));
format!(
"(function(){{try{{var v=(function(){{return {script}}})();\
"(function(){{try{{\
var v=(function(){{return {script}}})();\
function _s(v){{\
var t=(typeof v==='string')?v:JSON.stringify(v);\
try{{callBackJsToCpp('{escaped_source_url}@_@'+window.location.href+'@_@{callback}@_@sgBrowserExcuteJsCodeByDomain@_@'+(t??''))}}catch(_){{}}\
var j=JSON.stringify({{type:'callback',callback:'{callback}',request_url:'{escaped_source_url}',payload:{{value:(t??'')}}}});\
try{{var r=new XMLHttpRequest();r.open('POST','{events_url}',true);r.setRequestHeader('Content-Type','application/json');r.send(j)}}catch(_){{}}\
try{{navigator.sendBeacon('{events_url}',new Blob([j],{{type:'application/json'}}))}}catch(_){{}}\
}}\
if(v&&typeof v.then==='function'){{v.then(_s).catch(function(){{}});}}else{{_s(v);}}\
}}catch(e){{}}}})()"
)
}

View File

@@ -25,7 +25,6 @@ const COMMANDS_ENDPOINT_PATH: &str = "/sgclaw/callback/commands/next";
const COMMAND_ACK_ENDPOINT_PATH: &str = "/sgclaw/callback/commands/ack";
const COMMAND_POLL_INTERVAL: Duration = Duration::from_millis(25);
const HELPER_POLL_INTERVAL: Duration = Duration::from_millis(50);
const HELPER_BOOTSTRAP_ACTION: &str = "sgBrowerserOpenPage";
const NAVIGATE_CALLBACK_NAME: &str = "sgclawOnLoaded";
const CLICK_PROBE_CALLBACK_NAME: &str = "sgclawOnClickProbe";
const CLICK_CALLBACK_NAME: &str = "sgclawOnClick";
@@ -42,12 +41,15 @@ pub(crate) struct BrowserCallbackHost {
}
#[derive(Debug)]
#[allow(dead_code)]
pub(crate) struct LiveBrowserCallbackHost {
host: Arc<BrowserCallbackHost>,
shutdown: Arc<AtomicBool>,
server_thread: Mutex<Option<JoinHandle<()>>>,
command_lock: Mutex<()>,
result_timeout: Duration,
browser_ws_url: String,
use_hidden_domain: bool,
}
#[derive(Debug, Default)]
@@ -217,6 +219,7 @@ impl LiveBrowserCallbackHost {
bootstrap_request_url: &str,
ready_timeout: Duration,
result_timeout: Duration,
use_hidden_domain: bool,
) -> Result<Self, PipeError> {
let listener = TcpListener::bind("127.0.0.1:0").map_err(|err| {
PipeError::Protocol(format!("failed to bind callback host listener: {err}"))
@@ -238,7 +241,7 @@ impl LiveBrowserCallbackHost {
let thread_shutdown = shutdown.clone();
let server_thread = thread::spawn(move || serve_loop(listener, thread_host, thread_shutdown));
bootstrap_helper_page(browser_ws_url, bootstrap_request_url, host.helper_url())?;
bootstrap_helper_page(browser_ws_url, bootstrap_request_url, host.helper_url(), use_hidden_domain)?;
wait_for_helper_ready(host.as_ref(), ready_timeout)?;
let live_host = Self {
@@ -247,6 +250,8 @@ impl LiveBrowserCallbackHost {
server_thread: Mutex::new(Some(server_thread)),
command_lock: Mutex::new(()),
result_timeout,
browser_ws_url: browser_ws_url.to_string(),
use_hidden_domain,
};
Ok(live_host)
}
@@ -337,7 +342,12 @@ fn normalize_loopback_origin(origin: &str) -> String {
origin.trim_end_matches('/').to_string()
}
fn bootstrap_helper_page(browser_ws_url: &str, request_url: &str, helper_url: &str) -> Result<(), PipeError> {
fn bootstrap_helper_page(
browser_ws_url: &str,
request_url: &str,
helper_url: &str,
use_hidden_domain: bool,
) -> Result<(), PipeError> {
let (mut websocket, _) = connect(browser_ws_url)
.map_err(|err| PipeError::Protocol(format!("browser websocket connect failed: {err}")))?;
configure_bootstrap_socket(&mut websocket)?;
@@ -347,9 +357,20 @@ fn bootstrap_helper_page(browser_ws_url: &str, request_url: &str, helper_url: &s
))
.map_err(|err| PipeError::Protocol(format!("browser websocket register failed: {err}")))?;
let _ = recv_bootstrap_prelude(&mut websocket);
// Close any orphaned helper page from a previous process run.
// Best-effort: if no page exists, the browser silently ignores this.
let (open_action, close_action) = if use_hidden_domain {
("sgHideBrowerserOpenPage", "sgHideBrowerserClosePage")
} else {
("sgBrowerserOpenPage", "sgBrowserClosePage")
};
let close_payload = json!([request_url, close_action, helper_url]).to_string();
let _ = websocket.send(Message::Text(close_payload.into()));
let payload = json!([
request_url,
HELPER_BOOTSTRAP_ACTION,
open_action,
helper_url,
])
.to_string();
@@ -1080,6 +1101,7 @@ mod tests {
"https://www.zhihu.com",
Duration::from_millis(100),
Duration::from_millis(50),
false,
);
assert!(result.is_err(), "expected timeout because no real helper page loads");
drop(result);
@@ -1100,6 +1122,38 @@ mod tests {
);
}
#[test]
fn live_callback_host_hidden_domain_sends_hide_open_page_command() {
let (ws_url, frames, handle) = start_fake_browser_status_server();
let result = LiveBrowserCallbackHost::start_with_browser_ws_url(
&ws_url,
"https://www.zhihu.com",
Duration::from_millis(100),
Duration::from_millis(50),
true,
);
assert!(result.is_err(), "expected timeout because no real helper page loads");
drop(result);
handle.join().unwrap();
let sent = frames.lock().unwrap().clone();
assert!(
sent.iter().any(|frame| frame.contains("sgHideBrowerserOpenPage")),
"hidden domain bootstrap should send sgHideBrowerserOpenPage; sent frames: {sent:?}"
);
assert!(
!sent.iter().any(|frame| {
frame.contains("\"sgBrowerserOpenPage\"")
}),
"hidden domain bootstrap should NOT send visible sgBrowerserOpenPage; sent frames: {sent:?}"
);
assert!(
sent.iter().any(|frame| frame.contains("/sgclaw/browser-helper.html")),
"bootstrap should include the helper page URL; sent frames: {sent:?}"
);
}
#[test]
fn live_callback_host_treats_simulated_mouse_command_as_fire_and_forget() {
use crate::browser::callback_backend::{
@@ -1113,6 +1167,8 @@ mod tests {
server_thread: Mutex::new(None),
command_lock: Mutex::new(()),
result_timeout: Duration::from_millis(10),
browser_ws_url: "ws://127.0.0.1:12345".to_string(),
use_hidden_domain: false,
};
let response = host.execute(BrowserCallbackRequest {

View File

@@ -154,11 +154,26 @@ fn execute_browser_script_impl(
browser_tool: &dyn BrowserBackend,
args: Value,
) -> anyhow::Result<ToolResult> {
eprintln!("[execute_browser_script_impl] 开始执行");
eprintln!("[execute_browser_script_impl] tool.name: {}", tool.name);
eprintln!("[execute_browser_script_impl] tool.command: {}", tool.command);
eprintln!("[execute_browser_script_impl] skill_root: {:?}", skill_root);
eprintln!("[execute_browser_script_impl] args: {:?}", args);
let script_path = resolve_browser_script_path(skill_root, &tool.command)?;
eprintln!("[execute_browser_script_impl] script_path: {:?}", script_path);
// 检查脚本文件是否存在
if !script_path.exists() {
eprintln!("[execute_browser_script_impl] 脚本文件不存在!");
} else {
eprintln!("[execute_browser_script_impl] 脚本文件存在");
}
let mut args = match args {
Value::Object(args) => args,
other => {
eprintln!("[execute_browser_script_impl] args 不是 Object: {:?}", other);
return Ok(failed_tool_result(format!(
"expected object arguments, got {other}"
)))
@@ -168,27 +183,35 @@ fn execute_browser_script_impl(
let raw_expected_domain = match args.remove("expected_domain") {
Some(Value::String(value)) if !value.trim().is_empty() => value,
Some(other) => {
eprintln!("[execute_browser_script_impl] expected_domain 格式错误: {:?}", other);
return Ok(failed_tool_result(format!(
"expected_domain must be a non-empty string, got {other}"
)))
}
None => {
eprintln!("[execute_browser_script_impl] 缺少 expected_domain");
return Ok(failed_tool_result(
"missing required field expected_domain".to_string(),
))
}
};
eprintln!("[execute_browser_script_impl] raw_expected_domain: {}", raw_expected_domain);
let expected_domain = match normalize_domain_like(&raw_expected_domain) {
Some(value) => value,
None => {
eprintln!("[execute_browser_script_impl] expected_domain 解析失败");
return Ok(failed_tool_result(format!(
"expected_domain must resolve to a hostname, got {raw_expected_domain:?}"
)))
}
};
eprintln!("[execute_browser_script_impl] expected_domain: {}", expected_domain);
args.insert("expected_domain".to_string(), Value::String(expected_domain.clone()));
for required_arg in tool.args.keys() {
if !args.contains_key(required_arg) {
eprintln!("[execute_browser_script_impl] 缺少必需参数: {}", required_arg);
return Ok(failed_tool_result(format!(
"missing required field {required_arg}"
)));
@@ -196,8 +219,12 @@ fn execute_browser_script_impl(
}
let script_body = match fs::read_to_string(&script_path) {
Ok(value) => value,
Ok(value) => {
eprintln!("[execute_browser_script_impl] 脚本读取成功, 长度: {} 字节", value.len());
value
}
Err(err) => {
eprintln!("[execute_browser_script_impl] 脚本读取失败: {}", err);
return Ok(failed_tool_result(format!(
"failed to read browser script {}: {err}",
script_path.display()
@@ -206,16 +233,36 @@ fn execute_browser_script_impl(
};
let wrapped_script = wrap_browser_script(&script_body, &Value::Object(args.clone()));
eprintln!("[execute_browser_script_impl] 包装后脚本长度: {} 字节", wrapped_script.len());
eprintln!("[execute_browser_script_impl] 包装后脚本前500字符: {}",
if wrapped_script.len() > 500 { &wrapped_script[..500] } else { &wrapped_script });
eprintln!("[execute_browser_script_impl] 调用 browser_tool.invoke(Action::Eval)...");
let target_url = args.get("target_url")
.and_then(|v| v.as_str())
.map(|s| s.to_string())
.unwrap_or_else(|| format!("http://{}", expected_domain));
eprintln!("[execute_browser_script_impl] target_url: {}", target_url);
let result = match browser_tool.invoke(
Action::Eval,
json!({ "script": wrapped_script }),
json!({
"script": wrapped_script,
"target_url": target_url,
}),
&expected_domain,
) {
Ok(result) => result,
Err(err) => return Ok(failed_tool_result(err.to_string())),
Ok(result) => {
eprintln!("[execute_browser_script_impl] invoke 成功, result.success: {}", result.success);
result
}
Err(err) => {
eprintln!("[execute_browser_script_impl] invoke 失败: {}", err);
return Ok(failed_tool_result(err.to_string()))
}
};
if !result.success {
eprintln!("[execute_browser_script_impl] result.success=false, data: {:?}", result.data);
return Ok(failed_tool_result(format_browser_script_error(&result.data)));
}
@@ -224,6 +271,7 @@ fn execute_browser_script_impl(
.get("text")
.cloned()
.unwrap_or_else(|| result.data.clone());
eprintln!("[execute_browser_script_impl] 返回成功, payload 长度: {:?}", payload.to_string().len());
Ok(ToolResult {
success: true,
output: stringify_tool_payload(&payload)?,

View File

@@ -1,10 +1,12 @@
use std::path::Path;
use std::path::{Path, PathBuf};
use std::sync::Arc;
use serde_json::{Map, Value};
use crate::browser::BrowserBackend;
use crate::compat::artifact_open::{open_exported_xlsx, PostExportOpen};
use crate::compat::direct_skill_runtime::DirectSubmitOutcome;
use crate::compat::lineloss_xlsx_export::{export_lineloss_xlsx, LinelossExportRequest};
use crate::config::SgClawSettings;
use crate::pipe::{BrowserPipeTool, PipeError, Transport};
@@ -13,6 +15,7 @@ pub struct DeterministicExecutionPlan {
pub instruction: String,
pub tool_name: String,
pub expected_domain: String,
pub target_url: String,
pub org_label: String,
pub org_code: String,
pub period_mode: String,
@@ -30,6 +33,7 @@ pub enum DeterministicSubmitDecision {
const DETERMINISTIC_SUFFIX: &str = "。。。";
const LINELLOSS_HOST: &str = "20.76.57.61";
const LINELLOSS_TARGET_URL: &str = "http://20.76.57.61:18080/gsllys/tqLinelossStatis/tqQualifyRateMonitor";
const LINELLOSS_TOOL: &str = "tq-lineloss-report.collect_lineloss";
pub fn decide_deterministic_submit(
@@ -85,6 +89,7 @@ pub fn decide_deterministic_submit(
instruction: normalized_instruction.to_string(),
tool_name: LINELLOSS_TOOL.to_string(),
expected_domain: LINELLOSS_HOST.to_string(),
target_url: LINELLOSS_TARGET_URL.to_string(),
org_label: resolved_org.label,
org_code: resolved_org.code,
period_mode: period_mode_name(&resolved_period.mode).to_string(),
@@ -110,7 +115,8 @@ pub fn execute_deterministic_submit<T: Transport + 'static>(
args,
)?;
Ok(summarize_lineloss_output(&output))
let export_path = try_export_lineloss_xlsx(&output, workspace_root);
Ok(summarize_lineloss_output_with_export(&output, export_path.as_deref()))
}
pub fn execute_deterministic_submit_with_browser_backend(
@@ -129,7 +135,8 @@ pub fn execute_deterministic_submit_with_browser_backend(
args,
)?;
Ok(summarize_lineloss_output(&output))
let export_path = try_export_lineloss_xlsx(&output, workspace_root);
Ok(summarize_lineloss_output_with_export(&output, export_path.as_deref()))
}
fn deterministic_submit_args(plan: &DeterministicExecutionPlan) -> Map<String, Value> {
@@ -138,6 +145,10 @@ fn deterministic_submit_args(plan: &DeterministicExecutionPlan) -> Map<String, V
"expected_domain".to_string(),
Value::String(plan.expected_domain.clone()),
);
args.insert(
"target_url".to_string(),
Value::String(plan.target_url.clone()),
);
args.insert(
"org_label".to_string(),
Value::String(plan.org_label.clone()),
@@ -256,6 +267,155 @@ fn summarize_lineloss_artifact(artifact: &Value) -> DirectSubmitOutcome {
}
}
fn summarize_lineloss_output_with_export(output: &str, export_path: Option<&Path>) -> DirectSubmitOutcome {
let mut outcome = summarize_lineloss_output(output);
if let Some(path) = export_path {
outcome.summary.push_str(&format!(" export_path={}", path.display()));
match open_exported_xlsx(path) {
PostExportOpen::Opened => {
outcome.summary.push_str(" 已自动打开Excel");
}
PostExportOpen::Failed(reason) => {
outcome.summary.push_str(&format!(" 自动打开Excel失败: {}", reason));
}
}
}
outcome
}
struct LinelossArtifactExportData {
sheet_name: String,
column_defs: Vec<(String, String)>,
rows: Vec<Map<String, Value>>,
}
fn extract_export_data(output: &str) -> Option<LinelossArtifactExportData> {
let payload: Value = serde_json::from_str(output).ok()?;
let artifact = payload
.as_object()
.and_then(|object| object.get("text"))
.unwrap_or(&payload);
let artifact = artifact.as_object()?;
if artifact.get("type").and_then(Value::as_str) != Some("report-artifact") {
return None;
}
let status = artifact.get("status").and_then(Value::as_str).unwrap_or("");
if !matches!(status, "ok" | "partial") {
return None;
}
let rows = artifact
.get("rows")
.and_then(Value::as_array)?;
if rows.is_empty() {
return None;
}
let rows: Vec<Map<String, Value>> = rows
.iter()
.filter_map(|row| row.as_object().cloned())
.collect();
if rows.is_empty() {
return None;
}
let column_defs: Vec<(String, String)> = artifact
.get("column_defs")
.and_then(Value::as_array)
.map(|defs| {
defs.iter()
.filter_map(|def| {
let arr = def.as_array()?;
let key = arr.first()?.as_str()?.to_string();
let label = arr.get(1)?.as_str()?.to_string();
Some((key, label))
})
.collect()
})
.unwrap_or_default();
// Fallback: if column_defs not in artifact, try "columns" array as keys
let column_defs = if column_defs.is_empty() {
let columns = artifact
.get("columns")
.and_then(Value::as_array)?;
columns
.iter()
.filter_map(|col| {
let key = col.as_str()?.to_string();
Some((key.clone(), key))
})
.collect()
} else {
column_defs
};
if column_defs.is_empty() {
return None;
}
let org_label = artifact
.get("org")
.and_then(Value::as_object)
.and_then(|org| org.get("label"))
.and_then(Value::as_str)
.unwrap_or("lineloss");
let period_mode = artifact
.get("period")
.and_then(Value::as_object)
.and_then(|p| p.get("mode"))
.and_then(Value::as_str)
.unwrap_or("month");
let period_value = artifact
.get("period")
.and_then(Value::as_object)
.and_then(|p| p.get("value"))
.and_then(Value::as_str)
.unwrap_or("");
let mode_label = if period_mode == "week" { "周度" } else { "月度" };
let sheet_name = format!("{org_label}{mode_label}线损分析报表({period_value})");
Some(LinelossArtifactExportData {
sheet_name,
column_defs,
rows,
})
}
fn try_export_lineloss_xlsx(
output: &str,
workspace_root: &Path,
) -> Option<PathBuf> {
let data = extract_export_data(output)?;
let nanos = std::time::SystemTime::now()
.duration_since(std::time::UNIX_EPOCH)
.map(|d| d.as_nanos())
.unwrap_or_default();
let out_dir = workspace_root.join("out");
let output_path = out_dir.join(format!("tq-lineloss-{nanos}.xlsx"));
let request = LinelossExportRequest {
sheet_name: data.sheet_name,
column_defs: data.column_defs,
rows: data.rows,
output_path,
};
match export_lineloss_xlsx(&request) {
Ok(path) => {
eprintln!("[deterministic_submit] XLSX exported to: {}", path.display());
Some(path)
}
Err(err) => {
eprintln!("[deterministic_submit] XLSX export failed: {err}");
None
}
}
}
fn strip_exact_deterministic_suffix(raw_instruction: &str) -> Option<&str> {
let without_suffix = raw_instruction.strip_suffix(DETERMINISTIC_SUFFIX)?;
if without_suffix.ends_with('。') {

View File

@@ -330,8 +330,7 @@ fn derive_expected_domain(task_context: &CompatTaskContext) -> Result<String, Pi
.filter(|value| !value.is_empty())
.ok_or_else(|| {
PipeError::Protocol(
"direct submit skill requires page_url so expected_domain can be derived"
.to_string(),
"当前命令需要浏览器页面上下文才能执行。请在浏览器中打开目标页面后重试,或在指令末尾添加'。。。'使用确定性提交。".to_string(),
)
})?;

View File

@@ -0,0 +1,223 @@
use std::fs;
use std::io::Write;
use std::path::{Path, PathBuf};
use serde_json::{Map, Value};
use zip::write::FileOptions;
use zip::{CompressionMethod, ZipWriter};
pub struct LinelossExportRequest {
pub sheet_name: String,
pub column_defs: Vec<(String, String)>,
pub rows: Vec<Map<String, Value>>,
pub output_path: PathBuf,
}
pub fn export_lineloss_xlsx(request: &LinelossExportRequest) -> anyhow::Result<PathBuf> {
if request.rows.is_empty() {
anyhow::bail!("rows must not be empty");
}
if request.column_defs.is_empty() {
anyhow::bail!("column_defs must not be empty");
}
let sheet_xml = build_worksheet_xml(&request.column_defs, &request.rows);
write_xlsx(
&request.output_path,
&request.sheet_name,
&sheet_xml,
)?;
Ok(request.output_path.clone())
}
fn build_worksheet_xml(
column_defs: &[(String, String)],
rows: &[Map<String, Value>],
) -> String {
let mut xml_rows = Vec::with_capacity(rows.len() + 1);
// Header row (row 1)
let header_cells: Vec<String> = column_defs
.iter()
.enumerate()
.map(|(col_idx, (_key, label))| {
let col_letter = column_letter(col_idx);
format!(
"<c r=\"{col_letter}1\" t=\"inlineStr\"><is><t>{}</t></is></c>",
xml_escape(label)
)
})
.collect();
xml_rows.push(format!("<row r=\"1\">{}</row>", header_cells.join("")));
// Data rows (row 2+)
for (row_idx, row) in rows.iter().enumerate() {
let excel_row = row_idx + 2;
let cells: Vec<String> = column_defs
.iter()
.enumerate()
.map(|(col_idx, (key, _label))| {
let col_letter = column_letter(col_idx);
let value = row
.get(key)
.map(|v| value_to_string(v))
.unwrap_or_default();
format!(
"<c r=\"{col_letter}{excel_row}\" t=\"inlineStr\"><is><t>{}</t></is></c>",
xml_escape(&value)
)
})
.collect();
xml_rows.push(format!("<row r=\"{excel_row}\">{}</row>", cells.join("")));
}
format!(
"<?xml version=\"1.0\" encoding=\"UTF-8\" standalone=\"yes\"?>\
<worksheet xmlns=\"http://schemas.openxmlformats.org/spreadsheetml/2006/main\">\
<sheetData>{}</sheetData>\
</worksheet>",
xml_rows.join("")
)
}
fn column_letter(index: usize) -> String {
let mut result = String::new();
let mut n = index;
loop {
result.insert(0, (b'A' + (n % 26) as u8) as char);
if n < 26 {
break;
}
n = n / 26 - 1;
}
result
}
fn value_to_string(value: &Value) -> String {
match value {
Value::String(text) => text.clone(),
Value::Number(number) => number.to_string(),
Value::Bool(flag) => flag.to_string(),
Value::Null => String::new(),
other => other.to_string(),
}
}
fn xml_escape(value: &str) -> String {
value
.replace('&', "&amp;")
.replace('<', "&lt;")
.replace('>', "&gt;")
}
fn write_xlsx(output_path: &Path, sheet_name: &str, sheet_xml: &str) -> anyhow::Result<()> {
if let Some(parent) = output_path.parent() {
fs::create_dir_all(parent)?;
}
if output_path.exists() {
fs::remove_file(output_path)?;
}
let file = fs::File::create(output_path)?;
let mut zip = ZipWriter::new(file);
let options = FileOptions::default().compression_method(CompressionMethod::Stored);
zip.start_file("[Content_Types].xml", options)?;
zip.write_all(content_types_xml().as_bytes())?;
zip.start_file("_rels/.rels", options)?;
zip.write_all(root_rels_xml().as_bytes())?;
zip.start_file("docProps/app.xml", options)?;
zip.write_all(app_xml().as_bytes())?;
zip.start_file("docProps/core.xml", options)?;
zip.write_all(core_xml().as_bytes())?;
zip.start_file("xl/workbook.xml", options)?;
zip.write_all(workbook_xml(&xml_escape(sheet_name)).as_bytes())?;
zip.start_file("xl/_rels/workbook.xml.rels", options)?;
zip.write_all(workbook_rels_xml().as_bytes())?;
zip.start_file("xl/worksheets/sheet1.xml", options)?;
zip.write_all(sheet_xml.as_bytes())?;
zip.finish()?;
Ok(())
}
fn content_types_xml() -> &'static str {
r#"<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<Types xmlns="http://schemas.openxmlformats.org/package/2006/content-types">
<Default Extension="rels" ContentType="application/vnd.openxmlformats-package.relationships+xml"/>
<Default Extension="xml" ContentType="application/xml"/>
<Override PartName="/xl/workbook.xml" ContentType="application/vnd.openxmlformats-officedocument.spreadsheetml.sheet.main+xml"/>
<Override PartName="/xl/worksheets/sheet1.xml" ContentType="application/vnd.openxmlformats-officedocument.spreadsheetml.worksheet+xml"/>
<Override PartName="/docProps/core.xml" ContentType="application/vnd.openxmlformats-package.core-properties+xml"/>
<Override PartName="/docProps/app.xml" ContentType="application/vnd.openxmlformats-officedocument.extended-properties+xml"/>
</Types>"#
}
fn root_rels_xml() -> &'static str {
r#"<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<Relationships xmlns="http://schemas.openxmlformats.org/package/2006/relationships">
<Relationship Id="rId1" Type="http://schemas.openxmlformats.org/officeDocument/2006/relationships/officeDocument" Target="xl/workbook.xml"/>
<Relationship Id="rId2" Type="http://schemas.openxmlformats.org/package/2006/relationships/metadata/core-properties" Target="docProps/core.xml"/>
<Relationship Id="rId3" Type="http://schemas.openxmlformats.org/officeDocument/2006/relationships/extended-properties" Target="docProps/app.xml"/>
</Relationships>"#
}
fn app_xml() -> &'static str {
r#"<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<Properties xmlns="http://schemas.openxmlformats.org/officeDocument/2006/extended-properties"
xmlns:vt="http://schemas.openxmlformats.org/officeDocument/2006/docPropsVTypes">
<Application>sgClaw</Application>
</Properties>"#
}
fn core_xml() -> &'static str {
r#"<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<cp:coreProperties xmlns:cp="http://schemas.openxmlformats.org/package/2006/metadata/core-properties"
xmlns:dc="http://purl.org/dc/elements/1.1/"
xmlns:dcterms="http://purl.org/dc/terms/"
xmlns:dcmitype="http://purl.org/dc/dcmitype/"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<dc:title>台区线损报表</dc:title>
</cp:coreProperties>"#
}
fn workbook_xml(sheet_name: &str) -> String {
format!(
r#"<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<workbook xmlns="http://schemas.openxmlformats.org/spreadsheetml/2006/main"
xmlns:r="http://schemas.openxmlformats.org/officeDocument/2006/relationships">
<sheets>
<sheet name="{sheet_name}" sheetId="1" r:id="rId1"/>
</sheets>
</workbook>"#
)
}
fn workbook_rels_xml() -> &'static str {
r#"<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<Relationships xmlns="http://schemas.openxmlformats.org/package/2006/relationships">
<Relationship Id="rId1" Type="http://schemas.openxmlformats.org/officeDocument/2006/relationships/worksheet" Target="worksheets/sheet1.xml"/>
</Relationships>"#
}
#[cfg(test)]
mod tests {
use super::column_letter;
#[test]
fn column_letter_maps_indices_correctly() {
assert_eq!(column_letter(0), "A");
assert_eq!(column_letter(1), "B");
assert_eq!(column_letter(6), "G");
assert_eq!(column_letter(25), "Z");
assert_eq!(column_letter(26), "AA");
}
}

View File

@@ -6,6 +6,7 @@ pub mod cron_adapter;
pub mod deterministic_submit;
pub mod direct_skill_runtime;
pub mod event_bridge;
pub mod lineloss_xlsx_export;
pub mod memory_adapter;
pub mod openxml_office_tool;
pub mod orchestration;

View File

@@ -5,29 +5,598 @@ pub(crate) struct OrgUnit {
}
pub(crate) const ORG_UNITS: &[OrgUnit] = &[
// ===== Province-level =====
OrgUnit {
label: "国网甘肃省电力公司",
code: "62101",
aliases: &["国网甘肃省电力公司", "甘肃省电力公司", "甘肃电力公司", "甘肃省公司"],
},
// ===== City-level (lv=2) =====
OrgUnit {
label: "国网兰州供电公司",
code: "62401",
aliases: &["国网兰州供电公司", "兰州供电公司", "兰州公司"],
},
OrgUnit {
label: "国网白银供电公司",
code: "62402",
aliases: &["国网白银供电公司", "白银供电公司", "白银公司"],
},
OrgUnit {
label: "国网天水供电公司",
code: "62403",
aliases: &["国网天水供电公司", "天水供电公司", "天水公司"],
},
OrgUnit {
label: "国网平凉供电公司",
code: "62404",
aliases: &["国网平凉供电公司", "平凉供电公司", "平凉公司"],
},
OrgUnit {
label: "国网金昌供电公司",
code: "62405",
aliases: &["国网金昌供电公司", "金昌供电公司", "金昌公司"],
},
OrgUnit {
label: "国网张掖供电公司",
code: "62406",
aliases: &["国网张掖供电公司", "张掖供电公司", "张掖公司"],
},
OrgUnit {
label: "国网陇南供电公司",
code: "62407",
aliases: &["国网陇南供电公司", "陇南供电公司", "陇南公司"],
},
OrgUnit {
label: "国网定西供电公司",
code: "62408",
aliases: &["国网定西供电公司", "定西供电公司", "定西公司"],
},
OrgUnit {
label: "国网庆阳供电公司",
code: "62409",
aliases: &["国网庆阳供电公司", "庆阳供电公司", "庆阳公司"],
},
OrgUnit {
label: "国网武威供电公司",
code: "62410",
aliases: &["国网武威供电公司", "武威供电公司", "武威公司"],
},
OrgUnit {
label: "国网酒泉供电公司",
code: "62411",
aliases: &["国网酒泉供电公司", "酒泉供电公司", "酒泉公司"],
},
OrgUnit {
label: "国网临夏供电公司",
code: "62412",
aliases: &["国网临夏供电公司", "临夏供电公司", "临夏公司"],
},
OrgUnit {
label: "国网甘南供电公司",
code: "62413",
aliases: &["国网甘南供电公司", "甘南供电公司", "甘南公司"],
},
OrgUnit {
label: "国网嘉峪关供电公司",
code: "62414",
aliases: &["国网嘉峪关供电公司", "嘉峪关供电公司", "嘉峪关公司"],
},
OrgUnit {
label: "国网兰州新区供电公司",
code: "62415",
aliases: &["国网兰州新区供电公司", "兰州新区供电公司", "兰州新区公司"],
},
// ===== 兰州供电公司 children (lv=3) =====
OrgUnit {
label: "城关供电分公司",
code: "6240108",
aliases: &["城关供电分公司", "城关分公司"],
},
OrgUnit {
label: "七里河供电分公司",
code: "6240109",
aliases: &["七里河供电分公司", "七里河分公司"],
},
OrgUnit {
label: "西固供电分公司",
code: "6240107",
aliases: &["西固供电分公司", "西固分公司"],
},
OrgUnit {
label: "安宁供电分公司",
code: "6240111",
aliases: &["安宁供电分公司", "安宁分公司"],
},
OrgUnit {
label: "红古供电分公司",
code: "6240102",
aliases: &["红古供电分公司", "红古分公司"],
},
OrgUnit {
label: "东岗供电分公司",
code: "6240110",
aliases: &["东岗供电分公司", "东岗分公司"],
},
OrgUnit {
label: "国网永登县供电公司",
code: "6240122",
aliases: &["国网永登县供电公司", "永登县供电公司", "永登县公司"],
},
OrgUnit {
label: "国网榆中县供电公司",
code: "6240121",
aliases: &["国网榆中县供电公司", "榆中县供电公司", "榆中县公司"],
},
OrgUnit {
label: "榆中城关供电所",
code: "624012108",
aliases: &["榆中城关供电所"],
label: "国网永靖县供电公司",
code: "6240123",
aliases: &["国网永靖县供电公司", "永靖县供电公司", "永靖县公司"],
},
OrgUnit {
label: "兰州客户服务中心",
code: "6240101",
aliases: &["兰州客户服务中心", "兰州客服中心"],
},
// ===== 白银供电公司 children (lv=3) =====
OrgUnit {
label: "城区供电分公司",
code: "6240201",
aliases: &["城区供电分公司", "城区分公司"],
},
OrgUnit {
label: "国网白银市城区供电分公司",
code: "6240201",
aliases: &["国网白银市城区供电分公司", "白银市城区供电分公司", "白银城区分公司"],
},
OrgUnit {
label: "国网皋兰县供电公司",
code: "6240223",
aliases: &["国网皋兰县供电公司", "皋兰县供电公司", "皋兰县公司"],
},
OrgUnit {
label: "国网靖远县供电公司",
code: "6240221",
aliases: &["国网靖远县供电公司", "靖远县供电公司", "靖远县公司"],
},
OrgUnit {
label: "国网景泰县供电公司",
code: "6240222",
aliases: &["国网景泰县供电公司", "景泰县供电公司", "景泰县公司"],
},
OrgUnit {
label: "国网会宁县供电公司",
code: "6240225",
aliases: &["国网会宁县供电公司", "会宁县供电公司", "会宁县公司"],
},
OrgUnit {
label: "国网白银市平川区供电公司",
code: "6240224",
aliases: &["国网白银市平川区供电公司", "白银市平川区供电公司", "平川区供电公司", "平川区公司"],
},
OrgUnit {
label: "白银客户服务中心",
code: "6240207",
aliases: &["白银客户服务中心", "白银客服中心"],
},
// ===== 天水供电公司 children (lv=3) =====
OrgUnit {
label: "国网天水市秦州区供电公司",
code: "6240323",
aliases: &["国网天水市秦州区供电公司", "天水市秦州区供电公司", "秦州区供电公司", "秦州区公司"],
},
OrgUnit {
label: "秦州区供电公司",
code: "6240323",
aliases: &["秦州区供电公司", "秦州区公司"],
},
OrgUnit {
label: "国网天水市麦积区供电公司",
code: "6240305",
aliases: &["国网天水市麦积区供电公司", "天水市麦积区供电公司", "麦积区供电公司", "麦积区公司"],
},
OrgUnit {
label: "国网麦积区供电公司",
code: "6240305",
aliases: &["国网麦积区供电公司", "麦积区供电公司", "麦积区公司"],
},
OrgUnit {
label: "国网武山县供电公司",
code: "6240321",
aliases: &["国网武山县供电公司", "武山县供电公司", "武山县公司"],
},
OrgUnit {
label: "武山县供电公司",
code: "6240321",
aliases: &["武山县供电公司", "武山县公司"],
},
OrgUnit {
label: "国网甘谷县供电公司",
code: "6240322",
aliases: &["国网甘谷县供电公司", "甘谷县供电公司", "甘谷县公司"],
},
OrgUnit {
label: "甘谷县供电公司",
code: "6240322",
aliases: &["甘谷县供电公司", "甘谷县公司"],
},
OrgUnit {
label: "国网秦安县供电公司",
code: "6240324",
aliases: &["国网秦安县供电公司", "秦安县供电公司", "秦安县公司"],
},
OrgUnit {
label: "清水县供电公司",
code: "6240325",
aliases: &["清水县供电公司", "清水县公司"],
},
OrgUnit {
label: "张家川县供电公司",
code: "6240326",
aliases: &["张家川县供电公司", "张家川县公司"],
},
OrgUnit {
label: "天水客户服务中心",
code: "6240306",
aliases: &["天水客户服务中心", "天水客服中心"],
},
// ===== 平凉供电公司 children (lv=3) =====
OrgUnit {
label: "国网崇信县供电公司",
code: "6240401",
aliases: &["国网崇信县供电公司", "崇信县供电公司", "崇信县公司"],
},
OrgUnit {
label: "国网庄浪县供电公司",
code: "6240402",
aliases: &["国网庄浪县供电公司", "庄浪县供电公司", "庄浪县公司"],
},
OrgUnit {
label: "国网泾川县供电公司",
code: "6240403",
aliases: &["国网泾川县供电公司", "泾川县供电公司", "泾川县公司"],
},
OrgUnit {
label: "国网静宁县供电公司",
code: "6240404",
aliases: &["国网静宁县供电公司", "静宁县供电公司", "静宁县公司"],
},
OrgUnit {
label: "国网崆峒区供电公司",
code: "6240405",
aliases: &["国网崆峒区供电公司", "崆峒区供电公司", "崆峒区公司"],
},
OrgUnit {
label: "国网华亭市公司",
code: "6240407",
aliases: &["国网华亭市公司", "华亭市公司"],
},
OrgUnit {
label: "国网灵台县供电公司",
code: "6240408",
aliases: &["国网灵台县供电公司", "灵台县供电公司", "灵台县公司"],
},
// ===== 金昌供电公司 children (lv=3) =====
OrgUnit {
label: "金川区供电公司",
code: "6240522",
aliases: &["金川区供电公司", "金川区公司"],
},
OrgUnit {
label: "国网永昌县供电公司",
code: "6240523",
aliases: &["国网永昌县供电公司", "永昌县供电公司", "永昌县公司"],
},
OrgUnit {
label: "城区供电服务中心",
code: "6240505",
aliases: &["城区供电服务中心"],
},
OrgUnit {
label: "金昌客户服务中心",
code: "6240507",
aliases: &["金昌客户服务中心", "金昌客服中心"],
},
// ===== 张掖供电公司 children (lv=3) =====
OrgUnit {
label: "国网甘州区供电公司",
code: "6240621",
aliases: &["国网甘州区供电公司", "甘州区供电公司", "甘州区公司"],
},
OrgUnit {
label: "肃南县供电公司",
code: "6240622",
aliases: &["肃南县供电公司", "肃南县公司"],
},
OrgUnit {
label: "国网高台县供电公司",
code: "6240623",
aliases: &["国网高台县供电公司", "高台县供电公司", "高台县公司"],
},
OrgUnit {
label: "国网山丹县供电公司",
code: "6240624",
aliases: &["国网山丹县供电公司", "山丹县供电公司", "山丹县公司"],
},
OrgUnit {
label: "国网民乐县供电公司",
code: "6240625",
aliases: &["国网民乐县供电公司", "民乐县供电公司", "民乐县公司"],
},
OrgUnit {
label: "国网临泽县供电公司",
code: "6240626",
aliases: &["国网临泽县供电公司", "临泽县供电公司", "临泽县公司"],
},
// ===== 陇南供电公司 children (lv=3) =====
OrgUnit {
label: "国网武都区供电公司",
code: "6240701",
aliases: &["国网武都区供电公司", "武都区供电公司", "武都区公司"],
},
OrgUnit {
label: "国网宕昌县供电公司",
code: "6240702",
aliases: &["国网宕昌县供电公司", "宕昌县供电公司", "宕昌县公司"],
},
OrgUnit {
label: "国网文县供电公司",
code: "6240703",
aliases: &["国网文县供电公司", "文县供电公司", "文县公司"],
},
OrgUnit {
label: "国网康县供电公司",
code: "6240704",
aliases: &["国网康县供电公司", "康县供电公司", "康县公司"],
},
OrgUnit {
label: "国网西和县供电公司",
code: "6240705",
aliases: &["国网西和县供电公司", "西和县供电公司", "西和县公司"],
},
OrgUnit {
label: "国网礼县供电公司",
code: "6240706",
aliases: &["国网礼县供电公司", "礼县供电公司", "礼县公司"],
},
OrgUnit {
label: "国网成县供电公司",
code: "6240707",
aliases: &["国网成县供电公司", "成县供电公司", "成县公司"],
},
OrgUnit {
label: "国网徽县供电公司",
code: "6240708",
aliases: &["国网徽县供电公司", "徽县供电公司", "徽县公司"],
},
OrgUnit {
label: "国网两当县供电公司",
code: "6240709",
aliases: &["国网两当县供电公司", "两当县供电公司", "两当县公司"],
},
// ===== 定西供电公司 children (lv=3) =====
OrgUnit {
label: "国网定西市安定区供电公司",
code: "6240801",
aliases: &["国网定西市安定区供电公司", "定西市安定区供电公司", "安定区供电公司", "安定区公司"],
},
OrgUnit {
label: "国网通渭县供电公司",
code: "6240802",
aliases: &["国网通渭县供电公司", "通渭县供电公司", "通渭县公司"],
},
OrgUnit {
label: "国网陇西县供电公司",
code: "6240803",
aliases: &["国网陇西县供电公司", "陇西县供电公司", "陇西县公司"],
},
OrgUnit {
label: "国网渭源县供电公司",
code: "6240804",
aliases: &["国网渭源县供电公司", "渭源县供电公司", "渭源县公司"],
},
OrgUnit {
label: "国网临洮县供电公司",
code: "6240805",
aliases: &["国网临洮县供电公司", "临洮县供电公司", "临洮县公司"],
},
OrgUnit {
label: "国网漳县供电公司",
code: "6240806",
aliases: &["国网漳县供电公司", "漳县供电公司", "漳县公司"],
},
OrgUnit {
label: "国网岷县供电公司",
code: "6240807",
aliases: &["国网岷县供电公司", "岷县供电公司", "岷县公司"],
},
// ===== 庆阳供电公司 children (lv=3) =====
OrgUnit {
label: "西峰区供电公司",
code: "6240901",
aliases: &["西峰区供电公司", "西峰区公司"],
},
OrgUnit {
label: "国网庆城县供电公司",
code: "6240902",
aliases: &["国网庆城县供电公司", "庆城县供电公司", "庆城县公司"],
},
OrgUnit {
label: "国网正宁县供电公司",
code: "6240903",
aliases: &["国网正宁县供电公司", "正宁县供电公司", "正宁县公司"],
},
OrgUnit {
label: "国网镇原县供电公司",
code: "6240904",
aliases: &["国网镇原县供电公司", "镇原县供电公司", "镇原县公司"],
},
OrgUnit {
label: "国网环县供电公司",
code: "6240905",
aliases: &["国网环县供电公司", "环县供电公司", "环县公司"],
},
OrgUnit {
label: "国网华池县供电公司",
code: "6240906",
aliases: &["国网华池县供电公司", "华池县供电公司", "华池县公司"],
},
OrgUnit {
label: "国网合水县供电公司",
code: "6240907",
aliases: &["国网合水县供电公司", "合水县供电公司", "合水县公司"],
},
OrgUnit {
label: "国网宁县供电公司",
code: "6240908",
aliases: &["国网宁县供电公司", "宁县供电公司", "宁县公司"],
},
// ===== 武威供电公司 children (lv=3) =====
OrgUnit {
label: "国网古浪县供电公司",
code: "6241001",
aliases: &["国网古浪县供电公司", "古浪县供电公司", "古浪县公司"],
},
OrgUnit {
label: "国网凉州区供电公司",
code: "6241002",
aliases: &["国网凉州区供电公司", "凉州区供电公司", "凉州区公司"],
},
OrgUnit {
label: "国网民勤县供电公司",
code: "6241003",
aliases: &["国网民勤县供电公司", "民勤县供电公司", "民勤县公司"],
},
OrgUnit {
label: "国网天祝县供电公司",
code: "6241004",
aliases: &["国网天祝县供电公司", "天祝县供电公司", "天祝县公司"],
},
// ===== 酒泉供电公司 children (lv=3) =====
OrgUnit {
label: "国网酒泉市肃州区供电公司",
code: "6241101",
aliases: &["国网酒泉市肃州区供电公司", "酒泉市肃州区供电公司", "肃州区供电公司", "肃州区公司"],
},
OrgUnit {
label: "国网金塔县供电公司",
code: "6241102",
aliases: &["国网金塔县供电公司", "金塔县供电公司", "金塔县公司"],
},
OrgUnit {
label: "国网玉门市供电公司",
code: "6241103",
aliases: &["国网玉门市供电公司", "玉门市供电公司", "玉门市公司"],
},
OrgUnit {
label: "国网瓜州县供电公司",
code: "6241104",
aliases: &["国网瓜州县供电公司", "瓜州县供电公司", "瓜州县公司"],
},
OrgUnit {
label: "国网敦煌市供电公司",
code: "6241105",
aliases: &["国网敦煌市供电公司", "敦煌市供电公司", "敦煌市公司"],
},
OrgUnit {
label: "国网肃北县供电公司",
code: "6241106",
aliases: &["国网肃北县供电公司", "肃北县供电公司", "肃北县公司"],
},
OrgUnit {
label: "国网阿克塞县供电公司",
code: "6241107",
aliases: &["国网阿克塞县供电公司", "阿克塞县供电公司", "阿克塞县公司"],
},
// ===== 临夏供电公司 children (lv=3) =====
OrgUnit {
label: "临夏市城关营业班",
code: "6241201",
aliases: &["临夏市城关营业班"],
},
OrgUnit {
label: "国网临夏县供电公司",
code: "6241202",
aliases: &["国网临夏县供电公司", "临夏县供电公司", "临夏县公司"],
},
OrgUnit {
label: "国网东乡县供电公司",
code: "6241203",
aliases: &["国网东乡县供电公司", "东乡县供电公司", "东乡县公司"],
},
OrgUnit {
label: "国网和政县供电公司",
code: "6241204",
aliases: &["国网和政县供电公司", "和政县供电公司", "和政县公司"],
},
OrgUnit {
label: "国网广河县供电公司",
code: "6241205",
aliases: &["国网广河县供电公司", "广河县供电公司", "广河县公司"],
},
OrgUnit {
label: "国网积石山县供电公司",
code: "6241206",
aliases: &["国网积石山县供电公司", "积石山县供电公司", "积石山县公司"],
},
OrgUnit {
label: "国网康乐县供电公司",
code: "6241207",
aliases: &["国网康乐县供电公司", "康乐县供电公司", "康乐县公司"],
},
// ===== 甘南供电公司 children (lv=3) =====
OrgUnit {
label: "国网合作市供电公司",
code: "6241301",
aliases: &["国网合作市供电公司", "合作市供电公司", "合作市公司"],
},
OrgUnit {
label: "国网夏河县供电公司",
code: "6241302",
aliases: &["国网夏河县供电公司", "夏河县供电公司", "夏河县公司"],
},
OrgUnit {
label: "国网卓尼县供电公司",
code: "6241303",
aliases: &["国网卓尼县供电公司", "卓尼县供电公司", "卓尼县公司"],
},
OrgUnit {
label: "国网临潭县供电公司",
code: "6241304",
aliases: &["国网临潭县供电公司", "临潭县供电公司", "临潭县公司"],
},
OrgUnit {
label: "国网碌曲县供电公司",
code: "6241305",
aliases: &["国网碌曲县供电公司", "碌曲县供电公司", "碌曲县公司"],
},
OrgUnit {
label: "国网玛曲县供电公司",
code: "6241306",
aliases: &["国网玛曲县供电公司", "玛曲县供电公司", "玛曲县公司"],
},
OrgUnit {
label: "国网迭部县供电公司",
code: "6241307",
aliases: &["国网迭部县供电公司", "迭部县供电公司", "迭部县公司"],
},
OrgUnit {
label: "国网舟曲县供电公司",
code: "6241308",
aliases: &["国网舟曲县供电公司", "舟曲县供电公司", "舟曲县公司"],
},
];

View File

@@ -1,5 +1,7 @@
mod settings;
pub use crate::runtime::RuntimeProfile;
pub use settings::{
BrowserBackend, ConfigError, DeepSeekSettings, OfficeBackend, PlannerMode, ProviderSettings,
SgClawSettings, SkillsPromptMode,

View File

@@ -1,6 +1,6 @@
use std::path::{Path, PathBuf};
use serde::Deserialize;
use serde::{Deserialize, Serialize};
use thiserror::Error;
use crate::runtime::RuntimeProfile;
@@ -200,6 +200,65 @@ impl SgClawSettings {
.expect("active_provider should always resolve to a configured provider")
}
fn to_serializable(&self) -> SerializableRawSgClawSettings {
SerializableRawSgClawSettings {
api_key: self.provider_api_key.clone(),
base_url: self.provider_base_url.clone(),
model: self.provider_model.clone(),
skills_dir: self.skills_dir.as_ref().map(|p| p.to_string_lossy().into_owned()),
direct_submit_skill: self.direct_submit_skill.clone(),
skills_prompt_mode: Some(match self.skills_prompt_mode {
SkillsPromptMode::Full => "full".to_string(),
SkillsPromptMode::Compact => "compact".to_string(),
}),
runtime_profile: Some(match self.runtime_profile {
RuntimeProfile::BrowserAttached => "browser-attached".to_string(),
RuntimeProfile::BrowserHeavy => "browser-heavy".to_string(),
RuntimeProfile::GeneralAssistant => "general-assistant".to_string(),
}),
planner_mode: Some(match self.planner_mode {
PlannerMode::ZeroclawPlanFirst => "zeroclaw-plan-first".to_string(),
PlannerMode::LegacyDeterministic => "legacy-deterministic".to_string(),
}),
active_provider: Some(self.active_provider.clone()),
browser_backend: Some(match self.browser_backend {
BrowserBackend::SuperRpa => "super-rpa".to_string(),
BrowserBackend::AgentBrowser => "agent-browser".to_string(),
BrowserBackend::RustNative => "rust-native".to_string(),
BrowserBackend::ComputerUse => "computer-use".to_string(),
BrowserBackend::Auto => "auto".to_string(),
}),
office_backend: Some(match self.office_backend {
OfficeBackend::OpenXml => "openxml".to_string(),
OfficeBackend::Disabled => "disabled".to_string(),
}),
browser_ws_url: self.browser_ws_url.clone(),
service_ws_listen_addr: self.service_ws_listen_addr.clone(),
providers: self
.providers
.iter()
.map(|p| SerializableProviderSettings {
id: p.id.clone(),
provider: Some(p.provider.clone()),
api_key: p.api_key.clone(),
base_url: p.base_url.clone(),
model: p.model.clone(),
api_path: p.api_path.clone(),
wire_api: p.wire_api.clone(),
requires_openai_auth: p.requires_openai_auth,
})
.collect(),
}
}
pub fn save_to_path(&self, path: &Path) -> Result<(), ConfigError> {
let serializable = self.to_serializable();
let json = serde_json::to_string_pretty(&serializable)
.map_err(|err| ConfigError::ConfigParse(path.to_path_buf(), err.to_string()))?;
std::fs::write(path, json)
.map_err(|err| ConfigError::ConfigRead(path.to_path_buf(), err.to_string()))
}
fn maybe_from_env() -> Result<Option<Self>, ConfigError> {
let api_key = match std::env::var("DEEPSEEK_API_KEY") {
Ok(value) => value,
@@ -529,6 +588,54 @@ fn normalize_enum_token(raw: &str) -> String {
.to_ascii_lowercase()
}
#[derive(Debug, Serialize)]
struct SerializableRawSgClawSettings {
#[serde(rename = "apiKey")]
api_key: String,
#[serde(rename = "baseUrl")]
base_url: String,
model: String,
#[serde(rename = "skillsDir", skip_serializing_if = "Option::is_none")]
skills_dir: Option<String>,
#[serde(rename = "directSubmitSkill", skip_serializing_if = "Option::is_none")]
direct_submit_skill: Option<String>,
#[serde(rename = "skillsPromptMode", skip_serializing_if = "Option::is_none")]
skills_prompt_mode: Option<String>,
#[serde(rename = "runtimeProfile", skip_serializing_if = "Option::is_none")]
runtime_profile: Option<String>,
#[serde(rename = "plannerMode", skip_serializing_if = "Option::is_none")]
planner_mode: Option<String>,
#[serde(rename = "activeProvider", skip_serializing_if = "Option::is_none")]
active_provider: Option<String>,
#[serde(rename = "browserBackend", skip_serializing_if = "Option::is_none")]
browser_backend: Option<String>,
#[serde(rename = "officeBackend", skip_serializing_if = "Option::is_none")]
office_backend: Option<String>,
#[serde(rename = "browserWsUrl", skip_serializing_if = "Option::is_none")]
browser_ws_url: Option<String>,
#[serde(rename = "serviceWsListenAddr", skip_serializing_if = "Option::is_none")]
service_ws_listen_addr: Option<String>,
#[serde(default)]
providers: Vec<SerializableProviderSettings>,
}
#[derive(Debug, Serialize)]
struct SerializableProviderSettings {
id: String,
provider: Option<String>,
#[serde(rename = "apiKey")]
api_key: String,
#[serde(rename = "baseUrl", skip_serializing_if = "Option::is_none")]
base_url: Option<String>,
model: String,
#[serde(rename = "apiPath", skip_serializing_if = "Option::is_none")]
api_path: Option<String>,
#[serde(rename = "wireApi", skip_serializing_if = "Option::is_none")]
wire_api: Option<String>,
#[serde(rename = "requiresOpenaiAuth")]
requires_openai_auth: bool,
}
#[derive(Debug, Deserialize)]
struct RawSgClawSettings {
#[serde(rename = "apiKey", default)]

View File

@@ -7,14 +7,15 @@ use std::sync::Arc;
use tungstenite::accept;
use crate::agent::AgentRuntimeContext;
use crate::browser::callback_host::LiveBrowserCallbackHost;
use crate::pipe::PipeError;
use crate::security::MacPolicy;
const DEFAULT_BROWSER_WS_URL: &str = "ws://127.0.0.1:12345";
const DEFAULT_SERVICE_WS_LISTEN_ADDR: &str = "127.0.0.1:42321";
pub use protocol::{ClientMessage, ServiceMessage};
pub use server::{serve_client, ServiceEventSink, ServiceSession};
pub use protocol::{ClientMessage, ConfigUpdatePayload, ServiceMessage};
pub use server::{ServiceEventSink, ServiceSession};
pub(crate) mod browser_ws_client {
pub(crate) use super::server::{initial_request_url_for_submit_task, ServiceWsClient};
@@ -69,6 +70,11 @@ pub fn run() -> Result<(), PipeError> {
browser_ws_url,
);
// Cache the browser callback host across client sessions so the helper
// page tab is opened only once per process lifetime instead of once per
// WebSocket reconnection.
let mut cached_host: Option<Arc<LiveBrowserCallbackHost>> = None;
loop {
let (stream, _) = listener.accept()?;
let websocket = accept(stream)
@@ -76,12 +82,13 @@ pub fn run() -> Result<(), PipeError> {
let sink = Arc::new(ServiceEventSink::from_websocket(websocket));
match session.try_attach_client() {
Ok(()) => {
let result = serve_client(
let result = server::serve_client(
&runtime_context,
&session,
sink.clone(),
browser_ws_url,
&mac_policy,
&mut cached_host,
);
session.detach_client();
match result {

View File

@@ -3,6 +3,24 @@ use serde::{Deserialize, Serialize};
use crate::agent::SubmitTaskRequest;
use crate::pipe::ConversationMessage;
#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)]
pub struct ConfigUpdatePayload {
#[serde(rename = "apiKey", default)]
pub api_key: Option<String>,
#[serde(rename = "baseUrl", default)]
pub base_url: Option<String>,
#[serde(default)]
pub model: Option<String>,
#[serde(rename = "skillsDir", default)]
pub skills_dir: Option<String>,
#[serde(rename = "directSubmitSkill", default)]
pub direct_submit_skill: Option<String>,
#[serde(rename = "runtimeProfile", default)]
pub runtime_profile: Option<String>,
#[serde(rename = "browserBackend", default)]
pub browser_backend: Option<String>,
}
#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)]
#[serde(tag = "type", rename_all = "snake_case")]
pub enum ClientMessage {
@@ -21,6 +39,9 @@ pub enum ClientMessage {
page_title: String,
},
Ping,
UpdateConfig {
config: ConfigUpdatePayload,
},
}
impl ClientMessage {
@@ -39,7 +60,11 @@ impl ClientMessage {
page_url: normalize_optional_field(page_url),
page_title: normalize_optional_field(page_title),
}),
ClientMessage::Connect | ClientMessage::Start | ClientMessage::Stop | ClientMessage::Ping => None,
ClientMessage::Connect
| ClientMessage::Start
| ClientMessage::Stop
| ClientMessage::Ping
| ClientMessage::UpdateConfig { .. } => None,
}
}
}
@@ -52,6 +77,10 @@ pub enum ServiceMessage {
TaskComplete { success: bool, summary: String },
Busy { message: String },
Pong,
ConfigUpdated {
success: bool,
message: String,
},
}
fn normalize_optional_field(value: String) -> Option<String> {

View File

@@ -1,4 +1,6 @@
use std::net::TcpStream;
use std::path::Path;
use std::path::PathBuf;
use std::sync::{Arc, Mutex};
use std::time::Duration;
@@ -229,17 +231,60 @@ fn send_status_changed(sink: &ServiceEventSink, state: &str) -> Result<(), PipeE
})
}
pub fn serve_client(
fn update_config_file(config_path: &Path, config: crate::service::protocol::ConfigUpdatePayload) -> Result<(), String> {
use crate::config::{BrowserBackend, RuntimeProfile, SgClawSettings};
let mut settings = SgClawSettings::load(Some(config_path))
.map_err(|e| e.to_string())?
.ok_or_else(|| "无法读取现有配置".to_string())?;
if let Some(v) = config.api_key {
settings.provider_api_key = v;
}
if let Some(v) = config.base_url {
settings.provider_base_url = v;
}
if let Some(v) = config.model {
settings.provider_model = v;
}
if let Some(v) = config.skills_dir {
settings.skills_dir = Some(PathBuf::from(&v));
}
if let Some(v) = config.direct_submit_skill {
settings.direct_submit_skill = Some(v);
}
if let Some(v) = config.runtime_profile {
settings.runtime_profile = match v.as_str() {
"browser-attached" => RuntimeProfile::BrowserAttached,
"browser-heavy" => RuntimeProfile::BrowserHeavy,
"general-assistant" => RuntimeProfile::GeneralAssistant,
_ => return Err(format!("无效的 runtimeProfile: {}", v)),
};
}
if let Some(v) = config.browser_backend {
settings.browser_backend = match v.as_str() {
"super-rpa" => BrowserBackend::SuperRpa,
"agent-browser" => BrowserBackend::AgentBrowser,
"rust-native" => BrowserBackend::RustNative,
"computer-use" => BrowserBackend::ComputerUse,
"auto" => BrowserBackend::Auto,
_ => return Err(format!("无效的 browserBackend: {}", v)),
};
}
settings
.save_to_path(config_path)
.map_err(|e| format!("写入配置文件失败: {}", e))
}
pub(crate) fn serve_client(
context: &AgentRuntimeContext,
session: &ServiceSession,
sink: Arc<ServiceEventSink>,
browser_ws_url: &str,
mac_policy: &MacPolicy,
cached_host: &mut Option<Arc<LiveBrowserCallbackHost>>,
) -> Result<(), PipeError> {
// Cache the browser callback host across tasks so the helper page tab is
// opened only once per client session instead of once per task.
let mut cached_host: Option<Arc<LiveBrowserCallbackHost>> = None;
loop {
let Some(message) = sink.recv_client_message()? else {
return Ok(());
@@ -290,9 +335,10 @@ pub fn serve_client(
&bootstrap_url,
Duration::from_secs(15),
BROWSER_RESPONSE_TIMEOUT,
true, // use_hidden_domain: hidden domain for invisible helper
) {
Ok(host) => {
cached_host = Some(Arc::new(host));
*cached_host = Some(Arc::new(host));
}
Err(err) => {
session.finish_task();
@@ -336,6 +382,39 @@ pub fn serve_client(
}
}
}
ClientMessage::UpdateConfig { config } => {
let Some(config_path) = context.config_path() else {
sink.send_service_message(ServiceMessage::ConfigUpdated {
success: false,
message: "未找到配置文件路径。请通过 --config-path 参数启动 sg_claw 后再使用此功能。".to_string(),
})?;
continue;
};
if !config_path.exists() {
sink.send_service_message(ServiceMessage::ConfigUpdated {
success: false,
message: format!("配置文件不存在: {}", config_path.display()),
})?;
continue;
}
let result = update_config_file(config_path, config);
match result {
Ok(()) => {
sink.send_service_message(ServiceMessage::ConfigUpdated {
success: true,
message: "配置已保存。重启 sg_claw 以应用新配置。".to_string(),
})?;
}
Err(err) => {
sink.send_service_message(ServiceMessage::ConfigUpdated {
success: false,
message: format!("保存配置失败: {}", err),
})?;
}
}
}
}
}
}
@@ -378,6 +457,12 @@ fn derive_request_url_from_instruction(instruction: &str) -> Option<String> {
return Some("https://zhuanlan.zhihu.com".to_string());
}
// 台区线损相关
// TODO: 临时方案,后续应从 skill 配置或 deterministic_submit 解析结果中获取
if instruction.contains("线损") || instruction.contains("lineloss") {
return Some("http://20.76.57.61:18080".to_string());
}
None
}
@@ -813,6 +898,19 @@ mod tests {
);
}
#[test]
fn initial_request_url_falls_back_to_lineloss_origin_for_lineloss_instructions() {
let request = SubmitTaskRequest {
instruction: "兰州公司 台区线损大数据 月累计线损率统计分析。。。".to_string(),
..SubmitTaskRequest::default()
};
assert_eq!(
initial_request_url_for_submit_task(&request),
"http://20.76.57.61:18080"
);
}
#[test]
fn bridge_base_url_defaults_local_browser_ws_endpoint_to_http_bridge() {
assert_eq!(

View File

@@ -105,7 +105,8 @@ async fn execute_browser_script_tool_runs_packaged_script_with_expected_domain()
..
} if action == &Action::Eval
&& security.expected_domain == "www.zhihu.com"
&& params["script"].as_str().unwrap().contains("const args = {\"top_n\":\"10\"};")
&& params["script"].as_str().unwrap().contains("\"expected_domain\":\"www.zhihu.com\"")
&& params["script"].as_str().unwrap().contains("\"top_n\":\"10\"")
&& params["script"].as_str().unwrap().contains("source: \"packaged script\"")
));
}
@@ -278,7 +279,8 @@ return {
..
} if action == &Action::Eval
&& security.expected_domain == "www.zhihu.com"
&& params["script"].as_str().unwrap().contains("const args = {\"top_n\":\"10\"};")
&& params["script"].as_str().unwrap().contains("\"expected_domain\":\"www.zhihu.com\"")
&& params["script"].as_str().unwrap().contains("\"top_n\":\"10\"")
&& params["script"].as_str().unwrap().contains("return {")
));
}
@@ -360,7 +362,8 @@ return {
..
} if action == &Action::Eval
&& security.expected_domain == "www.zhihu.com"
&& params["script"].as_str().unwrap().contains("const args = {\"top_n\":\"10条\"};")
&& params["script"].as_str().unwrap().contains("\"expected_domain\":\"www.zhihu.com\"")
&& params["script"].as_str().unwrap().contains("\"top_n\":\"10条\"")
&& params["script"].as_str().unwrap().contains("rows: [[1, \"标题\", args.top_n]]")
));
}
@@ -444,7 +447,8 @@ return {
..
} if action == &Action::Eval
&& security.expected_domain == "www.zhihu.com"
&& params["script"].as_str().unwrap().contains("const args = {\"period\":\"2026-04\"};")
&& params["script"].as_str().unwrap().contains("\"expected_domain\":\"www.zhihu.com\"")
&& params["script"].as_str().unwrap().contains("\"period\":\"2026-04\"")
&& params["script"].as_str().unwrap().contains("sheet_name")
));
}
@@ -610,6 +614,77 @@ return {
);
}
#[tokio::test]
async fn execute_browser_script_tool_awaits_async_script() {
let skill_dir = unique_temp_dir("sgclaw-browser-script-async");
let scripts_dir = skill_dir.join("scripts");
fs::create_dir_all(&scripts_dir).unwrap();
// 异步脚本,返回 Promise
fs::write(
scripts_dir.join("async_extract.js"),
"return (async function() { return { async: true, args: args }; })();\n",
)
.unwrap();
let transport = Arc::new(MockTransport::new(vec![BrowserMessage::Response {
seq: 1,
success: true,
data: json!({
"text": {
"async": true,
"args": { "expected_domain": "example.com" }
}
}),
aom_snapshot: vec![],
timing: Timing {
queue_ms: 1,
exec_ms: 5,
},
}]));
let policy_json = MacPolicy::from_json_str(
r#"{
"version": "1.0",
"domains": { "allowed": ["www.zhihu.com", "example.com"] },
"pipe_actions": {
"allowed": ["click", "type", "navigate", "getText", "eval"],
"blocked": []
}
}"#,
)
.unwrap();
let browser_tool = BrowserPipeTool::new(
transport.clone(),
policy_json,
vec![1, 2, 3, 4, 5, 6, 7, 8],
)
.with_response_timeout(Duration::from_secs(1));
let skill_tool = SkillTool {
name: "async_extract".to_string(),
description: "Extract data asynchronously".to_string(),
kind: "browser_script".to_string(),
command: "scripts/async_extract.js".to_string(),
args: HashMap::new(),
};
let result = execute_browser_script_tool(
&skill_tool,
&skill_dir,
&PipeBrowserBackend::from_inner(browser_tool),
json!({
"expected_domain": "example.com"
}),
)
.await
.unwrap();
assert!(result.success);
let output = serde_json::from_str::<serde_json::Value>(&result.output).unwrap();
assert_eq!(output["async"], true);
}
fn unique_temp_dir(prefix: &str) -> PathBuf {
let nanos = SystemTime::now()
.duration_since(UNIX_EPOCH)

View File

@@ -0,0 +1,105 @@
use std::fs;
use std::path::PathBuf;
use serde_json::json;
use sgclaw::compat::lineloss_xlsx_export::{export_lineloss_xlsx, LinelossExportRequest};
fn temp_output_path(name: &str) -> PathBuf {
let dir = std::env::temp_dir().join("sgclaw-test-xlsx");
fs::create_dir_all(&dir).unwrap();
dir.join(name)
}
#[test]
fn export_month_lineloss_produces_valid_xlsx() {
let output_path = temp_output_path("month-test.xlsx");
if output_path.exists() {
fs::remove_file(&output_path).unwrap();
}
let request = LinelossExportRequest {
sheet_name: "国网兰州供电公司月度线损分析报表(2026-03)".to_string(),
column_defs: vec![
("ORG_NAME".to_string(), "供电单位".to_string()),
("YGDL".to_string(), "累计供电量".to_string()),
("YYDL".to_string(), "累计售电量".to_string()),
("YXSL".to_string(), "线损完成率(%)".to_string()),
("RAT_SCOPE".to_string(), "线损率累计目标值".to_string()),
("BLANK3".to_string(), "目标完成率".to_string()),
("BLANK2".to_string(), "排行".to_string()),
],
rows: vec![
serde_json::from_value(json!({
"ORG_NAME": "城关供电",
"YGDL": "12345.67",
"YYDL": "11234.56",
"YXSL": "9.00",
"RAT_SCOPE": "9.50",
"BLANK3": "94.74",
"BLANK2": "1"
}))
.unwrap(),
serde_json::from_value(json!({
"ORG_NAME": "七里河供电",
"YGDL": "9876.54",
"YYDL": "8765.43",
"YXSL": "11.24",
"RAT_SCOPE": "10.00",
"BLANK3": "112.40",
"BLANK2": "2"
}))
.unwrap(),
],
output_path: output_path.clone(),
};
let result_path = export_lineloss_xlsx(&request).unwrap();
assert_eq!(result_path, output_path);
assert!(output_path.exists());
// Verify it's a valid ZIP (xlsx is a zip archive)
let file = fs::File::open(&output_path).unwrap();
let mut archive = zip::ZipArchive::new(file).unwrap();
// Must contain the standard OpenXML entries
let entry_names: Vec<String> = (0..archive.len())
.map(|i| archive.by_index(i).unwrap().name().to_string())
.collect();
assert!(entry_names.contains(&"[Content_Types].xml".to_string()));
assert!(entry_names.contains(&"xl/worksheets/sheet1.xml".to_string()));
assert!(entry_names.contains(&"xl/workbook.xml".to_string()));
// Read sheet1.xml and verify it contains our data
let mut sheet = archive.by_name("xl/worksheets/sheet1.xml").unwrap();
let mut xml = String::new();
std::io::Read::read_to_string(&mut sheet, &mut xml).unwrap();
assert!(xml.contains("供电单位"), "header row should contain 供电单位");
assert!(xml.contains("累计供电量"), "header row should contain 累计供电量");
assert!(xml.contains("城关供电"), "data should contain 城关供电");
assert!(xml.contains("12345.67"), "data should contain 12345.67");
assert!(xml.contains("七里河供电"), "data should contain second row");
// Cleanup
fs::remove_file(&output_path).unwrap();
}
#[test]
fn export_empty_rows_returns_error() {
let output_path = temp_output_path("empty-test.xlsx");
let request = LinelossExportRequest {
sheet_name: "test".to_string(),
column_defs: vec![("A".to_string(), "ColA".to_string())],
rows: vec![],
output_path: output_path.clone(),
};
let result = export_lineloss_xlsx(&request);
assert!(result.is_err());
assert!(
result.unwrap_err().to_string().contains("rows must not be empty"),
"should reject empty rows"
);
}

View File

@@ -24,4 +24,14 @@ fn service_console_html_stays_on_service_ws_boundary() {
assert!(!source.contains("/sgclaw/callback/commands/next"));
assert!(!source.contains("/sgclaw/callback/commands/ack"));
assert!(!source.contains("ws://127.0.0.1:12345"));
// Auto-connect and settings enhancement assertions
assert!(source.contains("DOMContentLoaded"));
assert!(source.contains("settingsBtn"));
assert!(source.contains("settingsModal"));
assert!(source.contains("update_config"));
assert!(source.contains("config_updated"));
assert!(source.contains("settingApiKey"));
assert!(source.contains("settingBaseUrl"));
assert!(source.contains("settingModel"));
}

View File

@@ -0,0 +1,72 @@
use sgclaw::service::{ClientMessage, ConfigUpdatePayload, ServiceMessage};
#[test]
fn update_config_serializes_correctly() {
let config = ConfigUpdatePayload {
api_key: Some("test-key".to_string()),
base_url: Some("https://api.example.com".to_string()),
model: Some("test-model".to_string()),
skills_dir: Some("/path/to/skills".to_string()),
direct_submit_skill: Some("my-skill.my-tool".to_string()),
runtime_profile: Some("browser-attached".to_string()),
browser_backend: Some("super-rpa".to_string()),
};
let msg = ClientMessage::UpdateConfig { config };
let json = serde_json::to_string(&msg).unwrap();
assert!(json.contains("\"type\":\"update_config\""));
assert!(json.contains("\"apiKey\":\"test-key\""));
assert!(json.contains("\"baseUrl\":\"https://api.example.com\""));
assert!(json.contains("\"model\":\"test-model\""));
}
#[test]
fn update_config_deserializes_correctly() {
let json = r#"{
"type": "update_config",
"config": {
"apiKey": "key123",
"baseUrl": "https://api.test.com",
"model": "gpt-4"
}
}"#;
let msg: ClientMessage = serde_json::from_str(json).unwrap();
match msg {
ClientMessage::UpdateConfig { config } => {
assert_eq!(config.api_key, Some("key123".to_string()));
assert_eq!(config.base_url, Some("https://api.test.com".to_string()));
assert_eq!(config.model, Some("gpt-4".to_string()));
assert!(config.skills_dir.is_none());
}
_ => panic!("expected UpdateConfig variant"),
}
}
#[test]
fn config_updated_serializes_correctly() {
let msg = ServiceMessage::ConfigUpdated {
success: true,
message: "配置已保存".to_string(),
};
let json = serde_json::to_string(&msg).unwrap();
assert!(json.contains("\"type\":\"config_updated\""));
assert!(json.contains("\"success\":true"));
assert!(json.contains("配置已保存"));
}
#[test]
fn config_updated_deserializes_correctly() {
let json = r#"{"type":"config_updated","success":false,"message":"保存失败"}"#;
let msg: ServiceMessage = serde_json::from_str(json).unwrap();
match msg {
ServiceMessage::ConfigUpdated { success, message } => {
assert!(!success);
assert_eq!(message, "保存失败");
}
_ => panic!("expected ConfigUpdated variant"),
}
}