- Auto-connect WebSocket on page load in service console - Settings modal for editing sgclaw_config.json (API key, base URL, model, skills dir, etc.) - UpdateConfig/ConfigUpdated protocol messages for remote config save - save_to_path() for SgClawSettings serialization - ConfigUpdated handler in sg_claw_client binary - Protocol serialization tests for new message types - HTML test assertions for auto-connect and settings UI - Additional pending changes: deterministic submit, org units, lineloss xlsx export, browser script tool, and docs 🤖 Generated with [Qoder][https://qoder.com]
913 lines
29 KiB
Markdown
913 lines
29 KiB
Markdown
# Rust-Side Lineloss XLSX Export Implementation Plan
|
|
|
|
> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking.
|
|
|
|
**Goal:** Move XLSX export from browser JS (blocked by CORS) to Rust side, so `collect_lineloss.js` only collects data and Rust generates the `.xlsx` file locally.
|
|
|
|
**Architecture:** JS collects API data and returns a `report-artifact` JSON with `rows`, `column_defs`, and metadata. Rust parses the artifact, extracts rows + column definitions, and generates a standard `.xlsx` file using the `zip` crate + OpenXML XML strings (same pattern as `openxml_office_tool.rs`). Report log is deferred.
|
|
|
|
**Tech Stack:** Rust, `zip` 0.6.6, `serde_json`, OpenXML Spreadsheet ML, JavaScript (browser-injected)
|
|
|
|
**Spec:** `docs/superpowers/specs/2026-04-13-rust-side-lineloss-xlsx-export.md`
|
|
|
|
---
|
|
|
|
## File Structure
|
|
|
|
| File | Responsibility |
|
|
|------|---------------|
|
|
| `src/compat/lineloss_xlsx_export.rs` | **New.** Pure XLSX generation: takes column defs + row data, produces `.xlsx` file. No business logic. |
|
|
| `src/compat/deterministic_submit.rs` | **Modify.** After receiving JS artifact, extract rows + column_defs, call XLSX export, attach path to outcome. |
|
|
| `src/compat/mod.rs` | **Modify.** Register `lineloss_xlsx_export` module. |
|
|
| `D:/data/ideaSpace/rust/sgClaw/claw/claw/skills/skill_staging/skills/tq-lineloss-report/scripts/collect_lineloss.js` | **Modify.** Remove `exportWorkbook`/`writeReportLog` calls. Add `column_defs` to artifact. |
|
|
| `tests/lineloss_xlsx_export_test.rs` | **New.** Unit tests for XLSX generation. |
|
|
|
|
---
|
|
|
|
### Task 1: Create `lineloss_xlsx_export.rs` with Tests
|
|
|
|
**Files:**
|
|
- Create: `src/compat/lineloss_xlsx_export.rs`
|
|
- Create: `tests/lineloss_xlsx_export_test.rs`
|
|
- Modify: `src/compat/mod.rs`
|
|
|
|
- [ ] **Step 1: Register the new module in `src/compat/mod.rs`**
|
|
|
|
Add the module declaration in alphabetical order. In `src/compat/mod.rs`, insert after `pub mod event_bridge;`:
|
|
|
|
```rust
|
|
pub mod lineloss_xlsx_export;
|
|
```
|
|
|
|
The full file becomes:
|
|
|
|
```rust
|
|
pub mod artifact_open;
|
|
pub mod browser_script_skill_tool;
|
|
pub mod browser_tool_adapter;
|
|
pub mod config_adapter;
|
|
pub mod cron_adapter;
|
|
pub mod deterministic_submit;
|
|
pub mod direct_skill_runtime;
|
|
pub mod event_bridge;
|
|
pub mod lineloss_xlsx_export;
|
|
pub mod memory_adapter;
|
|
pub mod openxml_office_tool;
|
|
pub mod orchestration;
|
|
pub mod runtime;
|
|
pub mod screen_html_export_tool;
|
|
pub mod tq_lineloss;
|
|
pub mod workflow_executor;
|
|
```
|
|
|
|
- [ ] **Step 2: Write the failing test for XLSX generation**
|
|
|
|
Create `tests/lineloss_xlsx_export_test.rs`:
|
|
|
|
```rust
|
|
use std::fs;
|
|
use std::path::PathBuf;
|
|
|
|
use serde_json::json;
|
|
use sgclaw::compat::lineloss_xlsx_export::{export_lineloss_xlsx, LinelossExportRequest};
|
|
|
|
fn temp_output_path(name: &str) -> PathBuf {
|
|
let dir = std::env::temp_dir().join("sgclaw-test-xlsx");
|
|
fs::create_dir_all(&dir).unwrap();
|
|
dir.join(name)
|
|
}
|
|
|
|
#[test]
|
|
fn export_month_lineloss_produces_valid_xlsx() {
|
|
let output_path = temp_output_path("month-test.xlsx");
|
|
if output_path.exists() {
|
|
fs::remove_file(&output_path).unwrap();
|
|
}
|
|
|
|
let request = LinelossExportRequest {
|
|
sheet_name: "国网兰州供电公司月度线损分析报表(2026-03)".to_string(),
|
|
column_defs: vec![
|
|
("ORG_NAME".to_string(), "供电单位".to_string()),
|
|
("YGDL".to_string(), "累计供电量".to_string()),
|
|
("YYDL".to_string(), "累计售电量".to_string()),
|
|
("YXSL".to_string(), "线损完成率(%)".to_string()),
|
|
("RAT_SCOPE".to_string(), "线损率累计目标值".to_string()),
|
|
("BLANK3".to_string(), "目标完成率".to_string()),
|
|
("BLANK2".to_string(), "排行".to_string()),
|
|
],
|
|
rows: vec![
|
|
serde_json::from_value(json!({
|
|
"ORG_NAME": "城关供电",
|
|
"YGDL": "12345.67",
|
|
"YYDL": "11234.56",
|
|
"YXSL": "9.00",
|
|
"RAT_SCOPE": "9.50",
|
|
"BLANK3": "94.74",
|
|
"BLANK2": "1"
|
|
}))
|
|
.unwrap(),
|
|
serde_json::from_value(json!({
|
|
"ORG_NAME": "七里河供电",
|
|
"YGDL": "9876.54",
|
|
"YYDL": "8765.43",
|
|
"YXSL": "11.24",
|
|
"RAT_SCOPE": "10.00",
|
|
"BLANK3": "112.40",
|
|
"BLANK2": "2"
|
|
}))
|
|
.unwrap(),
|
|
],
|
|
output_path: output_path.clone(),
|
|
};
|
|
|
|
let result_path = export_lineloss_xlsx(&request).unwrap();
|
|
assert_eq!(result_path, output_path);
|
|
assert!(output_path.exists());
|
|
|
|
// Verify it's a valid ZIP (xlsx is a zip archive)
|
|
let file = fs::File::open(&output_path).unwrap();
|
|
let mut archive = zip::ZipArchive::new(file).unwrap();
|
|
|
|
// Must contain the standard OpenXML entries
|
|
let entry_names: Vec<String> = (0..archive.len())
|
|
.map(|i| archive.by_index(i).unwrap().name().to_string())
|
|
.collect();
|
|
|
|
assert!(entry_names.contains(&"[Content_Types].xml".to_string()));
|
|
assert!(entry_names.contains(&"xl/worksheets/sheet1.xml".to_string()));
|
|
assert!(entry_names.contains(&"xl/workbook.xml".to_string()));
|
|
|
|
// Read sheet1.xml and verify it contains our data
|
|
let mut sheet = archive.by_name("xl/worksheets/sheet1.xml").unwrap();
|
|
let mut xml = String::new();
|
|
std::io::Read::read_to_string(&mut sheet, &mut xml).unwrap();
|
|
|
|
assert!(xml.contains("供电单位"), "header row should contain 供电单位");
|
|
assert!(xml.contains("累计供电量"), "header row should contain 累计供电量");
|
|
assert!(xml.contains("城关供电"), "data should contain 城关供电");
|
|
assert!(xml.contains("12345.67"), "data should contain 12345.67");
|
|
assert!(xml.contains("七里河供电"), "data should contain second row");
|
|
|
|
// Cleanup
|
|
fs::remove_file(&output_path).unwrap();
|
|
}
|
|
|
|
#[test]
|
|
fn export_empty_rows_returns_error() {
|
|
let output_path = temp_output_path("empty-test.xlsx");
|
|
|
|
let request = LinelossExportRequest {
|
|
sheet_name: "test".to_string(),
|
|
column_defs: vec![("A".to_string(), "ColA".to_string())],
|
|
rows: vec![],
|
|
output_path: output_path.clone(),
|
|
};
|
|
|
|
let result = export_lineloss_xlsx(&request);
|
|
assert!(result.is_err());
|
|
assert!(
|
|
result.unwrap_err().to_string().contains("rows must not be empty"),
|
|
"should reject empty rows"
|
|
);
|
|
}
|
|
```
|
|
|
|
- [ ] **Step 3: Run the test to verify it fails**
|
|
|
|
Run: `cargo test --test lineloss_xlsx_export_test -- --nocapture`
|
|
|
|
Expected: compilation error — `lineloss_xlsx_export` module doesn't exist yet or `export_lineloss_xlsx` / `LinelossExportRequest` not defined.
|
|
|
|
- [ ] **Step 4: Implement `src/compat/lineloss_xlsx_export.rs`**
|
|
|
|
```rust
|
|
use std::fs;
|
|
use std::io::Write;
|
|
use std::path::{Path, PathBuf};
|
|
|
|
use serde_json::{Map, Value};
|
|
use zip::write::FileOptions;
|
|
use zip::{CompressionMethod, ZipWriter};
|
|
|
|
pub struct LinelossExportRequest {
|
|
pub sheet_name: String,
|
|
pub column_defs: Vec<(String, String)>,
|
|
pub rows: Vec<Map<String, Value>>,
|
|
pub output_path: PathBuf,
|
|
}
|
|
|
|
pub fn export_lineloss_xlsx(request: &LinelossExportRequest) -> anyhow::Result<PathBuf> {
|
|
if request.rows.is_empty() {
|
|
anyhow::bail!("rows must not be empty");
|
|
}
|
|
if request.column_defs.is_empty() {
|
|
anyhow::bail!("column_defs must not be empty");
|
|
}
|
|
|
|
let sheet_xml = build_worksheet_xml(&request.column_defs, &request.rows);
|
|
|
|
write_xlsx(
|
|
&request.output_path,
|
|
&request.sheet_name,
|
|
&sheet_xml,
|
|
)?;
|
|
|
|
Ok(request.output_path.clone())
|
|
}
|
|
|
|
fn build_worksheet_xml(
|
|
column_defs: &[(String, String)],
|
|
rows: &[Map<String, Value>],
|
|
) -> String {
|
|
let mut xml_rows = Vec::with_capacity(rows.len() + 1);
|
|
|
|
// Header row (row 1)
|
|
let header_cells: Vec<String> = column_defs
|
|
.iter()
|
|
.enumerate()
|
|
.map(|(col_idx, (_key, label))| {
|
|
let col_letter = column_letter(col_idx);
|
|
format!(
|
|
"<c r=\"{col_letter}1\" t=\"inlineStr\"><is><t>{}</t></is></c>",
|
|
xml_escape(label)
|
|
)
|
|
})
|
|
.collect();
|
|
xml_rows.push(format!("<row r=\"1\">{}</row>", header_cells.join("")));
|
|
|
|
// Data rows (row 2+)
|
|
for (row_idx, row) in rows.iter().enumerate() {
|
|
let excel_row = row_idx + 2;
|
|
let cells: Vec<String> = column_defs
|
|
.iter()
|
|
.enumerate()
|
|
.map(|(col_idx, (key, _label))| {
|
|
let col_letter = column_letter(col_idx);
|
|
let value = row
|
|
.get(key)
|
|
.map(|v| value_to_string(v))
|
|
.unwrap_or_default();
|
|
format!(
|
|
"<c r=\"{col_letter}{excel_row}\" t=\"inlineStr\"><is><t>{}</t></is></c>",
|
|
xml_escape(&value)
|
|
)
|
|
})
|
|
.collect();
|
|
xml_rows.push(format!("<row r=\"{excel_row}\">{}</row>", cells.join("")));
|
|
}
|
|
|
|
format!(
|
|
"<?xml version=\"1.0\" encoding=\"UTF-8\" standalone=\"yes\"?>\
|
|
<worksheet xmlns=\"http://schemas.openxmlformats.org/spreadsheetml/2006/main\">\
|
|
<sheetData>{}</sheetData>\
|
|
</worksheet>",
|
|
xml_rows.join("")
|
|
)
|
|
}
|
|
|
|
fn column_letter(index: usize) -> String {
|
|
let mut result = String::new();
|
|
let mut n = index;
|
|
loop {
|
|
result.insert(0, (b'A' + (n % 26) as u8) as char);
|
|
if n < 26 {
|
|
break;
|
|
}
|
|
n = n / 26 - 1;
|
|
}
|
|
result
|
|
}
|
|
|
|
fn value_to_string(value: &Value) -> String {
|
|
match value {
|
|
Value::String(text) => text.clone(),
|
|
Value::Number(number) => number.to_string(),
|
|
Value::Bool(flag) => flag.to_string(),
|
|
Value::Null => String::new(),
|
|
other => other.to_string(),
|
|
}
|
|
}
|
|
|
|
fn xml_escape(value: &str) -> String {
|
|
value
|
|
.replace('&', "&")
|
|
.replace('<', "<")
|
|
.replace('>', ">")
|
|
}
|
|
|
|
fn write_xlsx(output_path: &Path, sheet_name: &str, sheet_xml: &str) -> anyhow::Result<()> {
|
|
if let Some(parent) = output_path.parent() {
|
|
fs::create_dir_all(parent)?;
|
|
}
|
|
if output_path.exists() {
|
|
fs::remove_file(output_path)?;
|
|
}
|
|
|
|
let file = fs::File::create(output_path)?;
|
|
let mut zip = ZipWriter::new(file);
|
|
let options = FileOptions::default().compression_method(CompressionMethod::Stored);
|
|
|
|
zip.start_file("[Content_Types].xml", options)?;
|
|
zip.write_all(content_types_xml().as_bytes())?;
|
|
|
|
zip.start_file("_rels/.rels", options)?;
|
|
zip.write_all(root_rels_xml().as_bytes())?;
|
|
|
|
zip.start_file("docProps/app.xml", options)?;
|
|
zip.write_all(app_xml().as_bytes())?;
|
|
|
|
zip.start_file("docProps/core.xml", options)?;
|
|
zip.write_all(core_xml().as_bytes())?;
|
|
|
|
zip.start_file("xl/workbook.xml", options)?;
|
|
zip.write_all(workbook_xml(&xml_escape(sheet_name)).as_bytes())?;
|
|
|
|
zip.start_file("xl/_rels/workbook.xml.rels", options)?;
|
|
zip.write_all(workbook_rels_xml().as_bytes())?;
|
|
|
|
zip.start_file("xl/worksheets/sheet1.xml", options)?;
|
|
zip.write_all(sheet_xml.as_bytes())?;
|
|
|
|
zip.finish()?;
|
|
Ok(())
|
|
}
|
|
|
|
fn content_types_xml() -> &'static str {
|
|
r#"<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
|
|
<Types xmlns="http://schemas.openxmlformats.org/package/2006/content-types">
|
|
<Default Extension="rels" ContentType="application/vnd.openxmlformats-package.relationships+xml"/>
|
|
<Default Extension="xml" ContentType="application/xml"/>
|
|
<Override PartName="/xl/workbook.xml" ContentType="application/vnd.openxmlformats-officedocument.spreadsheetml.sheet.main+xml"/>
|
|
<Override PartName="/xl/worksheets/sheet1.xml" ContentType="application/vnd.openxmlformats-officedocument.spreadsheetml.worksheet+xml"/>
|
|
<Override PartName="/docProps/core.xml" ContentType="application/vnd.openxmlformats-package.core-properties+xml"/>
|
|
<Override PartName="/docProps/app.xml" ContentType="application/vnd.openxmlformats-officedocument.extended-properties+xml"/>
|
|
</Types>"#
|
|
}
|
|
|
|
fn root_rels_xml() -> &'static str {
|
|
r#"<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
|
|
<Relationships xmlns="http://schemas.openxmlformats.org/package/2006/relationships">
|
|
<Relationship Id="rId1" Type="http://schemas.openxmlformats.org/officeDocument/2006/relationships/officeDocument" Target="xl/workbook.xml"/>
|
|
<Relationship Id="rId2" Type="http://schemas.openxmlformats.org/package/2006/relationships/metadata/core-properties" Target="docProps/core.xml"/>
|
|
<Relationship Id="rId3" Type="http://schemas.openxmlformats.org/officeDocument/2006/relationships/extended-properties" Target="docProps/app.xml"/>
|
|
</Relationships>"#
|
|
}
|
|
|
|
fn app_xml() -> &'static str {
|
|
r#"<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
|
|
<Properties xmlns="http://schemas.openxmlformats.org/officeDocument/2006/extended-properties"
|
|
xmlns:vt="http://schemas.openxmlformats.org/officeDocument/2006/docPropsVTypes">
|
|
<Application>sgClaw</Application>
|
|
</Properties>"#
|
|
}
|
|
|
|
fn core_xml() -> &'static str {
|
|
r#"<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
|
|
<cp:coreProperties xmlns:cp="http://schemas.openxmlformats.org/package/2006/metadata/core-properties"
|
|
xmlns:dc="http://purl.org/dc/elements/1.1/"
|
|
xmlns:dcterms="http://purl.org/dc/terms/"
|
|
xmlns:dcmitype="http://purl.org/dc/dcmitype/"
|
|
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
|
|
<dc:title>台区线损报表</dc:title>
|
|
</cp:coreProperties>"#
|
|
}
|
|
|
|
fn workbook_xml(sheet_name: &str) -> String {
|
|
format!(
|
|
r#"<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
|
|
<workbook xmlns="http://schemas.openxmlformats.org/spreadsheetml/2006/main"
|
|
xmlns:r="http://schemas.openxmlformats.org/officeDocument/2006/relationships">
|
|
<sheets>
|
|
<sheet name="{sheet_name}" sheetId="1" r:id="rId1"/>
|
|
</sheets>
|
|
</workbook>"#
|
|
)
|
|
}
|
|
|
|
fn workbook_rels_xml() -> &'static str {
|
|
r#"<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
|
|
<Relationships xmlns="http://schemas.openxmlformats.org/package/2006/relationships">
|
|
<Relationship Id="rId1" Type="http://schemas.openxmlformats.org/officeDocument/2006/relationships/worksheet" Target="worksheets/sheet1.xml"/>
|
|
</Relationships>"#
|
|
}
|
|
|
|
#[cfg(test)]
|
|
mod tests {
|
|
use super::column_letter;
|
|
|
|
#[test]
|
|
fn column_letter_maps_indices_correctly() {
|
|
assert_eq!(column_letter(0), "A");
|
|
assert_eq!(column_letter(1), "B");
|
|
assert_eq!(column_letter(6), "G");
|
|
assert_eq!(column_letter(25), "Z");
|
|
assert_eq!(column_letter(26), "AA");
|
|
}
|
|
}
|
|
```
|
|
|
|
- [ ] **Step 5: Run the tests to verify they pass**
|
|
|
|
Run: `cargo test --test lineloss_xlsx_export_test -- --nocapture`
|
|
|
|
Expected: both `export_month_lineloss_produces_valid_xlsx` and `export_empty_rows_returns_error` PASS.
|
|
|
|
Also run the internal unit test:
|
|
|
|
Run: `cargo test lineloss_xlsx_export -- --nocapture`
|
|
|
|
Expected: `column_letter_maps_indices_correctly` PASS.
|
|
|
|
- [ ] **Step 6: Commit**
|
|
|
|
```bash
|
|
git add src/compat/lineloss_xlsx_export.rs src/compat/mod.rs tests/lineloss_xlsx_export_test.rs
|
|
git commit -m "feat(lineloss): add Rust-side XLSX generation for lineloss reports"
|
|
```
|
|
|
|
---
|
|
|
|
### Task 2: Integrate XLSX Export into `deterministic_submit.rs`
|
|
|
|
**Files:**
|
|
- Modify: `src/compat/deterministic_submit.rs`
|
|
|
|
- [ ] **Step 1: Add imports and helper function to extract export data from artifact**
|
|
|
|
At the top of `src/compat/deterministic_submit.rs`, add the import:
|
|
|
|
```rust
|
|
use crate::compat::lineloss_xlsx_export::{export_lineloss_xlsx, LinelossExportRequest};
|
|
```
|
|
|
|
Then add a new helper function after `summarize_lineloss_artifact`:
|
|
|
|
```rust
|
|
struct LinelossArtifactExportData {
|
|
sheet_name: String,
|
|
column_defs: Vec<(String, String)>,
|
|
rows: Vec<Map<String, Value>>,
|
|
}
|
|
|
|
fn extract_export_data(output: &str) -> Option<LinelossArtifactExportData> {
|
|
let payload: Value = serde_json::from_str(output).ok()?;
|
|
let artifact = payload
|
|
.as_object()
|
|
.and_then(|object| object.get("text"))
|
|
.unwrap_or(&payload);
|
|
let artifact = artifact.as_object()?;
|
|
|
|
if artifact.get("type").and_then(Value::as_str) != Some("report-artifact") {
|
|
return None;
|
|
}
|
|
|
|
let status = artifact.get("status").and_then(Value::as_str).unwrap_or("");
|
|
if !matches!(status, "ok" | "partial") {
|
|
return None;
|
|
}
|
|
|
|
let rows = artifact
|
|
.get("rows")
|
|
.and_then(Value::as_array)?;
|
|
if rows.is_empty() {
|
|
return None;
|
|
}
|
|
let rows: Vec<Map<String, Value>> = rows
|
|
.iter()
|
|
.filter_map(|row| row.as_object().cloned())
|
|
.collect();
|
|
if rows.is_empty() {
|
|
return None;
|
|
}
|
|
|
|
let column_defs: Vec<(String, String)> = artifact
|
|
.get("column_defs")
|
|
.and_then(Value::as_array)
|
|
.map(|defs| {
|
|
defs.iter()
|
|
.filter_map(|def| {
|
|
let arr = def.as_array()?;
|
|
let key = arr.first()?.as_str()?.to_string();
|
|
let label = arr.get(1)?.as_str()?.to_string();
|
|
Some((key, label))
|
|
})
|
|
.collect()
|
|
})
|
|
.unwrap_or_default();
|
|
|
|
// Fallback: if column_defs not in artifact, try "columns" array as keys
|
|
let column_defs = if column_defs.is_empty() {
|
|
let columns = artifact
|
|
.get("columns")
|
|
.and_then(Value::as_array)?;
|
|
columns
|
|
.iter()
|
|
.filter_map(|col| {
|
|
let key = col.as_str()?.to_string();
|
|
Some((key.clone(), key))
|
|
})
|
|
.collect()
|
|
} else {
|
|
column_defs
|
|
};
|
|
|
|
if column_defs.is_empty() {
|
|
return None;
|
|
}
|
|
|
|
let org_label = artifact
|
|
.get("org")
|
|
.and_then(Value::as_object)
|
|
.and_then(|org| org.get("label"))
|
|
.and_then(Value::as_str)
|
|
.unwrap_or("lineloss");
|
|
let period_mode = artifact
|
|
.get("period")
|
|
.and_then(Value::as_object)
|
|
.and_then(|p| p.get("mode"))
|
|
.and_then(Value::as_str)
|
|
.unwrap_or("month");
|
|
let period_value = artifact
|
|
.get("period")
|
|
.and_then(Value::as_object)
|
|
.and_then(|p| p.get("value"))
|
|
.and_then(Value::as_str)
|
|
.unwrap_or("");
|
|
let mode_label = if period_mode == "week" { "周度" } else { "月度" };
|
|
let sheet_name = format!("{org_label}{mode_label}线损分析报表({period_value})");
|
|
|
|
Some(LinelossArtifactExportData {
|
|
sheet_name,
|
|
column_defs,
|
|
rows,
|
|
})
|
|
}
|
|
```
|
|
|
|
- [ ] **Step 2: Add the export-after-collection function**
|
|
|
|
Add a new function that wraps the existing flow with XLSX export:
|
|
|
|
```rust
|
|
fn try_export_lineloss_xlsx(
|
|
output: &str,
|
|
workspace_root: &Path,
|
|
) -> Option<PathBuf> {
|
|
let data = extract_export_data(output)?;
|
|
let nanos = std::time::SystemTime::now()
|
|
.duration_since(std::time::UNIX_EPOCH)
|
|
.map(|d| d.as_nanos())
|
|
.unwrap_or_default();
|
|
let out_dir = workspace_root.join("out");
|
|
let output_path = out_dir.join(format!("tq-lineloss-{nanos}.xlsx"));
|
|
|
|
let request = LinelossExportRequest {
|
|
sheet_name: data.sheet_name,
|
|
column_defs: data.column_defs,
|
|
rows: data.rows,
|
|
output_path,
|
|
};
|
|
|
|
match export_lineloss_xlsx(&request) {
|
|
Ok(path) => {
|
|
eprintln!("[deterministic_submit] XLSX exported to: {}", path.display());
|
|
Some(path)
|
|
}
|
|
Err(err) => {
|
|
eprintln!("[deterministic_submit] XLSX export failed: {err}");
|
|
None
|
|
}
|
|
}
|
|
}
|
|
```
|
|
|
|
- [ ] **Step 3: Modify `execute_deterministic_submit_with_browser_backend` to call export**
|
|
|
|
Replace the body of `execute_deterministic_submit_with_browser_backend` (lines 119-136 of the original file):
|
|
|
|
```rust
|
|
pub fn execute_deterministic_submit_with_browser_backend(
|
|
browser_backend: Arc<dyn BrowserBackend>,
|
|
plan: &DeterministicExecutionPlan,
|
|
workspace_root: &Path,
|
|
settings: &SgClawSettings,
|
|
) -> Result<DirectSubmitOutcome, PipeError> {
|
|
let args = deterministic_submit_args(plan);
|
|
let output =
|
|
crate::compat::direct_skill_runtime::execute_browser_script_skill_raw_output_with_browser_backend(
|
|
browser_backend,
|
|
&plan.tool_name,
|
|
workspace_root,
|
|
settings,
|
|
args,
|
|
)?;
|
|
|
|
let export_path = try_export_lineloss_xlsx(&output, workspace_root);
|
|
Ok(summarize_lineloss_output_with_export(&output, export_path.as_deref()))
|
|
}
|
|
```
|
|
|
|
Apply the same change to `execute_deterministic_submit` (the non-backend variant, lines 101-117):
|
|
|
|
```rust
|
|
pub fn execute_deterministic_submit<T: Transport + 'static>(
|
|
browser_tool: BrowserPipeTool<T>,
|
|
plan: &DeterministicExecutionPlan,
|
|
workspace_root: &Path,
|
|
settings: &SgClawSettings,
|
|
) -> Result<DirectSubmitOutcome, PipeError> {
|
|
let args = deterministic_submit_args(plan);
|
|
let output = crate::compat::direct_skill_runtime::execute_browser_script_skill_raw_output(
|
|
browser_tool,
|
|
&plan.tool_name,
|
|
workspace_root,
|
|
settings,
|
|
args,
|
|
)?;
|
|
|
|
let export_path = try_export_lineloss_xlsx(&output, workspace_root);
|
|
Ok(summarize_lineloss_output_with_export(&output, export_path.as_deref()))
|
|
}
|
|
```
|
|
|
|
- [ ] **Step 4: Add `summarize_lineloss_output_with_export` function**
|
|
|
|
Add this new function. It wraps the existing `summarize_lineloss_output` and appends the export path:
|
|
|
|
```rust
|
|
fn summarize_lineloss_output_with_export(output: &str, export_path: Option<&Path>) -> DirectSubmitOutcome {
|
|
let mut outcome = summarize_lineloss_output(output);
|
|
|
|
if let Some(path) = export_path {
|
|
outcome.summary.push_str(&format!(" export_path={}", path.display()));
|
|
}
|
|
|
|
outcome
|
|
}
|
|
```
|
|
|
|
- [ ] **Step 5: Run existing tests to ensure nothing breaks**
|
|
|
|
Run: `cargo test --test deterministic_submit_test -- --nocapture`
|
|
|
|
Expected: all existing tests PASS (the tests don't call `execute_deterministic_submit`, they test `decide_deterministic_submit` and parsing logic which is unchanged).
|
|
|
|
Run: `cargo test deterministic_submit -- --nocapture`
|
|
|
|
Expected: PASS.
|
|
|
|
- [ ] **Step 6: Commit**
|
|
|
|
```bash
|
|
git add src/compat/deterministic_submit.rs
|
|
git commit -m "feat(lineloss): integrate Rust-side XLSX export into deterministic submit pipeline"
|
|
```
|
|
|
|
---
|
|
|
|
### Task 3: Modify `collect_lineloss.js` to Skip Browser-Side Export
|
|
|
|
**Files:**
|
|
- Modify: `D:/data/ideaSpace/rust/sgClaw/claw/claw/skills/skill_staging/skills/tq-lineloss-report/scripts/collect_lineloss.js`
|
|
|
|
- [ ] **Step 1: Add `column_defs` to the artifact returned by `buildArtifact`**
|
|
|
|
In the `buildArtifact` function (around line 198), the `columns` field currently contains just column keys (e.g., `["ORG_NAME", "YGDL", ...]`). Add a `column_defs` field that includes the full key+label pairs. Change the `buildArtifact` function to also accept and emit `column_defs`:
|
|
|
|
Find this block in `buildArtifact` (line 198-242):
|
|
|
|
```javascript
|
|
function buildArtifact({
|
|
status,
|
|
blockedReason = '',
|
|
fatalError = '',
|
|
org_label = '',
|
|
org_code = '',
|
|
period_mode = '',
|
|
period_mode_code = '',
|
|
period_value = '',
|
|
period_payload = {},
|
|
columns = [],
|
|
rows = [],
|
|
export: exportState,
|
|
reasons = []
|
|
}) {
|
|
```
|
|
|
|
Replace with:
|
|
|
|
```javascript
|
|
function buildArtifact({
|
|
status,
|
|
blockedReason = '',
|
|
fatalError = '',
|
|
org_label = '',
|
|
org_code = '',
|
|
period_mode = '',
|
|
period_mode_code = '',
|
|
period_value = '',
|
|
period_payload = {},
|
|
columns = [],
|
|
column_defs = [],
|
|
rows = [],
|
|
export: exportState,
|
|
reasons = []
|
|
}) {
|
|
```
|
|
|
|
In the returned object (the `return { ... }` block inside `buildArtifact`), add `column_defs` after `columns`:
|
|
|
|
```javascript
|
|
columns: [...columns],
|
|
column_defs: [...column_defs],
|
|
rows: [...rows],
|
|
```
|
|
|
|
- [ ] **Step 2: Pass `column_defs` from `buildBrowserEntrypointResult`**
|
|
|
|
In `buildBrowserEntrypointResult`, after the `columns` assignment (around line 452), add:
|
|
|
|
```javascript
|
|
const columns = normalizedArgs.period_mode === 'week' ? WEEK_COLUMNS : MONTH_COLUMNS;
|
|
const columnDefs = normalizedArgs.period_mode === 'week' ? WEEK_COLUMN_DEFS : MONTH_COLUMN_DEFS;
|
|
```
|
|
|
|
Then in every call to `buildArtifact` inside `buildBrowserEntrypointResult`, add `column_defs: columnDefs` alongside `columns`. There are 5 calls:
|
|
|
|
**Call 1** (API error, around line 466):
|
|
```javascript
|
|
columns,
|
|
column_defs: columnDefs,
|
|
rows: [],
|
|
```
|
|
|
|
**Call 2** (empty rows, around line 483):
|
|
```javascript
|
|
columns,
|
|
column_defs: columnDefs,
|
|
rows: []
|
|
```
|
|
|
|
**Call 3** (normalization failure, around line 497):
|
|
```javascript
|
|
columns,
|
|
column_defs: columnDefs,
|
|
rows: [],
|
|
```
|
|
|
|
**Call 4** (success, around line 558):
|
|
```javascript
|
|
columns,
|
|
column_defs: columnDefs,
|
|
rows,
|
|
```
|
|
|
|
Note: the two `buildArtifact` calls before the `columns` variable is assigned (validation failure and page context failure, around lines 422 and 439) don't need `column_defs` since they don't have data.
|
|
|
|
- [ ] **Step 3: Remove the `exportWorkbook` and `writeReportLog` calls from the success path**
|
|
|
|
In `buildBrowserEntrypointResult`, replace the entire export block (lines 518-556) with a simplified version:
|
|
|
|
Find:
|
|
```javascript
|
|
const exportState = {
|
|
attempted: false,
|
|
status: 'skipped',
|
|
message: null
|
|
};
|
|
|
|
if (typeof deps.exportWorkbook === 'function') {
|
|
exportState.attempted = true;
|
|
try {
|
|
const exportPayload = buildExportPayload({
|
|
mode: normalizedArgs.period_mode,
|
|
orgLabel: normalizedArgs.org_label,
|
|
periodValue: normalizedArgs.period_value,
|
|
rows
|
|
});
|
|
const exportResult = await deps.exportWorkbook(exportPayload);
|
|
const exportPath = pickFirstNonEmpty(exportResult?.path, exportResult?.data?.path, exportResult?.data?.data);
|
|
if (!exportPath) {
|
|
throw new Error('export_failed');
|
|
}
|
|
exportState.status = 'ok';
|
|
exportState.message = exportPath;
|
|
|
|
if (typeof deps.writeReportLog === 'function') {
|
|
try {
|
|
const reportLog = await deps.writeReportLog(buildReportName(normalizedArgs), exportPath);
|
|
if (reportLog?.success === false) {
|
|
reasons.push('report_log_failed');
|
|
}
|
|
} catch (_error) {
|
|
reasons.push('report_log_failed');
|
|
}
|
|
}
|
|
} catch (error) {
|
|
reasons.push('export_failed');
|
|
exportState.status = 'failed';
|
|
exportState.message = pickFirstNonEmpty(error?.message, 'export_failed');
|
|
}
|
|
}
|
|
```
|
|
|
|
Replace with:
|
|
```javascript
|
|
// Export is handled by Rust side after receiving the artifact.
|
|
// JS only provides rows + column_defs in the artifact.
|
|
const exportState = {
|
|
attempted: false,
|
|
status: 'deferred_to_rust',
|
|
message: null
|
|
};
|
|
```
|
|
|
|
- [ ] **Step 4: Remove unused constants and functions**
|
|
|
|
Remove these constants (lines 5-6) since they are no longer called from JS:
|
|
|
|
```javascript
|
|
const EXPORT_SERVICE_URL = 'http://localhost:13313/SurfaceServices/personalBread/export/faultDetailsExportXLSX';
|
|
const REPORT_LOG_URL = 'http://localhost:13313/ReportServices/Api/setReportLog';
|
|
```
|
|
|
|
Remove the `postJson` function (lines 264-294) — it is no longer needed since no JS-side HTTP calls are made to localhost.
|
|
|
|
Remove these functions from `defaultBrowserDeps()`:
|
|
- `exportWorkbook` (lines 350-373)
|
|
- `writeReportLog` (lines 375-409)
|
|
|
|
Remove these now-unused functions:
|
|
- `buildExportTitles` (lines 244-254)
|
|
- `buildExportPayload` (lines 256-262)
|
|
- `buildReportName` (lines 413-415)
|
|
|
|
- [ ] **Step 5: Update the module.exports to remove unused exports**
|
|
|
|
Update the `module.exports` block (lines 572-586). Remove `buildBrowserEntrypointResult` from exports if it was only used for testing with full deps, or keep it for test compatibility. The final exports block:
|
|
|
|
```javascript
|
|
if (typeof module !== 'undefined' && module.exports) {
|
|
module.exports = {
|
|
MONTH_COLUMNS,
|
|
WEEK_COLUMNS,
|
|
MONTH_COLUMN_DEFS,
|
|
WEEK_COLUMN_DEFS,
|
|
validateArgs,
|
|
buildMonthRequest,
|
|
buildWeekRequest,
|
|
normalizeRows,
|
|
determineArtifactStatus,
|
|
buildArtifact,
|
|
buildBrowserEntrypointResult
|
|
};
|
|
} else {
|
|
return buildBrowserEntrypointResult(args);
|
|
}
|
|
```
|
|
|
|
- [ ] **Step 6: Verify the JS file has no syntax errors**
|
|
|
|
Run: `node -c "D:/data/ideaSpace/rust/sgClaw/claw/claw/skills/skill_staging/skills/tq-lineloss-report/scripts/collect_lineloss.js"`
|
|
|
|
Expected: no syntax errors. (Note: the file uses `return` at top level inside a wrapped IIFE when injected into the browser, so Node syntax check may warn — the important thing is no parse errors.)
|
|
|
|
Alternatively, check the test file still works:
|
|
|
|
Run: `node "D:/data/ideaSpace/rust/sgClaw/claw/claw/skills/skill_staging/skills/tq-lineloss-report/scripts/collect_lineloss.test.js"`
|
|
|
|
Expected: tests pass (or at least no JS parse errors).
|
|
|
|
- [ ] **Step 7: Commit**
|
|
|
|
```bash
|
|
git add "D:/data/ideaSpace/rust/sgClaw/claw/claw/skills/skill_staging/skills/tq-lineloss-report/scripts/collect_lineloss.js"
|
|
git commit -m "feat(lineloss): remove browser-side export, defer to Rust-side XLSX generation"
|
|
```
|
|
|
|
---
|
|
|
|
### Task 4: Full Build Verification
|
|
|
|
**Files:** None (verification only)
|
|
|
|
- [ ] **Step 1: Run full cargo build**
|
|
|
|
Run: `cargo build`
|
|
|
|
Expected: successful compilation with no errors.
|
|
|
|
- [ ] **Step 2: Run all tests**
|
|
|
|
Run: `cargo test -- --nocapture`
|
|
|
|
Expected: all tests pass, including:
|
|
- `lineloss_xlsx_export_test::export_month_lineloss_produces_valid_xlsx`
|
|
- `lineloss_xlsx_export_test::export_empty_rows_returns_error`
|
|
- `lineloss_xlsx_export::tests::column_letter_maps_indices_correctly`
|
|
- All existing `deterministic_submit_test` tests
|
|
|
|
- [ ] **Step 3: Commit (if any fixups needed)**
|
|
|
|
Only if compilation or test fixes were required in this step.
|