Files
claw/docs/superpowers/plans/2026-04-26-skill-normalized-result-and-dashboard-local-reader-implementation-plan.md

50 KiB
Raw Blame History

Skill Normalized Result And Dashboard Local Reader Implementation Plan

For agentic workers: REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (- [ ]) syntax for tracking.

Goal: Build a stable normalized-result pipeline in claw-new, a local HTTP reader in digital-employee, and a dashboard data path that consumes normalized skill results instead of raw run-record.json.

Architecture: claw-new adds a result normalizer that reads scheduled skill run-record.json files and writes stable normalized snapshots plus index.json. digital-employee adds an embedded localhost-only Node reader that serves those normalized files over HTTP, while the Vue dashboard replaces static mock imports with a polling data layer that consumes the local API and supports count_snapshot and detail_snapshot.

Tech Stack: Rust (serde, serde_json, existing sg_claw CLI), Node/Express in digital-employee, Vue 2 + Vuex + Vue CLI dev proxy.


File Structure

claw-new files

Create:

  • src/result_normalization/mod.rs
  • src/result_normalization/contract.rs
  • src/result_normalization/registry.rs
  • src/result_normalization/writer.rs
  • src/result_normalization/index_builder.rs
  • src/result_normalization/extractors/mod.rs
  • src/result_normalization/extractors/available_balance.rs
  • src/result_normalization/extractors/archive_workorder.rs
  • src/result_normalization/extractors/command_center_fee_control.rs
  • src/result_normalization/extractors/sgcc_todo_crawler.rs
  • tests/result_normalization_contract_test.rs
  • tests/result_normalization_writer_test.rs
  • tests/result_normalization_extractors_test.rs
  • tests/result_normalization_cli_test.rs
  • tests/fixtures/result_normalization/available-balance-below-zero-monitor.run-record.json
  • tests/fixtures/result_normalization/archive-workorder-grid-push-monitor.run-record.json
  • tests/fixtures/result_normalization/command-center-fee-control-monitor.run-record.json
  • tests/fixtures/result_normalization/sgcc-todo-crawler.run-record.json

Modify:

  • src/lib.rs
  • src/bin/sg_claw.rs
  • Cargo.toml

digital-employee files

Create:

  • D:/data/ideaSpace/rust/sgClaw/digital-employee/server/index.js
  • D:/data/ideaSpace/rust/sgClaw/digital-employee/server/config.js
  • D:/data/ideaSpace/rust/sgClaw/digital-employee/server/routes/results.js
  • D:/data/ideaSpace/rust/sgClaw/digital-employee/server/services/fileRepository.js
  • D:/data/ideaSpace/rust/sgClaw/digital-employee/server/services/resultStore.js
  • D:/data/ideaSpace/rust/sgClaw/digital-employee/server/services/validators.js
  • D:/data/ideaSpace/rust/sgClaw/digital-employee/src/api/results.js
  • D:/data/ideaSpace/rust/sgClaw/digital-employee/src/store/modules/results.js

Modify:

  • D:/data/ideaSpace/rust/sgClaw/digital-employee/package.json
  • D:/data/ideaSpace/rust/sgClaw/digital-employee/vue.config.js
  • D:/data/ideaSpace/rust/sgClaw/digital-employee/src/store/index.js
  • D:/data/ideaSpace/rust/sgClaw/digital-employee/src/views/Dashboard.vue
  • D:/data/ideaSpace/rust/sgClaw/digital-employee/src/components/WorkReport.vue
  • D:/data/ideaSpace/rust/sgClaw/digital-employee/README.md

Test / validate:

  • D:/data/ideaSpace/rust/sgClaw/digital-employee dev server + local API manual smoke test

Task 1: Add Public Normalized Result Contract In claw-new

Files:

  • Create: src/result_normalization/contract.rs

  • Create: src/result_normalization/mod.rs

  • Modify: src/lib.rs

  • Test: tests/result_normalization_contract_test.rs

  • Step 1: Write the failing contract test

Create tests/result_normalization_contract_test.rs with tests that force the public schema shape to exist before implementation:

use sgclaw::result_normalization::contract::{
    NormalizedResult, NormalizedResultType, ResultFreshness, ResultMetric, ResultSource,
    ResultStatus,
};

#[test]
fn normalized_result_contract_serializes_required_envelope_fields() {
    let result = NormalizedResult {
        schema_version: "1.0".to_string(),
        skill_id: "available-balance-below-zero-monitor".to_string(),
        skill_name: "可用电费小于零监测".to_string(),
        category: "monitor".to_string(),
        result_type: NormalizedResultType::CountSnapshot,
        observed_at: "2026-04-25T19:46:24+08:00".to_string(),
        generated_at: "2026-04-25T19:46:26+08:00".to_string(),
        status: ResultStatus::Ok,
        freshness: ResultFreshness {
            stale_after_seconds: 900,
            is_stale: false,
        },
        summary: "2026-04-25 19:46:24--可用电费小于零监测检测到【数量】4265".to_string(),
        metric: ResultMetric {
            label: "数量".to_string(),
            value: 4265,
            unit: "items".to_string(),
        },
        payload: None,
        diagnostics: serde_json::json!({}),
        source: ResultSource {
            kind: "run_record".to_string(),
            run_record_path: r"D:\desk\sgclaw\sgclaw\results\available-balance-below-zero-monitor.run-record.json".to_string(),
            run_record_mtime: "2026-04-25T19:46:26+08:00".to_string(),
            extractor_version: "1.0".to_string(),
        },
    };

    let value = serde_json::to_value(&result).unwrap();
    assert_eq!(value["schemaVersion"], "1.0");
    assert_eq!(value["skillId"], "available-balance-below-zero-monitor");
    assert_eq!(value["resultType"], "count_snapshot");
    assert_eq!(value["status"], "ok");
    assert_eq!(value["metric"]["value"], 4265);
}

#[test]
fn normalized_result_type_serializes_public_names() {
    assert_eq!(
        serde_json::to_string(&NormalizedResultType::CountSnapshot).unwrap(),
        "\"count_snapshot\""
    );
    assert_eq!(
        serde_json::to_string(&NormalizedResultType::DetailSnapshot).unwrap(),
        "\"detail_snapshot\""
    );
}
  • Step 2: Run test to verify it fails

Run:

cargo test --test result_normalization_contract_test

Expected:

FAIL
unresolved import `sgclaw::result_normalization`
  • Step 3: Write the minimal public contract

Create src/result_normalization/mod.rs:

pub mod contract;
pub mod extractors;
pub mod index_builder;
pub mod registry;
pub mod writer;

Create src/result_normalization/contract.rs:

use serde::{Deserialize, Serialize};

#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)]
#[serde(rename_all = "snake_case")]
pub enum NormalizedResultType {
    CountSnapshot,
    DetailSnapshot,
}

#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)]
#[serde(rename_all = "snake_case")]
pub enum ResultStatus {
    Ok,
    Empty,
    SoftError,
    Error,
}

#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct ResultFreshness {
    pub stale_after_seconds: u64,
    pub is_stale: bool,
}

#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct ResultMetric {
    pub label: String,
    pub value: i64,
    pub unit: String,
}

#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct ResultSource {
    pub kind: String,
    pub run_record_path: String,
    pub run_record_mtime: String,
    pub extractor_version: String,
}

#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct NormalizedResult {
    pub schema_version: String,
    pub skill_id: String,
    pub skill_name: String,
    pub category: String,
    pub result_type: NormalizedResultType,
    pub observed_at: String,
    pub generated_at: String,
    pub status: ResultStatus,
    pub freshness: ResultFreshness,
    pub summary: String,
    pub metric: ResultMetric,
    #[serde(skip_serializing_if = "Option::is_none")]
    pub payload: Option<serde_json::Value>,
    pub diagnostics: serde_json::Value,
    pub source: ResultSource,
}

Modify src/lib.rs:

pub mod result_normalization;
  • Step 4: Run test to verify it passes

Run:

cargo test --test result_normalization_contract_test

Expected:

PASS
2 passed
  • Step 5: Commit
git add src/lib.rs src/result_normalization/mod.rs src/result_normalization/contract.rs tests/result_normalization_contract_test.rs
git commit -m "feat: add normalized result contract"

Task 2: Add Registry And index.json Models

Files:

  • Create: src/result_normalization/registry.rs

  • Create: src/result_normalization/index_builder.rs

  • Test: tests/result_normalization_contract_test.rs

  • Step 1: Extend tests to lock registry and index schema

Append to tests/result_normalization_contract_test.rs:

use sgclaw::result_normalization::contract::{NormalizedResultsIndex, NormalizedResultsIndexEntry};
use sgclaw::result_normalization::registry::skill_registry;

#[test]
fn registry_contains_four_initial_skills() {
    let ids: Vec<_> = skill_registry().iter().map(|item| item.skill_id.as_str()).collect();
    assert_eq!(
        ids,
        vec![
            "archive-workorder-grid-push-monitor",
            "available-balance-below-zero-monitor",
            "command-center-fee-control-monitor",
            "sgcc-todo-crawler"
        ]
    );
}

#[test]
fn index_contract_serializes_skill_entries() {
    let index = NormalizedResultsIndex {
        schema_version: "1.0".to_string(),
        generated_at: "2026-04-25T19:48:00+08:00".to_string(),
        revision: "2026-04-25T19:48:00+08:00".to_string(),
        skills: vec![NormalizedResultsIndexEntry {
            skill_id: "available-balance-below-zero-monitor".to_string(),
            skill_name: "可用电费小于零监测".to_string(),
            category: "monitor".to_string(),
            result_type: NormalizedResultType::CountSnapshot,
            status: ResultStatus::Ok,
            observed_at: "2026-04-25T19:46:24+08:00".to_string(),
            metric: ResultMetric {
                label: "数量".to_string(),
                value: 4265,
                unit: "items".to_string(),
            },
            summary: "2026-04-25 19:46:24--可用电费小于零监测检测到【数量】4265".to_string(),
            current_file: "latest/available-balance-below-zero-monitor.json".to_string(),
            last_good_file: "last-good/available-balance-below-zero-monitor.json".to_string(),
        }],
    };

    let value = serde_json::to_value(index).unwrap();
    assert_eq!(value["skills"][0]["currentFile"], "latest/available-balance-below-zero-monitor.json");
}
  • Step 2: Run test to verify it fails

Run:

cargo test --test result_normalization_contract_test

Expected:

FAIL
cannot find type `NormalizedResultsIndex`
cannot find function `skill_registry`
  • Step 3: Implement registry and index models

Extend src/result_normalization/contract.rs with:

#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct NormalizedResultsIndexEntry {
    pub skill_id: String,
    pub skill_name: String,
    pub category: String,
    pub result_type: NormalizedResultType,
    pub status: ResultStatus,
    pub observed_at: String,
    pub metric: ResultMetric,
    pub summary: String,
    pub current_file: String,
    pub last_good_file: String,
}

#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct NormalizedResultsIndex {
    pub schema_version: String,
    pub generated_at: String,
    pub revision: String,
    pub skills: Vec<NormalizedResultsIndexEntry>,
}

Create src/result_normalization/registry.rs:

use crate::result_normalization::contract::NormalizedResultType;

#[derive(Debug, Clone, PartialEq, Eq)]
pub struct SkillRegistryItem {
    pub skill_id: &'static str,
    pub skill_name: &'static str,
    pub category: &'static str,
    pub result_type: NormalizedResultType,
    pub stale_after_seconds: u64,
}

pub fn skill_registry() -> Vec<SkillRegistryItem> {
    vec![
        SkillRegistryItem {
            skill_id: "archive-workorder-grid-push-monitor",
            skill_name: "归档工单配网推送监测",
            category: "monitor",
            result_type: NormalizedResultType::CountSnapshot,
            stale_after_seconds: 900,
        },
        SkillRegistryItem {
            skill_id: "available-balance-below-zero-monitor",
            skill_name: "可用电费小于零监测",
            category: "monitor",
            result_type: NormalizedResultType::CountSnapshot,
            stale_after_seconds: 900,
        },
        SkillRegistryItem {
            skill_id: "command-center-fee-control-monitor",
            skill_name: "指挥中心费控异常监测",
            category: "monitor",
            result_type: NormalizedResultType::CountSnapshot,
            stale_after_seconds: 900,
        },
        SkillRegistryItem {
            skill_id: "sgcc-todo-crawler",
            skill_name: "国网待办抓取",
            category: "crawler",
            result_type: NormalizedResultType::DetailSnapshot,
            stale_after_seconds: 900,
        },
    ]
}

Create src/result_normalization/index_builder.rs:

use crate::result_normalization::contract::{NormalizedResult, NormalizedResultsIndex, NormalizedResultsIndexEntry};

pub fn build_index(generated_at: String, revision: String, items: &[NormalizedResult]) -> NormalizedResultsIndex {
    let skills = items
        .iter()
        .map(|item| NormalizedResultsIndexEntry {
            skill_id: item.skill_id.clone(),
            skill_name: item.skill_name.clone(),
            category: item.category.clone(),
            result_type: item.result_type.clone(),
            status: item.status.clone(),
            observed_at: item.observed_at.clone(),
            metric: item.metric.clone(),
            summary: item.summary.clone(),
            current_file: format!("latest/{}.json", item.skill_id),
            last_good_file: format!("last-good/{}.json", item.skill_id),
        })
        .collect();

    NormalizedResultsIndex {
        schema_version: "1.0".to_string(),
        generated_at,
        revision,
        skills,
    }
}
  • Step 4: Run test to verify it passes

Run:

cargo test --test result_normalization_contract_test

Expected:

PASS
4 passed
  • Step 5: Commit
git add src/result_normalization/contract.rs src/result_normalization/registry.rs src/result_normalization/index_builder.rs tests/result_normalization_contract_test.rs
git commit -m "feat: add normalized result registry and index models"

Task 3: Implement Raw Fixture Extractors For Four Initial Skills

Files:

  • Create: src/result_normalization/extractors/mod.rs

  • Create: src/result_normalization/extractors/available_balance.rs

  • Create: src/result_normalization/extractors/archive_workorder.rs

  • Create: src/result_normalization/extractors/command_center_fee_control.rs

  • Create: src/result_normalization/extractors/sgcc_todo_crawler.rs

  • Create: tests/result_normalization_extractors_test.rs

  • Create: tests/fixtures/result_normalization/*.run-record.json

  • Step 1: Add failing extractor tests with fixture expectations

Create tests/result_normalization_extractors_test.rs:

use std::fs;
use std::path::Path;

use sgclaw::result_normalization::extractors::normalize_skill_run_record;

fn fixture(name: &str) -> String {
    let path = Path::new("tests/fixtures/result_normalization").join(name);
    fs::read_to_string(path).unwrap()
}

#[test]
fn available_balance_extractor_reads_raw_merged_count() {
    let raw = fixture("available-balance-below-zero-monitor.run-record.json");
    let result = normalize_skill_run_record(
        "available-balance-below-zero-monitor",
        r"D:\desk\sgclaw\sgclaw\results\available-balance-below-zero-monitor.run-record.json",
        &raw,
        "2026-04-25T19:46:26+08:00",
        "2026-04-25T19:46:26+08:00",
    )
    .unwrap();

    assert_eq!(result.metric.value, 4265);
    assert_eq!(result.status, sgclaw::result_normalization::contract::ResultStatus::Ok);
}

#[test]
fn archive_workorder_extractor_marks_soft_error_when_query_soft_error_exists() {
    let raw = fixture("archive-workorder-grid-push-monitor.run-record.json");
    let result = normalize_skill_run_record(
        "archive-workorder-grid-push-monitor",
        r"D:\desk\sgclaw\sgclaw\results\archive-workorder-grid-push-monitor.run-record.json",
        &raw,
        "2026-04-25T19:46:06+08:00",
        "2026-04-25T19:46:06+08:00",
    )
    .unwrap();

    assert_eq!(result.metric.value, 0);
    assert_eq!(result.status, sgclaw::result_normalization::contract::ResultStatus::SoftError);
}

#[test]
fn command_center_extractor_uses_query_abnor_list_count() {
    let raw = fixture("command-center-fee-control-monitor.run-record.json");
    let result = normalize_skill_run_record(
        "command-center-fee-control-monitor",
        r"D:\desk\sgclaw\sgclaw\results\command-center-fee-control-monitor.run-record.json",
        &raw,
        "2026-04-25T19:47:56+08:00",
        "2026-04-25T19:47:56+08:00",
    )
    .unwrap();

    assert_eq!(result.metric.value, 1);
}

#[test]
fn sgcc_todo_crawler_extractor_projects_items_and_raw_items() {
    let raw = fixture("sgcc-todo-crawler.run-record.json");
    let result = normalize_skill_run_record(
        "sgcc-todo-crawler",
        r"D:\desk\sgclaw\sgclaw\results\sgcc-todo-crawler.run-record.json",
        &raw,
        "2026-04-25T19:47:36+08:00",
        "2026-04-25T19:47:36+08:00",
    )
    .unwrap();

    assert!(result.payload.is_some());
    let payload = result.payload.unwrap();
    assert_eq!(payload["aggregates"]["total"], payload["items"].as_array().unwrap().len());
    assert!(payload["rawItems"].as_array().unwrap().len() >= payload["items"].as_array().unwrap().len());
}

Copy local fixtures into:

tests/fixtures/result_normalization/
  available-balance-below-zero-monitor.run-record.json
  archive-workorder-grid-push-monitor.run-record.json
  command-center-fee-control-monitor.run-record.json
  sgcc-todo-crawler.run-record.json
  • Step 2: Run test to verify it fails

Run:

cargo test --test result_normalization_extractors_test

Expected:

FAIL
cannot find function `normalize_skill_run_record`
  • Step 3: Implement extractor dispatch and four extractors

Create src/result_normalization/extractors/mod.rs:

pub mod archive_workorder;
pub mod available_balance;
pub mod command_center_fee_control;
pub mod sgcc_todo_crawler;

use anyhow::{anyhow, Result};

use crate::result_normalization::contract::NormalizedResult;

pub fn normalize_skill_run_record(
    skill_id: &str,
    run_record_path: &str,
    raw_json: &str,
    observed_at: &str,
    generated_at: &str,
) -> Result<NormalizedResult> {
    match skill_id {
        "available-balance-below-zero-monitor" => available_balance::normalize(run_record_path, raw_json, observed_at, generated_at),
        "archive-workorder-grid-push-monitor" => archive_workorder::normalize(run_record_path, raw_json, observed_at, generated_at),
        "command-center-fee-control-monitor" => command_center_fee_control::normalize(run_record_path, raw_json, observed_at, generated_at),
        "sgcc-todo-crawler" => sgcc_todo_crawler::normalize(run_record_path, raw_json, observed_at, generated_at),
        other => Err(anyhow!("unsupported normalized skill id: {other}")),
    }
}

Implement each extractor with these minimal rules:

  • available_balance.rs

    • read auditPreview.detectReadDiagnostics.rawMergedCount
    • build diagnostics with slice counts and traces
    • mark SoftError if sliceErrors non-empty or trace status contains non-ok
  • archive_workorder.rs

    • read newItemCount, fallback filteredCount, fallback pending_count
    • mark SoftError when queryStatus != "ok" or any trace has soft_error
  • command_center_fee_control.rs

    • read queryAbnorListCount
    • preserve supporting diagnostics like getOrgTreeStatus
    • mark SoftError when primary read absent but fallback still available
  • sgcc_todo_crawler.rs

    • read decisionPreview.pendingList
    • build projected items, original rawItems, and aggregates { total, unread, read }

Use shared helper patterns in each extractor:

let root: serde_json::Value = serde_json::from_str(raw_json)?;
let diagnostics = root
    .pointer("/auditPreview/detectReadDiagnostics")
    .cloned()
    .unwrap_or_else(|| serde_json::json!({}));
  • Step 4: Run test to verify it passes

Run:

cargo test --test result_normalization_extractors_test

Expected:

PASS
4 passed
  • Step 5: Commit
git add src/result_normalization/extractors tests/result_normalization_extractors_test.rs tests/fixtures/result_normalization
git commit -m "feat: add scheduled skill result extractors"

Task 4: Implement Writer And last-good Policy

Files:

  • Create: src/result_normalization/writer.rs

  • Create: tests/result_normalization_writer_test.rs

  • Step 1: Add failing writer tests

Create tests/result_normalization_writer_test.rs:

use std::fs;
use std::time::{SystemTime, UNIX_EPOCH};

use sgclaw::result_normalization::contract::{
    NormalizedResult, NormalizedResultType, ResultFreshness, ResultMetric, ResultSource,
    ResultStatus,
};
use sgclaw::result_normalization::writer::write_normalized_result;

fn temp_dir() -> std::path::PathBuf {
    let nanos = SystemTime::now().duration_since(UNIX_EPOCH).unwrap().as_nanos();
    let dir = std::env::temp_dir().join(format!("normalized-writer-{nanos}"));
    fs::create_dir_all(&dir).unwrap();
    dir
}

fn sample(status: ResultStatus) -> NormalizedResult {
    NormalizedResult {
        schema_version: "1.0".to_string(),
        skill_id: "available-balance-below-zero-monitor".to_string(),
        skill_name: "可用电费小于零监测".to_string(),
        category: "monitor".to_string(),
        result_type: NormalizedResultType::CountSnapshot,
        observed_at: "2026-04-25T19:46:24+08:00".to_string(),
        generated_at: "2026-04-25T19:46:26+08:00".to_string(),
        status,
        freshness: ResultFreshness { stale_after_seconds: 900, is_stale: false },
        summary: "summary".to_string(),
        metric: ResultMetric { label: "数量".to_string(), value: 4265, unit: "items".to_string() },
        payload: None,
        diagnostics: serde_json::json!({}),
        source: ResultSource {
            kind: "run_record".to_string(),
            run_record_path: "x".to_string(),
            run_record_mtime: "2026-04-25T19:46:26+08:00".to_string(),
            extractor_version: "1.0".to_string(),
        },
    }
}

#[test]
fn writer_creates_latest_and_history() {
    let dir = temp_dir();
    write_normalized_result(&dir, &sample(ResultStatus::Ok), true).unwrap();

    assert!(dir.join("latest/available-balance-below-zero-monitor.json").exists());
    assert!(dir.join("last-good/available-balance-below-zero-monitor.json").exists());
}

#[test]
fn writer_does_not_replace_last_good_when_status_is_error() {
    let dir = temp_dir();
    write_normalized_result(&dir, &sample(ResultStatus::Ok), true).unwrap();

    let original = fs::read_to_string(dir.join("last-good/available-balance-below-zero-monitor.json")).unwrap();
    write_normalized_result(&dir, &sample(ResultStatus::Error), false).unwrap();
    let after = fs::read_to_string(dir.join("last-good/available-balance-below-zero-monitor.json")).unwrap();

    assert_eq!(original, after);
}
  • Step 2: Run test to verify it fails

Run:

cargo test --test result_normalization_writer_test

Expected:

FAIL
cannot find function `write_normalized_result`
  • Step 3: Implement atomic writer

Create src/result_normalization/writer.rs:

use std::fs;
use std::path::{Path, PathBuf};

use anyhow::Result;

use crate::result_normalization::contract::{NormalizedResult, ResultStatus};

fn write_atomic(path: &Path, body: &str) -> Result<()> {
    if let Some(parent) = path.parent() {
        fs::create_dir_all(parent)?;
    }
    let tmp = path.with_extension("json.tmp");
    fs::write(&tmp, body)?;
    fs::rename(tmp, path)?;
    Ok(())
}

pub fn should_update_last_good(status: &ResultStatus, allow_soft_error_last_good: bool) -> bool {
    match status {
        ResultStatus::Ok | ResultStatus::Empty => true,
        ResultStatus::SoftError => allow_soft_error_last_good,
        ResultStatus::Error => false,
    }
}

pub fn write_normalized_result(
    normalized_root: &Path,
    result: &NormalizedResult,
    allow_soft_error_last_good: bool,
) -> Result<PathBuf> {
    let latest_path = normalized_root.join("latest").join(format!("{}.json", result.skill_id));
    let last_good_path = normalized_root.join("last-good").join(format!("{}.json", result.skill_id));
    let history_path = normalized_root
        .join("history")
        .join(&result.skill_id)
        .join(result.observed_at[0..4].to_string())
        .join(result.observed_at[5..7].to_string())
        .join(format!("{}.json", result.generated_at.replace(':', "-")));

    let body = serde_json::to_string_pretty(result)?;
    write_atomic(&latest_path, &body)?;
    if should_update_last_good(&result.status, allow_soft_error_last_good) {
        write_atomic(&last_good_path, &body)?;
    }
    write_atomic(&history_path, &body)?;
    Ok(latest_path)
}
  • Step 4: Run test to verify it passes

Run:

cargo test --test result_normalization_writer_test

Expected:

PASS
2 passed
  • Step 5: Commit
git add src/result_normalization/writer.rs tests/result_normalization_writer_test.rs
git commit -m "feat: add normalized result writer"

Task 5: Wire A normalize-results CLI Into sg_claw

Files:

  • Modify: src/bin/sg_claw.rs

  • Create: tests/result_normalization_cli_test.rs

  • Step 1: Add failing CLI parser tests

Create tests/result_normalization_cli_test.rs:

use std::process::Command;

#[test]
fn sg_claw_reports_missing_results_dir_for_normalize_results() {
    let output = Command::new(env!("CARGO_BIN_EXE_sg_claw"))
        .arg("--normalize-results")
        .output()
        .unwrap();

    assert!(!output.status.success());
    let stderr = String::from_utf8_lossy(&output.stderr);
    assert!(stderr.contains("missing required --results-dir for --normalize-results"));
}
  • Step 2: Run test to verify it fails

Run:

cargo test --test result_normalization_cli_test

Expected:

FAIL
stderr does not contain "missing required --results-dir for --normalize-results"
  • Step 3: Implement CLI parsing and execution hook

Modify src/bin/sg_claw.rs:

  1. Add a new NormalizeResultsCliConfig:
#[derive(Debug, Clone, PartialEq, Eq)]
struct NormalizeResultsCliConfig {
    results_dir: PathBuf,
    skills: Option<Vec<String>>,
}
  1. Add parser:
fn parse_normalize_results_cli(args: &[String]) -> Result<Option<NormalizeResultsCliConfig>, String> {
    let mut enabled = false;
    let mut results_dir = None;
    let mut skills = None;

    let mut iter = args.iter().cloned();
    while let Some(arg) = iter.next() {
        match arg.as_str() {
            "--normalize-results" => enabled = true,
            "--results-dir" => results_dir = Some(PathBuf::from(next_arg(&mut iter, "--results-dir")?)),
            "--skills" => {
                let raw = next_arg(&mut iter, "--skills")?;
                skills = Some(raw.split(',').map(|item| item.trim().to_string()).filter(|item| !item.is_empty()).collect());
            }
            _ => {
                if let Some(value) = arg.strip_prefix("--results-dir=") {
                    results_dir = Some(PathBuf::from(value));
                } else if let Some(value) = arg.strip_prefix("--skills=") {
                    skills = Some(value.split(',').map(|item| item.trim().to_string()).filter(|item| !item.is_empty()).collect());
                }
            }
        }
    }

    if !enabled {
        return Ok(None);
    }

    Ok(Some(NormalizeResultsCliConfig {
        results_dir: results_dir.ok_or_else(|| "missing required --results-dir for --normalize-results".to_string())?,
        skills,
    }))
}
  1. Before scheduled-monitoring parsing in main(), add:
let raw_args: Vec<String> = std::env::args().skip(1).collect();
match parse_normalize_results_cli(&raw_args) {
    Ok(Some(config)) => {
        match sgclaw::result_normalization::run_normalize_results(&config.results_dir, config.skills.as_deref()) {
            Ok(_) => return ExitCode::SUCCESS,
            Err(err) => {
                eprintln!("normalize results failed: {err}");
                return ExitCode::FAILURE;
            }
        }
    }
    Ok(None) => {}
    Err(err) => {
        eprintln!("sg_claw argument error: {err}");
        return ExitCode::FAILURE;
    }
}
  1. In src/result_normalization/mod.rs, add a temporary orchestration stub:
use std::path::Path;

pub fn run_normalize_results(_results_dir: &Path, _skills: Option<&[String]>) -> anyhow::Result<()> {
    Ok(())
}
  • Step 4: Run test to verify it passes

Run:

cargo test --test result_normalization_cli_test

Expected:

PASS
1 passed
  • Step 5: Commit
git add src/bin/sg_claw.rs src/result_normalization/mod.rs tests/result_normalization_cli_test.rs
git commit -m "feat: add normalize-results cli entry"

Task 6: Implement Full Normalization Orchestration And index.json Rebuild

Files:

  • Modify: src/result_normalization/mod.rs

  • Modify: src/result_normalization/index_builder.rs

  • Test: tests/result_normalization_cli_test.rs

  • Step 1: Extend CLI tests to require normalized outputs

Append to tests/result_normalization_cli_test.rs:

use std::fs;
use std::time::{SystemTime, UNIX_EPOCH};

fn temp_results_dir() -> std::path::PathBuf {
    let nanos = SystemTime::now().duration_since(UNIX_EPOCH).unwrap().as_nanos();
    let dir = std::env::temp_dir().join(format!("normalize-results-cli-{nanos}"));
    fs::create_dir_all(&dir).unwrap();
    dir
}

#[test]
fn normalize_results_cli_generates_latest_and_index() {
    let dir = temp_results_dir();
    fs::copy(
        "tests/fixtures/result_normalization/available-balance-below-zero-monitor.run-record.json",
        dir.join("available-balance-below-zero-monitor.run-record.json"),
    )
    .unwrap();

    let output = Command::new(env!("CARGO_BIN_EXE_sg_claw"))
        .arg("--normalize-results")
        .arg("--results-dir")
        .arg(&dir)
        .arg("--skills")
        .arg("available-balance-below-zero-monitor")
        .output()
        .unwrap();

    assert!(output.status.success(), "stderr={}", String::from_utf8_lossy(&output.stderr));
    assert!(dir.join("normalized/latest/available-balance-below-zero-monitor.json").exists());
    assert!(dir.join("normalized/index.json").exists());
}
  • Step 2: Run test to verify it fails

Run:

cargo test --test result_normalization_cli_test

Expected:

FAIL
normalized/latest/available-balance-below-zero-monitor.json does not exist
  • Step 3: Implement orchestration

Replace src/result_normalization/mod.rs with:

pub mod contract;
pub mod extractors;
pub mod index_builder;
pub mod registry;
pub mod writer;

use std::fs;
use std::path::Path;

use anyhow::{Context, Result};
use chrono::{DateTime, Local};

use crate::result_normalization::extractors::normalize_skill_run_record;
use crate::result_normalization::index_builder::build_index;
use crate::result_normalization::registry::skill_registry;
use crate::result_normalization::writer::write_normalized_result;

fn observed_time_from_metadata(metadata: &std::fs::Metadata) -> String {
    let dt: DateTime<Local> = DateTime::from(metadata.modified().unwrap_or_else(|_| std::time::SystemTime::now()));
    dt.to_rfc3339()
}

pub fn run_normalize_results(results_dir: &Path, skills: Option<&[String]>) -> Result<()> {
    let normalized_root = results_dir.join("normalized");
    let registry = skill_registry();
    let selected: Vec<_> = registry
        .into_iter()
        .filter(|item| skills.map(|list| list.iter().any(|skill| skill == item.skill_id)).unwrap_or(true))
        .collect();

    let mut generated = Vec::new();

    for item in selected {
        let run_record_path = results_dir.join(format!("{}.run-record.json", item.skill_id));
        let raw = fs::read_to_string(&run_record_path)
            .with_context(|| format!("read run record failed: {}", run_record_path.display()))?;
        let metadata = fs::metadata(&run_record_path)?;
        let observed_at = observed_time_from_metadata(&metadata);
        let generated_at = chrono::Local::now().to_rfc3339();
        let result = normalize_skill_run_record(
            item.skill_id,
            &run_record_path.display().to_string(),
            &raw,
            &observed_at,
            &generated_at,
        )?;
        write_normalized_result(&normalized_root, &result, false)?;
        generated.push(result);
    }

    let revision = chrono::Local::now().to_rfc3339();
    let index = build_index(revision.clone(), revision, &generated);
    let index_body = serde_json::to_string_pretty(&index)?;
    crate::result_normalization::writer::write_index(&normalized_root, &index_body)?;

    Ok(())
}

Extend src/result_normalization/writer.rs with:

pub fn write_index(normalized_root: &Path, body: &str) -> Result<()> {
    let index_path = normalized_root.join("index.json");
    write_atomic(&index_path, body)
}
  • Step 4: Run test to verify it passes

Run:

cargo test --test result_normalization_cli_test

Expected:

PASS
2 passed
  • Step 5: Commit
git add src/result_normalization/mod.rs src/result_normalization/writer.rs src/result_normalization/index_builder.rs tests/result_normalization_cli_test.rs
git commit -m "feat: orchestrate normalized result generation"

Task 7: Add digital-employee Local Reader Service

Files:

  • Create: D:/data/ideaSpace/rust/sgClaw/digital-employee/server/index.js

  • Create: D:/data/ideaSpace/rust/sgClaw/digital-employee/server/config.js

  • Create: D:/data/ideaSpace/rust/sgClaw/digital-employee/server/routes/results.js

  • Create: D:/data/ideaSpace/rust/sgClaw/digital-employee/server/services/fileRepository.js

  • Create: D:/data/ideaSpace/rust/sgClaw/digital-employee/server/services/resultStore.js

  • Create: D:/data/ideaSpace/rust/sgClaw/digital-employee/server/services/validators.js

  • Modify: D:/data/ideaSpace/rust/sgClaw/digital-employee/package.json

  • Step 1: Add server dependencies and scripts

Modify D:/data/ideaSpace/rust/sgClaw/digital-employee/package.json:

{
  "scripts": {
    "serve": "vue-cli-service serve",
    "serve:web": "vue-cli-service serve",
    "serve:api": "node server/index.js",
    "dev": "concurrently \"npm run serve:api\" \"npm run serve:web\"",
    "build": "vue-cli-service build",
    "build:web": "vue-cli-service build",
    "lint": "vue-cli-service lint"
  },
  "dependencies": {
    "concurrently": "^9.0.1",
    "core-js": "^3.8.3",
    "echarts": "^6.0.0",
    "element-ui": "^2.15.14",
    "express": "^4.21.2",
    "vue": "^2.6.14",
    "vue-router": "^3.5.1",
    "vuex": "^3.6.2"
  }
}
  • Step 2: Create local reader configuration

Create D:/data/ideaSpace/rust/sgClaw/digital-employee/server/config.js:

const path = require('path')

module.exports = {
  host: process.env.LOCAL_READER_HOST || '127.0.0.1',
  port: Number(process.env.LOCAL_READER_PORT || 31337),
  resultsDir: process.env.SGCLAW_RESULTS_DIR || 'D:\\desk\\sgclaw\\sgclaw\\results',
  normalizedDir:
    process.env.SGCLAW_NORMALIZED_DIR ||
    path.join(process.env.SGCLAW_RESULTS_DIR || 'D:\\desk\\sgclaw\\sgclaw\\results', 'normalized'),
  staleAfterSeconds: Number(process.env.RESULT_STALE_AFTER_SECONDS || 900)
}
  • Step 3: Create validation and repository helpers

Create D:/data/ideaSpace/rust/sgClaw/digital-employee/server/services/validators.js:

function assertSkillId(skillId) {
  if (!/^[a-z0-9-]+$/.test(skillId)) {
    const error = new Error(`invalid skill id: ${skillId}`)
    error.statusCode = 400
    throw error
  }
}

module.exports = { assertSkillId }

Create D:/data/ideaSpace/rust/sgClaw/digital-employee/server/services/fileRepository.js:

const fs = require('fs')
const path = require('path')

function readJson(filePath) {
  return JSON.parse(fs.readFileSync(filePath, 'utf8'))
}

function resolveNormalizedPath(normalizedDir, ...parts) {
  return path.join(normalizedDir, ...parts)
}

module.exports = { readJson, resolveNormalizedPath }
  • Step 4: Create result store and routes

Create D:/data/ideaSpace/rust/sgClaw/digital-employee/server/services/resultStore.js:

const fs = require('fs')
const path = require('path')
const { readJson, resolveNormalizedPath } = require('./fileRepository')
const { assertSkillId } = require('./validators')

function safeRead(filePath) {
  return fs.existsSync(filePath) ? readJson(filePath) : null
}

function createResultStore(config) {
  function readIndex() {
    return readJson(resolveNormalizedPath(config.normalizedDir, 'index.json'))
  }

  function readSkill(skillId) {
    assertSkillId(skillId)
    return {
      current: safeRead(resolveNormalizedPath(config.normalizedDir, 'latest', `${skillId}.json`)),
      lastGood: safeRead(resolveNormalizedPath(config.normalizedDir, 'last-good', `${skillId}.json`))
    }
  }

  function readSkillItems(skillId) {
    const { current } = readSkill(skillId)
    return current?.payload?.items || []
  }

  function readSkillRawItems(skillId) {
    const { current } = readSkill(skillId)
    return current?.payload?.rawItems || []
  }

  function readHistory(skillId, limit = 30) {
    assertSkillId(skillId)
    const dir = resolveNormalizedPath(config.normalizedDir, 'history', skillId)
    if (!fs.existsSync(dir)) return []
    const paths = []
    for (const year of fs.readdirSync(dir)) {
      const yearDir = path.join(dir, year)
      for (const month of fs.readdirSync(yearDir)) {
        const monthDir = path.join(yearDir, month)
        for (const file of fs.readdirSync(monthDir)) {
          paths.push(path.join(monthDir, file))
        }
      }
    }
    return paths
      .sort()
      .reverse()
      .slice(0, limit)
      .map(readJson)
  }

  return { readIndex, readSkill, readSkillItems, readSkillRawItems, readHistory }
}

module.exports = { createResultStore }

Create D:/data/ideaSpace/rust/sgClaw/digital-employee/server/routes/results.js:

const express = require('express')

function createResultsRouter(store, config) {
  const router = express.Router()

  router.get('/health', (req, res) => {
    let revision = null
    try {
      revision = store.readIndex().revision
    } catch (_) {}
    res.json({
      ok: true,
      host: config.host,
      port: config.port,
      resultsDir: config.resultsDir,
      normalizedDir: config.normalizedDir,
      revision
    })
  })

  router.get('/results', (req, res, next) => {
    try {
      res.json(store.readIndex())
    } catch (error) {
      next(error)
    }
  })

  router.get('/results/:skillId', (req, res, next) => {
    try {
      res.json(store.readSkill(req.params.skillId))
    } catch (error) {
      next(error)
    }
  })

  router.get('/results/:skillId/items', (req, res, next) => {
    try {
      res.json({ items: store.readSkillItems(req.params.skillId) })
    } catch (error) {
      next(error)
    }
  })

  router.get('/results/:skillId/raw-items', (req, res, next) => {
    try {
      res.json({ items: store.readSkillRawItems(req.params.skillId) })
    } catch (error) {
      next(error)
    }
  })

  router.get('/results/:skillId/history', (req, res, next) => {
    try {
      const limit = Number(req.query.limit || 30)
      res.json({ items: store.readHistory(req.params.skillId, limit) })
    } catch (error) {
      next(error)
    }
  })

  return router
}

module.exports = { createResultsRouter }

Create D:/data/ideaSpace/rust/sgClaw/digital-employee/server/index.js:

const express = require('express')
const config = require('./config')
const { createResultStore } = require('./services/resultStore')
const { createResultsRouter } = require('./routes/results')

const app = express()
const store = createResultStore(config)

app.use('/api', createResultsRouter(store, config))
app.use((error, req, res, next) => {
  const status = error.statusCode || 500
  res.status(status).json({
    ok: false,
    error: error.message || 'local reader error'
  })
})

app.listen(config.port, config.host, () => {
  console.log(`local reader listening on http://${config.host}:${config.port}`)
})
  • Step 5: Install dependencies and smoke run the API

Run:

npm install
npm run serve:api

Expected:

local reader listening on http://127.0.0.1:31337
  • Step 6: Commit
git -C D:/data/ideaSpace/rust/sgClaw/digital-employee add package.json package-lock.json server
git -C D:/data/ideaSpace/rust/sgClaw/digital-employee commit -m "feat: add local reader api"

Task 8: Add Dev Proxy And Frontend API Wrapper

Files:

  • Modify: D:/data/ideaSpace/rust/sgClaw/digital-employee/vue.config.js

  • Create: D:/data/ideaSpace/rust/sgClaw/digital-employee/src/api/results.js

  • Step 1: Add API proxy

Modify D:/data/ideaSpace/rust/sgClaw/digital-employee/vue.config.js:

const { defineConfig } = require('@vue/cli-service')

module.exports = defineConfig({
  transpileDependencies: true,
  devServer: {
    proxy: {
      '/api': {
        target: 'http://127.0.0.1:31337',
        changeOrigin: false
      }
    }
  }
})
  • Step 2: Add browser API wrapper

Create D:/data/ideaSpace/rust/sgClaw/digital-employee/src/api/results.js:

async function request(pathname) {
  const response = await fetch(pathname)
  if (!response.ok) {
    const body = await response.json().catch(() => ({}))
    throw new Error(body.error || `request failed: ${pathname}`)
  }
  return response.json()
}

export function fetchResultsIndex() {
  return request('/api/results')
}

export function fetchSkillResult(skillId) {
  return request(`/api/results/${skillId}`)
}

export function fetchSkillItems(skillId) {
  return request(`/api/results/${skillId}/items`)
}
  • Step 3: Smoke the proxy path

Run:

npm run serve:api
npm run serve:web

Expected:

browser requests to /api/results are proxied to http://127.0.0.1:31337/api/results
  • Step 4: Commit
git -C D:/data/ideaSpace/rust/sgClaw/digital-employee add vue.config.js src/api/results.js
git -C D:/data/ideaSpace/rust/sgClaw/digital-employee commit -m "feat: add dashboard result api client"

Task 9: Add Vuex Results Module With Polling

Files:

  • Create: D:/data/ideaSpace/rust/sgClaw/digital-employee/src/store/modules/results.js

  • Modify: D:/data/ideaSpace/rust/sgClaw/digital-employee/src/store/index.js

  • Step 1: Create results module

Create D:/data/ideaSpace/rust/sgClaw/digital-employee/src/store/modules/results.js:

import { fetchResultsIndex, fetchSkillResult } from '@/api/results'

const state = () => ({
  index: null,
  skillDetails: {},
  loading: false,
  error: null,
  pollTimer: null
})

const mutations = {
  SET_LOADING(state, value) {
    state.loading = value
  },
  SET_ERROR(state, value) {
    state.error = value
  },
  SET_INDEX(state, value) {
    state.index = value
  },
  SET_SKILL_DETAIL(state, { skillId, detail }) {
    state.skillDetails = { ...state.skillDetails, [skillId]: detail }
  },
  SET_POLL_TIMER(state, timer) {
    state.pollTimer = timer
  },
  CLEAR_POLL_TIMER(state) {
    if (state.pollTimer) clearInterval(state.pollTimer)
    state.pollTimer = null
  }
}

const actions = {
  async refreshIndex({ commit }) {
    commit('SET_LOADING', true)
    commit('SET_ERROR', null)
    try {
      const index = await fetchResultsIndex()
      commit('SET_INDEX', index)
    } catch (error) {
      commit('SET_ERROR', error.message)
    } finally {
      commit('SET_LOADING', false)
    }
  },
  async loadSkillDetail({ commit }, skillId) {
    const detail = await fetchSkillResult(skillId)
    commit('SET_SKILL_DETAIL', { skillId, detail })
  },
  startPolling({ dispatch, commit, state }) {
    if (state.pollTimer) return
    dispatch('refreshIndex')
    const timer = setInterval(() => dispatch('refreshIndex'), 30000)
    commit('SET_POLL_TIMER', timer)
  },
  stopPolling({ commit }) {
    commit('CLEAR_POLL_TIMER')
  }
}

const getters = {
  skills(state) {
    return state.index?.skills || []
  }
}

export default {
  namespaced: true,
  state,
  mutations,
  actions,
  getters
}
  • Step 2: Register the module

Modify D:/data/ideaSpace/rust/sgClaw/digital-employee/src/store/index.js:

import results from './modules/results'

export default new Vuex.Store({
  modules: {
    results
  },
  state: {
  • Step 3: Smoke the module

Run:

npm run dev

Expected:

dashboard boot does not throw "unknown namespace results"
  • Step 4: Commit
git -C D:/data/ideaSpace/rust/sgClaw/digital-employee add src/store/index.js src/store/modules/results.js
git -C D:/data/ideaSpace/rust/sgClaw/digital-employee commit -m "feat: add dashboard results store"

Task 10: Replace Static Report Data With Live Results In Dashboard

Files:

  • Modify: D:/data/ideaSpace/rust/sgClaw/digital-employee/src/views/Dashboard.vue

  • Modify: D:/data/ideaSpace/rust/sgClaw/digital-employee/src/components/WorkReport.vue

  • Step 1: Start and stop polling with dashboard lifecycle

Modify D:/data/ideaSpace/rust/sgClaw/digital-employee/src/views/Dashboard.vue:

export default {
  name: 'Dashboard',
  components: { Header, UserInfo, SkillManager, Avatar3D, WorkReport, SkinSwitch },
  mounted() {
    this.$store.dispatch('results/startPolling')
  },
  beforeDestroy() {
    this.$store.dispatch('results/stopPolling')
  }
}
  • Step 2: Replace static imports in WorkReport.vue with store-driven data

Modify D:/data/ideaSpace/rust/sgClaw/digital-employee/src/components/WorkReport.vue:

  1. Remove:
import reportData from '@/data/work-reports.json'
import anomalyData from '@/data/anomaly-logs.json'
  1. Replace component data/computed with:
export default {
  name: 'WorkReport',
  data() {
    return { chart: null }
  },
  computed: {
    skills() {
      return this.$store.getters['results/skills']
    },
    report() {
      const countSkills = this.skills.filter(item => item.resultType === 'count_snapshot')
      return {
        totalProcessed: countSkills.reduce((sum, item) => sum + Number(item.metric?.value || 0), 0),
        successCount: this.skills.filter(item => item.status === 'ok' || item.status === 'empty').length,
        failCount: this.skills.filter(item => item.status === 'error' || item.status === 'soft_error').length,
        trend: countSkills.map(item => ({
          month: item.skillName,
          count: Number(item.metric?.value || 0)
        }))
      }
    },
    anomalyLogs() {
      return this.skills
        .filter(item => item.status === 'error' || item.status === 'soft_error')
        .map((item, index) => ({
          id: index + 1,
          message: `${item.skillName}${item.summary}`
        }))
    }
  },
  1. Add chart refresh watcher:
  watch: {
    report: {
      deep: true,
      handler() {
        if (this.chart) this.initChart()
      }
    }
  },
  1. Guard empty chart data:
data: this.report.trend.length ? this.report.trend.map(t => t.count) : [0]
  • Step 3: Smoke the live dashboard path

Run:

npm run dev

Expected:

dashboard loads
WorkReport renders values from /api/results
no static import dependency remains for live result cards
  • Step 4: Commit
git -C D:/data/ideaSpace/rust/sgClaw/digital-employee add src/views/Dashboard.vue src/components/WorkReport.vue
git -C D:/data/ideaSpace/rust/sgClaw/digital-employee commit -m "feat: wire dashboard to normalized results"

Task 11: Document Operator Workflow And Validate End-To-End

Files:

  • Modify: D:/data/ideaSpace/rust/sgClaw/digital-employee/README.md

  • Modify: docs/superpowers/plans/2026-04-26-skill-normalized-result-and-dashboard-local-reader-implementation-plan.md

  • Step 1: Update digital-employee/README.md for local-reader workflow

Append this section:

## Local skill result dashboard workflow

1. Copy raw `*.run-record.json` files into `D:\desk\sgclaw\sgclaw\results`
2. Run:

```powershell
sg_claw.exe --normalize-results --results-dir D:\desk\sgclaw\sgclaw\results
  1. Start dashboard + local API:
npm install
npm run dev
  1. Open the Vue dashboard and verify it loads /api/results

- [ ] **Step 2: Run end-to-end verification**

Run:

```powershell
cargo test --test result_normalization_contract_test
cargo test --test result_normalization_extractors_test
cargo test --test result_normalization_writer_test
cargo test --test result_normalization_cli_test
cargo build --bin sg_claw

Then in D:/data/ideaSpace/rust/sgClaw/digital-employee run:

npm install
npm run build

Manual smoke:

  1. copy at least available-balance-below-zero-monitor.run-record.json into D:\desk\sgclaw\sgclaw\results
  2. run:
target\debug\sg_claw.exe --normalize-results --results-dir D:\desk\sgclaw\sgclaw\results --skills available-balance-below-zero-monitor
  1. start:
npm run serve:api
npm run serve:web
  1. visit dashboard and verify:
    • /api/health returns ok: true
    • /api/results includes available-balance-below-zero-monitor
    • WorkReport shows non-static live values
  • Step 3: Mark the plan complete and commit docs if needed
git add docs/superpowers/plans/2026-04-26-skill-normalized-result-and-dashboard-local-reader-implementation-plan.md D:/data/ideaSpace/rust/sgClaw/digital-employee/README.md
git commit -m "docs: document normalized result implementation workflow"