feat: refactor sgclaw around zeroclaw compat runtime

This commit is contained in:
zyl
2026-03-26 16:23:31 +08:00
parent bca5b75801
commit ff0771a83f
1059 changed files with 409460 additions and 23 deletions

View File

@@ -0,0 +1,134 @@
# DeepSeek Browser Smoke Implementation Plan
> **For Claude:** REQUIRED SUB-SKILL: Use superpowers:executing-plans to implement this plan task-by-task.
**Goal:** Add a repo-local verification path that exercises the browser-delivered `sgclaw` binary through the ZeroClaw/DeepSeek compat runtime without requiring a real DeepSeek account.
**Architecture:** Keep the existing SuperRPA browser smoke script unchanged. Add a small sgClaw-owned helper module that behaves like a fake OpenAI-compatible DeepSeek server and a runner script that starts that server, injects `DEEPSEEK_*` into the browser process environment, and delegates the actual browser/UI verification to the existing `sgclaw_chat_smoke.mjs`.
**Tech Stack:** Node.js ESM, Node built-in `node:test`, local HTTP server, Chromium `build_sgclaw.py`, existing SuperRPA `sgclaw_chat_smoke.mjs`.
### Task 1: Add Fake DeepSeek Response Planner
**Files:**
- Create: `/home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/tools/browser_smoke/fake_deepseek_server.mjs`
- Test: `/home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/tools/browser_smoke/fake_deepseek_server.test.mjs`
**Step 1: Write the failing test**
Add `node:test` coverage that proves the fake server planner:
- returns Baidu tool calls for `打开百度搜索天气`
- returns Zhihu navigate tool calls for `打开知乎搜索天气`
- returns final summaries matching the existing smoke script expectations
- rejects unsupported instructions clearly
**Step 2: Run test to verify it fails**
Run:
```bash
node --test tools/browser_smoke/fake_deepseek_server.test.mjs
```
Expected: FAIL because the helper module does not exist yet.
**Step 3: Implement the minimal helper**
The helper should:
- inspect the latest user message / tool-result phase
- emit OpenAI-compatible `choices[0].message.tool_calls` for the first round
- emit `choices[0].message.content` for the second round
- keep summaries identical to the current smoke assertions
**Step 4: Run test to verify it passes**
Run:
```bash
node --test tools/browser_smoke/fake_deepseek_server.test.mjs
```
Expected: PASS
### Task 2: Add DeepSeek Smoke Wrapper Script
**Files:**
- Create: `/home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/tools/browser_smoke/run_deepseek_browser_smoke.mjs`
- Modify: `/home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/README.md`
**Step 1: Write the failing wrapper expectation**
Add a small test or dry-run seam in the helper test that proves the wrapper environment includes:
- `DEEPSEEK_API_KEY`
- `DEEPSEEK_BASE_URL`
- `DEEPSEEK_MODEL`
and points at the fake local server.
**Step 2: Run the targeted test to verify it fails**
Run:
```bash
node --test tools/browser_smoke/fake_deepseek_server.test.mjs
```
Expected: FAIL because no wrapper/env builder exists yet.
**Step 3: Implement the wrapper**
The wrapper should:
- start the fake DeepSeek server
- invoke:
```bash
node /home/zyl/projects/superRpa/src/chrome/browser/superrpa/sgclaw/sgclaw_chat_smoke.mjs
```
- inject `DEEPSEEK_*` into the child environment
- print the child stdout/stderr through
- stop the fake server on exit
**Step 4: Run the targeted test to verify it passes**
Run:
```bash
node --test tools/browser_smoke/fake_deepseek_server.test.mjs
```
Expected: PASS
### Task 3: Verify the Browser-Delivered DeepSeek Path
**Files:**
- Verify: `/home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/tools/browser_smoke/*`
- Verify: `/home/zyl/projects/superRpa/src/chrome/browser/superrpa/sgclaw/build_sgclaw.py`
**Step 1: Build the browser-delivered binary from the worktree**
Run:
```bash
python3 /home/zyl/projects/superRpa/src/chrome/browser/superrpa/sgclaw/build_sgclaw.py \
--manifest-path /home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/Cargo.toml \
--out /home/zyl/projects/superRpa/src/out/KylinRelease/sgclaw
```
Expected: PASS
**Step 2: Run the DeepSeek smoke wrapper**
Run:
```bash
node tools/browser_smoke/run_deepseek_browser_smoke.mjs
```
Expected:
- existing browser smoke passes
- `sgclaw` is forced down the compat runtime path through `DEEPSEEK_*`
- Baidu and Zhihu tasks still complete
**Step 3: Re-run full Rust tests to guard against regressions**
Run:
```bash
python3 /home/zyl/projects/superRpa/src/tools/crates/run_cargo.py test \
--manifest-path /home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/Cargo.toml \
--tests
```
Expected: PASS

View File

@@ -0,0 +1,274 @@
# ZeroClaw Core Refactor Implementation Plan
> **For Claude:** REQUIRED SUB-SKILL: Use superpowers:executing-plans to implement this plan task-by-task.
**Goal:** Rebuild `sgClaw` on top of vendored ZeroClaw core while preserving the existing SuperRPA browser pipe protocol, `FunctionsUI` bridge names, and `sgclaw` binary contract.
**Architecture:** Keep `sgclaw` as the compatibility shell and replace its current minimal runtime with a ZeroClaw-based core adapter. Vendor the upstream ZeroClaw workspace into this repository for reproducible builds, then build a `compat` layer that translates `submit_task` / `task_complete` / log events to and from ZeroClaw agent, memory, cron, and tool abstractions. Do not integrate the upstream ZeroClaw gateway in this phase; the future standalone gateway will reuse the same vendored core through a separate entrypoint.
**Tech Stack:** Rust workspace, vendored upstream ZeroClaw (`zeroclawlabs`), current sgClaw pipe protocol and browser tool, DeepSeek via ZeroClaw provider routing, SQLite memory backends, Chromium `run_cargo.py` build flow.
### Task 1: Vendor ZeroClaw Upstream Snapshot
**Files:**
- Create: `/home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/third_party/zeroclaw/**`
- Create: `/home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/third_party/zeroclaw/VENDORED_FROM.md`
- Modify: `/home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/.gitignore`
**Step 1: Copy the upstream snapshot into the repo**
Source:
```bash
/home/zyl/Downloads/zeroclaw-master.zip
```
Destination:
```bash
/home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/third_party/zeroclaw
```
Strip the top-level `zeroclaw-master/` folder so the vendored directory itself is the workspace root.
**Step 2: Record provenance**
Write `third_party/zeroclaw/VENDORED_FROM.md` with:
- upstream repo URL
- upstream default branch (`master`)
- source ZIP filename
- vendoring date
- a note that this copy is used to guarantee offline/reproducible browser builds
**Step 3: Verify the vendor tree exists**
Run:
```bash
find third_party/zeroclaw -maxdepth 2 -name Cargo.toml -o -name README.md
```
Expected: upstream workspace files are present.
### Task 2: Convert sgClaw into a ZeroClaw-Backed Workspace Shell
**Files:**
- Modify: `/home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/Cargo.toml`
- Modify: `/home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/src/lib.rs`
- Modify: `/home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/src/main.rs`
- Create: `/home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/src/compat/mod.rs`
**Step 1: Add the vendored ZeroClaw dependency**
Use a local path dependency:
```toml
zeroclaw = { package = "zeroclawlabs", path = "third_party/zeroclaw" }
tokio = { version = "1", features = ["rt-multi-thread", "macros"] }
```
Do not use a git dependency. Browser builds must not depend on network access.
**Step 2: Preserve the root crate identity**
Keep:
- package name `sgclaw`
- binary name `sgclaw`
- current manifest path used by SuperRPA browser build scripts
This avoids breaking `/home/zyl/projects/superRpa/src/chrome/browser/superrpa/sgclaw/build_sgclaw.py`.
**Step 3: Route the process entrypoint through the compatibility layer**
`src/lib.rs` should keep:
- current handshake
- current `BrowserPipeTool`
- current message loop
But delegate task execution to `compat::runtime`, not directly to the current thin planner/runtime path.
### Task 3: Introduce the sgClaw Compatibility Layer
**Files:**
- Create: `/home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/src/compat/runtime.rs`
- Create: `/home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/src/compat/browser_tool_adapter.rs`
- Create: `/home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/src/compat/config_adapter.rs`
- Create: `/home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/src/compat/event_bridge.rs`
- Create: `/home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/src/compat/memory_adapter.rs`
- Modify: `/home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/src/agent/mod.rs`
**Step 1: Define the boundary**
`compat::runtime` owns:
- creating the ZeroClaw config/provider/runtime/memory/tool registry
- executing a task from a browser `submit_task`
- translating ZeroClaw progress into current `AgentMessage::LogEntry`
- returning the final summary string for current `task_complete`
`compat::event_bridge` owns all formatting decisions for:
- `[info] ...`
- `[error] ...`
- final summary propagation
**Step 2: Keep the browser protocol unchanged**
Do not change these wire-level contracts:
- `BrowserMessage::SubmitTask`
- `AgentMessage::TaskComplete`
- `AgentMessage::LogEntry`
- `init/init_ack`
The browser side must not need a corresponding protocol change.
**Step 3: Retire direct planner ownership from the main path**
`src/agent/mod.rs` should stop owning the main task intelligence flow. The current rule-based planner can remain only as:
- transitional fallback, or
- deterministic test fixture
It must no longer be the primary execution engine.
### Task 4: Adapt BrowserPipeTool into a ZeroClaw Tool
**Files:**
- Create: `/home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/src/compat/browser_tool_adapter.rs`
- Modify: `/home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/src/pipe/browser_tool.rs`
- Test: `/home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/tests/compat_browser_tool_test.rs`
**Step 1: Write the failing adapter test**
Add a focused test that proves:
- a ZeroClaw tool invocation can issue `navigate`, `type`, `click`, `getText`
- domain validation still flows through current MAC/rules enforcement
- returned observation data includes browser response payload and AOM snapshot
**Step 2: Verify RED**
Run:
```bash
python3 /home/zyl/projects/superRpa/src/tools/crates/run_cargo.py test \
--manifest-path /home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/Cargo.toml \
--test compat_browser_tool_test
```
Expected: fail because the adapter does not exist yet.
**Step 3: Implement the adapter**
Wrap current `BrowserPipeTool` behind ZeroClaws async `Tool` trait:
- tool name should stay stable and sgClaw-specific, for example `browser_action`
- schema should only expose the currently supported safe actions
- `ToolResult` should include serialized `data`, `aom_snapshot`, `timing`
### Task 5: Build the DeepSeek-Backed ZeroClaw Runtime Path
**Files:**
- Create: `/home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/tests/compat_runtime_test.rs`
- Modify: `/home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/src/compat/runtime.rs`
- Modify: `/home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/src/compat/config_adapter.rs`
- Modify: `/home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/src/config/settings.rs`
**Step 1: Write the failing runtime test**
Add a compatibility runtime test that proves:
- when `DEEPSEEK_API_KEY` is configured, sgClaw uses the ZeroClaw provider path
- the runtime can execute a simple mocked `browser_action` sequence
- the final result is returned as current sgClaw `task_complete`
Use a fake provider or deterministic ZeroClaw test seam for RED/GREEN speed.
**Step 2: Verify RED**
Run:
```bash
python3 /home/zyl/projects/superRpa/src/tools/crates/run_cargo.py test \
--manifest-path /home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/Cargo.toml \
--test compat_runtime_test
```
Expected: fail because the compatibility runtime is not wired yet.
**Step 3: Implement DeepSeek mapping**
Map current sgClaw env/config into ZeroClaw provider config:
- `DEEPSEEK_API_KEY`
- `DEEPSEEK_BASE_URL`
- `DEEPSEEK_MODEL`
DeepSeek should be treated as OpenAI-compatible routing under ZeroClaw, not via the old local `DeepSeekProvider`.
### Task 6: Introduce Memory and Cron Through the Compatibility Core
**Files:**
- Modify: `/home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/src/compat/config_adapter.rs`
- Modify: `/home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/src/compat/memory_adapter.rs`
- Create: `/home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/tests/compat_memory_test.rs`
- Create: `/home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/tests/compat_cron_test.rs`
**Step 1: Memory**
Configure a workspace-local ZeroClaw memory backend suitable for browser embedding:
- default to SQLite
- keep storage under sgClaw-owned data path
- avoid enabling unrelated gateway/channel storage
**Step 2: Cron**
Expose ZeroClaw cron internally, but do not yet bind it to browser UI.
This phase only requires:
- creating validated agent jobs
- listing/running due jobs in tests
The future standalone gateway will surface management UI for cron.
### Task 7: Verification and Browser Integration
**Files:**
- Verify: `/home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/tests/*.rs`
- Verify: `/home/zyl/projects/superRpa/src/chrome/browser/superrpa/sgclaw/build_sgclaw.py`
- Verify: `/home/zyl/projects/superRpa/src/chrome/browser/superrpa/sgclaw/sgclaw_chat_smoke.mjs`
**Step 1: Run the full Rust test baseline**
Run:
```bash
python3 /home/zyl/projects/superRpa/src/tools/crates/run_cargo.py test \
--manifest-path /home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/Cargo.toml \
--tests
```
Expected: current protocol/tool/planner compatibility tests still pass or are consciously replaced with equivalent compat tests.
**Step 2: Build the browser-delivered binary from the worktree**
Run:
```bash
python3 /home/zyl/projects/superRpa/src/chrome/browser/superrpa/sgclaw/build_sgclaw.py \
--manifest-path /home/zyl/projects/sgClaw/claw/.worktrees/zeroclaw-core-refactor/Cargo.toml \
--out /home/zyl/projects/superRpa/src/out/KylinRelease/sgclaw
```
Expected: the compatibility-shell binary is produced at the same output path as today.
**Step 3: Run browser smoke**
Run:
```bash
node /home/zyl/projects/superRpa/src/chrome/browser/superrpa/sgclaw/sgclaw_chat_smoke.mjs
```
Expected:
- browser protocol still starts and stops correctly
- Baidu task still succeeds
- Zhihu task still succeeds
- no browser-side API/bridge changes are required
### Non-Goals for This Refactor
- Do not replace the current SuperRPA browser protocol with ZeroClaw gateway protocols.
- Do not expose the upstream ZeroClaw web dashboard inside FunctionsUI.
- Do not ship the standalone gateway in this phase.
- Do not migrate browser-side code to a new transport.
### Phase 2 After This Refactor
After this compatibility refactor is stable:
- add a separate `gateway` crate or binary that uses the same vendored ZeroClaw core
- expose memory/cron/agent management there
- keep browser-side `sgclaw` as a thin local execution shell