Compare commits
6 Commits
fix/node-i
...
feat/confi
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
5a4258815b | ||
|
|
b4d95498d8 | ||
|
|
1d0192bb91 | ||
|
|
20ad85dee4 | ||
|
|
9aa2acf7cb | ||
|
|
2a49df0cee |
22
CHANGELOG.md
22
CHANGELOG.md
@@ -6,12 +6,9 @@ Docs: https://docs.clawd.bot
|
||||
|
||||
### Changes
|
||||
- Deps: update workspace + memory-lancedb dependencies.
|
||||
- Dev: use tsgo for dev/watch builds by default; set `CLAWDBOT_TS_COMPILER=tsc` to opt out.
|
||||
- Repo: remove the Peekaboo git submodule now that the SPM release is used.
|
||||
- Update: sync plugin sources on channel switches and update npm-installed plugins during `clawdbot update`.
|
||||
- Plugins: share npm plugin update logic between `clawdbot update` and `clawdbot plugins update`.
|
||||
- Browser: allow config defaults for efficient snapshots in the tool/CLI. (#1336) — thanks @sebslight.
|
||||
- Channels: add the Nostr plugin channel with profile management + onboarding install defaults. (#1323) — thanks @joelklabo.
|
||||
- Plugins: require manifest-embedded config schemas, validate configs without loading plugin code, and surface plugin config warnings. (#1272) — thanks @thewilloftheshadow.
|
||||
- Plugins: move channel catalog metadata into plugin manifests; align Nextcloud Talk policy helpers with core patterns. (#1290) — thanks @NicholaiVogel.
|
||||
- Discord: fall back to /skill when native command limits are exceeded; expose /skill globally. (#1287) — thanks @thewilloftheshadow.
|
||||
@@ -19,33 +16,14 @@ Docs: https://docs.clawd.bot
|
||||
- Matrix: migrate to matrix-bot-sdk with E2EE support, location handling, and group allowlist upgrades. (#1298) — thanks @sibbl.
|
||||
- Plugins/UI: let channel plugin metadata drive UI labels/icons and cron channel options. (#1306) — thanks @steipete.
|
||||
- Zalouser: add channel dock metadata, config schema, setup wiring, probe, and status issues. (#1219) — thanks @suminhthanh.
|
||||
- Security: warn when <=300B models run without sandboxing and with web tools enabled.
|
||||
- Skills: add download installs with OS-filtered install options; add local sherpa-onnx-tts skill.
|
||||
- Docs: clarify WhatsApp voice notes and Windows WSL portproxy LAN access notes.
|
||||
- UI: add copy-as-markdown with error feedback and drop legacy list view. (#1345) — thanks @bradleypriest.
|
||||
- TUI: add input history (up/down) for submitted messages. (#1348) — thanks @vignesh07.
|
||||
### Fixes
|
||||
- Discovery: shorten Bonjour DNS-SD service type to `_clawdbot-gw._tcp` and update discovery clients/docs.
|
||||
- Agents: preserve subagent announce thread/topic routing + queued replies across channels. (#1241) — thanks @gnarco.
|
||||
- Agents: avoid treating timeout errors with "aborted" messages as user aborts, so model fallback still runs.
|
||||
- Diagnostics: export OTLP logs, correct queue depth tracking, and document message-flow telemetry.
|
||||
- Diagnostics: emit message-flow diagnostics across channels via shared dispatch; gate heartbeat/webhook logging. (#1244) — thanks @oscargavin.
|
||||
- CLI: preserve cron delivery settings when editing message payloads. (#1322) — thanks @KrauseFx.
|
||||
- CLI: keep `clawdbot logs` output resilient to broken pipes while preserving progress output.
|
||||
- Nodes: enforce node.invoke timeouts for node handlers. (#1357) — thanks @vignesh07.
|
||||
- Model catalog: avoid caching import failures, log transient discovery errors, and keep partial results. (#1332) — thanks @dougvk.
|
||||
- Doctor: clarify plugin auto-enable hint text in the startup banner.
|
||||
- Gateway: clarify unauthorized handshake responses with token/password mismatch guidance.
|
||||
- Gateway: clarify connect/validation errors for gateway params. (#1347) — thanks @vignesh07.
|
||||
- Gateway: preserve restart wake routing + thread replies across restarts. (#1337) — thanks @John-Rood.
|
||||
- Gateway: reschedule per-agent heartbeats on config hot reload without restarting the runner.
|
||||
- Config: log invalid config issues once per run and keep invalid-config errors stackless.
|
||||
- Exec: default gateway/node exec security to allowlist when unset (sandbox stays deny).
|
||||
- UI: keep config form enums typed, preserve empty strings, protect sensitive defaults, and deepen config search. (#1315) — thanks @MaudeBot.
|
||||
- UI: preserve ordered list numbering in chat markdown. (#1341) — thanks @bradleypriest.
|
||||
- UI: allow Control UI to read gatewayUrl from URL params for remote WebSocket targets. (#1342) — thanks @ameno-.
|
||||
- Web search: infer Perplexity base URL from API key source (direct vs OpenRouter).
|
||||
- Web fetch: harden SSRF protection with shared hostname checks and redirect limits. (#1346) — thanks @fogboots.
|
||||
- TUI: keep thinking blocks ordered before content during streaming and isolate per-run assembly. (#1202) — thanks @aaronveklabs.
|
||||
- TUI: align custom editor initialization with the latest pi-tui API. (#1298) — thanks @sibbl.
|
||||
- CLI: avoid duplicating --profile/--dev flags when formatting commands.
|
||||
|
||||
46
README.md
46
README.md
@@ -479,27 +479,27 @@ Core contributors:
|
||||
Thanks to all clawtributors:
|
||||
|
||||
<p align="left">
|
||||
<a href="https://github.com/steipete"><img src="https://avatars.githubusercontent.com/u/58493?v=4&s=48" width="48" height="48" alt="steipete" title="steipete"/></a> <a href="https://github.com/bohdanpodvirnyi"><img src="https://avatars.githubusercontent.com/u/31819391?v=4&s=48" width="48" height="48" alt="bohdanpodvirnyi" title="bohdanpodvirnyi"/></a> <a href="https://github.com/joaohlisboa"><img src="https://avatars.githubusercontent.com/u/8200873?v=4&s=48" width="48" height="48" alt="joaohlisboa" title="joaohlisboa"/></a> <a href="https://github.com/mneves75"><img src="https://avatars.githubusercontent.com/u/2423436?v=4&s=48" width="48" height="48" alt="mneves75" title="mneves75"/></a> <a href="https://github.com/MatthieuBizien"><img src="https://avatars.githubusercontent.com/u/173090?v=4&s=48" width="48" height="48" alt="MatthieuBizien" title="MatthieuBizien"/></a> <a href="https://github.com/MaudeBot"><img src="https://avatars.githubusercontent.com/u/255777700?v=4&s=48" width="48" height="48" alt="MaudeBot" title="MaudeBot"/></a> <a href="https://github.com/rahthakor"><img src="https://avatars.githubusercontent.com/u/8470553?v=4&s=48" width="48" height="48" alt="rahthakor" title="rahthakor"/></a> <a href="https://github.com/vrknetha"><img src="https://avatars.githubusercontent.com/u/20596261?v=4&s=48" width="48" height="48" alt="vrknetha" title="vrknetha"/></a> <a href="https://github.com/radek-paclt"><img src="https://avatars.githubusercontent.com/u/50451445?v=4&s=48" width="48" height="48" alt="radek-paclt" title="radek-paclt"/></a> <a href="https://github.com/joshp123"><img src="https://avatars.githubusercontent.com/u/1497361?v=4&s=48" width="48" height="48" alt="joshp123" title="joshp123"/></a>
|
||||
<a href="https://github.com/mukhtharcm"><img src="https://avatars.githubusercontent.com/u/56378562?v=4&s=48" width="48" height="48" alt="mukhtharcm" title="mukhtharcm"/></a> <a href="https://github.com/maxsumrall"><img src="https://avatars.githubusercontent.com/u/628843?v=4&s=48" width="48" height="48" alt="maxsumrall" title="maxsumrall"/></a> <a href="https://github.com/xadenryan"><img src="https://avatars.githubusercontent.com/u/165437834?v=4&s=48" width="48" height="48" alt="xadenryan" title="xadenryan"/></a> <a href="https://github.com/tobiasbischoff"><img src="https://avatars.githubusercontent.com/u/711564?v=4&s=48" width="48" height="48" alt="Tobias Bischoff" title="Tobias Bischoff"/></a> <a href="https://github.com/juanpablodlc"><img src="https://avatars.githubusercontent.com/u/92012363?v=4&s=48" width="48" height="48" alt="juanpablodlc" title="juanpablodlc"/></a> <a href="https://github.com/hsrvc"><img src="https://avatars.githubusercontent.com/u/129702169?v=4&s=48" width="48" height="48" alt="hsrvc" title="hsrvc"/></a> <a href="https://github.com/magimetal"><img src="https://avatars.githubusercontent.com/u/36491250?v=4&s=48" width="48" height="48" alt="magimetal" title="magimetal"/></a> <a href="https://github.com/meaningfool"><img src="https://avatars.githubusercontent.com/u/2862331?v=4&s=48" width="48" height="48" alt="meaningfool" title="meaningfool"/></a> <a href="https://github.com/NicholasSpisak"><img src="https://avatars.githubusercontent.com/u/129075147?v=4&s=48" width="48" height="48" alt="NicholasSpisak" title="NicholasSpisak"/></a> <a href="https://github.com/AbhisekBasu1"><img src="https://avatars.githubusercontent.com/u/40645221?v=4&s=48" width="48" height="48" alt="abhisekbasu1" title="abhisekbasu1"/></a>
|
||||
<a href="https://github.com/sebslight"><img src="https://avatars.githubusercontent.com/u/19554889?v=4&s=48" width="48" height="48" alt="sebslight" title="sebslight"/></a> <a href="https://github.com/claude"><img src="https://avatars.githubusercontent.com/u/81847?v=4&s=48" width="48" height="48" alt="claude" title="claude"/></a> <a href="https://github.com/jamesgroat"><img src="https://avatars.githubusercontent.com/u/2634024?v=4&s=48" width="48" height="48" alt="jamesgroat" title="jamesgroat"/></a> <a href="https://github.com/Hyaxia"><img src="https://avatars.githubusercontent.com/u/36747317?v=4&s=48" width="48" height="48" alt="Hyaxia" title="Hyaxia"/></a> <a href="https://github.com/dantelex"><img src="https://avatars.githubusercontent.com/u/631543?v=4&s=48" width="48" height="48" alt="dantelex" title="dantelex"/></a> <a href="https://github.com/daveonkels"><img src="https://avatars.githubusercontent.com/u/533642?v=4&s=48" width="48" height="48" alt="daveonkels" title="daveonkels"/></a> <a href="https://github.com/mteam88"><img src="https://avatars.githubusercontent.com/u/84196639?v=4&s=48" width="48" height="48" alt="mteam88" title="mteam88"/></a> <a href="https://github.com/omniwired"><img src="https://avatars.githubusercontent.com/u/322761?v=4&s=48" width="48" height="48" alt="Eng. Juan Combetto" title="Eng. Juan Combetto"/></a> <a href="https://github.com/dbhurley"><img src="https://avatars.githubusercontent.com/u/5251425?v=4&s=48" width="48" height="48" alt="dbhurley" title="dbhurley"/></a> <a href="https://github.com/mbelinky"><img src="https://avatars.githubusercontent.com/u/132747814?v=4&s=48" width="48" height="48" alt="Mariano Belinky" title="Mariano Belinky"/></a>
|
||||
<a href="https://github.com/TSavo"><img src="https://avatars.githubusercontent.com/u/877990?v=4&s=48" width="48" height="48" alt="TSavo" title="TSavo"/></a> <a href="https://github.com/julianengel"><img src="https://avatars.githubusercontent.com/u/10634231?v=4&s=48" width="48" height="48" alt="julianengel" title="julianengel"/></a> <a href="https://github.com/benithors"><img src="https://avatars.githubusercontent.com/u/20652882?v=4&s=48" width="48" height="48" alt="benithors" title="benithors"/></a> <a href="https://github.com/bradleypriest"><img src="https://avatars.githubusercontent.com/u/167215?v=4&s=48" width="48" height="48" alt="bradleypriest" title="bradleypriest"/></a> <a href="https://github.com/timolins"><img src="https://avatars.githubusercontent.com/u/1440854?v=4&s=48" width="48" height="48" alt="timolins" title="timolins"/></a> <a href="https://github.com/Nachx639"><img src="https://avatars.githubusercontent.com/u/71144023?v=4&s=48" width="48" height="48" alt="nachx639" title="nachx639"/></a> <a href="https://github.com/sreekaransrinath"><img src="https://avatars.githubusercontent.com/u/50989977?v=4&s=48" width="48" height="48" alt="sreekaransrinath" title="sreekaransrinath"/></a> <a href="https://github.com/gupsammy"><img src="https://avatars.githubusercontent.com/u/20296019?v=4&s=48" width="48" height="48" alt="gupsammy" title="gupsammy"/></a> <a href="https://github.com/cristip73"><img src="https://avatars.githubusercontent.com/u/24499421?v=4&s=48" width="48" height="48" alt="cristip73" title="cristip73"/></a> <a href="https://github.com/nachoiacovino"><img src="https://avatars.githubusercontent.com/u/50103937?v=4&s=48" width="48" height="48" alt="nachoiacovino" title="nachoiacovino"/></a>
|
||||
<a href="https://github.com/vsabavat"><img src="https://avatars.githubusercontent.com/u/50385532?v=4&s=48" width="48" height="48" alt="Vasanth Rao Naik Sabavat" title="Vasanth Rao Naik Sabavat"/></a> <a href="https://github.com/cpojer"><img src="https://avatars.githubusercontent.com/u/13352?v=4&s=48" width="48" height="48" alt="cpojer" title="cpojer"/></a> <a href="https://github.com/lc0rp"><img src="https://avatars.githubusercontent.com/u/2609441?v=4&s=48" width="48" height="48" alt="lc0rp" title="lc0rp"/></a> <a href="https://github.com/scald"><img src="https://avatars.githubusercontent.com/u/1215913?v=4&s=48" width="48" height="48" alt="scald" title="scald"/></a> <a href="https://github.com/gumadeiras"><img src="https://avatars.githubusercontent.com/u/5599352?v=4&s=48" width="48" height="48" alt="gumadeiras" title="gumadeiras"/></a> <a href="https://github.com/andranik-sahakyan"><img src="https://avatars.githubusercontent.com/u/8908029?v=4&s=48" width="48" height="48" alt="andranik-sahakyan" title="andranik-sahakyan"/></a> <a href="https://github.com/davidguttman"><img src="https://avatars.githubusercontent.com/u/431696?v=4&s=48" width="48" height="48" alt="davidguttman" title="davidguttman"/></a> <a href="https://github.com/sleontenko"><img src="https://avatars.githubusercontent.com/u/7135949?v=4&s=48" width="48" height="48" alt="sleontenko" title="sleontenko"/></a> <a href="https://github.com/sircrumpet"><img src="https://avatars.githubusercontent.com/u/4436535?v=4&s=48" width="48" height="48" alt="sircrumpet" title="sircrumpet"/></a> <a href="https://github.com/peschee"><img src="https://avatars.githubusercontent.com/u/63866?v=4&s=48" width="48" height="48" alt="peschee" title="peschee"/></a>
|
||||
<a href="https://github.com/rafaelreis-r"><img src="https://avatars.githubusercontent.com/u/57492577?v=4&s=48" width="48" height="48" alt="rafaelreis-r" title="rafaelreis-r"/></a> <a href="https://github.com/thewilloftheshadow"><img src="https://avatars.githubusercontent.com/u/35580099?v=4&s=48" width="48" height="48" alt="thewilloftheshadow" title="thewilloftheshadow"/></a> <a href="https://github.com/ratulsarna"><img src="https://avatars.githubusercontent.com/u/105903728?v=4&s=48" width="48" height="48" alt="ratulsarna" title="ratulsarna"/></a> <a href="https://github.com/lutr0"><img src="https://avatars.githubusercontent.com/u/76906369?v=4&s=48" width="48" height="48" alt="lutr0" title="lutr0"/></a> <a href="https://github.com/danielz1z"><img src="https://avatars.githubusercontent.com/u/235270390?v=4&s=48" width="48" height="48" alt="danielz1z" title="danielz1z"/></a> <a href="https://github.com/emanuelst"><img src="https://avatars.githubusercontent.com/u/9994339?v=4&s=48" width="48" height="48" alt="emanuelst" title="emanuelst"/></a> <a href="https://github.com/KristijanJovanovski"><img src="https://avatars.githubusercontent.com/u/8942284?v=4&s=48" width="48" height="48" alt="KristijanJovanovski" title="KristijanJovanovski"/></a> <a href="https://github.com/CashWilliams"><img src="https://avatars.githubusercontent.com/u/613573?v=4&s=48" width="48" height="48" alt="CashWilliams" title="CashWilliams"/></a> <a href="https://github.com/rdev"><img src="https://avatars.githubusercontent.com/u/8418866?v=4&s=48" width="48" height="48" alt="rdev" title="rdev"/></a> <a href="https://github.com/osolmaz"><img src="https://avatars.githubusercontent.com/u/2453968?v=4&s=48" width="48" height="48" alt="osolmaz" title="osolmaz"/></a>
|
||||
<a href="https://github.com/joshrad-dev"><img src="https://avatars.githubusercontent.com/u/62785552?v=4&s=48" width="48" height="48" alt="joshrad-dev" title="joshrad-dev"/></a> <a href="https://github.com/kiranjd"><img src="https://avatars.githubusercontent.com/u/25822851?v=4&s=48" width="48" height="48" alt="kiranjd" title="kiranjd"/></a> <a href="https://github.com/adityashaw2"><img src="https://avatars.githubusercontent.com/u/41204444?v=4&s=48" width="48" height="48" alt="adityashaw2" title="adityashaw2"/></a> <a href="https://github.com/search?q=sheeek"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="sheeek" title="sheeek"/></a> <a href="https://github.com/artuskg"><img src="https://avatars.githubusercontent.com/u/11966157?v=4&s=48" width="48" height="48" alt="artuskg" title="artuskg"/></a> <a href="https://github.com/onutc"><img src="https://avatars.githubusercontent.com/u/152018508?v=4&s=48" width="48" height="48" alt="onutc" title="onutc"/></a> <a href="https://github.com/tyler6204"><img src="https://avatars.githubusercontent.com/u/64381258?v=4&s=48" width="48" height="48" alt="tyler6204" title="tyler6204"/></a> <a href="https://github.com/ManuelHettich"><img src="https://avatars.githubusercontent.com/u/17690367?v=4&s=48" width="48" height="48" alt="manuelhettich" title="manuelhettich"/></a> <a href="https://github.com/minghinmatthewlam"><img src="https://avatars.githubusercontent.com/u/14224566?v=4&s=48" width="48" height="48" alt="minghinmatthewlam" title="minghinmatthewlam"/></a> <a href="https://github.com/myfunc"><img src="https://avatars.githubusercontent.com/u/19294627?v=4&s=48" width="48" height="48" alt="myfunc" title="myfunc"/></a>
|
||||
<a href="https://github.com/buddyh"><img src="https://avatars.githubusercontent.com/u/31752869?v=4&s=48" width="48" height="48" alt="buddyh" title="buddyh"/></a> <a href="https://github.com/connorshea"><img src="https://avatars.githubusercontent.com/u/2977353?v=4&s=48" width="48" height="48" alt="connorshea" title="connorshea"/></a> <a href="https://github.com/mcinteerj"><img src="https://avatars.githubusercontent.com/u/3613653?v=4&s=48" width="48" height="48" alt="mcinteerj" title="mcinteerj"/></a> <a href="https://github.com/John-Rood"><img src="https://avatars.githubusercontent.com/u/62669593?v=4&s=48" width="48" height="48" alt="John-Rood" title="John-Rood"/></a> <a href="https://github.com/timkrase"><img src="https://avatars.githubusercontent.com/u/38947626?v=4&s=48" width="48" height="48" alt="timkrase" title="timkrase"/></a> <a href="https://github.com/zerone0x"><img src="https://avatars.githubusercontent.com/u/39543393?v=4&s=48" width="48" height="48" alt="zerone0x" title="zerone0x"/></a> <a href="https://github.com/gerardward2007"><img src="https://avatars.githubusercontent.com/u/3002155?v=4&s=48" width="48" height="48" alt="gerardward2007" title="gerardward2007"/></a> <a href="https://github.com/obviyus"><img src="https://avatars.githubusercontent.com/u/22031114?v=4&s=48" width="48" height="48" alt="obviyus" title="obviyus"/></a> <a href="https://github.com/tosh-hamburg"><img src="https://avatars.githubusercontent.com/u/58424326?v=4&s=48" width="48" height="48" alt="tosh-hamburg" title="tosh-hamburg"/></a> <a href="https://github.com/azade-c"><img src="https://avatars.githubusercontent.com/u/252790079?v=4&s=48" width="48" height="48" alt="azade-c" title="azade-c"/></a>
|
||||
<a href="https://github.com/roshanasingh4"><img src="https://avatars.githubusercontent.com/u/88576930?v=4&s=48" width="48" height="48" alt="roshanasingh4" title="roshanasingh4"/></a> <a href="https://github.com/bjesuiter"><img src="https://avatars.githubusercontent.com/u/2365676?v=4&s=48" width="48" height="48" alt="bjesuiter" title="bjesuiter"/></a> <a href="https://github.com/cheeeee"><img src="https://avatars.githubusercontent.com/u/21245729?v=4&s=48" width="48" height="48" alt="cheeeee" title="cheeeee"/></a> <a href="https://github.com/j1philli"><img src="https://avatars.githubusercontent.com/u/3744255?v=4&s=48" width="48" height="48" alt="Josh Phillips" title="Josh Phillips"/></a> <a href="https://github.com/Whoaa512"><img src="https://avatars.githubusercontent.com/u/1581943?v=4&s=48" width="48" height="48" alt="Whoaa512" title="Whoaa512"/></a> <a href="https://github.com/YuriNachos"><img src="https://avatars.githubusercontent.com/u/19365375?v=4&s=48" width="48" height="48" alt="YuriNachos" title="YuriNachos"/></a> <a href="https://github.com/chriseidhof"><img src="https://avatars.githubusercontent.com/u/5382?v=4&s=48" width="48" height="48" alt="chriseidhof" title="chriseidhof"/></a> <a href="https://github.com/vignesh07"><img src="https://avatars.githubusercontent.com/u/1436853?v=4&s=48" width="48" height="48" alt="vignesh07" title="vignesh07"/></a> <a href="https://github.com/ysqander"><img src="https://avatars.githubusercontent.com/u/80843820?v=4&s=48" width="48" height="48" alt="ysqander" title="ysqander"/></a> <a href="https://github.com/superman32432432"><img src="https://avatars.githubusercontent.com/u/7228420?v=4&s=48" width="48" height="48" alt="superman32432432" title="superman32432432"/></a>
|
||||
<a href="https://github.com/search?q=Yurii%20Chukhlib"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Yurii Chukhlib" title="Yurii Chukhlib"/></a> <a href="https://github.com/grp06"><img src="https://avatars.githubusercontent.com/u/1573959?v=4&s=48" width="48" height="48" alt="grp06" title="grp06"/></a> <a href="https://github.com/antons"><img src="https://avatars.githubusercontent.com/u/129705?v=4&s=48" width="48" height="48" alt="antons" title="antons"/></a> <a href="https://github.com/austinm911"><img src="https://avatars.githubusercontent.com/u/31991302?v=4&s=48" width="48" height="48" alt="austinm911" title="austinm911"/></a> <a href="https://github.com/apps/blacksmith-sh"><img src="https://avatars.githubusercontent.com/in/807020?v=4&s=48" width="48" height="48" alt="blacksmith-sh[bot]" title="blacksmith-sh[bot]"/></a> <a href="https://github.com/dan-dr"><img src="https://avatars.githubusercontent.com/u/6669808?v=4&s=48" width="48" height="48" alt="dan-dr" title="dan-dr"/></a> <a href="https://github.com/HeimdallStrategy"><img src="https://avatars.githubusercontent.com/u/223014405?v=4&s=48" width="48" height="48" alt="HeimdallStrategy" title="HeimdallStrategy"/></a> <a href="https://github.com/imfing"><img src="https://avatars.githubusercontent.com/u/5097752?v=4&s=48" width="48" height="48" alt="imfing" title="imfing"/></a> <a href="https://github.com/jalehman"><img src="https://avatars.githubusercontent.com/u/550978?v=4&s=48" width="48" height="48" alt="jalehman" title="jalehman"/></a> <a href="https://github.com/jarvis-medmatic"><img src="https://avatars.githubusercontent.com/u/252428873?v=4&s=48" width="48" height="48" alt="jarvis-medmatic" title="jarvis-medmatic"/></a>
|
||||
<a href="https://github.com/kkarimi"><img src="https://avatars.githubusercontent.com/u/875218?v=4&s=48" width="48" height="48" alt="kkarimi" title="kkarimi"/></a> <a href="https://github.com/mahmoudashraf93"><img src="https://avatars.githubusercontent.com/u/9130129?v=4&s=48" width="48" height="48" alt="mahmoudashraf93" title="mahmoudashraf93"/></a> <a href="https://github.com/petter-b"><img src="https://avatars.githubusercontent.com/u/62076402?v=4&s=48" width="48" height="48" alt="petter-b" title="petter-b"/></a> <a href="https://github.com/pkrmf"><img src="https://avatars.githubusercontent.com/u/1714267?v=4&s=48" width="48" height="48" alt="pkrmf" title="pkrmf"/></a> <a href="https://github.com/RandyVentures"><img src="https://avatars.githubusercontent.com/u/149904821?v=4&s=48" width="48" height="48" alt="RandyVentures" title="RandyVentures"/></a> <a href="https://github.com/search?q=Ryan%20Lisse"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Ryan Lisse" title="Ryan Lisse"/></a> <a href="https://github.com/dougvk"><img src="https://avatars.githubusercontent.com/u/401660?v=4&s=48" width="48" height="48" alt="dougvk" title="dougvk"/></a> <a href="https://github.com/erikpr1994"><img src="https://avatars.githubusercontent.com/u/6299331?v=4&s=48" width="48" height="48" alt="erikpr1994" title="erikpr1994"/></a> <a href="https://github.com/search?q=Ghost"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Ghost" title="Ghost"/></a> <a href="https://github.com/jonasjancarik"><img src="https://avatars.githubusercontent.com/u/2459191?v=4&s=48" width="48" height="48" alt="jonasjancarik" title="jonasjancarik"/></a>
|
||||
<a href="https://github.com/search?q=Keith%20the%20Silly%20Goose"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Keith the Silly Goose" title="Keith the Silly Goose"/></a> <a href="https://github.com/search?q=L36%20Server"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="L36 Server" title="L36 Server"/></a> <a href="https://github.com/search?q=Marc"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Marc" title="Marc"/></a> <a href="https://github.com/mitschabaude-bot"><img src="https://avatars.githubusercontent.com/u/247582884?v=4&s=48" width="48" height="48" alt="mitschabaude-bot" title="mitschabaude-bot"/></a> <a href="https://github.com/neist"><img src="https://avatars.githubusercontent.com/u/1029724?v=4&s=48" width="48" height="48" alt="neist" title="neist"/></a> <a href="https://github.com/ngutman"><img src="https://avatars.githubusercontent.com/u/1540134?v=4&s=48" width="48" height="48" alt="ngutman" title="ngutman"/></a> <a href="https://github.com/chrisrodz"><img src="https://avatars.githubusercontent.com/u/2967620?v=4&s=48" width="48" height="48" alt="chrisrodz" title="chrisrodz"/></a> <a href="https://github.com/search?q=Friederike%20Seiler"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Friederike Seiler" title="Friederike Seiler"/></a> <a href="https://github.com/gabriel-trigo"><img src="https://avatars.githubusercontent.com/u/38991125?v=4&s=48" width="48" height="48" alt="gabriel-trigo" title="gabriel-trigo"/></a> <a href="https://github.com/Iamadig"><img src="https://avatars.githubusercontent.com/u/102129234?v=4&s=48" width="48" height="48" alt="iamadig" title="iamadig"/></a>
|
||||
<a href="https://github.com/search?q=Kit"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Kit" title="Kit"/></a> <a href="https://github.com/koala73"><img src="https://avatars.githubusercontent.com/u/996596?v=4&s=48" width="48" height="48" alt="koala73" title="koala73"/></a> <a href="https://github.com/manmal"><img src="https://avatars.githubusercontent.com/u/142797?v=4&s=48" width="48" height="48" alt="manmal" title="manmal"/></a> <a href="https://github.com/ogulcancelik"><img src="https://avatars.githubusercontent.com/u/7064011?v=4&s=48" width="48" height="48" alt="ogulcancelik" title="ogulcancelik"/></a> <a href="https://github.com/pasogott"><img src="https://avatars.githubusercontent.com/u/23458152?v=4&s=48" width="48" height="48" alt="pasogott" title="pasogott"/></a> <a href="https://github.com/petradonka"><img src="https://avatars.githubusercontent.com/u/7353770?v=4&s=48" width="48" height="48" alt="petradonka" title="petradonka"/></a> <a href="https://github.com/rubyrunsstuff"><img src="https://avatars.githubusercontent.com/u/246602379?v=4&s=48" width="48" height="48" alt="rubyrunsstuff" title="rubyrunsstuff"/></a> <a href="https://github.com/sibbl"><img src="https://avatars.githubusercontent.com/u/866535?v=4&s=48" width="48" height="48" alt="sibbl" title="sibbl"/></a> <a href="https://github.com/suminhthanh"><img src="https://avatars.githubusercontent.com/u/2907636?v=4&s=48" width="48" height="48" alt="suminhthanh" title="suminhthanh"/></a> <a href="https://github.com/VACInc"><img src="https://avatars.githubusercontent.com/u/3279061?v=4&s=48" width="48" height="48" alt="VACInc" title="VACInc"/></a>
|
||||
<a href="https://github.com/wes-davis"><img src="https://avatars.githubusercontent.com/u/16506720?v=4&s=48" width="48" height="48" alt="wes-davis" title="wes-davis"/></a> <a href="https://github.com/zats"><img src="https://avatars.githubusercontent.com/u/2688806?v=4&s=48" width="48" height="48" alt="zats" title="zats"/></a> <a href="https://github.com/24601"><img src="https://avatars.githubusercontent.com/u/1157207?v=4&s=48" width="48" height="48" alt="24601" title="24601"/></a> <a href="https://github.com/search?q=Chris%20Taylor"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Chris Taylor" title="Chris Taylor"/></a> <a href="https://github.com/djangonavarro220"><img src="https://avatars.githubusercontent.com/u/251162586?v=4&s=48" width="48" height="48" alt="Django Navarro" title="Django Navarro"/></a> <a href="https://github.com/evalexpr"><img src="https://avatars.githubusercontent.com/u/23485511?v=4&s=48" width="48" height="48" alt="evalexpr" title="evalexpr"/></a> <a href="https://github.com/henrino3"><img src="https://avatars.githubusercontent.com/u/4260288?v=4&s=48" width="48" height="48" alt="henrino3" title="henrino3"/></a> <a href="https://github.com/humanwritten"><img src="https://avatars.githubusercontent.com/u/206531610?v=4&s=48" width="48" height="48" alt="humanwritten" title="humanwritten"/></a> <a href="https://github.com/larlyssa"><img src="https://avatars.githubusercontent.com/u/13128869?v=4&s=48" width="48" height="48" alt="larlyssa" title="larlyssa"/></a> <a href="https://github.com/mkbehr"><img src="https://avatars.githubusercontent.com/u/1285?v=4&s=48" width="48" height="48" alt="mkbehr" title="mkbehr"/></a>
|
||||
<a href="https://github.com/oswalpalash"><img src="https://avatars.githubusercontent.com/u/6431196?v=4&s=48" width="48" height="48" alt="oswalpalash" title="oswalpalash"/></a> <a href="https://github.com/pcty-nextgen-service-account"><img src="https://avatars.githubusercontent.com/u/112553441?v=4&s=48" width="48" height="48" alt="pcty-nextgen-service-account" title="pcty-nextgen-service-account"/></a> <a href="https://github.com/Syhids"><img src="https://avatars.githubusercontent.com/u/671202?v=4&s=48" width="48" height="48" alt="Syhids" title="Syhids"/></a> <a href="https://github.com/search?q=Aaron%20Konyer"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Aaron Konyer" title="Aaron Konyer"/></a> <a href="https://github.com/aaronveklabs"><img src="https://avatars.githubusercontent.com/u/225997828?v=4&s=48" width="48" height="48" alt="aaronveklabs" title="aaronveklabs"/></a> <a href="https://github.com/adam91holt"><img src="https://avatars.githubusercontent.com/u/9592417?v=4&s=48" width="48" height="48" alt="adam91holt" title="adam91holt"/></a> <a href="https://github.com/search?q=ClawdFx"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="ClawdFx" title="ClawdFx"/></a> <a href="https://github.com/erik-agens"><img src="https://avatars.githubusercontent.com/u/80908960?v=4&s=48" width="48" height="48" alt="erik-agens" title="erik-agens"/></a> <a href="https://github.com/fcatuhe"><img src="https://avatars.githubusercontent.com/u/17382215?v=4&s=48" width="48" height="48" alt="fcatuhe" title="fcatuhe"/></a> <a href="https://github.com/ivanrvpereira"><img src="https://avatars.githubusercontent.com/u/183991?v=4&s=48" width="48" height="48" alt="ivanrvpereira" title="ivanrvpereira"/></a>
|
||||
<a href="https://github.com/jayhickey"><img src="https://avatars.githubusercontent.com/u/1676460?v=4&s=48" width="48" height="48" alt="jayhickey" title="jayhickey"/></a> <a href="https://github.com/jeffersonwarrior"><img src="https://avatars.githubusercontent.com/u/89030989?v=4&s=48" width="48" height="48" alt="jeffersonwarrior" title="jeffersonwarrior"/></a> <a href="https://github.com/search?q=jeffersonwarrior"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="jeffersonwarrior" title="jeffersonwarrior"/></a> <a href="https://github.com/jdrhyne"><img src="https://avatars.githubusercontent.com/u/7828464?v=4&s=48" width="48" height="48" alt="Jonathan D. Rhyne (DJ-D)" title="Jonathan D. Rhyne (DJ-D)"/></a> <a href="https://github.com/jverdi"><img src="https://avatars.githubusercontent.com/u/345050?v=4&s=48" width="48" height="48" alt="jverdi" title="jverdi"/></a> <a href="https://github.com/longmaba"><img src="https://avatars.githubusercontent.com/u/9361500?v=4&s=48" width="48" height="48" alt="longmaba" title="longmaba"/></a> <a href="https://github.com/mickahouan"><img src="https://avatars.githubusercontent.com/u/31423109?v=4&s=48" width="48" height="48" alt="mickahouan" title="mickahouan"/></a> <a href="https://github.com/mjrussell"><img src="https://avatars.githubusercontent.com/u/1641895?v=4&s=48" width="48" height="48" alt="mjrussell" title="mjrussell"/></a> <a href="https://github.com/p6l-richard"><img src="https://avatars.githubusercontent.com/u/18185649?v=4&s=48" width="48" height="48" alt="p6l-richard" title="p6l-richard"/></a> <a href="https://github.com/philipp-spiess"><img src="https://avatars.githubusercontent.com/u/458591?v=4&s=48" width="48" height="48" alt="philipp-spiess" title="philipp-spiess"/></a>
|
||||
<a href="https://github.com/robaxelsen"><img src="https://avatars.githubusercontent.com/u/13132899?v=4&s=48" width="48" height="48" alt="robaxelsen" title="robaxelsen"/></a> <a href="https://github.com/search?q=Sash%20Catanzarite"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Sash Catanzarite" title="Sash Catanzarite"/></a> <a href="https://github.com/T5-AndyML"><img src="https://avatars.githubusercontent.com/u/22801233?v=4&s=48" width="48" height="48" alt="T5-AndyML" title="T5-AndyML"/></a> <a href="https://github.com/search?q=VAC"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="VAC" title="VAC"/></a> <a href="https://github.com/zknicker"><img src="https://avatars.githubusercontent.com/u/1164085?v=4&s=48" width="48" height="48" alt="zknicker" title="zknicker"/></a> <a href="https://github.com/search?q=alejandro%20maza"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="alejandro maza" title="alejandro maza"/></a> <a href="https://github.com/ameno-"><img src="https://avatars.githubusercontent.com/u/2416135?v=4&s=48" width="48" height="48" alt="ameno-" title="ameno-"/></a> <a href="https://github.com/andrewting19"><img src="https://avatars.githubusercontent.com/u/10536704?v=4&s=48" width="48" height="48" alt="andrewting19" title="andrewting19"/></a> <a href="https://github.com/anpoirier"><img src="https://avatars.githubusercontent.com/u/1245729?v=4&s=48" width="48" height="48" alt="anpoirier" title="anpoirier"/></a> <a href="https://github.com/Asleep123"><img src="https://avatars.githubusercontent.com/u/122379135?v=4&s=48" width="48" height="48" alt="Asleep123" title="Asleep123"/></a>
|
||||
<a href="https://github.com/bolismauro"><img src="https://avatars.githubusercontent.com/u/771999?v=4&s=48" width="48" height="48" alt="bolismauro" title="bolismauro"/></a> <a href="https://github.com/cash-echo-bot"><img src="https://avatars.githubusercontent.com/u/252747386?v=4&s=48" width="48" height="48" alt="cash-echo-bot" title="cash-echo-bot"/></a> <a href="https://github.com/search?q=Clawd"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Clawd" title="Clawd"/></a> <a href="https://github.com/conhecendocontato"><img src="https://avatars.githubusercontent.com/u/82890727?v=4&s=48" width="48" height="48" alt="conhecendocontato" title="conhecendocontato"/></a> <a href="https://github.com/search?q=Dimitrios%20Ploutarchos"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Dimitrios Ploutarchos" title="Dimitrios Ploutarchos"/></a> <a href="https://github.com/search?q=Drake%20Thomsen"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Drake Thomsen" title="Drake Thomsen"/></a> <a href="https://github.com/search?q=Felix%20Krause"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Felix Krause" title="Felix Krause"/></a> <a href="https://github.com/gtsifrikas"><img src="https://avatars.githubusercontent.com/u/8904378?v=4&s=48" width="48" height="48" alt="gtsifrikas" title="gtsifrikas"/></a> <a href="https://github.com/HazAT"><img src="https://avatars.githubusercontent.com/u/363802?v=4&s=48" width="48" height="48" alt="HazAT" title="HazAT"/></a> <a href="https://github.com/hrdwdmrbl"><img src="https://avatars.githubusercontent.com/u/554881?v=4&s=48" width="48" height="48" alt="hrdwdmrbl" title="hrdwdmrbl"/></a>
|
||||
<a href="https://github.com/hugobarauna"><img src="https://avatars.githubusercontent.com/u/2719?v=4&s=48" width="48" height="48" alt="hugobarauna" title="hugobarauna"/></a> <a href="https://github.com/search?q=Jamie%20Openshaw"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Jamie Openshaw" title="Jamie Openshaw"/></a> <a href="https://github.com/search?q=Jarvis"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Jarvis" title="Jarvis"/></a> <a href="https://github.com/search?q=Jefferson%20Nunn"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Jefferson Nunn" title="Jefferson Nunn"/></a> <a href="https://github.com/search?q=Kevin%20Lin"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Kevin Lin" title="Kevin Lin"/></a> <a href="https://github.com/kitze"><img src="https://avatars.githubusercontent.com/u/1160594?v=4&s=48" width="48" height="48" alt="kitze" title="kitze"/></a> <a href="https://github.com/levifig"><img src="https://avatars.githubusercontent.com/u/1605?v=4&s=48" width="48" height="48" alt="levifig" title="levifig"/></a> <a href="https://github.com/search?q=Lloyd"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Lloyd" title="Lloyd"/></a> <a href="https://github.com/loukotal"><img src="https://avatars.githubusercontent.com/u/18210858?v=4&s=48" width="48" height="48" alt="loukotal" title="loukotal"/></a> <a href="https://github.com/martinpucik"><img src="https://avatars.githubusercontent.com/u/5503097?v=4&s=48" width="48" height="48" alt="martinpucik" title="martinpucik"/></a>
|
||||
<a href="https://github.com/search?q=Miles"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Miles" title="Miles"/></a> <a href="https://github.com/mrdbstn"><img src="https://avatars.githubusercontent.com/u/58957632?v=4&s=48" width="48" height="48" alt="mrdbstn" title="mrdbstn"/></a> <a href="https://github.com/MSch"><img src="https://avatars.githubusercontent.com/u/7475?v=4&s=48" width="48" height="48" alt="MSch" title="MSch"/></a> <a href="https://github.com/search?q=Mustafa%20Tag%20Eldeen"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Mustafa Tag Eldeen" title="Mustafa Tag Eldeen"/></a> <a href="https://github.com/ndraiman"><img src="https://avatars.githubusercontent.com/u/12609607?v=4&s=48" width="48" height="48" alt="ndraiman" title="ndraiman"/></a> <a href="https://github.com/nexty5870"><img src="https://avatars.githubusercontent.com/u/3869659?v=4&s=48" width="48" height="48" alt="nexty5870" title="nexty5870"/></a> <a href="https://github.com/odysseus0"><img src="https://avatars.githubusercontent.com/u/8635094?v=4&s=48" width="48" height="48" alt="odysseus0" title="odysseus0"/></a> <a href="https://github.com/prathamdby"><img src="https://avatars.githubusercontent.com/u/134331217?v=4&s=48" width="48" height="48" alt="prathamdby" title="prathamdby"/></a> <a href="https://github.com/reeltimeapps"><img src="https://avatars.githubusercontent.com/u/637338?v=4&s=48" width="48" height="48" alt="reeltimeapps" title="reeltimeapps"/></a> <a href="https://github.com/RLTCmpe"><img src="https://avatars.githubusercontent.com/u/10762242?v=4&s=48" width="48" height="48" alt="RLTCmpe" title="RLTCmpe"/></a>
|
||||
<a href="https://github.com/rodrigouroz"><img src="https://avatars.githubusercontent.com/u/384037?v=4&s=48" width="48" height="48" alt="rodrigouroz" title="rodrigouroz"/></a> <a href="https://github.com/search?q=Rolf%20Fredheim"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Rolf Fredheim" title="Rolf Fredheim"/></a> <a href="https://github.com/search?q=Rony%20Kelner"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Rony Kelner" title="Rony Kelner"/></a> <a href="https://github.com/search?q=Samrat%20Jha"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Samrat Jha" title="Samrat Jha"/></a> <a href="https://github.com/siraht"><img src="https://avatars.githubusercontent.com/u/73152895?v=4&s=48" width="48" height="48" alt="siraht" title="siraht"/></a> <a href="https://github.com/snopoke"><img src="https://avatars.githubusercontent.com/u/249606?v=4&s=48" width="48" height="48" alt="snopoke" title="snopoke"/></a> <a href="https://github.com/search?q=The%20Admiral"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="The Admiral" title="The Admiral"/></a> <a href="https://github.com/thesash"><img src="https://avatars.githubusercontent.com/u/1166151?v=4&s=48" width="48" height="48" alt="thesash" title="thesash"/></a> <a href="https://github.com/search?q=Ubuntu"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Ubuntu" title="Ubuntu"/></a> <a href="https://github.com/voidserf"><img src="https://avatars.githubusercontent.com/u/477673?v=4&s=48" width="48" height="48" alt="voidserf" title="voidserf"/></a>
|
||||
<a href="https://github.com/wstock"><img src="https://avatars.githubusercontent.com/u/1394687?v=4&s=48" width="48" height="48" alt="wstock" title="wstock"/></a> <a href="https://github.com/search?q=Zach%20Knickerbocker"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Zach Knickerbocker" title="Zach Knickerbocker"/></a> <a href="https://github.com/Alphonse-arianee"><img src="https://avatars.githubusercontent.com/u/254457365?v=4&s=48" width="48" height="48" alt="Alphonse-arianee" title="Alphonse-arianee"/></a> <a href="https://github.com/search?q=Azade"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Azade" title="Azade"/></a> <a href="https://github.com/carlulsoe"><img src="https://avatars.githubusercontent.com/u/34673973?v=4&s=48" width="48" height="48" alt="carlulsoe" title="carlulsoe"/></a> <a href="https://github.com/search?q=ddyo"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="ddyo" title="ddyo"/></a> <a href="https://github.com/search?q=Erik"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Erik" title="Erik"/></a> <a href="https://github.com/latitudeki5223"><img src="https://avatars.githubusercontent.com/u/119656367?v=4&s=48" width="48" height="48" alt="latitudeki5223" title="latitudeki5223"/></a> <a href="https://github.com/search?q=Manuel%20Maly"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Manuel Maly" title="Manuel Maly"/></a> <a href="https://github.com/search?q=Mourad%20Boustani"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Mourad Boustani" title="Mourad Boustani"/></a>
|
||||
<a href="https://github.com/odrobnik"><img src="https://avatars.githubusercontent.com/u/333270?v=4&s=48" width="48" height="48" alt="odrobnik" title="odrobnik"/></a> <a href="https://github.com/pcty-nextgen-ios-builder"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="pcty-nextgen-ios-builder" title="pcty-nextgen-ios-builder"/></a> <a href="https://github.com/search?q=Quentin"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Quentin" title="Quentin"/></a> <a href="https://github.com/search?q=Randy%20Torres"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Randy Torres" title="Randy Torres"/></a> <a href="https://github.com/rhjoh"><img src="https://avatars.githubusercontent.com/u/105699450?v=4&s=48" width="48" height="48" alt="rhjoh" title="rhjoh"/></a> <a href="https://github.com/ronak-guliani"><img src="https://avatars.githubusercontent.com/u/23518228?v=4&s=48" width="48" height="48" alt="ronak-guliani" title="ronak-guliani"/></a> <a href="https://github.com/search?q=William%20Stock"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="William Stock" title="William Stock"/></a>
|
||||
<a href="https://github.com/steipete"><img src="https://avatars.githubusercontent.com/u/58493?v=4&s=48" width="48" height="48" alt="steipete" title="steipete"/></a> <a href="https://github.com/bohdanpodvirnyi"><img src="https://avatars.githubusercontent.com/u/31819391?v=4&s=48" width="48" height="48" alt="bohdanpodvirnyi" title="bohdanpodvirnyi"/></a> <a href="https://github.com/joaohlisboa"><img src="https://avatars.githubusercontent.com/u/8200873?v=4&s=48" width="48" height="48" alt="joaohlisboa" title="joaohlisboa"/></a> <a href="https://github.com/mneves75"><img src="https://avatars.githubusercontent.com/u/2423436?v=4&s=48" width="48" height="48" alt="mneves75" title="mneves75"/></a> <a href="https://github.com/MatthieuBizien"><img src="https://avatars.githubusercontent.com/u/173090?v=4&s=48" width="48" height="48" alt="MatthieuBizien" title="MatthieuBizien"/></a> <a href="https://github.com/rahthakor"><img src="https://avatars.githubusercontent.com/u/8470553?v=4&s=48" width="48" height="48" alt="rahthakor" title="rahthakor"/></a> <a href="https://github.com/vrknetha"><img src="https://avatars.githubusercontent.com/u/20596261?v=4&s=48" width="48" height="48" alt="vrknetha" title="vrknetha"/></a> <a href="https://github.com/radek-paclt"><img src="https://avatars.githubusercontent.com/u/50451445?v=4&s=48" width="48" height="48" alt="radek-paclt" title="radek-paclt"/></a> <a href="https://github.com/joshp123"><img src="https://avatars.githubusercontent.com/u/1497361?v=4&s=48" width="48" height="48" alt="joshp123" title="joshp123"/></a> <a href="https://github.com/mukhtharcm"><img src="https://avatars.githubusercontent.com/u/56378562?v=4&s=48" width="48" height="48" alt="mukhtharcm" title="mukhtharcm"/></a>
|
||||
<a href="https://github.com/maxsumrall"><img src="https://avatars.githubusercontent.com/u/628843?v=4&s=48" width="48" height="48" alt="maxsumrall" title="maxsumrall"/></a> <a href="https://github.com/xadenryan"><img src="https://avatars.githubusercontent.com/u/165437834?v=4&s=48" width="48" height="48" alt="xadenryan" title="xadenryan"/></a> <a href="https://github.com/tobiasbischoff"><img src="https://avatars.githubusercontent.com/u/711564?v=4&s=48" width="48" height="48" alt="Tobias Bischoff" title="Tobias Bischoff"/></a> <a href="https://github.com/juanpablodlc"><img src="https://avatars.githubusercontent.com/u/92012363?v=4&s=48" width="48" height="48" alt="juanpablodlc" title="juanpablodlc"/></a> <a href="https://github.com/hsrvc"><img src="https://avatars.githubusercontent.com/u/129702169?v=4&s=48" width="48" height="48" alt="hsrvc" title="hsrvc"/></a> <a href="https://github.com/magimetal"><img src="https://avatars.githubusercontent.com/u/36491250?v=4&s=48" width="48" height="48" alt="magimetal" title="magimetal"/></a> <a href="https://github.com/meaningfool"><img src="https://avatars.githubusercontent.com/u/2862331?v=4&s=48" width="48" height="48" alt="meaningfool" title="meaningfool"/></a> <a href="https://github.com/NicholasSpisak"><img src="https://avatars.githubusercontent.com/u/129075147?v=4&s=48" width="48" height="48" alt="NicholasSpisak" title="NicholasSpisak"/></a> <a href="https://github.com/AbhisekBasu1"><img src="https://avatars.githubusercontent.com/u/40645221?v=4&s=48" width="48" height="48" alt="abhisekbasu1" title="abhisekbasu1"/></a> <a href="https://github.com/sebslight"><img src="https://avatars.githubusercontent.com/u/19554889?v=4&s=48" width="48" height="48" alt="sebslight" title="sebslight"/></a>
|
||||
<a href="https://github.com/claude"><img src="https://avatars.githubusercontent.com/u/81847?v=4&s=48" width="48" height="48" alt="claude" title="claude"/></a> <a href="https://github.com/jamesgroat"><img src="https://avatars.githubusercontent.com/u/2634024?v=4&s=48" width="48" height="48" alt="jamesgroat" title="jamesgroat"/></a> <a href="https://github.com/Hyaxia"><img src="https://avatars.githubusercontent.com/u/36747317?v=4&s=48" width="48" height="48" alt="Hyaxia" title="Hyaxia"/></a> <a href="https://github.com/dantelex"><img src="https://avatars.githubusercontent.com/u/631543?v=4&s=48" width="48" height="48" alt="dantelex" title="dantelex"/></a> <a href="https://github.com/daveonkels"><img src="https://avatars.githubusercontent.com/u/533642?v=4&s=48" width="48" height="48" alt="daveonkels" title="daveonkels"/></a> <a href="https://github.com/mteam88"><img src="https://avatars.githubusercontent.com/u/84196639?v=4&s=48" width="48" height="48" alt="mteam88" title="mteam88"/></a> <a href="https://github.com/omniwired"><img src="https://avatars.githubusercontent.com/u/322761?v=4&s=48" width="48" height="48" alt="Eng. Juan Combetto" title="Eng. Juan Combetto"/></a> <a href="https://github.com/dbhurley"><img src="https://avatars.githubusercontent.com/u/5251425?v=4&s=48" width="48" height="48" alt="dbhurley" title="dbhurley"/></a> <a href="https://github.com/mbelinky"><img src="https://avatars.githubusercontent.com/u/132747814?v=4&s=48" width="48" height="48" alt="Mariano Belinky" title="Mariano Belinky"/></a> <a href="https://github.com/TSavo"><img src="https://avatars.githubusercontent.com/u/877990?v=4&s=48" width="48" height="48" alt="TSavo" title="TSavo"/></a>
|
||||
<a href="https://github.com/julianengel"><img src="https://avatars.githubusercontent.com/u/10634231?v=4&s=48" width="48" height="48" alt="julianengel" title="julianengel"/></a> <a href="https://github.com/benithors"><img src="https://avatars.githubusercontent.com/u/20652882?v=4&s=48" width="48" height="48" alt="benithors" title="benithors"/></a> <a href="https://github.com/bradleypriest"><img src="https://avatars.githubusercontent.com/u/167215?v=4&s=48" width="48" height="48" alt="bradleypriest" title="bradleypriest"/></a> <a href="https://github.com/timolins"><img src="https://avatars.githubusercontent.com/u/1440854?v=4&s=48" width="48" height="48" alt="timolins" title="timolins"/></a> <a href="https://github.com/Nachx639"><img src="https://avatars.githubusercontent.com/u/71144023?v=4&s=48" width="48" height="48" alt="nachx639" title="nachx639"/></a> <a href="https://github.com/sreekaransrinath"><img src="https://avatars.githubusercontent.com/u/50989977?v=4&s=48" width="48" height="48" alt="sreekaransrinath" title="sreekaransrinath"/></a> <a href="https://github.com/gupsammy"><img src="https://avatars.githubusercontent.com/u/20296019?v=4&s=48" width="48" height="48" alt="gupsammy" title="gupsammy"/></a> <a href="https://github.com/cristip73"><img src="https://avatars.githubusercontent.com/u/24499421?v=4&s=48" width="48" height="48" alt="cristip73" title="cristip73"/></a> <a href="https://github.com/nachoiacovino"><img src="https://avatars.githubusercontent.com/u/50103937?v=4&s=48" width="48" height="48" alt="nachoiacovino" title="nachoiacovino"/></a> <a href="https://github.com/vsabavat"><img src="https://avatars.githubusercontent.com/u/50385532?v=4&s=48" width="48" height="48" alt="Vasanth Rao Naik Sabavat" title="Vasanth Rao Naik Sabavat"/></a>
|
||||
<a href="https://github.com/cpojer"><img src="https://avatars.githubusercontent.com/u/13352?v=4&s=48" width="48" height="48" alt="cpojer" title="cpojer"/></a> <a href="https://github.com/lc0rp"><img src="https://avatars.githubusercontent.com/u/2609441?v=4&s=48" width="48" height="48" alt="lc0rp" title="lc0rp"/></a> <a href="https://github.com/scald"><img src="https://avatars.githubusercontent.com/u/1215913?v=4&s=48" width="48" height="48" alt="scald" title="scald"/></a> <a href="https://github.com/gumadeiras"><img src="https://avatars.githubusercontent.com/u/5599352?v=4&s=48" width="48" height="48" alt="gumadeiras" title="gumadeiras"/></a> <a href="https://github.com/andranik-sahakyan"><img src="https://avatars.githubusercontent.com/u/8908029?v=4&s=48" width="48" height="48" alt="andranik-sahakyan" title="andranik-sahakyan"/></a> <a href="https://github.com/davidguttman"><img src="https://avatars.githubusercontent.com/u/431696?v=4&s=48" width="48" height="48" alt="davidguttman" title="davidguttman"/></a> <a href="https://github.com/sleontenko"><img src="https://avatars.githubusercontent.com/u/7135949?v=4&s=48" width="48" height="48" alt="sleontenko" title="sleontenko"/></a> <a href="https://github.com/sircrumpet"><img src="https://avatars.githubusercontent.com/u/4436535?v=4&s=48" width="48" height="48" alt="sircrumpet" title="sircrumpet"/></a> <a href="https://github.com/peschee"><img src="https://avatars.githubusercontent.com/u/63866?v=4&s=48" width="48" height="48" alt="peschee" title="peschee"/></a> <a href="https://github.com/rafaelreis-r"><img src="https://avatars.githubusercontent.com/u/57492577?v=4&s=48" width="48" height="48" alt="rafaelreis-r" title="rafaelreis-r"/></a>
|
||||
<a href="https://github.com/thewilloftheshadow"><img src="https://avatars.githubusercontent.com/u/35580099?v=4&s=48" width="48" height="48" alt="thewilloftheshadow" title="thewilloftheshadow"/></a> <a href="https://github.com/ratulsarna"><img src="https://avatars.githubusercontent.com/u/105903728?v=4&s=48" width="48" height="48" alt="ratulsarna" title="ratulsarna"/></a> <a href="https://github.com/lutr0"><img src="https://avatars.githubusercontent.com/u/76906369?v=4&s=48" width="48" height="48" alt="lutr0" title="lutr0"/></a> <a href="https://github.com/danielz1z"><img src="https://avatars.githubusercontent.com/u/235270390?v=4&s=48" width="48" height="48" alt="danielz1z" title="danielz1z"/></a> <a href="https://github.com/emanuelst"><img src="https://avatars.githubusercontent.com/u/9994339?v=4&s=48" width="48" height="48" alt="emanuelst" title="emanuelst"/></a> <a href="https://github.com/KristijanJovanovski"><img src="https://avatars.githubusercontent.com/u/8942284?v=4&s=48" width="48" height="48" alt="KristijanJovanovski" title="KristijanJovanovski"/></a> <a href="https://github.com/CashWilliams"><img src="https://avatars.githubusercontent.com/u/613573?v=4&s=48" width="48" height="48" alt="CashWilliams" title="CashWilliams"/></a> <a href="https://github.com/rdev"><img src="https://avatars.githubusercontent.com/u/8418866?v=4&s=48" width="48" height="48" alt="rdev" title="rdev"/></a> <a href="https://github.com/osolmaz"><img src="https://avatars.githubusercontent.com/u/2453968?v=4&s=48" width="48" height="48" alt="osolmaz" title="osolmaz"/></a> <a href="https://github.com/joshrad-dev"><img src="https://avatars.githubusercontent.com/u/62785552?v=4&s=48" width="48" height="48" alt="joshrad-dev" title="joshrad-dev"/></a>
|
||||
<a href="https://github.com/kiranjd"><img src="https://avatars.githubusercontent.com/u/25822851?v=4&s=48" width="48" height="48" alt="kiranjd" title="kiranjd"/></a> <a href="https://github.com/adityashaw2"><img src="https://avatars.githubusercontent.com/u/41204444?v=4&s=48" width="48" height="48" alt="adityashaw2" title="adityashaw2"/></a> <a href="https://github.com/search?q=sheeek"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="sheeek" title="sheeek"/></a> <a href="https://github.com/artuskg"><img src="https://avatars.githubusercontent.com/u/11966157?v=4&s=48" width="48" height="48" alt="artuskg" title="artuskg"/></a> <a href="https://github.com/onutc"><img src="https://avatars.githubusercontent.com/u/152018508?v=4&s=48" width="48" height="48" alt="onutc" title="onutc"/></a> <a href="https://github.com/tyler6204"><img src="https://avatars.githubusercontent.com/u/64381258?v=4&s=48" width="48" height="48" alt="tyler6204" title="tyler6204"/></a> <a href="https://github.com/ManuelHettich"><img src="https://avatars.githubusercontent.com/u/17690367?v=4&s=48" width="48" height="48" alt="manuelhettich" title="manuelhettich"/></a> <a href="https://github.com/minghinmatthewlam"><img src="https://avatars.githubusercontent.com/u/14224566?v=4&s=48" width="48" height="48" alt="minghinmatthewlam" title="minghinmatthewlam"/></a> <a href="https://github.com/myfunc"><img src="https://avatars.githubusercontent.com/u/19294627?v=4&s=48" width="48" height="48" alt="myfunc" title="myfunc"/></a> <a href="https://github.com/buddyh"><img src="https://avatars.githubusercontent.com/u/31752869?v=4&s=48" width="48" height="48" alt="buddyh" title="buddyh"/></a>
|
||||
<a href="https://github.com/connorshea"><img src="https://avatars.githubusercontent.com/u/2977353?v=4&s=48" width="48" height="48" alt="connorshea" title="connorshea"/></a> <a href="https://github.com/mcinteerj"><img src="https://avatars.githubusercontent.com/u/3613653?v=4&s=48" width="48" height="48" alt="mcinteerj" title="mcinteerj"/></a> <a href="https://github.com/John-Rood"><img src="https://avatars.githubusercontent.com/u/62669593?v=4&s=48" width="48" height="48" alt="John-Rood" title="John-Rood"/></a> <a href="https://github.com/timkrase"><img src="https://avatars.githubusercontent.com/u/38947626?v=4&s=48" width="48" height="48" alt="timkrase" title="timkrase"/></a> <a href="https://github.com/zerone0x"><img src="https://avatars.githubusercontent.com/u/39543393?v=4&s=48" width="48" height="48" alt="zerone0x" title="zerone0x"/></a> <a href="https://github.com/gerardward2007"><img src="https://avatars.githubusercontent.com/u/3002155?v=4&s=48" width="48" height="48" alt="gerardward2007" title="gerardward2007"/></a> <a href="https://github.com/obviyus"><img src="https://avatars.githubusercontent.com/u/22031114?v=4&s=48" width="48" height="48" alt="obviyus" title="obviyus"/></a> <a href="https://github.com/tosh-hamburg"><img src="https://avatars.githubusercontent.com/u/58424326?v=4&s=48" width="48" height="48" alt="tosh-hamburg" title="tosh-hamburg"/></a> <a href="https://github.com/azade-c"><img src="https://avatars.githubusercontent.com/u/252790079?v=4&s=48" width="48" height="48" alt="azade-c" title="azade-c"/></a> <a href="https://github.com/roshanasingh4"><img src="https://avatars.githubusercontent.com/u/88576930?v=4&s=48" width="48" height="48" alt="roshanasingh4" title="roshanasingh4"/></a>
|
||||
<a href="https://github.com/bjesuiter"><img src="https://avatars.githubusercontent.com/u/2365676?v=4&s=48" width="48" height="48" alt="bjesuiter" title="bjesuiter"/></a> <a href="https://github.com/cheeeee"><img src="https://avatars.githubusercontent.com/u/21245729?v=4&s=48" width="48" height="48" alt="cheeeee" title="cheeeee"/></a> <a href="https://github.com/j1philli"><img src="https://avatars.githubusercontent.com/u/3744255?v=4&s=48" width="48" height="48" alt="Josh Phillips" title="Josh Phillips"/></a> <a href="https://github.com/Whoaa512"><img src="https://avatars.githubusercontent.com/u/1581943?v=4&s=48" width="48" height="48" alt="Whoaa512" title="Whoaa512"/></a> <a href="https://github.com/YuriNachos"><img src="https://avatars.githubusercontent.com/u/19365375?v=4&s=48" width="48" height="48" alt="YuriNachos" title="YuriNachos"/></a> <a href="https://github.com/chriseidhof"><img src="https://avatars.githubusercontent.com/u/5382?v=4&s=48" width="48" height="48" alt="chriseidhof" title="chriseidhof"/></a> <a href="https://github.com/ysqander"><img src="https://avatars.githubusercontent.com/u/80843820?v=4&s=48" width="48" height="48" alt="ysqander" title="ysqander"/></a> <a href="https://github.com/superman32432432"><img src="https://avatars.githubusercontent.com/u/7228420?v=4&s=48" width="48" height="48" alt="superman32432432" title="superman32432432"/></a> <a href="https://github.com/vignesh07"><img src="https://avatars.githubusercontent.com/u/1436853?v=4&s=48" width="48" height="48" alt="vignesh07" title="vignesh07"/></a> <a href="https://github.com/search?q=Yurii%20Chukhlib"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Yurii Chukhlib" title="Yurii Chukhlib"/></a>
|
||||
<a href="https://github.com/grp06"><img src="https://avatars.githubusercontent.com/u/1573959?v=4&s=48" width="48" height="48" alt="grp06" title="grp06"/></a> <a href="https://github.com/antons"><img src="https://avatars.githubusercontent.com/u/129705?v=4&s=48" width="48" height="48" alt="antons" title="antons"/></a> <a href="https://github.com/austinm911"><img src="https://avatars.githubusercontent.com/u/31991302?v=4&s=48" width="48" height="48" alt="austinm911" title="austinm911"/></a> <a href="https://github.com/apps/blacksmith-sh"><img src="https://avatars.githubusercontent.com/in/807020?v=4&s=48" width="48" height="48" alt="blacksmith-sh[bot]" title="blacksmith-sh[bot]"/></a> <a href="https://github.com/dan-dr"><img src="https://avatars.githubusercontent.com/u/6669808?v=4&s=48" width="48" height="48" alt="dan-dr" title="dan-dr"/></a> <a href="https://github.com/HeimdallStrategy"><img src="https://avatars.githubusercontent.com/u/223014405?v=4&s=48" width="48" height="48" alt="HeimdallStrategy" title="HeimdallStrategy"/></a> <a href="https://github.com/imfing"><img src="https://avatars.githubusercontent.com/u/5097752?v=4&s=48" width="48" height="48" alt="imfing" title="imfing"/></a> <a href="https://github.com/jalehman"><img src="https://avatars.githubusercontent.com/u/550978?v=4&s=48" width="48" height="48" alt="jalehman" title="jalehman"/></a> <a href="https://github.com/jarvis-medmatic"><img src="https://avatars.githubusercontent.com/u/252428873?v=4&s=48" width="48" height="48" alt="jarvis-medmatic" title="jarvis-medmatic"/></a> <a href="https://github.com/kkarimi"><img src="https://avatars.githubusercontent.com/u/875218?v=4&s=48" width="48" height="48" alt="kkarimi" title="kkarimi"/></a>
|
||||
<a href="https://github.com/mahmoudashraf93"><img src="https://avatars.githubusercontent.com/u/9130129?v=4&s=48" width="48" height="48" alt="mahmoudashraf93" title="mahmoudashraf93"/></a> <a href="https://github.com/petter-b"><img src="https://avatars.githubusercontent.com/u/62076402?v=4&s=48" width="48" height="48" alt="petter-b" title="petter-b"/></a> <a href="https://github.com/pkrmf"><img src="https://avatars.githubusercontent.com/u/1714267?v=4&s=48" width="48" height="48" alt="pkrmf" title="pkrmf"/></a> <a href="https://github.com/RandyVentures"><img src="https://avatars.githubusercontent.com/u/149904821?v=4&s=48" width="48" height="48" alt="RandyVentures" title="RandyVentures"/></a> <a href="https://github.com/search?q=Ryan%20Lisse"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Ryan Lisse" title="Ryan Lisse"/></a> <a href="https://github.com/erikpr1994"><img src="https://avatars.githubusercontent.com/u/6299331?v=4&s=48" width="48" height="48" alt="erikpr1994" title="erikpr1994"/></a> <a href="https://github.com/search?q=Ghost"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Ghost" title="Ghost"/></a> <a href="https://github.com/jonasjancarik"><img src="https://avatars.githubusercontent.com/u/2459191?v=4&s=48" width="48" height="48" alt="jonasjancarik" title="jonasjancarik"/></a> <a href="https://github.com/search?q=Keith%20the%20Silly%20Goose"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Keith the Silly Goose" title="Keith the Silly Goose"/></a> <a href="https://github.com/search?q=L36%20Server"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="L36 Server" title="L36 Server"/></a>
|
||||
<a href="https://github.com/search?q=Marc"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Marc" title="Marc"/></a> <a href="https://github.com/mitschabaude-bot"><img src="https://avatars.githubusercontent.com/u/247582884?v=4&s=48" width="48" height="48" alt="mitschabaude-bot" title="mitschabaude-bot"/></a> <a href="https://github.com/neist"><img src="https://avatars.githubusercontent.com/u/1029724?v=4&s=48" width="48" height="48" alt="neist" title="neist"/></a> <a href="https://github.com/ngutman"><img src="https://avatars.githubusercontent.com/u/1540134?v=4&s=48" width="48" height="48" alt="ngutman" title="ngutman"/></a> <a href="https://github.com/chrisrodz"><img src="https://avatars.githubusercontent.com/u/2967620?v=4&s=48" width="48" height="48" alt="chrisrodz" title="chrisrodz"/></a> <a href="https://github.com/dougvk"><img src="https://avatars.githubusercontent.com/u/401660?v=4&s=48" width="48" height="48" alt="dougvk" title="dougvk"/></a> <a href="https://github.com/search?q=Friederike%20Seiler"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Friederike Seiler" title="Friederike Seiler"/></a> <a href="https://github.com/gabriel-trigo"><img src="https://avatars.githubusercontent.com/u/38991125?v=4&s=48" width="48" height="48" alt="gabriel-trigo" title="gabriel-trigo"/></a> <a href="https://github.com/Iamadig"><img src="https://avatars.githubusercontent.com/u/102129234?v=4&s=48" width="48" height="48" alt="iamadig" title="iamadig"/></a> <a href="https://github.com/search?q=Kit"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Kit" title="Kit"/></a>
|
||||
<a href="https://github.com/koala73"><img src="https://avatars.githubusercontent.com/u/996596?v=4&s=48" width="48" height="48" alt="koala73" title="koala73"/></a> <a href="https://github.com/manmal"><img src="https://avatars.githubusercontent.com/u/142797?v=4&s=48" width="48" height="48" alt="manmal" title="manmal"/></a> <a href="https://github.com/ogulcancelik"><img src="https://avatars.githubusercontent.com/u/7064011?v=4&s=48" width="48" height="48" alt="ogulcancelik" title="ogulcancelik"/></a> <a href="https://github.com/pasogott"><img src="https://avatars.githubusercontent.com/u/23458152?v=4&s=48" width="48" height="48" alt="pasogott" title="pasogott"/></a> <a href="https://github.com/petradonka"><img src="https://avatars.githubusercontent.com/u/7353770?v=4&s=48" width="48" height="48" alt="petradonka" title="petradonka"/></a> <a href="https://github.com/rubyrunsstuff"><img src="https://avatars.githubusercontent.com/u/246602379?v=4&s=48" width="48" height="48" alt="rubyrunsstuff" title="rubyrunsstuff"/></a> <a href="https://github.com/sibbl"><img src="https://avatars.githubusercontent.com/u/866535?v=4&s=48" width="48" height="48" alt="sibbl" title="sibbl"/></a> <a href="https://github.com/suminhthanh"><img src="https://avatars.githubusercontent.com/u/2907636?v=4&s=48" width="48" height="48" alt="suminhthanh" title="suminhthanh"/></a> <a href="https://github.com/VACInc"><img src="https://avatars.githubusercontent.com/u/3279061?v=4&s=48" width="48" height="48" alt="VACInc" title="VACInc"/></a> <a href="https://github.com/wes-davis"><img src="https://avatars.githubusercontent.com/u/16506720?v=4&s=48" width="48" height="48" alt="wes-davis" title="wes-davis"/></a>
|
||||
<a href="https://github.com/zats"><img src="https://avatars.githubusercontent.com/u/2688806?v=4&s=48" width="48" height="48" alt="zats" title="zats"/></a> <a href="https://github.com/24601"><img src="https://avatars.githubusercontent.com/u/1157207?v=4&s=48" width="48" height="48" alt="24601" title="24601"/></a> <a href="https://github.com/search?q=Chris%20Taylor"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Chris Taylor" title="Chris Taylor"/></a> <a href="https://github.com/djangonavarro220"><img src="https://avatars.githubusercontent.com/u/251162586?v=4&s=48" width="48" height="48" alt="Django Navarro" title="Django Navarro"/></a> <a href="https://github.com/evalexpr"><img src="https://avatars.githubusercontent.com/u/23485511?v=4&s=48" width="48" height="48" alt="evalexpr" title="evalexpr"/></a> <a href="https://github.com/henrino3"><img src="https://avatars.githubusercontent.com/u/4260288?v=4&s=48" width="48" height="48" alt="henrino3" title="henrino3"/></a> <a href="https://github.com/humanwritten"><img src="https://avatars.githubusercontent.com/u/206531610?v=4&s=48" width="48" height="48" alt="humanwritten" title="humanwritten"/></a> <a href="https://github.com/larlyssa"><img src="https://avatars.githubusercontent.com/u/13128869?v=4&s=48" width="48" height="48" alt="larlyssa" title="larlyssa"/></a> <a href="https://github.com/mkbehr"><img src="https://avatars.githubusercontent.com/u/1285?v=4&s=48" width="48" height="48" alt="mkbehr" title="mkbehr"/></a> <a href="https://github.com/oswalpalash"><img src="https://avatars.githubusercontent.com/u/6431196?v=4&s=48" width="48" height="48" alt="oswalpalash" title="oswalpalash"/></a>
|
||||
<a href="https://github.com/pcty-nextgen-service-account"><img src="https://avatars.githubusercontent.com/u/112553441?v=4&s=48" width="48" height="48" alt="pcty-nextgen-service-account" title="pcty-nextgen-service-account"/></a> <a href="https://github.com/Syhids"><img src="https://avatars.githubusercontent.com/u/671202?v=4&s=48" width="48" height="48" alt="Syhids" title="Syhids"/></a> <a href="https://github.com/search?q=Aaron%20Konyer"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Aaron Konyer" title="Aaron Konyer"/></a> <a href="https://github.com/aaronveklabs"><img src="https://avatars.githubusercontent.com/u/225997828?v=4&s=48" width="48" height="48" alt="aaronveklabs" title="aaronveklabs"/></a> <a href="https://github.com/adam91holt"><img src="https://avatars.githubusercontent.com/u/9592417?v=4&s=48" width="48" height="48" alt="adam91holt" title="adam91holt"/></a> <a href="https://github.com/erik-agens"><img src="https://avatars.githubusercontent.com/u/80908960?v=4&s=48" width="48" height="48" alt="erik-agens" title="erik-agens"/></a> <a href="https://github.com/fcatuhe"><img src="https://avatars.githubusercontent.com/u/17382215?v=4&s=48" width="48" height="48" alt="fcatuhe" title="fcatuhe"/></a> <a href="https://github.com/ivanrvpereira"><img src="https://avatars.githubusercontent.com/u/183991?v=4&s=48" width="48" height="48" alt="ivanrvpereira" title="ivanrvpereira"/></a> <a href="https://github.com/jayhickey"><img src="https://avatars.githubusercontent.com/u/1676460?v=4&s=48" width="48" height="48" alt="jayhickey" title="jayhickey"/></a> <a href="https://github.com/jeffersonwarrior"><img src="https://avatars.githubusercontent.com/u/89030989?v=4&s=48" width="48" height="48" alt="jeffersonwarrior" title="jeffersonwarrior"/></a>
|
||||
<a href="https://github.com/search?q=jeffersonwarrior"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="jeffersonwarrior" title="jeffersonwarrior"/></a> <a href="https://github.com/jdrhyne"><img src="https://avatars.githubusercontent.com/u/7828464?v=4&s=48" width="48" height="48" alt="Jonathan D. Rhyne (DJ-D)" title="Jonathan D. Rhyne (DJ-D)"/></a> <a href="https://github.com/jverdi"><img src="https://avatars.githubusercontent.com/u/345050?v=4&s=48" width="48" height="48" alt="jverdi" title="jverdi"/></a> <a href="https://github.com/longmaba"><img src="https://avatars.githubusercontent.com/u/9361500?v=4&s=48" width="48" height="48" alt="longmaba" title="longmaba"/></a> <a href="https://github.com/mickahouan"><img src="https://avatars.githubusercontent.com/u/31423109?v=4&s=48" width="48" height="48" alt="mickahouan" title="mickahouan"/></a> <a href="https://github.com/mjrussell"><img src="https://avatars.githubusercontent.com/u/1641895?v=4&s=48" width="48" height="48" alt="mjrussell" title="mjrussell"/></a> <a href="https://github.com/p6l-richard"><img src="https://avatars.githubusercontent.com/u/18185649?v=4&s=48" width="48" height="48" alt="p6l-richard" title="p6l-richard"/></a> <a href="https://github.com/philipp-spiess"><img src="https://avatars.githubusercontent.com/u/458591?v=4&s=48" width="48" height="48" alt="philipp-spiess" title="philipp-spiess"/></a> <a href="https://github.com/robaxelsen"><img src="https://avatars.githubusercontent.com/u/13132899?v=4&s=48" width="48" height="48" alt="robaxelsen" title="robaxelsen"/></a> <a href="https://github.com/search?q=Sash%20Catanzarite"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Sash Catanzarite" title="Sash Catanzarite"/></a>
|
||||
<a href="https://github.com/T5-AndyML"><img src="https://avatars.githubusercontent.com/u/22801233?v=4&s=48" width="48" height="48" alt="T5-AndyML" title="T5-AndyML"/></a> <a href="https://github.com/search?q=VAC"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="VAC" title="VAC"/></a> <a href="https://github.com/zknicker"><img src="https://avatars.githubusercontent.com/u/1164085?v=4&s=48" width="48" height="48" alt="zknicker" title="zknicker"/></a> <a href="https://github.com/search?q=alejandro%20maza"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="alejandro maza" title="alejandro maza"/></a> <a href="https://github.com/andrewting19"><img src="https://avatars.githubusercontent.com/u/10536704?v=4&s=48" width="48" height="48" alt="andrewting19" title="andrewting19"/></a> <a href="https://github.com/anpoirier"><img src="https://avatars.githubusercontent.com/u/1245729?v=4&s=48" width="48" height="48" alt="anpoirier" title="anpoirier"/></a> <a href="https://github.com/Asleep123"><img src="https://avatars.githubusercontent.com/u/122379135?v=4&s=48" width="48" height="48" alt="Asleep123" title="Asleep123"/></a> <a href="https://github.com/bolismauro"><img src="https://avatars.githubusercontent.com/u/771999?v=4&s=48" width="48" height="48" alt="bolismauro" title="bolismauro"/></a> <a href="https://github.com/cash-echo-bot"><img src="https://avatars.githubusercontent.com/u/252747386?v=4&s=48" width="48" height="48" alt="cash-echo-bot" title="cash-echo-bot"/></a> <a href="https://github.com/search?q=Clawd"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Clawd" title="Clawd"/></a>
|
||||
<a href="https://github.com/conhecendocontato"><img src="https://avatars.githubusercontent.com/u/82890727?v=4&s=48" width="48" height="48" alt="conhecendocontato" title="conhecendocontato"/></a> <a href="https://github.com/search?q=Dimitrios%20Ploutarchos"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Dimitrios Ploutarchos" title="Dimitrios Ploutarchos"/></a> <a href="https://github.com/search?q=Drake%20Thomsen"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Drake Thomsen" title="Drake Thomsen"/></a> <a href="https://github.com/search?q=Felix%20Krause"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Felix Krause" title="Felix Krause"/></a> <a href="https://github.com/gtsifrikas"><img src="https://avatars.githubusercontent.com/u/8904378?v=4&s=48" width="48" height="48" alt="gtsifrikas" title="gtsifrikas"/></a> <a href="https://github.com/HazAT"><img src="https://avatars.githubusercontent.com/u/363802?v=4&s=48" width="48" height="48" alt="HazAT" title="HazAT"/></a> <a href="https://github.com/hrdwdmrbl"><img src="https://avatars.githubusercontent.com/u/554881?v=4&s=48" width="48" height="48" alt="hrdwdmrbl" title="hrdwdmrbl"/></a> <a href="https://github.com/hugobarauna"><img src="https://avatars.githubusercontent.com/u/2719?v=4&s=48" width="48" height="48" alt="hugobarauna" title="hugobarauna"/></a> <a href="https://github.com/search?q=Jamie%20Openshaw"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Jamie Openshaw" title="Jamie Openshaw"/></a> <a href="https://github.com/search?q=Jarvis"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Jarvis" title="Jarvis"/></a>
|
||||
<a href="https://github.com/search?q=Jefferson%20Nunn"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Jefferson Nunn" title="Jefferson Nunn"/></a> <a href="https://github.com/search?q=Kevin%20Lin"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Kevin Lin" title="Kevin Lin"/></a> <a href="https://github.com/kitze"><img src="https://avatars.githubusercontent.com/u/1160594?v=4&s=48" width="48" height="48" alt="kitze" title="kitze"/></a> <a href="https://github.com/levifig"><img src="https://avatars.githubusercontent.com/u/1605?v=4&s=48" width="48" height="48" alt="levifig" title="levifig"/></a> <a href="https://github.com/search?q=Lloyd"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Lloyd" title="Lloyd"/></a> <a href="https://github.com/loukotal"><img src="https://avatars.githubusercontent.com/u/18210858?v=4&s=48" width="48" height="48" alt="loukotal" title="loukotal"/></a> <a href="https://github.com/martinpucik"><img src="https://avatars.githubusercontent.com/u/5503097?v=4&s=48" width="48" height="48" alt="martinpucik" title="martinpucik"/></a> <a href="https://github.com/search?q=Miles"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Miles" title="Miles"/></a> <a href="https://github.com/mrdbstn"><img src="https://avatars.githubusercontent.com/u/58957632?v=4&s=48" width="48" height="48" alt="mrdbstn" title="mrdbstn"/></a> <a href="https://github.com/MSch"><img src="https://avatars.githubusercontent.com/u/7475?v=4&s=48" width="48" height="48" alt="MSch" title="MSch"/></a>
|
||||
<a href="https://github.com/search?q=Mustafa%20Tag%20Eldeen"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Mustafa Tag Eldeen" title="Mustafa Tag Eldeen"/></a> <a href="https://github.com/ndraiman"><img src="https://avatars.githubusercontent.com/u/12609607?v=4&s=48" width="48" height="48" alt="ndraiman" title="ndraiman"/></a> <a href="https://github.com/nexty5870"><img src="https://avatars.githubusercontent.com/u/3869659?v=4&s=48" width="48" height="48" alt="nexty5870" title="nexty5870"/></a> <a href="https://github.com/odysseus0"><img src="https://avatars.githubusercontent.com/u/8635094?v=4&s=48" width="48" height="48" alt="odysseus0" title="odysseus0"/></a> <a href="https://github.com/prathamdby"><img src="https://avatars.githubusercontent.com/u/134331217?v=4&s=48" width="48" height="48" alt="prathamdby" title="prathamdby"/></a> <a href="https://github.com/reeltimeapps"><img src="https://avatars.githubusercontent.com/u/637338?v=4&s=48" width="48" height="48" alt="reeltimeapps" title="reeltimeapps"/></a> <a href="https://github.com/RLTCmpe"><img src="https://avatars.githubusercontent.com/u/10762242?v=4&s=48" width="48" height="48" alt="RLTCmpe" title="RLTCmpe"/></a> <a href="https://github.com/rodrigouroz"><img src="https://avatars.githubusercontent.com/u/384037?v=4&s=48" width="48" height="48" alt="rodrigouroz" title="rodrigouroz"/></a> <a href="https://github.com/search?q=Rolf%20Fredheim"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Rolf Fredheim" title="Rolf Fredheim"/></a> <a href="https://github.com/search?q=Rony%20Kelner"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Rony Kelner" title="Rony Kelner"/></a>
|
||||
<a href="https://github.com/search?q=Samrat%20Jha"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Samrat Jha" title="Samrat Jha"/></a> <a href="https://github.com/siraht"><img src="https://avatars.githubusercontent.com/u/73152895?v=4&s=48" width="48" height="48" alt="siraht" title="siraht"/></a> <a href="https://github.com/snopoke"><img src="https://avatars.githubusercontent.com/u/249606?v=4&s=48" width="48" height="48" alt="snopoke" title="snopoke"/></a> <a href="https://github.com/search?q=The%20Admiral"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="The Admiral" title="The Admiral"/></a> <a href="https://github.com/thesash"><img src="https://avatars.githubusercontent.com/u/1166151?v=4&s=48" width="48" height="48" alt="thesash" title="thesash"/></a> <a href="https://github.com/search?q=Ubuntu"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Ubuntu" title="Ubuntu"/></a> <a href="https://github.com/voidserf"><img src="https://avatars.githubusercontent.com/u/477673?v=4&s=48" width="48" height="48" alt="voidserf" title="voidserf"/></a> <a href="https://github.com/wstock"><img src="https://avatars.githubusercontent.com/u/1394687?v=4&s=48" width="48" height="48" alt="wstock" title="wstock"/></a> <a href="https://github.com/search?q=Zach%20Knickerbocker"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Zach Knickerbocker" title="Zach Knickerbocker"/></a> <a href="https://github.com/Alphonse-arianee"><img src="https://avatars.githubusercontent.com/u/254457365?v=4&s=48" width="48" height="48" alt="Alphonse-arianee" title="Alphonse-arianee"/></a>
|
||||
<a href="https://github.com/search?q=Azade"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Azade" title="Azade"/></a> <a href="https://github.com/carlulsoe"><img src="https://avatars.githubusercontent.com/u/34673973?v=4&s=48" width="48" height="48" alt="carlulsoe" title="carlulsoe"/></a> <a href="https://github.com/search?q=ddyo"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="ddyo" title="ddyo"/></a> <a href="https://github.com/search?q=Erik"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Erik" title="Erik"/></a> <a href="https://github.com/latitudeki5223"><img src="https://avatars.githubusercontent.com/u/119656367?v=4&s=48" width="48" height="48" alt="latitudeki5223" title="latitudeki5223"/></a> <a href="https://github.com/search?q=Manuel%20Maly"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Manuel Maly" title="Manuel Maly"/></a> <a href="https://github.com/search?q=Mourad%20Boustani"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Mourad Boustani" title="Mourad Boustani"/></a> <a href="https://github.com/odrobnik"><img src="https://avatars.githubusercontent.com/u/333270?v=4&s=48" width="48" height="48" alt="odrobnik" title="odrobnik"/></a> <a href="https://github.com/pcty-nextgen-ios-builder"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="pcty-nextgen-ios-builder" title="pcty-nextgen-ios-builder"/></a> <a href="https://github.com/search?q=Quentin"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Quentin" title="Quentin"/></a>
|
||||
<a href="https://github.com/search?q=Randy%20Torres"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="Randy Torres" title="Randy Torres"/></a> <a href="https://github.com/rhjoh"><img src="https://avatars.githubusercontent.com/u/105699450?v=4&s=48" width="48" height="48" alt="rhjoh" title="rhjoh"/></a> <a href="https://github.com/ronak-guliani"><img src="https://avatars.githubusercontent.com/u/23518228?v=4&s=48" width="48" height="48" alt="ronak-guliani" title="ronak-guliani"/></a> <a href="https://github.com/search?q=William%20Stock"><img src="assets/avatar-placeholder.svg" width="48" height="48" alt="William Stock" title="William Stock"/></a>
|
||||
</p>
|
||||
|
||||
@@ -6,19 +6,15 @@ struct ConfigSettings: View {
|
||||
private let isNixMode = ProcessInfo.processInfo.isNixMode
|
||||
@Bindable var store: ChannelsStore
|
||||
@State private var hasLoaded = false
|
||||
@State private var activeSectionKey: String?
|
||||
@State private var activeSubsection: SubsectionSelection?
|
||||
|
||||
init(store: ChannelsStore = .shared) {
|
||||
self.store = store
|
||||
}
|
||||
|
||||
var body: some View {
|
||||
HStack(spacing: 16) {
|
||||
self.sidebar
|
||||
self.detail
|
||||
ScrollView {
|
||||
self.content
|
||||
}
|
||||
.frame(maxWidth: .infinity, maxHeight: .infinity, alignment: .topLeading)
|
||||
.task {
|
||||
guard !self.hasLoaded else { return }
|
||||
guard !self.isPreview else { return }
|
||||
@@ -26,125 +22,42 @@ struct ConfigSettings: View {
|
||||
await self.store.loadConfigSchema()
|
||||
await self.store.loadConfig()
|
||||
}
|
||||
.onAppear { self.ensureSelection() }
|
||||
.onChange(of: self.store.configSchemaLoading) { _, loading in
|
||||
if !loading { self.ensureSelection() }
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
extension ConfigSettings {
|
||||
private enum SubsectionSelection: Hashable {
|
||||
case all
|
||||
case key(String)
|
||||
}
|
||||
|
||||
private struct ConfigSection: Identifiable {
|
||||
let key: String
|
||||
let label: String
|
||||
let help: String?
|
||||
let node: ConfigSchemaNode
|
||||
|
||||
var id: String { self.key }
|
||||
}
|
||||
|
||||
private struct ConfigSubsection: Identifiable {
|
||||
let key: String
|
||||
let label: String
|
||||
let help: String?
|
||||
let node: ConfigSchemaNode
|
||||
let path: ConfigPath
|
||||
|
||||
var id: String { self.key }
|
||||
}
|
||||
|
||||
private var sections: [ConfigSection] {
|
||||
guard let schema = self.store.configSchema else { return [] }
|
||||
return self.resolveSections(schema)
|
||||
}
|
||||
|
||||
private var activeSection: ConfigSection? {
|
||||
self.sections.first { $0.key == self.activeSectionKey }
|
||||
}
|
||||
|
||||
private var sidebar: some View {
|
||||
ScrollView {
|
||||
LazyVStack(alignment: .leading, spacing: 8) {
|
||||
if self.sections.isEmpty {
|
||||
Text("No config sections available.")
|
||||
private var content: some View {
|
||||
VStack(alignment: .leading, spacing: 16) {
|
||||
self.header
|
||||
if let status = self.store.configStatus {
|
||||
Text(status)
|
||||
.font(.callout)
|
||||
.foregroundStyle(.secondary)
|
||||
}
|
||||
self.actionRow
|
||||
Group {
|
||||
if self.store.configSchemaLoading {
|
||||
ProgressView().controlSize(.small)
|
||||
} else if let schema = self.store.configSchema {
|
||||
ConfigSchemaForm(store: self.store, schema: schema, path: [])
|
||||
.disabled(self.isNixMode)
|
||||
} else {
|
||||
Text("Schema unavailable.")
|
||||
.font(.caption)
|
||||
.foregroundStyle(.secondary)
|
||||
.padding(.horizontal, 6)
|
||||
.padding(.vertical, 4)
|
||||
} else {
|
||||
ForEach(self.sections) { section in
|
||||
self.sidebarRow(section)
|
||||
}
|
||||
}
|
||||
}
|
||||
.padding(.vertical, 10)
|
||||
.padding(.horizontal, 10)
|
||||
}
|
||||
.frame(minWidth: 220, idealWidth: 240, maxWidth: 280, maxHeight: .infinity, alignment: .topLeading)
|
||||
.background(
|
||||
RoundedRectangle(cornerRadius: 12, style: .continuous)
|
||||
.fill(Color(nsColor: .windowBackgroundColor)))
|
||||
.clipShape(RoundedRectangle(cornerRadius: 12, style: .continuous))
|
||||
}
|
||||
|
||||
private var detail: some View {
|
||||
VStack(alignment: .leading, spacing: 16) {
|
||||
if self.store.configSchemaLoading {
|
||||
ProgressView().controlSize(.small)
|
||||
} else if let section = self.activeSection {
|
||||
self.sectionDetail(section)
|
||||
} else if self.store.configSchema != nil {
|
||||
self.emptyDetail
|
||||
} else {
|
||||
Text("Schema unavailable.")
|
||||
if self.store.configDirty, !self.isNixMode {
|
||||
Text("Unsaved changes")
|
||||
.font(.caption)
|
||||
.foregroundStyle(.secondary)
|
||||
}
|
||||
Spacer(minLength: 0)
|
||||
}
|
||||
.frame(minWidth: 460, maxWidth: .infinity, maxHeight: .infinity, alignment: .topLeading)
|
||||
}
|
||||
|
||||
private var emptyDetail: some View {
|
||||
VStack(alignment: .leading, spacing: 8) {
|
||||
self.header
|
||||
Text("Select a config section to view settings.")
|
||||
.font(.callout)
|
||||
.foregroundStyle(.secondary)
|
||||
}
|
||||
.frame(maxWidth: .infinity, alignment: .leading)
|
||||
.padding(.horizontal, 24)
|
||||
.padding(.vertical, 18)
|
||||
}
|
||||
|
||||
private func sectionDetail(_ section: ConfigSection) -> some View {
|
||||
ScrollView(.vertical) {
|
||||
VStack(alignment: .leading, spacing: 16) {
|
||||
self.header
|
||||
if let status = self.store.configStatus {
|
||||
Text(status)
|
||||
.font(.callout)
|
||||
.foregroundStyle(.secondary)
|
||||
}
|
||||
self.actionRow
|
||||
self.sectionHeader(section)
|
||||
self.subsectionNav(section)
|
||||
self.sectionForm(section)
|
||||
if self.store.configDirty, !self.isNixMode {
|
||||
Text("Unsaved changes")
|
||||
.font(.caption)
|
||||
.foregroundStyle(.secondary)
|
||||
}
|
||||
Spacer(minLength: 0)
|
||||
}
|
||||
.frame(maxWidth: .infinity, alignment: .leading)
|
||||
.padding(.horizontal, 24)
|
||||
.padding(.vertical, 18)
|
||||
.groupBoxStyle(PlainSettingsGroupBoxStyle())
|
||||
}
|
||||
.groupBoxStyle(PlainSettingsGroupBoxStyle())
|
||||
}
|
||||
|
||||
@ViewBuilder
|
||||
@@ -158,18 +71,6 @@ extension ConfigSettings {
|
||||
.foregroundStyle(.secondary)
|
||||
}
|
||||
|
||||
private func sectionHeader(_ section: ConfigSection) -> some View {
|
||||
VStack(alignment: .leading, spacing: 6) {
|
||||
Text(section.label)
|
||||
.font(.title3.weight(.semibold))
|
||||
if let help = section.help {
|
||||
Text(help)
|
||||
.font(.callout)
|
||||
.foregroundStyle(.secondary)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private var actionRow: some View {
|
||||
HStack(spacing: 10) {
|
||||
Button("Reload") {
|
||||
@@ -184,204 +85,6 @@ extension ConfigSettings {
|
||||
}
|
||||
.buttonStyle(.bordered)
|
||||
}
|
||||
|
||||
private func sidebarRow(_ section: ConfigSection) -> some View {
|
||||
let isSelected = self.activeSectionKey == section.key
|
||||
return Button {
|
||||
self.selectSection(section)
|
||||
} label: {
|
||||
VStack(alignment: .leading, spacing: 2) {
|
||||
Text(section.label)
|
||||
if let help = section.help {
|
||||
Text(help)
|
||||
.font(.caption)
|
||||
.foregroundStyle(.secondary)
|
||||
.lineLimit(2)
|
||||
}
|
||||
}
|
||||
.padding(.vertical, 6)
|
||||
.padding(.horizontal, 8)
|
||||
.frame(maxWidth: .infinity, alignment: .leading)
|
||||
.background(isSelected ? Color.accentColor.opacity(0.18) : Color.clear)
|
||||
.clipShape(RoundedRectangle(cornerRadius: 10, style: .continuous))
|
||||
.background(Color.clear)
|
||||
.contentShape(Rectangle())
|
||||
}
|
||||
.frame(maxWidth: .infinity, alignment: .leading)
|
||||
.buttonStyle(.plain)
|
||||
.contentShape(Rectangle())
|
||||
}
|
||||
|
||||
@ViewBuilder
|
||||
private func subsectionNav(_ section: ConfigSection) -> some View {
|
||||
let subsections = self.resolveSubsections(for: section)
|
||||
if subsections.isEmpty {
|
||||
EmptyView()
|
||||
} else {
|
||||
ScrollView(.horizontal, showsIndicators: false) {
|
||||
HStack(spacing: 8) {
|
||||
self.subsectionButton(
|
||||
title: "All",
|
||||
isSelected: self.activeSubsection == .all)
|
||||
{
|
||||
self.activeSubsection = .all
|
||||
}
|
||||
ForEach(subsections) { subsection in
|
||||
self.subsectionButton(
|
||||
title: subsection.label,
|
||||
isSelected: self.activeSubsection == .key(subsection.key))
|
||||
{
|
||||
self.activeSubsection = .key(subsection.key)
|
||||
}
|
||||
}
|
||||
}
|
||||
.padding(.vertical, 2)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private func subsectionButton(
|
||||
title: String,
|
||||
isSelected: Bool,
|
||||
action: @escaping () -> Void) -> some View
|
||||
{
|
||||
Button(action: action) {
|
||||
Text(title)
|
||||
.font(.callout.weight(.semibold))
|
||||
.foregroundStyle(isSelected ? Color.accentColor : .primary)
|
||||
.padding(.horizontal, 10)
|
||||
.padding(.vertical, 6)
|
||||
.background(isSelected ? Color.accentColor.opacity(0.18) : Color(nsColor: .controlBackgroundColor))
|
||||
.clipShape(Capsule())
|
||||
}
|
||||
.buttonStyle(.plain)
|
||||
}
|
||||
|
||||
private func sectionForm(_ section: ConfigSection) -> some View {
|
||||
let subsection = self.activeSubsection
|
||||
let defaultPath: ConfigPath = [.key(section.key)]
|
||||
let subsections = self.resolveSubsections(for: section)
|
||||
let resolved: (ConfigSchemaNode, ConfigPath) = {
|
||||
if case let .key(key) = subsection,
|
||||
let match = subsections.first(where: { $0.key == key })
|
||||
{
|
||||
return (match.node, match.path)
|
||||
}
|
||||
return (self.resolvedSchemaNode(section.node), defaultPath)
|
||||
}()
|
||||
|
||||
return ConfigSchemaForm(store: self.store, schema: resolved.0, path: resolved.1)
|
||||
.disabled(self.isNixMode)
|
||||
}
|
||||
|
||||
private func ensureSelection() {
|
||||
guard let schema = self.store.configSchema else { return }
|
||||
let sections = self.resolveSections(schema)
|
||||
guard !sections.isEmpty else { return }
|
||||
|
||||
let active = sections.first { $0.key == self.activeSectionKey } ?? sections[0]
|
||||
if self.activeSectionKey != active.key {
|
||||
self.activeSectionKey = active.key
|
||||
}
|
||||
self.ensureSubsection(for: active)
|
||||
}
|
||||
|
||||
private func ensureSubsection(for section: ConfigSection) {
|
||||
let subsections = self.resolveSubsections(for: section)
|
||||
guard !subsections.isEmpty else {
|
||||
self.activeSubsection = nil
|
||||
return
|
||||
}
|
||||
|
||||
switch self.activeSubsection {
|
||||
case .all:
|
||||
return
|
||||
case let .key(key):
|
||||
if subsections.contains(where: { $0.key == key }) { return }
|
||||
case .none:
|
||||
break
|
||||
}
|
||||
|
||||
if let first = subsections.first {
|
||||
self.activeSubsection = .key(first.key)
|
||||
}
|
||||
}
|
||||
|
||||
private func selectSection(_ section: ConfigSection) {
|
||||
guard self.activeSectionKey != section.key else { return }
|
||||
self.activeSectionKey = section.key
|
||||
let subsections = self.resolveSubsections(for: section)
|
||||
if let first = subsections.first {
|
||||
self.activeSubsection = .key(first.key)
|
||||
} else {
|
||||
self.activeSubsection = nil
|
||||
}
|
||||
}
|
||||
|
||||
private func resolveSections(_ root: ConfigSchemaNode) -> [ConfigSection] {
|
||||
let node = self.resolvedSchemaNode(root)
|
||||
let hints = self.store.configUiHints
|
||||
let keys = node.properties.keys.sorted { lhs, rhs in
|
||||
let orderA = hintForPath([.key(lhs)], hints: hints)?.order ?? 0
|
||||
let orderB = hintForPath([.key(rhs)], hints: hints)?.order ?? 0
|
||||
if orderA != orderB { return orderA < orderB }
|
||||
return lhs < rhs
|
||||
}
|
||||
|
||||
return keys.compactMap { key in
|
||||
guard let child = node.properties[key] else { return nil }
|
||||
let path: ConfigPath = [.key(key)]
|
||||
let hint = hintForPath(path, hints: hints)
|
||||
let label = hint?.label
|
||||
?? child.title
|
||||
?? self.humanize(key)
|
||||
let help = hint?.help ?? child.description
|
||||
return ConfigSection(key: key, label: label, help: help, node: child)
|
||||
}
|
||||
}
|
||||
|
||||
private func resolveSubsections(for section: ConfigSection) -> [ConfigSubsection] {
|
||||
let node = self.resolvedSchemaNode(section.node)
|
||||
guard node.schemaType == "object" else { return [] }
|
||||
let hints = self.store.configUiHints
|
||||
let keys = node.properties.keys.sorted { lhs, rhs in
|
||||
let orderA = hintForPath([.key(section.key), .key(lhs)], hints: hints)?.order ?? 0
|
||||
let orderB = hintForPath([.key(section.key), .key(rhs)], hints: hints)?.order ?? 0
|
||||
if orderA != orderB { return orderA < orderB }
|
||||
return lhs < rhs
|
||||
}
|
||||
|
||||
return keys.compactMap { key in
|
||||
guard let child = node.properties[key] else { return nil }
|
||||
let path: ConfigPath = [.key(section.key), .key(key)]
|
||||
let hint = hintForPath(path, hints: hints)
|
||||
let label = hint?.label
|
||||
?? child.title
|
||||
?? self.humanize(key)
|
||||
let help = hint?.help ?? child.description
|
||||
return ConfigSubsection(
|
||||
key: key,
|
||||
label: label,
|
||||
help: help,
|
||||
node: child,
|
||||
path: path)
|
||||
}
|
||||
}
|
||||
|
||||
private func resolvedSchemaNode(_ node: ConfigSchemaNode) -> ConfigSchemaNode {
|
||||
let variants = node.anyOf.isEmpty ? node.oneOf : node.anyOf
|
||||
if !variants.isEmpty {
|
||||
let nonNull = variants.filter { !$0.isNullSchema }
|
||||
if nonNull.count == 1, let only = nonNull.first { return only }
|
||||
}
|
||||
return node
|
||||
}
|
||||
|
||||
private func humanize(_ key: String) -> String {
|
||||
key.replacingOccurrences(of: "_", with: " ")
|
||||
.replacingOccurrences(of: "-", with: " ")
|
||||
.capitalized
|
||||
}
|
||||
}
|
||||
|
||||
struct ConfigSettings_Previews: PreviewProvider {
|
||||
|
||||
@@ -87,7 +87,15 @@ final class ControlChannel {
|
||||
|
||||
func configure() async {
|
||||
self.logger.info("control channel configure mode=local")
|
||||
await self.refreshEndpoint(reason: "configure")
|
||||
self.state = .connecting
|
||||
do {
|
||||
try await GatewayConnection.shared.refresh()
|
||||
self.state = .connected
|
||||
PresenceReporter.shared.sendImmediate(reason: "connect")
|
||||
} catch {
|
||||
let message = self.friendlyGatewayMessage(error)
|
||||
self.state = .degraded(message)
|
||||
}
|
||||
}
|
||||
|
||||
func configure(mode: Mode = .local) async throws {
|
||||
@@ -103,7 +111,7 @@ final class ControlChannel {
|
||||
"target=\(target, privacy: .public) identitySet=\(idSet, privacy: .public)")
|
||||
self.state = .connecting
|
||||
_ = try await GatewayEndpointStore.shared.ensureRemoteControlTunnel()
|
||||
await self.refreshEndpoint(reason: "configure")
|
||||
await self.configure()
|
||||
} catch {
|
||||
self.state = .degraded(error.localizedDescription)
|
||||
throw error
|
||||
@@ -111,19 +119,6 @@ final class ControlChannel {
|
||||
}
|
||||
}
|
||||
|
||||
func refreshEndpoint(reason: String) async {
|
||||
self.logger.info("control channel refresh endpoint reason=\(reason, privacy: .public)")
|
||||
self.state = .connecting
|
||||
do {
|
||||
try await self.establishGatewayConnection()
|
||||
self.state = .connected
|
||||
PresenceReporter.shared.sendImmediate(reason: "connect")
|
||||
} catch {
|
||||
let message = self.friendlyGatewayMessage(error)
|
||||
self.state = .degraded(message)
|
||||
}
|
||||
}
|
||||
|
||||
func disconnect() async {
|
||||
await GatewayConnection.shared.shutdown()
|
||||
self.state = .disconnected
|
||||
@@ -280,28 +275,18 @@ final class ControlChannel {
|
||||
}
|
||||
}
|
||||
|
||||
await self.refreshEndpoint(reason: "recovery:\(reasonText)")
|
||||
if case .connected = self.state {
|
||||
do {
|
||||
try await GatewayConnection.shared.refresh()
|
||||
self.logger.info("control channel recovery finished")
|
||||
} else if case let .degraded(message) = self.state {
|
||||
self.logger.error("control channel recovery failed \(message, privacy: .public)")
|
||||
} catch {
|
||||
self.logger.error(
|
||||
"control channel recovery failed \(error.localizedDescription, privacy: .public)")
|
||||
}
|
||||
|
||||
self.recoveryTask = nil
|
||||
}
|
||||
}
|
||||
|
||||
private func establishGatewayConnection(timeoutMs: Int = 5000) async throws {
|
||||
try await GatewayConnection.shared.refresh()
|
||||
let ok = try await GatewayConnection.shared.healthOK(timeoutMs: timeoutMs)
|
||||
if ok == false {
|
||||
throw NSError(
|
||||
domain: "Gateway",
|
||||
code: 0,
|
||||
userInfo: [NSLocalizedDescriptionKey: "gateway health not ok"])
|
||||
}
|
||||
}
|
||||
|
||||
func sendSystemEvent(_ text: String, params: [String: AnyHashable] = [:]) async throws {
|
||||
var merged = params
|
||||
merged["text"] = AnyHashable(text)
|
||||
|
||||
@@ -319,7 +319,7 @@ private enum ExecHostExecutor {
|
||||
security: context.security,
|
||||
allowlistMatch: context.allowlistMatch,
|
||||
skillAllow: context.skillAllow),
|
||||
approvalDecision == nil
|
||||
approvalDecision == nil
|
||||
{
|
||||
let decision = ExecApprovalsPromptPresenter.prompt(
|
||||
ExecApprovalPromptRequest(
|
||||
|
||||
@@ -148,27 +148,6 @@ actor GatewayConnection {
|
||||
}
|
||||
}
|
||||
|
||||
let nsError = lastError as NSError
|
||||
if nsError.domain == URLError.errorDomain,
|
||||
let fallback = await GatewayEndpointStore.shared.maybeFallbackToTailnet(from: cfg.url)
|
||||
{
|
||||
await self.configure(url: fallback.url, token: fallback.token, password: fallback.password)
|
||||
for delayMs in [150, 400, 900] {
|
||||
try await Task.sleep(nanoseconds: UInt64(delayMs) * 1_000_000)
|
||||
do {
|
||||
guard let client = self.client else {
|
||||
throw NSError(
|
||||
domain: "Gateway",
|
||||
code: 0,
|
||||
userInfo: [NSLocalizedDescriptionKey: "gateway not configured"])
|
||||
}
|
||||
return try await client.request(method: method, params: params, timeoutMs: timeoutMs)
|
||||
} catch {
|
||||
lastError = error
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
throw lastError
|
||||
case .remote:
|
||||
let nsError = error as NSError
|
||||
|
||||
@@ -1,63 +0,0 @@
|
||||
import Foundation
|
||||
import Observation
|
||||
import OSLog
|
||||
|
||||
@MainActor
|
||||
@Observable
|
||||
final class GatewayConnectivityCoordinator {
|
||||
static let shared = GatewayConnectivityCoordinator()
|
||||
|
||||
private let logger = Logger(subsystem: "com.clawdbot", category: "gateway.connectivity")
|
||||
private var endpointTask: Task<Void, Never>?
|
||||
private var lastResolvedURL: URL?
|
||||
|
||||
private(set) var endpointState: GatewayEndpointState?
|
||||
private(set) var resolvedURL: URL?
|
||||
private(set) var resolvedMode: AppState.ConnectionMode?
|
||||
private(set) var resolvedHostLabel: String?
|
||||
|
||||
private init() {
|
||||
self.start()
|
||||
}
|
||||
|
||||
func start() {
|
||||
guard self.endpointTask == nil else { return }
|
||||
self.endpointTask = Task { [weak self] in
|
||||
guard let self else { return }
|
||||
let stream = await GatewayEndpointStore.shared.subscribe()
|
||||
for await state in stream {
|
||||
await MainActor.run { self.handleEndpointState(state) }
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
var localEndpointHostLabel: String? {
|
||||
guard self.resolvedMode == .local, let url = self.resolvedURL else { return nil }
|
||||
return Self.hostLabel(for: url)
|
||||
}
|
||||
|
||||
private func handleEndpointState(_ state: GatewayEndpointState) {
|
||||
self.endpointState = state
|
||||
switch state {
|
||||
case let .ready(mode, url, _, _):
|
||||
self.resolvedMode = mode
|
||||
self.resolvedURL = url
|
||||
self.resolvedHostLabel = Self.hostLabel(for: url)
|
||||
let urlChanged = self.lastResolvedURL?.absoluteString != url.absoluteString
|
||||
if urlChanged {
|
||||
self.lastResolvedURL = url
|
||||
Task { await ControlChannel.shared.refreshEndpoint(reason: "endpoint changed") }
|
||||
}
|
||||
case let .connecting(mode, _):
|
||||
self.resolvedMode = mode
|
||||
case let .unavailable(mode, _):
|
||||
self.resolvedMode = mode
|
||||
}
|
||||
}
|
||||
|
||||
private static func hostLabel(for url: URL) -> String {
|
||||
let host = url.host ?? url.absoluteString
|
||||
if let port = url.port { return "\(host):\(port)" }
|
||||
return host
|
||||
}
|
||||
}
|
||||
@@ -68,7 +68,6 @@ actor GatewayEndpointStore {
|
||||
env: ProcessInfo.processInfo.environment)
|
||||
let customBindHost = GatewayEndpointStore.resolveGatewayCustomBindHost(root: root)
|
||||
let tailscaleIP = await MainActor.run { TailscaleService.shared.tailscaleIP }
|
||||
?? TailscaleService.fallbackTailnetIPv4()
|
||||
return GatewayEndpointStore.resolveLocalGatewayHost(
|
||||
bindMode: bind,
|
||||
customBindHost: customBindHost,
|
||||
@@ -173,10 +172,6 @@ actor GatewayEndpointStore {
|
||||
return configToken
|
||||
}
|
||||
|
||||
if isRemote {
|
||||
return nil
|
||||
}
|
||||
|
||||
if let token = launchdSnapshot?.token?.trimmingCharacters(in: .whitespacesAndNewlines),
|
||||
!token.isEmpty
|
||||
{
|
||||
@@ -474,36 +469,6 @@ actor GatewayEndpointStore {
|
||||
}
|
||||
}
|
||||
|
||||
func maybeFallbackToTailnet(from currentURL: URL) async -> GatewayConnection.Config? {
|
||||
let mode = await self.deps.mode()
|
||||
guard mode == .local else { return nil }
|
||||
|
||||
let root = ClawdbotConfigFile.loadDict()
|
||||
let bind = GatewayEndpointStore.resolveGatewayBindMode(
|
||||
root: root,
|
||||
env: ProcessInfo.processInfo.environment)
|
||||
guard bind == "auto" else { return nil }
|
||||
|
||||
let currentHost = currentURL.host?.lowercased() ?? ""
|
||||
guard currentHost == "127.0.0.1" || currentHost == "localhost" else { return nil }
|
||||
|
||||
let tailscaleIP = await MainActor.run { TailscaleService.shared.tailscaleIP }
|
||||
?? TailscaleService.fallbackTailnetIPv4()
|
||||
guard let tailscaleIP, !tailscaleIP.isEmpty else { return nil }
|
||||
|
||||
let scheme = GatewayEndpointStore.resolveGatewayScheme(
|
||||
root: root,
|
||||
env: ProcessInfo.processInfo.environment)
|
||||
let port = self.deps.localPort()
|
||||
let token = self.deps.token()
|
||||
let password = self.deps.password()
|
||||
let url = URL(string: "\(scheme)://\(tailscaleIP):\(port)")!
|
||||
|
||||
self.logger.info("auto bind fallback to tailnet host=\(tailscaleIP, privacy: .public)")
|
||||
self.setState(.ready(mode: .local, url: url, token: token, password: password))
|
||||
return (url, token, password)
|
||||
}
|
||||
|
||||
private static func resolveGatewayBindMode(
|
||||
root: [String: Any],
|
||||
env: [String: String]) -> String?
|
||||
@@ -559,17 +524,12 @@ actor GatewayEndpointStore {
|
||||
tailscaleIP: String?) -> String
|
||||
{
|
||||
switch bindMode {
|
||||
case "tailnet":
|
||||
return tailscaleIP ?? "127.0.0.1"
|
||||
case "auto":
|
||||
if let tailscaleIP, !tailscaleIP.isEmpty {
|
||||
return tailscaleIP
|
||||
}
|
||||
return "127.0.0.1"
|
||||
case "tailnet", "auto":
|
||||
tailscaleIP ?? "127.0.0.1"
|
||||
case "custom":
|
||||
return customBindHost ?? "127.0.0.1"
|
||||
customBindHost ?? "127.0.0.1"
|
||||
default:
|
||||
return "127.0.0.1"
|
||||
"127.0.0.1"
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -640,12 +600,11 @@ extension GatewayEndpointStore {
|
||||
|
||||
static func _testResolveLocalGatewayHost(
|
||||
bindMode: String?,
|
||||
tailscaleIP: String?,
|
||||
customBindHost: String? = nil) -> String
|
||||
tailscaleIP: String?) -> String
|
||||
{
|
||||
self.resolveLocalGatewayHost(
|
||||
bindMode: bindMode,
|
||||
customBindHost: customBindHost,
|
||||
customBindHost: nil,
|
||||
tailscaleIP: tailscaleIP)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -235,8 +235,8 @@ final class HealthStore {
|
||||
let lower = error.lowercased()
|
||||
if lower.contains("connection refused") {
|
||||
let port = GatewayEnvironment.gatewayPort()
|
||||
let host = GatewayConnectivityCoordinator.shared.localEndpointHostLabel ?? "127.0.0.1:\(port)"
|
||||
return "The gateway control port (\(host)) isn’t listening — restart Clawdbot to bring it back."
|
||||
return "The gateway control port (127.0.0.1:\(port)) isn’t listening — " +
|
||||
"restart Clawdbot to bring it back."
|
||||
}
|
||||
if lower.contains("timeout") {
|
||||
return "Timed out waiting for the control server; the gateway may be crashed or still starting."
|
||||
|
||||
@@ -13,7 +13,6 @@ struct ClawdbotApp: App {
|
||||
private let gatewayManager = GatewayProcessManager.shared
|
||||
private let controlChannel = ControlChannel.shared
|
||||
private let activityStore = WorkActivityStore.shared
|
||||
private let connectivityCoordinator = GatewayConnectivityCoordinator.shared
|
||||
@State private var statusItem: NSStatusItem?
|
||||
@State private var isMenuPresented = false
|
||||
@State private var isPanelVisible = false
|
||||
|
||||
@@ -469,7 +469,7 @@ extension MenuSessionsInjector {
|
||||
}
|
||||
case .local:
|
||||
platform = "local"
|
||||
host = GatewayConnectivityCoordinator.shared.localEndpointHostLabel ?? "127.0.0.1:\(port)"
|
||||
host = "127.0.0.1:\(port)"
|
||||
case .unconfigured:
|
||||
platform = nil
|
||||
host = nil
|
||||
|
||||
@@ -2,9 +2,6 @@ import AppKit
|
||||
import Foundation
|
||||
import Observation
|
||||
import os
|
||||
#if canImport(Darwin)
|
||||
import Darwin
|
||||
#endif
|
||||
|
||||
/// Manages Tailscale integration and status checking.
|
||||
@Observable
|
||||
@@ -103,14 +100,16 @@ final class TailscaleService {
|
||||
}
|
||||
|
||||
func checkTailscaleStatus() async {
|
||||
let previousIP = self.tailscaleIP
|
||||
self.isInstalled = self.checkAppInstallation()
|
||||
if !self.isInstalled {
|
||||
guard self.isInstalled else {
|
||||
self.isRunning = false
|
||||
self.tailscaleHostname = nil
|
||||
self.tailscaleIP = nil
|
||||
self.statusError = "Tailscale is not installed"
|
||||
} else if let apiResponse = await fetchTailscaleStatus() {
|
||||
return
|
||||
}
|
||||
|
||||
if let apiResponse = await fetchTailscaleStatus() {
|
||||
self.isRunning = apiResponse.status.lowercased() == "running"
|
||||
|
||||
if self.isRunning {
|
||||
@@ -139,19 +138,6 @@ final class TailscaleService {
|
||||
self.statusError = "Please start the Tailscale app"
|
||||
self.logger.info("Tailscale API not responding; app likely not running")
|
||||
}
|
||||
|
||||
if self.tailscaleIP == nil, let fallback = Self.detectTailnetIPv4() {
|
||||
self.tailscaleIP = fallback
|
||||
if !self.isRunning {
|
||||
self.isRunning = true
|
||||
}
|
||||
self.statusError = nil
|
||||
self.logger.info("Tailscale interface IP detected (fallback) ip=\(fallback, privacy: .public)")
|
||||
}
|
||||
|
||||
if previousIP != self.tailscaleIP {
|
||||
await GatewayEndpointStore.shared.refresh()
|
||||
}
|
||||
}
|
||||
|
||||
func openTailscaleApp() {
|
||||
@@ -177,50 +163,4 @@ final class TailscaleService {
|
||||
NSWorkspace.shared.open(url)
|
||||
}
|
||||
}
|
||||
|
||||
private static func isTailnetIPv4(_ address: String) -> Bool {
|
||||
let parts = address.split(separator: ".")
|
||||
guard parts.count == 4 else { return false }
|
||||
let octets = parts.compactMap { Int($0) }
|
||||
guard octets.count == 4 else { return false }
|
||||
let a = octets[0]
|
||||
let b = octets[1]
|
||||
return a == 100 && b >= 64 && b <= 127
|
||||
}
|
||||
|
||||
private static func detectTailnetIPv4() -> String? {
|
||||
var addrList: UnsafeMutablePointer<ifaddrs>?
|
||||
guard getifaddrs(&addrList) == 0, let first = addrList else { return nil }
|
||||
defer { freeifaddrs(addrList) }
|
||||
|
||||
for ptr in sequence(first: first, next: { $0.pointee.ifa_next }) {
|
||||
let flags = Int32(ptr.pointee.ifa_flags)
|
||||
let isUp = (flags & IFF_UP) != 0
|
||||
let isLoopback = (flags & IFF_LOOPBACK) != 0
|
||||
let family = ptr.pointee.ifa_addr.pointee.sa_family
|
||||
if !isUp || isLoopback || family != UInt8(AF_INET) { continue }
|
||||
|
||||
var addr = ptr.pointee.ifa_addr.pointee
|
||||
var buffer = [CChar](repeating: 0, count: Int(NI_MAXHOST))
|
||||
let result = getnameinfo(
|
||||
&addr,
|
||||
socklen_t(ptr.pointee.ifa_addr.pointee.sa_len),
|
||||
&buffer,
|
||||
socklen_t(buffer.count),
|
||||
nil,
|
||||
0,
|
||||
NI_NUMERICHOST)
|
||||
guard result == 0 else { continue }
|
||||
let len = buffer.prefix { $0 != 0 }
|
||||
let bytes = len.map { UInt8(bitPattern: $0) }
|
||||
guard let ip = String(bytes: bytes, encoding: .utf8) else { continue }
|
||||
if Self.isTailnetIPv4(ip) { return ip }
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
nonisolated static func fallbackTailnetIPv4() -> String? {
|
||||
Self.detectTailnetIPv4()
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,16 +1,13 @@
|
||||
import ClawdbotKit
|
||||
import ClawdbotProtocol
|
||||
import Foundation
|
||||
#if canImport(Darwin)
|
||||
import Darwin
|
||||
#endif
|
||||
|
||||
struct ConnectOptions {
|
||||
var url: String?
|
||||
var token: String?
|
||||
var password: String?
|
||||
var mode: String?
|
||||
var timeoutMs: Int = 15000
|
||||
var timeoutMs: Int = 15_000
|
||||
var json: Bool = false
|
||||
var probe: Bool = false
|
||||
var clientId: String = "clawdbot-macos"
|
||||
@@ -22,43 +19,53 @@ struct ConnectOptions {
|
||||
|
||||
static func parse(_ args: [String]) -> ConnectOptions {
|
||||
var opts = ConnectOptions()
|
||||
let flagHandlers: [String: (inout ConnectOptions) -> Void] = [
|
||||
"-h": { $0.help = true },
|
||||
"--help": { $0.help = true },
|
||||
"--json": { $0.json = true },
|
||||
"--probe": { $0.probe = true },
|
||||
]
|
||||
let valueHandlers: [String: (inout ConnectOptions, String) -> Void] = [
|
||||
"--url": { $0.url = $1 },
|
||||
"--token": { $0.token = $1 },
|
||||
"--password": { $0.password = $1 },
|
||||
"--mode": { $0.mode = $1 },
|
||||
"--timeout": { opts, raw in
|
||||
if let parsed = Int(raw.trimmingCharacters(in: .whitespacesAndNewlines)) {
|
||||
opts.timeoutMs = max(250, parsed)
|
||||
}
|
||||
},
|
||||
"--client-id": { $0.clientId = $1 },
|
||||
"--client-mode": { $0.clientMode = $1 },
|
||||
"--display-name": { $0.displayName = $1 },
|
||||
"--role": { $0.role = $1 },
|
||||
"--scopes": { opts, raw in
|
||||
opts.scopes = raw.split(separator: ",").map { $0.trimmingCharacters(in: .whitespacesAndNewlines) }
|
||||
.filter { !$0.isEmpty }
|
||||
},
|
||||
]
|
||||
var i = 0
|
||||
while i < args.count {
|
||||
let arg = args[i]
|
||||
if let handler = flagHandlers[arg] {
|
||||
handler(&opts)
|
||||
i += 1
|
||||
continue
|
||||
}
|
||||
if let handler = valueHandlers[arg], let value = self.nextValue(args, index: &i) {
|
||||
handler(&opts, value)
|
||||
i += 1
|
||||
continue
|
||||
switch arg {
|
||||
case "-h", "--help":
|
||||
opts.help = true
|
||||
case "--json":
|
||||
opts.json = true
|
||||
case "--probe":
|
||||
opts.probe = true
|
||||
case "--url":
|
||||
opts.url = self.nextValue(args, index: &i)
|
||||
case "--token":
|
||||
opts.token = self.nextValue(args, index: &i)
|
||||
case "--password":
|
||||
opts.password = self.nextValue(args, index: &i)
|
||||
case "--mode":
|
||||
if let value = self.nextValue(args, index: &i) {
|
||||
opts.mode = value
|
||||
}
|
||||
case "--timeout":
|
||||
if let raw = self.nextValue(args, index: &i),
|
||||
let parsed = Int(raw.trimmingCharacters(in: .whitespacesAndNewlines))
|
||||
{
|
||||
opts.timeoutMs = max(250, parsed)
|
||||
}
|
||||
case "--client-id":
|
||||
if let value = self.nextValue(args, index: &i) {
|
||||
opts.clientId = value
|
||||
}
|
||||
case "--client-mode":
|
||||
if let value = self.nextValue(args, index: &i) {
|
||||
opts.clientMode = value
|
||||
}
|
||||
case "--display-name":
|
||||
opts.displayName = self.nextValue(args, index: &i)
|
||||
case "--role":
|
||||
if let value = self.nextValue(args, index: &i) {
|
||||
opts.role = value
|
||||
}
|
||||
case "--scopes":
|
||||
if let value = self.nextValue(args, index: &i) {
|
||||
opts.scopes = value.split(separator: ",").map { $0.trimmingCharacters(in: .whitespacesAndNewlines) }
|
||||
.filter { !$0.isEmpty }
|
||||
}
|
||||
default:
|
||||
break
|
||||
}
|
||||
i += 1
|
||||
}
|
||||
@@ -247,12 +254,8 @@ private func resolveGatewayEndpoint(opts: ConnectOptions, config: GatewayConfig)
|
||||
|
||||
if resolvedMode == "remote" {
|
||||
guard let raw = config.remoteUrl?.trimmingCharacters(in: .whitespacesAndNewlines),
|
||||
!raw.isEmpty
|
||||
else {
|
||||
throw NSError(
|
||||
domain: "Gateway",
|
||||
code: 1,
|
||||
userInfo: [NSLocalizedDescriptionKey: "gateway.remote.url is missing"])
|
||||
!raw.isEmpty else {
|
||||
throw NSError(domain: "Gateway", code: 1, userInfo: [NSLocalizedDescriptionKey: "gateway.remote.url is missing"])
|
||||
}
|
||||
guard let url = URL(string: raw) else {
|
||||
throw NSError(domain: "Gateway", code: 1, userInfo: [NSLocalizedDescriptionKey: "invalid url: \(raw)"])
|
||||
@@ -265,12 +268,9 @@ private func resolveGatewayEndpoint(opts: ConnectOptions, config: GatewayConfig)
|
||||
}
|
||||
|
||||
let port = config.port ?? 18789
|
||||
let host = resolveLocalHost(bind: config.bind)
|
||||
let host = "127.0.0.1"
|
||||
guard let url = URL(string: "ws://\(host):\(port)") else {
|
||||
throw NSError(
|
||||
domain: "Gateway",
|
||||
code: 1,
|
||||
userInfo: [NSLocalizedDescriptionKey: "invalid url: ws://\(host):\(port)"])
|
||||
throw NSError(domain: "Gateway", code: 1, userInfo: [NSLocalizedDescriptionKey: "invalid url: ws://\(host):\(port)"])
|
||||
}
|
||||
return GatewayEndpoint(
|
||||
url: url,
|
||||
@@ -280,7 +280,7 @@ private func resolveGatewayEndpoint(opts: ConnectOptions, config: GatewayConfig)
|
||||
}
|
||||
|
||||
private func bestEffortEndpoint(opts: ConnectOptions, config: GatewayConfig) -> GatewayEndpoint? {
|
||||
try? resolveGatewayEndpoint(opts: opts, config: config)
|
||||
return try? resolveGatewayEndpoint(opts: opts, config: config)
|
||||
}
|
||||
|
||||
private func resolvedToken(opts: ConnectOptions, mode: String, config: GatewayConfig) -> String? {
|
||||
@@ -304,56 +304,3 @@ private func resolvedPassword(opts: ConnectOptions, mode: String, config: Gatewa
|
||||
}
|
||||
return config.password
|
||||
}
|
||||
|
||||
private func resolveLocalHost(bind: String?) -> String {
|
||||
let normalized = (bind ?? "").trimmingCharacters(in: .whitespacesAndNewlines).lowercased()
|
||||
let tailnetIP = detectTailnetIPv4()
|
||||
switch normalized {
|
||||
case "tailnet", "auto":
|
||||
return tailnetIP ?? "127.0.0.1"
|
||||
default:
|
||||
return "127.0.0.1"
|
||||
}
|
||||
}
|
||||
|
||||
private func detectTailnetIPv4() -> String? {
|
||||
var addrList: UnsafeMutablePointer<ifaddrs>?
|
||||
guard getifaddrs(&addrList) == 0, let first = addrList else { return nil }
|
||||
defer { freeifaddrs(addrList) }
|
||||
|
||||
for ptr in sequence(first: first, next: { $0.pointee.ifa_next }) {
|
||||
let flags = Int32(ptr.pointee.ifa_flags)
|
||||
let isUp = (flags & IFF_UP) != 0
|
||||
let isLoopback = (flags & IFF_LOOPBACK) != 0
|
||||
let family = ptr.pointee.ifa_addr.pointee.sa_family
|
||||
if !isUp || isLoopback || family != UInt8(AF_INET) { continue }
|
||||
|
||||
var addr = ptr.pointee.ifa_addr.pointee
|
||||
var buffer = [CChar](repeating: 0, count: Int(NI_MAXHOST))
|
||||
let result = getnameinfo(
|
||||
&addr,
|
||||
socklen_t(ptr.pointee.ifa_addr.pointee.sa_len),
|
||||
&buffer,
|
||||
socklen_t(buffer.count),
|
||||
nil,
|
||||
0,
|
||||
NI_NUMERICHOST)
|
||||
guard result == 0 else { continue }
|
||||
let len = buffer.prefix { $0 != 0 }
|
||||
let bytes = len.map { UInt8(bitPattern: $0) }
|
||||
guard let ip = String(bytes: bytes, encoding: .utf8) else { continue }
|
||||
if isTailnetIPv4(ip) { return ip }
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
private func isTailnetIPv4(_ address: String) -> Bool {
|
||||
let parts = address.split(separator: ".")
|
||||
guard parts.count == 4 else { return false }
|
||||
let octets = parts.compactMap { Int($0) }
|
||||
guard octets.count == 4 else { return false }
|
||||
let a = octets[0]
|
||||
let b = octets[1]
|
||||
return a == 100 && b >= 64 && b <= 127
|
||||
}
|
||||
|
||||
@@ -473,7 +473,6 @@ public struct AgentParams: Codable, Sendable {
|
||||
public let replychannel: String?
|
||||
public let accountid: String?
|
||||
public let replyaccountid: String?
|
||||
public let threadid: String?
|
||||
public let timeout: Int?
|
||||
public let lane: String?
|
||||
public let extrasystemprompt: String?
|
||||
@@ -495,7 +494,6 @@ public struct AgentParams: Codable, Sendable {
|
||||
replychannel: String?,
|
||||
accountid: String?,
|
||||
replyaccountid: String?,
|
||||
threadid: String?,
|
||||
timeout: Int?,
|
||||
lane: String?,
|
||||
extrasystemprompt: String?,
|
||||
@@ -516,7 +514,6 @@ public struct AgentParams: Codable, Sendable {
|
||||
self.replychannel = replychannel
|
||||
self.accountid = accountid
|
||||
self.replyaccountid = replyaccountid
|
||||
self.threadid = threadid
|
||||
self.timeout = timeout
|
||||
self.lane = lane
|
||||
self.extrasystemprompt = extrasystemprompt
|
||||
@@ -538,7 +535,6 @@ public struct AgentParams: Codable, Sendable {
|
||||
case replychannel = "replyChannel"
|
||||
case accountid = "accountId"
|
||||
case replyaccountid = "replyAccountId"
|
||||
case threadid = "threadId"
|
||||
case timeout
|
||||
case lane
|
||||
case extrasystemprompt = "extraSystemPrompt"
|
||||
@@ -839,47 +835,35 @@ public struct SessionsListParams: Codable, Sendable {
|
||||
public let activeminutes: Int?
|
||||
public let includeglobal: Bool?
|
||||
public let includeunknown: Bool?
|
||||
public let includederivedtitles: Bool?
|
||||
public let includelastmessage: Bool?
|
||||
public let label: String?
|
||||
public let spawnedby: String?
|
||||
public let agentid: String?
|
||||
public let search: String?
|
||||
|
||||
public init(
|
||||
limit: Int?,
|
||||
activeminutes: Int?,
|
||||
includeglobal: Bool?,
|
||||
includeunknown: Bool?,
|
||||
includederivedtitles: Bool?,
|
||||
includelastmessage: Bool?,
|
||||
label: String?,
|
||||
spawnedby: String?,
|
||||
agentid: String?,
|
||||
search: String?
|
||||
agentid: String?
|
||||
) {
|
||||
self.limit = limit
|
||||
self.activeminutes = activeminutes
|
||||
self.includeglobal = includeglobal
|
||||
self.includeunknown = includeunknown
|
||||
self.includederivedtitles = includederivedtitles
|
||||
self.includelastmessage = includelastmessage
|
||||
self.label = label
|
||||
self.spawnedby = spawnedby
|
||||
self.agentid = agentid
|
||||
self.search = search
|
||||
}
|
||||
private enum CodingKeys: String, CodingKey {
|
||||
case limit
|
||||
case activeminutes = "activeMinutes"
|
||||
case includeglobal = "includeGlobal"
|
||||
case includeunknown = "includeUnknown"
|
||||
case includederivedtitles = "includeDerivedTitles"
|
||||
case includelastmessage = "includeLastMessage"
|
||||
case label
|
||||
case spawnedby = "spawnedBy"
|
||||
case agentid = "agentId"
|
||||
case search
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -3,8 +3,6 @@ import SwiftUI
|
||||
import Testing
|
||||
@testable import Clawdbot
|
||||
|
||||
private typealias SnapshotAnyCodable = Clawdbot.AnyCodable
|
||||
|
||||
@Suite(.serialized)
|
||||
@MainActor
|
||||
struct ChannelsSettingsSmokeTests {
|
||||
@@ -19,11 +17,8 @@ struct ChannelsSettingsSmokeTests {
|
||||
"signal": "Signal",
|
||||
"imessage": "iMessage",
|
||||
],
|
||||
channelDetailLabels: nil,
|
||||
channelSystemImages: nil,
|
||||
channelMeta: nil,
|
||||
channels: [
|
||||
"whatsapp": SnapshotAnyCodable([
|
||||
"whatsapp": AnyCodable([
|
||||
"configured": true,
|
||||
"linked": true,
|
||||
"authAgeMs": 86_400_000,
|
||||
@@ -42,7 +37,7 @@ struct ChannelsSettingsSmokeTests {
|
||||
"lastEventAt": 1_700_000_060_000,
|
||||
"lastError": "needs login",
|
||||
]),
|
||||
"telegram": SnapshotAnyCodable([
|
||||
"telegram": AnyCodable([
|
||||
"configured": true,
|
||||
"tokenSource": "env",
|
||||
"running": true,
|
||||
@@ -57,7 +52,7 @@ struct ChannelsSettingsSmokeTests {
|
||||
],
|
||||
"lastProbeAt": 1_700_000_050_000,
|
||||
]),
|
||||
"signal": SnapshotAnyCodable([
|
||||
"signal": AnyCodable([
|
||||
"configured": true,
|
||||
"baseUrl": "http://127.0.0.1:8080",
|
||||
"running": true,
|
||||
@@ -70,7 +65,7 @@ struct ChannelsSettingsSmokeTests {
|
||||
],
|
||||
"lastProbeAt": 1_700_000_050_000,
|
||||
]),
|
||||
"imessage": SnapshotAnyCodable([
|
||||
"imessage": AnyCodable([
|
||||
"configured": false,
|
||||
"running": false,
|
||||
"lastError": "not configured",
|
||||
@@ -105,18 +100,15 @@ struct ChannelsSettingsSmokeTests {
|
||||
"signal": "Signal",
|
||||
"imessage": "iMessage",
|
||||
],
|
||||
channelDetailLabels: nil,
|
||||
channelSystemImages: nil,
|
||||
channelMeta: nil,
|
||||
channels: [
|
||||
"whatsapp": SnapshotAnyCodable([
|
||||
"whatsapp": AnyCodable([
|
||||
"configured": false,
|
||||
"linked": false,
|
||||
"running": false,
|
||||
"connected": false,
|
||||
"reconnectAttempts": 0,
|
||||
]),
|
||||
"telegram": SnapshotAnyCodable([
|
||||
"telegram": AnyCodable([
|
||||
"configured": false,
|
||||
"running": false,
|
||||
"lastError": "bot missing",
|
||||
@@ -128,7 +120,7 @@ struct ChannelsSettingsSmokeTests {
|
||||
],
|
||||
"lastProbeAt": 1_700_000_100_000,
|
||||
]),
|
||||
"signal": SnapshotAnyCodable([
|
||||
"signal": AnyCodable([
|
||||
"configured": false,
|
||||
"baseUrl": "http://127.0.0.1:8080",
|
||||
"running": false,
|
||||
@@ -141,7 +133,7 @@ struct ChannelsSettingsSmokeTests {
|
||||
],
|
||||
"lastProbeAt": 1_700_000_200_000,
|
||||
]),
|
||||
"imessage": SnapshotAnyCodable([
|
||||
"imessage": AnyCodable([
|
||||
"configured": false,
|
||||
"running": false,
|
||||
"lastError": "not configured",
|
||||
|
||||
@@ -11,19 +11,16 @@ struct CronJobEditorSmokeTests {
|
||||
}
|
||||
|
||||
@Test func cronJobEditorBuildsBodyForNewJob() {
|
||||
let channelsStore = ChannelsStore(isPreview: true)
|
||||
let view = CronJobEditor(
|
||||
job: nil,
|
||||
isSaving: .constant(false),
|
||||
error: .constant(nil),
|
||||
channelsStore: channelsStore,
|
||||
onCancel: {},
|
||||
onSave: { _ in })
|
||||
_ = view.body
|
||||
}
|
||||
|
||||
@Test func cronJobEditorBuildsBodyForExistingJob() {
|
||||
let channelsStore = ChannelsStore(isPreview: true)
|
||||
let job = CronJob(
|
||||
id: "job-1",
|
||||
agentId: "ops",
|
||||
@@ -57,36 +54,31 @@ struct CronJobEditorSmokeTests {
|
||||
job: job,
|
||||
isSaving: .constant(false),
|
||||
error: .constant(nil),
|
||||
channelsStore: channelsStore,
|
||||
onCancel: {},
|
||||
onSave: { _ in })
|
||||
_ = view.body
|
||||
}
|
||||
|
||||
@Test func cronJobEditorExercisesBuilders() {
|
||||
let channelsStore = ChannelsStore(isPreview: true)
|
||||
var view = CronJobEditor(
|
||||
job: nil,
|
||||
isSaving: .constant(false),
|
||||
error: .constant(nil),
|
||||
channelsStore: channelsStore,
|
||||
onCancel: {},
|
||||
onSave: { _ in })
|
||||
view.exerciseForTesting()
|
||||
}
|
||||
|
||||
@Test func cronJobEditorIncludesDeleteAfterRunForAtSchedule() throws {
|
||||
let channelsStore = ChannelsStore(isPreview: true)
|
||||
let view = CronJobEditor(
|
||||
job: nil,
|
||||
isSaving: .constant(false),
|
||||
error: .constant(nil),
|
||||
channelsStore: channelsStore,
|
||||
onCancel: {},
|
||||
onSave: { _ in })
|
||||
|
||||
var root: [String: Any] = [:]
|
||||
view.applyDeleteAfterRun(to: &root, scheduleKind: CronJobEditor.ScheduleKind.at, deleteAfterRun: true)
|
||||
view.applyDeleteAfterRun(to: &root, scheduleKind: .at, deleteAfterRun: true)
|
||||
let raw = root["deleteAfterRun"] as? Bool
|
||||
#expect(raw == true)
|
||||
}
|
||||
|
||||
@@ -1,4 +1,3 @@
|
||||
import ClawdbotKit
|
||||
import Foundation
|
||||
import os
|
||||
import Testing
|
||||
|
||||
@@ -1,4 +1,3 @@
|
||||
import ClawdbotKit
|
||||
import Foundation
|
||||
import os
|
||||
import Testing
|
||||
|
||||
@@ -1,4 +1,3 @@
|
||||
import ClawdbotKit
|
||||
import Foundation
|
||||
import os
|
||||
import Testing
|
||||
|
||||
@@ -1,4 +1,3 @@
|
||||
import ClawdbotKit
|
||||
import Foundation
|
||||
import os
|
||||
import Testing
|
||||
|
||||
@@ -1,4 +1,3 @@
|
||||
import ClawdbotKit
|
||||
import Foundation
|
||||
import Testing
|
||||
@testable import Clawdbot
|
||||
|
||||
@@ -139,40 +139,4 @@ import Testing
|
||||
let resolved = ConnectionModeResolver.resolve(root: root, defaults: defaults)
|
||||
#expect(resolved.mode == .remote)
|
||||
}
|
||||
|
||||
@Test func resolveLocalGatewayHostPrefersTailnetForAuto() {
|
||||
let host = GatewayEndpointStore._testResolveLocalGatewayHost(
|
||||
bindMode: "auto",
|
||||
tailscaleIP: "100.64.1.2")
|
||||
#expect(host == "100.64.1.2")
|
||||
}
|
||||
|
||||
@Test func resolveLocalGatewayHostFallsBackToLoopbackForAuto() {
|
||||
let host = GatewayEndpointStore._testResolveLocalGatewayHost(
|
||||
bindMode: "auto",
|
||||
tailscaleIP: nil)
|
||||
#expect(host == "127.0.0.1")
|
||||
}
|
||||
|
||||
@Test func resolveLocalGatewayHostPrefersTailnetForTailnetMode() {
|
||||
let host = GatewayEndpointStore._testResolveLocalGatewayHost(
|
||||
bindMode: "tailnet",
|
||||
tailscaleIP: "100.64.1.5")
|
||||
#expect(host == "100.64.1.5")
|
||||
}
|
||||
|
||||
@Test func resolveLocalGatewayHostFallsBackToLoopbackForTailnetMode() {
|
||||
let host = GatewayEndpointStore._testResolveLocalGatewayHost(
|
||||
bindMode: "tailnet",
|
||||
tailscaleIP: nil)
|
||||
#expect(host == "127.0.0.1")
|
||||
}
|
||||
|
||||
@Test func resolveLocalGatewayHostUsesCustomBindHost() {
|
||||
let host = GatewayEndpointStore._testResolveLocalGatewayHost(
|
||||
bindMode: "custom",
|
||||
tailscaleIP: "100.64.1.9",
|
||||
customBindHost: "192.168.1.10")
|
||||
#expect(host == "192.168.1.10")
|
||||
}
|
||||
}
|
||||
|
||||
@@ -7,17 +7,15 @@ import Testing
|
||||
|
||||
@Suite(.serialized)
|
||||
struct LowCoverageHelperTests {
|
||||
private typealias ProtoAnyCodable = ClawdbotProtocol.AnyCodable
|
||||
|
||||
@Test func anyCodableHelperAccessors() throws {
|
||||
let payload: [String: ProtoAnyCodable] = [
|
||||
"title": ProtoAnyCodable("Hello"),
|
||||
"flag": ProtoAnyCodable(true),
|
||||
"count": ProtoAnyCodable(3),
|
||||
"ratio": ProtoAnyCodable(1.25),
|
||||
"list": ProtoAnyCodable([ProtoAnyCodable("a"), ProtoAnyCodable(2)]),
|
||||
let payload: [String: AnyCodable] = [
|
||||
"title": AnyCodable("Hello"),
|
||||
"flag": AnyCodable(true),
|
||||
"count": AnyCodable(3),
|
||||
"ratio": AnyCodable(1.25),
|
||||
"list": AnyCodable([AnyCodable("a"), AnyCodable(2)]),
|
||||
]
|
||||
let any = ProtoAnyCodable(payload)
|
||||
let any = AnyCodable(payload)
|
||||
let dict = try #require(any.dictionaryValue)
|
||||
#expect(dict["title"]?.stringValue == "Hello")
|
||||
#expect(dict["flag"]?.boolValue == true)
|
||||
@@ -78,27 +76,31 @@ struct LowCoverageHelperTests {
|
||||
#expect(result.stderr.contains("stderr-1999"))
|
||||
}
|
||||
|
||||
@Test func nodeInfoCodableRoundTrip() throws {
|
||||
let info = NodeInfo(
|
||||
@Test func pairedNodesStorePersists() async throws {
|
||||
let dir = FileManager().temporaryDirectory
|
||||
.appendingPathComponent("paired-\(UUID().uuidString)", isDirectory: true)
|
||||
try FileManager().createDirectory(at: dir, withIntermediateDirectories: true)
|
||||
let url = dir.appendingPathComponent("nodes.json")
|
||||
let store = PairedNodesStore(fileURL: url)
|
||||
await store.load()
|
||||
#expect(await store.all().isEmpty)
|
||||
|
||||
let node = PairedNode(
|
||||
nodeId: "node-1",
|
||||
displayName: "Node One",
|
||||
platform: "macOS",
|
||||
version: "1.0",
|
||||
coreVersion: "1.0-core",
|
||||
uiVersion: "1.0-ui",
|
||||
deviceFamily: "Mac",
|
||||
modelIdentifier: "MacBookPro",
|
||||
remoteIp: "192.168.1.2",
|
||||
caps: ["chat"],
|
||||
commands: ["send"],
|
||||
permissions: ["send": true],
|
||||
paired: true,
|
||||
connected: false)
|
||||
let data = try JSONEncoder().encode(info)
|
||||
let decoded = try JSONDecoder().decode(NodeInfo.self, from: data)
|
||||
#expect(decoded.nodeId == "node-1")
|
||||
#expect(decoded.isPaired == true)
|
||||
#expect(decoded.isConnected == false)
|
||||
token: "token",
|
||||
createdAtMs: 1,
|
||||
lastSeenAtMs: nil)
|
||||
try await store.upsert(node)
|
||||
#expect(await store.find(nodeId: "node-1")?.displayName == "Node One")
|
||||
|
||||
try await store.touchSeen(nodeId: "node-1")
|
||||
let updated = await store.find(nodeId: "node-1")
|
||||
#expect(updated?.lastSeenAtMs != nil)
|
||||
}
|
||||
|
||||
@Test @MainActor func presenceReporterHelpers() {
|
||||
|
||||
@@ -21,7 +21,6 @@ import Testing
|
||||
features: [:],
|
||||
snapshot: snapshot,
|
||||
canvashosturl: nil,
|
||||
auth: nil,
|
||||
policy: [:])
|
||||
|
||||
let mapped = MacGatewayChatTransport.mapPushToTransportEvent(.snapshot(hello))
|
||||
|
||||
@@ -3,15 +3,13 @@ import SwiftUI
|
||||
import Testing
|
||||
@testable import Clawdbot
|
||||
|
||||
private typealias ProtoAnyCodable = ClawdbotProtocol.AnyCodable
|
||||
|
||||
@Suite(.serialized)
|
||||
@MainActor
|
||||
struct OnboardingWizardStepViewTests {
|
||||
@Test func noteStepBuilds() {
|
||||
let step = WizardStep(
|
||||
id: "step-1",
|
||||
type: ProtoAnyCodable("note"),
|
||||
type: AnyCodable("note"),
|
||||
title: "Welcome",
|
||||
message: "Hello",
|
||||
options: nil,
|
||||
@@ -24,17 +22,17 @@ struct OnboardingWizardStepViewTests {
|
||||
}
|
||||
|
||||
@Test func selectStepBuilds() {
|
||||
let options: [[String: ProtoAnyCodable]] = [
|
||||
["value": ProtoAnyCodable("local"), "label": ProtoAnyCodable("Local"), "hint": ProtoAnyCodable("This Mac")],
|
||||
["value": ProtoAnyCodable("remote"), "label": ProtoAnyCodable("Remote")],
|
||||
let options: [[String: AnyCodable]] = [
|
||||
["value": AnyCodable("local"), "label": AnyCodable("Local"), "hint": AnyCodable("This Mac")],
|
||||
["value": AnyCodable("remote"), "label": AnyCodable("Remote")],
|
||||
]
|
||||
let step = WizardStep(
|
||||
id: "step-2",
|
||||
type: ProtoAnyCodable("select"),
|
||||
type: AnyCodable("select"),
|
||||
title: "Mode",
|
||||
message: "Choose a mode",
|
||||
options: options,
|
||||
initialvalue: ProtoAnyCodable("local"),
|
||||
initialvalue: AnyCodable("local"),
|
||||
placeholder: nil,
|
||||
sensitive: nil,
|
||||
executor: nil)
|
||||
|
||||
@@ -15,9 +15,6 @@ extension URLSessionWebSocketTask: WebSocketTasking {}
|
||||
|
||||
public struct WebSocketTaskBox: @unchecked Sendable {
|
||||
public let task: any WebSocketTasking
|
||||
public init(task: any WebSocketTasking) {
|
||||
self.task = task
|
||||
}
|
||||
|
||||
public var state: URLSessionTask.State { self.task.state }
|
||||
|
||||
|
||||
@@ -11,35 +11,6 @@ private struct NodeInvokeRequestPayload: Codable, Sendable {
|
||||
var idempotencyKey: String?
|
||||
}
|
||||
|
||||
// Ensures the timeout can win even if the invoke task never completes.
|
||||
private actor InvokeTimeoutRace {
|
||||
private var finished = false
|
||||
private let continuation: CheckedContinuation<BridgeInvokeResponse, Never>
|
||||
private var invokeTask: Task<Void, Never>?
|
||||
private var timeoutTask: Task<Void, Never>?
|
||||
|
||||
init(continuation: CheckedContinuation<BridgeInvokeResponse, Never>) {
|
||||
self.continuation = continuation
|
||||
}
|
||||
|
||||
func registerTasks(invoke: Task<Void, Never>, timeout: Task<Void, Never>) {
|
||||
self.invokeTask = invoke
|
||||
self.timeoutTask = timeout
|
||||
if finished {
|
||||
invoke.cancel()
|
||||
timeout.cancel()
|
||||
}
|
||||
}
|
||||
|
||||
func finish(_ response: BridgeInvokeResponse) {
|
||||
guard !finished else { return }
|
||||
finished = true
|
||||
continuation.resume(returning: response)
|
||||
invokeTask?.cancel()
|
||||
timeoutTask?.cancel()
|
||||
}
|
||||
}
|
||||
|
||||
public actor GatewayNodeSession {
|
||||
private let logger = Logger(subsystem: "com.clawdbot", category: "node.gateway")
|
||||
private let decoder = JSONDecoder()
|
||||
@@ -52,45 +23,6 @@ public actor GatewayNodeSession {
|
||||
private var onConnected: (@Sendable () async -> Void)?
|
||||
private var onDisconnected: (@Sendable (String) async -> Void)?
|
||||
private var onInvoke: (@Sendable (BridgeInvokeRequest) async -> BridgeInvokeResponse)?
|
||||
|
||||
static func invokeWithTimeout(
|
||||
request: BridgeInvokeRequest,
|
||||
timeoutMs: Int?,
|
||||
onInvoke: @escaping @Sendable (BridgeInvokeRequest) async -> BridgeInvokeResponse
|
||||
) async -> BridgeInvokeResponse {
|
||||
let timeout = max(0, timeoutMs ?? 0)
|
||||
guard timeout > 0 else {
|
||||
return await onInvoke(request)
|
||||
}
|
||||
|
||||
let cappedTimeout = min(timeout, Int(UInt64.max / 1_000_000))
|
||||
let timeoutResponse = BridgeInvokeResponse(
|
||||
id: request.id,
|
||||
ok: false,
|
||||
error: ClawdbotNodeError(
|
||||
code: .unavailable,
|
||||
message: "node invoke timed out")
|
||||
)
|
||||
|
||||
return await withCheckedContinuation { continuation in
|
||||
let race = InvokeTimeoutRace(continuation: continuation)
|
||||
let invokeTask = Task {
|
||||
let response = await onInvoke(request)
|
||||
await race.finish(response)
|
||||
}
|
||||
let timeoutTask = Task {
|
||||
do {
|
||||
try await Task.sleep(nanoseconds: UInt64(cappedTimeout) * 1_000_000)
|
||||
} catch {
|
||||
return
|
||||
}
|
||||
await race.finish(timeoutResponse)
|
||||
}
|
||||
Task {
|
||||
await race.registerTasks(invoke: invokeTask, timeout: timeoutTask)
|
||||
}
|
||||
}
|
||||
}
|
||||
private var serverEventSubscribers: [UUID: AsyncStream<EventFrame>.Continuation] = [:]
|
||||
private var canvasHostUrl: String?
|
||||
|
||||
@@ -235,11 +167,7 @@ public actor GatewayNodeSession {
|
||||
let request = try self.decoder.decode(NodeInvokeRequestPayload.self, from: data)
|
||||
guard let onInvoke else { return }
|
||||
let req = BridgeInvokeRequest(id: request.id, command: request.command, paramsJSON: request.paramsJSON)
|
||||
let response = await Self.invokeWithTimeout(
|
||||
request: req,
|
||||
timeoutMs: request.timeoutMs,
|
||||
onInvoke: onInvoke
|
||||
)
|
||||
let response = await onInvoke(req)
|
||||
await self.sendInvokeResult(request: request, response: response)
|
||||
} catch {
|
||||
self.logger.error("node invoke decode failed: \(error.localizedDescription, privacy: .public)")
|
||||
@@ -252,10 +180,8 @@ public actor GatewayNodeSession {
|
||||
"id": AnyCodable(request.id),
|
||||
"nodeId": AnyCodable(request.nodeId),
|
||||
"ok": AnyCodable(response.ok),
|
||||
"payloadJSON": AnyCodable(response.payloadJSON ?? NSNull()),
|
||||
]
|
||||
if let payloadJSON = response.payloadJSON {
|
||||
params["payloadJSON"] = AnyCodable(payloadJSON)
|
||||
}
|
||||
if let error = response.error {
|
||||
params["error"] = AnyCodable([
|
||||
"code": AnyCodable(error.code.rawValue),
|
||||
|
||||
@@ -473,7 +473,6 @@ public struct AgentParams: Codable, Sendable {
|
||||
public let replychannel: String?
|
||||
public let accountid: String?
|
||||
public let replyaccountid: String?
|
||||
public let threadid: String?
|
||||
public let timeout: Int?
|
||||
public let lane: String?
|
||||
public let extrasystemprompt: String?
|
||||
@@ -495,7 +494,6 @@ public struct AgentParams: Codable, Sendable {
|
||||
replychannel: String?,
|
||||
accountid: String?,
|
||||
replyaccountid: String?,
|
||||
threadid: String?,
|
||||
timeout: Int?,
|
||||
lane: String?,
|
||||
extrasystemprompt: String?,
|
||||
@@ -516,7 +514,6 @@ public struct AgentParams: Codable, Sendable {
|
||||
self.replychannel = replychannel
|
||||
self.accountid = accountid
|
||||
self.replyaccountid = replyaccountid
|
||||
self.threadid = threadid
|
||||
self.timeout = timeout
|
||||
self.lane = lane
|
||||
self.extrasystemprompt = extrasystemprompt
|
||||
@@ -538,7 +535,6 @@ public struct AgentParams: Codable, Sendable {
|
||||
case replychannel = "replyChannel"
|
||||
case accountid = "accountId"
|
||||
case replyaccountid = "replyAccountId"
|
||||
case threadid = "threadId"
|
||||
case timeout
|
||||
case lane
|
||||
case extrasystemprompt = "extraSystemPrompt"
|
||||
@@ -839,47 +835,35 @@ public struct SessionsListParams: Codable, Sendable {
|
||||
public let activeminutes: Int?
|
||||
public let includeglobal: Bool?
|
||||
public let includeunknown: Bool?
|
||||
public let includederivedtitles: Bool?
|
||||
public let includelastmessage: Bool?
|
||||
public let label: String?
|
||||
public let spawnedby: String?
|
||||
public let agentid: String?
|
||||
public let search: String?
|
||||
|
||||
public init(
|
||||
limit: Int?,
|
||||
activeminutes: Int?,
|
||||
includeglobal: Bool?,
|
||||
includeunknown: Bool?,
|
||||
includederivedtitles: Bool?,
|
||||
includelastmessage: Bool?,
|
||||
label: String?,
|
||||
spawnedby: String?,
|
||||
agentid: String?,
|
||||
search: String?
|
||||
agentid: String?
|
||||
) {
|
||||
self.limit = limit
|
||||
self.activeminutes = activeminutes
|
||||
self.includeglobal = includeglobal
|
||||
self.includeunknown = includeunknown
|
||||
self.includederivedtitles = includederivedtitles
|
||||
self.includelastmessage = includelastmessage
|
||||
self.label = label
|
||||
self.spawnedby = spawnedby
|
||||
self.agentid = agentid
|
||||
self.search = search
|
||||
}
|
||||
private enum CodingKeys: String, CodingKey {
|
||||
case limit
|
||||
case activeminutes = "activeMinutes"
|
||||
case includeglobal = "includeGlobal"
|
||||
case includeunknown = "includeUnknown"
|
||||
case includederivedtitles = "includeDerivedTitles"
|
||||
case includelastmessage = "includeLastMessage"
|
||||
case label
|
||||
case spawnedby = "spawnedBy"
|
||||
case agentid = "agentId"
|
||||
case search
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1340,9 +1324,6 @@ public struct ChannelsStatusResult: Codable, Sendable {
|
||||
public let ts: Int
|
||||
public let channelorder: [String]
|
||||
public let channellabels: [String: AnyCodable]
|
||||
public let channeldetaillabels: [String: AnyCodable]?
|
||||
public let channelsystemimages: [String: AnyCodable]?
|
||||
public let channelmeta: [[String: AnyCodable]]?
|
||||
public let channels: [String: AnyCodable]
|
||||
public let channelaccounts: [String: AnyCodable]
|
||||
public let channeldefaultaccountid: [String: AnyCodable]
|
||||
@@ -1351,9 +1332,6 @@ public struct ChannelsStatusResult: Codable, Sendable {
|
||||
ts: Int,
|
||||
channelorder: [String],
|
||||
channellabels: [String: AnyCodable],
|
||||
channeldetaillabels: [String: AnyCodable]?,
|
||||
channelsystemimages: [String: AnyCodable]?,
|
||||
channelmeta: [[String: AnyCodable]]?,
|
||||
channels: [String: AnyCodable],
|
||||
channelaccounts: [String: AnyCodable],
|
||||
channeldefaultaccountid: [String: AnyCodable]
|
||||
@@ -1361,9 +1339,6 @@ public struct ChannelsStatusResult: Codable, Sendable {
|
||||
self.ts = ts
|
||||
self.channelorder = channelorder
|
||||
self.channellabels = channellabels
|
||||
self.channeldetaillabels = channeldetaillabels
|
||||
self.channelsystemimages = channelsystemimages
|
||||
self.channelmeta = channelmeta
|
||||
self.channels = channels
|
||||
self.channelaccounts = channelaccounts
|
||||
self.channeldefaultaccountid = channeldefaultaccountid
|
||||
@@ -1372,9 +1347,6 @@ public struct ChannelsStatusResult: Codable, Sendable {
|
||||
case ts
|
||||
case channelorder = "channelOrder"
|
||||
case channellabels = "channelLabels"
|
||||
case channeldetaillabels = "channelDetailLabels"
|
||||
case channelsystemimages = "channelSystemImages"
|
||||
case channelmeta = "channelMeta"
|
||||
case channels
|
||||
case channelaccounts = "channelAccounts"
|
||||
case channeldefaultaccountid = "channelDefaultAccountId"
|
||||
|
||||
@@ -1,78 +0,0 @@
|
||||
import Foundation
|
||||
import Testing
|
||||
@testable import ClawdbotKit
|
||||
import ClawdbotProtocol
|
||||
|
||||
struct GatewayNodeSessionTests {
|
||||
@Test
|
||||
func invokeWithTimeoutReturnsUnderlyingResponseBeforeTimeout() async {
|
||||
let request = BridgeInvokeRequest(id: "1", command: "x", paramsJSON: nil)
|
||||
let response = await GatewayNodeSession.invokeWithTimeout(
|
||||
request: request,
|
||||
timeoutMs: 50,
|
||||
onInvoke: { req in
|
||||
#expect(req.id == "1")
|
||||
return BridgeInvokeResponse(id: req.id, ok: true, payloadJSON: "{}", error: nil)
|
||||
}
|
||||
)
|
||||
|
||||
#expect(response.ok == true)
|
||||
#expect(response.error == nil)
|
||||
#expect(response.payloadJSON == "{}")
|
||||
}
|
||||
|
||||
@Test
|
||||
func invokeWithTimeoutReturnsTimeoutError() async {
|
||||
let request = BridgeInvokeRequest(id: "abc", command: "x", paramsJSON: nil)
|
||||
let response = await GatewayNodeSession.invokeWithTimeout(
|
||||
request: request,
|
||||
timeoutMs: 10,
|
||||
onInvoke: { _ in
|
||||
try? await Task.sleep(nanoseconds: 200_000_000) // 200ms
|
||||
return BridgeInvokeResponse(id: "abc", ok: true, payloadJSON: "{}", error: nil)
|
||||
}
|
||||
)
|
||||
|
||||
#expect(response.ok == false)
|
||||
#expect(response.error?.code == .unavailable)
|
||||
#expect(response.error?.message.contains("timed out") == true)
|
||||
}
|
||||
|
||||
@Test
|
||||
func invokeWithTimeoutReturnsWhenHandlerNeverCompletes() async {
|
||||
let request = BridgeInvokeRequest(id: "stall", command: "x", paramsJSON: nil)
|
||||
let response = try? await AsyncTimeout.withTimeoutMs(
|
||||
timeoutMs: 200,
|
||||
onTimeout: { NSError(domain: "GatewayNodeSessionTests", code: 1) },
|
||||
operation: {
|
||||
await GatewayNodeSession.invokeWithTimeout(
|
||||
request: request,
|
||||
timeoutMs: 10,
|
||||
onInvoke: { _ in
|
||||
await withCheckedContinuation { _ in }
|
||||
}
|
||||
)
|
||||
}
|
||||
)
|
||||
|
||||
#expect(response != nil)
|
||||
#expect(response?.ok == false)
|
||||
#expect(response?.error?.code == .unavailable)
|
||||
}
|
||||
|
||||
@Test
|
||||
func invokeWithTimeoutZeroDisablesTimeout() async {
|
||||
let request = BridgeInvokeRequest(id: "1", command: "x", paramsJSON: nil)
|
||||
let response = await GatewayNodeSession.invokeWithTimeout(
|
||||
request: request,
|
||||
timeoutMs: 0,
|
||||
onInvoke: { req in
|
||||
try? await Task.sleep(nanoseconds: 5_000_000)
|
||||
return BridgeInvokeResponse(id: req.id, ok: true, payloadJSON: nil, error: nil)
|
||||
}
|
||||
)
|
||||
|
||||
#expect(response.ok == true)
|
||||
#expect(response.error == nil)
|
||||
}
|
||||
}
|
||||
@@ -21,7 +21,6 @@ Text is supported everywhere; media and reactions vary by channel.
|
||||
- [Microsoft Teams](/channels/msteams) — Bot Framework; enterprise support (plugin, installed separately).
|
||||
- [Nextcloud Talk](/channels/nextcloud-talk) — Self-hosted chat via Nextcloud Talk (plugin, installed separately).
|
||||
- [Matrix](/channels/matrix) — Matrix protocol (plugin, installed separately).
|
||||
- [Nostr](/channels/nostr) — Decentralized DMs via NIP-04 (plugin, installed separately).
|
||||
- [Zalo](/channels/zalo) — Zalo Bot API; Vietnam's popular messenger (plugin, installed separately).
|
||||
- [Zalo Personal](/channels/zalouser) — Zalo personal account via QR login (plugin, installed separately).
|
||||
- [WebChat](/web/webchat) — Gateway WebChat UI over WebSocket.
|
||||
|
||||
@@ -1,235 +0,0 @@
|
||||
---
|
||||
summary: "Nostr DM channel via NIP-04 encrypted messages"
|
||||
read_when:
|
||||
- You want Clawdbot to receive DMs via Nostr
|
||||
- You're setting up decentralized messaging
|
||||
---
|
||||
# Nostr
|
||||
|
||||
**Status:** Optional plugin (disabled by default).
|
||||
|
||||
Nostr is a decentralized protocol for social networking. This channel enables Clawdbot to receive and respond to encrypted direct messages (DMs) via NIP-04.
|
||||
|
||||
## Install (on demand)
|
||||
|
||||
### Onboarding (recommended)
|
||||
|
||||
- The onboarding wizard (`clawdbot onboard`) and `clawdbot channels add` list optional channel plugins.
|
||||
- Selecting Nostr prompts you to install the plugin on demand.
|
||||
|
||||
Install defaults:
|
||||
|
||||
- **Dev channel + git checkout available:** uses the local plugin path.
|
||||
- **Stable/Beta:** downloads from npm.
|
||||
|
||||
You can always override the choice in the prompt.
|
||||
|
||||
### Manual install
|
||||
|
||||
```bash
|
||||
clawdbot plugins install @clawdbot/nostr
|
||||
```
|
||||
|
||||
Use a local checkout (dev workflows):
|
||||
|
||||
```bash
|
||||
clawdbot plugins install --link <path-to-clawdbot>/extensions/nostr
|
||||
```
|
||||
|
||||
Restart the Gateway after installing or enabling plugins.
|
||||
|
||||
## Quick setup
|
||||
|
||||
1) Generate a Nostr keypair (if needed):
|
||||
|
||||
```bash
|
||||
# Using nak
|
||||
nak key generate
|
||||
```
|
||||
|
||||
2) Add to config:
|
||||
|
||||
```json
|
||||
{
|
||||
"channels": {
|
||||
"nostr": {
|
||||
"privateKey": "${NOSTR_PRIVATE_KEY}"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
3) Export the key:
|
||||
|
||||
```bash
|
||||
export NOSTR_PRIVATE_KEY="nsec1..."
|
||||
```
|
||||
|
||||
4) Restart the Gateway.
|
||||
|
||||
## Configuration reference
|
||||
|
||||
| Key | Type | Default | Description |
|
||||
| --- | --- | --- | --- |
|
||||
| `privateKey` | string | required | Private key in `nsec` or hex format |
|
||||
| `relays` | string[] | `['wss://relay.damus.io', 'wss://nos.lol']` | Relay URLs (WebSocket) |
|
||||
| `dmPolicy` | string | `pairing` | DM access policy |
|
||||
| `allowFrom` | string[] | `[]` | Allowed sender pubkeys |
|
||||
| `enabled` | boolean | `true` | Enable/disable channel |
|
||||
| `name` | string | - | Display name |
|
||||
| `profile` | object | - | NIP-01 profile metadata |
|
||||
|
||||
## Profile metadata
|
||||
|
||||
Profile data is published as a NIP-01 `kind:0` event. You can manage it from the Control UI (Channels -> Nostr -> Profile) or set it directly in config.
|
||||
|
||||
Example:
|
||||
|
||||
```json
|
||||
{
|
||||
"channels": {
|
||||
"nostr": {
|
||||
"privateKey": "${NOSTR_PRIVATE_KEY}",
|
||||
"profile": {
|
||||
"name": "clawdbot",
|
||||
"displayName": "Clawdbot",
|
||||
"about": "Personal assistant DM bot",
|
||||
"picture": "https://example.com/avatar.png",
|
||||
"banner": "https://example.com/banner.png",
|
||||
"website": "https://example.com",
|
||||
"nip05": "clawdbot@example.com",
|
||||
"lud16": "clawdbot@example.com"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Notes:
|
||||
|
||||
- Profile URLs must use `https://`.
|
||||
- Importing from relays merges fields and preserves local overrides.
|
||||
|
||||
## Access control
|
||||
|
||||
### DM policies
|
||||
|
||||
- **pairing** (default): unknown senders get a pairing code.
|
||||
- **allowlist**: only pubkeys in `allowFrom` can DM.
|
||||
- **open**: public inbound DMs (requires `allowFrom: ["*"]`).
|
||||
- **disabled**: ignore inbound DMs.
|
||||
|
||||
### Allowlist example
|
||||
|
||||
```json
|
||||
{
|
||||
"channels": {
|
||||
"nostr": {
|
||||
"privateKey": "${NOSTR_PRIVATE_KEY}",
|
||||
"dmPolicy": "allowlist",
|
||||
"allowFrom": ["npub1abc...", "npub1xyz..."]
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Key formats
|
||||
|
||||
Accepted formats:
|
||||
|
||||
- **Private key:** `nsec...` or 64-char hex
|
||||
- **Pubkeys (`allowFrom`):** `npub...` or hex
|
||||
|
||||
## Relays
|
||||
|
||||
Defaults: `relay.damus.io` and `nos.lol`.
|
||||
|
||||
```json
|
||||
{
|
||||
"channels": {
|
||||
"nostr": {
|
||||
"privateKey": "${NOSTR_PRIVATE_KEY}",
|
||||
"relays": [
|
||||
"wss://relay.damus.io",
|
||||
"wss://relay.primal.net",
|
||||
"wss://nostr.wine"
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Tips:
|
||||
|
||||
- Use 2-3 relays for redundancy.
|
||||
- Avoid too many relays (latency, duplication).
|
||||
- Paid relays can improve reliability.
|
||||
- Local relays are fine for testing (`ws://localhost:7777`).
|
||||
|
||||
## Protocol support
|
||||
|
||||
| NIP | Status | Description |
|
||||
| --- | --- | --- |
|
||||
| NIP-01 | Supported | Basic event format + profile metadata |
|
||||
| NIP-04 | Supported | Encrypted DMs (`kind:4`) |
|
||||
| NIP-17 | Planned | Gift-wrapped DMs |
|
||||
| NIP-44 | Planned | Versioned encryption |
|
||||
|
||||
## Testing
|
||||
|
||||
### Local relay
|
||||
|
||||
```bash
|
||||
# Start strfry
|
||||
docker run -p 7777:7777 ghcr.io/hoytech/strfry
|
||||
```
|
||||
|
||||
```json
|
||||
{
|
||||
"channels": {
|
||||
"nostr": {
|
||||
"privateKey": "${NOSTR_PRIVATE_KEY}",
|
||||
"relays": ["ws://localhost:7777"]
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Manual test
|
||||
|
||||
1) Note the bot pubkey (npub) from logs.
|
||||
2) Open a Nostr client (Damus, Amethyst, etc.).
|
||||
3) DM the bot pubkey.
|
||||
4) Verify the response.
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Not receiving messages
|
||||
|
||||
- Verify the private key is valid.
|
||||
- Ensure relay URLs are reachable and use `wss://` (or `ws://` for local).
|
||||
- Confirm `enabled` is not `false`.
|
||||
- Check Gateway logs for relay connection errors.
|
||||
|
||||
### Not sending responses
|
||||
|
||||
- Check relay accepts writes.
|
||||
- Verify outbound connectivity.
|
||||
- Watch for relay rate limits.
|
||||
|
||||
### Duplicate responses
|
||||
|
||||
- Expected when using multiple relays.
|
||||
- Messages are deduplicated by event ID; only the first delivery triggers a response.
|
||||
|
||||
## Security
|
||||
|
||||
- Never commit private keys.
|
||||
- Use environment variables for keys.
|
||||
- Consider `allowlist` for production bots.
|
||||
|
||||
## Limitations (MVP)
|
||||
|
||||
- Direct messages only (no group chats).
|
||||
- No media attachments.
|
||||
- NIP-04 only (NIP-17 gift-wrap planned).
|
||||
@@ -286,11 +286,6 @@ WhatsApp can automatically send emoji reactions to incoming messages immediately
|
||||
- CLI: `clawdbot message send --media <mp4> --gif-playback`
|
||||
- Gateway: `send` params include `gifPlayback: true`
|
||||
|
||||
## Voice notes (PTT audio)
|
||||
WhatsApp sends audio as **voice notes** (PTT bubble).
|
||||
- Best results: OGG/Opus. Clawdbot rewrites `audio/ogg` to `audio/ogg; codecs=opus`.
|
||||
- `[[audio_as_voice]]` is ignored for WhatsApp (audio already ships as voice note).
|
||||
|
||||
## Media limits + optimization
|
||||
- Default outbound cap: 5 MB (per media item).
|
||||
- Override: `agents.defaults.mediaMaxMb`.
|
||||
|
||||
@@ -14,16 +14,3 @@ Related:
|
||||
|
||||
Tip: run `clawdbot cron --help` for the full command surface.
|
||||
|
||||
## Common edits
|
||||
|
||||
Update delivery settings without changing the message:
|
||||
|
||||
```bash
|
||||
clawdbot cron edit <job-id> --deliver --channel telegram --to "123456789"
|
||||
```
|
||||
|
||||
Disable delivery for an isolated job:
|
||||
|
||||
```bash
|
||||
clawdbot cron edit <job-id> --no-deliver
|
||||
```
|
||||
|
||||
@@ -825,9 +825,9 @@ Common options:
|
||||
- `--url`, `--token`, `--timeout`, `--json`
|
||||
|
||||
Subcommands:
|
||||
- `nodes status [--connected] [--last-connected <duration>]`
|
||||
- `nodes status`
|
||||
- `nodes describe --node <id|name|ip>`
|
||||
- `nodes list [--connected] [--last-connected <duration>]`
|
||||
- `nodes list`
|
||||
- `nodes pending`
|
||||
- `nodes approve <requestId>`
|
||||
- `nodes reject <requestId>`
|
||||
|
||||
@@ -18,22 +18,15 @@ Related:
|
||||
|
||||
```bash
|
||||
clawdbot nodes list
|
||||
clawdbot nodes list --connected
|
||||
clawdbot nodes list --last-connected 24h
|
||||
clawdbot nodes pending
|
||||
clawdbot nodes approve <requestId>
|
||||
clawdbot nodes status
|
||||
clawdbot nodes status --connected
|
||||
clawdbot nodes status --last-connected 24h
|
||||
```
|
||||
|
||||
`nodes list` prints pending/paired tables. Paired rows include the most recent connect age (Last Connect).
|
||||
Use `--connected` to only show currently-connected nodes. Use `--last-connected <duration>` to
|
||||
filter to nodes that connected within a duration (e.g. `24h`, `7d`).
|
||||
|
||||
## Invoke / run
|
||||
|
||||
```bash
|
||||
clawdbot nodes invoke --node <id|name|ip> --command <command> --params <json>
|
||||
clawdbot nodes run --node <id|name|ip> <command...>
|
||||
```
|
||||
|
||||
|
||||
@@ -21,4 +21,3 @@ clawdbot security audit --fix
|
||||
```
|
||||
|
||||
The audit warns when multiple DM senders share the main session and recommends `session.dmScope="per-channel-peer"` for shared inboxes.
|
||||
It also warns when small models (`<=300B`) are used without sandboxing and with web/browser tools enabled.
|
||||
|
||||
@@ -77,21 +77,6 @@ Client Gateway
|
||||
safely retry; the server keeps a short‑lived dedupe cache.
|
||||
- Nodes must include `role: "node"` plus caps/commands/permissions in `connect`.
|
||||
|
||||
## Pairing + local trust
|
||||
|
||||
- All WS clients (operators + nodes) include a **device identity** on `connect`.
|
||||
- New device IDs require pairing approval; the Gateway issues a **device token**
|
||||
for subsequent connects.
|
||||
- **Local** connects (loopback or the gateway host’s own tailnet address) can be
|
||||
auto‑approved to keep same‑host UX smooth.
|
||||
- **Non‑local** connects must sign the `connect.challenge` nonce and require
|
||||
explicit approval.
|
||||
- Gateway auth (`gateway.auth.*`) still applies to **all** connections, local or
|
||||
remote.
|
||||
|
||||
Details: [Gateway protocol](/gateway/protocol), [Pairing](/start/pairing),
|
||||
[Security](/gateway/security).
|
||||
|
||||
## Protocol typing and codegen
|
||||
|
||||
- TypeBox schemas define the protocol.
|
||||
|
||||
@@ -1774,7 +1774,6 @@ Note: `applyPatch` is only under `tools.exec`.
|
||||
- `tools.web.fetch.maxChars` (default 50000)
|
||||
- `tools.web.fetch.timeoutSeconds` (default 30)
|
||||
- `tools.web.fetch.cacheTtlMinutes` (default 15)
|
||||
- `tools.web.fetch.maxRedirects` (default 3)
|
||||
- `tools.web.fetch.userAgent` (optional override)
|
||||
- `tools.web.fetch.readability` (default true; disable to use basic HTML cleanup only)
|
||||
- `tools.web.fetch.firecrawl.enabled` (default true when an API key is set)
|
||||
@@ -2615,13 +2614,10 @@ Defaults:
|
||||
// noSandbox: false,
|
||||
// executablePath: "/Applications/Brave Browser.app/Contents/MacOS/Brave Browser",
|
||||
// attachOnly: false, // set true when tunneling a remote CDP to localhost
|
||||
// snapshotDefaults: { mode: "efficient" }, // tool/CLI default snapshot preset
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Note: `browser.snapshotDefaults` only affects Clawdbot's browser tool + CLI. Direct HTTP clients must pass `mode` explicitly.
|
||||
|
||||
### `ui` (Appearance)
|
||||
|
||||
Optional accent color used by the native apps for UI chrome (e.g. Talk Mode bubble tint).
|
||||
|
||||
@@ -195,8 +195,6 @@ The Gateway treats these as **claims** and enforces server-side allowlists.
|
||||
- Gateways issue tokens per device + role.
|
||||
- Pairing approvals are required for new device IDs unless local auto-approval
|
||||
is enabled.
|
||||
- **Local** connects include loopback and the gateway host’s own tailnet address
|
||||
(so same‑host tailnet binds can still auto‑approve).
|
||||
- All WS clients must include `device` identity during `connect` (operator + node).
|
||||
- Non-local connections must sign the server-provided `connect.challenge` nonce.
|
||||
|
||||
|
||||
@@ -177,7 +177,6 @@ Recommendations:
|
||||
- **Use the latest generation, best-tier model** for any bot that can run tools or touch files/networks.
|
||||
- **Avoid weaker tiers** (for example, Sonnet or Haiku) for tool-enabled agents or untrusted inboxes.
|
||||
- If you must use a smaller model, **reduce blast radius** (read-only tools, strong sandboxing, minimal filesystem access, strict allowlists).
|
||||
- When running small models, **enable sandboxing for all sessions** and **disable web_search/web_fetch/browser** unless inputs are tightly controlled.
|
||||
|
||||
## Reasoning & verbose output in groups
|
||||
|
||||
@@ -270,12 +269,6 @@ Note: `gateway.remote.token` is **only** for remote CLI calls; it does not
|
||||
protect local WS access.
|
||||
Optional: pin remote TLS with `gateway.remote.tlsFingerprint` when using `wss://`.
|
||||
|
||||
Local device pairing:
|
||||
- Device pairing is auto‑approved for **local** connects (loopback or the
|
||||
gateway host’s own tailnet address) to keep same‑host clients smooth.
|
||||
- Other tailnet peers are **not** treated as local; they still need pairing
|
||||
approval.
|
||||
|
||||
Auth modes:
|
||||
- `gateway.auth.mode: "token"`: shared bearer token (recommended for most setups).
|
||||
- `gateway.auth.mode: "password"`: password auth (prefer setting via env: `CLAWDBOT_GATEWAY_PASSWORD`).
|
||||
|
||||
131
docs/logging.md
131
docs/logging.md
@@ -138,45 +138,17 @@ Redaction affects **console output only** and does not alter file logs.
|
||||
|
||||
## Diagnostics + OpenTelemetry
|
||||
|
||||
Diagnostics are structured, machine-readable events for model runs **and**
|
||||
message-flow telemetry (webhooks, queueing, session state). They do **not**
|
||||
replace logs; they exist to feed metrics, traces, and other exporters.
|
||||
Diagnostics are **opt-in** structured events for model runs (usage + cost +
|
||||
context + duration). They do **not** replace logs; they exist to feed metrics,
|
||||
traces, and other exporters.
|
||||
|
||||
Diagnostics events are emitted in-process, but exporters only attach when
|
||||
diagnostics + the exporter plugin are enabled.
|
||||
Clawdbot currently emits a `model.usage` event after each agent run with:
|
||||
|
||||
### OpenTelemetry vs OTLP
|
||||
|
||||
- **OpenTelemetry (OTel)**: the data model + SDKs for traces, metrics, and logs.
|
||||
- **OTLP**: the wire protocol used to export OTel data to a collector/backend.
|
||||
- Clawdbot exports via **OTLP/HTTP (protobuf)** today.
|
||||
|
||||
### Signals exported
|
||||
|
||||
- **Metrics**: counters + histograms (token usage, message flow, queueing).
|
||||
- **Traces**: spans for model usage + webhook/message processing.
|
||||
- **Logs**: exported over OTLP when `diagnostics.otel.logs` is enabled. Log
|
||||
volume can be high; keep `logging.level` and exporter filters in mind.
|
||||
|
||||
### Diagnostic event catalog
|
||||
|
||||
Model usage:
|
||||
- `model.usage`: tokens, cost, duration, context, provider/model/channel, session ids.
|
||||
|
||||
Message flow:
|
||||
- `webhook.received`: webhook ingress per channel.
|
||||
- `webhook.processed`: webhook handled + duration.
|
||||
- `webhook.error`: webhook handler errors.
|
||||
- `message.queued`: message enqueued for processing.
|
||||
- `message.processed`: outcome + duration + optional error.
|
||||
|
||||
Queue + session:
|
||||
- `queue.lane.enqueue`: command queue lane enqueue + depth.
|
||||
- `queue.lane.dequeue`: command queue lane dequeue + wait time.
|
||||
- `session.state`: session state transition + reason.
|
||||
- `session.stuck`: session stuck warning + age.
|
||||
- `run.attempt`: run retry/attempt metadata.
|
||||
- `diagnostic.heartbeat`: aggregate counters (webhooks/queue/session).
|
||||
- Token counts (input/output/cache/prompt/total)
|
||||
- Estimated cost (USD)
|
||||
- Context window used/limit
|
||||
- Duration (ms)
|
||||
- Provider/channel/model + session identifiers
|
||||
|
||||
### Enable diagnostics (no exporter)
|
||||
|
||||
@@ -214,7 +186,6 @@ works with any OpenTelemetry collector/backend that accepts OTLP/HTTP.
|
||||
"serviceName": "clawdbot-gateway",
|
||||
"traces": true,
|
||||
"metrics": true,
|
||||
"logs": true,
|
||||
"sampleRate": 0.2,
|
||||
"flushIntervalMs": 60000
|
||||
}
|
||||
@@ -224,91 +195,13 @@ works with any OpenTelemetry collector/backend that accepts OTLP/HTTP.
|
||||
|
||||
Notes:
|
||||
- You can also enable the plugin with `clawdbot plugins enable diagnostics-otel`.
|
||||
- `protocol` currently supports `http/protobuf` only. `grpc` is ignored.
|
||||
- Metrics include token usage, cost, context size, run duration, and message-flow
|
||||
counters/histograms (webhooks, queueing, session state, queue depth/wait).
|
||||
- Traces/metrics can be toggled with `traces` / `metrics` (default: on). Traces
|
||||
include model usage spans plus webhook/message processing spans when enabled.
|
||||
- `protocol` currently supports `http/protobuf`.
|
||||
- Metrics include token usage, cost, context size, and run duration.
|
||||
- Traces/metrics can be toggled with `traces` / `metrics` (default: on).
|
||||
- Set `headers` when your collector requires auth.
|
||||
- Environment variables supported: `OTEL_EXPORTER_OTLP_ENDPOINT`,
|
||||
`OTEL_SERVICE_NAME`, `OTEL_EXPORTER_OTLP_PROTOCOL`.
|
||||
|
||||
### Exported metrics (names + types)
|
||||
|
||||
Model usage:
|
||||
- `clawdbot.tokens` (counter, attrs: `clawdbot.token`, `clawdbot.channel`,
|
||||
`clawdbot.provider`, `clawdbot.model`)
|
||||
- `clawdbot.cost.usd` (counter, attrs: `clawdbot.channel`, `clawdbot.provider`,
|
||||
`clawdbot.model`)
|
||||
- `clawdbot.run.duration_ms` (histogram, attrs: `clawdbot.channel`,
|
||||
`clawdbot.provider`, `clawdbot.model`)
|
||||
- `clawdbot.context.tokens` (histogram, attrs: `clawdbot.context`,
|
||||
`clawdbot.channel`, `clawdbot.provider`, `clawdbot.model`)
|
||||
|
||||
Message flow:
|
||||
- `clawdbot.webhook.received` (counter, attrs: `clawdbot.channel`,
|
||||
`clawdbot.webhook`)
|
||||
- `clawdbot.webhook.error` (counter, attrs: `clawdbot.channel`,
|
||||
`clawdbot.webhook`)
|
||||
- `clawdbot.webhook.duration_ms` (histogram, attrs: `clawdbot.channel`,
|
||||
`clawdbot.webhook`)
|
||||
- `clawdbot.message.queued` (counter, attrs: `clawdbot.channel`,
|
||||
`clawdbot.source`)
|
||||
- `clawdbot.message.processed` (counter, attrs: `clawdbot.channel`,
|
||||
`clawdbot.outcome`)
|
||||
- `clawdbot.message.duration_ms` (histogram, attrs: `clawdbot.channel`,
|
||||
`clawdbot.outcome`)
|
||||
|
||||
Queues + sessions:
|
||||
- `clawdbot.queue.lane.enqueue` (counter, attrs: `clawdbot.lane`)
|
||||
- `clawdbot.queue.lane.dequeue` (counter, attrs: `clawdbot.lane`)
|
||||
- `clawdbot.queue.depth` (histogram, attrs: `clawdbot.lane` or
|
||||
`clawdbot.channel=heartbeat`)
|
||||
- `clawdbot.queue.wait_ms` (histogram, attrs: `clawdbot.lane`)
|
||||
- `clawdbot.session.state` (counter, attrs: `clawdbot.state`, `clawdbot.reason`)
|
||||
- `clawdbot.session.stuck` (counter, attrs: `clawdbot.state`)
|
||||
- `clawdbot.session.stuck_age_ms` (histogram, attrs: `clawdbot.state`)
|
||||
- `clawdbot.run.attempt` (counter, attrs: `clawdbot.attempt`)
|
||||
|
||||
### Exported spans (names + key attributes)
|
||||
|
||||
- `clawdbot.model.usage`
|
||||
- `clawdbot.channel`, `clawdbot.provider`, `clawdbot.model`
|
||||
- `clawdbot.sessionKey`, `clawdbot.sessionId`
|
||||
- `clawdbot.tokens.*` (input/output/cache_read/cache_write/total)
|
||||
- `clawdbot.webhook.processed`
|
||||
- `clawdbot.channel`, `clawdbot.webhook`, `clawdbot.chatId`
|
||||
- `clawdbot.webhook.error`
|
||||
- `clawdbot.channel`, `clawdbot.webhook`, `clawdbot.chatId`,
|
||||
`clawdbot.error`
|
||||
- `clawdbot.message.processed`
|
||||
- `clawdbot.channel`, `clawdbot.outcome`, `clawdbot.chatId`,
|
||||
`clawdbot.messageId`, `clawdbot.sessionKey`, `clawdbot.sessionId`,
|
||||
`clawdbot.reason`
|
||||
- `clawdbot.session.stuck`
|
||||
- `clawdbot.state`, `clawdbot.ageMs`, `clawdbot.queueDepth`,
|
||||
`clawdbot.sessionKey`, `clawdbot.sessionId`
|
||||
|
||||
### Sampling + flushing
|
||||
|
||||
- Trace sampling: `diagnostics.otel.sampleRate` (0.0–1.0, root spans only).
|
||||
- Metric export interval: `diagnostics.otel.flushIntervalMs` (min 1000ms).
|
||||
|
||||
### Protocol notes
|
||||
|
||||
- OTLP/HTTP endpoints can be set via `diagnostics.otel.endpoint` or
|
||||
`OTEL_EXPORTER_OTLP_ENDPOINT`.
|
||||
- If the endpoint already contains `/v1/traces` or `/v1/metrics`, it is used as-is.
|
||||
- If the endpoint already contains `/v1/logs`, it is used as-is for logs.
|
||||
- `diagnostics.otel.logs` enables OTLP log export for the main logger output.
|
||||
|
||||
### Log export behavior
|
||||
|
||||
- OTLP logs use the same structured records written to `logging.file`.
|
||||
- Respect `logging.level` (file log level). Console redaction does **not** apply
|
||||
to OTLP logs.
|
||||
- High-volume installs should prefer OTLP collector sampling/filtering.
|
||||
|
||||
## Troubleshooting tips
|
||||
|
||||
- **Gateway not reachable?** Run `clawdbot doctor` first.
|
||||
|
||||
@@ -1,51 +0,0 @@
|
||||
---
|
||||
summary: "Network hub: gateway surfaces, pairing, discovery, and security"
|
||||
read_when:
|
||||
- You need the network architecture + security overview
|
||||
- You are debugging local vs tailnet access or pairing
|
||||
- You want the canonical list of networking docs
|
||||
---
|
||||
# Network hub
|
||||
|
||||
This hub links the core docs for how Clawdbot connects, pairs, and secures
|
||||
devices across localhost, LAN, and tailnet.
|
||||
|
||||
## Core model
|
||||
|
||||
- [Gateway architecture](/concepts/architecture)
|
||||
- [Gateway protocol](/gateway/protocol)
|
||||
- [Gateway runbook](/gateway)
|
||||
- [Web surfaces + bind modes](/web)
|
||||
|
||||
## Pairing + identity
|
||||
|
||||
- [Pairing overview (DM + nodes)](/start/pairing)
|
||||
- [Gateway-owned node pairing](/gateway/pairing)
|
||||
- [Devices CLI (pairing + token rotation)](/cli/devices)
|
||||
- [Pairing CLI (DM approvals)](/cli/pairing)
|
||||
|
||||
Local trust:
|
||||
- Local connections (loopback or the gateway host’s own tailnet address) can be
|
||||
auto‑approved for pairing to keep same‑host UX smooth.
|
||||
- Non‑local tailnet/LAN clients still require explicit pairing approval.
|
||||
|
||||
## Discovery + transports
|
||||
|
||||
- [Discovery & transports](/gateway/discovery)
|
||||
- [Bonjour / mDNS](/gateway/bonjour)
|
||||
- [Remote access (SSH)](/gateway/remote)
|
||||
- [Tailscale](/gateway/tailscale)
|
||||
|
||||
## Nodes + bridge
|
||||
|
||||
- [Nodes overview](/nodes)
|
||||
- [Bridge protocol (legacy nodes)](/gateway/bridge-protocol)
|
||||
- [Node runbook: iOS](/platforms/ios)
|
||||
- [Node runbook: Android](/platforms/android)
|
||||
|
||||
## Security
|
||||
|
||||
- [Security overview](/gateway/security)
|
||||
- [Gateway config reference](/gateway/configuration)
|
||||
- [Troubleshooting](/gateway/troubleshooting)
|
||||
- [Doctor](/gateway/doctor)
|
||||
@@ -34,81 +34,6 @@ clawdbot nodes rename --node <idOrNameOrIp> --name "Kitchen iPad"
|
||||
Notes:
|
||||
- `nodes rename` stores a display name override in the gateway pairing store.
|
||||
|
||||
## Remote node host (system.run)
|
||||
|
||||
Use a **node host** when your Gateway runs on one machine and you want commands
|
||||
to execute on another. The model still talks to the **gateway**; the gateway
|
||||
forwards `exec` calls to the **node host** when `host=node` is selected.
|
||||
|
||||
### What runs where
|
||||
- **Gateway host**: receives messages, runs the model, routes tool calls.
|
||||
- **Node host**: executes `system.run`/`system.which` on the node machine.
|
||||
- **Approvals**: enforced on the node host via `~/.clawdbot/exec-approvals.json`.
|
||||
|
||||
### Start a node host (foreground)
|
||||
|
||||
On the node machine:
|
||||
|
||||
```bash
|
||||
clawdbot node start --host <gateway-host> --port 18789 --display-name "Build Node"
|
||||
```
|
||||
|
||||
### Start a node host (service)
|
||||
|
||||
```bash
|
||||
clawdbot node service install --host <gateway-host> --port 18789 --display-name "Build Node"
|
||||
clawdbot node service start
|
||||
```
|
||||
|
||||
### Pair + name
|
||||
|
||||
On the gateway host:
|
||||
|
||||
```bash
|
||||
clawdbot nodes pending
|
||||
clawdbot nodes approve <requestId>
|
||||
clawdbot nodes list
|
||||
```
|
||||
|
||||
Naming options:
|
||||
- `--display-name` on `clawdbot node start/service install` (persists in `~/.clawdbot/node.json` on the node).
|
||||
- `clawdbot nodes rename --node <id|name|ip> --name "Build Node"` (gateway override).
|
||||
|
||||
### Allowlist the commands
|
||||
|
||||
Exec approvals are **per node host**. Add allowlist entries from the gateway:
|
||||
|
||||
```bash
|
||||
clawdbot approvals allowlist add --node <id|name|ip> "/usr/bin/uname"
|
||||
clawdbot approvals allowlist add --node <id|name|ip> "/usr/bin/sw_vers"
|
||||
```
|
||||
|
||||
Approvals live on the node host at `~/.clawdbot/exec-approvals.json`.
|
||||
|
||||
### Point exec at the node
|
||||
|
||||
Configure defaults (gateway config):
|
||||
|
||||
```bash
|
||||
clawdbot config set tools.exec.host node
|
||||
clawdbot config set tools.exec.security allowlist
|
||||
clawdbot config set tools.exec.node "<id-or-name>"
|
||||
```
|
||||
|
||||
Or per session:
|
||||
|
||||
```
|
||||
/exec host=node security=allowlist node=<id-or-name>
|
||||
```
|
||||
|
||||
Once set, any `exec` call with `host=node` runs on the node host (subject to the
|
||||
node allowlist/approvals).
|
||||
|
||||
Related:
|
||||
- [Node host CLI](/cli/node)
|
||||
- [Exec tool](/tools/exec)
|
||||
- [Exec approvals](/tools/exec-approvals)
|
||||
|
||||
## Invoking commands
|
||||
|
||||
Low-level (raw RPC):
|
||||
@@ -289,9 +214,6 @@ Notes:
|
||||
- The node host stores its node id + pairing token in `~/.clawdbot/node.json`.
|
||||
- Exec approvals are enforced locally via `~/.clawdbot/exec-approvals.json`
|
||||
(see [Exec approvals](/tools/exec-approvals)).
|
||||
- On macOS, the headless node host prefers the companion app exec host when reachable and falls
|
||||
back to local execution if the app is unavailable. Set `CLAWDBOT_NODE_EXEC_HOST=app` to require
|
||||
the app, or `CLAWDBOT_NODE_EXEC_FALLBACK=0` to disable fallback.
|
||||
- Add `--tls` / `--tls-fingerprint` when the bridge requires TLS.
|
||||
|
||||
## Mac node mode
|
||||
|
||||
@@ -49,50 +49,6 @@ Repair/migrate:
|
||||
clawdbot doctor
|
||||
```
|
||||
|
||||
## Advanced: expose WSL services over LAN (portproxy)
|
||||
|
||||
WSL has its own virtual network. If another machine needs to reach a service
|
||||
running **inside WSL** (SSH, a local TTS server, or the Gateway), you must
|
||||
forward a Windows port to the current WSL IP. The WSL IP changes after restarts,
|
||||
so you may need to refresh the forwarding rule.
|
||||
|
||||
Example (PowerShell **as Administrator**):
|
||||
|
||||
```powershell
|
||||
$Distro = "Ubuntu-24.04"
|
||||
$ListenPort = 2222
|
||||
$TargetPort = 22
|
||||
|
||||
$WslIp = (wsl -d $Distro -- hostname -I).Trim().Split(" ")[0]
|
||||
if (-not $WslIp) { throw "WSL IP not found." }
|
||||
|
||||
netsh interface portproxy add v4tov4 listenaddress=0.0.0.0 listenport=$ListenPort `
|
||||
connectaddress=$WslIp connectport=$TargetPort
|
||||
```
|
||||
|
||||
Allow the port through Windows Firewall (one-time):
|
||||
|
||||
```powershell
|
||||
New-NetFirewallRule -DisplayName "WSL SSH $ListenPort" -Direction Inbound `
|
||||
-Protocol TCP -LocalPort $ListenPort -Action Allow
|
||||
```
|
||||
|
||||
Refresh the portproxy after WSL restarts:
|
||||
|
||||
```powershell
|
||||
netsh interface portproxy delete v4tov4 listenport=$ListenPort listenaddress=0.0.0.0 | Out-Null
|
||||
netsh interface portproxy add v4tov4 listenport=$ListenPort listenaddress=0.0.0.0 `
|
||||
connectaddress=$WslIp connectport=$TargetPort | Out-Null
|
||||
```
|
||||
|
||||
Notes:
|
||||
- SSH from another machine targets the **Windows host IP** (example: `ssh user@windows-host -p 2222`).
|
||||
- Remote nodes must point at a **reachable** Gateway URL (not `127.0.0.1`); use
|
||||
`clawdbot status --all` to confirm.
|
||||
- Use `listenaddress=0.0.0.0` for LAN access; `127.0.0.1` keeps it local only.
|
||||
- If you want this automatic, register a Scheduled Task to run the refresh
|
||||
step at login.
|
||||
|
||||
## Step-by-step WSL2 install
|
||||
|
||||
### 1) Install WSL2 + Ubuntu
|
||||
|
||||
@@ -41,7 +41,6 @@ See [Voice Call](/plugins/voice-call) for a concrete example plugin.
|
||||
- [Voice Call](/plugins/voice-call) — `@clawdbot/voice-call`
|
||||
- [Zalo Personal](/plugins/zalouser) — `@clawdbot/zalouser`
|
||||
- [Matrix](/channels/matrix) — `@clawdbot/matrix`
|
||||
- [Nostr](/channels/nostr) — `@clawdbot/nostr`
|
||||
- [Zalo](/channels/zalo) — `@clawdbot/zalo`
|
||||
- [Microsoft Teams](/channels/msteams) — `@clawdbot/msteams`
|
||||
- Google Antigravity OAuth (provider auth) — bundled as `google-antigravity-auth` (disabled by default)
|
||||
|
||||
@@ -32,7 +32,6 @@ Use these hubs to discover every page, including deep dives and reference docs t
|
||||
## Core concepts
|
||||
|
||||
- [Architecture](/concepts/architecture)
|
||||
- [Network hub](/network)
|
||||
- [Agent runtime](/concepts/agent)
|
||||
- [Agent workspace](/concepts/agent-workspace)
|
||||
- [Memory](/concepts/memory)
|
||||
|
||||
@@ -500,7 +500,6 @@ Notes:
|
||||
- `--format ai` (default when Playwright is installed): returns an AI snapshot with numeric refs (`aria-ref="<n>"`).
|
||||
- `--format aria`: returns the accessibility tree (no refs; inspection only).
|
||||
- `--efficient` (or `--mode efficient`): compact role snapshot preset (interactive + compact + depth + lower maxChars).
|
||||
- Config default (tool/CLI only): set `browser.snapshotDefaults.mode: "efficient"` to use efficient snapshots when the caller does not pass a mode (see [Gateway configuration](/gateway/configuration#browser-clawd-managed-browser)).
|
||||
- Role snapshot options (`--interactive`, `--compact`, `--depth`, `--selector`) force a role-based snapshot with refs like `ref=e12`.
|
||||
- `--frame "<iframe selector>"` scopes role snapshots to an iframe (pairs with role refs like `e12`).
|
||||
- `--interactive` outputs a flat, easy-to-pick list of interactive elements (best for driving actions).
|
||||
|
||||
@@ -39,7 +39,7 @@ Notes:
|
||||
|
||||
- `tools.exec.notifyOnExit` (default: true): when true, backgrounded exec sessions enqueue a system event and request a heartbeat on exit.
|
||||
- `tools.exec.host` (default: `sandbox`)
|
||||
- `tools.exec.security` (default: `deny` for sandbox, `allowlist` for gateway + node when unset)
|
||||
- `tools.exec.security` (default: `deny`)
|
||||
- `tools.exec.ask` (default: `on-miss`)
|
||||
- `tools.exec.node` (default: unset)
|
||||
- `tools.exec.pathPrepend`: list of directories to prepend to `PATH` for exec runs.
|
||||
|
||||
@@ -111,7 +111,7 @@ Fields under `metadata.clawdbot`:
|
||||
- `requires.env` — list; env var must exist **or** be provided in config.
|
||||
- `requires.config` — list of `clawdbot.json` paths that must be truthy.
|
||||
- `primaryEnv` — env var name associated with `skills.entries.<name>.apiKey`.
|
||||
- `install` — optional array of installer specs used by the macOS Skills UI (brew/node/go/uv/download).
|
||||
- `install` — optional array of installer specs used by the macOS Skills UI (brew/node/go/uv).
|
||||
|
||||
Note on sandboxing:
|
||||
- `requires.bins` is checked on the **host** at skill load time.
|
||||
@@ -134,13 +134,10 @@ metadata: {"clawdbot":{"emoji":"♊️","requires":{"bins":["gemini"]},"install"
|
||||
|
||||
Notes:
|
||||
- If multiple installers are listed, the gateway picks a **single** preferred option (brew when available, otherwise node).
|
||||
- If all installers are `download`, Clawdbot lists each entry so you can see the available artifacts.
|
||||
- Installer specs can include `os: ["darwin"|"linux"|"win32"]` to filter options by platform.
|
||||
- Node installs honor `skills.install.nodeManager` in `clawdbot.json` (default: npm; options: npm/pnpm/yarn/bun).
|
||||
This only affects **skill installs**; the Gateway runtime should still be Node
|
||||
(Bun is not recommended for WhatsApp/Telegram).
|
||||
- Go installs: if `go` is missing and `brew` is available, the gateway installs Go via Homebrew first and sets `GOBIN` to Homebrew’s `bin` when possible.
|
||||
- Download installs: `url` (required), `archive` (`tar.gz` | `tar.bz2` | `zip`), `extract` (default: auto when archive detected), `stripComponents`, `targetDir` (default: `~/.clawdbot/tools/<skillKey>`).
|
||||
|
||||
If no `metadata.clawdbot` is present, the skill is always eligible (unless
|
||||
disabled in config or blocked by `skills.allowBundled` for bundled skills).
|
||||
|
||||
@@ -215,7 +215,6 @@ Fetch a URL and extract readable content.
|
||||
maxChars: 50000,
|
||||
timeoutSeconds: 30,
|
||||
cacheTtlMinutes: 15,
|
||||
maxRedirects: 3,
|
||||
userAgent: "Mozilla/5.0 (Macintosh; Intel Mac OS X 14_7_2) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/122.0.0.0 Safari/537.36",
|
||||
readability: true,
|
||||
firecrawl: {
|
||||
@@ -242,7 +241,6 @@ Notes:
|
||||
- `web_fetch` uses Readability (main-content extraction) first, then Firecrawl (if configured). If both fail, the tool returns an error.
|
||||
- Firecrawl requests use bot-circumvention mode and cache results by default.
|
||||
- `web_fetch` sends a Chrome-like User-Agent and `Accept-Language` by default; override `userAgent` if needed.
|
||||
- `web_fetch` blocks private/internal hostnames and re-checks redirects (limit with `maxRedirects`).
|
||||
- `web_fetch` is best-effort extraction; some sites will need the browser tool.
|
||||
- See [Firecrawl](/tools/firecrawl) for key setup and service details.
|
||||
- Responses are cached (default 15 minutes) to reduce repeated fetches.
|
||||
|
||||
@@ -8,12 +8,9 @@
|
||||
},
|
||||
"dependencies": {
|
||||
"@opentelemetry/api": "^1.9.0",
|
||||
"@opentelemetry/api-logs": "^0.210.0",
|
||||
"@opentelemetry/exporter-logs-otlp-http": "^0.210.0",
|
||||
"@opentelemetry/exporter-metrics-otlp-http": "^0.210.0",
|
||||
"@opentelemetry/exporter-trace-otlp-http": "^0.210.0",
|
||||
"@opentelemetry/resources": "^2.4.0",
|
||||
"@opentelemetry/sdk-logs": "^0.210.0",
|
||||
"@opentelemetry/sdk-metrics": "^2.4.0",
|
||||
"@opentelemetry/sdk-node": "^0.210.0",
|
||||
"@opentelemetry/sdk-trace-base": "^2.4.0",
|
||||
|
||||
@@ -1,220 +0,0 @@
|
||||
import { beforeEach, describe, expect, test, vi } from "vitest";
|
||||
|
||||
const registerLogTransportMock = vi.hoisted(() => vi.fn());
|
||||
|
||||
const telemetryState = vi.hoisted(() => {
|
||||
const counters = new Map<string, { add: ReturnType<typeof vi.fn> }>();
|
||||
const histograms = new Map<string, { record: ReturnType<typeof vi.fn> }>();
|
||||
const tracer = {
|
||||
startSpan: vi.fn((_name: string, _opts?: unknown) => ({
|
||||
end: vi.fn(),
|
||||
setStatus: vi.fn(),
|
||||
})),
|
||||
};
|
||||
const meter = {
|
||||
createCounter: vi.fn((name: string) => {
|
||||
const counter = { add: vi.fn() };
|
||||
counters.set(name, counter);
|
||||
return counter;
|
||||
}),
|
||||
createHistogram: vi.fn((name: string) => {
|
||||
const histogram = { record: vi.fn() };
|
||||
histograms.set(name, histogram);
|
||||
return histogram;
|
||||
}),
|
||||
};
|
||||
return { counters, histograms, tracer, meter };
|
||||
});
|
||||
|
||||
const sdkStart = vi.hoisted(() => vi.fn().mockResolvedValue(undefined));
|
||||
const sdkShutdown = vi.hoisted(() => vi.fn().mockResolvedValue(undefined));
|
||||
const logEmit = vi.hoisted(() => vi.fn());
|
||||
const logShutdown = vi.hoisted(() => vi.fn().mockResolvedValue(undefined));
|
||||
|
||||
vi.mock("@opentelemetry/api", () => ({
|
||||
metrics: {
|
||||
getMeter: () => telemetryState.meter,
|
||||
},
|
||||
trace: {
|
||||
getTracer: () => telemetryState.tracer,
|
||||
},
|
||||
SpanStatusCode: {
|
||||
ERROR: 2,
|
||||
},
|
||||
}));
|
||||
|
||||
vi.mock("@opentelemetry/sdk-node", () => ({
|
||||
NodeSDK: class {
|
||||
start = sdkStart;
|
||||
shutdown = sdkShutdown;
|
||||
},
|
||||
}));
|
||||
|
||||
vi.mock("@opentelemetry/exporter-metrics-otlp-http", () => ({
|
||||
OTLPMetricExporter: class {},
|
||||
}));
|
||||
|
||||
vi.mock("@opentelemetry/exporter-trace-otlp-http", () => ({
|
||||
OTLPTraceExporter: class {},
|
||||
}));
|
||||
|
||||
vi.mock("@opentelemetry/exporter-logs-otlp-http", () => ({
|
||||
OTLPLogExporter: class {},
|
||||
}));
|
||||
|
||||
vi.mock("@opentelemetry/sdk-logs", () => ({
|
||||
BatchLogRecordProcessor: class {},
|
||||
LoggerProvider: class {
|
||||
addLogRecordProcessor = vi.fn();
|
||||
getLogger = vi.fn(() => ({
|
||||
emit: logEmit,
|
||||
}));
|
||||
shutdown = logShutdown;
|
||||
},
|
||||
}));
|
||||
|
||||
vi.mock("@opentelemetry/sdk-metrics", () => ({
|
||||
PeriodicExportingMetricReader: class {},
|
||||
}));
|
||||
|
||||
vi.mock("@opentelemetry/sdk-trace-base", () => ({
|
||||
ParentBasedSampler: class {},
|
||||
TraceIdRatioBasedSampler: class {},
|
||||
}));
|
||||
|
||||
vi.mock("@opentelemetry/resources", () => ({
|
||||
Resource: class {
|
||||
// eslint-disable-next-line @typescript-eslint/no-useless-constructor
|
||||
constructor(_value?: unknown) {}
|
||||
},
|
||||
}));
|
||||
|
||||
vi.mock("@opentelemetry/semantic-conventions", () => ({
|
||||
SemanticResourceAttributes: {
|
||||
SERVICE_NAME: "service.name",
|
||||
},
|
||||
}));
|
||||
|
||||
vi.mock("clawdbot/plugin-sdk", async () => {
|
||||
const actual = await vi.importActual<typeof import("clawdbot/plugin-sdk")>("clawdbot/plugin-sdk");
|
||||
return {
|
||||
...actual,
|
||||
registerLogTransport: registerLogTransportMock,
|
||||
};
|
||||
});
|
||||
|
||||
import { createDiagnosticsOtelService } from "./service.js";
|
||||
import { emitDiagnosticEvent } from "clawdbot/plugin-sdk";
|
||||
|
||||
describe("diagnostics-otel service", () => {
|
||||
beforeEach(() => {
|
||||
telemetryState.counters.clear();
|
||||
telemetryState.histograms.clear();
|
||||
telemetryState.tracer.startSpan.mockClear();
|
||||
telemetryState.meter.createCounter.mockClear();
|
||||
telemetryState.meter.createHistogram.mockClear();
|
||||
sdkStart.mockClear();
|
||||
sdkShutdown.mockClear();
|
||||
logEmit.mockClear();
|
||||
logShutdown.mockClear();
|
||||
registerLogTransportMock.mockReset();
|
||||
});
|
||||
|
||||
test("records message-flow metrics and spans", async () => {
|
||||
const registeredTransports: Array<(logObj: Record<string, unknown>) => void> = [];
|
||||
const stopTransport = vi.fn();
|
||||
registerLogTransportMock.mockImplementation((transport) => {
|
||||
registeredTransports.push(transport);
|
||||
return stopTransport;
|
||||
});
|
||||
|
||||
const service = createDiagnosticsOtelService();
|
||||
await service.start({
|
||||
config: {
|
||||
diagnostics: {
|
||||
enabled: true,
|
||||
otel: {
|
||||
enabled: true,
|
||||
endpoint: "http://otel-collector:4318",
|
||||
protocol: "http/protobuf",
|
||||
traces: true,
|
||||
metrics: true,
|
||||
logs: true,
|
||||
},
|
||||
},
|
||||
},
|
||||
logger: {
|
||||
info: vi.fn(),
|
||||
warn: vi.fn(),
|
||||
error: vi.fn(),
|
||||
debug: vi.fn(),
|
||||
},
|
||||
});
|
||||
|
||||
emitDiagnosticEvent({
|
||||
type: "webhook.received",
|
||||
channel: "telegram",
|
||||
updateType: "telegram-post",
|
||||
});
|
||||
emitDiagnosticEvent({
|
||||
type: "webhook.processed",
|
||||
channel: "telegram",
|
||||
updateType: "telegram-post",
|
||||
durationMs: 120,
|
||||
});
|
||||
emitDiagnosticEvent({
|
||||
type: "message.queued",
|
||||
channel: "telegram",
|
||||
source: "telegram",
|
||||
queueDepth: 2,
|
||||
});
|
||||
emitDiagnosticEvent({
|
||||
type: "message.processed",
|
||||
channel: "telegram",
|
||||
outcome: "completed",
|
||||
durationMs: 55,
|
||||
});
|
||||
emitDiagnosticEvent({
|
||||
type: "queue.lane.dequeue",
|
||||
lane: "main",
|
||||
queueSize: 3,
|
||||
waitMs: 10,
|
||||
});
|
||||
emitDiagnosticEvent({
|
||||
type: "session.stuck",
|
||||
state: "processing",
|
||||
ageMs: 125_000,
|
||||
});
|
||||
emitDiagnosticEvent({
|
||||
type: "run.attempt",
|
||||
runId: "run-1",
|
||||
attempt: 2,
|
||||
});
|
||||
|
||||
expect(telemetryState.counters.get("clawdbot.webhook.received")?.add).toHaveBeenCalled();
|
||||
expect(telemetryState.histograms.get("clawdbot.webhook.duration_ms")?.record).toHaveBeenCalled();
|
||||
expect(telemetryState.counters.get("clawdbot.message.queued")?.add).toHaveBeenCalled();
|
||||
expect(telemetryState.counters.get("clawdbot.message.processed")?.add).toHaveBeenCalled();
|
||||
expect(telemetryState.histograms.get("clawdbot.message.duration_ms")?.record).toHaveBeenCalled();
|
||||
expect(telemetryState.histograms.get("clawdbot.queue.wait_ms")?.record).toHaveBeenCalled();
|
||||
expect(telemetryState.counters.get("clawdbot.session.stuck")?.add).toHaveBeenCalled();
|
||||
expect(telemetryState.histograms.get("clawdbot.session.stuck_age_ms")?.record).toHaveBeenCalled();
|
||||
expect(telemetryState.counters.get("clawdbot.run.attempt")?.add).toHaveBeenCalled();
|
||||
|
||||
const spanNames = telemetryState.tracer.startSpan.mock.calls.map((call) => call[0]);
|
||||
expect(spanNames).toContain("clawdbot.webhook.processed");
|
||||
expect(spanNames).toContain("clawdbot.message.processed");
|
||||
expect(spanNames).toContain("clawdbot.session.stuck");
|
||||
|
||||
expect(registerLogTransportMock).toHaveBeenCalledTimes(1);
|
||||
expect(registeredTransports).toHaveLength(1);
|
||||
registeredTransports[0]?.({
|
||||
0: "{\"subsystem\":\"diagnostic\"}",
|
||||
1: "hello",
|
||||
_meta: { logLevelName: "INFO", date: new Date() },
|
||||
});
|
||||
expect(logEmit).toHaveBeenCalled();
|
||||
|
||||
await service.stop?.();
|
||||
});
|
||||
});
|
||||
@@ -1,17 +1,17 @@
|
||||
import { metrics, trace, SpanStatusCode } from "@opentelemetry/api";
|
||||
import type { SeverityNumber } from "@opentelemetry/api-logs";
|
||||
import { OTLPLogExporter } from "@opentelemetry/exporter-logs-otlp-http";
|
||||
import { metrics, trace } from "@opentelemetry/api";
|
||||
import { OTLPMetricExporter } from "@opentelemetry/exporter-metrics-otlp-http";
|
||||
import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-http";
|
||||
import { Resource } from "@opentelemetry/resources";
|
||||
import { BatchLogRecordProcessor, LoggerProvider } from "@opentelemetry/sdk-logs";
|
||||
import { PeriodicExportingMetricReader } from "@opentelemetry/sdk-metrics";
|
||||
import { NodeSDK } from "@opentelemetry/sdk-node";
|
||||
import { ParentBasedSampler, TraceIdRatioBasedSampler } from "@opentelemetry/sdk-trace-base";
|
||||
import { SemanticResourceAttributes } from "@opentelemetry/semantic-conventions";
|
||||
|
||||
import type { ClawdbotPluginService, DiagnosticEventPayload } from "clawdbot/plugin-sdk";
|
||||
import { onDiagnosticEvent, registerLogTransport } from "clawdbot/plugin-sdk";
|
||||
import type {
|
||||
ClawdbotPluginService,
|
||||
DiagnosticUsageEvent,
|
||||
} from "clawdbot/plugin-sdk";
|
||||
import { onDiagnosticEvent } from "clawdbot/plugin-sdk";
|
||||
|
||||
const DEFAULT_SERVICE_NAME = "clawdbot";
|
||||
|
||||
@@ -34,8 +34,6 @@ function resolveSampleRate(value: number | undefined): number | undefined {
|
||||
|
||||
export function createDiagnosticsOtelService(): ClawdbotPluginService {
|
||||
let sdk: NodeSDK | null = null;
|
||||
let logProvider: LoggerProvider | null = null;
|
||||
let stopLogTransport: (() => void) | null = null;
|
||||
let unsubscribe: (() => void) | null = null;
|
||||
|
||||
return {
|
||||
@@ -59,8 +57,7 @@ export function createDiagnosticsOtelService(): ClawdbotPluginService {
|
||||
|
||||
const tracesEnabled = otel.traces !== false;
|
||||
const metricsEnabled = otel.metrics !== false;
|
||||
const logsEnabled = otel.logs === true;
|
||||
if (!tracesEnabled && !metricsEnabled && !logsEnabled) return;
|
||||
if (!tracesEnabled && !metricsEnabled) return;
|
||||
|
||||
const resource = new Resource({
|
||||
[SemanticResourceAttributes.SERVICE_NAME]: serviceName,
|
||||
@@ -68,7 +65,6 @@ export function createDiagnosticsOtelService(): ClawdbotPluginService {
|
||||
|
||||
const traceUrl = resolveOtelUrl(endpoint, "v1/traces");
|
||||
const metricUrl = resolveOtelUrl(endpoint, "v1/metrics");
|
||||
const logUrl = resolveOtelUrl(endpoint, "v1/logs");
|
||||
const traceExporter = tracesEnabled
|
||||
? new OTLPTraceExporter({
|
||||
...(traceUrl ? { url: traceUrl } : {}),
|
||||
@@ -92,31 +88,20 @@ export function createDiagnosticsOtelService(): ClawdbotPluginService {
|
||||
})
|
||||
: undefined;
|
||||
|
||||
if (tracesEnabled || metricsEnabled) {
|
||||
sdk = new NodeSDK({
|
||||
resource,
|
||||
...(traceExporter ? { traceExporter } : {}),
|
||||
...(metricReader ? { metricReader } : {}),
|
||||
...(sampleRate !== undefined
|
||||
? {
|
||||
sampler: new ParentBasedSampler({
|
||||
root: new TraceIdRatioBasedSampler(sampleRate),
|
||||
}),
|
||||
}
|
||||
: {}),
|
||||
});
|
||||
sdk = new NodeSDK({
|
||||
resource,
|
||||
...(traceExporter ? { traceExporter } : {}),
|
||||
...(metricReader ? { metricReader } : {}),
|
||||
...(sampleRate !== undefined
|
||||
? {
|
||||
sampler: new ParentBasedSampler({
|
||||
root: new TraceIdRatioBasedSampler(sampleRate),
|
||||
}),
|
||||
}
|
||||
: {}),
|
||||
});
|
||||
|
||||
await sdk.start();
|
||||
}
|
||||
|
||||
const logSeverityMap: Record<string, SeverityNumber> = {
|
||||
TRACE: 1 as SeverityNumber,
|
||||
DEBUG: 5 as SeverityNumber,
|
||||
INFO: 9 as SeverityNumber,
|
||||
WARN: 13 as SeverityNumber,
|
||||
ERROR: 17 as SeverityNumber,
|
||||
FATAL: 21 as SeverityNumber,
|
||||
};
|
||||
await sdk.start();
|
||||
|
||||
const meter = metrics.getMeter("clawdbot");
|
||||
const tracer = trace.getTracer("clawdbot");
|
||||
@@ -137,210 +122,38 @@ export function createDiagnosticsOtelService(): ClawdbotPluginService {
|
||||
unit: "1",
|
||||
description: "Context window size and usage",
|
||||
});
|
||||
const webhookReceivedCounter = meter.createCounter("clawdbot.webhook.received", {
|
||||
unit: "1",
|
||||
description: "Webhook requests received",
|
||||
});
|
||||
const webhookErrorCounter = meter.createCounter("clawdbot.webhook.error", {
|
||||
unit: "1",
|
||||
description: "Webhook processing errors",
|
||||
});
|
||||
const webhookDurationHistogram = meter.createHistogram("clawdbot.webhook.duration_ms", {
|
||||
unit: "ms",
|
||||
description: "Webhook processing duration",
|
||||
});
|
||||
const messageQueuedCounter = meter.createCounter("clawdbot.message.queued", {
|
||||
unit: "1",
|
||||
description: "Messages queued for processing",
|
||||
});
|
||||
const messageProcessedCounter = meter.createCounter("clawdbot.message.processed", {
|
||||
unit: "1",
|
||||
description: "Messages processed by outcome",
|
||||
});
|
||||
const messageDurationHistogram = meter.createHistogram("clawdbot.message.duration_ms", {
|
||||
unit: "ms",
|
||||
description: "Message processing duration",
|
||||
});
|
||||
const queueDepthHistogram = meter.createHistogram("clawdbot.queue.depth", {
|
||||
unit: "1",
|
||||
description: "Queue depth on enqueue/dequeue",
|
||||
});
|
||||
const queueWaitHistogram = meter.createHistogram("clawdbot.queue.wait_ms", {
|
||||
unit: "ms",
|
||||
description: "Queue wait time before execution",
|
||||
});
|
||||
const laneEnqueueCounter = meter.createCounter("clawdbot.queue.lane.enqueue", {
|
||||
unit: "1",
|
||||
description: "Command queue lane enqueue events",
|
||||
});
|
||||
const laneDequeueCounter = meter.createCounter("clawdbot.queue.lane.dequeue", {
|
||||
unit: "1",
|
||||
description: "Command queue lane dequeue events",
|
||||
});
|
||||
const sessionStateCounter = meter.createCounter("clawdbot.session.state", {
|
||||
unit: "1",
|
||||
description: "Session state transitions",
|
||||
});
|
||||
const sessionStuckCounter = meter.createCounter("clawdbot.session.stuck", {
|
||||
unit: "1",
|
||||
description: "Sessions stuck in processing",
|
||||
});
|
||||
const sessionStuckAgeHistogram = meter.createHistogram("clawdbot.session.stuck_age_ms", {
|
||||
unit: "ms",
|
||||
description: "Age of stuck sessions",
|
||||
});
|
||||
const runAttemptCounter = meter.createCounter("clawdbot.run.attempt", {
|
||||
unit: "1",
|
||||
description: "Run attempts",
|
||||
});
|
||||
|
||||
if (logsEnabled) {
|
||||
const logExporter = new OTLPLogExporter({
|
||||
...(logUrl ? { url: logUrl } : {}),
|
||||
...(headers ? { headers } : {}),
|
||||
});
|
||||
logProvider = new LoggerProvider({ resource });
|
||||
logProvider.addLogRecordProcessor(
|
||||
new BatchLogRecordProcessor(logExporter, {
|
||||
...(typeof otel.flushIntervalMs === "number"
|
||||
? { scheduledDelayMillis: Math.max(1000, otel.flushIntervalMs) }
|
||||
: {}),
|
||||
}),
|
||||
);
|
||||
const otelLogger = logProvider.getLogger("clawdbot");
|
||||
|
||||
stopLogTransport = registerLogTransport((logObj) => {
|
||||
const safeStringify = (value: unknown) => {
|
||||
try {
|
||||
return JSON.stringify(value);
|
||||
} catch {
|
||||
return String(value);
|
||||
}
|
||||
};
|
||||
const meta = (logObj as Record<string, unknown>)._meta as
|
||||
| {
|
||||
logLevelName?: string;
|
||||
date?: Date;
|
||||
name?: string;
|
||||
parentNames?: string[];
|
||||
path?: {
|
||||
filePath?: string;
|
||||
fileLine?: string;
|
||||
fileColumn?: string;
|
||||
filePathWithLine?: string;
|
||||
method?: string;
|
||||
};
|
||||
}
|
||||
| undefined;
|
||||
const logLevelName = meta?.logLevelName ?? "INFO";
|
||||
const severityNumber = logSeverityMap[logLevelName] ?? (9 as SeverityNumber);
|
||||
|
||||
const numericArgs = Object.entries(logObj)
|
||||
.filter(([key]) => /^\d+$/.test(key))
|
||||
.sort((a, b) => Number(a[0]) - Number(b[0]))
|
||||
.map(([, value]) => value);
|
||||
|
||||
let bindings: Record<string, unknown> | undefined;
|
||||
if (typeof numericArgs[0] === "string" && numericArgs[0].trim().startsWith("{")) {
|
||||
try {
|
||||
const parsed = JSON.parse(numericArgs[0]);
|
||||
if (parsed && typeof parsed === "object" && !Array.isArray(parsed)) {
|
||||
bindings = parsed as Record<string, unknown>;
|
||||
numericArgs.shift();
|
||||
}
|
||||
} catch {
|
||||
// ignore malformed json bindings
|
||||
}
|
||||
}
|
||||
|
||||
let message = "";
|
||||
if (numericArgs.length > 0 && typeof numericArgs[numericArgs.length - 1] === "string") {
|
||||
message = String(numericArgs.pop());
|
||||
} else if (numericArgs.length === 1) {
|
||||
message = safeStringify(numericArgs[0]);
|
||||
numericArgs.length = 0;
|
||||
}
|
||||
if (!message) {
|
||||
message = "log";
|
||||
}
|
||||
|
||||
const attributes: Record<string, string | number | boolean> = {
|
||||
"clawdbot.log.level": logLevelName,
|
||||
};
|
||||
if (meta?.name) attributes["clawdbot.logger"] = meta.name;
|
||||
if (meta?.parentNames?.length) {
|
||||
attributes["clawdbot.logger.parents"] = meta.parentNames.join(".");
|
||||
}
|
||||
if (bindings) {
|
||||
for (const [key, value] of Object.entries(bindings)) {
|
||||
if (typeof value === "string" || typeof value === "number" || typeof value === "boolean") {
|
||||
attributes[`clawdbot.${key}`] = value;
|
||||
} else if (value != null) {
|
||||
attributes[`clawdbot.${key}`] = safeStringify(value);
|
||||
}
|
||||
}
|
||||
}
|
||||
if (numericArgs.length > 0) {
|
||||
attributes["clawdbot.log.args"] = safeStringify(numericArgs);
|
||||
}
|
||||
if (meta?.path?.filePath) attributes["code.filepath"] = meta.path.filePath;
|
||||
if (meta?.path?.fileLine) attributes["code.lineno"] = Number(meta.path.fileLine);
|
||||
if (meta?.path?.method) attributes["code.function"] = meta.path.method;
|
||||
if (meta?.path?.filePathWithLine) {
|
||||
attributes["clawdbot.code.location"] = meta.path.filePathWithLine;
|
||||
}
|
||||
|
||||
otelLogger.emit({
|
||||
body: message,
|
||||
severityText: logLevelName,
|
||||
severityNumber,
|
||||
attributes,
|
||||
timestamp: meta?.date ?? new Date(),
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
const spanWithDuration = (
|
||||
name: string,
|
||||
attributes: Record<string, string | number>,
|
||||
durationMs?: number,
|
||||
) => {
|
||||
const startTime =
|
||||
typeof durationMs === "number" ? Date.now() - Math.max(0, durationMs) : undefined;
|
||||
const span = tracer.startSpan(name, {
|
||||
attributes,
|
||||
...(startTime ? { startTime } : {}),
|
||||
});
|
||||
return span;
|
||||
};
|
||||
|
||||
const recordModelUsage = (evt: Extract<DiagnosticEventPayload, { type: "model.usage" }>) => {
|
||||
unsubscribe = onDiagnosticEvent((evt) => {
|
||||
if (evt.type !== "model.usage") return;
|
||||
const usageEvent = evt as DiagnosticUsageEvent;
|
||||
const attrs = {
|
||||
"clawdbot.channel": evt.channel ?? "unknown",
|
||||
"clawdbot.provider": evt.provider ?? "unknown",
|
||||
"clawdbot.model": evt.model ?? "unknown",
|
||||
"clawdbot.channel": usageEvent.channel ?? "unknown",
|
||||
"clawdbot.provider": usageEvent.provider ?? "unknown",
|
||||
"clawdbot.model": usageEvent.model ?? "unknown",
|
||||
};
|
||||
|
||||
const usage = evt.usage;
|
||||
const usage = usageEvent.usage;
|
||||
if (usage.input) tokensCounter.add(usage.input, { ...attrs, "clawdbot.token": "input" });
|
||||
if (usage.output) tokensCounter.add(usage.output, { ...attrs, "clawdbot.token": "output" });
|
||||
if (usage.output)
|
||||
tokensCounter.add(usage.output, { ...attrs, "clawdbot.token": "output" });
|
||||
if (usage.cacheRead)
|
||||
tokensCounter.add(usage.cacheRead, { ...attrs, "clawdbot.token": "cache_read" });
|
||||
if (usage.cacheWrite)
|
||||
tokensCounter.add(usage.cacheWrite, { ...attrs, "clawdbot.token": "cache_write" });
|
||||
if (usage.promptTokens)
|
||||
tokensCounter.add(usage.promptTokens, { ...attrs, "clawdbot.token": "prompt" });
|
||||
if (usage.total) tokensCounter.add(usage.total, { ...attrs, "clawdbot.token": "total" });
|
||||
if (usage.total)
|
||||
tokensCounter.add(usage.total, { ...attrs, "clawdbot.token": "total" });
|
||||
|
||||
if (evt.costUsd) costCounter.add(evt.costUsd, attrs);
|
||||
if (evt.durationMs) durationHistogram.record(evt.durationMs, attrs);
|
||||
if (evt.context?.limit)
|
||||
contextHistogram.record(evt.context.limit, {
|
||||
if (usageEvent.costUsd) costCounter.add(usageEvent.costUsd, attrs);
|
||||
if (usageEvent.durationMs) durationHistogram.record(usageEvent.durationMs, attrs);
|
||||
if (usageEvent.context?.limit)
|
||||
contextHistogram.record(usageEvent.context.limit, {
|
||||
...attrs,
|
||||
"clawdbot.context": "limit",
|
||||
});
|
||||
if (evt.context?.used)
|
||||
contextHistogram.record(evt.context.used, {
|
||||
if (usageEvent.context?.used)
|
||||
contextHistogram.record(usageEvent.context.used, {
|
||||
...attrs,
|
||||
"clawdbot.context": "used",
|
||||
});
|
||||
@@ -348,8 +161,8 @@ export function createDiagnosticsOtelService(): ClawdbotPluginService {
|
||||
if (!tracesEnabled) return;
|
||||
const spanAttrs: Record<string, string | number> = {
|
||||
...attrs,
|
||||
"clawdbot.sessionKey": evt.sessionKey ?? "",
|
||||
"clawdbot.sessionId": evt.sessionId ?? "",
|
||||
"clawdbot.sessionKey": usageEvent.sessionKey ?? "",
|
||||
"clawdbot.sessionId": usageEvent.sessionId ?? "",
|
||||
"clawdbot.tokens.input": usage.input ?? 0,
|
||||
"clawdbot.tokens.output": usage.output ?? 0,
|
||||
"clawdbot.tokens.cache_read": usage.cacheRead ?? 0,
|
||||
@@ -357,206 +170,23 @@ export function createDiagnosticsOtelService(): ClawdbotPluginService {
|
||||
"clawdbot.tokens.total": usage.total ?? 0,
|
||||
};
|
||||
|
||||
const span = spanWithDuration("clawdbot.model.usage", spanAttrs, evt.durationMs);
|
||||
span.end();
|
||||
};
|
||||
|
||||
const recordWebhookReceived = (
|
||||
evt: Extract<DiagnosticEventPayload, { type: "webhook.received" }>,
|
||||
) => {
|
||||
const attrs = {
|
||||
"clawdbot.channel": evt.channel ?? "unknown",
|
||||
"clawdbot.webhook": evt.updateType ?? "unknown",
|
||||
};
|
||||
webhookReceivedCounter.add(1, attrs);
|
||||
};
|
||||
|
||||
const recordWebhookProcessed = (
|
||||
evt: Extract<DiagnosticEventPayload, { type: "webhook.processed" }>,
|
||||
) => {
|
||||
const attrs = {
|
||||
"clawdbot.channel": evt.channel ?? "unknown",
|
||||
"clawdbot.webhook": evt.updateType ?? "unknown",
|
||||
};
|
||||
if (typeof evt.durationMs === "number") {
|
||||
webhookDurationHistogram.record(evt.durationMs, attrs);
|
||||
}
|
||||
if (!tracesEnabled) return;
|
||||
const spanAttrs: Record<string, string | number> = { ...attrs };
|
||||
if (evt.chatId !== undefined) spanAttrs["clawdbot.chatId"] = String(evt.chatId);
|
||||
const span = spanWithDuration("clawdbot.webhook.processed", spanAttrs, evt.durationMs);
|
||||
span.end();
|
||||
};
|
||||
|
||||
const recordWebhookError = (
|
||||
evt: Extract<DiagnosticEventPayload, { type: "webhook.error" }>,
|
||||
) => {
|
||||
const attrs = {
|
||||
"clawdbot.channel": evt.channel ?? "unknown",
|
||||
"clawdbot.webhook": evt.updateType ?? "unknown",
|
||||
};
|
||||
webhookErrorCounter.add(1, attrs);
|
||||
if (!tracesEnabled) return;
|
||||
const spanAttrs: Record<string, string | number> = {
|
||||
...attrs,
|
||||
"clawdbot.error": evt.error,
|
||||
};
|
||||
if (evt.chatId !== undefined) spanAttrs["clawdbot.chatId"] = String(evt.chatId);
|
||||
const span = tracer.startSpan("clawdbot.webhook.error", {
|
||||
const startTime = usageEvent.durationMs
|
||||
? Date.now() - Math.max(0, usageEvent.durationMs)
|
||||
: undefined;
|
||||
const span = tracer.startSpan("clawdbot.model.usage", {
|
||||
attributes: spanAttrs,
|
||||
...(startTime ? { startTime } : {}),
|
||||
});
|
||||
span.setStatus({ code: SpanStatusCode.ERROR, message: evt.error });
|
||||
span.end();
|
||||
};
|
||||
|
||||
const recordMessageQueued = (
|
||||
evt: Extract<DiagnosticEventPayload, { type: "message.queued" }>,
|
||||
) => {
|
||||
const attrs = {
|
||||
"clawdbot.channel": evt.channel ?? "unknown",
|
||||
"clawdbot.source": evt.source ?? "unknown",
|
||||
};
|
||||
messageQueuedCounter.add(1, attrs);
|
||||
if (typeof evt.queueDepth === "number") {
|
||||
queueDepthHistogram.record(evt.queueDepth, attrs);
|
||||
}
|
||||
};
|
||||
|
||||
const recordMessageProcessed = (
|
||||
evt: Extract<DiagnosticEventPayload, { type: "message.processed" }>,
|
||||
) => {
|
||||
const attrs = {
|
||||
"clawdbot.channel": evt.channel ?? "unknown",
|
||||
"clawdbot.outcome": evt.outcome ?? "unknown",
|
||||
};
|
||||
messageProcessedCounter.add(1, attrs);
|
||||
if (typeof evt.durationMs === "number") {
|
||||
messageDurationHistogram.record(evt.durationMs, attrs);
|
||||
}
|
||||
if (!tracesEnabled) return;
|
||||
const spanAttrs: Record<string, string | number> = { ...attrs };
|
||||
if (evt.sessionKey) spanAttrs["clawdbot.sessionKey"] = evt.sessionKey;
|
||||
if (evt.sessionId) spanAttrs["clawdbot.sessionId"] = evt.sessionId;
|
||||
if (evt.chatId !== undefined) spanAttrs["clawdbot.chatId"] = String(evt.chatId);
|
||||
if (evt.messageId !== undefined) spanAttrs["clawdbot.messageId"] = String(evt.messageId);
|
||||
if (evt.reason) spanAttrs["clawdbot.reason"] = evt.reason;
|
||||
const span = spanWithDuration("clawdbot.message.processed", spanAttrs, evt.durationMs);
|
||||
if (evt.outcome === "error") {
|
||||
span.setStatus({ code: SpanStatusCode.ERROR, message: evt.error });
|
||||
}
|
||||
span.end();
|
||||
};
|
||||
|
||||
const recordLaneEnqueue = (
|
||||
evt: Extract<DiagnosticEventPayload, { type: "queue.lane.enqueue" }>,
|
||||
) => {
|
||||
const attrs = { "clawdbot.lane": evt.lane };
|
||||
laneEnqueueCounter.add(1, attrs);
|
||||
queueDepthHistogram.record(evt.queueSize, attrs);
|
||||
};
|
||||
|
||||
const recordLaneDequeue = (
|
||||
evt: Extract<DiagnosticEventPayload, { type: "queue.lane.dequeue" }>,
|
||||
) => {
|
||||
const attrs = { "clawdbot.lane": evt.lane };
|
||||
laneDequeueCounter.add(1, attrs);
|
||||
queueDepthHistogram.record(evt.queueSize, attrs);
|
||||
if (typeof evt.waitMs === "number") {
|
||||
queueWaitHistogram.record(evt.waitMs, attrs);
|
||||
}
|
||||
};
|
||||
|
||||
const recordSessionState = (
|
||||
evt: Extract<DiagnosticEventPayload, { type: "session.state" }>,
|
||||
) => {
|
||||
const attrs: Record<string, string> = { "clawdbot.state": evt.state };
|
||||
if (evt.reason) attrs["clawdbot.reason"] = evt.reason;
|
||||
sessionStateCounter.add(1, attrs);
|
||||
};
|
||||
|
||||
const recordSessionStuck = (
|
||||
evt: Extract<DiagnosticEventPayload, { type: "session.stuck" }>,
|
||||
) => {
|
||||
const attrs: Record<string, string> = { "clawdbot.state": evt.state };
|
||||
sessionStuckCounter.add(1, attrs);
|
||||
if (typeof evt.ageMs === "number") {
|
||||
sessionStuckAgeHistogram.record(evt.ageMs, attrs);
|
||||
}
|
||||
if (!tracesEnabled) return;
|
||||
const spanAttrs: Record<string, string | number> = { ...attrs };
|
||||
if (evt.sessionKey) spanAttrs["clawdbot.sessionKey"] = evt.sessionKey;
|
||||
if (evt.sessionId) spanAttrs["clawdbot.sessionId"] = evt.sessionId;
|
||||
spanAttrs["clawdbot.queueDepth"] = evt.queueDepth ?? 0;
|
||||
spanAttrs["clawdbot.ageMs"] = evt.ageMs;
|
||||
const span = tracer.startSpan("clawdbot.session.stuck", { attributes: spanAttrs });
|
||||
span.setStatus({ code: SpanStatusCode.ERROR, message: "session stuck" });
|
||||
span.end();
|
||||
};
|
||||
|
||||
const recordRunAttempt = (evt: Extract<DiagnosticEventPayload, { type: "run.attempt" }>) => {
|
||||
runAttemptCounter.add(1, { "clawdbot.attempt": evt.attempt });
|
||||
};
|
||||
|
||||
const recordHeartbeat = (
|
||||
evt: Extract<DiagnosticEventPayload, { type: "diagnostic.heartbeat" }>,
|
||||
) => {
|
||||
queueDepthHistogram.record(evt.queued, { "clawdbot.channel": "heartbeat" });
|
||||
};
|
||||
|
||||
unsubscribe = onDiagnosticEvent((evt: DiagnosticEventPayload) => {
|
||||
switch (evt.type) {
|
||||
case "model.usage":
|
||||
recordModelUsage(evt);
|
||||
return;
|
||||
case "webhook.received":
|
||||
recordWebhookReceived(evt);
|
||||
return;
|
||||
case "webhook.processed":
|
||||
recordWebhookProcessed(evt);
|
||||
return;
|
||||
case "webhook.error":
|
||||
recordWebhookError(evt);
|
||||
return;
|
||||
case "message.queued":
|
||||
recordMessageQueued(evt);
|
||||
return;
|
||||
case "message.processed":
|
||||
recordMessageProcessed(evt);
|
||||
return;
|
||||
case "queue.lane.enqueue":
|
||||
recordLaneEnqueue(evt);
|
||||
return;
|
||||
case "queue.lane.dequeue":
|
||||
recordLaneDequeue(evt);
|
||||
return;
|
||||
case "session.state":
|
||||
recordSessionState(evt);
|
||||
return;
|
||||
case "session.stuck":
|
||||
recordSessionStuck(evt);
|
||||
return;
|
||||
case "run.attempt":
|
||||
recordRunAttempt(evt);
|
||||
return;
|
||||
case "diagnostic.heartbeat":
|
||||
recordHeartbeat(evt);
|
||||
return;
|
||||
}
|
||||
});
|
||||
|
||||
if (logsEnabled) {
|
||||
ctx.logger.info("diagnostics-otel: logs exporter enabled (OTLP/HTTP)");
|
||||
if (otel.logs) {
|
||||
ctx.logger.warn("diagnostics-otel: logs exporter not wired yet");
|
||||
}
|
||||
},
|
||||
async stop() {
|
||||
unsubscribe?.();
|
||||
unsubscribe = null;
|
||||
stopLogTransport?.();
|
||||
stopLogTransport = null;
|
||||
if (logProvider) {
|
||||
await logProvider.shutdown().catch(() => undefined);
|
||||
logProvider = null;
|
||||
}
|
||||
if (sdk) {
|
||||
await sdk.shutdown().catch(() => undefined);
|
||||
sdk = null;
|
||||
|
||||
@@ -1,26 +0,0 @@
|
||||
# Changelog
|
||||
|
||||
## 2026.1.19-1
|
||||
|
||||
Initial release.
|
||||
|
||||
### Features
|
||||
|
||||
- NIP-04 encrypted DM support (kind:4 events)
|
||||
- Key validation (hex and nsec formats)
|
||||
- Multi-relay support with sequential fallback
|
||||
- Event signature verification
|
||||
- TTL-based deduplication (24h)
|
||||
- Access control via dmPolicy (pairing, allowlist, open, disabled)
|
||||
- Pubkey normalization (hex/npub)
|
||||
|
||||
### Protocol Support
|
||||
|
||||
- NIP-01: Basic event structure
|
||||
- NIP-04: Encrypted direct messages
|
||||
|
||||
### Planned for v2
|
||||
|
||||
- NIP-17: Gift-wrapped DMs
|
||||
- NIP-44: Versioned encryption
|
||||
- Media attachments
|
||||
@@ -1,136 +0,0 @@
|
||||
# @clawdbot/nostr
|
||||
|
||||
Nostr DM channel plugin for Clawdbot using NIP-04 encrypted direct messages.
|
||||
|
||||
## Overview
|
||||
|
||||
This extension adds Nostr as a messaging channel to Clawdbot. It enables your bot to:
|
||||
|
||||
- Receive encrypted DMs from Nostr users
|
||||
- Send encrypted responses back
|
||||
- Work with any NIP-04 compatible Nostr client (Damus, Amethyst, etc.)
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
clawdbot plugins install @clawdbot/nostr
|
||||
```
|
||||
|
||||
## Quick Setup
|
||||
|
||||
1. Generate a Nostr keypair (if you don't have one):
|
||||
```bash
|
||||
# Using nak CLI
|
||||
nak key generate
|
||||
|
||||
# Or use any Nostr key generator
|
||||
```
|
||||
|
||||
2. Add to your config:
|
||||
```json
|
||||
{
|
||||
"channels": {
|
||||
"nostr": {
|
||||
"privateKey": "${NOSTR_PRIVATE_KEY}",
|
||||
"relays": ["wss://relay.damus.io", "wss://nos.lol"]
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
3. Set the environment variable:
|
||||
```bash
|
||||
export NOSTR_PRIVATE_KEY="nsec1..." # or hex format
|
||||
```
|
||||
|
||||
4. Restart the gateway
|
||||
|
||||
## Configuration
|
||||
|
||||
| Key | Type | Default | Description |
|
||||
|-----|------|---------|-------------|
|
||||
| `privateKey` | string | required | Bot's private key (nsec or hex format) |
|
||||
| `relays` | string[] | `["wss://relay.damus.io", "wss://nos.lol"]` | WebSocket relay URLs |
|
||||
| `dmPolicy` | string | `"pairing"` | Access control: `pairing`, `allowlist`, `open`, `disabled` |
|
||||
| `allowFrom` | string[] | `[]` | Allowed sender pubkeys (npub or hex) |
|
||||
| `enabled` | boolean | `true` | Enable/disable the channel |
|
||||
| `name` | string | - | Display name for the account |
|
||||
|
||||
## Access Control
|
||||
|
||||
### DM Policies
|
||||
|
||||
- **pairing** (default): Unknown senders receive a pairing code to request access
|
||||
- **allowlist**: Only pubkeys in `allowFrom` can message the bot
|
||||
- **open**: Anyone can message the bot (use with caution)
|
||||
- **disabled**: DMs are disabled
|
||||
|
||||
### Example: Allowlist Mode
|
||||
|
||||
```json
|
||||
{
|
||||
"channels": {
|
||||
"nostr": {
|
||||
"privateKey": "${NOSTR_PRIVATE_KEY}",
|
||||
"dmPolicy": "allowlist",
|
||||
"allowFrom": [
|
||||
"npub1abc...",
|
||||
"0123456789abcdef..."
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
### Local Relay (Recommended)
|
||||
|
||||
```bash
|
||||
# Using strfry
|
||||
docker run -p 7777:7777 ghcr.io/hoytech/strfry
|
||||
|
||||
# Configure clawdbot to use local relay
|
||||
"relays": ["ws://localhost:7777"]
|
||||
```
|
||||
|
||||
### Manual Test
|
||||
|
||||
1. Start the gateway with Nostr configured
|
||||
2. Open Damus, Amethyst, or another Nostr client
|
||||
3. Send a DM to your bot's npub
|
||||
4. Verify the bot responds
|
||||
|
||||
## Protocol Support
|
||||
|
||||
| NIP | Status | Notes |
|
||||
|-----|--------|-------|
|
||||
| NIP-01 | Supported | Basic event structure |
|
||||
| NIP-04 | Supported | Encrypted DMs (kind:4) |
|
||||
| NIP-17 | Planned | Gift-wrapped DMs (v2) |
|
||||
|
||||
## Security Notes
|
||||
|
||||
- Private keys are never logged
|
||||
- Event signatures are verified before processing
|
||||
- Use environment variables for keys, never commit to config files
|
||||
- Consider using `allowlist` mode in production
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Bot not receiving messages
|
||||
|
||||
1. Verify private key is correctly configured
|
||||
2. Check relay connectivity
|
||||
3. Ensure `enabled` is not set to `false`
|
||||
4. Check the bot's public key matches what you're sending to
|
||||
|
||||
### Messages not being delivered
|
||||
|
||||
1. Check relay URLs are correct (must use `wss://`)
|
||||
2. Verify relays are online and accepting connections
|
||||
3. Check for rate limiting (reduce message frequency)
|
||||
|
||||
## License
|
||||
|
||||
MIT
|
||||
@@ -1,11 +0,0 @@
|
||||
{
|
||||
"id": "nostr",
|
||||
"channels": [
|
||||
"nostr"
|
||||
],
|
||||
"configSchema": {
|
||||
"type": "object",
|
||||
"additionalProperties": false,
|
||||
"properties": {}
|
||||
}
|
||||
}
|
||||
@@ -1,69 +0,0 @@
|
||||
import type { ClawdbotPluginApi, ClawdbotConfig } from "clawdbot/plugin-sdk";
|
||||
import { emptyPluginConfigSchema } from "clawdbot/plugin-sdk";
|
||||
|
||||
import { nostrPlugin } from "./src/channel.js";
|
||||
import { setNostrRuntime, getNostrRuntime } from "./src/runtime.js";
|
||||
import { createNostrProfileHttpHandler } from "./src/nostr-profile-http.js";
|
||||
import { resolveNostrAccount } from "./src/types.js";
|
||||
import type { NostrProfile } from "./src/config-schema.js";
|
||||
|
||||
const plugin = {
|
||||
id: "nostr",
|
||||
name: "Nostr",
|
||||
description: "Nostr DM channel plugin via NIP-04",
|
||||
configSchema: emptyPluginConfigSchema(),
|
||||
register(api: ClawdbotPluginApi) {
|
||||
setNostrRuntime(api.runtime);
|
||||
api.registerChannel({ plugin: nostrPlugin });
|
||||
|
||||
// Register HTTP handler for profile management
|
||||
const httpHandler = createNostrProfileHttpHandler({
|
||||
getConfigProfile: (accountId: string) => {
|
||||
const runtime = getNostrRuntime();
|
||||
const cfg = runtime.config.loadConfig() as ClawdbotConfig;
|
||||
const account = resolveNostrAccount({ cfg, accountId });
|
||||
return account.profile;
|
||||
},
|
||||
updateConfigProfile: async (accountId: string, profile: NostrProfile) => {
|
||||
const runtime = getNostrRuntime();
|
||||
const cfg = runtime.config.loadConfig() as ClawdbotConfig;
|
||||
|
||||
// Build the config patch for channels.nostr.profile
|
||||
const channels = (cfg.channels ?? {}) as Record<string, unknown>;
|
||||
const nostrConfig = (channels.nostr ?? {}) as Record<string, unknown>;
|
||||
|
||||
const updatedNostrConfig = {
|
||||
...nostrConfig,
|
||||
profile,
|
||||
};
|
||||
|
||||
const updatedChannels = {
|
||||
...channels,
|
||||
nostr: updatedNostrConfig,
|
||||
};
|
||||
|
||||
await runtime.config.writeConfigFile({
|
||||
...cfg,
|
||||
channels: updatedChannels,
|
||||
});
|
||||
},
|
||||
getAccountInfo: (accountId: string) => {
|
||||
const runtime = getNostrRuntime();
|
||||
const cfg = runtime.config.loadConfig() as ClawdbotConfig;
|
||||
const account = resolveNostrAccount({ cfg, accountId });
|
||||
if (!account.configured || !account.publicKey) {
|
||||
return null;
|
||||
}
|
||||
return {
|
||||
pubkey: account.publicKey,
|
||||
relays: account.relays,
|
||||
};
|
||||
},
|
||||
log: api.logger,
|
||||
});
|
||||
|
||||
api.registerHttpHandler(httpHandler);
|
||||
},
|
||||
};
|
||||
|
||||
export default plugin;
|
||||
@@ -1,29 +0,0 @@
|
||||
{
|
||||
"name": "@clawdbot/nostr",
|
||||
"version": "2026.1.19-1",
|
||||
"type": "module",
|
||||
"description": "Clawdbot Nostr channel plugin for NIP-04 encrypted DMs",
|
||||
"clawdbot": {
|
||||
"extensions": ["./index.ts"],
|
||||
"channel": {
|
||||
"id": "nostr",
|
||||
"label": "Nostr",
|
||||
"selectionLabel": "Nostr (NIP-04 DMs)",
|
||||
"docsPath": "/channels/nostr",
|
||||
"docsLabel": "nostr",
|
||||
"blurb": "Decentralized protocol; encrypted DMs via NIP-04.",
|
||||
"order": 55,
|
||||
"quickstartAllowFrom": true
|
||||
},
|
||||
"install": {
|
||||
"npmSpec": "@clawdbot/nostr",
|
||||
"localPath": "extensions/nostr",
|
||||
"defaultChoice": "npm"
|
||||
}
|
||||
},
|
||||
"dependencies": {
|
||||
"clawdbot": "workspace:*",
|
||||
"nostr-tools": "^2.10.4",
|
||||
"zod": "^4.3.5"
|
||||
}
|
||||
}
|
||||
@@ -1,141 +0,0 @@
|
||||
import { describe, expect, it } from "vitest";
|
||||
import { nostrPlugin } from "./channel.js";
|
||||
|
||||
describe("nostrPlugin", () => {
|
||||
describe("meta", () => {
|
||||
it("has correct id", () => {
|
||||
expect(nostrPlugin.id).toBe("nostr");
|
||||
});
|
||||
|
||||
it("has required meta fields", () => {
|
||||
expect(nostrPlugin.meta.label).toBe("Nostr");
|
||||
expect(nostrPlugin.meta.docsPath).toBe("/channels/nostr");
|
||||
expect(nostrPlugin.meta.blurb).toContain("NIP-04");
|
||||
});
|
||||
});
|
||||
|
||||
describe("capabilities", () => {
|
||||
it("supports direct messages", () => {
|
||||
expect(nostrPlugin.capabilities.chatTypes).toContain("direct");
|
||||
});
|
||||
|
||||
it("does not support groups (MVP)", () => {
|
||||
expect(nostrPlugin.capabilities.chatTypes).not.toContain("group");
|
||||
});
|
||||
|
||||
it("does not support media (MVP)", () => {
|
||||
expect(nostrPlugin.capabilities.media).toBe(false);
|
||||
});
|
||||
});
|
||||
|
||||
describe("config adapter", () => {
|
||||
it("has required config functions", () => {
|
||||
expect(nostrPlugin.config.listAccountIds).toBeTypeOf("function");
|
||||
expect(nostrPlugin.config.resolveAccount).toBeTypeOf("function");
|
||||
expect(nostrPlugin.config.isConfigured).toBeTypeOf("function");
|
||||
});
|
||||
|
||||
it("listAccountIds returns empty array for unconfigured", () => {
|
||||
const cfg = { channels: {} };
|
||||
const ids = nostrPlugin.config.listAccountIds(cfg);
|
||||
expect(ids).toEqual([]);
|
||||
});
|
||||
|
||||
it("listAccountIds returns default for configured", () => {
|
||||
const cfg = {
|
||||
channels: {
|
||||
nostr: {
|
||||
privateKey: "0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef",
|
||||
},
|
||||
},
|
||||
};
|
||||
const ids = nostrPlugin.config.listAccountIds(cfg);
|
||||
expect(ids).toContain("default");
|
||||
});
|
||||
});
|
||||
|
||||
describe("messaging", () => {
|
||||
it("has target resolver", () => {
|
||||
expect(nostrPlugin.messaging?.targetResolver?.looksLikeId).toBeTypeOf("function");
|
||||
});
|
||||
|
||||
it("recognizes npub as valid target", () => {
|
||||
const looksLikeId = nostrPlugin.messaging?.targetResolver?.looksLikeId;
|
||||
if (!looksLikeId) return;
|
||||
|
||||
expect(looksLikeId("npub1xyz123")).toBe(true);
|
||||
});
|
||||
|
||||
it("recognizes hex pubkey as valid target", () => {
|
||||
const looksLikeId = nostrPlugin.messaging?.targetResolver?.looksLikeId;
|
||||
if (!looksLikeId) return;
|
||||
|
||||
const hexPubkey = "0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef";
|
||||
expect(looksLikeId(hexPubkey)).toBe(true);
|
||||
});
|
||||
|
||||
it("rejects invalid input", () => {
|
||||
const looksLikeId = nostrPlugin.messaging?.targetResolver?.looksLikeId;
|
||||
if (!looksLikeId) return;
|
||||
|
||||
expect(looksLikeId("not-a-pubkey")).toBe(false);
|
||||
expect(looksLikeId("")).toBe(false);
|
||||
});
|
||||
|
||||
it("normalizeTarget strips nostr: prefix", () => {
|
||||
const normalize = nostrPlugin.messaging?.normalizeTarget;
|
||||
if (!normalize) return;
|
||||
|
||||
const hexPubkey = "0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef";
|
||||
expect(normalize(`nostr:${hexPubkey}`)).toBe(hexPubkey);
|
||||
});
|
||||
});
|
||||
|
||||
describe("outbound", () => {
|
||||
it("has correct delivery mode", () => {
|
||||
expect(nostrPlugin.outbound?.deliveryMode).toBe("direct");
|
||||
});
|
||||
|
||||
it("has reasonable text chunk limit", () => {
|
||||
expect(nostrPlugin.outbound?.textChunkLimit).toBe(4000);
|
||||
});
|
||||
});
|
||||
|
||||
describe("pairing", () => {
|
||||
it("has id label for pairing", () => {
|
||||
expect(nostrPlugin.pairing?.idLabel).toBe("nostrPubkey");
|
||||
});
|
||||
|
||||
it("normalizes nostr: prefix in allow entries", () => {
|
||||
const normalize = nostrPlugin.pairing?.normalizeAllowEntry;
|
||||
if (!normalize) return;
|
||||
|
||||
const hexPubkey = "0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef";
|
||||
expect(normalize(`nostr:${hexPubkey}`)).toBe(hexPubkey);
|
||||
});
|
||||
});
|
||||
|
||||
describe("security", () => {
|
||||
it("has resolveDmPolicy function", () => {
|
||||
expect(nostrPlugin.security?.resolveDmPolicy).toBeTypeOf("function");
|
||||
});
|
||||
});
|
||||
|
||||
describe("gateway", () => {
|
||||
it("has startAccount function", () => {
|
||||
expect(nostrPlugin.gateway?.startAccount).toBeTypeOf("function");
|
||||
});
|
||||
});
|
||||
|
||||
describe("status", () => {
|
||||
it("has default runtime", () => {
|
||||
expect(nostrPlugin.status?.defaultRuntime).toBeDefined();
|
||||
expect(nostrPlugin.status?.defaultRuntime?.accountId).toBe("default");
|
||||
expect(nostrPlugin.status?.defaultRuntime?.running).toBe(false);
|
||||
});
|
||||
|
||||
it("has buildAccountSnapshot function", () => {
|
||||
expect(nostrPlugin.status?.buildAccountSnapshot).toBeTypeOf("function");
|
||||
});
|
||||
});
|
||||
});
|
||||
@@ -1,335 +0,0 @@
|
||||
import {
|
||||
buildChannelConfigSchema,
|
||||
DEFAULT_ACCOUNT_ID,
|
||||
formatPairingApproveHint,
|
||||
type ChannelPlugin,
|
||||
} from "clawdbot/plugin-sdk";
|
||||
|
||||
import { NostrConfigSchema } from "./config-schema.js";
|
||||
import { getNostrRuntime } from "./runtime.js";
|
||||
import {
|
||||
listNostrAccountIds,
|
||||
resolveDefaultNostrAccountId,
|
||||
resolveNostrAccount,
|
||||
type ResolvedNostrAccount,
|
||||
} from "./types.js";
|
||||
import { normalizePubkey, startNostrBus, type NostrBusHandle } from "./nostr-bus.js";
|
||||
import type { MetricEvent, MetricsSnapshot } from "./metrics.js";
|
||||
import type { NostrProfile } from "./config-schema.js";
|
||||
import type { ProfilePublishResult } from "./nostr-profile.js";
|
||||
|
||||
// Store active bus handles per account
|
||||
const activeBuses = new Map<string, NostrBusHandle>();
|
||||
|
||||
// Store metrics snapshots per account (for status reporting)
|
||||
const metricsSnapshots = new Map<string, MetricsSnapshot>();
|
||||
|
||||
export const nostrPlugin: ChannelPlugin<ResolvedNostrAccount> = {
|
||||
id: "nostr",
|
||||
meta: {
|
||||
id: "nostr",
|
||||
label: "Nostr",
|
||||
selectionLabel: "Nostr",
|
||||
docsPath: "/channels/nostr",
|
||||
docsLabel: "nostr",
|
||||
blurb: "Decentralized DMs via Nostr relays (NIP-04)",
|
||||
order: 100,
|
||||
},
|
||||
capabilities: {
|
||||
chatTypes: ["direct"], // DMs only for MVP
|
||||
media: false, // No media for MVP
|
||||
},
|
||||
reload: { configPrefixes: ["channels.nostr"] },
|
||||
configSchema: buildChannelConfigSchema(NostrConfigSchema),
|
||||
|
||||
config: {
|
||||
listAccountIds: (cfg) => listNostrAccountIds(cfg),
|
||||
resolveAccount: (cfg, accountId) => resolveNostrAccount({ cfg, accountId }),
|
||||
defaultAccountId: (cfg) => resolveDefaultNostrAccountId(cfg),
|
||||
isConfigured: (account) => account.configured,
|
||||
describeAccount: (account) => ({
|
||||
accountId: account.accountId,
|
||||
name: account.name,
|
||||
enabled: account.enabled,
|
||||
configured: account.configured,
|
||||
publicKey: account.publicKey,
|
||||
}),
|
||||
resolveAllowFrom: ({ cfg, accountId }) =>
|
||||
(resolveNostrAccount({ cfg, accountId }).config.allowFrom ?? []).map((entry) =>
|
||||
String(entry)
|
||||
),
|
||||
formatAllowFrom: ({ allowFrom }) =>
|
||||
allowFrom
|
||||
.map((entry) => String(entry).trim())
|
||||
.filter(Boolean)
|
||||
.map((entry) => {
|
||||
if (entry === "*") return "*";
|
||||
try {
|
||||
return normalizePubkey(entry);
|
||||
} catch {
|
||||
return entry; // Keep as-is if normalization fails
|
||||
}
|
||||
})
|
||||
.filter(Boolean),
|
||||
},
|
||||
|
||||
pairing: {
|
||||
idLabel: "nostrPubkey",
|
||||
normalizeAllowEntry: (entry) => {
|
||||
try {
|
||||
return normalizePubkey(entry.replace(/^nostr:/i, ""));
|
||||
} catch {
|
||||
return entry;
|
||||
}
|
||||
},
|
||||
notifyApproval: async ({ id }) => {
|
||||
// Get the default account's bus and send approval message
|
||||
const bus = activeBuses.get(DEFAULT_ACCOUNT_ID);
|
||||
if (bus) {
|
||||
await bus.sendDm(id, "Your pairing request has been approved!");
|
||||
}
|
||||
},
|
||||
},
|
||||
|
||||
security: {
|
||||
resolveDmPolicy: ({ account }) => {
|
||||
return {
|
||||
policy: account.config.dmPolicy ?? "pairing",
|
||||
allowFrom: account.config.allowFrom ?? [],
|
||||
policyPath: "channels.nostr.dmPolicy",
|
||||
allowFromPath: "channels.nostr.allowFrom",
|
||||
approveHint: formatPairingApproveHint("nostr"),
|
||||
normalizeEntry: (raw) => {
|
||||
try {
|
||||
return normalizePubkey(raw.replace(/^nostr:/i, "").trim());
|
||||
} catch {
|
||||
return raw.trim();
|
||||
}
|
||||
},
|
||||
};
|
||||
},
|
||||
},
|
||||
|
||||
messaging: {
|
||||
normalizeTarget: (target) => {
|
||||
// Strip nostr: prefix if present
|
||||
const cleaned = target.replace(/^nostr:/i, "").trim();
|
||||
try {
|
||||
return normalizePubkey(cleaned);
|
||||
} catch {
|
||||
return cleaned;
|
||||
}
|
||||
},
|
||||
targetResolver: {
|
||||
looksLikeId: (input) => {
|
||||
const trimmed = input.trim();
|
||||
return trimmed.startsWith("npub1") || /^[0-9a-fA-F]{64}$/.test(trimmed);
|
||||
},
|
||||
hint: "<npub|hex pubkey|nostr:npub...>",
|
||||
},
|
||||
},
|
||||
|
||||
outbound: {
|
||||
deliveryMode: "direct",
|
||||
textChunkLimit: 4000,
|
||||
sendText: async ({ to, text, accountId }) => {
|
||||
const aid = accountId ?? DEFAULT_ACCOUNT_ID;
|
||||
const bus = activeBuses.get(aid);
|
||||
if (!bus) {
|
||||
throw new Error(`Nostr bus not running for account ${aid}`);
|
||||
}
|
||||
const normalizedTo = normalizePubkey(to);
|
||||
await bus.sendDm(normalizedTo, text);
|
||||
return { channel: "nostr", to: normalizedTo };
|
||||
},
|
||||
},
|
||||
|
||||
status: {
|
||||
defaultRuntime: {
|
||||
accountId: DEFAULT_ACCOUNT_ID,
|
||||
running: false,
|
||||
lastStartAt: null,
|
||||
lastStopAt: null,
|
||||
lastError: null,
|
||||
},
|
||||
collectStatusIssues: (accounts) =>
|
||||
accounts.flatMap((account) => {
|
||||
const lastError = typeof account.lastError === "string" ? account.lastError.trim() : "";
|
||||
if (!lastError) return [];
|
||||
return [
|
||||
{
|
||||
channel: "nostr",
|
||||
accountId: account.accountId,
|
||||
kind: "runtime" as const,
|
||||
message: `Channel error: ${lastError}`,
|
||||
},
|
||||
];
|
||||
}),
|
||||
buildChannelSummary: ({ snapshot }) => ({
|
||||
configured: snapshot.configured ?? false,
|
||||
publicKey: snapshot.publicKey ?? null,
|
||||
running: snapshot.running ?? false,
|
||||
lastStartAt: snapshot.lastStartAt ?? null,
|
||||
lastStopAt: snapshot.lastStopAt ?? null,
|
||||
lastError: snapshot.lastError ?? null,
|
||||
}),
|
||||
buildAccountSnapshot: ({ account, runtime }) => ({
|
||||
accountId: account.accountId,
|
||||
name: account.name,
|
||||
enabled: account.enabled,
|
||||
configured: account.configured,
|
||||
publicKey: account.publicKey,
|
||||
profile: account.profile,
|
||||
running: runtime?.running ?? false,
|
||||
lastStartAt: runtime?.lastStartAt ?? null,
|
||||
lastStopAt: runtime?.lastStopAt ?? null,
|
||||
lastError: runtime?.lastError ?? null,
|
||||
lastInboundAt: runtime?.lastInboundAt ?? null,
|
||||
lastOutboundAt: runtime?.lastOutboundAt ?? null,
|
||||
}),
|
||||
},
|
||||
|
||||
gateway: {
|
||||
startAccount: async (ctx) => {
|
||||
const account = ctx.account;
|
||||
ctx.setStatus({
|
||||
accountId: account.accountId,
|
||||
publicKey: account.publicKey,
|
||||
});
|
||||
ctx.log?.info(`[${account.accountId}] starting Nostr provider (pubkey: ${account.publicKey})`);
|
||||
|
||||
if (!account.configured) {
|
||||
throw new Error("Nostr private key not configured");
|
||||
}
|
||||
|
||||
const runtime = getNostrRuntime();
|
||||
|
||||
// Track bus handle for metrics callback
|
||||
let busHandle: NostrBusHandle | null = null;
|
||||
|
||||
const bus = await startNostrBus({
|
||||
accountId: account.accountId,
|
||||
privateKey: account.privateKey,
|
||||
relays: account.relays,
|
||||
onMessage: async (senderPubkey, text, reply) => {
|
||||
ctx.log?.debug(`[${account.accountId}] DM from ${senderPubkey}: ${text.slice(0, 50)}...`);
|
||||
|
||||
// Forward to clawdbot's message pipeline
|
||||
await runtime.channel.reply.handleInboundMessage({
|
||||
channel: "nostr",
|
||||
accountId: account.accountId,
|
||||
senderId: senderPubkey,
|
||||
chatType: "direct",
|
||||
chatId: senderPubkey, // For DMs, chatId is the sender's pubkey
|
||||
text,
|
||||
reply: async (responseText: string) => {
|
||||
await reply(responseText);
|
||||
},
|
||||
});
|
||||
},
|
||||
onError: (error, context) => {
|
||||
ctx.log?.error(`[${account.accountId}] Nostr error (${context}): ${error.message}`);
|
||||
},
|
||||
onConnect: (relay) => {
|
||||
ctx.log?.debug(`[${account.accountId}] Connected to relay: ${relay}`);
|
||||
},
|
||||
onDisconnect: (relay) => {
|
||||
ctx.log?.debug(`[${account.accountId}] Disconnected from relay: ${relay}`);
|
||||
},
|
||||
onEose: (relays) => {
|
||||
ctx.log?.debug(`[${account.accountId}] EOSE received from relays: ${relays}`);
|
||||
},
|
||||
onMetric: (event: MetricEvent) => {
|
||||
// Log significant metrics at appropriate levels
|
||||
if (event.name.startsWith("event.rejected.")) {
|
||||
ctx.log?.debug(`[${account.accountId}] Metric: ${event.name}`, event.labels);
|
||||
} else if (event.name === "relay.circuit_breaker.open") {
|
||||
ctx.log?.warn(`[${account.accountId}] Circuit breaker opened for relay: ${event.labels?.relay}`);
|
||||
} else if (event.name === "relay.circuit_breaker.close") {
|
||||
ctx.log?.info(`[${account.accountId}] Circuit breaker closed for relay: ${event.labels?.relay}`);
|
||||
} else if (event.name === "relay.error") {
|
||||
ctx.log?.debug(`[${account.accountId}] Relay error: ${event.labels?.relay}`);
|
||||
}
|
||||
// Update cached metrics snapshot
|
||||
if (busHandle) {
|
||||
metricsSnapshots.set(account.accountId, busHandle.getMetrics());
|
||||
}
|
||||
},
|
||||
});
|
||||
|
||||
busHandle = bus;
|
||||
|
||||
// Store the bus handle
|
||||
activeBuses.set(account.accountId, bus);
|
||||
|
||||
ctx.log?.info(`[${account.accountId}] Nostr provider started, connected to ${account.relays.length} relay(s)`);
|
||||
|
||||
// Return cleanup function
|
||||
return {
|
||||
stop: () => {
|
||||
bus.close();
|
||||
activeBuses.delete(account.accountId);
|
||||
metricsSnapshots.delete(account.accountId);
|
||||
ctx.log?.info(`[${account.accountId}] Nostr provider stopped`);
|
||||
},
|
||||
};
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
/**
|
||||
* Get metrics snapshot for a Nostr account.
|
||||
* Returns undefined if account is not running.
|
||||
*/
|
||||
export function getNostrMetrics(accountId: string = DEFAULT_ACCOUNT_ID): MetricsSnapshot | undefined {
|
||||
const bus = activeBuses.get(accountId);
|
||||
if (bus) {
|
||||
return bus.getMetrics();
|
||||
}
|
||||
return metricsSnapshots.get(accountId);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all active Nostr bus handles.
|
||||
* Useful for debugging and status reporting.
|
||||
*/
|
||||
export function getActiveNostrBuses(): Map<string, NostrBusHandle> {
|
||||
return new Map(activeBuses);
|
||||
}
|
||||
|
||||
/**
|
||||
* Publish a profile (kind:0) for a Nostr account.
|
||||
* @param accountId - Account ID (defaults to "default")
|
||||
* @param profile - Profile data to publish
|
||||
* @returns Publish results with successes and failures
|
||||
* @throws Error if account is not running
|
||||
*/
|
||||
export async function publishNostrProfile(
|
||||
accountId: string = DEFAULT_ACCOUNT_ID,
|
||||
profile: NostrProfile
|
||||
): Promise<ProfilePublishResult> {
|
||||
const bus = activeBuses.get(accountId);
|
||||
if (!bus) {
|
||||
throw new Error(`Nostr bus not running for account ${accountId}`);
|
||||
}
|
||||
return bus.publishProfile(profile);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get profile publish state for a Nostr account.
|
||||
* @param accountId - Account ID (defaults to "default")
|
||||
* @returns Profile publish state or null if account not running
|
||||
*/
|
||||
export async function getNostrProfileState(
|
||||
accountId: string = DEFAULT_ACCOUNT_ID
|
||||
): Promise<{
|
||||
lastPublishedAt: number | null;
|
||||
lastPublishedEventId: string | null;
|
||||
lastPublishResults: Record<string, "ok" | "failed" | "timeout"> | null;
|
||||
} | null> {
|
||||
const bus = activeBuses.get(accountId);
|
||||
if (!bus) {
|
||||
return null;
|
||||
}
|
||||
return bus.getProfileState();
|
||||
}
|
||||
@@ -1,87 +0,0 @@
|
||||
import { z } from "zod";
|
||||
import { buildChannelConfigSchema } from "clawdbot/plugin-sdk";
|
||||
|
||||
const allowFromEntry = z.union([z.string(), z.number()]);
|
||||
|
||||
/**
|
||||
* Validates https:// URLs only (no javascript:, data:, file:, etc.)
|
||||
*/
|
||||
const safeUrlSchema = z
|
||||
.string()
|
||||
.url()
|
||||
.refine(
|
||||
(url) => {
|
||||
try {
|
||||
const parsed = new URL(url);
|
||||
return parsed.protocol === "https:";
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
},
|
||||
{ message: "URL must use https:// protocol" }
|
||||
);
|
||||
|
||||
/**
|
||||
* NIP-01 profile metadata schema
|
||||
* https://github.com/nostr-protocol/nips/blob/master/01.md
|
||||
*/
|
||||
export const NostrProfileSchema = z.object({
|
||||
/** Username (NIP-01: name) - max 256 chars */
|
||||
name: z.string().max(256).optional(),
|
||||
|
||||
/** Display name (NIP-01: display_name) - max 256 chars */
|
||||
displayName: z.string().max(256).optional(),
|
||||
|
||||
/** Bio/description (NIP-01: about) - max 2000 chars */
|
||||
about: z.string().max(2000).optional(),
|
||||
|
||||
/** Profile picture URL (must be https) */
|
||||
picture: safeUrlSchema.optional(),
|
||||
|
||||
/** Banner image URL (must be https) */
|
||||
banner: safeUrlSchema.optional(),
|
||||
|
||||
/** Website URL (must be https) */
|
||||
website: safeUrlSchema.optional(),
|
||||
|
||||
/** NIP-05 identifier (e.g., "user@example.com") */
|
||||
nip05: z.string().optional(),
|
||||
|
||||
/** Lightning address (LUD-16) */
|
||||
lud16: z.string().optional(),
|
||||
});
|
||||
|
||||
export type NostrProfile = z.infer<typeof NostrProfileSchema>;
|
||||
|
||||
/**
|
||||
* Zod schema for channels.nostr.* configuration
|
||||
*/
|
||||
export const NostrConfigSchema = z.object({
|
||||
/** Account name (optional display name) */
|
||||
name: z.string().optional(),
|
||||
|
||||
/** Whether this channel is enabled */
|
||||
enabled: z.boolean().optional(),
|
||||
|
||||
/** Private key in hex or nsec bech32 format */
|
||||
privateKey: z.string().optional(),
|
||||
|
||||
/** WebSocket relay URLs to connect to */
|
||||
relays: z.array(z.string()).optional(),
|
||||
|
||||
/** DM access policy: pairing, allowlist, open, or disabled */
|
||||
dmPolicy: z.enum(["pairing", "allowlist", "open", "disabled"]).optional(),
|
||||
|
||||
/** Allowed sender pubkeys (npub or hex format) */
|
||||
allowFrom: z.array(allowFromEntry).optional(),
|
||||
|
||||
/** Profile metadata (NIP-01 kind:0 content) */
|
||||
profile: NostrProfileSchema.optional(),
|
||||
});
|
||||
|
||||
export type NostrConfig = z.infer<typeof NostrConfigSchema>;
|
||||
|
||||
/**
|
||||
* JSON Schema for Control UI (converted from Zod)
|
||||
*/
|
||||
export const nostrChannelConfigSchema = buildChannelConfigSchema(NostrConfigSchema);
|
||||
@@ -1,464 +0,0 @@
|
||||
/**
|
||||
* Comprehensive metrics system for Nostr bus observability.
|
||||
* Provides clear insight into what's happening with events, relays, and operations.
|
||||
*/
|
||||
|
||||
// ============================================================================
|
||||
// Metric Types
|
||||
// ============================================================================
|
||||
|
||||
export type EventMetricName =
|
||||
| "event.received"
|
||||
| "event.processed"
|
||||
| "event.duplicate"
|
||||
| "event.rejected.invalid_shape"
|
||||
| "event.rejected.wrong_kind"
|
||||
| "event.rejected.stale"
|
||||
| "event.rejected.future"
|
||||
| "event.rejected.rate_limited"
|
||||
| "event.rejected.invalid_signature"
|
||||
| "event.rejected.oversized_ciphertext"
|
||||
| "event.rejected.oversized_plaintext"
|
||||
| "event.rejected.decrypt_failed"
|
||||
| "event.rejected.self_message";
|
||||
|
||||
export type RelayMetricName =
|
||||
| "relay.connect"
|
||||
| "relay.disconnect"
|
||||
| "relay.reconnect"
|
||||
| "relay.error"
|
||||
| "relay.message.event"
|
||||
| "relay.message.eose"
|
||||
| "relay.message.closed"
|
||||
| "relay.message.notice"
|
||||
| "relay.message.ok"
|
||||
| "relay.message.auth"
|
||||
| "relay.circuit_breaker.open"
|
||||
| "relay.circuit_breaker.close"
|
||||
| "relay.circuit_breaker.half_open";
|
||||
|
||||
export type RateLimitMetricName = "rate_limit.per_sender" | "rate_limit.global";
|
||||
|
||||
export type DecryptMetricName = "decrypt.success" | "decrypt.failure";
|
||||
|
||||
export type MemoryMetricName =
|
||||
| "memory.seen_tracker_size"
|
||||
| "memory.rate_limiter_entries";
|
||||
|
||||
export type MetricName =
|
||||
| EventMetricName
|
||||
| RelayMetricName
|
||||
| RateLimitMetricName
|
||||
| DecryptMetricName
|
||||
| MemoryMetricName;
|
||||
|
||||
// ============================================================================
|
||||
// Metric Event
|
||||
// ============================================================================
|
||||
|
||||
export interface MetricEvent {
|
||||
/** Metric name (e.g., "event.received", "relay.connect") */
|
||||
name: MetricName;
|
||||
/** Metric value (usually 1 for counters, or a measured value) */
|
||||
value: number;
|
||||
/** Unix timestamp in milliseconds */
|
||||
timestamp: number;
|
||||
/** Optional labels for additional context */
|
||||
labels?: Record<string, string | number>;
|
||||
}
|
||||
|
||||
export type OnMetricCallback = (event: MetricEvent) => void;
|
||||
|
||||
// ============================================================================
|
||||
// Metrics Snapshot (for getMetrics())
|
||||
// ============================================================================
|
||||
|
||||
export interface MetricsSnapshot {
|
||||
/** Total events received (before any filtering) */
|
||||
eventsReceived: number;
|
||||
/** Events successfully processed */
|
||||
eventsProcessed: number;
|
||||
/** Duplicate events skipped */
|
||||
eventsDuplicate: number;
|
||||
/** Events rejected by reason */
|
||||
eventsRejected: {
|
||||
invalidShape: number;
|
||||
wrongKind: number;
|
||||
stale: number;
|
||||
future: number;
|
||||
rateLimited: number;
|
||||
invalidSignature: number;
|
||||
oversizedCiphertext: number;
|
||||
oversizedPlaintext: number;
|
||||
decryptFailed: number;
|
||||
selfMessage: number;
|
||||
};
|
||||
|
||||
/** Relay stats by URL */
|
||||
relays: Record<
|
||||
string,
|
||||
{
|
||||
connects: number;
|
||||
disconnects: number;
|
||||
reconnects: number;
|
||||
errors: number;
|
||||
messagesReceived: {
|
||||
event: number;
|
||||
eose: number;
|
||||
closed: number;
|
||||
notice: number;
|
||||
ok: number;
|
||||
auth: number;
|
||||
};
|
||||
circuitBreakerState: "closed" | "open" | "half_open";
|
||||
circuitBreakerOpens: number;
|
||||
circuitBreakerCloses: number;
|
||||
}
|
||||
>;
|
||||
|
||||
/** Rate limiting stats */
|
||||
rateLimiting: {
|
||||
perSenderHits: number;
|
||||
globalHits: number;
|
||||
};
|
||||
|
||||
/** Decrypt stats */
|
||||
decrypt: {
|
||||
success: number;
|
||||
failure: number;
|
||||
};
|
||||
|
||||
/** Memory/capacity stats */
|
||||
memory: {
|
||||
seenTrackerSize: number;
|
||||
rateLimiterEntries: number;
|
||||
};
|
||||
|
||||
/** Snapshot timestamp */
|
||||
snapshotAt: number;
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Metrics Collector
|
||||
// ============================================================================
|
||||
|
||||
export interface NostrMetrics {
|
||||
/** Emit a metric event */
|
||||
emit: (
|
||||
name: MetricName,
|
||||
value?: number,
|
||||
labels?: Record<string, string | number>
|
||||
) => void;
|
||||
|
||||
/** Get current metrics snapshot */
|
||||
getSnapshot: () => MetricsSnapshot;
|
||||
|
||||
/** Reset all metrics to zero */
|
||||
reset: () => void;
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a metrics collector instance.
|
||||
* Optionally pass an onMetric callback to receive real-time metric events.
|
||||
*/
|
||||
export function createMetrics(onMetric?: OnMetricCallback): NostrMetrics {
|
||||
// Counters
|
||||
let eventsReceived = 0;
|
||||
let eventsProcessed = 0;
|
||||
let eventsDuplicate = 0;
|
||||
const eventsRejected = {
|
||||
invalidShape: 0,
|
||||
wrongKind: 0,
|
||||
stale: 0,
|
||||
future: 0,
|
||||
rateLimited: 0,
|
||||
invalidSignature: 0,
|
||||
oversizedCiphertext: 0,
|
||||
oversizedPlaintext: 0,
|
||||
decryptFailed: 0,
|
||||
selfMessage: 0,
|
||||
};
|
||||
|
||||
// Per-relay stats
|
||||
const relays = new Map<
|
||||
string,
|
||||
{
|
||||
connects: number;
|
||||
disconnects: number;
|
||||
reconnects: number;
|
||||
errors: number;
|
||||
messagesReceived: {
|
||||
event: number;
|
||||
eose: number;
|
||||
closed: number;
|
||||
notice: number;
|
||||
ok: number;
|
||||
auth: number;
|
||||
};
|
||||
circuitBreakerState: "closed" | "open" | "half_open";
|
||||
circuitBreakerOpens: number;
|
||||
circuitBreakerCloses: number;
|
||||
}
|
||||
>();
|
||||
|
||||
// Rate limiting stats
|
||||
const rateLimiting = {
|
||||
perSenderHits: 0,
|
||||
globalHits: 0,
|
||||
};
|
||||
|
||||
// Decrypt stats
|
||||
const decrypt = {
|
||||
success: 0,
|
||||
failure: 0,
|
||||
};
|
||||
|
||||
// Memory stats (updated via gauge-style metrics)
|
||||
const memory = {
|
||||
seenTrackerSize: 0,
|
||||
rateLimiterEntries: 0,
|
||||
};
|
||||
|
||||
function getOrCreateRelay(url: string) {
|
||||
let relay = relays.get(url);
|
||||
if (!relay) {
|
||||
relay = {
|
||||
connects: 0,
|
||||
disconnects: 0,
|
||||
reconnects: 0,
|
||||
errors: 0,
|
||||
messagesReceived: {
|
||||
event: 0,
|
||||
eose: 0,
|
||||
closed: 0,
|
||||
notice: 0,
|
||||
ok: 0,
|
||||
auth: 0,
|
||||
},
|
||||
circuitBreakerState: "closed",
|
||||
circuitBreakerOpens: 0,
|
||||
circuitBreakerCloses: 0,
|
||||
};
|
||||
relays.set(url, relay);
|
||||
}
|
||||
return relay;
|
||||
}
|
||||
|
||||
function emit(
|
||||
name: MetricName,
|
||||
value: number = 1,
|
||||
labels?: Record<string, string | number>
|
||||
): void {
|
||||
// Fire callback if provided
|
||||
if (onMetric) {
|
||||
onMetric({
|
||||
name,
|
||||
value,
|
||||
timestamp: Date.now(),
|
||||
labels,
|
||||
});
|
||||
}
|
||||
|
||||
// Update internal counters
|
||||
const relayUrl = labels?.relay as string | undefined;
|
||||
|
||||
switch (name) {
|
||||
// Event metrics
|
||||
case "event.received":
|
||||
eventsReceived += value;
|
||||
break;
|
||||
case "event.processed":
|
||||
eventsProcessed += value;
|
||||
break;
|
||||
case "event.duplicate":
|
||||
eventsDuplicate += value;
|
||||
break;
|
||||
case "event.rejected.invalid_shape":
|
||||
eventsRejected.invalidShape += value;
|
||||
break;
|
||||
case "event.rejected.wrong_kind":
|
||||
eventsRejected.wrongKind += value;
|
||||
break;
|
||||
case "event.rejected.stale":
|
||||
eventsRejected.stale += value;
|
||||
break;
|
||||
case "event.rejected.future":
|
||||
eventsRejected.future += value;
|
||||
break;
|
||||
case "event.rejected.rate_limited":
|
||||
eventsRejected.rateLimited += value;
|
||||
break;
|
||||
case "event.rejected.invalid_signature":
|
||||
eventsRejected.invalidSignature += value;
|
||||
break;
|
||||
case "event.rejected.oversized_ciphertext":
|
||||
eventsRejected.oversizedCiphertext += value;
|
||||
break;
|
||||
case "event.rejected.oversized_plaintext":
|
||||
eventsRejected.oversizedPlaintext += value;
|
||||
break;
|
||||
case "event.rejected.decrypt_failed":
|
||||
eventsRejected.decryptFailed += value;
|
||||
break;
|
||||
case "event.rejected.self_message":
|
||||
eventsRejected.selfMessage += value;
|
||||
break;
|
||||
|
||||
// Relay metrics
|
||||
case "relay.connect":
|
||||
if (relayUrl) getOrCreateRelay(relayUrl).connects += value;
|
||||
break;
|
||||
case "relay.disconnect":
|
||||
if (relayUrl) getOrCreateRelay(relayUrl).disconnects += value;
|
||||
break;
|
||||
case "relay.reconnect":
|
||||
if (relayUrl) getOrCreateRelay(relayUrl).reconnects += value;
|
||||
break;
|
||||
case "relay.error":
|
||||
if (relayUrl) getOrCreateRelay(relayUrl).errors += value;
|
||||
break;
|
||||
case "relay.message.event":
|
||||
if (relayUrl) getOrCreateRelay(relayUrl).messagesReceived.event += value;
|
||||
break;
|
||||
case "relay.message.eose":
|
||||
if (relayUrl) getOrCreateRelay(relayUrl).messagesReceived.eose += value;
|
||||
break;
|
||||
case "relay.message.closed":
|
||||
if (relayUrl) getOrCreateRelay(relayUrl).messagesReceived.closed += value;
|
||||
break;
|
||||
case "relay.message.notice":
|
||||
if (relayUrl) getOrCreateRelay(relayUrl).messagesReceived.notice += value;
|
||||
break;
|
||||
case "relay.message.ok":
|
||||
if (relayUrl) getOrCreateRelay(relayUrl).messagesReceived.ok += value;
|
||||
break;
|
||||
case "relay.message.auth":
|
||||
if (relayUrl) getOrCreateRelay(relayUrl).messagesReceived.auth += value;
|
||||
break;
|
||||
case "relay.circuit_breaker.open":
|
||||
if (relayUrl) {
|
||||
const r = getOrCreateRelay(relayUrl);
|
||||
r.circuitBreakerState = "open";
|
||||
r.circuitBreakerOpens += value;
|
||||
}
|
||||
break;
|
||||
case "relay.circuit_breaker.close":
|
||||
if (relayUrl) {
|
||||
const r = getOrCreateRelay(relayUrl);
|
||||
r.circuitBreakerState = "closed";
|
||||
r.circuitBreakerCloses += value;
|
||||
}
|
||||
break;
|
||||
case "relay.circuit_breaker.half_open":
|
||||
if (relayUrl) {
|
||||
getOrCreateRelay(relayUrl).circuitBreakerState = "half_open";
|
||||
}
|
||||
break;
|
||||
|
||||
// Rate limiting
|
||||
case "rate_limit.per_sender":
|
||||
rateLimiting.perSenderHits += value;
|
||||
break;
|
||||
case "rate_limit.global":
|
||||
rateLimiting.globalHits += value;
|
||||
break;
|
||||
|
||||
// Decrypt
|
||||
case "decrypt.success":
|
||||
decrypt.success += value;
|
||||
break;
|
||||
case "decrypt.failure":
|
||||
decrypt.failure += value;
|
||||
break;
|
||||
|
||||
// Memory (gauge-style - value replaces, not adds)
|
||||
case "memory.seen_tracker_size":
|
||||
memory.seenTrackerSize = value;
|
||||
break;
|
||||
case "memory.rate_limiter_entries":
|
||||
memory.rateLimiterEntries = value;
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
function getSnapshot(): MetricsSnapshot {
|
||||
// Convert relay map to object
|
||||
const relaysObj: MetricsSnapshot["relays"] = {};
|
||||
for (const [url, stats] of relays) {
|
||||
relaysObj[url] = { ...stats, messagesReceived: { ...stats.messagesReceived } };
|
||||
}
|
||||
|
||||
return {
|
||||
eventsReceived,
|
||||
eventsProcessed,
|
||||
eventsDuplicate,
|
||||
eventsRejected: { ...eventsRejected },
|
||||
relays: relaysObj,
|
||||
rateLimiting: { ...rateLimiting },
|
||||
decrypt: { ...decrypt },
|
||||
memory: { ...memory },
|
||||
snapshotAt: Date.now(),
|
||||
};
|
||||
}
|
||||
|
||||
function reset(): void {
|
||||
eventsReceived = 0;
|
||||
eventsProcessed = 0;
|
||||
eventsDuplicate = 0;
|
||||
Object.assign(eventsRejected, {
|
||||
invalidShape: 0,
|
||||
wrongKind: 0,
|
||||
stale: 0,
|
||||
future: 0,
|
||||
rateLimited: 0,
|
||||
invalidSignature: 0,
|
||||
oversizedCiphertext: 0,
|
||||
oversizedPlaintext: 0,
|
||||
decryptFailed: 0,
|
||||
selfMessage: 0,
|
||||
});
|
||||
relays.clear();
|
||||
rateLimiting.perSenderHits = 0;
|
||||
rateLimiting.globalHits = 0;
|
||||
decrypt.success = 0;
|
||||
decrypt.failure = 0;
|
||||
memory.seenTrackerSize = 0;
|
||||
memory.rateLimiterEntries = 0;
|
||||
}
|
||||
|
||||
return { emit, getSnapshot, reset };
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a no-op metrics instance (for when metrics are disabled).
|
||||
*/
|
||||
export function createNoopMetrics(): NostrMetrics {
|
||||
const emptySnapshot: MetricsSnapshot = {
|
||||
eventsReceived: 0,
|
||||
eventsProcessed: 0,
|
||||
eventsDuplicate: 0,
|
||||
eventsRejected: {
|
||||
invalidShape: 0,
|
||||
wrongKind: 0,
|
||||
stale: 0,
|
||||
future: 0,
|
||||
rateLimited: 0,
|
||||
invalidSignature: 0,
|
||||
oversizedCiphertext: 0,
|
||||
oversizedPlaintext: 0,
|
||||
decryptFailed: 0,
|
||||
selfMessage: 0,
|
||||
},
|
||||
relays: {},
|
||||
rateLimiting: { perSenderHits: 0, globalHits: 0 },
|
||||
decrypt: { success: 0, failure: 0 },
|
||||
memory: { seenTrackerSize: 0, rateLimiterEntries: 0 },
|
||||
snapshotAt: 0,
|
||||
};
|
||||
|
||||
return {
|
||||
emit: () => {},
|
||||
getSnapshot: () => ({ ...emptySnapshot, snapshotAt: Date.now() }),
|
||||
reset: () => {},
|
||||
};
|
||||
}
|
||||
@@ -1,544 +0,0 @@
|
||||
import { describe, expect, it } from "vitest";
|
||||
import { validatePrivateKey, isValidPubkey, normalizePubkey } from "./nostr-bus.js";
|
||||
import { createSeenTracker } from "./seen-tracker.js";
|
||||
import { createMetrics, type MetricName } from "./metrics.js";
|
||||
|
||||
// ============================================================================
|
||||
// Fuzz Tests for validatePrivateKey
|
||||
// ============================================================================
|
||||
|
||||
describe("validatePrivateKey fuzz", () => {
|
||||
describe("type confusion", () => {
|
||||
it("rejects null input", () => {
|
||||
expect(() => validatePrivateKey(null as unknown as string)).toThrow();
|
||||
});
|
||||
|
||||
it("rejects undefined input", () => {
|
||||
expect(() => validatePrivateKey(undefined as unknown as string)).toThrow();
|
||||
});
|
||||
|
||||
it("rejects number input", () => {
|
||||
expect(() => validatePrivateKey(123 as unknown as string)).toThrow();
|
||||
});
|
||||
|
||||
it("rejects boolean input", () => {
|
||||
expect(() => validatePrivateKey(true as unknown as string)).toThrow();
|
||||
});
|
||||
|
||||
it("rejects object input", () => {
|
||||
expect(() => validatePrivateKey({} as unknown as string)).toThrow();
|
||||
});
|
||||
|
||||
it("rejects array input", () => {
|
||||
expect(() => validatePrivateKey([] as unknown as string)).toThrow();
|
||||
});
|
||||
|
||||
it("rejects function input", () => {
|
||||
expect(() => validatePrivateKey((() => {}) as unknown as string)).toThrow();
|
||||
});
|
||||
});
|
||||
|
||||
describe("unicode attacks", () => {
|
||||
it("rejects unicode lookalike characters", () => {
|
||||
// Using zero-width characters
|
||||
const withZeroWidth =
|
||||
"0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcde\u200Bf";
|
||||
expect(() => validatePrivateKey(withZeroWidth)).toThrow();
|
||||
});
|
||||
|
||||
it("rejects RTL override", () => {
|
||||
const withRtl =
|
||||
"\u202E0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef";
|
||||
expect(() => validatePrivateKey(withRtl)).toThrow();
|
||||
});
|
||||
|
||||
it("rejects homoglyph 'a' (Cyrillic а)", () => {
|
||||
// Using Cyrillic 'а' (U+0430) instead of Latin 'a'
|
||||
const withCyrillicA =
|
||||
"0123456789\u0430bcdef0123456789abcdef0123456789abcdef0123456789abcdef";
|
||||
expect(() => validatePrivateKey(withCyrillicA)).toThrow();
|
||||
});
|
||||
|
||||
it("rejects emoji", () => {
|
||||
const withEmoji =
|
||||
"0123456789abcdef0123456789abcdef0123456789abcdef0123456789ab😀";
|
||||
expect(() => validatePrivateKey(withEmoji)).toThrow();
|
||||
});
|
||||
|
||||
it("rejects combining characters", () => {
|
||||
// 'a' followed by combining acute accent
|
||||
const withCombining =
|
||||
"0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcde\u0301";
|
||||
expect(() => validatePrivateKey(withCombining)).toThrow();
|
||||
});
|
||||
});
|
||||
|
||||
describe("injection attempts", () => {
|
||||
it("rejects null byte injection", () => {
|
||||
const withNullByte =
|
||||
"0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcde\x00f";
|
||||
expect(() => validatePrivateKey(withNullByte)).toThrow();
|
||||
});
|
||||
|
||||
it("rejects newline injection", () => {
|
||||
const withNewline =
|
||||
"0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcde\nf";
|
||||
expect(() => validatePrivateKey(withNewline)).toThrow();
|
||||
});
|
||||
|
||||
it("rejects carriage return injection", () => {
|
||||
const withCR =
|
||||
"0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcde\rf";
|
||||
expect(() => validatePrivateKey(withCR)).toThrow();
|
||||
});
|
||||
|
||||
it("rejects tab injection", () => {
|
||||
const withTab =
|
||||
"0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcde\tf";
|
||||
expect(() => validatePrivateKey(withTab)).toThrow();
|
||||
});
|
||||
|
||||
it("rejects form feed injection", () => {
|
||||
const withFormFeed =
|
||||
"0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcde\ff";
|
||||
expect(() => validatePrivateKey(withFormFeed)).toThrow();
|
||||
});
|
||||
});
|
||||
|
||||
describe("edge cases", () => {
|
||||
it("rejects very long string", () => {
|
||||
const veryLong = "a".repeat(10000);
|
||||
expect(() => validatePrivateKey(veryLong)).toThrow();
|
||||
});
|
||||
|
||||
it("rejects string of spaces matching length", () => {
|
||||
const spaces = " ".repeat(64);
|
||||
expect(() => validatePrivateKey(spaces)).toThrow();
|
||||
});
|
||||
|
||||
it("rejects hex with spaces between characters", () => {
|
||||
const withSpaces =
|
||||
"01 23 45 67 89 ab cd ef 01 23 45 67 89 ab cd ef 01 23 45 67 89 ab cd ef 01 23 45 67 89 ab cd ef";
|
||||
expect(() => validatePrivateKey(withSpaces)).toThrow();
|
||||
});
|
||||
});
|
||||
|
||||
describe("nsec format edge cases", () => {
|
||||
it("rejects nsec with invalid bech32 characters", () => {
|
||||
// 'b', 'i', 'o' are not valid bech32 characters
|
||||
const invalidBech32 = "nsec1qypqxpq9qtpqscx7peytbfwtdjmcv0mrz5rjpej8vjppfkqfqy8skqfv3l";
|
||||
expect(() => validatePrivateKey(invalidBech32)).toThrow();
|
||||
});
|
||||
|
||||
it("rejects nsec with wrong prefix", () => {
|
||||
expect(() => validatePrivateKey("nsec0aaaa")).toThrow();
|
||||
});
|
||||
|
||||
it("rejects partial nsec", () => {
|
||||
expect(() => validatePrivateKey("nsec1")).toThrow();
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
// ============================================================================
|
||||
// Fuzz Tests for isValidPubkey
|
||||
// ============================================================================
|
||||
|
||||
describe("isValidPubkey fuzz", () => {
|
||||
describe("type confusion", () => {
|
||||
it("handles null gracefully", () => {
|
||||
expect(isValidPubkey(null as unknown as string)).toBe(false);
|
||||
});
|
||||
|
||||
it("handles undefined gracefully", () => {
|
||||
expect(isValidPubkey(undefined as unknown as string)).toBe(false);
|
||||
});
|
||||
|
||||
it("handles number gracefully", () => {
|
||||
expect(isValidPubkey(123 as unknown as string)).toBe(false);
|
||||
});
|
||||
|
||||
it("handles object gracefully", () => {
|
||||
expect(isValidPubkey({} as unknown as string)).toBe(false);
|
||||
});
|
||||
});
|
||||
|
||||
describe("malicious inputs", () => {
|
||||
it("rejects __proto__ key", () => {
|
||||
expect(isValidPubkey("__proto__")).toBe(false);
|
||||
});
|
||||
|
||||
it("rejects constructor key", () => {
|
||||
expect(isValidPubkey("constructor")).toBe(false);
|
||||
});
|
||||
|
||||
it("rejects toString key", () => {
|
||||
expect(isValidPubkey("toString")).toBe(false);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
// ============================================================================
|
||||
// Fuzz Tests for normalizePubkey
|
||||
// ============================================================================
|
||||
|
||||
describe("normalizePubkey fuzz", () => {
|
||||
describe("prototype pollution attempts", () => {
|
||||
it("throws for __proto__", () => {
|
||||
expect(() => normalizePubkey("__proto__")).toThrow();
|
||||
});
|
||||
|
||||
it("throws for constructor", () => {
|
||||
expect(() => normalizePubkey("constructor")).toThrow();
|
||||
});
|
||||
|
||||
it("throws for prototype", () => {
|
||||
expect(() => normalizePubkey("prototype")).toThrow();
|
||||
});
|
||||
});
|
||||
|
||||
describe("case sensitivity", () => {
|
||||
it("normalizes uppercase to lowercase", () => {
|
||||
const upper = "0123456789ABCDEF0123456789ABCDEF0123456789ABCDEF0123456789ABCDEF";
|
||||
const lower = "0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef";
|
||||
expect(normalizePubkey(upper)).toBe(lower);
|
||||
});
|
||||
|
||||
it("normalizes mixed case to lowercase", () => {
|
||||
const mixed = "0123456789AbCdEf0123456789AbCdEf0123456789AbCdEf0123456789AbCdEf";
|
||||
const lower = "0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef";
|
||||
expect(normalizePubkey(mixed)).toBe(lower);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
// ============================================================================
|
||||
// Fuzz Tests for SeenTracker
|
||||
// ============================================================================
|
||||
|
||||
describe("SeenTracker fuzz", () => {
|
||||
describe("malformed IDs", () => {
|
||||
it("handles empty string IDs", () => {
|
||||
const tracker = createSeenTracker({ maxEntries: 100 });
|
||||
expect(() => tracker.add("")).not.toThrow();
|
||||
expect(tracker.peek("")).toBe(true);
|
||||
tracker.stop();
|
||||
});
|
||||
|
||||
it("handles very long IDs", () => {
|
||||
const tracker = createSeenTracker({ maxEntries: 100 });
|
||||
const longId = "a".repeat(100000);
|
||||
expect(() => tracker.add(longId)).not.toThrow();
|
||||
expect(tracker.peek(longId)).toBe(true);
|
||||
tracker.stop();
|
||||
});
|
||||
|
||||
it("handles unicode IDs", () => {
|
||||
const tracker = createSeenTracker({ maxEntries: 100 });
|
||||
const unicodeId = "事件ID_🎉_тест";
|
||||
expect(() => tracker.add(unicodeId)).not.toThrow();
|
||||
expect(tracker.peek(unicodeId)).toBe(true);
|
||||
tracker.stop();
|
||||
});
|
||||
|
||||
it("handles IDs with null bytes", () => {
|
||||
const tracker = createSeenTracker({ maxEntries: 100 });
|
||||
const idWithNull = "event\x00id";
|
||||
expect(() => tracker.add(idWithNull)).not.toThrow();
|
||||
expect(tracker.peek(idWithNull)).toBe(true);
|
||||
tracker.stop();
|
||||
});
|
||||
|
||||
it("handles prototype property names as IDs", () => {
|
||||
const tracker = createSeenTracker({ maxEntries: 100 });
|
||||
|
||||
// These should not affect the tracker's internal operation
|
||||
expect(() => tracker.add("__proto__")).not.toThrow();
|
||||
expect(() => tracker.add("constructor")).not.toThrow();
|
||||
expect(() => tracker.add("toString")).not.toThrow();
|
||||
expect(() => tracker.add("hasOwnProperty")).not.toThrow();
|
||||
|
||||
expect(tracker.peek("__proto__")).toBe(true);
|
||||
expect(tracker.peek("constructor")).toBe(true);
|
||||
expect(tracker.peek("toString")).toBe(true);
|
||||
expect(tracker.peek("hasOwnProperty")).toBe(true);
|
||||
|
||||
tracker.stop();
|
||||
});
|
||||
});
|
||||
|
||||
describe("rapid operations", () => {
|
||||
it("handles rapid add/check cycles", () => {
|
||||
const tracker = createSeenTracker({ maxEntries: 1000 });
|
||||
|
||||
for (let i = 0; i < 10000; i++) {
|
||||
const id = `event-${i}`;
|
||||
tracker.add(id);
|
||||
// Recently added should be findable
|
||||
if (i < 1000) {
|
||||
tracker.peek(id);
|
||||
}
|
||||
}
|
||||
|
||||
// Size should be capped at maxEntries
|
||||
expect(tracker.size()).toBeLessThanOrEqual(1000);
|
||||
tracker.stop();
|
||||
});
|
||||
|
||||
it("handles concurrent-style operations", () => {
|
||||
const tracker = createSeenTracker({ maxEntries: 100 });
|
||||
|
||||
// Simulate interleaved operations
|
||||
for (let i = 0; i < 100; i++) {
|
||||
tracker.add(`add-${i}`);
|
||||
tracker.peek(`peek-${i}`);
|
||||
tracker.has(`has-${i}`);
|
||||
if (i % 10 === 0) {
|
||||
tracker.delete(`add-${i - 5}`);
|
||||
}
|
||||
}
|
||||
|
||||
expect(() => tracker.size()).not.toThrow();
|
||||
tracker.stop();
|
||||
});
|
||||
});
|
||||
|
||||
describe("seed edge cases", () => {
|
||||
it("handles empty seed array", () => {
|
||||
const tracker = createSeenTracker({ maxEntries: 100 });
|
||||
expect(() => tracker.seed([])).not.toThrow();
|
||||
expect(tracker.size()).toBe(0);
|
||||
tracker.stop();
|
||||
});
|
||||
|
||||
it("handles seed with duplicate IDs", () => {
|
||||
const tracker = createSeenTracker({ maxEntries: 100 });
|
||||
tracker.seed(["id1", "id1", "id1", "id2", "id2"]);
|
||||
expect(tracker.size()).toBe(2);
|
||||
tracker.stop();
|
||||
});
|
||||
|
||||
it("handles seed larger than maxEntries", () => {
|
||||
const tracker = createSeenTracker({ maxEntries: 5 });
|
||||
const ids = Array.from({ length: 100 }, (_, i) => `id-${i}`);
|
||||
tracker.seed(ids);
|
||||
expect(tracker.size()).toBeLessThanOrEqual(5);
|
||||
tracker.stop();
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
// ============================================================================
|
||||
// Fuzz Tests for Metrics
|
||||
// ============================================================================
|
||||
|
||||
describe("Metrics fuzz", () => {
|
||||
describe("invalid metric names", () => {
|
||||
it("handles unknown metric names gracefully", () => {
|
||||
const metrics = createMetrics();
|
||||
|
||||
// Cast to bypass type checking - testing runtime behavior
|
||||
expect(() => {
|
||||
metrics.emit("invalid.metric.name" as MetricName);
|
||||
}).not.toThrow();
|
||||
});
|
||||
});
|
||||
|
||||
describe("invalid label values", () => {
|
||||
it("handles null relay label", () => {
|
||||
const metrics = createMetrics();
|
||||
expect(() => {
|
||||
metrics.emit("relay.connect", 1, { relay: null as unknown as string });
|
||||
}).not.toThrow();
|
||||
});
|
||||
|
||||
it("handles undefined relay label", () => {
|
||||
const metrics = createMetrics();
|
||||
expect(() => {
|
||||
metrics.emit("relay.connect", 1, { relay: undefined as unknown as string });
|
||||
}).not.toThrow();
|
||||
});
|
||||
|
||||
it("handles very long relay URL", () => {
|
||||
const metrics = createMetrics();
|
||||
const longUrl = "wss://" + "a".repeat(10000) + ".com";
|
||||
expect(() => {
|
||||
metrics.emit("relay.connect", 1, { relay: longUrl });
|
||||
}).not.toThrow();
|
||||
|
||||
const snapshot = metrics.getSnapshot();
|
||||
expect(snapshot.relays[longUrl]).toBeDefined();
|
||||
});
|
||||
});
|
||||
|
||||
describe("extreme values", () => {
|
||||
it("handles NaN value", () => {
|
||||
const metrics = createMetrics();
|
||||
expect(() => metrics.emit("event.received", NaN)).not.toThrow();
|
||||
|
||||
const snapshot = metrics.getSnapshot();
|
||||
expect(isNaN(snapshot.eventsReceived)).toBe(true);
|
||||
});
|
||||
|
||||
it("handles Infinity value", () => {
|
||||
const metrics = createMetrics();
|
||||
expect(() => metrics.emit("event.received", Infinity)).not.toThrow();
|
||||
|
||||
const snapshot = metrics.getSnapshot();
|
||||
expect(snapshot.eventsReceived).toBe(Infinity);
|
||||
});
|
||||
|
||||
it("handles negative value", () => {
|
||||
const metrics = createMetrics();
|
||||
metrics.emit("event.received", -1);
|
||||
|
||||
const snapshot = metrics.getSnapshot();
|
||||
expect(snapshot.eventsReceived).toBe(-1);
|
||||
});
|
||||
|
||||
it("handles very large value", () => {
|
||||
const metrics = createMetrics();
|
||||
metrics.emit("event.received", Number.MAX_SAFE_INTEGER);
|
||||
|
||||
const snapshot = metrics.getSnapshot();
|
||||
expect(snapshot.eventsReceived).toBe(Number.MAX_SAFE_INTEGER);
|
||||
});
|
||||
});
|
||||
|
||||
describe("rapid emissions", () => {
|
||||
it("handles many rapid emissions", () => {
|
||||
const events: unknown[] = [];
|
||||
const metrics = createMetrics((e) => events.push(e));
|
||||
|
||||
for (let i = 0; i < 10000; i++) {
|
||||
metrics.emit("event.received");
|
||||
}
|
||||
|
||||
expect(events).toHaveLength(10000);
|
||||
const snapshot = metrics.getSnapshot();
|
||||
expect(snapshot.eventsReceived).toBe(10000);
|
||||
});
|
||||
});
|
||||
|
||||
describe("reset during operation", () => {
|
||||
it("handles reset mid-operation safely", () => {
|
||||
const metrics = createMetrics();
|
||||
|
||||
metrics.emit("event.received");
|
||||
metrics.emit("event.received");
|
||||
metrics.reset();
|
||||
metrics.emit("event.received");
|
||||
|
||||
const snapshot = metrics.getSnapshot();
|
||||
expect(snapshot.eventsReceived).toBe(1);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
// ============================================================================
|
||||
// Event Shape Validation (simulating malformed events)
|
||||
// ============================================================================
|
||||
|
||||
describe("Event shape validation", () => {
|
||||
describe("malformed event structures", () => {
|
||||
// These test what happens if malformed data somehow gets through
|
||||
|
||||
it("identifies missing required fields", () => {
|
||||
const malformedEvents = [
|
||||
{}, // empty
|
||||
{ id: "abc" }, // missing pubkey, created_at, etc.
|
||||
{ id: null, pubkey: null }, // null values
|
||||
{ id: 123, pubkey: 456 }, // wrong types
|
||||
{ tags: "not-an-array" }, // wrong type for tags
|
||||
{ tags: [[1, 2, 3]] }, // wrong type for tag elements
|
||||
];
|
||||
|
||||
for (const event of malformedEvents) {
|
||||
// These should be caught by shape validation before processing
|
||||
const hasId = typeof event?.id === "string";
|
||||
const hasPubkey = typeof (event as { pubkey?: unknown })?.pubkey === "string";
|
||||
const hasTags = Array.isArray((event as { tags?: unknown })?.tags);
|
||||
|
||||
// At least one should be invalid
|
||||
expect(hasId && hasPubkey && hasTags).toBe(false);
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
describe("timestamp edge cases", () => {
|
||||
const testTimestamps = [
|
||||
{ value: NaN, desc: "NaN" },
|
||||
{ value: Infinity, desc: "Infinity" },
|
||||
{ value: -Infinity, desc: "-Infinity" },
|
||||
{ value: -1, desc: "negative" },
|
||||
{ value: 0, desc: "zero" },
|
||||
{ value: 253402300800, desc: "year 10000" }, // Far future
|
||||
{ value: -62135596800, desc: "year 0001" }, // Far past
|
||||
{ value: 1.5, desc: "float" },
|
||||
];
|
||||
|
||||
for (const { value, desc } of testTimestamps) {
|
||||
it(`handles ${desc} timestamp`, () => {
|
||||
const isValidTimestamp =
|
||||
typeof value === "number" &&
|
||||
!isNaN(value) &&
|
||||
isFinite(value) &&
|
||||
value >= 0 &&
|
||||
Number.isInteger(value);
|
||||
|
||||
// Timestamps should be validated as positive integers
|
||||
if (["NaN", "Infinity", "-Infinity", "negative", "float"].includes(desc)) {
|
||||
expect(isValidTimestamp).toBe(false);
|
||||
}
|
||||
});
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
// ============================================================================
|
||||
// JSON parsing edge cases (simulating relay responses)
|
||||
// ============================================================================
|
||||
|
||||
describe("JSON parsing edge cases", () => {
|
||||
const malformedJsonCases = [
|
||||
{ input: "", desc: "empty string" },
|
||||
{ input: "null", desc: "null literal" },
|
||||
{ input: "undefined", desc: "undefined literal" },
|
||||
{ input: "{", desc: "incomplete object" },
|
||||
{ input: "[", desc: "incomplete array" },
|
||||
{ input: '{"key": undefined}', desc: "undefined value" },
|
||||
{ input: "{'key': 'value'}", desc: "single quotes" },
|
||||
{ input: '{"key": NaN}', desc: "NaN value" },
|
||||
{ input: '{"key": Infinity}', desc: "Infinity value" },
|
||||
{ input: "\x00", desc: "null byte" },
|
||||
{ input: "abc", desc: "plain string" },
|
||||
{ input: "123", desc: "plain number" },
|
||||
];
|
||||
|
||||
for (const { input, desc } of malformedJsonCases) {
|
||||
it(`handles malformed JSON: ${desc}`, () => {
|
||||
let parsed: unknown;
|
||||
let parseError = false;
|
||||
|
||||
try {
|
||||
parsed = JSON.parse(input);
|
||||
} catch {
|
||||
parseError = true;
|
||||
}
|
||||
|
||||
// Either it throws or produces something that needs validation
|
||||
if (!parseError) {
|
||||
// If it parsed, we need to validate the structure
|
||||
const isValidRelayMessage =
|
||||
Array.isArray(parsed) &&
|
||||
parsed.length >= 2 &&
|
||||
typeof parsed[0] === "string";
|
||||
|
||||
// Most malformed cases won't produce valid relay messages
|
||||
if (["null literal", "plain number", "plain string"].includes(desc)) {
|
||||
expect(isValidRelayMessage).toBe(false);
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
||||
});
|
||||
@@ -1,452 +0,0 @@
|
||||
import { describe, expect, it, vi, beforeEach, afterEach } from "vitest";
|
||||
import { createSeenTracker } from "./seen-tracker.js";
|
||||
import {
|
||||
createMetrics,
|
||||
createNoopMetrics,
|
||||
type MetricEvent,
|
||||
} from "./metrics.js";
|
||||
|
||||
// ============================================================================
|
||||
// Seen Tracker Integration Tests
|
||||
// ============================================================================
|
||||
|
||||
describe("SeenTracker", () => {
|
||||
describe("basic operations", () => {
|
||||
it("tracks seen IDs", () => {
|
||||
const tracker = createSeenTracker({ maxEntries: 100, ttlMs: 60000 });
|
||||
|
||||
// First check returns false and adds
|
||||
expect(tracker.has("id1")).toBe(false);
|
||||
// Second check returns true (already seen)
|
||||
expect(tracker.has("id1")).toBe(true);
|
||||
|
||||
tracker.stop();
|
||||
});
|
||||
|
||||
it("peek does not add", () => {
|
||||
const tracker = createSeenTracker({ maxEntries: 100, ttlMs: 60000 });
|
||||
|
||||
expect(tracker.peek("id1")).toBe(false);
|
||||
expect(tracker.peek("id1")).toBe(false); // Still false
|
||||
|
||||
tracker.add("id1");
|
||||
expect(tracker.peek("id1")).toBe(true);
|
||||
|
||||
tracker.stop();
|
||||
});
|
||||
|
||||
it("delete removes entries", () => {
|
||||
const tracker = createSeenTracker({ maxEntries: 100, ttlMs: 60000 });
|
||||
|
||||
tracker.add("id1");
|
||||
expect(tracker.peek("id1")).toBe(true);
|
||||
|
||||
tracker.delete("id1");
|
||||
expect(tracker.peek("id1")).toBe(false);
|
||||
|
||||
tracker.stop();
|
||||
});
|
||||
|
||||
it("clear removes all entries", () => {
|
||||
const tracker = createSeenTracker({ maxEntries: 100, ttlMs: 60000 });
|
||||
|
||||
tracker.add("id1");
|
||||
tracker.add("id2");
|
||||
tracker.add("id3");
|
||||
expect(tracker.size()).toBe(3);
|
||||
|
||||
tracker.clear();
|
||||
expect(tracker.size()).toBe(0);
|
||||
expect(tracker.peek("id1")).toBe(false);
|
||||
|
||||
tracker.stop();
|
||||
});
|
||||
|
||||
it("seed pre-populates entries", () => {
|
||||
const tracker = createSeenTracker({ maxEntries: 100, ttlMs: 60000 });
|
||||
|
||||
tracker.seed(["id1", "id2", "id3"]);
|
||||
expect(tracker.size()).toBe(3);
|
||||
expect(tracker.peek("id1")).toBe(true);
|
||||
expect(tracker.peek("id2")).toBe(true);
|
||||
expect(tracker.peek("id3")).toBe(true);
|
||||
|
||||
tracker.stop();
|
||||
});
|
||||
});
|
||||
|
||||
describe("LRU eviction", () => {
|
||||
it("evicts least recently used when at capacity", () => {
|
||||
const tracker = createSeenTracker({ maxEntries: 3, ttlMs: 60000 });
|
||||
|
||||
tracker.add("id1");
|
||||
tracker.add("id2");
|
||||
tracker.add("id3");
|
||||
expect(tracker.size()).toBe(3);
|
||||
|
||||
// Adding fourth should evict oldest (id1)
|
||||
tracker.add("id4");
|
||||
expect(tracker.size()).toBe(3);
|
||||
expect(tracker.peek("id1")).toBe(false); // Evicted
|
||||
expect(tracker.peek("id2")).toBe(true);
|
||||
expect(tracker.peek("id3")).toBe(true);
|
||||
expect(tracker.peek("id4")).toBe(true);
|
||||
|
||||
tracker.stop();
|
||||
});
|
||||
|
||||
it("accessing an entry moves it to front (prevents eviction)", () => {
|
||||
const tracker = createSeenTracker({ maxEntries: 3, ttlMs: 60000 });
|
||||
|
||||
tracker.add("id1");
|
||||
tracker.add("id2");
|
||||
tracker.add("id3");
|
||||
|
||||
// Access id1, moving it to front
|
||||
tracker.has("id1");
|
||||
|
||||
// Add id4 - should evict id2 (now oldest)
|
||||
tracker.add("id4");
|
||||
expect(tracker.peek("id1")).toBe(true); // Not evicted, was accessed
|
||||
expect(tracker.peek("id2")).toBe(false); // Evicted
|
||||
expect(tracker.peek("id3")).toBe(true);
|
||||
expect(tracker.peek("id4")).toBe(true);
|
||||
|
||||
tracker.stop();
|
||||
});
|
||||
|
||||
it("handles capacity of 1", () => {
|
||||
const tracker = createSeenTracker({ maxEntries: 1, ttlMs: 60000 });
|
||||
|
||||
tracker.add("id1");
|
||||
expect(tracker.peek("id1")).toBe(true);
|
||||
|
||||
tracker.add("id2");
|
||||
expect(tracker.peek("id1")).toBe(false);
|
||||
expect(tracker.peek("id2")).toBe(true);
|
||||
|
||||
tracker.stop();
|
||||
});
|
||||
|
||||
it("seed respects maxEntries", () => {
|
||||
const tracker = createSeenTracker({ maxEntries: 2, ttlMs: 60000 });
|
||||
|
||||
tracker.seed(["id1", "id2", "id3", "id4"]);
|
||||
expect(tracker.size()).toBe(2);
|
||||
// Seed stops when maxEntries reached, processing from end to start
|
||||
// So id4 and id3 get added first, then we're at capacity
|
||||
expect(tracker.peek("id3")).toBe(true);
|
||||
expect(tracker.peek("id4")).toBe(true);
|
||||
|
||||
tracker.stop();
|
||||
});
|
||||
});
|
||||
|
||||
describe("TTL expiration", () => {
|
||||
it("expires entries after TTL", async () => {
|
||||
vi.useFakeTimers();
|
||||
|
||||
const tracker = createSeenTracker({
|
||||
maxEntries: 100,
|
||||
ttlMs: 100,
|
||||
pruneIntervalMs: 50,
|
||||
});
|
||||
|
||||
tracker.add("id1");
|
||||
expect(tracker.peek("id1")).toBe(true);
|
||||
|
||||
// Advance past TTL
|
||||
vi.advanceTimersByTime(150);
|
||||
|
||||
// Entry should be expired
|
||||
expect(tracker.peek("id1")).toBe(false);
|
||||
|
||||
tracker.stop();
|
||||
vi.useRealTimers();
|
||||
});
|
||||
|
||||
it("has() refreshes TTL", async () => {
|
||||
vi.useFakeTimers();
|
||||
|
||||
const tracker = createSeenTracker({
|
||||
maxEntries: 100,
|
||||
ttlMs: 100,
|
||||
pruneIntervalMs: 50,
|
||||
});
|
||||
|
||||
tracker.add("id1");
|
||||
|
||||
// Advance halfway
|
||||
vi.advanceTimersByTime(50);
|
||||
|
||||
// Access to refresh
|
||||
expect(tracker.has("id1")).toBe(true);
|
||||
|
||||
// Advance another 75ms (total 125ms from add, but only 75ms from last access)
|
||||
vi.advanceTimersByTime(75);
|
||||
|
||||
// Should still be valid (refreshed at 50ms)
|
||||
expect(tracker.peek("id1")).toBe(true);
|
||||
|
||||
tracker.stop();
|
||||
vi.useRealTimers();
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
// ============================================================================
|
||||
// Metrics Integration Tests
|
||||
// ============================================================================
|
||||
|
||||
describe("Metrics", () => {
|
||||
describe("createMetrics", () => {
|
||||
it("emits metric events to callback", () => {
|
||||
const events: MetricEvent[] = [];
|
||||
const metrics = createMetrics((event) => events.push(event));
|
||||
|
||||
metrics.emit("event.received");
|
||||
metrics.emit("event.processed");
|
||||
metrics.emit("event.duplicate");
|
||||
|
||||
expect(events).toHaveLength(3);
|
||||
expect(events[0].name).toBe("event.received");
|
||||
expect(events[1].name).toBe("event.processed");
|
||||
expect(events[2].name).toBe("event.duplicate");
|
||||
});
|
||||
|
||||
it("includes labels in metric events", () => {
|
||||
const events: MetricEvent[] = [];
|
||||
const metrics = createMetrics((event) => events.push(event));
|
||||
|
||||
metrics.emit("relay.connect", 1, { relay: "wss://relay.example.com" });
|
||||
|
||||
expect(events[0].labels).toEqual({ relay: "wss://relay.example.com" });
|
||||
});
|
||||
|
||||
it("accumulates counters in snapshot", () => {
|
||||
const metrics = createMetrics();
|
||||
|
||||
metrics.emit("event.received");
|
||||
metrics.emit("event.received");
|
||||
metrics.emit("event.processed");
|
||||
metrics.emit("event.duplicate");
|
||||
metrics.emit("event.duplicate");
|
||||
metrics.emit("event.duplicate");
|
||||
|
||||
const snapshot = metrics.getSnapshot();
|
||||
expect(snapshot.eventsReceived).toBe(2);
|
||||
expect(snapshot.eventsProcessed).toBe(1);
|
||||
expect(snapshot.eventsDuplicate).toBe(3);
|
||||
});
|
||||
|
||||
it("tracks per-relay stats", () => {
|
||||
const metrics = createMetrics();
|
||||
|
||||
metrics.emit("relay.connect", 1, { relay: "wss://relay1.com" });
|
||||
metrics.emit("relay.connect", 1, { relay: "wss://relay2.com" });
|
||||
metrics.emit("relay.error", 1, { relay: "wss://relay1.com" });
|
||||
metrics.emit("relay.error", 1, { relay: "wss://relay1.com" });
|
||||
|
||||
const snapshot = metrics.getSnapshot();
|
||||
expect(snapshot.relays["wss://relay1.com"]).toBeDefined();
|
||||
expect(snapshot.relays["wss://relay1.com"].connects).toBe(1);
|
||||
expect(snapshot.relays["wss://relay1.com"].errors).toBe(2);
|
||||
expect(snapshot.relays["wss://relay2.com"].connects).toBe(1);
|
||||
expect(snapshot.relays["wss://relay2.com"].errors).toBe(0);
|
||||
});
|
||||
|
||||
it("tracks circuit breaker state changes", () => {
|
||||
const metrics = createMetrics();
|
||||
|
||||
metrics.emit("relay.circuit_breaker.open", 1, { relay: "wss://relay.com" });
|
||||
|
||||
let snapshot = metrics.getSnapshot();
|
||||
expect(snapshot.relays["wss://relay.com"].circuitBreakerState).toBe("open");
|
||||
expect(snapshot.relays["wss://relay.com"].circuitBreakerOpens).toBe(1);
|
||||
|
||||
metrics.emit("relay.circuit_breaker.close", 1, { relay: "wss://relay.com" });
|
||||
|
||||
snapshot = metrics.getSnapshot();
|
||||
expect(snapshot.relays["wss://relay.com"].circuitBreakerState).toBe("closed");
|
||||
expect(snapshot.relays["wss://relay.com"].circuitBreakerCloses).toBe(1);
|
||||
});
|
||||
|
||||
it("tracks all rejection reasons", () => {
|
||||
const metrics = createMetrics();
|
||||
|
||||
metrics.emit("event.rejected.invalid_shape");
|
||||
metrics.emit("event.rejected.wrong_kind");
|
||||
metrics.emit("event.rejected.stale");
|
||||
metrics.emit("event.rejected.future");
|
||||
metrics.emit("event.rejected.rate_limited");
|
||||
metrics.emit("event.rejected.invalid_signature");
|
||||
metrics.emit("event.rejected.oversized_ciphertext");
|
||||
metrics.emit("event.rejected.oversized_plaintext");
|
||||
metrics.emit("event.rejected.decrypt_failed");
|
||||
metrics.emit("event.rejected.self_message");
|
||||
|
||||
const snapshot = metrics.getSnapshot();
|
||||
expect(snapshot.eventsRejected.invalidShape).toBe(1);
|
||||
expect(snapshot.eventsRejected.wrongKind).toBe(1);
|
||||
expect(snapshot.eventsRejected.stale).toBe(1);
|
||||
expect(snapshot.eventsRejected.future).toBe(1);
|
||||
expect(snapshot.eventsRejected.rateLimited).toBe(1);
|
||||
expect(snapshot.eventsRejected.invalidSignature).toBe(1);
|
||||
expect(snapshot.eventsRejected.oversizedCiphertext).toBe(1);
|
||||
expect(snapshot.eventsRejected.oversizedPlaintext).toBe(1);
|
||||
expect(snapshot.eventsRejected.decryptFailed).toBe(1);
|
||||
expect(snapshot.eventsRejected.selfMessage).toBe(1);
|
||||
});
|
||||
|
||||
it("tracks relay message types", () => {
|
||||
const metrics = createMetrics();
|
||||
|
||||
metrics.emit("relay.message.event", 1, { relay: "wss://relay.com" });
|
||||
metrics.emit("relay.message.eose", 1, { relay: "wss://relay.com" });
|
||||
metrics.emit("relay.message.closed", 1, { relay: "wss://relay.com" });
|
||||
metrics.emit("relay.message.notice", 1, { relay: "wss://relay.com" });
|
||||
metrics.emit("relay.message.ok", 1, { relay: "wss://relay.com" });
|
||||
metrics.emit("relay.message.auth", 1, { relay: "wss://relay.com" });
|
||||
|
||||
const snapshot = metrics.getSnapshot();
|
||||
const relay = snapshot.relays["wss://relay.com"];
|
||||
expect(relay.messagesReceived.event).toBe(1);
|
||||
expect(relay.messagesReceived.eose).toBe(1);
|
||||
expect(relay.messagesReceived.closed).toBe(1);
|
||||
expect(relay.messagesReceived.notice).toBe(1);
|
||||
expect(relay.messagesReceived.ok).toBe(1);
|
||||
expect(relay.messagesReceived.auth).toBe(1);
|
||||
});
|
||||
|
||||
it("tracks decrypt success/failure", () => {
|
||||
const metrics = createMetrics();
|
||||
|
||||
metrics.emit("decrypt.success");
|
||||
metrics.emit("decrypt.success");
|
||||
metrics.emit("decrypt.failure");
|
||||
|
||||
const snapshot = metrics.getSnapshot();
|
||||
expect(snapshot.decrypt.success).toBe(2);
|
||||
expect(snapshot.decrypt.failure).toBe(1);
|
||||
});
|
||||
|
||||
it("tracks memory gauges (replaces rather than accumulates)", () => {
|
||||
const metrics = createMetrics();
|
||||
|
||||
metrics.emit("memory.seen_tracker_size", 100);
|
||||
metrics.emit("memory.seen_tracker_size", 150);
|
||||
metrics.emit("memory.seen_tracker_size", 125);
|
||||
|
||||
const snapshot = metrics.getSnapshot();
|
||||
expect(snapshot.memory.seenTrackerSize).toBe(125); // Last value, not sum
|
||||
});
|
||||
|
||||
it("reset clears all counters", () => {
|
||||
const metrics = createMetrics();
|
||||
|
||||
metrics.emit("event.received");
|
||||
metrics.emit("event.processed");
|
||||
metrics.emit("relay.connect", 1, { relay: "wss://relay.com" });
|
||||
|
||||
metrics.reset();
|
||||
|
||||
const snapshot = metrics.getSnapshot();
|
||||
expect(snapshot.eventsReceived).toBe(0);
|
||||
expect(snapshot.eventsProcessed).toBe(0);
|
||||
expect(Object.keys(snapshot.relays)).toHaveLength(0);
|
||||
});
|
||||
});
|
||||
|
||||
describe("createNoopMetrics", () => {
|
||||
it("does not throw on emit", () => {
|
||||
const metrics = createNoopMetrics();
|
||||
|
||||
expect(() => {
|
||||
metrics.emit("event.received");
|
||||
metrics.emit("relay.connect", 1, { relay: "wss://relay.com" });
|
||||
}).not.toThrow();
|
||||
});
|
||||
|
||||
it("returns empty snapshot", () => {
|
||||
const metrics = createNoopMetrics();
|
||||
|
||||
const snapshot = metrics.getSnapshot();
|
||||
expect(snapshot.eventsReceived).toBe(0);
|
||||
expect(snapshot.eventsProcessed).toBe(0);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
// ============================================================================
|
||||
// Circuit Breaker Behavior Tests
|
||||
// ============================================================================
|
||||
|
||||
describe("Circuit Breaker Behavior", () => {
|
||||
// Test the circuit breaker logic through metrics emissions
|
||||
it("emits circuit breaker metrics in correct sequence", () => {
|
||||
const events: MetricEvent[] = [];
|
||||
const metrics = createMetrics((event) => events.push(event));
|
||||
|
||||
// Simulate 5 failures -> open
|
||||
for (let i = 0; i < 5; i++) {
|
||||
metrics.emit("relay.error", 1, { relay: "wss://relay.com" });
|
||||
}
|
||||
metrics.emit("relay.circuit_breaker.open", 1, { relay: "wss://relay.com" });
|
||||
|
||||
// Simulate recovery
|
||||
metrics.emit("relay.circuit_breaker.half_open", 1, { relay: "wss://relay.com" });
|
||||
metrics.emit("relay.circuit_breaker.close", 1, { relay: "wss://relay.com" });
|
||||
|
||||
const cbEvents = events.filter((e) => e.name.startsWith("relay.circuit_breaker"));
|
||||
expect(cbEvents).toHaveLength(3);
|
||||
expect(cbEvents[0].name).toBe("relay.circuit_breaker.open");
|
||||
expect(cbEvents[1].name).toBe("relay.circuit_breaker.half_open");
|
||||
expect(cbEvents[2].name).toBe("relay.circuit_breaker.close");
|
||||
});
|
||||
});
|
||||
|
||||
// ============================================================================
|
||||
// Health Scoring Behavior Tests
|
||||
// ============================================================================
|
||||
|
||||
describe("Health Scoring", () => {
|
||||
it("metrics track relay errors for health scoring", () => {
|
||||
const metrics = createMetrics();
|
||||
|
||||
// Simulate mixed success/failure pattern
|
||||
metrics.emit("relay.connect", 1, { relay: "wss://good-relay.com" });
|
||||
metrics.emit("relay.connect", 1, { relay: "wss://bad-relay.com" });
|
||||
|
||||
metrics.emit("relay.error", 1, { relay: "wss://bad-relay.com" });
|
||||
metrics.emit("relay.error", 1, { relay: "wss://bad-relay.com" });
|
||||
metrics.emit("relay.error", 1, { relay: "wss://bad-relay.com" });
|
||||
|
||||
const snapshot = metrics.getSnapshot();
|
||||
expect(snapshot.relays["wss://good-relay.com"].errors).toBe(0);
|
||||
expect(snapshot.relays["wss://bad-relay.com"].errors).toBe(3);
|
||||
});
|
||||
});
|
||||
|
||||
// ============================================================================
|
||||
// Reconnect Backoff Tests
|
||||
// ============================================================================
|
||||
|
||||
describe("Reconnect Backoff", () => {
|
||||
it("computes delays within expected bounds", () => {
|
||||
// Compute expected delays (1s, 2s, 4s, 8s, 16s, 32s, 60s cap)
|
||||
const BASE = 1000;
|
||||
const MAX = 60000;
|
||||
const JITTER = 0.3;
|
||||
|
||||
for (let attempt = 0; attempt < 10; attempt++) {
|
||||
const exponential = BASE * Math.pow(2, attempt);
|
||||
const capped = Math.min(exponential, MAX);
|
||||
const minDelay = capped * (1 - JITTER);
|
||||
const maxDelay = capped * (1 + JITTER);
|
||||
|
||||
// These are the expected bounds
|
||||
expect(minDelay).toBeGreaterThanOrEqual(BASE * 0.7);
|
||||
expect(maxDelay).toBeLessThanOrEqual(MAX * 1.3);
|
||||
}
|
||||
});
|
||||
});
|
||||
@@ -1,199 +0,0 @@
|
||||
import { describe, expect, it } from "vitest";
|
||||
import {
|
||||
validatePrivateKey,
|
||||
getPublicKeyFromPrivate,
|
||||
isValidPubkey,
|
||||
normalizePubkey,
|
||||
pubkeyToNpub,
|
||||
} from "./nostr-bus.js";
|
||||
|
||||
// Test private key (DO NOT use in production - this is a known test key)
|
||||
const TEST_HEX_KEY = "0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef";
|
||||
const TEST_NSEC = "nsec1qypqxpq9qtpqscx7peytzfwtdjmcv0mrz5rjpej8vjppfkqfqy8skqfv3l";
|
||||
|
||||
describe("validatePrivateKey", () => {
|
||||
describe("hex format", () => {
|
||||
it("accepts valid 64-char hex key", () => {
|
||||
const result = validatePrivateKey(TEST_HEX_KEY);
|
||||
expect(result).toBeInstanceOf(Uint8Array);
|
||||
expect(result.length).toBe(32);
|
||||
});
|
||||
|
||||
it("accepts lowercase hex", () => {
|
||||
const result = validatePrivateKey(TEST_HEX_KEY.toLowerCase());
|
||||
expect(result).toBeInstanceOf(Uint8Array);
|
||||
});
|
||||
|
||||
it("accepts uppercase hex", () => {
|
||||
const result = validatePrivateKey(TEST_HEX_KEY.toUpperCase());
|
||||
expect(result).toBeInstanceOf(Uint8Array);
|
||||
});
|
||||
|
||||
it("accepts mixed case hex", () => {
|
||||
const mixed = "0123456789ABCdef0123456789abcDEF0123456789abcdef0123456789ABCDEF";
|
||||
const result = validatePrivateKey(mixed);
|
||||
expect(result).toBeInstanceOf(Uint8Array);
|
||||
});
|
||||
|
||||
it("trims whitespace", () => {
|
||||
const result = validatePrivateKey(` ${TEST_HEX_KEY} `);
|
||||
expect(result).toBeInstanceOf(Uint8Array);
|
||||
});
|
||||
|
||||
it("trims newlines", () => {
|
||||
const result = validatePrivateKey(`${TEST_HEX_KEY}\n`);
|
||||
expect(result).toBeInstanceOf(Uint8Array);
|
||||
});
|
||||
|
||||
it("rejects 63-char hex (too short)", () => {
|
||||
expect(() => validatePrivateKey(TEST_HEX_KEY.slice(0, 63))).toThrow(
|
||||
"Private key must be 64 hex characters"
|
||||
);
|
||||
});
|
||||
|
||||
it("rejects 65-char hex (too long)", () => {
|
||||
expect(() => validatePrivateKey(TEST_HEX_KEY + "0")).toThrow(
|
||||
"Private key must be 64 hex characters"
|
||||
);
|
||||
});
|
||||
|
||||
it("rejects non-hex characters", () => {
|
||||
const invalid = "0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdeg"; // 'g' at end
|
||||
expect(() => validatePrivateKey(invalid)).toThrow("Private key must be 64 hex characters");
|
||||
});
|
||||
|
||||
it("rejects empty string", () => {
|
||||
expect(() => validatePrivateKey("")).toThrow("Private key must be 64 hex characters");
|
||||
});
|
||||
|
||||
it("rejects whitespace-only string", () => {
|
||||
expect(() => validatePrivateKey(" ")).toThrow("Private key must be 64 hex characters");
|
||||
});
|
||||
|
||||
it("rejects key with 0x prefix", () => {
|
||||
expect(() => validatePrivateKey("0x" + TEST_HEX_KEY)).toThrow(
|
||||
"Private key must be 64 hex characters"
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
describe("nsec format", () => {
|
||||
it("rejects invalid nsec (wrong checksum)", () => {
|
||||
const badNsec = "nsec1invalidinvalidinvalidinvalidinvalidinvalidinvalidinvalid";
|
||||
expect(() => validatePrivateKey(badNsec)).toThrow();
|
||||
});
|
||||
|
||||
it("rejects npub (wrong type)", () => {
|
||||
const npub = "npub1qypqxpq9qtpqscx7peytzfwtdjmcv0mrz5rjpej8vjppfkqfqy8s5epk55";
|
||||
expect(() => validatePrivateKey(npub)).toThrow();
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe("isValidPubkey", () => {
|
||||
describe("hex format", () => {
|
||||
it("accepts valid 64-char hex pubkey", () => {
|
||||
const validHex = "0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef";
|
||||
expect(isValidPubkey(validHex)).toBe(true);
|
||||
});
|
||||
|
||||
it("accepts uppercase hex", () => {
|
||||
const validHex = "0123456789ABCDEF0123456789ABCDEF0123456789ABCDEF0123456789ABCDEF";
|
||||
expect(isValidPubkey(validHex)).toBe(true);
|
||||
});
|
||||
|
||||
it("rejects 63-char hex", () => {
|
||||
const shortHex = "0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcde";
|
||||
expect(isValidPubkey(shortHex)).toBe(false);
|
||||
});
|
||||
|
||||
it("rejects 65-char hex", () => {
|
||||
const longHex = "0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef0";
|
||||
expect(isValidPubkey(longHex)).toBe(false);
|
||||
});
|
||||
|
||||
it("rejects non-hex characters", () => {
|
||||
const invalid = "0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdeg";
|
||||
expect(isValidPubkey(invalid)).toBe(false);
|
||||
});
|
||||
});
|
||||
|
||||
describe("npub format", () => {
|
||||
it("rejects invalid npub", () => {
|
||||
expect(isValidPubkey("npub1invalid")).toBe(false);
|
||||
});
|
||||
|
||||
it("rejects nsec (wrong type)", () => {
|
||||
expect(isValidPubkey(TEST_NSEC)).toBe(false);
|
||||
});
|
||||
});
|
||||
|
||||
describe("edge cases", () => {
|
||||
it("rejects empty string", () => {
|
||||
expect(isValidPubkey("")).toBe(false);
|
||||
});
|
||||
|
||||
it("handles whitespace-padded input", () => {
|
||||
const validHex = "0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef";
|
||||
expect(isValidPubkey(` ${validHex} `)).toBe(true);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe("normalizePubkey", () => {
|
||||
describe("hex format", () => {
|
||||
it("lowercases hex pubkey", () => {
|
||||
const upper = "0123456789ABCDEF0123456789ABCDEF0123456789ABCDEF0123456789ABCDEF";
|
||||
const result = normalizePubkey(upper);
|
||||
expect(result).toBe(upper.toLowerCase());
|
||||
});
|
||||
|
||||
it("trims whitespace", () => {
|
||||
const hex = "0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef";
|
||||
expect(normalizePubkey(` ${hex} `)).toBe(hex);
|
||||
});
|
||||
|
||||
it("rejects invalid hex", () => {
|
||||
expect(() => normalizePubkey("invalid")).toThrow("Pubkey must be 64 hex characters");
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe("getPublicKeyFromPrivate", () => {
|
||||
it("derives public key from hex private key", () => {
|
||||
const pubkey = getPublicKeyFromPrivate(TEST_HEX_KEY);
|
||||
expect(pubkey).toMatch(/^[0-9a-f]{64}$/);
|
||||
expect(pubkey.length).toBe(64);
|
||||
});
|
||||
|
||||
it("derives consistent public key", () => {
|
||||
const pubkey1 = getPublicKeyFromPrivate(TEST_HEX_KEY);
|
||||
const pubkey2 = getPublicKeyFromPrivate(TEST_HEX_KEY);
|
||||
expect(pubkey1).toBe(pubkey2);
|
||||
});
|
||||
|
||||
it("throws for invalid private key", () => {
|
||||
expect(() => getPublicKeyFromPrivate("invalid")).toThrow();
|
||||
});
|
||||
});
|
||||
|
||||
describe("pubkeyToNpub", () => {
|
||||
it("converts hex pubkey to npub format", () => {
|
||||
const hex = "0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef";
|
||||
const npub = pubkeyToNpub(hex);
|
||||
expect(npub).toMatch(/^npub1[a-z0-9]+$/);
|
||||
});
|
||||
|
||||
it("produces consistent output", () => {
|
||||
const hex = "0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef";
|
||||
const npub1 = pubkeyToNpub(hex);
|
||||
const npub2 = pubkeyToNpub(hex);
|
||||
expect(npub1).toBe(npub2);
|
||||
});
|
||||
|
||||
it("normalizes uppercase hex first", () => {
|
||||
const lower = "0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef";
|
||||
const upper = lower.toUpperCase();
|
||||
expect(pubkeyToNpub(lower)).toBe(pubkeyToNpub(upper));
|
||||
});
|
||||
});
|
||||
@@ -1,741 +0,0 @@
|
||||
import {
|
||||
SimplePool,
|
||||
finalizeEvent,
|
||||
getPublicKey,
|
||||
verifyEvent,
|
||||
nip19,
|
||||
type Event,
|
||||
} from "nostr-tools";
|
||||
import { decrypt, encrypt } from "nostr-tools/nip04";
|
||||
|
||||
import {
|
||||
readNostrBusState,
|
||||
writeNostrBusState,
|
||||
computeSinceTimestamp,
|
||||
readNostrProfileState,
|
||||
writeNostrProfileState,
|
||||
} from "./nostr-state-store.js";
|
||||
import {
|
||||
publishProfile as publishProfileFn,
|
||||
type ProfilePublishResult,
|
||||
} from "./nostr-profile.js";
|
||||
import type { NostrProfile } from "./config-schema.js";
|
||||
import { createSeenTracker, type SeenTracker } from "./seen-tracker.js";
|
||||
import {
|
||||
createMetrics,
|
||||
createNoopMetrics,
|
||||
type NostrMetrics,
|
||||
type MetricsSnapshot,
|
||||
type MetricEvent,
|
||||
} from "./metrics.js";
|
||||
|
||||
export const DEFAULT_RELAYS = ["wss://relay.damus.io", "wss://nos.lol"];
|
||||
|
||||
// ============================================================================
|
||||
// Constants
|
||||
// ============================================================================
|
||||
|
||||
const STARTUP_LOOKBACK_SEC = 120; // tolerate relay lag / clock skew
|
||||
const MAX_PERSISTED_EVENT_IDS = 5000;
|
||||
const STATE_PERSIST_DEBOUNCE_MS = 5000; // Debounce state writes
|
||||
|
||||
// Reconnect configuration (exponential backoff with jitter)
|
||||
const RECONNECT_BASE_MS = 1000; // 1 second base
|
||||
const RECONNECT_MAX_MS = 60000; // 60 seconds max
|
||||
const RECONNECT_JITTER = 0.3; // ±30% jitter
|
||||
|
||||
// Circuit breaker configuration
|
||||
const CIRCUIT_BREAKER_THRESHOLD = 5; // failures before opening
|
||||
const CIRCUIT_BREAKER_RESET_MS = 30000; // 30 seconds before half-open
|
||||
|
||||
// Health tracker configuration
|
||||
const HEALTH_WINDOW_MS = 60000; // 1 minute window for health stats
|
||||
|
||||
// ============================================================================
|
||||
// Types
|
||||
// ============================================================================
|
||||
|
||||
export interface NostrBusOptions {
|
||||
/** Private key in hex or nsec format */
|
||||
privateKey: string;
|
||||
/** WebSocket relay URLs (defaults to damus + nos.lol) */
|
||||
relays?: string[];
|
||||
/** Account ID for state persistence (optional, defaults to pubkey prefix) */
|
||||
accountId?: string;
|
||||
/** Called when a DM is received */
|
||||
onMessage: (
|
||||
pubkey: string,
|
||||
text: string,
|
||||
reply: (text: string) => Promise<void>
|
||||
) => Promise<void>;
|
||||
/** Called on errors (optional) */
|
||||
onError?: (error: Error, context: string) => void;
|
||||
/** Called on connection status changes (optional) */
|
||||
onConnect?: (relay: string) => void;
|
||||
/** Called on disconnection (optional) */
|
||||
onDisconnect?: (relay: string) => void;
|
||||
/** Called on EOSE (end of stored events) for initial sync (optional) */
|
||||
onEose?: (relay: string) => void;
|
||||
/** Called on each metric event (optional) */
|
||||
onMetric?: (event: MetricEvent) => void;
|
||||
/** Maximum entries in seen tracker (default: 100,000) */
|
||||
maxSeenEntries?: number;
|
||||
/** Seen tracker TTL in ms (default: 1 hour) */
|
||||
seenTtlMs?: number;
|
||||
}
|
||||
|
||||
export interface NostrBusHandle {
|
||||
/** Stop the bus and close connections */
|
||||
close: () => void;
|
||||
/** Get the bot's public key */
|
||||
publicKey: string;
|
||||
/** Send a DM to a pubkey */
|
||||
sendDm: (toPubkey: string, text: string) => Promise<void>;
|
||||
/** Get current metrics snapshot */
|
||||
getMetrics: () => MetricsSnapshot;
|
||||
/** Publish a profile (kind:0) to all relays */
|
||||
publishProfile: (profile: NostrProfile) => Promise<ProfilePublishResult>;
|
||||
/** Get the last profile publish state */
|
||||
getProfileState: () => Promise<{
|
||||
lastPublishedAt: number | null;
|
||||
lastPublishedEventId: string | null;
|
||||
lastPublishResults: Record<string, "ok" | "failed" | "timeout"> | null;
|
||||
}>;
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Circuit Breaker
|
||||
// ============================================================================
|
||||
|
||||
interface CircuitBreakerState {
|
||||
state: "closed" | "open" | "half_open";
|
||||
failures: number;
|
||||
lastFailure: number;
|
||||
lastSuccess: number;
|
||||
}
|
||||
|
||||
interface CircuitBreaker {
|
||||
/** Check if requests should be allowed */
|
||||
canAttempt: () => boolean;
|
||||
/** Record a success */
|
||||
recordSuccess: () => void;
|
||||
/** Record a failure */
|
||||
recordFailure: () => void;
|
||||
/** Get current state */
|
||||
getState: () => CircuitBreakerState["state"];
|
||||
}
|
||||
|
||||
function createCircuitBreaker(
|
||||
relay: string,
|
||||
metrics: NostrMetrics,
|
||||
threshold: number = CIRCUIT_BREAKER_THRESHOLD,
|
||||
resetMs: number = CIRCUIT_BREAKER_RESET_MS
|
||||
): CircuitBreaker {
|
||||
const state: CircuitBreakerState = {
|
||||
state: "closed",
|
||||
failures: 0,
|
||||
lastFailure: 0,
|
||||
lastSuccess: Date.now(),
|
||||
};
|
||||
|
||||
return {
|
||||
canAttempt(): boolean {
|
||||
if (state.state === "closed") return true;
|
||||
|
||||
if (state.state === "open") {
|
||||
// Check if enough time has passed to try half-open
|
||||
if (Date.now() - state.lastFailure >= resetMs) {
|
||||
state.state = "half_open";
|
||||
metrics.emit("relay.circuit_breaker.half_open", 1, { relay });
|
||||
return true;
|
||||
}
|
||||
return false;
|
||||
}
|
||||
|
||||
// half_open: allow one attempt
|
||||
return true;
|
||||
},
|
||||
|
||||
recordSuccess(): void {
|
||||
if (state.state === "half_open") {
|
||||
state.state = "closed";
|
||||
state.failures = 0;
|
||||
metrics.emit("relay.circuit_breaker.close", 1, { relay });
|
||||
} else if (state.state === "closed") {
|
||||
state.failures = 0;
|
||||
}
|
||||
state.lastSuccess = Date.now();
|
||||
},
|
||||
|
||||
recordFailure(): void {
|
||||
state.failures++;
|
||||
state.lastFailure = Date.now();
|
||||
|
||||
if (state.state === "half_open") {
|
||||
state.state = "open";
|
||||
metrics.emit("relay.circuit_breaker.open", 1, { relay });
|
||||
} else if (state.state === "closed" && state.failures >= threshold) {
|
||||
state.state = "open";
|
||||
metrics.emit("relay.circuit_breaker.open", 1, { relay });
|
||||
}
|
||||
},
|
||||
|
||||
getState(): CircuitBreakerState["state"] {
|
||||
return state.state;
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Relay Health Tracker
|
||||
// ============================================================================
|
||||
|
||||
interface RelayHealthStats {
|
||||
successCount: number;
|
||||
failureCount: number;
|
||||
latencySum: number;
|
||||
latencyCount: number;
|
||||
lastSuccess: number;
|
||||
lastFailure: number;
|
||||
}
|
||||
|
||||
interface RelayHealthTracker {
|
||||
/** Record a successful operation */
|
||||
recordSuccess: (relay: string, latencyMs: number) => void;
|
||||
/** Record a failed operation */
|
||||
recordFailure: (relay: string) => void;
|
||||
/** Get health score (0-1, higher is better) */
|
||||
getScore: (relay: string) => number;
|
||||
/** Get relays sorted by health (best first) */
|
||||
getSortedRelays: (relays: string[]) => string[];
|
||||
}
|
||||
|
||||
function createRelayHealthTracker(): RelayHealthTracker {
|
||||
const stats = new Map<string, RelayHealthStats>();
|
||||
|
||||
function getOrCreate(relay: string): RelayHealthStats {
|
||||
let s = stats.get(relay);
|
||||
if (!s) {
|
||||
s = {
|
||||
successCount: 0,
|
||||
failureCount: 0,
|
||||
latencySum: 0,
|
||||
latencyCount: 0,
|
||||
lastSuccess: 0,
|
||||
lastFailure: 0,
|
||||
};
|
||||
stats.set(relay, s);
|
||||
}
|
||||
return s;
|
||||
}
|
||||
|
||||
return {
|
||||
recordSuccess(relay: string, latencyMs: number): void {
|
||||
const s = getOrCreate(relay);
|
||||
s.successCount++;
|
||||
s.latencySum += latencyMs;
|
||||
s.latencyCount++;
|
||||
s.lastSuccess = Date.now();
|
||||
},
|
||||
|
||||
recordFailure(relay: string): void {
|
||||
const s = getOrCreate(relay);
|
||||
s.failureCount++;
|
||||
s.lastFailure = Date.now();
|
||||
},
|
||||
|
||||
getScore(relay: string): number {
|
||||
const s = stats.get(relay);
|
||||
if (!s) return 0.5; // Unknown relay gets neutral score
|
||||
|
||||
const total = s.successCount + s.failureCount;
|
||||
if (total === 0) return 0.5;
|
||||
|
||||
// Success rate (0-1)
|
||||
const successRate = s.successCount / total;
|
||||
|
||||
// Recency bonus (prefer recently successful relays)
|
||||
const now = Date.now();
|
||||
const recencyBonus =
|
||||
s.lastSuccess > s.lastFailure
|
||||
? Math.max(0, 1 - (now - s.lastSuccess) / HEALTH_WINDOW_MS) * 0.2
|
||||
: 0;
|
||||
|
||||
// Latency penalty (lower is better)
|
||||
const avgLatency =
|
||||
s.latencyCount > 0 ? s.latencySum / s.latencyCount : 1000;
|
||||
const latencyPenalty = Math.min(0.2, avgLatency / 10000);
|
||||
|
||||
return Math.max(0, Math.min(1, successRate + recencyBonus - latencyPenalty));
|
||||
},
|
||||
|
||||
getSortedRelays(relays: string[]): string[] {
|
||||
return [...relays].sort((a, b) => this.getScore(b) - this.getScore(a));
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Reconnect with Exponential Backoff + Jitter
|
||||
// ============================================================================
|
||||
|
||||
function computeReconnectDelay(attempt: number): number {
|
||||
// Exponential backoff: base * 2^attempt
|
||||
const exponential = RECONNECT_BASE_MS * Math.pow(2, attempt);
|
||||
const capped = Math.min(exponential, RECONNECT_MAX_MS);
|
||||
|
||||
// Add jitter: ±JITTER%
|
||||
const jitter = capped * RECONNECT_JITTER * (Math.random() * 2 - 1);
|
||||
return Math.max(RECONNECT_BASE_MS, capped + jitter);
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Key Validation
|
||||
// ============================================================================
|
||||
|
||||
/**
|
||||
* Validate and normalize a private key (accepts hex or nsec format)
|
||||
*/
|
||||
export function validatePrivateKey(key: string): Uint8Array {
|
||||
const trimmed = key.trim();
|
||||
|
||||
// Handle nsec (bech32) format
|
||||
if (trimmed.startsWith("nsec1")) {
|
||||
const decoded = nip19.decode(trimmed);
|
||||
if (decoded.type !== "nsec") {
|
||||
throw new Error("Invalid nsec key: wrong type");
|
||||
}
|
||||
return decoded.data;
|
||||
}
|
||||
|
||||
// Handle hex format
|
||||
if (!/^[0-9a-fA-F]{64}$/.test(trimmed)) {
|
||||
throw new Error(
|
||||
"Private key must be 64 hex characters or nsec bech32 format"
|
||||
);
|
||||
}
|
||||
|
||||
// Convert hex string to Uint8Array
|
||||
const bytes = new Uint8Array(32);
|
||||
for (let i = 0; i < 32; i++) {
|
||||
bytes[i] = parseInt(trimmed.slice(i * 2, i * 2 + 2), 16);
|
||||
}
|
||||
return bytes;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get public key from private key (hex or nsec format)
|
||||
*/
|
||||
export function getPublicKeyFromPrivate(privateKey: string): string {
|
||||
const sk = validatePrivateKey(privateKey);
|
||||
return getPublicKey(sk);
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Main Bus
|
||||
// ============================================================================
|
||||
|
||||
/**
|
||||
* Start the Nostr DM bus - subscribes to NIP-04 encrypted DMs
|
||||
*/
|
||||
export async function startNostrBus(
|
||||
options: NostrBusOptions
|
||||
): Promise<NostrBusHandle> {
|
||||
const {
|
||||
privateKey,
|
||||
relays = DEFAULT_RELAYS,
|
||||
onMessage,
|
||||
onError,
|
||||
onEose,
|
||||
onMetric,
|
||||
maxSeenEntries = 100_000,
|
||||
seenTtlMs = 60 * 60 * 1000,
|
||||
} = options;
|
||||
|
||||
const sk = validatePrivateKey(privateKey);
|
||||
const pk = getPublicKey(sk);
|
||||
const pool = new SimplePool();
|
||||
const accountId = options.accountId ?? pk.slice(0, 16);
|
||||
const gatewayStartedAt = Math.floor(Date.now() / 1000);
|
||||
|
||||
// Initialize metrics
|
||||
const metrics = onMetric ? createMetrics(onMetric) : createNoopMetrics();
|
||||
|
||||
// Initialize seen tracker with LRU
|
||||
const seen: SeenTracker = createSeenTracker({
|
||||
maxEntries: maxSeenEntries,
|
||||
ttlMs: seenTtlMs,
|
||||
});
|
||||
|
||||
// Initialize circuit breakers and health tracker
|
||||
const circuitBreakers = new Map<string, CircuitBreaker>();
|
||||
const healthTracker = createRelayHealthTracker();
|
||||
|
||||
for (const relay of relays) {
|
||||
circuitBreakers.set(relay, createCircuitBreaker(relay, metrics));
|
||||
}
|
||||
|
||||
// Read persisted state and compute `since` timestamp (with small overlap)
|
||||
const state = await readNostrBusState({ accountId });
|
||||
const baseSince = computeSinceTimestamp(state, gatewayStartedAt);
|
||||
const since = Math.max(0, baseSince - STARTUP_LOOKBACK_SEC);
|
||||
|
||||
// Seed in-memory dedupe with recent IDs from disk (prevents restart replay)
|
||||
if (state?.recentEventIds?.length) {
|
||||
seen.seed(state.recentEventIds);
|
||||
}
|
||||
|
||||
// Persist startup timestamp
|
||||
await writeNostrBusState({
|
||||
accountId,
|
||||
lastProcessedAt: state?.lastProcessedAt ?? gatewayStartedAt,
|
||||
gatewayStartedAt,
|
||||
recentEventIds: state?.recentEventIds ?? [],
|
||||
});
|
||||
|
||||
// Debounced state persistence
|
||||
let pendingWrite: ReturnType<typeof setTimeout> | undefined;
|
||||
let lastProcessedAt = state?.lastProcessedAt ?? gatewayStartedAt;
|
||||
let recentEventIds = (state?.recentEventIds ?? []).slice(
|
||||
-MAX_PERSISTED_EVENT_IDS
|
||||
);
|
||||
|
||||
function scheduleStatePersist(eventCreatedAt: number, eventId: string): void {
|
||||
lastProcessedAt = Math.max(lastProcessedAt, eventCreatedAt);
|
||||
recentEventIds.push(eventId);
|
||||
if (recentEventIds.length > MAX_PERSISTED_EVENT_IDS) {
|
||||
recentEventIds = recentEventIds.slice(-MAX_PERSISTED_EVENT_IDS);
|
||||
}
|
||||
|
||||
if (pendingWrite) clearTimeout(pendingWrite);
|
||||
pendingWrite = setTimeout(() => {
|
||||
writeNostrBusState({
|
||||
accountId,
|
||||
lastProcessedAt,
|
||||
gatewayStartedAt,
|
||||
recentEventIds,
|
||||
}).catch((err) => onError?.(err as Error, "persist state"));
|
||||
}, STATE_PERSIST_DEBOUNCE_MS);
|
||||
}
|
||||
|
||||
const inflight = new Set<string>();
|
||||
|
||||
// Event handler
|
||||
async function handleEvent(event: Event): Promise<void> {
|
||||
try {
|
||||
metrics.emit("event.received");
|
||||
|
||||
// Fast dedupe check (handles relay reconnections)
|
||||
if (seen.peek(event.id) || inflight.has(event.id)) {
|
||||
metrics.emit("event.duplicate");
|
||||
return;
|
||||
}
|
||||
inflight.add(event.id);
|
||||
|
||||
// Self-message loop prevention: skip our own messages
|
||||
if (event.pubkey === pk) {
|
||||
metrics.emit("event.rejected.self_message");
|
||||
return;
|
||||
}
|
||||
|
||||
// Skip events older than our `since` (relay may ignore filter)
|
||||
if (event.created_at < since) {
|
||||
metrics.emit("event.rejected.stale");
|
||||
return;
|
||||
}
|
||||
|
||||
// Fast p-tag check BEFORE crypto (no allocation, cheaper)
|
||||
let targetsUs = false;
|
||||
for (const t of event.tags) {
|
||||
if (t[0] === "p" && t[1] === pk) {
|
||||
targetsUs = true;
|
||||
break;
|
||||
}
|
||||
}
|
||||
if (!targetsUs) {
|
||||
metrics.emit("event.rejected.wrong_kind");
|
||||
return;
|
||||
}
|
||||
|
||||
// Verify signature (must pass before we trust the event)
|
||||
if (!verifyEvent(event)) {
|
||||
metrics.emit("event.rejected.invalid_signature");
|
||||
onError?.(new Error("Invalid signature"), `event ${event.id}`);
|
||||
return;
|
||||
}
|
||||
|
||||
// Mark seen AFTER verify (don't cache invalid IDs)
|
||||
seen.add(event.id);
|
||||
metrics.emit("memory.seen_tracker_size", seen.size());
|
||||
|
||||
// Decrypt the message
|
||||
let plaintext: string;
|
||||
try {
|
||||
plaintext = await decrypt(sk, event.pubkey, event.content);
|
||||
metrics.emit("decrypt.success");
|
||||
} catch (err) {
|
||||
metrics.emit("decrypt.failure");
|
||||
metrics.emit("event.rejected.decrypt_failed");
|
||||
onError?.(err as Error, `decrypt from ${event.pubkey}`);
|
||||
return;
|
||||
}
|
||||
|
||||
// Create reply function (try relays by health score)
|
||||
const replyTo = async (text: string): Promise<void> => {
|
||||
await sendEncryptedDm(
|
||||
pool,
|
||||
sk,
|
||||
event.pubkey,
|
||||
text,
|
||||
relays,
|
||||
metrics,
|
||||
circuitBreakers,
|
||||
healthTracker,
|
||||
onError
|
||||
);
|
||||
};
|
||||
|
||||
// Call the message handler
|
||||
await onMessage(event.pubkey, plaintext, replyTo);
|
||||
|
||||
// Mark as processed
|
||||
metrics.emit("event.processed");
|
||||
|
||||
// Persist progress (debounced)
|
||||
scheduleStatePersist(event.created_at, event.id);
|
||||
} catch (err) {
|
||||
onError?.(err as Error, `event ${event.id}`);
|
||||
} finally {
|
||||
inflight.delete(event.id);
|
||||
}
|
||||
}
|
||||
|
||||
const sub = pool.subscribeMany(
|
||||
relays,
|
||||
[{ kinds: [4], "#p": [pk], since }],
|
||||
{
|
||||
onevent: handleEvent,
|
||||
oneose: () => {
|
||||
// EOSE handler - called when all stored events have been received
|
||||
for (const relay of relays) {
|
||||
metrics.emit("relay.message.eose", 1, { relay });
|
||||
}
|
||||
onEose?.(relays.join(", "));
|
||||
},
|
||||
onclose: (reason) => {
|
||||
// Handle subscription close
|
||||
for (const relay of relays) {
|
||||
metrics.emit("relay.message.closed", 1, { relay });
|
||||
options.onDisconnect?.(relay);
|
||||
}
|
||||
onError?.(
|
||||
new Error(`Subscription closed: ${reason}`),
|
||||
"subscription"
|
||||
);
|
||||
},
|
||||
}
|
||||
);
|
||||
|
||||
// Public sendDm function
|
||||
const sendDm = async (toPubkey: string, text: string): Promise<void> => {
|
||||
await sendEncryptedDm(
|
||||
pool,
|
||||
sk,
|
||||
toPubkey,
|
||||
text,
|
||||
relays,
|
||||
metrics,
|
||||
circuitBreakers,
|
||||
healthTracker,
|
||||
onError
|
||||
);
|
||||
};
|
||||
|
||||
// Profile publishing function
|
||||
const publishProfile = async (profile: NostrProfile): Promise<ProfilePublishResult> => {
|
||||
// Read last published timestamp for monotonic ordering
|
||||
const profileState = await readNostrProfileState({ accountId });
|
||||
const lastPublishedAt = profileState?.lastPublishedAt ?? undefined;
|
||||
|
||||
// Publish the profile
|
||||
const result = await publishProfileFn(pool, sk, relays, profile, lastPublishedAt);
|
||||
|
||||
// Convert results to state format
|
||||
const publishResults: Record<string, "ok" | "failed" | "timeout"> = {};
|
||||
for (const relay of result.successes) {
|
||||
publishResults[relay] = "ok";
|
||||
}
|
||||
for (const { relay, error } of result.failures) {
|
||||
publishResults[relay] = error === "timeout" ? "timeout" : "failed";
|
||||
}
|
||||
|
||||
// Persist the publish state
|
||||
await writeNostrProfileState({
|
||||
accountId,
|
||||
lastPublishedAt: result.createdAt,
|
||||
lastPublishedEventId: result.eventId,
|
||||
lastPublishResults: publishResults,
|
||||
});
|
||||
|
||||
return result;
|
||||
};
|
||||
|
||||
// Get profile state function
|
||||
const getProfileState = async () => {
|
||||
const state = await readNostrProfileState({ accountId });
|
||||
return {
|
||||
lastPublishedAt: state?.lastPublishedAt ?? null,
|
||||
lastPublishedEventId: state?.lastPublishedEventId ?? null,
|
||||
lastPublishResults: state?.lastPublishResults ?? null,
|
||||
};
|
||||
};
|
||||
|
||||
return {
|
||||
close: () => {
|
||||
sub.close();
|
||||
seen.stop();
|
||||
// Flush pending state write synchronously on close
|
||||
if (pendingWrite) {
|
||||
clearTimeout(pendingWrite);
|
||||
writeNostrBusState({
|
||||
accountId,
|
||||
lastProcessedAt,
|
||||
gatewayStartedAt,
|
||||
recentEventIds,
|
||||
}).catch((err) => onError?.(err as Error, "persist state on close"));
|
||||
}
|
||||
},
|
||||
publicKey: pk,
|
||||
sendDm,
|
||||
getMetrics: () => metrics.getSnapshot(),
|
||||
publishProfile,
|
||||
getProfileState,
|
||||
};
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Send DM with Circuit Breaker + Health Scoring
|
||||
// ============================================================================
|
||||
|
||||
/**
|
||||
* Send an encrypted DM to a pubkey
|
||||
*/
|
||||
async function sendEncryptedDm(
|
||||
pool: SimplePool,
|
||||
sk: Uint8Array,
|
||||
toPubkey: string,
|
||||
text: string,
|
||||
relays: string[],
|
||||
metrics: NostrMetrics,
|
||||
circuitBreakers: Map<string, CircuitBreaker>,
|
||||
healthTracker: RelayHealthTracker,
|
||||
onError?: (error: Error, context: string) => void
|
||||
): Promise<void> {
|
||||
const ciphertext = await encrypt(sk, toPubkey, text);
|
||||
const reply = finalizeEvent(
|
||||
{
|
||||
kind: 4,
|
||||
content: ciphertext,
|
||||
tags: [["p", toPubkey]],
|
||||
created_at: Math.floor(Date.now() / 1000),
|
||||
},
|
||||
sk
|
||||
);
|
||||
|
||||
// Sort relays by health score (best first)
|
||||
const sortedRelays = healthTracker.getSortedRelays(relays);
|
||||
|
||||
// Try relays in order of health, respecting circuit breakers
|
||||
let lastError: Error | undefined;
|
||||
for (const relay of sortedRelays) {
|
||||
const cb = circuitBreakers.get(relay);
|
||||
|
||||
// Skip if circuit breaker is open
|
||||
if (cb && !cb.canAttempt()) {
|
||||
continue;
|
||||
}
|
||||
|
||||
const startTime = Date.now();
|
||||
try {
|
||||
await pool.publish([relay], reply);
|
||||
const latency = Date.now() - startTime;
|
||||
|
||||
// Record success
|
||||
cb?.recordSuccess();
|
||||
healthTracker.recordSuccess(relay, latency);
|
||||
|
||||
return; // Success - exit early
|
||||
} catch (err) {
|
||||
lastError = err as Error;
|
||||
const latency = Date.now() - startTime;
|
||||
|
||||
// Record failure
|
||||
cb?.recordFailure();
|
||||
healthTracker.recordFailure(relay);
|
||||
metrics.emit("relay.error", 1, { relay, latency });
|
||||
|
||||
onError?.(lastError, `publish to ${relay}`);
|
||||
}
|
||||
}
|
||||
|
||||
throw new Error(`Failed to publish to any relay: ${lastError?.message}`);
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Pubkey Utilities
|
||||
// ============================================================================
|
||||
|
||||
/**
|
||||
* Check if a string looks like a valid Nostr pubkey (hex or npub)
|
||||
*/
|
||||
export function isValidPubkey(input: string): boolean {
|
||||
if (typeof input !== "string") return false;
|
||||
const trimmed = input.trim();
|
||||
|
||||
// npub format
|
||||
if (trimmed.startsWith("npub1")) {
|
||||
try {
|
||||
const decoded = nip19.decode(trimmed);
|
||||
return decoded.type === "npub";
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
// Hex format
|
||||
return /^[0-9a-fA-F]{64}$/.test(trimmed);
|
||||
}
|
||||
|
||||
/**
|
||||
* Normalize a pubkey to hex format (accepts npub or hex)
|
||||
*/
|
||||
export function normalizePubkey(input: string): string {
|
||||
const trimmed = input.trim();
|
||||
|
||||
// npub format - decode to hex
|
||||
if (trimmed.startsWith("npub1")) {
|
||||
const decoded = nip19.decode(trimmed);
|
||||
if (decoded.type !== "npub") {
|
||||
throw new Error("Invalid npub key");
|
||||
}
|
||||
// Convert Uint8Array to hex string
|
||||
return Array.from(decoded.data)
|
||||
.map((b) => b.toString(16).padStart(2, "0"))
|
||||
.join("");
|
||||
}
|
||||
|
||||
// Already hex - validate and return lowercase
|
||||
if (!/^[0-9a-fA-F]{64}$/.test(trimmed)) {
|
||||
throw new Error("Pubkey must be 64 hex characters or npub format");
|
||||
}
|
||||
return trimmed.toLowerCase();
|
||||
}
|
||||
|
||||
/**
|
||||
* Convert a hex pubkey to npub format
|
||||
*/
|
||||
export function pubkeyToNpub(hexPubkey: string): string {
|
||||
const normalized = normalizePubkey(hexPubkey);
|
||||
// npubEncode expects a hex string, not Uint8Array
|
||||
return nip19.npubEncode(normalized);
|
||||
}
|
||||
@@ -1,378 +0,0 @@
|
||||
/**
|
||||
* Tests for Nostr Profile HTTP Handler
|
||||
*/
|
||||
|
||||
import { describe, it, expect, vi, beforeEach, afterEach } from "vitest";
|
||||
import { IncomingMessage, ServerResponse } from "node:http";
|
||||
import { Socket } from "node:net";
|
||||
|
||||
import {
|
||||
createNostrProfileHttpHandler,
|
||||
type NostrProfileHttpContext,
|
||||
} from "./nostr-profile-http.js";
|
||||
|
||||
// Mock the channel exports
|
||||
vi.mock("./channel.js", () => ({
|
||||
publishNostrProfile: vi.fn(),
|
||||
getNostrProfileState: vi.fn(),
|
||||
}));
|
||||
|
||||
// Mock the import module
|
||||
vi.mock("./nostr-profile-import.js", () => ({
|
||||
importProfileFromRelays: vi.fn(),
|
||||
mergeProfiles: vi.fn((local, imported) => ({ ...imported, ...local })),
|
||||
}));
|
||||
|
||||
import { publishNostrProfile, getNostrProfileState } from "./channel.js";
|
||||
import { importProfileFromRelays } from "./nostr-profile-import.js";
|
||||
|
||||
// ============================================================================
|
||||
// Test Helpers
|
||||
// ============================================================================
|
||||
|
||||
function createMockRequest(
|
||||
method: string,
|
||||
url: string,
|
||||
body?: unknown
|
||||
): IncomingMessage {
|
||||
const socket = new Socket();
|
||||
const req = new IncomingMessage(socket);
|
||||
req.method = method;
|
||||
req.url = url;
|
||||
req.headers = { host: "localhost:3000" };
|
||||
|
||||
if (body) {
|
||||
const bodyStr = JSON.stringify(body);
|
||||
process.nextTick(() => {
|
||||
req.emit("data", Buffer.from(bodyStr));
|
||||
req.emit("end");
|
||||
});
|
||||
} else {
|
||||
process.nextTick(() => {
|
||||
req.emit("end");
|
||||
});
|
||||
}
|
||||
|
||||
return req;
|
||||
}
|
||||
|
||||
function createMockResponse(): ServerResponse & { _getData: () => string; _getStatusCode: () => number } {
|
||||
const socket = new Socket();
|
||||
const res = new ServerResponse({} as IncomingMessage);
|
||||
|
||||
let data = "";
|
||||
let statusCode = 200;
|
||||
|
||||
res.write = function (chunk: unknown) {
|
||||
data += String(chunk);
|
||||
return true;
|
||||
};
|
||||
|
||||
res.end = function (chunk?: unknown) {
|
||||
if (chunk) data += String(chunk);
|
||||
return this;
|
||||
};
|
||||
|
||||
Object.defineProperty(res, "statusCode", {
|
||||
get: () => statusCode,
|
||||
set: (code: number) => {
|
||||
statusCode = code;
|
||||
},
|
||||
});
|
||||
|
||||
(res as unknown as { _getData: () => string })._getData = () => data;
|
||||
(res as unknown as { _getStatusCode: () => number })._getStatusCode = () => statusCode;
|
||||
|
||||
return res as ServerResponse & { _getData: () => string; _getStatusCode: () => number };
|
||||
}
|
||||
|
||||
function createMockContext(overrides?: Partial<NostrProfileHttpContext>): NostrProfileHttpContext {
|
||||
return {
|
||||
getConfigProfile: vi.fn().mockReturnValue(undefined),
|
||||
updateConfigProfile: vi.fn().mockResolvedValue(undefined),
|
||||
getAccountInfo: vi.fn().mockReturnValue({
|
||||
pubkey: "abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234",
|
||||
relays: ["wss://relay.damus.io"],
|
||||
}),
|
||||
log: {
|
||||
info: vi.fn(),
|
||||
warn: vi.fn(),
|
||||
error: vi.fn(),
|
||||
},
|
||||
...overrides,
|
||||
};
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Tests
|
||||
// ============================================================================
|
||||
|
||||
describe("nostr-profile-http", () => {
|
||||
beforeEach(() => {
|
||||
vi.clearAllMocks();
|
||||
});
|
||||
|
||||
describe("route matching", () => {
|
||||
it("returns false for non-nostr paths", async () => {
|
||||
const ctx = createMockContext();
|
||||
const handler = createNostrProfileHttpHandler(ctx);
|
||||
const req = createMockRequest("GET", "/api/channels/telegram/profile");
|
||||
const res = createMockResponse();
|
||||
|
||||
const result = await handler(req, res);
|
||||
|
||||
expect(result).toBe(false);
|
||||
});
|
||||
|
||||
it("returns false for paths without accountId", async () => {
|
||||
const ctx = createMockContext();
|
||||
const handler = createNostrProfileHttpHandler(ctx);
|
||||
const req = createMockRequest("GET", "/api/channels/nostr/");
|
||||
const res = createMockResponse();
|
||||
|
||||
const result = await handler(req, res);
|
||||
|
||||
expect(result).toBe(false);
|
||||
});
|
||||
|
||||
it("handles /api/channels/nostr/:accountId/profile", async () => {
|
||||
const ctx = createMockContext();
|
||||
const handler = createNostrProfileHttpHandler(ctx);
|
||||
const req = createMockRequest("GET", "/api/channels/nostr/default/profile");
|
||||
const res = createMockResponse();
|
||||
|
||||
vi.mocked(getNostrProfileState).mockResolvedValue(null);
|
||||
|
||||
const result = await handler(req, res);
|
||||
|
||||
expect(result).toBe(true);
|
||||
});
|
||||
});
|
||||
|
||||
describe("GET /api/channels/nostr/:accountId/profile", () => {
|
||||
it("returns profile and publish state", async () => {
|
||||
const ctx = createMockContext({
|
||||
getConfigProfile: vi.fn().mockReturnValue({
|
||||
name: "testuser",
|
||||
displayName: "Test User",
|
||||
}),
|
||||
});
|
||||
const handler = createNostrProfileHttpHandler(ctx);
|
||||
const req = createMockRequest("GET", "/api/channels/nostr/default/profile");
|
||||
const res = createMockResponse();
|
||||
|
||||
vi.mocked(getNostrProfileState).mockResolvedValue({
|
||||
lastPublishedAt: 1234567890,
|
||||
lastPublishedEventId: "abc123",
|
||||
lastPublishResults: { "wss://relay.damus.io": "ok" },
|
||||
});
|
||||
|
||||
await handler(req, res);
|
||||
|
||||
expect(res._getStatusCode()).toBe(200);
|
||||
const data = JSON.parse(res._getData());
|
||||
expect(data.ok).toBe(true);
|
||||
expect(data.profile.name).toBe("testuser");
|
||||
expect(data.publishState.lastPublishedAt).toBe(1234567890);
|
||||
});
|
||||
});
|
||||
|
||||
describe("PUT /api/channels/nostr/:accountId/profile", () => {
|
||||
it("validates profile and publishes", async () => {
|
||||
const ctx = createMockContext();
|
||||
const handler = createNostrProfileHttpHandler(ctx);
|
||||
const req = createMockRequest("PUT", "/api/channels/nostr/default/profile", {
|
||||
name: "satoshi",
|
||||
displayName: "Satoshi Nakamoto",
|
||||
about: "Creator of Bitcoin",
|
||||
});
|
||||
const res = createMockResponse();
|
||||
|
||||
vi.mocked(publishNostrProfile).mockResolvedValue({
|
||||
eventId: "event123",
|
||||
createdAt: 1234567890,
|
||||
successes: ["wss://relay.damus.io"],
|
||||
failures: [],
|
||||
});
|
||||
|
||||
await handler(req, res);
|
||||
|
||||
expect(res._getStatusCode()).toBe(200);
|
||||
const data = JSON.parse(res._getData());
|
||||
expect(data.ok).toBe(true);
|
||||
expect(data.eventId).toBe("event123");
|
||||
expect(data.successes).toContain("wss://relay.damus.io");
|
||||
expect(data.persisted).toBe(true);
|
||||
expect(ctx.updateConfigProfile).toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it("rejects private IP in picture URL (SSRF protection)", async () => {
|
||||
const ctx = createMockContext();
|
||||
const handler = createNostrProfileHttpHandler(ctx);
|
||||
const req = createMockRequest("PUT", "/api/channels/nostr/default/profile", {
|
||||
name: "hacker",
|
||||
picture: "https://127.0.0.1/evil.jpg",
|
||||
});
|
||||
const res = createMockResponse();
|
||||
|
||||
await handler(req, res);
|
||||
|
||||
expect(res._getStatusCode()).toBe(400);
|
||||
const data = JSON.parse(res._getData());
|
||||
expect(data.ok).toBe(false);
|
||||
expect(data.error).toContain("private");
|
||||
});
|
||||
|
||||
it("rejects non-https URLs", async () => {
|
||||
const ctx = createMockContext();
|
||||
const handler = createNostrProfileHttpHandler(ctx);
|
||||
const req = createMockRequest("PUT", "/api/channels/nostr/default/profile", {
|
||||
name: "test",
|
||||
picture: "http://example.com/pic.jpg",
|
||||
});
|
||||
const res = createMockResponse();
|
||||
|
||||
await handler(req, res);
|
||||
|
||||
expect(res._getStatusCode()).toBe(400);
|
||||
const data = JSON.parse(res._getData());
|
||||
expect(data.ok).toBe(false);
|
||||
// The schema validation catches non-https URLs before SSRF check
|
||||
expect(data.error).toBe("Validation failed");
|
||||
expect(data.details).toBeDefined();
|
||||
expect(data.details.some((d: string) => d.includes("https"))).toBe(true);
|
||||
});
|
||||
|
||||
it("does not persist if all relays fail", async () => {
|
||||
const ctx = createMockContext();
|
||||
const handler = createNostrProfileHttpHandler(ctx);
|
||||
const req = createMockRequest("PUT", "/api/channels/nostr/default/profile", {
|
||||
name: "test",
|
||||
});
|
||||
const res = createMockResponse();
|
||||
|
||||
vi.mocked(publishNostrProfile).mockResolvedValue({
|
||||
eventId: "event123",
|
||||
createdAt: 1234567890,
|
||||
successes: [],
|
||||
failures: [{ relay: "wss://relay.damus.io", error: "timeout" }],
|
||||
});
|
||||
|
||||
await handler(req, res);
|
||||
|
||||
expect(res._getStatusCode()).toBe(200);
|
||||
const data = JSON.parse(res._getData());
|
||||
expect(data.persisted).toBe(false);
|
||||
expect(ctx.updateConfigProfile).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it("enforces rate limiting", async () => {
|
||||
const ctx = createMockContext();
|
||||
const handler = createNostrProfileHttpHandler(ctx);
|
||||
|
||||
vi.mocked(publishNostrProfile).mockResolvedValue({
|
||||
eventId: "event123",
|
||||
createdAt: 1234567890,
|
||||
successes: ["wss://relay.damus.io"],
|
||||
failures: [],
|
||||
});
|
||||
|
||||
// Make 6 requests (limit is 5/min)
|
||||
for (let i = 0; i < 6; i++) {
|
||||
const req = createMockRequest("PUT", "/api/channels/nostr/rate-test/profile", {
|
||||
name: `user${i}`,
|
||||
});
|
||||
const res = createMockResponse();
|
||||
await handler(req, res);
|
||||
|
||||
if (i < 5) {
|
||||
expect(res._getStatusCode()).toBe(200);
|
||||
} else {
|
||||
expect(res._getStatusCode()).toBe(429);
|
||||
const data = JSON.parse(res._getData());
|
||||
expect(data.error).toContain("Rate limit");
|
||||
}
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
describe("POST /api/channels/nostr/:accountId/profile/import", () => {
|
||||
it("imports profile from relays", async () => {
|
||||
const ctx = createMockContext();
|
||||
const handler = createNostrProfileHttpHandler(ctx);
|
||||
const req = createMockRequest("POST", "/api/channels/nostr/default/profile/import", {});
|
||||
const res = createMockResponse();
|
||||
|
||||
vi.mocked(importProfileFromRelays).mockResolvedValue({
|
||||
ok: true,
|
||||
profile: {
|
||||
name: "imported",
|
||||
displayName: "Imported User",
|
||||
},
|
||||
event: {
|
||||
id: "evt123",
|
||||
pubkey: "abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234",
|
||||
created_at: 1234567890,
|
||||
},
|
||||
relaysQueried: ["wss://relay.damus.io"],
|
||||
sourceRelay: "wss://relay.damus.io",
|
||||
});
|
||||
|
||||
await handler(req, res);
|
||||
|
||||
expect(res._getStatusCode()).toBe(200);
|
||||
const data = JSON.parse(res._getData());
|
||||
expect(data.ok).toBe(true);
|
||||
expect(data.imported.name).toBe("imported");
|
||||
expect(data.saved).toBe(false); // autoMerge not requested
|
||||
});
|
||||
|
||||
it("auto-merges when requested", async () => {
|
||||
const ctx = createMockContext({
|
||||
getConfigProfile: vi.fn().mockReturnValue({ about: "local bio" }),
|
||||
});
|
||||
const handler = createNostrProfileHttpHandler(ctx);
|
||||
const req = createMockRequest("POST", "/api/channels/nostr/default/profile/import", {
|
||||
autoMerge: true,
|
||||
});
|
||||
const res = createMockResponse();
|
||||
|
||||
vi.mocked(importProfileFromRelays).mockResolvedValue({
|
||||
ok: true,
|
||||
profile: {
|
||||
name: "imported",
|
||||
displayName: "Imported User",
|
||||
},
|
||||
event: {
|
||||
id: "evt123",
|
||||
pubkey: "abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234",
|
||||
created_at: 1234567890,
|
||||
},
|
||||
relaysQueried: ["wss://relay.damus.io"],
|
||||
sourceRelay: "wss://relay.damus.io",
|
||||
});
|
||||
|
||||
await handler(req, res);
|
||||
|
||||
expect(res._getStatusCode()).toBe(200);
|
||||
const data = JSON.parse(res._getData());
|
||||
expect(data.saved).toBe(true);
|
||||
expect(ctx.updateConfigProfile).toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it("returns error when account not found", async () => {
|
||||
const ctx = createMockContext({
|
||||
getAccountInfo: vi.fn().mockReturnValue(null),
|
||||
});
|
||||
const handler = createNostrProfileHttpHandler(ctx);
|
||||
const req = createMockRequest("POST", "/api/channels/nostr/unknown/profile/import", {});
|
||||
const res = createMockResponse();
|
||||
|
||||
await handler(req, res);
|
||||
|
||||
expect(res._getStatusCode()).toBe(404);
|
||||
const data = JSON.parse(res._getData());
|
||||
expect(data.error).toContain("not found");
|
||||
});
|
||||
});
|
||||
});
|
||||
@@ -1,500 +0,0 @@
|
||||
/**
|
||||
* Nostr Profile HTTP Handler
|
||||
*
|
||||
* Handles HTTP requests for profile management:
|
||||
* - PUT /api/channels/nostr/:accountId/profile - Update and publish profile
|
||||
* - POST /api/channels/nostr/:accountId/profile/import - Import from relays
|
||||
* - GET /api/channels/nostr/:accountId/profile - Get current profile state
|
||||
*/
|
||||
|
||||
import type { IncomingMessage, ServerResponse } from "node:http";
|
||||
import { z } from "zod";
|
||||
|
||||
import { NostrProfileSchema, type NostrProfile } from "./config-schema.js";
|
||||
import { publishNostrProfile, getNostrProfileState } from "./channel.js";
|
||||
import { importProfileFromRelays, mergeProfiles } from "./nostr-profile-import.js";
|
||||
|
||||
// ============================================================================
|
||||
// Types
|
||||
// ============================================================================
|
||||
|
||||
export interface NostrProfileHttpContext {
|
||||
/** Get current profile from config */
|
||||
getConfigProfile: (accountId: string) => NostrProfile | undefined;
|
||||
/** Update profile in config (after successful publish) */
|
||||
updateConfigProfile: (accountId: string, profile: NostrProfile) => Promise<void>;
|
||||
/** Get account's public key and relays */
|
||||
getAccountInfo: (accountId: string) => { pubkey: string; relays: string[] } | null;
|
||||
/** Logger */
|
||||
log?: {
|
||||
info: (msg: string) => void;
|
||||
warn: (msg: string) => void;
|
||||
error: (msg: string) => void;
|
||||
};
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Rate Limiting
|
||||
// ============================================================================
|
||||
|
||||
interface RateLimitEntry {
|
||||
count: number;
|
||||
windowStart: number;
|
||||
}
|
||||
|
||||
const rateLimitMap = new Map<string, RateLimitEntry>();
|
||||
const RATE_LIMIT_WINDOW_MS = 60_000; // 1 minute
|
||||
const RATE_LIMIT_MAX_REQUESTS = 5; // 5 requests per minute
|
||||
|
||||
function checkRateLimit(accountId: string): boolean {
|
||||
const now = Date.now();
|
||||
const entry = rateLimitMap.get(accountId);
|
||||
|
||||
if (!entry || now - entry.windowStart > RATE_LIMIT_WINDOW_MS) {
|
||||
rateLimitMap.set(accountId, { count: 1, windowStart: now });
|
||||
return true;
|
||||
}
|
||||
|
||||
if (entry.count >= RATE_LIMIT_MAX_REQUESTS) {
|
||||
return false;
|
||||
}
|
||||
|
||||
entry.count++;
|
||||
return true;
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Mutex for Concurrent Publish Prevention
|
||||
// ============================================================================
|
||||
|
||||
const publishLocks = new Map<string, Promise<void>>();
|
||||
|
||||
async function withPublishLock<T>(accountId: string, fn: () => Promise<T>): Promise<T> {
|
||||
// Atomic mutex using promise chaining - prevents TOCTOU race condition
|
||||
const prev = publishLocks.get(accountId) ?? Promise.resolve();
|
||||
let resolve: () => void;
|
||||
const next = new Promise<void>((r) => {
|
||||
resolve = r;
|
||||
});
|
||||
// Atomically replace the lock before awaiting - any concurrent request
|
||||
// will now wait on our `next` promise
|
||||
publishLocks.set(accountId, next);
|
||||
|
||||
// Wait for previous operation to complete
|
||||
await prev.catch(() => {});
|
||||
|
||||
try {
|
||||
return await fn();
|
||||
} finally {
|
||||
resolve!();
|
||||
// Clean up if we're the last in chain
|
||||
if (publishLocks.get(accountId) === next) {
|
||||
publishLocks.delete(accountId);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// SSRF Protection
|
||||
// ============================================================================
|
||||
|
||||
// Block common private/internal hostnames (quick string check)
|
||||
const BLOCKED_HOSTNAMES = new Set([
|
||||
"localhost",
|
||||
"localhost.localdomain",
|
||||
"127.0.0.1",
|
||||
"::1",
|
||||
"[::1]",
|
||||
"0.0.0.0",
|
||||
]);
|
||||
|
||||
// Check if an IP address (resolved) is in a private range
|
||||
function isPrivateIp(ip: string): boolean {
|
||||
// Handle IPv4
|
||||
const ipv4Match = ip.match(/^(\d+)\.(\d+)\.(\d+)\.(\d+)$/);
|
||||
if (ipv4Match) {
|
||||
const [, a, b, c] = ipv4Match.map(Number);
|
||||
// 127.0.0.0/8 (loopback)
|
||||
if (a === 127) return true;
|
||||
// 10.0.0.0/8 (private)
|
||||
if (a === 10) return true;
|
||||
// 172.16.0.0/12 (private)
|
||||
if (a === 172 && b >= 16 && b <= 31) return true;
|
||||
// 192.168.0.0/16 (private)
|
||||
if (a === 192 && b === 168) return true;
|
||||
// 169.254.0.0/16 (link-local)
|
||||
if (a === 169 && b === 254) return true;
|
||||
// 0.0.0.0/8
|
||||
if (a === 0) return true;
|
||||
return false;
|
||||
}
|
||||
|
||||
// Handle IPv6
|
||||
const ipLower = ip.toLowerCase().replace(/^\[|\]$/g, "");
|
||||
// ::1 (loopback)
|
||||
if (ipLower === "::1") return true;
|
||||
// fe80::/10 (link-local)
|
||||
if (ipLower.startsWith("fe80:")) return true;
|
||||
// fc00::/7 (unique local)
|
||||
if (ipLower.startsWith("fc") || ipLower.startsWith("fd")) return true;
|
||||
// ::ffff:x.x.x.x (IPv4-mapped IPv6) - extract and check IPv4
|
||||
const v4Mapped = ipLower.match(/^::ffff:(\d+\.\d+\.\d+\.\d+)$/);
|
||||
if (v4Mapped) return isPrivateIp(v4Mapped[1]);
|
||||
|
||||
return false;
|
||||
}
|
||||
|
||||
function validateUrlSafety(urlStr: string): { ok: true } | { ok: false; error: string } {
|
||||
try {
|
||||
const url = new URL(urlStr);
|
||||
|
||||
if (url.protocol !== "https:") {
|
||||
return { ok: false, error: "URL must use https:// protocol" };
|
||||
}
|
||||
|
||||
const hostname = url.hostname.toLowerCase();
|
||||
|
||||
// Quick hostname block check
|
||||
if (BLOCKED_HOSTNAMES.has(hostname)) {
|
||||
return { ok: false, error: "URL must not point to private/internal addresses" };
|
||||
}
|
||||
|
||||
// Check if hostname is an IP address directly
|
||||
if (isPrivateIp(hostname)) {
|
||||
return { ok: false, error: "URL must not point to private/internal addresses" };
|
||||
}
|
||||
|
||||
// Block suspicious TLDs that resolve to localhost
|
||||
if (hostname.endsWith(".localhost") || hostname.endsWith(".local")) {
|
||||
return { ok: false, error: "URL must not point to private/internal addresses" };
|
||||
}
|
||||
|
||||
return { ok: true };
|
||||
} catch {
|
||||
return { ok: false, error: "Invalid URL format" };
|
||||
}
|
||||
}
|
||||
|
||||
// Export for use in import validation
|
||||
export { validateUrlSafety }
|
||||
|
||||
// ============================================================================
|
||||
// Validation Schemas
|
||||
// ============================================================================
|
||||
|
||||
// NIP-05 format: user@domain.com
|
||||
const nip05FormatSchema = z
|
||||
.string()
|
||||
.regex(/^[a-z0-9._-]+@[a-z0-9.-]+\.[a-z]{2,}$/i, "Invalid NIP-05 format (user@domain.com)")
|
||||
.optional();
|
||||
|
||||
// LUD-16 Lightning address format: user@domain.com
|
||||
const lud16FormatSchema = z
|
||||
.string()
|
||||
.regex(/^[a-z0-9._-]+@[a-z0-9.-]+\.[a-z]{2,}$/i, "Invalid Lightning address format")
|
||||
.optional();
|
||||
|
||||
// Extended profile schema with additional format validation
|
||||
const ProfileUpdateSchema = NostrProfileSchema.extend({
|
||||
nip05: nip05FormatSchema,
|
||||
lud16: lud16FormatSchema,
|
||||
});
|
||||
|
||||
// ============================================================================
|
||||
// Request Helpers
|
||||
// ============================================================================
|
||||
|
||||
function sendJson(res: ServerResponse, status: number, body: unknown): void {
|
||||
res.statusCode = status;
|
||||
res.setHeader("Content-Type", "application/json; charset=utf-8");
|
||||
res.end(JSON.stringify(body));
|
||||
}
|
||||
|
||||
async function readJsonBody(req: IncomingMessage, maxBytes = 64 * 1024): Promise<unknown> {
|
||||
return new Promise((resolve, reject) => {
|
||||
const chunks: Buffer[] = [];
|
||||
let totalBytes = 0;
|
||||
|
||||
req.on("data", (chunk: Buffer) => {
|
||||
totalBytes += chunk.length;
|
||||
if (totalBytes > maxBytes) {
|
||||
reject(new Error("Request body too large"));
|
||||
req.destroy();
|
||||
return;
|
||||
}
|
||||
chunks.push(chunk);
|
||||
});
|
||||
|
||||
req.on("end", () => {
|
||||
try {
|
||||
const body = Buffer.concat(chunks).toString("utf-8");
|
||||
resolve(body ? JSON.parse(body) : {});
|
||||
} catch {
|
||||
reject(new Error("Invalid JSON"));
|
||||
}
|
||||
});
|
||||
|
||||
req.on("error", reject);
|
||||
});
|
||||
}
|
||||
|
||||
function parseAccountIdFromPath(pathname: string): string | null {
|
||||
// Match: /api/channels/nostr/:accountId/profile
|
||||
const match = pathname.match(/^\/api\/channels\/nostr\/([^/]+)\/profile/);
|
||||
return match?.[1] ?? null;
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// HTTP Handler
|
||||
// ============================================================================
|
||||
|
||||
export function createNostrProfileHttpHandler(
|
||||
ctx: NostrProfileHttpContext
|
||||
): (req: IncomingMessage, res: ServerResponse) => Promise<boolean> {
|
||||
return async (req, res) => {
|
||||
const url = new URL(req.url ?? "/", `http://${req.headers.host ?? "localhost"}`);
|
||||
|
||||
// Only handle /api/channels/nostr/:accountId/profile paths
|
||||
if (!url.pathname.startsWith("/api/channels/nostr/")) {
|
||||
return false;
|
||||
}
|
||||
|
||||
const accountId = parseAccountIdFromPath(url.pathname);
|
||||
if (!accountId) {
|
||||
return false;
|
||||
}
|
||||
|
||||
const isImport = url.pathname.endsWith("/profile/import");
|
||||
const isProfilePath = url.pathname.endsWith("/profile") || isImport;
|
||||
|
||||
if (!isProfilePath) {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Handle different HTTP methods
|
||||
try {
|
||||
if (req.method === "GET" && !isImport) {
|
||||
return await handleGetProfile(accountId, ctx, res);
|
||||
}
|
||||
|
||||
if (req.method === "PUT" && !isImport) {
|
||||
return await handleUpdateProfile(accountId, ctx, req, res);
|
||||
}
|
||||
|
||||
if (req.method === "POST" && isImport) {
|
||||
return await handleImportProfile(accountId, ctx, req, res);
|
||||
}
|
||||
|
||||
// Method not allowed
|
||||
sendJson(res, 405, { ok: false, error: "Method not allowed" });
|
||||
return true;
|
||||
} catch (err) {
|
||||
ctx.log?.error(`Profile HTTP error: ${String(err)}`);
|
||||
sendJson(res, 500, { ok: false, error: "Internal server error" });
|
||||
return true;
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// GET /api/channels/nostr/:accountId/profile
|
||||
// ============================================================================
|
||||
|
||||
async function handleGetProfile(
|
||||
accountId: string,
|
||||
ctx: NostrProfileHttpContext,
|
||||
res: ServerResponse
|
||||
): Promise<true> {
|
||||
const configProfile = ctx.getConfigProfile(accountId);
|
||||
const publishState = await getNostrProfileState(accountId);
|
||||
|
||||
sendJson(res, 200, {
|
||||
ok: true,
|
||||
profile: configProfile ?? null,
|
||||
publishState: publishState ?? null,
|
||||
});
|
||||
return true;
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// PUT /api/channels/nostr/:accountId/profile
|
||||
// ============================================================================
|
||||
|
||||
async function handleUpdateProfile(
|
||||
accountId: string,
|
||||
ctx: NostrProfileHttpContext,
|
||||
req: IncomingMessage,
|
||||
res: ServerResponse
|
||||
): Promise<true> {
|
||||
// Rate limiting
|
||||
if (!checkRateLimit(accountId)) {
|
||||
sendJson(res, 429, { ok: false, error: "Rate limit exceeded (5 requests/minute)" });
|
||||
return true;
|
||||
}
|
||||
|
||||
// Parse body
|
||||
let body: unknown;
|
||||
try {
|
||||
body = await readJsonBody(req);
|
||||
} catch (err) {
|
||||
sendJson(res, 400, { ok: false, error: String(err) });
|
||||
return true;
|
||||
}
|
||||
|
||||
// Validate profile
|
||||
const parseResult = ProfileUpdateSchema.safeParse(body);
|
||||
if (!parseResult.success) {
|
||||
const errors = parseResult.error.issues.map((i) => `${i.path.join(".")}: ${i.message}`);
|
||||
sendJson(res, 400, { ok: false, error: "Validation failed", details: errors });
|
||||
return true;
|
||||
}
|
||||
|
||||
const profile = parseResult.data;
|
||||
|
||||
// SSRF check for picture URL
|
||||
if (profile.picture) {
|
||||
const pictureCheck = validateUrlSafety(profile.picture);
|
||||
if (!pictureCheck.ok) {
|
||||
sendJson(res, 400, { ok: false, error: `picture: ${pictureCheck.error}` });
|
||||
return true;
|
||||
}
|
||||
}
|
||||
|
||||
// SSRF check for banner URL
|
||||
if (profile.banner) {
|
||||
const bannerCheck = validateUrlSafety(profile.banner);
|
||||
if (!bannerCheck.ok) {
|
||||
sendJson(res, 400, { ok: false, error: `banner: ${bannerCheck.error}` });
|
||||
return true;
|
||||
}
|
||||
}
|
||||
|
||||
// SSRF check for website URL
|
||||
if (profile.website) {
|
||||
const websiteCheck = validateUrlSafety(profile.website);
|
||||
if (!websiteCheck.ok) {
|
||||
sendJson(res, 400, { ok: false, error: `website: ${websiteCheck.error}` });
|
||||
return true;
|
||||
}
|
||||
}
|
||||
|
||||
// Merge with existing profile to preserve unknown fields
|
||||
const existingProfile = ctx.getConfigProfile(accountId) ?? {};
|
||||
const mergedProfile: NostrProfile = {
|
||||
...existingProfile,
|
||||
...profile,
|
||||
};
|
||||
|
||||
// Publish with mutex to prevent concurrent publishes
|
||||
try {
|
||||
const result = await withPublishLock(accountId, async () => {
|
||||
return await publishNostrProfile(accountId, mergedProfile);
|
||||
});
|
||||
|
||||
// Only persist if at least one relay succeeded
|
||||
if (result.successes.length > 0) {
|
||||
await ctx.updateConfigProfile(accountId, mergedProfile);
|
||||
ctx.log?.info(`[${accountId}] Profile published to ${result.successes.length} relay(s)`);
|
||||
} else {
|
||||
ctx.log?.warn(`[${accountId}] Profile publish failed on all relays`);
|
||||
}
|
||||
|
||||
sendJson(res, 200, {
|
||||
ok: true,
|
||||
eventId: result.eventId,
|
||||
createdAt: result.createdAt,
|
||||
successes: result.successes,
|
||||
failures: result.failures,
|
||||
persisted: result.successes.length > 0,
|
||||
});
|
||||
} catch (err) {
|
||||
ctx.log?.error(`[${accountId}] Profile publish error: ${String(err)}`);
|
||||
sendJson(res, 500, { ok: false, error: `Publish failed: ${String(err)}` });
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// POST /api/channels/nostr/:accountId/profile/import
|
||||
// ============================================================================
|
||||
|
||||
async function handleImportProfile(
|
||||
accountId: string,
|
||||
ctx: NostrProfileHttpContext,
|
||||
req: IncomingMessage,
|
||||
res: ServerResponse
|
||||
): Promise<true> {
|
||||
// Get account info
|
||||
const accountInfo = ctx.getAccountInfo(accountId);
|
||||
if (!accountInfo) {
|
||||
sendJson(res, 404, { ok: false, error: `Account not found: ${accountId}` });
|
||||
return true;
|
||||
}
|
||||
|
||||
const { pubkey, relays } = accountInfo;
|
||||
|
||||
if (!pubkey) {
|
||||
sendJson(res, 400, { ok: false, error: "Account has no public key configured" });
|
||||
return true;
|
||||
}
|
||||
|
||||
// Parse options from body
|
||||
let autoMerge = false;
|
||||
try {
|
||||
const body = await readJsonBody(req);
|
||||
if (typeof body === "object" && body !== null) {
|
||||
autoMerge = (body as { autoMerge?: boolean }).autoMerge === true;
|
||||
}
|
||||
} catch {
|
||||
// Ignore body parse errors - use defaults
|
||||
}
|
||||
|
||||
ctx.log?.info(`[${accountId}] Importing profile for ${pubkey.slice(0, 8)}...`);
|
||||
|
||||
// Import from relays
|
||||
const result = await importProfileFromRelays({
|
||||
pubkey,
|
||||
relays,
|
||||
timeoutMs: 10_000, // 10 seconds for import
|
||||
});
|
||||
|
||||
if (!result.ok) {
|
||||
sendJson(res, 200, {
|
||||
ok: false,
|
||||
error: result.error,
|
||||
relaysQueried: result.relaysQueried,
|
||||
});
|
||||
return true;
|
||||
}
|
||||
|
||||
// If autoMerge is requested, merge and save
|
||||
if (autoMerge && result.profile) {
|
||||
const localProfile = ctx.getConfigProfile(accountId);
|
||||
const merged = mergeProfiles(localProfile, result.profile);
|
||||
await ctx.updateConfigProfile(accountId, merged);
|
||||
ctx.log?.info(`[${accountId}] Profile imported and merged`);
|
||||
|
||||
sendJson(res, 200, {
|
||||
ok: true,
|
||||
imported: result.profile,
|
||||
merged,
|
||||
saved: true,
|
||||
event: result.event,
|
||||
sourceRelay: result.sourceRelay,
|
||||
relaysQueried: result.relaysQueried,
|
||||
});
|
||||
return true;
|
||||
}
|
||||
|
||||
// Otherwise, just return the imported profile for review
|
||||
sendJson(res, 200, {
|
||||
ok: true,
|
||||
imported: result.profile,
|
||||
saved: false,
|
||||
event: result.event,
|
||||
sourceRelay: result.sourceRelay,
|
||||
relaysQueried: result.relaysQueried,
|
||||
});
|
||||
return true;
|
||||
}
|
||||
@@ -1,120 +0,0 @@
|
||||
/**
|
||||
* Tests for Nostr Profile Import
|
||||
*/
|
||||
|
||||
import { describe, it, expect, vi, beforeEach } from "vitest";
|
||||
|
||||
import { mergeProfiles, type ProfileImportOptions } from "./nostr-profile-import.js";
|
||||
import type { NostrProfile } from "./config-schema.js";
|
||||
|
||||
// Note: importProfileFromRelays requires real network calls or complex mocking
|
||||
// of nostr-tools SimplePool, so we focus on unit testing mergeProfiles
|
||||
|
||||
describe("nostr-profile-import", () => {
|
||||
describe("mergeProfiles", () => {
|
||||
it("returns empty object when both are undefined", () => {
|
||||
const result = mergeProfiles(undefined, undefined);
|
||||
expect(result).toEqual({});
|
||||
});
|
||||
|
||||
it("returns imported when local is undefined", () => {
|
||||
const imported: NostrProfile = {
|
||||
name: "imported",
|
||||
displayName: "Imported User",
|
||||
about: "Bio from relay",
|
||||
};
|
||||
const result = mergeProfiles(undefined, imported);
|
||||
expect(result).toEqual(imported);
|
||||
});
|
||||
|
||||
it("returns local when imported is undefined", () => {
|
||||
const local: NostrProfile = {
|
||||
name: "local",
|
||||
displayName: "Local User",
|
||||
};
|
||||
const result = mergeProfiles(local, undefined);
|
||||
expect(result).toEqual(local);
|
||||
});
|
||||
|
||||
it("prefers local values over imported", () => {
|
||||
const local: NostrProfile = {
|
||||
name: "localname",
|
||||
about: "Local bio",
|
||||
};
|
||||
const imported: NostrProfile = {
|
||||
name: "importedname",
|
||||
displayName: "Imported Display",
|
||||
about: "Imported bio",
|
||||
picture: "https://example.com/pic.jpg",
|
||||
};
|
||||
|
||||
const result = mergeProfiles(local, imported);
|
||||
|
||||
expect(result.name).toBe("localname"); // local wins
|
||||
expect(result.displayName).toBe("Imported Display"); // imported fills gap
|
||||
expect(result.about).toBe("Local bio"); // local wins
|
||||
expect(result.picture).toBe("https://example.com/pic.jpg"); // imported fills gap
|
||||
});
|
||||
|
||||
it("fills all missing fields from imported", () => {
|
||||
const local: NostrProfile = {
|
||||
name: "myname",
|
||||
};
|
||||
const imported: NostrProfile = {
|
||||
name: "theirname",
|
||||
displayName: "Their Name",
|
||||
about: "Their bio",
|
||||
picture: "https://example.com/pic.jpg",
|
||||
banner: "https://example.com/banner.jpg",
|
||||
website: "https://example.com",
|
||||
nip05: "user@example.com",
|
||||
lud16: "user@getalby.com",
|
||||
};
|
||||
|
||||
const result = mergeProfiles(local, imported);
|
||||
|
||||
expect(result.name).toBe("myname");
|
||||
expect(result.displayName).toBe("Their Name");
|
||||
expect(result.about).toBe("Their bio");
|
||||
expect(result.picture).toBe("https://example.com/pic.jpg");
|
||||
expect(result.banner).toBe("https://example.com/banner.jpg");
|
||||
expect(result.website).toBe("https://example.com");
|
||||
expect(result.nip05).toBe("user@example.com");
|
||||
expect(result.lud16).toBe("user@getalby.com");
|
||||
});
|
||||
|
||||
it("handles empty strings as falsy (prefers imported)", () => {
|
||||
const local: NostrProfile = {
|
||||
name: "",
|
||||
displayName: "",
|
||||
};
|
||||
const imported: NostrProfile = {
|
||||
name: "imported",
|
||||
displayName: "Imported",
|
||||
};
|
||||
|
||||
const result = mergeProfiles(local, imported);
|
||||
|
||||
// Empty strings are still strings, so they "win" over imported
|
||||
// This is JavaScript nullish coalescing behavior
|
||||
expect(result.name).toBe("");
|
||||
expect(result.displayName).toBe("");
|
||||
});
|
||||
|
||||
it("handles null values in local (prefers imported)", () => {
|
||||
const local: NostrProfile = {
|
||||
name: undefined,
|
||||
displayName: undefined,
|
||||
};
|
||||
const imported: NostrProfile = {
|
||||
name: "imported",
|
||||
displayName: "Imported",
|
||||
};
|
||||
|
||||
const result = mergeProfiles(local, imported);
|
||||
|
||||
expect(result.name).toBe("imported");
|
||||
expect(result.displayName).toBe("Imported");
|
||||
});
|
||||
});
|
||||
});
|
||||
@@ -1,259 +0,0 @@
|
||||
/**
|
||||
* Nostr Profile Import
|
||||
*
|
||||
* Fetches and verifies kind:0 profile events from relays.
|
||||
* Used to import existing profiles before editing.
|
||||
*/
|
||||
|
||||
import { SimplePool, verifyEvent, type Event } from "nostr-tools";
|
||||
|
||||
import { contentToProfile, type ProfileContent } from "./nostr-profile.js";
|
||||
import type { NostrProfile } from "./config-schema.js";
|
||||
import { validateUrlSafety } from "./nostr-profile-http.js";
|
||||
|
||||
// ============================================================================
|
||||
// Types
|
||||
// ============================================================================
|
||||
|
||||
export interface ProfileImportResult {
|
||||
/** Whether the import was successful */
|
||||
ok: boolean;
|
||||
/** The imported profile (if found and valid) */
|
||||
profile?: NostrProfile;
|
||||
/** The raw event (for advanced users) */
|
||||
event?: {
|
||||
id: string;
|
||||
pubkey: string;
|
||||
created_at: number;
|
||||
};
|
||||
/** Error message if import failed */
|
||||
error?: string;
|
||||
/** Which relays responded */
|
||||
relaysQueried: string[];
|
||||
/** Which relay provided the winning event */
|
||||
sourceRelay?: string;
|
||||
}
|
||||
|
||||
export interface ProfileImportOptions {
|
||||
/** The public key to fetch profile for */
|
||||
pubkey: string;
|
||||
/** Relay URLs to query */
|
||||
relays: string[];
|
||||
/** Timeout per relay in milliseconds (default: 5000) */
|
||||
timeoutMs?: number;
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Constants
|
||||
// ============================================================================
|
||||
|
||||
const DEFAULT_TIMEOUT_MS = 5000;
|
||||
|
||||
// ============================================================================
|
||||
// Profile Import
|
||||
// ============================================================================
|
||||
|
||||
/**
|
||||
* Sanitize URLs in an imported profile to prevent SSRF attacks.
|
||||
* Removes any URLs that don't pass SSRF validation.
|
||||
*/
|
||||
function sanitizeProfileUrls(profile: NostrProfile): NostrProfile {
|
||||
const result = { ...profile };
|
||||
const urlFields = ["picture", "banner", "website"] as const;
|
||||
|
||||
for (const field of urlFields) {
|
||||
const value = result[field];
|
||||
if (value && typeof value === "string") {
|
||||
const validation = validateUrlSafety(value);
|
||||
if (!validation.ok) {
|
||||
// Remove unsafe URL
|
||||
delete result[field];
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
/**
|
||||
* Fetch the latest kind:0 profile event for a pubkey from relays.
|
||||
*
|
||||
* - Queries all relays in parallel
|
||||
* - Takes the event with the highest created_at
|
||||
* - Verifies the event signature
|
||||
* - Parses and returns the profile
|
||||
*/
|
||||
export async function importProfileFromRelays(
|
||||
opts: ProfileImportOptions
|
||||
): Promise<ProfileImportResult> {
|
||||
const { pubkey, relays, timeoutMs = DEFAULT_TIMEOUT_MS } = opts;
|
||||
|
||||
if (!pubkey || !/^[0-9a-fA-F]{64}$/.test(pubkey)) {
|
||||
return {
|
||||
ok: false,
|
||||
error: "Invalid pubkey format (must be 64 hex characters)",
|
||||
relaysQueried: [],
|
||||
};
|
||||
}
|
||||
|
||||
if (relays.length === 0) {
|
||||
return {
|
||||
ok: false,
|
||||
error: "No relays configured",
|
||||
relaysQueried: [],
|
||||
};
|
||||
}
|
||||
|
||||
const pool = new SimplePool();
|
||||
const relaysQueried: string[] = [];
|
||||
|
||||
try {
|
||||
// Query all relays for kind:0 events from this pubkey
|
||||
const events: Array<{ event: Event; relay: string }> = [];
|
||||
|
||||
// Create timeout promise
|
||||
const timeoutPromise = new Promise<void>((resolve) => {
|
||||
setTimeout(resolve, timeoutMs);
|
||||
});
|
||||
|
||||
// Create subscription promise
|
||||
const subscriptionPromise = new Promise<void>((resolve) => {
|
||||
let completed = 0;
|
||||
|
||||
for (const relay of relays) {
|
||||
relaysQueried.push(relay);
|
||||
|
||||
const sub = pool.subscribeMany(
|
||||
[relay],
|
||||
[
|
||||
{
|
||||
kinds: [0],
|
||||
authors: [pubkey],
|
||||
limit: 1,
|
||||
},
|
||||
],
|
||||
{
|
||||
onevent(event) {
|
||||
events.push({ event, relay });
|
||||
},
|
||||
oneose() {
|
||||
completed++;
|
||||
if (completed >= relays.length) {
|
||||
resolve();
|
||||
}
|
||||
},
|
||||
onclose() {
|
||||
completed++;
|
||||
if (completed >= relays.length) {
|
||||
resolve();
|
||||
}
|
||||
},
|
||||
}
|
||||
);
|
||||
|
||||
// Clean up subscription after timeout
|
||||
setTimeout(() => {
|
||||
sub.close();
|
||||
}, timeoutMs);
|
||||
}
|
||||
});
|
||||
|
||||
// Wait for either all relays to respond or timeout
|
||||
await Promise.race([subscriptionPromise, timeoutPromise]);
|
||||
|
||||
// No events found
|
||||
if (events.length === 0) {
|
||||
return {
|
||||
ok: false,
|
||||
error: "No profile found on any relay",
|
||||
relaysQueried,
|
||||
};
|
||||
}
|
||||
|
||||
// Find the event with the highest created_at (newest wins for replaceable events)
|
||||
let bestEvent: { event: Event; relay: string } | null = null;
|
||||
for (const item of events) {
|
||||
if (!bestEvent || item.event.created_at > bestEvent.event.created_at) {
|
||||
bestEvent = item;
|
||||
}
|
||||
}
|
||||
|
||||
if (!bestEvent) {
|
||||
return {
|
||||
ok: false,
|
||||
error: "No valid profile event found",
|
||||
relaysQueried,
|
||||
};
|
||||
}
|
||||
|
||||
// Verify the event signature
|
||||
const isValid = verifyEvent(bestEvent.event);
|
||||
if (!isValid) {
|
||||
return {
|
||||
ok: false,
|
||||
error: "Profile event has invalid signature",
|
||||
relaysQueried,
|
||||
sourceRelay: bestEvent.relay,
|
||||
};
|
||||
}
|
||||
|
||||
// Parse the profile content
|
||||
let content: ProfileContent;
|
||||
try {
|
||||
content = JSON.parse(bestEvent.event.content) as ProfileContent;
|
||||
} catch {
|
||||
return {
|
||||
ok: false,
|
||||
error: "Profile event has invalid JSON content",
|
||||
relaysQueried,
|
||||
sourceRelay: bestEvent.relay,
|
||||
};
|
||||
}
|
||||
|
||||
// Convert to our profile format
|
||||
const profile = contentToProfile(content);
|
||||
|
||||
// Sanitize URLs from imported profile to prevent SSRF when auto-merging
|
||||
const sanitizedProfile = sanitizeProfileUrls(profile);
|
||||
|
||||
return {
|
||||
ok: true,
|
||||
profile: sanitizedProfile,
|
||||
event: {
|
||||
id: bestEvent.event.id,
|
||||
pubkey: bestEvent.event.pubkey,
|
||||
created_at: bestEvent.event.created_at,
|
||||
},
|
||||
relaysQueried,
|
||||
sourceRelay: bestEvent.relay,
|
||||
};
|
||||
} finally {
|
||||
pool.close(relays);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Merge imported profile with local profile.
|
||||
*
|
||||
* Strategy:
|
||||
* - For each field, prefer local if set, otherwise use imported
|
||||
* - This preserves user customizations while filling in missing data
|
||||
*/
|
||||
export function mergeProfiles(
|
||||
local: NostrProfile | undefined,
|
||||
imported: NostrProfile | undefined
|
||||
): NostrProfile {
|
||||
if (!imported) return local ?? {};
|
||||
if (!local) return imported;
|
||||
|
||||
return {
|
||||
name: local.name ?? imported.name,
|
||||
displayName: local.displayName ?? imported.displayName,
|
||||
about: local.about ?? imported.about,
|
||||
picture: local.picture ?? imported.picture,
|
||||
banner: local.banner ?? imported.banner,
|
||||
website: local.website ?? imported.website,
|
||||
nip05: local.nip05 ?? imported.nip05,
|
||||
lud16: local.lud16 ?? imported.lud16,
|
||||
};
|
||||
}
|
||||
@@ -1,479 +0,0 @@
|
||||
import { describe, expect, it } from "vitest";
|
||||
import { getPublicKey } from "nostr-tools";
|
||||
import {
|
||||
createProfileEvent,
|
||||
profileToContent,
|
||||
validateProfile,
|
||||
sanitizeProfileForDisplay,
|
||||
} from "./nostr-profile.js";
|
||||
import type { NostrProfile } from "./config-schema.js";
|
||||
|
||||
// Test private key
|
||||
const TEST_HEX_KEY = "0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef";
|
||||
const TEST_SK = new Uint8Array(
|
||||
TEST_HEX_KEY.match(/.{2}/g)!.map((byte) => parseInt(byte, 16))
|
||||
);
|
||||
|
||||
// ============================================================================
|
||||
// Unicode Attack Vectors
|
||||
// ============================================================================
|
||||
|
||||
describe("profile unicode attacks", () => {
|
||||
describe("zero-width characters", () => {
|
||||
it("handles zero-width space in name", () => {
|
||||
const profile: NostrProfile = {
|
||||
name: "test\u200Buser", // Zero-width space
|
||||
};
|
||||
const result = validateProfile(profile);
|
||||
expect(result.valid).toBe(true);
|
||||
// The character should be preserved (not stripped)
|
||||
expect(result.profile?.name).toBe("test\u200Buser");
|
||||
});
|
||||
|
||||
it("handles zero-width joiner in name", () => {
|
||||
const profile: NostrProfile = {
|
||||
name: "test\u200Duser", // Zero-width joiner
|
||||
};
|
||||
const result = validateProfile(profile);
|
||||
expect(result.valid).toBe(true);
|
||||
});
|
||||
|
||||
it("handles zero-width non-joiner in about", () => {
|
||||
const profile: NostrProfile = {
|
||||
about: "test\u200Cabout", // Zero-width non-joiner
|
||||
};
|
||||
const content = profileToContent(profile);
|
||||
expect(content.about).toBe("test\u200Cabout");
|
||||
});
|
||||
});
|
||||
|
||||
describe("RTL override attacks", () => {
|
||||
it("handles RTL override in name", () => {
|
||||
const profile: NostrProfile = {
|
||||
name: "\u202Eevil\u202C", // Right-to-left override + pop direction
|
||||
};
|
||||
const result = validateProfile(profile);
|
||||
expect(result.valid).toBe(true);
|
||||
|
||||
// UI should escape or handle this
|
||||
const sanitized = sanitizeProfileForDisplay(result.profile!);
|
||||
expect(sanitized.name).toBeDefined();
|
||||
});
|
||||
|
||||
it("handles bidi embedding in about", () => {
|
||||
const profile: NostrProfile = {
|
||||
about: "Normal \u202Breversed\u202C text", // LTR embedding
|
||||
};
|
||||
const result = validateProfile(profile);
|
||||
expect(result.valid).toBe(true);
|
||||
});
|
||||
});
|
||||
|
||||
describe("homoglyph attacks", () => {
|
||||
it("handles Cyrillic homoglyphs", () => {
|
||||
const profile: NostrProfile = {
|
||||
// Cyrillic 'а' (U+0430) looks like Latin 'a'
|
||||
name: "\u0430dmin", // Fake "admin"
|
||||
};
|
||||
const result = validateProfile(profile);
|
||||
expect(result.valid).toBe(true);
|
||||
// Profile is accepted but apps should be aware
|
||||
});
|
||||
|
||||
it("handles Greek homoglyphs", () => {
|
||||
const profile: NostrProfile = {
|
||||
// Greek 'ο' (U+03BF) looks like Latin 'o'
|
||||
name: "b\u03BFt", // Looks like "bot"
|
||||
};
|
||||
const result = validateProfile(profile);
|
||||
expect(result.valid).toBe(true);
|
||||
});
|
||||
});
|
||||
|
||||
describe("combining characters", () => {
|
||||
it("handles combining diacritics", () => {
|
||||
const profile: NostrProfile = {
|
||||
name: "cafe\u0301", // 'e' + combining acute = 'é'
|
||||
};
|
||||
const result = validateProfile(profile);
|
||||
expect(result.valid).toBe(true);
|
||||
expect(result.profile?.name).toBe("cafe\u0301");
|
||||
});
|
||||
|
||||
it("handles excessive combining characters (Zalgo text)", () => {
|
||||
const zalgo =
|
||||
"t̷̢̧̨̡̛̛̛͎̩̝̪̲̲̞̠̹̗̩͓̬̱̪̦͙̬̲̤͙̱̫̝̪̱̫̯̬̭̠̖̲̥̖̫̫̤͇̪̣̫̪̖̱̯̣͎̯̲̱̤̪̣̖̲̪̯͓̖̤̫̫̲̱̲̫̲̖̫̪̯̱̱̪̖̯e̶̡̧̨̧̛̛̛̖̪̯̱̪̯̖̪̱̪̯̖̪̯̖̪̱̪̯̖̪̯̖̪̱̪̯̖̪̯̖̪̱̪̯̖̪̯̖̪̱̪̯̖̪̯̖̪̱̪̯̖̪̯̖̪̱̪s̶̨̧̛̛̖̪̱̪̯̖̪̯̖̪̱̪̯̖̪̯̖̪̱̪̯̖̪̯̖̪̱̪̯̖̪̯̖̪̱̪̯̖̪̯̖̪̱̪̯̖̪̯̖̪̱̪̯̖̪̯̖̪̱̪̯̖̪̯̖̪̱̪̯t";
|
||||
const profile: NostrProfile = {
|
||||
name: zalgo.slice(0, 256), // Truncate to fit limit
|
||||
};
|
||||
const result = validateProfile(profile);
|
||||
// Should be valid but may look weird
|
||||
expect(result.valid).toBe(true);
|
||||
});
|
||||
});
|
||||
|
||||
describe("CJK and other scripts", () => {
|
||||
it("handles Chinese characters", () => {
|
||||
const profile: NostrProfile = {
|
||||
name: "中文用户",
|
||||
about: "我是一个机器人",
|
||||
};
|
||||
const result = validateProfile(profile);
|
||||
expect(result.valid).toBe(true);
|
||||
});
|
||||
|
||||
it("handles Japanese hiragana and katakana", () => {
|
||||
const profile: NostrProfile = {
|
||||
name: "ボット",
|
||||
about: "これはテストです",
|
||||
};
|
||||
const result = validateProfile(profile);
|
||||
expect(result.valid).toBe(true);
|
||||
});
|
||||
|
||||
it("handles Korean characters", () => {
|
||||
const profile: NostrProfile = {
|
||||
name: "한국어사용자",
|
||||
};
|
||||
const result = validateProfile(profile);
|
||||
expect(result.valid).toBe(true);
|
||||
});
|
||||
|
||||
it("handles Arabic text", () => {
|
||||
const profile: NostrProfile = {
|
||||
name: "مستخدم",
|
||||
about: "مرحبا بالعالم",
|
||||
};
|
||||
const result = validateProfile(profile);
|
||||
expect(result.valid).toBe(true);
|
||||
});
|
||||
|
||||
it("handles Hebrew text", () => {
|
||||
const profile: NostrProfile = {
|
||||
name: "משתמש",
|
||||
};
|
||||
const result = validateProfile(profile);
|
||||
expect(result.valid).toBe(true);
|
||||
});
|
||||
|
||||
it("handles Thai text", () => {
|
||||
const profile: NostrProfile = {
|
||||
name: "ผู้ใช้",
|
||||
};
|
||||
const result = validateProfile(profile);
|
||||
expect(result.valid).toBe(true);
|
||||
});
|
||||
});
|
||||
|
||||
describe("emoji edge cases", () => {
|
||||
it("handles emoji sequences (ZWJ)", () => {
|
||||
const profile: NostrProfile = {
|
||||
name: "👨👩👧👦", // Family emoji using ZWJ
|
||||
};
|
||||
const result = validateProfile(profile);
|
||||
expect(result.valid).toBe(true);
|
||||
});
|
||||
|
||||
it("handles flag emojis", () => {
|
||||
const profile: NostrProfile = {
|
||||
name: "🇺🇸🇯🇵🇬🇧",
|
||||
};
|
||||
const result = validateProfile(profile);
|
||||
expect(result.valid).toBe(true);
|
||||
});
|
||||
|
||||
it("handles skin tone modifiers", () => {
|
||||
const profile: NostrProfile = {
|
||||
name: "👋🏻👋🏽👋🏿",
|
||||
};
|
||||
const result = validateProfile(profile);
|
||||
expect(result.valid).toBe(true);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
// ============================================================================
|
||||
// XSS Attack Vectors
|
||||
// ============================================================================
|
||||
|
||||
describe("profile XSS attacks", () => {
|
||||
describe("script injection", () => {
|
||||
it("escapes script tags", () => {
|
||||
const profile: NostrProfile = {
|
||||
name: '<script>alert("xss")</script>',
|
||||
};
|
||||
const sanitized = sanitizeProfileForDisplay(profile);
|
||||
expect(sanitized.name).not.toContain("<script>");
|
||||
expect(sanitized.name).toContain("<script>");
|
||||
});
|
||||
|
||||
it("escapes nested script tags", () => {
|
||||
const profile: NostrProfile = {
|
||||
about: '<<script>script>alert("xss")<</script>/script>',
|
||||
};
|
||||
const sanitized = sanitizeProfileForDisplay(profile);
|
||||
expect(sanitized.about).not.toContain("<script>");
|
||||
});
|
||||
});
|
||||
|
||||
describe("event handler injection", () => {
|
||||
it("escapes img onerror", () => {
|
||||
const profile: NostrProfile = {
|
||||
about: '<img src="x" onerror="alert(1)">',
|
||||
};
|
||||
const sanitized = sanitizeProfileForDisplay(profile);
|
||||
expect(sanitized.about).toContain("<img");
|
||||
expect(sanitized.about).not.toContain('onerror="alert');
|
||||
});
|
||||
|
||||
it("escapes svg onload", () => {
|
||||
const profile: NostrProfile = {
|
||||
about: '<svg onload="alert(1)">',
|
||||
};
|
||||
const sanitized = sanitizeProfileForDisplay(profile);
|
||||
expect(sanitized.about).toContain("<svg");
|
||||
});
|
||||
|
||||
it("escapes body onload", () => {
|
||||
const profile: NostrProfile = {
|
||||
about: '<body onload="alert(1)">',
|
||||
};
|
||||
const sanitized = sanitizeProfileForDisplay(profile);
|
||||
expect(sanitized.about).toContain("<body");
|
||||
});
|
||||
});
|
||||
|
||||
describe("URL-based attacks", () => {
|
||||
it("rejects javascript: URL in picture", () => {
|
||||
const profile = {
|
||||
picture: "javascript:alert('xss')",
|
||||
};
|
||||
const result = validateProfile(profile);
|
||||
expect(result.valid).toBe(false);
|
||||
});
|
||||
|
||||
it("rejects javascript: URL with encoding", () => {
|
||||
const profile = {
|
||||
picture: "javascript:alert('xss')",
|
||||
};
|
||||
const result = validateProfile(profile);
|
||||
expect(result.valid).toBe(false);
|
||||
});
|
||||
|
||||
it("rejects data: URL", () => {
|
||||
const profile = {
|
||||
picture: "data:text/html,<script>alert('xss')</script>",
|
||||
};
|
||||
const result = validateProfile(profile);
|
||||
expect(result.valid).toBe(false);
|
||||
});
|
||||
|
||||
it("rejects vbscript: URL", () => {
|
||||
const profile = {
|
||||
website: "vbscript:msgbox('xss')",
|
||||
};
|
||||
const result = validateProfile(profile);
|
||||
expect(result.valid).toBe(false);
|
||||
});
|
||||
|
||||
it("rejects file: URL", () => {
|
||||
const profile = {
|
||||
picture: "file:///etc/passwd",
|
||||
};
|
||||
const result = validateProfile(profile);
|
||||
expect(result.valid).toBe(false);
|
||||
});
|
||||
});
|
||||
|
||||
describe("HTML attribute injection", () => {
|
||||
it("escapes double quotes in fields", () => {
|
||||
const profile: NostrProfile = {
|
||||
name: '" onclick="alert(1)" data-x="',
|
||||
};
|
||||
const sanitized = sanitizeProfileForDisplay(profile);
|
||||
expect(sanitized.name).toContain(""");
|
||||
expect(sanitized.name).not.toContain('onclick="alert');
|
||||
});
|
||||
|
||||
it("escapes single quotes in fields", () => {
|
||||
const profile: NostrProfile = {
|
||||
name: "' onclick='alert(1)' data-x='",
|
||||
};
|
||||
const sanitized = sanitizeProfileForDisplay(profile);
|
||||
expect(sanitized.name).toContain("'");
|
||||
});
|
||||
});
|
||||
|
||||
describe("CSS injection", () => {
|
||||
it("escapes style tags", () => {
|
||||
const profile: NostrProfile = {
|
||||
about: '<style>body{background:url("javascript:alert(1)")}</style>',
|
||||
};
|
||||
const sanitized = sanitizeProfileForDisplay(profile);
|
||||
expect(sanitized.about).toContain("<style>");
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
// ============================================================================
|
||||
// Length Boundary Tests
|
||||
// ============================================================================
|
||||
|
||||
describe("profile length boundaries", () => {
|
||||
describe("name field (max 256)", () => {
|
||||
it("accepts exactly 256 characters", () => {
|
||||
const result = validateProfile({ name: "a".repeat(256) });
|
||||
expect(result.valid).toBe(true);
|
||||
});
|
||||
|
||||
it("rejects 257 characters", () => {
|
||||
const result = validateProfile({ name: "a".repeat(257) });
|
||||
expect(result.valid).toBe(false);
|
||||
});
|
||||
|
||||
it("accepts empty string", () => {
|
||||
const result = validateProfile({ name: "" });
|
||||
expect(result.valid).toBe(true);
|
||||
});
|
||||
});
|
||||
|
||||
describe("displayName field (max 256)", () => {
|
||||
it("accepts exactly 256 characters", () => {
|
||||
const result = validateProfile({ displayName: "b".repeat(256) });
|
||||
expect(result.valid).toBe(true);
|
||||
});
|
||||
|
||||
it("rejects 257 characters", () => {
|
||||
const result = validateProfile({ displayName: "b".repeat(257) });
|
||||
expect(result.valid).toBe(false);
|
||||
});
|
||||
});
|
||||
|
||||
describe("about field (max 2000)", () => {
|
||||
it("accepts exactly 2000 characters", () => {
|
||||
const result = validateProfile({ about: "c".repeat(2000) });
|
||||
expect(result.valid).toBe(true);
|
||||
});
|
||||
|
||||
it("rejects 2001 characters", () => {
|
||||
const result = validateProfile({ about: "c".repeat(2001) });
|
||||
expect(result.valid).toBe(false);
|
||||
});
|
||||
});
|
||||
|
||||
describe("URL fields", () => {
|
||||
it("accepts long valid HTTPS URLs", () => {
|
||||
const longPath = "a".repeat(1000);
|
||||
const result = validateProfile({
|
||||
picture: `https://example.com/${longPath}.png`,
|
||||
});
|
||||
expect(result.valid).toBe(true);
|
||||
});
|
||||
|
||||
it("rejects invalid URL format", () => {
|
||||
const result = validateProfile({
|
||||
picture: "not-a-url",
|
||||
});
|
||||
expect(result.valid).toBe(false);
|
||||
});
|
||||
|
||||
it("rejects URL without protocol", () => {
|
||||
const result = validateProfile({
|
||||
picture: "example.com/pic.png",
|
||||
});
|
||||
expect(result.valid).toBe(false);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
// ============================================================================
|
||||
// Type Confusion Tests
|
||||
// ============================================================================
|
||||
|
||||
describe("profile type confusion", () => {
|
||||
it("rejects number as name", () => {
|
||||
const result = validateProfile({ name: 123 as unknown as string });
|
||||
expect(result.valid).toBe(false);
|
||||
});
|
||||
|
||||
it("rejects array as about", () => {
|
||||
const result = validateProfile({ about: ["hello"] as unknown as string });
|
||||
expect(result.valid).toBe(false);
|
||||
});
|
||||
|
||||
it("rejects object as picture", () => {
|
||||
const result = validateProfile({ picture: { url: "https://example.com" } as unknown as string });
|
||||
expect(result.valid).toBe(false);
|
||||
});
|
||||
|
||||
it("rejects null as name", () => {
|
||||
const result = validateProfile({ name: null as unknown as string });
|
||||
expect(result.valid).toBe(false);
|
||||
});
|
||||
|
||||
it("rejects boolean as about", () => {
|
||||
const result = validateProfile({ about: true as unknown as string });
|
||||
expect(result.valid).toBe(false);
|
||||
});
|
||||
|
||||
it("rejects function as name", () => {
|
||||
const result = validateProfile({ name: (() => "test") as unknown as string });
|
||||
expect(result.valid).toBe(false);
|
||||
});
|
||||
|
||||
it("handles prototype pollution attempt", () => {
|
||||
const malicious = JSON.parse('{"__proto__": {"polluted": true}}') as unknown;
|
||||
const result = validateProfile(malicious);
|
||||
// Should not pollute Object.prototype
|
||||
expect(({} as Record<string, unknown>).polluted).toBeUndefined();
|
||||
});
|
||||
});
|
||||
|
||||
// ============================================================================
|
||||
// Event Creation Edge Cases
|
||||
// ============================================================================
|
||||
|
||||
describe("event creation edge cases", () => {
|
||||
it("handles profile with all fields at max length", () => {
|
||||
const profile: NostrProfile = {
|
||||
name: "a".repeat(256),
|
||||
displayName: "b".repeat(256),
|
||||
about: "c".repeat(2000),
|
||||
nip05: "d".repeat(200) + "@example.com",
|
||||
lud16: "e".repeat(200) + "@example.com",
|
||||
};
|
||||
|
||||
const event = createProfileEvent(TEST_SK, profile);
|
||||
expect(event.kind).toBe(0);
|
||||
|
||||
// Content should be parseable JSON
|
||||
expect(() => JSON.parse(event.content)).not.toThrow();
|
||||
});
|
||||
|
||||
it("handles rapid sequential events with monotonic timestamps", () => {
|
||||
const profile: NostrProfile = { name: "rapid" };
|
||||
|
||||
// Create events in quick succession
|
||||
let lastTimestamp = 0;
|
||||
for (let i = 0; i < 100; i++) {
|
||||
const event = createProfileEvent(TEST_SK, profile, lastTimestamp);
|
||||
expect(event.created_at).toBeGreaterThan(lastTimestamp);
|
||||
lastTimestamp = event.created_at;
|
||||
}
|
||||
});
|
||||
|
||||
it("handles JSON special characters in content", () => {
|
||||
const profile: NostrProfile = {
|
||||
name: 'test"user',
|
||||
about: "line1\nline2\ttab\\backslash",
|
||||
};
|
||||
|
||||
const event = createProfileEvent(TEST_SK, profile);
|
||||
const parsed = JSON.parse(event.content) as { name: string; about: string };
|
||||
|
||||
expect(parsed.name).toBe('test"user');
|
||||
expect(parsed.about).toContain("\n");
|
||||
expect(parsed.about).toContain("\t");
|
||||
expect(parsed.about).toContain("\\");
|
||||
});
|
||||
});
|
||||
@@ -1,410 +0,0 @@
|
||||
import { describe, expect, it, vi, beforeEach } from "vitest";
|
||||
import { verifyEvent, getPublicKey } from "nostr-tools";
|
||||
import {
|
||||
createProfileEvent,
|
||||
profileToContent,
|
||||
contentToProfile,
|
||||
validateProfile,
|
||||
sanitizeProfileForDisplay,
|
||||
type ProfileContent,
|
||||
} from "./nostr-profile.js";
|
||||
import type { NostrProfile } from "./config-schema.js";
|
||||
|
||||
// Test private key (DO NOT use in production - this is a known test key)
|
||||
const TEST_HEX_KEY = "0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef";
|
||||
const TEST_SK = new Uint8Array(
|
||||
TEST_HEX_KEY.match(/.{2}/g)!.map((byte) => parseInt(byte, 16))
|
||||
);
|
||||
const TEST_PUBKEY = getPublicKey(TEST_SK);
|
||||
|
||||
// ============================================================================
|
||||
// Profile Content Conversion Tests
|
||||
// ============================================================================
|
||||
|
||||
describe("profileToContent", () => {
|
||||
it("converts full profile to NIP-01 content format", () => {
|
||||
const profile: NostrProfile = {
|
||||
name: "testuser",
|
||||
displayName: "Test User",
|
||||
about: "A test user for unit testing",
|
||||
picture: "https://example.com/avatar.png",
|
||||
banner: "https://example.com/banner.png",
|
||||
website: "https://example.com",
|
||||
nip05: "testuser@example.com",
|
||||
lud16: "testuser@walletofsatoshi.com",
|
||||
};
|
||||
|
||||
const content = profileToContent(profile);
|
||||
|
||||
expect(content.name).toBe("testuser");
|
||||
expect(content.display_name).toBe("Test User");
|
||||
expect(content.about).toBe("A test user for unit testing");
|
||||
expect(content.picture).toBe("https://example.com/avatar.png");
|
||||
expect(content.banner).toBe("https://example.com/banner.png");
|
||||
expect(content.website).toBe("https://example.com");
|
||||
expect(content.nip05).toBe("testuser@example.com");
|
||||
expect(content.lud16).toBe("testuser@walletofsatoshi.com");
|
||||
});
|
||||
|
||||
it("omits undefined fields from content", () => {
|
||||
const profile: NostrProfile = {
|
||||
name: "minimaluser",
|
||||
};
|
||||
|
||||
const content = profileToContent(profile);
|
||||
|
||||
expect(content.name).toBe("minimaluser");
|
||||
expect("display_name" in content).toBe(false);
|
||||
expect("about" in content).toBe(false);
|
||||
expect("picture" in content).toBe(false);
|
||||
});
|
||||
|
||||
it("handles empty profile", () => {
|
||||
const profile: NostrProfile = {};
|
||||
const content = profileToContent(profile);
|
||||
expect(Object.keys(content)).toHaveLength(0);
|
||||
});
|
||||
});
|
||||
|
||||
describe("contentToProfile", () => {
|
||||
it("converts NIP-01 content to profile format", () => {
|
||||
const content: ProfileContent = {
|
||||
name: "testuser",
|
||||
display_name: "Test User",
|
||||
about: "A test user",
|
||||
picture: "https://example.com/avatar.png",
|
||||
nip05: "test@example.com",
|
||||
};
|
||||
|
||||
const profile = contentToProfile(content);
|
||||
|
||||
expect(profile.name).toBe("testuser");
|
||||
expect(profile.displayName).toBe("Test User");
|
||||
expect(profile.about).toBe("A test user");
|
||||
expect(profile.picture).toBe("https://example.com/avatar.png");
|
||||
expect(profile.nip05).toBe("test@example.com");
|
||||
});
|
||||
|
||||
it("handles empty content", () => {
|
||||
const content: ProfileContent = {};
|
||||
const profile = contentToProfile(content);
|
||||
expect(Object.keys(profile).filter((k) => profile[k as keyof NostrProfile] !== undefined)).toHaveLength(0);
|
||||
});
|
||||
|
||||
it("round-trips profile data", () => {
|
||||
const original: NostrProfile = {
|
||||
name: "roundtrip",
|
||||
displayName: "Round Trip Test",
|
||||
about: "Testing round-trip conversion",
|
||||
};
|
||||
|
||||
const content = profileToContent(original);
|
||||
const restored = contentToProfile(content);
|
||||
|
||||
expect(restored.name).toBe(original.name);
|
||||
expect(restored.displayName).toBe(original.displayName);
|
||||
expect(restored.about).toBe(original.about);
|
||||
});
|
||||
});
|
||||
|
||||
// ============================================================================
|
||||
// Event Creation Tests
|
||||
// ============================================================================
|
||||
|
||||
describe("createProfileEvent", () => {
|
||||
beforeEach(() => {
|
||||
vi.useFakeTimers();
|
||||
vi.setSystemTime(new Date("2024-01-15T12:00:00Z"));
|
||||
});
|
||||
|
||||
it("creates a valid kind:0 event", () => {
|
||||
const profile: NostrProfile = {
|
||||
name: "testbot",
|
||||
about: "A test bot",
|
||||
};
|
||||
|
||||
const event = createProfileEvent(TEST_SK, profile);
|
||||
|
||||
expect(event.kind).toBe(0);
|
||||
expect(event.pubkey).toBe(TEST_PUBKEY);
|
||||
expect(event.tags).toEqual([]);
|
||||
expect(event.id).toMatch(/^[0-9a-f]{64}$/);
|
||||
expect(event.sig).toMatch(/^[0-9a-f]{128}$/);
|
||||
});
|
||||
|
||||
it("includes profile content as JSON in event content", () => {
|
||||
const profile: NostrProfile = {
|
||||
name: "jsontest",
|
||||
displayName: "JSON Test User",
|
||||
about: "Testing JSON serialization",
|
||||
};
|
||||
|
||||
const event = createProfileEvent(TEST_SK, profile);
|
||||
const parsedContent = JSON.parse(event.content) as ProfileContent;
|
||||
|
||||
expect(parsedContent.name).toBe("jsontest");
|
||||
expect(parsedContent.display_name).toBe("JSON Test User");
|
||||
expect(parsedContent.about).toBe("Testing JSON serialization");
|
||||
});
|
||||
|
||||
it("produces a verifiable signature", () => {
|
||||
const profile: NostrProfile = { name: "signaturetest" };
|
||||
const event = createProfileEvent(TEST_SK, profile);
|
||||
|
||||
expect(verifyEvent(event)).toBe(true);
|
||||
});
|
||||
|
||||
it("uses current timestamp when no lastPublishedAt provided", () => {
|
||||
const profile: NostrProfile = { name: "timestamptest" };
|
||||
const event = createProfileEvent(TEST_SK, profile);
|
||||
|
||||
const expectedTimestamp = Math.floor(Date.now() / 1000);
|
||||
expect(event.created_at).toBe(expectedTimestamp);
|
||||
});
|
||||
|
||||
it("ensures monotonic timestamp when lastPublishedAt is in the future", () => {
|
||||
// Current time is 2024-01-15T12:00:00Z = 1705320000
|
||||
const futureTimestamp = 1705320000 + 3600; // 1 hour in the future
|
||||
const profile: NostrProfile = { name: "monotonictest" };
|
||||
|
||||
const event = createProfileEvent(TEST_SK, profile, futureTimestamp);
|
||||
|
||||
expect(event.created_at).toBe(futureTimestamp + 1);
|
||||
});
|
||||
|
||||
it("uses current time when lastPublishedAt is in the past", () => {
|
||||
const pastTimestamp = 1705320000 - 3600; // 1 hour in the past
|
||||
const profile: NostrProfile = { name: "pasttest" };
|
||||
|
||||
const event = createProfileEvent(TEST_SK, profile, pastTimestamp);
|
||||
|
||||
const expectedTimestamp = Math.floor(Date.now() / 1000);
|
||||
expect(event.created_at).toBe(expectedTimestamp);
|
||||
});
|
||||
|
||||
vi.useRealTimers();
|
||||
});
|
||||
|
||||
// ============================================================================
|
||||
// Profile Validation Tests
|
||||
// ============================================================================
|
||||
|
||||
describe("validateProfile", () => {
|
||||
it("validates a correct profile", () => {
|
||||
const profile = {
|
||||
name: "validuser",
|
||||
about: "A valid user",
|
||||
picture: "https://example.com/pic.png",
|
||||
};
|
||||
|
||||
const result = validateProfile(profile);
|
||||
|
||||
expect(result.valid).toBe(true);
|
||||
expect(result.profile).toBeDefined();
|
||||
expect(result.errors).toBeUndefined();
|
||||
});
|
||||
|
||||
it("rejects profile with invalid URL", () => {
|
||||
const profile = {
|
||||
name: "invalidurl",
|
||||
picture: "http://insecure.example.com/pic.png", // HTTP not HTTPS
|
||||
};
|
||||
|
||||
const result = validateProfile(profile);
|
||||
|
||||
expect(result.valid).toBe(false);
|
||||
expect(result.errors).toBeDefined();
|
||||
expect(result.errors!.some((e) => e.includes("https://"))).toBe(true);
|
||||
});
|
||||
|
||||
it("rejects profile with javascript: URL", () => {
|
||||
const profile = {
|
||||
name: "xssattempt",
|
||||
picture: "javascript:alert('xss')",
|
||||
};
|
||||
|
||||
const result = validateProfile(profile);
|
||||
|
||||
expect(result.valid).toBe(false);
|
||||
});
|
||||
|
||||
it("rejects profile with data: URL", () => {
|
||||
const profile = {
|
||||
name: "dataurl",
|
||||
picture: "data:image/png;base64,abc123",
|
||||
};
|
||||
|
||||
const result = validateProfile(profile);
|
||||
|
||||
expect(result.valid).toBe(false);
|
||||
});
|
||||
|
||||
it("rejects name exceeding 256 characters", () => {
|
||||
const profile = {
|
||||
name: "a".repeat(257),
|
||||
};
|
||||
|
||||
const result = validateProfile(profile);
|
||||
|
||||
expect(result.valid).toBe(false);
|
||||
expect(result.errors!.some((e) => e.includes("256"))).toBe(true);
|
||||
});
|
||||
|
||||
it("rejects about exceeding 2000 characters", () => {
|
||||
const profile = {
|
||||
about: "a".repeat(2001),
|
||||
};
|
||||
|
||||
const result = validateProfile(profile);
|
||||
|
||||
expect(result.valid).toBe(false);
|
||||
expect(result.errors!.some((e) => e.includes("2000"))).toBe(true);
|
||||
});
|
||||
|
||||
it("accepts empty profile", () => {
|
||||
const result = validateProfile({});
|
||||
expect(result.valid).toBe(true);
|
||||
});
|
||||
|
||||
it("rejects null input", () => {
|
||||
const result = validateProfile(null);
|
||||
expect(result.valid).toBe(false);
|
||||
});
|
||||
|
||||
it("rejects non-object input", () => {
|
||||
const result = validateProfile("not an object");
|
||||
expect(result.valid).toBe(false);
|
||||
});
|
||||
});
|
||||
|
||||
// ============================================================================
|
||||
// Sanitization Tests
|
||||
// ============================================================================
|
||||
|
||||
describe("sanitizeProfileForDisplay", () => {
|
||||
it("escapes HTML in name field", () => {
|
||||
const profile: NostrProfile = {
|
||||
name: "<script>alert('xss')</script>",
|
||||
};
|
||||
|
||||
const sanitized = sanitizeProfileForDisplay(profile);
|
||||
|
||||
expect(sanitized.name).toBe("<script>alert('xss')</script>");
|
||||
});
|
||||
|
||||
it("escapes HTML in about field", () => {
|
||||
const profile: NostrProfile = {
|
||||
about: 'Check out <img src="x" onerror="alert(1)">',
|
||||
};
|
||||
|
||||
const sanitized = sanitizeProfileForDisplay(profile);
|
||||
|
||||
expect(sanitized.about).toBe(
|
||||
'Check out <img src="x" onerror="alert(1)">'
|
||||
);
|
||||
});
|
||||
|
||||
it("preserves URLs without modification", () => {
|
||||
const profile: NostrProfile = {
|
||||
picture: "https://example.com/pic.png",
|
||||
website: "https://example.com",
|
||||
};
|
||||
|
||||
const sanitized = sanitizeProfileForDisplay(profile);
|
||||
|
||||
expect(sanitized.picture).toBe("https://example.com/pic.png");
|
||||
expect(sanitized.website).toBe("https://example.com");
|
||||
});
|
||||
|
||||
it("handles undefined fields", () => {
|
||||
const profile: NostrProfile = {
|
||||
name: "test",
|
||||
};
|
||||
|
||||
const sanitized = sanitizeProfileForDisplay(profile);
|
||||
|
||||
expect(sanitized.name).toBe("test");
|
||||
expect(sanitized.about).toBeUndefined();
|
||||
expect(sanitized.picture).toBeUndefined();
|
||||
});
|
||||
|
||||
it("escapes ampersands", () => {
|
||||
const profile: NostrProfile = {
|
||||
name: "Tom & Jerry",
|
||||
};
|
||||
|
||||
const sanitized = sanitizeProfileForDisplay(profile);
|
||||
|
||||
expect(sanitized.name).toBe("Tom & Jerry");
|
||||
});
|
||||
|
||||
it("escapes quotes", () => {
|
||||
const profile: NostrProfile = {
|
||||
about: 'Say "hello" to everyone',
|
||||
};
|
||||
|
||||
const sanitized = sanitizeProfileForDisplay(profile);
|
||||
|
||||
expect(sanitized.about).toBe("Say "hello" to everyone");
|
||||
});
|
||||
});
|
||||
|
||||
// ============================================================================
|
||||
// Edge Cases
|
||||
// ============================================================================
|
||||
|
||||
describe("edge cases", () => {
|
||||
it("handles emoji in profile fields", () => {
|
||||
const profile: NostrProfile = {
|
||||
name: "🤖 Bot",
|
||||
about: "I am a 🤖 robot! 🎉",
|
||||
};
|
||||
|
||||
const content = profileToContent(profile);
|
||||
expect(content.name).toBe("🤖 Bot");
|
||||
expect(content.about).toBe("I am a 🤖 robot! 🎉");
|
||||
|
||||
const event = createProfileEvent(TEST_SK, profile);
|
||||
const parsed = JSON.parse(event.content) as ProfileContent;
|
||||
expect(parsed.name).toBe("🤖 Bot");
|
||||
});
|
||||
|
||||
it("handles unicode in profile fields", () => {
|
||||
const profile: NostrProfile = {
|
||||
name: "日本語ユーザー",
|
||||
about: "Привет мир! 你好世界!",
|
||||
};
|
||||
|
||||
const content = profileToContent(profile);
|
||||
expect(content.name).toBe("日本語ユーザー");
|
||||
|
||||
const event = createProfileEvent(TEST_SK, profile);
|
||||
expect(verifyEvent(event)).toBe(true);
|
||||
});
|
||||
|
||||
it("handles newlines in about field", () => {
|
||||
const profile: NostrProfile = {
|
||||
about: "Line 1\nLine 2\nLine 3",
|
||||
};
|
||||
|
||||
const content = profileToContent(profile);
|
||||
expect(content.about).toBe("Line 1\nLine 2\nLine 3");
|
||||
|
||||
const event = createProfileEvent(TEST_SK, profile);
|
||||
const parsed = JSON.parse(event.content) as ProfileContent;
|
||||
expect(parsed.about).toBe("Line 1\nLine 2\nLine 3");
|
||||
});
|
||||
|
||||
it("handles maximum length fields", () => {
|
||||
const profile: NostrProfile = {
|
||||
name: "a".repeat(256),
|
||||
about: "b".repeat(2000),
|
||||
};
|
||||
|
||||
const result = validateProfile(profile);
|
||||
expect(result.valid).toBe(true);
|
||||
|
||||
const event = createProfileEvent(TEST_SK, profile);
|
||||
expect(verifyEvent(event)).toBe(true);
|
||||
});
|
||||
});
|
||||
@@ -1,242 +0,0 @@
|
||||
/**
|
||||
* Nostr Profile Management (NIP-01 kind:0)
|
||||
*
|
||||
* Profile events are "replaceable" - the latest created_at wins.
|
||||
* This module handles profile event creation and publishing.
|
||||
*/
|
||||
|
||||
import { finalizeEvent, SimplePool, type Event } from "nostr-tools";
|
||||
import { type NostrProfile, NostrProfileSchema } from "./config-schema.js";
|
||||
|
||||
// ============================================================================
|
||||
// Types
|
||||
// ============================================================================
|
||||
|
||||
/** Result of a profile publish attempt */
|
||||
export interface ProfilePublishResult {
|
||||
/** Event ID of the published profile */
|
||||
eventId: string;
|
||||
/** Relays that successfully received the event */
|
||||
successes: string[];
|
||||
/** Relays that failed with their error messages */
|
||||
failures: Array<{ relay: string; error: string }>;
|
||||
/** Unix timestamp when the event was created */
|
||||
createdAt: number;
|
||||
}
|
||||
|
||||
/** NIP-01 profile content (JSON inside kind:0 event) */
|
||||
export interface ProfileContent {
|
||||
name?: string;
|
||||
display_name?: string;
|
||||
about?: string;
|
||||
picture?: string;
|
||||
banner?: string;
|
||||
website?: string;
|
||||
nip05?: string;
|
||||
lud16?: string;
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Profile Content Conversion
|
||||
// ============================================================================
|
||||
|
||||
/**
|
||||
* Convert our config profile schema to NIP-01 content format.
|
||||
* Strips undefined fields and validates URLs.
|
||||
*/
|
||||
export function profileToContent(profile: NostrProfile): ProfileContent {
|
||||
const validated = NostrProfileSchema.parse(profile);
|
||||
|
||||
const content: ProfileContent = {};
|
||||
|
||||
if (validated.name !== undefined) content.name = validated.name;
|
||||
if (validated.displayName !== undefined) content.display_name = validated.displayName;
|
||||
if (validated.about !== undefined) content.about = validated.about;
|
||||
if (validated.picture !== undefined) content.picture = validated.picture;
|
||||
if (validated.banner !== undefined) content.banner = validated.banner;
|
||||
if (validated.website !== undefined) content.website = validated.website;
|
||||
if (validated.nip05 !== undefined) content.nip05 = validated.nip05;
|
||||
if (validated.lud16 !== undefined) content.lud16 = validated.lud16;
|
||||
|
||||
return content;
|
||||
}
|
||||
|
||||
/**
|
||||
* Convert NIP-01 content format back to our config profile schema.
|
||||
* Useful for importing existing profiles from relays.
|
||||
*/
|
||||
export function contentToProfile(content: ProfileContent): NostrProfile {
|
||||
const profile: NostrProfile = {};
|
||||
|
||||
if (content.name !== undefined) profile.name = content.name;
|
||||
if (content.display_name !== undefined) profile.displayName = content.display_name;
|
||||
if (content.about !== undefined) profile.about = content.about;
|
||||
if (content.picture !== undefined) profile.picture = content.picture;
|
||||
if (content.banner !== undefined) profile.banner = content.banner;
|
||||
if (content.website !== undefined) profile.website = content.website;
|
||||
if (content.nip05 !== undefined) profile.nip05 = content.nip05;
|
||||
if (content.lud16 !== undefined) profile.lud16 = content.lud16;
|
||||
|
||||
return profile;
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Event Creation
|
||||
// ============================================================================
|
||||
|
||||
/**
|
||||
* Create a signed kind:0 profile event.
|
||||
*
|
||||
* @param sk - Private key as Uint8Array (32 bytes)
|
||||
* @param profile - Profile data to include
|
||||
* @param lastPublishedAt - Previous profile timestamp (for monotonic guarantee)
|
||||
* @returns Signed Nostr event
|
||||
*/
|
||||
export function createProfileEvent(
|
||||
sk: Uint8Array,
|
||||
profile: NostrProfile,
|
||||
lastPublishedAt?: number
|
||||
): Event {
|
||||
const content = profileToContent(profile);
|
||||
const contentJson = JSON.stringify(content);
|
||||
|
||||
// Ensure monotonic timestamp (new event > previous)
|
||||
const now = Math.floor(Date.now() / 1000);
|
||||
const createdAt = lastPublishedAt !== undefined ? Math.max(now, lastPublishedAt + 1) : now;
|
||||
|
||||
const event = finalizeEvent(
|
||||
{
|
||||
kind: 0,
|
||||
content: contentJson,
|
||||
tags: [],
|
||||
created_at: createdAt,
|
||||
},
|
||||
sk
|
||||
);
|
||||
|
||||
return event;
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Profile Publishing
|
||||
// ============================================================================
|
||||
|
||||
/** Per-relay publish timeout (ms) */
|
||||
const RELAY_PUBLISH_TIMEOUT_MS = 5000;
|
||||
|
||||
/**
|
||||
* Publish a profile event to multiple relays.
|
||||
*
|
||||
* Best-effort: publishes to all relays in parallel, reports per-relay results.
|
||||
* Does NOT retry automatically - caller should handle retries if needed.
|
||||
*
|
||||
* @param pool - SimplePool instance for relay connections
|
||||
* @param relays - Array of relay WebSocket URLs
|
||||
* @param event - Signed profile event (kind:0)
|
||||
* @returns Publish results with successes and failures
|
||||
*/
|
||||
export async function publishProfileEvent(
|
||||
pool: SimplePool,
|
||||
relays: string[],
|
||||
event: Event
|
||||
): Promise<ProfilePublishResult> {
|
||||
const successes: string[] = [];
|
||||
const failures: Array<{ relay: string; error: string }> = [];
|
||||
|
||||
// Publish to each relay in parallel with timeout
|
||||
const publishPromises = relays.map(async (relay) => {
|
||||
try {
|
||||
const timeoutPromise = new Promise<never>((_, reject) => {
|
||||
setTimeout(() => reject(new Error("timeout")), RELAY_PUBLISH_TIMEOUT_MS);
|
||||
});
|
||||
|
||||
await Promise.race([pool.publish([relay], event), timeoutPromise]);
|
||||
|
||||
successes.push(relay);
|
||||
} catch (err) {
|
||||
const errorMessage = err instanceof Error ? err.message : String(err);
|
||||
failures.push({ relay, error: errorMessage });
|
||||
}
|
||||
});
|
||||
|
||||
await Promise.all(publishPromises);
|
||||
|
||||
return {
|
||||
eventId: event.id,
|
||||
successes,
|
||||
failures,
|
||||
createdAt: event.created_at,
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Create and publish a profile event in one call.
|
||||
*
|
||||
* @param pool - SimplePool instance
|
||||
* @param sk - Private key as Uint8Array
|
||||
* @param relays - Array of relay URLs
|
||||
* @param profile - Profile data
|
||||
* @param lastPublishedAt - Previous timestamp for monotonic ordering
|
||||
* @returns Publish results
|
||||
*/
|
||||
export async function publishProfile(
|
||||
pool: SimplePool,
|
||||
sk: Uint8Array,
|
||||
relays: string[],
|
||||
profile: NostrProfile,
|
||||
lastPublishedAt?: number
|
||||
): Promise<ProfilePublishResult> {
|
||||
const event = createProfileEvent(sk, profile, lastPublishedAt);
|
||||
return publishProfileEvent(pool, relays, event);
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Profile Validation Helpers
|
||||
// ============================================================================
|
||||
|
||||
/**
|
||||
* Validate a profile without throwing (returns result object).
|
||||
*/
|
||||
export function validateProfile(profile: unknown): {
|
||||
valid: boolean;
|
||||
profile?: NostrProfile;
|
||||
errors?: string[];
|
||||
} {
|
||||
const result = NostrProfileSchema.safeParse(profile);
|
||||
|
||||
if (result.success) {
|
||||
return { valid: true, profile: result.data };
|
||||
}
|
||||
|
||||
return {
|
||||
valid: false,
|
||||
errors: result.error.issues.map((e) => `${e.path.join(".")}: ${e.message}`),
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Sanitize profile text fields to prevent XSS when displaying in UI.
|
||||
* Escapes HTML special characters.
|
||||
*/
|
||||
export function sanitizeProfileForDisplay(profile: NostrProfile): NostrProfile {
|
||||
const escapeHtml = (str: string | undefined): string | undefined => {
|
||||
if (str === undefined) return undefined;
|
||||
return str
|
||||
.replace(/&/g, "&")
|
||||
.replace(/</g, "<")
|
||||
.replace(/>/g, ">")
|
||||
.replace(/"/g, """)
|
||||
.replace(/'/g, "'");
|
||||
};
|
||||
|
||||
return {
|
||||
name: escapeHtml(profile.name),
|
||||
displayName: escapeHtml(profile.displayName),
|
||||
about: escapeHtml(profile.about),
|
||||
picture: profile.picture, // URLs already validated by schema
|
||||
banner: profile.banner,
|
||||
website: profile.website,
|
||||
nip05: escapeHtml(profile.nip05),
|
||||
lud16: escapeHtml(profile.lud16),
|
||||
};
|
||||
}
|
||||
@@ -1,128 +0,0 @@
|
||||
import fs from "node:fs/promises";
|
||||
import os from "node:os";
|
||||
import path from "node:path";
|
||||
|
||||
import { describe, expect, it } from "vitest";
|
||||
import type { PluginRuntime } from "clawdbot/plugin-sdk";
|
||||
|
||||
import {
|
||||
readNostrBusState,
|
||||
writeNostrBusState,
|
||||
computeSinceTimestamp,
|
||||
} from "./nostr-state-store.js";
|
||||
import { setNostrRuntime } from "./runtime.js";
|
||||
|
||||
async function withTempStateDir<T>(fn: (dir: string) => Promise<T>) {
|
||||
const previous = process.env.CLAWDBOT_STATE_DIR;
|
||||
const dir = await fs.mkdtemp(path.join(os.tmpdir(), "clawdbot-nostr-"));
|
||||
process.env.CLAWDBOT_STATE_DIR = dir;
|
||||
setNostrRuntime({
|
||||
state: {
|
||||
resolveStateDir: (env, homedir) => {
|
||||
const override = env.CLAWDBOT_STATE_DIR?.trim();
|
||||
if (override) return override;
|
||||
return path.join(homedir(), ".clawdbot");
|
||||
},
|
||||
},
|
||||
} as PluginRuntime);
|
||||
try {
|
||||
return await fn(dir);
|
||||
} finally {
|
||||
if (previous === undefined) delete process.env.CLAWDBOT_STATE_DIR;
|
||||
else process.env.CLAWDBOT_STATE_DIR = previous;
|
||||
await fs.rm(dir, { recursive: true, force: true });
|
||||
}
|
||||
}
|
||||
|
||||
describe("nostr bus state store", () => {
|
||||
it("persists and reloads state across restarts", async () => {
|
||||
await withTempStateDir(async () => {
|
||||
// Fresh start - no state
|
||||
expect(await readNostrBusState({ accountId: "test-bot" })).toBeNull();
|
||||
|
||||
// Write state
|
||||
await writeNostrBusState({
|
||||
accountId: "test-bot",
|
||||
lastProcessedAt: 1700000000,
|
||||
gatewayStartedAt: 1700000100,
|
||||
});
|
||||
|
||||
// Read it back
|
||||
const state = await readNostrBusState({ accountId: "test-bot" });
|
||||
expect(state).toEqual({
|
||||
version: 2,
|
||||
lastProcessedAt: 1700000000,
|
||||
gatewayStartedAt: 1700000100,
|
||||
recentEventIds: [],
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
it("isolates state by accountId", async () => {
|
||||
await withTempStateDir(async () => {
|
||||
await writeNostrBusState({
|
||||
accountId: "bot-a",
|
||||
lastProcessedAt: 1000,
|
||||
gatewayStartedAt: 1000,
|
||||
});
|
||||
await writeNostrBusState({
|
||||
accountId: "bot-b",
|
||||
lastProcessedAt: 2000,
|
||||
gatewayStartedAt: 2000,
|
||||
});
|
||||
|
||||
const stateA = await readNostrBusState({ accountId: "bot-a" });
|
||||
const stateB = await readNostrBusState({ accountId: "bot-b" });
|
||||
|
||||
expect(stateA?.lastProcessedAt).toBe(1000);
|
||||
expect(stateB?.lastProcessedAt).toBe(2000);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe("computeSinceTimestamp", () => {
|
||||
it("returns now for null state (fresh start)", () => {
|
||||
const now = 1700000000;
|
||||
expect(computeSinceTimestamp(null, now)).toBe(now);
|
||||
});
|
||||
|
||||
it("uses lastProcessedAt when available", () => {
|
||||
const state = {
|
||||
version: 2,
|
||||
lastProcessedAt: 1699999000,
|
||||
gatewayStartedAt: null,
|
||||
recentEventIds: [],
|
||||
};
|
||||
expect(computeSinceTimestamp(state, 1700000000)).toBe(1699999000);
|
||||
});
|
||||
|
||||
it("uses gatewayStartedAt when lastProcessedAt is null", () => {
|
||||
const state = {
|
||||
version: 2,
|
||||
lastProcessedAt: null,
|
||||
gatewayStartedAt: 1699998000,
|
||||
recentEventIds: [],
|
||||
};
|
||||
expect(computeSinceTimestamp(state, 1700000000)).toBe(1699998000);
|
||||
});
|
||||
|
||||
it("uses the max of both timestamps", () => {
|
||||
const state = {
|
||||
version: 2,
|
||||
lastProcessedAt: 1699999000,
|
||||
gatewayStartedAt: 1699998000,
|
||||
recentEventIds: [],
|
||||
};
|
||||
expect(computeSinceTimestamp(state, 1700000000)).toBe(1699999000);
|
||||
});
|
||||
|
||||
it("falls back to now if both are null", () => {
|
||||
const state = {
|
||||
version: 2,
|
||||
lastProcessedAt: null,
|
||||
gatewayStartedAt: null,
|
||||
recentEventIds: [],
|
||||
};
|
||||
expect(computeSinceTimestamp(state, 1700000000)).toBe(1700000000);
|
||||
});
|
||||
});
|
||||
@@ -1,226 +0,0 @@
|
||||
import crypto from "node:crypto";
|
||||
import fs from "node:fs/promises";
|
||||
import os from "node:os";
|
||||
import path from "node:path";
|
||||
|
||||
import { getNostrRuntime } from "./runtime.js";
|
||||
|
||||
const STORE_VERSION = 2;
|
||||
const PROFILE_STATE_VERSION = 1;
|
||||
|
||||
type NostrBusStateV1 = {
|
||||
version: 1;
|
||||
/** Unix timestamp (seconds) of the last processed event */
|
||||
lastProcessedAt: number | null;
|
||||
/** Gateway startup timestamp (seconds) - events before this are old */
|
||||
gatewayStartedAt: number | null;
|
||||
};
|
||||
|
||||
type NostrBusState = {
|
||||
version: 2;
|
||||
/** Unix timestamp (seconds) of the last processed event */
|
||||
lastProcessedAt: number | null;
|
||||
/** Gateway startup timestamp (seconds) - events before this are old */
|
||||
gatewayStartedAt: number | null;
|
||||
/** Recent processed event IDs for overlap dedupe across restarts */
|
||||
recentEventIds: string[];
|
||||
};
|
||||
|
||||
/** Profile publish state (separate from bus state) */
|
||||
export type NostrProfileState = {
|
||||
version: 1;
|
||||
/** Unix timestamp (seconds) of last successful profile publish */
|
||||
lastPublishedAt: number | null;
|
||||
/** Event ID of the last published profile */
|
||||
lastPublishedEventId: string | null;
|
||||
/** Per-relay publish results from last attempt */
|
||||
lastPublishResults: Record<string, "ok" | "failed" | "timeout"> | null;
|
||||
};
|
||||
|
||||
function normalizeAccountId(accountId?: string): string {
|
||||
const trimmed = accountId?.trim();
|
||||
if (!trimmed) return "default";
|
||||
return trimmed.replace(/[^a-z0-9._-]+/gi, "_");
|
||||
}
|
||||
|
||||
function resolveNostrStatePath(
|
||||
accountId?: string,
|
||||
env: NodeJS.ProcessEnv = process.env
|
||||
): string {
|
||||
const stateDir = getNostrRuntime().state.resolveStateDir(env, os.homedir);
|
||||
const normalized = normalizeAccountId(accountId);
|
||||
return path.join(stateDir, "nostr", `bus-state-${normalized}.json`);
|
||||
}
|
||||
|
||||
function resolveNostrProfileStatePath(
|
||||
accountId?: string,
|
||||
env: NodeJS.ProcessEnv = process.env
|
||||
): string {
|
||||
const stateDir = getNostrRuntime().state.resolveStateDir(env, os.homedir);
|
||||
const normalized = normalizeAccountId(accountId);
|
||||
return path.join(stateDir, "nostr", `profile-state-${normalized}.json`);
|
||||
}
|
||||
|
||||
function safeParseState(raw: string): NostrBusState | null {
|
||||
try {
|
||||
const parsed = JSON.parse(raw) as Partial<NostrBusState> & Partial<NostrBusStateV1>;
|
||||
|
||||
if (parsed?.version === 2) {
|
||||
return {
|
||||
version: 2,
|
||||
lastProcessedAt: typeof parsed.lastProcessedAt === "number" ? parsed.lastProcessedAt : null,
|
||||
gatewayStartedAt: typeof parsed.gatewayStartedAt === "number" ? parsed.gatewayStartedAt : null,
|
||||
recentEventIds: Array.isArray(parsed.recentEventIds)
|
||||
? parsed.recentEventIds.filter((x): x is string => typeof x === "string")
|
||||
: [],
|
||||
};
|
||||
}
|
||||
|
||||
// Back-compat: v1 state files
|
||||
if (parsed?.version === 1) {
|
||||
return {
|
||||
version: 2,
|
||||
lastProcessedAt: typeof parsed.lastProcessedAt === "number" ? parsed.lastProcessedAt : null,
|
||||
gatewayStartedAt: typeof parsed.gatewayStartedAt === "number" ? parsed.gatewayStartedAt : null,
|
||||
recentEventIds: [],
|
||||
};
|
||||
}
|
||||
|
||||
return null;
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
export async function readNostrBusState(params: {
|
||||
accountId?: string;
|
||||
env?: NodeJS.ProcessEnv;
|
||||
}): Promise<NostrBusState | null> {
|
||||
const filePath = resolveNostrStatePath(params.accountId, params.env);
|
||||
try {
|
||||
const raw = await fs.readFile(filePath, "utf-8");
|
||||
return safeParseState(raw);
|
||||
} catch (err) {
|
||||
const code = (err as { code?: string }).code;
|
||||
if (code === "ENOENT") return null;
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
export async function writeNostrBusState(params: {
|
||||
accountId?: string;
|
||||
lastProcessedAt: number;
|
||||
gatewayStartedAt: number;
|
||||
recentEventIds?: string[];
|
||||
env?: NodeJS.ProcessEnv;
|
||||
}): Promise<void> {
|
||||
const filePath = resolveNostrStatePath(params.accountId, params.env);
|
||||
const dir = path.dirname(filePath);
|
||||
await fs.mkdir(dir, { recursive: true, mode: 0o700 });
|
||||
const tmp = path.join(
|
||||
dir,
|
||||
`${path.basename(filePath)}.${crypto.randomUUID()}.tmp`
|
||||
);
|
||||
const payload: NostrBusState = {
|
||||
version: STORE_VERSION,
|
||||
lastProcessedAt: params.lastProcessedAt,
|
||||
gatewayStartedAt: params.gatewayStartedAt,
|
||||
recentEventIds: (params.recentEventIds ?? []).filter((x): x is string => typeof x === "string"),
|
||||
};
|
||||
await fs.writeFile(tmp, `${JSON.stringify(payload, null, 2)}\n`, {
|
||||
encoding: "utf-8",
|
||||
});
|
||||
await fs.chmod(tmp, 0o600);
|
||||
await fs.rename(tmp, filePath);
|
||||
}
|
||||
|
||||
/**
|
||||
* Determine the `since` timestamp for subscription.
|
||||
* Returns the later of: lastProcessedAt or gatewayStartedAt (both from disk),
|
||||
* falling back to `now` for fresh starts.
|
||||
*/
|
||||
export function computeSinceTimestamp(
|
||||
state: NostrBusState | null,
|
||||
nowSec: number = Math.floor(Date.now() / 1000)
|
||||
): number {
|
||||
if (!state) return nowSec;
|
||||
|
||||
// Use the most recent timestamp we have
|
||||
const candidates = [
|
||||
state.lastProcessedAt,
|
||||
state.gatewayStartedAt,
|
||||
].filter((t): t is number => t !== null && t > 0);
|
||||
|
||||
if (candidates.length === 0) return nowSec;
|
||||
return Math.max(...candidates);
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Profile State Management
|
||||
// ============================================================================
|
||||
|
||||
function safeParseProfileState(raw: string): NostrProfileState | null {
|
||||
try {
|
||||
const parsed = JSON.parse(raw) as Partial<NostrProfileState>;
|
||||
|
||||
if (parsed?.version === 1) {
|
||||
return {
|
||||
version: 1,
|
||||
lastPublishedAt:
|
||||
typeof parsed.lastPublishedAt === "number" ? parsed.lastPublishedAt : null,
|
||||
lastPublishedEventId:
|
||||
typeof parsed.lastPublishedEventId === "string" ? parsed.lastPublishedEventId : null,
|
||||
lastPublishResults:
|
||||
parsed.lastPublishResults && typeof parsed.lastPublishResults === "object"
|
||||
? (parsed.lastPublishResults as Record<string, "ok" | "failed" | "timeout">)
|
||||
: null,
|
||||
};
|
||||
}
|
||||
|
||||
return null;
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
export async function readNostrProfileState(params: {
|
||||
accountId?: string;
|
||||
env?: NodeJS.ProcessEnv;
|
||||
}): Promise<NostrProfileState | null> {
|
||||
const filePath = resolveNostrProfileStatePath(params.accountId, params.env);
|
||||
try {
|
||||
const raw = await fs.readFile(filePath, "utf-8");
|
||||
return safeParseProfileState(raw);
|
||||
} catch (err) {
|
||||
const code = (err as { code?: string }).code;
|
||||
if (code === "ENOENT") return null;
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
export async function writeNostrProfileState(params: {
|
||||
accountId?: string;
|
||||
lastPublishedAt: number;
|
||||
lastPublishedEventId: string;
|
||||
lastPublishResults: Record<string, "ok" | "failed" | "timeout">;
|
||||
env?: NodeJS.ProcessEnv;
|
||||
}): Promise<void> {
|
||||
const filePath = resolveNostrProfileStatePath(params.accountId, params.env);
|
||||
const dir = path.dirname(filePath);
|
||||
await fs.mkdir(dir, { recursive: true, mode: 0o700 });
|
||||
const tmp = path.join(
|
||||
dir,
|
||||
`${path.basename(filePath)}.${crypto.randomUUID()}.tmp`
|
||||
);
|
||||
const payload: NostrProfileState = {
|
||||
version: PROFILE_STATE_VERSION,
|
||||
lastPublishedAt: params.lastPublishedAt,
|
||||
lastPublishedEventId: params.lastPublishedEventId,
|
||||
lastPublishResults: params.lastPublishResults,
|
||||
};
|
||||
await fs.writeFile(tmp, `${JSON.stringify(payload, null, 2)}\n`, {
|
||||
encoding: "utf-8",
|
||||
});
|
||||
await fs.chmod(tmp, 0o600);
|
||||
await fs.rename(tmp, filePath);
|
||||
}
|
||||
@@ -1,14 +0,0 @@
|
||||
import type { PluginRuntime } from "clawdbot/plugin-sdk";
|
||||
|
||||
let runtime: PluginRuntime | null = null;
|
||||
|
||||
export function setNostrRuntime(next: PluginRuntime): void {
|
||||
runtime = next;
|
||||
}
|
||||
|
||||
export function getNostrRuntime(): PluginRuntime {
|
||||
if (!runtime) {
|
||||
throw new Error("Nostr runtime not initialized");
|
||||
}
|
||||
return runtime;
|
||||
}
|
||||
@@ -1,271 +0,0 @@
|
||||
/**
|
||||
* LRU-based seen event tracker with TTL support.
|
||||
* Prevents unbounded memory growth under high load or abuse.
|
||||
*/
|
||||
|
||||
export interface SeenTrackerOptions {
|
||||
/** Maximum number of entries to track (default: 100,000) */
|
||||
maxEntries?: number;
|
||||
/** TTL in milliseconds (default: 1 hour) */
|
||||
ttlMs?: number;
|
||||
/** Prune interval in milliseconds (default: 10 minutes) */
|
||||
pruneIntervalMs?: number;
|
||||
}
|
||||
|
||||
export interface SeenTracker {
|
||||
/** Check if an ID has been seen (also marks it as seen if not) */
|
||||
has: (id: string) => boolean;
|
||||
/** Mark an ID as seen */
|
||||
add: (id: string) => void;
|
||||
/** Check if ID exists without marking */
|
||||
peek: (id: string) => boolean;
|
||||
/** Delete an ID */
|
||||
delete: (id: string) => void;
|
||||
/** Clear all entries */
|
||||
clear: () => void;
|
||||
/** Get current size */
|
||||
size: () => number;
|
||||
/** Stop the pruning timer */
|
||||
stop: () => void;
|
||||
/** Pre-seed with IDs (useful for restart recovery) */
|
||||
seed: (ids: string[]) => void;
|
||||
}
|
||||
|
||||
interface Entry {
|
||||
seenAt: number;
|
||||
// For LRU: track order via doubly-linked list
|
||||
prev: string | null;
|
||||
next: string | null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a new seen tracker with LRU eviction and TTL expiration.
|
||||
*/
|
||||
export function createSeenTracker(options?: SeenTrackerOptions): SeenTracker {
|
||||
const maxEntries = options?.maxEntries ?? 100_000;
|
||||
const ttlMs = options?.ttlMs ?? 60 * 60 * 1000; // 1 hour
|
||||
const pruneIntervalMs = options?.pruneIntervalMs ?? 10 * 60 * 1000; // 10 minutes
|
||||
|
||||
// Main storage
|
||||
const entries = new Map<string, Entry>();
|
||||
|
||||
// LRU tracking: head = most recent, tail = least recent
|
||||
let head: string | null = null;
|
||||
let tail: string | null = null;
|
||||
|
||||
// Move an entry to the front (most recently used)
|
||||
function moveToFront(id: string): void {
|
||||
const entry = entries.get(id);
|
||||
if (!entry) return;
|
||||
|
||||
// Already at front
|
||||
if (head === id) return;
|
||||
|
||||
// Remove from current position
|
||||
if (entry.prev) {
|
||||
const prevEntry = entries.get(entry.prev);
|
||||
if (prevEntry) prevEntry.next = entry.next;
|
||||
}
|
||||
if (entry.next) {
|
||||
const nextEntry = entries.get(entry.next);
|
||||
if (nextEntry) nextEntry.prev = entry.prev;
|
||||
}
|
||||
|
||||
// Update tail if this was the tail
|
||||
if (tail === id) {
|
||||
tail = entry.prev;
|
||||
}
|
||||
|
||||
// Move to front
|
||||
entry.prev = null;
|
||||
entry.next = head;
|
||||
if (head) {
|
||||
const headEntry = entries.get(head);
|
||||
if (headEntry) headEntry.prev = id;
|
||||
}
|
||||
head = id;
|
||||
|
||||
// If no tail, this is also the tail
|
||||
if (!tail) tail = id;
|
||||
}
|
||||
|
||||
// Remove an entry from the linked list
|
||||
function removeFromList(id: string): void {
|
||||
const entry = entries.get(id);
|
||||
if (!entry) return;
|
||||
|
||||
if (entry.prev) {
|
||||
const prevEntry = entries.get(entry.prev);
|
||||
if (prevEntry) prevEntry.next = entry.next;
|
||||
} else {
|
||||
head = entry.next;
|
||||
}
|
||||
|
||||
if (entry.next) {
|
||||
const nextEntry = entries.get(entry.next);
|
||||
if (nextEntry) nextEntry.prev = entry.prev;
|
||||
} else {
|
||||
tail = entry.prev;
|
||||
}
|
||||
}
|
||||
|
||||
// Evict the least recently used entry
|
||||
function evictLRU(): void {
|
||||
if (!tail) return;
|
||||
const idToEvict = tail;
|
||||
removeFromList(idToEvict);
|
||||
entries.delete(idToEvict);
|
||||
}
|
||||
|
||||
// Prune expired entries
|
||||
function pruneExpired(): void {
|
||||
const now = Date.now();
|
||||
const toDelete: string[] = [];
|
||||
|
||||
for (const [id, entry] of entries) {
|
||||
if (now - entry.seenAt > ttlMs) {
|
||||
toDelete.push(id);
|
||||
}
|
||||
}
|
||||
|
||||
for (const id of toDelete) {
|
||||
removeFromList(id);
|
||||
entries.delete(id);
|
||||
}
|
||||
}
|
||||
|
||||
// Start pruning timer
|
||||
let pruneTimer: ReturnType<typeof setInterval> | undefined;
|
||||
if (pruneIntervalMs > 0) {
|
||||
pruneTimer = setInterval(pruneExpired, pruneIntervalMs);
|
||||
// Don't keep process alive just for pruning
|
||||
if (pruneTimer.unref) pruneTimer.unref();
|
||||
}
|
||||
|
||||
function add(id: string): void {
|
||||
const now = Date.now();
|
||||
|
||||
// If already exists, update and move to front
|
||||
const existing = entries.get(id);
|
||||
if (existing) {
|
||||
existing.seenAt = now;
|
||||
moveToFront(id);
|
||||
return;
|
||||
}
|
||||
|
||||
// Evict if at capacity
|
||||
while (entries.size >= maxEntries) {
|
||||
evictLRU();
|
||||
}
|
||||
|
||||
// Add new entry at front
|
||||
const newEntry: Entry = {
|
||||
seenAt: now,
|
||||
prev: null,
|
||||
next: head,
|
||||
};
|
||||
|
||||
if (head) {
|
||||
const headEntry = entries.get(head);
|
||||
if (headEntry) headEntry.prev = id;
|
||||
}
|
||||
|
||||
entries.set(id, newEntry);
|
||||
head = id;
|
||||
if (!tail) tail = id;
|
||||
}
|
||||
|
||||
function has(id: string): boolean {
|
||||
const entry = entries.get(id);
|
||||
if (!entry) {
|
||||
add(id);
|
||||
return false;
|
||||
}
|
||||
|
||||
// Check if expired
|
||||
if (Date.now() - entry.seenAt > ttlMs) {
|
||||
removeFromList(id);
|
||||
entries.delete(id);
|
||||
add(id);
|
||||
return false;
|
||||
}
|
||||
|
||||
// Mark as recently used
|
||||
entry.seenAt = Date.now();
|
||||
moveToFront(id);
|
||||
return true;
|
||||
}
|
||||
|
||||
function peek(id: string): boolean {
|
||||
const entry = entries.get(id);
|
||||
if (!entry) return false;
|
||||
|
||||
// Check if expired
|
||||
if (Date.now() - entry.seenAt > ttlMs) {
|
||||
removeFromList(id);
|
||||
entries.delete(id);
|
||||
return false;
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
function deleteEntry(id: string): void {
|
||||
if (entries.has(id)) {
|
||||
removeFromList(id);
|
||||
entries.delete(id);
|
||||
}
|
||||
}
|
||||
|
||||
function clear(): void {
|
||||
entries.clear();
|
||||
head = null;
|
||||
tail = null;
|
||||
}
|
||||
|
||||
function size(): number {
|
||||
return entries.size;
|
||||
}
|
||||
|
||||
function stop(): void {
|
||||
if (pruneTimer) {
|
||||
clearInterval(pruneTimer);
|
||||
pruneTimer = undefined;
|
||||
}
|
||||
}
|
||||
|
||||
function seed(ids: string[]): void {
|
||||
const now = Date.now();
|
||||
// Seed in reverse order so first IDs end up at front
|
||||
for (let i = ids.length - 1; i >= 0; i--) {
|
||||
const id = ids[i];
|
||||
if (!entries.has(id) && entries.size < maxEntries) {
|
||||
const newEntry: Entry = {
|
||||
seenAt: now,
|
||||
prev: null,
|
||||
next: head,
|
||||
};
|
||||
|
||||
if (head) {
|
||||
const headEntry = entries.get(head);
|
||||
if (headEntry) headEntry.prev = id;
|
||||
}
|
||||
|
||||
entries.set(id, newEntry);
|
||||
head = id;
|
||||
if (!tail) tail = id;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
has,
|
||||
add,
|
||||
peek,
|
||||
delete: deleteEntry,
|
||||
clear,
|
||||
size,
|
||||
stop,
|
||||
seed,
|
||||
};
|
||||
}
|
||||
@@ -1,161 +0,0 @@
|
||||
import { describe, expect, it } from "vitest";
|
||||
import {
|
||||
listNostrAccountIds,
|
||||
resolveDefaultNostrAccountId,
|
||||
resolveNostrAccount,
|
||||
} from "./types.js";
|
||||
|
||||
const TEST_PRIVATE_KEY = "0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef";
|
||||
|
||||
describe("listNostrAccountIds", () => {
|
||||
it("returns empty array when not configured", () => {
|
||||
const cfg = { channels: {} };
|
||||
expect(listNostrAccountIds(cfg)).toEqual([]);
|
||||
});
|
||||
|
||||
it("returns empty array when nostr section exists but no privateKey", () => {
|
||||
const cfg = { channels: { nostr: { enabled: true } } };
|
||||
expect(listNostrAccountIds(cfg)).toEqual([]);
|
||||
});
|
||||
|
||||
it("returns default when privateKey is configured", () => {
|
||||
const cfg = {
|
||||
channels: {
|
||||
nostr: { privateKey: TEST_PRIVATE_KEY },
|
||||
},
|
||||
};
|
||||
expect(listNostrAccountIds(cfg)).toEqual(["default"]);
|
||||
});
|
||||
});
|
||||
|
||||
describe("resolveDefaultNostrAccountId", () => {
|
||||
it("returns default when configured", () => {
|
||||
const cfg = {
|
||||
channels: {
|
||||
nostr: { privateKey: TEST_PRIVATE_KEY },
|
||||
},
|
||||
};
|
||||
expect(resolveDefaultNostrAccountId(cfg)).toBe("default");
|
||||
});
|
||||
|
||||
it("returns default when not configured", () => {
|
||||
const cfg = { channels: {} };
|
||||
expect(resolveDefaultNostrAccountId(cfg)).toBe("default");
|
||||
});
|
||||
});
|
||||
|
||||
describe("resolveNostrAccount", () => {
|
||||
it("resolves configured account", () => {
|
||||
const cfg = {
|
||||
channels: {
|
||||
nostr: {
|
||||
privateKey: TEST_PRIVATE_KEY,
|
||||
name: "Test Bot",
|
||||
relays: ["wss://test.relay"],
|
||||
dmPolicy: "pairing" as const,
|
||||
},
|
||||
},
|
||||
};
|
||||
const account = resolveNostrAccount({ cfg });
|
||||
|
||||
expect(account.accountId).toBe("default");
|
||||
expect(account.name).toBe("Test Bot");
|
||||
expect(account.enabled).toBe(true);
|
||||
expect(account.configured).toBe(true);
|
||||
expect(account.privateKey).toBe(TEST_PRIVATE_KEY);
|
||||
expect(account.publicKey).toMatch(/^[0-9a-f]{64}$/);
|
||||
expect(account.relays).toEqual(["wss://test.relay"]);
|
||||
});
|
||||
|
||||
it("resolves unconfigured account with defaults", () => {
|
||||
const cfg = { channels: {} };
|
||||
const account = resolveNostrAccount({ cfg });
|
||||
|
||||
expect(account.accountId).toBe("default");
|
||||
expect(account.enabled).toBe(true);
|
||||
expect(account.configured).toBe(false);
|
||||
expect(account.privateKey).toBe("");
|
||||
expect(account.publicKey).toBe("");
|
||||
expect(account.relays).toContain("wss://relay.damus.io");
|
||||
expect(account.relays).toContain("wss://nos.lol");
|
||||
});
|
||||
|
||||
it("handles disabled channel", () => {
|
||||
const cfg = {
|
||||
channels: {
|
||||
nostr: {
|
||||
enabled: false,
|
||||
privateKey: TEST_PRIVATE_KEY,
|
||||
},
|
||||
},
|
||||
};
|
||||
const account = resolveNostrAccount({ cfg });
|
||||
|
||||
expect(account.enabled).toBe(false);
|
||||
expect(account.configured).toBe(true);
|
||||
});
|
||||
|
||||
it("handles custom accountId parameter", () => {
|
||||
const cfg = {
|
||||
channels: {
|
||||
nostr: { privateKey: TEST_PRIVATE_KEY },
|
||||
},
|
||||
};
|
||||
const account = resolveNostrAccount({ cfg, accountId: "custom" });
|
||||
|
||||
expect(account.accountId).toBe("custom");
|
||||
});
|
||||
|
||||
it("handles allowFrom config", () => {
|
||||
const cfg = {
|
||||
channels: {
|
||||
nostr: {
|
||||
privateKey: TEST_PRIVATE_KEY,
|
||||
allowFrom: ["npub1test", "0123456789abcdef"],
|
||||
},
|
||||
},
|
||||
};
|
||||
const account = resolveNostrAccount({ cfg });
|
||||
|
||||
expect(account.config.allowFrom).toEqual(["npub1test", "0123456789abcdef"]);
|
||||
});
|
||||
|
||||
it("handles invalid private key gracefully", () => {
|
||||
const cfg = {
|
||||
channels: {
|
||||
nostr: {
|
||||
privateKey: "invalid-key",
|
||||
},
|
||||
},
|
||||
};
|
||||
const account = resolveNostrAccount({ cfg });
|
||||
|
||||
expect(account.configured).toBe(true); // key is present
|
||||
expect(account.publicKey).toBe(""); // but can't derive pubkey
|
||||
});
|
||||
|
||||
it("preserves all config options", () => {
|
||||
const cfg = {
|
||||
channels: {
|
||||
nostr: {
|
||||
privateKey: TEST_PRIVATE_KEY,
|
||||
name: "Bot",
|
||||
enabled: true,
|
||||
relays: ["wss://relay1", "wss://relay2"],
|
||||
dmPolicy: "allowlist" as const,
|
||||
allowFrom: ["pubkey1", "pubkey2"],
|
||||
},
|
||||
},
|
||||
};
|
||||
const account = resolveNostrAccount({ cfg });
|
||||
|
||||
expect(account.config).toEqual({
|
||||
privateKey: TEST_PRIVATE_KEY,
|
||||
name: "Bot",
|
||||
enabled: true,
|
||||
relays: ["wss://relay1", "wss://relay2"],
|
||||
dmPolicy: "allowlist",
|
||||
allowFrom: ["pubkey1", "pubkey2"],
|
||||
});
|
||||
});
|
||||
});
|
||||
@@ -1,99 +0,0 @@
|
||||
import type { ClawdbotConfig } from "clawdbot/plugin-sdk";
|
||||
import { getPublicKeyFromPrivate } from "./nostr-bus.js";
|
||||
import { DEFAULT_RELAYS } from "./nostr-bus.js";
|
||||
import type { NostrProfile } from "./config-schema.js";
|
||||
|
||||
export interface NostrAccountConfig {
|
||||
enabled?: boolean;
|
||||
name?: string;
|
||||
privateKey?: string;
|
||||
relays?: string[];
|
||||
dmPolicy?: "pairing" | "allowlist" | "open" | "disabled";
|
||||
allowFrom?: Array<string | number>;
|
||||
profile?: NostrProfile;
|
||||
}
|
||||
|
||||
export interface ResolvedNostrAccount {
|
||||
accountId: string;
|
||||
name?: string;
|
||||
enabled: boolean;
|
||||
configured: boolean;
|
||||
privateKey: string;
|
||||
publicKey: string;
|
||||
relays: string[];
|
||||
profile?: NostrProfile;
|
||||
config: NostrAccountConfig;
|
||||
}
|
||||
|
||||
const DEFAULT_ACCOUNT_ID = "default";
|
||||
|
||||
/**
|
||||
* List all configured Nostr account IDs
|
||||
*/
|
||||
export function listNostrAccountIds(cfg: ClawdbotConfig): string[] {
|
||||
const nostrCfg = (cfg.channels as Record<string, unknown> | undefined)?.nostr as
|
||||
| NostrAccountConfig
|
||||
| undefined;
|
||||
|
||||
// If privateKey is configured at top level, we have a default account
|
||||
if (nostrCfg?.privateKey) {
|
||||
return [DEFAULT_ACCOUNT_ID];
|
||||
}
|
||||
|
||||
return [];
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the default account ID
|
||||
*/
|
||||
export function resolveDefaultNostrAccountId(cfg: ClawdbotConfig): string {
|
||||
const ids = listNostrAccountIds(cfg);
|
||||
if (ids.includes(DEFAULT_ACCOUNT_ID)) return DEFAULT_ACCOUNT_ID;
|
||||
return ids[0] ?? DEFAULT_ACCOUNT_ID;
|
||||
}
|
||||
|
||||
/**
|
||||
* Resolve a Nostr account from config
|
||||
*/
|
||||
export function resolveNostrAccount(opts: {
|
||||
cfg: ClawdbotConfig;
|
||||
accountId?: string | null;
|
||||
}): ResolvedNostrAccount {
|
||||
const accountId = opts.accountId ?? DEFAULT_ACCOUNT_ID;
|
||||
const nostrCfg = (opts.cfg.channels as Record<string, unknown> | undefined)?.nostr as
|
||||
| NostrAccountConfig
|
||||
| undefined;
|
||||
|
||||
const baseEnabled = nostrCfg?.enabled !== false;
|
||||
const privateKey = nostrCfg?.privateKey ?? "";
|
||||
const configured = Boolean(privateKey.trim());
|
||||
|
||||
let publicKey = "";
|
||||
if (configured) {
|
||||
try {
|
||||
publicKey = getPublicKeyFromPrivate(privateKey);
|
||||
} catch {
|
||||
// Invalid key - leave publicKey empty, configured will indicate issues
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
accountId,
|
||||
name: nostrCfg?.name?.trim() || undefined,
|
||||
enabled: baseEnabled,
|
||||
configured,
|
||||
privateKey,
|
||||
publicKey,
|
||||
relays: nostrCfg?.relays ?? DEFAULT_RELAYS,
|
||||
profile: nostrCfg?.profile,
|
||||
config: {
|
||||
enabled: nostrCfg?.enabled,
|
||||
name: nostrCfg?.name,
|
||||
privateKey: nostrCfg?.privateKey,
|
||||
relays: nostrCfg?.relays,
|
||||
dmPolicy: nostrCfg?.dmPolicy,
|
||||
allowFrom: nostrCfg?.allowFrom,
|
||||
profile: nostrCfg?.profile,
|
||||
},
|
||||
};
|
||||
}
|
||||
@@ -1,5 +0,0 @@
|
||||
// Test setup file for nostr extension
|
||||
import { vi } from "vitest";
|
||||
|
||||
// Mock console.error to suppress noise in tests
|
||||
vi.spyOn(console, "error").mockImplementation(() => {});
|
||||
@@ -210,7 +210,6 @@
|
||||
"@types/proper-lockfile": "^4.1.4",
|
||||
"@types/qrcode-terminal": "^0.12.2",
|
||||
"@types/ws": "^8.18.1",
|
||||
"@typescript/native-preview": "7.0.0-dev.20260120.1",
|
||||
"@vitest/coverage-v8": "^4.0.17",
|
||||
"docx-preview": "^0.3.7",
|
||||
"lit": "^3.3.2",
|
||||
@@ -232,7 +231,7 @@
|
||||
"overrides": {
|
||||
"@sinclair/typebox": "0.34.47",
|
||||
"hono": "4.11.4",
|
||||
"tar": "7.5.4"
|
||||
"tar": "7.5.3"
|
||||
},
|
||||
"patchedDependencies": {
|
||||
"@mariozechner/pi-ai@0.49.2": "patches/@mariozechner__pi-ai@0.49.2.patch"
|
||||
|
||||
187
pnpm-lock.yaml
generated
187
pnpm-lock.yaml
generated
@@ -7,7 +7,7 @@ settings:
|
||||
overrides:
|
||||
'@sinclair/typebox': 0.34.47
|
||||
hono: 4.11.4
|
||||
tar: 7.5.4
|
||||
tar: 7.5.3
|
||||
|
||||
patchedDependencies:
|
||||
'@mariozechner/pi-ai@0.49.2':
|
||||
@@ -151,8 +151,8 @@ importers:
|
||||
specifier: 0.1.7-alpha.2
|
||||
version: 0.1.7-alpha.2
|
||||
tar:
|
||||
specifier: 7.5.4
|
||||
version: 7.5.4
|
||||
specifier: 7.5.3
|
||||
version: 7.5.3
|
||||
tslog:
|
||||
specifier: ^4.10.2
|
||||
version: 4.10.2
|
||||
@@ -202,9 +202,6 @@ importers:
|
||||
'@types/ws':
|
||||
specifier: ^8.18.1
|
||||
version: 8.18.1
|
||||
'@typescript/native-preview':
|
||||
specifier: 7.0.0-dev.20260120.1
|
||||
version: 7.0.0-dev.20260120.1
|
||||
'@vitest/coverage-v8':
|
||||
specifier: ^4.0.17
|
||||
version: 4.0.17(@vitest/browser@4.0.17(vite@7.3.1(@types/node@25.0.9)(jiti@2.6.1)(lightningcss@1.30.2)(tsx@4.21.0)(yaml@2.8.2))(vitest@4.0.17))(vitest@4.0.17)
|
||||
@@ -267,12 +264,6 @@ importers:
|
||||
'@opentelemetry/api':
|
||||
specifier: ^1.9.0
|
||||
version: 1.9.0
|
||||
'@opentelemetry/api-logs':
|
||||
specifier: ^0.210.0
|
||||
version: 0.210.0
|
||||
'@opentelemetry/exporter-logs-otlp-http':
|
||||
specifier: ^0.210.0
|
||||
version: 0.210.0(@opentelemetry/api@1.9.0)
|
||||
'@opentelemetry/exporter-metrics-otlp-http':
|
||||
specifier: ^0.210.0
|
||||
version: 0.210.0(@opentelemetry/api@1.9.0)
|
||||
@@ -282,9 +273,6 @@ importers:
|
||||
'@opentelemetry/resources':
|
||||
specifier: ^2.4.0
|
||||
version: 2.4.0(@opentelemetry/api@1.9.0)
|
||||
'@opentelemetry/sdk-logs':
|
||||
specifier: ^0.210.0
|
||||
version: 0.210.0(@opentelemetry/api@1.9.0)
|
||||
'@opentelemetry/sdk-metrics':
|
||||
specifier: ^2.4.0
|
||||
version: 2.4.0(@opentelemetry/api@1.9.0)
|
||||
@@ -365,18 +353,6 @@ importers:
|
||||
|
||||
extensions/nextcloud-talk: {}
|
||||
|
||||
extensions/nostr:
|
||||
dependencies:
|
||||
clawdbot:
|
||||
specifier: workspace:*
|
||||
version: link:../..
|
||||
nostr-tools:
|
||||
specifier: ^2.10.4
|
||||
version: 2.19.4(typescript@5.9.3)
|
||||
zod:
|
||||
specifier: ^4.3.5
|
||||
version: 4.3.5
|
||||
|
||||
extensions/signal: {}
|
||||
|
||||
extensions/slack: {}
|
||||
@@ -1357,26 +1333,9 @@ packages:
|
||||
'@napi-rs/wasm-runtime@1.1.1':
|
||||
resolution: {integrity: sha512-p64ah1M1ld8xjWv3qbvFwHiFVWrq1yFvV4f7w+mzaqiR4IlSgkqhcRdHwsGgomwzBH51sRY4NEowLxnaBjcW/A==}
|
||||
|
||||
'@noble/ciphers@0.5.3':
|
||||
resolution: {integrity: sha512-B0+6IIHiqEs3BPMT0hcRmHvEj2QHOLu+uwt+tqDDeVd0oyVzh7BPrDcPjRnV1PV/5LaknXJJQvOuRGR0zQJz+w==}
|
||||
|
||||
'@noble/curves@1.1.0':
|
||||
resolution: {integrity: sha512-091oBExgENk/kGj3AZmtBDMpxQPDtxQABR2B9lb1JbVTs6ytdzZNwvhxQ4MWasRNEzlbEH8jCWFCwhF/Obj5AA==}
|
||||
|
||||
'@noble/curves@1.2.0':
|
||||
resolution: {integrity: sha512-oYclrNgRaM9SsBUBVbb8M6DTV7ZHRTKugureoYEncY5c65HOmRzvSiTE3y5CYaPYJA/GVkrhXEoF0M3Ya9PMnw==}
|
||||
|
||||
'@noble/ed25519@3.0.0':
|
||||
resolution: {integrity: sha512-QyteqMNm0GLqfa5SoYbSC3+Pvykwpn95Zgth4MFVSMKBB75ELl9tX1LAVsN4c3HXOrakHsF2gL4zWDAYCcsnzg==}
|
||||
|
||||
'@noble/hashes@1.3.1':
|
||||
resolution: {integrity: sha512-EbqwksQwz9xDRGfDST86whPBgM65E0OH/pCgqW0GBVzO22bNE+NuIbeTb714+IfSjU3aRk47EUvXIb5bTsenKA==}
|
||||
engines: {node: '>= 16'}
|
||||
|
||||
'@noble/hashes@1.3.2':
|
||||
resolution: {integrity: sha512-MVC8EAQp7MvEcm30KWENFjgR+Mkmf+D189XJTkFIlwohU5hcBbn1ZkKq7KVTi2Hme3PMGF390DaL52beVrIihQ==}
|
||||
engines: {node: '>= 16'}
|
||||
|
||||
'@node-llama-cpp/linux-arm64@3.15.0':
|
||||
resolution: {integrity: sha512-IaHIllWlj6tGjhhCtyp1w6xA7AHaGJiVaXAZ+78hDs8X1SL9ySBN2Qceju8AQJALePtynbAfjgjTqjQ7Hyk+IQ==}
|
||||
engines: {node: '>=20.0.0'}
|
||||
@@ -2155,15 +2114,6 @@ packages:
|
||||
cpu: [x64]
|
||||
os: [win32]
|
||||
|
||||
'@scure/base@1.1.1':
|
||||
resolution: {integrity: sha512-ZxOhsSyxYwLJj3pLZCefNitxsj093tb2vq90mp2txoYeBqbcjDjqFhyM8eUjq/uFm6zJ+mUuqxlS2FkuSY1MTA==}
|
||||
|
||||
'@scure/bip32@1.3.1':
|
||||
resolution: {integrity: sha512-osvveYtyzdEVbt3OfwwXFr4P2iVBL5u1Q3q4ONBfDY/UpOuXmOlbgwc1xECEboY8wIays8Yt6onaWMUdUbfl0A==}
|
||||
|
||||
'@scure/bip39@1.2.1':
|
||||
resolution: {integrity: sha512-Z3/Fsz1yr904dduJD0NpiyRHhRYHdcnyh73FZWiV+/qhWi83wNJ3NWolYqCEN+ZWsUz2TWwajJggcRE9r1zUYg==}
|
||||
|
||||
'@selderee/plugin-htmlparser2@0.11.0':
|
||||
resolution: {integrity: sha512-P33hHGdldxGabLFjPPpaTxVolMrzrcegejx+0GxjrIb9Zv48D8yAIA/QTDR2dFl7Uz7urX8aX6+5bCZslr+gWQ==}
|
||||
|
||||
@@ -2533,45 +2483,6 @@ packages:
|
||||
'@types/ws@8.18.1':
|
||||
resolution: {integrity: sha512-ThVF6DCVhA8kUGy+aazFQ4kXQ7E1Ty7A3ypFOe0IcJV8O/M511G99AW24irKrW56Wt44yG9+ij8FaqoBGkuBXg==}
|
||||
|
||||
'@typescript/native-preview-darwin-arm64@7.0.0-dev.20260120.1':
|
||||
resolution: {integrity: sha512-r3pWFuR2H7mn6ScwpH5jJljKQqKto0npVuJSk6pRwFwexpTyxOGmJTZJ1V0AWiisaNxU2+CNAqWFJSJYIE/QTg==}
|
||||
cpu: [arm64]
|
||||
os: [darwin]
|
||||
|
||||
'@typescript/native-preview-darwin-x64@7.0.0-dev.20260120.1':
|
||||
resolution: {integrity: sha512-cuC1+wLbUP+Ip2UT94G134fqRdp5w3b3dhcCO6/FQ4yXxvRNyv/WK+upHBUFDaeSOeHgDTyO9/QFYUWwC4If1A==}
|
||||
cpu: [x64]
|
||||
os: [darwin]
|
||||
|
||||
'@typescript/native-preview-linux-arm64@7.0.0-dev.20260120.1':
|
||||
resolution: {integrity: sha512-zZGvEGY7wcHYefMZ87KNmvjN3NLIhsCMHEpHZiGCS3khKf+8z6ZsanrzCjOTodvL01VPyBzHxV1EtkSxAcLiQg==}
|
||||
cpu: [arm64]
|
||||
os: [linux]
|
||||
|
||||
'@typescript/native-preview-linux-arm@7.0.0-dev.20260120.1':
|
||||
resolution: {integrity: sha512-vN6OYVySol/kQZjJGmAzd6L30SyVlCgmCXS8WjUYtE5clN0YrzQHop16RK29fYZHMxpkOniVBtRPxUYQANZBlQ==}
|
||||
cpu: [arm]
|
||||
os: [linux]
|
||||
|
||||
'@typescript/native-preview-linux-x64@7.0.0-dev.20260120.1':
|
||||
resolution: {integrity: sha512-JBfNhWd/asd5MDeS3VgRvE24pGKBkmvLub6tsux6ypr+Yhy+o0WaAEzVpmlRYZUqss2ai5tvOu4dzPBXzZAtFw==}
|
||||
cpu: [x64]
|
||||
os: [linux]
|
||||
|
||||
'@typescript/native-preview-win32-arm64@7.0.0-dev.20260120.1':
|
||||
resolution: {integrity: sha512-tTndRtYCq2xwgE0VkTi9ACNiJaV43+PqvBqCxk8ceYi3X36Ve+CCnwlZfZJ4k9NxZthtrAwF/kUmpC9iIYbq1w==}
|
||||
cpu: [arm64]
|
||||
os: [win32]
|
||||
|
||||
'@typescript/native-preview-win32-x64@7.0.0-dev.20260120.1':
|
||||
resolution: {integrity: sha512-oZia7hFL6k9pVepfonuPI86Jmyz6WlJKR57tWCDwRNmpA7odxuTq1PbvcYgy1z4+wHF1nnKKJY0PMAiq6ac18w==}
|
||||
cpu: [x64]
|
||||
os: [win32]
|
||||
|
||||
'@typescript/native-preview@7.0.0-dev.20260120.1':
|
||||
resolution: {integrity: sha512-nnEf37C9ue7OBRnF2zmV/OCBmV5Y7T/K4mCHa+nxgiXcF/1w8sA0cgdFl+gHQ0mysqUJ+Bu5btAMeWgpLyjrgg==}
|
||||
hasBin: true
|
||||
|
||||
'@typespec/ts-http-runtime@0.3.2':
|
||||
resolution: {integrity: sha512-IlqQ/Gv22xUC1r/WQm4StLkYQmaaTsXAhUVsNE0+xiyf0yRFiH5++q78U3bw6bLKDCTmh0uqKB9eG9+Bt75Dkg==}
|
||||
engines: {node: '>=20.0.0'}
|
||||
@@ -4164,17 +4075,6 @@ packages:
|
||||
resolution: {integrity: sha512-6eZs5Ls3WtCisHWp9S2GUy8dqkpGi4BVSz3GaqiE6ezub0512ESztXUwUB6C6IKbQkY2Pnb/mD4WYojCRwcwLA==}
|
||||
engines: {node: '>=0.10.0'}
|
||||
|
||||
nostr-tools@2.19.4:
|
||||
resolution: {integrity: sha512-qVLfoTpZegNYRJo5j+Oi6RPu0AwLP6jcvzcB3ySMnIT5DrAGNXfs5HNBspB/2HiGfH3GY+v6yXkTtcKSBQZwSg==}
|
||||
peerDependencies:
|
||||
typescript: '>=5.0.0'
|
||||
peerDependenciesMeta:
|
||||
typescript:
|
||||
optional: true
|
||||
|
||||
nostr-wasm@0.1.0:
|
||||
resolution: {integrity: sha512-78BTryCLcLYv96ONU8Ws3Q1JzjlAt+43pWQhIl86xZmWeegYCNLPml7yQ+gG3vR6V5h4XGj+TxO+SS5dsThQIA==}
|
||||
|
||||
npmlog@6.0.2:
|
||||
resolution: {integrity: sha512-/vBvz5Jfr9dT/aFWd0FIRf+T/Q2WBsLENygUaFUqstqsycmZAP/t5BvFJTK0viFmSUxiUKTUplWy5vt+rvKIxg==}
|
||||
engines: {node: ^12.13.0 || ^14.15.0 || >=16.0.0}
|
||||
@@ -4905,9 +4805,10 @@ packages:
|
||||
tailwindcss@4.1.17:
|
||||
resolution: {integrity: sha512-j9Ee2YjuQqYT9bbRTfTZht9W/ytp5H+jJpZKiYdP/bpnXARAuELt9ofP0lPnmHjbga7SNQIxdTAXCmtKVYjN+Q==}
|
||||
|
||||
tar@7.5.4:
|
||||
resolution: {integrity: sha512-AN04xbWGrSTDmVwlI4/GTlIIwMFk/XEv7uL8aa57zuvRy6s4hdBed+lVq2fAZ89XDa7Us3ANXcE3Tvqvja1kTA==}
|
||||
tar@7.5.3:
|
||||
resolution: {integrity: sha512-ENg5JUHUm2rDD7IvKNFGzyElLXNjachNLp6RaGf4+JOgxXHkqA+gq81ZAMCUmtMtqBsoU62lcp6S27g1LCYGGQ==}
|
||||
engines: {node: '>=18'}
|
||||
deprecated: Old versions of tar are not supported, and contain widely publicized security vulnerabilities, which have been fixed in the current version. Please update. Support for old versions may be purchased (at exhorbitant rates) by contacting i@izs.me
|
||||
|
||||
thenify-all@1.6.0:
|
||||
resolution: {integrity: sha512-RNxQH/qI8/t3thXJDwcstUO4zeqo64+Uy/+sNVRBx4Xn2OX+OZ9oP+iJnNFqplFra2ZUVeKCSa2oVWi3T4uVmA==}
|
||||
@@ -6455,22 +6356,8 @@ snapshots:
|
||||
'@tybys/wasm-util': 0.10.1
|
||||
optional: true
|
||||
|
||||
'@noble/ciphers@0.5.3': {}
|
||||
|
||||
'@noble/curves@1.1.0':
|
||||
dependencies:
|
||||
'@noble/hashes': 1.3.1
|
||||
|
||||
'@noble/curves@1.2.0':
|
||||
dependencies:
|
||||
'@noble/hashes': 1.3.2
|
||||
|
||||
'@noble/ed25519@3.0.0': {}
|
||||
|
||||
'@noble/hashes@1.3.1': {}
|
||||
|
||||
'@noble/hashes@1.3.2': {}
|
||||
|
||||
'@node-llama-cpp/linux-arm64@3.15.0':
|
||||
optional: true
|
||||
|
||||
@@ -7183,19 +7070,6 @@ snapshots:
|
||||
'@rollup/rollup-win32-x64-msvc@4.55.2':
|
||||
optional: true
|
||||
|
||||
'@scure/base@1.1.1': {}
|
||||
|
||||
'@scure/bip32@1.3.1':
|
||||
dependencies:
|
||||
'@noble/curves': 1.1.0
|
||||
'@noble/hashes': 1.3.1
|
||||
'@scure/base': 1.1.1
|
||||
|
||||
'@scure/bip39@1.2.1':
|
||||
dependencies:
|
||||
'@noble/hashes': 1.3.1
|
||||
'@scure/base': 1.1.1
|
||||
|
||||
'@selderee/plugin-htmlparser2@0.11.0':
|
||||
dependencies:
|
||||
domhandler: 5.0.3
|
||||
@@ -7740,37 +7614,6 @@ snapshots:
|
||||
dependencies:
|
||||
'@types/node': 25.0.9
|
||||
|
||||
'@typescript/native-preview-darwin-arm64@7.0.0-dev.20260120.1':
|
||||
optional: true
|
||||
|
||||
'@typescript/native-preview-darwin-x64@7.0.0-dev.20260120.1':
|
||||
optional: true
|
||||
|
||||
'@typescript/native-preview-linux-arm64@7.0.0-dev.20260120.1':
|
||||
optional: true
|
||||
|
||||
'@typescript/native-preview-linux-arm@7.0.0-dev.20260120.1':
|
||||
optional: true
|
||||
|
||||
'@typescript/native-preview-linux-x64@7.0.0-dev.20260120.1':
|
||||
optional: true
|
||||
|
||||
'@typescript/native-preview-win32-arm64@7.0.0-dev.20260120.1':
|
||||
optional: true
|
||||
|
||||
'@typescript/native-preview-win32-x64@7.0.0-dev.20260120.1':
|
||||
optional: true
|
||||
|
||||
'@typescript/native-preview@7.0.0-dev.20260120.1':
|
||||
optionalDependencies:
|
||||
'@typescript/native-preview-darwin-arm64': 7.0.0-dev.20260120.1
|
||||
'@typescript/native-preview-darwin-x64': 7.0.0-dev.20260120.1
|
||||
'@typescript/native-preview-linux-arm': 7.0.0-dev.20260120.1
|
||||
'@typescript/native-preview-linux-arm64': 7.0.0-dev.20260120.1
|
||||
'@typescript/native-preview-linux-x64': 7.0.0-dev.20260120.1
|
||||
'@typescript/native-preview-win32-arm64': 7.0.0-dev.20260120.1
|
||||
'@typescript/native-preview-win32-x64': 7.0.0-dev.20260120.1
|
||||
|
||||
'@typespec/ts-http-runtime@0.3.2':
|
||||
dependencies:
|
||||
http-proxy-agent: 7.0.2
|
||||
@@ -8268,7 +8111,7 @@ snapshots:
|
||||
npmlog: 6.0.2
|
||||
rc: 1.2.8
|
||||
semver: 7.7.3
|
||||
tar: 7.5.4
|
||||
tar: 7.5.3
|
||||
url-join: 4.0.1
|
||||
which: 2.0.2
|
||||
yargs: 17.7.2
|
||||
@@ -9586,20 +9429,6 @@ snapshots:
|
||||
|
||||
normalize-path@3.0.0: {}
|
||||
|
||||
nostr-tools@2.19.4(typescript@5.9.3):
|
||||
dependencies:
|
||||
'@noble/ciphers': 0.5.3
|
||||
'@noble/curves': 1.2.0
|
||||
'@noble/hashes': 1.3.1
|
||||
'@scure/base': 1.1.1
|
||||
'@scure/bip32': 1.3.1
|
||||
'@scure/bip39': 1.2.1
|
||||
nostr-wasm: 0.1.0
|
||||
optionalDependencies:
|
||||
typescript: 5.9.3
|
||||
|
||||
nostr-wasm@0.1.0: {}
|
||||
|
||||
npmlog@6.0.2:
|
||||
dependencies:
|
||||
are-we-there-yet: 3.0.1
|
||||
@@ -10510,7 +10339,7 @@ snapshots:
|
||||
|
||||
tailwindcss@4.1.17: {}
|
||||
|
||||
tar@7.5.4:
|
||||
tar@7.5.3:
|
||||
dependencies:
|
||||
'@isaacs/fs-minipass': 4.0.1
|
||||
chownr: 3.0.0
|
||||
|
||||
@@ -7,8 +7,6 @@ import process from "node:process";
|
||||
const args = process.argv.slice(2);
|
||||
const env = { ...process.env };
|
||||
const cwd = process.cwd();
|
||||
const compiler = env.CLAWDBOT_TS_COMPILER === "tsc" ? "tsc" : "tsgo";
|
||||
const projectArgs = ["--project", "tsconfig.json"];
|
||||
|
||||
const distRoot = path.join(cwd, "dist");
|
||||
const distEntry = path.join(distRoot, "entry.js");
|
||||
@@ -112,10 +110,11 @@ const writeBuildStamp = () => {
|
||||
};
|
||||
|
||||
if (!shouldBuild()) {
|
||||
logRunner("Skipping build; dist is fresh.");
|
||||
runNode();
|
||||
} else {
|
||||
logRunner("Building TypeScript (dist is stale).");
|
||||
const build = spawn("pnpm", ["exec", compiler, ...projectArgs], {
|
||||
const build = spawn("pnpm", ["exec", "tsc", "-p", "tsconfig.json"], {
|
||||
cwd,
|
||||
env,
|
||||
stdio: "inherit",
|
||||
|
||||
@@ -5,10 +5,8 @@ import process from "node:process";
|
||||
const args = process.argv.slice(2);
|
||||
const env = { ...process.env };
|
||||
const cwd = process.cwd();
|
||||
const compiler = env.CLAWDBOT_TS_COMPILER === "tsc" ? "tsc" : "tsgo";
|
||||
const projectArgs = ["--project", "tsconfig.json"];
|
||||
|
||||
const initialBuild = spawnSync("pnpm", ["exec", compiler, ...projectArgs], {
|
||||
const initialBuild = spawnSync("pnpm", ["exec", "tsc", "-p", "tsconfig.json"], {
|
||||
cwd,
|
||||
env,
|
||||
stdio: "inherit",
|
||||
@@ -18,12 +16,7 @@ if (initialBuild.status !== 0) {
|
||||
process.exit(initialBuild.status ?? 1);
|
||||
}
|
||||
|
||||
const watchArgs =
|
||||
compiler === "tsc"
|
||||
? [...projectArgs, "--watch", "--preserveWatchOutput"]
|
||||
: [...projectArgs, "--watch"];
|
||||
|
||||
const compilerProcess = spawn("pnpm", ["exec", compiler, ...watchArgs], {
|
||||
const tsc = spawn("pnpm", ["exec", "tsc", "--watch", "--preserveWatchOutput"], {
|
||||
cwd,
|
||||
env,
|
||||
stdio: "inherit",
|
||||
@@ -41,14 +34,14 @@ function cleanup(code = 0) {
|
||||
if (exiting) return;
|
||||
exiting = true;
|
||||
nodeProcess.kill("SIGTERM");
|
||||
compilerProcess.kill("SIGTERM");
|
||||
tsc.kill("SIGTERM");
|
||||
process.exit(code);
|
||||
}
|
||||
|
||||
process.on("SIGINT", () => cleanup(130));
|
||||
process.on("SIGTERM", () => cleanup(143));
|
||||
|
||||
compilerProcess.on("exit", (code) => {
|
||||
tsc.on("exit", (code) => {
|
||||
if (exiting) return;
|
||||
cleanup(code ?? 1);
|
||||
});
|
||||
|
||||
@@ -1,49 +0,0 @@
|
||||
---
|
||||
name: sherpa-onnx-tts
|
||||
description: Local text-to-speech via sherpa-onnx (offline, no cloud)
|
||||
metadata: {"clawdbot":{"emoji":"🗣️","os":["darwin","linux","win32"],"requires":{"env":["SHERPA_ONNX_RUNTIME_DIR","SHERPA_ONNX_MODEL_DIR"]},"install":[{"id":"download-runtime-macos","kind":"download","os":["darwin"],"url":"https://github.com/k2-fsa/sherpa-onnx/releases/download/v1.12.23/sherpa-onnx-v1.12.23-osx-universal2-shared.tar.bz2","archive":"tar.bz2","extract":true,"stripComponents":1,"targetDir":"~/.clawdbot/tools/sherpa-onnx-tts/runtime","label":"Download sherpa-onnx runtime (macOS)"},{"id":"download-runtime-linux-x64","kind":"download","os":["linux"],"url":"https://github.com/k2-fsa/sherpa-onnx/releases/download/v1.12.23/sherpa-onnx-v1.12.23-linux-x64-shared.tar.bz2","archive":"tar.bz2","extract":true,"stripComponents":1,"targetDir":"~/.clawdbot/tools/sherpa-onnx-tts/runtime","label":"Download sherpa-onnx runtime (Linux x64)"},{"id":"download-runtime-win-x64","kind":"download","os":["win32"],"url":"https://github.com/k2-fsa/sherpa-onnx/releases/download/v1.12.23/sherpa-onnx-v1.12.23-win-x64-shared.tar.bz2","archive":"tar.bz2","extract":true,"stripComponents":1,"targetDir":"~/.clawdbot/tools/sherpa-onnx-tts/runtime","label":"Download sherpa-onnx runtime (Windows x64)"},{"id":"download-model-lessac","kind":"download","url":"https://github.com/k2-fsa/sherpa-onnx/releases/download/tts-models/vits-piper-en_US-lessac-high.tar.bz2","archive":"tar.bz2","extract":true,"targetDir":"~/.clawdbot/tools/sherpa-onnx-tts/models","label":"Download Piper en_US lessac (high)"}]}}
|
||||
---
|
||||
|
||||
# sherpa-onnx-tts
|
||||
|
||||
Local TTS using the sherpa-onnx offline CLI.
|
||||
|
||||
## Install
|
||||
|
||||
1) Download the runtime for your OS (extracts into `~/.clawdbot/tools/sherpa-onnx-tts/runtime`)
|
||||
2) Download a voice model (extracts into `~/.clawdbot/tools/sherpa-onnx-tts/models`)
|
||||
|
||||
Update `~/.clawdbot/clawdbot.json`:
|
||||
|
||||
```json5
|
||||
{
|
||||
skills: {
|
||||
entries: {
|
||||
"sherpa-onnx-tts": {
|
||||
env: {
|
||||
SHERPA_ONNX_RUNTIME_DIR: "~/.clawdbot/tools/sherpa-onnx-tts/runtime",
|
||||
SHERPA_ONNX_MODEL_DIR: "~/.clawdbot/tools/sherpa-onnx-tts/models/vits-piper-en_US-lessac-high"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
The wrapper lives in this skill folder. Run it directly, or add the wrapper to PATH:
|
||||
|
||||
```bash
|
||||
export PATH="{baseDir}/bin:$PATH"
|
||||
```
|
||||
|
||||
## Usage
|
||||
|
||||
```bash
|
||||
{baseDir}/bin/sherpa-onnx-tts -o ./tts.wav "Hello from local TTS."
|
||||
```
|
||||
|
||||
Notes:
|
||||
- Pick a different model from the sherpa-onnx `tts-models` release if you want another voice.
|
||||
- If the model dir has multiple `.onnx` files, set `SHERPA_ONNX_MODEL_FILE` or pass `--model-file`.
|
||||
- You can also pass `--tokens-file` or `--data-dir` to override the defaults.
|
||||
- Windows: run `node {baseDir}\\bin\\sherpa-onnx-tts -o tts.wav "Hello from local TTS."`
|
||||
@@ -1,178 +0,0 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
const fs = require("node:fs");
|
||||
const path = require("node:path");
|
||||
const { spawnSync } = require("node:child_process");
|
||||
|
||||
function usage(message) {
|
||||
if (message) {
|
||||
console.error(message);
|
||||
}
|
||||
console.error(
|
||||
"\nUsage: sherpa-onnx-tts [--runtime-dir <dir>] [--model-dir <dir>] [--model-file <file>] [--tokens-file <file>] [--data-dir <dir>] [--output <file>] \"text\"",
|
||||
);
|
||||
console.error("\nRequired env (or flags):\n SHERPA_ONNX_RUNTIME_DIR\n SHERPA_ONNX_MODEL_DIR");
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
function resolveRuntimeDir(explicit) {
|
||||
const value = explicit || process.env.SHERPA_ONNX_RUNTIME_DIR || "";
|
||||
return value.trim();
|
||||
}
|
||||
|
||||
function resolveModelDir(explicit) {
|
||||
const value = explicit || process.env.SHERPA_ONNX_MODEL_DIR || "";
|
||||
return value.trim();
|
||||
}
|
||||
|
||||
function resolveModelFile(modelDir, explicitFlag) {
|
||||
const explicit = (explicitFlag || process.env.SHERPA_ONNX_MODEL_FILE || "").trim();
|
||||
if (explicit) return explicit;
|
||||
try {
|
||||
const candidates = fs
|
||||
.readdirSync(modelDir)
|
||||
.filter((entry) => entry.endsWith(".onnx"))
|
||||
.map((entry) => path.join(modelDir, entry));
|
||||
if (candidates.length === 1) return candidates[0];
|
||||
} catch {
|
||||
return "";
|
||||
}
|
||||
return "";
|
||||
}
|
||||
|
||||
function resolveTokensFile(modelDir, explicitFlag) {
|
||||
const explicit = (explicitFlag || process.env.SHERPA_ONNX_TOKENS_FILE || "").trim();
|
||||
if (explicit) return explicit;
|
||||
const candidate = path.join(modelDir, "tokens.txt");
|
||||
return fs.existsSync(candidate) ? candidate : "";
|
||||
}
|
||||
|
||||
function resolveDataDir(modelDir, explicitFlag) {
|
||||
const explicit = (explicitFlag || process.env.SHERPA_ONNX_DATA_DIR || "").trim();
|
||||
if (explicit) return explicit;
|
||||
const candidate = path.join(modelDir, "espeak-ng-data");
|
||||
return fs.existsSync(candidate) ? candidate : "";
|
||||
}
|
||||
|
||||
function resolveBinary(runtimeDir) {
|
||||
const binName = process.platform === "win32" ? "sherpa-onnx-offline-tts.exe" : "sherpa-onnx-offline-tts";
|
||||
return path.join(runtimeDir, "bin", binName);
|
||||
}
|
||||
|
||||
function prependEnvPath(current, next) {
|
||||
if (!next) return current;
|
||||
if (!current) return next;
|
||||
return `${next}${path.delimiter}${current}`;
|
||||
}
|
||||
|
||||
const args = process.argv.slice(2);
|
||||
let runtimeDir = "";
|
||||
let modelDir = "";
|
||||
let modelFile = "";
|
||||
let tokensFile = "";
|
||||
let dataDir = "";
|
||||
let output = "tts.wav";
|
||||
const textParts = [];
|
||||
|
||||
for (let i = 0; i < args.length; i += 1) {
|
||||
const arg = args[i];
|
||||
if (arg === "--runtime-dir") {
|
||||
runtimeDir = args[i + 1] || "";
|
||||
i += 1;
|
||||
continue;
|
||||
}
|
||||
if (arg === "--model-dir") {
|
||||
modelDir = args[i + 1] || "";
|
||||
i += 1;
|
||||
continue;
|
||||
}
|
||||
if (arg === "--model-file") {
|
||||
modelFile = args[i + 1] || "";
|
||||
i += 1;
|
||||
continue;
|
||||
}
|
||||
if (arg === "--tokens-file") {
|
||||
tokensFile = args[i + 1] || "";
|
||||
i += 1;
|
||||
continue;
|
||||
}
|
||||
if (arg === "--data-dir") {
|
||||
dataDir = args[i + 1] || "";
|
||||
i += 1;
|
||||
continue;
|
||||
}
|
||||
if (arg === "-o" || arg === "--output") {
|
||||
output = args[i + 1] || output;
|
||||
i += 1;
|
||||
continue;
|
||||
}
|
||||
if (arg === "--text") {
|
||||
textParts.push(args[i + 1] || "");
|
||||
i += 1;
|
||||
continue;
|
||||
}
|
||||
textParts.push(arg);
|
||||
}
|
||||
|
||||
runtimeDir = resolveRuntimeDir(runtimeDir);
|
||||
modelDir = resolveModelDir(modelDir);
|
||||
|
||||
if (!runtimeDir || !modelDir) {
|
||||
usage("Missing runtime/model directory.");
|
||||
}
|
||||
|
||||
modelFile = resolveModelFile(modelDir, modelFile);
|
||||
tokensFile = resolveTokensFile(modelDir, tokensFile);
|
||||
dataDir = resolveDataDir(modelDir, dataDir);
|
||||
|
||||
if (!modelFile || !tokensFile || !dataDir) {
|
||||
usage(
|
||||
"Model directory is missing required files. Set SHERPA_ONNX_MODEL_FILE, SHERPA_ONNX_TOKENS_FILE, SHERPA_ONNX_DATA_DIR or pass --model-file/--tokens-file/--data-dir.",
|
||||
);
|
||||
}
|
||||
|
||||
const text = textParts.join(" ").trim();
|
||||
if (!text) {
|
||||
usage("Missing text.");
|
||||
}
|
||||
|
||||
const bin = resolveBinary(runtimeDir);
|
||||
if (!fs.existsSync(bin)) {
|
||||
usage(`TTS binary not found: ${bin}`);
|
||||
}
|
||||
|
||||
const env = { ...process.env };
|
||||
const libDir = path.join(runtimeDir, "lib");
|
||||
if (process.platform === "darwin") {
|
||||
env.DYLD_LIBRARY_PATH = prependEnvPath(env.DYLD_LIBRARY_PATH || "", libDir);
|
||||
} else if (process.platform === "win32") {
|
||||
env.PATH = prependEnvPath(env.PATH || "", [path.join(runtimeDir, "bin"), libDir].join(path.delimiter));
|
||||
} else {
|
||||
env.LD_LIBRARY_PATH = prependEnvPath(env.LD_LIBRARY_PATH || "", libDir);
|
||||
}
|
||||
|
||||
const outputPath = path.isAbsolute(output) ? output : path.join(process.cwd(), output);
|
||||
fs.mkdirSync(path.dirname(outputPath), { recursive: true });
|
||||
|
||||
const child = spawnSync(
|
||||
bin,
|
||||
[
|
||||
`--vits-model=${modelFile}`,
|
||||
`--vits-tokens=${tokensFile}`,
|
||||
`--vits-data-dir=${dataDir}`,
|
||||
`--output-filename=${outputPath}`,
|
||||
text,
|
||||
],
|
||||
{
|
||||
stdio: "inherit",
|
||||
env,
|
||||
},
|
||||
);
|
||||
|
||||
if (typeof child.status === "number") {
|
||||
process.exit(child.status);
|
||||
}
|
||||
if (child.error) {
|
||||
console.error(child.error.message || String(child.error));
|
||||
}
|
||||
process.exit(1);
|
||||
@@ -400,7 +400,7 @@ export function createExecTool(
|
||||
host = "gateway";
|
||||
}
|
||||
|
||||
const configuredSecurity = defaults?.security ?? (host === "sandbox" ? "deny" : "allowlist");
|
||||
const configuredSecurity = defaults?.security ?? "deny";
|
||||
const requestedSecurity = normalizeExecSecurity(params.security);
|
||||
let security = minSecurity(configuredSecurity, requestedSecurity ?? configuredSecurity);
|
||||
if (elevatedRequested) {
|
||||
@@ -447,10 +447,7 @@ export function createExecTool(
|
||||
applyPathPrepend(env, defaultPathPrepend);
|
||||
|
||||
if (host === "node") {
|
||||
const approvals = resolveExecApprovals(
|
||||
defaults?.agentId,
|
||||
host === "node" ? { security: "allowlist" } : undefined,
|
||||
);
|
||||
const approvals = resolveExecApprovals(defaults?.agentId);
|
||||
const hostSecurity = minSecurity(security, approvals.agent.security);
|
||||
const hostAsk = maxAsk(ask, approvals.agent.ask);
|
||||
const askFallback = approvals.agent.askFallback;
|
||||
@@ -494,7 +491,12 @@ export function createExecTool(
|
||||
if (nodeEnv) {
|
||||
applyPathPrepend(nodeEnv, defaultPathPrepend, { requireExisting: true });
|
||||
}
|
||||
const requiresAsk = hostAsk === "always" || hostAsk === "on-miss";
|
||||
const resolution = resolveCommandResolution(params.command, workdir, env);
|
||||
const allowlistMatch =
|
||||
hostSecurity === "allowlist" ? matchAllowlist(approvals.allowlist, resolution) : null;
|
||||
const requiresAsk =
|
||||
hostAsk === "always" ||
|
||||
(hostAsk === "on-miss" && hostSecurity === "allowlist" && !allowlistMatch);
|
||||
|
||||
let approvedByAsk = false;
|
||||
let approvalDecision: "allow-once" | "allow-always" | null = null;
|
||||
@@ -509,7 +511,7 @@ export function createExecTool(
|
||||
security: hostSecurity,
|
||||
ask: hostAsk,
|
||||
agentId: defaults?.agentId,
|
||||
resolvedPath: null,
|
||||
resolvedPath: resolution?.resolvedPath ?? null,
|
||||
sessionKey: defaults?.sessionKey ?? null,
|
||||
timeoutMs: 120_000,
|
||||
},
|
||||
@@ -527,7 +529,11 @@ export function createExecTool(
|
||||
approvedByAsk = true;
|
||||
approvalDecision = "allow-once";
|
||||
} else if (askFallback === "allowlist") {
|
||||
// Defer allowlist enforcement to the node host.
|
||||
if (!allowlistMatch) {
|
||||
throw new Error("exec denied: approval required (approval UI not available)");
|
||||
}
|
||||
approvedByAsk = true;
|
||||
approvalDecision = "allow-once";
|
||||
} else {
|
||||
throw new Error("exec denied: approval required (approval UI not available)");
|
||||
}
|
||||
@@ -539,8 +545,32 @@ export function createExecTool(
|
||||
if (decision === "allow-always") {
|
||||
approvedByAsk = true;
|
||||
approvalDecision = "allow-always";
|
||||
if (hostSecurity === "allowlist") {
|
||||
const pattern =
|
||||
resolution?.resolvedPath ??
|
||||
resolution?.rawExecutable ??
|
||||
params.command.split(/\s+/).shift() ??
|
||||
"";
|
||||
if (pattern) {
|
||||
addAllowlistEntry(approvals.file, defaults?.agentId, pattern);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (hostSecurity === "allowlist" && !allowlistMatch && !approvedByAsk) {
|
||||
throw new Error("exec denied: allowlist miss");
|
||||
}
|
||||
|
||||
if (allowlistMatch) {
|
||||
recordAllowlistUse(
|
||||
approvals.file,
|
||||
defaults?.agentId,
|
||||
allowlistMatch,
|
||||
params.command,
|
||||
resolution?.resolvedPath,
|
||||
);
|
||||
}
|
||||
const invokeParams: Record<string, unknown> = {
|
||||
nodeId,
|
||||
command: "system.run",
|
||||
@@ -586,7 +616,7 @@ export function createExecTool(
|
||||
}
|
||||
|
||||
if (host === "gateway") {
|
||||
const approvals = resolveExecApprovals(defaults?.agentId, { security: "allowlist" });
|
||||
const approvals = resolveExecApprovals(defaults?.agentId);
|
||||
const hostSecurity = minSecurity(security, approvals.agent.security);
|
||||
const hostAsk = maxAsk(ask, approvals.agent.ask);
|
||||
const askFallback = approvals.agent.askFallback;
|
||||
|
||||
@@ -1,82 +0,0 @@
|
||||
import { afterEach, beforeEach, describe, expect, it, vi } from "vitest";
|
||||
|
||||
import type { ClawdbotConfig } from "../config/config.js";
|
||||
import {
|
||||
__setModelCatalogImportForTest,
|
||||
loadModelCatalog,
|
||||
resetModelCatalogCacheForTest,
|
||||
} from "./model-catalog.js";
|
||||
|
||||
type PiSdkModule = typeof import("@mariozechner/pi-coding-agent");
|
||||
|
||||
vi.mock("./models-config.js", () => ({
|
||||
ensureClawdbotModelsJson: vi.fn().mockResolvedValue({ agentDir: "/tmp", wrote: false }),
|
||||
}));
|
||||
|
||||
vi.mock("./agent-paths.js", () => ({
|
||||
resolveClawdbotAgentDir: () => "/tmp/clawdbot",
|
||||
}));
|
||||
|
||||
describe("loadModelCatalog", () => {
|
||||
beforeEach(() => {
|
||||
resetModelCatalogCacheForTest();
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
__setModelCatalogImportForTest();
|
||||
resetModelCatalogCacheForTest();
|
||||
vi.restoreAllMocks();
|
||||
});
|
||||
|
||||
it("retries after import failure without poisoning the cache", async () => {
|
||||
const warnSpy = vi.spyOn(console, "warn").mockImplementation(() => {});
|
||||
let call = 0;
|
||||
|
||||
__setModelCatalogImportForTest(async () => {
|
||||
call += 1;
|
||||
if (call === 1) {
|
||||
throw new Error("boom");
|
||||
}
|
||||
return {
|
||||
discoverAuthStorage: () => ({}),
|
||||
discoverModels: () => [{ id: "gpt-4.1", name: "GPT-4.1", provider: "openai" }],
|
||||
} as unknown as PiSdkModule;
|
||||
});
|
||||
|
||||
const cfg = {} as ClawdbotConfig;
|
||||
const first = await loadModelCatalog({ config: cfg });
|
||||
expect(first).toEqual([]);
|
||||
|
||||
const second = await loadModelCatalog({ config: cfg });
|
||||
expect(second).toEqual([{ id: "gpt-4.1", name: "GPT-4.1", provider: "openai" }]);
|
||||
expect(call).toBe(2);
|
||||
expect(warnSpy).toHaveBeenCalledTimes(1);
|
||||
});
|
||||
|
||||
it("returns partial results on discovery errors", async () => {
|
||||
const warnSpy = vi.spyOn(console, "warn").mockImplementation(() => {});
|
||||
|
||||
__setModelCatalogImportForTest(
|
||||
async () =>
|
||||
({
|
||||
discoverAuthStorage: () => ({}),
|
||||
discoverModels: () => ({
|
||||
getAll: () => [
|
||||
{ id: "gpt-4.1", name: "GPT-4.1", provider: "openai" },
|
||||
{
|
||||
get id() {
|
||||
throw new Error("boom");
|
||||
},
|
||||
provider: "openai",
|
||||
name: "bad",
|
||||
},
|
||||
],
|
||||
}),
|
||||
}) as unknown as PiSdkModule,
|
||||
);
|
||||
|
||||
const result = await loadModelCatalog({ config: {} as ClawdbotConfig });
|
||||
expect(result).toEqual([{ id: "gpt-4.1", name: "GPT-4.1", provider: "openai" }]);
|
||||
expect(warnSpy).toHaveBeenCalledTimes(1);
|
||||
});
|
||||
});
|
||||
@@ -18,22 +18,10 @@ type DiscoveredModel = {
|
||||
reasoning?: boolean;
|
||||
};
|
||||
|
||||
type PiSdkModule = typeof import("@mariozechner/pi-coding-agent");
|
||||
|
||||
let modelCatalogPromise: Promise<ModelCatalogEntry[]> | null = null;
|
||||
let hasLoggedModelCatalogError = false;
|
||||
const defaultImportPiSdk = () => import("@mariozechner/pi-coding-agent");
|
||||
let importPiSdk = defaultImportPiSdk;
|
||||
|
||||
export function resetModelCatalogCacheForTest() {
|
||||
modelCatalogPromise = null;
|
||||
hasLoggedModelCatalogError = false;
|
||||
importPiSdk = defaultImportPiSdk;
|
||||
}
|
||||
|
||||
// Test-only escape hatch: allow mocking the dynamic import to simulate transient failures.
|
||||
export function __setModelCatalogImportForTest(loader?: () => Promise<PiSdkModule>) {
|
||||
importPiSdk = loader ?? defaultImportPiSdk;
|
||||
}
|
||||
|
||||
export async function loadModelCatalog(params?: {
|
||||
@@ -46,21 +34,12 @@ export async function loadModelCatalog(params?: {
|
||||
if (modelCatalogPromise) return modelCatalogPromise;
|
||||
|
||||
modelCatalogPromise = (async () => {
|
||||
const piSdk = await import("@mariozechner/pi-coding-agent");
|
||||
|
||||
const models: ModelCatalogEntry[] = [];
|
||||
const sortModels = (entries: ModelCatalogEntry[]) =>
|
||||
entries.sort((a, b) => {
|
||||
const p = a.provider.localeCompare(b.provider);
|
||||
if (p !== 0) return p;
|
||||
return a.name.localeCompare(b.name);
|
||||
});
|
||||
try {
|
||||
const cfg = params?.config ?? loadConfig();
|
||||
await ensureClawdbotModelsJson(cfg);
|
||||
// IMPORTANT: keep the dynamic import *inside* the try/catch.
|
||||
// If this fails once (e.g. during a pnpm install that temporarily swaps node_modules),
|
||||
// we must not poison the cache with a rejected promise (otherwise all channel handlers
|
||||
// will keep failing until restart).
|
||||
const piSdk = await importPiSdk();
|
||||
const agentDir = resolveClawdbotAgentDir();
|
||||
const authStorage = piSdk.discoverAuthStorage(agentDir);
|
||||
const registry = piSdk.discoverModels(authStorage, agentDir) as
|
||||
@@ -87,20 +66,16 @@ export async function loadModelCatalog(params?: {
|
||||
// If we found nothing, don't cache this result so we can try again.
|
||||
modelCatalogPromise = null;
|
||||
}
|
||||
|
||||
return sortModels(models);
|
||||
} catch (error) {
|
||||
if (!hasLoggedModelCatalogError) {
|
||||
hasLoggedModelCatalogError = true;
|
||||
console.warn(`[model-catalog] Failed to load model catalog: ${String(error)}`);
|
||||
}
|
||||
// Don't poison the cache on transient dependency/filesystem issues.
|
||||
} catch {
|
||||
// Leave models empty on discovery errors and don't cache.
|
||||
modelCatalogPromise = null;
|
||||
if (models.length > 0) {
|
||||
return sortModels(models);
|
||||
}
|
||||
return [];
|
||||
}
|
||||
|
||||
return models.sort((a, b) => {
|
||||
const p = a.provider.localeCompare(b.provider);
|
||||
if (p !== 0) return p;
|
||||
return a.name.localeCompare(b.name);
|
||||
});
|
||||
})();
|
||||
|
||||
return modelCatalogPromise;
|
||||
|
||||
@@ -1,9 +1,3 @@
|
||||
import {
|
||||
diagnosticLogger as diag,
|
||||
logMessageQueued,
|
||||
logSessionStateChange,
|
||||
} from "../../logging/diagnostic.js";
|
||||
|
||||
type EmbeddedPiQueueHandle = {
|
||||
queueMessage: (text: string) => Promise<void>;
|
||||
isStreaming: () => boolean;
|
||||
@@ -20,40 +14,22 @@ const EMBEDDED_RUN_WAITERS = new Map<string, Set<EmbeddedRunWaiter>>();
|
||||
|
||||
export function queueEmbeddedPiMessage(sessionId: string, text: string): boolean {
|
||||
const handle = ACTIVE_EMBEDDED_RUNS.get(sessionId);
|
||||
if (!handle) {
|
||||
diag.debug(`queue message failed: sessionId=${sessionId} reason=no_active_run`);
|
||||
return false;
|
||||
}
|
||||
if (!handle.isStreaming()) {
|
||||
diag.debug(`queue message failed: sessionId=${sessionId} reason=not_streaming`);
|
||||
return false;
|
||||
}
|
||||
if (handle.isCompacting()) {
|
||||
diag.debug(`queue message failed: sessionId=${sessionId} reason=compacting`);
|
||||
return false;
|
||||
}
|
||||
logMessageQueued({ sessionId, source: "pi-embedded-runner" });
|
||||
if (!handle) return false;
|
||||
if (!handle.isStreaming()) return false;
|
||||
if (handle.isCompacting()) return false;
|
||||
void handle.queueMessage(text);
|
||||
return true;
|
||||
}
|
||||
|
||||
export function abortEmbeddedPiRun(sessionId: string): boolean {
|
||||
const handle = ACTIVE_EMBEDDED_RUNS.get(sessionId);
|
||||
if (!handle) {
|
||||
diag.debug(`abort failed: sessionId=${sessionId} reason=no_active_run`);
|
||||
return false;
|
||||
}
|
||||
diag.info(`aborting run: sessionId=${sessionId}`);
|
||||
if (!handle) return false;
|
||||
handle.abort();
|
||||
return true;
|
||||
}
|
||||
|
||||
export function isEmbeddedPiRunActive(sessionId: string): boolean {
|
||||
const active = ACTIVE_EMBEDDED_RUNS.has(sessionId);
|
||||
if (active) {
|
||||
diag.debug(`run active check: sessionId=${sessionId} active=true`);
|
||||
}
|
||||
return active;
|
||||
return ACTIVE_EMBEDDED_RUNS.has(sessionId);
|
||||
}
|
||||
|
||||
export function isEmbeddedPiRunStreaming(sessionId: string): boolean {
|
||||
@@ -64,7 +40,6 @@ export function isEmbeddedPiRunStreaming(sessionId: string): boolean {
|
||||
|
||||
export function waitForEmbeddedPiRunEnd(sessionId: string, timeoutMs = 15_000): Promise<boolean> {
|
||||
if (!sessionId || !ACTIVE_EMBEDDED_RUNS.has(sessionId)) return Promise.resolve(true);
|
||||
diag.debug(`waiting for run end: sessionId=${sessionId} timeoutMs=${timeoutMs}`);
|
||||
return new Promise((resolve) => {
|
||||
const waiters = EMBEDDED_RUN_WAITERS.get(sessionId) ?? new Set();
|
||||
const waiter: EmbeddedRunWaiter = {
|
||||
@@ -73,7 +48,6 @@ export function waitForEmbeddedPiRunEnd(sessionId: string, timeoutMs = 15_000):
|
||||
() => {
|
||||
waiters.delete(waiter);
|
||||
if (waiters.size === 0) EMBEDDED_RUN_WAITERS.delete(sessionId);
|
||||
diag.warn(`wait timeout: sessionId=${sessionId} timeoutMs=${timeoutMs}`);
|
||||
resolve(false);
|
||||
},
|
||||
Math.max(100, timeoutMs),
|
||||
@@ -94,7 +68,6 @@ function notifyEmbeddedRunEnded(sessionId: string) {
|
||||
const waiters = EMBEDDED_RUN_WAITERS.get(sessionId);
|
||||
if (!waiters || waiters.size === 0) return;
|
||||
EMBEDDED_RUN_WAITERS.delete(sessionId);
|
||||
diag.debug(`notifying waiters: sessionId=${sessionId} waiterCount=${waiters.size}`);
|
||||
for (const waiter of waiters) {
|
||||
clearTimeout(waiter.timer);
|
||||
waiter.resolve(true);
|
||||
@@ -102,24 +75,13 @@ function notifyEmbeddedRunEnded(sessionId: string) {
|
||||
}
|
||||
|
||||
export function setActiveEmbeddedRun(sessionId: string, handle: EmbeddedPiQueueHandle) {
|
||||
const wasActive = ACTIVE_EMBEDDED_RUNS.has(sessionId);
|
||||
ACTIVE_EMBEDDED_RUNS.set(sessionId, handle);
|
||||
logSessionStateChange({
|
||||
sessionId,
|
||||
state: "processing",
|
||||
reason: wasActive ? "run_replaced" : "run_started",
|
||||
});
|
||||
diag.info(`run registered: sessionId=${sessionId} totalActive=${ACTIVE_EMBEDDED_RUNS.size}`);
|
||||
}
|
||||
|
||||
export function clearActiveEmbeddedRun(sessionId: string, handle: EmbeddedPiQueueHandle) {
|
||||
if (ACTIVE_EMBEDDED_RUNS.get(sessionId) === handle) {
|
||||
ACTIVE_EMBEDDED_RUNS.delete(sessionId);
|
||||
logSessionStateChange({ sessionId, state: "idle", reason: "run_completed" });
|
||||
diag.info(`run cleared: sessionId=${sessionId} totalActive=${ACTIVE_EMBEDDED_RUNS.size}`);
|
||||
notifyEmbeddedRunEnded(sessionId);
|
||||
} else {
|
||||
diag.debug(`run clear skipped: sessionId=${sessionId} reason=handle_mismatch`);
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -1,13 +1,10 @@
|
||||
import fs from "node:fs";
|
||||
import path from "node:path";
|
||||
import { Readable } from "node:stream";
|
||||
import type { ReadableStream as NodeReadableStream } from "node:stream/web";
|
||||
import { pipeline } from "node:stream/promises";
|
||||
|
||||
import type { ClawdbotConfig } from "../config/config.js";
|
||||
import { resolveBrewExecutable } from "../infra/brew.js";
|
||||
import { runCommandWithTimeout } from "../process/exec.js";
|
||||
import { CONFIG_DIR, ensureDir, resolveUserPath } from "../utils.js";
|
||||
import { resolveUserPath } from "../utils.js";
|
||||
import {
|
||||
hasBinary,
|
||||
loadWorkspaceSkillEntries,
|
||||
@@ -16,7 +13,6 @@ import {
|
||||
type SkillInstallSpec,
|
||||
type SkillsInstallPreferences,
|
||||
} from "./skills.js";
|
||||
import { resolveSkillKey } from "./skills/frontmatter.js";
|
||||
|
||||
export type SkillInstallRequest = {
|
||||
workspaceDir: string;
|
||||
@@ -34,10 +30,6 @@ export type SkillInstallResult = {
|
||||
code: number | null;
|
||||
};
|
||||
|
||||
function isNodeReadableStream(value: unknown): value is NodeJS.ReadableStream {
|
||||
return Boolean(value && typeof (value as NodeJS.ReadableStream).pipe === "function");
|
||||
}
|
||||
|
||||
function summarizeInstallOutput(text: string): string | undefined {
|
||||
const raw = text.trim();
|
||||
if (!raw) return undefined;
|
||||
@@ -120,162 +112,11 @@ function buildInstallCommand(
|
||||
if (!spec.package) return { argv: null, error: "missing uv package" };
|
||||
return { argv: ["uv", "tool", "install", spec.package] };
|
||||
}
|
||||
case "download": {
|
||||
return { argv: null, error: "download install handled separately" };
|
||||
}
|
||||
default:
|
||||
return { argv: null, error: "unsupported installer" };
|
||||
}
|
||||
}
|
||||
|
||||
function resolveDownloadTargetDir(entry: SkillEntry, spec: SkillInstallSpec): string {
|
||||
if (spec.targetDir?.trim()) return resolveUserPath(spec.targetDir);
|
||||
const key = resolveSkillKey(entry.skill, entry);
|
||||
return path.join(CONFIG_DIR, "tools", key);
|
||||
}
|
||||
|
||||
function resolveArchiveType(spec: SkillInstallSpec, filename: string): string | undefined {
|
||||
const explicit = spec.archive?.trim().toLowerCase();
|
||||
if (explicit) return explicit;
|
||||
const lower = filename.toLowerCase();
|
||||
if (lower.endsWith(".tar.gz") || lower.endsWith(".tgz")) return "tar.gz";
|
||||
if (lower.endsWith(".tar.bz2") || lower.endsWith(".tbz2")) return "tar.bz2";
|
||||
if (lower.endsWith(".zip")) return "zip";
|
||||
return undefined;
|
||||
}
|
||||
|
||||
async function downloadFile(
|
||||
url: string,
|
||||
destPath: string,
|
||||
timeoutMs: number,
|
||||
): Promise<{ bytes: number }> {
|
||||
const controller = new AbortController();
|
||||
const timeout = setTimeout(() => controller.abort(), Math.max(1_000, timeoutMs));
|
||||
try {
|
||||
const response = await fetch(url, { signal: controller.signal });
|
||||
if (!response.ok || !response.body) {
|
||||
throw new Error(`Download failed (${response.status} ${response.statusText})`);
|
||||
}
|
||||
await ensureDir(path.dirname(destPath));
|
||||
const file = fs.createWriteStream(destPath);
|
||||
const body = response.body as unknown;
|
||||
const readable = isNodeReadableStream(body)
|
||||
? body
|
||||
: Readable.fromWeb(body as NodeReadableStream);
|
||||
await pipeline(readable, file);
|
||||
const stat = await fs.promises.stat(destPath);
|
||||
return { bytes: stat.size };
|
||||
} finally {
|
||||
clearTimeout(timeout);
|
||||
}
|
||||
}
|
||||
|
||||
async function extractArchive(params: {
|
||||
archivePath: string;
|
||||
archiveType: string;
|
||||
targetDir: string;
|
||||
stripComponents?: number;
|
||||
timeoutMs: number;
|
||||
}): Promise<{ stdout: string; stderr: string; code: number | null }> {
|
||||
const { archivePath, archiveType, targetDir, stripComponents, timeoutMs } = params;
|
||||
if (archiveType === "zip") {
|
||||
if (!hasBinary("unzip")) {
|
||||
return { stdout: "", stderr: "unzip not found on PATH", code: null };
|
||||
}
|
||||
const argv = ["unzip", "-q", archivePath, "-d", targetDir];
|
||||
return await runCommandWithTimeout(argv, { timeoutMs });
|
||||
}
|
||||
|
||||
if (!hasBinary("tar")) {
|
||||
return { stdout: "", stderr: "tar not found on PATH", code: null };
|
||||
}
|
||||
const argv = ["tar", "xf", archivePath, "-C", targetDir];
|
||||
if (typeof stripComponents === "number" && Number.isFinite(stripComponents)) {
|
||||
argv.push("--strip-components", String(Math.max(0, Math.floor(stripComponents))));
|
||||
}
|
||||
return await runCommandWithTimeout(argv, { timeoutMs });
|
||||
}
|
||||
|
||||
async function installDownloadSpec(params: {
|
||||
entry: SkillEntry;
|
||||
spec: SkillInstallSpec;
|
||||
timeoutMs: number;
|
||||
}): Promise<SkillInstallResult> {
|
||||
const { entry, spec, timeoutMs } = params;
|
||||
const url = spec.url?.trim();
|
||||
if (!url) {
|
||||
return {
|
||||
ok: false,
|
||||
message: "missing download url",
|
||||
stdout: "",
|
||||
stderr: "",
|
||||
code: null,
|
||||
};
|
||||
}
|
||||
|
||||
let filename = "";
|
||||
try {
|
||||
const parsed = new URL(url);
|
||||
filename = path.basename(parsed.pathname);
|
||||
} catch {
|
||||
filename = path.basename(url);
|
||||
}
|
||||
if (!filename) filename = "download";
|
||||
|
||||
const targetDir = resolveDownloadTargetDir(entry, spec);
|
||||
await ensureDir(targetDir);
|
||||
|
||||
const archivePath = path.join(targetDir, filename);
|
||||
let downloaded = 0;
|
||||
try {
|
||||
const result = await downloadFile(url, archivePath, timeoutMs);
|
||||
downloaded = result.bytes;
|
||||
} catch (err) {
|
||||
const message = err instanceof Error ? err.message : String(err);
|
||||
return { ok: false, message, stdout: "", stderr: message, code: null };
|
||||
}
|
||||
|
||||
const archiveType = resolveArchiveType(spec, filename);
|
||||
const shouldExtract = spec.extract ?? Boolean(archiveType);
|
||||
if (!shouldExtract) {
|
||||
return {
|
||||
ok: true,
|
||||
message: `Downloaded to ${archivePath}`,
|
||||
stdout: `downloaded=${downloaded}`,
|
||||
stderr: "",
|
||||
code: 0,
|
||||
};
|
||||
}
|
||||
|
||||
if (!archiveType) {
|
||||
return {
|
||||
ok: false,
|
||||
message: "extract requested but archive type could not be detected",
|
||||
stdout: "",
|
||||
stderr: "",
|
||||
code: null,
|
||||
};
|
||||
}
|
||||
|
||||
const extractResult = await extractArchive({
|
||||
archivePath,
|
||||
archiveType,
|
||||
targetDir,
|
||||
stripComponents: spec.stripComponents,
|
||||
timeoutMs,
|
||||
});
|
||||
const success = extractResult.code === 0;
|
||||
return {
|
||||
ok: success,
|
||||
message: success
|
||||
? `Downloaded and extracted to ${targetDir}`
|
||||
: formatInstallFailureMessage(extractResult),
|
||||
stdout: extractResult.stdout.trim(),
|
||||
stderr: extractResult.stderr.trim(),
|
||||
code: extractResult.code,
|
||||
};
|
||||
}
|
||||
|
||||
async function resolveBrewBinDir(timeoutMs: number, brewExe?: string): Promise<string | undefined> {
|
||||
const exe = brewExe ?? (hasBinary("brew") ? "brew" : resolveBrewExecutable());
|
||||
if (!exe) return undefined;
|
||||
@@ -326,9 +167,6 @@ export async function installSkill(params: SkillInstallRequest): Promise<SkillIn
|
||||
code: null,
|
||||
};
|
||||
}
|
||||
if (spec.kind === "download") {
|
||||
return await installDownloadSpec({ entry, spec, timeoutMs });
|
||||
}
|
||||
|
||||
const prefs = resolveSkillsInstallPreferences(params.config);
|
||||
const command = buildInstallCommand(spec, prefs);
|
||||
|
||||
@@ -100,49 +100,36 @@ function normalizeInstallOptions(
|
||||
): SkillInstallOption[] {
|
||||
const install = entry.clawdbot?.install ?? [];
|
||||
if (install.length === 0) return [];
|
||||
|
||||
const platform = process.platform;
|
||||
const filtered = install.filter((spec) => {
|
||||
const osList = spec.os ?? [];
|
||||
return osList.length === 0 || osList.includes(platform);
|
||||
});
|
||||
if (filtered.length === 0) return [];
|
||||
|
||||
const toOption = (spec: SkillInstallSpec, index: number): SkillInstallOption => {
|
||||
const id = (spec.id ?? `${spec.kind}-${index}`).trim();
|
||||
const bins = spec.bins ?? [];
|
||||
let label = (spec.label ?? "").trim();
|
||||
if (spec.kind === "node" && spec.package) {
|
||||
label = `Install ${spec.package} (${prefs.nodeManager})`;
|
||||
}
|
||||
if (!label) {
|
||||
if (spec.kind === "brew" && spec.formula) {
|
||||
label = `Install ${spec.formula} (brew)`;
|
||||
} else if (spec.kind === "node" && spec.package) {
|
||||
label = `Install ${spec.package} (${prefs.nodeManager})`;
|
||||
} else if (spec.kind === "go" && spec.module) {
|
||||
label = `Install ${spec.module} (go)`;
|
||||
} else if (spec.kind === "uv" && spec.package) {
|
||||
label = `Install ${spec.package} (uv)`;
|
||||
} else if (spec.kind === "download" && spec.url) {
|
||||
const url = spec.url.trim();
|
||||
const last = url.split("/").pop();
|
||||
label = `Download ${last && last.length > 0 ? last : url}`;
|
||||
} else {
|
||||
label = "Run installer";
|
||||
}
|
||||
}
|
||||
return { id, kind: spec.kind, label, bins };
|
||||
};
|
||||
|
||||
const allDownloads = filtered.every((spec) => spec.kind === "download");
|
||||
if (allDownloads) {
|
||||
return filtered.map((spec, index) => toOption(spec, index));
|
||||
}
|
||||
|
||||
const preferred = selectPreferredInstallSpec(filtered, prefs);
|
||||
const preferred = selectPreferredInstallSpec(install, prefs);
|
||||
if (!preferred) return [];
|
||||
return [toOption(preferred.spec, preferred.index)];
|
||||
const { spec, index } = preferred;
|
||||
const id = (spec.id ?? `${spec.kind}-${index}`).trim();
|
||||
const bins = spec.bins ?? [];
|
||||
let label = (spec.label ?? "").trim();
|
||||
if (spec.kind === "node" && spec.package) {
|
||||
label = `Install ${spec.package} (${prefs.nodeManager})`;
|
||||
}
|
||||
if (!label) {
|
||||
if (spec.kind === "brew" && spec.formula) {
|
||||
label = `Install ${spec.formula} (brew)`;
|
||||
} else if (spec.kind === "node" && spec.package) {
|
||||
label = `Install ${spec.package} (${prefs.nodeManager})`;
|
||||
} else if (spec.kind === "go" && spec.module) {
|
||||
label = `Install ${spec.module} (go)`;
|
||||
} else if (spec.kind === "uv" && spec.package) {
|
||||
label = `Install ${spec.package} (uv)`;
|
||||
} else {
|
||||
label = "Run installer";
|
||||
}
|
||||
}
|
||||
return [
|
||||
{
|
||||
id,
|
||||
kind: spec.kind,
|
||||
label,
|
||||
bins,
|
||||
},
|
||||
];
|
||||
}
|
||||
|
||||
function buildSkillStatus(
|
||||
|
||||
@@ -109,33 +109,4 @@ describe("buildWorkspaceSkillStatus", () => {
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
it("filters install options by OS", async () => {
|
||||
const workspaceDir = await fs.mkdtemp(path.join(os.tmpdir(), "clawdbot-"));
|
||||
const skillDir = path.join(workspaceDir, "skills", "install-skill");
|
||||
|
||||
await writeSkill({
|
||||
dir: skillDir,
|
||||
name: "install-skill",
|
||||
description: "OS-specific installs",
|
||||
metadata:
|
||||
'{"clawdbot":{"requires":{"bins":["missing-bin"]},"install":[{"id":"mac","kind":"download","os":["darwin"],"url":"https://example.com/mac.tar.bz2"},{"id":"linux","kind":"download","os":["linux"],"url":"https://example.com/linux.tar.bz2"},{"id":"win","kind":"download","os":["win32"],"url":"https://example.com/win.tar.bz2"}]}}',
|
||||
});
|
||||
|
||||
const report = buildWorkspaceSkillStatus(workspaceDir, {
|
||||
managedSkillsDir: path.join(workspaceDir, ".managed"),
|
||||
});
|
||||
const skill = report.skills.find((entry) => entry.name === "install-skill");
|
||||
|
||||
expect(skill).toBeDefined();
|
||||
if (process.platform === "darwin") {
|
||||
expect(skill?.install.map((opt) => opt.id)).toEqual(["mac"]);
|
||||
} else if (process.platform === "linux") {
|
||||
expect(skill?.install.map((opt) => opt.id)).toEqual(["linux"]);
|
||||
} else if (process.platform === "win32") {
|
||||
expect(skill?.install.map((opt) => opt.id)).toEqual(["win"]);
|
||||
} else {
|
||||
expect(skill?.install).toEqual([]);
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
@@ -35,7 +35,7 @@ function parseInstallSpec(input: unknown): SkillInstallSpec | undefined {
|
||||
const kindRaw =
|
||||
typeof raw.kind === "string" ? raw.kind : typeof raw.type === "string" ? raw.type : "";
|
||||
const kind = kindRaw.trim().toLowerCase();
|
||||
if (kind !== "brew" && kind !== "node" && kind !== "go" && kind !== "uv" && kind !== "download") {
|
||||
if (kind !== "brew" && kind !== "node" && kind !== "go" && kind !== "uv") {
|
||||
return undefined;
|
||||
}
|
||||
|
||||
@@ -47,16 +47,9 @@ function parseInstallSpec(input: unknown): SkillInstallSpec | undefined {
|
||||
if (typeof raw.label === "string") spec.label = raw.label;
|
||||
const bins = normalizeStringList(raw.bins);
|
||||
if (bins.length > 0) spec.bins = bins;
|
||||
const osList = normalizeStringList(raw.os);
|
||||
if (osList.length > 0) spec.os = osList;
|
||||
if (typeof raw.formula === "string") spec.formula = raw.formula;
|
||||
if (typeof raw.package === "string") spec.package = raw.package;
|
||||
if (typeof raw.module === "string") spec.module = raw.module;
|
||||
if (typeof raw.url === "string") spec.url = raw.url;
|
||||
if (typeof raw.archive === "string") spec.archive = raw.archive;
|
||||
if (typeof raw.extract === "boolean") spec.extract = raw.extract;
|
||||
if (typeof raw.stripComponents === "number") spec.stripComponents = raw.stripComponents;
|
||||
if (typeof raw.targetDir === "string") spec.targetDir = raw.targetDir;
|
||||
|
||||
return spec;
|
||||
}
|
||||
|
||||
@@ -2,18 +2,12 @@ import type { Skill } from "@mariozechner/pi-coding-agent";
|
||||
|
||||
export type SkillInstallSpec = {
|
||||
id?: string;
|
||||
kind: "brew" | "node" | "go" | "uv" | "download";
|
||||
kind: "brew" | "node" | "go" | "uv";
|
||||
label?: string;
|
||||
bins?: string[];
|
||||
os?: string[];
|
||||
formula?: string;
|
||||
package?: string;
|
||||
module?: string;
|
||||
url?: string;
|
||||
archive?: string;
|
||||
extract?: boolean;
|
||||
stripComponents?: number;
|
||||
targetDir?: string;
|
||||
};
|
||||
|
||||
export type ClawdbotSkillMetadata = {
|
||||
|
||||
@@ -49,10 +49,9 @@ const browserConfigMocks = vi.hoisted(() => ({
|
||||
}));
|
||||
vi.mock("../../browser/config.js", () => browserConfigMocks);
|
||||
|
||||
const configMocks = vi.hoisted(() => ({
|
||||
vi.mock("../../config/config.js", () => ({
|
||||
loadConfig: vi.fn(() => ({ browser: {} })),
|
||||
}));
|
||||
vi.mock("../../config/config.js", () => configMocks);
|
||||
|
||||
const toolCommonMocks = vi.hoisted(() => ({
|
||||
imageResultFromFile: vi.fn(),
|
||||
@@ -71,12 +70,11 @@ import { createBrowserTool } from "./browser-tool.js";
|
||||
describe("browser tool snapshot maxChars", () => {
|
||||
afterEach(() => {
|
||||
vi.clearAllMocks();
|
||||
configMocks.loadConfig.mockReturnValue({ browser: {} });
|
||||
});
|
||||
|
||||
it("applies the default ai snapshot limit", async () => {
|
||||
const tool = createBrowserTool();
|
||||
await tool.execute?.(null, { action: "snapshot", snapshotFormat: "ai" });
|
||||
await tool.execute?.(null, { action: "snapshot", format: "ai" });
|
||||
|
||||
expect(browserClientMocks.browserSnapshot).toHaveBeenCalledWith(
|
||||
"http://127.0.0.1:18791",
|
||||
@@ -92,7 +90,7 @@ describe("browser tool snapshot maxChars", () => {
|
||||
const override = 2_000;
|
||||
await tool.execute?.(null, {
|
||||
action: "snapshot",
|
||||
snapshotFormat: "ai",
|
||||
format: "ai",
|
||||
maxChars: override,
|
||||
});
|
||||
|
||||
@@ -108,7 +106,7 @@ describe("browser tool snapshot maxChars", () => {
|
||||
const tool = createBrowserTool();
|
||||
await tool.execute?.(null, {
|
||||
action: "snapshot",
|
||||
snapshotFormat: "ai",
|
||||
format: "ai",
|
||||
maxChars: 0,
|
||||
});
|
||||
|
||||
@@ -126,7 +124,7 @@ describe("browser tool snapshot maxChars", () => {
|
||||
|
||||
it("passes refs mode through to browser snapshot", async () => {
|
||||
const tool = createBrowserTool();
|
||||
await tool.execute?.(null, { action: "snapshot", snapshotFormat: "ai", refs: "aria" });
|
||||
await tool.execute?.(null, { action: "snapshot", format: "ai", refs: "aria" });
|
||||
|
||||
expect(browserClientMocks.browserSnapshot).toHaveBeenCalledWith(
|
||||
"http://127.0.0.1:18791",
|
||||
@@ -137,36 +135,9 @@ describe("browser tool snapshot maxChars", () => {
|
||||
);
|
||||
});
|
||||
|
||||
it("uses config snapshot defaults when mode is not provided", async () => {
|
||||
configMocks.loadConfig.mockReturnValue({
|
||||
browser: { snapshotDefaults: { mode: "efficient" } },
|
||||
});
|
||||
const tool = createBrowserTool();
|
||||
await tool.execute?.(null, { action: "snapshot", snapshotFormat: "ai" });
|
||||
|
||||
expect(browserClientMocks.browserSnapshot).toHaveBeenCalledWith(
|
||||
"http://127.0.0.1:18791",
|
||||
expect.objectContaining({
|
||||
mode: "efficient",
|
||||
}),
|
||||
);
|
||||
});
|
||||
|
||||
it("does not apply config snapshot defaults to aria snapshots", async () => {
|
||||
configMocks.loadConfig.mockReturnValue({
|
||||
browser: { snapshotDefaults: { mode: "efficient" } },
|
||||
});
|
||||
const tool = createBrowserTool();
|
||||
await tool.execute?.(null, { action: "snapshot", snapshotFormat: "aria" });
|
||||
|
||||
expect(browserClientMocks.browserSnapshot).toHaveBeenCalled();
|
||||
const [, opts] = browserClientMocks.browserSnapshot.mock.calls.at(-1) ?? [];
|
||||
expect(opts?.mode).toBeUndefined();
|
||||
});
|
||||
|
||||
it("defaults to host when using profile=chrome (even in sandboxed sessions)", async () => {
|
||||
const tool = createBrowserTool({ defaultControlUrl: "http://127.0.0.1:9999" });
|
||||
await tool.execute?.(null, { action: "snapshot", profile: "chrome", snapshotFormat: "ai" });
|
||||
await tool.execute?.(null, { action: "snapshot", profile: "chrome", format: "ai" });
|
||||
|
||||
expect(browserClientMocks.browserSnapshot).toHaveBeenCalledWith(
|
||||
"http://127.0.0.1:18791",
|
||||
@@ -180,7 +151,6 @@ describe("browser tool snapshot maxChars", () => {
|
||||
describe("browser tool snapshot labels", () => {
|
||||
afterEach(() => {
|
||||
vi.clearAllMocks();
|
||||
configMocks.loadConfig.mockReturnValue({ browser: {} });
|
||||
});
|
||||
|
||||
it("returns image + text when labels are requested", async () => {
|
||||
@@ -205,7 +175,7 @@ describe("browser tool snapshot labels", () => {
|
||||
|
||||
const result = await tool.execute?.(null, {
|
||||
action: "snapshot",
|
||||
snapshotFormat: "ai",
|
||||
format: "ai",
|
||||
labels: true,
|
||||
});
|
||||
|
||||
|
||||
@@ -190,17 +190,11 @@ export function createBrowserTool(opts?: {
|
||||
return jsonResult({ ok: true });
|
||||
}
|
||||
case "snapshot": {
|
||||
const snapshotDefaults = loadConfig().browser?.snapshotDefaults;
|
||||
const format =
|
||||
params.snapshotFormat === "ai" || params.snapshotFormat === "aria"
|
||||
? (params.snapshotFormat as "ai" | "aria")
|
||||
: "ai";
|
||||
const mode =
|
||||
params.mode === "efficient"
|
||||
? "efficient"
|
||||
: format === "ai" && snapshotDefaults?.mode === "efficient"
|
||||
? "efficient"
|
||||
: undefined;
|
||||
const mode = params.mode === "efficient" ? "efficient" : undefined;
|
||||
const labels = typeof params.labels === "boolean" ? params.labels : undefined;
|
||||
const refs = params.refs === "aria" || params.refs === "role" ? params.refs : undefined;
|
||||
const hasMaxChars = Object.hasOwn(params, "maxChars");
|
||||
|
||||
@@ -65,7 +65,7 @@ describe("cron tool", () => {
|
||||
data: {
|
||||
name: "wake-up",
|
||||
schedule: { atMs: 123 },
|
||||
payload: { kind: "systemEvent", text: "hello" },
|
||||
payload: { text: "hello" },
|
||||
},
|
||||
},
|
||||
});
|
||||
@@ -105,7 +105,7 @@ describe("cron tool", () => {
|
||||
job: {
|
||||
name: "reminder",
|
||||
schedule: { atMs: 123 },
|
||||
payload: { kind: "systemEvent", text: "Reminder: the thing." },
|
||||
payload: { text: "Reminder: the thing." },
|
||||
},
|
||||
});
|
||||
|
||||
|
||||
@@ -3,8 +3,6 @@ import crypto from "node:crypto";
|
||||
import { Type } from "@sinclair/typebox";
|
||||
|
||||
import type { ClawdbotConfig } from "../../config/config.js";
|
||||
import { loadConfig } from "../../config/io.js";
|
||||
import { loadSessionStore, resolveStorePath } from "../../config/sessions.js";
|
||||
import { scheduleGatewaySigusr1Restart } from "../../infra/restart.js";
|
||||
import {
|
||||
formatDoctorNonInteractiveHint,
|
||||
@@ -79,42 +77,11 @@ export function createGatewayTool(opts?: {
|
||||
: undefined;
|
||||
const note =
|
||||
typeof params.note === "string" && params.note.trim() ? params.note.trim() : undefined;
|
||||
// Extract channel + threadId for routing after restart
|
||||
let deliveryContext: { channel?: string; to?: string; accountId?: string } | undefined;
|
||||
let threadId: string | undefined;
|
||||
if (sessionKey) {
|
||||
const threadMarker = ":thread:";
|
||||
const threadIndex = sessionKey.lastIndexOf(threadMarker);
|
||||
const baseSessionKey = threadIndex === -1 ? sessionKey : sessionKey.slice(0, threadIndex);
|
||||
const threadIdRaw =
|
||||
threadIndex === -1 ? undefined : sessionKey.slice(threadIndex + threadMarker.length);
|
||||
threadId = threadIdRaw?.trim() || undefined;
|
||||
try {
|
||||
const cfg = loadConfig();
|
||||
const storePath = resolveStorePath(cfg.session?.store);
|
||||
const store = loadSessionStore(storePath);
|
||||
let entry = store[sessionKey];
|
||||
if (!entry?.deliveryContext && threadIndex !== -1 && baseSessionKey) {
|
||||
entry = store[baseSessionKey];
|
||||
}
|
||||
if (entry?.deliveryContext) {
|
||||
deliveryContext = {
|
||||
channel: entry.deliveryContext.channel,
|
||||
to: entry.deliveryContext.to,
|
||||
accountId: entry.deliveryContext.accountId,
|
||||
};
|
||||
}
|
||||
} catch {
|
||||
// ignore: best-effort
|
||||
}
|
||||
}
|
||||
const payload: RestartSentinelPayload = {
|
||||
kind: "restart",
|
||||
status: "ok",
|
||||
ts: Date.now(),
|
||||
sessionKey,
|
||||
deliveryContext,
|
||||
threadId,
|
||||
message: note ?? reason ?? null,
|
||||
doctorHint: formatDoctorNonInteractiveHint(),
|
||||
stats: {
|
||||
|
||||
@@ -1,160 +0,0 @@
|
||||
import { afterEach, describe, expect, it, vi } from "vitest";
|
||||
|
||||
const lookupMock = vi.fn();
|
||||
|
||||
vi.mock("node:dns/promises", () => ({
|
||||
lookup: lookupMock,
|
||||
}));
|
||||
|
||||
function makeHeaders(map: Record<string, string>): { get: (key: string) => string | null } {
|
||||
return {
|
||||
get: (key) => map[key.toLowerCase()] ?? null,
|
||||
};
|
||||
}
|
||||
|
||||
function redirectResponse(location: string): Response {
|
||||
return {
|
||||
ok: false,
|
||||
status: 302,
|
||||
headers: makeHeaders({ location }),
|
||||
body: { cancel: vi.fn() },
|
||||
} as Response;
|
||||
}
|
||||
|
||||
function textResponse(body: string): Response {
|
||||
return {
|
||||
ok: true,
|
||||
status: 200,
|
||||
headers: makeHeaders({ "content-type": "text/plain" }),
|
||||
text: async () => body,
|
||||
} as Response;
|
||||
}
|
||||
|
||||
describe("web_fetch SSRF protection", () => {
|
||||
const priorFetch = global.fetch;
|
||||
|
||||
afterEach(() => {
|
||||
// @ts-expect-error restore
|
||||
global.fetch = priorFetch;
|
||||
lookupMock.mockReset();
|
||||
vi.restoreAllMocks();
|
||||
});
|
||||
|
||||
it("blocks localhost hostnames before fetch/firecrawl", async () => {
|
||||
const fetchSpy = vi.fn();
|
||||
// @ts-expect-error mock fetch
|
||||
global.fetch = fetchSpy;
|
||||
|
||||
const { createWebFetchTool } = await import("./web-tools.js");
|
||||
const tool = createWebFetchTool({
|
||||
config: {
|
||||
tools: {
|
||||
web: {
|
||||
fetch: {
|
||||
cacheTtlMinutes: 0,
|
||||
firecrawl: { apiKey: "firecrawl-test" },
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
await expect(tool?.execute?.("call", { url: "http://localhost/test" })).rejects.toThrow(
|
||||
/Blocked hostname/i,
|
||||
);
|
||||
expect(fetchSpy).not.toHaveBeenCalled();
|
||||
expect(lookupMock).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it("blocks private IP literals without DNS", async () => {
|
||||
const fetchSpy = vi.fn();
|
||||
// @ts-expect-error mock fetch
|
||||
global.fetch = fetchSpy;
|
||||
|
||||
const { createWebFetchTool } = await import("./web-tools.js");
|
||||
const tool = createWebFetchTool({
|
||||
config: {
|
||||
tools: { web: { fetch: { cacheTtlMinutes: 0, firecrawl: { enabled: false } } } },
|
||||
},
|
||||
});
|
||||
|
||||
await expect(tool?.execute?.("call", { url: "http://127.0.0.1/test" })).rejects.toThrow(
|
||||
/private|internal|blocked/i,
|
||||
);
|
||||
await expect(tool?.execute?.("call", { url: "http://[::ffff:127.0.0.1]/" })).rejects.toThrow(
|
||||
/private|internal|blocked/i,
|
||||
);
|
||||
expect(fetchSpy).not.toHaveBeenCalled();
|
||||
expect(lookupMock).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it("blocks when DNS resolves to private addresses", async () => {
|
||||
lookupMock.mockImplementation(async (hostname: string) => {
|
||||
if (hostname === "public.test") {
|
||||
return [{ address: "93.184.216.34", family: 4 }];
|
||||
}
|
||||
return [{ address: "10.0.0.5", family: 4 }];
|
||||
});
|
||||
|
||||
const fetchSpy = vi.fn();
|
||||
// @ts-expect-error mock fetch
|
||||
global.fetch = fetchSpy;
|
||||
|
||||
const { createWebFetchTool } = await import("./web-tools.js");
|
||||
const tool = createWebFetchTool({
|
||||
config: {
|
||||
tools: { web: { fetch: { cacheTtlMinutes: 0, firecrawl: { enabled: false } } } },
|
||||
},
|
||||
});
|
||||
|
||||
await expect(tool?.execute?.("call", { url: "https://private.test/resource" })).rejects.toThrow(
|
||||
/private|internal|blocked/i,
|
||||
);
|
||||
expect(fetchSpy).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it("blocks redirects to private hosts", async () => {
|
||||
lookupMock.mockResolvedValue([{ address: "93.184.216.34", family: 4 }]);
|
||||
|
||||
const fetchSpy = vi.fn().mockResolvedValueOnce(redirectResponse("http://127.0.0.1/secret"));
|
||||
// @ts-expect-error mock fetch
|
||||
global.fetch = fetchSpy;
|
||||
|
||||
const { createWebFetchTool } = await import("./web-tools.js");
|
||||
const tool = createWebFetchTool({
|
||||
config: {
|
||||
tools: {
|
||||
web: {
|
||||
fetch: { cacheTtlMinutes: 0, firecrawl: { apiKey: "firecrawl-test" } },
|
||||
},
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
await expect(tool?.execute?.("call", { url: "https://example.com" })).rejects.toThrow(
|
||||
/private|internal|blocked/i,
|
||||
);
|
||||
expect(fetchSpy).toHaveBeenCalledTimes(1);
|
||||
});
|
||||
|
||||
it("allows public hosts", async () => {
|
||||
lookupMock.mockResolvedValue([{ address: "93.184.216.34", family: 4 }]);
|
||||
|
||||
const fetchSpy = vi.fn().mockResolvedValue(textResponse("ok"));
|
||||
// @ts-expect-error mock fetch
|
||||
global.fetch = fetchSpy;
|
||||
|
||||
const { createWebFetchTool } = await import("./web-tools.js");
|
||||
const tool = createWebFetchTool({
|
||||
config: {
|
||||
tools: { web: { fetch: { cacheTtlMinutes: 0, firecrawl: { enabled: false } } } },
|
||||
},
|
||||
});
|
||||
|
||||
const result = await tool?.execute?.("call", { url: "https://example.com" });
|
||||
expect(result?.details).toMatchObject({
|
||||
status: 200,
|
||||
extractor: "raw",
|
||||
});
|
||||
});
|
||||
});
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user