# daoye.dev — Sovereign Browser Platform # https://daoye.dev/llms.txt > Built by hand. Served by machine. Owned by you. ## What This Is A sovereign browser-native platform where the Service Worker IS the server. One Caddy process serves static files. After the first HTML load, the SW owns all routing, caching, filesystem, AI inference, and network policy. The browser is the computer. ## Architecture Caddy (TLS + static + reverse proxies) → index.html → SW registers → SW controls everything. Two categories of routes: - **SW-handled**: Page routes (/meet, /bbs, /present, /code, /voice, /cameras, /sensors), API routes (/api/fs/*, /api/big-pickle, /api/network-audit) - **Caddy-proxied**: /api/ollama/* → Ollama (local LLM), /api/oc/* → OpenCode (coding agent), /api/zen/* → opencode.ai (cloud fallback), /api/voice/* → voice server, /api/streams/* → MediaMTX ## LLM Access Default: Ollama on localhost (eigen-assistant, llama3.2, qwen3-coder, deepseek-r1). Zero API key, zero cost. Fallback: OpenCode Zen (opencode.ai, Bearer public, deepseek-v4-flash, zero cost). Optional: Anthropic or OpenAI with API key. The `/api/big-pickle` endpoint in the SW routes to the appropriate provider. The `/api/oc/*` proxy gives full OpenCode protocol (sessions, tools, PTY, diffs, todos). ## Filesystem Virtual Filesystem (VFS) backed by Cache API. Routes: GET /api/fs/read?path=, POST /api/fs/write, GET /api/fs/list?prefix=, DELETE /api/fs/delete?path=, GET /api/fs/stat?path=. Files written to the cache are immediately live. ## Network Firewall Provable single outstream. `ALLOWED_OUTSTREAM` in sw.js whitelists: api.anthropic.com, api.openai.com, opencode.ai, js.stripe.com, api.stripe.com. Everything else → 403. Audit log at /api/network-audit. ## Key Files - sw.js — THE server (routing, caching, firewall, VFS, Big Pickle proxy) - vfs.js — Virtual filesystem (importScripts, exposes self.VFS) - index.html — Portfolio + platform nav - bbs.html + bbs-app.js — xterm.js BBS (message boards, AI SysOp, door games) - present.html + present-app.js — Three.js 3D avatar + TTS lip sync - meet.html — P2P WebRTC meeting room - voice.html — Voice companion (af_heart) - cameras.html — Camera feed dashboard - sensors.html — IoT sensor dashboard - Caddyfile — Reverse proxies + static serving (env vars: $SITE_ROOT, $OPENCODE_HOST, $OLLAMA_HOST) - build/vscode-web/ — VS Code workbench source (npm install && npm run vendor) - vessel/loro/web/ — Loro CRDT (vendored WASM) ## OpenCode Integration OpenCode runs locally (opencode web --port 4097). Caddy proxies /api/oc/* to it. The x-opencode-directory header enables workspace routing. Sessions, prompts, tools, PTY all accessible from the browser. Proven: `opencode run -m local/llama3.2 --dangerously-skip-permissions "create hello.txt"` creates files in the project. The AI edits its own host. ## For LLMs Working On This Codebase - No npm, no build step (except VS Code workbench in build/vscode-web/) - All JS is vanilla ES modules or importScripts - Vendor third-party libs to assets/lib/ (no CDN) - Add new assets to asset-manifest.json - Bump VERSION in sw.js for breaking changes - Don't search for API keys — use Ollama (local) or Zen (Bearer public) - The user values sovereignty, privacy, and zero-infrastructure design