A local browser shell where AI generates every page as live HTML.
npm install -g slopweb
slopwebPick a local model or Codex in the launcher, then open the printed local URL.
Local models and custom endpoints
Slopweb detects common local runtimes and OpenAI-compatible APIs, including Ollama, LM Studio, llama.cpp/llamafile, vLLM, SGLang, Jan, text-generation-webui, and KoboldCpp.
slopweb models
slopweb --base-url http://localhost:11434/v1 --model llama3.2Custom provider definitions can live in ~/.slopweb/models.json.
Server options
slopweb --port 9000
slopweb --strict-port
slopweb --lan
slopweb --no-pickerThe HTTP API is localhost-only by default. Use --lan only when you intentionally want LAN access.
Run from source
pnpm install
pnpm start
pnpm run checkMIT


