llmrouter
Provides an LLM routing proxy that classifies requests and forwards them to providers like `anthropic`, `openai`, and local `ollama` models. It instructs running `python server.py`, performing network operations (`git clone https://github.com/alexrudloff/llmrouter.git`, `ollama pull qwen2.5:3b`), and editing `~/.openclaw/openclaw.json`; it references `ANTHROPIC_API_KEY`.