Three agents now. Ten agents architected. Run entirely on local Ollama, or burst to cloud when you choose. Open source, VS Code-native, no vendor lock-in.
Most AI coding tools force a trade-off: power vs. privacy. Kuma refuses that trade.
Run agents on your local Ollama or LM Studio. Your code, your prompts, your context โ none of it leaves your network unless you opt in.
$ kuma --local-onlyPlanner, Coder, and Reviewer agents work simultaneously on your task. Each picks the right model for its job. Architecture supports scaling to 10 agents.
3 agents โ 10 agents1GB modular skill packs make a 3B local model perform like a 70B-class model on TypeScript, React, and Vite. Two-level lookup keeps inference fast.
3B model โ 70B outputYour agents learn from your patterns whether you're running local Ollama or bursting to Ollama Cloud. Switch providers; the loop persists.
Local + CloudKuma is designed for teams in regulated industries โ finance, healthcare, defense, government โ where source code can't leave the corporate network. A local-only configuration has zero required outbound connections.
config.tsWe provide data-flow diagrams and control mappings on request. Reach out and tell us about your environment โ we'll work through the specifics with you.
Contact us โ kuma@zerosec-ai.techKuma Code stands on the shoulders of three open-source projects with 57K+ combined GitHub stars. Each ancestor contributed something we kept and built on. Read the full ATTRIBUTION for technical detail and credit.