Nanocoder CLI
by Mote Software
Local-first CLI coding agent for terminal-focused developers
See https://github.com/nicobytes/nanocoder | NPM
Features
- Multi-Model Support: Ollama (local), OpenRouter, OpenAI-compatible APIs (LM Studio, vLLM)
- MCP Integration: Connect to file systems, GitHub, web search via Model Context Protocol
- Custom Commands: Reusable markdown templates in
.nanocoder/commands/with parameters - Project-Level Config: Per-project
agents.config.jsonfor models and API keys - Auto-Accept Mode: Batch operations without user intervention
- Session State: Command history, token usage tracking, real-time thinking display
Commands
| Command | Purpose |
|---|---|
/provider | Switch AI provider (Ollama, OpenRouter, etc.) |
/model | Switch between available models |
/clear | Start new chat session |
/debug | Toggle detailed logging |
/exit | Exit program |
Superpowers
Local-first architecture - run entirely offline with Ollama, keeping code and context on your machine. Ideal for confidential projects and terminal-focused developers who want Claude Code-like capabilities without cloud dependencies. Custom markdown commands enable standardized, repeatable AI assistance across teams.
Pricing
Free and open source. Pay only for cloud API usage (OpenRouter, OpenAI, etc.) if used.
Installation
npm install -g @motesoftware/nanocoder