1. Install ez-term
curl -sSL https://raw.githubusercontent.com/ezcorp-org/ez-term/main/scripts/install.sh | bash 3. Configure ez-term
ez --set-backend ollama
ez --set-model qwen2.5-coder:latest 4. Start using it!
ez "your question here" AI-powered CLI command generator designed to run completely locally using Ollama
No data leaves your machine. No telemetry. No cloud dependencies. Just you, your terminal, and AI-powered command generation that respects your privacy.
curl -sSL https://raw.githubusercontent.com/ezcorp-org/ez-term/main/scripts/install.sh | bash Runs entirely on your machine with Ollama. No data leaves your computer. Zero telemetry. No cloud dependencies.
Understands your environment, tools, and projects automatically. Generates commands tailored to your actual setup.
Preview before execute. Defaults to safe, read-only operations. Refuses dangerous commands like rm -rf /.
One-line installation. No Rust required.
curl -sSL https://raw.githubusercontent.com/ezcorp-org/ez-term/main/scripts/install.sh | bashSet up local AI in minutes.
curl -fsSL https://ollama.ai/install.sh | sh ollama pull qwen2.5-coder:latest ez --set-backend ollamaAsk in natural language. Get perfect commands.
ez "find large files" ez "show git branches" ez "compress this directory"curl -sSL https://raw.githubusercontent.com/ezcorp-org/ez-term/main/scripts/install.sh | bash ez --set-backend ollama
ez --set-model qwen2.5-coder:latest ez "your question here"