100%
Local. Private. Yours.

AI-powered CLI command generator designed to run completely locally using Ollama

No data leaves your machine. No telemetry. No cloud dependencies. Just you, your terminal, and AI-powered command generation that respects your privacy.

curl -sSL https://raw.githubusercontent.com/ezcorp-org/ez-term/main/scripts/install.sh | bash

Why Choose ez-term?

🔒

Privacy First

Runs entirely on your machine with Ollama. No data leaves your computer. Zero telemetry. No cloud dependencies.

🧠

Context Aware

Understands your environment, tools, and projects automatically. Generates commands tailored to your actual setup.

🛡️

Safe by Default

Preview before execute. Defaults to safe, read-only operations. Refuses dangerous commands like rm -rf /.

How It Works

1

Install

One-line installation. No Rust required.

curl -sSL https://raw.githubusercontent.com/ezcorp-org/ez-term/main/scripts/install.sh | bash
2

Configure Ollama

Set up local AI in minutes.

curl -fsSL https://ollama.ai/install.sh | sh ollama pull qwen2.5-coder:latest ez --set-backend ollama
3

Use

Ask in natural language. Get perfect commands.

ez "find large files" ez "show git branches" ez "compress this directory"

See it in Action

Terminal

Quick Start

1. Install ez-term

curl -sSL https://raw.githubusercontent.com/ezcorp-org/ez-term/main/scripts/install.sh | bash

2. Install Ollama

Visit ollama.ai for installation instructions

ollama pull qwen2.5-coder:latest

3. Configure ez-term

ez --set-backend ollama ez --set-model qwen2.5-coder:latest

4. Start using it!

ez "your question here"