Skip to main content

Local AI CLI using local llm gpt-oss:20b

Project description

easylocai

Fully autonomous agentic workflows running locally—no APIs, no data leaks

run_sample.gif

Overview

Easylocai is an On-Device Autonomous Agent designed for secure, offline task execution. Unlike cloud-dependent assistants, it leverages the gpt-oss:20b model to perform complex reasoning and actions entirely on your local machine.

By implementing a sophisticated Plan-Execute-Replan orchestration, Easylocai can decompose ambiguous goals into actionable steps, execute them using Model Context Protocol (MCP) tools, and autonomously refine its strategy based on real-time feedback.

Features

  • Privacy-First Autonomy: 100% local execution using gpt-oss:20b via Ollama. Your code and data never leave your machine.

  • Agentic Orchestration: A robust multi-agent loop (Plan → Execute → Replan) that ensures high success rates for long-horizon tasks.

  • MCP Tool Integration: Seamlessly connects with Model Context Protocol (MCP) servers to interact with your local file system, terminal, and APIs.

Requirements

To ensure stable performance of the autonomous agent, your system must meet the following criteria:

System Requirements

  • Minimum 16GB RAM (32GB or more recommended for optimal performance)
  • Sufficient disk space for model storage and operation

OS

  • OS: macOS (Strictly supported)

Software Requirements

  • Runtime: Python 3.12. It is recommended to use pyenv.
  • LLM Engine: Ollama must be installed and running.
    • Model: gpt-oss:20b (Make sure to run ollama pull gpt-oss:20b before starting).

Install & Execution

Installation

bash install.sh

Configuration

MCP server configuration

  • file_name: ~/.config/easylocai/config.json
  • example
    {
      "mcpServers": {
        "filesystem": {
          "command": "npx",
          "args": [
            "-y",
            "@modelcontextprotocol/server-filesystem",
            "."
          ]
        },
        "kubernetes": {
          "command": "python",
          "args": [
            "-m",
            "kubectl_mcp_tool.mcp_server"
          ],
          "cwd": "~/Programming/kubectl-mcp-server",
          "env": {
            "KUBECONFIG": "~/.kube/config",
            "KUBECTL_MCP_LOG_LEVEL": "ERROR",
            "PYTHONUNBUFFERED": "1"
          }
        },
        "notion_api": {
          "command": "docker",
          "args": [
            "run",
            "--rm",
            "-i",
            "-e", "NOTION_TOKEN",
            "mcp/notion"
          ],
          "env": {
            "NOTION_TOKEN": "<token>"
          }
        }
      }
    }
    

Initialization

easylocai init

If you want to force re-initialization, use --force flag:

easylocai init --force

Run

easylocai

Run beta workflow variant

easylocai --flag=beta

Development

Run without installing

python -m easylocai.run

# Beta variant
python -m easylocai.run --flag=beta

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

easylocai-0.1.0.tar.gz (496.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

easylocai-0.1.0-py3-none-any.whl (60.7 kB view details)

Uploaded Python 3

File details

Details for the file easylocai-0.1.0.tar.gz.

File metadata

  • Download URL: easylocai-0.1.0.tar.gz
  • Upload date:
  • Size: 496.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.11

File hashes

Hashes for easylocai-0.1.0.tar.gz
Algorithm Hash digest
SHA256 d92750d0c4a0f3b68d1ba29c18c86b99e431f164cc291ebf606c32ca8075a702
MD5 0f66e55d728a3bfb216512ce0e20a0fd
BLAKE2b-256 1de8958b90256cf43825e9bd8465b8d3536d8e12029f99f7e1d7d3598319e25e

See more details on using hashes here.

File details

Details for the file easylocai-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: easylocai-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 60.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.11

File hashes

Hashes for easylocai-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 a518d72a9d35d672d257ff4d00ade5049590c22b5c0d14db42a217091d5a443d
MD5 ad2e15c37bd07777d21464f4e9fb05aa
BLAKE2b-256 0ff875ca6e21ebee64b77238af58b15b70647ce8f823230e198211960b871d85

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page