Skip to main content

An AI coding assistant CLI using OpenRouter and various LLM models.

Project description

LM Code

LM Code is a powerful AI coding assistant for your terminal, leveraging the OpenRouter API and supporting multiple LLM models like Qwen, DeepSeek, and Gemini. With LM Code, you can interactively work on coding tasks, automate file operations, and improve your workflow directly from the command line.


Features

  • Interactive CLI with AI Assistance:
    • Chat with AI models for coding advice, file management, and more.
    • Markdown rendering for improved readability.
  • Multi-Model Support:
    • Qwen, DeepSeek, Gemini, and more.
  • Automated Tool Usage:
    • File operations: view, edit, list, grep, glob.
    • Directory operations: ls, tree, create_directory.
    • System commands: bash.
    • Quality checks: linting, formatting.
    • Test running: pytest and similar tools.
  • Customizable Configurations:
    • Easily set default models and API keys.

Installation

Method 1: Install from PyPI (Recommended)

pip install lmcode

Method 2: Install from Source

# Clone the repository
git clone https://github.com/Panagiotis897/lm-code.git
cd lm-code

# Install the package
pip install -e .

Setup

Before using LM Code, you need to set up your API keys for OpenRouter.

Configure API Key

lmcode setup YOUR_OPENROUTER_API_KEY

This saves your API key in the configuration file located at ~/.config/gemini-code/config.yaml.


Usage

Start an Interactive Session

# Start with the default model
lmcode

# Start with a specific model
lmcode --model qwen/qwen-2.5-coder-32b-instruct:free

Manage Models

# Set a default model
lmcode set-default-model deepseek/deepseek-r1:free

# List all available models
lmcode list-models

Supported Models

  • Qwen 2.5 Coder 32B: qwen/qwen-2.5-coder-32b-instruct:free
  • Qwen QWQ 32B: qwen/qwq-32b:free
  • DeepSeek R1: deepseek/deepseek-r1:free
  • Gemma 3 (27B Italian): google/gemma-3-27b-it:free
  • Gemini 2.5 Pro Experimental: google/gemini-2.5-pro-exp-03-25:free

Interactive Commands

During an interactive session, you can use these commands:

  • /exit: Exit the session.
  • /help: Display help information.

How It Works

LM Code uses native tools to enhance your coding experience. For instance:

  1. You ask: "What files are in the current directory?"
  2. LM Code uses the ls tool to fetch directory contents.
  3. The assistant formats and presents the response.

This seamless integration of tools and AI makes LM Code a powerful coding partner.


Development

LM Code is under active development. Contributions, feature requests, and feedback are welcome!

Recent Changes

v0.1.0

  • Rebranded from Gemini to LM Code.
  • Integrated OpenRouter's Qwen model as the default.
  • Added multi-model support for Qwen, DeepSeek, and Gemini.
  • Overhauled CLI commands (geminilmcode).
  • Enhanced configuration to support multiple API providers.

Known Issues

  • If you used earlier versions, you might need to delete your old configuration:
    rm -rf ~/.config/gemini-code
    

License

MIT License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

code_lm-0.1.3.tar.gz (28.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

code_lm-0.1.3-py3-none-any.whl (35.1 kB view details)

Uploaded Python 3

File details

Details for the file code_lm-0.1.3.tar.gz.

File metadata

  • Download URL: code_lm-0.1.3.tar.gz
  • Upload date:
  • Size: 28.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.6

File hashes

Hashes for code_lm-0.1.3.tar.gz
Algorithm Hash digest
SHA256 b173980106437c277069de4475bfc089db28c27075ff439c436b9f32b2c74ff7
MD5 f9fbeec914a151e1f5143954c3fdaed4
BLAKE2b-256 4fca12fc173b68d931428967199f72de8591ca2e8756268d96279635d7e5411f

See more details on using hashes here.

File details

Details for the file code_lm-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: code_lm-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 35.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.6

File hashes

Hashes for code_lm-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 b80e0c278a1ea50d99ea48a815c38aa287eb1e6b15debf92f0cc69269ff466a0
MD5 5c596dd505def8c12ea74553ca5bdd8f
BLAKE2b-256 d3584881e6be259ac99b68d25c6e2674ffce38d54eb4b5806442b9f26416bcfb

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page