Skip to main content

A terminal-based interface for interacting with large language models (LLMs)

Project description

NoLlama

NoLlama is a terminal-based interface for interacting with Google's Gemini API directly from your terminal. Inspired by Ollama, NoLlama provides a streamlined experience for chatting with Gemini models like Gemini 2.0 Flash, Gemini 2.5 Flash Preview, and Gemini 2.5 Pro Preview, etc. Ollama, Groq and OpenRouter support will be added soon.

NoLlama offers a neat terminal interface for powerful language models that aren't easily available for local execution, complete with colorful markdown rendering, multiple model choices, and efficient memory usage.

NoLlama

Features

  • Google Gemini Models: Access to powerful models like Gemini 2.0 Flash, Gemini 2.5 Flash Preview, and Gemini 2.5 Pro Preview.
  • Multi-turn Conversations: Maintain context between prompts for more coherent conversations.
  • Neat Terminal UI: Enjoy a clean and intuitive interface for your interactions.
  • Live Streaming Responses: Watch responses appear in real-time as they're generated.
  • Colorful Markdown Rendering: Rich text formatting and syntax highlighting in your terminal.
  • Low Memory Usage: Efficient memory management makes it lightweight compared to using a browser.
  • Easy Model Switching: Simply type model in the chat to switch between models.
  • Clear Chat History: Type clear to clear the chat history.
  • Exit Commands: Type q, quit, or exit to leave the chat, or use Ctrl+C or Ctrl+D.

Setup

  1. API Key Configuration:

    Create a .nollama file in your home directory with your Gemini API key:

    echo "GEMINI=your_api_key_here" > ~/.nollama
    

    You can get a free API key from Google AI Studio.

  2. Installation:

    a. Install from PyPI (recommended):

    pip install nollama
    

    b. Or clone and install from source:

    git clone https://github.com/spignelon/nollama.git
    cd nollama
    pip install -e .
    
  3. Run NoLlama:

    nollama
    

Usage

  • Select a Model: At startup, choose from available Gemini models.
  • Chat Normally: Type your questions and see the responses with rich formatting.
  • Switch Models: Type model in the chat to choose a different model.
  • Clear Chat: Type clear to clear the chat history.
  • Exit: Type q, quit, or exit to leave the chat, or press Ctrl+C or Ctrl+D.

Todos

  • Add context window
  • Web interface
  • Add support for Groq
  • Add support for OpenRouter
  • Add support for Ollama API
  • Support for custom APIs

Contribution

Contributions are welcome! If you have suggestions for new features or improvements, feel free to open an issue or submit a pull request.

Disclaimer

NoLlama is not affiliated with Ollama. It is an independent project inspired by the concept of providing a neat terminal interface for interacting with language models.

License

This project is licensed under the GPL-3.0 License.
GNU GPLv3 Image

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nollama-0.4.tar.gz (18.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

nollama-0.4-py3-none-any.whl (18.2 kB view details)

Uploaded Python 3

File details

Details for the file nollama-0.4.tar.gz.

File metadata

  • Download URL: nollama-0.4.tar.gz
  • Upload date:
  • Size: 18.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.10

File hashes

Hashes for nollama-0.4.tar.gz
Algorithm Hash digest
SHA256 68e9759fbac8caad0222b38886bf988c308569984bae776a47156801cdc1e8aa
MD5 3a8f9aa7d689390ca508310f0a8df13e
BLAKE2b-256 19ca1bc1bf5b9cae126acedfeac1bed51740d9d062a78eef2af64986bb165071

See more details on using hashes here.

File details

Details for the file nollama-0.4-py3-none-any.whl.

File metadata

  • Download URL: nollama-0.4-py3-none-any.whl
  • Upload date:
  • Size: 18.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.10

File hashes

Hashes for nollama-0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 2eeee53e32d137e285e850e8c90a07457d7eb824e62680d69bb41fbe657e8735
MD5 b43d483c16e380afda989af22152c747
BLAKE2b-256 a0604a3e7bfe1956b47d6d5857bcc389d15d0257e8ce38741312610f306deed5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page