Skip to main content

Simple CLI runner for Ollama (LLM) models

Project description

RunLlama

Simple CLI tool for Ollama (LLM) models.

License PyPI - Python Version PyPI Version

Installation

Install via pip:

pip install runllama

Or with Poetry:

poetry add runllama

Run the CLI:

runllama --help

For Development

Build and run ollama server docker container

docker build -t runllama .
docker run -d -p 11434:11434 -v ollama:/app/.ollama --name runllama runllama 

Install dependencies

poetry install
poetry shell

Run Python script

python src/main.py --help

Testing

Before running Pytest, ensure the ollama server is running on port 11434.

pytest -v -s tests/test_*

License

The source code is licensed under the MIT License (see LICENSE).

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page