Skip to main content

Local Ollama CLI

Project description

Lolla

Description

Lolla is a simple, easy to use, and lightweight Python binary for using Ollama's API. It is designed to be used in the terminal and can be used to make AI inference using Ollama's API.

Installation

To install Lolla, simply run the following command:

pip install --user lolla

or with an isolated environment:

python -m venv ~/bin/lolla_venv
~/bin/lolla_venv/bin/pip install lolla

ln -s ~/bin/lolla_venv/bin/lolla ~/bin/lolla

Usage

To use Lolla, simply run the following command :

lolla --help

Contributing

To contribute to LaFlem, simply fork the repository and create a pull request. Please make sure to include a detailed description of your changes. Here are the things I will check during the review :

  • Is CHANGELOG.md have been updated (required)
  • Is the lint score did not decrease (required)
  • Is the test coverage did not decrease (required)
  • Is the documentation have been updated (if required)
  • If tests have been added (optional)

Development

This repository uses Taskfile to manage the development tasks. To see the available tasks, run the following command:

task --list

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lolla-0.1.2.tar.gz (20.5 kB view hashes)

Uploaded Source

Built Distribution

lolla-0.1.2-py3-none-any.whl (20.7 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page