Skip to main content

Scikit-Ollama: an extension of Scikit-LLM for Ollama served models.

Project description

Scikit-Ollama: an extension of Scikit-LLM for Ollama served models.

Leverage the power of Scikit-LLM and the security of self-hosted LLMs.

Installation

pip install scikit-ollama

Support us

You can support the project in the following ways:

  • Support the original Scikit-LLM package. New features will be made available downstream slowly but surely.
  • Star this repository.
  • Provide feedback in the issues section.
  • Share this repository with others.

Quick start & documentation

Assuming you have installed and configured Ollama to run on your machine:

from skllm.datasets import get_classification_dataset
from skollama.models.ollama.classification.zero_shot import ZeroShotOllamaClassifier

X, y = get_classification_dataset()

clf = ZeroShotOllamaClassifier(model="llama3:8b")
clf.fit(X, y)
preds = clf.predict(X)

For more information please refer to the documentation.

Why Scikit-Ollama?

Scikit-Ollama lets you use locally run models for several text classification approaches. Running models locally can be beneficial for cases where data privacy and control are paramount. This also makes you less dependent on 3rd-party APIs and gives you more control over when you want to add changes.

This project builds heavily on Scikit-LLM and has it as a core dependency. Scikit-LLM provides broad and great support to query a variety of backend families, e.g. OpenAI, Vertex, GPT4All. In their version you could already use the OpenAI compatible v1 API backend to query locally run models. However, the issue is that Ollama does not support passing options, such as the context size to that endpoint.

Therefore this model uses the Ollama Python SDK to allow that level of control.

Contributing

For a guide to contributing please follow the steps here.

Citation

@software{Scikit-Ollama,
    author = {Andreas Karasenko},
    year = {2024},
    title = {Scikit-Ollama: an extension of Scikit-LLM for Ollama served models},
    url = {https://github.com/AndreasKarasenko/scikit-ollama}
}

If you consider citing this repository, please also consider citing scikit-llm.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

scikit_ollama-0.3.2.tar.gz (8.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

scikit_ollama-0.3.2-py3-none-any.whl (14.2 kB view details)

Uploaded Python 3

File details

Details for the file scikit_ollama-0.3.2.tar.gz.

File metadata

  • Download URL: scikit_ollama-0.3.2.tar.gz
  • Upload date:
  • Size: 8.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for scikit_ollama-0.3.2.tar.gz
Algorithm Hash digest
SHA256 35d42ed4d9142790996d11daf15e7a33313e0447dd92f62ed79992b17b39688b
MD5 4dd5d97222c3a9280eca1a67f845f451
BLAKE2b-256 a0d5a1ed39d84f699c33c1ce609557483a217657f283fd5b0e6b91cecd131eff

See more details on using hashes here.

Provenance

The following attestation bundles were made for scikit_ollama-0.3.2.tar.gz:

Publisher: pypi-deploy.yml on AndreasKarasenko/scikit-ollama

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file scikit_ollama-0.3.2-py3-none-any.whl.

File metadata

  • Download URL: scikit_ollama-0.3.2-py3-none-any.whl
  • Upload date:
  • Size: 14.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for scikit_ollama-0.3.2-py3-none-any.whl
Algorithm Hash digest
SHA256 9d64ba0ddd1a2096c0b84fb75e7f288a2732ba839ca5eb78c64a7c12b5414158
MD5 6bd35e4261e6387317603e590b8263de
BLAKE2b-256 b75dd19d00672ca5abc270ac0bf8947b0f23b1298a1d1821e05ce87795cbd0a1

See more details on using hashes here.

Provenance

The following attestation bundles were made for scikit_ollama-0.3.2-py3-none-any.whl:

Publisher: pypi-deploy.yml on AndreasKarasenko/scikit-ollama

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page