Skip to main content

An integration between the Ollama LLM framework and Haystack

Project description

ollama-haystack

PyPI - Version PyPI - Python Version


Table of Contents

Installation

pip install ollama-haystack

License

ollama-haystack is distributed under the terms of the Apache-2.0 license.

Testing

To run tests first start a Docker container running Ollama and pull a model for integration testing It's recommended to use the smallest model possible for testing purposes - see https://ollama.ai/library for a list that Ollama supportd

docker run -d -p 11434:11434 --name ollama ollama/ollama:latest
docker exec ollama ollama pull <your model here>

Then run tests:

hatch run test

The default model used here is orca-mini

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ollama_haystack-0.0.6.tar.gz (16.6 kB view hashes)

Uploaded Source

Built Distribution

ollama_haystack-0.0.6-py3-none-any.whl (14.8 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page