An integration between the Ollama LLM framework and Haystack
Project description
ollama-haystack
Table of Contents
Installation
pip install ollama-haystack
License
ollama-haystack
is distributed under the terms of the Apache-2.0 license.
Testing
To run tests first start a Docker container running Ollama and pull a model for integration testing It's recommended to use the smallest model possible for testing purposes - see https://ollama.ai/library for a list that Ollama supportd
docker run -d -p 11434:11434 --name ollama ollama/ollama:latest
docker exec ollama ollama pull <your model here>
Then run tests:
hatch run test
The default model used here is orca-mini
for generation and nomic-embed-text
for embeddings
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
ollama_haystack-1.0.0.tar.gz
(16.5 kB
view details)
Built Distribution
File details
Details for the file ollama_haystack-1.0.0.tar.gz
.
File metadata
- Download URL: ollama_haystack-1.0.0.tar.gz
- Upload date:
- Size: 16.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: python-httpx/0.27.2
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 3414fc57d2f5a193b7527a73dacc940e76d110dcf78603fa54dbd402152d74fb |
|
MD5 | d933dc81d1bfa16cdedde4e0005df240 |
|
BLAKE2b-256 | a9f3d0691d2cae33a7ca1218702d1ae6cad62af64ceddad332ad07c2f827c73f |
File details
Details for the file ollama_haystack-1.0.0-py3-none-any.whl
.
File metadata
- Download URL: ollama_haystack-1.0.0-py3-none-any.whl
- Upload date:
- Size: 14.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: python-httpx/0.27.2
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | bd539218c753c8d672fbaba3343ae606edf4631caf951e46fd02a53db6fb2515 |
|
MD5 | a08d3a112fcedf636a00180b28ca92bb |
|
BLAKE2b-256 | b369f944f9b4c9601fbd51032e181e9324057d78f65668da71a824a8ce63d7d1 |