An integration between the Ollama LLM framework and Haystack
Project description
ollama-haystack
Table of Contents
Installation
pip install ollama-haystack
License
ollama-haystack
is distributed under the terms of the Apache-2.0 license.
Testing
To run tests first start a Docker container running Ollama and pull a model for integration testing It's recommended to use the smallest model possible for testing purposes - see https://ollama.ai/library for a list that Ollama supportd
docker run -d -p 11434:11434 --name ollama ollama/ollama:latest
docker exec ollama ollama pull <your model here>
Then run tests:
hatch run test
The default model used here is orca-mini
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
ollama_haystack-0.0.4.tar.gz
(14.7 kB
view hashes)
Built Distribution
Close
Hashes for ollama_haystack-0.0.4-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 193f547e84c45c9aee9234825957edf9a0a6cb18ee6064706f1d92e43fc6ec63 |
|
MD5 | 36c015764b952b8ed0f5ffd186e0a8f4 |
|
BLAKE2b-256 | a80f233dac1d5059428151e4f6bfadaa753397288ac8f7d28683792d9385c289 |