Skip to main content

Genie Flow Invoker Ollama

Project description

Ollama Invokers

PyPI version PyPI - Downloads

This package contains Genie Flow invokers for different Ollama invocations.

Installing the ollama invoker

pip install genie-flow-invoker-ollama

Installing Ollama

To run models locally, you'll need to install Ollama. We recommend using the official native installer for your platform:

Install Ollama

This will set up the Ollama runtime and make the ollama command available in your terminal.

Once installed, you can start a model like:

ollama run llama3

For advanced users, Ollama also provides a Docker image: Ollama for Docker which you can use in containerized environments.

Select model and query

List all available ollama models run:

ollama list

To start using the selected model, create a meta.yaml and a prompt like described here: Create LLM templates

There are three different types of invokers available:

  • OllamaChatInvoker, includes dialogue history in your prompt
  • OllamaGenerateInvoker, includes base64-encoded images in your prompt, using the model defined in meta.yaml
  • OllamaEmbedInvoker, vectorizes text using the embedding model specified in meta.yaml

Include base64 encoded images in prompt

For images to be included, the prompt template must be structured as follows, or else, the query will only contain plain text.

prompt: |
    Some prompt text
images:
    - {{ image_as_base64 }}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

genie_flow_invoker_ollama-0.8.4-py3-none-any.whl (5.3 kB view details)

Uploaded Python 3

File details

Details for the file genie_flow_invoker_ollama-0.8.4-py3-none-any.whl.

File metadata

File hashes

Hashes for genie_flow_invoker_ollama-0.8.4-py3-none-any.whl
Algorithm Hash digest
SHA256 bad1b374e36ab6a28599157f4edb5e760bdfb0f6b06155feda482cfb5a3cf454
MD5 579ce0dcaaafc52647b8e275c68a5cd7
BLAKE2b-256 3bf7f55a4614dadacb243fd555f4702ec4b86fbe6366a5fda0ff1ca1e0493ccc

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page