Skip to main content

Genie Flow Invoker Ollama

Project description

Ollama Invokers

PyPI version PyPI - Downloads

This package contains Genie Flow invokers for different Ollama invocations.

Installing the ollama invoker

pip install genie-flow-invoker-ollama

Installing Ollama

To run models locally, you'll need to install Ollama. We recommend using the official native installer for your platform:

Install Ollama

This will set up the Ollama runtime and make the ollama command available in your terminal.

Once installed, you can start a model like:

ollama run llama3

For advanced users, Ollama also provides a Docker image: Ollama for Docker which you can use in containerized environments.

Select model and query

List all available ollama models run:

ollama list

To start using the selected model, create a meta.yaml and a prompt like described here: Create LLM templates

There are three different types of invokers available:

  • OllamaChatInvoker, includes dialogue history in your prompt
  • OllamaGenerateInvoker, includes base64-encoded images in your prompt, using the model defined in meta.yaml
  • OllamaEmbedInvoker, vectorizes text using the embedding model specified in meta.yaml

Include base64 encoded images in prompt

For images to be included, the prompt template must be structured as follows, or else, the query will only contain plain text.

prompt: |
    Some prompt text
images:
    - {{ image_as_base64 }}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

genie_flow_invoker_ollama-0.8.3-py3-none-any.whl (5.1 kB view details)

Uploaded Python 3

File details

Details for the file genie_flow_invoker_ollama-0.8.3-py3-none-any.whl.

File metadata

File hashes

Hashes for genie_flow_invoker_ollama-0.8.3-py3-none-any.whl
Algorithm Hash digest
SHA256 c6d2d06b5c3157a682a37624572a05e12978409ee63a707c8bfbff9e4b3ca7f5
MD5 4c03607bc7d92900c34a1fd99bb08227
BLAKE2b-256 8324958ed3e5e60bcda0ca9caceb3bf9154008d5014308a99ca6f3da06564e2c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page