Functions to parse Ollama options from string and get info on Ollama options.
Project description
dm-ollamalib
Python package: helper functions to parse Ollama options from a string. Also show available options.
Nothing stellar, but these functionalities are somehow missing from the Ollama Python package.
Installation
If you haven't done so already, please install uv as this Python package and project manager basically makes all headaches of Python package management go away in an instant.
Simply do uv add dm-ollamalib and you are good to go.
In case you want to work with the GitHub repository (e.g., because testing out a branch or similar), do
uv add git+https://github.com/DrMicrobit/dm-ollamalib.
Usage
Three function are provided:
- two helper functions that return as string a list of Ollama options, their type, and if available a short description
- a parsing function that parses a string and returns a dictionary compatible for use with Ollama
Functions to describe Ollama options
Both functions return a string.
[!NOTE] For name and type of the options, dm-ollamalib uses information directly from the Ollama Python library, which must be installed, e.g. via
uv add ollama. That is, the strings returned are dynamically generated and adapted to the version of the Ollama Python library you have installed.
- help_overview(): returns a string showing name and type of supported Ollama options. String will look like this:
numa : bool
num_ctx : int
num_batch : int
...
- help_long:(): returns a string showing name, type and description of supported Ollama options. The string will look like this:
numa : bool
This parameter seems to be new, or not described in docs as of January 2025.
dm_ollamalib does not know it, sorry.
num_ctx : int
Sets the size of the context window used to generate the next token.
(Default: 2048)
...
[!IMPORTANT] As no description texts for options are present in the Ollama Python library, they were copied into dm-ollamalib by hand from descriptions either from the Ollama docs for model files on GitHub or the Ollama Python library package on PyPi. Note that some options have no description online at all, dm-ollamalib will tell you that.
Function to parse strings representing Ollama options
to_ollama_options() transforms a string (or an Iterable of strings) with semicolon separated Ollama options to a dict compatible with Ollama.
The Python Ollama library wants the Ollama options as correct Python types in a dict, i.e., one cannot use strings. This functions transforms any string with Ollama options into a dict with correct types.
Ollama uses a TypedDict, the dict[str, Any] returned by this function is compatible.
Arguments:
- options : str | Iterable[str]
E.g. "num_ctx=8092;temperature=0.8" or ["num_ctx=8092","temperature=0.8"]
Exceptions raised:
- ValueError for
- unrecognised Ollama options
- conversion errors of a string to required type (int, float, bool), e.g. "num_ctx=NotAnInt"
- incomplete options (e.g. "num_ctx=" or "=8092")
- unknown Ollama options
- RuntimeError
- if Ollama Python library has unexpected parameter types not handled by this function (should not happen, except if Ollama devs implemented something new)
Usage examples
from dm_ollamalib.parse_options import help_long, help_overview, to_ollama_options
print(help_overview())
print(help_long())
print(to_ollama_options("top_p=0.9;temperature=0.8"))
print(to_ollama_options(["top_p=0.9", "temperature=0.8"]))
The first two lines will print the generated help texts for the options present in the Ollama Python package. The two subsequent lines show how to parse options from one or several strings coming from, e.g. command line.
The dictionary returned by to_ollama_options() can be used directly in calls to Ollama. E.g.
import ollama
from dm_ollamalib.optionhelper import to_ollama_options
op = to_ollama_options("top_p=0.9;temperature=0.8")
ostream = ollama.chat(
model="llama3.1",
options=op, # the options parsed from string
...
)
[!IMPORTANT] For the code above to work, you need to have (1) Ollama installed and running, the llama3.1 model installed in Ollama (
ollama pull llama3.1), and (3) your Python project needs to have the Ollama Python module installed via, e.g.,uv add ollama.
Notes
The GitHub repository comes with all files I currently use for Python development across multiple platforms. Notably:
- configuration of the Python environment via
uv: pyproject.toml and uv.lock - configuration for linter and code formatter
ruff: ruff.toml - configuration for
pylint: .pylintrc - git ignore files: .gitignore
- configuration for
pre-commit: .pre-commit-config.yaml. The script used to checkgit commitsummary message is in devsupport/check_commitsummary.py - configuration for VSCode editor: .vscode directory
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file dm_ollamalib-0.1.0.tar.gz.
File metadata
- Download URL: dm_ollamalib-0.1.0.tar.gz
- Upload date:
- Size: 29.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.7.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2fefb44d0a5c83b5b422935393d122d497871cb99ba9cef142a81bb21f257a01
|
|
| MD5 |
2ef17f361cab573cd02f947f07a65451
|
|
| BLAKE2b-256 |
d80bf26852e154cae4bf88227c838472427290dee70e4025fcebcf07d4f55c75
|
File details
Details for the file dm_ollamalib-0.1.0-py3-none-any.whl.
File metadata
- Download URL: dm_ollamalib-0.1.0-py3-none-any.whl
- Upload date:
- Size: 10.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.7.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
14f8953453441cbc6e46abe40d0a53c1a0750e50755abd41e6cdbf70268e8a76
|
|
| MD5 |
7a3d913aef102068ba73fe1a3e981460
|
|
| BLAKE2b-256 |
cee309d913fafb4214238fdda1862cc9930208c90f2950be6bc035e52c91220c
|