Skip to main content

A fork of the official Python client for Ollama for Home Assistant.

Project description

NOTE: This is a fork of the official Ollama Python library with loosened dependencies in order to make it compatible with Home Assistant.

Ollama Python Library

The Ollama Python library provides the easiest way to integrate Python 3.8+ projects with Ollama.

Install

pip install ollama

Usage

import ollama
response = ollama.chat(model='llama2', messages=[
  {
    'role': 'user',
    'content': 'Why is the sky blue?',
  },
])
print(response['message']['content'])

Streaming responses

Response streaming can be enabled by setting stream=True, modifying function calls to return a Python generator where each part is an object in the stream.

import ollama

stream = ollama.chat(
    model='llama2',
    messages=[{'role': 'user', 'content': 'Why is the sky blue?'}],
    stream=True,
)

for chunk in stream:
  print(chunk['message']['content'], end='', flush=True)

API

The Ollama Python library's API is designed around the Ollama REST API

Chat

ollama.chat(model='llama2', messages=[{'role': 'user', 'content': 'Why is the sky blue?'}])

Generate

ollama.generate(model='llama2', prompt='Why is the sky blue?')

List

ollama.list()

Show

ollama.show('llama2')

Create

modelfile='''
FROM llama2
SYSTEM You are mario from super mario bros.
'''

ollama.create(model='example', modelfile=modelfile)

Copy

ollama.copy('llama2', 'user/llama2')

Delete

ollama.delete('llama2')

Pull

ollama.pull('llama2')

Push

ollama.push('user/llama2')

Embeddings

ollama.embeddings(model='llama2', prompt='They sky is blue because of rayleigh scattering')

Custom client

A custom client can be created with the following fields:

  • host: The Ollama host to connect to
  • timeout: The timeout for requests
from ollama import Client
client = Client(host='http://localhost:11434')
response = client.chat(model='llama2', messages=[
  {
    'role': 'user',
    'content': 'Why is the sky blue?',
  },
])

Async client

import asyncio
from ollama import AsyncClient

async def chat():
  message = {'role': 'user', 'content': 'Why is the sky blue?'}
  response = await AsyncClient().chat(model='llama2', messages=[message])

asyncio.run(chat())

Setting stream=True modifies functions to return a Python asynchronous generator:

import asyncio
from ollama import AsyncClient

async def chat():
  message = {'role': 'user', 'content': 'Why is the sky blue?'}
  async for part in await AsyncClient().chat(model='llama2', messages=[message], stream=True):
    print(part['message']['content'], end='', flush=True)

asyncio.run(chat())

Errors

Errors are raised if requests return an error status or if an error is detected while streaming.

model = 'does-not-yet-exist'

try:
  ollama.chat(model)
except ollama.ResponseError as e:
  print('Error:', e.error)
  if e.status_code == 404:
    ollama.pull(model)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ollama_hass-0.1.7.tar.gz (9.7 kB view details)

Uploaded Source

Built Distribution

ollama_hass-0.1.7-py3-none-any.whl (9.5 kB view details)

Uploaded Python 3

File details

Details for the file ollama_hass-0.1.7.tar.gz.

File metadata

  • Download URL: ollama_hass-0.1.7.tar.gz
  • Upload date:
  • Size: 9.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.11.2

File hashes

Hashes for ollama_hass-0.1.7.tar.gz
Algorithm Hash digest
SHA256 ac0ac9e68d97e2b74dfe8278671c2c67c3ed4b796df1b195c82e440350918684
MD5 e6c000652e3f84feb80ba277963486d5
BLAKE2b-256 eedcc45d42f94fd05a94d00cc1ea02ca7e4553dac19540c87b169b6cfeb5e210

See more details on using hashes here.

File details

Details for the file ollama_hass-0.1.7-py3-none-any.whl.

File metadata

  • Download URL: ollama_hass-0.1.7-py3-none-any.whl
  • Upload date:
  • Size: 9.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.11.2

File hashes

Hashes for ollama_hass-0.1.7-py3-none-any.whl
Algorithm Hash digest
SHA256 130fdf6cdd2bf86be0cce3e5328676c5de5c2fb4d34f9478c2890bec4fbcb7e2
MD5 8034b3c2779a50151892c2973fe11b86
BLAKE2b-256 1625afb47ee6b27911de140bf4b53b41bea2b128f7f8c2aca59d5648f7a2f30c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page