Skip to main content

The official Python client for Ollama.

Project description

Ollama Python Library

The Ollama Python library provides the easiest way to integrate Python 3.8+ projects with Ollama.

Prerequisites

  • Ollama should be installed and running
  • Pull a model to use with the library: ollama pull <model> e.g. ollama pull gemma3
    • See Ollama.com for more information on the models available.

Install

pip install ollama

Usage

from ollama import chat
from ollama import ChatResponse

response: ChatResponse = chat(model='gemma3', messages=[
  {
    'role': 'user',
    'content': 'Why is the sky blue?',
  },
])
print(response['message']['content'])
# or access fields directly from the response object
print(response.message.content)

See _types.py for more information on the response types.

Streaming responses

Response streaming can be enabled by setting stream=True.

from ollama import chat

stream = chat(
    model='gemma3',
    messages=[{'role': 'user', 'content': 'Why is the sky blue?'}],
    stream=True,
)

for chunk in stream:
  print(chunk['message']['content'], end='', flush=True)

Custom client

A custom client can be created by instantiating Client or AsyncClient from ollama.

All extra keyword arguments are passed into the httpx.Client.

from ollama import Client
client = Client(
  host='http://localhost:11434',
  headers={'x-some-header': 'some-value'}
)
response = client.chat(model='gemma3', messages=[
  {
    'role': 'user',
    'content': 'Why is the sky blue?',
  },
])

Async client

The AsyncClient class is used to make asynchronous requests. It can be configured with the same fields as the Client class.

import asyncio
from ollama import AsyncClient

async def chat():
  message = {'role': 'user', 'content': 'Why is the sky blue?'}
  response = await AsyncClient().chat(model='gemma3', messages=[message])

asyncio.run(chat())

Setting stream=True modifies functions to return a Python asynchronous generator:

import asyncio
from ollama import AsyncClient

async def chat():
  message = {'role': 'user', 'content': 'Why is the sky blue?'}
  async for part in await AsyncClient().chat(model='gemma3', messages=[message], stream=True):
    print(part['message']['content'], end='', flush=True)

asyncio.run(chat())

API

The Ollama Python library's API is designed around the Ollama REST API

Chat

ollama.chat(model='gemma3', messages=[{'role': 'user', 'content': 'Why is the sky blue?'}])

Generate

ollama.generate(model='gemma3', prompt='Why is the sky blue?')

List

ollama.list()

Show

ollama.show('gemma3')

Create

ollama.create(model='example', from_='gemma3', system="You are Mario from Super Mario Bros.")

Copy

ollama.copy('gemma3', 'user/gemma3')

Delete

ollama.delete('gemma3')

Pull

ollama.pull('gemma3')

Push

ollama.push('user/gemma3')

Embed

ollama.embed(model='gemma3', input='The sky is blue because of rayleigh scattering')

Embed (batch)

ollama.embed(model='gemma3', input=['The sky is blue because of rayleigh scattering', 'Grass is green because of chlorophyll'])

Ps

ollama.ps()

Errors

Errors are raised if requests return an error status or if an error is detected while streaming.

model = 'does-not-yet-exist'

try:
  ollama.chat(model)
except ollama.ResponseError as e:
  print('Error:', e.error)
  if e.status_code == 404:
    ollama.pull(model)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ollama-0.6.0.tar.gz (50.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ollama-0.6.0-py3-none-any.whl (14.1 kB view details)

Uploaded Python 3

File details

Details for the file ollama-0.6.0.tar.gz.

File metadata

  • Download URL: ollama-0.6.0.tar.gz
  • Upload date:
  • Size: 50.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for ollama-0.6.0.tar.gz
Algorithm Hash digest
SHA256 da2b2d846b5944cfbcee1ca1e6ee0585f6c9d45a2fe9467cbcd096a37383da2f
MD5 cce6b0336e5f3489cae856e6d19ab3dd
BLAKE2b-256 d647f9ee32467fe92744474a8c72e138113f3b529fc266eea76abfdec9a33f3b

See more details on using hashes here.

Provenance

The following attestation bundles were made for ollama-0.6.0.tar.gz:

Publisher: publish.yaml on ollama/ollama-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ollama-0.6.0-py3-none-any.whl.

File metadata

  • Download URL: ollama-0.6.0-py3-none-any.whl
  • Upload date:
  • Size: 14.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for ollama-0.6.0-py3-none-any.whl
Algorithm Hash digest
SHA256 534511b3ccea2dff419ae06c3b58d7f217c55be7897c8ce5868dfb6b219cf7a0
MD5 79740cfb7a76be41ef7666af4d7637da
BLAKE2b-256 b5c1edc9f41b425ca40b26b7c104c5f6841a4537bb2552bfa6ca66e81405bb95

See more details on using hashes here.

Provenance

The following attestation bundles were made for ollama-0.6.0-py3-none-any.whl:

Publisher: publish.yaml on ollama/ollama-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page