Skip to main content

python inference handler for agents

Project description

Bottom AI CHANGELOG LICENSE

org.slashlib.py.inference.ollama

The official Ollama inference adapter for the org.slashlib.py.agent framework.

PyPI version PyPI-Test version License: MIT


1. Overview & Architecture

This project provides a specialized implementation of the InferenceAdapter interface from the core Agent Framework. By decoupling the Ollama logic from the main framework, we ensure a lightweight core and allow for independent updates to the inference logic.

Architectural Role

  • Adapter Pattern: Bridges the standardized framework calls to the Ollama-specific API.
  • Provider Agnostic: The framework consumes this adapter through a unified interface.
  • Plugin-Based: Leverages Python entry points for seamless, zero-config integration.

2. Prerequisites

  • Ollama Service: Must be installed and reachable (Default: http://localhost:11434).
  • Python: Version 3.10 or higher.
  • Base Framework: org.slashlib.py.agent must be installed in the same environment.

3. Installation

Production / Standard

pip install org.slashlib.py.inference.ollama

Development (Editable Mode)

If you are developing both the framework and this adapter, install both in editable mode to ensure the Entry Points are correctly registered in your current Python environment:

# 1. Install the framework
cd path/to/org.slashlib.py.agent
pip install -e .

# 2. Install this adapter
cd path/to/org.slashlib.py.inference.ollama
pip install -e .

4. Integration Logic (Entry Points)

This adapter is designed to be "invisible" to the end user. It registers itself via the pyproject.toml entry points:

[project.entry-points."org.slashlib.py.agent.inference"]
ollama = "org.slashlib.py.inference.ollama.adapter:OllamaInferenceAdapter"

The framework automatically scans this group and maps the name ollama to the OllamaInferenceAdapter class.


5. Configuration

The adapter's behavior is controlled via the framework's configuration loader (usually pyproject.json).

Key Type Default Description
model string (required) The name of the Ollama model (e.g., "llama3", "gemma").
think boolean true Enables/Disables thinking process visibility if supported.
timeout float 600.0 Connection timeout in seconds.

Example pyproject.json:

{
  "inference": {
    "ollama": {
      "model": "gemma4",
      "think": true,
      "timeout": 300.0
    }
  }
}

6. Usage Example

You do not need to import any classes from this package directly. Use the Framework's factory method:

import asyncio
from org.slashlib.py.agent.agent import Agent

async def main():
    # Load the agent - the framework resolves 'ollama' via Entry Points
    my_agent = Agent.from_plugin(
        identifier="my-local-assistant",
        plugin_name="ollama"
    )

    # All standard Agent methods are now available
    response = await my_agent.chat("How does the plugin system work?")
    print(f"Assistant: {response.content}")

if __name__ == "__main__":
    asyncio.run(main())

7. Development & Testing

Testing with Pytest

pytest tests/

Logging

The adapter logs under the namespace org.slashlib.py.inference.ollama.


8. Documentation & Obsidian

The project root is fully prepared as an Obsidian Vault.


License

Distributed under the MIT License. See LICENSE.md for more information.


© 2026 org.slashlib

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

org_slashlib_py_inference_ollama-0.1.3.tar.gz (9.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file org_slashlib_py_inference_ollama-0.1.3.tar.gz.

File metadata

File hashes

Hashes for org_slashlib_py_inference_ollama-0.1.3.tar.gz
Algorithm Hash digest
SHA256 83838a735a29a1f01c53a1104959c21e69cadc255d01a8ee5d18529306cf1134
MD5 bd8d31c1ca6c056ab5c3f4aa9014c350
BLAKE2b-256 0fa25d1d1c64076eb17fb429bdfa0e15711c93fe0392f8775fd09fab387b4c51

See more details on using hashes here.

File details

Details for the file org_slashlib_py_inference_ollama-0.1.3-py3-none-any.whl.

File metadata

File hashes

Hashes for org_slashlib_py_inference_ollama-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 5731490fbb5e8acaf9d37a5ade3e5b092b64d4037a439c95a15b97800dadc41d
MD5 2a143626dfe46563e6be8e170612aa76
BLAKE2b-256 1eff15be369a6336a80472f83e961bda4c6363dcd1f1872924904ae944bd96ea

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page