Skip to main content

A lightweight, provider-neutral library for translating LLM requests and responses across model APIs.

Project description

Divyam LLM Interop

A minimal, provider‑agnostic library for interoperable AI model requests and responses. Divyam LLM Interop provides a unified interface for interacting with models across providers while maintaining consistent request and response semantics.

Development Environment Setup

Create a virtual environment

With Python virtualenv:

python -m venv .venv
source .venv/bin/activate  # On Windows: .venv\Scripts\activate

With conda:

conda create -n .venv python=3.10 -y
conda activate .venv

Note: Make sure to activate the virtual environment before running any commands.

Install poetry

pip install poetry
poetry self update 

Install dependencies

For the first time, or when dependencies in pyproject.toml change, regenerate the poetry lock file.

poetry lock
poetry install

Contributing

We welcome contributions to improve the library!

How to contribute

  1. Fork the repository
  2. Create a feature branch: git checkout -b feature/my-improvement
  3. Make your changes
  4. Run tests and linters (see below)
  5. Submit a pull request

Contribution guidelines

  • Follow existing code style
  • Write clear commit messages
  • Include tests when adding features or fixing bugs
  • Ensure documentation reflects changes

If you're unsure about a change, feel free to open a discussion or draft PR.

Code Quality Checks

Before submitting your PR, make sure the code passes all checks:

Format code

poetry run ruff format .

Check formatting (without modifying files)

poetry run ruff format --check .

Lint code

poetry run ruff check .

Auto-fix linting issues (where possible)

poetry run ruff check --fix .

Type check

poetry run pyright .

Run all checks at once

poetry run ruff format . && poetry run ruff check . && poetry run pyright .

Running Tests

poetry run pytest

With coverage report:

poetry run pytest --cov=. --cov-report=term-missing

License

This project is licensed under the Apache License, Version 2.0. You may obtain a copy of the License at:

https://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the LICENSE file for the full license text.


Copyright © 2025 DivyamAI Technologies Private Limited. All rights reserved.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

divyam_llm_interop-0.1.0.tar.gz (49.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

divyam_llm_interop-0.1.0-py3-none-any.whl (77.9 kB view details)

Uploaded Python 3

File details

Details for the file divyam_llm_interop-0.1.0.tar.gz.

File metadata

  • Download URL: divyam_llm_interop-0.1.0.tar.gz
  • Upload date:
  • Size: 49.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for divyam_llm_interop-0.1.0.tar.gz
Algorithm Hash digest
SHA256 98f0146854c1d6e1fcc0e13c32a46b8c030b1b878b8faa4b48786aa07ac5e527
MD5 b7677c5b0c8493eceb958de5e384535b
BLAKE2b-256 eb06205d1e687037aecf88b65851b67aa5623685b7b690b869b3ebc199399f4c

See more details on using hashes here.

Provenance

The following attestation bundles were made for divyam_llm_interop-0.1.0.tar.gz:

Publisher: release.yml on Divyam-AI/divyam-llm-interop

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file divyam_llm_interop-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for divyam_llm_interop-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 eace2a12933929c1f745b55a703789fb04fbb843175a0a38d0050c8ed26efbac
MD5 c4185ecd456b30d1ad789c1ff178a1ee
BLAKE2b-256 7572e881be7bfa2baf94d6e832b75184e4fa48652bab9e7ecf655f2f33c1f4c1

See more details on using hashes here.

Provenance

The following attestation bundles were made for divyam_llm_interop-0.1.0-py3-none-any.whl:

Publisher: release.yml on Divyam-AI/divyam-llm-interop

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page