Skip to main content

Adapters as API gateways to Different LLM Models

Project description

Adapters Package Documentation

Overview

The Adapters package facilitates communication between different language model APIs by providing a unified interface for interaction. This ensures ease of use and flexibility in integrating multiple models from various providers.

Getting Started

Prerequisites

  • Python version: 3.11.6
  • Poetry

Installation

  1. Install Python: Ensure Python 3.11.6 is installed on your system.

  2. Install Poetry: Follow the installation guide on the official Poetry website.

  3. Install Dependencies:

    poetry install
    
  4. Install Pre-commit Hooks:

    poetry run pre-commit install
    
  5. Run Commands via Poetry:

    poetry run pytest
    

Setting Up Pre-commit

Pre-commit hooks help maintain code quality and standards. Install them with the following command:

poetry run pre-commit install

To run pre-commit manually:

poetry run pre-commit run --all-files

Semantic Versioning

For versioning we follow Semantic Versioning

Environment Configuration

The package requires certain environment variables to be set by the users:

  • Copy .env-example to .env and populate it with appropriate values.

Running Tests

Ensure Python 3.11 is used:

poetry run pytest

Usage

Creating and Using Adapters

  1. Instantiate an Adapter:

    adapter1 = AdapterFactory.get_adapter_by_path("adapter_path")
    

    Here, "adapter_path" should follow the format provider/vendor/model_name. Use AdapterFactory.get_supported_models() to retrieve all supported models. In order to get path to the model use model.get_path()

  2. Convert Input:

    adapter1.convert_to_input(prompt)
    
  3. Execute Adapter:

    adapter1.execute_async(input_data)
    

Disabling Specific Models

To disable models, set the ADAPTER_DISABLED_MODELS environment variable:

export ADAPTER_DISABLED_MODELS="model1,model2"

Disabled models will not appear in the supported models list.

Retrieving Supported Models

AdapterFactory.get_supported_models()

Contributing

Adding New Models

  1. Existing Providers: Add new models to the SUPPORTED_MODELS array if the provider is already supported.

  2. New Providers:

    • If the provider follows the OpenAI format, model integration is straightforward. See the "Together" provider class as an example.
    • For providers with different schemas, see the "Anthropic" provider class for guidance.

Development Steps

  1. Add the Provider and Model: Update provider_adapters/__init__.py and test files accordingly.

  2. Write Tests: Add tests in the relevant directories. Use @pytest.mark.vcr for tests making network requests.

  3. Run Tests:

    poetry run pytest
    
  4. Check-in Cassette Files: Include any new cassette YAML files in your commit.

  5. Send a Pull Request: Ensure all tests pass before requesting a review.

Re-creating Cassette Files

Use the --record-mode=rewrite option with pytest to update cassette files.

Additional Notes

Some models may only be accessible from specific locations (e.g., the U.S.). In such cases, running tests might require access to a U.S.-based server.

This documentation provides a streamlined approach to using and contributing to the Adapters package, emphasizing practical steps and clear examples.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

martian_adapters-5.4.0.tar.gz (35.5 kB view hashes)

Uploaded Source

Built Distribution

martian_adapters-5.4.0-py3-none-any.whl (51.3 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page