Skip to main content

The Unify tool package gives access to a single sign on client with multiple LLM endpoints and their metadata

Project description

Unify Integration for Prompt Flow

Introduction

This tool package provides access to numerous endpoints and custom routers, with the option to employ dynamic routing to obtain responses from the best-suited model@provider for your task.

Requirements

PyPI package: unify-integration.

Unify-specific inputs (optional)

Name Type Description Required
cost string Cost-per-token for the endpoint. No
quality string The quality value of the model based on dataset evaluations done by the oracle model. No
inter_token_latency string The delay before a new token is output. No
time_to_first_token string The delay before the first token is generated No
connection CustomConnection UnifyConnection using the Unify client No

Overview

This repository provides an integration between Unify and Promptflow, allowing seamless optimization of large language models (LLMs) using Unify's capabilities. With this integration, users can dynamically select the optimal model based on quality, cost, and latency constraints, as well as benchmark models for specific tasks.

Project Structure

.
├── dist/                          # Distribution files for installation   ├── unify_integration-0.0.14-py3-none-any.whl
│   └── unify_integration-0.0.14.tar.gz
├── tests/                         # Test files for the Unify integration   ├── __init__.py
│   ├── quick_test.py              # Quick tests for package tools   ├── test_unify_llm_tool.py     # Unit tests for Unify LLM tool functionality   └── test_unify_llm.py          # Additional unit tests for Unify LLM
├── unify_llm_tool/                # Unify tool package and connection settings   ├── __init__.py
│   ├── connections/
│      └── unify_connection.yml   # Configuration for the Unify connection   └── examples/                  # Example workflows for Unify integration
├── tools/                         # Tools available in the Unify integration   ├── yamls/
│      ├── benchmark_llm_tool.yaml # YAML for the benchmark LLM tool      ├── chat_tool.yaml          # YAML for the chat tool      ├── optimize_llm_tool.yaml  # YAML for the LLM optimization tool      └── single_sign_on_tool.yaml # YAML for Single Sign-On tool
├── .gitignore                     # Git ignore file
├── .pre-commit-config.yaml        # Pre-commit hooks configuration
├── generate_icon_data_uri.py      # Script to generate base64 icons for the project
├── LICENSE                        # License file
├── MANIFEST.in                    # Manifest for including package data
├── README.md                      # Project README file
├── requirements.txt               # Required dependencies
├── setup.cfg                      # Configuration for flake8, isort, etc.
├── setup.py                       # Setup script for the project
└── unify_icon.png                 # Icon for the project

Installation

To install the project, you can either download the wheel or install the unify_integration package via pip:

pip install unify_integration-0.0.14-py3-none-any.whl

Alternatively, install directly from PyPI:

pip install unify-integration

Tools and Features

1. Optimize LLM Tool

Optimize LLM selection based on task constraints like quality, cost, and time. The YAML file configuration allows customization of these parameters.

unify_llm_tool.tools.optimize_llm_tool.optimize_llm:
  function: optimize_llm
  inputs:
    unify_api_key: '{{env: UNIFY_API_KEY}}'
    quality: "1"
    cost: "4.65e-03"
    time_to_first_token: "2.08e-05"

2. Benchmark LLM Tool

Benchmark multiple LLMs against a set of inputs to determine the best-performing model for a given task.

3. Chat Tool

Allows you to interact with custom endpoints using predefined or dynamic prompts.

4. Single Sign-On Tool

Single sign-on integration with multiple endpoints, streamlining the authentication process for various services.

Testing

The tests/ directory contains unit tests for each tool. You can run the tests using:

pytest tests/

License

This project is licensed under the MIT License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

unify_integration-0.1.11.tar.gz (16.8 kB view details)

Uploaded Source

Built Distribution

unify_integration-0.1.11-py3-none-any.whl (20.6 kB view details)

Uploaded Python 3

File details

Details for the file unify_integration-0.1.11.tar.gz.

File metadata

  • Download URL: unify_integration-0.1.11.tar.gz
  • Upload date:
  • Size: 16.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.19

File hashes

Hashes for unify_integration-0.1.11.tar.gz
Algorithm Hash digest
SHA256 e8369542990e6a8c3dc9b07a9fa710d9bc8a3751444914543f5d21a875ea6273
MD5 c09cf220055a7caec724030596838615
BLAKE2b-256 6841cda3af8a153209577199c1f09227574078959be98c61b51e5b075ba983f4

See more details on using hashes here.

File details

Details for the file unify_integration-0.1.11-py3-none-any.whl.

File metadata

File hashes

Hashes for unify_integration-0.1.11-py3-none-any.whl
Algorithm Hash digest
SHA256 ff477967b348d75fa9a3e5666681bf2a0915e787ade8407ec157970aa8d4637f
MD5 44a99929f5b5e07e278b72b90e9a073d
BLAKE2b-256 9551cb9fa9b57e645d610e63b11fed0b6b2f4c62c8eee49370f9d0053be8d64b

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page