Skip to main content

A library to transform messages for different LLMs like OpenAI, Mistral, LangChain, Gemini, DBRX, Cohere, Claude, and AWS Bedrock to Pangea messages format

Project description

pangea-llm-translators

Description

llm-message-translator is a Python library designed to transform messages for various LLM (Large Language Model) providers to Pangea messages format. It supports multiple AI models and platforms, including:

  • OpenAI
  • Mistral
  • LangChain
  • Gemini (Google AI)
  • DBRX (Databricks)
  • Cohere
  • Claude (Anthropic)
  • AWS Bedrock
  • Pangea

Features

  • Seamlessly convert various LLM message formats to the Pangea Messages format.
  • Preserve message structure while transforming the original input using responses from Pangea AI Guard APIs.

Installation

pip install pangea-llm-translator

Usage Example

from pangea_translator import get_translator

# Define an OpenAI-style LLM input
openapi_message = {
   "model": "gpt-4o",
   "messages": [
      {"role": "developer", "content": "you are a joker"},
      {"role": "user", "content": [{"type": "text", "text": "knock knock"}]},
      {"role": "assistant", "content": [{"type": "text", "text": "Who's there?"}]},
   ],
}

# Initialize the translator
translator = get_translator(openapi_message, llm_hint="openai")

# Convert to Pangea format
pangea_messages = translator.get_pangea_messages()

# Print transformed messages
print(pangea_messages.get_messages_list())
# [{'role': 'system', 'content': 'you are a joker'}, {'role': 'user', 'content': 'knock knock'}, {'role': 'assistant', 'content': "Who's there?"}]

# Mimic some api behavior like modify content (Example: Censor "joker")
for message in pangea_messages.messages:
   message.content = message.content.replace("joker", "*****")

# Convert back to OpenAPI input format
original_output = translator.transformed_original_input(messages=pangea_messages.get_messages_list())

print(original_output)

Development

Prerequisites

Ensure you have the following installed:

Installation

  1. Clone the repository:
    git clone https://github.com/pangeacyber/pangea-llm-translators.git
    cd pangea-llm-translators
    
  2. Install dependencies using Poetry:
    poetry install
    

Virtual Environment

To activate the virtual environment:

poetry shell

Running Tests

To run the test suite, use:

poetry run pytest

Running the Application

To run the application, use:

poetry run python examples/openai_translator.py

License

This project is licensed under the MIT License - see the LICENSE file for details.

Contributing

Contributions are welcome! Please create an issue or submit a pull request.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pangea_llm_translator-1.0.2.tar.gz (13.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pangea_llm_translator-1.0.2-py3-none-any.whl (24.8 kB view details)

Uploaded Python 3

File details

Details for the file pangea_llm_translator-1.0.2.tar.gz.

File metadata

  • Download URL: pangea_llm_translator-1.0.2.tar.gz
  • Upload date:
  • Size: 13.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.1 CPython/3.9.21 Linux/6.8.0-1021-azure

File hashes

Hashes for pangea_llm_translator-1.0.2.tar.gz
Algorithm Hash digest
SHA256 14fb5e427ee137309bfd6612e6a4af7f69ad91f246e156b15a93e62c46d23eea
MD5 90f9b16e2e954a7428a77e1850a5a6cd
BLAKE2b-256 fc80bc7d1402bb1e03cea7fd8ef8e155d6631b40f2dd6932dd4c51db18e27482

See more details on using hashes here.

File details

Details for the file pangea_llm_translator-1.0.2-py3-none-any.whl.

File metadata

File hashes

Hashes for pangea_llm_translator-1.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 7fd8603fa8918a877abc069da73c84b81513d7d2ac3433295d694367caca5675
MD5 97c586d7a9e4bb90f24bc9c70738a278
BLAKE2b-256 bce462697aa6baf25506de384850ccfa1507eb893d823221e80eec8ac1aa4545

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page