Skip to main content

No project description provided

Project description

Here's the updated README file with the "from fazah import Fazah" line separated from the examples:

Fazah

Fazah is a Python library that enables seamless language translation for interactions with Large Language Models (LLMs). It allows users to communicate with LLMs in any language, ensuring accurate and comprehensive responses by leveraging the vast amount of information available in English on the internet.

Supported LLMs

Fazah seamlessly integrates with popular LLM APIs, including:

  • Anthropic
  • OpenAI
  • Google Gemini
  • And more!

Installation

To install Fazah, use pip:

pip install fazah

Usage

To use Fazah, start by importing the necessary module:

from fazah import Fazah

Using Fazah with Anthropic API

  1. Initialize the Anthropic client with your API key:
client = Anthropic(api_key="YOUR_API_KEY")
  1. Create a function to generate responses using the Anthropic API:
def create_anthropic_llm_model():
    def generate(prompt):
        response = client.messages.create(
            model="claude-3-haiku-20240307",
            max_tokens=1024,
            system="You are a helpful assistant.",
            messages=[
                {"role": "user", "content": prompt}
            ]
        )
        if isinstance(response.content, list):
            response.content = response.content[0].text
        elif hasattr(response.content, 'text'):
            response.content = response.content.text
        return response.content
    return generate
  1. Create an instance of the Fazah class with the Anthropic LLM model:
llm_model = create_anthropic_llm_model()
fazah = Fazah(llm_model)

Using Fazah with Google Gemini API

  1. Set up the Google Gemini API key:
API_KEY = "YOUR_GEMINI_API_KEY"
genai.configure(api_key=API_KEY)
  1. Create an instance of the Google Gemini model:
model = genai.GenerativeModel('gemini-pro')
  1. Create an instance of the LLM model using Google Gemini:
def create_llm_model():
    def generate(prompt):
        response = model.generate_content(prompt)
        return response.text
    return generate
  1. Create an instance of the Fazah class with the Google Gemini LLM model:
llm_model = create_llm_model()
fazah = Fazah(llm_model)

Using Fazah with OpenAI API

  1. Set up the OpenAI API key:
OPENAI_API_KEY = "YOUR_OPENAI_API_KEY"
client = OpenAI(api_key=OPENAI_API_KEY)
  1. Create an instance of the OpenAI Chat model:
def create_chatgpt_llm_model():
    def generate(prompt):
        response = client.chat.completions.create(
            model="gpt-3.5-turbo",
            messages=[
                {"role": "system", "content": "You are a helpful assistant."},
                {"role": "user", "content": prompt}
            ]
        )
        return response.choices[0].message.content
    return generate
  1. Create an instance of the Fazah class with the OpenAI Chat model:
llm_model = create_chatgpt_llm_model()
fazah = Fazah(llm_model)

Now you can use the fazah object to process text in any language. Fazah will automatically translate the prompt to English, pass it to the respective LLM API, and then translate the generated response back to the original language.

Key Features

  • Automatic translation of user prompts from any language to English
  • Leverages the extensive English language resources available on the internet
  • Translates LLM responses back into the original language of the user prompt
  • Seamless integration with popular LLM APIs
  • Enhances the user experience by providing localized interactions
  • Enables users to ask complex questions and receive comprehensive responses in their preferred language

Support

If you encounter any issues or have questions about Fazah, please contact Ajlang5@wisc.edu or wjfoster2@wisc.edu.


With Fazah, you can unlock the full potential of LLMs for a global audience, breaking down language barriers and providing an inclusive and accessible experience for all users.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fazah-3.30.tar.gz (3.7 kB view details)

Uploaded Source

Built Distribution

fazah-3.30-py3-none-any.whl (3.2 kB view details)

Uploaded Python 3

File details

Details for the file fazah-3.30.tar.gz.

File metadata

  • Download URL: fazah-3.30.tar.gz
  • Upload date:
  • Size: 3.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.11.5

File hashes

Hashes for fazah-3.30.tar.gz
Algorithm Hash digest
SHA256 b9a2ab8c70261b0a06046cceb029a0b2f0f78f92019110b021126752066f2ea6
MD5 7b368c2924ec15da4ab63503b4544159
BLAKE2b-256 9954fcbc0b5733fcc118a0838a70e7a64e794f7d13396f22afd242f9ae0b4e0b

See more details on using hashes here.

File details

Details for the file fazah-3.30-py3-none-any.whl.

File metadata

  • Download URL: fazah-3.30-py3-none-any.whl
  • Upload date:
  • Size: 3.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.11.5

File hashes

Hashes for fazah-3.30-py3-none-any.whl
Algorithm Hash digest
SHA256 c531b86dc0f3e605ffad19701ad8960dbd472f47a10bab7ab3ca83f8bf00e540
MD5 3e75059257fedeccbb2a075ca9679f2b
BLAKE2b-256 18150635a76816f1173423ca542c08f0686e7bd0e9b339dd8d049640171f13d9

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page