Skip to main content

No project description provided

Project description

Fazah

Fazah is a Python library that enables seamless language translation for interactions with Large Language Models (LLMs). It allows users to communicate with LLMs in any language, ensuring accurate and comprehensive responses by leveraging the vast amount of information available in English on the internet.

Supported LLMs

Fazah seamlessly integrates with popular LLM APIs, including:

  • Anthropic
  • OpenAI
  • Google Gemini
  • And more!

Installation

To install Fazah, use pip:

pip install fazah

Usage

To use Fazah with the Anthropic API, follow these steps:

  1. Import the necessary modules:
from fazah import Fazah
from anthropic import Anthropic
  1. Initialize the Anthropic client with your API key:
client = Anthropic(api_key="YOUR_API_KEY")
  1. Create a function to generate responses using the Anthropic API:
def create_anthropic_llm_model():
    def generate(prompt):
        response = client.messages.create(
            model="claude-3-haiku-20240307",
            max_tokens=1024,
            system="You are a helpful assistant.",
            messages=[
                {"role": "user", "content": prompt}
            ]
        )
        if isinstance(response.content, list):
            response.content = response.content[0].text
        elif hasattr(response.content, 'text'):
            response.content = response.content.text
        return response.content
    return generate
  1. Create an instance of the Fazah class with the Anthropic LLM model:
llm_model = create_anthropic_llm_model()
fazah = Fazah(llm_model)

Using Fazah with Google Gemini API

  1. Set up the Google Gemini API key:
API_KEY = "YOUR_GEMINI_API_KEY"
genai.configure(api_key=API_KEY)
  1. Create an instance of the Google Gemini model:
model = genai.GenerativeModel('gemini-pro')
  1. Create an instance of the LLM model using Google Gemini:
def create_llm_model():
    def generate(prompt):
        response = model.generate_content(prompt)
        return response.text
    return generate
  1. Create an instance of the Fazah class with the Google Gemini LLM model:
llm_model = create_llm_model()
fazah = Fazah(llm_model)

Using Fazah with OpenAI API

  1. Set up the OpenAI API key:
OPENAI_API_KEY = "YOUR_OPENAI_API_KEY"
client = OpenAI(api_key=OPENAI_API_KEY)
  1. Create an instance of the OpenAI Chat model:
def create_chatgpt_llm_model():
    def generate(prompt):
        response = client.chat.completions.create(
            model="gpt-3.5-turbo",
            messages=[
                {"role": "system", "content": "You are a helpful assistant."},
                {"role": "user", "content": prompt}
            ]
        )
        return response.choices[0].message.content
    return generate
  1. Create an instance of the Fazah class with the OpenAI Chat model:
llm_model = create_chatgpt_llm_model()
fazah = Fazah(llm_model)

Now you can use the fazah object to process text in any language. Fazah will automatically translate the prompt to English, pass it to the respective LLM API (Google Gemini or OpenAI), and then translate the generated response back to the original language.

Key Features

  • Automatic translation of user prompts from any language to English
  • Leverages the extensive English language resources available on the internet
  • Translates LLM responses back into the original language of the user prompt
  • Seamless integration with popular LLM APIs
  • Enhances the user experience by providing localized interactions
  • Enables users to ask complex questions and receive comprehensive responses in their preferred language

Support

If you encounter any issues or have questions about Fazah, please contact Ajlang5@wisc.edu or wjfoster2@wisc.edu.


With Fazah, you can unlock the full potential of LLMs for a global audience, breaking down language barriers and providing an inclusive and accessible experience for all users.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fazah-3.29.tar.gz (3.7 kB view details)

Uploaded Source

Built Distribution

fazah-3.29-py3-none-any.whl (3.2 kB view details)

Uploaded Python 3

File details

Details for the file fazah-3.29.tar.gz.

File metadata

  • Download URL: fazah-3.29.tar.gz
  • Upload date:
  • Size: 3.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.11.5

File hashes

Hashes for fazah-3.29.tar.gz
Algorithm Hash digest
SHA256 075825b4f951d35e66cd7811ce512e8af2607bb29fb63c8f01df223fd957b968
MD5 c6d9c8c9c66f2aa88432332f24835e00
BLAKE2b-256 dfd59727f260a5d7a7bcbd6ba157c423120e907a052db00396aec8bbf08cf8f2

See more details on using hashes here.

File details

Details for the file fazah-3.29-py3-none-any.whl.

File metadata

  • Download URL: fazah-3.29-py3-none-any.whl
  • Upload date:
  • Size: 3.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.11.5

File hashes

Hashes for fazah-3.29-py3-none-any.whl
Algorithm Hash digest
SHA256 178ede5fc183bff8c01dfd4e7dc22da5673c5517cca1c0eae2b3dcc322ce8f37
MD5 2b38c850d6cd9a13a9e15f1c92f7fecf
BLAKE2b-256 87e39b4d6a028ce6da7b2980dc87c3d353fc937fe99761e337c902e7fb64a22e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page