Skip to main content

A package for prompt processing and language model interaction

Project description

FLUTE

FLUTE: Flexible Language Unified Tuning Elixir. Provides a factory and an abstract object for LLMs, and possibly BCIs in the future.

Installation

You can install FLUTE using pip:

pip install FLUTE-LLM

Usage

Creating a Prompt Processor

To create a prompt processor, use the PromptProcessorFactory class and call the create_prompt_processor method with the desired model name and API key:

import flute
from flute.Modules.PromptProcessorFactory import PromptProcessorFactory

model_name = "claude-3-haiku-20240307"
api_key = "your_api_key"

prompt_processor = PromptProcessorFactory.create_prompt_processor(model_name, api_key)

For Claude, GPT, and Gemini, you may simply write:

import flute
from flute.Modules.PromptProcessorFactory import PromptProcessorFactory

model_name = "claude-3-haiku-20240307"

prompt_processor = PromptProcessorFactory.create_prompt_processor(model_name)

The create_prompt_processor method will return an instance of the appropriate prompt processor class based on the provided model name.

Generating a Response

To generate a response using the prompt processor, call the generate_response method with the desired parameters:

prompt = "What is the capital of France?"
response = prompt_processor.generate_response(prompt, max_tokens=100, temperature=0.8)
print(response)

The generate_response method takes various parameters to control the generation process, such as max_tokens, temperature, top_p, system, etc. Refer to the specific prompt processor class documentation for more details on the available parameters.

Class Diagram

classDiagram
    AbstractPromptProcessor <|-- ClaudePromptProcessor
    AbstractPromptProcessor <|-- GPTPromptProcessor
    AbstractPromptProcessor <|-- GeminiPromptProcessor
    PromptProcessorFactory ..> AbstractPromptProcessor

    class AbstractPromptProcessor {
        +api_key: str
        +generate_response(prompt: Union[str, List[str]], **kwargs) Union[str, List[str]]
        +load_api_key(env_var: str)
        +remove_special_characters(text: str) str
    }

    class ClaudePromptProcessor {
        +model: str
        +generate_response(prompt: Union[str, List[str]], **kwargs) Union[str, List[str]]
    }

    class GPTPromptProcessor {
        +organization: str
        +generate_response(prompt: Union[str, List[str]], **kwargs) Union[str, List[str]]
    }

    class GeminiPromptProcessor {
        +model: GenerativeModel
        +generate_response(prompt: Union[str, List[str]], **kwargs) Union[str, List[str]]
    }

    class PromptProcessorFactory {
        +create_prompt_processor(model_name: str, api_key: Optional[str]) AbstractPromptProcessor
    }

The class diagram shows the inheritance relationship between the abstract base class AbstractPromptProcessor and its concrete implementations ClaudePromptProcessor, GPTPromptProcessor, and GeminiPromptProcessor. The PromptProcessorFactory class is responsible for creating instances of the appropriate prompt processor based on the provided model name.

Supported models

Those listed in PromptProcessorFactory.py are currently supported.

In brief, they consist of:

  • GPT-4 models (GPT-4, GPT-4-turbo, GPT-4o, and most of their different versions)
  • Claude 3 and 3.5 models (Haiku, Sonnet, and Opus)
  • Gemini models(1.0 Pro, 1.5 Flash, and 1.5 Pro)

LICENSE

The repository is licensed under the latest version of Modular and Inclusive Software Advancement License Classic (MISA-CLASSIC License).

There are 4 main policies that consist of this license.

  1. Disclaimer of Liability
  2. Naming Continuity Obligation
  3. Waiver of Other Copyrights
  4. Modular Extensibility (Defines how to modify the license)

See the license document for more details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

FLUTE-LLM-0.1.6.tar.gz (18.3 kB view details)

Uploaded Source

Built Distribution

FLUTE_LLM-0.1.6-py3-none-any.whl (26.9 kB view details)

Uploaded Python 3

File details

Details for the file FLUTE-LLM-0.1.6.tar.gz.

File metadata

  • Download URL: FLUTE-LLM-0.1.6.tar.gz
  • Upload date:
  • Size: 18.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.11.1

File hashes

Hashes for FLUTE-LLM-0.1.6.tar.gz
Algorithm Hash digest
SHA256 6a34d84ac4858c28c20fd8421f3ba7420f3f67724b6ac70c8db056b1c2f5591c
MD5 806246faba5ee78497db0b216cb07c31
BLAKE2b-256 b0284d512ff497c11eeddd018ba8747c36f1ea3d251e2778fb344d2ec713fb7a

See more details on using hashes here.

File details

Details for the file FLUTE_LLM-0.1.6-py3-none-any.whl.

File metadata

  • Download URL: FLUTE_LLM-0.1.6-py3-none-any.whl
  • Upload date:
  • Size: 26.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.11.1

File hashes

Hashes for FLUTE_LLM-0.1.6-py3-none-any.whl
Algorithm Hash digest
SHA256 8668d1a2e8ddc8a3d89446fc6f70fd28a8c3d394785e98809acddb88afd77bb5
MD5 5c792b25a07d13aa880e34d3c088cfd3
BLAKE2b-256 20835c09a02457e9e9ed60db5398aca5530ee8c785f91e8cc10d4e4122c34f2c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page