Skip to main content

A handy toolkit for using LLM.

Project description

HandyLLM

GitHub PyPI

A handy toolkit for using LLM.

Install

pip3 install handyllm

or, install from the Github repo to get latest updates:

pip3 install git+https://github.com/atomiechen/handyllm.git

Examples

Example scripts are placed in tests folder.

OpenAI API Request

Logs

You can pass custom logger and log_marks (a string or a collection of strings) to chat/completions to get input and output logging.

Timeout control

This toolkit supports client-side timeout control:

from handyllm import OpenAIAPI
prompt = [{
    "role": "user",
    "content": "please tell me a joke"
    }]
response = OpenAIAPI.chat(
    model="gpt-3.5-turbo",
    messages=prompt,
    timeout=10
    )
print(response['choices'][0]['message']['content'])

Authorization

API key and organization will be loaded using the environment variable OPENAI_API_KEY and OPENAI_ORGANIZATION, or you can set manually:

OpenAIAPI.api_key = 'sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx'
OpenAIAPI.organization = '......'  # default: None

Or, you can pass api_key and organization parameters in each API call.

Stream response

Stream response of chat/completions/finetunes_list_events can be achieved using steam parameter:

response = OpenAIAPI.chat(
    model="gpt-3.5-turbo",
    messages=prompt,
    timeout=10,
    stream=True
    )

# you can use this to stream the response text
for text in OpenAIAPI.stream_chat(response):
    print(text, end='')

# or you can use this to get the whole response
# for chunk in response:
#     if 'content' in chunk['choices'][0]['delta']:
#         print(chunk['choices'][0]['delta']['content'], end='')

Supported APIs

  • chat
  • completions
  • edits
  • embeddings
  • models_list
  • models_retrieve
  • moderations
  • images_generations
  • images_edits
  • images_variations
  • audio_transcriptions
  • audtio_translations
  • files_list
  • files_upload
  • files_delete
  • files_retrieve
  • files_retrieve_content
  • finetunes_create
  • finetunes_list
  • finetunes_retrieve
  • finetunes_cancel
  • finetunes_list_events
  • finetunes_delete_model

Please refer to OpenAI official API reference for details.

Prompt

Prompt Conversion

PromptConverter can convert this text file prompt.txt into a structured prompt for chat API calls:

$system$
You are a helpful assistant.

$user$
Please help me merge the following two JSON documents into one.

$assistant$
Sure, please give me the two JSON documents.

$user$
{
    "item1": "It is really a good day."
}
{
    "item2": "Indeed."
}
%output_format%
%misc1%
%misc2%
from handyllm import PromptConverter
converter = PromptConverter()

# chat can be used as the message parameter for OpenAI API
chat = converter.rawfile2chat('prompt.txt')

# variables wrapped in %s can be replaced at runtime
new_chat = converter.chat_replace_variables(
    chat, 
    {
        r'%misc1%': 'Note1: do not use any bad word.',
        r'%misc2%': 'Note2: be optimistic.',
    }
)

Substitute

PromptConverter can also substitute placeholder variables like %output_format% stored in text files to make multiple prompts modular. A substitute map substitute.txt looks like this:

%output_format%
Please output a SINGLE JSON object that contains all items from the two input JSON objects.

%variable1%
Placeholder text.

%variable2%
Placeholder text.
from handyllm import PromptConverter
converter = PromptConverter()
converter.read_substitute_content('substitute.txt')  # read substitute map
chat = converter.rawfile2chat('prompt.txt')  # variables are substituted already

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

HandyLLM-0.3.1.tar.gz (9.1 kB view hashes)

Uploaded Source

Built Distribution

HandyLLM-0.3.1-py3-none-any.whl (8.5 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page