Skip to main content

OpenAI Helper for Easy I/O

Project description

OpenAI-Helper

OpenAI Helper for Easy I/O

Github

https://github.com/craigtrim/openai-helper

Usage

Set the OpenAI credentials

import os
os.environ['OPENAI_KEY'] = "<encrypted key>"
os.environ['OPENAI_ORG'] = "<encrypted key>"

Use CryptoBase.encrypt_str("...") from https://pypi.org/project/baseblock/

Initialize the OpenAI Helper:

run = OpenAITextCompletion().run

This will connect to OpenAI and establish performant callbacks.

Call OpenAI:

run(input_prompt="Generate a one random number between 1 and 5000")

or

run(engine="text-ada-001",
    temperature=1.0,
    max_tokens=256,
    input_prompt="Rewrite the input in grammatical English:\n\nInput: You believe I can help you understand what trust yourself? don't you?\nOutput:\n\n")

The output will contain both the input and output values:

{
   "input":{
      "best_of":1,
      "engine":"text-davinci-003",
      "frequency_penalty":0.0,
      "input_prompt":"Rewrite the input in grammatical English:\n\nInput: You believe I can help you understand what trust yourself? don't you?\nOutput:\n\n",
      "max_tokens":256,
      "presence_penalty":2,
      "temperature":1.0,
      "timeout":5,
      "top_p":1.0
   },
   "output":{
      "choices":[
         {
            "finish_reason":"stop",
            "index":0,
            "logprobs":"None",
            "text":"Don't you believe that I can help you understand trust in yourself?"
         }
      ],
      "created":1659051242,
      "id":"cmpl-5Z7IwXM5bCwWj8IuHaGnOLn6bCvHz",
      "model":"text-ada-001",
      "object":"text_completion",
      "usage":{
         "completion_tokens":17,
         "prompt_tokens":32,
         "total_tokens":49
      }
   }
}

Supported Parameters and Defaults

This method signature describes support:

def process(self,
            input_prompt: str,
            engine: str = None,
            best_of: int = None,
            temperature: float = None,
            max_tokens: int = None,
            top_p: float = None,
            frequency_penalty: int = None,
            presence_penalty: int = None) -> dict:
    """ Run an OpenAI event

    Args:
        input_prompt (str): The Input Prompt to execute against OpenAI
        engine (str, optional): The OpenAI model (engine) to run against. Defaults to None.
            Options as of July, 2022 are:
                'text-davinci-003'
                'text-curie-001',
                'text-babbage-001'
                'text-ada-001'
        best_of (int, optional): Generates Multiple Server-Side Combinations and only selects the best. Defaults to None.
            This can really eat up OpenAI tokens so use with caution!
        temperature (float, optional): Control Randomness; Scale is 0.0 - 1.0. Defaults to None.
            Scale is 0.0 - 1.0
            Lower values approach predictable outputs and determinate behavior
            Higher values are more engaging but also less predictable
            Use High Values cautiously
        max_tokens (int, optional): The Maximum Number of tokens to generate. Defaults to None.
            Requests can use up to 4,000 tokens (this takes the length of the input prompt into account)
            The higher this value, the more each request will cost.
        top_p (float, optional): Controls Diversity via Nucleus Sampling. Defaults to None.
            no idea what this means
        frequency_penalty (int, optional): How much to penalize new tokens based on their frequency in the text so far. Defaults to None.
            Scale: 0.0 - 2.0.
        presence_penalty (int, optional): Seems similar to frequency penalty. Defaults to None.

    Returns:
        dict: an output dictionary with two keys:
            input: the input dictionary with validated parameters and default values where appropriate
            output: the output event from OpenAI
    """

Counting Tokens (tiktoken)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openai-helper-0.2.3.tar.gz (20.1 kB view details)

Uploaded Source

Built Distribution

openai_helper-0.2.3-py3-none-any.whl (34.2 kB view details)

Uploaded Python 3

File details

Details for the file openai-helper-0.2.3.tar.gz.

File metadata

  • Download URL: openai-helper-0.2.3.tar.gz
  • Upload date:
  • Size: 20.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.13 CPython/3.8.5 Windows/10

File hashes

Hashes for openai-helper-0.2.3.tar.gz
Algorithm Hash digest
SHA256 b6b6e7e47fe8c83679c0cbbffbe64984543b1474f825a9539d5b1f030e8ac632
MD5 1a2160b9f3188a39c2f85eb6993306e4
BLAKE2b-256 4105fc018429540cd04295853782bcc3b95ad13e0cdc6748331e9ca2f77cb4a3

See more details on using hashes here.

Provenance

File details

Details for the file openai_helper-0.2.3-py3-none-any.whl.

File metadata

  • Download URL: openai_helper-0.2.3-py3-none-any.whl
  • Upload date:
  • Size: 34.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.13 CPython/3.8.5 Windows/10

File hashes

Hashes for openai_helper-0.2.3-py3-none-any.whl
Algorithm Hash digest
SHA256 e8409630e62f1643fa86f209fd4127aef99a09a1f3d27d0611858087a4f77221
MD5 ebc2ba787ff34c7696c171828a7aa8a6
BLAKE2b-256 4c22aaaf6f8baaedf1c2b301fa9108ff4c7d5212d542e1e6330a16b91f74bbbe

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page