Skip to main content

Run gptscripts from Python apps

Project description

GPTScript Python Module

Introduction

The GPTScript Python module is a library that provides a simple interface to create and run gptscripts within Python applications, and Jupyter notebooks. It allows you to define tools, execute them, and process the responses.

Installation

You can install the GPTScript Python module using pip.

pip install gptscript

On MacOS, Windows X6

SDIST and none-any wheel installations

When installing from the sdist or the none-any wheel, the binary is not packaged by default. You must run the install_gptscript command to install the binary.

install_gptscript

The script is added to the same bin directory as the python executable, so it should be in your path.

Or you can install the gptscript cli from your code by running:

from gptscript.install import install
install()

Using an existing gptscript cli

If you already have the gptscript cli installed, you can use it by setting the envvar:

export GPTSCRIPT_BIN="/path/to/gptscript"

Using the Module

The module requires the OPENAI_API_KEY environment variable to be set with your OPENAI key. You can set it in your shell or in your code.

export OPENAI_AI_KEY="your-key"

Tools

The Tool class represents a gptscript tool. The fields align with what you would be able to define in a normal gptscript .gpt file.

Fields

  • name: The name of the tool.
  • description: A description of the tool.
  • tools: Additional tools associated with the main tool.
  • max_tokens: The maximum number of tokens to generate.
  • model: The GPT model to use.
  • cache: Whether to use caching for responses.
  • temperature: The temperature parameter for response generation.
  • args: Additional arguments for the tool.
  • internal_prompt: Boolean defaults to false.
  • instructions: Instructions or additional information about the tool.
  • json_response: Whether the response should be in JSON format.(If you set this to True, you must say 'json' in the instructions as well.)

Primary Functions

Aside from the list methods there are exec and exec_file methods that allow you to execute a tool and get the responses. Those functions also provide a streaming version of execution if you want to process the output streams in your code as the tool is running.

Opts

You can pass the following options to the exec and exec_file functions:

opts= { "cache": True(default)|False, "cache-dir": "", }

Cache can be set to true or false to enable or disable caching globally or it can be set at the individual tool level. The cache-dir can be set to a directory to use for caching. If not set, the default cache directory will be used.

list_models()

This function lists the available GPT models.

from gptscript.command import list_models

models = list_models()
print(models)

list_tools()

This function lists the available tools.

from gptscript.command import list_tools

tools = list_tools()
print(tools)

exec(tool, opts)

This function executes a tool and returns the response.

from gptscript.command import exec
from gptscript.tool import Tool

tool = Tool(
    json_response=True,
    instructions="""
Create three short graphic artist descriptions and their muses. 
These should be descriptive and explain their point of view.
Also come up with a made up name, they each should be from different
backgrounds and approach art differently.
the response should be in JSON and match the format:
{
   artists: [{
      name: "name"
      description: "description"
   }]
}
""",
    )


response = exec(tool)
print(response)

exec_file(tool_path, input="", opts)

This function executes a tool from a file and returns the response. The input values are passed to the tool as args.

from gptscript.command import exec_file

response = exec_file("./example.gpt")
print(response)

stream_exec(tool, opts)

This function streams the execution of a tool and returns the output, error, and process wait function. The streams must be read from.

from gptscript.command import stream_exec
from gptscript.tool import Tool

tool = Tool(
    json_response=True,
    instructions="""
Create three short graphic artist descriptions and their muses. 
These should be descriptive and explain their point of view.
Also come up with a made up name, they each should be from different
backgrounds and approach art differently.
the response should be in JSON and match the format:
{
   artists: [{
      name: "name"
      description: "description"
   }]
}
""",
    )

def print_output(out, err):
    # Error stream has the debug info that is useful to see
    for line in err:
        print(line)

    for line in out:
        print(line)

out, err, wait = stream_exec(tool)
print_output(out, err)
wait()

stream_exec_file(tool_path, input="",opts)

This function streams the execution of a tool from a file and returns the output, error, and process wait function. The input values are passed to the tool as args.

from gptscript.command import stream_exec_file

def print_output(out, err):
    # Error stream has the debug info that is useful to see
    for line in err:
        print(line)

    for line in out:
        print(line)

out, err, wait = stream_exec_file("./init.gpt")
print_output(out, err)
wait()

Example Usage

from gptscript.command import exec
from gptscript.tool import FreeForm, Tool

# Define a simple tool
simple_tool = FreeForm(
    content="""
What is the capital of the United States?
"""
)

# Define a complex tool
complex_tool = Tool(
    tools=["sys.write"],
    json_response=True,
    cache=False,
    instructions="""
    Create three short graphic artist descriptions and their muses.
    These should be descriptive and explain their point of view.
    Also come up with a made-up name, they each should be from different
    backgrounds and approach art differently.
    the JSON response format should be:
    {
        artists: [{
            name: "name"
            description: "description"
        }]
    }
    """
)

# Execute the complex tool
response, err = exec(complex_tool)
print(err)
print(response)

# Execute the simple tool
resp, err = exec(simple_tool)
print(err)
print(resp)

Example 2 multiple tools

In this example, multiple tool are provided to the exec function. The first tool is the only one that can exclude the name field. These will be joined and passed into the gptscript as a single gpt script.

from gptscript.command import exec
from gptscript.tool import Tool

tools = [
    Tool(tools=["echo"], instructions="echo hello times"),
    Tool(
        name="echo",
        tools=["sys.exec"],
        description="Echo's the input",
        args={"input": "the string input to echo"},
        instructions="""
        #!/bin/bash
        echo ${input}
        """,
    ),
]

resp, err = exec(tools)
print(err)
print(resp)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gptscript-0.4.2.tar.gz (17.8 kB view details)

Uploaded Source

Built Distributions

gptscript-0.4.2-py3-none-win_amd64.whl (7.1 MB view details)

Uploaded Python 3 Windows x86-64

gptscript-0.4.2-py3-none-macosx_10_9_universal2.whl (13.8 MB view details)

Uploaded Python 3 macOS 10.9+ universal2 (ARM64, x86-64)

gptscript-0.4.2-py3-none-any.whl (15.0 kB view details)

Uploaded Python 3

File details

Details for the file gptscript-0.4.2.tar.gz.

File metadata

  • Download URL: gptscript-0.4.2.tar.gz
  • Upload date:
  • Size: 17.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.2

File hashes

Hashes for gptscript-0.4.2.tar.gz
Algorithm Hash digest
SHA256 b038d25aadd6320aedcf180de00ed7567b43e9900cb23c83174dfbc468fbe014
MD5 16243c95cfffb39da39a62ccf122e09e
BLAKE2b-256 ab374092e86ad1c7ace9da88c3ce45112e793607ce09ba2465b30ecb44a17508

See more details on using hashes here.

File details

Details for the file gptscript-0.4.2-py3-none-win_amd64.whl.

File metadata

  • Download URL: gptscript-0.4.2-py3-none-win_amd64.whl
  • Upload date:
  • Size: 7.1 MB
  • Tags: Python 3, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.2

File hashes

Hashes for gptscript-0.4.2-py3-none-win_amd64.whl
Algorithm Hash digest
SHA256 900d094f470ff5243638d9287cc7320b0c24632aee63cdca27e42d731461a68a
MD5 33fb0debf18075fa729bbda65c3fc9ac
BLAKE2b-256 b864751bb348ba1dd3b7784f8a07a69b2bf886cbfacaf330a9ac23ce5a25bf86

See more details on using hashes here.

File details

Details for the file gptscript-0.4.2-py3-none-manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for gptscript-0.4.2-py3-none-manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 4762d05b6e524eca52d4322d2b948060dc5dfa719e6709b6231afbc5f938c0c7
MD5 59440b5779434a8eb0a63e945d452862
BLAKE2b-256 73db4cc9a916a1fd307fdaa53e55c785cdd9995874ea14f51e45aee9e3136d16

See more details on using hashes here.

File details

Details for the file gptscript-0.4.2-py3-none-manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for gptscript-0.4.2-py3-none-manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 dc887597d2ed9702e0ad87c8741eacba69b06c206c93c667f5477fe708d49006
MD5 c1af74e43586ca93b3952b16c1ebc445
BLAKE2b-256 ac240d757ff995ce001c548c70f63047584eb3eae28dc47847c07920dcd5a3a8

See more details on using hashes here.

File details

Details for the file gptscript-0.4.2-py3-none-macosx_10_9_universal2.whl.

File metadata

File hashes

Hashes for gptscript-0.4.2-py3-none-macosx_10_9_universal2.whl
Algorithm Hash digest
SHA256 6ddfa890dd737bdb4b96f4ab49d2b9b56d628cc89e54399f5712686d9c7f28f4
MD5 04f20021c974836d7eb54d79f05ff141
BLAKE2b-256 a947171b861563bf1a84a045a0445534f62269930611565fbd40b8fea2a3cbcb

See more details on using hashes here.

File details

Details for the file gptscript-0.4.2-py3-none-any.whl.

File metadata

  • Download URL: gptscript-0.4.2-py3-none-any.whl
  • Upload date:
  • Size: 15.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.2

File hashes

Hashes for gptscript-0.4.2-py3-none-any.whl
Algorithm Hash digest
SHA256 bd8a3cf60d084dd83c097c8b9c4455a41e5075fdaa8b61b0c015743343a6f8b8
MD5 16b12b90c72c67967636fd59fb443941
BLAKE2b-256 983c7164d31a3f157ec43560d82538a9854f899577a16b9cad12481321f08b56

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page