Create ChatGPT plugins from Python code
Project description
AutoPlugin
AutoPlugin is a Python package that makes it easy to convert Python functions into ChatGPT plugins. With just a couple lines of code, you can:
- Automatically create an OpenAPI spec with custom endpoints for your registered Python functions, telling ChatGPT how to use it. Pull endpoint descriptions from the function docstring or generate them automatically with the OpenAI API.
- Generate the
ai-plugin.jsonfile to register your plugin with ChatGPT. - Launch a local server that can be used by ChatGPT for development.
Installation
To install AutoPlugin, simply run the following command:
pip install autoplugin
To install with the ability to generate endpoint descriptions for the OpenAPI specification automatically from source code, install with
pip install 'autoplugin[gen]'
Basic Usage
To get started with AutoPlugin, follow these steps:
- Import the necessary functions from AutoPlugin:
from autoplugin import register, generate, launch, get_app
- Create an app instance, backed by FastAPI:
app = get_app()
- Use the register decorator to register your functions as API endpoints. AutoPlugin will automatically generate descriptions if needed.
@register(app, methods=["GET"])
async def get_order(name: str) -> str:
order = await get_order_from_db(name)
return f"Order for {name}: {order}"
# Generated description: "Retrieves an order from the database for a given name."
- Generate the necessary files (
openapi.yamlandai-plugin.json) for your ChatGPT plugin. Optionally, specifyout_dirto change where they're saved to, or setoverwrite_openapi_spec=Falseoroverwrite_plugin_spec=Falseto avoid overwriting the respective files.
# generated files saved to `.well-known/` directory
generate(app, name="Example", description="Plugin to add numbers or greet users")
- Launch the server. Optionally, specify
hostandport:
launch(app) # API hosted at localhost:8000
- Follow the instructions to run a custom plugin:
- On ChatGPT, make a new chat.
- Under "Models" select "Plugins"
- In the Plugins dropdown, select "Plugin store"
- Click "Develop your own plugin"
- Enter the URL you're running the server at ("localhost:8000" by default) and hit enter.
- Click "Install localhost plugin"
Example
Here's a complete example that demonstrates how to use AutoPlugin to create API endpoints for two functions, hello and add.
It also generates the openapi.yaml and ai-plugin.json files, by default in the .well-known directory. :
from autoplugin.autoplugin import register, generate, launch, get_app
app = get_app()
@register(app, methods=["GET", "POST"])
async def hello(name: str, age: int = 5) -> str:
return f"Hello, {name}! Age {age}."
@register(app, methods=["GET"])
async def add(a: int, b: int) -> int:
""" Adds two numbers """
return a + b
# Generate the necessary files
generate(app, name="Example", description="Plugin to add numbers or greet users")
# Launch the server
launch(app)
This example creates a FastAPI server with two endpoints, /hello and /add, that can be accessed using GET or POST requests.
AutoPlugin will use the docstring for the OpenAPI description of /add and generate an automatic description for /hello by passing the source code of the function to OpenAI's API.
Docs
The @register Decorator
The @register decorator is used as follows:
@register(app: FastAPI,
methods: List[str], # which HTTP methods to support
description: Optional[str], # if provided, used as is
generate_description: Optional[bool]) # whether to autogenerate a description
def my_func(...):
...
AutoPlugin generates function descriptions in the OpenAPI spec so that ChatGPT knows how to use your endpoints. There are a few arguments to customize the behavior of this generation.
app: Your FastAPI application. AutoPlugin provides aget_appfunction that includes CORSMiddleware for testing convenience (allows all origins by default).methods: A list of HTTP methods to be supported (e.g. ”GET”, POST”)description: If provided, overrides everything else and is used directly as the endpoint description for the OpenAPI specgenerate_description: If set toTrue, AutoPlugin will generate one automatically from OpenAI's API (requires the LangChain package and setting theOPENAI_API_KEYenvironment variable).
By default (if neither description nor generate_description are provided), the description is fetched from the docstring. If there's no docstring, AutoPlugin falls back to generating one automatically.
The generate Function
The generate function has the following signature:
def generate(app: FastAPI, version="v1", out_dir=".well-known",
overwrite_plugin_spec=True, overwrite_openapi_spec=True,
name="", description="",
**kwargs)
app: Your FastAPI application again.version="v1": What version number to pass to both the plugin and OpenAPI specs.out_dir=".well-known": The directory to save both files to.overwrite_plugin_spec=True: If set to False, does not overwriteai-plugin.jsonif it already exists.overwrite_openapi_spec=True: If set to False, does not overwriteopenapi.yamlif it already exists.name="": If specified, used for bothname_for_humanandname_for_model.description="": If specified, used for bothdescription_for_humananddescription_for_model. Keep in mind the best practices for descriptions.**kwargs: All other keyword arguments are passed on toai-plugin.jsondirectly. See the full list of possible options here.
The launch Function
The launch function has the following signature:
def launch(app: FastAPI, host="127.0.0.1", port=8000):
app: Still your FastAPI application.host="127.0.0.1": the host to launch the server onport=8000: the port to launch the server on
Testing
AutoPlugin also provides a testing_server utility (courtesy of florimondmanca) for testing your endpoints. Here's an example of how you can use it to test the /hello and /add endpoints from the example above:
from autoplugin.testing import testing_server
from os.path import join
import requests
def test_api():
host = "127.0.0.1"
port = 8000
server, base_url = testing_server(host=host, port=port, app_file="path/to/example.py", app_var="app")
with server.run_in_thread():
# Server is started. Do your tests here.
response = requests.post(join(base_url, "hello"), json={"name": "John Doe", "age": 31})
assert response.json() == {"result": "Hello, John Doe! Age 31."}
response = requests.get(join(base_url, "hello"), params={"name": "Jane Smith"})
assert response.json() == {"result": "Hello, Jane Smith! Age 5."}
response = requests.get(join(base_url, "add"), params={"a": 6, "b": 8})
assert response.json() == {"result": 14}
# Server will be stopped.
test_api()
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file autoplugin-0.1.5.tar.gz.
File metadata
- Download URL: autoplugin-0.1.5.tar.gz
- Upload date:
- Size: 10.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.16
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f6849ca34b0214d0a7531caa7ed639a7ddeed5a199013166ee4b40c52d7388df
|
|
| MD5 |
eed48d45ef018fb19d0d1ba75e8a41fc
|
|
| BLAKE2b-256 |
e178c910c18d6721284a65660c3a06a1c52ede51c19f2099b0d0fa3429e5d79a
|
File details
Details for the file autoplugin-0.1.5-py3-none-any.whl.
File metadata
- Download URL: autoplugin-0.1.5-py3-none-any.whl
- Upload date:
- Size: 9.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.16
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
76ef9b927b79ad9b44f335595e3c7712ef4aad684366ba19bad5b526b67f61d7
|
|
| MD5 |
d137a9034cd788d3bf2967736d45c1bf
|
|
| BLAKE2b-256 |
6931dc85f340300b40bbe9f913a6ec5425ab047b13abcaf40fa216f3529fe8e4
|