Eliza, the chatbot from the 1960s, running on OpenAI's Chat Completions API
Project description
Eliza-GPT
Eliza, the classic chatbot from the 1960s, running on OpenAI's Chat Completions API.
Why?
The main reason is that this seemed like a fun project to help me acquire a greater understanding of the Chat Completions API from OpenAI, which has become a sort of standard for turn-based conversation between a human and a machine.
But also, it is great to have a fast and free service with negligible resource consumption that can be used as a stand-in for a real LLM when the quality of the responses do not matter, such as while doing development or testing.
How To Use
Eliza-GPT is written in Python and can be installed with pip
:
$ pip install eliza-gpt
Once installed, run it to start it as a local service on your computer:
$ eliza-gpt
Eliza-GPT is running!
Set base_url="http://127.0.0.1:5005/v1" in your OpenAI client to connect.
Configuration
Run with --help
to learn about configuration options, which include:
- Setting the listening IP address and port.
- Adding an API key for authentication.
- Changing the (simulated) response times. Set to 0 to have near immediate responses.
Connecting With a Chat Client
If you are using the official Python client from OpenAI, add the base_url
option to it as follows:
openai_client = OpenAI(base_url='http://127.0.0.1:5005/v1')
If you use Langchain, then use the following to connect to Eliza-GPT:
chat = ChatOpenAI(base_url='http://127.0.0.1:5005/v1')
If you have a custom client that talks directly to the chat completions endpoint, configure http://127.0.0.1:5005/v1
as your endpoint.
Eliza-GPT supports both the direct and streaming interfaces.
Examples
The examples directory includes demonstration applications implemented with the OpenAI client and Langchain.
Implementation Details
Eliza-GPT implements a portion of the OpenAI Chat Completions API, ignoring anything that isn't useful. In particular, at this time only the /v1/chat/completions
endpoint is implemented. Any other endpoint from the OpenAI API will return a 404 error to the caller.
The following aspects of the Chat Completions endpoint are currently supported:
- The chat history given in the
messages
argument. Only messages with roleuser
are used to "prime" Eliza so that it provides a reasonable response. - The
stream
argument, which determines if the response should be given in a single JSON object or as a stream of Server-Sent Events. - The
seed
option, which makes it possible to have deterministic responses.
These are not used or not implemented:
- Any messages in the chat history that have a role different than
user
. - Model names. Eliza-GPT returns the requested model name in the response.
- The
n
argument, which controls how many responses are returned. Eliza-GPT always returns a single response. - The
max_tokens
argument. - Temperature and other LLM-specific tuning parameters.
- The
stop
argument. - Anything related to tools or functions that the model may call.
- Anything related to log probabilities.
Credits
The Eliza chatbot used in this project is called Eliza.py and was created by Riccardo Di Maio.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file eliza-gpt-0.2.0.tar.gz
.
File metadata
- Download URL: eliza-gpt-0.2.0.tar.gz
- Upload date:
- Size: 46.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.12.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | b4961c6212232c9d6354807fe906606ea9fee6101e75e5f7d8289b8eba671e16 |
|
MD5 | 542199ba02fc82ddb6a09970ce3b75b1 |
|
BLAKE2b-256 | 4a7e4d34a0fb98480e9b89dd936e877548e24bc062db6626105eb379f9f5c363 |
File details
Details for the file eliza_gpt-0.2.0-py3-none-any.whl
.
File metadata
- Download URL: eliza_gpt-0.2.0-py3-none-any.whl
- Upload date:
- Size: 46.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.12.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 94637574a6ae67842b89c575730c15b20075f5f9eb9d63b0f1179cddd0811f16 |
|
MD5 | 56334e1f1e5287f85034932f507b7911 |
|
BLAKE2b-256 | 7d59ebd291c71114e3efd101d8a80783dc84ace7a0dbce0a6b4dac53ee2ceeb6 |