Interaction of multiple language models
Project description
Symposium
Interactions with multiple language models require at least a little bit of a 'unified' interface. The 'symposium' packagee is an attempt to do that. It is a work in progress and will change without notice.
Anthropic
Import:
from symposium.connectors import anthropic_rest as ant
Messages
messages = [
{"role": "user", "content": "Can we change human nature?"}
]
kwargs = {
"model": "claude-3-sonnet-20240229",
"system": "answer concisely",
# "messages": [],
"max_tokens": 5,
"stop_sequences": ["stop", ant.HUMAN_PREFIX],
"stream": False,
"temperature": 0.5,
"top_k": 250,
"top_p": 0.5
}
response = ant.claud_message(messages,**kwargs)
Completion
prompt = "Can we change human nature?"
kwargs = {
"model": "claude-instant-1.2",
"max_tokens": 5,
# "prompt": prompt,
"stop_sequences": [ant.HUMAN_PREFIX],
"temperature": 0.5,
"top_k": 250,
"top_p": 0.5
}
response = ant.claud_complete(prompt, **kwargs)
OpenAI
Import:
from symposium.connectors import openai_rest as oai
Messages
messages = [
{"role": "user", "content": "Can we change human nature?"}
]
kwargs = {
"model": "gpt-3.5-turbo",
# "messages": [],
"max_tokens": 5,
"n": 1,
"stop_sequences": ["stop"],
"seed": None,
"frequency_penalty": None,
"presence_penalty": None,
"logit_bias": None,
"logprobs": None,
"top_logprobs": None,
"temperature": 0.5,
"top_p": 0.5,
"user": None
}
responses = oai.gpt_message(messages, **kwargs)
Completion
prompt = "Can we change human nature?"
kwargs = {
"model": "gpt-3.5-turbo-instruct",
# "prompt": str,
"suffix": str,
"max_tokens": 5,
"n": 1,
"best_of": None,
"stop_sequences": ["stop"],
"seed": None,
"frequency_penalty": None,
"presence_penalty": None,
"logit_bias": None,
"logprobs": None,
"top_logprobs": None,
"temperature": 0.5,
"top_p": 0.5,
"user": None
}
responses = oai.gpt_complete(prompt, **kwargs)
Gemini
Import:
from symposium.connectors import gemini_rest as gem
Messages
messages = [
{
"role": "user",
"parts": [
{"text": "Human nature can not be changed, because..."},
{"text": "...and that is why human nature can not be changed."}
]
},{
"role": "model",
"parts": [
{"text": "Should I synthesize a text that will be placed between these two statements and follow the previous instruction while doing that?"}
]
},{
"role": "user",
"parts": [
{"text": "Yes, please do."},
{"text": "Create a most concise text possible, preferably just one sentence}"}
]
}
]
kwargs = {
"model": "gemini-1.0-pro",
# "messages": [],
"stop_sequences": ["STOP","Title"],
"temperature": 0.5,
"max_tokens": 5,
"n": 1,
"top_p": 0.9,
"top_k": None
}
response = gem.gemini_content(messages, **kwargs)
PaLM
Import:
from symposium.connectors import palm_rest as path
Completion
kwargs = {
"model": "text-bison-001",
"prompt": str,
"temperature": 0.5,
"n": 1,
"max_tokens": 10,
"top_p": 0.5,
"top_k": None
}
responses = path.palm_complete(prompt, **kwargs)
Messages
context = "This conversation will be happening between Albert and Niels"
examples = [
{
"input": {"author": "Albert", "content": "We didn't talk about quantum mechanics lately..."},
"output": {"author": "Niels", "content": "Yes, indeed."}
}
]
messages = [
{
"author": "Albert",
"content": "Can we change human nature?"
}, {
"author": "Niels",
"content": "Not clear..."
}, {
"author": "Albert",
"content": "Seriously, can we?"
}
]
kwargs = {
"model": "chat-bison-001",
# "context": str,
# "examples": [],
# "messages": [],
"temperature": 0.5,
# no 'max_tokens', beware the effects of that!
"n": 1,
"top_p": 0.5,
"top_k": None
}
responses = path.palm_content(context, examples, messages, **kwargs)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
symposium-0.1.3.tar.gz
(18.5 kB
view details)
Built Distribution
symposium-0.1.3-py3-none-any.whl
(31.1 kB
view details)
File details
Details for the file symposium-0.1.3.tar.gz
.
File metadata
- Download URL: symposium-0.1.3.tar.gz
- Upload date:
- Size: 18.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.10.12
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | b4dcb0232c97d1aedb45d0e3bf63e9b121f6f2ebd09f3cbbe5fa48ede276c578 |
|
MD5 | ee8754726e4fcaafa4c92337951669eb |
|
BLAKE2b-256 | ffd9a5851ab1a7d44d31186744d3813ce827343c8274716cb26cd8281a6a7b1b |
File details
Details for the file symposium-0.1.3-py3-none-any.whl
.
File metadata
- Download URL: symposium-0.1.3-py3-none-any.whl
- Upload date:
- Size: 31.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.10.12
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0df67ddf230ef831d378d2fc69897854f90f4b8526f66bd96d79f0060dc8601b |
|
MD5 | 15e3ce94498d2d4e3045046780033dbd |
|
BLAKE2b-256 | f0f0804bf4acebecb8930b8d252f0cb569e43f500cc586209bdcd9382e62dc22 |