Interaction of multiple language models
Project description
Symposium
Interactions with multiple language models require at least a little bit of a 'unified' interface. The 'symposium' packagee is an attempt to do that. It is a work in progress and will change without notice.
Anthropic
Import:
from symposium.connectors import anthropic_rest as ant
Messages
messages = [
{"role": "user", "content": "Can we change human nature?"}
]
kwargs = {
"model": "claude-3-sonnet-20240229",
"system": "answer concisely",
# "messages": [],
"max_tokens": 5,
"stop_sequences": ["stop", ant.HUMAN_PREFIX],
"stream": False,
"temperature": 0.5,
"top_k": 250,
"top_p": 0.5
}
response = ant.claud_message(messages,**kwargs)
Completion
prompt = "Can we change human nature?"
kwargs = {
"model": "claude-instant-1.2",
"max_tokens": 5,
# "prompt": prompt,
"stop_sequences": [ant.HUMAN_PREFIX],
"temperature": 0.5,
"top_k": 250,
"top_p": 0.5
}
response = ant.claud_complete(prompt, **kwargs)
OpenAI
Import:
from symposium.connectors import openai_rest as oai
Messages
messages = [
{"role": "user", "content": "Can we change human nature?"}
]
kwargs = {
"model": "gpt-3.5-turbo",
# "messages": [],
"max_tokens": 5,
"n": 1,
"stop_sequences": ["stop"],
"seed": None,
"frequency_penalty": None,
"presence_penalty": None,
"logit_bias": None,
"logprobs": None,
"top_logprobs": None,
"temperature": 0.5,
"top_p": 0.5,
"user": None
}
responses = oai.gpt_message(messages, **kwargs)
Completion
prompt = "Can we change human nature?"
kwargs = {
"model": "gpt-3.5-turbo-instruct",
# "prompt": str,
"suffix": str,
"max_tokens": 5,
"n": 1,
"best_of": None,
"stop_sequences": ["stop"],
"seed": None,
"frequency_penalty": None,
"presence_penalty": None,
"logit_bias": None,
"logprobs": None,
"top_logprobs": None,
"temperature": 0.5,
"top_p": 0.5,
"user": None
}
responses = oai.gpt_complete(prompt, **kwargs)
Gemini
Import:
from symposium.connectors import gemini_rest as gem
Messages
messages = [
{
"role": "user",
"parts": [
{"text": "Human nature can not be changed, because..."},
{"text": "...and that is why human nature can not be changed."}
]
},{
"role": "model",
"parts": [
{"text": "Should I synthesize a text that will be placed between these two statements and follow the previous instruction while doing that?"}
]
},{
"role": "user",
"parts": [
{"text": "Yes, please do."},
{"text": "Create a most concise text possible, preferably just one sentence}"}
]
}
]
kwargs = {
"model": "gemini-1.0-pro",
# "messages": [],
"stop_sequences": ["STOP","Title"],
"temperature": 0.5,
"max_tokens": 5,
"n": 1,
"top_p": 0.9,
"top_k": None
}
response = gem.gemini_content(messages, **kwargs)
PaLM
Import:
from symposium.connectors import palm_rest as path
Completion
kwargs = {
"model": "text-bison-001",
"prompt": str,
"temperature": 0.5,
"n": 1,
"max_tokens": 10,
"top_p": 0.5,
"top_k": None
}
responses = path.palm_complete(prompt, **kwargs)
Messages
context = "This conversation will be happening between Albert and Niels"
examples = [
{
"input": {"author": "Albert", "content": "We didn't talk about quantum mechanics lately..."},
"output": {"author": "Niels", "content": "Yes, indeed."}
}
]
messages = [
{
"author": "Albert",
"content": "Can we change human nature?"
}, {
"author": "Niels",
"content": "Not clear..."
}, {
"author": "Albert",
"content": "Seriously, can we?"
}
]
kwargs = {
"model": "chat-bison-001",
# "context": str,
# "examples": [],
# "messages": [],
"temperature": 0.5,
# no 'max_tokens', beware the effects of that!
"n": 1,
"top_p": 0.5,
"top_k": None
}
responses = path.palm_content(context, examples, messages, **kwargs)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
symposium-0.1.5.tar.gz
(18.6 kB
view details)
Built Distribution
symposium-0.1.5-py3-none-any.whl
(31.3 kB
view details)
File details
Details for the file symposium-0.1.5.tar.gz
.
File metadata
- Download URL: symposium-0.1.5.tar.gz
- Upload date:
- Size: 18.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.10.12
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 6a3fe0843946bdb52871a646397db131db152537225069ac2b4108745e81e8ed |
|
MD5 | b607fb199a167a5f34d55182f2fbaee2 |
|
BLAKE2b-256 | 19f4a47d91b18ff3ad0a07f52c25672d5d0eb3e47531b67bcf67c8454938e308 |
File details
Details for the file symposium-0.1.5-py3-none-any.whl
.
File metadata
- Download URL: symposium-0.1.5-py3-none-any.whl
- Upload date:
- Size: 31.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.10.12
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | f93adba0db9ba0e4addbcb6ae7ced3c43139c7a448477069aec0fbc88533a8e5 |
|
MD5 | 846afee09d2e6a6ececa77fafc2966a1 |
|
BLAKE2b-256 | 6a3ec63576a7c176bab2bca9c900e1f938c214f2e4ffb4e55654a307529a0da6 |