Interaction of multiple language models
Project description
Symposium
Interactions with multiple language models require at least a little bit of a 'unified' interface. The 'symposium' packagee is an attempt to do that. It is a work in progress and will change without notice.
Anthropic
Import:
from symposium.connectors import anthropic_rest as ant
Messages
kwargs = {
"model": "claude-3-sonnet-20240229",
"system": "answer concisely",
"messages": [],
"max_tokens": 5,
"stop_sequences": ["stop", ant.HUMAN_PREFIX],
"stream": False,
"temperature": 0.5,
"top_k": 250,
"top_p": 0.5
}
response = ant.claud_message(messages,**kwargs)
Completion
kwargs = {
"model": "claude-instant-1.2",
"max_tokens": 5,
"prompt": f"{ant.HUMAN_PREFIX}{prompt}{ant.MACHINE_PREFIX}",
"stop_sequences": [ant.HUMAN_PREFIX],
"temperature": 0.5,
"top_k": 250,
"top_p": 0.5
}
response = ant.claud_complete(prompt, **kwargs)
OpenAI
Import:
from symposium.connectors import openai_rest as oai
Messages
kwargs = {
"model": "gpt-3.5-turbo",
"messages": [],
"max_tokens": 5,
"n": 1,
"stop_sequences": ["stop"],
"seed": None,
"frequency_penalty": None,
"presence_penalty": None,
"logit_bias": None,
"logprobs": None,
"top_logprobs": None,
"temperature": 0.5,
"top_p": 0.5,
"user": None
}
responses = oai.gpt_message(messages, **kwargs)
Completion
kwargs = {
"model": "gpt-3.5-turbo-instruct",
"prompt": str,
"suffix": str,
"max_tokens": 5,
"n": 1,
"best_of": None,
"stop_sequences": ["stop"],
"seed": None,
"frequency_penalty": None,
"presence_penalty": None,
"logit_bias": None,
"logprobs": None,
"top_logprobs": None,
"temperature": 0.5,
"top_p": 0.5,
"user": None
}
responses = oai.gpt_complete(prompt, **kwargs)
Gemini
Import:
from symposium.connectors import gemini_rest as gem
Messages
kwargs = {
"model": "gemini-1.0-pro",
"messages": [],
"stop_sequences": ["STOP","Title"],
"temperature": 0.5,
"max_tokens": 5,
"n": 1,
"top_p": 0.9,
"top_k": None
}
response = gem.gemini_content(messages, **kwargs)
PaLM
Import:
from symposium.connectors import palm_rest as path
Completion
kwargs = {
"model": "text-bison-001",
"prompt": str,
"temperature": 0.5,
"n": 1,
"max_tokens": 10,
"top_p": 0.5,
"top_k": None
}
responses = path.palm_complete(prompt, **kwargs)
Messages
kwargs = {
"model": "chat-bison-001",
"context": str,
"examples": [],
"messages": [],
"temperature": 0.5,
# no 'max_tokens', beware the effects of that!
"n": 1,
"top_p": 0.5,
"top_k": None
}
responses = path.palm_content(messages, **kwargs)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
symposium-0.1.0.tar.gz
(16.5 kB
view details)
Built Distribution
symposium-0.1.0-py3-none-any.whl
(27.6 kB
view details)
File details
Details for the file symposium-0.1.0.tar.gz
.
File metadata
- Download URL: symposium-0.1.0.tar.gz
- Upload date:
- Size: 16.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.10.12
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 946764b649589b54d204d1a3197176a577df4eb49874a8290b5fa8add303a531 |
|
MD5 | 2aa0866fdb5b6d4ae82c27fbf0a3ac9e |
|
BLAKE2b-256 | e720cf0ad94d6f8948038f71f2f317658a798c82d61a8425c0ac91e9b4dfdcac |
File details
Details for the file symposium-0.1.0-py3-none-any.whl
.
File metadata
- Download URL: symposium-0.1.0-py3-none-any.whl
- Upload date:
- Size: 27.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.10.12
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9a759161269d721bae9f5c181bfa0f9d10a83a3b2128c61db85e3a767424f564 |
|
MD5 | f5053d0bf1da208149588f4b3498b5ae |
|
BLAKE2b-256 | a88ff6442fbaf129967159051bd6e49b91787048d44a245b8ebcc1a138d4fd56 |