Interaction of multiple language models
Project description
Symposium
Interactions with multiple language models require at least a little bit of a 'unified' interface. The 'symposium' packagee is an attempt to do that. It is a work in progress and will change without notice.
Anthropic
Import:
from symposium.connectors import anthropic_rest as ant
Messages
kwargs = {
"model": "claude-3-sonnet-20240229",
"system": "answer concisely",
"messages": [],
"max_tokens": 5,
"stop_sequences": ["stop", ant.HUMAN_PREFIX],
"stream": False,
"temperature": 0.5,
"top_k": 250,
"top_p": 0.5
}
response = ant.claud_message(messages,**kwargs)
Completion
kwargs = {
"model": "claude-instant-1.2",
"max_tokens": 5,
"prompt": f"{ant.HUMAN_PREFIX}{prompt}{ant.MACHINE_PREFIX}",
"stop_sequences": [ant.HUMAN_PREFIX],
"temperature": 0.5,
"top_k": 250,
"top_p": 0.5
}
response = ant.claud_complete(prompt, **kwargs)
OpenAI
Import:
from symposium.connectors import openai_rest as oai
Messages
kwargs = {
"model": "gpt-3.5-turbo",
"messages": [],
"max_tokens": 5,
"n": 1,
"stop_sequences": ["stop"],
"seed": None,
"frequency_penalty": None,
"presence_penalty": None,
"logit_bias": None,
"logprobs": None,
"top_logprobs": None,
"temperature": 0.5,
"top_p": 0.5,
"user": None
}
responses = oai.gpt_message(messages, **kwargs)
Completion
kwargs = {
"model": "gpt-3.5-turbo-instruct",
"prompt": str,
"suffix": str,
"max_tokens": 5,
"n": 1,
"best_of": None,
"stop_sequences": ["stop"],
"seed": None,
"frequency_penalty": None,
"presence_penalty": None,
"logit_bias": None,
"logprobs": None,
"top_logprobs": None,
"temperature": 0.5,
"top_p": 0.5,
"user": None
}
responses = oai.gpt_complete(prompt, **kwargs)
Gemini
Import:
from symposium.connectors import gemini_rest as gem
Messages
kwargs = {
"model": "gemini-1.0-pro",
"messages": [],
"stop_sequences": ["STOP","Title"],
"temperature": 0.5,
"max_tokens": 5,
"n": 1,
"top_p": 0.9,
"top_k": None
}
response = gem.gemini_content(messages, **kwargs)
PaLM
Import:
from symposium.connectors import palm_rest as path
Completion
kwargs = {
"model": "text-bison-001",
"prompt": str,
"temperature": 0.5,
"n": 1,
"max_tokens": 10,
"top_p": 0.5,
"top_k": None
}
responses = path.palm_complete(prompt, **kwargs)
Messages
kwargs = {
"model": "chat-bison-001",
"context": str,
"examples": [],
"messages": [],
"temperature": 0.5,
# no 'max_tokens', beware the effects of that!
"n": 1,
"top_p": 0.5,
"top_k": None
}
responses = path.palm_content(messages, **kwargs)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
symposium-0.1.0.tar.gz
(16.5 kB
view hashes)
Built Distribution
symposium-0.1.0-py3-none-any.whl
(27.6 kB
view hashes)
Close
Hashes for symposium-0.1.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9a759161269d721bae9f5c181bfa0f9d10a83a3b2128c61db85e3a767424f564 |
|
MD5 | f5053d0bf1da208149588f4b3498b5ae |
|
BLAKE2b-256 | a88ff6442fbaf129967159051bd6e49b91787048d44a245b8ebcc1a138d4fd56 |