A Python SDK for the Llama Chat API
Project description
LlamaChat SDK
A Python SDK for interacting with Llama language models through a chat interface. This SDK provides a simple way to integrate Llama's powerful language models into your applications, with support for function calling and various model sizes.
Features
- 🤖 Easy integration with Llama language models
- 🔄 Support for multiple model sizes (1B, 3B, 8B, and 70B parameters)
- 🛠️ Function registration and calling capabilities
- 🐞 Debug mode for troubleshooting
- 💬 Conversation history management
- 🔌 Simple API interface
Installation
pip install llamachat-sdk
Quick Start
from run import LlamaChat, LlamaModel
# Initialize the chat client
chat = LlamaChat(
api_key="your_api_key_here",
model=LlamaModel.LLAMA_8B,
debug=False
)
# Send a message and get a response
response = chat.chat("Hello, how are you?")
print(response)
Available Models
The SDK supports the following Llama models:
LLAMA_1B
: 1 billion parameter modelLLAMA_3B
: 3 billion parameter modelLLAMA_8B
: 8 billion parameter model (default)LLAMA_70B
: 70 billion parameter model
Function Registration
You can register custom functions that the AI can call:
def get_weather(city: str) -> str:
return f"The weather in {city} is sunny!"
# Register a single function
chat.register_function(
func=get_weather,
description="Get the current weather for a given city"
)
# Register multiple functions
functions = [
{
"function": get_weather,
"description": "Get the current weather for a given city"
},
# Add more functions as needed
]
chat.register_functions(functions)
Advanced Usage
Debug Mode
Enable debug mode to log API responses:
chat = LlamaChat(
api_key="your_api_key_here",
model=LlamaModel.LLAMA_8B,
debug=True
)
Custom Model Selection
Use a custom model string if needed:
chat = LlamaChat(
api_key="your_api_key_here",
model="custom-model-identifier"
)
Function Calling Example
# Register a function
def calculate_sum(a: int, b: int) -> int:
return a + b
chat.register_function(
calculate_sum,
"Calculate the sum of two numbers"
)
# The AI can now use this function
response = chat.chat("What is 5 plus 3?")
# The AI might respond with a function call to calculate_sum(5, 3)
API Reference
LlamaChat Class
class LlamaChat:
def __init__(
self,
api_key: str,
model: Union[LlamaModel, str] = LlamaModel.LLAMA_8B,
debug: bool = False
)
Parameters:
api_key
(str): Your API authentication keymodel
(Union[LlamaModel, str]): The Llama model to usedebug
(bool): Enable debug logging
Methods:
chat(message: str) -> str
: Send a message and get a responseregister_function(func: Callable, description: str)
: Register a single functionregister_functions(functions: List[Dict[str, Any]])
: Register multiple functions
Error Handling
The SDK includes built-in error handling for:
- API connection issues
- Invalid function calls
- Response parsing errors
Development
The source code is available in the run.py
file. To contribute:
- Fork the repository
- Create a feature branch
- Submit a pull request
License
MIT License
Support
For support, please open an issue in the GitHub repository or contact the maintainers.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file llama_chat_sdk-1.0.0.tar.gz
.
File metadata
- Download URL: llama_chat_sdk-1.0.0.tar.gz
- Upload date:
- Size: 2.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 5f69d47deb154891271efed7b06eef12de1233f74aeb86a2e5f53a5fcbb30604 |
|
MD5 | e0c749cbd7664abc79950dab483d2524 |
|
BLAKE2b-256 | bf86b67f7fd1e8d7e76c7da1f502035e64ec3cbdc9f6afa7ca04426ca3b785ca |
File details
Details for the file llama_chat_sdk-1.0.0-py3-none-any.whl
.
File metadata
- Download URL: llama_chat_sdk-1.0.0-py3-none-any.whl
- Upload date:
- Size: 2.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | fe9bb90d54aa2eec87e9a22dedf572d37177eea225b065d5b078382f1a382fe8 |
|
MD5 | 81e6d480b59e150e06cc723907348343 |
|
BLAKE2b-256 | 98e2e6c08002ce809159c3f142938bf6a8a9d1906bae3bcf756c4df8e6e7e50d |