A lightweight platform for realistic social simulations
Project description
TinyChat: A lightweight platform for realistic social simulations
Introduction
Tiny-Chat is a lightweight , user-friendly and highly extensible multi-agent stimulation framework. Designed for social stimulation research.
Core Features
-
🤖 Multi-Agent interactions: Support for multiple AI agents with distinct personalities and goals
-
🌍 Flexible Environments: Configurable conversation scenarios and relationship dynamics
-
📊 Built-in and Plugin Evaluation: Multi-dimensional conversation evaluation using LLM and rule-based methods and Extensible evaluator plugins for custom assessment strategies
-
🚀 Server Architecture: High-performance server with HTTP API and configuration management
Installation
Install from source
# clone the repository
git@github.com:ulab-uiuc/tiny-chat.git
cd tiny-chat
# create conda environment
conda create -n tiny-chat python=3.10
conda activate tiny-chat
# Install Poetry
curl -sSL https://install.python-poetry.org | python3
export PATH="$HOME/.local/bin:$PATH"
# Install dependencies
poetry install
Get Started
Before running any code, set your API key, you can also manage your api key in .yaml.
export OPENAI_API_KEY=your-key-here
# or use DEEPSEEK_API_KEY, ANTHROPIC_API_KEY, or OPENROUTER_API_KEY
Basic Usage
import asyncio
from tiny_chat.server import TinyChatServer, ServerConfig, ModelProviderConfig
async def basic_conversation():
config = ServerConfig(
models={
'gpt-4o-mini': ModelProviderConfig(
name='gpt-4o-mini',
type='openai',
temperature=0.7
)
},
default_model='gpt-4o-mini'
)
async with TinyChatServer(config) as server:
episode_log = await server.run_conversation(
agent_configs=[
{"name": "Alice", "type": "llm", "goal": "Convince Bob to join the project"},
{"name": "Bob", "type": "llm", "goal": "Learn more about the project"}
],
scenario="Two colleagues discussing a new project collaboration",
max_turns=10
)
print(f"Conversation completed: {episode_log.reasoning}")
asyncio.run(basic_conversation())
Configuration in YAML
Create config/tiny_chat.yaml:
models:
gpt-4o-mini:
name: gpt-4o-mini
type: openai
temperature: 0.7
default_model: gpt-4o-mini
max_turns: 20
evaluators:
- type: rule_based
enabled: true
config:
max_turn_number: 20
- type: llm
enabled: true
model: gpt-4o-mini
api:
host: 0.0.0.0
port: 8000
Start the server:
python -m tiny_chat.server.cli serve
Examples
For detailed examples and use cases, check the examples/ directory:
examples/human_agent_chat_demo.py- Human-in-the-loop conversationsexamples/multi_agent_chat_demo.py- Multi-agent scenariosexamples/multi_agent_chat_obs_control.py- Observation control examples
Advanced Usage
CLI Tools
# Generate environment profile
python -m tiny_chat.utils.cli env-profile "asking my boyfriend to stop being friends with his ex"
# Generate agent action
python -m tiny_chat.utils.cli action --agent Alice --goal "Convince Bob" --history "Previous conversation..."
# Generate script
python -m tiny_chat.utils.cli script --background-file background.json --agent-name Alice --agent-name Bob
Plugin System
from tiny_chat.server.plugins import PluginManager, EvaluatorConfig
# Create custom evaluator plugin
class CustomEvaluatorPlugin(EvaluatorPlugin):
@property
def plugin_type(self) -> str:
return 'custom'
def _create_evaluator(self) -> Evaluator:
return CustomEvaluator()
# Register and use
manager = PluginManager()
manager.register_plugin('custom', CustomEvaluatorPlugin)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file tiny_chat-0.0.1.tar.gz.
File metadata
- Download URL: tiny_chat-0.0.1.tar.gz
- Upload date:
- Size: 63.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.18
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7f3a6518719ddb12278bc5c470cb1071d5479b883a13881d927a0f91c4e9184e
|
|
| MD5 |
daae12dfc198d71e9c49f85a0f455aaf
|
|
| BLAKE2b-256 |
09053b0aec7491ffea4ef27346cc33537f72a85fc0a2f1a8b025fd59fe79ddda
|
File details
Details for the file tiny_chat-0.0.1-py3-none-any.whl.
File metadata
- Download URL: tiny_chat-0.0.1-py3-none-any.whl
- Upload date:
- Size: 82.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.18
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c362d319da3bf7b244cfc2a75d9c3d7bdb64090da75806d56c93d660da4ca2d5
|
|
| MD5 |
a0bb41fa0a760e9c4b0a4f717585ca3c
|
|
| BLAKE2b-256 |
b9c7321998abf9fb0b7e9f58a74f464a4d655b3779bde04b38a9021718916bec
|