A Python library for creating and executing chains of prompts using OpenAI's SDK with streaming support and template formatting.
Project description
TasksPromptsChain
A Mini Python library for creating and executing chains of prompts using OpenAI's API with streaming support and output template formatting.
Features
- Sequential prompt chain execution
- Streaming responses
- Template-based output formatting
- System prompt support
- Placeholder replacement between prompts
- Multiple output formats (JSON, Markdown, CSV, Text)
- Async/await support
Dependancies
Please install typing-extensions and openai python packages
pip install typing-extensions
pip install openai
Installation from source code
For Users required from source gitHub repo
pip install -r requirements/requirements.txt
For Developers required from source gitHub repo
pip install -r requirements/requirements.txt
pip install -r requirements/requirements-dev.txt
Quick Start
from tasks_prompts_chain import TasksPromptsChain
async def main():
# Initialize the chain
chain = TasksPromptsChain(
model="gpt-3.5-turbo",
api_key="your-api-key",
final_result_placeholder="design_result"
)
# Define your prompts
prompts = [
{
"prompt": "Create a design concept for a luxury chocolate bar",
"output_format": "TEXT",
"output_placeholder": "design_concept"
},
{
"prompt": "Based on this concept: {{design_concept}}, suggest a color palette",
"output_format": "JSON",
"output_placeholder": "color_palette"
}
]
# Stream the responses
async for chunk in chain.execute_chain(prompts):
print(chunk, end="", flush=True)
# Get specific results
design = chain.get_result("design_concept")
colors = chain.get_result("color_palette")
Advanced Usage
Using Templates
# Set output template before execution
chain.template_output("""
<result>
<design>
### Design Concept:
{{design_concept}}
</design>
<colors>
### Color Palette:
{{color_palette}}
</colors>
</result>
""")
Using System Prompts
chain = TasksPromptsChain(
model="gpt-3.5-turbo",
api_key="your-api-key",
final_result_placeholder="result",
system_prompt="You are a professional design expert specialized in luxury products",
system_apply_to_all_prompts=True
)
Custom API Endpoint
chain = TasksPromptsChain(
model="gpt-3.5-turbo",
api_key="your-api-key",
final_result_placeholder="result",
base_url="https://your-custom-endpoint.com/v1"
)
API Reference
TasksPromptsChain Class
Constructor Parameters
model(str): The model identifier (e.g., 'gpt-3.5-turbo')api_key(str): Your OpenAI API keyfinal_result_placeholder(str): Name for the final result placeholdersystem_prompt(Optional[str]): System prompt for contextsystem_apply_to_all_prompts(Optional[bool]): Apply system prompt to all promptsbase_url(Optional[str]): Custom API endpoint URL
Methods
-
execute_chain(prompts: List[Dict], temperature: float = 0.7) -> AsyncGenerator[str, None]- Executes the prompt chain and streams responses
-
template_output(template: str) -> None- Sets the output template format
-
get_result(placeholder: str) -> Optional[str]- Retrieves a specific result by placeholder
Prompt Format
Each prompt in the chain can be defined as a dictionary:
{
"prompt": str, # The actual prompt text
"output_format": str, # "JSON", "MARKDOWN", "CSV", or "TEXT"
"output_placeholder": str # Identifier for accessing this result
}
Error Handling
The library includes comprehensive error handling:
- Template validation
- API error handling
- Placeholder validation
Errors are raised with descriptive messages indicating the specific issue and prompt number where the error occurred.
Best Practices
- Always set templates before executing the chain
- Use meaningful placeholder names
- Handle streaming responses appropriately
- Consider temperature settings based on your use case
- Use system prompts for consistent context
License
MIT License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file tasks_prompts_chain-0.0.2.tar.gz.
File metadata
- Download URL: tasks_prompts_chain-0.0.2.tar.gz
- Upload date:
- Size: 10.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.8
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
11260c7b5c0e36ab4b7c93c2dacc1b1f64a717f3f714392ed3ac8bced3c4a0bb
|
|
| MD5 |
8ecfc27f2c510f1090f1232674ab6552
|
|
| BLAKE2b-256 |
e81a3f0ecd04aac86f81ab2594e9ad7d51288c375e786749304f190f52faf593
|
Provenance
The following attestation bundles were made for tasks_prompts_chain-0.0.2.tar.gz:
Publisher:
publish.yml on smirfolio/tasks_prompts_chain
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
tasks_prompts_chain-0.0.2.tar.gz -
Subject digest:
11260c7b5c0e36ab4b7c93c2dacc1b1f64a717f3f714392ed3ac8bced3c4a0bb - Sigstore transparency entry: 170820235
- Sigstore integration time:
-
Permalink:
smirfolio/tasks_prompts_chain@5123f66263fe698f44474b7b6f85fbb46a517283 -
Branch / Tag:
refs/tags/0.0.2 - Owner: https://github.com/smirfolio
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@5123f66263fe698f44474b7b6f85fbb46a517283 -
Trigger Event:
release
-
Statement type:
File details
Details for the file tasks_prompts_chain-0.0.2-py3-none-any.whl.
File metadata
- Download URL: tasks_prompts_chain-0.0.2-py3-none-any.whl
- Upload date:
- Size: 10.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.8
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3bda7a7f9bb9c1ac978421baa84a7591113a61279d7b32b15fd4af607d975147
|
|
| MD5 |
5ba2be47d8892c24bb9f1c5a36598539
|
|
| BLAKE2b-256 |
6edf15fc423f5bfb013e8d884120df28cb63986564e561e65d0d7c90798800e8
|
Provenance
The following attestation bundles were made for tasks_prompts_chain-0.0.2-py3-none-any.whl:
Publisher:
publish.yml on smirfolio/tasks_prompts_chain
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
tasks_prompts_chain-0.0.2-py3-none-any.whl -
Subject digest:
3bda7a7f9bb9c1ac978421baa84a7591113a61279d7b32b15fd4af607d975147 - Sigstore transparency entry: 170820236
- Sigstore integration time:
-
Permalink:
smirfolio/tasks_prompts_chain@5123f66263fe698f44474b7b6f85fbb46a517283 -
Branch / Tag:
refs/tags/0.0.2 - Owner: https://github.com/smirfolio
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@5123f66263fe698f44474b7b6f85fbb46a517283 -
Trigger Event:
release
-
Statement type: