A demo package for OpenAI integration with conversation management and multiple output formats
Project description
hjimi_openai
An AI conversation management tool based on LangChain, supporting multi-session management, streaming output, automatic saving and backup, and other features. Focused on providing flexible and powerful conversation management solutions.
Key Features
-
Multi-Session Management
- Support for managing multiple independent conversation sessions
- Automatic session saving and recovery
- Flexible session configuration system
- Session history management support
-
Multiple Output Format Support
- Markdown format output
- JSON format output
- Plain text (TXT) output
- HTML format output
-
Automation Features
- Automatic conversation history saving
- Periodic backup functionality
- Automatic history length management
-
Input/Output Processing
- Support for streaming output
- Support for batch loading questions from files
- JSON format support
- TXT format support
- Flexible output formatting
-
System Features
- Complete logging system
- Error handling and recovery
- Environment variable configuration
- Session state management
Installation Requirements
- Python 3.8 or higher
- Required dependencies:
- langchain-openai>=0.0.3
- langchain-core>=0.1.4
- langchain-community>=0.0.6
- langchain>=0.1.0
Quick Start
- Install Package
pip install hjimi-openai
- Set Environment Variables
# Set API Key
export DASHSCOPE_API_KEY="your_api_key_here"
- Basic Usage
from hjimi_openai import AIConversationManager, ConversationConfig
# Create configuration
config = ConversationConfig(
output_dir="output",
model_name="qwen-plus",
temperature=0.7
)
# Initialize manager
manager = AIConversationManager(config)
# Process single question
response = manager.process_conversation("What is artificial intelligence?")
print(response)
Configuration Options
Basic Configuration
model_name: Model name to use (default: "qwen-plus")temperature: Temperature parameter (default: 0.0)max_tokens: Maximum tokens (default: 1024)streaming: Enable streaming output (default: True)output_dir: Output directory (default: "output")output_format: Output format (supports markdown/json/txt/html)
Advanced Configuration
system_prompt: System promptapi_base: API base URLapi_key_env: API key environment variable namesave_interval: Auto-save intervalmax_history_length: Maximum history lengthbackup_enabled: Enable backupsession_id_prefix: Session ID prefix
Advanced Usage
Multi-Session Management
# Create multiple sessions
sessions = {
"ai_basics": [
"What is machine learning?",
"What's the difference between deep learning and machine learning?"
],
"python_tips": [
"What are Python decorators?",
"How to use generators to improve performance?"
]
}
manager.process_questions(sessions)
Loading Questions from File
# JSON format example
questions_json = {
"session1": {
"title": "AI Basics",
"questions": [
"What are neural networks?",
"What is backpropagation?"
]
}
}
# Load from file
manager.process_questions("questions.json")
Custom Output Format
from hjimi_openai import ConversationConfig, OutputFormat
config = ConversationConfig(
output_format=OutputFormat.MARKDOWN,
markdown_template="""
# {title}
## Q&A Records
{content}
"""
)
manager = AIConversationManager(config)
Basic Conversation Example
# Create manager instance
manager = AIConversationManager()
# Simple conversation
questions = [
"What's the difference between lists and tuples in Python?",
"How to handle exceptions in Python?"
]
for question in questions:
response = manager.process_conversation(question)
print(f"Question: {question}")
print(f"Answer: {response}\n")
Batch Processing Example
# Prepare batch questions
batch_questions = {
"programming": {
"title": "Programming Basics",
"questions": [
"What is object-oriented programming?",
"What are design patterns?",
"What is functional programming?"
]
},
"database": {
"title": "Database Basics",
"questions": [
"What is database indexing?",
"What's the difference between SQL and NoSQL?"
]
}
}
# Batch processing
manager.process_questions(batch_questions)
Custom Configuration Example
config = ConversationConfig(
model_name="qwen-plus",
temperature=0.7,
max_tokens=2048,
streaming=True,
output_dir="custom_output",
output_format=OutputFormat.MARKDOWN,
system_prompt="You are a professional programming assistant. Please answer questions concisely.",
save_interval=3,
max_history_length=100,
backup_enabled=True
)
manager = AIConversationManager(config)
Version History
0.1.0 (Current)
- Initial release
- Multi-session management support
- Multiple output format support
- Streaming output support
- Automatic saving and backup support
- File-based question loading support
Contributing
Issues and Pull Requests are welcome to help improve the project.
License
This project is licensed under the MIT License. See the LICENSE file for details.
Author
wenquanshan (wenquanshan@sximi.com)
Contact
- GitHub: https://github.com/zidanewenqsh/openai_demo
- Email: wenquanshan@sximi.com
- Bug Reports: https://github.com/zidanewenqsh/openai_demo/issues
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file hjimi_openai-0.1.0.tar.gz.
File metadata
- Download URL: hjimi_openai-0.1.0.tar.gz
- Upload date:
- Size: 11.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.9.21
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b8b51e4f0d2d89be1e8ad05ed73dbbcde063a871c186f49885c7af410fb9574a
|
|
| MD5 |
d9848682c57ee3e9b57613652cf85da3
|
|
| BLAKE2b-256 |
84e6dc972932f59da929bc6e4cd70c02f82855462cc6d0b4e67083bcf0d7e315
|
File details
Details for the file hjimi_openai-0.1.0-py3-none-any.whl.
File metadata
- Download URL: hjimi_openai-0.1.0-py3-none-any.whl
- Upload date:
- Size: 9.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.9.21
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
96d275666e45a74580b5e82dbe561fdfe92e9639a6921c21093833edb539c492
|
|
| MD5 |
f24088f0cf0ab753ef289fbbc65a4ddd
|
|
| BLAKE2b-256 |
21ec97bdfb1dd4e619aa9a4931a1c8c7098ef32d415fac0f84f07a0af234e837
|