A modular Python framework for AI agents and LLM interactions
Project description
Flotorch Python
A modular Python framework for AI agents and LLM interactions with support for multiple AI frameworks.
Features
- Modular Design: Install only what you need
- SDK Core: Foundation for all AI interactions with LLM, Memory, Session, and Vector Store support
- Multi-Framework Support: Seamless integration with popular AI frameworks
- ADK: Google Agent Development Kit integration
- CrewAI: Full CrewAI framework support with agents, tasks, and memory
- AutoGen: Microsoft AutoGen integration with agents and memory
- LangChain: LangChain-compatible LLM, agents, memory, and sessions
- LangGraph: LangGraph agent support with checkpointing and memory stores
- Strands: Strands Agents framework integration
- Flexible Dependencies: Choose your installation level
- Built-in Logging: Configurable logging system with console and file providers
- Session Management: Persistent session state management
- Memory & Vector Stores: Long-term memory storage with vector search capabilities
Installation
Option 1: Install everything
# Install all modules and dependencies
pip install flotorch[all]
Option 2: Install specific modules only
# Install only SDK (core functionality)
pip install flotorch[sdk]
# Install SDK + ADK (Google Agent Development Kit) (Recommended)
pip install flotorch[adk]
# Install SDK + CrewAI (Recommended)
pip install flotorch[crewai]
# Install SDK + AutoGen
pip install flotorch[autogen]
# Install SDK + LangChain
pip install flotorch[langchain]
# Install SDK + LangGraph (includes LangChain)
pip install flotorch[langgraph]
# Install SDK + Strands
pip install flotorch[strands]
# Install multiple modules
pip install flotorch[adk,crewai,langchain]
Option 3: Development installation
# Install in development mode with all dependencies
pip install -e .
# Install with development tools
pip install -e .[dev]
Option 4: Beta/Pre-release installation
# Install latest beta version
pip install --pre flotorch
# Install specific beta version
pip install --pre flotorch==2.6.1b1
# Install beta with specific modules
pip install --pre flotorch[adk]
pip install --pre flotorch[crewai]
pip install --pre flotorch[autogen]
pip install --pre flotorch[langchain]
pip install --pre flotorch[langgraph]
pip install --pre flotorch[strands]
# Install specific beta version with modules
pip install --pre flotorch[adk]==2.6.1b1
pip install --pre flotorch[all]==2.6.1b1
Note: The --pre flag is required to install beta/pre-release versions. Without it, pip will only install stable releases.
Configuration
Environment Variables
Flotorch uses environment variables for configuration. You can set these in your .env file or as environment variables:
# Required for most operations
FLOTORCH_API_KEY=your-api-key-here
FLOTORCH_BASE_URL=https://api.flotorch.com
# Optional: Enable debug logging
FLOTORCH_DEBUG=true
SDK Components
The core SDK provides the following components:
- FlotorchLLM: LLM client for making API calls to various language models
- FlotorchSession: Session management for maintaining conversation state
- FlotorchMemory: Long-term memory storage with vector search support
- FlotorchVectorStore: Vector store operations for semantic search
- Logging System: Configurable logging with console and file providers
Module Dependencies
SDK (Core) - Always included
httpx>=0.24- HTTP clientpydantic>=1.10- Data validation
ADK Module
- Requires: SDK dependencies
- Adds:
google-adk>=1.5.0python-dotenv>=1.0.0opentelemetry-exporter-otlp-proto-grpc>=1.34.0
CrewAI Module
- Requires: SDK dependencies
- Adds:
crewai==0.193.2crewai-tools==0.73.1crewai-tools[mcp]==0.73.1python-dotenv>=1.0.0
AutoGen Module
- Requires: SDK dependencies
- Adds:
python-dotenv>=1.0.0autogen-core>=0.4.0autogen-agentchat>=0.4.0autogen-ext[openai]>=0.4.0openai>=1.0.0mcp>=1.2.0
LangChain Module
- Requires: SDK dependencies
- Adds:
langchain>=0.3.27langchain-openai>=0.2.14langchain-experimental>=0.3.4langchain-community>=0.3.29langgraph>=0.6.6langchain-mcp-tools>=0.2.13langchain-mcp-adapters>=0.1.9langchain-mcp>=0.2.1
LangGraph Module
- Requires: SDK and LangChain dependencies
- Adds:
langgraph>=0.6.6langgraph-checkpoint>=2.1.1langgraph-prebuilt>=0.6.4langgraph-sdk>=0.2.3langchain-core>=0.2.14langchain-community>=0.3.27langchain-openai>=0.3.30langchain-mcp-adapters>=0.1.9mcp>=1.13.1
Strands Module
- Requires: SDK dependencies
- Adds:
strands-agents==1.9.0
Development Tools
build>=0.10.0- Package buildingtwine>=4.0.0- PyPI uploadpytest>=7.0.0- Testingpytest-cov>=4.0.0- Test coverageblack>=23.0.0- Code formattingflake8>=6.0.0- Lintingmypy>=1.0.0- Type checking
Quick Start
SDK (Core)
from flotorch.sdk.llm import FlotorchLLM
from flotorch.sdk.session import FlotorchSession
from flotorch.sdk.memory import FlotorchMemory
# Initialize LLM
llm = FlotorchLLM(
model_id="gpt-4",
api_key="your-api-key",
base_url="https://api.flotorch.com"
)
# Use LLM
response = llm.invoke([{"role": "user", "content": "Hello!"}])
# Initialize Session
session = FlotorchSession(
api_key="your-api-key",
base_url="https://api.flotorch.com"
)
# Create a session
session_data = session.create(
app_name="my-app",
user_id="user-123"
)
# Initialize Memory
memory = FlotorchMemory(
api_key="your-api-key",
base_url="https://api.flotorch.com",
provider_name="my-provider"
)
ADK Integration
from flotorch.adk.agent import FlotorchADKAgent
agent = FlotorchADKAgent(
agent_name="my-agent",
enable_memory=True,
base_url="https://api.flotorch.com",
api_key="your-api-key"
)
adk_agent = agent.get_agent()
CrewAI Integration
from flotorch.crewai.agent import FlotorchCrewAIAgent
agent_manager = FlotorchCrewAIAgent(
agent_name="my-crewai-agent",
base_url="https://api.flotorch.com",
api_key="your-api-key"
)
agent = agent_manager.get_agent()
task = agent_manager.get_task()
AutoGen Integration
from flotorch.autogen.agent import FlotorchAutogenAgent
agent = FlotorchAutogenAgent(
agent_name="my-autogen-agent",
base_url="https://api.flotorch.com",
api_key="your-api-key"
)
autogen_agent = agent.get_agent()
LangChain Integration
from flotorch.langchain.agent import FlotorchLangChainAgent
agent = FlotorchLangChainAgent(
agent_name="my-langchain-agent",
enable_memory=True,
base_url="https://api.flotorch.com",
api_key="your-api-key"
)
langchain_agent = agent.get_agent()
LangGraph Integration
from flotorch.langgraph.agent import FlotorchLangGraphAgent
agent = FlotorchLangGraphAgent(
agent_name="my-langgraph-agent",
base_url="https://api.flotorch.com",
api_key="your-api-key"
)
langgraph_agent = agent.get_agent()
Strands Integration
from flotorch.strands.llm import FlotorchStrandsModel
model = FlotorchStrandsModel(
model_id="gpt-4",
api_key="your-api-key",
base_url="https://api.flotorch.com"
)
Module Overview
SDK (Core)
The foundation of Flotorch, providing:
- FlotorchLLM: Unified LLM interface supporting multiple providers
- FlotorchSession: Persistent session state management
- FlotorchMemory: Long-term memory storage with metadata support
- FlotorchVectorStore: Vector store for semantic search and RAG applications
- Logging System: Structured logging with multiple providers
ADK Module
Google Agent Development Kit integration:
- FlotorchADKAgent: ADK-compatible agent with dynamic configuration
- FlotorchADKLLM: LLM wrapper for ADK framework
- FlotorchADKSession: Session service implementation
- FlotorchADKVectorMemoryService: Memory service with vector search
CrewAI Module
CrewAI framework integration:
- FlotorchCrewAIAgent: Agent and task management from configuration
- FlotorchCrewAILLM: LLM integration for CrewAI agents
- FlotorchCrewAISession: Short-term storage using Flotorch Sessions
- FlotorchMemoryStorage: Long-term memory storage for CrewAI
AutoGen Module
Microsoft AutoGen integration:
- FlotorchAutogenAgent: AutoGen agent with configuration management
- FlotorchAutogenLLM: Chat completion client for AutoGen
- FlotorchAutogenSession: Model context for conversation state
- FlotorchAutogenMemory: Memory integration with AutoGen framework
LangChain Module
LangChain framework integration:
- FlotorchLangChainAgent: LangChain agent with MCP tool support
- FlotorchLangChainLLM: BaseChatModel implementation
- FlotorchLangChainSession: BaseMemory implementation for sessions
- FlotorchLangChainMemory: BaseMemory implementation for long-term storage
LangGraph Module
LangGraph framework integration:
- FlotorchLangGraphAgent: LangGraph agent with checkpointing support
- FlotorchLanggraphSession: BaseCheckpointSaver implementation
- FlotorchStore: BaseStore implementation for memory operations
Strands Module
Strands Agents framework integration:
- FlotorchStrandsModel: Model class with stream and structured output support
Project Structure
flotorch/
├── __init__.py
├── sdk/ # Core SDK functionality
│ ├── __init__.py
│ ├── llm.py # FlotorchLLM - LLM client
│ ├── session.py # FlotorchSession - Session management
│ ├── memory.py # FlotorchMemory, FlotorchVectorStore
│ ├── logger/ # Logging system
│ │ ├── logger.py
│ │ ├── logger_provider.py
│ │ ├── console_logger_provider.py
│ │ ├── file_logger_provider.py
│ │ └── global_logger.py
│ └── utils/ # Shared utilities
│ ├── http_utils.py
│ ├── llm_utils.py
│ ├── memory_utils.py
│ ├── session_utils.py
│ └── validation_utils.py
├── adk/ # Google Agent Development Kit
│ ├── __init__.py
│ ├── agent.py # FlotorchADKAgent
│ ├── llm.py # FlotorchADKLLM
│ ├── memory.py # FlotorchADKVectorMemoryService
│ ├── sessions.py # FlotorchADKSession
│ └── utils/
├── crewai/ # CrewAI framework integration
│ ├── __init__.py
│ ├── agent.py # FlotorchCrewAIAgent
│ ├── llm.py # FlotorchCrewAILLM
│ ├── memory.py # FlotorchMemoryStorage
│ └── sessions.py # FlotorchCrewAISession
├── autogen/ # Microsoft AutoGen integration
│ ├── __init__.py
│ ├── agent.py # FlotorchAutogenAgent
│ ├── llm.py # FlotorchAutogenLLM
│ ├── memory.py # FlotorchAutogenMemory
│ └── sessions.py # FlotorchAutogenSession
├── langchain/ # LangChain framework integration
│ ├── __init__.py
│ ├── agent.py # FlotorchLangChainAgent
│ ├── llm.py # FlotorchLangChainLLM
│ ├── memory.py # FlotorchLangChainMemory
│ └── session.py # FlotorchLangChainSession
├── langgraph/ # LangGraph framework integration
│ ├── __init__.py
│ ├── agent.py # FlotorchLangGraphAgent
│ ├── memory.py # FlotorchStore
│ └── sessions.py # FlotorchLanggraphSession
└── strands/ # Strands Agents integration
├── __init__.py
├── agent.py
├── llm.py # FlotorchStrandsModel
├── memory.py
└── session.py
Development
Easy Building with Makefile
The easiest way to build and manage your package is using the provided Makefile:
Direct Version Specification (Recommended)
# Build with specific version
make build VERSION=2.6.1
make build-beta VERSION=2.6.1b1
make build-prod VERSION=2.6.1
# Test with specific version
make test VERSION=2.6.1
# Publish with specific version
make publish-test VERSION=2.6.1b1
make publish-prod VERSION=2.6.1
# Full workflow with specific version
make all VERSION=2.6.1b1
Interactive Commands (prompts for version)
# Interactive builds (if no VERSION specified)
make build # Prompts: "Enter version (e.g., 2.6.1):"
make build-beta # Prompts: "Enter beta version (e.g., 2.6.1b1):"
make build-prod # Prompts: "Enter production version (e.g., 2.6.1):"
# Interactive testing and publishing
make test # Prompts: "Enter version to test (e.g., 2.6.1):"
make publish-test # Prompts: "Enter version to publish (e.g., 2.6.1b1):"
make publish-prod # Prompts: "Enter version to publish (e.g., 2.6.1):"
Quick Commands (pre-defined versions)
# Quick development builds
make quick-build # Builds version 2.6.1
make quick-test # Tests version 2.6.1
make quick-beta # Builds version 2.6.1b1
# Quick publishing
make quick-publish-test # Publishes 2.6.1b1 to TestPyPI
make quick-publish # Publishes 2.6.1 to PyPI
Development Setup
# Set up development environment
make install # Install in development mode
make install-dev # Install with development dependencies
make dev-setup # Complete development setup and test
# Other useful commands
make help # Show all available commands
make clean # Clean build artifacts
Testing
# Test specific module installations
python -c "from flotorch.sdk.llm import FlotorchLLM; print('SDK works!')"
python -c "from flotorch.adk.agent import FlotorchADKAgent; print('ADK works!')"
python -c "from flotorch.crewai.agent import FlotorchCrewAIAgent; print('CrewAI works!')"
python -c "from flotorch.autogen.agent import FlotorchAutogenAgent; print('AutoGen works!')"
python -c "from flotorch.langchain.agent import FlotorchLangChainAgent; print('LangChain works!')"
python -c "from flotorch.langgraph.agent import FlotorchLangGraphAgent; print('LangGraph works!')"
python -c "from flotorch.strands.llm import FlotorchStrandsModel; print('Strands works!')"
Publishing
# Update version in pyproject.toml, then:
python -m build
twine upload dist/* # For PyPI
twine upload --repository testpypi dist/* # For TestPyPI
Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests
- Submit a pull request
License
MIT License - see LICENSE file for details.
Support
- Issues: GitHub Issues
- Documentation: docs.flotorch.com
- Discussions: GitHub Discussions
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file flotorch-3.1.3b1.tar.gz.
File metadata
- Download URL: flotorch-3.1.3b1.tar.gz
- Upload date:
- Size: 211.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.8
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
76dc6a3ab3b8466ca0ba6eaac22329827349e28b9cf15b76da36cad5259b01d3
|
|
| MD5 |
63f1d792aee8ad57afb50ae0896c7c32
|
|
| BLAKE2b-256 |
9ce902673387103ad4a2138f866b55532b99ea65c7a625811c6f6bc5a80945cc
|
File details
Details for the file flotorch-3.1.3b1-py3-none-any.whl.
File metadata
- Download URL: flotorch-3.1.3b1-py3-none-any.whl
- Upload date:
- Size: 261.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.8
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c50603c59296b126dc2a3cbd616326b811193ceab96f5c4bb46ae32f6410795f
|
|
| MD5 |
f21a4180bcd97f9118178a454e1759f6
|
|
| BLAKE2b-256 |
386397ea6bbb81e970aad4711c11bfee3eb90fabc0c8118e98f422b61e789fcc
|