Provides the underlying framework to enhance langchain and add loading configurations
Project description
language-model-common
A shared Python framework for building LLM-powered agent applications with LangChain and LangGraph. It provides reusable infrastructure for configuration management, protocol conversion, MCP tool integration, file management, and more.
Features
- Multi-source configuration loading — Read LLM model configs from local filesystem, AWS S3, or GitHub repositories with TTL-based caching and client-specific overrides
- LangGraph-to-OpenAI protocol conversion — Stream LangGraph agent output as OpenAI-compatible Server-Sent Events (SSE) for chat completion APIs
- MCP (Model Context Protocol) integration — Tool discovery with BM25 search ranking, OAuth 2.1/OIDC support, dynamic client registration, and MCP Apps UI rendering via
ui://resources - Prompt template library — Load and manage prompt templates from organized directory structures with GitHub auto-download support
- File management abstraction — Unified interface for local and AWS S3 storage with factory-based backend selection
- Token and cost management — Token reduction for long conversations and usage metadata tracking via tiktoken
- Authentication and authorization — OAuth with PKCE, JWT token validation, OIDC provider discovery
- Image generation — Provider abstraction over OpenAI and AWS Bedrock image generators
- OCR extraction — AWS Textract integration behind a factory interface
- Dependency injection container — Pre-wired service registry using
simple-containerwith singleton lifecycle management
Installation
pip install language-model-common
Requirements: Python >= 3.10
Quick Start
Reading Model Configurations
from languagemodelcommon.configs.config_reader import ConfigReader
config_reader = ConfigReader(
config_paths=["./configs"], # local path, s3:// URI, or github:// path
)
# Load base configs with optional client-specific overrides
models = await config_reader.read_model_configs_async(client_id="my-client")
Configuration sources are selected by path prefix:
- Local:
./configsor/absolute/path - S3:
s3://bucket-name/path - GitHub: Managed via
GithubConfigRepoManagerwith automatic background refresh
Loading Prompts
from languagemodelcommon.configs.prompt_library import PromptLibraryManager
prompt_manager = PromptLibraryManager(config_paths=["./configs"])
prompt_text = prompt_manager.get_prompt("system-prompt")
Prompts are loaded from .md or .txt files in a prompts/ subdirectory of your config paths.
Streaming LangGraph Output as OpenAI SSE
from languagemodelcommon.converters import LangGraphToOpenAIConverter
converter = LangGraphToOpenAIConverter(
streaming_manager=streaming_manager,
token_reducer=token_reducer,
)
response = await converter.stream_response(
graph=my_langgraph,
messages=messages,
config=config,
)
File Management
from languagemodelcommon.file_managers import FileManagerFactory
factory = FileManagerFactory(aws_client_factory=aws_factory)
# Automatically selects local or S3 backend based on the folder path
manager = factory.create("s3://my-bucket")
await manager.save_file_async(file_data, "s3://my-bucket", "output.json", "application/json")
MCP Tool Discovery
from languagemodelcommon.mcp import ToolCatalog
catalog = ToolCatalog()
catalog.register_server("my-server", url="http://localhost:8080")
# BM25-ranked search for relevant tools
results = catalog.search("search patient records")
Using the Dependency Injection Container
from languagemodelcommon.container import LanguageModelCommonContainerFactory
from simple_container import SimpleContainer
container = SimpleContainer()
LanguageModelCommonContainerFactory.register_services_in_container(container)
config_reader = container.resolve(ConfigReader)
Environment Variables
| Variable | Description | Default |
|---|---|---|
GITHUB_CONFIG_REPO_URL |
GitHub zipball API URL for config repo | — |
GITHUB_TOKEN |
GitHub token for authenticated requests | — |
GITHUB_CACHE_FOLDER |
Local cache directory for GitHub configs | /tmp/github_config_cache |
CONFIG_CACHE_TIMEOUT_SECONDS |
Config refresh interval (seconds) | 120 |
GITHUB_TIMEOUT |
HTTP timeout for GitHub requests (seconds) | 300 |
Development
Prerequisites
- Docker and Docker Compose
Setup
make init
This builds the development Docker image, locks dependencies, and sets up pre-commit hooks.
Running Tests
make tests
Tests run inside Docker using pytest with async support (asyncio_mode = auto).
Other Commands
make shell # Open a shell in the dev container
make update # Update dependencies and rebuild
make build # Build distribution package
make package # Publish to PyPI
make testpackage # Publish to TestPyPI
Project Structure
languagemodelcommon/
├── auth/ # Token storage and auth managers
├── aws/ # AWS client factory
├── configs/ # Config reader, prompt library, schemas
├── container/ # DI container factory
├── converters/ # LangGraph-to-OpenAI conversion, streaming
├── exceptions/ # Custom exception types
├── file_managers/ # Local and S3 file management
├── graph/ # LangGraph utilities
├── history/ # Conversation history management
├── http/ # HTTP client factory
├── image_generation/ # Image generation providers
├── markdown/ # HTML/CSV to Markdown converters
├── mcp/ # MCP client, tool catalog, OAuth
├── models/ # LLM model definitions
├── mocks/ # Test mocks and fakes
├── ocr/ # OCR extraction (AWS Textract)
├── persistence/ # LangGraph checkpoint/store backends
├── schema/ # OpenAI-compatible schema definitions
├── state/ # LangGraph state definitions
├── structures/ # Request/response wrappers
├── tools/ # Resilient tool base class, MCP tools
└── utilities/ # Logging, caching, token reduction, security
License
Apache License 2.0
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file language_model_common-2.0.15.tar.gz.
File metadata
- Download URL: language_model_common-2.0.15.tar.gz
- Upload date:
- Size: 184.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9ee5fd4a3bf63e77ed2a6f43a08714ae6d7cb4695f3e81d5086e0976df221844
|
|
| MD5 |
b4adbffb53e561175ad6bc15e4dc9bad
|
|
| BLAKE2b-256 |
dd1ccc7e3ef9d5b53194ba48bba7b558d9f61e0547f7c202a1907446a9ec1124
|
Provenance
The following attestation bundles were made for language_model_common-2.0.15.tar.gz:
Publisher:
python-publish.yml on icanbwell/language-model-common
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
language_model_common-2.0.15.tar.gz -
Subject digest:
9ee5fd4a3bf63e77ed2a6f43a08714ae6d7cb4695f3e81d5086e0976df221844 - Sigstore transparency entry: 1292543445
- Sigstore integration time:
-
Permalink:
icanbwell/language-model-common@e1e7058d21325221f7883e83a4f417a67547511e -
Branch / Tag:
refs/tags/2.0.15 - Owner: https://github.com/icanbwell
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-publish.yml@e1e7058d21325221f7883e83a4f417a67547511e -
Trigger Event:
release
-
Statement type:
File details
Details for the file language_model_common-2.0.15-py3-none-any.whl.
File metadata
- Download URL: language_model_common-2.0.15-py3-none-any.whl
- Upload date:
- Size: 253.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e9345d27e4a60ca1caf44b73f2fabc673126600100f638621cfe2b01c8b20efe
|
|
| MD5 |
453c97ffb618b361a08ec90e1fe69abd
|
|
| BLAKE2b-256 |
756695f2b3636992d95e89fd3cdeb6e14da6db5bea9a788ed193315d2af20d0d
|
Provenance
The following attestation bundles were made for language_model_common-2.0.15-py3-none-any.whl:
Publisher:
python-publish.yml on icanbwell/language-model-common
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
language_model_common-2.0.15-py3-none-any.whl -
Subject digest:
e9345d27e4a60ca1caf44b73f2fabc673126600100f638621cfe2b01c8b20efe - Sigstore transparency entry: 1292543497
- Sigstore integration time:
-
Permalink:
icanbwell/language-model-common@e1e7058d21325221f7883e83a4f417a67547511e -
Branch / Tag:
refs/tags/2.0.15 - Owner: https://github.com/icanbwell
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-publish.yml@e1e7058d21325221f7883e83a4f417a67547511e -
Trigger Event:
release
-
Statement type: