Skip to main content

Provides the underlying framework to enhance langchain and add loading configurations

Project description

language-model-common

A shared Python framework for building LLM-powered agent applications with LangChain and LangGraph. It provides reusable infrastructure for configuration management, protocol conversion, MCP tool integration, file management, and more.

Features

  • Multi-source configuration loading — Read LLM model configs from local filesystem, AWS S3, or GitHub repositories with TTL-based caching and client-specific overrides
  • LangGraph-to-OpenAI protocol conversion — Stream LangGraph agent output as OpenAI-compatible Server-Sent Events (SSE) for chat completion APIs
  • MCP (Model Context Protocol) integration — Tool discovery with BM25 search ranking, OAuth 2.1/OIDC support, dynamic client registration, and MCP Apps UI rendering via ui:// resources
  • Prompt template library — Load and manage prompt templates from organized directory structures with GitHub auto-download support
  • File management abstraction — Unified interface for local and AWS S3 storage with factory-based backend selection
  • Token and cost management — Token reduction for long conversations and usage metadata tracking via tiktoken
  • Authentication and authorization — OAuth with PKCE, JWT token validation, OIDC provider discovery
  • Image generation — Provider abstraction over OpenAI and AWS Bedrock image generators
  • OCR extraction — AWS Textract integration behind a factory interface
  • Dependency injection container — Pre-wired service registry using simple-container with singleton lifecycle management

Installation

pip install language-model-common

Requirements: Python >= 3.10

Quick Start

Reading Model Configurations

from languagemodelcommon.configs.config_reader import ConfigReader

config_reader = ConfigReader(
    config_paths=["./configs"],  # local path, s3:// URI, or github:// path
)

# Load base configs with optional client-specific overrides
models = await config_reader.read_model_configs_async(client_id="my-client")

Configuration sources are selected by path prefix:

  • Local: ./configs or /absolute/path
  • S3: s3://bucket-name/path
  • GitHub: Managed via GithubConfigRepoManager with automatic background refresh

Loading Prompts

from languagemodelcommon.configs.prompt_library import PromptLibraryManager

prompt_manager = PromptLibraryManager(config_paths=["./configs"])
prompt_text = prompt_manager.get_prompt("system-prompt")

Prompts are loaded from .md or .txt files in a prompts/ subdirectory of your config paths.

Streaming LangGraph Output as OpenAI SSE

from languagemodelcommon.converters import LangGraphToOpenAIConverter

converter = LangGraphToOpenAIConverter(
    streaming_manager=streaming_manager,
    token_reducer=token_reducer,
)

response = await converter.stream_response(
    graph=my_langgraph,
    messages=messages,
    config=config,
)

File Management

from languagemodelcommon.file_managers import FileManagerFactory

factory = FileManagerFactory(aws_client_factory=aws_factory)

# Automatically selects local or S3 backend based on the folder path
manager = factory.create("s3://my-bucket")
await manager.save_file_async(file_data, "s3://my-bucket", "output.json", "application/json")

MCP Tool Discovery

from languagemodelcommon.mcp import ToolCatalog

catalog = ToolCatalog()
catalog.register_server("my-server", url="http://localhost:8080")

# BM25-ranked search for relevant tools
results = catalog.search("search patient records")

Using the Dependency Injection Container

from languagemodelcommon.container import LanguageModelCommonContainerFactory
from simple_container import SimpleContainer

container = SimpleContainer()
LanguageModelCommonContainerFactory.register_services_in_container(container)

config_reader = container.resolve(ConfigReader)

Environment Variables

Variable Description Default
GITHUB_CONFIG_REPO_URL GitHub zipball API URL for config repo
GITHUB_TOKEN GitHub token for authenticated requests
GITHUB_CACHE_FOLDER Local cache directory for GitHub configs /tmp/github_config_cache
CONFIG_CACHE_TIMEOUT_SECONDS Config refresh interval (seconds) 120
GITHUB_TIMEOUT HTTP timeout for GitHub requests (seconds) 300

Development

Prerequisites

  • Docker and Docker Compose

Setup

make init

This builds the development Docker image, locks dependencies, and sets up pre-commit hooks.

Running Tests

make tests

Tests run inside Docker using pytest with async support (asyncio_mode = auto).

Other Commands

make shell       # Open a shell in the dev container
make update      # Update dependencies and rebuild
make build       # Build distribution package
make package     # Publish to PyPI
make testpackage # Publish to TestPyPI

Project Structure

languagemodelcommon/
├── auth/              # Token storage and auth managers
├── aws/               # AWS client factory
├── configs/           # Config reader, prompt library, schemas
├── container/         # DI container factory
├── converters/        # LangGraph-to-OpenAI conversion, streaming
├── exceptions/        # Custom exception types
├── file_managers/     # Local and S3 file management
├── graph/             # LangGraph utilities
├── history/           # Conversation history management
├── http/              # HTTP client factory
├── image_generation/  # Image generation providers
├── markdown/          # HTML/CSV to Markdown converters
├── mcp/               # MCP client, tool catalog, OAuth
├── models/            # LLM model definitions
├── mocks/             # Test mocks and fakes
├── ocr/               # OCR extraction (AWS Textract)
├── persistence/       # LangGraph checkpoint/store backends
├── schema/            # OpenAI-compatible schema definitions
├── state/             # LangGraph state definitions
├── structures/        # Request/response wrappers
├── tools/             # Resilient tool base class, MCP tools
└── utilities/         # Logging, caching, token reduction, security

License

Apache License 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

language_model_common-2.0.21.tar.gz (146.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

language_model_common-2.0.21-py3-none-any.whl (201.6 kB view details)

Uploaded Python 3

File details

Details for the file language_model_common-2.0.21.tar.gz.

File metadata

  • Download URL: language_model_common-2.0.21.tar.gz
  • Upload date:
  • Size: 146.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for language_model_common-2.0.21.tar.gz
Algorithm Hash digest
SHA256 839b5121e39c02cb8add4e458ed0e9672ce90e99278c3e0a729dedb43eb33bc2
MD5 c48d6ea84bdc101c1d98ded869b90c2c
BLAKE2b-256 5a4d8e92912c4fe4eaf5df7711d75674f03499301c41ba18a549e1451e27c0b1

See more details on using hashes here.

Provenance

The following attestation bundles were made for language_model_common-2.0.21.tar.gz:

Publisher: python-publish.yml on icanbwell/language-model-common

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file language_model_common-2.0.21-py3-none-any.whl.

File metadata

File hashes

Hashes for language_model_common-2.0.21-py3-none-any.whl
Algorithm Hash digest
SHA256 5a0afdbb3fb1814d4d4327a940465ee884ddb7c70b636fe4bd5e05c88e729bb2
MD5 3373515e689dd01dfdf5aaff514a296e
BLAKE2b-256 af84629aff6eabd4325273fc333f6aad06795c5c1a4a7457fc99558054ec0eee

See more details on using hashes here.

Provenance

The following attestation bundles were made for language_model_common-2.0.21-py3-none-any.whl:

Publisher: python-publish.yml on icanbwell/language-model-common

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page