A new package that processes user-provided text inputs (such as headlines, descriptions, or extracted content from media) and returns structured, pattern-validated summaries using an LLM. It leverages
Project description
textsummarizer-llm
Overview
textsummarizer_llm provides a simple, pattern‑validated summarization utility powered by a language model (LLM).
Given a raw text input (e.g., a headline, article excerpt, or description), the package:
- Sends the text to a chat LLM using LangChain messages.
- Enforces a predefined output format via a regular‑expression pattern (
llmatch). - Returns a list of extracted, structured summaries.
Typical use‑cases include content moderation, topic tagging, and automated summarization where a consistent response format is required.
Installation
pip install textsummarizer_llm
Quick Start
from textsummarizer_llm import textsummarizer_llm
# Simple call – uses the default ChatLLM7 internally
summary = textsummarizer_llm(
user_input="OpenAI just released GPT‑4 Turbo, offering faster inference and lower cost."
)
print(summary) # → ['...structured summary according to the defined pattern...']
Advanced Usage – Plugging Your Own LLM
You can pass any LangChain‑compatible chat model (e.g., OpenAI, Anthropic, Google) to the function.
OpenAI
from langchain_openai import ChatOpenAI
from textsummarizer_llm import textsummarizer_llm
my_llm = ChatOpenAI(model="gpt-4o-mini")
result = textsummarizer_llm(
user_input="A new study shows that daily meditation improves mental health.",
llm=my_llm
)
print(result)
Anthropic
from langchain_anthropic import ChatAnthropic
from textsummarizer_llm import textsummarizer_llm
anthropic_llm = ChatAnthropic(model="claude-3-sonnet-20240229")
result = textsummarizer_llm(
user_input="The city council approved a new bike‑lane network.",
llm=anthropic_llm
)
Google Generative AI
from langchain_google_genai import ChatGoogleGenerativeAI
from textsummarizer_llm import textsummarizer_llm
google_llm = ChatGoogleGenerativeAI(model="gemini-1.5-flash")
result = textsummarizer_llm(
user_input="Tesla announced a new battery technology with higher energy density.",
llm=google_llm
)
API Reference
def textsummarizer_llm(
user_input: str,
api_key: Optional[str] = None,
llm: Optional[BaseChatModel] = None
) -> List[str]:
"""
Summarize `user_input` while ensuring the output matches a predefined regex pattern.
Parameters
----------
user_input: str
The raw text that needs to be processed and summarized.
api_key: Optional[str]
API key for the default `ChatLLM7`. If omitted, the function first looks for the
`LLM7_API_KEY` environment variable, then falls back to a placeholder key.
llm: Optional[BaseChatModel]
A LangChain chat model instance. If not provided, `ChatLLM7` from
`langchain_llm7` is instantiated automatically.
Returns
-------
List[str]
A list of extracted summary strings that conform to the regex pattern.
"""
Authentication & Rate Limits
The default LLM is ChatLLM7 from the langchain_llm7 package.
Free‑tier limits are sufficient for typical development and small‑scale usage.
If you require higher limits, provide your own API key:
export LLM7_API_KEY="your-llm7-api-key"
or directly:
summary = textsummarizer_llm(
user_input="...",
api_key="your-llm7-api-key"
)
You can obtain a free key at https://token.llm7.io/.
Contributing
Contributions are welcome! Please open issues or pull requests on the GitHub repository.
License
This project is licensed under the MIT License.
Author
Eugene Evstafev – hi@eugene.plus
GitHub: chigwell
Issues
Report bugs or request features via the issue tracker:
https://github.com/chigwell/textsummarizer-llm/issues
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file textsummarizer_llm-2025.12.20201642.tar.gz.
File metadata
- Download URL: textsummarizer_llm-2025.12.20201642.tar.gz
- Upload date:
- Size: 5.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c96596a126ee805d45be1504b12be198357fbd10abb0017b36625b662f6245ee
|
|
| MD5 |
3561e66c6c7acccf87098fd7f35b22c2
|
|
| BLAKE2b-256 |
2074cc9785eca9b1af09ecdf9dbbc0619404d47b375787a04ff4530b0259e5bf
|
File details
Details for the file textsummarizer_llm-2025.12.20201642-py3-none-any.whl.
File metadata
- Download URL: textsummarizer_llm-2025.12.20201642-py3-none-any.whl
- Upload date:
- Size: 5.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
60c94bba0caad01a1da32be35dd31af55770457dafcd4635279619c2ebed3076
|
|
| MD5 |
7e71e7405255d3400acd786d363a00a0
|
|
| BLAKE2b-256 |
518a867372222e7e416354fe62922b93134b7be66efd399201b6ee59932e80a7
|