packages extracts structured summaries from security and government texts using LLM
Project description
sec-summary-llm
sec-summary-llm is a tiny Python package that extracts structured, concise summaries from security‑ and government‑related text (news headlines, reports, etc.). It drives a language model (LLM) with a system prompt that focuses on factual details such as roles, events, and implications, and then validates the LLM output against a strict regex pattern. The result is a clean, plain‑text summary ready for downstream analysis or reporting.
Features
- One‑function API – call
sec_summary_llm()with raw text and get back a list of extracted summary strings. - Built‑in LLM7 support – automatically uses
ChatLLM7from thelangchain_llm7package if no LLM is supplied. - Pattern‑based validation – output must match the predefined regex pattern, guaranteeing consistent formatting.
- Pluggable LLMs – works with any LangChain‑compatible chat model (OpenAI, Anthropic, Google, etc.).
- Zero‑markdown/HTML output – the function returns plain text, suitable for CSV, databases, or further NLP pipelines.
Installation
pip install sec_summary_llm
Quick Start
from sec_summary_llm import sec_summary_llm
# Simple call using the default ChatLLM7 (requires an API key in the environment)
summary = sec_summary_llm(
user_input="The Treasury Department announced new sanctions against..."
)
print(summary) # -> ['...'] (list of formatted summary strings)
Parameters
| Name | Type | Description |
|---|---|---|
user_input |
str |
The raw text (e.g., headline, report excerpt) to be summarized. |
llm |
Optional[BaseChatModel] |
A LangChain chat model instance. If omitted, the function creates a ChatLLM7 instance automatically. |
api_key |
Optional[str] |
API key for LLM7. If not supplied, the function reads LLM7_API_KEY from the environment or falls back to "None" (which will raise an error from the provider). |
Using a Custom LLM
You can pass any LangChain‑compatible chat model. Below are examples for the most common providers.
OpenAI
from langchain_openai import ChatOpenAI
from sec_summary_llm import sec_summary_llm
llm = ChatOpenAI(model="gpt-4o-mini")
response = sec_summary_llm(
user_input="Recent congressional hearings revealed...",
llm=llm
)
print(response)
Anthropic
from langchain_anthropic import ChatAnthropic
from sec_summary_llm import sec_summary_llm
llm = ChatAnthropic(model="claude-3-haiku-20240307")
response = sec_summary_llm(
user_input="A new cyber‑espionage campaign has been traced...",
llm=llm
)
print(response)
Google Generative AI
from langchain_google_genai import ChatGoogleGenerativeAI
from sec_summary_llm import sec_summary_llm
llm = ChatGoogleGenerativeAI(model="gemini-1.5-flash")
response = sec_summary_llm(
user_input="The Ministry of Defense released a statement about...",
llm=llm
)
print(response)
LLM7 (Default)
- Package:
langchain_llm7– https://pypi.org/project/langchain-llm7/ - Free‑tier rate limits are sufficient for typical usage (few requests per minute).
- To increase limits, provide a paid API key via the
LLM7_API_KEYenvironment variable or pass it directly:
response = sec_summary_llm(
user_input="...",
api_key="your_paid_llm7_key"
)
- Obtain a free key by registering at https://token.llm7.io/
Contributing & Issues
If you encounter bugs or have feature requests, please open an issue:
https://github... (replace with actual repository URL)
License
This project is licensed under the MIT License.
Author
Eugene Evstafev
Email: hi@euegne.plus
GitHub: chigwell
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file sec_summary_llm-2025.12.21193952.tar.gz.
File metadata
- Download URL: sec_summary_llm-2025.12.21193952.tar.gz
- Upload date:
- Size: 5.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.1
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
edf06444b757f37ae4abb37d05eff5f1785175426a914410920bb788907a858f
|
|
| MD5 |
273193841b4b2ac9f6aa1959dd64b2c1
|
|
| BLAKE2b-256 |
8fc14dc3ec3488494a3bd6374e4bcabdebccb555f9176e8115a5bdd56be60a71
|
File details
Details for the file sec_summary_llm-2025.12.21193952-py3-none-any.whl.
File metadata
- Download URL: sec_summary_llm-2025.12.21193952-py3-none-any.whl
- Upload date:
- Size: 5.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.1
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1eec4e3c559f2da655fe4d6b08c24bce792ceaa84e4f7e5f32396d7655023925
|
|
| MD5 |
764889843ebe779530fdac58cf1f4dd6
|
|
| BLAKE2b-256 |
29c9eed5901916c7fbd92755b2105a9229c98ecfad7312f3c597c0030cf93a08
|