A new package that analyzes user-submitted text queries about community moderation or platform policies (like why small voting projects get flagged) and returns structured insights. It uses an LLM to
Project description
ModerateFocus
Analyzing Community Moderation and Platform Policies
Overview ModerateFocus is a Python package that helps analyze user-submitted text queries about community moderation or platform policies. It uses a Large Language Model (LLM) to generate reasoned explanations and extract key points using pattern matching. This ensures consistent, non-opinionated output, helping users understand common moderation pitfalls without delving into sensitive or subjective areas.
Installation
pip install moderatefocus
Usage
from moderatefocus import moderatefocus
response = moderatefocus(user_input, api_key="your_api_key_here")
print(response) # Output: list of extracted key points
Parameters
user_input: str - the user input text to processapi_key: Optional[str] - the API key for LLM7, if not provided, the default ChatLLM7 will be usedllm: Optional[BaseChatModel] - the langchain LLM instance to use, if not provided, the default ChatLLM7 will be used
Using Custom LLM Instances You can safely pass your own LLM instance (based on langchain if you want to use another LLM. For example:
from langchain_openai import ChatOpenAI
from moderatefocus import moderatefocus
llm = ChatOpenAI()
response = moderatefocus(user_input, llm=llm)
Using Another LLM You can use another LLM like anthropic or google. For example:
from langchain_anthropic import ChatAnthropic
from moderatefocus import moderatefocus
llm = ChatAnthropic()
response = moderatefocus(user_input, llm=llm)
or google:
from langchain_google_genai import ChatGoogleGenerativeAI
from moderatefocus import moderatefocus
llm = ChatGoogleGenerativeAI()
response = moderatefocus(user_input, llm=llm)
API Key Rate Limits
The default rate limits for LLM7 free tier are sufficient for most use cases of this package. If you need higher rate limits for LLM7, you can pass your own API key via environment variable LLM7_API_KEY or via passing it directly like moderatefocus(user_input, api_key="your_api_key_here"). You can get a free API key by registering at https://token.llm7.io/.
Author Eugene Evstafev (hi@eugene.plus)
GitHub https://github.com/chigwell
License MIT License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file moderatefocus-2025.12.21103715.tar.gz.
File metadata
- Download URL: moderatefocus-2025.12.21103715.tar.gz
- Upload date:
- Size: 5.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
df51ae233a33af5d960dd3773926e1f09a4e5ccedd7e9978981f642f0de1a1b0
|
|
| MD5 |
7a3e2f30feaa1d3a7543d6eb77cd4225
|
|
| BLAKE2b-256 |
57d62a22ecf1d62c0bb17df8422012356002946ea15b7048a49f4df59b7d379a
|
File details
Details for the file moderatefocus-2025.12.21103715-py3-none-any.whl.
File metadata
- Download URL: moderatefocus-2025.12.21103715-py3-none-any.whl
- Upload date:
- Size: 6.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
02cf46527da4757d059633d4344191c9b6161aab50863315f9c77e475e561402
|
|
| MD5 |
a47f62cf088a99f65215de280719d9fd
|
|
| BLAKE2b-256 |
6af49e5a3917084dc724e0c2c91a25759ed600f80f4565335d665bbbd9e1b6cf
|