Text input processor for corporate actions, extracting key entities, actions, and context.
Project description
bizact-insights
bizact-insights is a Python package designed to process text inputs related to corporate actions, such as trademark filings or product launches, and extract structured insights. It identifies key entities, actions, and contextual information from news snippets or official statements, enabling users to quickly grasp business-related developments.
Installation
Install the package via pip:
pip install bizact_insights
Usage
from bizact_insights import bizact_insights
# Example usage with default LLM7
results = bizact_insights(
user_input="Apple announced the launch of a new product line in Europe.",
)
print(results)
Parameters
user_input(str): The input text to analyze.llm(Optional[BaseChatModel]): An instance of a language model (e.g., from langchain). If not provided, the default ChatLLM7 will be used.api_key(Optional[str]): API key for LLM7. If not provided, it can be set via the environment variableLLM7_API_KEY.
Details
This package relies on the ChatLLM7 class from langchain_llm7, which offers a straightforward interface to the LLM7 API. It can also accept custom language model instances, allowing integration with other providers such as OpenAI, Anthropic, or Google Generative AI.
Example with custom LLMs
from langchain_openai import ChatOpenAI
from bizact_insights import bizact_insights
llm = ChatOpenAI()
response = bizact_insights(user_input="Tesla reported record deliveries.", llm=llm)
print(response)
from langchain_anthropic import ChatAnthropic
from bizact_insights import bizact_insights
llm = ChatAnthropic()
response = bizact_insights(user_input="Microsoft announced a new cloud service.", llm=llm)
print(response)
from langchain_google_genai import ChatGoogleGenerativeAI
from bizact_insights import bizact_insights
llm = ChatGoogleGenerativeAI()
response = bizact_insights(user_input="Google is expanding its workspace features.", llm=llm)
print(response)
Rate Limits
The default usage via LLM7's free tier offers sufficient rate limits for most purposes. For higher throughput, you can obtain a free API key by registering at https://token.llm7.io/ and set it via the LLM7_API_KEY environment variable or pass it directly:
results = bizact_insights(user_input="Sample text", api_key="your_api_key_here")
Support & Issues
If you encounter issues or have questions, please open an issue on the GitHub repository.
Author
Eugene Evstafev
Email: hi@euegne.plus
GitHub: chigwell
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file bizact_insights-2025.12.21201957.tar.gz.
File metadata
- Download URL: bizact_insights-2025.12.21201957.tar.gz
- Upload date:
- Size: 6.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.1
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
714b1b84ea8b9ad8aa0599f2925be63f57ebeebc00962875698640c7012e5e67
|
|
| MD5 |
121da595e42135f621eacb57caf5191e
|
|
| BLAKE2b-256 |
33a3c5fe782b423b630686fbcce81a120aa7fbf971a0bc6b26bd0152f5adcf05
|
File details
Details for the file bizact_insights-2025.12.21201957-py3-none-any.whl.
File metadata
- Download URL: bizact_insights-2025.12.21201957-py3-none-any.whl
- Upload date:
- Size: 6.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.1
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a3e8a9c980ac01410ba0174a4b1be16b7190a8342d9f0e6d1627f3323e78bad7
|
|
| MD5 |
9e26cc6627fcd346354c7c3fcad9308d
|
|
| BLAKE2b-256 |
1f1196477738814ddc292d2dbc447db81bb243a921ac296463550e8337ef8763
|