Skip to main content

A new package designed to process user input descriptions of technical products or components and generate structured summaries or specifications. It leverages LLMs to interpret the input text and ext

Project description

techspec-extract

PyPI version License: MIT Downloads LinkedIn

A Python package designed to process user input descriptions of technical products or components and generate structured summaries or specifications. It leverages LLMs to interpret the input text and extract key details such as features, return status, or related media references, enabling consistent data extraction for product management, customer support, or inventory tracking without handling the actual media files.

Installation

pip install techspec_extract

Usage

from techspec_extract import techspec_extract

user_input = "Your user input text here"
response = techspec_extract(user_input)
print(response)

Parameters

  • user_input (str): The user input text to process.
  • llm (Optional[BaseChatModel]): The LangChain LLM instance to use. If not provided, the default ChatLLM7 will be used.
  • api_key (Optional[str]): The API key for LLM7. If not provided, the environment variable LLM7_API_KEY will be used.

Using Different LLMs

You can safely pass your own LLM instance if you want to use another LLM. Here are examples of how to use different LLMs:

Using OpenAI

from langchain_openai import ChatOpenAI
from techspec_extract import techspec_extract

llm = ChatOpenAI()
response = techspec_extract(user_input, llm=llm)

Using Anthropic

from langchain_anthropic import ChatAnthropic
from techspec_extract import techspec_extract

llm = ChatAnthropic()
response = techspec_extract(user_input, llm=llm)

Using Google

from langchain_google_genai import ChatGoogleGenerativeAI
from techspec_extract import techspec_extract

llm = ChatGoogleGenerativeAI()
response = techspec_extract(user_input, llm=llm)

Rate Limits

The default rate limits for LLM7 free tier are sufficient for most use cases of this package. If you want higher rate limits for LLM7, you can pass your own API key via the environment variable LLM7_API_KEY or directly via the api_key parameter:

from techspec_extract import techspec_extract

user_input = "Your user input text here"
response = techspec_extract(user_input, api_key="your_api_key")

You can get a free API key by registering at LLM7.

Issues

If you encounter any issues, please report them on the GitHub issues page.

Author

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

techspec_extract-2025.12.21115447.tar.gz (5.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

techspec_extract-2025.12.21115447-py3-none-any.whl (6.3 kB view details)

Uploaded Python 3

File details

Details for the file techspec_extract-2025.12.21115447.tar.gz.

File metadata

File hashes

Hashes for techspec_extract-2025.12.21115447.tar.gz
Algorithm Hash digest
SHA256 e9fdfc62518a95547ed5ff0429c377431c0e01e60f336c4e7200205da6d25f46
MD5 edd468d4b3ddef910902db1e52455ada
BLAKE2b-256 c22b7945a8ce3c12cfef6391bef92c09295e5c9e6c6af47f57f49b2118926b4b

See more details on using hashes here.

File details

Details for the file techspec_extract-2025.12.21115447-py3-none-any.whl.

File metadata

File hashes

Hashes for techspec_extract-2025.12.21115447-py3-none-any.whl
Algorithm Hash digest
SHA256 b4c9fb7b28cb65c15563498b8aae07e64e5a4fa05045cfdba038ef4999b77760
MD5 fe7638a5c7684caea0fd13ce43508915
BLAKE2b-256 aeca48003f07b3bd876095268512be46791a3bb0f0db5dd0f38faff7070b0e15

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page