A new package that helps users deploy applications by providing structured, step-by-step guidance through terminal commands. Users input their deployment goals or issues, and the package uses llmatch-
Project description
termdeployr-llm7
Overview
termdeployr-llm7 is a lightweight Python package that helps developers deploy applications directly from the terminal.
It guides you step‑by‑step with clear, actionable terminal commands, parsing your deployment goals or issues and returning structured responses that match a strict regex pattern.
The core of the package uses LLM7 via the ChatLLM7 class from the langchain_llm7 integration, but you can plug in any other LangChain‑compatible LLM if you prefer.
Installation
pip install termdeployr-llm7
Quick Start
from termdeployr_llm7 import termdeployr_llm7
# Minimal call – uses default ChatLLM7 and the LLM7_API_KEY env variable
response = termdeployr_llm7(
user_input="I want to deploy a Django app to AWS Elastic Beanstalk"
)
print(response) # => list of terminal commands / instructions
Function Signature
def termdeployr_llm7(
user_input: str,
api_key: Optional[str] = None,
llm: Optional[BaseChatModel] = None,
) -> List[str]:
"""
Returns a list of terminal commands that fulfill the deployment request.
Parameters
----------
user_input : str
The natural‑language description of the deployment goal or problem.
api_key : Optional[str]
LLM7 API key. If omitted, the function looks for the `LLM7_API_KEY`
environment variable, falling back to a placeholder key.
llm : Optional[BaseChatModel]
Any LangChain `BaseChatModel` instance. If not provided, the default
`ChatLLM7` client is instantiated.
"""
Using a Custom LLM
You can safely replace the default ChatLLM7 with any LangChain‑compatible chat model.
OpenAI
from langchain_openai import ChatOpenAI
from termdeployr_llm7 import termdeployr_llm7
llm = ChatOpenAI(model="gpt-4o-mini")
response = termdeployr_llm7(
user_input="Deploy a Flask app to Railway",
llm=llm
)
Anthropic
from langchain_anthropic import ChatAnthropic
from termdeployr_llm7 import termdeployr_llm7
llm = ChatAnthropic(model="claude-3-opus-20240229")
response = termdeployr_llm7(
user_input="Set up CI/CD for a Node.js project on GitHub Actions",
llm=llm
)
Google Generative AI
from langchain_google_genai import ChatGoogleGenerativeAI
from termdeployr_llm7 import termdeployr_llm7
llm = ChatGoogleGenerativeAI(model="gemini-1.5-flash")
response = termdeployr_llm7(
user_input="Create Docker images for a Go microservice",
llm=llm
)
API Key & Rate Limits
- The free tier of LLM7 provides sufficient rate limits for typical deployment assistance.
- To obtain a free API key, register at: https://token.llm7.io/
- You can set the key via the environment variable
LLM7_API_KEYor pass it directly:
response = termdeployr_llm7(
user_input="Deploy a static site to Netlify",
api_key="sk_XXXXXXXXXXXXXXXX"
)
If higher usage limits are required, upgrade your LLM7 plan and use the new key.
Contributing
Issues, bug reports, and feature requests are welcomed. Please open a new issue on GitHub:
https://github.com/chigwell/termdeployr-llm7/issues
Author
Eugene Evstafev – hi@euegne.plus
GitHub: chigwell
Happy deploying! 🚀
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file termdeployr_llmdeploy_stepdeploy_cliassist_deployflow_termguide_llmdeployer_deploymate_terminaldeploy_smartdeploy-2025.12.21142331.tar.gz.
File metadata
- Download URL: termdeployr_llmdeploy_stepdeploy_cliassist_deployflow_termguide_llmdeployer_deploymate_terminaldeploy_smartdeploy-2025.12.21142331.tar.gz
- Upload date:
- Size: 6.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.1
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
78aa44f3788a57aceab2bb52accad43dce73603c8da297b8bd6c0f90ab76df17
|
|
| MD5 |
abf6b9b41a9e05561eaab55e65e0b566
|
|
| BLAKE2b-256 |
d15f5d01a5dabe77ed0dbe0f6cb1eff2b34015148fae611fcfda5a3c0239b61f
|
File details
Details for the file termdeployr_llmdeploy_stepdeploy_cliassist_deployflow_termguide_llmdeployer_deploymate_terminaldeploy_smartdeploy-2025.12.21142331-py3-none-any.whl.
File metadata
- Download URL: termdeployr_llmdeploy_stepdeploy_cliassist_deployflow_termguide_llmdeployer_deploymate_terminaldeploy_smartdeploy-2025.12.21142331-py3-none-any.whl
- Upload date:
- Size: 8.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.1
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8de42e827d6c5fd39f35cb6a88e7575b8988756215908aa8e66262a6fe1aa498
|
|
| MD5 |
f415513afef4b06871e26b3a5f0d6139
|
|
| BLAKE2b-256 |
ac31cacf827cd6183399a2e89a2550107ac47a67bffbe41b1867a1cba4998dd8
|