LLM fine-tuning hardware planner with CLI and API.
Project description
Train in Silence
Stop comparing GPU prices. Start training.
You want to fine-tune an LLM. You open Vast.ai, RunPod, AWS -- three tabs, three pricing models, three different ways to describe a GPU. An hour later you're still in a spreadsheet and haven't written a single line of training code.
Train in Silence does that homework for you. Describe your workload once, and it returns the cheapest, fastest, and most balanced hardware options across cloud providers -- in seconds.
Quickstart
Option A: Ask Claude (recommended)
Install the library and register it as a tool in Claude Code:
pip install train-in-silence
claude mcp add tis --scope user -- tis-mcp
Then just ask in natural language:
> I want to QLoRA fine-tune Llama-13B on 100M tokens, budget under $20.
Find me the best GPU options across Vast.ai, RunPod, and AWS.
Claude calls TIS behind the scenes and returns a structured recommendation -- no YAML, no config files, no manual comparison.
Option B: CLI
pip install train-in-silence
tis recommend examples/request.yaml
$ tis recommend examples/request.yaml
Found 5 viable configurations
Lowest cost: $4.32 | Fastest runtime: 2.1 hours
#1 [cheapest] RunPod 1x A6000 (48 GB) $4.32 / 6.8 h
#2 [fastest] Vast.ai 2x A100 (80 GB) $9.10 / 2.1 h
#3 [balanced] RunPod 1x A100 (80 GB) $6.40 / 3.2 h
...
Note: Output above is illustrative. Actual results depend on live market data.
Use It Your Way
| Channel | Command | Docs |
|---|---|---|
| CLI | tis recommend request.yaml |
CLI Guide |
| REST API | uvicorn tis.api.server:app |
API Reference |
| Claude Code | claude mcp add tis --scope user -- tis-mcp |
MCP Guide |
| Claude Desktop | Add tis-mcp to claude_desktop_config.json |
MCP Guide |
Market Providers
Live pricing from three providers -- no manual data entry:
| Provider | Source | Auth Required |
|---|---|---|
| Vast.ai | REST API | VAST_API_KEY |
| RunPod | GraphQL API | RUNPOD_API_KEY |
| AWS | Public EC2 Price List | None |
If a provider is unreachable, TIS gracefully falls back to bundled sample data and marks the result accordingly. -> Provider details
Architecture at a Glance
YAML request -> Estimator -> Market Aggregator -> Optimizer -> Pareto Frontier -> Ranked Output
| | |
VRAM/FLOPs Vast+RunPod+AWS Cost vs. Time
Each recommendation shows where the data came from (live or sample) and flags any estimated fields -- no silent guesswork. -> Architecture deep-dive
Known Limitations
- Estimation model is fixed with no built-in calibration; future versions will calibrate using real runtimes.
- AWS availability uses an approximation method due to the lack of a real-time instance list API (flagged transparently).
- Upstream Provider API schema changes will require synchronized mapping updates.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file train_in_silence-0.1.0.tar.gz.
File metadata
- Download URL: train_in_silence-0.1.0.tar.gz
- Upload date:
- Size: 23.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
069b48ac74ac32ee14b32d1466d2b96179aa6d90cf036c0412ad239f46f65437
|
|
| MD5 |
0c68e2e96a872073d1d18df42fd4bb99
|
|
| BLAKE2b-256 |
1e5f9cbda0dfdddd865d21b211dba22af868aeec3f05f37ee8a54c93a94d3755
|
File details
Details for the file train_in_silence-0.1.0-py3-none-any.whl.
File metadata
- Download URL: train_in_silence-0.1.0-py3-none-any.whl
- Upload date:
- Size: 14.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e26d26a4d3c89a6e63cc2d79a4eec205629034a15477e2fef15bf1be88b23785
|
|
| MD5 |
d4ab9b0a96af5be758aa40b9a1de0ecd
|
|
| BLAKE2b-256 |
24c27ae63125ed4964994d49d61ce142d7fb9da51432e1a74153ea4aa2ad8541
|