Spec-driven, build passing code.
Project description
👋 Welcome to Boot!
Your AI-powered code generator. Write a 5-line spec. Get a working code base that passes tests and builds.
Boot uses AI to generate production-ready PySpark, Python, and SQL pipelines from simple specifications. No more boilerplate. No more setup hassle. Just describe what you want.
Install (30 seconds)
Step 1: Prerequisites
First, ensure you have the required tools installed on your system:
Python: You'll need Python 3.11 or newer. You can check your version with python --version.
Poetry: This project uses Poetry for dependency management. If you don't have it, install it using the officially recommended method:
This command downloads and runs the official installer
pip install poetry
After installation, close and reopen your terminal, or run source "$HOME/.poetry/env" to ensure the poetry command is available in your PATH. You can find more details at the official Poetry documentation.
Step 2: Project Setup
Once the prerequisites are met, you can set up the project:
First, ensure you have an API key from OpenAI and/or Google Gemini.
Next, set up your environment:
# 1. Clone the repository
git clone git@github.com:renbytes/boot-code.git
cd boot-code
# 2. Install project dependencies
poetry install
# 3. Start a virtual environment
poetry env activate
# 3. Set up your API keys
cp .env.example .env
# 4. Edit the .env file to add your keys
# OPENAI_API_KEY="sk-..."
# GEMINI_API_KEY="AI..."
Use (2 minutes)
Run the main CLI module and follow the prompts:
# Activate the environment
poetry shell
# View available commands
boot --help
# Example: Validate a specification file
boot validate path/to/your/spec.toml
# Example: Generate a new pipeline
boot generate path/to/your/spec.toml
That's it. Answer a few questions, and watch your pipeline appear.
What You Get
- Complete project structure with all the files you need.
- Working code that's ready to run on your data.
- Unit tests to ensure quality and reliability.
- Visualizations to help you see your results.
- Documentation so your team understands the pipeline.
Why Boot?
Instead of spending hours writing boilerplate, Boot generates:
✅ Production-ready code following best practices
✅ Complete test suites with 90%+ coverage
✅ Interactive visualizations for immediate insights
✅ Professional documentation your team will love
✅ Modern tooling (Streamlit, Black, pytest)
Performance Comparison
| Metric | Manual Coding | With Boot |
|---|---|---|
| Time to MVP | 4-8 hours | 2 minutes |
| Lines of code | 200-500 | Generated |
| Test coverage | ~60% | 90%+ |
| Documentation | Minimal | Complete |
Advanced Usage
Below is an example of more advanced usage, using all available flags:
boot generate examples/consumer_tech/spec.toml \
--provider gemini \
--model gemini-2.5-pro \
--api-key "AIzaSy..." \
--two-pass \
--temperature 0.1 \
--timeout 180 \
--output-dir Desktop/ecommerce_gemini_pro
poetry run boot generate examples/rust/my_rust_spec.toml --build-pass
Where:
--provider: Explicitly selects the LLM provider, overriding any default or.envsetting.--model: Specifies a particular model to use for the generation, rather than the default.--api-key: Provides the API key directly on the command line, which takes precedence over any key in an.envfile or other environment settings.--two-pass: Enables the secondary review pass, where the initial code is sent back to the LLM for refinement and improvement--temperature: Sets the generation temperature to a very low value, making the output more deterministic and less random.--timeout: Sets the API request timeout, which is useful for complex specifications that may take the model longer to process.--output-dir: Specifies a custom directory for the generated project files, overriding the defaultgenerated_jobslocation.
Examples
Explore real-world use cases in the examples/ directory:
- E-commerce - Top selling products analysis (SQL)
- Healthcare - Patient length of stay analysis (SQL)
- Finance - Stock volatility calculation (Python)
- Energy - Renewable energy production analysis (Python)
- Consumer Tech - Ad attribution pipeline (PySpark)
Supported Languages
| Language | Framework | Use Case |
|---|---|---|
| Python | pandas | Data analysis, reporting |
| PySpark | Spark | Big data, distributed computing |
| SQL | dbt-style | Data warehousing, analytics |
Development
See the DEVELOPER_GUIDE.md
Support
- 📖 Documentation: Check the
docs/directory. - 🐛 Issues: Report bugs on GitHub Issues.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file boot_code-0.1.0.tar.gz.
File metadata
- Download URL: boot_code-0.1.0.tar.gz
- Upload date:
- Size: 30.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.1.3 CPython/3.12.2 Darwin/24.2.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9a3e45ce8580304e2f7357a29415adba3b191e78522c523ba1bb7e8a4896aa12
|
|
| MD5 |
4607626c746dad1729835e8e4d0b7305
|
|
| BLAKE2b-256 |
c7d5e2088b3aed14c1abd9d600ab3a5bb962a94454554cf6e2586d6e2c57bd2f
|
File details
Details for the file boot_code-0.1.0-py3-none-any.whl.
File metadata
- Download URL: boot_code-0.1.0-py3-none-any.whl
- Upload date:
- Size: 39.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.1.3 CPython/3.12.2 Darwin/24.2.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
918ac2a4f1d1d5b538cd1ccfc163f5e503d26d2f0e450c82b1d30178a11d04a2
|
|
| MD5 |
642a8f6eb116ae35c82582c0e7d18328
|
|
| BLAKE2b-256 |
c583047dd280d59735c7c53ced821fc13696a8b0146fb5f611f1a2023bb2ade8
|