A tool to generate infrastructure files (Dockerfile, etc.) for codebases using local LLMs.
Project description
local-AI-infra-generation
local-AI-infra-generation leverages local Large Language Models (LLMs) to analyze code repositories and automatically generate infrastructure files such as Dockerfiles and docker-compose.yml. All processing is performed locally, ensuring privacy and control over your codebase.
Features
- Codebase Embedding: Index and embed your codebase for semantic search and retrieval.
- Natural Language Q&A: Ask questions about your codebase and receive context-aware answers.
- Automated Infrastructure Generation: Generate Dockerfiles and docker-compose.yml files tailored to your project.
- Multi-language Support: Works with Python, JavaScript, TypeScript, and Go projects.
Getting Started
1. Prerequisites
-
Python 3.11+
Ensure you have Python 3.11 or higher installed.
Check with:python --version -
Ollama
Download and install Ollama for local LLM inference. -
C/C++ Build Tools
Required for building tree-sitter-languages.
2. Installation
Create and activate a virtual environment:
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
Install the package from this repository:
pip install .
This will install the infra-gen command-line tool and all necessary dependencies.
Usage
1. Start Ollama
Make sure the Ollama server is running in the background:
ollama serve &
2. Run the CLI
Once installed, you can use the infra-gen command.
infra-gen --help
Common Commands
-
Embed a Project:
infra-gen embed /path/to/your/project
-
Ask a Question:
infra-gen ask "How does authentication work?" --project your_project_name
-
List Embedded Projects:
infra-gen list -
Generate Full Infrastructure (Dockerfile, Compose, etc.):
infra-gen generate-infra /path/to/your/project --output ./infra
-
Generate Only a Dockerfile:
infra-gen generate-docker --project your_project_name
-
Generate Only a docker-compose.yml:
infra-gen generate-compose --project your_project_name
Configuration
The tool uses a config.yaml file for settings. The configuration is loaded in the following order of priority:
- Via
--configflag: Provide a direct path to a.yamlfile.infra-gen --config /path/to/my-config.yaml embed /path/to/project
- User-level config: Place a file at
~/.config/infra-generator/config.yaml. - Default package config: If no other config is found, a default version bundled with the package is used.
You can customize model names, ChromaDB storage directories, Ollama URLs, and more in your custom config file.
Development
If you want to contribute to the development of this tool, you can install it in editable mode.
- Clone the repository:
git clone https://github.com/yourusername/local-AI-infra-generation.git cd local-AI-infra-generation
- Create and activate a virtual environment:
python -m venv .venv source .venv/bin/activate
- Install in editable mode:
pip install -e .
This allows you to make changes to the source code and have them reflected immediately when you run the infra-gen command.
Running Tests
TODO: Add unit tests and instructions for running them.
Troubleshooting
-
Ollama not found:
Ensure Ollama is installed and available in your PATH. -
tree-sitter language .so files missing:
If you encounter errors about missing.sofiles, ensure tree-sitter-languages is installed and built correctly. -
Model download issues:
The first run will download required models. Ensure you have a stable internet connection.
TODO
- Add comprehensive unit and integration tests.
- Improve error handling and user feedback.
- Add support for more programming languages (e.g., Java, Rust).
- Enhance prompt templates for better infrastructure generation.
- Add web or GUI interface.
- Document API for programmatic usage.
- Support for private model registries and custom LLMs.
- Optimize embedding and retrieval for large codebases.
- Add CI/CD pipeline for automated testing and deployment.
License
This project is licensed under the MIT License.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file local_ai_infra_generation-1.1.1.tar.gz.
File metadata
- Download URL: local_ai_infra_generation-1.1.1.tar.gz
- Upload date:
- Size: 25.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c04020a831cdfef5e44a4e009e7fa217b22c4661f98843ca4304ee74fc688a6a
|
|
| MD5 |
446266d1773074bf8a9692ed7d2e8f63
|
|
| BLAKE2b-256 |
3add4fcb2cfac5522a24dae4a0488b4fba8f614eb16b787fb717e32c8e1d0d18
|
File details
Details for the file local_ai_infra_generation-1.1.1-py3-none-any.whl.
File metadata
- Download URL: local_ai_infra_generation-1.1.1-py3-none-any.whl
- Upload date:
- Size: 28.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
49fa668484491419f1bcf57a60012b5e655adcf5f4927be0465bfc88b52b1f37
|
|
| MD5 |
d7034d1760be83d88cbd8500da86979f
|
|
| BLAKE2b-256 |
fead8156b4b4dbdf32eb88e02b9d1c453d8522071e99478ebed1a3e7cc43a458
|