Distributed task execution and coordination for LlamaAI workloads.
Project description
llama-distributed
Llama Distributed (llama-distributed) is a toolkit within the LlamaSearch AI ecosystem designed for distributing tasks or packages across multiple nodes or environments. It likely aids in packaging applications and managing their deployment or execution in a distributed setting.
Key Features
- Package Distribution: Core logic related to packaging and distributing Python applications or tasks (
package.py). - Deployment Management (Potential): May include tools for deploying packages to target environments.
- Task Execution (Potential): Could support running distributed tasks or parallel processing.
- Core Module: Manages the distribution process (
core.py). - Configurable: Allows specifying target environments, package details, and distribution methods (
config.py).
Installation
pip install llama-distributed
# Or install directly from GitHub for the latest version:
# pip install git+https://github.com/llamasearchai/llama-distributed.git
Usage
(Usage examples for packaging and distributing applications or tasks will be added here.)
# Placeholder for Python client usage
# from llama_distributed import Distributor, PackageConfig
# config = PackageConfig.load("config.yaml")
# distributor = Distributor(config)
# # Define package or task
# package_path = "/path/to/my_app"
# target_nodes = ["node1.example.com", "node2.example.com"]
# # Distribute the package
# distribution_job = distributor.distribute(
# package_path=package_path,
# targets=target_nodes,
# options={'run_command': 'python main.py'}
# )
# print(f"Distribution job started: {distribution_job.id}")
Architecture Overview
graph TD
A[User / Build System] --> B{Core Distributor (core.py)};
B --> C{Packaging Logic (package.py)};
C --> D[Packaged Application / Task];
B -- Uses --> E{Deployment / Execution Interface};
E -- Deploys/Runs on --> F[Target Node 1];
E -- Deploys/Runs on --> G[Target Node 2];
E -- Deploys/Runs on --> H[...];
I[Configuration (config.py)] -- Configures --> B;
I -- Configures --> C;
I -- Configures --> E;
style B fill:#f9f,stroke:#333,stroke-width:2px
style F fill:#ccf,stroke:#333,stroke-width:1px
style G fill:#ccf,stroke:#333,stroke-width:1px
style H fill:#ccf,stroke:#333,stroke-width:1px
- Input: User or a build system triggers the distribution process.
- Core Distributor: Manages the workflow based on configuration.
- Packaging: The application or task is packaged for distribution.
- Deployment/Execution: The packaged artifact is sent to target nodes and potentially executed.
- Targets: Represents the remote machines or environments where the package is distributed.
- Configuration: Defines the package source, target nodes, deployment methods, execution commands, etc.
Configuration
(Details on configuring source packages, target node addresses/credentials, distribution protocols (SSH, etc.), post-deployment commands, etc., will be added here.)
Development
Setup
# Clone the repository
git clone https://github.com/llamasearchai/llama-distributed.git
cd llama-distributed
# Install in editable mode with development dependencies
pip install -e ".[dev]"
Testing
pytest tests/
Contributing
Contributions are welcome! Please refer to CONTRIBUTING.md and submit a Pull Request.
License
This project is licensed under the MIT License - see the LICENSE file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file llama_distributed-0.1.0.tar.gz.
File metadata
- Download URL: llama_distributed-0.1.0.tar.gz
- Upload date:
- Size: 25.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
69c7282da65341aacd6223b683239eb8aea33e6b06025fbaa4e51f02deb51d4d
|
|
| MD5 |
9c277153486ce9252504e9f9b2f2cd2a
|
|
| BLAKE2b-256 |
017fcfe7628ed20f8d6b9f000349592b35d01f9feb11a83e9d6d88fe2b86db6e
|
File details
Details for the file llama_distributed-0.1.0-py3-none-any.whl.
File metadata
- Download URL: llama_distributed-0.1.0-py3-none-any.whl
- Upload date:
- Size: 15.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4e83fbcf0f8ffcee1b5e7f8abad5dc53ed3994137d8cfab177850e793731c84e
|
|
| MD5 |
5fd8a3be1b51cc550f0086843f6611c8
|
|
| BLAKE2b-256 |
e51478c0877b257ccc981e5e21842e2870904bbb7852d9f76430239468c47f0d
|