Simple task manager and job queue for distributed rendering. Built on celery and redis.
Project description
A super simple way to distribute rendering tasks across multiple machines.
Description
Distributaur uses Celery to create a task queue, and each task is defined by some function. Each task is sent to a worker using Redis as a broker. Once the task is complete, the worker uploads the result to a Huggingface dataset.
Installation
pip install distributaur
Development
Setup
Clone the repository and navigate to the project directory:
git clone https://github.com/RaccoonResearch/distributaur.git
cd distributaur
Install the required packages:
pip install -r requirements.txt
Install the distributaur package:
python setup.py install
Configuration
Create a .env
file in the root directory of your project or set environment variables to match your setup:
REDIS_HOST=localhost
REDIS_PORT=6379
REDIS_USER=user
REDIS_PASSWORD=password
VAST_API_KEY=your_vast_api_key
HF_TOKEN=hf_token
HF_REPO_ID=YourHFRepo/test_dataset
BROKER_POOL_LIMIT=your_broker_pool_limit
Getting Started
Running an Example Task
To run an example task and see Distributaur in action, you can execute the example script provided in the project:
# To run the example task locally using either a Docker container or a celery worker
python -m distributaur.example.local
# To run the example task on VAST.ai ("kitchen sink" example)
python -m distributaur.example.distributed
This script configures the environment, registers a sample function, dispatches a task, and monitors its execution.
API Reference
Core Functionality
- register_function(func: callable) -> callable: Decorator to register a function so that it can be invoked as a task.
- execute_function(func_name: str, args: dict) -> Celery.AsyncResult: Execute a registered function as a Celery task with provided arguments.
Configuration Management
- get_env(key: str, default: any = None) -> any: Retrieve a value from the configuration settings, with an optional default if the key is not found.
Task Management
- update_function_status(task_id: str, status: str) -> None: Update the status of a function task in Redis.
Hugging Face Dataset Management
- initialize_dataset(**kwargs) -> None: Initialize a Hugging Face repository if it doesn't exist.
- upload_file(file_path: str) -> None: Upload a file to a Hugging Face repository.
- upload_directory(output_dir: str, repo_dir: str) -> None: Upload the rendered outputs to a Hugging Face repository.
- delete_file(repo_id: str, path_in_repo: str) -> None: Delete a file from a Hugging Face repository.
- file_exists(repo_id: str, path_in_repo: str) -> bool: Check if a file exists in a Hugging Face repository.
- list_files(repo_id: str) -> list: Get a list of files from a Hugging Face repository.
VAST.ai Integration
- search_offers(max_price: float) -> List[Dict]: Search for available offers on the Vast.ai platform.
- create_instance(offer_id: str, image: str, module_name: str) -> Dict: Create an instance on the Vast.ai platform.
- destroy_instance(instance_id: str) -> Dict: Destroy an instance on the Vast.ai platform.
- rent_nodes(max_price: float, max_nodes: int, image: str, module_name: str) -> List[Dict]: Rent nodes on the Vast.ai platform.
- terminate_nodes(nodes: List[Dict]) -> None: Terminate the rented nodes.
Contributing
Contributions are welcome! For major changes, please open an issue first to discuss what you would like to change.
License
This project is licensed under the MIT License - see the LICENSE
file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for distributaur-0.0.18-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | dcf4b10db977d7528cd7c1900bb70ef705c93d73649d66fb9f255a78ab30bb56 |
|
MD5 | 40471c783906388713fade3db5304451 |
|
BLAKE2b-256 | 7681f75ce2053503f65d5e55c5d3ac2a8cf7f649f997b958d4df0771b0557b6f |