Skip to main content

No more excuses for bad models. >70 Optimizers. All explained. All pre-configured.

Project description

Documentation Status Publish Python Package

🔥 pyrodigy 🔥

ATTENTION - WORK IN PROGRESS - TEST DEPLOYMENT - NOT COMPLETE

pyrodigy is a Python wrapper around more than 70 optimizers from pytorch_optimizer, along with some additional custom optimizers. Designed for flexibility, pyrodigy offers easy configuration management, history tracking, and a CLI for convenience.

Features

  • Access to 70+ Optimizers: Use a variety of optimizers, from well-known ones to niche algorithms.
  • Config Management: View, add, set, or remove optimizer configurations directly from the CLI.
  • History Tracking: Track optimizer instantiations with detailed history, including timestamps, parameters, and caller information.
  • Customizable TTL: Automatically clear history entries older than a specified time-to-live (TTL).
  • Rich CLI Interface: Manage configurations, view documentation, and explore history—all from the command line.

Installation

With pip

pip install pyrodigy

or

Clone the repo and install pyrodigy using Poetry:

poetry install

Dependencies

Note: Pyrodigy requires PyTorch to be installed separately. You can install it based on your specific environment (CPU or GPU) and operating system. Follow the PyTorch installation guide for instructions.

Usage

CLI Commands

The CLI commands allow you to list optimizers, manage configurations, and handle history entries.

List Available Optimizers

Displays a list of optimizers for which a wrapper exist

pyrodigy list

Show Optimizer Documentation

Prints the Markdown documentation for the specified optimizer:

pyrodigy show <optimizer_name>

Configuration Management

Manage optimizer configurations using the config command with get, set, add, and rm actions.

  • View Configuration:

    pyrodigy config <optimizer_name> get
    
  • Set Configuration: Update an existing configuration with new values (JSON format).

    pyrodigy config <optimizer_name> set '{"default": {"lr": 0.01, "beta": 0.9}}'
    
  • Add New Configuration: Add a new named configuration (JSON format).

    pyrodigy config <optimizer_name> add <config_name> '{"lr": 0.02, "beta": 0.95}'
    
  • Remove Configuration: Remove a named configuration.

    pyrodigy config <optimizer_name> rm <config_name>
    

History Management

Each time an optimizer is instantiated, an entry is created in its history. You can review or clear history and apply a TTL to automatically remove old entries.

  • Show History: View the history for an optimizer. Specify a TTL to filter entries within a certain timeframe.

    pyrodigy history <optimizer_name> show --TTL 30d
    

    Example with TTL:

    pyrodigy history a2grad show --TTL 60d
    
  • Clear History: Remove all history entries for the optimizer.

    pyrodigy history <optimizer_name> clear
    

Example: Using pyrodigy in Code

General usage

Instantiate an optimizer with pyrodigy’s OptimizerWrapper, which logs the creation details to the optimizer's history.

from pyrodigy import OptimizerWrapper

# Define model parameters and optimizer configuration
params = model.parameters()
optimizer_name = "AdamP"
config_name = "default"
lr = 0.001

# Initialize the optimizer
optimizer = OptimizerWrapper(params, optimizer_name=optimizer_name, config_name=config_name, lr=lr)

Implementing into Kohya's training framework

Instantiate an optimizer with pyrodigy’s OptimizerWrapper, which logs the creation details to the optimizer's history.

# get optimizer in train_utils.py

if optimizer_type.lower().startswith("Pyro-Wrapper".lower()):
  try:
      from pyrodigy import OptimizerWrapper

      # Extract necessary information from optimizer_kwargs
      optimizer_name = optimizer_kwargs.get("id", "adabelief")
      optimizer_default_config = optimizer_kwargs.get("cfg", "low_memory")

      # Define optimizer_class using the optimizer name
      optimizer_class = OptimizerWrapper(optimizer=optimizer_name)

      # Initialize the optimizer using the Wrapper
      optimizer = optimizer_class(
          trainable_params,
          optimizer_name=optimizer_name,
          config_name=optimizer_default_config,
          lr=lr,
          **optimizer_kwargs,
      )

  except ImportError:
      raise ImportError(
          "No pytorch_optimizer"
      )
  except Exception as e:
      logger.error(
          "An error occurred while loading the optimizer:", exc_info=True
      )
      raise RuntimeError(
          f"Failed to initialize optimizer '{optimizer_name}' with type '{optimizer_type}'. "
          "Please check the optimizer name, configuration, and installation."
      ) from e

#....

Then you need to start Kohya's with these optimizer params:

--optimizer_type=pyro-wrapper `
--optimizer_args "id=adabelief" "cfg=low_memory" `
--learning_rate 1e-4 `

History Entries

Every time you create an optimizer instance, the following details are saved:

  • Optimizer Name: The name of the optimizer.
  • Config Name: The configuration used, if provided.
  • Parameters: Any additional parameters such as learning rate.
  • Caller Information: File, line number, and function name where the optimizer was instantiated.

License

Licensed under the Apache License 2.0. See the LICENSE file for details.

Contributing

Contributions are welcome! If you find any issues or have suggestions, feel free to open an issue or submit a pull request.

Support

For questions or support, please open an issue on GitHub.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyrodigy-0.2.1.tar.gz (11.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pyrodigy-0.2.1-py3-none-any.whl (10.9 kB view details)

Uploaded Python 3

File details

Details for the file pyrodigy-0.2.1.tar.gz.

File metadata

  • Download URL: pyrodigy-0.2.1.tar.gz
  • Upload date:
  • Size: 11.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.12.7 Linux/6.5.0-1025-azure

File hashes

Hashes for pyrodigy-0.2.1.tar.gz
Algorithm Hash digest
SHA256 9f24a06aa1e4b094bf81234b0bd5833216a05eecfe5bb91d5448b6a122559b04
MD5 9fd55d87d9a154698841550e1b6b65c3
BLAKE2b-256 19ba47708da5190c5748b284128b58422079f5b7f4df85660ec46049476f2d51

See more details on using hashes here.

File details

Details for the file pyrodigy-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: pyrodigy-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 10.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.12.7 Linux/6.5.0-1025-azure

File hashes

Hashes for pyrodigy-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 7b619653a4db9c16b9abef89e39f8ea00cfd22a4ef0e6de59caef9fbd2e3a69d
MD5 47f270b375f49d4cbee9ae8faffd1990
BLAKE2b-256 c6fbd261d78e50f7a966681d72c25f6bee0e3da82a79fff610bcfbcf479965ed

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page