Skip to main content

Optimizers for/and sklearn compatible Machine Learning models

Project description

OptiML

Build Status Coverage Status Python Version PyPI Version PyPI Downloads Binder

OptiML is a sklearn compatible implementation of Support Vector Machines and Deep Neural Networks, both with some of the most successful features according to the state of the art.

This work was motivated by the possibility of being able to solve the optimization problem deriving from the mathematical formulation of these models through a wide range of optimization algorithms object of study and developed for the Numerical Methods and Optimization course @ Department of Computer Science @ University of Pisa under the supervision of prof. Antonio Frangioni.

Contents

  • Numerical Optimization

    • Unconstrained Optimization
      • Line Search Methods
        • 1st Order Methods
          • Steepest Gradient Descent
          • Conjugate Gradient
            • Fletcher–Reeves formula
            • Polak–Ribière formula
            • Hestenes-Stiefel formula
            • Dai-Yuan formula
        • 2nd Order Methods
          • Newton
          • Quasi-Newton
            • BFGS
            • L-BFGS
      • Stochastic Methods
        • Stochastic Gradient Descent
          • Momentum
            • Polyak
            • Nesterov
        • Adam
          • Momentum
            • Polyak
            • Nesterov
        • AMSGrad
          • Momentum
            • Polyak
            • Nesterov
        • AdaMax
          • Momentum
            • Polyak
            • Nesterov
        • AdaGrad
        • AdaDelta
        • RMSProp
          • Momentum
            • Polyak
            • Nesterov
        • Schedules
          • Step size
            • Decaying
            • Linear Annealing
            • Repeater
          • Momentum
            • Sutskever Blend
      • Proximal Bundle with cvxpy interface to ecos, osqp, scs, etc.
    • Constrained Quadratic Optimization
      • Box-Constrained Quadratic Methods
        • Projected Gradient
        • Frank-Wolfe or Conditional Gradient
        • Active Set
        • Interior Point
      • Lagrangian Dual
      • Augmented Lagrangian Dual
  • Machine Learning

    • Support Vector Machines
      • Formulations
        • Primal
        • Wolfe Dual
        • Lagrangian Dual
      • Support Vector Classifier
        • Losses
          • Hinge (L1 Loss) l1_svc_loss
          • Squared Hinge (L2 Loss) l2_svc_loss
      • Support Vector Regression
        • Losses
          • Epsilon-insensitive (L1 Loss) l1_svr_loss
          • Squared Epsilon-insensitive (L2 Loss) l2_svr_loss
      • Kernels
        • Linear

          SVC SVR
          linear_dual_l1_svc_hyperplane linear_dual_l1_svc_hyperplane
        • Polynomial

          SVC SVR
          poly_dual_l1_svc_hyperplane poly_dual_l1_svc_hyperplane
        • Gaussian

          SVC SVR
          gaussian_dual_l1_svc_hyperplane gaussian_dual_l1_svc_hyperplane
        • Laplacian

          SVC SVR
          laplacian_dual_l1_svc_hyperplane laplacian_dual_l1_svc_hyperplane
        • Sigmoid

      • Optimizers (ad hoc)
    • Neural Networks
      • Neural Network Classifier
      • Neural Network Regressor
      • Losses
        • Mean Absolute Error (L1 Loss)
        • Mean Squared Error (L2 Loss)
        • Binary Cross Entropy
        • Categorical Cross Entropy
        • Sparse Categorical Cross Entropy
      • Regularizers
        • L1 or Lasso
        • L2 or Ridge or Tikhonov
      • Activations
        • Linear
        • Sigmoid
        • Tanh
        • ReLU
        • SoftMax
      • Layers
        • Fully Connected
      • Initializers
        • Xavier or Glorot (normal and uniform)
        • He (normal and uniform)

Install

pip install optiml

License License: MIT

This software is released under the MIT License. See the LICENSE file for details.

Project details


Release history Release notifications | RSS feed

This version

1.5

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

optiml-1.5.tar.gz (61.5 kB view details)

Uploaded Source

File details

Details for the file optiml-1.5.tar.gz.

File metadata

  • Download URL: optiml-1.5.tar.gz
  • Upload date:
  • Size: 61.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.6.1 pkginfo/1.7.1 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.61.2 CPython/3.9.5

File hashes

Hashes for optiml-1.5.tar.gz
Algorithm Hash digest
SHA256 a0ebf03eb1cda81a9d8e86dd096acf223669980c1bad59e65fcd1a9370a8ed17
MD5 18369ded2cda24f183182954598fe744
BLAKE2b-256 1b7d83efac39ab6b6da0c7bfc9a891faa8fbbecc9027d1280e2e056932e57afc

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page