Skip to main content

Moon, Make MOO great again

Project description

Moon: A Standardized/Flexible Framework for MultiObjective OptimizatioN

Moon

Moon: A Multiobjective Optimization Framework

Introduction

Moon is a multiobjective optimization framework that spans from single-objective optimization to multiobjective optimization. It aims to enhance the understanding of optimization problems and facilitate fair comparisons between MOO algorithms.

"I raise my cup to invite the moon.
With my shadow we become three from one."
-- Li Bai

Main Contributors

  • Xiaoyuan Zhang (Project Leader)
  • Ji Cheng
  • Liao Zhao
  • Weiduo Liao
  • Zhe Zhao
  • Xi Lin
  • Cheng Gong
  • Longcan Chen
  • YingYing Yu

Advisory Board

  • Prof. Yifan Chen (Hong Kong Baptist University)
  • Prof. Zhichao Lu (City University of Hong Kong)
  • Prof. Ke Shang (Shenzhen University)
  • Prof. Tao Qin (Microsoft Research)
  • Prof. Han Zhao (University of Illinois at Urbana-Champaign)

Correspondence

For any inquiries, please contact Prof. Qingfu Zhang (City University of Hong Kong) at the corresponding address.

Resources

For more information on methodologies, please visit our GitHub repository. Contributions and stars are welcome!

(1) A standardlized gradient based framework.

Optimization Problem Classes

Problem Class Details

For more information on problem specifics, please refer to the Readme_problem.md file.

Synthetic Problems

Here's a list of synthetic problems along with relevant research papers and project/code links:

Problem Paper Project/Code
ZDT Paper Project
DTLZ Paper Project
MAF Paper Project
WFG Paper Code
Fi's Paper Code
RE Paper Code

Multitask Learning Problems

This section details problems related to multitask learning, along with their corresponding papers and project/code references:

Problem Paper Project/Code
MO-MNISTs PMTL COSMOS
Fairness Classification COSMOS COSMOS
Federated Learning Federal MTL COSMOS
Synthetic (DST, FTS...) Envelop Project
Robotics (MO-MuJoCo...) PGMORL Code
  • Gradient-based Solver.

    Method Property #Obj Support Published Complexity
    EPO code Exact solution. Any Y ICML 2020 $O(m^2 n K )$
    COSMOS code Approximated exact solution. Any Y ICDM 2021 $O(m n K )$
    MOO-SVGD code A set of diverse Pareto solution. Any Y NeurIPS 2021 $O(m^2 n K^2 )$
    MGDA code Arbitray Pareto solutions. Location affected highly by initialization. Any Y NeurIPS 2018 $O(m^2 n K )$
    PMTL code Pareto solutions in sectors. 2. 3 is difficult. Y NeurIPS 2019 $O(m^2 n K^2 )$
    PMGDA Pareto solutions satisfying any preference. Any Y Under review $O(m^2 n K )$
    GradienHV WangHao code It is a gradient-based HV method. 2/3 Y CEC 2023 $O(m^2 n K^2 )$
    Aggregation fun. based, e.g. Tche,mTche,LS,PBI,... Pareto solution with aggregations. Any Y

    Here, $m$ is the number of objectives, $K$ is the number of samples, and $n$ is the number of decision variables. For neural network based methods, $n$ is the number of parameters; hence $n$ is very large (>10000), K is also large (e.g., 20-50), while $m$ is small (2.g., 2-4).

    As a result, m^2 is not a big problem. n^2 is a big problem. K^2 is a big problem.

    Time complexity of gradient based methods are as follows, -1 Tier 1. GradAggSolver. -2 Tier 2. MGDASolver, EPOSolver, PMTLSolver. -3 Tier 3. GradHVSolver -4 Tier 4. MOOSVGDSolver

    Current support: GradAggSolver, MGDASolver, EPOSolver, MOOSVGDSolver, GradHVSolver, PMTLSolver.

    Important things to notice: The original code MOO-SVGD does not offer a MTL implement. Our code is the first open source code for MTL MOO-SVGD.

Supported Solvers

Current Support

Libmoon includes a variety of solvers tailored for different needs:

  • GradAggSolver
  • MGDASolver
  • EPOSolver
  • MOOSVGDSolver (*)
  • GradHVSolver
  • PMTLSolver

(*) The original MOO-SVGD code does not include an implementation for Multitask Learning (MTL). Our release of MOO-SVGD is the first open-source code that supports MTL.

PSL (Pareto set learning) Solvers

Libmoon supports various models of PSL solvers, categorized as follows:

  • EPO-based PSL model
  • Agg-based PSL model
  • Hypernetwork-based PSL model
  • ConditionalNet-based PSL model
  • Simple PSL model
  • Generative PSL model

MOEA/D Framework

Currently Supported

Upcoming Releases

ML Pretrained Methods

  • HV Net, a model for handling high-volume data, available here.

Installation

Libmoon is available on PyPI. You can install it using pip:

pip install libmoon==0.1.11


Example code for a synthetic problem,

from libmoon.solver.gradient import GradAggSolver from libmoon.util_global.constant import problem_dict from libmoon.util_global.weight_factor.funs import uniform_pref import torch import numpy as np from matplotlib import pyplot as plt import argparse from libmoon.visulization.view_res import vedio_res

if name == 'main': parser = argparse.ArgumentParser(description='example') parser.add_argument('--n-partition', type=int, default=10) parser.add_argument('--agg', type=str, default='tche') # If solve is agg, then choose a specific agg method. parser.add_argument('--solver', type=str, default='agg') parser.add_argument('--problem-name', type=str, default='VLMOP2') parser.add_argument('--iter', type=int, default=1000) parser.add_argument('--step-size', type=float, default=1e-2) parser.add_argument('--tol', type=float, default=1e-6) args = parser.parse_args()

# Init the solver, problem and prefs. 
solver = GradAggSolver(args.step_size, args.iter, args.tol)
problem = problem_dict[args.problem_name]
prefs = uniform_pref(args.n_partition, problem.n_obj, clip_eps=1e-2)
args.n_prob = len(prefs)

# Initialize the initial solution 
if 'lbound' in dir(problem):
    if args.problem_name == 'VLMOP1':
        x0 = torch.rand(args.n_prob, problem.n_var) * 2 / np.sqrt(problem.n_var) - 1 / np.sqrt(problem.n_var)
    else:
        x0 = torch.rand(args.n_prob, problem.n_var)
else:
    x0 = torch.rand( args.n_prob, problem.n_var )*20 - 10


# Solve results
res = solver.solve(problem, x=x0, prefs=prefs, args=args)

# Visualize results
y_arr = res['y']
plt.scatter(y_arr[:,0], y_arr[:,1], s=50)
plt.xlabel('$f_1$', fontsize=20)
plt.ylabel('$f_2$', fontsize=20)
plt.show()

# If use vedio
use_vedio=True
if use_vedio:
    vedio_res(res, problem, prefs, args)     
        
Example of MTL

    

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

libmoon-0.2.1.tar.gz (54.9 kB view details)

Uploaded Source

File details

Details for the file libmoon-0.2.1.tar.gz.

File metadata

  • Download URL: libmoon-0.2.1.tar.gz
  • Upload date:
  • Size: 54.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.11.0

File hashes

Hashes for libmoon-0.2.1.tar.gz
Algorithm Hash digest
SHA256 bd2230c0d3120d8bdc80f03ba7a5910779e20aece6d65f4b802ab3c7fddd4eab
MD5 7437f025be188b0fb877d08f27e0317d
BLAKE2b-256 5a75083993d7e5699d95979eeb3596ee9feb16e76da496a0cb82d0488d7051bd

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page