Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add MOEA/D sampler #152

Merged
merged 8 commits into from
Sep 5, 2024
Merged

Add MOEA/D sampler #152

merged 8 commits into from
Sep 5, 2024

Conversation

hrntsm
Copy link
Contributor

@hrntsm hrntsm commented Sep 5, 2024

Contributor Agreements

Please read the contributor agreements and if you agree, please click the checkbox below.

  • I agree to the contributor agreements.

Tip

Please follow the Quick TODO list to smoothly merge your PR.

Motivation

There are already exist samplers for more than 3 objective optimization, such as NSGA-III, but would like to add another one

Description of the changes

Add MOEA/D sampler

TODO List towards PR Merge

Please remove this section if this PR is not an addition of a new package.
Otherwise, please check the following TODO list:

  • Copy ./template/ to create your package
  • Replace <COPYRIGHT HOLDER> in LICENSE of your package with your name
  • Fill out README.md in your package
  • Add import statements of your function or class names to be used in __init__.py
  • Apply the formatter based on the tips in README.md
  • Check whether your module works as intended based on the tips in README.md

@toshihikoyanase toshihikoyanase self-assigned this Sep 5, 2024
Copy link
Member

@toshihikoyanase toshihikoyanase left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We sincerely appreciate your contribution to adding one of the most popular EMO algorithms, MOEA/D.

I confirmed that the examples worked as expected. I added some minor comments, but they are just suggestions. So, let me merge this PR.

author: Hiroaki Natsume
title: MOEA/D sampler
description: Sampler using MOEA/D algorithm. MOEA/D stands for "Multi-Objective Evolutionary Algorithm based on Decomposition.
tags: [sampler, multiobjective]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A nit, but other samplers seem to use multi-objective optimization as a tag.

Suggested change
tags: [sampler, multiobjective]
tags: [sampler, multi-objective optimization]

## Installation

```
pip install scipy
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a nit, but it might be kind to users if we also provided a common way to install the package's dependencies.

pip install -r https://hub.optuna.org/samplers/moead/requirements.txt

Comment on lines +41 to +52
if __name__ == "__main__":
population_size = 100
n_trials = 1000

mod = optunahub.load_module("samplers/moead")
sampler = mod.MOEADSampler(
population_size=population_size,
scalar_aggregation_func="tchebycheff",
n_neighbors=population_size // 10,
)
study = optuna.create_study(sampler=sampler)
study.optimize(objective, n_trials=n_trials)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not a strong opinion, but removing if __name__ == "__main__": might be simple as an example.

Suggested change
if __name__ == "__main__":
population_size = 100
n_trials = 1000
mod = optunahub.load_module("samplers/moead")
sampler = mod.MOEADSampler(
population_size=population_size,
scalar_aggregation_func="tchebycheff",
n_neighbors=population_size // 10,
)
study = optuna.create_study(sampler=sampler)
study.optimize(objective, n_trials=n_trials)
population_size = 100
n_trials = 1000
mod = optunahub.load_module("samplers/moead")
sampler = mod.MOEADSampler(
population_size=population_size,
scalar_aggregation_func="tchebycheff",
n_neighbors=population_size // 10,
)
study = optuna.create_study(sampler=sampler)
study.optimize(objective, n_trials=n_trials)


Q. Zhang and H. Li,
"MOEA/D: A Multiobjective Evolutionary Algorithm Based on Decomposition," in IEEE Transactions on Evolutionary Computation, vol. 11, no. 6, pp. 712-731, Dec. 2007,
doi: 10.1109/TEVC.2007.892759.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since you provide the doi, how about using a link to the paper?

Suggested change
doi: 10.1109/TEVC.2007.892759.
[doi: 10.1109/TEVC.2007.892759](https://doi.org/10.1109/TEVC.2007.892759).

parent_population: list[FrozenTrial],
neighbors: dict[int, list[int]],
) -> dict[str, Any]:
"""Generate a child parameter from the given parent population by NSGA-II algorithm.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A nit.

Suggested change
"""Generate a child parameter from the given parent population by NSGA-II algorithm.
"""Generate a child parameter from the given parent population by MOEA/D algorithm.

Comment on lines +95 to +97
subproblem_parent_population = [
parent_population[i] for i in neighbors[self._subproblem_id]
]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure, but this implementation might have unexpected behavior in the case of multi-threading.

*,
population_size: int = 100,
n_neighbors: int | None = None,
scalar_aggregation_func: str = "tchebycheff",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

scalar_aggregation_func seems to be either tchebycheff or weighted_sum, we may be use Literal as a type hint.

Suggested change
scalar_aggregation_func: str = "tchebycheff",
scalar_aggregation_func: Literal['weighted_sum', 'tchebycheff'] = "tchebycheff",

@toshihikoyanase toshihikoyanase merged commit 5b36cdd into optuna:main Sep 5, 2024
4 checks passed
@hrntsm hrntsm deleted the feature/moea_d branch September 5, 2024 23:14
@hrntsm
Copy link
Contributor Author

hrntsm commented Sep 5, 2024

@toshihikoyanase , Thank you for your comment.
I would like to create a PR reflecting your comments.

@hrntsm hrntsm mentioned this pull request Sep 5, 2024
1 task
@y0z y0z added the new-package New packages label Sep 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
new-package New packages
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants