Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add the implementation of Differential Evolution #199

Merged
merged 20 commits into from
Dec 13, 2024
Merged

Add the implementation of Differential Evolution #199

merged 20 commits into from
Dec 13, 2024

Conversation

JLX0
Copy link
Contributor

@JLX0 JLX0 commented Dec 6, 2024

Contributor Agreements

Please read the contributor agreements and if you agree, please click the checkbox below.

  • I agree to the contributor agreements.

Motivation

This PR introduces the DESampler, which implements a hybrid Differential Evolution (DE) and Random Sampling algorithm for hyperparameter optimization.

Differential Evolution is a well-established optimization algorithm known for its robustness and performance across numerical optimization tasks. By integrating Random Sampling for categorical parameters, this implementation bridges the gap between numerical and categorical hyperparameter spaces, making it suitable for real-world machine learning and AI applications where mixed-type parameters are common.

The implementation is particularly motivated by the need to handle dynamic search spaces efficiently, enabling flexible adaptation to changing parameter dimensions during optimization. This makes the DESampler well-suited for tasks involving complex search spaces, such as neural architecture search or expensive optimization tasks with evolving requirements.

For simplicity, this implementation provides a default setup of Differential Evolution, with automatic population size determination and seamless integration with Optuna’s framework for tracking trials and results. The sampler ensures computational efficiency while maintaining high diversity in the population, enabling faster convergence to optimal solutions.

Description of the changes

  • Added DESampler in de.py
  • Added comprehensive examples and benchmarking for using DESampler in example.py

TODO List towards PR Merge

  • Copy ./template/ to create your package
  • Replace <COPYRIGHT HOLDER> in LICENSE of your package with your name
  • Fill out README.md in your package
  • Add import statements of your function or class names to be used in __init__.py
  • (Optional) Add from __future__ import annotations at the head of any Python files that include typing to support older Python versions
  • Apply the formatter based on the tips in README.md
  • Check whether your module works as intended based on the tips in README.md

@JLX0
Copy link
Contributor Author

JLX0 commented Dec 6, 2024

for Checks/checks:

package/samplers/differential_evolution/example.py:182:9: F841 Local variable z is assigned to but never used
package/samplers/differential_evolution/example.py:199:9: F841 Local variable z is assigned to but never used
package/samplers/differential_evolution/example.py:220:9: F841 Local variable z is assigned to but never used

The variable z is just a dummy variable used to test whether the sampler works for dynamic search space.

@JLX0 JLX0 changed the title De Add the implementation of Differential Evolution Dec 6, 2024
@y0z y0z self-assigned this Dec 6, 2024
@y0z y0z added the new-package New packages label Dec 6, 2024
Copy link
Member

@y0z y0z left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@y0z y0z merged commit c9c5ec4 into optuna:main Dec 13, 2024
4 checks passed
@y0z y0z removed their assignment Dec 16, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
new-package New packages
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants