CORNN is easy to use benchmark suit, which is focused on the task of optimizing neural network weights using population based approaches on large array of regression tasks.
The CORNN benchmark suit is constructed from 54 classic real-valued functions from the field of continuous optimization, see CORNN_RegressionFunctions.pdf, from which the regression data sets have been generated by uniformly sampling the active domains each classic function. For each function a training set and test set of 3750 and 1250 are provided. A CORNN problem instance is generated by pairing a regression data set with specific neural network architecture. In the case of CORNN there are 6 architectures provided to you. Resulting in 324 unique problem instances within the CORNN benchmark suite.
The goal of each problem instance is to try and improve the model fit by optimization of the NN weights for a given dataset. A full motivation for this benchmark suite, and a demonstration of utilizing it to compare a number of optimization algorithms is given in the article: arxiv.org/abs/2109.05606.
- Pure Python 3 implementation
- Windows, Linux, and Mac OS supported
- Rapid candidate solution evaluation due to the PyTorch backend
- 324 unique problem instances
- Simple to use interface, with both the neural network computation and dataset processing abstracted away
- Easy addition of other neural network architectures
- Generating functions of all 54 regression tasks provided
The CORNN benchmark suite is designed using Python 3 and relies on PyTorch for highly efficient candidate solution evaluation. In order to construct one of the 324 problem instances an underlying regression task and a model architecture must be selected from the library. Once this pair has been selected the library will construct a problem instance object with a callable function from which the user can simply pass a candidate solution to for evaluation.
The following code snippet is also present in 'Demo.py', and should serve as a straight-forward guide of how to use the benchmark suite.
function_dictionary=CORNN.get_benchmark_functions()
# Returns a dictionary of all the regression functions within CORNN
# The key is the function name, the value is a 3 element tuple:
# containing the raw objective function (for example the Ackley function) and
# the x variable's domain and the y variable's domain
print([*function_dictionary.keys()])
# list all the available objective functions.
training_data, test_data= CORNN.get_scaled_function_data(function_dictionary["Ackley"])
# Both the training data and the test data are pairs.
# The first element of training_data is an PyTorch tensor of data patterns
# The second element of training_data is a PyTorch tensor of the corresponding labels
# The same is true for test_data
neural_network_dictionary=CORNN.get_NN_models()
# Returns a dictionary of all neural network architecture from CORNN
# The key is the class name, the value is the NN class built on PyTorch.
print([*neural_network_dictionary.keys()])
# list all the available neural network architecture.
neural_network_architecture=neural_network_dictionary["Net_5_relu_layers"]() # the () to instantiate
# Selects the 3 hiddern layer NN model that uses ReLU activation function within
# the Hiddern layers.
# The combination of training_data, test_data, and the
# selected neural network architecture makes a problem instance of CORNN
CORNN_benchmark_instance=CORNN.NN_Benchmark(training_data,test_data,neural_network_architecture)
instance_dimension= CORNN_benchmark_instance.get_weight_count()
example_candidate_solution=np.random.rand(instance_dimension)
# In order to evaluate a candidate solution on the training set simply use:
training_loss=CORNN_benchmark_instance.training_set_evaluation(example_candidate_solution)
print("Training set loss:",training_loss)
# In order to evaluate a candidate solution on the training set simply use:
testing_loss=CORNN_benchmark_instance.testing_set_evaluation(example_candidate_solution)
print("Testing set loss:",testing_loss)
While the primary CORRN bechmark suit problem instances are pre-built as stated in the arXiv, it is possible to use a custom neural architecture or dataset.
Using a custom neural architecture can be achieved by passing a specialization of the torch.nn.Module PyTorch class. For example, the line from the previous demo
neural_network_architecture=neural_network_dictionary["Net_5_relu_layers"]()
can be replaced with
import torch.nn as nn
import torch.nn.functional as F
class Net_Custom(nn.Module):
def __init__(self):
super().__init__()
self.fc1 = nn.Linear(2, 10) # input->H1
self.fc2 = nn.Linear(10, 50) # H1->H2
self.fc3 = nn.Linear(50, 10) # H2->H3
self.fc4 = nn.Linear(10, 1) # H3->output
def forward(self, x):
x = self.fc1(x)
x = F.relu(x)
x = self.fc2(x)
x = F.relu(x)
x = self.fc3(x)
x = F.relu(x)
x = self.fc4(x)
return x
neural_network_architecture=Net_Custom()
where Net_Custom could be any creation you choose.
In order to use a custom data set all that is required to genarate two PyTorch tensors inline with in the demo code line
training_data, test_data= CORNN.get_scaled_function_data(function_dictionary["Ackley"])
where the first element of training_data is an PyTorch tensor of data patterns and the second element of training_data is a PyTorch tensor of the corresponding labels. The same must hold true for test_data. If your dataset has data patterns or labels of a different dimensionality than what is used in the base CORNN suit datasets, an appropriate custom neural architecture will be required.
The CORNN suite provides data sets (both test and train) for 54 functions, as detailed in [arXiv]. To provide a sense of the regression tasks for are plotted in 3D below: Alpine N.1, Bird, Egg Holder, and Himmelblau.
The benchmark suite makes use of numpy, panda, and PyTorch. The minimum version numbers are stored in requirements.txt file and are:
numpy==1.20.1
pandas==1.2.3
torch==1.9.0
You can simply download the lib folder and use as indicated in the 'Demo.py' file or if you wish to install CORNN as a package you can enter the below commands:
git clone https:///github.com/CWCleghornAI/CORNN
cd CORNN
python3 setup.py install
The CORNN suite is provided under GNU General Public License v3.0: details can be found here
If you encounter any bugs or see possible improvements, please open an issue or feel free to make a pull request.