-
Notifications
You must be signed in to change notification settings - Fork 6
3. Automatic Hyperparameter Tuning
kausmik edited this page Aug 25, 2020
·
1 revision
DashAI uses Ax: an accessible, general-purpose platform for understanding, managing, deploying, and automating adaptive experiments. For full documentation and tutorials, see the Ax website
Metric allows to specify what to optimize (here error), how to optimize (here we set minimize to true), and how many trials to optimize for (here set to 20).
"metric": {
"name": "error",
"minimize": true,
"num_trials": 20
}
Here we can specify which hyper-parameters we want DashVerum to optimize by setting the flag to true or false. If set to true, it takes the specifications given in "param" like type of the variable, bounds etc. More about this can be found at the facebook/ax repo. If the flag is set to false, it uses the value given in default.
"learning_rate": {
"flag": true,
"param": {
"name": "learning_rate",
"type": "range",
"bounds": [1e-5, 0.5],
"log_scale": true
},
"default": "slice(None, 0.003, None)"
}
"num_epochs": {
"flag": true,
"param": {
"name": "num_epochs",
"type": "range",
"bounds": [2, 50],
"digits": 0
},
"default": 5
}
"momentum0": {
"flag": true,
"param": {
"name": "momentum0",
"type": "range",
"bounds": [0.9, 0.99]
},
"default": 0.95
}
"momentum1": {
"flag": true,
"param": {
"name": "momentum1",
"type": "range",
"bounds": [0.8, 0.89]
},
"default": 0.85
}
"dropout_ps": {
"flag": true,
"param": {
"name": "dropout_ps",
"type": "range",
"bounds": [0.0, 1.0]
},
"default": null
}
"weight_decay": {
"flag": true,
"param": {
"name": "weight_decay",
"type": "range",
"bounds": [1e-6, 1.0],
"log_scale": true
},
"default": null
}
"use_bn": {
"flag": true,
"param": {
"name": "use_bn",
"type": "choice",
"values": [true, false]
},
"default": true
}