Skip to content

v0.5.0

Compare
Choose a tag to compare
@github-actions github-actions released this 11 Jun 01:17
ec59410

MLJFlux v0.5.0

Diff since v0.4.0

  • (new model) Add NeuralNetworkBinaryClasssifier, an optimised form of NeuralNetworkClassifier for the special case of two target classes. Use Flux.σ instead of softmax for the default finaliser (#248)
  • (internals) Switch from implicit to explicit differentiation (#251)
  • (breaking) Use optimisers from Optimisers.jl instead of Flux.jl (#251). Note that the new optimisers are immutable.
  • (RNG changes.) Change the default value of the model field rng fromRandom.GLOBAL_RNG to Random.default_rng(). Change the seeded RNG, obtained by specifying an integer value for rng, from MersenneTwister to Xoshiro (#251)
  • (RNG changes.) Update the Short builder so that the rng argument of build(::Short, rng, ...)
    is passed on to the Dropout layer, as these layers now support this on a GPU, at
    least for rng=Random.default_rng() (#251)
  • (weakly breaking) Change the implementation of L1/L2 regularization from explicit loss penalization to weight/sign decay (internally chained with the user-specified optimiser). The only breakage for users is that the losses reported in the history will no longer be penalized, because the penalty is not explicitly computed (#251)

Merged pull requests:

Closed issues:

  • Stop using implicit style differentiating (#221)
  • Update examples/MNIST (#234)
  • MultitargetNeuralNetworkRegressor has surprising target_scitype (#246)
  • Error fit GPU (#247)
  • Metalhead has broken MLJFlux (#249)