Skip to content

Releases: FluxML/MLJFlux.jl

v0.6.4

13 Feb 00:08
9124068
Compare
Choose a tag to compare

MLJFlux v0.6.4

Diff since v0.6.3

Merged pull requests:

v0.6.3

12 Feb 04:38
54bbf26
Compare
Choose a tag to compare

MLJFlux v0.6.3

Diff since v0.6.2

  • Extend compatibility: Flux = "0.14, 0.15"

Merged pull requests:

  • CompatHelper: bump compat for Flux to 0.16, (keep existing compat) (#290) (@github-actions[bot])
  • CompatHelper: bump compat for Optimisers to 0.4, (keep existing compat) (#291) (@github-actions[bot])
  • CompatHelper: bump compat for ColorTypes to 0.12, (keep existing compat) (#292) (@github-actions[bot])
  • For a 0.6.3 release (extension of compat bounds only) (#293) (@ablaom)
  • Revert [compat] Flux = "0.14" (#294) (@ablaom)
  • Extend Flux [compat] to Flux = "0.14, 0.15" (#296) (@ablaom)

v0.6.2

24 Jan 00:08
595bc9b
Compare
Choose a tag to compare

MLJFlux v0.6.2

Diff since v0.6.1

Merged pull requests:

v0.6.1

20 Jan 04:28
637e67a
Compare
Choose a tag to compare

MLJFlux v0.6.1

Diff since v0.6.0

  • Add model wrapper EntityEmbedder(model) to transform supervised MLJFlux models into entity embedding transformers (#286)
  • Make some performance improvements around unwrapping of CategoricalArrays (#281)

Merged pull requests:

Closed issues:

  • Julia 1.11 fails tests (#280)
  • use column names instead of indices (#282)

v0.6.0

29 Sep 22:33
de2b3c6
Compare
Choose a tag to compare

MLJFlux v0.6.0

Diff since v0.5.1

All models, except ImageClassifier, now support categorical features (presented as table columns with a CategoricalVector type). Rather than one-hot encoding, embeddings into a continuous space are learned (i.e, by adding an embedding layer) and the dimension of theses spaces can be specified by the user, using a new dictionary-valued hyperparameter, embedding_dims. The learned embeddings are exposed by a new implementation of transform, which means they can be used with other models (transfer learning) as described in Cheng Guo and Felix Berkhahn (2016): Entity Embeddings of Categorical Variables.

Also, all continuous input presented to these models is now forced to be Float32, but this is the only breaking change.

Merged pull requests:

Closed issues:

  • deprecated warning in documentation (#236)
  • Some minor doc issues (#258)
  • Fix code snippet in Readme (#266)
  • MultitargetNeuralNetworkRegressor doc example doesn't work as intended (#268)

v0.5.1

12 Jun 06:05
3766e44
Compare
Choose a tag to compare

MLJFlux v0.5.1

Diff since v0.5.0

Merged pull requests:

v0.5.0

11 Jun 01:17
ec59410
Compare
Choose a tag to compare

MLJFlux v0.5.0

Diff since v0.4.0

  • (new model) Add NeuralNetworkBinaryClasssifier, an optimised form of NeuralNetworkClassifier for the special case of two target classes. Use Flux.σ instead of softmax for the default finaliser (#248)
  • (internals) Switch from implicit to explicit differentiation (#251)
  • (breaking) Use optimisers from Optimisers.jl instead of Flux.jl (#251). Note that the new optimisers are immutable.
  • (RNG changes.) Change the default value of the model field rng fromRandom.GLOBAL_RNG to Random.default_rng(). Change the seeded RNG, obtained by specifying an integer value for rng, from MersenneTwister to Xoshiro (#251)
  • (RNG changes.) Update the Short builder so that the rng argument of build(::Short, rng, ...)
    is passed on to the Dropout layer, as these layers now support this on a GPU, at
    least for rng=Random.default_rng() (#251)
  • (weakly breaking) Change the implementation of L1/L2 regularization from explicit loss penalization to weight/sign decay (internally chained with the user-specified optimiser). The only breakage for users is that the losses reported in the history will no longer be penalized, because the penalty is not explicitly computed (#251)

Merged pull requests:

Closed issues:

  • Stop using implicit style differentiating (#221)
  • Update examples/MNIST (#234)
  • MultitargetNeuralNetworkRegressor has surprising target_scitype (#246)
  • Error fit GPU (#247)
  • Metalhead has broken MLJFlux (#249)

v0.4.0

17 Oct 06:46
8eaed13
Compare
Choose a tag to compare

MLJFlux v0.4.0

Diff since v0.3.1

Merged pull requests:

v0.3.1

11 Sep 23:28
ab630f5
Compare
Choose a tag to compare

MLJFlux v0.3.1

Diff since v0.3.0

  • Improve the error message for faulty builders (#238)

Merged pull requests:

v0.3.0

25 Aug 01:48
37b5f31
Compare
Choose a tag to compare

MLJFlux v0.3.0

Diff since v0.2.10

Merged pull requests:

  • Off-by-one error in clean! method (#227) (@MarkArdman)
  • CompatHelper: bump compat for Flux to 0.14, (keep existing compat) (#228) (@github-actions[bot])
  • Actions node 12 => node 16 (#229) (@vnegi10)
  • Bump compat for Metalhead (#232) (@ablaom)
  • For a 0.3 release (#235) (@ablaom)