0.8.1
Elegy is now based on Treex π
Changes
- Remove the
module
,nn
,metrics
, andlosses
from Elegy, instead Elegy reexports these modules from Treex. GeneralizedModule
and friends are gone, to use Flax Modules use theelegy.nn.FlaxModule
wrapper.- Low level API is massively simplified:
States
is removed, sinceModel
is a pytree all parameters are tracked automatically thanks to Treex / Treeo.- All static state arguments (
training
,initializing
) are removed,Module
s can simply useself.training
to pick their training state andself.initializing()
to check whether they are initializing. - Signature for
pred_step
,test_step
, andtrain_step
now simply consists ofinputs
andlabels
, wherelabels
is adict
that can contain additional keys likesample_weight
orclass_weight
as required by the losses and metrics.
- Adds the
DistributedStrategy
class which currently has 3 instancesEager
: Runs model in a single device in eager mode (nojit
)JIT
: Runs model in a single device withjit
DataParallel
: Run the model in multiple devices usingpmap
.
- Adds methods to change the model's distributed strategy:
.distributed(strategy = DataParallel)
: changes the distributed strategy,DataParallel
used by default..local()
: changes the distributed strategy toJIT
..eager()
: changes the distributed strategy toEager
.
- Removes the
.eager
field in favor of the.eager()
method.