From 197dda87aae53660de83e716c32b19122b241514 Mon Sep 17 00:00:00 2001 From: Essam Date: Fri, 24 May 2024 11:01:20 +0300 Subject: [PATCH] Update docs/src/index.md Co-authored-by: Anthony Blaom, PhD --- docs/src/index.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/src/index.md b/docs/src/index.md index be875679..950aca17 100644 --- a/docs/src/index.md +++ b/docs/src/index.md @@ -52,7 +52,7 @@ evaluate!(mach, resampling=cv, measure=accuracy) ``` As you can see we were able to use MLJ functionality (i.e., cross validation) with a Flux deep learning model. All arguments provided also have defaults. -Notice that we were also able to define the neural network in a high-level fashion by only specifying the number of neurons per each hidden layer and the activation function. Meanwhile, `MLJFlux` was able to infer the input and output layer as well as use a suitable default for the loss function and output activation given the classification task. Notice as well that we did not need to implement a training or prediction loop as in `Flux`. +Notice that we were also able to define the neural network in a high-level fashion by only specifying the number of neurons in each hidden layer and the activation function. Meanwhile, `MLJFlux` was able to infer the input and output layer as well as use a suitable default for the loss function and output activation given the classification task. Notice as well that we did not need to implement a training or prediction loop as in `Flux`. ## Basic idea