Skip to content

Commit

Permalink
Update docs/src/index.md
Browse files Browse the repository at this point in the history
Co-authored-by: Anthony Blaom, PhD <[email protected]>
  • Loading branch information
EssamWisam and ablaom authored May 24, 2024
1 parent 3df8eb5 commit 197dda8
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion docs/src/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ evaluate!(mach, resampling=cv, measure=accuracy)
```
As you can see we were able to use MLJ functionality (i.e., cross validation) with a Flux deep learning model. All arguments provided also have defaults.

Notice that we were also able to define the neural network in a high-level fashion by only specifying the number of neurons per each hidden layer and the activation function. Meanwhile, `MLJFlux` was able to infer the input and output layer as well as use a suitable default for the loss function and output activation given the classification task. Notice as well that we did not need to implement a training or prediction loop as in `Flux`.
Notice that we were also able to define the neural network in a high-level fashion by only specifying the number of neurons in each hidden layer and the activation function. Meanwhile, `MLJFlux` was able to infer the input and output layer as well as use a suitable default for the loss function and output activation given the classification task. Notice as well that we did not need to implement a training or prediction loop as in `Flux`.

## Basic idea

Expand Down

0 comments on commit 197dda8

Please sign in to comment.