Skip to content

Commit

Permalink
📝 Add minor improvements
Browse files Browse the repository at this point in the history
  • Loading branch information
EssamWisam committed May 22, 2024
1 parent ee3e0b8 commit fd8ab78
Show file tree
Hide file tree
Showing 2 changed files with 8 additions and 9 deletions.
4 changes: 0 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,10 +50,6 @@ and this will require familiarity with the [Flux
API](https://fluxml.ai/Flux.jl/stable/) for defining a neural network
chain.

In the future MLJFlux may provide a larger assortment of canned
builders. Pull requests introducing new ones are most welcome.


### Installation

```julia
Expand Down
13 changes: 8 additions & 5 deletions docs/src/interface/Summary.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,9 +22,11 @@ In MLJ a *model* is a mutable struct storing hyper-parameters for some
learning algorithm indicated by the model name, and that's all. In
particular, an MLJ model does not store learned parameters.

*Warning:* In Flux the term "model" has another meaning. However, as all
Flux "models" used in MLJFLux are `Flux.Chain` objects, we call them
*chains*, and restrict use of "model" to models in the MLJ sense.
!!! warning "Difference in Definition"
In Flux the term "model" has another meaning. However, as all
Flux "models" used in MLJFLux are `Flux.Chain` objects, we call them
*chains*, and restrict use of "model" to models in the MLJ sense.

```@raw html
</details>
```
Expand Down Expand Up @@ -104,8 +106,9 @@ length as its input.
Currently, the loss function specified by `loss=...` is applied
internally by Flux and needs to conform to the Flux API. You cannot,
for example, supply one of MLJ's probabilistic loss functions, such as
`MLJ.cross_entropy` to one of the classifier constructors, although
you *should* use MLJ loss functions in MLJ meta-algorithms.
`MLJ.cross_entropy` to one of the classifier constructors.

That said, you can only use MLJ loss functions or metrics in evaluation meta-algorithms (such as cross validation) and they will work even if they underlying model comes from MLJFlux.

```@raw html
<details closed><summary><b>More on accelerated training with GPUs</b></summary>
Expand Down

0 comments on commit fd8ab78

Please sign in to comment.