Skip to content

Commit

Permalink
✏️ Attempt for a better link for Quick Start
Browse files Browse the repository at this point in the history
  • Loading branch information
EssamWisam committed Jun 10, 2024
1 parent 801bc18 commit e182bd0
Show file tree
Hide file tree
Showing 18 changed files with 21 additions and 21 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -94,7 +94,7 @@
{
"cell_type": "markdown",
"source": [
"Now let's construct our model. This follows a similar setup the one followed in the [Quick Start](../../index.md)."
"Now let's construct our model. This follows a similar setup the one followed in the [Quick Start](../../index.md#Quick-Start)."
],
"metadata": {}
},
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ first(X, 5)

# ### Instantiating the model

# Now let's construct our model. This follows a similar setup the one followed in the [Quick Start](../../index.md).
# Now let's construct our model. This follows a similar setup the one followed in the [Quick Start](../../index.md#Quick-Start).
NeuralNetworkClassifier = @load NeuralNetworkClassifier pkg = "MLJFlux"
clf = NeuralNetworkClassifier(
builder = MLJFlux.MLP(; hidden = (1, 1, 1), σ = Flux.relu),
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ first(X, 5)

### Instantiating the model

Now let's construct our model. This follows a similar setup the one followed in the [Quick Start](../../index.md).
Now let's construct our model. This follows a similar setup the one followed in the [Quick Start](../../index.md#Quick-Start).

````@example tuning
NeuralNetworkClassifier = @load NeuralNetworkClassifier pkg = "MLJFlux"
Expand Down
2 changes: 1 addition & 1 deletion docs/src/workflow examples/Comparison/comparison.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@
"cell_type": "markdown",
"source": [
"### Instantiating the models\n",
"Now let's construct our model. This follows a similar setup to the one followed in the [Quick Start](../../index.md)."
"Now let's construct our model. This follows a similar setup to the one followed in the [Quick Start](../../index.md#Quick-Start)."
],
"metadata": {}
},
Expand Down
2 changes: 1 addition & 1 deletion docs/src/workflow examples/Comparison/comparison.jl
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ y, X = unpack(iris, ==(:Species), colname -> true, rng=123);


# ### Instantiating the models
# Now let's construct our model. This follows a similar setup to the one followed in the [Quick Start](../../index.md).
# Now let's construct our model. This follows a similar setup to the one followed in the [Quick Start](../../index.md#Quick-Start).

NeuralNetworkClassifier = @load NeuralNetworkClassifier pkg=MLJFlux

Expand Down
2 changes: 1 addition & 1 deletion docs/src/workflow examples/Comparison/comparison.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ y, X = unpack(iris, ==(:Species), colname -> true, rng=123);
````

### Instantiating the models
Now let's construct our model. This follows a similar setup to the one followed in the [Quick Start](../../index.md).
Now let's construct our model. This follows a similar setup to the one followed in the [Quick Start](../../index.md#Quick-Start).

````julia
NeuralNetworkClassifier = @load NeuralNetworkClassifier pkg=MLJFlux
Expand Down
2 changes: 1 addition & 1 deletion docs/src/workflow examples/Early Stopping/iteration.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@
"cell_type": "markdown",
"source": [
"### Instantiating the model\n",
"Now let's construct our model. This follows a similar setup to the one followed in the [Quick Start](../../index.md)."
"Now let's construct our model. This follows a similar setup to the one followed in the [Quick Start](../../index.md#Quick-Start)."
],
"metadata": {}
},
Expand Down
2 changes: 1 addition & 1 deletion docs/src/workflow examples/Early Stopping/iteration.jl
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ X = Float32.(X); # To be compatible with type of network network parameters


# ### Instantiating the model
# Now let's construct our model. This follows a similar setup to the one followed in the [Quick Start](../../index.md).
# Now let's construct our model. This follows a similar setup to the one followed in the [Quick Start](../../index.md#Quick-Start).

NeuralNetworkClassifier = @load NeuralNetworkClassifier pkg=MLJFlux

Expand Down
2 changes: 1 addition & 1 deletion docs/src/workflow examples/Early Stopping/iteration.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ nothing #hide
````

### Instantiating the model
Now let's construct our model. This follows a similar setup to the one followed in the [Quick Start](../../index.md).
Now let's construct our model. This follows a similar setup to the one followed in the [Quick Start](../../index.md#Quick-Start).

````@example iteration
NeuralNetworkClassifier = @load NeuralNetworkClassifier pkg=MLJFlux
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@
"cell_type": "markdown",
"source": [
"### Instantiating the model\n",
"Now let's construct our model. This follows a similar setup the one followed in the [Quick Start](../../index.md)."
"Now let's construct our model. This follows a similar setup the one followed in the [Quick Start](../../index.md#Quick-Start)."
],
"metadata": {}
},
Expand Down
2 changes: 1 addition & 1 deletion docs/src/workflow examples/Hyperparameter Tuning/tuning.jl
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ X = Float32.(X); # To be compatible with type of network network parameters


# ### Instantiating the model
# Now let's construct our model. This follows a similar setup the one followed in the [Quick Start](../../index.md).
# Now let's construct our model. This follows a similar setup the one followed in the [Quick Start](../../index.md#Quick-Start).

NeuralNetworkClassifier = @load NeuralNetworkClassifier pkg=MLJFlux
clf = NeuralNetworkClassifier(
Expand Down
2 changes: 1 addition & 1 deletion docs/src/workflow examples/Hyperparameter Tuning/tuning.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ nothing #hide
````

### Instantiating the model
Now let's construct our model. This follows a similar setup the one followed in the [Quick Start](../../index.md).
Now let's construct our model. This follows a similar setup the one followed in the [Quick Start](../../index.md#Quick-Start).

````@example Tuning
NeuralNetworkClassifier = @load NeuralNetworkClassifier pkg=MLJFlux
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@
"cell_type": "markdown",
"source": [
"### Instantiating the model\n",
"Now let's construct our model. This follows a similar setup the one followed in the [Quick Start](../../index.md)."
"Now let's construct our model. This follows a similar setup to the one followed in the [Quick Start](../../index.md#quick-start)."
],
"metadata": {}
},
Expand All @@ -76,7 +76,7 @@
{
"output_type": "execute_result",
"data": {
"text/plain": "NeuralNetworkClassifier(\n builder = MLP(\n hidden = (5, 4), \n σ = NNlib.relu), \n finaliser = NNlib.softmax, \n optimiser = Adam(0.01, (0.9, 0.999), 1.0e-8, IdDict{Any, Any}()), \n loss = Flux.Losses.crossentropy, \n epochs = 10, \n batch_size = 8, \n lambda = 0.0, \n alpha = 0.0, \n rng = 42, \n optimiser_changes_trigger_retraining = false, \n acceleration = CPU1{Nothing}(nothing))"
"text/plain": "NeuralNetworkClassifier(\n builder = MLP(\n hidden = (5, 4), \n σ = NNlib.relu), \n finaliser = NNlib.softmax, \n optimiser = Flux.Optimise.Adam(0.01, (0.9, 0.999), 1.0e-8, IdDict{Any, Any}()), \n loss = Flux.Losses.crossentropy, \n epochs = 10, \n batch_size = 8, \n lambda = 0.0, \n alpha = 0.0, \n rng = 42, \n optimiser_changes_trigger_retraining = false, \n acceleration = ComputationalResources.CPU1{Nothing}(nothing))"
},
"metadata": {},
"execution_count": 3
Expand Down Expand Up @@ -111,13 +111,13 @@
"output_type": "stream",
"text": [
"[ Info: Training machine(NeuralNetworkClassifier(builder = MLP(hidden = (5, 4), …), …), …).\n",
"\rOptimising neural net: 18%[====> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 27%[======> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 36%[=========> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 45%[===========> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 55%[=============> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 64%[===============> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 73%[==================> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 82%[====================> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 91%[======================> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 100%[=========================] Time: 0:00:00\u001b[K\n"
"\rOptimising neural net: 18%[====> ] ETA: 0:00:21\u001b[K\rOptimising neural net: 100%[=========================] Time: 0:00:05\u001b[K\n"
]
},
{
"output_type": "execute_result",
"data": {
"text/plain": "trained Machine; caches model-specific representations of data\n model: NeuralNetworkClassifier(builder = MLP(hidden = (5, 4), …), …)\n args: \n 1:\tSource @609 ⏎ Table{AbstractVector{Continuous}}\n 2:\tSource @135 ⏎ AbstractVector{Multiclass{3}}\n"
"text/plain": "trained Machine; caches model-specific representations of data\n model: NeuralNetworkClassifier(builder = MLP(hidden = (5, 4), …), …)\n args: \n 1:\tSource @655ScientificTypesBase.Table{AbstractVector{ScientificTypesBase.Continuous}}\n 2:\tSource @902 ⏎ AbstractVector{ScientificTypesBase.Multiclass{3}}\n"
},
"metadata": {},
"execution_count": 4
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ X = Float32.(X) # To be compatible with type of network network parameters


# ### Instantiating the model
# Now let's construct our model. This follows a similar setup to the one followed in the [Quick Start](../../index.md).
# Now let's construct our model. This follows a similar setup to the one followed in the [Quick Start](../../index.md#Quick-Start).

NeuralNetworkClassifier = @load NeuralNetworkClassifier pkg=MLJFlux
clf = NeuralNetworkClassifier(
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ nothing #hide
````

### Instantiating the model
Now let's construct our model. This follows a similar setup the one followed in the [Quick Start](../../index.md).
Now let's construct our model. This follows a similar setup to the one followed in the [Quick Start](../../index.md#quick-start).

````@example incremental
NeuralNetworkClassifier = @load NeuralNetworkClassifier pkg=MLJFlux
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@
"cell_type": "markdown",
"source": [
"### Instantiating the model\n",
"Now let's construct our model. This follows a similar setup to the one followed in the [Quick Start](../../index.md)."
"Now let's construct our model. This follows a similar setup to the one followed in the [Quick Start](../../index.md#Quick-Start)."
],
"metadata": {}
},
Expand Down
2 changes: 1 addition & 1 deletion docs/src/workflow examples/Live Training/live-training.jl
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ X = Float32.(X); # To be compatible with type of network network parameters


# ### Instantiating the model
# Now let's construct our model. This follows a similar setup to the one followed in the [Quick Start](../../index.md).
# Now let's construct our model. This follows a similar setup to the one followed in the [Quick Start](../../index.md#Quick-Start).

NeuralNetworkClassifier = @load NeuralNetworkClassifier pkg=MLJFlux

Expand Down
2 changes: 1 addition & 1 deletion docs/src/workflow examples/Live Training/live-training.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ nothing #hide
````

### Instantiating the model
Now let's construct our model. This follows a similar setup to the one followed in the [Quick Start](../../index.md).
Now let's construct our model. This follows a similar setup to the one followed in the [Quick Start](../../index.md#Quick-Start).

````@example live-training
NeuralNetworkClassifier = @load NeuralNetworkClassifier pkg=MLJFlux
Expand Down

0 comments on commit e182bd0

Please sign in to comment.