Skip to content

Commit

Permalink
Update docs/src/common_workflows/entity_embeddings/notebook.jl
Browse files Browse the repository at this point in the history
Co-authored-by: Essam <[email protected]>
  • Loading branch information
ablaom and EssamWisam authored Nov 20, 2024
1 parent d35258e commit 71eeafd
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion docs/src/common_workflows/entity_embeddings/notebook.jl
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
# It employs a set of embedding layers to map each categorical feature into a dense continuous vector in a similar fashion to how they are employed in NLP architectures.

# In MLJFlux, the `NeuralNetworkClassifier`, `NeuralNetworkRegressor`, and the `MultitargetNeuralNetworkRegressor`` can be trained and evaluated with heterogenous data (i.e., containing categorical features) because they have a built-in entity embedding layer.
# Moreover, they now offer a `transform` method which encodes the categorical features with the learned embeddings to be used by an upstream machine learning model.
# Moreover, they offer a `transform` method which encodes the categorical features with the learned embeddings. Such embeddings can then be used as features in downstream machine learning models.

# In this notebook, we will explore how to use entity embeddings in MLJFlux models.

Expand Down

0 comments on commit 71eeafd

Please sign in to comment.