Skip to content
This repository has been archived by the owner on Oct 31, 2023. It is now read-only.

[Question] Does XLM-R follows RoBERTa or XLM for MLM? #351

Open
mani-rai opened this issue Jul 3, 2022 · 0 comments
Open

[Question] Does XLM-R follows RoBERTa or XLM for MLM? #351

mani-rai opened this issue Jul 3, 2022 · 0 comments

Comments

@mani-rai
Copy link

mani-rai commented Jul 3, 2022

Hugging Face states that:

It is based on Facebook’s RoBERTa model released in 2019. It is a large multi-lingual language model, trained on 2.5TB of filtered CommonCrawl data.

While XLM-R paper states:

We follow the XLM approach as closely as possible, only introducing changes that improve performance at scale.

The confusion is RoBERTa uses dynamic masking whereas XLM uses static one. Can somebody explain me what exactly is XLM-R doing in MLM?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant