Pretrained model on Marathi language using a masked language modeling objective
RoBERTa base model for Marathi Language (मराठी भाषा)
Pretrained model on Marathi language using a masked language modeling (MLM) objective. RoBERTa was introduced in
this paper and first released in
this repository. We trained RoBERTa model for Marathi Language during community week hosted by Huggingface 🤗 using JAX/Flax for NLP & CV jax.
RoBERTa base model for Marathi language (मराठी भाषा)
Model description
Marathi RoBERTa is a transformers model pretrained on a large corpus of Marathi data in a self-supervised fashion.
Intended uses & limitations❗️
You can use the raw model for masked language modeling, but it’s mostly intended to be fine-tuned on a downstream task.
Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially