Machine Translation Weekly 50: Language-Agnostic Multilingual Representations
Pre-trained multilingual representations promise to make the current best NLP model available even for low-resource languages. With a truly language-neutral pre-trained multilingual representation, we could train a task-specific model for English (or another language with available training data) and such a model would work for all languages the representation model can work with. (Except that by doing so, the models might transfer Western values into low-resource language applications.) There are several multilingual contextual embeddings models (such as multilingual BERT or […]
Read more