Machine Translation and Multilinguality in May and June 2022
After a while, here is a dump of what I found most interesting on arXiv about
machine translation and multilinguality, covering May and June of this year.
Google Research published a pre-print of their NAACL
paper: SCONES (Single-label Contrastive
Objective for Non-Exclusive Sequences). The paper is about a simple trick:
they replace softmax with binary classifiers with a sigmoid output and use the
sum of binary cross-entropies as their loss function. It gets a slightly better
BLEU and BLEURT score on WMT19, it does not suffer from the beam search curse
that much and it is slightly faster because it does not have to normalize the
logits of the output vocabulary in every time step.
Folks from UNC Chapell Hill, Meta AI, and Microfost made an empirical study