Issue #6 – Zero-Shot Neural MT
22 Aug18
Issue #6 – Zero-Shot Neural MT
Author: Dr. Rohit Gupta, Sr. Machine Translation Scientist @ Iconic
As we covered in last week’s post, training a neural MT engine requires a lot of data, typically millions of sentences in both languages which are aligned at the sentence level, i.e. every sentence in the source (e.g. Spanish) has a corresponding target (e.g. English). During a typical training, the system looks at these bilingual sentence pairs and learns from it. The learning procedure makes use of the fact that we can generate a new translation and compare it with the available target to see how the model is performing and update the parameters accordingly.
In the absence of such data, this training procedure can not be followed as we don’t have any reference to compare with. What now?
Pivoting
To deal with a scenario where bilingual data is not available, we often use pivoting. We translate first into an intermediate language, and from here into the target language. For example, if we need to translate from Spanish to French and we do not have Spanish-French bilingual data, we can translate from Spanish to English first, and then from
To finish reading, please visit source site