Issue #136 – Neural Machine Translation without Embeddings
28 Jun21 Issue #136 – Neural Machine Translation without Embeddings Author: Dr. Jingyi Han, Machine Translation Scientist @ Language Weaver Introduction Nowadays, Byte Pair Encoding (BPE) has become one of the most commonly used tokenization strategies due to its universality and effectiveness in handling rare words. Although many previous works show that subword models with embedding layers in general achieve more stable and competitive results in neural machine translation (NMT), character-based (see issue #60) and Byte-based subword (see issue #64) […]
Read more