Issue #130 – Shared-Private Bilingual Word Embeddings for NMT

13 May21

Issue #130 – Shared-Private Bilingual Word Embeddings for NMT

Author: Akshai Ramesh, Machine Translation Scientist @ Iconic

Introduction

In recent years, there has been a significant amount of research to improve the representation learning of neural machine translation (NMT). In today’s blog post, we will look at the work of Liu et al., 2019 who propose a novel approach called Shared-Private Bilingual Word Embeddings, to improve the word representations of NMT.

Introduction

A word representation is a mathematical object associated with each word, often a vector. The NMT models make use of the word embeddings to capture the semantic and syntactic properties of words. NMT usually relies on 3-word embeddings: