Issue #103 – LEGAL-BERT: The Muppets straight out of Law School

16 Oct20 Issue #103 – LEGAL-BERT: The Muppets straight out of Law School Author: Akshai Ramesh, Machine Translation Scientist @ Iconic Introduction BERT (Bidirectional Encoder Representations from Transformers) is a large-scale pre-trained autoencoding language model that has made a substantial contribution to natural language processing (NLP) and has been studied as a potentially promising way to further improve neural machine translation (NMT). “Given that BERT is based on a similar approach to neural MT in Transformers, there’s considerable interest and […]

Read more