Issue #103 – LEGAL-BERT: The Muppets straight out of Law School

16 Oct20

Issue #103 – LEGAL-BERT: The Muppets straight out of Law School

Author: Akshai Ramesh, Machine Translation Scientist @ Iconic

Introduction

BERT (Bidirectional Encoder Representations from Transformers) is a large-scale pre-trained autoencoding language model that has made a substantial contribution to natural language processing (NLP) and has been studied as a potentially promising way to further improve neural machine translation (NMT).

Given that BERT is based on a similar approach to neural MT in Transformers, there’s considerable interest and research into how the two can be combined” — Dr. John Tinsley, Co-founder and CEO, Iconic Translation Machines

But, there has been limited investigation on its adaptation guidelines in specialised domains. In this post, we will discuss the systematic investigation of the available strategies when applying BERT in specialised domains

 

 

To finish reading, please visit source site