Intro to Pre-Trained Models in NLP?

As a scientist working in the Natural Language Processing/Understanding (NLP/NLU) domain, I use Pre-Trained Models (PTMs) on a daily basis. We all also use PTMs on a daily basis in our lives without even knowing it. For example, when you talk to your favorite virtual assistant (e.g. Alexa, Siri, Google), or when you use Google Translate, these engines rely on PTMs one way or another. I was having a hard time finding a brief primer to what PTMs are in […]

Read more

Minun and Explainable Entity Matching

Given two collections of entities, such as product listings, the entity matching (EM) problem aims to identify all pairs that refer to the same object in the real world, such as products, publications, businesses, etc. Recently, deep learning (DL) techniques have been widely applied to the EM problem and have achieved promising results. Unfortunately, the performance gain brought by DL techniques comes at the cost of reducing transparency and interpretability. The reason is that DL-based approaches are more like black-box […]

Read more

N-gram Language Model Based Next Word Suggestion — Part 1

Have you ever wondered how, when typing a message on your phone, it’s able to suggest the next word? Or even being able to correct your spelling on the fly, turning your angry WhatsApp to your boss into a laughable matter because you end up telling them to go and duck themselves? One of the many ways to achieve next word suggestion is using an n-gram language model. This article briefly overviews what an n-gram language model is and how […]

Read more
1 15 16 17 18 19 27