Simulated Annealing From Scratch in Python

Simulated Annealing is a stochastic global search optimization algorithm. This means that it makes use of randomness as part of the search process. This makes the algorithm appropriate for nonlinear objective functions where other local search algorithms do not operate well. Like the stochastic hill climbing local search algorithm, it modifies a single solution and searches the relatively local area of the search space until the local optima is located. Unlike the hill climbing algorithm, it may accept worse solutions […]

Read more

A Review of the Neural History of Natural Language Processing

This post discusses major recent advances in NLP focusing on neural network-based methods. This post originally appeared at the AYLIEN blog. This is the first blog post in a two-part series. The series expands on the Frontiers of Natural Language Processing session organized by Herman Kamper and me at the Deep Learning Indaba 2018. Slides of the entire session can be found here. This post discusses major recent advances in NLP focusing on neural network-based methods. The second post discusses […]

Read more

Issue #118 – EDITOR: a Repositioning Transformer with Soft Lexical Constraints

18 Feb21 Issue #118 – EDITOR: a Repositioning Transformer with Soft Lexical Constraints Author: Dr. Karin Sim, Machine Translation Scientist @ Iconic EDITOR: an Edit-Based Transformer with Repositioning for Neural MT with Soft Lexical Constraints Introduction On our blog a couple of weeks ago (issue 116), Patrik explored fully non-autoregressive machine translation, highlighting the tricks such as dependency reduction that enabled quality to be maintained while retaining the speed-up gains over autoregressive MT. Today we revisit non-autoregressive translation (NAT), examining […]

Read more

No Free Lunch Theorem for Machine Learning

The No Free Lunch Theorem is often thrown around in the field of optimization and machine learning, often with little understanding of what it means or implies. The theorem states that all optimization algorithms perform equally well when their performance is averaged across all possible problems. It implies that there is no single best optimization algorithm. Because of the close relationship between optimization, search, and machine learning, it also implies that there is no single best machine learning algorithm for […]

Read more

Dialogue Summarization: A Deep Learning Approach

This article was published as a part of the Data Science Blogathon. Dialogue Summarization: Its types and methodology   Image cc: Aseem Srivastava Summarizing long pieces of text is a challenging problem. Summarization is done primarily in two ways: extractive approach and abstractive approach. In this work, we break down the problem of meeting summarization into extractive and abstractive components which further collectively generate a summary of the conversation.   What is Dialogue Summarization? Humans are social animals, we exchange […]

Read more

Python: Get Number of Days Between Dates

Introduction In this tutorial, we’ll take a look at how to get the number of days between two dates in Python. We’ll be using the built-in datetime package, that allows you to really easily work with datetime objects in Python. Creating a Datetime Object As datetime is a built-in module, you can access it right away by importing it at the top of your Python file. You can construct datetime objects in a few different ways: from datetime import datetime […]

Read more

Python: Check if Variable is a Dictionary

Introduction Variables act as a container to store data. A developer can use type hints when creating variables or passing arguments, however, that’s an optional feature in Python, and many codebases, old and new, are yet to have them. It’s more common for a variable in Python to have no information of the type being stored. If we had code that needed a dictionary but lacked type hints, how can we avoid errors if the variable used is not a […]

Read more

A Gentle Introduction to Stochastic Optimization Algorithms

Stochastic optimization refers to the use of randomness in the objective function or in the optimization algorithm. Challenging optimization algorithms, such as high-dimensional nonlinear objective problems, may contain multiple local optima in which deterministic optimization algorithms may get stuck. Stochastic optimization algorithms provide an alternative approach that permits less optimal local decisions to be made within the search procedure that may increase the probability of the procedure locating the global optima of the objective function. In this tutorial, you will […]

Read more

Machine Translation Weekly 68: Pre-editing of MT inputs

Today, I am going to comment on a paper that systematically explores something that probably many MT users do this is pre-editing (editing the source sentence) to get a better output of an MT that is treated as a black box. The title of the paper is Understanding Pre-Editing for Black-Box Neural Machine Translation by authors from Nagoya University and NICT in Japan and will appear at this year’s EACL. Pre-editing is something I often do when I use automatic […]

Read more
1 677 678 679 680 681 912