Research Collection: The Unseen History of Audio and Acoustics Research at Microsoft

Audio and Acoustics Research at Microsoft Getting the sound right is a crucial ingredient in natural user interfaces, immersive gaming, realistic virtual and mixed reality, and ubiquitous computing. Audio also plays an important role in assistive technologies for people who are blind or have low vision, and speech recognition and processing can help support those who are deaf or hard of hearing. Although computers have been capable of playing and processing high-fidelity audio for many decades, there are many frontiers […]

Read more

Adversarial robustness as a prior for better transfer learning

Editor’s note: This post and its research are the collaborative efforts of our team, which includes Andrew Ilyas (PhD Student, MIT), Logan Engstrom (PhD Student, MIT), Aleksander Mądry (Professor at MIT), Ashish Kapoor (Partner Research Manager). In practical machine learning, it is desirable to be able to transfer learned knowledge from some “source” task to downstream “target” tasks. This is known as transfer learning—a simple and efficient way to obtain performant machine learning models, especially when there is little training […]

Read more

Issue #93 – Semantic Neural Machine Translation using AMR

06 Aug20 Issue #93 – Semantic Neural Machine Translation using AMR Author: Dr. Karin Sim, Machine Translation Scientist @ Iconic Introduction Semantic representations were part of the very early Machine Translation (MT) systems, yet have had little role in recent Neural MT (NMT) systems. Given that a good translation should reflect the meaning of the source text, this seems an important area to focus on, particularly since the abstraction could potentially help handle data sparsity. In today’s blog post, we […]

Read more

ICML 2020 highlights: A Transformer-based RL agent, causal ML for increased privacy, and more

With over 50 papers from Microsoft accepted at this year’s International Conference on Machine Learning (ICML 2020), a number of which were presented in virtual workshops, Microsoft researchers are in full summer swing when it comes to advancing machine learning in accessibility, privacy, healthcare, and other areas. As Microsoft Partner Research Manager and ICML President John Langford puts it, “ICML is a very broad conference, so its specialty is in some sense ‘all of the above.’” But Langford goes on […]

Read more

Three new reinforcement learning methods aim to improve AI in gaming and beyond

Reinforcement learning (RL) provides exciting opportunities for game development, as highlighted in our recently announced Project Paidia—a research collaboration between our Game Intelligence group at Microsoft Research Cambridge and game developer Ninja Theory. In Project Paidia, we push the state of the art in reinforcement learning to enable new game experiences. In particular, we focus on developing game agents that learn to genuinely collaborate in teams with human players. In this blog post we showcase three of our recent research […]

Read more

Issue #90 – Tangled up in BLEU: Reevaluating how we evaluate automatic metrics in Machine Translation

16 Jul20 Issue #90 – Tangled up in BLEU: Reevaluating how we evaluate automatic metrics in Machine Translation Author: Dr. Karin Sim, Machine Translation Scientist @ Iconic Introduction Automatic metrics have a crucial role in Machine Translation (MT). They are used to tune the MT systems during the development phase, to determine which model is best, and to subsequently determine the accuracy of the final translations. Currently, the performance of these automatic metrics is judged by seeing how well they […]

Read more

Issue #88 – Multilingual Denoising Pre-training for Neural Machine Translation

02 Jul20 Issue #88 – Multilingual Denoising Pre-training for Neural Machine Translation Author: Dr. Chao-Hong Liu, Machine Translation Scientist @ Iconic Introduction Pre-training has been used in many natural language processing (NLP) tasks with significant improvements in performance. In neural machine translation (NMT), pre-training is mostly applied to building blocks of the whole system, e.g. encoder or decoder. In a previous post (#70), we compared several approaches using pre-training with masked language models. In this post, we take a closer […]

Read more

Issue #87 – YiSi – A Unified Semantic MT Quality Evaluation and Estimation Metric

25 Jun20 Issue #87 – YiSi – A Unified Semantic MT Quality Evaluation and Estimation Metric Author: Dr. Karin Sim, Machine Translation Scientist @ Iconic Introduction Automatic evaluation is an issue that has long troubled machine translation (MT): how do we evaluate how good the MT output is? Traditionally, BLEU has been the “go to”, as it is simple to use across language pairs. However, it is overly simplistic, evaluating string matches to a single reference translation. More sophisticated metrics […]

Read more

Issue #86 – Neural MT with Levenshtein Transformer

18 Jun20 Issue #86 – Neural MT with Levenshtein Transformer Author: Dr. Patrik Lambert, Senior Machine Translation Scientist @ Iconic Introduction The standard Transformer model is autoregressive, meaning that the prediction of each target word is based on the predictions for the previous words. The output is generated from left to right, with no chance to revise a past decision and without considering future predictions of the words on the right of the current word. In a recent post (#82), […]

Read more

Issue #85 – Applying Terminology Constraints in Neural MT

11 Jun20 Issue #85 – Applying Terminology Constraints in Neural MT Author: Dr. Chao-Hong Liu, Machine Translation Scientist @ Iconic Introduction Maintaining consistency of terminology translation in Neural Machine Translation (NMT) is a more challenging task than in Statistical MT (SMT). In this post, we review a method proposed by Dinu et al. (2019) to train NMT to use custom terminology. Translation with Terminology Constraints Applying terminology constraints to translation may appear to be an easy task. It is a […]

Read more
1 903 904 905 906 907 910