An Imitation Game for Learning Semantic Parsers from User Interaction

November 16, 2020 By: Ziyu Yao, Yiqi Tang, Wen-tau Yih, Huan Sun, Yu Su Abstract Despite the widely successful applications, building a semantic parser is still a tedious process in practice with challenges from costly data annotation and privacy risks. We suggest an alternative, human-in-the-loop methodology for learning semantic parsers directly from users. A semantic parser should be introspective of its uncertainties and prompt for user demonstrations when uncertain. In doing so it also gets to imitate the user behavior […]

Read more

Generating Fact Checking Briefs

Abstract Fact checking at scale is difficult—while the number of active fact checking websites is growing, it remains too small for the needs of the contemporary media ecosystem. However, despite good intentions, contributions from volunteers are often error-prone, and thus in practice restricted to claim detection. We investigate how to increase the accuracy and efficiency of fact checking by providing information about the claim before performing the check, in the form of natural language briefs. We investigate passage-based briefs, containing […]

Read more

Measuring Systematic Generalization in Neural Proof Generation with Transformers

November 27, 2020 By: Nicolas Gontier, Koustuv Sinha, Siva Reddy, Christopher Pal Abstract We are interested in understanding how well Transformer language models (TLMs) can perform reasoning tasks when trained on knowledge encoded in the form of natural language. We investigate systematic generalization abilities on an inductive logical reasoning task in natural language, which involves reasoning over relationships between entities grounded in first-order logical proofs. Specifically, we perform soft theorem-proving by leveraging TLMs to generate logical proofs represented in natural […]

Read more

Deep Transformers with Latent Depth

Abstract The Transformer model has achieved state-of-the-art performance in many sequence modeling tasks. However, how to leverage model capacity with large or variable depths is still an open challenge. We present a probabilistic framework to automatically learn which layer(s) to use by learning the posterior distributions of layer selection. As an extension of this framework, we propose a novel method to train one shared Transformer network for multilingual machine translation with different layer selection posteriors for each language pair. The […]

Read more

Resource Constrained Dialog Policy Learning via Differentiable Inductive Logic Programming

Abstract Motivated by the needs of resource constrained dialog policy learning, we introduce dialog policy via differentiable inductive logic (DILOG). We explore the tasks of one-shot learning and zero-shot domain transfer with DILOG on SimDial and MultiWoZ. Using a single representative dialog from the restaurant domain, we train DILOG on the SimDial dataset and obtain 99+% in-domain test accuracy. We also show that the trained DILOG zero-shot transfers to all other domains with 99+% accuracy, proving the suitability of DILOG […]

Read more

A Review of 2020 and Trends in 2021 – A Technical Overview of Machine Learning and Deep Learning!

Introduction Data science is not a choice anymore. It is a necessity. 2020 is almost in the books now. What a crazy year from whichever standpoint you look at it. A pandemic raged around the world and yet it failed to dim the light on data science. The thirst to learn more continued unabated in our community and we saw some incredible developments and breakthroughs this year. From OpenAI’s mind-boggling GPT-3 framework to Facebook’s DETR model, this was a year […]

Read more

Issue #112 -Translating markup tags in Neural Machine Translation

17 Dec20 Issue #112 -Translating markup tags in Neural Machine Translation Author: Dr. Patrik Lambert, Senior Machine Translation Scientist @ Iconic Introduction Text to be translated is often encapsulated in structured documents containing inline tags in different formats, such as XML, HTML, Microsoft Word, PDF, XLIFF, etc. Transferring these inline tags into the target language is not a trivial task. However, it is a crucial component of the MT system, because a correct tag placement ensures a good readability of […]

Read more

Research at Microsoft 2020: Addressing the present while looking to the future

Microsoft researchers pursue the big questions about what the world will be like in the future and the role technology will play. Not only do they take on the responsibility of exploring the long-term vision of their research, but they must also be ready to react to the immediate needs of the present. This year in particular, they were asked to use their roles as futurists to address pressing societal challenges. In early 2020, as countries began responding to COVID-19 […]

Read more

Calculus Books for Machine Learning

Knowledge of calculus is not required to get results and solve problems in machine learning or deep learning. However, knowing some calculus will help you in a number of ways, such as in reading mathematical notation in books and papers, and in understanding the terms used to describe fitting models like “gradient,” and in understanding the learning dynamics of models fit via optimization such as neural networks. Calculus is a challenging topic as taught at a university level, but you […]

Read more

Engineering More Reliable Transportation with Machine Learning and AI at Uber

In recent months, Uber Engineering has shared how we use machine learning (ML), artificial intelligence (AI), and advanced technologies to create more seamless and reliable experiences for our users. From introducing a Bayesian neural network architecture that more accurately estimates trip growth, to our real-time features prediction system, and even our own internal ML-as-a-service platform, Michelangelo, these two fields are critical to supporting Uber’s mission of developing reliable transportation solutions for everyone,    

Read more
1 699 700 701 702 703 919