Ideas: Designing AI for people with Abigail Sellen

[MUSIC FADES]  My guest on this episode is Abigail Sellen, known by her friends and colleagues as Abi. A social scientist by training and an expert in human-computer interaction, Abi has a long list of accomplishments and honors, and she’s a fellow of many technical academies and societies. But today I’m talking to her in her role as distinguished scientist and lab director of Microsoft Research Cambridge, UK, where she oversees a diverse portfolio of research, some of which supports […]

Read more

Abstracts: May 20, 2024

ANDREY KOLOBOV: Thank you for having me! HUIZINGA: So let’s start with a sort of abstract of your abstract. In just a few sentences, tell us about the problem your research addresses and more importantly, why we should care about it.  KOLOBOV: Right, so the overarching goal of this work—and I have to thank my collaborators from ETH Zürich, without whom this work would have been impossible—so the overarching goal of our work was to give drones the ability to […]

Read more

What’s Your Story: Jacki O’Neill

GEHRKE: We just had a discussion maybe a couple of years ago, right, when you were just in transition to Africa. So it’s really great to have you here and both learn a little bit what’s happening there, but also to learn a bit more about your story. Where did you grow up, and how did you end up here at Microsoft? O’NEILL: Yeah, thanks for asking that. I’ve had a very, well, it’s definitely not been a straight road […]

Read more

Research Focus: Week of May 13, 2024

Welcome to Research Focus, a series of blog posts that highlights notable publications, events, code/datasets, new hires and other milestones from across the research community at Microsoft. NEW RESEARCH Injecting New Knowledge into Large Language Models via Supervised Fine-Tuning  Large language models (LLMs) have shown remarkable performance in generating text similar to that created by people, proving to be a valuable asset across various applications. However, adapting these models to incorporate new, out-of-domain knowledge remains a challenge,  

Read more

MatterSim: A deep-learning model for materials under real-world conditions

In the quest for groundbreaking materials crucial to nanoelectronics, energy storage, and healthcare, a critical challenge looms: predicting a material’s properties before it is even created. This is no small feat, with any combination of 118 elements in the periodic table, and the range of temperatures and pressures under which materials are synthesized and operated. These factors drastically affect atomic interactions within materials, making accurate property prediction and behavior simulation exceedingly demanding. Here at Microsoft  

Read more

LoftQ: Reimagining LLM fine-tuning with smarter initialization

This research paper was presented at the 12th International Conference on Learning Representations (opens in new tab) (ICLR 2024), the premier conference dedicated to the advancement of deep learning. Large language models (LLMs) use extensive datasets and advanced algorithms to generate nuanced, context-sensitive content. However, their development requires substantial computational resources. To address this, we developed LoftQ, an innovative technique that streamlines the fine-tuning process—which is used to  

Read more

Abstracts: May 6, 2024

MICHEL GALLEY: Thank you for having me. HUIZINGA: So I like to start with a distillation or sort of an elevator pitch of your research. Tell us in just a couple sentences what problem or issue your paper addresses and why we should care about it. GALLEY: So this paper is about evaluating large foundation models. So it’s a very important part of researching large language models because it’s a good way to evaluate, kind of, the capabilities—what these models […]

Read more

Research Focus: Week of April 29, 2024

Welcome to Research Focus, a series of blog posts that highlights notable publications, events, code/datasets, new hires and other milestones from across the research community at Microsoft. NEW RESEARCH Can Large Language Models Transform Natural Language Intent into Formal Method Postconditions? Informal natural language that describes code functionality, such as code comments or function documentation, may contain substantial information about a program’s intent. However, there is no guarantee that a program’s implementation aligns with its natural  

Read more

SIGMA: An open-source mixed-reality system for research on physical task assistance

Imagine if every time you needed to complete a complex physical task, like building a bicycle, fixing a broken water heater, or cooking risotto for the first time, you had a world-class expert standing over your shoulder and guiding you through the process. In addition to telling you the steps to follow, this expert would also tune the instructions to your skill set, deliver them with the right timing,  

Read more
1 4 5 6 7 8 13