Ideas: Exploring AI frontiers with Rafah Hosn

[MUSIC FADES]  My guest today is Rafah Hosn. She’s a partner, group product manager for AI Frontiers at Microsoft Research. I’d call Rafah a sort of organizational conductor, working both with leaders to drive clarity around the mission as well as program managers to make sure they have solid operational strategies to execute on it. Rafah has mad skills in bringing research ideas from lab to life, and I’m thrilled to talk to her today. Rafah Hosn, welcome to Ideas!  RAFAH HOSN: […]

Read more

SAMMO: A general-purpose framework for prompt optimization

Large language models (LLMs) have revolutionized a wide range of tasks and applications that were previously reliant on manually crafted machine learning (ML) solutions, streamlining through automation. However, despite these advances, a notable challenge persists: the need for extensive prompt engineering to adapt these models to new tasks. New generations of language models like GPT-4 and Mixtral 8x7B advance the capability to process long input texts. This progress enables the use of longer inputs, providing richer context and detailed instructions […]

Read more

Ideas: Language technologies for everyone with Kalika Bali

[MUSIC FADES]  I’m excited to be live in the booth today with Kalika Bali, a principal researcher at Microsoft Research India. Kalika is working on language technologies that she hopes will bring the benefits of generative AI to under-resourced and underserved language communities around the world. Kalika, it’s a pleasure to speak with you today. Welcome to Ideas!  KALIKA BALI: Thank you. Thank you, Gretchen. Thank you for having me.  HUIZINGA: So before we dive in on the big ideas […]

Read more

Research Focus: Week of April 1, 2024

Welcome to Research Focus, a series of blog posts that highlights notable publications, events, code/datasets, new hires and other milestones from across the research community at Microsoft. NEW RESEARCH In the same way that tools can help people complete tasks beyond their innate abilities, tools are essential for large language models (LLMs) to acquire up-to-date information and take consequential actions in external environments. Existing work on tool-augmented LLMs primarily focuses on the broad coverage of tools and the flexibility of […]

Read more

AI Frontiers: Rethinking intelligence with Ashley Llorens and Ida Momennejad

[MUSIC FADES] Let’s dive right in. We are undergoing a paradigm shift where AI models and systems are starting to exhibit characteristics that I and, of course, many others have described as more general intelligence. When I say general in this context, I think I mean systems with abilities like reasoning and problem-solving that can be applied to many different tasks, even tasks they were not explicitly trained to perform. Despite all of this, I think it’s also important to […]

Read more

Learning from interaction with Microsoft Copilot (web)

AI systems like Bing and Microsoft Copilot (web) are as good as they are because they continuously learn and improve from people’s interactions. Since the early 2000s, user clicks on search result pages have fueled the continuous improvements of search engines. Recently, reinforcement learning from human feedback (RLHF) brought step-function improvements to response quality of generative AI models. Bing has a rich history of success in improving its AI offerings by learning from user interactions. For example, Bing pioneered the […]

Read more

Abstracts: March 21, 2024

CHANG LIU: Thank you. Thank you for this opportunity to share our work.  HUIZINGA: So in a few sentences, tell us about the issue or problem your paper addresses and why people should care about this research.  LIU: Sure. Since this is an AI4Science work, let’s start from this perspective. About science, people always want to understand the properties of matters, such as why some substances can cure disease and why some materials are heavy or conductive. For a very […]

Read more

Research Focus: Week of March 18, 2024

Welcome to Research Focus, a series of blog posts that highlights notable publications, events, code/datasets, new hires and other milestones from across the research community at Microsoft. NEW RESEARCH Fewer is More: Boosting LLM Reasoning with Reinforced Context Pruning Large language models (LLMs) have shown impressive capabilities, yet they still struggle with math reasoning. In a recent paper: Fewer is More: Boosting LLM Reasoning with Reinforced Context Pruning, researchers from Microsoft propose CoT-Influx, a  

Read more

Intelligent monitoring: Towards AI-assisted monitoring for cloud services

In the evolving field of software development, professionals are increasingly adopting a modern approach known as service-oriented architecture to enhance the scalability and flexibility of their services and applications. Often utilizing a microservices approach, developers construct software as a collection of small, independently functioning services. This method is particularly advantageous for developing cloud-based software, as it offers numerous benefits over the traditional monolithic architectures, including the ability to separately develop, deploy, and scale individual components of an application. Nevertheless, this […]

Read more

Introducing Garnet – an open-source, next-generation, faster cache-store for accelerating applications and services

Researchers at Microsoft have been working for nearly a decade to address the increasing demand for data storage mechanisms to support the rapid advances in interactive web applications and services. Our new cache-store system called Garnet, which offers several advantages over legacy cache-stores, has been deployed in multiple use cases at Microsoft, such as those in the Windows & Web Experiences Platform, Azure Resource Manager, and Azure  

Read more
1 5 6 7 8 9 13