Research Focus: Week of November 22, 2023
Welcome to Research Focus, a series of blog posts that highlights notable publications, events, code/datasets, new hires and other milestones from across the research community at Microsoft.

NEW RESEARCH
PIT: Optimization of Dynamic Sparse Deep Learning Models via Permutation Invariant Transformation
Dynamic sparsity is a technique used in machine learning to reduce computational and memory requirements while maintaining or improving performance. This can be particularly useful when computational resources are limited, such as