Abstracts: July 29, 2024
LI LYNA ZHANG: Thank you for having me. HUIZINGA: So let’s start with a brief overview of your paper. Tell us about the issue your research addresses and why it matters. ZHANG: OK, so this paper is about how to effectively extend the context window of large language models beyond 2 million tokens. Why this is important? Because enabling longer input contexts can improve LLM capabilities. Right now, some LLMs can only handle a limited context window of 4K tokens, […]
Read more