Back to researchers

Christopher Ré

Fast, memory-efficient attention

Co-authored FlashAttention: one of the most impactful attention-kernel optimizations.

Highlights

FlashAttentionEfficient attentionSystems
Focus: Fast, memory-efficient attention
Why it matters: Co-authored FlashAttention: one of the most impactful attention-kernel optimizations.

Research Areas

FlashAttentionEfficient attentionSystems
Christopher Ré - AI Researcher Profile | 500AI