Back to researchers

Erez Schwartz

Hybrid Transformer–Mamba language models (Jamba)

Co-authored Jamba: a hybrid Transformer–Mamba architecture for efficient long-context modeling.

Highlights

JambaAI21Hybrid modelsState space models
Focus: Hybrid Transformer–Mamba language models (Jamba)
Why it matters: Co-authored Jamba: a hybrid Transformer–Mamba architecture for efficient long-context modeling.

Research Areas

JambaAI21Hybrid modelsState space models
Erez Schwartz - AI Researcher Profile | 500AI