Back to researchers
Uriya Pumerantz
Hybrid Transformer–Mamba language models (Jamba)
Co-authored Jamba: a hybrid Transformer–Mamba architecture for efficient long-context modeling.
Highlights
JambaAI21Hybrid modelsState space models
Focus: Hybrid Transformer–Mamba language models (Jamba)
Why it matters: Co-authored Jamba: a hybrid Transformer–Mamba architecture for efficient long-context modeling.
Research Areas
JambaAI21Hybrid modelsState space models