Hybrid Transformer–Mamba language models (Jamba)
A strong long-tail researcher page because his public profile explicitly points to factual knowledge and grounding, which are much more useful signals than another generic AI21/Jamba placeholder.
Lab & Ecosystem
Researchers exploring long-context hybrid architectures and practical language model deployment.
Within 500AI, AI21 is most legible through researchers like Dor Muhlgay, Barak Lenz, Daniel Gissin.
This cluster is especially tied to Systems & Infrastructure, Evaluation & Benchmarks, Agents & Reasoning. Frequent institution signals include AI21 Labs, Bar-Ilan University, Hebrew University of Jerusalem. Recurring entry points include Jamba-1.5: Hybrid Transformer-Mamba Models at Scale, Jamba: A Hybrid Transformer-Mamba Language Model.
Snapshot
Researchers
61
Related topics
8
Starting points
8
Developed dossiers
8
Useful lenses pulled from the strongest researcher profiles in this cluster.
Frequent institutions showing up across linked profiles in this ecosystem.
Repeatedly linked papers, projects, and repositories across this lab cluster.
Jamba-1.5: Hybrid Transformer-Mamba Models at Scale
61Linked by 61 profiles in this cluster
Jamba: A Hybrid Transformer-Mamba Language Model
61Linked by 61 profiles in this cluster
AI21 Jamba Large 1.5 model card
42Linked by 42 profiles in this cluster
JAMBA: Hybrid Transformer-Mamba Language Models
10Linked by 10 profiles in this cluster
MRKL Systems: A modular, neuro-symbolic architecture that combines large language models, external knowledge sources and discrete reasoning
5Linked by 5 profiles in this cluster
Attention was never enough: Tracing the rise of hybrid LLMs
2Linked by 2 profiles in this cluster
Jamba
2Linked by 2 profiles in this cluster
AI Index
1Linked by 1 profiles in this cluster
Source clusters that repeatedly anchor researcher pages in this ecosystem.
A stronger first pass through AI21, ranked by profile depth, evidence, and editorial importance.
Hybrid Transformer–Mamba language models (Jamba)
A strong long-tail researcher page because his public profile explicitly points to factual knowledge and grounding, which are much more useful signals than another generic AI21/Jamba placeholder.
Hybrid Transformer–Mamba language models (Jamba)
One of the higher-signal people to know in the hybrid-LLM line because he sits at the point where AI21’s research architecture, long-context systems work, and real product deployment meet.
Hybrid Transformer–Mamba language models (Jamba)
A stronger page than the default Jamba byline because his work clearly predates it: he has earlier papers on active learning and implicit bias in deep networks before showing up on Jamba-1.5.
Hybrid Transformer–Mamba language models (Jamba)
A useful page because his public trail is broader than the generic Jamba author stub: it runs from earlier language grounding and text-similarity work into Jamba-1.5 and later multimodal hallucination mitigation.
Hybrid Transformer–Mamba language models (Jamba)
Worth tracking on the architecture side of AI21 because his profile sits where infrastructure leadership, hybrid-model design, and the mechanics of shipping long-context systems overlap.
Hybrid Transformer–Mamba language models (Jamba)
A worthwhile profile because he is tied directly to the main public Jamba releases, which makes him one of the clearer names behind the hybrid Transformer-Mamba model line rather than just another long author list entry.
Hybrid Transformer–Mamba language models (Jamba)
A useful page for the implementation layer of AI21 research because it captures the engineers who turn the company's hybrid-model ideas into trained systems and concrete releases.
Hybrid Transformer–Mamba language models (Jamba)
A valuable page in this cluster because his public role description is unusually specific: post-training, steerability, and AI-generated evaluation data are exactly the kinds of practical problems strong researcher pages should make discoverable.
Hybrid Transformer–Mamba language models (Jamba)
One of the clearer non-model pages in the AI21 cluster because he connects data leadership, infrastructure realities, and public explanation of enterprise AI rather than only pure modeling work.
61 linked profiles.