Back to labs

Lab & Ecosystem

AI21

Researchers exploring long-context hybrid architectures and practical language model deployment.

Within 500AI, AI21 is most legible through researchers like Dor Muhlgay, Barak Lenz, Daniel Gissin.

This cluster is especially tied to Systems & Infrastructure, Evaluation & Benchmarks, Agents & Reasoning. Frequent institution signals include AI21 Labs, Bar-Ilan University, Hebrew University of Jerusalem. Recurring entry points include Jamba-1.5: Hybrid Transformer-Mamba Models at Scale, Jamba: A Hybrid Transformer-Mamba Language Model.

Snapshot

Researchers

61

Related topics

8

Starting points

8

Developed dossiers

8

Institution Signals

Frequent institutions showing up across linked profiles in this ecosystem.

AI21 Labs (56)Bar-Ilan University (2)Hebrew University of Jerusalem (2)Kempner Institute for the Study of Natural and Artificial Intelligence (1)Mobileye (1)Stanford University (1)Technion (1)

Canonical Starting Points

Repeatedly linked papers, projects, and repositories across this lab cluster.

Frequently Linked Sources

Source clusters that repeatedly anchor researcher pages in this ecosystem.

Researchers To Start With

A stronger first pass through AI21, ranked by profile depth, evidence, and editorial importance.

Dor Muhlgay

Hybrid Transformer–Mamba language models (Jamba)

5 sources

A strong long-tail researcher page because his public profile explicitly points to factual knowledge and grounding, which are much more useful signals than another generic AI21/Jamba placeholder.

Start HereDor Muhlgay
Barak Peleg

Hybrid Transformer–Mamba language models (Jamba)

3 sources

Worth tracking on the architecture side of AI21 because his profile sits where infrastructure leadership, hybrid-model design, and the mechanics of shipping long-context systems overlap.

Start HereBarak Peleg
Daniel Jannai

Hybrid Transformer–Mamba language models (Jamba)

3 sources

A worthwhile profile because he is tied directly to the main public Jamba releases, which makes him one of the clearer names behind the hybrid Transformer-Mamba model line rather than just another long author list entry.

Erez Schwartz

Hybrid Transformer–Mamba language models (Jamba)

3 sources

A useful page for the implementation layer of AI21 research because it captures the engineers who turn the company's hybrid-model ideas into trained systems and concrete releases.

Start HereErez Schwartz
Alan Arazi

Hybrid Transformer–Mamba language models (Jamba)

4 sources

A valuable page in this cluster because his public role description is unusually specific: post-training, steerability, and AI-generated evaluation data are exactly the kinds of practical problems strong researcher pages should make discoverable.

Start HereAlan Arazi

All Researchers In This Lab Cluster

61 linked profiles.