Back to page 1

Researchers — page 11

Showing 1201-1320 of 3,615 researchers

1216
Ido Blass

Hybrid Transformer–Mamba language models (Jamba)

A helpful long-tail page because it surfaces the data-engineering layer behind AI21 releases, which is easy to ignore even though data pipelines and labeling workflows strongly shape model quality.

1225
Illia Polosukhin

Transformers

Important both as a transformer coauthor and as one of the clearest examples of a researcher who took core sequence-model work into a broader platform-building role.

1227
Ilya Sutskever

Deep learning, large-scale training

A defining figure of the deep-learning era whose influence comes from both landmark technical contributions and his role in setting the ambition level of frontier-model labs.

1230
Inbal Magar

Hybrid Transformer–Mamba language models (Jamba)

A useful profile for the model-algorithms side of AI21 because it points to the people iterating directly on the behavior and architecture of the system rather than only the surrounding platform.

1259
Jack Clark

AI policy, frontier-lab strategy, analysis

Useful not just for his own technical work, but because he consistently translates frontier research, deployment shifts, and policy implications into a coherent field-level picture.

1285
Jakob Uszkoreit

Transformers

A high-signal person to follow for the research arc from early transformer work into later sequence, vision, and multimodal model design.