Back to page 1

Researchers — page 8

Showing 841-960 of 3,615 researchers

857
Edward J. Hu

Parameter-efficient finetuning

A high-signal person to study if you care about the practical mechanics of adapting large models, especially where scaling theory turns into techniques that actually spread across the industry.

869
Elad Dolev

Hybrid Transformer–Mamba language models (Jamba)

One of the clearer infrastructure pages in the AI21 cluster because it anchors the operational side of the stack: deployment, reliability, and the systems work needed to keep fast-moving model releases usable.

909
Eran Krakovsky

Hybrid Transformer–Mamba language models (Jamba)

A solid page for the engineering side of model development because it captures the people who turn hybrid-architecture research into actual trained and shipped systems rather than just writing the abstract.

914
Erez Schwartz

Hybrid Transformer–Mamba language models (Jamba)

A useful page for the implementation layer of AI21 research because it captures the engineers who turn the company's hybrid-model ideas into trained systems and concrete releases.

917
Eric Alcaide

RWKV and efficient sequence modeling

A distinctive page because his work bridges open-sequence-model experimentation with applied machine learning for molecules, proteins, and structural biology, and he shows up on multiple RWKV-family papers including the hybrid GoldFinch branch rather than only the first release.