Back to researchers

Thomas Wang

Mixture-of-experts LLMs

Co-authored Mixtral of Experts: a key MoE reference in the open-weights frontier.

Highlights

MixtralMoEMistralOpen models
Focus: Mixture-of-experts LLMs
Why it matters: Co-authored Mixtral of Experts: a key MoE reference in the open-weights frontier.

Research Areas

MixtralMoEMistralOpen models
Thomas Wang - AI Researcher Profile | 500AI