Computer vision, representation learning
A foundational computer-vision researcher whose work on representations and architectures still shapes modern pretraining and perception systems.
Lab & Ecosystem
Builders behind Llama, Segment Anything, and large-scale open-weight model research.
Within 500AI, Meta is most legible through researchers like Kaiming He, Baptiste Rozière, Armand Joulin.
This cluster is especially tied to Open Models, Code Models, Systems & Infrastructure. Frequent institution signals include Meta, Menlo School, Mistral AI. Recurring entry points include The Llama 3 Herd of Models, Llama (site).
Snapshot
Researchers
583
Related topics
7
Starting points
8
Developed dossiers
21
Useful lenses pulled from the strongest researcher profiles in this cluster.
Frequent institutions showing up across linked profiles in this ecosystem.
Repeatedly linked papers, projects, and repositories across this lab cluster.
The Llama 3 Herd of Models
484Linked by 484 profiles in this cluster
Llama (site)
482Linked by 482 profiles in this cluster
Llama 2: Open Foundation and Fine-Tuned Chat Models
68Linked by 68 profiles in this cluster
Code Llama: Open Foundation Models for Code
20Linked by 20 profiles in this cluster
LLaMA: Open and Efficient Foundation Language Models
12Linked by 12 profiles in this cluster
Segment Anything
9Linked by 9 profiles in this cluster
Segment Anything (project)
9Linked by 9 profiles in this cluster
Bag of Tricks for Efficient Text Classification
2Linked by 2 profiles in this cluster
Source clusters that repeatedly anchor researcher pages in this ecosystem.
Llama (site)
482Used across 482 researcher pages in this lab cluster
The Llama 3 Herd of Models
482Used across 482 researcher pages in this lab cluster
Llama 2: Open Foundation and Fine-Tuned Chat Models
61Used across 61 researcher pages in this lab cluster
Code Llama: Open Foundation Models for Code
17Used across 17 researcher pages in this lab cluster
LLaMA: Open and Efficient Foundation Language Models
12Used across 12 researcher pages in this lab cluster
Segment Anything
9Used across 9 researcher pages in this lab cluster
A stronger first pass through Meta, ranked by profile depth, evidence, and editorial importance.
Computer vision, representation learning
A foundational computer-vision researcher whose work on representations and architectures still shapes modern pretraining and perception systems.
Open-weight foundation models (LLaMA)
Important for the code-model side of the open-weight ecosystem, especially where general-purpose LLaMA work turns into stronger coding systems.
Open-weight foundation models (LLaMA)
A strong bridge figure between the older fastText and self-supervision era and the newer open-weight LLaMA wave at Meta.
Open-weight foundation models (LLaMA)
Useful to follow for the scaling and productization layer of the LLaMA line, especially as it moved from the first paper into the broader Llama 3 release wave.
Open-weight foundation models (LLaMA)
Important for the practical representation-learning line behind fastText, multilingual embeddings, and later open-weight model work at Meta.
Open-weight foundation models (LLaMA)
Interesting because his work spans two fairly different but important threads: open-ended reinforcement-learning environments and the later open-weight model push around LLaMA.
Open-weight foundation models (LLaMA)
A strong person to follow if you care about open-weight language models and retrieval-heavy NLP systems, especially the line from RoBERTa and RAG into LLaMA-era model development.
Representation learning, AI systems
A foundational deep-learning figure whose influence spans convolutional networks, representation learning, and long-running arguments about what capable AI systems should optimize for next.
Open-weight foundation models (LLaMA)
A strong page to keep because he sits on both sides of a major shift in open models: he appears on Meta's LLaMA 2 paper and then on Mistral 7B and Mixtral, which makes him part of the early handoff from the first LLaMA wave into Mistral's open-weight model line.
583 linked profiles.