Back to researchers

Jacob Devlin

Pretraining and representation learning for NLP

BERT defined a generation of language-model pretraining and evaluation practices.

Highlights

NLPTransformersPretraining
Focus: Pretraining and representation learning for NLP
Why it matters: BERT defined a generation of language-model pretraining and evaluation practices.

Research Areas

NLPTransformersPretraining
Jacob Devlin - AI Researcher Profile | 500AI