Back to researchers
Ming-Wei Chang
Bidirectional transformer pretraining (BERT)
Co-authored BERT: a turning point for transfer learning in NLP.
Highlights
BERTNLPTransformers
Focus: Bidirectional transformer pretraining (BERT)
Why it matters: Co-authored BERT: a turning point for transfer learning in NLP.
Research Areas
BERTNLPTransformers