Technology, Data and Science

Large Language Models: Text Classification for NLP using BERT

Transformers are taking the natural language processing (NLP) world by storm. In this course, instructor Jonathan Fernandes teaches you all about this go-to architecture for NLP and computer vision tasks and must-have skill in your Artificial Intelligence toolkit. Jonathan uses a hands-on approach to show you the basics of working with transformers in NLP and production. He goes over BERT model sizes, bias in BERT, and how BERT was trained. Jonathan explores transfer learning, shows you how to use the BERT model and tokenization, and covers text classification. After thoroughly explaining the transformer model architecture, he finishes up with some additional training runs.

Learn More