Build custom NLP models in record time by adapting pre-trained machine learning models to solve specialized problems.
In
Transfer Learning for Natural Language Processing you will learn:
- Fine tuning pretrained models with new domain data
- Picking the right model to reduce resource usage
- Transfer learning for neural network architectures
- Generating text with generative pretrained transformers
- Cross-lingual transfer learning with BERT
- Foundations for exploring NLP academic literature
Training deep learning NLP models from scratch is costly, time-consuming, and requires massive amounts of data. In
Transfer Learning for Natural Language Processing, DARPA researcher Paul Azunre reveals cutting-edge transfer learning techniques that apply customizable pretrained models to your own NLP architectures. You’ll learn how to use transfer learning to deliver state-of-the-art results for language comprehension, even when working with limited label data. Best of all, you’ll save on training time and computational costs.