Great Book !
October 4, 2021 by Abdurrahman S. (Paris, France)
“It provides a comprehensive and up-to-date introduction to different techniques employed for the purpose of Transfer Learning in NLP. From basic models like Logistic Regression, to more complex architectures like the Transformer, the author embarks us in an adventure filled with concepts, models, and techniques like multitask learning, domain adaptation, knowledge distillation, embedding, ELMo, RNN, BERT, mBERT, SIMOn, GPT, ULMFiT and ALBERT, to name a few, that become crystal clear at the end of the journey. Moreover, the heavy use of the transformers library allows to grasp concepts and adapt the code to one’s own use cases without unnecessary overhead. Definitely a must read if you want to have innovative insights when facing challenges inherent to Natural Language Processing.”