5, 10 or 20 seats+ for your team - learn more
In this series of liveProjects, you’ll use variants of the BERT Transformer to solve real-world natural language processing problems. Transformers are pretrained machine learning models and are rapidly becoming the go-to architecture for any NLP use case. As you work through the hands-on challenges in this liveProject series, you’ll get real experience implementing state-of-the-art Transformer architectures, using the Hugging Face library, for use cases such as detecting hate speech, spotting fake news, and blocking spam. Each project is standalone, letting you pick and choose the application most relevant to your career. For each, you’ll create an entire NLP pipeline that stretches from preprocessing data all the way to validating model performance.
This liveProject is for intermediate Python and NLP practitioners who are interested in implementing pretrained BERT architectures, and customizing them to solve real-world NLP problems. To begin this liveProject you will need to be familiar with the following:
geekle is based on a wordle clone.