One of the recent powerful advancements in natural language processing is transfer learning. Transfer learning refers to a machine learning methodology where a sophisticated model which has been trained for a generic task can be “transferred” to another task. In this workshop, we will introduce the notion of deep transformers, looking specifically at the BERT architecture. We will then carry out some practical transfer learning to leverage a pertained BERT model from TensorflowHub to build our own text classifiers. At the end of the workshop. Attendees will not only be able to create text classifiers leveraging the power of BERT, but also obtain skills allowing to adapt other pretrained models available on TensorflowHub for their needs, transfer learning style.
- Oduwa Edo-Osagie Data Scientist
Text Classification by Transfer learning with Deep Transformers
Online webinar
Saturday 14th November 2020