Fine Tune BERT for Text Classification with TensorFlow
This is a guided project on fine-tuning a Bidirectional Transformers for Language Understanding (BERT) model for text classification with TensorFlow. In this 2.5 hour long project, you will learn to preprocess and tokenize data for BERT classification, build TensorFlow input pipelines for text data with the tf.data API, and train and evaluate a fine-tuned BERT model for text classification with TensorFlow 2 and TensorFlow Hub. Prerequisites: In order to successfully complete this project, you should be competent in the Python programming language, be familiar with deep learning for Natural Language Processing (NLP), and have trained models with TensorFlow or and its Keras API. Note: This course works best for learners who are based in the North America region. We’re currently working on providing the same experience in other regions.
It is assumed that are competent in Python programming and have prior experience with building deep learning NLP models with TensorFlow or Keras
Introduction to the Project
Setup your TensorFlow and Colab Runtime
Download and Import the Quora Insincere Questions Dataset
Create tf.data.Datasets for Training and Evaluation
Download a Pre-trained BERT Model from TensorFlow Hub
Tokenize and Preprocess Text for BERT
Wrap a Python Function into a TensorFlow op for Eager Execution
Create a TensorFlow Input Pipeline with tf.data
Add a Classification Head to the BERT hub.KerasLayer
Fine-Tune and Evaluate BERT for Text Classification
由 SS 提供Apr 15, 2021
really nice glue to connect all the dots. Thanks so much
由 JH 提供Dec 24, 2021
I have some experience on computer vision and need to take a NLP project, this course give me a heads up on the project.
由 VS 提供Apr 12, 2021
The project was well explained and provided good understanding of bert for text classification. Also the quiz were good.
由 AA 提供Dec 12, 2021
Excellent and very helpful course, the instructor language is very clear and concise and to the point, I would love to learn more from the same instructor.