Introduction
What Will You Gain From This Course?
Following the completion of this course, the students will:
Skills Acquired:
Who Can Benefit From This Course?
This course is designed for:
Course Content
3 Modules – 41 Videos - 24 Readings - 3 Quiz - 3 Programming Assignments - 1 App Item - 9 Ungraded Labs – Certificate of Completion
The module focuses on the translation of words from one language to another. The students will get to learn about translation, the execution of translation, and different models that are used for effortless translation.
This section also explores some of the basics of the traditional seq2seq model and its limitations that might hinder the translation process. Students also learn about attention mechanisms to help overcome these limitations. By the end of this module, the participant will develop a Neural Machine Translation model with Attention to execute text translation from English to German.
The second module of this course will take nearly eight hours to complete. This module explores some basic concepts of summarization, such as Transformer architecture, RNNs, Scaled and Dot-Product Attention, Masked Self Attention, Multi-head Attention, and more.
The participants will compare different concepts, such as Transformers and RNNs. Moreover, the student will look at some sequential models and how modern Transformer architecture can improve the overall text summary process. By the end of this module, students will have a chance to put these skills to use by creating a tool to generate a text summary.
The third and last module of this course will require almost eleven hours to complete. This module covers some of the most important concepts like ELMo, GPT, BERT, T5, Bidirectional Encoder Representations from Transformers (BERT), Multi-Task Training Strategy, GLUE Benchmark, Hugging Face, etc.
All these concepts are very important for adding the conversation feature in the NLP applications. After students learn these models and their use, they will get to create the answer question feature for the NLP app. With this feature, the user can ask any question, and the app will be able to answer it as required. This section also explores transfer learning with state-of-the-art models like T5 and BERT.
Description
In this course, the students will learn how to translate English sentences into German with the help of the encoder-decoder attention model. The participants will also build a Transformer model for generating text summaries, use T5 and BERT models to answer questions and build a chatbot using a Reformer model.
For a better understanding of this course, the participants are expected to have a working knowledge of machine learning and Python, including experience with a deep learning framework (e.g., TensorFlow, Keras), as well as proficiency in calculus, linear algebra, and statistics.
Meet the Instructor
This course is the final part of a specialization course powered by deeplearning.ai. Mourri is an Instructor at Stanford University. His academic and research interest lies in machine learning, deep learning, and AI. He has helped in building the Deep Learning Specialization course as well. Mourri has earned his Bachelors in Applied Mathematics and Computer Science and Masters in Statistics from Stanford University.
Kaiser is a Staff Research Scientist at Google Brain. He is also the co-author of Tensor2Tensor, Trax, and Tensorflow libraries. His work has greatly influenced the AI community.