img-name

Natural Language Processing with Attention Models Free Course



Introduction

This specialization course's goal is to help the student build an NLP application with features such as basic conversation, text translation, text summarization, sentiment analysis of the text, and chatbot building. The Natural Language Processing with Attention Model course focuses on three specific features to ensure the NLP application runs smoothly. The three main features that the students will get to learn with this course include summarizing the text, translating the text language, and questioning and answering. By the end of this course, the students will be able to deploy the NLP application, which will run with all the above-mentioned features.

What Will You Gain From This Course?

Following the completion of this course, the students will:

  1. Be able to understand the use of T5+BERT Models.
  2. Know how to use Attention Models.
  3. Get to learn about the limitations of the traditional seq2seq model.
  4. Have an understanding of text generation and translation of any text from English to German.
  5. Develop an understanding of how the computer understands human language.
  6. Put T5+BERT Models to use by adding advanced AI features.

Skills Acquired:

  1. T5+BERT Models
  2. Chatterbot
  3. Reformer Models
  4. Neural Machine Translation
  5. Attention Models

Who Can Benefit From This Course?

This course is designed for:

  1. An intermediate-level learner with at least some experience in NLP or related fields such as AI, machine learning, and deep learning.
  2. Individuals with a keen interest in Natural Language Processing.
  3. People are interested in prompt engineering and exploring how a computer understands human language.
  4. Social media experts interested in the use of AI on online platforms and their impact on algorithms.
  5. Search engine marketers who want to know how search engine predicts, autocorrect, and incomplete queries.
  6. Machine learning and AI experts interested in the role of attention mechanisms in communication.

Course Content

3 Modules – 41 Videos - 24 Readings - 3 Quiz - 3 Programming Assignments - 1 App Item - 9 Ungraded Labs – Certificate of Completion

Neural Machine Translation

The module focuses on the translation of words from one language to another. The students will get to learn about translation, the execution of translation, and different models that are used for effortless translation.

This section also explores some of the basics of the traditional seq2seq model and its limitations that might hinder the translation process. Students also learn about attention mechanisms to help overcome these limitations. By the end of this module, the participant will develop a Neural Machine Translation model with Attention to execute text translation from English to German.

  1. 15 Videos
  2. 4 Readings
  3. 1 Quiz
  4. 1 Programming Assignment
  5. 1 App Item
  6. 3 Ungraded Labs

Text Summarization

The second module of this course will take nearly eight hours to complete. This module explores some basic concepts of summarization, such as Transformer architecture, RNNs, Scaled and Dot-Product Attention, Masked Self Attention, Multi-head Attention, and more.

The participants will compare different concepts, such as Transformers and RNNs. Moreover, the student will look at some sequential models and how modern Transformer architecture can improve the overall text summary process. By the end of this module, students will have a chance to put these skills to use by creating a tool to generate a text summary.

  1. 10 Videos
  2. 6 Readings
  3. 1 Quiz
  4. 1 Programming Assignment
  5. 3 Ungraded Labs

Question Answering

The third and last module of this course will require almost eleven hours to complete. This module covers some of the most important concepts like ELMo, GPT, BERT, T5, Bidirectional Encoder Representations from Transformers (BERT), Multi-Task Training Strategy, GLUE Benchmark, Hugging Face, etc.

All these concepts are very important for adding the conversation feature in the NLP applications. After students learn these models and their use, they will get to create the answer question feature for the NLP app. With this feature, the user can ask any question, and the app will be able to answer it as required. This section also explores transfer learning with state-of-the-art models like T5 and BERT.

  1. 16 Videos
  2. 15 Readings
  3. 1 Quiz
  4. 1 Programming Assignment
  5. 3 Ungraded Labs

Description

In this course, the students will learn how to translate English sentences into German with the help of the encoder-decoder attention model. The participants will also build a Transformer model for generating text summaries, use T5 and BERT models to answer questions and build a chatbot using a Reformer model.

For a better understanding of this course, the participants are expected to have a working knowledge of machine learning and Python, including experience with a deep learning framework (e.g., TensorFlow, Keras), as well as proficiency in calculus, linear algebra, and statistics.

Meet the Instructor

Younes Bensouda Mourri – Instructor - DeepLearning.AI

This course is the final part of a specialization course powered by deeplearning.ai. Mourri is an Instructor at Stanford University. His academic and research interest lies in machine learning, deep learning, and AI. He has helped in building the Deep Learning Specialization course as well. Mourri has earned his Bachelors in Applied Mathematics and Computer Science and Masters in Statistics from Stanford University.

Łukasz Kaiser – Instructor – DeepLearning.AI

Kaiser is a Staff Research Scientist at Google Brain. He is also the co-author of Tensor2Tensor, Trax, and Tensorflow libraries. His work has greatly influenced the AI community.


Newsletter

Subscribe for latest courses update

© 2024 cryptojobs.com. All right reserved.