img-name

Natural Language Processing with Sequence Models Free Course



Introduction

The Natural Language Processing with Sequence Models course is ideal for people who want to explore how computers comprehend our language. In the world of AI, where prompts hold so much power, this course can be a game changer for everyone, especially communication experts, marketers, search engine optimization experts, etc. The students will explore concepts like GLoVe word embeddings and gain a better understanding of LSTM models. This course has three modules and takes 21 hours to complete. It also includes quizzes and assignment tasks for hands-on experience.

What Will You Gain From This Course?

By the end of this course, the student will:

  1. Be able to understand the use of recurrent neural networks.
  2. Know the use of LSTMs in natural programming processing NLP.
  3. Be able to put GRUs & Siamese networks in Trax for sentiment analysis to use.
  4. Have an understanding of text generation.
  5. Be able to execute the named entity recognition feature.
  6. Develop an understanding of how the computer understands human language.

Skills Acquired:

  1. Word Embedding
  2. Siamese Networks
  3. Sentiment with Neural Nets
  4. Natural Language Generation
  5. Named-Entity Recognition

Who Can Benefit From This Course?

This course is designed for:

  1. Individuals with a keen interest in NLP.
  2. Social media experts looking for ways to analyze sentiments used in language.
  3. Search engine marketers who want to know about algorithms used on social media sites and tech companies like Google and IBM.
  4. Machine learning and AI experts are interested in the role of language and words in creating effective prompts for better output.

Course Content

3 Modules - 33 Videos - 34 Readings - 3 Quiz – 4 Programming Assignments - 1 App Item - 8 Ungraded Labs – Certificate of Completion

Recurrent Neural Networks for Language Modeling

This is the first module of this course and spans nearly ten hours. The students will learn about traditional language modules and their limitations. They will also see how RNNs and GRUs use sequential data for text prediction. By the end of this module, the students will have their own next-word generator using a simple RNN on Shakespeare text data.

The goal of this module is to better understand basic concepts such as Neural Networks for Sentiment Analysis, dense layers, ReLU, Embedding and Mean Layers, Traditional Language models, Recurrent Neural Networks, Applications of RNNs, Cost Functions for RNNs, and Implementation notes.

  1. 15 Videos
  2. 15 Readings
  3. 1 Quiz
  4. 2 Programming Assignments
  5. 1 App Item
  6. 4 Ungraded Labs

LSTMS and Named Entity Recognition

This is the second module of the course and spans four hours. It explores some of the most important concepts in NLP, especially long short-term memory units (LSTMs), entity recognition systems, and the use of named entity recognition systems with an LSTM.

By the end of this module, the students will have a much better understanding of how long short-term memory units (LSTMs) solve the vanishing gradient problem. Moreover, the participants will also get to use concepts like how named entity recognition systems quickly extract important information from text. Finally, by the end of this module, the participant will get to make their own named entity recognition system with the help of an LSTM and data from Kaggle.

  1. 15 Videos
  2. 15 Readings
  3. 1 Quiz
  4. 2 Programming Assignment
  5. 4 Ungraded Lab
  6. 1 App Item

Siamese Networks

This is the third and last module of the course and will take nearly six hours to complete. It will explore concepts such as Siamese networks, architecture, cost function, triplets, cost computing, and one-shot learning.

The students will learn about the workings of Siamese networks, a special type of neural network made of two identical networks that eventually merge. By the end of this module, the learner is expected to make a Siamese network that can easily recognize duplicated questions within a dataset from Quora.

  1. 10 Videos
  2. 10 Readings
  3. 1 Quiz
  4. 1 Programming Assignment
  5. 3 Ungraded Labs

Description

This course is a part of a four-series specialization training program. The specialization training program offers a much more in-depth understanding of natural language processing. This course offers an understanding of Natural Language Processing with Sequence Models. After completing this 21-hour course, the students will be able to train a neural network with GLoVe word embeddings to perform sentiment analysis of tweets.

The participants will also generate synthetic Shakespeare text using a gated recurrent unit (GRU) language model and train a recurrent neural network to perform named entity recognition (NER) using LSTMs with linear layers.

Finally, the students will have an opportunity to utilize the ‘Siamese’ LSTM models for comparing questions in a corpus and recognizing the questions with the same intention but different wording.

The specialization course is designed by two NLP experts, i.e., Younes Bensouda Mourri and Łukasz Kaiser. Following the completion of these four courses, the learner will be able to develop NLP applications with features like question-answering, sentiment analysis, translation, summarizing text, and building a chatbot.

Meet the Instructor

Younes Bensouda Mourri – Instructor - DeepLearning.AI

Younes Bensouda Mourri, an NLP, machine learning, and deep learning expert, teaches this course. Mourri has earned his Bachelors in Applied Mathematics and Computer Science and Masters in Statistics from Stanford University. With the help of his fellow machine learning and AI experts, he has designed this four-course series that offers a much-advanced understanding of natural language processing. For his academic career, his primary focus has been AI; however, he has also worked on multiple research deep learning specializations with the help of deeplearning.ai. His courses are taught at Stanford and available on Coursera for the general public who has not enrolled at Stanford.


Newsletter

Subscribe for latest courses update

© 2024 cryptojobs.com. All right reserved.