Booking options
£24.99
£24.99
On-Demand course
23 hours 31 minutes
All levels
Take your first step toward Natural Language Processing with this beginner-to-pro course. Gain an in-depth understanding of deep learning models for NLP with the help of examples. Learn the essential concepts from the absolute beginning with complete unraveling along with examples in Python.
Natural Language Processing (NLP), a subdivision of Artificial Intelligence (AI), is the ability of a computer to understand human language the way it's spoken and written. Human language is typically referred to as natural language. Humans also have different sensors. For instance, ears perform the function of hearing and eyes perform the function of seeing. Similarly, computers have programs for reading and microphones for collecting audio. Just as the human brain processes an input, a computer program processes a specific input. And during processing, the program converts the input to code that the computer understands. This course, Natural Language Processing (NLP), Theory and Practice in Python, introduces you to the concepts, tools, and techniques of machine learning for text data. You will learn the elementary concepts as well as emerging trends in the field of NLP. You will also learn about the implementation and evaluation of different NLP applications using deep learning methods. Code bundles are available here: https://github.com/PacktPublishing/NLP-Natural-Language-Processing-in-Python-for-Beginners
Learn the fundamentals of NLP using datasets
Explore language models and their uses in speech recognition
Learn to use software tools such as SpaCY, NLTK, Gensim, and PyTorch
Learn the concepts of deep learning theory
Explore linear subspaces for word embeddings
Understand the architecture of neural networks
This course is for complete beginners who are new to NLP, people who want to upgrade their Python programming skills for NLP, and individuals who are passionate about numbers and programming such as data scientists, data analysts, and machine learning practitioners.
This course provides an interactive and practical learning experience. At the end of each module, you will get an opportunity to revise everything you have learned through homework/tasks/activities. They have been designed to evaluate/further build your learning based on the concepts and methods you have learned. Most of these assignments are coding-based, and they will be useful to get you up and going ahead with implementations.
Apply the concepts to any language to build customized NLP models * Learn machine learning concepts in a more practical way * Build your own applications for automatic text generation and language translators
https://github.com/PacktPublishing/NLP-Natural-Language-Processing-in-Python-for-Beginners
AI Sciences are experts, PhDs, and artificial intelligence practitioners, including computer science, machine learning, and Statistics. Some work in big companies such as Amazon, Google, Facebook, Microsoft, KPMG, BCG, and IBM. AI sciences produce a series of courses dedicated to beginners and newcomers on techniques and methods of machine learning, statistics, artificial intelligence, and data science. They aim to help those who wish to understand techniques more easily and start with less theory and less extended reading. Today, they publish more comprehensive courses on specific topics for wider audiences. Their courses have successfully helped more than 100,000 students master AI and data science.
1. Introduction
1. Introduction to the Course This video provides an overview of the entire course. |
2. Introduction to the Instructor This video provides an introduction to the instructor. |
3. Introduction to the Co-Instructor In this video, you will be introduced to the co-instructor. |
4. Course Introduction This video introduces you to the course. |
2. Introduction (Regular Expressions)
1. What is Regular Expression This video introduces us to the concept of regular expression. |
2. Why Regular Expression in this session, we will understand why regular expressions are used. |
3. ELIZA Chatbot In this lesson, we will discuss about Eliza chatbot. |
4. Python Regular Expression Package Let's understand the Python regular expression package in detail. |
3. Metacharacters (Regular Expressions)
1. Metacharacters Let's get introduced to metacharacters in this lesson. |
2. Metacharacters Bigbrackets Exercise In this lesson, we will learn about the square brackets metacharacter. |
3. Meta Characters Bigbrackets Exercise Solution In the previous lesson, we saw a problem; in this lesson, we will work on the solution together. |
4. Metacharacters Bigbrackets Exercise 2 Let's take a look at another problem and find out the solution. |
5. Metacharacters Bigbrackets Exercise 2 - Solution Let's find out the solution to the problem we saw in the previous session. |
6. Metacharacters Cap In this lesson, we will learn about the Cap (^) metacharacter. |
7. Metacharacters Cap Exercise 3 Now that we are familiar with square brackets and cap metacharacter, it's time to solve a problem period. |
8. Metacharacters Cap Exercise 3 - Solution Let's look at the solution of the problem we discussed in the previous lesson. |
9. Backslash In this session, we will learn about the backslash metacharacter. |
10. Backslash Continued Let's explore more about the backslash metacharacter. |
11. Backslash Continued - 01 We will continue our discussion on the backslash metacharacter in this session as well. |
12. Backslash Squared Brackets Exercise It's exercise time; let's take a look at a problem. |
13. Backslash Squared Brackets Exercise Solution Let's find out the solution to the problem we discussed in the previous lesson. |
14. Backslash Squared Brackets Exercise - Another Solution Let's look at another solution to the same problem period. |
15. Backslash Exercise In this session, we will see another practice problem period and understand the problem question first. |
16. Backslash Exercise Solution and Special Sequences Exercise In this session, we will solve the problem we discussed in the previous video. |
17. Solution and Special Sequences Exercise Solution In this session, we will look at an exercise on special sequences for pattern matching. |
18. Metacharacter Asterisk In this session, we will learn about the asterisk metacharacter. |
19. Metacharacter Asterisk Exercise Let's look at an exercise on the asterisk metacharacter. |
20. Metacharacter Asterisk Exercise Solution Let's find the solution to the exercise we discussed in the previous video session. |
21. Metacharacter Asterisk Homework Here is a problem you can solve by yourself (homework). |
22. Metacharacter Asterisk Greedy Matching In this session, we will discuss about an important concept called greedy matching. |
23. Metacharacter Plus and Question Mark In this session, we will learn about the Plus and Question mark metacharacters. |
24. Metacharacter Curly Brackets Exercise Let's discuss about the metacharacter curly brackets in this session. |
25. Metacharacter Curly Brackets Exercise Solution Let's look at the solution for the problem we discussed in the previous video. |
4. Pattern Objects (Regular Expressions)
1. Pattern Objects In this session, we will learn about pattern objects. |
2. Pattern Objects Match Method Exercise In this session, we will discuss about two specific methods of pattern objects, which are match() and search(). |
3. Pattern Objects Match Method Exercise Solution Let's find the solution to the problem we discussed in the previous video. |
4. Pattern Objects Match Method Versus Search Method Let's understand the difference between match() and search() methods in this session. |
5. Pattern Objects Finditer Method In this lesson, we will look at the finditer built-in function. |
6. Pattern Objects Finditer Method Exercise Solution Let's look at the solution to the problem we discussed in the previous video lesson. |
5. More Metacharacters (Regular Expressions)
1. Metacharacters Logical Or In this lesson, we will learn about the logical or metacharacter. |
2. Metacharacters Beginning and End Patterns In this lesson, we will learn how to form beginning and end patterns using the cap (^) and dollar($) metacharacters respectively. |
3. Metacharacters Parentheses In this lesson, we will look at it another important metacharacter, which is (). |
6. String Modification (Regular Expressions)
1. String Modification This video introduces you to the concept of string modification. |
2. Word Tokenizer Using Split Method In this lesson, we will learn how to build a simple word tokenizer using the split function. |
3. Sub Method Exercise Let's take a look at an example of string replacement or pattern replacement in this session. |
4. Sub Method Exercise Solution In this session, we will discuss the solution to the problem we discussed in the previous lesson. |
7. Words and Tokens (Text Preprocessing)
1. What is a Word Let's understand what a word is in the session. |
2. Definition of Word is Task Dependent Let's continue the discussion on âEUR~word' in this session and look at an example. |
3. Vocabulary and Corpus Let's understand the concept of vocabulary and corpus in text preprocessing. |
4. Tokens In this session, we will explore tokens. |
5. Tokenization in Spacy In this session, we will take a look at a power Python package called SpaCY. |
8. Sentiment Classification (Text Preprocessing)
1. Yelp Reviews Classification Mini Project Introduction In this lesson, we will work with real data and perform data cleaning using Python. |
2. Yelp Reviews Classification Mini Project Vocabulary Initialization In this lesson, we will start data preparation for the Yelp Review Project. |
3. Yelp Reviews Classification Mini Project Adding Tokens to Vocabulary In this lesson, we will go ahead and add tokens to the vocabulary. |
4. Yelp Reviews Classification Mini Project Look Up Functions in Vocabulary In this lesson, we will add look up functions in the vocabulary. |
5. Yelp Reviews Classification Mini Project Building Vocabulary from Data In this lesson, we will add another function that will take a data frame as an input and build the vocabulary. |
6. Yelp Reviews Classification Mini Project One-Hot Encoding In this video, you will learn the process of one-hot encoding that enables us to change the data into vector form. |
7. Yelp Reviews Classification Mini Project One-Hot Encoding Implementation By now, you should be familiar with one-hot encoding. Let's implement this concept in this session. |
8. Yelp Reviews Classification Mini Project Encoding Documents In this lesson, you will learn the process of encoding documents. |
9. Yelp Reviews Classification Mini Project Encoding Documents Implementation Now that you are familiar with encoding documents, let's go ahead and implement it in this lesson. |
10. Yelp Reviews Classification Mini Project Train Test Splits In this video, you will learn to use train test splits. |
11. Yelp Reviews Classification Mini Project Feature Computation In this session, we will discuss feature computation. |
12. Yelp Reviews Classification Mini Project Classification In this session, we will discuss classification. |
9. Language Independent Tokenization (Text Preprocessing)
1. Tokenization in Detial Introduction In this lesson, we will explore the concept of text normalization. |
2. Tokenization is Hard In this session, we will explore space beast tokenization. |
3. Tokenization Byte Pair Encoding Let's understand the data-driven approach using byte pair encoding in this session. |
4. Tokenization Byte Pair Encoding Example In this session, we will take a look at an example of byte pair encoding for better understanding. |
5. Tokenization Byte Pair Encoding on Test Data Let's apply byte pair encoding on test data in this session. |
6. Tokenization Byte Pair Encoding Implementation Get Pair Counts In this video, we will start implementation of byte pair encoding to get pair counts. |
7. Tokenization Byte Pair Encoding Implementation Merge in Corpus LetâEUR~s continue implementation of byte pair encoding in this session. We will merge the best pair in corpus. |
8. Tokenization Byte Pair Encoding Implementation BFE Training In this session, we will create the entire training setup that will generate the byte pair encoding statistics. |
9. Tokenization Byte Pair Encoding Implementation BFE Encoding In this session, we will take an example word and find out its tokenization. |
10. Tokenization Byte Pair Encoding Implementation BFE Encoding One Pair In this session, we will add a function that gets the pair merged based on the byte pair encoding statistics received during the training. |
11. Tokenization Byte Pair Encoding Implementation BFE Encoding One Pair 1 In this lesson, we will write a function that will encode any new word using byte pair encoding. |
10. Text Normalization(Text Preprocessing)
1. Word Normalization Case Folding In this session, letâEUR~s take a look at a few preprocessing issues that one might encounter and how we can use normalization techniques to overcome them. |
2. Word Normalization Lemmatization In this session, we will discuss the normalization techniqueâEUR"lemmatization. |
3. Word Normalization Stemming In this session, you will learn about stemming, which is a type of lemmatization. |
4. Word Normalization Sentence Segmentation In this session, let's look at an algorithm to understand the concept of sentence segmentation. |
11. String Matching and Spelling Correction (Text Preprocessing)
1. Spelling Correction Minimum Edit Distance Introduction Learn how to use the minimum edit distance algorithm to achieve spelling correction. |
2. Spelling Correction Minimum Edit Distance Example Learn to differentiate between two words using minimum edit distance. Here is an example of minimum edit distance calculation. |
3. Spelling Correction Minimum Edit Distance Table Filling Learn to calculate minimum edit distance in this session. |
4. Spelling Correction Minimum Edit Distance Dynamic Programming Let's understand the concept of spelling correction using minimum edit distance in a little more detail before we start writing the code. |
5. Spelling Correction Minimum Edit Distance Pseudocode Before we start writing our code in Python, let's first write the pseudocode to calculate the minimum edit distance. |
6. Spelling Correction Minimum Edit Distance Implementation Learn to write the minimum edit distance algorithm in Python in this lesson. |
7. Spelling Correction Minimum Edit Distance Implementation Bug fixing We will continue writing our Python code for minimum edit distance and fix bugs in this lesson. |
8. Spelling Correction Implementation We will use the edit distance package in this session to build a spelling correction application. |
12. Language Modeling
1. What is a Language Model This video introduces you to the concept of language modeling. |
2. Language Model Formal Definition This video session provides the language model definition. |
3. Language Model Curse of Dimensionality In this lesson, you will learn how dimensionality is an important factor in language modeling. |
4. Language Model Markov Assumption and N-Grams In this session, we will learn about Markov assumption: n-gram. |
5. Language Model Implementation Setup In this session, we will implement the concepts about language modeling that we have learned so far. |
6. Language Model Implementation N-grams Function We will start writing our code to build the language model in this session. |
7. Language Model Implementation Update Counts Function In this session, we will update the count function. |
8. Language Model Implementation Probability Model Function In this session, we will convert our model to probability values rather than counts. |
9. Language Model Implementation Reading Corpus In this session, we will be importing some real data. |
10. Language Model Implementation Sampling Text So far, we have successfully created our model. It's time to check whether it works. In this session, we will be sampling text. |
13. Topic Modelling with Word and Document Representations
1. One-Hot Vectors In this video, we will learn about word representations in vector space. |
2. One-Hot Vectors Implementation Learn how to implement one-hot vector encoding. |
3. One-Hot Vectors Limitations This video provides a detailed explanation about the limitations of one-hot vectors. |
4. One-Hot Vectors Used as Target Labeling In this lesson, we will learn about target labeling. |
5. Term Frequency for Document Representations This video introduces you to the concept of document representation. |
6. Term Frequency for Document Representations Implementations In this session, we will write codes to get to the term frequency of a document. |
7. Term Frequency for Word Representations In this session, we will talk about term frequency for word representation. |
8. TFIDF for Document Representations In this lesson, you will learn about term frequency inverted document frequencyâEUR"TFIDF. |
9. TFIDF for Document Representations Implementation Reading Corpus In this session, we will implement the TFIDF model to represent different documents from scratch. |
10. TFIDF for Document Representations Implementation Computing Document Frequency In this session, we will compute the document frequency for each term. |
11. TFIDF for Document Representations Implementation Computing TFIDF In this video, we will compute the complete IDF vector. |
12. Topic Modeling with TFIDF 1 In this session, we will implement topic modeling with TFIDF. |
13. Topic Modeling with TFIDF 2 In this session, we're going to add a function that will build a model handy for the FIDF transformations. |
14. Topic Modeling with TFIDF 3 In this session, we're going to write a function for TFIDF having the models of TFIDF transformer. |
15. Topic Modeling with TFIDF 4 In this session, we will train our classifier. |
16. Topic Modeling with Gensim In this session, we will import corpora models and similarities from Gensim for our corpus. |
14. Word Embeddings LSI
1. Word Co-Occurrence Matrix This video explains the concept of word co-occurrence matrix/term âEUR" term matrix/context - term matrix. |
2. Word Co-Occurrence Matrix Versus Document-Term Matrix In this session, we will look at the difference between word co-occurrence matrix and word document matrix. |
3. Word Co-Occurrence Matrix Implementation Preparing Data In this session, we will prepare our dataset for computing the word co-occurrence matrix. |
4. Word Co-Occurrence Matrix Implementation Preparing Data 2 In this session, we will focus on more frequent words. |
5. Word Co-Occurrence Matrix Implementation Preparing Data Getting Vocabulary In this session, you will learn how to compute the vocabulary for the process list. |
6. Word Co-Occurrence Matrix Implementation Final Function In this session, let's add a function that computes the word co-occurrence matrix. |
7. Word Co-Occurrence Matrix Implementation Handling Memory Issues on Large Corpora In this lesson, you will learn how to handle memory issues on large corpora. |
8. Word Co-Occurrence Matrix Sparsity This video explains the concept of sparsity. |
9. Word Co-Occurrence Matrix Positive Point Wise Mutual Information PPMI This video explains pointwise mutual information PPMI with examples. |
10. PCA for Dense Embeddings In this lesson, you will learn about PCA for dense embedding. |
11. Latent Semantic Analysis This lesson gives a detailed explanation about latent semantic analysis. |
12. Latent Semantic Analysis Implementation In the session, we will implement latent semantic analysis using truncated SVD. |
15. Word Semantics
1. Cosine Similarity This video explains the concept of cosine similarity. |
2. Cosine Similarity Getting Norms of Vectors In this session, we will learn how to get norms of vectors with cosine similarity. |
3. Cosine Similarity Normalizing Vectors In this session, we will write a function for normalizing a vector. |
4. Cosine Similarity with More than One Vector In this session, we will learn to write a function for normalizing more than one vector. |
5. Cosine Similarity Getting Most Similar Words in the Vocabulary In this session you will learn how to get the most similar words in the vocabulary. |
6. Cosine Similarity Getting Most Similar Words in the Vocabulary Fixing bug In this session, let's fix a few bugs. |
7. Cosine Similarity Word2Vec Embeddings In this video, we will see how we can compute the similarity of one word with another word using cosine similarity. |
8. Word Analogies Let's look at word analogies in this session. |
9. Words Analogies Implementation 1 In this session, you will learn how to write a function to get the analogy. |
10. Word Analogies Implementation 2 Let's continue the discussion about the functions that we have added to get the analogy and fix a few bugs. |
11. Word Visualizations This video explains word visualizations and important concepts related to it. |
12. Word Visualizations Implementation In this video, we will learn how to apply PCA or any other kind of dimensionality reduction technique. |
13. Word Visualizations Implementation 2 In this session, we will implement word visualization. |
16. Word2vec(Optional)
1. Static and Dynamic Embeddings This video introduces you to the Word2Vec embeddings. |
2. Self Supervision In this session, we will discuss about some basic concepts of Word2Vec embeddings. |
3. Word2Vec Algorithm Abstract This session provides a step-by-step explanation of the Word2Vec algorithm. |
4. Word2Vec: Why Negative Sampling In this session, we will learn about negative sampling and why it is required. |
5. Word2Vec: What is Skip Gram Learn about Skip gram in this session. |
6. Word2Vec: How to Define Probability Law In this session, we will learn about the probability law and how we can define it. |
7. Word2Vec Sigmoid This session provides a detailed description of the sigmoid function and how it can be used to model probabilities. |
8. Word2Vec Formalizing Loss Function In this session, we will learn how to formalize the loss function |
9. Word2Vec Loss Function Maximizing the similarity between the target and the context is equivalent to minimizing the similarity of negative words. Let's continue our discussion on lost function in this session. |
10. Word2Vec Gradient Descent Step This session explains the gradient descent step. |
11. Word2Vec Implementation Preparing Data Let's go ahead and implement Word2Vec. The first step is to prepare the data. |
12. Word2Vec Implementation Gradient Step In this session, we will continue implementing Word2Vec and perform the gradient descent step. |
13. Word2Vec Implementation Driver Function In this session, we will go ahead and add the driver function. |
17. Need of Deep Learning for NLP (NLP with Deep Learning DNN)
1. Why RNNs for NLP In this section, let's get introduced to the concept of deep learning. |
2. PyTorch Installation and Tensors Introduction In this session, we will learn how to install PyTorch. |
3. Automatic Differentiation PyTorch In this session, we will understand automatic differentiation in detail. |
18. Introduction (NLP with Deep Learning DNN)
1. Why DNNs in Machine Learning In this lesson, we will learn about deep neural networks and their importance in machine learning. |
2. Representational Power and Data Utilization Capacity of DNN Let's understand why deep neural networks are preferred with the help of universal approximation theorem. |
3. Perceptron In this video, we will learn about perceptron/neuron. |
4. Perceptron Implementation In this session, we will implement a simple perceptron. |
5. DNN Architecture In this session, we will dive deeper into the deep neural network architecture. |
6. DNN Forwardstep Implementation In the session, you will learn how to build a neural network with two computational layers and one output unit. The layers will contain different number of neurons. |
7. DNN Why Activation Function is Required In this video, you will get a detailed explanation about the activation function and why it is required. |
8. DNN Properties of Activation Function In this lesson, let's look at the properties of the activation function. |
9. DNN Activation Functions in PyTorch In this lesson, we will define an activation function in PyTorch. |
10. DNN What is Loss Function In this session, let's learn about the loss function (gradient descent) in depth. |
11. DNN Loss Function in PyTorch In this session, we will look at an example of lost function in PyTorch. |
19. Training (NLP with Deep Learning DNN)
1. DNN Gradient Descent In this lesson, we will dive deeper into gradient descent. |
2. DNN Gradient Descent Implementation In this lesson, we will understand the concept of gradient descent with the help of an example. |
3. DNN Gradient Descent Stochastic Batch Minibatch In this session, we will look at the different ways to implement gradient descent such as stochastic, minibatch, and batch. |
4. DNN Gradient Descent Summary Let's summarize everything we have learned about gradient descent in this session. |
5. DNN Implementation Gradient Step In this session, we will add the sigmoid activation function two hour code. |
6. DNN Implementation Stochastic Gradient Descent In this session, we will add a training function for stochastic gradient descent. |
7. DNN Implementation Batch Gradient Descent In this session, you will learn how to implement batch gradient descent. |
8. DNN Implementation Minibatch Gradient Descent In this session, we will implement minibatch gradient descent. |
9. DNN Implementation in PyTorch In this lesson, we will perform DNN implementation in PyTorch. |
20. Hyperparameters (NLP with Deep Learning DNN)
1. DNN Weights Initializations In this session, we will learn about weight initialization. |
2. DNN Learning Rate In this session, we will understand what is step size, also known as learning rate. |
3. DNN Batch Normalization This video provides information about batch normalization and it's important. |
4. DNN Batch Normalization Implementation In this video, we will implement batch normalization using the PyTorch framework. |
5. DNN Optimizations In this session, we will learn about the different optimization techniques. |
6. DNN Dropout Learn all about dropouts in this video lesson. |
7. DNN Dropout in PyTorch In this session, let's implement a dropout layer in our model. |
8. DNN Early Stopping In this lesson, you will learn about early stopping. |
9. DNN Hyperparameters Let's take a look at the list of all the hyperparameters we have learned so far. |
10. DNN PyTorch CIFAR10 Example In this session, we will implement a deep neural network in PyTorch using a real datasetâEUR"CIFAR 10. |
21. Introduction (NLP with Deep Learning RNN)
1. What is RNN Let's understand RNN with the help of an example in this session. |
2. Understanding RNN with a Simple Example In this lesson, let's understand vocabulary index for this project. |
3. RNN Applications Human Activity Recognition In this session, we will discuss about the various applications of RNN in human activity recognition. |
4. RNN Applications Image Captioning Let's understand the application of RNN in image captioning. |
5. RNN Applications Machine Translation In this video, we will learn about the application of RNN in machine translation. |
6. RNN Applications Speech Recognition Stock Price Prediction In this lesson, we will take a look at the application of RNN in speech recognition and stock price prediction. |
7. RNN Models In this video, we will cover the different RNN models in details. |
22. Mini-Project Language Modelling (NLP with Deep Learning RNN)
1. Language Modeling Next Word Prediction This video provides an overview of the language modeling mini-project. |
2. Language Modeling Next Word Prediction Vocabulary Index In this lesson, let's understand vocabulary index for this particular project. |
3. Language Modeling Next Word Prediction Vocabulary Index Embeddings In this session, you will learn the vocabulary index embedding that will be used in the mini project. |
4. Language Modeling Next Word Prediction RNN Architecture This section provides a detailed explanation of the RNN architecture for this particular project. |
5. Language Modeling Next Word Prediction Python 1 Now that we have discussed the project in detail, it's time to start coding. |
6. Language Modeling Next Word Prediction Python 2 Let's continue with our coding in this session as well. We will define the weight matrices and set the gradients to be true. |
7. Language Modeling Next Word Prediction Python 3 Let's continue with our coding and define the forward step in this session. |
8. Language Modeling Next Word Prediction Python 4 In this session, we will continue coding for the language model. |
9. Language Modeling Next Word Prediction Python 5 In this session, we will implement the loss function in our model. |
10. Language Modeling Next Word Prediction Python 6 In this lesson, we will define the train function. |
23. Mini-Project Sentiment Classification (NLP with Deep Learning RNN)
1. Vocabulary Implementation In this session, we will start working on our project with vocabulary implementation. |
2. Vocabulary Implementation Helpers In this session, we will add some tokens to our code. |
3. Vocabulary Implementation From File In this session, we will add a function that will build a vocabulary from the data frame. |
4. Vectorizer In this lesson, we will add a function to vectorize the reviews followed by building the RNN architecture. |
5. RNN Setup We will continue setting up RNN in this session. |
6. RNN Setup 1 In this session, we will continue with our coding and add the sigmoid function. |
24. RNN in PyTorch (NLP with Deep Learning RNN)
1. RNN In PyTorch Introduction This video provides a short recap followed by an overview of the entire section. |
2. RNN in PyTorch Embedding Layer In this session, we will first import a few modules followed by embedding layers. |
3. RNN in PyTorch Nn Rnn In this session, we will define a vocabulary size followed by the index vector. |
4. RNN in PyTorch Output Shapes In this session, we will apply the RNN to the inputs. |
5. RNN in PyTorch Gated Units In this session, let's learn about the different gated models. |
6. RNN in PyTorch Gated Units GRU LSTM In this session, you will learn how to replace a simple RNN with gated units GRU and LSTM. |
7. RNN in PyTorch Bidirectional RNN In this session, we will learn about bidirectional RNN. |
8. RNN in PyTorch Bidirectional RNN Output Shapes In this lesson, you will learn how to implement a bidirectional RNN. |
9. RNN in PyTorch Bidirectional RNN Output Shapes Separation In this session, we will learn how to separate the output shapes. |
10. RNN in PyTorch Example In this lesson, we will build simple RNN model using the PyTorch interface. |
25. Advanced RNN Models (NLP with Deep Learning RNN)
1. RNN Encoder Decoder In this session, we will discuss about the encoder-decoder model. |
2. RNN Attention In this lesson, we will understand the attention mechanism. |
26. Neural Machine Translation
1. Introduction to Dataset and Packages In this lesson, we will look at the dataset and packages that we're going to use to build the encoder-decoder model. |
2. Implementing Language Class In this session, we will build a class for each language. |
3. Testing Language Class and Implementing Normalization In this lesson, we will test the class we created in our previous lesson. |
4. Reading Datafile In this session, we will define a function to read the data from the file. |
5. Reading Building Vocabulary In this lesson, we will be working on building the vocabulary. |
6. EncoderRNN Now that we have prepared all the data and the helper functions are in place, let's go ahead and define the encoder RNN class in this session. |
7. DecoderRNN In this session, we will add the codes for decoder RNN. |
8. DecoderRNN Forward Step In this session, we will be defining the forward function. |
9. DecoderRNN Helper Functions In this session, we will define some auxiliary functions. |
10. Training Module In this session, we will define the training routine. |
11. Stochastic Gradient Descent In this session, we will implement stochastic gradient decent. |
12. NMT Training In this session, we will cover an NMT training. |
13. NMT Evaluation In this session, we will cover an NMT evaluation. |