• Professional Development
  • Medicine & Nursing
  • Arts & Crafts
  • Health & Wellbeing
  • Personal Development

Course Images

Fundamentals of Neural Networks

Fundamentals of Neural Networks

  • 30 Day Money Back Guarantee
  • Completion Certificate
  • 24/7 Technical Support

Highlights

  • On-Demand course

  • 6 hours 37 minutes

  • All levels

Description

Get started with Neural networks and understand the underlying concepts of Neural Networks, Convolutional Neural Networks, and Recurrent Neural Networks. This hands-on course will help you understand deep learning in detail with no prior coding or programming experience required.

Learning can be supervised, semi-supervised or unsupervised. Deep-learning architectures such as deep neural networks, deep belief networks, deep reinforcement learning, recurrent neural networks, and convolutional neural networks have been applied to fields including computer vision, speech recognition, natural language processing, machine translation, bioinformatics, drug design, medical image analysis, material inspection, and board game programs, where they have produced results comparable to and in some cases surpassing human expert performance. This course covers the following three sections: (1) Neural Networks, (2) Convolutional Neural Networks (CNN), and (3) Recurrent Neural Networks (RNN). You will learn about logistic regression and linear regression and know the purpose of neural networks. You will also understand forward and backward propagation as well as the cross-entropy function. Furthermore, you will explore image data, convolutional operation, and residual networks. In the final section of the course, you will understand the use of RNN, Gated Recurrent Unit (GRU), and Long Short-Term Memory (LSTM). You will also have code blocks and notebooks to help you understand the topics covered in the course. By the end of this course, you will have a hands-on understanding of Neural Networks in detail. All resources and code files are placed here: https://github.com/PacktPublishing/Fundamentals-in-Neural-Networks

What You Will Learn

Learn about linear and logistic regression in ANN
Learn about cross-entropy between two probability distributions
Understand convolution operation which scans inputs with respect to their dimensions
Understand VGG16, a convolutional neural network model
Understand why to use recurrent neural network
Understand Long short-term memory (LSTM)

Audience

This course can be taken by a beginner level audience that intends to obtain an in-depth overview of Artificial Intelligence, Deep Learning, and three major types of neural networks: Artificial Neural Networks, Convolutional Neural Networks, and Recurrent Neural Networks. There is no prior coding or programming experience required. This course assumes you have your own laptop, and the code will be done using Colab.

Approach

This course contains a detailed discussion on various topics of Deep Learning, mathematical description, and code walkthroughs of the three common families of neural networks.

Key Features

Understand the intuition behind Artificial Neural Networks, Convolution Neural Networks, and Recurrent Neural Networks Understand backward and forward propagation in ANN Understand Bidirectional Recurrent Neural Networks (BRNN)

Github Repo

https://github.com/PacktPublishing/Fundamentals-in-Neural-Networks

About the Author
Yiqiao Yin

Yiqiao Yin was a PhD student in statistics at Columbia University. He has a BA in mathematics and an MS in finance from the University of Rochester. He also has a wide range of research interests in representation learning: feature learning, deep learning, computer vision, and NLP. Yiqiao Yin is a senior data scientist at an S&P 500 company LabCorp, developing AI-driven solutions for drug diagnostics and development. He has held professional positions as an enterprise-level data scientist at EURO STOXX 50 company Bayer, a quantitative researcher at AQR working on alternative quantitative strategies to portfolio management and factor-based trading, and equity trader at T3 Trading on Wall Street.

Course Outline

1. Welcome

1. Welcome Message

This video explains the need for taking up the course and introduces you to the author.

2. Course Outline

This video explains the course outline and what the course has to offer.

2. Artificial Neural Networks

1. Linear Regression

This video explains statistical machine learning, where you will start with the linear regression model.

2. Logistic Regression

This video explains logistic regression and specifically if the target here is discrete or binary.

3. Purpose of Neural Networks

This video explains the purpose of neural networks.

4. Forward Propagation

This video explains forward propagation and will dive deeper into the architecture of neural networks.

5. Backward Propagation

This video explains backward propagation, which is defined by the optimization problem called the gradient descent algorithm.

6. Activation Function

This video explains the role of the activation function, which is an interesting phenomenon in the design of neural networks.

7. Cross-Entropy Loss Function

This video explains the cross-entropy function, which is designed under the assumption that the variable you are trying to predict is binary.

8. Gradient Descent

This video explains the optimization problem using the gradient descent algorithm.

9. Lab 1 - Introduction to Python

This video demonstrates some of the basic commands in Python specifically the print statement, data structures, variables, and how to define a function.

10. Lab 2 - Introduction to TensorFlow - Remove the Throat-Clearing Sound in the Start of the Video

This video demonstrates some basic operations in TensorFlow such as objects and we will apply some mathematical operations to the Tensor objects.

11. Lab 3 - Introduction to Neural Network

This video demonstrates how to use Keras TensorFlow as API to essentially design and craft the neural network architecture.

12. Lab 4 - Functional API

This video demonstrates functional API versus sequential API.

13. Lab 5 - Building Deeper and Wider Model

This video demonstrates how to build a deeper and wider neural network model.

3. Convolutional Neural Networks

1. Image Data

This video explains image data in CNN (Convolutional Neural Network).

2. Tensor and Matrix

This video explains what we mean by Tensor and Matrix.

3. Convolutional Operation

The Convolution layer (CONV) uses filters that perform convolution operations as it is scanning the input with respect to its dimensions. Its hyperparameters include the filter size and stride. The resulting output is called a feature map or activation map.

4. Padding

This video explains padding in convolutional neural networks.

5. Stride

For a convolutional or pooling operation, the stride denotes the number of pixels by which the window moves after each operation.

6. Convolution in 2D and 3D

This video explains Convolution in 2D and 3D.

7. VGG16

This video explains VGG16 which is a convolutional neural network model proposed by K. Simonyan and A. Zisserman from the University of Oxford in the paper "Very Deep Convolutional Networks for Large-Scale Image Recognition".

8. Residual Network

Deeper neural networks are more difficult to train. We present a residual learning framework to ease the training of networks that are substantially deeper than those used previously.

9. Lab 1 - Introduction to Convolutional 1-Dimensional

This video demonstrates convolutional operations in 1-dimension.

10. Lab 2 - Introduction to CNN

This video demonstrates the architecture and how to carry out the code using TensorFlow in collab and building a convolutional neural network.

11. Lab 3 - Deep CNN

This video demonstrates a deeper CNN, where you will build a much bigger number of trainable parameters.

12. Lab 4 - Transfer Learning

This video demonstrates transfer learning.

4. Recurrent Neural Networks

1. Welcome to RNN

This video explains recurrent neural networks and why we want to use RNN.

2. Why Use RNN

A Recurrent neural network is a type of artificial neural network commonly used in speech recognition and natural language processing. Recurrent neural networks recognize data's sequential characteristics and use patterns to predict the next likely scenario.

3. Language Processing

NLP is a tool for structuring data in a way that AI systems can process that deals with language. NLP uses AI to 'read' through a document and extract key information.

4. Forward Propagation in RNN

The forward propagation in an RNN makes a few assumptions: 1) We assume the hyperbolic tangent activation function for the hidden layer. 2) We assume that the output is discrete as if the RNN is used to predict words or characters.

5. Backward Propagation Through Time

Backpropagation through time (BPTT) is a gradient-based technique for training certain types of recurrent neural networks. It can be used to train Elman networks. The algorithm was independently derived by numerous researchers.

6. Gated Recurrent Unit (GRU)

Gated recurrent units (GRUs) are a gating mechanism in recurrent neural networks. GRUs have been shown to exhibit better performance on certain smaller and less frequent datasets.

7. Long Short-Term Memory (LSTM)

Long short-term memory (LSTM) is an artificial recurrent neural network (RNN) architecture used in the field of deep learning. LSTM networks are well-suited to classifying, processing, and making predictions based on time series data since there can be lags of unknown duration between important events in a time series.

8. Bi-Directional RNN

Bidirectional recurrent neural networks (BRNN) connect two hidden layers of opposite directions to the same output. BRNNs are especially useful when the context of the input is needed. For example, in handwriting recognition, the performance can be enhanced by knowledge of the letters located before and after the current letter.

9. Lab 1 - RNN in Text Classification

This video demonstrates how to design a recurrent neural network or RNN.

10. Lab 2 - Sequence to Sequence Stock Candlestick Forecast

This video demonstrates sequence-to-sequence stock candlestick forecast.

Course Content

  1. Fundamentals of Neural Networks

About The Provider

Packt
Packt
Birmingham
Founded in 2004 in Birmingham, UK, Packt’s mission is to help the world put software to work in new ways, through the delivery of effective learning and i...
Read more about Packt

Tags

Reviews