Deep Neural Networks using Python Online Course
Deep Neural Networks using Python Online Course
This course is a step-by-step guide to becoming a deep learning expert. Starting with the theory, you'll then implement concepts using Python, the leading language for machine learning. The course covers essential topics like data preprocessing, machine learning fundamentals, and Deep Neural Networks (DNNs). You’ll learn both theoretical concepts and practical implementation through Python, progressing from basic to advanced levels. By the end, you'll have the skills to train machines and make predictions using real-world datasets.
Who is this Course for?
This course is ideal for those interested in data science or looking to advance their skills. It’s perfect for students wanting to master DNNs with real datasets or implement them in practical projects. A background in deep learning is recommended for the best learning experience.
What you will learn
- Learn machine learning and neural network basics
- Understand neural network architecture
- Train a DNN using the Gradient Descent algorithm
- Implement a full DNN using NumPy
- Build a DNN structure from scratch with Python
- Work on a deep learning project with the IRIS dataset
Course Table of Contents
Introduction
- Introduction to Instructor
- Introduction to Course
Basics of Deep Learning
- Problem to Solve Part 1
- Problem to Solve Part 2
- Problem to Solve Part 3
- Linear Equation
- Linear Equation Vectorized
- 3D Feature Space
- N-Dimensional Space
- Theory of Perceptron
- Implementing Basic Perceptron
- Logical Gates for Perceptrons
- Perceptron Training Part 1
- Perceptron Training Part 2
- Learning Rate
- Perceptron Training Part 3
- Perceptron Algorithm
- Coding Perceptron Algo (Data Reading and Visualization)
- Coding Perceptron Algo (Perceptron Step)
- Coding Perceptron Algo (Training Perceptron)
- Coding Perceptron Algo (Visualizing the Results)
- Problem with Linear Solutions
- Solution to Problem
- Error Functions
- Discrete Versus Continuous Error Function
- Sigmoid Function
- Multi-Class Problem
- Problem of Negative Scores
- Need of SoftMax
- Coding SoftMax
- One-Hot Encoding
- Maximum Likelihood Part 1
- Maximum Likelihood Part 2
- Cross Entropy
- Cross Entropy Formulation
- Multi-Class Cross Entropy
- Cross Entropy Implementation
- Sigmoid Function Implementation
- Output Function Implementation
Deep Learning
- Introduction to Gradient Descent
- Convex Functions
- Use of Derivatives
- How Gradient Descent Works
- Gradient Step
- Logistic Regression Algorithm
- Data Visualization and Reading
- Updating Weights in Python
- Implementing Logistic Regression
- Visualization and Results
- Gradient Descent Versus Perceptron
- Linear to Non-Linear Boundaries
- Combining Probabilities
- Weighted Sums
- Neural Network Architecture
- Layers and DEEP Networks
- Multi-Class Classification
- Basics of Feed Forward
- Feed Forward for DEEP Net
- Deep Learning Algo Overview
- Basics of Backpropagation
- Updating Weights
- Chain Rule for Backpropagation
- Sigma Prime
- Data Analysis NN (Neural Networks) Implementation
- One-Hot Encoding (NN Implementation)
- Scaling the Data (NN Implementation)
- Splitting the Data (NN Implementation)
- Helper Functions (NN Implementation)
- Training (NN Implementation)
- Testing (NN Implementation)
Optimizations
- Underfitting vs Overfitting
- Early Stopping
- Quiz
- Solution and Regularization
- L1 and L2 Regularization
- Dropout
- Local Minima Problem
- Random Restart Solution
- Vanishing Gradient Problem
- Other Activation Functions
Final Project
- Final Project Part 1
- Final Project Part 2
- Final Project Part 3
- Final Project Part 4
- Final Project Part 5