Machine Learning

Machine Learning

Rs.10,000.00

Please login to purchase the course.

18% GST Extra.

Day 1. Introduction to Machine Learning
    - What is Machine Learning?
    - Traditional Programming vs. Machine Learning
    - Types of Machine Learning
    - Supervised Learning
    - Unsupervised Learning
    - Reinforcement Learning
Day 2. Data_Collection_and_Cleaning
    - Introduction to Data Collection
    - Data Collection - Sources & Methods
    - Data Collection - Challenges
    - Introduction to Data Cleaning
    - Handling Missing Values
    - Dealing with Duplicate Data
    - Correcting Inconsistent Data
    - Outlier Detection & Treatment
    - Data Type Conversion & Formatting
    - Data Validation & Verification
Day 3. Exploratory_Data_Analysis(EDA)
    - What is EDA?
    - Importance of EDA
    - EDA Techniques
    - Introduction to Descriptive Statistics
    - Common Metrics in Descriptive Statistics
    - Measures of Central Tendency
    - Measures of Dispersion
    - Example of Descriptive Statistics in Python
    - Common Visualization Techniques
    - Correlation
    - Introduction to Outlier Detection
    - Common Methods for Outlier Detection
Day 4. Feature_Engineering
    - What is Feature Engineering?
    - Importance of Feature Engineering
    - Types of Features
    - Techniques for Feature Engineering
Day 5. Model_Selection_and_Training
    - Introduction
    - Model Selection
    - Model Training
Day 6. Model_Evaluation
    - Introduction to Model Evaluation & Tuning
    - Evaluation Metrics for Classification
    - Evaluation Metrics for Regression
Day 7. Model_Hyperparameter_Tuning
    - What are Hyperparameters?
    - Why is Hyperparameter Tuning Crucial?
    - Hyperparameter Tuning Methods Overview
    - Grid Search: Exhaustive Hyperparameter Search
    - Randomized Search: Efficient Hyperparameter Exploration
    - Bayesian Optimization: Intelligent Hyperparameter Search
    - Grid vs. Randomized vs. Bayesian Search: When to Choose?
    - Cross-Validation
    - Why Cross-Validation? (Limitations of Single Train/Test Split)
    - How K-Fold Cross-Validation Works
    - Cross-Validation: Pros & Cons
Day 8. Linear_Regression
    - Introduction to Linear Regression
    - Assumptions of Linear Regression
    - Example : Linear Regression
    - Linear Regression Model using  sklearn
Day 9. Polynomial_Regression
    - Introduction to Non-Linear Relationships
    - What is Polynomial Regression?
    - Polynomial Regression Equation
    - Example: Quadratic Polynomial Regression
    - Quadratic Polynomial Regression Model Using sklearn
    - Why Use Polynomial Regression?
    - Limitations of Polynomial Regression
    - When to Use Polynomial Regression
Day 10. Ridge_Lasso_and_ElasticNet_Regularization
    - Introduction to Regularization
    - Bias-Variance Trade-off
    - Types of Regularization
    - A Practical Demonstration of L1 (Lasso), L2 (Ridge), and Mixed Regularization
   
Day 11. Logistic_Regression
    - Introduction - What is Logistic Regression?
    - Why Not Use Linear Regression for Classification?
    - The Sigmoid (Logistic) Function - The Core
    - From Probability to Class Prediction
    - Cost Function (Loss Function)
    - Optimization: Gradient Descent
    - Numerical Example
    - Advantages & Disadvantages
    - Common Applications
Day 12. Decision_Tree
    - Introduction
    - Key Components of a Decision Tree
    - How Decision Trees Learn? (The Algorithm's Core Idea)
    - Splitting Criteria - Measuring Impurity
    - Building the Tree: The Algorithm Steps
    - Numerical Example on Decision Tree
    - Overfitting and Pruning in Decision Trees
    - Advantages  of Decision Trees
    - Disadvantages of Decision Trees
    - Extensions & Ensemble Methods
    - Decision Tree for Classification (Implementation using Iris Dataset)
Day 13. Ensemble_Learning
    - What is Ensemble Learning?
    - Why Ensemble Learning? The Motivation
    - Core Principles of Ensemble Learning
    - Types of Ensemble Methods
    - Bagging (Bootstrap Aggregating)
    - Boosting
    - Stacking / Voting / Blending
    - Advantages of Ensemble Learning
    - Disadvantages & Challenges
    - Choosing the Right Ensemble Method
Day 14. Random_Forest
    - What is Random Forest?
    - How It Works: The Core Process
    - Out-of-Bag (OOB) Evaluation
    - Strengths of Random Forest
    - Weaknesses of Random Forest
    - Key Hyperparameters
    - Applications
    - Predicting Customer Churn: A Random Forest Example
Day 15. Support_Vector_Machines(SVM)
    - What is a Support Vector Machine (SVM)
    - Linear SVM
    - Mathematical Formulation (Hard and Soft Margin)
    - Non-Linear SVM
    - Common Kernel Functions
    - Hyperparameter Tuning
    - SVM for Regression (SVR)
    - Advantages of SVMs
    - Disadvantages of SVMs
    - SVM implementation
Day 16. K-Nearest_Neighbors(KNN)
    - Introduction to K-Nearest Neighbors (KNN)
    - How KNN Works: The Core Algorithm
    - Key Concept
    - Advantages of KNN
    - Disadvantages of KNN
    - The Curse of Dimensionality
    - Feature Scaling: Why it's Crucial for KNN
    - Numerical Example
    - KNN for Regression
    - Hyperparameter Tuning: Finding the Optimal K
    - Weighted KNN (An Extension)
    - Use Cases of KNN
    - K-Nearest Neighbors (KNN) Implementation
Day 17. Naive_Bayes
    - What is Naive Bayes?
    - Why "Naive"? The Core Assumption
    - Naive Bayes Algorithm
    - Types of Naive Bayes Classifiers
    - Naive Bayes Algorithm Advantages
    - Naive Bayes Algorithm Disadvantages
    - Numerical Example: Spam Detection
Day 18. Gradiant_Boosting
    - Introduction to Ensemble Learning
    - Understanding Gradient Boosting Machines (GBM)
    - Advantages of Gradient Boosting
    - XGBoost (eXtreme Gradient Boosting)
    - LightGBM (Light Gradient Boosting Machine)
    - XGBoost vs. LightGBM: A Comparison
Day 19. K_Means_Clustering
    - What is Clustering?
    - Introduction to K-Means Clustering
    - Key Concepts in K-Means
    - The K-Means Algorithm
    - Choosing the Right 'K'
    - Numerical Example
    - Advantages of K-Means
    - Limitations of K-Means
    - Real-World Applications
    - The K-Means Clustering Algorithm using Scikit-Learn
Day 20. DBSCAN
    - What is Clustering?
    - DBSCAN Core Concepts
    - The DBSCAN Algorithm
    - Numerical Example
    - Advantages of DBSCAN
    - Disadvantages of DBSCAN
    - The DBSCAN Clustering Algorithm using Scikit-Learn
Day 21. Hierarchical_Clustering
    - What is Clustering?
    - Introduction to Hierarchical Clustering
    - Types of Hierarchical Clustering
    - Agglomerative Hierarchical Clustering Algorithm
    - Key Concept: Linkage Criteria
    - The Dendrogram
    - Divisive Hierarchical Clustering  Algorithm
    - Advantages of Hierarchical Clustering
    - Disadvantages of Hierarchical Clustering
    - Applications of Hierarchical Clustering
    - Numerical Example
    - Hierarchical Clustering Implementation
Day 22. Principal_Component_Analysis (PCA)
    - What is PCA?
    - Why PCA?
    - Core Concepts of PCA
    - How PCA Algorithm works
    - Numerical Example
    - Advantages of PCA
    - Limitations of PCA
    - Applications of PCA
    - Implementation of PCA using Scikit Learn
Day 23. Perceptrons & Multi-Layer Perceptrons (MLPs)
    - Introduction to Neural Networks
    - The Perceptron: The First Artificial Neuron
    - Multi-Layer Perceptrons (MLPs)
    - Perceptrons vs. Multi-Layer Perceptrons (MLPs)
Day 24. Activation_Functions
    - What are Activation Functions?
    - Why Do We Need Them? (The Non-Linearity Imperative)
    - Sigmoid Function(Logistic Function)
    - Tanh Activation Function
    - ReLU( Rectified Linear Units) Activation Function
    - Leaky ReLU Activation Function
    - Exponential Linear Units(ELU) Activation Function
    - Softmax Activation Function
   
Day 25. Forward_and_Backward_Propagation
    - Forward Propagation
    - Feed Forward Neural Network Structure
    - Backward propagation
    - Flow of Backward Pass
Day 26. Optimizers
    - Introduction to Optimizers
    - Gradient Descent (GD)
    - Types of Gradient Descent
    - Advantages of Gradient Descent
    - Disadvantages and Challenges in Gradient Descent
    - Momentum
    - Nesterov Accelerated Gradient (NAG)
    - Adaptive Learning Rate Methods
    - Adagrad (Adaptive Gradient Algorithm)
    - RMSprop (Root Mean Square Propagation)
    - Adam (Adaptive Moment Estimation)
    - Adadelta
    - Learning Rate Schedules and Advanced Techniques
    - Choosing an Optimizer
Day 27. Convolutional_Neural_Networks(CNNs)
    - What are Convolutional Neural Networks (CNNs)?
    - The Challenge of Traditional MLPs
    - Why CNNs Excel for Image Data
    - Core Building Blocks of a CNN
    - CNN Architecture
    - Training a CNN - The Learning Process
    - Real-World Examples & Applications
    - Advantages of CNNs
    - Limitations of CNNs
    - Advanced Concepts & Future Trends
Day 28. Recurrent_Neural_Networks(RNNs)
    - Introduction
    - Key Features of RNNs
    - RNN Architecture
    - How RNNs "Remember" - The Hidden State(Example)
    - Applications of RNNs
    - Challenges with Basic RNNs
    - Overcoming Challenges - Gated RNNs
    - LSTM vs. GRU
    - Implementation of RNN Using Pytorch
Benefits:
  • Time-saving & Cost-effective
  • Get trained via industry experts (having 10+ years of experience in the same field, corporate trainers)
  • Full of hands-on practical exposure for better understanding
  • Adding super solid value in your professional career
  • Weekend Doubt clearing sessions.

For inquiry call:  8953463074

Online Live Training Program 2025