Machine Learning

Machine Learning

Rs.3,178.00

18% GST Extra

Please login to purchase the course.

*
SKU: cid_2412 Category: Tag:
About the course

This course provides a concise introduction to the fundamental concepts in machine learning and popular machine learning algorithms. The course is accompanied by hands-on problem-solving exercises in Python.

This course covers the standard and most popular supervised learning algorithms including linear regression, logistic regression, decision trees, k-nearest neighbor, an introduction to Bayesian learning and the naïve Bayes algorithm, support vector machines and kernels, and neural networks with an introduction to Deep Learning. It also covers the basic clustering algorithms. Feature reduction methods are also discussed. This course also introduces the basics of computational learning theory. Also, this course covers various issues related to the application of machine learning algorithms. This course also covers hypothesis space, overfitting, bias and variance, tradeoffs between representational power and learnability, evaluation strategies, and cross-validation.

Learning Outcomes

After completing this course, you will be able to:

  • Understand the fundamental concepts in machine learning and popular machine learning algorithms.
  • Understand the fundamental issues and challenges of machine learning such as data, model selection, model complexity, etc.
  • Understand the strengths and weaknesses of many popular machine learning approaches.
  • Understand the underlying mathematical relationships within and across Machine Learning algorithms and the paradigms of supervised and unsupervised learning.
  • Design and implement various machine learning algorithms in a range of real-world applications.
  • Boost your hireability through innovative and independent learning.
Target Audience

The course can be taken by:

Students: All students who are pursuing professional graduate/post-graduate courses related to computer science and engineering or data science.

Teachers/Faculties: All computer science and engineering teachers/faculties.

Professionals: All working professionals from the computer science / IT / Data Science domain.

Why learn Machine Learning?

Machine Learning lays the foundation for Artificial Intelligence. Artificial Intelligence (AI) is indeed moving tremendously. Self-driving cars are AI applications, also, Siri on your iPhone as well as Youtube’s video recommendations are AI applications. Machine Learning is the rave of the moment. Tons of companies are going all out to hire competent engineers, as ML is gradually becoming the brain behind business intelligence. Just as humans learn from experience, ML systems learn from data. So, learning ML would make you more knowledgeable in data science, and thus more attractive in the labor market. Also, there’s a potentially positive demand for ML engineers. So, it’s worth learning to have a go at the Machine learning course if you want to be a highly demanded ML professional.

Course Features
  • 24X7 Access: You can view lectures at your own convenience.
  • Online lectures: 22 hours of online lectures with high-quality videos.
  • Updated Quality content: Content is latest and gets updated regularly to meet the current industry demands.
Test & Evaluation

1. During the program, the participants will have to take all the assignments given to them for better learning.

2. At the end of the program, a final assessment will be conducted.

Certification

1. All successful participants will be provided with a certificate of completion.

2. Students who do not complete the course / leave it midway will not be awarded any certificate.

Topics to be covered
  1. Introduction
    • What is the history of Machine Learning?
    • What is the difference between Machine Learning solution and programmatic solution?
    • What is a formal definition of Machine Learning?
    • What are some domains and examples of Machine Learning?
    • How can we create a (machine) learner?
  2. Different types of Machine Learning
    • What are the broad types of Machine Learning?
    • What is Unsupervised / Supervised / Semisupervised and Reinforcement learning?
    • What is supervised learning? (In detail)
    • What are some examples of Classification and Regression problems?
    • What are Features, Some of the Sample training examples of feature and Can we draw some Schematic Diagrams (for Supervised learning)?
    • What is Classification Learning? and what are some of its tasks and performance metric?
    • How do we get data for the learning problems? How are representations of functions used in Machine Learning? What is the hypothesis space?
  3. Hypothesis Space and Inductive Bias
    • What is inductive learning?
    • What are the features and feature vectors?
    • What is the start of the classification problem? What are feature space and hypothesis space for Classification problems?
    • 5 types of representations of a function
    • Hypothesis space
    • Terminology (example, training data, instance space, concept, target function)
    • What is the size of the hypothesis space (for n boolean features) and what is hypothesis language?
    • What is inductive learning hypothesis?
    • What are inductive learning and consistent hypothesis? Why is inductive learning an ill-posed problem?
    • What are various types of bias? (Occam's Razor, MDL, MM) and what are the important issues in Machine Learning? What is generalization? (Bias and Variance)
  4. Evaluation and Cross-Validation
    • What is an experimental evaluation of learning algorithms?
    • How do we Evaluate predictions? and What is an absolute error? (Evaluate predictions)
    • What is the sum of squares error and number of misclassification? (Evaluate predictions)
    • What is the confusion matrix?
    • What is accuracy, precision, and recall? (evaluate predictions)
    • What is sample error and true error?
    • What are the sources of errors?
    • What are the difficulties in evaluating the hypothesis with limited data and possible solutions?
    • How can we evaluate with limited training data?
    • What is K fold cross validation trade-off in Machine Learning?
  5. Tutorial I
    • Introduction to Tutorial I
    • Types of learning: supervised vs unsupervised learning
    • Example of supervised vs unsupervised learning
    • Types of features: categorical vs continuous features
    • Types of supervised learning: regression vs classification
    • Bias vs Variance
    • Generalization performance of a learning algorithm
  6. Linear Regression
    • What is regression? (Linear functions and other functions) and What are various Types of regression models?
    • What is the linear regression?
    • Looking at an example of a training set for regression
    • What is multiple linear regression?
    • What assumption are we making for errors?
    • The least squares regression line
    • How do we learn the parameters (for single regression and for multiple linear regression)
    • What is the delta or LMS method and how do we use gradient descent?
    • What is LMS update or delta rule, batch descent and stochastic gradient descent?
  7. Introduction to Decision Trees
    • What is a decision tree?
    • How to draw a sample decision trees for discrete data?
    • How to draw a sample decision trees for continuous data?
    • Generate a decision tree from training examples
    • Decision tree for playing tennis
    • Introduction to ID3 (searching for a good tree )
  8. Learning Decision Tree
    • How do we select attributes for the decision tree? (information gain, entropy)
    • Example of creating a decision tree (using the ID3 algorithm)
    • What is the GINI Index?
    • How do we split continuous attributes and what are the practical issues in classification
    • Practical issues in classification
  9. Overfitting
    • What is overfitting?
    • An example of underfitting and overfitting
    • Overfitting due to noise or insufficient examples
    • How to avoid overfitting?
    • What is MDL?
    • What are the conditions for pre-pruning?
    • How do we use reduced error pruning for post pruning?
    • What are the triple tradeoffs in model selection and generalization?
    • What is regularization?
  10. Python exercise on decision tree and linear regression
    • Python exercise on linear regression
    • Python exercise on logistic regression
    • Python exercise on decision tree regression
  11. Tutorial II
    • How to solve a sample problem in linear regression?
    • How to solve problems related to decision trees?
    • How to find the entropy of a set and use in decision trees?
    • What is the information gain?
  12. K-Nearest Neighbour
    • What are instance-based learning and K-Nearest Neighbour algorithm?
    • What are the standard distance function (Euclidean distance) and the 3 issues related to it?
    • What are some examples of K-Nearest Neighbour and what is the impact of k?
    • How can we use weighted distance functions?
    • Why do we need to remove extra features?
    • What are the various approaches to giving weights?
  13. Feature Selection
    • Why do we need feature reduction?
    • What is the curse of dimensionality?
    • How can we do feature reduction? (selection and extraction)
    • How can we evaluate feature subset? (wrapper / supervised and filter / unsupervised)
    • How can we use the feature selection algorithm? (forward and backward selection algorithm)
    • What are univariate feature selection methods?
    • What are multivariate feature selection methods?
  14. Feature Extraction
    • What is feature extraction and what kind of features do we want?
    • What are the principal components (PCs) and how do we choose features?
    • How do we choose the direction of the principal components (PCs) and how do we use PCA?
    • How do we choose a feature (axis) for classification and how is Linear Discriminant Analysis useful?
  15. Collaborative Filtering
    • What is a recommender system?
    • How can we formally define recommendation problem?
    • What are the two types of recommendation systems? (content, collaborative filtering)
    • What are the two types of collaborative filtering? (used based nearest nbr, item-based nearest nbr)
    • What are the two phases of algorithms for collaborative filtering? (nbr formation, recommendation)
    • What are the issues with user-based KNN CF?
    • What is item-based collaborative filtering?
  16. Python Exercise on KNN and PCA
    • What will we cover?
    • How do we use KNeighbors classifier in Python?
    • How do we use randomized PCA in Python?
    • How can we do face recognition using PCA and KNN?
  17. Tutorial III
    • What is the curse of dimensionality?
    • What is feature selection?
    • What is feature reduction and PCA? (principal component analysis)
    • How do you calculate the eigenvalues and eigenvector of a matrix?
    • What is K-NN (K Nearest Neighbour) classification?
  18. Bayesian Learning
    • How is probability used for modeling concepts?
    • What is the Bayes theorem?
    • Can we look at an example of Bayes theorem?
    • How can the Bayes theorem be applied to find the hypothesis in Machine Learning? (MAP hypothesis)
    • What is the Bayes optimal classifier?
    • Gibbs sampling
  19. Naive Bayes
    • Naive Bayes algorithm
    • Naive Bayes algorithm for discrete x
    • What is smoothing and why is it required?
    • Can we look at an example of a Naive Bayes algorithm for discrete x?
    • How do we use smoothing when estimating parameters?
    • What is the assumption that we made in Naive Bayes and what happens if it is invalid?
    • What is gaussian Naive Bayes? (for continuous X, but discrete Y)
    • What are Bayesian networks?
  20. Bayesian Network
    • Why do we need a Bayes network?
    • Can we look at an example of Bayes network?
    • What does a Bayesian network represent?
    • What can we do with a Bayesian network (Inference)?
    • Where can we apply Bayesian network?
    • How do we define a Bayesian network?
    • What is the graphical representation of the Naive Bayes model?
    • What is the hidden Markov model?
    • How is learning helped by Bayesian belief networks?
  21. Python Exercise on Naive Bayes
    • How to use the Naive Bayes classifier?
    • What is the Naive Bayes classifier?
    • How is the Naive Bayes classifier relevant in the context of email spam classification?
  22. Tutorial IV
    • How do we estimate the probabilities using the frequency distribution of probability?
    • How do we use Bayes rule?
    • What is MAP inference?
    • What is the Naive Bayes assumption?
    • What is Bayesian networks (the structures), inference and marginalization?
  23. Logistic Regression
    • What are Logistic Regression (for Classification problems) and sigmoid function?
    • What are some of the Interesting Properties of Sigmoid function?
    • How can we use stochastic gradient descent with logistic regression?
  24. Introduction Support Vector Machine
    • Support vector machine
    • Functional margin
    • The functional margin of a set of point
    • Solving the optimization problem
  25. SVM The Dual Formulation
    • Lagrangian duality in brief
    • The KKT conditions
    • Implication of Lagrangian
    • The dual problem
  26. SVM Maximum Margin with Noise
    • Linear SVM formulation
    • Limitation of previous SVM formulation
    • What objective to be minimized?
    • Lagrangian
    • Dual formulation
  27. Nonlinear SVM and Kernel Function
    • Non-linear SVM, feature space, and kernel function
    • Kernel trick
    • Commonly used the kernel function
    • Performance
  28. SVM Solution to the Dual Problem
    • SMO algorithm (sequential optimization)
    • Coordinate ascent
    • SMO (for the dual problem)
  29. Python Exercise on SVM
    • Support vector classification
    • Visualize the decision boundaries
    • Load data
  30. Introduction to NN
    • Neural network and neuron
    • Perceptron - basic unit in NN
    • Gradient descent
    • Stochastic gradient descent
    • Multi-layer networks - by stochastic many NN
  31. Multilayer Neural Network
    • Limitation of perceptrons
    • Multi-layer NN
    • Power/ Expressiveness of multilayer networks
    • Two-layer back-propagation neural network
    • Learning for BP nets
    • Derivation
  32. Neural Network and Backpropagation Algorithm
    • Single layer perceptron and boolean functions (OR, XOR)
    • Representation capability of NNs
    • Learning in multilayer N using backpropagation
    • Derivation
    • Backpropagation algorithm
    • Training practices: batch vs stochastic and learning in the epoch
    • Overfitting in anns and local minima
  33. Deep Neural Network
    • Deep learning
    • Hierarchical representation & unsupervised pre-training
    • Architecture & Training
    • Pooling
    • CNN properties
  34. Python Exercise on Neural Network
    • How can we create an artificial neural network using TensorFlow and TFLearn to recognize handwritten digits?
    • How do we load dependencies (to recognize handwritten digits)?
    • How do we load the data (to recognize handwritten digits)?
    • How do we make the model (to recognize handwritten digits)?
    • How do we train the model (to recognize handwritten digits)?
    • What is our takeaway from this exercise (to recognize handwritten digits)?
  35. Tutorial VI
    • What is the perception?
    • What is the perceptron learning rule?
    • How do we represent a boolean function using a perceptron?
    • What is the forward and backward pass algorithm or backpropagation algorithm?
    • Stochastic gradient descent and batch gradient descent
    • A quick overview of some deep learning algorithms
  36. Introduction to Computational Learning Theory
    • The goal of learning theory & Core aspect of Machine Learning
    • PAC
    • Prototypical concept learning task
  37. Sample Complexity Finite Hypothesis Space
    • What is Sample Complexity?
    • Can we look at an example of the consistent case?
    • What is Find-S algorithm and what can it do?
  38. VC Dimension
    • What kind of theorems do we have when hypothesis state is infinite?
    • What is shattering?
    • What is the definition of VC dimension?
    • What is the upper bound and lower band on sample complexity with VC?
  39. Introduction to Ensembles
    • What is ensemble learning?
    • How can we use weak learners?
    • How can we combine learners in Bayesian classifiers?
    • Why are ensembles successful and what are the main challenges with them?
  40. Bagging and Boosting
    • What is Bagging?
    • What is Boosting and what is AdaBoost?
    • Why does ensembling work?
  41. Introduction to Clustering
    • What is unsupervised learning and clustering?
    • What are some applications of clustering, and what are various aspects of clustering?
    • Major clustering approaches
    • How can we measure the quality of clustering?
  42. K-means Clustering
    • What is the K-means algorithm?
    • How can we describe the K-means Algorithm, and can we look at an illustration of it?
    • What are the similarity and distance measures?
    • What is the proof of convergence of K-means, time complexity, advantages, and disadvantages?
    • What is model-based clustering?
    • How can we apply K-means on an RGB image?
    • What is EM algorithm?
  43. Agglomerative Hierarchical Clustering
    • What is hierarchical clustering, bottom up and top down clustering?
    • What is a Dendrogram?
    • What is the algorithm for Agglomerative Hierarchical Clustering?
    • What is the complete link method?
    • What is average link clustering?
  44. Python Exercise on K-means clustering
    • Can we look at python code for K means algorithm?
    • Can we look at python code for Gaussian mixture model?
    • Hierarchical agglomerative clustering
  45. Tutorial VIII
    • What is K-means clustering?
    • Solving a sample problem n K-means clustering
    • What is agglomerative hierarchical clustering?
    • What is Gaussian mixture model?
    • Machine Learning Final Quiz

', { 'anonymize_ip': true });