Applied Machine Learning Online Course

₹32,500.00 ₹25,000.00
  • How to utilise Appliedaicourse 0/1

    • Lecture1.1
      How to Learn from Appliedaicourse 35 min
  • Python for Data Science Introduction 0/0

  • Python for Data Science: Data Structures 0/6

    • Lecture3.1
      Lists 38 min
    • Lecture3.2
      Tuples part 1 10 min
    • Lecture3.3
      Tuples part-2 04 min
    • Lecture3.4
      Sets 16 min
    • Lecture3.5
      Dictionary 21 min
    • Lecture3.6
      Strings 16 min
  • Python for Data Science: Functions 0/10

    • Lecture4.1
      Introduction 13 min
    • Lecture4.2
      Types of functions 25 min
    • Lecture4.3
      Function arguments 10 min
    • Lecture4.4
      Recursive functions 16 min
    • Lecture4.5
      Lambda functions 08 min
    • Lecture4.6
      Modules 07 min
    • Lecture4.7
      Packages 06 min
    • Lecture4.8
      File Handling 23 min
    • Lecture4.9
      Exception Handling 15 min
    • Lecture4.10
      Debugging Python 15 min
  • Python for Data Science: Numpy 0/2

    • Lecture5.1
      Numpy Introduction 41 min
    • Lecture5.2
      Numerical operations on Numpy 41 min
  • Python for Data Science: Matplotlib 0/1

    • Lecture6.1
      Getting started with Matplotlib 20 min
  • Python for Data Science: Pandas 0/3

    • Lecture7.1
      Getting started with pandas 08 min
    • Lecture7.2
      Data Frame Basics 09 min
    • Lecture7.3
      Key Operations on Data Frames 31 min
  • Python for Data Science: Computational Complexity 0/4

    • Lecture8.1
      Space and Time Complexity: Find largest number in a list 20 min
    • Lecture8.2
      Binary search 17 min
    • Lecture8.3
      Find elements common in two lists 06 min
    • Lecture8.4
      Find elements common in two lists using a Hashtable/Dict 12 min
  • Plotting for exploratory data analysis (EDA) 0/0

    exploratory data analysis (EDA) is an approach to analyzing data sets to summarize their main characteristics, often with visual methods. A statistical model can be used or not, but primarily EDA is for seeing what the data can tell us beyond the formal modeling or hypothesis testing task.

  • Linear Algebra 0/1

    It will give you the tools to help you with the other areas of mathematics required to understand and build better intuitions for machine learning algorithms.

  • Interview Questions on Linear Algebra 0/0

  • Probability and Statistics 0/26

    • Lecture12.1
      Introduction to Probability and Statistics 17 min
    • Lecture12.2
      Population and Sample 07 min
    • Lecture12.3
      Gaussian/Normal Distribution and its PDF(Probability Density Function) 27 min
    • Lecture12.4
      CDF(Cumulative Distribution function) of Gaussian/Normal distribution 11 min
    • Lecture12.5
      Symmetric distribution, Skewness and Kurtosis 05 min
    • Lecture12.6
      Standard normal standardization 05 min
    • Lecture12.7
      Kernel density estimation 07 min
    • Lecture12.8
      Sampling distribution & Central Limit theorem 19 min
    • Lecture12.9
      Q-Q plot:How to test if a random variable is normally distributed or not? 23 min
    • Lecture12.10
      Computing confidence interval given the underlying distribution 13 min
    • Lecture12.11
      How to randomly sample data points (Uniform Distribution) 10 min
    • Lecture12.12
      Bernoulli and Binomial Distribution 11 min
    • Lecture12.13
      Log Normal Distribution 12 min
    • Lecture12.14
      Power law distribution 12 min
    • Lecture12.15
      Box cox transform 12 min
    • Lecture12.16
      Co-variance 14 min
    • Lecture12.17
      Pearson Correlation Coefficient 13 min
    • Lecture12.18
      Spearman Rank Correlation Coefficient 07 min
    • Lecture12.19
      Correlation vs Causation 03 min
    • Lecture12.20
      Confidence interval (C.I) Introduction 08 min
    • Lecture12.21
      Discrete and Continuous Uniform distributions 11 min
    • Lecture12.22
      C.I for mean of a normal random variable 14 min
    • Lecture12.23
      Confidence interval using bootstrapping 17 min
    • Lecture12.24
      Hypothesis testing methodology, Null-hypothesis, p-value 16 min
    • Lecture12.25
      Resampling and permutation test 15 min
    • Lecture12.26
      K-S Test for similarity of distributions 15 min
    • Lecture12.27
      K-S Test for similarity of two distributions 06 min
    • Lecture12.28
      Hypothesis Testing Intution with coin toss example 27 min
    • Lecture12.29
      Hypothesis testing Mean differences Example 18 min
    • Lecture12.30
      Resampling 19 min
  • Interview Questions on Probability and statistics 0/0

  • Dimensionality reduction and Visualization: 0/0

    In machine learning and statistics, dimensionality reduction or dimension reduction is the process of reducing the number of random variables under consideration, via obtaining a set of principal variables. It can be divided into feature selection and feature extraction.

  • Interview Questions on Dimensionality Reduction 0/0

  • PCA(principal component analysis) 0/0

  • (t-SNE)T-distributed Stochastic Neighbourhood Embedding 0/0

  • Real world problem: Predict rating given product reviews on Amazon 0/18

    • Lecture18.1
      Dataset overview: Amazon Fine Food reviews(EDA) 23 min
    • Lecture18.2
      Data Cleaning: Deduplication 15 min
    • Lecture18.3
      Why convert text to a vector? 14 min
    • Lecture18.4
      Bag of Words (BoW) 18 min
    • Lecture18.5
      Text Preprocessing: Stemming, Stop-word removal, Tokenization, Lemmatization. 15 min
    • Lecture18.6
      uni-gram, bi-gram, n-grams. 09 min
    • Lecture18.7
      tf-idf (term frequency- inverse document frequency) 22 min
    • Lecture18.8
      Why use log in IDF? 14 min
    • Lecture18.9
      Word2Vec. 16 min
    • Lecture18.10
      Avg-Word2Vec, tf-idf weighted Word2Vec 09 min
    • Lecture18.11
      Bag of Words( Code Sample) 19 min
    • Lecture18.12
      Text Preprocessing( Code Sample) 11 min
    • Lecture18.13
      Bi-Grams and n-grams (Code Sample) 05 min
    • Lecture18.14
      TF-IDF (Code Sample) 06 min
    • Lecture18.15
      Word2Vec (Code Sample) 12 min
    • Lecture18.16
      Avg-Word2Vec and TFIDF-Word2Vec (Code Sample) 02 min
    • Lecture18.17
      Exercise: t-SNE visualization of Amazon reviews with polarity based color-coding 06 min
    • Lecture18.18
      Assignment 30 min
  • Classification And Regression Models: K-Nearest Neighbors 0/32

    • Lecture19.1
      How “Classification” works? 10 min
    • Lecture19.2
      Data matrix notation 07 min
    • Lecture19.3
      Classification vs Regression (examples) 06 min
    • Lecture19.4
      K-Nearest Neighbours Geometric intuition with a toy example 11 min
    • Lecture19.5
      Failure cases of KNN 07 min
    • Lecture19.6
      Distance measures: Euclidean(L2) , Manhattan(L1), Minkowski, Hamming 20 min
    • Lecture19.7
      Cosine Distance & Cosine Similarity 19 min
    • Lecture19.8
      How to measure the effectiveness of k-NN? 16 min
    • Lecture19.9
      Test/Evaluation time and space complexity 12 min
    • Lecture19.10
      KNN Limitations 09 min
    • Lecture19.11
      Decision surface for K-NN as K changes 23 min
    • Lecture19.12
      Overfitting and Underfitting 12 min
    • Lecture19.13
      Need for Cross validation 22 min
    • Lecture19.14
      K-fold cross validation 17 min
    • Lecture19.15
      Visualizing train, validation and test datasets 13 min
    • Lecture19.16
      How to determine overfitting and underfitting? 19 min
    • Lecture19.17
      Time based splitting 19 min
    • Lecture19.18
      k-NN for regression 05 min
    • Lecture19.19
      Weighted k-NN 08 min
    • Lecture19.20
      Voronoi diagram 04 min
    • Lecture19.21
      Binary search tree 16 min
    • Lecture19.22
      How to build a kd-tree 17 min
    • Lecture19.23
      Find nearest neighbours using kd-tree 13 min
    • Lecture19.24
      Limitations of Kd tree 09 min
    • Lecture19.25
      Extensions 03 min
    • Lecture19.26
      Hashing vs LSH 10 min
    • Lecture19.27
      LSH for cosine similarity 40 min
    • Lecture19.28
      LSH for euclidean distance 13 min
    • Lecture19.29
      Probabilistic class label 08 min
    • Lecture19.30
      Code Sample:Decision boundary . 23 min
    • Lecture19.31
      Code Sample:Cross Validation 13 min
    • Lecture19.32
      Exercise: Apply k-NN on Amazon reviews dataset 05 min
  • Interview Questions on K-NN(K Nearest Neighbour) 0/0

  • Classification algorithms in various situations 0/20

    • Lecture21.1
      Introduction 05 min
    • Lecture21.2
      Imbalanced vs balanced dataset 23 min
    • Lecture21.3
      Multi-class classification 12 min
    • Lecture21.4
      k-NN, given a distance or similarity matrix 09 min
    • Lecture21.5
      Train and test set differences 22 min
    • Lecture21.6
      Impact of outliers 07 min
    • Lecture21.7
      Local outlier Factor (Simple solution :Mean distance to Knn) 13 min
    • Lecture21.8
      k distance 04 min
    • Lecture21.9
      Reachability-Distance(A,B) 08 min
    • Lecture21.10
      Local reachability-density(A) 09 min
    • Lecture21.11
      Local outlier Factor(A) 21 min
    • Lecture21.12
      Impact of Scale & Column standardization 12 min
    • Lecture21.13
      Interpretability 12 min
    • Lecture21.14
      Feature Importance and Forward Feature selection 22 min
    • Lecture21.15
      Handling categorical and numerical features 24 min
    • Lecture21.16
      Handling missing values by imputation 21 min
    • Lecture21.17
      curse of dimensionality 27 min
    • Lecture21.18
      Bias-Variance tradeoff 24 min
    • Lecture21.19
      Intuitive understanding of bias-variance. 06 min
    • Lecture21.20
      best and wrost case of algorithm 06 min
  • Interview Questions on Classification algorithms in various situations 0/0

  • Performance measurement of models 0/8

    • Lecture23.1
      Accuracy 15 min
    • Lecture23.2
      Confusion matrix, TPR, FPR, FNR, TNR 25 min
    • Lecture23.3
      Precision and recall, F1-score 10 min
    • Lecture23.4
      Receiver Operating Characteristic Curve (ROC) curve and AUC 19 min
    • Lecture23.5
      Log-loss 12 min
    • Lecture23.6
      R-Squared/Coefficient of determination 14 min
    • Lecture23.7
      Median absolute deviation (MAD) 05 min
    • Lecture23.8
      Distribution of errors 07 min
  • Interview Questions on Performance Measurement Models 0/0

  • Naive Bayes 0/21

    • Lecture25.1
      Conditional probability 13 min
    • Lecture25.2
      Independent vs Mutually exclusive events 06 min
    • Lecture25.3
      Bayes Theorem with examples 18 min
    • Lecture25.4
      Exercise problems on Bayes Theorem 30 min
    • Lecture25.5
      Naive Bayes algorithm 26 min
    • Lecture25.6
      Toy example: Train and test stages 26 min
    • Lecture25.7
      Naive Bayes on Text data 16 min
    • Lecture25.8
      Laplace/Additive Smoothing 24 min
    • Lecture25.9
      Log-probabilities for numerical stability 11 min
    • Lecture25.10
      Bias and Variance tradeoff 14 min
    • Lecture25.11
      Feature importance and interpretability 10 min
    • Lecture25.12
      Imbalanced data 14 min
    • Lecture25.13
      Outliers 06 min
    • Lecture25.14
      Missing values 03 min
    • Lecture25.15
      Handling Numerical features (Gaussian NB) 13 min
    • Lecture25.16
      Multiclass classification 02 min
    • Lecture25.17
      Similarity or Distance matrix 03 min
    • Lecture25.18
      Large dimensionality 02 min
    • Lecture25.19
      Best and worst cases 08 min
    • Lecture25.20
      Code example 07 min
    • Lecture25.21
      Exercise: Apply Naive Bayes to Amazon reviews 06 min
  • Interview Questions on Naive Bayes Algorithm 0/0

  • Logistic Regression 0/18

    • Lecture27.1
      Geometric intuition of Logistic Regression 31 min
    • Lecture27.2
      Sigmoid function: Squashing 37 min
    • Lecture27.3
      Mathematical formulation of Objective function 24 min
    • Lecture27.4
      Weight vector 11 min
    • Lecture27.5
      L2 Regularization: Overfitting and Underfitting 26 min
    • Lecture27.6
      L1 regularization and sparsity 11 min
    • Lecture27.7
      Probabilistic Interpretation: Gaussian Naive Bayes 19 min
    • Lecture27.8
      Loss minimization interpretation 24 min
    • Lecture27.9
      hyperparameters and random search 16 min
    • Lecture27.10
      Column Standardization 05 min
    • Lecture27.11
      Feature importance and Model interpretability 14 min
    • Lecture27.12
      Collinearity of features 14 min
    • Lecture27.13
      Test/Run time space and time complexity 10 min
    • Lecture27.14
      Real world cases 11 min
    • Lecture27.15
      Non-linearly separable data & feature engineering 28 min
    • Lecture27.16
      Code sample: Logistic regression, GridSearchCV, RandomSearchCV 23 min
    • Lecture27.17
      Exercise: Apply Logistic regression to Amazon reviews dataset. 06 min
    • Lecture27.18
      Extensions to Generalized linear models 09 min
  • Linear Regression 0/4

    • Lecture28.1
      Geometric intuition of Linear Regression 13 min
    • Lecture28.2
      Mathematical formulation 14 min
    • Lecture28.3
      Real world Cases 08 min
    • Lecture28.4
      Code sample for Linear Regression 13 min
  • Solving Optimization Problems 0/12

    • Lecture29.1
      Differentiation 29 min
    • Lecture29.2
      Online differentiation tools 08 min
    • Lecture29.3
      Maxima and Minima 12 min
    • Lecture29.4
      Vector calculus: Grad 10 min
    • Lecture29.5
      Gradient descent: geometric intuition 19 min
    • Lecture29.6
      Learning rate 08 min
    • Lecture29.7
      Gradient descent for linear regression 08 min
    • Lecture29.8
      SGD algorithm 09 min
    • Lecture29.9
      Constrained Optimization & PCA 14 min
    • Lecture29.10
      Logistic regression formulation revisited 06 min
    • Lecture29.11
      Why L1 regularization creates sparsity? 17 min
    • Lecture29.12
      Exercise: Implement SGD for linear regression 06 min
    • Lecture29.13
      Interview Questions on Logistic Regression and Linear Regression 30 min
  • Interview Questions on Logistic Regression and Linear Regression 0/0

  • Support Vector Machines (SVM) 0/15

    • Lecture31.1
      Geometric Intution 20 min
    • Lecture31.2
      Why we take values +1 and and -1 for Support vector planes 09 min
    • Lecture31.3
      Mathematical derivation 32 min
    • Lecture31.4
      Loss function (Hinge Loss) based interpretation 18 min
    • Lecture31.5
      Dual form of SVM formulation 16 min
    • Lecture31.6
      kernel trick 10 min
    • Lecture31.7
      Polynomial kernel 11 min
    • Lecture31.8
      RBF-Kernel 21 min
    • Lecture31.9
      Domain specific Kernels 06 min
    • Lecture31.10
      Train and run time complexities 08 min
    • Lecture31.11
      nu-SVM: control errors and support vectors 06 min
    • Lecture31.12
      SVM Regression 08 min
    • Lecture31.13
      Cases 09 min
    • Lecture31.14
      Code Sample 14 min
    • Lecture31.15
      Exercise: Apply SVM to Amazon reviews dataset 04 min
  • Interview Questions on Support Vector Machine 0/0

  • Decision Trees 0/15

    • Lecture33.1
      Geometric Intuition of decision tree: Axis parallel hyperplanes 17 min
    • Lecture33.2
      Sample Decision tree 08 min
    • Lecture33.3
      Building a decision Tree:Entropy 19 min
    • Lecture33.4
      Building a decision Tree:Information Gain 10 min
    • Lecture33.5
      Building a decision Tree: Gini Impurity 07 min
    • Lecture33.6
      Building a decision Tree: Constructing a DT 21 min
    • Lecture33.7
      Building a decision Tree: Splitting numerical features 08 min
    • Lecture33.8
      Feature standardization 04 min
    • Lecture33.9
      Building a decision Tree:Categorical features with many possible values 07 min
    • Lecture33.10
      Overfitting and Underfitting 08 min
    • Lecture33.11
      Train and Run time complexity 07 min
    • Lecture33.12
      Regression using Decision Trees 09 min
    • Lecture33.13
      Cases 12 min
    • Lecture33.14
      Code Samples 09 min
    • Lecture33.15
      Exercise: Decision Trees on Amazon reviews dataset 03 min
  • Interview Questions on decision Trees 0/0

  • Ensemble Models 0/19

    • Lecture35.1
      What are ensembles? 06 min
    • Lecture35.2
      Bootstrapped Aggregation (Bagging) Intuition 17 min
    • Lecture35.3
      Random Forest and their construction 15 min
    • Lecture35.4
      Bias-Variance tradeoff 07 min
    • Lecture35.5
      Train and run time complexity 09 min
    • Lecture35.6
      Bagging:Code Sample 04 min
    • Lecture35.7
      Extremely randomized trees 08 min
    • Lecture35.8
      Random Tree :Cases 06 min
    • Lecture35.9
      Boosting Intuition 17 min
    • Lecture35.10
      Residuals, Loss functions and gradients 13 min
    • Lecture35.11
      Gradient Boosting 10 min
    • Lecture35.12
      Regularization by Shrinkage 08 min
    • Lecture35.13
      Train and Run time complexity 06 min
    • Lecture35.14
      XGBoost: Boosting + Randomization 14 min
    • Lecture35.15
      AdaBoost: geometric intuition 07 min
    • Lecture35.16
      Stacking models 22 min
    • Lecture35.17
      Cascading classifiers 15 min
    • Lecture35.18
      Kaggle competitions vs Real world 09 min
    • Lecture35.19
      Exercise: Apply GBDT and RF to Amazon reviews dataset. 04 min
  • Interview Questions on Ensemble Models 0/0

  • Featurization and Feature engineering. 0/18

    • Lecture37.1
      Introduction 17 min
    • Lecture37.2
      Moving window for Time Series Data 25 min
    • Lecture37.3
      Fourier decomposition 22 min
    • Lecture37.4
      Deep learning features: LSTM 08 min
    • Lecture37.5
      Image histogram 23 min
    • Lecture37.6
      Keypoints: SIFT. 10 min
    • Lecture37.7
      Deep learning features: CNN 04 min
    • Lecture37.8
      Relational data 10 min
    • Lecture37.9
      Graph data 12 min
    • Lecture37.10
      Indicator variables 07 min
    • Lecture37.11
      Feature binning 14 min
    • Lecture37.12
      Interaction variables 08 min
    • Lecture37.13
      Mathematical transforms 04 min
    • Lecture37.14
      Model specific featurizations 09 min
    • Lecture37.15
      Feature orthogonality 11 min
    • Lecture37.16
      Domain specific featurizations 04 min
    • Lecture37.17
      Feature slicing 10 min
    • Lecture37.18
      Kaggle Winners solutions 07 min
  • Miscellaneous Topics 0/10

    • Lecture38.1
      Calibration of Models:Need for calibration 08 min
    • Lecture38.2
      Calibration Plots. 17 min
    • Lecture38.3
      Platt’s Calibration/Scaling. 08 min
    • Lecture38.4
      Isotonic Regression 11 min
    • Lecture38.5
      Code Samples 04 min
    • Lecture38.6
      Modeling in the presence of outliers: RANSAC 13 min
    • Lecture38.7
      Productionizing models 17 min
    • Lecture38.8
      Retraining models periodically. 08 min
    • Lecture38.9
      A/B testing. 22 min
    • Lecture38.10
      Data Science Life cycle 17 min
  • Unsupervised learning/Clustering 0/14

    • Lecture39.1
      What is Clustering? 10 min
    • Lecture39.2
      Unsupervised learning 04 min
    • Lecture39.3
      Applications 16 min
    • Lecture39.4
      Metrics for Clustering 13 min
    • Lecture39.5
      K-Means: Geometric intuition, Centroids 08 min
    • Lecture39.6
      K-Means: Mathematical formulation: Objective function 11 min
    • Lecture39.7
      K-Means Algorithm. 11 min
    • Lecture39.8
      How to initialize: K-Means++ 24 min
    • Lecture39.9
      Failure cases/Limitations 11 min
    • Lecture39.10
      K-Medoids 19 min
    • Lecture39.11
      Determining the right K 05 min
    • Lecture39.12
      Code Samples 07 min
    • Lecture39.13
      Time and space complexity 04 min
    • Lecture39.14
      Exercise cluster 05 min
  • Hierarchical clustering Technique 0/7

    • Lecture40.1
      Agglomerative & Divisive, Dendrograms 13 min
    • Lecture40.2
      Agglomerative Clustering 09 min
    • Lecture40.3
      Proximity methods: Advantages and Limitations. 24 min
    • Lecture40.4
      Time and Space Complexity 04 min
    • Lecture40.5
      Limitations of Hierarchical Clustering 05 min
    • Lecture40.6
      Code sample 03 min
    • Lecture40.7
      Exercise: Amazon food reviews 03 min
  • DBSCAN (Density based clustering) Technique 0/10

    • Lecture41.1
      Density based clustering 05 min
    • Lecture41.2
      MinPts and Eps: Density 06 min
    • Lecture41.3
      Core, Border and Noise points 07 min
    • Lecture41.4
      Density edge and Density connected points. 06 min
    • Lecture41.5
      DBSCAN Algorithm 11 min
    • Lecture41.6
      Hyper Parameters: MinPts and Eps 10 min
    • Lecture41.7
      Advantages and Limitations of DBSCAN 10 min
    • Lecture41.8
      Time and Space Complexity 03 min
    • Lecture41.9
      Code samples. 03 min
    • Lecture41.10
      Exercise: Amazon Food reviews 03 min
  • Interview Questions on Clustering: 0/0

  • Recommender Systems and Matrix Factorization 0/15

    • Lecture43.1
      Problem formulation: IMDB Movie reviews 23 min
    • Lecture43.2
      Content based vs Collaborative Filtering 11 min
    • Lecture43.3
      Similarity based Algorithms 16 min
    • Lecture43.4
      Matrix Factorization: PCA, SVD 23 min
    • Lecture43.5
      Matrix Factorization: NMF 03 min
    • Lecture43.6
      Matrix Factorization for Collaborative filtering 23 min
    • Lecture43.7
      Matrix Factorization for feature engineering 09 min
    • Lecture43.8
      Clustering as MF 21 min
    • Lecture43.9
      Hyperparameter tuning 10 min
    • Lecture43.10
      Matrix Factorization for recommender systems: Netflix Prize Solution 30 min
    • Lecture43.11
      Cold Start problem 06 min
    • Lecture43.12
      Word vectors as MF 20 min
    • Lecture43.13
      Eigen-Faces 15 min
    • Lecture43.14
      Code example. 11 min
    • Lecture43.15
      Exercise: Word Vectors using Truncated SVD. 07 min
  • Interview Questions on Recommender Systems and Matrix Factorization. 0/0

  • Case Study 2: Personalized Cancer Diagnosis 0/22

    • Lecture45.1
      Business/Real world problem : Overview 13 min
    • Lecture45.2
      Business objectives and constraints. 11 min
    • Lecture45.3
      ML problem formulation :Data 05 min
    • Lecture45.4
      ML problem formulation: Mapping real world to ML problem. 19 min
    • Lecture45.5
      ML problem formulation :Train, CV and Test data construction 04 min
    • Lecture45.6
      Exploratory Data Analysis:Reading data & preprocessing 07 min
    • Lecture45.7
      Exploratory Data Analysis:Distribution of Class-labels 07 min
    • Lecture45.8
      Exploratory Data Analysis: “Random” Model 19 min
    • Lecture45.9
      Univariate Analysis:Gene feature 34 min
    • Lecture45.10
      Univariate Analysis:Variation Feature 19 min
    • Lecture45.11
      Univariate Analysis:Text feature 15 min
    • Lecture45.12
      Machine Learning Models:Data preparation 08 min
    • Lecture45.13
      Baseline Model: Naive Bayes 23 min
    • Lecture45.14
      K-Nearest Neighbors Classification 09 min
    • Lecture45.15
      Logistic Regression with class balancing 10 min
    • Lecture45.16
      Logistic Regression without class balancing 04 min
    • Lecture45.17
      Linear-SVM. 06 min
    • Lecture45.18
      Random-Forest with one-hot encoded features 07 min
    • Lecture45.19
      Random-Forest with response-coded features 06 min
    • Lecture45.20
      Stacking Classifier 08 min
    • Lecture45.21
      Majority Voting classifier 05 min
    • Lecture45.22
      Assignments. 05 min
  • Case study 3:Taxi demand prediction in New York City 0/28

    • Lecture46.1
      Business/Real world problem Overview 09 min
    • Lecture46.2
      Objectives and Constraints 11 min
    • Lecture46.3
      Mapping to ML problem :Data 08 min
    • Lecture46.4
      Mapping to ML problem :dask dataframes 11 min
    • Lecture46.5
      Mapping to ML problem :Fields/Features. 06 min
    • Lecture46.6
      Mapping to ML problem :Time series forecasting/Regression 08 min
    • Lecture46.7
      Mapping to ML problem :Performance metrics 06 min
    • Lecture46.8
      Data Cleaning :Latitude and Longitude data 04 min
    • Lecture46.9
      Data Cleaning :Trip Duration. 07 min
    • Lecture46.10
      Data Cleaning :Speed. 05 min
    • Lecture46.11
      Data Cleaning :Distance. 02 min
    • Lecture46.12
      Data Cleaning :Fare 06 min
    • Lecture46.13
      Data Cleaning :Remove all outliers/erroneous points 03 min
    • Lecture46.14
      Data Preparation:Clustering/Segmentation 19 min
    • Lecture46.15
      Data Preparation:Time binning 05 min
    • Lecture46.16
      Data Preparation:Smoothing time-series data. 05 min
    • Lecture46.17
      Data Preparation:Smoothing time-series data cont.. 02 min
    • Lecture46.18
      Data Preparation: Time series and Fourier transforms. 13 min
    • Lecture46.19
      Ratios and previous-time-bin values 09 min
    • Lecture46.20
      Simple moving average 08 min
    • Lecture46.21
      Weighted Moving average. 05 min
    • Lecture46.22
      Exponential weighted moving average 06 min
    • Lecture46.23
      Results. 04 min
    • Lecture46.24
      Regression models :Train-Test split & Features 08 min
    • Lecture46.25
      Linear regression. 03 min
    • Lecture46.26
      Random Forest regression 04 min
    • Lecture46.27
      Xgboost Regression 02 min
    • Lecture46.28
      Model comparison 06 min
    • Lecture46.29
      Assignment. 06 min
  • Case Study 4: Microsoft Malware Detection 0/21

    • Lecture47.1
      Business/real world problem :Problem definition 06 min
    • Lecture47.2
      Business/real world problem :Objectives and constraints 07 min
    • Lecture47.3
      Machine Learning problem mapping :Data overview. 13 min
    • Lecture47.4
      Machine Learning problem mapping :ML problem 12 min
    • Lecture47.5
      Machine Learning problem mapping :Train and test splitting 04 min
    • Lecture47.6
      Exploratory Data Analysis :Class distribution. 03 min
    • Lecture47.7
      Exploratory Data Analysis :Feature extraction from byte files 08 min
    • Lecture47.8
      Exploratory Data Analysis :Multivariate analysis of features from byte files 03 min
    • Lecture47.9
      Exploratory Data Analysis :Train-Test class distribution 03 min
    • Lecture47.10
      ML models – using byte files only :Random Model 11 min
    • Lecture47.11
      k-NN 07 min
    • Lecture47.12
      Logistic regression 05 min
    • Lecture47.13
      Random Forest and Xgboost 07 min
    • Lecture47.14
      ASM Files :Feature extraction & Multi-threading. 11 min
    • Lecture47.15
      File-size feature 02 min
    • Lecture47.16
      Univariate analysis 03 min
    • Lecture47.17
      t-SNE analysis. 02 min
    • Lecture47.18
      ML models on ASM file features 07 min
    • Lecture47.19
      Models on all features :t-SNE 02 min
    • Lecture47.20
      Models on all features :RandomForest and Xgboost 04 min
    • Lecture47.21
      Assignments. 04 min
  • Case study 5:Netflix Movie Recommendation System 0/28

    • Lecture48.1
      Business/Real world problem:Problem definition 06 min
    • Lecture48.2
      Objectives and constraints 07 min
    • Lecture48.3
      Mapping to an ML problem:Data overview. 04 min
    • Lecture48.4
      Mapping to an ML problem:ML problem formulation 05 min
    • Lecture48.5
      Exploratory Data Analysis:Data preprocessing 07 min
    • Lecture48.6
      Exploratory Data Analysis:Temporal Train-Test split. 06 min
    • Lecture48.7
      Exploratory Data Analysis:Preliminary data analysis. 15 min
    • Lecture48.8
      Exploratory Data Analysis:Sparse matrix representation 08 min
    • Lecture48.9
      Exploratory Data Analysis:Average ratings for various slices 08 min
    • Lecture48.10
      Exploratory Data Analysis:Cold start problem 05 min
    • Lecture48.11
      Computing Similarity matrices:User-User similarity matrix 20 min
    • Lecture48.12
      Computing Similarity matrices:Movie-Movie similarity 06 min
    • Lecture48.13
      Computing Similarity matrices:Does movie-movie similarity work? 06 min
    • Lecture48.14
      ML Models:Surprise library 06 min
    • Lecture48.15
      Overview of the modelling strategy. 08 min
    • Lecture48.16
      Data Sampling. 05 min
    • Lecture48.17
      Google drive with intermediate files 02 min
    • Lecture48.18
      Featurizations for regression. 11 min
    • Lecture48.19
      Data transformation for Surprise. 02 min
    • Lecture48.20
      Xgboost with 13 features 06 min
    • Lecture48.21
      Surprise Baseline model. 09 min
    • Lecture48.22
      Xgboost + 13 features +Surprise baseline model 04 min
    • Lecture48.23
      Surprise KNN predictors 15 min
    • Lecture48.24
      Matrix Factorization models using Surprise 05 min
    • Lecture48.25
      SVD ++ with implicit feedback 11 min
    • Lecture48.26
      Final models with all features and predictors. 04 min
    • Lecture48.27
      Comparison between various models. 04 min
    • Lecture48.28
      Assignments 04 min
  • Case study 6: Stackoverflow tag predictor 0/18

    • Lecture49.1
      Business/Real world problem 10 min
    • Lecture49.2
      Business objectives and constraints 05 min
    • Lecture49.3
      Mapping to an ML problem: Data overview 04 min
    • Lecture49.4
      Mapping to an ML problem:ML problem formulation. 05 min
    • Lecture49.5
      Mapping to an ML problem:Performance metrics. 21 min
    • Lecture49.6
      Hamming loss 07 min
    • Lecture49.7
      EDA:Data Loading 13 min
    • Lecture49.8
      EDA:Analysis of tags 11 min
    • Lecture49.9
      EDA:Data Preprocessing 11 min
    • Lecture49.10
      Data Modeling : Multi label Classification 18 min
    • Lecture49.11
      Data preparation. 08 min
    • Lecture49.12
      Train-Test Split 02 min
    • Lecture49.13
      Featurization 06 min
    • Lecture49.14
      Logistic regression: One VS Rest 07 min
    • Lecture49.15
      Sampling data and tags+Weighted models. 04 min
    • Lecture49.16
      Logistic regression revisited 04 min
    • Lecture49.17
      Why not use advanced techniques 03 min
    • Lecture49.18
      Assignments. 05 min
  • Case Study 7: Quora question Pair Similarity Problem 0/17

    • Lecture50.1
      Business/Real world problem : Problem definition 06 min
    • Lecture50.2
      Business objectives and constraints. 05 min
    • Lecture50.3
      Mapping to an ML problem : Data overview 05 min
    • Lecture50.4
      Mapping to an ML problem : ML problem and performance metric. 04 min
    • Lecture50.5
      Mapping to an ML problem : Train-test split 05 min
    • Lecture50.6
      EDA: Basic Statistics. 07 min
    • Lecture50.7
      EDA: Basic Feature Extraction 06 min
    • Lecture50.8
      EDA: Text Preprocessing 10 min
    • Lecture50.9
      EDA: Advanced Feature Extraction 31 min
    • Lecture50.10
      EDA: Feature analysis. 09 min
    • Lecture50.11
      EDA: Data Visualization: T-SNE. 03 min
    • Lecture50.12
      EDA: TF-IDF weighted Word2Vec featurization. 06 min
    • Lecture50.13
      ML Models :Loading Data 06 min
    • Lecture50.14
      ML Models: Random Model 07 min
    • Lecture50.15
      ML Models : Logistic Regression and Linear SVM 11 min
    • Lecture50.16
      ML Models : XGBoost 06 min
    • Lecture50.17
      Assignments 04 min
  • Deep Learning:Neural Networks. 0/14

    • Lecture51.1
      History of Neural networks and Deep Learning. 25 min
    • Lecture51.2
      How Biological Neurons work? 10 min
    • Lecture51.3
      Growth of biological neural networks 16 min
    • Lecture51.4
      Diagrammatic representation: Logistic Regression and Perceptron 17 min
    • Lecture51.5
      Multi-Layered Perceptron (MLP). 23 min
    • Lecture51.6
      Notation 18 min
    • Lecture51.7
      Training a single-neuron model. 28 min
    • Lecture51.8
      Training an MLP: Chain Rule 40 min
    • Lecture51.9
      Training an MLP:Memoization 14 min
    • Lecture51.10
      Backpropagation. 26 min
    • Lecture51.11
      Activation functions 17 min
    • Lecture51.12
      Vanishing Gradient problem. 23 min
    • Lecture51.13
      Bias-Variance tradeoff. 10 min
    • Lecture51.14
      Decision surfaces: Playground 15 min
  • Deep Learning: Deep Multi-layer perceptrons 0/21

    • Lecture52.1
      Deep Multi-layer perceptrons:1980s to 2010s 16 min
    • Lecture52.2
      Dropout layers & Regularization. 21 min
    • Lecture52.3
      Rectified Linear Units (ReLU). 28 min
    • Lecture52.4
      Weight initialization. 24 min
    • Lecture52.5
      Batch Normalization. 21 min
    • Lecture52.6
      Optimizers:Hill-descent analogy in 2D 19 min
    • Lecture52.7
      Optimizers:Hill descent in 3D and contours. 13 min
    • Lecture52.8
      SGD Recap 18 min
    • Lecture52.9
      Batch SGD with momentum. 25 min
    • Lecture52.10
      Nesterov Accelerated Gradient (NAG) 08 min
    • Lecture52.11
      Optimizers:AdaGrad 15 min
    • Lecture52.12
      Optimizers : Adadelta andRMSProp 10 min
    • Lecture52.13
      Adam 11 min
    • Lecture52.14
      Which algorithm to choose when? 05 min
    • Lecture52.15
      Gradient Checking and clipping 10 min
    • Lecture52.16
      Softmax and Cross-entropy for multi-class classification. 25 min
    • Lecture52.17
      How to train a Deep MLP? 08 min
    • Lecture52.18
      Auto Encoders. 27 min
    • Lecture52.19
      Word2Vec :CBOW 19 min
    • Lecture52.20
      Word2Vec: Skip-gram 14 min
    • Lecture52.21
      Word2Vec :Algorithmic Optimizations. 12 min
  • Deep Learning: Tensorflow and Keras. 0/14

    • Lecture53.1
      Tensorflow and Keras overview 23 min
    • Lecture53.2
      GPU vs CPU for Deep Learning. 23 min
    • Lecture53.3
      Google Colaboratory. 05 min
    • Lecture53.4
      Install TensorFlow 06 min
    • Lecture53.5
      Online documentation and tutorials 06 min
    • Lecture53.6
      Softmax Classifier on MNIST dataset. 32 min
    • Lecture53.7
      MLP: Initialization 11 min
    • Lecture53.8
      Model 1: Sigmoid activation. 22 min
    • Lecture53.9
      Model 2: ReLU activation. 06 min
    • Lecture53.10
      Model 3: Batch Normalization. 08 min
    • Lecture53.11
      Model 4 : Dropout. 05 min
    • Lecture53.12
      MNIST classification in Keras. 18 min
    • Lecture53.13
      Hyperparameter tuning in Keras. 11 min
    • Lecture53.14
      Exercise: Try different MLP architectures on MNIST dataset. 05 min
  • Deep Learning: Convolutional Neural Nets. 0/19

    • Lecture54.1
      Biological inspiration: Visual Cortex 17 min
    • Lecture54.2
      Convolution:Edge Detection on images. 28 min
    • Lecture54.3
      Convolution:Padding and strides 19 min
    • Lecture54.4
      Convolution over RGB images. 11 min
    • Lecture54.5
      Convolutional layer. 23 min
    • Lecture54.6
      Max-pooling. 12 min
    • Lecture54.7
      CNN Training: Optimization 09 min
    • Lecture54.8
      Example CNN: LeNet [1998] 11 min
    • Lecture54.9
      ImageNet dataset. 06 min
    • Lecture54.10
      Data Augmentation. 07 min
    • Lecture54.11
      Convolution Layers in Keras 17 min
    • Lecture54.12
      AlexNet 13 min
    • Lecture54.13
      VGGNet 11 min
    • Lecture54.14
      Residual Network. 22 min
    • Lecture54.15
      Inception Network. 19 min
    • Lecture54.16
      What is Transfer learning. 23 min
    • Lecture54.17
      Code example: Cats vs Dogs. 15 min
    • Lecture54.18
      Code Example: MNIST dataset. 06 min
    • Lecture54.19
      Assignment: Try various CNN networks on MNIST dataset. 04 min
  • Deep Learning: Long Short-term memory (LSTMs) 0/11

    • Lecture55.1
      Why RNNs? 23 min
    • Lecture55.2
      Recurrent Neural Network. 29 min
    • Lecture55.3
      Training RNNs: Backprop. 16 min
    • Lecture55.4
      Types of RNNs. 14 min
    • Lecture55.5
      Need for LSTM/GRU. 10 min
    • Lecture55.6
      LSTM. 34 min
    • Lecture55.7
      GRUs. 07 min
    • Lecture55.8
      Deep RNN. 07 min
    • Lecture55.9
      Bidirectional RNN. 12 min
    • Lecture55.10
      Code example : IMDB Sentiment classification 33 min
    • Lecture55.11
      Exercise: Amazon Fine Food reviews LSTM model. 04 min
  • Case Study 8: Amazon fashion discovery engine 0/28

    • Lecture56.1
      Problem Statement: Recommend similar apparel products in e-commerce using product descriptions and Images 12 min
    • Lecture56.2
      Plan of action 07 min
    • Lecture56.3
      Amazon product advertising API 10 min
    • Lecture56.4
      Data folders and paths 06 min
    • Lecture56.5
      Overview of the data and Terminology 12 min
    • Lecture56.6
      Data cleaning and understanding:Missing data in various features 22 min
    • Lecture56.7
      Understand duplicate rows 09 min
    • Lecture56.8
      Remove duplicates : Part 1 12 min
    • Lecture56.9
      Remove duplicates: Part 2 15 min
    • Lecture56.10
      Text Pre-Processing: Tokenization and Stop-word removal 10 min
    • Lecture56.11
      Stemming 04 min
    • Lecture56.12
      Text based product similarity :Converting text to an n-D vector: bag of words 14 min
    • Lecture56.13
      Code for bag of words based product similarity 26 min
    • Lecture56.14
      TF-IDF: featurizing text based on word-importance 17 min
    • Lecture56.15
      Code for TF-IDF based product similarity 10 min
    • Lecture56.16
      Code for IDF based product similarity 09 min
    • Lecture56.17
      Text Semantics based product similarity: Word2Vec(featurizing text based on semantic similarity) 19 min
    • Lecture56.18
      Code for Average Word2Vec product similarity 15 min
    • Lecture56.19
      TF-IDF weighted Word2Vec 09 min
    • Lecture56.20
      Code for IDF weighted Word2Vec product similarity 06 min
    • Lecture56.21
      Weighted similarity using brand and color 09 min
    • Lecture56.22
      Code for weighted similarity 07 min
    • Lecture56.23
      Building a real world solution 05 min
    • Lecture56.24
      Deep learning based visual product similarity:ConvNets: How to featurize an image: edges, shapes, parts 11 min
    • Lecture56.25
      Using Keras + Tensorflow to extract features 08 min
    • Lecture56.26
      Visual similarity based product similarity 06 min
    • Lecture56.27
      Measuring goodness of our solution :A/B testing 07 min
    • Lecture56.28
      Exercise :Build a weighted Nearest neighbor model using Visual, Text, Brand and Color 09 min
  • Case study 9:Self Driving Car 0/14

    • Lecture57.1
      Self Driving Car :Problem definition. 14 min
    • Lecture57.2
      Datasets. 09 min
    • Lecture57.3
      Data understanding & Analysis :Files and folders. 04 min
    • Lecture57.4
      Dash-cam images and steering angles. 05 min
    • Lecture57.5
      Split the dataset: Train vs Test 03 min
    • Lecture57.6
      EDA: Steering angles 06 min
    • Lecture57.7
      Mean Baseline model: simple 05 min
    • Lecture57.8
      Deep-learning model:Deep Learning for regression: CNN, CNN+RNN 10 min
    • Lecture57.9
      Batch load the dataset. 06 min
    • Lecture57.10
      NVIDIA’s end to end CNN model. 18 min
    • Lecture57.11
      Train the model. 13 min
    • Lecture57.12
      Test and visualize the output. 11 min
    • Lecture57.13
      Extensions. 05 min
    • Lecture57.14
      Assignment. 03 min
  • Case Studies 0/4

    • Lecture58.1
      AD-Click Predicition
    • Lecture58.2
      Human Activity Recognition using smartphones
    • Lecture58.3
      Song similarity and genre classification
    • Lecture58.4
      Facebook Friend Recommendation using Graph Mining
  • Interview Questions 0/1

    • Lecture59.1
      External resources 30 min
  • Interview Questions on Deep Learning 0/1

    • Lecture60.1
      Questions and Answers 30 min

Download Our Syllabus ( click on CURRICULUM tab to view lessons)

Obective of Applied AI/ Machine Learning Online Course:

The AppliedAICourse attempts to teach students/course-participants some of the core ideas in machine learning, data-science and AI that would help the participants go from a real world business problem to a first cut, working and deployable AI solution to the problem. Our primary focus is to help participants build real world AI solutions using the skills they learn in this course.

This course will focus on practical knowledge more than mathematical or theoretical rigor. That doesn’t mean that we would water down the content. We will try and balance the theory and practice while giving more preference to the practical and applied aspects of AI as the course name suggests. Through the course, we will work on 20+ case studies of real world AI problems and datasets to help students grasp the practical details of building AI solutions. For each idea/algorithm in AI, we would provide examples to provide the intuition and show how the idea to used in the real world.

Key Points:

  1. Validity of this course is 365 days( i.e Starts from the date of your registration to this course)
  2. Expert Guidance, we will try to answer your queries in atmost 24hours
  3. 10+ real world case studies and 5 case studies will be given as assignments to build your portfolio. please click here to view the sample portfolio
  4. 30+ machine learning and Deep learning algorithms will be taught in this course.
  5. No prerequisites– we will teach every thing from basics ( we just expect you to know basic programming)
  6.  Python for Data science is part of the course curriculum.
  7.  The content of this course will be dynamic(i.e lessons will be added if there is an exceptional paper published)

 

Target Audience:

We are building our course content and teaching methodology to cater to the needs to students at various levels of expertise and varying background skills. This course can be taken by anyone with a working knowledge of a modern programming language like C/C++/Java/Python. We expect the average student to spend at least 5 hours a week over a 6 month period amounting to a 145+ hours of effort. More the effort, better the results. Here is a list of customers who would benefit from our course:

Undergrad (BS/BTech/BE) students in engineering and science.

  1. Grad(MS/MTech/ME/MCA) students in engineering and science.
  2. Working professionals: Software engineers, Business analysts, Product managers, Program managers, Managers, Startup teams building ML products/services.
  3. ML Scientists and ML engineers.

Click Here to Download Our Syllabus

Course Features

  • Lectures 654
  • Quizzes 0
  • Duration 140+ hours
  • Skill level All levels
  • Language English
  • Students 963
  • Assessments Yes
QUALIFICATION: Masters from IISC Bangalore PROFESSIONAL EXPIERENCE: 9+ years of Experience( Yahoo Labs, Matherix Labs Co-founder and Amazon)
₹32,500.00 ₹25,000.00

    184 Comments

  1. November 1, 2017

    Hi,
    the python videos will be completed by the end of this week. from the next, we will start uploading the ML techniques.
    since you just registered please start doing the workshop once you are done with it, please continue with AI Course videos.

  2. SHARATH R
    November 1, 2017

    Sir for the free workshop course.Will you provide any certification?Does the workshop include any project?

  3. October 30, 2017

    how much percentage is this course similar to an ML course in IIT’s and IISC?

    • November 1, 2017

      Well this course has a right balance between theory and practical approach..
      Upon the completion of the course, you will be able to solve any real world challenge easily (i.e atleast a first cut solution)

  4. October 27, 2017

    does this course have all the projects which are being sold separately for rs15000 each?

  5. October 25, 2017

    You are going to teach complete python or just covering a particular portion of python which is required for this course only. Complete python I simple mean the standard syllabus of python which is covered in standard text books and websites. Teaching everything n claiming 100% is practically impossible,I know.

    • January 20, 2018

      Hello! I’m currently taking the course and I’ve just completed the Python for data Science module. It is not an exhaustive course by any means but Srikanth Sir has covered almost all of the syntax and features of Python we might commonly need in the future. I’m a 3rd year IT student and I have to primarily work with Java and C++ in my college courses, I’m confident that I can complete all my programming assignments with Python after covering this module. Rest assured, it’s a cool course!

  6. October 25, 2017

    Why “Airbnb First Travel Destination” case study is not included in AI course curriculum ?

  7. October 25, 2017

    Hello Sir,

    Considering your syllabus including all case studies, how soon can we complete the course?

  8. October 24, 2017

    Hello Sir,

    Considering your syllabus how soon can we complete the course?

  9. Anushka Sinha
    October 22, 2017

    Python for data science:Data structures page is not showing. Is it yet to be updated? If not how to See it?

  10. Admin bar avatar

    Sir after successfully completing AI course, you are going to provide a certificate. What would be the parameter to assess the performance for grading criteria on certificate ? How would you refer us to companies after the course ?

    • November 1, 2017

      the certificate we will provide will have a grade on it, the grading will be based on your performance throughout the course, results you get in the case studies, and the quality of the portfolio that you build during the entire course.
      we have 2 certificates, one is course completion and the other one is course participation, the course completion certificate will be given to the folks who will submit at least 5 case studies out of 10, the guys who get A, A+ we will personally take responsibility to forward their resumes and portfolios to top tier companies.

  11. October 20, 2017

    Great.Thanks.

  12. October 17, 2017

    Is it possible to get this course for a group of people and can we get different login ID?

  13. October 17, 2017

    hello sir,
    If I’ll enroll with AI course then would I get all case study video lecture which is mentioned in a syllabus ??

  14. October 17, 2017

    Is programming knowledge is mandatory to take this course?

    • October 17, 2017

      we expect you to have knowledge about C programming and any basic OOPS programming knowledge will help you..
      Any way we will teach you Python required for Data Science in this course

  15. Sir it will be very helpful to many students including me if you can discuss about VC dimensions in your course. People all around India and all around the Internet give different explanations but are not clear about the concept of VC Dimensions

  16. October 12, 2017

    Thanks for making videos very simple and easy to understand.
    I will register to course soon.

  17. what is the validity of course ? like if i buy now can i watch it for life time like udemy ??

  18. I am a group of 5 people and want to have this course.Is it possible to get the course for a group? Will we get 5 Log in ids for this?
    And, Where will be the paid videos played ie. on YouTube or elsewhere ?

  19. October 8, 2017

    I am a fresher. After completing the entire course, can i directly start my career in AI?

  20. Your demo videos are great. Please provide subtitle that will help us a lot.

    • admin
      October 5, 2017

      Thank you for your suggestion
      We are already working on it, currently we are prioritizing video content creation as soon as video content is ready
      we will work on subtitle/close captions

      Regards,
      Team

Leave A Reply