Python for Data Science Introduction 0/0
- Lecture1.1
- Lecture1.2
- Lecture1.3
- Lecture1.4
- Lecture1.5
- Lecture1.6
- Lecture1.7
- Lecture1.8
- Lecture1.9
- Lecture1.10
- Lecture1.11
Python for Data Science: Data Structures 0/6
- Lecture2.1
- Lecture2.2
- Lecture2.3
- Lecture2.4
- Lecture2.5
- Lecture2.6
Python for Data Science: Numpy 0/2
- Lecture3.1
- Lecture3.2
Python for Data Science: Functions 0/10
- Lecture4.1
- Lecture4.2
- Lecture4.3
- Lecture4.4
- Lecture4.5
- Lecture4.6
- Lecture4.7
- Lecture4.8
- Lecture4.9
- Lecture4.10
Python for Data Science: Matplotlib 0/1
- Lecture5.1
Python for Data Science: Pandas 0/3
- Lecture6.1
- Lecture6.2
- Lecture6.3
Python for Data Science: Computational Complexity 0/4
- Lecture7.1
- Lecture7.2
- Lecture7.3
- Lecture7.4
Plotting for exploratory data analysis (EDA) 0/0
exploratory data analysis (EDA) is an approach to analyzing data sets to summarize their main characteristics, often with visual methods. A statistical model can be used or not, but primarily EDA is for seeing what the data can tell us beyond the formal modeling or hypothesis testing task.
- Lecture8.1
- Lecture8.2
- Lecture8.3
- Lecture8.4
- Lecture8.5
- Lecture8.6
- Lecture8.7
- Lecture8.8
- Lecture8.9
- Lecture8.10
- Lecture8.11
- Lecture8.12
- Lecture8.13
- Lecture8.14
- Lecture8.15
- Lecture8.16
Linear Algebra 0/1
It will give you the tools to help you with the other areas of mathematics required to understand and build better intuitions for machine learning algorithms.
- Lecture9.1
- Lecture9.2
- Lecture9.3
- Lecture9.4
- Lecture9.5
- Lecture9.6
- Lecture9.7
- Lecture9.8
- Lecture9.9
- Lecture9.10
- Lecture9.11
Interview Questions on Linear Algebra 0/0
- Lecture10.1
Probability and Statistics 0/28
- Lecture11.1
- Lecture11.2
- Lecture11.3
- Lecture11.4
- Lecture11.5
- Lecture11.6
- Lecture11.7
- Lecture11.8
- Lecture11.9
- Lecture11.10
- Lecture11.11
- Lecture11.12
- Lecture11.13
- Lecture11.14
- Lecture11.15
- Lecture11.16
- Lecture11.17
- Lecture11.18
- Lecture11.19
- Lecture11.20
- Lecture11.21
- Lecture11.22
- Lecture11.23
- Lecture11.24
- Lecture11.25
- Lecture11.26
- Lecture11.27
- Lecture11.28
- Lecture11.29
- Lecture11.30
Interview Questions on Probability and statistics 0/0
- Lecture12.1
Dimensionality reduction and Visualization: 0/0
In machine learning and statistics, dimensionality reduction or dimension reduction is the process of reducing the number of random variables under consideration, via obtaining a set of principal variables. It can be divided into feature selection and feature extraction.
- Lecture13.1
- Lecture13.2
- Lecture13.3
- Lecture13.4
- Lecture13.5
- Lecture13.6
- Lecture13.7
- Lecture13.8
- Lecture13.9
- Lecture13.10
Interview Questions on Dimensionality Reduction 0/0
- Lecture14.1
PCA(principal component analysis) 0/0
- Lecture15.1
- Lecture15.2
- Lecture15.3
- Lecture15.4
- Lecture15.5
- Lecture15.6
- Lecture15.7
- Lecture15.8
- Lecture15.9
- Lecture15.10
(t-SNE)T-distributed Stochastic Neighbourhood Embedding 0/1
- Lecture16.1
- Lecture16.2
- Lecture16.3
- Lecture16.4
- Lecture16.5
- Lecture16.6
- Lecture16.7
- Lecture16.8
Real world problem: Predict rating given product reviews on Amazon 0/17
- Lecture17.1
- Lecture17.2
- Lecture17.3
- Lecture17.4
- Lecture17.5
- Lecture17.6
- Lecture17.7
- Lecture17.8
- Lecture17.9
- Lecture17.10
- Lecture17.11
- Lecture17.12
- Lecture17.13
- Lecture17.14
- Lecture17.15
- Lecture17.16
- Lecture17.17
Classification And Regression Models: K-Nearest Neighbors 0/31
- Lecture18.1
- Lecture18.2
- Lecture18.3
- Lecture18.4
- Lecture18.5
- Lecture18.6
- Lecture18.7
- Lecture18.8
- Lecture18.9
- Lecture18.10
- Lecture18.11
- Lecture18.12
- Lecture18.13
- Lecture18.14
- Lecture18.15
- Lecture18.16
- Lecture18.17
- Lecture18.18
- Lecture18.19
- Lecture18.20
- Lecture18.21
- Lecture18.22
- Lecture18.23
- Lecture18.24
- Lecture18.25
- Lecture18.26
- Lecture18.27
- Lecture18.28
- Lecture18.29
- Lecture18.30
- Lecture18.31
- Lecture18.32
Interview Questions on K-NN(K Nearest Neighbour) 0/0
- Lecture19.1
Classification algorithms in various situations 0/20
- Lecture20.1
- Lecture20.2
- Lecture20.3
- Lecture20.4
- Lecture20.5
- Lecture20.6
- Lecture20.7
- Lecture20.8
- Lecture20.9
- Lecture20.10
- Lecture20.11
- Lecture20.12
- Lecture20.13
- Lecture20.14
- Lecture20.15
- Lecture20.16
- Lecture20.17
- Lecture20.18
- Lecture20.19
- Lecture20.20
Interview Questions on Classification algorithms in various situations 0/0
- Lecture21.1
Performance measurement of models 0/8
- Lecture22.1
- Lecture22.2
- Lecture22.3
- Lecture22.4
- Lecture22.5
- Lecture22.6
- Lecture22.7
- Lecture22.8
Interview Questions on Performance Measurement Models 0/0
- Lecture23.1
Naive Bayes 0/21
- Lecture24.1
- Lecture24.2
- Lecture24.3
- Lecture24.4
- Lecture24.5
- Lecture24.6
- Lecture24.7
- Lecture24.8
- Lecture24.9
- Lecture24.10
- Lecture24.11
- Lecture24.12
- Lecture24.13
- Lecture24.14
- Lecture24.15
- Lecture24.16
- Lecture24.17
- Lecture24.18
- Lecture24.19
- Lecture24.20
- Lecture24.21
Interview Questions on Naive Bayes Algorithm 0/0
- Lecture25.1
Logistic Regression 0/18
- Lecture26.1
- Lecture26.2
- Lecture26.3
- Lecture26.4
- Lecture26.5
- Lecture26.6
- Lecture26.7
- Lecture26.8
- Lecture26.9
- Lecture26.10
- Lecture26.11
- Lecture26.12
- Lecture26.13
- Lecture26.14
- Lecture26.15
- Lecture26.16
- Lecture26.17
- Lecture26.18
Linear Regression 0/4
- Lecture27.1
- Lecture27.2
- Lecture27.3
- Lecture27.4
Solving optimization problems 0/12
- Lecture28.1
- Lecture28.2
- Lecture28.3
- Lecture28.4
- Lecture28.5
- Lecture28.6
- Lecture28.7
- Lecture28.8
- Lecture28.9
- Lecture28.10
- Lecture28.11
- Lecture28.12
- Lecture28.13
Interview Questions on Logistic Regression and Linear Regression 0/0
- Lecture29.1
Support Vector Machines (SVM) 0/15
- Lecture30.1
- Lecture30.2
- Lecture30.3
- Lecture30.4
- Lecture30.5
- Lecture30.6
- Lecture30.7
- Lecture30.8
- Lecture30.9
- Lecture30.10
- Lecture30.11
- Lecture30.12
- Lecture30.13
- Lecture30.14
- Lecture30.15
Interview Questions on Support Vector Machine 0/0
- Lecture31.1
Decision Trees 0/15
- Lecture32.1
- Lecture32.2
- Lecture32.3
- Lecture32.4
- Lecture32.5
- Lecture32.6
- Lecture32.7
- Lecture32.8
- Lecture32.9
- Lecture32.10
- Lecture32.11
- Lecture32.12
- Lecture32.13
- Lecture32.14
- Lecture32.15
Interview Questions on decision Trees 0/0
- Lecture33.1
Ensemble Models 0/19
- Lecture34.1
- Lecture34.2
- Lecture34.3
- Lecture34.4
- Lecture34.5
- Lecture34.6
- Lecture34.7
- Lecture34.8
- Lecture34.9
- Lecture34.10
- Lecture34.11
- Lecture34.12
- Lecture34.13
- Lecture34.14
- Lecture34.15
- Lecture34.16
- Lecture34.17
- Lecture34.18
- Lecture34.19
K-Means Clustering Technique 0/11
- Lecture35.1
- Lecture35.2
- Lecture35.3
- Lecture35.4
- Lecture35.5
- Lecture35.6
- Lecture35.7
- Lecture35.8
- Lecture35.9
- Lecture35.10
- Lecture35.11
Bias-Variance tradeoff 0/3
- Lecture36.1
- Lecture36.2
- Lecture36.3
Hierarchical clustering Technique 0/4
- Lecture37.1
- Lecture37.2
- Lecture37.3
- Lecture37.4
DBSCAN (Density based clustering) Technique 0/6
- Lecture38.1
- Lecture38.2
- Lecture38.3
- Lecture38.4
- Lecture38.5
- Lecture38.6
Recommender Systems and Matrix Factorization 0/4
- Lecture39.1
- Lecture39.2
- Lecture39.3
- Lecture39.4
Python for Data Science: OOPS(object oriented program) 0/2
- Lecture40.1
- Lecture40.2
Python for Data Science: Seaborn 0/1
- Lecture41.1
Python for Data Science: Scikit Learn 0/1
- Lecture42.1
Interview Questions on Ensemble Models 0/0
- Lecture43.1
Facebook Friend Recommendation using Graph Mining 0/19
- Lecture44.1
- Lecture44.2
- Lecture44.3
- Lecture44.4
- Lecture44.5
- Lecture44.6
- Lecture44.7
- Lecture44.8
- Lecture44.9
- Lecture44.10
- Lecture44.11
- Lecture44.12
- Lecture44.13
- Lecture44.14
- Lecture44.15
- Lecture44.16
- Lecture44.17
- Lecture44.18
- Lecture44.19
---67446-------not-logged-in------9576--------
The content of this item is restricted.. Please Login