#### How to utilise Appliedaicourse 0/1

- Lecture1.1

#### Python for Data Science Introduction 0/0

- Lecture2.1
- Lecture2.2
- Lecture2.3
- Lecture2.4
- Lecture2.5
- Lecture2.6
- Lecture2.7
- Lecture2.8
- Lecture2.9
- Lecture2.10
- Lecture2.11

#### Python for Data Science: Data Structures 0/6

- Lecture3.1
- Lecture3.2
- Lecture3.3
- Lecture3.4
- Lecture3.5
- Lecture3.6

#### Python for Data Science: Functions 0/10

- Lecture4.1
- Lecture4.2
- Lecture4.3
- Lecture4.4
- Lecture4.5
- Lecture4.6
- Lecture4.7
- Lecture4.8
- Lecture4.9
- Lecture4.10

#### Python for Data Science: Numpy 0/2

- Lecture5.1
- Lecture5.2

#### Python for Data Science: Matplotlib 0/1

- Lecture6.1

#### Python for Data Science: Pandas 0/3

- Lecture7.1
- Lecture7.2
- Lecture7.3

#### Python for Data Science: Computational Complexity 0/4

- Lecture8.1
- Lecture8.2
- Lecture8.3
- Lecture8.4

#### Plotting for exploratory data analysis (EDA) 0/0

exploratory data analysis (EDA) is an approach to analyzing data sets to summarize their main characteristics, often with visual methods. A statistical model can be used or not, but primarily EDA is for seeing what the data can tell us beyond the formal modeling or hypothesis testing task.

- Lecture9.1
- Lecture9.2
- Lecture9.3
- Lecture9.4
- Lecture9.5
- Lecture9.6
- Lecture9.7
- Lecture9.8
- Lecture9.9
- Lecture9.10
- Lecture9.11
- Lecture9.12
- Lecture9.13
- Lecture9.14
- Lecture9.15
- Lecture9.16

#### Linear Algebra 0/1

It will give you the tools to help you with the other areas of mathematics required to understand and build better intuitions for machine learning algorithms.

- Lecture10.1
- Lecture10.2
- Lecture10.3
- Lecture10.4
- Lecture10.5
- Lecture10.6
- Lecture10.7
- Lecture10.8
- Lecture10.9
- Lecture10.10
- Lecture10.11

#### Interview Questions on Linear Algebra 0/0

- Lecture11.1

#### Probability and Statistics 0/26

- Lecture12.1
- Lecture12.2
- Lecture12.3
- Lecture12.4
- Lecture12.5
- Lecture12.6
- Lecture12.7
- Lecture12.8
- Lecture12.9
- Lecture12.10
- Lecture12.11
- Lecture12.12
- Lecture12.13
- Lecture12.14
- Lecture12.15
- Lecture12.16
- Lecture12.17
- Lecture12.18
- Lecture12.19
- Lecture12.20
- Lecture12.21
- Lecture12.22
- Lecture12.23
- Lecture12.24
- Lecture12.25
- Lecture12.26
- Lecture12.27
- Lecture12.28
- Lecture12.29
- Lecture12.30

#### Interview Questions on Probability and statistics 0/0

- Lecture13.1

#### Dimensionality reduction and Visualization: 0/0

In machine learning and statistics, dimensionality reduction or dimension reduction is the process of reducing the number of random variables under consideration, via obtaining a set of principal variables. It can be divided into feature selection and feature extraction.

- Lecture14.1
- Lecture14.2
- Lecture14.3
- Lecture14.4
- Lecture14.5
- Lecture14.6
- Lecture14.7
- Lecture14.8
- Lecture14.9
- Lecture14.10

#### Interview Questions on Dimensionality Reduction 0/0

- Lecture15.1

#### PCA(principal component analysis) 0/0

- Lecture16.1
- Lecture16.2
- Lecture16.3
- Lecture16.4
- Lecture16.5
- Lecture16.6
- Lecture16.7
- Lecture16.8
- Lecture16.9
- Lecture16.10

#### (t-SNE)T-distributed Stochastic Neighbourhood Embedding 0/0

- Lecture17.1
- Lecture17.2
- Lecture17.3
- Lecture17.4
- Lecture17.5
- Lecture17.6
- Lecture17.7

#### Real world problem: Predict rating given product reviews on Amazon 0/18

- Lecture18.1
- Lecture18.2
- Lecture18.3
- Lecture18.4
- Lecture18.5
- Lecture18.6
- Lecture18.7
- Lecture18.8
- Lecture18.9
- Lecture18.10
- Lecture18.11
- Lecture18.12
- Lecture18.13
- Lecture18.14
- Lecture18.15
- Lecture18.16
- Lecture18.17
- Lecture18.18

#### Classification And Regression Models: K-Nearest Neighbors 0/32

- Lecture19.1
- Lecture19.2
- Lecture19.3
- Lecture19.4
- Lecture19.5
- Lecture19.6
- Lecture19.7
- Lecture19.8
- Lecture19.9
- Lecture19.10
- Lecture19.11
- Lecture19.12
- Lecture19.13
- Lecture19.14
- Lecture19.15
- Lecture19.16
- Lecture19.17
- Lecture19.18
- Lecture19.19
- Lecture19.20
- Lecture19.21
- Lecture19.22
- Lecture19.23
- Lecture19.24
- Lecture19.25
- Lecture19.26
- Lecture19.27
- Lecture19.28
- Lecture19.29
- Lecture19.30
- Lecture19.31
- Lecture19.32

#### Interview Questions on K-NN(K Nearest Neighbour) 0/0

- Lecture20.1

#### Classification algorithms in various situations 0/20

- Lecture21.1
- Lecture21.2
- Lecture21.3
- Lecture21.4
- Lecture21.5
- Lecture21.6
- Lecture21.7
- Lecture21.8
- Lecture21.9
- Lecture21.10
- Lecture21.11
- Lecture21.12
- Lecture21.13
- Lecture21.14
- Lecture21.15
- Lecture21.16
- Lecture21.17
- Lecture21.18
- Lecture21.19
- Lecture21.20

#### Interview Questions on Classification algorithms in various situations 0/0

- Lecture22.1

#### Performance measurement of models 0/8

- Lecture23.1
- Lecture23.2
- Lecture23.3
- Lecture23.4
- Lecture23.5
- Lecture23.6
- Lecture23.7
- Lecture23.8

#### Interview Questions on Performance Measurement Models 0/0

- Lecture24.1

#### Naive Bayes 0/21

- Lecture25.1
- Lecture25.2
- Lecture25.3
- Lecture25.4
- Lecture25.5
- Lecture25.6
- Lecture25.7
- Lecture25.8
- Lecture25.9
- Lecture25.10
- Lecture25.11
- Lecture25.12
- Lecture25.13
- Lecture25.14
- Lecture25.15
- Lecture25.16
- Lecture25.17
- Lecture25.18
- Lecture25.19
- Lecture25.20
- Lecture25.21

#### Interview Questions on Naive Bayes Algorithm 0/0

- Lecture26.1

#### Logistic Regression 0/18

- Lecture27.1
- Lecture27.2
- Lecture27.3
- Lecture27.4
- Lecture27.5
- Lecture27.6
- Lecture27.7
- Lecture27.8
- Lecture27.9
- Lecture27.10
- Lecture27.11
- Lecture27.12
- Lecture27.13
- Lecture27.14
- Lecture27.15
- Lecture27.16
- Lecture27.17
- Lecture27.18

#### Linear Regression 0/4

- Lecture28.1
- Lecture28.2
- Lecture28.3
- Lecture28.4

#### Solving Optimization Problems 0/12

- Lecture29.1
- Lecture29.2
- Lecture29.3
- Lecture29.4
- Lecture29.5
- Lecture29.6
- Lecture29.7
- Lecture29.8
- Lecture29.9
- Lecture29.10
- Lecture29.11
- Lecture29.12
- Lecture29.13

#### Interview Questions on Logistic Regression and Linear Regression 0/0

- Lecture30.1

#### Support Vector Machines (SVM) 0/15

- Lecture31.1
- Lecture31.2
- Lecture31.3
- Lecture31.4
- Lecture31.5
- Lecture31.6
- Lecture31.7
- Lecture31.8
- Lecture31.9
- Lecture31.10
- Lecture31.11
- Lecture31.12
- Lecture31.13
- Lecture31.14
- Lecture31.15

#### Interview Questions on Support Vector Machine 0/0

- Lecture32.1

#### Decision Trees 0/15

- Lecture33.1
- Lecture33.2
- Lecture33.3
- Lecture33.4
- Lecture33.5
- Lecture33.6
- Lecture33.7
- Lecture33.8
- Lecture33.9
- Lecture33.10
- Lecture33.11
- Lecture33.12
- Lecture33.13
- Lecture33.14
- Lecture33.15

#### Interview Questions on decision Trees 0/0

- Lecture34.1

#### Ensemble Models 0/19

- Lecture35.1
- Lecture35.2
- Lecture35.3
- Lecture35.4
- Lecture35.5
- Lecture35.6
- Lecture35.7
- Lecture35.8
- Lecture35.9
- Lecture35.10
- Lecture35.11
- Lecture35.12
- Lecture35.13
- Lecture35.14
- Lecture35.15
- Lecture35.16
- Lecture35.17
- Lecture35.18
- Lecture35.19

#### Interview Questions on Ensemble Models 0/0

- Lecture36.1

#### Featurization and Feature engineering. 0/18

- Lecture37.1
- Lecture37.2
- Lecture37.3
- Lecture37.4
- Lecture37.5
- Lecture37.6
- Lecture37.7
- Lecture37.8
- Lecture37.9
- Lecture37.10
- Lecture37.11
- Lecture37.12
- Lecture37.13
- Lecture37.14
- Lecture37.15
- Lecture37.16
- Lecture37.17
- Lecture37.18

#### Miscellaneous Topics 0/10

- Lecture38.1
- Lecture38.2
- Lecture38.3
- Lecture38.4
- Lecture38.5
- Lecture38.6
- Lecture38.7
- Lecture38.8
- Lecture38.9
- Lecture38.10

#### Unsupervised learning/Clustering 0/14

- Lecture39.1
- Lecture39.2
- Lecture39.3
- Lecture39.4
- Lecture39.5
- Lecture39.6
- Lecture39.7
- Lecture39.8
- Lecture39.9
- Lecture39.10
- Lecture39.11
- Lecture39.12
- Lecture39.13
- Lecture39.14

#### Hierarchical clustering Technique 0/7

- Lecture40.1
- Lecture40.2
- Lecture40.3
- Lecture40.4
- Lecture40.5
- Lecture40.6
- Lecture40.7

#### DBSCAN (Density based clustering) Technique 0/10

- Lecture41.1
- Lecture41.2
- Lecture41.3
- Lecture41.4
- Lecture41.5
- Lecture41.6
- Lecture41.7
- Lecture41.8
- Lecture41.9
- Lecture41.10

#### Interview Questions on Clustering: 0/0

- Lecture42.1

#### Recommender Systems and Matrix Factorization 0/15

- Lecture43.1
- Lecture43.2
- Lecture43.3
- Lecture43.4
- Lecture43.5
- Lecture43.6
- Lecture43.7
- Lecture43.8
- Lecture43.9
- Lecture43.10
- Lecture43.11
- Lecture43.12
- Lecture43.13
- Lecture43.14
- Lecture43.15

#### Interview Questions on Recommender Systems and Matrix Factorization. 0/0

- Lecture44.1

#### Case Study 2: Personalized Cancer Diagnosis 0/22

- Lecture45.1
- Lecture45.2
- Lecture45.3
- Lecture45.4
- Lecture45.5
- Lecture45.6
- Lecture45.7
- Lecture45.8
- Lecture45.9
- Lecture45.10
- Lecture45.11
- Lecture45.12
- Lecture45.13
- Lecture45.14
- Lecture45.15
- Lecture45.16
- Lecture45.17
- Lecture45.18
- Lecture45.19
- Lecture45.20
- Lecture45.21
- Lecture45.22

#### Case study 3:Taxi demand prediction in New York City 0/28

- Lecture46.1
- Lecture46.2
- Lecture46.3
- Lecture46.4
- Lecture46.5
- Lecture46.6
- Lecture46.7
- Lecture46.8
- Lecture46.9
- Lecture46.10
- Lecture46.11
- Lecture46.12
- Lecture46.13
- Lecture46.14
- Lecture46.15
- Lecture46.16
- Lecture46.17
- Lecture46.18
- Lecture46.19
- Lecture46.20
- Lecture46.21
- Lecture46.22
- Lecture46.23
- Lecture46.24
- Lecture46.25
- Lecture46.26
- Lecture46.27
- Lecture46.28
- Lecture46.29

#### Case Study 4: Microsoft Malware Detection 0/21

- Lecture47.1
- Lecture47.2
- Lecture47.3
- Lecture47.4
- Lecture47.5
- Lecture47.6
- Lecture47.7
- Lecture47.8
- Lecture47.9
- Lecture47.10
- Lecture47.11
- Lecture47.12
- Lecture47.13
- Lecture47.14
- Lecture47.15
- Lecture47.16
- Lecture47.17
- Lecture47.18
- Lecture47.19
- Lecture47.20
- Lecture47.21

#### Case study 5:Netflix Movie Recommendation System 0/28

- Lecture48.1
- Lecture48.2
- Lecture48.3
- Lecture48.4
- Lecture48.5
- Lecture48.6
- Lecture48.7
- Lecture48.8
- Lecture48.9
- Lecture48.10
- Lecture48.11
- Lecture48.12
- Lecture48.13
- Lecture48.14
- Lecture48.15
- Lecture48.16
- Lecture48.17
- Lecture48.18
- Lecture48.19
- Lecture48.20
- Lecture48.21
- Lecture48.22
- Lecture48.23
- Lecture48.24
- Lecture48.25
- Lecture48.26
- Lecture48.27
- Lecture48.28

#### Case study 6: Stackoverflow tag predictor 0/18

- Lecture49.1
- Lecture49.2
- Lecture49.3
- Lecture49.4
- Lecture49.5
- Lecture49.6
- Lecture49.7
- Lecture49.8
- Lecture49.9
- Lecture49.10
- Lecture49.11
- Lecture49.12
- Lecture49.13
- Lecture49.14
- Lecture49.15
- Lecture49.16
- Lecture49.17
- Lecture49.18

#### Case Study 7: Quora question Pair Similarity Problem 0/17

- Lecture50.1
- Lecture50.2
- Lecture50.3
- Lecture50.4
- Lecture50.5
- Lecture50.6
- Lecture50.7
- Lecture50.8
- Lecture50.9
- Lecture50.10
- Lecture50.11
- Lecture50.12
- Lecture50.13
- Lecture50.14
- Lecture50.15
- Lecture50.16
- Lecture50.17

#### Deep Learning:Neural Networks. 0/14

- Lecture51.1
- Lecture51.2
- Lecture51.3
- Lecture51.4
- Lecture51.5
- Lecture51.6
- Lecture51.7
- Lecture51.8
- Lecture51.9
- Lecture51.10
- Lecture51.11
- Lecture51.12
- Lecture51.13
- Lecture51.14

#### Deep Learning: Deep Multi-layer perceptrons 0/21

- Lecture52.1
- Lecture52.2
- Lecture52.3
- Lecture52.4
- Lecture52.5
- Lecture52.6
- Lecture52.7
- Lecture52.8
- Lecture52.9
- Lecture52.10
- Lecture52.11
- Lecture52.12
- Lecture52.13
- Lecture52.14
- Lecture52.15
- Lecture52.16
- Lecture52.17
- Lecture52.18
- Lecture52.19
- Lecture52.20
- Lecture52.21

#### Deep Learning: Tensorflow and Keras. 0/14

- Lecture53.1
- Lecture53.2
- Lecture53.3
- Lecture53.4
- Lecture53.5
- Lecture53.6
- Lecture53.7
- Lecture53.8
- Lecture53.9
- Lecture53.10
- Lecture53.11
- Lecture53.12
- Lecture53.13
- Lecture53.14

#### Deep Learning: Convolutional Neural Nets. 0/19

- Lecture54.1
- Lecture54.2
- Lecture54.3
- Lecture54.4
- Lecture54.5
- Lecture54.6
- Lecture54.7
- Lecture54.8
- Lecture54.9
- Lecture54.10
- Lecture54.11
- Lecture54.12
- Lecture54.13
- Lecture54.14
- Lecture54.15
- Lecture54.16
- Lecture54.17
- Lecture54.18
- Lecture54.19

#### Deep Learning: Long Short-term memory (LSTMs) 0/11

- Lecture55.1
- Lecture55.2
- Lecture55.3
- Lecture55.4
- Lecture55.5
- Lecture55.6
- Lecture55.7
- Lecture55.8
- Lecture55.9
- Lecture55.10
- Lecture55.11

#### Case Study 8: Amazon fashion discovery engine 0/28

- Lecture56.1
- Lecture56.2
- Lecture56.3
- Lecture56.4
- Lecture56.5
- Lecture56.6
- Lecture56.7
- Lecture56.8
- Lecture56.9
- Lecture56.10
- Lecture56.11
- Lecture56.12
- Lecture56.13
- Lecture56.14
- Lecture56.15
- Lecture56.16
- Lecture56.17
- Lecture56.18
- Lecture56.19
- Lecture56.20
- Lecture56.21
- Lecture56.22
- Lecture56.23
- Lecture56.24
- Lecture56.25
- Lecture56.26
- Lecture56.27
- Lecture56.28

#### Case study 9:Self Driving Car 0/14

- Lecture57.1
- Lecture57.2
- Lecture57.3
- Lecture57.4
- Lecture57.5
- Lecture57.6
- Lecture57.7
- Lecture57.8
- Lecture57.9
- Lecture57.10
- Lecture57.11
- Lecture57.12
- Lecture57.13
- Lecture57.14

#### Case Studies 0/4

- Lecture58.1
- Lecture58.2
- Lecture58.3
- Lecture58.4

#### Interview Questions 0/1

- Lecture59.1

#### Interview Questions on Deep Learning 0/1

- Lecture60.1

### Introduction to IRIS dataset and 2D scatter plot

### Leave A Reply Cancel reply

You must be logged in to post a comment.

## 25 Comments

import pandas as pd

import seaborn as sns

import matplotlib.pyplot as plt

iris=pd.read_csv(“iris.csv”)

iris.plot(kind=’scatter’, x=’sepal_length’, y=’sepal_width’) ;

plt.show()

question :

iris is a data frame derived from pandas then the plot function used here belongs to pandas/matplotlib package?

it belongs to pandas, which internally uses the matplotlib.

how to install seaborn package

You can install using pip (python install package manager)

pip install seaborn

(or) you can also install using conda

conda install seaborn

Hi Team,

sns.set_style(‘whitegrid’)

sns.FacetGrid(iris_df,hue=’species’,size=5) \

.map(plt.scatter,’sepal_length’,’sepal_width’) \

.add_legend()

plt.show()

In the above code snippet, are the functions map() and add_legend() part of seaborn library? If not , can you please tell the way why the functions are invoked with ‘\’ at trailing and only ‘.’ (dot) at beginning without prefixing any library?

Thanks

It’s a single sentence

With the help of ‘\’ we can write it in 3 lines of same code. As a python developer you have to follow the PEP8 rules. According to PEP8 you shouldn’t write more than 80 characters in a single sentence

sns.FacetGrid(iris_df,hue=’species’,size=5).map(plt.scatter,’sepal_length’,’sepal_width’).add_legend()

Hello Sir, can you please provide the link for seaborn tutorials.I see that you are using it in this video.

seaborn tutorial

https://seaborn.pydata.org/tutorial.html

Hi Team,

Please explain the functions a bit as for seaborn like in some videos you used ‘o’ as a parameter but did not explain it, other you pass size parameter but its significance is not talked about. Would be great if these are included in the video.

We have partially covered seaborn in our course. For more details, please go through the below link https://seaborn.pydata.org/tutorial.html

We didn’t want to overwhelm the students by going over all the details and making this video very long. We went through the core details that are most important to the concepts we were trying to cover. But you do have many videos above this one which explains python concepts very clearly in full detail.

As for Seaborn itself, we use only a few functions available in Seaborn. If you do not understand any parameter, we recommend you search for the function reference using Google. I am very sure you will be able to understand most functions after going through the Python videos. If you do not understand the function reference, please feel free to reach out to us. We are always here to help.

Additionally, we want all of our students to be learning to learn concepts on their own beyond this course as no course can cover everything in a reasonable amount of time. That is the reason we use a lot of public web resources in this course to teach students to learn on their own. Of course, we are here to help if you are stuck.

While scrolling down the page your screen is vibrating even it is not showing complete option also…….please rectify this issue…..

Sorry, we did not understand it, what’s your exact problem

When I’m downloading the IPYTHON notes of this lecture then I’m not able to view it on jupyter notebook. How can I open it on jupyter notebook. Please let me know the procedure

open your terminal/command-prompt and type ‘jupyter notebook’ it will open an explorer in the browser, from there you can open .ipynb

We actually have a video explaining the same in our Python chapter. Please check this video (https://www.appliedaicourse.com/course/applied-ai-course-online/lessons/python-anaconda-and-relevant-packages-installations-2/) from 17:35 minutes onwards.

sns.FactGrid(iris,hue=”species”,size=4)

here what is the size represent

Height (in inches) of each facet

What is the price of R20000 in Naria, or what currency is R, I am having difficulties in the conversion

This currency is INR (Indian Rupee) The “R” like symbol is the symbol for INR. If you want to convert it to USD, please google “convert 20000 INR into USD”. You can also convert it to your respective currency by replacing USD with your currency.

Hi, do you have the IPYTHON notes of this lecture on github i.e. just like https://github.com/dataquestio/solutions/blob/master/Mission201Solution.ipynb.

I want to finish the exercise(8.16), but before it I want to go through the IPYTHON notes of this lecture. But I am having trouble installing Anaconda/Jupyter on my Linux machine, hence though I downloaded IPYTHON notes of this lecture I am unable to view it properly. Hence was expecting whether it is available on github. So, if there a github link, could you pls provide it.

Better first install jupyter. without installing you can’t work on it.

My machine is windows and i am executing programs using Jupyter ipython notebook. So where i have to place my csv file so that the program can read it from ipython notebook.

keep that csv file in the same folder in which you are lunching jupyter notebook, ex: C:/user/abc/Desktop/workshop> jupyter notebook, keep the csv file in C:/user/abc/Desktop/workshop folder

Paste the github resource link into the pd.read_csv() function as argument enclosed in ” “.

https://gist.github.com/curran/a08a1080b88344b0c8a7