Have any question ?
+91 8106-920-029
+91 6301-939-583
team@appliedaicourse.com
Register
Login
COURSES
Applied Machine Learning Course
Diploma in AI and ML
GATE CS Blended Course
Interview Preparation Course
AI Workshop
AI Case Studies
Courses
Applied Machine Learning Course
Workshop
Case Studies
Job Guarantee
Job Guarantee Terms & Conditions
Incubation Center
Student Blogs
Live Sessions
Success Stories
For Business
Upskill
Hire From Us
Contact Us
Home
Courses
Applied Machine Learning Online Course
Cosine Distance & Cosine Similarity
Cosine Distance & Cosine Similarity
Instructor:
Applied AI Course
Duration:
19 mins
Full Screen
Close
This content is restricted. Please
Login
Prev
Next
Distance measures: Euclidean(L2) , Manhattan(L1), Minkowski, Hamming
How to measure the effectiveness of k-NN?
Real world problem: Predict rating given product reviews on Amazon
1.1
Dataset overview: Amazon Fine Food reviews(EDA)
23 min
1.2
Data Cleaning: Deduplication
15 min
1.3
Why convert text to a vector?
14 min
1.4
Bag of Words (BoW)
18 min
1.5
Text Preprocessing: Stemming, Stop-word removal, Tokenization, Lemmatization.
15 min
1.6
uni-gram, bi-gram, n-grams.
9 min
1.7
tf-idf (term frequency- inverse document frequency)
22 min
1.8
Why use log in IDF?
14 min
1.9
Word2Vec.
16 min
1.10
Avg-Word2Vec, tf-idf weighted Word2Vec
9 min
1.11
Bag of Words( Code Sample)
19 min
1.12
Text Preprocessing( Code Sample)
11 min
1.13
Bi-Grams and n-grams (Code Sample)
5 min
1.14
TF-IDF (Code Sample)
6 min
1.15
Word2Vec (Code Sample)
12 min
1.16
Avg-Word2Vec and TFIDF-Word2Vec (Code Sample)
2 min
Classification And Regression Models: K-Nearest Neighbors
2.1
How “Classification” works?
10 min
2.2
Data matrix notation
7 min
2.3
Classification vs Regression (examples)
6 min
2.4
K-Nearest Neighbours Geometric intuition with a toy example
12 min
2.5
Failure cases of KNN
7 min
2.6
Distance measures: Euclidean(L2) , Manhattan(L1), Minkowski, Hamming
20 min
2.7
Cosine Distance & Cosine Similarity
19 min
2.8
How to measure the effectiveness of k-NN?
16 min
2.9
Test/Evaluation time and space complexity
12 min
2.10
KNN Limitations
9 min
2.11
Decision surface for K-NN as K changes
23 min
2.12
Overfitting and Underfitting
12 min
2.13
Need for Cross validation
22 min
2.14
K-fold cross validation
18 min
2.15
Visualizing train, validation and test datasets
13 min
2.16
How to determine overfitting and underfitting?
19 min
2.17
Time based splitting
19 min
2.18
k-NN for regression
5 min
2.19
Weighted k-NN
8 min
2.20
Voronoi diagram
4 min
2.21
Binary search tree
16 min
2.22
How to build a kd-tree
17 min
2.23
Find nearest neighbours using kd-tree
13 min
2.24
Limitations of Kd tree
9 min
2.25
Extensions
3 min
2.26
Hashing vs LSH
10 min
2.27
LSH for cosine similarity
40 min
2.28
LSH for euclidean distance
13 min
2.29
Probabilistic class label
8 min
2.30
Code Sample:Decision boundary .
23 min
2.31
Code Sample:Cross Validation
13 min
2.32
Revision Questions
30 min
Interview Questions on K-NN(K Nearest Neighbour)
3.1
Questions & Answers
30 min
Classification algorithms in various situations
4.1
Introduction
5 min
4.2
Imbalanced vs balanced dataset
23 min
4.3
Multi-class classification
12 min
4.4
k-NN, given a distance or similarity matrix
9 min
4.5
Train and test set differences
22 min
4.6
Impact of outliers
7 min
4.7
Local outlier Factor (Simple solution :Mean distance to Knn)
13 min
4.8
k distance
4 min
4.9
Reachability-Distance(A,B)
8 min
4.10
Local reachability-density(A)
9 min
4.11
Local outlier Factor(A)
21 min
4.12
Impact of Scale & Column standardization
13 min
4.13
Interpretability
12 min
4.14
Feature Importance and Forward Feature selection
22 min
4.15
Handling categorical and numerical features
24 min
4.16
Handling missing values by imputation
21 min
4.17
curse of dimensionality
27 min
4.18
Bias-Variance tradeoff
24 min
4.19
Intuitive understanding of bias-variance.
7 min
4.20
Revision Questions
30 min
4.21
best and worst case of algorithm
6 min
Performance measurement of models
5.1
Accuracy
15 min
5.2
Confusion matrix, TPR, FPR, FNR, TNR
25 min
5.3
Precision and recall, F1-score
10 min
5.4
Receiver Operating Characteristic Curve (ROC) curve and AUC
19 min
5.5
Log-loss
12 min
5.6
R-Squared/Coefficient of determination
14 min
5.7
Median absolute deviation (MAD)
5 min
5.8
Distribution of errors
7 min
5.9
Revision Questions
30 min
Interview Questions on Performance Measurement Models
6.1
Questions & Answers
30 min
Naive Bayes
7.1
Conditional probability
13 min
7.2
Independent vs Mutually exclusive events
7 min
7.3
Bayes Theorem with examples
18 min
7.4
Exercise problems on Bayes Theorem
30 min
7.5
Naive Bayes algorithm
26 min
7.6
Toy example: Train and test stages
26 min
7.7
Naive Bayes on Text data
16 min
7.8
Laplace/Additive Smoothing
24 min
7.9
Log-probabilities for numerical stability
11 min
7.10
Bias and Variance tradeoff
14 min
7.11
Feature importance and interpretability
10 min
7.12
Imbalanced data
14 min
7.13
Outliers
6 min
7.14
Missing values
3 min
7.15
Handling Numerical features (Gaussian NB)
13 min
7.16
Multiclass classification
2 min
7.17
Similarity or Distance matrix
3 min
7.18
Large dimensionality
3 min
7.19
Best and worst cases
8 min
7.20
Code example
8 min
7.21
Revision Questions
Logistic Regression
8.1
Geometric intuition of Logistic Regression
31 min
8.2
Sigmoid function: Squashing
37 min
8.3
Mathematical formulation of Objective function
24 min
8.4
Weight vector
11 min
8.5
L2 Regularization: Overfitting and Underfitting
26 min
8.6
L1 regularization and sparsity
11 min
8.7
Probabilistic Interpretation: Gaussian Naive Bayes
20 min
8.8
Loss minimization interpretation
19 min
8.9
hyperparameter Search: Grid search and random search
16 min
8.10
Column Standardization
5 min
8.11
Feature importance and Model interpretability
14 min
8.12
Collinearity of features
14 min
8.13
Train & Run time space & time complexity
11 min
8.14
Real world cases
11 min
8.15
Non-linearly separable data & feature engineering
28 min
8.16
Code sample: Logistic regression, GridSearchCV, RandomSearchCV
23 min
8.17
Extensions to Logistic Regression: Generalized linear models(GLM)
8 min
Linear Regression
9.1
Geometric intuition of Linear Regression
13 min
9.2
Mathematical formulation
14 min
9.3
Real world Cases
8 min
9.4
Code sample for Linear Regression
13 min
Solving Optimization Problems
10.1
Differentiation
29 min
10.2
Online differentiation tools
8 min
10.3
Maxima and Minima
12 min
10.4
Vector calculus: Grad
10 min
10.5
Gradient descent: geometric intuition
19 min
10.6
Learning rate
8 min
10.7
Gradient descent for linear regression
8 min
10.8
SGD algorithm
9 min
10.9
Constrained Optimization & PCA
14 min
10.10
Logistic regression formulation revisited
6 min
10.11
Why L1 regularization creates sparsity?
17 min
10.12
Revision questions
30 min
Interview Questions on Logistic Regression and Linear Regression
11.1
Questions & Answers
30 min
Module 3: Live Sessions
12.1
Code Walkthrough: Text Encodings for ML/AI
12.2
Dive deep into K-NN
12.3
Performance metrics Deep Dive
12.4
Interactive LIVE session: Logistic regression deep dive
12.5
Code Walkthrough: Optimization Methods for ML/AI
12.6
Code Walkthrough: Hyper-param Optimisation
12.7
Logistic Regression with Imbalanced data: A Geometric View
12.8
Linear Regression using Probability & Stats
Close