Have any question ?
+91 8106-920-029
+91 6301-939-583
team@appliedaicourse.com
Register
Login
COURSES
Applied Machine Learning Course
Diploma in AI and ML
GATE CS Blended Course
Interview Preparation Course
AI Workshop
AI Case Studies
Courses
Applied Machine Learning Course
Workshop
Case Studies
Job Guarantee
Job Guarantee Terms & Conditions
Incubation Center
Student Blogs
Live Sessions
Success Stories
For Business
Upskill
Hire From Us
Contact Us
Home
Courses
Cancer Diagnosis using Medical Records
Loss function (Hinge Loss) based interpretation
Loss function (Hinge Loss) based interpretation
Instructor:
Applied AI Course
Duration:
18 mins
Full Screen
Close
This content is restricted. Please
Login
Prev
Next
Mathematical derivation
RBF-Kernel
Support Vector Machines (SVM)
1.1
Geometric Intution
20 min
1.2
Mathematical derivation
32 min
1.3
Loss function (Hinge Loss) based interpretation
18 min
1.4
RBF-Kernel
21 min
1.5
Polynomial kernel
11 min
1.6
Domain specific Kernels
6 min
1.7
Train and run time complexities
8 min
1.8
nu-SVM: control errors and support vectors
6 min
1.9
SVM Regression
8 min
1.10
Code Samples
14 min
1.11
Exercise: Apply SVM to Amazon reviews dataset
4 min
1.12
Why we take values +1 and and -1 for Support vector planes
9 min
1.13
Dual form of SVM formulation
16 min
1.14
kernel trick
10 min
1.15
Realtime cases
9 min
Ensemble Models
2.1
Introduction to Bootstrapped Aggregation (Bagging)
17 min
2.2
Random Forest and their construction
15 min
2.3
Bias-Variance tradeoff(Random Forest)
7 min
2.4
Intution to Boosting
17 min
2.5
Gradient Boosting
10 min
2.6
AdaBoost: geometric intuition
7 min
2.7
Stacking models
22 min
2.8
Exercise: Apply GBDT and RF to Amazon reviews dataset
4 min
2.9
What are ensembles?
6 min
2.10
Bagging :Train and Run-time Complexity.
9 min
2.11
Bagging:Code Sample
4 min
2.12
Extremely randomized trees
8 min
2.13
Random Tree :Cases
6 min
2.14
Residuals, Loss functions and gradients
13 min
2.15
Regularization by Shrinkage
8 min
2.16
Train and Run time complexity
6 min
2.17
XGBoost: Boosting + Randomization
14 min
2.18
Cascading classifiers
15 min
2.19
Kaggle competitions vs Real world
9 min
Close