Have any question ?
+91 8106-920-029
+91 6301-939-583
team@appliedaicourse.com
Register
Login
COURSES
Applied Machine Learning Course
Diploma in AI and ML
GATE CS Blended Course
Interview Preparation Course
AI Workshop
AI Case Studies
Courses
Applied Machine Learning Course
Workshop
Case Studies
Job Guarantee
Job Guarantee Terms & Conditions
Incubation Center
Student Blogs
Live Sessions
Success Stories
For Business
Upskill
Hire From Us
Contact Us
Home
Courses
Human Activity Recognition using smartphones
Building a decision Tree:Entropy
Building a decision Tree:Entropy
Instructor:
Applied AI Course
Duration:
19 mins
Full Screen
Close
This content is restricted. Please
Login
Prev
Next
Sample Decision tree
Building a decision Tree:Information Gain
Support Vector Machines (SVM)
1.1
Geometric Intution
20 min
1.2
Mathematical derivation
17 min
1.3
Loss function (Hinge Loss) based interpretation
18 min
1.4
Dual form of SVM formulation
16 min
1.5
Kernel trick
10 min
1.6
Polynomial kernel
11 min
1.7
RBF-Kernel
21 min
1.8
Domain specific Kernels
6 min
1.9
Train and run time complexities
8 min
1.10
nu-SVM: control errors and support vectors
6 min
1.11
SVM Regression
8 min
1.12
Cases
9 min
1.13
Code Sample
14 min
1.14
Exercise: Apply SVM to Amazon reviews dataset
4 min
Decision Trees
2.1
Geometric Intuition: Axis parallel hyperplanes
17 min
2.2
Sample Decision tree
8 min
2.3
Building a decision Tree:Entropy
19 min
2.4
Building a decision Tree:Information Gain
10 min
2.5
Building a decision Tree: Gini Impurity
7 min
2.6
Building a decision Tree: Constructing a DT
21 min
2.7
Building a decision Tree: Splitting numerical features
8 min
2.8
Feature standardization
4 min
2.9
Building a decision Tree:Categorical features with many possible values
7 min
2.10
Overfitting and Underfitting
8 min
2.11
Train and Run time complexity
7 min
2.12
Regression using Decision Trees
9 min
2.13
Cases
12 min
2.14
Code Samples
9 min
2.15
Exercise: Decision Trees on Amazon reviews dataset
3 min
Ensemble Models
3.1
What are ensembles?
6 min
3.2
Bootstrapped Aggregation (Bagging) Intuition
17 min
3.3
Random Forest and their construction
15 min
3.4
Bias-Variance tradeoff
7 min
3.5
Bagging :Train and Run-time Complexity.
9 min
3.6
Bagging:Code Sample
4 min
3.7
Extremely randomized trees
8 min
3.8
Random Tree :Cases
6 min
3.9
Boosting Intuition
17 min
3.10
Residuals, Loss functions and gradients
13 min
3.11
Gradient Boosting
10 min
3.12
Regularization by Shrinkage
8 min
3.13
Train and Run time complexity
6 min
3.14
XGBoost: Boosting + Randomization
14 min
3.15
AdaBoost: geometric intuition
7 min
3.16
Stacking models
22 min
3.17
Cascading classifiers
15 min
3.18
Kaggle competitions vs Real world
9 min
3.19
Assignment: Apply Random Forests & GBDT ?
4 min
Close